The American public is more broadly engaged in the arts than previously understood—believing that the arts not only play a vital role in personal wellâ€being and healthier communities, but that the arts are also core to a wellâ€rounded education.
Home > Learning Center > Research & Reports > Americans Speak Out About the Arts

