America has forgotten the value of the humanities at the moment it needs them most
Neil deGrasse Tyson recently tweeted, “In school, rarely do we learn how data become facts, how facts become knowledge, and how knowledge becomes wisdom.” A librarian replied, “Hi Neil, That’s literally what we teach. Thanks for the shoutout! Sincerely, The Humanities.”
When a champion of critical thinking like Tyson is unclear on the very purpose of the humanities, it’s fair to say higher education is facing a public relations crisis, a reality also highlighted by the recent Pew Research Center poll showing that a majority of Republicans believe higher education has a “negative effect” on the country.
This is a serious crisis. Universities face untenable budgets and a dire faculty job market at the same time the public is questioning the value of a college education in light of rising tuition and student loan burdens. But the transformation in public attitudes toward universities is not based on a concrete loss of value: Higher education continues to correlate with improved employability and incomes. U.S. universities continue — for the time being — to maintain a global competitive edge.
Instead, people’s attitudes about college reflect a changed political perception about the role that higher education plays in American life. Rightward shifts in attitudes toward government investment and the value of social mobility and diversity have transformed the idea of public investment in education from a staple of American society to a partisan wedge issue. It’s not that the university is no longer providing what it used to — though it is having increasing difficulty doing so. Rather, the right has abandoned the premise of liberal arts education because they increasingly perceive it as not a driver of broad social and economic advancement but as a mere incubator for liberal ideas in the narrow political sense.
American universities evolved from earlier, European models that aimed to prepare society’s elite for leadership or to generate new knowledge through research. Though both missions had clear public benefits, the earliest universities were privately funded. As industrialization and modern bureaucratic states empowered the middle class in the 19th century, though, universities also began to serve as a pathway to social advancement, with land-grant universities founded across the United States. But college remained mostly an elite affair. ...