Higher Ed 2.0
By Susan Carini | Emory Magzine | Oct. 28, 2013
Call it sensational, but it had the intended effect. On September 9 of last year, a Newsweek cover demanded, "Is College a Lousy Investment?"
It is a question that would have been unthinkable in the historic period of growth that higher education experienced in the aftermath of World War II. Then, it was perceived as a public good and not—as some consider it now—a private benefit conferring economic reward.
For close observers of higher education, the negative turn is now decades old, starting after 1970 as critics began voicing concerns over unchecked expansion. In the 1980s and 1990s, the drumbeat continued when the costs of attending college outstripped the economic returns. And certainly since 2008, the pressures on the industry have been accelerated by the economic downturn and shrinking government support for research.
Even before the economy faltered, the US Department of Education issued a 2006 report titled "A Test of Leadership: Charting the Future of US Higher Education." One sentence stands out: "What we have learned . . . makes clear that American higher education has become what, in the business world, would be called a mature enterprise: increasingly risk-averse, at times self-satisfied, and unduly expensive." Issues that higher education sometimes has seemed slow to address include runaway prices, chronic inefficiencies, uneven outcomes, lifetime faculty tenure, arcane research, and scattered authority.
The uncertainty even has breached ivory tower walls. A survey of the American public and of more than a thousand college and university presidents, conducted this past spring by the Pew Research Center in association with the Chronicle of Higher Education, revealed significant concerns not only about the costs of education but also about its direction and goals.