Is university a waste of time? In a piece for the Atlantic, Bryan Caplan quotes a statistic that would seem to settle the question:
“When we look at countries around the world, a year of education appears to raise an individual’s income by 8 to 11 percent.”
Not bad, especially when multiplied across the typical duration of an undergraduate degree course. However, there’s a complication:
“…increasing education across a country’s population by an average of one year per person raises the national income by only 1 to 3 percent. In other words, education enriches individuals much more than it enriches nations.”
Caplan ascribes the mismatch to “credential inflation”:
“As the average level of education rises, you need more education to convince employers you’re worthy of any specific job. One research team found that from the early 1970s through the mid‑1990s, the average education level within 500 occupational categories rose by 1.2 years… all American workers’ education rose by 1.5 years in that same span—which is to say that a great majority of the extra education workers received was deployed not to get better jobs, but to get jobs that had recently been held by people with less education.”
One might expect a vastly expanded graduate population to be super-literate and highly-cultured. The evidence suggests otherwise.
Have we turned education into a treadmill, with each generation having to invest ever greater amounts of time and money just to stand still? Or are students bringing added value to the jobs market, not just acquiring credentials?
At the very least, one might expect a vastly expanded graduate population to be super-literate and highly-cultured. Caplan cites evidence to the contrary:
“In 2003, the United States Department of Education gave about 18,000 Americans the National Assessment of Adult Literacy. The ignorance it revealed is mind-numbing. Fewer than a third of college graduates received a composite score of ‘proficient’—and about a fifth were at the ‘basic’ or ‘below basic’ level… Tests of college graduates’ knowledge of history, civics, and science have had similarly dismal results.”
But are these things that universities should be expected to teach? If schools don’t equip students with the fundmentals, then isn’t it too late by the time they get to college? In fact, there’s evidence that a university education can, and should, make a difference:
“One ambitious study at the University of Michigan tested natural-science, humanities, and psychology and other social-science majors on verbal reasoning, statistical reasoning, and conditional reasoning during the first semester of their first year. When the same students were retested the second semester of their fourth year, each group had sharply improved in precisely one area. Psychology and other social-science majors had become much better at statistical reasoning. Natural-science and humanities majors had become much better at conditional reasoning… In the remaining areas, however, gains after three and a half years of college were modest or nonexistent.”
Why can’t students improve across the board? After all, most workplaces require a mix of reasoning skills, not just those most relevant to a particular academic discipline. This mix might be acquired if undergraduates had a broader-based education, however that would require them to put the hours in:
“Fifty years ago, college was a full-time job. The typical student spent 40 hours a week in class or studying. Effort has since collapsed across the board. “Full time” college students now average 27 hours of academic work a week—including just 14 hours spent studying.”
For all the talk of ‘lifelong learning’, the reality is that young adults have a once-in-a-lifetime opportunity to develop their minds. To allow so many students to waste these golden years – and to shoulder so much debt to so little effect – is shameful.