A dispute is emerging in conservative thinking about higher education. On one side are those who embrace “creative destruction.” Nimble, cheap, online competitors will drive many brick-and-mortar institutions out of business, and good riddance to those bloated, government-backed guilds that care more about indoctrinating students than they do about preparing them to get jobs. Resisting the destruction of the old model is, on this account, both futile and “wicked.”
On the other side are those who acknowledge the defects, even the “decadence,” of much of higher education but worry that the “transformative agenda” is “about exploiting the decadence to root out the quality as well.” The “disruptive logic of the market” may lead not only to a comeuppance for conservativism’s many enemies in academia but also to the “disappearance of close reading of the ‘real books’ of philosophy, literature, theology, and so forth . . . because they’re unreliable and not cost-efficient.”
A weapon of choice for the creative destruction camp is the claim that college is merely a very expensive screening device that flags not what students have learned in college but attributes they had before they got into college, which got them there. If college signals a student’s preexisting strengths rather than cultivating those strengths, then even those who are eager to preserve the study of great works may have to concede that they are better off finding another home than they are bunking with flim-flam artists.
Higher education is “worth it” for a lot of students because many employers prefer candidates with college degrees. But in a recent Minding the Campus essay, Richard Vedder argues, as he has before, that employers favor degrees only because they are not allowed to use better screening tests. In Griggs v. Duke Power Co. (1971) the Supreme Court “outlawed testing that had a ‘disparate impact’ on minorities.” Since employers can’t give I.Q. tests to applicants, they use college degrees, whose cost is borne by students, parents, and taxpayers. Vedder explains that college degrees are reasonably good screening devices because degree holders have, on average, stronger cognitive skills and motivation than those without degrees. But they had those before they got into college, where students neither work hard nor learn much.
But this argument is specious. While we should be worried that students are not doing well on the Collegiate Learning Assessment (CLA), among other measures of achievement, employers seem to think that students do learn something in college.
First, it’s not true that employers use college degrees to screen job candidates because they have no other means of doing so. According to this Wall Street Journal story, reporting employer interest in the CLA, companies “such as General Mills Inc. and Procter & Gamble Co.” have developed “their own job-applicant assessments.” And according to a survey conducted by the Educational Testing Service, more than a quarter of businesses” are already using its Graduate Record Examination “to evaluate job applicants.” Indeed, if employers can’t get away with using tests to screen candidates, why does Vedder recommend that employers use the CLA, which seems on its face as vulnerable to disparate impact complaints as other aptitude tests are?
Second, a recent survey of human resources professionals shows that employers discriminate among different kinds of education, and not merely on the basis of prestige. Most strikingly, when asked to choose between a candidate bearing a degree from an average school, completed entirely in the classroom, and a candidate with an online-only degree from a top school, 56 percent opted for the average but traditional degree. Only 17 percent went the other way. That finding holds even though 45 percent of the HR professionals surveyed thought that online degrees required more discipline than traditional degrees, compared to just 23 percent who thought online degrees required less discipline.
Perhaps employers think a traditional degree signals a candidate’s social skills, or that students learn more in traditional programs (43 percent think so, 49 percent think they learn about the same amount, and 4 percent think online students learn more). Or maybe they think that traditional programs are tougher (39 percent say online only programs are easier, 41 percent think they are about as difficult, and 13 percent think they are more difficult). Whether some or all these reasons are in play, hiring professionals think that college is not merely a measure of the talents students possessed before they enrolled but a measure of something they gain there, which programs can be more or less successful at providing.
While I am reporting the results of only one survey, there is little evidence that employers think students learn nothing in college. Employers unquestionably wish colleges were doing much better, but that is a far cry from the thesis Vedder and the partisans of creative destruction defend. Perhaps, then, those who care about the education of our young should identify and support the outposts of excellence in American higher education, rather than summoning a flood to wash the good away with the bad.