College Calculus

As the supply of college grads expands, many are taking jobs that shouldn’t require a degree.Illustration by Leo Espinosa

If there is one thing most Americans have been able to agree on over the years, it is that getting an education, particularly a college education, is a key to human betterment and prosperity. The consensus dates back at least to 1636, when the legislature of the Massachusetts Bay Colony established Harvard College as America’s first institution of higher learning. It extended through the establishment of “land-grant colleges” during and after the Civil War, the passage of the G.I. Bill during the Second World War, the expansion of federal funding for higher education during the Great Society era, and President Obama’s efforts to make college more affordable. Already, the cost of higher education has become a big issue in the 2016 Presidential campaign. Three Democratic candidates—Hillary Clinton, Martin O’Malley, and Bernie Sanders—have offered plans to reform the student-loan program and make college more accessible.

Promoters of higher education have long emphasized its role in meeting civic needs. The Puritans who established Harvard were concerned about a shortage of clergy; during the Progressive Era, John Dewey insisted that a proper education would make people better citizens, with enlarged moral imaginations. Recently, as wage stagnation and rising inequality have emerged as serious problems, the economic arguments for higher education have come to the fore. “Earning a post-secondary degree or credential is no longer just a pathway to opportunity for a talented few,” the White House Web site states. “Rather, it is a prerequisite for the growing jobs of the new economy.” Commentators and academic economists have claimed that college doesn’t merely help individuals get higher-paying jobs; it raises wages throughout the economy and helps ameliorate rising inequality. In an influential 2008 book, “The Race Between Education and Technology,” the Harvard economists Claudia Goldin and Lawrence F. Katz argued that technological progress has dramatically increased the demand for skilled workers, and that, in recent decades, the American educational system has failed to meet the challenge by supplying enough graduates who can carry out the tasks that a high-tech economy requires. “Not so long ago, the American economy grew rapidly and wages grew in tandem, with education playing a large, positive role in both,” they wrote in a subsequent paper. “The challenge now is to revitalize education-based mobility.”

The “message from the media, from the business community, and even from many parts of the government has been that a college degree is more important than ever in order to have a good career,” Peter Cappelli, a professor of management at Wharton, notes in his informative and refreshingly skeptical new book, “Will College Pay Off?” (PublicAffairs). “As a result, families feel even more pressure to send their kids to college. This is at a time when more families find those costs to be a serious burden.” During recent decades, tuition and other charges have risen sharply—many colleges charge more than fifty thousand dollars a year in tuition and fees. Even if you factor in the expansion of financial aid, Cappelli reports, “students in the United States pay about four times more than their peers in countries elsewhere.”

Despite the increasing costs—and the claims about a shortage of college graduates—the number of people attending and graduating from four-year educational institutions keeps going up. In the 2000-01 academic year, American colleges awarded almost 1.3 million bachelor’s degrees. A decade later, the figure had jumped nearly forty per cent, to more than 1.7 million. About seventy per cent of all high-school graduates now go on to college, and half of all Americans between the ages of twenty-five and thirty-four have a college degree. That’s a big change. In 1980, only one in six Americans twenty-five and older were college graduates. Fifty years ago, it was fewer than one in ten. To cater to all the new students, colleges keep expanding and adding courses, many of them vocationally inclined. At Kansas State, undergraduates can major in Bakery Science and Management or Wildlife and Outdoor Enterprise Management. They can minor in Unmanned Aircraft Systems or Pet Food Science. Oklahoma State offers a degree in Fire Protection and Safety Engineering and Technology. At Utica College, you can major in Economic Crime Detection.

In the fast-growing for-profit college sector, which now accounts for more than ten per cent of all students, vocational degrees are the norm. DeVry University—which last year taught more than sixty thousand students, at more than seventy-five campuses—offers majors in everything from multimedia design and development to health-care administration. On its Web site, DeVry boasts, “In 2013, 90% of DeVry University associate and bachelor’s degree grads actively seeking employment had careers in their field within six months of graduation.” That sounds impressive—until you notice that the figure includes those graduates who had jobs in their field before graduation. (Many DeVry students are working adults who attend college part-time to further their careers.) Nor is the phrase “in their field” clearly defined. “Would you be okay rolling the dice on a degree in communications based on information like that?” Cappelli writes. He notes that research by the nonprofit National Association of Colleges and Employers found that, in the same year, just 6.5 per cent of graduates with communications degrees were offered jobs in the field. It may be unfair to single out DeVry, which is one of the more reputable for-profit education providers. But the example illustrates Cappelli’s larger point: many of the claims that are made about higher education don’t stand up to scrutiny.

“It is certainly true that college has been life changing for most people and a tremendous financial investment for many of them,” Cappelli writes. “It is also true that for some people, it has been financially crippling. . . .The world of college education is different now than it was a generation ago, when many of the people driving policy decisions on education went to college, and the theoretical ideas about why college should pay off do not comport well with the reality.”

No idea has had more influence on education policy than the notion that colleges teach their students specific, marketable skills, which they can use to get a good job. Economists refer to this as the “human capital” theory of education, and for the past twenty or thirty years it has gone largely unchallenged. If you’ve completed a two-year associate’s degree, you’ve got more “human capital” than a high-school graduate. And if you’ve completed a four-year bachelor’s degree you’ve got more “human capital” than someone who attended a community college. Once you enter the labor market, the theory says, you will be rewarded with a better job, brighter career prospects, and higher wages.

There’s no doubt that college graduates earn more money, on average, than people who don’t have a degree. And for many years the so-called “college wage premium” grew. In 1970, according to a recent study by researchers at the Federal Reserve Bank of New York, people with a bachelor’s degree earned about sixty thousand dollars a year, on average, and people with a high-school diploma earned about forty-five thousand dollars. Thirty-five years later, in 2005, the average earnings of college graduates had risen to more than seventy thousand dollars, while high-school graduates had seen their earnings fall slightly. (All these figures are inflation-adjusted.) The fact that the college wage premium went up at a time when the supply of graduates was expanding significantly seemed to confirm the Goldin-Katz theory that technological change was creating an ever-increasing demand for workers with a lot of human capital.

During the past decade or so, however, a number of things have happened that don’t easily mesh with that theory. If college graduates remain in short supply, their wages should still be rising. But they aren’t. In 2001, according to the Economic Policy Institute*, a liberal think tank in Washington, workers with undergraduate degrees (but not graduate degrees) earned, on average, $30.05 an hour; last year, they earned $29.55 an hour. Other sources show even more dramatic falls. “Between 2001 and 2013, the average wage of workers with a bachelor’s degree declined 10.3 percent, and the average wage of those with an associate’s degree declined 11.1 percent,” the New York Fed reported in its study. Wages have been falling most steeply of all among newly minted college graduates. And jobless rates have been rising. In 2007, 5.5 per cent of college graduates under the age of twenty-five were out of work. Today, the figure is close to nine per cent. If getting a bachelor’s degree is meant to guarantee entry to an arena in which jobs are plentiful and wages rise steadily, the education system has been failing for some time.

And, while college graduates are still doing a lot better than nongraduates, some studies show that the earnings gap has stopped growing. The figures need careful parsing. If you lump college graduates in with people with advanced degrees, the picture looks brighter. But almost all the recent gains have gone to folks with graduate degrees. “The four-year-degree premium has remained flat over the past decade,” the Federal Reserve Bank of Cleveland reported. And one of the main reasons it went up in the first place wasn’t that college graduates were enjoying significantly higher wages. It was that the earnings of nongraduates were falling.

Many students and their families extend themselves to pay for a college education out of fear of falling into the low-wage economy. That’s perfectly understandable. But how sound an investment is it? One way to figure this out is to treat a college degree like a stock or a bond and compare the cost of obtaining one with the accumulated returns that it generates over the years. (In this case, the returns come in the form of wages over and above those earned by people who don’t hold degrees.) When the research firm PayScale did this a few years ago, it found that the average inflation-adjusted return on a college education is about seven per cent, which is a bit lower than the historical rate of return on the stock market. Cappelli cites this study along with one from the Hamilton Project, a Washington-based research group that came up with a much higher figure—about fifteen per cent—but by assuming, for example, that all college students graduate in four years. (In fact, the four-year graduation rate for full-time, first-degree students is less than forty per cent, and the six-year graduation rate is less than sixty per cent.)

These types of studies, and there are lots of them, usually find that the financial benefits of getting a college degree are much larger than the financial costs. But Cappelli points out that for parents and students the average figures may not mean much, because they disguise enormous differences in outcomes from school to school. He cites a survey, carried out by PayScale for Businessweek in 2012, that showed that students who attend M.I.T., Caltech, and Harvey Mudd College enjoy an annual return of more than ten per cent on their “investment.” But the survey also found almost two hundred colleges where students, on average, never fully recouped the costs of their education. “The big news about the payoff from college should be the incredible variation in it across colleges,” Cappelli writes. “Looking at the actual return on the costs of attending college, careful analyses suggest that the payoff from many college programs—as much as one in four—is actually negative. Incredibly, the schools seem to add nothing to the market value of the students.”

So what purpose does college really serve for students and employers? Before the human-capital theory became so popular, there was another view of higher education—as, in part, a filter, or screening device, that sorted individuals according to their aptitudes and conveyed this information to businesses and other hiring institutions. By completing a four-year degree, students could signal to potential employers that they had a certain level of cognitive competence and could carry out assigned tasks and work in a group setting. But a college education didn’t necessarily imbue students with specific work skills that employers needed, or make them more productive.

Kenneth Arrow, one of the giants of twentieth-century economics, came up with this account, and if you take it seriously you can’t assume that it’s always a good thing to persuade more people to go to college. If almost everybody has a college degree, getting one doesn’t differentiate you from the pack. To get the job you want, you might have to go to a fancy (and expensive) college, or get a higher degree. Education turns into an arms race, which primarily benefits the arms manufacturers—in this case, colleges and universities.

The screening model isn’t very fashionable these days, partly because it seems perverse to suggest that education doesn’t boost productivity. But there’s quite a bit of evidence that seems to support Arrow’s theory. In recent years, more jobs have come to demand a college degree as an entry requirement, even though the demands of the jobs haven’t changed much. Some nursing positions are on the list, along with jobs for executive secretaries, salespeople, and distribution managers. According to one study, just twenty per cent of executive assistants and insurance-claims clerks have college degrees but more than forty-five per cent of the job openings in the field require one. “This suggests that employers may be relying on a B.A. as a broad recruitment filter that may or may not correspond to specific capabilities needed to do the job,” the study concluded.

It is well established that students who go to élite colleges tend to earn more than graduates of less selective institutions. But is this because Harvard and Princeton do a better job of teaching valuable skills than other places, or because employers believe that they get more talented students to begin with? An exercise carried out by Lauren Rivera, of the Kellogg School of Management, at Northwestern, strongly suggests that it’s the latter. Rivera interviewed more than a hundred recruiters from investment banks, law firms, and management consulting firms, and she found that they recruited almost exclusively from the very top-ranked schools, and simply ignored most other applicants. The recruiters didn’t pay much attention to things like grades and majors. “It was not the content of education that elite employers valued but rather its prestige,” Rivera concluded.

If higher education serves primarily as a sorting mechanism, that might help explain another disturbing development: the tendency of many college graduates to take jobs that don’t require college degrees. Practically everyone seems to know a well-educated young person who is working in a bar or a mundane clerical job, because he or she can’t find anything better. Doubtless, the Great Recession and its aftermath are partly to blame. But something deeper, and more lasting, also seems to be happening.

In the Goldin-Katz view of things, technological progress generates an ever-increasing need for highly educated, highly skilled workers. But, beginning in about 2000, for reasons that are still not fully understood, the pace of job creation in high-paying, highly skilled fields slowed significantly. To demonstrate this, three Canadian economists, Paul Beaudry, David A. Green, and Benjamin M. Sand, divided the U.S. workforce into a hundred occupations, ranked by their average wages, and looked at how employment has changed in each category. Since 2000, the economists showed, the demand for highly educated workers declined, while job growth in low-paying occupations increased strongly. “High-skilled workers have moved down the occupational ladder and have begun to perform jobs traditionally performed by lower-skilled workers,” they concluded, thus “pushing low-skilled workers even further down the occupational ladder.”

Increasingly, the competition for jobs is taking place in areas of the labor market where college graduates didn’t previously tend to compete. As Beaudry, Green, and Sand put it, “having a B.A. is less about obtaining access to high paying managerial and technology jobs and more about beating out less educated workers for the Barista or clerical job.” Even many graduates in science, technology, engineering, and mathematics—the so-called STEM subjects, which receive so much official encouragement—are having a tough time getting the jobs they’d like. Cappelli reports that only about a fifth of recent graduates with STEM degrees got jobs that made use of that training. “The evidence for recent grads suggests clearly that there is no overall shortage of STEM grads,” he writes.

Why is this happening? The short answer is that nobody knows for sure. One theory is that corporate cost-cutting, having thinned the ranks of workers on the factory floor and in routine office jobs, is now targeting supervisors, managers, and other highly educated people. Another theory is that technological progress, after favoring highly educated workers for a long time, is now turning on them. With rapid advances in processing power, data analysis, voice recognition, and other forms of artificial intelligence, computers can perform tasks that were previously carried out by college graduates, such as analyzing trends, translating foreign-language documents, and filing tax returns. In “The Second Machine Age” (Norton), the M.I.T. professors Erik Brynjolfsson and Andrew McAfee sketch a future where computers will start replacing doctors, lawyers, and many other highly educated professionals. “As digital labor becomes more pervasive, capable, and powerful,” they write, “companies will be increasingly unwilling to pay people wages that they’ll accept, and that will allow them to maintain the standard of living to which they’ve been accustomed.”

“Don’t you want to hear about the day I had?”

Cappelli stresses the change in corporate hiring patterns. In the old days, Fortune 500 companies such as General Motors, Citigroup, and I.B.M. took on large numbers of college graduates and trained them for a lifetime at the company. But corporations now invest less in education and training, and, instead of promoting someone, or finding someone in the company to fill a specialized role, they tend to hire from outside. Grooming the next generation of leadership is much less of a concern. “What employers want from college graduates now is the same thing they want from applicants who have been out of school for years, and that is job skills and the ability to contribute now,” Cappelli writes. “That change is fundamental, and it is the reason that getting a good job out of college is now such a challenge.”

Obtaining a vocational degree or certificate is one strategy that many students employ to make themselves attractive to employers, and, on the face of it, this seems sensible. If you’d like to be a radiology technician, shouldn’t you get a B.A. in radiology? If you want to run a bakery, why not apply to Kansas State and sign up for that major in Bakery Science? But narrowly focussed degrees are risky. “If you graduate in a year when gambling is up and the casinos like your casino management degree, you probably have hit it big,” Cappelli writes. “If they aren’t hiring when you graduate, you may be even worse off getting a first job with that degree anywhere else precisely because it was so tuned to that group of employers.” During the dot-com era, enrollment in computer-science and information-technology programs rose sharply. After the bursting of the stock-market bubble, many of these graduates couldn’t find work. “Employers who say that we need more engineers or IT grads are not promising to hire them when they graduate in four years,” Cappelli notes. “Pushing kids into a field like health care because someone believes there is a need there now will not guarantee that they all get jobs and, if they do, that those jobs will be as good as workers in that field have now.”

So what’s the solution? Some people believe that online learning will provide a viable low-cost alternative to a live-in college education. Bernie Sanders would get rid of tuition fees at public universities, raising some of the funds with a new tax on financial transactions. Clinton and O’Malley would also expand federal support for state universities, coupling this funding with lower interest rates on student loans and incentives for colleges to hold down costs. Another approach is to direct more students and resources to two-year community colleges and other educational institutions that cost less than four-year colleges. President Obama recently called for all qualified high-school students to be guaranteed a place in community college, and for tuition fees to be eliminated. Such policies would reverse recent history. In a new book, “Learning by Doing: The Real Connection between Innovation, Wages, and Wealth” (Yale), James Bessen, a technology entrepreneur who also teaches at Boston University School of Law, points out that “the policy trend over the last decade has been to starve community colleges in order to feed four-year colleges, especially private research universities.” Some of the discrepancies are glaring. Richard Vedder, who teaches economics at Ohio University, calculated that in 2010 Princeton, which had an endowment of close to fifteen billion dollars, received state and federal benefits equivalent to roughly fifty thousand dollars per student, whereas the nearby College of New Jersey got benefits of just two thousand dollars per student. There are sound reasons for rewarding excellence and sponsoring institutions that do important scientific research. But is a twenty-five-to-one difference in government support really justified?

Perhaps the strongest argument for caring about higher education is that it can increase social mobility, regardless of whether the human-capital theory or the signalling theory is correct. A recent study by researchers at the Federal Reserve Bank of San Francisco showed that children who are born into households in the poorest fifth of the income distribution are six times as likely to reach the top fifth if they graduate from college. Providing access to college for more kids from deprived backgrounds helps nurture talents that might otherwise go to waste, and it’s the right thing to do. (Of course, if college attendance were practically universal, having a degree would send a weaker signal to employers.) But increasing the number of graduates seems unlikely to reverse the over-all decline of high-paying jobs, and it won’t resolve the income-inequality problem, either. As the economist Lawrence Summers and two colleagues showed in a recent simulation, even if we magically summoned up college degrees for a tenth of all the working-age American men who don’t have them—by historical standards, a big boost in college-graduation rates—we’d scarcely change the existing concentration of income at the very top of the earnings distribution, where C.E.O.s and hedge-fund managers live.

Being more realistic about the role that college degrees play would help families and politicians make better choices. It could also help us appreciate the actual merits of a traditional broad-based education, often called a liberal-arts education, rather than trying to reduce everything to an economic cost-benefit analysis. “To be clear, the idea is not that there will be a big financial payoff to a liberal arts degree,” Cappelli writes. “It is that there is no guarantee of a payoff from very practical, work-based degrees either, yet that is all those degrees promise. For liberal arts, the claim is different and seems more accurate, that it will enrich your life and provide lessons that extend beyond any individual job. There are centuries of experience providing support for that notion.” ♦

*An earlier version of this article misstated the name of the Economic Policy Institute.