Academically Adrift was released in mid-January to widespread and well-deserved notice in the popular and trade press and in the blogosphere. ABC Nightly News did a segment on it—probably a first for an empirically-based and fairly technical analysis of data on the intellectual performance of college students. The Chronicle of Higher Education and Inside Higher Ed ran numerous columns and attracted abundant comments. Perhaps the first thing to note about the book that underlies Arum and Roksa’s article is that it struck a resonant chord. The initial comments generally divided between those who saw confirmation of something they already suspected and those suspicious that Arum and Roksa had employed, as one writer put it, “an oversimplistic [sic] metric.”

Those who have doubts about the value of the Collegiate Learning Assessment as a general tool or about the “performance task” component of the CLA on which the authors relied will have time to pore over the details, but I judge the analysis to more than meet ordinary standards of social science rigor. And skeptics face the hurdle that some of Arum and Roksa’s key findings are consistent with other studies, including the much larger data set of the National Survey of Student Engagement. With that in mind, I will proceed on the assumption that their factual findings are accurate and that the pressing question is what to make of those findings.

In another indication of timeliness of Academically Adrift, less than a week after its release the Lumina Foundation released a report titled The Degree Qualifications Profile, specifically aimed at trying to restore to American undergraduate education a better and more explicit set of “expected student learning outcomes.” The Lumina Foundation has for several years been at the forefront of the movement that Arum and Roksa refer to in their first sentence as “college for all.” More exactly, Lumina advocates that 60% of Americans should have two or four-year college degrees by 2025. (The current figure is 39%.) Lumina’s goal would appear a practical impossibility given the facilities and resources of American higher education as we know it, but perhaps is achievable if Americans were to consent to a system of mass online college degree programs. Lumina’s new report aims at staving off a major threat to its goal of increasing the percentage of Americans who hold college degrees: the possibility that Americans will, to the contrary, begin in large numbers to forego a traditional college education.

Not long after the Lumina Foundation and President Obama began (with others) to advocate for their massive expansion of American higher education, the first signs began to appear of what has been called the “higher education bubble.” That is to say, commentators and analysts began to note a new willingness of Americans to question the value of seeking a traditional college degree—a questioning that, given the house-of-cards structure of American higher education’s financial arrangements, could lead to a very rapid change of fortune for many colleges and universities. The idea of a higher education bubble is built on an analogy with the housing bubble or the high tech bubble: people bid up the price of a good on the assumption that, despite very high prices, the purchase can be justified by the prospect of even higher returns. The logic holds that houses are very expensive, but the price of housing rises so rapidly, I can afford to purchase beyond my means knowing that I will recoup my investment upon subsequent sale. But once buyers begin to doubt that the price of housing will continue to rise, they withdraw from the market and prices then indeed begin to tumble.

College degrees are not houses, or stocks, or tulip bulbs, so we can wonder how strong an analogy this really is. You cannot lose a college degree in a foreclosure, or sell the one you have earned at a distressed price. But the logic of the marketplace that treats a college degree as a kind of good that can be sold and purchased definitely has some bearing on the situation. For the last fifty years or so, American higher education has marketed itself to students very much as a kind of financial investment. Getting a college degree was explicitly promoted as the key to getting ahead, either directly by treating the bachelor’s degree as a general workplace credential or indirectly by treating the degree as a necessary prerequisite to professional study and certification. The former is what is now largely in doubt, although post-graduate study in the professions, especially law, has also come under increasing scrutiny as a career path grown less secure.

The idea that college is an excellent investment paying a high return in terms of lifetime income has been buttressed by numerous studies over the years purporting to show that college graduates earn a large premium over a lifetime in comparison to individuals who earn only a high school diploma or who attend some college without completing a four-year degree. For a while a figure originally posited by the College Board of a $1 million dollar lifetime earnings premium was commonly cited, but it turned out to be baseless and other estimates came in with still substantial but much lower figures. Even these figures have begun to lose some their luster. That could be because so many college graduates these days are plainly unemployed or are visibly working in occupations far below the level for which a college degree would be needed. Several economists (Gonzalo Castx, Kartik Athreya, Janice Eberly) have recently argued that the premium earned by college graduates can be seen as partly a reward (in George Vedder’s paraphrase) “for risk-averse persons to take the risks of going to college—the risk they might not make it through.” The drop-out rate (now euphemistically called the “non-completion rate”) is in fact substantial. Only about 63% of students who begin a four-year college program earn a degree, and the time to completion now averages 6 years. Moreover, those who drop out still typically have student loans to pay off.

As the perception grows that college is not, as it was once thought, a sure bet, but on the contrary a fairly risky proposition, so does the understanding that college debt can be a major impediment to getting started in life. The New York Times has run front-page stories about young men and women deemed by potential marriage partners as unsuitable because of their high levels of student debt. By federal law, student debt cannot be expunged through bankruptcy, and the interest on loans can add up to a significant portion of a young individual’s total income.

These are strong disincentives to attending college, but the situation is further complicated by the growing recognition that college has opportunity costs (the student foregoes income, practical training, and workplace seniority for four or more years); that the higher education industry has played fast and loose with its sales pitch (omitting in its calculation of college benefits what happens to those who fail to complete the degree); and that new alternatives are emerging, such as employer acceptance of online credentials that can be earned while working full-time.

The “higher education bubble,” if there is one, however, has one more crucial component: the perception that the college degree itself doesn’t necessarily represent much of anything in the way of intellectual attainment or enhanced practical skill. This is a widely expressed doubt among Americans but rather hard to pin down. The extraordinary reception for Academically Adrift is due in large measure to Arum and Roksa answering a question that has been in the air. The Lumina Foundation’s report, The Degree Qualifications Profile, similarly is an attempt to get at the heart of this anxiety, but proceeds on the assumption that to fix the problem colleges and universities simply need a crisper enunciation of their proper educational goals.

Perhaps the Lumina Foundation’s approach would meet with some sympathy from Arum and Roksa, whose diagnosis of the problem focuses on the readiness with which colleges have substituted “social engagement” for “academic rigor.” A new vision of degree requirements does not number among the recommendations they offer (improve elementary and secondary school preparation; use educational leadership to create a culture of learning on campus; etc.), but it is in the same ameliorating spirit. In Academically Adrift, they explicitly say that their findings do not warrant a view that higher education is in “crisis,” and they see no existential threat to the institution despite their finding that over a third of the students who make it to graduation have learned next to nothing for their efforts. As they put it, “Limited learning on college campuses is not a crisis because the institutional actors implicated in the system are receiving the organizational outcomes that they seek, and therefore neither the institutions themselves nor the system as a whole is in any way challenged or threatened.”

In other words, the phenomena they describe of students on average spending very little time studying and consequently learning very little are the new normal. They might be right about this, of course, but I wonder whether their picture would seem quite so unproblematic if considered in terms of (1) the “higher education bubble” hypothesis, and (2) the longer historical developments that have brought us to this point. The bubble, being a matter of public confidence in a system built on borrowing and long-term debt, is something that lies well outside the parameters of Arum and Roksa’s investigation and I wouldn’t expect them to have weighed it as a relevant factor. But it is easy to see how it could be: as higher education has grown enormously expensive for students, its price far outstripping the rate of inflation, colleges and universities have had to seek all sorts of expedients to attract people willing to pay—or willing to borrow—the extravagant sums needed to pay tuition, room and board, and other fees. That has meant appealing to students who have little interest in the traditional forms of intellectual pursuit that were once central to the enterprise of higher education. The substitution of what Arum and Roksa call “social engagement” for academic rigor makes perfectly good sense as a form of marketing to students who are marginally interested in academic rigor. But as the leavening of such students increases in college courses, other considerations have to give way too. Colleges need not just to attract but retain these marginal students, and that means easing course requirements and inflating grades. More ambitious students can be segregated into honors courses and a handful of rigorous majors where they pose no threat as rate-busters.

So yes, students on the whole may receive the “organizational outcomes” they desire, but there is a price to be paid for this accommodation and contrary to Arum and Roksa, it might well be an existential one for some colleges and universities. At some point, the game is up. People could decide that $20,000, $40,000, or $50,000 a year in tuition is too much to pay for an education in an intellectual Potemkin village.

The bubble hypothesis, however, treats a college education mainly as a commodity that people will, in the long term, evaluate as they would any other economic good. That’s an important perspective but it can also be a cramped way of looking at things and we ought also to consider the larger claims of higher education. A college degree isn’t simply a marketplace credential. It is meant as an outward recognition of a student’s attainment of a certain kind of wholeness and maturity, mostly intellectual in character but extending beyond that into the realm of important cultural values. How do Arum and Roksa’s findings sit with this perspective?

Mostly they show us a system of higher education that has sidled away from rigorous course requirements and an emphasis on learning per se. From a longer perspective, however, that isn’t just a matter of dropping the ball to play a different game of “social engagement.” A more serious displacement has occurred that seems a little over the horizon in their study. We are dealing with an institution that has, to an extraordinary degree, simply forgotten why it exists in the first place. Peripheral matters are pursued with vigor and often emotional cathexis (“diversity,” “sustainability”) while central and once abiding concerns become mere husks of themselves—old books gathering dust in unused libraries.

When a college or university decides it has something more on its mind than undergraduate learning, the possibilities are endless. Some of these possibilities are longstanding and if not hallowed by tradition, they are at least welded to the enterprise. Others are recent accretions. The accretions are easy to spot. As I sat down to write this I got a call from a grad student at Columbia University eager to learn my view of the new field of “fat studies,” which has emerged in the last decade as specialization within women’s studies and lesbian studies. A few days ago I wrote a brief piece about the 540 colleges and universities, 145 of them in the United States, that have become official signatories to “Academic Impact,” an initiative sponsored by UN Secretary General Ban Ki-moon. Signatory institutions commit to advancing ten principles, which include the right of everyone in the world to an education, and using education to “encourage” global citizenship, to “advance” conflict resolution, to “address” poverty, and to “promote” sustainability.

Fat Studies is simply an application to corpulent women of the well-trod path of creating an academic interest group out of an identity forged in resentment. “Academic Impact” is an effort to make higher education an instrument for fostering a worldview congenial to those who see human welfare and the rule of law through the lens of post-national institutions. Higher education won’t topple because of these two developments, but they illustrate the ease with which the university these days treats itself as a universal utility, good for advancing almost any item on someone’s social and political agenda. On any given day of the week, it is easy to find at least one new way in which higher education sets out to divert itself from the task of teaching undergraduates the knowledge and skills they ought to learn.

The rationalizations for these diversions are significant. There is an important social scientific question lurking here. How does an institution that owes its existence and continued support to its ability to carry out one central purpose end up scanting that purpose in favor of a cloud of largely irrelevant concerns?

I suppose there are three kinds of answers available: mission creep, capture, and public choice theory. The first gives us a picture of the university as having little resistance to opportunities to “do more.” Mission creep has been in evidence at least since World War II enlisted university-based science in work such as the Manhattan Project and the development of radar, and was capped by the 1944 Servicemen’s Readjustment Act, which gave colleges and universities their first large-scale experience with federally-funded students. By the early 1960s, the university was well on its way to re-sizing itself for the baby boomers. Mass higher education from the start, however, was more than just scaling up. It meant the rise of what University of California Chancellor Clark Kerr called “the multiversity,” an institution that would serve a virtually unlimited set of ever-changing public goods.

From this perspective, today’s amorphous diversion-piled-on-diversion approach to undergraduate instruction is nothing new. It is the fulfillment of a logic that has been evident for over half a century, or arguably even longer. After all, American higher education from the passage in 1862 of the Morrill Act, which created the basis for land-grant public universities, has had a strong utilitarian element. We are not good at saying no to invitations to add to the curriculum if doing so might serve some practical good.

Which brings us to “capture.” The university as an institution has historically been defined by its allegiance to high principles such as the pursuit of truth, the acquisition of knowledge as a basis for a free society, and the transmission of the ideals of civilization. These principles have philosophical heft but seldom match the more down-to-earth motives of interest groups intent on advancing their own political fortunes. Another way of accounting for what has diverted higher education from its central task is that it has been captured in the last half-century by interest groups that have a strong commitment to using the university for various kinds of social transformation that have thin connection to the foundational principles of the institution.

Perhaps the most explicit expression of this counter-agenda for higher education was the new left Students for a Democratic Society’s 1962 manifesto, The Port Huron Statement, which denounced the majority of students of its time as “apathetic” and higher education as “a training center for those who want to spend their lives in political pretense.” It dismissed “searching for truth” as merely a pretext for the university’s actual function, which was merely to “help the student get by, modestly but comfortably, in the big society beyond.” Against this, the SDS manifesto called for a form of higher education which would “serve as a significant source of social criticism and an initiator of new modes and molders of attitudes.”

The original SDS dissolved into factional disputes at end of the 1960s, but the idea that the university ought to “serve as a significant source of social criticism and an initiator of new modes and molders of attitudes” has retained a substantial following and we see it reverberating in Fat Studies, the U.N. “Academic Impact” initiative, and countless other places. But the idea of using the instrument of higher education as a means to effect social change is far from universally accepted. As the radical left gained power on American campuses it came to see the continuing value of positioning higher education as a “training center” for those whose aspirations were to “get by” or even to prosper “in the big society beyond.” What developed was a hybrid institution that presents itself to the general public as concerned about national economic priorities and practical preparation of students for the marketplace, but presents itself to faculty members and students as pursuing goals such as social justice, diversity, and sustainability.

“Capture” by a political interest group or ideologically linked set of interest groups goes a long way towards explaining the content of the programs that diverge from the foundational principles of undergraduate learning, but as an explanation it has a loose end. How does an institution of the size and complexity of American higher education get “captured” by an ideology (or congeries of ideologies) that represents a view strongly rejected by a majority of the population at large, at odds with the institution’s historical commitments, and divergent from the interests of many of its own faculty members? Higher education in America is not an organizationally unified enterprise, but a collection of some four thousand colleges and universities operating under a wide variety of legal arrangements. On its face, “capture” would seem to be an unlikely occurrence.

This is the point where we need to consider some version of public interest theory. The capture of higher education is very much a story of people successfully “professionalizing” their personal political views and making out of this straw the bricks of academic careers. The enterprise of turning a political conviction into an academic career is closely matched by the use of the word “studies” to denote an area of inquiry that is extra-disciplinary by the old standards. Cultural studies, area studies, African-American studies, women’s studies, queer studies, and a host of more arcane fields bear this stamp. The humanities and social sciences have borne the brunt of this politicization, but the natural sciences are not immune: witness the rise of sustainability as a robust campus ideology. Administrators and non-faculty administrative employees also figure prominently in the rise of academic policies that more strongly reflect the values and interests of the stewards than of the public they supposedly serve. We now have a form of higher education where, accordingly to Department of Education’s IPEDS data, non-faculty administrative employees outnumber faculty members, 51% to 49%. Whose interests are served by this proliferation of administrative positions? The administrators themselves are the foremost beneficiaries. And very few of them are hired who have not been vetted for their commitment to the idea that the university is ultimately about achieving social transformation in American society.

The university, of course, has never been purely and simply focused on undergraduate learning. We expect scholars to pursue research and scholarship that in most cases is more specialized and more advanced than the material taught to undergraduate students. We expect universities to prepare graduate students in the learned professions. We expect—or at least we did until recently—that colleges will pay some attention to shaping the character as well as the minds of students. We expect colleges and universities will also provide some space for sports and for an organized extra-curriculum of amateur arts, music, theater, games, and conviviality. And lastly, we expect that higher education will allow room for college students in their late adolescence to act up. Drinking, general rowdiness, and not-very-wise exploits are among the social facts that any objective description of campus life for the last thousand years would have to comprehend.

Higher education has survived the debilities that come from too many Casaubons, too many spoiled rich kids, too much football, too many Beat poets, and too many Animal Houses. Surely there has always been some percentage of students who do enough work to get by but who learn little in college. But something has indeed changed. The “35% of students at four-year colleges [who] report that they spend five or fewer hours per week studying alone,” is not a historical norm. It is a change that cannot really be comprehended in terms of what colleges are supposed to do.

If it is not a crisis, we should make it so for we are wasting the lives of individuals in a pointless and pointlessly expensive pursuit. We are squandering national resources. And we are undermining the real purposes of higher education. The third of the students who graduate from college having learned next to nothing are not harmless ballast. They are rather the clientele that the university most caters to, because they are “at risk” of leaving. But this takes us back to the bubble. Ultimately the bubble and the forfeiting of mission are joined. Higher education may founder on the loss of public confidence in the market value of a college degree, but that foundering is the long-term consequence of colleges and universities simply losing their bearings.