The syndrome has become all too common. A provocative op-ed piece appears in a major newspaper (for preference, The New York Times). Its logic is fragile and its evidence is thin, but the writing is crisp and the examples are pungent, and the assault on sacred cows arouses a storm of discussion (much of it sharply critical, but no matter). It goes viral. And almost immediately, publishers comes calling. “This should be a book,” they coo, and the author, entranced by a bit of sudden fame (not to mention, perhaps, a decent advance), eagerly agrees. He or she sets to work, and soon enough the original 800 words expand to 50,000. But far from reinforcing the original logic and evidence, the new accretions of text only strain them further, while smothering the original provocations under thick layers of padded anecdote, pop sociology and oracular pronouncement. Call the syndrome Friedmanitis, after a prominent early victim, the New York Times columnist Tom Friedman.
Mark C. Taylor’s unbelievably misguided book provides an almost textbook example. In April, 2009, he published an incendiary New York Times op-ed entitled “End the University as We Know It,” which denounced graduate education as the “Detroit of higher learning,” demanded the abolition of tenure, and called for the replacement of traditional academic departments by flexible, short-lived “problem-focused programs.” Widely criticized (by me, too, in this magazine), the piece stayed at the top of the Times’s “most e-mailed” list for a cyber-eternity of four days. Enter Alfred A. Knopf.
Just sixteen months later, the book is here, and the signs of the syndrome are all too evident. Taylor, the chair of the Department of Religion at Columbia, has enveloped his original argument in an overblown, cliché-ridden theoretical framework about the on-going shift from a “world of walls and grids” to a “world of networks.” The globe, Taylor declares, with a certain lack of originality, has become “more interconnected.” “Global financial capitalism” is replacing “industrial and consumer capitalism.” And “as cross-cultural communication grows, it transforms old assumptions and ideas.” Recounting a lengthy anecdote about a course he taught partly via video conferencing, Taylor remarks, “That was the Aha! moment in which I knew the world had changed.” (The world is flat!) Abandoning his earlier facile comparison of higher education to the auto industry, Taylor now likens it with equal facility to the financial sector, and speaks in doom-laden tones of the “education bubble.”
In their elaborate new packaging the arguments remain incendiary, but they are no more convincing than when Taylor first presented them in the Times. Incendiary does not mean true. And, ironically, there is no better demonstration that this is so than the book itself. Taylor is the great avatar of interdisciplinarity, of drawing eclectically on half a dozen different fields to illuminate a single problem such as global water supplies or diabetes. In Crisis on Campus he certainly practices what he preaches, invoking history, sociology, computer science, philosophy, economics, and much else besides to discuss the state of higher education.
Taylor has tried to draw on so much, however, that he has ended up mastering very little of it. Consider a subject he ought to know well: higher education in late twentieth-century America. “As the fields of expertise became more restricted,” Taylor writes, “the research of scholars working in them became more homogeneous, and communication among scholars with different concerns became less common.” But is this really true? Much research has certainly grown narrower and more specialized, but that is different than saying that subfields and disciplines have become closed off from each other. In my own field of history, the late twentieth-century saw one wave after another of outreach to other disciplines: to sociology and economics in the heyday of the “new social history,” then to anthropology and literary criticism when the “new cultural history” arose. In the same period, literary studies turned to continental philosophy (post-structuralism) and to history (the new historicism). Art history and classics saw similar movements. In fact, a common—and not unfounded—complaint about the humanities today is that too much of it has become an indistinguishable mass of “cultural studies.” Where are Taylor’s walls?
Things do not get much better when Taylor turns to earlier history. He traces the origins of the modern university entirely to Kant at the end of the eighteenth century, mistakenly seeing earlier universities as concerned with little but theology. (Where does he think Adam Smith worked?) As for political science, in a year which has seen the near collapse of the euro, and the continuing rise of strident patriotism and xenophobia throughout the West, he confidently predicts the end of the “domination” of the nation-state.
Most egregiously—and surprisingly, for someone with a considerable background in philosophy—Taylor fails to distinguish adequately between forms of communication and forms of knowledge. Do hypertext, multi-media, and social networking affect the way we know things? Of course. But the ways in which they do depend on the objects of knowledge themselves. On its own, the technology signifies nothing. And while communication can take place in any direction along myriad pathways, the acquisition of real knowledge cannot. It demands sequence, it demands order, it demands logic. The new technologies have supplemented conventional forms of learning and argumentation in fascinating ways, but they cannot replace them.
Taylor, however, is so enraptured by his networks that he loses sight of these important realities. Near the end of his book, he positively swoons: “No longer constrained by words in black-and-white, ordered in straight lines and right angles, you become free to reconfigure words with any color, image or sound in designed texts that can be layered and even set in motion.” Yes, we can do this, but for what purpose? Just for the sake of doing it? It is worth noting that despite the supposed superiority of new media to the boring old written word, Taylor himself has chosen the most traditional of forms—“words in black and white, ordered in straight lines and right angles”—to put across his own ideas.
And yet Taylor wants to rush ahead and re-model universities to fit his brave new networked world. In particular, he wants to dethrone conventional models of academic expertise. In a swipe at the principle of peer review, he indicates his scorn with snarky scare quotes: “the work done by faculty members can be judged only by other ‘experts’ in the same field.” But who else should we trust with the advancement and the transmission of knowledge? Taylor gives a clue in an anecdote he likes so much he cites it, at length, twice. It involves a student of his acquaintance who cannot find a doctoral program to fit her needs. Her ideal program, she tells him, would draw eclectically from “religion, history, anthropology, ethnography, philosophy, and psychology” to illuminate “the impact religious specifics (texts, philosophies, rituals, etc.) have on the mind of the religious individual.” Except that “I still cannot find any advisor who studies something quite like this.” Taylor thinks this story is a condemnation of a rigid, outmoded university system. But how can anyone know if a program even makes sense, if there is no professor competent to oversee it? Should we simply take the student’s word for it, and applaud her for the creative initiative she has shown?
Taylor apparently thinks so, and suggests that in his ideal university, students will even create their own courses out of smaller modules. The trouble here is that Taylor seems to think of academic disciplines as toolkits that can be quickly raided for implements to tackle a given problem. They are not. They are complex, difficult bodies of knowledge whose theoretical underpinnings have to be mastered to some degree before one can even know which aspects of them can be usefully applied to which tasks. This is why even the most brilliant and hard-working graduate student would have a very high chance of floundering in the program proposed by Taylor's ambitious acquaintance. Serious interdisciplinarity hardly requires a separate Ph.D. in each of the subjects concerned, but it does require respect for the disciplines.
It might seem from Taylor's hostility to academic authority that his politics run to standard left-wing anti-elitism, but this is far from the case. To begin with, he has no truck with the scholarly progeny of 1960’s-era identity politics, stridently attacking gender studies and ethnic studies programs as “politically motivated” and “divisive.” Far from decrying corporate influence on the academy, he would like to see more of it (he particularly likes the idea of universities partnering with for-profit companies to sell on-line courses), and he sees research for the sake of pure knowledge (which he weirdly traces back, again, to Kant) as an idea whose time has passed. In fact, in the history of education, Taylor resembles no one so much as the British utilitarians of the nineteenth century, with their emphasis on profitable knowledge and their frequent contempt for the ivory tower. Not surprisingly, Taylor often conflates economic profitability and intellectual profitability, as in his blanket pronouncement that “the scholarly monograph has no future.” As an economic venture, perhaps not. But as an intellectual one? How does he know?
Taylor’s hostility to universities in their current form of course leaves him with little concern for the future of the professoriate. He wants to slash graduate programs, assuming blithely that their current size is due solely to the need for cheap labor to teach discussion sections, although many universities want to maintain a critical mass of students for very good intellectual reasons as well. He wants to eliminate whole departments—“religion, history, anthropology, ethnography, philosophy and psychology,” perhaps?—with students taking online courses at other universities where necessary, without regard for what might be lost to the world of learning by firing senior scholars. And he is entirely hostile to academic tenure. He dismisses—in a few sentences—the idea that it might protect academic freedom, noting that he has never personally seen it under threat, and that in forty years of teaching he has never met a professor “who was more willing to express his or her views after tenure than before.”
On the second of these points, I can only conclude that Taylor and I know a very different set of academics. As to the first of them, well, Taylor’s personal experience came at Williams College and Columbia University. Perhaps he should think for a moment of what it might be like to teach at a large public university in a state where Tea Party members increasingly dominate the legislature, denouncing “radical professors” and calling for the further slashing of university budgets. Would he feel entirely free, at such an institution, to start a research project on, say, homoeroticism in American poetry? The evolution of dinosaurs? The history of racial discrimination in American evangelical churches? Corruption in the state senate? Lifetime tenure, for all its problems, still provides a very real safeguard for the advancement of unpopular ideas.
Taylor is obviously right to say that university systems today, in this country and abroad, face an unprecedented crisis. Costs continue to spiral upwards even as revenue shrinks. Successive cohorts of graduate students move from the Ph.D. to the unemployment lines, or to the wilderness of adjuncting. While magnificent advances in knowledge continue to take place, many tenured professors produce little of real scholarly value. But it is one thing to say that universities have problems. It is another to argue, as Taylor is effectively arguing, that the universities are the problem—that the system that allegedly began with Kant (in fact it began much earlier) has reached the end of its intellectual and social usefulness, and needs to be swept away in favor of something radically new and untested, in accordance with technologies that are still evolving at breakneck speed. That is a reckless, wrong-headed idea, and it has no place in serious discussions of higher education’s future, even if it puts a buzz on an op-ed page.
David A. Bell, a TNR contributing editor, is Professor of History at Princeton, and was previously Dean of Faculty in the School of Arts and Sciences at Johns Hopkins.