This is the last in The Vanishing University, a four-part series exploring the tech-driven future of higher education in America. Here are parts one, two, and three.
“Not enough people are innovating enough in higher education,” gripes Larry Summers, the economist who served for five years as president of Harvard. ”General Electric looks nothing like it looked in 1975. Harvard, Yale, Princeton, or Stanford look a lot like they looked in 1975. They’re about the same size to within a factor of two; they’re about the same number of buildings; they operate on about the same calendar; they have many of the same people or some number of the same people in significant positions.”
But is Summers right?
Think of the college library. A musky, magnificent space—rooms topped out by cathedral ceilings, golden light angling in with otherworldly might, illuminating rows of students camping out amongst shelves or hunching over at wooden tables riddled with decades of frustrated pencil marks and the thuds of limb-tearing textbooks.
That is what it was. No longer. Over the last several decades, the university library has become less vital, its books getting dusty with disuse, its edge-worn card system replaced by digital catalogs and powerful scanning machines that could put entire tomes online in minutes. Some schools, like the University of Chicago, moved much of their physical collections underground. Others, like the University of Texas at San Antonio, rethought the idea of a library, opening study spaces without physical books at all. Instead of going to libraries for resources and information, most students these days congregate there mainly to toss ideas back and forth, write essays together, work on group projects.
A massive transition is underway in the global economy right now that will soon obliterate the need for such a space, entirely. Future workers need—ironically enough—education that is both available at a mass scale and intensely specialized. Universities are facing a seemingly impossible crisis over how to offer accessible teaching, to several times the number of people as in years past, that is individualized, yet affordable.
Shocking as it might seem, there is one catch-all answer that could be the remedy to many of these concerns: Cut the campus loose. Axe the physical constraints. The library? Classrooms? Professors? Take it all away. The future of the university is up in the air.
Change is upon us
“All universities are very much struggling to answer the question of: What does [digitization] mean, and as technology rapidly changes, how can we leverage it?” says James Soto Antony, the director of the higher-education program at Harvard’s graduate school of education. “Colleges afraid of asking that question do so at their own peril.”
Rapid-fire innovation out of Silicon Valley has allowed students to chat over an array of messaging apps from their dorm-room beds and work at lightning-fast speed across digital platforms. The same radical disruptions are taking place, simultaneously, in other spaces on campuses: Ancient classrooms and musty hallways are no longer a requirement for university education, as they have been for the last several centuries.
First, there came free or cheap digital learning platforms like Coursera and the University of Phoenix, offering distance learning to people who wouldn’t have otherwise gone to a traditional college. Now traditional campuses, hundred-year-old state flagships and community colleges and Ivy League schools alike, are also offering lecture courses for free across the internet; in some cases, they’re even allowing degrees to be fully or partially completed online. Then there are all the new tech-savvy alternative schools and coding bootcamps floating around, daring traditional schools to up their ante.
Much of this drive is corporatized. Think Apple’s pushing of iPads into classrooms, think Google’s mighty success with its suite of ubiquitous tools like Gmail and Google Docs, or Microsoft’s launch of adaptive learning software; all these large companies are, of course, vying for chunks of the roughly $250 billion (and still growing) education technology market. ”Schools are trying to stay relevant,” says Mike Silagadze, who leads a startup called Top Hat—one of hundreds of smaller edu-tech ventures that is currently trying to streamline the learning and teaching experience in modest but markedly useful ways.
Students who’ve witnessed the tech shift in whatever form at their schools have largely embraced it. Texas A&M University is currently debuting a first-of-its-kind online lecture, which will replace a mandatory introductory economics lecture: Students can pass the entire semester (with flying colors, at that) without ever having to see the professors, or one another, in person. At many other universities across the US, textbooks are going the way of Netflix. That means they no longer need to exist in physical form. In some cases, even the teachers are going virtual.
But despite all the frenzied excitement around it, change isn’t actually happening very quickly. Innovation in higher education ”hasn’t yet been pursued on a scale and with a degree of energy that is commensurate with the real challenge,” Summers notes. The sluggishness owes to the fact that a university is made up of hundreds of stubborn, rooted parts. It is beholden to countless traditions and generations of students served in the past. And so disruption comes slowly, even with impatient Silicon Valley breathing down schools’ necks.
A recent poll of professors showed that a sizable percentage does not support online learning—which makes sense, given that it partly threatens their very jobs.
And per a recent report on digital transformation in higher education, only 50% of respondents (including students, university leaders, and education technology company counters) expect the traditional university “model” to be disrupted by 2025. To be sure, the bulk of people do believe technology is the future of education; they’re just skeptical of the degree to which technology belongs there. “Moving a university is like moving a cemetery—you can’t expect any help from the inhabitants,” says Barb Oakley, a professor who taught traditional class structures at several universities for years before signing up to teach with Coursera as well.
There’s the shock of it, too. Two students working on a problem set via Google Documents are one thing. An entire campus plucked up, migrated over to the web? That idea is tremendous cause for parents to worry that they’ll soon be shipping their kids off to little dorm prisons—in which all they’ll do is chat and watch videos on their computers, ordering Seamless in, never stepping out into sunlight.
Michele Borba is a psychologist and the author of a book on empathy training, who regularly consults with employers on how to instill empathy in recent college graduates, many of whom nowadays have gotten through four years of a degree without being forced to work closely with others. “The one thing we are losing is human connection,” she warns of the digital age at large. “You don’t learn empathy facing a screen. We’ve lost emotional identification; we’re seeing such a character gap in kids. It’s destructive.”
But at the same time, there’s so much to be gained. Soft skills essential to the future workforce will include more than knowing how to hold a one-on-one conversation. People growing up in the digital age will also have to navigate online relationships, maybe even work for years with bosses they never meet in person. Oakley’s Coursera class for instance, which is called “Learning How to Learn,” is an interactive program that teaches soft skills like discipline and repetition which are, most of the time, somewhat impossible to get across in a lecture setting.
“I’m standing there in front. of what I’m trying to teach and suddenly I pop closer so I’m half-body on the screen!” she says ”Or I could go upside down! It shocks. It has more of an impact. I can use great visuals. I can sketch metaphors.” It’s a whole new world.
Days of future past
The famed Oxford tutorial system—in which fresh-faced pupils meet individually with sagacious mentors for lively debates that is the platonic ideal of higher education—was developed when less than 1% of the population went to university. The last major industrial revolution already took education from a world of individual apprenticeships to a universal, mass-scale system: Years of standardized schooling suddenly became necessary to prepare children for the new way to live, and later, as the economy grew into a marketplace for specialized professions, in-depth education at universities turned into a requirement as well.
It is untenable for universities to continue existing as sanctums for a small group of elite students, taught by top scholars.
Technology isn’t only refashioning the ways in which we live and work, but also changing what we need to learn for these new schemes of existence: It’s returning us to a need for specialized learning, for individualized education that is custom-tailored to one’s needs. A world in which most of our learning is more self-directed and practical is, in many ways, a return to an apprenticeship model that existed before industrialization. There’s just a distinctly modern twist in lieu of the stale, dated books on untouched library shelves—it’s in the cloud instead of the campus.
Here’s the most likely outcome. Higher education in the future will be a continuum of services.
Online “cloud” teaching is cheaper; universities can offer such online-based (or majority-online) degrees at the lowest rate—making for a cheap(ish) degree, available to everyone with access to the internet, and taking place completely digitally. Meanwhile, other students will pay a premium to interact with professors and have more of a traditional campus experience. At the highest end, the richest or most elite students may get the full Oxford tutorial experience, brushing elbows with the best of scholars; they’ll just have to pay through the nose for it.
Digital learning is already on its way to the norm, and it’s more than possible that one day only the most moneyed students will want the full physical experience—like a cushy first-class upgrade to an airline flight, which most people only need as a basic mode of transport. The majority of college students will likely fall in the middle category, receiving a hybrid of interaction with professors in mid-size classes and large lectures online.
There will always be some professors on campus. Perhaps just fewer of them. Educating undergraduates and graduate students is only one service universities offer, after all: They also produce research and scholarship, and AI can’t yet publish in top journals or conduct groundbreaking lab research. That’ll put a large premium on soft skills, of course, because in-person learning will be a more valuable, scarce commodity. As University of Illinois economics professor David Albouy points out, “AI might be better—it is better—at lots of things, but I have a comparative advantage when it comes to teaching because I am good at the mushy human stuff.”
These days, college education is almost a necessity for employment. Universities and students alike have to come to terms with the fact that those who can pay the most will also receive the most scarce and valuable skills. In college education, as is the case with many other goods and services in the modern economy, technology has radically broadened the world’s access—at the price of heightened inequality.