Now that educators have absorbed the initial sucker-punch of ChatGPT coming online, the conversation has quickly turned to finding ways to prevent students from cheating with it. These are necessary efforts, but even if teachers come up with innumerable ways of foiling the machines, the fact of the matter is that they will not have averted the real threat that AI poses for higher education. ChatGPT did not emerge out of nowhere. It is just the latest phase in a decades-long effort by Big Tech to reduce knowledge to a monetizable on-demand product, and this ambition is helping to transform the very idea of what it means to learn.
Such claims will be dismissed as the cry of fear that greets any era of technological change, especially by the small subset of media and academic commentators who have posited that ChatGPT could end up exercising a mostly positive influence on education. AI writing apps, the argument goes, will force us to rethink education, shake off old habits, and, importantly, put that tired old work horse of student assignments, the term paper, out to pasture for the last time.
Rick Salutin, writing for the Toronto Star and rabble.ca, suggests that the advent of ChatGPT might rekindle the oral tradition in the classroom, and playfully notes that the Socratic method could become a new-old paradigm for teaching. To underscore the possible benefits of a return to orality, he cites a line in Plato’s Phaedrus warning about the threat that writing poses to wisdom. What Salutin doesn’t note, however, is that the warning is said to have come from a myth about the introduction of the written word into the pre-literate oral culture of ancient Egypt. As the communications theorist Walter Ong observed, ancient oral cultures were highly dependent on memory for the preservation and transmission of knowledge—something that encouraged feats of memorization that are unthinkable today. In a digital age, all information is accessible on demand. The idea of knowledge as something that someone builds and holds within their mind, as it were (metaphors are inescapable when speaking about the mind), is rapidly being eclipsed by a view of the Internet as a prosthetic brain that more or less obviates the need for memorization.
Innumerable studies confirm that when people read online they tend to skim material and retain significantly less of it; they are also less likely to engage with the content in a thoughtful and sustained way, taking less time to reflect, draw connections, ask questions, etc. Nicholas Carr, author of the Pulitzer Prize nominated book, The Shallows, notes that this sort of brisk, superficial reading can be exhilarating in an age when we are all in search of talking points to post to our social media profile or immediately deploy in a Twitter scrimmage. It is also gold for a Big Tech industry seeking to expose viewers to as many ads as possible by channeling them algorithmically through as much content as quickly as possible. But it is inimical to the sort of deep reading that allows for the rich and critical engagement with complex ideas that marks education at its best.
ChatGPT will only further entrench this relationship to knowledge and learning. What’s most alarming about ChatGPT is not that students will use it to cheat, but that many will come to see it as a fitting surrogate for their own mental efforts. They will casually delegate their learning to a machine because they don’t see any difference between what the human mind does and what the machine does, beyond the fact that the latter does it faster. This is not through any fault of their own; it’s just the times and media environment that we live in. In the end, the fear that AI will provide a superior version of the mind is misplaced: The real danger is that we are already well on the way to downgrading the mind into something resembling AI.
Essay assignments at their best are designed to encourage a more sustained encounter with the existing literature on a given topic, while promoting the ability to synthesize debates, critically engage with different perspectives, and articulate a balanced and informed position for oneself. The reality, of course, is that most students cut corners, pull all-nighters, cherry-pick sources, rush their readings, and are generally less committed to the task than their profs would hope. But what we will see more and more is not just students who are insufficiently committed to the hard work of learning, but who see no point in it whatsoever. After all, why spend hours sweating over something that a bot can produce in a couple of seconds?
This is why I’m skeptical of the suggestion that we take Socrates as our model and look to revive something of an oral tradition in the classroom. Many of us teaching in colleges and universities already strive to make our lectures more of a dialogical experience, and it seems inevitable that we’ll move more and more towards oral assessments too in response to ChatGPT. But these efforts will be small counter-currents in a vast sea of change. Any return to orality will take place within a capital-driven media environment that is already giving rise to a diminished view of learning, and for that matter, of the human mind. It will not be a step forward but a sad attempt to deal with grim necessity.