ChatGPT attends university
When colleges ban AI tools like ChatGPT, they (inadvertently) select for the sneaky, savvy students
Please like, share, comment, and subscribe. It helps grow the newsletter and podcast without a financial contribution on your part. Anything is very much appreciated. And thank you, as always, for reading and listening.
A new technological era has dawned with the introduction of artificially intelligent tools such as ChatGPT and xAI’s Grok, both large language models (LLMs) capable of generating large volumes of relatively high-quality text with a simple prompt. Tools like these have the potential to upend how many sectors operate—including higher education. In response, a growing number of colleges and universities have banned or restricted the use of AI tools like ChatGPT and Grok, including institutions in New York and Alabama and schools such as Cambridge University and Imperial College London.¹
The Skills Explanation of Higher Education
There are many reasons—though not all good ones—why colleges and universities are banning or restricting AI. But the most prominent rationale is straightforward: if students use tools like ChatGPT and Grok to write essays or finish assignments, they won’t actually learn the material. This aligns with a widely accepted rationale for attending college: acquiring valuable skills such as critical thinking and effective writing, which benefit both society and future employers.
If students rely too much on artificial intelligence to complete their degrees, they may fail to internalize these key skills. Consider Carnegie Mellon University’s mission statement:
To create a transformative educational experience for students focused on deep disciplinary knowledge; problem solving; leadership, communication, and interpersonal skills; and personal health and well-being. … To impact society in a transformative way—regionally, nationally, and globally—by engaging with partners outside the traditional borders of the university campus.
Carnegie Mellon is not alone. George Mason University promises “a college experience like no other” with the goal of helping students grow as “individuals, scholars, and professionals.” Arizona State University says its redefined liberal arts education will help students become “socially aware, critically thinking global citizens who strive to bring about positive change.”
These mission statements reflect what many people take to be the central purpose of college: the development of valuable knowledge and skills. This is the skills explanation of higher education. There are other potential benefits to college—networking, social life, even meeting a life partner with similar earning potential—but the main argument advanced by educators and employers alike is that college develops human capital. Bryan Caplan captures this perspective well in The Case against Education:²
The key question isn’t whether employers care a lot about grades and diplomas, but why. The … popular answer is that schools teach their students useful job skills. Low grades, no diploma, few skills. This simple, popular answer is not utterly wrong. Literacy and numeracy are crucial in most occupations. Yet the education-as-skills story—better known to social scientists as ‘human capital theory’—dodges puzzling questions.
If the skills explanation is right, it makes sense for universities to worry about AI. If an LLM can write better and faster than a student, then students might not acquire the skills employers value—and the whole enterprise of higher education risks being rendered pointless.
The Signaling Explanation of Higher Education
Despite its popularity, the skills explanation has a competitor: the signaling explanation, grounded in signaling theory. Developed independently in economics³ and evolutionary biology⁴ in the 1970s, signaling theory centers on a key insight: many desirable traits—like intelligence, reliability, or conscientiousness—aren’t immediately observable. Signals help reveal these hidden qualities.
Some traits, like height, are obvious. But others—like punctuality—are not. To be effective, a signal must be costly or hard to fake. Lifting 200 pounds above your head is a reliable signal of strength; simply claiming you can do it is not. A peacock’s colorful plumage is a costly, hard-to-fake signal of biological fitness, as are diamond engagement rings for commitment.
A functional signaling system includes the following elements:⁵
Some individuals have a quality that is hard to detect directly.
Others value accurate information about that quality.
Without reliable signals, deception becomes possible and profitable.
The cost of signaling ultimately benefits the sender.
How does this apply to higher education? When employers hire, they’re making an investment decision under uncertainty. As economist Michael Spence puts it:⁶
In most job markets the employer is not sure of the productive capabilities of an individual at the time he hires him. Nor will this information necessarily become available … immediately after hiring. … The fact that these capabilities are not known beforehand makes the decision one under uncertainty.
Because traits like conscientiousness and intelligence are opaque, employers rely on costly, hard-to-fake signals to reduce hiring risk. Graduating from college serves this purpose well: it signals, to varying degrees, that a person is intelligent enough to gain admission, conscientious enough to complete assignments, and conformist enough to follow institutional rules for years.
This doesn’t mean skills are irrelevant, especially in fields like engineering or computer science. But signaling theorists argue that most of the value of a college degree comes from what it signals, not what it teaches.
What’s the evidence? Too much for a short article, but consider these common behaviors:
Students are glad when class is canceled.
Subjects (e.g., Shakespeare, philosophy) often have little labor market relevance.
Graduation year strongly predicts earnings gains.
High-quality education (e.g., MIT’s online courses) is freely available.
Failing a class is worse than forgetting everything learned in it.
Students often seek “easy A” classes.
Cheating, if undetected, is considered “just as good.”
If college were primarily about skills, these behaviors would be puzzling. If students were paying for useful content, they shouldn’t celebrate canceled classes any more than they’d be happy getting only half their Starbucks order. But if college is mainly about signaling, then skipping class or forgetting content post-graduation isn’t so surprising—so long as the grade and the credential remain.
How Higher Education May Evolve Given Artificial Intelligence
In institutions where LLMs are banned to prevent cheating, a new dynamic is emerging: students are competing not just with each other but with their schools—trying to use AI tools without getting caught. And there’s growing evidence that this is already happening.
Owen Terry, a Columbia University undergrad, reports:⁷
The common fear among teachers is that AI is actually writing our essays for us, but that isn’t what happens. … The more effective, and increasingly popular, strategy is to have the AI walk you through the writing process step by step. … Depending on the topic, you might even be able to have it write each paragraph … then rewrite them yourself to make them flow better.
Some students will become especially adept at using AI tools to streamline their work without detection. Others will get caught and face penalties. Just as some students are better at schoolwork, some will be better at integrating AI into their academic workflow undetected.
Will signaling still hold in an AI-saturated academic world? Likely, yes. As Caplan notes:⁸
Signaling models have three basic elements. First, there must be different types of people. … Second, an individual’s type must be non-obvious. … Third, types must visibly differ on average. … A signal doesn’t have to be definitive, just better than nothing.
That last point is key: signals need not be perfect. Even in an AI world, graduating from college may still plausibly reveal intelligence, conformity, and conscientiousness—though the form those traits take may shift.
And especially at schools that ban AI, at least initially, successfully using them undetected may send an additional signal: the graduate is savvy, tech-literate, and skilled at integrating powerful tools into complex workflows—traits likely to matter in a post-LLM economy.
The best students, then, won’t merely avoid getting caught using AI. They’ll learn to use it well—and that, too, will be part of the evolving signal sent to future employers.
Notes
Cindy Gordon, “How Are Educators Reacting to Chat GPT?” Forbes, April 30, 2023, https://www.forbes.com/sites/cindygordon/2023/04/30/how-are-educators-reacting-to-chat-gpt/.
Bryan Caplan, The Case against Education: Why the Education System Is a Waste of Time and Money (Princeton: Princeton University Press, 2019).
Michael Spence, “Job Market Signaling,” Quarterly Journal of Economics 87, no. 3 (1973): 355–374, https://doi.org/10.2307/1882010.
Amotz Zahavi, “Mate Selection—A Selection for a Handicap,” Journal of Theoretical Biology 53, no. 1 (1975): 205–214, https://www.sciencedirect.com/science/article/abs/pii/0022519375901113?via%3Dihub.
Rebecca Bliege Bird and Eric Alden Smith, “Signaling Theory, Strategic Interaction, and Symbolic Capital,” Current Anthropology 46, no. 2 (2005), https://www.journals.uchicago.edu/doi/10.1086/427115.
Spence, “Job Market Signaling.”
Owen Kichizo Terry, “I’m a Student. You Have No Idea How Much We Are Using ChatGPT,” Chronicle of Higher Education, May 12, 2023, https://www.chronicle.com/article/im-a-student-you-have-no-idea-how-much-were-using-chatgpt.
Caplan, The Case against Education.
I spend a lot of time calling up students for AI misuse (not my students—this is a job I do for my school). What depresses me is how *badly* they use it—made up references and quotes, simply copying and pasting bullet point lists, every second paragraph has its own section. They literally just copy and paste. If we can at least catch the students who can’t even be bothered to use it properly we’re doing some sort of service.
Universities tend to attract students of different dispositions. A large number of them are just motivated by credentials in the way described. That's no susprise because the structure of schooling their whole lives encourages this kind of thinking. But a good chunk are interested in ideas, expanding what they know, digging into the issues. They form a large part of the constituency of students who are against their peers using LLMs, just as they'd disapprove of those who used to buy their papers.