College writing is low rent
Any schmuck can "write" a solid essay with AI. Higher education must adapt.
Please like, share, comment, and subscribe. It helps grow the newsletter and podcast without a financial contribution on your part. Anything is very much appreciated. And thank you, as always, for reading and listening.
EDIT: Bear in mind there is a serious difference between the value of writing (writing is a form of thinking; so is debating a sharp, pre-trained LLM), and the value of writing assignments as an evaluation tool. They are not even close to the same!
This is an open letter to my students and the higher education system, to be widely read and circulated. The views in this article are my own. They do not represent Arizona State University, or any other institution of higher education.
I’ve taught at five universities, including University of Maryland and Georgetown, across several states and Washington D.C. This piece comes from my depth of teaching (and research) experience.
Dear Students:
As students, you are constantly evaluated (sometimes explicitly, sometimes implicitly). You are evaluated to decide if you will be admitted to a university. You are evaluated every time you turn in an assignment, make a comment in class, or interact with group members. Whether one likes this approach to higher education or not, it is the nature of the beast. You are evaluated. Constantly. (And to be fair to colleges and universities, employers and coworkers will constantly evaluate you too).
And they don’t just evaluate you by an absolute or independent standard. They evaluate you relative to your classmates. Did you write a better paper than the students around you? Is your presentation better put together? Are you more on the ball than others? Partly this is due to the fact that generic human evaluation is relative—is she prettier or smarter than others around her? Is he funnier and taller? Our social lives are replete with evaluating people relative to their peers and competitors. Colleges and universities do this, too, sorting stronger and weaker students into different piles, at the level of the university or college itself, the nature of the school and major, whether you is graded on a curve, and so forth.
Since evaluation is relative, and large language models (LLMs) like ChatGPT, Grok, and other AI chatbots exist, college writing has become a poor indicator of talent and hard work—it has become low rent. This is unfortunate but true. Your essay or research paper no longer signals that one is intelligent and hard working relative to your classmates. Why? Because anybody with access to an LLM and the assignments can produce a decent college essay with increasing sophistication as AI improves. Even a person who knows little to nothing about the class can do this. So, college essays are now low rent—they are a cheap signal of anything that higher education does and should care about. College essays admittedly were never an amazing measure of collegial potential, but their value in the respect has plummeted to zero over the last couple years.
So why does that matter? It is partly because the university or college you attend needs a reliable indicator that is hard-to-fake and that indicates one understands the class material. Likewise, employers want to know that the degree one earned reflects one’s understanding. Copying, pasting, or uploading essay instructions into an LLM, and then turning it in as one's own is not an indicator or reflection of one’s academic ability and virtue. As Michael Spence, the economist who pioneered work on the signaling function of higher education, explains:
The signal, to be effective, must be one which the employer can observe, but which is more easily or less expensively obtained by high-productivity individuals than by low-productivity individuals. Education serves as such a signal—not because it necessarily increases productivity, but because it is a costly activity that high-productivity individuals find easier to complete.
College writing no longer fulfills that role due to LLMs and AI. If anybody with access to a LLM can turn in a passable paper, then it is no longer a hard-to-fake signal, e.g., like the ability to lift two hundred pounds over one’s head is hard, if not impossible to do, without the needed muscle. So what is higher education to do? It must adapt or go extinct. It may resort to more in-person testing and exams, though this is harder than many who have never taught a college class appreciate. Using LLMs as tutors and blank slates to be taught basic concepts. And so forth. Whatever path higher education takes, it must teach students—without the wrong kind of deskilling—how to effectively use LLMs without undercutting their value as employees and citizens.
So we now have a handle on the problem. What can you, the student, do about it? Do not forget (or fail to learn) how to write. Just because a device or computer can do something better than you hardly means that one should cease to do it, or never learn to begin with. Such a conclusion would, sometimes, be a mistake. It is true that sometimes technology enders a skill obsolete. Other times, it requires the retention of the skill to better use the technology. For example, modern farmers have shed the knowledge of how to operate a horse-drawn plough, but still retain a deep knowledge of soil quality, rain and weather patterns, seasons, and crop yield. In the latter case, the technology amplifies the knowledge of the farmer. Writing skills are more like the latter case than the former (though perhaps in more select cases), and those skills can become even more productive when augmented with LLMs.
Someone with strong writing skills will be better and clearer in their thinking than someone who offloads writing tasks completely to LLMs and AI. More to the point, this will enable you to better spot bad writing and thinking—whether by other people or by AI—and to leverage the power of LLMs to improve that writing. Learning how best to use LLMs is crucial; many people wrongly use them like search engines instead of as iterated tools if inquiry. The sooner higher education addresses LLMs and artificial intelligence, to empower rather than deskill students, the better.
Because of this, I am redesigning my own classes to better leverage LLMs and AI within the classroom to empower you, the student, rather than to undercut and deskill your abilities. The idea will be to chart a course, as philosophers, using LLMs and AI to better understand deep and important philosophical questions. We can use LLMs to help us become better philosophers by using such tools to find research, devise examples and counterexamples to arguments, generate different versions of sentences and paragraphs, reformat in different styles, and proofread drafts. At the end of the day, LLMs and AI are merely tools to use in the exploration of philosophy—they possess neither a life full of experiences nor the existential angst underlying questions of God, ethics, death and the afterlife, society and politics, art, and so forth. Maybe they will someday, but for now, philosophy is an exclusively human enterprise aimed at fundamental and meaningful questions—we must proceed treating AI as a tool, rather than as a replacement. To accomplish this and other pedagogical goals in the philosophy classroom, though, I need your help. I cannot do this alone. And neither can you.
Take care of yourselves and each other,
Jimmy Alfonso Licon
This article presents a compelling and nuanced perspective on the evolving dynamics of education in the age of artificial intelligence. The points raised about the shifting role of student writing as a signal of effort and understanding are especially relevant in today’s rapidly changing academic environment. It is interesting to consider how, moving forward, the integration of LLMs may not only alter the way students complete assignments, but also redefine the core competencies that institutions value. Ultimately, this invites an important conversation about authenticity, productivity, and what it really means to “learn” in the 21st century.
Love this and have saved it. I note only that the example taken from a productivity-based theory of learning misleads a little, and the points you make don’t depend on that theory. Your post and my own current heading (I came over to this post while pontificating the value of a particular submission by a student!)