Culture I Used to Teach Students. Now I Catch ChatGPT Cheats - "I once believed university was a shared intellectual pursuit. That faith has been obliterated." - Statements from the utterly deranged

Source: https://thewalrus.ca/i-used-to-teach-students-now-i-catch-chatgpt-cheats/
Archive: https://archive.is/n4ij7#selection-878.0-878.1

I Used to Teach Students. Now I Catch ChatGPT Cheats​

I once believed university was a shared intellectual pursuit. That faith has been obliterated​

by Troy Jollimore Updated 8:03, Mar. 5, 2025 | Published 6:30, Mar. 5, 2025

The Magic Bag. It’s a familiar storytelling device. Ask the bag for something, anything, whatever you might want—and poof, out it pops. We find variations in myths and fables, in jokes, in numerous Twilight Zone episodes. The genie in a bottle. The monkey’s paw. The holodeck on the USS Enterprise. The moral of the story, as often as not, turns out to be: Be careful what you wish for. By giving us what we thought we wanted, the Magic Bag instructs us on the danger of having one’s desires fulfilled, reminding us that it is often better to want than to get.

Recently, a new kind of Magic Bag has found its way into the real world. For some years, I have taught philosophy, mostly ethics, at a university in California. I teach a mix of in-person and online classes, and my main method of evaluation has been the student essay. But as nearly everyone now knows, AI tools like ChatGPT and Google Gemini make it possible to obtain college essays with little more effort than it takes to snap your fingers.

Ask one of these chatbots for a paper on Plato’s Republic, or on the ethics of buying and selling kidneys—or just input an exam prompt—and, within seconds, out pops a paper that will look to a lot of people like something a human wrote. No fuss, no muss. If your instructor doesn’t know what to look for, or if the AI you are using is good enough, you can convince them you have mastered the topic without needing to learn anything about it at all.

Not very sophisticated AI-generated papers are easy to spot—if, again, you know what to look for. In terms of both syntax and tone, they all sound roughly the same. Very neutral, very bland. Regardless of the nature of the question, they address the issues at hand in the same highly methodical manner, first developing a systematic framework, then balancing competing considerations against each other to arrive at an overall judgment. Sometimes they will provide quotations, giving page numbers that, as often as not, do not seem to correspond to anything in the actual world. These, again, are the ones that are easiest to pick out. I have little doubt that there are others that are more sophisticated, and that some get past me.

To judge by the number of papers I read last semester that were clearly AI generated, a lot of students are enthusiastic about this latest innovation. It turns out, too, this enthusiasm is hardly dampened by, say, a clear statement in one’s syllabus prohibiting the use of AI. Or by frequent reminders of this policy, accompanied by heartfelt pleas that students author the work they submit.

North American college instructors are accustomed to adversity. Our society has always manifested strongly anti-intellectual tendencies. (In the past, this was perhaps more true in the US than Canada; I’m not sure how much, if at all, this remains the case.) One mainstream view is that practical intelligence, or street smarts, constitutes a more valuable form of intelligence. Those who emphasize theory, study, and scholarship are often viewed as marooned in ivory towers with expertise that is mostly, if not entirely, spurious.

Combating such skepticism is a large part of what education is all about. The best way to get people to see why education matters is to educate them; you can’t really grasp that value from the outside. There are always students who are willing but who feel an automatic resistance to any effort to help them learn. They need, to some degree, to be dragged along. But in the process of completing the assigned coursework, some of them start to feel it. They begin to grasp that thinking well, and in an informed manner, really is different from thinking poorly and from a position of ignorance.

That moment, when you start to understand the power of clear thinking, is crucial. The trouble with generative AI is that it short-circuits that process entirely. One begins to suspect that a great many students wanted this all along: to make it through college unaltered, unscathed. To be precisely the same person at graduation, and after, as they were on the first day they arrived on campus. As if the whole experience had never really happened at all.

I once believed my students and I were in this together, engaged in a shared intellectual pursuit. That faith has been obliterated over the past few semesters. It’s not just the sheer volume of assignments that appear to be entirely generated by AI—papers that show no sign the student has listened to a lecture, done any of the assigned reading, or even briefly entertained a single concept from the course.

It’s other things too. It’s the students who say: I did write the paper, but I just used AI for a little editing and polishing. Or: I just used it to help with the research. (More and more, I have been forbidding outside research in these papers for this very reason. But this, of course, has its own costs. And I hate discouraging students who genuinely want to explore their topics further.) It’s the students who, after making such protestations, are unable to answer the most basic questions about the topic or about the paper they allegedly wrote. The students who beg you to reconsider the zero you gave them in order not to lose their scholarship. (I want to say to them: Shouldn’t that scholarship be going to ChatGPT? )

It’s also, and especially, the students who look at you mystified. The use of AI already seems so natural to so many of them, so much an inevitability and an accepted feature of the educational landscape, that any prohibition strikes them as nonsensical. Don’t we instructors understand that today’s students will be able, will indeed be expected, to use AI when they enter the workforce? Writing is no longer something people will have to do in order to get a job.

Or so, at any rate, a number of them have told me. Which is why, they argue, forcing them to write in college makes no sense. That mystified look does not vanish—indeed, it sometimes intensifies—when I respond by saying: Look, even if that were true, you have to understand that I don’t equate education with job training.

What do you mean?
they might then ask.

And I say: I’m not really concerned with your future job. I want to prepare you for life.

It turns out that if there is anything more implausible than the idea that they might need to write as part of their jobs, it is the idea that they might have to write, or want to write, in some part of their lives other than their jobs.
Or, more generally, the idea that education might be valuable not because it gets you a bigger paycheque but because, in a fundamental way, it gives you access to a more rewarding life.

My students have been shaped by a culture that has long doubted the value of being able to think and write for oneself—and that is increasingly convinced of the power of a machine to do both for us. As a result, when it comes to writing their own papers, they simply disregard it. They look at instructors who levy such prohibitions as irritating anachronisms, relics of a bygone, pre-ChatGPT age.

One consequence of this boom in AI use is that, in my online classes, I can no longer be sure of what my students are learning. I have no way of determining which concepts they are grasping and which are proving elusive. And as a result, I no longer have any idea how to modify my teaching to help them. What do I need to spend more time on? What can I pass over more quickly? Where do they need some real-world examples or applications? Imagine being a doctor and treating a patient you are not permitted to examine. It’s a little like that.

I could, of course, turn a blind eye to it and give every paper the benefit of the doubt, no matter how unlikely it might be that it was written by a human hand. I could say, as some of my colleagues do, that I trust all of my students, and that it is better to trust and sometimes be wrong than to be suspicious and right. I could go along with all of the people who have said to me: It’s not your job to force people to be virtuous. Let cheaters cheat. In the long run, they only hurt themselves, denying themselves the real education they could be gaining (and for which, after all, they or their parents are paying a fair bit of money).

This is a lovely set of thoughts, and it would make my life a good deal easier if I could bring myself to embrace it. But I can’t. (I’m an ethics professor, for crying out loud.) For one thing, while it is true that the cheaters are cheating themselves out of an education, it does not follow that they are harming only themselves. There is no getting around the fact that a central part of my job is deciding who passes and who fails. Some of my students still do write their own papers. They still do the readings. They want to learn. (And then there are those who don’t but have enough integrity that they follow the rules nonetheless.) My papers are hard to write, and the readings can be hard to understand; it’s philosophy, after all. These students are working. The effort they put in is real, and I appreciate and admire it.

It would be an egregious wrong to reward both sets of students equally, to force the honest students to compete for jobs, for scholarships, for admission to graduate schools on an equal footing with those who have “earned” their grades by submitting work that is not theirs.

Of course, should such cheating continue to be widespread, it seems inevitable that all college degrees will be worth a good deal less and will perhaps cease to offer any advantage at all. But this outcome is surely of little comfort to those willing to work for their degrees, those who want those degrees to continue to be worth something and to mean something.

The problem is that none of the strategies available to me are without serious problems. I could, for instance, require all of these students to do their written work during class. That way, I would know they were the ones doing it. (Assuming, of course, I made them hand-write it.) In my experience, there is a fair bit of pressure on faculty not to lean on in-class written assignments. These are said to advantage students who think quickly, who perform well under pressure, who are more comfortable with pen and paper. Such concerns strike me as oddly one sided; after all, work done at home also privileges some students over others. I admit there is something to the point: there is a certain unfairness in grading solely on such a basis. But that unfairness pales compared to the greater injustice of some students submitting papers they laboured over while others turn in material conjured from a Magic Bag.

Writing in a classroom can never approximate the sustained concentration required to produce a carefully thought-out, polished piece done over a period of time. Being assigned such projects encourages students to work at a higher level and to flex more of their intellectual capacities. It teaches them that good writing is something you craft, not something you spit out at a moment’s notice. And it demonstrates to them that they are capable of producing something they can be proud of. Moreover, restricting students’ writing to the classroom overlooks the fact that a student’s comprehension and thinking ability will nearly always be more accurately represented by sentences they take time to mull over, to compose, and to revise than by words they whip up when confronted by an exam question.

I have said this sort of thing to students from time to time, in explaining why most of their grade—all of it, in some cases—is based on such long-term writing assignments. Such assignments have been the most powerful teaching tool in my arsenal, particularly when they are developed in stages: a first draft followed by feedback, followed by a revision, and so on. Assuming that the student puts in the effort, the paper inevitably gets better through such a process.

But what is more important is that the student gets better. After all, I don’t actually need the essays. I don’t have a shortage of things to read. The essays are a means to an end, the end being the transformation of the author into an educated person. This kind of writing assignment is simply the best available instrument for bringing about this highly desirable end.

But it works only, of course, if the students themselves write the essays.

Are there alternatives? I could continue to assign such essays while spending a good deal of my time scrutinizing the submissions, hoping to catch as many of the AI-generated ones as possible. I could revise my assignments to try to make them more AI resistant. I could choose assigned readings that AI tools might be less familiar with. I could, in short, alter my teaching practice.

These efforts might dissuade some students from cheating and might mitigate its harmful effects. But the more sophisticated AI becomes—and it is progressing rapidly indeed—the less effective these efforts will be. The very idea of “AI-resistant assignments” could soon be an empty category. More than that, these strategies undermine the quality of the education I can deliver. Tailoring your assignments to make them difficult for AI tools often means not being able to ask the kinds of questions you want your students to answer, and not having them do the kind of work from which they would learn the most—the very work that can develop their ability to think and to write.

As I go on, I find that more of the time, energy, and resources I have for teaching are dedicated to dealing with this issue. I am doing less and less actual teaching, more and more policing. Sometimes I try to remember the last time I actually looked forward to walking into a classroom. It’s been a while.

One of the courses I teach is health ethics. Many of the students who enrol aspire to careers in medicine, in one form or another. Those who make it might become nurses, doctors, pharmacists, or emergency medical technicians. But whatever path they take, they will inevitably confront ethical dilemmas—some of them agonizing.

What do you do when a patient requests treatment that is not approved of by the medical establishment? Or declines a treatment they clearly need? Perhaps the request reflects bad information they found on the internet. Perhaps they are simply confused, not in their right mind, and not competent to make such decisions at the moment. But precisely what constitutes competence, and how do we know when someone is incompetent? Perhaps a patient’s resistance is based in their religious convictions; what then? Or what if a patient or client requests something that conflicts with the religious commitments of the hospital staff? Whose values ought to take priority in these and other cases of conflict and disagreement?

The ethical conundrums that health care workers encounter don’t arrive neatly packaged like an essay prompt. It won’t be anything you can google. Even if there were time to feed the dilemma into AI, the AI won’t help you figure out just which questions need to be asked and answered. And in real life, this is often the most difficult part.

A person who has not absorbed, and genuinely internalized, a fair bit of ethical insight by the time they find themselves in such situations is highly unlikely to respond well, or even minimally adequately. One needs to be able to see a real-life situation and grasp what is central and essential in it, to imagine the various possible outcomes and to see who and what is at risk—threats and dangers often hidden under the surface of what we observe.

And there is a great deal at stake. Reacting badly to a complicated moral situation that you were not prepared for can haunt you for a long time. It is no exaggeration to say that in the health professions, such situations can be, in the most literal sense, matters of life and death.

I don’t pretend to be able, in the course of a semester, to fully prepare students to face such situations. No one, after all, is ever as prepared as they could possibly be. But I can get them closer to being prepared—assuming that they commit and do the work. I can point out and help them explore examples of good ethical thinking while offering them a framework of theories and concepts to help make sense of complicated situations. In these ways, they can get better at recognizing the most pertinent considerations, and so better at framing the issues in more nuanced ways—ways more likely to lead to a good outcome, to a decision they can live with.

Students who skip this step, who use AI to avoid absorbing or even being exposed to these ideas, are hobbling themselves. In doing so, they risk doing serious harm, not only to themselves but to future patients, clients, colleagues—anyone, really, whom their potentially misguided actions and uninformed decisions might affect.

But let’s pretend, for a moment, that it’s true: students who cheat harm only themselves. Even though I am certainly angry at those students who choose to cheat, the fact is that I also care about them and feel a certain degree of compassion for them. I don’t want them to miss out on the opportunity to become educated, not even as the result of their own poor choices.

It’s a bit of a Catch-22. How can we expect them to make good choices, about their studies or anything else, if they have not yet been given the tools to think critically? How can we expect them to grasp what education means when we, as educators, haven’t begun to undo the years of cognitive and spiritual damage inflicted by a society that treats schooling as a means to a high-paying job, maybe some social status, but nothing more? Or, worse, to see it as bearing no value at all, as if it were a kind of confidence trick, an elaborate sham?

My life, like anyone’s, could have gone in a lot of different directions. As it happens, I was lucky enough to end up majoring in a subject—philosophy—that I love, and whose pursuit has allowed me to enjoy the benefits of genuine learning. So I am in a position to know and appreciate what a difference education makes to the quality of your life. The vastness of the world it opens up to you while simultaneously instilling in you the curiosity to explore it. The sense of perspective it offers, enabling you to view the events of your life, and the events of whatever historical moment it is yours to live through, in a much larger context, rather than being resigned to viewing them from a standpoint of uncomprehending ignorance, as if they were all happening for the first time and for no discernible reason.

The fact is, I want my students, all of them, to have that kind of experience, to have the opportunity to live that kind of life. I don’t want them to be cheated out of it. I don’t want this, even if they themselves are the ones they are cheating. No Magic Bag is worth that.


Troy Jollimore
Troy Jollimore is the author of four books of poetry and three books of philosophy. In 2006, he won the National Book Critics Circle award in poetry. His fourth book of poems, and most recent, is Earthly Delights.
 
Last edited:
turning in essays is not part of that learning
This is categorically false.

Essays formalize constructing an argument, as does mathematical proof and spoken debate (all of which also include prior research and note-taking).

The long, tedious, sometimes painful process of writing and rewriting and editing serves as rehearsal for your brain, later, to construct arguments and form conclusions quickly and independently. A student isn’t writing an essay because anyone cares deeply about what some 19-year old has to say; he’s writing it so that he himself sorts out “here’s how I answer challenging questions with tact and rigor.”

I don’t remember the last time I wrote an essay, but I do organize and pace my thoughts internally—and therefore my spoken words—on the principles of rhetoric and argument… trained by writing essays.
 
Obviously outsourcing 95% of your work GPT prevents you from retaining information, however on the flip side 99% of what you learn in school is not worthy of, or indeed eligible for retention. None of it matters unless you use it day in and day out. Any understanding you can retain from a lecture 5 years ago you can get from GPT in 30 seconds.

On one hand students are lazier than ever and don't give a fuck (and for good reason IMHO). On the other hand the curriculum, the methodology, the subject matter, everything!... is out of date. These adjunct professors are glorified political activists who hate themselves and their lives, they are circling the wagons, they are scared. I'm watching videos of kids on their laptops using GPT to take lecture notes and the professor has a conniption and starts yelling about 'cheating.' They are totally out of touch, have forgotten who pays who, and academia needs to burn to the ground.

These kids will be dumber and more machine dependent than any generation before them by leaps and bounds but it will be worth it to see the worthless servile faggots running universities cry every time they make a minimum payment on the student loans they took out 12 years ago.
 
One of the biggest problems is cultural, blue collar and trades are still viewed as "inferior" by a lot of people despite the fact they are absolutely essential to everything functioning.

Your average MD trying to repair a problem with electric wiring would be more likely to kill himself than succeed. If I tried changing a flat tire I'd probably fuck it up and injure myself.

Plus you have high schools stripping away vocational/trade courses in favor of the endless pursuit of STEM, the apparent be-all, end-all of human purpose and magical solution to all of the problems in a deindustrialized and financialized economy. They never think about who keeps the power going for the servers or repairs the machines the engineers design.
People say this, but you forget that intellectual/brain derived work has always been higher status than menial labor, however skilled. That's been the case since the priests of Ancient Sumer demanded the peasantry pay taxes to them.

People who glorify trades/vocational work or encourage young people into it-are essentially telling these people to be slaves and servants of those who actually get degrees(however worthless said degree is in itself-it connotes higher social status).

We live in a deindustrialized consumer economy-a society that prioritizes social status and capital more than just making money. A plumber or weldsman is still seen as poor and low status-however much money they bring home.

Intellectual ability and interest does matter-and I find this glorification of the prole lifestyle deeply distasteful.
 
People say this, but you forget that intellectual/brain derived work has always been higher status than menial labor, however skilled. That's been the case since the priests of Ancient Sumer demanded the peasantry pay taxes to them.

People who glorify trades/vocational work or encourage young people into it-are essentially telling these people to be slaves and servants of those who actually get degrees(however worthless said degree is in itself-it connotes higher social status).

We live in a deindustrialized consumer economy-a society that prioritizes social status and capital more than just making money. A plumber or weldsman is still seen as poor and low status-however much money they bring home.

Intellectual ability and interest does matter-and I find this glorification of the prole lifestyle deeply distasteful.
Intellectual ability has nothing to do with a college degree. I've seen enough so-called graduates flake out and fail to know it means nothing in and of itself.

This may also be a fundamental ideological difference, I am a socialist and have deep family connections to the working class. My great-grandfather was a coal miner, my grandfather worked in a factory, my father worked in construction and I went to college.
 
  • Like
Reactions: NoReturn
Intellectual ability has nothing to do with a college degree. I've seen enough so-called graduates flake out and fail to know it means nothing in and of itself.

This may also be a fundamental ideological difference, I am a socialist and have deep family connections to the working class. My great-grandfather was a coal miner, my grandfather worked in a factory, my father worked in construction and I went to college.
As a socialist then you should know that things aren't so simple as "working man good, office man bad then"

The entire superstructure of society rewards those who reproduce it, more than those who simply perpetuate it.

Credentials tell any potential employer you are a better investment and more reliable than someone without-at least as far as skilled work matters, and as I said, operating in a society that emphasizes social capital more and more over simple rote docility.

(For the record, I think this is largely just glorified prostitution and you'd be surprised how sympathetic this makes me to some form of socialism, but it is what it is).

Regardless-fascinating to meet a socialist here. KF has a well deserved reputation as a right wing site.
 
  • Thunk-Provoking
Reactions: NoReturn
The Magic Bag. It’s a familiar storytelling device. Ask the bag for something, anything, whatever you might want—and poof, out it pops. We find variations in myths and fables, in jokes, in numerous Twilight Zone episodes. The genie in a bottle. The monkey’s paw. The holodeck on the USS Enterprise. The moral of the story, as often as not, turns out to be: Be careful what you wish for. By giving us what we thought we wanted, the Magic Bag instructs us on the danger of having one’s desires fulfilled, reminding us that it is often better to want than to get.

Recently, a new kind of Magic Bag has found its way into the real world. For some years, I have taught philosophy, mostly ethics, at a university in California. I teach a mix of in-person and online classes, and my main method of evaluation has been the student essay. But as nearly everyone now knows, AI tools like ChatGPT and Google Gemini make it possible to obtain college essays with little more effort than it takes to snap your fingers.
1741493294208.png
YOU WILL NEVER HAVE A MAGIC BAG.
YOU ARE USING A TEXT COMPLETION ALGORITHM FILLED WITH HALLUCINATONS, WARPED BY MANIPULATIVE MARKETING LINGO.
 
  • Like
Reactions: NoReturn
In my experience, there is a fair bit of pressure on faculty not to lean on in-class written assignments. These are said to advantage students who think quickly, who perform well under pressure, who are more comfortable with pen and paper.
What seem to be the problem? Being able to think quickly and to work under pressure is an asset; being comfortable with pen and paper is a basic fucking requirement.

I sympathize with the teacher. It seems his heart is in the right place but he is constantly stymied by woke colleagues. Still the root of the problem is that higher-ed is now a service industry. In any sane world no more than 1% of the population should attend college, and less than 1% of those who do should embark on post-grad studies. Now it is all for profit, thus worthless. I applaude the teacher's zeal to educate, but the higher-ed problem is simply incurable so he might as well let the kids cheat and sleep tight, for the sake of his health.

All being said, I wouldn't let a poet teach me Ethics.
 
Well, yes, that's what credentialism causes. I will cut the author some slack as he teaches in the healthcare field, in which it is crucial that students actually learn all the course material, but turning in essays is not part of that learning. I don't give a shit if my doctor can turn out the bog-standard five-paragraph argument, I want him to know the difference between my pancreas and my prostate.

Making post-secondary a job requirement turns it into a grandiose version of forklift certification, and nobody's interested in having to write an essay on the history of Hyster and the Pettibone Super 20 before they start forkin' shit.
He teaches "ethics" at a libshit indoctrination farm.
 
  • Agree
Reactions: NoReturn
Back