Culture Is ChatGPT killing higher education? - AI is creating a cheating utopia and universities don’t know how to respond.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
image.webp
Frank Rumpenhorst/Picture Alliance via Getty Images

What’s the point of college if no one’s actually doing the work?

It’s not a rhetorical question. More and more students are not doing the work. They’re offloading their essays, their homework, even their exams, to AI tools like ChatGPT or Claude. These are not just study aids. They’re doing everything.

We’re living in a cheating utopia — and professors know it. It’s becoming increasingly common, and faculty are either too burned out or unsupported to do anything about it. And even if they wanted to do something, it’s not clear that there’s anything to be done at this point.

So what are we doing here?

James Walsh is a features writer for New York magazine’s Intelligencer and the author of the most unsettling piece I’ve read about the impact of AI on higher education.

Walsh spent months talking to students and professors who are living through this moment, and what he found isn’t just a story about cheating. It’s a story about ambivalence and disillusionment and despair. A story about what happens when technology moves faster than our institutions can adapt.

I invited Walsh onto The Gray Area to talk about what all of this means, not just for the future of college but the future of writing and thinking. As always, there’s much more in the full podcast, so listen and follow The Gray Area on Apple Podcasts, Spotify, Pandora, or wherever you find podcasts. New episodes drop every Monday.

This interview has been edited for length and clarity.

Let’s talk about how students are cheating today. How are they using these tools? What’s the process look like?

It depends on the type of student, the type of class, the type of school you’re going to. Whether or not a student can get away with that is a different question, but there are plenty of students who are taking their prompt from their professor, copying and pasting it into ChatGPT and saying, “I need a four to five-page essay,” and copying and pasting that essay without ever reading it.

One of the funniest examples I came across is a number of professors are using this so-called Trojan horse method where they’re dropping non-sequiturs into their prompts. They mention broccoli or Dua Lipa, or they say something about Finland in the essay prompts just to see if people are copying and pasting the prompts into ChatGPT. If they are, ChatGPT or whatever LLM they’re using will say something random about broccoli or Dua Lipa.

Unless you’re incredibly lazy, it takes just a little effort to cover that up.

Every professor I spoke to said, “So many of my students are using AI and I know that so many more students are using it and I have no idea,” because it can essentially write 70 percent of your essay for you, and if you do that other 30 percent to cover all your tracks and make it your own, it can write you a pretty good essay.

And there are these platforms, these AI detectors, and there’s a big debate about how effective they are. They will scan an essay and assign some grade, say a 70 percent chance that this is AI-generated. And that’s really just looking at the language and deciding whether or not that language is created by an LLM.

But it doesn’t account for big ideas. It doesn’t catch the students who are using AI and saying, “What should I write this essay about?” And not doing the actual thinking themselves and then just writing. It’s like paint by numbers at that point.

Did you find that students are relating very differently to all of this? What was the general vibe you got?

It was a pretty wide perspective on AI. I spoke to a student at the University of Wisconsin who said, “I realized AI was a problem last fall, walking into the library and at least half of the students were using ChatGPT.” And it was at that moment that she started thinking about her classroom discussions and some of the essays she was reading.

The one example she gave that really stuck with me was that she was taking some psych class, and they were talking about attachment theories. She was like, “Attachment theory is something that we should all be able to talk about [from] our own personal experiences. We all have our own attachment theory. We can talk about our relationships with our parents. That should be a great class discussion. And yet I’m sitting here in class and people are referencing studies that we haven’t even covered in class, and it just makes for a really boring and unfulfilling class.” That was the realization for her that something is really wrong. So there are students like that.

And then there are students who feel like they have to use AI because if they’re not using AI, they’re at a disadvantage. Not only that, AI is going to be around no matter what for the rest of their lives. So they feel as if college, to some extent now, is about training them to use AI.

What’s the general professor’s perspective on this? They seem to all share something pretty close to despair.

Yes. Those are primarily the professors in writing-heavy classes or computer science classes. There were professors who I spoke to who actually were really bullish on AI. I spoke to one professor who doesn’t appear in the piece, but she is at UCLA and she teaches comparative literature, and used AI to create her entire textbook for this class this semester. And she says it’s the best class she’s ever had.

So I think there are some people who are optimistic, [but] she was an outlier in terms of the professors I spoke to. For the most part, professors were, yes, in despair. They don’t know how to police AI usage. And even when they know an essay is AI-generated, the recourse there is really thorny. If you’re going to accuse a student of using AI, there’s no real good way to prove it. And students know this, so they can always deny, deny, deny. And the sheer volume of AI-generated essays or paragraphs is overwhelming. So that, just on the surface level, is extremely frustrating and has a lot of professors down.

Now, if we zoom out and think also about education in general, this raises a lot of really uncomfortable questions for teachers and administrators about the value of each assignment and the value of the degree in general.

How many professors do you think are now just having AI write their lectures?

There’s been a little reporting on this. I don’t know how many are. I know that there are a lot of platforms that are advertising themselves or asking professors to use them more, not just to write lectures, but to grade papers, which of course, as I say in the piece, opens up the very real possibility that right now an AI is grading itself and offering comments on an essay that it wrote. And this is pretty widespread stuff. There are plenty of universities across the country offering teachers this technology. And students love to talk about catching their professors using AI.

I’ve spoken to another couple of professors who are like, I’m nearing retirement, so it’s not my problem, and good luck figuring it out, younger generation. I just don’t think people outside of academia realize what a seismic change is coming. This is something that we’re all going to have to deal with professionally.

And it’s happening much, much faster than anyone anticipated. I spoke with somebody who works on education at Anthropic, who said, “We expected students to be early adopters and use it a lot. We did not realize how many students would be using it and how often they would be using it.”

Is it your sense that a lot of university administrators are incentivized to not look at this too closely, that it’s better for business to shove it aside?

I do think there’s a vein of AI optimism among a certain type of person, a certain generation, who saw the tech boom and thought, I missed out on that wave, and now I want to adopt. I want to be part of this new wave, this future, this inevitable future that’s coming. They want to adopt the technology and aren’t really picking up on how dangerous it might be.

I used to teach at a university. I still know a lot of people in that world. A lot of them tell me that they feel very much on their own with this, that the administrators are pretty much just saying, Hey, figure it out. And I think it’s revealing that university admins were quickly able, during Covid, for instance, to implement drastic institutional changes to respond to that, but they’re much more content to let the whole AI thing play out.

I think they were super responsive to Covid because it was a threat to the bottom line. They needed to keep the operation running. AI, on the other hand, doesn’t threaten the bottom line in that way, or at least it doesn’t yet. AI is a massive, potentially extinction-level threat to the very idea of higher education, but they seem more comfortable with a degraded education as long as the tuition checks are still cashing. Do you think I’m being too harsh?


I genuinely don’t think that’s too harsh. I think administrators may not fully appreciate the power of AI and exactly what’s happening in the classroom and how prevalent it is. I did speak with many professors who go to administrators or even just older teachers, TAs going to professors and saying, This is a problem.

I spoke to one TA at a writing course at Iowa who went to his professor, and the professor said, “Just grade it like it was any other paper.” I think they’re just turning a blind eye to it. And that is one of the ways AI is exposing the rot underneath education.

It’s this system that hasn’t been updated in forever. And in the case of the US higher ed system, it’s like, yeah, for a long time it’s been this transactional experience. You pay X amount of dollars, tens of thousands of dollars, and you get your degree. And what happens in between is not as important.

The universities, in many cases, also have partnerships with AI companies, right?

Right. And what you said about universities can also be said about AI companies. For the most part, these are companies or companies within nonprofits that are trying to capture customers. One of the more dystopian moments was when we were finishing this story, getting ready to completely close it, and I got a push alert that was like, “Google is letting parents know that they have created a chatbot for children under [thirteen years old].” And it was kind of a disturbing experience, but they are trying to capture these younger customers and build this loyalty.

There’s been reporting from the Wall Street Journal on OpenAI and how they have been sitting on an AI that would be really, really effective at essentially watermarking their output. And they’ve been sitting on it, they have not released it, and you have to wonder why. And you have to imagine they know that students are using it, and in terms of building loyalty, an AI detector might not be the best thing for their brand.

This is a good time to ask the obligatory question, Are we sure we’re not just old people yelling at clouds here? People have always panicked about new technologies. Hell, Socrates panicked about the written word. How do we know this isn’t just another moral panic?

I think there’s a lot of different ways we could respond to that. It’s not a generational moral panic. This is a tool that’s available, and it’s available to us just as it’s available to students. Society and our culture will decide what the morals are. And that is changing, and the way that the definition of cheating is changing. So who knows? It might be a moral panic toda,y and it won’t be in a year.

However, I think somebody like Sam Altman, the CEO of OpenAI, is one of the people who said, “This is a calculator for words.” And I just don’t really understand how that is compatible with other statements he’s made about AI potentially being lights out for humanity or statements made by people at an Anthropic about the power of AI to potentially be a catastrophic event for humans. And these are the people who are closest and thinking about it the most, of course.

I have spoken to some people who say there is a possibility, and I think there are people who use AI who would back this up, that we’ve maxed out the AI’s potential to supplement essays or writing. That it might not get much better than it is now. And I think that’s a very long shot, one that I would not want to bank on.

Is your biggest fear at this point that we are hurtling toward a post-literate society? I would argue, if we are post-literate, then we’re also post-thinking.

It’s a very scary thought that I try not to dwell in — the idea that my profession and what I’m doing is just feeding the machine, that my most important reader now is a robot, and that there’s going to be fewer and fewer readers is really scary, not just because of subscriptions, but because, as you said, that means fewer and fewer people thinking and engaging with these ideas.

I think ideas can certainly be expressed in other mediums and that’s exciting, but I don’t think anybody who’s paid attention to the way technology has shaped teen brains over the past decade and a half is thinking, Yeah, we need more of that. And the technology we’re talking about now is orders of magnitude more powerful than the algorithms on Instagram.

Article Link

Archive
 
I graduated back in 2021, so that was right before the specific "tutor" (cheating) LLMs really started to become widespread. I'm very unsurprised that AI is the new thing, school administrations would turn a blind eye to international students blatantly cheating, and it turned higher education into a rat race.
Tuition fees keep getting higher, while job prospects keep trending downwards, so I can hardly blame anybody for taking advantage of whatever helps them get ahead. For a lot of these schools, it's a mess they created, and society is going to pay for it. We already have with a lot of the international students cheating their asses off blatantly.
 
There are (at least theoretically) a few things you should get from a university.

1. a degree - this is basically a certification by an authority that you know what you're talking about.

The problem is, this is not really any different from a certification you could get from a dedicated certification authority (e.g. a Cisco CCNA certification). In the olden days, the university offered you access to books and things, and teachers to education ...and also the degree at the end. But the books and teachers are now online, so all you really get is that degree ...which is the same as a certification

2. a program of study - the idea here is that in addition to your (hopefully useful) degree, the university would make you a better person by forcing you to study things that you didn't even know where important. They force you to take philosophy and history classes, for example.

If you don't agree that a university should do this, here's a great video that makes the argument:


The problem is, universities became ideologically captured and as a result, today when you meet someone with a prestigious degree, you no longer thing, "this person probably knows lots of stuff" - you think, "this person is probably about to start whining about privilege and transphobia."

The political Left, which controls most institutions, is openly racist against whites, and so they're not going to teach you the western cannon. They're just going to tell you it's bad.

3. a network, and a culture - there used to be a unique culture at Harvard vs. Yale. Used to be that you had songs and comradery that would follow you and help you in your professional career.

The problem is, we built too many universities and accepted the expectation that everyone should go to one. Worse, the universities created "degrees" for every tom, dick, and harry. They were chasing that sweet, sweet tuition loan money. As a result, you can get a degree in ... well, let's just let some modern students explain it themselves:


These are not serious people, and they are not studying serious things. NYU may be exceptionally bad, but every university contains this rot. I have two undergraduate, and one graduate degree. All from different institutions. But if I meet someone from an alma mater, I have basically nothing in common with them. Our football team is probably the last tenuous connection that any of us retain.

University is no longer prestigious. It's basically high school with extra steps.

All of these things could be restored

Any institution could identify these (and probably other) items of value that they could offer to their students, and focus on creating and preserving them. But they wont, because they want money, and they hate white people.
 
lol. the fag who wrote this article is outing himself as a fucking nerd who's dumb enough to believe that college is a meritocracy. bro, all that extra work you did staying up late at night while your other group members fucked off went straight into the garbage. your meticulously written term paper got a whole ten seconds of attention from the professor who ran it through the plaigarism detector, checked it off, and instantly forgot about it. all the nerd shit you did while your much cooler classmates were out partying, making friends, and getting laid was a complete waste. the guy who skated through every class faking his ass off through every assignment is probably your boss at Vox now, and he's sending you a reminder that your article is due TONIGHT from the penthouse suite in Fiji his friend got him for a week, while you complain to your wife's boyfriend that you're the only one who does any work. you failed to understand the rules, you lost a game you didn't even know you were playing, and now your busybody ass is grousing about the next generation of winners, who rightfully give zero fucks about your gay assignments, and will go on to do much cooler shit than you while being half your age. that's why you're a columnist and a college professor, two of the lamest and most useless professions in the world.
I'd normally rate a text wall like this autistic by default, but there is truth in what you say here; and the problem is that that truth is part of the reason why we have the competency crisis now.
 
The real problem is that schools don't teach you how to think and learn. They teach you to memorize and regurgitate what you are told or read. And guess what, LLMs are better at that than anything else. The education system has been fundamentally flawed from the get-go, because what they should be training children in is assessing information and getting meaningful conclusions from that assessment. Being able to recall a thing you memorized is a useful skill, but if you have no idea what to do with it, what value is that in the long run? More value, and probably more engagement from kids, would be gained by training them in logic, gathering data, and decision making. Do it up like escape rooms or video games or whatever to keep the little monsters from being too bored. Present them with a problem that needs solving, have the tools to solve the problem be accessible, and leave them to it. A person who's used to problem-solving can easily seek out information they need for their task, but someone who only knows how to repeat something they heard is going to be lost quickly.
 
University is no longer prestigious. It's basically high school with extra steps.
And that's in America. Your education system is still ahead of many countries. Take Argentina, which has a dogshit system below college. Where you can pass early grades without actually knowing how to read. Where some teachers are told not to give students failing grades. Where they, as of 2024, have shown that only 14.2% of 5th and 6th grade students are learning a satisfactory level of math, per Aprender test results. And those tests aren't like SOL tests in the US - the scores can be as shit as that, and schools face no major consequence. We have a system where, if you want your kid to actually learn, you need to have money to get them to an engineering focused school, or just straight up a private school. Else, your kid won't get challenged, and when it's time to go to college...the entrance exams fuck their shit up, even though the tests aren't that hard anymore.

And this isn't even getting into how teachers have practically no authority. If a teacher tells a student to shut the fuck up in just a slightly too high tone, that student can complain to family, who will then raise hell. In some cases, even threaten to kill teachers. In others, lawsuits happen, and they end up ruling against the school. Now, imagine teachers with no authority having to deal with bullies. Imagine trying to teach kids how to express themselves properly, when they have no incentive to actually learn. It's why when I find some random video on tiktok/facebook/etc. showing footage of Buenos Aires from previous decades, commenters notice "wow, even when speaking casually, people sounded smarter back then"

The other side of this is bad parenting, obviously, but this thread is about academia, so that's why I focused on that side.
It's like there's two parallel systems: dogshit public schools and okay private schools that require not being poor. Then in college, everyone has to figure out how to make things work out for themselves. And then people wonder why so many students drop out there. This also isn't helped by strikes that leave colleges without activity on random days. Funny that these strikes almost never seemed to happen unless Argentina got a non-lefty president.
 
Here is my perspective:
I went to a very traditional university where to pass philosophy exams you had to WRITE with pen and paper, and no electronics three short essays (you could pick three topics out of five). And before my university, in high school, we had a form of class test that included writing essays. I also has some oral exams that included formulating answers for the questions using my own mind. Believe it or not, the same did apply to some of the tests in my CS classes. You had to write code by hand.

That automatically solves the problem of cheating. However, it generates another problem: no college today employs enough hard-working people to spend time on marking these types of tests. Or spend hours doing oral examinations.

As a result all the credentials are even more laughable than before.
 
The real problem is that schools don't teach you how to think and learn. They teach you to memorize and regurgitate what you are told or read. And guess what, LLMs are better at that than anything else. The education system has been fundamentally flawed from the get-go, because what they should be training children in is assessing information and getting meaningful conclusions from that assessment. Being able to recall a thing you memorized is a useful skill, but if you have no idea what to do with it, what value is that in the long run? More value, and probably more engagement from kids, would be gained by training them in logic, gathering data, and decision making. Do it up like escape rooms or video games or whatever to keep the little monsters from being too bored. Present them with a problem that needs solving, have the tools to solve the problem be accessible, and leave them to it. A person who's used to problem-solving can easily seek out information they need for their task, but someone who only knows how to repeat something they heard is going to be lost quickly.
Hard agree. I got out of academia before ChatGPT became a thing, but even back then I would get papers full of paragraphs that were lifted straight out of the textbook. I would hold study sessions where I asked questions, and there were kids who immediately started flipping through the book.
 
There is a simple solution to the problem. Its called exams. But the same schools that charge infinity money and spend infinity money will tell you that what was commonly done a few decades ago in terms of exams is now simply impossible. Grades didn't mean shit before AI and they are not going to mean shit after AI. There is a world full of dummies out there with near perfect GPAs. Even 10 or 15 years ago it was so bad that looking at college transcripts was already a waste of time.
 
Are guns causing homicides in black Chicago neighborhoods?
Is HIV/AIDS killing tens of thousands of faggots a year?
Does alcohol being legal lead to a frightening amount of drunk driving incidents that kill people per annum?
Should we consider if HFCS is making people fat?

Each year I live I slowly grow more misanthropic.
 
Right now I have high school teachers using AI to read and grade papers most likely written by AI. I don't blame them, there's a huge push to do frequent "formative assessments" that go beyond multiple choice, and go deeper into Blooms taxonomy. When a typical teacher has a roster pushing 200, how is anyone going to read all that shit and give meaningful feedback?
The whole thing is so ludicrous to me. First you have these True Believers in education who think every piece of shit retard can be the next Albert Einstein without thinking wether or not we need infinite Einsteins let alone the fact that, no, not everyone can be Einstein if you just give the right assessments or teach a particular way. It's a fucking flea circus and it's been that way for a long time. Now comes AI and we've created this situation where AI is talking to AI and the teachers and students are only pretending to accomplish anything. I just have to laugh. Someone needs to update Brazil and Idiocracy.
In the meantime I'm going to make sure my own children write their fucking essays themselves because someone in this world needs to know how to write without tech help. I don't even care if they end up with a C compared to the cheater's A. Actual skill is that important to our family.
Anyone can teach themselves how to write a prompt with about a half hour of fucking around. Actually putting together your own thoughts takes a lifetime of diligence and practice, but no one is willing to do that or figure out an efficient system for adequately assessing if someone can.
 
Essays need to die.

We all have our own attachment theory.
"People" taking a psych class? Probably. Normal people? No, most of the world doesn't subscribe to your pedophilic religion.

They force you to take philosophy and history classes, for example.

If you don't agree that a university should do this, here's a great video that makes the argument:
Philosophy is garbage. It used to be science, then it became sperging of people who also advanced science, now it's just sperging of privileged fucks too stupid for science. Philosophy is hazing, and it shouldn't exist.

History is important, but it's a waste of time to teach "history" at a university. The best you can do is an equivalent of the Soviet "history of the Party".
 
Back