Everyone Is Cheating Their Way Through College - ChatGPT has unraveled the entire academic project.

Source: https://nymag.com/intelligencer/art...eating-education-college-students-school.html
Archive: https://archive.is/DzOE6

Chungin “Roy” Lee stepped onto Columbia University’s campus this past fall and, by his own admission, proceeded to use generative artificial intelligence to cheat on nearly every assignment. As a computer-science major, he depended on AI for his introductory programming classes: “I’d just dump the prompt into ChatGPT and hand in whatever it spat out.” By his rough math, AI wrote 80 percent of every essay he turned in. “At the end, I’d put on the finishing touches. I’d just insert 20 percent of my humanity, my voice, into it,” Lee told me recently.

Lee was born in South Korea and grew up outside Atlanta, where his parents run a college-prep consulting business. He said he was admitted to Harvard early in his senior year of high school, but the university rescinded its offer after he was suspended for sneaking out during an overnight field trip before graduation. A year later, he applied to 26 schools; he didn’t get into any of them. So he spent the next year at a community college, before transferring to Columbia. (His personal essay, which turned his winding road to higher education into a parable for his ambition to build companies, was written with help from ChatGPT.) When he started at Columbia as a sophomore this past September, he didn’t worry much about academics or his GPA. “Most assignments in college are not relevant,” he told me. “They’re hackable by AI, and I just had no interest in doing them.” While other new students fretted over the university’s rigorous core curriculum, described by the school as “intellectually expansive” and “personally transformative,” Lee used AI to breeze through with minimal effort. When I asked him why he had gone through so much trouble to get to an Ivy League university only to off-load all of the learning to a robot, he said, “It’s the best place to meet your co-founder and your wife.”

By the end of his first semester, Lee checked off one of those boxes. He met a co-founder, Neel Shanmugam, a junior in the school of engineering, and together they developed a series of potential start-ups: a dating app just for Columbia students, a sales tool for liquor distributors, and a note-taking app. None of them took off. Then Lee had an idea. As a coder, he had spent some 600 miserable hours on LeetCode, a training platform that prepares coders to answer the algorithmic riddles tech companies ask job and internship candidates during interviews. Lee, like many young developers, found the riddles tedious and mostly irrelevant to the work coders might actually do on the job. What was the point? What if they built a program that hid AI from browsers during remote job interviews so that interviewees could cheat their way through instead?

In February, Lee and Shanmugam launched a tool that did just that. Interview Coder’s website featured a banner that read F*CK LEETCODE. Lee posted a video of himself on YouTube using it to cheat his way through an internship interview with Amazon. (He actually got the internship, but turned it down.) A month later, Lee was called into Columbia’s academic-integrity office. The school put him on disciplinary probation after a committee found him guilty of “advertising a link to a cheating tool” and “providing students with the knowledge to access this tool and use it how they see fit,” according to the committee’s report.

Lee thought it absurd that Columbia, which had a partnership with ChatGPT’s parent company, OpenAI, would punish him for innovating with AI. Although Columbia’s policy on AI is similar to that of many other universities’ — students are prohibited from using it unless their professor explicitly permits them to do so, either on a class-by-class or case-by-case basis — Lee said he doesn’t know a single student at the school who isn’t using AI to cheat. To be clear, Lee doesn’t think this is a bad thing. “I think we are years — or months, probably — away from a world where nobody thinks using AI for homework is considered cheating,” he said.

In January 2023, just two months after OpenAI launched ChatGPT, a survey of 1,000 college students found that nearly 90 percent of them had used the chatbot to help with homework assignments. In its first year of existence, ChatGPT’s total monthly visits steadily increased month-over-month until June, when schools let out for the summer. (That wasn’t an anomaly: Traffic dipped again over the summer in 2024.) Professors and teaching assistants increasingly found themselves staring at essays filled with clunky, robotic phrasing that, though grammatically flawless, didn’t sound quite like a college student — or even a human.

Two and a half years later, students at large state schools, the Ivies, liberal-arts schools in New England, universities abroad, professional schools, and community colleges are relying on AI to ease their way through every facet of their education. Generative-AI chatbots — ChatGPT but also Google’s Gemini, Anthropic’s Claude, Microsoft’s Copilot, and others — take their notes during class, devise their study guides and practice tests, summarize novels and textbooks, and brainstorm, outline, and draft their essays. STEM students are using AI to automate their research and data analyses and to sail through dense coding and debugging assignments. “College is just how well I can use ChatGPT at this point,” a student in Utah recently captioned a video of herself copy-and-pasting a chapter from her Genocide and Mass Atrocity textbook into ChatGPT.

Sarah, a freshman at Wilfrid Laurier University in Ontario, said she first used ChatGPT to cheat during the spring semester of her final year of high school. (Sarah’s name, like those of other current students in this article, has been changed for privacy.) After getting acquainted with the chatbot, Sarah used it for all her classes: Indigenous studies, law, English, and a “hippie farming class” called Green Industries. “My grades were amazing,” she said. “It changed my life.” Sarah continued to use AI when she started college this past fall. Why wouldn’t she? Rarely did she sit in class and not see other students’ laptops open to ChatGPT.

Toward the end of the semester, she began to think she might be dependent on the website. She already considered herself addicted to TikTok, Instagram, Snapchat, and Reddit, where she writes under the username maybeimnotsmart. “I spend so much time on TikTok,” she said. “Hours and hours, until my eyes start hurting, which makes it hard to plan and do my schoolwork. With ChatGPT, I can write an essay in two hours that normally takes 12.”

Teachers have tried AI-proofing assignments, returning to Blue Books or switching to oral exams. Brian Patrick Green, a tech-ethics scholar at Santa Clara University, immediately stopped assigning essays after he tried ChatGPT for the first time. Less than three months later, teaching a course called Ethics and Artificial Intelligence, he figured a low-stakes reading reflection would be safe — surely no one would dare use ChatGPT to write something personal. But one of his students turned in a reflection with robotic language and awkward phrasing that Green knew was AI-generated. A philosophy professor across the country at the University of Arkansas at Little Rock caught students in her Ethics and Technology class using AI to respond to the prompt “Briefly introduce yourself and say what you’re hoping to get out of this class.”

It isn’t as if cheating is new. But now, as one student put it, “the ceiling has been blown off.” Who could resist a tool that makes every assignment easier with seemingly no consequences? After spending the better part of the past two years grading AI-generated papers, Troy Jollimore, a poet, philosopher, and Cal State Chico ethics professor, has concerns. “Massive numbers of students are going to emerge from university with degrees, and into the workforce, who are essentially illiterate,” he said. “Both in the literal sense and in the sense of being historically illiterate and having no knowledge of their own culture, much less anyone else’s.”

That future may arrive sooner than expected when you consider what a short window college really is. Already, roughly half of all undergrads have never experienced college without easy access to generative AI. “We’re talking about an entire generation of learning perhaps significantly undermined here,” said Green, the Santa Clara tech ethicist. “It’s short-circuiting the learning process, and it’s happening fast.”
 
Lee was born in South Korea and grew up outside Atlanta, where his parents run a college-prep consulting business. He said he was admitted to Harvard early in his senior year of high school, but the university rescinded its offer after he was suspended for sneaking out during an overnight field trip before graduation. A year later, he applied to 26 schools; he didn’t get into any of them. So he spent the next year at a community college, before transferring to Columbia.

Maybe something more happened but does no one else see how absolute bullshit this is? Because of some dumb teen bullshit unrelated to his academics he got his offer rescinded. Then even though he was admitted to Harvard he couldn't get into any other school afterwards?

If this kid took part in looting and rioting for "social justice" then there likely wouldn't have had any issues but do dumb kid shit and you get ruined over it.
 
Ever since I've always assumed that the majority of teachers either just read the first page or skim whatever you turn in, and I would bet the same applies to college.
To truly read and understand a paper requires time and they got 20+ other kids’ paper to grade. They check to see if there’s anything bad that stands out and then skim over the rest.

The irony of them bitching about AI is that they and TAs use computerized grading tools to make their job easier but not allowed for students. This “do as I say and not as I do” is another big problem with education. Especially as education continues to be seen as a profit center by just cramming classes full of students. Little wonder why college attendance continues to decline.
I am confident people who can't get much out of AI are simply poor users of it.
I’ve witnessed retards attempting to use AI and then get pissy when it gives bad results. As someone else mentioned, it’s a great aid for people. But there are millions of dumbasses that will upload their assignment, say “do my homework” and then get a red ass about poor results.
Maybe something more happened but does no one else see how absolute bullshit this is? Because of some dumb teen bullshit unrelated to his academics he got his offer rescinded. Then even though he was admitted to Harvard he couldn't get into any other school afterwards?

If this kid took part in looting and rioting for "social justice" then there likely wouldn't have had any issues but do dumb kid shit and you get ruined over it.
The Ivy leagues these days get crammed full of applicants. Acceptance rates have dropped significantly in the last several years. They’re in the position to drop anyone they want because there are tens of thousands of other great applicants who don’t commit random acts of dumbfuckery like this.
 
Just consider the absurdity of paying a university tens of thousands of dollars so you can go and cheat your way through with AI. Learning not a single thing other than how to use AI, which you could do from your phone in between jerking off to tiktok thots.

I am a longtime hater of how much time university wastes. Seemingly by its very design. Nonetheless there were 5 classes taught by the same guy that very much defined how I started my career in software engineering. Everything else was stupid, especially learning programming in 6 months when you could learn it from free tutorials in a few weeks if you were sufficiently motivated.

I’ve witnessed retards attempting to use AI and then get pissy when it gives bad results. As someone else mentioned, it’s a great aid for people. But there are millions of dumbasses that will upload their assignment, say “do my homework” and then get a red ass about poor results.
These people also think they are smart and clever for using AI to do all this for them. Much like the original guy in the article.
 
For my highschool Advanced English class I was supposed to write this 10 page paper on Hamlet. I blew it off til the last minute and then wrote 2 pages and the rest of the 8 pages devolved into pure madness about how everyone in the play mutated into Strawberry ice cream monsters and Hamlets castle was destroyed by a giant ice cream monster and he killed his wife after he discovered she cheated because his child had waffle cone instead of hair.

My teacher gave me a 100% on it and I was too afraid to ask her if she ever read past the first two pages or not. Ever since I've always assumed that the majority of teachers either just read the first page or skim whatever you turn in, and I would bet the same applies to college.
But did everyone clap?
 
College is a fraud, grossly overpriced and you're basically paying for a piece of paper that says you're allowed to have a slightly better job. But if you don't go into debt for this piece of paper you aren't allowed the better jobs. Stuff like being a doctor or engineer make sense but there's plenty of other stuff that doesn't really require a degree and could easily just be on site job training, but again, it's all a big scam.
to throw more fuel onto it the whole jig breaks over the fact that most curators cannot be assed to check shit either nor think about the shit in question outside of checking if it is in sync with correct answers about the course
 
The irony of them bitching about AI is that they and TAs use computerized grading tools to make their job easier but not allowed for students. This “do as I say and not as I do” is another big problem with education.
I have no problem with the rest of your post but this commonly held attitude about education always has baffled me. Is it really that hypocritical? Do teachers and TAs have any obligation to spend as much time per student on an assignment as each student has? The educators aren't the ones needing the practice or needing to demonstrate competency. Teachers and their assistants just need to deliver an evaluation. It's expensive to do this at scale without technology.
 
Maybe something more happened but does no one else see how absolute bullshit this is? Because of some dumb teen bullshit unrelated to his academics he got his offer rescinded. Then even though he was admitted to Harvard he couldn't get into any other school afterwards?

If this kid took part in looting and rioting for "social justice" then there likely wouldn't have had any issues but do dumb kid shit and you get ruined over it.
Meanwhile David Hogg got into Harvard by not even being in a school shooting.

These institutions only serve for the children of rich shitlibs and kikes to network with each other, simple as.
 
I have no problem with the rest of your post but this commonly held attitude about education always has baffled me. Is it really that hypocritical? Do teachers and TAs have any obligation to spend as much time per student on an assignment as each student has? The educators aren't the ones needing the practice or needing to demonstrate competency. Teachers and their assistants just need to deliver an evaluation. It's expensive to do this at scale without technology.
They do have a responsibility to provide constructive feedback on essays. When a teacher just writes a grade on the paper and leaves no feedback (which is extremely common), kids never learn how to write properly.

Somehow teachers managed to actually read every student paper before automated grading technology existed. They're also given several paid hours a day to grade.
 
Miss Sneed learned to write in a northeast education. ChatGPT apparently did too. Her hand written shit is sometimes accused of being AI.

I always shoot from the hip, so I’m immune. But I’ll use AI to collect my thoughts or automate googling shit (then treat it like a robotic pet grad student and catch mistakes).

Almost like it’s a tool for people who are at the level or checking someone else’s work but not much else; almost like it’s cheating for people below it? 🤔
 
  • Like
Reactions: Falcos_Commisar
Then go back to pencil and paper and exam halls. We managed. Practical skills can be evaluated in person too. De-digitalise the entire thing. Return to parchment if you have to. I’ve no sympathy for any academics setting essays that they can’t tell are written by chatGPT, if you’ve automated everything to the point t you can’t tell if a student cheated using it your course is worth shit
I think this is likely the best solution. U.S students are practically in the toilet compared to students from other countries, that they'll be used as a case study by other countries as to why you don't allow the thrall of technology to replace what actually works and gets the job done. There is a point where technology is good and has it's application, but it's use in schools (beyond computer lab) is just absolutely turning students into future adults who won't be able to function and run society.

It's actually happening btw. We used to do a quick little test before one of my classes but after three semesters of the entire year acing them they randomly switched to pen and paper and all is well in the world.
I hope more schools do this.
Arguents like this are so nigger tier. If you're traeting the course as a set of mindless busywork that gatekeeps you from a magical gatekeeping pice of paper then you're setting yourself up to fail. Sure it's going to be fun and easy cheating your way through a basic freshman subject but good luck getting through the more complex shit later on.
I'd also say that they'll be a liability to any future company or lab if they don't learn how to do things the right way. Imagine their employer just being so tickled that they cost them millions from their severe lack of reasoning skills and atrophied problem solving abilities because they never learned how to do these properly.
I’m still more worried about Mgumbo paying someone to pass her nursing board exam than I am about some dangerhair skating through sociology class. I’ve heard so many stories through the grapevine it’s terrifying.
That's the reason I can't jive with the whole digitization thing beyond a certain point, because it is creating, enabling and embedding into the system dangerous levels of idiocy now, people that will through their incompetence and laziness harm or kill people or create disasters.
I agree with Otter. Computers are a luxury, not a necessity. However, youngsters cheating their homework could also be detrimental.
People act like the times before advent of widespread personal computer use and handheld digital devices were the dark ages. But they managed to send men to the moon, discover the secrets of the atom, and discover a lot of what upholds a lot of modern medicine. If they had a device that they turned to that told them what they wanted to hear, thought for them and regurgitated answers based on the slop of the day, do you think they would have been able to find this stuff out?

I think we are going to be going backwards soon. The society of today was built on the ability of previous generations to be able to reason, research and extrapolate enough to create theories and test them without some shiny electrical device in their hands to distract them and to encourage laziness and intellectual pablum.
 
They're also given several paid hours a day to grade.
I missed the boat on this when I started teaching. 40 kids per class, 6 classes. Brutal. It's why I escaped the classroom. I would actually be all for a different model where someone other than the teacher grades and provides feedback, or at least collaborates with the teacher to provide feedback. It serves as verification of instruction and reduces bias - it's hard not to give pity points or grade punitively. Right now I'm seeing student teachers trying out Cograder, an AI assistant and it's pretty good at reducing the strain of reading so many papers yet still prompts teachers to give their feedback where it counts.
 
You can absolutely use AI to write things about real subjects, weird subjects and highly technical ones. You just need the right training data. It's not that hard if you know how to use it, and not hard to make it sound real. So much of what I hear people say about AI sounds like the person using it last used it 18 months ago. Or they "know" AI when it's default output without custom training.
 
While I find most degrees to be fart-huffing credentialism; I'm kinda happy my school had a Capstone project, where the entire class semester is building a solution and then an oral exam with just you and the school chair and they can ask damn near whatever they want. Im not saying everyone got fluffed, in fact I'd almost say they tailored the oral exam to who they see as weak, middle, strong. But it at least felt like I could say I learned something, even if half my class hours was listening to music in the lab.

While I feel i could write a War and Peace about how to save Academia (burn it to the ground, along with the administrative bureaucracy); I'll put it aside so I don't sperg. But like someone already stated, the teachers aren't in charge of the school, it's all middle management penny pinchers.
 
Back