US A New Headache for Honest Students: Proving They Didn’t Use A.I.

NY Times - Archive
Students are resorting to extreme measures to fend off accusations of cheating, including hours long screen recordings of their homework sessions.

A few weeks into her sophomore year of college, Leigh Burrell got a notification that made her stomach drop.
She had received a zero on an assignment worth 15 percent of her final grade in a required writing course. In a brief note, her professor explained that he believed she had outsourced the composition of her paper — a mock cover letter — to an A.I. chatbot.
“My heart just freaking stops,” said Ms. Burrell, 23, a computer science major at the University of Houston-Downtown.
But Ms. Burrell’s submission was not, in fact, the instantaneous output of a chatbot. According to Google Docs editing history that was reviewed by The New York Times, she had drafted and revised the assignment over the course of two days. It was flagged anyway by a service offered by the plagiarism-detection company Turnitin that aims to identify text generated by artificial intelligence.

Panicked, Ms. Burrell appealed the decision. Her grade was restored after she sent a 15-page PDF of time-stamped screenshots and notes from her writing process to the chair of her English department.
Still, the episode made her painfully aware of the hazards of being a student — even an honest one — in an academic landscape distorted by A.I. cheating.
Generative A.I. tools including ChatGPT are reshaping education for the students who use them to cut corners. According to a Pew Research survey conducted last year, 26 percent of teenagers said they had used ChatGPT for schoolwork, double the rate of the previous year. Student use of A.I. chatbots to compose essays and solve coding problems has sent teachers scrambling for solutions.
But the specter of A.I. misuse, and the imperfect systems used to root it out, may also be affecting students who are following the rules. In interviews, high school, college and graduate students described persistent anxiety about being accused of using A.I. on work they had completed themselves — and facing potentially devastating academic consequences.
In response, many students have imposed methods of self-surveillance that they say feel more like self-preservation. Some record their screens for hours at a time as they do their schoolwork. Others make a point of composing class papers using only word processors that track their keystrokes closely enough to produce a detailed edit history.

The next time Ms. Burrell had to submit an assignment for the class in which she had been accused of using A.I., she uploaded a 93-minute YouTube video documenting her writing process. It was annoying, she said, but necessary for her peace of mind.
“I was so frustrated and paranoid that my grade was going to suffer because of something I didn’t do,” she said.

These students’ fears are borne out by research reported in The Washington Post and Bloomberg Businessweek indicating that A.I.-detection software, a booming industry in recent years, often misidentifies work as A.I.-generated.
A new study of a dozen A.I.-detection services by researchers at the University of Maryland found that they had erroneously flagged human-written text as A.I.-generated about 6.8 percent of the time, on average.
“At least from our analysis, current detectors are not ready to be used in practice in schools to detect A.I. plagiarism,” said Soheil Feizi, an author of the paper and an associate professor of computer science at Maryland.

Turnitin, which was not included in the analysis, said in 2023 that its software mistakenly flagged human-written sentences about 4 percent of the time. A detection program from OpenAI that had a 9 percent false-positive rate, according to the company, was discontinued after six months.
Turnitin did not respond to requests for comment for this article, but has said that its scores should not be used as the sole determinant of A.I. misuse.
“We cannot mitigate the risk of false positives completely given the nature of A.I. writing and analysis, so, it is important that educators use the A.I. score to start a meaningful and impactful dialogue with their students in such instances,” Annie Chechitelli, Turnitin’s chief product officer, wrote in a 2023 blog post.
Some students are mobilizing against the use of A.I.-detection tools, arguing that the risk of penalizing innocent students is too great. More than 1,000 people have signed an online petition started by Kelsey Auman last month, one of the first of its kind, that calls on the University at Buffalo to disable its A.I.-detection service.
A month before her graduation from the university’s master of public health program, Ms. Auman was told by a professor that three of her assignments had been flagged by Turnitin. She reached out to other members of the 20-person course, and five told her that they had received similar messages, she recalled in a recent interview. Two said that their graduations had been delayed.
Ms. Auman, 29, was terrified she would not graduate. She had finished her undergraduate studies well before ChatGPT arrived on campuses, and it had never occurred to her to stockpile evidence in case she was accused of cheating using generative A.I.

“You just assume that if you do your work, you’re going to be fine — until you aren’t,” she said.

Ms. Auman said she was concerned that A.I.-detection software would punish students whose writing fell outside “algorithmic norms” for reasons that had nothing to do with artificial intelligence. In a 2023 study, Stanford University researchers found that A.I.-detection services were more likely to misclassify the work of students who were not native English speakers. (Turnitin has disputed those findings.)
After Ms. Auman met with her professor and exchanged lengthy emails with the school’s Office of Academic Integrity, she was notified that she would graduate as planned, without any charges of academic dishonesty.
“I’m just really glad I’m graduating,” she said. “I can’t imagine living in this feeling of fear for the rest of my academic career.”
John Della Contrada, a spokesman for the University at Buffalo, said that the school was not considering discontinuing its use of Turnitin’s A.I.-detection service in response to the petition.
“To ensure fairness, the university does not rely solely on A.I.-detection software when adjudicating cases of alleged academic dishonesty,” he wrote in an email, adding that the university guaranteed due process for accused students, a right to appeal and remediation options for first-time offenders. (Ms. Burrell’s school, the University of Houston-Downtown, warns faculty members that plagiarism detectors including Turnitin “are inconsistent and can easily be misused,” but still makes them available.)
Other schools have determined that detection software is more trouble than it is worth: The University of California, Berkeley; Vanderbilt; and Georgetown have all cited reliability concerns in their decisions to disable Turnitin’s A.I.-detection feature.

“While we recognize that A.I. detection may give some instructors peace of mind, we’ve noticed that overreliance on technology can damage a student-and-instructor relationship more than it can help it,” Jenae Cohn, executive director of the Center for Teaching and Learning at U.C. Berkeley, wrote in an email.

Sydney Gill, an 18-year-old high school senior in San Francisco, said she appreciated that teachers were in an extremely difficult position when it came to navigating an academic environment jumbled by A.I. She added that she had second-guessed her writing ever since an essay she entered in a writing competition in late 2023 was wrongly marked as A.I.-generated.
That anxiety persisted as she wrote college application essays this year. “I don’t want to say it’s life-changing, but it definitely altered how I approach all of my writing in the future,” she said.
In 2023, Kathryn Mayo, a professor at Cosumnes River College in Sacramento, started using A.I.-detection tools from Copyleaks and Scribbr on essays from students in her photo-history and photo-theory classes. She was relieved, at first, to find what she hoped would be a straightforward fix in a complex and frustrating moment for teachers.
Then she ran some of her own writing through the service, and was notified that it had been partly generated using A.I. “I was so embarrassed,” she said.

She has since changed some of her assignment prompts to make them more personal, which she hopes will make them harder for students to outsource to ChatGPT. She tries to engage any student whom she seriously suspects of misusing A.I. in a gentle conversation about the writing process.
Sometimes students sheepishly admit to cheating, she said. Other times, they just drop the class.
 
Last edited:
From today onward, all assignments must have the word NIGGER as a header, and the word NIGGER must also be hidden at least 15 times in the presentation, as AI is not allowed to write the word NIGGER.

As an actual solution, make all the important assignments and tests into a live presentation. No printed shit, all handwritten. The point shouldn't be to forbid AI from being used but to make it so much of a hassle that studying is just easier.
 
They can't make you prove a thing and they know it. If I were a student and somebody tried to claim I used AI to write a paper or whatever i'd demand they prove it or restore the grade, and if they got pissy about it make it quite clear I will refer the matter to a lawyer if they try to fuck with my education, to say nothing of that kind of baseless slander. Thats a pretty serious accusation in academia. One they'd best have more proof than their own AI program says so before they start making. Just the accusation alone can fuck you permanently in some fields like law or medicine
 
I’ve taught. It makes me feel like I’m going insane second guessing this shit. I have low standards, but I don’t like wasting my time reading stuff that a robot wrote. And sometimes it does feel like something is too academic, but then I see a tell tale sign that it’s real writing, and it’s like oh, they’re not a cheater, they’re just the one in fifty that isn’t a fucking moron.

college kids hate doing homework too, like they’ve got this extreme sense of entitlement (don’t like it, take a different class, fuck off). I may just give up on it.
 
It's so funny how much snake oil and smoke and mirrors makes up AI as an industry. Even the AI detection service is just bullshitting these people. Just retards infinitely scamming each other, very Indian.

Does Hinduism contain some kind of AI-like deity or something that could have been a forewarning of all this?
 
I was told the words "key elements" in my finals essay would sound plagiarized to the district medal commission. I did cheat, a handler smuggled the essay outline to me
("Safir, the first hour's up, do you need a dictionary?" "Nah thank you." "Ok, I'll bring it.") Bullet points, under 50 words total. All the essay text was my own, written directly to the page, no drafts. They made me rewrite it and replace the offending phrase with "important elements". All the other medal contenders needed extensive rewrites so I guess the nasty woman wanted me to suffer for "equity".

Even if you've got a way to show that it was typed out by hand, there's no guarantee that you didn't just prompt an AI to write it and are just keying in whatever it spit out back to you.
1. Writing a thing usually involves a lot of editing. (I now realize I wrote school essays straight to the page in longhand with no drafts, but I've since lost the skill, and I doubt "digital natives" ever had it). It's not enough to just type up AI shit, and simulating real organic editing is hard.
2. A major advantage of AI is speeding up dumb work. Wasting 5 hours on writing an essay and wasting 5 hours on typing up an AI essay are equally terrible.

She has since changed some of her assignment prompts to make them more personal
This is a professor of photography who teaches "photo history" and "photo theory". Why does this require essays at all, not to mention "personal" essays? How do you personalize the trichrome carbro process?
 
Really don't see a good solution to this. Fact of the matter is schools have to teach writing, and to write anything of substance they will need more time than the 45 minutes they have in class to write it, and if they have any time outside of class they can consult AI. There are ways to stop the worst offenders who just copy and paste an entire essay from one prompt and hand it in without revision, but there are much more subtle ways to use AI than that.
 
lol "fuck you, teach, it's on you to prove I did use AI."

Upon any failure of an instructor to do just that while leveling the accusation should be immediately met with litigation. Fuck that bullshit. I did my work. Go to hell with your cheating accusations.
 
Look at the following facts:
sophomore year of college
assignment worth 15 percent of her final grade in a required writing course.
her paper — a mock cover letter
she had drafted and revised the assignment over the course of two days
This college is giving 15 percent of a sophomore class grade for writing a cover letter. It still takes students 2 days to write it.

College is a scam and full of worthless idiots. The professor should be banned from teaching for money, the college should be shut down and turned into a high school, and the dumb bitch in this article should go work at Walmart.
 
This college is giving 15 percent of a sophomore class grade for writing a cover letter. It still takes students 2 days to write.
I actually hadn't even noticed that, that's hilarious. That also means the professor thinks he needs AI to decide if a cover letter (like one page of personal information) was actually written by a student. He better have about 500 students or that makes no sense. I can identity writing styles on this very forum and I do it for free.
 
"Please prove your innocence!" isn't how it works in actual court, lmao. I don't have to prove I cheated, YOU have to prove I cheated, and the best way to prove that, is to force people to write shit out by hand, and do oral presentations for their work. Of course schools will chafe at this, because they're lazy faggots who don't want to go back to the manual ways themselves. God forbid teachers do actual work.
 
This college is giving 15 percent of a sophomore class grade for writing a cover letter. It still takes students 2 days to write it.
This is insane. It’s the kind of excercise you should sit down and be able to knock out in 15-30 mins absolute tops. In front of someone, using pen and paper.
It’s not on the students to prove they didn’t cheat. It’s up to the academics to set up a system where students can produce work without accusation. If that’s in person and with pen and paper so be it, but if you accuse me of cheating, the onus is on YOU to prove it.
 
Back