US A teacher caught students using ChatGPT on their first assignment to introduce themselves. Her post about it started a debate.

A teacher caught students using ChatGPT on their first assignment to introduce themselves. Her post about it started a debate.​

Jaures Yip
Sep 8, 2024, 10:41 AM GMT+2


Professor Megan Fritts caught several students using ChatGPT in the first week of the semester. picture alliance/Getty Images
  • A teacher's students ChatGPT for a simple introductory assignment in an ethics and technology class.
  • Professor Megan Fritts shared her concerns on X, sparking debate on AI's role in education.
  • Educators are divided on AI's impact, with some feeling it undermines critical thinking skills.

Professor Megan Fritts' first assignment to her students was what she considered an easy A: "Briefly introduce yourself and say what you're hoping to get out of this class."
Yet many of the students enrolled in her Ethics and Technology course decided to introduce themselves with ChatGPT.
"They all owned up to it, to their credit," Fritts told Business Insider. "But it was just really surprising to me that — what was supposed to be a kind of freebie in terms of assignments — even that they felt compelled to generate with an LLM."
When Fritts, an assistant professor of philosophy at the University of Arkansas at Little Rock, took her concern to X, formerly Twitter, in a tweet that has now garnered 3.5 million views, some replies argued that students would obviously combat "busywork" assignments with similarly low-effort AI-generated answers.
Second week of the semester and I've already had students use (and own up to using) ChatGPT to write their first assignment: "briefly introduce yourself and say what you're hoping to get out of this class". They are also using it to word the *questions they ask in class*.
— Megan Fritts (@freganmitts) August 28, 2024

However, Fritts said that the assignment was not only to help students get acquainted with using the online Blackboard discussion board feature, but she was also "genuinely curious" about the introductory question.
"A lot of students who take philosophy classes, especially if they're not majors, don't really know what philosophy is," she said. "So I like to get an idea of what their expectations are so I can know how to respond to them."
The AI-written responses, however, did not reflect what the students, as individuals, were expecting from the course but rather a regurgitated description of what a technology ethics class is, which clued Fritts in that they were generated by ChatGPT or a similar chatbot.
"When you're a professor, and you've read dozens and dozens of AI essays, you can just tell," she said.

The calculator argument — why ChatGPT is not just another problem-solving tool​

While a common defense permeating Fritts' replies likened ChatGPT for writing to a calculator for math problems, she said that viewing LLMs as just another problem-solving tool is a "mistaken" comparison, especially in the context of humanities.
Related stories

Calculators reduce the time needed to solve mechanical operations that students are already taught to produce a singular correct solution. But Fritts said that the aim of humanities education is not to create a product but to "shape people" by "giving them the ability to think about things that they wouldn't naturally be prompted to think about."
"The goal is to create liberated minds — liberated people — and offloading the thinking onto a machine, by definition, doesn't achieve that," she said.

Lasting impacts on students​

Beyond cheating on papers, Fritts said that students have, in general, become compromised in their thinking ability — and they've noticed.
"They're like, 'When I was young, I used to love to read, and now I can't. I can't even get through the chapter of a book,'" she said. "'My attention span is so bad, and I know it's from looking at my phone, always having YouTube or TikTok on.' And they're sad about it."
Fritts said that technology addiction has affected students' general agency when interacting with information. She cited a 2015 paper by Professor Charles Harvey, chair of the Department of Philosophy and Religion at the University of Central Arkansas, which examines the effects that interactions with technology could have had on human agency and concentration.
Harvey wrote that two different eye-tracking experiments indicated that the vast majority of people skim online text quickly, "skipping down the page" rather than reading line by line. Deep reading of paper texts is being snipped into "even smaller, disconnected" thoughts.
"The new generations will not be experiencing this technology for the first time. They'll have grown up with it," Fritts said. "I think we can expect a lot of changes in the really foundational aspects of human agency, and I'm not convinced those changes are going to be good."

Teachers are getting tired​

Fritts acknowledges that educators have some obligation to teach students how to use AI in a productive and edifying way. However, she said that placing the burden of fixing the cheating trend on scholars teaching AI literacy to students is "naive to the point of unbelievability."
"Let's not deceive ourselves that students are using AI because they're just so siked about the new tech, and they're not sure of what the right way to use it in the classroom," Fritts said.
"And I'm not trying to slam them," she added. "All of us are inclined to take measures to make things easier for us."
But Fritts also feels just as "pessimistic" about the alternative solution — educators and institutions forming a "united front" in keeping AI out of the classroom.
"Which isn't going to happen because so many educators are now fueled by sentiments from university administration," Fritts said. "They're being encouraged to incorporate this into the curriculum."
At least 22 state departments of education have released official guidelines for AI use in schools, The Information recently reported. A 2024 survey by EdWeek Research Center found that 56% of over 900 educators anticipated AI use to rise. And some are excited for it.
Curby Alexander, an associate education professor at Texas Christian University, previously told BI that he uses AI to help brainstorm ideas and develop case studies "without taking up a lot of class time."
ASU's Anna Cunningham, a Dean's Fellow, and Joel Nishimura, an associate professor in the Mathematical and Natural Sciences department, wrote an op-ed encouraging having students teach ChatGPT agents with programmed misunderstandings.
"With this, we are on the cusp of being able to give all students as many opportunities as they want to learn by teaching," they wrote.
OpenAI even partnered with Arizona State University to offer students and faculty full access to ChatGPT Enterprise for tutoring, coursework, research, and more.
However, many educators remain skeptical. Some professors have even reverted back to pen and paper to combat ChatGPT usage, but Fritts said many are tired of trying to fight the seemingly inevitable. And students are left in the middle of education and AI's love-hate relationship.
"I think it, understandably, creates a lot of confusion and makes them feel like the professors who are saying 'Absolutely not' are maybe philistines or behind the times or unnecessarily strict," Fritts said.
Fritts is not the only professor voicing concerns about AI use among students. In a Reddit thread titled "ChatGPT: It's getting worse," several users who identified as professors lamented increased AI usage in classrooms, especially in online courses. One commented, "This is one reason I'm genuinely considering leaving academia."
A professor in another post that received over 600 upvotes said that ChatGPT was "ruining" their love of teaching. "The students are no longer interpreting a text, they're just giving me this automated verbiage," they wrote. "Grading it as if they wrote it makes me feel complicit. I'm honestly despairing."

 
A couple thoughts:

Engineering Ethics is a pretty weak class. In my experience it's a 1 credit hour, check the box part of a degree program. Most students who go into STEM fields do not enjoy writing at best, and having a class that requires it is a real drag that teaches them very little.

Minimum word counts in assignments encourages unclear writing styles and bullshit fluff. AI can seem real appealing to meet a word count for something you give no shit about. STEM students should be encouraged to write succinctly and in bulleted lists when appropriate.

Resorting to so called AI in writing will definitely reduce critical thinking skills. Being forced to write 1000 words about some topic that you don't fucking care about actually forces you to learn a learn bit about that topic.

Schools need to fail students.
 
cant blame the students on this one, i always hated those 'introduce yourself' assignments. not sure anybody likes them. if i had chatGPT when i was in school i would have used it on these assignments.
the aim of humanities education is not to create a product but to "shape people" by "giving them the ability to think about things that they wouldn't naturally be prompted to think about."
"The goal is to create liberated minds — liberated people


lmao. the aim of modern humanities isnt to liberate people, but instead to mold people into unthinking progressive bugmen.
 
The flip side of minimum word counts is maximum word counts. I've had papers returned where the professor made us go back and rewrite pages worth of material without extraneous verbage. Short and concise is the key to good writing, not flowery and overly complicated prose. Get to the point, get through the point, get to the next point in such a way as the reader can't miss it.
 
I saw this in my own online class recently. Their self-introduction post started off “here is the introduction lengthened to 200 characters for you!”

At least be proactive enough to cut that part off.

Idk they’re convenient for working adults such as myself and, I guess, ppl with other extenuating circumstances (?) but in general online classes are a joke that shouldn’t count as regular education.
 
Last edited:
>Ethics and Technology course
Every single student who wrote more than "get these bullshit creds checked off so I can focus on my weeding out programming course" was lying anyways
 
some replies argued that students would obviously combat "busywork" assignments with similarly low-effort AI-generated answers.

Based students. Article is about a college class but same concepts applies to public school as well. Theirs only so many hours in a day. If you can't do this in the hours you are assigned the student then it's worthless busy work with no reward for the student.

The idea that you have to do free labor for a bullshit assignment that will go into the trash after being read for 5 mins during the prime years of your life is asinine.

Long term projects are obviously exempt
 
I dislike chatgpt. It was sold to me as a way to get shit done quicker, yet in every writing assignment I have used it seems to just regurgitate the wikipedia page of a topic. I always have to go and rewrite and organize everything myself, and at that point just research and write the whole thing yourself.
In my opinion, students would benefit from more english/writing classes. I took two semesters and the last one was a class called "writing for engineering". In my opinion, it is one of the few electives that was worth attending twice a week. I ended up learning the best way to write coherently.
If you use chatgpt and copy paste the answer without any embellishments, you deserve to fucking fail
 
How hard is it for these kids to write "My name is Rhyleigh and I enjoy making TikTok videos and consooming product".?

Most of the introductions are going to be boring anyway. It shouldn't take more than an hour to put together some properly formatted drivel about your boring self. Everyone's probably going to have some measure of the same.
 
There's a prevalent notion in education that everyone needs to be able to write and express themselves and it's just not true.

Some people simply don't want to, they get nothing out of it, and some people just shouldn't, we get nothing out of it.
 
I had a teacher that gave this as an optional assignment, and it had to answer 10 questions about yourself. I expected it to be worthless drivel and didn't do it. Then instead of reading it with half attention, this guy read each, thought about them and asked personal questions to every student that had completed it, questions that got people to understand themselves and each other better.
 
Thinking is too hard, so I asked ChatGPT about the subject.
Using ChatGPT in a classroom setting presents both opportunities and challenges from an ethical standpoint. Here are some key considerations:

Opportunities:
  1. Accessibility: ChatGPT can provide students with additional support and resources, offering explanations, answering questions, and providing feedback on assignments. This can be especially helpful for students who need extra assistance outside regular classroom hours.
  2. Personalization: It can cater to different learning styles and paces, offering tailored explanations or alternative approaches to problem-solving.
  3. Engagement: Interactive tools like ChatGPT can make learning more engaging by integrating technology into education and encouraging students to interact with digital resources.
Challenges:
  1. Dependence: There's a risk that students might become overly reliant on AI for answers and assistance, potentially hindering their critical thinking and problem-solving skills.
  2. Accuracy: While ChatGPT is designed to provide useful and accurate information, it’s not infallible. There’s a risk of disseminating incorrect or misleading information, so it’s crucial to verify facts and cross-check with reliable sources.
  3. Privacy and Data Security: Using AI tools involves handling personal data. It’s important to ensure that any data collected is managed securely and ethically, in compliance with privacy regulations.
  4. Academic Integrity: There’s a potential for misuse, such as students using ChatGPT to complete assignments or exams dishonestly. Teachers need to set clear guidelines on how to use AI tools ethically and monitor their use appropriately.
  5. Equity: Not all students might have equal access to technology, which could create disparities in learning opportunities. Ensuring that AI tools complement rather than replace traditional teaching methods can help address this issue.
In summary, ChatGPT can be a valuable educational tool when used thoughtfully and ethically. It’s essential for educators to establish clear guidelines and ensure that its use enhances rather than detracts from the learning experience.
 
Thinking is too hard, so I asked ChatGPT about the subject.
You should feed chatgpt the thread itself and ask it to write a better prompt, because whatever you asked only 1 out of 8 points is actually related to the context.

"The machine knows me better than I know myself".

That's the real problem here.

Nah, you're just a boomer that doesn't realize that they're actually right about that.
 
  • Like
Reactions: Neo-Nazi Rich Evans
Back