Kids aren’t as sneaky as they think they are.
They do try, as Holly Distefano has seen in her middle school English language arts classes. When she poses a question to her seventh graders over her school’s learning platform and watches the live responses roll in, there are times when too many are suspiciously similar. That’s when she knows students are using an artificial intelligence tool to write an answer.
“I really think that they have become so accustomed to it, they lack confidence in their own writing,” Distefano, who teaches in Texas, says. “In addition to just so much pressure on them to be successful, to get good grades, really a lot is expected of them.”
Distefano is sympathetic — but still expects better from her students.
“I’ve shown them examples of what AI is — it’s not real,” she says. “It’s like margarine to me.”
Educators have been trying to curb the use of AI-assisted cheating since ChatGPT exploded onto the scene.
It’s a formidable challenge. For instance, there’s a corner of TikTok reserved for tech influencers who rack up thousands of views and likes teaching students how to most effectively use AI programs to generate their essays, including step-by-step instructions on bypassing AI detectors. And the search term for software that purports to “humanize” AI-generated content spiked in the fall, according to Google Trends data, only to fall sharply before hitting the peak of its popularity around the end of April.
While the overall proportion of students who say they’ve cheated hasn’t fluctuated by much in recent years, students also say generative AI is making academic dishonesty easier.
But there may be a solution on the horizon, one that will help ensure students have to put more effort into their schoolwork than entering a prompt into a large language model.
Teachers are transitioning away from question-and-answer assignments or straightforward essays — in favor of projects.
It’s not especially high-tech or even particularly ingenious. Yet proponents say it’s a strategy that pushes students to focus on problem-solving while instructing them on how to use AI ethically.
Becoming ‘AI-Proof’
During this past school year, Distefano says her students’ use of AI to cheat on their assignments has reached new heights. She’s spent more time coming up with ways to stop or slow their ability to plug questions and assignments into an AI generator, including by giving out hard copy work.
It used to mainly be a problem with take-home assignments, but Distefano has increasingly seen students use AI during class. Kids have long been astute at getting around whatever firewalls schools put on computers, and their desire to circumvent AI blockers is no different.
Between schoolwork, sports, clubs and everything else middle schoolers are juggling, Distefano can see why they’re tempted by the allure of a shortcut. But she worries about what her students are missing out on when they avoid the struggle that comes with learning to write.
“To get a student to write is challenging, but the more we do it, the better we get.” she says. “But if we’re bypassing that step, we’re never going to get that confidence. The downfall is they're not getting that experience, not getting that feeling of, ‘This is something I did.’”
Distefano is not alone in trying to beat back the onslaught of AI cheating. Blue books, which college students use to complete exams by hand, have had a resurgence as professors try to eliminate the risk of AI intervention, reports The Wall Street Journal.
Richard Savage, the superintendent of California Online Public Schools, says AI cheating is not a major issue among his district’s students. But Savage says it’s a simple matter for teachers to identify when students do turn to AI to complete their homework. If a student does well in class but fails their thrice-yearly “diagnostic exams,” that’s a clear sign of cheating. It would also be tough for students to fake their way through live, biweekly progress meetings with their teachers, he adds.
Savage says educators in his district will spend the summer working on making their lesson plans “AI-proof.”
“AI is always changing, so we’re always going to have to modify what we do,” he says. “We’re all learning this together. The key for me is not to be AI-averse, not to think of AI as the enemy, but think of it as a tool.”
‘Trick Them Into Learning’
Doing that requires teachers to work a little differently.
Leslie Eaves, program director for project-based learning at the Southern Regional Education Board, has been devising solutions for educators like Distefano and Savage.
Eaves authored the board’s guidelines for AI use in K-12 education, released earlier this year. Rather than exile AI, the report recommends that teachers use AI to enhance classroom activities that challenge students to think more deeply and critically about the problems they’re presented with.
It also outlines what students need to become what Eaves calls “ethical and effective users” of artificial intelligence.
“The way that happens is through creating more cognitively demanding assignments, constantly thinking in our own practice, ‘In what way am I encouraging students to think?’” she says. “We do have to be more creative in our practice, to try and do some new things to incorporate more student discourse, collaborative hands-on assignments, peer review and editing, as a way to trick them into learning because they have to read someone else’s work.”
In an English class lesson on “The Odyssey," Eaves offers as an example, students could focus on reading and discussion, use pen and paper to sketch out the plot structure, and use AI to create an outline for an essay based on their work, before moving on to peer-editing their papers.
Eaves says that the teachers she’s working with to take a project-based approach to their lesson plans aren’t panicking about AI but rather seem excited about the possibilities.
And it’s not only English teachers who are looking to shift their instruction so that AI is less a tool for cheating and more a tool that helps students solve problems. She recounts that an automotive teacher realized he had to change his teaching strategy because when his students adopted AI, they “stopped thinking.”
“So he had to reshuffle his plan so kids were re-designing an engine for use in racing, [figuring out] how to upscale an engine in a race car,” Eaves says. “AI gave you a starting point — now what can we do with it?”
When it comes to getting through to students on AI ethics, Savage says the messaging should be a combination of digital citizenship and the practical ways that using AI to cheat will stunt students’ opportunities. Students with an eye on college, for example, give up the opportunity to demonstrate their skills and hurt their competitiveness for college admissions and scholarships when they turn over their homework to AI.
Making the shift to more project-based classrooms will be a heavy lift for educators, he says, but districts will have to change, because generative AI is here to stay.
“The important thing is we don’t have the answers. I’m not going to pretend I do,” Savage says. “I know what we can do, when we can get there, and then it’ll probably change. The answer is having an open mind and being willing to think about the issue and change and adapt.”