More Schools Are Considering Education-Focused AI Tools. What’s the Best...

Artificial Intelligence

More Schools Are Considering Education-Focused AI Tools. What’s the Best Way to Use Them?

A new report shows AI teaching assistants could be beneficial in the classroom, but may also carry invisible bias.

By Lauren Coffey     Aug 22, 2025

More Schools Are Considering Education-Focused AI Tools. What’s the Best Way to Use Them?

Discussion about the use of AI in the classroom has become as commonplace as pencils or notebooks, but many have struggled when it comes to implementing and deploying the ubiquitous technology. A new report looks at how — and if — AI tools specifically geared toward the education sector can ultimately help educators.

Common Sense Media, a nonprofit helping parents navigate technology and media, released its risk assessment of “AI Teacher Assistants” earlier this month. AI Teacher Assistants are built specifically for classroom use, unlike more general chatbots like ChatGPT. The former – which include Google School and Adobe’s Magic School – aim to save teachers time while improving student outcomes.

“As we see adoption of these tools continue to skyrocket, districts are really asking questions,” says Robbie Torney, senior director of AI programs at Common Sense Media. “It’s looking at, ‘Are they safe? Are they trustworthy? Do they use data responsibly?’ We’re trying to be comprehensive into how they fit into school as a whole.”

The report focused less on use of the tools for administrative tasks, such as syllabus building, and more on the pedagogical work, like creating discussion questions based on an AP U.S. History reading.

Torney recommends institutions set guardrails early to use these tools, based on the goals they hope to achieve.

“My main takeaway is that this is not a go-it-alone technology,” he says. “If you're a school leader and you as a staff haven't had a conversation on how to use these things and what they’re good at and not good at, that’s where you get into these potential dangers.”

Paul Shovlin, an AI faculty fellow at the Center for Teaching and Learning at Ohio University, says the K-12 sector seems to have adopted the new tools at a quicker pace than its higher education counterparts.

“I think they are becoming more prevalent,” he says. “This is just a feeling, but I feel K-12 has picked up on platforms sooner than higher ed; and there are some concerns related to them.”

A frequently cited danger is the inherent bias that technology brings. The Common Sense Media report dubbed it “invisible influence,” in which the teaching assistants were fed “white-coded” names and “Black-coded” names. While each of the responses about the hypothetical students appeared innocuous, Torney says when a mass number of chats were compared, researchers found responses to the white-coded female names had more “supportive” responses and Black-coded names received shorter and less helpful answers.

“I’m always surprised how difficult it is to see bias; sometimes it’s obvious, sometimes it's invisible and hard to detect,” Torney says. “If you are just generating outputs on a one-off basis, you may not be able to see the differences in outputs based on one student versus another. It could be truly invisible and you may only see them at the aggregate level.”

Shovlin noted the companies themselves can have their own biases that may show up.

“There are affordances and limitations with any technology and I don’t want to completely discount these platforms, but I’m highly skeptical because they are commercial products and there is that imperative built into how they create these things and market them,” he says. “This industry that has created these tools also has embedded bias as a result of who is doing the coding originally. If it’s dominated by one identity, it will be baked into the algorithms.”

Emma Braaten, director of digital learning at the Friday Institute for Educational Innovation at North Carolina State University, also advises checking the company’s terms and conditions to ensure data privacy, and not fully trusting specific companies or products just because they have been trustworthy in the past.

“There are educators who trust this program or platform because we've used it before,” Braaten says, urging educators to think more deeply. “How do we review and revisit that [tool] as they incorporate AI? Do we give a blanket of trust or start to review and think critically about those?”

There is also the importance of what Braaten calls “human in the loop,” or ensuring both students and teachers are in the forefront while employing AI.

“That piece both for students and educators is a huge focus to think about; making sure all these groups stay in the loop and not just give it all away to the tool,” she says. “When we have a teaching assistant in the classroom space, it’s looking at … do we have guidance to make lessons to include both technology and the human connection in that space?”

Each of the experts interviewed by EdSurge acknowledge the tools, when used correctly, offer benefits for teachers that outweigh their potential pitfalls. The report pushed for educators to base the tools in their own lesson plans, instead of having the tools come up with proprietary lessons.

“The [AI] model is not as good as the curriculum you're teaching from,” Torney says. “If you're teaching from an adopted curriculum, the output will be so much better than getting a random generated lesson about fractions.”

And as adoption continues, experts urge the importance of leaning into the right way to adapt to the technology.

“You can't just block AI with one sweeping wave of your hand; at this point it's embedded into so many things,” Braaten says. “There’s looking at that integration into the products themselves, but also how you're part of that system and how you incorporate it into your application [are what] we have to be critical thinkers about.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up