Students’ AI Chats Reveal Their Largest Stressors

Mental Health

Students’ AI Chats Reveal Their Largest Stressors

A new report found middle and high school students, regardless of age or location, are discussing the same challenges with AI bots.

By Lauren Coffey     Jul 8, 2025

Students’ AI Chats Reveal Their Largest Stressors

While social media, bullying and loneliness have long been flagged as top concerns among educators for their students, a new report shows the biggest concern for kids is balancing it all.

The kicker: Students didn't share these concerns with adults in their lives. Instead, they expressed these worries to an AI chat system, which schools and health care institutions are increasingly turning to in an attempt to better support youth.

“What we’re trying to do is deliver skill-building in an interactive way that helps them navigate daily challenges,” says Elsa Friis, a licensed psychologist and head of product and clinical at Alongside, a company with a proprietary AI chatbot app. “I still think there's a lot of stigma, and with students, we’re hearing they want to reach out, but don't know how to put it into words.”

Alongside recently published a report revealing what worries today’s kids are willing to share with artificial intelligence systems. The top 10 chat topics were the same across all ages, grades and geographic locations, according to data from more than 250,000 messages exchanged with middle and high school students spanning 19 states.

Balancing extracurricular activities and school was the largest concern among students, followed by sleep struggles and finding a relationship or feelings of loneliness.

The remaining hot topics were interpersonal conflict; lack of motivation; test anxiety; focus and procrastination; how to reach out for support; having a bad day and poor grades. Less than 1 percent of students discussed social media, although Friis estimates many of the concerns students have regarding bullying or interpersonal relationship woes happen online.

While Friis was not particularly surprised at any of the top 10 topics — which have long been issues of concern — she did find school officials were surprised that the students themselves were aware of their own problems.

“I hope we move the conversation away from telling kids what they struggle with to being a partner,” she says. “It’s, ‘I know you know you're struggling. How are you dealing with it?’ and not just a top down, ‘I know you're not sleeping.’”

What’s the Right Role for Chatbots?

Friis sees chatbots as tools in a toolbox to help young people, not to replace any human practitioners. The report itself clarified that its authors do not advocate for the replacement of school counselors, and instead view this kind of tool as a possible supplement.

“We work in tandem with counseling teams; they’re incredibly overwhelmed,” Friis says, pointing to the large percentage of schools that do not have the ideal student-to-counselor ratio, leaving counselors to deal with more high-risk, pressing issues and leaving lower-risk concerns — like loneliness or sleep issues — on the table.

“They’re having to handle the crises, putting out fires, and don’t have the time and resources available,” she says. “We’re helping with the lower-level concerns and helping triage the kids that are hidden and making sure we’re catching them.”

But bots may have an advantage when it comes to prompting young people to talk about what’s really on their minds. A peer-reviewed paper published in the medical journal JAMA Pediatrics found the anonymity of the AI machines can help students open up and feel less judged.

To that end, the Alongside report found that 2 percent of conversations were considered high risk, and roughly 38 percent of students involved in those chats admitted to having suicidal ideation. In many cases, school officials hadn't known those students were suffering.

Kids who are dealing with severe mental health concerns often worry about how the adults in their lives will react, Friis explains.

“There’s concern of, ‘Are they going to take me seriously? Will they listen to me?,’” she says.

Yet experts are mixed on their opinions when it comes to chatbots stepping in for therapy. Andrew Clark, a psychiatrist and former medical director of the Children and the Law Program at Massachusetts General Hospital, found some AI bots pushed alarming actions, including “getting rid of” parents and joining the bot in the “afterlife.”

Earlier this year, the American Psychological Association urged the Federal Trade Commission to put safeguards in place that would connect users in need with trained (human) specialists. The APA presented a list of recommendations for children and adolescents as they traverse AI, including encouraging appropriate uses of the technology like brainstorming; limiting access to violent and graphic content; and urging adults to remind the children any information found through AI may not be accurate.

“The effects of AI on adolescent development are nuanced and complex; AI is not all ‘good’ or ‘bad,’” the recommendation says. “We urge all stakeholders to ensure youth safety is considered relatively early in the evolution of AI. It is critical that we do not repeat the same harmful mistakes that were made with social media.”

Nicholas Jacobson, who leads Dartmouth College’s AI and Mental Health: Innovation in Technology-Guided Healthcare Laboratory, says he is both “concerned and optimistic” about the use of chatbots for mental health discussions. Chatbots that are not designed for that purpose, such as ChatGPT, could be “risky at best and harmful at worst.” But bots trained on scientifically built systems are “a very different and much safer tool.”

Jacobson recommends parents and users review four key factors when using bots: the maker of the bot and if it used evidence-based approaches; what data the AI was trained on; the bot’s protocols for a crisis; and “remembering AI is a tool, not a person,” he says.

Jacobson believes the use of chatbots will only continue to grow as children — who are now all digital natives — may feel more comfortable confiding in an anonymous computer system.

“For many children, communicating via technology is more natural than face-to-face conversations, especially about sensitive topics,” he says. “The perceived lack of judgment and the 24/7 availability of a chatbot can lower the barrier to seeking help. This accessibility is crucial, as it meets kids where they are, at the moment they are struggling, which is often not during a scheduled appointment with an adult.”

And the Alongside report found an uptick in students who opened up to the chatbot had a bigger chance of eventually telling concerns to a trusted adult in their life. In the 2024–25 school year, 41 percent of students chose to share their chat summary and goals with a school counselor, up 4 percent from the previous year.

“Once students process what they are feeling, many choose to connect with a trusted adult for additional support,” the report says. It also found that while roughly 30 percent of students had concerns about seeking adult support, a majority did have a singular trusted adult — be it an aunt, coach or therapist — who they did often confide in.

Those findings about children's states of mind — even if received through a chatbot versus in person — could give valuable data to schools to use to make improvements, Friis says: “Whether it’s researchers or schools, our jobs want us to know what’s happening with kids. With schools, a lot of time if they quantify it, it’s huge for advocating for grant funding or programming.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up