How Analyzing Data Can Prevent Cyberbullying and Suicide

Big Data

How Analyzing Data Can Prevent Cyberbullying and Suicide

By Charley Locke     Oct 29, 2014

How Analyzing Data Can Prevent Cyberbullying and Suicide

Sticks and stones aside, words can hurt, whether said out loud or on a screen. And for teenagers today, four in ten of whom report experiencing cyberbullying in the past year, the free speech of the Internet has enabled new online forms of harassment--and easy access to information about self-harm.

But the Internet also offers new tools for preventing cyberbullying and suicide. By using data from student searches and posts on social media, analytics tools like Securly and Mevoked offer educators, parents and students a way to recognize harmful behavior and intervene.

“We use Securly as the filter for our Chromebooks,” says Lisa DeLapo, director of instructional innovation at Holy Names High School in Oakland, California. “From the Securly dashboard, the administrators can see what students have and haven’t been able to access,” she explains. “If I want to see what kids are posting on Twitter or Facebook, I can--everything on our Chromebooks gets logged by Securly.”

The cloud-based filter system works by flagging suspicious combinations of keywords in student posts and searches and passing along concerning actions to administrators. “We don’t just flag on keywords, which can be problematic and alert you to a lot of false positives,” explains Awais Ahsan, director of demand generation at Securly. “We more so flag on sentiment, and we’re trying to train our computer algorithm to be as complex as possible.”

At Holy Names, Securly has enabled educators to step in and make a difference with troubled students. “For a student who was dealing with emotional issues and talking about very adult behavior, we were able to intercept and talk to the student, and get that student some help,” explains DeLapo.

The social media filtering makes an especially significant difference at schools like Holy Names, a Catholic all-girls high school where most students bring their school-supplied Chromebooks home. “Most of our students are economically disadvantaged, and use our device as their only device,” DeLapo explains. “Students take Chromebooks home, and the Securly filters continue there.”

Securly recognizes the power of expanding its filters beyond classroom walls. On September 29, the company launched a Kickstarter campaign to develop Securly for Parents, which will offer the data analytics tools to families. “Schools should certainly teach digital citizenship and how to stay safe online, but that’s all for naught if it’s not being reinforced at home,” Ahsan explains.

DeLapo seconds the importance of extending Securly’s tools to families. “Right now, we’re the only policemen of what students are seeing online,” she says. “If parents also have the insights, that can help students use social media appropriately.”

Ahsan envisions Securly as eventually connecting educators and parents in monitoring all social use of technology by students. “A tool for parents to log in and see a view for their particular child, and have alerts through SMS for these activities, would complete the picture in our eyes,” he says.

Complete access to browser history certainly could lead to concerns about student privacy, as when the ACLU accused a Tennessee district of violating constitutional rights by searching through students' social media posts. As one student at Holy Names said, "I get it when they monitor us during class, but outside of that time, I don't know that they should have that right."

In response, Ahsan reminds that so far, Securly only functions on school-issued devices, which students ostensibly should not be using for personal social media use in the first place. And in terms of extending Securly to personal devices, he sees it part of an individual family conversation about trust and responsibility. “Parents need to trust their kids and not monitor everything they do, but also be able to intervene,” he explains. And if flagging of suspicious posts can prevent harm to self or others, it may be worth the intrusion.

Arun Ravi, founder of Mevoked, agrees that data about individual technology usage can be pivotal in identifying and addressing concerns. “One of the biggest issues in mental health is to [provide resources and support] while someone is going through an issue, rather than five or six months later,” Ravi says. “Suicide is never a data point, it’s patterns and ideation.”

Like Securly, Mevoked analyzes social mobile and online data, but focuses on mental health and connecting individuals with online and in-person resources. “We want to fill in the gap of identifying negative behavior and be the conduit to managing your condition,” Ravi says.

Ravi explains that Mevoked accumulates data about how individuals use technology that is already largely accessible. “There’s no barrier in collecting this data,” he says. “We’re doing exactly what Google does when they advertise to you, using the same algorithms to assess mental health.”

Ravi, who first trialed Mevoked with parents to use with their children, encountered resistance not in the collection of private data about student usage--both Ravi and Ahsan stress that student data is deleted when a school or individual no longer uses the application--but in the societal stigma against talking about mental health. “Parents don’t tell other parents they’re using [Mevoked],” explains Ravi. “Everyone talks about mental health as an issue, but rarely does anyone bring it up, especially at the school level.”

By offering Mevoked to schools, Ravi hopes “to put the onus on educators to take a more active interest” in student mental health. As he explains it, Mevoked “can tell schools this is a red flag, go talk to this student, without [sharing the specific post of a student and] identifying the problem.”

Beyond educators and parents, Mevoked has begun to offer data analysis to students themselves. “We give [the Mevoked app] directly to teenagers to keep track of their own mental health,” says Ravi. This fall, Mevoked is conducting a six-week pilot of app use with thirty students at Lewis & Clark College in Portland, Oregon.

“We want to see whether we can use Mevoked as a link to the campus counseling center, to bridge the gap between the user and help,” explains Mingming Caressi, a senior psychology major at Lewis & Clark who is conducting the study with Mevoked. Caressi sees Mevoked’s tools as helpful for all individuals, not only those who are diagnosed with depression. She hopes Mevoked will offer “a resource for temporary emotional distress as well, which everyone experiences, especially in college.” By using Mevoked for prevention rather than treatment, ideally, the data analysis can eliminate the need for intervention.

As Ahsan and Ravi both see it, the data analysis made available through Securly and Mevoked ideally can make students more aware of how they use technology, and how it affects their moods and mental health. As Ahsan affirms, “Even if we’re able to intervene and make a difference for one student in one district, that can shape the conversation with all students about how to use technology in a safe way.”

Editor's Note: Mevoked is now focused on clinical applications rather than an application for monitoring children, including a partnership with Mayo Clinic.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up