Why the Social Media Addiction Case Isn’t Over Yet

Policy and Government

Why the Social Media Addiction Case Isn’t Over Yet

By Nadia Tamez-Robledo     Apr 8, 2026

Why the Social Media Addiction Case Isn’t Over Yet

Algorithms. Beauty filters. Endless scrolling.

The case over “social media addiction” against Meta and Google in a California courtroom ultimately came down to these elements, legal experts say, and what a jury found was negligence on social media companies’ part when designing apps where tweens and teens would come to spend roughly one-fifth of their day.

Joseph McNally, former federal prosecutor and director of Emerging Torts and Litigation at McNicholas & McNicholas in California, says jurors agreed with the novel legal argument that Meta and Google were negligent in their design of Instagram and YouTube, respectively, contributing to the mental health problems of the plaintiff. Parent companies of Snapchat and TikTok settled with the plaintiffs before the trial.

McNally and other experts tell EdSurge the verdict will affect thousands of similar cases and influence how tech companies roll out their features — and that the legal tussle over where liability falls when it comes to youth mental health isn’t over yet. With the social media giants vowing to appeal, the case could end up before the U.S. Supreme Court.

Email Evidence

The impact left by the presentation of internal company emails was undeniable, McNally says. Internal Meta communications showed that employees raised alarms about the potential harm to teen girls posed by a beauty filter. Documents also showed they knew that users much younger than 13 — the minimum age required for sign up — were on their platforms, he adds.

“They looked the other way because — the plaintiffs argued — they had a long-term benefit, long-term value of hooking those users early,” McNally says. “I think that the emails painted a picture of a company whose own employees were raising concerns about features in the product, and the plaintiff effectively used those emails to show that they knew about the risk of the product.”

“Addictive” Design

If Meta and Google had settled, the court wouldn’t have had cause to grapple with the legal question of whether social media companies can be held liable for harm caused by their design. But from the defense’s perspective, tech companies had been solidly protected by Section 230 in the past, explains Princess Uchekwe, corporate attorney and founder of The Chief Counsel in New York. That’s the part of the 1996 Communications Decency Act that shields websites and online platforms from being sued over content posted by users.

Just one day before the California verdict, a New Mexico jury found Meta liable in a $375 million consumer protection lawsuit over its failure to protect children from social media harm on its platforms.

“What the lawyers for the plaintiffs were arguing is, essentially, it's not the content that we have a problem with,” Uchekwe says, “It's the fact that when people use your platform, you have implemented certain features that make it almost impossible for people to leave. You can scroll into the bottomless pit of hell on Instagram, and nothing ever tells you, ‘Maybe you should pause.’”

The Appeal of an Appeal

The $6 million in damages is a drop in the bucket for the two social media giants, but McNally says there are potential benefits to appealing the ruling anyway. There are thousands more consumer lawsuits against social media companies around the country, with school districts joining as plaintiffs.

One is that an appellate court might find that the long-time protections that social media companies have relied on should have come into play. The verdict barreled through the defenses raised by Section 230, which protects platforms from claims of harm caused by third-party content. It’s a policy that makes a free and open internet possible.

“[Section] 230 has resulted in the dismissal of hundreds of lawsuits over the years where they would've otherwise faced hundreds of millions of dollars in liability,” McNally says. “An appeal [based on] Section 230, which is a federal statute, could make its way up to the Supreme Court, who would have the final word on the scope. [If the] court of appeals remanded it back to the trial court and said, ‘Look, Section 230 applies,’ it would essentially bar these claims [of harm caused by the design].”

Uchekwe says failure to win an appeal could be “almost devastating” for tech companies due to the sheer amount of damages they could have to pay across thousands of similar lawsuits, along with the cost of restructuring how their apps function. That could mean rethinking features like targeted algorithms, the ability to endlessly scroll and notifications that draw users back into the app.

“Not only social media companies,” Uchekwe says, “all tech companies that have implemented things like that, especially if they have children as a base, are going to have to start reconsidering.”

First Amendment Question

There’s also a First Amendment case to be made, McNally adds. Some legal experts, including UC Berkeley law professor Erwin Chemerinsky, argue that the “addictive” algorithms that came under fire during the trial are protected free speech. If that argument succeeds on appeal, it could stop the legal cases arguing product liability in their tracks.

“If the Supreme Court overturned it based on Section 230 and the First Amendment, it’s unlikely there's going to be a new trial. It would likely be dismissed,” McNally says. “I won't say that with certainty, but the prospects of dismissal would be pretty good for the defendants.”

Ripple Effect

McNally says the fact that a jury ruled Meta and Google’s app features were “unreasonably unsafe for its users” creates challenges for them in the swaths of similar lawsuits they’re facing. Plaintiffs in those cases still must prove a direct link between the social media companies and the harm they’re alleging.

“I think it's going to result in some cases probably moving closer to settlement, but in all those cases, I think that the defendants are going to be looking closely at the causation issue,” McNally says. “There's probably other cases out there where the evidence of causation is not as strong, and those cases may be harder for a plaintiff to get across the finish line.”

Uchekwe predicts that if the verdict sticks, tech companies — especially those with users who are under 18 — will be forced to retool their app features to encourage users to spend less time on their platforms. That could hurt the companies’ ad revenue and their ability to gather data on users.

“Undoing some of those things may decrease their bottom line, but I'm not sure it will do it to the extent that it's detrimental to their revenue,” Uchekwe says. “If you weigh the benefits of putting these safeguards in for children versus your revenue, I never think that your profit should come at the expense of a generation of people.”

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up