Google Gets $170M Fine and Pledges to Protect Children on YouTube. Will...

Data Privacy

Google Gets $170M Fine and Pledges to Protect Children on YouTube. Will It Matter?

By Tony Wan     Sep 4, 2019

Google Gets $170M Fine and Pledges to Protect Children on YouTube. Will It Matter?

How much is $170 million? At first glance, it seems like a hefty fine—and it’s what the Federal Trade Commission ordered Google to pay as a settlement over complaints that YouTube had violated the Children’s Online Privacy Protection Rule, or COPPA. But is it really all that much?

In April 2018, privacy advocacy groups filed a complaint stating that YouTube had been illegally collecting personal data about minors who use its service without their parents’ consent, and used that information for advertising purposes. The complaint charges that Google skirted COPPA compliance by claiming that YouTube does not have viewers under the age of 13 (even though it doesn’t take much effort to find content that is clearly directed at young children.)

According to a statement from FTC chairman Joseph Simons and commissioner Christine Wilson, the $170 million fine is “almost 30 times higher than the largest civil penalty previously imposed under COPPA.” That was a $5.7 million fine levied against TikTok, a social video app, in February 2019.

But for a company that reported generating $39 billion in the second quarter of 2019, the fine amounts to barely a slap on the wrist, wrote Rohit Chopra, an FTC commissioner in his dissenting statement about the settlement: “The terms of the settlement were not even significant enough to make Google issue a warning to its investors.”

He later added: “In my view the Commission often makes a low opening bid for monetary relief … Financial penalties need to be meaningful or they will not deter misconduct.” (Recode’s Peter Kafka called the fine “a rounding error.”)

Perhaps more materially meaningful are the actions that YouTube has pledged to take. In a blog post, YouTube CEO Susan Wojcicki said the company will “treat data from anyone watching children’s content on YouTube as coming from a child, regardless of the age of the user.” The company will also require video uploaders to self-report if their material is targeted for kids, and pledges to stop serving personalized ads and disable comments and notifications on children’s content. It will also establish a $100 million fund to support the creation of “thoughtful, original children’s content on YouTube and YouTube Kids globally.”

In addition to requiring users to flag their own content, Wojcicki noted that her team will also “use machine learning to find videos that clearly target young audiences.” That comment may well be intended as an additional safeguard, and what Rebecca Kelly Slaughter, another dissenting FTC member, is looking for.

But privacy advocates are not sure whether such technology is accurate or reliable. “There are a lot of questions about whether machine learning today is sophisticated enough to identify children’s content,” says Amelia Vance, director of education policy at the Future of Privacy Forum. “It may be easy when you’re talking about nursery rhymes. But what about a video that, say, explores ‘Beauty and the Beast’ in the context of Stockholm Syndrome?” she raises hypothetically.

While the fine may not amount to much, the ruling could portend important updates to federal laws that govern how internet companies are supposed to safeguard children’s privacy, Vance adds. No longer can content creators and media platforms feign ignorance about whether children access their materials, or how they may be tracked by third party advertising tools.

Vance says that the timing of the fine is also noteworthy in light of the FTC’s recent call for public comments for updates to COPPA. The agency is hosting a public workshop in about a month, on Oct. 7, to consider changes to the law, which was last amended in 2013. Much has been discovered about data collection practices since then, thanks to revelations involving other large technology companies like Facebook.

“I don’t think the timing [of this fine] is a coincidence with the call that the FTC put out for comments about COPPA,” she says. “I think you will see potential significant changes from the workshop that will change how companies will interact with kids and parents.” According to Vance, among the proposals under consideration is raising the age for kids that are protected by COPPA. The rule currently only covers children up to age 13.

Learn more about EdSurge operations, ethics and policies here. Learn more about EdSurge supporters here.

More from EdSurge

Get our email newsletterSign me up
Keep up to date with our email newsletterSign me up