Meta was eventually prosecuted for harming teenagers. Now what?


Meta lost a lawsuit against the state of New Mexico last week for the first time that the company was held liable by the court system for endangering the safety of children. That alone was a remarkable decision – but the next day Meta lost another job When a jury in Los Angeles found that the company knowingly designed its apps to be addictive for children and teenagers and therefore endangered the mental health of the twenty-year-old plaintiff, known only as KGM.

These precedents open the door to a number of lawsuits alleging that Meta deliberately harassed teenage users despite knowing its apps could have negative psychological effects on teenagers. Thousands of cases Like the KGMs, 40 state attorneys general have filed lawsuits similar to the New Mexico case against Meta.

And social media platforms protected by law so that users cannot be held responsible for what they post on their platforms, this time it was not the content on these platforms that was being tested. It was the very design features like infinite scrolling and round-the-clock notifications.

“They took the model that was used against the tobacco industry years ago, and instead of focusing on things like content, they focused on these addictive features — how the platform was designed and the issues with design as opposed to content. You have this First Amendment argument,” he said. “In these two cases, at least, it turned out to be the winning argument.”

In the New Mexico case, a jury found Meta liable for violating the state’s Unfair Practices Act after a six-week trial, ordering the company to pay a maximum fine of $5,000 per violation, for a total of $375 million. The Los Angeles case, which holds Meta 70% and YouTube 30% responsible for plaintiff KGM’s problem, will fine the companies a total of $6 million. (Snap and TikTok solved the case before trial.)

“It’s nothing to the Metas of the world,” Fitzpatrick said. “But when you take that $6 million and multiply it by all the cases against them, it becomes a big number.”

“We respectfully disagree with these rulings and will appeal,” a Meta spokesperson told TechCrunch. “Reducing something as complex as teen mental health to a single cause risks not addressing the many, broader issues teens face today, and ignores the fact that many teens rely on digital communities to find connection and belonging.”

Techcrunch event

San Francisco, CA
|
October 13-15, 2026

During the trial, new internal documents from Meta were revealed, which show a pattern of inaction regarding the known negative impact of its platform on minors, as well as concentrated effort increasing the amount of time teens spend on their apps, even during school hours or through “finstas,” which are “fake Instagram” accounts that teens create to hide from parents or teachers.

He showed a document report With the results of a 2019 study, Meta conducted 24 in-person, one-on-one interviews with people whose use of the product was marked as problematic – a designation that applied to about 12.5% ​​of users.

“The best external research shows that Facebook has a negative impact on people’s well-being,” the report said.

Multiple documents It cited statements from Meta CEO Mark Zuckerberg and Instagram CEO Adam Mosseri about prioritizing teen engagement. Even Zuckerberg comments For Facebook Live to succeed with teenagers, he “guessed, we’d have to be very good at not alerting parents/teachers.”

In other documents, Meta employees talked about the company’s goals to increase retention of teenage users.

“We learned that one of the things we need to optimize is sneaking a peek at your phone in the middle of Chemistry :),” wrote one employee. e-mail To Meta CPO Chris Cox.

“Nobody thinks they want to increase the number of times they open Instagram that day,” Meta VP of Product Max Eulenstein wrote in an internal email in January 2021. “But that’s exactly what our product teams are trying to do.”

A Meta spokesperson told TechCrunch that many of the newly released documents date back nearly a decade, but the company is listening to parents, experts and law enforcement about how the platform can be improved.

“We don’t focus on the time teenagers spend today,” the spokesperson said, referring to Instagram Teen Accounts, which were introduced in 2024 and offer built-in security features for teen users. These protections include, for example, keeping accounts private and allowing only people they follow to tag or mention them in posts. Instagram will also send time limit reminders to teens telling them to leave the app after 60 minutes, which can only be changed for under-16s with parental permission.

The revelations come as no surprise to Meta’s Director of Product Marketing Kelly Stonelake, who worked at the company from 2009-2024. (Stonelake currently He is suing Meta for alleged gender-based discrimination and harassment.)

“The mountain of unsealed evidence really represents my first experiences,” he told TechCrunch.

At Meta, Stonelake led the teen “go-to-market” strategies for VR social app Horizon Worlds. He claims to have raised concerns about the lack of effective content moderation tools in the metaverse, but his objections were not taken seriously.

The US government has taken a keen interest in children’s online safety, especially after Meta informant Francis Haugen Damning internal documents leaked in 2021 revealed Meta knew Instagram was harming teenage girls.

While Congress has proposed numerous bills aimed at keeping children safe online, many of these efforts will do more to monitor adults and censor speech than to protect minors. privacy activists say.

“There is no universe that is censored or”age verification“The law, under the guise of child safety, does not lead to massive online censorship of content and speech that Trump dislikes,” Evan Greer, director of Fight for the Future, said in a statement.

Stonelake once lobbied on Capitol Hill Children’s Online Safety ActWith the most momentum of any of these legislative efforts, garnering support from companies like Microsoft, Snap, X and Apple, the . But as the bill evolved and changed, he began to criticize it.

“I’m urging a no vote on the current version,” he said, referring to the bill’s preemption provisions that would repeal state regulations on tech companies. “The final version has language that closes the courthouse doors to school districts, to bereaved families, to counties — and it’s atrocious.”

That language could have prevented, for example, a lawsuit brought by the state of New Mexico against Meta.

“We need people to come to the table with solutions instead of what they’re doing now,” Stonelake said. “A real solution needs to be complex and nuanced and take into account multiple priorities.”



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *