Meta and YouTube Found Liable for Addicting Children in Social Media Trial

Meta and YouTube Found Liable for Addicting Children in Social Media Trial

Dan Saltman
Dan Saltman
15 min read

Categories: Facebook, Instagram, Law, Meta, Social Media, YouTube

Quick Story Summary
  • A Los Angeles jury found Meta and YouTube liable for negligently designing addictive platforms that caused harm to a minor, marking a first in U.S. legal history.
  • The March 25, 2026 verdict awarded $6 million in damages, assigning 70% responsibility to Meta and 30% to YouTube, including punitive damages tied to findings of misconduct.
  • Plaintiffs successfully argued that platform features like infinite scroll, autoplay, and algorithmic recommendations constitute defective product design, bypassing Section 230 protections.
  • Internal company documents presented at trial suggested awareness of youth engagement risks, including strategies to attract younger users despite age restrictions.
  • This case is a bellwether among 1,500+ similar lawsuits and could shape future litigation, with additional major trials and state-level actions already underway.
  • The ruling reinforces concerns about long-term digital footprints and addictive design, highlighting the difficulty users face when trying to disengage or delete accumulated content.
$6M Total damages awarded
March 25, 2026 Verdict date
1,500+ cases pending

Billions of people use social media every day. For most, it is a routine part of life. But for a growing number of users, particularly children and teenagers, that routine has tipped into something more serious. Studies have consistently linked excessive social media use to anxiety, depression, and declining mental health in young people, and pressure on the companies responsible has been building for years.

What has been missing, until now, is accountability. Lawmakers have held hearings. Researchers have published warnings. Parents have filed complaints. But the tech industry has largely avoided legal consequences, shielded by decades of regulation written before platforms like Instagram and YouTube even existed.

That changed in March 2026. A Los Angeles jury became the first in American history to find social media companies liable for addicting a child, ruling that Meta and YouTube had negligently designed their platforms in ways that caused real and documented harm. The verdict sent a signal to an entire industry: the era of impunity may be coming to an end.

What the Verdict Actually Says

On March 25, 2026, after more than 44 hours of deliberation across nine days, a Los Angeles County Superior Court jury found both Meta and YouTube negligent in the design and operation of their platforms. The jury concluded that those design choices were a substantial factor in causing documented harm to the plaintiff.

The jury awarded $3 million in compensatory damages and recommended a further $3 million in punitive damages, bringing the total to $6 million. Meta was found 70% responsible, with YouTube bearing the remaining 30%. The punitive award reflected the jury’s finding of evidence of malice, oppression, or fraud in how the companies had operated.

This was the first social media addiction lawsuit in American history to reach a jury, and it came after a seven-week trial that put some of the most powerful executives in the world on the stand.

Who Is the Plaintiff?

The plaintiff is a 20-year-old woman identified in court documents only by her initials, K.G.M. Her legal team referred to her as Kaley throughout the trial. She began using YouTube at age 6 and Instagram at age 9. By the time she finished elementary school, she had already posted 284 videos to YouTube. She told the court she was on social media “all day long” as a child, and that she eventually withdrew from her family because of how much time she was spending on these platforms.

Kaley testified that her early use of social media triggered addiction-like dependency and worsened depression and suicidal thoughts. She said she began suffering anxiety and depression at the age of 10 and was later formally diagnosed with both. A former therapist who treated her, Victoria Burke, testified that social media and Kaley’s sense of self were closely linked, and that activity on those platforms could “make or break her mood” on any given day.

Meta and YouTube argued in their defence that Kaley’s mental health struggles predated her social media use, and that her family history, learning difficulties, and home environment were more significant contributing factors. Meta’s attorney told the court that not one of her therapists had identified social media as a cause. The jury disagreed.

Where and When It Happened

The trial was held at Los Angeles County Superior Court and lasted seven weeks. Jury deliberations ran over nine days before a verdict was reached on March 25, 2026. The presiding judge was Judge Carolyn B. Kuhl. The case was selected as a bellwether trial under California’s Judicial Council Coordination Proceedings, meaning it was chosen to help guide the outcome of hundreds of related cases filed across the state.

How the Case Was Built

The legal strategy in this case was deliberate and significant. Rather than challenging specific content posted on these platforms, an approach that would have run into the strong legal protections offered by Section 230 of the Communications Decency Act, lawyers targeted the design of the platforms themselves.

The argument was that features like infinite scroll, autoplay video, push notifications, like counts, and algorithmic recommendation systems were engineered specifically to maximise engagement rather than to serve users, and that these choices constituted defective product design. Plaintiffs’ attorneys described these features as the digital equivalent of a casino floor: deliberately structured to prevent users from stopping.

Internal documents from Meta shown to jurors revealed the extent of the company’s awareness. One document read: “If we wanna win big with teens, we must bring them in as tweens.” Another internal memo showed that 11-year-olds were four times as likely to return to Instagram compared to competing apps, even though the platform officially requires users to be at least 13 years old.

This design-focused approach matters legally because Section 230 protects platforms from liability for third-party content, but legal experts have argued it does not necessarily shield companies from product liability claims based on their own engineering decisions. Courts are increasingly making this distinction.

Who Took the Stand

The trial was notable for the seniority of those called to testify. Meta CEO Mark Zuckerberg appeared in court, making it the first time he had faced a jury alongside families who said his company’s products had harmed their children. Instagram head Adam Mosseri also testified, pushing back on the concept of social media addiction and characterising heavy platform use instead as “problematic usage.” YouTube’s VP of Engineering, Cristos Goodrow, told the court that YouTube was “not designed to maximize time.”

The plaintiff’s lead attorney, Mark Lanier, said he hoped the proceedings would produce transparency and accountability so the public could see that these companies had been “orchestrating an addiction crisis in our country and, actually, the world.”

How Meta and YouTube Responded

Both companies rejected the jury’s findings and announced their intention to appeal.

A Meta spokesperson told reporters: “We respectfully disagree with the verdict and are evaluating our legal options. Teen mental health is profoundly complex and cannot be linked to a single app.”

Google’s response was similar. A spokesperson said: “We disagree with the verdict and plan to appeal. This case misunderstands YouTube, which is a responsibly built streaming platform, not a social media site.”

Meta President Dina Powell McCormick, speaking at a summit the same day the verdict was announced, said the company would keep working to protect young people on its platform.

Why This Verdict Matters

This was not an isolated lawsuit. It is the first of more than 1,500 similar cases against social media companies to go to trial, and the outcome is expected to shape how those cases proceed. The Social Media Victims Law Center, which represents many of the plaintiffs, said: “For years, social media companies have profited from targeting children while concealing their addictive and dangerous design features. Today’s verdict is a referendum from a jury to an entire industry that accountability has arrived.”

James Steyer, founder of online safety organisation Common Sense Media, said: “Social media giants would never have faced trial if they had prioritised kids’ safety over engagement. Instead, they buried their own research showing children were being harmed, and used kids and society as guinea pigs in massive, uncontrolled, and wildly profitable experiments.”

The verdict came just one day after a separate jury in New Mexico ordered Meta to pay $375 million for failing to protect young users on Instagram and Facebook. That case will move into a second phase in May, where a judge will decide whether Meta created a public nuisance and whether additional penalties are required.

What Happens Next

This verdict is one data point in a much larger legal story. A federal bellwether trial involving school districts and parents nationwide is scheduled for summer 2026 in the Northern District of California. That case consolidates claims against Meta, YouTube, TikTok, and Snap, alleging that their platforms caused significant mental health harm to young users.

More than 40 state attorneys general are pursuing separate litigation against Meta. TikTok and Snap both settled with the plaintiff in the Los Angeles case before it reached trial, though they remain involved in other legal proceedings.

Peter Ormerod, an associate professor of law at Villanova University, described the verdict as “a momentous development” but noted it is “one step in a much longer saga.” He said he does not expect immediate large-scale changes to the platforms, and that reaching something comparable to the tobacco industry’s master settlement would require further losses for the companies on appeal and in additional bellwether trials.

That comparison to the tobacco litigation of the 1990s is one that legal experts, advocates, and the plaintiffs’ attorneys themselves have all drawn. In that era, internal industry documents revealed that tobacco companies had known for decades about the addictive and lethal nature of their products. The documents revealed in the Los Angeles trial suggest a parallel pattern of internal awareness and external denial.

What This Means for You

The platforms at the centre of this case are used by hundreds of millions of people worldwide, most of whom are adults. The design features that plaintiffs argue harmed Kaley, including infinite scroll, algorithmic recommendations, and notification systems, are features that every user encounters, not just young people.

Years of engagement with these platforms leave behind a substantial digital footprint. Posts, comments, liked content, followed accounts, and interaction history build up into a record that platforms retain and use. As we covered in our analysis of the TikTok settlement, this footprint is not easy to remove. Platforms make bulk deletion deliberately difficult, requiring users to delete content one item at a time, a process that can take hours or days for anyone with years of activity.

This is not incidental. It serves the same purpose as addictive design: making it harder to disengage.

A Turning Point for Big Tech

The jury’s decision on March 25, 2026, does not end the legal battle against Meta and YouTube. Both companies have signalled they will appeal, and the road to any industry-wide settlement comparable to what followed the tobacco litigation is long. But this verdict does something that no regulatory announcement or legislative proposal has yet achieved: it places corporate responsibility for addictive design in front of a jury of ordinary people, and those people found the platforms liable.

For parents, for users, and for anyone who has spent significant time on these platforms, the verdict is a validation of something that many people have felt for years: that the design of these products was not neutral, and that the consequences of that design were not accidental. If you want to understand more about how these cases have unfolded, our full breakdown of the TikTok settlement covers the wider litigation landscape in detail.

Take Back Control of Your Digital Footprint

The platforms at the centre of this trial were designed to keep you engaged and to make it as difficult as possible to leave. That applies not just to the addictive features the jury ruled on, but to the content you have accumulated over years of use. Posts, comments, likes, and tagged photos build up into a permanent record that platforms retain and use, and most offer no way to remove it in bulk.

Redact was built to change that. It allows you to bulk-delete posts, comments, likes, and other content across more than 25 platforms, including Instagram, Facebook, YouTube, Twitter, Reddit, and TikTok.

Rather than spending hours removing content one item at a time, Redact automates the process while keeping you in control of what gets removed and what stays. Your data is processed locally and is never stored or uploaded to third-party servers.

The platforms made it hard to leave a lighter footprint. Redact makes it manageable.