
Amy Neville sobbed on a Wednesday afternoon in late March while standing outside the Los Angeles Superior Court. Not in a quiet manner. The kind of sobbing that accumulates over years—through late-night ER visits, therapists’ waiting rooms, and the gradual realization that your child had been harmed and no one had been held accountable. She gave strangers hugs when the verdict was announced. Over the course of the five weeks of the trial, she had met parents in the courthouse lobby who, for some reason, now felt like family.
Meta and YouTube were recently found liable by a jury for purposefully creating addictive platforms that negatively impacted the mental health of a young woman. Depending on who you asked, it was either a historic reckoning or a verdict that was dangerously simplistic. Most likely both.
Meta Platforms, Inc. — Key Case Information
| Case name | KGM v. Meta Platforms, Inc. & Google LLC (YouTube) |
|---|---|
| Court | Los Angeles Superior Court, California |
| Verdict date | March 25, 2026 |
| Plaintiff | Kaley (KGM), age 20 — identity protected by court |
| Defendants | Meta Platforms (Instagram, Facebook) & Google/Alphabet (YouTube) |
| Damages awarded | $6 million total — $3M compensatory + $3M punitive |
| Liability split | Meta: 70% | YouTube: 30% |
| Trial duration | ~6 weeks; deliberations lasted nearly 9 days |
| Verdict breakdown | 10–2 jury split in favor of plaintiff on every question |
| Concurrent verdict | Meta separately ordered to pay $375M in New Mexico (child exploitation case) |
| Broader litigation | First of 20+ bellwether trials; 1,600+ plaintiffs in consolidated California cases |
| Both companies’ stance | Disagree with verdict; both intend to appeal |
| Official reference | The Guardian — Full Verdict Report |
The plaintiff, whose full name is protected and who is 20 years old, is only known in court as Kaley. At age six, she began using YouTube. Instagram at nine o’clock. She had uploaded 284 videos to the internet by the time she graduated from elementary school.
She claims that by the time she was ten, she was depressed and self-harming. When she eventually saw a therapist at the age of thirteen, she was diagnosed with social anxiety and body dysmorphic disorder. Kaley directly credits her years spent on YouTube and Instagram for both. After deliberating for almost nine days, the jury reached a verdict.
The $6 million in damages was divided between $3 million in compensatory damages and $3 million in punitive damages, which are only awarded when a jury determines that a defendant “acted with malice, oppression, or fraud.” That is not a small legal detail. A higher standard of proof is needed for punitive damages. The jury found something more akin to intent rather than just negligence on the part of the companies. Google will cover the remaining amount, with Meta covering 70%.
It’s worth taking a moment to consider that. These are two of the world’s most valuable businesses. For both of them, six million dollars is essentially a rounding error. However, it was never really about the number. The point was what a twelve-person jury decided to believe following six weeks of testimony from executives, addiction specialists, whistleblowers, and a 20-year-old woman who described her own unraveling.
During closing arguments, Kaley’s lead lawyer, Mark Lanier, stated clearly: “How do you make a child never put down the phone? We refer to that as ‘the engineering of addiction.” He compared it to the big tobacco industry, which publicly denied what its own internal research had already shown for decades. Although the parallel isn’t flawless, it’s also not incorrect.
As this develops, it seems as though Silicon Valley has been experiencing something similar for years, and the industry has been wagering that the legal system would take too long to catch up. That wager simply lost, at least in Los Angeles.
In court, Kaley admitted that she occasionally went to the school restroom to check her notifications. She was pulled out of class by the rush of a like, a tiny dopamine flicker created by someone in a Menlo Park office building. She talked about gradually distancing herself from her own face by applying Instagram filters to nearly every picture she shared. Autoplay video, infinite scroll, and algorithmically timed push notifications are all intentional features. The design is theirs. The jury agreed with the plaintiff’s legal team’s argument.
For its part, Meta said the decision was incorrect. The company “respectfully disagrees,” according to a representative, who also noted that teen mental health “is profoundly complex and cannot be linked to a single app.” Technically, that is correct. It’s also the kind of statement that, after nine days of discussion, seems a little less reasonable than it did in a press release. YouTube from Google adopted a slightly different stance, describing itself as “a responsibly built streaming platform, not a social media site.” It appears that the jury did not find that distinction persuasive after hearing from those who constructed it.
A different jury in New Mexico found Meta liable in a child sexual exploitation case the day before the Los Angeles verdict, and they ordered the company to pay $375 million in civil penalties. The same company was found liable for the effects of its products on children in two separate verdicts rendered on consecutive days. Forrester research director Mike Proulx called the event a “breaking point.” That framing is difficult to dispute.
How far any of this goes is still unknown. Kaley’s case is the first of over twenty “bellwether trials,” which are smaller cases intended to gauge juries’ reactions prior to the arrival of the bigger legal waves. The next one is set for July. In June, a different federal lawsuit with hundreds of plaintiffs in San Francisco will go to trial. Kaley’s part of the case was settled by TikTok and Snap before trial; the terms were never made public. Google and Meta say they will file an appeal. Most likely, they will. This is far from finished.
For months, there has been an increase in the political climate around children and social media on a global scale. Australia has completely banned minors from using social media. A pilot program in the UK is investigating what an actual under-16 ban might entail. In response to the Los Angeles verdict, the prime minister of Britain stated that the current situation was “not good enough.” Even the Duke and Duchess of Sussex, who have long fought against the negative effects of social media, referred to the decision as a “reckoning.” It remains to be seen if any of that momentum results in real regulation anywhere.
The discrepancy between the extent of the purported harm and the extent of the remedy is unsettling. Engineers optimizing for time-on-app may have made design decisions that impacted hundreds of thousands of youth. Six million dollars went to one woman. While the courts decide what to do, the companies will probably continue to run their platforms essentially in the same manner, file an appeal, and pay more in legal fees than the actual judgment.
However, the significance of verdicts extends beyond their monetary values. A jury of ordinary people was asked a straightforward question in that Los Angeles courtroom: did these companies know their products were harming children, and did they continue to do so? By a vote of 10-2, they agreed that the answer was yes. No matter what happens legally, that response won’t go away.
