On the afternoon of March 25, 2026, outside Los Angeles Superior Court, something happened that Silicon Valley had spent years and considerable legal resources trying to prevent. A group of parents sobbed as they hugged each other in the sunlight. Some had lost children to suicide, while others were witnessing their children battle anxiety, depression, and body image issues that they directly linked to social media. The jury had returned. YouTube and Meta were accountable. Six million dollars was the amount of damages. Additionally, a jury determined that a social media platform ought to be handled like a defective product for the first time in the history of the internet.
The case focused on a 20-year-old woman from Chico, California, who was only known as Kaley. When she was six years old, she began using YouTube. Instagram at nine o’clock. No algorithmic pause, parental gate, or age verification stopped her. She told the court that she was already dealing with anxiety and depression by the time she was ten. She started to become fixated on how she looked, applying filters to her selfies to make her eyes bigger and nose smaller in an attempt to achieve a version of herself that was exclusive to the app. Eventually, body dysmorphia was identified as her diagnosis. It is hard to hear and even harder to forget that she testified that she would sneak out to the restroom during class to see how many likes her posts had received.
What really set this trial apart from the dozens that preceded it was the tactics her attorneys used. Section 230 of the Communications Decency Act, a 1996 law that shielded online platforms from liability for user-posted content, has shielded tech companies for thirty years. Cases that centered on what people saw on social media often came to a standstill. Lead trial attorney Mark Lanier, a Texas litigator with a preacher’s sense of human connection, went in a completely different direction. His argument focused on architecture rather than content. The endless scroll that prevents you from coming to a stop. The notification pings timed to pull you back in. The beauty filters woven so deeply into Instagram’s interface that a nine-year-old could access them without any friction whatsoever. These features, Lanier argued, were not accidental byproducts of design. The design was them. “How do you make a child never put down the phone?” he told the jury. “That’s called the engineering of addiction.”
| Category | Details |
|---|---|
| Case Name | K.G.M. v. Meta Platforms, Inc., et al. |
| Court | Los Angeles Superior Court, California |
| Verdict Date | March 25, 2026 |
| Plaintiff | “Kaley” / KGM (identity protected, age 20) |
| Defendants | Meta Platforms, Inc. (Instagram/Facebook) & Google LLC (YouTube) |
| Verdict | Liable on all counts — negligent design, failure to warn |
| Total Damages Awarded | $6 million ($3M compensatory + $3M punitive) |
| Meta’s Share | 70% (~$4.2 million) |
| Google/YouTube’s Share | 30% (~$1.8 million) |
| Settled Before Trial | Snap (TikTok) and TikTok reached undisclosed settlements |
| Lead Trial Lawyer (Plaintiff) | Mark Lanier |
| Key Legal Strategy | Product defect / design liability (bypassing Section 230) |
| Related Cases | ~2,000 pending lawsuits; New Mexico Meta verdict ($375M, day prior) |
| Meta CEO Testimony | Mark Zuckerberg testified February 18, 2026 |
| Instagram Head Testimony | Adam Mosseri testified during trial |
| Key Comparison | 1990s Big Tobacco litigation |

Comparing the platforms to “digital casinos,” Lanier’s team built a case around internal Meta documents that were, by any measure, striking. One memo described the company’s goal of bringing users in “as tweens” if Meta wanted to “win big with teens.” Another internal analysis found that eleven-year-olds were four times more likely to keep returning to Instagram than to competing apps — despite the platform’s stated minimum age of thirteen. In February, Zuckerberg made the claim that protecting young users has always been a top priority. Instagram CEO Adam Mosseri described a teen’s sixteen-hour Instagram session as “problematic” rather than proof of addiction. Apparently, the jury did not find that distinction persuasive.
One could be disappointed by the $6 million verdict. The market value of Meta is in the trillions. Google does as well. A $6 million verdict on a quarterly earnings call is hardly more than a rounding error to businesses that size. Reporters were informed by Lanier that he anticipated a higher number. He had stood in front of the jury during the punitive damages phase with a jar of M&Ms, each of which represented a billion dollars of the combined value of the companies, and instructed them to speak to Meta in “Meta money.” Jurors adhered to the written law, according to the jury foreman, who only provided his first name. However, another juror, Victoria, put it more bluntly: “We wanted them to feel it. We wanted them to understand how unacceptable this was.
In reality, the verdict might achieve something different and possibly more significant than any monetary amount. The decision is what attorneys refer to as a bellwether, a test case intended to predict the outcome of about 2,000 similar pending lawsuits. It happened just one day after a jury in a different trial in New Mexico fined Meta $375 million for failing to ensure the safety of children on its platforms. Two states, two juries, two days in a row. The analogies to the Big Tobacco litigation of the 1990s are not exaggerated. It took years for that legal campaign to gain traction, but once it did, it never stopped. After decades of deliberate harm, it compelled an entire industry to alter its methods, target audience, and obligations to the public.
Teen mental health is “profoundly complex” and cannot be reduced to a single app, according to Meta, so it would be appealing. Google maintained that YouTube is a “responsibly built streaming platform,” not social media, a claim that the jury, who had only seen five weeks of evidence, didn’t seem to find convincing. A single appeal won’t be enough to end the battle between both businesses. The juries continue to listen, the cases continue to come in, and the internal documents continue to surface. Observing all of this gives the impression that the industry’s protracted period of consequence-free expansion is quietly coming to an end—not with a single dramatic moment, but rather with a steady accumulation of verdicts, parents on the steps of the courthouse, and jurors who trusted their own judgment about what an addicted child looks like.
