A California jury found Meta Platforms and Google‑owned YouTube liable in a first‑of‑its‑kind lawsuit that sought to hold social media companies responsible for harm to children using their platforms, awarding the plaintiff $3 million in damages.
After more than 40 hours of deliberations across nine days, jurors concluded that both companies were negligent in the design or operation of their platforms and that their conduct was a substantial factor in causing harm to the plaintiff, a 20‑year‑old woman who said her heavy use of social media as a child led to addiction and worsened her mental health.
The verdict’s financial penalty could grow significantly. Jurors also found that Meta and YouTube acted with malice, meaning highly egregious conduct under California law, which opens the door to punitive damages. Jurors will hear additional evidence in a second phase of the trial before deciding whether to impose them.
The decision comes one day after a New Mexico jury found that Meta violated the state’s consumer protection law by failing to disclose risks its platforms pose to children and by misleading the public about safety on services that include Facebook, Instagram and WhatsApp. The verdict capped a nearly seven-week trial in Santa Fe.
What Was the Social Media Trial About?
The plaintiff, identified in court documents as KGM and referred to as Kaley during the trial, testified that she began using YouTube at age 6 and Instagram at age 9 and spent much of her childhood on social media. She told jurors she was online “all day long” as a child.
Kaley said her early and sustained use of the platforms led to compulsive behavior and exacerbated her mental health struggles as she grew older. Her lawyers argued that the companies failed to adequately protect young users or warn families about potential risks tied to prolonged use.
Meta and YouTube were the only defendants to go to trial after TikTok and Snap, the parent company of Snapchat, reached settlements before proceedings began. The terms of those settlements were not disclosed.
Arguments Over Platform Design and Addiction
Lawyers representing Kaley, led by trial attorney Mark Lanier, focused on platform design rather than user‑generated content. They argued that features like infinite scrolling feeds, autoplay functions and frequent notifications were intentionally designed to keep users engaged, particularly children.
The plaintiffs were required to show that the companies’ negligence was a substantial factor in causing harm, not the sole cause. Lanier emphasized that even a small contributing factor could meet that legal standard if it played a meaningful role.
Jurors were instructed not to consider the specific posts or videos Kaley viewed. That limitation stems from Section 230 of the Communications Decency Act of 1996, which generally shields technology companies from liability for content posted by users. The case instead centered on whether the structure and mechanics of the platforms themselves could be considered harmful.
Who Testified in the Trial?
Jurors heard about a month of testimony, arguments and evidence. Kaley testified about her experiences, and Meta executives Mark Zuckerberg and Adam Mosseri, the head of Instagram, were also called to the stand.
Meta said in a statement that it plans to challenge the outcome.
“We respectfully disagree with the verdict and are evaluating our legal options,” the company said. The Meta spokesperson added that teen mental health is “profoundly complex and cannot be linked to a single app.”
YouTube’s CEO, Neal Mohan, was not called to testify.
Google spokesperson Jose Castañeda said the verdict misrepresents YouTube, “which is a responsibly built streaming platform, not a social media site.”
How Meta and YouTube Defended Themselves
Meta consistently argued that Kaley’s mental health struggles were rooted in factors separate from her social media use, including what it described as a turbulent home life. The company said that none of her therapists identified social media as the cause of her mental health issues.
YouTube’s defense focused less on Kaley’s medical records and more on how she used the platform. Lawyers argued that YouTube functions more like television than a traditional social media network and pointed to data showing her usage declined as she got older.
According to YouTube, Kaley averaged about one minute per day watching YouTube Shorts since its launch in 2020. Shorts is the platform’s short‑form, vertical video feature that includes the infinite scroll function criticized by the plaintiffs.
Why the Case Could Have Broader Impact
The lawsuit was selected as a bellwether trial, meaning it is intended to help predict how juries may respond to similar claims. Thousands of lawsuits have been filed nationwide accusing social media companies of harming children through addictive design practices.
Laura Marquez‑Garrett, attorney with the Social Media Victims Law Center and counsel of record for Kaley, described the trial as “a vehicle, not an outcome,” emphasizing its role in bringing internal company documents and decision‑making into the public record.
“This case is historic no matter what happens because it was the first,” Marquez‑Garrett said.
She compared the litigation to past mass‑tort cases involving asbestos, tobacco and talcum powder, arguing that companies often continue selling harmful products despite mounting evidence. She said families affected by social media harms will continue pressing forward with litigation.
How Can Social Media Platforms Harm Children?
The verdict comes amid years of intensifying scrutiny of social media companies over child safety, addiction and mental health. Lawmakers, regulators and parents have raised questions about whether platforms knowingly design products that contribute to depression, eating disorders and suicide among young users.
Plaintiffs hope social media companies will ultimately face comparable accountability, forcing changes to how platforms are designed and how risks are disclosed. The outcome of the punitive damages phase, along with future bellwether trials, could help determine whether that comparison holds.
What People Are Saying
Aura’s Dr. Scott Kollins, clinical psychologist and expert in child and adolescent mental health, to Newsweek in a statement: “This isn’t complicated, and it’s preventable. The kids struggling the most are on their phones constantly–checking them seven times more, sending five times more messages, and staying up on them at night. They’ll tell you themselves: the biggest pressure they feel is being online, even more than smoking or drinking.”
Casey Waughn, senior associate in the technology group at national law firm Armstrong Teasdale, to Newsweek in a statement: “This is another area where navigating the patchwork of state and federal privacy laws that regulate this space can be complex, but this seems to be an area where public perception, regulators, and lawmakers are relatively in step, which can be rare in the digital space. This verdict could further signal to lawmakers that there is public support for oversight and safety measures when it comes to youth online, and spur further legislative activity in the space.”
Update: 3/25/26, 4:35 p.m. ET: This article was updated with further information and remarks.
This article includes reporting by the Associated Press.
In a polarized era, the center is dismissed as bland. At Newsweek, ours is different: The Courageous Center—it’s not “both sides,” it’s sharp, challenging and alive with ideas. We follow facts, not factions. If that sounds like the kind of journalism you want to see thrive, we need you.
When you become a Newsweek Member, you support a mission to keep the center strong and vibrant. Members enjoy: Ad-free browsing, exclusive content and editor conversations. Help keep the center courageous. Join today.
————————————————
