Meta and Google Found Liable: 70% of Blame Assigned to Meta in Landmark Social Media Addiction Trial
A Los Angeles jury delivered a verdict that the plaintiff called a vindication: meta and Google were found liable for harm caused by social media use, with jurors assigning 70% responsibility to Meta and 30% to YouTube. The panel awarded $3 million in compensatory damages to a young woman who said prolonged use of Instagram and YouTube from childhood contributed to mental-health struggles.
Why this matters now
The ruling landed after a trial spanning roughly a month and lengthy deliberations by California jurors that exceeded 40 hours over nine days. It is the first of a cluster of novel civil actions targeting the architecture of social platforms and will likely influence hundreds of similar lawsuits now moving through U. S. courts. The plaintiff, identified in court by initials and also named Kaley in courtroom testimony, said near-constant platform use contributed to depression, anxiety and body dysmorphia; jurors concluded the design and operation of those services were a substantial factor in her harm. The jury placed the lion’s share of legal responsibility on meta, with YouTube held partially liable.
Meta’s share of liability: what the jury decided and why
Jurors found that Meta’s Instagram and Google’s YouTube were negligent in design or operation and that negligence substantially contributed to the plaintiff’s injuries. The verdict allocated 70% responsibility to Meta and 30% to YouTube, and ordered $3 million in compensatory damages. Testimony at trial touched on the allegation that platform features were engineered to keep young users engaged; plaintiff’s counsel framed that as a deliberate design choice that produced addiction-like effects.
Key courtroom testimony included direct statements from senior company figures and lawyers. Mark Lanier, plaintiff’s lawyer, told jurors, “How do you make a child never put down the phone? That’s called the engineering of addiction. ” Mark Zuckerberg, chief executive, Meta, testified that he intended the platforms to be beneficial, saying, “It’s very important to me that what we do [… ] is a positive force in their lives. ” Adam Mosseri, head of Instagram, characterized excessive use as “problematic use, ” adding that there was no scientific evidence presented to the court that social media is addictive in a clinical sense. Luis Li, lawyer for YouTube, questioned whether the record showed addiction to YouTube in the plaintiff’s medical files, asking jurors to apply common sense about shifting interest in platforms.
Deep analysis: causes, consequences and legal ripple effects
The verdict rests on two intertwined findings from jurors: first, that platform design choices could be judged negligent, and second, that those choices were a substantial factor in one user’s long-term harm. The trial highlighted competing narratives — the plaintiff’s claim of near-constant use and mental-health consequences versus defendants’ arguments that other life events contributed to the plaintiff’s challenges and that platform use was not the primary cause. Jurors’ allocation of 70% to meta signals a judicial willingness to scrutinize product design as a proximate source of harm rather than confining liability to user content or discrete interactions.
Legal observers will watch how this verdict interacts with existing law referenced at trial, including protections that historically shield online platforms from certain liability. The ruling may act as a bellwether for the consolidated wave of cases that include hundreds of plaintiffs and numerous school districts, testing whether courts will treat design and warning practices as legally actionable across jurisdictions.
The plaintiff testified that continuous platform engagement made her fear missing out and affected her self-worth; jurors accepted that narrative enough to assign damages and split responsibility between the companies. Meta has stated it disagrees with the verdict and is evaluating its legal options, and the judgment follows other civil findings against platform companies in separate state trials.
As this legal landscape unfolds, questions remain about remedies that might follow: will design changes, disclosure regimes or new industry standards emerge from litigation pressure — or will appeals and divergent rulings create a patchwork of outcomes? The jury’s decision assigned clear legal accountability, but the broader policy and regulatory response is still to be determined.
What does this mean for parents, schools and policymakers who must reconcile youth well‑being with ubiquitous digital platforms? The Los Angeles verdict has raised that question in stark terms, and it will be central to the next phase of litigation and public debate about technology, responsibility and the protection of young users on social platforms.
Will the judgment that placed primary fault on meta change how platforms are built or simply redirect battles to higher courts?