Instagram and the New Mexico Trial: Depositons Put Safety, Scale and a CEO’s Words on Trial

Instagram and the New Mexico Trial: Depositons Put Safety, Scale and a CEO’s Words on Trial

In a Santa Fe courtroom, jurors sat in silence as recorded depositions were played back, placing instagram and the people who steer its product decisions under a stark, human light. The tapes of Mark Zuckerberg and Adam Mosseri crystallized a tension between the company’s global reach and repeated warnings about harms to children.

What did Mark Zuckerberg and Instagram leaders say in depositions?

The jury heard Zuckerberg acknowledge that when a platform serves billions of users it will include “some very small percent” who are criminals, and that perfection cannot be the standard for the company. Adam Mosseri, identified in depositions as the leader of Instagram, was asked whether the platform should do everything possible to keep teens safe; he answered, “I think we should do what we can. ” Mosseri also said the company would “prioritize safety over profits. ”

Those statements were shown as prosecutors presented video testimony to buttress claims that risks to children—ranging from sexual solicitation to harms tied to algorithmic recommendations—were not fully disclosed or prevented. Jurors were told that family members of company employees experienced sexual solicitation on the service and that in 2020 Meta estimated a daily figure of hundreds of thousands of children receiving sexually inappropriate communications on Instagram, a count the company later described as overly broad.

How do prosecutors frame the risks to children and what evidence was shown?

New Mexico’s attorney general, Raul Torrez, has accused the company of prioritizing engagement over child safety and of knowingly enabling predators to exploit the platform. Prosecutors emphasized internal assessments and the role of features such as the “People you may know” recommendation tool, which the company identified as a principal driver of identified inappropriate contacts in earlier years.

Jurors also heard that some enforcement actions were not permanent: about thirty percent of adults whose accounts were disabled for targeting children later returned and resumed problematic behavior, a figure offered to illustrate the persistence of the challenge. The trial record includes warnings from child safety organizations—Thorn and the National Center for Missing and Exploited Children—that certain product moves could increase risks for children.

What is Meta saying it has done, and what remains contested?

Company representatives pushed back in opening statements. Meta attorney Kevin Huff emphasized efforts to remove harmful content and the company’s disclosure of risk, while a Meta spokesperson outlined long-standing rules against child exploitation and investment in detection technology and safety features. The company pointed to changes such as teen accounts launched with default protections in 2024 and to ongoing transparency about content removals and misses.

The depositions also touched on policy choices with privacy implications: Zuckerberg said he authorized end-to-end encryption for a messaging product despite warnings from child-safety groups that the move could carry risks for minors, framing the decision around user privacy. Mosseri described product experiments—such as a recommendations reset and attempts to carve out subsets of adults from recommended follows—that the company framed as steps to reduce harm even while acknowledging limits when serving a very large user base.

The trial is being positioned as potentially precedent-setting, with observers noting it could influence other lawsuits raising similar claims about platform design, safety trade-offs and disclosure practices. The answers given in the taped depositions—short, candid and at times defensive—have made the courtroom a close, human stage for questions about technology, governance and responsibility.

Back in the Santa Fe courtroom where the day began with a hush as videos rolled, jurors were left with competing narratives: executives describing constraints and ongoing work, and prosecutors presenting internal figures and warnings about real children harmed on the platform. As the trial continues, instagram will remain a central reference point in a broader debate about how far large tech companies must go to prevent known harms while serving billions of users.

Next