Jury Hits Meta with $375 Million Verdict in New Mexico Child Safety Trial

After six weeks of testimony and one day of deliberations, a Santa Fe jury found Meta willfully violated state consumer protection law by exposing children to predators on its platforms. The $375 million damage award is the first of its kind at the state level — and only the beginning of what the legal system may demand.


The Verdict

On March 24, 2026, a jury in Santa Fe, New Mexico returned a verdict finding that Meta — the parent company of Facebook, Instagram, and WhatsApp — willfully violated New Mexico's Unfair Practices Act, according to CNBC. The jury awarded $375 million in damages.

Deliberations began Monday, March 23, one day after closing arguments concluded. The verdict arrived within roughly 24 hours.

Meta immediately announced its intention to contest the outcome. "We respectfully disagree with the verdict and will appeal," a Meta spokesperson said in a statement reported by CNBC. "We work hard to keep people safe on our platforms and are clear about the challenges of identifying and removing bad actors or harmful content. We will continue to defend ourselves vigorously, and we remain confident in our record of protecting teens online."

What New Mexico Alleged

New Mexico Attorney General Raúl Torrez filed suit against Meta in 2023, following an undercover state investigation in which investigators created a fake social media profile of a 13-year-old girl. According to CNBC, Torrez previously told the outlet that the account "was simply inundated with images and targeted solicitations" from individuals seeking to exploit children.

The state's core theory, as argued by prosecution attorney Linda Singer in closing statements reported by the Associated Press, was that Meta's failure to protect children was not accidental. "The safety issues that you've heard about in this case, weren't mistakes," Singer told the jury. "They were a product of a corporate philosophy that chose growth and engagement over children's safety. And young people in this state and around the country have borne the cost."

Singer further cited what she described as internal Meta research that was not publicly disclosed: "It was included in Meta's internal research — again this was research that didn't get disclosed by Meta — one-in-three teens experienced problematic use. They knew these kids were struggling with problematic use — again, addiction."

On the question of damages, Singer had urged jurors to impose a civil penalty that could exceed $2 billion, based on the maximum $5,000 penalty per violation under two counts of consumer protection violations, applied to an estimated 208,700 monthly users of Meta platforms under the age of 18 in New Mexico, according to AP News. The jury's $375 million award fell below that maximum but constituted the largest verdict of its kind in a state-level social media child safety case.

Meta's Defense

Meta attorney Kevin Huff argued throughout the trial that the company has invested heavily in child protection. According to AP News, Huff told the jury that "Meta has 40,000 people working to make its apps as safe as possible" and highlighted automated safety tools the company has deployed.

Huff also acknowledged imperfection while arguing it did not equal liability: "No one can, with billions of pieces of content every day, even the best system, cannot catch all of it." He further argued that Meta had disclosed the limitations of its platforms in user agreements, ads, and public statements, and that "common sense also says that parents and teens know that there is bad content on the internet, and on Facebook and Instagram specifically," per AP News.

Meta's attorneys characterized the state's evidence as cherry-picked and described the billions-of-dollars damages request as "a shocking number," according to AP News.

What Happens Next in New Mexico

The verdict concludes only the first phase of the New Mexico trial. According to CNBC, a second phase — conducted without a jury — will commence in summer 2026. In that phase, a judge will determine whether Meta created a public nuisance and whether it should be required to fund public programs intended to address the harms found by the jury.

That second phase could substantially increase Meta's total financial exposure from the New Mexico case beyond the $375 million damages figure.

The Broader Legal Storm

The New Mexico verdict lands in the middle of what legal experts and observers have compared to the landmark Big Tobacco litigation of the 1990s. Multiple simultaneous proceedings are underway at both the state and federal levels.

As of late March 2026, a separate jury in Los Angeles Superior Court has been deliberating in a case involving both Meta and Google's YouTube, according to CNBC. That so-called bellwether case centers on alleged harm to a single plaintiff — identified only as K.G.M. — who claims she became addicted to social media apps while underage. Its outcome is expected to influence thousands of similar lawsuits consolidated under California's Judicial Council Coordination Proceedings.

Beyond California, a federal trial is scheduled for later in 2026 in the Northern District of California, in which school districts and parents across multiple states allege that Meta, YouTube, TikTok, and Snap collectively caused measurable mental health harm to children and teenagers, per CNBC.

The legal distinction between the New Mexico case and the others is notable. Under the federal Section 230 of the Communications Decency Act, tech companies enjoy broad immunity from liability for content posted by third parties. New Mexico's prosecution was carefully crafted to sidestep that shield by targeting not the content itself but Meta's algorithmic design choices and business practices — specifically the algorithms that the state alleged actively promoted addictive and harmful content to minors, per AP News.

Why This Verdict Matters

The New Mexico verdict is the first time a jury has found Meta liable under a state consumer protection framework for how it designed and operated its platforms in relation to children. While Meta has settled some claims and faced regulatory actions in Europe and elsewhere, a jury finding of willful misconduct in a U.S. court carries different legal and reputational weight.

The "willful" finding in particular is significant. It means jurors concluded Meta did not simply fail to prevent harm — it knowingly operated in a way that violated the law. That finding could inform how other juries and judges evaluate similar evidence in the wave of litigation still pending.

For the industry broadly, the verdict signals that the Big Tobacco analogy is not purely rhetorical. In the tobacco cases, a decades-long pattern of internal research contradicting public statements eventually became the fulcrum of massive liability. The internal Meta documents and studies introduced at trial in New Mexico — including the alleged undisclosed research showing that one-in-three teens experienced what the state characterized as problematic use — follow a structurally similar pattern.

Meta has pledged to appeal. But with multiple juries now deliberating simultaneously and a federal mega-case still to come, the company's legal exposure is expanding faster than any single verdict can resolve.