A New Mexico jury found Meta liable for endangering children on its platforms, concluding the company failed to protect minors from predators and other online threats. The decision followed roughly a day of deliberations after a six-week trial in Santa Fe.

Jurors awarded $375 million in damages, significantly less than the $2.2 billion sought by the state, but still marking one of the first major jury verdicts addressing child safety on social media platforms.

The case centered on allegations that Meta, the parent company of Facebook, Instagram, and WhatsApp, allowed minors to be exposed to sexual exploitation, online solicitation, and human trafficking risks. State officials argued the company knowingly failed to implement adequate safeguards.

New Mexico Attorney General Raul Torrez called the ruling a “historic victory,” accusing Meta of prioritizing profits over the safety of children and ignoring internal warnings about the risks its platforms posed.

Meta said it plans to appeal the verdict, stating it disagrees with the outcome and maintains that it has taken steps to address harmful content and protect users.

During the trial, jurors heard testimony from 40 witnesses, including whistleblowers, and reviewed extensive internal documents and communications. Prosecutors argued Meta’s algorithms contributed to the problem by directing adults toward content posted by teenage users while downplaying known risks.

The jury determined that Meta violated New Mexico’s Unfair Practices Act by misleading consumers about the safety of its platforms for children.

The case is not yet concluded. A second phase is scheduled to begin May 4, when a judge will consider additional penalties and potential changes to the company’s operations.

The verdict comes as similar cases continue nationwide, including a closely watched trial in California that could further shape legal standards for social media companies and child safety.