
Photo: The New York Times
A New Mexico state court jury on Tuesday ruled that Meta must pay $375 million in civil damages after finding the company willfully violated state consumer protection laws designed to protect children from online predators. The verdict comes after a high-profile trial where the state attorney general accused Meta of failing to safeguard minors using Facebook and Instagram.
The case began in Santa Fe last month, focusing on allegations that Meta misled residents about the safety of its platforms. New Mexico Attorney General Raúl Torrez initiated the lawsuit in 2023 following an undercover investigation in which a fake social media profile of a 13-year-old girl was targeted with solicitations from predators. During deliberations, the jury concluded that Meta had violated the state’s Unfair Practices Act and assigned damages based on the number of violations.
Meta issued a statement rejecting the verdict and announcing plans to appeal. “We respectfully disagree with the verdict. We work hard to keep people safe on our platforms and will continue to defend ourselves vigorously,” the spokesperson said, emphasizing the company’s efforts to protect teens online despite challenges in policing harmful content at scale.
Attorney Linda Singer, representing New Mexico, had urged the jury to impose a civil penalty potentially exceeding $2 billion. In response, Torrez called the decision “a historic victory for every child and family harmed by Meta’s failure to prioritize safety.” He highlighted internal company documents suggesting executives were aware of the risks their products posed to minors, ignored warnings, and misrepresented safety measures to the public.
The trial’s next phase, set to begin on May 4, will be conducted without a jury. A judge will determine whether Meta created a public nuisance and whether it should fund programs addressing the alleged harms. State lawyers are pushing for reforms including stricter age verification, removal of predators from platforms, and protections against encrypted communications that could shield bad actors.
Evidence presented during the trial revealed internal Meta communications discussing how CEO Mark Zuckerberg’s 2019 plan to enable end-to-end encryption on Messenger could limit law enforcement access to reports of child sexual abuse material—then numbering approximately 7.5 million. Torrez argued that these disclosures underscored Meta’s awareness of risks and failure to act responsibly.
The case is part of a broader wave of social media litigation. Torrez has a similar lawsuit pending against Snap, filed in 2024, and experts have compared these trials to the Big Tobacco litigation of the 1990s, given allegations that companies misled the public about risks associated with their products. Other ongoing and upcoming cases include federal and state-level actions against Meta, YouTube, TikTok, and Snap, with plaintiffs citing negative mental health impacts on minors linked to app usage.
The New Mexico verdict sets a potential precedent for how social media companies are held accountable for child safety and app design practices, signaling a heightened regulatory and legal focus on platform responsibility in the United States.









