Meta Confronts Landmark Jury Trial Over Child Exploitation Allegations in New Mexico
The social media giant Meta is facing a pivotal jury trial in Santa Fe, New Mexico, with opening statements scheduled for 9 February. This landmark legal proceeding represents the second major trial of 2026 concerning alleged harms to children linked to the company's platforms.
Serious Allegations Against Social Media Platforms
New Mexico Attorney General Raúl Torrez has brought forth serious accusations against Meta, claiming the company knowingly enabled predators to exploit children through Facebook and Instagram. The lawsuit alleges that Meta's design choices and profit incentives consistently prioritised user engagement over child safety, creating what Torrez has described as "the largest marketplace for predators and paedophiles globally."
The state's legal action specifically accuses Meta of facilitating dangerous environments where children face risks of sexual exploitation, solicitation, sextortion, and human trafficking. According to court filings, the company allegedly allowed unmoderated groups devoted to commercial sex and enabled the distribution of child sexual abuse material across its networks.
Meta's Response and Safety Measures
In response to these allegations, a Meta spokesperson defended the company's approach to youth safety, stating: "While the New Mexico attorney general makes sensationalist, irrelevant and distracting arguments by cherry-picking select documents, we're focused on demonstrating our longstanding commitment to supporting young people."
The company highlighted its decade-long efforts to collaborate with parents, experts, and law enforcement, pointing to specific safety measures including Teen Accounts with built-in protections and parental management tools. Meta maintains that these initiatives demonstrate meaningful progress in addressing platform safety concerns.
Legal Precedents and Broader Context
This New Mexico trial follows closely on the heels of another high-profile case in Los Angeles involving hundreds of US families and schools. Those plaintiffs allege that multiple social media platforms, including Meta, have knowingly contributed to youth addiction and mental health problems. While Snap and TikTok have reached settlements in that case, Meta and YouTube continue to face legal proceedings.
Significantly, Meta's attempts to invoke Section 230 protections—which typically shield platforms from liability for user-generated content—were denied by a judge in June 2024. This ruling allowed the New Mexico case to proceed based on allegations concerning platform design and internal company decisions rather than speech-related issues.
Expected Evidence and Key Witnesses
The seven-week trial is expected to feature compelling evidence from multiple sources. Key witnesses for the plaintiffs will likely include educators and law enforcement officials who can testify about harms they've witnessed occurring on Meta's platforms. Whistleblowers may also provide crucial insights into internal company discussions regarding safety protocols.
Notably, the attorney general's office has already taken Meta chief Mark Zuckerberg's deposition, portions of which may be presented in court. While teens and families directly affected by platform harms are not expected to testify, their experiences will be represented through other evidence.
Revelations from Internal Documents
Recent disclosures from the attorney general's office have revealed startling allegations based on internal Meta documents. These include claims that the company may have profited from placing advertisements from major corporations alongside content that sexualised children.
Perhaps most concerning are internal estimates suggesting approximately 100,000 children on Facebook and Instagram experience online sexual harassment daily. The filings also contain chat excerpts allegedly showing users discussing methods to lure minors into sexual engagement.
Operation MetaPhile and AI Concerns
Among the evidence expected to be presented are details from "Operation MetaPhile," an investigation that led to the 2024 arrest of three men charged with sexually preying on children through Meta's platforms. Undercover agents posing as children were allegedly contacted by suspects who found minors through specific design features on Facebook and Instagram.
A particularly troubling revelation emerged last week concerning AI chatbot companions. Internal documents reportedly show that Zuckerberg approved allowing minors to access these chatbots despite safety staff warnings about potential sexual interactions. According to the filings, Meta rejected recommendations to implement reasonable guardrails, with one employee noting that disabling chatbots for children was "a Mark-level decision."
As this landmark trial unfolds, it represents what industry observers describe as a potential turning point in holding social media platforms accountable for their impact on young users, with implications that could reshape digital safety standards for years to come.