Mark Zuckerberg Says Criminal Behavior on Facebook 'Inevitable' in Child Safety Trial Deposition
Meta CEO Mark Zuckerberg acknowledged in a taped deposition played during a high-stakes child safety trial that criminal activity, including harms to children, is an unavoidable reality on platforms serving billions of users like Facebook and Instagram.

The comments, revealed March 4-5, 2026, in a New Mexico courtroom, came as prosecutors played excerpts from Zuckerberg's pretrial deposition to support allegations that Meta violated state consumer protection laws by failing to adequately disclose or mitigate risks of child sexual exploitation and mental health damage on its services.
"I just think if you're serving billions of people, the unfortunate reality is that some very small percent of them are going to be criminals, and we should work as hard as we can to stop that activity from happening," Zuckerberg said in the deposition. "I don't think that the standard for our platforms would be that you should assume that it will ever be perfect."
The statement drew sharp reactions from critics and child safety advocates, who argue Meta prioritizes engagement and profits over robust protections. Zuckerberg's words were part of broader testimony addressing Meta's efforts — or perceived shortcomings — in combating predatory behavior, underage access and harmful content.
The ongoing bellwether trial, brought by New Mexico Attorney General Raúl Torrez, accuses Meta of knowingly allowing dangerous conditions to persist on Facebook, Instagram and related apps. Prosecutors presented internal documents and executive statements claiming the company downplayed known risks to maintain user growth and advertising revenue.
Instagram head Adam Mosseri echoed similar sentiments in his own deposition, played alongside Zuckerberg's, noting the inevitability of some bad actors in vast online communities. Both emphasized Meta's investments in safety tools, including AI detection, content moderation teams and billions spent annually on enforcement.
Zuckerberg defended the company's approach, highlighting thousands of employees dedicated to trust and safety, proactive removals of violating content and partnerships with law enforcement. He stressed the challenge of balancing privacy features like end-to-end encryption — which limits direct message scanning — with safety needs. "Our job is to build products that balance these things in appropriate ways," he said. "Safety is obviously extremely important. People also care a lot about privacy and security, too."
The trial builds on years of scrutiny over Meta's handling of youth safety. It follows Zuckerberg's February 2026 testimony in a separate Los Angeles addiction lawsuit, where he faced questions on algorithmic design and underage verification. In that case, he admitted improvements in detecting children under 13 but wished the company had acted sooner.
New Mexico's suit focuses on consumer protection violations, alleging Meta misrepresented platform safety to users and parents. Prosecutors pointed to cases of sexual exploitation facilitated through the apps, including grooming and sextortion schemes targeting minors. They argue Meta's scale amplifies these issues, with harms like depression, anxiety and suicide linked to exposure.
Meta counters that it discloses risks, removes harmful content aggressively and cannot eliminate every violation in open platforms. Company lawyers note adversarial actors constantly evade systems, but Meta continually upgrades defenses.
The case has spotlighted broader industry challenges. Social media giants face mounting lawsuits and regulatory pressure over youth mental health and exploitation. Section 230 protections shield platforms from liability for user content, but states like New Mexico seek to hold companies accountable for design choices and disclosures.
Public reaction to Zuckerberg's remarks has been swift and critical. Advocacy groups called the statement an admission of defeat on child protection, urging stronger federal legislation. On social media, users debated whether billions of users inherently doom platforms to host crime or if better tools could minimize it further.
Zuckerberg has long maintained that perfection is unattainable but progress is ongoing. In past congressional hearings, he apologized to families affected by platform harms and pledged reforms.
As the New Mexico trial continues, depositions from other executives like former policy head Nick Clegg reinforced that harmful content damages business interests — bad for ads and brand trust. Clegg noted advertisers avoid proximity to toxic material.
The outcome could influence hundreds of similar suits nationwide, potentially reshaping how platforms approach safety, moderation and transparency. For Meta, the case tests the limits of scale: serving billions inevitably includes risks, but critics say Zuckerberg's words underscore insufficient urgency in addressing them.
With testimony ongoing and more internal records expected, the trial highlights enduring tensions between innovation, privacy, safety and corporate responsibility in the social media era.
© Copyright 2026 IBTimes AU. All rights reserved.





















