Stock Markets February 23, 2026

Internal Documents Show Meta Executives Warned Encryption Would Hinder Child-Safety Investigations

Court filings reveal senior policy and safety leaders raised serious reservations as company moved to end-to-end encrypt Messenger and Instagram DMs

By Derek Hwang META
Internal Documents Show Meta Executives Warned Encryption Would Hinder Child-Safety Investigations
META

Internal records filed in a New Mexico state court case show Meta proceeded with plans to implement default end-to-end encryption for messages tied to Facebook and Instagram despite internal warnings that the change would sharply reduce the company's ability to detect and report child sexual exploitation and other threats to law enforcement. The documents include emails and chat messages from 2019 in which senior safety and policy executives characterize the plan as irresponsible and warn it would eliminate proactive detection capabilities largely relied on to refer cases to authorities.

Key Points

  • Internal 2019 communications show senior Meta executives privately warned default end-to-end encryption for Messenger would sharply reduce proactive detection and reporting of child sexual exploitation.
  • A February 2019 briefing estimated reporting to NCMEC would have declined from 18.4 million to 6.4 million (a 65% drop) had Messenger been encrypted, and listed specific numbers of cases that would have lacked proactive data for law enforcement.
  • Meta says it developed additional safety features and controls, including special account restrictions for minors and reporting pathways, before launching default encryption for Facebook and Instagram messages in 2023.

Internal company materials submitted in a New Mexico state court case portray a sharp divide between Meta's public statements and the private concerns expressed by senior safety and policy executives as the company prepared a 2019 public announcement to implement default end-to-end encryption for messaging linked to its Facebook platform and later extended to Instagram direct messages.

The filings, made public in connection with a lawsuit brought by New Mexico Attorney General Raul Torrez, contain emails, chat exchanges and briefing papers gathered in discovery. They show that some of Meta's most senior content and safety leaders concluded the move would materially diminish the company's capacity to identify and refer child-exploitation cases and other serious threats to law enforcement.


What the documents show

A chat exchange dated March 2019 captures Monika Bickert, Meta's head of content policy at the time, writing:

"We are about to do a bad thing as a company. This is so irresponsible,"
as executives prepared for a public rollout of the encryption plan by Chief Executive Mark Zuckerberg.

According to the materials, other senior executives voiced similarly stark warnings. The filings say that in internal communications Bickert accused the company of making "gross misstatements of our ability to conduct safety operations." In one message she said she was not inclined to assist Zuckerberg in presenting the move as primarily a privacy protection: "I’m not very invested in helping him sell this, I must say."

One of Bickert's criticisms, as recorded in the documents, was operational: with end-to-end encryption in place, the company would be unable to find some forms of illicit activity proactively. She wrote that with end-to-end encryption, "there is no way to find the terror attack planning or child exploitation" and therefore the company could not readily refer such cases to law enforcement.


Quantified impacts in company briefings

A Meta briefing document from February 2019, included in the filings, attempted to quantify the effect encryption would have on the company's reporting of child nudity and sexual exploitation imagery to the National Center for Missing and Exploited Children (NCMEC). It estimated that the prior year's reporting would have dropped from 18.4 million reports to 6.4 million if Messenger had been encrypted - a 65% decline.

A subsequent update to that same briefing added more tangible case-level impacts, stating Meta would have been "unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases [and] 9 threatened school shootings." Those figures are presented in the filings as internal assessments of the company’s reduced ability to supply proactive data to authorities under an encrypted messaging regime.


Safety executives flagged platform-specific risks

Safety teams singled out the combination of public social networking features and private messaging as particularly hazardous for minors. In a 2019 email excerpted in the filings, Antigone Davis, Meta's Global Head of Safety, warned that Facebook's social graph made it easy for predatory adults to locate both other adults and children, and to transition interactions into Messenger for more private contact. Davis wrote: "FB [Facebook] allows pedophiles to find each other and kids via social graph with easy transition to Messenger."

By contrast, Davis noted that WhatsApp - Meta's then-existing encrypted messaging product - did not offer the same risk profile because it was not directly integrated with a broad social network and did not facilitate the same ease of social connections. She wrote that making Messenger end-to-end encrypted would be "far, far worse than anything we have seen/gotten a glimpse of on WA (WhatsApp)."


Legal and regulatory context

The filings were submitted in a case filed by New Mexico Attorney General Raul Torrez that alleges Meta allowed predators unfettered access to underage users and connected them with victims, often contributing to real-world abuse and human trafficking. That lawsuit has proceeded to trial this month, making it the first of its kind against the company to reach a jury.

The case arrives amid a broader wave of litigation and regulatory scrutiny alleging harms to young people linked to Meta’s platforms. The court filings note that a coalition of more than 40 state attorneys general is pursuing claims related to youth mental health, some school districts have brought suits, and a separate civil action in Los Angeles County Superior Court involves testimony from Mark Zuckerberg last week in a case brought by attorneys representing a teenager who says she was harmed by the company's products.


Company response and mitigation steps

Meta spokesperson Andy Stone responded to inquiries recorded in the filings by saying the concerns raised internally by Bickert and Davis prompted the company to design and deploy additional safety features prior to launching encrypted messaging for Facebook and Instagram in 2023. Stone said that while messages are now encrypted by default, users retain the ability to report objectionable messages for review and potential referral to law enforcement.

Stone described several of the measures Meta developed to address child safety within encrypted chats, including the creation of special account settings for underage users that prevent adult accounts from initiating contact with minors they do not know. He told investigators that the 2019 concerns helped motivate these developments and other tools intended to detect and prevent abuse while preserving encryption.


Why this matters

The documents in the New Mexico case illuminate the tensions Meta faced between preserving user privacy through encryption and maintaining the operational visibility used by safety teams to detect serious crimes and threats. Internal warnings recorded in early 2019 show that some safety and policy leaders believed the company’s public messaging understated the operational trade-offs and that the proposed encryption rollout risked substantially curtailing its ability to proactively flag harmful activity to law enforcement.

As the litigation proceeds, the filings underscore ongoing debates inside major platform companies over the balance between encryption, detection of illicit activity and user safety - debates that have legal, regulatory and public policy implications that continue to play out in courtrooms and government investigations.


Key takeaways

  • Internal Meta documents from 2019 indicate senior safety and policy executives warned that default end-to-end encryption for Messenger would reduce the company's ability to detect and report child sexual exploitation and other threats.
  • Company briefings estimated a drop from 18.4 million to 6.4 million in reports to NCMEC for a prior year if Messenger had been encrypted - a 65% decline - and listed specific categories of cases that would have lacked proactive data for law enforcement.
  • Meta has since added features it says are designed to mitigate those risks, and encrypted messaging for Facebook and Instagram was launched with messages encrypted by default in 2023.

Risks

  • Reduced proactive visibility - Encryption on social-network-linked messaging could limit the company’s ability to detect child exploitation and other serious threats, affecting law enforcement referrals and public safety - impacts social media and public safety sectors.
  • Legal and regulatory exposure - The company faces litigation and multi-state investigations alleging harms to children and failures to keep users safe, creating legal risk for the technology and social media sectors.
  • Operational trade-offs - Measures intended to preserve privacy may compromise established safety operations, creating uncertainty for product design, moderation practices and compliance strategies within tech platforms.

More from Stock Markets

Australian Shares Slip as IT, Consumer Discretionary and A-REITs Weigh on Market Feb 24, 2026 European New Car Registrations Slip in January as Petrol Models See Steep Drop Feb 24, 2026 Retail Traders Account for Nearly All Activity in Leveraged Single-Stock ETFs, Study Shows Feb 24, 2026 Novo’s latest trial setback strengthens Lilly’s advantage in weight-loss market Feb 24, 2026 Warner Board to Review Upgraded Paramount Bid While Maintaining Netflix Recommendation Feb 24, 2026