Internal company materials submitted in a New Mexico state court case portray a sharp divide between Meta's public statements and the private concerns expressed by senior safety and policy executives as the company prepared a 2019 public announcement to implement default end-to-end encryption for messaging linked to its Facebook platform and later extended to Instagram direct messages.
The filings, made public in connection with a lawsuit brought by New Mexico Attorney General Raul Torrez, contain emails, chat exchanges and briefing papers gathered in discovery. They show that some of Meta's most senior content and safety leaders concluded the move would materially diminish the company's capacity to identify and refer child-exploitation cases and other serious threats to law enforcement.
What the documents show
A chat exchange dated March 2019 captures Monika Bickert, Meta's head of content policy at the time, writing:
"We are about to do a bad thing as a company. This is so irresponsible,"as executives prepared for a public rollout of the encryption plan by Chief Executive Mark Zuckerberg.
According to the materials, other senior executives voiced similarly stark warnings. The filings say that in internal communications Bickert accused the company of making "gross misstatements of our ability to conduct safety operations." In one message she said she was not inclined to assist Zuckerberg in presenting the move as primarily a privacy protection: "I’m not very invested in helping him sell this, I must say."
One of Bickert's criticisms, as recorded in the documents, was operational: with end-to-end encryption in place, the company would be unable to find some forms of illicit activity proactively. She wrote that with end-to-end encryption, "there is no way to find the terror attack planning or child exploitation" and therefore the company could not readily refer such cases to law enforcement.
Quantified impacts in company briefings
A Meta briefing document from February 2019, included in the filings, attempted to quantify the effect encryption would have on the company's reporting of child nudity and sexual exploitation imagery to the National Center for Missing and Exploited Children (NCMEC). It estimated that the prior year's reporting would have dropped from 18.4 million reports to 6.4 million if Messenger had been encrypted - a 65% decline.
A subsequent update to that same briefing added more tangible case-level impacts, stating Meta would have been "unable to provide data proactively to law enforcement in 600 child exploitation cases, 1,454 sextortion cases, 152 terrorist cases [and] 9 threatened school shootings." Those figures are presented in the filings as internal assessments of the company’s reduced ability to supply proactive data to authorities under an encrypted messaging regime.
Safety executives flagged platform-specific risks
Safety teams singled out the combination of public social networking features and private messaging as particularly hazardous for minors. In a 2019 email excerpted in the filings, Antigone Davis, Meta's Global Head of Safety, warned that Facebook's social graph made it easy for predatory adults to locate both other adults and children, and to transition interactions into Messenger for more private contact. Davis wrote: "FB [Facebook] allows pedophiles to find each other and kids via social graph with easy transition to Messenger."
By contrast, Davis noted that WhatsApp - Meta's then-existing encrypted messaging product - did not offer the same risk profile because it was not directly integrated with a broad social network and did not facilitate the same ease of social connections. She wrote that making Messenger end-to-end encrypted would be "far, far worse than anything we have seen/gotten a glimpse of on WA (WhatsApp)."
Legal and regulatory context
The filings were submitted in a case filed by New Mexico Attorney General Raul Torrez that alleges Meta allowed predators unfettered access to underage users and connected them with victims, often contributing to real-world abuse and human trafficking. That lawsuit has proceeded to trial this month, making it the first of its kind against the company to reach a jury.
The case arrives amid a broader wave of litigation and regulatory scrutiny alleging harms to young people linked to Meta’s platforms. The court filings note that a coalition of more than 40 state attorneys general is pursuing claims related to youth mental health, some school districts have brought suits, and a separate civil action in Los Angeles County Superior Court involves testimony from Mark Zuckerberg last week in a case brought by attorneys representing a teenager who says she was harmed by the company's products.
Company response and mitigation steps
Meta spokesperson Andy Stone responded to inquiries recorded in the filings by saying the concerns raised internally by Bickert and Davis prompted the company to design and deploy additional safety features prior to launching encrypted messaging for Facebook and Instagram in 2023. Stone said that while messages are now encrypted by default, users retain the ability to report objectionable messages for review and potential referral to law enforcement.
Stone described several of the measures Meta developed to address child safety within encrypted chats, including the creation of special account settings for underage users that prevent adult accounts from initiating contact with minors they do not know. He told investigators that the 2019 concerns helped motivate these developments and other tools intended to detect and prevent abuse while preserving encryption.
Why this matters
The documents in the New Mexico case illuminate the tensions Meta faced between preserving user privacy through encryption and maintaining the operational visibility used by safety teams to detect serious crimes and threats. Internal warnings recorded in early 2019 show that some safety and policy leaders believed the company’s public messaging understated the operational trade-offs and that the proposed encryption rollout risked substantially curtailing its ability to proactively flag harmful activity to law enforcement.
As the litigation proceeds, the filings underscore ongoing debates inside major platform companies over the balance between encryption, detection of illicit activity and user safety - debates that have legal, regulatory and public policy implications that continue to play out in courtrooms and government investigations.
Key takeaways
- Internal Meta documents from 2019 indicate senior safety and policy executives warned that default end-to-end encryption for Messenger would reduce the company's ability to detect and report child sexual exploitation and other threats.
- Company briefings estimated a drop from 18.4 million to 6.4 million in reports to NCMEC for a prior year if Messenger had been encrypted - a 65% decline - and listed specific categories of cases that would have lacked proactive data for law enforcement.
- Meta has since added features it says are designed to mitigate those risks, and encrypted messaging for Facebook and Instagram was launched with messages encrypted by default in 2023.