Stock Markets February 23, 2026

Internal Survey Shows Nearly 1 in 5 Young Teens Report Seeing Unwanted Sexual Content on Instagram

Court filing includes deposition excerpts where Instagram head cites limits of self-reported surveys and user privacy constraints

By Jordan Park META
Internal Survey Shows Nearly 1 in 5 Young Teens Report Seeing Unwanted Sexual Content on Instagram
META

Portions of a deposition released in a federal lawsuit indicate that about 19% of Instagram users aged 13 to 15 told Meta they encountered nudity or sexual images on the platform that they did not want to see. The filing also reports that roughly 8% of users in that age group said they had seen someone harm themselves or threaten self-harm on Instagram. Company testimony emphasized the limitations of self-reported survey data and privacy protections for private messages.

Key Points

  • About 19% of Instagram users aged 13 to 15 reported seeing nudity or sexual images they did not want to see, according to a Meta survey cited in a court filing.
  • Roughly 8% of users in that age group reported seeing someone harm themselves or threaten self-harm on Instagram, per the deposition excerpts.
  • Instagram leadership emphasized limits of self-reported surveys and the need to respect privacy when reviewing private messages; most explicit images were reported to have been shared via direct messages.

Portions of a court filing made public Friday, tied to a federal lawsuit in California, include excerpts from a March 2025 deposition by Adam Mosseri, the head of Instagram, indicating that nearly 1 in 5 users aged 13 to 15 told Meta they had seen "nudity or sexual images on Instagram" that they did not want to view.

The statistic derives from a company survey of users about their experiences on Instagram, a Meta spokesperson, Andy Stone, said in the filing. Stone clarified that the figure came from self-reported answers rather than a systematic review of posts on the platform.

In his deposition, Mosseri noted that the firm does not typically disclose survey results "in general," and cautioned about drawing strong conclusions from self-reported data, calling such surveys "notoriously problematic." The testimony also reported that about 8% of users in the 13 to 15 age bracket said they had "seen someone harm themselves or threaten to do so on Instagram."

Mosseri told the court that the majority of the sexually explicit images reported by those young users were transmitted through private messages between individuals rather than public posts. He emphasized that Meta must weigh privacy considerations when reviewing messages, saying, "A lot of people don't want us reading their messages."

The filing appears as Meta faces allegations in multiple lawsuits asserting that its products can harm young users. In the United States, thousands of suits in federal and state courts contend the company engineered addictive features and contributed to a youth mental-health crisis. The deposition excerpts and the public filing do not alter the company position that internal survey data have limitations and that privacy must be respected when investigating private communications.

Separately, the company announced in late 2025 that it would remove images and videos "containing nudity or explicit sexual activity, including when generated by AI," while allowing potential exceptions for medical and educational content. The filing does not provide additional detail on how that policy change affects the specific survey results cited in the deposition.

This account in the court record centers on self-reported user experiences, company testimony about data disclosure practices, and the tension between content review and users' privacy expectations, as presented in the March 2025 deposition.

Risks

  • Ongoing litigation and public scrutiny of platform safety practices could heighten regulatory and legal risk for social media companies - impacts technology and communications sectors.
  • Reliance on self-reported survey data introduces uncertainty in assessing the prevalence and nature of harmful content - impacts companies and stakeholders trying to evaluate user-safety metrics.
  • Privacy constraints on reviewing private messages create operational limits for content moderation, complicating efforts by platforms to identify and remove harmful private communications - impacts platform moderation policies and compliance functions.

More from Stock Markets

Lagarde Says Eurozone Inflation and Rate Policy Remain 'In Good Place' at Washington Conference Feb 23, 2026 C.H. Robinson CEO Calls AI-Driven Stock Drop a Short-Term Reaction, Predicts Industry Consolidation Feb 23, 2026 Anthropic Says Three Chinese Firms Used Claude to Extract Capabilities for Their Own Models Feb 23, 2026 Abits Group Shares Plunge After $2.1M Registered Direct Offering Announcement Feb 23, 2026 Adial Shares Climb After FDA Signals Flexibility on Pivotal Trial Requirements Feb 23, 2026