West Virginia's attorney general has taken legal action against Apple, asserting that the technology giant permitted its iCloud storage service to become what one internal Apple message described as the "greatest platform for distributing child porn." The lawsuit, filed in Mason County Circuit Court, accuses Apple of placing a premium on user privacy policies at the expense of measures that would prevent the dissemination of child sexual abuse material.
The complaint, announced by the attorney general's office, identifies Apple actions and omissions it says allowed abusive images to be stored and circulated through iCloud. The filing requests statutory and punitive damages and asks a judge to order Apple to deploy more effective detection systems and to adopt safer product-design practices.
Allegations and evidence cited
The state highlights an internal text message from 2020 attributed to Apple’s then anti-fraud chief, which said that because of Apple’s priorities the company had become "the greatest platform for distributing child porn." West Virginia’s attorney general, JB McCuskey, said in a statement that such content constitutes a permanent record of a child’s trauma and that every sharing or viewing re-victimizes the child.
"These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed," McCuskey said. "This conduct is despicable, and Apple’s inaction is inexcusable."
The lawsuit asserts that Apple stores and synchronizes data through iCloud without proactive detection of abusive material. The state describes Apple’s prior and evolving technical choices - some of which the company initiated and later reversed - as enabling the spread of these images.
Apple’s historical approaches to scanning and encryption
The filing contrasts Apple’s choices with the approaches taken by other major technology companies. Alphabet’s Google, Microsoft and other providers have long used databases of identifiers for known child sexual abuse material, supplied by organizations such as the National Center for Missing and Exploited Children and similar clearing houses, to check files uploaded to their services.
By contrast, until 2022 Apple did not scan all files uploaded to its iCloud storage offerings. During that period iCloud data was not end-to-end encrypted, meaning law enforcement could obtain access with a warrant. At one point Apple explored implementing end-to-end encryption for iCloud that would have rendered data inaccessible to law enforcement; the state’s filing says that company later abandoned that plan following concerns that it would hinder investigations.
In August 2021 Apple announced a system called NeuralHash, intended to scan images on users’ devices before upload in an effort to detect known abuse material while preserving privacy. The state says NeuralHash was criticized by outside researchers who warned it could generate false positives and by privacy advocates who feared it could be expanded for broader government surveillance. Apple delayed deployment of NeuralHash the following month and ultimately canceled the project in December 2022, the state said.
That same month Apple introduced an option for end-to-end encryption for iCloud data, according to the state’s statement. The filing argues that NeuralHash was inferior to other available tools and could be evaded, and that Apple’s storage and synchronization practices permitted abusive images to continue circulating.
Other features, reporting and comparisons
Although Apple did not move forward with scanning uploaded images in iCloud, the company implemented a Communication Safety feature that blurs nudity and other sensitive content sent to or from a child’s device. Federal law requires U.S.-based technology companies to report instances of child sexual abuse material to the National Center for Missing and Exploited Children. In 2023 Apple made 267 such reports, the state said, compared with 1.47 million by Google and 30.6 million by Meta Platforms.
The state’s civil claims mirror allegations in a proposed class-action lawsuit filed in late 2024 in federal court in California by individuals who appear in such images. Apple has sought to have that private suit dismissed, asserting that it is protected from liability under Section 230 of the Communications Decency Act, which provides broad legal shields for internet companies with respect to user-generated content.
What the state is seeking
West Virginia’s complaint seeks both monetary relief and injunctive remedies. The state is asking a court to require Apple to implement more robust detection of abusive material and to change aspects of its product design to reduce the circulation of such content. The filing frames these remedies as necessary to protect children and to hold the company accountable for decisions the state says left its platform vulnerable to misuse.
The case marks what the attorney general’s office describes as a novel government effort to hold a major technology company legally accountable for how its data storage platform has been used in the spread of child sexual abuse material.