POINT PLEASANT, W.Va. — West Virginia Attorney General JB McCuskey sued Apple in Mason County Circuit Court, charging that the company’s ironclad privacy features on iCloud have turned its cloud storage into a haven for child sexual abuse material. McCuskey, a Republican, claims Apple places user secrecy above child safety, citing an internal 2020 iMessage where company executive Eric Friedman labeled iCloud ‘the greatest platform for distributing child porn.’

The conversation, between Friedman, then Apple’s anti-fraud chief, and security head Herve Sibert, surfaced during the 2021 Apple v. Epic Games litigation. Friedman lamented Apple’s privacy focus, noting the firm had ‘chosen to not know’ the full scope of CSAM flowing through its systems. West Virginia’s 45-page complaint uses this admission to argue iCloud’s end-to-end encryption blocks law enforcement access, even to Apple itself.

McCuskey seeks statutory damages, punitive awards and a court order for Apple to deploy CSAM detection tools. ‘These images are a permanent record of a child’s trauma, and that child is revictimized every time the material is shared or viewed,’ he said in a statement. The suit accuses Apple of harming the state’s child protection efforts by evading reports to the National Center for Missing and Exploited Children.

Apple reported just 267 CSAM instances to NCMEC in 2023. That figure dwarfs Google’s 1.47 million and Meta’s 30 million-plus. Prosecutors blame Apple’s limited scanning, confined to iCloud Mail, which lacks end-to-end encryption and allows warrant-based access. Most iCloud photos and files went unscanned until users opted into Advanced Data Protection, which locks data behind user-held keys.

Apple pushed back with a statement highlighting existing safeguards. The company bars children from sending or receiving nude images via Messages. On Thursday, it revealed a ‘Report to Apple’ button for U.S. users to flag junk content straight to the firm. Spokespeople insist the tool predates the lawsuit and aims to combat a broad range of harms on devices and apps.

This clash echoes years of tension over encryption. Law enforcement decries it as a shield for criminals. Privacy groups hail it as essential against surveillance. Apple flirted with solutions before. In 2021, it proposed NeuralHash, software to spot known CSAM on devices pre-upload without decrypting files. Backlash from advocates like the Electronic Frontier Foundation killed the plan over fears of mission creep into censorship.

Undeterred, Apple rolled out optional Advanced Data Protection anyway. It encrypts most iCloud data end-to-end. Critics, including West Virginia, say this prioritizes secrecy over scanning, letting predators operate freely. The state points to abandoned default encryption for iCloud backups—scrapped amid FBI pressure—as proof Apple knows the risks but chose privacy.

Pressure mounts across tech. Google and Meta scan uploads against NCMEC databases. Apple demurred, citing privacy. Now facing court, it must respond within 30 days. McCuskey’s office frames the suit as a public health imperative, not just law enforcement. Failed detections, they argue, strain resources and retraumatize victims whose images proliferate unchecked.

The case could set precedents. No state had sued over cloud CSAM before. Child safety groups cheer it; a 2023 Louisiana suit sought $1.2 billion from Apple on similar grounds. As smartphones and social feeds brim with toxic content, regulators eye big tech harder. Apple, with 2.2 billion active devices, sits at the epicenter.