NewsWorld
PredictionsDigestsScorecardTimelinesArticles
NewsWorld
HomePredictionsDigestsScorecardTimelinesArticlesWorldTechnologyPoliticsBusiness
AI-powered predictive news aggregation© 2026 NewsWorld. All rights reserved.
Trending
TrumpTariffTradeAnnounceLaunchNewsPricesStrikesMajorFebruaryPhotosYourCarLotSayCourtDigestSundayTimelineSafetyGlobalMarketTechChina
TrumpTariffTradeAnnounceLaunchNewsPricesStrikesMajorFebruaryPhotosYourCarLotSayCourtDigestSundayTimelineSafetyGlobalMarketTechChina
All Articles
West Virginia sues Apple for allegedly letting child abuse spread in iCloud
The Verge
Clustered Story
Published 3 days ago

West Virginia sues Apple for allegedly letting child abuse spread in iCloud

The Verge · Feb 19, 2026 · Collected from RSS

Summary

West Virginia has filed a lawsuit against Apple, accusing the company of allowing the distribution and storage of child sexual abuse material (CSAM) in iCloud. In a lawsuit filed on Thursday, West Virginia Attorney General JB McCuskey claims that by abandoning a CSAM detection system in favor of end-to-end encryption, iCloud has become a "secure frictionless avenue for the possession, protection, and distribution [of] CSAM," violating the state's consumer protection laws. Apple initially outlined plans to launch a system that checks iCloud photos against a known list of CSAM images in 2021. The move was met with significant backlash from pr … Read the full story at The Verge.

Full Article

West Virginia has filed a lawsuit against Apple, accusing the company of allowing the distribution and storage of child sexual abuse material (CSAM) in iCloud. In a lawsuit filed on Thursday, West Virginia Attorney General JB McCuskey claims that by abandoning a CSAM detection system in favor of end-to-end encryption, iCloud has become a “secure frictionless avenue for the possession, protection, and distribution [of] CSAM,” violating the state’s consumer protection laws.Apple initially outlined plans to launch a system that checks iCloud photos against a known list of CSAM images in 2021. The move was met with significant backlash from privacy advocates, with some claiming that the company is launching a surveillance system, leading Apple to stop the development of this feature nearly one year later. At the time, Apple’s software head Craig Federighi told The Wall Street Journal that “child sexual abuse can be headed off before it occurs... That’s where we’re putting our energy going forward.”Now, West Virginia alleges Apple “knowingly and intentionally designed its products with deliberate indifference to the highly preventable harms.” McCuskey believes other states could take legal action against Apple as well, as he told reporters during a press conference that he thinks they’ll “see the leadership that this office has taken” and “join us in this fight.”The lawsuit claims Apple made 267 CSAM reports to the National Center for Missing & Exploited Children, fewer than the over 1.47 million reports made by Google, and the more than 30.6 million made by Meta. It also cites an internal message between Apple executives, where Apple’s fraud head Eric Friedman allegedly states iCloud is the “greatest platform for distributing child porn.”Many online platforms, including Google, Reddit, Snap, Meta, and others, use tools like Microsoft’s PhotoDNA or Google’s Content Safety API to detect, remove, and report CSAM in the photos and videos sent through their systems. Apple currently doesn’t offer these capabilities, but it has since rolled out some features focused on child safety, including parental controls that require kids to get permission to text new numbers, as well as a tool that automatically blurs nude images for minors on iMessage in other apps. But McCuskey argues that these safeguards aren’t enough to protect children.“Apple has knowingly designed a set of tools that dramatically reduces friction for possessing, collecting, safeguarding, and spreading CSAM, all the while engineering an encryption shield that makes it much more likely for bad actors to use Apple to protect their illicit activities,” the lawsuit claims.Follow topics and authors from this story to see more like this in your personalized homepage feed and to receive email updates.Emma Roth


Share this story

Read Original at The Verge

Related Articles

The Hill3 days ago
West Virginia alleges Apple failed to detect, report child sexual abuse material

West Virginia in a new lawsuit is accusing Apple of knowingly permitting its iCloud platform to be used to store and distribute child sexual abuse material. The state’s attorney general, John McCuskey, filed the complaint in the Circuit Court of Mason County on Thursday, demanding the company adopt “effective” detection measures for this illicit content. ...

Engadget3 days ago
West Virginia is suing Apple alleging negligence over CSAM materials

The office of the Attorney General for West Virginia announced Thursday that it has filed a lawsuit against Apple alleging that the company had "knowingly" allowed its iCloud platform "to be used as a vehicle for distributing and storing child sexual abuse material." The state alleges this went on for years but drew no action from the tech giant "under the guise of user privacy." In the lawsuit, the state repeatedly cites a text from Apple executive Eric Friedman, in which he calls iCloud "the greatest platform for distributing child porn" in a conversation with another Apple executive. These messages were first uncovered by The Verge in 2021 within discovery documents for the Epic Games v. Apple trial. In the conversation, Friedman says while some other platforms prioritize safety over privacy, Apple's priorities "are the inverse." The state further alleges that detection technology to help root out and report CSAM exists, but that Apple chooses not to implement it. Apple indeed considered scanning iCloud Photos for CSAM in 2021, but abandoned these plans after pushback stemming from privacy concerns. In 2024 Apple was sued by a group of over 2,500 victims of child sexual abuse, citing nearly identical claims and alleging that Apple's failure to implement these features led to the victims' harm as images of them circulated through the company's servers. At the time Apple told Engadget, “child sexual abuse material is abhorrent and we are committed to fighting the ways predators put children at risk. We are urgently and actively innovating to combat these crimes without compromising the security and privacy of all our users." The case in West Virginia would mark the first time a governmental body is bringing such an action against the iPhone maker. The state says it is seeking injunctive relief that would compel Apple to implement effective CSAM detection measures as well as damages. We have reached out to Apple for comment on the suit and will update if we hear ba

The Vergeabout 3 hours ago
Vibe camera shootout: Camp Snap Pro vs. Flashback One35 V2

Fun vibes. Okay-ish photos. | Photo: Antonio G. Di Benedetto / The Verge There's been a surge of interest over the last few years in inexpensive digital cameras. Younger folks are snapping up old point-and-shoots because they view the aesthetic as more authentic and more appealing than smartphone images. Companies are even rereleasing old tech at new prices. And there are cameras like the original Camp Snap: a $70 single-button point-and-shoot with no screen, designed as a modern take on a disposable film camera. It's cheap enough to send off with a kid to summer camp and accessible enough for just about anyone to enjoy its lo-fi aesthetic. I've been testing two charming examples of this formula: the $99 Camp S … Read the full story at The Verge.

The Vergeabout 3 hours ago
America desperately needs new privacy laws

This is The Stepback, a weekly newsletter breaking down one essential story from the tech world. For more on the dire state of tech regulation, follow Adi Robertson. The Stepback arrives in our subscribers' inboxes at 8AM ET. Opt in for The Stepback here. How it started In 1973, long before the modern digital era, the US Department of Health, Education, and Welfare (HEW) published a report called "Records, Computers, and the Rights of Citizens." Networked computers seemed "destined to become the principal medium for making, storing, and using records about people," the report's foreword began. These systems could be a "powerful management … Read the full story at The Verge.

The Vergeabout 18 hours ago
Arturia’s FX Collection 6 adds two new effects and a $99 intro version

Arturia launched a new version of its flagship effects suite, FX Collection, which includes two new plugins, EFX Ambient and Pitch Shifter-910. FX Collection 6 also marks the introduction of an Intro version with a selection of six effects covering the basics for $99. That pales in comparison to the 39 effects in the full FX Collection Pro, but that also costs $499. Pitch Shifter-910 is based on the iconic Eventide H910 Harmonizer from 1974, an early digital pitchshifter and delay with a very unique character. Arturia does an admirable job preserving its glitchy quirks. Pitch Shifter-910 is not a transparent effect that lets you create natu … Read the full story at The Verge.

The Vergeabout 23 hours ago
Georgia says Elon Musk’s America PAC violated election law

Of course, it’s the guy who constantly complains about voter fraud who may have committed voter fraud. | Image: The Verge For all his bluster about voter fraud, Elon Musk has been one of the most flagrant flaunters of US election law. Now his America PAC has been slapped with a reprimand by the Georgia State Election Board for sending out pre-filled absentee ballot applications. State law prohibits anyone, other than an authorized relative, from sending an absentee ballot application prefilled with the elector's information. Residents of Chattooga, Cherokee, Coweta, Floyd, and Whitfield counties reported receiving absentee ballot applications from America PAC, partially pre-filled. According to the State Election Board, the applications also failed to note tha … Read the full story at The Verge.