Home Kripto Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud
Kripto

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

Apple faces legal action over its decision to abandon a system designed to detect child sexual abuse material (CSAM) in iCloud. The lawsuit, filed in Northern California, accuses Apple of neglecting to implement promised measures to combat the circulation of CSAM, potentially affecting thousands of victims.

The lawsuit, reported by The New York Times, stems from Apple’s 2021 announcement of a CSAM detection tool. This system aimed to use digital signatures from the National Center for Missing and Exploited Children (NCMEC) to identify known CSAM in users’ iCloud libraries. However, Apple halted its rollout after privacy advocates raised concerns about the potential misuse of the technology, including its use as a government surveillance tool.

Filed by a 27-year-old woman under a pseudonym, the lawsuit claims that Apple’s inaction forces victims to endure ongoing trauma as their abuse images remain accessible online. The plaintiff, who was abused as an infant, reports receiving frequent notifications from law enforcement about new cases involving her images. The suit estimates a class of 2,680 victims who could seek compensation, with damages exceeding $1.2 billion.

Attorney James Marsh, representing the plaintiff, highlighted the systemic impact of Apple’s choice, emphasizing the company’s failure to implement its widely promoted child safety features. The lawsuit also references a similar case filed in August, where a 9-year-old girl and her guardian sued Apple for not addressing CSAM on iCloud.

Apple responded, emphasizing its commitment to child safety while maintaining user privacy. Spokesperson Fred Sainz stated, “Child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk. We are urgently innovating to combat these crimes without compromising security and privacy.” He also pointed to features like Communication Safety, which alerts children to explicit content as part of broader prevention efforts.

This lawsuit adds to mounting scrutiny of Apple’s approach to addressing CSAM. In a separate critique earlier this year, the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accused the company of underreporting such material.

Apple’s decision to abandon its CSAM detection plans raises pressing questions about the balance between user privacy and protecting vulnerable individuals. While privacy concerns are valid, the absence of proactive measures leaves victims of child exploitation without crucial safeguards.

Related Articles

Search Rivals Urge EU to Fully Enforce Market Fairness Rules on Google
Kripto

Search Rivals Urge EU to Fully Enforce Market Fairness Rules on Google

The European Union is under mounting pressure to broaden its investigation into...

Google introduces AI-powered tool in Shopping tab to match your fashion ideas with similar items
Kripto

Google introduces AI-powered tool in Shopping tab to match your fashion ideas with similar items

Google has rolled out a new feature, “Vision Match,” in its Shopping...

YouTube introduces .99 per month Premium Lite subscription with ad-free experience
Kripto

YouTube introduces $7.99 per month Premium Lite subscription with ad-free experience

YouTube has introduced a new subscription tier called Premium Lite, which aims...

AI Still Faces Skepticism as a Viable ‘Co-Scientist’ in Research
Kripto

AI Still Faces Skepticism as a Viable ‘Co-Scientist’ in Research

Google recently unveiled its “AI co-scientist,” a cutting-edge artificial intelligence tool designed...