Saturday , 28 December 2024
Home Kripto Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud
Kripto

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

Apple faces legal action over its decision to abandon a system designed to detect child sexual abuse material (CSAM) in iCloud. The lawsuit, filed in Northern California, accuses Apple of neglecting to implement promised measures to combat the circulation of CSAM, potentially affecting thousands of victims.

The lawsuit, reported by The New York Times, stems from Apple’s 2021 announcement of a CSAM detection tool. This system aimed to use digital signatures from the National Center for Missing and Exploited Children (NCMEC) to identify known CSAM in users’ iCloud libraries. However, Apple halted its rollout after privacy advocates raised concerns about the potential misuse of the technology, including its use as a government surveillance tool.

Filed by a 27-year-old woman under a pseudonym, the lawsuit claims that Apple’s inaction forces victims to endure ongoing trauma as their abuse images remain accessible online. The plaintiff, who was abused as an infant, reports receiving frequent notifications from law enforcement about new cases involving her images. The suit estimates a class of 2,680 victims who could seek compensation, with damages exceeding $1.2 billion.

Attorney James Marsh, representing the plaintiff, highlighted the systemic impact of Apple’s choice, emphasizing the company’s failure to implement its widely promoted child safety features. The lawsuit also references a similar case filed in August, where a 9-year-old girl and her guardian sued Apple for not addressing CSAM on iCloud.

Apple responded, emphasizing its commitment to child safety while maintaining user privacy. Spokesperson Fred Sainz stated, “Child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk. We are urgently innovating to combat these crimes without compromising security and privacy.” He also pointed to features like Communication Safety, which alerts children to explicit content as part of broader prevention efforts.

This lawsuit adds to mounting scrutiny of Apple’s approach to addressing CSAM. In a separate critique earlier this year, the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accused the company of underreporting such material.

Apple’s decision to abandon its CSAM detection plans raises pressing questions about the balance between user privacy and protecting vulnerable individuals. While privacy concerns are valid, the absence of proactive measures leaves victims of child exploitation without crucial safeguards.

Related Articles

Lyft Challenges San Francisco Over 0 Million Tax Dispute
Kripto

Lyft Challenges San Francisco Over $100 Million Tax Dispute

Lyft has accused San Francisco of overcharging $100 million in taxes between...

Singapore and Hong Kong Emerge as Leaders in Blockchain Innovation
Kripto

Singapore and Hong Kong Emerge as Leaders in Blockchain Innovation

Singapore has been declared the global leader in blockchain technology according to...

Growth in China’s Foldable Smartphone Market Slows This Year
Kripto

Growth in China’s Foldable Smartphone Market Slows This Year

China’s foldable smartphone market is losing momentum in shipment growth, according to...

Russia Explores Using Bitcoin for International Trade, Finance Minister Reveals
Kripto

Russia Explores Using Bitcoin for International Trade, Finance Minister Reveals

In a recent interview with the state-owned news channel Russia-24 on December...