Sunday , 29 December 2024
Home Kripto Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud
Kripto

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

Apple Faces Lawsuit Over Dropped CSAM Detection Plans for iCloud

Apple faces legal action over its decision to abandon a system designed to detect child sexual abuse material (CSAM) in iCloud. The lawsuit, filed in Northern California, accuses Apple of neglecting to implement promised measures to combat the circulation of CSAM, potentially affecting thousands of victims.

The lawsuit, reported by The New York Times, stems from Apple’s 2021 announcement of a CSAM detection tool. This system aimed to use digital signatures from the National Center for Missing and Exploited Children (NCMEC) to identify known CSAM in users’ iCloud libraries. However, Apple halted its rollout after privacy advocates raised concerns about the potential misuse of the technology, including its use as a government surveillance tool.

Filed by a 27-year-old woman under a pseudonym, the lawsuit claims that Apple’s inaction forces victims to endure ongoing trauma as their abuse images remain accessible online. The plaintiff, who was abused as an infant, reports receiving frequent notifications from law enforcement about new cases involving her images. The suit estimates a class of 2,680 victims who could seek compensation, with damages exceeding $1.2 billion.

Attorney James Marsh, representing the plaintiff, highlighted the systemic impact of Apple’s choice, emphasizing the company’s failure to implement its widely promoted child safety features. The lawsuit also references a similar case filed in August, where a 9-year-old girl and her guardian sued Apple for not addressing CSAM on iCloud.

Apple responded, emphasizing its commitment to child safety while maintaining user privacy. Spokesperson Fred Sainz stated, “Child sexual abuse material is abhorrent, and we are committed to fighting the ways predators put children at risk. We are urgently innovating to combat these crimes without compromising security and privacy.” He also pointed to features like Communication Safety, which alerts children to explicit content as part of broader prevention efforts.

This lawsuit adds to mounting scrutiny of Apple’s approach to addressing CSAM. In a separate critique earlier this year, the UK’s National Society for the Prevention of Cruelty to Children (NSPCC) accused the company of underreporting such material.

Apple’s decision to abandon its CSAM detection plans raises pressing questions about the balance between user privacy and protecting vulnerable individuals. While privacy concerns are valid, the absence of proactive measures leaves victims of child exploitation without crucial safeguards.

Related Articles

Elon Musk’s Planes Logged 355 Trips in 2024 Including Campaign Stops
Kripto

Elon Musk’s Planes Logged 355 Trips in 2024 Including Campaign Stops

Elon Musk’s private jets undertook 355 flights in 2024, including 31 trips...

Japan’s Prime Minister Hesitant on Bitcoin National Reserve
Kripto

Japan’s Prime Minister Hesitant on Bitcoin National Reserve

Japan’s Prime Minister Shigeru Ishiba has expressed reservations about integrating Bitcoin into...

China’s Dahua Technology Ends Xinjiang Projects
Kripto

China’s Dahua Technology Ends Xinjiang Projects

China’s Zhejiang Dahua Technology, a leading video surveillance equipment manufacturer, announced its...

China’s CATL Launches EV Chassis With Safety as the Key Feature
Kripto

China’s CATL Launches EV Chassis With Safety as the Key Feature

China’s CATL, the world’s largest electric vehicle battery manufacturer, has unveiled an...