Apple sued for not implementing CSAM detection in iCloud
In 2021, Apple announced and then quickly retracted a controversial iCloud system that would have scanned iMessage and iCloud photos for any Child Sexual Abuse Material (CSAM).
The system would have let Apple analyze pictures in Apple devices used by children, but the company found itself caught in the eye of the storm due to privacy concerns raised by experts and advocacy groups. Apple abandoned the system, saying they would "take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features."
In the years since then, Apple has stayed quiet about any features regarding CSAM. Now (via Engadget), a lawsuit from a victim, filed in the US District Court in North California, alleges that Apple's failure to build safety precautions has led to images of her circulated inappropriately online.
First reported by The New York Times, the 27-year-old said she and her mother received constant notifications of several individuals charged with possession. The lawsuit is demanding financial compensation for 2680 victims who have had their images leaked online.
Apple spokesperson Fred Sainz told Engadget that CSAM "is abhorrent, and we are committed to fighting the ways of predators put children at risk." Sainz said the company was "urgently and actively" looking for ways "to combat these crimes without compromising the security and privacy of all our users."
Are you a techie who knows how to write? Then join our Team! Wanted:
- News Writer (Romania based)
Details here
Source(s)
Image Source: Photo by Tayyab Ejaz on Unsplash