Notebookcheck Logo

Apple pulls controversial CSAM plans to scan the Photos app in iOS and iPadOS for nude pictures of children from its website

Apple has deleted all references to its CSAM initiative. (Image source: Jeremy Bezanger)
Apple has deleted all references to its CSAM initiative. (Image source: Jeremy Bezanger)
Rumour had it that Apple would integrate CSAM scans within iOS 15.2 and iPad 15.2, released earlier this week. However, the company has now deleted all references to these plans on its website, indicating that the feature has been pulled.

A few days ago, Mark Gurman reported that Apple would include several new features in iOS 15.2, released earlier this week. For Gurman, one of the most notable features was the ability for iOS to scan 'for nude photos sent to or from children in Messages'. By nude photos, Gurman meant child sexual abuse material (CSAM), which Apple discussed at great length on its website.

For some reason, Apple has removed all its anti-CSAM initiative material from its website. You can still view Apple's rationale behind integrating this functionality within iOS and iPadOS via The Wayback Machine. Its removal does not mean that Apple has u-turned on this, but it would be the second time that it had postponed CSAM integrations.

Seemingly, Apple has decided against making CSAM scans a part of iOS 15.2 and iPadOS 15.2, though. Apple insists that people must activate this feature for it to scan photos, and that it would only do so within the Messages app. Additionally, the company sought to reassure people that no images are analysed in the cloud and that the risk of an account being reported in error would be one trillion to one.

However, the Council of the European Union and Edward Snowden believed otherwise. In short, Snowden considered CSAM 'mass surveillance' and that it could be used for nefarious practices in the future. An Apple spokesperson told The Verge that the 'company’s position hasn’t changed since September, when it first announced it would be delaying the launch of the CSAM detection'.

Source(s)

static version load dynamic
Loading Comments
Comment on this article
Please share our article, every link counts!
> Expert Reviews and News on Laptops, Smartphones and Tech Innovations > News > News Archive > Newsarchive 2021 12 > Apple pulls controversial CSAM plans to scan the Photos app in iOS and iPadOS for nude pictures of children from its website
Alex Alderson, 2021-12-16 (Update: 2021-12-16)