Apple pulls controversial CSAM plans to scan the Photos app in iOS and iPadOS for nude pictures of children from its website
A few days ago, Mark Gurman reported that Apple would include several new features in iOS 15.2, released earlier this week. For Gurman, one of the most notable features was the ability for iOS to scan 'for nude photos sent to or from children in Messages'. By nude photos, Gurman meant child sexual abuse material (CSAM), which Apple discussed at great length on its website.
For some reason, Apple has removed all its anti-CSAM initiative material from its website. You can still view Apple's rationale behind integrating this functionality within iOS and iPadOS via The Wayback Machine. Its removal does not mean that Apple has u-turned on this, but it would be the second time that it had postponed CSAM integrations.
Seemingly, Apple has decided against making CSAM scans a part of iOS 15.2 and iPadOS 15.2, though. Apple insists that people must activate this feature for it to scan photos, and that it would only do so within the Messages app. Additionally, the company sought to reassure people that no images are analysed in the cloud and that the risk of an account being reported in error would be one trillion to one.
However, the Council of the European Union and Edward Snowden believed otherwise. In short, Snowden considered CSAM 'mass surveillance' and that it could be used for nefarious practices in the future. An Apple spokesperson told The Verge that the 'company’s position hasn’t changed since September, when it first announced it would be delaying the launch of the CSAM detection'.