Apple will scan iMessages and iCloud photos to fight child abuse
Apple has officially announced that it will implement new measures to fight child abuse in the upcoming iPadOS and iOS 15 updates for its mobile devices like the iPhone 12 (from US$564 on Amazon). Even though it is for a good cause, privacy advocates worry about the overall three actions that Apple is planning to take.
First of all, the Cupertino-based company will start to analyze pictures in the native iOS Messages app on devices that are used by children. When a minor receives or sends a message that includes explicit content, the picture will be automatically blurred and parents as well as the child itself will be warned.
Furthermore, Apple will start matching photos that are uploaded and stored on its iCloud servers to a databank of known abusive material. To do that, the company utilizes the so-called NeuralHash technology, which is illustrated in the image below. This procedure is exclusively used to identify and report abusive material. Apple vows that the privacy of its customers will not be violated, even though critics have pointed out that the company could use this same technology to identify all types of content in the pictures of iCloud users. In fact, tech companies like Google, Facebook, Twitter and Microsoft have used this technology for years, which most customers probably weren't aware of.
Last but not least, Siri and the Spotlight search will provide parents and kids with more useful information regarding abuse. Later this year, all these functions will be activated on Apple devices that are used in the United States, although the iPhone maker has plans to bring these safety features to more countries in the future.