Since the launch of iOS 18, "Enhanced Visual Search" has been enabled by default in the iPhone's photo app. This feature allows the device to match places seen in user photos with a global index maintained by Apple. However, as iPhone user and developer Jeff Johnson points out in a recent blog post, this feature not only introduces privacy concerns by being enabled by default but curiously received very little attention on launch.
In his post, Johnson details two instances in which Apple mentions the new feature, according to his research. The first is on Apple's website, which has an update to the legal notice concerning photos and privacy, and the second is in a blog post titled "Combining Machine Learning and Homomorphic Encryption in the Apple Ecosystem." While both documents highlight Apple's focus on security, the blog post provides a clearer image of how much AI processes the photos subjected to "Enhanced Visual Search."
Enhanced visual search processing begins by using AI to check if a photo contains a potential landmark. If a landmark is detected, the phone creates a code representing the landmark, and Apple calls this embedding. Then, the code is encrypted and sent to Apple servers via third-party channels to hide the user's location. Once Apple servers receive the code, the servers find a similar landmark in the database and send the information back to the device for decryption. Finally, the phone uses another AI to select the best matching landmark and tags the photo with the landmark's name.
This may be a nice quality-of-life feature for some, but it is yet another privacy vulnerability for others, including Johnson. Ultimately, users can decide whether to enable this feature and whether the privacy risks are worth it. However, one must question how much of a user's privacy decisions are already being made behind the scenes.
Source(s)
Jeff Johnson via ycombinator and Apple