Although the iPhone 15 Pro received Apple Intelligence right alongside the iPhone 16 lineup, it notably missed out on Visual Intelligence. That’s finally set to change though. Apple has officially confirmed that it plans to bring Visual Intelligence to the iPhone 15 Pro and iPhone 15 Pro Max, with the feature possibly arriving with the stable iOS 18.4 update.
The reason Visual Intelligence has so far been missing from the iPhone 15 series is because of its implementation—the feature can only be triggered using the dedicated Camera Access button. Since the iPhone 15 Pro series lacks such a button—and the fact that Apple did not provide an alternative method—the feature remained exclusive to the iPhone 16, iPhone 16 Pro, and iPhone 16 Pro Max. That changes with the iPhone 16E, which comes with Visual Intelligence despite lacking the Camera Access button.
So, how did Apple do it? As demonstrated during the iPhone 16E announcement, Visual Intelligence can be mapped to the Action Button and added as a shortcut to the Control Center. And we've now official confirmation from Apple (via Daring Fireball) that both these new methods to summon Visual Intelligence will be coming to the iPhone 15 Pro (view at Best Buy) and iPhone 15 Pro Max, allowing users of these devices to try out this super handy Apple Intelligence feature for the first time. Apple says it'll arrive "in a future software update," but won't say exactly when. It's not live in the first iOS 18.4 beta yet, but it won't be far-fetched to expect it to be part of the final stable release.
For the uninitiated, Visual Intelligence is a lot like Google Lens on Android. It uses a combination of computer vision and generative AI to quickly provide details about places, identify plants and animals, summarize or translate text, and look up items on Google— all just by pointing your camera.
Source(s)
Via Daring Fireball (linked in the article)