
Apple will introduce Visual Intelligence to its last-gen iPhones in a forthcoming iOS update.
The Apple Intelligence feature has, until now, required the use of the Camera Control button that neither device has.
Apple will soon give iPhone 15 Pro and 15 Pro Max owners the opportunity to use one of its new Apple Intelligence features that they've not yet had access to.
Visual Intelligence is already available on the iPhone 16 family of phones, and will be accessible on the iPhone 16e when it arrives next week, but last-gen Pro users have not received it yet for good reason – it requires the use of the Camera Control button they don't have on their handsets.
However, a future iOS update will reportedly introduce the option to switch its use to the Action button instead, which the iPhone 15 Pro and its super-sized stablemate do have.
A Control Center shortcut is also said to be coming as an alternative.
Both of these are going to be available on the iPhone 16e from the off, as it too is lacking a Camera Control button.
Visual Intelligence is essentially Apple's answer to Google Lens.
You currently activate it on supported iPhones by holding down the Camera Control button and point the camera at an object you want information on, such as a building or something you'd like to buy online.
It'll then give you the option to ask ChatGPT for details or search for the item through Google. Of course, you also need to approve Apple Intelligence support on your device, plus sign into ChatGPT through settings to get the full functionality.
According to John Gruber of Daring Fireball, Apple is yet to reveal when Visual Intelligence will be made available on last-gen iPhones – ie. which iOS update will carry the changes. However, he guesses that it could come with iOS 18.4, which should be available in its beta form "any day now", so we're likely to find out more about it when that arrives.
We'll let you know more as and when we find out.