You don’t need me to tell you that Apple Intelligence has endured a rocky start to life.
Not only did Apple delay its AI toolset until after the launch of the supposedly AI-packed iPhone 16 line in September, but the Apple Intelligence features that have since arrived on the best iPhones are, by and large, novelties rather than needle-movers.
“I believed in Apple Intelligence, but Apple let me down,” wrote TechRadar’s Senior AI Writer John-Anthony Disotto in a recent article summarizing Apple’s woes. And as a long-time iPhone user, I too have felt the sting of seeing my favored tech brand trail behind the likes of Google and Samsung in the Great AI Race.
But all is not lost. For all its shortcomings, there are green shoots to be found in Apple Intelligence, particularly in the way Apple’s AI tools are presented and accessed. They might not be the most powerful features, but they do work pretty darn seamlessly, and after using Apple Intelligence every day for a week during a recent vacation to Madeira, Portugal, I’ve come around to the idea that Visual Intelligence, in particular, is a genuinely useful addition to the best iPhones.
Out in the field
Apple’s take on Google Lens launched with iOS 18.2 in December – a whole three months after the iPhone 16 launched, by the way – and while I briefly played around with Visual Intelligence at the time, I never really found a reason to use it consistently on a day-to-day basis in London.
In Madeira, though, I made a point of putting Visual Intelligence through its paces; not least because it’s a place of unique natural beauty, exotic animals, and region-specific customs.
Visual Intelligence is activated with a long press of the Camera Control toggle on all iPhone 16 models, and via the Action Button, Lock Screen, or Control Center on the iPhone 16e, iPhone 15 Pro, and iPhone 15 Pro Max. It uses the iPhone’s camera to identify, interpret, and act on visual information, and is integrated with both ChatGPT and Google Search.
As an iPhone 16 Pro user, I always summon Visual Intelligence via Camera Control, and the former has given the latter a new lease of life in my eyes.
I almost never use Camera Control to take pictures, but Apple’s physical-meets-haptic addition to the iPhone 16 series is a perfect trigger point for Visual Intelligence. Equivalent features on Samsung and Pixel phones can be activated by pressing the side button and power button, respectively, but I like that my iPhone 16 Pro has what I now see as a dedicated Visual Intelligence toggle.
@techradar
♬ coffee chat – choppy.wav
As for how Visual Intelligence responded to my visual queries in Madeira? It served up useful and accurate information every single time. Granted, the feature relied on ChatGPT in the majority of cases, but I don’t see that as a problem if ChatGPT remains free and integrated seamlessly into Visual Intelligence. It might not be an Apple tool, but it’s certainly an iPhone tool.
As you can see via the TikTok video above, Visual Intelligence was able to determine my location based on a fairly nondescript valley and an even more obscure botanical garden. It correctly translated both Portuguese and Latin text, and explained that the lizard I stumbled across was a common wall lizard (Podarcis muralis). It identified a traditional Portuguese spice crusher and even told me where I could purchase some rather delicious Madeira wine.
And even in cases where Visual Intelligence couldn’t pinpoint my exact whereabouts or determine the origin of the grilled fish on my plate, it gave me something to work with. For instance: “This is a stunning waterfall cascading down a lush, green cliff, surrounded by dense vegetation. The area appears to be a popular spot for visitors, with people enjoying the natural scenery, taking photos, and relaxing near the water’s edge.” All correct, even if it wasn’t able to identify the waterfall in question as Madeira’s 25 Fontes.
The point being: Visual Intelligence works, and it shouldn’t be dismissed as a useless gimmick by those who have only tried Apple Intelligence’s more superfluous features.
The bigger, better competition
Of course, the unfortunate situation for Apple is that many of the best Android phones offer comparable or better versions of the same tool. Gemini Live is Visual Intelligence on steroids, and when Google’s AI assistant helps you cook up a three-course meal or engages in a conversation about the local music scene, it genuinely feels like you’re interacting with the future.
Ironically, Gemini Live is now available for free on iOS as well as Android, and the harsh reality for Apple is that there’s really no reason for iPhone owners to use Visual Intelligence over Gemini Live at the time of writing. AI is moving at such a pace that the former already feels outdated – Apple Intelligence is not the most intelligent software on iPhone.
However, I’m convinced that Apple does have a functioning AI platform to build on at next week’s WWDC 2025 summit. Yes, Apple Intelligence is a long way behind the competition, but features like Visual Intelligence prove that there is genuine utility in what Apple has produced thus far. Apple is still the king of UI and design, and I hope that we’ll soon see an Apple Intelligence that isn’t so reliant on third-party platforms to be useful.
Read the full article here