How to use Visual Intelligence on the iPhone

News Room

One of the Apple Intelligence features that hasn’t been delayed is Visual Intelligence, which uses your iPhone’s camera to identify and answer questions on whatever’s around you in the world.

It lets you snap a pizza restaurant storefront and find out its opening hours, for example, or point your camera at a plant and find out what it’s called and how to care for it. If you’ve used Google Lens, you’ll get the idea.

This isn’t available to everyone, though. You have to be using iOS 18.2 on the iPhone 16, iPhone 16 Plus, iPhone 16 Pro, or iPhone 16 Pro Max; iOS 18.3 on the iPhone 16E; or iOS 18.4 on the iPhone 15 Pro and iPhone 15 Pro Max. You’ll also need to have Apple Intelligence turned on, via Apple Intelligence & Siri in Settings.

How to launch Visual Intelligence

If you have an iPhone 16 with a Camera Control button on the right-hand side, you can tap and hold this button to bring up the camera and Visual Intelligence.

If you’ve got an iPhone 16E, iPhone 15 Pro, or iPhone 15 Pro Max, you’ve got a few different options to choose from:

How to use Visual Intelligence

There are all kinds of ways to use Visual Intelligence. Most of the time, it’ll be able to recognize and respond to prompts about anything you show it, so try experimenting and see what you get.

Outside of those options, you’ve got two features you can use, which appear as buttons onscreen whenever Visual Intelligence is looking at something.

To exit Visual Intelligence at any time, swipe up from the bottom of the screen.

Read the full article here

Share This Article
Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *