Visual Intelligence is an Apple Intelligence feature that's exclusive to the iPhone 16 and iPhone 16 Pro models because it relies on the Camera Control button. Visual Intelligence is available as of iOS 18.2, and this guide outlines what it can do.
To use Visual Intelligence, you need to hold down on the Camera Control button for a few seconds to activate the Visual Intelligence mode.
Just pressing opens up the camera with Camera Control, so you do need a distinct press and hold gesture to get to it. Make sure you're not already in the Camera app, because it doesn't work if the camera is already active.
The Visual Intelligence interface features a view from the camera, a button to capture a photo, and dedicated "Ask" and "Search" buttons. Ask queries ChatGPT, and Search sends an image to Google Search.
Using Visual Intelligence requires taking a photo of whatever you're looking at. You need to snap a photo, which you can do with the Camera Control button, and select an option. It does not work with a live camera view, and you cannot use photos that you took previously.
If you're out somewhere and want to get more information about a restaurant or a retail store, click and hold Camera Control, and then click Camera Control again to take a photo or tap the name of the location at the top of the display.
Take a photo of text from the Visual Intelligence interface. Choose the "Summarize" option to get a summary of what's written.
Whenever you take a Camera Control image of text, there is an option to hear it read aloud. To use this, just tap the "Read Aloud" button at the bottom of the display, and Siri will read it out loud in your selected Siri voice.
If text that you capture with Visual Intelligence is not in your language (limited to English at this time), you'll see a "Translate" option. You can tap it to get an instant translation.
If there's a link in an image that you capture with Visual Intelligence, you'll see a link that you can tap to visit the website.
If there is an email address in an image, you can tap it to compose an email in the Mail app. Similarly, if there is a phone number, you'll see an option to call it.
Using Visual Intelligence on something that has a date will give you an option to add that event to your calendar.
For phone numbers, email addresses, and addresses, Apple says you can add the information to a contact in the Contacts app. You can also open address in the Maps app.
Visual Intelligence can be used to scan a QR code. With QR codes, you don't actually need to snap an image, you simply need to point the camera at the QR code and then tap the link that pops up.
You can take a photo of anything and tap on the "Ask" option to send it to ChatGPT while also asking a question about it. If you take a picture of an item with Visual Intelligence and want to know what it is, for example, you tap on Ask and then type in "What is this?" to get to a ChatGPT interface.
Visual Intelligence uses the ChatGPT Siri integration, which is opt-in. By default, no data is collected, but if you sign in with an OpenAI account, ChatGPT can remember conversations.
You can take a picture of any item that you see and tap on the "Search" option to use Google Image Search to find it on the web. This is a feature that's useful for locating items that you might want to buy.
The above is the detailed content of iOS 18.2: What You Can Do With Visual Intelligence. For more information, please follow other related articles on the PHP Chinese website!