The Visual Intelligence feature, which initially focused on analysing what we could see through the camera, also allows us to interact directly with whatever appears on our iPhone screen. From looking up information about objects to creating calendar events, Apple Intelligence is there to assist us. Let’s take a look at how to use Visual Intelligence with the content on our iPhone screen and everything we can achieve with it.
Visual Intelligence now understands what we see on the screen
Visual Intelligence used to be mainly about recognising elements in the physical world through the camera, but with the new features in iOS 26, Apple has expanded this capability so we can analyse any element shown on our iPhone, from a link in a screenshot to text we want to summarise. How do we use this feature? The process is extremely simple:
- We take a screenshot by pressing the side button and the volume up button at the same time.
- As soon as the screenshot interface appears, we’ll see the new Visual Intelligence options at the bottom of the screen.
From this point, the iPhone offers us different possibilities depending on the image’s content. We can highlight just the part we’re interested in using our finger so that Apple Intelligence focuses on it, make quick queries, or even interact with visible links without leaving the screenshot view.
What we can do with Visual Intelligence in a screenshot
Once the screenshot is on screen, the Visual Intelligence tools open up a wide range of possibilities. The most notable include:
- Search for similar items on the web to find products, related images or extra information with just one tap on the search button in the bottom right corner.
- Highlight a specific area and search only within that object, which is key when we want to focus on what truly matters in a busy screenshot.
- Ask questions about what’s on the screen by tapping the ChatGPT button in the bottom left corner to get quick explanations or extra details about the content.
- Add events to the calendar directly when the system detects dates or relevant information. We do this by tapping the new event button that appears automatically when applicable.
- Access detected links by tapping the link shown at the bottom. If there are several, we can swipe through them and open the one we’re interested in.
- Summarise the visible text in the screenshot to get a more concise version, which we do using the text options in the central area that appear next to the links.
- Listen to the content aloud using the reading option, which is again part of the sliding menu.
This set of features turns something as common as a screenshot into a gateway and context for artificial intelligence. Whether it’s Apple Intelligence models, integration with ChatGPT or smart searches via Google. A valuable context when it comes to speeding up any query.
We take a screenshot, see the suggestions and act immediately. Privacy, of course, is maintained at all times, since no information leaves our device or Apple Intelligence’s encrypted private cloud until we tap the ChatGPT or Google search button.
When we’re done interacting with the screenshot, we can dismiss it with the small x or save it to Photos using the checkmark at the top of the screen. In the meantime, we’re working with a function that integrates smoothly into our everyday iPhone use and significantly streamlines our interaction with artificial intelligence.
On Hanaringo | We can now go shopping inside ChatGPT