Apple said it’s bringing Visual Intelligence, its AI-powered image analysis tech, to the iPhone screen in iOS 26.
Visual Intelligence will now make it easier and faster to do more with the content on your iPhone, Apple says, and it works automatically with any app. For example, if you open a social media app and see a gray jacket, you can use Visual Intelligence — which can be accessed by pressing the same button you use to take a screenshot — to do an image search for the jacket in Google Search and other apps you use frequently.
Visual Intelligence on the iPhone’s screen offers other shortcut options, like quickly adding an event to your calendar, based on context. It can extract the date, time, and location and pre-populate them in a calendar entry. There’s also an option to upload a screenshot to ChatGPT for analysis and additional information.
Apple’s head of software engineering, Craig Federighi, said that developers can integrate Apple Intelligence into their apps.
“For developers, you can use app intents to integrate search capabilities from your apps into this experience,” Federighi said onstage at WWDC 2025. “We’re also making it possible for you to search visually across your most-used apps using Visual Intelligence with the iPhone camera.”