Apple has introduced a new feature called Visual Intelligence with the iPhone 16. From the demo that Apple showed as part of its September 2024 event, it seems like Visual Intelligence is Apple’s version of Google lens.
The new feature is activated by a brand new touch-sensitive button on the right side of the device called Camera Control. With just a click, Visual Intelligence can identify objects, provide information, and offer actions based on what you point it at. Aim it at a restaurant to instantly pull up menus and ratings, or snap a flyer for an event to add it directly to your calendar. Curious about a dog’s breed? Point and click to find out. Eyeing a bike for purchase? Click to search for it online.
Apple claims that Visual Intelligence is private, which means that the company does not know what you clicked.
This is a developing story…
Catch up on all the news from Apple’s iPhone 16 event!
You Might Also Like
TikTok removes Russian state-owned media accounts for ‘covert influence’
TikTok has announced in its US Elections Integrity Hub that it has removed accounts associated with Rossiya Segodnya and TV-Novosti,...
Apple’s AirPods 4 are already on sale in this early Prime Day deal
It has been less than a week since Apple released the AirPods 4, and there's already a small sale available...
Spotify’s AI Playlists are now available for Premium users in the US
Spotify’s beta AI Playlist feature is now available for Premium users in the US, Canada, Ireland and New Zealand. It...
OpenAI’s X account was hacked to promote a crypto scam
OpenAI opened a newsroom Twitter account earlier this month and it's already been hacked. The new handle was taken over...