Last Updated:
Apple launched the new iPhone 16 series with a camera control button which now supports new AI features.
iPhone 16 series launched this year has a new camera control button that lets you activate the camera and even capture photos with a single tap. Now, Apple has added new features to this interface with the iOS 18.2 update that has rolled out recently.
The new version brings visual intelligence to the iPhone 16, 16 Plus, 16 Pro and the 16 Pro Max models. This is just Apple’s version of Google Lens but the company is using both ChatGPT and Google Search to power this intelligence ability.
iPhone 16 Pro Camera Control Update: What It Offers
People using the new iPhone models have installed the iOS 18.2 update this week and they can see the new feature available in the camera app. When you open the camera on the phone, Apple will greet you with a message about visual intelligence: “Learn about the objects and places around you and get more information about what you see.”
Apple also notifies that the images your iPhones will use to identify objects and places will not be stored on the device and are only shared with Apple to process what’s in view.
– Click on the camera control button to activate the camera
– Long-press the camera control button to open the camera in the VI mode
– Move the camera around the tap on the button to search for image on the internet
– Apple will give you the option of asking ChatGPT for the query, or use Google Search on the internet
The company is clearly saying that when you use either of these platforms, their terms of service and privacy applies.
The VI feature works as advertised in most cases but still needs external support, which we are hoping will change once Apple has its own AI weapon to master these tricks in the future updates.