Google’s AI-based image recognition system Google Lens will be making its way into stock camera apps, the company revealed during its I/O 2018 conference this week. The tool— which identifies objects, landmarks, and more in the images you capture with your phone—was first revealed last year, later launching for the maker’s own Pixel smartphones and then on Android and iOS in general.
For now, Google Lens is only available through the Google Photos app, but that will be changing in coming months. The company plans to launch Google Lens in stock mobile camera apps starting with its own Pixel handsets; other Android smartphone models will get support later on. According to The Verge, a total of 10 models will offer Google Lens in their stock camera apps.
Joining the expansion announcement are three new Google Lens features: smart text selection, style match, and real-time functionality for instant results.
With smart text selection, Google Lens is able to identify words within images and find relevant information—such as retrieving data on a food dish after the user captures an image of its name in a menu or other document. Style match, meanwhile, is a feature for finding objects similar to ones captured in an image, such as related outfits or home decor.
Finally, all of this information will be made available at a faster pace thanks to real-time functionality. With this addition, Google Lens finds and provides information proactively based on items in the user’s environment that are captured as they point their phone’s camera around.
Google says these features will start rolling out in the next few weeks.
Articles: Digital Photography Review (dpreview.com)