With the Pixel 2 smartphone, Google introduced an exciting new software feature called Google Lens. Google Lens uses Artificial Intelligence to power its visual recognition algorithms and provides information about whatever your smartphone’s camera is pointed at—for example, what type of flower you are looking at or reviews and other information about a restaurant. You can also identify landmarks, look up movies, books or works of art and scan barcodes/QR codes and business cards.
Unfortunately, in its first implementation the feature wasn’t terribly easy or straightforward to use. You had to take a picture, then go to Google Photos and tap the Lens icon which would trigger the Google Lens scan. That’s too many steps to make the feature as useful as it could potentially be.
Thankfully, Lens will be integrated into Google Assistant soon. When you open the latter, there’ll now be a Lens icon near the bottom right of the display. Tapping this opens up a Google Lens camera. You can tap on any object of interest in the preview window and the app will provide any available information.
As usual, the new feature will be rolled out gradually. English-language Pixel phones that are using Assistant in the United States, United Kingdom, Australia, Canada, India, and Singapore will be served first over the coming weeks, but we’d expect the new feature to make it other regions soon after.
Articles: Digital Photography Review (dpreview.com)