Google has announced its new Lens feature, which was first shown at the I/O developer conference this year, will be arriving as an update to the Assistant. It has already been available as a part of the Google Photos app for over a month now.
According to the search giant, Google Lens remains exclusive to the Pixel smartphones for now. There is no word on when it will be available for the broader ecosystem of Android devices.
A spiritual successor to the Google Goggles app, Google Lens offers information to you based on the visual analysis of an object. You can use Lens to identify text, landmarks, barcodes, art, books, movies and more. It uses artificial neural networks and machine learning to detect and identify what it sees. With the integration in Google Assistant, Lens can go even further and provide quick help based on its analysis. It becomes a kind of visual search engine.
Here is what Google Lens with Google Assistant can do for you:
- Text: Save information from business cards, follow URLs, call phone numbers and navigate to addresses.
- Landmarks: Recognize landmarks and learn about their history.
- Art, books and movies: Learn more about a movie from the poster or look up a book to see the rating and a synopsis or check out artist information
- Barcodes: Quickly look up products by barcode, or scan QR codes
Google revealed that Lens integration in Assistant will roll-out to Pixel phones using English as the system language in US, UK, Australia, Canada, India and Singapore over the coming weeks.
How to use Lens in Google Assistant
It is very simple. After you have gotten the Lens integration in Assistant, just tap the Lens icon on the bottom right corner to trigger your phone’s camera. Bring the object you want to scan with Lens in the frame and tap on the object. Assistant and Lens will then analyze the object and offer you information and suggest related actions to perform.