During Google I/O 2018 today, a couple of the recurring themes included the continued expansion of Google’s AI technology and the merging of this technology into more Google products. One of those combinations is Google Lens, the AI driven software that enables users to explore their world through the lens of their smartphone camera. Google is expanding Google Lens to work directly within the camera app on supported devices and gave it three new features to benefit users.
Google Lens was launched only one year ago at Google I/O where it was added to Google Photos and Google Assistant. This gave users the ability to take a picture they had in their library and doing a search for more information on the image. Now the camera app gets the same capability without a user having to jump out into Photos or to the Assistant for a much faster and seamless user experience. Devices that will be able to benefit from this enhancement of the camera app obviously include Google’s own Pixel phones as well as devices from LG, Motorola, Xiaomi, Sony, HMD Global (Nokia), Transsion, TCL, OnePlus, BQ and ASUS.
In terms of new features and capabilities, Google Lens will now be able to:
Perform smart text selection – instead of highlighting text on a web page or in a document and then copying and pasting or querying for more information, users will be able to do the same thing with text that appears on objects the camera is pointed at. Google gives the example of a restaurant menu and the ability to quickly look up information on a dish you may not be familiar with. Other users might include copying gift card codes or WiFi connection information.
Style match – Google recognized that sometimes people may spot things they like in terms of a certain style and want to use it merely as inspiration for future research. So the question is really more along the lines of “what are some things like this item?” Google Lens will now be able to pull up options to answer that question.
Real-time information – by including the Google Lens technology directly into the camera app, users no longer have to first take a picture and then perform a query. Instead, they can just pan their phone around pointing the camera at what they find interesting and pull up information.
The improved Google Lens experience is slated to start rolling out to users in the next few weeks.
Be sure to stay with TalkAndroid for more coverage from Google I/O 2018.