Google Lens will work in real-time, adds smart text selection, and integrates in the camera app on multiple devices

Google Lens will work in real-time, adds smart text selection, and integrates in the camera app on multiple devices

We may earn a commission for purchases made using our links.

It was a big day yesterday for the Android community as a number of new features for popular Google services were shown off during the keynote. From the changes happening in Android P, to Google News being revamped thanks to machine learning, and a nice little update for Google Assistant as well. A standout from the keynote was the latest changes happening to Google Lens with the new version adding smart text selection, a style match feature to help search for similar items, and a real-time mode that can proactively surface information instantly.

So first up we have the new real-time feature (shown above) that allows Google Lens to be integrated directly into the camera application. Simply open it up and point the smartphone at various objects around you. You’ll see tiny dots appearing on certain items as Google Lens tries to figure out what it is, and it will even place a colored dot on things it thinks it has identified. You can then tap on that dot to bring up information about the object you’re currently looking at.

Next up we have a new smart text selection feature (above) that can be used in a number of different situations. The idea here is to point the camera at some text, whether it’s a street sign, a restaurant menu, or even a full document. One of the examples demonstrated yesterday showed a food menu where the person could tap a word on the screen and get a Google search showing what that entree is, how it looks, and what ingredients are in it. Another example showed how quick and easy it was to capture an entire document as a PDF by just pointing the camera at it for a couple of seconds

The last new feature showed off yesterday at Google I/O 2018 for Google Lens is something the team is calling Style Match (above). The idea is that you’ll be able to point your camera at an object, like some clothing or a lamp, and you will then be shown items that look similar to it. While you may end up pulling up the exact item from an online store that you can purchase, you will also be shown similar items in case the exact item you’re looking at is too expensive (or not high quality enough).

These are just a few of the new features that will appear in Google Lens in the next few weeks. Lens will be integrated directly in the camera applications on supported devices from LG, Motorola, Xiaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ, Asus, and of course the Google Pixel.

Source: The Keyword