- Google Lens, a visual search tool that uses a phone’s camera to identify real world objects, can now perform a few new tricks.
- Google unveiled three new improvements to Lens at its annual I/O conference on Tuesday.
- Among them is the ability to identify your surroundings in real time.
Among the slew of new products and features announced at this week’s Google I/O conference were three big additions to Google Lens, the company’s visual search feature.
The Lens product was initially announced at last year’s I/O event, at which point the feature was in more of a beta phase. Since then, Lens has rolled out to Google Pixel 2 users, as well as all Android and iPhone users through the Google Photos app as of this past March.
Google also announced at this year’s Google I/O that the Lens will now be available through the camera app on supported devices from LG, Motorola, Ziaomi, Sony Mobile, HMD/Nokia, Transsion, TCL, OnePlus, BQ and Asus.
The three new Lens features will roll out to users in the next few weeks. Get a head start now and learn about how how they work:
1. The smart text selection allow you to highlight a word on a menu and quickly pull up relevant photos and information.
This works with anything according to Google: recipes, gift card codes, WiFi passwords. Like in the above example, if you’re at a restaurant and you see an unfamiliar dish on the menu, you can highlight the name and Lens will pull up a description of it, along with a photo to help you decide if it’s appealing to you.
2. Google has jumped on the style match train.
Most features like this, such as Pinterest’s which is also called Lens, identify an item and pull up similar matches, not exact ones. Google’s Lens claims to do both. You can use Lens for outfit and home decor pieces to get more information on that specific item, such as reviews, where to buy it and its price. Lens will also pull up similar styles.
3. Lens has upped its “Shazam” qualities.
It now works in real time, meaning you can graze your phone over the room around you and Lens will automatically identify items and pull up information pertaining to them. This includes home decor, furniture and books, as seen above.
In October when Lens became available on the Pixel 2, users could search similarly for things like artwork, landmarks, movies and even posted flyers you pass on the footpath. It had its downfalls though: the Lens struggled with identifying handwriting and searched information wasn’t always on point or thorough.
As the feature rolls out steadily to users over the next few weeks, we’ll have to wait and see how much the company has sharpened and refined the “smart” magnifying glass technology in the past year.
Business Insider Emails & Alerts
Site highlights each day to your inbox.