Embedded Vision

Google Lens leverages computer vision and artificial intelligence to understand images

At Google’s I/O Developer Conference from May 17-19, CEO Sundar Pichai announced a new technology called "Google Lens" that utilizes computer vision and artificial intelligence technologies that enables your smartphone to understand what you’re looking at and help act based on that information.
May 23, 2017
3 min read

At Google’s I/O Developer Conference from May 17-19, CEO Sundar Pichai announced a new technology called "Google Lens" that utilizes computer vision and artificial intelligence technologies that enables your smartphone to understand what you’re looking at and help act based on that information.

"So for example if you run into something and you want to know what it is—say a flower—you can invoke Google Lens from your assistant, point your phone at it and we can tell you what flower it is," said Sundar Pichai, Google CEO at the conference. "Or, if you’ve ever been at a friend’s place and you’ve crawled under a desk just to get the username and password from a Wi-Fi router, you can point your phone at it and we can automatically do the hard work for you."

He added, "Or, if you’re walking in a street downtown and you see a set of restaurants across you, you can point your phone, because we know where you are, and we have our Knowledge Graph, and we know what you’re looking at, we can give you the right information in a meaningful way."

Pichai noted that Google was built because they "started understanding text and web pages, so the fact that computers can understand images and video has profound implications for our core mission."

Google Lens will first ship in Google Assistant and Photos, and will come to other products in the future. Scott Huffman, Vice President, Assistant, also provided some details on how Google Lens can be used.

"So, last time I traveled to Osaka, I came across a line of people waiting to try something that smelled amazing. Now, I don’t speak Japanese so I couldn’t read the sign out front. But Google Translate knows over a hundred languages, and my Assistant will help with visual translation. I just tap the Google Lens icon, point the camera and my Assistant can instantly translate them into English."

Lens will reportedly be rolling out later this year. Just like the Google Glass—which failed to take off, seemingly due to privacy concerns—there are likely many people that are excited about Google Lens. On the other side, though, are those that may find it too intrusive, as technology creeps further and further into peoples’ everyday lives. It will be one of numerous interesting storylines to follow over the next several months, with regards to topics surrounding the proliferation of artificial intelligence.

View a transcript from the Google I/O Developer Conference.
View Pichai’s keynote from the conference.

Share your vision-related news by contacting James Carroll, Senior Web Editor, Vision Systems Design

To receive news like this in your inbox,
click here.

Join our LinkedIn group | Like us on Facebook | Follow us on Twitter

Learn more: search the Vision Systems Design Buyer's Guide for companies, new products, press releases, and videos

About the Author

James Carroll

Former VSD Editor James Carroll joined the team 2013.  Carroll covered machine vision and imaging from numerous angles, including application stories, industry news, market updates, and new products. In addition to writing and editing articles, Carroll managed the Innovators Awards program and webcasts.

Sign up for Vision Systems Design Newsletters

Voice Your Opinion!

To join the conversation, and become an exclusive member of Vision Systems Design, create an account today!