Google Lens gets integrated into the camera app
Point your camera at real world text, then copy and paste the words. Or aim your lens at a snazzy dresser to call up similar clothing on your device.
Google is integrating its Google Lens augmented reality helper into a camera app, which will unlock features like letting you grab text that you can copy and paste, and find other objects and clothing that match a style in front of you.
Starting next week, Lens will be integrated inside the camera app for Google's own Pixel phone as well as the new LG G7, plus other devices from makers like Motorola , Sony , Nokia , OnePlus and Asus , according to Aparna Chennapragada, Google's vice president of AR, VR and vision-based products. Chennapragada was presenting at Google I/O, the company's giant developer conference, which serves as one of Google's big annual events to unveil new products and services people can expect to see in the year ahead.
The world's two biggest phone makers, Apple and Samsung , weren't on the list of partners.
"This way it makes it super easy on things right in front of you already in the camera," Chennapragada said.
See also
She demoed three new features available with the new Lens.
Smart text selection lets you copy and paste words from the real world into your phone. So if you're reading a book, you can tap to highlight text on the physical page through your phone's camera app. Or if you're looking at a menu full of unfamiliar dishes, you can tap on any word to pull up a card with information about each one.
Style Match allows you to direct your camera at an object to find things that are similar, like a friend's lamp you want to copy in your decor or an outfit on a passerby that you want to replicate for yourself.
Discussing real-time results, Chennapragada provided an update on how Lens will anchor information on top of objects in your camera's view in real time, using machine learning. Over time Google is aiming to overlay live results on top of things like store fronts or street signs. For example, if you point your camera at a concert poster, Lens will automatically offer a music video by that artist, which you can dive into with a tap, she said.
"This is an example of how the camera is not just answering questions, but it is putting the answers right where the questions are," she said.
Mixed, augmented and virtual reality together make up a buzzy category of technology that giants like Google believe will define the next era of computing. They all use a device -- whether smart glasses, a headset or a phone -- to put you in the middle of digital images. Those can be an all-encompassing virtual world, with VR, or digital overlays on top of the normal world around you, with AR and MR.
Google has been among the most active companies in VR and AR today. It introduced Cardboard , its dirt-cheap 360-video viewer, in 2014 to turn your phone into a DIY virtual-reality device. It's more advanced Google Daydream headset arrived in 2016, which also leveraged a mobile phone as the brains and screen for its VR experiences. Last year, Google announced plans for a standalone VR headset, the Lenovo Mirage Solo with Daydream that finally arrived this month. And at last year's I/O, it unveiled Google Lens, a visual search tool that uses a phone's camera to overlay information about the objects it identifies through the viewfinder.
Google I/O: All our coverage of this year's developer conference.
Google Assistant could become the most lifelike AI yet: Experimental technology called Duplex, rolling out soon in a limited release, makes you think you're talking to a real person.