the inaugural conference of The Google I/O just finished and has come up with quite a few novelties. Starting with Google Photos, Assistant more intelligent than ever, going by the long-awaited beta of Android P that will arrive to new devices. But have not been the only ads, there has also been space for a Lens, the tool of visual search from Google.
Lens was announced just a year ago but it has not been until recently when it has begun to be integrated into more devices. Now, Google announces interesting innovations that improve some of the weak points that already we detect in your day. This is all that changes.
Google Lens ‘live’ through the camera
In our analysis of Lens we have already commented that one of the weak points of this function was that you had to use it since the app from Google Photos. That is to say, that you had to take the photo and then open it and click on the icon of Lens and that would give us the information. Later it began to extend the integration of Lens in Assistant, but there was still no approach to the idea of ‘point and learn’.
With the new Lens, this will be just what you make: point the camera to the object that we want to recognize and the app will do the rest. According to Google, this new feature is possible thanks to the use of machine learning, both on the device itself as using data in the cloud.
Improvements in the selection of texts and Match Style
The second novelty, which introduces Lens is an improvement in the selection of texts, another of the weak points that we were able to experience in its version early. In our experience, even though it ran great with a printed text, the handwritten letters used to fail a lot.
at the moment, Google does not plan to enhance the part of text to hand, but yes that will give us more options of selection of texts. For example if we are in a restaurant and we don’t know one dish from the menu, will show us a picture of the aspect of the recipe. That is to say, that in addition to being able to copy text, it also understands to offer additional information.
finally, Google’s Lens points to the fashion and Match Style, a function with which we will be able to find clothing or items of decoration similar just point the camera at them. Of course, we take you to stores that sell them.
For the moment, Google does not put a date of arrival for the new Lens and it is limited to say that will begin arriving in the coming weeks. What we do know is that the integration in the camera will be available in terminals, LG, Motorola, Xiaomi, Sony, Nokia, TCL, OnePlus, BQ and Asus.
More information | Google
Xataka Android | Android P Beta, all the news: more intelligent, more simple and more control
The news Google Lens (finally) integrated into the camera, improves the selection of texts and helps to buy clothes was originally published in Xataka Android by Amparo Babiloni .
May 8, 2018
- ← The back of Android One to BQ: of the input range to the more powerful model
- ARCore in the Google I/O 2018: these are the new features of the augmented reality of Google Android →