This is how Google Maps will use your phone’s camera to improve the location

So this is how Google Maps will use the camera on your phone to improve location


This week Google Maps has begun to prove your new navigation with augmented reality Map Live View. This new feature allows us to see the arrows of direction while pointing the camera in the environment.

well, the use that will make Google Maps for our camera will not be limited to paint over the arrows of the navigation step by step with the augmented reality ARCore, but also will use the camera as another sensor to improve the location.

global Localization

During these last few years Google has tried to improve the accuracy of the blue dot of Google Maps with the GPS and the compass, but the physical limitations of both sensors make it difficult to the solution to provide the best accuracy, especially in urban environments.

To solve that problem you are now experiencing with the Global Localization, which combines the new Visual placing (VPS), Street View andprendizaje automatic to identify with greater precision the position and the orientation of our mobile.

Image4

The GPS falls short in urban environments, especially where there are buildings of great height. The signals from the satellites bounce off buildings and this causes the location of our mobile phone we place several meters away from our actual location, coming to put us up in another street.

The GPS is not able to determine the orientation, where we’re looking at, and to remedy that deficiency is used the compass of the mobile device, by measuring the magnetic field and of gravity of the earth and the relative motion of the device, but neither are they entirely accurate.. These sensors can be diverted easily by the magnetic objects in the environment, such as cars, pipes, buildings, or by the actual electronic components inside the phone. Sometimes you can generate errors up to 180 degrees.

Given these two limitations of the GPS, compass and Google is testing a new, complementary technology that adds to these two sensors, the camera of our mobile. The new view Google Maps Live View will be able to **determine our location based only on images rather than signals thanks to the new Service Positioning Visual. The images taken by our mobile discuss the issue with the images taken of the area with Street View, and with an algorithm of machine learning, will be able to determine with greater accuracy the position of the device to guide people towards their destination.

The use of this technique also has a problem, and is that the image taken at that moment may be different to the one taken in Street View. There are elements that change with the passage of time, as the billboards or trees. For this purpose, the machine learning has to learn to focus with the structures standing in the scene, such as buildings.

Maps

More information | Google

we Also recommend

Hashtags to reach the reviews in Google Maps: well used

Google Maps launches the tab "For you" in more than 130 countries: as well you configure your new personalized recommendations

The display of the mobile has been devouring chassis and sensors, but it has not been easy


The news this is how Google Maps will use your phone’s camera to improve the location was originally published in Xataka Android by Cosmos .


Xataka Android

This is how Google Maps will use your phone’s camera to improve the location
Source: english  
February 12, 2019


Next Random post