Google explains how the algorithm works behind the zoom high resolution of the Google Pixel

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel


The Google Pixel are one of the maximum exponents in moving picture. Despite continue to keep a single lens on the back, the devices of the Big G are able to get night-time images of high-definition and keep the detail when we zoom, all this is thanks to the AI and the algorithms, which we call a “computational photography”.

it All happens in the background and nothing else pressing the shutter button, but behind every decision there is a series of processes that help the end result to be as perfect as possible. On this occasion, Google has released the paper which explains how the algorithm works in charge of the zoom high resolution (which also applies Night Sight). As we will detail below, but can be summed up in six words: it takes several pictures and combines.


Multiple photos in RAW combined to obtain the best result

we Started with the basics. As pointed out from Google, “compared to the DSLR cameras, the camera of the smartphone has a smaller sensor, which limits their resolution; opening lower, which limits their light gathering capabilities; and pixels smaller, which reduces its signal/noise ratio”. In addition, you have to use color filter arrays to make the tween chromatic and rebuild the image (a process known as “demosaicing”), which reduces still more the resolution.

What Google has done has been to replace this interpolation color conventional by a algorithm multi-frame, high-resolution that is responsible for completing the images in RGB by using a burst of RAW images (between seven and 15, depending on the light) taken from different angles. But how from different angles, if when we take a photo we are still? Easy, using the shaking that occurs when shooting freehand.

Image 2019 05 29 13 39 22 The Google algorithm, summary.

The process takes just a 100 milliseconds for each of the 12 photos in the RAW. When you have the raw images, the algorithm sets a base frame (remember that each photo has an angle slightly different) and aligns and combines the rest of the pictures with said base frame. That is, using overlapping signal (aliasing), is used to reconstruct a signal of higher resolution.

The total process takes just 100 milliseconds, and happens in the background without any intervention of the user

then, the algorithm detects the gradients of the photo, that is to say, the corners, edges, and textures. Depending on the scene, the algorithm decides how much there is to improve the resolution of the signal and how much it contributes to each picture obtained above in that improvement. Here come into play basis functions radial isotropic gaussian, but the basic idea is that the algorithm detects the changes of signal for analysis where the edges of the objects and to maintain the integrity of the photo.

Image 2019 05 29 13 40 42 If you do not apply the model, the photo would come out with imperfections, as we are combining several photos taken in motion.

Something that one should ask is: what happens if we are on the move? If we take a photo from the car, when you merge the photos the items are moved, isn’t it? At all. Google uses a model of robustness of local motion, that creates a series of layers with the items moved and deleted. That way, the aberrations are cleared and the picture is frozen.

putting Together all these techniques, Google will get us to zoom in and out without losing too much quality, even with a single lens and no telephoto lens dedicated. It is also the basis of the night mode, that we have already put to the test and has been shown to behave more than decently.

More information | Google

we Also recommend

Google Lens already translated texts, shows reviews of dishes at restaurants, and recognizes objects in your latest beta

Google is also working on his flip phone, although it is not expected to arrive soon

Google Pixel 3a and Google Pixel 3a XL, the photography on the banner for the new mid-range Google


The news Google explains how it works the algorithm behind the zoom high resolution of the Google Pixel was originally published in Xataka Android by Jose García Nieto .


Xataka Android

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel
Source: english  
May 29, 2019