Google explains how the algorithm works behind the zoom high resolution of the Google Pixel

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel

The Google Pixel are one of the maximum exponents in moving picture. Despite continue to keep a single lens on th...

The Google Pixel are one of the maximum exponents in moving picture. Despite continue to keep a single lens on the back, the devices of the Big G are able to get night-time images of high-definition and keep the detail when we zoom, all this is thanks to the AI and the algorithms, which we call a “computational photography”.

it All happens in the background and nothing else pressing the shutter button, but behind every decision there is a series of processes that help the end result to be as perfect as possible. On this occasion, Google has released the paper which explains how the algorithm works in charge of the zoom high resolution (which also applies Night Sight). As we will detail below, but can be summed up in six words: it takes several pictures and combines.

The Xiaomi Mi 6 Lite with chip Snapdragon 660 could be a reality Tomorrow same will be presented the new chi...
Analysis iPhone SE: the Trojan horse Apple Why a new four-inch iPhone? What need? Wil...

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel

Xataka

Comparative photo between the most powerful mobile of 2019 for now: a war with one, two, three and four chambers in the rear

Multiple photos in RAW combined to obtain the best result

https://www.youtube.com/watch?v=iDn5HXMQNzE

we Started with the basics. As pointed out from Google, “compared to the DSLR cameras, the camera of the smartphone has a smaller sensor, which limits their resolution; opening lower, which limits their light gathering capabilities; and pixels smaller, which reduces its signal/noise ratio”. In addition, you have to use color filter arrays to make the tween chromatic and rebuild the image (a process known as “demosaicing”), which reduces still more the resolution.

What Google has done has been to replace this interpolation color conventional by a algorithm multi-frame, high-resolution that is responsible for completing the images in RGB by using a burst of RAW images (between seven and 15, depending on the light) taken from different angles. But how from different angles, if when we take a photo we are still? Easy, using the shaking that occurs when shooting freehand.

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel The Google algorithm, summary.

The process takes just a 100 milliseconds for each of the 12 photos in the RAW. When you have the raw images, the algorithm sets a base frame (remember that each photo has an angle slightly different) and aligns and combines the rest of the pictures with said base frame. That is, using overlapping signal (aliasing), is used to reconstruct a signal of higher resolution.

The total process takes just 100 milliseconds, and happens in the background without any intervention of the user

then, the algorithm detects the gradients of the photo, that is to say, the corners, edges, and textures. Depending on the scene, the algorithm decides how much there is to improve the resolution of the signal and how much it contributes to each picture obtained above in that improvement. Here come into play basis functions radial isotropic gaussian, but the basic idea is that the algorithm detects the changes of signal for analysis where the edges of the objects and to maintain the integrity of the photo.

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel If you do not apply the model, the photo would come out with imperfections, as we are combining several photos taken in motion.

Something that one should ask is: what happens if we are on the move? If we take a photo from the car, when you merge the photos the items are moved, isn’t it? At all. Google uses a model of robustness of local motion, that creates a series of layers with the items moved and deleted. That way, the aberrations are cleared and the picture is frozen.

putting Together all these techniques, Google will get us to zoom in and out without losing too much quality, even with a single lens and no telephoto lens dedicated. It is also the basis of the night mode, that we have already put to the test and has been shown to behave more than decently.

More information | Google

You can also look for…

Google Lens already translated texts, shows reviews of dishes at restaurants, and recognizes objects in your latest beta

Google is also working on his flip phone, although it is not expected to arrive soon

Google Pixel 3a and Google Pixel 3a XL, the photography on the banner for the new mid-range Google


The news Google explains how it works the algorithm behind the zoom high resolution of the Google Pixel was originally published in Xataka Android by Jose García Nieto.

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel

Google explains how the algorithm works behind the zoom high resolution of the Google Pixel
Xataka Android

The ambitious game service in the cloud Google already has a few titles co...

A new phenomenon is about to eat the Internet. It is a mysterious video o...

It will do next year. 17-Bit and GungHo Online Entertainment America ...

iOS Benutzer sind im Glück, denn sie können kostenlos auf ihren mobilen...

Bibliography ►
Phoneia.com (May 29, 2019). Google explains how the algorithm works behind the zoom high resolution of the Google Pixel. Recovered from https://phoneia.com/en/google-explains-how-the-algorithm-works-behind-the-zoom-high-resolution-of-the-google-pixel/

About the author


This article was published by Phoneia, originally published in English and titled "Google explains how the algorithm works behind the zoom high resolution of the Google Pixel", along with 18989 other articles. It is an honor for me to have you visit me and read my other writings.