Google-worker Marc Levoy talks in an interview about the camera software for Pixel, Pixel XL. Levoy says HDR + should be enabled was constant for each image. Location is turned on by default.

HDR has so far been slow at times but the Pixel is HDR photography without delay. Google has solved the speed problem in an interesting way. The picture is taken when the user presses the shutter release – pictures have at that stage already been caught.

The trigger is used instead to determine which of the already captured images to be selected and then combined into a single photograph. This approach is partly possible thanks to the image signal processor in the Snapdragon 821.

HDR in Google Pixel also has other peculiarities. Usually combining several images exposed in different ways. Pixel camera underexpose each image and uses math to create a good end result.

Regarding the lack of optical image stabilization Levoy mean that it is not necessary for the HDR + takes several pictures with short exposure rather than only with long exposure. Google engineer says in conclusion that he believes the cameras in mobiles will become increasingly software-focused future.

Mathematically speaking, take a picture of a shadowed area – it’s got the right color, it’s just very noisy Because not many photons landed in Those pixels. But the way the mathematics works, if I’ll take nine shots, the noise will go down by a factor of three – by the square root of the number of shots that i take. And so just taking more shots will make That shot look fine.

Maybe it’s still dark, maybe I want to boost it with tone mapping, but it will not be noisy. One of the design principles we wanted to adhere to was no ghosts, ever. Every shot looks the Sami except for object motion. Nothing is blown out in one shot and not in the other, nothing is noisier in one shot and not in the other. That makes the alignment really robust.

post How to reach Google blazing HDR + Pixel, Pixel XL first appeared on Forums .