The Pixel's review embargo just lifted earlier today, and reviewers have been very impressed with both the speed of the phone's 12.3MP shooter and the quality that its images capture. In his review of the Pixel, David said it has "the best smartphone camera on the market." Marc Levoy, the lead of a computational photography team at Google Research, discussed with The Verge just how much the software assists in making the Pixel's camera so damn good.

It's worth mentioning that Levoy is no rookie; before the Pixel, he worked on Google Jump (a 360-degree GoPro rig intended to be used for VR), Google Glass's burst mode, Nexus' HDR+ mode, and he's lectured about photography at Stanford. That's a pretty impressive resume. Also worth mentioning is that many of these features are possible thanks to the Snapdragon 821's Hexagon DSP.

Levoy says that HDR+ should be left on at all times, and, "[he] can't think of any reason to switch it off." That's probably because one of the main demerits of traditional HDR+ is slowness; for instance, with the Nexus 6P and 5X, you'd get a few-second delay. However, the Pixel's HDR+ is instant, and here's Levoy's explanation:

"The moment you press the shutter it's not actually taking a shot — it already took the shot. It took lots of shots! What happens when you press the shutter button is it just marks the time when you pressed it, uses the images it's already captured, and combines them together."

nexus2cee_IMG_20161014_171625-1.jpg  

Some images from David's review.

HDR images are usually made by combining a number of shots with exposures concentrated on different areas, but the way the Pixel does HDR is completely different. Instead of exposing different pieces of the environment, it underexposes every shot (Levoy claims that the Pixel produces good colors in low light and denoises them well) and makes them pretty with math wizardry.

"Mathematically speaking, take a picture of a shadowed area — it's got the right color, it's just very noisy because not many photons landed in those pixels. But the way the mathematics works, if I take nine shots, the noise will go down by a factor of three — by the square root of the number of shots that I take. And so just taking more shots will make that shot look fine. Maybe it's still dark, maybe I want to boost it with tone mapping, but it won't be noisy. One of the design principles we wanted to adhere to was no.ghosts. ever. Every shot looks the same except for object motion. Nothing is blown out in one shot and not in the other, nothing is noisier in one shot and not in the other. That makes alignment really robust."

When asked about the Pixel's lack of OIS, Levoy explained that HDR+ doesn't really need it, as the HDR+ mode that he really wants customers to use utilizes shorter exposures. This ties in well with his belief that photography will become more and more software-based. If the Pixel can do this well already, there's no telling how awesome these software-based cameras could get in the future.