Right after Google finally made the Pixel 4 official today, the company also took a lot of time to talk about its camera. Two lenses on the back and a lot of computational magic are supposed to help you take better pictures with it than with its predecessors. You'll get a live HDR+ preview with dual exposure controls, learning-based white balancing, a wider-range portrait mode, and an option for astrophotography.

Googler Marc Levoy took the stage to tell us all about the computational photography the Pixel 4 is capable of. The first thing letting you better judge your shots is live HDR+ preview in the viewfinder, which gives you an approximation of the finished processed picture before you hit the shutter. You get what you see, basically.

Using dual exposure, you can create artistic images such as this one.

Dual exposure controls help you create more artistic images by letting you separately control highlights and shadows through their own sliders. The shadow slider doesn't change exposure but tone mapping, which affects darker spots more than bright ones. You can see these in action in a video in our previous coverage.

Machine-learning-based white balancing has first been used in the Pixel 3 for Night Sight shots. Google expands this computational photography feature to all camera modes on the Pixel 4. It should help you get truer colors, especially in tricky lighting situations involving multiple light sources or snow.

Portrait mode sees some improvements thanks to the dual-camera setup and machine learning improvements to dual-pixel PDAF. This should help the software process hair or dog and cat fur more easily and should allow you to take pictures of bigger objects and items further away. The bokeh effect sees some enhancements, too, and resembles an SLR more closely.

Night Sight officially received the astrophotography feature we've covered earlier. With the help of a tripod or some stable object to lean the Pixel 4 against, the phone can take long exposure shots of the night sky, which can lead to some stunning imagery of the milky way, assuming you're far away from other light sources. It does so by exposing individual shots for up to 16 seconds and computing a final image made up of up to 15 individual frames. It can take the Pixel about four minutes to finish one of these shots.

[imgset]

[/imgset]

Hopefully, images you take yourself will look as good as these samples provided by Google.

Levoy also teased extremely high-dynamic range photos that are supposed to come to the Pixel 4 later through a software update. This could allow you to take an image of a scene lit by the moon without overexposing the natural satellite itself.

Google now also actively recommends pinching to zoom before you take an image on the Pixel 4. The company managed to improve super res zoom which uses small shakes from your hands to calculate more details in shots further away. The addition of the telephoto lens that lets subjects appear closer without having to zoom digitally right away helps with that, too. That means you'll be able to take relatively crispy images even when you zoom digitally 8x.