Smartphone cameras have come a long way, but can you ever take images that rival a "real" camera? According to Google software engineer Florian Kainz, the answer is yes. Using a custom camera app and some post-capture editing , Kainz shows what the camera sensors in the Pixel and Nexus 6P can do in low light situations.

Kainz says his little project was inspired by a chat with Google's Gcam team, which focuses on computational photography. The most visible Gcam contribution to Google's products is HDR+, which allows phones like the Pixel to take photos in dim lighting by combining as many as 10 burst shots. HDR+ isn't perfect, though. Sometimes there just isn't enough light to produce a good image.

64 4-second exposures, shot with Nexus 6P

With long exposures and lots of individual images, Kainz reasoned it should be possible to get much better photos out of a smartphone's tiny camera sensor. To test this, he built a simple camera app with manual control over exposure time, ISO, and focus distance. The image at the top of the Point Reyes Lighthouse at night was Kainz's first test. This is a composite of 32 frames at 4-second exposure and ISO 1600, taken with the Nexus 6P. He also covered the lens and captured 32 completely black frames. The RAW image files were exported to Photoshop, where the mean of all 32 photos was computed, then the mean of the 32 black frames was subtracted. Below you can see what one of the individual HDR+ photos looked like. Impressive, no?

A single 6P HDR+ frame

Kainz did more tests with a variety of editing tricks, which you can read about in the full post. Below are some of the photos taken with the Pixel, but that phone only supports maximum 2-second exposures. Kainz speculates that a longer exposure could allow even better low-light photos on smartphones, assuming the subject is not moving and you have a tripod. Amusingly, the Pixel photos were taken in June and July last year, long before the phone was announced.

64 2-second exposures, shot with Google Pixel

Many of the editing tweaks used to produce these images could be automated, too. You could imagine an app for Android or a mode in the camera that does most of this for you. Of course, the processing time for a few dozen RAW files would be rather lengthy on a phone. Maybe it could be built into Google Photos? If you want to see more photos from Kainz's experiments, there's a Google Photos album.

64 2-second exposures, shot with Pixel

Source: Google Research