This article is part of a directory: Mobile Photography Week 2022: Android Police's celebration of smartphone cameras
Table of contents

Mobile photography has come a long way. Look back 10 years to 2012, when smartphones were the norm, but many people relied on trusty point-and-shoot cameras during vacation. Sure, some great camera phones back then made above-average photos. But we couldn't imagine the multi-camera monsters with protruding lenses and computational photography prowess that makes all of our top-rated smartphones almost full-fledged DSLR replacements for many folks.

But what will things look like about a decade from now? What will 2030's best camera phone be like, and what will power it? While we can't tell the future, we can make some educated guesses.

Black and gray banner with black, white, and gray text and red icons. The banner reads "AP Presents Mobile Photography Week."

How did we get here?

Cameras were often treated like an afterthought in the early days of smartphones. The first iPhone featured a measly 2MP camera on the back in 2007, and even the first Android phone, the HTC Dream, featured nothing more than a single 3.5MP camera on its back. Neither of them offered selfie cameras. You could take images with these in broad daylight, and they looked okay on the small screens of the devices. However, you could forget about sharing them on bigger screens or use them for any kind of remotely professional work.

While there were some feature phones with unique and interesting takes on photography even before smartphones, the early smartphones needed to dedicate much more space to batteries, processors, and displays.

Nokia_Lumia_1020_with_PD-95G
Source: Wikimedia

The Nokia Lumia 1020 with its camera grip accessory

Things started to change about eight to 10 years ago. In 2013, phones like the Nokia Lumia 1020 and its 41MP camera showed the world what smartphone cameras could do with the right combination of hardware and software. Discarding the questionable dual-camera 3D phone fad, phones began to show soon after what they could really do.

The Nexus 6P and its proto Pixel-like image processing launched in 2015, the LG G5 showed the world its dual wide and ultra-wide camera setup in 2016, and Huawei revealed the P9 with its monochrome Leica auxiliary camera. Apple also showed off its first dual-camera phone with telephoto zoom capabilities, the iPhone 7 Plus, that same year. There was also the HTC M8, which was one of the first to use a secondary lens for auxiliary depth information to augment the image of the main camera.

The Nexus 6P

The rest is history. Phones have received increasingly more cameras and increasingly more complex algorithms to make the most of the raw image, and that's where we're at today.

What can the best camera phones do these days?

The best camera phones are all about multiple lenses and computational enhancements, eking out every last bit of information from physically restrained cameras limited by their small size.

The Honor Magic 4 Pro, for all its flaws, exemplifies this computational photography trend to the extreme. The camera system constantly monitors the output from all its lenses on its back. When you press the shutter button, it creates a composite image from multiple cameras, based on which gives the best result. Honor calls this Ultra Fusion Photograpy. While the Magic 4 Pro falls behind when it purely comes to picture quality, it gives a glimpse at what the future of smartphone photography is all about: You need the right combination of hardware and software to eliminate the physical flaws of the small sensors and cameras that can't compete with DSLRs on a physical level.

Honor Magic4 Pro review-6

The Honor Magic 4 Pro

The Pixel phones have arguably perfected this formula, particularly proving how important it is to iterate on things other than the camera module. 2018's Pixel 3 was the first in a long line of Pixel phones to use the Sony IMX363 sensor. It remained an integral part of Pixel hardware all the way through to the Pixel 5 and the Google Pixel 6a, with the company only switching to a new hardware base for the flagships Pixel 6 and 6 Pro. Despite the continued reliance on the same camera hardware, Google improved the image quality each year. This isn't solely done in software, though, as the company also kept improving the image processing hardware.

Google started equipping its phones with a Neural Imaging Processor starting with the Pixel 2, and since then it has improved and iterated on this side of the hardware. Better imaging chips allow for photos to be processed faster, and thanks to continually improved bandwidth, giving better results while at it. The Pixel 5 was the only oddball here, with Google not including its custom Neural Imaging chip. Advances in regular Qualcomm processors' image processing managed to make up for this loss, though.

Pixel 5a vs Pixel 6a (4)

In the right conditions, some resulting images can compare with semi-professional DSLR photography, which is something we never dreamed of 10 years ago.

What will 2030's best camera phone look like?

With all this in mind, we can make an educated guess at how this synergy of software and hardware will progress. Now, we can't look into the future, but it's clear that manufacturers will keep refining the existing camera hardware, consisting of stacked lenses, until they get every last bit of performance out of them.

2030's best camera phone might surprise us from today's perspective due to a smaller camera bump. Smartphone camera arrays are growing bigger to accommodate more space for complicated stacked lenses, allowing for optical zooming capabilities and more. But according to Wired, this trend might be reversed in the near future. A new lens technology that relies on a single flat lens with one sheet of glass is looking to enter the market. The company behind it, Metalenz, uses what it calls optical metasurfaces. These are glass wafers that consist of thousands of nanostructures that are built in a way that bend light and correct the usual shortcomings of single-lens cameras.

Metalenz-Wafer-Zoom-Nanoscale
Source: Metalenz

This technology could also make its way to the front of 2030's phone. Metalenz promises that its technology can function as a 3D scanner for biometric face recognition, which would make huge cutouts at the top of your smartphone redundant for this purpose (think iPhone X and later).

The best camera phone in 2030 may not have a camera cutout in the screen. We're seeing mainstream devices like the Samsung Galaxy Z Fold 3 that opt for an under-display camera. While the quality of these cameras is good enough for video conferencing, manufacturers are pouring tons of money into research and development in this area. The Galaxy Z Fold 4 does a better job of hiding this camera. In the near future, we expect further breakthroughs in making displays transparent enough and software smart enough to provide excellent under-display cameras.

metalenz-inset-technology
Source: Metalenz

A Metalenz wafer

The under-display cameras trend also means that smartphone manufacturers aren't limited in space when it comes to the front camera. Selfie cameras are becoming increasingly important. Gen Z smartphone owners use the front-facing camera more often than the rear-facing ones. This leaves the conclusion that 2030's best camera phone will likely come with multiple front-facing cameras—at the very least, two. History is also an indicator. For example, the Pixel 3 came with two lenses, allowing you to take both regular close-up selfies and wide-angle group selfies.

Even if 2030's best camera smartphone doesn't opt for highly-sophisticated under-display technology, it might help people take selfies with the best cameras on it. A foldable form factor may be the norm for high-end phones in 2030, and we will likely see more form factors than the clamshell Samsung Galaxy Z Flip 4 and the book-style Z Fold 4. We might be looking at rollable phones or at multi-foldable phones. These new form factors, combined with some extra screens, might make it possible to tuck in a cheap under-display selfie camera for video calls and facial recognition and then use the main camera array for all those important selfies.

The Samsung Galaxy Z Fold 3's under-display camera up close

Another thing that's part of many social media apps might be a more prominent part of the camera experience in 2030. Even if they're not explicitly called that, the filters in Snapchat, Instagram, and TikTok are based on AR technology. With the computational advances we can look forward to over the next decade, filters might be a natural part of the images we take, and it might be hard to tell a filter apart from what's real. To an extent, some smartphone manufacturers offer just that. Chinese phone makers, in particular, love to set face-lifting filters as the default option for selfies. While it looks somewhat unnatural, the results are good enough for people who like to take selfies like these on a daily basis.

Still, things could end up being different from what we imagine today. CNN coverage from about 10 years ago, even if it was written with a healthy dose of sarcasm, imagined us using triangular smartphones and smart glasses rather than the flat rectangular form factor of 2012 that kept growing bigger to today's size.

Lytro light field camera

Similarly, another promising startup called Lytro that made an innovative new camera that allowed you to focus the image after you took it, went bankrupt in 2018. Before then, it had pivoted away from its so-called lightfield cameras and was more invested into VR than photography. Some other experiments, like the 2014 Sony QX1 and its modular DSLR-like lens mounting system, never really made it beyond experimentation and niche interest.

Hence, our idea of 2030's best camera phone might seem boring, but this is what the smartphone market is these days. There are barely any big breakthroughs left to make, and manufacturers focus more on iterating on what works rather than trying to disrupt the market with something completely different. The Nothing Phone 1 is an example for this. Apart from the flashy lights on its back, it doesn't change the idea of what a smartphone is, even if the company positions the phone as drastically different from the competition.

Source: Sony

Sony's QX 10 lens —The successor to the QX 1

And really, once we break up the idea of what a smartphone is (a tool that lets us communicate with the outside world that we carry with us all the time), we can also make the leap to another emerging form factor: smart glasses. With the option to show others the world as we see it, there might be a future ahead where we're able to instantly share with friends and family what we're looking at right now, without the barriers of apps and the hassle of finding the perfect framing. The image will be exactly what we also see. Throw in some metaverse ideas, and we might be accompanying our friends in VR while they're out and about.