Under-display cameras are poised to be The Hot New Feature, following up on the similar success of the in-display fingerprint sensor. It almost seems like science fiction: You can't see the camera, but it can see you. The technology promises to eliminate the last impediment in the all-screen phone dream. But how does it work, and when will you actually be able to buy a phone that has one?

To answer these questions, we sat down to pick the brain of OTI Lumionics' Michael Helander. His company makes some of the actual materials involved in OLED display manufacturing (especially in the case of under-display sensors), so they're familiar with all the technical ins and outs involved, and he was happy to tell us how it works.

Diagram of the light passing through a standard OLED display.

Two types of screens

According to Helander, there are two engineering approaches to designing under-display cameras: You either do everything you can to make the entire display as transparent as possible above the camera, or you essentially make tiny transparent holes in an otherwise opaque screen between the pixels.

In the first case, that means changing materials and rearranging things in the area above the camera. Certain metals in the various layers can be replaced by transparent conducting materials like indium tin oxide, and the structure of the display itself can be rearranged to reroute anything that might interfere with optimal transparency in that area. Anything that can't ultimately be moved or made transparent can be made as small as possible.

Diagram of the light that passes through a design optimized for transparency. 

There are a few limitations to this route: primarily brightness, uniformity, and resolution. Typically, OLED pixels are designed to be reflective on one side and transparent on the other, ensuring most of the light produced goes in one direction: toward you. Making the display transparent in one section interferes with that sort of design, and it can make the area the camera is in look distinctly different and less bright than the rest of the screen. Compensating for that effect by cranking brightness and calibrating differently in that tiny area can result in other long-term issues like burn-in around the transparent camera area. We're also told that all the rerouting and transparency-increasing steps often mean accepting a lower display resolution in that particular spot — a handful of big pixels in a sea of smaller ones. This is allegedly the approach that ZTE is taking in devices like the Axon 20 5G.

The second method is a little different. Rather than making an entire stack of the display transparent across one area, you can carve out individual transparent "holes" between the pixels and rely on them to transmit light through the screen. You can do this in a few different ways, like cutting down on display resolution to carve out an area for one in every X number of pixels, or just shuffling and rerouting things to make regular patterned spaces.

Diagram of the light that passes through with a patterned cathode.

As before, this means rerouting some components to ensure you have a clear line through the screen, but you don't have to worry about the whole display stack being transparent, just specific spots at regular intervals. If your resolution is low enough, you can accommodate these extra holes without any loss, but at very high densities, it can also mean giving up some pixels and accepting a lower resolution. Importantly, though, this route means the individual pixels above the camera have the same individual brightness and performance characteristics of the pixels elsewhere on the display, so you shouldn't have as many issues with uniformity. This second route is what we're told Xiaomi is planning for its upcoming phones, and it sounds like it may work the best out of the possible solutions available right now.

Cathode patterning: Note the evenly spaced "holes" light can travel more easily through. 

Other costs are involved

Now, whichever route manufacturers take, under-display cameras won't work quite the same as they did before. Either way, the camera is going to get a little less light with more stuff in the way, and there are other optical effects these designs have to fight. There are issues like reflection and diffraction from all the various materials, layers, and holes that the light travels through. These are problems that can't be fully eliminated, but Helander tells us they can be compensated for in software and reduced by advances in material science and engineering. Some of these issues also result in "softer" looking images, mimicking some of the effects of the beauty filters so many people enjoy, so it isn't all bad. Helander also claims that machine learning models can compensate for many of these issues pretty well.

This also opens a pretty big door for us in the future. Right now, most under-display prototypes just put the camera in the same place and extend the screen to cover it, but nothing is stopping us from using some of these solutions to make the entire display transparent and putting the camera wherever we like. Ultimately, we could move the camera down to the center of the screen, making it easier to keep the effect of eye contact when in video calls, or we could even toss several cameras under the screen in different places. Some day other optical sensors, like the infrared cameras used for face unlock systems, could also be moved under the screen. Eventually, we could do the same with desktop computer monitors, too.

There are a lot of potential use cases beyond just taking selfies once all the kinks are worked out, but it won't happen overnight.

Mid-range first, flagships come later

Before this technology can replace the notch or the hole-punch cutout, it needs to be scaled up. And given the sort of engineering costs involved, Helander tells us that, counter-intuitively, we'll see this technology roll out in the mid-range market first. Right now, sacrifices required when it comes to resolution and brightness mean this technology probably won't be a good fit in the flagship space for a while, where customers expect the very best. Issues like a big gray square or circle in the screen at max brightness, a resolution drop in one corner, or an overall lower display resolution all won't play at the thousand-dollar price point, but they're more acceptable in a mid-range product, and the details of ZTE's upcoming device lend further evidence to that argument.

In Helander's estimation, it could be 2022 or 2023 before this technology becomes mainstream, engineering problems are solved, production ramps up, and the feature works its way up and down the market. In the meantime, most of us will have to make do with being able to actually see our camera in a bezel, notch, or hole-punch cutout.