We’ve all been oohing and aahing over Material You and its fabulous vibes since Android 12 rolled it out last year. But it’s not just beautiful; it was a technical achievement as well, with surprising complexity behind those dapper accent colors. Google actually invented its own color space to take perceptually accurate contrasting “tones” into account when picking the colors that adapt from your wallpaper to personalize your device’s color palette.

Recently, I got the chance to sit down with Google’s James O’Leary, design engineer on the Android team and one of the architects behind Material You, to dive into how it all actually works, from color-picking to contrast-enhancing accessibility, and I even got a bit of the history behind how and why it was developed in the first place.

If you wish to make an apple pie from scratch…

Google’s been toying with the idea of dynamic color for a long time, according to O’Leary, with folks like Matias Duarte (VP of design and a subject of many Android memes) and other designers at Google working on “some really cool concepts for years and years,” covering a range from merely dynamic app colors down to the level of system integration Android 12 users on Pixels now enjoy.

But it was the Pixel 6 that really lit a fire under Google to deliver Material You:

“We felt there was a really powerful moment with the Pixel brand this year between the Tensor CPU, this gorgeous new device, and we wanted to have the software and the design play into this moment. So we saw this deep collaboration at the top level, an understanding across our hardware division and the designers there, Material and the designers there, and also Android (my org) and the designers there, that we were going to go all in and make it happen this year.”

The strategy would change a little during the development process — both from below as technical challenges appeared, and from above to fit with the changes in executive planning. But at its simplest, Material You wasn’t just meant to be a pretty coat of paint on top of Android, even though that’s how most of us see it. It’s actually a powerful tool for designers.

One of the most critical features that Material You offers, and one we generally gloss over, is how the colors it chooses aren’t just attractive, but perfectly contrasting for their use cases as different parts of the UI. All of its automation may ultimately arrive at a palette of pretty accenting and base colors that designers may have chosen themselves if they were willing to spend time matching colors to every background you’ve got on your phone, but it’s that simplification — generating a palette with usable relative contrast — that’s most important.

Anyone can pick a set of colors for a themed purpose. Fewer people can select good colors that actually look nice together. But fewer still can choose those colors knowing that, when it comes to contrast, text and buttons will always be perfectly legible no matter how they’re used or in what combination. But that’s what Material You does, and it wasn’t easy.

As Carl Sagan once said, “If you wish to make an apple pie from scratch, you must first invent the universe.” Google, in its usual Googley fashion, ultimately decided that the best way to make dynamic themes work was to invent its own color space, different from any other color space that exists, just to have a perceptually accurate value for lightness it can use to select colors with perfect contrast in these dynamically generated themes.

It didn’t start there, though. At first, the engineers thought this would be a much simpler problem.

Hue, color, and tone

Google didn’t initially aim to create its own color space. In fact, the idea of a color space didn’t even enter into the equation at first. Engineers just wanted an automated system that could choose colors. But they ran into a bit of cruft left over from the early era of computing, which prevented a simple solution.

The HSL system (Hue, Saturation, Lightness) developed to represent the RGB color space in the 1970s is one of the default ways designers working in a digital space pick colors for things. And, unfortunately, it’s not very good.

“So it turns out that this thing we’ve been using for design in print, digital, what have you, over 50 years — anything that’s been done on a computer — is just not even close to being accurate. “

One example Google showed me (demonstrated below) was comparing colors at what should be a lightness value of 50 in HSL. But perceptually, many of the colors are closer to a lightness of 30 or 90. Color scientists like to muddle admittedly difficult and hard to summarize multi-dimensional concepts, but all of this is a complicated way of saying the relative contrast of colors that should be similar in HSL color space actually aren’t that similar.

HSL-Lightness-50-color-space-1

That might not seem like an issue, and anyone can use any colors they want, but like we stressed earlier, Google wanted to make sure that the colors selected would have the right contrast for a consistent, reliable, and accessible experience. That means the easy HSL-based color space answer was out.

To address this issue, Google needed a color space that measured “tone,” the company’s chosen word for perceptually accurate lightness. With a mind for tone, the engineers could automate away the issue of contrast. In O’Leary’s words, “All you have to worry about for legibility is this tone dimension and how far apart the colors are in this dimension.”

The idea for tone was taken from another much more accurate color space model developed in the 70s called LAB. Google had hoped it would be a drop-in solution like HSL, and though it was better than HSL, it still wasn’t ideal, being spatially limited in a way that resulted in some inconsistent results for designers. Finally, they tried a bleeding-edge and fairly radical color space called CAM16. It still wasn’t perfect, but it was close enough that Google could use its hue and colorfulness measures to develop its own color space from, bolting that concept of “tone” on top. Hue, color, and tone won, and the HCT color space was created.

LAB was accurate and reliable, but the “stretched-out blues” and gaps in coverage were an issue for designers.

Color spaces are definite “XKCD Standards” territory, but Google didn’t intend for HCT to replace any of the systems we’re used to in other places, and you’re probably not going to be worried about whether your next TV supports it or anything like that. HCT just simplifies the work a designer has to do when determining contrast for accessibility standards and better UI design, automating away what was once an annoying process, even eliminating other issues that could arise like certain color combinations in cases of vision deficiency — it’s just all about contrast.

Back in the day, when “skeuomorphic” design was the norm, contrast was almost a built-in part of the design process. Emulating real-life objects — often with their own colors, that had naturally developed interfaces in a vaguely implied three-dimensional space — offered things like shading and contextual clues you needed for contrast. But the recent switch to a more minimal approach in software design introduced new challenges and a rut for designers to fall into.

"Something that really bothered me was that after the industry switched over to “Flat design” in the early 2010’s, we kind of got stuck. Like, we could pair two colors, and it would usually just be this stark white, some sort of colorful color on top, depending on the platform, the app, that kind of thing. But when I got to Google and started working with designers as an engineer, I realized that they were bound by this contrast rule, they didn’t have the tools to be able to express that rule until it was almost too late. By the time you’re doing contrast checking and accessibility, you’re getting ready to ship the software. And in order to unlock the full range of colors — “free” design, so they weren’t in this messy place where they had to pick color pairs, and they couldn’t even really know up-front what pairs would work — we had to fundamentally make it possible to talk about contrast and lightness.

One analogy I think of is “fish don’t talk about how they’re swimming around in water.” We needed to make contrast and tone part of the system in order to unlock possibilities past this stasis of flat design.

We really went all-out with it as soon as we figured out the rules; we were like, “okay, let’s make almost infinite numbers of variations, programmatically, personalized to the user.” But that’s the core thing there: You need to invent the color space so people can finally talk about the problem."

Talking about the issue still wasn’t enough, and converting between color spaces for use on Android in what some color scientists call “gamut mapping” was another problem of its own, among many that came up as the clock ticked away during Android 12’s development. “Oh no!” O’Leary said of his months developing the bits and pieces that went into it, “I have discovered yet another rabbit hole, or yet another problem.”

But finally, building on all the work done with color on Android over the last half-decade, Material You landed with Android 12.

How it picks the colors

Material Design isn’t a black box of machine learning-powered magic; it was precisely crafted to produce the results that it does. You might actually be surprised to learn the first stage of the color picker shares one of its most basic features with one of the oldest image formats used on the internet — one that no one can agree how to pronounce: “the algorithms there are pretty much the same algorithms that reduce the colors in a GIF to compress it.”

Monet's "Impression, Sunrise" as an example. Left: Original image. Right: Reduced to its main colors

First, Material You distills your backgrounds into 256 main colors. Then, it takes those colors and compares each one of them, scoring them twice: Once based on colorfulness, and another time based on their resemblance to the overall image’s hue, “giving you a good indicator of if this color is expressing a large part of the image. You don’t just pick the one “pop-y” pixel in the image, so you have to include this population-based component, too.”

“So we go through all the colors we extracted, we score them based on that colorfulness and percentage component, and then the top four are presented in the color picker. But anyone that’s used a Pixel will also point out, there’s not always four colors there. We also filter out some colors — those that aren’t colorful, close to grayscale.“

This is how the “source color” is chosen, and from there, the “core palette” is developed, based on “tonal palettes” — all a lot of complicated color jargon for saying Google looks at hue, color, and tone to pick sets of colors that will go well with it.

Untitled-(12)-1

They’re all tossed together onto a table, further processed to pick them out for specific “roles” in the UI, and bam, you’ve got a set of colors for buttons, one for backgrounds, another for icons, and everything else you need, all with a contrast ratio that already meets accessibility requirements. You can actually see part of the palette it generates in Android 12’s Easter Egg.

All this ultimately gets distilled down into a set of color themes for both light and dark mode automatically generated by your Android 12-powered phone, replacing boring presets with living, breathing themes that change over time, and a pretty killer vibe compared to how things used to be.

The end result is simple, but it doesn’t automate away all of what a designer does. While it might make the job of determining relative contrast and a color palette simpler, it also provides its own challenges.

“There’s still that place for design, and it’s probably the most important part of this. Because I can write any formulas I want. The perfect contrast is just black on white — we can just make our phones black on white and completely optimize the algorithms, and a quote-unquote “optimal UI.” But though the computational part is important, that doesn’t mean the design part goes away. If anything, it’s more important and more of a challenge. Because now these designers have to think in a formula, engaging with engineers type of world, where before you could just go to Figma, pick the red you liked, and you were done.”

Material You in more places

Most of us know Material You from its appearance on Pixels with the Android 12 update. Many have said that Google’s implementation is actually proprietary and that other companies hoping to do their own versions will need to wait on Google to release it. But, according to O’Leary, those tools have been ready and waiting for other developers and OEMs for months on Github.

The Material Color Utilities are available for everyone from smartphone makers to homebrew app developers to enjoy, and it will be expanded to include other platforms and modules over time, with new features coming like “Harmonization,” which somehow adjusts even more colors dynamically to fit a source, a gradient-rotating tool for realistic gradient blending, and a tool for computing shadows to highlight content on top of a background.

Material You will be expanding to more phones from companies like Samsung and OnePlus in the future, too (which makes sense, given it may soon be a requirement).

At the end of our interview, O’Leary told me, half-joking, that after working on it for well over a year, his favorite feature in Material You is the simple fact that “it shipped.” But after collecting his thoughts, I think he puts what we all love about it best:

“It’s just so cool to have a phone that doesn’t feel like this staid piece of modern art that’s stuck on the same presentation. It feels alive! There’s this full range of expression… But it’s so cool to have a phone that’s able to be both really elegant and deeply designed, and also have it be joyful — it’s not just “here’s red on white, yellow on white, green on white.” Instead, there’s a sense of infinite possibilities and playfulness.”