Google is "upgrading" how key attestation works on Android, implementing what it calls Remote Key Provisioning. I know this sounds like just another overly technical security thing (and it is), but it could have a real impact on customers by establishing a longer but more secure chain of trust, enhancing their security. But, perhaps best of all, this change could potentially fix a lot of little issues Android phones run into as a result of hardware-backed security problems, like more gracefully recovering from security vulnerabilities. It might even be how Google fixed that Widevine issue for Pixels that broke HD playback for some people in apps like Netflix for over a year.

This is a gross oversimplification, but the way Android used to handle attestation was relatively simple. There's a software key stored on basically every Android phone, inside a secure element and separated from your own data — separately from Android itself, even. The bits required for that key are provided by the device manufacturer when the phone is made, signed by a root key that's provided by Google. In more practical terms, apps that need to do something sensitive can prove that the bundled secure hardware environment can be trusted, and this is the basis on which a larger chain of love trust can be built, allowing things like biometric data, user data, and secure operations of all kind to be stored or transmitted safely.

Previously, Android devices that wanted to enjoy this process needed to have that key securely installed at the factory, but Google is changing from in-factory private key provisioning to in-factory public key extraction with over-the-air certificate provisioning, paired with short-lived certificates.

As even the description makes it sound, this new change is a more complicated system, but it fixes a lot of issues in practice. Previously, device makers had to "install" those keys securely themselves when a phone was manufactured, and even if you might trust Samsung or Motorola to do that, it's a point of failure. What if that key is logged, or what if the environment it's placed in is later found to be vulnerable to some kind of attack after it leaves the factory? The whole chain of trust that Android and customers rely on breaks down, with no easy way to fix it.

This new system uses a public/private key pair generated on the device when it's made. The private part of the key never even leaves the device, so it never has to be installed or exist on potentially insecure hardware, and smartphone manufacturers don't have to worry about it anymore. The public half of the key is sent to Google, and this simple public/private keypair serves as the foundation for all later provisioning. Using these keys, Google will generate a series of short-lived certificates, allowing the Android Keystore to use them for authentication instead. Those certificates are rotated and expire regularly, so every app that needs to do something securely can get its own.

Googles-new-chain-of-trust-Android-13-1

This is a longer chain of trust but potentially a better one. For one, it reduces the chance that a key is accidentally leaked through some mechanism. And if/when a device is known to have a vulnerability, these separate certificates can be revoked without worrying about the actual key itself. After the compromise is fixed, the device still has that grain of security stored deep inside to build the chain back up again, so devices can more easily "recover" back to a secure system — the private key was never revoked (hopefully). Before, Google would simply have to revoke the built-in key itself, more permanently breaking the chain.

It could end up being unrelated (and we've reached out to Google to confirm), but this change might also make a dent in an annoying issue that crops up once in a while with apps on Android devices losing HD playback. The root of that issue is a loss of secure L1 Widevine status — Widevine is Google's DRM solution and the higher security levels (like L1) rely on the same hardware-backed security key to provide the first link in its own chain of trust, proving a device has a safe environment for playback.

In the past, some devices that lose Widevine certification have had to be physically sent back to the manufacturer — presumably, for a new key to be installed. Others that never supported the hardware-backed Widevine security levels needed the same treatment to install the secure key in the first place, like the OnePlus 5 and OnePlus 5T. In recent years, we've reported on devices like Pixels randomly losing access to Widevine certification, possibly due to key-related issues like this.

If these issues are in any way related to a security vulnerability with the factory-installed key, this new system could make it much easier for phones to recover from the problem, revoking the associated certificates instead and more gracefully creating new ones later when whatever issue caused the problem is resolved.

In fact, the timing of this announcement could be telling. Google fixed the Widevine issue for Pixels with the recent March update, which may have also delivered today's announced change in functionality to Pixel devices — though it's not clear to me if it's possible to switch to this newer system just via an update. Again, It's speculative for now, but we've reached out to Google to confirm, and we'll update if we hear more.

Google is making Remote Key Provisioning's new attestation and private key scheme mandatory in Android 13, and it's an option for devices on Android 12 — in both cases, we assume this applies to devices launching with each version, but Google isn't explicit. More technical details regarding key format changes, provisioning, and chain structure changes are also documented in the announcement for developers. It doesn't seem like this will have any impact on rooting or ROMing (though it will probably further enhance the security of SafetyNet in its own way), but we'll probably just have to wait and see to be sure.