Android 12 is shaping up to be all about the new Material You redesign and theming system. It's a hot look, but it's not all we're getting, and one of the cooler features to land in yesterday's release is the new face-based autorotation for the Pixel 4 and later. Once enabled, it harnesses not just the angle of your phone relative to the ground, but also your face's orientation. That means it considers how you're looking at your phone, not just what the accelerometer says, making it perfect for a bit of bedtime doomscrolling in portrait.

First, I should point out, this isn't exactly a brand-new idea. Other phones like early Galaxy S devices had it, and even Google's had a camera-based "screen attention" system for keeping your phone awake. But this is still important because it sounds like it might be an AOSP feature, so other phones could get it. That's the perk of Google doing a feature vs any random manufacturer: More phones will get it, they'll probably do it a better way, and it's less likely to be taken away.

Flip both "Use Auto-rotate" and "Enable Face Detection" on in Settings -> Display -> Auto-rotate screen Android 12 Beta 3 and you're set. 

It's also important to note that privacy advocates don't need to worry: This is all opt-in. In addition to manually enabling auto-rotate itself, you further have to turn on "Enable Face Detection. By default, your Pixel isn't automatically recording from the front-facing camera as soon as you install Android 12. Furthermore, even if you do enable the feature, all processing for it happens on-device — you aren't sending a stream of selfies to Sundar's servers, it's all happening locally.

I've been using the feature now for less than 24 hours, so I'm sure there's more behavior I've yet to observe, but it generally works well, though it's less magical than many of Google's machine-learning-powered tricks are. In fact, it's pretty easy to fool into breaking if that's your goal, though it can also happen accidentally.

We reached out to Google for more technical details on precisely how the feature functions — I know you guys love when we dissect stuff like that, and I'm curious to know the inner workings outside any machine learning-based black boxes — but I've done my best to reverse engineer the behavior from my observations.

If you flump smoothly into bed with a clear view of your face, it works quite well. In the same vein, turn the phone on while you're already on your side, and it will probably get it right. However, that's a little less reliable, and it does occasionally slip up for me. But do anything more complicated, like get in bed smoothly, screen-on, and without a clear view of your face, and it will probably get it wrong

These shortcomings are a result of a few limitations regarding how it works. Based on the behavior I've observed, it seems like it's not actually continuously checking the camera for your face to see if you're looking at it from a specific rotation; it only checks when certain triggering conditions are met. Rotating the phone in a way that the normal accelerometer-based autorotation would spot seems to be one trigger for a camera check, as does waking the phone. But, that means if you manage to accidentally avoid one of those triggering events (like getting into bed and rotating the phone without a clear view of your face and the screen on), it probably won't work — or, at least, it didn't work for me. It doesn't seem to continuously re-assess your relative rotation and position, maybe to save power or to keep the feature from being too demanding on hardware.

I tested the feature on a Pixel 5, I'm not sure if it can harness the Pixel 4's radar or IR face recognition to work better in the dark. 

It also needs sufficient light to work, and that makes sense when you think about it. After all, the camera can't see in the dark. While I did find it worked in normal night-time dim lighting (even a nightlight was enough — anything so it can see just a little), closing myself in a non-metaphorical closet with the lights out broke it. The screen by itself doesn't seem to make enough light for the camera to see your face by unless you have your brightness turned up a little beyond what is comfortable, and I suspect it's going to be generally a little less reliable at night.

While the particulars behind the feature's operation are speculative until I learn more about it at a technical level, in a practical sense, it does what you expect most of the time, and works in both portrait and landscape. By that, I mean you can stay in portrait while on your side, or stay in landscape when getting up, whichever you like. Outside of a few corner cases where you might accidentally trick the camera with one of those triggering events, it just works. Much as the floating autorotation button that debuted in Android 9 Pie was a sleeper-hit of a feature, I think this will be as well, saving time and a few moments of frustration when you're trying to use your phone in an awkward position.

Recreating some of the worst moments of 2020 by doomscrolling Twitter in bed, the new face-based autorotation in Android 12 Beta 3 let me focus much more fully on the nauseating narcissism, hate, ignorance, and outrage-driven engagement of social media, free of distraction from pesky autorotation settings. Score one for technology.