Let me preface this post by saying that I love Android's notification shade. I love the toggles for things like airplane mode and dark theme, I love the platform's increasingly rich and smart notification quick actions, notification bundling, and just how Android handles notifications in general. But after three months of using the Pixel 4 XL, I've come to an increasingly annoying realization: the notification shade is having absolutely ruinous effects on the phone's facial recognition performance, something I've never experienced on the iPhone. And the simple reason is fingerprints.

Sensors that have to see things don't like fingerprints, for obvious reasons. Just smudge up on your phone's camera lens and you'll see the effects readily: sharp lines become vague blurs, and faces are distorted into a muddied mess. Unfortunately, those smudges on your camera have the same basic effect when they're on the sensor array responsible for authenticating your face on the Pixel 4. They don't see (or in the case of the dot projector, transmit) as clearly, and you become much more likely to have a failed read. This problem has become so frequent in my experience that I've started preemptively wiping my Pixel 4 on my shirt when I remove it from my pocket. If the smudges are bad enough, the phone just straight up won't recognize me at all, with a 100% failure rate.

And while it's true that the iPhone can suffer from similar issues, it tends to occur far less frequently because of the manner in which iOS's interface was designed. The number of times you typically reach for the very top of the screen on iOS pale in comparison to Android simply because iOS has never emphasized actions on that portion of the phone. Most iPhone users check their notifications on the lockscreen (or not at all), and the control center for quick toggles is access via the top right of the phone, away from the sensor array. In general, Apple discourages developers from including many interface elements near the top of the phone, and if they are there, to offset them such that there is a low probability interacting with them will result in fingers ending up near the face ID array. While Android is emphatically moving in this direction with the newest iteration of Material, this does little to address the notification bar.

I am not suggesting that the notification shade, necessarily, be moved. And I'm not suggesting notifications themselves should be deprioritized in any way to reduce interactions with the shade—which, again, I love—in order to prevent these issues with facial recognition. That's letting the tail wag the dog. But I think it represents a real problem, and one that I suspect the Pixel team gave perhaps less thought than it should have during the development of the facial recognition hardware. It's also a case where I believe the fact that the Pixel and Android teams are "firewalled" is probably having some unintended consequences: there are probably some good design solutions to this problem (and designers, feel free to chime in), but because the Pixel phone is developed independently from the Android operating system, the time and seriousness devoted to this issue was probably less than it should have been.

And, unfortunately, this problem only seems to be worsening for me. As my Pixel 4 XL's oleophobic coating inevitably starts to fade (partly from my constantly cleaning it!), the perniciousness of my oily fingers only becomes more obvious, and more disruptive. And I get more and more failed face ID reads. It's not the end of the world, certainly, but man, it is disappointing. For what it's worth: Google's suggestion is to just not reach so far up the display, though after 10 years of smartphoning, I don't see my drawer swipe muscle memory changing.