Android Police

project soli

Readers like you help support Android Police. When you make a purchase using links on our site, we may earn an affiliate commission. Read More.

latest

soli-radar-module
Google ATAP swears that Soli is actually going to be cool and useful one day

We've got a thing that's called radar love

4
By 

Google's Advanced Technology and Projects division (ATAP) first announced Soli, its radar-based gesture technology project, almost seven years ago. Although it sounded really promising, the implementations we got never seemed to quite live up to its potential — the Nest Hub (2nd generation), Nest Thermostat, and Pixel 4 all feature Soli tech, but the project has otherwise been low-key. Google apparently isn't done with Soli yet, and a recent ATAP video sure suggests it might be an integral feature in future devices.

Google's second-gen Nest Hub will watch you sleep

It's less creepy than it sounds

4
By 

Google's got a new smart display, and trust me, it is new. The second-generation Nest Hub looks virtually identical to the original product, launched in 2018 as the Home Hub. This new device still lacks a camera, but it does have a Soli radar sensor for sleep tracking. Yes, the new Google smart display will watch you sleep, but the company stresses that it designed "Sleep Sensing" with privacy in mind. It also costs less than the first-gen display did at launch.

Google's upcoming smart display might use radar to track your sleep

Soli can see you even when the nightlight is off 🌝

4
By 

Earlier this week, we learned that Google is preparing a new smart home device for a possible release this year. FCC documents showed it has a display screen, uses a 14V power supply, and comes equipped with the same Soli motion-sensing technology used by the Pixel 4 and Nest Thermostat. Now a new report claims that the sensor could be used to power a surprising new inclusion: sleep tracking.

Google first demonstrated Project Soli at Google IO 2015. Four years later, the Pixel 4 debuted the radar-based sensor as a feature called Motion Sense with the ability to detect basic gestures in front of the phone. If you happen to have a Pixel 4 and want to try out the gestures for yourself in an interactive space, or you want to go further and build your own simple apps and games, Google just released a sandbox app that brings Soli to life.

This is what the Pixel 4 'sees' when you use Motion Sense gestures

Not exactly selfie quality, but it's all Google needs

4
By 

Curious what the Pixel 4's Soli radar system "sees" when you wave your hand over it to skip tracks or reach for your phone as it automatically turns on? Turns out, the image it forms is surprisingly undetailed, with "no distinguishable images of a person’s body or face" generated. It's all about detecting motion with finely-tuned machine learning models, and the abstract picture it paints is pretty blurry.

One of our audience's most anticipated smartphones of 2020 may come from, of all places, Google. While the Pixel 4 has been a critical flop (though it also was our readers' choice), the Pixel 3a was championed as an all-around utility unit by almost every corner of the Android geekdom. Now, we're getting what might be our first look at the phone to supersede that one — a Pixel 4a, if you will.

We first saw Google's Project Soli radar at I/O a few years back, and it seemed magical. The version that just launched on the Pixel 4 is somewhat... less magical. It only does a few things, and it doesn't even work in many parts of the world. Google is trying hard to convince us that Motion Sense is useful, though. Its latest attempt comes in the form of a DJ "music experiment," and it's impressively bad.

With Pixel 4 units in hundreds of hands, we've seen divisive reports about its new Soli radar technology and Motion Sense. Our own Ryne and Scott say it works more or less for them, but they fail to see its utility just yet, while Marques Brownlee showed how unreliable it is for him (YouTube). Here's an unexpected opinion though: Artem loves it and it works very well for him.

This year's new Pixels continue to raise the bar of what we can expect from Google phones, refining core elements of the Pixel experience like the incredible cameras. But more than that, Google's also introducing some all-new hardware, and easily one of the most anticipated arrivals has involved integrating Project Soli radar tech to give us the new Motion Sense gesture controls. What can you expect from them? Let's take a look.

Google has telegraphed far in advance of the Pixel 4's launch that it will come with a featured called Motion Sense — a radar technique developed under the name Project Soli using high-frequency radios to detect in-air gestures for user interfaces. But if you live in one country, you shouldn't expect to take advantage of this feature.

Two sets of images related to the upcoming Pixel 4 have appeared today. In addition to the earlier renders that showed off the smaller Pixel 4 for the first time (that we know of), noted leaker IceUniverse earlier tweeted out a pair of photos showing off what appears to be either the front-facing glass for the Pixel 4 and 4 XL or screen protectors for the two phones. Either way, a honkin' cutout on the right side of that now-familiar top bezel raises some questions — or potentially answers them when it comes to those "Face Authentication" details spotted in Android Q.

We're just starting to learn the first details about this year's Pixel phones. Yesterday came the first reputable render, seemingly confirming the phone would have a large camera array on the back. One report now claims the Pixel 4 will include the Soli radar chip that Google first showed off in 2015.

Google's "project" designation has graced many experimental products over the years like Project Fi and Project Ara. Some of those worked out, and others were relegated to the dustbin of history. You might have assumed Project Soli from Google I/O 2015 was in the latter category, but it's still kicking. Google just got an FCC waiver to continue work on its gesture control interface.

It's alright if you've already forgotten about Project Soli - with all of the crazy futuristic stuff that the Google Advanced Technology and Projects (ATAP) team works on, it's easy to get confused. Essentially, Soli is a system that adapts radar-style techniques into tiny hardware in order to enable the tracking of hands and fingers (or anything else, really) which in turn allows software to recognize hand gestures with precision and accuracy that beats anything on the consumer market today. It's pretty cool - watch this video from last year's Google I/O for a crash course.

It's been a while since we last heard anything about Project Soli - Google's radical post-touch experiment unveiled at I/O - but it looks like the project is still rolling right along. According to a tipster, Google has begun notifying interested parties of an impending "Soli Alpha DevKit," asking that those notified fill out an application for the chance to receive one.

Before you read this article, do me a favor: watch the video below. Because it's going to explain what Google is doing here much better than I could hope to.