Before you read this article, do me a favor: watch the video below. Because it's going to explain what Google is doing here much better than I could hope to.
Got it? Good. Pretty amazing, right?
For those of you who can't or don't want to watch it, fine, I guess that's what writers are for or whatever! Project Soli is, at its root, a fingernail-sized radar chip and an advanced set of algorithms that interpret the data that the array feeds back into a connected device. The purpose of those algorithms is to analyze the fine-grain motions of your hands and fingers. See? My explanation makes it sound really boring, even though it's super awesome. This screengrab may get the point across a little more clearly.
This mockup has someone controlling a color slider in a drawing app on a Nexus 9 without touching the screen. Simply by sliding their thumb against their forefinger, the device knows what the user is trying to do, and that's because the embedded radar chip (in concept, these demos are almost certainly not live) is taking the feedback it's getting in real time and using algorithms to understand that this is what the person's fingers are doing. It is that sensitive.
Radar isn't something you're going to find in phones or tablets of today, because frankly, I don't think there's been a huge use for it. An array of this tiny size would have very limited range, and I don't think there are a whole lot of use cases for ultra-short-range mobile radar yet. ATAP thought differently, and began building Project Soli.
To be clear, the implications here are amazing if Soli proves to be as practical and seamless as the demos make it look. Touching your smartwatch could be a thing of the past, as could controls in your vehicle, Bluetooth speakers, or really anything. How practical is it? I have no idea. How affordable is it? Again, no clue (I'm guessing the chips are not prohibitively costly, though).
But wow. This is one of those things that could mark a real leap in how consumer electronics are designed and evolve. Let's hope Google can get it figured out. Also, yes, this really does bring us one step closer to Minority Report gesture magic.