Google has telegraphed far in advance of the Pixel 4's launch that it will come with a featured called Motion Sense — a radar technique developed under the name Project Soli using high-frequency radios to detect in-air gestures for user interfaces. But if you live in one country, you shouldn't expect to take advantage of this feature.
Two sets of images related to the upcoming Pixel 4 have appeared today. In addition to the earlier renders that showed off the smaller Pixel 4 for the first time (that we know of), noted leaker IceUniverse earlier tweeted out a pair of photos showing off what appears to be either the front-facing glass for the Pixel 4 and 4 XL or screen protectors for the two phones. Either way, a honkin' cutout on the right side of that now-familiar top bezel raises some questions — or potentially answers them when it comes to those "Face Authentication" details spotted in Android Q.
We're just starting to learn the first details about this year's Pixel phones. Yesterday came the first reputable render, seemingly confirming the phone would have a large camera array on the back. One report now claims the Pixel 4 will include the Soli radar chip that Google first showed off in 2015.
Google's "project" designation has graced many experimental products over the years like Project Fi and Project Ara. Some of those worked out, and others were relegated to the dustbin of history. You might have assumed Project Soli from Google I/O 2015 was in the latter category, but it's still kicking. Google just got an FCC waiver to continue work on its gesture control interface.
Google I/O 2016 came to a close on Friday, and it marked our sixth year in attendance to Google's annual developer bash. A great many topics were covered from divisions all over Google, with lots of announcements big and small, both consumer and developer-facing. Our first I/O recap covered the front half of the show, while Mark takes a look at some of the announcements from Google's ATAP division as well as shares some thoughts on the show and news overall this year.
It's alright if you've already forgotten about Project Soli - with all of the crazy futuristic stuff that the Google Advanced Technology and Projects (ATAP) team works on, it's easy to get confused. Essentially, Soli is a system that adapts radar-style techniques into tiny hardware in order to enable the tracking of hands and fingers (or anything else, really) which in turn allows software to recognize hand gestures with precision and accuracy that beats anything on the consumer market today. It's pretty cool - watch this video from last year's Google I/O for a crash course.
It's been a while since we last heard anything about Project Soli - Google's radical post-touch experiment unveiled at I/O - but it looks like the project is still rolling right along. According to a tipster, Google has begun notifying interested parties of an impending "Soli Alpha DevKit," asking that those notified fill out an application for the chance to receive one.
Google says it's looking for pretty much everything when it comes to possible applications - health, art, interactive installations, robotics, HCI, VR, and more are all specifically called out as fair game in Google's email.
The email says that those selected to receive a DevKit will get a development board and SDK, along with the opportunity to participate in a Soli Alpha developer workshop at some point in the future.
Before you read this article, do me a favor: watch the video below. Because it's going to explain what Google is doing here much better than I could hope to.
Got it? Good. Pretty amazing, right?
For those of you who can't or don't want to watch it, fine, I guess that's what writers are for or whatever! Project Soli is, at its root, a fingernail-sized radar chip and an advanced set of algorithms that interpret the data that the array feeds back into a connected device. The purpose of those algorithms is to analyze the fine-grain motions of your hands and fingers.