Remember that "Voice Access" talk that was supposed to happen at I/O but was removed from the schedule? It turns out that, while it wasn't the full-on in-app voice craziness we had hoped for, Google did have some news about voice interactions to share.
Specifically, with Android M, Google has introduced the Voice Interaction API, which will allow apps to get a better handle on a user's voice-initiated requests. Check out the video below, by the leaders of a sandbox talk at I/O about voice actions.
The new API, as Google Search Developer Advocate Jarek Wilkiewicz explains, shouldn't be confused with custom voice actions.
A while ago, we posted about information we'd received indicating that sometime soon, Google's search functionality (and other actions) would be expanding beyond the Search app, moving into other apps for device-wide search interaction and - eventually - app-specific functionality.
It appears that isn't the only Search trick Google is working on, though. According to the information available to us, Google is working on functionality for now known as KITT (get it?) or "Android Eyes Free" internally. This functionality would allow users to interact with Search without touching or even looking at their device, getting just what they want without any distraction.
We've recently seen plenty of rumors related to Google's future plans for its Search app, from automatically remembering where you parked to reminding you of things when you're with another person, to reminding you to pay bills, down to something as simple as setting a proper timer. Clearly, Google's got plenty of plans for what will happen inside Search. But today, we've got something a little different - this time, it relates to how Google's voice assistant will break out of Search, entering other Google apps to help you do more with your voice and perform more actions with Search in general.
Google Glass gives wearers access to notifications, the ability to take pictures of what they see, and other bite-size nuggets of general tech geekery, but the device relies on tactile swipes and voice commands to manage it all. Atheer One, a pair of smart glasses that were recently funded on Indiegogo, promises users the ability to interact with its virtual UI elements using just their hands.
Don't expect an experience even remotely comparable to that displayed in the video above, though. Atheer One will overlay a 26-inch Android tablet UI 50 centimeters from your face, which you can interact with in mid-air as you would a regular tablet (assuming you owned a flying tablet).