We've recently seen plenty of rumors related to Google's future plans for its Search app, from automatically remembering where you parked to reminding you of things when you're with another person, to reminding you to pay bills, down to something as simple as setting a proper timer. Clearly, Google's got plenty of plans for what will happen inside Search. But today, we've got something a little different - this time, it relates to how Google's voice assistant will break out of Search, entering other Google apps to help you do more with your voice and perform more actions with Search in general. The hope is that eventually, these modular actions will apply to third-party apps as well, but that will depend on Google's willingness to open up the relevant APIs.
We do not have possession of any APKs or unreleased devices, so please don't ask for them.
As with any rumor, it's important to discuss first whether we're confident in the information at hand. This particular rumor gets a confidence level of 8. With a project of this scope, a lot could substantially change before release. Because of this, and because we do not have much teardown-based evidence of this on the way, we have to deduct points. From information available to us, this project is being actively worked on, but the actual UX may still be under exploration.
That said, we are confident that some form of this functionality will be implemented, even if it isn't exactly what we lay out here.
This rumor is a little more complicated than some of the others. Essentially, it appears that Google wants to put the "Ok Google" hotword and/or voice-based actions just about everywhere, with a focus on adding specialized actions for individual apps. This would mean that users could, for example, say "Ok Google" inside the photos app to open a voice box, which would then allow them to perform actions specific to the photo app like sharing, or perhaps starting up the editor.
Google is apparently exploring this idea with new navigation buttons, including a "Google" button that would replace the traditional home button. We'll discuss this further later in the post. Changing the way users access the home screen by implementing a new navigation bar will obviously be a major shift, but we'll explore how that may work in a future post.
Something very important to note right now, though, is that this interface will likely be part of the Google experience, and as such may not appear on non-Nexus/GPE devices. It's clear that Google is trying to build its own experience (with the Google Now launcher being one part) to differentiate its own vanilla Android experience from partners/competitors.
We also have reason to believe that, in some apps, Google is experimenting with functionality that would enable the "Ok Google" prompt to provide suggested actions instead of simply listening. For example, if you were having a conversation with someone in Gmail, the prompt may suggest replying to that person, or performing actions related to the message chain like finding a movie, looking up the hours of a restaurant mentioned in the conversation, etc. This functionality is probably further out, as it would likely require more build-out on Google's predictive/assistive technology, but simpler suggestions like composing an email or creating an appointment are already in exploration (we'll discuss that in a moment).
As with many rumors, we won't be able to provide source images here, but be assured that we reproduce interfaces and experiences as faithfully as we can to demonstrate what we're talking about. First, check out the animation below to see what the "Ok Google" hot word experience might look like in the Photos app when a user begins a "share" command and then cancels it.
In implementing this functionality, Google seems to be experimenting with new navigation buttons. This is where things get a little strange.
Let's take a look at how these work. The back button appears to work (or not work, depending on your point of view) how it always has, the button on the right is "Recents," which appears to lead to the multitasking view we got a glimpse of in our post about Hera - the unification of Chrome and Search on Android - and the "Google" button, represented variously as a lowercase g or as the Google logo, which would also trigger a search prompt wherever you are. The obvious question is "how do you get to the home screen?" It appears - from what information we have - that users could get to the home screen through recents, where the home screen is accessed by swiping to the right from the recents list. As mentioned before, we plan on discussing the home screen further in a future post, so please refrain from freaking out and reserve judgment until such time (or, preferably, until all of this is actually released in its final form).
You'll also notice that a major element of the experience is the red "g" in a circle, a style of iconography we've already seen in Android Wear.
The circular transition animations are likewise becoming more prominent in Google's Android apps. The separation of the main app interface also looks similar to the treatment app interfaces get in the rumored Recents menu we showed in our post about Project Hera (the unification of Search and Chrome on Android).
As mentioned earlier, there's reason to believe that Google will suggest actions for users to take, depending on the app and - at some point in the future - the context of the content you're looking at. Below we've mocked up an example of what this may look like (the Gmail screenshot was taken from the Play Store listing, for those wondering).
From an implementation perspective, our information indicates that Google plans to create and deploy new actions using a modular structure. To get an idea of what this means, think about telling Google to send an email. Google needs to know who you're sending it to, what the body should contain, and what the subject is. Each of those parameters would be a module that could be unplugged and put into other actions. These pieces of code would evidently be able to snap together (metaphorically) to create new actions on Google's side. Whether anyone besides Google would be able to create such actions is unclear. We've already seen evidence of this in Google Glass. The XE16 update carried a 4000-line ModularActionProtos class that may be a start on this approach.
From the information available to us, it seems that there will be a simple onboarding process to get hot word detection working outside Search - for now, the functionality is being referred to as "Ok Google everywhere." The onboarding process will start with a simple suggestion to try the service, followed by a process of explaining where the prompts work, and listening to the user's voice to be sure Google responds only to that user. Note that the interface represented here is likely under construction, and may not appear this way when and if this feature becomes public.
As always, it's worth noting that this functionality appears to be a work in progress. Since we're talking about pre-release features, things could always change. We feel confident in the advent of modular actions, but the layout changes related to it are more of a moving target, considering there are still months of potential development ahead. What is clear, however, is that Google wants to make voice actions a significantly more common part of Android's interface, and is experimenting with one (really awesome) way to do that.