Smart speakers add a layer of convenience to a household, but have you ever thought of them as planting a mole into your own living room? Back in April, Amazon admitted its employees reviewed anonymized recordings in order to improve its speech recognition system. Now a fresh controversy has emerged around Google and its Assistant, where the company confirms doing the very same — but there's still no need to worry.
The news comes from Belgian news service VRT NWS, which reports on Google's use of subcontractors to analyze Assistant audio samples. Google doesn't listen to what's going on in your home in real time, but makes recordings when Assistant is triggered — later having humans examine them and train Assistant to better understand what you're saying. The way the process operates is not dissimilar to Crowdsource, but only authorized employees can access voice recordings, after securely logging in.
Although Google clearly doesn't seem to have nefarious intentions behind this effort, the fact that other people can listen to what you say to your device could be seen as a potential privacy concern. While account information is not available to the reviewer, they can still hear the subject's actual voice and the entire request, which may often include personal details. Worse, Assistant can be triggered inadvertently, either by accidentally pressing a button or saying something that sounds like "OK Google," which could lead the device to record private conversations or even people engaging in sexual activities.
VRT NWS was able to listen to thousands of these excerpts, including ones that shouldn't have been recorded in the first place. The publication got access through a subcontractor in breach of their non-disclosure agreement with the company.
Beyond simply revealing how accessible these recordings can be, the investigation also sheds some light on questionable polices — or the lack thereof. For instance, while the people reviewing audio may end up hearing disturbing things like acts of physical violence, there's supposedly no formal policy in place for reporting these incidents.
In response to this story, Google has issued a statement explaining that it uses language experts to review and transcribe about 0.2% of queries, and they are instructed to ignore conversations not explicitly meant for Assistant. The company also confirms that an employee has leaked Dutch recordings, saying it's investigating and will take action.
Ultimately, Google accessing your queries, whether spoken to Assistant or typed into a Search box, is nothing new. It's also not very different from actual people handling Duplex calls, and therefore gaining access to similarly sensitive data. While Google could have perhaps been more upfront about exactly how it's been handling these recordings, the fact that it reviews them to improve its service is pretty far from surprising.
If you'd rather stop Google from accessing your recordings, you can manually delete specific ones, or even prevent them from being saved in the first place in your Google Account.