0

Unless you change a setting, Google stores anything you say to Google Assistant on its servers. The company employs contractors to listen to those recordings to improve Google Assistant’s performance. A contractor leaked over 1000 recordings, some of personal nature.

Google Assistant on a device isn’t intelligent. If you have a Google Home, all it can do on its own is listen for the wake word. Once you say the wake word, Google Home sends everything else that follows to cloud servers for interpretation. Those cloud servers provide the real intelligence to Google Home.

What happens next depends on your settings. If you’re a new Google Assistant user, by default Google discards your recordings once it interprets your command and sends a response back. But Google only recently made that behavior the default, and it didn’t retroactively apply the new choice to existing users.

So if you’re a long time Google Assistant user, and you haven’t changed your preference, Google stores your voice on its servers. The company uses these recordings to improve the Google Assistant service, in part by having humans listen to them. The idea is, a human can listen to the command sent and examine the response given, find any errors, and flag them for correction.

A contractor employed for just this purpose recently leaked over 1000 recordings that came from Google Assistant. Some of these recordings revealed that Google Assistant did occasionally record when no one spoke the wake word. Typically this is an instance of false positive, a person’s Google Home thought it detected the wake word and started recording, but it was wrong.

Some of the recordings leaked contained personal details, such as medical details, and the contractor claimed that in some cases they could potentially link the voice to an actual user.

In the past few months, similar reports have come out about Alexa, and the main difference here is you can opt-out with Google Assistant— that is it’s possible to use a Google Home and not have your voice stored.  You can’t opt-out with Alexa.

But the most disturbing part is that a contractor could leak these voice transcripts in the first place. It’s not clear how they managed to copy the data, and Google says it’s now investigating and plans to find the leaker. Hopefully, along the way, they add more security precautions that prevent removing data from its servers [TechCrunch]

Read the remaining 7 paragraphs


Post a Comment Blogger

We welcome comments that add value to the discussion. We attempt to block comments that use offensive language or appear to be spam, and our editors frequently review the comments to ensure they are appropriate. As the comments are written and submitted by visitors of The Sheen Blog, they in no way represent the opinion of The Sheen Blog. Let's work together to keep the conversation civil.

 
Top