Google Home Assistant employees listen to your conversations, like at Amazon
Google Home Assistant … the orders that you give to your voice assistant and the recordings of fragments of your accidental conversations are listened to by employees of the firm, reveals an investigation of the Belgian media VRT News. The goal is to improve speech recognition. But this practice that had already been pointed at Amazon, remains opaque since the conditions of use of the two assistants do not specify that conversations can be listened to by humans.
Belgian media in the Flemish language VRT News conducted an uplifting survey of voice and conversation recordings from Google Home speakers and Google Assistant. The journalists say that they were contacted in April by a Dutch Dutch subcontractor when it was learned that Amazon employees are listening to the recordings from the speakers to improve the accuracy of voice recognition. He explained that his role is exactly the same as that of Amazon employees, and has allowed them to listen to more than 1000 recordings.
Google Home Assistant employees are likely to listen to your private conversations
The site explains that Google needs humans to improve the accuracy of its algorithms. It’s not in itself what you say that interests Google, but the way you say it. But there is still a problem: some recordings, for example when you ask your speaker to turn off the lights, are perfectly aware. Others are less so: the assistant can indeed trigger accidentally as soon as a word or a sequence of syllables resembles, even vaguely, shebang “Ok Google” / “Hey Google”. Fortunately, these records are anonymized. But is that enough?
Obviously not to believe the journalists. Because among these conversations, VRT News was able to retrieve personal addresses and visit the authors of some of these recordings to collect their reaction. And that’s not all: of the 1,000 records, 153 were accidental. Thus very private conversations in the bedroom, between parents and children, or even business conversations containing an incredible density of sensitive information, end up on Google’s servers, and sometimes by being listened to by operators.
There are other types of very private requests: for example, people who ask their assistant for health advice (which indirectly informs them about their state of health). There would also, it will not surprise anyone, an impressive number of requests on the bottom of the belt. But another aspect of the problem is also the recordings that suggest that their author is in danger. What to do in these cases? The source of VRT News says there is no specific Google policy in this area. The site tells the story of an employee who had to deal with a recording in which it was clear that a woman was in danger.
A practice that lacks transparency
According to VRT, the only really fixed rule that subcontractors must follow is to classify records containing bank information and passwords as sensitive. Reading the survey of the Dutch site, we can only regret that the firm does not show more transparency on the behavior of the assistant, and the use of these records.
Google exercised its right of reply to VRT following the publication of the article, which we translated: “We are working with language experts all over the world to improve our voice recognition technology by making transcripts at home. from a small number of audio clips. This work is crucial for the development of technology that makes products like Google Assistant possible. “
Google advertisments: “Our language experts only evaluate around 0.2% of all audio clips – extracts that are not linked to data that personally identifies their author. We have recently learned that one of these experts may have violated our data security policy by leaking audio clips in Dutch. We are actively conducting the investigation, when we find a flaw in our policy, we will take quick action, which can go as far as breaking our agreement with the partner.
Read also: Technical News