‘Do You Work For NSA?’: Amazon Employees Listen to You Talking to Alexa - Report

© AP Photo / Jeff ChiuMarch 2, 2016 photo shows an Echo Dot in San Francisco. Amazon.com is introducing two devices, the Amazon Tap and Echo Dot, that are designed to amplify the role that its voice-controlled assistant Alexa plays in people's homes and lives
March 2, 2016 photo shows an Echo Dot in San Francisco. Amazon.com is introducing two devices, the Amazon Tap and Echo Dot, that are designed to amplify the role that its voice-controlled assistant Alexa plays in people's homes and lives - Sputnik International
Subscribe
There are thousands of people listening to what you tell your digital assistant – and sometimes even what you don’t. Amazon claims it’s for improving the machine’s voice recognition algorithms.

Amazon employees are listening to what people tell Alexa — the company's voice recognition-based AI software — in the intimacy of their homes, a Bloomberg report revealed Thursday.

In order to improve machine learning algorithms, thousands of people are listening to snippets of audio detected by Alexa, checking whether the digital assistant correctly parsed the request and provided the desired solution. The audio is checked, annotated and then fed back into the system, thus training the voice recognition neural network. Working nine hours per day, Amazon employees check up to 1,000 audio clips daily, two workers from Amazon's Budapest division told Bloomberg.

Satellite - Sputnik International
For the Love of Internet: Amazon to Launch 3,236 Satellites Into Low Earth Orbit
In the process, the employees gain access to things people would like to keep private, such as bank identification information, Bloomberg reports.

"You don't necessarily think of another human listening to what you're telling your smart speaker in the intimacy of your home," Florian Schaub, a professor at the University of Michigan who has researched privacy issues related to smart speakers, told Bloomberg. "I think we've been conditioned to the [assumption] that these machines are just doing magic machine learning. But the fact is there is still manual processing involved."

The way Amazon's system is designed, Alexa (or its physical embodiment, the Echo smart speaker) listens to everything that happens in the room, including background conversations. Officially, the system does not store any audio until the wake word is detected ("Alexa" or "Echo" by default). However, the system often activates by mistake, triggered by mondegreen, background noise or even a working TV set. Regardless of what woke the system, Amazon employees are required to analyse and annotate input into it.

Amazon CEO Jeff Bezos - Sputnik International
World
Bezos’ Sec Advisor Claims Saudis Hacked, Leaked Amazon CEO Intimate Messages
Because of this, Amazon workers often catch bizarre things: actual examples range from a woman singing badly off-tune in the shower to children crying for help to what was believed to be a domestic sexual assault, the two sources told Bloomberg. However, company's privacy policy prohibits them from intervening.

"We take the security and privacy of our customers' personal information seriously," an Amazon spokesman said in an emailed statement to Bloomberg. "We only annotate an extremely small sample of Alexa voice recordings in order [to] improve the customer experience."

The employees utilize group chats, in which they share the audio clips and seek assistance with particularly unintelligible words. As a way of relieving stress, possibly criminal or simply embarrassing clips often end up in those chats as well, the employees say.

Can Amazon track users by abusing this system? It is unclear, but the possibility is there. Screenshots of the review process provided to Bloomberg indicate each audio clip is tied to an account number and device serial number. While the reviewers have no access to any person's last name and personal address, their superiors might.

Amazon logo - Sputnik International
France Hits Google, Amazon, Facebook with 3% Digital Tax
Google and Apple use the same method of human analysis for audio recorded by their Google Home and Siri systems. However, their privacy rules are said to be stricter: the recordings are tagged only with a random identifier. Google also says their system distorts the audio to make identification by voice impossible. Apple disclosed they store recorded information for six months, after which the audio is stripped of all identification information, but can be still used for machine learning.

Interestingly, a certain percentage of consumers actually had their suspicions confirmed regarding unwanted ears listening in while they talk to Alexa. According to Bloomberg's sources, the employees often pick up questions like "Do you work for the NSA?" or "Alexa, is someone else listening to us?"

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала