- Sputnik International
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Apple’s Siri Listens to People Having Sex, Talking to Doctors, Whistle-Blower Says

Subscribe
Although the US-based tech giant has repeatedly voiced its commitment to guarding its consumers’ privacy, a report by the British outlet The Guardian suggests that Apple is sending confidential user info to contractors for “quality control” and upgrades of its voice assistant.

Apple’s contractors all over the world are regularly sent Siri recordings containing private information ranging from medical questions and drug deals to a couple having sex without disclosing it in privacy documentation, The Guardian reports.

The contractors are supposed to listen to and evaluate the voice assistant’s responses “to help Siri and dictation … understand you better and recognise what you say”, according to Apple, which does not explicitly state, however, that it is humans who do this job, the outlet says.

The company points out that only a small portion of randomly selected Siri requests, making up less than 1% of its daily activations, are analysed. These recordings are said to typically be only a few seconds long and pseudonymised.

“User requests are not associated with the user’s Apple ID. Siri responses are analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements”, Apple stated to the media.

However, as one of the contractors working for the tech giant revealed, the voice assistant, which is often activated accidentally with a “wake phrase” or similar word, or even simply by moving an arm with an Apple Watch, can record sensitive private information.

“The sound of a zip, Siri often hears as a trigger”, the insider, who asked to remain unnamed, told the outlet.

The contractor noted that the Apple Watch and HomePod smart speaker are responsible for most accidental activations, while the watch can record “some snippets that will be 30 seconds”.

According to the source, such recordings are accompanied by “user data showing location, contact details”.

“There have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on”, the whistle-blower said.

According to the insider, sometimes Apple’s contractor “can definitely hear a doctor and patient, talking about the medical history of the patient”.

“You’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch”, according to the source.

At the same time, staffers are said to be encouraged to report such activations only “as a technical problem”, but “there’s nothing about reporting the content”, the whistle-blower says, adding that the data could be misused.

“Apple is subcontracting out, there’s a high turnover. It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings]”, the source said.

These revelations come hot on the heels of a report by El Pais that Apple has subcontractors in Spain who listen to users’ verbal requests in a variety of languages, including French and German.

However, Apple might not be the only company that allegedly employs humans to listen to users’ requests and accidental recordings. Amazon is also reportedly passing on some information, recorded by Alexa, to human listeners, while Google employees are said to be tasked with the same job for Google Assistant.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала