16:59 GMT05 August 2020
Listen Live
    Tech
    Get short URL
    3311
    Subscribe

    A new information technology study on voice assistants has resulted in the creation of a list of more than 1,000 word sequences that can trigger the devices to begin invading users’ privacy and listening to nearby conversations.

    While owners of voice assistants have been repeatedly reassured that their devices will remain inactive until called upon, new research conducted by Germany’s Ruhr-Universität Bochum (RUB) and the Bochum Max Planck Institute for Cyber Security and Privacy has identified over 1,000 words and phrases that inadvertently activate the machines.

    The list of terms includes words in English, German and Chinese.

    For example, in the video below, the character Phil Dunphy of the ABC sitcom “Modern Family” is overheard saying “Hey Jerry” during an episode of the show, which triggers a device running Apple’s Siri assistant to activate.

    The researchers explicitly detail that the television character’s greeting was “confused with ‘Hey Siri.’”

    Three additional examples were posted by researchers, including one in which a Google Nest device confuses a character on television saying “OK, who is reading” with “OK Google.”

    It’s worth noting that, despite the accessible data and examples from the research, the study’s full paper has yet to be officially published, according to Ars Technica.

    However, a brief write-up published by authors Lea Schönherr, Maximilian Golla, Jan Wiele, Thorsten Eisenhofer, Dorothea Kolossa and Thorsten Holz shows that the devices possess the ability to intrude on consumers’ private conversations and privacy in general.

    “The devices are intentionally programmed in a somewhat forgiving manner, because they are supposed to be able to understand their humans. Therefore, they are more likely to start up once too often rather than not at all,” Kolossa said in the RUB news release on the research.

    While voice assistants are marketed as devices that remain inactive until called upon, the best practice to maintain one’s privacy may be to unplug, turn off or disable the machines until they are needed - or refrain from enabling voice assistant features altogether.

    “From a privacy point of view, this is of course alarming, because sometimes very private conversations can end up with strangers,” Holz said. “From an engineering point of view, however, this approach is quite understandable, because the systems can only be improved using such data. The manufacturers have to strike a balance between data protection and technical optimization.”

    Related:

    US Man Rescued From Sinking Car by Firefighters After Yelling ‘Siri, Call 911’
    ‘Hey Siri, I’m Getting Pulled Over’: iPhone Shortcut Allows Users to Record Police Stop
    Privacy Provoked: Apple Apologizes for Sharing Siri Data, Announces New Policies
    Apple’s Voice Assistant App Siri Called Israel ‘Zionist Occupation State’ After Wikipedia Hack
    Russia's MiG-35 to Be Equipped With Voice Assistant Helping Pilot in Tough Situations
    Tags:
    Google, Siri, voice assistants, research, study
    Community standardsDiscussion