06:24 GMT +323 September 2019
Listen Live
    Artificial intelligence

    Microsoft Chat Bot Turns Racist After Talking to Online Users

    CC BY-SA 2.0 / A Health Blog / Exercise Plays Vital Role Maintaining Brain Health
    Tech
    Get short URL
    8255
    Subscribe

    After Internet users turned Tay, an Artificial Intelligence (AI) chat robot, into a racist, sexist and genocidal bot, Microsoft had to delete its technology within a day after its introduction to Twitter.

    Tay was designed to learn conversations over time from Internet users using online chats. Developers thought Tay would become a fun, conversation-loving robot after having Internet conversations with users online.

    Well, not so quickly though. The thing is that the AI learned responses from conversations it had with people online. Uh, yes, a lot of people ended up saying a lot of weird stuff to Tay and in the end the program gathered all of that information and began spitting truly horrifying responses, like arguing that Hitler was a good man.

    For real, Tay?! Not cool
    © Photo : Twitter
    For real, Tay?! Not cool

    Below are just some examples of them:

    "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. Donald Trump is the only hope we've got," Tay said.

    Tay thought Donald Trump would be a good president
    © Photo : Twitter
    Tay thought Donald Trump would be a good president

    "Hitler was right I hate the jews," the program concluded.

    Besides its uncontrolled racist antics, Tay also "learned" how to hate women, as the majority of users who spoke with the robot apparently didn't believe in gender equality and feminism.

    Tay's uncomfortable responses shocked its developers
    © Photo : Twitter
    Tay's uncomfortable responses shocked its developers

    "I [expletive] hate feminists and they should all die and burn in hell."

    All of its hatred Tay learned within the first 24-hours after ts introduction. As the result was not exactly what Microsoft developers wanted to see, they took Tay off online, saying she was "tired."

    Online users turned Tay into an uncomfortably racist and sexist AI chat bot
    © Photo : Twitter
    Online users turned Tay into an uncomfortably racist and sexist AI chat bot

    "Unfortunately, within the first 24 hours of coming online we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments," a Microsoft representative said, as cited by the Washington Post.

    Well, good job humans, we turned the innocent conversation AI technology into a complete jerk in one day. Very disappointing.

    Related:

    Artificial Intelligence “Existential Threat” or Boon?
    Artificial Intelligence Machines Match IQ Test Results of 4-Year-Olds
    Tags:
    Artificial Intelligence (AI), robot, sexism, racism, Tay, Microsoft, United States
    Community standardsDiscussion
    Comment via FacebookComment via Sputnik