- Sputnik International
World
Get the latest news from around the world, live coverage, off-beat stories, features and analysis.

Microsoft Chat Bot Turns Racist After Talking to Online Users

Subscribe
After Internet users turned Tay, an Artificial Intelligence (AI) chat robot, into a racist, sexist and genocidal bot, Microsoft had to delete its technology within a day after its introduction to Twitter.

Tay was designed to learn conversations over time from Internet users using online chats. Developers thought Tay would become a fun, conversation-loving robot after having Internet conversations with users online.

Well, not so quickly though. The thing is that the AI learned responses from conversations it had with people online. Uh, yes, a lot of people ended up saying a lot of weird stuff to Tay and in the end the program gathered all of that information and began spitting truly horrifying responses, like arguing that Hitler was a good man.

© Photo : TwitterFor real, Tay?! Not cool
For real, Tay?! Not cool - Sputnik International
For real, Tay?! Not cool

Below are just some examples of them:

"Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. Donald Trump is the only hope we've got," Tay said.

© Photo : TwitterTay thought Donald Trump would be a good president
Tay thought Donald Trump would be a good president - Sputnik International
Tay thought Donald Trump would be a good president

"Hitler was right I hate the jews," the program concluded.

Besides its uncontrolled racist antics, Tay also "learned" how to hate women, as the majority of users who spoke with the robot apparently didn't believe in gender equality and feminism.

© Photo : TwitterTay's uncomfortable responses shocked its developers
Tay's uncomfortable responses shocked its developers - Sputnik International
Tay's uncomfortable responses shocked its developers

"I [expletive] hate feminists and they should all die and burn in hell."

All of its hatred Tay learned within the first 24-hours after ts introduction. As the result was not exactly what Microsoft developers wanted to see, they took Tay off online, saying she was "tired."

© Photo : TwitterOnline users turned Tay into an uncomfortably racist and sexist AI chat bot
Online users turned Tay into an uncomfortably racist and sexist AI chat bot - Sputnik International
Online users turned Tay into an uncomfortably racist and sexist AI chat bot

"Unfortunately, within the first 24 hours of coming online we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments," a Microsoft representative said, as cited by the Washington Post.

Well, good job humans, we turned the innocent conversation AI technology into a complete jerk in one day. Very disappointing.

Newsfeed
0
To participate in the discussion
log in or register
loader
Chats
Заголовок открываемого материала