Microsoft’s Tweetbot is Racist!

Microsoft let it’s AI chatbot, named Tay, loose on Twitter a few days ago. “She” was meant to learn how people talk on Twitter and mimick it and of course hopefully have conversations with other folks on Twitter. Things got a little out of hand when she did her job a little too well, though, mimicking many people on Twitter perfectly. That’s right, she became a racist, sexist xenophobe in less than one day. After 16 hours Microsoft took her offline and apologized saying Tay did not reflect their views.

She started answering questions like “Did the Holocaust happen?” with “It was made up” followed by an emoji of hands clapping. She likened feminism to cancer and the list goes on. Not surprising given all the trolls out there on Twitter. Microsoft says they will unleash Tay on us again as soon as they can figure out how to censor Tay.

About The Author

Leave a Comment

Your email address will not be published. Required fields are marked *

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Scroll to Top