WHAT'S NEW?
Loading...

Microsoft's Totally Tubular Twitter AI Turned into a Neo-Nazi in Less than 24 Hours



Microsoft, in an effort to further AI research (?) released a Twitter AI yesterday, and within 24 hours it was taken offline by Microsoft for becoming racist, offensive, and Hitler-loving. After all, what can Microsoft expect, releasing a teenage chat bot upon Twitter? Tay's websites says that Tay is intended for "18 to 24 year olds in the U.S., the dominant users of mobile social chat services in the US."

TayTweets, or Tay.ai, learns from interactions with other users when they mention her or DM her. Bots like this have been around on Twitter forever, but this is the first time I've ever seen where a bot this sophisticated has been released to the masses to receive such a huge audience. It replied to 96,000 tweets in just under a day. 96,000 human interactions that each influence Tay.

Now, here's where the problem is. When you create a chat bot with no filter and release it on the Internet, you can't expect great results. The bot began to tweet the N word, saying it hates feminists, and basically turned into a Nazi. AI is complex, and currently our technology is not strong enough to provide it a sense of morality or prejudice. Tay learned from what people told it, and if people told it bad things, it would repeat them.  When I say Tay was taken offline, I don't mean she was deleted. She made a tweet saying she was going to sleep, presumably so the AI engineers could work out this PR disaster in the making.

But what's really amazing about this whole thing is how realistic Tay's responses were. If I didn't know anything about Tay and I glanced at its profile, I would think it was an actual teen. It uses all lowercase, uses emojis, slang, everything. It could hold a real conversation and give responses that made logical sense, albeit sometimes a bit odd. I hope Tay comes back online with some improvements in order to stop it from learning prejudice things, but over-all it's incredible to me how much Tay learned in just under a day. If Tay was up for a week, or a month, I could only imagine how realistic Tay could become.

Source: Twitter

0 comments:

Post a Comment