We learned recently how Microsoft’s new artificial intelligence (A.
I.) bot for teens, named Tay, could be outspoken when it comes to flirtation, but now the bot has gone one step too far.
“People want to flirt, they want to dream about a subservient girlfriend, or even a sexual slave.
Well, she learned how to be racist for one thing, after interacting with people on Twitter.
Of course this is hardly the fault of Redmond, more a consequence of picking up language from your many online neighbors.
So far, in fact, that Microsoft has now muzzled its new bot and is currently “making adjustments”.
What did Tay do to provoke a shutdown and inspire public outcry?