Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours

Friday, March 25, 2016


24 hours after the have created Tay, an artificial intelligence modeled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software, Microsoft had to delete it.

From The Telegraph:
A day after Microsoft introduced an innocent Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into an evil Hitler-loving, incestual sex-promoting, 'Bush did 9/11'-proclaiming robot.
Follow Blame it on the Voices on Twitter | Blame it on the Voices on Facebook

If you liked this post, you can subscribe to the Blame It On The Voices RSS feed and get your regular fix


0 comentarii:

Post a Comment

Dear spammers! Please note that a nofollow attribute is automatically added to all the comment-related links!

You can use the following HTML tags: <b>, <i>, <a>