No Banner to display

Microsoft’s AI teen chatbot experiment failed badly and sent out a few racist, genocidal and crude tweets [pics]
Posted by on Mar 29th, 2016

Artificial intelligence is a buzzword that you all should be familiar with and several big name companies are trying to be first to perfect the technology. Microsoft’s Twitter experiment with AI failed badly last week though when they created Tay…an artificially intelligent Twitter bot that was designed to be a teenage girl that learned from using the social media platform. She started off ok then it all went a bit pear shaped when she started tweeting racist and genocidal things. Microsoft pulled the plug a day after she went live.

Stared off ok.

taytweets 1

Then later…

taytweets 2

taytweets 3

Then oh dear…Houston we have a problem.


She was created as a customer service experiment, but clearly she needs more work.

No Comments have been left.

Be the first to leave a comment.

Leave a comment