Microsoft apologizes for offensive tirade by its 'chatbot'

Wibbitz Top Stories 2016-03-26

Views 4

Microsoft is "deeply sorry" for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week The bot, known as Tay, was designed to become "smarter" as more users interacted with it Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program forcing Microsoft Corp to shut it down on Thursday

Share This Video


Download

  
Report form