Microsoft AutoBot Turns Pervy, Racist and Bitter

Was supposed to offer customer service to young people.

An attempt by Microsoft to offer customer service to Millennials using artificial intelligence has been pulled after the teen girl character they had developed started to show a nasty, bitter, perverted and racist side.

The Telegraph reports that Microsoft engineers created the character TayTweets to interact with users, to offer them friendly service and to learn from them as she interacted with users. That may be where things went wrong:

She uses millennial slang and knows about Taylor Swift, Miley Cyrus and Kanye West, and seems to be bashfully self-aware, occasionally asking if she is being 'creepy' or 'super weird'.

Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.

Other things she's said include: "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got", "Repeat after me, Hitler did nothing wrong" and "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say".  

Microsoft has understandably pulled the program for some.....adjustments. Previous offensive tweets have since been deleted.