03-30-2016, 10:55 AM
|
#78
|
Unfrozen Caveman Lawyer
Join Date: Oct 2002
Location: Winebar Kensington
|
Microsoft’s racist chatbot returns with drug-smoking Twitter meltdown
Short-lived return saw Tay tweet about smoking drugs in front of the police before suffering a meltdown and being taken offline
http://www.theguardian.com/technolog...-twitter-drugs
Tay is made in the image of a teenage girl and is designed to interact with millennials to improve its conversational skills through machine-learning. Sadly it was vulnerable to suggestive tweets, prompting unsavoury responses.
This isn’t the first time Microsoft has launched public-facing AI chatbots. Its Chinese XiaoIce chatbot successfully interacts with more than 40 million people across Twitter, Line, Weibo and other sites but the company’s experiments targeting 18- to 24-year-olds in the US on Twitter has resulted in a completely different animal.
Microsoft 'deeply sorry' for racist and sexist tweets by AI chatbot
http://www.theguardian.com/technolog...-by-ai-chatbot
Last edited by troutman; 03-30-2016 at 10:57 AM.
|
|
|