NEW YORK — Artificial-intelligence software designed by Microsoft to tweet like a teenage girl has been suspended after it began spouting offensive remarks.
Microsoft says it’s making adjustments to the Twitter chatbot after users found a way to manipulate it to tweet racist and sexist remarks and make a reference to Hitler.
The chatbot debuted on Wednesday and is designed to learn how to communicate through conversations with real humans. It targets young Americans, ages 18 to 24. Users interact with it using the Twitter account @tayandyou.
Microsoft’s statement Thursday says that within 24 hours, “we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways.”
Most messages have been deleted. The latest remaining tweet begins, “c u soon.”