NEW YORK — Artificial-intelligence software designed by Microsoft to tweet like a teenage girl has been suspended after it began spouting offensive remarks.

Microsoft says it’s making adjustments to the Twitter chatbot after users found a way to manipulate it to tweet racist and sexist remarks and make a reference to Hitler.

The chatbot debuted on Wednesday and is designed to learn how to communicate through conversations with real humans. It targets young Americans, ages 18 to 24. Users interact with it using the Twitter account @tayandyou.

Microsoft’s statement Thursday says that within 24 hours, “we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways.”

Most messages have been deleted. The latest remaining tweet begins, “c u soon.”

VIAThe Associated Press
The AP is one of the largest and most trusted sources of independent newsgathering. AP is neither privately owned nor government-funded; instead, as a not-for-profit news cooperative owned by its American newspaper and broadcast members, it can maintain its single-minded focus on newsgathering and its commitment to the highest standards of objective, accurate journalism.