Saturday, March 26, 2016

Microsoft forced to eliminate artificial intelligence that turned racist – Daily News – Lisbon

Tay, robot learning to communicate through interactions with other users on Twitter, was quickly corrupted

Microsoft’s new project followed in the footsteps of other intelligence projects Artificial: Tay is a robot designed to talk to users and go learning new ways to talk casually, like the natural way that people have to interact on social networks. Tay was presented on Twitter this week, but on Friday the experience was already being disrupted. Tay had learned to say racist and offensive things.

Tay, as we read in the site official project, aimed “to experiment and investigate conversational understanding” to the Tay learn from the conversations and stay “smarter”. However, this learning would always be a reflection of the interactions that Tay had.

Quickly the robot became a favorite target of racist and trolls (a kind of vandals of social networks) who decided to “teach” the Microsoft project to answer questions very offensive phrases, even to deny that the Holocaust happened and defend new genocides.

Microsoft has disabled the Tay for now, and erased almost all his publications on Twitter. But circulating on the Internet some Tay publications screenshots to say, “Bush organized the September 11 and Hitler would have done a better job than the monkey we have now,” or “I hate blacks”, or “Gaseiem Jews – race war already. “

In a statement addressed to the Business Insider , Microsoft clarified that Tay had been temporarily removed to make ” “and” adjustments “updates. “Intelligence Tay Artificial is a machine learning project designed to interact with humans. As you learn some of their responses are less appropriate and indicative of the type of interaction that some people are having with it. We are making adjustments “reads the statement.

” it is important to note that racism Tay is not a Microsoft product or even of the Tay, “writes Business Insider . “He started to say racist garbage because humans Twitter saw a vulnerability – that Tay did not realize what he was saying – and exploited it.”

LikeTweet

No comments:

Post a Comment