What happens when you put two robots in the same piece?

Kairi Zaide

Unforgiven
Aug 11, 2009
105,341
12,891
Quebec City
fbf680327d.jpg
 

John Price

Gang Gang
Sep 19, 2008
385,036
30,535
It took less than 24 hours for Twitter to corrupt an innocent AI chatbot. Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational understanding." The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation."

Unfortunately, the conversations didn't stay playful for long. Pretty soon after Tay launched, people starting tweeting the bot with all sorts of misogynistic, racist, and Donald Trumpist remarks. And Tay — being essentially a robot parrot with an internet connection — started repeating these sentiments back to users, proving correct that old programming adage: flaming garbage pile in, flaming garbage pile out.
 

Ad

Upcoming events

Ad

Ad