Advertisement

'Good Bot, Bad Bot': Episode IV

28:41
Download Audio
Resume
XiaoIce, a cutting-edge artificial intelligence system designed to create emotional bonds with its 660 million users worldwide, at the company's office in Beijing. XiaoIce was an inspiration for Microsoft's Tay, an AI chatbot modeled to be a typical teenage girl. (Courtesy AFP and Getty Images)
XiaoIce, a cutting-edge artificial intelligence system designed to create emotional bonds with its 660 million users worldwide, at the company's office in Beijing. XiaoIce was an inspiration for Microsoft's Tay, an AI chatbot modeled to be a typical teenage girl. (Courtesy AFP and Getty Images)

Find the original episode and a full transcript here.


Bots are everywhere. They're all over social media platforms, chatrooms, phone apps. These pieces of software — which are meant to imitate human behavior and language — influence our daily lives in sneaky, surprising ways.

In the fourth episode of Endless Thread's series "Good Bot, Bad Bot," co-hosts Ben Brock Johnson and Quincy Walters share a cautionary tale about Tay, a Microsoft AI chatbot that has lived on in infamy. Tay was originally modeled to be the bot-girl-next-door. But after only 16 hours on Twitter, Tay was shut down for regurgitating white supremacist, racist and sexist talking points online.

Tay's short-lived run on the internet illuminated ethical issues in tech culture. This episode uncovers who gets a say in what we build, how developers build it, and who is to blame when things take a dark turn.

Headshot of Quincy Walters

Quincy Walters Producer, WBUR Podcasts
Quincy Walters was a producer for WBUR Podcasts.

More…

Headshot of Ben Brock Johnson

Ben Brock Johnson Executive Producer, Podcasts
Ben Brock Johnson is the executive producer of podcasts at WBUR and co-host of the podcast Endless Thread.

More…

Advertisement

More from Short Run

Advertisement

Listen Live
Close