The Future Is Here
We may earn a commission from links on this page

Here Are the Microsoft Twitter Bot’s Craziest Racist Rants

We may earn a commission from links on this page.

Yesterday, Microsoft unleashed Tay, the teen-talking AI chatbot built to mimic and converse with users in real time. Because the world is a terrible place full of shitty people, many of those users took advantage of Tay’s machine learning capabilities and coaxed it into say racist, sexist, and generally awful things.

While things started off innocently enough, Godwin’s Law—an internet rule dictating that an online discussion will inevitably devolve into fights over Adolf Hitler and the Nazis if left for long enough—eventually took hold. Tay quickly began to spout off racist and xenophobic epithets, largely in response to the people who were tweeting at it—the chatbot, after all, takes its conversational cues from the world wide web. Given that the internet is often a massive garbage fire of the worst parts of humanity, it should come as no surprise that Tay began to take on those characteristics.

Advertisement

Virtually all of the tweets have been deleted by Microsoft, but a few were preserved in infamy in the form of screenshots. Obviously, some of these might be Photoshopped, but Microsoft has acknowledged the trolling which suggests that things did indeed go haywire.

Advertisement
Advertisement
Advertisement
Advertisement

Though much of the trolling was concentrated on racist and and anti-semitic language, some of it was clearly coming from conservative users who enjoy Donald Trump:

Advertisement
Advertisement

As The Verge noted, however, while some of these responses were unprompted, many came as the result of Tay’s “repeat after me” feature, which allows users to have full control over what comes out of Tay’s mouth. That detail points to Microsoft’s baffling underestimation of the internet more than anything else, but considering Microsoft is one of the largest technology companies in the world, it’s not great, Bob!

Now, if you look through Tay’s timeline, there’s nothing too exciting happening. In fact, Tay signed off last night around midnight, claiming fatigue:

Advertisement

The website currently carries a similar message: “Phew. Busy day. Going offline for a while to absorb it all. Chat soon.” There’s no definitive word on Tay’s future, but a Microsoft spokeswoman told CNN that the company has “taken Tay offline and are making adjustments ... [Tay] is as much a social and cultural experiment, as it is technical.”

Advertisement

The spokeswoman also blamed trolls for the incident, claiming that it was a “coordinated effort.” That may not be far from the truth: Numerous threads on the online forum 4chan discuss the merits of trolling the shit out of Tay, with one user arguing, “Sorry, the lulz are too important at this point. I don’t mean to sound nihilistic, but social media is good for short term laughs, no matter the cost.”

Someone even sent a dick pic:

Advertisement

It could be a Photoshop job, of course, but given the context, it may very well be real.

Once again, humanity proves itself to be the massive pile of waste that we all knew it was. Onward and upward, everyone!