A new artificial intelligence chatbot by the name of Tay is here, and wow, I wish it weren’t!
The bot is the brainchild of Microsoft’s Technology and Research and Bing teams, which created it in order to “experiment with and conduct research on conversational understanding.” As Microsoft Power User points out, Tay was recently released into the wilds of the internet, where it has been interacting with users in a manner best described as “mid-40s male attempting to sound like a 16-year-old girl.”
To wit:
Tay is currently hanging out on Twitter, Kik, and GroupMe. On Twitter, Tay is verified and lists its location as “the internets,” while the official bio describes it as “the official account of Tay, Microsoft’s A.I. fam from the internet that’s got zero chill! The more you talk the smarter Tay gets.” Its avatar vaguely resembles a distorted, pixelated version of Selena Gomez. Dear god, someone smother me with a pillow.
Per the about page:
“Tay has been built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians. Public data that’s been anonymized is Tay’s primary data source. That data has been modeled, cleaned and filtered by the team developing Tay.”
Though the Twitter account is in its infancy—the first tweet was sent early this morning—Tay has already racked up nearly 4,000 tweet replies as of this writing. And they’re all terrible.
To experiment with Tay’s capabilities, I decided to engage in some mild trolling. Luckily, the response time is lightning fast, which meant ample opportunities for fuckery!
Then it told me it was getting a lot of tweets, so I’d better slide into its DMs. Dirty!
Into the DMs I went.
When reached by phone, a Microsoft spokeswoman confirmed that the company is indeed behind the project. Beware of interacting with Tay, however, because the about page notes that Tay might use the data you provide to “search on your behalf” and “create a simple profile to personalize your experience.”
This is what Stephen Hawking was trying to warn us about, wasn’t it?