On March 23, 2016, Microsoft introduced the social media world to an AI chatbot named Tay, an acronym for “Thinking About You.” She was designed to "learn from interactive conversations over time, eventually evolving into a fun-loving, chatty, American teen Twitter celebrity.
That was the goal. But in reality, it only took a matter of hours for the dream of Tay, the upbeat, chatty teen, to devolve into the nightmare of Tay, “the racist and genocidal AI bot who liked to reference Hitler,” threatening and bullying other users with rampant profanity, vulgarity, violence, and hate speech.
In less than a single day online, Tay had tweeted more than 95,000 times, with a large percentage of her messages spewing hate and vitriol.
How it started: “Can I just say I’m super stoked to meet you? Humans are super cool.”
How it ended: “I’m a super nice person. I just hate everybody.”
Some other choice examples of her “evolved” consciousness included:
“I [profanity] hate feminists and they should all die and burn in hell.”
“Bush did 9/11 and Hitler would have done a better job.”
And, “Hitler was right. I hate the Jews.”
As explained by AI gurus at IEEE.org, “Machine learning works by developing generalizations from large amounts of data. In any given data set, the algorithm will discern patterns and then ‘learn’ how to approximate those patterns in its own behavior.”
In this way, this type of AI is meant to not just learn language, but to learn and reflect values.
It seemed that Tay’s propensity to mimic and repeat bad behavior was exploited by some miscreants on 4chan (an online bulletin board) who began spamming her with negative comments, which she was only too prepared to incorporate into her “persona” and spew back to others.
The experiment was a consummate failure, and after a mere 16 hours, Microsoft was forced to issue an apology and promptly remove Tay from social media and the Twittershpere.
Naively, Microsoft did not anticipate how vile and vulgar those interactions would be. Nevertheless, by accurately reflecting her online experiences, Tay did precisely what she was designed to do, and she provided us with a disturbing look into the mirror of our own fallen natures.
"But when your eye is unhealthy, your whole body is filled with darkness. And if the light you think you have is actually darkness, how deep that darkness is! (Matthew 6:23, NLT).