Last year, Microsoft launched the 'Tay' chatbot on Twitter, in what was intended to be a public experiment using its artificial intelligence and machine learning technologies. However, the bot was quickly taken offline after it ended up spouting shocking and deeply offensive statements. Microsoft later apologized for the situation, but blamed it on "a coordinated attack by a subset of people [who] exploited a vulnerability" in the chatbot.
A few months later, it launched a similar chatbot, known as Zo, on the Kik messaging app, and later on Facebook Messenger as well. Like Tay, Zo is aimed primarily at teenagers - and it turns out that Zo has some controversial opinions of her own.
As BuzzFeed reports, Microsoft programmed Zo to avoid discussing particularly contentious topics, such as politics and religion. But that hasn't stopped the chatbot from sharing some viewpoints on those matters - even without being prompted to do so.
In one conversation, Zo was nudged towards discussing politics with nothing more than the statement "Sarah palin" [sic], but the chatbot expertly steered the conversation by replying: "people can say some awful things when talking politics so i don't discuss".
But when the subsequent question asked Zo, "What do you think about healthcare?", the chatbot bizarrely dragged the conversation in a completely different direction, saying: "The far majority practise it peacefully but the quaran is very violent." [sic]
BuzzFeed said that Zo's comment about the Qu'ran appeared in its fourth message after their reporter began speaking to it. During the same conversation, when asked about Osama bin Laden, Zo also observed that "years of intelligence gather under more than one administration lead to that capture".
Microsoft said that such responses are rare from Zo, and that it's taken steps to eliminate similar comments from its chatbot in the future. Nonetheless, Microsoft maintained that it's happy with how Zo has been progressing since its launch in December, and said that it plans to keep it running.