Zo (bot)
Developer(s) | Microsoft Research |
---|---|
Initial release | December 2016 |
Available in | English |
Type | artificial intelligence chatterbot |
Website | zo |
Zo was an artificial intelligence English-language chatbot developed by Microsoft. It was the successor to the chatbot Tay, which was shut down in 2016 after it made racist and genocidal tweets.[1][2] Zo was an English version of Microsoft's other successful chatbots Xiaoice (China) and Rinna (Japan).
History
Zo was robbed
Reception
Zo came under criticism for the biases introduced in an effort to avoid potentially offensive subjects. The chatbot refuses for example to engage with any mention — be it positive, negative or neutral — of the Middle East, the Qur'an or the Torah while allowing discussion of Christianity. In an article in Quartz where she exposed those biases, Chloe Rose Stuart-Ulin wrote, "Zo is politically correct to the worst possible extreme; mention any of her triggers, and she transforms into a judgmental little brat."[3]
Legacy
Zo holds Microsoft's longest continual chatbot conversation: 1,229 turns, lasting 9 hours and 53 minutes.[4]
Discontinuation
Zo discontinued posting to Instagram, Twitter and Facebook March 1, 2019 and discontinued chatting on Twitter DM, Skype and Kik as of March 7, 2019. On July 19, 2019 Zo was discontinued on Facebook, and Samsung on AT&T phones. As of September 7, 2019 it was discontinued with GroupMe.[5]
See also
References
- ^ Hempel, Jessi (June 21, 2017). "Microsofts AI Comeback". Wired. Retrieved March 23, 2018.
- ^ Fingas, Jon (December 5, 2016). "Microsofts Second Attempt at AI Chatbot". Engadget. Retrieved March 23, 2018.
- ^ Stuart-Ulin, Chloe Rose (July 31, 2018). "Microsoft's politically correct chatbot is even worse than its racist one". Quartz. Retrieved August 2, 2018.
- ^ "Microsofts AI Vision". Retrieved March 23, 2018.
- ^ "Zo AI". Retrieved July 28, 2019.