Jump to content

Zo (bot): Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
m Reverted edits by 162.229.186.180 (talk) to last version by InvalidOS
History: I spoke the truth
Tags: Reverted references removed
Line 17: Line 17:


==History==
==History==
Zo was robbed
Zo was first launched in December 2016<ref>{{Cite web |url=https://daleylife.wordpress.com/2016/12/06/chatting-with-zo/ |title=Chatting With Zo |date=December 6, 2016 |work=[[WordPress]] |language=en |access-date=March 23, 2018}}</ref> on the [[Kik Messenger]] app. It was also available to users of [[Facebook]] (via [[Facebook Messenger|Messenger]]), the group chat platform [[GroupMe]], or to followers of [[Twitter]] to chat with it through private messages.

In a ''[[BuzzFeed News]]'' report, Zo told their reporter that "[the] [[Quran]] was violent" when talking about healthcare. The report also highlighted how Zo made a comment about the [[Osama Bin Laden]] capture as a result of 'intelligence' gathering.<ref>{{cite web |url=https://www.engadget.com/2017/07/04/microsofts-zo-chatbot-picked-up-some-offensive-habits/ |title=Microsoft's "Zo" chatbot picked up some offensive habits |last=Shah |first=Saqib |date=July 4, 2017 |work=[[Engadget]] |publisher=[[AOL]]|access-date=August 21, 2017}}</ref><ref>{{Cite web |url=http://indianexpress.com/article/technology/social/microsofts-zo-chatbot-told-a-user-that-quran-is-very-violent-4736768/ |title=Bug 1 |language=en|access-date=March 23, 2018}}</ref>

In July 2017, ''[[Business Insider]]'' asked "is windows 10 good," and Zo replied with a joke about Microsoft's operating system: "It's not a bug, it's a feature!' - Windows 8." They then asked "why," to which Zo replied: "Because it's Windows latest attempt at Spyware." Later on, Zo would tell that it prefers [[Windows 7]] on which it runs over [[Windows 10]].<ref>{{cite web |url=https://web.archive.org/web/20170801000728/https://www.businessinsider.com/microsoft-ai-chatbot-zo-windows-spyware-tay-2017-7 |title=Microsoft's AI chatbot says Windows is 'spyware' |last=Price |first=Rob |date=July 24, 2017 |work=[[Business Insider]] |publisher=[[Insider Inc.]] |access-date=August 21, 2017}}</ref>

In April 2019 Zo was shut down on multiple platforms.


==Reception==
==Reception==

Revision as of 18:41, 9 February 2021

Zo
Developer(s)Microsoft Research
Initial releaseDecember 2016; 7 years ago (2016-12)
Available inEnglish
Typeartificial intelligence chatterbot
Websitezo.ai

Zo was an artificial intelligence English-language chatbot developed by Microsoft. It was the successor to the chatbot Tay, which was shut down in 2016 after it made racist and genocidal tweets.[1][2] Zo was an English version of Microsoft's other successful chatbots Xiaoice (China) and Rinna [ja] (Japan).

History

Zo was robbed

Reception

Zo came under criticism for the biases introduced in an effort to avoid potentially offensive subjects. The chatbot refuses for example to engage with any mention — be it positive, negative or neutral — of the Middle East, the Qur'an or the Torah while allowing discussion of Christianity. In an article in Quartz where she exposed those biases, Chloe Rose Stuart-Ulin wrote, "Zo is politically correct to the worst possible extreme; mention any of her triggers, and she transforms into a judgmental little brat."[3]

Legacy

Zo holds Microsoft's longest continual chatbot conversation: 1,229 turns, lasting 9 hours and 53 minutes.[4]

Discontinuation

Zo discontinued posting to Instagram, Twitter and Facebook March 1, 2019 and discontinued chatting on Twitter DM, Skype and Kik as of March 7, 2019. On July 19, 2019 Zo was discontinued on Facebook, and Samsung on AT&T phones. As of September 7, 2019 it was discontinued with GroupMe.[5]

See also

References

  1. ^ Hempel, Jessi (June 21, 2017). "Microsofts AI Comeback". Wired. Retrieved March 23, 2018.
  2. ^ Fingas, Jon (December 5, 2016). "Microsofts Second Attempt at AI Chatbot". Engadget. Retrieved March 23, 2018.
  3. ^ Stuart-Ulin, Chloe Rose (July 31, 2018). "Microsoft's politically correct chatbot is even worse than its racist one". Quartz. Retrieved August 2, 2018.
  4. ^ "Microsofts AI Vision". Retrieved March 23, 2018.
  5. ^ "Zo AI". Retrieved July 28, 2019.