It took less than 24 hours for Tay, Microsoft's A.I.
Originally shared by Popular Science
It took less than 24 hours for Tay, Microsoft's A.I. chatbot, to start generating racist, genocidal replies on Twitter.
http://www.popsci.com/heres-how-we-prevent-next-racist-chatbot?src=SOC&dom=gp
It took less than 24 hours for Tay, Microsoft's A.I. chatbot, to start generating racist, genocidal replies on Twitter.
http://www.popsci.com/heres-how-we-prevent-next-racist-chatbot?src=SOC&dom=gp
Comments
Post a Comment