It took less than 24 hours for Tay, Microsoft's A.I.

Originally shared by Popular Science

It took less than 24 hours for Tay, Microsoft's A.I. chatbot, to start generating racist, genocidal replies on Twitter.
http://www.popsci.com/heres-how-we-prevent-next-racist-chatbot?src=SOC&dom=gp

Comments

Popular posts from this blog

The theory of multiple intelligences is a theory of intelligence that differentiates it into specific (primarily...

can't understand why in the US it's sooo much cheaper than in Europe, since the product is not even from there..