It took less than 24 hours for Tay, Microsoft's A.I.

Originally shared by Popular Science

It took less than 24 hours for Tay, Microsoft's A.I. chatbot, to start generating racist, genocidal replies on Twitter.
http://www.popsci.com/heres-how-we-prevent-next-racist-chatbot?src=SOC&dom=gp

Comments

Popular posts from this blog

#artificialintelligence

Exactly!

The hologenome theory of evolution recasts the idea of organism from all the genetically identical cells, to all the...