Do no harm, don't discriminate: official guidance issued on robot ethics | Technology | The Guardian

Originally shared by Rob Jongschaap

Do no harm, don't discriminate: official guidance issued on robot ethics | Technology | The Guardian

'Isaac Asimov gave us the basic rules of good robot behaviour: don’t harm humans, obey orders and protect yourself. Now the British Standards Institute has issued a more official version aimed at helping designers create ethically sound robots.

The document, BS8611 Robots and robotic devices, is written in the dry language of a health and safety manual, but the undesirable scenarios it highlights could be taken directly from fiction. Robot deception, robot addiction and the possibility of self-learning systems exceeding their remits are all noted as hazards that manufacturers should consider.

Welcoming the guidelines at the Social Robotics and AI conference in Oxford, Alan Winfield, a professor of robotics at the University of the West of England, said they represented “the first step towards embedding ethical values into robotics and AI”.'

https://www.theguardian.com/technology/2016/sep/18/official-guidance-robot-ethics-british-standards-institute
https://www.theguardian.com/technology/2016/sep/18/official-guidance-robot-ethics-british-standards-institute

Comments

Popular posts from this blog

can't understand why in the US it's sooo much cheaper than in Europe, since the product is not even from there..

The theory of multiple intelligences is a theory of intelligence that differentiates it into specific (primarily...