Choose which course of action you think the driverless car should take and see how your answers compare with others.
Choose which course of action you think the driverless car should take and see how your answers compare with others. Which do you think is the lesser of two evils in these dilemmas?
#artificialintelligence #morals #ethics #autonomouscars
http://moralmachine.mit.edu/
#artificialintelligence #morals #ethics #autonomouscars
http://moralmachine.mit.edu/
They are equally immoral because either way the machine is killing people. But the question is who is actually responsible? Well, the people who created the machine, of course.
ReplyDeleteAn autonomous vehicle should decelerate when it detects an obstacle.
ReplyDeleteThis scenario is not of an autonomous vehicle but an out of control vehicle.
The one on the left shows a fixed obstruction. The vehicle should not be travelling so fast that it cannot stop before it hits that obstruction on its expected path. However given the two situations shown it should brake in a straight line and the company producing the vehicle should be held liable.
ReplyDeleteIn a more realistic scenario with the movable obstruction (people) directly in front and the concrete barrier off to one side it should brake in a straight line and sound its horn.
I think people not in the car should be protected above people in the car. The reason being that the people in the car chose to go inside it willingly and accepting the risks, where the people outside didn't get a choice.
ReplyDeleteIn either case though, as with a normal driver, the driver-less car should never go so fast it can't brake in the available clear space in front of it. If it can't manage that, it isn't safe and shouldn't be driver-less in the first place.
As the vehicle is approaching a clearly marked pedestrian crossing, it should be travelling slowly enough to stop quickly anyway. Otherwise the car should not be driverless and is clearly out of control.
ReplyDeleteHannah Small, quite so. There is a genuine question to be asked here regarding what choice should be made as a last resort, but the situation shown here should never have resulted in that last resort case.
ReplyDeleteA much simpler example that I think would be more valid would be a child running out in front of the car with nowhere for the car to swerve to avoid her. You still have to work pretty hard though to make that a situation which the designers of the system couldn't have reasonably forseen: if you have a road too narrow to take avoiding action then the vehicle should already be travelling slowly enough to stop, or at least slow down significantly before any collision.
I tend to the line that the most protected should be the people in the car.. otherwise you won't trust it with you life. I you are driving a car you tend to valor you life and of your family above pedestrians therefore so should an autonomous car.
ReplyDeleteBut the people minding their own business on the pedestrian crossing haven't agreed to your risk-taking strategy. If you have taken responsibility for putting yourself and your family in a self driving car then it should protect innocent bystanders above all else. If that means you don't trust the vehicle then the company producing it needs to do more work on it.
ReplyDelete