“—You spot a squirrel in the middle of the road. If you don’t break, you will kill the squirrel. However, you happen to know that the squirrel is on his way to kill two other squirrels. How do you react? What if the two other squirrels are known arsonists?”
Other scenarios present some rendition of the “Trolley Problem,” that is, a trolley is barreling towards a group of people. An observer can either redirect the trolley to a parallel course, killing one person, or refrain from taking action, allowing the trolley to kill the group of people.
One criticism against the use of “who lives and who dies” dilemmas rests on the fact that driverless cars do not have a moral capacity.The autonomous vehicle, some experts contend, will unlikely be evaluating whether to save some people rather than others. Instead, these vehicles make decisions based on speed, weather, road conditions, and distance. Thus, the main challenge is of a technical sort—that is, programming the vehicles to process data fast enough to avoid such perilous scenarios.
What do you think? Is the main challenge of driverless cars fundamentally ethical or technical? Given that vehicle manufacterers will eventually have to answer this question, it seems that ethics totally matters! Whether the manufacturer equips the autonomous vehicle with know-how in various life or death scenarios or the ability to process data quickly enough to avert extreme danger, ultimately the manufacturer is making a moral decision.