Welcome to the Nexus of Ethics, Psychology, Morality, Philosophy and Health Care

Welcome to the nexus of ethics, psychology, morality, technology, health care, and philosophy

Thursday, December 8, 2016

Morality in transportation

Jeffrey C. Peters
The Conversation by way of Salon
Originally posted November 19, 2016

A common fantasy for transportation enthusiasts and technology optimists is for self-driving cars and trucks to form the basis of a safe, streamlined, almost choreographed dance. In this dream, every vehicle — and cyclist and pedestrian — proceeds unimpeded on any route, as the rest of the traffic skillfully avoids collisions and even eliminates stop-and-go traffic. It’s a lot like the synchronized traffic chaos in “Rush Hour,” a short movie by Black Sheep Films.

Today, autonomous cars are becoming more common, but safety is still a question. More than 30,000 people die on U.S. roads every year — nearly 100 a day. That’s despite the best efforts of government regulators, car manufacturers and human drivers alike. Early statistics from autonomous driving suggest that widespread automation could drive the death toll down significantly.

There’s a key problem, though: Computers like rules — solid, hard-and-fast instructions to follow. How should we program them to handle difficult situations? The hypotheticals are countless: What if the car has to choose between hitting one cyclist or five pedestrians? What if the car must decide to crash into a wall and kill its occupant, or slam through a group of kindergartners? How do we decide? Who does the deciding?

The article is here.