Skip navigation
Autonomous Cars Will Need Ethics, Morals and NASCAR Mode

Autonomous Cars Will Need Ethics, Morals and NASCAR Mode

Imagine you’re riding in an autonomous car down a narrow 2-lane road with a double-yellow center line. Suddenly you come upon a stalled car in your lane. How long does the self-driving car wait until it decides it is okay to break the law and cross the lines to pass?

Here’s an even tougher scenario: An out-of-control trolley with five people onboard is directly in the robot car’s path. The only way to avoid it is to swerve and run down a pedestrian. Does the autonomous car choose the lesser of two evils, or smash into the trolley?

These and thousands of similar ethical, moral and legal questions are being considered by engineers and programmers as they develop autonomous vehicles and try to take human error out of the driving equation. The effort underscores how difficult it will be to go from vehicles with autonomous features such as automatic emergency braking to cars that truly drive themselves.

So far, the driving style of Google’s autonomous cars is described by observers as that of an overly cautious elderly person: safe, but not able to avoid every mishap. The company says its self-driving cars have been involved in about a dozen crashes over six years, none of which were their fault.

It is easy to make autonomous vehicles drive in a more virtuous manner than humans, but this could make them vulnerable as more such vehicles mix with conventional cars and drivers on the road.

“We found very quickly that engineers and philosophers can talk to each other much better than we ever expected,” says Chris Gerdes, director of Stanford University’s Center for Automotive Research, at the recent TU-Automotive conference in suburban Detroit.

“We can take frameworks that had been developed in philosophy and map them to the actual software we were putting in the car.”

Gerdes is addressing complex autonomous driving issues by bringing philosophers, engineers and programmers together at Stanford to answer questions such as how to determine when the desire for safety outweighs the need for legality.

The trolley example is a famous topic among philosophers and ethicists known as “The Trolley Car Problem,” which deals with the weighing of consequences to come up with the best result. In practice, it is less about life and death and more about when it makes sense to cross double yellow lines and such. “Automated vehicle ethics revolve around trolley-car problems,” Gerdes says.

An autonomous vehicle’s “morals” are based on what are called deontological ethics. These are like the 10 Commandments, Gerdes says, and focus on what the vehicle should not do, such as speed and drive through yellow lights. All these wholesome values promise to make autonomous vehicles far safer than today’s cars and trucks, but they still do not fully prepare ultra-virtuous self-drivers to coexist with the millions of human drivers that will dominate roads for decades. And this is especially true in high-stress situations.    

“A Google car would never be able to make a left turn, never be able to pull away from the curb and into traffic, because aggressive taxi drivers would quickly learn to exploit its algorithms,” Wall Street Journal writer Holman Jenkins points out in a recent commentary.

Why would a cab driver bully an autonomous taxi? You mean, aside from the fact it is designed to eliminate his job?

Unfortunately, Gerdes does not spend much time talking about the need for what I will call a NASCAR mode: an algorithm that directs the car to arbitrarily play chicken with aggressive human drivers and swap paint occasionally with conventional taxis, just to remind everyone it is a robot that does not know fear or pain. It should be a mandatory feature for all self-driving vehicles transporting anyone in New York or other large cities.

It’s important to send individuals out into the world with strong ethics and morals, but they still won’t survive unless they know how to demand respect and defend themselves if necessary. That goes for people, and soon, robot cars.

[email protected]

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish