self-driving cars

Licensed to kill

I've written before about the coming revolution of self-driving cars. I'm fascinated by the moral questions that arise as the liability for accidents transfers from individual, fallible drivers to software written by fallible programmers. 

In Why Self-Driving Cars Must Be Programmed to Kill, Technology Review considers the thorny question of how we can program vehicles to behave when a crash is unavoidable. When should the driver and passengers be sacrificed, for example, to save a larger group of people who are suddenly in harm's way? 

Some unexpected ways that self-driving cars will change America

I've been thinking and writing a bit about the advent of self-driving vehicles and how they will change things like safety and liability for accidents. But driverless vehicles will change more than our chances of dying in a car wreck - they may change how we perceive each other.

This past weekend, my wife and I witnessed another driver make one of the stupidly reckless, me-first maneuvers that are commonplace occurrences around the nation's capital. Nothing out of the ordinary there. When I'm on the job, running with lights and sirens, I see people behave in ways that are so frustrating that it can leave me questioning humanity. Most people try to get out of the way the best they can. But a few cut me off, try to outrun me, or get the jump on everyone else by drafting behind me. Some just sit there, blocking traffic, and take no apparent action that would help an emergency vehicle responding to an emergency. It's not their emergency, after all.

When we witnessed the latest instance of bad driving, my wife said "We need self-driving cars, now." It reminded me of this article from the resident of a town where Google is testing out their self-driving vehicles. By his account, the cars are cautious, obey the law without fail, and give pedestrians a lot of room. Another observation: the cars are so careful that other people have learned you can cut them off in traffic when changing lanes.

People tout the safety and efficiency benefits that automatic cars would bring: you can pack far more of them on to overloaded highways, and route them to minimize commute times for everyone (more about that in another post). It's been predicted that they'll cut traffic accidents by 90%, which will likely reduce the current socially-tolerated roadway carnage of over 36,000 annual deaths.  

But what will they do to our souls? Americans equate cars to freedom, to movement and liberty. We love the road, love our cars... and hate other drivers. Driving to work is like being forced to hang out on the comment boards of youtube, or the more unpleasant corners of reddit: you're packed in with a bunch of anonymous strangers who seem determined to behave in the most stupid or meanly opportunistic ways possible. 

The current popularity of zombie movies & TV shows is no great surprise: we're constant participants in a madcap scramble amidst people who appear to have lost their minds. We're daily survivors of the driving dead. 

As self-driving cars begin to appear on the road, expect them to be harassed, exploited, and abused. They'll be the Google Glasses of the street. Their owners will be mocked, their masculinity (and perhaps humanity) questioned.

Then, something will happen. Enough of the vehicles will be among us that a strange shift will take place. People commuting in a self-driving car, sliding along at a mutually-beneficial pace with other vehicles, will look out of their window and see other people. Not zombies, perhaps, but people. Because they're locked in a regulated traffic pattern, they may pace each other the whole way into the city. The gulf between cars will be narrower than it is now. They'll be neighbors for the whole trip. Someone will roll down a window, maybe, and yell a good-natured insult about whatever sports team's logo is plastered on the rear bumper. Freed from the soul-crushing grind that driving is becoming for many people, they won't default to hatred and alienation. 

Or, because this is the U.S., that other person might take out a gun and begin shooting.

But not necessarily. 

As I write it, this sounds like some kind of utopian vision in which sameness makes us all better people. That kind of thing has been skewered in so many dystopian novels that I wonder why we're so terrified of equality. But that's a question for another day. In the meantime, we're on the verge of a revolutionary change in how we move around, which may revolutionize transportation the way the Internet has altered our handling of information. Could it also make us better people? 

All I need is an automated car and a GPS to steer by

Haven't I mentioned the podcast 99% Invisible before? That was an oversight.

In this episode, they talk about driverless cars and the "automation paradox," which goes something like this:

  1. Humans are weak at performing a dangerous task.
    Example: Navigating with a paper map while driving in unfamiliar places.
  2. We create a form of automation that replaces our imperfect performance
    Drive with a GPS and allow it to perform all the navigation for you.
  3. People become weaker at the task because they increasingly rely on the automated solution.
    When was the last time you used a paper map to locate something?
  4. When automation fails or doesn't perform as expected, humans are more prone to failing at the difficult task.

As part of the discussion of driverless vehicles, this episode touches on a question that's been intriguing me for some time. It has to do with how we handle it when someone is killed by a driverless car.

The vast majority of vehicle accidents are due to human error. Automation will reduce that number, but there will still be collisions, and lives will be lost. Right now, we have a two-tiered means of addressing the responsibility for damage to lives and objects: we hold the individual responsible, and we distribute some of the financial liability throughout all drivers in the form of insurance.

But what happens when the fault is in the software, and virtually all cars are running on a single platform (Google)? Will the company take on the cumulative liability of the fatalities caused by errors in their product? Or do we establish a national fund to dispense payouts in the event of fatalities?

That liability might be too great even for Google to bear, but would a tax-funded system amount to a massive governmental payout to private industry? And would the industry actually have less incentive to perfect their technology and save lives, since they would be insulated from liability? 

Anyway, 99% Invisible is all about design, and it's pretty great. Listen to the full episode, with lots of supporting video, right here.