Haven't I mentioned the podcast 99% Invisible before? That was an oversight.
In this episode, they talk about driverless cars and the "automation paradox," which goes something like this:
- Humans are weak at performing a dangerous task.
Example: Navigating with a paper map while driving in unfamiliar places.
- We create a form of automation that replaces our imperfect performance
Drive with a GPS and allow it to perform all the navigation for you.
- People become weaker at the task because they increasingly rely on the automated solution.
When was the last time you used a paper map to locate something?
- When automation fails or doesn't perform as expected, humans are more prone to failing at the difficult task.
As part of the discussion of driverless vehicles, this episode touches on a question that's been intriguing me for some time. It has to do with how we handle it when someone is killed by a driverless car.
The vast majority of vehicle accidents are due to human error. Automation will reduce that number, but there will still be collisions, and lives will be lost. Right now, we have a two-tiered means of addressing the responsibility for damage to lives and objects: we hold the individual responsible, and we distribute some of the financial liability throughout all drivers in the form of insurance.
But what happens when the fault is in the software, and virtually all cars are running on a single platform (Google)? Will the company take on the cumulative liability of the fatalities caused by errors in their product? Or do we establish a national fund to dispense payouts in the event of fatalities?
That liability might be too great even for Google to bear, but would a tax-funded system amount to a massive governmental payout to private industry? And would the industry actually have less incentive to perfect their technology and save lives, since they would be insulated from liability?
Anyway, 99% Invisible is all about design, and it's pretty great. Listen to the full episode, with lots of supporting video, right here.