Skip to main content

Does Tesla have blood on its hands? The complex morality of autonomy

tesla autopilot black box model s oped
Image used with permission by copyright holder
We don’t live in a black and white world. Reality has a whole messy collection of grays too, not to mention stripes and plaids. And while we like simple answers that cut through the morass, real life doesn’t always provide them.

On Thursday the National Highway Transportation Safety Administration (NHTSA, pronounced “nit-sah”) announced plans to investigate the May 7 fatality of a Florida man behind the wheel of – but not driving – a Tesla Model S. The car has an Autopilot feature that allows it take full control of highway driving, and during this accident, the car was in control.

Recommended Videos

So is Tesla at fault? The real answers are far from black and white.

Beta testing at 80 mph

Tesla’s Autopilot feature is a “beta” that’s disabled every time you turn the car off. This driver (and every driver who wants the feature) had to turn it on and click through the warnings. And there are many warnings. Among them is this one:

Warning: Traffic-Aware Cruise Control can not detect all objects and may not detect a stationary vehicle or other object in the lane of travel. There may be situations in which Traffic-Aware Cruise Control does not detect a vehicle, bicycle, or pedestrian. Depending on Traffic-Aware Cruise Control to avoid a collision can result in serious injury or death.

Maybe the driver is responsible. That warning is pretty clear — but disclaimers are just that: disclaimers. You don’t get to absolve yourself of responsibility simply because you post a note saying you aren’t responsible. If restaurants had notes saying “eat at your own risk,” are they responsible for food poisoning?

That said, what does “beta” mean in this context? Cars aren’t computers. We’re fine dealing with “beta” software on a computer, where crashes are as frequent as unpopped kernels in a bag of popcorn. Crashes on the highway don’t lead to rebooting, they lead to twisted metal. Simply by dint of the potential outcomes, unfinished software shouldn’t be released to users.

press01_Tesla-autopilot
Image used with permission by copyright holder

A note on Tesla’s website carries more than a tinge of defensiveness, as though a project manager at the company is already preparing to be excoriated for the death. The blog post is titled “A Tragic Loss,” but opens not with notes of sadness but this comment on the incidence of collisions:

“This is the first known fatality in just over 130 million miles where Autopilot was activated. Among all vehicles in the US, there is a fatality every 94 million miles. Worldwide, there is a fatality approximately every 60 million miles.”

It’s as if the company were saying, “Hey, we didn’t do it! Lots of people die every year!!” Only in the final paragraph of the note does the company acknowledge that “the customer who died in this crash had a loving family and we are beyond saddened by their loss… We would like to extend our deepest sympathies to his family and friends.”

Humans will be humans

A tale of two cars

David Weinberger, a senior researcher at Harvard’s Berkman Center, wrote an essay for us last year titled, “Should your self-driving car kill you to save a schoolbus full of kids?

Today’s cars should do all they can to preserve the life of the driver and passenger, he argued, because that’s about as far as today’s tech can go. In the future, when cars are completely networked, they’ll know all about their passengers as well — at which point cars will need to make moral judgments.

Imagine this scenario: Two autonomous cars are about to crash, and the computers driving can save either one, but not both. One has a 25-year-old mother in it. The other has a 70-year-old childless man in it. Do we program our cars to always prefer the life of someone young? Of a parent? Do we give extra weight to the life of a medical worker beginning a journey to an ebola-stricken area, or a renowned violinist, or a promising scientist or a beloved children’s author?

But it’s not Tesla’s fault, at least not completely. When Tesla enabled the Autopilot feature, people invariably posted videos of themselves jumping in the backseat while the car steered down the highway. One man was caught napping behind the wheel of his Tesla as the car blithely drove itself down the highway. Even in a fully autonomous vehicle, which Tesla doesn’t claim to manufacture, we should be awake and alert as 5,000 pounds of steel, leather, and batteries zips us along at 80 miles per hour.

Cars aren’t toys, and cars that can steer themselves and avoid obstacles shouldn’t turn us into passengers or children.

For another thing, records reveal that the driver had 8 speeding tickets in 6 years. In theory, a self-driving car could turn him into a better driver, one who obeys the speed limits and doesn’t change lanes recklessly. That’s in the future, of course, when cars are fully autonomous. Today’s cars are hardly smart enough.

Perhaps the trillion-dollar question in this case – “Is it Tesla’s fault?” — should be rephrased as, “How do you deal with human nature?”

It’s inevitable that people will act recklessly – the videos of people pulling stupid stunts are evidence of that. How do self-driving cars (and the people who program them) deal with that? Google has said it wants to make its cars drive more like humans. After all, human drivers expect other vehicles on the road to act as they would, and humans aren’t good drivers. Imagine if the car in front of you came to a full stop at that yellow light as it’s supposed to, rather than tearing through as you would. Would that catch you buy surprise? Having a car that anticipates human foibles and can know enough to accelerate through a red light may reduce accidents.

A speed bump on the road to autonomy

The ultimate point of self-driving vehicles is just that: reducing accidents. Call them what they really are: collisions, and usually avoidable ones at that. More than a million people die every year in vehicle crashes, and the vast majority of them are caused simply because humans are human. We look at cell phones. We get distracted by others, our own thoughts, the radio, passing street signs, UFOs, whatever.

While this incident was a tragedy, it shouldn’t detract from the larger goal of reducing vehicular deaths. If designed right, computers will be much better drivers than we are – they never tire, they don’t get distracted, they come to a full stop and respect yellow lights. The road to complete autonomy for cars is potted and full of obstacles. But let’s keep the destination in our sights.

Topics
Jeremy Kaplan
As Editor in Chief, Jeremy Kaplan transformed Digital Trends from a niche publisher into one of the fastest growing…
Tesla delaying the launch of its next big thing, report claims
A Tesla steering wheel.

Tesla is delaying the unveiling of its robotaxi by a couple of months, according to a Bloomberg report citing people with knowledge of the matter.

Tesla boss Elon Musk said three months ago that the robotaxi would be unveiled at a special event on August 8, but sources said the event has now been pushed to October, apparently to give engineers more time to get the prototype right.

Read more
Tesla faces new rival as a tech giant launches its first EV
Xiaomi's first electric car, the SU7.

Previous

Next

Read more
Cruise says it’s nearing approval for mass production of futuristic robotaxi
Interior of Cruise's Origin vehicle.

Robotaxi company Cruise is “just days away” from getting regulatory approval that would pave the way for mass production of its purpose-built driverless vehicle, CEO Kyle Vogt said on Thursday in comments reported by the Detroit Free Press.

General Motors-backed Cruise unveiled the vehicle -- called Origin -- in early 2020, presenting the kind of driverless car that we all dreamed of when R&D in the sector kicked off years ago; a vehicle without a steering wheel and without pedals. A vehicle with passenger seats only.

Read more