Skip to main content

Tesla’s Autopilot is in the hot seat again over driver misuse

Tesla isn’t preventing the misuse of its Autopilot feature like it should, according to the National Transportation Safety Board (NTSB), which is calling the company out because of it. 

Recommended Videos

During a hearing on Tuesday about a March 2018 Tesla crash that resulted in the driver’s death due to misuse of the Autopilot feature, the NTSB said that Tesla needs to do more to improve the safety of its Autopilot feature.

According to multiple reports, the NTSB made Autopilot safety recommendations to six automakers — including Volkswagen, BMW AG, and Nissan — in 2017, and Tesla is the only one that has yet to respond. 

The board also determined that the driver of the 2018 crash was playing a video game on his phone instead of paying attention to the road. During Tuesday’s hearing, the NTSB said that while the driver was definitely distracted and was relying solely on the Autopilot function, the car’s forward-collision warning and automatic emergency brake system did not activate, according to CNBC. 

“If you own a car with partial automation, [you do] not own a self-driving car. So don’t pretend you do,” said NTSB Chair Robert Sumwalt during the hearing. 

Image used with permission by copyright holder

Tesla’s Autopilot has come into question before, but mostly regarding drivers’ actions rather than Tesla’s technology. Drivers have fallen asleep behind the wheel, letting their Tesla model take control through the Autopilot feature, but the company cautions that this is definitely not what drivers should be doing. 

“While using Autopilot, it is your responsibility to stay alert, keep your hands on the steering wheel at all times, and maintain control of your car. Before enabling Autopilot, the driver first needs to agree to ‘keep your hands on the steering wheel at all times’ and to always ‘maintain control and responsibility for your vehicle,’” Tesla’s support page on Autopilot reads.

Tesla had its first Autopilot fatality in 2016, but the NTSB reported that the driver was at fault for not paying attention. However, the NTSB also said that Tesla “could have taken further steps to prevent the system’s misuse,” according to Reuters. 

Digital Trends reached out to Tesla to comment on Tuesday’s hearing, as well as to find out what it is doing to ensure that drivers use the Autopilot feature properly. We will update this story when we hear back. 

Allison Matyus
Former Digital Trends Contributor
Allison Matyus is a general news reporter at Digital Trends. She covers any and all tech news, including issues around social…
Tesla issues stark warning to drivers using its Full Self-Driving mode
A Telsa Model 3 drives along a road.

Tesla in recent days rolled out a long-awaited update to its Full Self-Driving (FSD) mode that gives its vehicles a slew of driver-assist features.

But in a stark warning to owners who’ve forked out for the premium FSD feature, Tesla said that the software is still in beta and therefore “may do the wrong thing at the worst time.” It insisted that drivers should keep their "hands on the wheel and pay extra attention to the road.”

Read more
Tesla’s Autopilot can be easily tricked, engineers find
Tesla emblem preview image

Engineers at Consumer Reports (CR) said this week they were able to "easily" trick Tesla’s Autopilot system into thinking someone was in the driver’s seat, meaning the car could be driven without anyone behind the wheel.

CR engineers performed the demonstration on a private road using a Tesla Model Y vehicle. The non-profit consumer organization said it decided to conduct the test after hearing about Saturday’s fatal crash in Spring, Texas, involving a Tesla Model S that apparently had no one behind the wheel.

Read more
Elon Musk suggests Autopilot was off in fatal Texas Tesla crash
elon musk stylized image

Tesla chief Elon Musk has said that the automaker’s early investigations suggest the Model S in the fatal accident in Texas at the weekend didn’t have Autopilot enabled.

Saturday's crash in Spring, just north of Houston, killed two men aged 59 and 69 when the vehicle hit a tree and burst into flames. Police at the scene said one person was found in the front passenger seat while the other was in a rear seat, indicating that the car may have been in Autopilot or Full Self-Driving (FSD) mode when the accident occurred. The other possibility is that the driver was thrown from the driver’s seat, or moved out of it, around the time of the impact.

Read more