Skip to main content

Germany tells Telsa drivers to pay attention and follow instructions

tesla germany autopilots ads autopilot display 1500x954
Image used with permission by copyright holder
Just last week the German Transport Ministry called Tesla’s Autopilot feature a “significant traffic hazard,” and now we know what the government agency intended to do about it. It wrote a letter. Tesla owners in Germany received a notice from the Federal Motor Authority telling them to stay in control of their cars and follow the manufacturer’s instructions, according to Electrek.

The official notification raised some eyebrows and pushback. The criticism was based on the assumption that Tesla Autopilot is a fully autonomous feature, which it is not. Tesla has consistently stated that autopilot is a driver assistance feature. When drivers engage it they are given a visual reminder to stay alert and maintain control. Additional visual and audio alerts appear if drivers keep their hands off the wheel or if the Autopilot system senses and impending danger.

Recommended Videos

The letter sent to the Tesla owners states that public perception of the system requires the letter be sent. Apparently, too many people are expecting too much of the system — driving and using Autopilot as if autopilot actually is capable of autonomous control even though the manufacturer warns it is not.

The letter repeats several times that Autopilot is for driver assistance only and therefore driver attention and control are required at all times. Owners are directed to read the Tesla owner’s manual, especially the chapter “Driver Assistance — restrictions.” That chapter outlines the Autopilot system’s limitations and describes the alerts that will be given.

In summarizing the warnings and strictures, the letter states — as translated from the original German — “In this context, there is executed the following: ‘It is for the driver to stay alert, drive safely, and at any time to keep control of the vehicle.'”

The 10-to-15-year shift to autonomous vehicles has started. Three factors will be evident until the technology fully develops. Drivers will assume semi-autonomous or assistance systems are more capable than they actually are. Carmakers will walk a thin line between marketing and caution. Government regulatory agencies and consumer groups will be watchful, balancing the eventual traffic safety of fully self-driving cars with drivers’ irresponsible use of incrementally improving assistance features.

Bruce Brown
Bruce Brown Contributing Editor   As a Contributing Editor to the Auto teams at Digital Trends and TheManual.com, Bruce…
Range Rover’s first electric SUV has 48,000 pre-orders
Land Rover Range Rover Velar SVAutobiography Dynamic Edition

Range Rover, the brand made famous for its British-styled, luxury, all-terrain SUVs, is keen to show it means business about going electric.

And, according to the most recent investor presentation by parent company JLR, that’s all because Range Rover fans are showing the way. Not only was demand for Range Rover’s hybrid vehicles up 29% in the last six months, but customers are buying hybrids “as a stepping stone towards battery electric vehicles,” the company says.

Read more
BYD’s cheap EVs might remain out of Canada too
BYD Han

With Chinese-made electric vehicles facing stiff tariffs in both Europe and America, a stirring question for EV drivers has started to arise: Can the race to make EVs more affordable continue if the world leader is kept out of the race?

China’s BYD, recognized as a global leader in terms of affordability, had to backtrack on plans to reach the U.S. market after the Biden administration in May imposed 100% tariffs on EVs made in China.

Read more
Tesla posts exaggerate self-driving capacity, safety regulators say
Beta of Tesla's FSD in a car.

The National Highway Traffic Safety Administration (NHTSA) is concerned that Tesla’s use of social media and its website makes false promises about the automaker’s full-self driving (FSD) software.
The warning dates back from May, but was made public in an email to Tesla released on November 8.
The NHTSA opened an investigation in October into 2.4 million Tesla vehicles equipped with the FSD software, following three reported collisions and a fatal crash. The investigation centers on FSD’s ability to perform in “relatively common” reduced visibility conditions, such as sun glare, fog, and airborne dust.
In these instances, it appears that “the driver may not be aware that he or she is responsible” to make appropriate operational selections, or “fully understand” the nuances of the system, NHTSA said.
Meanwhile, “Tesla’s X (Twitter) account has reposted or endorsed postings that exhibit disengaged driver behavior,” Gregory Magno, the NHTSA’s vehicle defects chief investigator, wrote to Tesla in an email.
The postings, which included reposted YouTube videos, may encourage viewers to see FSD-supervised as a “Robotaxi” instead of a partially automated, driver-assist system that requires “persistent attention and intermittent intervention by the driver,” Magno said.
In one of a number of Tesla posts on X, the social media platform owned by Tesla CEO Elon Musk, a driver was seen using FSD to reach a hospital while undergoing a heart attack. In another post, a driver said he had used FSD for a 50-minute ride home. Meanwhile, third-party comments on the posts promoted the advantages of using FSD while under the influence of alcohol or when tired, NHTSA said.
Tesla’s official website also promotes conflicting messaging on the capabilities of the FSD software, the regulator said.
NHTSA has requested that Tesla revisit its communications to ensure its messaging remains consistent with FSD’s approved instructions, namely that the software provides only a driver assist/support system requiring drivers to remain vigilant and maintain constant readiness to intervene in driving.
Tesla last month unveiled the Cybercab, an autonomous-driving EV with no steering wheel or pedals. The vehicle has been promoted as a robotaxi, a self-driving vehicle operated as part of a ride-paying service, such as the one already offered by Alphabet-owned Waymo.
But Tesla’s self-driving technology has remained under the scrutiny of regulators. FSD relies on multiple onboard cameras to feed machine-learning models that, in turn, help the car make decisions based on what it sees.
Meanwhile, Waymo’s technology relies on premapped roads, sensors, cameras, radar, and lidar (a laser-light radar), which might be very costly, but has met the approval of safety regulators.

Read more