Skip to main content

Stickers on street signs can confuse self-driving cars, researchers show

street signs
Vicnt/123RF
Engineers developing autonomous cars certainly have their work cut out as they try to perfect the technology to make the safest vehicles possible, but it’s often the unexpected issues that pop up along the way that can leave them scratching their heads.

A few months ago, for example, it was revealed that bird poop had been causing havoc with the sensors on autonomous cars, with a direct hit obscuring their ability to “see,” making the vehicle about as safe as a human driver tootling along with their eyes closed. While Waymo has overcome the poop problem with the development of tiny water squirters and wipers that spring into action the moment the gloop hits the sensor, another issue has just reared its ugly head that clearly requires urgent attention if we’re ever to see self-driving technology rolled out in a meaningful way.

Recommended Videos

Interested in testing the all-important sensors that help a car to make sense of its surroundings and make decisions at speed, security researchers at the University of Washington recently tampered with a street sign — under lab conditions, of course — to see if it would confuse the technology.

It did.

Please enable Javascript to view this content

The researchers said that by printing off some stickers and attaching them in a particular way to different street signs, the alterations were able to confuse cameras that are used by “most” autonomous vehicles, Car and Driver reported.

Rather worryingly, the team managed to confuse a self-driving car into thinking a regular “stop” sign was a 45-mph speed limit sign, simply by adding a few carefully placed stickers to it (pictured).

The sign alterations can be very small and go unnoticed by humans because the camera’s software is using an algorithm to understand the image, and interprets it in a profoundly different way to how a human does. So the sign used in the test clearly continues to show the word “stop,” despite the addition of the graffiti-like stickers that serve to trick the car into thinking it means something else.

The researchers suggest that if hackers are able to access the algorithm, they could use an image of the road sign to create a customized, slightly altered version capable of confusing the car’s camera.

The implications of such confusion aren’t hard to imagine. A self-driving car speeding through a stop sign that it mistook for a speed limit sign could put it in the path of an oncoming vehicle, though in such a scenario the self-driving tech in both cars should prevent a catastrophic collision. So, in such cases, tampering with street signs has the potential to cause huge amounts of chaos on the roads rather than anything more serious.

But what happens if the entire sign is fake having been put up by pranksters — something that does happen  from time to time. How will the driverless car be able to tell the difference between a fake sign and a genuine one? While the car’s mapping technology will add to its knowledge of its immediate surroundings, information on temporary signs for construction or incidents may have to be transmitted to driverless cars ahead of time to avoid issues. The technology could also take into account contextual information, prompting it to ignore, say, a (fake) 80 mph sign in a residential area.

Trevor Mogg
Contributing Editor
Not so many moons ago, Trevor moved from one tea-loving island nation that drives on the left (Britain) to another (Japan)…
An autonomous car in San Francisco got stuck in wet concrete
A Cruise autonomous car.

A self-driving car operated by General Motors-backed Cruise got stuck on Tuesday when it drove into a patch of wet concrete.

The incident happened in San Francisco and occurred just days after California's Public Utilities Commission made a landmark decision when it voted to allow autonomous-car companies Cruise and Waymo to expand their paid ridesharing services in the city to all hours of the day instead of just quieter periods.

Read more
Volkswagen is launching its own self-driving car testing program in the U.S.
Volkswagen self-driving ID. Buzz in Austin

Volkswagen is taking autonomous driving a little more seriously. While the likes of Tesla and Waymo have largely led the development of next-gen driving tech, the legacy automakers are certainly starting to invest more heavily. To that end, Volkswagen has announced its first autonomous driving program in the U.S.

As part of the program, Volkswagen has outfitted 10 all-electric ID. Buzz vans with autonomous driving tech, in partnership with autonomous car tech company MobileEye. Over the next few years, Volkswagen says it'll grow this fleet of autonomous cars to cover at least four additional cities, with the current fleet operating in Austin, Texas. By 2026, Volkswagen hopes to commercially launch autonomous cars in Austin.

Read more
Waymo’s robotaxis are coming to Uber’s ridesharing app
A Waymo autonomous vehicle.

Uber will soon offer rides in Waymo's autonomous vehicles using the regular Uber app. It will also integrate with Uber Eats for meal delivery.

Announced on Tuesday, the service will begin toward the end of the year in the Metro Phoenix area, where Waymo is already offering driverless rides for paying passengers through its Waymo One app. Earlier this month, Waymo said it was doubling its service area in Phoenix to serve 180 square miles of The Valley, an expansion that it said makes it “the largest fully autonomous service area in the world.”

Read more