Skip to main content

I was on the self-driving bus that crashed in Vegas. Here’s what really happened

This was going to be a pretty ordinary story about the technology that allows the Las Vegas self-driving shuttle bus to cruise around the old Fremont street district of the city. But then we had a fender-bender just an hour into our ride in the shuttle and things changed. The Internet was blowing up with third-hand accounts of the traffic accident involving the new autonomous bus by the time I got back to the airport, and the more the story has gone viral, the less it resembles what actually happened.

Get on the bus

The downtown Las Vegas self-driving shuttle isn’t exactly new. They’ve had these buses plying the strip for a while this year, with no incidents. The new shuttle is set to run a 0.6-mile loop course north of the Strip in the old downtown area. It’s a joint project put together by the City of Las Vegas, AAA, the Regional Transportation Commission of Southern Nevada, and the Keolis North America company, which runs mass transit in Las Vegas.

Top: AAA, Bottom: Jeff Zurschmeide/Digital Trends

The shuttle bus itself is a Navya Arma, an autonomous and electric French vehicle that’s already in use in several European cities. Let’s face it, self-driving tech is here: We’ve seen self-driving forklifts, and fleets of trucks will drive themselves around the U.K. next year. Horseless carriages are now driverless in in Phoenix, thanks to Waymo. The Vegas self-driving shuttle will hold about 12 people, including an attendant from Keolis. The attendant is kind of like an elevator operator – they don’t really need to be there, but they will make people feel more comfortable about using the new tech.

The organizers of the new shuttle line held a press event to launch the new service. They got NASCAR driver Danica Patrick, magicians Penn & Teller, the Mayor of Las Vegas, and various other dignitaries to talk about the new service. Among the points made was that over 90 percent of traffic collisions are due to human error, and they hope the new shuttle bus will make Las Vegas streets a little safer.

So what really happened?

Once the speechifying was over, the press and the public in attendance were invited to take a loop ride on the bus. The little shuttle did about 10 laps carrying people around, and when the crowd thinned out I went to take a ride and get some photos.

The so-called “crash” happened in super slow motion, and merely dented the shuttle’s plastic panels.

The bus drives very conservatively. If it senses a person walking across the street ahead, it stops. If there’s traffic on the street when it’s at a stop, it waits for the road to clear. It goes along at about 20 mph, and it’s a really gentle ride. The self-driving shuttle does exactly what it’s supposed to do.

On our ride, we encountered a medium-large articulated delivery truck stopped in the street. The driver was trying to back his trailer into an alleyway on the left. The shuttle bus very obediently stopped a reasonable distance from the truck and waited for it to move. That’s where things went wrong.

What the autonomous shuttle bus didn’t expect was that the truck would back up towards it. As the driver was swinging the trailer into the alley, the tractor portion of the truck was coming right at us – very slowly. We had plenty of time to watch it happen. I was taking pictures.

Jeff Zurschmeide/Digital Trends

The driver of the truck was probably watching where his trailer was going, and didn’t notice where we were. The so-called “crash” happened in super slow motion, and merely dented the plastic panels on the front of the shuttle. It was no big deal, although the Keolis attendant was understandably upset.

Analyzing the situation

This collision, like 90 percent of traffic incidents on our roads, was the result of human error. The truck driver got a ticket from the Las Vegas police. We could see his mirrors the whole time and he should have seen us. But I don’t want to be too harsh on the guy – driving a big truck in Las Vegas is a tough job, and he’s only human. His error could have happened to anyone.

The more the story has gone viral, the less it resembles what actually happened.

On the other side, the shuttle did exactly what it was programmed to do, and that’s a critical point. The self-driving program didn’t account for the vehicle in front unexpectedly backing up. We had about 20 feet of empty street behind us (I looked) and most human drivers would have thrown the car into reverse and used some of that space to get away from the truck. Or at least leaned on the horn and made our presence harder to miss. The shuttle didn’t have those responses in its program.

My suggestion to Navya and Keolis is that if the shuttle doesn’t have cameras and LIDAR facing backwards, it would be good to enable the shuttle to reverse if something’s coming toward it. And a horn for the attendant would be a good feature, too. But here’s the key thing about autonomous cars: we humans will learn from this accident and we can add those features and make all future shuttle buses better. In a very short while, any self-driving shuttle will know what to do in this kind of situation. Cars like the 2018 Audi A8, which flawlessly steers itself through traffic jams.

So there you have it. As usual, the reality is not as sensational as most of the news out there would have you believe. This item? Just another checklist in the history of self-driving cars. Next time you’re in Vegas, give the self-driving shuttle a chance.

Jeff Zurschmeide
Former Digital Trends Contributor
Jeff Zurschmeide is a freelance writer from Portland, Oregon. Jeff covers new cars, motor sports, and technical topics for a…
Tesla posts exaggerate self-driving capacity, safety regulators say
Beta of Tesla's FSD in a car.

The National Highway Traffic Safety Administration (NHTSA) is concerned that Tesla’s use of social media and its website makes false promises about the automaker’s full-self driving (FSD) software.
The warning dates back from May, but was made public in an email to Tesla released on November 8.
The NHTSA opened an investigation in October into 2.4 million Tesla vehicles equipped with the FSD software, following three reported collisions and a fatal crash. The investigation centers on FSD’s ability to perform in “relatively common” reduced visibility conditions, such as sun glare, fog, and airborne dust.
In these instances, it appears that “the driver may not be aware that he or she is responsible” to make appropriate operational selections, or “fully understand” the nuances of the system, NHTSA said.
Meanwhile, “Tesla’s X (Twitter) account has reposted or endorsed postings that exhibit disengaged driver behavior,” Gregory Magno, the NHTSA’s vehicle defects chief investigator, wrote to Tesla in an email.
The postings, which included reposted YouTube videos, may encourage viewers to see FSD-supervised as a “Robotaxi” instead of a partially automated, driver-assist system that requires “persistent attention and intermittent intervention by the driver,” Magno said.
In one of a number of Tesla posts on X, the social media platform owned by Tesla CEO Elon Musk, a driver was seen using FSD to reach a hospital while undergoing a heart attack. In another post, a driver said he had used FSD for a 50-minute ride home. Meanwhile, third-party comments on the posts promoted the advantages of using FSD while under the influence of alcohol or when tired, NHTSA said.
Tesla’s official website also promotes conflicting messaging on the capabilities of the FSD software, the regulator said.
NHTSA has requested that Tesla revisit its communications to ensure its messaging remains consistent with FSD’s approved instructions, namely that the software provides only a driver assist/support system requiring drivers to remain vigilant and maintain constant readiness to intervene in driving.
Tesla last month unveiled the Cybercab, an autonomous-driving EV with no steering wheel or pedals. The vehicle has been promoted as a robotaxi, a self-driving vehicle operated as part of a ride-paying service, such as the one already offered by Alphabet-owned Waymo.
But Tesla’s self-driving technology has remained under the scrutiny of regulators. FSD relies on multiple onboard cameras to feed machine-learning models that, in turn, help the car make decisions based on what it sees.
Meanwhile, Waymo’s technology relies on premapped roads, sensors, cameras, radar, and lidar (a laser-light radar), which might be very costly, but has met the approval of safety regulators.

Read more
Scout Traveler SUV vs. Rivian R1S: modern electric SUVs compared
Scout Motors Traveler SUV in a forest

Scout Motors has made a big comeback, now under the guidance of Volkswagen Group. Not only is the Scout brand being reinvigorated but it has already announced two new models in the form of the Scout Terra truck and the Scout Traveler SUV. The Scout Traveler SUV in particular is an interesting vehicle, but it has an uphill battle ahead of it thanks to competition from the likes of Rivian.

The Rivian R1S is often considered to be the best electric SUV out there right now, and while it's not cheap, it offers an excellent set of tech features, a long range, and more. So how does the Scout Traveler SUV compare with the R1S? We put the two vehicles head to head to find out.
Design
The exterior designs of the Scout Traveler and the Rivian R1S are similar in many ways, though there are some major differences. Both vehicles offer a relatively blocky shape characteristic of a traditional SUV. The Traveler is a little more traditional than the Rivian in that it has a spare tire on the back, which is both practical and gives it a rugged vibe.

Read more
Waymo, Nexar present AI-based study to protect ‘vulnerable’ road users
waymo data vulnerable road users ml still  1 ea18c3

Robotaxi operator Waymo says its partnership with Nexar, a machine-learning tech firm dedicated to improving road safety, has yielded the largest dataset of its kind in the U.S., which will help inform the driving of its own automated vehicles.

As part of its latest research with Nexar, Waymo has reconstructed hundreds of crashes involving what it calls ‘vulnerable road users’ (VRUs), such as pedestrians walking through crosswalks, biyclists in city streets, or high-speed motorcycle riders on highways.

Read more