Skip to main content

How crowdsourced lidar could give your car X-ray-like superpowers

One of my uncles always tells a story about how, when he was a kid, his mom would tell him she could see around corners. They would be out walking someplace, only for his mom to tell him the details of some vehicle or person that was about to appear around a bend in the road. A few seconds later and, sure enough, that vehicle or person would appear, exactly as described. Magic, surely?

Of course, it wasn’t magic at all: His mom — my grandmother — was just taller than he was, and could see over walls and other obstacles that he wasn’t able to. What appeared to be some kind of superpower was really just about having a superior vantage point.

Recommended Videos

Now, researchers from the U.K.’s University of Cambridge, University of Oxford, and University College London want to give every car on the planet the ability to see around corners. And, with genuine magic being in short supply, they’ve come up with a way to shift the world’s vantage points using a combination of lidar, augmented reality, and crowdsourcing.

If it works as promised — and that’s a big if — it could totally transform the way we drive by allowing drivers to “see through” objects to alert them of potential hazards, without distracting them in the process. And it will also “beam” the information directly into your eye for good measure.

Crowdsourcing lidar

Lidar (Light Detection And Ranging) refers to the depth-sensing, bounced laser-mapping technology that allows many self-driving cars to perceive the world around them. As it happens, those last four words — “the world around them” — is the bit that the researchers behind this project want to change. To give drivers something akin to X-ray vision that allow them to spot obstacles hidden from view — such as the motorcyclist momentarily obscured behind a vehicle — they want to build a massive crowdsourced map of lidar data gathered from all road users.

University of Cambridge

For an analogy of what this might look like, think of that scene from Christopher Nolan’s 2008 movie The Dark Knight in which Batman hacks every cell phone in Gotham City and converts them into a high-frequency generator, stitching together all the location data to build a three-dimensional schematic of the city, from buildings to people. As Lucius Fox, the perturbed Wayne Enterprises boss, says, “you took my sonar concept and applied it to every phone in the city. Half the city feeding you SONAR; you can image all of Gotham.”

The idea of car-to-car communication for collaborative purposes isn’t exactly science fiction. Starting with Waze, many mapping apps have used the driving data of different users to build up a pretty detailed picture of what is happening on the road in terms of the free flow of traffic. Tesla, meanwhile, collects large amounts of road data from vehicle owners via its Full Self-Driving beta test fleet. In 2017, Tesla asked vehicle owners if they were willing to provide videos collected using their cars’ onboard Autopilot cameras. This data, while collected by individual vehicles, is combined to make the overall fleet smarter and better able to deal with obstacles.

What this latest LiDAR project adds to that is the gathering of 360-degree point cloud data that can be aggregated to give every road user a clear view of their surroundings.

As Jana Skirnewskaja, a researcher on the team, told Digital Trends, this is still relatively early times for the project. So far, the team has carried out a proof of concept scanning Malet Street, a busy street in London, using multiple lidar scanners in various positions. This data was then used to build up a 3D model.

3D Model of Malet St, Central London, Based on LiDAR Data

“We scanned Malet Street from 10 different positions using 10 different data scanners,” Skirnewskaja told Digital Trends. “This allows us to fully re-create the street how it is at that moment, so any objects — hidden or not — will be [represented in] the point cloud. This allows us to erase objects that we don’t want to see, and choose the objects which are hidden … and project them.”

Beaming the information into drivers’ eyes

As it happens, this is only one half of the project. The other, equally impressive, bit involves projecting this information directly into the eye of the driver in ultra-high definition. This in-car technology, Skirnewskaja believes, could be a valuable alternative to 2D windscreen AR projection, as well as to burgeoning AR tech like augmented reality contact lenses.

“What our studies have shown is that it [causes] no harm at all to the pupil, to the human eye,” she said. “It can project, [directly] into the driver’s eye, any object. We can also use augmented reality to layer objects so that we project different objects, like road obstacles or signs or people or trees, at different sizes [to indicate] distances. The further away an object is, the smaller it will be. That can be realized.”

University of Cambridge

It means that, as a driver sits behind the wheel, they could have overlaid information appear superimposed on the real world. “[Our work] has shown that we can already project in-eye 3D augmented reality objects on the road, and that these are properly aligned and not distracting the driver,” Skirnewskaja said.

She said that, initially, this will likely be fixed information, such as highlighting permanent obstacles that have caused other drivers problems. But, in the long term, it could be possible to track dynamic objects as well. In addition to gathering lidar data from other vehicles, Skirnewskaja said that cities could install lidar sensors along the sides of roads, similar to the way CCTV cameras are used today.

“We hope that it can be expanded further so that we can connect every car and project this information of road obstacles in real time,” she explained.

The team aims to work with established automotive companies as part of the project. She suggested that this includes Jaguar Land Rover and VW. At present, the researchers are working to miniaturize the optical components they used in their experimental holographic setup so that it can be fitted into a car. After this, they plan to carry out vehicle tests on public roads in the city of Cambridge.

There’s no word on when this technology ultimately goes live, but, provided it works as well as described, it’ll certainly be worth waiting for.

A paper describing the work was recently published in the journal Optics Express.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Rivian, VW venture kicks off next-gen platform for R1, Scout EVs
Rivian R2, R3, and R3X

The big challenge for Rivian, the EV maker known for its innovative electric and software systems, has long been how to reach the next stage of growth.

That stage came within reach in June, when the California-based company and Volkswagen announced a joint venture involving a $5 billion injection from the German automaker.

Read more
Kia EV3: everything we know so far
White Kia EV3

Kia is on a roll. Hot on the heels of the success of the Kia EV6 and EV9, the company is now expanding its lineup even further, with the new EV3.

The EV3 was announced some time ago, but it's now rolling out in Europe with a solid range and a relatively low price tag. That low price tag, however, thankfully doesn't mean that the EV3 is a low-end vehicle -- on the contrary, it still offers everything you know and love about modern Kia vehicles.

Read more
I reviewed an electric car like it was a phone, and I came to a shocking conclusion
The front of the Cupra Born VZ.

The Cupra Born VZ is not a smartphone — it’s an electric car. Yet, during my time driving it over the last five days, it has reminded me more than once about the device I spend most of my time using and reviewing.

This is not a put-down, nor is it a comment on electric versus combustion-engine vehicles, but more about how I, someone who doesn’t professionally review cars, can still easily recognize what’s good and bad about it. What’s more, the categories I usually break phone reviews down into, and the language I regularly use to talk about them, also neatly applies to the Born VZ.

Read more