A friend of mine who works in games design recently showed me a 3D model of the Earth, rendered in great detail using topographically accurate satellite data, so that we could soar through canyons and our respective neighborhoods at high speed like a pair of joyriding Supermen. “Let’s see if we can go underwater,” he said, exhilarated, as we flew out over the Pacific.
We couldn’t. The model, so stunningly accurate on land, apparently had zero data with which to model the undersea environment. It was an unrendered void beneath the water’s glassy surface, as if this was some subaquatic version of The Truman Show, and we had reached the end of the world.
Neither of us was particularly surprised. The shock would have been if the oceans had been rendered. Where would that information have come from? And how accurate would it have been? It would have meant the model’s creators knew something that even the world’s foremost oceanographers do not.
For all the justifiable excitement around exploring space in the 2020s (Elon Musk is “highly confident” that humans will be rocketing toward Mars by 2026), our planet’s oceans remain a largely uncharted and unknown domain that’s much closer to home. Water covers around 71 percent of Earth’s surface, with the freshwater stuff we drink accounting for a minuscule 3 percent, little more than a rounding error. But the overwhelming majority of the Earth’s oceans — up to 95 percent — are an unexplored mystery.
While we’re still a long way off from a Google Street View equivalent for the undersea world, a new project being carried out by researchers at Stanford University could pave the way for just such a thing in the future — and a whole lot more besides. Picture being able to fly an airplane over a stretch of water and see, with absolute clarity, what’s hiding beneath the waves.
It sounds impossible. As it turns out, it’s just really, really difficult.
The issue with lidar, the trouble with sonar
“Imaging underwater environments from an airborne system is a challenging task, but one that has many potential applications,” Aidan James Fitzpatrick, a graduate student in Stanford University’s department of and electrical engineering, told Digital Trends.
The obvious candidate for this imaging job is lidar. Lidar is the bounced laser technology most famous for helping (non-Tesla) autonomous vehicles to perceive the world around them. It works by emitting pulsed light waves and then measuring how long they take to bounce off objects and return to the sensor. Doing this allows the sensor to calculate how far the light pulse traveled and, as a result, to build up a picture of the world around it. While self-driving cars remain the best-known use of lidar, it can be used as a powerful mapping tool in other contexts as well. For example, researchers used it in 2016 to uncover a long-lost city hidden beneath dense foliage cover in the Cambodian jungle.
Lidar isn’t appropriate for this kind of mapping, though. Although advanced, high-power lidar systems perform well in extremely clear waters, much of the ocean — especially coastal water — tends to be murky and opaque to light. As a result, Fitzpatrick said, much of the underwater imaging performed to date has relied on in-water sonar systems that use sound waves able to propagate through murky waters with ease.
Unfortunately, there’s a catch here, too. In-water sonar systems are mounted to, or towed by, a slow-moving boat. Imaging from the air, using a flying airborne vehicle, would be more effective since it could cover a much greater area in less time. But it’s impossible since sound waves cannot pass from air into water and then back again without losing 99.9999 percent of their energy.
What comes to PASS
Consequently, while lidar and radar systems have mapped the entire Earth’s landscape (emphasis on the “land”), only around 5 percent of the global waters have been the subject of similar imaging and mapping. That’s the equivalent of a world map that only shows Australia, and leaves the rest of it dark like some unexplored Age of Empires map.
“Our goal is to propose a technology which can be mounted on a flying vehicle to provide large-scale coverage while using an imaging technique that is robust in murky water,” Fitzpatrick said. “To do this, we are developing what we have coined a Photoacoustic Airborne Sonar System. PASS exploits the benefits of light propagation in air and sound propagation in water to image underwater environments from an airborne system.”
PASS works like this: First, a special custom laser system fires a burst of infrared light that is absorbed by the first centimeter or so of water. Once laser absorption has occurred, the water thermally expands, creating sound waves that are able to travel into the water.
“These sound waves now act as an in-water sonar signal that was remotely generated using the laser,” Fitzpatrick continued. “The sound waves will reflect off underwater objects and travel back toward the water surface. Some of this sound – only about 0.06 percent – crosses the air-water interface and travels up toward the airborne system. High-sensitivity sound receivers, or transducers, capture these sound waves. The transducers [then] convert the sound energy to electrical signals which can be passed through image reconstruction algorithms to form a perceptible image.”
The things that lie beneath
So far, PASS is a work in progress. The team has demonstrated high-resolution, three-dimensional imaging in a controlled lab environment. But this, Fitzpatrick acknowledged, is in a “container the size of a large fish tank,” although the technology is now “close to the stage” where it could be deployed over a large swimming pool.
There is, of course, a slight difference between a large swimming pool and the entirety of Earth’s oceans, and this will require considerably more work. In particular, a big challenge to be solved before testing in larger, more uncontrolled environments is how to tackle imaging through water with turbulent surface waves. Fitzpatrick said that this is a head-scratcher, but it’s one that “surely has feasible solutions,” some of which the team is already working on.
“PASS could be used to map the depths of uncharted waters, survey biological environments, search for lost wreckages, and potentially much more,” he said. “Isn’t it strange,” he added, “that we have yet to explore the entirety of the Earth we live on? Maybe PASS can change this.”
Combining light and sound in order to solve the air-water interface would be a game changer. And after that? Bring on the army of mapping drones to finally help show us what lies beneath the ocean’s surface.
A paper describing the PASS project was recently published in the journal IEEE Access.