Skip to main content

Exoskeletons with autopilot: A peek at the near future of wearable robotics

Automation makes things easier. It also makes things potentially scarier as you put your well-being in the hands of technology that has to make spur-of-the-moment calls without first consulting you, the user. A self-driving car, for instance, must be able to spot a traffic jam or swerving cyclist and react appropriately. If it can do this effectively, it’s a game-changer for transportation. If it can’t, the results may be fatal.

Recommended Videos

At the University of Waterloo, Canada, researchers are working on just this problem — only applied to the field of wearable robot exosuits. These suits, which can range from industrial wearables reminiscent of Aliens’ Power Loader to assistive suits for individuals with mobility impairments resulting from age or physical disabilities, are already in use as augmentation devices to aid their wearers. But they’ve been entirely manual in their operation. Now, researchers want to give them a mind of their own.

To that end, the University of Waterloo investigators are developing A.I. tools like computer vision that will allow exosuits to sense their surroundings and adjust movements accordingly — such as being able to spot flights of stairs and climb them automatically or otherwise respond to different walking environments in real time. Should they pull it off, it will forever change the usefulness of these assistive devices. Doing so isn’t easy, however.

The biggest challenge for robotic exoskeletons

“Control is generally regarded as one of the biggest challenges to developing robotic exoskeletons for real-world applications,” Brokoslaw Laschowski, a Ph.D. candidate in the university’s Systems Design Engineering department, told Digital Trends. “To ensure safe and robust operation, commercially available exoskeletons use manual controls like joysticks or mobile interfaces to communicate the user’s locomotor intent. We’re developing autonomous control systems for robotic exoskeletons using wearable cameras and artificial intelligence, [so as to alleviate] the cognitive burden associated with human control and decision-making.”

University of Waterloo: wearable robot exoskeletons camera
University of Waterloo

As part of the project, the team had to develop an A.I.-powered environment classification system, called the ExoNet database, which it claims is the largest-ever open-source image dataset of human walking environments. This was gathered by having people wear a mounted camera on their chest and walk around local environments while recording their movement and locomotion, It was then used to train neural networks.

“Our environment classification system uses deep learning,” Laschowski continued. “However, high-performance deep-learning algorithms tend to be quite computationally expensive, which is problematic for robotic exoskeletons with limited operating resources. Therefore, we’re using efficient convolutional neural networks with minimal computational and memory storage requirements for the environment classification. These dee- learning algorithms can also automatically and efficiently learn optimal image features directly from training data, rather than using hand-engineered features as is traditionally done.”

John McPhee, a professor of Systems Design Engineering at the University of Waterloo, told Digital Trends: “Essentially, we are replacing manual controls — [like] stop, start, lift leg for step — with an automated solution. One analogy is an automatic powertrain in a car, which replaces manual shifting. Nowadays, most people drive automatics because it is more efficient, and the user can focus on their environment more rather than operating the clutch and stick. In a similar way, an automated high-level controller for an exo will open up new opportunities for the user [in the form of] greater environmental awareness.”

As with a self-driving car, the researchers note that the human user will possess the ability to override the automated control system if the need arises. While it will still require a bit of faith to, for instance, trust that your exosuit will spot a flight of descending stairs prior to launching down them, the wearer can take control in scenarios where it’s necessary.

Still prepping for prime time

Right now, the project is a work in progress. “We’re currently focusing on optimizing our A.I.-powered environment classification system, specifically improving the classification accuracy and real-time performance,” said Laschowski. “This technical engineering development is essential to ensuring safe and robust operation for future clinical testing using robotic exoskeletons with autonomous control.”

University of Waterloo: wearable robot exoskeleton in use
University of Waterloo

Should all go to plan, however, hopefully it won’t be too long until such algorithms can be deployed in commercially available exosuits. They are already becoming more widespread, thanks to innovative companies like Sarcos Robotics, and are being used in evermore varied settings. They’re also capable of greatly enhancing human capabilities beyond what the wearer would be capable of when not wearing the suit.

In some ways, it’s highly reminiscent of the original conception of the cyborg, not as some nightmarish Darth Vader or RoboCop amalgamation of half-human and half-machine, but, as researchers Manfred Clynes and Nathan Kline wrote in the 1960s, as “an organizational system in which … robot-like problems [are] taken care of automatically, leaving [humans] free to explore, to create, to think, and to feel.” Shorn of its faintly hippy vibes (this was the ’60s), the idea still stands: By letting robots autonomously take care of the mundane problems associated with navigation, the human users can focus on more important, engaging things. After all, most people don’t have to consciously think about the minutiae of moving one foot in front of the other when they walk. Why should someone in a robot exosuit have to do so?

The latest paper dedicated to this research was recently published in the journal IEEE Transactions on Medical Robotics and Bionics.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Self-driving vehicle rules set to loosen under Trump, report says
self driving looser rules trump screenshot 2024 10 at 54 56 pm 6708947b14810

Tesla “has been very clear the future is autonomous,” CEO Elon Musk said in October, shortly after unveiling the Cybercab, Tesla’s self-driving robotaxi.

It now seems that Musk, who was recently nominated to lead a newly-created "Department of Government Efficiency," is sharing his crystal ball with the incoming Trump administration.

Read more
Honda doubles down on ‘holy grail’ of EV batteries
honda solid state battery production first electric suv 3

While some automakers are scaling back their production of electric vehicles, Honda is basking in the glow of a successful launch of its Prologue EV in the U.S., and was recently dubbed “North America’s most committed automaker.”

And now, Japan’s third-largest automaker is showing a similar commitment to making EVs more efficient and affordable, zeroing in on the production of its own in-house solid-state batteries, also known as the ‘holy grail’ of EV batteries.

Read more
Hyundai’s brand new Ioniq 9 EV features backseat lounge
hyundai ioniq 9 lounge 4 single image desktop

After months of teasing details about the Ioniq 9, Hyundai’s much-anticipated, three-row electric SUV, the company finally unveiled it at the Los Angeles Auto Show.

One of the Ioniq 9’s promised features -- that the SUV had the ability to offer a lounge-like interior – had most of us wondering what exactly that might mean.

Read more