Skip to main content

Exoskeletons with autopilot: A peek at the near future of wearable robotics

Automation makes things easier. It also makes things potentially scarier as you put your well-being in the hands of technology that has to make spur-of-the-moment calls without first consulting you, the user. A self-driving car, for instance, must be able to spot a traffic jam or swerving cyclist and react appropriately. If it can do this effectively, it’s a game-changer for transportation. If it can’t, the results may be fatal.

Recommended Videos

At the University of Waterloo, Canada, researchers are working on just this problem — only applied to the field of wearable robot exosuits. These suits, which can range from industrial wearables reminiscent of Aliens’ Power Loader to assistive suits for individuals with mobility impairments resulting from age or physical disabilities, are already in use as augmentation devices to aid their wearers. But they’ve been entirely manual in their operation. Now, researchers want to give them a mind of their own.

To that end, the University of Waterloo investigators are developing A.I. tools like computer vision that will allow exosuits to sense their surroundings and adjust movements accordingly — such as being able to spot flights of stairs and climb them automatically or otherwise respond to different walking environments in real time. Should they pull it off, it will forever change the usefulness of these assistive devices. Doing so isn’t easy, however.

The biggest challenge for robotic exoskeletons

“Control is generally regarded as one of the biggest challenges to developing robotic exoskeletons for real-world applications,” Brokoslaw Laschowski, a Ph.D. candidate in the university’s Systems Design Engineering department, told Digital Trends. “To ensure safe and robust operation, commercially available exoskeletons use manual controls like joysticks or mobile interfaces to communicate the user’s locomotor intent. We’re developing autonomous control systems for robotic exoskeletons using wearable cameras and artificial intelligence, [so as to alleviate] the cognitive burden associated with human control and decision-making.”

University of Waterloo: wearable robot exoskeletons camera
University of Waterloo

As part of the project, the team had to develop an A.I.-powered environment classification system, called the ExoNet database, which it claims is the largest-ever open-source image dataset of human walking environments. This was gathered by having people wear a mounted camera on their chest and walk around local environments while recording their movement and locomotion, It was then used to train neural networks.

“Our environment classification system uses deep learning,” Laschowski continued. “However, high-performance deep-learning algorithms tend to be quite computationally expensive, which is problematic for robotic exoskeletons with limited operating resources. Therefore, we’re using efficient convolutional neural networks with minimal computational and memory storage requirements for the environment classification. These dee- learning algorithms can also automatically and efficiently learn optimal image features directly from training data, rather than using hand-engineered features as is traditionally done.”

John McPhee, a professor of Systems Design Engineering at the University of Waterloo, told Digital Trends: “Essentially, we are replacing manual controls — [like] stop, start, lift leg for step — with an automated solution. One analogy is an automatic powertrain in a car, which replaces manual shifting. Nowadays, most people drive automatics because it is more efficient, and the user can focus on their environment more rather than operating the clutch and stick. In a similar way, an automated high-level controller for an exo will open up new opportunities for the user [in the form of] greater environmental awareness.”

As with a self-driving car, the researchers note that the human user will possess the ability to override the automated control system if the need arises. While it will still require a bit of faith to, for instance, trust that your exosuit will spot a flight of descending stairs prior to launching down them, the wearer can take control in scenarios where it’s necessary.

Still prepping for prime time

Right now, the project is a work in progress. “We’re currently focusing on optimizing our A.I.-powered environment classification system, specifically improving the classification accuracy and real-time performance,” said Laschowski. “This technical engineering development is essential to ensuring safe and robust operation for future clinical testing using robotic exoskeletons with autonomous control.”

University of Waterloo: wearable robot exoskeleton in use
University of Waterloo

Should all go to plan, however, hopefully it won’t be too long until such algorithms can be deployed in commercially available exosuits. They are already becoming more widespread, thanks to innovative companies like Sarcos Robotics, and are being used in evermore varied settings. They’re also capable of greatly enhancing human capabilities beyond what the wearer would be capable of when not wearing the suit.

In some ways, it’s highly reminiscent of the original conception of the cyborg, not as some nightmarish Darth Vader or RoboCop amalgamation of half-human and half-machine, but, as researchers Manfred Clynes and Nathan Kline wrote in the 1960s, as “an organizational system in which … robot-like problems [are] taken care of automatically, leaving [humans] free to explore, to create, to think, and to feel.” Shorn of its faintly hippy vibes (this was the ’60s), the idea still stands: By letting robots autonomously take care of the mundane problems associated with navigation, the human users can focus on more important, engaging things. After all, most people don’t have to consciously think about the minutiae of moving one foot in front of the other when they walk. Why should someone in a robot exosuit have to do so?

The latest paper dedicated to this research was recently published in the journal IEEE Transactions on Medical Robotics and Bionics.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Range Rover’s first electric SUV has 48,000 pre-orders
Land Rover Range Rover Velar SVAutobiography Dynamic Edition

Range Rover, the brand made famous for its British-styled, luxury, all-terrain SUVs, is keen to show it means business about going electric.

And, according to the most recent investor presentation by parent company JLR, that’s all because Range Rover fans are showing the way. Not only was demand for Range Rover’s hybrid vehicles up 29% in the last six months, but customers are buying hybrids “as a stepping stone towards battery electric vehicles,” the company says.

Read more
BYD’s cheap EVs might remain out of Canada too
BYD Han

With Chinese-made electric vehicles facing stiff tariffs in both Europe and America, a stirring question for EV drivers has started to arise: Can the race to make EVs more affordable continue if the world leader is kept out of the race?

China’s BYD, recognized as a global leader in terms of affordability, had to backtrack on plans to reach the U.S. market after the Biden administration in May imposed 100% tariffs on EVs made in China.

Read more
Tesla posts exaggerate self-driving capacity, safety regulators say
Beta of Tesla's FSD in a car.

The National Highway Traffic Safety Administration (NHTSA) is concerned that Tesla’s use of social media and its website makes false promises about the automaker’s full-self driving (FSD) software.
The warning dates back from May, but was made public in an email to Tesla released on November 8.
The NHTSA opened an investigation in October into 2.4 million Tesla vehicles equipped with the FSD software, following three reported collisions and a fatal crash. The investigation centers on FSD’s ability to perform in “relatively common” reduced visibility conditions, such as sun glare, fog, and airborne dust.
In these instances, it appears that “the driver may not be aware that he or she is responsible” to make appropriate operational selections, or “fully understand” the nuances of the system, NHTSA said.
Meanwhile, “Tesla’s X (Twitter) account has reposted or endorsed postings that exhibit disengaged driver behavior,” Gregory Magno, the NHTSA’s vehicle defects chief investigator, wrote to Tesla in an email.
The postings, which included reposted YouTube videos, may encourage viewers to see FSD-supervised as a “Robotaxi” instead of a partially automated, driver-assist system that requires “persistent attention and intermittent intervention by the driver,” Magno said.
In one of a number of Tesla posts on X, the social media platform owned by Tesla CEO Elon Musk, a driver was seen using FSD to reach a hospital while undergoing a heart attack. In another post, a driver said he had used FSD for a 50-minute ride home. Meanwhile, third-party comments on the posts promoted the advantages of using FSD while under the influence of alcohol or when tired, NHTSA said.
Tesla’s official website also promotes conflicting messaging on the capabilities of the FSD software, the regulator said.
NHTSA has requested that Tesla revisit its communications to ensure its messaging remains consistent with FSD’s approved instructions, namely that the software provides only a driver assist/support system requiring drivers to remain vigilant and maintain constant readiness to intervene in driving.
Tesla last month unveiled the Cybercab, an autonomous-driving EV with no steering wheel or pedals. The vehicle has been promoted as a robotaxi, a self-driving vehicle operated as part of a ride-paying service, such as the one already offered by Alphabet-owned Waymo.
But Tesla’s self-driving technology has remained under the scrutiny of regulators. FSD relies on multiple onboard cameras to feed machine-learning models that, in turn, help the car make decisions based on what it sees.
Meanwhile, Waymo’s technology relies on premapped roads, sensors, cameras, radar, and lidar (a laser-light radar), which might be very costly, but has met the approval of safety regulators.

Read more