Skip to main content

Brain-controlled third arm lets you take your multitasking to the next level

BMI control of a third arm for multitasking

For whatever reason, some seriously smart folks in the tech community seem to be obsessed with adding extra appendages to the human body — and they’re getting more ambitious all the time. First, it was the 3D-printed functioning extra thumb prosthesis, made by a graduate student at London’s Royal College of Art. Then, it was the robotic Double Hand, dreamed up by augmented human startup YouBionic. Now, researchers from the Advanced Telecommunications Research Institute in Kyoto, Japan are taking the next logical step by creating a robotic third arm that will allow its wearers to take their multitasking ability to warp speed. Oh, and did we mention that it’s mind-controlled, too?

Recommended Videos

“Instead of a robot arm system, I would call it, a [brain-machine interface] (BMI) system for multitasking,” Christian Penaloza, a researcher on the project, told Digital Trends. “Traditional BMI systems are used mostly to recover or replace a lost function of a person with a disability, but not to enhance the capabilities of healthy users. Common BMI systems require the user to concentrate on a particular task, such as controlling a robot arm or wheelchair, while the body stays still. That means that the user can only do a single task. Due to the current limitations of BMI systems, it is more convenient for healthy users to use their own bodies instead.”

Hiroshi Ishiguro Laboratory, ATR

What the researchers at the Advanced Telecommunications Research Institute have developed is a brain-machine interface with a focus on multitasking. The robotic arm used in their demonstration is controlled via two electrodes which are stuck on the user’s head in order to capture their brain activity. Without requiring a person’s full attention, it’s possible to control the arm to carry out certain feats. That means that users can engage in one task while carrying out a second hands-free task simultaneously.

“In our experiments, we used a human-like robot arm for participants to grasp a bottle, while they did a different task [of] balancing a ball,” Penaloza continued. “[In terms of real-world applications] we could think of future use cases for this particular system, such as future construction or manufacturing workers who can use a third arm to increase their productivity, or even astronauts in space. However, the applications do not have to be limited to a robotic arm. Perhaps in the future, we could use the system to control other devices — household devices, cell phones, or machinery — while we do another task.”

A paper describing the work was recently published in the journal Science Robotics.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Roku TV Ready lets you control your entire soundbar from a Roku TV remote
Roku Ultra 2019 remote.

One of the best features of HDMI ARC connection is being able to control basic functions of your soundbar like volume and power with your TV remote. Now, Roku's new Roku TV Ready initiative is taking things a step further, allowing soundbar makers to work with Roku software so you can control everything on your soundbar -- from EQ to sound modes -- all from your Roku TV remote.

The first manufacturer on the list is long-time partner TCL, which has committed to releasing at least one Roku soundbar in 2020. Sound United brand and storied audio company Denon has promised to launch some Roku TV Ready audio products as well, allowing you to control their high-quality sound machines from the Roku TV Remote.

Read more
Brain-reading headphones are here to give you telekinetic control
Neurable

For the past 45 years, SIGGRAPH, the renowned annual conference for all things computer graphics, has been a great place to look if you want a sneak peak at the future. In the 1980s, it was the place animation enthusiasts Ed Catmull and John Lasseter crossed paths for one of the first times. A decade later they had created Toy Story, the first feature length movie animated on computer. In the 1990s, it was home to a spectacular demo in which thousands of attendees used color-coded paddles to play a giant collaborative game of Pong. Today, online games played by millions are everywhere.

And, in 2017, it was where an early stage startup called Neurable and VR graphics company Estudiofuture demonstrated a game called The Awakening. In The Awakening, players donned a VR headset, along with head-mounted electrodes designed for reading their brain waves. Using machine learning technology to decode these messy brain signals, Neurable was able to turn thoughts into game actions. Players could select, pick up, and throw objects simply by thinking about it. No gamepads, controller, or body movement necessary. Is this the future of computer interaction as we know it?

Read more
Mind-reading A.I. analyzes your brain waves to guess what video you’re watching
brain control the user interface of future eeg headset

Neural networks taught to "read minds" in real time

When it comes to things like showing us the right search results at the right time, A.I. can often seem like it’s darn close to being able to read people’s minds. But engineers at Russian robotics research company Neurobotics Lab have shown that artificial intelligence really can be trained to read minds -- and guess what videos users are watching based entirely on their brain waves alone.

Read more