The MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) program has come up with some amazing advances in robotics recently, from origami robots that transform themselves to artificial intelligence that can sense people through walls. Its newest project allows you to control a robot just by watching it and correct mistakes with a simple hand gesture.
The team demonstrated the results of their research with a short video showing a human supervising a robot drilling holes in a piece of wood. The interface works on people the robot has never encountered before, meaning there’s no training involved.
The brain sensors can quickly detect when a person notices that the robot is about to make a mistake. Using hand movement, the robot can then be instructed the correct action to perform. CSAIL Director Daniela Rus said the two sensors working in tandem enabled an almost instantaneous response.
“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” Rus said. “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”
Controlling a robot with your brain often requires you to learn how to “think” in a certain way so the sensors can interpret the commands correctly. It’s one thing in a controlled laboratory environment with a trained operator, but you can imagine how it might be difficult on a noisy construction site, for example.
Everyone’s familiar with the “uh-oh” moment you get when you realize something is about to go haywire. For this project, the brain-wave scanners could quickly detect signals known as “error-related potentials” (ErrPs), which occur naturally when people notice mistakes. An ErrP causes the robot to pause, so the human operator can direct the operation correctly, if needed.
“What’s great about this approach is that there’s no need to train users to think in a prescribed way,” Joseph DelPreto, lead author of a paper on the research, said.“The machine adapts to you, and not the other way around.”
In the study, Baxter the robot chose the correct drill spot 70 percent of the time on his own. With human supervision, that number rose to 97 percent.
“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,” DelPreto said. “This helps make communicating with a robot more like communicating with another person.”