Skip to main content

MIT researchers develop a robot system controlled by brainwaves

Supervising Robots with Brain and Muscle Signals

The MIT Computer Science and Artificial Intelligence Laboratory (CSAIL) program has come up with some amazing advances in robotics recently, from origami robots that transform themselves to artificial intelligence that can sense people through walls. Its newest project allows you to control a robot just by watching it and correct mistakes with a simple hand gesture.

Recommended Videos

The team demonstrated the results of their research with a short video showing a human supervising a robot drilling holes in a piece of wood. The interface works on people the robot has never encountered before, meaning there’s no training involved.

Please enable Javascript to view this content

The brain sensors can quickly detect when a person notices that the robot is about to make a mistake. Using hand movement, the robot can then be instructed the correct action to perform. CSAIL Director Daniela Rus said the two sensors working in tandem enabled an almost instantaneous response.

“This work combining EEG and EMG feedback enables natural human-robot interactions for a broader set of applications than we’ve been able to do before using only EEG feedback,” Rus said. “By including muscle feedback, we can use gestures to command the robot spatially, with much more nuance and specificity.”

Controlling a robot with your brain often requires you to learn how to “think” in a certain way so the sensors can interpret the commands correctly. It’s one thing in a controlled laboratory environment with a trained operator, but you can imagine how it might be difficult on a noisy construction site, for example.

Everyone’s familiar with the “uh-oh” moment you get when you realize something is about to go haywire. For this project, the brain-wave scanners could quickly detect signals known as “error-related potentials” (ErrPs), which  occur naturally when people notice mistakes. An ErrP causes the robot to pause, so the human operator can direct the operation correctly, if needed.

“What’s great about this approach is that there’s no need to train users to think in a prescribed way,” Joseph DelPreto, lead author of a paper on the research, said.“The machine adapts to you, and not the other way around.”

In the study, Baxter the robot chose the correct drill spot 70 percent of the time on his own. With human supervision, that number rose to 97 percent.

“By looking at both muscle and brain signals, we can start to pick up on a person’s natural gestures along with their snap decisions about whether something is going wrong,” DelPreto said. “This helps make communicating with a robot more like communicating with another person.”

Mark Austin
Former Digital Trends Contributor
Mark’s first encounter with high-tech was a TRS-80. He spent 20 years working for Nintendo and Xbox as a writer and…
MIT says it can tell how healthy you are by analyzing your dishwasher
silverware handles up or down dishwasher utensils mem2

The idea that your everyday, at-home appliances might be quietly monitoring your health sounds, in some regards, like a scenario straight out of Black Mirror. According to MIT, it could also be a game-changer for the likes of elderly people living alone, whose everyday interactions with appliances like their dishwasher, microwave, stove, or hairdryer could pick up on valuable data points that could be used to spot early changes in health that might otherwise go unnoticed.

The “Sapple” system, developed by researchers from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL), analyzes in-home usage of appliances using radio signals and a smart electricity meter.

Read more
MIT researchers say 6 feet may not be far enough while social distancing
Sneezing man

For the past few weeks, we’ve been redefining our notion of personal space to ensure that we stay a few feet apart for social distancing reasons. Advice varies depending on who you ask, but the prevailing idea is in the vicinity of a couple of meters, the equivalent of around 6.5 feet.

But according to researchers from MIT (Massachusetts Institute of Technology), we may have been greatly underestimating the distance that coughs and sneezes can travel. Instead of distances of  6.5 feet being sufficient to keep us safe, researchers from the Fluid Dynamics of Disease Transmission Laboratory (which, frankly, sounds like it’s pretty perfectly placed to answer this question) think that we should possibly be keeping at least 8.2 meters or 27 feet away from one another. That’s because the gaseous clouds which result from sneezes and coughs, potentially containing COVID-19-conveying droplets, can travel that far.

Read more
The world needs a drone traffic control system, so AirMap is building one
AirMap technology in use

One hundred years ago, in late February 1920, the United Kingdom’s Air Ministry commissioned a first-of-its-kind building at the newly opened Croydon Airport. The U.K.’s inaugural international airport (now defunct) was home to an elevated wooden hut with four large windows that was grandly referred to as the Aerodrome Control Tower. It was the world’s first air traffic control center.

At the time, air travel was still in its infancy. Although the first commercial flight had taken place six years earlier, the era of mass air travel remained several decades away. Along with providing weather information to pilots, the job of the people who worked in the Aerodrome Control Tower was to mark the progress of approximately 12 daily flights, tracked using basic radio-based navigation, on paper maps by way of pins and flags. It was, needless to say, a simpler time.

Read more