“A lot of jobs are difficult to do remotely, particularly in manufacturing and industry,” Jeffrey Lipton, one of the postdoctoral researchers who worked on the project, told Digital Trends. “A system like this could one day allow humans to supervise robots from a distance. This would enable employees to work from home, and could even open up manufacturing jobs to people with physical limitations, [such as those] who can’t lift heavy or bulky objects. Many industrial jobs are also terrible for human health — imagine servicing the inside of an airplane or working out on an oil rig. They can be dangerous, cramped and uncomfortable, but right now they need a human mind to understand, make decisions, and do movements. We think this model of teleoperation could allow us to keep humans safe and away from these sites while leveraging human mental capabilities.”
MIT’s smart system embeds the user in a virtual reality control room with multiple sensor displays, allowing the user to see everything the robot is seeing at any given moment. To execute tasks, the human then employs gestures — picked up courtesy of hand controllers — which are mirrored by the robot. Controls accessible by the human user appear virtually, rather than being physical controls. This allows them to change depending on what the robot has to carry out at any given time.
“We hope to extend this work to many different robots and scale up the trials to tasks beyond assembly,” Lipton continued, describing future research the CSAIL scientists hope to carry out. For more on the Baxter project, you can check out a research paper published earlier this year, titled “Baxter’s Homunculus: Virtual Reality Spaces for Teleoperation in Manufacturing.”