“In this work, we developed a soft gripper that uses tactile sensing to model the objects it’s interacting with,” Michael Tolley, a roboticist at UC San Diego, told Digital Trends. “By rotating the object around in-hand, similar to what you would do when you reach into your pocket and feel for your keys, the gripper can map out a point cloud representing the object. Our gripper is unique in its ability to twist, sense, and model objects, allowing the gripper to operate in low light, low visibility, and uncertain conditions.”
The robot hand has three soft flexible fingers, which are powered pneumatically using air pressure. Sensors in the robot’s “skin” allow it to work out what it’s holding, and transmit this data to a control board, which creates a three-dimensional model of the object for reference.
When the gripper hand was attached to an industrial Fetch Robotics robot arm for testing, the researchers demonstrated that it was able to carry out a range of fine precision tasks, such as the aforementioned screwing in of lightbulbs — along with turning screwdrivers and holding individual sheets of paper.
“[As the next stage of research,] we are interested in incorporating techniques from machine learning to allow the gripper to semantically identify the objects it’s manipulating,” Tolley said. “We are also prototyping a 3D-printed version of the gripper for more consistent fabrication. We would [additionally] like to test with a wider variety of objects that also contain uncertainty in their positioning and orientation.”
In the real world, Tolley says he hopes a robot gripper such as this might be useful for tasks like fruit picking, or potentially working as an assistive robot in the home. To reach this point, it will need to be further put through its paces with a more extensive set of real-world objects.