News Release
See, Think, Predict: Engineers build a soft robotics perception system inspired by humans
Photos: David Baillot/Jacobs School/University of California San Diego Photo gallery: https://www.flickr.com/photos/jsoe/albums/72157705778950921 |
San Diego, Calif., Jan. 30, 2019 -- An international team of researchers has developed a perception system for soft robots inspired by the way humans process information about their own bodies in space and in relation to other objects and people. They describe the system, which includes a motion capture system, soft sensors, a neural network, and a soft robotic finger, in the Jan. 30 issue of Science Robotics.
The researchers’ ultimate goal is to build a system that can predict a robot’s movements and internal state without relying on external sensors, much like humans do every day. In their Science Robotics paper, they show that they have achieved this goal for a soft robotic finger. The work has applications in human-robot interaction and wearable robotics, as well as soft devices to correct disorders affecting muscles and bones.
The system is meant to mimic the various components required for humans to navigate their environment: the motion capture system stands in for vision; the neural network stands in for brain functions; the sensors for touch; and the finger for the body interacting with the outside world. The motion capture system is there to train the neural network and can be discarded once training is complete.
“The advantages of our approach are the ability to predict complex motions and forces that the soft robot experiences (which is difficult with traditional methods) and the fact that it can be applied to multiple types of actuators and sensors,” said Michael Tolley, a professor of mechanical and aerospace engineering at the University of California San Diego and the paper’s senior author. “Our method also includes redundant sensors, which improves the overall robustness of our predictions.”
Researchers embedded soft strain sensors arbitrarily within the soft robotic finger, knowing that they would be responsive to a wide variety of motions, and used machine learning techniques to interpret the sensors’ signals. This allowed the team, which includes researchers from the Bioinspired Robotics and Design Lab at UC San Diego, to predict forces applied to, and movements of, the finger. This approach will enable researchers to develop models that can predict forces and deformations experienced by soft robotic systems as they move.
This is important because the techniques traditionally used in robotics for processing sensor data can’t capture the complex deformations of soft systems. In addition, the information the sensors capture is equally complex. As a result, sensor design, placement and fabrication in soft robots are difficult tasks that could be vastly improved if researchers had access to robust models. This is what the research team is hoping to provide.
Next steps include scaling up the number of sensors to better mimic the dense sensing capabilities of biological skin and closing the loop for feedback control of the actuator.
Soft robot perception using embedded soft sensors and recurrent neural networks
Thomas George Thuruthel and Cecilia Laschi, The BioRobotics Institute, Scuola Superiore SantAnna, Pisa, Italy; and Benjamin Shih and Michael Thomas Tolley, Department of Mechanical and Aerospace Engineering, UC San Diego
Overivew of the modeling architecture and its parallel to the human perceptive system. |
A close up of the soft robotic finger that provides haptic and movement feedback to the perception system. |
Media Contacts
Ioana Patringenaru
Jacobs School of Engineering
858-822-0899
ipatrin@ucsd.edu