Highlights

In brief

The iCub robot collects tactile data with its forearm and can interpret up to 23 different textures.

© 2019 A*STAR Institute for Infocomm Research

Giving robots the human touch

25 Oct 2019

Researchers at A*STAR have designed a hybrid machine learning model to enhance touch capabilities in robots.

Search and rescue operations are reliant primarily on humans and trained animals. However, these rescuers are often hampered by hazardous conditions such as unstable structures, poor lighting or bad weather. This can lead to a loss of valuable time, with implications for the success of rescue efforts.

One way to overcome such limitations is to use robots instead. In such applications, a robot’s ability to distinguish textures—for instance, between human skin, fabric, concrete and metal—is critical.

Scientists at the Institute of Infocomm Research (I2R), in collaboration with the National University of Singapore and Nanyang Technological University, Singapore, have since developed an approach to equip robots with sensitive touch capabilities. They presented their findings in a paper at the 2019 International Conference on Robotics and Automation.

Yan Wu, a Research Scientist at I2R and senior author on the paper, explained that their system mimics the way humans distinguish different textures by touch. “Touch is performed in a two-stage fashion,” said Wu. In the first stage, contact results in an initial conjecture of some coarse properties of a surface. This is then followed by sliding, during which finer details are sensed through gentle rubbing of a surface. Temporal signals also play an important role in this sensing, he added.

Thereafter, robots must learn to associate certain tactile signals with particular types of surfaces. To facilitate tactile learning, the group applied a combinatorial machine learning approach, relying on both convolutional neural networks (CNN) and long-short-term memory (LSTM) methods.

In essence, when their robot touches a surface, a tactile map loosely equivalent to a camera image is generated, accompanied by time sequence information. The CNN analyzes the ‘image’ data while the LSTM evaluates the temporal data for patterns. Collectively, the hybrid technique allows the robot to classify each tactile signal into one of 23 different textures.

The researchers showed that their CNN-LSTM architecture outperformed prior state-of-the-art machine learning techniques by as much as 10 percent in terms of texture classification accuracy.

Wu’s group is now looking into improving this approach by expanding the tactile dataset to include more material surfaces. “This will allow us to improve the robustness of the architecture and help build an open database for the research community to work on common problems and benchmark their solutions against ours,” he said.

The A*STAR-affiliated researchers contributing to this research are from the Institute for Infocomm Research (I2R).

Want to stay up to date with breakthroughs from A*STAR? Follow us on Twitter and LinkedIn!

References

Taunyazov, T., Koh, H. F., Wu, Y., Cai, C., Soh, H et al. Towards Effective Tactile Identification of Textures using a Hybrid Touch Approach. 2019 International Conference on Robotics and Automation. | article

About the Researcher

View articles

Yan Wu

Assistant Head of Robotics and Autonomous Systems Department

Institute for Infocomm Research
Yan Wu received his PhD degree in electrical engineering from Imperial College London in 2013. Between August 2012 and November 2013, he worked at the Institute of Child Health, University College London, UK, as an Honorary Research Associate and the NHS Great Ormond Street Hospital for Children as a Research Fellow. He joined A*STAR’s Institute for Infocomm Research (I2R) in December 2013, where he is currently a lead scientist investigating on robot dexterity. He is also the Assistant Head of Robotics and Autonomous Systems Department at I2R. A Senior Member of the IEEE and a Neuro-robotics Systems Technical Committee Member of the IEEE Robotics and Automation Society, Wu is interested in assistive robotics and technologies, multimodal robot learning and human-robot interaction.

This article was made for A*STAR Research by Wildtype Media Group