TY - JOUR
T1 - Tactile Transfer Learning and Object Recognition With a Multifingered Hand Using Morphology Specific Convolutional Neural Networks
AU - Funabashi, Satoshi
AU - Yan, Gang
AU - Hongyi, Fei
AU - Schmitz, Alexander
AU - Jamone, Lorenzo
AU - Ogata, Tetsuya
AU - Sugano, Shigeki
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2024/6/1
Y1 - 2024/6/1
N2 - Multifingered robot hands can be extremely effective in physically exploring and recognizing objects, especially if they are extensively covered with distributed tactile sensors. Convolutional neural networks (CNNs) have been proven successful in processing high dimensional data, such as camera images, and are, therefore, very well suited to analyze distributed tactile information as well. However, a major challenge is to organize tactile inputs coming from different locations on the hand in a coherent structure that could leverage the computational properties of the CNN. Therefore, we introduce a morphology-specific CNN (MS-CNN), in which hierarchical convolutional layers are formed following the physical configuration of the tactile sensors on the robot. We equipped a four-fingered Allegro robot hand with several uSkin tactile sensors; overall, the hand is covered with 240 sensitive elements, each one measuring three-axis contact force. The MS-CNN layers process the tactile data hierarchically: at the level of small local clusters first, then each finger, and then the entire hand. We show experimentally that, after training, the robot hand can successfully recognize objects by a single touch, with a recognition rate of over 95%. Interestingly, the learned MS-CNN representation transfers well to novel tasks: by adding a limited amount of data about new objects, the network can recognize nine types of physical properties.
AB - Multifingered robot hands can be extremely effective in physically exploring and recognizing objects, especially if they are extensively covered with distributed tactile sensors. Convolutional neural networks (CNNs) have been proven successful in processing high dimensional data, such as camera images, and are, therefore, very well suited to analyze distributed tactile information as well. However, a major challenge is to organize tactile inputs coming from different locations on the hand in a coherent structure that could leverage the computational properties of the CNN. Therefore, we introduce a morphology-specific CNN (MS-CNN), in which hierarchical convolutional layers are formed following the physical configuration of the tactile sensors on the robot. We equipped a four-fingered Allegro robot hand with several uSkin tactile sensors; overall, the hand is covered with 240 sensitive elements, each one measuring three-axis contact force. The MS-CNN layers process the tactile data hierarchically: at the level of small local clusters first, then each finger, and then the entire hand. We show experimentally that, after training, the robot hand can successfully recognize objects by a single touch, with a recognition rate of over 95%. Interestingly, the learned MS-CNN representation transfers well to novel tasks: by adding a limited amount of data about new objects, the network can recognize nine types of physical properties.
KW - Convolutional neural network (CNN)
KW - multifingered hand
KW - object recognition
KW - tactile sensing
UR - http://www.scopus.com/inward/record.url?scp=85141555805&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85141555805&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2022.3215723
DO - 10.1109/TNNLS.2022.3215723
M3 - Article
C2 - 36327180
AN - SCOPUS:85141555805
SN - 2162-237X
VL - 35
SP - 7587
EP - 7601
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 6
ER -