TY - JOUR
T1 - Tactile Object Property Recognition Using Geometrical Graph Edge Features and Multi-Thread Graph Convolutional Network
AU - Kulkarni, Shardul
AU - Funabashi, Satoshi
AU - Schmitz, Alexander
AU - Ogata, Tetsuya
AU - Sugano, Shigeki
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2024/4/1
Y1 - 2024/4/1
N2 - Performing dexterous tasks with a multi fingered robotic hand remains challenging. Tactile sensors provide touch states and object features for multifingered tasks, yet the variety in shapes, sizes, textures, deformabilities and masses of everyday objects makes the task conditions diverse. Despite these challenges, humans accomplish these difficult tasks by producing a sensory motor representation of their body. This combined tactile and proprioceptive representation enables humans to accommodate the diversity in daily objects. Referring to this concept, this paper presents a method for object property recognition using Graph Convolutional Networks (GCNs), leveraging robot hand proprioception and morphology with spatial embeddings derived from geometrical graph edge features acquired from real tactile sensor alignments on an Allegro Hand. Additionally, a Multi Thread GCN (MT GCN) architecture is introduced to process these edge features and basically multi modal data in a graph. Training data was acquired using a data glove, from tri axial tactile sensors distributed across the fingertips, finger phalanges, and palm of an Allegro Hand, producing a total of 1152 tactile measurements. MT GCN with proposed edge features, tactile features and joint angles achieved a high recognition rate, 86.08% for six classes of object property combinations from eight objects. The effect of variation in graph adjacency on MT GCN was examined. The proposed network showed clusters following the robot hand configuration with t SNE analysis. Furthermore, analysis of learned parameters in the edge feature encoder demonstrated its ability to discern joint positions on the hand, acquiring proprioceptive features effectively. Consequently, we could confirm that the proposed method was effective for multi fingered dexterous tasks.
AB - Performing dexterous tasks with a multi fingered robotic hand remains challenging. Tactile sensors provide touch states and object features for multifingered tasks, yet the variety in shapes, sizes, textures, deformabilities and masses of everyday objects makes the task conditions diverse. Despite these challenges, humans accomplish these difficult tasks by producing a sensory motor representation of their body. This combined tactile and proprioceptive representation enables humans to accommodate the diversity in daily objects. Referring to this concept, this paper presents a method for object property recognition using Graph Convolutional Networks (GCNs), leveraging robot hand proprioception and morphology with spatial embeddings derived from geometrical graph edge features acquired from real tactile sensor alignments on an Allegro Hand. Additionally, a Multi Thread GCN (MT GCN) architecture is introduced to process these edge features and basically multi modal data in a graph. Training data was acquired using a data glove, from tri axial tactile sensors distributed across the fingertips, finger phalanges, and palm of an Allegro Hand, producing a total of 1152 tactile measurements. MT GCN with proposed edge features, tactile features and joint angles achieved a high recognition rate, 86.08% for six classes of object property combinations from eight objects. The effect of variation in graph adjacency on MT GCN was examined. The proposed network showed clusters following the robot hand configuration with t SNE analysis. Furthermore, analysis of learned parameters in the edge feature encoder demonstrated its ability to discern joint positions on the hand, acquiring proprioceptive features effectively. Consequently, we could confirm that the proposed method was effective for multi fingered dexterous tasks.
KW - Deep learning
KW - feature detection
KW - graph neural networks
KW - representation learning
KW - robot sensing systems
KW - tactile sensors
UR - http://www.scopus.com/inward/record.url?scp=85186090738&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85186090738&partnerID=8YFLogxK
U2 - 10.1109/LRA.2024.3367271
DO - 10.1109/LRA.2024.3367271
M3 - Article
AN - SCOPUS:85186090738
SN - 2377-3766
VL - 9
SP - 3894
EP - 3901
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 4
ER -