TY - JOUR
T1 - End-to-End Tactile Feedback Loop
T2 - From Soft Sensor Skin over Deep GRU-Autoencoders to Tactile Stimulation
AU - Geier, Andreas
AU - Tucker, Rawleigh
AU - Somlor, Sophon
AU - Sawada, Hideyuki
AU - Sugano, Shigeki
N1 - Funding Information:
Manuscript received February 24, 2020; accepted July 14, 2020. Date of publication July 29, 2020; date of current version August 14, 2020. This letter, was recommended for publication by Associate Editor Prof. Ki-Uk Kyung and Editor Allison M. Okamura upon evaluation of the Reviewers’ comments. This work was supported in part by the JSPS Grant-in-Aid No.19H01130, No. 19K14948, the Tateishi Science and Technology Foundation Research Grant (S), and in part by the Research Institute for Science and Engineering of Waseda University. (Corresponding author : Andreas Geier.) Andreas Geier is with the Faculty of Science and Engineering, Department of Modern Mechanical Engineering, Waseda University, 169-8050 Tokyo, Japan, and also with the Rostock Medical Center, Department of Orthopaedics, University Hospital of Rostock, 18057 Rostock, Germany (e-mail: a_geier@ sugano.mech.waseda.ac.jp).
Publisher Copyright:
© 2016 IEEE.
PY - 2020/10
Y1 - 2020/10
N2 - Tactile feedback is a key sensory channel that contributes to our ability to perform precise manipulations. In this regard, sensor skin provides robots with the sense of touch making them increasingly capable of dexterous object manipulation. However, in applications like teleoperation, the complex sensory input of an infinite number of different textures must be projected to the human user's skin in a meaningful manner. In addressing this issue, a deep gated recurrent unit-based autoencoder (GRU-AE) that captured the perceptual dimensions of tactile textures in latent space was deployed to implicitly understand unseen textures. The expression of unknown textures in this latent space allowed for the definition of a control law to effectively drive tactile displays and to convey tactile feedback in a psycho-physically meaningful manner. The approach was experimentally verified by evaluating the prediction performance of the GRU-AE on seen and unseen data that were gathered during active tactile exploration of objects commonly encountered in daily living. A user study on a custom-made tactile display was conducted in which real tactile perceptions in response to active tactile object exploration were compared to the emulated tactile feedback using the proposed tactile feedback loop. The results suggest that the deep GRU-AE for tactile display control offers an effective and intuitive method for efficient end-to-end tactile feedback during active tactile texture exploration.
AB - Tactile feedback is a key sensory channel that contributes to our ability to perform precise manipulations. In this regard, sensor skin provides robots with the sense of touch making them increasingly capable of dexterous object manipulation. However, in applications like teleoperation, the complex sensory input of an infinite number of different textures must be projected to the human user's skin in a meaningful manner. In addressing this issue, a deep gated recurrent unit-based autoencoder (GRU-AE) that captured the perceptual dimensions of tactile textures in latent space was deployed to implicitly understand unseen textures. The expression of unknown textures in this latent space allowed for the definition of a control law to effectively drive tactile displays and to convey tactile feedback in a psycho-physically meaningful manner. The approach was experimentally verified by evaluating the prediction performance of the GRU-AE on seen and unseen data that were gathered during active tactile exploration of objects commonly encountered in daily living. A user study on a custom-made tactile display was conducted in which real tactile perceptions in response to active tactile object exploration were compared to the emulated tactile feedback using the proposed tactile feedback loop. The results suggest that the deep GRU-AE for tactile display control offers an effective and intuitive method for efficient end-to-end tactile feedback during active tactile texture exploration.
KW - AI-based methods
KW - Haptics and haptic interfaces
KW - soft sensors and actuators
UR - http://www.scopus.com/inward/record.url?scp=85089874372&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85089874372&partnerID=8YFLogxK
U2 - 10.1109/LRA.2020.3012951
DO - 10.1109/LRA.2020.3012951
M3 - Article
AN - SCOPUS:85089874372
SN - 2377-3766
VL - 5
SP - 6467
EP - 6474
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 4
M1 - 9152113
ER -