TY - JOUR
T1 - Utilization of Image/Force/Tactile Sensor Data for Object-Shape-Oriented Manipulation
T2 - Wiping Objects with Turning Back Motions and Occlusion
AU - Saito, Namiko
AU - Shimizu, Takumi
AU - Ogata, Tetsuya
AU - Sugano, Shigeki
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2022/4/1
Y1 - 2022/4/1
N2 - There has been an increasing demand for housework robots to handle various objects. It is, however, difficult to achieve object-shape-oriented tasks in conventional research owing to the requirement for dealing with multiple surfaces, invisible area, and occlusion; moreover, robots must perceive shapes and adjust movements even if they cannot be seen directly. Humans usually tackle questions by integrating several sensory information; inspired by this perception mechanism of humans, in this study, we considered the effective utilization of image/force/tactile data in constructing a multimodal deep neural networks (DNN) model for the shape of an object perception and motion generation. As an example, we constructed a robot to wipe around the outside of objects that are imitating light shades. The wiping motions include the moment when the hands of the robot must be away from the surface as well as the turning directions required to wipe the next surface, even though some parts of the surfaces, such as the backside or parts occluded by the arm of the robot, may not be seen directly. If DNN model uses continuous visual information, it is badly influenced by the occluded images. Hence, the best-performing DNN model is the one that uses an image of the initial time-step to approximately perceive the shape and size and then generate motions by integrating the perception and sense of tactile and force. We conclude that the effective approach to object-shape-oriented manipulation is to initially utilize image to outline the target shape and, thereafter, to use force and tactile to understand concrete features while performing tasks.
AB - There has been an increasing demand for housework robots to handle various objects. It is, however, difficult to achieve object-shape-oriented tasks in conventional research owing to the requirement for dealing with multiple surfaces, invisible area, and occlusion; moreover, robots must perceive shapes and adjust movements even if they cannot be seen directly. Humans usually tackle questions by integrating several sensory information; inspired by this perception mechanism of humans, in this study, we considered the effective utilization of image/force/tactile data in constructing a multimodal deep neural networks (DNN) model for the shape of an object perception and motion generation. As an example, we constructed a robot to wipe around the outside of objects that are imitating light shades. The wiping motions include the moment when the hands of the robot must be away from the surface as well as the turning directions required to wipe the next surface, even though some parts of the surfaces, such as the backside or parts occluded by the arm of the robot, may not be seen directly. If DNN model uses continuous visual information, it is badly influenced by the occluded images. Hence, the best-performing DNN model is the one that uses an image of the initial time-step to approximately perceive the shape and size and then generate motions by integrating the perception and sense of tactile and force. We conclude that the effective approach to object-shape-oriented manipulation is to initially utilize image to outline the target shape and, thereafter, to use force and tactile to understand concrete features while performing tasks.
KW - Deep learning in grasping and manipulation
KW - perception for grasping and manipulation
KW - recognition
KW - sensorimotor learning
UR - http://www.scopus.com/inward/record.url?scp=85122058590&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85122058590&partnerID=8YFLogxK
U2 - 10.1109/LRA.2021.3136657
DO - 10.1109/LRA.2021.3136657
M3 - Article
AN - SCOPUS:85122058590
SN - 2377-3766
VL - 7
SP - 968
EP - 975
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 2
ER -