TY - JOUR
T1 - A framework of physically interactive parameter estimation based on active environmental groping for safe disaster response work
AU - Kamezaki, Mitsuhiro
AU - Uehara, Yusuke
AU - Azuma, Kohga
AU - Sugano, Shigeki
N1 - Funding Information:
This research was supported in part by JSPS KAKENHI Grant Number 18KT0063, in part by the Industrial Cluster Promotion Project in Fukushima Pref., in part by the Institute for Disaster Response Robotics, Future Robotics Organization, Waseda University, in part by the Research Institute for Science and Engineering, Waseda University.
Publisher Copyright:
© 2021, The Author(s).
PY - 2021/12
Y1 - 2021/12
N2 - Disaster response robots are expected to perform complicated tasks such as traveling over unstable terrain, climbing slippery steps, and removing heavy debris. To complete such tasks safely, the robots must obtain not only visual-perceptual information (VPI) such as surface shape but also the haptic-perceptual information (HPI) such as surface friction of objects in the environments. VPI can be obtained from laser sensors and cameras. In contrast, HPI can be basically obtained from only the results of physical interaction with the environments, e.g., reaction force and deformation. However, current robots do not have a function to estimate the HPI. In this study, we propose a framework to estimate such physically interactive parameters (PIPs), including hardness, friction, and weight, which are vital parameters for safe robot-environment interaction. For effective estimation, we define the ground (GGM) and object groping modes (OGM). The endpoint of the robot arm, which has a force sensor, actively touches, pushes, rubs, and lifts objects in the environment with a hybrid position/force control, and three kinds of PIPs are estimated from the measured reaction force and displacement of the arm endpoint. The robot finally judges the accident risk based on estimated PIPs, e.g., safe, attentional, or dangerous. We prepared environments that had the same surface shape but different hardness, friction, and weight. The experimental results indicated that the proposed framework could estimate PIPs adequately and was useful to judge the risk and safely plan tasks.
AB - Disaster response robots are expected to perform complicated tasks such as traveling over unstable terrain, climbing slippery steps, and removing heavy debris. To complete such tasks safely, the robots must obtain not only visual-perceptual information (VPI) such as surface shape but also the haptic-perceptual information (HPI) such as surface friction of objects in the environments. VPI can be obtained from laser sensors and cameras. In contrast, HPI can be basically obtained from only the results of physical interaction with the environments, e.g., reaction force and deformation. However, current robots do not have a function to estimate the HPI. In this study, we propose a framework to estimate such physically interactive parameters (PIPs), including hardness, friction, and weight, which are vital parameters for safe robot-environment interaction. For effective estimation, we define the ground (GGM) and object groping modes (OGM). The endpoint of the robot arm, which has a force sensor, actively touches, pushes, rubs, and lifts objects in the environment with a hybrid position/force control, and three kinds of PIPs are estimated from the measured reaction force and displacement of the arm endpoint. The robot finally judges the accident risk based on estimated PIPs, e.g., safe, attentional, or dangerous. We prepared environments that had the same surface shape but different hardness, friction, and weight. The experimental results indicated that the proposed framework could estimate PIPs adequately and was useful to judge the risk and safely plan tasks.
KW - Active environmental touch
KW - Disaster response robot
KW - Groping
KW - Physically interactive parameter
UR - http://www.scopus.com/inward/record.url?scp=85116392055&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85116392055&partnerID=8YFLogxK
U2 - 10.1186/s40648-021-00209-1
DO - 10.1186/s40648-021-00209-1
M3 - Article
AN - SCOPUS:85116392055
SN - 2197-4225
VL - 8
JO - ROBOMECH Journal
JF - ROBOMECH Journal
IS - 1
M1 - 22
ER -