TY - JOUR
T1 - Automatic estimation of the position and orientation of the drill to be grasped and manipulated by the disaster response robot based on analyzing depth camera information
AU - Nishikawa, Keishi
AU - Ohya, Jun
AU - Ogata, Hiroyuki
AU - Hashimoto, Kenji
AU - Matsuzawa, Takashi
AU - Imai, Asaki
AU - Kimura, Shunsuke
AU - Takanishi, Atsuo
N1 - Funding Information:
The authors of this paper acknowledge to the support of Toshiki Kurosawa, Kazuya Miyakawa and Kanaki Nakao of Waseda University.
Publisher Copyright:
© 2019, Society for Imaging Science and Technology.
PY - 2019/1/13
Y1 - 2019/1/13
N2 - Towards the actualization of a disaster response robot that can locate and manipulate a drill at an arbitrary position with an arbitrary posture in disaster sites, this paper proposes a method that can estimate the position and orientation of the drill that is to be grasped and manipulated by the robot arm, by utilizing the depth camera information acquired by the depth camera. In this paper's algorithm, first, using a conventional method, the target drill is detected on the basis of an RGB image captured by the depth camera, and 3D point cloud data representing the target is generated by combining the detection results and the depth image. Second, using our proposed method, the generated point cloud data is processed to estimate the information on the proper position and orientation for grasping the drill. More specifically, a pass through filter is applied to the generated 3D point cloud data obtained by the first step. Then, the point cloud is divided, and features are classified so that the chuck and handle are identified. By computing the centroid of the point cloud for the chuck, the position for grasping is obtained. By applying Principal Component Analysis, the orientation for grasping is obtained. Experiments were conducted on a simulator. The results show that our method could accurately estimate the proper configuration for the autonomous grasping a normal-type drill.
AB - Towards the actualization of a disaster response robot that can locate and manipulate a drill at an arbitrary position with an arbitrary posture in disaster sites, this paper proposes a method that can estimate the position and orientation of the drill that is to be grasped and manipulated by the robot arm, by utilizing the depth camera information acquired by the depth camera. In this paper's algorithm, first, using a conventional method, the target drill is detected on the basis of an RGB image captured by the depth camera, and 3D point cloud data representing the target is generated by combining the detection results and the depth image. Second, using our proposed method, the generated point cloud data is processed to estimate the information on the proper position and orientation for grasping the drill. More specifically, a pass through filter is applied to the generated 3D point cloud data obtained by the first step. Then, the point cloud is divided, and features are classified so that the chuck and handle are identified. By computing the centroid of the point cloud for the chuck, the position for grasping is obtained. By applying Principal Component Analysis, the orientation for grasping is obtained. Experiments were conducted on a simulator. The results show that our method could accurately estimate the proper configuration for the autonomous grasping a normal-type drill.
UR - http://www.scopus.com/inward/record.url?scp=85080859060&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85080859060&partnerID=8YFLogxK
U2 - 10.2352/ISSN.2470-1173.2019.7.IRIACV-452
DO - 10.2352/ISSN.2470-1173.2019.7.IRIACV-452
M3 - Conference article
AN - SCOPUS:85080859060
SN - 2470-1173
VL - 2019
JO - IS and T International Symposium on Electronic Imaging Science and Technology
JF - IS and T International Symposium on Electronic Imaging Science and Technology
IS - 7
M1 - 452
T2 - 2019 Intelligent Robotics and Industrial Applications Using Computer Vision Conference, IRIACV 2019
Y2 - 13 January 2019 through 17 January 2019
ER -