TY - GEN
T1 - Dynamic motion generation by flexible-joint robot based on deep learning using images
AU - Wu, Yuheng
AU - Takahashi, Kuniyuki
AU - Yamada, Hiroki
AU - Kim, Kitae
AU - Murata, Shingo
AU - Sugano, Shigeki
AU - Ogata, Tetsuya
N1 - Funding Information:
The work was supported by JSPS KAKENHI Grant Numbers 15J12683, 15H01710, 2522005; by the Program for Leading Graduate Schools, "Graduate Program for Embodiment Informatics" of the Ministry of Education, Culture, Sports, Science and Technology; and by "Fundamental Study for Intelligent Machine to Coexist with Nature," Research Institute for Science and Engineering, Waseda University.
Funding Information:
ACKNOWLEDGMENTS The work was supported by JSPS KAKENHI Grant Numbers 15J12683, 15H01710, 2522005; by the Program for Leading Graduate Schools, “Graduate Program for Embodiment Informatics” of the Ministry of Education, Culture, Sports, Science and Technology; and by
Publisher Copyright:
© 2018 IEEE.
PY - 2018/9
Y1 - 2018/9
N2 - Robots with flexible joints have recently been attracting attention from researchers because such robots can passively adapt to environmental changes and realize dynamic motion that uses inertia. In previous research, body-model acquisition using deep learning was proposed and dynamic motion learning was achieved. However, using the end-effector position as a visual feedback signal to train a robot limits what the robot can know to only the relation between the task and itself, instead of the relation between the environment and itself. In this research, we propose to use images as a feedback signal so that the robot can have a sense of the overall situation within the task environment. This motion learning is performed via deep learning using raw image data. In an experiment, we let a robot perform task motions once to acquire motor and image data. Then, we used a convolutional auto-encoder to extract image features from raw image data. The extracted image features were used in combination with motor data to train a recurrent neural network. As a result, motion learning through deep learning from image data allowed the robot to acquire environmental information and conduct tasks that require consideration of environmental changes, making use of its advantage of passive adaptation.
AB - Robots with flexible joints have recently been attracting attention from researchers because such robots can passively adapt to environmental changes and realize dynamic motion that uses inertia. In previous research, body-model acquisition using deep learning was proposed and dynamic motion learning was achieved. However, using the end-effector position as a visual feedback signal to train a robot limits what the robot can know to only the relation between the task and itself, instead of the relation between the environment and itself. In this research, we propose to use images as a feedback signal so that the robot can have a sense of the overall situation within the task environment. This motion learning is performed via deep learning using raw image data. In an experiment, we let a robot perform task motions once to acquire motor and image data. Then, we used a convolutional auto-encoder to extract image features from raw image data. The extracted image features were used in combination with motor data to train a recurrent neural network. As a result, motion learning through deep learning from image data allowed the robot to acquire environmental information and conduct tasks that require consideration of environmental changes, making use of its advantage of passive adaptation.
UR - http://www.scopus.com/inward/record.url?scp=85070380587&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85070380587&partnerID=8YFLogxK
U2 - 10.1109/DEVLRN.2018.8761020
DO - 10.1109/DEVLRN.2018.8761020
M3 - Conference contribution
AN - SCOPUS:85070380587
T3 - 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
SP - 169
EP - 174
BT - 2018 Joint IEEE 8th International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - Joint 8th IEEE International Conference on Development and Learning and Epigenetic Robotics, ICDL-EpiRob 2018
Y2 - 16 September 2018 through 20 September 2018
ER -