Robust modeling of dynamic environment based on robot embodiment

Kuniaki Noda*, Mototaka Suzuki, Naofumi Tsuchiya, Yuki Suga, Tetsuya Ogata, Shigeki Sugano

*この研究の対応する著者

研究成果: Conference article査読

6 被引用数 (Scopus)

抄録

Recent studies on embodied cognitive science have shown us the possibility of emergence of more complex and nontrivial behaviors with quite simple designs if the designer takes the dynamics of the system-environment interaction into account properly. In this paper, we report our tentative classification experiments of several objects using the human-like autonomous robot, "WAMOEBA-2Ri". As modeling the environment, we focus on not only static aspects of the environment but also dynamic aspects of it including that of the system own. The visualized results of this experiment shows the integration of multimodal sensor dataset acquired by the system-environment interaction ("grasping") enable robust categorization of several objects. Finally, in discussion, we demonstrate a possible application to making "invariance in motion" emerge consequently by extending this approach.

本文言語English
ページ(範囲)3565-3570
ページ数6
ジャーナルProceedings - IEEE International Conference on Robotics and Automation
3
出版ステータスPublished - 2003
イベント2003 IEEE International Conference on Robotics and Automation - Taipei, Taiwan, Province of China
継続期間: 2003 9月 142003 9月 19

ASJC Scopus subject areas

  • ソフトウェア
  • 制御およびシステム工学
  • 人工知能
  • 電子工学および電気工学

フィンガープリント

「Robust modeling of dynamic environment based on robot embodiment」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル