Robust modeling of dynamic environment based on robot embodiment

Kuniaki Noda*, Mototaka Suzuki, Naofumi Tsuchiya, Yuki Suga, Tetsuya Ogata, Shigeki Sugano

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

6 Citations (Scopus)


Recent studies on embodied cognitive science have shown us the possibility of emergence of more complex and nontrivial behaviors with quite simple designs if the designer takes the dynamics of the system-environment interaction into account properly. In this paper, we report our tentative classification experiments of several objects using the human-like autonomous robot, "WAMOEBA-2Ri". As modeling the environment, we focus on not only static aspects of the environment but also dynamic aspects of it including that of the system own. The visualized results of this experiment shows the integration of multimodal sensor dataset acquired by the system-environment interaction ("grasping") enable robust categorization of several objects. Finally, in discussion, we demonstrate a possible application to making "invariance in motion" emerge consequently by extending this approach.

Original languageEnglish
Pages (from-to)3565-3570
Number of pages6
JournalProceedings - IEEE International Conference on Robotics and Automation
Publication statusPublished - 2003 Dec 9
Event2003 IEEE International Conference on Robotics and Automation - Taipei, Taiwan, Province of China
Duration: 2003 Sept 142003 Sept 19

ASJC Scopus subject areas

  • Software
  • Control and Systems Engineering
  • Artificial Intelligence
  • Electrical and Electronic Engineering


Dive into the research topics of 'Robust modeling of dynamic environment based on robot embodiment'. Together they form a unique fingerprint.

Cite this