Abstract
Recent studies on embodied cognitive science have shown us the possibility of emergence of more complex and nontrivial behaviors with quite simple designs if the designer takes the dynamics of the system-environment interaction into account properly. In this paper, we report our tentative classification experiments of several objects using the human-like autonomous robot, "WAMOEBA-2Ri". As modeling the environment, we focus on not only static aspects of the environment but also dynamic aspects of it including that of the system own. The visualized results of this experiment shows the integration of multimodal sensor dataset acquired by the system-environment interaction ("grasping") enable robust categorization of several objects. Finally, in discussion, we demonstrate a possible application to making "invariance in motion" emerge consequently by extending this approach.
Original language | English |
---|---|
Pages (from-to) | 3565-3570 |
Number of pages | 6 |
Journal | Proceedings - IEEE International Conference on Robotics and Automation |
Volume | 3 |
Publication status | Published - 2003 Dec 9 |
Event | 2003 IEEE International Conference on Robotics and Automation - Taipei, Taiwan, Province of China Duration: 2003 Sept 14 → 2003 Sept 19 |
ASJC Scopus subject areas
- Software
- Control and Systems Engineering
- Artificial Intelligence
- Electrical and Electronic Engineering