Robot oriented state space construction

Hiroshi Ishiguro*, Ritsuko Sato, Toru Ishida

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

31 Citations (Scopus)

Abstract

The state space of a sensor-based robot in the most previous works has been determined based on human intuitions, however the state space constructed from human viewpoints is not always appropriate for the robot. The robot has a different body, sensors, and tasks, therefore, we consider the robot should have an original internal state space determined based on actions, sensors, and tasks. This paper proposes an approach to construct such a robot oriented state space by statistically analyzing the actions, sensor patterns, and rewards given as results of task executions. In the state space construction, the robot creates sensor pattern classifiers called Empirically Obtained Perceivers (EOPs) the combinations of which represents internal states of the robot. We have confirmed that the robot can construct original state spaces through its vision sensor and achieve navigation tasks with the obtained state spaces in a complicated simulated world.

Original languageEnglish
Pages1496-1501
Number of pages6
Publication statusPublished - 1996 Dec 1
Externally publishedYes
EventProceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. Part 3 (of 3) - Osaka, Jpn
Duration: 1996 Nov 41996 Nov 8

Other

OtherProceedings of the 1996 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS. Part 3 (of 3)
CityOsaka, Jpn
Period96/11/496/11/8

ASJC Scopus subject areas

  • Control and Systems Engineering
  • Software
  • Computer Vision and Pattern Recognition
  • Computer Science Applications

Fingerprint

Dive into the research topics of 'Robot oriented state space construction'. Together they form a unique fingerprint.

Cite this