A framework for integrating sensory information in a humanoid robot

I. Fermin*, Hiroshi G. Okuno, H. Ishiguro, H. Kitano

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

In this paper, we propose a framework towards the integration of information sensors based on the idea that the stimulus perceived through different sensors are spatial-time correlated for a short time period. Applications in robotics need to be able to process information from multiple sensors, for instance, in the case of a visible talking person. How can we relate those kind of information in a simple way, without making use of high level representation? This is the question that we want to address. A new framework based on correlation measure at low level data information is proposed. This low level correlation measure can be used as integration data engine to support high level task description. In this paper a coherent approach from sensor level to task level for developing a robot which can handle a large number of sensors and actuators is developed. An example how this approach can be used for a visual-sound integration task is also presented.

Original languageEnglish
Title of host publicationIEEE International Conference on Intelligent Robots and Systems
Pages1748-1753
Number of pages6
Volume3
Publication statusPublished - 2000
Externally publishedYes
Event2000 IEEE/RSJ International Conference on Intelligent Robots and Systems - Takamatsu
Duration: 2000 Oct 312000 Nov 5

Other

Other2000 IEEE/RSJ International Conference on Intelligent Robots and Systems
CityTakamatsu
Period00/10/3100/11/5

Keywords

  • Correlation measure
  • Humanoid robot
  • Learning
  • Sensor integration
  • Spatial-time information

ASJC Scopus subject areas

  • Control and Systems Engineering

Fingerprint

Dive into the research topics of 'A framework for integrating sensory information in a humanoid robot'. Together they form a unique fingerprint.

Cite this