Sound and visual tracking for humanoid robot

Hiroshi G. Okuno*, Kazuhiro Nakadai, Tino Lourens, Hiroaki Kitano

*この研究の対応する著者

研究成果: Article査読

17 被引用数 (Scopus)

抄録

Mobile robots capable of auditory perception usually adopt the "stop-perceive-act" principle to avoid sounds made during moving due to motor noise. Although this principle reduces the complexity of the problems involved in auditory processing for mobile robots, it restricts their capabilities of auditory processing. In this paper, sound and visual tracking are investigated to compensate each other's drawbacks in tracking objects and to attain robust object tracking. Visual tracking may be difficult in case of occlusion, while sound tracking may be ambiguous in localization due to the nature of auditory processing. For this purpose, we present an active audition system for humanoid robot. The audition system of the highly intelligent humanoid requires localization of sound sources and identification of meanings of the sound in the auditory scene. The active audition reported in this paper focuses on improved sound source tracking by integrating audition, vision, and motor control. Given the multiple sound sources in the auditory scene, SIG the humanoid actively moves its head to improve localization by aligning microphones orthogonal to the sound source and by capturing the possible sound sources by vision. The system adaptively cancels motor noises using motor control signals. The experimental result demonstrates the effectiveness of sound and visual tracking.

本文言語English
ページ(範囲)253-266
ページ数14
ジャーナルApplied Intelligence
20
3
DOI
出版ステータスPublished - 2004 5月
外部発表はい

ASJC Scopus subject areas

  • 制御およびシステム工学
  • 人工知能

フィンガープリント

「Sound and visual tracking for humanoid robot」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル