Abstract
Sound localization for a robot or an embedded system is usually solved by using Interaural Phase Difference (IPD) and Interaural Intensity Difference (IID). These values are calculated by using Head-Related Transfer Function (HRTF). However, HRTF depends on the shape of head and also changes as environments changes. Therefore, sound localization without HRTF is needed for real-world applications. In this paper, we present a new sound localization method based on auditory epipolar geometry with motion control. Auditory epipolar geometry is an extension of epipolar geometry in stereo vision to audition, and auditory and visual epipolar geometry can share the sound source direction. The key idea is to exploit additional inputs obtained by motor control in order to compensate damages in the IPD and IID caused by reverberation of the room and the body of a robot. The proposed system can localize and extract simultaneous two sound sources in a real-world room.
Original language | English |
---|---|
Title of host publication | IEEE International Conference on Intelligent Robots and Systems |
Pages | 1395-1401 |
Number of pages | 7 |
Volume | 3 |
Publication status | Published - 2001 |
Externally published | Yes |
Event | 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems - Maui, HI Duration: 2001 Oct 29 → 2001 Nov 3 |
Other
Other | 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems |
---|---|
City | Maui, HI |
Period | 01/10/29 → 01/11/3 |
Keywords
- Active audition
- Humanoid
- Localization
- Sensor fusion
ASJC Scopus subject areas
- Control and Systems Engineering