TY - GEN
T1 - Extracting the Relationship between the Spatial Distribution and Types of Bird Vocalizations Using Robot Audition System HARK
AU - Sumitani, Shinji
AU - Suzuki, Reiji
AU - Matsubayashi, Shiho
AU - Arita, Takaya
AU - Nakadai, Kazuhiro
AU - Okuno, Hiroshi G.
PY - 2018/12/27
Y1 - 2018/12/27
N2 - For a deeper understanding of ecological functions and semantics of wild bird vocalizations (i.e., songs and calls), it is important to clarify the fine-scaled and detailed relationships among their characteristics of vocalizations and their behavioral contexts. However, it takes a lot of time and effort to obtain such data using conventional recordings or by human observation. Bringing out a robot to a field is our approach to solve this problem. We are developing a portable observation system called HARKBird using a robot audition HARK and microphone arrays to understand temporal patterns of vocalizations characteristics and their behavioral contexts. In this paper, we introduce a prototype system to 2D localize vocalizations of wild birds in real-time, and to classify their song types after recording. We show that the system can estimate the position of songs of a target individual and classify their songs with a reasonable quality to discuss their song - behavior relationships.
AB - For a deeper understanding of ecological functions and semantics of wild bird vocalizations (i.e., songs and calls), it is important to clarify the fine-scaled and detailed relationships among their characteristics of vocalizations and their behavioral contexts. However, it takes a lot of time and effort to obtain such data using conventional recordings or by human observation. Bringing out a robot to a field is our approach to solve this problem. We are developing a portable observation system called HARKBird using a robot audition HARK and microphone arrays to understand temporal patterns of vocalizations characteristics and their behavioral contexts. In this paper, we introduce a prototype system to 2D localize vocalizations of wild birds in real-time, and to classify their song types after recording. We show that the system can estimate the position of songs of a target individual and classify their songs with a reasonable quality to discuss their song - behavior relationships.
UR - http://www.scopus.com/inward/record.url?scp=85062998103&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85062998103&partnerID=8YFLogxK
U2 - 10.1109/IROS.2018.8594130
DO - 10.1109/IROS.2018.8594130
M3 - Conference contribution
AN - SCOPUS:85062998103
T3 - IEEE International Conference on Intelligent Robots and Systems
SP - 2485
EP - 2490
BT - 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS 2018
Y2 - 1 October 2018 through 5 October 2018
ER -