A beat-tracking robot for human-robot interaction and its evaluation

Kazumasa Murata*, Kazuhiro Nakadai, Ryu Takeda, Hiroshi G. Okuno, Toyotaka Torh, Yuji Hasegawa, Hiroshi Tsujino

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

19 Citations (Scopus)

Abstract

Human-robot interaction through music in real environments is essential for humanoids, because such a robot makes people enjoyable. We thus developed a beat-tracking robot which steps, sings, and scats according to musical beats predicted by using a robot-embedded microphone, as a first step to realize a robot which makes a music session with people. This paper first describes the beat-tracking robot, and then evaluated it in detail at the following three points: adaptation to tempo changes, robustness of environmental noises including periodic noises generated by stepping, singing and scatting, and human-robot interaction by using a clapping sound. The results showed that our beat-tracking robot improved noise-robustness and adaptation to tempo changes drastically so that it can make a simple sound session with people.

Original languageEnglish
Title of host publication2008 8th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2008
Pages79-84
Number of pages6
DOIs
Publication statusPublished - 2008
Externally publishedYes
Event2008 8th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2008 - Daejeon
Duration: 2008 Dec 12008 Dec 3

Other

Other2008 8th IEEE-RAS International Conference on Humanoid Robots, Humanoids 2008
CityDaejeon
Period08/12/108/12/3

ASJC Scopus subject areas

  • Artificial Intelligence
  • Computer Vision and Pattern Recognition
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'A beat-tracking robot for human-robot interaction and its evaluation'. Together they form a unique fingerprint.

Cite this