A musical robot that synchronizes with a coplayer using non-verbal cues

Angelica Lim*, Takeshi Mizumoto, Tetsuya Ogata, Hiroshi G. Okuno

*この研究の対応する著者

研究成果: Article査読

5 被引用数 (Scopus)

抄録

Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.

本文言語English
ページ(範囲)363-381
ページ数19
ジャーナルAdvanced Robotics
26
3-4
DOI
出版ステータスPublished - 2012
外部発表はい

ASJC Scopus subject areas

  • 制御およびシステム工学
  • ソフトウェア
  • 人間とコンピュータの相互作用
  • ハードウェアとアーキテクチャ
  • コンピュータ サイエンスの応用

フィンガープリント

「A musical robot that synchronizes with a coplayer using non-verbal cues」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル