抄録
Music has long been used to strengthen bonds between humans. In our research, we develop musical coplayer robots with the hope that music may improve human-robot symbiosis as well. In this paper, we underline the importance of non-verbal, visual communication for ensemble synchronization at the start, during and end of a piece. We propose three cues for interplayer communication, and present a thereminplaying, singing robot that can detect them and adapt its play to a human flutist. Experiments with two naive flutists suggest that the system can recognize naturally occurring flutist gestures without requiring specialized user training. In addition, we show how the use of audio-visual aggregation can allow a robot to adapt to tempo changes quickly.
本文言語 | English |
---|---|
ページ(範囲) | 363-381 |
ページ数 | 19 |
ジャーナル | Advanced Robotics |
巻 | 26 |
号 | 3-4 |
DOI | |
出版ステータス | Published - 2012 |
外部発表 | はい |
ASJC Scopus subject areas
- 制御およびシステム工学
- ソフトウェア
- 人間とコンピュータの相互作用
- ハードウェアとアーキテクチャ
- コンピュータ サイエンスの応用