抄録
Our goal is to develop a coplayer music robot capable of presenting a musical expression together with humans. Although many instrument-performing robots exist, they may have difficulty playing with human performers due to the lack of the synchronization function. The robot has to follow differences in humans' performance such as temporal fluctuations to play with human performers. We classify synchronization and musical expression into two levels: (1) melody level and (2) rhythm level to cope with erroneous synchronizations. The idea is as follows: When the synchronization with the melody is reliable, respond to the pitch the robot hears, when the synchronization is uncertain, try to follow the rhythm of the music. Our method estimates the score position for the melody level and the tempo for the rhythm level. The reliability of the score position estimation is extracted from the probability distribution of the score position. The experimental results demonstrate that our method outperforms the existing score following system in 16 songs out of 20 polyphonic songs. The error in the prediction of the score position is reduced by 69 on average. The results also revealed that the switching mechanism alleviates the error in the estimation of the score position.
本文言語 | English |
---|---|
論文番号 | 384651 |
ジャーナル | Eurasip Journal on Advances in Signal Processing |
巻 | 2011 |
DOI | |
出版ステータス | Published - 2011 |
外部発表 | はい |
ASJC Scopus subject areas
- 信号処理
- ハードウェアとアーキテクチャ
- 電子工学および電気工学