TY - JOUR
T1 - Query-by-Example music retrieval approach based on musical genre shift by changing instrument volume
AU - Itoyama, Katsutoshi
AU - Goto, Masataka
AU - Komatani, Kazunori
AU - Ogata, Tetsuya
AU - Okuno, Hiroshi G.
PY - 2009
Y1 - 2009
N2 - We describe a novel Query-by-Example (QBE) approach in Music Information Retrieval, which allows a user to customize query examples by directly modifying the volume of different instrument parts. The underlying hypothesis is that the musical genre shifts (changes) in relation to the volume balance of different instruments. On the basis of this hypothesis, we aim to clarify the relationship between the change of the volume balance of a query and the shift in the musical genre of retrieved similar pieces, and thus help instruct a user in generating alternative queries without choosing other pieces. Our QBE system first separates all instrument parts from the audio signal of a piece with the help of its musical score, and then lets a user remix those parts to change acoustic features that represent musical mood of the piece. The distribution of those features is modeled by the Gaussian Mixture Model for each musical piece, and the Earth Movers Distance between mixtures of different pieces is used as the degree of their mood similarity. Experimental results showed that the shift was actually caused by the volume change of vocal, guitar, and drums.
AB - We describe a novel Query-by-Example (QBE) approach in Music Information Retrieval, which allows a user to customize query examples by directly modifying the volume of different instrument parts. The underlying hypothesis is that the musical genre shifts (changes) in relation to the volume balance of different instruments. On the basis of this hypothesis, we aim to clarify the relationship between the change of the volume balance of a query and the shift in the musical genre of retrieved similar pieces, and thus help instruct a user in generating alternative queries without choosing other pieces. Our QBE system first separates all instrument parts from the audio signal of a piece with the help of its musical score, and then lets a user remix those parts to change acoustic features that represent musical mood of the piece. The distribution of those features is modeled by the Gaussian Mixture Model for each musical piece, and the Earth Movers Distance between mixtures of different pieces is used as the degree of their mood similarity. Experimental results showed that the shift was actually caused by the volume change of vocal, guitar, and drums.
UR - http://www.scopus.com/inward/record.url?scp=84872702150&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84872702150&partnerID=8YFLogxK
M3 - Conference article
AN - SCOPUS:84872702150
SN - 2413-6700
SP - 205
EP - 212
JO - Proceedings of the International Conference on Digital Audio Effects, DAFx
JF - Proceedings of the International Conference on Digital Audio Effects, DAFx
T2 - 12th International Conference on Digital Audio Effects, DAFx 2009
Y2 - 1 September 2009 through 4 September 2009
ER -