Query-by-Example music retrieval approach based on musical genre shift by changing instrument volume

Katsutoshi Itoyama*, Masataka Goto, Kazunori Komatani, Tetsuya Ogata, Hiroshi G. Okuno

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

We describe a novel Query-by-Example (QBE) approach in Music Information Retrieval, which allows a user to customize query examples by directly modifying the volume of different instrument parts. The underlying hypothesis is that the musical genre shifts (changes) in relation to the volume balance of different instruments. On the basis of this hypothesis, we aim to clarify the relationship between the change of the volume balance of a query and the shift in the musical genre of retrieved similar pieces, and thus help instruct a user in generating alternative queries without choosing other pieces. Our QBE system first separates all instrument parts from the audio signal of a piece with the help of its musical score, and then lets a user remix those parts to change acoustic features that represent musical mood of the piece. The distribution of those features is modeled by the Gaussian Mixture Model for each musical piece, and the Earth Movers Distance between mixtures of different pieces is used as the degree of their mood similarity. Experimental results showed that the shift was actually caused by the volume change of vocal, guitar, and drums.

Original languageEnglish
Pages (from-to)205-212
Number of pages8
JournalProceedings of the International Conference on Digital Audio Effects, DAFx
Publication statusPublished - 2009
Externally publishedYes
Event12th International Conference on Digital Audio Effects, DAFx 2009 - Como, Italy
Duration: 2009 Sept 12009 Sept 4

ASJC Scopus subject areas

  • Signal Processing

Fingerprint

Dive into the research topics of 'Query-by-Example music retrieval approach based on musical genre shift by changing instrument volume'. Together they form a unique fingerprint.

Cite this