TY - GEN
T1 - Toward enabling a natural interaction between human musicians and musical performance robots
T2 - 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
AU - Petersen, Klaus
AU - Solis, Jorge
AU - Takanishi, Atsuo
PY - 2008
Y1 - 2008
N2 - Our research aims to develop an anthropomorphic flutist robot as a benchmark for the better understanding of interaction between musicians and musical performance robots from a musical point of view. As a long-term goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of with a musical instrument. The gestures are Identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production. The resulting information from the vision processing is then transformed into MIDI messages, which are subsequently played by the flute robot. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to musically interact with musical partners. ¿From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance.
AB - Our research aims to develop an anthropomorphic flutist robot as a benchmark for the better understanding of interaction between musicians and musical performance robots from a musical point of view. As a long-term goal of our research, we would like to enable such robots to play actively together with a human band, and create novel ways of musical expression. For this purpose, we focus on enhancing the perceptual capabilities of the flutist robot to process musical information coming from the aural and visual perceptual channels. In this paper, we introduce, as a first approach, a hands-free gesture-based control interface designed to modify musical parameters in real-time. In particular, we describe a set of virtual controllers, that a composer can manipulate through gestures of with a musical instrument. The gestures are Identified by 2-D motion sensitive areas which graphically represent common control interfaces used in music production. The resulting information from the vision processing is then transformed into MIDI messages, which are subsequently played by the flute robot. In order to verify the effectiveness of the proposed gestural interface, we performed experiments to musically interact with musical partners. ¿From the experimental results we concluded that our method satisfies the technical and idiosyncratic requirements for being a suitable tool for musical performance.
UR - http://www.scopus.com/inward/record.url?scp=52949100629&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=52949100629&partnerID=8YFLogxK
U2 - 10.1109/ROMAN.2008.4600689
DO - 10.1109/ROMAN.2008.4600689
M3 - Conference contribution
AN - SCOPUS:52949100629
SN - 9781424422135
T3 - Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
SP - 340
EP - 345
BT - Proceedings of the 17th IEEE International Symposium on Robot and Human Interactive Communication, RO-MAN
Y2 - 1 August 2008 through 3 August 2008
ER -