TY - GEN
T1 - Implementation of a musical performance interaction system for the waseda flutist robot
T2 - 23rd IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010
AU - Petersen, Klaus
AU - Solis, Jorge
AU - Takanishi, Atsuo
PY - 2010/12/1
Y1 - 2010/12/1
N2 - The flutist robot WF-4RIV at Waseda University is able to play the flute at the level of an intermediate human player. So far the robot has been able to play in a statically sequenced duet with another musician, individually communicating only by keeping eye-contact. To extend the interactive capabilities of the flutist robot, we have in previous publications described the implementation of a Music-based Interaction System (MbIS). The purpose of this system is to combine information from the robot's visual and aural sensor input signal processing systems to enable musical communication with a partner musician. In this paper we focus on that part of the MbIS that is responsible for mapping the information from the sensor processing system to generate meaningful modulation of the musical output of the robot. We propose a two skill level approach to enable musicians of different ability levels to interact with the robot. When interacting with the flutist robot the device's physical capabilities / limitations need to be taken into account. In the beginner level interaction system the user's input to the robot is filtered in order to adjust it to the state of the robot's breathing system. The advanced level stage uses both the aural and visual sensor processing information. In a teaching phase the musician teaches the robot a tone sequence (by actually performing the sequence) that he relates to a certain instrument movement. In a performance phase, the musician can trigger these taught sequences by performing the according movements. Experiments to validate the functionality of the MbIS approach have been performed and the results are presented in this paper.
AB - The flutist robot WF-4RIV at Waseda University is able to play the flute at the level of an intermediate human player. So far the robot has been able to play in a statically sequenced duet with another musician, individually communicating only by keeping eye-contact. To extend the interactive capabilities of the flutist robot, we have in previous publications described the implementation of a Music-based Interaction System (MbIS). The purpose of this system is to combine information from the robot's visual and aural sensor input signal processing systems to enable musical communication with a partner musician. In this paper we focus on that part of the MbIS that is responsible for mapping the information from the sensor processing system to generate meaningful modulation of the musical output of the robot. We propose a two skill level approach to enable musicians of different ability levels to interact with the robot. When interacting with the flutist robot the device's physical capabilities / limitations need to be taken into account. In the beginner level interaction system the user's input to the robot is filtered in order to adjust it to the state of the robot's breathing system. The advanced level stage uses both the aural and visual sensor processing information. In a teaching phase the musician teaches the robot a tone sequence (by actually performing the sequence) that he relates to a certain instrument movement. In a performance phase, the musician can trigger these taught sequences by performing the according movements. Experiments to validate the functionality of the MbIS approach have been performed and the results are presented in this paper.
UR - http://www.scopus.com/inward/record.url?scp=78651480614&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=78651480614&partnerID=8YFLogxK
U2 - 10.1109/IROS.2010.5652576
DO - 10.1109/IROS.2010.5652576
M3 - Conference contribution
AN - SCOPUS:78651480614
SN - 9781424466757
T3 - IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
SP - 2283
EP - 2288
BT - IEEE/RSJ 2010 International Conference on Intelligent Robots and Systems, IROS 2010 - Conference Proceedings
Y2 - 18 October 2010 through 22 October 2010
ER -