Robotic interface for embodied interaction via dance and musical performance

Kenji Suzuki*, Shuji Hashimoto

*Corresponding author for this work

    Research output: Contribution to journalArticlepeer-review

    10 Citations (Scopus)

    Abstract

    A substantial robotic interface is proposed for collaborative work between humans and machines in a multimodal musical environment. The robotic interface is regarded as a "moving instrument" that displays the reactive motion on a stage while producing sound and music by embedded stereo speakers according to the context of the performance. In this paper, we introduce four musical platforms utilizing robotic technology and information technology in different circumstances. These are effective designing environments for artists such as musicians, composers, and choreographers, not only for music creation but also for media coordination including motion and visual effects. The architecture, called the MIDI network, enables them to control the robot movement as well as to compose music. Each of the developed robotic systems works as a sort of reflector to create an acoustic and visual space with multimodality. The proposed approach to equip musical instruments with an autonomous mobile ability promises a new type of computer music performance.

    Original languageEnglish
    Pages (from-to)656-671
    Number of pages16
    JournalProceedings of the IEEE
    Volume92
    Issue number4
    DOIs
    Publication statusPublished - 2004 Apr

    Keywords

    • Computer music
    • Human-computer interaction (HCI)
    • Hyperinstruments
    • Multimodal human-machine interaction
    • Robotic interface
    • User interface human factors

    ASJC Scopus subject areas

    • Electrical and Electronic Engineering

    Fingerprint

    Dive into the research topics of 'Robotic interface for embodied interaction via dance and musical performance'. Together they form a unique fingerprint.

    Cite this