Inter-modality mapping in robot with recurrent neural network

Tetsuya Ogata*, Shun Nishide, Hideki Kozima, Kazunori Komatani, Hiroshi G. Okuno

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

23 Citations (Scopus)


A system for mapping between different sensory modalities was developed for a robot system to enable it to generate motions expressing auditory signals and sounds generated by object movement. A recurrent neural network model with parametric bias, which has good generalization ability, is used as a learning model. Since the correspondences between auditory signals and visual signals are too numerous to memorize, the ability to generalize is indispensable. This system was implemented in the "Keepon" robot, and the robot was shown horizontal reciprocating or rotating motions with the sound of friction and falling or overturning motion with the sound of collision by manipulating a box object. Keepon behaved appropriately not only from learned events but also from unknown events and generated various sounds in accordance with observed motions.

Original languageEnglish
Pages (from-to)1560-1569
Number of pages10
JournalPattern Recognition Letters
Issue number12
Publication statusPublished - 2010 Sept 1
Externally publishedYes


  • Dynamical systems
  • Generalization
  • Inter-modal mapping
  • Recurrent neural network with parametric bias

ASJC Scopus subject areas

  • Software
  • Signal Processing
  • Computer Vision and Pattern Recognition
  • Artificial Intelligence


Dive into the research topics of 'Inter-modality mapping in robot with recurrent neural network'. Together they form a unique fingerprint.

Cite this