Voice-Awareness control for a humanoid robot consistent with its body posture and movements

Takuma Otsuka, Kazuhiro Nakadai, Toru Takahashi, Kazunori Komatani, Tetsuya Ogata, Hiroshi G. Okuno

Research output: Contribution to journalArticlepeer-review

6 Citations (Scopus)


This paper presents voice-Awareness control consistent with robot's head movements. For a natural spoken communication between robots and humans, robots must behave and speak the way humans expect them to. The consistency between the robot's voice quality and its body motion is one of the most especially striking factors in naturalness of robot speech. Our control is based on a new model of spectral envelope modification for vertical head motion, and left-right balance modulation for horizontal head motion. We assume that a pitch-Axis rotation, or a vertical head motion, and a yaw-Axis rotation, or a horizontal head motion, effect the voice quality independently. The spectral envelope modification model is constructed based on the analysis of human vocalizations. The left-right balance model is established by measuring impulse responses using a pair of microphones. Experimental results show that the voice-Awareness is perceivable in a robot-To-robot dialogue when the robots stand up to 150 cm away. The dynamic change in the voice quality is also confirmed in the experiment.

Original languageEnglish
Pages (from-to)80-88
Number of pages9
Issue number1
Publication statusPublished - 2010 Mar 1
Externally publishedYes


  • 2D voice manipulation
  • Human robot interaction
  • Robot speech signal control
  • Source filter model
  • Voice awareness

ASJC Scopus subject areas

  • Human-Computer Interaction
  • Developmental Neuroscience
  • Cognitive Neuroscience
  • Artificial Intelligence
  • Behavioral Neuroscience


Dive into the research topics of 'Voice-Awareness control for a humanoid robot consistent with its body posture and movements'. Together they form a unique fingerprint.

Cite this