Interactive biped locomotion based on visual/auditory information

Yu Ogura, Yusuke Sugahara, Yoshiharu Kaneshima, Naoki Hieda, Hun Ok Lim, Atsuo Takanishi

Research output: Chapter in Book/Report/Conference proceedingConference contribution

17 Citations (Scopus)

Abstract

This paper describes an interactive locomotion method for a biped humanoid robot. The method consists of two main parts: a pattern generator and a human-robot interface. The human robot interface is used to achieve realtime interactive locomotion. In particular, visual information and voice instructions are employed to determine locomotion parameters such as a step length, a step direction, and the number of steps. The motion of the lower-limbs is generated by the online pattern generator based on the locomotion parameters. Continuous locomotion experiments are carried out in real time using WABIAN-RV. The experimental results show the feasibility of the proposed interactive locomotion method.

Original languageEnglish
Title of host publicationIEEE ROMAN 2002 - 11th IEEE International Workshop on Robot and Human Interactive Communication, Proceedings
Pages253-258
Number of pages6
DOIs
Publication statusPublished - 2002 Dec 1
Event11th IEEE International Workshop on Robot and Human Interactive Communication, IEEE ROMAN 2002 - Berlin, Germany
Duration: 2002 Sept 252002 Sept 27

Publication series

NameProceedings - IEEE International Workshop on Robot and Human Interactive Communication

Conference

Conference11th IEEE International Workshop on Robot and Human Interactive Communication, IEEE ROMAN 2002
Country/TerritoryGermany
CityBerlin
Period02/9/2502/9/27

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Interactive biped locomotion based on visual/auditory information'. Together they form a unique fingerprint.

Cite this