Interactive biped locomotion based on visual/auditory information

Yu Ogura, Yusuke Sugahara, Yoshiharu Kaneshima, Naoki Hieda, Hun Ok Lim, Atsuo Takanishi

研究成果: Conference contribution

17 被引用数 (Scopus)

抄録

This paper describes an interactive locomotion method for a biped humanoid robot. The method consists of two main parts: a pattern generator and a human-robot interface. The human robot interface is used to achieve realtime interactive locomotion. In particular, visual information and voice instructions are employed to determine locomotion parameters such as a step length, a step direction, and the number of steps. The motion of the lower-limbs is generated by the online pattern generator based on the locomotion parameters. Continuous locomotion experiments are carried out in real time using WABIAN-RV. The experimental results show the feasibility of the proposed interactive locomotion method.

本文言語English
ホスト出版物のタイトルIEEE ROMAN 2002 - 11th IEEE International Workshop on Robot and Human Interactive Communication, Proceedings
ページ253-258
ページ数6
DOI
出版ステータスPublished - 2002 12月 1
イベント11th IEEE International Workshop on Robot and Human Interactive Communication, IEEE ROMAN 2002 - Berlin, Germany
継続期間: 2002 9月 252002 9月 27

出版物シリーズ

名前Proceedings - IEEE International Workshop on Robot and Human Interactive Communication

Conference

Conference11th IEEE International Workshop on Robot and Human Interactive Communication, IEEE ROMAN 2002
国/地域Germany
CityBerlin
Period02/9/2502/9/27

ASJC Scopus subject areas

  • ソフトウェア
  • 人工知能
  • 人間とコンピュータの相互作用

フィンガープリント

「Interactive biped locomotion based on visual/auditory information」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル