Converting text into agent animations: Assigning gestures to text

Yukiko I. Nakano, Masashi Okamoto, Daisuke Kawahara, Qing Li, Toyoaki Nishida

Research output: Chapter in Book/Report/Conference proceedingConference contribution

26 Citations (Scopus)

Abstract

This paper proposes a method for assigning gestures to text based on lexical and syntactic information. First, our empirical study identified lexical and syntactic information strongly correlated with gesture occurrence and suggested that syntactic structure is more useful for judging gesture occurrence than local syntactic cues. Based on the empirical results, we have implemented a system that converts text into an animated agent that gestures and speaks synchronously.

Original languageEnglish
Title of host publicationHLT-NAACL 2004 - Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics, Short Papers
PublisherAssociation for Computational Linguistics (ACL)
Pages153-156
Number of pages4
ISBN (Electronic)1932432248, 9781932432244
Publication statusPublished - 2004
Externally publishedYes
Event2004 Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics - Short Papers, HLT-NAACL 2004 - Boston, United States
Duration: 2004 May 22004 May 7

Publication series

NameHLT-NAACL 2004 - Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics, Short Papers

Conference

Conference2004 Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics - Short Papers, HLT-NAACL 2004
Country/TerritoryUnited States
CityBoston
Period04/5/204/5/7

ASJC Scopus subject areas

  • Language and Linguistics
  • Linguistics and Language

Fingerprint

Dive into the research topics of 'Converting text into agent animations: Assigning gestures to text'. Together they form a unique fingerprint.

Cite this