Abstract
This paper describes a multimodal database which consists of image data of human gestures and corresponding speech data for the research on multimodal interaction systems. The purpose of this database is to provide an underlying foundation for research and development of multimodal interactive systems. Our primary concern in selecting utterances and gestures for inclusion in the database was to ascertain the kinds of expressions and gestures that artificial systems could produce and recognize. Total 25 kinds of gestures and speech were repeated four times for the recording of each subject. The speech and gestures for a total of 48 subjects were recorded, converted into files and in the first version, the files for 12 subjects were recorded on CD-ROMs.
Original language | English |
---|---|
Pages | 2247-2250 |
Number of pages | 4 |
Publication status | Published - 1999 |
Externally published | Yes |
Event | 6th European Conference on Speech Communication and Technology, EUROSPEECH 1999 - Budapest, Hungary Duration: 1999 Sept 5 → 1999 Sept 9 |
Conference
Conference | 6th European Conference on Speech Communication and Technology, EUROSPEECH 1999 |
---|---|
Country/Territory | Hungary |
City | Budapest |
Period | 99/9/5 → 99/9/9 |
Keywords
- database
- gesture
- multimodal
ASJC Scopus subject areas
- Computer Science Applications
- Software
- Linguistics and Language
- Communication