This paper presents a robot quizmaster that has auditory functions (i.e., ears) for moderating a multiplayer quiz game. The most basic form of oral interaction in a quiz game is that a quizmaster reads aloud a question, and each player is allowed to answer it whenever the answer comes to his or her mind. A critical problem in such oral interaction is that if multiple players speak almost simultaneously for answering, it is difficult for a 'human' quizmaster to recognize overlapping answers and judge the correctness of each answer. To avoid this problem, players have conventionally been required to push a button, raise a hand, or say 'Yes' to just get a right to answer a question before doing it. This requirement, however, inhibits natural oral interaction. In this paper we propose a 'robot' quizmaster that can identify a player who correctly answers a question first, even when multiple players utter answers almost at the same time. Since our robot uses its own microphones (ears) embedded in the head, individual players are not required to wear small pin microphones close to their mouths. To localize, separate, and recognize overlapping utterances captured by the ears, we use a robot audition software called HARK and an automatic speech recognizer called Julius. Experimental results showed the effectiveness of our approach.
|Title of host publication
|2014 IEEE/SICE International Symposium on System Integration, SII 2014
|Institute of Electrical and Electronics Engineers Inc.
|Number of pages
|Published - 2014 Jan 30
|7th IEEE/SICE International Symposium on System Integration, SII 2014 - Tokyo, Japan
Duration: 2014 Dec 13 → 2014 Dec 15
|7th IEEE/SICE International Symposium on System Integration, SII 2014
|14/12/13 → 14/12/15
ASJC Scopus subject areas
- Control and Systems Engineering
- Computer Networks and Communications
- Information Systems