TY - GEN
T1 - Rapid Prototyping of Robust Language Understanding Modules for Spoken Dialogue Systems
AU - Fukubayashi, Yuichiro
AU - Komatani, Kazunori
AU - Nakano, Mikio
AU - Funakoshi, Kotaro
AU - Tsujino, Hiroshi
AU - Ogata, Tetsuya
AU - Okuno, Hiroshi G.
N1 - Publisher Copyright:
© 2008 IJCNLP 2008 - 3rd International Joint Conference on Natural Language Processing, Proceedings of the Conference. All rights reserved.
PY - 2008
Y1 - 2008
N2 - Language understanding (LU) modules for spoken dialogue systems in the early phases of their development need to be (i) easy to construct and (ii) robust against various expressions. Conventional methods of LU are not suitable for new domains, because they take a great deal of effort to make rules or transcribe and annotate a suf- _cient corpus for training. In our method, the weightings of the Weighted Finite State Transducer (WFST) are designed on two levels and simpler than those for conventional WFST-based methods. Therefore, our method needs much fewer training data, which enables rapid prototyping of LU modules. We evaluated our method in two different domains. The results revealed that our method outperformed baseline methods with less than one hundred utterances as training data, which can be reasonably prepared for new domains. This shows that our method is appropriate for rapid prototyping of LU modules.
AB - Language understanding (LU) modules for spoken dialogue systems in the early phases of their development need to be (i) easy to construct and (ii) robust against various expressions. Conventional methods of LU are not suitable for new domains, because they take a great deal of effort to make rules or transcribe and annotate a suf- _cient corpus for training. In our method, the weightings of the Weighted Finite State Transducer (WFST) are designed on two levels and simpler than those for conventional WFST-based methods. Therefore, our method needs much fewer training data, which enables rapid prototyping of LU modules. We evaluated our method in two different domains. The results revealed that our method outperformed baseline methods with less than one hundred utterances as training data, which can be reasonably prepared for new domains. This shows that our method is appropriate for rapid prototyping of LU modules.
UR - http://www.scopus.com/inward/record.url?scp=70450149333&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=70450149333&partnerID=8YFLogxK
M3 - Conference contribution
AN - SCOPUS:70450149333
T3 - IJCNLP 2008 - 3rd International Joint Conference on Natural Language Processing, Proceedings of the Conference
SP - 210
EP - 216
BT - IJCNLP 2008 - 3rd International Joint Conference on Natural Language Processing, Proceedings of the Conference
PB - Association for Computational Linguistics (ACL)
T2 - 3rd International Joint Conference on Natural Language Processing, IJCNLP 2008
Y2 - 7 January 2008 through 12 January 2008
ER -