A new efficient measure for accuracy prediction and its application to multistream-based unsupervised adaptation

Tetsuji Ogawa, Sri Harish Mallidi, Emmanuel Dupoux, Jordan Cohen, Naomi H. Feldman, Hynek Hermansky

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Citations (Scopus)

Abstract

A new efficient measure for predicting estimation accuracy is proposed and successfully applied to multistream-based unsupervised adaptation of ASR systems to address data uncertainty when the ground-truth is unknown. The proposed measure is an extension of the M-measure, which predicts confidence in the output of a probability estimator by measuring the divergences of probability estimates spaced at specific time intervals. In this study, the M-measure was extended by considering the latent phoneme information, resulting in an improved reliability. Experimental comparisons carried out in a multistream-based ASR paradigm demonstrated that the extended M-measure yields a significant improvement over the original M-measure, especially under narrow-band noise conditions.

Original languageEnglish
Title of host publication2016 23rd International Conference on Pattern Recognition, ICPR 2016
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages2222-2227
Number of pages6
ISBN (Electronic)9781509048472
DOIs
Publication statusPublished - 2016 Jan 1
Event23rd International Conference on Pattern Recognition, ICPR 2016 - Cancun, Mexico
Duration: 2016 Dec 42016 Dec 8

Publication series

NameProceedings - International Conference on Pattern Recognition
Volume0
ISSN (Print)1051-4651

Other

Other23rd International Conference on Pattern Recognition, ICPR 2016
Country/TerritoryMexico
CityCancun
Period16/12/416/12/8

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'A new efficient measure for accuracy prediction and its application to multistream-based unsupervised adaptation'. Together they form a unique fingerprint.

Cite this