Alpha-EM gives fast hidden Markov model estimation: Derivation and evaluation of alpha-HMM

Yasuo Matsuyama*, Ryunosuke Hayashi

*この研究の対応する著者

    研究成果: Conference contribution

    5 被引用数 (Scopus)

    抄録

    A fast learning algorithm for Hidden Markov Models is derived starting from convex divergence optimization. This method utilizes the alpha-logarithm as a surrogate function for the traditional logarithm to process the likelihood ratio. This enables the utilization of a stronger curvature than the logarithm. This paper's method includes the ordinary Baum-Welch re-estimation algorithm as a proper subset. The presented algorithm shows fast learning by utilizing time-shifted information during the progress of iterations. The computational complexity of this algorithm, which directly affects the CPU time, remains almost the same as the logarithmic one since only stored results are utilized for the speedup. Software implementation and speed are examined in the test data. The results showed that the presented method is creditable.

    本文言語English
    ホスト出版物のタイトルProceedings of the International Joint Conference on Neural Networks
    DOI
    出版ステータスPublished - 2010
    イベント2010 6th IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 International Joint Conference on Neural Networks, IJCNN 2010 - Barcelona
    継続期間: 2010 7月 182010 7月 23

    Other

    Other2010 6th IEEE World Congress on Computational Intelligence, WCCI 2010 - 2010 International Joint Conference on Neural Networks, IJCNN 2010
    CityBarcelona
    Period10/7/1810/7/23

    ASJC Scopus subject areas

    • ソフトウェア
    • 人工知能

    フィンガープリント

    「Alpha-EM gives fast hidden Markov model estimation: Derivation and evaluation of alpha-HMM」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

    引用スタイル