TY - JOUR
T1 - The Alpha-HMM Estimation Algorithm
T2 - Prior Cycle Guides Fast Paths
AU - Matsuyama, Yasuo
PY - 2017/7/1
Y1 - 2017/7/1
N2 - The estimation of generative structures for sequences is becoming increasingly important for preventing such data sources from becoming a flood of disorganized information. Obtaining hidden Markov models (HMMs) has been a central method for structuring such data. However, users have been aware of the slow speed of this algorithm. In this study, we devise generalized and fast estimation methods for HMMs by employing a geometric information measure that is associated with a function called the alpha-logarithm. Using the alpha-logarithmic likelihood ratio, we exploit prior iterations to guide rapid convergence. The parameter alpha is used to adjust the utilization of previous information. A fixed-point approach using a causal shift and a series expansion is responsible for this gain. For software implementations, we present probability scaling to avoid underflow, where we generalize flaw corrections to the de facto standard. For the update mechanism, we begin with a method called shotgun surrogates, in relation to the parameter alpha. Then, we obtain a dynamic version that employs the controlling and undoing of alpha. Experiments on biological sequences and brain signals for practical state models demonstrate that a significant speedup is achieved compared to the Baum-Welch method. The effects of restricting the state models are also reported.
AB - The estimation of generative structures for sequences is becoming increasingly important for preventing such data sources from becoming a flood of disorganized information. Obtaining hidden Markov models (HMMs) has been a central method for structuring such data. However, users have been aware of the slow speed of this algorithm. In this study, we devise generalized and fast estimation methods for HMMs by employing a geometric information measure that is associated with a function called the alpha-logarithm. Using the alpha-logarithmic likelihood ratio, we exploit prior iterations to guide rapid convergence. The parameter alpha is used to adjust the utilization of previous information. A fixed-point approach using a causal shift and a series expansion is responsible for this gain. For software implementations, we present probability scaling to avoid underflow, where we generalize flaw corrections to the de facto standard. For the update mechanism, we begin with a method called shotgun surrogates, in relation to the parameter alpha. Then, we obtain a dynamic version that employs the controlling and undoing of alpha. Experiments on biological sequences and brain signals for practical state models demonstrate that a significant speedup is achieved compared to the Baum-Welch method. The effects of restricting the state models are also reported.
KW - Alpha-hidden Markov model estimation
KW - convergence speedup
KW - dynamic surrogate
KW - message passing
KW - shotgun surrogates
UR - http://www.scopus.com/inward/record.url?scp=85019179561&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85019179561&partnerID=8YFLogxK
U2 - 10.1109/TSP.2017.2692724
DO - 10.1109/TSP.2017.2692724
M3 - Review article
AN - SCOPUS:85019179561
SN - 1053-587X
VL - 65
SP - 3446
EP - 3461
JO - IEEE Transactions on Acoustics, Speech, and Signal Processing
JF - IEEE Transactions on Acoustics, Speech, and Signal Processing
IS - 13
M1 - 7895145
ER -