TY - GEN
T1 - Hyperparameter Learning of Stochastic Image Generative Models with Bayesian Hierarchical Modeling and Its Effect on Lossless Image Coding
AU - Nakahara, Yuta
AU - Matsushima, Toshiyasu
N1 - Funding Information:
ACKNOWLEDGMENT This work was supported by JSPS KAKENHI Grant Numbers JP17K06446 and JP19K04914.
Publisher Copyright:
© 2021 IEEE.
PY - 2021
Y1 - 2021
N2 - Explicit assumption of stochastic data generative models is a remarkable feature of lossless compression of general data in information theory. However, current lossless image coding mostly focus on coding procedures without explicit assumption of the stochastic generative model. Therefore, we have difficulty discussing the theoretical optimality of the coding procedure to the stochastic generative model. In this paper, we solve this difficulty by constructing a stochastic generative model by interpreting the previous coding procedure from another perspective. An important problem of our approach is how to learn the hyperparameters of the stochastic generative model because the optimality of our coding algorithm is guaranteed only asymptotically and the hyperparameter setting still affects the expected code length for finite length data. For this problem, we use Bayesian hierarchical modeling and confirm its effect by numerical experiments. In lossless image coding, this is the first study assuming such an explicit stochastic generative model and learning its hyperparameters, to the best of our knowledge.
AB - Explicit assumption of stochastic data generative models is a remarkable feature of lossless compression of general data in information theory. However, current lossless image coding mostly focus on coding procedures without explicit assumption of the stochastic generative model. Therefore, we have difficulty discussing the theoretical optimality of the coding procedure to the stochastic generative model. In this paper, we solve this difficulty by constructing a stochastic generative model by interpreting the previous coding procedure from another perspective. An important problem of our approach is how to learn the hyperparameters of the stochastic generative model because the optimality of our coding algorithm is guaranteed only asymptotically and the hyperparameter setting still affects the expected code length for finite length data. For this problem, we use Bayesian hierarchical modeling and confirm its effect by numerical experiments. In lossless image coding, this is the first study assuming such an explicit stochastic generative model and learning its hyperparameters, to the best of our knowledge.
UR - http://www.scopus.com/inward/record.url?scp=85123424050&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85123424050&partnerID=8YFLogxK
U2 - 10.1109/ITW48936.2021.9611418
DO - 10.1109/ITW48936.2021.9611418
M3 - Conference contribution
AN - SCOPUS:85123424050
T3 - 2021 IEEE Information Theory Workshop, ITW 2021 - Proceedings
BT - 2021 IEEE Information Theory Workshop, ITW 2021 - Proceedings
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2021 IEEE Information Theory Workshop, ITW 2021
Y2 - 17 October 2021 through 21 October 2021
ER -