We analyze the extended stochastic complexity (ESC) which has been proposed by K. Yamanishi. The ESC can be applied to learning algorithms for on-line prediction and batch-learning settings. Yamanishi derived the upper bound of ESC satisfying uniformly for all data sequences and that of the asymptotic expectation of ESC. However, Yamanishi concentrates mainly on the worst case performance and the lower bound has not been derived. In this paper, we show some interesting properties of ESC which are similar to Bayesian statistics: the Bayes rule and the asymptotic normality. We then derive the asymptotic formula of ESC in the meaning of almost sure and mean convergence within an error of o(1) using these properties.
|IEICE Transactions on Fundamentals of Electronics, Communications and Computer Sciences
|Published - 1999 1月 1
ASJC Scopus subject areas
- コンピュータ グラフィックスおよびコンピュータ支援設計