## Abstract

The Expectation–Maximization (EM) algorithm is a simple meta-algorithm that has been used for many years as a methodology for statistical inference when there are missing measurements in the observed data or when the data is composed of observables and unobservables. Its general properties are well studied, and also, there are countless ways to apply it to individual problems. In this paper, we introduce the em algorithm, an information geometric formulation of the EM algorithm, and its extensions and applications to various problems. Specifically, we will see that it is possible to formulate an outlier–robust inference algorithm, an algorithm for calculating channel capacity, parameter estimation methods on probability simplex, particular multivariate analysis methods such as principal component analysis in a space of probability models and modal regression, matrix factorization, and learning generative models, which have recently attracted attention in deep learning, from the geometric perspective provided by Amari.

Original language | English |
---|---|

Pages (from-to) | 39-77 |

Number of pages | 39 |

Journal | Information Geometry |

Volume | 7 |

DOIs | |

Publication status | Published - 2023 Dec |

## Keywords

- Bregman divergence
- EM algorithm
- Generative models
- Information geometry
- Information theory
- Robust statistics
- em algorithm

## ASJC Scopus subject areas

- Statistics and Probability
- Geometry and Topology
- Computer Science Applications
- Computational Theory and Mathematics
- Applied Mathematics