抄録
Likelihood optimization methods for learning algorithms are generalized and faster algorithms are provided. The idea is to transfer the optimization to a general class of convex divergences between two probability density functions. The first part explains why such optimization transfer is significant. The second part contains derivation of the generalized ICA (Independent Component Analysis). Experiments on brain fMRI maps are reported. The third part discusses this optimization transfer in the generalized EM algorithm (Expectation-Maximization). Hierarchical descendants to this algorithm such as vector quantization and self-organization are also explained.
本文言語 | English |
---|---|
ホスト出版物のタイトル | Proceedings of the International Joint Conference on Neural Networks |
ページ | 1883-1888 |
ページ数 | 6 |
巻 | 2 |
出版ステータス | Published - 2002 |
イベント | 2002 International Joint Conference on Neural Networks (IJCNN '02) - Honolulu, HI 継続期間: 2002 5月 12 → 2002 5月 17 |
Other
Other | 2002 International Joint Conference on Neural Networks (IJCNN '02) |
---|---|
City | Honolulu, HI |
Period | 02/5/12 → 02/5/17 |
ASJC Scopus subject areas
- ソフトウェア