Abstract
The learning efficiency of a simplified version of adaptive natural gradient descent (ANGD) for soft committee machines was evaluated. Statistical-mechanical techniques, which extract order parameters and make the stochastic learning dynamics converge towards deterministic at the large limit of the input dimension N [1,2], were employed. ANGD was found to perform as well as natural gradient descent (NGD). The key condition affecting the learning plateau in ANGD were also revealed.
Original language | English |
---|---|
Article number | 056120 |
Pages (from-to) | 056120-1-056120-14 |
Journal | Physical Review E - Statistical, Nonlinear, and Soft Matter Physics |
Volume | 69 |
Issue number | 5 1 |
Publication status | Published - 2004 May 1 |
Externally published | Yes |
ASJC Scopus subject areas
- Statistical and Nonlinear Physics
- Statistics and Probability
- Condensed Matter Physics