Improving generalization ability of universal learning networks with superfluous parameters

Min Han*, Kotaro Hirasawa, Jinglu Hu, Junichi Murata, Chun zhi Jin

*この研究の対応する著者

研究成果: Conference article査読

1 被引用数 (Scopus)

抄録

The parameters in large scale neural networks can be divided into two classes. One class is necessary for a certain purpose while another class is not directly needed. The parameters in the latter are defined as superfluous parameters. How to use these superfluous parameters effectively is an interesting subject. In this paper, it is studied how the generalization ability of dynamic systems can be improved by use of network's superfluous parameters. And a calculation technique is proposed which use second order derivatives of the criterion function with respect to superfluous parameters. So as to investigate the effectiveness of the proposed method, simulations of modeling a nonlinear robot dynamics system is studied. Simulation results show that the proposed method is useful for improving the generalization ability of neural networks, which may model nonlinear dynamic systems.

本文言語English
ページ(範囲)V-407 - V-412
ジャーナルProceedings of the IEEE International Conference on Systems, Man and Cybernetics
5
出版ステータスPublished - 1999
外部発表はい
イベント1999 IEEE International Conference on Systems, Man, and Cybernetics 'Human Communication and Cybernetics' - Tokyo, Jpn
継続期間: 1999 10月 121999 10月 15

ASJC Scopus subject areas

  • 制御およびシステム工学
  • ハードウェアとアーキテクチャ

フィンガープリント

「Improving generalization ability of universal learning networks with superfluous parameters」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル