Effective training methods for function localization neural networks

Takafumi Sasakawa*, Jinglu Hu, Katsunori Isono, Kotaro Hirasawa

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Citation (Scopus)

Abstract

Inspired by Hebb's cell assembly theory about how the brain worked, we have developed a function localization neural network (FLNN). The main part of a FLNN is structurally the same as an ordinary feedforward neural network, but it is considered to consist of several overlapping modules, which are switched according to input patterns. A FLNN constructed in this way has been shown to have better representation ability than an ordinary neural network. However, BP training algorithm for such FLNN is very easy to get stuck at a local minimum. In this paper, we mainly discuss the methods for improving BP training of the FLNN by utilizing the structural property of the network. Two methods are proposed. Numerical simulations are used to show the effectiveness of the improved BP training methods.

Original languageEnglish
Title of host publicationInternational Joint Conference on Neural Networks 2006, IJCNN '06
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages4785-4790
Number of pages6
ISBN (Print)0780394909, 9780780394902
DOIs
Publication statusPublished - 2006 Jan 1
EventInternational Joint Conference on Neural Networks 2006, IJCNN '06 - Vancouver, BC, Canada
Duration: 2006 Jul 162006 Jul 21

Publication series

NameIEEE International Conference on Neural Networks - Conference Proceedings
ISSN (Print)1098-7576

Conference

ConferenceInternational Joint Conference on Neural Networks 2006, IJCNN '06
Country/TerritoryCanada
CityVancouver, BC
Period06/7/1606/7/21

ASJC Scopus subject areas

  • Software

Fingerprint

Dive into the research topics of 'Effective training methods for function localization neural networks'. Together they form a unique fingerprint.

Cite this