TY - JOUR
T1 - Improvement of generalization ability for identifying dynamical systems by using universal learning networks
AU - Hirasawa, Kotaro
AU - Kim, Sung Ho
AU - Hu, Jinglu
AU - Murata, Junichi
AU - Han, Min
AU - Jin, Chunzhi
N1 - Copyright:
Copyright 2007 Elsevier B.V., All rights reserved.
PY - 2001
Y1 - 2001
N2 - This paper studies how the generalization ability of models of dynamical systems can be improved by taking advantage of the second order derivatives of the outputs with respect to the external inputs. The proposed method can be regarded as a direct implementation of the well-known regularization technique using the higher order derivatives of the Universal Learning Networks (ULNs). ULNs consist of a number of interconnected nodes where the nodes may have any continuously differentiable nonlinear functions in them and each pair of nodes can be connected by multiple branches with arbitrary time delays. A generalized learning algorithm has been derived for the ULNs, in which both the first order derivatives (gradients) and the higher order derivatives are incorporated. First, the method for computing the second order derivatives of ULNs is discussed. Then, a new method for implementing the regularization term is presented. Finally, simulation studies on identification of a nonlinear dynamical system with noises are carried out to demonstrate the effectiveness of the proposed method. Simulation results show that the proposed method can improve the generalization ability of neural networks significantly, especially in terms that (1) the robust network can be obtained even when the branches of trained ULNs are destructed, and (2) the obtained performance does not depend on the initial parameter values.
AB - This paper studies how the generalization ability of models of dynamical systems can be improved by taking advantage of the second order derivatives of the outputs with respect to the external inputs. The proposed method can be regarded as a direct implementation of the well-known regularization technique using the higher order derivatives of the Universal Learning Networks (ULNs). ULNs consist of a number of interconnected nodes where the nodes may have any continuously differentiable nonlinear functions in them and each pair of nodes can be connected by multiple branches with arbitrary time delays. A generalized learning algorithm has been derived for the ULNs, in which both the first order derivatives (gradients) and the higher order derivatives are incorporated. First, the method for computing the second order derivatives of ULNs is discussed. Then, a new method for implementing the regularization term is presented. Finally, simulation studies on identification of a nonlinear dynamical system with noises are carried out to demonstrate the effectiveness of the proposed method. Simulation results show that the proposed method can improve the generalization ability of neural networks significantly, especially in terms that (1) the robust network can be obtained even when the branches of trained ULNs are destructed, and (2) the obtained performance does not depend on the initial parameter values.
KW - Generalization ability
KW - Regularization technique
KW - Robustness
KW - Second order derivatives
KW - Universal learning networks
UR - http://www.scopus.com/inward/record.url?scp=0035200301&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=0035200301&partnerID=8YFLogxK
U2 - 10.1016/S0893-6080(01)00117-4
DO - 10.1016/S0893-6080(01)00117-4
M3 - Article
C2 - 11771719
AN - SCOPUS:0035200301
SN - 0893-6080
VL - 14
SP - 1389
EP - 1404
JO - Neural Networks
JF - Neural Networks
IS - 10
ER -