TY - GEN
T1 - Multiplication units in feedforward neural networks and its training
AU - Li, Dazi
AU - Hirasawa, K.
AU - Hu, Jinglu
AU - Murata, J.
N1 - Publisher Copyright:
© 2002 Nanyang Technological University.
PY - 2002
Y1 - 2002
N2 - This paper proposes the application of neural networks with multiplication units to parity-N problem, mirror symmetry problem and a function approximation problem. It is clear that, higher-order terms in neural networks, such as sigma-pi unit, can improve the computational power of neural networks considerably. But how the real neurons do this is still unclear. We have used one multiplication unit to construct full higher-order terms of all the inputs, which was proved very efficient for parity-N problem. Our earlier work on applying multiplication units to other problems suffered from the drawback of gradient-based algorithm, such as backpropagation algorithms, for being easy to stuck at local minima due to the complexity of the network. In order to overcome this problem we consider a novel random search, RasID, for the training of neural networks with multiplication units, which does an intensified search where it is easy to find good solutions locally and a diversified search to escape from local minima under a pure random search scheme. The method shows its advantage on the training of neural networks with multiplication units.
AB - This paper proposes the application of neural networks with multiplication units to parity-N problem, mirror symmetry problem and a function approximation problem. It is clear that, higher-order terms in neural networks, such as sigma-pi unit, can improve the computational power of neural networks considerably. But how the real neurons do this is still unclear. We have used one multiplication unit to construct full higher-order terms of all the inputs, which was proved very efficient for parity-N problem. Our earlier work on applying multiplication units to other problems suffered from the drawback of gradient-based algorithm, such as backpropagation algorithms, for being easy to stuck at local minima due to the complexity of the network. In order to overcome this problem we consider a novel random search, RasID, for the training of neural networks with multiplication units, which does an intensified search where it is easy to find good solutions locally and a diversified search to escape from local minima under a pure random search scheme. The method shows its advantage on the training of neural networks with multiplication units.
UR - http://www.scopus.com/inward/record.url?scp=84965025652&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=84965025652&partnerID=8YFLogxK
U2 - 10.1109/ICONIP.2002.1202134
DO - 10.1109/ICONIP.2002.1202134
M3 - Conference contribution
AN - SCOPUS:84965025652
T3 - ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing: Computational Intelligence for the E-Age
SP - 75
EP - 79
BT - ICONIP 2002 - Proceedings of the 9th International Conference on Neural Information Processing
A2 - Rajapakse, Jagath C.
A2 - Yao, Xin
A2 - Wang, Lipo
A2 - Fukushima, Kunihiko
A2 - Lee, Soo-Young
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 9th International Conference on Neural Information Processing, ICONIP 2002
Y2 - 18 November 2002 through 22 November 2002
ER -