TY - JOUR
T1 - Recurrent neural networks with multi-branch structure
AU - Yamashita, Takashi
AU - Mabu, Shingo
AU - Hirasawa, Kotaro
AU - Furuzuki, Takayuki
PY - 2008
Y1 - 2008
N2 - Universal Learning Networks (ULNs) provide a generalized framework for many kinds of structures in neural networks with supervised learning. Multi-Branch Neural Networks (MBNNs) which use the framework of ULNs have already been shown to have better representation ability in feedforward neural networks (FNNs). The multi-branch structure of MBNNs can be easily extended to recurrent neural networks (RNNs) because the characteristics of ULNs include the connection of multiple branches with arbitrary time delays. In this paper, therefore, RNNs with multi-branch structure are proposed and are shown to have better representation ability than conventional RNNs. RNNs can represent dynamical systems and are useful for time series prediction. The performance evaluation of RNNs with multi-branch structure was carried out using a benchmark of time series prediction. Simulation results showed that RNNs with multi-branch structure could obtain better performance than conventional RNNs, and also showed that they could improve the representation ability even if they are smaller-sized networks.
AB - Universal Learning Networks (ULNs) provide a generalized framework for many kinds of structures in neural networks with supervised learning. Multi-Branch Neural Networks (MBNNs) which use the framework of ULNs have already been shown to have better representation ability in feedforward neural networks (FNNs). The multi-branch structure of MBNNs can be easily extended to recurrent neural networks (RNNs) because the characteristics of ULNs include the connection of multiple branches with arbitrary time delays. In this paper, therefore, RNNs with multi-branch structure are proposed and are shown to have better representation ability than conventional RNNs. RNNs can represent dynamical systems and are useful for time series prediction. The performance evaluation of RNNs with multi-branch structure was carried out using a benchmark of time series prediction. Simulation results showed that RNNs with multi-branch structure could obtain better performance than conventional RNNs, and also showed that they could improve the representation ability even if they are smaller-sized networks.
KW - Multi-branch
KW - Recurrent neural networks
KW - Time series prediction
KW - Universal learning networks
UR - http://www.scopus.com/inward/record.url?scp=62249102875&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=62249102875&partnerID=8YFLogxK
U2 - 10.1002/ecj.10157
DO - 10.1002/ecj.10157
M3 - Article
AN - SCOPUS:62249102875
SN - 8756-663X
VL - 91
SP - 37
EP - 44
JO - Electronics and Communications in Japan, Part II: Electronics (English translation of Denshi Tsushin Gakkai Ronbunshi)
JF - Electronics and Communications in Japan, Part II: Electronics (English translation of Denshi Tsushin Gakkai Ronbunshi)
IS - 9
ER -