TY - JOUR
T1 - Stacked residual recurrent neural network with word weight for text classification
AU - Cao, Wei
AU - Song, Anping
AU - Hu, Jinglu
N1 - Funding Information:
This research is supported by the Major Research plan of the National Natural Science Foundation of China (Grant No. 91630206)
PY - 2017/8/1
Y1 - 2017/8/1
N2 - Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.
AB - Neural networks, and in particular recurrent neural networks (RNNs) have recently been shown to give a state-ofthe- art performance on some text classification tasks. However, most existing methods assume that each word in a sentence contributes the same importance, it is different from the real world. For example, if we do sentiment analysis, the word "awesome" is much more important than any other words in the sentence "This movie is awesome". Motivated by this deficiency and in order to achieve a further performance, in this paper, a Stacked Residual RNN with Word Weight method is proposed, we extend the stacked RNN to a deep one with residual network architecture and introduce a word weight based network to consider the weight of each word. Our proposed method is able to learn high the hierarchical meaning of each word in a sentence and consider the weight of each word for text classification task. Experimental result indicates that our method achieves high performance compared with the state-of-the-art approaches.
KW - Long Short-Term Memory
KW - Recurrent Neural Networks
KW - Residual Networks
KW - Text classification
KW - Word weight
UR - http://www.scopus.com/inward/record.url?scp=85028065923&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85028065923&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:85028065923
SN - 1819-656X
VL - 44
SP - 277
EP - 284
JO - IAENG International Journal of Computer Science
JF - IAENG International Journal of Computer Science
IS - 3
ER -