抄録
A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing a method to combine a DNN and an SVM, as a deep classifier. For a DRN trained by a common tuning algorithm, a multilayer gated bilinear classifier is designed to mimic its functionality. Its parameter set is duplicated into two independent sets, playing different roles. One set is used to generate gate signals so as to determine subnetworks corresponding to its inputs, and keeps fixed when optimizing the classifier. The other set serves as parameters of subnetworks, which are linear classifiers. Therefore, their parameters can be implicitly optimized by applying SVM optimizations. Since the DRN is only to generate gate signals, we show in experiments, that it can be trained by using supervised, or unsupervised learning, and even by transfer learning.
本文言語 | English |
---|---|
ホスト出版物のタイトル | 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings |
出版社 | Institute of Electrical and Electronics Engineers Inc. |
ページ | 140-146 |
ページ数 | 7 |
巻 | 2017-May |
ISBN(電子版) | 9781509061815 |
DOI | |
出版ステータス | Published - 2017 6月 30 |
イベント | 2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States 継続期間: 2017 5月 14 → 2017 5月 19 |
Other
Other | 2017 International Joint Conference on Neural Networks, IJCNN 2017 |
---|---|
国/地域 | United States |
City | Anchorage |
Period | 17/5/14 → 17/5/19 |
ASJC Scopus subject areas
- ソフトウェア
- 人工知能