A deep neural network (DNN) is called as a deep rectified network (DRN), if using Rectified Linear Units (ReLUs) as its activation function. In this paper, we show its parameters can be seen to play two important roles simultaneously: one for determining the subnetworks corresponding to the inputs and the other for the parameters of those subnetworks. This observation leads our paper to proposing a method to combine a DNN and an SVM, as a deep classifier. For a DRN trained by a common tuning algorithm, a multilayer gated bilinear classifier is designed to mimic its functionality. Its parameter set is duplicated into two independent sets, playing different roles. One set is used to generate gate signals so as to determine subnetworks corresponding to its inputs, and keeps fixed when optimizing the classifier. The other set serves as parameters of subnetworks, which are linear classifiers. Therefore, their parameters can be implicitly optimized by applying SVM optimizations. Since the DRN is only to generate gate signals, we show in experiments, that it can be trained by using supervised, or unsupervised learning, and even by transfer learning.
|Title of host publication
|2017 International Joint Conference on Neural Networks, IJCNN 2017 - Proceedings
|Institute of Electrical and Electronics Engineers Inc.
|Number of pages
|Published - 2017 Jun 30
|2017 International Joint Conference on Neural Networks, IJCNN 2017 - Anchorage, United States
Duration: 2017 May 14 → 2017 May 19
|2017 International Joint Conference on Neural Networks, IJCNN 2017
|17/5/14 → 17/5/19
ASJC Scopus subject areas
- Artificial Intelligence