TY - GEN
T1 - Multi-class support vector machine simplification
AU - Nguyen, Ducdung
AU - Matsumoto, Kazunori
AU - Hashimoto, Kazuo
AU - Takishima, Yasuhiro
AU - Takatori, Daichi
AU - Terabe, Masahiro
PY - 2008
Y1 - 2008
N2 - In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.
AB - In support vector learning, computational complexity of testing phase scales linearly with number of support vectors (SVs) included in the solution - support vector machine (SVM). Among different approaches, reduced set methods speed-up the testing phase by replacing original SVM with a simplified one that consists of smaller number of SVs, called reduced vectors (RV). In this paper we introduce an extension of the bottom-up method for binary-class SVMs to multi-class SVMs. The extension includes: calculations for optimally combining two multi-weighted SVs, selection heuristic for choosing a good pair of SVs for replacing them with a newly created vector, and algorithm for reducing the number of SVs included in a SVM classifier. We show that our method possesses key advantages over others in terms of applicability, efficiency and stability. In constructing RVs, it requires finding a single maximum point of a one-variable function. Experimental results on public datasets show that simplified SVMs can run faster original SVMs up to 100 times with almost no change in predictive accuracy.
KW - Kernel-based methods
KW - Reduced set method
KW - Support vector machines
UR - http://www.scopus.com/inward/record.url?scp=58349097796&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=58349097796&partnerID=8YFLogxK
U2 - 10.1007/978-3-540-89197-0_74
DO - 10.1007/978-3-540-89197-0_74
M3 - Conference contribution
AN - SCOPUS:58349097796
SN - 354089196X
SN - 9783540891963
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 799
EP - 808
BT - PRICAI 2008
T2 - 10th Pacific Rim International Conference on Artificial Intelligence, PRICAI 2008
Y2 - 15 December 2008 through 19 December 2008
ER -