Optimization on support vector machines

Joao Pedro Pedroso*, Noburu Murata

*Corresponding author for this work

Research output: Contribution to conferencePaperpeer-review

2 Citations (Scopus)

Abstract

In this paper we deal with the optimization problem involved in determining the maximal margin separation hyperplane in support vector machines. We consider three different formulations, based on L2 norm distance (the standard case), L1 norm, and L norm. We consider separation in the original space of the data (i.e., there are no kernel transformations). For any of these cases, we focus on the following problem: having the optimal solution for a given training data set, one is given a new training example. The purpose is to use the information about the solution of the problem without the additional example in order to speed up the new optimization problem. We also consider the case of reoptimization after removing an example from the data set. We report results obtained for some standard benchmark problems.

Original languageEnglish
Pages399-404
Number of pages6
Publication statusPublished - 2000
Externally publishedYes
EventInternational Joint Conference on Neural Networks (IJCNN'2000) - Como, Italy
Duration: 2000 Jul 242000 Jul 27

Other

OtherInternational Joint Conference on Neural Networks (IJCNN'2000)
CityComo, Italy
Period00/7/2400/7/27

ASJC Scopus subject areas

  • Software
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Optimization on support vector machines'. Together they form a unique fingerprint.

Cite this