Support vector machines with different norms: Motivation, formulations and results

João Pedro Pedroso*, Noboru Murata

*この研究の対応する著者

研究成果: Article査読

32 被引用数 (Scopus)

抄録

We introduce two formulations for training support vector machines, based on considering the L1 and L norms instead of the currently used L2 norm, and maximising the margin between the separating hyperplane and each data sets using L1 and L distances. We exploit the geometrical properties of these different norms, and propose what kind of results should be expected for them. Formulations in mathematical programming for linear problems corresponding to L1 and L norms are also provided, for both the separable and non-separable cases. We report results obtained for some standard benchmark problems, which confirmed that the performance of all the formulations is similar. As expected, the CPU time required for machines solvable with linear programming is much shorter.

本文言語English
ページ(範囲)1263-1272
ページ数10
ジャーナルPattern Recognition Letters
22
12
DOI
出版ステータスPublished - 2001

ASJC Scopus subject areas

  • ソフトウェア
  • 信号処理
  • コンピュータ ビジョンおよびパターン認識
  • 人工知能

フィンガープリント

「Support vector machines with different norms: Motivation, formulations and results」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル