The most robust loss function for boosting

Takafumi Kanamori*, Takashi Takenouchi, Shinto Eguchi, Noboru Murata

*この研究の対応する著者

研究成果: Chapter

13 被引用数 (Scopus)

抄録

Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.

本文言語English
ホスト出版物のタイトルLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
編集者Nikhil R. Pal, Srimanta Pal, Nikola Kasabov, Rajani K. Mudi, Swapan K. Parui
出版社Springer Verlag
ページ496-501
ページ数6
ISBN(印刷版)3540239316, 9783540239314
DOI
出版ステータスPublished - 2004

出版物シリーズ

名前Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
3316
ISSN(印刷版)0302-9743
ISSN(電子版)1611-3349

ASJC Scopus subject areas

  • 理論的コンピュータサイエンス
  • コンピュータ サイエンス(全般)

フィンガープリント

「The most robust loss function for boosting」の研究トピックを掘り下げます。これらがまとまってユニークなフィンガープリントを構成します。

引用スタイル