The most robust loss function for boosting

Takafumi Kanamori*, Takashi Takenouchi, Shinto Eguchi, Noboru Murata

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingChapter

13 Citations (Scopus)

Abstract

Boosting algorithm is understood as the gradient descent algorithm of a loss function. It is often pointed out that the typical boosting algorithm, Adaboost, is seriously affected by the outliers. In this paper, loss functions for robust boosting are studied. Based on a concept of the robust statistics, we propose a positive-part-truncation of the loss function which makes the boosting algorithm robust against extreme outliers. Numerical experiments show that the proposed boosting algorithm is useful for highly noisy data in comparison with other competitors.

Original languageEnglish
Title of host publicationLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
EditorsNikhil R. Pal, Srimanta Pal, Nikola Kasabov, Rajani K. Mudi, Swapan K. Parui
PublisherSpringer Verlag
Pages496-501
Number of pages6
ISBN (Print)3540239316, 9783540239314
DOIs
Publication statusPublished - 2004

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume3316
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'The most robust loss function for boosting'. Together they form a unique fingerprint.

Cite this