Robust loss functions for boosting

Takafumi Kanamori*, Takashi Takenouchi, Shinto Eguchi, Noboru Murata

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

37 Citations (Scopus)

Abstract

Boosting is known as a gradient descent algorithm over loss functions. It is often pointed out that the typical boosting algorithm, Adaboost, is highly affected by outliers. In this letter, loss functions for robust boosting are studied. Based on the concept of robust statistics, we propose a transformation of loss functions that makes boosting algorithms robust against extreme outliers. Next, the truncation of loss functions is applied to contamination models that describe the occurrence of mislabels near decision boundaries. Numerical experiments illustrate that the proposed loss functions derived from the contamination models are useful for handling highly noisy data in comparison with other loss functions.

Original languageEnglish
Pages (from-to)2183-2244
Number of pages62
JournalNeural Computation
Volume19
Issue number8
DOIs
Publication statusPublished - 2007 Aug

ASJC Scopus subject areas

  • Arts and Humanities (miscellaneous)
  • Cognitive Neuroscience

Fingerprint

Dive into the research topics of 'Robust loss functions for boosting'. Together they form a unique fingerprint.

Cite this