Evaluating evaluation metrics on the bootstrap

Tetsuya Sakai*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

161 Citations (Scopus)

Abstract

This paper describes how the Bootstrap approach to statistics can be applied to the evaluation of IR effectiveness metrics. First, we argue that Bootstrap Hypothesis Tests deserve more attention from the IR community, as they are based on fewer assumptions than traditional statistical significance tests. We then describe straightforward methods for comparing the sensitivity of IR metrics based on Bootstrap Hypothesis Tests. Unlike the heuristics-based "swap" method proposed by Voorhees and Buckley, our method estimates the performance difference required to achieve a given significance level directly from Bootstrap Hypothesis Test results. In addition, we describe a simple way of examining the accuracy of rank correlation between two metrics based on the Bootstrap Estimate of Standard Error. We demonstrate the usefulness of our methods using test collections and runs from the NTCIR CLIR track for comparing seven IR metrics, including those that can handle graded relevance and those based on the Geometric Mean.

Original languageEnglish
Title of host publicationProceedings of the Twenty-Ninth Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
Pages525-532
Number of pages8
Publication statusPublished - 2006 Oct 31
Externally publishedYes
Event29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval - Seatttle, WA, United States
Duration: 2006 Aug 62006 Aug 11

Publication series

NameProceedings of the Twenty-Ninth Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
Volume2006

Conference

Conference29th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval
Country/TerritoryUnited States
CitySeatttle, WA
Period06/8/606/8/11

Keywords

  • Bootstrap
  • Evaluation
  • Graded relevance
  • Test collection

ASJC Scopus subject areas

  • Engineering(all)
  • Information Systems
  • Software
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Evaluating evaluation metrics on the bootstrap'. Together they form a unique fingerprint.

Cite this