A computationally efficient information estimator for weighted data

Hideitsu Hino*, Noboru Murata

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contribution

3 Citations (Scopus)

Abstract

The Shannon information content is a fundamental quantity and it is of great importance to estimate it from observed dataset in the field of statistics, information theory, and machine learning. In this study, an estimator for the information content using a given set of weighted data is proposed. The empirical data distribution varies depending on the weight. The notable features of the proposed estimator are its computational efficiency and its ability to deal with weighted data. The proposed estimator is extended in order to estimate cross entropy, entropy and KL divergence with weighted data. Then, the estimators are applied to classification with one-class samples, and distribution preserving data compression problems.

Original languageEnglish
Title of host publicationArtificial Neural Networks and Machine Learning, ICANN 2011 - 21st International Conference on Artificial Neural Networks, Proceedings
Pages301-308
Number of pages8
EditionPART 2
DOIs
Publication statusPublished - 2011
Event21st International Conference on Artificial Neural Networks, ICANN 2011 - Espoo, Finland
Duration: 2011 Jun 142011 Jun 17

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
NumberPART 2
Volume6792 LNCS
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference21st International Conference on Artificial Neural Networks, ICANN 2011
Country/TerritoryFinland
CityEspoo
Period11/6/1411/6/17

Keywords

  • Information
  • entropy
  • non-parametric
  • quantile

ASJC Scopus subject areas

  • Theoretical Computer Science
  • Computer Science(all)

Fingerprint

Dive into the research topics of 'A computationally efficient information estimator for weighted data'. Together they form a unique fingerprint.

Cite this