TY - JOUR
T1 - Candidate-Label Learning
T2 - A Generalization of Ordinary-Label Learning and Complementary-Label Learning
AU - Katsura, Yasuhiro
AU - Uchida, Masato
N1 - Funding Information:
NHMRC (National Health and Medical Research Council) project grant (application ID 1063608). This project received seed funding from a Pfizer Neuroscience Research Grant (2011) (application ID WS1931543). Chris Moran was supported by a NHMRC-ARC (Australian Research Council) Dementia Research Fellowship (application ID 1109482).
Funding Information:
This work was supported in part by the Japan Society for the Promotion of Science through Grants-in-Aid for Scientific Research (C) (20K11800).
Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Nature Singapore Pte Ltd.
PY - 2021/7
Y1 - 2021/7
N2 - A supervised learning framework has been proposed for a situation in which each training data is provided with a complementary label that represents a class to which the pattern does not belong. In the existing literature, complementary-label learning has been studied independently from ordinary-label learning, which assumes that each training data is provided with a label representing the class to which the pattern belongs. However, providing a complementary label should be treated as equivalent to providing the rest of all labels as candidates of the one true class. In this paper, we focus on the fact that the loss functions for one-versus-all and pairwise classifications corresponding to ordinary-label learning and complementary-label learning satisfy additivity and duality, and provide a framework that directly bridges the existing supervised learning frameworks. We also show that the complementary labels generated from a probabilistic model assumed in the existing literature is equivalent to the ordinary labels generated from a mixture of ground-truth probabilistic model and uniform distribution. Based on this finding, the relationship between our work and the existing work can be naturally derived. Further, we derive the classification risk and error bound for any loss functions that satisfy additivity and duality.
AB - A supervised learning framework has been proposed for a situation in which each training data is provided with a complementary label that represents a class to which the pattern does not belong. In the existing literature, complementary-label learning has been studied independently from ordinary-label learning, which assumes that each training data is provided with a label representing the class to which the pattern belongs. However, providing a complementary label should be treated as equivalent to providing the rest of all labels as candidates of the one true class. In this paper, we focus on the fact that the loss functions for one-versus-all and pairwise classifications corresponding to ordinary-label learning and complementary-label learning satisfy additivity and duality, and provide a framework that directly bridges the existing supervised learning frameworks. We also show that the complementary labels generated from a probabilistic model assumed in the existing literature is equivalent to the ordinary labels generated from a mixture of ground-truth probabilistic model and uniform distribution. Based on this finding, the relationship between our work and the existing work can be naturally derived. Further, we derive the classification risk and error bound for any loss functions that satisfy additivity and duality.
KW - Complementary-label learning
KW - Statistical inference
KW - Statistical learning theory
KW - Supervised classification
UR - http://www.scopus.com/inward/record.url?scp=85131830343&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85131830343&partnerID=8YFLogxK
U2 - 10.1007/s42979-021-00681-x
DO - 10.1007/s42979-021-00681-x
M3 - Article
AN - SCOPUS:85131830343
SN - 2662-995X
VL - 2
JO - SN Computer Science
JF - SN Computer Science
IS - 4
M1 - 288
ER -