Classification with unknown class-conditional label noise on non-compact feature spaces

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Standard

Classification with unknown class-conditional label noise on non-compact feature spaces. / Reeve, Henry W. J.; Kaban, Ata.

32nd Annual Conference on Learning Theory (COLT 19). Vol. 99 Proceedings of Machine Learning Research, 2019. p. 2624-2651 (Proceedings of Machine Learning Research; Vol. 99).

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Harvard

Reeve, HWJ & Kaban, A 2019, Classification with unknown class-conditional label noise on non-compact feature spaces. in 32nd Annual Conference on Learning Theory (COLT 19). vol. 99, Proceedings of Machine Learning Research, vol. 99, Proceedings of Machine Learning Research, pp. 2624-2651, 32nd Annual Conference on Learning Theory (COLT 19), Phoenix, Arizona, United States, 25/06/19. <http://proceedings.mlr.press/v99/>

APA

Reeve, H. W. J., & Kaban, A. (2019). Classification with unknown class-conditional label noise on non-compact feature spaces. In 32nd Annual Conference on Learning Theory (COLT 19) (Vol. 99, pp. 2624-2651). (Proceedings of Machine Learning Research; Vol. 99). Proceedings of Machine Learning Research. http://proceedings.mlr.press/v99/

Vancouver

Reeve HWJ, Kaban A. Classification with unknown class-conditional label noise on non-compact feature spaces. In 32nd Annual Conference on Learning Theory (COLT 19). Vol. 99. Proceedings of Machine Learning Research. 2019. p. 2624-2651. (Proceedings of Machine Learning Research).

Author

Reeve, Henry W. J. ; Kaban, Ata. / Classification with unknown class-conditional label noise on non-compact feature spaces. 32nd Annual Conference on Learning Theory (COLT 19). Vol. 99 Proceedings of Machine Learning Research, 2019. pp. 2624-2651 (Proceedings of Machine Learning Research).

Bibtex

@inproceedings{bf264f4c15594e27bd145917a26dc8db,
title = "Classification with unknown class-conditional label noise on non-compact feature spaces",
abstract = "We investigate the problem of classification in the presence of unknown class-conditional label noise in which the labels observed by the learner have been corrupted with some unknown class dependent probability. In order to obtain finite sample rates, previous approaches to classification with unknown class-conditional label noise have required that the regression function is close to its extrema on sets of large measure. We shall consider this problem in the setting of non-compact metric spaces, where the regression function need not attain its extrema. In this setting we determine the minimax optimal learning rates (up to logarithmic factors). The rate displays interesting threshold behaviour: When the regression function approaches its extrema at a sufficient rate, the optimal learning rates are of the same order as those obtained in the label-noise free setting. If the regression function approaches its extrema more gradually then classification performance necessarily degrades. In addition, we present an adaptive algorithm which attains these rates without prior knowledge of either the distributional parameters or the localdensity. This identifies for the first time a scenario in which finite sample rates are achievable in the label noise setting, but they differ from the optimal rates without label noise..",
keywords = "Label noise, minimax rates, non-parametric classification, metric spaces",
author = "Reeve, {Henry W. J.} and Ata Kaban",
year = "2019",
month = aug
day = "17",
language = "English",
volume = "99",
series = "Proceedings of Machine Learning Research",
publisher = "Proceedings of Machine Learning Research",
pages = "2624--2651",
booktitle = "32nd Annual Conference on Learning Theory (COLT 19)",
note = "32nd Annual Conference on Learning Theory (COLT 19) ; Conference date: 25-06-2019 Through 28-06-2019",

}

RIS

TY - GEN

T1 - Classification with unknown class-conditional label noise on non-compact feature spaces

AU - Reeve, Henry W. J.

AU - Kaban, Ata

PY - 2019/8/17

Y1 - 2019/8/17

N2 - We investigate the problem of classification in the presence of unknown class-conditional label noise in which the labels observed by the learner have been corrupted with some unknown class dependent probability. In order to obtain finite sample rates, previous approaches to classification with unknown class-conditional label noise have required that the regression function is close to its extrema on sets of large measure. We shall consider this problem in the setting of non-compact metric spaces, where the regression function need not attain its extrema. In this setting we determine the minimax optimal learning rates (up to logarithmic factors). The rate displays interesting threshold behaviour: When the regression function approaches its extrema at a sufficient rate, the optimal learning rates are of the same order as those obtained in the label-noise free setting. If the regression function approaches its extrema more gradually then classification performance necessarily degrades. In addition, we present an adaptive algorithm which attains these rates without prior knowledge of either the distributional parameters or the localdensity. This identifies for the first time a scenario in which finite sample rates are achievable in the label noise setting, but they differ from the optimal rates without label noise..

AB - We investigate the problem of classification in the presence of unknown class-conditional label noise in which the labels observed by the learner have been corrupted with some unknown class dependent probability. In order to obtain finite sample rates, previous approaches to classification with unknown class-conditional label noise have required that the regression function is close to its extrema on sets of large measure. We shall consider this problem in the setting of non-compact metric spaces, where the regression function need not attain its extrema. In this setting we determine the minimax optimal learning rates (up to logarithmic factors). The rate displays interesting threshold behaviour: When the regression function approaches its extrema at a sufficient rate, the optimal learning rates are of the same order as those obtained in the label-noise free setting. If the regression function approaches its extrema more gradually then classification performance necessarily degrades. In addition, we present an adaptive algorithm which attains these rates without prior knowledge of either the distributional parameters or the localdensity. This identifies for the first time a scenario in which finite sample rates are achievable in the label noise setting, but they differ from the optimal rates without label noise..

KW - Label noise

KW - minimax rates

KW - non-parametric classification

KW - metric spaces

M3 - Conference contribution

VL - 99

T3 - Proceedings of Machine Learning Research

SP - 2624

EP - 2651

BT - 32nd Annual Conference on Learning Theory (COLT 19)

PB - Proceedings of Machine Learning Research

T2 - 32nd Annual Conference on Learning Theory (COLT 19)

Y2 - 25 June 2019 through 28 June 2019

ER -