Learning kernel logistic regression in the presence of class label noise

Jakramate Bootkrajang, Ata Kabán

Research output: Contribution to journalArticlepeer-review

35 Citations (Scopus)
478 Downloads (Pure)

Abstract

The classical machinery of supervised learning machines relies on a correct set of training labels. Unfortunately, there is no guarantee that all of the labels are correct. Labelling errors are increasingly noticeable in today's classification tasks, as the scale and difficulty of these tasks increases so much that perfect label assignment becomes nearly impossible. Several algorithms have been proposed to alleviate the problem, of which a robust Kernel Fisher Discriminant is a successful example. However, for classification, discriminative models are of primary interest, and rather curiously, the very few existing label-robust discriminative classifiers are limited to linear problems.

In this paper, we build on the widely used and successful kernelising technique to introduce a label-noise robust Kernel Logistic Regression classifier. The main difficulty that we need to bypass is how to determine the model complexity parameters when no trusted validation set is available. We propose to adapt the Multiple Kernel Learning approach for this new purpose, together with a Bayesian regularisation scheme. Empirical results on 13 benchmark data sets and two real-world applications demonstrate the success of our approach.
Original languageEnglish
Pages (from-to)3641-3655
Number of pages15
JournalPattern Recognition
Volume47
Issue number11
Early online date21 May 2014
DOIs
Publication statusPublished - 1 Nov 2014

Bibliographical note

In Press, Accepted Manuscript - Accepted 11 May 2014, Available online 21 May 2014

Keywords

  • Classification
  • Label noise
  • Model selection
  • Multiple Kernel Learning

Fingerprint

Dive into the research topics of 'Learning kernel logistic regression in the presence of class label noise'. Together they form a unique fingerprint.

Cite this