VaB-AL: incorporating class imbalance and difficulty with variational Bayes for active learning
Research output: Chapter in Book/Report/Conference proceeding › Conference contribution
Colleges, School and Institutes
- Seoul National University
- University of British Columbia
- Samsung SDS
Active Learning for discriminative models has largely been studied with the focus on individual samples, with less emphasis on how classes are distributed or which classes are hard to deal with. In this work, we show that this is harmful. We propose a method based on the Bayes’ rule, that can naturally incorporate class imbalance into the Active Learning framework. We derive that three terms should be considered together when estimating the probability of a classifier making a mistake for a given sample; i) probability of mislabelling a class, ii) likelihood of the data given a predicted class, and iii) the prior probability on the abundance of a predicted class. Implementing these terms requires a generative model and an intractable likelihood estimation. Therefore, we train a Variational Auto Encoder (VAE) for this purpose. To further tie the VAE with the classifier and facilitate VAE training, we use the classifiers’ deep feature representations as input to the VAE. By considering all three probabilities, among them, especially the data imbalance, we can substantially improve the potential of existing methods under limited data budget. We show that our method can be applied to classification tasks on multiple different datasets – including one that is a real-world dataset with heavy data imbalance – significantly outperforming the state of the art.
Not yet published as of 08/06/2021.
|Title of host publication||2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)|
|Publication status||Accepted/In press - 1 Mar 2021|
|Name||Proceedings. IEEE Computer Society Conference on Computer Vision and Pattern Recognition.|