# Exploiting geometric structure in mixture proportion estimation with generalised Blanchard-Lee-Scott estimators

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

## Standard

**Exploiting geometric structure in mixture proportion estimation with generalised Blanchard-Lee-Scott estimators.** / Reeve, Henry W. J.; Kaban, Ata.

Research output: Chapter in Book/Report/Conference proceeding › Conference contribution

## Harvard

*30th International Conference on Algorithmic Learning Theory (ALT'19).*Proceedings of Machine Learning Research, vol. 98, Proceedings of Machine Learning Research, pp. 682-699, 30th International Conference on Algorithmic Learning Theory (ALT'19), Chicago, United States, 22/03/19. <http://proceedings.mlr.press/v98/reeve19a.html>

## APA

*30th International Conference on Algorithmic Learning Theory (ALT'19)*(pp. 682-699). (Proceedings of Machine Learning Research; Vol. 98). Proceedings of Machine Learning Research. http://proceedings.mlr.press/v98/reeve19a.html

## Vancouver

## Author

## Bibtex

}

## RIS

TY - GEN

T1 - Exploiting geometric structure in mixture proportion estimation with generalised Blanchard-Lee-Scott estimators

AU - Reeve, Henry W. J.

AU - Kaban, Ata

PY - 2019

Y1 - 2019

N2 - Mixture proportion estimation is a building block in many weakly supervised classification tasks (missing labels, label noise, anomaly detection). Estimators with finite sample guarantees help analyse algorithms for such tasks, but so far only exist for Euclidean and Hilbert space data. We generalise the framework of Blanchard, Lee and Scott to allow extensions to other data types, and exemplify its use by deducing novel estimators for metric space data, and for randomly compressed Euclidean data – both of which make use of favourable geometry to tighten guarantees. Finally we demonstrate a theoretical link with the state of the art estimator specialised for Hilbert space data.

AB - Mixture proportion estimation is a building block in many weakly supervised classification tasks (missing labels, label noise, anomaly detection). Estimators with finite sample guarantees help analyse algorithms for such tasks, but so far only exist for Euclidean and Hilbert space data. We generalise the framework of Blanchard, Lee and Scott to allow extensions to other data types, and exemplify its use by deducing novel estimators for metric space data, and for randomly compressed Euclidean data – both of which make use of favourable geometry to tighten guarantees. Finally we demonstrate a theoretical link with the state of the art estimator specialised for Hilbert space data.

KW - Mixture proportion estimation

KW - metric spaces

KW - covering dimension

KW - randonm projections

KW - Gaussian width

M3 - Conference contribution

T3 - Proceedings of Machine Learning Research

SP - 682

EP - 699

BT - 30th International Conference on Algorithmic Learning Theory (ALT'19)

PB - Proceedings of Machine Learning Research

T2 - 30th International Conference on Algorithmic Learning Theory (ALT'19)

Y2 - 22 March 2019 through 24 March 2019

ER -