Testing and learning on distributions with symmetric noise invariance

Ho Chung Leon Law, Christopher Yau, Dino Sejdinovic

Research output: Contribution to journalConference articlepeer-review

2 Citations (Scopus)

Abstract

Kernel embeddings of distributions and the Maximum Mean Discrepancy (MMD), the resulting distance between distributions, are useful tools for fully nonparametric two-sample testing and learning on distributions. However, it is rare that all possible differences between samples are of interest - discovered differences can be due to different types of measurement noise, data collection artefacts or other irrelevant sources of variability. We propose distances between distributions which encode invariance to additive symmetric noise, aimed at testing whether the assumed true underlying processes differ. Moreover, we construct invariant features of distributions, leading to learning algorithms robust to the impairment of the input distributions with symmetric additive noise.

Original languageEnglish
Pages (from-to)1344-1354
Number of pages11
JournalAdvances in Neural Information Processing Systems
Volume2017-December
Publication statusPublished - 1 Jan 2017
Event31st Annual Conference on Neural Information Processing Systems, NIPS 2017 - Long Beach, United States
Duration: 4 Dec 20179 Dec 2017

ASJC Scopus subject areas

  • Computer Networks and Communications
  • Information Systems
  • Signal Processing

Fingerprint

Dive into the research topics of 'Testing and learning on distributions with symmetric noise invariance'. Together they form a unique fingerprint.

Cite this