Robust randomized optimization with k nearest neighbors

Research output: Contribution to journalArticle

Colleges, School and Institutes

Abstract

Modern applications of machine learning typically require the tuning of a multitude of hyperparameters. With this motivation in mind, we consider the problem of optimization given a set of noisy function evaluations. We focus on robust optimization in which the goal is to find a point in the input space such that the function remains high when perturbed by an adversary within a given radius. Here we identify the minimax optimal rate for this problem, which turns out to be of order (n-λ/(2λ+1)), where n is the sample size and λ quantifies the smoothness of the function for a broad class of problems, including situations where the metric space is unbounded. The optimal rate is achieved (up to logarithmic factors) by a conceptually simple algorithm based on k-nearest neighbor regression.

Bibliographic note

(In Special Issue on Mathematics of Big Data: Deep Learning, Approximation and Optimization)

Details

Original languageEnglish
Pages (from-to)819-836
Number of pages18
JournalAnalysis and Applications
Volume17
Issue number5
Publication statusPublished - 1 Sep 2019

Keywords

  • Optimisation for machine learning, metric spaces, non-parametric methods, Optimization for machine learning

ASJC Scopus subject areas