## Abstract

Modern applications of machine learning typically require the tuning of a multitude of hyperparameters. With this motivation in mind, we consider the problem of optimization given a set of noisy function evaluations. We focus on robust optimization in which the goal is to find a point in the input space such that the function remains high when perturbed by an adversary within a given radius. Here we identify the minimax optimal rate for this problem, which turns out to be of order (n-λ/(2λ+1)), where n is the sample size and λ quantifies the smoothness of the function for a broad class of problems, including situations where the metric space is unbounded. The optimal rate is achieved (up to logarithmic factors) by a conceptually simple algorithm based on k-nearest neighbor regression.

Original language | English |
---|---|

Pages (from-to) | 819-836 |

Number of pages | 18 |

Journal | Analysis and Applications |

Volume | 17 |

Issue number | 5 |

DOIs | |

Publication status | Published - 1 Sept 2019 |

### Bibliographical note

(In Special Issue on Mathematics of Big Data: Deep Learning, Approximation and Optimization)## Keywords

- Optimisation for machine learning
- metric spaces
- non-parametric methods
- Optimization for machine learning

## ASJC Scopus subject areas

- Analysis
- Applied Mathematics