Abstract
Many fundamental machine learning tasks can be formulated as a problem of learning with vector-valued functions, where we learn multiple scalar-valued functions together. Although there is some generalization analysis on different specific algorithms under the empirical risk minimization principle, a unifying analysis of vector-valued learning under a regularization framework is still lacking. In this paper, we initiate the generalization analysis of regularized vector-valued learning algorithms by presenting bounds with a mild dependency on the output dimension and a fast rate on the sample size. Our discussions relax the existing assumptions on the restrictive constraint of hypothesis spaces, smoothness of loss functions and low-noise condition. To understand the interaction between optimization and learning, we further use our results to derive the first generalization bounds for stochastic gradient descent with vector-valued functions. We apply our general results to multi-class classification and multi-label classification, which yield the first bounds with a logarithmic dependency on the output dimension for extreme multi-label classification with the Frobenius regularization. As a byproduct, we derive a Rademacher complexity bound for loss function classes defined in terms of a general strongly convex function.
Original language | English |
---|---|
Title of host publication | AAAI'21 Proceedings of the Thirty-fifth AAAI Conference on Artificial Intelligence |
Publisher | AAAI Press |
Pages | 10338-10346 |
Number of pages | 9 |
ISBN (Print) | 9781577358664 |
Publication status | Published - 18 May 2021 |
Event | 35th AAAI Conference on Artificial Intelligence - Vancouver, Canada Duration: 2 Feb 2021 → 9 Feb 2021 Conference number: 35 https://aaai.org/Conferences/AAAI-21/ |
Publication series
Name | Proceedings of the AAAI Conference on Artificial Intelligence |
---|---|
Publisher | AAAI Press |
Number | 12 |
Volume | 35 |
ISSN (Print) | 2159-5399 |
ISSN (Electronic) | 2374-3468 |
Conference
Conference | 35th AAAI Conference on Artificial Intelligence |
---|---|
Abbreviated title | AAAI-21 |
Country/Territory | Canada |
City | Vancouver |
Period | 2/02/21 → 9/02/21 |
Internet address |
Keywords
- Learning Theory