We found a match
Your institution may have access to this item. Find your institution then sign in to continue.
- Title
Convergences of Regularized Algorithms and Stochastic Gradient Methods with Random Projections.
- Authors
Junhong Lin; Cevher, Volkan
- Abstract
We study the least-squares regression problem over a Hilbert space, covering nonparametric regression over a reproducing kernel Hilbert space as a special case. We first investigate regularized algorithms adapted to a projection operator on a closed subspace of the Hilbert space. We prove convergence results with respect to variants of norms, under a capacity assumption on the hypothesis space and a regularity condition on the target function. As a result, we obtain optimal rates for regularized algorithms with randomized sketches, provided that the sketch dimension is proportional to the effective dimension up to a logarithmic factor. As a byproduct, we obtain similar results for Nyström regularized algorithms. Our results provide optimal, distribution-dependent rates that do not have any saturation effect for sketched/Nystrom regularized algorithms, considering both the attainable and non-attainable cases, in the well-conditioned regimes. We then study stochastic gradient methods with projection over the subspace, allowing multi-pass over the data and minibatches, and we derive similar optimal statistical convergence results.
- Subjects
RANDOM projection method; HILBERT space; SUBSPACES (Mathematics); ALGORITHMS
- Publication
Journal of Machine Learning Research, 2020, Vol 21, Issue 1-25, p1
- ISSN
1532-4435
- Publication type
Article