Annals of Statistics, volume 44, issue 4
Nonparametric stochastic approximation with large step-sizes
Aymeric Dieuleveut
1
,
Francis Bach
1
1
Laboratoire d'informatique de l'école normale supérieure
Publication type: Journal Article
Publication date: 2016-07-07
Journal:
Annals of Statistics
scimago Q1
SJR: 5.335
CiteScore: 9.3
Impact factor: 3.2
ISSN: 00905364, 21688966
Statistics and Probability
Statistics, Probability and Uncertainty
Abstract
We consider the random-design least-squares regression problem within the reproducing kernel Hilbert space (RKHS) framework. Given a stream of independent and identically distributed input/output data, we aim to learn a regression function within an RKHS $\mathcal{H}$, even if the optimal predictor (i.e., the conditional expectation) is not in $\mathcal{H}$. In a stochastic approximation framework where the estimator is updated after each observation, we show that the averaged unregularized least-mean-square algorithm (a form of stochastic gradient), given a sufficient large step-size, attains optimal rates of convergence for a variety of regimes for the smoothnesses of the optimal prediction function and the functions in $\mathcal{H}$.
Found
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.