Journal of Complexity, volume 82, pages 101825
Online Regularized Learning Algorithm for Functional Data
Ming Yuan
1
,
Yuan Mao
1
,
Zheng-Chu Guo
1
Publication type: Journal Article
Publication date: 2024-06-01
Journal:
Journal of Complexity
scimago Q1
SJR: 1.115
CiteScore: 3.1
Impact factor: 1.8
ISSN: 0885064X, 10902708
General Mathematics
Statistics and Probability
Applied Mathematics
Control and Optimization
Numerical Analysis
Algebra and Number Theory
Abstract
In recent years, functional linear models have attracted growing attention in statistics and machine learning, with the aim of recovering the slope function or its functional predictor. This paper considers online regularized learning algorithm for functional linear models in reproducing kernel Hilbert spaces. Convergence analysis of excess prediction error and estimation error are provided with polynomially decaying step-size and constant step-size, respectively. Fast convergence rates can be derived via a capacity dependent analysis. By introducing an explicit regularization term, we uplift the saturation boundary of unregularized online learning algorithms when the step-size decays polynomially, and establish fast convergence rates of estimation error without capacity assumption. However, it remains an open problem to obtain capacity independent convergence rates for the estimation error of the unregularized online learning algorithm with decaying step-size. It also shows that convergence rates of both prediction error and estimation error with constant step-size are competitive with those in the literature.
Found
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.