Online gradient descent algorithms for functional data learning
2
Publication type: Journal Article
Publication date: 2022-06-01
scimago Q1
wos Q1
SJR: 0.850
CiteScore: 3.5
Impact factor: 1.8
ISSN: 0885064X, 10902708
General Mathematics
Statistics and Probability
Applied Mathematics
Control and Optimization
Numerical Analysis
Algebra and Number Theory
Abstract
Functional linear model is a fruitfully applied general framework for regression problems, including those with intrinsically infinite-dimensional data. Online gradient descent methods, despite their evidenced power of processing online or large-sized data, are not well studied for learning with functional data. In this paper, we study reproducing kernel-based online learning algorithms for functional data, and derive convergence rates for the expected excess prediction risk under both online and finite-horizon settings of step-sizes respectively. It is well understood that nontrivial uniform convergence rates for the estimation task depend on the regularity of the slope function. Surprisingly, the convergence rates we derive for the prediction task can assume no regularity from slope. Our analysis reveals the intrinsic difference between the estimation task and the prediction task in functional data learning.
Found
Nothing found, try to update filter.
Found
Nothing found, try to update filter.
Top-30
Journals
|
1
2
3
4
|
|
|
Journal of Complexity
4 publications, 21.05%
|
|
|
Applied and Computational Harmonic Analysis
2 publications, 10.53%
|
|
|
Inverse Problems
2 publications, 10.53%
|
|
|
Journal of Computational Design and Engineering
1 publication, 5.26%
|
|
|
Foundations of Computational Mathematics
1 publication, 5.26%
|
|
|
Axioms
1 publication, 5.26%
|
|
|
IEEE Transactions on Transportation Electrification
1 publication, 5.26%
|
|
|
Applied Mathematics
1 publication, 5.26%
|
|
|
Neural Networks
1 publication, 5.26%
|
|
|
IEEE Transactions on Information Theory
1 publication, 5.26%
|
|
|
Statistics and Computing
1 publication, 5.26%
|
|
|
Journal of Approximation Theory
1 publication, 5.26%
|
|
|
IEEE Transactions on Automatic Control
1 publication, 5.26%
|
|
|
1
2
3
4
|
Publishers
|
1
2
3
4
5
6
7
8
|
|
|
Elsevier
8 publications, 42.11%
|
|
|
Springer Nature
3 publications, 15.79%
|
|
|
Institute of Electrical and Electronics Engineers (IEEE)
3 publications, 15.79%
|
|
|
IOP Publishing
2 publications, 10.53%
|
|
|
Oxford University Press
1 publication, 5.26%
|
|
|
MDPI
1 publication, 5.26%
|
|
|
SPIE-Intl Soc Optical Eng
1 publication, 5.26%
|
|
|
1
2
3
4
5
6
7
8
|
- We do not take into account publications without a DOI.
- Statistics recalculated weekly.
Are you a researcher?
Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
19
Total citations:
19
Citations from 2024:
14
(73.69%)
Cite this
GOST |
RIS |
BibTex
Cite this
GOST
Copy
Chen X. et al. Online gradient descent algorithms for functional data learning // Journal of Complexity. 2022. Vol. 70. p. 101635.
GOST all authors (up to 50)
Copy
Chen X., Tang B., Fan J., Guo X. Online gradient descent algorithms for functional data learning // Journal of Complexity. 2022. Vol. 70. p. 101635.
Cite this
RIS
Copy
TY - JOUR
DO - 10.1016/j.jco.2021.101635
UR - https://doi.org/10.1016/j.jco.2021.101635
TI - Online gradient descent algorithms for functional data learning
T2 - Journal of Complexity
AU - Chen, Xiaming
AU - Tang, Bohao
AU - Fan, Jun
AU - Guo, Xin
PY - 2022
DA - 2022/06/01
PB - Elsevier
SP - 101635
VL - 70
SN - 0885-064X
SN - 1090-2708
ER -
Cite this
BibTex (up to 50 authors)
Copy
@article{2022_Chen,
author = {Xiaming Chen and Bohao Tang and Jun Fan and Xin Guo},
title = {Online gradient descent algorithms for functional data learning},
journal = {Journal of Complexity},
year = {2022},
volume = {70},
publisher = {Elsevier},
month = {jun},
url = {https://doi.org/10.1016/j.jco.2021.101635},
pages = {101635},
doi = {10.1016/j.jco.2021.101635}
}