volume 22 issue 06 pages 1095-1131

Iterative Kernel Regression with Preconditioning

Publication typeJournal Article
Publication date2024-04-22
scimago Q1
wos Q1
SJR1.144
CiteScore3.9
Impact factor2.4
ISSN02195305, 17936861
Applied Mathematics
Analysis
Abstract

Kernel methods are popular in nonlinear and nonparametric regression due to their solid mathematical foundations and optimal statistical properties. However, scalability remains the primary bottleneck in applying kernel methods to large-scale data regression analysis. This paper aims to improve the scalability of kernel methods. We combine Nyström subsampling and the preconditioned conjugate gradient method to solve regularized kernel regression. Our theoretical analysis indicates that achieving optimal convergence rates requires only [Formula: see text] memory and [Formula: see text] time (up to logarithmic factors). Numerical experiments show that our algorithm outperforms existing methods in time efficiency and prediction accuracy on large-scale datasets. Notably, compared to the FALKON algorithm [A. Rudi, L. Carratino and L. Rosasco, Falkon: An optimal large scale kernel method, in Advances in Neural Information Processing Systems (Curran Associates, 2017), pp. 3891–3901], which is known as the optimal large-scale kernel method, our method is more flexible (applicable to non-positive definite kernel functions) and has a lower algorithmic complexity. Additionally, our established theoretical analysis further relaxes the restrictive conditions on hyperparameters previously imposed in convergence analyses.

Found 
Found 

Top-30

Journals

1
Mathematics of Computation
1 publication, 100%
1

Publishers

1
American Mathematical Society
1 publication, 100%
1
  • We do not take into account publications without a DOI.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
1
Share
Cite this
GOST |
Cite this
GOST Copy
Shi L., Zhang Z. Iterative Kernel Regression with Preconditioning // Analysis and Applications. 2024. Vol. 22. No. 06. pp. 1095-1131.
GOST all authors (up to 50) Copy
Shi L., Zhang Z. Iterative Kernel Regression with Preconditioning // Analysis and Applications. 2024. Vol. 22. No. 06. pp. 1095-1131.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1142/s0219530524500131
UR - https://www.worldscientific.com/doi/10.1142/S0219530524500131
TI - Iterative Kernel Regression with Preconditioning
T2 - Analysis and Applications
AU - Shi, Lei
AU - Zhang, Zihan
PY - 2024
DA - 2024/04/22
PB - World Scientific
SP - 1095-1131
IS - 06
VL - 22
SN - 0219-5305
SN - 1793-6861
ER -
BibTex |
Cite this
BibTex (up to 50 authors) Copy
@article{2024_Shi,
author = {Lei Shi and Zihan Zhang},
title = {Iterative Kernel Regression with Preconditioning},
journal = {Analysis and Applications},
year = {2024},
volume = {22},
publisher = {World Scientific},
month = {apr},
url = {https://www.worldscientific.com/doi/10.1142/S0219530524500131},
number = {06},
pages = {1095--1131},
doi = {10.1142/s0219530524500131}
}
MLA
Cite this
MLA Copy
Shi, Lei, et al. “Iterative Kernel Regression with Preconditioning.” Analysis and Applications, vol. 22, no. 06, Apr. 2024, pp. 1095-1131. https://www.worldscientific.com/doi/10.1142/S0219530524500131.