volume 27 issue 1 pages 55-67

Optimal learning rates for least squares regularized regression with unbounded sampling

Publication typeJournal Article
Publication date2011-02-01
scimago Q1
wos Q1
SJR0.850
CiteScore3.5
Impact factor1.8
ISSN0885064X, 10902708
General Mathematics
Statistics and Probability
Applied Mathematics
Control and Optimization
Numerical Analysis
Algebra and Number Theory
Abstract
A standard assumption in theoretical study of learning algorithms for regression is uniform boundedness of output sample values. This excludes the common case with Gaussian noise. In this paper we investigate the learning algorithm for regression generated by the least squares regularization scheme in reproducing kernel Hilbert spaces without the assumption of uniform boundedness for sampling. By imposing some incremental conditions on moments of the output variable, we derive learning rates in terms of regularity of the regression function and capacity of the hypothesis space. The novelty of our analysis is a new covering number argument for bounding the sample error.
Found 
Found 

Top-30

Journals

1
2
3
4
5
6
Analysis and Applications
6 publications, 12.24%
International Journal of Wavelets, Multiresolution and Information Processing
4 publications, 8.16%
Automatica
4 publications, 8.16%
Abstract and Applied Analysis
3 publications, 6.12%
IEEE Transactions on Automatic Control
3 publications, 6.12%
Foundations of Computational Mathematics
2 publications, 4.08%
Advances in Computational Mathematics
2 publications, 4.08%
International Journal of Computer Mathematics
2 publications, 4.08%
Neural Computation
1 publication, 2.04%
SIAM Journal on Numerical Analysis
1 publication, 2.04%
Machine Learning
1 publication, 2.04%
Journal of Inequalities and Applications
1 publication, 2.04%
Complex Analysis and Operator Theory
1 publication, 2.04%
Acta Mathematica Sinica, English Series
1 publication, 2.04%
Journal of Complexity
1 publication, 2.04%
Journal of Mathematical Analysis and Applications
1 publication, 2.04%
Journal of Multivariate Analysis
1 publication, 2.04%
Applied and Computational Harmonic Analysis
1 publication, 2.04%
Computers and Mathematics with Applications
1 publication, 2.04%
Physica D: Nonlinear Phenomena
1 publication, 2.04%
Neural Networks
1 publication, 2.04%
Information Sciences
1 publication, 2.04%
Statistics and Probability Letters
1 publication, 2.04%
Journal of Approximation Theory
1 publication, 2.04%
Applicable Analysis
1 publication, 2.04%
Communications and Control Engineering
1 publication, 2.04%
IEEE Control Systems Letters
1 publication, 2.04%
Journal of Computational and Applied Mathematics
1 publication, 2.04%
IEEE Transactions on Pattern Analysis and Machine Intelligence
1 publication, 2.04%
Bernoulli
1 publication, 2.04%
1
2
3
4
5
6

Publishers

2
4
6
8
10
12
14
16
Elsevier
16 publications, 32.65%
World Scientific
10 publications, 20.41%
Springer Nature
9 publications, 18.37%
Institute of Electrical and Electronics Engineers (IEEE)
5 publications, 10.2%
Taylor & Francis
3 publications, 6.12%
Hindawi Limited
3 publications, 6.12%
MIT Press
1 publication, 2.04%
Society for Industrial and Applied Mathematics (SIAM)
1 publication, 2.04%
Bernoulli Society for Mathematical Statistics and Probability
1 publication, 2.04%
2
4
6
8
10
12
14
16
  • We do not take into account publications without a DOI.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
49
Share
Cite this
GOST |
Cite this
GOST Copy
Wang C., Zhou D. Optimal learning rates for least squares regularized regression with unbounded sampling // Journal of Complexity. 2011. Vol. 27. No. 1. pp. 55-67.
GOST all authors (up to 50) Copy
Wang C., Zhou D. Optimal learning rates for least squares regularized regression with unbounded sampling // Journal of Complexity. 2011. Vol. 27. No. 1. pp. 55-67.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1016/j.jco.2010.10.002
UR - https://doi.org/10.1016/j.jco.2010.10.002
TI - Optimal learning rates for least squares regularized regression with unbounded sampling
T2 - Journal of Complexity
AU - Wang, Cheng
AU - Zhou, Ding-Xuan
PY - 2011
DA - 2011/02/01
PB - Elsevier
SP - 55-67
IS - 1
VL - 27
SN - 0885-064X
SN - 1090-2708
ER -
BibTex |
Cite this
BibTex (up to 50 authors) Copy
@article{2011_Wang,
author = {Cheng Wang and Ding-Xuan Zhou},
title = {Optimal learning rates for least squares regularized regression with unbounded sampling},
journal = {Journal of Complexity},
year = {2011},
volume = {27},
publisher = {Elsevier},
month = {feb},
url = {https://doi.org/10.1016/j.jco.2010.10.002},
number = {1},
pages = {55--67},
doi = {10.1016/j.jco.2010.10.002}
}
MLA
Cite this
MLA Copy
Wang, Cheng, and Ding-Xuan Zhou. “Optimal learning rates for least squares regularized regression with unbounded sampling.” Journal of Complexity, vol. 27, no. 1, Feb. 2011, pp. 55-67. https://doi.org/10.1016/j.jco.2010.10.002.