volume 19 issue 01 pages 107-124

Optimal learning with Gaussians and correntropy loss

Publication typeJournal Article
Publication date2019-11-11
scimago Q1
wos Q1
SJR1.144
CiteScore3.9
Impact factor2.4
ISSN02195305, 17936861
Applied Mathematics
Analysis
Abstract

Correntropy-based learning has achieved great success in practice during the last decades. It is originated from information-theoretic learning and provides an alternative to classical least squares method in the presence of non-Gaussian noise. In this paper, we investigate the theoretical properties of learning algorithms generated by Tikhonov regularization schemes associated with Gaussian kernels and correntropy loss. By choosing an appropriate scale parameter of Gaussian kernel, we show the polynomial decay of approximation error under a Sobolev smoothness condition. In addition, we employ a tight upper bound for the uniform covering number of Gaussian RKHS in order to improve the estimate of sample error. Based on these two results, we show that the proposed algorithm using varying Gaussian kernel achieves the minimax rate of convergence (up to a logarithmic factor) without knowing the smoothness level of the regression function.

Found 
Found 

Top-30

Journals

1
2
3
4
5
6
7
Mathematical Foundations of Computing
7 publications, 29.17%
Journal of Complexity
2 publications, 8.33%
Journal of Mathematics
2 publications, 8.33%
International Journal of Wavelets, Multiresolution and Information Processing
1 publication, 4.17%
Analysis and Applications
1 publication, 4.17%
Entropy
1 publication, 4.17%
Journal of Approximation Theory
1 publication, 4.17%
Neurocomputing
1 publication, 4.17%
Inverse Problems
1 publication, 4.17%
AIMS Mathematics
1 publication, 4.17%
IEEE Transactions on Neural Networks and Learning Systems
1 publication, 4.17%
Foundations of Computational Mathematics
1 publication, 4.17%
Neural Networks
1 publication, 4.17%
Neural Computation
1 publication, 4.17%
Applied and Computational Harmonic Analysis
1 publication, 4.17%
IEEE Transactions on Instrumentation and Measurement
1 publication, 4.17%
1
2
3
4
5
6
7

Publishers

1
2
3
4
5
6
7
8
American Institute of Mathematical Sciences (AIMS)
8 publications, 33.33%
Elsevier
6 publications, 25%
World Scientific
2 publications, 8.33%
Institute of Electrical and Electronics Engineers (IEEE)
2 publications, 8.33%
Hindawi Limited
2 publications, 8.33%
MDPI
1 publication, 4.17%
IOP Publishing
1 publication, 4.17%
Springer Nature
1 publication, 4.17%
MIT Press
1 publication, 4.17%
1
2
3
4
5
6
7
8
  • We do not take into account publications without a DOI.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
24
Share
Cite this
GOST |
Cite this
GOST Copy
Lv F., Fan J. Optimal learning with Gaussians and correntropy loss // Analysis and Applications. 2019. Vol. 19. No. 01. pp. 107-124.
GOST all authors (up to 50) Copy
Lv F., Fan J. Optimal learning with Gaussians and correntropy loss // Analysis and Applications. 2019. Vol. 19. No. 01. pp. 107-124.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1142/S0219530519410124
UR - https://doi.org/10.1142/S0219530519410124
TI - Optimal learning with Gaussians and correntropy loss
T2 - Analysis and Applications
AU - Lv, Fusheng
AU - Fan, Jun
PY - 2019
DA - 2019/11/11
PB - World Scientific
SP - 107-124
IS - 01
VL - 19
SN - 0219-5305
SN - 1793-6861
ER -
BibTex |
Cite this
BibTex (up to 50 authors) Copy
@article{2019_Lv,
author = {Fusheng Lv and Jun Fan},
title = {Optimal learning with Gaussians and correntropy loss},
journal = {Analysis and Applications},
year = {2019},
volume = {19},
publisher = {World Scientific},
month = {nov},
url = {https://doi.org/10.1142/S0219530519410124},
number = {01},
pages = {107--124},
doi = {10.1142/S0219530519410124}
}
MLA
Cite this
MLA Copy
Lv, Fusheng, and Jun Fan. “Optimal learning with Gaussians and correntropy loss.” Analysis and Applications, vol. 19, no. 01, Nov. 2019, pp. 107-124. https://doi.org/10.1142/S0219530519410124.