volume 7 issue 2 pages 219-269

Regularization Theory and Neural Networks Architectures

Publication typeJournal Article
Publication date2008-04-04
scimago Q1
wos Q3
SJR0.829
CiteScore5.6
Impact factor2.1
ISSN08997667, 1530888X
Cognitive Neuroscience
Arts and Humanities (miscellaneous)
Abstract

We had previously shown that regularization principles lead to approximation schemes that are equivalent to networks with one layer of hidden units, called regularization networks. In particular, standard smoothness functionals lead to a subclass of regularization networks, the well known radial basis functions approximation schemes. This paper shows that regularization networks encompass a much broader range of approximation schemes, including many of the popular general additive models and some of the neural networks. In particular, we introduce new classes of smoothness functionals that lead to different classes of basis functions. Additive splines as well as some tensor product splines can be obtained from appropriate classes of smoothness functionals. Furthermore, the same generalization that extends radial basis functions (RBF) to hyper basis functions (HBF) also leads from additive models to ridge approximation models, containing as special cases Breiman's hinge functions, some forms of projection pursuit regression, and several types of neural networks. We propose to use the term generalized regularization networks for this broad class of approximation schemes that follow from an extension of regularization. In the probabilistic interpretation of regularization, the different classes of basis functions correspond to different classes of prior probabilities on the approximating function spaces, and therefore to different types of smoothness assumptions. In summary, different multilayer networks with one hidden layer, which we collectively call generalized regularization networks, correspond to different classes of priors and associated smoothness functionals in a classical regularization principle. Three broad classes are (1) radial basis functions that can be generalized to hyper basis functions, (2) some tensor product splines, and (3) additive splines that can be generalized to schemes of the type of ridge approximation, hinge functions, and several perceptron-like neural networks with one hidden layer.

Found 
Found 

Top-30

Journals

10
20
30
40
50
60
70
80
90
Lecture Notes in Computer Science
90 publications, 9.04%
Neurocomputing
35 publications, 3.51%
Neural Networks
22 publications, 2.21%
Neural Computation
22 publications, 2.21%
IEEE Transactions on Neural Networks
22 publications, 2.21%
IEEE Transactions on Neural Networks and Learning Systems
15 publications, 1.51%
Pattern Recognition
9 publications, 0.9%
Neural Computing and Applications
9 publications, 0.9%
IEEE Transactions on Pattern Analysis and Machine Intelligence
8 publications, 0.8%
Automatica
7 publications, 0.7%
Signal Processing
7 publications, 0.7%
Neural Processing Letters
6 publications, 0.6%
IEEE Access
6 publications, 0.6%
Perspectives in Neural Computing
6 publications, 0.6%
Engineering Applications of Artificial Intelligence
5 publications, 0.5%
Fuzzy Sets and Systems
5 publications, 0.5%
IFAC Proceedings Volumes
5 publications, 0.5%
Applied Soft Computing Journal
5 publications, 0.5%
Applied Sciences (Switzerland)
5 publications, 0.5%
Communications in Computer and Information Science
5 publications, 0.5%
Applied Intelligence
4 publications, 0.4%
IEEE Transactions on Automatic Control
4 publications, 0.4%
IEEE Transactions on Information Theory
4 publications, 0.4%
IEEE Transactions on Image Processing
4 publications, 0.4%
PLoS ONE
4 publications, 0.4%
Knowledge-Based Systems
4 publications, 0.4%
IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics)
4 publications, 0.4%
IEEE Transactions on Cybernetics
4 publications, 0.4%
Studies in Computational Intelligence
4 publications, 0.4%
10
20
30
40
50
60
70
80
90

Publishers

50
100
150
200
250
300
Institute of Electrical and Electronics Engineers (IEEE)
293 publications, 29.42%
Springer Nature
209 publications, 20.98%
Elsevier
203 publications, 20.38%
Wiley
26 publications, 2.61%
MDPI
25 publications, 2.51%
MIT Press
24 publications, 2.41%
Association for Computing Machinery (ACM)
12 publications, 1.2%
Taylor & Francis
12 publications, 1.2%
World Scientific
10 publications, 1%
SAGE
8 publications, 0.8%
IOP Publishing
8 publications, 0.8%
Society for Industrial and Applied Mathematics (SIAM)
7 publications, 0.7%
AIP Publishing
6 publications, 0.6%
American Physical Society (APS)
6 publications, 0.6%
Hindawi Limited
6 publications, 0.6%
Public Library of Science (PLoS)
5 publications, 0.5%
SPIE-Intl Soc Optical Eng
4 publications, 0.4%
Institute for Operations Research and the Management Sciences (INFORMS)
4 publications, 0.4%
Frontiers Media S.A.
4 publications, 0.4%
Oxford University Press
4 publications, 0.4%
IGI Global
4 publications, 0.4%
Walter de Gruyter
4 publications, 0.4%
Cold Spring Harbor Laboratory
4 publications, 0.4%
ASME International
3 publications, 0.3%
Institute of Mathematical Statistics
3 publications, 0.3%
American Chemical Society (ACS)
3 publications, 0.3%
Trans Tech Publications
3 publications, 0.3%
Emerald
2 publications, 0.2%
American Geophysical Union
2 publications, 0.2%
50
100
150
200
250
300
  • We do not take into account publications without a DOI.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Metrics
996
Share
Cite this
GOST |
Cite this
GOST Copy
Girosi F., Jones M., Poggio T. Regularization Theory and Neural Networks Architectures // Neural Computation. 2008. Vol. 7. No. 2. pp. 219-269.
GOST all authors (up to 50) Copy
Girosi F., Jones M., Poggio T. Regularization Theory and Neural Networks Architectures // Neural Computation. 2008. Vol. 7. No. 2. pp. 219-269.
RIS |
Cite this
RIS Copy
TY - JOUR
DO - 10.1162/neco.1995.7.2.219
UR - https://doi.org/10.1162/neco.1995.7.2.219
TI - Regularization Theory and Neural Networks Architectures
T2 - Neural Computation
AU - Girosi, Federico
AU - Jones, Michael
AU - Poggio, Tomaso
PY - 2008
DA - 2008/04/04
PB - MIT Press
SP - 219-269
IS - 2
VL - 7
SN - 0899-7667
SN - 1530-888X
ER -
BibTex |
Cite this
BibTex (up to 50 authors) Copy
@article{2008_Girosi,
author = {Federico Girosi and Michael Jones and Tomaso Poggio},
title = {Regularization Theory and Neural Networks Architectures},
journal = {Neural Computation},
year = {2008},
volume = {7},
publisher = {MIT Press},
month = {apr},
url = {https://doi.org/10.1162/neco.1995.7.2.219},
number = {2},
pages = {219--269},
doi = {10.1162/neco.1995.7.2.219}
}
MLA
Cite this
MLA Copy
Girosi, Federico, et al. “Regularization Theory and Neural Networks Architectures.” Neural Computation, vol. 7, no. 2, Apr. 2008, pp. 219-269. https://doi.org/10.1162/neco.1995.7.2.219.