Mathematics of Computation, volume 91, issue 336, pages 1763-1788

Stochastic gradient descent for linear inverse problems in Hilbert spaces

Zhi Hang Wu 1
Peter Mathé 2
2
 
Weierstraß Institute for Applied Analysis and Stochastics, Mohrenstraße 39, 10117 Berlin, Germany
Publication typeJournal Article
Publication date2021-11-10
scimago Q1
SJR1.460
CiteScore3.9
Impact factor2.2
ISSN00255718, 10886842
Computational Mathematics
Applied Mathematics
Algebra and Number Theory
Abstract

We investigate stochastic gradient decent (SGD) for solving full infinite dimensional ill-posed problems in Hilbert spaces. We allow for batch-size versions of SGD where the randomly chosen batches incur noise fluctuations. Based on the corresponding bias-variance decomposition we provide bounds for the root mean squared error. These bounds take into account the discretization levels, the decay of the step-size, which is more flexible than in existing results, and the underlying smoothness in terms of general source conditions. This allows to apply SGD to severely ill-posed problems. The obtained error bounds exhibit three stages of the performance of SGD. In particular, the pre-asymptotic behavior can be well seen. Some numerical studies verify the theoretical predictions.

Top-30

Journals

1
2
3
4
5
1
2
3
4
5

Publishers

1
2
3
4
5
1
2
3
4
5
  • We do not take into account publications without a DOI.
  • Statistics recalculated only for publications connected to researchers, organizations and labs registered on the platform.
  • Statistics recalculated weekly.

Are you a researcher?

Create a profile to get free access to personal recommendations for colleagues and new articles.
Share
Cite this
GOST | RIS | BibTex | MLA
Found error?