Stochastic gradient descent for linear inverse problems in Hilbert spaces
We investigate stochastic gradient decent (SGD) for solving full infinite dimensional ill-posed problems in Hilbert spaces. We allow for batch-size versions of SGD where the randomly chosen batches incur noise fluctuations. Based on the corresponding bias-variance decomposition we provide bounds for the root mean squared error. These bounds take into account the discretization levels, the decay of the step-size, which is more flexible than in existing results, and the underlying smoothness in terms of general source conditions. This allows to apply SGD to severely ill-posed problems. The obtained error bounds exhibit three stages of the performance of SGD. In particular, the pre-asymptotic behavior can be well seen. Some numerical studies verify the theoretical predictions.