Journals
Resources
About Us
Open Access

Convergence of Stochastic Gradient Descent Schemes for Łojasiewicz-Landscapes

Year:    2024

Author:    Steffen Dereich, Sebastian Kassing

Journal of Machine Learning, Vol. 3 (2024), Iss. 3 : pp. 245–281

Abstract

In this article, we consider convergence of stochastic gradient descent schemes (SGD), including momentum stochastic gradient descent (MSGD), under weak assumptions on the underlying landscape. More explicitly, we show that on the event that the SGD stays bounded we have convergence of the SGD if there is only a countable number of critical points or if the objective function satisfies Łojasiewicz-inequalities around all critical levels as all analytic functions do. In particular, we show that for neural networks with analytic activation function such as softplus, sigmoid and the hyperbolic tangent, SGD converges on the event of staying bounded, if the random variables modelling the signal and response in the training are compactly supported.

Journal Article Details

Publisher Name:    Global Science Press

Language:    English

DOI:    https://doi.org/10.4208/jml.240109

Journal of Machine Learning, Vol. 3 (2024), Iss. 3 : pp. 245–281

Published online:    2024-01

AMS Subject Headings:   

Copyright:    COPYRIGHT: © Global Science Press

Pages:    37

Keywords:    Stochastic gradient descent Stochastic approximation Robbins-Monro Almost sure convergence Łojasiewicz-inequality.

Author Details

Steffen Dereich

Sebastian Kassing

  1. On the Existence of Minimizers in Shallow Residual ReLU Neural Network Optimization Landscapes

    Dereich, Steffen

    Jentzen, Arnulf

    Kassing, Sebastian

    SIAM Journal on Numerical Analysis, Vol. 62 (2024), Iss. 6 P.2640

    https://doi.org/10.1137/23M1556241 [Citations: 0]