Characterizing Dynamical Stability of Stochastic Gradient Descent in Overparameterized Learning
Authors
Paper Information
-
Journal:
Journal of Machine Learning Research -
Added to Tracker:
Sep 08, 2025
Abstract
For overparameterized optimization tasks, such as those found in modern machine learning, global minima are generally not unique. In order to understand generalization in these settings, it is vital to study to which minimum an optimization algorithm converges. The possibility of having minima that are unstable under the dynamics imposed by the optimization algorithm limits the potential minima that the algorithm can find. In this paper, we characterize the global minima that are dynamically stable/unstable for both deterministic and stochastic gradient descent (SGD). In particular, we introduce a characteristic Lyapunov exponent that depends on the local dynamics around a global minimum and rigorously prove that the sign of this Lyapunov exponent determines whether SGD can accumulate at the respective global minimum.
Author Details
Dennis Chemnitz
AuthorMaximilian Engel
AuthorCitation Information
APA Format
Dennis Chemnitz
&
Maximilian Engel
.
Characterizing Dynamical Stability of Stochastic Gradient Descent in Overparameterized Learning.
Journal of Machine Learning Research
.
BibTeX Format
@article{paper521,
title = { Characterizing Dynamical Stability of Stochastic Gradient Descent in Overparameterized Learning },
author = {
Dennis Chemnitz
and Maximilian Engel
},
journal = { Journal of Machine Learning Research },
url = { https://www.jmlr.org/papers/v26/24-1547.html }
}