JMLR

Towards Understanding Gradient Flow Dynamics of Homogeneous Neural Networks Beyond the Origin

Authors
Akshay Kumar Jarvis Haupt
Research Topics
Machine Learning
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Dec 30, 2025
Abstract

Recent works exploring the training dynamics of homogeneous neural network weights under gradient flow with small initialization have established that in the early stages of training, the weights remain small and near the origin, but converge in direction. Building on this, the current paper studies the gradient flow dynamics of homogeneous neural networks with locally Lipschitz gradients, after they escape the origin. Insights gained from this analysis are used to characterize the first saddle point encountered by gradient flow after escaping the origin. Also, it is shown that for homogeneous feed-forward neural networks, under certain conditions, the sparsity structure emerging among the weights before the escape is preserved after escaping the origin and until reaching the next saddle point.

Author Details
Akshay Kumar
Author
Jarvis Haupt
Author
Research Topics & Keywords
Machine Learning
Research Area
Citation Information
APA Format
Akshay Kumar & Jarvis Haupt . Towards Understanding Gradient Flow Dynamics of Homogeneous Neural Networks Beyond the Origin. Journal of Machine Learning Research .
BibTeX Format
@article{paper661,
  title = { Towards Understanding Gradient Flow Dynamics of Homogeneous Neural Networks Beyond the Origin },
  author = { Akshay Kumar and Jarvis Haupt },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v26/25-1089.html }
}