JMLR

Stabilizing Sharpness-Aware Minimization Through A Simple Renormalization Strategy

Authors
Chengli Tan Jiangshe Zhang Junmin Liu Yicheng Wang Yunda Hao
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

Recently, sharpness-aware minimization (SAM) has attracted much attention because of its surprising effectiveness in improving generalization performance. However, compared to stochastic gradient descent (SGD), it is more prone to getting stuck at the saddle points, which as a result may lead to performance degradation. To address this issue, we propose a simple renormalization strategy, dubbed Stable SAM (SSAM), so that the gradient norm of the descent step maintains the same as that of the ascent step. Our strategy is easy to implement and flexible enough to integrate with SAM and its variants, almost at no computational cost. With elementary tools from convex optimization and learning theory, we also conduct a theoretical analysis of sharpness-aware training, revealing that compared to SGD, the effectiveness of SAM is only assured in a limited regime of learning rate. In contrast, we show how SSAM extends this regime of learning rate and then it can consistently perform better than SAM with the minor modification. Finally, we demonstrate the improved performance of SSAM on several representative data sets and tasks.

Author Details
Chengli Tan
Author
Jiangshe Zhang
Author
Junmin Liu
Author
Yicheng Wang
Author
Yunda Hao
Author
Citation Information
APA Format
Chengli Tan , Jiangshe Zhang , Junmin Liu , Yicheng Wang & Yunda Hao . Stabilizing Sharpness-Aware Minimization Through A Simple Renormalization Strategy. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-0065,
  author  = {Chengli Tan and Jiangshe Zhang and Junmin Liu and Yicheng Wang and Yunda Hao},
  title   = {Stabilizing Sharpness-Aware Minimization Through A Simple Renormalization Strategy},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {68},
  pages   = {1--35},
  url     = {http://jmlr.org/papers/v26/24-0065.html}
}
Related Papers