JMLR

Efficiently Escaping Saddle Points in Bilevel Optimization

Authors
Minhui Huang Xuxing Chen Kaiyi Ji Shiqian Ma Lifeng Lai
Research Topics
Computational Statistics
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

Bilevel optimization is one of the fundamental problems in machine learning and optimization. Recent theoretical developments in bilevel optimization focus on finding the first-order stationary points for nonconvex-strongly-convex cases. In this paper, we analyze algorithms that can escape saddle points in nonconvex-strongly-convex bilevel optimization. Specifically, we show that the perturbed approximate implicit differentiation (AID) with a warm start strategy finds an $\epsilon$-approximate local minimum of bilevel optimization in $\tilde{O}(\epsilon^{-2})$ iterations with high probability. Moreover, we propose an inexact NEgative-curvature-Originated-from-Noise Algorithm (iNEON), an algorithm that can escape saddle point and find local minimum of stochastic bilevel optimization. As a by-product, we provide the first nonasymptotic analysis of perturbed multi-step gradient descent ascent (GDmax) algorithm that converges to local minimax point for minimax problems.

Author Details
Minhui Huang
Author
Xuxing Chen
Author
Kaiyi Ji
Author
Shiqian Ma
Author
Lifeng Lai
Author
Research Topics & Keywords
Computational Statistics
Research Area
Citation Information
APA Format
Minhui Huang , Xuxing Chen , Kaiyi Ji , Shiqian Ma & Lifeng Lai . Efficiently Escaping Saddle Points in Bilevel Optimization. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:22-0136,
  author  = {Minhui Huang and Xuxing Chen and Kaiyi Ji and Shiqian Ma and Lifeng Lai},
  title   = {Efficiently Escaping Saddle Points in Bilevel Optimization},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {1},
  pages   = {1--61},
  url     = {http://jmlr.org/papers/v26/22-0136.html}
}
Related Papers