Efficiently Escaping Saddle Points in Bilevel Optimization
Authors
Research Topics
Paper Information
-
Journal:
Journal of Machine Learning Research -
Added to Tracker:
Jul 30, 2025
Abstract
Bilevel optimization is one of the fundamental problems in machine learning and optimization. Recent theoretical developments in bilevel optimization focus on finding the first-order stationary points for nonconvex-strongly-convex cases. In this paper, we analyze algorithms that can escape saddle points in nonconvex-strongly-convex bilevel optimization. Specifically, we show that the perturbed approximate implicit differentiation (AID) with a warm start strategy finds an $\epsilon$-approximate local minimum of bilevel optimization in $\tilde{O}(\epsilon^{-2})$ iterations with high probability. Moreover, we propose an inexact NEgative-curvature-Originated-from-Noise Algorithm (iNEON), an algorithm that can escape saddle point and find local minimum of stochastic bilevel optimization. As a by-product, we provide the first nonasymptotic analysis of perturbed multi-step gradient descent ascent (GDmax) algorithm that converges to local minimax point for minimax problems.
Author Details
Shiqian Ma
AuthorMinhui Huang
AuthorXuxing Chen
AuthorKaiyi Ji
AuthorLifeng Lai
AuthorResearch Topics & Keywords
Computational Statistics
Research AreaCitation Information
APA Format
Shiqian Ma
,
Minhui Huang
,
Xuxing Chen
,
Kaiyi Ji
&
Lifeng Lai
.
Efficiently Escaping Saddle Points in Bilevel Optimization.
Journal of Machine Learning Research
.
BibTeX Format
@article{paper319,
title = { Efficiently Escaping Saddle Points in Bilevel Optimization },
author = {
Shiqian Ma
and Minhui Huang
and Xuxing Chen
and Kaiyi Ji
and Lifeng Lai
},
journal = { Journal of Machine Learning Research },
url = { https://www.jmlr.org/papers/v26/22-0136.html }
}