JMLR

Distributed Stochastic Bilevel Optimization: Improved Complexity and Heterogeneity Analysis

Authors
Youcheng Niu Jinming Xu Ying Sun Yan Huang Li Chai
Research Topics
Computational Statistics
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

This paper considers solving a class of nonconvex-strongly-convex distributed stochastic bilevel optimization (DSBO) problems with personalized inner-level objectives. Most existing algorithms require computational loops for hypergradient estimation, leading to computational inefficiency. Moreover, the impact of data heterogeneity on convergence in bilevel problems is not explicitly characterized yet. To address these issues, we propose LoPA, a loopless personalized distributed algorithm that leverages a tracking mechanism for iterative approximation of inner-level solutions and Hessian-inverse matrices without relying on extra computation loops. Our theoretical analysis explicitly characterizes the heterogeneity across nodes (denoted by $b$), and establishes a sublinear rate of $\mathcal{O}( {\frac{1}{{{{\left( {1 - \rho } \right)}}K}}\!+ \!\frac{{(\frac{b}{\sqrt{m}})^{\frac{2}{3}} }}{{\left( {1 - \rho } \right)^{\frac{2}{3}} K^{\frac{2}{3}} }} \!+ \!\frac{1}{\sqrt{ K }}( {\sigma _{\operatorname{p} }} + \frac{1}{\sqrt{m}}{\sigma _{\operatorname{c} }} ) } )$ without the boundedness of local hypergradients, where ${\sigma _{\operatorname{p} }}$ and ${\sigma _{\operatorname{c} }}$ represent the gradient sampling variances associated with the inner- and outer-level variables, respectively. We also integrate LoPA with a gradient tracking scheme to eliminate the impact of data heterogeneity, yielding an improved rate of ${{\mathcal{O}}}(\frac{{1}}{{ (1-\rho)^2K }} \!+\! \frac{1}{{\sqrt{K}}}( \sigma_{\rm{p}} \!+\! \frac{1}{\sqrt{m}}\sigma_{\rm{c}} ) )$. The computational complexity of LoPA is of ${{\mathcal{O}}}({\epsilon^{-2}})$ to an $\epsilon$-stationary point, matching the communication complexity due to the loopless structure, which outperforms existing counterparts for DSBO. Numerical experiments validate the effectiveness of the proposed algorithm.

Author Details
Youcheng Niu
Author
Jinming Xu
Author
Ying Sun
Author
Yan Huang
Author
Li Chai
Author
Research Topics & Keywords
Computational Statistics
Research Area
Citation Information
APA Format
Youcheng Niu , Jinming Xu , Ying Sun , Yan Huang & Li Chai . Distributed Stochastic Bilevel Optimization: Improved Complexity and Heterogeneity Analysis. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-0187,
  author  = {Youcheng Niu and Jinming Xu and Ying Sun and Yan Huang and Li Chai},
  title   = {Distributed Stochastic Bilevel Optimization: Improved Complexity and Heterogeneity Analysis},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {76},
  pages   = {1--58},
  url     = {http://jmlr.org/papers/v26/24-0187.html}
}
Related Papers