JMLR

Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity

Authors
Jie Peng Qing Ling Qiankun Shi Kun Yuan Xiao Wang
Research Topics
Computational Statistics
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Dec 30, 2025
Abstract

In this paper, we establish tight lower bounds for Byzantine-robust distributed first-order stochastic methods in both strongly convex and non-convex stochastic optimization. We reveal that when the distributed nodes have heterogeneous data, the convergence error comprises two components: a non-vanishing Byzantine error and a vanishing optimization error. We establish the lower bounds on the Byzantine error and on the minimum number of queries to a stochastic gradient oracle for achieving an arbitrarily small optimization error. Nevertheless, we also identify significant discrepancies between our established lower bounds and the existing upper bounds. To fill this gap, we leverage the techniques of Nesterov's acceleration and variance reduction to develop novel Byzantine-robust distributed stochastic optimization methods that provably match these lower bounds, up to at most logarithmic factors, implying that our established lower bounds are tight.

Author Details
Jie Peng
Author
Qing Ling
Author
Qiankun Shi
Author
Kun Yuan
Author
Xiao Wang
Author
Research Topics & Keywords
Computational Statistics
Research Area
Citation Information
APA Format
Jie Peng , Qing Ling , Qiankun Shi , Kun Yuan & Xiao Wang . Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity. Journal of Machine Learning Research .
BibTeX Format
@article{paper662,
  title = { Optimal Complexity in Byzantine-Robust Distributed Stochastic Optimization with Data Heterogeneity },
  author = { Jie Peng and Qing Ling and Qiankun Shi and Kun Yuan and Xiao Wang },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v26/25-0613.html }
}