JMLR

Frequentist Guarantees of Distributed (Non)-Bayesian Inference

Authors
Bohan Wu César A. Uribe
Research Topics
Bayesian Statistics
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Sep 08, 2025
Abstract

We establish frequentist properties, i.e., posterior consistency, asymptotic normality, and posterior contraction rates, for the distributed (non-)Bayesian inference problem for a set of agents connected over a network. These results are motivated by the need to analyze large, decentralized datasets, where distributed (non)-Bayesian inference has become a critical research area across multiple fields, including statistics, machine learning, and economics. Our results show that, under appropriate assumptions on the communication graph, distributed (non)-Bayesian inference retains parametric efficiency while enhancing robustness in uncertainty quantification. We also explore the trade-off between statistical efficiency and communication efficiency by examining how the design and size of the communication graph impact the posterior contraction rate. Furthermore, we extend our analysis to time-varying graphs and apply our results to exponential family models, distributed logistic regression, and decentralized detection models.

Author Details
Bohan Wu
Author
César A. Uribe
Author
Research Topics & Keywords
Bayesian Statistics
Research Area
Citation Information
APA Format
Bohan Wu & César A. Uribe . Frequentist Guarantees of Distributed (Non)-Bayesian Inference. Journal of Machine Learning Research .
BibTeX Format
@article{paper487,
  title = { Frequentist Guarantees of Distributed (Non)-Bayesian Inference },
  author = { Bohan Wu and César A. Uribe },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v26/23-1504.html }
}