JMLR

How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences

Authors
Miko{\l}aj J. Kasprzak Ryan Giordano Tamara Broderick
Research Topics
Bayesian Statistics
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

The Laplace approximation is a popular method for constructing a Gaussian approximation to the Bayesian posterior and thereby approximating the posterior mean and variance. But approximation quality is a concern. One might consider using rate-of-convergence bounds from certain versions of the Bayesian Central Limit Theorem (BCLT) to provide quality guarantees. But existing bounds require assumptions that are unrealistic even for relatively simple real-life Bayesian analyses; more specifically, existing bounds either (1) require knowing the true data-generating parameter, (2) are asymptotic in the number of samples, (3) do not control the Bayesian posterior mean, or (4) require strongly log concave models to compute. In this work, we provide the first computable bounds on quality that simultaneously (1) do not require knowing the true parameter, (2) apply to finite samples, (3) control posterior means and variances, and (4) apply generally to models that satisfy the conditions of the asymptotic BCLT. Moreover, we substantially improve the dimension dependence of existing bounds; in fact, we achieve the lowest-order dimension dependence possible in the general case. We compute exact constants in our bounds for a variety of standard models, including logistic regression, and numerically demonstrate their utility. We provide a framework for analysis of more complex models.

Author Details
Miko{\l}aj J. Kasprzak
Author
Ryan Giordano
Author
Tamara Broderick
Author
Research Topics & Keywords
Bayesian Statistics
Research Area
Citation Information
APA Format
Miko{\l}aj J. Kasprzak , Ryan Giordano & Tamara Broderick . How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-0619,
  author  = {Miko{\l}aj J. Kasprzak and Ryan Giordano and Tamara Broderick},
  title   = {How good is your Laplace approximation of the Bayesian posterior? Finite-sample computable error bounds for a variety of useful divergences},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {87},
  pages   = {1--81},
  url     = {http://jmlr.org/papers/v26/24-0619.html}
}
Related Papers