JMLR

Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights

Authors
Insung Kong Yongdai Kim
Research Topics
Machine Learning Bayesian Statistics
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

Bayesian approaches for training deep neural networks (BNNs) have received significant interest and have been effectively utilized in a wide range of applications. Several studies have examined the properties of posterior concentrations in BNNs. However, most of these studies focus solely on BNN models with sparse or heavy-tailed priors. Surprisingly, there are currently no theoretical results for BNNs using Gaussian priors, which are the most commonly used in practice. The lack of theory arises from the absence of approximation results of Deep Neural Networks (DNNs) that are non-sparse and have bounded parameters. In this paper, we present a new approximation theory for non-sparse DNNs with bounded parameters. Additionally, based on the approximation theory, we show that BNNs with non-sparse general priors can achieve near-minimax optimal posterior concentration rates around the true model.

Author Details
Insung Kong
Author
Yongdai Kim
Author
Research Topics & Keywords
Machine Learning
Research Area
Bayesian Statistics
Research Area
Citation Information
APA Format
Insung Kong & Yongdai Kim . Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-0425,
  author  = {Insung Kong and Yongdai Kim},
  title   = {Posterior Concentrations of Fully-Connected Bayesian Neural Networks with General Priors on the Weights},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {94},
  pages   = {1--60},
  url     = {http://jmlr.org/papers/v26/24-0425.html}
}
Related Papers