Online Bernstein-von Mises theorem
Authors
Paper Information
-
Journal:
Journal of Machine Learning Research -
Added to Tracker:
Mar 03, 2026
Abstract
Online learning is an inferential paradigm in which parameters are updated incrementally from sequentially available data, in contrast to batch learning, where the entire dataset is processed at once. In this paper, we assume that mini-batches from the full dataset become available sequentially. The Bayesian framework, which updates beliefs about unknown parameters after observing each mini-batch, is naturally suited for online learning. At each step, we update the posterior distribution using the current prior and new observations, with the updated posterior serving as the prior for the next step. However, this recursive Bayesian updating is rarely computationally tractable unless the model and prior are conjugate. When the model is regular, the updated posterior can be approximated by a normal distribution, as justified by the Bernstein-von Mises theorem. We adopt a variational approximation at each step and investigate the frequentist properties of the final posterior obtained through this sequential procedure. Under mild assumptions, we show that the accumulated approximation error becomes negligible once the mini-batch size exceeds a threshold depending on the parameter dimension. As a result, the sequentially updated posterior is asymptotically indistinguishable from the full posterior.
Author Details
Jeyong Lee
AuthorMinwoo Chae
AuthorJunhyeok Choi
AuthorCitation Information
APA Format
Jeyong Lee
,
Minwoo Chae
&
Junhyeok Choi
.
Online Bernstein-von Mises theorem.
Journal of Machine Learning Research
.
BibTeX Format
@article{paper965,
title = { Online Bernstein-von Mises theorem },
author = {
Jeyong Lee
and Minwoo Chae
and Junhyeok Choi
},
journal = { Journal of Machine Learning Research },
url = { https://www.jmlr.org/papers/v27/25-0989.html }
}