Bayes Meets Bernstein at the Meta Level: an Analysis of Fast Rates in Meta-Learning with PAC-Bayes
Authors
Research Topics
Paper Information
-
Journal:
Journal of Machine Learning Research -
Added to Tracker:
Jul 15, 2025
Abstract
Bernstein's condition is a key assumption that guarantees fast rates in machine learning. For example, under this condition, the Gibbs posterior with prior $\pi$ has an excess risk in $O(d_{\pi}/n)$, as opposed to $O(\sqrt{d_{\pi}/n})$ in the general case, where $n$ denotes the number of observations and $d_{\pi}$ is a complexity parameter which depends on the prior $\pi$. In this paper, we examine the Gibbs posterior in the context of meta-learning, i.e., when learning the prior $\pi$ from $T$ previous tasks. Our main result is that Bernstein's condition always holds at the meta level, regardless of its validity at the observation level. This implies that the additional cost to learn the Gibbs prior $\pi$, which will reduce the term $d_\pi$ across tasks, is in $O(1/T)$, instead of the expected $O(1/\sqrt{T})$. We further illustrate how this result improves on the standard rates in three different settings: discrete priors, Gaussian priors and mixture of Gaussian priors.
Author Details
Charles Riou
AuthorPierre Alquier
AuthorBadr-Eddine Chérief-Abdellatif
AuthorResearch Topics & Keywords
Bayesian Statistics
Research AreaCitation Information
APA Format
Charles Riou
,
Pierre Alquier
&
Badr-Eddine Chérief-Abdellatif
.
Bayes Meets Bernstein at the Meta Level: an Analysis of Fast Rates in Meta-Learning with PAC-Bayes.
Journal of Machine Learning Research
.
BibTeX Format
@article{JMLR:v26:23-025,
author = {Charles Riou and Pierre Alquier and Badr-Eddine Ch{{\'e}}rief-Abdellatif},
title = {Bayes Meets Bernstein at the Meta Level: an Analysis of Fast Rates in Meta-Learning with PAC-Bayes},
journal = {Journal of Machine Learning Research},
year = {2025},
volume = {26},
number = {2},
pages = {1--60},
url = {http://jmlr.org/papers/v26/23-025.html}
}