JMLR

On the Natural Gradient of the Evidence Lower Bound

Authors
Nihat Ay Jesse van Oostrum Adwait Datar
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Dec 30, 2025
Abstract

This article studies the Fisher-Rao gradient, also referred to as the natural gradient, of the evidence lower bound (ELBO) which plays a central role in generative machine learning. It reveals that the gap between the evidence and its lower bound, the ELBO, has essentially a vanishing natural gradient within unconstrained optimization. As a result, maximization of the ELBO is equivalent to minimization of the Kullback-Leibler divergence from a target distribution, the primary objective function of learning. Building on this insight, we derive a condition under which this equivalence persists even when optimization is constrained to a model. This condition yields a geometric characterization, which we formalize through the notion of a cylindrical model.

Author Details
Nihat Ay
Author
Jesse van Oostrum
Author
Adwait Datar
Author
Citation Information
APA Format
Nihat Ay , Jesse van Oostrum & Adwait Datar . On the Natural Gradient of the Evidence Lower Bound. Journal of Machine Learning Research .
BibTeX Format
@article{paper708,
  title = { On the Natural Gradient of the Evidence Lower Bound },
  author = { Nihat Ay and Jesse van Oostrum and Adwait Datar },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v26/24-0606.html }
}