JMLR

Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models

Authors
Xuefeng Gao Hoang M. Nguyen Lingjiong Zhu
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

Score-based generative models are a recent class of deep generative models with state-of-the-art performance in many applications. In this paper, we establish convergence guarantees for a general class of score-based generative models in the 2-Wasserstein distance, assuming accurate score estimates and smooth log-concave data distribution. We specialize our results to several concrete score-based generative models with specific choices of forward processes modeled by stochastic differential equations, and obtain an upper bound on the iteration complexity for each model, which demonstrates the impacts of different choices of the forward processes. We also provide a lower bound when the data distribution is Gaussian. Numerically, we experiment with score-based generative models with different forward processes for unconditional image generation on CIFAR-10. We find that the experimental results are in good agreement with our theoretical predictions on the iteration complexity.

Author Details
Xuefeng Gao
Author
Hoang M. Nguyen
Author
Lingjiong Zhu
Author
Citation Information
APA Format
Xuefeng Gao , Hoang M. Nguyen & Lingjiong Zhu . Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-0902,
  author  = {Xuefeng Gao and Hoang M. Nguyen and Lingjiong Zhu},
  title   = {Wasserstein Convergence Guarantees for a General Class of Score-Based Generative Models},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {43},
  pages   = {1--54},
  url     = {http://jmlr.org/papers/v26/24-0902.html}
}
Related Papers