Learning conditional distributions on continuous spaces
Authors
Paper Information
-
Journal:
Journal of Machine Learning Research -
Added to Tracker:
Jul 15, 2025
Abstract
We investigate sample-based learning of conditional distributions on multi-dimensional unit boxes, allowing for different dimensions of the feature and target spaces. Our approach involves clustering data near varying query points in the feature space to create empirical measures in the target space. We employ two distinct clustering schemes: one based on a fixed-radius ball and the other on nearest neighbors. We establish upper bounds for the convergence rates of both methods and, from these bounds, deduce optimal configurations for the radius and the number of neighbors. We propose to incorporate the nearest neighbors method into neural network training, as our empirical analysis indicates it has better performance in practice. For efficiency, our training process utilizes approximate nearest neighbors search with random binary space partitioning. Additionally, we employ the Sinkhorn algorithm and a sparsity-enforced transport plan. Our empirical findings demonstrate that, with a suitably designed structure, the neural network has the ability to adapt to a suitable level of Lipschitz continuity locally.
Author Details
Cyril Benezet
AuthorZiteng Cheng
AuthorSebastian Jaimungal
AuthorCitation Information
APA Format
Cyril Benezet
,
Ziteng Cheng
&
Sebastian Jaimungal
.
Learning conditional distributions on continuous spaces.
Journal of Machine Learning Research
.
BibTeX Format
@article{JMLR:v26:24-0924,
author = {Cyril Benezet and Ziteng Cheng and Sebastian Jaimungal},
title = {Learning conditional distributions on continuous spaces},
journal = {Journal of Machine Learning Research},
year = {2025},
volume = {26},
number = {105},
pages = {1--64},
url = {http://jmlr.org/papers/v26/24-0924.html}
}