Composite Goodness-of-fit Tests with Kernels
Authors
Research Topics
Paper Information
-
Journal:
Journal of Machine Learning Research -
Added to Tracker:
Jul 15, 2025
Abstract
We propose kernel-based hypothesis tests for the challenging composite testing problem, where we are interested in whether the data comes from any distribution in some parametric family. Our tests make use of minimum distance estimators based on kernel-based distances such as the maximum mean discrepancy. As our main result, we show that we are able to estimate the parameter and conduct our test on the same data (without data splitting), while maintaining a correct test level. We also prove that the popular wild bootstrap will lead to an overly conservative test, and show that the parametric bootstrap is consistent and can lead to significantly improved performance in practice. Our approach is illustrated on a range of problems, including testing for goodness-of-fit of a non-parametric density model, and an intractable generative model of a biological cellular network.
Author Details
Oscar Key
AuthorArthur Gretton
AuthorFrançois-Xavier Briol
AuthorTamara Fernandez
AuthorResearch Topics & Keywords
Nonparametric Statistics
Research AreaCitation Information
APA Format
Oscar Key
,
Arthur Gretton
,
François-Xavier Briol
&
Tamara Fernandez
.
Composite Goodness-of-fit Tests with Kernels.
Journal of Machine Learning Research
.
BibTeX Format
@article{JMLR:v26:24-0276,
author = {Oscar Key and Arthur Gretton and Fran{\c{c}}ois-Xavier Briol and Tamara Fernandez},
title = {Composite Goodness-of-fit Tests with Kernels},
journal = {Journal of Machine Learning Research},
year = {2025},
volume = {26},
number = {51},
pages = {1--60},
url = {http://jmlr.org/papers/v26/24-0276.html}
}