JMLR

Composite Goodness-of-fit Tests with Kernels

Authors
Oscar Key Arthur Gretton François-Xavier Briol Tamara Fernandez
Research Topics
Nonparametric Statistics
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

We propose kernel-based hypothesis tests for the challenging composite testing problem, where we are interested in whether the data comes from any distribution in some parametric family. Our tests make use of minimum distance estimators based on kernel-based distances such as the maximum mean discrepancy. As our main result, we show that we are able to estimate the parameter and conduct our test on the same data (without data splitting), while maintaining a correct test level. We also prove that the popular wild bootstrap will lead to an overly conservative test, and show that the parametric bootstrap is consistent and can lead to significantly improved performance in practice. Our approach is illustrated on a range of problems, including testing for goodness-of-fit of a non-parametric density model, and an intractable generative model of a biological cellular network.

Author Details
Oscar Key
Author
Arthur Gretton
Author
François-Xavier Briol
Author
Tamara Fernandez
Author
Research Topics & Keywords
Nonparametric Statistics
Research Area
Citation Information
APA Format
Oscar Key , Arthur Gretton , François-Xavier Briol & Tamara Fernandez . Composite Goodness-of-fit Tests with Kernels. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:24-0276,
  author  = {Oscar Key and Arthur Gretton and Fran{\c{c}}ois-Xavier Briol and Tamara Fernandez},
  title   = {Composite Goodness-of-fit Tests with Kernels},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {51},
  pages   = {1--60},
  url     = {http://jmlr.org/papers/v26/24-0276.html}
}
Related Papers