Minimax Optimal Two-Sample Testing under Local Differential Privacy
Authors
Research Topics
Paper Information
-
Journal:
Journal of Machine Learning Research -
Added to Tracker:
Dec 30, 2025
Abstract
We explore the trade-off between privacy and statistical utility in private two-sample testing under local differential privacy (LDP) for both multinomial and continuous data. We begin with the multinomial case, where we introduce private permutation tests using practical privacy mechanisms such as Laplace, discrete Laplace, and Google's RAPPOR. We then extend this approach to continuous data via binning and study its uniform separation under LDP over H\"{o}lder and Besov smoothness classes. The proposed tests for both discrete and continuous cases rigorously control type I error for any finite sample size, strictly adhere to LDP constraints, and achieve minimax optimality under LDP. The attained minimax rates reveal inherent privacy-utility trade-offs that are unavoidable in private testing. To address scenarios with unknown smoothness parameters in density testing, we propose a Bonferroni-type adaptive test that ensures robust performance without prior knowledge of the smoothness parameters. We validate our theoretical findings with extensive numerical experiments and demonstrate the practical relevance and effectiveness of our proposed methods.
Author Details
Ilmun Kim
AuthorJongmin Mun
AuthorSeungwoo Kwak
AuthorResearch Topics & Keywords
Hypothesis Testing
Research AreaCitation Information
APA Format
Ilmun Kim
,
Jongmin Mun
&
Seungwoo Kwak
.
Minimax Optimal Two-Sample Testing under Local Differential Privacy.
Journal of Machine Learning Research
.
BibTeX Format
@article{paper678,
title = { Minimax Optimal Two-Sample Testing under Local Differential Privacy },
author = {
Ilmun Kim
and Jongmin Mun
and Seungwoo Kwak
},
journal = { Journal of Machine Learning Research },
url = { https://www.jmlr.org/papers/v26/24-2016.html }
}