JMLR

Classification in the high dimensional Anisotropic mixture framework: A new take on Robust Interpolation

Authors
Stanislav Minsker Mohamed Ndaoud Yiqiu Shen
Research Topics
Machine Learning
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Sep 08, 2025
Abstract

We study the classification problem under the two-component anisotropic sub-Gaussian mixture model in high dimensions and in the non-asymptotic setting. First, we derive lower bounds and matching upper bounds for the minimax risk of classification in this framework. We also show that in the high-dimensional regime, the linear discriminant analysis classifier turns out to be sub-optimal in the minimax sense. Next, we give precise characterization of the risk of classifiers based on solutions of $\ell_2$-regularized least squares problem. We deduce that the interpolating solutions may outperform the regularized classifiers under mild assumptions on the covariance structure of the noise, and present concrete examples of this phenomenon. Our analysis also demonstrates robustness of interpolation to certain models of corruption. To the best of our knowledge, this peculiar fact has not yet been investigated in the rapidly growing literature related to interpolation. We conclude that interpolation is not only benign but can also be optimal, and in some cases robust.

Author Details
Stanislav Minsker
Author
Mohamed Ndaoud
Author
Yiqiu Shen
Author
Research Topics & Keywords
Machine Learning
Research Area
Citation Information
APA Format
Stanislav Minsker , Mohamed Ndaoud & Yiqiu Shen . Classification in the high dimensional Anisotropic mixture framework: A new take on Robust Interpolation. Journal of Machine Learning Research .
BibTeX Format
@article{paper502,
  title = { Classification in the high dimensional Anisotropic mixture framework: A new take on Robust Interpolation },
  author = { Stanislav Minsker and Mohamed Ndaoud and Yiqiu Shen },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v26/24-1366.html }
}