JMLR

On Inference for the Support Vector Machine

Authors
Jakub Rybak Heather Battey Wen-Xin Zhou
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 15, 2025
Abstract

The linear support vector machine has a parametrised decision boundary. The paper considers inference for the corresponding parameters, which indicate the effects of individual variables on the decision boundary. The proposed inference is via a convolution-smoothed version of the SVM loss function, this having several inferential advantages over the original SVM, whose associated loss function is not everywhere differentiable. Notably, convolution-smoothing comes with non-asymptotic theoretical guarantees, including a distributional approximation to the parameter estimator that scales more favourably with the dimension of the feature vector. The differentiability of the loss function produces other advantages in some settings; for instance, by facilitating the inclusion of penalties or the synthesis of information from a large number of small samples. The paper closes by relating the linear SVM parameters to those of some probability models for binary outcomes.

Author Details
Jakub Rybak
Author
Heather Battey
Author
Wen-Xin Zhou
Author
Citation Information
APA Format
Jakub Rybak , Heather Battey & Wen-Xin Zhou . On Inference for the Support Vector Machine. Journal of Machine Learning Research .
BibTeX Format
@article{JMLR:v26:23-1581,
  author  = {Jakub Rybak and Heather Battey and Wen-Xin Zhou},
  title   = {On Inference for the Support Vector Machine},
  journal = {Journal of Machine Learning Research},
  year    = {2025},
  volume  = {26},
  number  = {85},
  pages   = {1--54},
  url     = {http://jmlr.org/papers/v26/23-1581.html}
}
Related Papers