JMLR

On Inference for the Support Vector Machine

Authors
Wen-Xin Zhou Jakub Rybak Heather Battey
Paper Information
  • Journal:
    Journal of Machine Learning Research
  • Added to Tracker:
    Jul 30, 2025
Abstract

The linear support vector machine has a parametrised decision boundary. The paper considers inference for the corresponding parameters, which indicate the effects of individual variables on the decision boundary. The proposed inference is via a convolution-smoothed version of the SVM loss function, this having several inferential advantages over the original SVM, whose associated loss function is not everywhere differentiable. Notably, convolution-smoothing comes with non-asymptotic theoretical guarantees, including a distributional approximation to the parameter estimator that scales more favourably with the dimension of the feature vector. The differentiability of the loss function produces other advantages in some settings; for instance, by facilitating the inclusion of penalties or the synthesis of information from a large number of small samples. The paper closes by relating the linear SVM parameters to those of some probability models for binary outcomes.

Author Details
Wen-Xin Zhou
Author
Jakub Rybak
Author
Heather Battey
Author
Citation Information
APA Format
Wen-Xin Zhou , Jakub Rybak & Heather Battey . On Inference for the Support Vector Machine. Journal of Machine Learning Research .
BibTeX Format
@article{paper167,
  title = { On Inference for the Support Vector Machine },
  author = { Wen-Xin Zhou and Jakub Rybak and Heather Battey },
  journal = { Journal of Machine Learning Research },
  url = { https://www.jmlr.org/papers/v26/23-1581.html }
}