arXiv Analytics

Sign in

arXiv:2105.14084 [cs.LG]AbstractReferencesReviewsResources

Support vector machines and linear regression coincide with very high-dimensional features

Navid Ardeshir, Clayton Sanford, Daniel Hsu

Published 2021-05-28Version 1

The support vector machine (SVM) and minimum Euclidean norm least squares regression are two fundamentally different approaches to fitting linear models, but they have recently been connected in models for very high-dimensional data through a phenomenon of support vector proliferation, where every training example used to fit an SVM becomes a support vector. In this paper, we explore the generality of this phenomenon and make the following contributions. First, we prove a super-linear lower bound on the dimension (in terms of sample size) required for support vector proliferation in independent feature models, matching the upper bounds from previous works. We further identify a sharp phase transition in Gaussian feature models, bound the width of this transition, and give experimental support for its universality. Finally, we hypothesize that this phase transition occurs only in much higher-dimensional settings in the $\ell_1$ variant of the SVM, and we present a new geometric characterization of the problem that may elucidate this phenomenon for the general $\ell_p$ case.

Related articles: Most relevant | Search more
arXiv:1902.04622 [cs.LG] (Published 2019-02-12)
Learning Theory and Support Vector Machines - a primer
arXiv:1303.1152 [cs.LG] (Published 2013-03-05, updated 2014-04-25)
An Equivalence between the Lasso and Support Vector Machines
arXiv:1710.10600 [cs.LG] (Published 2017-10-29)
Regularization approaches for support vector machines with applications to biomedical data