arXiv Analytics

Sign in

arXiv:0803.3490 [cs.LG]AbstractReferencesReviewsResources

Robustness and Regularization of Support Vector Machines

Huan Xu, Constantine Caramanis, Shie Mannor

Published 2008-03-25, updated 2008-11-11Version 2

We consider regularized support vector machines (SVMs) and show that they are precisely equivalent to a new robust optimization formulation. We show that this equivalence of robust optimization and regularization has implications for both algorithms, and analysis. In terms of algorithms, the equivalence suggests more general SVM-like algorithms for classification that explicitly build in protection to noise, and at the same time control overfitting. On the analysis front, the equivalence of robustness and regularization, provides a robust optimization interpretation for the success of regularized SVMs. We use the this new robustness interpretation of SVMs to give a new proof of consistency of (kernelized) SVMs, thus establishing robustness as the reason regularized SVMs generalize well.

Journal: Journal of Machine Learning Research, vol 10, 1485-1510, year 2009
Categories: cs.LG, cs.AI
Related articles: Most relevant | Search more
arXiv:2310.00183 [cs.LG] (Published 2023-09-29)
On the Equivalence of Graph Convolution and Mixup
Xiaotian Han et al.
arXiv:1810.00123 [cs.LG] (Published 2018-09-29)
Generalization and Regularization in DQN
arXiv:1203.4523 [cs.LG] (Published 2012-03-20, updated 2012-09-11)
On the Equivalence between Herding and Conditional Gradient Algorithms