Efficient variable selection in support vector machines via the alternating direction method of multipliers

Gui Bo Ye*, Yifei Chen, Xiaohui Xie

*Corresponding author for this work

Research output: Contribution to journalConference article published in journalpeer-review

43 Citations (Scopus)

Abstract

The support vector machine (SVM) is a widely used tool for classification. Although commonly understood as a method of finding the maximum-margin hyperplane, it can also be formulated as a regularized function estimation problem, corresponding to a hinge loss function plus an ℓ2-norm regulation term. The doubly regularized support vector machine (DrSVM) is a variant of the standard SVM, which introduces an additional ℓ1-norm regularization term on the fitted coefficients. The combined ℓ1 and ℓ2 regularization, termed elastic net penalty, has the property of achieving simultaneous variable selection and margin-maximization within a single framework. However, because of the nondifferentiability of both the loss function and the regularization term, there is no efficient method available to solve DrSVM for large-scale problems. Here we develop an efficient algorithm based on the alternating direction method of multipliers (ADMM) to solve the optimization problem in DrSVM. The utility of the method is illustrated using both simulated and real-world data.

Original languageEnglish
Pages (from-to)832-840
Number of pages9
JournalJournal of Machine Learning Research
Volume15
Publication statusPublished - 2011
Externally publishedYes
Event14th International Conference on Artificial Intelligence and Statistics, AISTATS 2011 - Fort Lauderdale, FL, United States
Duration: 11 Apr 201113 Apr 2011

Fingerprint

Dive into the research topics of 'Efficient variable selection in support vector machines via the alternating direction method of multipliers'. Together they form a unique fingerprint.

Cite this