Gradient hard thresholding pursuit for sparsity-constrained optimization

Xiao Tong Yuan, Ping Li, Tong Zhang

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

45 Citations (Scopus)

Abstract

Hard Thresholding Pursuit (HTP) is an iterative greedy selection procedure for finding sparse solutions of underdetermined linear systems. This method has been shown to have strong theoretical guarantees and impressive numerical performance. In this paper, we generalize HTP from compressed sensing to a generic problem setup of sparsity-constrained convex optimization. The proposed algorithm iterates between a standard gradient descent step and a hard truncation step with or without debiasing. We prove that our method enjoys the strong guarantees analogous to HTP in terms of rate of convergence and parameter estimation accuracy. Numerical evidences show that our method is superior to the state-of-the-art greedy selection methods when applied to learning tasks of sparse logistic regression and sparse support vector machines.

Original languageEnglish
Title of host publication31st International Conference on Machine Learning, ICML 2014
PublisherInternational Machine Learning Society (IMLS)
Pages1322-1330
Number of pages9
ISBN (Electronic)9781634393973
Publication statusPublished - 2014
Externally publishedYes
Event31st International Conference on Machine Learning, ICML 2014 - Beijing, China
Duration: 21 Jun 201426 Jun 2014

Publication series

Name31st International Conference on Machine Learning, ICML 2014
Volume2

Conference

Conference31st International Conference on Machine Learning, ICML 2014
Country/TerritoryChina
CityBeijing
Period21/06/1426/06/14

Bibliographical note

Publisher Copyright:
Copyright © (2014) by the International Machine Learning Society (IMLS) All rights reserved.

Fingerprint

Dive into the research topics of 'Gradient hard thresholding pursuit for sparsity-constrained optimization'. Together they form a unique fingerprint.

Cite this