Efficient sparse modeling with automatic feature grouping

Leon Wenliang Zhong, James T. Kwok

Research output: Contribution to journalJournal Articlepeer-review

65 Citations (Scopus)

Abstract

For high-dimensional data, it is often desirable to group similar features together during the learning process. This can reduce the estimation variance and improve the stability of feature selection, leading to better generalization. Moreover, it can also help in understanding and interpreting data. Octagonal shrinkage and clustering algorithm for regression (OSCAR) is a recent sparse-modeling approach that uses a l1-regularizer and a pairwise l-regularizer on the feature coefficients to encourage such feature grouping. However, computationally, its optimization procedure is very expensive. In this paper, we propose an efficient solver based on the accelerated gradient method. We show that its key proximal step can be solved by a highly efficient simple iterative group merging algorithm. Given d input features, this reduces the empirical time complexity from O(d2 ~ d5) for the existing solvers to just O(d). Experimental results on a number of toy and real-world datasets demonstrate that OSCAR is a competitive sparse-modeling approach, but with the added ability of automatic feature grouping.

Original languageEnglish
Article number6238378
Pages (from-to)1436-1447
Number of pages12
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume23
Issue number9
DOIs
Publication statusPublished - 2012

Keywords

  • Accelerated gradient descent
  • feature grouping
  • sparse modeling
  • structured sparsity

Fingerprint

Dive into the research topics of 'Efficient sparse modeling with automatic feature grouping'. Together they form a unique fingerprint.

Cite this