Core vector regression for very large regression problems

Ivor W. Tsang*, James T. Kwok, Kimo T. Lai

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

Abstract

In this paper, we extend the recently proposed Core Vector Machine algorithm to the regression setting by generalizing the underlying minimum enclosing ball problem. The resultant Core Vector Regression (CVR) algorithm can be used with any linear/nonlinear kernels and can obtain provably approximately optimal solutions. Its asymptotic time complexity is linear in the number of training patterns m, while its space complexity is independent of m. Experiments show that CVR has comparable performance with SVR, but is much faster and produces much fewer support vectors on very large data sets. It is also successfully applied to large 3D point sets in computer graphics for the modeling of implicit surfaces.

Original languageEnglish
Title of host publicationICML 2005 - Proceedings of the 22nd International Conference on Machine Learning
EditorsL. Raedt, S. Wrobel
Pages913-920
Number of pages8
Publication statusPublished - 2005
EventICML 2005: 22nd International Conference on Machine Learning - Bonn, Germany
Duration: 7 Aug 200511 Aug 2005

Publication series

NameICML 2005 - Proceedings of the 22nd International Conference on Machine Learning

Conference

ConferenceICML 2005: 22nd International Conference on Machine Learning
Country/TerritoryGermany
CityBonn
Period7/08/0511/08/05

Fingerprint

Dive into the research topics of 'Core vector regression for very large regression problems'. Together they form a unique fingerprint.

Cite this