Leave-one-out bounds for kernel methods

Tong Zhang*

*Corresponding author for this work

Research output: Contribution to journalJournal Articlepeer-review

94 Citations (Scopus)

Abstract

In this article, we study leave-one-out style cross-validation bounds for kernel methods. The essential element in our analysis is a bound on the parameter estimation stability for regularized kernel formulations. Using this result, we derive bounds on expected leave-one-out cross-validation errors, which lead to expected generalization bounds for various kernel algorithms. In addition, we also obtain variance bounds for leave-one-out errors. We apply our analysis to some classification and regression problems and compare them with previous results.

Original languageEnglish
Pages (from-to)1397-1437
Number of pages41
JournalNeural Computation
Volume15
Issue number6
DOIs
Publication statusPublished - Jun 2003
Externally publishedYes

Fingerprint

Dive into the research topics of 'Leave-one-out bounds for kernel methods'. Together they form a unique fingerprint.

Cite this