Abstract
In this article, we study leave-one-out style cross-validation bounds for kernel methods. The essential element in our analysis is a bound on the parameter estimation stability for regularized kernel formulations. Using this result, we derive bounds on expected leave-one-out cross-validation errors, which lead to expected generalization bounds for various kernel algorithms. In addition, we also obtain variance bounds for leave-one-out errors. We apply our analysis to some classification and regression problems and compare them with previous results.
| Original language | English |
|---|---|
| Pages (from-to) | 1397-1437 |
| Number of pages | 41 |
| Journal | Neural Computation |
| Volume | 15 |
| Issue number | 6 |
| DOIs | |
| Publication status | Published - Jun 2003 |
| Externally published | Yes |