Abstract
In this paper, we prove a general leave-one-out style crossvalidation bound for Kernel methods. We apply this bound to some classification and regression problems, and compare the results with previously known bounds. One aspect of our analysis is that the derived expected generalization bounds reflect both approximation (bias) and learning (variance) properties of the underlying kernel methods. We are thus able to demonstrate the universality of certain learning formulations.
| Original language | English |
|---|---|
| Title of host publication | Computational Learning Theory - 14th Annual Conference on Computational Learning Theory, COLT 2001 and 5th European Conference on Computational Learning Theory, EuroCOLT 2001, Proceedings |
| Editors | David Helmbold, Bob Williamson |
| Publisher | Springer Verlag |
| Pages | 427-443 |
| Number of pages | 17 |
| ISBN (Print) | 9783540423430 |
| DOIs | |
| Publication status | Published - 2001 |
| Externally published | Yes |
| Event | 14th Annual Conference on Computational Learning Theory, COLT 2001 and 5th European Conference on Computational Learning Theory, EuroCOLT 2001 - Amsterdam, Netherlands Duration: 16 Jul 2001 → 19 Jul 2001 |
Publication series
| Name | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
|---|---|
| Volume | 2111 |
| ISSN (Print) | 0302-9743 |
| ISSN (Electronic) | 1611-3349 |
Conference
| Conference | 14th Annual Conference on Computational Learning Theory, COLT 2001 and 5th European Conference on Computational Learning Theory, EuroCOLT 2001 |
|---|---|
| Country/Territory | Netherlands |
| City | Amsterdam |
| Period | 16/07/01 → 19/07/01 |
Bibliographical note
Publisher Copyright:© Springer-Verlag Berlin Heidelberg 2001.