Abstract
We investigate the generalization performance of some learning problems in Hilbert function Spaces. We introduce a concept of scale-sensitive effective data dimension, and show that it characterizes the convergence rate of the underlying learning problem. Using this concept, we can naturally extend results for parametric estimation problems in finite dimensional spaces to non-parametric kernel learning methods. We derive upper bounds on the generalization performance and show that the resulting convergent rates are optimal under various circumstances.
| Original language | English |
|---|---|
| Title of host publication | NIPS 2002 |
| Subtitle of host publication | Proceedings of the 15th International Conference on Neural Information Processing Systems |
| Editors | Suzanna Becker, Sebastian Thrun, Klaus Obermayer |
| Publisher | MIT Press Journals |
| Pages | 454-461 |
| Number of pages | 8 |
| ISBN (Electronic) | 0262025507, 9780262025508 |
| Publication status | Published - 2002 |
| Externally published | Yes |
| Event | 15th International Conference on Neural Information Processing Systems, NIPS 2002 - Vancouver, Canada Duration: 9 Dec 2002 → 14 Dec 2002 |
Publication series
| Name | NIPS 2002: Proceedings of the 15th International Conference on Neural Information Processing Systems |
|---|
Conference
| Conference | 15th International Conference on Neural Information Processing Systems, NIPS 2002 |
|---|---|
| Country/Territory | Canada |
| City | Vancouver |
| Period | 9/12/02 → 14/12/02 |
Bibliographical note
Publisher Copyright:© NIPS 2002: Proceedings of the 15th International Conference on Neural Information Processing Systems. All rights reserved.