Effective Dimension and Generalization of Kernel Learning

Tong Zhang*

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

16 Citations (Scopus)

Abstract

We investigate the generalization performance of some learning problems in Hilbert function Spaces. We introduce a concept of scale-sensitive effective data dimension, and show that it characterizes the convergence rate of the underlying learning problem. Using this concept, we can naturally extend results for parametric estimation problems in finite dimensional spaces to non-parametric kernel learning methods. We derive upper bounds on the generalization performance and show that the resulting convergent rates are optimal under various circumstances.

Original languageEnglish
Title of host publicationNIPS 2002
Subtitle of host publicationProceedings of the 15th International Conference on Neural Information Processing Systems
EditorsSuzanna Becker, Sebastian Thrun, Klaus Obermayer
PublisherMIT Press Journals
Pages454-461
Number of pages8
ISBN (Electronic)0262025507, 9780262025508
Publication statusPublished - 2002
Externally publishedYes
Event15th International Conference on Neural Information Processing Systems, NIPS 2002 - Vancouver, Canada
Duration: 9 Dec 200214 Dec 2002

Publication series

NameNIPS 2002: Proceedings of the 15th International Conference on Neural Information Processing Systems

Conference

Conference15th International Conference on Neural Information Processing Systems, NIPS 2002
Country/TerritoryCanada
CityVancouver
Period9/12/0214/12/02

Bibliographical note

Publisher Copyright:
© NIPS 2002: Proceedings of the 15th International Conference on Neural Information Processing Systems. All rights reserved.

Fingerprint

Dive into the research topics of 'Effective Dimension and Generalization of Kernel Learning'. Together they form a unique fingerprint.

Cite this