Some theoretical results concerning the convergence of compositions of regularized linear functions

Tong Zhang*

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

2 Citations (Scopus)

Abstract

Recently, sample complexity bounds have been derived for problems involving linear functions such as neural networks and support vector machines. Ini this paper, we extend some theoretical results in this area by deriving dimensional independent covering number bounds for regularized linear functions under certain regularization conditions. We show that such bounds lead to a class of new methods for training linear classifiers with similar theoretical advantages of the support vector machine. Furthermore, we also present a theoretical analysis for these new methods from the asymptotic statistical point of view. This technique provides better description for large sample behaviors of these algorithms.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 12 - Proceedings of the 1999 Conference, NIPS 1999
PublisherNeural information processing systems foundation
Pages370-378
Number of pages9
ISBN (Print)0262194503, 9780262194501
Publication statusPublished - 2000
Externally publishedYes
Event13th Annual Neural Information Processing Systems Conference, NIPS 1999 - Denver, CO, United States
Duration: 29 Nov 19994 Dec 1999

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

Conference13th Annual Neural Information Processing Systems Conference, NIPS 1999
Country/TerritoryUnited States
CityDenver, CO
Period29/11/994/12/99

Fingerprint

Dive into the research topics of 'Some theoretical results concerning the convergence of compositions of regularized linear functions'. Together they form a unique fingerprint.

Cite this