Regularized winnow methods

Tong Zhang*

*Corresponding author for this work

Research output: Chapter in Book/Conference Proceeding/ReportConference Paper published in a bookpeer-review

22 Citations (Scopus)

Abstract

In theory, the Winnow multiplicative update has certain advantages over the Perceptron additive update when there are many irrelevant attributes. Recently, there has been much effort on enhancing the Perceptron algorithm by using regularization, leading to a class of linear classification methods called support vector machines. Similarly, it is also possible to apply the regularization idea to the Winnow algorithm, which gives methods we call regularized Winnows. We show that the resulting methods compare with the basic Winnows in a similar way that a support vector machine compares with the Perceptron. We investigate algorithmic issues and learning properties of the derived methods. Some experimental results will also be provided to illustrate different methods.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 13 - Proceedings of the 2000 Conference, NIPS 2000
PublisherNeural information processing systems foundation
ISBN (Print)0262122413, 9780262122412
Publication statusPublished - 2001
Externally publishedYes
Event14th Annual Neural Information Processing Systems Conference, NIPS 2000 - Denver, CO, United States
Duration: 27 Nov 20002 Dec 2000

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

Conference14th Annual Neural Information Processing Systems Conference, NIPS 2000
Country/TerritoryUnited States
CityDenver, CO
Period27/11/002/12/00

Fingerprint

Dive into the research topics of 'Regularized winnow methods'. Together they form a unique fingerprint.

Cite this