New framework for modeling learning dynamics

Y. W. Tong*, K. Y.Michael Wong, S. Li

*Corresponding author for this work

Research output: Contribution to conferenceConference Paperpeer-review

Abstract

An important issue in neural computing concerns the description of learning dynamics with macroscopic dynamical variables. Recent progress on on-line learning only addresses the often unrealistic case of an infinite training set. We introduce a new framework to model batch learning of restricted sets of examples, widely applicable to any learning cost function, and fully taking into account the temporal correlations introduced by the recycling of the examples. Here we illustrate the technique using the Adaline rule learning random or teacher-generated examples.

Original languageEnglish
Pages1164-1168
Number of pages5
Publication statusPublished - 1999
Externally publishedYes
EventInternational Joint Conference on Neural Networks (IJCNN'99) - Washington, DC, USA
Duration: 10 Jul 199916 Jul 1999

Conference

ConferenceInternational Joint Conference on Neural Networks (IJCNN'99)
CityWashington, DC, USA
Period10/07/9916/07/99

Cite this