Abstract
An important issue in neural computing concerns the description of learning dynamics with macroscopic dynamical variables. Recent progress on on-line learning only addresses the often unrealistic case of an infinite training set. We introduce a new framework to model batch learning of restricted sets of examples, widely applicable to any learning cost function, and fully taking into account the temporal correlations introduced by the recycling of the examples. Here we illustrate the technique using the Adaline rule learning random or teacher-generated examples.
| Original language | English |
|---|---|
| Pages | 1164-1168 |
| Number of pages | 5 |
| Publication status | Published - 1999 |
| Externally published | Yes |
| Event | International Joint Conference on Neural Networks (IJCNN'99) - Washington, DC, USA Duration: 10 Jul 1999 → 16 Jul 1999 |
Conference
| Conference | International Joint Conference on Neural Networks (IJCNN'99) |
|---|---|
| City | Washington, DC, USA |
| Period | 10/07/99 → 16/07/99 |