Online learning via congregational gradient descent

Kim L. Blackmore, Robert C. Williamson, Iven M.Y. Mareels, William A. Sethares

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

1 Citation (Scopus)

Abstract

We propose and analyse a populational version of stepwise gradient descent suitable for a wide range of learning problems. The algorithm is motivated by genetic algorithms which update a population of solutions rather than just a single representative as is typical for gradient descent. This modification of traditional gradient descent (as used for example in the backpropagation algorithm) avoids getting trapped in local minima. We use an averaging analysis of the algorithm to relate its behaviour to an associated ordinary differential equation. We derive a result concerning how long one has to wait in order that with a given high probability, the algorithm is within a certain neighbourhood of the global minimum. We also analyze the effect of different population sizes. An example is presented which corroborates our theory very well.

Original languageEnglish
Title of host publicationProceedings of the 8th Annual Conference on Computational Learning Theory, COLT 1995
PublisherAssociation for Computing Machinery (ACM)
Pages265-272
Number of pages8
ISBN (Electronic)0897917235, 9780897917230
DOIs
Publication statusPublished - 5 Jul 1995
Event8th Annual Conference on Computational Learning Theory, COLT 1995 - Santa Cruz, United States
Duration: 5 Jul 19958 Jul 1995

Publication series

NameProceedings of the 8th Annual Conference on Computational Learning Theory, COLT 1995
Volume1995-January

Conference

Conference8th Annual Conference on Computational Learning Theory, COLT 1995
Country/TerritoryUnited States
CitySanta Cruz
Period5/07/958/07/95

Fingerprint

Dive into the research topics of 'Online learning via congregational gradient descent'. Together they form a unique fingerprint.

Cite this