Local minima and attractors at infinity for gradient descent learning algorithms

Kim L. Blackmore*, Robert C. Williamson, Iven M.Y. Mareels

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Citations (Scopus)

Abstract

In the paper 'Learning Nonlinearly Parametrized Decision Regions', an online scheme for learning a very general class of decision regions is given, together with conditions on both the parametrization and on the sequence of input examples under which good learning can be guaranteed to occur. In this paper, we discuss these conditions, in particular the requirement that there be no non-global local minima of the relevant cost function, and the more specific problem of no attractor at infinity. Somewhat simpler sufficient conditions are given. A number of examples are discussed.

Original languageEnglish
Pages (from-to)231-234
Number of pages4
JournalJournal of Mathematical Systems, Estimation, and Control
Volume6
Issue number2
Publication statusPublished - 1996

Cite this