TY - JOUR
T1 - Local minima and attractors at infinity for gradient descent learning algorithms
AU - Blackmore, Kim L.
AU - Williamson, Robert C.
AU - Mareels, Iven M.Y.
PY - 1996
Y1 - 1996
N2 - In the paper 'Learning Nonlinearly Parametrized Decision Regions', an online scheme for learning a very general class of decision regions is given, together with conditions on both the parametrization and on the sequence of input examples under which good learning can be guaranteed to occur. In this paper, we discuss these conditions, in particular the requirement that there be no non-global local minima of the relevant cost function, and the more specific problem of no attractor at infinity. Somewhat simpler sufficient conditions are given. A number of examples are discussed.
AB - In the paper 'Learning Nonlinearly Parametrized Decision Regions', an online scheme for learning a very general class of decision regions is given, together with conditions on both the parametrization and on the sequence of input examples under which good learning can be guaranteed to occur. In this paper, we discuss these conditions, in particular the requirement that there be no non-global local minima of the relevant cost function, and the more specific problem of no attractor at infinity. Somewhat simpler sufficient conditions are given. A number of examples are discussed.
UR - http://www.scopus.com/inward/record.url?scp=0029776684&partnerID=8YFLogxK
M3 - Article
AN - SCOPUS:0029776684
SN - 1052-0600
VL - 6
SP - 231
EP - 234
JO - Journal of Mathematical Systems, Estimation, and Control
JF - Journal of Mathematical Systems, Estimation, and Control
IS - 2
ER -