Abstract
In the early eighties Lojasiewicz [in Seminari di Geometric, 1982-1983, Università di Bologna, Istituto di Geometria, Dipartimento di Matematica, 1984, pp. 115-117] proved that a bounded solution of a gradient flow for an analytic cost function converges to a well-defined limit point. In this paper, we show that the iterates of numerical descent algorithms, for an analytic cost function, share this convergence property if they satisfy certain natural descent conditions. The results obtained are applicable to a broad class of optimization schemes and strengthen classical "weak convergence" results for descent methods to "strong limit-point convergence" for a large class of cost functions of practical interest. The result does not require that the cost has isolated critical points and requires no assumptions on the convexity of the cost nor any nondegeneracy conditions on the Hessian of the cost at critical points.
Original language | English |
---|---|
Pages (from-to) | 531-547 |
Number of pages | 17 |
Journal | SIAM Journal on Optimization |
Volume | 16 |
Issue number | 2 |
DOIs | |
Publication status | Published - 2006 |