Newton-like methods for numerical optimization on manifolds

Knut Hüper*, Jochen Trumpf

*Corresponding author for this work

    Research output: Contribution to journalConference articlepeer-review

    31 Citations (Scopus)

    Abstract

    Many problems in signal processing require the numerical optimization of a cost function which is defined on a smooth manifold. Especially, orthogonally or unitarily constrained optimization problems tend to occur in signal processing tasks involving subspaces. In this paper we consider Newton-like methods for solving these types of problems. Under the assumption that the parameterization of the manifold is linked to so-called Riemannian normal coordinates our algorithms can be considered as intrinsic Newton methods. Moreover, if there is not such a relationship, we still can prove local quadratic convergence to a critical point of the cost function by means of analysis on manifolds. Our approach is demonstrated by a detailed example, i.e., computing the dominant eigenspace of a real symmetric matrix.

    Original languageEnglish
    Pages (from-to)136-139
    Number of pages4
    JournalConference Record of the Asilomar Conference on Signals, Systems and Computers
    Volume1
    Publication statusPublished - 2004
    EventConference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers - Pacific Grove, CA, United States
    Duration: 7 Nov 200410 Nov 2004

    Fingerprint

    Dive into the research topics of 'Newton-like methods for numerical optimization on manifolds'. Together they form a unique fingerprint.

    Cite this