Abstract
Many problems in signal processing require the numerical optimization of a cost function which is defined on a smooth manifold. Especially, orthogonally or unitarily constrained optimization problems tend to occur in signal processing tasks involving subspaces. In this paper we consider Newton-like methods for solving these types of problems. Under the assumption that the parameterization of the manifold is linked to so-called Riemannian normal coordinates our algorithms can be considered as intrinsic Newton methods. Moreover, if there is not such a relationship, we still can prove local quadratic convergence to a critical point of the cost function by means of analysis on manifolds. Our approach is demonstrated by a detailed example, i.e., computing the dominant eigenspace of a real symmetric matrix.
| Original language | English |
|---|---|
| Pages (from-to) | 136-139 |
| Number of pages | 4 |
| Journal | Conference Record of the Asilomar Conference on Signals, Systems and Computers |
| Volume | 1 |
| Publication status | Published - 2004 |
| Event | Conference Record of the Thirty-Eighth Asilomar Conference on Signals, Systems and Computers - Pacific Grove, CA, United States Duration: 7 Nov 2004 → 10 Nov 2004 |