TY - JOUR
T1 - The geometry of weighted low-rank approximations
AU - Manton, Jonathan H.
AU - Mahony, Robert
AU - Hua, Yingbo
PY - 2003/2
Y1 - 2003/2
N2 - The low-rank approximation problem is to approximate optimally, with respect to some norm, a matrix by one of the same dimension but smaller rank. It is known that under the Frobenius norm, the best low-rank approximation can be found by using the singular value decomposition (SVD). Although this is no longer true under weighted norms in general, it is demonstrated here that the weighted low-rank approximation problem can be solved by finding the subspace that minimizes a particular cost function. A number of advantages of this parameterization over the traditional parameterization are elucidated. Finding the minimizing subspace is equivalent to minimizing a cost function on the Grassmann manifold. A general framework for constructing optimization algorithms on manifolds is presented and it is shown that existing algorithms in the literature are special cases of this framework. Within this framework, two novel algorithms (a steepest descent algorithm and a Newton-like algorithm) are derived for solving the weighted low-rank approximation problem. They are compared with other algorithms for low-rank approximation as well as with other algorithms for minimizing a cost function on a Grassmann manifold.
AB - The low-rank approximation problem is to approximate optimally, with respect to some norm, a matrix by one of the same dimension but smaller rank. It is known that under the Frobenius norm, the best low-rank approximation can be found by using the singular value decomposition (SVD). Although this is no longer true under weighted norms in general, it is demonstrated here that the weighted low-rank approximation problem can be solved by finding the subspace that minimizes a particular cost function. A number of advantages of this parameterization over the traditional parameterization are elucidated. Finding the minimizing subspace is equivalent to minimizing a cost function on the Grassmann manifold. A general framework for constructing optimization algorithms on manifolds is presented and it is shown that existing algorithms in the literature are special cases of this framework. Within this framework, two novel algorithms (a steepest descent algorithm and a Newton-like algorithm) are derived for solving the weighted low-rank approximation problem. They are compared with other algorithms for low-rank approximation as well as with other algorithms for minimizing a cost function on a Grassmann manifold.
KW - Grassman manifold
KW - Low-rank approximations
KW - Optimization on manifolds
KW - Reduced rank signal processing
UR - http://www.scopus.com/inward/record.url?scp=0037304841&partnerID=8YFLogxK
U2 - 10.1109/TSP.2002.807002
DO - 10.1109/TSP.2002.807002
M3 - Article
SN - 1053-587X
VL - 51
SP - 500
EP - 514
JO - IEEE Transactions on Signal Processing
JF - IEEE Transactions on Signal Processing
IS - 2
ER -