TY - JOUR
T1 - Sparse Coding on Symmetric Positive Definite Manifolds Using Bregman Divergences
AU - Harandi, Mehrtash T.
AU - Hartley, Richard
AU - Lovell, Brian
AU - Sanderson, Conrad
N1 - Publisher Copyright:
© 2012 IEEE.
PY - 2016/6
Y1 - 2016/6
N2 - This paper introduces sparse coding and dictionary learning for symmetric positive definite (SPD) matrices, which are often used in machine learning, computer vision, and related areas. Unlike traditional sparse coding schemes that work in vector spaces, in this paper, we discuss how SPD matrices can be described by sparse combination of dictionary atoms, where the atoms are also SPD matrices. We propose to seek sparse coding by embedding the space of SPD matrices into the Hilbert spaces through two types of the Bregman matrix divergences. This not only leads to an efficient way of performing sparse coding but also an online and iterative scheme for dictionary learning. We apply the proposed methods to several computer vision tasks where images are represented by region covariance matrices. Our proposed algorithms outperform state-of-the-art methods on a wide range of classification tasks, including face recognition, action recognition, material classification, and texture categorization.
AB - This paper introduces sparse coding and dictionary learning for symmetric positive definite (SPD) matrices, which are often used in machine learning, computer vision, and related areas. Unlike traditional sparse coding schemes that work in vector spaces, in this paper, we discuss how SPD matrices can be described by sparse combination of dictionary atoms, where the atoms are also SPD matrices. We propose to seek sparse coding by embedding the space of SPD matrices into the Hilbert spaces through two types of the Bregman matrix divergences. This not only leads to an efficient way of performing sparse coding but also an online and iterative scheme for dictionary learning. We apply the proposed methods to several computer vision tasks where images are represented by region covariance matrices. Our proposed algorithms outperform state-of-the-art methods on a wide range of classification tasks, including face recognition, action recognition, material classification, and texture categorization.
KW - Bregman's divergences
KW - Riemannian's geometry
KW - dictionary learning
KW - kernel methods
KW - sparse coding.
UR - http://www.scopus.com/inward/record.url?scp=84921971224&partnerID=8YFLogxK
U2 - 10.1109/TNNLS.2014.2387383
DO - 10.1109/TNNLS.2014.2387383
M3 - Article
SN - 2162-237X
VL - 27
SP - 1294
EP - 1306
JO - IEEE Transactions on Neural Networks and Learning Systems
JF - IEEE Transactions on Neural Networks and Learning Systems
IS - 6
M1 - 7024121
ER -