Kernel methods on the riemannian manifold of symmetric positive definite matrices

Sadeep Jayasumana, Richard Hartley, Mathieu Salzmann, Hongdong Li, Mehrtash Harandi

    Research output: Contribution to journalConference articlepeer-review

    235 Citations (Scopus)

    Abstract

    Symmetric Positive Definite (SPD) matrices have become popular to encode image information. Accounting for the geometry of the Riemannian manifold of SPD matrices has proven key to the success of many algorithms. However, most existing methods only approximate the true shape of the manifold locally by its tangent plane. In this paper, inspired by kernel methods, we propose to map SPD matrices to a high dimensional Hilbert space where Euclidean geometry applies. To encode the geometry of the manifold in the mapping, we introduce a family of provably positive definite kernels on the Riemannian manifold of SPD matrices. These kernels are derived from the Gaussian kernel, but exploit different metrics on the manifold. This lets us extend kernel-based algorithms developed for Euclidean spaces, such as SVM and kernel PCA, to the Riemannian manifold of SPD matrices. We demonstrate the benefits of our approach on the problems of pedestrian detection, object categorization, texture analysis, 2D motion segmentation and Diffusion Tensor Imaging (DTI) segmentation.

    Original languageEnglish
    Article number6618861
    Pages (from-to)73-80
    Number of pages8
    JournalProceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition
    DOIs
    Publication statusPublished - 2013
    Event26th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2013 - Portland, OR, United States
    Duration: 23 Jun 201328 Jun 2013

    Fingerprint

    Dive into the research topics of 'Kernel methods on the riemannian manifold of symmetric positive definite matrices'. Together they form a unique fingerprint.

    Cite this