Abstract
We discuss an approach to exploiting kernel methods with manifold-valued data. In many computer vision problems, the data can be naturally represented as points on a Riemannian manifold. Due to the non-Euclidean geometry of Riemannian manifolds, usual Euclidean computer vision and machine learning algorithms yield inferior results on such data. We define positive definite kernels on manifolds that permit us to embed a given manifold with a corresponding metric in a reproducing kernel Hilbert space. These kernels make it possible to utilize algorithms developed for linear spaces on nonlinear manifold-valued data. We primarily work with Gaussian radial basis function (RBF)-type kernels. Since the Gaussian RBF defined with any given metric is not always positive definite, we present a unified framework for analyzing the positive definiteness of the Gaussian RBF on a generic metric space. We then use the proposed framework to identify positive definite kernels on three specific manifolds commonly encountered in computer vision: the Riemannian manifold of symmetric positive definite matrices, the Grassmann manifold, and Kendall’s manifold of 2D shapes. We show that many popular algorithms designed for Euclidean spaces, such as support vector machines, discriminant analysis, and principal component analysis can be generalized to Riemannian manifolds with the help of such positive definite Gaussian kernels.
Original language | English |
---|---|
Title of host publication | Riemannian Computing in Computer Vision |
Publisher | Springer International Publishing Switzerland |
Pages | 45-67 |
Number of pages | 23 |
ISBN (Electronic) | 9783319229577 |
ISBN (Print) | 9783319229560 |
DOIs | |
Publication status | Published - 1 Jan 2015 |