Abstract
We discuss an approach to exploiting kernel methods with manifold-valued data. In many computer vision problems, the data can be naturally represented as points on a Riemannian manifold. Due to the non-Euclidean geometry of Riemannian manifolds, usual Euclidean computer vision and machine learning algorithms yield inferior results on such data. We define positive definite kernels on manifolds that permit us to embed a given manifold with a corresponding metric in a reproducing kernel Hilbert space. These kernels make it possible to utilize algorithms developed for linear spaces on nonlinear manifold-valued data.
We primarily work with Gaussian radial basis function (RBF)-type kernels. Since the Gaussian RBF defined with any given metric is not always positive definite, we present a unified framework for analyzing the positive definiteness of the Gaussian RBF on a generic metric space. We then use the proposed framework to identify positive definite kernels on three specific manifolds commonly encountered in computer vision: the Riemannian manifold of symmetric positive definite matrices, the Grassmann manifold, and Kendall’s manifold of 2D shapes. We show that many popular algorithms designed for Euclidean spaces, such as support vector machines, discriminant analysis, and principal component analysis can be generalized to Riemannian manifolds with the help of such positive definite Gaussian kernels.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
We limit the discussion to real Hilbert spaces and real-valued kernels, since they are the most useful kind in learning algorithms. However, the theory holds for complex Hilbert spaces and complex-valued kernels as well.
References
Aronszajn N (2006) Theory of reproducing kernels. Trans Am Math Soc, Magn Reson Med Aug;56(2):411–21
Arsigny V, Fillard P, Pennec X, Ayache N (2005) Fast and simple computations on tensors with log-Euclidean metrics. Research report RR-5584, INRIA
Arsigny V, Fillard P, Pennec X, Ayache N (2006) Log-Euclidean metrics for fast and simple calculus on diffusion tensors. Magn Reson Med 56(2):411–421
Bai X, Yang X, Latecki L, Liu W, Tu Z (2010) Learning context-sensitive shape similarity by graph transduction. IEEE Trans Pattern Anal Mach Intell 32(5):861–874
Berg C, Christensen JPR, Ressel P (1984) Harmonic analysis on semigroups. Springer, New York
Caseiro R, Henriques JF, Martins P, Batista J (2012) Semi-intrinsic mean shift on Riemannian manifolds. In: European conference on computer vision (ECCV)
Cevikalp H, Triggs B (2010) Face recognition based on image sets. In: CVPR
Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1-3):131–159
Dalal N, Triggs B (2005) Histograms of oriented gradients for human detection. In: CVPR
Dryden I, Mardia K (1998) Statistical shape analysis. Wiley, Chichester
Dryden IL, Koloydenko A, Zhou D (2009) Non-Euclidean statistics for covariance matrices, with applications to diffusion tensor imaging. Ann Appl Stat
Edelman A, Arias TA, Smith ST (1998) The geometry of algorithms with orthogonality constraints. SIAM J Matrix Anal Appl 20(2):303–353
Goh A, Vidal R (2008) Clustering and dimensionality reduction on Riemannian manifolds. In: CVPR
Hamm J, Lee DD (2008) Grassmann discriminant analysis: a unifying view on subspace-based learning. In: ICML
Harandi MT, Sanderson C, Shirazi S, Lovell BC (2011) Graph embedding discriminant analysis on Grassmannian manifolds for improved image set matching. In: CVPR
Harandi M, Sanderson C, Hartley R, Lovel B (2012) Sparse coding and dictionary learning for symmetric positive definite matrices: a kernel approach. In: European conference on computer vision (ECCV)
Jayasumana S, Hartley R, Salzmann M, Li H, Harandi M (2013) Kernel methods on the Riemannian manifold of symmetric positive definite matrices. In: CVPR
Jayasumana S, Salzmann M, Li H, Harandi M (2013) A framework for shape analysis via Hilbert space embedding. In: ICCV
Jayasumana S, Hartley R, Salzmann M, Li H, Harandi M (2014) Optimizing over radial kernels on compact manifolds. In: CVPR
Jayasumana S, Hartley R, Salzmann M, Li H, Harandi M (2015) Kernel methods on Riemannian manifolds with Gaussian RBF kernels. Pattern Anal Mach Intell
Kendall DG (1984) Shape manifolds, Procrustean metrics, and complex projective spaces. Bull Lond Math Soc 16(2):81–121
Kim TK, Kittler J, Cipolla R (2007) Discriminative learning and recognition of image set classes using canonical correlations. IEEE Trans Pattern Anal Mach Intell 29(2):1005–1018
Kim M, Kumar S, Pavlovic V, Rowley HA (2008) Face tracking and recognition with visual constraints in real-world videos. IEEE Comput Vis Pattern Recogn 29(2):286–299
Leibe B, Schiele B (2003) Analyzing appearance and contour based methods for object categorization. In: CVPR
Ling H, Jacobs D (2007) Shape classification using the inner-distance. IEEE Trans Pattern Anal Mach Intell
Mokhtarian F, Abbasi S, Kittler J (1996) Efficient and robust retrieval by shape content through curvature scale space. In: BMVC
Ojala T, Pietikainen M, Maenpaa T (2002) Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans Pattern Anal Mach Intell 24(7):971–987
Pennec X, Fillard P, Ayache N (2006) A Riemannian framework for tensor computing. Int J Comput Vis, 66(1):41–66
Schoenberg IJ (1938) Metric spaces and positive definite functions. Trans Am Math Soc, 44:522–536
Schölkopf B, Smola A, Müller KR (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10(5):1299–1319
Schölkopf B, Smola AJ (2002) Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT Press, Cambridge
Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press, Cambridge
Sra S (2012) Positive definite matrices and the symmetric Stein divergence. Preprint [arXiv:1110.1773]
Tosato D, Farenzena M, Cristani M, Spera M, Murino V (2010) Multi-class classification on Riemannian manifolds for video surveillance. In: European conference on computer vision (ECCV)
Turaga P, Veeraraghavan A, Srivastava A, Chellappa R (2011) Statistical computations on Grassmann and Stiefel manifolds for image and video-based recognition. IEEE Trans Pattern Anal Mach Intell 33(11):2273–2286
Tuzel O, Porikli F, Meer P (2006) Region covariance: a fast descriptor for detection and classification. In: European conference on computer vision (ECCV)
Tuzel O, Porikli F, Meer P (2008) Pedestrian detection via classification on Riemannian manifolds. IEEE Trans Pattern Anal Mach Intell 30(10):1713–1727
Varma M, Babu BR (2009) More generality in efficient multiple kernel learning. In: ICML
Varma M, Ray D (2007) Learning the discriminative power-invariance trade-off. In: ICCV
Wang R, Guo H, Davis LS, Dai Q (2012) Covariance discriminative learning: a natural and efficient approach to image set classification. In: CVPR
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Jayasumana, S., Hartley, R., Salzmann, M. (2016). Kernels on Riemannian Manifolds. In: Turaga, P., Srivastava, A. (eds) Riemannian Computing in Computer Vision. Springer, Cham. https://doi.org/10.1007/978-3-319-22957-7_3
Download citation
DOI: https://doi.org/10.1007/978-3-319-22957-7_3
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-22956-0
Online ISBN: 978-3-319-22957-7
eBook Packages: EngineeringEngineering (R0)