Skip to main content

Abstract

We discuss an approach to exploiting kernel methods with manifold-valued data. In many computer vision problems, the data can be naturally represented as points on a Riemannian manifold. Due to the non-Euclidean geometry of Riemannian manifolds, usual Euclidean computer vision and machine learning algorithms yield inferior results on such data. We define positive definite kernels on manifolds that permit us to embed a given manifold with a corresponding metric in a reproducing kernel Hilbert space. These kernels make it possible to utilize algorithms developed for linear spaces on nonlinear manifold-valued data.

We primarily work with Gaussian radial basis function (RBF)-type kernels. Since the Gaussian RBF defined with any given metric is not always positive definite, we present a unified framework for analyzing the positive definiteness of the Gaussian RBF on a generic metric space. We then use the proposed framework to identify positive definite kernels on three specific manifolds commonly encountered in computer vision: the Riemannian manifold of symmetric positive definite matrices, the Grassmann manifold, and Kendall’s manifold of 2D shapes. We show that many popular algorithms designed for Euclidean spaces, such as support vector machines, discriminant analysis, and principal component analysis can be generalized to Riemannian manifolds with the help of such positive definite Gaussian kernels.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 109.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 149.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    We limit the discussion to real Hilbert spaces and real-valued kernels, since they are the most useful kind in learning algorithms. However, the theory holds for complex Hilbert spaces and complex-valued kernels as well.

References

  1. Aronszajn N (2006) Theory of reproducing kernels. Trans Am Math Soc, Magn Reson Med Aug;56(2):411–21

    Google Scholar 

  2. Arsigny V, Fillard P, Pennec X, Ayache N (2005) Fast and simple computations on tensors with log-Euclidean metrics. Research report RR-5584, INRIA

    Google Scholar 

  3. Arsigny V, Fillard P, Pennec X, Ayache N (2006) Log-Euclidean metrics for fast and simple calculus on diffusion tensors. Magn Reson Med 56(2):411–421

    Article  Google Scholar 

  4. Bai X, Yang X, Latecki L, Liu W, Tu Z (2010) Learning context-sensitive shape similarity by graph transduction. IEEE Trans Pattern Anal Mach Intell 32(5):861–874

    Article  Google Scholar 

  5. Berg C, Christensen JPR, Ressel P (1984) Harmonic analysis on semigroups. Springer, New York

    Book  MATH  Google Scholar 

  6. Caseiro R, Henriques JF, Martins P, Batista J (2012) Semi-intrinsic mean shift on Riemannian manifolds. In: European conference on computer vision (ECCV)

    Google Scholar 

  7. Cevikalp H, Triggs B (2010) Face recognition based on image sets. In: CVPR

    Book  Google Scholar 

  8. Chapelle O, Vapnik V, Bousquet O, Mukherjee S (2002) Choosing multiple parameters for support vector machines. Mach Learn 46(1-3):131–159

    Article  MATH  Google Scholar 

  9. Dalal N, Triggs B (2005) Histograms of oriented gradients for human detection. In: CVPR

    Book  Google Scholar 

  10. Dryden I, Mardia K (1998) Statistical shape analysis. Wiley, Chichester

    MATH  Google Scholar 

  11. Dryden IL, Koloydenko A, Zhou D (2009) Non-Euclidean statistics for covariance matrices, with applications to diffusion tensor imaging. Ann Appl Stat

    Google Scholar 

  12. Edelman A, Arias TA, Smith ST (1998) The geometry of algorithms with orthogonality constraints. SIAM J Matrix Anal Appl 20(2):303–353

    Article  MATH  MathSciNet  Google Scholar 

  13. Goh A, Vidal R (2008) Clustering and dimensionality reduction on Riemannian manifolds. In: CVPR

    Book  Google Scholar 

  14. Hamm J, Lee DD (2008) Grassmann discriminant analysis: a unifying view on subspace-based learning. In: ICML

    Book  Google Scholar 

  15. Harandi MT, Sanderson C, Shirazi S, Lovell BC (2011) Graph embedding discriminant analysis on Grassmannian manifolds for improved image set matching. In: CVPR

    Book  Google Scholar 

  16. Harandi M, Sanderson C, Hartley R, Lovel B (2012) Sparse coding and dictionary learning for symmetric positive definite matrices: a kernel approach. In: European conference on computer vision (ECCV)

    Google Scholar 

  17. Jayasumana S, Hartley R, Salzmann M, Li H, Harandi M (2013) Kernel methods on the Riemannian manifold of symmetric positive definite matrices. In: CVPR

    Book  Google Scholar 

  18. Jayasumana S, Salzmann M, Li H, Harandi M (2013) A framework for shape analysis via Hilbert space embedding. In: ICCV

    Book  Google Scholar 

  19. Jayasumana S, Hartley R, Salzmann M, Li H, Harandi M (2014) Optimizing over radial kernels on compact manifolds. In: CVPR

    Book  Google Scholar 

  20. Jayasumana S, Hartley R, Salzmann M, Li H, Harandi M (2015) Kernel methods on Riemannian manifolds with Gaussian RBF kernels. Pattern Anal Mach Intell

    Google Scholar 

  21. Kendall DG (1984) Shape manifolds, Procrustean metrics, and complex projective spaces. Bull Lond Math Soc 16(2):81–121

    Article  MATH  MathSciNet  Google Scholar 

  22. Kim TK, Kittler J, Cipolla R (2007) Discriminative learning and recognition of image set classes using canonical correlations. IEEE Trans Pattern Anal Mach Intell 29(2):1005–1018

    Article  Google Scholar 

  23. Kim M, Kumar S, Pavlovic V, Rowley HA (2008) Face tracking and recognition with visual constraints in real-world videos. IEEE Comput Vis Pattern Recogn 29(2):286–299

    Google Scholar 

  24. Leibe B, Schiele B (2003) Analyzing appearance and contour based methods for object categorization. In: CVPR

    Book  Google Scholar 

  25. Ling H, Jacobs D (2007) Shape classification using the inner-distance. IEEE Trans Pattern Anal Mach Intell

    Google Scholar 

  26. Mokhtarian F, Abbasi S, Kittler J (1996) Efficient and robust retrieval by shape content through curvature scale space. In: BMVC

    Google Scholar 

  27. Ojala T, Pietikainen M, Maenpaa T (2002) Multiresolution gray-scale and rotation invariant texture classification with local binary patterns. IEEE Trans Pattern Anal Mach Intell 24(7):971–987

    Article  Google Scholar 

  28. Pennec X, Fillard P, Ayache N (2006) A Riemannian framework for tensor computing. Int J Comput Vis, 66(1):41–66

    Article  MATH  MathSciNet  Google Scholar 

  29. Schoenberg IJ (1938) Metric spaces and positive definite functions. Trans Am Math Soc, 44:522–536

    Article  MathSciNet  Google Scholar 

  30. Schölkopf B, Smola A, Müller KR (1998) Nonlinear component analysis as a kernel eigenvalue problem. Neural Comput 10(5):1299–1319

    Article  Google Scholar 

  31. Schölkopf B, Smola AJ (2002) Learning with kernels: support vector machines, regularization, optimization, and beyond. MIT Press, Cambridge

    Google Scholar 

  32. Shawe-Taylor J, Cristianini N (2004) Kernel methods for pattern analysis. Cambridge University Press, Cambridge

    Book  Google Scholar 

  33. Sra S (2012) Positive definite matrices and the symmetric Stein divergence. Preprint [arXiv:1110.1773]

    Google Scholar 

  34. Tosato D, Farenzena M, Cristani M, Spera M, Murino V (2010) Multi-class classification on Riemannian manifolds for video surveillance. In: European conference on computer vision (ECCV)

    Google Scholar 

  35. Turaga P, Veeraraghavan A, Srivastava A, Chellappa R (2011) Statistical computations on Grassmann and Stiefel manifolds for image and video-based recognition. IEEE Trans Pattern Anal Mach Intell 33(11):2273–2286

    Article  Google Scholar 

  36. Tuzel O, Porikli F, Meer P (2006) Region covariance: a fast descriptor for detection and classification. In: European conference on computer vision (ECCV)

    Google Scholar 

  37. Tuzel O, Porikli F, Meer P (2008) Pedestrian detection via classification on Riemannian manifolds. IEEE Trans Pattern Anal Mach Intell 30(10):1713–1727

    Article  Google Scholar 

  38. Varma M, Babu BR (2009) More generality in efficient multiple kernel learning. In: ICML

    Book  Google Scholar 

  39. Varma M, Ray D (2007) Learning the discriminative power-invariance trade-off. In: ICCV

    Book  Google Scholar 

  40. Wang R, Guo H, Davis LS, Dai Q (2012) Covariance discriminative learning: a natural and efficient approach to image set classification. In: CVPR

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Sadeep Jayasumana .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing Switzerland

About this chapter

Cite this chapter

Jayasumana, S., Hartley, R., Salzmann, M. (2016). Kernels on Riemannian Manifolds. In: Turaga, P., Srivastava, A. (eds) Riemannian Computing in Computer Vision. Springer, Cham. https://doi.org/10.1007/978-3-319-22957-7_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-22957-7_3

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-22956-0

  • Online ISBN: 978-3-319-22957-7

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics