Abstract
This paper develops the theory of geodesic regression and least-squares estimation on Riemannian manifolds. Geodesic regression is a method for finding the relationship between a real-valued independent variable and a manifold-valued dependent random variable, where this relationship is modeled as a geodesic curve on the manifold. Least-squares estimation is formulated intrinsically as a minimization of the sum-of-squared geodesic distances of the data to the estimated model. Geodesic regression is a direct generalization of linear regression to the manifold setting, and it provides a simple parameterization of the estimated relationship as an initial point and velocity, analogous to the intercept and slope. A nonparametric permutation test for determining the significance of the trend is also given. For the case of symmetric spaces, two main theoretical results are established. First, conditions for existence and uniqueness of the least-squares problem are provided. Second, a maximum likelihood criteria is developed for a suitable definition of Gaussian errors on the manifold. While the method can be generally applied to data on any manifold, specific examples are given for a set of synthetically generated rotation data and an application to analyzing shape changes in the corpus callosum due to age.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.References
Bookstein, F. L. (1986). Size and shape spaces for landmark data in two dimensions (with discussion). Statistical Science, 1(2), 181–242.
Boothby, W. M. (1986). An introduction to differentiable manifolds and Riemannian geometry (2nd ed.). London: Academic Press.
do Carmo, M. (1992). Riemannian geometry. Boston: Birkhäuser.
Davis, B., Fletcher, P. T., Bullitt, E.,& Joshi, S. (2007). Population shape regression from random design data. In: Proceedings of IEEE international conference on computer vision. Rio de Janeiro: IEEE.
Driesen, N.,& Raz, N. (1995). The influence of sex, age, and handedness on corpus callosum morphology: A meta-analysis. Psychobiology, 23(3), 240–247.
Dryden, I.,& Mardia, K. (1998). Statistical shape analysis. New York: John Wiley and Sons.
Durrleman, S., Pennec, X., Trouvé, A., Gerig, G.,& Ayache, N. (2009). Spatiotemporal atlas estimation for developmental delay detection in longitudinal datasets (pp. 297–304). In: Medical image computing and computer-assisted Intervention intervention. Heidelberg: Springer.
Fletcher, P. T. (2011). Geodesic regression on Riemannian manifolds (pp. 75–86). In: MICCAI workshop on mathematical foundations of computational anatomy. Toronto: MFCA.
Fletcher, P. T., Lu, C.,& Joshi, S. (2003). Statistics of shape via principal geodesic analysis on lie groups (pp. 95–101). In: IEEE CVPR. Madison: IEEE.
Fréchet, M. (1948). Les éléments aléatoires de nature quelconque dans un espace distancié. Annales de l’Institut Henri Poincaré, 10(3), 215–310.
Grenander, U. (1963). Probabilities on algebraic structures. New York: John Wiley and Sons.
Helgason, S. (1978). Differential geometry, lie groups, and symmetric spaces. New York: Academic Press.
Hinkle, J., Fletcher, P. T., Muralidharan, P., Joshi, S.,& (2012). Polynomial regression on Riemannian manifolds. In: European conference on computer vision. Florence: ECCV.
Huckemann, S., Hotz, T.,& Munk, A. (2010). Intrinsic shape analysis: Geodesic principal component analysis for Riemannian manifolds modulo lie group actions. Statistica Sinica, 20, 1–100.
Jupp, P. E.,& Kent, J. T. (1987). Fitting smooth paths to spherical data. Applied Statistics, 36(1), 34–46.
Karcher, H. (1977). Riemannian center of mass and mollifier smoothing. Communication on Pure and Applied Mathematics, 30, 509–541.
Kendall, D. G. (1984). Shape manifolds, procrustean metrics, and complex projective spaces. Bulletin of the London Mathematical Society, 16, 18–121.
Kendall, W. S. (1990). Probability, convexity, and harmonic maps with small image I: Uniqueness and fine existence. Proceedings of the London Mathematical Society, 3(61), 371–406.
Kenobi, K., Dryden, I. L.,& Le, H. (2010). Shape curves and geodesic modelling. Biometrika, 97(3), 567–584.
Klassen, E., Srivastava, A., Mio, W.,& Joshi, S. (2004). Analysis of planar shapes using geodesic paths on shape spaces. IEEE PAMI, 26(3), 372–383.
Kume, A., Dryden, I. L.,& Le, H. (2007). Shape-space smoothing splines for planar landmark data. Biometrika, 94(3), 513–528.
Mardia, K. V. (2000). Directional statistics. New York: John Wiley and Sons.
Michor, P. W.,& Mumford, D. (2006). Riemannian geometries on spaces of plane curves. Journal of European Mathematical Society, 8, 1–48.
Miller, M. (2004). Computational anatomy: Shape, growth, and atrophy comparison via diffeomorphisms. NeuroImage, 23, S19–S33.
Muralidharan, P.,& Fletcher, P. T. (2012). Sasaki metrics for analysis of longitudinal data on manifolds. In: IEEE conference on computer vision and pattern recognition. Providence: CVPR.
Niethammer, M., Huang, Y.,& Viallard, F. X. (2011). Geodesic regression for image time-series. In: Proceedings of medical image computing and computer assisted intervention. Toronto: MICCAI.
Pennec, X. (2006). Intrinsic statistics on Riemannian manifolds: Basic tools for geometric measurements. Journal of Mathematical Imaging and Vision, 25(1), 1–12.
Shi, X., Styner, M., Lieberman, J., Ibrahim, J., Lin, W.,& Zhu, H. (2009). Intrinsic regression models for manifold-valued data. Journal of American Statistical Association, 5762, 192–199.
Sommer, S., Lauze, F., Nielsen, M. (2010). The differential of the exponential map, Jacobi fields and exact principal geodesic, analysis. Retrieved October 7, 2010 from arXiv:10081902v3.
Trouvé, A.,& Vialard, F. X. (2010). A second-order model for time-dependent data interpolation: Splines on shape spaces. In: MICCAI STIA workshop. Beijing: MICCAI.
Younes, L. (1998). Computable elastic distances between shapes. SIAM Journal of Applied Mathematics, 58, 565–586.
Younes, L. (2006). Jacobi fields in groups of diffeomorphisms and applications. Quarterly of Applied Mathematics. 65, 113–134.
Acknowledgments
This work was supported by NSF CAREER Grant 1054057.
Author information
Authors and Affiliations
Corresponding author
Appendix A: Proof of Theorem 1
Appendix A: Proof of Theorem 1
When \(\kappa = 0,\) it is clear that the vector field \(J\) changes linearly in \(t,\) giving the desired result \(DJ/dt = 0.\) Therefore, it suffices to consider only the cases where \(\kappa \ne 0.\) Let’s first consider the case where \(\kappa > 0.\) Then we can write
Our goal is to compute the normal and tangential components of \(DJ/dt.\) We will use the identities
Tangential Component: We start by noting that
and we can compute, using \(k = \sqrt{\kappa } L,\)
This gives the second term in (19),
The first term in (19) is given by
We now compute each of the derivatives in this equation. Starting with \(dL/dt,\) and using the previous result in (21), we get
Using the fact that \(T\) is a unit vector field, we get
Evaluating this at \(s = 0,\) gives
Denoting \(\tau _t\) as parallel translation along \(p(t),\) we can write
Using this, and the fact that \(Z, W\) are constant in \(s,\) allows us to compute
We put these together to get
Finally, the tangential component of \(DJ/dt\) is given by
Normal Component: Similar to the computation above for \(Z, W,\) but now for the normal components \(X,Y,\) we get
Using the fact that \(E\) is a unit vector field, we get
Evaluating this at \(s = 0,\) gives
Again, using the fact that \(X, Y\) are constant in \(s,\) this gives us
The first term in (20) is calculated as
Again, using the fact that \(E\) is a unit vector field, we have
The second term in (20) is now given by
Putting this together, we get the normal component of \(DJ/dt\) to be
Negative Sectional Curvature: Now consider the case when the sectional curvature is negative, i.e., \(\kappa < 0.\) The Jacobi field is given by
The derivation of \(DJ/dt\) in this case proceeds almost identically to the positive curvature case, taking care to handle the sign difference when differentiating \(\cosh .\) The result is
The final formulas for the second derivative of the exponential map are given by evaluation at \((s, t) = (1, 0)\) in Eqs. (22)–(23).\(\square \)
Rights and permissions
About this article
Cite this article
Thomas Fletcher, P. Geodesic Regression and the Theory of Least Squares on Riemannian Manifolds. Int J Comput Vis 105, 171–185 (2013). https://doi.org/10.1007/s11263-012-0591-y
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11263-012-0591-y