Skip to main content

Group Testing for Longitudinal Data

  • Conference paper
  • First Online:
Book cover Information Processing in Medical Imaging (IPMI 2015)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 9123))

Included in the following conference series:

Abstract

We consider how to test for group differences of shapes given longitudinal data. In particular, we are interested in differences of longitudinal models of each group’s subjects. We introduce a generalization of principal geodesic analysis to the tangent bundle of a shape space. This allows the estimation of the variance and principal directions of the distribution of trajectories that summarize shape variations within the longitudinal data. Each trajectory is parameterized as a point in the tangent bundle. To study statistical differences in two distributions of trajectories, we generalize the Bhattacharyya distance in Euclidean space to the tangent bundle. This not only allows to take second-order statistics into account, but also serves as our test-statistic during permutation testing. Our method is validated on both synthetic and real data, and the experimental results indicate improved statistical power in identifying group differences. In fact, our study sheds new light on group differences in longitudinal corpus callosum shapes of subjects with dementia versus normal controls.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    A better estimate of the covariance matrix may be obtained, e.g., by using [8] or [2].

  2. 2.

    The threshold \(\epsilon \) varies with the application. In our experiments, we set it to 1e-6. Usually, the eigenvalues larger than \(\epsilon \) cover almost \(99\,\%\) of the variances.

  3. 3.

    We use two geodesics to connect three given shapes and uniformly sample points on these two geodesics. Then, by connecting opposing points, we obtain new geodesics which are located within the triangle region to sample a population of shapes.

  4. 4.

    The average of two generalized squared-Mahalanobis distances is related to the first term of the generalized Bhattacharyya distance in Eq. (4).

References

  1. Bhattacharyya, A.: On a measure of divergence between two multinomial populations. Sankhyā Indian J. Stat. 7(4), 401–406 (1946)

    MATH  Google Scholar 

  2. Bickel, P.J., Levina, E.: Covariance regularization by thresholding. Ann. Stat. 36(6), 2577–2604 (2008)

    Article  MATH  MathSciNet  Google Scholar 

  3. Durrleman, S., Pennec, X., Trouvé, A., Braga, J., Gerig, G., Ayache, N.: Toward a comprehensive framework for the spatiotemporal statistical analysis of longitudinal shape data. IJCV 103(1), 22–59 (2013)

    Article  MATH  Google Scholar 

  4. Fletcher, P.T.: Geodesic regression and the theory of least squares on Riemannian manifolds. IJCV 105(2), 171–185 (2013)

    Article  MATH  MathSciNet  Google Scholar 

  5. Fletcher, P.T., Lu, C., Pizer, S.M., Joshi, S.: Principal geodesic analysis for the study of nonlinear statistics of shape. IEEE TMI 23(8), 995–1005 (2004)

    Google Scholar 

  6. Gilmore, J.H., Shi, F., Woolson, S.L., Knickmeyer, R.C., Short, S.J., Lin, W., Zhu, H., Hamer, R.M., Styner, M., Shen, D.: Longitudinal development of cortical and subcortical gray matter from birth to 2 years. Cereb. Cortex 22(11), 2478–2485 (2012)

    Article  Google Scholar 

  7. Hong, Y., Joshi, S., Sanchez, M., Styner, M., Niethammer, M.: Metamorphic geodesic regression. In: Ayache, N., Delingette, H., Golland, P., Mori, K. (eds.) MICCAI 2012, Part III. LNCS, vol. 7512, pp. 197–205. Springer, Heidelberg (2012)

    Chapter  Google Scholar 

  8. Ledoit, O., Wolf, M.: A well-conditioned estimator for large-dimensional covariance matrices. J. Multivar. Anal. 88(2), 365–411 (2004)

    Article  MATH  MathSciNet  Google Scholar 

  9. Lee, J.: Introduction to Smooth Manifolds. Springer, New York (2012)

    Book  Google Scholar 

  10. Mahalanobis, P.C.: On the generalized distance in statistics. Proc. Natl. Inst. Sci. (Calcutta) 2, 49–55 (1936)

    MATH  Google Scholar 

  11. Marcus, D.S., Fotenos, A.F., Csernansky, J.G., Morris, J.C., Buckner, R.L.: Open access series of imaging studies: longitudinal mri data in nondemented and demented older adults. J. Cogn. Neurosci. 22(12), 2677–2684 (2010)

    Article  Google Scholar 

  12. Muralidharan, P., Fletcher, P.T.: Sasaki metrics for analysis of longitudinal data on manifolds. In: CVPR, pp. 1027–1034 (2012)

    Google Scholar 

  13. Niethammer, M., Huang, Y., Vialard, F.-X.: Geodesic regression for image time-series. In: Fichtinger, G., Martel, A., Peters, T. (eds.) MICCAI 2011, Part II. LNCS, vol. 6892, pp. 655–662. Springer, Heidelberg (2011)

    Chapter  Google Scholar 

  14. Oliver, D.S.: Calculation of the inverse of the covariance. Math. Geol. 30(7), 911–933 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  15. Sasaki, S.: On the differential geometry of tangent bundles of riemannian manifolds. TMJ 10(3), 338–354 (1958)

    MATH  Google Scholar 

  16. Singh, N., Hinkle, J., Joshi, S., Fletcher, P.T.: A hierarchical geodesic model for diffeomorphic longitudinal shape analysis. In: Gee, J.C., Joshi, S., Pohl, K.M., Wells, W.M., Zöllei, L. (eds.) IPMI 2013. LNCS, vol. 7917, pp. 560–571. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  17. Su, J., Kurtek, S., Klassen, E., Srivastava, A.: Statistical analysis of trajectories on riemannian manifolds: bird migration, hurricane tracking and video surveillance. Ann. Appl. Stat. 8(1), 530–552 (2014)

    Article  MATH  MathSciNet  Google Scholar 

  18. Su, J., Srivastrava, A., de Souza, F., Sarkar, S.: Rate-invariant analysis of trajectories on riemannian manifolds with application in visual speech recognition. In: CVPR, pp. 620–627 (2014)

    Google Scholar 

Download references

Acknowledgements

This work was supported by NSF EECS-1148870 and NSF EECS-0925875.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Yi Hong .

Editor information

Editors and Affiliations

Appendices

Appendix

A Properties of the Generalized Bhattacharyya Distance

Non-negativity. In the first term of Eq. (4), \(D_M^{\mathcal {TM}}\) is the generalized squared-Mahalanobis distance which is non-negative; consequently, the first term in Eq. (4) is non-negative. Furthermore, the determinant of a covariance matrix in the second term is also non-negative, since it is the product of all non-negative eigenvalues. Besides, it is easy to demonstrate that \((|\varSigma _1|+|\varSigma _2|) / (2\sqrt{|\varSigma _1||\varSigma _2|}) \ge 1\), indicating the second term is non-negative. Hence, \(D_B^{\mathcal {TM}}(D_1, D_2) \ge 0\).

Identity of Indiscernibles. If \(D_1 = D_2\), i.e., \(\mu _1 = \mu _2\) and \(\varSigma _1 = \varSigma _2\), we see that (1) \({{\mathrm{Log}}}_{\mu _1} \mu _2\) and \({{\mathrm{Log}}}_{\mu _2} \mu _1\) are zero tangent vectors, and (2) \(|\varSigma _1| = |\varSigma _2|\). Hence, \(D_M^{\mathcal {TM}}(\mu _1, D_2) = D_M^{\mathcal {TM}}(\mu _2, D_1) = 0\), i.e., the first term of Eq. (4) is 0; also, the second term is 0. Now, if \(D_1 = D_2\) then \(D_B^{\mathcal {TM}}(D_1, D_2) = 0\). On the other hand, assuming \(D_B^{\mathcal {TM}}(D_1, D_2) = 0\), we can only obtain \(\mu _1 = \mu _2\) and \(|\varSigma _1| = |\varSigma _2|\), because of the non-negativity properties of the two terms in Eq. (4). But, we cannot draw the conclusion that the two covariance matrices are equal. Therefore, if \(D_1=D_2\) then \(D_B^{\mathcal {TM}}(D_1, D_2) = 0\), but it is possible that \(D_B^{\mathcal {TM}}(D_1, D_2) =0\) for some \(D_1 \not = D_2\), if \(\mu _1 = \mu _2\) and \(|\varSigma _1| = |\varSigma _2|\).

Symmetry. Because both terms of Eq. (4) are symmetric, the sum of them is also symmetric, i.e., \(D_B^{\mathcal {TM}}(D_1, D_2) = D_B^{\mathcal {TM}}(D_2, D_1)\).

Triangle Inequality. Since, Eq. (1) in \(\mathbb {R}^n\) does not satisfy the triangle inequality, our generalized variant will not satisfy it either.

Rights and permissions

Reprints and permissions

Copyright information

© 2015 Springer International Publishing Switzerland

About this paper

Cite this paper

Hong, Y., Singh, N., Kwitt, R., Niethammer, M. (2015). Group Testing for Longitudinal Data. In: Ourselin, S., Alexander, D., Westin, CF., Cardoso, M. (eds) Information Processing in Medical Imaging. IPMI 2015. Lecture Notes in Computer Science(), vol 9123. Springer, Cham. https://doi.org/10.1007/978-3-319-19992-4_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-19992-4_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-19991-7

  • Online ISBN: 978-3-319-19992-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics