Skip to main content
Log in

High-dimensional MRI data analysis using a large-scale manifold learning approach

  • Original Paper
  • Published:
Machine Vision and Applications Aims and scope Submit manuscript

Abstract

A novel manifold learning approach is presented to efficiently identify low-dimensional structures embedded in high-dimensional MRI data sets. These low-dimensional structures, known as manifolds, are used in this study for predicting brain tumor progression. The data sets consist of a series of high-dimensional MRI scans for four patients with tumor and progressed regions identified. We attempt to classify tumor, progressed and normal tissues in low-dimensional space. We also attempt to verify if a progression manifold exists—the bridge between tumor and normal manifolds. By identifying and mapping the bridge manifold back to MRI image space, this method has the potential to predict tumor progression. This could be greatly beneficial for patient management. Preliminary results have supported our hypothesis: normal and tumor manifolds are well separated in a low-dimensional space. Also, the progressed manifold is found to lie roughly between the normal and tumor manifolds.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13
Fig. 14
Fig. 15
Fig. 16
Fig. 17
Fig. 18
Fig. 19
Fig. 20
Fig. 21

Similar content being viewed by others

References

  1. Pauleit, D., Langen, K.-J., Floeth, F., Hautzel, H., Riemenschneider, M.J., Reifenberger, G., Shah, N.J., Muller, H.-W.: Can the apparent diffusion coefficient be used as a noninvasive parameter to distinguish tumor tissue from peritumoral tissue in cerebral gliomas? J. Magn. Reson. Imag. 20(5), 758–764 (2004)

    Article  Google Scholar 

  2. Bode, M.K., Ruohonen, J., Nieminen, M.T., Pyhtinen, J.: Potential of diffusion imaging in brain tumors: a review. Acta Radiol. 47(6), 585–594 (2006)

    Article  Google Scholar 

  3. Sinha, S., Bastin, M.E., Whittle, I.R., Wardlaw, J.M.: Diffusion tensor MR imaging of high-grade cerebral gliomas. Am. J. Neuroradiol. 23(4), 520–527 (2002)

    Google Scholar 

  4. Castillo, M., Smith, J.K., Kwock, L., Wilber, K.: Apparent diffusion coefficients in the evaluation of high-grade cerebral gliomas. Am. J. Neuroradiol. 22(1), 60–64 (2001)

    Google Scholar 

  5. Kono, K., Inoue, Y., Nakayama, K., Shakudo, M., Morino, M., Ohata, K., Wakasa, K., Yamada, R.: The role of diffusion-weighted imaging in patients with brain tumors. Am. J. Neuroradiol. 22(6), 1081–1088 (2001)

    Google Scholar 

  6. Bulakbasi, N., Kocaoglu, M., Ors, F., Tayfun, C., Ucoz, T.: Combination of single-voxel proton MR spectroscopy and apparent diffusion coefficient calculation in the evaluation of common brain tumors. Am. J. Neuroradiol. 24(2), 225–233 (2003)

    Google Scholar 

  7. Lam, W., Poon, W., Metreweli, C.: Diffusion MR imaging in glioma: does it have any role in the pre-operation determination of grading of glioma? Clin. Radiol. 57(3), 219–225 (2002)

    Article  Google Scholar 

  8. Yang, D., Korogi, Y., Sugahara, T., Kitajima, M., Shigematsu, Y., Liang, L., Ushio, Y., Takahashi, M.: Cerebral gliomas: prospective comparison of multivoxel 2D chemical-shift imaging proton mr spectroscopy, echoplanar perfusion and diffusion-weighted MRI. Neuroradiology 44, 656–666 (2002)

    Article  Google Scholar 

  9. Sadeghi, N., Camby, I., Goldman, S., Gabius, H.-J., Baleriaux, D., Salmon, I., Decaesteckere, C., Kiss, R., Metens, T.: Effect of hydrophilic components of the extracellular matrix on quantifiable diffusion-weighted imaging of human gliomas: preliminary results of correlating apparent diffusion coefficient values and hyaluronan expression level. Am. J. Roentgenol. 181(1), 235–241 (2003)

    Article  Google Scholar 

  10. Bastin, M.E., Sinha, S., Whittle, I.R., Wardlaw, J.M.: Measurements of water diffusion and T1 values in peritumoural oedematous brain. Neuroreport 13(10), 1335–1340 (2002)

    Article  Google Scholar 

  11. Lu, S., Ahn, D., Johnson, G., Law, M., Zagzag, D., Grossman, R.I.: Diffusion-tensor MR imaging of intracranial neoplasia and associated peritumoral edema: introduction of the tumor infiltration index. Radiology 232(1), 221–228 (2004)

    Article  Google Scholar 

  12. Provenzale, J.M., McGraw, P., Mhatre, P., Guo, A.C., Delong, D.: Peritumoral brain regions in gliomas and meningiomas: investigation with isotropic diffusion-weighted MR imaging and diffusion-tensor MR imaging. Radiology 232(2), 451–460 (2004)

    Article  Google Scholar 

  13. Li, J., Wang, J., Shen, Y., Shen, Y., McKenzie, R., Guha-Thakurta, N.: Brain tumor progression assessment using multiple MRI volumes. In: Radiological society of North America (RSNA) 95th scientific assembly and annual meeting, oral presentation, Chicago, IL

  14. Shen, Y., Banerjee, D., Li, J., Chandler, A., Shen, Y., McKenzie, F., Wang, J.: Prediction of brain tumor progression using a machine learning technique. In: Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series. Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, vol. 7624 (2010)

  15. Banerjee, D., Tran, L., Li, J., Shen, Y., McKenzie, F, Wang, J.: Prediction of brain tumor progression using multiple histogram matched MRI scans. In: Society of Photo-Optical Instrumentation Engineers (SPIE) Conference Series, vol. 7963, Provided by the SAO/NASA Astrophysics Data System (2011). doi:10.1117/12.878208. http://adsabs.harvard.edu/abs/2011SPIE.7963E..96B.

  16. Deshpande, A., Rademacher, L., Vempala, S., Wang, G.: Matrix approximation and projective clustering via volume sampling. In: Proceedings of the seventeenth annual ACM-SIAM symposium on Discrete algorithm, SODA ’06, pp. 1117–1126. ACM, New York, NY, USA (2006)

  17. Drineas, P., Mahoney, M.W.: On the Nystrom method for approximating a Gram matrix for improved kernel-based learning. J. Mach. Learn. Res. 6, 2153–2175 (2005)

    MathSciNet  MATH  Google Scholar 

  18. Wang, S., Yao, J., Summers, R.M.: Improved classifier for computer-aided polyp detection in CT colonography by nonlinear dimensionality reduction. Med. Phys. 35(4), 1377–1386 (2008)

    Article  Google Scholar 

  19. de Silva, V., Tenenbaum, J.B.: Global versus local methods in nonlinear dimensionality reduction. In: Becker, S., Thrun, S., Obermayer, K. (eds.) Advances in Neural Information Processing Systems, vol. 15, pp. 705–712. MIT Press, Cambridge (2003)

    Google Scholar 

  20. Zhang, Z., Zha, H.: Principal manifolds and nonlinear dimensionality reduction via tangent space alignment. SIAM J. Sci. Comput. 26, 313–338 (2005)

    Article  MathSciNet  Google Scholar 

  21. Roweis, S.T., Saul, L.K.: Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500), 2323–2326 (2000)

    Article  Google Scholar 

  22. Hartkens, T., Rueckert, D., Schnabel, J.A., Hawkes, D.J., Hill, D.L.G.: Vtk cisg registration toolkit: an open source software package for affine and nonrigid registration of single- and multimodal 3d images. In: Meiler, M., Saupe, D., Kruggel, F., Handels, H., Lehmann, T.M. (eds.) Bildverarbeitung fur die Medizin, vol. 56. CEUR Workshop Proceedings, pp. 409–412, Springer, Berlin (2002)

  23. Madabhushi, A., Udupa, J.K., Souza, A.: Generalized scale: theory, algorithms, and application to image inhomogeneity correction. Comput. Vis. Image Underst. 101, 100–121 (2006)

    Article  Google Scholar 

  24. Nyul, L.G., Udupa, J.K.: On standardizing the MR image intensity scale. Magn. Reson. Med. 42(6), 1072–1081 (1999)

    Article  Google Scholar 

  25. Wang, J., Zhang, Z., Zha, H.: Adaptive manifold learning. In: Advances in Neural Information Processing Systems, vol. 17, pp. 1473–1480. MIT Press, Cambridge, MA (2005)

  26. Niskanen, M., Silven, O.: Comparison of dimensionality reduction methods for wood surface inspection. In: Proceedings of the 6th International Conference on Quality Control by Artificial Vision, pp. 178–188. Gatlinburg, TN, USA (2003)

  27. Lim, I.S., de Heras Ciechomski, P., Sarni, S., Thalmann, D.: Planar arrangement of high-dimensional biomedical data sets by isomap coordinates. In: Computer-Based Medical Systems, 2003. Proceedings. 16th IEEE, Symposium, pp. 50–55 (2003)

  28. Raytchev, B., Yoda, I., Sakaue, K.: Head pose estimation by nonlinear manifold learning. In: Proceedings of the 17th International Conference on Pattern Recognition, 2004. ICPR 2004, vol. 4, pp. 462–466 (2004)

  29. van der Maaten, L., Postma, E.O., van den Herik, H.J.: Dimensionality reduction: a comparative review (2008)

  30. Duda, R., Hart, P., Stork, D.: Pattern classification and scene analysis: pattern classification. Wiley, New York (2001)

    Google Scholar 

  31. Bishop, C.: Pattern recognition and machine Learning. Information science and statistics. Springer, Berlin (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Jiang Li.

Appendix

Appendix

GMM Classifier:

Gaussian mixture models (GMM) are commonly used for classification. A benefit of using GMM is that a posterior probability mapping can be generated. Here, the landmarks chosen in the landmark selection step were used for training the parameters of the GMM using the Expectation Maximization (EM) algorithm [31]:

  1. Step 1:

    Initialize the means \(\mu _{k}\), covariance \(\sigma _{k}\), and mixing coefficients \(\pi _{k}\) for the \(k\)-th component in the GMM.

  2. Step 2:

    Expectation step. Evaluate the responsibilities \(\gamma \left( z_{nk}\right) \) using the current parameters

    $$\begin{aligned} \gamma \left( z_{nk}\right) =\frac{\pi _{k}\mathcal N \left( x_{n}|\mu _{k},\sigma _{k}\right) }{\sum _{j=1}^{K} \pi _{j}\mathcal N \left( x_{n}|\mu _{j},\sigma _{j}\right) } \end{aligned}$$

    where \(K\) is the total number of components in the GMM. The responsibilities, \(\gamma (z_{nk})\), at each point can be viewed as the posterior probability of each Gaussian function.

  3. Step 3:

    Maximization step. Compute new parameter values for the means \(\mu _{k}^{new}\), standard deviations \(\sigma _{k}^{new}\), and mixing coefficients \(\pi _{k}^{new}\) using the current responsibilities to maximize the responsibilities of Step 2.

    $$\begin{aligned} \mu _{k}^{new}&= \frac{1}{N_{k}}\sum _{n=1}^{N}\gamma \left( z_{nk} \right) z_{n}\\ \sigma _{k}^{new}&= \frac{1}{N_{k}}\sum _{n=1}^{N}\gamma \left( z_{nk} \right) \left( x_{n}-\mu _{k}^{new}\right) \left( x_{n}-\mu _{k}^{new} \right) ^{T}\\ \pi _{k}^{new}&= \frac{N_{k}}{N} \end{aligned}$$

    where

    $$\begin{aligned} N_{k}=\sum _{n=1}^{N}\gamma \left( z_{nk}\right) \end{aligned}$$

    and \(N\) is the total number of data samples for training.

  4. Step 4:

    Evaluate the log likelihood

    $$\begin{aligned}&\ln p\left( X|\mu ,\sigma ,\pi \right) \nonumber \\&\quad =\sum _{n=1}^{N}\ln \left\{ \sum _{k=1}^{K}\pi _{k}\mathcal N \left( x_{n}|\mu _{k},\sigma _{k} \right) \right\} \end{aligned}$$
    (4)

    If the values do not converge, then the process is reiterated at step 2. With each iteration of the Expectation and Maximization step, the log likelihood function in Eq. (4) will always increase and is guaranteed to converge to a local optimum.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Tran, L., Banerjee, D., Wang, J. et al. High-dimensional MRI data analysis using a large-scale manifold learning approach. Machine Vision and Applications 24, 995–1014 (2013). https://doi.org/10.1007/s00138-013-0499-8

Download citation

  • Received:

  • Revised:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00138-013-0499-8

Keywords

Navigation