Advertisement

Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Learning deep similarity metric for 3D MR–TRUS image registration

Abstract

Purpose

The fusion of transrectal ultrasound (TRUS) and magnetic resonance (MR) images for guiding targeted prostate biopsy has significantly improved the biopsy yield of aggressive cancers. A key component of MR–TRUS fusion is image registration. However, it is very challenging to obtain a robust automatic MR–TRUS registration due to the large appearance difference between the two imaging modalities. The work presented in this paper aims to tackle this problem by addressing two challenges: (i) the definition of a suitable similarity metric and (ii) the determination of a suitable optimization strategy.

Methods

This work proposes the use of a deep convolutional neural network to learn a similarity metric for MR–TRUS registration. We also use a composite optimization strategy that explores the solution space in order to search for a suitable initialization for the second-order optimization of the learned metric. Further, a multi-pass approach is used in order to smooth the metric for optimization.

Results

The learned similarity metric outperforms the classical mutual information and also the state-of-the-art MIND feature-based methods. The results indicate that the overall registration framework has a large capture range. The proposed deep similarity metric-based approach obtained a mean TRE of 3.86 mm (with an initial TRE of 16 mm) for this challenging problem.

Conclusion

A similarity metric that is learned using a deep neural network can be used to assess the quality of any given image registration and can be used in conjunction with the aforementioned optimization framework to perform automatic registration that is robust to poor initialization.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

References

  1. 1.

    Calio B, Sidana A, Sugano D, Gaur S, Jain A, Maruf M, Xu S, Yan P, Kruecker J, Merino M, Choyke P, Turkbey B, Wood B, Pinto P (2017) Changes in prostate cancer detection rate of MRI–TRUS fusion vs systematic biopsy over time: evidence of a learning curve. Prostate Cancer Prostatic Dis 20:436

  2. 2.

    Cheng X, Zhang L, Zheng Y (2016) Deep similarity learning for multi- modal medical images. Comput Methods Biomech Biomed Eng Imaging Vis 6:1–5

  3. 3.

    Chollet F (2015) Keras. https://github.com/fchollet/keras. Accessed 25 Aug 2017

  4. 4.

    Fedorov A, Khallaghi S, Sánchez CA, Lasso A, Fels S, Tuncali K, Sugar EN, Kapur T, Zhang C, Wells W, Nguyen PL, Abolmaesumi P, Tempany C (2015) Open-source image registration for mri-trus fusion-guided prostate interventions. Int J Comput Assist Radiol Surg 10(6):925–934

  5. 5.

    Fletcher R, Powell MJD (1963) A rapidly convergent descent method for minimization. Comput J 6(2):163–168. https://doi.org/10.1093/comjnl/6.2.163

  6. 6.

    Fuerst B, Wein W, Mller M, Navab N (2014) Automatic ultrasound-MRI registration for neurosurgery using the 2d and 3d LC(2) metric. Med Image Anal 18(8):1312–1319. https://doi.org/10.1016/j.media.2014.04.008

  7. 7.

    Heinrich MP, Jenkinson M, Bhushan M, Matin T, Gleeson FV, Brady SM, Schnabel JA (2012) MIND: modality independent neighbourhood descriptor for multi-modal deformable registration. Med Image Anal 16(7):1423–1435

  8. 8.

    Jones E, Oliphant T, Peterson P (2001–) SciPy: Open source scientific tools for Python. http://www.scipy.org/, [Online; accessed 2018-07-30]

  9. 9.

    Khallaghi S, Snchez CA, Rasoulian A, Sun Y, Imani F, Khojaste A, Goksel O, Romagnoli C, Abdi H, Chang S, Mousavi P, Fenster A, Ward A, Fels S, Abolmaesumi P (2015) Biomechanically constrained surface registration: application to MR–TRUS fusion for prostate interventions. IEEE Trans Med Imaging 34(11):2404–2414

  10. 10.

    Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Proceedings of the 25th international conference on neural information processing systems, vol 1, Curran Associates Inc., USA, NIPS’12, pp 1097–1105

  11. 11.

    LeCun Y, Bengio Y, Hinton G (2015) Deep learning. Nature 521(7553):436

  12. 12.

    Maes F, Collignon A, Vandermeulen D, Marchal G, Suetens P (1997) Multimodality image registration by maximization of mutual information. IEEE Trans Med Imaging 16(2):187–198

  13. 13.

    Pinto PA, Chung PH, Rastinehad AR, Baccala AA, Kruecker J, Benjamin CJ, Xu S, Yan P, Kadoury S, Chua C, Locklin JK, Turkbey B, Shih JH, Gates SP, Buckner C, Bratslavsky G, Linehan WM, Glossop ND, Choyke PL, Wood BJ (2011) Magnetic resonance imaging/ultrasound fusion guided prostate biopsy improves cancer detection following transrectal ultrasound biopsy and correlates with multiparametric magnetic resonance imaging. J Urol 186(4):1281–1285

  14. 14.

    Poulin E, Boudam K, Pinter C, Kadoury S, Lasso A, Fichtinger G, Mnard C (2018) Validation of MRI to TRUS registration for high-dose-rate prostate brachytherapy. Brachytherapy 17(2):283–290. https://doi.org/10.1016/j.brachy.2017.11.018

  15. 15.

    Reena Benjamin J, Jayasree T (2018) Improved medical image fusion based on cascaded PCA and shift invariant wavelet transforms. Int J Comput Assist Radiol Surg 13(2):229–240

  16. 16.

    Shanno DF (1970) Conditioning of Quasi-Newton methods for function minimization. Math Comput 24(111):647–656. https://doi.org/10.2307/2004840. http://www.jstor.org/stable/2004840. Accessed 25 Apr 2018

  17. 17.

    Siddiqui MM, Rais-Bahrami S, Turkbey B, George AK, Rothwax J, Shakir N, Okoro C, Raskolnikov D, Parnes HL, Linehan WM, Merino MJ, Simon RM, Choyke PL, Wood BJ, Pinto PA (2015) Comparison of MR/ultrasound fusion-guided biopsy with ultrasound-guided biopsy for the diagnosis of prostate cancer. JAMA 313(4):390–397

  18. 18.

    Siegel RL, Miller KD, Jemal A (2018) Cancer statistics. CA Cancer J Clin 68:7–30

  19. 19.

    Simonovsky M, Gutirrez-Becker B, Mateus D, Navab N, Komodakis N (2016) A deep metric for multimodal registration. In: Medical image computing and computer-assisted intervention —MICCAI, pp 10–18

  20. 20.

    Sparks R, Bloch BN, Feleppa E, Barratt D, Madabhushi A (2013) Fully automated prostate magnetic resonance imaging and transrectal ultrasound fusion via a probabilistic registration metric. In: SPIE medical imaging, vol 8671

  21. 21.

    Storn R, Price K (1997) Differential evolution—a simple and efficient heuristic for global optimization over continuous spaces. J Glob Optim 11(4):341–359

  22. 22.

    Sun Y, Yuan J, Rajchl M, Qiu W, Romagnoli C, Fenster A (2013) Efficient convex optimization approach to 3d non-rigid MR-TRUS registration. In: Mori K, Sakuma I, Sato Y, Barillot C, Navab N (eds) Medical image computing and computer-assisted intervention MICCAI 2013. Springer, Berlin, pp 195–202

  23. 23.

    Sun Y, Yuan J, Qiu W, Rajchl M, Romagnoli C, Fenster A (2015) Three-dimensional nonrigid MR-TRUS registration using dual optimization. IEEE Trans Med Imaging 34(5):1085–1095

  24. 24.

    Wells WM, Viola P, Atsumi H, Nakajima S, Kikinis R (1996) Multi-modal volume registration by maximization of mutual information. Med Image Anal 1(1):35–51

  25. 25.

    Zagoruyko S, Komodakis N (2015) Learning to compare image patches via convolutional neural networks. In: IEEE conference on computer vision and pattern recognition (CVPR), pp 4353–4361

  26. 26.

    Zettinig O, Shah A, Hennersperger C, Eiber M, Kroll C, Kbler H, Maurer T, Milletar F, Rackerseder J, Schulte zu Berge C, Storz E, Frisch B, Navab N (2015) Multimodal image-guided prostate fusion biopsy based on automatic deformable registration. Int J Comput Assist Radiol Surg 10(12):1997–2007

Download references

Acknowledgements

We would also like to thank NVIDIA Corporation for the donation of the Titan Xp GPU used for this research (PY).

Author information

Correspondence to Pingkun Yan.

Ethics declarations

Conflict of interest

NIH and Philips/In Vivo have a Cooperative Research and Development Agreement. NIH and Philips share intellectual property in the field and one author receive royalties for licensed patents (BW). PY was a salaried employee of Philips Research at the time some of the research was performed.

Ethical approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee of where the studies were conducted.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

This work was supported by the Intramural Research Program of the National Institutes of Health, the National Institutes of Health Center for Interventional Oncology, and NIH Grants 1ZIDBC011242 and 1ZIDCL040015.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Haskins, G., Kruecker, J., Kruger, U. et al. Learning deep similarity metric for 3D MR–TRUS image registration. Int J CARS 14, 417–425 (2019). https://doi.org/10.1007/s11548-018-1875-7

Download citation

Keywords

  • Image registration
  • Convolutional neural networks
  • Multimodal image fusion
  • Prostate cancer
  • Image-guided interventions