Advertisement

A Mixed Reality Guidance System for Robot Assisted Laparoscopic Radical Prostatectomy

  • Abhishek KolagundaEmail author
  • Scott Sorensen
  • Sherif Mehralivand
  • Philip Saponaro
  • Wayne Treible
  • Baris Turkbey
  • Peter Pinto
  • Peter Choyke
  • Chandra Kambhamettu
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11041)

Abstract

Robotic surgery with preoperative imaging data for planning have become increasingly common for surgical treatment of patients. For surgeons using robotic surgical platforms, maintaining spatial awareness of the anatomical structures in the surgical area is key for good outcomes. We propose a Mixed Reality system which allows surgeons to visualize and interact with aligned anatomical models extracted from preoperative imagery as well as the in vivo imagery from the stereo laparoscope. To develop this system, we have employed techniques to 3D reconstruct stereo laparoscope images, model 3D shape of the anatomical structures from preoperative MRI stack and align the two 3D surfaces. The application we have developed allows surgeons to visualize occluded and obscured organ boundaries as well as other important anatomy that is not visible through the laparoscope alone, facilitating better spatial awareness during surgery. The system was deployed in 9 robot assisted laparoscopic prostatectomy procedures as part of a feasibility study.

Keywords

Mixed reality AR VR Robot assisted prostatectomy 

Notes

Acknowledgments

We would like to thank Kai Hammerich, Kaitlin Cobb, Vladimir Valera Romero, Jonathan Bloom, Gustavo Pena Lagrave, Vikram Sabarwal, Samuel Gold, Graham Hale, Kareem Rayn, Stephanie Harmon, Clayton Smith, Marcin Czarniecki and Bradford J. Wood of the National Institute of Health, Bethesda, MD, USA. They have provided valuable inputs and assistance during the development and testing of our system.

References

  1. 1.
    Besl, P.J., McKay, H.D.: A method for registration of 3-D shapes. IEEE Trans. Pattern Anal. Mach. Intell. 14(2), 239–256 (1992).  https://doi.org/10.1109/34.121791CrossRefGoogle Scholar
  2. 2.
    Chen, Y., Medioni, G.G.: Object modelling by registration of multiple range images. Image Vision Comput. 10(3), 145–155 (1992)CrossRefGoogle Scholar
  3. 3.
    Cohen, D., et al.: Augmented reality image guidance in minimally invasive prostatectomy. In: Madabhushi, A., Dowling, J., Yan, P., Fenster, A., Abolmaesumi, P., Hata, N. (eds.) Prostate Cancer Imaging 2010. LNCS, vol. 6367, pp. 101–110. Springer, Heidelberg (2010).  https://doi.org/10.1007/978-3-642-15989-3_12CrossRefGoogle Scholar
  4. 4.
    EDF R&D, T.P.: Cloudcompare (version 2.5.5.2)[gpl software] (2014). http://www.cloudcompare.org/
  5. 5.
    Edgcumbe, P., Singla, R., Pratt, P., Schneider, C., Nguan, C., Rohling, R.: Augmented reality imaging for robot-assisted partial nephrectomy surgery. In: Zheng, G., Liao, H., Jannin, P., Cattin, P., Lee, S.-L. (eds.) MIAR 2016. LNCS, vol. 9805, pp. 139–150. Springer, Cham (2016).  https://doi.org/10.1007/978-3-319-43775-0_13CrossRefGoogle Scholar
  6. 6.
    Ester, M., Kriegel, H.P., Sander, J., Xu, X., et al.: A density-based algorithm for discovering clusters in large spatial databases with noise. In: Kdd, vol. 96, pp. 226–231 (1996)Google Scholar
  7. 7.
    Gao, Q., et al.: Modeling of the bony pelvis from mri using a multi-atlas AE-SDM for registration and tracking in image-guided robotic prostatectomy. Comput. Med. Imaging Graph. 37(2), 183–194 (2013)CrossRefGoogle Scholar
  8. 8.
    Hartley, R., Zisserman, A.: Multiple View Geometry in Computer Vision. Cambridge University Press (2003)Google Scholar
  9. 9.
    Hirschmuller, H.: Stereo processing by semiglobal matching and mutual information. IEEE Trans. Pattern Anal. Mach. Intell. 30(2), 328–341 (2008)CrossRefGoogle Scholar
  10. 10.
    Hughes-Hallett, A., et al.: Augmented reality partial nephrectomy: examining the current status and future perspectives. Urology 83(2), 266–273 (2014).  https://doi.org/10.1016/j.urology.2013.08.049, http://www.sciencedirect.com/science/article/pii/S0090429513011333CrossRefGoogle Scholar
  11. 11.
    Kazhdan, M., Hoppe, H.: Screened poisson surface reconstruction. ACM Trans. Graph. (ToG) 32(3), 29 (2013)CrossRefGoogle Scholar
  12. 12.
    Kenney, P.A., Wszolek, M.F., Gould, J.J., Libertino, J.A., Moinzadeh, A.: Face, content, and construct validity of DV-trainer, a novel virtual reality simulator for robotic surgery. Urology 73(6), 1288–1292 (2009)CrossRefGoogle Scholar
  13. 13.
    Kolagunda, A., Lu, G., Kambhamettu, C.: Hierarchical hybrid shape representation for medical shapes. In: BMVC, pp. 74–1 (2015)Google Scholar
  14. 14.
    Kolagunda, A., Sorensen, S., Saponaro, P., Treible, W., Kambhamettu, C.: Robust shape registration using fuzzy correspondences. arXiv preprint arXiv:1702.05664 (2017)
  15. 15.
    Perrenot, C., et al.: The virtual reality simulator DV-trainer is a valid assessment tool for robotic surgical skills. Surg. Endosc. 26(9), 2587–2593 (2012).  https://doi.org/10.1007/s00464-012-2237-0, http://dx.doi.org/10.1007/s00464-012-2237-0CrossRefGoogle Scholar
  16. 16.
    Pratt, P., et al.: An effective visualisation and registration system for image-guided robotic partial nephrectomy. J. Robot. Surg. 6(1), 23–31 (2012)CrossRefGoogle Scholar
  17. 17.
    Shenai, M.B., et al.: Virtual interactive presence and augmented reality (VIPAR) for remote surgical assistance. Oper. Neurosurg. 68, ons200–ons207 (2011)CrossRefGoogle Scholar
  18. 18.
    Simpfendörfer, T., Baumhauer, M., Müller, M., Gutt, C.N., Meinzer, H.P., Rassweiler, J.J., Guven, S., Teber, D.: Augmented reality visualization during laparoscopic radical prostatectomy. J. Endourol. 25(12), 1841–1845 (2011)CrossRefGoogle Scholar
  19. 19.
    Solis, M.: New frontiers in robotic surgery: the latest high-tech surgical tools allow for superhuman sensing and more. IEEE Pulse 7(6), 51–55 (2016).  https://doi.org/10.1109/MPUL.2016.2606470CrossRefGoogle Scholar
  20. 20.
    Su, L.M., Vagvolgyi, B.P., Agarwal, R., Reiley, C.E., Taylor, R.H., Hager, G.D.: Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration. Urology 73(4), 896–900 (2009)CrossRefGoogle Scholar
  21. 21.
    Ukimura, O., Gill, I.S.: Imaging-assisted endoscopic surgery: Cleveland clinic experience. J. Endourol. 22(4), 803–810 (2008)CrossRefGoogle Scholar
  22. 22.
    Zhang, Z.: A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 22(11), 1330–1334 (2000)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Abhishek Kolagunda
    • 1
    Email author
  • Scott Sorensen
    • 1
  • Sherif Mehralivand
    • 2
  • Philip Saponaro
    • 1
  • Wayne Treible
    • 1
  • Baris Turkbey
    • 2
  • Peter Pinto
    • 2
  • Peter Choyke
    • 2
  • Chandra Kambhamettu
    • 1
  1. 1.University of DelawareNewarkUSA
  2. 2.National Institute of HealthBethesdaUSA

Personalised recommendations