Skip to main content

Advertisement

Log in

A markerless automatic deformable registration framework for augmented reality navigation of laparoscopy partial nephrectomy

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose Video see-through augmented reality (VST-AR) navigation for laparoscopic partial nephrectomy (LPN) can enhance intraoperative perception of surgeons by visualizing surgical targets and critical structures of the kidney tissue. Image registration is the main challenge in the procedure. Existing registration methods in laparoscopic navigation systems suffer from limitations such as manual alignment, invasive external marker fixation, relying on external tracking devices with bulky tracking sensors and lack of deformation compensation. To address these issues, we present a markerless automatic deformable registration framework for LPN VST-AR navigation.

Method

Dense stereo matching and 3D reconstruction, automatic segmentation and surface stitching are combined to obtain a larger dense intraoperative point cloud of the renal surface. A coarse-to-fine deformable registration is performed to achieve a precise automatic registration between the intraoperative point cloud and the preoperative model using the iterative closest point algorithm followed by the coherent point drift algorithm. Kidney phantom experiments and in vivo experiments were performed to evaluate the accuracy and effectiveness of our approach.

Results

The average segmentation accuracy rate of the automatic segmentation was 94.9%. The mean target registration error of the phantom experiments was found to be 1.28 ± 0.68 mm (root mean square error). In vivo experiments showed that tumor location was identified successfully by superimposing the tumor model on the laparoscopic view.

Conclusion

Experimental results have demonstrated that the proposed framework could accurately overlay comprehensive preoperative models on deformable soft organs automatically in a manner of VST-AR without using extra intraoperative imaging modalities and external tracking devices, as well as its potential clinical use.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Dimick JB, Ryan AM (2013) Taking a broader perspective on the benefits of minimally invasive surgery. JAMA Surg 148(7):648

    Article  PubMed  Google Scholar 

  2. Gill IS, Kavoussi LR, Lane BR, Blute ML, Babineau D, Colombo JR Jr, Frank I, Permpongkosol S, Weight CJ, Kaouk JH, Kattan MW, Novick AC (2007) Comparison of 1800 laparoscopic and open partial nephrectomies for single renal tumors. J Urol 178(1):41–46

    Article  Google Scholar 

  3. Stoyanov D, Mylonas GP, Lerotic M, Chung AJ, Yang G (2008) Intra-operative visualizations: perceptual fidelity and human factors. J Disp Technol 4(4):491–501

    Article  Google Scholar 

  4. Bernhardt S, Nicolau SA, Soler L, Doignon C (2017) The status of augmented reality in laparoscopic surgery as of 2016. Med Image Anal 37:66–90

    Article  PubMed  Google Scholar 

  5. Nicolau S, Soler L, Mutter D, Marescaux J (2011) Augmented reality in laparoscopic surgical oncology. Surg Oncol 20(3):189–201

    Article  PubMed  Google Scholar 

  6. Feuerstein M, Mussack T, Heining SM, Navab N (2008) Intraoperative laparoscope augmentation for port placement and resection planning in minimally invasive liver resection. IEEE Trans Med Imaging 27(3):355–369

    Article  PubMed  Google Scholar 

  7. Mountney P, Fallert J, Nicolau S, Soler L, Mewes PW (2014) An augmented reality framework for soft tissue surgery. Med Image Comput Comput Assist Interv 17(1):423–431

    PubMed  Google Scholar 

  8. Baumhauer M, Simpfendörfer T, Müller-Stich BP, Teber D, Gutt CN, Rassweiler J, Meinzer H-P, Wolf I (2008) Soft tissue navigation for laparoscopic partial nephrectomy. IJCARS 3(3–4):307–314

    Google Scholar 

  9. Wild E, Teber D, Schmid D, Simpfendörfer T, Müller M, Baranski AC, Kenngott H, Kopka K, Maier-Hein L (2016) Robust augmented reality guidance with fluorescent markers in laparoscopic surgery. IJCARS 11(6):899–907

    Google Scholar 

  10. Bernhardt S, Nicolau SA, Agnus V, Soler L, Doignon C, Marescaux J (2016) Automatic localization of endoscope in intraoperative CT image: a simple approach to augmented reality guidance in laparoscopic surgery. Med Image Anal 30:130–143

    Article  PubMed  Google Scholar 

  11. Bernhardt S, Nicolau SA, Agnus V, Soler L, Doignon C, Marescaux J (2014) Automatic detection of endoscope in intraoperative CT image: application to AR guidance in laparoscopic surgery. ISBI. https://doi.org/10.1109/ISBI.2014.6867933

    Article  Google Scholar 

  12. Pessaux P, Diana M, Soler L, Piardi T, Mutter D, Marescaux J (2015) Towards cybernetic surgery: robotic and augmented reality-assisted liver segmentectomy. Langenbeck’s Arch Surg 400(3):381–385

    Article  Google Scholar 

  13. Souzaki R, Ieiri S, Uemura M, Ohuchida K, Tomikawa M (2013) An augmented reality navigation system for pediatric oncologic surgery based on preoperative CT and MRI images. J Pediatr Surg 48(12):2479–2483

    Article  PubMed  Google Scholar 

  14. Su LM, Vagvolgyi BP, Agarwal R, Reiley CE, Taylor RH, Hager GD (2009) Augmented reality during robot-assisted laparoscopic partial nephrectomy: toward real-time 3D-CT to stereoscopic video registration. Urology 73(4):896–900

    Article  PubMed  Google Scholar 

  15. Thompson S, Totz J, Song Y, Johnsen S, Stoyanov D (2015) Accuracy validation of an image guided laparoscopy system for liver resection. Med Imaging 9415:941509-1–941509-12

    Google Scholar 

  16. Marzano E, Piardi T, Soler L, Diana M, Mutter D, Marescaux J, Pessaux P (2013) Augmented reality-guided artery-first pancreatico-duodenectomy. J Gastrointest Surg 17(11):1980–1983

    Article  PubMed  Google Scholar 

  17. Puerto Souza GA, Mariottini GL (2013) Toward long-term and accurate augmented-reality display for minimally-invasive surgery. In: ICRA, pp 5384–5389

  18. Amir-Khalili A, Nosrati MS, Peyrat JM, Hamarneh G, Abugharbieh R (2013) Uncertainty-encoded augmented reality for robot assisted partial nephrectomy: a phantom study. In: AECAI@MICCAI, pp 182–191. https://doi.org/10.1007/978-3-642-40843-4_20

  19. Cash DM, Miga MI, Sinha TK, Galloway RL, Chapman WC (2005) Compensating for intraoperative soft-tissue deformations using incomplete surface data and finite elements. IEEE Trans Med Imaging 24(11):1479–1491

    Article  PubMed  Google Scholar 

  20. Hamarneh G, Amir-Khalili A, Nosrati M, Figueroa I, Kawahara J, Al-Alao O, Peyrat JM, Abi-Nahed J, Al-Ansari A, Abugharbieh R (2014) Towards multi-modal image-guided tumour identification in robot-assisted partial nephrectomy. In: MECBME, pp 159–162. https://doi.org/10.1109/mecbme.2014.6783230

  21. Haouchine N, Dequidt J, Peterlik I, Kerrien E, Berger MO, Cotin S (2013) Image-guided simulation of heterogeneous tissue deformation for augmented reality during hepatic surgery. In: ISMAR, pp 199–208. https://doi.org/10.1109/ismar.2013.6671780

  22. Nakamura K, Naya Y, Zenbutsu S, Araki K, Cho S, Ohta S, Nihei N, Suzuki H, Ichikawa T, Igarashi T (2010) Surgical navigation using three-dimensional computed tomography images fused intraoperatively with live video. J Endourol 24(4):521–524

    Article  PubMed  Google Scholar 

  23. Teber D, Guven S, Simpfendörfer T, Baumhauer M, Güven EO, Yencilek F, Gözen AS, Rassweiler J (2009) Augmented reality: a new tool to improve surgical accuracy during laparoscopic partial nephrectomy preliminary in vitro and in vivo results. Eur Urol 56(2):332–338

    Article  PubMed  Google Scholar 

  24. Kong S, Haouchine N, Soares R, Klymchenko A, Andreiuk B, Marques B, Shabat G, Piechaud T, Diana M, Cotin S, Marescaux J (2017) Robust augmented reality registration method for localization of solid organs’ tumors using CT-derived virtual biomechanical model and fluorescent fiducials. Surg Endosc 31(7):2863–2871

    Article  PubMed  Google Scholar 

  25. Wild E, Teber D, Schmid D, Simpfendörfer T, Müller M, Baranski A, Kenngott H, Kopka K, Maier-Hein L (2016) Robust augmented reality guidance with fluorescent markers in laparoscopic surgery. Int J Comput Ass Rad 11(6):899–907

    Google Scholar 

  26. Chang PL, Stoyanov D, Davison AJ, Edwards PE (2013) Real-time dense stereo reconstruction using convex optimisation with a cost-volume for image-guided robotic surgery. MICCAI 8149:42–49

    Google Scholar 

  27. Totz J, Thompson S, Stoyanov D, Gurusamy K, Davidson BR, Hawkes DJ, Clarkson MJ (2014) Fast semi-dense surface reconstruction from stereoscopic video in laparoscopic surgery. IPCAI 8498:206–215

    Google Scholar 

  28. Stoyanov D, Darzi A, Yang GZ (2004) Dense 3D depth recovery for soft tissue deformation during robotically assisted laparoscopic surgery. MICCAI 3217:41–48

    Google Scholar 

  29. Chui H, Rangarajan A (2003) A new point matching algorithm for non-rigid registration. Comput Vis Image Und 89(2):114–141

    Article  Google Scholar 

  30. Myronenko A, Song X (2010) Point set registration: coherent point drift. TPAMI 32(12):2262–2275

    Article  Google Scholar 

  31. Hirschmuller H (2005) Accurate and efficient stereo processing by semi-global matching and mutual information. CVPR 2:807–814

    Google Scholar 

  32. Hirschmuller H (2008) Stereo processing by semiglobal matching and mutual information. IEEE Trans Pattern Anal Mach Intell 30(2):328–341

    Article  PubMed  Google Scholar 

  33. Birchfield S, Tomasi C (2002) A pixel dissimilarity measure that is insensitive to image sampling. IEEE Trans Pattern Anal Mach Intell 20(4):401–406

    Article  Google Scholar 

  34. He K, Gkioxari G, Dollár P, Girshick R (2017) Mask R-CNN. In: ICCV, pp 2980–2988. https://doi.org/10.1109/iccv.2017.322

  35. Bay H, Ess A, Tuytelaars T, Van Gool L (2008) Speeded-up robust features (SURF). Comput Vis Image Und 110(3):346–359

    Article  Google Scholar 

  36. Fischler M, Bolles R (1981) Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun ACM 24(6):381–395

    Article  Google Scholar 

  37. Richard H, Andrew Z (2015) Multiple view geometry in computer vision, 2nd edn. Cambridge University Press, Cambridge

    Google Scholar 

  38. Besl PJ, Mckay ND (1992) A method for registration of 3-d shapes. Proc SPIE Int Soc Opt Eng 14(3):239–256

    Google Scholar 

  39. Jolliffe IT (2002) Principal component analysis. J Mark Res 87(100):513. https://doi.org/10.2307/3172953

    Article  Google Scholar 

  40. Wang JC, Suenaga H, Hoshi K, Yang LJ, Kobayashi E, Sakuma I, Liao HG (2014) Augmented reality navigation with automatic marker-free image registration using 3-D image overlay for dental surgery. IEEE Trans Biomed Eng 61(4):1295–1304

    Article  PubMed  Google Scholar 

  41. Myronenko A, Song X, Carreira-Perpinan MA (2007) Non-rigid point set registration: coherent point drift. In: Proceedings of advances in neural information processing systems, pp 1009–1016. https://doi.org/10.1109/tpami.20

  42. Zhang Z (2002) A flexible new technique for camera calibration. TPAMI 22(11):1330–1334

    Article  Google Scholar 

  43. Plantefeve R, Haouchine N, Radoux JP, Cotin S (2014) Automatic alignment of pre and intraoperative data using anatomical landmarks for augmented laparoscopic liver surgery. Lecture Notes in Computer Science, vol 8789. Springer, Berlin, pp 58–66

    Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grant No. 61701014).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xuebin Zhang.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Informed consent

Informed consent was obtained from all individual participants included in the study.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Zhang, X., Wang, J., Wang, T. et al. A markerless automatic deformable registration framework for augmented reality navigation of laparoscopy partial nephrectomy. Int J CARS 14, 1285–1294 (2019). https://doi.org/10.1007/s11548-019-01974-6

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-019-01974-6

Keywords

Navigation