Skip to main content
Log in

Deformable three-dimensional model architecture for interactive augmented reality in minimally invasive surgery

  • Published:
Surgical Endoscopy Aims and scope Submit manuscript

Abstract

Background

Surgical procedures have undergone considerable advancement during the last few decades. More recently, the availability of some imaging methods intraoperatively has added a new dimension to minimally invasive techniques. Augmented reality in surgery has been a topic of intense interest and research.

Methods

Augmented reality involves usage of computer vision algorithms on video from endoscopic cameras or cameras mounted in the operating room to provide the surgeon additional information that he or she otherwise would have to recognize intuitively. One of the techniques combines a virtual preoperative model of the patient with the endoscope camera using natural or artificial landmarks to provide an augmented reality view in the operating room. The authors’ approach is to provide this with the least number of changes to the operating room. Software architecture is presented to provide interactive adjustment in the registration of a three-dimensional (3D) model and endoscope video.

Results

Augmented reality including adrenalectomy, ureteropelvic junction obstruction, and retrocaval ureter and pancreas was used to perform 12 surgeries. The general feedback from the surgeons has been very positive not only in terms of deciding the positions for inserting points but also in knowing the least change in anatomy.

Conclusions

The approach involves providing a deformable 3D model architecture and its application to the operating room. A 3D model with a deformable structure is needed to show the shape change of soft tissue during the surgery. The software architecture to provide interactive adjustment in registration of the 3D model and endoscope video with adjustability of every 3D model is presented.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

References

  1. Chen W, Chen L, Qiang G, Chen Z, Jing J, Xiong S (2007) Using an image-guided navigation system for localization of small pulmonary nodules before thoracoscopic surgery. Surg Endosc 21:1883–1886

    Article  PubMed  CAS  Google Scholar 

  2. Lemke HU, Vannier MW (2006) The operating room and the need for an IT infrastructure and standards. Int J Comput Assisted Radiol Surg 1:117–121

    Article  Google Scholar 

  3. Papademetris X, DeLorenzo C, Flossmann S, Neff M, Vives KP, Spencer DD, Staib LH, Duncan JS (2009) From medical image computing to computer-aided intervention: development of a research interface for image-guided navigation. Int J Med Robot Comput Assisted Surg 5:147–157

    Article  Google Scholar 

  4. Papademetris X, Vives KP, DiStasio M, Staib HL, Neff M, Flossman S, Frielinghaus N, Zaveri H, Novotny EJ, Blumenfeld H, Constable RT, Hetherington HP, Duckrow RB, Spencer SS, Spencer DD, Duncan JS (2006) Development of a research interface for image-guided intervention: initial application to epilepsy neurosurgery. In: IEEE 3rd international symposium on biomedical imaging: Nano to Macro, Arlington, VA, USA, 6–9 April 2006, pp 490–493

  5. Nakao M, Minato K (2010) Physics-based interactive volume manipulation for sharing surgical process. IEEE Transact Inform Technol Biomed 14:809–816

    Article  Google Scholar 

  6. Bauer M, Schlegel M, Pustka D, Navab N, Klinker G (2006) Predicting and estimating the accuracy of n-occular optical tracking systems. In: IEEE/ACM international symposium on mixed and augmented reality, Santa Barbara, CA, USA, 22–25 Oct 2006, pp 43–51

  7. Lopresti BJ, Russo A, Jones WF, Fisher T, Crouch DG (1998) Implementation and performance of an optical motion tracking system for high-resolution brain pet imaging. IEEE Symposium Nuclear Sci 2:1127–1131

    Google Scholar 

  8. Cartellieri M, Kremser J, Vorbeck F (2001) Comparison of different 3D navigation systems by a clinical user. Eur Arch Otorhinolaryngol 258:38–41

    Article  PubMed  CAS  Google Scholar 

  9. Vanbergen P, Kunert W, Bessell J, Buess GF (1998) Comparative study of two-dimensional and three-dimensional vision systems for minimally invasive surgery. Surg Endosc 12:948–954

    Article  CAS  Google Scholar 

  10. Auer T, Pinz A (1999) The integration of optical and magnetic tracking for multi-user augmented reality. Comput Graph 23:805–808

    Article  Google Scholar 

  11. Baumhauer M, Feuerstein M, Meinzer H, Rassweiler J (2008) Navigation in endoscopic soft tissue surgery: perspectives and limitations. J Endourol 22:751–766

    Article  PubMed  Google Scholar 

  12. Aguinaga I, Fierz B, Spillmann J, Harders M (2010) Filtering of high modal frequencies for stable real-time explicit integration of deformable objects using the finite element method. Prog Biophys Mol Biol (special issue on biomechanical modeling of soft tissue motion) 103:225–235

    Google Scholar 

  13. Wang X, Zhang Q, Han Q, Yang R, Carswell M, Seales B, Sutton E (2010) Endoscopic video texture mapping on prebuilt 3D anatomical objects without camera tracking. IEEE Trans Med Imaging 29:1213–1223

    Google Scholar 

  14. Mountney P, Yang G (2008) Soft tissue tracking for minimally invasive surgery: learning local deformation online. In: Proceedings of the 11th international conference on medical image computing and computer-assisted intervention, Part II, New York City, USA, 6–10 Sept 2008, pp 364–372

  15. Collins T, Compte B, Bartoli A (2011) Deformable shape-from-motion in laparoscopy using a rigid sliding window. In: Proceedings of medical image understanding and analysis (MIUA’11), London, UK, 14–15 July 2011

  16. Figl M, Rueckert D, Hawkes D, Casula R, Hu M, Pedro O, Zhang DP, Penney G, Bello F, Edwards P (2010) Image guidance for robotic minimally invasive coronary artery bypass. Comput Med Imaging Graph 34:61–68

    Article  PubMed  Google Scholar 

  17. Sielhorst T, Feuerstein M, Navab N (2008) Advanced medical displays: a literature review of augmented reality. IEEE J Display Technol 4:451–467

    Google Scholar 

  18. Fuchs H, Livingston M, Raskar R, Colucci D, Keller K, State A, Crawford J, Rademacher P, Drake S, Meyer A (1998) Augmented reality visualization for laparoscopic surgery. Proc Med Image Comput Comput Assist Intervent 1496:934–943

    Google Scholar 

  19. Hansen C, Wieferich J, Ritter F, Rieder C, Peitgen H (2010) Illustrative visualization of 3D planning models for augmented reality in liver surgery. Int J Comput Assist Radiol Surg 5:133–141

    Article  PubMed  Google Scholar 

  20. Lee S, Lerotic M, Vitiello V, Giannarou S, Kwok K, Visentini-Scarzanella M, Yang G (2010) From medical images to minimally invasive intervention: computer assistance for robotic surgery. J Comput Med Imaging Graph 34:33–45

    Article  Google Scholar 

  21. Nicolau S, Vemuri A, Wu H-S, Huang M-H, Ho Y, Charnoz A, Hostettler A, Soler L, Marescaux J. A cost-effective simulator for education of ultrasound image interpretation and probe manipulation. In: Proceedings of Medicine Meets Virtual Reality (MMVR) 163:403–407

  22. Bozorg-Grayeli A, Bernardeschi D, Sonji G, Elgarem H, Sterkers O, Ferrary E (2010) Assessing mental representation of mastoidectomy by a computer-based drawing tool. J Acta Otolaryngol 130:1335–1342

    Article  Google Scholar 

  23. Pugh CM, Santacaterina S, DaRosa DA, Clark RE (2011) Intraoperative decision making: more than meets the eye. J Biomed Informat 44:486–496

    Article  Google Scholar 

  24. Liu A, Tendick F, Cleary K, Kaufmann C (2003) A survey of surgical simulation: applications, technology, and education. Presence MIT Press 12:599–614

    Google Scholar 

  25. DiMaio S, Kapur T, Cleary K, Aylward S, Kazanzides P, Vosburgh K, Ellis R, Duncan J, Farahani K, Lemke H, Peters T, Lorensen W, Gobbi D, Haller D, Clarke L, Pizer S, Taylor R, Galloway R Jr, Fichtinger G, Hata N, Lawson K, Tempany C, Kikinis R, Jolesz F (2007) Challenges in image-guided therapy system design. NeuroImage 37:144–151

    Article  Google Scholar 

  26. Trevisan DG, Vanderdonckt J, Macq B, Raftopoulos C (2003) Medical imaging: visualization, image-guided procedures, and display. In: Galloway RL Jr (ed) Proceedings of the SPIE, vol 5029, pp 108–118

  27. Dickhaus CF, Burghart C, Tempany CM, D’Amico A, Haker S, Kikinis R, Woern H (2004) Workflow modeling and analysis of computer guided prostate brachytherapy under MR imaging control. Stud Health Technol Inform 98:72–74

    PubMed  Google Scholar 

  28. Jalote-Parmar A, Badke-Schaub P (2008) Workflow integration matrix: a framework to support the development of surgical information systems. Design Stud 29:338–368

    Article  Google Scholar 

  29. Jalote-Parmar A, Pattynama PMT, Goossens RHM, Freudenthal A, Samset E, De Ridder H (2006) Exploring a user centric methodology to investigate and integrate information gathered during medical intervention. In: 16th world congress on ergonomics IEA 2006, “Meeting Diversity in Ergonomics”, Maastricht, The Netherlands, 10–14 July 2006

  30. Enquobahrie A, Cheng P, Gary K, Ibanez L, Gobbi D, Lindseth F, Yaniv Z, Aylward S, Jomier J, Cleary K (2007) The image-guided surgery toolkit IGSTK: An open source c++ software toolkit. J Digit Imaging 20:21–33

    Article  PubMed  Google Scholar 

  31. Wolf I, Vetter M, Wegner I, Nolden M, Bottger T, Hastenteufel M, Schobinger M, Kunert T, Meinzer H (2010) The medical imaging interaction toolkit (MITK): a toolkit facilitating the creation of interactive software by extending VTK and ITK. Comput Meth Prog Bio 100(1):79–86

    Google Scholar 

  32. Nicolau S, Soler L, Mutter D, Marescaux J (2011) Augmented reality in laparoscopic surgical oncology. J Surg Oncol 20:189–201

    Article  Google Scholar 

  33. Pieper S, Halle M, Kikinis R (2004) 3D SLICER. In: Proceedings of the 1st IEEE international symposium on biomedical imaging: from nano to macro, vol 1, pp 632–635

  34. Deguet A, Kumar R, Taylor R, Kazanzides P (2008) The CISST libraries for computer-assisted intervention systems

  35. Fasquel J-B, Moreau J (2011) A design pattern coupling role and component concepts: application to medical software. J Systems Software 84:847–863

    Article  Google Scholar 

  36. Sugimoto M, Yasuda H, Koda K, Suzuki M, Yamazaki M, Tezuka T, Kosugi C, Higuchi R, Watayo Y, Yagawa Y, Uemura S, Tsuchiya H, Azuma T (2010) Image overlay navigation by markerless surface registration in gastrointestinal, hepatobiliary and pancreatic surgery. J Hepatobiliarypancreatic Sci 17:629–636

    Google Scholar 

  37. Volonte F, Pugin F, Bucher P, Sugimoto M, Ratib O, Morel P (2011) Augmented reality and image overlay navigation with OsiriX in laparoscopic and robotic surgery: not only a matter of fashion. J Hepatobiliarypancreatic Sci 18(4):506–509

    Google Scholar 

  38. Fasquel J, Chabre G, Zanne P, Nicolau S, Agnus V, Soler L, De Mathelin M, Marescaux J (2009) A role-based component architecture for computer assisted interventions: illustration for electromagnetic tracking and robotized motion rejection in flexible endoscopy

Download references

Disclosures

Anant S. Vemuri, Jungle Chi-Hsiang Wu, Kai-Che Liu, and Hurng-Sheng Wu have no conflicts of interest or financial ties to disclose.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hurng-Sheng Wu.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Vemuri, A.S., Wu, J.CH., Liu, KC. et al. Deformable three-dimensional model architecture for interactive augmented reality in minimally invasive surgery. Surg Endosc 26, 3655–3662 (2012). https://doi.org/10.1007/s00464-012-2395-0

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00464-012-2395-0

Keywords

Navigation