Advertisement

A system for context-aware intraoperative augmented reality in dental implant surgery

  • Darko Katić
  • Patrick Spengler
  • Sebastian Bodenstedt
  • Gregor Castrillon-Oberndorfer
  • Robin Seeberger
  • Juergen Hoffmann
  • Ruediger Dillmann
  • Stefanie Speidel
Original Article

Abstract

Purpose   

Large volumes of information in the OR are ignored by surgeons when the amount outpaces human mental processing abilities. We developed an augmented reality (AR) system for dental implant surgery that acts as an automatic information filter, selectively displaying only relevant information. The purpose is to reduce information overflow and offer intuitive image guidance. The system was evaluated in a pig cadaver experiment.

Methods   

Information filtering is implemented via rule-based situation interpretation with description logics. The interpretation is based on intraoperative distances measurement between anatomical structures and the dental drill with optical tracking. For AR, a head-mounted display is used, which was calibrated with a novel method based on SPAAM. To adapt to surgeon specific preferences, we offer two alternative display formats: one with static and another with contact analog AR.

Results   

The system made the surgery easier and showed ergonomical benefits, as assessed by a questionnaire. All relevant phases were recognized reliably. The new calibration showed significant improvements, while the deviation of the realized implants was \(<\)2.5 mm.

Conclusion   

The system allowed the surgeon to fully concentrate on the surgery itself. It offered greater flexibility since the surgeon received all relevant information, but was free to deviate from it. Accuracy of the realized implants remains an open issue and part of future work.

Keywords

Dental implant surgery Augmented reality Cognitive surgery Context-awareness 

Notes

Acknowledgments

The present research is supported by the German Research Foundation (Research Grant DI 330/23-1) and part of the “SFB TRR 125 Cognition-Guided Surgery” founded by the German Research Foundation. It is furthermore sponsored by the European Social Fund of the State Baden-Wuerttemberg.

Conflict of interest

Darko Katić, Patrick Spengler, Sebastian Bodenstedt, Gregor Castrillon-Oberndorfer, Robin Seeberger, Juergen Hoffmann, Ruediger Dillmann and Stefanie Speidel declare that they have no conflict of interest.

References

  1. 1.
    Ewers R, Schicho K, Truppe M, Seemann R, Reichwein A et al (2004) Computer-aided navigation in dental implantology: 7 years of clinical experience. J Oral Maxillofac Surg 62(3):329–334PubMedCrossRefGoogle Scholar
  2. 2.
    Hassfeld S, Muehling J (2001) Computer assisted oral and maxillofacial surgery a review and an assessment of technology. J Oral Maxillofac Surg 30(1):2–13CrossRefGoogle Scholar
  3. 3.
    Kersten-Oertel M, Jannin P, Collins DL (2013) The state of the art of visualization in mixed reality image guided surgery. Comput Med Imaging Graph 37(2):98–112PubMedCrossRefGoogle Scholar
  4. 4.
    Woods DD, Patterson ES, Roth EM (2002) Can we ever escape from data overload? A cognitive systems diagnosis. Cognit Technol Work 4(1):22–36CrossRefGoogle Scholar
  5. 5.
    Yamaguchi S, Ohtani T, Yatani H, Sohmura T (2009) Augmented reality system for dental implant surgery. HCI 13:633–638Google Scholar
  6. 6.
    Yamaguchi S, Ohtani T, Ono S, Yamanishi Y, Sohmura T, Yatani H (2011) Intuitive surgical navigation system for dental implantology by using retinal imaging display, implant dentistry—a rapidly evolving practice Prof. Ilser Turkyilmaz (Ed.), ISBN: 978-953-307-658-4, InTech. doi: 10.5772/19034
  7. 7.
    Ploder O, Wagner A, Enislidis G, Ewers R (1995) Computer-assisted intraoperative visualization of dental implants. Augmented reality in medicine. Radiologe 35(9):569–572PubMedGoogle Scholar
  8. 8.
    Wanschitz F, Birkfellner W, Figl M, Patruta S, Wagner A, Watzinger F, Yerit K et al (2002) Computer-enhanced stereoscopic vision in a head-mounted display for oral implant surgery. Clin Oral Implant Res 13(6):610–616CrossRefGoogle Scholar
  9. 9.
    Tran HH, Suenaga H, Kuwana K, Masamune K, Dohi T, Nakajima S, Liao H (2011) Augmented reality system for oral surgery using 3D auto stereoscopic visualization medical image computing and computer-assisted intervention—MICCAI 2011 LNCS 6891, 81-88Google Scholar
  10. 10.
    Lin YK, Yau HT, Wang IC, Zheng C, Chung KH (2013) A novel dental implant guided surgery based on integration of surgical template and augmented reality, Clin Implant Dent Relat Res. doi: 10.1111/cid.12119
  11. 11.
    Phattanapon R, Kugamoorthy G, Peter H, Matthew D, Siriwan S (2010) Augmented reality haptics system for dental surgical skills training VRST ’10. In: Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology 97–98Google Scholar
  12. 12.
    Behringer R, Christian J, Holzinger A, Wilkinson S (2007) Some usability Issues of augmented and mixed reality for e-health applications in the medical domain. USAB 255–266Google Scholar
  13. 13.
    Adam EC (1993) Fighter cockpits of the future. In: proceedings of 12th IEEE/AIAA digital avionics systems conference (DASC), Int J Comput Assist Radiol Surg, 318–323Google Scholar
  14. 14.
    Lalys F, Jannin P (2013) Surgical process modelling: a review. Int J Comput Assist Radiol Surg [Epub ahead of print]Google Scholar
  15. 15.
    Blum T, Feuner H, Navab N (2010) Modeling and segmentation of surgical workflow from laparoscopic video MICCAI 2010Google Scholar
  16. 16.
    Padoy N, Blum T, Ahmadi A, Feuner H, Berger MO, Navab N (2010) Statistical modeling and recognition of surgical workflow, medical image analysis (2010)Google Scholar
  17. 17.
    Lalys F, Riaud L, Morandi X, Jannin P (2011) Surgical phases detection from microscope videos by combining SVM and HMM medical computer vision. MCV’10 proceedings of the 2010 international MICCAI conference on Medical computer vision: recognition techniques and applications in medical imagingGoogle Scholar
  18. 18.
    Bouarfa L, Jonkera PP, Dankelmana J (2010) Discovery of high-level tasks in the operating room. J Biomed InformGoogle Scholar
  19. 19.
    James A, Vieira D, Lo B, Darzi A, Yang GZ (2007) Eye-gaze driven surgical workflow segmentation MICCAI 2007Google Scholar
  20. 20.
    Jannin P, Morandi X (2007) Surgical models for computer-assisted neurosurgery. Neuroimage 37:783–791PubMedCrossRefGoogle Scholar
  21. 21.
    Neumuth T, Jannin P, Schlomberg J, Meixensberger J, Wiedemann P, Burgert O (2010) Analysis of surgical intervention populations using generic surgical process models. Int J Comput Assist Radiol Surg 6(1):59–71PubMedCentralPubMedCrossRefGoogle Scholar
  22. 22.
    Baader F, Calvanese D, McGuinness DL, Nardi D, Patel-Schneider PF (2003) The description logic handbook: theory, implementation, applications. Cambridge University Press, CambridgeGoogle Scholar
  23. 23.
    Neumuth T, Strau G, Meixensberger J, Lemke HU, Burgert O (2006) Acquisition of process descriptions from surgical interventions. DEXA2006, LNCS(4080)Google Scholar
  24. 24.
    Katic D, Wekerle AL, Bodenstedt S, Kenngott H, Mueller-Stich BP, Dillmann R, Speidel S (2011) Logic-based situation interpretation with real-valued sensor data for laparoscopic surgery. M2CAI 2011–2nd workshop on modeling and monitoring of computer assisted interventionsGoogle Scholar
  25. 25.
    Katic D, Wekerle AL, Grtler J, Spengler P, Bodenstedt S, Rhl S, Suwelack S, Kenngott HG, Wagner M, Mueller-Stich BP, Dillmann R, Speidel S (2013) Context-aware augmented reality in laparoscopic surgery. Comp Med Imag Graph 37(2):174–182CrossRefGoogle Scholar
  26. 26.
    Katic D, Sudra G, Speidel S, Castrillon-Oberndorfer G, Eggers G, Dillmann R (2010) Knowledge-based situation interpretation for context-aware augmented reality in dental implant surgery MIAR 2010Google Scholar
  27. 27.
    Tuceryan M, Navab N (2000) Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR IEEE and ACM international symposium on augmented reality Google Scholar
  28. 28.
    Katic D, Christian T, Castrillon-Oberndorfer G, Hoffmann J, Eggers G, Dillmann R, Speidel S (2011) Calibration of see-through-goggles for a context-aware augmented reality system computer assisted radiology and surgeryGoogle Scholar

Copyright information

© CARS 2014

Authors and Affiliations

  • Darko Katić
    • 1
  • Patrick Spengler
    • 1
  • Sebastian Bodenstedt
    • 1
  • Gregor Castrillon-Oberndorfer
    • 2
  • Robin Seeberger
    • 2
  • Juergen Hoffmann
    • 2
  • Ruediger Dillmann
    • 1
  • Stefanie Speidel
    • 1
  1. 1.Department of Informatics, Institute for AnthropomaticsKarlsruhe Institute of Technology (KIT)KarlsruheGermany
  2. 2.Department of Cranio-Maxillofacial SurgeryUniversity of HeidelbergHeidelbergGermany

Personalised recommendations