A system for context-aware intraoperative augmented reality in dental implant surgery
Large volumes of information in the OR are ignored by surgeons when the amount outpaces human mental processing abilities. We developed an augmented reality (AR) system for dental implant surgery that acts as an automatic information filter, selectively displaying only relevant information. The purpose is to reduce information overflow and offer intuitive image guidance. The system was evaluated in a pig cadaver experiment.
Information filtering is implemented via rule-based situation interpretation with description logics. The interpretation is based on intraoperative distances measurement between anatomical structures and the dental drill with optical tracking. For AR, a head-mounted display is used, which was calibrated with a novel method based on SPAAM. To adapt to surgeon specific preferences, we offer two alternative display formats: one with static and another with contact analog AR.
The system made the surgery easier and showed ergonomical benefits, as assessed by a questionnaire. All relevant phases were recognized reliably. The new calibration showed significant improvements, while the deviation of the realized implants was \(<\)2.5 mm.
The system allowed the surgeon to fully concentrate on the surgery itself. It offered greater flexibility since the surgeon received all relevant information, but was free to deviate from it. Accuracy of the realized implants remains an open issue and part of future work.
KeywordsDental implant surgery Augmented reality Cognitive surgery Context-awareness
The present research is supported by the German Research Foundation (Research Grant DI 330/23-1) and part of the “SFB TRR 125 Cognition-Guided Surgery” founded by the German Research Foundation. It is furthermore sponsored by the European Social Fund of the State Baden-Wuerttemberg.
Conflict of interest
Darko Katić, Patrick Spengler, Sebastian Bodenstedt, Gregor Castrillon-Oberndorfer, Robin Seeberger, Juergen Hoffmann, Ruediger Dillmann and Stefanie Speidel declare that they have no conflict of interest.
- 5.Yamaguchi S, Ohtani T, Yatani H, Sohmura T (2009) Augmented reality system for dental implant surgery. HCI 13:633–638Google Scholar
- 6.Yamaguchi S, Ohtani T, Ono S, Yamanishi Y, Sohmura T, Yatani H (2011) Intuitive surgical navigation system for dental implantology by using retinal imaging display, implant dentistry—a rapidly evolving practice Prof. Ilser Turkyilmaz (Ed.), ISBN: 978-953-307-658-4, InTech. doi: 10.5772/19034
- 9.Tran HH, Suenaga H, Kuwana K, Masamune K, Dohi T, Nakajima S, Liao H (2011) Augmented reality system for oral surgery using 3D auto stereoscopic visualization medical image computing and computer-assisted intervention—MICCAI 2011 LNCS 6891, 81-88Google Scholar
- 10.Lin YK, Yau HT, Wang IC, Zheng C, Chung KH (2013) A novel dental implant guided surgery based on integration of surgical template and augmented reality, Clin Implant Dent Relat Res. doi: 10.1111/cid.12119
- 11.Phattanapon R, Kugamoorthy G, Peter H, Matthew D, Siriwan S (2010) Augmented reality haptics system for dental surgical skills training VRST ’10. In: Proceedings of the 17th ACM Symposium on Virtual Reality Software and Technology 97–98Google Scholar
- 12.Behringer R, Christian J, Holzinger A, Wilkinson S (2007) Some usability Issues of augmented and mixed reality for e-health applications in the medical domain. USAB 255–266Google Scholar
- 13.Adam EC (1993) Fighter cockpits of the future. In: proceedings of 12th IEEE/AIAA digital avionics systems conference (DASC), Int J Comput Assist Radiol Surg, 318–323Google Scholar
- 14.Lalys F, Jannin P (2013) Surgical process modelling: a review. Int J Comput Assist Radiol Surg [Epub ahead of print]Google Scholar
- 15.Blum T, Feuner H, Navab N (2010) Modeling and segmentation of surgical workflow from laparoscopic video MICCAI 2010Google Scholar
- 16.Padoy N, Blum T, Ahmadi A, Feuner H, Berger MO, Navab N (2010) Statistical modeling and recognition of surgical workflow, medical image analysis (2010)Google Scholar
- 17.Lalys F, Riaud L, Morandi X, Jannin P (2011) Surgical phases detection from microscope videos by combining SVM and HMM medical computer vision. MCV’10 proceedings of the 2010 international MICCAI conference on Medical computer vision: recognition techniques and applications in medical imagingGoogle Scholar
- 18.Bouarfa L, Jonkera PP, Dankelmana J (2010) Discovery of high-level tasks in the operating room. J Biomed InformGoogle Scholar
- 19.James A, Vieira D, Lo B, Darzi A, Yang GZ (2007) Eye-gaze driven surgical workflow segmentation MICCAI 2007Google Scholar
- 22.Baader F, Calvanese D, McGuinness DL, Nardi D, Patel-Schneider PF (2003) The description logic handbook: theory, implementation, applications. Cambridge University Press, CambridgeGoogle Scholar
- 23.Neumuth T, Strau G, Meixensberger J, Lemke HU, Burgert O (2006) Acquisition of process descriptions from surgical interventions. DEXA2006, LNCS(4080)Google Scholar
- 24.Katic D, Wekerle AL, Bodenstedt S, Kenngott H, Mueller-Stich BP, Dillmann R, Speidel S (2011) Logic-based situation interpretation with real-valued sensor data for laparoscopic surgery. M2CAI 2011–2nd workshop on modeling and monitoring of computer assisted interventionsGoogle Scholar
- 26.Katic D, Sudra G, Speidel S, Castrillon-Oberndorfer G, Eggers G, Dillmann R (2010) Knowledge-based situation interpretation for context-aware augmented reality in dental implant surgery MIAR 2010Google Scholar
- 27.Tuceryan M, Navab N (2000) Single point active alignment method (SPAAM) for optical see-through HMD calibration for AR IEEE and ACM international symposium on augmented reality Google Scholar
- 28.Katic D, Christian T, Castrillon-Oberndorfer G, Hoffmann J, Eggers G, Dillmann R, Speidel S (2011) Calibration of see-through-goggles for a context-aware augmented reality system computer assisted radiology and surgeryGoogle Scholar