Analysis of disruptive events and precarious situations caused by interaction with neurosurgical microscope
- 274 Downloads
Developments in micro-neurosurgical microscopes have improved operating precision and ensured the quality of outcomes. Using the stereoscopic magnified view, however, necessitates frequent manual adjustments to the microscope during an operation.
This article reports on an investigation of the interaction details concerning a state-of-the-art micro-neurosurgical microscope. The video data from detailed observations of neurosurgeons’ interaction patterns with the microscope were analysed to examine disruptive events caused by adjusting the microscope.
The primary findings show that interruptions caused by adjusting the microscope handgrips and mouth switch prolong the surgery time up to 10 %. Surgeons, we observed, avoid interaction with the microscope’s controls, settings, and configurations by working at the edge of the view, operating on a non-focused view, and assuming unergonomic body postures.
The lack of an automatic method for adjusting the microscope is a major problem that causes interruptions during micro-neurosurgery. From this understanding of disruptive events, we discuss the opportunities and limitations of interactive technologies that aim to reduce the frequency or shorten the duration of interruptions caused by microscope adjustment.
KeywordsMedical practice Microscope use in the OR Interruption Interaction with microscope Micro-neurosurgery
Conflicts of interest
All authors certify that they have NO affiliations with or involvement in any organization or entity with any financial interest (such as honoraria; educational grants; participation in speakers’ bureaus; membership, employment, consultancies, stock ownership, or other equity interest; and expert testimony or patent-licensing arrangements), or non-financial interest (such as personal or professional relationships, affiliations, knowledge or beliefs) in the subject matter or materials discussed in this manuscript.
- 3.Yasargil MG, Curcic M, Abernathey CD (1995) Microneurosurgery of CNS tumors. Thieme, New YorkGoogle Scholar
- 5.Ramamurti R, Sridhar K, Vasudevan M (2005) Textbooks of operative neurosurgery. BI Publications Pvt Ltd, New DelhiGoogle Scholar
- 7.Hernesniemi J, Niemelä M, Karatas A, Kivipelto L, Ishii K, Rinne J, Ronkainen A, Koivisto T, Kivisaari R, Shen H, Lehecka M, Frösen J, Piippo A, Jääskeläinen JE (2005) Some collected principles of microneurosurgery: simple and fast, while preserving normal anatomy: a review. Surg Neurol 64:195–200PubMedCrossRefGoogle Scholar
- 12.Elprama SA, Kilpi K, Duysburgh P, Jacobs A, Vermeulen L, Van Looy J (2013) Identifying barriers in telesurgery by studying current team practices in robot-assisted surgery. In PervasiveHealth conference. IEEE 224–231Google Scholar
- 13.Mentis HM, Taylor AS (2013) Imaging the body: embodied vision in minimally invasive surgery. In Proc CHI ’13 SIGCHI Conference. ACM 1479–1488Google Scholar
- 16.Afkari H, Eivazi S, Bednarik R, Mäkelä S (2014) The potentials for hands-free interaction in micro-neurosurgery. In Proc of NordiCHI’14. ACM 401–410Google Scholar
- 17.Nardi BA, Schwarz H, Kuchinsky A, Leichner R, Whittaker S, Sclabassi R (1993) Turning away from talking heads: the use of video-as-data in neurosurgery. In Proc. INTERACT’93 and CHI’93 conference. ACM 327–334Google Scholar
- 18.Eivazi S, Bednarik R, Tukiainen M, Fraunberg M, Leinonen V, Jääskeläinen JE (2012) Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings. In Proc of ETRA’12. ACM 377–380Google Scholar
- 21.Lehecka M, Laakso A, Hernesniemi J, Çelik Ö (2011) Helsinki microneurosurgery basics and tricks . DruckereiHohl GmbH and Co, KG, GermanyGoogle Scholar
- 23.Wieben O (2001) Image-guided surgery. In: Webster J (eds) Minimally Invasive Medical Technology, Series in Med Physics and Biomed Eng, pp. 152–175Google Scholar
- 25.Hinckley K, Pausch R, Goble JC, Kassell NF (1994) Passive real-world interface props for neurosurgical visualization. In Proc of the SIGCHI conference, ACM, 452–458Google Scholar
- 27.Hillaire S, Lécuyer A, Cozot R, Casiez G (2008) Using an eye-tracking system to improve camera motions and depth-of-field blur effects in virtual environments. In VR’08 Conference. IEEE 47–50Google Scholar
- 29.Jacob RJ, Karn KS (2003) Eye tracking in human-computer interaction and usability research: Ready to deliver the promises. In: Hyona J, Radach R, Deubel H (eds) The mind’s eye: cognitive and applied aspects of eye movement research, pp 573–603Google Scholar
- 30.Ware C, Mikaelian HH (1987) An evaluation of an eye tracker as a device for computer input. In ACM SIGCHI Bull 17:183–188. ACMGoogle Scholar