Providing intuitive and easy to operate interaction for medical augmented reality is essential for use in the operating room. Commonly, intra-operative navigation information is displayed on an installed monitor, requiring the operating surgeon to change focus from the monitor to the surgical site and vice versa during navigation. Projector-based augmented reality has the potential to alleviate this problem. The aim of our work is to use a projector for visualization and to provide intuitive means for direct interaction with the projected information.
A consumer-grade projector is used to visualize preoperatively defined surgical planning data. The projection of the virtual information is possible on any deformable surface, and the surgeon can interact with the presented virtual information. A Microsoft Kinect camera is used to capture both the surgeon interactions and the deformations of the surface over time. After calibration of projector and Kinect camera, the fingertips are localized automatically. A point cloud surface representation is used to determine the surgeon interaction with the projected virtual information. Interaction is detected by estimating the proximity of the surgeon’s fingertips to the interaction zone and applying projector–Kinect calibration information. Interaction is performed using multi-touch gestures.
In our experimental surgical scenario, the surgeon stands in front of the Microsoft Kinect camera, while relevant medical information is projected on the interaction zone. A hand wave gesture initiates the tracking of the hand. The user can then interact with the projected virtual information according to the defined multi-touch-based gestures. Thus, all information such as preoperative planning data is provided to the surgeon and his/her team intra-operatively in a familiar context.
We enabled the projection of the virtual information on an arbitrarily shaped surface and used a Microsoft Kinect camera to capture the interaction zone and the surgeon’s actions. The system eliminates the need for the surgeon to alternately view the surgical site and the monitor. The system eliminates unnecessary distractions and may enhance the surgeon’s performance.
This is a preview of subscription content, log in to check access.
Buy single article
Instant unlimited access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
Bailly G, Walter R, Müller J, Ning T, Lecolinet E (2011) Comparing free hand menu techniques for distant displays using linear, marking and finger-count menus. In: Campos P, Graham N, Jorge J, Nunes N, Palanque P, Winckler M (eds) Human–computer interaction INTERACT 2011. Lecture notes in computer science, vol 6947, pp 248–262. Springer, Berlin. doi:10.1007/978-3-642-23771-3_19
Burrus N (2013) nestk—c++ library for kinect. https://github.com/nburrus/nestk
Burrus N (2012) Rgbdemo 0.7.0. http://labs.manctl.com/rgbdemo/
Douglas D, Peucker T (1973) Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartographica Int J Geogr Inf Geovis 10(2):112–122
Fischer J, Bartz D, Straßer W (2005) Intuitive and lightweight user interaction for medical augmented reality. Vision, modeling, and visualization. Erlangen, pp 375–382
Fraunhofer MEVIS http://www.mevis.fraunhofer.de/
Fraunhofer MEVIS MeVisLab. http://www.mevislab.de/
Fujii K, Grossberg M, Nayar S (2005) A projector–camera system with real-time photometric adaptation for dynamic environments. In: IEEE conference on computer vision and pattern recognition (CVPR), vol 1, pp 814–821
GmbH C Computer assisted soft tissue surgery. http://www.cascination.ch
Graetzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon–computer interaction. Technol Health Care 12(3):245–257
Grossberg M, Peri H, Nayar S, Belhumeur P (2004) Making one object look like another: controlling appearance using a projector–camera system. In: IEEE conference on computer vision and pattern recognition (CVPR), vol I, pp 452–459
Hansen C, Wieferich J, Ritter F, Rieder C, Peitgen H (2010) Illustrative visualization of 3D planning models for augmented reality in liver surgery. Int J Comput Assist Radiol Surg 5(2):133–141
Harrison C, Benko H, Wilson A (2011) Omnitouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th annual ACM symposium on user interface software and technology, pp 441–450. ACM
Hartung C, Gnahm C, Sailer S, Schenderlein M, Friedl R, Hoffmann M, Dietmayer K (2009) Towards projector-based visualization for computer-assisted cabg at the open heart. Bildverarbeitung für die Medizin, pp 376–380
Hoppe H, Brief J, Däuber S, Raczkowsky J, Haßfeld S, Wörn H (2001) Projector based intraoperative visualization of surgical planning data. In: Proceedings of ISRACAS
Kocev B, Ojdanic D, Peitgen H (2011) An approach for projector-based surgeon–computer interaction using tracked instruments. In: Proceedings of GI workshop: emerging technologies for medical diagnosis and therapy
Liu Y, Paul J, Yong J, Yu P, Zhang H, Sun J, Ramani K (2006) Automatic least-squares projection of points onto point clouds with applications in reverse engineering. Comput Aided Des 38(12):1251–1263
Mistry P, Maes P (2009) Sixthsense: a wearable gestural interface. In: ACM SIGGRAPH ASIA 2009 Sketches, pp 1–1. ACM
Nayar S, Peri H, Grossberg M, Belhumeur P (2003) A projection system with radiometric compensation for screen imperfections. In: ICCV workshop on projector-camera systems (PROCAMS)
Nokia Qt-cross-platform application and UI framework. http://qt.nokia.com/
OpenCV Camera calibration and 3D reconstruction. http://docs.opencv.org/modules/calib3d/doc/camera_calibration_and_3d_reconstruction.html?highlight=calibratecamera#cv2.calibrateCamera
Parsons C (2011) Christian Parsons. http://www.chparsons.com.ar/
PCL: Point Cloud Library http://pointclouds.org/
Ramer U (1972) An iterative procedure for the polygonal approximation of plane curves. Comput Graph Image Process 1(3):244–256
Ritter F, Hansen C, Wilkens K, Köhn A, Peitgen H (2009) Benutzungsschnittstellen für den direkten Zugriff auf 3D-Planungsdaten im OP user interfaces for direct interaction with 3D planning data in the operating room. i-com 8(1):24–31
Sklansky J (1982) Finding the convex hull of a simple polygon. Pattern Recognit Lett 1(2):79–83
Suzuki S et al (1985) Topological structural analysis of digitized binary images by border following. Comput Vis Graph Image Process 30(1):32–46
The university of edinburgh school of informatics. Computer vision it412. http://homepages.inf.ed.ac.uk/rbf/CVonline/LOCAL_COPIES/OWENS/LECT9/lect9.html
Volonté F, Pugin F, Bucher P, Sugimoto M, Ratib O, Morel P (2011) Augmented reality and image overlay navigation with Osirix in laparoscopic and robotic surgery: not only a matter of fashion. J HepatoBiliary Pancreat Sci 18(4):506–509. doi:10.1007/s00534-011-0385-6
Conflict of interest
About this article
Cite this article
Kocev, B., Ritter, F. & Linsen, L. Projector-based surgeon–computer interaction on deformable surfaces. Int J CARS 9, 301–312 (2014) doi:10.1007/s11548-013-0928-1
- Surgeon–computer interaction
- Multi-touch gestures
- Projector-based medical data visualization
- Image processing