Projector-based surgeon–computer interaction on deformable surfaces

  • 558 Accesses

  • 9 Citations



Providing intuitive and easy to operate interaction for medical augmented reality is essential for use in the operating room. Commonly, intra-operative navigation information is displayed on an installed monitor, requiring the operating surgeon to change focus from the monitor to the surgical site and vice versa during navigation. Projector-based augmented reality has the potential to alleviate this problem. The aim of our work is to use a projector for visualization and to provide intuitive means for direct interaction with the projected information.


A consumer-grade projector is used to visualize preoperatively defined surgical planning data. The projection of the virtual information is possible on any deformable surface, and the surgeon can interact with the presented virtual information. A Microsoft Kinect camera is used to capture both the surgeon interactions and the deformations of the surface over time. After calibration of projector and Kinect camera, the fingertips are localized automatically. A point cloud surface representation is used to determine the surgeon interaction with the projected virtual information. Interaction is detected by estimating the proximity of the surgeon’s fingertips to the interaction zone and applying projector–Kinect calibration information. Interaction is performed using multi-touch gestures.


In our experimental surgical scenario, the surgeon stands in front of the Microsoft Kinect camera, while relevant medical information is projected on the interaction zone. A hand wave gesture initiates the tracking of the hand. The user can then interact with the projected virtual information according to the defined multi-touch-based gestures. Thus, all information such as preoperative planning data is provided to the surgeon and his/her team intra-operatively in a familiar context.


We enabled the projection of the virtual information on an arbitrarily shaped surface and used a Microsoft Kinect camera to capture the interaction zone and the surgeon’s actions. The system eliminates the need for the surgeon to alternately view the surgical site and the monitor. The system eliminates unnecessary distractions and may enhance the surgeon’s performance.

This is a preview of subscription content, log in to check access.

Access options

Buy single article

Instant unlimited access to the full article PDF.

US$ 39.95

Price includes VAT for USA

Subscribe to journal

Immediate online access to all issues from 2019. Subscription will auto renew annually.

US$ 99

This is the net price. Taxes to be calculated in checkout.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12


  1. 1.

    Bailly G, Walter R, Müller J, Ning T, Lecolinet E (2011) Comparing free hand menu techniques for distant displays using linear, marking and finger-count menus. In: Campos P, Graham N, Jorge J, Nunes N, Palanque P, Winckler M (eds) Human–computer interaction INTERACT 2011. Lecture notes in computer science, vol 6947, pp 248–262. Springer, Berlin. doi:10.1007/978-3-642-23771-3_19

  2. 2.

    Burrus N (2013) nestk—c++ library for kinect.

  3. 3.

    Burrus N (2012) Rgbdemo 0.7.0.

  4. 4.

    Douglas D, Peucker T (1973) Algorithms for the reduction of the number of points required to represent a digitized line or its caricature. Cartographica Int J Geogr Inf Geovis 10(2):112–122

  5. 5.

    Fischer J, Bartz D, Straßer W (2005) Intuitive and lightweight user interaction for medical augmented reality. Vision, modeling, and visualization. Erlangen, pp 375–382

  6. 6.

    Fraunhofer MEVIS

  7. 7.

    Fraunhofer MEVIS MeVisLab.

  8. 8.

    Fujii K, Grossberg M, Nayar S (2005) A projector–camera system with real-time photometric adaptation for dynamic environments. In: IEEE conference on computer vision and pattern recognition (CVPR), vol 1, pp 814–821

  9. 9.

    GmbH C Computer assisted soft tissue surgery.

  10. 10.

    Graetzel C, Fong T, Grange S, Baur C (2004) A non-contact mouse for surgeon–computer interaction. Technol Health Care 12(3):245–257

  11. 11.

    Grossberg M, Peri H, Nayar S, Belhumeur P (2004) Making one object look like another: controlling appearance using a projector–camera system. In: IEEE conference on computer vision and pattern recognition (CVPR), vol I, pp 452–459

  12. 12.

    Hansen C, Wieferich J, Ritter F, Rieder C, Peitgen H (2010) Illustrative visualization of 3D planning models for augmented reality in liver surgery. Int J Comput Assist Radiol Surg 5(2):133–141

  13. 13.

    Harrison C, Benko H, Wilson A (2011) Omnitouch: wearable multitouch interaction everywhere. In: Proceedings of the 24th annual ACM symposium on user interface software and technology, pp 441–450. ACM

  14. 14.

    Hartung C, Gnahm C, Sailer S, Schenderlein M, Friedl R, Hoffmann M, Dietmayer K (2009) Towards projector-based visualization for computer-assisted cabg at the open heart. Bildverarbeitung für die Medizin, pp 376–380

  15. 15.

    Hoppe H, Brief J, Däuber S, Raczkowsky J, Haßfeld S, Wörn H (2001) Projector based intraoperative visualization of surgical planning data. In: Proceedings of ISRACAS

  16. 16.

    Kocev B, Ojdanic D, Peitgen H (2011) An approach for projector-based surgeon–computer interaction using tracked instruments. In: Proceedings of GI workshop: emerging technologies for medical diagnosis and therapy

  17. 17.

    Liu Y, Paul J, Yong J, Yu P, Zhang H, Sun J, Ramani K (2006) Automatic least-squares projection of points onto point clouds with applications in reverse engineering. Comput Aided Des 38(12):1251–1263

  18. 18.

    Mistry P, Maes P (2009) Sixthsense: a wearable gestural interface. In: ACM SIGGRAPH ASIA 2009 Sketches, pp 1–1. ACM

  19. 19.

    Nayar S, Peri H, Grossberg M, Belhumeur P (2003) A projection system with radiometric compensation for screen imperfections. In: ICCV workshop on projector-camera systems (PROCAMS)

  20. 20.

    Nokia Qt-cross-platform application and UI framework.

  21. 21.

    OpenCV Camera calibration and 3D reconstruction.

  22. 22.


  23. 23.

    Parsons C (2011) Christian Parsons.

  24. 24.

    PCL: Point Cloud Library

  25. 25.

    Ramer U (1972) An iterative procedure for the polygonal approximation of plane curves. Comput Graph Image Process 1(3):244–256

  26. 26.

    Ritter F, Hansen C, Wilkens K, Köhn A, Peitgen H (2009) Benutzungsschnittstellen für den direkten Zugriff auf 3D-Planungsdaten im OP user interfaces for direct interaction with 3D planning data in the operating room. i-com 8(1):24–31

  27. 27.

    Sklansky J (1982) Finding the convex hull of a simple polygon. Pattern Recognit Lett 1(2):79–83

  28. 28.

    Stereo Calibration

  29. 29.

    Suzuki S et al (1985) Topological structural analysis of digitized binary images by border following. Comput Vis Graph Image Process 30(1):32–46

  30. 30.

    The university of edinburgh school of informatics. Computer vision it412.

  31. 31.

    Volonté F, Pugin F, Bucher P, Sugimoto M, Ratib O, Morel P (2011) Augmented reality and image overlay navigation with Osirix in laparoscopic and robotic surgery: not only a matter of fashion. J HepatoBiliary Pancreat Sci 18(4):506–509. doi:10.1007/s00534-011-0385-6

Download references

Conflict of interest


Author information

Correspondence to Bojan Kocev.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (mpg 33078 KB)

Supplementary material 2 (mpg 12208 KB)

Supplementary material 1 (mpg 33078 KB)

Supplementary material 2 (mpg 12208 KB)

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Kocev, B., Ritter, F. & Linsen, L. Projector-based surgeon–computer interaction on deformable surfaces. Int J CARS 9, 301–312 (2014) doi:10.1007/s11548-013-0928-1

Download citation


  • Surgeon–computer interaction
  • Multi-touch gestures
  • Projector-based medical data visualization
  • Image processing