Computer Supported Cooperative Work (CSCW)

, Volume 23, Issue 3, pp 299–337 | Cite as

Interactional Order and Constructed Ways of Seeing with Touchless Imaging Systems in Surgery

  • Kenton O’Hara
  • Gerardo Gonzalez
  • Graeme Penney
  • Abigail Sellen
  • Robert Corish
  • Helena Mentis
  • Andreas Varnavas
  • Antonio Criminisi
  • Mark Rouncefield
  • Neville Dastur
  • Tom Carrell
Article

Abstract

While surgical practices are increasingly reliant on a range of digital imaging technologies, the ability for clinicians to interact and manipulate these digital representations in the operating theatre using traditional touch based interaction devices is constrained by the need to maintain sterility. To overcome these concerns with sterility, a number of researchers are have been developing ways of enabling interaction in the operating theatre using touchless interaction techniques such as gesture and voice to allow clinicians control of the systems. While there have been important technical strides in the area, there has been little in the way of understanding the use of these touchless systems in practice. With this in mind we present a touchless system developed for use during vascular surgery. We deployed the system in the endovascular suite of a large hospital for use in the context of real procedures. We present findings from a study of the system in use focusing on how, with touchless interaction, the visual resources were embedded and made meaningful in the collaborative practices of surgery. In particular we discuss the importance of direct and dynamic control of the images by the clinicians in the context of talk and in the context of other artefact use as well as the work performed by members of the clinical team to make themselves sensable by the system. We discuss the broader implications of these findings for how we think about the design, evaluation and use of these systems.

Key words

touchless interaction operating theatre sterility collaborative practices of surgery gestural interaction work practice 

References

  1. Alač, M. (2008). Working with Brain Scans: Digital Images and Gestural Interaction in fMRI Laboratory. Social Studies of Science, vol. 38, no.4, August 2008, pp. 483–508.Google Scholar
  2. Cassell, J. (1987). On Control, Certitude, and the “Paranoia” of Surgeons. Culture, Medicine and Psychiatry, vol. 11, no.2, June 1987, pp. 229–49.Google Scholar
  3. Douglas, M. (1966). Purity and Danger. London: Routledge.CrossRefGoogle Scholar
  4. Ebert, L., Hatch, G., Ampanozi, G., Thali, M. and Ross, S. (2012). You Can’t Touch This: Touch-free Navigation Through Radiological Images. Surgical Innovation, vol. 19, no. 3, September, 2012, pp. 301–307.Google Scholar
  5. Ebert, L., Hatch, G., Ampanozi, G., Thali, M. and Ross, S. (2013). Invisible touch—Control of a DICOM viewer with finger gestures using the Kinect depth camera. Journal of Forensic Radiology and Imaging, vol. 1, no. 1, January 2013, pp. 10–14.Google Scholar
  6. Fox, N. (1992). The Social Meaning of Surgery. Buckingham: Open University Press.Google Scholar
  7. Gallo, L., Placitelli, A.P. and Ciampi, M. (2011). Controller-free exploration of medical image data: experiencing the Kinect. CBMS 2011. Proceedings of 24th Symposium on Computer-Based Medical Systems, Bristol, UK, 27–30 June 2011, pp. 1–6.Google Scholar
  8. Goffman, E. (1961). Encounters: Two Studies in the Sociology of Interaction. Harmondsworth: Penguin.Google Scholar
  9. Goffman, E. (1983). The Interaction Order. American Sociological Review, vol. 48, no. 1, February 1983, pp. 1–17.Google Scholar
  10. Goodwin, C. (1994). Professional Vision. American Anthropologist, vol. 96, no. 3, September 1994, pp. 606–633.Google Scholar
  11. Goodwin, C. (2000). Practices of Seeing: Visual Analysis: An Ethnomethodological Approach. In van Leeuwen, T. and Carey Jewit, C. (Eds) Handbook of Visual Analysis. London: Sage, pp. 157–182.Google Scholar
  12. Goodwin, C. (2003). Pointing as Situated Practice. In Kita, S. (Ed) Pointing: Where Language, Culture and Cognition Meet. Mahwah, NJ: Lawrence Erlbaum, pp. 217–41.Google Scholar
  13. Graetzel, C., Fong, T., Grange, S. and Baur, C. (2004). A Non-Contact Mouse for Surgeon-Computer Interaction. Technology and Health Care, vol. 12, no. 3, August 2004, pp. 245–257.Google Scholar
  14. Hartswood, M., Procter, R., Rouncefield, M., Slack, R. Soutter, J. and Voss, A. (2003). Repairing the Machine: A Case Study of the Evaluation of Computer-Aided Detection Tools in Breast Screening. ECSCW’03. Proceedings of 8th European Conference on Computer Supported Cooperative Work, 14–18 September 2003, Helsinki, Finland, pp. 375–394.Google Scholar
  15. Hindmarsh, J. and Pilnick, A. (2007). Knowing Bodies at Work: Embodiment and Ephemeral Teamwork in Anaesthesia. Organization Studies, vol. 28, no. 9, September 2007, pp. 1395–1416.Google Scholar
  16. Hindmarsh, J. and Pilnick, A. (2002). The Tacit Order of Teamwork: Collaboration and Embodied Conduct in Anaesthesia. The Sociological Quarterly, vol. 43, no. 2, March 2002, pp. 139–164.Google Scholar
  17. Hirschauer, S. (1991). The Manufacture of Bodies in Surgery. Social Studies of Science, vol. 21, no. 2, May 1991, pp. 279–319.Google Scholar
  18. Jacob, M., Li, Y., Akingba, G. and Wachs, J. P. (2012). Gestonurse: a robotic surgical nurse for handling surgical instruments in the operating room. Journal of Robotic Surgery, vol. 6, no. 1, March 2012, pp. 53–63.Google Scholar
  19. Johnson, R., O’Hara, K., Sellen, A., Cousins, C. and Criminisi, A. (2011). Exploring the potential for touchless interaction in image-guided interventional radiology. CHI 2011. Proceedings of Conference on Human Factors in Computing Systems, 7–12 April 2011, Vancouver, Canada, pp. 3323–3332.Google Scholar
  20. Katz, P. (1981). Ritual in the Operating Room. Ethnology, vol. 20, no. 4, October 1981, pp. 335–350.Google Scholar
  21. Kipshagen, T., Graw, M., Tronnier, V., Bonsanto, M. and Hofmann, U. (2009). Touch- and marker-free interaction with medical software. World Congress on Medical Physics and Biomedical Engineering, September 7–12, 2009, Munich, Germany, pp. 75–78.Google Scholar
  22. Lammer, C. (2002). Horizontal Cuts and Vertical Penetration: The ‘Flesh and Blood’ of Image Fabrication in the Operating Theatres of Interventional Radiology. Cultural Studies, vol. 16, no. 6, pp. 833–847.CrossRefGoogle Scholar
  23. Lynch, M. (1990). The Externalized Retina: Selection and Mathematization in the Visual Documentation of Objects in the Life Sciences. In Lynch, M. and Woolgar, S. (Eds): Representation in Scientific Practice. Cambridge MA: MIT Press, pp. 153–186.Google Scholar
  24. Mentis, H., O’Hara, K., Sellen, A. and Trivedi, R. (2012). Interaction Proxemics and Image Use in Neurosurgery. CHI 2012. Proceedings of Conference on Human Factors in Computing Systems, 5–10 May 2012, Austin, Texas, pp. 927–936.Google Scholar
  25. O’Hara, K., Harper, R., Mentis, H. Sellen, A. and Taylor, A. (2013). On the naturalness of touchless: putting the “Interaction” back into NUI. In ACM Transactions on Computer-Human Interaction, vol. 20, no. 1, March 2013, pp. 1–25.Google Scholar
  26. Ruppert, G., Amorim, P., Moares, T. and Silva, J. (2012). Touchless Gesture User Interface for 3D Visualization using Kinect Platform and Open-Source Frameworks. Innovative Developments in Virtual and Physical Prototyping. Proceedings of the 5th International Conference on Advanced Research in Virtual and Rapid Prototyping, 28 September–1 October, Leiria, Portugal, pp. 215–219.Google Scholar
  27. Stern, H. I., Wachs, J. P. and Edan, Y. (2008). Optimal Consensus Intuitive Hand Gesture Vocabulary Design. ICSC 2008. Proceedings of International Conference on Semantic Computing, 4–7 August, Santa Clara, CA, pp. 96–103.Google Scholar
  28. Strickland M., Tremaine J., Brigley G. and Law C. (2013). Using a depth-sensing infrared camera system to access and manipulate medical imaging from within the sterile operating field. Canadian Journal of Surgery, vol. 56, no. 3, June 2013, E1–6.Google Scholar
  29. Svenssen, M., Heath, C. and Luff, P. (2007). Instrumental action: the timely exchange of implements during surgical operations. ECSCW’07. Proceedings of ECSCW’07, 24–28 September 2007, Limerick, Ireland, pp. 41–60.Google Scholar
  30. Tan, J., Unal, J., Tucker, T. and Link, K. (2011). Kinect Sensor for Touch-Free Use in Virtual Medicine. http://www.youtube.com/watch?v=id7OZAbFaVI&feature=player_embedded. Uploaded 11 February 2011.
  31. Wachs, J., Stern, H., Edan, Y., Gillam, M., Feied, C., Smith, M., and Handler, J. (2006). A Real-Time Hand Gesture Interface for Medical Visualization Applications. Applications of Soft Computing: Advances in Intelligent and Soft Computing vol. 36, pp. 153–162.CrossRefGoogle Scholar
  32. Wilson, R. N. (1954). Team Work in the Operating Room. Human Organization, vol. 12, no. 4, winter 1954, pp. 9–14.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2014

Authors and Affiliations

  • Kenton O’Hara
    • 1
  • Gerardo Gonzalez
    • 2
  • Graeme Penney
    • 2
  • Abigail Sellen
    • 1
  • Robert Corish
    • 1
  • Helena Mentis
    • 4
  • Andreas Varnavas
    • 2
  • Antonio Criminisi
    • 1
  • Mark Rouncefield
    • 3
  • Neville Dastur
    • 2
  • Tom Carrell
    • 2
  1. 1.Microsoft ResearchCambridgeUK
  2. 2.Kings College LondonLondonUK
  3. 3.Lancaster UniversityLancasterUK
  4. 4.University of MarylandCollege ParkUSA

Personalised recommendations