Vision-Based User Interfaces for Health Applications: A Survey

  • Alexandra Branzan Albu
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4291)


This paper proposes a survey of vision-based human computer interfaces for several key-fields in health care: data visualization for image-guided diagnosis, image-guided therapy planning and surgery, the operating room, assistance to motor-impaired patients, and monitoring and support of elderly. The emphasis is on the contribution of the underlying computer vision techniques to the usability and usefullness of interfaces for each specific domain.It is also shown that end-user requirements have a significant impact on the algorithmic design of the computer vision techniques embedded in the interfaces.


Minimally Invasive Surgery Hand Gesture Medical Image Analysis Medical Image Segmentation Computer Vision Technique 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Gosbee, J., Ritchie, E.: Human-computer interaction and medical software development. Interactions 4(4), 13–18 (1997)CrossRefGoogle Scholar
  2. 2.
    Pham, D., Xu, C., Prince, J.: Current methods in medical image segmentation. Annual Review of Biomedical Engineering 2, 315–337 (2000)CrossRefGoogle Scholar
  3. 3.
    Olabarriaga, S.D., Smeulders, A.W.M.: Interaction in the segmentation of medical images: A survey. Medical Image Analysis 5, 127–142 (2001)CrossRefGoogle Scholar
  4. 4.
    Gerig, G., Jomier, M., Chakos, M.: Valmet: A new validation tool for assessing and improving 3D object segmentation. In: Proc. MICCAI Conf. Med. Image Comput. and Computer-Assisted Intervention, pp. 516–523 (2001)Google Scholar
  5. 5.
    Warfield, S., Zou, K.H., Wells, W.M.: Simultaneous Truth and Performance Level Estimation (STAPLE): An Algorithm for the Validation of Image Segmentation. IEEE Trans. on Med. Imag. 23(7), 903–921 (2004)CrossRefGoogle Scholar
  6. 6.
    O’Donnell, L., Westin, C.-F., Grimson, W.E.L., et al.: Phase-based user-steered image segmentation. In: Proc. MICCAI Conf. Medical Image Computing and Computer-Assisted Intervention, pp. 1022–1030 (2001)Google Scholar
  7. 7.
    Barrett, W.A., Mortensen, E.N.: Interactive live-wire boundary extraction. Med. Image Anal. 1(4), 331–341 (1997)CrossRefGoogle Scholar
  8. 8.
    Elliott, P.J., Diedrichsen, J., Goodson, K.J., Riste-Smith, R., Sivewright, G.J.: An object-oriented system for 3D medical image analysis. IBM Systems Journal 35(1), 4–24 (1996)CrossRefGoogle Scholar
  9. 9.
    Elliot, P.J., Knapman, J.M., Schlegel, W.: Interacting segmentation for radiation treatment planning. IBM Systems Journal 31(4), 620–634 (1992)CrossRefGoogle Scholar
  10. 10.
    Meissner, M., Zuiderveld(organizers), K., Harris, G., Lesser, J.R., Persson, A., Vannier(panelists), M.: End Users’ Perspectives on Volume Rendering in Medical Imaging: A job well done or not over yet? In: Panel, IEEE Visualization, Vis 2005, Minneapolis, USA (October 2005)Google Scholar
  11. 11.
    Shahidi, R., Clarke, L., Bucholz, R.D., Fuchs, H., Kikinis, R., Robb, R.A., Vannier, M.: White paper: challenges and opportunities in computer-assisted intervention. Comp. Aided Surgery 6(3), 176–181 (2001)Google Scholar
  12. 12.
    Meissner, M., Lorensen, B., Zuiderveld, K., Simha, V., Wegenkittl, R.: Volume Rendering in Medical Applications: We’ve got pretty images, what’s left to do? In: Panel, IEEE Visualization, Vis 2002, Boston, USA, October 27-November 1 (2002)Google Scholar
  13. 13.
    Gering, D.T., Nabavi, A., Kikinis, R., et al.: An Integrated Visualization System for Surgical Planning and Guidance Using Image Fusion and Interventional Imaging. In: Proc. MICCAI Conf. Medical Image Computing and Computer-Assisted Intervention, pp. 809–819 (1999)Google Scholar
  14. 14.
    Robb, R.: Biomedical Imaging, Visualization and Analysis. John Wiley and Sons, Inc, New York (1999)Google Scholar
  15. 15.
    Ibanez, L., Schroeder, W., Ng, L., Cates, J.: The ITK Software Guide (2003)Google Scholar
  16. 16.
    Hanson, D., Robb, R., et al.: New software toolkits for comprehensive visualization and analysis of 3D multimodal biomedical images. Journal of Digital Imaging 10(2), 1–2 (1997)Google Scholar
  17. 17.
    Augustine, K., Holmes, D., Robb, R.: ITK and Analyze: A synergistic integration. In: Proc. SPIE Medical Imaging, pp. 6–15 (2004)Google Scholar
  18. 18.
    Rexilius, J., Spindler, W., Jomier, J., et al.: A Framework for Algorithm Evaluation and Clinical Application Prototyping using ITK. In: MICCAI Workshop on Open-Source Software 2005 (2005)Google Scholar
  19. 19.
    Hanssen, N., von Rymon-Lipinski, B., Jansen, T., et al.: Integrating the Insight Toolkit itk into a Medical Software Framework. In: Proc. of CARS Computer Assisted Radiology and Surgery 2002 (2002)Google Scholar
  20. 20.
    Alberola-Lopez, C., Cardenes, R., Martin, M., et al.: diSNei: A collaborative environment for medical images analysis and visualization. In: Proc. of MICCAI Medical image computing and computer-assisted interventions, pp. 814–823 (2000)Google Scholar
  21. 21.
    Simmross-Wattenberg, F., Carranza-Herrezuelo, N., Palacios-Camarero, C., et al.: Group-slicer: a collaborative extension of the 3D-slicer. Journal of Biomed. Informatics 38, 431–442 (2005)CrossRefGoogle Scholar
  22. 22.
    McInerney, J., Roberts, D.W.: Frameless Stereotaxy of the Brain. The Mount Sinai Journal of Medicine 67(4), 300–310 (2000)Google Scholar
  23. 23.
    Heilbrun, M.P., McDonald, P., Wicker, C., et al.: Stereotactic localization and guidance using a machine vision technique. Stereotactic Functional Neurosurgery 58, 94–98 (1992)CrossRefGoogle Scholar
  24. 24.
    Pelizzari, C.A., Chen, G.T.Y., Spelbring, D.R., et al.: Accurate three-dimensional registration of CT, PET, and MR images of the brain. Journal Comput. Assist. Tomography 13, 20–26 (1989)CrossRefGoogle Scholar
  25. 25.
    Wang, Y., Peterson, B.S., Staib, L.H.: 3D Brain Surface matching based on geodesics and local geometry. Computer Vision Image Understanding 89, 252–271 (2003)CrossRefGoogle Scholar
  26. 26.
    Gering, D.T., Nabavi, A., Kikinis, R., et al.: An Integrated Visualization System for Surgical Planning and Guidance Using Image Fusion and an Open MR. Journal of Magnetic Resonance Imaging 13, 967–975 (2001)CrossRefGoogle Scholar
  27. 27.
    Warfield, S., Nabavi, A., Butz, T., et al.: Intraoperative segmentation and non-rigid registration for image-guided therapy. In: Proc. of MICCAI, Medical Image Computing and Computer-Assisted Intervention, pp. 176–185 (October 2000)Google Scholar
  28. 28.
    Pavlovic, V.I., Sharma, R., Huang, T.S.: Visual interpretation of hand gestures for human computer interaction: A review. IEEE Trans. on Patt. Anal. and Machine Intelligence 19(7) (1997)Google Scholar
  29. 29.
    Erol, A., Bebis, G., Nicolescu, M., Boyle, R., Twombly, X.: A Review on Vision-Based Full DOF Hand Motion Estimation. In: Proc. of the IEEE Workshop on Vision for Human-Computer Interaction (V4HCI), San Diego (June 2005)Google Scholar
  30. 30.
    Graetzel, C., Fong, T., Grange, S., Baur, C.: A non-contact mouse for surgeon-computer interaction. Technology and Health Care 12(3) (2004)Google Scholar
  31. 31.
    Nishikawa, A., Hosoi, T., Koara, K., et al.: Face mouse: a novel human-machine interface for controlling the position of a laparoscope. IEEE Trans. on Robotics and Automation 19(5), 825–844 (2003)CrossRefGoogle Scholar
  32. 32.
    Grange, S., Fong, T., Baur, C.: M/ORIS: A medical/operating room interaction system. In: Proc. of the ACM Int. Conf. on Multimodal Interfaces, State College, PA (2004)Google Scholar
  33. 33.
    McKenna, S.J., Nait Charif, H., Frank, T.: Towards Video Understanding of Laparoscopic Surgery: Instrument Tracking. In: Proc. of Image and Vision Computing, New Zealand (2005)Google Scholar
  34. 34.
    Nishikawa, A., Asano, S., Fujita, R., et al.: Robust visual tracking of multiple surgical instruments for laparoscopic surgery. In: Proc. of Comp. Assisted Radiology and Surgery, London (2003)Google Scholar
  35. 35.
    Chen, J., Yeasin, M., Sharma, R.: Visual modeling and evaluation of surgical skill. Pattern Analysis and Applications 6, 1–11 (2003)MATHCrossRefMathSciNetGoogle Scholar
  36. 36.
    Rosen, J., Solazzo, M., Hannaford, B., Sinanan, M.: Objective Evaluation of Laparoscopic Skills Based on Haptic Information and Tool/Tissue Interactions. Computer Aided Surgery 7(1), 49–61 (2002)CrossRefGoogle Scholar
  37. 37.
    Card, S.K., Moran, T.P., Newell, A.: The Psychology of Human-Computer Interaction. Lawrence Erlbaum Associates, Hillsdale (1983)Google Scholar
  38. 38.
    Keates, S., Clarkson, P.J., Robinson, P.: Developing a methodology for the design of accessible interfaces. In: Proc. of the 4th Workshop on User Interfaces for All, Stockholm, Sweden (1998)Google Scholar
  39. 39.
    Morrison, H., McKenna, S.J.: Contact-free recognition ofuser-defined gestures as a means of computer access for thephysically disabled. In: Proc. 1st Workshop on Univ. Access and Assistive Technology, Cambridge, UK, pp. 99–103 (March 2002)Google Scholar
  40. 40.
    Betke, M., Gips, J., Flemming, P.: The Camera Mouse: Visual tracking of body features to provide computer access for people with severe disabilities. IEEE Trans. on neural systems and rehabilitation eng. 10(1), 1–9 (2002)CrossRefGoogle Scholar
  41. 41.
    Grauman, K., Betke, M., Lombardi, J., Gips, J., Bradski, G.R.: Communication via eye-blinks and eye-brow raises: video-based human-computer interfaces. Univ. Access. Inf. Soc. 2, 359–373 (2003)CrossRefGoogle Scholar
  42. 42.
    Yanco, H.A., Gips, J.: Preliminary investigation of a semi-autonomous robotic wheelchair directed through electrodes. In: Proc. Rehab. Eng. Soc. of North Am. Annual Conf., pp. 414–416 (1997)Google Scholar
  43. 43.
    Kuno, Y., Shimada, N., Shirai, Y.: Look where you’re going: A robotic wheelchair based on the integration of human and environmental observations. IEEE Robotics and Automation Magazine, 27–34 (March 2003)Google Scholar
  44. 44.
    McKenna, S.J., Marquis-Faulkes, F., Gregor, P., Newell, A.F.: Scenario-based drama as a tool for investigating user requirements with application to home monitoring for elderly people. In: Proc. of HCI Int, Crete, Greece (June 2003)Google Scholar
  45. 45.
    Katz, S.: Assessing Self-Maintenance: Activities of Daily Living, Mobility, and Instrumental Activities of Daily Living. J. Am. Geriatrics Soc. 31(12), 721–726 (1983)Google Scholar
  46. 46.
    Mihailidis, A., Carmichael, B., Boger, J.: The use of computer vision to support aging-in-place, safety, and independence in the home. IEEE Trans. on Inf. Tech. in Biomed. 8(3), 238–247 (2004)CrossRefGoogle Scholar
  47. 47.
    Abowd, G.A., Bobick, I., Essa, E., Mynatt, W.: Rogers, The aware home: developing technologies for successful aging. AAAI technical reportGoogle Scholar
  48. 48.
    Philipose, M., Fishkin, K., Perkowitz, M., et al.: Inferring activities from interactions with objects. IEEE Pervasive Computing 3(4), 50–57 (2004)CrossRefGoogle Scholar
  49. 49.
    Nait-Charif, H., Mckenna, S.J.: Activity summarization and fall detection in a supportive home environment. In: IEEE Int. Conf. on Pattern Recognition, pp. 323–326 (2004)Google Scholar
  50. 50.
    Sixsmith, A., Johnson, N.: A smart sensor to detect the falls in the elderly. IEEE Pervasive Computing 3(2), 42–47 (2004)CrossRefGoogle Scholar
  51. 51.
    Cucchiara, R., Grana, C., Prati, A., Vezzani, R.: Computer vision techniques for PDA ac-cessibility of in-house video surveillance. In: Proc. of ACM Int. Workshop on Visual Sur-veillance IWVS, Berkeley (CA), pp. 87–97 (2003)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Alexandra Branzan Albu
    • 1
  1. 1.Dept. of Electrical and Computer EngineeringUniversity of Victoria (BC)Canada

Personalised recommendations