Skip to main content

A Real-Time Hand Gesture Interface for Medical Visualization Applications

  • Conference paper
Applications of Soft Computing

Part of the book series: Advances in Intelligent and Soft Computing ((AINSC,volume 36))

Abstract

In this paper, we consider a vision-based system that can interpret a user’s gestures in real time to manipulate objects within a medical data visualization environment. Dynamic navigation gestures are translated to commands based on their relative positions on the screen. Static gesture poses are identified to execute non-directional commands. This is accomplished by using Haar-like features to represent the shape of the hand. These features are then input to a Fuzzy C-Means Clustering algorithm for pose classification. A probabilistic neighborhood search algorithm is employed to automatically select a small number of Haar features, and to tune the fuzzy c-means classification algorithm. The gesture recognition system was implemented in a sterile medical data-browser environment. Test results on four interface tasks showed that the use of a few Haar features with the supervised FCM yielded successful performance rates of 95 to 100%. In addition a small exploratory test of the Adaboost Haar system was made to detect a single hand gesture, and assess its suitability for hand gesture recognition.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 169.00
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 219.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Azyxxi Online Source (2003) Available: http://www.imedi.org/dataman.pl?c= lib&dir=docs/Azyxxi

    Google Scholar 

  2. Bradski GR (1998) Computer vision face tracking for use in a perceptual user interface. In Intel Technical Journal, pp 1–15.

    Google Scholar 

  3. Foley JD, van Dam A, Feiner SK and Hughes JF (1987) Computer graphics: principles and practice, 2 Ed, Addison Wesley

    Google Scholar 

  4. Graetzel C, Fong TW, Grange S, and Baur C (2004) A non-contact mouse for surgeon-computer interaction. J Tech and Health Care 12:3:245–257

    Google Scholar 

  5. Lienhart R and Maydt J (2002) An extended set of haar-like features for rapid object detection. In IEEE ICIP 2002 vol:1, pp 900–903

    Google Scholar 

  6. Nishikawa A, Hosoi T, Koara K, Negoro D, Hikita A, Asano S, Kakutani H, Miyazaki F, Sekimoto M, Yasui M, Miyake Y, Takiguchi S, and Monden M (2003) FAce MOUSe: A novel human-machine interface for controlling the position of a laparoscope. IEEE Trans on Robotics and Automation 19:5:825–841.

    Article  Google Scholar 

  7. Schultz M, Gill J, Zubairi S, Huber R, Gordin F (2003) Bacterial contamination of computer keyboards in a teaching hospital. Infect Control Hosp Epidemiol 24:302–303

    Article  Google Scholar 

  8. Stem H, Wachs JP, Edan Y (2004) Parameter calibration for reconfiguration of a hand gesture tele-robotic control system. In Proc of USA Symp on Flexible Automat, Denver, Colorado, July 19–21

    Google Scholar 

  9. Viola P and Jones M (2001) Rapid object detection using a boosted cascade of simple features. In IEEE Conf on Computer Vision and Pattern Recog, Kauai, Hawaii

    Google Scholar 

  10. Wachs IP, Stem H, and Edan Y (2005) Cluster labeling and parameter estimation for automated set up of a hand gesture recognition system. In IEEE Trans in SMC Part A 2005. vol. 35, no. 6, pp: 932–944.

    Google Scholar 

  11. Wachs JP, Stem H (2005) Hand gesture interface for med visual app web site. Available: http://www.imedi.org/docs/references/gesture.htm

    Google Scholar 

  12. Yanagihara Y, Hiromitsu H (2000) System for selecting and generating images controlled by eye movements applicable to CT image display, Medical Imaging Technology, September, vol.18, no.5, pp 725–733

    Google Scholar 

  13. Zeng TJ, Wang Y, Freedman MT and Mun SK (1997) Finger Tracking for Breast Palpation Quantification using Color Image Features. SPIE Optical Eng 36:12, pp 3455–3461

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Wachs, J. et al. (2006). A Real-Time Hand Gesture Interface for Medical Visualization Applications. In: Tiwari, A., Roy, R., Knowles, J., Avineri, E., Dahal, K. (eds) Applications of Soft Computing. Advances in Intelligent and Soft Computing, vol 36. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-36266-1_15

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-36266-1_15

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-29123-7

  • Online ISBN: 978-3-540-36266-1

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics