Real Time Hand Gesture Recognition Including Hand Segmentation and Tracking

  • Thomas Coogan
  • George Awad
  • Junwei Han
  • Alistair Sutherland
Part of the Lecture Notes in Computer Science book series (LNCS, volume 4291)


In this paper we present a system that performs automatic gesture recognition. The system consists of two main components: (i) A unified technique for segmentation and tracking of face and hands using a skin detection algorithm along with handling occlusion between skin objects to keep track of the status of the occluded parts. This is realized by combining 3 useful features, namely, color, motion and position. (ii) A static and dynamic gesture recognition system. Static gesture recognition is achieved using a robust hand shape classification, based on PCA subspaces, that is invariant to scale along with small translation and rotation transformations. Combining hand shape classification with position information and using DHMMs allows us to accomplish dynamic gesture recognition.


Gesture Recognition Search Window Hand Shape Hand Gesture Recognition Occlusion Status 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Gupta, L., Ma, S.: Gesture-Based Interaction and Communication: Automated Classification of Gesture Contours. IEEE Transactions on Systems, Man and Cybernetics – Part C: Applications and reviews 31(1) (2001)Google Scholar
  2. 2.
    Chen, F.-S., Fu, C.-M., Huang, C.-L.: Hand Gesture Recognition Using a Real-time Tracking Method and Hidden Markov Models. Image and Vision Computing 21, 745–758 (2003)CrossRefGoogle Scholar
  3. 3.
    Patwardhan, K.S., Dutta Roy, S.: Hand gesture modeling and recognition involving changing shapes and trajectories, using a Predictive EigenTracker, Pattern Recognition (Article in Press) (2006)Google Scholar
  4. 4.
    Kadir, T., Bowden, R., Ong, E.J.,, Z.: Minimal Training, Large Lexicon, Unconstrained Sign Language Recognition. In: Proc BMVC 2004, vol. 2, pp. 849–858 (2004)Google Scholar
  5. 5.
    Shamaie, A., Sutherland, A.: Hand Tracking in Bimanual Movements. Image and Vision Computing 23, 1131–1149 (2005)CrossRefGoogle Scholar
  6. 6.
    Huang, C.-L., Jeng, S.-H.: A Model-Based Hand Gesture Recognition System. Machine Vision and Application 12(5), 243–258 (2001)CrossRefGoogle Scholar
  7. 7.
    Terrillon, J.-C., Piplr, A., Niwa, Y., Yamamoto, K.: Robust Face Detection and Japanese Sign Language Hand Posture Recognition for Human-Computer Interaction in an Intelligent Room. In: Proc. Int’l Conf. Vision Interface, pp. 369–376 (2002)Google Scholar
  8. 8.
    Shamaie, A.: Hand Tracking and Bimanual Movement Understanding, PHD Thesis, Dublin City University (2003)Google Scholar
  9. 9.
    Just, A., Rodriguez, Y., Marcel, S.: Hand Posture Classification and Recognition using the Modified Census Transform. In: Proc. IEEE International Conference on Face and Gesture, pp. 351–356 (2006)Google Scholar
  10. 10.
    Triesch, J., Von der Malsburg, C.: Classification of hand postures against complex backgrounds using elastic graph matching. Image and Vision Computing 20, 937–943 (2002)CrossRefGoogle Scholar
  11. 11.
    Yuan, Y., Barner, K.: An Active Shape Model Based Tactile Hand Shape Recognition with Support Vector Machine. In: Proc 40th Annual Conf Information Sciences and systems (2006)Google Scholar
  12. 12.
    Bishoff, H., Leonardis, A., Maver, J.: Multiple Eigenspaces. Pattern Recognition 35, 2613–2627 (2002)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Thomas Coogan
    • 1
  • George Awad
    • 1
  • Junwei Han
    • 1
  • Alistair Sutherland
    • 1
  1. 1.Dublin City UniversityIreland

Personalised recommendations