Skip to main content

Hand posture and gesture recognition technology


Hand gestures that are performed by one or two hands can be categorized according to their applications into different categories including conversational, controlling, manipulative and communicative gestures. Generally, hand gesture recognition aims to identify specific human gestures and use them to convey information. The process of hand gesture recognition composes mainly of four stages: hand gesture images collection, gesture image preprocessing using some techniques including edge detection, filtering and normalization, capture the main characteristics of the gesture images and the evaluation (or classification) stage where the image is classified to its corresponding gesture class. There are many methods that have been used in the classification stage of hand gesture recognition such as Artificial Neural Networks, template matching, Hidden Markov Models and Dynamic Time Warping. This exploratory survey aims to provide a progress report on hand posture and gesture recognition technology.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5


  1. Lamar MV (2001) Hand gesture recognition using T-comb net A neural network model dedicated to temporal information processing. Ph.D. thesis, Nagoya Institute of Technology, Japan

  2. Nehaniv CL, Dautenhahn, Kubacki K, Haegele J, Parlitz M, Alami C (2005) A methodological approach relating the classification of gesture to identification of human intent in the context of human–robot interaction, Robot and Human Interactive Communication, 2005. ROMAN 2005. IEEE International Workshop

  3. Gesture Internet Document (2007) From “

  4. Wu Y, Huang TS (1999) Vision-based gesture recognition: a review. International Gesture Workshop, France

  5. Haitham S, Sameem A (2013) Gesture feature extraction for static gesture recognition. Arab J Sci Eng 38:3350–3351

    Google Scholar 

  6. Hasan H, Abdul Kareem S (2013) Fingerprint image enhancement and recognition algorithms: a survey. Neural Comput Appl 23:1606–1608

    Article  Google Scholar 

  7. Hasan H, Abdul Kareem S (2012) Static hand gesture recognition using neural networks. Artif Intell Rev. doi:10.1007/s10462-011-9303-1

  8. Hasan H, Abdul Kareem S (2013) Human computer interaction using vision based hand gesture recognition systems: a survey. Neural Comput Appl. doi:10.1007/s00521-013-1481-0

  9. Chang C, Jiann T (2006) New approach for static gesture recognition. J Inf Sci Eng 22:1047–1057

    Google Scholar 

  10. Dong G, Yan Y, Xie M (1998) Vision-based hand gesture recognition for human–vehicle interaction. In: Proceedings of the international conference on control, automation and computer vision, pp 151–155

  11. Kim TK, Cipolla R (2007) Gesture recognition under small sample size. Springer, Berlin

    Google Scholar 

  12. Shet V, Prasad SN (2004) Multi-cue exemplar-based nonparametric model for gesture recognition. In: Indian conference on computer vision, graphics and image processing

  13. Kjeldsen FCM (1997) Visual interpretation of hand gestures as a practical interface modality. Doctoral dissertation, Columbia University

  14. Guiar P (2002) Gesture recognition, Report, Retrieved from

  15. Turk M (1999) Gesture recognition, Microsoft Research, Retrieved from

  16. Winnemller, Holger (1999) Practical gesture recognition for controlling virtual environments. Project for Bachelor of Science (Honours) of Rhodes University

  17. ZNeumann Mo (2006) Lexical gesture interface, computer vision systems, 2006 ICVS’06. In: IEEE international conference

  18. Gao Wen M, Jiyong W (2000) Sign language recognition based on HMM/ANN/DP. Int J Pattern Recognit Artif Intell 14:587–602

    Article  Google Scholar 

  19. Binh Nguyen D (2006) A new approach dedicated to real-time hand gesture recognition. In: International conference on image processing, computer vision, and pattern recognition, Las Vegas, Nevada, USA 2:481–488

  20. Just A (2006) Two-handed gestures for human-computer interaction. IDIAP–RR 6(73):1–94

    Google Scholar 

  21. Wachs, Juan (2002) Real-time hand gesture telerobotic system using fuzzy c-means clustering, automation congress, 2002. In: Proceedings of the 5th biannual world

  22. Stergiou C, Siganos D (1996) Neural networks, Retrieved March 2008.

  23. Gargesha M, Kuchi P (2002) Facial expression recognition using artificial neural network. Arizona State University

  24. Jain Anil K (2000) Statistical pattern recognition: a review, pattern analysis and machine intelligence. IEEE Trans 22:4–37

    Google Scholar 

  25. Awcock GJ, Thomas R (1995) Applied image processing. Mcmillan Press LTD, London

    Google Scholar 

  26. Blunsom P (2004) Hidden markov models, Lecture notes, August

  27. Zabulis X (2009) Vision-based hand gesture recognition for human–computer interaction. The Universal Access Handbook, LEA

  28. Ten H (2007) Multi-dimensional dynamic time warping for gesture recognition, In: Proceedings of the conference of the advanced school for computing and imaging (ASCI 2007)

  29. Gonzalez RC, Woods RE (2002) Digital image processing, 2nd edn. Prentice Hall, Upper Saddle River

  30. Umbaugh SE (1997) Computer vision and image processing: a practical approach using cviptools with cdrom. Prentice Hall PTR, New Jersey

  31. Pitas I (2000) Digital image processing algorithms and applications. Wiley-interscience, New York

Download references

Author information

Authors and Affiliations


Corresponding author

Correspondence to Haitham Sabah Badi.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Badi, H.S., Hussein, S. Hand posture and gesture recognition technology. Neural Comput & Applic 25, 871–878 (2014).

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI:


  • Gesture recognition
  • Human–computer interaction
  • Representations
  • Natural interfaces
  • Recognition