Hand gesture recognition and animation for local hand motions

  • M. K. Bhuyan
  • V. Venkata Ramaraju
  • Yuji Iwahori
Original Article

Abstract

Hand gestures are universally adopted means of communication to convey message in the form of sign language. Therefore, to communicate with a deaf and dumb person, a normal human requires to have some knowledge about the sign language and should be able to make the sign language gestures. By understanding and animating hand gestures, we can help in facilitating communication between computers and the underprivileged. In this paper, we present a method for synthesizing hand gestures with the help of a computer which may enable a normal person to convey massage to a mute person more easily without any knowledge of sign language. The proposed technique requires to train the system prior to its operation. But, gesture animation is computationally complex as it involves replication of the hand with its 27 degrees of freedom. Gesture animation also involves gesture recognition. Hence, in this paper, we have implemented a gesture animation framework after recognizing hand gestures. Computational complexity has been significantly reduced by summarizing large gesture sequence in the form of key frames. The animation process includes hand parameter calculation for every pose in a gesture sequence which is obtained using information like position of fingers, location of metacarpophalangeal joints of the fingers and the bent angles of the fingers. By using these parameters, hand pose estimation is done by imposing some constraints of the hand. Subsequently, a gesture sequence is animated using these models. For this, the hand model for the frames in between the key frames are obtained by interpolation. In our experiment, we demonstrate gesture animation with hand pose exactly same as the real gesture.

Keywords

Hand gesture Gesture animation Hand model Finger pose estimation 

References

  1. 1.
    Pavlovic VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans Pattern Anal Mach Intell 19(7):677–695Google Scholar
  2. 2.
    Zhang S, McCullagh P, Nugent C, Zheng H, Baumgarten M (2011) Optimal model selection for posture recognition in home-based healthcare. Int J Mach Learn Cybern 2(1):1–14CrossRefGoogle Scholar
  3. 3.
    Wu Y, Huang TS (2001) Hand modelling, analysis, and recognition for visionbased human computer interaction. IEEE Signal Process Mag 18:51–60Google Scholar
  4. 4.
    Ong SCW, Ranganath S (1997) Automatic sign language analysis: a survey and the future beyond lexical meaning. IEEE Trans Pattern Anal Mach Intell 27(6):873–891CrossRefGoogle Scholar
  5. 5.
    Horace HSIp, Chan Sam CS, Lam Maria SW (2000) Hand gesture animation from static postures using an anatomy-based model. In: Proceedings of computer graphics international, pp 29–36Google Scholar
  6. 6.
    ElKoura G, Singh K (2003) Handrix: animating the human hand. In: Proceedings of ACM eurographics/SIGGRAPH symposium on computer animation, pp 110–119Google Scholar
  7. 7.
    Verma V, Ghosh D (2005) Hand gesture reconstruction and animation. In: Proceedings of 2nd international conference on artificial intelligence (IICAI-05), pp 537–555Google Scholar
  8. 8.
    Shankar VN, Ghosh D (2006) Dynamic hand gesture synthesis and animation using image morphing technique. In: Proceedings of international conference on visual information engineering (VIE 2006), pp 543–548Google Scholar
  9. 9.
    Lin J, Wu Y, Huang TS (2000) Modeling the constraints of human hand motion. In: Proceedings of workshop on human motion, pp 121–126Google Scholar
  10. 10.
    Lee J, Kunii TL (1995) Model-based analysis of hand posture. IEEE Comput Graph Appl 15(5):77–86CrossRefGoogle Scholar
  11. 11.
    Guan H, Chua CS, Ho YK (2001) 3D hand pose retrieval from a single 2D image. Proc Int Conf Image Process 1:157–160Google Scholar
  12. 12.
    Tan W, Wu C, Zhao C, Chen S (2009) Hand extraction using geometric moments based on active skin color model. In: Proceedings of IEEE international conference on intelligent computing and intelligent systems, pp 468–471Google Scholar
  13. 13.
    Teng X, Wu B, Yu W, Liu C (2006) A hand gesture recognition system based on local linear embedding. In: Proceedings of IEEE conference on robotics, automation and mechatronics,vol 16, pp 1–6Google Scholar
  14. 14.
    Liu Q, Peng GZ (2010) A robust skin color based face detection algorithm. In: Proceedings of 2nd international Asia conference on informatics in control, automation and robotics (CAR), vol 2, pp 525–528Google Scholar
  15. 15.
    Porle RR, Chekima A, Wong F, Sainarayanan G (2009) Performance of histogram-based skin colour segmentation for arms detection in human motion analysis application. Int J Electr Comput Eng 4(15):950–955Google Scholar
  16. 16.
    Dong G, Yonghua Y, Ming X (2002) Vision-based hand gesture recognition for human-vehicle interaction. In: Proceedings of 7th international conference on control, automation, robotics and vision, pp 1–4Google Scholar
  17. 17.
    Amayeh G, Bebis G, Erol A, Nicolescu M (2007) A new approach to hand-based authentication. In: Proceedings of biometric technology for human identificationGoogle Scholar
  18. 18.
    Borgefors G (1986) Distance transformations in digital images. Comput Vision Graph Image Process 34:344–371CrossRefGoogle Scholar
  19. 19.
    Schwarz C, da Vitoria Lobo N (2005) Segment-based hand pose estimation. In: Proceedings of the 2nd Canadian conference on computer and robot vision, vol 20, pp 42–49Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • M. K. Bhuyan
    • 1
  • V. Venkata Ramaraju
    • 1
  • Yuji Iwahori
    • 2
  1. 1.Department of Electronics and Electrical EngineeringIndian Institute of Technology GuwahatiGuwahatiIndia
  2. 2.Department of Computer ScienceChubu UniversityKasugaiJapan

Personalised recommendations