Multimedia Tools and Applications

, Volume 74, Issue 8, pp 2687–2715 | Cite as

Hand tracking and gesture recognition system for human-computer interaction using low-cost hardware

Article

Abstract

Human-Computer Interaction (HCI) exists ubiquitously in our daily lives. It is usually achieved by using a physical controller such as a mouse, keyboard or touch screen. It hinders Natural User Interface (NUI) as there is a strong barrier between the user and computer. There are various hand tracking systems available on the market, but they are complex and expensive. In this paper, we present the design and development of a robust marker-less hand/finger tracking and gesture recognition system using low-cost hardware. We propose a simple but efficient method that allows robust and fast hand tracking despite complex background and motion blur. Our system is able to translate the detected hands or gestures into different functional inputs and interfaces with other applications via several methods. It enables intuitive HCI and interactive motion gaming. We also developed sample applications that can utilize the inputs from the hand tracking system. Our results show that an intuitive HCI and motion gaming system can be achieved with minimum hardware requirements.

Keywords

Gesture recognition Hand/Finger tracking HCI Kinect Motion game NUI 

References

  1. 1.
    Barczak ALC, Dadgostar F (2005) Real-time hand tracking using a set of cooperative classifiers based on Haar-like features. Res Lett Inf Math Sci 7:29–42Google Scholar
  2. 2.
    Benedetti W (2009) Motion controls move games into the future. [Online] http://www.msnbc.msn.com/id/31200220/ns/technology_and_science-games/
  3. 3.
    Bradski G, Kaehler A (2008) Learning OpenCV: computer vision with the OpenCV library. O’Reilly Media, IncorporatedGoogle Scholar
  4. 4.
    Bretzner, L, Laptev I, Lindeberg T (2002) Hand gesture recognition using multi-scale colour features, hierarchical models and particle filtering. Automatic face and gesture recognition, 2002. Proceedings. Fifth IEEE International Conference on (pp. 423–428). IEEEGoogle Scholar
  5. 5.
    Buchmann V et al (2004) FingARtips: gesture based direct manipulation in Augmented reality. Proceedings of the 2nd international conference on Computer graphics and interactive techniques in Australasia and South East Asia (pp. 212–221). ACMGoogle Scholar
  6. 6.
    Burns A-M, Mazzarino B (2006) Finger tracking methods using eyesweb. Gesture in human-computer interaction and simulation 156–167Google Scholar
  7. 7.
    Carter C (2007) Microsoft® xna™ unleashed: graphics and game programming for xbox 360 and windows. SamsGoogle Scholar
  8. 8.
    Chai D, Ngan KN (1999) Face segmentation using skin-color map in videophone applications. IEEE Trans Circ Syst Video Technol 9(4):551–564CrossRefGoogle Scholar
  9. 9.
    Chen Q (2008) Real-time vision-based hand tracking and gesture recognition. University of OttawaGoogle Scholar
  10. 10.
    Chen, Q, Georganas ND, Petriu EM (2007) Real-time vision-based hand gesture recognition using haar-like features. Instrumentation and Measurement Technology Conference Proceedings, 2007. IMTC 2007. IEEE (pp. 1–6). IEEEGoogle Scholar
  11. 11.
    Dardas NH, Alhaj M (2011) Hand gesture interaction with a 3D virtual environment. Int J ACM Jordan 2(3):186–194Google Scholar
  12. 12.
    Dardas NH, Georganas ND (2011) Real-time hand gesture detection and recognition using bag-of-features and support vector machine techniques. IEEE Trans Instrum Meas 60(11):3592–3607CrossRefGoogle Scholar
  13. 13.
    Dias JMS, Nande P, Barata N, Correia A (2004) OGRE-open gestures recognition engine. In: Computer graphics and image processing, 2004. Proceedings. 17th Brazilian Symposium on (pp. 33–40). IEEEGoogle Scholar
  14. 14.
  15. 15.
    Han SI, Mi JY, Kwon JH, Yang HK, Lee BG (2008) Vision based hand tracking for interactionGoogle Scholar
  16. 16.
    Hasan MM, Mishra PK (2012) Real time fingers and palm locating using dynamic circle templates. Int J Comput Appl 41(6):33–43Google Scholar
  17. 17.
    Hürst W, van Wezel C (2012) Gesture-based interaction via finger tracking for mobile augmented reality. Multimed Tools Appl 62(1):233–258Google Scholar
  18. 18.
    Kalman RE (1960) A new approach to linear filtering and prediction problems. J Basic Eng 82(1):35–45CrossRefGoogle Scholar
  19. 19.
    Kaltenbrunner M et al (2005) TUIO: a protocol for table-top tangible user interfaces. Proc. of the The 6th Int’l Workshop on Gesture in Human-Computer Interaction and SimulationGoogle Scholar
  20. 20.
    Keskin C, Erkan A, Akarun L (2003) Real time hand tracking and 3d gesture recognition for interactive interfaces using hmm. ICANN/ICONIPP 2003:26–29Google Scholar
  21. 21.
    Laufs U, Ruff C, Zibuschka J (2010) Mt4j-a cross-platform multi-touch development framework. arXiv preprint arXiv:1012.0467Google Scholar
  22. 22.
    Mahmoud TM (2008) A new fast skin color detection technique. World Acad Sci 501–505Google Scholar
  23. 23.
    Mahmoudi F, Parviz M (1993) Visual hand tracking algorithms. In: Geometric modeling and imaging--new trends, 2006 (pp. 228–232). IEEEGoogle Scholar
  24. 24.
    Malik S, Laszlo J (2004) Visual touchpad: a two-handed gestural input device. Proceedings of the 6th international conference on Multimodal interfaces (pp. 289–296). ACMGoogle Scholar
  25. 25.
    Manresa C, Varona J, Mas R, Perales F (2005) Hand tracking and gesture recognition for human-computer interaction. Electron Letters Comput Vis Image Anal 5(3):96–104Google Scholar
  26. 26.
    Oka K, Sato Y, Koike H (2002) Real-time fingertip tracking and gesture recognition. IEEE Comput Graph Appl 22(6):64–71CrossRefGoogle Scholar
  27. 27.
    Pavlovic VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans Pattern Anal Mach Intell 19(7):677–695CrossRefGoogle Scholar
  28. 28.
    Quam DL (1990) Gesture recognition with a DataGlove. Aerospace and Electronics Conference, 1990. Proceedings of the IEEE 1990 National (pp. 755–760). IEEEGoogle Scholar
  29. 29.
    Rehg JM, Kanade T (1994) Digiteyes: vision-based hand tracking for human-computer interaction. Motion of non-rigid and articulated objects, 1994. Proceedings of the 1994 IEEE Workshop on (pp. 16–22). IEEEGoogle Scholar
  30. 30.
    Segen J, Kumar S (1998) Human-computer interaction using gesture recognition and 3D hand tracking. Image Processing, 1998. ICIP 98. Proceedings. 1998 International Conference on (pp. 188–192). IEEEGoogle Scholar
  31. 31.
    Singh SK, Chauhan DS, Vatsa M, Singh R (2003) A robust skin color based face detection algorithm. Tamkang J Sci Eng 6(4):227–234Google Scholar
  32. 32.
    Takahashi M et al (2011) Human gesture recognition system for TV viewing using time-of-flight camera. Multimed Tools Appl 1–23Google Scholar
  33. 33.
    Video demo of hand tracking system, http://www.youtube.com/user/tcboy88/videos?
  34. 34.
    Viola P, Jones MJ (2004) Robust real-time face detection. Int J Comput Vis 57(2):137–154CrossRefGoogle Scholar
  35. 35.
    Von Hardenberg C, Bérard F (2001) Bare-hand human-computer interaction. Proceedings of the 2001 workshop on Perceptive user interfaces (pp. 1–8). ACMGoogle Scholar
  36. 36.
    Wang RY, Popović J (2009) Real-time hand-tracking with a color glove. ACM Transactions on Graphics (TOG). Vol. 28. No. 3. ACMGoogle Scholar
  37. 37.
    Welch G, Bishop G (1995) An introduction to the kalman filter. Technical report, UNC-CH Computer Science Technical Report 95041Google Scholar
  38. 38.
    Zabulis X, Baltzakis H, Argyros A (2009) Vision-based hand gesture recognition for human-computer interaction. The Universal Access Handbook. LEAGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  1. 1.Department of Ubiquitous ITDongseo UniversityBusanKorea
  2. 2.Department of Visual ContentsDongseo UniversityBusanKorea
  3. 3.Division of Computer and Information EngineeringDongseo UniversityBusanKorea

Personalised recommendations