Advertisement

Multimedia Tools and Applications

, Volume 72, Issue 1, pp 843–864 | Cite as

A high-performance training-free approach for hand gesture recognition with accelerometer

  • Liang Yin
  • Mingzhi Dong
  • Ying Duan
  • Weihong Deng
  • Kaili Zhao
  • Jun Guo
Article

Abstract

In previous research on human machine interaction, parameters or templates of gestures are always learnt from training samples first and then a certain kind of matching is conducted. For these training-required methods, a small number of training samples always result in poor or user-independent performance, while a large quantity of training samples lead to time-consuming and laborious sample collection processes. In this paper, a high-performance training-free approach for hand gesture recognition with accelerometer is proposed. First, we determine the underlining space for gesture generation with the physical meaning of acceleration direction. Then, the template of each gesture in the underlining space can be generated from the gesture trails, which are frequently provided in the instructions of gesture recognition devices. Thus, during the gesture template generation process, the algorithm does not require training samples any more and fulfills training-free gesture recognition. After that, a feature extraction method, which transforms the original acceleration sequence into a sequence of more user-invariant features in the underlining space, and a more robust template matching method, which is based on dynamic programming, are presented to finish the gesture recognition process and enhance the system performance. Our algorithm is tested in a 28-user experiment with 2,240 gesture samples and this training-free algorithm shows better performance than the traditional training-required algorithms of Hidden Markov Model (HMM) and Dynamic Time Warping (DTW).

Keywords

Training-free Gesture recognition Accelerometer 

References

  1. 1.
    Adxl330 datasheet (2006)Google Scholar
  2. 2.
    Akl A, Valaee S (2010) Accelerometer-based gesture recognition via dynamic-time warping, affinity propagation, & compressive sensing. In: 2010 IEEE international conference on acoustics speech and signal processing (ICASSP), IEEE, pp 2270–2273Google Scholar
  3. 3.
    Brezmes T, Gorricho JL, Cotrina J (2009) Activity recognition from accelerometer data on a mobile phone. In: Distributed computing, artificial intelligence, bioinformatics, soft computing, and ambient assisted living, pp 796–799Google Scholar
  4. 4.
    Brindza J, Szweda J, Liao Q, Jiang Y, Striegel A (2009) Wiilab: bringing together the nintendo wiimote and matlab. In: Frontiers in education conference, 2009. FIE’09. 39th IEEE. IEEE, pp 1–6Google Scholar
  5. 5.
    Byrne D, Doherty AR, Snoek CGM, Jones GJF, Smeaton AF (2010) Everyday concept detection in visual lifelogs: validation, relationships and trends. Multimed Tools Appl 49(1):119–144CrossRefGoogle Scholar
  6. 6.
    Chen Y, Liu M, Liu J, Shen Z, Pan W (2011) Slideshow: Gesture-aware ppt presentation. In: 2011 IEEE international conference on multimedia and expo (ICME), IEEE, pp 1–4Google Scholar
  7. 7.
    Cho SJ, Oh JK, Bang WC, Chang W, Choi E, Jing Y, Cho J, Kim DY (2004) Magic wand: a hand-drawn gesture input device in 3-d space with inertial sensorsGoogle Scholar
  8. 8.
    Choi ES, Bang WC, Cho SJ, Yang J, Kim DY, Kim SR (2005) Beatbox music phone: gesture-based interactive mobile phone using a tri-axis accelerometer. In: IEEE international conference on industrial technology, 2005. ICIT 2005. IEEE, pp 97–102Google Scholar
  9. 9.
    Farella E, Pieracci A, Benini L, Rocchi L, Acquaviva A (2008) Interfacing human and computer with wireless body area sensor networks: the wimoca solution. Multimed Tools Appl 38(3):337–363CrossRefGoogle Scholar
  10. 10.
    Flórez F, García JM, García J, Hernández A (2002) Hand gesture recognition following the dynamics of a topology-preserving network. In: Fifth IEEE international conference on automatic face and gesture recognition, 2002. Proceedings. IEEE, pp 318–323Google Scholar
  11. 11.
    Holzinger A, Nischelwitzer AK, Kickmeier-Rust MD (2006) Pervasive e-education supports life long learning: some examples of x-media learning objects (xlo). Digital Media, pp 20–26Google Scholar
  12. 12.
    Holzinger A, Softic S, Stickel C, Ebner M, Debevc M (2009) Intuitive e-teaching by using combined hci devices: experiences with wiimote applications. In: Universal access in human-computer interaction. applications and services, pp 44–52Google Scholar
  13. 13.
    Holzinger A, Softic S, Stickel C, Ebner M, Debevc M, Hu B (2012) Nintendo wii remote controller in higher education: development and evaluation of a demonstrator kit for e-teaching. Comput Inform 29(4):601–615Google Scholar
  14. 14.
    Hürst W, van Wezel C (2012) Gesture-based interaction via finger tracking for mobile augmented reality. Multimed Tools Appl 62(1):233–258CrossRefGoogle Scholar
  15. 15.
    Kettebekov S, Sharma R (2000) Understanding gestures in multimodal human computer interaction. Int J Artif Intell Tools 9(2):205–223CrossRefGoogle Scholar
  16. 16.
    Lee HK, Kim JH (1999) An hmm-based threshold model approach for gesture recognition. IEEE Trans Pattern Anal Mach Intell 21(10):961–973CrossRefGoogle Scholar
  17. 17.
    Liu J, Zhong L, Wickramasuriya J, Vasudevan V (2009) uwave: accelerometer-based personalized gesture recognition and its applications. Pervasive Mob Comput 5(6):657–675CrossRefGoogle Scholar
  18. 18.
    Mäntyjärvi J, Kela J, Korpipää P, Kallio S (2004) Enabling fast and effortless customisation in accelerometer based gesture interaction. In: ACM international conference proceeding seriesGoogle Scholar
  19. 19.
    Mantyla VM, Mantyjarvi J, Seppanen T, Tuulari E (2000) Hand gesture recognition of a mobile device user. In: 2011 IEEE international conference on multimedia and expo (ICME), vol 1. IEEE, pp 281–284Google Scholar
  20. 20.
    Montoliu R, Blom J, Gatica-Perez D (2013) Discovering places of interest in everyday life from smartphone data. Multimed Tools Appl 62(1):179–307CrossRefGoogle Scholar
  21. 21.
    Montoliu R, Gatica-Perez D (2010) Discovering human places of interest from multimodal mobile phone data. In: Proceedings of the 9th international conference on mobile and ubiquitous multimedia. ACM, p 12Google Scholar
  22. 22.
    Müller M (2007) Ltd MyiLibrary. Information retrieval for music and motion, vol 6. Springer BerlinGoogle Scholar
  23. 23.
    Park CB, Roh MC, Lee SW (2008) Real-time 3d pointing gesture recognition in mobile space. In: 8th IEEE international conference on automatic face & gesture recognition, 2008. FG’08. IEEE, pp 1–6Google Scholar
  24. 24.
    Pavlovic VI, Sharma R, Huang TS (1997) Visual interpretation of hand gestures for human-computer interaction: a review. IEEE Trans Pattern Anal Mach Intell 19(7):677–695CrossRefGoogle Scholar
  25. 25.
    Pei M, Jia Y, Zhu SC (2011) Parsing video events with goal inference and intent prediction. In: 2011 IEEE international conference on computer vision (ICCV), IEEE, pp 487–494Google Scholar
  26. 26.
    Peng X, Bennamoun M, Mian AS (2011) A training-free nose tip detection method from face range images. Pattern Recogn 44(3):544–558CrossRefzbMATHGoogle Scholar
  27. 27.
    Quintana GE, Sucar LE, Azcárate G, Leder R (2008) Qualification of arm gestures using hidden markov models. In: 8th IEEE international conference on automatic face & gesture recognition, 2008. FG’08. IEEE, pp 1–6Google Scholar
  28. 28.
    Rabiner LR (1989) A tutorial on hidden markov models and selected applications in speech recognition. Proc IEEE 77(2):257–286CrossRefGoogle Scholar
  29. 29.
    Rabiner LR, Juang BH (1993) Fundamentals of speech recognitionGoogle Scholar
  30. 30.
    Rajko S, Qian G (2008) Hmm parameter reduction for practical gesture recognition. In: 8th IEEE international conference on automatic face & gesture recognition, 2008. FG’08. IEEE, pp 1–6Google Scholar
  31. 31.
    Seo HJ, Milanfar P (2010) Training-free, generic object detection using locally adaptive regression kernels. IEEE Trans Pattern Anal Mach Intell 32(9):1688–1704CrossRefGoogle Scholar
  32. 32.
    Sminchisescu C, Kanaujia A, Li Z, Metaxas D (2005) Conditional models for contextual human motion recognition. In: Tenth IEEE international conference on computer vision, 2005. ICCV 2005, vol 2. IEEE, pp 1808–1815Google Scholar
  33. 33.
    Song Y, Demirdjian D, Davis R (2011) Multi-signal gesture recognition using temporal smoothing hidden conditional random fields. In: 2011 IEEE international conference on automatic face & gesture recognition and workshops (FG 2011), IEEE, pp 388–393Google Scholar
  34. 34.
    Suk HI, Sin BK, Lee SW (2008) Recognizing hand gestures using dynamic bayesian network. In: 8th IEEE international conference on automatic face & gesture recognition, 2008. FG’08. IEEE, pp 1–6Google Scholar
  35. 35.
    Takahashi M, Fujii M, Naemura M, Satoh S (2013) Human gesture recognition system for tv viewing using time-of-flight camera. Multimed Tools Appl 62(3):761–783CrossRefGoogle Scholar
  36. 36.
    Tsukada K, Yasamura M (2002) Ubi-finger: gesture input device for mobile use. In: Asia-Pacific computer and human interactionGoogle Scholar
  37. 37.
    Wang D, Xiong Z, Zhang M (2012) An application oriented and shape feature based multi-touch gesture description and recognition method. Multimed Tools Appl 58(3):497–519CrossRefMathSciNetGoogle Scholar
  38. 38.
    Wilson A, Shafer S (2003) Xwand: Ui for intelligent spaces. In: Computer human interaction, pp 545–552Google Scholar
  39. 39.
    Wilson D, Wilson A (2004) Gesture recognition using the xwandGoogle Scholar
  40. 40.
    Wu J, Pan G, Zhang D, Qi G, Li S (2009) Gesture recognition with a 3-d accelerometer. In: Ubiquitous intelligence and computing, pp 25–38Google Scholar
  41. 41.
    Zhu Y, Xu G, Kriegman DJ (2002) A real-time approach to the spotting, representation, and recognition of hand gestures for human–computer interaction. Comput Vis Image Underst 85(3):189–208CrossRefzbMATHGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2013

Authors and Affiliations

  • Liang Yin
    • 1
  • Mingzhi Dong
    • 1
  • Ying Duan
    • 2
  • Weihong Deng
    • 1
  • Kaili Zhao
    • 1
  • Jun Guo
    • 1
  1. 1.Beijing University of Posts and TelecommunicationsBeijingChina
  2. 2.Beijing Normal UniversityBeijingChina

Personalised recommendations