Advertisement

Hand Gesture Recognition Using Infrared Imagery Provided by Leap Motion Controller

  • Tomás MantecónEmail author
  • Carlos R. del-Blanco
  • Fernando Jaureguizar
  • Narciso García
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10016)

Abstract

Hand gestures are one of the main alternatives for Human-Computer Interaction. For this reason, a hand gesture recognition system using near-infrared imagery acquired by a Leap Motion sensor is proposed. The recognition system directly characterizes the hand gesture by computing a global image descriptor, called Depth Spatiograms of Quantized Patterns, without any hand segmentation stage. To deal with the high dimensionality of the image descriptor, a Compressive Sensing framework is applied, obtaining a manageable image feature vector that almost preserves the original information. Finally, the resulting reduced image descriptors are analyzed by a set of Support Vectors Machines to identify the performed gesture independently of the precise hand location in the image. Promising results have been achieved using a new hand-based near-infrared database.

Keywords

Feature extraction Gesture recognition Random projections Image classification Near-infrared imaging 

Notes

Acknowledgements

This work has been partially supported by the Ministerio de Economía y Competitividad of the Spanish Government under project TEC2013-48453 (MR-UHDTV), and by AIRBUS Defense and Space under project SAVIER.

References

  1. 1.
    Achlioptas, D.: Database-friendly random projections: Johnson-Lindenstrauss with binary coins. J. Comput. Syst. Sci. 66(4), 671–687 (2003)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Aditya, R., Namrata, V., Santanu, C., Subhashis, B.: Recognition of dynamic hand gestures. Pattern Recogn. 36(9), 2069–2081 (2003)CrossRefzbMATHGoogle Scholar
  3. 3.
    Choi, E., Lee, C.: Feature extraction based on the Bhattacharyya distance. Pattern Recogn. 36(8), 1703–1709 (2003)CrossRefGoogle Scholar
  4. 4.
    Deng-Yuan, H., Wu-Chih, H., Sung-Hsiang, C.: Gabor filter-based hand-pose angle estimation for hand gesture recognition under varying illumination. Expert Syst. Appl. 38(5), 6031–6042 (2011)CrossRefGoogle Scholar
  5. 5.
    Eldar, Y.C., Kutyniok, G.: Compressed Sensing: Theory and Applications. Cambridge University Press, Cambridge (2012)CrossRefGoogle Scholar
  6. 6.
    Gieser, S.N., Boisselle, A., Makedon, F.: Real-time static gesture recognition for upper extremity rehabilitation using the leap motion. In: Duffy, V.G. (ed.) DHM 2015. LNCS, vol. 9185, pp. 144–154. Springer, Heidelberg (2015). doi: 10.1007/978-3-319-21070-4_15 CrossRefGoogle Scholar
  7. 7.
    Tran, T.T.H.: How can human communicate with robot by hand gesture? In: International Conference on Computing, Management and Telecommunications, pp. 235–240, January 2013Google Scholar
  8. 8.
    Jiang, F., Wang, C., Gao, Y., Wu, S., Zhao, D.: Discriminating features learning in hand gesture classification. IET Comput. Vis. 9(5), 673–680 (2015)CrossRefGoogle Scholar
  9. 9.
    Kai-Yin, F., Ganganath, N., Chi-Tsun, C., Tse, C.: A real-time ASL recognition system using Leap Motion sensors. In: International Conference on Cyber-Enabled Distributed Computing and Knowledge Discovery, pp. 411–414, September 2015Google Scholar
  10. 10.
    Kim, J., Ryu, J., Han, T.: Multimodal interface based on novel HMI UI/UX for in-vehicle infotainment system. ETRI J. 37(4), 793–803 (2015)CrossRefGoogle Scholar
  11. 11.
    Kopinski, T., Geisler, S., Handmann, U.: Gesture-based human-machine interaction for assistance systems. In: IEEE International Conference on Information and Automation, pp. 510–517, August 2015Google Scholar
  12. 12.
    Kuizhi, M., Lu, X., Boliang, L., Bin, L., Fang, W.: A real-time hand detection system based on multi-feature. Neurocomputing 158, 184–193 (2015)CrossRefGoogle Scholar
  13. 13.
    Mantecon, T., del Blanco, C.R., Jaureguizar, F., Garcia, N.: New generation of human machine interfaces for controlling UAV through depth-based gesture recognition. In: Proceedings of the SPIE, vol. 9084, May 2014Google Scholar
  14. 14.
    Mantecon, T., Mantecon, A., del Blanco, C., Jaureguizar, F., Garcia, N.: Enhanced gesture-based human-computer interaction through a compressive sensing reduction scheme of very large and efficient depth feature descriptors. In: IEEE International Conference on Advanced Video and Signal Based Surveillance, pp. 1–6, August 2015Google Scholar
  15. 15.
    Marin, G., Dominio, F., Zanuttigh, P.: Hand gesture recognition with Leap Motion and Kinect devices. In: IEEE International Conference on Image Processing, pp. 1565–1569, October 2014Google Scholar
  16. 16.
    Ng, C.W., Ranganath, S.: Real-time gesture recognition system and application. Image Vis. Comput. 20(1314), 993–1007 (2002)CrossRefGoogle Scholar
  17. 17.
    Nigam, I., Vatsa, M., Singh, R.: Leap signature recognition using HOOF and HOT features. In: IEEE International Conference on Image Processing, pp. 5012–5016, October 2014Google Scholar
  18. 18.
    Ojala, T., Pietikinen, M., Harwood, D.: A comparative study of texture measures with classification based on featured distributions. Pattern Recogn. 29(1), 51–59 (1996)CrossRefGoogle Scholar
  19. 19.
    Shang, W., Cao, X., Ma, H., Zang, H., Wei, P.: Kinect-based vision system of mine rescue robot for low illuminous environment. J. Sens. 2016, 1–9 (2016)Google Scholar
  20. 20.
    Singer, Y., Srebro, N.: Pegasos: primal estimated sub-gradient solver for SVM. In: ICML, pp. 807–814, October 2007Google Scholar
  21. 21.
    Sykora, P., Kamencay, P., Hudec, R.: Comparison of SIFT and SURF methods for use on hand gesture recognition based on depth map. In: AASRI Conference on Circuit and Signal Processing, vol. 9, pp. 19–24, September 2014Google Scholar
  22. 22.
    Tornow, M., Al-Hamadi, A., Borrmann, V.: Gestic-based human machine interface for robot control. In: IEEE International Conference on Systems, Man, and Cybernetics, pp. 2706–2711, October 2013Google Scholar
  23. 23.
    Yuanrong, X., Qianqian, W., Xiao, B., Yen-Lun, C., Xinyu, W.: A novel feature extracting method for dynamic gesture recognition based on support vector machine. In: IEEE International Conference on Information and Automation, pp. 437–441, Jul 2014Google Scholar
  24. 24.
    Zhang, P., Li, B., Du, G., Liu, X.: A wearable-based and markerless human-manipulator interface with feedback mechanism and kalman filters. Int. J. Adv. Robot Syst. 12, 164–170 (2015)Google Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  • Tomás Mantecón
    • 1
    Email author
  • Carlos R. del-Blanco
    • 1
  • Fernando Jaureguizar
    • 1
  • Narciso García
    • 1
  1. 1.Grupo de Tratamiento de Imágenes, E.T.S.I. de TelecomunicaciónUniversidad Politécnica de MadridMadridSpain

Personalised recommendations