Skip to main content

Development of a Hand Gesture Based Control Interface Using Deep Learning

  • 455 Accesses

Part of the Communications in Computer and Information Science book series (CCIS,volume 1070)

Abstract

This paper describes the implementation of a control system based on ten different hand gestures, providing a useful approach for the implementation of better user-friendly human-machine interfaces. Hand detection is achieved using fast detection and tracking algorithms, and classification by a light convolutional neural network. The experimental results show a real-time response with an accuracy of 95.09%, and making use of low power consumption. These results demonstrate that the proposed system could be applied in a large range of applications such as virtual reality, robotics, autonomous driving systems, human-machine interfaces, augmented reality among others.

Keywords

  • Gesture recognition
  • Human-machine interface
  • Deep learning
  • Real-time
  • Hand poses

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-030-46140-9_14
  • Chapter length: 8 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   79.99
Price excludes VAT (USA)
  • ISBN: 978-3-030-46140-9
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   99.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.

References

  1. Oyedotun, O.K., Khashman, A.: Deep learning in vision-based static hand gesture recognition. Neural Comput. Appl. 28(12), 3941–3951 (2016). https://doi.org/10.1007/s00521-016-2294-8

    CrossRef  Google Scholar 

  2. Krizhevsky, A., Sutskever, I., Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: NIPS, pp. 1097–1105 (2012)

    Google Scholar 

  3. Kwolek, B.: Face detection using convolutional neural networks and gabor filters. In: Duch, W., Kacprzyk, J., Oja, E., Zadrożny, S. (eds.) ICANN 2005. LNCS, vol. 3696, pp. 551–556. Springer, Heidelberg (2005). https://doi.org/10.1007/11550822_86

    CrossRef  Google Scholar 

  4. Arel, I., Rose, D., Karnowski, T.: Research frontier: deep machine learning-a new frontier in artificial intelligence research. Comp. Intell. Mag. 5(4), 13–18 (2010)

    CrossRef  Google Scholar 

  5. Tompson, J., Stein, M., Lecun, Y., Perlin, K.: Real-time continuous pose recovery of human hands using convolutional networks. ACM Trans. Graph. 33(5), 1–10 (2014)

    CrossRef  Google Scholar 

  6. Nagi, J., Ducatelle, F., et al.: Max-pooling convolutional neural networks for vision-based hand gesture recognition. In: IEEE ICSIP, pp. 342–347 (2011)

    Google Scholar 

  7. Barros, P., Magg, S., Weber, C., Wermter, S.: A multichannel convolutional neural network for hand posture recognition. In: Wermter, S., Weber, C., Duch, W., Honkela, T., Koprinkova-Hristova, P., Magg, S., Palm, G., Villa, A.E.P. (eds.) ICANN 2014. LNCS, vol. 8681, pp. 403–410. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-11179-7_51

    CrossRef  Google Scholar 

  8. Koller, O., Ney, H., Bowden, R.: Deep hand: How to train a CNN on 1 million hand images when your data is continuous and weakly labelled. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 3793–3802 (2016)

    Google Scholar 

  9. Yuan, S., Ye, Q., Stenger, B., Jain, S., Kim, T.: BigHand2.2M benchmark: hand pose dataset and state of the art analysis. In: 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Honolulu, HI, pp. 2605–2613 (2017)

    Google Scholar 

  10. Bolme, D.S., Beveridge, J.R., Draper, B.A., Lui, Y.M.: Visual object tracking using adaptive correlation filters. In: 2010 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Francisco, CA, pp. 2544–2550 (2010)

    Google Scholar 

  11. Jones, M.J., Rehg, J.M.: Statistical color models with application to skin detection. Int. J. Comput. Vision 46(1), 81–96 (2002)

    CrossRef  Google Scholar 

  12. Szegedy, C., et al.: Going deeper with convolutions. In: 2015 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Boston, MA (2015)

    Google Scholar 

  13. Li, D., Chen, X., Becchi, M., Zong, Z.: Evaluating the energy efficiency of deep convolutional neural networks on CPUs and GPUs. In: 2016 IEEE International Conferences on Big Data and Cloud Computing (BDCloud), Social Computing and Networking (SocialCom), Sustainable Computing and Communications (SustainCom) (BDCloud-SocialCom-SustainCom), Atlanta, GA, pp. 477–484 (2016)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dennis Núñez-Fernández .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Verify currency and authenticity via CrossMark

Cite this paper

Núñez-Fernández, D. (2020). Development of a Hand Gesture Based Control Interface Using Deep Learning. In: Lossio-Ventura, J.A., Condori-Fernandez, N., Valverde-Rebaza, J.C. (eds) Information Management and Big Data. SIMBig 2019. Communications in Computer and Information Science, vol 1070. Springer, Cham. https://doi.org/10.1007/978-3-030-46140-9_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-46140-9_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-46139-3

  • Online ISBN: 978-3-030-46140-9

  • eBook Packages: Computer ScienceComputer Science (R0)