Advertisement

Hand Gesture Detection and Its Application to Virtual Reality Systems

  • M. Fikret ErcanEmail author
  • Allen Qiankun Liu
Conference paper
Part of the Lecture Notes in Electrical Engineering book series (LNEE, volume 547)

Abstract

Detecting hand gestures can provide a useful non-contact interaction tool with machines and systems and it has been employed for a wide range of applications. Recently, smart glasses and Virtual Reality (VR) headsets become viable solutions for various training applications ranging from surgical training in medicine to operator training for heavy equipment. A major challenge in these systems is to interact with the training platform since user’s view is blocked. In this paper, we present hand gesture detection using deep learning as a means of interaction with the VR system. Real world images are streamed by a camera mounted on the VR headset. User’s hand gestures are detected and blended into the virtual images providing more immersive and interactive user experience.

Keywords

Computer vision Hand gesture detection Virtual reality Deep learning 

Notes

Acknowledgements

This study is sponsored by Singapore Ministry of Education under the grant number MOE2015-TIF-2-T-039

References

  1. 1.
    Rautaray, S.S., Agrawal, A.: Vision based hand gesture recognition for human computer interaction: a survey. Artif. Intell. Rev. 43(1), 1–54 (2015)CrossRefGoogle Scholar
  2. 2.
    Gattupalli, S., Ebert, D., Papakostas, M., Makedon, F., Athitsos, V.: CogniLearn: a deep learning-based interface for cognitive behavior assessment. In: Proceedings of the 22nd International Conference on Intelligent User Interfaces, pp. 577–587 (2017)Google Scholar
  3. 3.
    Elsayed, R.A., Abdalla, M. I., Sayed, M.S.: Hybrid method based on multi-feature descriptor for static sign language recognition. In: Eighth International Conference on Intelligent Computing and Information Systems, pp. 98–105 (2017)Google Scholar
  4. 4.
    Zabulis, X., Baltzakis, H., Argyros, A.: Vision-based hand gesture recognition for human–computer interaction. In: The Universal Access Handbook. CRC press, Boca Raton (2009)Google Scholar
  5. 5.
    Yun, L., Peng, Z.: An automatic hand gesture recognition system based on Viola–Jones method and SVMs. In: Proceeding of 2nd International Workshop on Computer Science and Engineering, pp. 72–76 (2009)Google Scholar
  6. 6.
    Nagi, J., Ducatelle, F., Di Caro, G. A., Ciresan, D., Meier, U., Giusti, A., Nagi, F., Schmidhuber, J., Gambardella, L.M.: Max-pooling convolutional neural networks for vision-based hand gesture recognition, In: IEEE International Conference on Signal and Image Processing Applications, pp. 342–347 (2011)Google Scholar
  7. 7.
    Wu, D., Pigou, L., Kindermans, P.J., Le, N.D., Shao, L., Dambre, J., Odobez, J.M.: Deep dynamic neural networks for multimodal gesture segmentation and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 38(8), 1583–1597 (2016)CrossRefGoogle Scholar
  8. 8.
    Redmon, J., Farhadi, A.: YOLO9000: Better, Faster, Stronger (2016). arXiv:1612.08242
  9. 9.
    Howard, A.G., Zhu, M., Chen, B., Kalenichenko, D., Wang, W., Weyand, T., Andreetto, M., Adam, H.: MobileNets: Efficient convolutional neural networks for mobile vision applications (2017). arXiv:1704.04861
  10. 10.
    Redmon, J.: Darknet: Open source neural networks in C (2013–2016) http://pjreddie.com/darknet/
  11. 11.
    Hand Gesture video demo link with MobileNetSSD in VR09. https://youtu.be/78-IUv9vR20

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Singapore Polytechnic, School of Electrical and Electronic EngineeringSingaporeSingapore

Personalised recommendations