Advertisement

Wireless Personal Communications

, Volume 91, Issue 4, pp 1779–1797 | Cite as

Gesture Recognition Method Using Sensing Blocks

  • Yulong Xi
  • Seoungjae Cho
  • Simon Fong
  • Yong Woon Park
  • Kyungeun ChoEmail author
Article
  • 244 Downloads

Abstract

Recently, the recognition of posture and gesture has been widely used in fields such as medical treatment and human–computer interaction. Previous research into the recognition of posture and gesture has mainly used human skeletons and an RGB-D camera. The resulting recognition methods utilize models of the human skeleton, with different numbers of joints. The processing of the resulting large amounts of feature data needed to recognize a gesture leads to the recognition being delayed. To overcome this issue, we designed and developed a system for learning and recognizing postures and gestures. This paper proposes a gesture recognition method with enhanced generality and processing speed. The proposed method consists of feature collection part, feature optimization part, and a posture and gesture recognition part. We have verified the solution proposed in this paper through the learning and subsequent recognition of 29 postures and 8 gestures.

Keywords

Posture recognition Gesture recognition Natural user interface Hidden Markov model Support vector machine 

Notes

Acknowledgments

This research was supported by the MSIP (Ministry of Science, ICT and Future Planning), Korea, under the ITRC (Information Technology Research Center) support program (IITP-2016-H8501-16-1014) supervised by the IITP (Institute for Information & communications Technology Promotion) and by Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Science, ICT and future Planning (NRF-2015R1A2A2A01003779).

References

  1. 1.
    Omelina, L., Jansen, B., Bonnechere, B., Van Sint Jan, S., & Cornelis, J. (2012). Serious games for physical rehabilitation: designing highly configurable and adaptable games. In Proceedings of the 9th international conference on disability, virutal reality & associated technologies, Laval, France, pp. 195–201.Google Scholar
  2. 2.
    Pirani, E., & Kolte, M. (2010). Gesture based educational software for children with acquired brain injuries. International Journal in Computer Science and Engineering, 2(3), 790–794.Google Scholar
  3. 3.
    Alves, S., Marques, A., Queirós, C., & Orvalho, V. (2013). LIFEisGAME prototype: A serious game about emotions for children with autism spectrum disorders. PsychNology Journal, 11(3), 191–211.Google Scholar
  4. 4.
    Zhang, K., Zhai, Y., Leong, H. W., & Wang, S. (2012). An interaction educational computer game framework using hand gesture recognition. In Proceedings of the 4th international conference on internet multimedia computing and service, pp. 219–222.Google Scholar
  5. 5.
    He, G. F., Park, J. W., Kang, S. K., & Jung, S. T. (2012). Development of gesture recognition-based serious games. In 2012 IEEE-EMBS International Conference on Biomedical and health informatics (BHI), pp. 922–925.Google Scholar
  6. 6.
    Kühnel, C., Westermann, T., Hemmert, F., Kratz, S., Müller, A., & Möller, S. (2011). I’m home: Defining and evaluating a gesture set for smart-home control. International Journal of Human–Computer Studies, 69(11), 693–704.CrossRefGoogle Scholar
  7. 7.
    Taranta, E. M, I. I., Simons, T. K., Sukthankar, R., & Laviola, J. J, Jr. (2015). Exploring the benefits of context in 3D gesture recognition for game-based virtual environments. ACM Transactions on Interactive Intelligent Systems (TiiS), 5(1), 1–34.CrossRefGoogle Scholar
  8. 8.
    Zhou, Y., Jing, L., Wang, J., & Cheng, Z. (2012). Analysis and selection of features for gesture recognition based on a micro wearable device. Graduate School of Computer Science and Engineering, University of Aizu Wakamatsu, Japan,(IJACSA) International Journal of Advanced Computer Science and Applications 3(1), 1–7.Google Scholar
  9. 9.
    Kang, S. K., Chung, K. Y., Rim, K. W., & Lee, J. H. (2011). Skin color based hand and finger detection for gesture recognition in CCTV surveillance. The Journal of the Korea Contents Association, 11(10), 1–10.CrossRefGoogle Scholar
  10. 10.
    Juhyun, L., Hanbyul, C., & Kicheon, H. (2015). A fainting condition detection system using thermal imaging cameras based object tracking algorithm. Journal of Convergence., 6(3), 1–15.CrossRefGoogle Scholar
  11. 11.
    Bostanci, E., Kanwal, N., & Clark, A. F. (2015). Augmented reality applications for cultural heritage using Kinect. Human-Centric Computing and Information Sciences, 5(1), 1–18.CrossRefGoogle Scholar
  12. 12.
    Kessous, L., Castellano, G., & Caridakis, G. (2010). Multimodal emotion recognition in speech-based interaction using facial expression, body gesture and acoustic analysis. Journal on Multimodal User Interfaces, 3(1–2), 33–48.CrossRefGoogle Scholar
  13. 13.
    Zhang, Z., Liu, Y., Li, A., & Wang, M. (2014, October). A novel method for user-defined human posture recognition using Kinect. In 2014 7th international congress on image and signal processing (CISP), pp. 736–740.Google Scholar
  14. 14.
    Patsadu, O., Nukoolkit, C., & Watanapa, B. (2012, May). Human gesture recognition using Kinect camera. In 2012 international joint conference on computer science and software engineering (JCSSE), pp. 28–32.Google Scholar
  15. 15.
    Le, T. L., Nguyen, M. Q., & Nguyen, T. T. M. (2013, January). Human posture recognition using human skeleton provided by Kinect. In 2013 international conference on computing, management and telecommunications (ComManTel), pp. 340–345.Google Scholar
  16. 16.
    Wu, J., Cheng, J., & Feng, W. (2014, July). 3D dynamic gesture recognition based on improved HMMs with entropy. In 2014 IEEE international conference on information and automation (ICIA), pp. 213–218.Google Scholar
  17. 17.
    Park, C. B., & Lee, S. W. (2011). Real-time 3D pointing gesture recognition for mobile robots with cascade HMM and particle filter. Image and Vision Computing, 29(1), 51–63.CrossRefGoogle Scholar
  18. 18.
    Fujii, T., Lee, J. H., & Okamoto, S. (2014). Gesture recognition system for human–robot interaction and its application to robotic service task. In Lecture notes in engineering and computer science: proceedings of the international multiconference of engineers and computer scientists, pp. 1–6.Google Scholar
  19. 19.
    Yang, M., Lin, Z., Tang, W., Zheng, L., & Zhou, J. (2014). Human action recognition based on Kinect. Journal of Computational Information Systems, 10(12), 5347–5354.Google Scholar
  20. 20.
    Huang, X., Zheng, L., Liang, R., & Wang, W. (2012). Human action recognition based on SVM using multiple features. International Conference on Artificial Intelligence and Soft Computing, 12, 160–165.Google Scholar
  21. 21.
    Li, P. C., & Xu, S. H. (2005). Support vector machine and kernel function characteristic analysis in pattern recognition. Computer Engineering and Design, 26(2), 302–304.Google Scholar
  22. 22.
    Baruah, U., & Hazarika, S. M. (2015). A dataset of online handwritten assamese characters. Journal of Information Processing Systems, 11(3), 325–341.Google Scholar

Copyright information

© Springer Science+Business Media New York 2016

Authors and Affiliations

  1. 1.Dongguk UniversitySeoulRepublic of Korea
  2. 2.University of MacauMacauChina
  3. 3.Agency for Defense DevelopmentDaejeonRepublic of Korea

Personalised recommendations