Gesture recognition based on HMM-FNN model using a Kinect
- 478 Downloads
Addressing the problem of complex dynamic gesture recognition, this paper obtains the body depth image through the body feeling sensor device—Kinect; the threshold segmentation method is used to segment the gestures depth image, on the basis of the common distance between hand and body. Then, the HMM-FNN model, which combines the hidden markov model (HMM) and the fuzzy neural network (FNN), is used for dynamic gesture recognition. This paper mainly focuses on the trainees’ common operations of equipment in virtual substation to set the custom gesture interaction sets. Based on the characteristic of the complex dynamic gesture, gesture image was decomposed into three feature sequences—hand shape change, hand position changes in the two-dimensional plane, and movement in the Z-axis direction, for feature extraction. The HMM model is respectively built according to the three sub sequences, and the FNN was connected to judge the semantics of gesture using the fuzzy reasoning. By experimental verification, the HMM-FNN model can quickly and effectively identify complicated dynamic hand gestures. Meanwhile, it has strong robustness. The recognition effect is superior to that of the simple HMM model.
KeywordsKinect Threshold segmentation method Complex dynamic gesture HMM-FNN Gesture recognition
Compliance with ethical standards
This study was funded by the key transformation project of provincial science and technology plan (No. 20140307008GX) and the “Double ten” cultivation project of Jilin provincial education department ( Open image in new window  No. 109).
Conflict of interest
The authors declare that they have no conflict of interest.
Research involving human participants and/or animals
In this research, ten people were chose to participate in the gesture recognition experiment. They are Tingting Yang, Yanli Wen, Liqing Sun, Chunlei Shi, Xudong Ma, Jun Qi, Qing Li, Yang Yu, Jiajia Zhang, Ning Zhou.
All participants voluntarily agreed to participate in this study and all gave written informed consent.
- 1.Wang Y, Zhang Q-Z (2013) Gesture recognition based on Kinect depth information. J Beijing Inform Sci Technol Univ 28(1):22–26Google Scholar
- 3.Chen Y, Chen Z, Zhou X (2013) Gesture recognition based on Kinect and application in the virtual assembly technology. Electron Des Eng 21(10):3–7Google Scholar
- 7.Cheng H, Dai Z, Liu Z (2016) An image-to-class dynamic time warping approach for both 3D static and trajectory hand gesture recognition. doi: 10.1016/j.patcog.2016.01.011
- 9.Wu X, Yang C, Feng Q (2015) Research on Kinect-based hand gesture recognition algorithm and its applications. Comput Appl Softw 32(7):173–177Google Scholar
- 11.Yangecnu (2012) Kinect for Windows SDK Open image in new window. http://www.cnblogs.com/yangecnu/archive/2012/03/30/KinectSDK_Geting_Started.html. Accessed 7 Apr 2015
- 12.Yangecnu (2012) Kinect for Windows SDK Open image in new window. http://www.cnblogs.com/yangecnu/archive/2012/04/04/KinectSDK_Depth_Image_Processing_Part1.html. Accessed 7 Apr 2015
- 13.Lei J, Wang W (2013) Research and implementation of image threshold segmentation based on OpenCV. Mod Electron Tech 36(24):72–76Google Scholar
- 14.Cao C, Li R, Zhao L (2012) Hand posture recognition method based on depth image technology. Comput Eng 38(8):16–18Google Scholar
- 15.Chong W, Zhong L, Shing-Chow C (2015) Superpixel-based hand gesture recognition with Kinect depth camera. IEEE Trans Multimed 1(17):29–39Google Scholar
- 16.Wang D, Shi C, Zhang M (2012) Multi-touch gesture recognition based on propagation neural networks. Pattern Recogn Artif Intell 23(3):408–413Google Scholar
- 18.Tan T, Ren K, Chen X et al (2011) Fuzzy neural network technology. J Chongqing Univ Arts Sci (Nat Sci Ed) 30(1):71–74Google Scholar
- 19.Zhu Q, Li K, Zhang Z et al (2010) An improved Gaussian mixture model for an adaptive background model. J Harb Eng Univ 31(10): 1348–1353, 1392Google Scholar
- 20.Qu Z, Hou S, Zhang Y et al (2014) Realization of substation visualization training platform. J Northeast Dianli Univ 34(3):75–79Google Scholar