Advertisement

Human Motion Attitude Tracking Method Based on Vicon Motion Capture Under Big Data

  • Ze-guo LiuEmail author
Conference paper
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 301)

Abstract

Aiming at the problem that the human body motion posture cannot be correctly and quickly marked in the conventional method, a human body motion attitude tracking method based on Vicon motion capture under big data is proposed and designed. Under the motion capture filtering algorithm, the human body weight measurement function is constructed by the combination of color, edge and motion features, and different images are selected according to the occlusion between limbs to establish a constrained human motion model, and the model is based on Vicon action. The tracking calculation of the capture realizes the tracking process of the human body motion posture. The effectiveness of the method is determined by the method of experimental argumentation analysis. The results show that the method can track the motion posture of the human body quickly and accurately, and the robustness is better. The tracking accuracy is 13.87% higher than the conventional method.

Keywords

Big data technology Vicon motion capture Human body movement posture Tracking method 

Notes

Acknowledgements

On the digitalized 3D model of Tibetan Xianziwu based on 3D Motion Capture (Innovation-Supportive Project for Young Teachers in Colleges and Universities in Tibet Autonomous Region) QCZ2016—33.

References

  1. 1.
    Anonymous. Research on the motion characteristics of left and right ankle joints of Vicon motion capture system during human walking. Modern Inf. Technol. 2(6), 15–17 (2018)Google Scholar
  2. 2.
    Wei, Y., Fan, X.: Tracking and recognition of human motion posture based on GMM. J. Beijing Inst. Fashion (Natural Science Edition) 100(2), 47–55 (2018)Google Scholar
  3. 3.
    Wang, M., Hu, M., Yang, W.: Motion accuracy simulation of robot tracking gesture image. Comput. Simul. 34(8), 346–350 (2017)Google Scholar
  4. 4.
    Gang, J., Nan, Z.: An athlete posture analysis model for tracking human joint points with multi-feature optical flow. Sci. Technol. Bull. 15(12), 133–136 (2017)Google Scholar
  5. 5.
    He, K., Ding, X.: Human behavior tracking method based on low-dimensional manifolds. Comput. Eng. Des. 38(5), 1361–1365 (2017)Google Scholar
  6. 6.
    Wang, Y., Li, H., Qi, H., et al.: A robust moving target tracking method inspired by three-stage memory mechanism of human brain. J. Electron. 45(09), 12–17 (2017)Google Scholar
  7. 7.
    Man, W., Zhu, Z., Zhang, Z., et al.: Human motion posture analysis using synchronous extrusion wavelet transform. J. Xi’an Jiaotong Univ. 51(12), 8–13 (2017)Google Scholar
  8. 8.
    Zhiping, W.: Pedestrian tracking algorithm based on human body feature recognition and Kalman filter. Electro-Optic Control 46(11), 97–102 (2016)Google Scholar
  9. 9.
    Ying, Z., Ying, Z.: Research on motion target tracking algorithms based on computer vision. J. Chifeng Univ. (Nature Edition) 33(4), 12–14 (2017)Google Scholar
  10. 10.
    Liu, S., Qiu, Z., Ding, Q., et al.: Research on head attitude tracking method based on adaptive filtering. Electro-optic and Control 91(4), 33–36 (2016)Google Scholar
  11. 11.
    Wang, Q., Gong, L., Dong, C.: Passive attitude tracking control of Spacecraft Based on time-varying gain ESO. Control Decis.-Making 43(2), 193–202 (2018)Google Scholar
  12. 12.
    Yin, C.: Multi-loop recursive attitude tracking control for capturing spacecraft. Space Control 12(1), 54–56 (2018)Google Scholar

Copyright information

© ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2019

Authors and Affiliations

  1. 1.College of Information EngineeringXizang Minzu UniversityXianyangChina

Personalised recommendations