Skip to main content

Human Motion Attitude Tracking Method Based on Vicon Motion Capture Under Big Data

  • Conference paper
  • First Online:
Advanced Hybrid Information Processing (ADHIP 2019)

Abstract

Aiming at the problem that the human body motion posture cannot be correctly and quickly marked in the conventional method, a human body motion attitude tracking method based on Vicon motion capture under big data is proposed and designed. Under the motion capture filtering algorithm, the human body weight measurement function is constructed by the combination of color, edge and motion features, and different images are selected according to the occlusion between limbs to establish a constrained human motion model, and the model is based on Vicon action. The tracking calculation of the capture realizes the tracking process of the human body motion posture. The effectiveness of the method is determined by the method of experimental argumentation analysis. The results show that the method can track the motion posture of the human body quickly and accurately, and the robustness is better. The tracking accuracy is 13.87% higher than the conventional method.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Anonymous. Research on the motion characteristics of left and right ankle joints of Vicon motion capture system during human walking. Modern Inf. Technol. 2(6), 15ā€“17 (2018)

    Google ScholarĀ 

  2. Wei, Y., Fan, X.: Tracking and recognition of human motion posture based on GMM. J. Beijing Inst. Fashion (Natural Science Edition) 100(2), 47ā€“55 (2018)

    Google ScholarĀ 

  3. Wang, M., Hu, M., Yang, W.: Motion accuracy simulation of robot tracking gesture image. Comput. Simul. 34(8), 346ā€“350 (2017)

    Google ScholarĀ 

  4. Gang, J., Nan, Z.: An athlete posture analysis model for tracking human joint points with multi-feature optical flow. Sci. Technol. Bull. 15(12), 133ā€“136 (2017)

    Google ScholarĀ 

  5. He, K., Ding, X.: Human behavior tracking method based on low-dimensional manifolds. Comput. Eng. Des. 38(5), 1361ā€“1365 (2017)

    Google ScholarĀ 

  6. Wang, Y., Li, H., Qi, H., et al.: A robust moving target tracking method inspired by three-stage memory mechanism of human brain. J. Electron. 45(09), 12ā€“17 (2017)

    Google ScholarĀ 

  7. Man, W., Zhu, Z., Zhang, Z., et al.: Human motion posture analysis using synchronous extrusion wavelet transform. J. Xiā€™an Jiaotong Univ. 51(12), 8ā€“13 (2017)

    Google ScholarĀ 

  8. Zhiping, W.: Pedestrian tracking algorithm based on human body feature recognition and Kalman filter. Electro-Optic Control 46(11), 97ā€“102 (2016)

    Google ScholarĀ 

  9. Ying, Z., Ying, Z.: Research on motion target tracking algorithms based on computer vision. J. Chifeng Univ. (Nature Edition) 33(4), 12ā€“14 (2017)

    Google ScholarĀ 

  10. Liu, S., Qiu, Z., Ding, Q., et al.: Research on head attitude tracking method based on adaptive filtering. Electro-optic and Control 91(4), 33ā€“36 (2016)

    Google ScholarĀ 

  11. Wang, Q., Gong, L., Dong, C.: Passive attitude tracking control of Spacecraft Based on time-varying gain ESO. Control Decis.-Making 43(2), 193ā€“202 (2018)

    Google ScholarĀ 

  12. Yin, C.: Multi-loop recursive attitude tracking control for capturing spacecraft. Space Control 12(1), 54ā€“56 (2018)

    Google ScholarĀ 

Download references

Acknowledgements

OnĀ theĀ digitalizedĀ 3DĀ modelĀ ofĀ TibetanĀ Xianziwu basedĀ onĀ 3DĀ MotionĀ Capture (Innovation-SupportiveĀ ProjectĀ forĀ YoungĀ TeachersĀ inĀ CollegesĀ andĀ UniversitiesĀ inĀ TibetĀ AutonomousĀ Region) QCZ2016ā€”33.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ze-guo Liu .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

Ā© 2019 ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Liu, Zg. (2019). Human Motion Attitude Tracking Method Based on Vicon Motion Capture Under Big Data. In: Gui, G., Yun, L. (eds) Advanced Hybrid Information Processing. ADHIP 2019. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 301. Springer, Cham. https://doi.org/10.1007/978-3-030-36402-1_41

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-36402-1_41

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-36401-4

  • Online ISBN: 978-3-030-36402-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics