Advertisement

Multimedia Tools and Applications

, Volume 77, Issue 6, pp 7779–7794 | Cite as

A novel key frames matching approach for human locomotion interpolation

  • Minghua Zhao
  • Yongqin Yuan
  • Xin Zhang
  • Zhenghao Shi
  • Yinghui Wang
Article
  • 116 Downloads

Abstract

A novel key frames matching approach for human locomotion interpolation is proposed in this paper. Firstly, locomotion data is transformed from body coordinate system to world coordinate system based on recursive algorithm; secondly, features of two feet and step distance are extracted; thirdly, key frames are extracted based on feature curve and the locomotion sequence is segmented; fourthly, a new approach for key frames matching is put forward; finally, transition frames are interpolated between the matched key frames using quaternion Slerp (Spherical linear interpolation) algorithm and linear interpolation algorithm. The main contributions of this paper are: i) different locomotion sequences can be synthesized in a controllable manner with good effect; ii) to extract locomotion features conveniently, locomotion data in body coordinate system is transformed to world coordinate system; iii) to analyze locomotion sequences intuitively and accurately, key frames are extracted and locomotion sequences are segmented based on feature curve analysis; iv) to avoid illogical jump of traditional interpolation between key frames, a new key frames matching strategy is put forward and only the key frames between two continuous segments can be chosen as matching frames. Experimental results based on the CMU MoCap database show that the proposed method can confirm the logical correctness and naturalness in the junction of two locomotion sequences.

Keywords

Optical motion capture Locomotion interpolation Feature extraction Key frames matching Locomotion segmentation 

Notes

Acknowledgement

This work was partially supported by a grant from the National Natural Science Foundation of China (No.61401355, No.61472319 and No.61502382), a grant from the Key Laboratory Foundation of Shaanxi Education Department, China (No.14JS072), a grant from Science and Technology Project Foundation of Beilin District, Xi’an City, China (No.GX1621) and a grant from Fok Ying Tung Education Foundation (No.141119). The authors also thank anonymous reviewers for their valuable comments.

Compliance with ethical standards

Conflict of interest

The authors declare that there is no conflict of interests regarding the publication of this paper.

References

  1. 1.
    Fang AC, Pollard NS (2003) Efficient synthesis of physically valid human motion. ACM Trans Graph 22:417–426CrossRefGoogle Scholar
  2. 2.
    Mahmudi M, Kallmann M (2013) Analyzing locomotion synthesis with feature-based motion graphs. IEEE Trans Vis Comput Graph 19:774–786CrossRefGoogle Scholar
  3. 3.
    Zaman L, Sumpeno S, Mochamad H (2013) Affective postures classification using bones rotations features from motion capture data. In: the Proceedings of 2013 I.E. Intern Conf Comp Sci Automat Eng 11:585–588Google Scholar
  4. 4.
    Xia S, Wang Z (2009) Recent advances on virtual human synthesis. Sci China Series F: Info Sci 52:741–757CrossRefzbMATHGoogle Scholar
  5. 5.
    Zhao L, Safonova A (2009) Achieving good connectivity in motion graphs. Graph Model 71:139–152CrossRefGoogle Scholar
  6. 6.
    Ashraf G, Wong KC (2000) Dynamic time warp based framespace interpolation for motion editing. Graph Interface:45–52Google Scholar
  7. 7.
    Rose C, Guenter B, Bodenheimer B et al (1996) Efficient generation of motion transitions using spacetime constraints. In: the SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer Graphics and Interactive Techniques. ACM, pp 147–154Google Scholar
  8. 8.
    Nopparit S, Pantuwong N, Sugimoto M (2013) A parametric motion concatenation method using cubic Bézier interpolation, the International conference on information technology and electrical Engineering (ICITEE), pp 13-17Google Scholar
  9. 9.
    Park MJ, Shin SY (2004) Example-based motion cloning. Comput Animat Virtual Worlds 15:45–257Google Scholar
  10. 10.
    Yang T, Jun X, Wu F et al (2006) Extraction of keyframe of motion capture data based on layered curve simplification. J Compr-Aid Design Compr Graph 18:1691–1697Google Scholar
  11. 11.
    Kwon T, Shin SY (2005) Motion modeling for on-line locomotion synthesis. In: the Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on computer animation, pp 29-38Google Scholar
  12. 12.
    Laviers A, Egerstedt M (2012) Style based robotic motion. American control conference (ACC), pp 4327-4332.Google Scholar
  13. 13.
    Cardle M (2004) Automated motion editing. Technical Report. Computer Laboratory, University of Cambridge, UKGoogle Scholar
  14. 14.
    Park SI, Shin HJ, Shin SY (2002) On-line locomotion generation based on motion blending. In: the Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on computer animation, pp 105-111Google Scholar
  15. 15.
    Kim T, Park SI, Shin SY (2003) Rhythmic-motion synthesis based on motion-beat analysis. ACM Trans Graph (TOG) 22:392–401CrossRefGoogle Scholar
  16. 16.
    Dam EB, Koch M, Lillholm M (1998) Quaternions, interpolation and animation[M]. University of Copenhage, Department of Computer ScienceGoogle Scholar
  17. 17.
    CMU-Graphics-Lab Cmu graphics lab motion capture database. http://mocap.cs.cmu. edu/
  18. 18.
    Yin KK, Loken K, van de Panne M (2007) Simbicon: Simple biped locomotion control. ACM Trans Graph (TOG) 26:105–114CrossRefGoogle Scholar
  19. 19.
    Tian R, Cao Y, Li X et al (2012) Data-driven based interactive motion blending. The fourth International conference on computational and information sciences (ICCIS), pp 530-533Google Scholar
  20. 20.
    Poppe R (2007) Vision-based human motion analysis: an overview. Comput Vis Image Underst 108:4–18CrossRefGoogle Scholar
  21. 21.
    Gleicher M, Shin HJ, Kovar L et al (2008) Snap-together motion: assembling run-time animations. ACM SIGGRAPH 2008 classes. ACM, vol. 52Google Scholar
  22. 22.
    Yan C, Zhang Y, Xu J et al (2014) A highly parallel framework for HEVC coding unit partitioning tree decision on many-core processors. IEEE Sig Proces Lett 21:573–576CrossRefGoogle Scholar
  23. 23.
    Yan C, Zhang Y, Xu J et al (2014) Efficient parallel framework for HEVC motion estimation on many-core processors. IEEE Trans Circ Sys Vid Technol 24:2077–2089CrossRefGoogle Scholar
  24. 24.
    Yan C, Zhang Y, Dai F et al (2014) Parallel deblocking filter for HEVC on many-core processor. Electron Lett 50:367–368CrossRefGoogle Scholar
  25. 25.
    Yan C, Zhang Y, Dai F et al (2014) Efficient parallel HEVC intra prediction on many-core processor. Electron Lett 50:805–806CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media New York 2017

Authors and Affiliations

  • Minghua Zhao
    • 1
  • Yongqin Yuan
    • 1
  • Xin Zhang
    • 1
  • Zhenghao Shi
    • 1
  • Yinghui Wang
    • 1
  1. 1.School of Computer Science and EngineeringXi’an University of TechnologyXi’anChina

Personalised recommendations