A novel key frames matching approach for human locomotion interpolation
- 116 Downloads
A novel key frames matching approach for human locomotion interpolation is proposed in this paper. Firstly, locomotion data is transformed from body coordinate system to world coordinate system based on recursive algorithm; secondly, features of two feet and step distance are extracted; thirdly, key frames are extracted based on feature curve and the locomotion sequence is segmented; fourthly, a new approach for key frames matching is put forward; finally, transition frames are interpolated between the matched key frames using quaternion Slerp (Spherical linear interpolation) algorithm and linear interpolation algorithm. The main contributions of this paper are: i) different locomotion sequences can be synthesized in a controllable manner with good effect; ii) to extract locomotion features conveniently, locomotion data in body coordinate system is transformed to world coordinate system; iii) to analyze locomotion sequences intuitively and accurately, key frames are extracted and locomotion sequences are segmented based on feature curve analysis; iv) to avoid illogical jump of traditional interpolation between key frames, a new key frames matching strategy is put forward and only the key frames between two continuous segments can be chosen as matching frames. Experimental results based on the CMU MoCap database show that the proposed method can confirm the logical correctness and naturalness in the junction of two locomotion sequences.
KeywordsOptical motion capture Locomotion interpolation Feature extraction Key frames matching Locomotion segmentation
This work was partially supported by a grant from the National Natural Science Foundation of China (No.61401355, No.61472319 and No.61502382), a grant from the Key Laboratory Foundation of Shaanxi Education Department, China (No.14JS072), a grant from Science and Technology Project Foundation of Beilin District, Xi’an City, China (No.GX1621) and a grant from Fok Ying Tung Education Foundation (No.141119). The authors also thank anonymous reviewers for their valuable comments.
Compliance with ethical standards
Conflict of interest
The authors declare that there is no conflict of interests regarding the publication of this paper.
- 3.Zaman L, Sumpeno S, Mochamad H (2013) Affective postures classification using bones rotations features from motion capture data. In: the Proceedings of 2013 I.E. Intern Conf Comp Sci Automat Eng 11:585–588Google Scholar
- 6.Ashraf G, Wong KC (2000) Dynamic time warp based framespace interpolation for motion editing. Graph Interface:45–52Google Scholar
- 7.Rose C, Guenter B, Bodenheimer B et al (1996) Efficient generation of motion transitions using spacetime constraints. In: the SIGGRAPH '96 Proceedings of the 23rd annual conference on Computer Graphics and Interactive Techniques. ACM, pp 147–154Google Scholar
- 8.Nopparit S, Pantuwong N, Sugimoto M (2013) A parametric motion concatenation method using cubic Bézier interpolation, the International conference on information technology and electrical Engineering (ICITEE), pp 13-17Google Scholar
- 9.Park MJ, Shin SY (2004) Example-based motion cloning. Comput Animat Virtual Worlds 15:45–257Google Scholar
- 10.Yang T, Jun X, Wu F et al (2006) Extraction of keyframe of motion capture data based on layered curve simplification. J Compr-Aid Design Compr Graph 18:1691–1697Google Scholar
- 11.Kwon T, Shin SY (2005) Motion modeling for on-line locomotion synthesis. In: the Proceedings of the 2005 ACM SIGGRAPH/Eurographics symposium on computer animation, pp 29-38Google Scholar
- 12.Laviers A, Egerstedt M (2012) Style based robotic motion. American control conference (ACC), pp 4327-4332.Google Scholar
- 13.Cardle M (2004) Automated motion editing. Technical Report. Computer Laboratory, University of Cambridge, UKGoogle Scholar
- 14.Park SI, Shin HJ, Shin SY (2002) On-line locomotion generation based on motion blending. In: the Proceedings of the 2002 ACM SIGGRAPH/Eurographics symposium on computer animation, pp 105-111Google Scholar
- 16.Dam EB, Koch M, Lillholm M (1998) Quaternions, interpolation and animation[M]. University of Copenhage, Department of Computer ScienceGoogle Scholar
- 17.CMU-Graphics-Lab Cmu graphics lab motion capture database. http://mocap.cs.cmu. edu/
- 19.Tian R, Cao Y, Li X et al (2012) Data-driven based interactive motion blending. The fourth International conference on computational and information sciences (ICCIS), pp 530-533Google Scholar
- 21.Gleicher M, Shin HJ, Kovar L et al (2008) Snap-together motion: assembling run-time animations. ACM SIGGRAPH 2008 classes. ACM, vol. 52Google Scholar