Using Bayesian Networks to Synthesize Human Walking Animation
A human walking animation synthesis method is proposed to learn motion discipline from motion capture data and produce new walking animations by using Bayesian Networks. Raw motion capture data is segmented to walking units, and then footprint parameters and keyframes are extracted to train Bayesian Networks. The trained Bayesian Networks can infer appropriate keyframes according to given footprint sequence. The presented method combines the non-deterministic inference ability of Bayesian Networks and control of footprint parameters for end effector. Experiments demonstrate that motions synthesized by our method strictly satisfy space-time constraints and perform naturally.
Unable to display preview. Download preview PDF.
- 3.Xu, W., Pan, Z., Ge, Y.: Footprints Sampling Based Motion Editing. Journal of Computer-Aided Design & Computer Graphics 15, 805–811 (2003)Google Scholar
- 5.Liu, G., Pan, Z., Cheng, X., Li, L., Zhang, M.: A Survey on Machine Learning in the Synthesis of Human Motions. Journal of Computer-Aided Design & Computer Graphics 22, 1619–1627 (2010)Google Scholar
- 6.Li, C., Xia, S., Wang, Z.: Pose Synthesis Using the Inverse of Jacobian Matrix Learned from Examples. In: Virtual Reality Conference, pp. 99–106. IEEE (2007)Google Scholar
- 9.Wang, S.: Bayesian Networks Learning, Inference and Application, pp. 117–135. LiXin Accounting Publishing House (2010)Google Scholar