Advertisement

Capturing Close Interactions with Objects Using a Magnetic Motion Capture System and a RGBD Sensor

  • Peter Sandilands
  • Myung Geol Choi
  • Taku Komura
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7660)

Abstract

Games and interactive virtual worlds increasingly rely on interactions with the environment, and require animations for displaying them. Manually synthesizing such animations is a daunting task due to the difficulty of handling the close interactions between a character’s body and the object. Capturing such movements using optical motion capture systems, which are the most prevalent devices for motion capturing, is also not very straightforward due to occlusions happening between the body markers and the object or body itself. In this paper, we describe a scheme to capture such movements using a magnetic motion capture system. The advantage of using a magnetic motion capture system is that it can obtain the global data without any concern for the occlusion problem. This allows us to digitally recreate captured close interactions without significant artist work in creating the scene after capture. We show examples of capturing movements including opening a bottle, drawing on a paper, taking on / off pen caps, carrying objects and interacting with furniture. The captured data is currently published as a publicly available database.

Keywords

character animation motion capture environment interactions 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Burdea, G.: Virtual reality systems and applications (short course). In: Electro 1993 International Conference (1993)Google Scholar
  2. 2.
    Krieg, J.: Motion tracking: Polhemus technology. Virtual Reality Systems 1(1), 32–36 (1993)Google Scholar
  3. 3.
    Blood, E.: Device for quantitatively measuring the relative position and orientation of two bodies in the presence of metals utilizing direct current magnetic fields. U.S. Patent 4,849,692 (July 18, 1989)Google Scholar
  4. 4.
    Ascension: Flock of birds real-time motion tracker. Company Brochure, Ascension Technology Co., Burlington, VT (1998)Google Scholar
  5. 5.
    Kry, P.G., Pai, D.K.: Interaction capture and synthesis. ACM Trans. Graph. 25(3), 872–880 (2006)CrossRefGoogle Scholar
  6. 6.
    Liu, C.K.: Dextrous manipulation from a grasping pose. ACM Trans. Graph. 28(3) (2009)Google Scholar
  7. 7.
    Ho, E.S.L., Komura, T., Tai, C.L.: Spatial relationship preserving character motion adaptation. ACM Trans. Graph. 29(4) (2010)Google Scholar
  8. 8.
    Ye, Y., Liu, C.K.: Synthesis of detailed hand manipulations using contact sampling. ACM Trans. Graph. (SIGGRAPH 2012) 31(4) (2012)Google Scholar
  9. 9.
    Hamer, H., Gall, J., Urtasun, R., van Gool, L.: Data-driven animation of hand-object interactions. In: IEEE Conference on Automatic Face and Gesture Recognition, pp. 360–367 (2011)Google Scholar
  10. 10.
    Mitobe, K., Kaiga, T., Yukawa, T., Miura, T., Tamamoto, H., Rodgers, A., Yoshimura, N.: Development of a motion capture system for a hand using a magnetic three dimensional position sensor. In: ACM SIGGRAPH 2006 Research Posters, SIGGRAPH 2006. ACM, New York (2006)Google Scholar
  11. 11.
    Munoz-Salinas, R.: Aruco: a minimal library for augmented reality applications based on opencv (2012), http://www.uco.es/investiga/grupos/ava/node/26
  12. 12.
    Burrus, N.: Rgbdemo (June 2012), http://labs.manctl.com/rgbdemo/
  13. 13.
    Kazhdan, M., Bolitho, M., Hoppe, H.: Poisson surface reconstruction. In: Eurographics Symposium on Geometry Processing (2006)Google Scholar
  14. 14.
    Rusu, R.B., Blodow, N., Beetz, M.: Fast point feature histograms (fpfh) for 3d registration. In: IEEE International Conference on Robotics and Automation, ICRA 2009, pp. 3212–3217 (May 2009)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Peter Sandilands
    • 1
  • Myung Geol Choi
    • 1
  • Taku Komura
    • 1
  1. 1.Institute of Perception, Action and Behaviour, School of InformaticsUniversity of EdinburghUnited Kingdom

Personalised recommendations