Advertisement

MoCap-Based Adaptive Human-Like Walking Simulation in Laser-Scanned Large-Scale as-Built Environments

  • Tsubasa MaruyamaEmail author
  • Satoshi Kanai
  • Hiroaki Date
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9185)

Abstract

Accessibility evaluation to enhance accessibility and safety for the elderly and disabled is increasing in importance. Accessibility must be assessed not only from the general standard aspect but also in terms of physical and cognitive friendliness for users of different ages, genders, and abilities. Human behavior simulation has been progressing in crowd behavior analysis and emergency evacuation planning. This research aims to develop a virtual accessibility evaluation by combining realistic human behavior simulation using a digital human model (DHM) with as-built environmental models. To achieve this goal, we developed a new algorithm for generating human-like DHM walking motions, adapting its strides and turning angles to laser-scanned as-built environments using motion-capture (MoCap) data of flat walking. Our implementation quickly constructed as-built three-dimensional environmental models and produced a walking simulation speed sufficient for real-time applications. The difference in joint angles between the DHM and MoCap data was sufficiently small. Demonstrations of our environmental modeling and walking simulation in an indoor environment are illustrated.

Keywords

Walking simulation Laser-scanning Accessibility evaluation Motion capture 

References

  1. 1.
    ISO21542: Building construction –Accessibility and usability of the built environment (2011)Google Scholar
  2. 2.
    Thalmann, D., Musse, S.R.: Crowd Simulation, pp. 3–4. Springer, London (2007)Google Scholar
  3. 3.
    Kakizaki, T., Urii, J., Endo, M.: Post-Tsunami evacuation simulation using 3D kinematic digital human models and experimental verification. J. Comput. Inf. Sci. Eng. 14(2), 021010-1–021010-9 (2014)CrossRefGoogle Scholar
  4. 4.
    Maruyama, T., Kanai, S., Date, H.: Simulating a walk of digital human model directly in massive 3D laser-scanned point cloud of indoor environments. In: Duffy, V.G. (ed.) HCII 2013 and DHM 2013, Part II. LNCS, vol. 8026, pp. 366–375. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  5. 5.
    Helbing, D., Farkas, I., Vicsek, T.: simulating dynamical features of escape panic. Nature 407, 487–490 (2000)CrossRefGoogle Scholar
  6. 6.
    Kakizaki, T., Urii, J., Endo, M.: A three-dimensional evacuation simulation using digital human models with precise kinematics joints. J. Comput. Inf. Sci. Eng. 12(3), 031001-1–031001-8 (2012)CrossRefGoogle Scholar
  7. 7.
    Pettre, J., Laumond, J.-P., Thalmann, D.: A navigation graph for real-time crowd animation on multi-layered and uneven terrain. In: Proceedings of 1st International Workshop on Crowd Simulation (V- CROWDS 2005), pp. 81–90 (2005)Google Scholar
  8. 8.
    Nüchter, A., Hertzberg, J.: Towards semantic maps for mobile robots. Robot. Auton. Syst. 56(11), 915–926 (2008)CrossRefGoogle Scholar
  9. 9.
    Rusu, R.B., Marton, Z.C., Blodow, N., Dolha, M., Beetz, M.: Towards 3D point cloud based object maps for household environments. Robot. Auton. Syst. 56(11), 927–941 (2008)CrossRefGoogle Scholar
  10. 10.
    Xiao, J., Furukawa, Y.: Reconstructing the world’s museums. Int. J. Comput. Vision 110(3), 243–258 (2014)CrossRefGoogle Scholar
  11. 11.
    Troje, N.F.: Decomposing biological motion: a framework for analysis and synthesis of human gait patterns. J. Vis. 2(5), 371–387 (2002)CrossRefGoogle Scholar
  12. 12.
    Min, J., Liu, H., Chai, J.: Synthesis and editing of personalized stylistic human motion. In: Proceedings of the 2010 ACM SIGGRAPH Symposium on Interactive 3D Graphics and Games, pp. 39–46 (2010)Google Scholar
  13. 13.
    Mukai, T.: Motion rings for interactive gait synthesis. In: Proceedings of ACM Symposium on Interactive 3D Graphics and Games, pp. 125–132 (2011)Google Scholar
  14. 14.
    Grochow, K., Martin, S.L., Hertzmann, A., Popovic, Z.: Style-based Inverse Kinematics. ACM Trans. Graph. 23(23), 522–531 (2004)CrossRefGoogle Scholar
  15. 15.
    Yin, K., Loken, K., van de Panne, M.: SIMBICON: simple biped locomotion control. ACM Trans. Graph. 26(3), 105 (2007)CrossRefGoogle Scholar
  16. 16.
    Coros, S., Beaudoin, P., van de Panne, M.: Generalized biped walking control. ACM Trans. Graph. 29(4), 130 (2010)CrossRefGoogle Scholar
  17. 17.
    Al-Asqhar, R.A., Komura, T., Choi, M.G.: Relationship descriptors for interactive motion adaptation. In: Proceedings of the 12th ACM SIGGRAPH/Eurographics Symposium on Computer Animation, pp. 45–53 (2013)Google Scholar
  18. 18.
    Rusu, R.B.: Semantic 3D Object Maps for Everyday Robot Manipulation. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  19. 19.
    Kobayashi, Y., Mochimaru, M.: AIST Gait Database 2013 (2013). https://www.dh.aist.go.jp/database/gait2013/
  20. 20.
    Pan, H., Hou, X., Gao, C., Lei, Y.: A method of real-time human motion retargeting for 3D terrain adaptation. In: Proceedings of 13th IEEE JICSIT, pp. 1–5 (2013)Google Scholar
  21. 21.
    Perry, J., Burnfield, J.M.: GAIT ANALYSIS Normal and Pathological Function, 2nd edn. SLACK Inc., New Jersey (2010)Google Scholar
  22. 22.
    PCL –Point Cloud Library. http://pointclouds.org/

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  1. 1.Graduate School of Information Science and TechnologyHokkaido UniversitySapporoJapan

Personalised recommendations