The Visual Computer

, Volume 29, Issue 10, pp 1077–1091 | Cite as

Easy-to-use authoring system for Noh (Japanese traditional) dance animation and its evaluation

  • Masaki Oshita
  • Takeshi Seki
  • Reiko Yamanaka
  • Yukiko Nakatsuka
  • Masami Iwatsuki
Original Article

Abstract

Noh is a genre of Japanese traditional theater, a kind of musical drama. Similar to other dance forms, Noh dance (shimai) can also be divided into small, discrete units of motion (shosa). Therefore, if we have a set of motion clips of motion units (shosa), we can synthesize Noh dance animation by composing them in a sequence based on the Noh dance notation (katatsuke). However, it is difficult for researchers and learners of Noh dance to utilize existing animation systems to create such animations. The purpose of this research is to develop an easy-to-use authoring system for Noh dance animation. In this paper, we introduce the design, implementation, and evaluation of our system. To solve the problems of existing animation systems, we employ our smart motion synthesis technique to compose motion units automatically. We improved the motion synthesis method by enhancing the algorithms for detecting body orientation and constraints between the foot and ground to handle Noh dance motions correctly. We classify motion units as either pattern units, which are specific forms of motion, represented as shot motion clips, or locomotion units, generated on the fly to denote movement towards a specific position or direction. To handle locomotion-type motion units, we implemented a module to generate walking motion based on a given path. We created several Noh dance animations using this system, which was evaluated through a series of experiments. We also conducted a user test to determine the usefulness of our system for learners of Noh dance.

Keywords

Animation system Traditional dance Motion synthesis Motion composition Motion capture 

References

  1. 1.
    Oshita, M.: Smart motion synthesis. Comput. Graph. Forum 27, 1909–1918 (2008). (Pacific Graphics 2008) CrossRefGoogle Scholar
  2. 2.
    Oshita, M., Yamanaka, R., Iwatsuki, M., Nakatsuka, Y., Seki, T.: Development of easy-to-use authoring system for Noh (Japanese traditional) dance animation. In: Proc. of International Conference on Cyberworlds 2012, pp. 45–52 (2012) CrossRefGoogle Scholar
  3. 3.
    Choensawat, W., Takahashi, T., Nakamura, M., Hachimura, K.: The use of Labanotation for choreographing a Noh-play. In: Proc. of International Conference on Culture and Computing 2011, pp. 167–168 (2011) CrossRefGoogle Scholar
  4. 4.
    Calvert, T., Wilke, L., Ryman, R., Fox, I.: Applications of computers to dance. IEEE Comput. Graph. Appl. 25(2), 6–12 (2005) CrossRefGoogle Scholar
  5. 5.
    Guest, A.H.: Labanotation: the System of Analyzing and Recording Movement. Taylor and Francis, London (1987) Google Scholar
  6. 6.
    Soga, A., Umino, B., Yasuda, T., Yokoi, S.: Automatic composition and simulation system for ballet sequences. Vis. Comput. 23(5), 309–316 (2007) CrossRefGoogle Scholar
  7. 7.
    Soga, A., Umino, B., Hirayama, M.: Automatic composition for contemporary dance using 3D motion clips, experiment on dance training and system evaluation. In: Proc. of International Conference on Cyberworlds, pp. 171–176 (2009) Google Scholar
  8. 8.
    DanceForms: (2012). http://www.danceforms.com. Accessed 14 Dec. 2012
  9. 9.
    Miku Miku Dance: (2012). http://www.geocities.jp/higuchuu4/. Accessed 14 Dec. 2012
  10. 10.
    Ménardais, S., Kulpa, R., Multon, F., Arnaldi, B.: Synchronization for dynamic blending of motions. In: Proc. of ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2004, pp. 325–335 (2004) CrossRefGoogle Scholar
  11. 11.
    Shum, H.P.H., Komura, K., Yadav, P.: Angular momentum guided motion concatenation. Comput. Animat. Virtual Worlds 20(2–3), 385–394 (2009) CrossRefGoogle Scholar
  12. 12.
    Wang, J., Bodenheimer, B.: Synthesis and evaluation of linear motion transitions. ACM Trans. Graph. 27, 1 (2008) Google Scholar
  13. 13.
    Ikemoto, L., Arikan, O., Forsyth, D.: Quick transitions with cached multi-way blends. In: Proc. of Symposium on Interactive 3D Graphics and Games, pp. 145–151 (2007) Google Scholar
  14. 14.
    Rose, C., Guenter, B., Bodenheimer, B., Cohen, M.F.: Efficient generation of motion transitions using spacetime constraints. In: Proc. of SIGGRAPH ’95, pp. 147–154 (1995) Google Scholar
  15. 15.
    Shiratori, T., Nakazawa, A., Ikeuchi, K.: Dancing-to-music character animation. Comput. Graph. Forum, 25, 449–458 (2006). (Eurographics 2006) CrossRefGoogle Scholar
  16. 16.
    Zhang, M., Li, L., Pan, Z.: Rhythmic motion synthesis using motion thread networks. In: Proc. of International Conference on Virtual Reality Continuum and Its Applications in Industry 2011 (VRCAI 2011), pp. 327–334 (2011) Google Scholar
  17. 17.
    Yokomichi, M.: Pictures and Diagrams of Noh Theater. Iwanami Lecture Series Nō and Kyōgen, Separate Volume. Iwanami Shoten, Tokyo (1992) (in Japanese) Google Scholar
  18. 18.
    Bethe, M., Brazell, K.: Dance in the Nō Theater, vol. 1, Dance Analysis. Cornell University, Ithaca (1982) Google Scholar
  19. 19.
    Bethe, M., Brazell, K.: Dance in the Nō Theater, vol. 2, Plays and Scores. Cornell University, Ithaca (1982) Google Scholar
  20. 20.
    Bethe, M., Brazell, K.: Dance in the Nō Theater, vol. 3, Dance Patterns. Cornell University, Ithaca (1982) Google Scholar
  21. 21.
    Hachimura, K.: Digital archiving of dancing. Rev. Natl. Cent. Digit. 8, 51–66 (2006) Google Scholar
  22. 22.
    Brazell, K. (ed.): Traditional Japanese Theater. Columbia University Press, New York (1998) Google Scholar
  23. 23.
    Leiter, S.L.: Historical Dictionary of Japanese Traditional Theatre. The Scarecrow Press, Lanham (2006) Google Scholar
  24. 24.
    Majkowska, A., Zordan, V.B., Faloutsos, P.: Automatic splicing for hand and body animations. In: Proc of ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2006, pp. 309–316 (2006) Google Scholar
  25. 25.
    Sang, I.P., Hyun, J.S., Sung, Y.S.: On-line locomotion generation based on motion blending. In: Proc. of ACM SIGGRAPH Symposium on Computer Animation 2002, pp. 105–111 (2002) Google Scholar
  26. 26.
    Oshita, M.: Pen-to-mime: pen-based interactive control of a human figure. Comput. Graph. 29(6), 931–945 (2005) CrossRefGoogle Scholar
  27. 27.
    Lockwood, N., Singh, K.: Biomechanically-inspired motion path editing. In: Proc. of ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2011, pp. 267–276 (2011) CrossRefGoogle Scholar
  28. 28.
    Autodesk, M.: (2012). http://usa.autodesk.com. Accessed 14 Dec. 2012

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Masaki Oshita
    • 1
  • Takeshi Seki
    • 2
  • Reiko Yamanaka
    • 2
  • Yukiko Nakatsuka
    • 2
  • Masami Iwatsuki
    • 2
  1. 1.Kyushu Institute of TechnologyIizukaJapan
  2. 2.Hosei UniversityTokyoJapan

Personalised recommendations