Skip to main content
Log in

Easy-to-use authoring system for Noh (Japanese traditional) dance animation and its evaluation

  • Original Article
  • Published:
The Visual Computer Aims and scope Submit manuscript

Abstract

Noh is a genre of Japanese traditional theater, a kind of musical drama. Similar to other dance forms, Noh dance (shimai) can also be divided into small, discrete units of motion (shosa). Therefore, if we have a set of motion clips of motion units (shosa), we can synthesize Noh dance animation by composing them in a sequence based on the Noh dance notation (katatsuke). However, it is difficult for researchers and learners of Noh dance to utilize existing animation systems to create such animations. The purpose of this research is to develop an easy-to-use authoring system for Noh dance animation. In this paper, we introduce the design, implementation, and evaluation of our system. To solve the problems of existing animation systems, we employ our smart motion synthesis technique to compose motion units automatically. We improved the motion synthesis method by enhancing the algorithms for detecting body orientation and constraints between the foot and ground to handle Noh dance motions correctly. We classify motion units as either pattern units, which are specific forms of motion, represented as shot motion clips, or locomotion units, generated on the fly to denote movement towards a specific position or direction. To handle locomotion-type motion units, we implemented a module to generate walking motion based on a given path. We created several Noh dance animations using this system, which was evaluated through a series of experiments. We also conducted a user test to determine the usefulness of our system for learners of Noh dance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Similar content being viewed by others

References

  1. Oshita, M.: Smart motion synthesis. Comput. Graph. Forum 27, 1909–1918 (2008). (Pacific Graphics 2008)

    Article  Google Scholar 

  2. Oshita, M., Yamanaka, R., Iwatsuki, M., Nakatsuka, Y., Seki, T.: Development of easy-to-use authoring system for Noh (Japanese traditional) dance animation. In: Proc. of International Conference on Cyberworlds 2012, pp. 45–52 (2012)

    Chapter  Google Scholar 

  3. Choensawat, W., Takahashi, T., Nakamura, M., Hachimura, K.: The use of Labanotation for choreographing a Noh-play. In: Proc. of International Conference on Culture and Computing 2011, pp. 167–168 (2011)

    Chapter  Google Scholar 

  4. Calvert, T., Wilke, L., Ryman, R., Fox, I.: Applications of computers to dance. IEEE Comput. Graph. Appl. 25(2), 6–12 (2005)

    Article  Google Scholar 

  5. Guest, A.H.: Labanotation: the System of Analyzing and Recording Movement. Taylor and Francis, London (1987)

    Google Scholar 

  6. Soga, A., Umino, B., Yasuda, T., Yokoi, S.: Automatic composition and simulation system for ballet sequences. Vis. Comput. 23(5), 309–316 (2007)

    Article  Google Scholar 

  7. Soga, A., Umino, B., Hirayama, M.: Automatic composition for contemporary dance using 3D motion clips, experiment on dance training and system evaluation. In: Proc. of International Conference on Cyberworlds, pp. 171–176 (2009)

    Google Scholar 

  8. DanceForms: (2012). http://www.danceforms.com. Accessed 14 Dec. 2012

  9. Miku Miku Dance: (2012). http://www.geocities.jp/higuchuu4/. Accessed 14 Dec. 2012

  10. Ménardais, S., Kulpa, R., Multon, F., Arnaldi, B.: Synchronization for dynamic blending of motions. In: Proc. of ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2004, pp. 325–335 (2004)

    Chapter  Google Scholar 

  11. Shum, H.P.H., Komura, K., Yadav, P.: Angular momentum guided motion concatenation. Comput. Animat. Virtual Worlds 20(2–3), 385–394 (2009)

    Article  Google Scholar 

  12. Wang, J., Bodenheimer, B.: Synthesis and evaluation of linear motion transitions. ACM Trans. Graph. 27, 1 (2008)

    Google Scholar 

  13. Ikemoto, L., Arikan, O., Forsyth, D.: Quick transitions with cached multi-way blends. In: Proc. of Symposium on Interactive 3D Graphics and Games, pp. 145–151 (2007)

    Google Scholar 

  14. Rose, C., Guenter, B., Bodenheimer, B., Cohen, M.F.: Efficient generation of motion transitions using spacetime constraints. In: Proc. of SIGGRAPH ’95, pp. 147–154 (1995)

    Google Scholar 

  15. Shiratori, T., Nakazawa, A., Ikeuchi, K.: Dancing-to-music character animation. Comput. Graph. Forum, 25, 449–458 (2006). (Eurographics 2006)

    Article  Google Scholar 

  16. Zhang, M., Li, L., Pan, Z.: Rhythmic motion synthesis using motion thread networks. In: Proc. of International Conference on Virtual Reality Continuum and Its Applications in Industry 2011 (VRCAI 2011), pp. 327–334 (2011)

    Google Scholar 

  17. Yokomichi, M.: Pictures and Diagrams of Noh Theater. Iwanami Lecture Series Nō and Kyōgen, Separate Volume. Iwanami Shoten, Tokyo (1992) (in Japanese)

    Google Scholar 

  18. Bethe, M., Brazell, K.: Dance in the Nō Theater, vol. 1, Dance Analysis. Cornell University, Ithaca (1982)

    Google Scholar 

  19. Bethe, M., Brazell, K.: Dance in the Nō Theater, vol. 2, Plays and Scores. Cornell University, Ithaca (1982)

    Google Scholar 

  20. Bethe, M., Brazell, K.: Dance in the Nō Theater, vol. 3, Dance Patterns. Cornell University, Ithaca (1982)

    Google Scholar 

  21. Hachimura, K.: Digital archiving of dancing. Rev. Natl. Cent. Digit. 8, 51–66 (2006)

    Google Scholar 

  22. Brazell, K. (ed.): Traditional Japanese Theater. Columbia University Press, New York (1998)

    Google Scholar 

  23. Leiter, S.L.: Historical Dictionary of Japanese Traditional Theatre. The Scarecrow Press, Lanham (2006)

    Google Scholar 

  24. Majkowska, A., Zordan, V.B., Faloutsos, P.: Automatic splicing for hand and body animations. In: Proc of ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2006, pp. 309–316 (2006)

    Google Scholar 

  25. Sang, I.P., Hyun, J.S., Sung, Y.S.: On-line locomotion generation based on motion blending. In: Proc. of ACM SIGGRAPH Symposium on Computer Animation 2002, pp. 105–111 (2002)

    Google Scholar 

  26. Oshita, M.: Pen-to-mime: pen-based interactive control of a human figure. Comput. Graph. 29(6), 931–945 (2005)

    Article  Google Scholar 

  27. Lockwood, N., Singh, K.: Biomechanically-inspired motion path editing. In: Proc. of ACM SIGGRAPH/Eurographics Symposium on Computer Animation 2011, pp. 267–276 (2011)

    Chapter  Google Scholar 

  28. Autodesk, M.: (2012). http://usa.autodesk.com. Accessed 14 Dec. 2012

Download references

Acknowledgements

This work was supported in part by the Program for Promoting Methodological Innovation in Humanities and Social Sciences by Cross-Disciplinary Fusing as well as Grants-in-Aid for Scientific Research (24500238) from the Japan Society for the Promotion of Science (JSPS). We would like to thank Masaki Umano, a professional Noh performer, for his invaluable help in acquiring motion capture data and evaluating our prototype system. We would also like to thank Steven G. Nelson for his insightful comments during the writing of this paper.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Masaki Oshita.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Oshita, M., Seki, T., Yamanaka, R. et al. Easy-to-use authoring system for Noh (Japanese traditional) dance animation and its evaluation. Vis Comput 29, 1077–1091 (2013). https://doi.org/10.1007/s00371-013-0839-8

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00371-013-0839-8

Keywords

Navigation