Advertisement

The Possibility of Personality Extraction Using Skeletal Information in Hip-Hop Dance by Human or Machine

  • Saeka FuruichiEmail author
  • Kazuki Abe
  • Satoshi Nakamura
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11749)

Abstract

The same dance can give different impressions depending on the way the dancers convey their own emotions and personality through their interpretation of the dance. Beginner dancers who are teaching themselves often search for dance videos online that match their own personality in order to practice and mimic them, but it is not easy to find a dance that suits their own personality and skill level. In this work, we examined hip-hop dance to determine whether it is possible to identify one’s own dance from skeleton information acquired by Kinect and whether it is possible to mechanically extract information representing the individuality of dance. Experimental results showed that rich experienced dancers could distinguish their own dances by only skeleton information, and it was also possible to distinguish from averaged skeletal information. Furthermore, we generated features from the skeletal information of dance and clarified that individual dance can be distinguished accurately by machine learning.

Keywords

Dance Personality Kinect Skeleton Random forest 

Notes

Acknowledgments

This work was supported in part by JST ACCEL Grant Number JPMJAC1602, Japan.

Supplementary material

Supplementary material 1 (MP4 47131 kb)

References

  1. 1.
    Chan, C., Ginosar, S., Zhou, T., Efros, A.A.: Everybody dance now. In: arXiv 2018, vol. 1, no. 1 (2018)Google Scholar
  2. 2.
    Fujimoto, M., Tsukamoto, M., Terada, T.: A dance training system that maps self-images onto an instruction video. In: The Fifth International Conference on Advances in Computer-Human Interactions (ACHI2012), pp. 309–314, Valencia (2012)Google Scholar
  3. 3.
    Saito, H., Maki, Y., Tsuchiya, S., Nakamura, S.: Can people sense their personalities only by watching the movements of their skeleton in street dancing performances? In: Kurosu, M. (ed.) HCI 2018. LNCS, vol. 10903, pp. 344–354. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-91250-9_27CrossRefGoogle Scholar
  4. 4.
    Cao, Z., Simon, T., Wei, S.E., Sheikh, Y.: Real time multi-person 2D pose estimation using part affinity fields. In: Computer Vision and Pattern Recognition (CVPR2017), Honolulu (2017)Google Scholar
  5. 5.
    Sato, N., Imura, S., Nunome, H., Ikegami, Y.: Motion characteristics in hip hop dance underlying subjective evaluation of the performance. In: The 30th Conference of the International Society of Biomechanics in Sports (ISBS2012), pp. 17–20, Melbourne (2012)Google Scholar
  6. 6.
    Yonezawa, M.: The investigations on the teachers’ attitudes to dance in the face of scholastic requirement of dance in middle schools in Heisei 24 (2012) academic year. In: Studies in Humanities of Kanagawa University, vol. 178, pp. 53–80, Kanagawa (2012)Google Scholar
  7. 7.
    Yamaguchi, T., Kadone, H.: Supporting creative dance performance using a grasping-type musical interface. In: 2014 IEEE International Conference on Robotics and Biomimetics (ROBIO2014), Bali (2014)Google Scholar
  8. 8.
    Nakamura, A., Tabata, S., Ueda, T., Kiyofuji, S., Kuno, Y.: Dance training system with active vibro-devices and a mobile image display. In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS 2005), Edmonton (2005)Google Scholar
  9. 9.
    Yang, U., Kim, G.: Implementation and evaluation of “Just Follow Me”: an immersive, VR-based, motion-training system. Presence 11(3), 304–323 (2002)CrossRefGoogle Scholar
  10. 10.
    Tsuchida, S., Fukayama, S., Goto, M.: Automatic system for editing dance videos recorded using multiple cameras. In: Cheok, A.D., Inami, M., Romão, T. (eds.) ACE 2017. LNCS, vol. 10714, pp. 671–688. Springer, Cham (2018).  https://doi.org/10.1007/978-3-319-76270-8_47CrossRefGoogle Scholar
  11. 11.
    Tsuchida, S., Fukayama, S., Goto, M.: Query-by-dancing: a dance music retrieval system based on body-motion similarity. In: Kompatsiaris, I., Huet, B., Mezaris, V., Gurrin, C., Cheng, W.-H., Vrochidis, S. (eds.) MMM 2019. LNCS, vol. 11295, pp. 251–263. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-05710-7_21CrossRefGoogle Scholar
  12. 12.
    Mousas, C.: Performance-driven dance motion control of a virtual partner character, pp. 57–64 (2018)Google Scholar
  13. 13.
    Aristidou, A., Charalambous, P., Chrysanthou, Y.: Emotion analysis and classification: understanding the performers’ emotions using the LMA entities. Comput. Graph. Forum 34, 262–276 (2015)CrossRefGoogle Scholar
  14. 14.
    Senecal, S., Cuel, L., Aristidou, A., Thalmann, N.: Continuous body emotion recognition system during theater performances. Comput. Anim. Virtual Worlds 27, 311–320 (2016)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  1. 1.Meiji UniversityNakano-kuJapan

Personalised recommendations