Journal on Multimodal User Interfaces

, Volume 4, Issue 1, pp 37–46 | Cite as

Browsing a dance video collection: dance analysis and interface design

  • Damien Tardieu
  • Xavier Siebert
  • Barbara Mazzarino
  • Ricardo Chessini
  • Julien Dubois
  • Stéphane Dupont
  • Giovanna Varni
  • Alexandra Visentin
Original Paper


In this article we present a system for content-based browsing of a dance video database. A set of features describing dance is proposed, to quantify local gestures of the dancer as well as global stage usage. These features are used to compute similarities between recorded dance improvisations, which in turn serve to guide the visual exploration in the browsing methods presented here. The software integrating all these components is part of an interactive touch-screen installation, and is also accessible online in association with an artistic project. The different components of this browsing system are presented in this paper.


Dance Database Video Motion capture Gesture analysis Data visualization Interactive installation Browsing interface Touchscreen 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Smeulders A, Worring M, Santini S, Gupta A, Jain R (2000) Content-based image retrieval at the end of the early years. IEEE Trans Pattern Anal Mach Intell 22(12):1349–1380 CrossRefGoogle Scholar
  2. 2.
    Keim D (2002) Information visualization and visual data mining. IEEE Trans Vis Comput Graph 1–8 Google Scholar
  3. 3.
    Santini S, Jain R (2000) Integrated browsing and querying for image databases. IEEE Multimed 7(3):26–39 CrossRefGoogle Scholar
  4. 4.
    Zhang Z, Zhang R (2008) Multimedia data mining: a systematic introduction to concepts and theory. Chapman & Hall/CRC, London CrossRefGoogle Scholar
  5. 5.
    Nguyen G, Worring M (2008) Interactive access to large image collections using similarity-based visualization. J Vis Lang Comput 19(2):203–224 CrossRefGoogle Scholar
  6. 6.
    Boujemaa N, Compañó R, Dosch C, Geurst J, Kompatsiaris Y, Karlgren J, King P, Köhler J, Le Moine J, Ortgies R et al (2007) Chorus deliverable 2.1: state of the art on multimedia search engines Google Scholar
  7. 7.
    Heesch D (2008) A survey of browsing models for content based image retrieval. Multimed Tools Appl 40(2):261–284 CrossRefGoogle Scholar
  8. 8.
    Moghaddam B, Tian Q, Lesh N, Shen C, Huang T (2004) Visualization and user-modeling for browsing personal photo libraries. Int J Comput Vis 56(1):109–130 CrossRefGoogle Scholar
  9. 9.
    Rubner Y, Guibas L, Tomasi C (1997) The Earth mover’s distance, multi-dimensional scaling, and color-based image retrieval. In: Proceedings of the ARPA image understanding workshop, pp 661–668 Google Scholar
  10. 10.
    Rorissa A, Clough P, Deselaers T (2008) Exploring the relationship between feature and perceptual visual spaces. J Am Soc Inf Sci Technol 59(5):770–784 CrossRefGoogle Scholar
  11. 11.
    Camurri A, Mazzarino B, Volpe G (2004) Analysis of expressive gesture: the eyes web expressive gesture processing library. In: Lecture notes in computer science. Springer, Berlin Google Scholar
  12. 12.
    Camurri A, Mazzarino B, Ricchetti M, Timmers R, Volpe G (2004) Multimodal analysis of expressive gesture in music and dance performances. In: Lecture notes in computer science. Springer, Berlin, pp 20–39 Google Scholar
  13. 13.
    Laban R (1963) Modern educational dance. Macdonald & Evans, London Google Scholar
  14. 14.
    Laban R, Lawrence FC (1947) Effort. Macdonald & Evans, London Google Scholar
  15. 15.
    Boone RT, Cunningham JG (1998) Children’s decoding of emotion in expressive body movement: the development of cue attunement. Dev Psychol 34:1007–1016 CrossRefGoogle Scholar
  16. 16.
    Volpe G (2003) Computational models of expressive gesture in multimedia systems. Ph.D. Dissertation, Faculty of Engineering, Department of Communication, Computer and System Sciences Google Scholar
  17. 17.
    Hurley N, Rickard S (2008) Comparing measures of sparsity. Mach Learn Signal Process 55–60 (2008) Google Scholar
  18. 18.
    Ehrgott M (2005) Multicriteria optimization. Springer, Berlin zbMATHGoogle Scholar
  19. 19.
    Siebert X, Dupont S, Fortemps P, Tardieu D (2009) Mediacycle: browsing and performing with sound and image libraries. In: Dutoit T, Macq V (eds) QPSR of the numediart research program, vol 2(1). Numediart research program on digital art technologies, vol 3, pp 19–22.
  20. 20.
    Wright M, Freed A (1997) Open sound control: a new protocol for communicating with sound synthesizers. In: International computer music conference Google Scholar
  21. 21.
    Mazzarino B, Mancini M (2009) The need for impulsivity & smoothness—improving hci by qualitatively measuring new high-level human motion features. In: SIGMAP, pp 62–67 Google Scholar

Copyright information

© OpenInterface Association 2010

Authors and Affiliations

  • Damien Tardieu
    • 1
  • Xavier Siebert
    • 2
  • Barbara Mazzarino
    • 3
  • Ricardo Chessini
    • 4
  • Julien Dubois
    • 1
  • Stéphane Dupont
    • 1
  • Giovanna Varni
    • 3
  • Alexandra Visentin
    • 3
  1. 1.TCTS Lab.University of MonsMonsBelgium
  2. 2.MathRO Lab.University of MonsMonsBelgium
  3. 3.infoMus Lab—DISTUniversity of GenovaGenovaItaly
  4. 4.SEMI Lab.University of MonsMonsBelgium

Personalised recommendations