Advertisement

Methods for Studying Music-Related Body Motion

  • Alexander Refsum Jensenius
Part of the Springer Handbooks book series (SHB)

Abstract

This chapter presents an overview of some methodological approaches and technologies that can be used in the study of music-related body motion. The aim is not to cover all possible approaches, but rather to highlight some of the ones that are more relevant from a musicological point of view. This includes methods for video-based and sensor-based motion analyses, both qualitative and quantitative. It also includes discussions of the strengths and weaknesses of the different methods, and reflections on how the methods can be used in connection to other data in question, such as physiological or neurological data, symbolic notation, sound recordings and contextual data.

ADC

analog-to-digital converter

AoM

area of motion

CoM

centroid of motion

CV

computer vision

ECG

electrocardiogram

EEG

electroencephalogram/electroencephalography

EMG

electromyogram

GDIF

gesture description interchange file format

GMS

gesture motion signal

GSR

galvanic skin response

IFF

interchange file format

LMA

Laban movement analysis

MCML

motion capture markup language

MPML

multimodal presentation markup language

OSC

open sound control

PML

performance markup language

QoM

quantity of motion

SDIF

sound description interchange format

XML

extensible markup language

References

  1. 38.1
    M.M. Wanderley, M. Battier (Eds.): Trends in Gestural Control of Music (IRCAM – Centre Pompidou, Paris 2000)Google Scholar
  2. 38.2
    A. Gritten, E. King (Eds.): Music and gesture (Ashgate, Hampshire 2006)Google Scholar
  3. 38.3
    A. Gritten, E. King (Eds.): New Perspectives on Music and Gesture (Ashgate, Hampshire 2011)Google Scholar
  4. 38.4
    R.I. Godoy, M. Leman (Eds.): Musical Gestures: Sound, Movement, and Meaning (Routledge, New York 2010)Google Scholar
  5. 38.5
    A.R. Jensenius, M.M. Wanderley, R.I. Godøy, M. Leman: Musical gestures: Concepts and methods in research. In: Musical Gestures: Sound, Movement, and Meaning, ed. by R.I. Godøy, M. Leman (Routledge, New York 2010) pp. 12–35Google Scholar
  6. 38.6
    W. Barlow: Alexander-princippet (Borgen forlag, Copenhagen 1975)Google Scholar
  7. 38.7
    R. Feitis: Ida Rolf Talks about Rolfing and Physical Reality (Harper and Row, New York 1978)Google Scholar
  8. 38.8
    A. Pierce, R. Pierce: Expressive Movement: Posture and Action in Daily Life, Sports, and the Performing Arts (Perseus, Cambridge 1989)CrossRefGoogle Scholar
  9. 38.9
    E. Findlay: Rhythm and Movement – Applications of Dalcroze Eurhythmics (Summy-Birchard, Miami 1971)Google Scholar
  10. 38.10
    M. Parker: Benesh Movement Notation for Ballet (Royal Academy of Dance, London 1996)Google Scholar
  11. 38.11
    A.H. Guest: Labanotation (Routledge, New York 2004)Google Scholar
  12. 38.12
    W. Choensawat, M. Nakamura, K. Hachimura: GenLaban: A tool for generating Labanotation from motion capture data, Multimed. Tools Appl. 74(23), 10823–10846 (2014)CrossRefGoogle Scholar
  13. 38.13
    C.A. Schrader: A Sense of Dance: Exploring Your Movement Potential (Human Kinetics, Champaign 2004)Google Scholar
  14. 38.14
    R. Laban, F.C. Lawrence: Effort (Macdonald Evans, London 1947)Google Scholar
  15. 38.15
    E. Haga: Correspondences Between Music and Body Movement, Ph.D. Thesis (University of Oslo, Oslo 2008)Google Scholar
  16. 38.16
    L. Campbell, M. Wanderley: The Observation of Movement, MUMT 609 Report (McGill University, Montreal 2005)Google Scholar
  17. 38.17
    E. Van Dyck, P.-J. Maes, J. Hargreaves, M. Lesaffre, M. Leman: Expressing induced emotions through free dance movement, J. Nonverbal Behav. 37(3), 175–190 (2013)CrossRefGoogle Scholar
  18. 38.18
    M.A.R. Ahad, J.K. Tan, H. Kim, S. Ishikawa: Motion history image: Its variants and applications, Mach. Vis. Appl. 23(2), 255–281 (2012)CrossRefGoogle Scholar
  19. 38.19
    A. Camurri, I. Lagerlöf, G. Volpe: Recognizing emotion from dance movement: Comparison of spectator recognition and automated techniques, Appl. Affect. Comput. Hum.-Comput. Interact. 59(1/2), 213–225 (2003)Google Scholar
  20. 38.20
    A.R. Jensenius: Some video abstraction techniques for displaying body movement in analysis and performance, Leonardo 46(1), 53–60 (2013)CrossRefGoogle Scholar
  21. 38.21
    T.B. Moeslund, E. Granum: A survey of computer vision-based human motion capture, Comput. Vis. Image Underst. 81(3), 231–268 (2001)CrossRefGoogle Scholar
  22. 38.22
    T.B. Moeslund, A. Hilton, V. Krüger: A survey of advances in vision-based human motion capture and analysis, Comput. Vis. Image Underst. 104(2/3), 90–126 (2006)CrossRefGoogle Scholar
  23. 38.23
    S.S. Rautaray, A. Agrawal: Vision based hand gesture recognition for human computer interaction: A survey, Artif. Intell. Rev. 43(1), 1–54 (2015)CrossRefGoogle Scholar
  24. 38.24
    A. Camurri, B. Mazzarino, G. Volpe: Analysis of expressive gesture: The EyesWeb expressive gesture processing library. In: Gesture-based Communication in Human-Computer Interaction, Lecture Notes in Computer Science, Vol. 2915, ed. by A. Camurri, G. Volpe (Springer, Berlin, Heidelberg 2004) pp. 460–467CrossRefGoogle Scholar
  25. 38.25
    J.M. Zmölnig: Gem for pd – Recent progress. In: Proc. Int. Comput. Music Conf., Miami (2004)Google Scholar
  26. 38.26
    G. Levin: Computer vision for artists and designers: Pedagogic tools and techniques for novice programmers, AI Society 20(4), 462–482 (2006)CrossRefGoogle Scholar
  27. 38.27
    L. Sigal, A. Balan, M. Black: Humaneva: Synchronized video and motion capture dataset and baseline algorithm for evaluation of articulated human motion, Int. J. Comput. Vis. 87(1), 4–27 (2010)CrossRefGoogle Scholar
  28. 38.28
    M.M. Wanderley, D. Birnbaum, J. Malloch, E. Sinyor, J. Boissinot: SensorWiki.org: A collaborative resource for researchers and interface designers. In: Proc. Int. Conf. New Interfaces Music. Expr., Paris (2006) pp. 180–183Google Scholar
  29. 38.29
    R. Begg, M. Palaniswami: Computational intelligence for movement sciences: Neural networks and other emerging techniques (IGI Global, Hershey 2006)CrossRefGoogle Scholar
  30. 38.30
    H. Zhou, H. Hu: Human motion tracking for rehabilitation – A survey, Biomed. Signal Process. Control 3(1), 1–18 (2008)CrossRefGoogle Scholar
  31. 38.31
    W.M. Richard: A sensor classification scheme, IEEE Trans. Ultrason. Ferroelectr. Freq. Control UFFC-34(2), 124–126 (1987)Google Scholar
  32. 38.32
    S. Patel, H. Park, P. Bonato, L. Chan, M. Rodgers: A review of wearable sensors and systems with application in rehabilitation, J. NeuroEng. Rehabil. 9(1), 21 (2012)CrossRefGoogle Scholar
  33. 38.33
    G. Bishop, G. Welch, B.D. Allen: Tracking: Beyond 15 minutes of thought. In: SIGGRAPH Course 11 (ACM, Los Angeles 2001) pp. 6–11Google Scholar
  34. 38.34
    F. Vogt, G. Mccaig, M.A. Ali, S.S. Fels: Tongue ‘n’ groove: An ultrasound based music controller. In: Proc. Int. Conf. New Interfaces Music. Expr., Dublin (2002) pp. 181–185Google Scholar
  35. 38.35
    M. Ciglar: An ultrasound based instrument generating audible and tactile sound. In: Proc. Int. Conf. New Interfaces Music. Expr (2010) pp. 19–22Google Scholar
  36. 38.36
    F. Styns, L. van Noorden, D. Moelants, M. Leman: Walking on music, Hum. Mov. Sci. 26(5), 769–785 (2007)CrossRefGoogle Scholar
  37. 38.37
    E.R. Miranda, M.M. Wanderley: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard (A-R Editions, Middleton 2006)Google Scholar
  38. 38.38
    G. Vigliensoni, M.M. Wanderley: A quantitative comparison of position trackers for the development of a touch-less musical interface. In: Proc. Int. Conf. New Interfaces Music. Expr., Ann Arbor (2012)Google Scholar
  39. 38.39
    T. Marrin, R. Picard: The ‘Conductor’s Jacket’: A device for recording expressive musical gestures. In: Proc. Int. Comput. Music Conf (1998) pp. 215–219Google Scholar
  40. 38.40
    E. Lin, P. Wu: Jam Master, a music composing interface. In: Proc. Hum. Interface Technol., Vancouver (2000) pp. 21–28Google Scholar
  41. 38.41
    M.T. Marshall, J. Malloch, M.M. Wanderley: Gesture control of spatialization. In: 7th Int. Workshop Gesture Human-Comput. Interact. Simul., Lisbon (2007)Google Scholar
  42. 38.42
    M.T. Marshall, M. Rath, B. Moynihan: The Virtual Bodhran – The Vodhran. In: Proc. Int. Conf. New Interfaces Music. Expr., Dublin (2002) pp. 118–119Google Scholar
  43. 38.43
    E. Maestre, J. Janer, M. Blaauw, A. Pérez, E. Guaus: Acquisition of violin instrumental gestures using a commercial EMF tracking device. In: Proc. Int. Comput. Music Conf., Copenhagen (2007)Google Scholar
  44. 38.44
    A.R. Jensenius, K. Nymoen, R.I. Godøy: A multilayered GDIF-based setup for studying coarticulation in the movements of musicians. In: Proc. Int. Comput. Music Conf (2008) pp. 743–746Google Scholar
  45. 38.45
    H. Wilmers: Bowsense – An open wireless motion sensing platform. In: Proc. Int. Comput. Music Conf., Montreal (2009) pp. 287–290Google Scholar
  46. 38.46
    S. Skogstad, K. Nymoen, M.E. Høvin: Comparing inertial and optical MoCap technologies for synthesis control. In: Proc. Sound Music Comput., Padova (2011) pp. 421–426Google Scholar
  47. 38.47
    G. Welch, E. Foxlin: Motion tracking: No silver bullet, but a respectable arsenal, IEEE Comput. Graph. Appl. 22(6), 24–38 (2002)CrossRefGoogle Scholar
  48. 38.48
    Y. de Quay, S. Skogstad, A.R. Jensenius: Dance Jockey: Performing electronic music by dancing, Leonardo Music J. 21, 11–12 (2011)CrossRefGoogle Scholar
  49. 38.49
    M.A.O. Pérez, R.B. Knapp: BioTools: A biosignal toolbox for composers and performers. Computer Music Modeling and Retrieval. In: Sense of Sounds, Lecture Notes in Computer Science, Vol. 4969, ed. by R. Kronland-Martinet, S. Ystad, K. Jensen (Springer, Berlin, Heidelberg 2008) pp. 441–452Google Scholar
  50. 38.50
    A. Tanaka: Musical technical issues in using interactive instrument technology with application to the BioMuse. In: Proc. Int. Comput. Music Conf., Waseda (1993) pp. 124–124Google Scholar
  51. 38.51
    K. Nymoen, M.R. Haugen, A.R. Jensenius: MuMYO – Evaluating and exploring the MYO armband for musical interaction. In: Proc. Int. Conf. New Interfaces Music. Expr., Baton Rouge (2015)Google Scholar
  52. 38.52
    C. Lee, S.K. Yoo, Y.J. Park, N.H. Kim, K.S. Jeong, B.C. Lee: Using neural network to recognize human emotions from heart rate variability and skin resistance. In: Proc. IEEE Eng. Med. Biol., Shanghai (2005) pp. 5523–5525Google Scholar
  53. 38.53
    G.H. Zimny, E.W. Weidenfeller: Effects of music upon GSR and heart-rate, Am. J. Psychol. 76(2), 311–314 (1963)CrossRefGoogle Scholar
  54. 38.54
    D.G. Craig: An Exploratory Study of Physiological Changes during ‘‘Chills’’ Induced by Music, Musicae Scientiae 9(2), 273–287 (2005)CrossRefGoogle Scholar
  55. 38.55
    M. Ojanen, J. Suominen, T. Kallio, K. Lassfolk: Design principles and user interfaces of Erkki Kurenniemi’s electronic musical instruments of the 1960’s and 1970’s. In: Proc. Int. Conf. New Interfaces Music. Expr., New York (2007) pp. 88–93Google Scholar
  56. 38.56
    E.R. Miranda, B. Boskamp: Steering generative rules with the EEG: An approach to brain-computer music interfacing. In: Proc. Sound Music Comput., Salerno (2005)Google Scholar
  57. 38.57
    A.R. Jensenius, A. Camurri, N. Castagne, E. Maestre, J. Malloch, D. McGilvray, D. Schwarz, M. Wright: Panel: The need of formats for streaming and storing music-related movement and gesture data. In: Proc. Int. Comput. Music Conf (2007) pp. 13–16Google Scholar
  58. 38.58
    Motion Lab Systems: The C3D File Format: User Guide (Motion Lab Systems, Baton Rouge 2008)Google Scholar
  59. 38.59
    H. Chung, Y. Lee: MCML: Motion capture markup language for integration of heterogeneous motion capture data, Comput. Stand. Interfaces 26(2), 113–130 (2004)CrossRefGoogle Scholar
  60. 38.60
    T. Tsutsui, S. Saeyor, M. Ishizuka: MPML: A multimodal presentation markup language with character agent control functions. In: Proc. (CD-ROM) WebNet, San Antonio (2000) pp. 30–37Google Scholar
  61. 38.61
    E. Hartman, J. Cooper, K. Spratt: Swing set: Musical controllers with inherent physical dynamics. In: Proc. Int. Conf. New Interfaces Music. Expr (2008) pp. 356–357Google Scholar
  62. 38.62
    B. Manjunath, P. Salembier, T. Sikora: Introduction to MPEG-7: Multimedia Content Description Interface (Wiley, New York 2002)Google Scholar
  63. 38.63
    M. Evrard, D. Couroussé, N. Castagné, C. Cadoz, J.-L. Florens, A. Luciani: The GMS File Format: Specifications of the version 0.1 of the format, Technical report, INPG, ACROE/ICA, Grenoble, France (2006)Google Scholar
  64. 38.64
    J. Morrison: EA IFF 85: Standard for Interchange Format Files. Technical report, Electronic Arts (1985)Google Scholar
  65. 38.65
    A.R. Jensenius, T. Kvifte, R.I. Godøy: Towards a gesture description interchange format. In: Proc. Int. Conf. New Interfaces for Music. Expr (2006) pp. 176–179Google Scholar
  66. 38.66
    M. Wright, A. Freed, A. Momeni: OpenSound control: State of the art 2003. In: Proc. Int. Conf. New Interfaces Music. Expr., Montreal (2003)Google Scholar
  67. 38.67
    J.J. Burred, C.E. Cella, G. Peeters, A. Roebel, D. Schwarz: Using the SDIF sound description interchange format for audio features. In: Proc. Int. Conf. Music Inf. Retr. (2008) pp. 427–432Google Scholar
  68. 38.68
    P. Roland: The Music Encoding Initiative (MEI). In: Proc. 1st Int. Conf. Music. Appl. using XML (2002) pp. 55–59Google Scholar
  69. 38.69
    A. Camurri, P. Coletta, A. Massari, B. Mazzarino, M. Peri, M. Ricchetti, A. Ricci, G. Volpe: Toward real-time multimodal processing: EyesWeb 4.0. In: Proc. Artif. Intell. Simul. Behav. Conv., Leeds (2004) pp. 22–26Google Scholar
  70. 38.70
    A. Camurri, P. Coletta, G. Varni, S. Ghisio: Developing multimodal interactive systems with EyesWeb XMI. In: Proc. Int. Conf. New Interfaces for Music. Expr., New York (2007) pp. 305–308Google Scholar
  71. 38.71
    B. Burger, P. Toiviainen: MoCap Toolbox – A Matlab toolbox for computational analysis of movement data. In: Proc. Sound Music Comput. Conf. (2013) pp. 172–178Google Scholar
  72. 38.72
    J. Jaimovich, B. Knapp: Synchronization of multimodal recordings for musical performance research. In: Proc. Int. Conf. New Interfaces Music. Expr., Sydney (2010) pp. 372–374Google Scholar
  73. 38.73
    O. Mayor, J. Llop, E. Maestre: RepoVizz: A multimodal on-line database and browsing tool for music performance research. In: Int. Soc. Music Inform. Retr. Conf (2011)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2018

Authors and Affiliations

  • Alexander Refsum Jensenius
    • 1
  1. 1.Dept. of MusicologyUniversity of OsloOsloNorway

Personalised recommendations