Towards Automated Analysis of Joint Music Performance in the Orchestra

  • Giorgio Gnecco
  • Leonardo Badino
  • Antonio Camurri
  • Alessandro D’Ausilio
  • Luciano Fadiga
  • Donald Glowinski
  • Marcello Sanguineti
  • Giovanna Varni
  • Gualtiero Volpe
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 116)

Abstract

Preliminary results from a study of expressivity and of non-verbal social signals in small groups of users are presented. Music is selected as experimental test-bed since it is a clear example of interactive and social activity, where affective non-verbal communication plays a fundamental role. In this experiment the orchestra is adopted as a social group characterized by a clear leader (the conductor) of two groups of musicians (the first and second violin sections). It is shown how a reduced set of simple movement features - heads movements - can be sufficient to explain the difference in the behavior of the first violin section between two performance conditions, characterized by different eye contact between the two violin sections and between the first section and the conductor.

Keywords

automated analysis of non-verbal behavior expressive gesture analysis computational models of joint music action 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Castellano, G., Mortillaro, M., Camurri, A., Volpe, G., Scherer, K.: Automated analysis of body movement in emotionally expressive piano performances. Music Perception 26, 103–120 (2008)CrossRefGoogle Scholar
  2. 2.
    D’Ausilio, A., Badino, L., Li, Y., Tokay, S., Craighero, L., Canto, R., Aloimonos, Y., Fadiga, L.: Leadership in orchestra emerges from the casual relationships of movement kinematics. PLoS One 7, e35757, 1–6 (2012)Google Scholar
  3. 3.
    Davidson, J.W.: Visual perception of performance manner in the movements of solo musicians. Psychology of Music 21, 103–113 (1993)CrossRefGoogle Scholar
  4. 4.
    Davidson, J.W.: What type of information is conveyed in the body movements of solo musician performers? J. of Human Movement Studies 6, 279–301 (1994)Google Scholar
  5. 5.
    Palmer, C., Koopmans, E., Carter, C., Loehr, J.D., Wanderley, M.: Synchronization of motion and timing in clarinet performance. In: Proc. 2nd Int. Symposium on Performance Science (2009)Google Scholar
  6. 6.
    Varni, G., Volpe, G., Camurri, A.: A system for real-time multimodal analysis of nonverbal affective social interaction in user-centric media. IEEE Trans. on Multimedia 12, 576–590 (2010)CrossRefGoogle Scholar
  7. 7.
    Wanderley, M.M.: Quantitative analysis of non-obvious performer gestures. In: Wachsmuth, I., Sowa, T. (eds.) GW 2001. LNCS (LNAI), vol. 2298, pp. 241–253. Springer, Heidelberg (2002)CrossRefGoogle Scholar

Copyright information

© ICST Institute for Computer Science, Social Informatics and Telecommunications Engineering 2013

Authors and Affiliations

  • Giorgio Gnecco
    • 1
  • Leonardo Badino
    • 2
  • Antonio Camurri
    • 1
  • Alessandro D’Ausilio
    • 2
  • Luciano Fadiga
    • 2
  • Donald Glowinski
    • 1
  • Marcello Sanguineti
    • 1
  • Giovanna Varni
    • 1
  • Gualtiero Volpe
    • 1
  1. 1.DIBRIS DepartmentUniversity of GenoaGenoaItaly
  2. 2.IIT - Italian Institute of TechnologyGenoaItaly

Personalised recommendations