Towards Automated Analysis of Joint Music Performance in the Orchestra
- Cite this paper as:
- Gnecco G. et al. (2013) Towards Automated Analysis of Joint Music Performance in the Orchestra. In: De Michelis G., Tisato F., Bene A., Bernini D. (eds) Arts and Technology. ArtsIT 2013. Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, vol 116. Springer, Berlin, Heidelberg
Preliminary results from a study of expressivity and of non-verbal social signals in small groups of users are presented. Music is selected as experimental test-bed since it is a clear example of interactive and social activity, where affective non-verbal communication plays a fundamental role. In this experiment the orchestra is adopted as a social group characterized by a clear leader (the conductor) of two groups of musicians (the first and second violin sections). It is shown how a reduced set of simple movement features - heads movements - can be sufficient to explain the difference in the behavior of the first violin section between two performance conditions, characterized by different eye contact between the two violin sections and between the first section and the conductor.
Keywordsautomated analysis of non-verbal behavior expressive gesture analysis computational models of joint music action
Unable to display preview. Download preview PDF.