Skip to main content
Log in

Expressive non-verbal interaction in a string quartet: an analysis through head movements

  • Original Paper
  • Published:
Journal on Multimodal User Interfaces Aims and scope Submit manuscript

Abstract

The present study investigates expressive non-verbal interaction in the musical context starting from behavioral features extracted at individual and group levels. Four groups of features are defined, which are related to head movement and direction, and may help gaining insight on the expressivity and cohesion of the performance, discriminating between different performance conditions. Then, the features are evaluated both at a global scale and at a local scale. The findings obtained from the analysis of a string quartet recorded in an ecological setting show that using these features alone or in their combination may help in distinguishing between two types of performance: (a) a concert-like condition, where all musicians aim at performing at best, (b) a perturbed one, where the 1\(\mathrm{st}\) violinist devises alternative interpretations of the music score without discussing them with the other musicians. In the global data analysis, the discriminative power of the features is investigated through statistical tests. Then, in the local data analysis, a larger amount of data is used to exploit more sophisticated machine learning techniques to select suitable subsets of the features, which are then used to train an SVM classifier to perform binary classification. Interestingly, the features whose discriminative power is evaluated as large (respectively, small) in the global analysis are also evaluated in a similar way in the local analysis. When used together, the 22 features that have been defined in the paper demonstrate to be efficient for classification, leading to a percentage of about 90 % successfully classified examples among the ones not used in the training phase. Similar results are obtained considering only a subset of 15 features.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9

Similar content being viewed by others

Notes

  1. http://www.siempre.infomus.org.

  2. http://www.infomus.org/index_eng.php.

  3. http://www.qualisys.com.

  4. http://www.infomus.org/eyesweb_eng.php.

  5. For what concerns the definitions of some features, we have corrected a typo present in [5], which reported an old definition of the features \(F_1\) and \(\mathbf{F}_3\) (given in terms of means over the frames instead than medians over the frames, as done in the present manuscript), although its numerical results about such features were actually obtained according to the same definitions of the present manuscript.

  6. That condition was mentioned in Sect. 3.1 only to describe how the procedure could be modified in the unlikely case such a condition would occur.

  7. One can notice that there is no contradiction about the use of the median inside the definition of the feature \(F_1\) in Sect. 3.1, and the use of the mean instead in the analysis described in this subsection. Indeed, the median among the frames of each recording was used in the definition of the feature \(F_1\) in Sect. 3.1, whereas the mean in this subsection was computed at another level of the analysis, i.e., averaging the obtained values of the feature \(F_1\) with respect to the recordings associated with the same performance condition. A similar remark holds for the feature \(\mathbf{F}_2\).

  8. The interonset interval (\(IoI\)) is the lapse of time between the beginnings of two consecutive time windows. Since in this work it was chosen to be smaller than the length of the time windows, consecutive time windows always overlapped. Although this introduced an additional correlation between the features computed on different time windows, this was limited to consecutive time windows.

References

  1. Eerola T, Vuoskoski JK (2013) A review of music and emotion studies: approaches, emotion models, and stimuli. Music Percept Interdiscip J 30(3):307–340

  2. Vinciarelli A, Pantic M, Heylen D, Pelachaud C, Poggi I, D’Errico F, Schroder M (2012) Bridging the gap between social animal and unsocial machine: a survey of social signal processing. IEEE TAC 3(1):69–87

  3. Seddon F, Biasutti M (2009) A comparison of modes of communication between members of a string quartet and a jazz sextet. Psychol Music 37(4):395

    Article  Google Scholar 

  4. Gilboa A, Tal-Shmotkin M (2010) String quartets as self-managed teams: an interdisciplinary perspective. Psychol Music 40(1):19–41

  5. Glowinski D, Gnecco G, Camurri A, Piana S (2013) Expressive non-verbal interaction in string quartet. In: Proc. of the fifth IEEE conference on affective computing and intelligent interaction (IEEE ACII 2013), pp 233–238

  6. Cadoz C, Wanderley MM (2000) Gesture: music, chapter trends in gestural control of music. Ircam/Centre Pompidou, Paris, pp 71–94

  7. Chadefaux D, Wanderley M, Le Carrou J-L, Fabre B, Daudet L (2012) Experimental study of the musician/instrument interaction in the case of the concert harp. In: Proceedings of acoustics

  8. Wanderley MM (2002) Quantitative analysis of non-obvious performer gestures. Proc Gest Workshop 2001(2):241–253

    Google Scholar 

  9. Davidson JW (1993) Visual perception of performance manner in the movements of solo musicians. Psychol Music 21:103–113

    Article  Google Scholar 

  10. Davidson JW (1994) What type of information is conveyed in the body movements of solo musician performers? J Hum Mov Stud 6:279–301

    Google Scholar 

  11. Castellano G, Mortillaro M, Camurri A, Volpe G, Scherer K (2008) Automated analysis of body movement in emotionally expressive piano performances. Music Percept 26:103–120

    Article  Google Scholar 

  12. Palmer C, Koopmans E, Carter C, Loehr JD, Wanderley M (2009) Synchronization of motion and timing in clarinet performance. In Proceedings of the Second International Symposium on Performance Science

  13. Varni G, Volpe G, Camurri A (2010) A system for real-time multimodal analysis of nonverbal affective social interaction in user-centric media. IEEE Trans Multimed 12:576–590

    Article  Google Scholar 

  14. D’Ausilio A, Badino, L, Li Y, Tokay S, Craighero L, Canto R, Aloimonos Y, Fadiga L (2012) Leadership in orchestra emerges from the casual relationships of movement kinematics. PLoS one 7. doi:10.1371/journal.pone.0035757

  15. Gnecco G, Badino L, Camurri A, D’Ausilio A, Fadiga L, Glowinski D, Sanguineti M, Varni G, Volpe G (2013) Towards automated analysis of joint music performance in the orchestra. In: Arts and technology, third international conference, ArtsIT 2013, Milan, Italy, March 21–23, 2013, revised selected papers. Lecture Notes of the institute for computer sciences, social informatics and telecommunications engineering (LNICST) series, vol 116. Springer, Berlin, pp 120–127

  16. Clayton M, Sager R, Will U (2004) In time with the music: the concept of entrainment and its significance for ethnomusicology. ESEM Counterpoint 1:1–82

    Google Scholar 

  17. Davidson JW, Good JMM (2002) Social and musical co-ordination between members of a string quartet: an exploratory study. Psychol Music 30(2):186

    Article  Google Scholar 

  18. Glowinski D, Coletta P, Volpe G, Camurri A, Chiorri C, Schenone A (2010) Multi-scale entropy analysis of dominance in social creative activities. In: ACM Multimedia MM10. ACM, Firenze, pp 1035–1038

  19. Keller PE, Appel M (2010) Individual differences, auditory imagery, and the coordination of body movements and sounds in musical ensembles. Music Percept 28(1):27–46

    Article  Google Scholar 

  20. Poggi I (2006) Body and mind in the pianist’s performance. In: Proceedings of the 9th international conference on music perception and cognition, pp. 1044–1051

  21. Glowinski D, Badino L, Ausilio A, Camurri A, Fadiga L (2012) Analysis of leadership in a string quartet. In: Third international workshop on social behaviour in music at ACM ICMI 2012

  22. D’Ausilio A, Badino L, Li Yi, Tokay S, Craighero L, Canto R, Aloimonos Y, Fadiga L (2011) Communication in orchestra playing as measured with granger causality. In: Proceedings of the INTETAIN conference (intelligent technologies for interactive entertainment), Genova, May 2011

  23. Luck G, Toiviainen P (2006) Ensemble musicians’ synchronization with conductors’ gestures: an automated feature-extraction analysis. Music Percept 24(2):189–200

    Article  Google Scholar 

  24. Poggi I (2011) Music and leadership: the Choir Conductor’s multimodal communication. John Benjamins Pub Co, Amsterdam, pp 341–353

  25. Glowinski D, Camurri A, Volpe G, Noera C, Cowie R, McMahon E, Knapp B, Jaimovich J (2008) Using induction and multimodal assessment to understand the role of emotion in musical performance. In: Proceedings of the 2008 conference on emotion in human–computer interaction. Liverpool John Moores University, Liverpool

  26. Schubert E (2001) Continuous measurement of self-report emotional response to music. In: Juslin PN, Sloboda JA (eds) Music and emotion: theory and research. Series in affective science. Oxford University Press, New York, NY, pp 393–414

  27. Glowinski D, Torres-Eliard K, Chiorri C, Camurri A, Grandjean D (2012) Can naive observers distinguish a violinist’s solo from an ensemble performance? A pilot study. In: Third international workshop on social behaviour in music at ACM ICMI 2012

  28. Glowinski D, Dael N, Camurri A, Volpe G, Mortillaro M, Scherer K (2011) Toward a minimal representation of affective gestures. IEEE Trans Affect Comput 2:106–118

  29. Dahl S, Bevilacqua F, Bresin R, Clayton M, Leante L, Poggi I, Rasamimanana N (2009) Gestures in performance. Sound Mov Mean Music Gest

  30. Gnecco G, Glowinski D, Camurri A, Sanguineti M (2013) On the detection of the level of attention in an orchestra through head movements. Int J Arts Technol (to appear)

  31. Stiefelhagen R (2002) Tracking focus of attention in meetings. In: Proceedings of the fourth IEEE international conference on multimodal interfaces. IEEE, New York, pp 273–280

  32. Stiefelhagen R, Zhu J (2002) Head orientation and gaze direction in meetings. In: CHI ’02 extended abstracts on human factors in computing systems, CHI EA ’02. ACM, New York, pp 858–859

  33. Stiefelhagen R, Yang J, Waibel A (2002) Modeling focus of attention for meeting indexing based on multiple cues. IEEE Trans Neural Netw 13(4):928–938

    Article  Google Scholar 

  34. Camurri A, Dardard F, Ghisio S, Glowinski D, Gnecco G, Sanguineti M (2014) Exploiting the Shapley value in the estimation of the position of a point of interest for a group of individuals. Procedia Soc Behav Sci 108:249–259

    Article  Google Scholar 

  35. Ba S, Odobez J-M (2006) A study on visual focus of attention recognition from head pose in a meeting room. In: Renals St, Bengio S, Fiscus JG (ed) Machine learning for multimodal interaction. Lecture notes in computer science, vol 4299. Springer, Berlin, pp 75–87

  36. Hung H, Gatica-Perez D (2010) Estimating cohesion in small groups using audio–visual nonverbal behavior. IEEE Trans Multimed 12(6):563–575

    Article  Google Scholar 

  37. Chang Y-W, Lin C-J (2008) Feature ranking using linear SVM. In: JMLR workshop and conference proceedings: causation and prediction challenge at WCCI 2008, vol 3, pp 53–64

  38. Chen Y-W, Lin C-J (2006) Combining SVMs with various feature selection strategies. Feature extraction. In: Studies in fuzziness and soft computing, vol 207. Springer, Berlin, pp 315–324

  39. Glowinski D, Mancini M, Cowie D, Camurri A (2013) How action adapts to social context: the movements of musicians in solo and ensemble conditions. In: Proc. of the fifth IEEE conference on affective computing and intelligent interaction (IEEE ACII 2013), pp 294–299

Download references

Acknowledgments

The project SIEMPRE acknowledges the financial support of the Future and Emerging Technologies (FET) programme within the Seventh Framework Programme for Research of the European Commission, under FET-Open Grant Number: 250026-2.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Donald Glowinski.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Glowinski, D., Dardard, F., Gnecco, G. et al. Expressive non-verbal interaction in a string quartet: an analysis through head movements. J Multimodal User Interfaces 9, 55–68 (2015). https://doi.org/10.1007/s12193-014-0154-3

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s12193-014-0154-3

Keywords

Navigation