Human-Computer Interaction

INTERACT 2015: Human-Computer Interaction – INTERACT 2015 pp 47-54 | Cite as

The LuminUs: Providing Musicians with Visual Feedback on the Gaze and Body Motion of Their Co-performers

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9297)

Abstract

This paper describes the LuminUs - a device that we designed in order to explore how new technologies could influence the inter-personal aspects of co-present musical collaborations. The LuminUs uses eye-tracking headsets and small wireless accelerometers to measure the gaze and body motion of each musician. A small light display then provides visual feedback to each musician, based either on the gaze or the body motion of their co-performer. We carried out an experiment with 15 pairs of music students in order to investigate how the LuminUs would influence their musical interactions. Preliminary results suggest that visual feedback provided by the LuminUs led to significantly increased glancing between the two musicians, whilst motion based feedback appeared to lead to a decrease in body motion for both participants.

Keywords

Musical interaction Computer-supported cooperative work Groupware Eye-tracking Social signals Non-verbal communication 

References

  1. 1.
    Morgan, E., Gunes, H., Bryan-Kinns, N.: Using affective and behavioural sensors to explore aspects of collaborative music making. Int. J. Hum.-Comput. Stud. 82, 31–47 (2015)CrossRefGoogle Scholar
  2. 2.
    Kendon, A.: Some functions of gaze-direction in social interaction. Acta Psychologica 26, 22–63 (1967)CrossRefGoogle Scholar
  3. 3.
    Kleinke, C.L.: Gaze and eye contact: a research review. Psycholog. Bull. 100(1), 78–100 (1986)CrossRefGoogle Scholar
  4. 4.
    Argyle, M., Graham, J.A.: The central Europe experiment: looking at persons and looking at objects. Environ. Psychol. Nonverbal Behav. 1(1), 6–16 (1976)CrossRefGoogle Scholar
  5. 5.
    Schutz, A.: Making music together. Collected papers II (1976)Google Scholar
  6. 6.
    Davidson, J.W., Good, J.M.M.: Social and musical co-ordination between members of a string quartet: an exploratory study. Psychol. Music 30(2), 186–201 (2002)CrossRefGoogle Scholar
  7. 7.
    Vera, B., Chew, E., Healey, P.: A study of ensemble performance under restricted line of sight. In: Proceedings of the International Conference on Music Information Retrieval, Curitiba, Brazil (2013)Google Scholar
  8. 8.
    Vertegaal, R.: The GAZE groupware system: mediating joint attention in multiparty communication and collaboration. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems - CHI 1999, pp. 294–301. ACM Press, New York (1999)Google Scholar
  9. 9.
    Chanel, G., Bétrancourt, M., Pun, T., Cereghetti, D., Molinari, G.: Assessment of computer-supported collaborative processes using interpersonal physiological and eye-movement coupling. In: Affective Computing and Intelligent Interaction (ACII 2013), Geneva, Switzerland (2013)Google Scholar
  10. 10.
    Hornof, A.J.: The prospects for eye-controlled musical performance. In: International Conference on New Interfaces for Musical Expression (NIME 2014), pp. 461–466 (2014)Google Scholar
  11. 11.
    Goldin-Meadow, S.: Beyond words: the importance of gesture to researchers and learners. Child Dev. 71(1), 231–239 (2000)CrossRefGoogle Scholar
  12. 12.
    Walk, R.D., Homan, C.P.: Emotion and dance in dynamic light displays. Bull. Psychon. Soc. 22(5), 437–440 (1984)CrossRefGoogle Scholar
  13. 13.
    Clarke, T.J., Bradshaw, M.F., Field, D.T., Hampson, S.E., Rose, D.: The perception of emotion from body movement in point-light displays of interpersonal dialogue. Perception 34(10), 1171–1180 (2005)CrossRefGoogle Scholar
  14. 14.
    Castellano, G., Villalba, S.D., Camurri, A.: Recognising human emotions from body movement and gesture dynamics. In: Paiva, A.C.R., Prada, R., Picard, R.W. (eds.) ACII 2007. LNCS, vol. 4738, pp. 71–82. Springer, Heidelberg (2007) CrossRefGoogle Scholar
  15. 15.
    Kleinsmith, A., Bianchi-Berthouze, N.: Affective body expression perception and recognition: a survey. IEEE Trans. Affect. Comput. 4(1), 15–33 (2012)CrossRefGoogle Scholar
  16. 16.
    Dahl, S., Friberg, A.: Visual perception of expressiveness in Musicians’ body movements. Music Percept.: Interdisc. J. 24(5), 433–454 (2007)CrossRefGoogle Scholar
  17. 17.
    Kassner, M., Patera, W., Bulling, A.: Pupil: an open source platform for pervasive eye tracking and mobile gaze-based interaction, April 2014. CoRR abs/1405.0006
  18. 18.
    Chartrand, T.L., Lakin, J.L.: The antecedents and consequences of human behavioral mimicry. Annu. Rev. Psychol. September 2012Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  1. 1.School of Electronic Engineering and Computer ScienceQueen Mary University of LondonLondonUK

Personalised recommendations