Retroaction Between Music and Physiology: An Approach from the Point of View of Emotions

  • Pierre-Henri Vulliard
  • Joseph Larralde
  • Myriam Desainte-Catherine


It is a well-known fact that listening to music produces particular physiological reactions for the auditor, and the study of these relationships remains a wide unexplored field of study. When one starts analyzing physiological signals measured on a person listening to music, one has to firstly define models to know what information could be observed with these signals. Conversely, when one starts trying to generate some music from physiological data, in fact, it is an attempt to create the inverse relationship of the one happening naturally, and in order to do that, one also has to define models enabling the control of all the parameters of a generative music system from the few physiological information available, and in a coherent way. The notion of emotion, aside from looking particularly appropriate in the context, reveals itself to be a central concept allowing the articulation between musical and physiological models. We suggest in this article an experimental real-time system aiming at studying the interactions and retroactions between music and physiology, based on the paradigm of emotions.


Linear Discriminant Analysis Emotion Recognition Rhythmic Pattern Vigilance State Musical Excerpt 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This research was carried out in the context of the SCRIME (Studio de Création et de Recherche en Informatique et Musique Electroacoustique, project which is funded by the DGCA of the French Culture Ministry, the Aquitaine Regional Council. SCRIME project is the result of a cooperation convention between the Conservatoire of Bordeaux, ENSEIRB-Matmeca (school of electronic and computer scientist engineers) and the University of Sciences of Bordeaux. It is composed of electroacoustic music composers and scientific researchers. It is managed by the LaBRI (laboratory of research in computer science of the University of Bordeaux, Its main missions are research and creation, diffusion and pedagogy thus extending its influence.

We would like to thank Pierre Héricourt, system engineer at LaBRI, for developing the EEG headsets’ drivers, allowing us to interface the headsets with all the software parts, and thus to set up our experiments for real.


  1. Bradley MM, Lang PJ (2007) The international affective digitized sounds (; IADS-2): affective ratings of sounds and instruction manual. University of Florida, Gainesville, FL, Technical Report B-3Google Scholar
  2. Clay A, Domenger G, Couture N, De La Rivière J, Desainte-Catherine M, et al (2011) Spectacle augmenté: le projet CARE, un processus de recherche.
  3. Clay A, Couture N, Decarsin E, Desainte-Catherine M, Vulliard PH, Larralde J (2012) Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation. In: Proceedings of NIME'12. University of Michigan, Ann Arbour. pp. 180–186Google Scholar
  4. Duvinage M et al (2012) A P300-based quantitative comparison between the Emotiv Epoc headset and a medical EEG device. In: Proceedings of the 9th IASTED international conference on biomedical engineering, p 2Google Scholar
  5. Ekman P (1992) An argument for basic emotions. Cogn Emot 6(3–4):169–200CrossRefGoogle Scholar
  6. Ganidagli S, Cengiz M, Yanik M, Becerik C, Unal B (2005) The effect of music on preoperative sedation and the bispectral index. Anesth Analg 101(1):103–106Google Scholar
  7. Lang PJ, Bradley MM, Cuthbert BN (1999) International affective picture system (IAPS): technical manual and affective ratings. University of Florida, Gainesville, FLGoogle Scholar
  8. Le Groux S, Verschure P (2010) Towards adaptive music generation by reinforcement learning of musical tension. In: Proceedings of the 6th Sound and Music Computing Conference, pp 134–137Google Scholar
  9. Livingstone S, Muhlberger R, Brown A, Thompson W (2010) Changing musical emotion: a computational rule system for modifying score and performance. Comput Music J 34(1):41–64CrossRefGoogle Scholar
  10. Loewy J, Hallan C, Friedman E, Martinez C (2005) Sleep/sedation in children undergoing EEG testing: a comparison of chloral hydrate and music therapy. J Perianesthesia Nurs 20(5):323–332CrossRefGoogle Scholar
  11. Peretz I, Gagnon L, Bouchard B (1998) Music and emotion: perceptual determinants, immediacy and isolation after brain damage. Cognition 68(2):111–41Google Scholar
  12. Renard Y, Lotte F, Gibert G, Congedo M, Maby E, Delannoy V, Bertrand O, Lécuyer A (2010) Openvibe: an open-source software platform to design, test, and use brain-computer interfaces in real and virtual environments. Presence Teleoperators Virtual Environ 19(1):35–53CrossRefGoogle Scholar
  13. Russel J (1980) A circumplex model of affect. J Personnal Soc Psychol 39(6):1161–1178CrossRefGoogle Scholar
  14. Sammler D, Grigutsch M, Fritz T, Koelsch S (2007) Music and emotion: electrophysiological correlates of the processing of pleasant and unpleasant music. Psychophysiology 44:293–304CrossRefGoogle Scholar
  15. Schmidt LA, Trainor LJ (2001) Frontal brain electrical activity (EEG) distinguishes valence and intensity of musical emotions. Cogn Emot 15(4):487–500CrossRefGoogle Scholar
  16. Trainor LJ, Schmidt LA (2003) Processing emotions induced by music. Oxford University Press, OxfordGoogle Scholar
  17. Vezard L, Chavent M, Legrand P, Faita-Ainseba F, Clauzel J, et al (2011) Caractérisation d’états psychophysiologiques par classification de signaux EEG. Intégration de ces résultats dans le projet PSIGoogle Scholar
  18. Wallis I, Ingalls T, Campana E (2008) Computer-generating emotional music: the design of an affective music algorithm. In: International conference on digital audio effects, pp 7–12Google Scholar
  19. Wright M (2005) Open sound control: an enabling technology for musical networking. Organ Sound 10(03):193–200CrossRefGoogle Scholar
  20. Zhang XW, Fan Y, Manyande A, Tian YK, Yin P (2005) Effects of music on target? controlled infusion of propofol requirements during combined spinal? epidural anaesthesia. Anaesthesia 60(10):990–994CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2014

Authors and Affiliations

  • Pierre-Henri Vulliard
    • 1
  • Joseph Larralde
    • 1
  • Myriam Desainte-Catherine
    • 1
    • 2
  1. 1.University of Bordeaux, LaBRI, UMR 5800BordeauxFrance
  2. 2.CNRS, LaBRI, UMR 5800BordeauxFrance

Personalised recommendations