Retroaction Between Music and Physiology: An Approach from the Point of View of Emotions
It is a well-known fact that listening to music produces particular physiological reactions for the auditor, and the study of these relationships remains a wide unexplored field of study. When one starts analyzing physiological signals measured on a person listening to music, one has to firstly define models to know what information could be observed with these signals. Conversely, when one starts trying to generate some music from physiological data, in fact, it is an attempt to create the inverse relationship of the one happening naturally, and in order to do that, one also has to define models enabling the control of all the parameters of a generative music system from the few physiological information available, and in a coherent way. The notion of emotion, aside from looking particularly appropriate in the context, reveals itself to be a central concept allowing the articulation between musical and physiological models. We suggest in this article an experimental real-time system aiming at studying the interactions and retroactions between music and physiology, based on the paradigm of emotions.
KeywordsLinear Discriminant Analysis Emotion Recognition Rhythmic Pattern Vigilance State Musical Excerpt
This research was carried out in the context of the SCRIME (Studio de Création et de Recherche en Informatique et Musique Electroacoustique, scrime.labri.fr) project which is funded by the DGCA of the French Culture Ministry, the Aquitaine Regional Council. SCRIME project is the result of a cooperation convention between the Conservatoire of Bordeaux, ENSEIRB-Matmeca (school of electronic and computer scientist engineers) and the University of Sciences of Bordeaux. It is composed of electroacoustic music composers and scientific researchers. It is managed by the LaBRI (laboratory of research in computer science of the University of Bordeaux, www.labri.fr). Its main missions are research and creation, diffusion and pedagogy thus extending its influence.
We would like to thank Pierre Héricourt, system engineer at LaBRI, for developing the EEG headsets’ drivers, allowing us to interface the headsets with all the software parts, and thus to set up our experiments for real.
- Bradley MM, Lang PJ (2007) The international affective digitized sounds (; IADS-2): affective ratings of sounds and instruction manual. University of Florida, Gainesville, FL, Technical Report B-3Google Scholar
- Clay A, Domenger G, Couture N, De La Rivière J, Desainte-Catherine M, et al (2011) Spectacle augmenté: le projet CARE, un processus de recherche. http://hal.inria.fr/hal-00651544/
- Clay A, Couture N, Decarsin E, Desainte-Catherine M, Vulliard PH, Larralde J (2012) Movement to emotions to music: using whole body emotional expression as an interaction for electronic music generation. In: Proceedings of NIME'12. University of Michigan, Ann Arbour. pp. 180–186Google Scholar
- Duvinage M et al (2012) A P300-based quantitative comparison between the Emotiv Epoc headset and a medical EEG device. In: Proceedings of the 9th IASTED international conference on biomedical engineering, p 2Google Scholar
- Ganidagli S, Cengiz M, Yanik M, Becerik C, Unal B (2005) The effect of music on preoperative sedation and the bispectral index. Anesth Analg 101(1):103–106Google Scholar
- Lang PJ, Bradley MM, Cuthbert BN (1999) International affective picture system (IAPS): technical manual and affective ratings. University of Florida, Gainesville, FLGoogle Scholar
- Le Groux S, Verschure P (2010) Towards adaptive music generation by reinforcement learning of musical tension. In: Proceedings of the 6th Sound and Music Computing Conference, pp 134–137Google Scholar
- Peretz I, Gagnon L, Bouchard B (1998) Music and emotion: perceptual determinants, immediacy and isolation after brain damage. Cognition 68(2):111–41Google Scholar
- Trainor LJ, Schmidt LA (2003) Processing emotions induced by music. Oxford University Press, OxfordGoogle Scholar
- Vezard L, Chavent M, Legrand P, Faita-Ainseba F, Clauzel J, et al (2011) Caractérisation d’états psychophysiologiques par classification de signaux EEG. Intégration de ces résultats dans le projet PSIGoogle Scholar
- Wallis I, Ingalls T, Campana E (2008) Computer-generating emotional music: the design of an affective music algorithm. In: International conference on digital audio effects, pp 7–12Google Scholar