Skip to main content

Evaluating BCI for Musical Expression: Historical Approaches, Challenges and Benefits

  • Chapter
  • First Online:
Brain Art

Abstract

A recurring challenge in the use of BCI (and more generally HCI) for musical expression is in the design and conduct of appropriate evaluation strategies when considering BCI systems for music composition or performance. Assessing the value of computationally assisted creativity is challenging in most artistic domains, and the assessment of computer assisted (or entirely computer generated) music is no different. BCI provides two unique possibilities over traditional evaluation strategies: firstly, the possibility of devising evaluations which do not require conscious input from the listener (and therefore do not detract from the immersive experience of performing, creating, or listening to music), and secondly in devising neurofeedback loops to actively maneuver the creator or listener through an expressive musical experience. Music offers some unusual challenges in comparison to other artistic interfaces: for example, often it is made in ensemble, and there is evidence to suggest neurophysiological differences are evident in ensemble measurement when compared to solo performance activities, for example see (Babiloni et al. in cortex 47:1082–1090, 2011). Moreover, a central purpose of music is often to incite movement (swaying, nodding head, dancing)—both in performer and audience—and as such this also offers up challenges for BCI/HCI design. This chapter considers historical approaches as well as making proposals for borrowing solutions from the world of auditory display (also referred to as sonification) and psychoacoustic evaluation techniques, to propose a hybrid paradigm for the evaluation of expression in BCI music applications.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Aldridge D (2005) Music therapy and neurological rehabilitation: performing health. Jessica Kingsley Publishers

    Google Scholar 

  • AlZoubi O, Calvo RA, Stevens RH (2009) Classification of EEG for affect recognition: an adaptive approach. In: Australasian joint conference on artificial intelligence. Springer, pp 52–61

    Google Scholar 

  • AlZoubi O, Koprinska I, Calvo RA (2008) Classification of brain-computer interface data. In: Proceedings of the 7th Australasian data mining conference, vol 87. Australian Computer Society, Inc, pp 123–131

    Google Scholar 

  • Babiloni C, Vecchio F, Infarinato F, Buffo P, Marzano N, Spada D, Rossi S, Bruni I, Rossini PM, Perani D (2011) Simultaneous recording of electroencephalographic data in musicians playing in ensemble. cortex 47:1082–1090

    Google Scholar 

  • Baier G, Hermann T, Stephani U (2007a) Event-based sonification of EEG rhythms in real time. Clin Neurophysiol 118:1377–1386

    Article  Google Scholar 

  • Baier G, Hermann T, Stephani U (2007b) Multi-channel sonification of human EEG. In: Proceedings of the 13th international conference on auditory display

    Google Scholar 

  • Bailes F, Dean RT (2009) Listeners discern affective variation in computer-generated musical sounds. Perception 38:1386–1404. https://doi.org/10.1068/p6063

    Article  Google Scholar 

  • Berndt A (2009) Musical nonlinearity in interactive narrative environments. MPublishing, University of Michigan Library, Ann Arbor, MI

    Google Scholar 

  • Bigand E, Poulin-Charronnat B (2006) Are we “experienced listeners”? A review of the musical capacities that do not depend on formal musical training. Cognition 100:100–130

    Article  Google Scholar 

  • Brouwer A-M, van Erp J (2010) A tactile P300 brain-computer interface. Front Neurosci. https://doi.org/10.3389/fnins.2010.00019

  • Casey K, Smith D (2013) Global mind field-a cybernetic perspective

    Google Scholar 

  • Chanel G, Ansari-Asl K, Pun T (2007) Valence-arousal evaluation using physiological signals in an emotion recall paradigm. In: IEEE international conference on systems, man and cybernetics, 2007. ISIC, pp 2662–2667. https://doi.org/10.1109/icsmc.2007.4413638

  • Chanel G, Kronegg J, Grandjean D, Pun T (2006) Emotion assessment: arousal evaluation using EEG’s and peripheral physiological signals. In: Multimedia content representation, classification and security, pp 530–537

    Google Scholar 

  • Chew, YCD, Caspary E (2011) MusEEGk: a brain computer musical interface. In: Proceedings of the 2011 annual conference extended abstracts on human factors in computing systems. ACM Press, New York, NY, pp 1417–1422. https://doi.org/10.1145/1979742.1979784

  • Clair AA, Memmott J (2008) Therapeutic uses of music with older adults. ERIC

    Google Scholar 

  • Daly I, Hallowell J, Hwang F, Kirke A, Malik A, Roesch E, Weaver J, Williams D, Miranda E, Nasuto SJ (2014a) Changes in music tempo entrain movement related brain activity. In: 2014 36th annual international conference of the IEEE engineering in medicine and biology society. IEEE, pp 4595–4598

    Google Scholar 

  • Daly I, Malik A, Hwang F, Roesch E, Weaver J, Kirke A, Williams D, Miranda E, Nasuto SJ (2014b) Neural correlates of emotional responses to music: an EEG study. Neurosci Lett 573:52–57

    Article  Google Scholar 

  • Daly I, Williams D, Hwang F, Kirke A, Malik A, Roesch E, Weaver J, Miranda E, Nasuto SJ (2014c) Brain-computer music interfacing for continuous control of musical tempo

    Google Scholar 

  • Daly I, Malik A, Weaver J, Hwang F, Nasuto SJ, Williams D, Kirke A, Miranda E (2015) Towards human-computer music interaction: evaluation of an affectively-driven music generator via galvanic skin response measures. IEEE, pp 87–92. https://doi.org/10.1109/ceec.2015.7332705

  • Daly I, Williams D, Kirke A, Weaver J, Malik A, Hwang F, Miranda E, Nasuto SJ (2016) Affective brain–computer music interfacing. J Neural Eng 13:46022–46035

    Article  Google Scholar 

  • De Smedt T, Menschaert L (2012) VALENCE: affective visualisation using EEG. Digit Creat 23:272–277

    Google Scholar 

  • Eaton ML (1971) Bio-music: biological feedback experimental music systems. Orcus

    Google Scholar 

  • Fagen TS (1982) Music therapy in the treatment of anxiety and fear in terminal pediatric patients. Music Ther 2:13–23

    Article  Google Scholar 

  • Franco F, Swaine JS, Israni S, Zaborowska KA, Kaloko F, Kesavarajan I, Majek JA (2014) Affect-matching music improves cognitive performance in adults and young children for both positive and negative emotions. Psychol Music 42:869–887

    Article  Google Scholar 

  • Goudeseune C (2002) Interpolated mappings for musical instruments. Organ Sound 7:85–96

    Article  Google Scholar 

  • Grierson M (2008) Composing with brainwaves: minimal trial P300b recognition as an indication of subjective preference for the control of a musical instrument. In: Proceedings of international cryogenic materials conference (ICMC’08)

    Google Scholar 

  • Grierson M, Kiefer C (2011) Better brain interfacing for the masses. ACM Press, p 1681. https://doi.org/10.1145/1979742.1979828

  • Gürkök H, Nijholt A (2013) Affective brain-computer interfaces for arts. In: 2013 Humaine association conference on affective computing and intelligent interaction (ACII). IEEE, pp 827–831

    Google Scholar 

  • Hanser SB (1985) Music therapy and stress reduction research. J Music Ther 22:193–206

    Article  Google Scholar 

  • Hinterberger T, Baier G (2005) Poser: parametric orchestral sonification of eeg in real-time for the self-regulation of brain states. IEEE Trans Multimed 12:70

    Article  Google Scholar 

  • Hunt A, Kirk R (2000) Mapping strategies for musical performance. Trends Gestural Control Music 21:231–258

    Google Scholar 

  • Hunter PG, Schellenberg EG, Schimmack U (2010) Feelings and perceptions of happiness and sadness induced by music: similarities, differences, and mixed emotions. Psychol Aesthet Creat Arts 4:47

    Article  Google Scholar 

  • Huron D (2011) Why is sad music pleasurable? A possible role for prolactin. Music Sci 15:146–158

    Article  Google Scholar 

  • Juslin PN, Laukka P (2004) Expression, perception, and induction of musical emotions: a review and a questionnaire study of everyday listening. J New Music Res 33:217–238

    Article  Google Scholar 

  • Knapp RB, Jaimovich J, Coghlan N (2009) Measurement of motion and emotion during musical performance

    Google Scholar 

  • Knapp RB, Lusted HS (1990) A bioelectric controller for computer music applications. Comput Music J 14:42–47

    Article  Google Scholar 

  • Le Groux S, Verschure P (2009) Neuromuse: training your brain through musical interaction. In: Proceedings of the international conference on auditory display, Copenhagen, Denmark

    Google Scholar 

  • Lee HY, Lee WH (2014) A study on interactive media art to apply emotion recognition. Int J Multimed Ubiquitous Eng 9:12

    Google Scholar 

  • Leslie G, Mullen T (2012) MoodMixer: EEG-based collaborative sonification. In: Proceedings of the international conference on new interfaces for musical expression, pp 296–299. http://www.nime.org/proceedings/2011/nime2011_296.pdf. Accessed 19 Nov

  • Lin C-Y, Cheng S (2012) Multi-theme analysis of music emotion similarity for jukebox application. In: 2012 International conference on audio, language and image processing (ICALIP). IEEE, pp 241–246

    Google Scholar 

  • Lin Y-P, Wang C-H, Jung T-P, Wu T-L, Jeng S-K, Duann J-R, Chen J-H (2010) EEG-based emotion recognition in music listening. IEEE Trans Biomed Eng 57:1798–1806. https://doi.org/10.1109/TBME.2010.2048568

    Article  Google Scholar 

  • Lucier A (1976) Statement on: music for solo performer. Biofeedback and the arts, results of early experiments. Aesthetic Research Center of Canada Publications, Vancouver, pp 60–61

    Google Scholar 

  • Lyon E, Knapp RB, Ouzounian G (2014) Compositional and performance mapping in computer chamber music: a case study. Comput Music J 38:64–75

    Google Scholar 

  • Manuel P (2005) Does sad music make one sad? An ethnographic perspective. Contemp Aesthet 3

    Google Scholar 

  • Manzolli J, Verschure PFMJ (2005) Roboser: a real-world composition system. Comput Music J 29:55–74

    Article  Google Scholar 

  • Merchel S, Altinsoy E, Stamm M (2010) Tactile music instrument recognition for audio mixers. In: Audio engineering society convention 128

    Google Scholar 

  • Merchel S, Altinsoy ME, Stamm M (2012) Touch the sound: audio-driven tactile feedback for audio mixing applications. J Audio Eng Soc 60:47–53

    Google Scholar 

  • Middendorf M, McMillan G, Calhoun G, Jones KS et al (2000) Brain-computer interfaces based on the steady-state visual-evoked response. IEEE Trans Rehabil Eng 8:211–214

    Article  Google Scholar 

  • Miranda ER (2010) Plymouth brain-computer music interfacing project: from EEG audio mixers to composition informed by cognitive neuroscience. Int J Arts Technol 3:154–176

    Article  Google Scholar 

  • Miranda ER, Castet J (eds) (2014) Guide to brain-computer music interfacing. Springer, London

    Google Scholar 

  • Mühl C, Heylen D, Nijholt A (2015) Affective brain-computer interfaces: neuroscientific approaches to affect detection. Oxford handbook of affective computing. Oxford University Press, Oxford, pp 217–232

    Google Scholar 

  • Mullen T, Khalil A, Ward T, Iversen J, Leslie G, Warp R, Whitman M et al (2015) MindMusic: playful and social installations at the interface between music and the brain. In: More playful user interfaces. Springer, pp 197–229

    Google Scholar 

  • Nirjon S, Dickerson RF, Li Q, Asare P, Stankovic JA, Hong D, Zhang B, Jiang X, Shen G, Zhao F (2012) Musicalheart: a hearty way of listening to music. In: Proceedings of the 10th ACM conference on embedded network sensor systems. ACM, pp 43–56

    Google Scholar 

  • Pérez MAO, Knapp RB (2008) BioTools: a biosignal toolbox for composers and performers. In: Computer music modeling and retrieval. Sense of sounds. Springer, pp 441–452

    Google Scholar 

  • Picinali L, Katz BF (2010) Spectral discrimination thresholds comparing audio and haptics. In: Proceedings of haptic and auditory interaction design workshop, Copenhagen, pp 1–2

    Google Scholar 

  • Ramirez R, Vamvakousis Z (2012) Detecting emotion from EEG signals using the emotive epoc device. In: Zanzotto FM, Tsumoto S, Taatgen N, Yao Y (ed) Brain informatics. Lecture notes in computer science, vol 7670. Springer, Berlin, Heidelberg, pp 175–184

    Google Scholar 

  • Rosenboom D (1990) The performing brain. Comput Music J 14:48–66

    Article  Google Scholar 

  • Russell JA (1980) A circumplex model of affect. J Pers Soc Psychol 39:1161

    Article  Google Scholar 

  • Scherer KR (2004) Which emotions can be induced by music? What are the underlying mechanisms? And how can we measure them? J New Music Res 33:239–251

    Article  Google Scholar 

  • Snyder JS, Large EW (2005) Gamma-band activity reflects the metric structure of rhythmic tone sequences. Cogn Brain Res 24:117–126. https://doi.org/10.1016/j.cogbrainres.2004.12.014

    Article  Google Scholar 

  • Teitelbaum R (1976) In tune: some early experiments in biofeedback music (1966–1974). In: Biofeedback and the arts, results of early experiments. Aesthetic Research Center of Canada Publications, Vancouver

    Google Scholar 

  • Toharia P, Morales J, Juan O, Fernaud I, Rodríguez A, DeFelipe J (2014) Musical representation of dendritic spine distribution: a new exploratory tool. Neuroinformatics: 1–13. https://doi.org/10.1007/s12021-013-9195-0

  • Väljamäe A, Steffert T, Holland S, Marimon X, Benitez R, Mealla S, Oliveira A, Jordà S (2013) A review of real-time EEG sonification research

    Google Scholar 

  • Vuoskoski JK, Eerola T (2012) Can sad music really make you sad? Indirect measures of affective states induced by music and autobiographical memories. Psychol Aesthet Creat Arts 6:204

    Article  Google Scholar 

  • Vuoskoski JK, Thompson WF, McIlwain D, Eerola T (2012) Who enjoys listening to sad music and why? Music Percept 29:311–317

    Article  Google Scholar 

  • Williams D (2016) Utility versus creativity in biomedical musification. J Creat Music Syst 1

    Google Scholar 

  • Wu D, Li C, Yin Y, Zhou C, Yao D (2010) Music composition from the brain signal: representing the mental state by music. Comput Intell Neurosci 2010:14

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Duncan A. H. Williams .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Williams, D.A.H. (2019). Evaluating BCI for Musical Expression: Historical Approaches, Challenges and Benefits. In: Nijholt, A. (eds) Brain Art. Springer, Cham. https://doi.org/10.1007/978-3-030-14323-7_5

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-14323-7_5

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-14322-0

  • Online ISBN: 978-3-030-14323-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics