Skip to main content

Embodied Musical Interaction

Body Physiology, Cross Modality, and Sonic Experience

Part of the Springer Series on Cultural Computing book series (SSCC)

Abstract

Music is a natural partner to human-computer interaction, offering tasks and use cases for novel forms of interaction. The richness of the relationship between a performer and their instrument in expressive musical performance can provide valuable insight to human-computer interaction (HCI) researchers interested in applying these forms of deep interaction to other fields. Despite the longstanding connection between music and HCI, it is not an automatic one, and its history arguably points to as many differences as it does overlaps. Music research and HCI research both encompass broad issues, and utilize a wide range of methods. In this chapter I discuss how the concept of embodied interaction can be one way to think about music interaction. I propose how the three “paradigms” of HCI and three design accounts from the interaction design literature can serve as a lens through which to consider types of music HCI. I use this conceptual framework to discuss three different musical projects—Haptic Wave, Form Follows Sound, and BioMuse.

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-92069-6_9
  • Chapter length: 20 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   149.00
Price excludes VAT (USA)
  • ISBN: 978-3-319-92069-6
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Hardcover Book
USD   199.99
Price excludes VAT (USA)
Fig. 9.1
Fig. 9.2
Fig. 9.3

(credit ZKM ONUK)

Notes

  1. 1.

    An exemplar of the emergent complexity afforded by analogue synthesizer patching is heard in Douglas Leedy’s “Entropical Paradise” (Strange 1983).

  2. 2.

    https://www.gold.ac.uk/interaction/.

  3. 3.

    See also Chap. 11 in this volume for an educational application of brainwave biosignals (Yuksel 2019).

  4. 4.

    http://bitalino.com.

References

  • Akkermans V, Font F, Funollet J, De Jong B, Roma G, Togias S, Serra X (2011) Freesound 2: an improved platform for sharing audio clips. In: International society for music information retrieval conference. International Society for Music Information Retrieval (ISMIR)

    Google Scholar 

  • Bannon LJ (1995) From human factors to human actors: the role of psychology and human-computer interaction studies in system design. In: Readings in human–computer interaction. Elsevier, pp 205–214

    Google Scholar 

  • Bødker S (2006) When second wave HCI meets third wave challenges. In: Proceedings of the 4th nordic conference on human-computer interaction: changing roles. ACM, pp 1–8

    Google Scholar 

  • Bødker S (2015) Third-wave HCI, 10 years later—participation and sharing. Interactions 22:24–31

    CrossRef  Google Scholar 

  • Caramiaux B, Altavilla A, Pobiner SG, Tanaka A (2015a) Form follows sound: designing interactions from sonic memories. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems. ACM, pp 3943–3952

    Google Scholar 

  • Caramiaux B, Donnarumma M, Tanaka A (2015b) Understanding gesture expressivity through muscle sensing. ACM Trans Comput Hum Interact. TOCHI 21:31

    CrossRef  Google Scholar 

  • Chowning JM (1973) The synthesis of complex audio spectra by means of frequency modulation. J Audio Eng Soc 21:526–534

    Google Scholar 

  • da Silva HP, Fred A, Martins R (2014) Biosignals for everyone. IEEE Pervasive Comput 13:64–71

    CrossRef  Google Scholar 

  • Donnarumma M (2016) Corpus nil

    Google Scholar 

  • Dourish P (2004) Where the action is: the foundations of embodied interaction. MIT Press

    Google Scholar 

  • Fallman D (2003) Design-oriented Human-computer interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’03. ACM, New York, NY, USA, pp 225–232. https://doi.org/10.1145/642611.642652

  • Flanagan JC (1954) The critical incident technique. Psychol Bull 51:327

    CrossRef  Google Scholar 

  • Franinović K, Serafin S (2013) Sonic interaction design. MIT Press

    Google Scholar 

  • Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4:67–94

    CrossRef  Google Scholar 

  • Gaver B (2014) Third wave HCI: methods, domains and concepts. http://cordis.europa.eu/result/rcn/178889_en.html. Accessed 17 May 2017

  • Gaver B, Dunne T, Pacenti E (1999) Design: cultural probes. Interactions 6:21–29

    CrossRef  Google Scholar 

  • Gibson JJ (1986) The ecological approach to visual perception. Psychology Press

    Google Scholar 

  • Harrison S, Tatar D, Sengers P (2007) The three paradigms of HCI. In: Alt. Chi. session at the SIGCHI conference on human factors in computing systems, San Jose, California, USA. pp 1–18

    Google Scholar 

  • Hutchins EL, Hollan JD, Norman DA (1985) Direct manipulation interfaces. Hum Comput Interact 1:311–338

    CrossRef  Google Scholar 

  • Kaufmann P, Englehart K, Platzner M (2010) Fluctuating EMG signals: investigating long-term effects of pattern matching algorithms. In: 2010 Proceedings of the annual international conference of the IEEE engineering in medicine and biology society, pp 6357–6360. https://doi.org/10.1109/IEMBS.2010.5627288

  • Knapp RB, Jaimovich J, Coghlan N (2009) Measurement of motion and emotion during musical performance. In: 3rd international conference on affective computing and intelligent interaction and workshops, 2009. ACII 2009. IEEE, pp 1–5

    Google Scholar 

  • Krischkowsky A, Maurer B, Tscheligi M (2016) Captology and technology appropriation: unintended use as a source for designing persuasive technologies. In: International conference on persuasive technology. Springer, pp 78–83

    Google Scholar 

  • Lopes P, Baudisch P (2013) Muscle-propelled force feedback: bringing force feedback to mobile devices. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 2577–2580

    Google Scholar 

  • Lucier A (1982) Music for solo performer. Lovely Music

    Google Scholar 

  • Lyon E, Knapp RB, Ouzounian G (2014) Compositional and performance mapping in computer chamber music: a case study. Comput Music J 38:64–75

    CrossRef  Google Scholar 

  • Mackay WE (2004) The interactive thread: exploring methods for multi-disciplinary design. In: Proceedings of the 5th conference on designing interactive systems: processes, practices, methods, and techniques. ACM, pp 103–112

    Google Scholar 

  • Mathews MV (1963) The digital computer as a musical instrument. Science 142:553–557. https://doi.org/10.1126/science.142.3592.553

    CrossRef  Google Scholar 

  • Maturana HR, Varela FJ (1987) The tree of knowledge: the biological roots of human understanding. New Science Library/Shambhala Publications

    Google Scholar 

  • McCartney J (2002) Rethinking the computer music language: SuperCollider. Comput Music J 26:61–68

    CrossRef  Google Scholar 

  • Metatla O, Martin F, Parkinson A, Bryan-Kinns N, Stockman T, Tanaka A (2016) Audio-haptic interfaces for digital audio workstations. J Multimodal User Interfaces 10:247–258

    CrossRef  Google Scholar 

  • Milner-Brown HS, Stein RB (1975) The relation between the surface electromyogram and muscular force. J Physiol 246:549

    CrossRef  Google Scholar 

  • Parkinson A, Cameron D, Tanaka A (2015) Haptic Wave: presenting the multiple voices, artefacts and materials of a design research project. Presented at the proceedings of the 2nd biennial research through design conference

    Google Scholar 

  • Phinyomark A, Phukpattaranont P, Limsakul C (2012) Feature reduction and selection for EMG signal classification. Expert Syst Appl 39:7420–7431

    CrossRef  Google Scholar 

  • Poupyrev I, Lyons MJ, Fels S, Blaine T (2001) New interfaces for musical expression. In: CHI ’01 extended abstracts on human factors in computing systems, CHI EA ’01. ACM, New York, NY, USA, pp 491–492. https://doi.org/10.1145/634067.634348

  • Puckette MS (1997) Pure data. In: Proceedings of the international computer music conference. International Computer Music Association, San Francisco, pp 224–227

    Google Scholar 

  • Risset J-C, Mathews MV (1969) Analysis of musical-instrument tones. Phys Today 22:23–30

    CrossRef  Google Scholar 

  • Roedl D, Bardzell S, Bardzell J (2015) Sustainable making? Balancing optimism and criticism in HCI discourse. ACM Trans Comput Hum Interact TOCHI 22, 15

    CrossRef  Google Scholar 

  • Saponas TS, Tan DS, Morris D, Turner J, Landay JA (2010) Making muscle-computer interfaces more practical. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’10. ACM, New York, NY, USA, pp 851–854. https://doi.org/10.1145/1753326.1753451

  • Shneiderman B (1982) The future of interactive systems and the emergence of direct manipulation. Behav Inf Technol 1:237–256

    CrossRef  Google Scholar 

  • Small C (2011) Musicking: the meanings of performing and listening. Wesleyan University Press

    Google Scholar 

  • Smith JO (1992) Physical modeling using digital waveguides. Comput Music J 16:74–91

    CrossRef  Google Scholar 

  • Smith M (2005) Stelarc: the monograph. MIT Press

    Google Scholar 

  • Strange A (1983) Electronic music: systems, techniques, and controls. William C Brown Pub

    Google Scholar 

  • Strawn J (1988) Implementing table lookup oscillators for music with the Motorola DSP56000 family. In: 85th audio engineering society convention. Audio Engineering Society

    Google Scholar 

  • Suchman LA (1987) Plans and situated actions: the problem of human-machine communication. Cambridge university press

    Google Scholar 

  • Tanaka A (1993) Musical technical issues in using interactive instrument technology with application to the BioMuse. In: Proceedings of the international computer music conference. International Computer Music Association, pp 124–124

    Google Scholar 

  • Tanaka A (2012) The use of electromyogram signals (EMG) in musical performance. eContact! 14

    Google Scholar 

  • Tanaka A (2017) Myogram, MetaGesture Music CD. Goldsmiths Press/NX Records

    Google Scholar 

  • Tanaka A, Donnarumma M (2018) The body as musical instrument. In: Kim Y, Gilman S (eds), The Oxford handbook on music and the body. Oxford University Press

    Google Scholar 

  • Tanaka A, Parkinson A (2016) Haptic Wave: a cross-modal interface for visually impaired audio producers. Presented at the proceedings of the 2016 CHI conference on human factors in computing systems. ACM, 2858304, pp 2150–2161. https://doi.org/10.1145/2858036.2858304

  • Vercoe B (1996) Extended Csound. In: Proceedings of the international computer music conference. International Computer Music Accociation, pp 141–142

    Google Scholar 

  • Wang G et al (2003) ChucK: a concurrent, on-the-fly, audio programming language. In: Proceedings of ICMC

    Google Scholar 

  • Weiser M (1991) The computer for the 21st century. Sci Am 265:94–104

    CrossRef  Google Scholar 

  • Wenger E (1998) Communities of practice: learning, meaning, and identity. Cambridge University Press

    Google Scholar 

  • Yuksel BF, Oleson KB, Chang R, Jacob RJK (2019) Detecting and adapting to users’ cognitive and affective state to develop intelligent musical interfaces. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6

    Google Scholar 

  • Zicarelli D (2002) How I learned to love a program that does nothing. Comput Music J 26:44–51

    CrossRef  Google Scholar 

Download references

Acknowledgements

The research reported here has received generous public funding. The MetaGesture Music project was supported by the European Research Council under the European Union’s Seventh Framework Programme (FP/2007-2013)/ERC Grant Agreement n. FP7-283771. The Design Patterns for Inclusive Collaboration (DePIC) project was supported by the UK Engineering and Physical Sciences Research Council EP/J018120/1. These projects were team efforts that represented personal and institutional collaboration, resulting in multi-authored publication reporting their results. I would like to thank my collaborators and previous co-authors for the original work that led up to the synthesis reported here.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Atau Tanaka .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2019 Springer Nature Switzerland AG

About this chapter

Verify currency and authenticity via CrossMark

Cite this chapter

Tanaka, A. (2019). Embodied Musical Interaction. In: Holland, S., Mudd, T., Wilkie-McKenna, K., McPherson, A., Wanderley, M. (eds) New Directions in Music and Human-Computer Interaction. Springer Series on Cultural Computing. Springer, Cham. https://doi.org/10.1007/978-3-319-92069-6_9

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-92069-6_9

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-92068-9

  • Online ISBN: 978-3-319-92069-6

  • eBook Packages: Computer ScienceComputer Science (R0)