Advertisement

Embodied Musical Interaction

Body Physiology, Cross Modality, and Sonic Experience
  • Atau TanakaEmail author
Chapter
Part of the Springer Series on Cultural Computing book series (SSCC)

Abstract

Music is a natural partner to human-computer interaction, offering tasks and use cases for novel forms of interaction. The richness of the relationship between a performer and their instrument in expressive musical performance can provide valuable insight to human-computer interaction (HCI) researchers interested in applying these forms of deep interaction to other fields. Despite the longstanding connection between music and HCI, it is not an automatic one, and its history arguably points to as many differences as it does overlaps. Music research and HCI research both encompass broad issues, and utilize a wide range of methods. In this chapter I discuss how the concept of embodied interaction can be one way to think about music interaction. I propose how the three “paradigms” of HCI and three design accounts from the interaction design literature can serve as a lens through which to consider types of music HCI. I use this conceptual framework to discuss three different musical projects—Haptic Wave, Form Follows Sound, and BioMuse.

Notes

Acknowledgements

The research reported here has received generous public funding. The MetaGesture Music project was supported by the European Research Council under the European Union’s Seventh Framework Programme (FP/2007-2013)/ERC Grant Agreement n. FP7-283771. The Design Patterns for Inclusive Collaboration (DePIC) project was supported by the UK Engineering and Physical Sciences Research Council EP/J018120/1. These projects were team efforts that represented personal and institutional collaboration, resulting in multi-authored publication reporting their results. I would like to thank my collaborators and previous co-authors for the original work that led up to the synthesis reported here.

References

  1. Akkermans V, Font F, Funollet J, De Jong B, Roma G, Togias S, Serra X (2011) Freesound 2: an improved platform for sharing audio clips. In: International society for music information retrieval conference. International Society for Music Information Retrieval (ISMIR)Google Scholar
  2. Bannon LJ (1995) From human factors to human actors: the role of psychology and human-computer interaction studies in system design. In: Readings in human–computer interaction. Elsevier, pp 205–214Google Scholar
  3. Bødker S (2006) When second wave HCI meets third wave challenges. In: Proceedings of the 4th nordic conference on human-computer interaction: changing roles. ACM, pp 1–8Google Scholar
  4. Bødker S (2015) Third-wave HCI, 10 years later—participation and sharing. Interactions 22:24–31CrossRefGoogle Scholar
  5. Caramiaux B, Altavilla A, Pobiner SG, Tanaka A (2015a) Form follows sound: designing interactions from sonic memories. In: Proceedings of the 33rd annual ACM conference on human factors in computing systems. ACM, pp 3943–3952Google Scholar
  6. Caramiaux B, Donnarumma M, Tanaka A (2015b) Understanding gesture expressivity through muscle sensing. ACM Trans Comput Hum Interact. TOCHI 21:31CrossRefGoogle Scholar
  7. Chowning JM (1973) The synthesis of complex audio spectra by means of frequency modulation. J Audio Eng Soc 21:526–534Google Scholar
  8. da Silva HP, Fred A, Martins R (2014) Biosignals for everyone. IEEE Pervasive Comput 13:64–71CrossRefGoogle Scholar
  9. Donnarumma M (2016) Corpus nilGoogle Scholar
  10. Dourish P (2004) Where the action is: the foundations of embodied interaction. MIT PressGoogle Scholar
  11. Fallman D (2003) Design-oriented Human-computer interaction. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’03. ACM, New York, NY, USA, pp 225–232.  https://doi.org/10.1145/642611.642652
  12. Flanagan JC (1954) The critical incident technique. Psychol Bull 51:327CrossRefGoogle Scholar
  13. Franinović K, Serafin S (2013) Sonic interaction design. MIT PressGoogle Scholar
  14. Gaver WW (1989) The SonicFinder: an interface that uses auditory icons. Hum Comput Interact 4:67–94CrossRefGoogle Scholar
  15. Gaver B (2014) Third wave HCI: methods, domains and concepts. http://cordis.europa.eu/result/rcn/178889_en.html. Accessed 17 May 2017
  16. Gaver B, Dunne T, Pacenti E (1999) Design: cultural probes. Interactions 6:21–29CrossRefGoogle Scholar
  17. Gibson JJ (1986) The ecological approach to visual perception. Psychology PressGoogle Scholar
  18. Harrison S, Tatar D, Sengers P (2007) The three paradigms of HCI. In: Alt. Chi. session at the SIGCHI conference on human factors in computing systems, San Jose, California, USA. pp 1–18Google Scholar
  19. Hutchins EL, Hollan JD, Norman DA (1985) Direct manipulation interfaces. Hum Comput Interact 1:311–338CrossRefGoogle Scholar
  20. Kaufmann P, Englehart K, Platzner M (2010) Fluctuating EMG signals: investigating long-term effects of pattern matching algorithms. In: 2010 Proceedings of the annual international conference of the IEEE engineering in medicine and biology society, pp 6357–6360.  https://doi.org/10.1109/IEMBS.2010.5627288
  21. Knapp RB, Jaimovich J, Coghlan N (2009) Measurement of motion and emotion during musical performance. In: 3rd international conference on affective computing and intelligent interaction and workshops, 2009. ACII 2009. IEEE, pp 1–5Google Scholar
  22. Krischkowsky A, Maurer B, Tscheligi M (2016) Captology and technology appropriation: unintended use as a source for designing persuasive technologies. In: International conference on persuasive technology. Springer, pp 78–83Google Scholar
  23. Lopes P, Baudisch P (2013) Muscle-propelled force feedback: bringing force feedback to mobile devices. In: Proceedings of the SIGCHI conference on human factors in computing systems. ACM, pp 2577–2580Google Scholar
  24. Lucier A (1982) Music for solo performer. Lovely MusicGoogle Scholar
  25. Lyon E, Knapp RB, Ouzounian G (2014) Compositional and performance mapping in computer chamber music: a case study. Comput Music J 38:64–75CrossRefGoogle Scholar
  26. Mackay WE (2004) The interactive thread: exploring methods for multi-disciplinary design. In: Proceedings of the 5th conference on designing interactive systems: processes, practices, methods, and techniques. ACM, pp 103–112Google Scholar
  27. Mathews MV (1963) The digital computer as a musical instrument. Science 142:553–557.  https://doi.org/10.1126/science.142.3592.553CrossRefGoogle Scholar
  28. Maturana HR, Varela FJ (1987) The tree of knowledge: the biological roots of human understanding. New Science Library/Shambhala PublicationsGoogle Scholar
  29. McCartney J (2002) Rethinking the computer music language: SuperCollider. Comput Music J 26:61–68CrossRefGoogle Scholar
  30. Metatla O, Martin F, Parkinson A, Bryan-Kinns N, Stockman T, Tanaka A (2016) Audio-haptic interfaces for digital audio workstations. J Multimodal User Interfaces 10:247–258CrossRefGoogle Scholar
  31. Milner-Brown HS, Stein RB (1975) The relation between the surface electromyogram and muscular force. J Physiol 246:549CrossRefGoogle Scholar
  32. Parkinson A, Cameron D, Tanaka A (2015) Haptic Wave: presenting the multiple voices, artefacts and materials of a design research project. Presented at the proceedings of the 2nd biennial research through design conferenceGoogle Scholar
  33. Phinyomark A, Phukpattaranont P, Limsakul C (2012) Feature reduction and selection for EMG signal classification. Expert Syst Appl 39:7420–7431CrossRefGoogle Scholar
  34. Poupyrev I, Lyons MJ, Fels S, Blaine T (2001) New interfaces for musical expression. In: CHI ’01 extended abstracts on human factors in computing systems, CHI EA ’01. ACM, New York, NY, USA, pp 491–492.  https://doi.org/10.1145/634067.634348
  35. Puckette MS (1997) Pure data. In: Proceedings of the international computer music conference. International Computer Music Association, San Francisco, pp 224–227Google Scholar
  36. Risset J-C, Mathews MV (1969) Analysis of musical-instrument tones. Phys Today 22:23–30CrossRefGoogle Scholar
  37. Roedl D, Bardzell S, Bardzell J (2015) Sustainable making? Balancing optimism and criticism in HCI discourse. ACM Trans Comput Hum Interact TOCHI 22, 15CrossRefGoogle Scholar
  38. Saponas TS, Tan DS, Morris D, Turner J, Landay JA (2010) Making muscle-computer interfaces more practical. In: Proceedings of the SIGCHI conference on human factors in computing systems, CHI ’10. ACM, New York, NY, USA, pp 851–854.  https://doi.org/10.1145/1753326.1753451
  39. Shneiderman B (1982) The future of interactive systems and the emergence of direct manipulation. Behav Inf Technol 1:237–256CrossRefGoogle Scholar
  40. Small C (2011) Musicking: the meanings of performing and listening. Wesleyan University PressGoogle Scholar
  41. Smith JO (1992) Physical modeling using digital waveguides. Comput Music J 16:74–91CrossRefGoogle Scholar
  42. Smith M (2005) Stelarc: the monograph. MIT PressGoogle Scholar
  43. Strange A (1983) Electronic music: systems, techniques, and controls. William C Brown PubGoogle Scholar
  44. Strawn J (1988) Implementing table lookup oscillators for music with the Motorola DSP56000 family. In: 85th audio engineering society convention. Audio Engineering SocietyGoogle Scholar
  45. Suchman LA (1987) Plans and situated actions: the problem of human-machine communication. Cambridge university pressGoogle Scholar
  46. Tanaka A (1993) Musical technical issues in using interactive instrument technology with application to the BioMuse. In: Proceedings of the international computer music conference. International Computer Music Association, pp 124–124Google Scholar
  47. Tanaka A (2012) The use of electromyogram signals (EMG) in musical performance. eContact! 14Google Scholar
  48. Tanaka A (2017) Myogram, MetaGesture Music CD. Goldsmiths Press/NX RecordsGoogle Scholar
  49. Tanaka A, Donnarumma M (2018) The body as musical instrument. In: Kim Y, Gilman S (eds), The Oxford handbook on music and the body. Oxford University PressGoogle Scholar
  50. Tanaka A, Parkinson A (2016) Haptic Wave: a cross-modal interface for visually impaired audio producers. Presented at the proceedings of the 2016 CHI conference on human factors in computing systems. ACM, 2858304, pp 2150–2161.  https://doi.org/10.1145/2858036.2858304
  51. Vercoe B (1996) Extended Csound. In: Proceedings of the international computer music conference. International Computer Music Accociation, pp 141–142Google Scholar
  52. Wang G et al (2003) ChucK: a concurrent, on-the-fly, audio programming language. In: Proceedings of ICMCGoogle Scholar
  53. Weiser M (1991) The computer for the 21st century. Sci Am 265:94–104CrossRefGoogle Scholar
  54. Wenger E (1998) Communities of practice: learning, meaning, and identity. Cambridge University PressGoogle Scholar
  55. Yuksel BF, Oleson KB, Chang R, Jacob RJK (2019) Detecting and adapting to users’ cognitive and affective state to develop intelligent musical interfaces. In: Holland S, Mudd T, Wilkie-McKenna K, McPherson A, Wanderley MM (eds) New directions in music and human-computer interaction. Springer, London. ISBN 978-3-319-92069-6Google Scholar
  56. Zicarelli D (2002) How I learned to love a program that does nothing. Comput Music J 26:44–51CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of ComputingGoldsmiths, University of LondonLondonUK

Personalised recommendations