Skip to main content

Mediated Interactions and Musical Expression—A Survey

  • Chapter
  • First Online:
Digital Da Vinci

Abstract

This chapter surveys the field of technologically mediated musical interaction and technologically enhanced musical expression. We look at several new technologies that enable new ways of musical expression and interaction, explore the micro-coordination that occurs in collaborative musical performance and look at the preconditions for human-agent interaction through co-creative agents. This survey collects a number of insights that will enable us to create better technological artifacts for musical expression and collaboration.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  • Akkersdijk S (2012) Synchronized clapping: a two-way synchronized process. Capita Selecta paper

    Google Scholar 

  • Barbosa Á (2008) Ten-hand piano: a networked music installation. In: Proceedings of the 2008 Conference on New Interfaces for Musical Expression, NIME ’08, Genova, Italy, pp 9–12

    Google Scholar 

  • Barbosa Á, Kaltenbrunner M (2002) Public sound objects: a shared musical space on the web. In: Proceedings of the First International Symposium on Cyber Worlds, CW ’02, Washington, DC, USA, pp 9–11

    Google Scholar 

  • Beaton B, Harrison S, Tatar D (2010) Digital drumming: a study of co-located, highly coordinated, dyadic collaboration. In: Proceedings of the 28th international conference on Human factors in computing systems, CHI ’10, New York, USA, pp 1417–1426

    Google Scholar 

  • Beilharz K (2011) Tele-touch embodied controllers: posthuman gestural interaction in music performance. Social Semiotics, Vol 21, issue 4. (Published Online), pp 547–568

    Google Scholar 

  • Benthien C (2002) Skin: on the cultural border between self and the world. Columbia University Press. New York, USA

    Google Scholar 

  • Bianchi-Berthouze N (2013) Understanding the role of body movement in player engagement. Hum Comput Int 28(1):40–75. (Published Online)

    Google Scholar 

  • Borchers J, Lee E, Samminger W, Muhlhauser M (2004) Personal orchestra: a real-time audio/video system for interactive conducting. ACM Multimedia Systems Journal Special Issue on Multimedia Software Engineering, vol 9, issue 5, San Jose, USA, pp 458–465

    Google Scholar 

  • Bryan-Kinns N (2004) Daisyphone: The design and impact of a novel environment for remote group music improvisation. In Proceedings of the 5th Conference on Designing Interactive Systems: processes, practices, methods, and techniques, DIS 2004, pp 135–144

    Google Scholar 

  • Bryan-Kinns N (2012) Mutual engagement in social music making. In: Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering, LNICST 78, pp 260–266. (Published Online)

    Google Scholar 

  • Bryan-Kinns N, Hamilton F (2009) Identifying mutual engagement. Behav Inform Tech 31(2):101–125

    Article  Google Scholar 

  • Chew YCD, Caspary E (2011) MusEEGk: a brain computer musical interface. In: CHI ’11 extended abstracts on human factors in computing systems. New York, NY, USA: ACM, pp 1417–1422. (A BCI example is MusEEGk)

    Google Scholar 

  • Choi I, Bargar R (2011) A playable evolutionary interface for performance and social engagement. INTETAIN, vol 78. Genova, Italy, pp 170–182

    Google Scholar 

  • Coletta P, Mazzarino B, Camurri A, Canepa C, Volpe G (2008) Mappe per Affetti Erranti: a multimodal system for social active listening and expressive performance. In: Proceedings of the 2008 Conference on New Interfaces for Musical Expression, NIME ’08, Genova, Italy, pp 134–139

    Google Scholar 

  • Crick C, Munz M, Scassellati B (2006) Synchronization in social tasks: Robotic drumming. In: Proceedings of the 15th IEEE International Symposium on Robot and Human Interactive Communication. ROMAN, Vol 15, Hatfield, UK, pp 97–102

    Google Scholar 

  • De Boeck C (2011) Staalhemel: responsive environment for brainwaves. Available: http://www.staalhemel.com/

  • Eck D, Schmidhuber J (2002) Finding Temporal Structure in Music: Blues Improvisation with LSTM Recurrent Networks. In: Neural Networks for Signal Processing XII, vol 12. Martigny, Valais, Switzerland, pp 747–756

    Google Scholar 

  • Eigenfeldt A (2009) A realtime generative music system using autonomous melody, harmony, and rhythm agents. In: XIII Internationale Conference on Generative Arts, Milan, Italy

    Google Scholar 

  • Eigenfeldt A (2010) Realtime generation of harmonic progressions using controlled markov selection. In: Proceedings of the First International Conference on Computational Creativity, ICCC 2010. Lisbon, Portugal

    Google Scholar 

  • Eigenfeldt A, Kapur A (2008) An agent-based system for robotic musical performance. In: Proceedings of the 2008 Conference on New Interfaces for Musical Expression, NIME ’08, Genova, Italy, pp 144–149

    Google Scholar 

  • Fikkert FW, Hakvoort MC, van der Vet PE, Nijholt A (2009) Feelsound: interactive acoustic music making. In: Proceedings of the International Conference on Advances in Computer Entertainment Technology, ACE2009, New York, pp 449–449

    Google Scholar 

  • Gamerman D, Lopes HF (2006) Markov chain Monte Carlo: stochastic simulation for Bayesian inference, vol 68. Chapman & Hall/CRC

    Google Scholar 

  • Georgeff M, Pell B, Pollack M, Tambe M, Wooldridge M (1998) The belief-desire-intention model of agency. Intelligent Agents V: Agents Theories, Architectures, and Languages. Paris, pp 1–10

    Google Scholar 

  • Goina M, Polotti P (2008) Elementary gestalts for gesture sonification. In: Proceedings of the 2008 Conference on New Interfaces for Musical Expression, NIME ’08, Genova, Italy, pp 150–153

    Google Scholar 

  • Harmonix Music Systems (2005) Guitar hero video game series. DVD by RedOctane

    Google Scholar 

  • Hoffman G, Breazeal C (2006) Robotic partners’ bodies and minds: an embodied approach to fluid human-robot collaboration. Cognitive Robotics

    Google Scholar 

  • Hoffman G, Breazeal C (2008) Anticipatory perceptual simulation for human-robot joint practice: theory and application study. In Proceedings of the 23rd national conference on Artificial intelligence, Chicago, IL, USA, pp 1357–1362

    Google Scholar 

  • Hoffman G, Weinberg G (2010) Shimon: an interactive improvisational robotic marimba player. In: Proceedings of the 28th of the international conference extended abstracts on Human factors in computing systems. ACM, pp 3097–3102

    Google Scholar 

  • Hoffman G, Kubat R, Breazeal C (2008) A hybrid control system for puppeteering a live robotic stage actor. In: Robot and Human Interactive Communication, 2008. RO-MAN 2008. The 17th IEEE International Symposium on. IEEE, pp 354–359

    Google Scholar 

  • Ihde D (1986) Experimental phenomenology: an introduction. SUNY Press

    Google Scholar 

  • Jorda S, Alonso M (2006) Mary had a little scoreTable* or the reacTable* goes melodic. In: Proceedings of the 2006 conference on New interfaces for musical expression. IRCAM Centre Pompidou, Paris, France, pp 208–211

    Google Scholar 

  • Jorda S, Geiger G, Alonso M, Kaltenbrunne M (2007) The reactable exploring the synergy between live music performance and tabletop tangible interfaces. In: the Proceedings of the 1st Conference on Tangible and Embedded Interaction, New York, NY, USA

    Google Scholar 

  • Kerckhove D (1993) Touch versus vision: Ästhetik neuer technologien. Die Aktualitat des Ästhetischen,Germany, pp 137–168

    Google Scholar 

  • Le Groux S, Manzolli J, Verschure P (2010) Disembodied and collaborative musical interaction in the multimodal brain orchestra. In: Proceedings of the International Conference on New Interfaces for Musical Expression, pp 309–314

    Google Scholar 

  • Lee JS, Tatar D, Harrison S (2012) Micro-coordination: because we did not already learn everything we need to know about working with others in kindergarten. In Proceedings of the ACM 2012 conference on Computer Supported Cooperative Work, CSCW ’12, New York, NY, USA, pp 1135–1144

    Google Scholar 

  • Lerdahl F, Jackendoff R (1996) A generative theory of tonal music. The MIT Press, Cambridge

    Google Scholar 

  • Leslie G, Mullen T (2011) MoodMixer: EEG-based collaborative sonification. Proceedings of the International Conference on New Interfaces for Musical Expression, 30 May–1 June 2011. Oslo, Norway, pp 296–299.

    Google Scholar 

  • Levin G (2006) The table is the score: an augmented-reality interface for real-time, tangible, spectrographic performance. In: Proceedings of the International Computer Music Conference (ICMC’ 06), New Orleans, USA

    Google Scholar 

  • Maxwell J, Pasquier P, Eigenfeldt A (2009) Hierarchical sequential memory for music: a cognitive model. In: the Proceedings of the 10th International Society for Music Information Retrieval Conference. ISMIR 2009, Kobe International Conference Center, Kobe, pp 429–434

    Google Scholar 

  • Maxwell JB, Eigenfeldt A (2008) The MusicDB: a music database query system for recombinance-based composition in Max/MSP. In: the Proceedings of the International Computer Music Conference, ICMC2008. (Published Online)

    Google Scholar 

  • McDermot J, Oxenham AJ (2008) Music perception, pitch and the auditory system. In: the Proceedings of the Current Opinion in Neurobiology 18(4):452–463 (geen locatie!)

    Google Scholar 

  • Miller K (2009) schizophonic performance: guitar hero, rock band, and virtual virtuosity. J Soc Am Music 3(4):395–429

    Google Scholar 

  • Miranda E (2006) Brain-computer music interface for composition and performance. Int J Disabil Hum Develop 5(2):119–126

    Google Scholar 

  • MIROR website (2012) Musical interaction relying on reflection. http://www.mirorproject.eu

  • Munoz EE (2007 Nov) When gesture sounds: bodily significance in musical performance. In Conference proceedings from the International Symposium on Performance Science, ISPS2007. Porto, Portugal, pp 55–60

    Google Scholar 

  • Nijholt A, Tan DS, Allison BZ, Millán José del R, Graimann B (2008) Brain-computer interfaces for HCI and games. CHI Extended Abstracts 2008:3925–3928

    Google Scholar 

  • Pachet F (2002 Sep) The continuator: musical interaction with style. In: Proceedings of the International Computer Music Conference. ICMA2002. Gothenburg, Sweden, pp 211–218

    Google Scholar 

  • Paine G (2009 Aug) Towards unified design guidelines for new interfaces for musical expression. Organ Sound 14(2):142–155. (Published Online)

    Google Scholar 

  • Patten J, Recht B, Ishii H (2002) Audiopad: a tag-based interface for musical performance. In: Proceedings of the 2002 conference on New interfaces for musical expression. National University of Singapore, Singapore, pp 1–6

    Google Scholar 

  • Poupyrev I, Berry R, Kurumisawa J, Nakao K, Billinghurst M, Airola C, Kato H, Yonezawa T, Baldwin L (2000) Augmented groove: collaborative jamming in augmented reality. In: ACM SIGGRAPH2000 Conference Abstracts and Applications, New Orleans, USA, p 77

    Google Scholar 

  • Raffa R (2011) Rhythmsynthesis: visual music instrument. In: Proceedings of the 8th ACM conference on Creativity and cognition. ACM, pp 415–416

    Google Scholar 

  • Reidsma D, Nijholt A, Bos P (2008 Dec) Temporal interaction between an artificial orchestra conductor and human musicians. Computers in Entertainment 6(4):1–22

    Article  Google Scholar 

  • Rosenbaum E (2008) Melodymorph: a reconfigurable musical instrument. In: Proceedings of the 8th International Conference on New Interfaces for Musical Expression. NIME2008. Genova, Italy, pp 445–447

    Google Scholar 

  • Sänger J, Müller V, Lindenberger U (2012) Intra- and interbrain synchronization and network properties when playing guitar in duets. Front Hum Neurosci. 2012; 6: 312. Published online 2012 November 29. Prepublished online 2012 July 6. doi: 10.3389/fnhum.2012.00312

    Google Scholar 

  • Takahashi M, Kumon Y, Takeda S, Inami M (2011) Remote hand clapping transmission using hand clapping machines on live video streaming. In: Proceedings of Entertainment Computing

    Google Scholar 

  • Tarumi H, Akazawa K, Ono M, Kagawa E, Hayashi T, Yaegashi R (2012) Awareness support for remote music performance. Advances in Computer Entertainment, ACE2012, Nepal, pp 573–576

    Google Scholar 

  • Tatar D (2012) Human-computer interaction seminar. University Lecture, Stanford

    Google Scholar 

  • Teitelbaum R (1976) In tune: some early experiments in biofeedback music (1966–1974). Vancouver, BC, Canada: Aesthetic Research Center of Canada Publications

    Google Scholar 

  • Thompson WF, Russo FA, Quinto L (2008) Audio-visual integration of emotional cues in song. Cogn Emot 22(8):1457–1470, Published Online

    Google Scholar 

  • Varni G, Camurri A, Coletta P, Volpe G (2008) Emotional entrainment in music performance. In: Proceedings of the 8th IEEE International Conference on Automatic Face & Gesture Recognition. FG2008. Amsterdam, The Netherlands, pp 1–5

    Google Scholar 

  • Varni G, Volpe G, Camurri A (2010) A system for real-time multimodal analysis of nonverbal affective social interaction in user-centric media. IEEE Transactions of Multimedia, vol 12, pp 576–590. (Published Online)

    Google Scholar 

  • Verbeek P-P (2005) What things do: philosophical reflections on technology, agency and design. Science and Engineering Ethics, Pennsylvania State University Press

    Google Scholar 

  • Verplank Bill (2011) Keynote on motors and music. Presented on Chi Spark 2011 as the second keynote

    Google Scholar 

  • Verplank Bill, Gurevich M, Mathews M (2002) The plank: designing a simple haptic controller. In: Proceedings of the 2002 conference on New interfaces for musical expression. NIME2002. Singapore, pp 1–4

    Google Scholar 

  • Ward KPN, O’Modhrain S (2008) A study of two thereminists: towards movement informed instrument design. In: Proceedings of the 2008 Conference on New Interfaces for Musical Expression. NIME ’08. Genova, Italy, pp 117–121

    Google Scholar 

  • Weinberg G, Driscoll S (2006) Robot-human interaction with an anthropomorphic percussionist. In: Proceedings of the SIGCHI conference on Human Factors in computing systems. CHI2006. New York, USA, pp 1229–1232

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dennis Reidsma .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Reidsma, D., Radha, M., Nijholt, A. (2014). Mediated Interactions and Musical Expression—A Survey. In: Lee, N. (eds) Digital Da Vinci. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-0536-2_4

Download citation

  • DOI: https://doi.org/10.1007/978-1-4939-0536-2_4

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4939-0535-5

  • Online ISBN: 978-1-4939-0536-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics