Abstract
This chapter poses, and proposes some answers to, questions about the origins and nature of creativity when digital media takes an active role in the music-making process. The discussions are centered on François’ Mimi (Multimodal Interaction for Musical Improvisation ) system, which enables a musician to seed the computer with musical ideas and then improvise atop re-combinations of these ideas; the system provides the musician with visual foreknowledge of the machine’s intent and review of the interaction. They extend to the different instantiations of, and extensions to, the Mimi system, which are designed with various interaction nuances in mind, and engender new forms of creativity. We review each Mimi version, from the original blue-and-white silhouette display, to the Scriabin -inspired varicolored panels, to the multi-paneled user-directed Mimi4x . In each scenario, we consider the impact of Mimi on the creative process and the resulting performance; specifically, we describe the interaction between a performer, the composer (when this is different from the performer), and the system, analyzing the techniques used to successfully negotiate a performance with Mimi, and the formal musical structures that result from this interaction.
Keywords
- Improvisation System
- Visual Interface
- Musical Structure
- Musical Instrument Digital Interface
- Musical Content
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.
This chapter incorporates, in part and in modified form, material that has previously appeared in “Mimi4x: An Interactive Audio-visual Installation for High-level Structural Improvisation” (Alexandre R. J. François, Isaac Schankler and Elaine Chew, International Journal of Arts and Technology, vol. 6, no. 2, 2013), “Performer Centered Visual Feedback for Human-Machine Improvisation” (Alexandre R. J. François, Elaine Chew and Dennis Thurmond, ACM Computers in Entertainment, vol. 9, no. 3, November 2011, 13 pages), “Preparing for the Unpredictable: Identifying Successful Performance Strategies in Human-Machine Improvisation” (Isaac Schankler, Alexandre R. J. François and Elaine Chew, Proceedings of the International Symposium on Performance Science, Toronto, Canada, 24–27 August 2011), and “Emergent Formal Structures of Factor Oracle-Driven Musical Improvisations” (Isaac Schankler, Jordan L.B. Smith, Alexandre R. J. François and Elaine Chew, Proceedings of the International Conference on Mathematics and Computation in Music, Paris, France, 15–17 June 2011).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Assayag G, Dubnov S (2004) Using factor oracles for machine improvisation. Soft Comput 8(9):604–610
Assayag G, Rueda C, Laurson M, Agon C, Delerue O (1999) Computer assisted composition at IRCAM: patchwork & openmusic. Comput Music J 23(3):59–72
Assayag G, Bloch G, Chemillier M, Cont A, Dubnov S (2006). Omax brothers: a dynamic topology of agents for improvization learning. In: Proceedings of ACM workshop on music and audio computing, Santa Barbara
Austin L, Boone C, Serra X (1991) Transmission two: the great excursion (TT:TGE): the aesthetic, art and science of a composition for radio. Leonardo Music J 1(1):81–88
Bamberger J (2000) Developing musical intuitions: a project-based introduction to making and understanding music. Oxford University Press, New York
Boivin J (2007) Musical analysis according to Messiaen: a critical view of a most original approach. In: Dingle C, Simeone N (eds) Olivier Messiaen: music, art and literature. Ashgate, Aldershot, pp 137–157
Boulez P (1991) Stravinsky remains. In: Stocktakings from an apprenticeship. Trans stephen walsh. Clarendon Press, Oxford, pp 55–110
Bresson, J., C. Agon, and G. Assayag (2005). OpenMusic 5: A Cross-Platform Release of the Computer-Assisted Composition Environment. In 10th Brazilian Symposium on Computer Music, Belo Horizonte, Brazil.
Cowell H (1996) New musical resources, annotated, with an accompanying essay, by David Nicholls. Cambridge University Press, Cambridge
Dubnov S, Assayag G (2005) Improvisation planning and jam session design using concepts of sequence variation and flow experience. Proceedings of international conference on sound and music computing, Salerno
Farbood M (2001) Hyperscore: a new approach to interactive computer-generated music. Master’s thesis, Massachusetts Institute of Technology Media Laboratory
François ARJ (2009) Time and perception in music and computation. In: Assayag G, Gerzso A (eds) New computational paradigms for computer music. Editions Delatour France/IRCAM, pp 125–146
François AR, Chew E, Thurmond D (2007) Visual feedback in performer-machine interaction for musical improvisation. Proceedings of international conference on new interfaces for musical expression, New York, pp 277–280
François AR, Chew E, Thurmond D (2011) Performer centered visual feedback for human-machine improvisation. ACM Comput Entertain 9(3):Article 13
Galeyev BM, Vanechkina IL (2001) Was Scriabin a synesthete? Leonardo Music J 34(4):357–362
Gann K (2006) Music Downtown: writings from the village voice. University of California Press, Berkeley
Lévy B (2009) Visualising OMax. Masters Thesis, Ircam/Université Pierre et Marie Curie
Lewis G (2000) Too many notes: computers, complexity and culture in voyager. Leonardo Music J 10:33–39
Pachet F (2003) The continuator: musical interaction with style. J New Music Res 32(3):333–341
Puckette MS (2004) A divide between ‘compositional’ and ‘performative’ aspects of Pd. In: Proceedings of the first international pd convention, Graz
Solomon D (2009) In Bb 2.0- a collaborative music/spoken word project. www.inbflat.net
Thom B (2000a) Bob: an interactive improvisational companion. In: Proceedings of international conference on autonomous agents (Agents-2000), Barcelona
Thom B (2000b) Unsupervised learning and interactive jazz/blues improvisation. In: Proceedings of seventeenth national conference on artificial intelligence, Austin
Thom B (2003) Interactive improvisational music companionship: a user-modeling approach. User Model User-Adap Interact J 13(1–2):133–177 (Special Issue on User Modeling and Intelligent Agents)
Walker WF (1997) A computer participant in musical improvisation. Proceedings of human factors in computing systems (CHI), Atlanta
Walker W, Belet B (1996) Applying ImprovisationBuilder to interactive composition with midi piano. Proceedings of the international computer music conference, Hong Kong
Walker W, Hebel K, Martirano S, Scaletti C (1992) Improvisationbuilder: improvisation as conversation. Proceedings of the international computer music conference, San Jose
Weinberg G, Driscoll S (2006) Robot-human interaction with an anthropomorphic percussionist. Proceedings of human factors in computing systems (CHI), Montreal
Zorn J (2004) The game pieces. In: Cox C, Warner D (eds) Audio culture: readings in modern music. Continuum, New York, pp 196–200
Acknowledgement:
This material is based in part on work supported by the National Science Foundation under Grant No. 0347988. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer Science+Business Media New York
About this chapter
Cite this chapter
Schankler, I., Chew, E., François, A. (2014). Improvising with Digital Auto-Scaffolding: How Mimi Changes and Enhances the Creative Process. In: Lee, N. (eds) Digital Da Vinci. Springer, New York, NY. https://doi.org/10.1007/978-1-4939-0536-2_5
Download citation
DOI: https://doi.org/10.1007/978-1-4939-0536-2_5
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4939-0535-5
Online ISBN: 978-1-4939-0536-2
eBook Packages: Computer ScienceComputer Science (R0)