Improvising with Digital Auto-Scaffolding: How Mimi Changes and Enhances the Creative Process



This chapter poses, and proposes some answers to, questions about the origins and nature of creativity when digital media takes an active role in the music-making process. The discussions are centered on François’ Mimi (Multimodal Interaction for Musical Improvisation ) system, which enables a musician to seed the computer with musical ideas and then improvise atop re-combinations of these ideas; the system provides the musician with visual foreknowledge of the machine’s intent and review of the interaction. They extend to the different instantiations of, and extensions to, the Mimi system, which are designed with various interaction nuances in mind, and engender new forms of creativity. We review each Mimi version, from the original blue-and-white silhouette display, to the Scriabin -inspired varicolored panels, to the multi-paneled user-directed Mimi4x . In each scenario, we consider the impact of Mimi on the creative process and the resulting performance; specifically, we describe the interaction between a performer, the composer (when this is different from the performer), and the system, analyzing the techniques used to successfully negotiate a performance with Mimi, and the formal musical structures that result from this interaction.


Improvisation System Visual Interface Musical Structure Musical Instrument Digital Interface Musical Content 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This material is based in part on work supported by the National Science Foundation under Grant No. 0347988. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.


  1. Assayag G, Dubnov S (2004) Using factor oracles for machine improvisation. Soft Comput 8(9):604–610zbMATHGoogle Scholar
  2. Assayag G, Rueda C, Laurson M, Agon C, Delerue O (1999) Computer assisted composition at IRCAM: patchwork & openmusic. Comput Music J 23(3):59–72CrossRefGoogle Scholar
  3. Assayag G, Bloch G, Chemillier M, Cont A, Dubnov S (2006). Omax brothers: a dynamic topology of agents for improvization learning. In: Proceedings of ACM workshop on music and audio computing, Santa BarbaraGoogle Scholar
  4. Austin L, Boone C, Serra X (1991) Transmission two: the great excursion (TT:TGE): the aesthetic, art and science of a composition for radio. Leonardo Music J 1(1):81–88CrossRefGoogle Scholar
  5. Bamberger J (2000) Developing musical intuitions: a project-based introduction to making and understanding music. Oxford University Press, New YorkGoogle Scholar
  6. Boivin J (2007) Musical analysis according to Messiaen: a critical view of a most original approach. In: Dingle C, Simeone N (eds) Olivier Messiaen: music, art and literature. Ashgate, Aldershot, pp 137–157Google Scholar
  7. Boulez P (1991) Stravinsky remains. In: Stocktakings from an apprenticeship. Trans stephen walsh. Clarendon Press, Oxford, pp 55–110Google Scholar
  8. Bresson, J., C. Agon, and G. Assayag (2005). OpenMusic 5: A Cross-Platform Release of the Computer-Assisted Composition Environment. In 10th Brazilian Symposium on Computer Music, Belo Horizonte, Brazil.Google Scholar
  9. Cowell H (1996) New musical resources, annotated, with an accompanying essay, by David Nicholls. Cambridge University Press, CambridgeGoogle Scholar
  10. Dubnov S, Assayag G (2005) Improvisation planning and jam session design using concepts of sequence variation and flow experience. Proceedings of international conference on sound and music computing, SalernoGoogle Scholar
  11. Farbood M (2001) Hyperscore: a new approach to interactive computer-generated music. Master’s thesis, Massachusetts Institute of Technology Media LaboratoryGoogle Scholar
  12. François ARJ (2009) Time and perception in music and computation. In: Assayag G, Gerzso A (eds) New computational paradigms for computer music. Editions Delatour France/IRCAM, pp 125–146Google Scholar
  13. François AR, Chew E, Thurmond D (2007) Visual feedback in performer-machine interaction for musical improvisation. Proceedings of international conference on new interfaces for musical expression, New York, pp 277–280Google Scholar
  14. François AR, Chew E, Thurmond D (2011) Performer centered visual feedback for human-machine improvisation. ACM Comput Entertain 9(3):Article 13Google Scholar
  15. Galeyev BM, Vanechkina IL (2001) Was Scriabin a synesthete? Leonardo Music J 34(4):357–362CrossRefGoogle Scholar
  16. Gann K (2006) Music Downtown: writings from the village voice. University of California Press, BerkeleyGoogle Scholar
  17. Lévy B (2009) Visualising OMax. Masters Thesis, Ircam/Université Pierre et Marie CurieGoogle Scholar
  18. Lewis G (2000) Too many notes: computers, complexity and culture in voyager. Leonardo Music J 10:33–39CrossRefGoogle Scholar
  19. Pachet F (2003) The continuator: musical interaction with style. J New Music Res 32(3):333–341CrossRefGoogle Scholar
  20. Puckette MS (2004) A divide between ‘compositional’ and ‘performative’ aspects of Pd. In: Proceedings of the first international pd convention, GrazGoogle Scholar
  21. Solomon D (2009) In Bb 2.0- a collaborative music/spoken word project.
  22. Thom B (2000a) Bob: an interactive improvisational companion. In: Proceedings of international conference on autonomous agents (Agents-2000), BarcelonaGoogle Scholar
  23. Thom B (2000b) Unsupervised learning and interactive jazz/blues improvisation. In: Proceedings of seventeenth national conference on artificial intelligence, AustinGoogle Scholar
  24. Thom B (2003) Interactive improvisational music companionship: a user-modeling approach. User Model User-Adap Interact J 13(1–2):133–177 (Special Issue on User Modeling and Intelligent Agents)Google Scholar
  25. Walker WF (1997) A computer participant in musical improvisation. Proceedings of human factors in computing systems (CHI), AtlantaGoogle Scholar
  26. Walker W, Belet B (1996) Applying ImprovisationBuilder to interactive composition with midi piano. Proceedings of the international computer music conference, Hong KongGoogle Scholar
  27. Walker W, Hebel K, Martirano S, Scaletti C (1992) Improvisationbuilder: improvisation as conversation. Proceedings of the international computer music conference, San JoseGoogle Scholar
  28. Weinberg G, Driscoll S (2006) Robot-human interaction with an anthropomorphic percussionist. Proceedings of human factors in computing systems (CHI), MontrealGoogle Scholar
  29. Zorn J (2004) The game pieces. In: Cox C, Warner D (eds) Audio culture: readings in modern music. Continuum, New York, pp 196–200Google Scholar

Copyright information

© Springer Science+Business Media New York 2014

Authors and Affiliations

  1. 1.Process Pool MusicLos AngelesUSA
  2. 2.Queen Mary University of LondonLondonUK
  3. 3.Interactions IntelligenceLondonUK

Personalised recommendations