Dynamic Mapping Strategies for Expressive Synthesis Performance and Improvisation

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5493)


Realtime musical expression through synthesis is notoriously difficult. The complete potential of a sound engine is traditionally available only at design time. In this paper two mapping strategies are presented, adressing this problem. Based on dynamic and random many-to-many mappings between control space and synthesis space, they erase the line between sound editing and synthesizer performance, with an emphasis on free improvisation. One strategy is based on the addition of random vectors in synthesis space, weighted by control parameters. The other is based on a gravity analogy, interpolating between random points in synthesis space, with the gravities controlled by player interaction. Vectors and point sets can be scaled and shifted during performance, allowing dynamic exploration of vast soundspaces or minute timberal control. The mappings have been adopted to a wide range of musical interfaces. Together with suitable sound engines, surprisingly expressive performance instruments have been created, and they have been used in regular rehearsals, concerts and recording sessions over the last two years.


Gravity Model Vector Model Synthesis Parameter Mapping Engine Trigger Finger 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Dahlstedt, P.: Creating and exploring huge parameter spaces: Interactive evolution as a tool for sound generation. In: Proceedings of the 2001 International Computer Music Conference, Habana, Cuba, pp. 235–242 (2001)Google Scholar
  2. 2.
    Dahlstedt, P.: Evolution in creative sound design. In: Miranda, E.R., Biles, J.A. (eds.) Evolutionary Computer Music, pp. 79–99. Springer, London (2007)CrossRefGoogle Scholar
  3. 3.
    Ryan, J.: Effort and expression. In: Proceedings of the 1992 International Computer Music Conference, San Jose, California, USA, pp. 414–416 (1992)Google Scholar
  4. 4.
    Wessel, D., Wright, M.: Problems and prospects for intimate musical control of computers. In: Workshop in New Interfaces for Musical Expression, Seattle (2001)Google Scholar
  5. 5.
    Wanderley, M., Battier, M. (eds.): Trends in gestural control of music (CDROM). Ircam/Centre Pompidou, Paris (2000)Google Scholar
  6. 6.
    Wanderley, M. (ed.): Theme issue on mapping. Organised Sound 7(2) (2002)Google Scholar
  7. 7.
    Mulder, A.: Virtual musical instruments: Accessing the sound synthesis universe as a performer. In: Proc. of the 1st Brazilian Symposium on Computers and MusicGoogle Scholar
  8. 8.
    Wanderley, M.: Gestural control of music. In: Intl. Workshop on Human Supervision and Control in Engineering and Music, Kassel (2001)Google Scholar
  9. 9.
    Cadoz, C.: Instrumental gesture and musical composition. In: Proceedings of the 1988 International Computer Music Conference - Kologne, Germany, pp. 1–12. ICMA, San Francisco (1988)Google Scholar
  10. 10.
    Bowler, I., Purvis, A., Manning, P., Bailey, N.: On mapping n articulation onto m synthesiser-control parameters. In: Proceedings of the 1990 International Computer Music Conference (ICMC 1990), pp. 181–184 (1990)Google Scholar
  11. 11.
    Choi, I., Bargar, R., Goudeseune, C.: A manifold interface for a high dimensional control space. In: Proceedings of the 1995 International Computer Music Conference, Banff, pp. 89–92 (1995)Google Scholar
  12. 12.
    Garnett, G.E., Goudeseune, C.: Performance factors in control of high-dimensional spaces. In: Proceedings of the 1999 International Computer Music Conference, Beijing, China (1999)Google Scholar
  13. 13.
    Dahlstedt, P., Nilsson, P.A.: Free flight in parameter space: A dynamic mapping strategy for expressive free impro. In: Giacobini, M., Brabazon, A., Cagnoni, S., Di Caro, G.A., Drechsler, R., Ekárt, A., Esparcia-Alcázar, A.I., Farooq, M., Fink, A., McCormack, J., O’Neill, M., Romero, J., Rothlauf, F., Squillero, G., Uyar, A.Ş., Yang, S. (eds.) EvoWorkshops 2008. LNCS, vol. 4974, pp. 479–484. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  14. 14.
    Levitin, D.J., McAdams, S., Adams, R.L.: Control parameters for musical instruments: a foundation for new mappings of gesture to sound. Organised Sound 7(2), 171–189 (2002)CrossRefGoogle Scholar
  15. 15.
    Cadoz, C., Wanderley, M.M.: Gesture-music. In: Wanderley, M., Battier, M. (eds.) Trends in Gestural Control of Music. Ircam/Centre Pompidou, Paris (2000)Google Scholar
  16. 16.
    Hunt, A., Wanderley, M.M., Kirk, R.: Towards a model for instrumental mapping in expert musical interaction. In: Proceedings of the 2000 International Computer Music Conference, pp. 209–212 (2000)Google Scholar
  17. 17.
    Hunt, A., Kirk, R.: Mapping strategies for musical performance. In: Wanderley, M., Battier, M. (eds.) Trends in Gestural Control of Music (CDROM). Ircam/Centre Pompidou, Paris (2000)Google Scholar
  18. 18.
    Norman, D.A.: The psychology of everyday things. Basic Books Inc. (1988)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  1. 1.Dept. of applied information technology, University of Gothenburg and Chalmers University of Technology, SE41296 Gteborg, Sweden Academy of Music and DramaUniversity of GothenburgSweden

Personalised recommendations