Using Autonomous Agents to Improvise Music Compositions in Real-Time

Conference paper

DOI: 10.1007/978-3-319-55750-2_8

Part of the Lecture Notes in Computer Science book series (LNCS, volume 10198)
Cite this paper as:
Hutchings P., McCormack J. (2017) Using Autonomous Agents to Improvise Music Compositions in Real-Time. In: Correia J., Ciesielski V., Liapis A. (eds) Computational Intelligence in Music, Sound, Art and Design. EvoMUSART 2017. Lecture Notes in Computer Science, vol 10198. Springer, Cham


This paper outlines an approach to real-time music generation using melody and harmony focused agents in a process inspired by jazz improvisation. A harmony agent employs a Long Short-Term Memory (LSTM) artificial neural network trained on the chord progressions of 2986 jazz ‘standard’ compositions using a network structure novel to chord sequence analysis. The melody agent uses a rule-based system of manipulating provided, pre-composed melodies to improvise new themes and variations. The agents take turns in leading the direction of the composition based on a rating system that rewards harmonic consistency and melodic flow. In developing the multi-agent system it was found that implementing embedded spaces in the LSTM encoding process resulted in significant improvements to chord sequence learning.


Multi-agent systems Music composition Artificial neural networks 

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  1. 1.sensiLab, Faculty of Information TechnologyMonash UniversityCaulfield EastAustralia

Personalised recommendations