Skip to main content

Real Time Tracking and Visualisation of Musical Expression

  • Conference paper
  • First Online:
Music and Artificial Intelligence (ICMAI 2002)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2445))

Included in the following conference series:

Abstract

Skilled musicians are able to shape a given piece of music (by continuously modulating aspects like tempo, loudness, etc.) to communicate high level information such as musical structure and emotion. This activity is commonly referred to as expressive music performance. The present paper presents another step towards the automatic high-level analysis of this elusive phenomenon with AI methods. A system is presented that is able to measure tempo and dynamics of a musical performance and to track their development over time. The system accepts raw audio input, tracks tempo and dynamics changes in real time, and displays the development of these expressive parameters in an intuitive and aesthetically appealing graphical format which provides insight into the expressive patterns applied by skilled artists. The paper describes the tempo tracking algorithm (based on a new clustering method) in detail, and then presents an application of the system to the analysis of performances by different pianists.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. A.T. Cemgil, B. Kappen, P. Desain, and H. Honing, ‘On tempo tracking: Tempogram representation and Kalman filtering’, in Proceedings of the 2000 International Computer Music Conference, pp. 352–355, San Francisco CA, (2000). International Computer Music Association.

    Google Scholar 

  2. P. Desain and H. Honing, ‘Quantization of musical time: A connectionist approach’, Computer Music Journal, 13(3), 56–66, (1989).

    Article  Google Scholar 

  3. S. Dixon, ‘Automatic extraction of tempo and beat from expressive performances’, Journal of New Music Research, 30(1), 39–58, (2001).

    Article  Google Scholar 

  4. S. Dixon and E. Cambouropoulos, ‘Beat tracking with musical knowledge’, in ECAI 2000: Proceedings of the 14th European Conference on Artificial Intelligence, pp. 626–630, Amsterdam, (2000). IOS Press.

    Google Scholar 

  5. M. Goto and Y. Muraoka, ‘Areal-time beat tracking system for audio signals’, in Proceedings of the International Computer Music Conference, pp. 171–174, San Francisco CA, (1995). International Computer Music Association.

    Google Scholar 

  6. M. Goto and Y. Muraoka, ‘Real-time beat tracking for drumless audio signals’, Speech Communication, 27(3–4), 331–335, (1999).

    Article  Google Scholar 

  7. J. Langner and W. Goebl, ‘Representing expressive performance in tempo-loudness space’, in Proceedings of the ESCOM 10th Anniversary Conference on Musical Creativity, Liège, Belgium, (2002).

    Google Scholar 

  8. E.W. Large and J.F. Kolen, ‘Resonance and the perception of musical meter’, Connection Science, 6, 177–208, (1994).

    Article  Google Scholar 

  9. C.S. Lee, ‘The perception of metrical structure: Experimental evidence and a model’, in Representing Musical Structure, eds., P. Howell, R. West, and I. Cross, 59–127, Academic Press, San Diego CA, (1991).

    Google Scholar 

  10. H.C. Longuet-Higgins, Mental Processes, MIT Press, Cambridge MA, 1987.

    Google Scholar 

  11. D. Rosenthal, ‘Emulation of human rhythm perception’, Computer Music Journal, 16(1), 64–76, (1992).

    Article  Google Scholar 

  12. E.D. Scheirer, ‘Tempo and beat analysis of acoustic musical signals’, Journal of the Acoustical Society of America, 103(1), 588–601, (1998).

    Article  Google Scholar 

  13. G. Widmer, ‘What is it that makes it a Horowitz? Empirical musicology via machine learning’, in Proceedings of the 12th European Conference on Artificial Intelligence (ECAI-96), Chichester, UK, (1996).Wiley & Sons.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Dixon, S., Goebl, W., Widmer, G. (2002). Real Time Tracking and Visualisation of Musical Expression. In: Anagnostopoulou, C., Ferrand, M., Smaill, A. (eds) Music and Artificial Intelligence. ICMAI 2002. Lecture Notes in Computer Science(), vol 2445. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45722-4_7

Download citation

  • DOI: https://doi.org/10.1007/3-540-45722-4_7

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-44145-8

  • Online ISBN: 978-3-540-45722-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics