Key Estimation in Electronic Dance Music

  • Ángel FaraldoEmail author
  • Emilia Gómez
  • Sergi Jordà
  • Perfecto Herrera
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9626)


In this paper we study key estimation in electronic dance music, an umbrella term referring to a variety of electronic music subgenres intended for dancing at nightclubs and raves. We start by defining notions of tonality and key before outlining the basic architecture of a template-based key estimation method. Then, we report on the tonal characteristics of electronic dance music, in order to infer possible modifications of the method described. We create new key profiles combining these observations with corpus analysis, and add two pre-processing stages to the basic algorithm. We conclude by comparing our profiles to existing ones, and testing our modifications on independent datasets of pop and electronic dance music, observing interesting improvements in the performance or our algorithms, and suggesting paths for future research.


Music information retrieval Computational key estimation Key profiles Electronic dance music Tonality Music theory 


  1. 1.
    Bogdanov, D., Wack, N., Gómez, E., Gulati, S., Herrera, P., Mayor, O.: ESSENTIA: an open-source library for sound and music analysis. In: Proceedings 21st ACM-ICM, pp. 855–858 (2013)Google Scholar
  2. 2.
    Cannam, C., Mauch, M., Davies, M.: MIREX 2013 Entry: Vamp plugins from the Centre For Digital Music (2013).
  3. 3.
    Everett, W.: Making sense of rock’s tonal systems. Music Theory Online, vol. 10(4) (2004)Google Scholar
  4. 4.
    Dayal, G., Ferrigno, E.: Electronic Dance Music. Grove Music Online. Oxford University Press, Oxford (2012)Google Scholar
  5. 5.
    Dressler, K., Streich, S.: Tuning frequency estimation using circular statistics. In: Proceedings of the 8th ISMIR, pp. 2–5 (2007)Google Scholar
  6. 6.
    Gómez, E.: Tonal description of polyphonic audio for music content processing. INFORMS J. Comput. 18(3), 294–304 (2006)CrossRefGoogle Scholar
  7. 7.
    Gómez, E.: Tonal description of music audio signals. Ph.D. thesis, Universitat Pompeu Fabra, Barcelona (2006)Google Scholar
  8. 8.
    Harte., C.: Towards automatic extraction of harmony information from music signals. Ph.D. thesis, Queen Mary University of London (2010)Google Scholar
  9. 9.
    Hyer, B.: Tonality. Grove Music Online. Oxford University Press, Oxford (2012)Google Scholar
  10. 10.
    James, R.: My life would suck without you / Where have you been all my life: Tension-and-release structures in tonal rock and non-tonal EDM pop. Accessed 16th December 2014
  11. 11.
    Klapuri, A.: Multipitch analysis of polyphonic music and speech signals using an auditory model. IEEE Trans. Audio Speech Lang. Process. 16(2), 255–266 (2008)CrossRefGoogle Scholar
  12. 12.
    Knees, P., Faraldo, Á., Herrera, P., Vogl, R., Böck, S., Hörschläger, F., Le Goff, M.: Two data sets for tempo estimation and key detection in electronic dance music annotated from user corrections. In: Proceeings of the 16th ISMIR (2015)Google Scholar
  13. 13.
    Krumhansl, C.L.: Cognitive Foundations of Musical Pitch. Oxford Unversity Press, New York (1990)Google Scholar
  14. 14.
    Mauch, M., Dixon., S.: Approximate note transcription for the improvedidentification of difficult chords. In: Proceedings of the 11th ISMIR, pp. 135–140 (2010)Google Scholar
  15. 15.
    Mauch, M., Cannam, C., Davies, M., Dixon, S., Harte, C., Kolozali, S., Tidjar, D.: OMRAS2 metadata project 2009. In: Proceedings of the 10th ISMIR, Late-Breaking Session (2009)Google Scholar
  16. 16.
    Moore, A.: The so-called “flattened seventh” in rock. Pop. Music 14(2), 185–201 (1995)CrossRefGoogle Scholar
  17. 17.
    Müller, M., Ewert, S.: Towards timbre-invariant audio features for harmony-based music. IEEE Trans. Audio Speech Lang. Process. 18(3), 649–662 (2010)CrossRefGoogle Scholar
  18. 18.
    Noland, K.: Computational Tonality estimation: Signal Processing and Hidden Markov Models. Ph.D. thesis, Queen Mary University of London (2009)Google Scholar
  19. 19.
    Noland, K., Sandler, M.: Signal processing parameters for tonality estimation. In: Proceedings of the 122nd Convention Audio Engeneering Society (2007)Google Scholar
  20. 20.
    Pollack., A.W.: Notes on..series. (Accessed: 1 February 2015). soundscapes/DATABASES/AWP/awp-notes_on.shtml
  21. 21.
    Saslaw, J.: Modulation (i). Grove Music Online. Oxford University Press, Oxford (2012)Google Scholar
  22. 22.
    Schellenberg, E.G., von Scheve, C.: Emotional cues in American popular music: five decades of the Top 40. Psychol. Aesthetics Creativity Arts 6(3), 196–203 (2012)CrossRefGoogle Scholar
  23. 23.
    Sha’ath., I.: Estimation of key in digital music recordings. In: Departments of Computer Science & Information Systems, Birkbeck College, University of London (2011)Google Scholar
  24. 24.
    Spicer, M.: (Ac)cumulative form in pop-rock music. Twentieth Century Music 1(1), 29–64 (2004)CrossRefGoogle Scholar
  25. 25.
    Tagg, P.: From refrain to rave: the decline of figure and raise of ground. Pop. Music 13(2), 209–222 (1994)CrossRefGoogle Scholar
  26. 26.
    Tagg., P.: Everyday tonality II (Towards a tonal theory of what most people hear). The Mass Media Music Scholars’ Press. New York and Huddersfield (2014)Google Scholar
  27. 27.
    Temperley, D.: What’s key for key? The Krumhansl-Schmuckler key-finding algorithm reconsidered. Music Percept. Interdiscip. J. 17(1), 65–100 (1999)CrossRefGoogle Scholar
  28. 28.
    Röbel, A., Rodet, X.: Efficient spectral envelope estimation and its application to pitch shifting and envelope preservation. In: Proceedings of the 8th DAFX (2005)Google Scholar
  29. 29.
    Zhu, Y., Kankanhalli, M.S., Gao., S.: Music key detection for musical audio. In: Proceedings of the 11th IMMC, pp. 30–37 (2005)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Ángel Faraldo
    • 1
    Email author
  • Emilia Gómez
    • 1
  • Sergi Jordà
    • 1
  • Perfecto Herrera
    • 1
  1. 1.Music Technology GroupUniversitat Pompeu FabraBarcelonaSpain

Personalised recommendations