Automatic Classification of Guitar Playing Modes

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8905)


When they improvise, musicians typically alternate between several playing modes on their instruments. Guitarists in particular, alternate between modes such as octave playing, mixed chords and bass, chord comping, solo melodies, walking bass, etc. Robust musical interactive systems call for a precise detection of these playing modes in real-time. In this context, the accuracy of mode classification is critical because it underlies the design of the whole interaction taking place. In this paper, we present an accurate and robust playing mode classifier for guitar audio signals. Our classifier distinguishes between three modes routinely used in jazz improvisation: bass, solo melodic improvisation, and chords. Our method uses a supervised classification technique applied to a large corpus of training data, recorded with different guitars (electric, jazz, nylon-strings, electro-acoustic). We detail our method and experimental results over various data sets. We show in particular that the performance of our classifier is comparable to that of a MIDI-based classifier. We describe the application of the classifier to live interactive musical systems and discuss the limitations and possible extensions of this approach.


Audio classification Playing mode Guitar Interactive musical systems 



This research is conducted within the Flow Machines project which received funding from the European Research Council under the European Union’s Seventh Framework Programme (FP/2007-2013)/ERC Grant Agreement n. 291156.


  1. 1.
    Abesser, J., Lukashevich, H.M., Schuller, G.: Feature-based extraction of plucking and expression styles of the electric bass guitar. In: Proceedings of International Conference on Acoustics, Speech and Signal Processing, Dallas, pp. 2290–2293 (2010)Google Scholar
  2. 2.
    Abesser, J.: Automatic string detection for bass guitar and electric guitar. In: Proceedings of the 9th International Symposium on Computer Music Modelling and Retrieval, London, pp. 567–582 (2012)Google Scholar
  3. 3.
    Barbancho, I., Tardon, L.J., Sammartino, S., Barbancho, A.M.: Inharmonicity-based method for the automatic generation of guitar tablature. IEEE Trans. Audio Speech Lang. Process. 20(6), 1857–1868 (2012)CrossRefGoogle Scholar
  4. 4.
    De Cheveigné, A., Kawahara, H.: YIN, a fundamental frequency estimator for speech and music. J. Acoust. Soc. Am. 111(4), 1917–1931 (2002)CrossRefGoogle Scholar
  5. 5.
    Dixon, S.: Onset detection revisited. In: Proceedings of the 9th International Conference on Digital Audio Effects, Montreal, pp. 113–137 (2006)Google Scholar
  6. 6.
    Fohl, W., Turkalj, I., Meisel, A.: A Feature relevance study for guitar tone classification. In: Proceedings of the 13th International Study for Music Information Retrieval Conference, Porto (2012)Google Scholar
  7. 7.
    Guyon, I., Elisseef, A.: An Introduction to variable and feature selection. J. Mach. Learn. Res. 3, 1157–1182 (2003). ( Scholar
  8. 8.
    Hartquist, J.: Real-time musical analysis of polyphonic guitar audio. Ph.D. thesis, California Polytechnic State University (2012)Google Scholar
  9. 9.
    Holmes, G., Donkin, A., Witten, I.H.: Weka: a machine learning workbench. In: IEEE Proceedings of the Second Australian and New Zealand Conference on Intelligent Information Systems, pp. 357–361 (1994)Google Scholar
  10. 10.
    Klapuri, A., Davy, M.: Signal Processing Methods for Music Transcription. Springer, New York (2006) Google Scholar
  11. 11.
    Lachambre, H., André-Obrecht, R., Pinquier J.: Monophony vs polyphony: a new method based on Weibull bivariate models. In: Proceedings of the 7th International Workshop on Content-Based Multimedia Indexing, Chania, pp. 68–72 (2009)Google Scholar
  12. 12.
    Lähdeoja, O., Reboursière, L., Drugman, T., Dupont, S., Picard-Limpens, C., Riche, N.: Détection des Techniques de Jeu de la Guitare. In: Actes des Journées d’Informatique Musicale, Mons, pp. 47–53 (2012)Google Scholar
  13. 13.
    Özaslan, T.H., Guaus, E., Palacios, E., Arcos, J.L.: Attack based articulation analysis of nylon string guitar. In: Proceedings of the 7th International Symposium on Computer Music Modeling and Retrieval, Malaga, pp. 285–297 (2010)Google Scholar
  14. 14.
    Pachet, F., Roy, P., Moreira, J., d’Inverno, M.: Reflexive loopers for solo musical improvisation. In: Proceedings of International Conference on Human Factors in Computing Systems (CHI), Paris, pp. 2205–2208 (2013). (Best paper honorable mention award)Google Scholar
  15. 15.
    Reboursière, L, Lähdeoja, O., Drugman, T., Dupont, S., Picard-Limpens, C., Riche, N.: Left and right-hand guitar playing techniques detection. In: Proceedings of New Interfaces for Musical Expression, Ann Arbor, pp. 30–34 (2012)Google Scholar
  16. 16.
    Ryynänen, M.P., Klapuri, A.P.: Automatic transcription of melody, bass line, and chords in polyphonic music. Comput. Music J. 32(3), 72–86 (2008). (MIT Press)CrossRefGoogle Scholar
  17. 17.
    Sinyor, E., Mckay, C., Mcennis, D., Fujinaga, I.: Beatbox classification using ACE. In: Proceedings of the 6th International Conference on Music Information Retrieval, London, pp. 672–675 (2005)Google Scholar
  18. 18.
    Stein, M., Abesser, J., Dittmar, C., Schuller, G.: Automatic detection of audio effects in guitar and bass recordings. In: Proceedings of the 128th Audio Engineering Society Convention, London (2010)Google Scholar
  19. 19.
    Stowell, D., Plumbley, M.D.: Delayed Decision-making in real-time beatbox percussion classification. J. New Music Res. 39(3), 203–213 (2010)CrossRefGoogle Scholar
  20. 20.
    Tindale, A., Kapur, A., Tzanetakis, G., Fujinaga, I.: Retrieval of percussion gestures using timbre classification techniques. In: Proceedings of the 5th International Conference on Music Information Retrieval, Barcelona, pp. 541–544 (2004)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Sony Computer Science LaboratoryParisFrance

Personalised recommendations