Advertisement

Abstract

The meter of a musical excerpt provides high-level rhythmic information and is valuable in many music information retrieval tasks. We investigate the use of a computationally efficient approach to metrical analysis based on psycho-acoustically motivated decomposition of the audio signal. A two-stage comb filter-based approach, originally proposed for double/ triple meter estimation, is extended to a septuple meter (such as 7/8 time-signature) and its performance evaluated on a sizable Indian music database. We find that this system works well for Indian music and the distribution of musical stress/accents across a temporal grid can be utilized to obtain the metrical structure of audio automatically.

Keywords

Meter detection Indian music complex meter comb filtering 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Klapuri, A., Eronen, A.J., Astola, J.T.: Analysis of the meter of acoustic musical signals. IEEE Transactions on Acoustics Speech and Signal Processing 14(1), 342–355 (2006)Google Scholar
  2. 2.
    Holzapfel, A., Stylianou, Y.: Rhythmic similarity in traditional turkish music. In: Proceedings of International Conference on Music Information Retrieval (2009)Google Scholar
  3. 3.
    Gulati, S., Rao, P.: Rhythm Pattern Representation for Tempo Detection in Music. In: Proceedings of the First International Conference on Intelligent Interactive Technologies and Multimedia, Allahabad, India (December 2010)Google Scholar
  4. 4.
    Dixon, S.: Onset Detection Revisited. In: Proceedings of the International Conference on Digital Audio Effects (DAFx 2006), Montreal, Canada (2006)Google Scholar
  5. 5.
    Klapuri, A.: Sound Onset Detection by Applying Psychoacoustic Knowledge. In: Proceedings of the IEEE International Conference on Acoustics, Speech, and Signal Processing (March 1999)Google Scholar
  6. 6.
    Dixon, S.: Automatic extraction of tempo and beat from expressive performances. J. New Music Res. 30(1), 39–58 (2001)CrossRefGoogle Scholar
  7. 7.
    Ellis, D.P.: Beat tracking by dynamic programming. J. New Music Res. 36(1), 51–60 (2007)CrossRefGoogle Scholar
  8. 8.
    Goto, M., Muraoka, Y.: Music Understanding at the Beat Level Real-time Beat Tracking for Audio Signals. In: Proceedings of IJCAI 1995 Workshop on Computational Auditory Scene Analysis, p. 6875 (1995)Google Scholar
  9. 9.
    Gouyon, F., Herrera, P.: Determination of the meter of musical audio signals: Seeking recurrences in beat segment descriptors. In: 114th Audio Engineering Society Convention (March 2003)Google Scholar
  10. 10.
    Clayton, M.: Time in Indian music: rhythm, metre, and form in North Indian rãg performance. Oxford University Press Inc., New York (2000)Google Scholar
  11. 11.
    Schuller, B., Eyben, F., Rigoll, G.: Fast and robust meter and tempo recognition for the automatic discrimination of ballroom dance styles. In: Proceedings of the IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP 2007), Honolulu, Hawaii, USA, pp. 217–220 (April 2007)Google Scholar
  12. 12.
    Gouyon, F.: A computational approach to rhythm description – Audio features for the computation of rhythm periodicity functions and their use in tempo induction and music content processing. PhD dissertation, Music Technology Group, PompeuFabra University (2005)Google Scholar
  13. 13.
    Scheirer, E.: Tempo and beat analysis of acoustic musical signals. J. Acoust. Soc. Amer. 103(1), 588–601 (1998)CrossRefGoogle Scholar
  14. 14.
    Gouyon, F., Dixon, S., Pampalk, E., Widmer, G.: Evaluating rhythmic descriptors for musical genre classification. In: Proceedings. AES 25th Int. Conf., New York, pp. 196–204 (2004)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Sankalp Gulati
    • 1
  • Vishweshwara Rao
    • 1
  • Preeti Rao
    • 1
  1. 1.Department of Electrical EngineeringIndian Institute of Technology BombayMumbaiIndia

Personalised recommendations