Digital Sensing of Musical Instruments

  • Peter Driessen
  • George Tzanetakis
Part of the Springer Handbooks book series (SPRINGERHAND)

Abstract

Acoustic musical instruments enable very rich and subtle control when used by experienced musicians. Musicology has traditionally focused on analysis of scores and more recently audio recordings. However, most music from around the world is not notated, and many nuances of music performance are hard to recover from audio recordings. In this chapter, we describe hyperinstruments, i. e., acoustic instruments that are augmented with digital sensors for capturing performance information and in some cases offering additional playing possibilities. Direct sensors are integrated onto the physical instrument, possibly requiring modifications. Indirect sensors such as cameras and microphones can be used to analyze performer gestures without requiring modifications to the instrument. We describe some representative case studies of hyperinstruments from our own research as well as some representative case studies of the types of musicological analysis one can perform using this approach, such as performer identification, microtiming analysis, and transcription. Until recently, hyperinstruments were mostly used for electroacoustic music creation, but we believe they have a lot of potential in systematic musicological applications involving music performance analysis.

3-D

three-dimensional

ADC

analog-to-digital converter

EROSS

easily removable, wireless optical sensor system

FSR

force-sensing resistor

MIDI

musical instrument digital interface

MLP

multilayer perceptron

NB

naive Bayes

NIME

new interfaces for musical expression

OSC

open sound control

SMO

sequential minimal optimization

References

  1. 46.1
    G. Tzanetakis, A. Kapur, W.A. Schloss, M. Wright: Computational ethnomusicology, J. Interdiscipl. Music Stud. 1(2), 1–24 (2007)Google Scholar
  2. 46.2
    E.R. Miranda, M.M. Wanderley: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard, Vol. 21 (AR, Middleton 2006)Google Scholar
  3. 46.3
    T. Machover: Hyperinstruments: A Progress Report, 1987–1991 (MIT Media Laboratory, Cambridge 1992)Google Scholar
  4. 46.4
    W. Goebl, R. Bresin, A. Galembo: Touch and temporal behavior of grand piano actions, J. Acoust. Soc. Am. 118(2), 1154–1165 (2005)CrossRefGoogle Scholar
  5. 46.5
    M. Bernays, C. Traube: Expressive production of piano timbre: Touch and playing techniques for timbre control in piano performance. In: Proc. Sound Music Comput. Conf. (2013)Google Scholar
  6. 46.6
    T. Machover: Hyperinstruments: A composer’s approach to the evolution of intelligent musical instruments. In: Cyberarts: Exploring Arts and Technology, ed. by L. Jacobson (Miller Freeman, San Francisco 1992) pp. 67–76Google Scholar
  7. 46.7
    M. Burtner: The metasaxophone: Concept, implementation, and mapping strategies for a new computer music instrument, Organ. Sound 7(2), 201–213 (2002)CrossRefGoogle Scholar
  8. 46.8
    C. Palacio-Quintin: The hyper-flute. In: Proc. New Interfaces Music. Expr. (NIME) Conf. (2003)Google Scholar
  9. 46.9
    P. Cook, D. Morrill, J.O. Smith: A MIDI control and performance system for brass instruments. In: Proc. Int. Comput. Music Conf. (ICMC) (1993)Google Scholar
  10. 46.10
    A.R. Tindale, A. Kapur, G. Tzanetakis, P. Driessen, A. Schloss: A comparison of sensor strategies for capturing percussive gestures. In: Proc. Conf. New Interfaces Music. Expr. (2005)Google Scholar
  11. 46.11
    A. McPherson: TouchKeys: Capacitive multi-touch sensing on a physical keyboard. In: Proc. New Interfaces Music. Expr. (NIME) Conf. (2012)Google Scholar
  12. 46.12
    M. Wright: Open sound control-a new protocol for communicating with sound synthesizers. In: Proc. Int. Comput. Music Conf. (ICMC) (1997)Google Scholar
  13. 46.13
    A. Tindale, A. Kapur, G. Tzanetakis: Training surrogate sensors in musical gesture acquisition systems, IEEE Trans. Multimed. 13(1), 50–59 (2011)CrossRefGoogle Scholar
  14. 46.14
    S. Bagchee: Understanding Raga Music (BPI, New Delhi 1998)Google Scholar
  15. 46.15
    A. Kapur, A. Lazier, P. Davidson, R.S. Wilson, P. Cook: The electronic sitar controller. In: Proc. New Interfaces Music. Expr. (NIME) Conf. (2004)Google Scholar
  16. 46.16
    L. Jenkins, W. Page, S. Trail, G. Tzanetakis, P. Driessen: An easily removable, wireless optical sensing system (EROSS) for the trumpet. In: Proc. Int. Conf. New Interfaces Music. Expr. (NIME) (2013)Google Scholar
  17. 46.17
    W.A. Schloss: Recent advances in the coupling of the language Max with the Mathews/Boie radio drum. In: Proc. Int. Comput. Music Conf (1990)Google Scholar
  18. 46.18
    A. Klapuri, M. Davy: Signal Processing Methods for Music Transcription, Vol. 1 (Springer, New York 2006)CrossRefGoogle Scholar
  19. 46.19
    M. Wright, W.A. Schloss, G. Tzanetakis: Analyzing afro-cuban rhythms using rotation-aware clave template matching with dynamic programming. In: ISMIR (2008) pp. 647–652Google Scholar
  20. 46.20
    M. Wright: Empirical Comparison of Two Recordings of the Kazakh Dombra piece ‘Akbai’. https://ccrma.stanford.edu/~matt/dombra/ (2005)
  21. 46.21
    J. Hochenbaum, A. Kapur, M. Wright: Multimodal musician recognition. In: Proc. Int. Conf. New Interfaces Music. Expr. (2010)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2018

Authors and Affiliations

  • Peter Driessen
    • 1
  • George Tzanetakis
    • 2
  1. 1.Dept. of Electrical and Computer EngineeringUniversity of VictoriaVictoriaCanada
  2. 2.Dept. of Computer ScienceUniversity of VictoriaVictoriaCanada

Personalised recommendations