Advertisement

New Interfaces for Classifying Performance Gestures in Music

  • Chris RhodesEmail author
  • Richard Allmendinger
  • Ricardo Climent
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11872)

Abstract

Interactive machine learning (ML) allows a music performer to digitally represent musical actions (via gestural interfaces) and affect their musical output in real-time. Processing musical actions (termed performance gestures) with ML is useful because it predicts and maps often-complex biometric data. ML models can therefore be used to create novel interactions with musical systems, game-engines, and networked analogue devices. Wekinator is a free open-source software for ML (based on the Waikato Environment for Knowledge Analysis – WEKA - framework) which has been widely used, since 2009, to build supervised predictive models when developing real-time interactive systems. This is because it is accessible in its format (i.e. a graphical user interface – GUI) and simplified approach to ML. Significantly, it allows model training via gestural interfaces through demonstration. However, Wekinator offers the user several models to build predictive systems with. This paper explores which ML models (in Wekinator) are the most useful for predicting an output in the context of interactive music composition. We use two performance gestures for piano, with opposing datasets, to train available ML models, investigate compositional outcomes and frame the investigation. Our results show ML model choice is important for mapping performance gestures because of disparate mapping accuracies and behaviours found between all Wekinator ML models.

Keywords

Interactive machine learning Wekinator Myo HCI Performance gestures Interactive music Gestural interfaces 

Notes

Acknowledgements

This work was supported by the Engineering and Physical Sciences Research Council [2063473].

References

  1. 1.
    Tanaka, A.: Sensor-based musical instruments and interactive music. In: The Oxford Handbook of Computer Music, pp. 233–257. Oxford University Press, USA (2011)Google Scholar
  2. 2.
    Miranda, E., Wanderley, M.: New Digital Musical Instruments: Control and Interaction Beyond the Keyboard, 1st edn. A-R Editions, USA (2006)Google Scholar
  3. 3.
    Hayward, P.: Danger! Retro-Affectivity! The Cultural Career of the Theremin. Convergence 3(4), 28–53 (1997)CrossRefGoogle Scholar
  4. 4.
    Finamore, E.: A Tribute to the Synth: How Synthesisers Revolutionised Modern Music. https://www.bbc.co.uk/programmes/articles/3ryZCdlXtpkNG3yRl3Y7pnh/a-tribute-to-the-synth-how-synthesisers-revolutionised-modern-music. Accessed 28 July 2019
  5. 5.
    Rose, J.: K-bow: The Palimpolin Project. http://www.jonroseweb.com/e_vworld_k-bow.html. Accessed 28 July 2019
  6. 6.
    Bongers, B.: An interview with sensorband. Comput. Music J. 22(1), 13–24 (1998)CrossRefGoogle Scholar
  7. 7.
    Wu, D., et al.: Music composition from the brain signal: representing the mental state by music. Comput. Intell. Neurosci. 2010, 1–6 (2010)CrossRefGoogle Scholar
  8. 8.
    Jackson, B.: Thalmic Labs Myo Armband Hits Consumer Release, for Sale on Amazon. https://www.itbusiness.ca/news/thalmic-labs-myo-armband-hits-consumer-release-for-sale-on-amazon/54056. Accessed 26 July 2019
  9. 9.
    Nymoen, K., et al.: MuMyo – evaluating and exploring the MYO armband for musical exploration. In: Proceedings of the International Conference on New Interfaces for Musical Expression, vol. 2015, pp. 215–218. NIME, USA (2015)Google Scholar
  10. 10.
    Fiebrink, R., Trueman, D., Cook, P.R.: A meta-instrument for interactive, on-the-fly machine learning. In: Proceedings of the International Conference on New Interfaces for Musical Expression, vol. 2009, pp. 280–285. NIME, USA (2009)Google Scholar
  11. 11.
    Fiebrink, R., Schedel, M.: A demonstration of bow articulation recognition with Wekinator and K-Bow. In: Proceedings of the International Computer Music Conference (ICMC), vol. 2011, pp. 272–275. ICMC, UK (2011)Google Scholar
  12. 12.
    Santini, G.: Synesthesizer: physical modelling and machine learning for a color-based synthesizer in virtual reality. In: Montiel, M., Gomez-Martin, F., Agustín-Aquino, Octavio A. (eds.) MCM 2019. LNCS (LNAI), vol. 11502, pp. 229–235. Springer, Cham (2019).  https://doi.org/10.1007/978-3-030-21392-3_18CrossRefzbMATHGoogle Scholar
  13. 13.
    Hantrakul, L., Kondak, Z.: GestureRNN: a neural gesture system for the roli lightpad block. In: Proceedings of the International Conference on New Interfaces for Musical Expression, vol. 2018, pp. 132–137. NIME, USA (2018)Google Scholar
  14. 14.
    North. Myo Connect, SDK and Firmware Downloads. https://support.getmyo.com/hc/en-us/articles/360018409792-Myo-Connect-SDK-and-firmware-downloads. Accessed 23 July 2019
  15. 15.
    North. What is the Wireless Range of the Myo?. https://support.getmyo.com/hc/en-us/articles/202668603-What-is-the-wireless-range-of-Myo. Accessed 23 July 2019
  16. 16.
    North, Creating a Custom Calibration Profile for Your Myo Armband. https://support.getmyo.com/hc/en-us/articles/203829315-Creating-a-custom-calibration-profile-for-your-Myo-armband. Accessed 26 July 2019
  17. 17.
    Soares, S.B., et al.: The use of cross correlation function in onset detection of electromyographic signals. In: ISSNIP Biosignals and Biorobotics Conference, vol 2013, pp. 1–5. IEEE, Brazil (2013)Google Scholar
  18. 18.
    Hunt, A., Wanderley, M.M.: Mapping performer parameters to synthesis engines. Organised Sound 7(2), 97–108 (2002)CrossRefGoogle Scholar
  19. 19.
    Winkler, T.: Composing Interactive Music: Techniques and Ideas Using Max, 1st edn. MIT Press, Cambridge (1998)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.NOVARS Research CentreUniversity of ManchesterManchesterUK
  2. 2.Alliance Manchester Business School (AMBS)University of ManchesterManchesterUK

Personalised recommendations