Adaptive Interface for Mapping Body Movements to Sounds

Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10783)

Abstract

Contemporary digital musical instruments allow an abundance of means to generate sound. Although superior to traditional instruments in terms of producing a unique audio-visual act, there is still an unmet need for digital instruments that allow performers to generate sounds through movements in an intuitive manner. One of the key factors for an authentic digital music act is a low latency between movements (user commands) and corresponding sounds. Here we present such a low-latency interface that maps the user’s kinematic actions into sound samples. The interface relies on wireless sensor nodes equipped with inertial measurement units and a real-time algorithm dedicated to the early detection and classification of a variety of movements/gestures performed by a user. The core algorithm is based on the approximate inference of a hierarchical generative model with piecewise-linear dynamical components. Importantly, the model’s structure is derived from a set of motion gestures. The performance of the Bayesian algorithm was compared against the k-nearest neighbors (k-NN) algorithm, which showed the highest classification accuracy, in a pre-testing phase, among several existing state-of-the-art algorithms. In almost all of the evaluation metrics the proposed probabilistic algorithm outperformed the k-NN algorithm.

References

  1. 1.
    Bahn, C., Hahn, T., Trueman, D.: Physicality and feedback: a focus on the body in the performance of electronic music. In: Proceedings of the International Computer Music Conference, vol. 2001, pp. 44–51 (2001)Google Scholar
  2. 2.
    Bevilacqua, F., Guédy, F., Schnell, N., Fléty, E., Leroy, N.: Wireless sensor interface and gesture-follower for music pedagogy. In: Proceedings of the 7th International Conference on New Interfaces for Musical Expression, pp. 124–129. ACM (2007)Google Scholar
  3. 3.
    Bishop, C.M.: Neural Networks for Pattern Recognition. Oxford University Press, Oxford (1995)MATHGoogle Scholar
  4. 4.
    Bongers, B.: Physical interfaces in the electronic arts. In: Trends in Gestural Control of Music, pp. 41–70 (2000)Google Scholar
  5. 5.
    Cacoullos, T.: Discriminant Analysis and Applications. Academic Press, New York (2014)Google Scholar
  6. 6.
    Cadoz, C., Wanderley, M.M.: Gesture-music. In: Trends in Gestural Control of Music, pp. 71–93 (2000)Google Scholar
  7. 7.
    Choi, I.: From motion to emotion: synthesis of interactivity with gestural primitives. In: Emotional and Intelligent: The Tangled Knot of Cognition, pp. 22–25 (1998)Google Scholar
  8. 8.
    Du, K.-L., Swamy, M.N.S.: Support vector machines. In: Neural Networks and Statistical Learning. STS, pp. 469–524. Springer, London (2014).  https://doi.org/10.1007/978-1-4471-5571-3_16
  9. 9.
    Ferreira, P.P.: When sound meets movement: performance in electronic dance music. Leonardo Music J. 18, 17–20 (2008)CrossRefGoogle Scholar
  10. 10.
    Forney, G.D.: The viterbi algorithm. Proc. IEEE 61(3), 268–278 (1973)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Goldstein, M.: Gestural coherence and musical interaction design. In: 1998 IEEE International Conference on Systems, Man, and Cybernetics, vol. 2, pp. 1076–1079. IEEE (1998)Google Scholar
  12. 12.
    Gritten, A., King, E.: Music and Gesture. Ashgate Publishing, Ltd., Farnham (2006)Google Scholar
  13. 13.
    Hechenbichler, K., Schliep, K.: Weighted k-nearest-neighbor techniques and ordinal classification (2004). http://nbn-resolving.de/urn/resolver.pl?urn=nbn:de:bvb:19-epub-1769-9
  14. 14.
    Iazzetta, F.: Meaning in musical gesture. In: Trends in Gestural Control of Music, pp. 259–268 (2000)Google Scholar
  15. 15.
    Jensenius, A.R.: Action-sound: developing methods and tools to study music-related body movement. Ph.D. thesis, Department of Musicology, University of Oslo (2007)Google Scholar
  16. 16.
    Johnson, M.: pyhsmm-autoregresive repository (2012). https://github.com/mattjj/pyhsmm-autoregressive
  17. 17.
    Mitchell, T.J.: Soundgrasp: a gestural interface for the performance of live music. In: International Conference on New Interfaces for Musical Expression (NIME) (2011)Google Scholar
  18. 18.
    Mitra, S., Acharya, T.: Gesture recognition: a survey. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 37(3), 311–324 (2007)CrossRefGoogle Scholar
  19. 19.
    North, B., Blake, A.: Learning dynamical models using expectation-maximisation. In: 1998 Sixth International Conference on Computer Vision, pp. 384–389. IEEE (1998)Google Scholar
  20. 20.
    Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in Python. J. Mach. Learn. Res. 12, 2825–2830 (2011)MathSciNetMATHGoogle Scholar
  21. 21.
    Schloss, W.A.: Using contemporary technology in live performance: the dilemma of the performer. J. New Music Res. 32(3), 239–242 (2003)CrossRefGoogle Scholar
  22. 22.
    Stuart, C.: The object of performance: aural performativity in contemporary laptop music. Contemp. Music Rev. 22(4), 59–65 (2003)CrossRefGoogle Scholar
  23. 23.
    Tanaka, A.: Sensor-based musical instruments and interactive. In: Oxford Handbook of Computer Music, p. 233 (2009)Google Scholar
  24. 24.
    Wang, G.: The Chuck Audio Programming Language. A Strongly-Timed and On-the-Fly Environ/Mentality. Princeton University, Princeton (2008)Google Scholar
  25. 25.
    Watanabe, S.: A widely applicable Bayesian information criterion. J. Mach. Learn. Res. 14, 867–897 (2013)MathSciNetMATHGoogle Scholar
  26. 26.
    Winkler, T.: Making motion musical: gesture mapping strategies for interactive computer music. In: ICMC Proceedings, pp. 261–264 (1995)Google Scholar
  27. 27.
    Yang, M.: Some properties of vector autoregressive processes with markov-switching coefficients. Econometric Theory 16(01), 23–43 (2000)MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Technische Universität DresdenDresdenGermany
  2. 2.Lund UniversityLundSweden
  3. 3.Faculty of Electrical EngineeringUniversity of BelgradeBelgradeSerbia

Personalised recommendations