Learning Movement Kinematics with a Targeted Sound

  • Eric O. Boyer
  • Quentin Pyanet
  • Sylvain Hanneton
  • Frédéric Bevilacqua
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8905)

Abstract

This study introduces an experiment designed to analyze the sensorimotor adaptation to a motion-based sound synthesis system. We investigated a sound-oriented learning task, namely to reproduce a targeted sound. The motion of a small handheld object was used to control a sound synthesizer. The object angular velocity was measured by a gyroscope and transmitted in real time wirelessly to the sound system. The targeted sound was reached when the motion matched a given reference angular velocity profile with a given accuracy. An incorrect velocity profile produced either a noisier sound or a sound with a louder high harmonic, depending on the sign of the velocity error. The results showed that the participants were generally able to learn to reproduce sounds very close to the targeted sound. A corresponding motor adaptation was also found to occur, at various degrees, in most of the participants when the profile is altered.

Keywords

Gesture Sound Sensorimotor Learning Adaptation Interactive systems Auditory feedback Sound-oriented task 

References

  1. 1.
    Avanzini, F., De Götzen, A., Spagnol, S., Rodà, A.: Integrating auditory feedback in motor rehabilitation systems. In: Proceedings of International Conference on Multimodal Interfaces for Skills Transfer (SKILLS09) (2009)Google Scholar
  2. 2.
    Avanzini, F., Rochesso, D., Serafin, S.: Friction sounds for sensory substitution. In: Proceedings of ICAD 04, pp. 1–8 (2004)Google Scholar
  3. 3.
    Bevilacqua, F., Fels, S., Jensenius, A.R., Lyons, M.J., Schnell, N., Tanaka, A.: Sig nime: music, technology, and human-computer interaction. In: CHI ’13 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’13, pp. 2529–2532. ACM, New York (2013)Google Scholar
  4. 4.
    Boyer, E.O., Babayan, B.M., Bevilacqua, F., Noisternig, M., Warusfel, O., Roby-Brami, A., Hanneton, S., Viaud-Delmon, I.: From ear to hand: the role of the auditory-motor loop in pointing to an auditory source. Frontiers Comput. Neurosci. 7, 26 (2013)CrossRefGoogle Scholar
  5. 5.
    Caramiaux, B., Susini, P., Bianco, T., Bevilacqua, F., Houix, O., Schnell, N., Misdariis, N.: Gestural embodiment of environmental sounds: an experimental study. In: Proceedings of the International Conference on New Interfaces for Musical Expression, Oslo, Norway, pp. 144–148 (2011)Google Scholar
  6. 6.
    Effenberg, A.: Using sonification to enhance perception and reproduction accuracy of human movement patterns. In: Internation Workshop on Interactive Sonification 2004, pp. 1–5 (2004)Google Scholar
  7. 7.
    Effenberg, A., Fehse, U., Weber, A.: Movement sonification: audiovisual benefits on motor learning. In: BIO Web of Conferences, The International Conference SKILLS 2011, vol. 00022, pp. 1–5 (2011)Google Scholar
  8. 8.
    Forma, V., Hoellinger, T., Auvray, M., Roby-Brami, A., Hanneton, S.: Ears on the hand: reaching 3D audio targets. BIO Web of Conferences, vol. 00026, pp. 1–4 (2011)Google Scholar
  9. 9.
    Gelineck, S., Serafin, S.: A quantitative evaluation of the difference between knobs and sliders. In: Proceedings of the International Conference on New Interfaces for Musical Expression (2011)Google Scholar
  10. 10.
    Godøy, R.I., Haga, E., Jensenius, A.R.: Exploring music-related gestures by sound-tracing: A preliminary study. In: Proceedings of the COST287-ConGAS 2nd International Symposium on Gesture Interfaces for Multimedia Systems (GIMS2006) (2006)Google Scholar
  11. 11.
    Godøy, R.I., Haga, E., Jensenius, A.R.: Playing “air instruments”: mimicry of sound-producing gestures by novices and experts. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS (LNAI), vol. 3881, pp. 256–267. Springer, Heidelberg (2006)CrossRefGoogle Scholar
  12. 12.
    Hartveld, A., Hegarty, J.: Augmented feedback and physiotherapy practice. Physiotherapy 82(8), 480–490 (1996)CrossRefGoogle Scholar
  13. 13.
    Huang, H., Ingallas, T., Olson, L., Ganley, K., Rikakis, T., He, J.: Interactive multimodal biofeedback for task-oriented neural rehabilitation. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3–6 (2005)Google Scholar
  14. 14.
    Hunt, A., Kirk, R.: Mapping strategies for musical performance. In: Wanderley, M.M., Battier, M. (eds.) Trends in Gestural Control of Music, pp. 231–258. Ircam - Centre Pompidou (2000)Google Scholar
  15. 15.
    Karageorghis, C.I., Terry, P.C.: The psychophysical effects of music in sport and exercise: a review. J. Sport Behav. 20(1), 54 (1997)Google Scholar
  16. 16.
    Mitrovic, D., Klanke, S., Osu, R., Kawato, M., Vijayakumar, S.: A computational model of limb impedance control based on principles of internal model uncertainty. PLoS One 5, e13601 (2010)CrossRefGoogle Scholar
  17. 17.
    Nymoen, K., Glette, K., Skogstad, S., Torresen, J., Jensenius, A.: Searching for cross-individual relationships between sound and movement features using an svm classifier. In: Proceedings of the International Conference on New Interfaces for Musical Expression (2010)Google Scholar
  18. 18.
    Nymoen, K., Caramiaux, B., Kozak, M., Torresen, J.: Analyzing sound tracings: a multimodal approach to music information retrieval. In: Proceedings of the 1st International ACM Workshop on Music Information Retrieval with User-Centered and Multimodal Strategies, MIRUM ’11, pp. 39–44. ACM, New York (2011)Google Scholar
  19. 19.
    Rath, M., Schleicher, R.: On the relevance of auditory feedback for quality of control in a balancing task. Acta Acust. United Acust. 94, 12–20 (2008)CrossRefGoogle Scholar
  20. 20.
    Robertson, J.V.G., Hoellinger, T., Lindberg, P., Bensmail, D., Hanneton, S., Roby-Brami, A.: Effect of auditory feedback differs according to side of hemiparesis: a comparative pilot study. J. Neuroengineering Rehabil. 6, 45 (2009)CrossRefGoogle Scholar
  21. 21.
    Rosati, G., Oscari, F., Reinkensmeyer, D., Secoli, R., Avanzini, F., Spagnol, S., Masiero, S.: Improving robotics for neurorehabilitation: enhancing engagement, performance, and learning with auditory feedback. In: 2011 IEEE International Conference on Rehabilitation Robotics (ICORR), pp. 1–6 (2011)Google Scholar
  22. 22.
    Rosati, G., Oscari, F., Spagnol, S., Avanzini, F., Masiero, S.: Effect of task-related continuous auditory feedback during learning of tracking motion exercises. J. Neuroengineering Rehabil. 9(1), 79 (2012)CrossRefGoogle Scholar
  23. 23.
    Sigrist, R., Rauter, G., Riener, R., Wolf, P.: Augmented visual, auditory, haptic, and multimodal feedback in motor learning: a review. Psychon. Bull. Rev. 20(1), 21–53 (2013)CrossRefGoogle Scholar
  24. 24.
    Subramanian, S.K., Massie, C.L., Malcolm, M.P., Levin, M.F.: Does provision of extrinsic feedback result in improved motor learning in the upper limb poststroke? A systematic review of the evidence. Neurorehabilitation Neural Repair 24(2), 113–124 (2010)CrossRefGoogle Scholar
  25. 25.
    Takeuchi, T.: Auditory information in playing tennis. Percept. Mot. Skills 76(3 Pt 2), 1323–8 (1993)CrossRefGoogle Scholar
  26. 26.
    Thoret, E., Aramaki, M., Kronland-Martinet, R., Velay, J.L., Ystad, S.: Sonifying drawings: characterization of perceptual attributes of sounds produced by human gestures. In: Proceedings of the Acoustics 2012 Nantes Conference, pp. 23–27, April 2012Google Scholar
  27. 27.
    Vogt, K., Pirro, D., Kobenz, I., Höldrich, R., Eckel, G.: Physiosonic - Movement sonification as auditory feedback. In: ICAD 2009, pp. 1–7 (2009)Google Scholar
  28. 28.
    Wolf, P., Sigrist, R., Rauter, G., Riener, R.: Error sonification of a complex motor task. In: BIO Web of Conferences, vol. 1, p. 00098, Dec 2011Google Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Eric O. Boyer
    • 1
    • 2
  • Quentin Pyanet
    • 1
  • Sylvain Hanneton
    • 2
  • Frédéric Bevilacqua
    • 1
  1. 1.IRCAM STMS-CNRS-UPMCParisFrance
  2. 2.Laboratoire de Psychologie de la Perception UMR CNRS 8242UFR STAPS - Université Paris DescartesParisFrance

Personalised recommendations