Skip to main content

Learning Movement Kinematics with a Targeted Sound

Part of the Lecture Notes in Computer Science book series (LNISA,volume 8905)

Abstract

This study introduces an experiment designed to analyze the sensorimotor adaptation to a motion-based sound synthesis system. We investigated a sound-oriented learning task, namely to reproduce a targeted sound. The motion of a small handheld object was used to control a sound synthesizer. The object angular velocity was measured by a gyroscope and transmitted in real time wirelessly to the sound system. The targeted sound was reached when the motion matched a given reference angular velocity profile with a given accuracy. An incorrect velocity profile produced either a noisier sound or a sound with a louder high harmonic, depending on the sign of the velocity error. The results showed that the participants were generally able to learn to reproduce sounds very close to the targeted sound. A corresponding motor adaptation was also found to occur, at various degrees, in most of the participants when the profile is altered.

Keywords

  • Gesture
  • Sound
  • Sensorimotor
  • Learning
  • Adaptation
  • Interactive systems
  • Auditory feedback
  • Sound-oriented task

This is a preview of subscription content, access via your institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • DOI: 10.1007/978-3-319-12976-1_14
  • Chapter length: 16 pages
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
eBook
USD   84.99
Price excludes VAT (USA)
  • ISBN: 978-3-319-12976-1
  • Instant PDF download
  • Readable on all devices
  • Own it forever
  • Exclusive offer for individuals only
  • Tax calculation will be finalised during checkout
Softcover Book
USD   109.99
Price excludes VAT (USA)
Fig. 1.
Fig. 2.
Fig. 3.
Fig. 4.
Fig. 5.
Fig. 6.
Fig. 7.
Fig. 8.
Fig. 9.

Notes

  1. 1.

    Legos project, http://legos.ircam.fr.

  2. 2.

    Modalys (Ircam), http://www.forumnet.ircam.fr/product/modalys. The object used is “MONO-STRING”, see documentation for details.

References

  1. Avanzini, F., De Götzen, A., Spagnol, S., Rodà, A.: Integrating auditory feedback in motor rehabilitation systems. In: Proceedings of International Conference on Multimodal Interfaces for Skills Transfer (SKILLS09) (2009)

    Google Scholar 

  2. Avanzini, F., Rochesso, D., Serafin, S.: Friction sounds for sensory substitution. In: Proceedings of ICAD 04, pp. 1–8 (2004)

    Google Scholar 

  3. Bevilacqua, F., Fels, S., Jensenius, A.R., Lyons, M.J., Schnell, N., Tanaka, A.: Sig nime: music, technology, and human-computer interaction. In: CHI ’13 Extended Abstracts on Human Factors in Computing Systems, CHI EA ’13, pp. 2529–2532. ACM, New York (2013)

    Google Scholar 

  4. Boyer, E.O., Babayan, B.M., Bevilacqua, F., Noisternig, M., Warusfel, O., Roby-Brami, A., Hanneton, S., Viaud-Delmon, I.: From ear to hand: the role of the auditory-motor loop in pointing to an auditory source. Frontiers Comput. Neurosci. 7, 26 (2013)

    CrossRef  Google Scholar 

  5. Caramiaux, B., Susini, P., Bianco, T., Bevilacqua, F., Houix, O., Schnell, N., Misdariis, N.: Gestural embodiment of environmental sounds: an experimental study. In: Proceedings of the International Conference on New Interfaces for Musical Expression, Oslo, Norway, pp. 144–148 (2011)

    Google Scholar 

  6. Effenberg, A.: Using sonification to enhance perception and reproduction accuracy of human movement patterns. In: Internation Workshop on Interactive Sonification 2004, pp. 1–5 (2004)

    Google Scholar 

  7. Effenberg, A., Fehse, U., Weber, A.: Movement sonification: audiovisual benefits on motor learning. In: BIO Web of Conferences, The International Conference SKILLS 2011, vol. 00022, pp. 1–5 (2011)

    Google Scholar 

  8. Forma, V., Hoellinger, T., Auvray, M., Roby-Brami, A., Hanneton, S.: Ears on the hand: reaching 3D audio targets. BIO Web of Conferences, vol. 00026, pp. 1–4 (2011)

    Google Scholar 

  9. Gelineck, S., Serafin, S.: A quantitative evaluation of the difference between knobs and sliders. In: Proceedings of the International Conference on New Interfaces for Musical Expression (2011)

    Google Scholar 

  10. Godøy, R.I., Haga, E., Jensenius, A.R.: Exploring music-related gestures by sound-tracing: A preliminary study. In: Proceedings of the COST287-ConGAS 2nd International Symposium on Gesture Interfaces for Multimedia Systems (GIMS2006) (2006)

    Google Scholar 

  11. Godøy, R.I., Haga, E., Jensenius, A.R.: Playing “air instruments”: mimicry of sound-producing gestures by novices and experts. In: Gibet, S., Courty, N., Kamp, J.-F. (eds.) GW 2005. LNCS (LNAI), vol. 3881, pp. 256–267. Springer, Heidelberg (2006)

    CrossRef  Google Scholar 

  12. Hartveld, A., Hegarty, J.: Augmented feedback and physiotherapy practice. Physiotherapy 82(8), 480–490 (1996)

    CrossRef  Google Scholar 

  13. Huang, H., Ingallas, T., Olson, L., Ganley, K., Rikakis, T., He, J.: Interactive multimodal biofeedback for task-oriented neural rehabilitation. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3–6 (2005)

    Google Scholar 

  14. Hunt, A., Kirk, R.: Mapping strategies for musical performance. In: Wanderley, M.M., Battier, M. (eds.) Trends in Gestural Control of Music, pp. 231–258. Ircam - Centre Pompidou (2000)

    Google Scholar 

  15. Karageorghis, C.I., Terry, P.C.: The psychophysical effects of music in sport and exercise: a review. J. Sport Behav. 20(1), 54 (1997)

    Google Scholar 

  16. Mitrovic, D., Klanke, S., Osu, R., Kawato, M., Vijayakumar, S.: A computational model of limb impedance control based on principles of internal model uncertainty. PLoS One 5, e13601 (2010)

    CrossRef  Google Scholar 

  17. Nymoen, K., Glette, K., Skogstad, S., Torresen, J., Jensenius, A.: Searching for cross-individual relationships between sound and movement features using an svm classifier. In: Proceedings of the International Conference on New Interfaces for Musical Expression (2010)

    Google Scholar 

  18. Nymoen, K., Caramiaux, B., Kozak, M., Torresen, J.: Analyzing sound tracings: a multimodal approach to music information retrieval. In: Proceedings of the 1st International ACM Workshop on Music Information Retrieval with User-Centered and Multimodal Strategies, MIRUM ’11, pp. 39–44. ACM, New York (2011)

    Google Scholar 

  19. Rath, M., Schleicher, R.: On the relevance of auditory feedback for quality of control in a balancing task. Acta Acust. United Acust. 94, 12–20 (2008)

    CrossRef  Google Scholar 

  20. Robertson, J.V.G., Hoellinger, T., Lindberg, P., Bensmail, D., Hanneton, S., Roby-Brami, A.: Effect of auditory feedback differs according to side of hemiparesis: a comparative pilot study. J. Neuroengineering Rehabil. 6, 45 (2009)

    CrossRef  Google Scholar 

  21. Rosati, G., Oscari, F., Reinkensmeyer, D., Secoli, R., Avanzini, F., Spagnol, S., Masiero, S.: Improving robotics for neurorehabilitation: enhancing engagement, performance, and learning with auditory feedback. In: 2011 IEEE International Conference on Rehabilitation Robotics (ICORR), pp. 1–6 (2011)

    Google Scholar 

  22. Rosati, G., Oscari, F., Spagnol, S., Avanzini, F., Masiero, S.: Effect of task-related continuous auditory feedback during learning of tracking motion exercises. J. Neuroengineering Rehabil. 9(1), 79 (2012)

    CrossRef  Google Scholar 

  23. Sigrist, R., Rauter, G., Riener, R., Wolf, P.: Augmented visual, auditory, haptic, and multimodal feedback in motor learning: a review. Psychon. Bull. Rev. 20(1), 21–53 (2013)

    CrossRef  Google Scholar 

  24. Subramanian, S.K., Massie, C.L., Malcolm, M.P., Levin, M.F.: Does provision of extrinsic feedback result in improved motor learning in the upper limb poststroke? A systematic review of the evidence. Neurorehabilitation Neural Repair 24(2), 113–124 (2010)

    CrossRef  Google Scholar 

  25. Takeuchi, T.: Auditory information in playing tennis. Percept. Mot. Skills 76(3 Pt 2), 1323–8 (1993)

    CrossRef  Google Scholar 

  26. Thoret, E., Aramaki, M., Kronland-Martinet, R., Velay, J.L., Ystad, S.: Sonifying drawings: characterization of perceptual attributes of sounds produced by human gestures. In: Proceedings of the Acoustics 2012 Nantes Conference, pp. 23–27, April 2012

    Google Scholar 

  27. Vogt, K., Pirro, D., Kobenz, I., Höldrich, R., Eckel, G.: Physiosonic - Movement sonification as auditory feedback. In: ICAD 2009, pp. 1–7 (2009)

    Google Scholar 

  28. Wolf, P., Sigrist, R., Rauter, G., Riener, R.: Error sonification of a complex motor task. In: BIO Web of Conferences, vol. 1, p. 00098, Dec 2011

    Google Scholar 

Download references

Acknowledgements

We acknowledge support by ANR French National Research Agency, under the ANR-Blanc program 2011 (Legos project ANR- 11-BS02-012) and additional support from Cap Digital. We thank all the participants of Legos project for very fruitful discussions.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Eric O. Boyer .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and Permissions

Copyright information

© 2014 Springer International Publishing Switzerland

About this paper

Cite this paper

Boyer, E.O., Pyanet, Q., Hanneton, S., Bevilacqua, F. (2014). Learning Movement Kinematics with a Targeted Sound. In: Aramaki, M., Derrien, O., Kronland-Martinet, R., Ystad, S. (eds) Sound, Music, and Motion. CMMR 2013. Lecture Notes in Computer Science(), vol 8905. Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-12976-1_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-12975-4

  • Online ISBN: 978-3-319-12976-1

  • eBook Packages: Computer ScienceComputer Science (R0)