Sound Feedback Assessment for Upper Limb Rehabilitation Using a Multimodal Guidance System

  • Mario Covarrubias Rodriguez
  • Mauro Rossini
  • Giandomenico Caruso
  • Gianluca Samali
  • Chiara Giovanzana
  • Franco Molteni
  • Monica Bordegoni
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9759)

Abstract

This paper describes the implementation of a Multimodal Guidance System (MGS) for upper limb rehabilitation through vision, haptic and sound. The system consists of a haptic device that physically renders virtual path of 2D shapes through the point-based approach, while sound technology provides audio feedback inputs about patient’s actions while performing a manual task as for example: starting and/or finishing an sketch; different sounds related to the hand’s velocity while sketching. The goal of this sonification approach is to strengthen the patient’s understanding of the virtual shape which is used in the rehabilitation process, and to inform the patient about some attributes that could otherwise remain unseen. Our results provide conclusive evidence that the effect of using the sound as additional feedback increases the accuracy in the tasks operations.

Keywords

Haptic guidance Upper-limb rehabilitation Sound interaction 

References

  1. 1.
    Blank, R., Heizer, W., von Voss, H.: Externally guided control of static grip forces by visual feedback age and task effects in 3–6 old children and adults. Neurosc. Lett. 271, 41–44 (1999)CrossRefGoogle Scholar
  2. 2.
    Covarrubias, M., Gatti, E., Bordegoni, M., Cugini, U., Mansutti, A.: Improving manual skills in persons with disabilities (pwd) through a multimodal assistance system. Disabil. Rehabil. Assistive Technol. 9(4), 335–343 (2014). pMID: 23692410CrossRefGoogle Scholar
  3. 3.
    Hermann, T.: Sonification for exploratory data analysis. Ph.D. thesis, Bielefeld University, Germany (2002)Google Scholar
  4. 4.
    Kurillo, G., Gregorič, M., Goljar, N., Bajd, T.: Grip force tracking system for assessment and rehabilitation of hand function. Technol. Health Care 13, 137–149 (2005)Google Scholar
  5. 5.
    Minghim, R., Forrest, A.R.: An illustrated analysis of sonification for scientific visualisation. In: Proceedings of the 6th Conference on Visualization 1995, VIS 1995, p. 110. IEEE Computer Society, Washington, DC (1995)Google Scholar
  6. 6.
    Scaletti, C.: Sound synthesis algorithms for auditory data representations. In: Auditory Display: Sonification, Audification, and Auditory Interfaces. SFI Studies in the Sciences of Complexity Proceedings, vol. XVIII, pp. 223–251. Addison-Wesley (1994)Google Scholar
  7. 7.
    Stockman, T., Nickerson, L.V., Hind, G.: Auditory graphs: a summary of current experience and towards a research agenda. In: Proceedings of the International Conference on Auditory Display (ICAD 2005) Limerick, Ireland (2005)Google Scholar
  8. 8.
    Wier, C.C., Jesteadt, W., Green, D.M.: Frequency discrimination as a function of frequency and sensation level. J. Acoust. Soc. Am. 61, 178183 (1977)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Mario Covarrubias Rodriguez
    • 1
  • Mauro Rossini
    • 2
  • Giandomenico Caruso
    • 1
  • Gianluca Samali
    • 2
  • Chiara Giovanzana
    • 2
  • Franco Molteni
    • 2
  • Monica Bordegoni
    • 1
  1. 1.Dipartimento di MeccanicaPolitecnico di MilanoMilanItaly
  2. 2.Valduce Hospital, Villa Beretta, Rehabilitation CentreCosta MasnagaItaly

Personalised recommendations