GyGSLA: A Portable Glove System for Learning Sign Language Alphabet

  • Luís Sousa
  • João M. F. Rodrigues
  • Jânio Monteiro
  • Pedro J. S. Cardoso
  • Roberto Lam
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9739)

Abstract

The communication between people with normal hearing with those having hearing or speech impairment is difficult. Learning a new alphabet is not always easy, especially when it is a sign language alphabet, which requires both hand skills and practice. This paper presents the GyGSLA system, standing as a completely portable setup created to help inexperienced people in the process of learning a new sign language alphabet. To achieve it, a computer/mobile game-interface and an hardware device, a wearable glove, were developed. When interacting with the computer or mobile device, using the wearable glove, the user is asked to represent alphabet letters and digits, by replicating the hand and fingers positions shown in a screen. The glove then sends the hand and fingers positions to the computer/mobile device using a wireless interface, which interprets the letter or digit that is being done by the user, and gives it a corresponding score. The system was tested with three completely inexperience sign language subjects, achieving a 76 % average recognition ratio for the Portuguese sign language alphabet.

Keywords

HCI Gesture recognition Sign Language Assistive technologies 

Notes

Acknowledgements

This work was partially supported by the Portuguese Foundation for Science and Technology (FCT) project PEst-OE/EEI/LA0009/2013.

References

  1. 1.
    Benbasat, A.Y., Paradiso, J.A.: An inertial measurement framework for gesture recognition and applications. In: Wachsmuth, I., Sowa, T. (eds.) GW 2001. LNCS (LNAI), vol. 2298, pp. 9–20. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  2. 2.
    Vivendo em Silencio: Dactiologia portuguesa. https://vivendoemsilencio.files.wordpress.com/2010/07/dactl-lgp.jpg (2015). Accessed 15 June 2015
  3. 3.
    Fujitsu: Fujitsu glove-style wearable device. http://vandrico.com/wearables/device/fujitsu-glove-style-wearable-device (2015). Accessed 14 Dez 2015
  4. 4.
    Invensense: InvenSense. http://www.invensense.com/ (2015). Accessed 15 June 2015
  5. 5.
    Kim, K.-W., Lee, M.-S., Soon, B.-R., Ryu, M.-H., Kim, J.-N.: Recognition of sign language with an inertial sensor-based data glove. Technol. Health Care 24(s1), S223–S230 (2015)CrossRefGoogle Scholar
  6. 6.
    Kinect.: Kinect for Windows. http://goo.gl/fGZT8X (2014). Accessed 10 Nov 2014
  7. 7.
    Mehdi, S.A., Khan, Y.N.: Sign language recognition using sensor gloves. In: Proceedings of the 9th International Conference on Neural Information Processing, vol. 5, pp. 2204–2206. IEEE (2002)Google Scholar
  8. 8.
    Motion, L.: Leap motion. https://www.leapmotion.com/ (2014). Accessed 10 Nov 2014
  9. 9.
    Oz, C., Leu, M.C.: American sign language word recognition with a sensory glove using artificial neural networks. Eng. Appl. Artif. Intell. 24(7), 1204–1213 (2011)CrossRefGoogle Scholar
  10. 10.
    Praveen, N., Naveen Karanth, M.S., Megha.: Sign language interpreter using a smart glove. In: International Conference on Advances in Electronics, Computers and Communications, pp. 1–5. IEEE (2014)Google Scholar
  11. 11.
    Structure: Structure sensor. http://structure.io/ (2014). Accessed 10 Nov 2014
  12. 12.
    Unity: Unity 3D. https://unity3d.com/pt (2014). Accessed 10 Nov 2014

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Luís Sousa
    • 1
  • João M. F. Rodrigues
    • 1
  • Jânio Monteiro
    • 2
  • Pedro J. S. Cardoso
    • 1
  • Roberto Lam
    • 1
  1. 1.LARSyS and ISEUniversity of the AlgarveFaroPortugal
  2. 2.INEC-ID (Lisbon) and ISEUniversity of the AlgarveFaroPortugal

Personalised recommendations