Advertisement

Sign Language Support – Adding a Gesture Library to the Leap Motion SDK

  • Tiago Lopes
  • Tiago CardosoEmail author
  • José Barata
Conference paper
Part of the Lecture Notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering book series (LNICST, volume 176)

Abstract

There are several research initiatives that tackle gesture recognition. Nevertheless the interaction between the input devices and an application level is still a hard task that has to be accomplished each time a new system is being developed. The objective of this research work is to facilitate that endeavor by introducing a new generic software layer between the gesture capture device and the application level. This layer will provide the introduction of a gesture library and a set of functionalities both to feed this library and pursue gesture recognition afterwards. The objective is to hinder lower-level software/hardware details from a developer towards letting him or her to focus directly at the Application Level. This article presents the created architecture for this new layer. The validation was made using the Leap Motion, at the Sensor Level, and creating a Serious Game devoted to Sign Language exercising, at the Application Level.

Keywords

Natural user interface Gesture recognition Serious games Leap motion Middleware 

References

  1. 1.
    Câmara, A.: Natural User Interfaces. In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6946, p. 1. Springer, Heidelberg (2011). doi: 10.1007/978-3-642-23774-4_1 CrossRefGoogle Scholar
  2. 2.
    Gameiro, J.M.F.: About using Serious Games to teach (Portuguese) Sign Language (2014)Google Scholar
  3. 3.
    Galveia, B.M.G.D.: Extensão do SDK do Kinect : Criação de uma Biblioteca de Gestos (2014)Google Scholar
  4. 4.
    Zhang, Z.: Microsoft kinect sensor and its effect. IEEE Multimed. 19(2), 4–10 (2012)CrossRefGoogle Scholar
  5. 5.
    Weichert, F., Bachmann, D., Rudak, B., Fisseler, D.: Analysis of the accuracy and robustness of the leap motion controller. Sens. (Basel) 13(5), 6380–6393 (2013)CrossRefGoogle Scholar
  6. 6.
    MotionSavvy, Uni. http://www.motionsavvy.com/. Accessed 06 May 2016
  7. 7.
    Thalmic Labs Inc., Tech Specs | Myo Gesture Control Armband. https://www.myo.com/techspecs. Accessed 02 Feb 2016
  8. 8.
    Randel, J.M., Morris, B.A., Wetzel, C.D., Whitehill, B.V.: The effectiveness of games for educational purposes: A review of recent research (1992)Google Scholar
  9. 9.
    Papastergiou, M.: Digital game-based learning in high school computer science education: impact on educational effectiveness and student motivation. Comput. Educ. 52(1), 1–12 (2009)CrossRefGoogle Scholar
  10. 10.
    Camerer, C.F.: Sophisticated experience-weighted attraction learning and strategic teaching in repeated games. J. Econ. Theory 104(1), 137–188 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  11. 11.
    Erhel, S., Jamet, E.: Digital game-based learning: impact of instructions and feedback on motivation and learning effectiveness. Comput. Educ. 67, 156–167 (2013)CrossRefGoogle Scholar
  12. 12.
    Gameiro, J., Cardoso, T., Rybarczyk, Y.: Kinect-sign, teaching sign language to ‘listeners’ through a game. Proc. Technol. 17, 384–391 (2014)CrossRefGoogle Scholar
  13. 13.
    Escola Virtual de Língua Gestual Portuguesa, Introdução à Língua Gestual Portuguesa. http://www.lgpescolavirtual.pt/index.php?module=modulo&m=1. Accessed 06 May 2016
  14. 14.
    Associação Portuguesa de Surdos, Informação - Língua Gestual. http://www.apsurdos.org.pt/index.php?option=com_content&view=article&id=41&Itemid=56. Accessed 24 Oct 2015

Copyright information

© ICST Institute for Computer Sciences, Social Informatics and Telecommunications Engineering 2017

Authors and Affiliations

  1. 1.Faculdade de Ciências e TecnologiaUniversidade Nova de LisboaCaparicaPortugal

Personalised recommendations