Analysis of Expressiveness of Portuguese Sign Language Speakers

  • Inês  V.  Rodrigues
  • Eduardo M.  PereiraEmail author
  • Luis  F.  Teixeira
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9117)


Nowadays, there are several communication gaps that isolate deaf people in several social activities. This work studies the expressive- ness of gestures in Portuguese Sign Language (PSL) speakers and their differences between deaf and hearing people. It is a first effort towards the ultimate goal of understanding emotional and behaviour patterns among such populations. In particular, our work designs solutions for the following problems: (i) differentiation between deaf and hearing people, (ii) identification of different conversational topics based on body expressiveness, (iii) identification of different levels of mastery of PSL speakers through feature analysis. With these aims, we build up a complete and novel dataset that reveals the duo-interaction between deaf and hearing people under several conversational topics. Results show high recognition and classification rates.


Support Vector Machine Emotion Recognition Deaf People Body Expression Motion History Image 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



The authors would like to thank to the PSL experts, Ana and Paula, who helped to find the volunteer population, to the socio-psychologist team from the Faculdade de Psicologia e Ciências da Educação da Universidade do Porto who helped to define the sociological constraints of the database, to the Agrupamento de Escolas Eugenio de Andrade, Escola EB2/3 de Paranhos for providing the venue for acquisition of the videos of the database, and finally to Stephano Piana for his help with the EyesWeb platform.


  1. 1.
    Tomkins, S.: Affect Imagery Consciousness: Volume:II: The Negative Affects. Springer Series. Springer Publishing Company, New York (1963)Google Scholar
  2. 2.
    Ekman, P., Friesen, W.V.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto (1978)Google Scholar
  3. 3.
    Harrigan, J., Rosenthal, R., Scherer, K.: New Handbook of Methods in Nonverbal Behavior Research. Series in affective science. OUP, Oxford (2008)CrossRefGoogle Scholar
  4. 4.
    Metaxas, D., Zhang, S.: A review of motion analysis methods for human nonverbal communication computing. Image Vis. Comput. 31(6–7), 421–433 (2013). Machine learning in motion analysis: New advancesCrossRefGoogle Scholar
  5. 5.
    Nadal, J.M., Monreal, P., Perera, S.: Emotion and linguistic diversity. Procedia Soc. Behav. Sci. 82, 614–620 (2013). World Conference on Psychology and Sociology 2012CrossRefGoogle Scholar
  6. 6.
    Darwin, C.: The Expression of the Emotions in Man and Animals. John Murray, London (1872)CrossRefGoogle Scholar
  7. 7.
    Nakajima, C., Pontil, M., Heisele, B., Poggio, T.: Full-body person recognition system. Pattern Recogn. 36(9), 1997–2006 (2003)zbMATHCrossRefGoogle Scholar
  8. 8.
    Piana, S., Staglianó, A., Odone, A.C.A.: A set of full-body movement features for emotion recognition to help children affected by autism spectrum condition. In: IDGEI International Workshop (2013)Google Scholar
  9. 9.
    Atkinson, A., Dittrich, W., Gemmell, A., Young, A.: Emotion perception from dynamic and static body expressions in point-light and full-light displays. Perception 33, 717–746 (2004)CrossRefGoogle Scholar
  10. 10.
    Cassell, J.: A framework for gesture generation and interpretation. In: Cipolla, R., Pentland, A. (eds.) Computer Vision in Human-Machine Interaction, pp. 191–215. Cambridge University Press, Cambridge (2000)Google Scholar
  11. 11.
    Kobayashi, Y.: The emotion sign: human motion analysis classifying specific emotion. JCP 3(9), 20–28 (2008)CrossRefGoogle Scholar
  12. 12.
    Hwang, B.-W., Kim, S.-M., Lee, S.-W.: 2D and 3D full-body gesture database for analyzing daily human gestures. In: Huang, D.-S., Zhang, X.-P., Huang, G.-B. (eds.) ICIC 2005. LNCS, vol. 3644, pp. 611–620. Springer, Heidelberg (2005) CrossRefGoogle Scholar
  13. 13.
    Pollick, F., Paterson, H., Bruderlin, A., Sanford, A.: Perceiving affect from arm movement. Cognition 82(2), B51–61 (2001)CrossRefGoogle Scholar
  14. 14.
    Heloir, A., Gibet, S.: A qualitative and quantitative characterisation of style in sign language gestures. In: Sales Dias, M., Gibet, S., Wanderley, M.M., Bastos, R. (eds.) GW 2007. LNCS (LNAI), vol. 5085, pp. 122–133. Springer, Heidelberg (2009) CrossRefGoogle Scholar
  15. 15.
    Jensenius, A.R.: Using motiongrams in the study of musical gestures. In: Proceedings of the International Computer Music Conference, pp. 499–502. Tulane University, New Orleans (2006)Google Scholar
  16. 16.
    Pinto da Costa, J., Sousa, R., Cardoso, J.: An all-at-once unimodal svm approach for ordinal classification. In: Ninth International Conference on Machine Learning and Applications (ICMLA 2010), pp. 59–64, December 2010Google Scholar
  17. 17.
    Zhao, Y., Karypis, G.: Evaluation of hierarchical clustering algorithms for document datasets. In: Proceedings of the Eleventh International Conference on Information and Knowledge Management, CIKM 2002, pp. 515–524. ACM, New York (2002)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Inês  V.  Rodrigues
    • 1
  • Eduardo M.  Pereira
    • 1
    • 2
    Email author
  • Luis  F.  Teixeira
    • 1
  1. 1.Faculty of Engineering of the University of PortoPortoPortugal
  2. 2.INESC TECPortoPortugal

Personalised recommendations