Skip to main content

Experimental Analysis Using Action Units as Feature Descriptor for Emotion in People with down Syndrome

  • Conference paper
  • First Online:
Recent Advances in Electrical Engineering, Electronics and Energy (CIT 2020)

Part of the book series: Lecture Notes in Electrical Engineering ((LNEE,volume 762))

Included in the following conference series:

Abstract

This article presents an analysis based on Action Units of the facial expressions of four basic emotions in people with Down Syndrome, such as: happiness, sadness, anger and surprise, taking as a reference the Facial Action Coding System, proposed by Paul Ekman, which is based on the universal study of the movements of the muscles of the face, used in some research in people with typical development. For the present study, the action units represent the feature extraction phase of a dataset of images of people with Down syndrome, extracted from the open access website. The tool used to obtain the features of the microexpressions for the action units was Open Face 2.0, which has open source for the development of research topics. Additionally, the statistics that most closely approximated the activation of the Action Units were obtained with their respective equations, using the probability density function, evaluated by the Kullback Leibler Divergence. The statistics obtained in this article made it possible to identify the activation of action units of some emotions that are not found in the literature, such as: AU20 in happiness, sadness and anger; AU15 and AU9 in anger that will be the basis for projecting new research on algorithms for the recognition of emotions in people with Down syndrome through the face.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 229.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 299.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 299.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Baltrusaitis, T.,Zadeh, A., Lim, Y.C., Morency, L.P.: OpenFace 2.0: facial behavior analysis toolkit. In: 2018 13th IEEE International Conference on Automatic Face Gesture Recognition (FG 2018), pp. 59–66 (2018). https://doi.org/10.1109/FG.2018.00019

  2. Bettadapura, V.: Face Expression Recognition and Analysis: The State of the Art. College of Computing, Georgia Institue Of Techonology, dblp computer science bibliography (2012). http://arxiv.org/abs/1203.6722

  3. Ambady, N., Rosenthal, R.: Thin slices of expressive behavior as predictors of interpersonal consequences : a meta-analysis. Psycho-logical Bull. 111(2), 256–274 (1992)

    Article  Google Scholar 

  4. Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the Human Face, 2nd edn. Cambridge University Press, Cambridge (1982)

    Google Scholar 

  5. Wu, T., Du, S., Yang, G.: Survey of the facial expression recognition research. In: Adv Brain Inspire Cognition System, pp. 392-402 (2012)

    Google Scholar 

  6. Schneirdeman, H., Kanade, T.: Object detection using the statistics of parts. Int. J. Comput. Vis. 56, 151–177 (2004)

    Article  Google Scholar 

  7. Viola, P., Jones, M.: Rapid object detection using a boosted cascade of simple features. In: Proceedings of the 2001 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1 (2001). https://doi.org/10.1109/CVPR.2001.990517

  8. Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I., Ave, F.: The Extended Cohn-Kanade Dataset (CK +): a complete dataset for action unit and emotion-specified expression. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops, pp. 94-101 (2010)

    Google Scholar 

  9. Khan, N.U.: A comparative analysis of facial expression recognition techniques. In: IEEE 3rd International IMS Engineering College, IEEE International Advance Computing Conference (IACC), pp. 1262-1268 (2013)

    Google Scholar 

  10. Li, Y., Wang, S., Zhao, Y., Ji, Q.: Simultaneous facial feature tracking and facial expression recognition. IEEE Trans. Image Process. 22(7), 2559–2573 (2013). https://doi.org/10.1109/TIP.2013.2253477

    Article  Google Scholar 

  11. Capino, D.: Facial action coding system inspired tutorial (2009). http://es.scribd.com/doc/18649644/Facial-Action-Coding-System_Khappucino-s-Tutorial

  12. Almudema, G.: Integración de algoritmos basados en Unidades de Acción en una herramienta de análisis de reconocimiento de emociones. Universidad Politécnica de Madrid (2016)

    Google Scholar 

  13. Kullback, S., Leibler, R.A.: On information and sufficiency. Ann. Math. Statist. 22(1), 79–86 (1951). https://doi.org/10.1214/aoms/1177729694

    Article  MathSciNet  MATH  Google Scholar 

  14. Antonarakis, S.E., Lyle, R., Dermitzakis, E.T., Deutsh, S.: Chromosome 21 and down syndrome: from genomics to pathophophysiology. Nature Rev. Gene. 5, 725 (2004)

    Article  Google Scholar 

  15. Cornejo, J.Y.R., Pedrini, H., Machado-Lima, A., dos Santos Nunes, F.D.L.: Down syndrome detection based on facial features using a geometric descriptor. J. Med. Imaging 4, 044008 (2017)

    Google Scholar 

  16. Hippolyte, L., Barisnikov, K., Van der Linden ,M., Detraux, J.J.: From facial emotional recognition abilities to emotional attribution: a study in down syndrome. Res. Dev. Disabil. 30(5), 1007–1022 (2009)

    Google Scholar 

  17. Agbolade, O.A., Nazri, A., Yaakob, R., Azim Ghani, A., Kqueen Cheah, Y.: Down syndrome face recognition: a review. Symmetry 12, 1182 (2020)

    Google Scholar 

  18. Ekman, P., Friesen, W.V., Hager, J.C.: Facial action coding system. Salt Lake City, UT, USA, In Manual and Investigator’s Guide; Research Nexus (2002)

    Google Scholar 

  19. Gavrilescu, M., Vizireanu, N.: Predicting depression, anxiety, and stress levels from videos using the facial action coding system. Sensors 19, 3693 (2019)

    Article  Google Scholar 

  20. Ekman, P., Friesen, W.V.: Measuring facial movement. Environ. Psychol. Nonverbal Behav. 1, 56–75 (1976). https://doi.org/10.1007/BF01115465

    Article  Google Scholar 

  21. Ekman, P., Friesen, W.V.: Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto, California (1978)

    Google Scholar 

  22. Clark, E., Kessinger, J., Duncan, S., Bell, M., Lahne, J., Gallagher, D., O’Keefe, S.: The facial action coding system for characterization of human affective response to consumer product-based stimuli: a systematic review. J. Front. Psychol. (2020)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Nancy Paredes .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Paredes, N., Caicedo Bravo, E., Bacca Cortes, B. (2021). Experimental Analysis Using Action Units as Feature Descriptor for Emotion in People with down Syndrome. In: Botto Tobar, M., Cruz, H., Díaz Cadena, A. (eds) Recent Advances in Electrical Engineering, Electronics and Energy. CIT 2020. Lecture Notes in Electrical Engineering, vol 762. Springer, Cham. https://doi.org/10.1007/978-3-030-72208-1_19

Download citation

Publish with us

Policies and ethics