Advertisement

3D Corpus of Spontaneous Complex Mental States

  • Marwa Mahmoud
  • Tadas Baltrušaitis
  • Peter Robinson
  • Laurel D. Riek
Part of the Lecture Notes in Computer Science book series (LNCS, volume 6974)

Abstract

Hand-over-face gestures, a subset of emotional body language, are overlooked by automatic affect inference systems. We propose the use of hand-over-face gestures as a novel affect cue for automatic inference of cognitive mental states. Moreover, affect recognition systems rely on the existence of publicly available datasets, often the approach is only as good as the data. We present the collection and annotation methodology of a 3D multimodal corpus of 108 audio/video segments of natural complex mental states. The corpus includes spontaneous facial expressions and hand gestures labelled using crowd-sourcing and is publicly available.

Keywords

Facial Expression Hand Gesture Video Segment Facial Expression Recognition Dyadic Interaction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Afzal, S., Robinson, P.: Natural affect data - collection & annotation in a learning context. In: ACII, pp. 1–7. IEEE, Los Alamitos (2009)Google Scholar
  2. 2.
    Baron-Cohen, S., Golan, O., Wheelwright, S., Hill, J.: Mind Reading: The Interactive Suide to Emotions (2004)Google Scholar
  3. 3.
    Bourel, F., Chibelushi, C., Low, A.: Robust facial expression recognition using a state-based model of spatially-localised facial dynamics. In: IEEE AFGR (2002)Google Scholar
  4. 4.
    Cowie, R.: Building the databases needed to understand rich, spontaneous human behaviour. In: AFGR, pp. 1–6. IEEE, Los Alamitos (2008)Google Scholar
  5. 5.
    Duchenne, G., Cuthbertson, R.: The mechanism of human facial expression. Cambridge Univ. Press, Cambridge (1990)CrossRefGoogle Scholar
  6. 6.
    Ekenel, H., Stiefelhagen, R.: Block selection in the local appearance-based face recognition scheme. In: CVPRW, pp. 43–43. IEEE, Los Alamitos (2006)Google Scholar
  7. 7.
    Ekman, P., Friesen, W.: Manual for the Facial Action Coding System. Consulting Psychologists Press, Palo Alto (1977)Google Scholar
  8. 8.
    Ekman, P., Friesen, W.V., Ellsworth, P.: Emotion in the Human Face, 2nd edn. Cambridge University Press, Cambridge (1982)Google Scholar
  9. 9.
    Fleiss, J., Levin, B., Paik, M.: Statistical Methods for Rates and Proportions. Wiley, Chichester (2003)CrossRefzbMATHGoogle Scholar
  10. 10.
    de Gelder, B.: Why bodies? Twelve reasons for including bodily expressions in affective neuroscience. Phil. Trans. of the Royal Society B 364(1535), 3475 (2009)CrossRefGoogle Scholar
  11. 11.
    Gunes, H., Piccardi, M.: A bimodal face and body gesture database for automatic analysis of human nonverbal affective behavior. In: ICPR, vol. 1, pp. 1148–1153. IEEE, Los Alamitos (2006)Google Scholar
  12. 12.
    Hoque, M.E., Picard, R.W.: Acted vs. natural frustration and delight: Many people smile in natural frustration. In: IEEE AFGR (2011)Google Scholar
  13. 13.
    Lausberg, H., Sloetjes, H.: Coding gestural behavior with the NEUROGES-ELAN system. Behavior research methods (2009), http://www.lat-mpi.eu/tools/elan/
  14. 14.
    Lucey, P., Cohn, J., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended Cohn-Kanade dataset (CK+): A complete dataset for action unit and emotion-specified expression. In: CVPRW, pp. 94–101. IEEE, Los Alamitos (2010)Google Scholar
  15. 15.
    McKeown, G., Valstar, M., Cowie, R., Pantic, M.: The SEMAINE corpus of emotionally coloured character interactions. In: ICME, pp. 1079–1084. IEEE, Los Alamitos (2010)Google Scholar
  16. 16.
    Pantic, M., Valstar, M., Rademaker, R., Maat, L.: Web-based database for facial expression analysis. In: IEEE Conf. Multimedia and Expo, p. 5. IEEE, Los Alamitos (2005)Google Scholar
  17. 17.
    Pease, A., Pease, B.: The definitive book of body language, Bantam (2006)Google Scholar
  18. 18.
    Roberts, N., Tsai, J., Coan, J.: Emotion elicitation using dyadic interaction tasks. In: Handbook of Emotion Elicitation and Assessment, pp. 106–123 (2007)Google Scholar
  19. 19.
    Rozin, P., Cohen, A.B.: High frequency of facial expressions corresponding to confusion, concentration, and worry in an analysis of naturally occurring facial expressions of Americans. Emotion 3(1), 68–(2003)CrossRefGoogle Scholar
  20. 20.
    Snow, R., O’Connor, B., Jurafsky, D., Ng, A.: Cheap and fast-but is it good?: evaluating non-expert annotations for natural language tasks. In: Proc. of the Conf. on Empirical Methods in Natural Language Processing, pp. 254–263. Association for Computational Linguistics (2008)Google Scholar
  21. 21.
    Sun, Y., Yin, L.: Facial expression recognition based on 3D dynamic range model sequences. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part II. LNCS, vol. 5303, pp. 58–71. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  22. 22.
    Tong, Y., Liao, W., Ji, Q.: Facial action unit recognition by exploiting their dynamic and semantic relationships. IEEE PAMI, 1683–1699 (2007)Google Scholar
  23. 23.
    Weitz, S.: Sex differences in nonverbal communication. Sex Roles 2, 175–184 (1976)CrossRefGoogle Scholar
  24. 24.
    Yin, L., Wei, X., Sun, Y., Wang, J., Rosato, M.J.: A 3D facial expression database for facial behavior research. In: AFGR, pp. 211–216 (2006)Google Scholar
  25. 25.
    Zeng, Z., Pantic, M., Roisman, G.I., Huang, T.S.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. TPAMI 31(1), 39–58 (2009)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Marwa Mahmoud
    • 1
  • Tadas Baltrušaitis
    • 1
  • Peter Robinson
    • 1
  • Laurel D. Riek
    • 2
  1. 1.Univeristy of CambridgeUnited Kingdom
  2. 2.University of Notre DameUSA

Personalised recommendations