Journal of Nonverbal Behavior

, Volume 35, Issue 1, pp 1–16 | Cite as

FACSGen: A Tool to Synthesize Emotional Facial Expressions Through Systematic Manipulation of Facial Action Units

  • Etienne B. Roesch
  • Lucas Tamarit
  • Lionel Reveret
  • Didier Grandjean
  • David Sander
  • Klaus R. Scherer
Original Paper

Abstract

To investigate the perception of emotional facial expressions, researchers rely on shared sets of photos or videos, most often generated by actor portrayals. The drawback of such standardized material is a lack of flexibility and controllability, as it does not allow the systematic parametric manipulation of specific features of facial expressions on the one hand, and of more general properties of the facial identity (age, ethnicity, gender) on the other. To remedy this problem, we developed FACSGen: a novel tool that allows the creation of realistic synthetic 3D facial stimuli, both static and dynamic, based on the Facial Action Coding System. FACSGen provides researchers with total control over facial action units, and corresponding informational cues in 3D synthetic faces. We present four studies validating both the software and the general methodology of systematically generating controlled facial expression patterns for stimulus presentation.

Keywords

Emotion Facial expression Software Research material Facial action coding system FACS 

References

  1. Bickel, B., Botsch, M., Angst, R., Matusik, W., Otaduy, M., Pfister, H., et al. (2007). Multi-scale capture of facial geometry and motion. ACM Transactions in Graphics, 26(3), 33.CrossRefGoogle Scholar
  2. Blanz, V., & Vetter, T. (1999). Morphable model for the synthesis of 3D faces [Computer software]. Los Angeles: SIGGRAPH.Google Scholar
  3. Corneille, O., Hugenberg, K., & Timothy, P. (2007). Applying the attractor field model to social cognition: Perceptual discrimination is facilitated, but memory is impaired for faces displaying evaluatively congruent expressions. Journal of Personality and Social Psychology, 93(3), 335–352.CrossRefPubMedGoogle Scholar
  4. Cosker, D., Borkett, R., Mashall, D., & Rosin, P. L. (2008). Towards automatic performance driven animation between multiple types of facial model. IET Computer Vision, 2(3), 129–141.CrossRefGoogle Scholar
  5. Cristinzio, C., N’Diaye, K., Seeck, M., Vuilleumier, P., & Sander, D. (2010). Integration of gaze direction and facial expression in patients with unilateral amygdala damage. Brain, 133, 248–261.CrossRefPubMedGoogle Scholar
  6. Delplanque, S., N’Diaye, K., Scherer, K. R., & Grandjean, D. (2007). Spatial frequencies or emotional effects? A systematic measure of spatial frequencies for IAPS pictures by a discrete wavelet analysis. Journal of Neuroscience Methods, 165, 144–150.CrossRefPubMedGoogle Scholar
  7. Ekman, P., & Friesen, W. V. (1976). Pictures of facial affect. Palo Alto, CA: Consulting Psychologists Press.Google Scholar
  8. Ekman, P., Friesen, W. V., & Hager, J. (2002). The facial action coding system. London, UK.Google Scholar
  9. Ekman, P., Irwin, W., & Rosenberg, E. L. (1994). The emotional facial action coding system (EMFACS). London, UK.Google Scholar
  10. Ellison, J. W., & Massaro, D. W. (1997). Featural evaluation, integration, and judgment of facial affect. Journal of Experimental Psychology: Human Perception and Performance, 23(1), 213–226.CrossRefPubMedGoogle Scholar
  11. FaceGen Modeller. [Software] (2009). Singular Inversions Inc. Retrieved from http://www.facegen.com/.
  12. Fiser, J., Bex, P. J., & Makous, W. (2003). Contrast conservation in human vision. Vision Research, 43, 2637–2648.CrossRefPubMedGoogle Scholar
  13. Freeman, J. B., & Ambady, N. (2009). Motions of the hand expose the partial and parallel activation of stereotypes. Psychological Science, 20, 1183–1188.CrossRefPubMedGoogle Scholar
  14. Gaag, C., van der Minderaa, R. B., & Keysers, C. (2007). The bold signal in the amygdala does not differentiate between dynamic facial expressions. Social Cognitive Affective Neuroscience, 2, 93–103.CrossRefGoogle Scholar
  15. Goeleven, E., De Raedt, R., Leyman, L., & Verschuere, B. (2008). The Karolinska directed emotional faces: A validation studies. Cognition and Emotion, 22(6), 1094–1118.CrossRefGoogle Scholar
  16. Grammer, K., Tessarek, A., & Hofer, G. (in preparation). From emoticons to avatars: The simulation of facial expression. In A. Kappas (Ed.), Emotional communication on the internet. Retrieved from http://evolution.anthro.univie.ac.at/institutes/urbanethology/projects/simulation/emosym/index.html.
  17. Hess, U., Blairy, S., & Kleck, R. E. (1997). The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21(4), 241–257.CrossRefGoogle Scholar
  18. Hezle, V., Biehn, C., Schlömer, T., & Linner, F. (2004). Adaptable setup for performance driven facial animation. In Proceedings of SIGGRAPH’04—sketches. Los Angeles: Springer.Google Scholar
  19. Hirsh, A. T., Alqudah, A. F., Stutts, L. A., & Robinson, M. E. (2009). Virtual human technology: Capturing sex, race, and age influences in individual pain decision policies. Pain, 140, 231–238.CrossRefGoogle Scholar
  20. Joorman, J., & Gotlib, I. H. (2006). Is this happiness I see? Biases in the identification of emotional facial expressions in depression and social phobia. Journal of Abnormal Psychology, 115(4), 705–714.CrossRefGoogle Scholar
  21. Kanade, T., Cohn, J. F., & Tian, Y. (2000). Comprehensive database for facial expression analysis. In Proceedings of the fourth IEEE international conference on automatic face and gesture recognition. Retrieved from http://vasc.ri.cmu.edu/idb/html/face/facial_expression/index.html.
  22. Lundqvist, D., Esteves, F., & Öhman, A. (1998). The Karolinska directed emotional faces—KDEF [CD ROM]. Department of Clinical Neuroscience, Psychology section, Karolinska Institute. ISBN 91-630-7164-9.Google Scholar
  23. Ma, W., Jones, A., Chiang, J., Hawkins, T., Frederiksen, S., Peers, P., et al. (2008). Facial performance synthesis using deformation-driven polynomial displacement maps. ACM Transactions in Graphics, 27(5), 1–10.CrossRefGoogle Scholar
  24. Malatesta, L., Raouzaiou, A., Karpouzis, K., & Kollias, S. (2006). Mpeg-4 facial expression synthesis based on appraisal theory. In The 3rd IFIP conference in artificial intelligence applications and innovations, AIAI 2006. Athens, Greece.Google Scholar
  25. Moradi, F., Koch, C., Shimojo, S., Sarma, G., & Gutierrez, J. (2005). Adaptation to face identity and emotional expression depends on attention. In Proceedings of vision sciences society 5th. Sarasota, FL: Journal of Vision.Google Scholar
  26. Moser, E., Derntl, B., Robinson, S., Fink, B., Gur, R. C., & Grammer, K. (2007). Amygdala activation at 3t in response to human and avatar facial expressions of emotions. Journal of Neuroscience Methods, 161(1), 126–133.CrossRefPubMedGoogle Scholar
  27. N’Diaye, K., Sander, D., & Vuilleumier, P. (2009). Self-relevance processing in the human amygdala: Gaze direction, facial expression, and emotion intensity. Emotion, 9(6), 798–806.CrossRefPubMedGoogle Scholar
  28. Oosterhof, N. N., & Todorov, A. (2008). The functional basis of face evaluation. In Proceedings of the national academy of sciences of the United States of America, 105(32), 11087–11092.Google Scholar
  29. Pantic, M., Valstar, M. F., Rademaker, R., & Maat, L. (2005). Web-based database for facial expression analysis. In Proceedings of the IEEE international conference on multimedia and expo (ICME’05). Retrieved from http://www.docstoc.com/docs/2933918/EULA-End-User-License-Agreement-MMI-Face-Database-www-mmifacedb.
  30. Parke, F. I., & Waters, K. (1996). Computer facial animation. Natick, MA: A.K. Peters Ltd.Google Scholar
  31. Parr, L. A., Waller, B. M., & Heintz, M. (2008). Facial expression categorization by chimpanzees using standardized stimuli. Emotion, 8(2), 216–231.CrossRefPubMedGoogle Scholar
  32. Pasquariello, S., & Pelachaud, C. (2001). Greta: A simple facial animation engine. In R. Rajkumar, M. Köppen, S. Ovaska, T. Furuhashi, & F. Hoffmann (Eds.), 6th online world conference on soft computing in industrial applications, session on soft computing for intelligent 3D agents. Germany: Springer.Google Scholar
  33. Pelphrey, K., Viola, R., & McCarthy, G. (2004). When strangers pass: Processing of mutual and averted social gaze in the superior temporal sulcus. Psychological Science, 15(9), 598–603.CrossRefPubMedGoogle Scholar
  34. Poser 7 [Software] (2007). e-frontier. Retrieved from http://www.e-frontier.com/go/products/poser/.
  35. Pourtois, G., Grandjean, D., Sander, D., & Vuilleumier, P. (2004). Electrophysiological correlates of rapid spatial orienting towards fearful faces. Cerebral Cortex, 14(6), 619–633.CrossRefPubMedGoogle Scholar
  36. Roesch, E. B., Sander, D., Mumenthaler, C., Kerzel, D., & Scherer, K. R. (2010a). Psychophysics of emotion: The QUEST for emotional attention. Journal of Vision, 10(3), 4, 1–9.Google Scholar
  37. Roesch, E. B., Sander, D., & Scherer, K. R. (2009). Emotion and motion in facial expressions modulate the attentional blink. Perception, 38, 466.Google Scholar
  38. Roesch, E. B., Sander, D., & Scherer, K. R. (2010b). The 4th dimension(s) of emotion perception: Emotion and motion in facial expressions modulate the attentional blink. Manuscript in preparation.Google Scholar
  39. Ruys, K. I., & Stapel, D. A. (2008). Emotion elicitor or emotion messenger: Subliminal priming reveals two faces of facial expressions. Psychological Science, 19(6), 593–600.CrossRefPubMedGoogle Scholar
  40. Sander, D., Grandjean, D., Kaiser, S., Wehrle, T., & Scherer, K. R. (2007). Interaction effect of perceived gaze direction and dynamic facial expression: Evidence for appraisal theories of emotion. European Journal of Cognitive Psychology, 19(3), 470–480.CrossRefGoogle Scholar
  41. Sayette, M. A., Cohn, J. F., Wertz, J. M., Perrott, M. A., & Parrott, D. J. (2004). A psychometric evaluation of the facial action coding system for assessing spontaneous expression. Journal of Nonverbal Behavior, 25(3), 167–186.CrossRefGoogle Scholar
  42. Scherer, K. R., & Ellgring, H. (2007). Are facial expressions of emotion produced by categorical affect programs or dynamically driven by appraisal? Emotion, 7(1), 113–130.CrossRefPubMedGoogle Scholar
  43. Schulte-Rüther, M., Markowitsch, H. J., Fink, G. R., & Piefke, M. (2007). Mirror neuron and theory of mind mechanisms involved in face-to-face interactions: A functional magnetic resonance imaging approach to empathy. Journal of Cognitive Neuroscience, 19, 1354–1372.CrossRefPubMedGoogle Scholar
  44. Shimojo, S., Simion, C., Shimojo, E., & Scheier, C. (2003). Gaze bias both reflects and influences preference. Nature Neuroscience, 6(12), 1317–1322.CrossRefPubMedGoogle Scholar
  45. Szczepanowski, R., & Pessoa, L. (2007). Fear perception: Can objective and subjective awareness measures be dissociated? Journal of Vision, 7(4), 10–17.CrossRefPubMedGoogle Scholar
  46. Todorov, A., Baron, S. G., & Oosterhof, N. N. (2008). Evaluating face trustworthiness: A model based approach. Social Cognitive Affective Neuroscience, 3, 119–127.CrossRefGoogle Scholar
  47. Wehrle, T., Kaiser, S., Schmidt, S., & Scherer, K. R. (2000). Studying the dynamics of emotional expression using synthesized facial muscle movements. Journal of Personality and Social Psychology, 78(1), 105–119.CrossRefPubMedGoogle Scholar
  48. Zhang, L., Snavely, N., Curless, B., & Seitz, S. (2004). Spacetime faces: High resolution capture for modeling and animation. ACM Transactions in Graphics, 23(3), 548–558.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2010

Authors and Affiliations

  • Etienne B. Roesch
    • 1
    • 2
    • 4
  • Lucas Tamarit
    • 2
  • Lionel Reveret
    • 3
  • Didier Grandjean
    • 1
    • 2
  • David Sander
    • 1
    • 2
  • Klaus R. Scherer
    • 1
    • 2
  1. 1.Department of PsychologyUniversity of GenevaGenevaSwitzerland
  2. 2.Swiss Centre for Affective SciencesCISA, University of GenevaGenevaSwitzerland
  3. 3.INRIARhône-AlpesFrance
  4. 4.Department of ComputingImperial College LondonLondonUK

Personalised recommendations