Advertisement

A System for Non-intrusive Affective Assessment in the Circumplex Model from Pupil Diameter and Facial Expression Monitoring

  • Sudarat TangnimitchokEmail author
  • Nonnarit O-larnnithipong
  • Neeranut Ratchatanantakit
  • Armando Barreto
  • Francisco R. Ortega
  • Naphtali D. Rishe
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10901)

Abstract

This paper outlines a system for non-intrusive estimation of a user’s affective state in the Circumplex Model from monitoring the user’s pupil diameter and facial expression, obtained from an EyeTech TM3 eye gaze tracker (EGT) and a RGB-D camera (KINECT) respectively. According to previous studies, the pupillary response can be used to recognize “sympathetic activation” and simultaneous “parasympathetic deactivation”, which correspond to affective arousal. Additionally, tracking the user’s facial muscle movements as he or she displays characteristic facial gestures yields indicators to estimate the affective valence. We propose to combine both types of information to map the affective state of the user to a region on the Circumplex Model. This paper outlines our initial implementation of such combined system.

Keywords

Affective computing Facial expression recognition  RGB-D camera Eye-Gaze tracking 

Notes

Acknowledgements

This research was supported by National Science Foundation grants HRD-0833093 and CNS-1532061.

References

  1. 1.
    Ahlberg, J.: CANDIDE - a parameterized face, 24 May 2012. http://www.icg.isy.liu.se/candide/. Accessed 25 Apr 2017
  2. 2.
    Ekman, P., Freisen, W.V., Ancoli, S.: Facial signs of emotional experience. J. Pers. Soc. Psychol. 39(6), 1125–1134 (1980).  https://doi.org/10.1037/h0077722. Elissa, K.: “Title of paper if known,” unpublishedCrossRefGoogle Scholar
  3. 3.
    Russell, J.A.: A circumplex model of affect. J. Pers. Soc. Psychol. 39, 1161–1178 (1980)CrossRefGoogle Scholar
  4. 4.
    Microsoft 2017: Face Tracking. https://msdn.microsoft.com/en-us/library/dn782034.aspx. Accessed 25 Apr (2017)
  5. 5.
    Picard, R.: Affective Computing. MIT Press, Cambridge (1997)CrossRefGoogle Scholar
  6. 6.
    Picard, R.: Affective computing: challenges. Int. J. Hum.-Comput. Stud. 59, 55–64 (2003)CrossRefGoogle Scholar
  7. 7.
    Hudlicka, E.: To feel or not to feel: the role of affect in human-computer interaction. Int. J. Hum.-Comput. Stud. 59(1–2), 1–32 (2003)CrossRefGoogle Scholar
  8. 8.
    Barreto, A.: Non-intrusive physiological monitoring for affective sensing of computer users. In: Asai, K. (ed.) Human-Computer Interaction New Developments, 1st edn., chap. 4, pp. 85–100. I-Tech, Vienna, August 2008. ISBN 978-953-7619-14-5Google Scholar
  9. 9.
    Cowie, R., Douglas-Cowie, E., Tsapatsoulis, N., Votsis, G., Kollias, S., Fellenz, W., Taylor, J.: Emotion recognition in human-computer interaction. IEEE Sig. Process. Mag. 18(1), 32–80 (2001)CrossRefGoogle Scholar
  10. 10.
    Chang, Y., Hu, C., Feris, R., Turk, M.: Manifold based analysis of facial expression. J. Image Vis. Comput. 24(6), 605–614 (2006)CrossRefGoogle Scholar
  11. 11.
    Guo, G., Dyer, C.: Learning from examples in the small sample case - face expression recognition. IEEE Trans. Syst. Man Cybern. Part B 35(3), 477–488 (2005)CrossRefGoogle Scholar
  12. 12.
    Fragopanagos, N., Taylor, J.: Emotion recognition in human-computer interaction. Neural Netw. 18(4), 389–405 (2005)CrossRefGoogle Scholar
  13. 13.
    Liu, H.; Lieberman, H., Selker, T.: A model of textual affect sensing using real-world knowledge. In: Proceedings of the 8th International Conference on Intelligent User Interfaces, Miami, Florida, USA, pp. 125–132. ACM (2003)Google Scholar
  14. 14.
    Elliott, C.: The affective reasoner: a process model of emotions in a multi-agent system. Doctoral Dissertation, Northwestern University (1992)Google Scholar
  15. 15.
    Valitutti, A., Strapparava, C., Stock, O.: Developing affective lexical resources. PsychNol. J. 2(1), 61–83 (2004)Google Scholar
  16. 16.
    Goertzel, B., Silverman, K., Hartley, C., Bugaj, S., Ross, M.: The baby webmind project. In: Proceedings of the AISB 2000 Symposium, The Society for the Study of Artificial Intelligence and the Simulation of Behaviour (2000)Google Scholar
  17. 17.
    Martini, F.H., Ober, W.C., Garrison, C.W., Welch, K., Hutchings, R.T.: Fundamentals of Anatomy & Physiology, 5th edn. Prentice-Hall, Upper Saddle River (2001)Google Scholar
  18. 18.
    Barreto, A., Zhai, J., Rishe, N., Gao, Y.: Measurement of pupil diameter variations as a physiological indicator of the affective state in a computer user. Biomed. Sci. Instrum. 43, 146–151 (2007)Google Scholar
  19. 19.
    Steinhauer, S.R., Siegle, G.J., Condray, R., Pless, M.: Sympathetic and parasympa-thetic innervation of pupillary dilation during sustained processing. Int. J. Psychophysiol. 52, 77–86 (2004)CrossRefGoogle Scholar
  20. 20.
    Bressloff, P.C., Wood, C.V.: Spontaneous oscillations in a nonlinear delayed-feedback shunting model of the pupillary light reflex. Phys. Rev. E 58, 3597–3605 (1998)CrossRefGoogle Scholar
  21. 21.
    Bradley, M.M., Lang, P.J.: International affective digitized sounds (IADS): stimuli, instruction manual and affective ratings. Technical report B-2, University of Florida, The Center for Research in Psychophysiology, FL (1999)Google Scholar
  22. 22.
    Partala, T., Surakka, V.: Pupil size variation as an indication of affective processing. Int. J. Hum.-Comput. Stud. 59, 185–198 (2003)CrossRefGoogle Scholar
  23. 23.
    Morimoto, C.H., Mimica, M.R.M.: Eye gaze tracking techniques for interactive applications. Comput. Vis. Image Underst. 98, 4–24 (2005)CrossRefGoogle Scholar
  24. 24.
    Gao, Y., Barreto, A., Adjouadi, M.: Detection of sympathetic activation through measurement and adaptive processing of the pupil diameter for affective assessment of computer users. Am. J. Biomed. Sci. 1(4), 283–294 (2009)CrossRefGoogle Scholar
  25. 25.
    Zeng, Z.; Pantic, M; Roisman, G., Huang, T.: A survey of affect recognition methods: audio, visual and spontaneous expressions. In: Proceedings of the 9th International Conference on Multimodal Interfaces, Nagoya, Aichi, Japan, pp. 126–133. ACM (2007)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Sudarat Tangnimitchok
    • 1
    Email author
  • Nonnarit O-larnnithipong
    • 1
  • Neeranut Ratchatanantakit
    • 1
  • Armando Barreto
    • 1
  • Francisco R. Ortega
    • 2
  • Naphtali D. Rishe
    • 2
  1. 1.Electrical and Computer Engineering DepartmentFlorida International UniversityMiamiUSA
  2. 2.School of Computer and Information SciencesFlorida International UniversityMiamiUSA

Personalised recommendations