Advertisement

Our Emotions as Seen through a Webcam

  • Natalie Sommer
  • Leanne Hirshfield
  • Senem Velipasalar
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8534)

Abstract

Humanity’s desire to enable machines to “understand” us drives research that seeks to uncover the mysteries of human beings and of their reactions. That is because a computer’s ability to correctly classify our emotions will lead to an enhanced experience for a user. Making use of the eye of the computer, a webcam, we can acquire human reaction data through the acquisition of facial images in response to stimuli. The data of interest in this research are changes in pupil size and gaze patterns in conjunction with classification of facial expression. Although fusion of these measurements has been considered in the past by Xiang and Kankanhalli [14] as well as Valverde et al. [15], their approach was quite different from ours. Both groups used a multimodal set-up: an eye tracker alongside a webcam and the stimulus was visual. A novel approach is to avoid costly eye trackers and rely on images acquired only from a standard webcam to measure changes in pupil size, gaze patterns and facial expression in response to auditory stimuli. The auditory mode is often preferred since luminance does not need to be accounted for, unlike visual stimulation from a monitor. The fusion of the information from these features is then used to distinguish between negative, neutral and positive emotional states. In this paper we discuss an experiment (n = 15) where the stimuli from the auditory version of the international affective picture system (IAPS) are used to elicit these three main emotions in participants. Webcam data is recorded during the experiments and advanced signal processing and feature extraction techniques are used on the resulting image files to achieve a model capable of predicting neutral, positive, and negative emotional states.

Keywords

Facial Expression Emotion Recognition Pupil Size International Affective Picture System Left Pupil 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Klingner, J., Tversky, B., Hanrahan, P.: Effects of visual and verbal presentation on cognitive load in vigilance, memory and arithmetic tasks. J. of Psychophysiology 48(3), 323–332 (2011)CrossRefGoogle Scholar
  2. 2.
    Petridis, S., Giannakopoulos, T., Spyropoulos, C.: Unobtrusive Low Cost Pupil Size Measurements using Web cameras. In: Proceedings of the 2nd International Workshop on Artificial Intelligence and NetMedicine, pp. 9–20 (2013)Google Scholar
  3. 3.
    Schwarz, L., Gamba, H., Pacheco, F., Ramos, R., Sovierzoski, M.: Pupil and Iris Detection in Dynamic Pupillometry using the OpenCV Library. In: 2012 5th Annual Congress on Image and Signal Processing, pp. 211–215 (2012)Google Scholar
  4. 4.
    Parlata, T., Surakka, V.: Pupil Size Variation as an Indication of Affective Processing. International Journal of Human-Computer Studies, 185–198 (2003)Google Scholar
  5. 5.
    Canento, F., Fred, A., Gamboa, H., Lourenco, A.: Multimodal Biosignal Sensor Data Handling for Emotion Recognition. In: Proceedings of the IEEE Sensors Conference (2011)Google Scholar
  6. 6.
    Xu, G., Wang, Y., Li, J., Zhou, X.: Real Time Detection of Eye Corners and Iris Center from Images Acquired by Usual Camera. In: 2009 Second International Conference Proceedings on Intelligent Networks and Intelligent Systems, pp. 401–404 (2009)Google Scholar
  7. 7.
    Zhai, J., Barreto, A.: Stress Detection in Computer Users Based on Digital Signal Processing of Noninvasive Physiological Variables. In: Proceedings of the 28th IEEE EMBS International Conference, pp. 1355–1358 (2006)Google Scholar
  8. 8.
    Baldaci, S., Gockay, D.: Negative Sentiment in Scenarios Elicit Pupil Dilation Response: An Auditory Study. In: 2012 International Conference on Multimodal Interaction, pp. 529–532. ACM (2012)Google Scholar
  9. 9.
    Wang, W., Li, Z., Wang, Y., Chen, F.: Indexing Cognitive Workload Based on Pupillary Response under Luminance and Emotional Changes. In: 2013 International Conference on Intelligent User Interfaces, pp. 247–256. ACM (2013)Google Scholar
  10. 10.
    Partala, T., Jokiniemi, M., Surakka, V.: Pupillary Responses to Emotionally Provocative Stimuli. In: 2000 Eye Tracking Research & Applications Symposium, pp. 123–129. ACM (2000)Google Scholar
  11. 11.
    Babiker, A., Faye, I., Malik, A.: Non-conscious Behavior in Emotion Recognition. In: 2013 IEEE 9th International Colloquium on Signal Processing and its Applications, pp. 258–262 (2013)Google Scholar
  12. 12.
    Delibasis, K.K., Asvestas, P., Matsopoulos, G.K., Economopoulos, T., Assimakis, N.: A Real Time Eye-Motion Monitoring System. In: 16th International Conference on Systems, Signals and Image Processing, pp. 1–5 (2009)Google Scholar
  13. 13.
    Viola, P., Jones, M.: Robust Real-Time Object Detection. International Journal of Computer Vision 57(2), 137154 (2004)CrossRefGoogle Scholar
  14. 14.
    Xiang, X., Kankanhalli, M.S.: A Multimodal Approach for Online Estimation of Subtle Facial Expression. In: Lin, W., Xu, D., Ho, A., Wu, J., He, Y., Cai, J., Kankanhalli, M., Sun, M.-T. (eds.) PCM 2012. LNCS, vol. 7674, pp. 402–413. Springer, Heidelberg (2012)CrossRefGoogle Scholar
  15. 15.
    Valverde, L., DeLera, E., Fernandez, C.: Inferencing Emotions Through the Triangulation of Pupil Size Data, Facial Heuristics and Self-Assessment Techniques. In: 2010 Second International Conference on Mobile, Hybrid, and On-Line Learning, pp. 147–150. IEEE Computer Society (2010)Google Scholar
  16. 16.
    Bousefsaf, F., Maaoui, C., Pruski, A.: Remote Assessment of the Heart Rate Variability to Detect Mental State. In: 2013 7th International Conference on Pervasive Computing Technologies for Healthcare and Workshops, pp. 348–351. IEEE (2013)Google Scholar
  17. 17.
    Terzis, V., Moridis, C., Economides, A.: Measuring Instant Emotions Based on Facial Expressions During Computer-Based Assessment. Personal and Ubiquitous Computing 17(1), 43–54 (2013)CrossRefGoogle Scholar
  18. 18.
    Adams, R., Kleck, R.: Effects of Direct and Averted Gaze on the Perception of Facially Communicated Emotion. American Psychological Association’s Emotion 5(1), 3–11 (2005)Google Scholar
  19. 19.
    Bradley, M., Lang, P.: Affective Ratings of Sounds and Instruction Manual. In: The International Affective Digitized Sounds (2nd edn., IADS-2). Technical Report B-3, University of Florida, Gainesville, FLGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Natalie Sommer
    • 1
  • Leanne Hirshfield
    • 2
  • Senem Velipasalar
    • 1
  1. 1.L.C. Smith College of Engineering and Computer ScienceSyracuse UniversitySyracuseUSA
  2. 2.S.I. Newhouse School of Public CommunicationsSyracuse UniversitySyracuseUSA

Personalised recommendations