Springer Nature is making SARS-CoV-2 and COVID-19 research free. View research | View latest news | Sign up for updates

Measuring instant emotions based on facial expressions during computer-based assessment

Abstract

Emotions are very important during learning and assessment procedures. However, measuring emotions is a very demanding task. Several tools have been developed and used for this purpose. In this paper, the efficiency of the FaceReader during a computer-based assessment (CBA) was evaluated. Instant measurements of the FaceReader were compared with the researchers’ estimations regarding students’ emotions. The observations took place in a properly designed room in real time. Statistical analysis showed that there are some differences between FaceReader’s and researchers’ estimations regarding Disgusted and Angry emotions. Results showed that FaceReader is capable of measuring emotions with an efficacy of over 87% during a CBA and that it could be successfully integrated into a computer-aided learning system for the purpose of emotion recognition. Moreover, this study provides useful results for the emotional states of students during CBA and learning procedures. This is actually the first time that student’s instant emotions were measured during a CBA, based on their facial expressions. Results showed that most of the time students were experiencing Neutral, Angry, and Sad emotions. Furthermore, gender analysis highlights differences between genders’ instant emotions.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

References

  1. 1.

    Zaman B, Shrimpton-Smith T (2006) The FaceReader: measuring instant fun of use. In: Proceedings of the fourth Nordic conference on human-computer interaction. ACM Press, Oslo, pp 457–460

  2. 2.

    Moridis C, Economides AA (2008) Towards computer-aided affective learning systems: a literature review. J Educ Comput Res 39(4):313–337

  3. 3.

    Oatley K (2004) The bug in the salad: the uses of emotions in computer interfaces. Interact Comput 16(4):693–696

  4. 4.

    Dieterich H, Malinowski U, Kühme T, Schneider-Hufschmidt M (1993) State of the art in adaptive user interfaces. In: Schneider-Hufschmidt M, Kühme T, Malinowski U (eds) Adaptive user interfaces: principles and practice. Elsevier, North Holland, pp 13–48

  5. 5.

    Lindgaard G, Triggs TJ (1990) Can artificial intelligence outperform real people? The potential of computerised decision aids in medical diagnosis. In: Karwowski W, Genaidy A, Asfour SS (eds) Computer aided design: applications in ergonomics and safety. Taylor & Francis, London, pp 416–422

  6. 6.

    Lindgaard G (1995) Human performance in fault diagnosis: can expert systems help. Interact Comput 7(3):254–272

  7. 7.

    Anderson NH (1982) Methods of information integration theory. Academic Press, London

  8. 8.

    Slovic P, Lichtenstein S (1971) Comparison of Bayesian and regression approaches to the study of information processing in judgment. Organ Behav Human Perform 6:649–744

  9. 9.

    Lindgaard G (2004) Adventurers versus nit-pickers on affective computing. Interact Comput 16(4):723–728

  10. 10.

    Ortony A, Clore GL, Collins A (1988) The cognitive structure of emotions. Cambridge University Press, Cambridge, UK

  11. 11.

    Conati C (2002) Probabilistic assessment of user’s emotions in education games. J Appl Artif Intell 16(7–8):555–575 (special issue on managing cognition and Affect in HCI)

  12. 12.

    Banse R, Sherer KR (1996) Acoustic profiles in vocal emotion expression. J Pers Soc Psychol 70(3):614–636

  13. 13.

    Picard R (1997) Affective computing. MIT Press, Cambridge, MA

  14. 14.

    Oudeyer P-Y (2003) The production and recognition of emotions in speech: features and algorithms. Int J Human Comput Stud 59(1–2):157–183

  15. 15.

    Burkhardt F, Sendlmeier W (2000). Verification of acoustical correlates of emotional speech using formant-synthesis. In: Proceedings of the ISCA workshop on speech and emotion. Belfast, Northern Ireland

  16. 16.

    James W (1983) What is an emotion? In: James W (ed) Essays in psychology. Harvard University Press, Cambridge, pp 168–187 (Reprinted from Mind, 1884, 9:188–205)

  17. 17.

    Ekman P, Levenson RW, Friesen WV (1983) Autonomic nervous system activity distinguishes among emotions. Science 221:1208–1210

  18. 18.

    Frijda N (1986) The emotions. Cambridge University Press, Cambridge, UK

  19. 19.

    Picard R (1998) Toward agents that recognize emotion. In: Proceedings of IMAGINA. Monaco, pp 153–165

  20. 20.

    Ark W, Dryer D, Lu D (1999) The emotion mouse. In: Bullinger HJ, Ziegler J (eds) Human–computer interaction: ergonomics and user interfaces. Lawrence Erlbaum, London, pp 818–823

  21. 21.

    Partala T, Surakka V (2004) The effects of affective interventions in human-computer interaction. Interact Comput 16(2):295–309

  22. 22.

    Partala T, Surakka V (2003) Pupil size as an indication of affective processing. Int J Human Comput Stud 59(1–2):185–198

  23. 23.

    Bamidis PD, Papadelis C, Kourtidou-Papadeli C, Vivas A (2004) Affective computing in the era of contemporary neurophysiology and health informatics. Interact Comput 16(4):715–721

  24. 24.

    McQuiggan SW, Lee S, Lester JC (2006) Predicting user physiological response for interactive environments: an inductive approach. In: Proceedings of the second conference on artificial intelligence and interactive entertainment. Marina del Rey, pp 60–65

  25. 25.

    Wilson GM, Sasse MA (2004) From doing to being: getting closer to the user experience. Interact Comput 16(4):697–705

  26. 26.

    Kapoor A, Picard RW (2005) Multimodal affect recognition in learning environments. In: Proceedings of the 13th annual ACM international conference on multimedia. Hilton, Singapore, pp 677–682

  27. 27.

    Ekman P (1982) Emotion in the human face, 2nd edn. Cambridge University Press, Cambridge, MA

  28. 28.

    Essa IA, Pentland AP (1997) Coding, analysis, interpretation and recognition of facial expressions. IEEE Trans Pattern Anal Mach Intell 19(7):757–763

  29. 29.

    Partala T, Surakka V, Vanhala T (2006) Real-time estimation of emotional experiences from facial expressions. Interact Comput 18(2):208–226

  30. 30.

    Cohen I, Sebe N, Chen L, Garg A, Huang TS (2003) Facial expression recognition from video sequences: temporal and static modelling. Comput Vis Image Understand 91(1–2):160–187

  31. 31.

    Oliver N, Pentland A, Berard F (2000) LAFTER: a real-time face and lips tracker with facial expression recognition. Pattern Recogn 33(8):1369–1382

  32. 32.

    Smith E, Bartlett MS, Movellan J (2001) Computer recognition of facial actions: a study of co-articulation effects. In: Proceedings of the eighth annual joint symposium on neural computation

  33. 33.

    Den Uyl MJ, van Kuilenburg H (2005) The FaceReader: online facial expression recognition. In: Proceedings of measuring behaviour. Wageningen, The Netherlands, pp 589–590

  34. 34.

    Ekman P, Friesen WV (1977) Manual for the facial action coding system. Consulting Psychologists Press, Palo Alto, CA

  35. 35.

    Benţa K-I, Cremene M, Todica V (2009) Towards an affective aware home. In: Mokhtari M et al. (eds) ICOST 2009, LNCS 5597, pp 74–81

  36. 36.

    Truong KP, Neerincx MA, Van Leeuwen DA (2008) Measuring spontaneous vocal and facial emotion expressions in real world environments. In: Proceedings of MB 2008. Maastricht, The Netherlands, pp 170–171

  37. 37.

    Cassell J, Miller P (2007) Is it self-administration if the computer gives you encouraging looks? In: Conrad FG, Schober MF (eds) Envisioning the survey interview of the future. Wiley, New York, pp 161–178

  38. 38.

    Goleman D (1995) Emotional intelligence. Bantam Books, New York

  39. 39.

    Bower G (1992) How might emotions affect learning? In: Svenake C, Lawrence E (eds) Handbook of emotion and memory: research and theory. Erlbaum, Hillsdale, NJ

  40. 40.

    Economides AA, Moridis CN (2008) Adaptive self-assessment trying to reduce fear. In: Proceedings first international conference on advances in computer-human interaction. ACHI 2008, IEEE Press

  41. 41.

    Yusoff MZ, Du Boulay B (2009) The integration of domain independent strategies into an affective tutoring system: can students’ learning gain be improved? Electron J Comput Sci Inf Technol 1(1):23–30

  42. 42.

    Achebe C (1982) Multi-modal counselling for examination failure in a Nigerian university: a case study. J Afr Stud 9:187–193

  43. 43.

    Thompson T (2004) Failure-avoidance: parenting, the achievement environment of the home and strategies for reduction”. Learn Instruct 14(1):3–26

  44. 44.

    Economides AA (2005) Personalized feedback in CAT (Computer Adaptive Testing). WSEAS Trans Adv Eng Educ 2(3):174–181

  45. 45.

    Efklides A, Volet S (2005) Feelings and emotions in the learning process. Learn Instruct 15(5):1–10

  46. 46.

    Economides AA (2006) Emotional feedback in CAT (Computer Adaptive Testing). Int J Instruct Technol Dist Learn 3(2). Available online at: http://itdl.org/Journal/Feb_06/article02.htm

  47. 47.

    Economides AA (2006) Adaptive feedback characteristics in CAT (Computer Adaptive Testing). Int J Instruct Technol Dist Learn 3(8):15–26

Download references

Author information

Correspondence to Vasileios Terzis.

Rights and permissions

Reprints and Permissions

About this article

Cite this article

Terzis, V., Moridis, C.N. & Economides, A.A. Measuring instant emotions based on facial expressions during computer-based assessment. Pers Ubiquit Comput 17, 43–52 (2013). https://doi.org/10.1007/s00779-011-0477-y

Download citation

Keywords

  • FaceReader
  • e-Learning
  • Computer-based assessment
  • Emotion recognition