Skip to main content

Emotion Elicitation with Stimuli Datasets in Automatic Affect Recognition Studies – Umbrella Review

  • Conference paper
  • First Online:
Human-Computer Interaction – INTERACT 2021 (INTERACT 2021)

Abstract

Affect Recognition has become a relevant research field in Artificial Intelligence development. Nevertheless, its progress is impeded by poor methodological conduct in psychology, computer science, and, consequently, affective computing. We address this issue by providing a rigorous overview of Emotion Elicitation utilising stimuli datasets in Affect Recognition studies. We identified relevant trials by exploring five electronic databases and other sources. Eligible studies were those reviews identified through the title, abstract and full text, which aimed to include subjects who underwent Emotion Elicitation in laboratory conditions with passive stimuli presentation for Automatic Affect Recognition. Two independent reviewers were involved in each step in the process of identification of eligible studies. The discussion resolved any discrepancies. 16 of 1308 references met the inclusion criteria. The 16 papers reviewed 271 primary studies, in which 3515 participants were examined. We found out that datasets containing video, music, and pictures stimuli are most widely explored, while researchers should focus more on these incorporating audio excerpts. Five of the most frequently analysed emotions are: sadness, anger, happiness, fear and joyfulness. The Elicitation Effectiveness and techniques towards emotion assessment, are not reported by the review authors. We also provide conclusions about the lack of studies concerning Deep Learning methods. All of the included studies were of Critically low quality. Much of the critical information is missing in the reviewed papers, and therefore a comprehensive view on this research area is disturbingly hard to claim.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aifanti, N., Papachristou, C., Delopoulos, A.: The mug facial expression database. In: 11th International Workshop on Image Analysis for Multimedia Interactive Services WIAMIS 10, pp. 1–4. IEEE (2010)

    Google Scholar 

  2. Anderson, K., McOwan, P.W.: A real-time automated system for the recognition of human facial expressions. IEEE Trans. Syst. Man Cybern. Part B (Cybern.) 36(1), 96–105 (2006)

    Article  Google Scholar 

  3. Aromataris, E., Munn, Z.: Chapter 1: JBI systematic reviews. Joanna Briggs Institute Reviewer’s Manual. The Joanna Briggs Institute (2017)

    Google Scholar 

  4. Baghdadi, A., Aribi, Y., Alimi, A.M.: A survey of methods and performances for EEG-based emotion recognition. In: Abraham, A., Haqiq, A., Alimi, A.M., Mezzour, G., Rokbani, N., Muda, A.K. (eds.) HIS 2016. AISC, vol. 552, pp. 164–174. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-52941-7_17

    Chapter  Google Scholar 

  5. Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)

    Article  Google Scholar 

  6. Bradley, M.M., Lang, P.J.: Affective norms for English words: Instruction manual and affective ratings. Technical report, The center for research in psychophysiology (1999)

    Google Scholar 

  7. Bradley, M.M., Lang, P.J.: The international affective digitized sounds (IADS-2): affective ratings of sounds and instruction manual. University of Florida, Gainesville, FL, Technical report B-3 (2007)

    Google Scholar 

  8. Bradley, M., Lang, P.: International affective digitized sounds: stimuli, instruction manual and affective ratings. Center for Research in Psychophysiology (1999)

    Google Scholar 

  9. Chen, J., Mehmood, R.: A critical review on state-of-the-art EEG-based emotion datasets. In: Proceedings of the International Conference on Advanced Information Science and System, pp. 1–5 (2019)

    Google Scholar 

  10. Christensen, L.R., Abdullah, M.A.: EEG emotion detection review. In: 2018 IEEE Conference on Computational Intelligence in Bioinformatics and Computational Biology (CIBCB), pp. 1–7. IEEE (2018)

    Google Scholar 

  11. Correa, J.A.M., Abadi, M.K., Sebe, N., Patras, I.: Amigos: a dataset for affect, personality and mood research on individuals and groups. IEEE Trans. Affect. Comput. 12, 479–493 (2018)

    Article  Google Scholar 

  12. Dan-Glauser, E.S., Scherer, K.R.: The Geneva affective picture database (GAPED): a new 730-picture database focusing on valence and normative significance. Behav. Res. Methods 43(2), 468 (2011)

    Article  Google Scholar 

  13. Dellandréa, E., Huigsloot, M., Chen, L., Baveye, Y., Xiao, Z., Sjöberg, M.: Predicting the emotional impact of movies. ACM SIGMM Rec. 10, 1–7 (2018)

    Google Scholar 

  14. Dhaka, S., Kashyap, N.: Explicit emotion regulation: comparing emotion inducing stimuli. Psychol. Thought 10(2), 303–314 (2017)

    Article  Google Scholar 

  15. D’Mello, S., Kappas, A., Gratch, J.: The affective computing approach to affect measurement. Emot. Rev. 10(2), 174–183 (2018)

    Article  Google Scholar 

  16. Ekman, P.: Pictures of Facial Affect. Consulting Psychologists Press (1976)

    Google Scholar 

  17. Ekman, P., et al.: Universals and cultural differences in the judgments of facial expressions of emotion. J. Pers. Soc. Psychol. 53(4), 712 (1987)

    Article  Google Scholar 

  18. EQUATOR: Enhancing the quality and transparency of health research (2014). https://equator-network.org

  19. García-Martínez, B., Martinez-Rodrigo, A., Alcaraz, R., Fernández-Caballero, A.: A review on nonlinear methods using electroencephalographic recordings for emotion recognition. IEEE Trans. Affect. Comput. (2019)

    Google Scholar 

  20. Goodfellow, I.J., et al.: Generative adversarial networks. arXiv preprint arXiv:1406.2661 (2014)

  21. Green, D.M., Swets, J.A., et al.: Signal Detection Theory and Psychophysics, vol. 1. Wiley, New York (1966)

    Google Scholar 

  22. Hamada, M., Zaidan, B., Zaidan, A.: A systematic review for human EEG brain signals based emotion classification, feature extraction, brain condition, group comparison. J. Med. Syst. 42(9), 162 (2018)

    Article  Google Scholar 

  23. Hamdi, H., Richard, P., Suteau, A., Allain, P.: Emotion assessment for affective computing based on physiological responses. In: 2012 IEEE International Conference on Fuzzy Systems, pp. 1–8. IEEE (2012)

    Google Scholar 

  24. Higgins, J., et al.: Methodological expectations of cochrane intervention reviews. Cochrane 6, London (2019)

    Google Scholar 

  25. Higgins, J.P., Thomas, J., Chandler, J., et al.: Cochrane Handbook for Systematic Reviews of Interventions. Wiley, Hoboken (2019)

    Book  Google Scholar 

  26. Jemioło, P., Giżycka, B., Nalepa, G.J.: Prototypes of arcade games enabling affective interaction. In: Rutkowski, L., Scherer, R., Korytkowski, M., Pedrycz, W., Tadeusiewicz, R., Zurada, J.M. (eds.) ICAISC 2019. LNCS (LNAI), vol. 11509, pp. 553–563. Springer, Cham (2019). https://doi.org/10.1007/978-3-030-20915-5_49

    Chapter  Google Scholar 

  27. Jemioło, P., Storman, D.: Quality assessment of systematic reviews (QASR), June 2020. https://osf.io/dhtw3/

  28. Jemioło, P., Giżycka, B., Storman, D.: Datasets for affect elicitation in emotion recognition (2020). https://osf.io/vdbqg/

  29. Jerritta, S., Murugappan, M., Nagarajan, R., Wan, K.: Physiological signals based human emotion recognition: a review. In: 2011 IEEE 7th International Colloquium on Signal Processing and its Applications, pp. 410–415. IEEE (2011)

    Google Scholar 

  30. Kapur, A., Kapur, A., Virji-Babul, N., Tzanetakis, G., Driessen, P.F.: Gesture-based affective computing on motion capture data. In: Tao, J., Tan, T., Picard, R.W. (eds.) ACII 2005. LNCS, vol. 3784, pp. 1–7. Springer, Heidelberg (2005). https://doi.org/10.1007/11573548_1

    Chapter  Google Scholar 

  31. Khalil, R.A., Jones, E., Babar, M.I., Jan, T., Zafar, M.H., Alhussain, T.: Speech emotion recognition using deep learning techniques: a review. IEEE Access 7, 117327–117345 (2019)

    Article  Google Scholar 

  32. Khosla, A., Khandnor, P., Chand, T.: A comparative analysis of signal processing and classification methods for different applications based on EEG signals. Biocybern. Biomed. Eng. 40, 649–690 (2020)

    Article  Google Scholar 

  33. Kory, J.M., D’Mello, S.K.: Affect elicitation for affective computing. In: The Oxford Handbook of Affective Computing, p. 371 (2015)

    Google Scholar 

  34. Kurdi, B., Lozano, S., Banaji, M.R.: Introducing the open affective standardized image set (OASIS). Behav. Res. Methods 49(2), 457–470 (2017)

    Article  Google Scholar 

  35. Kutt, K., et al.: BIRAFFE: bio-reactions and faces for emotion-based personalization. CEUR Workshop Proceedings (2019)

    Google Scholar 

  36. Lang, P.J., Bradley, M.M., Cuthbert, B.N., et al.: International affective picture system (IAPS): technical manual and affective ratings. NIMH Cent. Study Emot. Attent. 1, 39–58 (1997)

    Google Scholar 

  37. Langner, O., Dotsch, R., Bijlstra, G., Wigboldus, D.H., Hawk, S.T., Van Knippenberg, A.: Presentation and validation of the radboud faces database. Cogn. Emot. 24(8), 1377–1388 (2010)

    Article  Google Scholar 

  38. Liang, Y., Hsieh, S., Weng, C., Sun, C.: Taiwan corpora of Chinese emotions and relevant psychophysiological data - standard Chinese emotional film clips database. Chin. J. Psychol. 55(4), 597–617 (2013)

    Google Scholar 

  39. Liberati, A., et al.: The PRISMA statement for reporting systematic and meta-analyses of studies that evaluate interventions. PLoS Med. 6(7), 1–28 (2009)

    Article  Google Scholar 

  40. Lu, B., Hui, M., Yu-Xia, H.: The development of native Chinese affective picture system - a pretest in 46 college students. Chin. Ment. Health J. (2005)

    Google Scholar 

  41. Luo, Y., Cai, X., Zhang, Y., Xu, J., Yuan, X.: Multivariate time series imputation with generative adversarial networks. In: Proceedings of the 32nd International Conference on Neural Information Processing Systems, pp. 1603–1614 (2018)

    Google Scholar 

  42. Lyons, M.J., Akamatsu, S., Kamachi, M., Gyoba, J., Budynek, J.: The Japanese female facial expression (JAFFE) database. In: Proceedings of Third International Conference on Automatic Face and Gesture Recognition, pp. 14–16 (1998)

    Google Scholar 

  43. Marchewka, A., Żurawski, Ł, Jednoróg, K., Grabowska, A.: The nencki affective picture system: introduction to a novel, standardized, wide-range, high-quality, realistic picture database. Behav. Res. Methods 46(2), 596–610 (2014)

    Article  Google Scholar 

  44. Maria, E., Matthias, L., Sten, H.: Emotion recognition from physiological signal analysis: a review. Notes Theor. Comput. Sci. 343, 35–55 (2019)

    Article  Google Scholar 

  45. Mehrabian, A.: Pleasure-arousal-dominance: a general framework for describing and measuring individual differences in temperament. Curr. Psychol. 14(4), 261–292 (1996)

    Article  MathSciNet  Google Scholar 

  46. Moher, D., Liberati, A., Tetzlaff, J., Altman, D.G.: Prisma 2009 flow diagram. PRISMA statement 6, 97 (2009)

    Google Scholar 

  47. Moors, A.: Theories of emotion causation: a review. Cogn. Emot. 23(4), 625–662 (2009)

    Article  Google Scholar 

  48. Ouzzani, M., Hammady, H., Fedorowicz, Z., Elmagarmid, A.: Rayyan–a web and mobile app for systematic reviews. Syst. Control Found. Appl. 5(1), 210 (2016)

    Google Scholar 

  49. Pallavicini, F., Ferrari, A., Pepe, A., Garcea, G., Zanacchi, A., Mantovani, F.: Effectiveness of virtual reality survival horror games for the emotional elicitation: preliminary insights using resident evil 7: biohazard. In: Antona, M., Stephanidis, C. (eds.) UAHCI 2018. LNCS, vol. 10908, pp. 87–101. Springer, Cham (2018). https://doi.org/10.1007/978-3-319-92052-8_8

    Chapter  Google Scholar 

  50. Peng, K.C., Chen, T., Sadovnik, A., Gallagher, A.C.: A mixed bag of emotions: model, predict, and transfer emotion distributions. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp. 860–868 (2015)

    Google Scholar 

  51. Picard, R.W.: Affective Computing. MIT Press, Cambridge (1997)

    Google Scholar 

  52. Pollock, M., Fernandes, R.M., Becker, L.A., Pieper, D., Hartling, L.: Chapter V: overviews of reviews. Cochrane Handb. Syst. Rev. Intervent. Version 6 (2018)

    Google Scholar 

  53. Redondo, J., Fraga, I., Padrón, I., Comesaña, M.: The Spanish adaptation of anew. Behav. Res. Methods 39(3), 600–605 (2007)

    Article  Google Scholar 

  54. Russell, J.A., Barrett, L.F.: Core affect, prototypical emotional episodes, and other things called emotion: dissecting the elephant. J. Pers. Soc. Psychol. 76(5), 805 (1999)

    Article  Google Scholar 

  55. Sarma, P., Barma, S.: Review on stimuli presentation for affect analysis based on EEG. IEEE Access 8, 51991–52009 (2020)

    Article  Google Scholar 

  56. Schaefer, A., Nils, F., Sanchez, X., Philippot, P.: Assessing the effectiveness of a large database of emotion-eliciting films: a new tool for emotion researchers. Cogn. Emot. 24(7), 1153–1172 (2010)

    Article  Google Scholar 

  57. Schmidt, P., Reiss, A., Dürichen, R., Laerhoven, K.V.: Wearable-based affect recognition - a review. Sensors 19, 4079 (2019)

    Article  Google Scholar 

  58. Schmidtke, D.S., Schröder, T., Jacobs, A.M., Conrad, M.: ANGST: affective norms for German sentiment terms, derived from the affective norms for English words. Behav. Res. Methods 46(4), 1108–1118 (2014)

    Article  Google Scholar 

  59. Shea, B.J., et al.: AMSTAR 2: a critical appraisal tool for systematic reviews that include randomised or non-randomised studies of healthcare interventions, or both. bmj 358, j4008 (2017)

    Article  Google Scholar 

  60. Shoumy, N.J., Ang, L.M., Seng, K.P., et al.: Multimodal big data affective analytics: a comprehensive survey using text, audio, visual and physiological signals. J. Netw. Comput. Appl. 149, 102447 (2020)

    Article  Google Scholar 

  61. Shu, L., et al.: A review of emotion recognition using physiological signals. Sensors 18(7), 2074 (2018)

    Article  Google Scholar 

  62. Smith, V., Devane, D., Begley, C.M., Clarke, M.: Methodology in conducting a systematic review of systematic reviews of healthcare interventions. BMC Med. Res. Methodol. 11(1), 15 (2011)

    Article  Google Scholar 

  63. Soleymani, M., Aljanaki, A., Yang, Y.: DEAM: MediaEval database for emotional analysis in music (2016)

    Google Scholar 

  64. Spezialetti, M., Cinque, L., Tavares, J.M.R., Placidi, G.: Towards EEG-based BCI driven by emotions for addressing BCI-illiteracy: a meta-analytic review. Behav. Inf. Technol. 37(8), 855–871 (2018)

    Article  Google Scholar 

  65. Szwoch, W.: Using physiological signals for emotion recognition. In: International Conference on Human System Interactions (HSI), pp. 556–561. IEEE (2013)

    Google Scholar 

  66. Tandle, A.L., Joshi, M.S., Dharmadhikari, A.S., Jaiswal, S.V.: Mental state and emotion detection from musically stimulated EEG. Brain Inf. 5(2), 14 (2018)

    Article  Google Scholar 

  67. Thanapattheerakul, T., Mao, K., Amoranto, J., Chan, J.H.: Emotion in a century: A review of emotion recognition. In: Proceedings of the 10th International Conference on Advances in Information Technology, pp. 1–8 (2018)

    Google Scholar 

  68. Valenza, G., Citi, L., Lanata, A., Scilingo, E.P., Barbieri, R.: A nonlinear heartbeat dynamics model approach for personalized emotion recognition. In: 2013 35th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), pp. 2579–2582. IEEE (2013)

    Google Scholar 

  69. Van Eck, N., Waltman, L.: Software survey: VOSviewer, a computer program for bibliometric mapping. Scientometrics 84(2), 523–538 (2010)

    Article  Google Scholar 

  70. Witkowski, T.: Is the glass half empty or half full? Latest results in the replication crisis in psychology. Skept. Inq. 43(2), 5–6 (2019)

    Google Scholar 

  71. Yang, W., et al.: Affective auditory stimulus database: an expanded version of the international affective digitized sounds (IADS-E). Behav. Res. Methods 50(4), 1415–1429 (2018)

    Article  Google Scholar 

  72. Zhang, Q., Chen, X., Zhan, Q., Yang, T., Xia, S.: Respiration-based emotion recognition with deep learning. Comput. Ind. 92, 84–90 (2017)

    Article  Google Scholar 

  73. Zhao, Y., Zhao, W., Jin, C., Chen, Z.: A review on EEG based emotion classification. In: 2019 IEEE 4th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC), vol. 1, pp. 1959–1963. IEEE (2019)

    Google Scholar 

  74. Zhou, F., Qu, X., Jiao, J., Helander, M.G.: Emotion prediction from physiological signals: a comparison study between visual and auditory elicitors. Interact. Comput. 26(3), 285–302 (2014)

    Article  Google Scholar 

  75. Zupan, B., Babbage, D.R.: Film clips and narrative text as subjective emotion elicitation techniques. J. Soc. Psychol. 157(2), 194–210 (2017)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Contributions

Paweł Jemioło (PJ) – all listed stages. Barbara Giżycka (BG) – conceptualization, investigation, validation, writing. Dawid Storman (DS) – conceptualization, formal analysis, investigation, methodology, supervision, writing. Antoni Ligęza (AL) – supervision.

Corresponding author

Correspondence to Paweł Jemioło .

Editor information

Editors and Affiliations

Ethics declarations

Conflict of Interest

Authors declare that they have no conflict of interest.

Funding Sources

Current work is supported by AGH UST grants.

Rights and permissions

Reprints and permissions

Copyright information

© 2021 IFIP International Federation for Information Processing

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Jemioło, P., Storman, D., Giżycka, B., Ligęza, A. (2021). Emotion Elicitation with Stimuli Datasets in Automatic Affect Recognition Studies – Umbrella Review. In: Ardito, C., et al. Human-Computer Interaction – INTERACT 2021. INTERACT 2021. Lecture Notes in Computer Science(), vol 12934. Springer, Cham. https://doi.org/10.1007/978-3-030-85613-7_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-85613-7_18

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-85612-0

  • Online ISBN: 978-3-030-85613-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics