Evaluation Criteria for Affect-Annotated Databases

  • Agata Kolakowska
  • Agnieszka Landowska
  • Mariusz Szwoch
  • Wioleta Szwoch
  • Michal R. Wrobel
Part of the Communications in Computer and Information Science book series (CCIS, volume 521)

Abstract

In this paper a set of comprehensive evaluation criteria for affect-annotated databases is proposed. These criteria can be used for evaluation of the quality of a database on the stage of its creation as well as for evaluation and comparison of existing databases. The usefulness of these criteria is demonstrated on several databases selected from affect computing domain. The databases contain different kind of data: video or still images presenting facial expressions, speech recordings and affect-annotated words.

Keywords

Affective computing Database quality Affect-annotated databases 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    General Inquirer (February 21, 2013), http://www.wjh.harvard.edu/~inquirer/
  2. 2.
    Baccianella, S., Esuli, A., Sebastiani, F.: SentiWordNet 3.0: An enhanced lexical resource for sentiment analysis and opinion mining. In: Conference on Language Resources and Evaluation, pp. 2200–2204 (2010)Google Scholar
  3. 3.
    Bailenson, J., Pontikakis, E., Mauss, I., Gross, J., Jabon, M., Hutcherson, C., Nass, C., John, O.: Real-time classification of evoked emotions using facial feature tracking and physiological responses. Int. J. Human-Computer Studies 66(5), 303–317 (2008)CrossRefGoogle Scholar
  4. 4.
    Banea, C., Mihalcea, R., Wiebe, J.: A bootstrapping method for building subjectivity lexicons for languages with scarce resources. In: 6th International Conference on Language Resources and Evaluation, Marrakech (2008)Google Scholar
  5. 5.
    Bradley, M.M., Lang, P.J.: Affective norms for English words (ANEW): Instruction manual and affective ratings. Technical Report C-1, The Center for research in Psychophysiology, University of Florida (1999)Google Scholar
  6. 6.
    Burgin, W., Pantofaru, C., Smart, W.D.: Using depth information to improve face detection. In: Proc. of the 6th Int. Conf. on Human-Robot Interaction, pp. 119–120 (2011)Google Scholar
  7. 7.
    Burkhardt, F., Paechke, A., Rolfes, M., Sendlmeier, W., Weiss, B.: A database of German emotional speech. In: 9th European Conference on Speech Communication and Technology (2005)Google Scholar
  8. 8.
    Castrillon-Santana, M., Deniz-Suarez, O., Anton-Canalis, L., Lorenzo-Navarro, J.: Face and facial feature detection evaluation - performance evaluation of public domain Haar detectors for face and facial feature detection. In: Third Int. Conf. on Computer Vision Theory and Applications, VISAPP 2008, pp. 167–172 (2008)Google Scholar
  9. 9.
    Douglas-Cowie, E., Cowie, R., Schroeder, M.: A database of German emotional speech. In: 15th International Congress of Phonetic Sciences, Barcelona (2003)Google Scholar
  10. 10.
    Ekman, P., Friesen, W.V.: Facial Action Coding System. Consulting Psychologist Press (1978)Google Scholar
  11. 11.
    El Menshawy, D., Mokhtar, H.M.O., Hegazy, O.: A keystroke dynamics based approach for continuous authentication. In: Kozielski, S., Mrozek, D., Kasprowski, P., Małysiak-Mrozek, B., Kostrzewa, D. (eds.) BDAS 2014. CCIS, vol. 424, pp. 415–424. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  12. 12.
    Epp, C., Lippold, M., Mandryk, R.L.: Identifying emotional states using keystroke dynamics. In: Conf. on Human Factors in Computing Systems, pp. 715–724 (2011)Google Scholar
  13. 13.
    Gill, A.J., French, R.M., Gergle, D., Oberlander, J.: Identifying emotional characteristics from short blog texts. In: 30th Annual Conference of the Cognitive Science Society, pp. 2237–2242 (2008)Google Scholar
  14. 14.
    Gunes, H., Piccardi, M.: Affect recognition from face and body: Early fusion vs. late fusion. In: IEEE International Conference on Systems, Man and Cybernetics, vol. 4, pp. 3437–3443 (2005)Google Scholar
  15. 15.
    Jones, C., Sutherland, J.: Acoustic emotion recognition for affective computer gaming. In: Peter, C., Beale, R. (eds.) Affect and Emotion in Human-Computer Interaction. LNCS, vol. 4868, pp. 209–219. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  16. 16.
    Kołakowska, A.: A review of emotion recognition methods based on keystroke dynamics and mouse movements. In: Proc. of the 6th Int. Conf. on Human System Interaction (2013)Google Scholar
  17. 17.
    Landowska, A.: Affect-awareness framework for intelligent tutoring systems. In: 6th International Conference on Human System Interaction (2013)Google Scholar
  18. 18.
    Liberman, M., et al: Emotional prosody speech and transcripts LDC2002S28. philadelphia, linguistic data consortium (2002), https://catalog.ldc.upenn.edu/LDC2002S28
  19. 19.
    Lucey, P., Cohn, J.F., Kanade, T., Saragih, J., Ambadar, Z., Matthews, I.: The extended Cohn-Kanade (CK+): A complete dataset for action unit and emotion-specified expression. In: IEEE Computer Society Conference on Computer Vision and Pattern Recognition (Workshops), pp. 94–101 (2010)Google Scholar
  20. 20.
    Ortony, A., Clore, G.L., Foss, M.A.: The referential structure of the affective lexicon. In: Cognitive Science, pp. 341–364 (1987)Google Scholar
  21. 21.
    Pantic, M., Valstar, M.F., Rademaker, R., Maat, L.: Web-based database for facial expression analysis. In: IEEE International Conf. on Multimedia and Expo, Amsterdam (2005)Google Scholar
  22. 22.
    Pittermann, J., Pittermann, A., Minker, W.: Handling emotions in human computer dialog. Springer (2010)Google Scholar
  23. 23.
    Russell, J., Mehrabian, A.: Evidence for a three-factor theory of emotions. J. Research in Personality 11, 273–294 (1977)CrossRefGoogle Scholar
  24. 24.
    Schuller, B., Lang, M., Rigoll, G.: Multimodal emotion recognition in audiovisual communication. In: IEEE International Conference on Multimedia and Expo (2002)Google Scholar
  25. 25.
    Strapparava, C., Valitutti, A.: Wordnet-affect: an affective extension of WordNet. In: 4th International Conference on Language Resources and Evaluation, Lisbon, pp. 1083–1086 (2004)Google Scholar
  26. 26.
    Szwoch, M.: FEEDB: a multimodal database of facial expressions and emotions. In: Proc. of the 6th Int. Conf. on Human System Interaction, pp. 524–531 (2013)Google Scholar
  27. 27.
    Szwoch, M.: On facial expressions and emotions RGB-D database. In: Kozielski, S., Mrozek, D., Kasprowski, P., Małysiak-Mrozek, B., Kostrzewa, D. (eds.) BDAS 2014. CCIS, vol. 424, pp. 384–394. Springer, Heidelberg (2014)CrossRefGoogle Scholar
  28. 28.
    Szwoch, W.: Using physiological signals for emotion recognition. In: Proc. of the 6th Int. Conf. on Human System Interaction, pp. 556–561 (2013)Google Scholar
  29. 29.
    Tivatansakul, S., Chalumporn, G., Puangpontip, S., Kankanokkul, Y., Achalaku, T., Ohku-ra, M.: Healthcare system focusing on emotional aspect using augmented reality: Emotion detection by facial expression. In: Advances in Human Aspects of Healthcare, pp. 375–384 (2014)Google Scholar
  30. 30.
    Wróbel, M.R.: Emotions in the software development process. In: 6th International Conference on Human System Interaction (2013)Google Scholar
  31. 31.
    Yacoub, S., Simske, S., Lin, X., Burns, J.: Recognition of emotions in interactive voice response system. In: 8th European Conference on Speech Communication and Technology, Geneva (2003)Google Scholar
  32. 32.
    Zeng, Z., Pantic, M., Roisman, G., Huang, T.: A survey of affect recognition methods: Audio, visual, and spontaneous expressions. IEEE Transactions on Pattern Analysis and Machine Intelligence 31, 39–58 (2009)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Agata Kolakowska
    • 1
  • Agnieszka Landowska
    • 1
  • Mariusz Szwoch
    • 1
  • Wioleta Szwoch
    • 1
  • Michal R. Wrobel
    • 1
  1. 1.Faculty of Electronics, Telecommunications and InformaticsGdansk University of TechnologyGdanskPoland

Personalised recommendations