Natural Affect Data: Collection and Annotation

Chapter
Part of the Explorations in the Learning Sciences, Instructional Systems and Performance Technologies book series (LSIS, volume 3)

Abstract

Emotions are crucial for healthy cognitive functioning and have direct relevance to learning and achievement. Not surprisingly then, affective diagnoses constitute a significant aspect of expert human mentoring. Consequently, computer-based learning environments seek to incorporate the social dynamics of such human teacher–learner interactions in order to make learning with computers more engaging and effective. Advances in the field of affective computing have opened the possibility of studying emotions from their nonverbal manifestations and have motivated several efforts towards realising automatic affect inference. However, development and validation of affect recognition systems requires representative data to serve as the ground-truth. For viable applications of affect-sensitive technology, and to ensure ecological validity, the use of context-relevant, naturalistic data is preferred. This chapter reports results from the collection and subsequent annotation of data obtained in a learning scenario. The conceptual and methodological issues encountered during data collection are discussed, and problems with labelling and annotation are identified. It provides an integrated account of the complexity and challenges associated with emotion assessment in naturalistic situations.

Keywords

Sorting 

References

  1. Abrilian, S., Devillers, L., Buisine, S., & Martin, J.-C. (2005). EmoTV1: Annotation of real-life emotions for the specification of multimodal affective interfaces. Proceedings of 11th International Conference on Human-Computer Interaction (HCI 2005), Las Vegas, USA.Google Scholar
  2. Abrilian, S., Devillers, L., & Martin, J.-C. (2006). Annotation of emotions in real-life video interviews: Variability between coders. International Conference of Language Resources & Evaluation. Genoa, Italy.Google Scholar
  3. Afzal, S., Morrison, C., & Robinson, P. (2009). Intentional affect: An alternative notion of affective interaction with a machine. Proceedings of British HCI. Cambridge, UK.Google Scholar
  4. Afzal, S., & Robinson, P. (2010). Measuring affect in learning – Motivation and methods. 10th IEEE International Conference on Advanced Learning Technologies (ICALT). Tunisia.Google Scholar
  5. Arroyo, I., Cooper, D. G., Burleson, W., Woolf, B. P., Muldner, K., & Christopherson, R. (2009). Emotion Sensors Go to School. In V. Dimitrova, R. Mizoguchi, B. du Boulay & A. Grasser (Eds.), Artificial Intelligence in Education. Building Learning Systems that Care: from Knowledge Representation to Affective Modelling (Vol. Frontiers in Artificial Intelligence and Applications 200, pp. 17–24). Brighton, UK: IOS Press.Google Scholar
  6. Bakeman, R., & Gothman, J. M. (1997). Observing interaction: An introduction to sequential analysis. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
  7. Baker, R., Rodrigo, M., & Xolocotzin, U. (2007). The dynamics of affective transitions in simulation problem-solving environments. In: Paiva, A., Prada, R., Picard, R.W. (Eds.), ACII 2007. LNCS, vol. 4738, pp. 666–677. Springer, Heidelberg.Google Scholar
  8. Baron-Cohen, S., Golan, O., Wheelwright, S., & Hill, J. (2004). Mind reading: The interactive guide to emotions. London: Jessica Kingsley.Google Scholar
  9. Bransford, J. D., Brown, A. L., & Cocking, R. R. (1999). How people learn: Brain, mind, experience and school. Washington, DC: National Academy Press.Google Scholar
  10. Camtasia Studio. (2006). Version 3.1. TechSmith Software.Google Scholar
  11. Cohn, J. F., & Schmidt, K. L. (2004). The timing of facial motion in posed and spontaneous smiles. International Journal of Wavelets, Multiresolution and Information Processing, 2, 1–12.CrossRefGoogle Scholar
  12. Conati, C., & Maclaren, H. (2009). Empirically building and evaluating a probabilistic model of user affect. User Modeling and User-Adapted Interaction, 19(3), 267–303.CrossRefGoogle Scholar
  13. Cowie, R., Douglas-Cowie, E., & Cox, C. (2005). Beyond emotion archetypes: Databases for emotion modelling using neural networks. Neural Networks, 18, 371–388.CrossRefGoogle Scholar
  14. D’Mello, S., Picard, R. W., & Graesser, A. (2007). Towards an affect-sensitive auto-tutor. IEEE Intelligent Systems, 22(4), 53.CrossRefGoogle Scholar
  15. D’Mello, S., Taylor, R., Davidson, K., & Graesser, A. (2008). Self versus teacher judgements of learner emotions during a tutoring session with AutoTutor. In B. Woolf et al. (Eds.), ITS 2008, LNCS 5091, (pp 9–18). Springer-Verlag.Google Scholar
  16. Ekman, P., & Rosenberg, E. L. (1997). What the face reveals: Basic and applied studies of spontaneous expression using facial action coding system (FACS). New York: Oxford University Press.Google Scholar
  17. El Kaliouby, R., & Teeters, A. (2007). Eliciting, capturing and tagging spontaneous facial affect in autism spectrum disorder. International Conference on Multimodal Interfaces. Aichi, Japan.Google Scholar
  18. Elfenbein, H. A., Marsh, A. A., & Ambady, N. (2002). Emotional intelligence and the recognition of emotion from facial expressions. In L. F. Barrett & P. Salovey (Eds.), The wisdom of feelings: Processes underlying emotional intelligence (pp. 37–59). New York: Guilford.Google Scholar
  19. Fleiss, J. L., Levin, B., & Paik, M. C. (2003). The measurement of interrater agreement. In J. L. Fleiss, B. Levin, & M. C. Paik (Eds.), Statistical methods for rates & proportions (3rd ed., pp. 598–626). Hoboken, NJ: Wiley.CrossRefGoogle Scholar
  20. Frank, M. G., Juslin, P. N., & Harrigan, J. A. (2005). Technical issues in recording nonverbal behaviour. In J. A. Harrigan (Ed.), The new handbook of methods in nonverbal behaviour research. New York: Oxford University Press.Google Scholar
  21. Graesser, A. C., McDaniel, B., Chipman, P., Witherspoon, A., D’Mello, S., & Gholson, B. (2006). Detection of emotions during learning with AutoTutor. In R. Son (Ed.), Proceedings of the 28th Annual Meetings of the Cognitive Science Society (pp. 285–290). Mahwah, NJ: Erlbaum.Google Scholar
  22. Hayes, A. F., & Krippendorff, K. (2007). Answering the call for a standard reliability measure for coding data. Communication Methods and Measures, 1(1), 77–89.CrossRefGoogle Scholar
  23. Isomursu, M., Tahti, M., Vainamo, S., & Kuutti, K. (2007). Experimental evaluation of five methods for collecting emotions in field settings with mobile application. International Journal of Human-Computer Studies, 65, 404–418.CrossRefGoogle Scholar
  24. Kort B., Reilly, R. & Picard R. (2001) An affective model of interplay between emotions and learning: reengineering educational pedagogy—building a learning companion, In T. Okamoto, R. Hartley, Kinshuk & J. P. Klus (Eds.), IEEE International Conference on Advanced Learning Technology: Issues, Achievements and Challenges (Madison, WI, IEEE Computer Society), 43–48.Google Scholar
  25. Lepper, M. R., Woolverton, M., Mumme, D. L., & Gurtner, J. (1993). Motivational techniques of expert human tutors: Lessons for the design of computer-based tutors. In S. P. Lajoie & S. J. Derry (Eds.), Computers as cognitive tools (pp. 75–105). Hillsdale, IN: Erlbaum.Google Scholar
  26. Lisetti, C., & Schiano, D. (2000). Facial expression recognition: Where human computer interaction, artificial intelligence and cognitive science intersect. Pragmatics and Cognition, 8(1), 185–235.CrossRefGoogle Scholar
  27. Manusov, V. L. (2005). Sourcebook of nonverbal measures: Going beyond words. Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  28. Merrill, D. C., Reiser, B. J., Ranney, M., & Trafton, J. G. (1992). Effective tutoring techniques: A comparison of human tutors and intelligent tutoring systems. Journal of the Learning Sciences, 2, 277–305.CrossRefGoogle Scholar
  29. O’Regan, K. (2003). Emotion and e-learning. Journal of Asynchronous Learning Networks, 7(3), 78–92.Google Scholar
  30. Pantic, M., & Patras, I. (2006). Dynamics of facial expression: Recognition of facial actions and their temporal segments from face profile image sequences. IEEE Transactions on Systems, Man, and Cybernetics, 36(2), 433–449.CrossRefGoogle Scholar
  31. Pantic, M., & Rothkrantz, L. J. M. (2003). Toward an affect-sensitive multimodal human-computer interaction. Proceedings of the IEEE, 91(9), 1370–1390.CrossRefGoogle Scholar
  32. Pekrun, R. (2005). Progress and open problems in educational emotion research. Learning and Instruction, 15, 497–506.CrossRefGoogle Scholar
  33. Pekrun, R., Goetz, T., Titz, W., & Perry, R. P. (2002). Academic emotions in students’ self-regulated learning and achievement: A program of qualitative and quantitative research. Educational Psychologist, 37, 91–105.CrossRefGoogle Scholar
  34. Picard, R. W. (1997). Affective computing. Cambridge, MA: MIT.Google Scholar
  35. Picard, R. W., Papert, S., Bender, W., Blumberg, B., Breazeal, C., Cavallo, D., et al. (2004). Affective learning – A manifesto. BT Technology Journal, 22, 253–269.CrossRefGoogle Scholar
  36. Riggio, R. E., & Riggio, H. R. (2005). Self-report measures of emotional and nonverbal expressiveness. In V. Manusov (Ed.), The sourcebook of nonverbal measures: Going beyond words (pp. 105–111). Mahwah, NJ: Erlbaum.Google Scholar
  37. Russell, J. A., & Fernandez-Dols, J. M. (1997). The psychology of facial expression. Cambridge, UK: Cambridge University Press.CrossRefGoogle Scholar
  38. Scherer, K. (2005). What are emotions? And how can they be measured? Social Science Information, 44(4), 695–729.Google Scholar
  39. Schutz, P. A., Hong, J. Y., Cross, D. I., & Obson, J. N. (2006). Reflections on investigating emotion in educational activity settings. Educational Psychology Review, 18, 343–360.CrossRefGoogle Scholar
  40. Skemp, R. R. (1971). The psychology of learning mathematics. Hillsdale, NJ: Erlbaum.Google Scholar
  41. White, C. H., & Sargent, J. (2005). Researcher choices and practices in the study of nonverbal communication. In V. L. Manusov (Ed.), Sourcebook of nonverbal measures: Going beyond words. Mahwah, NJ: Lawrence Erlbaum.Google Scholar
  42. Wosnitza, M., & Volet, S. (2005). Origin, direction and impact of emotions in social online learning. Learning and Instruction, 15(5), 440–464.CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC 2011

Authors and Affiliations

  1. 1.Computer LaboratoryUniversity of CambridgeCambridgeUK

Personalised recommendations