Advertisement

User Modeling and User-Adapted Interaction

, Volume 18, Issue 1–2, pp 45–80 | Cite as

Automatic detection of learner’s affect from conversational cues

  • Sidney K. D’Mello
  • Scotty D. Craig
  • Amy Witherspoon
  • Bethany McDaniel
  • Arthur Graesser
Original Paper

Abstract

We explored the reliability of detecting a learner’s affect from conversational features extracted from interactions with AutoTutor, an intelligent tutoring system (ITS) that helps students learn by holding a conversation in natural language. Training data were collected in a learning session with AutoTutor, after which the affective states of the learner were rated by the learner, a peer, and two trained judges. Inter-rater reliability scores indicated that the classifications of the trained judges were more reliable than the novice judges. Seven data sets that temporally integrated the affective judgments with the dialogue features of each learner were constructed. The first four datasets corresponded to the judgments of the learner, a peer, and two trained judges, while the remaining three data sets combined judgments of two or more raters. Multiple regression analyses confirmed the hypothesis that dialogue features could significantly predict the affective states of boredom, confusion, flow, and frustration. Machine learning experiments indicated that standard classifiers were moderately successful in discriminating the affective states of boredom, confusion, flow, frustration, and neutral, yielding a peak accuracy of 42% with neutral (chance = 20%) and 54% without neutral (chance = 25%). Individual detections of boredom, confusion, flow, and frustration, when contrasted with neutral affect, had maximum accuracies of 69, 68, 71, and 78%, respectively (chance = 50%). The classifiers that operated on the emotion judgments of the trained judges and combined models outperformed those based on judgments of the novices (i.e., the self and peer). Follow-up classification analyses that assessed the degree to which machine-generated affect labels correlated with affect judgments provided by humans revealed that human-machine agreement was on par with novice judges (self and peer) but quantitatively lower than trained judges. We discuss the prospects of extending AutoTutor into an affect-sensing ITS.

Keywords

Affect detection Human-computer interaction Human-computer dialogue Dialogue features Discourse markers Conversational cues Intelligent Tutoring Systems AutoTutor 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Aist, G., Kort, B., Reilly, R., Mostow, J., Picard, R.: Adding human-provided emotional awareness to an automated reading tutor that listens. Intelligent Tutoring Systems 2002, pp. 992–993. Berlin, Germany (2002)Google Scholar
  2. Aleven V. and Koedinger K.R. (2002). An effective metacognitive strategy: learning by doing and explaining with a computer based Cognitive Tutor. Cogn. Sci. 26: 147–179 CrossRefGoogle Scholar
  3. Alm, C.O., Sproat, R.: Perceptions of emotions in expressive storytelling. InterSpeech 2005, pp. 533–536. Lisbon, Portugal (2005)Google Scholar
  4. Anderson J.R., Corbett A.T., Koedinger K.R. and Pelletier R. (1995). Cognitive tutors: lessons learned. J. Learn. Sci. 4: 167–207 CrossRefGoogle Scholar
  5. Ang, J., Dhillon, R., Krupski A., Shriberg, E., Stolcke, A.: Prosody-based automatic detection of annoyance and frustration in human-computer dialog. Proceedings of the International Conference on Spoken Language Processing (ICSLP’02), pp. 2037–2039. Denver, CO (2002)Google Scholar
  6. Azevedo R. and Cromley J.G. (2004). Does training on self-regulated learning facilitate students’ learning with hypermedia. J. Educ. Psychol. 96: 523–535 CrossRefGoogle Scholar
  7. Batliner A., Fischer K., Huber R., Spilker J. and Nöth E. (2003). How to find trouble in communication. Speech Commun. 40: 117–143 zbMATHCrossRefGoogle Scholar
  8. Batliner, A., Steidl, S., Hacker, C., Nöth, E.: Private emotions vs. social interaction—a data-driven approach towards analysing emotion in speech. User Model. User-Adapt. Interact. J. Pers. Res. 18 (2008). doi: 10.1007/s11257-007-9039-4
  9. Bianchi-Berthouze N. and Lisetti C.L. (2002). Modeling multimodal expression of users affective subjective experience. User Model. User-Adapt. Interact. 12(1): 49–84 zbMATHCrossRefGoogle Scholar
  10. Boersma, P., Weenink D.: Praat: doing phonetics by computer (Version 4.3.14) [Computer program]. Retrieved May 02, 2006, from http://www.praat.org/ (2006)
  11. Bosch L.T. (2003). Emotions, speech and the ASR framework. Speech Commun. 40(1–2): 213–215 zbMATHGoogle Scholar
  12. Bower G.H. (1981). Mood and memory. Am. Psychol. 36(2): 129–148 CrossRefGoogle Scholar
  13. Bruner J.S. (1961). The act of discovery. Harv. Educ. Rev. 31(1): 21–32 Google Scholar
  14. Bull, E.P.: Posture and Gesture. Pergamon Press (1987)Google Scholar
  15. Campbell D.T. and Fiske D.W. (1959). Convergent and discriminant validation by the multitrait-multimethod matrix. Psychol. Bull. 56: 81–105 CrossRefGoogle Scholar
  16. Carberry S., Lambert L. and Schroeder L. (2002). Toward recognizing and conveying an attitude of doubt via natural language. Appl. Artif. Intell. 16(7): 495–517 CrossRefGoogle Scholar
  17. Cohn, J.F., Kanade, T.: Use of automated facial image analysis for measurement of emotion expression. In: Coan J.A., Allen J.B. (eds.) The Handbook of Emotion Elicitation and Assessment. Oxford University Press Series in Affective Science. New York, Oxford (In press)Google Scholar
  18. Conati C. (2002). Probabilistic assessment of user’s emotions in educational games. J. Appl. Artif. Intell. 16: 555–575 CrossRefGoogle Scholar
  19. Craig S.D., Graesser A.C., Sullins J. and Gholson B. (2004). Affect and learning: an exploratory look into the role of affect in learning. J. Educ. Media 29: 241–250 Google Scholar
  20. Craig, S.D., D’Mello, S., Witherspoon, A., Sullins, J., Graesser, A.C.: Emotions during learning: the first step toward an affect sensitive intelligent tutoring system. Proceedings of the International Conference on eLearning, pp. 284–288. AACE, Boston, MA (2004a)Google Scholar
  21. Csikszentmihalyi M. (1990). Flow: the Psychology of Optimal Experience. Harper-Row, New York Google Scholar
  22. De Vicente, A., Pain, H.: Informing the detection of students’ motivational state: an empirical study. Intelligent tutoring systems 2002, pp. 933–943. Berlin, Germany, Springer (2002)Google Scholar
  23. D’Mello, S.K., Craig, S.D., Gholson, B., Franklin, S., Picard, R., Graesser, A.C.: Integrating affect sensors in an intelligent tutoring system. Affective Interactions: The Computer in the Affective Loop Workshop at 2005 International conference on Intelligent User Interfaces, pp. 7–13. AMC Press, New York (2005)Google Scholar
  24. D’Mello K.S., Craig S.D., Sullins J. and Graesser A.C. (2006). Predicting affective states through an emote-aloud procedure from AutoTutor’s mixed-initiative dialogue. Int. J. Artif. Intell. Educ. 16: 3–28 Google Scholar
  25. Ekman P. and Friesen W.V. (1978). The Facial Action Coding System: A Technique for the Measurement of Facial Movement. Consulting Psychologists Press, Palo Alto Google Scholar
  26. Ericsson K.A. and Simon H.A. (1993). Protocol Analysis: Verbal Reports as Data. Revised edition. The MIT Press, Cambridge, MA Google Scholar
  27. Forbes-Riley, K., Litman, D.: Predicting Emotion in Spoken Dialogue from Multiple Knowledge Sources. Proceedings of the Human Language Technology Conference of the North American Chapter of the Association for Computational Linguistics, pp. 201–208. Association for Computational Linguistics, Boston, MA (2004)Google Scholar
  28. Forbes-Riley, K., Rotaru, M., Litman, D.: The relative impact of student affect on performance models in a spoken dialogue tutoring system. User Model. User-Adapt. Interact.: J. Pers. Res. 18 (2008). doi: 10.1007/s11257-007-9038-5
  29. Fredrickson B.L. and Branigan C. (2005). Positive emotions broaden the scope of attention and thought-action repertoires. Cogn. Emot. 19: 313–332 CrossRefGoogle Scholar
  30. Gertner, A.S., VanLehn, K.: Andes: a coached problem solving environment for physics. Intelligent Tutoring Systems: 5th International Conference, ITS 2000, pp. 133–142. Springer, New York (2000)Google Scholar
  31. Goleman D. (1995). Emotional Intelligence. Bantam Books, New York Google Scholar
  32. Gorin A.L., Riccardi G. and Wright J.H. (1997). How may I help you?. Speech Commun. 23: 113–127 CrossRefGoogle Scholar
  33. Graesser, A.C., Wiemer-Hastings, P., Wiemer-Hastings, K., Harter, D., Person, N.K.: Tutoring Research Group: Using latent semantic analysis to evaluate the contributions of students in AutoTutor. Interac. Learn. Environ. 8:129–148 (2000)Google Scholar
  34. Graesser A., VanLehn K., Rosé C., Jordan P. and Harter D. (2001). Intelligent tutoring systems with conversational dialogue.. AI Mag. 22(4): 39–51 Google Scholar
  35. Graesser A.C., Person N., Harter D. and Tutoring Research Group (2001). Teaching tactics and dialogue in AutoTutor. Int. J. Artif. Intell. Educ. 12: 257–279 Google Scholar
  36. Graesser A.C. and Olde B. (2003). How does one know whether a person understands a device? The quality of the questions the person asks when the device breaks down. J. Educ. Psychol. 95: 524–536 CrossRefGoogle Scholar
  37. Graesser A.C., McNamara D.S., Louwerse M.M. and Cai Z. (2004). Coh-Metrix: analysis of text on cohesion and language. Behav. Res. Meth. Instruments Comput. 36: 193–202 Google Scholar
  38. Graesser A.C., Chipman P., Haynes B.C., Olney A.: 2005a. AutoTutor: an intelligent tutoring system with mixed-initiative dialogue. IEEE Trans. Educ. 48:612–618Google Scholar
  39. Graesser A.C., Person N., Lu Z., Jeon M.G. and McDaniel B. (2005b). Learning while holding a conversation with a computer. In: PytlikZillig, L., Bodvarsson, M., and Bruning, R. (eds) Technology-based Education: Bringing Researchers and Practitioners together, pp 143–167. Information Age Publishing, Greenwich, CT Google Scholar
  40. Graesser, A.C., McDaniel, B., Chipman, P., Witherspoon, A., D’Mello, S., Gholson, B.: Detection of emotions during learning with AutoTutor. Proceedings of the 28th Annual Conference of the Cognitive Science Society, pp. 285–290. Erlbaum, Mahwah, NJ (2006)Google Scholar
  41. Graesser A.C., Penumatsa P., Ventura M., Cai Z. and Hu X. (2007). Using LSA in AutoTutor: learning through mixed initiative dialogue in natural language. In: Landauer, T., McNamara, D., Dennis, S., and Kintsch, W. (eds) Handbook of Latent Semantic Analysis, pp 243–262. Erlbaum, Mahwah, NJ Google Scholar
  42. Grimm, M., Mower, E., Kroschel, K., Narayan, S.: Combining categorical and primitives-based emotion recognition. 14th European Signal Processing Conference (EUSIPCO), Florence, Italy (2006)Google Scholar
  43. Hoque, M.E., Yeasin, M., Louwerse, M.M. (2006) Robust Recognition of Emotion from Speech. IVA 2006, LNAI 4133, pp. 42–53. Springer-Verlag, Berlin, Heidelberg (2006)Google Scholar
  44. Hudlicka E. and McNeese D. (2002). Assessment of user affective and belief states for interface adaptation: application to an Air Force pilot task. User Model. User-Adapt. Interact. 12(1): 1–47 zbMATHCrossRefGoogle Scholar
  45. Issroff, K., del Soldato, T.: Incorporating motivation into computer-supported collaborative learning. Proceedings of the European Conference on Artificial Intelligence in Education, pp. 284–290. Lisbon, Colibri (1996)Google Scholar
  46. Kim, Y.: Empathetic virtual peers enhanced learner interest and self-efficacy. Workshop on Motivation and Affect in Educational Software at the 12th International Conference on Artificial Intelligence in Education, pp. 9–16. Amsterdam, The Netherlands (2005)Google Scholar
  47. Klein J., Moon Y. and Picard R. (2002). This computer responds to user frustration—Theory, design, and results. Interac. Comput. 14(2): 119–140 Google Scholar
  48. Koedinger K.R., Anderson J.R., Hadley W.H. and Mark M.A. (1997). Intelligent tutoring goes to school in the big city. Int. J. Artif. Intell. Educ. 8: 30–43 Google Scholar
  49. Kort, B., Reilly, R., Picard, R.: An affective model of interplay between emotions and learning: Reengineering educational pedagogy—building a learning companion. Proceedings IEEE International Conference on Advanced Learning Technology: Issues, Achievements and Challenges, IEEE Computer Society, pp. 43–48. Madison, Wisconsin (2001)Google Scholar
  50. Kozma R. and Freeman W.J. (2001). Chaotic resonance: methods and applications for robust classification of noisy and variable patterns. Int. J. Bifurcat. Chaos 11: 1607–1629 CrossRefGoogle Scholar
  51. Landauer T.K. and Dumais S.T. (1997). A solution to Plato’s problem: the latent semantic analysis theory of acquisition, induction and representation of knowledge. Psychol. Rev. 104: 211–240 CrossRefGoogle Scholar
  52. Lee C.M. and Narayanan S. (2004). Towards detecting emotions in spoken dialogs. IEEE Trans. Speech Audio Process. 13(2): 293–303 CrossRefGoogle Scholar
  53. Lepper M.R. and Chabay R.W. (1988). Socializing the intelligent tutor: bringing empathy to computer tutors. In: Mandl, H. and Lesgold, A. (eds) Learning Issues for Intelligent Tutoring Systems, pp 242–257. Erlbaum, Hillsdale, NJ Google Scholar
  54. Lepper M.R. and Woolverton M. (2002). The wisdom of practice: Lessons learned from the study of highly effective tutors. In: Aronson, J. (eds) Improving Academic Achievement: Impact of Psychological Factors on Education, pp 135–158. Academic Press, Orlando, FL Google Scholar
  55. Lesgold A., Lajoie S., Bunzo M. and Eggan G. (1992). SHERLOCK: a coached practice environment for an electronics troubleshooting job. In: Larkin, J.H. and Chabay, R.W. (eds) Computer-Assisted Instruction and Intelligent Tutoring Systems, pp 201–238. Erlbaum, Hillsdale, NJ Google Scholar
  56. Linnenbrink E. and Pintrich P. (2004). Role of affect in cognitive processing in academic contexts. In: Dai, D. and Sternberg, R. (eds) Motivation, Emotion and Cognition: Integrative Perspectives on Intellectual Functioning and Development, pp 57–87. Lawrence Erlbaum, Mahwah, NJ Google Scholar
  57. Linnenbrink E.A. and Pintrich P. (2002). The role of motivational beliefs in conceptual change. In: Limon, M. and Mason, L. (eds) Reconsidering Conceptual Change: Issues in Theory and Practice, pp 115–135. Kluwer Academic Publishers, Dordretch, The Netherlands CrossRefGoogle Scholar
  58. Liscombe, J., Riccardi, G., Hakkani-Tür D.: Using context to improve emotion detection in spoken dialog systems. EUROSPEECH’05, 9th European Conference on Speech Commun. and Technology, pp. 1845–1848. Lisbon, Portugal (2005)Google Scholar
  59. Litman, D.J., Forbes-Riley, K.: Predicting student emotions in computer-human tutoring dialogues. Proceedings of the 42nd annual meeting of the association for computational linguistics. Association for Computational Linguistics, pp. 352–359. East Stroudsburg, PA (2004)Google Scholar
  60. Litman, D.J., Rose, C.P., Forbes-Riley, K., VanLehn, K., Bhemhe, D., Silliman, S.: Spoken versus typed human and computer dialogue tutoring. Proceedings of the Seventh International Conference on Intelligent Tutoring Systems, pp. 368–379. Springer Verlag, Berlin (2004)Google Scholar
  61. Litman, D.J., Silliman, S.: ITSPOKE: an intelligent tutoring spoken dialogue system. Proceedings of the Human language technology conference: 3rd meeting of the North American chapter of the association of computational linguistics, pp. 52–54. ACL, Edmonton, Canada (2004)Google Scholar
  62. Mandler G. (1984). Mind and Body: Psychology of Emotion and Stress. Norton, New York Google Scholar
  63. Matsubara, Y., Nagamachi, M.: Motivation systems and human models for intelligent tutoring. Proceedings of the Third International Conference in Intelligent Tutoring Systems, pp. 139–147. Springer-Verlag, London, England, (1996)Google Scholar
  64. McQuiggan, S., Mott, B., Lester, J.: Modeling self-efficacy in intelligent tutoring systems: an inductive approach. User Model. User-Adapt. Interact. J. Pers. Res. 18 (2008). doi: 10.1007/s11257-007-9040-y
  65. Miserandino M. (1996). Children who do well in school: Individual differences in perceived competence and autonomy in above-average children. J. Educ. Psychol. 88: 203–214 CrossRefGoogle Scholar
  66. Morimoto, C., Koons, D., Amir, A., Flickner, M.: Pupil detection and tracking using multiple light sources. Technical report, IBM Almaden Research Center (1998)Google Scholar
  67. Mota, S., Picard, R.W.: Automated posture analysis for detecting learner’s interest level. Workshop on Computer Vision and Pattern Recognition for Human-Computer Interaction, CVPR HCI (2003)Google Scholar
  68. Oliver, N., Pentland, A., Berand, F.: LAFTER: a real-time lips and face tracker with facial expression recognition. Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 123–129. IEEE, San Juan, Puerto Rico, (1997)Google Scholar
  69. Olney, A., Louwerse, M., Mathews, E., Marineau, J., Hite-Mitchell, H., Graesser, A.: Utterance classification in AutoTutor. Proceedings of the HLT-NAACL 03 Workshop on Building Educational Applications using Natural Language Processing, pp. 1–8 (2003)Google Scholar
  70. Ortony A., Clore G.L. and Collins A. (1988). The Cognitive Structure of Emotions. Cambridge University Press, Cambridge, UK Google Scholar
  71. Pantic, M., Rothkrantz, L.J.M.: Towards an affect-sensitive multimodal human-computer interaction. Proceedings of the IEEE, Special Issue on Multimodal Human-Computer Interaction (HCI), vol. 91(9), pp. 1370–1390 (2003)Google Scholar
  72. Patrick B., Skinner E. and Connell J. (1993). What motivates children’s behavior and emotion? Joint effects of perceived control and autonomy in the academic domain. J. Pers. Soc. Psychol. 65: 781–791 CrossRefGoogle Scholar
  73. Person N.K., Graesser A.C., Tutoring Research Group Human or computer? AutoTutor in a bystander Turing test. In: Cerri S.A., Gouarderes G., Paraguacu F. (eds.) Intelligent Tutoring Systems. pp. 821–830. Springer, Berlin, Germany (2002)Google Scholar
  74. Picard R.W. (1997). Affective Computing. MIT Press, Boston, MA Google Scholar
  75. Picard R.W., Vyzas E. and Healey J. (2001). Toward machine emotional intelligence: analysis of affective physiological state. IEEE Trans. Pattern Anal. Mach. Intell. 23(10): 1175–1191 CrossRefGoogle Scholar
  76. Prendinger H. and Ishizuka M. (2005). The empathic companion: a character-based interface that addresses users’ affective states. Int. J. Appl. Artif. Intell. 19(3,4): 267–285 CrossRefGoogle Scholar
  77. Porayska-Pomsta, K., Mavrikis, M., Pain, H.: Diagnosing and acting on student affect: the tutor’s perspective. User Model. User-Adapt. Interact. J. Pers Res. 18 (2008) doi: 10.1007/s11257-007-9041-x
  78. Rani, P., Sarkar, N., Smith, C.A.: An affect-sensitive human-robot cooperation—theory and experiments. Proceedings of the IEEE Conference on Robotics and Automation, pp. 2382–2387. IEEE, Taipei, Taiwan (2003)Google Scholar
  79. Robson C. (1993). Real World Research: A Resource for Social Scientist and Practitioner Researchers. Blackwell, Oxford Google Scholar
  80. Rus, V., Graesser, A.C.: Deeper natural language processing for evaluating student answers in intelligent tutoring systems. Proceedings of the American Association of Artificial Intelligence, pp. 1495–1600. AAAI, Menlo Park, CA (2006)Google Scholar
  81. Russell J.A. (2003). Core affect and the psychological construction of emotion. Psychol. Rev. 110: 145–172 CrossRefGoogle Scholar
  82. Scheirer J., Fernandez R., Klein J. and Picard R. (2002). Frustrating the user on purpose: a step toward building an affective computer. Interact. Comput. 14(2): 93–118 Google Scholar
  83. Schutzwohl A and Borgstedt K. (2005). The processing of affectively valenced stimuli: the role of surprise. Cogn. Emot. 19: 583–600 CrossRefGoogle Scholar
  84. Selfridge, O.G.: Pandemonium: A Paradigm for learning. Symposium on the Mechanization of Thought Processes, pp. 511–531. Her Majesty’s Stationary Office, London (1959)Google Scholar
  85. Shafran, I., Mohri, M.: A comparison of classifiers for detecting emotion from speech. Proceedings of the International Conference on Acoustics, Speech, and Signal Processing, pp. 341–344. IEEE, Philadelphia, PA (2005)Google Scholar
  86. Shafran, I., Riley, M., Mohri, M.: Voice signatures. Proc. IEEE Automatic Speech Recognition and Understanding Workshop, pp. 31–36. IEEE, Piscataway, NJ, (2003)Google Scholar
  87. Silvia P. and Abele A. (2002). Can positive affect induce self-focused attention? Methodological and measurement issues. Cogn. Emot. 16: 845–853 CrossRefGoogle Scholar
  88. Sleeman D. and Brown J. (1982). Intelligent Tutoring Systems. Academic Press, New York Google Scholar
  89. Stein N. and Levine L. (1991). Making sense out of emotion: the representation and use of goal-structured knowledge. In: Kessen, W., Ortony, A. and Craik, F. (eds) Memories, Thoughts and Emotions: Essays in Honor of George Mandler, pp 295–322. Laurence Erlbaum Associates, Hillsdale, NJ Google Scholar
  90. Tekscan (1997) Tekscan Body Pressure Measurement System User’s Manual. Tekscan Inc., South Boston, MAGoogle Scholar
  91. VanLehn K. (1990). Mind Bugs: The Origins of Procedural Misconceptions. MIT Press, Cambridge, MA Google Scholar
  92. VanLehn, K., Jordan, P., Rosé, C.P., Bhembe, D., Bottner, M., Gaydos, A., et~al.: The architecture of Why2-Atlas: A coach for qualitative physics essay writing. Proceedings of the Sixth International Conference on Intelligent Tutoring, pp. 158–167. Springer-Verlag, Berlin (2002)Google Scholar
  93. VanLehn, K., Graesser, A.C., Jackson, G.T. Jordan, P., Olney, A., Rose, C.P.: When are tutorial dialogues more effective than reading? Cogn. Sci. 30, 1–60 (2006)Google Scholar
  94. Vavik L. (1993). Facilitating discovery learning in computer-based simulation learning environments. In: Tennyson, R.D. and Baron, A.E. (eds) Automating Instructional Design: Computer-Based Development and Delivery Tools, pp 403–449. Springer-Verlag, Berlin, Germany Google Scholar
  95. Walker M.A., Langkilde -Geary I., Hastie H.W., Wright J. and Gorin A. (2002). Automatically training a problematic dialogue predictor for a spoken dialogue system. J. Artif. Intell. Res. 16: 293–319 zbMATHGoogle Scholar
  96. Whang M.C., Lim J.S. and Boucsein W. (2003). Preparing computers for affective communication: a psychophysiological concept and preliminary results. Hum. Factors 45(4): 623–634 CrossRefGoogle Scholar
  97. Wiemer-Hastings P., Wiemer-Hastings K. and Graesser A.C. (1999). Improving an intelligent tutor’s comprehension of students with latent semantic analysis. In: Lajoie, S.P. and Vivet, M. (eds) Artificial Intelligence in Education, pp 535–542. IOS Press, Amsterdam Google Scholar
  98. Witten, I.H., Frank, E.: (2005) Data mining: practical machine learning tools and techniques, 2nd ed. Morgan Kaufmann, San Francisco (2005)Google Scholar
  99. Yannakakis, G., Hallam, J., Lund, H.: Entertainment capture through heart rate activity in physical interactive playgrounds. User Model. User-Adapt. Interact. J. Pers. Res. 18 (2008). doi: 10.1007/s11257-007-9036-7

Copyright information

© Springer Science+Business Media B.V. 2007

Authors and Affiliations

  • Sidney K. D’Mello
    • 1
  • Scotty D. Craig
    • 2
  • Amy Witherspoon
    • 3
  • Bethany McDaniel
    • 3
  • Arthur Graesser
    • 3
  1. 1.Department of Computer ScienceThe University of MemphisMemphisUSA
  2. 2.Learning Research and Development CenterUniversity of PittsburghPittsburghUSA
  3. 3.Department of PsychologyThe University of MemphisMemphisUSA

Personalised recommendations