Computationally Augmented Ethnography: Emotion Tracking and Learning in Museum Games

  • Kit MartinEmail author
  • Emily Q. Wang
  • Connor Bain
  • Marcelo Worsley
Conference paper
Part of the Communications in Computer and Information Science book series (CCIS, volume 1112)


In this paper, we describe a way of using multi-modal learning analytics to augment qualitative data. We extract facial expressions that may indicate particular emotions from videos of dyads playing an interactive table-top game built for a museum. From this data, we explore the correlation between students’ understanding of the biological and complex systems concepts showcased in the learning environment and their facial expressions. First, we show how information retrieval techniques can be used on facial expression features to investigate emotional variation during key moments of the interaction. Second, we connect these features to moments of learning identified by traditional qualitative methods. Finally, we present an initial pilot using these methods in concert to identify key moments in multiple modalities. We end with a discussion of our preliminary findings on interweaving machine and human analytical approaches.


Multimodal learning analytics Affect tracking Game-based learning Physical traces 


  1. 1.
    Blikstein, P.: An atom is known by the company it keeps: a constructionist learning environment for materials science using multi-agent simulation (2006)Google Scholar
  2. 2.
    Chi, M.T.H., Roscoe, R.D., Slotta, J.D., Roy, M., Chase, C.C.: Misconceived causal explanations for emergent processes. Cogn. Sci. 36(1), 1–61 (2012). Scholar
  3. 3.
    Council, N.R.: Learning Science in Informal Environments: People, Places, and Pursuits. National Academies Press (2009)Google Scholar
  4. 4.
    D’Mello, S., Dieterle, E., Duckworth, A.: Advanced, analytic, automated (AAA) measurement of engagement during learning. Educ. Psychol. 52(2), 104–123 (2017). Scholar
  5. 5.
    D’Mello, S., Graesser, A.: Dynamics of affective states during complex learning. Learn. Instr. 22(2), 145–157 (2012). Scholar
  6. 6.
    Falk, J.H., Dierking, L.D.: The Museum Experience Revisited. Routledge (2016)Google Scholar
  7. 7.
    Gregory, J.: Game Engine Architecture. AK Peters/CRC Press (2017)Google Scholar
  8. 8.
    Martin, K., Wilensky, U.: Netlogo ant adapatation model (2019).
  9. 9.
    Martin, K.: Constructivist dialogue mapping: evaluating learning during play of ant adaptation, a complex interactive tabletop museum game. In: 31st Annual Visitor Studies Association, p. 20 (2018)Google Scholar
  10. 10.
    Martin, K.: Multitouch NetLogo for museum interactive game. In: Companion of the 2018 ACM Conference on Computer Supported Cooperative Work and Social Computing, CSCW 2018, Jersey City, NJ, USA, pp. 5–8. ACM, New York (2018).
  11. 11.
    Martin, K., Horn, M., Wilensky, U.: Ant Adaptation: A Complex Interactive Multitouch Game about Ants Designed for Museums, August 2018Google Scholar
  12. 12.
    Martin, K., Horn, M., Wilensky, U.: Prevalence of direct and emergent schema and change after play. Inf. Educ. 18(1), 183 (2019)Google Scholar
  13. 13.
    McGaugh, J.L.: Memory and Emotion: The Making of Lasting Memories. Columbia University Press (2003)Google Scholar
  14. 14.
    McGaugh, J.L.: Make mild moments memorable: add a little arousal. Trends Cogn. Sci. 10(8), 345–347 (2006). Scholar
  15. 15.
    Miles, M.B., Huberman, A.M., Saldaa, J.: Qualitative Data Analysis: A Methods Sourcebook, 3rd edn. Sage, Thousand Oaks (2014)Google Scholar
  16. 16.
    Ochoa, X., Worsley, M.: Editorial: augmenting learning analytics with multimodal sensory data. J. Learn. Anal. 3(2), 213–219 (2016). Scholar
  17. 17.
    Oviatt, S., Cohen, A., Weibel, N.: Multimodal learning analytics: description of math data corpus for ICMI grand challenge workshop. In: Proceedings of the 15th ACM on International Conference on Multimodal Interaction, ICMI 2013, Sydney, Australia, pp. 563–568. ACM, New York (2013).
  18. 18.
    Schneider, B., Blikstein, P.: Unraveling students’ interaction around a tangible interface using multimodal learning analytics. J. Educ. Data Mining 7, 89–116 (2015)Google Scholar
  19. 19.
    Spikol, D., et al.: Exploring the interplay between human and machine annotated multimodal learning analytics in hands-on stem activities. In: Proceedings of the 6th International Learning Analytics & Knowledge Conference, vol. 6, pp. 522–523. Association for Computing Machinery (2016)Google Scholar
  20. 20.
    Taggart, R.W., Dressler, M., Kumar, P., Khan, S., Coppola, J.F.: Determining emotions via facial expression analysis software. In: Proceedings of Student-Faculty Research Day, CSIS, Pace University, May 2016Google Scholar
  21. 21.
    Wagner, J., Lingenfelser, F., Baur, T., Damian, I., Kistler, F., Andr, E.: The social signal interpretation (SSI) framework: multimodal signal processing and recognition in real-time. In: Proceedings of the 21st ACM International Conference on Multimedia, MM 2013, Barcelona, Spain, pp. 831–834. ACM, New York (2013).
  22. 22.
    Wilensky, U.: Netlogo gaslab gas in a box model (1997).
  23. 23.
    Wilensky, U.: NetLogo (and NetLogo user manual) (1999).
  24. 24.
    Wilensky, U., Reisman, K.: Thinking like a wolf, a sheep, or a firefly: learning biology through constructing and testing computational theories–an embodied modeling approach. Cogn. Instr. 24(2), 171–209 (2006)CrossRefGoogle Scholar
  25. 25.
    Worsley, M., Blikstein, P.: A multimodal analysis of making. Int. J. Artif. Intell. Educ. 28(3), 385–419 (2018). Scholar
  26. 26.
    Worsley, M., Scherer, S., Morency, L.P., Blikstein, P.: Exploring behavior representation for learning analytics. In: Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ICMI 2015, Seattle, Washington, USA, pp. 251–258. ACM, New York (2015).

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Northwestern UniversityEvanstonUSA

Personalised recommendations