Human-Computer Interaction

INTERACT 2015: Human-Computer Interaction – INTERACT 2015 pp 263-280 | Cite as

LEGO Pictorial Scales for Assessing Affective Response

  • Mohammad Obaid
  • Andreas Dünser
  • Elena Moltchanova
  • Danielle Cummings
  • Johannes Wagner
  • Christoph Bartneck
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9296)

Abstract

This article presents the design and evaluation of novel types of pictorial scales for assessing emotional response based on LEGO Minifigures. We describe the creation of two pictorial scales (LEGO Face Scale and Stylized LEGO Face Scale) through the use of a semi-automatic process. We report on the results of two evaluation studies conducted to assess the validity of the proposed pictorial scales. The first study evaluated the rating of emotions expressed by other humans; the second focused on rating one’s own emotional state when looking at expressive stimuli. We investigate the validity of the two pictorial scales by comparing them to ratings given on a conventional Likert Scale. Results show that assessing expressive faces using the proposed pictorial scales can differ from using a Likert scale; however, when rating one’s own emotional state there is no difference. Finally, we assembled a physical version of the LEGO Face scale and discuss future work.

Keywords

LEGO minifigures Evaluation Pictorial Emotion Scale 

References

  1. 1.
    Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: The second International Symposium on Information Theory, pp. 267–281 (1973)Google Scholar
  2. 2.
    Aviezer, H., Trope, Y., Todorov, A.: Body cues, not facial expressions, discriminate between intense positive and negative emotions. Science 338(6111), 1225–1229 (2012)CrossRefGoogle Scholar
  3. 3.
    Bartneck, C.: The Unofficial LEGO Minifigure Catalog. CreateSpace, Charleston (2011)Google Scholar
  4. 4.
    Bartneck, C., Obaid, M., Zawieska, K.: Agents with faces - what can we Learn from LEGO Minfigures. In: Proceedings of the 1st International Conference on Human-Agent Interaction. Hokkaido University, Sapporo, Japan, pp. III–2–1 (2013)Google Scholar
  5. 5.
    Bartneck, C., Duenser, A., Moltchanova, E., Zawieska, K.: Comparing the similarity of responses received from studies in Amazon’s mechanical turk to studies conducted online and with direct recruitment. PLOS ONE 10(4), e0121595 (2015)CrossRefMATHGoogle Scholar
  6. 6.
    Bieri, D., Reeve, R., Champion, G., Addicoat, L., Ziegler, J.: The faces pain scale for the self-assessment of the severity of pain experienced by children: development, initial validation, and preliminary investigation for ratio scale properties. Pain 41(2), 139–150 (1990)CrossRefGoogle Scholar
  7. 7.
    Bradley, M.M., Lang, P.J.: Measuring emotion: the self-assessment manikin and the semantic differential. J. Behav. Ther. Exp. Psychiatry 25(1), 49–59 (1994)CrossRefGoogle Scholar
  8. 8.
    Buhrmester, M., Kwang, T., Gosling, S.: Amazon’s mechanical turk: a new source of inexpensive, yet high-quality data? Perspect. Psychol. Sci. 6(1), 3–5 (2011)CrossRefGoogle Scholar
  9. 9.
    Cummings, D.: Multimodal interaction for enhancing team coordination on the battlefield. Dissertation, Texas A&M University, College Station, TX, USA (2013)Google Scholar
  10. 10.
    Desmet, P., Overbeeke, K., Tax, S.: Designing products with added emotional value: development and application of an approach for research through design. Des. J. 4(1), 32–47 (2001)Google Scholar
  11. 11.
    Desmet, P., Vastenburg, M., Van Bel, D., Romero, N.: Pick-A-Mood; development and application of a pictorial mood-reporting instrument. In: Proceedings of the 8th International Design and Emotion Conference, pp. 11–14 (2012)Google Scholar
  12. 12.
    Ekman, P.: An argument for basic emotions. Cogn. Emot. 6(3), 169–200 (1992)CrossRefGoogle Scholar
  13. 13.
    Ekman, P., Friesen, W.: The repertoire of nonverbal behavior: categories, origins, usage and coding. Semiotica 1(1), 49–98 (1969)CrossRefGoogle Scholar
  14. 14.
    Ekman, P., Friesen, W.: Constants across cultures in the face and emotion. Pers. Soc. Psychol. 17(2), 124–129 (1971)CrossRefGoogle Scholar
  15. 15.
    Ekman, P., Friesen, W.: Unmasking the Face. Prentice Hall, Englewood Cliffs (1975)Google Scholar
  16. 16.
    Gelman, A., Carlin, J., Stern, H., Rubin, D.: Bayesian Data Analysis. Chapman and Hall/CRC, Boca Raton (2004)Google Scholar
  17. 17.
    Goodman, J.K., Cryder, C.E., Cheema, A.: Data collection in a flat world: accelerating consumer behavior research by using mechanical turk. J. Behav. Decis. Making 26, 213–224 (2013)CrossRefGoogle Scholar
  18. 18.
    Grimm, M., Kroschel, K.: Evaluation of natural emotions using self assessment manikins. In: IEEE Workshop on Automatic Speech Recognition and Understanding, pp. 381–385 (2005)Google Scholar
  19. 19.
    Hedeker, D.: A mixed-effects multinomial logistic regression model. Stat. Med. 22, 1433–1446 (2003)CrossRefGoogle Scholar
  20. 20.
    Hicks, C.L., von Baeyer, C.L., Spafford, P.A., van Korlaar, I., Goodenough, B.: the faces pain scale - revised: toward a common metric in pediatric pain measurement. Pain 93(2), 173–183 (2001)CrossRefMATHGoogle Scholar
  21. 21.
    Ipeirotis, P.G.: Demographics of mechanical turk. Technical report, New York University, (2010)Google Scholar
  22. 22.
    Isomursu, M., Tähti, M., Väinämö, S., Kuutti, K.: Experimental evaluation of five methods for collecting emotions in field settings with mobile applications. Int. J. Hum.-Comput. Stud. 65(4), 404–418 (2007)CrossRefGoogle Scholar
  23. 23.
    Lang, P.J., Bradley, M.M., Cuthbert, B.N.: International Affective Picture System (IAPS): Technical Manual and Affective Ratings (1999)Google Scholar
  24. 24.
    Lunn, D., Thomas, A., Best, N., Spiegelhalter, D.: WinBUGS – a Bayesian modeling framework: concepts, structure, and extensibility. Stat. Comput. 10, 325–337 (2000)CrossRefGoogle Scholar
  25. 25.
    Lyons, M.J., Akamatsu, S., Kamachi, M., Gyoba, J.: Coding facial expressions with Gabor wavelets. In: Third IEEE International Conference on Automatic Face and Gesture Recognition, 200–205. IEEE (1998)Google Scholar
  26. 26.
    Paolacci, G., Chandler, J., Ipeirotis, P.: Running experiments on amazon mechanical turk. Judgm. Decis. Making 5(5), 411–419 (2010)Google Scholar
  27. 27.
    Pollak, J.P., Adams, P., Gay, G.: PAM: a photographic affect meter for frequent, in situ measurement of affect. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 725–734. ACM (2011)Google Scholar
  28. 28.
    Ross, J., Irani, L., Silberman, S., Zaldivar, A., Tomlinson, B.: Who are The crowdworkers? shifting demographics in mechanical turk. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2863–2872. ACM (2010)Google Scholar
  29. 29.
    Spiegelhalter, D., Best, N., Carlin, B., Van der Linde, A.: J. Roy. Stat. Soc. Ser. B 64(4), 583 (2002)CrossRefMATHGoogle Scholar
  30. 30.
    Vastenburg, M., Romero Herrera, N., Van Bel, D., Desmet, P.: PMRI: development of a pictorial mood reporting instrument. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2155–2160. ACM (2011)Google Scholar
  31. 31.
    Visser, N., Alant, E., Harty, M.: Which graphic symbols do 4-year-old children choose to represent each of the four basic emotions? Augment. Altern. Commun. 24(4), 302–312 (2008)CrossRefGoogle Scholar
  32. 32.
    The LEGO Group: A Short Presentation (2011)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2015

Authors and Affiliations

  • Mohammad Obaid
    • 1
  • Andreas Dünser
    • 2
  • Elena Moltchanova
    • 3
  • Danielle Cummings
    • 4
  • Johannes Wagner
    • 5
  • Christoph Bartneck
    • 6
  1. 1.t2i LabChalmers University of TechnologyGothenburgSweden
  2. 2.Digital ProductivityCSIROHobartAustralia
  3. 3.Mathematics and Statistics DepartmentUniversity of CanterburyChristchurchNew Zealand
  4. 4.Texas A&M UniversityCollege StationUSA
  5. 5.Human Centered MultimediaAugsburg UniversityAugsburgGermany
  6. 6.Human Interface Technology Lab New ZealandUniversity of CanterburyChristchurchNew Zealand

Personalised recommendations