Towards a Conceptual Framework for the Objective Evaluation of User Experience

  • Carolina Rico-OlarteEmail author
  • Diego M. López
  • Sara Kepplinger
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10918)


Background: The concept of user experience (UX) involves many aspects and different perspectives, making it difficult to evaluate the whole set of what UX represents. Despite existing standards, a clear definition of the UX evaluation concerning the identification of the different aspects to be evaluated according to the perspectives forming the UX (user and system), taking into account a given context of use is missing. Objective: Propose a conceptual framework for identifying differences between the UX evaluation perspectives and their measurable aspects. Methods: We followed a qualitative method for building conceptual frameworks. Results: The proposed conceptual framework identifies and associates the main UX concepts, from the user and system perspectives. The obtained plane of concepts provides a better overview of the phenomenon studied. The built framework led to the definition of an objective UX evaluation method: physiological signals are the convergence point between the physical state of the user and the measurement of emotions. Conclusion: The evaluation of UX is particularly important in ICT solutions for health since users/patients must maintain the motivation to continue using technology, in order to guarantee adherence to their treatments or interventions. The obtained framework and method are the first step towards finding suitable and according-to-context UX evaluation processes allowing an improved interaction between user and system.


User experience Conceptual framework Evaluation methods Objective evaluation 



The General Royalty System (SGR) in Colombia financed this research work under the “InnovAcción Cauca” program and the HapHop-Fisio project (VRI ID 4441).


  1. 1.
    Roto, V., Law, E.L.-C., Vermeeren, A., Hoonhout, J.: User Experience White Paper. Presented at the Dagstuhl, Germany (2011)Google Scholar
  2. 2.
    Brezinka, V.: Computer games supporting cognitive behaviour therapy in children. Clin. Child Psychol. Psychiatry 19, 100–110 (2014). Scholar
  3. 3.
    Carrillo, I., Meza-Kubo, V., Morán, A.L., Galindo, G., García-Canseco, E.: Processing EEG signals towards the construction of a user experience assessment method. In: Bravo, J., Hervás, R., Villarreal, V. (eds.) AmIHEALTH 2015. LNCS, vol. 9456, pp. 281–292. Springer, Cham (2015). Scholar
  4. 4.
    Colpani, R., Homem, M.R.P.: An innovative augmented reality educational framework with gamification to assist the learning process of children with intellectual disabilities. In: 2015 6th International Conference on Information, Intelligence, Systems and Applications (IISA), pp. 1–6 (2015)Google Scholar
  5. 5.
    Keskinen, T., Hakulinen, J., Heimonen, T., Turunen, M., Sharma, S., Miettinen, T., Luhtala, M.: Evaluating the experiential user experience of public display applications in the wild. In: Proceedings of the 12th International Conference on Mobile and Ubiquitous Multimedia, p. 7. ACM (2013)Google Scholar
  6. 6.
    Law, E., Roto, V., Vermeeren, A.P., Kort, J., Hassenzahl, M.: Towards a shared definition of user experience. In: CHI 2008 extended abstracts on Human factors in computing systems, pp. 2395–2398. ACM (2008)Google Scholar
  7. 7.
    Väänänen-Vainio-Mattila, K., Roto, V., Hassenzahl, M.: Now let’s do it in practice: user experience evaluation methods in product development. In: CHI 2008 Extended Abstracts on Human Factors in Computing Systems, pp. 3961–3964. ACM (2008)Google Scholar
  8. 8.
    Law, E.L.-C., Vermeeren, A.P., Hassenzahl, M., Blythe, M.: Towards a UX manifesto. In: Proceedings of the 21st British HCI Group Annual Conference on People and Computers: HCI… but not as We Know it-Volume 2, pp. 205–206. British Computer Society (2007)Google Scholar
  9. 9.
    ISO 9241-210: Ergonomics of human-system interaction—Part 210: Human-centred design for interactive systems (2010).
  10. 10.
    Law, E.L.-C., Roto, V., Hassenzahl, M., Vermeeren, A.P., Kort, J.: Understanding, scoping and defining user experience: a survey approach. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 719–728. ACM (2009)Google Scholar
  11. 11.
    Hassenzahl, M.: The hedonic/pragmatic model of user experience. In: UX Manifesto, vol. 10, pp. 10–14 (2007)Google Scholar
  12. 12.
    Hassenzahl, M.: User experience (UX): towards an experiential perspective on product quality. In: Proceedings of the 20th Conference on l’Interaction Homme-Machine, pp. 11–15. ACM (2008)Google Scholar
  13. 13.
    Hassenzahl, M., Tractinsky, N.: User experience-a research agenda. Behav. Inf. Technol. 25, 91–97 (2006)CrossRefGoogle Scholar
  14. 14.
    Wiklund-Engblom, A., Hassenzahl, M., Bengs, A., Sperring, S.: What us needs tell about user experience. In: Gross, T., Gulliksen, J., Kotzé, P., Oestreicher, L., Palanque, P., Prates, R.O., Winckler, M. (eds.) INTERACT 2009. LNCS, vol. 5727, pp. 666–669. Springer, Heidelberg (2009). Scholar
  15. 15.
    Thüring, M., Mahlke, S.: Usability, aesthetics and emotions in human–technology interaction. Int. J. Psychol. 42, 253–264 (2007)CrossRefGoogle Scholar
  16. 16.
    Mandryk, R.L.: Physiological measures for game evaluation. In: Game Usability: Advice from the Experts for Advancing the Player Experience, pp. 207–235 (2008)Google Scholar
  17. 17.
    Boehner, K., DePaula, R., Dourish, P., Sengers, P.: How emotion is made and measured. Int. J. Hum.-Comput. Stud. 65, 275–291 (2007)CrossRefGoogle Scholar
  18. 18.
    Allam, A., Dahlan, H.M.: User experience: challenges and opportunities. J. Inf. Syst. Res. Innov. 3, 28–36 (2013)Google Scholar
  19. 19.
    Stone, D., Jarrett, C., Woodroffe, M., Minocha, S.: User Interface Design and Evaluation. Morgan Kaufmann, Burlington (2005)Google Scholar
  20. 20.
    Kort, J., Vermeeren, A., Fokker, J.E.: Conceptualizing and measuring user experience. In: UX Manifesto, p. 57 (2007)Google Scholar
  21. 21.
    Bevan, N.: What is the difference between the purpose of usability and user experience evaluation methods. In: Proceedings of the Workshop UXEM, pp. 1–4 (2009)Google Scholar
  22. 22.
    Bevan, N.: International standards for usability should be more widely used. J. Usability Stud. 4, 106–113 (2009)Google Scholar
  23. 23.
    Jabareen, Y.: Building a conceptual framework: philosophy, definitions, and procedure. Int. J. Qual. Methods 8, 49–62 (2009)CrossRefGoogle Scholar
  24. 24.
    Rico-Olarte, C., López, D., Kepplinger, S.: User experience evaluation methods in games for children with cognitive disabilities: a systematic review. In: Proceedings of 1st International Young Researcher Summit on Quality of Experience in Emerging Multimedia Services (QEEMS 2017), Erfurt, Germany (2017)Google Scholar
  25. 25.
    Maia, C.L.B., Furtado, E.S.: A systematic review about user experience evaluation. In: Marcus, A. (ed.) DUXU 2016. LNCS, vol. 9746, pp. 445–455. Springer, Cham (2016). Scholar
  26. 26.
    Zarour, M., Alharbi, M.: User experience framework that combines aspects, dimensions, and measurement methods. Cogent Eng. Article no. 1421006 (2017)Google Scholar
  27. 27.
    Rico-Olarte, C., López, D., Blobel, B., Kepplinger, S.: User experience evaluations in rehabilitation video games for children: a systematic mapping of the literature. Stud. Health Technol. Inf. 243, 13 (2017)Google Scholar
  28. 28.
    Maia, C.L.B., Furtado, E.S.: A study about psychophysiological measures in user experience monitoring and evaluation. In: Proceedings of the 15th Brazilian Symposium on Human Factors in Computer Systems, pp. 7:1–7:9. ACM, New York (2016)Google Scholar
  29. 29.
    ISO/IEC 25000: Systems and software engineering - Systems and software Quality Requirements and Evaluation (SQuaRE) - Guide to SQuaRE (2014).
  30. 30.
    Miki, H.: User experience and other people: on user experience evaluation framework for human-centered design. In: Stephanidis, C. (ed.) HCI 2015. CCIS, vol. 528, pp. 55–59. Springer, Cham (2015). Scholar
  31. 31.
    ISO/IEC 25022: Systems and software engineering - Systems and software quality requirements and evaluation (SQuaRE) - Measurement of quality in use (2016).
  32. 32.
    ISO 9355-2: Ergonomic requirements for the design of displays and control actuators – Part 2: Displays (1999).
  33. 33.
    Cacioppo, J.T., Tassinary, L.G., Berntson, G.: Handbook of Psychophysiology. Cambridge University Press, Cambridge (2007)CrossRefGoogle Scholar
  34. 34.
    Nacke, L.E.: An introduction to physiological player metrics for evaluating games. In: Seif El-Nasr, M., Drachen, A., Canossa, A. (eds.) Game Analytics, pp. 585–619. Springer, Heidelberg (2013). Scholar
  35. 35.
    ISO/IEC 25021: Systems and software engineering - Systems and software Quality Requirements and Evaluation (SQuaRE) - Quality measure elements (2012).
  36. 36.
    Bernhaupt, R.: User experience evaluation methods in the games development life cycle. In: Bernhaupt, R. (ed.) Game User Experience Evaluation. HIS, pp. 1–8. Springer, Cham (2015). Scholar
  37. 37.
    Izard, C.E.: Basic emotions, relations among emotions, and emotion-cognition relations (1992)CrossRefGoogle Scholar
  38. 38.
    Ekman, P.: An argument for basic emotions. Cogn. Emot. 6, 169–200 (1992)CrossRefGoogle Scholar
  39. 39.
    Rico-Olarte, C., Narváez, S., Farinango, C., López, D.M., Pharow, P.S.: HapHop-Physio a computer game to support cognitive therapies in children. Psychol. Res. Behav. Manag. 10, 1–9 (2017)CrossRefGoogle Scholar
  40. 40.
    Sim, G., Horton, M.: Investigating children’s opinions of games: fun toolkit vs. this or that. In: Proceedings of the 11th International Conference on Interaction Design and Children, pp. 70–77. ACM, New York (2012)Google Scholar
  41. 41.
    Montañés, M.C.: Psicología de la emoción: el proceso emocional. Universidad de Valencia (2005)Google Scholar
  42. 42.
    Hernandez, J., Riobo, I., Rozga, A., Abowd, G.D., Picard, R.W.: Using electrodermal activity to recognize ease of engagement in children during social interactions. In: Proceedings of the 2014 ACM International Joint Conference on Pervasive and Ubiquitous Computing, pp. 307–317. ACM (2014)Google Scholar
  43. 43.
    Martínez, F., Barraza, C., González, N., González, J.: KAPEAN: understanding affective states of children with ADHD. Educ. Technol. Soc. 19, 18–28 (2016)Google Scholar
  44. 44.
    Landowska, A., Miler, J.: Limitations of emotion recognition in software user experience evaluation context. In: 2016 Federated Conference on Computer Science and Information Systems (FedCSIS), pp. 1631–1640 (2016)Google Scholar
  45. 45.
    Yao, L., Liu, Y., Li, W., Zhou, L., Ge, Y., Chai, J., Sun, X.: Using physiological measures to evaluate user experience of mobile applications. In: Harris, D. (ed.) EPCE 2014. LNCS (LNAI), vol. 8532, pp. 301–310. Springer, Cham (2014). Scholar
  46. 46.
    Roto, V., Vermeeren, A., Väänänen-Vainio-Mattila, K., Law, E.: User Experience Evaluation – Which Method to Choose? In: Campos, P., Graham, N., Jorge, J., Nunes, N., Palanque, P., Winckler, M. (eds.) INTERACT 2011. LNCS, vol. 6949, pp. 714–715. Springer, Heidelberg (2011). Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Carolina Rico-Olarte
    • 1
    Email author
  • Diego M. López
    • 1
  • Sara Kepplinger
    • 2
  1. 1.Universidad del CaucaPopayánColombia
  2. 2.Fraunhofer Institute for Digital Media TechnologyIlmenauGermany

Personalised recommendations