Advertisement

Evaluation of UX Methods: Lessons Learned When Evaluating a Multi-user Mobile Application

  • Bruna Moraes FerreiraEmail author
  • Luís Rivero
  • Natasha M. Costa Valentim
  • Renata Zilse
  • Andrew Koster
  • Tayana Conte
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9731)

Abstract

The User Experience (UX) of a software product is influenced by pragmatic and hedonic aspects, and it is necessary to choose a UX evaluation method that takes both of these aspects into account. In this paper, we report on the lessons learned from applying different UX evaluation methods (3E, 3E*, SAM, MAX, EM, Think Aloud, and Observation) in prototyping a multi-user mobile application. We analyzed the different methods in terms of: (i) the type of problems they identified, (ii) their contribution to improve the prototype in each development phase, (iii) encountered difficulties when applying the method, and (iv) encountered difficulties when analyzing the results of the method. We found that SAM and MAX were the easiest methods to apply and also to analyze their results. They are best used to identify hedonic problems, as is EM, whereas Think Aloud, EM and 3E* best identify pragmatic ones.

Keywords

User Experience Usability Evaluation method Lessons learned 

Notes

Acknowledgment

We thank Professors Rafael Bordini, Felipe Meneguzzi, Renata Vieira, and all their team. We also thank all the participants in the evaluations. We would like to acknowledge the financial support granted by “Large Scale Qualification PROgram on MOBILE Technologies”, which is supported by Samsung Eletrônica da Amazônia Ltda, under the terms of the Informatics Law number 8387/91; CAPES; and FAPEAM through processes numbers: 062.00600/2014; 062.00578/2014.

References

  1. 1.
    Balasubramoniam, V., Tungatkar, N.: Study of user experience (UX) and UX evaluation methods. Int. J. Adv. Res. Comput. Eng. Technol. (IJARCET) 2(3), 1214–1219 (2013)Google Scholar
  2. 2.
    Bernhaupt, R., Pirker, M.: Evaluating user experience for interactive television: towards the development of a domain-specific user experience questionnaire. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013, Part II. LNCS, vol. 8118, pp. 642–659. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  3. 3.
    Cavalcante, E., Rivero, L., Conte, T.: MAX: a method for evaluating the post-use user eXperience through cards and a board. In: 27th International Conference on Software Engineering and Knowledge Engineering, pp. 495–500 (2015)Google Scholar
  4. 4.
    Hassenzahl, M.: User experience (UX): towards an experiential perspective on product quality. In: Proceedings of the 20th International Conference of the Association Francophone d’Interaction Homme-Machine, pp. 11–15. ACM (2008)Google Scholar
  5. 5.
    Hassenzahl, M., Diefenbach, S., Göritz, A.: Needs, affect, and interactive products–facets of user experience. Int. Comput. 22(5), 353–362 (2010)Google Scholar
  6. 6.
    ISO 9241-210. International Standardization Organization (ISO). Ergonomics of human system interaction -Part 210: Human-centred design for interactive systems. Switzerland (2010)Google Scholar
  7. 7.
    ISO/IEC 25010, International Organization for Standardization, ISO, Systems and software engineering – SquaRE – Software product Quality Requirements and Evaluation – System and Software Quality Models (2011)Google Scholar
  8. 8.
    Jordan, P.W.: Designing Pleasurable Products: An Introduction to the New Human Factors. CRC Press, Boca Raton (2002)Google Scholar
  9. 9.
    Lang, P.J.: Behavioral treatment and bio-behavioral assessment: computer applications. In: Sidowski, J.B., Johnson, J.H., Williams, T.A. (eds.) Technology in mental health care delivery systems. Ablex, Norwood, NJ (1980)Google Scholar
  10. 10.
    Law, E.L., Abrahão, S., Vermeeren, A.P., Hvannberg, E.T.: Interplay between user experience evaluation and system development: state of the art. In: International Workshop on the Interplay between User Experience (UX) Evaluation and System Development (I-UxSED 2012), pp. 14–17 (2012)Google Scholar
  11. 11.
    Lizano, F., Sandoval, M.M., Bruun, A., Stage, J.: Usability evaluation in a digitally emerging country: a survey study. In: Kotzé, P., Marsden, G., Lindgaard, G., Wesson, J., Winckler, M. (eds.) INTERACT 2013, Part IV. LNCS, vol. 8120, pp. 298–305. Springer, Heidelberg (2013)CrossRefGoogle Scholar
  12. 12.
    Osterwalder, A., Pigneur, Y.: Business Model Generation. Alta Books Editora (2013)Google Scholar
  13. 13.
    Tähti, M., Marketta, N.: 3E–expressing emotions and experiences. In: Proceedings of the WP9 Workshop on Innovative Approaches for Evaluating Affective Systems, HUMAINE (Human-Machine Interaction Network on Emotion), pp. 15–19 (2006)Google Scholar
  14. 14.
    Van Someren, M.W., Barnard, Y.F., Sandberg, J.A.: The Think Aloud Method: A Practical Guide to Modelling Cognitive Processes. Academic Press, London (1994)Google Scholar
  15. 15.
    Väätäjä, H., Koponen, T., Roto, V.: Developing practical tools for user experience evaluation: a case from mobile news journalism. In: European Conference on Cognitive Ergonomics, pp. 23–30 (2009)Google Scholar
  16. 16.
    Vermeeren, A., Law, E., Roto, V., Obrist, M., Hoonhout, J., Väänänen-Vainio-Mattila, K.: User experience evaluation methods: current state and development needs. In: Proceedings of the 6th Nordic Conference on Human-Computer Interaction: Extending Boundaries, pp. 521–530 (2010)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • Bruna Moraes Ferreira
    • 1
    Email author
  • Luís Rivero
    • 1
  • Natasha M. Costa Valentim
    • 1
  • Renata Zilse
    • 2
  • Andrew Koster
    • 2
  • Tayana Conte
    • 1
  1. 1.USES Research GroupFederal University of AmazonasManausBrazil
  2. 2.Samsung Research Institute BrazilCampinasBrazil

Personalised recommendations