Heuristic Evaluation for Novice Evaluators

  • André de Lima SalgadoEmail author
  • Renata Pontin de Mattos Fortes
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 9746)


Adapting the method of Heuristic Evaluation for novice evaluators can capacitate organizations of low monetary power that, usually, do not have conditions to resort to experts. In one of the courses given by the authors, 12 in 15 novice evaluators (80 %) said they had difficulties to distinguish the difference among the traditional usability heuristics. The aim of this study was to explore this affirmation and develop possible adaptations in order to mitigate this problem. Surveys with 13 usability experts and 15 novice evaluators showed that the 3rd and the 7th heuristics, from the traditional set of Nielsen and Molich, are probably more difficult for novices to understand and distinguish between each other. In a third survey, with 7 usability experts, we discussed a new description for heuristics 3 and 7 in order to make them easier for novice evaluators to understand and distinguish. Future studies can validate of the adaptations proposed here.


Heuristic Evaluation Usability heuristics Novice evaluator 



We thank all volunteers that took part in the interview, CAPES and FAPESP for their great support. We also thank the Research Group Intermídia, from USP São Carlos, and ALCANCE, from Federal University of Lavras, for their kindly help. Also, we thank Professor Rudinei Goularte for his great advices at this study.

This study was supported by the grant #2015/09493-5, São Paulo Research Foundation (FAPESP).


  1. 1.
    Aljohani, M., Blustein, J.: Heuristic evaluation of university institutional repositories based on DSpace. In: Marcus, A. (ed.) DUXU 2015, Part III. LNCS, vol. 9188, pp. 119–130. Springer, Heidelberg (2015). CrossRefGoogle Scholar
  2. 2.
    Borys, M., Laskowski, M.: Expert vs novice evaluators: comparison of heuristic evaluation assessment. In: 16th International Conference on Enterprise Information Systems, ICEIS 2014, vol. 3, pp. 144–149. SciTePress, Institute of Computer Science, Lublin University of Technology, Nadbystrzycka 38D street, Lublin, Poland (27 – 30 April 2014).
  3. 3.
    Botella, F., Alarcon, E., Peñalver, A.: A new proposal for improving heuristic evaluation reports performed by novice evaluators. In: Proceedings of the 2013 Chilean Conference on Human – Computer Interaction, ChileCHI 2013, pp. 72–75. ACM, New York (2013).
  4. 4.
    Botella, F., Alarcon, E., Peñalver, A.: How to classify to experts in usability evaluation. In: Proceedings of the XV International Conference on Human Computer Interaction, Interacción 2014, pp. 25:1–25:4. ACM, New York (2014).
  5. 5.
    Følstad, A., Law, E., Hornbæk, K.: Analysis in practical usability evaluation: a survey study. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2012, pp. 2127–2136. ACM, New York (2012).
  6. 6.
    Hertzum, M., Jacobsen, N.E.: The evaluator effect: a chilling fact about usability evaluation methods. Int. J. Hum. Comput. Interact. 13(4), 421–443 (2001). CrossRefGoogle Scholar
  7. 7.
    Johannessen, G.H.J., Hornbæk, K.: Must evaluation methods be about usability? devising and assessing the utility inspection method. Behav. Inf. Technol. 33(2), 195–206 (2014)CrossRefGoogle Scholar
  8. 8.
    de Lima Salgado, A., Freire, A.P.: Heuristic evaluation of mobile usability: a mapping study. In: Kurosu, M. (ed.) HCII 2014, Part III. LNCS, vol. 8512, pp. 178–188. Springer, Heidelberg (2014). Google Scholar
  9. 9.
    Lowry, P.B., Roberts, T.L., Romano Jr., N.C.: What signal is your inspection team sending to each other? using a shared collaborative interface to improve shared cognition and implicit coordination in error-detection teams. Int. J. Hum. Comput. Stud. 71(4), 455–474 (2013).
  10. 10.
    MacFarlane, S., Pasiali, A.: Adapting the heuristic evaluation method for use with children. In: Workshop on Child Computer Interaction: Methodological Research, Interact, pp. 28–31 (2005)Google Scholar
  11. 11.
    MacFarlane, S., Sim, G., Horton, M.: Assessing usability and fun in educational software. In: Proceedings of the 2005 Conference on Interaction Design and Children, pp. 103–109. ACM (2005)Google Scholar
  12. 12.
    Mankoff, J., Dey, A.K., Hsieh, G., Kientz, J., Lederer, S., Ames, M.: Heuristic evaluation of ambient displays. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, CHI 2003, pp. 169–176. ACM, New York (2003).
  13. 13.
    Martins, A.I., Queirós, A., Silva, A.G., Rocha, N.P.: Usability evaluation methods: a systematic review. Human Factors in Software Development and Design, p. 250 (2014)Google Scholar
  14. 14.
    Nielsen, J.: Finding usability problems through heuristic evaluation. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 373–380. ACM (1992)Google Scholar
  15. 15.
    Nielsen, J.: Heuristic evaluation. In: Usability Inspection Methods, vol. 17, pp. 25–62. Wiley & Sons, New York (1994)Google Scholar
  16. 16.
    Nielsen, J.: 10 usability heuristics for user interface design (1995).
  17. 17.
    Nielsen, J., Molich, R.: Heuristic evaluation of user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 249–256. ACM (1990)Google Scholar
  18. 18.
    Paz, F., Paz, F.A., Pow-Sang, J.A.: Experimental case study of new usability heuristics. In: Marcus, A. (ed.) DUXU 2015, Part I. LNCS, vol. 9186, pp. 212–223. Springer, Heidelberg (2015). CrossRefGoogle Scholar
  19. 19.
    Read, J.: Children as participants in design and evaluation. Interactions 22(2), 64–66 (2015). CrossRefGoogle Scholar
  20. 20.
    Preece, J., Sharp, H., Rogers, Y.: Interaction Design: Beyond Human-Computer Interaction, 4th edn. John Wiley & Sons Ltd., Chichester (2015)Google Scholar
  21. 21.
    Salian, K., Sim, G.: Simplifying heuristic evaluation for older children. In: Proceedings of the India HCI 2014 Conference on Human Computer Interaction, IndiaHCI 2014, pp. 26:26–26:34. ACM, New York (2014).
  22. 22.
    Salian, K., Sim, G., Read, J.C.: Can children perform a heuristic evaluation? In: Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, APCHI 2013, pp. 137–141. ACM, New York (2013).
  23. 23.
    Scheller, T., Kühn, E.: Automated measurement of API usability: the API concepts framework. Inf. Softw. Technol. 61, 145–162 (2015). CrossRefGoogle Scholar
  24. 24.
    Slavkovic, A., Cross, K.: Novice heuristic evaluations of a complex interface. In: CHI 1999 Extended Abstracts on Human Factors in Computing Systems, CHI EA 1999, pp. 304–305. ACM, New York (1999).
  25. 25.
    Wodike, O.A., Sim, G., Horton, M.: Empowering teenagers to perform a heuristic evaluation of a game. In: Proceedings of the 28th International BCS Human Computer Interaction Conference on HCI 2014-Sand, Sea and Sky-Holiday HCI, pp. 353–358. BCS (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2016

Authors and Affiliations

  • André de Lima Salgado
    • 1
    Email author
  • Renata Pontin de Mattos Fortes
    • 1
  1. 1.ICMC University of São PauloSão CarlosBrazil

Personalised recommendations