Skip to main content

Guiding Usability Newcomers to Understand the Context of Use: Towards Models of Collaborative Heuristic Evaluation

  • Chapter
  • First Online:
Behavior Engineering and Applications

Abstract

Usability inspection methods are liable to the expertise effect among distinct evaluators. Hence, understanding the difficulties faced by evaluators of low expertise (novices and newcomers) is a requirement to move forward in the field. However, the following question remains: Which of the terms that compose usability (User, Goal, Effectiveness, Efficiency, Satisfaction, Context of Use and Task) is the most difficult for newcomers to understand? This exploratory study aims to compare usability newcomers’ difficulties on understanding different terms that compose the usability, based on the definition showed by the ISO/IEC 25066. To achieve this goal, we conducted a survey with 38 usability newcomers. Observations on our survey show the Context of Use may be the most difficult term to be understood by newcomers. Thus, we suggest the adoption of scenarios, storyboards, and domain-specific principles as the basis for newcomers in HEs, when practitioners cannot count on experts for the inspection. In addition, we suggest three (3) different models of Collaborative Heuristic Evaluation aimed to provide newcomers with important insights about Context of Use. Finally, we suggest as future works to study the validity of such models.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    https://www.iso.org/standard/63831.html.

  2. 2.

    www.nngroup.com/articles/ten-usability-heuristics/.

References

  1. Boone, H. N., & Boone, D. A. (2012). Analyzing likert data. Journal of extension, 50(2), pp 1–5.

    Google Scholar 

  2. Brajnik, G., Yesilada, Y., & Harper, S. (2011). The expertise effect on web accessibility evaluation methods. Human–Computer Interaction, 26(3), pp 246–283. DOI:https://doi.org/10.1080/07370024.2011.601670

    Article  Google Scholar 

  3. Bruun, A., & Stage, J. (2014). Barefoot usability evaluations. Behaviour & Information Technology, 33(11), pp 1148–1167. DOI:https://doi.org/10.1080/0144929X.2014.883552

    Article  Google Scholar 

  4. Bruun, A., & Stage, J. (2015). New approaches to usability evaluation in software development: Barefoot and crowdsourcing. Journal of Systems and Software, 105, pp 40–53. DOI:https://doi.org/10.1016/j.jss.2015.03.043

    Article  Google Scholar 

  5. Bryman, A. (2015). Social research methods. Oxford university press. ISBN: 978-0-19-968945-3

    Google Scholar 

  6. Buykx, L. (2009). Improving heuristic evaluation through collaborative working. Masters Dissertation. The University of York Department of Computer Science.

    Google Scholar 

  7. Chambel, T. (2016, November). Interactive and Immersive Media Experiences. In Proceedings of the 22nd Brazilian Symposium on Multimedia and the Web, pp 1–1. ACM. DOI:https://doi.org/10.1145/2976796.2984746

  8. de Lima Salgado, A., & Fortes, R. P. M. (2016). Heuristic Evaluation for Novice Evaluators. In International Conference of Design, User Experience, and Usability, Springer International Publishing, pp 387–398. DOI: https://doi.org/10.1007/978-3-319-40409-7_37

    Chapter  Google Scholar 

  9. DeLone, W. H., & McLean, E. R. (1992). Information systems success: The quest for the dependent variable. Information systems research, 3(1), pp 60–95. DOI:https://doi.org/10.1287/isre.3.1.60

    Article  Google Scholar 

  10. Delone, W. H., & McLean, E. R. (2003). The DeLone and McLean model of information systems success: a ten-year update. Journal of management information systems, 19(4), pp 9–30. DOI:https://doi.org/10.1080/07421222.2003.11045748

    Article  Google Scholar 

  11. Demers, R. A. (1981). System design for usability. Communications of the ACM, 24(8), pp 494–501. DOI:https://doi.org/10.1145/358722.358730

    Article  Google Scholar 

  12. Ebling, M. R., & Want, R. (2017). Pervasive Computing Revisited. IEEE Pervasive Computing, 16(3), pp 17–19. DOI:https://doi.org/10.1109/MPRV.2017.2940959

    Article  Google Scholar 

  13. Ferreira, D. F. (2008). Estatística multivariada. Editora UFLA. ISBN: 978-85-87692-52-8

    Google Scholar 

  14. Georgsson, M., Weir, C., & Staggers, N. (2014). Revisiting heuristic evaluation methods to improve the reliability of findings. In MIE2014. IOS Press. DOI:https://doi.org/10.3233/978-1-61499-432-9-930

    Chapter  Google Scholar 

  15. Hair, J. F., Anderson, R. E., Babin, B. J. & Black, W. C. (2010). . Multivariate data analysis: A global perspective Pearson Upper Saddle River, NJ. ISBN: 0133792684

    Google Scholar 

  16. Hermawati, S., & Lawson, G. (2016). Establishing usability heuristics for heuristics evaluation in a specific domain: Is there a consensus? Applied ergonomics, 56, pp 34–51. DOI:https://doi.org/10.1016/j.apergo.2015.11.016

    Article  Google Scholar 

  17. Hertzum, M. (2017). Commentary: Usability—A Sensitizing Concept. Human–Computer Interaction, pp 1–4. DOI: https://doi.org/10.1080/07370024.2017.1302800

    Article  Google Scholar 

  18. Hertzum, M., & Jacobsen, N. E. (2001). The evaluator effect: A chilling fact about usability evaluation methods. International Journal of Human-Computer Interaction, 13(4), pp 421–443. DOI:https://doi.org/10.1207/S15327590IJHC1304_05

    Article  Google Scholar 

  19. Hornbæk, K. (2010). Dogmas in the assessment of usability evaluation methods. Behaviour & Information Technology, 29(1), pp 97–111. DOI: https://doi.org/10.1080/01449290801939400

    Article  Google Scholar 

  20. Huang, B. (2012). A Comparison of Remote Collaborative Heuristic Evaluation by Novices and Experts with User-based Evaluation. Masters Dissertation. The University of York Department of Computer Science.

    Google Scholar 

  21. Hung P.C.K., Tang J.K.T., & Kanev K. (2017) Introduction. In: Tang J., Hung P. (eds) Computing in Smart Toys. International Series on Computer Entertainment and Media Technology Springer, Cham, pp 1–5. DOI: https://doi.org/10.1007/978-3-319-62072-5_1

    Chapter  Google Scholar 

  22. ISO 25066 (2016). Systems and software engineering - Systems and software Quality Requirements and Evaluation (SQuaRE) - Common Industry Format (CIF) for Usability - Evaluation Report.

    Google Scholar 

  23. ISO 9241-210 (2010). Ergonomics of human-system interaction - Part 210: Human-centred design for interactive systems.

    Google Scholar 

  24. Johannessen, G. H. J., & Hornbæk, K. (2014). Must evaluation methods be about usability? Devising and assessing the utility inspection method. Behaviour & Information Technology, 33(2), pp 195–206. DOI: https://doi.org/10.1080/0144929X.2012.751708

    Article  Google Scholar 

  25. Kangas, K., Seitamaa-Hakkarainen, P., & Hakkarainen, K. (2013). Design expert’s participation in elementary students’ collaborative design process. International Journal of Technology and Design Education, 23(2), pp 161–178. DOI: https://doi.org/10.1007/s10798-011-9172-6

    Article  Google Scholar 

  26. Lewis, J. R. (2014). Usability: lessons learned… and yet to be learned. International Journal of Human-Computer Interaction, 30(9), pp 663–684. DOI: https://doi.org/10.1080/10447318.2014.930311

    Article  Google Scholar 

  27. MacDonald, C. M., & Atwood, M. E. (2013, April). Changing perspectives on evaluation in HCI: past, present, and future. In CHI’13 Extended Abstracts on Human Factors in Computing Systems, pp 1969–1978. ACM. DOI:https://doi.org/10.1145/2468356.2468714

  28. MacFarlane, S., & Pasiali, A. (2005). Adapting the heuristic evaluation method for use with children. In Workshop on child computer interaction: methodological research, Interact, pp 28–31.

    Google Scholar 

  29. MacFarlane, S., Sim, G., & Horton, M. (2005, June). Assessing usability and fun in educational software. In Proceedings of the 2005 conference on Interaction design and children, pp 103–109. ACM.

    Google Scholar 

  30. Martins, A. I., Queirós, A., Silva, A. G., & Rocha, N. P. (2014). Usability evaluation methods: A systematic review. Human Factors in Software Development and Design, 250. DOI:https://doi.org/10.4018/978-1-4666-6485-2.ch013

  31. Molich, R., & Nielsen, J. (1990). Improving a human-computer dialogue. Communications of the ACM, 33(3), pp 338–348. DOI: https://doi.org/10.1145/77481.77486

    Article  Google Scholar 

  32. Nielsen, J., & Molich, R. (1989). Teaching user interface design based on usability engineering. ACM SIGCHI Bulletin, 21(1), pp 45–48. DOI:https://doi.org/10.1145/67880.67885

    Article  Google Scholar 

  33. Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user interfaces. In Proceedings of the SIGCHI conference on Human factors in computing systems, pp. 249–256. ACM. DOI: https://doi.org/10.1145/97243.97281

  34. Nielsen, J. (1995). How to conduct a heuristic evaluation. Nielsen Norman Group. URL: https://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/

  35. Nielsen, J. (1992). Finding usability problems through heuristic evaluation. Proceedings ACM CHI’92 Conference (Monterey, CA, May 3-7), 373–380. DOI:https://doi.org/10.1145/142750.142834

  36. Nielsen, J. (1994a, April). Enhancing the explanatory power of usability heuristics. In Proceedings of the SIGCHI conference on Human Factors in Computing Systems, ACM, pp 152–158. DOI: https://doi.org/10.1145/191666.191729

  37. Nielsen, J. Heuristic Evaluation: How-To: Article by Jakob Nielsen. Nielsen Norman Group Available at: https://www.nngroup.com/articles/how-to-conduct-a-heuristic-evaluation/. (Accessed: 2nd May 2018)

    Google Scholar 

  38. Nielsen, J. (2002). Becoming a usability professional. Nielsen Norman Group. URL: https://www.nngroup.com/articles/becoming-a-usability-professional/

  39. Nielsen, J. (2012). Usability 101: Introduction to usability. Nielsen Norman Group. URL: https://www.nngroup.com/articles/usability-101-introduction-to-usability

  40. Othman, M. K., Mahudin, F., Ahaguk, C. H., & Rahman, M. F. A. (2014, September). Mobile guide technologies (smartphone apps): Collaborative Heuristic Evaluation (CHE) with expert and novice users. In User Science and Engineering (i-USEr), 2014 3rd International Conference on (pp. 232–236). IEEE. DOI: https://doi.org/10.1109/IUSER.2014.7002708

  41. Paz, F., & Pow-Sang, J. A. (2016). A systematic mapping review of usability evaluation methods for software development process. International Journal of Software Engineering and Its Applications, 10(1), pp 165–178.

    Article  Google Scholar 

  42. Petrie, H., & Buykx, L. (2010). Collaborative Heuristic Evaluation: improving the effectiveness of heuristic evaluation. In Proceedings of UPA 2010 International Conference.

    Google Scholar 

  43. Petrie, H., & Power, C. (2012, May). What do users really care about? a comparison of usability problems found by users and experts on highly interactive websites. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, ACM, pp 2107–2116. ACM. DOI: https://doi.org/10.1145/2207676.2208363

  44. Petter, S., DeLone, W., & McLean, E. R. (2013). Information systems success: The quest for the independent variables. Journal of Management Information Systems, 29(4), pp 7–62. DOI: https://doi.org/10.2753/MIS0742-1222290401.

    Article  Google Scholar 

  45. Popovic, V. (2004). Expertise development in product design—strategic and domain-specific knowledge connections. Design Studies, 25(5), (pp. 527–545). DOI: https://doi.org/10.1016/j.destud.2004.05.006

    Article  Google Scholar 

  46. Read, J. (2015). Children as participants in design and evaluation. interactions, 22(2), pp 64–66. DOI: https://doi.org/10.1145/2735710.

    Article  Google Scholar 

  47. Renzi, A. B., Chammas, A., Agner, L., & Greenshpan, J. (2015, August). Startup Rio: user experience and startups. International Conference of Design, User Experience, and Usability Springer, Cham, pp 339–347. DOI: https://doi.org/10.1007/978-3-319-20886-2_32.

    Chapter  Google Scholar 

  48. Rusu, C., Rusu, V., Roncagliolo, S., Apablaza, J., & Rusu, V. Z. (2015, August). User experience evaluations: challenges for newcomers. International Conference of Design, User Experience, and Usability Springer, Cham, pp 237–246. DOI: https://doi.org/10.1007/978-3-319-20886-2_23.

    Chapter  Google Scholar 

  49. Salian, K., & Sim, G. (2014, December). Simplifying heuristic evaluation for older children. In Proceedings of the India HCI 2014 Conference on Human Computer Interaction (p. 26). ACM. DOI: https://doi.org/10.1145/2676702.2676704.

  50. Salian, K., Sim, G., & Read, J. C. (2013, September). Can children perform a heuristic evaluation? In Proceedings of the 11th Asia Pacific Conference on Computer Human Interaction, ACM, pp 137-141. DOI: https://doi.org/10.1145/2525194.2525200

  51. Wei, T., & Simko, V. (2016). corrplot: Visualization of a Correlation Matrix. R package version 0.77. CRAN, Vienna, Austria.

    Google Scholar 

  52. Wodike, O. A., Sim, G., & Horton, M. (2014, September). Empowering teenagers to perform a heuristic evaluation of a game. In Proceedings of the 28th International BCS Human Computer Interaction Conference on HCI 2014-Sand, Sea and Sky-Holiday HCI, BCS, pp 353–358.

    Google Scholar 

Download references

Acknowledgments

This study was supported by the grants 2017/15239-0 and 2015/24525-0, São Paulo Research Foundation (FAPESP). We also thank to CAPES and University of São Paulo for their important support.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to André de Lima Salgado .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

de Lima Salgado, A., de Souza Santos, F., de Mattos Fortes, R.P., Hung, P.C.K. (2018). Guiding Usability Newcomers to Understand the Context of Use: Towards Models of Collaborative Heuristic Evaluation. In: Wong, R., Chi, CH., Hung, P. (eds) Behavior Engineering and Applications. International Series on Computer Entertainment and Media Technology. Springer, Cham. https://doi.org/10.1007/978-3-319-76430-6_7

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-76430-6_7

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-76429-0

  • Online ISBN: 978-3-319-76430-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics