Active Collaborative Learning: Supporting Software Developers in Creating Redesign Proposals

  • Anders Bruun
  • Janne Juul Jensen
  • Mikael B. Skov
  • Jan Stage
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8742)

Abstract

Redesign proposals have been suggested as means to improve the feedback from usability evaluation to software development. Yet redesign proposals are usually created by usability specialists without any involvement of the software developers who will implement the proposals. This paper reports from an exploratory study where redesign proposals were created in an active and collaborative learning process that involved both software developers and usability specialists. The focus was on the support that the developers needed in order to contribute constructively to improve the usability of the system. The findings show that this process had a considerable impact on the developers’ understanding of the usability problems, especially the weaknesses of the system. They were able to contribute constructively to create redesign proposals, and they found the workshop very useful for their future efforts to eliminate the usability problems that have been identified.

Keywords

Usability evaluation usability problem redesign proposal developer involvement active collaborative learning exploratory study 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Bailey, R.W., Allan, R.W., Raiello, P.: Usability Testing vs. Heuristic Evaluation: A Head-to-Head Comparison. In: Proc. Human Factors Society 36th Annual Meeting, pp. 409–413 (1992)Google Scholar
  2. 2.
    Bevan, N., Singhal, N., Werner, B., Degler, D., Wilson, C.: Formative Evaluation. Usability Body of Knowledge, http://www.usabilitybok.org/methods/formative-evaluation
  3. 3.
    Capra, M.G.: Comparing Usability Problem Identification and Description by Prac-titioners and Students. In: Proceedings of the Human Factors and Ergonomics Society 51st Annual Meeting, pp. 474–478. HFES, Santa Monica (2007)Google Scholar
  4. 4.
    Dewey, J.: Democracy and Education: an introduction to the philosophy of education (1916), http://en.wikisource.org/wiki/Democracy_and_Education
  5. 5.
    Dumas, J., Molich, R., Jefferies, R.: Describing Usability Problems: Are we sending the right message? Interactions 4, 24–29 (2004)CrossRefGoogle Scholar
  6. 6.
    Hornbæk, K., Frøkjær, E.: Comparing usability problems and redesign propos-als as input to practical systems development. In: Proceedings of CHI 2005, pp. 391–400 (2005)Google Scholar
  7. 7.
    Hornbæk, K., Frøkjær, E.: What kind of usability-problem description are useful for developers? In: HFES 2006, pp. 2523–2527 (2006)Google Scholar
  8. 8.
    Høegh, R.T., Nielsen, C.M., Overgaard, M., Pedersen, M.B., Stage, J.: The Impact of Usability Reports and User Test Observations on Developers’ Understanding of Usability Data: An Exploratory Study. International Journal of Human-Computer Interaction 21(2), 173–196 (2006)CrossRefGoogle Scholar
  9. 9.
    Hvannberg, E.T., Law, E.L.-C., Lárusdóttir, M.K.: Heuristic evaluation: Com-paring ways of finding and reporting usability problems. Interacting with Computers 19(2), 225–240 (2007)CrossRefGoogle Scholar
  10. 10.
    ISO, ISO 9241-11 Ergonomic Requirement for Office Work with Visual Display Terminals (VDTs) – part 11: Guidance on Usability. Switzerland, International Organization for Standardization (1998)Google Scholar
  11. 11.
    Jeffries, R.: Usability Problem Reports: helping Evaluators Communicate Effectively with Developers. In: Nielsen, J., Mack, R.L. (eds.) Usability Inspection Methods, pp. 273–294. John Wiley, New York (1994)Google Scholar
  12. 12.
    John, B.E., Marks, S.J.: Tracking the Effectiveness of Usability Evaluation Methods. Behaviour and Information Technology 16(4/5), 188–202 (1997)Google Scholar
  13. 13.
    John, B.E., Packer, H.: Learning and using the cognitive walkthrough method: a case study approach. In: Katz, I., Mack, R., Marks, L. (eds.) Proceedings of ACM CHI 1995 Conference on Human Factors in Computing Systems, pp. 429–436. ACM, New York (1995)Google Scholar
  14. 14.
    Lavery, D., Cockton, G., Atkinson, M.P.: Comparison of evaluation methods us-ing structured usability problem reports. Behaviour & Information Technology 16(4/5), 246–266 (1997)CrossRefGoogle Scholar
  15. 15.
    Law, E.L.-C.: Evaluating the downstream utility of user tests and examining the developer effect: A case study. International Journal of Human-Computer Interaction 21(2), 147–172 (2006)CrossRefGoogle Scholar
  16. 16.
    Mack, R., Montaniz, F.: Observing, predicting and analyzing usability problems. In: Nielsen, J., Mack, R.L. (eds.) Usability Inspection Methods, pp. 295–339. John Wiley and Sons, New York (1994)Google Scholar
  17. 17.
    Polito, T.: Educational Theory as Theory of Culture: A Vichian perspective on the educational theories of John Dewey and Kieran Egan. Educational Philosophy and Theory 37(4), 475–494 (2005)CrossRefGoogle Scholar
  18. 18.
    Rubin, J.: Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. John Wiley & Sons, New York (1994)Google Scholar
  19. 19.
    Sawyer, P., Flanders, A., Wixon, D.: Making a Difference - The Impact of Inspections. In: Proc. CHI 1996, pp. 376–382. ACM Press (1996)Google Scholar
  20. 20.
    Smith, A., Dunckley, L.: Prototype Evaluation and Redesign: Structuring the Design Space through Contextual Techniques. Interacting with Computers 14, 821–843 (2002)CrossRefGoogle Scholar
  21. 21.
    Strauss, A., Corbin, J.: Basics of qualitative research. Techniques and procedures for developing grounded theory, 2nd edn. Sage, Thousand Oaks (1998)Google Scholar
  22. 22.
    Uldall-Espersen, T., Frøkjær, E., Hornbæk, K.: Tracing Impact in a Usability Improvement Process. Interacting with Computers 20(1), 48–63 (2008)CrossRefGoogle Scholar
  23. 23.
    Wixon, D.: Evaluating Usability Methods: Why the Current Literature Fails the Practitioner. Interactions 10(4), 29–34 (2003)CrossRefGoogle Scholar
  24. 24.
    Hart, S.G.: Nasa-task load index (nasa-tlx); 20 years later. NASA-Ames Research (2006)Google Scholar
  25. 25.
    Hart, S.G., Staveland, L.E.: Developing the NASA-TLX (Task Load Index): Results of Empirical and Theoretical Research. Human Mental Workload. In: Hancock, P.A., Meshkati, N. (eds.) Human Mental Workload, pp. 239–250. North-Holland, Amsterdam (1998)Google Scholar
  26. 26.
    Kjeldskov, J., Skov, M.B., Stage, J.: Instant Data Analysis: Evaluating Usability in a Day. In: Proceedings of NordiCHI 2004, pp. 233–240. ACM (2004)Google Scholar
  27. 27.
    Bruun, A., Stage, J.: Barefoot Usability Evaluations. Behaviour and Information Technology (2014), doi: 10.1080/0144929X.2014.883552Google Scholar
  28. 28.
    Gothelf, J.: Lean UX. O’Reilly (2013)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2014

Authors and Affiliations

  • Anders Bruun
    • 1
  • Janne Juul Jensen
    • 1
    • 2
  • Mikael B. Skov
    • 1
  • Jan Stage
    • 1
  1. 1.Department of Computer ScienceAalborg UniversitityAalborgDenmark
  2. 2.Trifork A/SÅrhus CDenmark

Personalised recommendations