Empirical Software Engineering

, Volume 4, Issue 1, pp 71–104 | Cite as

The Usability Problem Taxonomy: A Framework for Classification and Analysis

  • Susan L. Keenan
  • H. Rex Hartson
  • Dennis G. Kafura
  • Robert S. Schulman
Article

Abstract

Although much can be gained by analyzing usability problems, there is no overall framework in which large sets of usability problems can be easily classified, compared, and analyzed. Current approaches to problem analysis that focus on identifying specific problem characteristics (such as severity or cost-to-fix) do provide additional information to the developer; however, they do not adequately support high-level (global) analysis. High-level approaches to problem analysis depend on the developer / evaluator's ability to group problems, yet commonly used techniques for organizing usability problems are incomplete and / or provide inadequate information for problem correction. This paper presents the Usability Problem Taxonomy (UPT), a taxonomic model in which usability problems detected in graphical user interfaces with textual components are classified from both an artifact and a task perspective. The UPT was built empirically using over 400 usability problem descriptions collected on real-world development projects. The UPT has two components and contains 28 categories: 19 are in the artifact component and nine are in the task component. A study was conducted showing that problems can be classified reliably using the UPT. Techniques for high-level problem analysis are explored using UPT classification of a set of usability problems detected during an evaluation of a CASE tool. In addition, ways to augment or complement existing problem analysis strategies using UPT analysis are suggested. A summary of reports from two developers who have used the UPT in the workplace provides anecdotal evidence indicating that UPT classification has improved problem identification, reporting, analysis, and prioritization prior to correction.

Usability problem classification usability problem analysis problem prioritization 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Card, D. N. 1993. Defect-causal analysis drives down error rates. IEEE Software 10.4: 98–99.Google Scholar
  2. Card, D. N. 1998. Learning from our mistakes with defect causal analysis. IEEE Software 15.1: 56–63.Google Scholar
  3. Carroll, J. M., Kellogg, W. A., and Rosson, M. B. 1991. The task-artifact cycle. Designing Interaction: Psychology at the Human-Computer Interface. Cambridge: Cambridge University Press, 74–102.Google Scholar
  4. Chin, J. P., Diehl, V. A., and Norman K. L. 1988. Development of an instrument measuring user satisfaction of the human-computer interface. Proc. ACM CHI '88 Conference. Washington, D.C., 213–218.Google Scholar
  5. Cohen, J. 1960. A coefficient of agreement for nominal scales. Educational and Psychological Measurement XX 1: 37–46.Google Scholar
  6. Desurvire, H. W. 1994. Faster, cheaper!! Are usability inspection methods as effective as empirical testing?. In J. Nielsen and R. L. Mack (eds.), Usability Inspection Methods. New York: John Wiley & Sons, Inc., 173–202.Google Scholar
  7. Dumas, J. S., and Redish, J. C. 1994. A Practical Guide to Usability Testing. Norwood, NJ: Ablex Publishing Company.Google Scholar
  8. Fleiss, J. L. 1971. Measuring nominal scale agreement among many raters. Psychological Bulletin 76.5: 378–382.Google Scholar
  9. Giblin, D. 1992. Defect causal analysis: A report from the field. Proceedings of the ASQC Second International Conference on Sofware Quality, 1–5.Google Scholar
  10. Grady, R. B. 1994. Successfully applying software metrics. IEEE Computer 27.9: 18–25.Google Scholar
  11. Hix, D., and Hartson, H. R. 1993. Developing User Interfaces Ensuring Usability Through Predate & Process. New York: John Wiley & Sons, Inc.Google Scholar
  12. Jeffries, R. 1994. Usability problem reports: Helping evaluators communicate effectively with developers. In J. Nielsen and R. L. Mack (eds.), Usability Inspection Methods. New York: John Wiley & Sons, Inc, 273–294.Google Scholar
  13. Karat, C., Campbell, R., and Fiegel, T. 1992. Comparison of empirical testing and walkthrough methods in user interface evaluation. Human Factors in Computing Systems. CHI '92 Conference Proceedings. Monterey, California, 397–404.Google Scholar
  14. Keenan, S. L. 1996. Product Usability and Process Improvement Based on Usability Problem Classification. Ph.D. Dissertation. Department of Computer Science. Virginia Polytechnic Institute and State University.Google Scholar
  15. Mack, R., and Montaniz, F. 1994. Observing, predicting, and analyzing usability problems. In J. Nielsen and R. L. Mack (eds.), Usability Inspection Methods. New York: John Wiley & Sons, 295–339.Google Scholar
  16. Muller, M. J., Dayton, T., and Root, R. 1993. Comparing studies that compare usability assessment methods: An unsuccessful search for stable criteria. Proceedings of INTERCHI Conference on Human Factors in Computing Systems (Adjunct). New York: Amsterdam, 185–186.Google Scholar
  17. Myers, Glenford, J. 1979. The Art of Software Testing. New York: John Wiley & Sons.Google Scholar
  18. Nielsen, J. 1994. Heuristic evaluation. In J. Nielsen and R. L. Mack (eds.), Usability Inspection Methods. New York: John Wiley & Sons, 25–62.Google Scholar
  19. Nielsen, J. 1992. Finding usability problems through heuristic evaluation. Human Factors in Computing Systems. CHI '92 Conference Proceedings. Monterey, California, 373–380.Google Scholar
  20. Nielsen, J., and Molich, R. 1990. Heuristic evaluation of user interfaces. Proceedings ACM CHI'90 Conference. Seattle, Washington, 249–256.Google Scholar
  21. Nielsen, J. 1993. Usability Engineering. San Diego, California: Academic Press, Inc.Google Scholar
  22. Ostrand, T. J., and Weyuker, E. J. Collecting and categorizing software error data in an industrial environment. The Journal of Systems and Software 4: 289–300.Google Scholar
  23. Paternò, F., and Mancini, C. 1999. Engineering the design of usable hypermedia. Empirical Software Engineering: An International Journal, Special Issue on Usability Engineering 4.1: 11–42.Google Scholar
  24. Pfleeger, S. L. 1991. Software Engineering The Production of Quality Software. New York: Macmillan Publishing Company.Google Scholar
  25. Porteous, M. A., Kirakowski, J., and Corbett, M. 1993. Software Usability Measurement Inventory Handbook. Cork, Ireland: Human Factors Research Group, University College.Google Scholar
  26. Preece, J., Rogers, Y., Sharp, H., Benyon, D., Holland, S., and Carey, T. 1995. Human-Computer Interaction. Reading, MA: Addison-Wesley Publishing Company.Google Scholar
  27. Pressman, R. S. 1997. Sotware Engineering A Practitioner's Approach Fourth Edition. New York: McGraw-Hill, Inc.Google Scholar
  28. Rubin, J. 1994. Handbook of Usability Testing. New York: John Wiley & Sons, Inc.Google Scholar
  29. Shneiderman, B. 1998. Designing the User Interface: Strategies for Effective Human-Computer Interaction, Third Edition. Reading, MA: Addison-Wesley.Google Scholar
  30. Senders, J. W., and Moray, N. P. (Eds.). 1991. Human Error: Cause, Prediction, and Reduction. Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
  31. Sommerville, I. 1989. Software Engineering. New York: Addison-Wesley Publishing Company.Google Scholar
  32. Vora, P. 1995. Classifying user errors in human-computer interactive tasks. Common Ground. Usability Professional Association. Vol. 5.No. 2. 15.Google Scholar
  33. Weiss, D. M. 1979. Evaluating software development by error analysis: The data from the architecture research facility. The Journal of Systems and Software 1. 57–70.Google Scholar
  34. Zhang, Z., Basili, V., and Shneiderman, B. 1999. Perspective-based usability inspection: An empirical validation of efficacy. Empirical Software Engineering: An International Journal, Special Issue on Usability Engineering 4.1: 43–70.Google Scholar

Copyright information

© Kluwer Academic Publishers 1999

Authors and Affiliations

  • Susan L. Keenan
    • 1
  • H. Rex Hartson
    • 2
  • Dennis G. Kafura
    • 2
  • Robert S. Schulman
    • 2
  1. 1.Shrewsbury
  2. 2.Department of Computer ScienceVirginia TechBlacksburg

Personalised recommendations