Skip to main content
Log in

Usability measurement and metrics: A consolidated model

  • Published:
Software Quality Journal Aims and scope Submit manuscript

Abstract

Usability is increasingly recognized as an important quality factor for interactive software systems, including traditional GUIs-style applications, Web sites, and the large variety of mobile and PDA interactive services. Unusable user interfaces are probably the single largest reasons why encompassing interactive systems – computers plus people, fail in actual use. The design of this diversity of applications so that they actually achieve their intended purposes in term of ease of use is not an easy task. Although there are many individual methods for evaluating usability; they are not well integrated into a single conceptual framework that facilitate their usage by developers who are not trained in the filed of HCI. This is true in part because there are now several different standards (e.g., ISO 9241, ISO/IEC 9126, IEEE Std.610.12) or conceptual models (e.g., Metrics for Usability Standards in Computing [MUSiC]) for usability, and not all of these standards or models describe the same operational definitions and measures. This paper first reviews existing usability standards and models while highlighted the limitations and complementarities of the various standards. It then explains how these various models can be unified into a single consolidated, hierarchical model of usability measurement. This consolidated model is called Quality in Use Integrated Measurement (QUIM). Included in the QUIM model are 10 factors each of which corresponds to a specific facet of usability that is identified in an existing standard or model. These 10 factors are decomposed into a total of 26 sub-factors or measurable criteria that are furtherdecomposed into 127 specific metrics. The paper explains also how a consolidated model, such as QUIM, can help in developing a usability measurement theory.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  • Ahuja, V. 2000. Building trust in electronic commerce, IT Professional 2: 61–63.

    Article  Google Scholar 

  • Atif, Y. 2002. Building trust in e-commerce, IEEE Internet Computing 6: 18–24.

    Article  Google Scholar 

  • Bevan, N. 1995. Measuring usability as quality of use, Software Quality Journal 4: 115–130.

    Article  Google Scholar 

  • Bevan, N. and Azuma, M. 1997. Quality in Use: Incorporating human factors into the software engineering lifecycle, Proceedings of the Third IEEE International Symposium and Forum on Software Engineering Standards, Walnut Creek, CA, pp. 169–179.

  • Bevan, N. and Macleod, M. 1994. Usability measurement in context, Behavior and Information Technology 13: 132–145.

    Google Scholar 

  • Bevan, N. and Schoeffel, R. 2001. A proposed standard for consumer product usability, Proceedings of 1st International Conference on Universal Access in Human Computer Interaction, New Orleans, LA, pp. 557–561.

  • Boehm, B.W., Abts, C., Brown, W., Chulani, S, Clark, B.F., Steece, B., Brown, A.W., Chulani, S., and Abts, C. 2000. Software Cost Estimation with COCOMO II, New York: Prentice-Hall.

    Google Scholar 

  • Boehm, B.W., Brown, J.R., Kaspar, H., Lipow, M., Macleod, G., and Merritt, M. J. 1978. Characteristics of Software Quality, New York: North Holland.

    Google Scholar 

  • Caldwell, B., Chisholm, W., Vanderheiden, G., and White, J. (Eds.), 2004. Web Content Accessibility Guidelines 2.0, W3C Working Draft 30 July 2004, World Wide Web Consortium. Retrieved July 3, 2005 from http://www.w3.org/TR/2004/WD-WCAG20-20040730/.

  • Cheskin Research and Studio Archetype/Sapient 1999. e-Commerce trust study. Retrieved June 30, 2005 from http://www.cheskin.com/docs/sites/1/report-eComm%20Trust1999.pdf.

  • Constantine, L.L. and Lockwood, L.A.D. 1999. Software for Use: A Practical Guide to the Models and Methods of Usage-Centred Design, New York: Addison-Wesley.

    Google Scholar 

  • Council of the European Union, 1990. Council Directive 90/270/EEC on the Minimum Safety and Health Requirements for Work with Display Screen Equipment, Official Journal of the European Communities L 156: 14–18.

  • Curtis, B., 1980. Measurement and experimentation in software engineering, IEEE Transaction on Software Engineering 68: 1144–1157.

    MathSciNet  Google Scholar 

  • Fenton, N. E., and Whitty, R., 1995. Software Quality Assurance and Measurement: A Worldwide Perspective, London: International Thomson Computer Press.

    Google Scholar 

  • Friedman, B., Kahn, P.H., Jr., and Howe, D.C. 2000. Trust online, ACM Communications, 43: 34–40.

    Article  Google Scholar 

  • Hyatt, L.E. and Rosenberg, L.H. 1996. A software quality model and metrics for identifying project risks and assessing software quality. Retrieved July 3, 2005 from http://satc.gsfc.nasa.gov/support/STC_APR96/qualtiy/stc_qual.PDF.

  • Institute of Electrical and Electronics Engineers, 1990. 610.12-1990, IEEE Standard Glossary of Software Engineering Terminology, Los Alamitos, CA: Author.

  • Institute of Electrical and Electronics Engineers, 1998. 1061-1998, Standard for a Software Quality Metrics Methodology, Los Alamitos, CA: Author.

  • International Electrotechnical Commission, 2004. IEC 60300-3-9, Ed. 2.0, Dependability Management, Part 3-9: Application Guide, Risk Analysis of Technological Systems, Geneva: Author.

  • International Organization for Standardization, 1998. ISO 9241-11, Ergonomic Requirements for Office Work with Visual Display Terminals (VDTs), Part 11: Guidance on Usability, Geneva: Author.

  • International Organization for Standardization, 1999. ISO 13407:1999, Human-Centered Design Processes for Interactive Systems, Geneva: Author.

  • International Organization for Standardization/International Electrotechnical Commission, 1991. ISO/IEC 9126, Information Technology, Software Product Evaluation, Quality Characteristics and Guidelines for their Use, Geneva: Author.

  • International Organization for Standardization/International Electrotechnical Commission, 1995. ISO/IEC 12207, Information Technology, Software Life Cycle Processes Geneva: Author.

  • International Organization for Standardization/International Electrotechnical Commission, 1999. ISO/IEC 14598-1, Information Technology, Software Product Evaluation, Part 1: General Overview, Geneva: Author.

  • International Organization for Standardization/International Electrotechnical Commission, 2001. ISO/IEC 9126-1 Standard, Software Engineering, Product Quality, Part 1: Quality Model, Geneva: Author.

  • International Organization for Standardization/International Electrotechnical Commission, 2001. ISO/IEC 9126-4, Software Engineering, Product Quality, Part 4: Quality in Use Metrics, Geneva: Author.

  • Ivory, M.Y. and Hearst, M.A. 2001. The state of the art in automating usability evaluation of user interfaces, ACM Computing Surveys 33: 470–516.

    Article  Google Scholar 

  • Jarrar, M., Demey, J., and Meersman, R. 2003. On using conceptual data modeling for ontology engineering, Journal on Data Semantics 2800: 185–207.

    Google Scholar 

  • John, B.E. and Kieras, D. E. 1996. Using GOMS for user interface design and evaluation: Which technique? ACM Transactions on Computer-Human Interaction 3: 287–319.

    Article  Google Scholar 

  • Kirakowski, J. and Corbett, M., 1993. SUMI: The Software Usability Measurement Inventory, British Journal of Educational Technology 24: 210–212.

    Google Scholar 

  • Lin, H. X., Choong, Y.-Y., and Salvendy, G., 1997. A proposed index of usability: A method for comparing the relative usability of different software systems, Behaviour and Information Technology, 16: 267-277.

    Article  Google Scholar 

  • Macleod, M., 1994. Usability: Practical Methods for testing and Improvement, Proceedings of the Norwegian Computer Society Software Conference, Sandvika, Norway. Retrieved July 3, 2005 from http://www.usability.serco.com/papers/mm-us94.pdf.

  • Macleod, M., and Rengger, R., 1993. The development of DRUM: A software tool for video-assisted usability evaluation. Retrieved July 3, 2005 from http://www.usability.serco.com/papers/drum93.pdf

  • Macleod, M., Bowden, R., Bevan, N. and Curson, I., 1997. The MUSiC performance method, Behaviour and Information Technology 16: 279-293.

    Article  Google Scholar 

  • McCall, J. A., Richards, P. K., and Walters, G. F., 1977. Factors in Software Quality, Springfield, VA: National Technical Information Service.

    Google Scholar 

  • Nielsen, J., 1993. Usability Engineering, London, UK: Academic Press.

    Google Scholar 

  • Nunnally, J. C., and Bernstein, I. H., 1994. Psychometric theory (3rd ed.), New York: McGraw-Hill.

    Google Scholar 

  • Olsina, L., Lafuente, G., and Rossi, G., 2001. Specifying quality characteristics and attributes for websites, in S. Murugesan and Y. Deshpande (Eds.), Web Engineering, Software Engineering and Web Application Development, London: Springer-Verlag, pp. 266–278.

  • Preece, J., Rogers, Y., Sharp, H., Benyon, D., Holland, S., and Carey, T. 1994. Human Computer Interaction, Wokingham, UK: Addison-Wesley.

    Google Scholar 

  • Rubin, J., 1994. Handbook of Usability Testing, New York: John Wiley.

    Google Scholar 

  • Schneiderman, B., 1992. Designing the User Interface: Strategies for Effective Human-Computer Interaction (2nd ed.), Reading, MA: Addison-Wesley.

    Google Scholar 

  • Schneidewind, N. F., 1992. Methodology for validating software metrics, IEEE Transactions on Software Engineering 18: 410–422.

    Article  Google Scholar 

  • Scholtz, J. and Laskowski, S., 1998. Developing usability tools and techniques for designing and testing web sites, Proceedings of the Fourth Conference on Human Factors & the Web, Basking Ridge, NJ. Retrieved July 3, 2005 from http://www.research.att.com/conf/hfweb/proceedings/scholtz/index.html.

  • Sears, A., 1995. AIDE: A step toward metric-based interface development tools, Proceedings of the ACM Symposium on User Interface Software and Technology, New York: ACM Press, pp. 101–110.

  • Shackel, B., 1991. Usability—Context, framework, definition, design and evaluation, in B. Shackel and S. Richardson (Eds.), Human Factors for Informatics Usability, Cambridge, MA: University Press, pp. 21–38.

    Google Scholar 

  • Stevens, S. S., 1959. Measurement, psychophysics, and utility, in C. W. Churchman and P. Ratoosh (Eds.), Measurement: Definitions and Theories, New York: John Wiley, pp.18–63.

    Google Scholar 

  • Tilson, R., Dong, J., Martin, S., and Kieke, E., 1998. Factors and principles affecting the usability of four e-commerce sites, Proceedings of the 4th Conference on Human Factors & the Web, Basking Ridge, New Jersey. Retrieved July 3, 2005 from http://www.research.att.com/conf/hfweb/proceedings/tilson/index.html.

  • Yamada, S., Hong, J.-K., and Sugita, S. 1995. Development and evaluation of hypermedia for museum education: Validation of metrics, ACM Transactions of Computer Human Interface 12: 410–422.

    Google Scholar 

  • Landuaer, T.K. The Trouble with Computers: Usefulness, Usability and Productivity, MIT Press, 1995.

  • Mayhew, D.J. (1999). The Usability Engineering Lifecycle: A Practitioner’s Handbook for User Interface design, Morgan Kaufmann, San Francisco.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ahmed Seffah.

Additional information

Ahmed Seffah interests are at the intersection of human-computer interaction and software engineering, with an emphasis on human-centered software engineering, empirical studies, theoretical models for quality in use measurement, as well as patterns as a vehicle for capturing and incorporating empirically valid design practices in software engineering practices. He is a co-founder of the Usability and Empirical Studies Lab and the founder and the chair of the Human-Centered Software Engineering Research Group at Concordia University.

Harkirat K. Padda is a Ph.D. candidate at Concordia University (Montreal) since 2003, where she completed her masters’ degree in computer science. Her research interests are software quality measurement, metrics, and empirical software evaluation. Being a member of Human Computer Software Engineering Group, she is exploring the comprehension and usability of software systems—in particular the visualization systems. In her masters’ work, she already defined a repository of different metrics to measure the ‘quality in use’ factors of software products in general. Currently, she is proposing a pattern-oriented measurement framework to measure comprehension of visualization systems.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Seffah, A., Donyaee, M., Kline, R.B. et al. Usability measurement and metrics: A consolidated model. Software Qual J 14, 159–178 (2006). https://doi.org/10.1007/s11219-006-7600-8

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11219-006-7600-8

Keywords

Navigation