An Outline of Techniques for Evaluating the Human-Computer Interface

  • Stephen Howard
  • Dianne M. Murray


Although evaluation is seen as a crucial stage in the development of computer systems, a number of problems beset the evaluator of a human-computer interface. Firstly, there is a poor understanding of the theoretical underpinnings of many techniques developed by the behavioral sciences. Secondly, the way in which the utility of each possible technique will vary with differing evaluation environments is not well understood. Thirdly, there is little advice available to aid the evaluator in the selection of an evaluation package.

A recent wide-ranging literature review is summarized from which five different forms of evaluation can be identified and a variety of currently used techniques categorized. Two possible taxonomies, one based upon criteria such as validity and reliability and one based upon a proposed evaluation environment, are discussed. Neither of the taxonomies alone are adequate in aiding in the selection of an evaluation package; however, aspects of both may prove useful in the future.


Evaluation Environment Evaluation Package Repertory Grid Informal Evaluation Critical Incident Technique 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. Ainsworth, L., 1986, A Comparison of Control Room Evaluation Techniques, Presented at the American Nuclear Society Conference held in Knoxville during 1986.Google Scholar
  2. Anders-Ericsson, K., and Simon, H. E., 1984, Protocol Analysis: Verbal Reports as Data, MIT Press, Cambridge, Mass., London, England.Google Scholar
  3. Barrett, G. V., Thornton, C. L., and Cabe, P. A., 1968, “Human Factors Evaluation of a Computer Based Information Storage and Retrieval System”, Human Factors, 10, (4), pp. 431–436.Google Scholar
  4. Flanagan, J. C., 1954, “The Critical Incident Technique”, Psychological Bulletin, 51, (4), pp. 327–358.CrossRefGoogle Scholar
  5. Hammond, N., Long, J., Clark, I., Morton, J., and Barnard, P., 1984, Documenting User Difficulties — Annotated Protocols, Hursley Human Factors File No. HF024, IBM, Hursley Park, Hampshire, S021 2JN.Google Scholar
  6. Kak, A., 1981, “Stress: An Analysis of Physiological Assessment Devices”, Machine Pacing and Occupational Stress, Salvendy, G., and Smith, M. J., eds., Taylor Francis, London.Google Scholar
  7. Kieras, D., and Poulson, P. G., 1985, “An Approach to the Formal Analysis of User Complexity”, IJMMS, 22., pp. 365–394.Google Scholar
  8. Levine, D. H., 1978, “A Generalized Methodology for Determining the Correlation of User Requirements to the Information System”, Proc. of the 7th Mid-Year Meeting of the American Society for Information Science, May 1978.Google Scholar
  9. Maclean, A., et al., 1985(a), Recall as an Indicant of Performance in Interactive Systems, Hursley Human Factors File No. HF088, IBM, Hursley Park, Hampshire, S021 2JN.Google Scholar
  10. Maclean, A., et al., 1985(b), Evaluating the Interface of a Document Processor: A Comparison of Expert Judgement and User Observation, Hursley Human Factors File No. HF089, IBM, Hursley Park, Hampshire, S021 2JN.Google Scholar
  11. Martin, T. H., Carlisle, J., and Treu, S., 1973, The User Interface for Interactive Bibliographic Searching: An Analysis of the Attitudes of Nineteen Information Scientists, Hursley Human Factors File No. HF089, IBM, Hursley Park, Hampshire, S021 2JN.Google Scholar
  12. Meister, D., and Sullivan, D. J., 1967, “Evaluation of User Reactions to a Prototype On-line Information Retrieval System”, Report to NASA by the Bunker-Ramo Corporation under Contract No. NASA-1369, Rep. No. NASA CR-918.Google Scholar
  13. Monk, A., 1986, “Personal Communication on Predictability Analysis”.Google Scholar
  14. Nelson, C., and Marshall, C., 1986, “Human Computer Interaction Evaluation”, Presented at the Working with Display Units Conference, Stockholm, May 1986.Google Scholar
  15. Norman, D. A., 1986, “Cognitive Engineering, Chapter 3”, User Centered System Design, Norman, D. A., and Draper, S. W., eds.Google Scholar
  16. O’Mailey, C., Draper, S. W., and Riley, M. S., 1984, “Constructive Interaction: A Method for Studying User-Computer-User Interaction”, Interact ’84, Shake1, B., Elsevier Science Publishers B.V.Google Scholar
  17. Peace, D. M. S., and Easterby, R. S., 1973, “The Evaluation of User Interaction with Computer Based Management Information Systems”, Human Factors, 15, (2), pp. 163–177.Google Scholar
  18. Penniman, W. D., and Dominick, W. D., 1980, “Monitoring and Evaluation of On-Line Information System Usage”, On-Line Information System Usage, 16, (1).Google Scholar
  19. Powers, W. G., Cummings, H. W., and Talbott, R., 1973, “The Effects of Prior Computer Exposure on Man-Machine Computer Anxiety”, Presented at Int. Commun. Assoc. Ann. Meeting, 25–28 April 1973, Montreal, Canada.Google Scholar
  20. Rushinek, A., and Rushinek, S. F., 1984, “A User Evaluation of Information Characteristics Related to Demand Deposit System: An Empirical Analysis” Information and Management, 7, pp. 69–72.CrossRefGoogle Scholar
  21. Wickens, C. D., 1984, Engineering Psychology and Human Performance, Charles E. Merrill Publishing Company, London.Google Scholar

Copyright information

© Plenum Press, New York 1987

Authors and Affiliations

  • Stephen Howard
    • 1
  • Dianne M. Murray
    • 2
  1. 1.National Physical LaboratoryTeddington, MiddlesexUK
  2. 2.Ergonomics UnitUniversity CollegeLondonUK

Personalised recommendations