Subjectivist Approaches to Evaluation

  • Charles P. Friedman
  • Jeremy C. Wyatt
Part of the Computers and Medicine book series (C+M)


With this chapter we turn a corner. The previous four chapters have dealt almost exclusively with objectivist approaches to evaluation. These approaches are useful for answering some, but by no means all, of the interesting and important questions that challenge investigators in medical informatics. The subjectivist approaches, introduced here and in Chapter 9, address the problem of evaluation from a different set of premises as first discussed in Chapter 2. These premises derive from philosophical views that may be unfamiliar and perhaps even discomforting to some readers. They challenge some fundamental beliefs about scientific method and the validity of our understanding of the world that develops from objectivist research. They argue that, particularly within the realm of evaluation of information resources, the kind of “knowing” that develops from subjectivist studies may be as useful as that which derives from objectivist studies. While reading what follows, it may be tempting to dismiss subjectivist methods as informal, imprecise, or “subjective.” When carried out well, however, these studies are none of the above. They are equally objective, but in a different way. Professionals in informatics, even those who choose not to conduct subjectivist studies, can come to appreciate the rigor, validity, and value of this work.


Information Resource Subjectivist Study Medical Informatics Subjectivist Method Subjectivist Approach 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Winograd T, Flores F: Understanding Computers and Cognition: A New Foundation for Design. Reading, MA: Addison-Wesley, 1987.Google Scholar
  2. 2.
    Norman DA: The Psychology of Everyday Things. New York: Basic Books, 1988.Google Scholar
  3. 3.
    Davis WS: Business Systems Design and Analysis. Belmont, CA: Wads worth, 1994.Google Scholar
  4. 4.
    Bansler JP, Bødker K: A reappraisal of structured analysis: design in an organizational context. ACM Transact Inf Syst 1993;11:165–193.CrossRefGoogle Scholar
  5. 5.
    Zachary WW, Strong GW, Zaklad A: Information systems ethnography: integrating anthropological methods into system design to insure organizational acceptance. In: Hendrick HW, Brown O (eds) Human Factors in Organizational Design and Management. Amsterdam: North Holland, 1984:223–227.Google Scholar
  6. 6.
    Becker H: Boys in White: Student Culture in Medical School. Chicago: University of Chicago Press, 1963.Google Scholar
  7. 7.
    Bosk CL: Forgive and Remember: Managing Medical Failure. Chicago: University of Chicago Press, 1979.Google Scholar
  8. 8.
    Fafchamps D, Young CY, Tang PC: Modelling work practices: input to the design of the physician’s workstation. Symp Comput Applications Med Care 1991;15:788–792.Google Scholar
  9. 9.
    Forsythe DE, Buchanan BG: Broadening our approach to evaluating medical information systems. Symp Comput Applications Med Care 1991;15:8–12.Google Scholar
  10. 10.
    Forsythe DE: Using ethnography to build a working system: rethinking basic design assumptions. Symp Comput Applications Med Care 1992;16:505–509.Google Scholar
  11. 11.
    Forsythe DE: Using ethnography in the design of an explanation system. Expert Applications 1995;8:403–417.CrossRefGoogle Scholar
  12. 12.
    Osheroff JA, Forsythe DE, Buchanan BG, Bankowitz RA, Blumenfeld BH, Miller R: Analysis of clinical information needs using questions posed in a teaching hospital. Ann Intern Med 1991;14:576–581.Google Scholar
  13. 13.
    Aydin CE: Occupational adaptation to computerized medical information systems. J Health Soc Behav 1989;30:163–179.PubMedCrossRefGoogle Scholar
  14. 14.
    Kaplan B: Initial impact of a clinical laboratory computer system. J Med Syst 1987;11:137–147.PubMedCrossRefGoogle Scholar
  15. 15.
    Wilson SR, Starr-Schneidkraut N, Cooper MD: Use of the Critical Incident Technique to Evaluate the Impact of MEDLINE. Palo Alto, CA: American Institutes of Research, 1989. Final Report to the National Library of Medicine under contract N01-LM-8-3529.Google Scholar
  16. 16.
    Kaplan B, Duchon D: Combining qualitative and quantitative methods in information systems research: a case study. MIS Quarterly 1988;12:571–586.CrossRefGoogle Scholar
  17. 17.
    Johnston ME, Langton KB, Haynes RB, Matthieu D: A critical appraisal of research on the effects of computer-based decision support systems on clinician performance and patient outcomes. Ann Intern Med 1994;120:135–142.PubMedGoogle Scholar
  18. 18.
    Geertz C: The Interpretation of Cultures. New York: Basic Books, 1973.Google Scholar
  19. 19.
    House ER: Evaluating with Validity. Beverly Hills: Sage, 1980.Google Scholar
  20. 20.
    Stake RE: Evaluating the Arts in Education: A Responsive Approach. Columbus, OH: Merrill, 1975.Google Scholar
  21. 21.
    Scriven M: Objectivity and subjectivity in educational research. In: Thomas LG (ed) Philosophical Redirection of Educational Research. National Society for the Study of Education. Chicago: University of Chicago Press 1972.Google Scholar
  22. 22.
    Scriven M: Maximizing the power of causal investigations: the modus operandi method. In: Popham WJ (ed) Evaluation in Education. Berkeley, CA: McCutchan, 1974.Google Scholar
  23. 23.
    Parlett M, Hamilton D: Evaluation as illumination. In: Parlett M, Dearden G (eds) Introduction to Illuminative Evaluation. Cardiff-by-the-Sea, CA: Pacific Soundings, 1977.Google Scholar
  24. 24.
    Fielding NG, Lee RM: Using Computers in Qualitative Research. Newbury Park, CA: Sage, 1991.Google Scholar

Copyright information

© Springer Science+Business Media New York 1997

Authors and Affiliations

  • Charles P. Friedman
    • 1
    • 2
  • Jeremy C. Wyatt
    • 3
  1. 1.University of North CarolinaPittsburghUSA
  2. 2.Center for Biomedical InformaticsUniversity of PittsburghPittsburghUSA
  3. 3.Imperial Cancer Research FundLondonUK

Personalised recommendations