Abstract
In the discussion about “enlightenment” or “utilization” in program evaluation, it is increasingly clear that the discussants implicitly refer to different professional contexts. Weiss, in Alkin (1990), using the “scientist” approach appears to reflect upon the academic context of the traditional university, where, beyond the land grant institutions, “enlightenment” is the honored objective. In contrast, Patton (Alkin, 1990) speaks from the clinical perspective of the organization consultant, with “utilization” the essential element in the evaluator-client relationship. Yet, those contextual differences notwithstanding, each party defends its case on the implicit assumption ofone methodological procedure of program evaluation—equally shared by both. Correcting this assumption, this article articulates major principles and methods of the “clinical” approach in program evaluation. The method has been tested in the field in western Europe and the United States. Its characteristic difference with the academic tradition is that in clinical evaluation, improving the program is part of the method.
Similar content being viewed by others
References
Alkin, M. C. (1990).Debates on evaluation. With contributions by M. Q. Patton & C. H. Weiss. Newbury Park, CA.
Baier, V. E., March, J. G., & Saetren, H. (1990). Implementation and ambiguity. In J. G. March (Ed.),Decisions and organizations (pp. 150–164). Oxford Basil Blackwell.
Bartee, E. M., & Cheyunski, F. (1977). A methodology for process oriented organizational diagnosis.The Journal of Applied Behavioral Science, 13, 153–68.
Behling, J. H., & Merves, E. S. (1984).The practice of clinical research. New York: University Press of America.
Berk, R. A. et al. (1985). Social policy experimentation: A position paper.Evaluation Review, 9 (4), 387–429.,
Brinkerhoff, R. O., & Dressler, D. E. (1990).Productivity measurement. Beverly Hills, CA: Sage.
Campbell, D. T. (1986). Relabeling internal and external validity for applied social scientists. In W. M. K. Trochim (Ed.),Advances in quasi experimental design and analysis (p. 69). San Francisco, CA: Jossey Bass.
Checkland, P., & Scholes, J. (1990).Soft systems methodology in action. New York: John Wiley & Sons.
Deutscher, I., & Beattie, M. (1988). Success and failure: Static concepts in a dynamic society.Evaluation Review, 12 (6), 607–624.
Elkin, R., & Vorwaller, D. (1975). Evaluating the effectiveness of social services. In L. Seidler & E. Seidler (Eds.),Social accounting: Theory, issues and cases (pp. 410–426). Los Angeles, CA: Academic Press.
Fischer, F. (1980).Politics, values and public policy: The problem of methodology. Boulder, CO: Westview Press.
Fischhoff, B. (1986). Clinical policy analysis. In W. N. Dunn (Ed.),Policy analysis: Perspectives, concepts, and methods (pp. 111–130). Greenwich, CT: JAI Press.
Gageldonk, A. V., Leeuw, F., & Dekker, P. (1987). Analyse van Voorlichtingsveronderstellingen,Massacommunicatie, Jg. 15/2, 175–191.
Hage, J. (1972).Techniques and problems of theory construction in sociology. New York: Wiley.
Ham, C., & Hill, M. (1986).The policy process in the modern capitalist state. Brighton: Wheatsheaf Books.
Harrell Allen, T. (1978).New methods in the social sciences. New York: Preager.
Hersen, M., & Barlow, D. H. (1976).Single case experimental designs: Strategies for studying behavior change. New York: Pergamon Press.
Hoaglin, D., Light, R. J., McPeek, B., Mosteller, F., & Stoto, M. A. (1982).Data for decisions. Cambridge, MA: Abt.
Hoeben, W. T. J. G. (1989). Dansen om de Regenboom,Nederlands Tijdschrift v. Opvoeding, Vorming en Onderwijs, 5, 28–40.
Kash, D. E., & Ballard, S. (1987). Academic and applied policy studies.The American Behavioral Scientist, 30 (6), 597–611.
Leeuw, F. L. (1988). Overheidsvoorlichting en de wijziging van het sociale zekerheidsstelsel,Beleid en Maatschappij, 1, 20–44.
Leeuw, F. L. (1990). Analyzing policy theories and the systematic use of knowledge for public policy. In B. G. Peters & T. A. Barker (Eds.),Advice to governments. London: Sage.
Leeuw, F., & van de Vall, M. (1984). National population policies in industrial countries: Praxis or paradox? In R. F. Tomasson (Ed.),Comparative social research, vol., 7, pp. 351–368.
Mante-Meijer, E. (1989).Conflicten in organisaties: Individuele klachten en hun behandeling. Rotterdam: Risbo Press.
Mason, R. O., & Mitroff, I. I. (1981).Challenging strategic planning assumptions. New York: Wiley.
Mayer, R., & Greenwood, E. (1980).The design of social policy research. Englewood Cliffs, NJ: Prentice Hall.
McGuire, W. J. (1981). Theoretical foundations of campaigns. In R. E. Rice & W. J. Paisley (Eds.),Public communication campaigns. London: Sage.
Mitroff, I. I. (1983). Beyond experimentation: New methods for a new age. In E. Seidman (Ed.),Handbook of social intervention (pp.163–177). London: Sage.
Mohr, L. B. (1985). The reliability of the case study as a source of information. In R. F. Coulam & R. A. Smith (Eds.),Advances in information processing in organizations. Greenwich: JAI Press.
O’Shaughnessy, J. (1972).Inquiry and decision. London: Allen and Unwin.
Patton, M. Q. (1990). In M. C. Alkin (Ed.),Debates on evaluation. Newbury Park, CA.
Rossi, P. H., & Freeman, H. (1989).Evaluation: A systematic approach (4th ed.). Beverly Hills, CA: Sage.
Sabatier, P., & Mazmanian, D. (1980). The implementation of public policy: A framework of analysis.Policy studies, Special issue: Symposium on successful policy implementation, 538–559.
Schein, E. H. (1987).Process consultation. Reading, MA: Addison-Wesley.
Scheirer, M. A. (1981).Program implementation, a framework of analysis. Beverly Hills, CA: Sage.
Scriven, M. (1974). Maximizing the power of casual investigations: The modus operandi method. In W. J. Popham (Ed.),Evaluation in education—Current applications (pp. 68–84). Berkeley, CA: McCutchan.
Thompson, M. M. (1978).A study of interorganizational coordination: The organization-set example. Unpublished doctoral dissertation, SUNY/Buffalo.
van de Vall, M., & Schoemaker, N. (1983). The social context of information transfer: An analysis of facilitating and inhibiting factors. In P. J. Taylor & B. Cronin (Eds.),Information management research in Europe (pp. 140–151), London: Aslib.
van de Vall, M. (1970).Labor organizations, a macro- and macro-sociological analysis on a comparative basis. Cambridge: Cambridge University Press.
van de Vall, M. (1987). Data based sociological practice: A professional paradigm.American Behavioral Scientist, 30 (6), 4–66.
van de Vall, M., Bolas, C., & Kang, T. S. (1976). Applied social research in industrial organizations.The Journal of Applied Behavioral Science, 12, 158–177.
Vosburgh, M. (1986). Implementation analysis: A case of accident compensation in New Zealand.Evaluation and Program Planning, 9, 49–59.
Wageman, A. (1990). Report about the GEB research “take home” project. Personal correspondence from the author.
Weiss, C. H. (1990). In M. C. Alkin (Ed.),Debates on evaluation. Newbury Park, CA.
Yin, R. K. (1984).Case study research, design and methods. Beverly Hills, CA: Sage.
Zaltman, G., LeMasters, K. & Heffring, M. (1982).Theory construction in marketing: Some thoughts on thinking. New York: Wiley.
Author information
Authors and Affiliations
Additional information
Mark van de Vall is professor of Sociology, Erasmus University Rotterdam and adjunct professor at SUNY / Buffalo.
Rights and permissions
About this article
Cite this article
van de Vall, M. The clinical approach: Triangulated program evaluation and adjustment. Knowledge and Policy 4, 41–57 (1991). https://doi.org/10.1007/BF02693087
Issue Date:
DOI: https://doi.org/10.1007/BF02693087