Skip to main content
Log in

Kirkpatrick plus: Evaluation and continuous improvement with a community focus

  • Development
  • Published:
Educational Technology Research and Development Aims and scope Submit manuscript

Abstract

For almost 40 years, Donald Kirkpatrick's framework for evaluation has been used as a basic model for the identification and targeting of training-specific interventions in business, government, military, and industry alike. By approaching evaluation from four different perspectives—reaction, learning, behavior, and results—the model has provided a solid basis for the examination of training's impact on the organization. Despite the current practice of measuring one's success according to the success of one's clients, proposed changes in the model have not been frequently adopted. It is therefore likely time for professionals to reevaluate the utility and responsiveness of the Kirk-patrick framework to meet the value-added requirements of today's organizations. This article identifies tools and concepts for being responsive to the new organizational realities not originally addressed by the Kirkpatrick model.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • American Society for Training and Development (ASTD). (no date). Percent of courses evaluated at Kirkpatrick levels [online]. Available: http://www.astd.org/who/research/benchmar/96stats/graph15.gif [March 30, 1998].

  • Dick, W., & Carey, L. (1989).The systematic design of instruction (3rd ed.). Glenview, IL: Scott Foresman & Co.

    Google Scholar 

  • Drucker, P.F. (1993)Post-capitalist society. New York: HarperBusiness.

    Google Scholar 

  • Foshay, R. (in press). An examination of evaluation research.

  • Gilbert, T., & Gilbert, M. (1989: January). Performance engineering: Making human productivity a science.Performance Improvement.

  • Hall, J., Sprague, D., & Watkins, R. (1995; August).Florida's job training programs: What is the return on taxpayers' investment? Tallahassee, FL: Florida TaxWatch Incorporated.

    Google Scholar 

  • Kaufman, R. (1992).Strategic planning plus: An organizational guide. Newbury Park, CA: Sage (Revised).

    Google Scholar 

  • Kaufman, R. (May–June, 1997). Avoiding the “dumbing down” of human performance improvement.Performance Improvement.

  • Kaufman, R. (1998).Strategic thinking: A guide to identifying and solving problems—Revised. Arlington, VA & Washington, DC: Jointly published by the American Society for Training and Development and the International Society for Performance Improvement.

    Google Scholar 

  • Kaufman, R., & Keller, J. (1994: Winter). Levels of evaluation: Beyond Kirkpatrick.Human Resources Quarterly, 5(4).

  • Kaufman, R., Keller, J., & Watkins, R. (1995). What works and what doesn't: Evaluation beyond Kirkpatrick.Performance and Instruction 35(2), 8–12.

    Google Scholar 

  • Kaufman, R., & Watkins, R. (1996: Spring). Costs-consequences analysis.HRD Quarterly.

  • Kaufman, R., Watkins, R., & Sims, L. (1997). Cost-consequences analysis: A case study.Performance Improvement Quarterly 10(2).

  • Kaufman, R., Watkins, R., Triner, D., & Stith, M. (1998: summer). The changing corporate mind: Organizations, vision, missions, purposes, and indicators on the move toward societal payoffs.Performance Improvement Quarterly.

  • Kirkpatrick, D.L. (1959a). Techniques for evaluating training programs.Journal of ASTD, 13(11), 3–9.

    Google Scholar 

  • Kirkpatrick, D.L. (1959b). Techniques for evaluating training programs: Part 2—Learning.Journal of ASTD, 13(12), 21–26.

    Google Scholar 

  • Kirkpatrick, D.L. (1960a). Techniques for evaluating training programs: Part 3—Behavior.Journal of ASTD, 14(1), 13–18.

    Google Scholar 

  • Kirkpatrick, D.L. (1960b). Techniques for evaluating training programs: Part 4—Results.Journal of ASTD, 14(2), 28–32.

    Google Scholar 

  • Kirkpatrick, D.L. (1994).Evaluating training programs: The four levels. San Francisco: Berret-Koehler.

    Google Scholar 

  • Kirkpatrick, D.L. (1996). Evaluation. In R.L. Craig, & L.R. Bittel (Eds.),Training & Development Handbook. American Society for Training and Development, New York: McGraw-Hill Book Co.

    Google Scholar 

  • Kirkpatrick, D.L. (1996: Spring). Invited reaction: Reaction to Holton article.Performance Improvement Quarterly, 7(1), 23–29.

    Google Scholar 

  • Muellner, A. (Reporter). (1998, March 18).Morning Edition. Upper Marlboro, Maryland: National Public Radio.

    Google Scholar 

  • Muir, M., Watkins, R., Kaufman, R., & Leigh, D. (June, 1998). Costs-consequences analysis: A primer.Performance Improvement.

  • Pava, M.N., & Krausz, J. (1995).Corporate responsibility and financial performance: The paradox of social cost. Westport, CN: Quorum Books.

    Google Scholar 

  • Phillips, J. (1996). Measuring the Results of Training. In R.L. Craig, & L.R. Bittel (Eds.),Training & Development Handbook. American Society for Training and Development, New York: McGraw-Hill Book Co.

    Google Scholar 

  • Popcorn, F. (1991).The Popcorn report. New York: Doubleday.

    Google Scholar 

  • Rothwell, W.J., & Kazanas, H.C. (1992).Mastering the instructional design process: A systematic approach. San Francisco: Jossey-Bass Publishers.

    Google Scholar 

  • Toffler, A. (1990).Powershift: Knowledge, wealth and violence at the edge of the 21st century. New York: Bantam Books.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Watkins, R., Leigh, D., Foshay, R. et al. Kirkpatrick plus: Evaluation and continuous improvement with a community focus. ETR&D 46, 90–96 (1998). https://doi.org/10.1007/BF02299676

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF02299676

Keywords

Navigation