, Volume 32, Issue 10, pp 967–979 | Cite as

Avoiding and Identifying Errors and Other Threats to the Credibility of Health Economic Models

  • Paul TappendenEmail author
  • James B. Chilcott
Practical Application


Health economic models have become the primary vehicle for undertaking economic evaluation and are used in various healthcare jurisdictions across the world to inform decisions about the use of new and existing health technologies. Models are required because a single source of evidence, such as a randomised controlled trial, is rarely sufficient to provide all relevant information about the expected costs and health consequences of all competing decision alternatives. Whilst models are used to synthesise all relevant evidence, they also contain assumptions, abstractions and simplifications. By their very nature, all models are therefore ‘wrong’. As such, the interpretation of estimates of the cost effectiveness of health technologies requires careful judgements about the degree of confidence that can be placed in the models from which they are drawn. The presence of a single error or inappropriate judgement within a model may lead to inappropriate decisions, an inefficient allocation of healthcare resources and ultimately suboptimal outcomes for patients. This paper sets out a taxonomy of threats to the credibility of health economic models. The taxonomy segregates threats to model credibility into three broad categories: (i) unequivocal errors, (ii) violations, and (iii) matters of judgement; and maps these across the main elements of the model development process. These three categories are defined according to the existence of criteria for judging correctness, the degree of force with which such criteria can be applied, and the means by which these credibility threats can be handled. A range of suggested processes and techniques for avoiding and identifying these threats is put forward with the intention of prospectively improving the credibility of models.


Model Error Decision Problem Model Validation Pharmaceutical Benefit Advisory Committee Credibility Threat 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



This work was not supported by research funding. Neither of the authors have any conflicts of interest to declare. Both PT and JC developed the taxonomy of model errors and other threats to model credibility and contributed to the writing of this manuscript.


  1. 1.
    Drummond M, Sculpher M, Torrance G, O’Brien B, Stoddart G. Methods for the economic evaluation of health care programmes. 3rd ed. Oxford: Oxford University Press; 2005.Google Scholar
  2. 2.
    Sorenson C. Use of comparative effectiveness research in drug coverage and pricing decisions: a six-country comparison. Commonw Fund. 2010;91:1–14.Google Scholar
  3. 3.
    Briggs A, Sculpher M, Claxton K. Decision modelling for health economic evaluation. 1st ed. Oxford: Oxford University Press; 2006.Google Scholar
  4. 4.
    Eddy D. Technology assessment: the role of mathematical modeling. In: National Research Council, editor. Assessing Medical Technology. Washington DC: National Academies; 1985.Google Scholar
  5. 5.
    Kaltenthaler E, Tappenden P, Paisley S. Reviewing the evidence to inform the population of cost-effectiveness models within health technology assessments. Value Health. 2013;16(5):830–6.PubMedCrossRefGoogle Scholar
  6. 6.
    Towse A, Drummond M. The pros and cons of modelling in economic evaluation. London: Office of Health Economics; 1997.Google Scholar
  7. 7.
    Sheldon T. Problems of using modelling in the economic evaluation of health care. Health Econ. 1996;5:1–11.PubMedCrossRefGoogle Scholar
  8. 8.
    Buxton M, Drummond M, Van Hout B, Prince R, Sheldon T, Szucs T, et al. Modelling in economic evaluation: an unavoidable fact of life. Health Econ. 1997;6(3):217–27.PubMedCrossRefGoogle Scholar
  9. 9.
    Culyer A. Encyclopaedia of health economics. 1st ed. Oxford: Elsevier; 2014.Google Scholar
  10. 10.
    Schlesinger S, Crosbie R, Gagne R, Innis G, Lalwani C, Loch J, et al. Terminology for model credibility. Simulation. 1979;32(3):103–4.CrossRefGoogle Scholar
  11. 11.
    Chilcott J, Tappenden P, Rawdin A, Johnson M, Kaltenthaler E, Paisley S, et al. Avoiding and identifying errors in health technology assessment models. Health Technol Assess. 2010;14(25):1–135.Google Scholar
  12. 12.
    Philips Z, Ginnelly L, Sculpher M, Claxton K, Golder S, Riemsma R, et al. Review of guidelines for good practice in decision-analytic modelling in health technology assessment. Health Technol Assess. 2004;8(36):1–158.PubMedGoogle Scholar
  13. 13.
    Drummond M, Jefferson T. Guidelines for authors and peer reviewers of economic submissions to the BMJ. Br Med J. 1996;313(7052):275–83.CrossRefGoogle Scholar
  14. 14.
    Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, et al. Consolidated Health Economic Evaluation Reporting Standards (CHEERS)—explanation and elaboration: a report of the ISPOR health economic evaluation publication guidelines good reporting practices task force. Value Health. 2013;16(2):231–50.PubMedCrossRefGoogle Scholar
  15. 15.
    Weinstein M, O’Brien B, Hornberger J, Jackson J, Johannesson M, McCabe C, et al. Principles of good practice for decision analytic modeling in health-care evaluation: report of the ISPOR task force on good research practices-modeling studies. Value Health. 2003;6(1):9–17.PubMedCrossRefGoogle Scholar
  16. 16.
    Neuhauser D, Lewicki A. What do we gain from the sixth stool guaiac? New Engl J Med. 1975;293:226–8.PubMedCrossRefGoogle Scholar
  17. 17.
    Brown K, Burrows C. The sixth stool guaiac test: $47 million that never was. J Health Econ. 1990;9:429–55.PubMedCrossRefGoogle Scholar
  18. 18.
    Culyer A. Economics. 1st ed. Oxford: Blackwell Scientific Publications; 1985.Google Scholar
  19. 19.
    Mooney G, Russell E, Weir R. Choices for health care. 1st ed. Houndmills: McMillan; 1980.Google Scholar
  20. 20.
    Hill S, Mitchell A, Henry D. Problems with the interpretation of pharmacoeconomic analyses: a review of submissions to the Australian pharmaceuticals benefit scheme. J Am Med Assoc. 2000;283:2116–21.CrossRefGoogle Scholar
  21. 21.
    Li J, Harris A, Chin G. Assessing the quality of evidence for decision-making (QED): a new instrument. Abstract 104. Health priorities 2008, conference of the international society for priorities in health care, Gateshead, UK, 28th–31st October 2008.Google Scholar
  22. 22.
    Eddy D, Hollingworth W, Caro J, Tsevat J, McDonald K, Wong J, et al. Model transparency and validation: a report of the ISPOR-SMDM modeling good research practices task force-7. Med Decis Making. 2012;32(5):733–43.CrossRefGoogle Scholar
  23. 23.
    Sargent RG. Validation and verification of simulation models. Proceedings of the 31st conference on winter simulation, New Jersey, NJ, 1998;39–48.Google Scholar
  24. 24.
    Pidd M. Tools for thinking: modelling in management science. 2nd ed. Chichester: Wiley; 2005.Google Scholar
  25. 25.
    Tappenden P. Problem structuring for the development of health economic models. In: Culyer A, editor. Encyclopaedia of health economics. Oxford: Elsevier; 2014.Google Scholar
  26. 26.
    Ackoff R. The future of operational research is past. J Oper Res Soc. 1979;30:93–104.CrossRefGoogle Scholar
  27. 27.
    Siebert U, Alagoz O, Bayoumi A, Jahn B, Owns D, Cohen D, et al. State-transition modeling: a report of the ISPOR-SMDM modeling good practices task force-3. Value Health. 2012;15(6):812–20.PubMedCrossRefGoogle Scholar
  28. 28.
    Karnon J, Stahl J, Brennan A, Caro J, Mar J, Moller J, et al. Modeling using discrete event simulation: a report of the ISPOR-SMDM modeling good practices task force-4. Value Health. 2012;15(6):821–7.PubMedCrossRefGoogle Scholar
  29. 29.
    Panko R. Revising the Panko–Halverson taxonomy of spreadsheet errors. J Decis Support Syst. 2010;49(2):235–44.CrossRefGoogle Scholar
  30. 30.
    Senders J, Moray N. Human error: cause, prediction, and reduction. Hillsdale: Lawrence Erlbaum; 1991.Google Scholar
  31. 31.
    Rajalingham K, Chadwick D, Knight B. Classification of spreadsheet errors. Conference of the European Spreadsheet Risks Interest Group (EuSpRIG) 2000.Google Scholar
  32. 32.
    Purser M, Chadwick D. Does an awareness of differing types of spreadsheet errors aid end-users in identifying spreadsheet errors? Conference of the European Spreadsheet Risks Interest Group (EuSpRIG) 2006;185–204.Google Scholar
  33. 33.
    Teo T, Tan M. Quantitative and qualitative errors in spreadsheet development. Proceedings of the 30th Hawaii international conference on system sciences 1997;3(149).Google Scholar
  34. 34.
    National Institute for Health and Care Excellence. Guide to the methods of technology appraisal. London: NICE; 2013.Google Scholar
  35. 35.
    Schultz K, Altman D, Moher D, and the CONSORT Group. CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMC Med. 2010;8(18):1–9.Google Scholar
  36. 36.
    Byford S, Palmer S. Common errors and controversies in pharmacoeconomic analyses. Pharmacoeconomics. 1998;13(6):659–66.PubMedCrossRefGoogle Scholar
  37. 37.
    Sonnenberg F, Roberts M, Tsevat J, Wong J, Barry M, Kent D. Toward a peer review process for medical decision analysis models. Med Care. 1994;32:JS52–64.PubMedCrossRefGoogle Scholar
  38. 38.
    Drummond M, Sculpher M. Common methodological flaws in economic evaluations. Med Care. 2005;43(7):II4–15.Google Scholar
  39. 39.
    Cragg P, King M. Spreadsheet modelling abuse: an opportunity for OR. J Oper Res Soc. 1993;44:743–52.CrossRefGoogle Scholar
  40. 40.
    Powell S, Baker K, Lawson B. A critical review of the literature on spreadsheet errors. Decis Support Syst. 2008;46:128–38.CrossRefGoogle Scholar
  41. 41.
    Davies C, Briggs A, Lorgelly P, Garellick G, Malchau H. The, “hazard” of extrapolating survival curves. Med Decis Making. 2013;33:369–80.CrossRefGoogle Scholar
  42. 42.
    Balci O. Validation, verification, and testing techniques throughout the life cycle of a simulation study. Ann Oper Res. 1994;53:121–73.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  1. 1.Health Economics and Decision Science, School of Health and Related ResearchUniversity of SheffieldSheffieldUK

Personalised recommendations