, Volume 37, Issue 1, pp 1–6 | Cite as

Trusting the Results of Model-Based Economic Analyses: Is there a Pragmatic Validation Solution?

  • Salah Ghabri
  • Matt Stevenson
  • Jörgen Möller
  • J. Jaime CaroEmail author
Current Opinion


Models have become a nearly essential component of health technology assessment. This is because the efficacy and safety data available from clinical trials are insufficient to provide the required estimates of impact of new interventions over long periods of time and for other populations and subgroups. Despite more than five decades of use of these decision-analytic models, decision makers are still often presented with poorly validated models and thus trust in their results is impaired. Among the reasons for this vexing situation are the artificial nature of the models, impairing their validation against observable data, the complexity in their formulation and implementation, the lack of data against which to validate the model results, and the challenges of short timelines and insufficient resources. This article addresses this crucial problem of achieving models that produce results that can be trusted and the resulting requirements for validation and transparency, areas where our field is currently deficient. Based on their differing perspectives and experiences, the authors characterize the situation and outline the requirements for improvement and pragmatic solutions to the problem of inadequate validation.



The authors thank Isaac Coro Ramos, Pepijn Vemer, George A.K van Voorn, Maiwenn J. AI, Talitha L. Feenstra, and Chloé Herpin for their useful comments and suggestions.

Author contributions

SG drafted Sects. 2 and 3.1; MS drafted Sect. 3.2; JM drafted Sect. 3.3; and JJC reviewed these materials and integrated them into the paper. All authors participated in writing and editing Sects. 1, 2, and 4.

Compliance with Ethical Standards


No funding was received for the preparation of this article.

Conflict of interest

Salah Ghabri and Matt Stevenson have no conflicts of interest that are directly relevant to the contents of this article. Jörgen Möller and J. Jaime Caro are employed by Evidera, a company that provides consulting and other research services to pharmaceutical, device, government, and non-government organizations. The opinions expressed in this article are those of the authors and do not necessarily represent the views of their institutions.


  1. 1.
    Eddy DM, Hollingworth W, Caro JJ, Tsevat J, McDonald KM, Wong JB, ISPOR-SMDM Modeling Good Research Practices Task Force. Model transparency and validation: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-7. Med Decis Making. 2012;32:733–43.CrossRefPubMedGoogle Scholar
  2. 2.
    Karnon J. Model validation: has it’s time come? Pharmacoeconomics. 2016;34:829–31.CrossRefPubMedGoogle Scholar
  3. 3.
    Dasbach EJ, Elbasha EH. Verification of decision-analytic models for health economic evaluations: an overview. Pharmacoeconomics. 2017;35:673–83.CrossRefPubMedGoogle Scholar
  4. 4.
    Ghabri S, Hamers F, Josselin JM. Exploring uncertainty in economic evaluations of drugs and medical devices: lessons from the first review of manufacturers’ submissions to the French National Authority for Health. Pharmacoeconomics. 2016;34:617–24.CrossRefPubMedGoogle Scholar
  5. 5.
    Caro J, Möller J. Decision-analytic models: current methodological challenges. Pharmacoeconomics. 2014;32:943.CrossRefPubMedGoogle Scholar
  6. 6.
    Canadian Agency for Drugs and Technologies in Health. Guidelines for the economic evaluation of health technologies: Canada. 4th ed. Ottawa: Canadian Agency for Drugs and Technologiesin Health; 2017. Accessed 15 Oct 2017.
  7. 7.
    National Institute for Health and Care Excellence (NICE). Single technology appraisal: user guide for company evidence submission template; 2015. Accessed 15 Oct 2017.
  8. 8.
    Belgian Health Care Knowledge Centre (KCE). Belgian guidelines for economic evaluations and budget impact analysis; 2015. Accessed 6 Aug 2018.
  9. 9.
    Haute Autorité de Santé (HAS). Choices in methods for economic evaluation; 2012. Accessed 15 Oct 2017.
  10. 10.
    Department of Health, Commonwealth of Australia. Guidelines for preparing submissions to the Pharmaceutical Benefits Advisory Committee (PBAC), version 5.0; 2016. Accessed 6 Aug 2018.
  11. 11.
    Zorginstitut Nederland. Guideline for the conduct of economic evaluations in healthcare; 2016. Accessed 6 Aug 2018.
  12. 12.
    Vemer P, Corro Ramos I, van Voorn GA, Al MJ, Feenstra TL. AdViSHE: a validation-assessment tool of health-economic models for decision makers and model users. Pharmacoeconomics. 2016;34:349–61.CrossRefPubMedGoogle Scholar
  13. 13.
    Journal Officiel de la République Francaise. Décret n°2012-1116 du 2 octobre 2012 relatif aux missions médico-économiques de la Haute Autorité de Santé. JORF n°0231 du 4 octobre 2012; page 15522 texte n°8.;jsessionid=EEAEBD3C6A9DB9AFF25516B14F46DBDC.tplgfr32s_3. Accessed 6 Aug 2018.
  14. 14.
    Ghabri S, Herpin C. Economic model validation: a pilot study on manufacturers submissions. In: Presented at ISPOR 20th Annual European Congress; 2017. Accessed 6 Aug 2018.
  15. 15.
    Afzali HH, Gray J, Karnon J. Model performance evaluation (validation and calibration) in model-based studies of therapeutic interventions for cardiovascular diseases: a review and suggested reporting framework. Appl Health Econ Health Policy. 2013;11:85–93.CrossRefGoogle Scholar
  16. 16.
    Afzali HH, Karnon J, Merlin T. Improving the accuracy and comparability of model-based economic evaluations of health technologies for reimbursement decisions: a methodological framework for the development of reference models. Med Decis Making. 2013;33:325–32.CrossRefPubMedGoogle Scholar
  17. 17.
    Ciani O, Buyse M, Drummond M, Rasi G, Saad ED, Taylor RS. Time to review the role of surrogate end points in health policy: state of the art and the way forward. Value Health. 2017;20:487–95.CrossRefPubMedGoogle Scholar
  18. 18.
    Huang M, Latimer N, Zhang Y, et al. Estimating the long-term outcomes associated with immuno-oncology therapies: challenges and approaches for overall survival extrapolations. Value Outcomes Spotlight. 2018;2018:28–30.Google Scholar
  19. 19.
    National Institute for Health and Care Excellence (NICE). Guide to the processes of technology appraisal. Accessed 6 Aug 2018.
  20. 20.
    Miners AH, Garau M, Fidan D, Fischer AJ. Comparing estimates of cost effectiveness submitted to the National Institute for Clinical Excellence (NICE) by different organisations: retrospective study. BMJ. 2005;330:65–8.CrossRefPubMedPubMedCentralGoogle Scholar
  21. 21.
    Tikhonova I, Hoyle MW, Snowsill TM, Cooper C, Varley-Campbell JL, Rudin CE, Mujica Mota RE. Azacitidine for treating acute myeloid leukaemia with more than 30 % bone marrow blasts: an Evidence Review Group perspective of a National Institute for Health and Care Excellence Single Technology Appraisal. Pharmacoeconomics. 2017;35:363–73.CrossRefPubMedGoogle Scholar
  22. 22.
    National Institute for Health and Care Excellence (NICE). Regorafenib for previously treated advanced hepatocellular carcinoma. Accessed 6 Aug 2018.
  23. 23.
    National Institute for Health and Care Excellence (NICE). Cenegermin for treating neurotrophic keratitis. Accessed 6 Aug 2018.
  24. 24.
    National Institute for Health and Care Excellence (NICE). Golimumab for treating non-radiographic axial spondyloarthritis. Accessed 6 Aug 2018.
  25. 25.
    Ransohoff DF, Feinstein AR. Editorial: is decision modeling useful in clinical medicine. Yale J Biol Med. 1976;41:761–7.Google Scholar
  26. 26.
    Beck JR, Pauker SG. The Markov process in medical prognosis. Med Decis Making. 1983;3:419–58.CrossRefPubMedGoogle Scholar
  27. 27.
    Pitman R, Fisman D, Zaric GS, Postma M, Kretzschmar M, Edmunds J. Brisson M; ISPOR-SMDM Modeling Good Research Practices Task Force. Dynamic transmission modeling: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force-5. Value Health. 2012;15:828–34.CrossRefPubMedGoogle Scholar
  28. 28.
    Brailsford SC, Hilton NA. A comparison of discrete event simulation and system dynamics for modelling health care systems. In: Riley J, editor. Planning for the future: health service quality and emergency accessibility. Operational Research Applied to Health Services (ORAHS). Glasgow: Glasgow Caledonian University; 2001.Google Scholar
  29. 29.
    Dunlop W, Mason N, Kenworthy J, Akehurst R. Benefits, challenges and potential strategies of open source health economic models. Pharmacoeconomics. 2017;35:125–8.CrossRefPubMedGoogle Scholar
  30. 30.
    Wang J, Carroll JM. Behind Linus’s Law: a preliminary analysis of open source software peer review practices in Mozilla and Python. In: Proceedings of the 2011 international conference on collaboration technologies and systems; 2011; p. 117–24.Google Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Salah Ghabri
    • 1
  • Matt Stevenson
    • 2
  • Jörgen Möller
    • 3
  • J. Jaime Caro
    • 3
    • 4
    • 5
    • 6
    Email author
  1. 1.French National Authority for Health (HAS)Saint-DenisFrance
  2. 2.School of Health and Related ResearchUniversity of SheffieldSheffieldUK
  3. 3.EvideraLondonUK
  4. 4.McGill UniversityMontrealCanada
  5. 5.London School of EconomicsLondonUK
  6. 6.WalthamUSA

Personalised recommendations