, Volume 37, Issue 11, pp 1329–1339 | Cite as

A Need for Change! A Coding Framework for Improving Transparency in Decision Modeling

  • Fernando Alarid-EscuderoEmail author
  • Eline M. Krijkamp
  • Petros Pechlivanoglou
  • Hawre Jalal
  • Szu-Yu Zoe Kao
  • Alan Yang
  • Eva A. Enns
Practical Application


The use of open-source programming languages, such as R, in health decision sciences is growing and has the potential to facilitate model transparency, reproducibility, and shareability. However, realizing this potential can be challenging. Models are complex and primarily built to answer a research question, with model sharing and transparency relegated to being secondary goals. Consequently, code is often neither well documented nor systematically organized in a comprehensible and shareable approach. Moreover, many decision modelers are not formally trained in computer programming and may lack good coding practices, further compounding the problem of model transparency. To address these challenges, we propose a high-level framework for model-based decision and cost-effectiveness analyses (CEA) in R. The proposed framework consists of a conceptual, modular structure and coding recommendations for the implementation of model-based decision analyses in R. This framework defines a set of common decision model elements divided into five components: (1) model inputs, (2) decision model implementation, (3) model calibration, (4) model validation, and (5) analysis. The first four components form the model development phase. The analysis component is the application of the fully developed decision model to answer the policy or the research question of interest, assess decision uncertainty, and/or to determine the value of future research through value of information (VOI) analysis. In this framework, we also make recommendations for good coding practices specific to decision modeling, such as file organization and variable naming conventions. We showcase the framework through a fully functional, testbed decision model, which is hosted on GitHub for free download and easy adaptation to other applications. The use of this framework in decision modeling will improve code readability and model sharing, paving the way to an ideal, open-source world.



We thank Mr Caleb Easterly for his helpful comments and suggestions on the code developed for this framework and Dr Myriam Hunink for her overall contribution in the DARTH workgroup.

Author contributions

FAE, EK, PP, HJ SYK, AY, and EE: study design and analysis. All authors participated in the interpretation of the data, drafting of the manuscript, critical revision of the manuscript, and approval of the final manuscript.

Compliance with Ethical Standards

Data availability statement

Data and statistical code are provided in the GitHub repository ( and the darthpack website ( The version of darthpack released in this article is available at


Dr Alarid-Escudero was supported by a Grant from the National Cancer Institute (U01-CA-199335) as part of the Cancer Intervention and Surveillance Modeling Network (CISNET). Dr Enns was supported by a Grant from the National Institute of Allergy and Infectious Disease of the National Institutes of Health under award no. K25AI118476. Dr Jalal was supported by a Grant from the National Institute of Health (KL2 TR0001856). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. The funding agencies had no role in the design of the study, interpretation of results, or writing of the manuscript. The funding agreement ensured the authors’ independence in designing the study, interpreting the data, writing, and publishing the report.

Conflict of interest

FAE reports no conflicts of interest. EK reports no conflicts of interest. PP reports no conflicts of interest. HJ reports no conflicts of interest. SYK reports no conflicts of interest. AY reports no conflicts of interest. EE reports no conflicts of interest.

Supplementary material

40273_2019_837_MOESM1_ESM.pdf (1 mb)
Supplementary material 1 (PDF 1059 kb)


  1. 1.
    Taichman DB. Data sharing statements for clinical trials: a requirement of the international committee of medical journal editors. Ann Intern Med. 2017;14:e1002315–e1002315.Google Scholar
  2. 2.
    Stanford. Data Availability Policies at Top Journals [Internet]. 2019. Accessed 2 Aug 2019.
  3. 3.
    Eddy DM, Hollingworth W, Caro JJ, Tsevat J, McDonald KM, Wong JB. Model transparency and validation: a report of the ISPOR-SMDM modeling good research practices task force-7. Med Decis Mak. 2012;32:733–43.CrossRefGoogle Scholar
  4. 4.
    Cohen JT, Neumann PJ, Wong JB. A call for open-source cost-effectiveness analysis. Ann Intern Med. 2017;167(6):432–3.CrossRefGoogle Scholar
  5. 5.
    Baio G, Heath A. When simple becomes complicated: why Excel should lose its place at the top table. Glob Reg Heal Technol Assess. 2017;4(1):e3–6.Google Scholar
  6. 6.
    Canadian Agency for Drugs and Technologies in Health (CADTH). Procedure and Submission Guidelines for the CADTH Common Drug Review. 2019. p. 110. Accessed 17 Sept 2019.
  7. 7.
    Center for the Evaluation of Value and Risk in Health. Open-source model clearinghouse [Internet]. Tufts University Medical Center; 2019. Accessed 1 Feb 2019.
  8. 8.
    Dunlop WCN, Mason N, Kenworthy J, Akehurst RL. Benefits, challenges and potential strategies of open source health economic models. Pharmacoeconomics. 2017;35:125–8.CrossRefGoogle Scholar
  9. 9.
    Sampson CJ, Wrightson T. Model registration: a call to action. Pharmacoecon Open. 2017;1:73–7.CrossRefGoogle Scholar
  10. 10.
    Sampson CJ, Arnold R, Bryan S, et al. Transparency in decision modelling: what, why, who and how? PharmacoEconomics. 2019. Scholar
  11. 11.
    Jalal H, Pechlivanoglou P, Krijkamp E, Alarid-Escudero F, Enns EA, Hunink MGM. An overview of R in health decision sciences. Med Decis Mak. 2017;37:735–46.CrossRefGoogle Scholar
  12. 12.
    Decision Analysis in R for Technologies in Health (DARTH) workgroup. Decision analysis in R for technologies in health [Internet]. 2019. Accessed 1 Jan 2019.
  13. 13.
    R Core Team. R: A language and environment for statistical computing. Vienna: R Foundation for Statistical Computing; 2019.Google Scholar
  14. 14.
    Marwick B, Boettiger C, Mullen L. Packaging data analytical work reproducibly using R (and friends). Am Stat. 2018;72:80–8.CrossRefGoogle Scholar
  15. 15.
    Stout NK, Knudsen AB, Kong CY, Mcmahon PM, Gazelle GS. Calibration methods used in cancer simulation models and suggested reporting guidelines. Pharmacoeconomics. 2009;27:533–45.CrossRefGoogle Scholar
  16. 16.
    Briggs AH, Weinstein MC, Fenwick EAL, Karnon J, Sculpher MJ, Paltiel AD. Model parameter estimation and uncertainty analysis: a report of the ISPOR-SMDM modeling good research practices task force working group-6. Med Decis Mak. 2012;32:722–32.CrossRefGoogle Scholar
  17. 17.
    Alarid-Escudero F, MacLehose RF, Peralta Y, Kuntz KM, Enns EA. Nonidentifiability in model calibration and implications for medical decision making. Med Decis Mak. 2018;38:810–21.CrossRefGoogle Scholar
  18. 18.
    Sargent RG. Verification and validation of simulation models. J Simul. 2013;7:12–24.CrossRefGoogle Scholar
  19. 19.
    Goldhaber-Fiebert JD, Stout NK, Goldie SJ. Empirically evaluating decision-analytic models. Value Health. 2010;13:667–74.CrossRefGoogle Scholar
  20. 20.
    Rutter CM, Savarino JE. An evidence-based microsimulation model for colorectal cancer: validation and application. Cancer Epidemiol Biomark Prev. 2010;19:1992–2002.CrossRefGoogle Scholar
  21. 21.
    Rutter CM, Knudsen AB, Marsh TL, Doria-Rose VP, Johnson E, Pabiniak C, et al. Validation of models used to inform colorectal cancer screening guidelines: accuracy and implications. Med Decis Mak. 2016;36:604–14.CrossRefGoogle Scholar
  22. 22.
    Kopec JA, Finès P, Manuel DG, Buckeridge DL, Flanagan WM, Oderkirk J, et al. Validation of population-based disease simulation models: a review of concepts and methods. BMC Public Health. 2010;10:710.CrossRefGoogle Scholar
  23. 23.
    Cancer Intervention and Surveillance Modelling Network (CISNET). About CISNET [Internet]. 2019. Accessed 16 July 2019.
  24. 24.
    Krijkamp EM, Alarid-Escudero F, Enns E, Pechlivanoglou P, Hunink MM, Jalal H. A multidimensional array representation of state-transition model dynamics. bioRxiv 670612. 2019.Google Scholar
  25. 25.
    Sculpher MJ, Basu A, Kuntz KM, Meltzer DO. Reflecting uncertainty in cost-effectiveness analysis. In: Neumann PJ, Sanders GD, Russell LB, Siegel JE, Ganiats TG, editors. Cost-effectiveness heal med. 2nd ed. New York: Oxford University Press; 2017. p. 289–318.Google Scholar
  26. 26.
    Alarid-Escudero F, Enns EA, Kuntz KM, Michaud TL, Jalal H. “Time traveling is just too dangerous” but some methods are worth revisiting: the advantages of expected loss curves over cost-effectiveness acceptability curves and frontier. Value Health. 2019;22:611–8.CrossRefGoogle Scholar
  27. 27.
    Raiffa H, Schlaifer RO. Applied statistical decision theory. Cambridge: Harvard Business School; 1961.Google Scholar
  28. 28.
    Claxton K, Posnett J. An economic approach to clinical trial design and research priority-setting. Health Econ. 1996;5:513–24.CrossRefGoogle Scholar
  29. 29.
    Jutkowitz E, Alarid-Escudero F, Kuntz KM, Jalal H. The curve of optimal sample size (COSS): a graphical representation of the optimal sample size from a value of information analysis. Pharmacoeconomics. 2019;37:871–7.CrossRefGoogle Scholar
  30. 30.
    Wickham H. R packages: organize, test, document, and share your code. Spencer A, Marie Beaugureau, editors. Sebastopol: O’Reilly Media; 2015.Google Scholar
  31. 31.
    Cooper N, Hsing P-Y, editors. A guide to reproducible code in ecology and evolution. London: British Ecology Society; 2017.Google Scholar
  32. 32.
    Kleijnen JPC. Verification and validation of simulation models. Eur J Oper Res. 1995;82:145–62.CrossRefGoogle Scholar
  33. 33.
    Wickham H. The tidyverse style guide [Internet]. 2019. Accessed 19 July 2019.
  34. 34.
    Google. Google’s R Style Guide [Internet]. 2019. p. 1–6. Accessed 24 July 2019.
  35. 35.
    Martin RC. Clean code: a handbook of agile software craftsmanship. Boston: Pearson Education; 2009.Google Scholar
  36. 36.
    Wickham H. testthat: get started with testing. R J. 2011;3:5.CrossRefGoogle Scholar
  37. 37.
    Beeley C. Web application development with R using Shiny. Birmingham: Packt Publishing Ltd; 2013.Google Scholar
  38. 38.
    Incerti D, Curtis JR, Shafrin J, Lakdawalla DN, Jansen JP. A flexible open-source decision model for value assessment of biologic treatment for rheumatoid arthritis. Pharmacoeconomics. 2019;37:829–43.CrossRefGoogle Scholar
  39. 39.
    Xie Y. Bookdown: authoring books with R Markdown. Boca Raton, FL: CRC Press; 2016.CrossRefGoogle Scholar
  40. 40.
    Enns EA, Cipriano LE, Simons CT, Kong CY. Identifying best-fitting inputs in health-economic model calibration: a pareto frontier approach. Med Decis Mak. 2015;35:170–82.CrossRefGoogle Scholar
  41. 41.
    RStudio. Using projects [Internet]. 2019. Accessed 1 Feb 2019.
  42. 42.
    David O, Ascough JC, Lloyd W, Green TR, Rojas KW, Leavesley GH, et al. A software engineering perspective on environmental modeling framework design: the object modeling system. Environ Model Softw. 2013;39:201–13.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Drug Policy ProgramCenter for Research and Teaching in Economics (CIDE)-CONACyTAguascalientesMexico
  2. 2.Department of EpidemiologyErasmus MCRotterdamThe Netherlands
  3. 3.The Hospital for Sick Children and University of TorontoTorontoCanada
  4. 4.Department of Health Policy and Management, Graduate School of Public HealthUniversity of PittsburghPittsburghUSA
  5. 5.Division of Health Policy and ManagementUniversity of Minnesota School of Public HealthMinneapolisUSA
  6. 6.The Hospital for Sick ChildrenTorontoCanada

Personalised recommendations