Skip to main content

A Need for Change! A Coding Framework for Improving Transparency in Decision Modeling


The use of open-source programming languages, such as R, in health decision sciences is growing and has the potential to facilitate model transparency, reproducibility, and shareability. However, realizing this potential can be challenging. Models are complex and primarily built to answer a research question, with model sharing and transparency relegated to being secondary goals. Consequently, code is often neither well documented nor systematically organized in a comprehensible and shareable approach. Moreover, many decision modelers are not formally trained in computer programming and may lack good coding practices, further compounding the problem of model transparency. To address these challenges, we propose a high-level framework for model-based decision and cost-effectiveness analyses (CEA) in R. The proposed framework consists of a conceptual, modular structure and coding recommendations for the implementation of model-based decision analyses in R. This framework defines a set of common decision model elements divided into five components: (1) model inputs, (2) decision model implementation, (3) model calibration, (4) model validation, and (5) analysis. The first four components form the model development phase. The analysis component is the application of the fully developed decision model to answer the policy or the research question of interest, assess decision uncertainty, and/or to determine the value of future research through value of information (VOI) analysis. In this framework, we also make recommendations for good coding practices specific to decision modeling, such as file organization and variable naming conventions. We showcase the framework through a fully functional, testbed decision model, which is hosted on GitHub for free download and easy adaptation to other applications. The use of this framework in decision modeling will improve code readability and model sharing, paving the way to an ideal, open-source world.

This is a preview of subscription content, access via your institution.

Fig. 1


  1. 1.

    Taichman DB. Data sharing statements for clinical trials: a requirement of the international committee of medical journal editors. Ann Intern Med. 2017;14:e1002315–e1002315.

    Google Scholar 

  2. 2.

    Stanford. Data Availability Policies at Top Journals [Internet]. 2019. Accessed 2 Aug 2019.

  3. 3.

    Eddy DM, Hollingworth W, Caro JJ, Tsevat J, McDonald KM, Wong JB. Model transparency and validation: a report of the ISPOR-SMDM modeling good research practices task force-7. Med Decis Mak. 2012;32:733–43.

    Article  Google Scholar 

  4. 4.

    Cohen JT, Neumann PJ, Wong JB. A call for open-source cost-effectiveness analysis. Ann Intern Med. 2017;167(6):432–3.

    Article  Google Scholar 

  5. 5.

    Baio G, Heath A. When simple becomes complicated: why Excel should lose its place at the top table. Glob Reg Heal Technol Assess. 2017;4(1):e3–6.

    Google Scholar 

  6. 6.

    Canadian Agency for Drugs and Technologies in Health (CADTH). Procedure and Submission Guidelines for the CADTH Common Drug Review. 2019. p. 110. Accessed 17 Sept 2019.

  7. 7.

    Center for the Evaluation of Value and Risk in Health. Open-source model clearinghouse [Internet]. Tufts University Medical Center; 2019. Accessed 1 Feb 2019.

  8. 8.

    Dunlop WCN, Mason N, Kenworthy J, Akehurst RL. Benefits, challenges and potential strategies of open source health economic models. Pharmacoeconomics. 2017;35:125–8.

    Article  Google Scholar 

  9. 9.

    Sampson CJ, Wrightson T. Model registration: a call to action. Pharmacoecon Open. 2017;1:73–7.

    Article  Google Scholar 

  10. 10.

    Sampson CJ, Arnold R, Bryan S, et al. Transparency in decision modelling: what, why, who and how? PharmacoEconomics. 2019.

    Article  PubMed  Google Scholar 

  11. 11.

    Jalal H, Pechlivanoglou P, Krijkamp E, Alarid-Escudero F, Enns EA, Hunink MGM. An overview of R in health decision sciences. Med Decis Mak. 2017;37:735–46.

    Article  Google Scholar 

  12. 12.

    Decision Analysis in R for Technologies in Health (DARTH) workgroup. Decision analysis in R for technologies in health [Internet]. 2019. Accessed 1 Jan 2019.

  13. 13.

    R Core Team. R: A language and environment for statistical computing. Vienna: R Foundation for Statistical Computing; 2019.

  14. 14.

    Marwick B, Boettiger C, Mullen L. Packaging data analytical work reproducibly using R (and friends). Am Stat. 2018;72:80–8.

    Article  Google Scholar 

  15. 15.

    Stout NK, Knudsen AB, Kong CY, Mcmahon PM, Gazelle GS. Calibration methods used in cancer simulation models and suggested reporting guidelines. Pharmacoeconomics. 2009;27:533–45.

    Article  Google Scholar 

  16. 16.

    Briggs AH, Weinstein MC, Fenwick EAL, Karnon J, Sculpher MJ, Paltiel AD. Model parameter estimation and uncertainty analysis: a report of the ISPOR-SMDM modeling good research practices task force working group-6. Med Decis Mak. 2012;32:722–32.

    Article  Google Scholar 

  17. 17.

    Alarid-Escudero F, MacLehose RF, Peralta Y, Kuntz KM, Enns EA. Nonidentifiability in model calibration and implications for medical decision making. Med Decis Mak. 2018;38:810–21.

    Article  Google Scholar 

  18. 18.

    Sargent RG. Verification and validation of simulation models. J Simul. 2013;7:12–24.

    Article  Google Scholar 

  19. 19.

    Goldhaber-Fiebert JD, Stout NK, Goldie SJ. Empirically evaluating decision-analytic models. Value Health. 2010;13:667–74.

    Article  Google Scholar 

  20. 20.

    Rutter CM, Savarino JE. An evidence-based microsimulation model for colorectal cancer: validation and application. Cancer Epidemiol Biomark Prev. 2010;19:1992–2002.

    Article  Google Scholar 

  21. 21.

    Rutter CM, Knudsen AB, Marsh TL, Doria-Rose VP, Johnson E, Pabiniak C, et al. Validation of models used to inform colorectal cancer screening guidelines: accuracy and implications. Med Decis Mak. 2016;36:604–14.

    Article  Google Scholar 

  22. 22.

    Kopec JA, Finès P, Manuel DG, Buckeridge DL, Flanagan WM, Oderkirk J, et al. Validation of population-based disease simulation models: a review of concepts and methods. BMC Public Health. 2010;10:710.

    Article  Google Scholar 

  23. 23.

    Cancer Intervention and Surveillance Modelling Network (CISNET). About CISNET [Internet]. 2019. Accessed 16 July 2019.

  24. 24.

    Krijkamp EM, Alarid-Escudero F, Enns E, Pechlivanoglou P, Hunink MM, Jalal H. A multidimensional array representation of state-transition model dynamics. bioRxiv 670612. 2019.

  25. 25.

    Sculpher MJ, Basu A, Kuntz KM, Meltzer DO. Reflecting uncertainty in cost-effectiveness analysis. In: Neumann PJ, Sanders GD, Russell LB, Siegel JE, Ganiats TG, editors. Cost-effectiveness heal med. 2nd ed. New York: Oxford University Press; 2017. p. 289–318.

    Google Scholar 

  26. 26.

    Alarid-Escudero F, Enns EA, Kuntz KM, Michaud TL, Jalal H. “Time traveling is just too dangerous” but some methods are worth revisiting: the advantages of expected loss curves over cost-effectiveness acceptability curves and frontier. Value Health. 2019;22:611–8.

    Article  Google Scholar 

  27. 27.

    Raiffa H, Schlaifer RO. Applied statistical decision theory. Cambridge: Harvard Business School; 1961.

    Google Scholar 

  28. 28.

    Claxton K, Posnett J. An economic approach to clinical trial design and research priority-setting. Health Econ. 1996;5:513–24.

    CAS  Article  Google Scholar 

  29. 29.

    Jutkowitz E, Alarid-Escudero F, Kuntz KM, Jalal H. The curve of optimal sample size (COSS): a graphical representation of the optimal sample size from a value of information analysis. Pharmacoeconomics. 2019;37:871–7.

    Article  Google Scholar 

  30. 30.

    Wickham H. R packages: organize, test, document, and share your code. Spencer A, Marie Beaugureau, editors. Sebastopol: O’Reilly Media; 2015.

    Google Scholar 

  31. 31.

    Cooper N, Hsing P-Y, editors. A guide to reproducible code in ecology and evolution. London: British Ecology Society; 2017.

    Google Scholar 

  32. 32.

    Kleijnen JPC. Verification and validation of simulation models. Eur J Oper Res. 1995;82:145–62.

    Article  Google Scholar 

  33. 33.

    Wickham H. The tidyverse style guide [Internet]. 2019. Accessed 19 July 2019.

  34. 34.

    Google. Google’s R Style Guide [Internet]. 2019. p. 1–6. Accessed 24 July 2019.

  35. 35.

    Martin RC. Clean code: a handbook of agile software craftsmanship. Boston: Pearson Education; 2009.

    Google Scholar 

  36. 36.

    Wickham H. testthat: get started with testing. R J. 2011;3:5.

    Article  Google Scholar 

  37. 37.

    Beeley C. Web application development with R using Shiny. Birmingham: Packt Publishing Ltd; 2013.

    Google Scholar 

  38. 38.

    Incerti D, Curtis JR, Shafrin J, Lakdawalla DN, Jansen JP. A flexible open-source decision model for value assessment of biologic treatment for rheumatoid arthritis. Pharmacoeconomics. 2019;37:829–43.

    Article  Google Scholar 

  39. 39.

    Xie Y. Bookdown: authoring books with R Markdown. Boca Raton, FL: CRC Press; 2016.

    Book  Google Scholar 

  40. 40.

    Enns EA, Cipriano LE, Simons CT, Kong CY. Identifying best-fitting inputs in health-economic model calibration: a pareto frontier approach. Med Decis Mak. 2015;35:170–82.

    Article  Google Scholar 

  41. 41.

    RStudio. Using projects [Internet]. 2019. Accessed 1 Feb 2019.

  42. 42.

    David O, Ascough JC, Lloyd W, Green TR, Rojas KW, Leavesley GH, et al. A software engineering perspective on environmental modeling framework design: the object modeling system. Environ Model Softw. 2013;39:201–13.

    Article  Google Scholar 

Download references


We thank Mr Caleb Easterly for his helpful comments and suggestions on the code developed for this framework and Dr Myriam Hunink for her overall contribution in the DARTH workgroup.

Author information




FAE, EK, PP, HJ SYK, AY, and EE: study design and analysis. All authors participated in the interpretation of the data, drafting of the manuscript, critical revision of the manuscript, and approval of the final manuscript.

Corresponding author

Correspondence to Fernando Alarid-Escudero.

Ethics declarations

Data availability statement

Data and statistical code are provided in the GitHub repository ( and the darthpack website ( The version of darthpack released in this article is available at


Dr Alarid-Escudero was supported by a Grant from the National Cancer Institute (U01-CA-199335) as part of the Cancer Intervention and Surveillance Modeling Network (CISNET). Dr Enns was supported by a Grant from the National Institute of Allergy and Infectious Disease of the National Institutes of Health under award no. K25AI118476. Dr Jalal was supported by a Grant from the National Institute of Health (KL2 TR0001856). The content is solely the responsibility of the authors and does not necessarily represent the official views of the National Institutes of Health. The funding agencies had no role in the design of the study, interpretation of results, or writing of the manuscript. The funding agreement ensured the authors’ independence in designing the study, interpreting the data, writing, and publishing the report.

Conflict of interest

FAE reports no conflicts of interest. EK reports no conflicts of interest. PP reports no conflicts of interest. HJ reports no conflicts of interest. SYK reports no conflicts of interest. AY reports no conflicts of interest. EE reports no conflicts of interest.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (PDF 1059 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Alarid-Escudero, F., Krijkamp, E.M., Pechlivanoglou, P. et al. A Need for Change! A Coding Framework for Improving Transparency in Decision Modeling. PharmacoEconomics 37, 1329–1339 (2019).

Download citation