Abstract
Background
Concerns have been raised about the use of clinical data in cost-effectiveness models. The aim of this analysis was to evaluate the appropriate use of data on clinical effectiveness in cost-effectiveness modeling studies that were published between 2001 and 2015.
Methods
Assessors rated 72 modeling studies obtained from three therapeutic areas by applying criteria defined by the Grading of Recommendations Assessment, Development and Evaluation group for assessing the quality of clinical evidence: selection of clinical data (publication bias), imprecision, indirectness, inconsistency (i.e., heterogeneity), and study limitations (risk of bias). For all parameters included in the analyses, potential changes over time were assessed.
Results
Although three out of four modeling studies relied on randomized controlled trials, more than 60% of the modeling studies were based on clinical data with a high or unclear risk of bias, in more than 80%, a risk of publication bias was found, and in about 30%, evidence was based on indirect clinical evidence, having significantly increased over the years. Study limitations were inadequately described in more than one third of the studies. However, less than 10% of clinical studies showed inconsistency or imprecision in study results.
Conclusion
Despite the fact that the majority of economic evaluations are based on precise and consistent randomized controlled trials, their results are often affected by limitations arising from methodological shortcomings in the underlying data on clinical efficacy. Modelers and assessors should be more aware of aspects surrounding the quality of clinical evidence as considered by the Grading of Recommendations Assessment, Development and Evaluation group.
Similar content being viewed by others
References
Sculpher M, Claxton K, Akehurst R. It’s just evaluation for decision making: recent developments in, and challenges for, cost-effectiveness research. In: Smith PC, Ginnelly L, Sculpher M, editors. Health policy and economics: opportunities and challenges. Maidenhead, UK: Open University Press; 2005.
Buxton MJ, Drummond MF, Van Hout BA, Prince RL, Sheldon TA, Szucs T, Vray M. Modelling in economic evaluation: an unavoidable fact of life. Health Econ. 1997;6(3):217–27.
Drummond MF, Jefferson TO. Guidelines for authors and peer reviewers of economic submissions to the BMJ: the BMJ Economic Evaluation Working Party. BMJ. 1996;313:275–83.
Weinstein MC, O’Brien B, Hornberger J, Jackson J, Johannesson M, McCabe C, Luce BR, ISPOR Task Force on Good Research Practices-Modelling Studies. Principles of good practice for decision analytic modelling in health-care evaluation: report of the ISPOR Task Force on Good Research Practices—Modelling Studies. Value Health. 2003;6:9–17.
Philips Z, Ginnelly L, Sculpher M, Claxton K, Golder S, Riemsma R, Woolacoot N, Glanville J. Review of guidelines for good practice in decision-analytic modelling in health technology assessment. Health Technol Assess. 2004;8:1–158.
Guyatt GH, Oxman AD, Kunz R, Vist GE, Falck-Ytter Y, Schünemann HJ, GRADE Working Group. What is “quality of evidence” and why is it important to clinicians? BMJ. 2008;336(7651):995–8.
Guyatt GH, Oxman AD, Schunemann HJ, Tugwell P, Knottnerus A. GRADE guidelines: a new series of articles in the Journal of Clinical Epidemiology. J Clin Epidemiol. 2011;64(4):380–2.
Mills EJ, Ioannidis JP, Thorlund K, Schünemann HJ, Puhan MA, Guyatt GH. How to use an article reporting a multiple treatment comparison meta-analysis. JAMA. 2012;308(12):1246–53.
National Institute for Health and Care Excellence (NICE). The guidelines manual. Chapter 6. Reviewing the evidence. www.nice.org.uk/process/pmg6/chapter/reviewing-the-evidence. Accessed 09 Mar 2018.
Scottish Intercollegiate Guidelines Network (SIGN). Applying the GRADE methodology to SIGN guidelines: core principles. www.sign.ac.uk/assets/gradeprincipals.pdf. Accessed 09 Mar 2018.
Balshem H, Helfand M, Schünemann HJ, Oxman AD, Kunz R, Brozek J, Vist GE, Falck-Ytter Y, Meerpohl J, Norris S, Guyatt GH. GRADE guidelines. 3. Rating the quality of evidence. J Clin Epidemiol. 2011;64(4):401–6.
Oxford Center for Evidence-based Medicine (CEBM). Levels of evidence 2011. www.cebm.net/wp-content/uploads/2014/06/CEBM-Levels-of-Evidence-2.1.pdf/. Accessed 09 Mar 2018.
Nuesch E, Trelle S, Reichenbach S, Rutjes AW, Tschannen B, Altman DG, Egger M, Juni P. Small study effects in meta-analyses of osteoarthritis trials: meta-epidemiological study. BMJ. 2010;341:c3515.
Higgins JPT, Altman DG. Assessing risk of bias in included studies. In: Higgins JPT, Green S, editors. Cochrane handbook for systematic reviews of interventions. Chichester: Wiley; 2008.
Elm E, Altman DG, Egger M, Pocock SJ, Gøtzsche P, Vandenbroucke P. Strengthening the reporting of observational studies in epidemiology (STROBE) statement: guidelines for reporting observational studies. BMJ. 2007;335:806.
Shea BJ, Grimshaw JM, Wells GA, Boers M, Andersson N, Hamel C, Porter AC, Tugwell P, Moher D, Bouter LM. Development of AMSTAR: a measurement tool to assess the methodological quality of systematic reviews. BMC Med Res Methodol. 2007;7:10.
Williams PL. Trend test for counts and proportions. In: Armitage P, editor. Encyclopedia of biostatistics. Chichester: Wiley; 2005.
Song F, Loke Y, Walsh T, Glenny A, Eastwood AJ, Altman DG. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews. BMJ. 2009;338:b1147.
Ciani O, Buyse M, Garside R, Pavey T, Stein K, Sterne J, Tylor R. Comparison of treatment effect sizes associated with surrogate and final patient relevant outcomes in randomised controlled trials: meta-epidemiological study. BMJ. 2013;346:f457.
Pouwels KB, Widyakusuma NN, Groenwold R, Hak E. Quality of reporting of confounding remained suboptimal after the STROBE guideline. J Clin Epidemiol. 2016;69:217–24.
Page MJ, Higgins JP, Clayton G, Sterne JA, Hróbjartsson A, Savović J. Empirical evidence of study design biases in randomized trials: systematic review of meta-epidemiological studies. PLoS One. 2016;11(7):e0159267.
Müller D, Pullenayegum E, Gandjour A. Impact of small study bias on cost-effectiveness acceptability curves and value of information analyses. Eur J Health Econ. 2015;16(2):219–23.
Husereau D, Drummond M, Petrou S, Carswell C, Moher D, Greenberg D, et al. Consolidated Health Economic Evaluation Reporting Standards (CHEERS) statement. BMJ. 2013;346:f1049.
Puhan MA, Schünemann HJ, Murad MH, Li T, Brignardello-Petersen R, Singh JA, Kessels AG, Guyatt GH, GRADE Working Group. A GRADE Working Group approach for rating the quality of treatment effect estimates from network meta-analysis. BMJ. 2014;349:g5630. https://doi.org/10.1136/bmj.g5630.
Willich SN. Randomisierte kontrollierte Studien: Pragmatische Ansätze erforderlich. Dtsch Arztebl. 2006;103(39):A-2524.
Schünemann H, Brożek J, Guyatt G, Oxman A, editors. Handbook for grading the quality of evidence and the strength of recommendations using the GRADE approach. McMaster University and Evidence Prime Inc., Ontario, Canada; 2013 (Updated 2014).
Brunetti M, Ruiz F, Lord J, Pregno S, Oxman A. Grading economic evidence; in evidence-based decisions and economics: health care, social welfare, education and criminal justice. Oxford: Wiley; 2010.
Author information
Authors and Affiliations
Contributions
AM, TB, CK, and NL conducted the selection, extraction, and rating of data. AM, AG, and DM conducted the analyses and prepared the manuscript. DC performed the literature search, selected the studies, and handed them over to the assessors. DM and SS revised the manuscript and monitored the analyses.
Corresponding author
Ethics declarations
Funding
No funding was received for the preparation of this study.
Conflict of interest
Alexander Mensch, Tanja Beck, Daniele Civello, Christopher Kunigkeit, Nicole Lachmann, Stephanie Stock, Afschin Gandjour, and Dirk Müller have no conflicts of interest directly relevant to the content of this study.
Data availability statement
A full list of all enrolled studies and detailed results for each individual study can be found in the Electronic Supplementary Material.
Electronic supplementary material
Below is the link to the electronic supplementary material.
Rights and permissions
About this article
Cite this article
Mensch, A., Beck, T., Civello, D. et al. Applying GRADE Criteria to Clinical Inputs to Cost-Effectiveness Modeling Studies. PharmacoEconomics 36, 987–994 (2018). https://doi.org/10.1007/s40273-018-0651-4
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40273-018-0651-4