Use of Indirect and Mixed Treatment Comparisons for Technology Assessment
- 549 Downloads
Indirect and mixed treatment comparison (MTC) approaches to synthesis are logical extensions of more established meta-analysis methods. They have great potential for estimating the comparative effectiveness of multiple treatments using an evidence base of trials that individually do not compare all treatment options. Connected networks of evidence can be synthesized simultaneously to provide estimates of the comparative effectiveness of all included treatments and a ranking of their effectiveness with associated probability statements.
The potential of the use of indirect and MTC methods in technology assessment is considerable, and would allow for a more consistent assessment than simpler alternative approaches. Although such models can be viewed as a logical and coherent extension of standard pair-wise meta-analysis, their increased complexity raises some unique issues with far-reaching implications concerning how we use data in technology assessment, while simultaneously raising searching questions about standard pair-wise meta-analysis. This article reviews pair-wise meta-analysis and indirect and MTC approaches to synthesis, clearly outlining the assumptions involved in each approach. It also raises the issues that the National Institute for Health and Clinical Excellence (NICE) needed to consider in updating their 2004 Guide to the Methods of Technology Appraisal, if such methods are to be used in their technology appraisals.
KeywordsTechnology Assessment Indirect Comparison Decision Context Technology Appraisal Mixed Treatment Comparison
This paper was initially prepared as a briefing paper for NICE as part of the process of updating the Institute’s 2004 Guide to the Methods of Technology Appraisal. The work was funded by NICE through its Decision Support Unit, which is based at the universities of Sheffield, Leicester, York, Leeds and at the London School of Hygiene and Tropical Medicine.
K.R. Abrams, A.E. Ades, N.J. Cooper and A.J. Sutton have all delivered fee-paying courses on indirect and mixed treatment comparisons (MTC), and have had research projects developing and using MTC methods funded by the Medical Research Council (MRC), NHS Health Technology Appraisal and the healthcare industry. A.J. Sutton is an applicant on an MRC grant investigating the validity of indirect comparisons. K.R. Abrams, A.E. Ades, N.J. Cooper and A.J. Sutton are all applicants on grant applications using MTC methods. K.R. Abrams and A.E. Ades have also acted as paid consultants to consultancy companies in the healthcare industry, specifically on MTC methods. K.R. Abrams also receives royalities for Bayesian Approaches to Clinical Trials and Health-care Evaluation.
The authors would like to thank Louise Longworth (NICE) for input into the original briefing and for insightful comments on an earlier draft of this paper; Karl Claxton for his insightful comments on an earlier draft of the paper; and Deborah Caldwell for allowing us to reproduce tables I, III and IV for the paper. Although this paper has its roots in the document prepared for a workshop on MTCs hosted by NICE as part of the process of updating their 2004 guidance, it has evolved considerably since then and includes numerous substantive changes.
- 1.Hierarchy of evidence and grading of recommendations. Thorax 2004; 59 (Suppl. 1): 13Google Scholar
- 2.Egger M, Davey Smith G, Altman DG. Systematic reviews in health care: meta-analysis in context. London: BMJ Books, 2000Google Scholar
- 3.Higgins JPT, Green S, editors. Cochrane handbook for systematic reviews of interventions, 4.2.5 [updated May 2005]. In: The Cochrane Library. Issue 3. Chichester: John Wiley & Sons, Ltd, 2005Google Scholar
- 10.National Institute for Health and Clinical Excellence (NICE). Guide to the methods of technology appraisal. London: NICE 2004Google Scholar
- 12.Sutton AJ, Abrams KR, Jones DR, et al. Methods for meta-analysis in medical research. London: John Wiley, 2000Google Scholar
- 24.Turner RM, Spiegelhalter DJ, Smith GCS, et al. Bias modelling in evidence synthesis. J R Stat Soc Ser A Stat Soc. In pressGoogle Scholar
- 26.Barrio V. Actual methodological controversies on the controlled clinical trials and on meta-analysis. Nefrologia 1998; 18: 32–39Google Scholar
- 30.Spiegelhalter DJ, Thomas A, Best NG. WinBUGS version 1.2 user manual. Cambridge (UK): MRC Biostatistics Unit, 1999Google Scholar
- 32.Efron B, Tibshirani RJ. An introduction to the bootstrap. 1st ed. New York: Chapman & Hall, 1993Google Scholar
- 36.Welton N, Cooper NJ, Ades A, et al. Mixed treatment comparison with multiple outcomes reported inconsistently across trials: evaluation of antivirals for treatment of influenza A and B. Stat Med. In pressGoogle Scholar