Skip to main content
Log in

Target Trial Emulation for Transparent and Robust Estimation of Treatment Effects for Health Technology Assessment Using Real-World Data: Opportunities and Challenges

  • Current Opinion
  • Published:
PharmacoEconomics Aims and scope Submit manuscript

Abstract

Evidence about the relative effects of new treatments is typically collected in randomised controlled trials (RCTs). In many instances, evidence from RCTs falls short of the needs of health technology assessment (HTA). For example, RCTs may not be able to capture longer-term treatment effects, or include all relevant comparators and outcomes required for HTA purposes. Information routinely collected about patients and the care they receive have been increasingly used to complement RCT evidence on treatment effects. However, such routine (or real-world) data are not collected for research purposes, so investigators have little control over the way patients are selected into the study or allocated to the different treatment groups, introducing biases for example due to selection or confounding. A promising approach to minimise common biases in non-randomised studies that use real-world data (RWD) is to apply design principles from RCTs. This approach, known as ‘target trial emulation’ (TTE), involves (1) developing the protocol with respect to core study design and analysis components of the hypothetical RCT that would answer the question of interest, and (2) applying this protocol to the RWD so that it mimics the data that would have been gathered for the RCT. By making the ‘target trial’ explicit, TTE helps avoid common design flaws and methodological pitfalls in the analysis of non-randomised studies, keeping each step transparent and accessible. It provides a coherent framework that embeds existing analytical methods to minimise confounding and helps identify potential limitations of RWD and the extent to which these affect the HTA decision. This paper provides a broad overview of TTE and discusses the opportunities and challenges of using this approach in HTA. We describe the basic principles of trial emulation, outline some areas where TTE using RWD can help complement RCT evidence in HTA, identify potential barriers to its adoption in the HTA setting and highlight some priorities for future work.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Bullement A, Podkonjak T, Robinson MJ, Benson E, Selby R, Hatswell AJ, et al. Real-world evidence use in assessments of cancer drugs by NICE. Int J Technol Assess Health Care. 2020:1–7.

  2. Goring S, Taylor A, Muller K, Li TJJ, Korol EE, Levy AR, et al. Characteristics of non-randomised studies using comparisons with external controls submitted for regulatory approval in the USA and Europe: a systematic review. BMJ Open. 2019;9(2):e024895.

  3. Hatswell AJ, Baio G, Berlin JA, Irs A, Freemantle N. Regulatory approval of pharmaceuticals without a randomised controlled study: analysis of EMA and FDA approvals 1999-2014. BMJ Open. 2016;6(6):e011666.

  4. Griffiths EA, Macaulay R, Vadlamudi NK, Uddin J, Samuels ER. The role of noncomparative evidence in health technology assessment decisions. Value Health. 2017;20(10):1245–51.

    Article  Google Scholar 

  5. Chan K, Nam S, Evans B, de Oliveira C, Chambers A, Gavura S, et al. Developing a framework to incorporate real-world evidence in cancer drug funding decisions: the Canadian Real-world Evidence for Value of Cancer Drugs (CanREValue) collaboration. BMJ Open. 2020;10(1):e032884.

  6. Makady A, van Veelen A, Jonsson P, Moseley O, D’Andon A, de Boer A, et al. Using real-world data in health technology assessment (HTA) practice: a comparative study of five HTA agencies. Pharmacoeconomics. 2018;36(3):359–68.

    Article  Google Scholar 

  7. FDA. Framework for FDA's real-world evidence program. 2018.

  8. Denaxas SC, George J, Herrett E, Shah AD, Kalra D, Hingorani AD, et al. Data resource profile: cardiovascular disease research using linked bespoke studies and electronic health records (CALIBER). Int J Epidemiol. 2012;41(6):1625–38.

    Article  Google Scholar 

  9. Wood A, Denholm R, Hollings S, Cooper J, Ip S, Walker V, et al. Linked electronic health records for research on a nationwide cohort of more than 54 million people in England: data resource. BMJ. 2021;373:n826.

  10. Franklin JM, Schneeweiss S. When and how can real world data analyses substitute for randomized controlled trials? Clin Pharmacol Ther. 2017;102(6):924–33.

    Article  Google Scholar 

  11. Lodi S, Phillips A, Lundgren J, Logan R, Sharma S, Cole SR, et al. Effect estimates in randomized trials and observational studies: comparing apples with apples. Am J Epidemiol. 2019;188(8):1569–77.

    Article  Google Scholar 

  12. Freemantle N, Marston L, Walters K, Wood J, Reynolds MR, Petersen I. Making inferences on treatment effects from real world data: propensity scores, confounding by indication, and other perils for the unwary in observational research. BMJ. 2013;347:f6409.

  13. Bell H, Wailoo A, Hernandez-Alava M, Grieve R, Faria R, Gibson L, et al. The use of real world data for the estimation of treatment effects in NICE decision making.; 2016.

  14. Faria R, Hernandez-Alava M, Manca A, Wailoo A. The use of observational data to inform estimates of treatment effectiveness in technology appraisal: methods for comparative individual patient data. NICE DSU Technical Support Document No 17. 2015.

  15. NICE. NICE health technology evaluations: the manual. Appendix 1—Real world evidence framework. 2022.

  16. Welton NJ, Phillippo DM, Owen R, Jones HE, Dias S, Bujkiewicz S, et al. CHTE2020 sources and synthesis of evidence: update to evidence synthesis methods. 2020.

  17. Kreif N, Grieve R, Sadique MZ. Statistical methods for cost-effectiveness analyses that use observational data: a critical appraisal tool and review of current practice. Health Econ. 2013;22(4):486–500.

    Article  Google Scholar 

  18. Hernan MA, Alonso A, Logan R, Grodstein F, Michels KB, Willett WC, et al. Observational studies analyzed like randomized experiments: an application to postmenopausal hormone therapy and coronary heart disease. Epidemiology. 2008;19(6):766–79.

    Article  Google Scholar 

  19. Hernan MA, Robins JM. Using big data to emulate a target trial when a randomized trial is not available. Am J Epidemiol. 2016;183(8):758–64.

    Article  Google Scholar 

  20. Admon AJ, Donnelly JP, Casey JD, Janz DR, Russell DW, Joffe AM, et al. Emulating a novel clinical trial using existing observational data. predicting results of the prevent study. Ann Am Thorac Soc. 2019;16(8):998–1007.

  21. Boyne DJ, Cheung WY, Hilsden RJ, Sajobi TT, Batra A, Friedenreich CM, et al. Association of a shortened duration of adjuvant chemotherapy with overall survival among individuals with stage III colon cancer. JAMA Netw Open. 2021;4(3):e213587.

  22. Dickerman BA, Garcia-Albeniz X, Logan RW, Denaxas S, Hernan MA. Avoidable flaws in observational analyses: an application to statins and cancer. Nat Med. 2019;25(10):1601–6.

    Article  CAS  Google Scholar 

  23. Emilsson L, Garcia-Albeniz X, Logan RW, Caniglia EC, Kalager M, Hernan MA. Examining bias in studies of statin treatment and survival in patients with cancer. JAMA Oncol. 2018;4(1):63–70.

    Article  Google Scholar 

  24. OPERAND. The observational patient evidence for regulatory approval and understanding disease project. 2020.

  25. Noseworthy PA, Gersh BJ, Kent DM, Piccini JP, Packer DL, Shah ND, et al. Atrial fibrillation ablation in practice: assessing CABANA generalizability. Eur Heart J. 2019;40(16):1257–64.

    Article  CAS  Google Scholar 

  26. Franklin JM, Patorno E, Desai RJ, Glynn RJ, Martin D, Quinto K, et al. Emulating randomized clinical trials with nonrandomized real-world evidence studies: first results from the RCT DUPLICATE initiative. Circulation. 2021;143(10):1002–13.

    Article  Google Scholar 

  27. Sterne J, Hernan MA, McAleenan A, Reeves B, Higgins JPT. Chapter 25: Assessing risk of bias in a non-randomized study. Cochrane Training Handbook. 2021.

  28. Garcia-Albeniz X, Hsu J, Hernan MA. The value of explicitly emulating a target trial when using real world evidence: an application to colorectal cancer screening. Eur J Epidemiol. 2017;32(6):495–500.

    Article  Google Scholar 

  29. Phillippo DM, Dias S, Elsada A, Ades AE, Welton NJ. Population adjustment methods for indirect comparisons: a review of national institute for health and care excellence technology appraisals. Int J Technol Assess Health Care. 2019;35(3):221–8.

    Article  Google Scholar 

  30. Katsoulis M, Lai AG, Diaz-Ordaz K, Gomes M, Pasea L, Banerjee A, et al. Identifying adults at high-risk for change in weight and BMI in England: a longitudinal, large-scale, population-based cohort study using electronic health records. Lancet Diabetes Endocrinol. 2021;9(10):681–94.

    Article  Google Scholar 

  31. Katsoulis M, Stavola BD, Diaz-Ordaz K, Gomes M, Lai A, Lagiou P, et al. Weight change and the onset of cardiovascular diseases: emulating trials using electronic health records. Epidemiology. 2021;32(5):744–55.

    Article  Google Scholar 

  32. Fotheringham J, Latimer N, Froissart M, Kronenberg F, Stenvinkel P, Floege J, et al. Survival on four compared with three times per week haemodialysis in high ultrafiltration patients: an observational study. Clin Kidney J. 2021;14(2):665–72.

    Article  Google Scholar 

  33. EMA. ICH E9 (R1) addendum on estimands and sensitivity analysis in clinical trials to the guideline on statistical principles for clinical trials. 2017.

  34. Hernan MA, Robins JM. Instruments for causal inference: an epidemiologist’s dream? Epidemiology. 2006;17(4):360–72.

    Article  Google Scholar 

  35. Lipsitch M, Tchetgen Tchetgen E, Cohen T. Negative controls: a tool for detecting confounding and bias in observational studies. Epidemiology. 2010;21(3):383–8.

    Article  Google Scholar 

  36. VanderWeele TJ, Ding P. Sensitivity analysis in observational research: introducing the E-value. Ann Intern Med. 2017;167(4):268–74.

    Article  Google Scholar 

  37. Tennant PWG, Murray EJ, Arnold KF, Berrie L, Fox MP, Gadd SC, et al. Use of directed acyclic graphs (DAGs) to identify confounders in applied health research: review and recommendations. Int J Epidemiol. 2021;50(2):620–32.

    Article  Google Scholar 

  38. Herrett E, Gallagher AM, Bhaskaran K, Forbes H, Mathur R, van Staa T, et al. Data resource profile: clinical practice research datalink (CPRD). Int J Epidemiol. 2015;44(3):827–36.

    Article  Google Scholar 

  39. Herbert A, Wijlaars L, Zylbersztejn A, Cromwell D, Hardelid P. Data resource profile: hospital episode statistics admitted patient care (HES APC). Int J Epidemiol. 2017;46(4):1093–i.

    Article  Google Scholar 

  40. Hernan MA, Sauer BC, Hernandez-Diaz S, Platt R, Shrier I. Specifying a target trial prevents immortal time bias and other self-inflicted injuries in observational analyses. J Clin Epidemiol. 2016;79:70–5.

    Article  Google Scholar 

  41. Woolacott N, Corbett M, Jones-Diette J, Hodgson R. Methodological challenges for the evaluation of clinical effectiveness in the context of accelerated regulatory approval: an overview. J Clin Epidemiol. 2017;90:108–18.

    Article  Google Scholar 

  42. Davies J, Martinec M, Delmar P, Coudert M, Bordogna W, Golding S, et al. Comparative effectiveness from a single-arm trial and real-world data: alectinib versus ceritinib. J Comp Eff Res. 2018;7(9):855–65.

    Article  Google Scholar 

  43. Thorlund K, Dron L, Park JJH, Mills EJ. Synthetic and external controls in clinical trials—a primer for researchers. Clin Epidemiol. 2020;12:457–67.

    Article  Google Scholar 

  44. MHRA. Early access to medicines scheme (EAMS). 2014.

  45. McCabe C, Chilcott J, Claxton K, Tappenden P, Cooper C, Roberts J, et al. Continuing the multiple sclerosis risk sharing scheme is unjustified. BMJ. 2010;340:c1786.

  46. Tai TA, Latimer NR, Benedict A, Kiss Z, Nikolaou A. Prevalence of immature survival data for anti-cancer drugs presented to the national institute for health and care excellence and impact on decision making. Value Health. 2021;24(4):505–12.

    Article  Google Scholar 

  47. Phillippo DM, Ades AE, Dias S, Palmer S, Abrams KR, Welton NJ. Methods for population-adjusted indirect comparisons in health technology appraisal. Med Decis Making. 2018;38(2):200–11.

    Article  Google Scholar 

  48. NICE. Adalimumab, etanercept, infliximab, certolizumab pegol, golimumab, tocilizumab and abatacept for rheumatoid arthritis not previously treated with DMARDs or after conventional DMARDs only have failed. Technology appraisal guidance [TA375]. 2016.

  49. Wolfe F, Michaud K. The National Data Bank for rheumatic diseases: a multi-registry rheumatic disease data bank. Rheumatology (Oxford). 2011;50(1):16–24.

    Article  Google Scholar 

  50. Danaei G, Garcia Rodriguez LA, Cantero OF, Logan RW, Hernan MA. Electronic medical records can be used to emulate target trials of sustained treatment strategies. J Clin Epidemiol. 2018;96:12–22.

    Article  Google Scholar 

  51. Hernan MA, Robins JM. causal inference: what if?: Boca Raton: Chapman & Hall/CRC; 2021.

  52. Petito LC, Garcia-Albeniz X, Logan RW, Howlader N, Mariotto AB, Dahabreh IJ, et al. Estimates of overall survival in patients with cancer receiving different treatment regimens: emulating hypothetical target trials in the surveillance, epidemiology, and end results (SEER)-medicare linked database. JAMA Netw Open. 2020;3(3):e200452.

  53. Wang SV, Schneeweiss S, Berger ML, Brown J, de Vries F, Douglas I, et al. Reporting to improve reproducibility and facilitate validity assessment for healthcare database studies V1.0. Value Health. 2017;20(8):1009–22.

  54. Kent S, Burn E, Dawoud D, Jonsson P, Ostby JT, Hughes N, et al. Common problems, common data model solutions: evidence generation for health technology assessment. Pharmacoeconomics. 2021;39(3):275–85.

    Article  Google Scholar 

  55. Berger ML, Mamdani M, Atkins D, Johnson ML. Good research practices for comparative effectiveness research: defining, reporting and interpreting nonrandomized studies of treatment effects using secondary data sources: the ISPOR Good Research Practices for Retrospective Database Analysis Task Force Report-Part I. Value Health. 2009;12(8):1044–52.

    Article  Google Scholar 

  56. Berger ML, Sox H, Willke RJ, Brixner DL, Eichler HG, Goettsch W, et al. Good practices for real-world data studies of treatment and/or comparative effectiveness: recommendations from the joint ISPOR-ISPE special task force on real-world evidence in health care decision making. Value Health. 2017;20(8):1003–8.

    Article  Google Scholar 

  57. Cox E, Martin BC, Van Staa T, Garbe E, Siebert U, Johnson ML. Good research practices for comparative effectiveness research: approaches to mitigate bias and confounding in the design of nonrandomized studies of treatment effects using secondary data sources: the ISPOR Good Research Practices for Retrospective Database Analysis—Task Force Report-Part II. Value Health. 2009;12(8):1053–61.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Manuel Gomes.

Ethics declarations

Funding

NL is supported by Yorkshire Cancer Research (Award S406NL).

Conflicts of interest

The authors have no conflicts or competing interests to declare.

Ethics approval

Not applicable.

Consent to participate

Not applicable.

Consent for publication

Not applicable.

Availability of data and material

Not applicable.

Code availability

Not applicable.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Gomes, M., Latimer, N., Soares, M. et al. Target Trial Emulation for Transparent and Robust Estimation of Treatment Effects for Health Technology Assessment Using Real-World Data: Opportunities and Challenges. PharmacoEconomics 40, 577–586 (2022). https://doi.org/10.1007/s40273-022-01141-x

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40273-022-01141-x

Navigation