Skip to main content

Sensitivity Analyses for Unmeasured Confounding: This Is the Way

  • Chapter
  • First Online:
Real-World Evidence in Medical Product Development
  • 476 Accesses

Abstract

One important assumption for valid causal inference when comparing outcomes from two or more groups based on observational (non-randomized) data is the assumption of ‘no unmeasured confounders’. While researchers have long been aware of the potential bias from unmeasured confounders, recently many new approaches have been proposed to quantitatively assess the robustness of research to the potential for unmeasured confounding. These include a growing set of literature proposing methods that are applicable in a broad number of settings including when researchers can not even identify a specific potential unmeasured confounder. In this chapter, we review existing sensitivity analysis approaches and argue for a growing consensus of best practice principles. Broadly applicable methods, such as the E-value, simulation framework, or omitted variables approach are well suited as foundational steps in any quantitative assessment of robustness to unmeasured confounding. We also argue that given these broadly applicable methods and published R-packages to remove implementation complexity, consistent application of quantitative sensitivity analyses is now practical.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 139.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Hardcover Book
USD 179.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. U.S. Food and Drug Administration Guidance Document (2018). Framework for FDA’s Real-World Evidence Program. https://www.fda.gov/media/120060/download.

  2. U.S. Food and Drug Administration Guidance Document (2021). Considerations for the Use of Real-World Data and Real-World Evidence to Support Regulatory Decision Making for Drug and Biologic Products. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/considerations-use-real-world-data-and-real-world-evidence-support-regulatory-decision-making-drug.

  3. U.S. Food and Drug Administration Guidance Document (2021). Real-World Data: Assessing Electronic Health Records and Medical Claims Data to Support Regulatory Decision-Making for Drug and Biologic Products. https://www.fda.gov/regulatory-information/search-fda-guidance-documents/real-world-data-assessing-electronic-health-records-and-medical-claims-data-support-regulatory

  4. Neyman, J. On the Application of Probability Theory to Agricultural Experiments: Essay on Principles. Chapter 9, (1923). Translated in Statistical Science, 1990: 5(4), 465–472.(1990).

    Google Scholar 

  5. Rubin DB. Estimating Causal Effects of Treatments in Randomized and Nonrandomized Studies. J Educ Psychol. 66(5):688–701 (1974).

    Article  Google Scholar 

  6. Rubin DB. Assignment of Treatment Group on the Basis of Covariates. J Educational Statistics 2:1–26 (1977).

    Article  Google Scholar 

  7. Holland PW. Statistics and Causal Inference. J American Statistical Association 81(396):945–960 (1986).

    Article  MathSciNet  MATH  Google Scholar 

  8. Zagar A, Kadziola Z, Lipkovich I, Madigan D, Faries D. Evaluating Bias Control Strategies in Observational Studies Using Frequentist Model Averaging J Biopharm Stat 27(3):535–553 (2022).

    Article  Google Scholar 

  9. Schuemie MJ, Ryan PB, DuMouchel W, Suchard MA, Madigan D. Interpreting Observational Studies: Why Empirical Calibration is Needed to Correct p-values. Statist. Med., 33:209–218 (2014).

    Article  MathSciNet  Google Scholar 

  10. Ryan PB, Madigan D, Stang PE, Overhage JM, Racoosin JA, and Hartzema AG. Empirical Assessment of Methods for Risk Identification in Healthcare Data: Results from the Experiments of the Observational Medical Outcomes Partnership. Statist. Med., 31 4401–4415 (2012).

    Article  MathSciNet  Google Scholar 

  11. Cinelli C and Hazlett C. Making Sense of Sensitivity: Extending Omitted Variable Bias. J.R. Statst. Soc. B 82:, Part 1, 39–67 (2020).

    Article  MathSciNet  MATH  Google Scholar 

  12. Blum MR, Tan YJ, Ioannidis JPA. Use of E-values for Addressing Confounding in Observational Studies—an Empirical Assessment of the Literature. Int J Epidemiol 49:1482–94 (2020).

    Article  Google Scholar 

  13. Zhang X, Faries DE, Boytsov N, Stamey JD, Seaman JWA Jr. Bayesian Sensitivity Analysis to Evaluate the Impact of Unmeasured Confounding with External Data: a Real World Comparative Effectiveness Study in Osteoporosis. Pharmacoepidemiol Drug Saf., 25(9):982–992 (2016).

    Article  Google Scholar 

  14. Faries D, Peng X, Pawaskar M, Price K, Stamey JD, Seaman JW Jr. Evaluating the Impact of Unmeasured Confounding with Internal Validation Data: An Example Cost Evaluation in Type 2 Diabetes. Value in Health 16:259–266 (2013).

    Article  Google Scholar 

  15. Federspiel JJ, Anstrom KJ, Xian Y, McCoy LA, Effron MB, Faries DE, Zettler M, Mauri L, Yeh RW, Peterson ED, Wang TY for the Treatment With Adenosine Diphosphate Receptor Inhibitors–Longitudinal Assessment of Treatment Patterns and Events After Acute Coronary Syndrome (TRANSLATE-ACS) Investigators. Comparing Inverse Probability of Treatment Weighting and Instrumental Variable Methods for the Evaluation of Adenosine Diphosphate Receptor Inhibitors After Percutaneous Coronary Intervention. JAMA Cardiol., 1(6):655–665 (2016).

    Article  Google Scholar 

  16. Choong CK, Belger M, Koch AE, Meyers KJ, Marconi VC, Abedtash H, Faries D, Krishnan V. Comparative Effectiveness of Dexamethasone in Hospitalized COVID-19 Patients in the United States. To appear in Advances in Therapy (2022).

    Google Scholar 

  17. Uddin MJ, Groenwold RHH, Ali MS, de Boer A, Roes KCB, Chowdhury AB, Klungel OH. Methods to Control for Unmeasured Confounding in Pharmacoepidemiology: an Overview. International Journal of Clinical Pharmacy 38(3):1–10 (2016).

    Google Scholar 

  18. Streeter AJ, Lin NX, Crathorne L, Haasova M, et al. Adjusting for Unmeasured Confounding in Non-randomised Longitudinal Studies: a Methodological Review. Journal of Clinical Epidemiology 87:23–34 (2017).

    Article  Google Scholar 

  19. Zhang X, Faries DE, Li H, Stamey JD, Imbens GW. Addressing Unmeasured Confounding in Comparative Observational Research. Pharmacoepidemiol Drug Saf. 27:373–382 (2018).

    Article  Google Scholar 

  20. Schneeweiss S. Sensitivity Analysis and External Adjustment for Unmeasured Confounders in Epidemiologic Database Studies of Therapeutics. Pharmacoepidemiology and drug safety 15(5):291–303 (2006). .

    Article  Google Scholar 

  21. VanderWeele TJ and Ding P. Sensitivity Analysis in Observational Research: Introducing the E-Value. Annals of Internal Medicine https://doi.org/10.7326/m16-2607 (2017).

  22. VanderWeele TJ, Ding P, Mathur M. Technical Considerations in the Use of the E-Value. J. Causal Infer. 2019:1–11 (2019).

    MathSciNet  Google Scholar 

  23. VanderWeele TJ and Mathur MB. Commentary: Developing Best-practice Guidelines for the Reporting of E-values. International Journal of Epidemiology, 49(5): 1495–1497 (2020).

    Article  Google Scholar 

  24. Mathur MB, Ding P, Riddell CA, and VanderWeele TJ. Website and R Package for Computing E-Values. Epidemiology 29(5): e45–e47 (2018).

    Article  Google Scholar 

  25. Dorie V, Harada M, Carnegie NB, Hill J. A Flexible, Interpretable Framework for Assessing Sensitivity to Unmeasured Confounding. Stat in Med 35:3453–3470 (2016).

    Article  MathSciNet  Google Scholar 

  26. Carnegie NB, Harada M, Hill JL. Assessing Sensitivity to Unmeasured Confounding Using a Simulated Potential Confounder. J. Res. Educational Effectiveness 9(3):395–420 (2016).

    Article  Google Scholar 

  27. Faries D, Zhang X, Kadziola Z, Siebert U, Kuehne F, Obenchain RL, and Haro JM. Real World Health Care Data Analysis: Causal Methods and Implementation Using SAS®. Cary, NC: SAS Institute Inc. 2020.

    Google Scholar 

  28. Zhang X, Stamey J, Mather MB. Assessing the Impact of Unmeasured Confounders for Credible and Reliable Real-world Evidence. Pharmacoepi and Drug Safety 29:1219–1227, 2020.

    Article  Google Scholar 

  29. Girman CJ, Faries D, Ryan P, Rotelli M, Belger M, Binkowitz B, O’Neill R, for the Drug Information Association CER Working Group. Pre-Study Feasibility and Identifying Sensitivity Analyses for Protocol Pre-Specification in Comparative Effectiveness Research. Journal of Comparative Effectiveness 3(3): 259–270 (2014).

    Article  Google Scholar 

  30. Fang Y, He W, Hu X, Wang H. A Method for Sample Size Calculation via E-value in the Planning of Observational Studies. Pharmaceutical Statistics 20:163–174 (2021).

    Article  Google Scholar 

  31. Lash TL, Fox MP, MacLehose RF, Maldonado G, McCandless LC, Greenland S. Good Practices for Quantitative Bias Analyses. Int J Epidemiol 43(6):1969–85 (2014).

    Article  Google Scholar 

  32. Tannen RL, Weiner MG, Xie D. Use of Primary Care Electronic Medical Record Database in Drug Efficacy Research on Cardiovascular Outcomes: Comparison of Database and Randomised Controlled Trial Findings. British Medical Journal 338:b81 (2009).

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Douglas Faries .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Faries, D. (2023). Sensitivity Analyses for Unmeasured Confounding: This Is the Way. In: He, W., Fang, Y., Wang, H. (eds) Real-World Evidence in Medical Product Development . Springer, Cham. https://doi.org/10.1007/978-3-031-26328-6_14

Download citation

Publish with us

Policies and ethics