PharmacoEconomics

, Volume 28, Issue 10, pp 813–830 | Cite as

International Comparison of Comparative Effectiveness Research in Five Jurisdictions

Insights for the US
  • Adrian R. Levy
  • Craig Mitton
  • Karissa M. Johnston
  • Brian Harrigan
  • Andrew H. Briggs
International Perspectives International Comparison of CER in Five Jurisdictions

Abstract

Spurred by a desire to improve quality of care and to understand the relative value of medical treatments, there has been a recent surge of interest in publicly funded comparative effectiveness research (CER) in the US. As health technology assessment (HTA) shares some of the same goals as CER, and publicly funded HTA has been a feature within other industrialized countries for many years, a review of HTA activities in some of these countries can be a helpful source of information for the US debate.

Informed by a literature review, and in two cases augmented by informant interviews, we reviewed the organization of HTA activities in five jurisdictions: Canada, Sweden, Scotland, the Netherlands and Australia. We provide a summary description of the healthcare system in each country as well as a description of the key features of their HTA bodies, with a particular focus on the processes of HTA for listing medications on public formularies.

Four of the committees evaluating medications for formulary inclusion are funded by, but remain at arm’s length from, the government (Canada, Australia, Sweden and Scotland), while the process is fully embedded within the government in the Netherlands. Each of these jurisdictions has a stated preference for comparative outcomes evidence from randomized controlled trials, but will, under certain circumstances, accept randomized evidence using surrogate markers, other comparators that are not directly relevant or non-randomized evidence. Health technology evaluation committees largely comprise health professionals, with public representatives included in the Canadian, Australian and Scottish committees. Scotland is the only one of the five jurisdictions reviewed to have industry representation on the evaluation committee.

We identified seven characteristics that are shared across the jurisdictions reviewed and that potentially serve as insights for development of CER in the US: (i) the process must be responsive to stakeholders’ interests, in that the turn-around time for assessments must be minimized, transparency must be maximized, the process must be considered fair using universally agreed standards and the process must be modifiable based on stakeholders’ requirements; (ii) the assessment of medical technologies other than drugs may present different challenges and is managed separately in other HTA organizations; (iii) because of the link between HTA and reimbursement decisions, completion of the HTA process following regulatory approval can delay market access to new technologies, thus closer integration between regulatory approval and HTA processes is being explored internationally; (iv) there is a direct or indirect link to reimbursement in the jurisdictions explored–without this link the role of CER in the US will remain advisory; (v) each jurisdiction reviewed benefits from a single payer that is informed by the process–given the diverse multipayer environment in the US, CER in the US may usefully focus on generating comparative effectiveness evidence; (vi) a common metric for assessing intended and unintended effects of treatment allows comparison across different technologies; and (vii) one stated focus of CER is on therapeutic benefit among ‘high-priority populations’, including specific demographic groups (the elderly and children, racial and ethnic minorities) and individuals with disabilities, multiple chronic conditions and specific genomic factors. This will be difficult to achieve because epidemiological evidence of differences in therapeutic benefit among subgroups is detected through effect modification, or more specifically, statistical evidence of effect measure modification, typically on relative measures of effect. Few randomized trials have enough power to detect effect modification and these have been uncommon in the scientific literature. As consideration is given to the development of a publicly funded CER body in the US, much can be learned from the international experience. Nevertheless, there are some distinctive features of the US healthcare system that must be taken into account when assessing the transferability of these insights.

Supplementary material

40273_2012_28100813_MOESM1_ESM.pdf (126 kb)
Supplementary material, approximately 129 KB.
40273_2012_28100813_MOESM2_ESM.pdf (114 kb)
Supplementary material, approximately 116 KB.

References

  1. 1.
    American Recovery and Reinvestment Act of 2009, HR1, 111th Cong.Google Scholar
  2. 2.
    America’s Affordable Health Choices Act of 2009, HR3200, 111th Cong.Google Scholar
  3. 3.
    America’s Healthy Future Act of 2009, S. 1796, 111th Cong.Google Scholar
  4. 4.
    Affordable Health Care for America Act of 2009, HR3962, 111th Cong.Google Scholar
  5. 5.
    Academy Health. Summary of comparative effectiveness research legislation in the 111th congress [online]. Available from URL: http://www.chsr.org/Summary_of_CERL.pdf [Accessed 2010 Apr 26]Google Scholar
  6. 6.
    Banta D, Jonsson E. History of HTA: introduction. Int J Technol Assess Health Care 2009 July; 25 Suppl. 1: 1–6PubMedCrossRefGoogle Scholar
  7. 7.
    Office of Technology Assessment. Policy implications of the computed tomography (CT) scanner. Washington, DC: US Government Printing Office, 1978Google Scholar
  8. 8.
    Office of Technology Assessment. Development of medical technology, opportunities for assessment. Washington, DC: US Government Printing House, 1976Google Scholar
  9. 9.
    Banta D. The development of health technology assessment. Health Policy 2003 Feb; 63 (2): 121–32PubMedCrossRefGoogle Scholar
  10. 10.
    International Network of Agencies for Health Technology Assessment. Global networking for effective healthcare [online]. Available from URL: http://www.inahta.org/HTA [Accessed 2010 Apr 26]Google Scholar
  11. 11.
    Banta D. What is technology assessment? Int J Technol Assess Health Care 2009 July; 25 Suppl. 1: 7–9PubMedCrossRefGoogle Scholar
  12. 12.
    Drummond MF, Schwartz JS, Jonsson B, et al. Key principles for the improved conduct of health technology assessments for resource allocation decisions. Int J Technol Assess Health Care 2008; 24 (3): 244–58PubMedGoogle Scholar
  13. 13.
    O’Donnell JC, Pham SV, Pashos CL, et al. Health technology assessment: lessons learned from around the world. An overview. Value Health 2009 June; 12 Suppl. 2: S1–5Google Scholar
  14. 14.
    Tunis SR, Stryer DB, Clancy CM. Practical clinical trials: increasing the value of clinical research for decision making in clinical and health policy. JAMA 2003 Sep 24; 290 (12): 1624–32PubMedCrossRefGoogle Scholar
  15. 15.
    Iglehart JK. Prioritizing comparative-effectiveness research: IOM recommendations. N Engl JMed 2009 Jul 30; 361 (4): 325–8CrossRefGoogle Scholar
  16. 16.
    Garrison Jr LP, Neumann PJ, Erickson P, et al. Using realworld data for coverage and payment decisions: the ISPOR Real-World Data Task Force report. Value Health 2007 Sep; 10 (5): 326–35PubMedCrossRefGoogle Scholar
  17. 17.
    Luce B, Cohen RS. Health technology assessment in the United States. Int J Technol Assess Health Care 2009 July; 25 Suppl. 1: 33–41PubMedCrossRefGoogle Scholar
  18. 18.
    US Department of Health and Human Services. Federal Coordinating Council for Comparative Effectiveness Research Membership. Recovery Act allocates $1.1 billion for comparative effectiveness research [online]. Available from URL: http://www.hhs.gov/recovery/programs/os/cerbios.html [Accessed 2009 Jul 16]Google Scholar
  19. 19.
    Budget of the United States Government, fiscal year 2010 [online]. Available from URL: http://www.gpoaccess.gov/usbudget/fy10/index.html [Accessed 2010 Apr 26]
  20. 20.
    Drummond M, Banta D. Health technology assessment in the United Kingdom. Int J Technol Assess Health Care 2009 Jul; 25 Suppl. 1: 178–81PubMedCrossRefGoogle Scholar
  21. 21.
    AWBZ: General Exceptional Medical Expenses Act [factsheet; online]. Available from URL: http://www.euraxess.nl/documents/awbz.pdf [Accessed 2010 Jun 11]
  22. 22.
    Organisation for Economic Co-operation and Development. Expenditure by funding source and transaction type [online]. Available from URL: http://stats.oecd.org/Index.aspx?DataSetCode=RFIN1 [Accessed 2010 Apr 26]Google Scholar
  23. 23.
    Sorenson C, Drummond M, Kanavos P. Ensuring value for money in health care: the role of health technology assessment in the European Union [observatory studies series no. 11]. Copenhagen: European Observatory on Health systems and Policies, 2008Google Scholar
  24. 24.
    Sorenson C. The role of HTA in coverage and pricing decisions: a cross-country comparison. Euro Observer 2009; 11 (1): 1–4 [online]. Available from URL: http://www.euro.who.int/__data/assets/pdf_file/0019/80335/EuroObserver_spring2009.pdf [Accessed 2010 Apr 26]Google Scholar
  25. 25.
    Exter A, Hermans H, Dosljak M, et al. Health care systems in transition: Netherlands. Copenhagen: WHO Regional Office for Europe on behalf of the European Observatory on Health systems and Policies, 2004 [online]. Available from URL: http://www.euro.who.int/__data/assets/pdf_file/0006/95136/E84949.pdf [Accessed 2010 Apr 26]Google Scholar
  26. 26.
    Klatt I. Understanding the Canadian health care system. J Financial Serv Prof 2000; 54: 42–51Google Scholar
  27. 27.
    Canada Health Act of dy1985, C. 6, s. 1Google Scholar
  28. 28.
    Paris V, Docteur E. Pharmaceutical pricing and reimbursment policies in Canada. Paris: OECD, 2007. Report no.:DELSA/HEA/HWP (2006) 4Google Scholar
  29. 29.
    Canadian Agency for Drugs and Technologies in Health. Frequently asked questions [online]. Available from URL: http://www.cadth.ca/index.php/en/hta/faq [Accessed 2010 Apr 26]Google Scholar
  30. 30.
    Canadian Agency for Drugs and Technologies in Health. CDR process [online]. Available from URL: http://www.cadth.ca/index.php/en/cdr/process [Accessed 2010 Apr 26]Google Scholar
  31. 31.
    Medicare Australia. About Medicare Australia [online]. Available from URL: http://www.medicareaustralia.gov.au/about/whatwedo/pbs.jsp [Accessed 2010 Apr 26]Google Scholar
  32. 32.
    Australian Government Department of Health and Ageing. PBAC outcomes: outcomes of Pharmaceutical Benefits Advisory Committee meetings [online]. Available from URL: http://www.health.gov.au/internet/main/publishing.nsf/Content/health-pbs-general-outcomes_full.htm [Accessed 2010 Apr 26]Google Scholar
  33. 33.
    Glenngård AH, Hjalte F, Svensson M, et al. Health systems in transition: Sweden. Copenhagen: WHO Regional Office for Europe on behalf of the European Observatory on Health Systems and Policies, 2005Google Scholar
  34. 34.
    Carlsson P, Jonsson E, Werko L, et al. Health technology assessment in Sweden. Int J Technol Assess Health Care 2000; 16 (2): 560–75PubMedCrossRefGoogle Scholar
  35. 35.
    Carlsson P. Health technology assessment and priority setting for health policy in Sweden. Int J Technol Assess Health Care 2004; 20 (1): 44–54PubMedCrossRefGoogle Scholar
  36. 36.
    Anell A, Persson U. Reimbursement and clinical guidance for pharmaceuticals in Sweden: do health-economic evaluations support decision making? Eur J Health Econ 2005 Sep; 6 (3): 274–9PubMedCrossRefGoogle Scholar
  37. 37.
    Jonsson E. History of health technology assessment in Sweden. Int J Technol Assess Health Care 2009; 25 Suppl. 1: 42–52PubMedCrossRefGoogle Scholar
  38. 38.
    International Society for PharmacoEconomics and Outcomes Research. ISPOR global health care systems road map: Sweden–pharmaceutical [online]. Available from URL: http://www.ispor.org/htaroadmaps/Sweden.asp [Accessed 2010 Apr 26]Google Scholar
  39. 39.
    van de Ven WP, Schut FT. Universal mandatory health insurance in the Netherlands: a model for the United States? Health Aff (Millwood) 2008 May/June; 27 (3): 771–81CrossRefGoogle Scholar
  40. 40.
    Tolley K, Postma M. Pharmacoeconomics and market access in Europe: case studies in Scotland and the Netherlands. ISPOR Connections 2006 [online]. Available from URL: http://www.ispor.org/News/articles/Oct06/economic_eval.asp [Accessed 2010 Apr 26]Google Scholar
  41. 41.
    Health Insurance Council (Ziekenfondsraad). Dutch guidelines for pharmacoeconomic research. Amstelveen: Health Insurance Council, 1999 Mar 25 [online]. Available from URL: http://www.ispor.org/peguidelines/source/PE_guidelines_english_Netherlands.pdf [Accessed 2010 Apr 26]Google Scholar
  42. 42.
    Moïse P, Docteur E. Pharmaceutical pricing and reimbursement policies in Sweden for 2004. Paris: OECD, 2007. Report no. 28.CrossRefGoogle Scholar
  43. 43.
    Polsky MJ, Stolk EA, Brouwer WBF. The use and impact of HTA in decision making in the Netherlands. Euro Observer 2009; 11 (1): 7–9Google Scholar
  44. 44.
    Dear J, O’Dowd C, Timoney A, et al. Scottish Medicines Consortium: an overview of rapid new drug assessment in Scotland. Scot Med J 2007; 52 (3): 20–6PubMedCrossRefGoogle Scholar
  45. 45.
    Scottish Medicines Consortium [online]. Available from URL: http://www.scottishmedicines.org/smc/CCC_FirstPage.jsp [Accessed 2010 Apr 26]
  46. 46.
    Scottish Medicines Consortium annual report 2008 Glasgow: Scottish Medicines Consortium, 2008 [online]. Available from URL: http://www.scottishmedicines.org.uk/smc/files/NHS%20SMC%20AR%2008.pdf [Accessed 2010 Apr 26]
  47. 47.
    Zentner A, Velasco Garrido M, Busse R. Methods for the comparative evaluation of pharmaceuticals. GMS Health Technology Assessment 2005; 1: Doc09Google Scholar
  48. 48.
    Clement FM, Harris A, Li JJ, et al. Using effectiveness and cost-effectiveness to make drug coverage decisions: a comparison of Britain, Australia, and Canada. JAMA 2009 Oct 7; 302 (13): 1437–43PubMedCrossRefGoogle Scholar
  49. 49.
    Hill SR, Mitchell AS, Henry DA. Problems with the interpretation of pharmacoeconomic analyses: a review of submissions to the Australian Pharmaceutical Benefits Scheme. JAMA 2000 Apr 26; 283 (16): 2116–21PubMedCrossRefGoogle Scholar
  50. 50.
    Wells GA, Sultan SA, Chen L, et al. Indirect evidence: indirect treatment comparisons in meta-analysis. Ottawa (ON): Canadian Agency for Drugs and Technologies in Health, 2009Google Scholar
  51. 51.
    Karnon J, Carlton J, Czoski-Murray C, et al. Informing disinvestment through cost-effectiveness modeling: is lack of data a surmountable barrier? Appl Health Econ Health Policy 2009; 7 (1): 1–9PubMedCrossRefGoogle Scholar
  52. 52.
    Drummond M, Griffin A, Tarricone R. Economic evaluation for devices and drugs: same or different? Value Health 2009; 12 (4): 402–4PubMedCrossRefGoogle Scholar
  53. 53.
    Tierney M, Manns B. Optimizing the use of prescription drugs in Canada through the Common Drug Review. CMAJ 2008 Feb 12; 178 (4): 432–5PubMedCrossRefGoogle Scholar
  54. 54.
    Therasse P, Arbuck SG, Eisenhauer EA, et al. New guidelines to evaluate the response to treatment in solid tumors. Breast Cancer 2008; 12 Suppl. 1: s16–27Google Scholar
  55. 55.
    Federal/provincial/territorial Ministerial Task Force on the National Pharmaceuticals Strategy. National pharmaceuticals strategy progress report. Ottawa (ON): Her Majesty the Queen in Right of Canada, 2006Google Scholar
  56. 56.
    Schubert F. Health technology assessment: the pharmaceutical industry perspective. Int J Technol Assess Health Care 2002; 18 (2): 184–91PubMedCrossRefGoogle Scholar
  57. 57.
    Kendall T, McGoey L, Jackson E. If NICE was in the USA. Lancet 2009 Jul 25; 374 (9686): 272–3PubMedCrossRefGoogle Scholar
  58. 58.
    Breckenridge A, Woods K, Walley T. Medicines regulation and health technology assessment. Clin Pharmacol Ther 2010 Feb; 87 (2): 152–4PubMedCrossRefGoogle Scholar
  59. 59.
    Mehrez A, Gafni A. Quality-adjusted life years, utility theory, and healthy-years equivalents. Med Decis Making 1989 Apr; 9 (2): 142–9PubMedCrossRefGoogle Scholar
  60. 60.
    Mehrez A, Gafni A. Healthy-years equivalents versus qualityadjusted life years: in pursuit of progress. Med Decis Making 1993 Oct; 13 (4): 287–92PubMedCrossRefGoogle Scholar
  61. 61.
    Gold MR, Siegel JE, Russel LB, et al. Cost-effectiveness in health and medicine. NewYork: Oxford University Press, 1996Google Scholar
  62. 62.
    Claxton K, Sculpher M, Drummond M. A rational framework for decision making by the National Institute for Clinical Excellence (NICE). Lancet 2002 Aug 31; 360 (9334): 711–5PubMedCrossRefGoogle Scholar
  63. 63.
    Neumann PJ, Greenberg D. Is the United States ready for QALYs? Health Aff (Millwood) 2009 Sep; 28 (5): 1366–71CrossRefGoogle Scholar
  64. 64.
    Nord E, Daniels N, Kamlet M. QALYs: some challenges. Value Health 2009 Mar; 12 Suppl. 1: s10–5CrossRefGoogle Scholar
  65. 65.
    Dix Smith M, Drummond M, Brixner D. Moving the QALY forward: rationale for change. Value Health 2009 Mar; 12 Suppl. 1: S1–4CrossRefGoogle Scholar
  66. 66.
    Levy AR, Harrigan B, Johnston K, et al. Comparative effectiveness research through the looking glass. Med Dec Making 2009; 29 (6): NP6–8CrossRefGoogle Scholar
  67. 67.
    Conway PH, Clancy C. Comparative effectiveness research: implications of the Federal Coordinating Council’s Report. N Engl J Med 2009; 361 (4): 328–30PubMedCrossRefGoogle Scholar
  68. 68.
    D’Agostino Sr RB, Grundy S, Sullivan LM, et al. Validation of the Framingham coronary heart disease prediction scores: results of a multiple ethnic groups investigation. JAMA 2001 Jul 11; 286 (2): 180–7CrossRefGoogle Scholar
  69. 69.
    Sculpher M. Subgroups and heterogeneity in cost-effectiveness analysis. Pharmacoeconomics 2008; 26 (9): 799–806PubMedCrossRefGoogle Scholar
  70. 70.
    Tunis SR, Pearson SD. Coverage options for promising technologies: Medicare’s ‘coverage with evidence development’. Health Aff (Millwood) 2006 Sep; 25 (5): 1218–30CrossRefGoogle Scholar
  71. 71.
    Whicher DM, Chalkidou K, Dhalla IA, et al. Comparative effectiveness research in Ontario, Canada: producing relevant and timely information for health care decision makers. Milbank Q 2009 Sep; 87 (3): 585–606PubMedCrossRefGoogle Scholar
  72. 72.
    Hutton J, Trueman P, Henshall C. Coverage with evidence development: an examination of conceptual and policy issues. Int J Technol Assess Health Care 2007; 23 (4): 425–32PubMedCrossRefGoogle Scholar
  73. 73.
    Hawkins N, Scott DA, Woods B. How far do you go? Efficient searching for indirect evidence. Med Decis Making 2009 May; 29 (3): 273–81PubMedCrossRefGoogle Scholar
  74. 74.
    Hawkins N, Scott DA, Woods BS, et al. No study left behind: a network meta-analysis in non-small-cell lung cancer demonstrating the importance of considering all relevant data. Value Health 2009 Sep; 12 (6): 996–1003PubMedCrossRefGoogle Scholar
  75. 75.
    Pocock SJ. Safety of drug-eluting stents: demystifying network meta-analysis. Lancet 2007 Dec 22; 370 (9605): 2099–100PubMedCrossRefGoogle Scholar
  76. 76.
    Song F, Loke YK, Walsh T, et al. Methodological problems in the use of indirect comparisons for evaluating healthcare interventions: survey of published systematic reviews. BMJ 2009; 338: b1147PubMedCrossRefGoogle Scholar
  77. 77.
    Song F, Glenny AM, Altman DG. Indirect comparison in evaluating relative efficacy illustrated by antimicrobial prophylaxis in colorectal surgery. Control Clin Trials 2000 Oct; 21 (5): 488–97PubMedCrossRefGoogle Scholar
  78. 78.
    Lumley T. Network meta-analysis for indirect treatment comparisons. Stat Med 2002 Aug 30; 21 (16): 2313–24PubMedCrossRefGoogle Scholar
  79. 79.
    Claxton K, Briggs A, Buxton MJ, et al. Value based pricing for NHS drugs: an opportunity not to be missed? BMJ 2008 Feb 2; 336 (7638): 251–4PubMedCrossRefGoogle Scholar
  80. 80.
    Chalkidou K, Tunis S, Lopert R, et al. Comparative effectiveness research and evidence-based health policy: experience from four countries. Milbank Q 2009 June; 87 (2): 339–67PubMedCrossRefGoogle Scholar
  81. 81.
    Lexchin J, Mintzes B. Medicine reimbursement recommendations in Canada, Australia, and Scotland. Am J Manag Care 2008 Sep; 14 (9): 581–8PubMedGoogle Scholar

Copyright information

© Adis Data Information BV 2010

Authors and Affiliations

  • Adrian R. Levy
    • 1
    • 2
    • 5
  • Craig Mitton
    • 3
  • Karissa M. Johnston
    • 2
    • 5
  • Brian Harrigan
    • 2
    • 5
  • Andrew H. Briggs
    • 2
    • 4
  1. 1.Department of Community Health and EpidemiologyDalhousie UniversityHalifaxCanada
  2. 2.Oxford OutcomesOxfordUK
  3. 3.School of Population and Public HealthUniversity of British ColumbiaVancouverCanada
  4. 4.Public Health and Health PolicyUniversity of GlasgowGlasgowUK
  5. 5.VancouverCanada

Personalised recommendations