Towards a New Framework for Addressing Structural Uncertainty in Health Technology Assessment Guidelines
- 1.5k Downloads
Providing scientific advice and recommendations for public decision making entails identifying, selecting and weighing evidence derived from multiple sources of information through a systematic approach, while taking into account ethical, cultural and societal factors. Integrated in the evaluation process are exchanges between regulatory agencies, private firms, scientific experts and government representatives.
In the case of drugs and medical devices, health technology assessment (HTA) agencies are increasingly commissioned to evaluate innovations in order to provide government with recommendations and advice on reimbursement and/or pricing. To undertake this task, HTA agencies [1, 2, 3, 4, 5, 6] in Europe and elsewhere have developed methodological guidelines on the economic evaluation of health technologies . One component of these guidelines deals with ways for both manufacturers (pharmaceutical and medical device firms) and HTA agencies evaluators (modelers, economists and public health experts) to address uncertainty. Several types of uncertainty have indeed been identified in HTA: methodological, parameter and structural uncertainty. Most guidelines describe quite well how to deal with the first two categories, although there is still room for improvement. However, recommendations about how to tackle structural uncertainty remain largely elusive. HTA agencies and decision makers may thus be exposed to oversimplifying assessments and recommendations by putting aside complex forms of uncertainty such as structural ‘deep’ uncertainty .
The editorial is not intended to promote new approaches to exploring structural uncertainty, rather to emphasize concerns related to the topic, such as definition and analysis. Our aim is therefore to highlight the need to renew the analytical framework guidance for HTA.
2 From the Knightian to the Bayesian Uncertainty
Uncertainty and its exploration have been investigated across many disciplines, including statistics, operations research, philosophy and economics. Defining uncertainty is a complex and challenging task, both theoretically and practically .
The theoretical debate frequently focusses on the quantification of ‘immeasurable’ or ‘deep’ uncertainty. As early as 1921, Franck Knight paved the way for an essential debate on uncertainty, how it can be defined, and the extent to which it should be taken into account at either microeconomic or macroeconomic levels of economic analysis. Knight  placed particular emphasis on the distinction between ‘known risk’, which may be measured, and ‘true uncertainty’, which is unlikely to be measured: “Uncertainty must be taken in a sense radically distinct from the familiar notion of risk, from which it has never been properly separated. The essential fact is that ‘risk’ means in some cases a quantity susceptible of measurement”.
Today, arguments against the Knightian view and in favor of Bayesian decision theory are increasingly developed. According to the latter, all forms of uncertainty are theoretically quantifiable. It is admittedly difficult and uncomfortable to quantify uncertainty in many contexts because “people find it difficult to express their knowledge and beliefs in probabilistic form, so that elicitation of probability distribution is a far from perfect process” . However, if ultimately we are to make a decision under uncertainty, our optimal action will be that which maximizes our utility given our subjective probability specification over all relevant unknowns.
The Bayesian approach has been progressively included in the formal assessment of uncertainty [12, 13, 14, 15] and applied in the value-of-information framework . It nevertheless remains that its handling by HTA guidelines remains tricky and challenging.
3 Taking into Account Uncertainty in the Economic Evaluation of Health Technologies
Handling uncertainty is a concern shared by national HTA guidelines (e.g. Australia [Pharmaceutical Benefits Advisory Committee; PBAC] , Belgium [Health Care Knowledge Centre; KCE 2015] , Canada [Canadian Agency for Drugs and Technologies in Health; CADT 2017] , England and Wales [National Institute for Health and Care Excellence; NICE 2015] , France [Haute Autorité de Santé; HAS 2012]  and the Netherlands ), academic research , international associations (e.g. International Society for Pharmacoeconomics and Outcomes Research–Society for Medical Decision Making [ISPOR-SMDM] Modelling Good Research Practices Task Force-6 2012  and the European Network for Health Technology Assessment [EUnetHTA] 2015 ). In particular, the ISPOR-SMDM Modelling Good Research Practices Task Force-6 distinguished stochastic, parameter and structural uncertainty. Broadly, stochastic uncertainty refers to ‘random variability in outcomes between identical patients’. Parameter uncertainty relates to the uncertainty around the ‘true’ costs and effects for each decision option because we are uncertain about the ‘true’ values of the inputs of the model. Structural uncertainty is the most difficult type of uncertainty to define and to grasp. Again, according to Briggs et al. , structural uncertainty is associated with ‘the assumptions inherent in the decision model’. Today, although academic research suggests several analytical frameworks allowing the exploration of structural uncertainty [29, 30, 31, 32, 33, 34, 35], there are not yet any explicit HTA recommendations about how to operationally address structural uncertainty in national or international guidelines related to health economic evaluation [20, 32].
4 Managing Structural Uncertainty in Health Economic Evaluation: A Further Need For Straightforward Definition and Practical Health Technology Assessment (HTA) Guidance
A decision context is always characterized by limited knowledge on either the condition under study or the effect of the technology under evaluation. For example, drug outcomes and health gains based on surrogate endpoints [21, 22], incomplete data generated for the outcomes of medical devices [23, 24] or partial understanding of disease progression  put decision makers in a situation in which they must make decisions based on incomplete information. In limited knowledge environments, the lack of in-depth exploration of structural uncertainty may prevent both manufacturers and HTA agencies from optimally assessing the magnitude of prediction errors related to health economic outputs and consequently the confidence associated with economic evaluation (e.g. cost-effectiveness analysis).
To our knowledge, there is as yet no standard definition of structural uncertainty. Admittedly, it was identified in the late 2000s [26, 27], but no ‘operational’ definition of structural uncertainty itself has yet been proposed. For example, the report of the ISPOR-SMDM Modeling Good Research Practices Task Force-6  and Ghabri et al.  associated it with the uncertainty related to ‘the assumptions inherent in the decision model’ or ‘model uncertainty’ and mentioned some analogies with concerns identified in the analysis of standard multivariate regression.
Exploring structural uncertainty primarily requires a renewed framework. Such a framework should first address and overcome the difficulty in defining exactly what structural uncertainty means. It cannot be defined as ‘uncertainty about the model’ because the model is built, so it is not uncertain. It cannot be ‘uncertainty about the assumptions or the structure of the model’, because that would imply there is a ‘true’ model about which we are uncertain. However, by definition, models are assumed constructs of a reality based on the best available knowledge. A frequently used quote in relation to models is that “all models are wrong, but some are useful” . But if they are wrong, what does model error mean? We are uncertain about ‘true’ costs and effects for each of our decision options because our model produces a prediction that we know is doubtful since we know the model is not perfect (cannot perfectly reflect the actual future outcomes of the assessed intervention). Consequently, following Strong and Oakley , structural uncertainty might be defined as the uncertainty that arises as a result of uncertain model error in relation to the target quantities or more generally the outcomes that the model is trying to predict.
Once the issue of defining structural uncertainty has been taken up, the question of how to recommend addressing it in HTA agencies arises. Considering the genuine need to establish a renewed HTA framework, we suggest the following interrogations. Should HTA agencies go far enough in proposing ‘analytical’ tools to assess structural uncertainty or at least explicitly recommend the use of approaches of structural uncertainty already explored in academic research? Should health economic journals and peer reviewers stress the lack of exploration of structural uncertainty? Much like with the validation of health economic decision models and its related issues , is it not time to put forward a dedicated tool for structural uncertainty that would propose a commonly accepted definition and good practices on undisputed ways to include structural uncertainty in HTA?
This editorial partially draws on previous work presented at the 2016 Health Technology Assessment International (HTAi) Congress in Tokyo, Japan and at the 2016 meeting of the European Public Choice Society in Freiburg, Germany. The authors thank especially Mark Strong (University of Sheffield) and Jaime Caro (University of McGill) for their suggestions and insightful comments on an early draft of the manuscript. We also thank Nathalie Merle (HAS), Françoise Hamers (Santé Publique, France) and all conference participants for their helpful comments on this work. The opinions expressed in this editorial are those of the authors and do not necessarily represent the views of the French Authority for Health, the Belgian Health Care Knowledge Centre or the University of Rennes I.
Compliance with Ethical Standards
No funding was received for the preparation of this manuscript.
Conflict of interest
The authors have no conflicts of interest. Salah Ghabri is employed by the French National Authority for Health, Irina Cleemput is employed by the Belgian Health Care Knowledge Centre and Jean-Michel Josselin is employed by the University of Rennes1.
- 1.Pharmaceutical Benefits Advisory Committee. Guidelines for preparing submissions to the Pharmaceutical Benefits Advisory Committee (version 5.0). Australian Government Department of Health. 2016. http://www.pbac.pbs.gov.au/content/information/printable-files/pbacg-book.pdf. Accessed 15 Oct 2017
- 2.Belgian Guidelines for economic evaluations and budget impact analysis. KCE. 2015. https://kce.fgov.be/sites/default/files/page_documents/KCE_183_economic_evaluations_second_edition_Report.pdf. Accessed 15 Oct 2017.
- 3.Canadian Agency for Drugs and Technologies in Health. Guidelines for the economic evaluation of health technologies: Canada. 4th ed. Ottawa: Canadian Agency for Drugs and Technologies in Health; 2017. https://www.cadth.ca/dv/guidelines-economic-evaluation-health-technologies-canada-4th-edition. Accessed 15 Oct 2017.
- 4.National Institute for Health and Care Excellence (NICE). Single technology appraisal: user guide for company evidence submission template. 2015. https://www.nice.org.uk/process/pmg24/chapter/cost-effectiveness. Accessed 15 Oct 2017.
- 5.Haute Autorité de Santé (HAS). Choices in methods for economic evaluation. 2012. https://www.has-sante.fr/portail/upload/docs/application/pdf/2012-10/choices_in_methods_for_economic_evaluation.pdf. Accessed 15 Oct 2017.
- 6.Guideline for the Conduct of Economic Evaluations in Health Care (Dutch Version February 2016). https://www.ispor.org/PEguidelines/source/Netherlands_Guideline_for_economic_evaluations_in_healthcare.pdf. Accessed 15 Oct 2017.
- 10.Knight F. Risk, uncertainty and profit. Boston: Houghton Mifflin; 1921.Google Scholar
- 17.Drummond M, Sculpher M, Torrance G, O’Brien B, Stoddart G. Methods for the economic evaluation of health care programmes. 4th ed. New York: Oxford University Press; 2015.Google Scholar
- 19.EUnetHTA. Methods for health economic evaluations - A guideline based on current practices in Europe. 2015. http://www.eunethta.eu/outputs/eunethta-methodological-guideline-methods-health-economic-evaluations. Accessed 15 Oct 2017.
- 26.Bojke L, Claxton K, Sculpher M, Palmer S. Characterizing structural uncertainty in decision evaluations. Pharmacoeconomics. 2009;33:435–43.Google Scholar
- 27.Strong M, Pilgrim H, Oakley J, Chilcott J. Structural uncertainty in health economic decision models. 2009. ScHARR Occasional Paper.Google Scholar
- 28.Box G, Draper N. Empirical model-building and response surfaces. New Jersey: Wiley; 1987.Google Scholar