Including Opt-Out Options in Discrete Choice Experiments: Issues to Consider
Providing an opt-out alternative in discrete choice experiments can often be considered to be important for presenting real-life choice situations in different contexts, including health. However, insufficient attention has been given to how best to address choice behaviours relating to this opt-out alternative when modelling discrete choice experiments, particularly in health studies. The objective of this paper is to demonstrate how to account for different opt-out effects in choice models. We aim to contribute to a better understanding of how to model opt-out choices and show the consequences of addressing the effects in an incorrect fashion. We present our code written in the R statistical language so that others can explore these issues in their own data. In this practical guideline, we generate synthetic data on medication choice and use Monte Carlo simulation. We consider three different definitions for the opt-out alternative and four candidate models for each definition. We apply a frequentist-based multimodel inference approach and use performance indicators to assess the relative suitability of each candidate model in a range of settings. We show that misspecifying the opt-out effect has repercussions for marginal willingness to pay estimation and the forecasting of market shares. Our findings also suggest a number of key recommendations for DCE practitioners interested in exploring these issues. There is no unique best way to analyse data collected from discrete choice experiments. Researchers should consider several models so that the relative support for different hypotheses of opt-out effects can be explored.
We thank the editor Christopher Carswell for his invitation to write this paper. We also thank four anonymous reviewers for their helpful comments and suggestions on previous versions of this paper. Any remaining errors or misinterpretations are solely the authors’ responsibility.
DC and SE contributed equally to all aspects of this paper, including the conceptualisation, data generation, analysis and drafting of the manuscript.
Compliance with ethical standards
The study was not supported by any external sources or funds.
The study did not involve the collection of primary data or the use of secondary data sources, thus negating the need for ethical approval.
Participants have been artifically generated as part of the Monte Carlo simulation, meaning that informed consent is not applicable.
Conflict of interest
Danny Campbell and Seda Erdem declare no conflicts of interest relevant to the content of this manuscript.
- 8.Johnston RJ, Boyle KJ, Adamowicz W, Bennett J, Brouwer R, Cameron TA, Hanemann WM, Hanley N, Ryan M, Scarpa R, Tourangeau R, Vossler C. Contemporary guidance for stated preference studies. J Assoc Environ Resour Econ. 2017;4(2):319–405.Google Scholar
- 11.Dhar R, Simonson I. The effect of forced choice on choice. J Market Res. 2003;40(2):146–60.Google Scholar
- 19.Krosnick JA, Holbrook AL, Berent MK, Carson RT, Hanemann WM, Kopp RJ, Mitchell RM, Presser S, Ruud PA, Smith VK, Moody WR, Green MC, Conaway M. The impact of “no opinion” response options on data quality: non-attitude reduction or an invitation to satisfice? Public Opinion Q. 2002;66(3):371–403.CrossRefGoogle Scholar
- 25.R Core Team. R: a language and environment for statistical computing. Vienna: R Foundation for Statistical Computing; 2016. https://www.R-project.org/.
- 35.Henningsen A, Toomet O. Maxlik: a package for maximum likelihood estimation in R. Comput Stat. 2011;26(3):443–58.Google Scholar
- 40.Wuertz D et al. fExtremes: Rmetrics—extreme financial market data. 2013. R package version 3010.81. https://CRAN.R-project.org/package=fExtremes.
- 41.Aizaki H. Basic functions for supporting an implementation of choice experiments in R. J Stat Softw. 2012;50(2):1–24.Google Scholar