Skip to main content
Log in

Are Efficient Designs Used in Discrete Choice Experiments Too Difficult for Some Respondents? A Case Study Eliciting Preferences for End-of-Life Care

  • Original Research Article
  • Published:
PharmacoEconomics Aims and scope Submit manuscript

Abstract

Background

Although efficient designs have sample size advantages for discrete choice experiments (DCEs), it has been hypothesised that they may result in biased estimates owing to some respondents using simplistic heuristics.

Objectives

The main objective was to provide a case study documenting that many respondents choose on the basis of a single attribute when exposed to highly efficient DCE designs but switch to a conventional multi-attribute decision rule when the design efficiency was lowered (resulting in less need to trade across all attributes). Additional objectives included comparisons of the sizes of the estimated coefficients and characterisation of heterogeneity, thus providing evidence of the magnitude of bias likely present in highly efficient designs.

Methods

Five hundred and twenty-five respondents participating in a wider end-of-life survey each answered two DCEs that varied in their design efficiency. The first was a Street and Burgess 100 % efficient Orthogonal Main Effects Plan design (27 in 8), using the top and bottom levels of all attributes. The second DCE comprised one eighth of the full Orthogonal Main Effects Plan in 32 pairs, (a 2 × 46). Linear probability models estimated every respondent’s complete utility function in DCE1. The number of respondents answering on the basis of one attribute level was noted, as was the proportion of these who then violated this rule in DCE2, the less efficient DCE. Latent class analyses were used to identify heterogeneity.

Results

Sixty per cent of respondents answered all eight tasks comprising DCE1 using a single attribute; most used the rule “choose cheapest end-of-life care plan”. However, when answering the four less efficient tasks in DCE2, one third of these (20 % overall) then traded across attributes at least once. Among those whose decision rule could not be described qualitatively, latent class models identified two classes; compared to class one, class two was more concerned with quality rather than cost of care and wished to die in an institution rather than at home. Higher efficiency was also associated with smaller regression coefficients, suggesting either weaker preferences or lower choice consistency (larger errors).

Conclusion

This is the first within-subject study to investigate the association between DCE design efficiency and utility estimates. It found that a majority of people did not trade across attributes in the more efficient design but that one third of these then did trade in the less efficient design. More within-subject studies are required to establish how common this is. It may be that future DCEs should attempt to maximise some joint function of statistical and cognitive efficiency to maximise overall efficiency and minimise bias.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Louviere JJ, Hensher DA, Swait J. Stated choice methods: analysis and application. Cambridge: Cambridge University Press; 2000.

    Book  Google Scholar 

  2. Hensher DA, Rose JM, Greene WH. Applied choice analysis: a primer. Cambridge: Cambridge University Press; 2005.

    Book  Google Scholar 

  3. de Bekker-Grob EW, Ryan M, Gerard K. Discrete choice experiments in health economics: a review of the literature. Health Econ. 2012;21:145–72.

    Article  PubMed  Google Scholar 

  4. Rose JM, Bliemer MCJ. Constructing efficient stated choice experimental designs. Transp Rev. 2009;29(5):587–617.

    Article  Google Scholar 

  5. Swait J, Louviere JJ. The role of the scale parameter in the estimation and comparison of multinomial logit models. J Mark Res. 1993;30:305–14.

    Article  Google Scholar 

  6. Yatchew A, Griliches Z. Specification error in probit models. Rev Econ Stat. 1985;67(1):134–9.

    Article  Google Scholar 

  7. Ben-Akiva M, Lerman SR. Discrete choice analysis: theory and application to travel demand. Cambridge: MIT Press; 1985.

    Google Scholar 

  8. Marley AAJ, Flynn TN, Louviere JJ. Probabilistic models of set-dependent and attribute-level best-worst choice. J Math Psychol. 2008;52:281–96.

    Article  Google Scholar 

  9. Flynn TN, Louviere JJ, Marley AAJ, Coast J, Peters TJ. Rescaling quality of life values from discrete choice experiments for use as QALYs: a cautionary tale. Popul Health Metr. 2008;6:1–6.

    Article  Google Scholar 

  10. Louviere JJ, Street DJ, Burgess L, Wasi N, Islam T, Marley AAJ. Modelling the choices of single individuals by combining efficient choice experiment designs with extra preference information. J Choice Model. 2008;1(1):128–63.

    Article  Google Scholar 

  11. Flynn TN. Valuing citizen and patient preferences in health: recent developments in three types of best-worst scaling. Exp Rev Pharmacoeconom Outcomes Res. 2010;10(3):259–67.

    Article  Google Scholar 

  12. Flynn TN, Louviere JJ, Peters TJ, Coast J. Using discrete choice experiments to understand preferences for quality of life: variance scale heterogeneity matters. Social Sci Med. 2010;70:1957–65.

    Article  Google Scholar 

  13. Flynn TN, Huynh E, Peters TJ, Al-Janabi H, Clemens S, Moody A, et al. Scoring the ICECAP: a quality of life instrument. Estimation of a UK general population tariff. Health Econ. 2015;24(3):258–69.

    Article  PubMed Central  PubMed  Google Scholar 

  14. Finkelstein EA, Bilger M, Flynn TN, Malhotra C. Preferences for end-of-life care among community-dwelling older adults and patients with advanced cancer: a discrete choice experiment. Health Policy. 2015. doi:10.1016/j.healthpol.2015.09.001

    PubMed  Google Scholar 

  15. Louviere JJ. What if consumer experiments impact variances as well as means: response variability as a behavioural phenomenon. J Consum Res. 2001;28(3):506–11.

    Article  Google Scholar 

  16. Louviere JJ. Modeling single individuals: the journey from psych lab to the app store. In: Hess S, Daly A, editors. Choice modelling: the state of the art and the state of practice. Cheltenham: Edward Elgar; 2013. p. 1–48.

    Chapter  Google Scholar 

  17. Carson RT, Groves T. Incentive and informational properties of preference questions. Environ Res Econs. 2007;37:181–200.

    Article  Google Scholar 

  18. Hensher DA, Collins AT. Interrogation of responses to stated choice experiments: is there sense in what respondents tell us? A closer look at what respondents choose and process heuristics used in stated choice experiments. J Choice Model. 2013;4(1):62–89.

    Article  Google Scholar 

  19. Giergiczny M, Czajkowski M. Investigating preference heterogeneity through individual level modelling. International Choice Modelling Conference. Leeds. 2011.

  20. Hawkins GE, Marley AAJ, Heathcote A, Flynn TN, Louviere JJ, Brown SD. Integrating cognitive process and descriptive models of attitudes and preferences. Cogn Sci. 2014;38(4):701–35.

    Article  PubMed  Google Scholar 

  21. Train K, Weeks M, Alberini A, Scarpa R. Discrete choice models in preference space and willingness-to-pay space: applications of simulation methods in environmental and resource economics. Dordrecht: Springer; 2005. p. 1–16.

    Book  Google Scholar 

  22. Sonnier G, Ainslie A, Otter T. Heterogeneity distributions of willingness-to-pay in choice models. Quant Mark Econ. 2007;5:313–31.

    Article  Google Scholar 

Download references

Acknowledgments

We acknowledge participants at the International Choice Modelling Conference in Sydney 2013 for helpful comments and Prof. Jordan Louviere for providing the conceptual basis for the two designs used and for comments on the results. This research was supported by the Singapore Ministry of Health’s National Medical Research Council (NMRC/NIG/1036/2010) and Lien Centre for Palliative Care (Duke-NUS-LCPC(I)/2011/0001). The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Terry N. Flynn.

Ethics declarations

TNF designed the discrete choice experiments, conducted principal analyses, drafted the paper and agreed edits to the manuscript. He has no conflict of interest.

MB co-designed the overall survey, suggested secondary analyses and edited the manuscript. He has no conflict of interest.

CM co-designed the overall survey, led administration of the survey and edited the manuscript. She has no conflict of interest.

EAF co-designed the overall survey, suggested secondary analyses, led the overall project and edited the manuscript. He has no conflict of interest.

Electronic supplementary material

Below is the link to the electronic supplementary material.

Supplementary material 1 (DOCX 84 kb)

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Flynn, T.N., Bilger, M., Malhotra, C. et al. Are Efficient Designs Used in Discrete Choice Experiments Too Difficult for Some Respondents? A Case Study Eliciting Preferences for End-of-Life Care. PharmacoEconomics 34, 273–284 (2016). https://doi.org/10.1007/s40273-015-0338-z

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s40273-015-0338-z

Keywords

Navigation