, Volume 34, Issue 3, pp 273–284 | Cite as

Are Efficient Designs Used in Discrete Choice Experiments Too Difficult for Some Respondents? A Case Study Eliciting Preferences for End-of-Life Care

  • Terry N. FlynnEmail author
  • Marcel Bilger
  • Chetna Malhotra
  • Eric A. Finkelstein
Original Research Article



Although efficient designs have sample size advantages for discrete choice experiments (DCEs), it has been hypothesised that they may result in biased estimates owing to some respondents using simplistic heuristics.


The main objective was to provide a case study documenting that many respondents choose on the basis of a single attribute when exposed to highly efficient DCE designs but switch to a conventional multi-attribute decision rule when the design efficiency was lowered (resulting in less need to trade across all attributes). Additional objectives included comparisons of the sizes of the estimated coefficients and characterisation of heterogeneity, thus providing evidence of the magnitude of bias likely present in highly efficient designs.


Five hundred and twenty-five respondents participating in a wider end-of-life survey each answered two DCEs that varied in their design efficiency. The first was a Street and Burgess 100 % efficient Orthogonal Main Effects Plan design (27 in 8), using the top and bottom levels of all attributes. The second DCE comprised one eighth of the full Orthogonal Main Effects Plan in 32 pairs, (a 2 × 46). Linear probability models estimated every respondent’s complete utility function in DCE1. The number of respondents answering on the basis of one attribute level was noted, as was the proportion of these who then violated this rule in DCE2, the less efficient DCE. Latent class analyses were used to identify heterogeneity.


Sixty per cent of respondents answered all eight tasks comprising DCE1 using a single attribute; most used the rule “choose cheapest end-of-life care plan”. However, when answering the four less efficient tasks in DCE2, one third of these (20 % overall) then traded across attributes at least once. Among those whose decision rule could not be described qualitatively, latent class models identified two classes; compared to class one, class two was more concerned with quality rather than cost of care and wished to die in an institution rather than at home. Higher efficiency was also associated with smaller regression coefficients, suggesting either weaker preferences or lower choice consistency (larger errors).


This is the first within-subject study to investigate the association between DCE design efficiency and utility estimates. It found that a majority of people did not trade across attributes in the more efficient design but that one third of these then did trade in the less efficient design. More within-subject studies are required to establish how common this is. It may be that future DCEs should attempt to maximise some joint function of statistical and cognitive efficiency to maximise overall efficiency and minimise bias.


Decision Rule Latent Class Analysis Discrete Choice Experiment Efficient Design Latent Class Model 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.



We acknowledge participants at the International Choice Modelling Conference in Sydney 2013 for helpful comments and Prof. Jordan Louviere for providing the conceptual basis for the two designs used and for comments on the results. This research was supported by the Singapore Ministry of Health’s National Medical Research Council (NMRC/NIG/1036/2010) and Lien Centre for Palliative Care (Duke-NUS-LCPC(I)/2011/0001). The funders had no role in study design, data collection and analysis, decision to publish or preparation of the manuscript.

Compliance with Ethical Standards

TNF designed the discrete choice experiments, conducted principal analyses, drafted the paper and agreed edits to the manuscript. He has no conflict of interest.

MB co-designed the overall survey, suggested secondary analyses and edited the manuscript. He has no conflict of interest.

CM co-designed the overall survey, led administration of the survey and edited the manuscript. She has no conflict of interest.

EAF co-designed the overall survey, suggested secondary analyses, led the overall project and edited the manuscript. He has no conflict of interest.

Supplementary material

40273_2015_338_MOESM1_ESM.docx (84 kb)
Supplementary material 1 (DOCX 84 kb)


  1. 1.
    Louviere JJ, Hensher DA, Swait J. Stated choice methods: analysis and application. Cambridge: Cambridge University Press; 2000.CrossRefGoogle Scholar
  2. 2.
    Hensher DA, Rose JM, Greene WH. Applied choice analysis: a primer. Cambridge: Cambridge University Press; 2005.CrossRefGoogle Scholar
  3. 3.
    de Bekker-Grob EW, Ryan M, Gerard K. Discrete choice experiments in health economics: a review of the literature. Health Econ. 2012;21:145–72.CrossRefPubMedGoogle Scholar
  4. 4.
    Rose JM, Bliemer MCJ. Constructing efficient stated choice experimental designs. Transp Rev. 2009;29(5):587–617.CrossRefGoogle Scholar
  5. 5.
    Swait J, Louviere JJ. The role of the scale parameter in the estimation and comparison of multinomial logit models. J Mark Res. 1993;30:305–14.CrossRefGoogle Scholar
  6. 6.
    Yatchew A, Griliches Z. Specification error in probit models. Rev Econ Stat. 1985;67(1):134–9.CrossRefGoogle Scholar
  7. 7.
    Ben-Akiva M, Lerman SR. Discrete choice analysis: theory and application to travel demand. Cambridge: MIT Press; 1985.Google Scholar
  8. 8.
    Marley AAJ, Flynn TN, Louviere JJ. Probabilistic models of set-dependent and attribute-level best-worst choice. J Math Psychol. 2008;52:281–96.CrossRefGoogle Scholar
  9. 9.
    Flynn TN, Louviere JJ, Marley AAJ, Coast J, Peters TJ. Rescaling quality of life values from discrete choice experiments for use as QALYs: a cautionary tale. Popul Health Metr. 2008;6:1–6.CrossRefGoogle Scholar
  10. 10.
    Louviere JJ, Street DJ, Burgess L, Wasi N, Islam T, Marley AAJ. Modelling the choices of single individuals by combining efficient choice experiment designs with extra preference information. J Choice Model. 2008;1(1):128–63.CrossRefGoogle Scholar
  11. 11.
    Flynn TN. Valuing citizen and patient preferences in health: recent developments in three types of best-worst scaling. Exp Rev Pharmacoeconom Outcomes Res. 2010;10(3):259–67.CrossRefGoogle Scholar
  12. 12.
    Flynn TN, Louviere JJ, Peters TJ, Coast J. Using discrete choice experiments to understand preferences for quality of life: variance scale heterogeneity matters. Social Sci Med. 2010;70:1957–65.CrossRefGoogle Scholar
  13. 13.
    Flynn TN, Huynh E, Peters TJ, Al-Janabi H, Clemens S, Moody A, et al. Scoring the ICECAP: a quality of life instrument. Estimation of a UK general population tariff. Health Econ. 2015;24(3):258–69.PubMedCentralCrossRefPubMedGoogle Scholar
  14. 14.
    Finkelstein EA, Bilger M, Flynn TN, Malhotra C. Preferences for end-of-life care among community-dwelling older adults and patients with advanced cancer: a discrete choice experiment. Health Policy. 2015. doi: 10.1016/j.healthpol.2015.09.001 PubMedGoogle Scholar
  15. 15.
    Louviere JJ. What if consumer experiments impact variances as well as means: response variability as a behavioural phenomenon. J Consum Res. 2001;28(3):506–11.CrossRefGoogle Scholar
  16. 16.
    Louviere JJ. Modeling single individuals: the journey from psych lab to the app store. In: Hess S, Daly A, editors. Choice modelling: the state of the art and the state of practice. Cheltenham: Edward Elgar; 2013. p. 1–48.CrossRefGoogle Scholar
  17. 17.
    Carson RT, Groves T. Incentive and informational properties of preference questions. Environ Res Econs. 2007;37:181–200.CrossRefGoogle Scholar
  18. 18.
    Hensher DA, Collins AT. Interrogation of responses to stated choice experiments: is there sense in what respondents tell us? A closer look at what respondents choose and process heuristics used in stated choice experiments. J Choice Model. 2013;4(1):62–89.CrossRefGoogle Scholar
  19. 19.
    Giergiczny M, Czajkowski M. Investigating preference heterogeneity through individual level modelling. International Choice Modelling Conference. Leeds. 2011.Google Scholar
  20. 20.
    Hawkins GE, Marley AAJ, Heathcote A, Flynn TN, Louviere JJ, Brown SD. Integrating cognitive process and descriptive models of attitudes and preferences. Cogn Sci. 2014;38(4):701–35.CrossRefPubMedGoogle Scholar
  21. 21.
    Train K, Weeks M, Alberini A, Scarpa R. Discrete choice models in preference space and willingness-to-pay space: applications of simulation methods in environmental and resource economics. Dordrecht: Springer; 2005. p. 1–16.CrossRefGoogle Scholar
  22. 22.
    Sonnier G, Ainslie A, Otter T. Heterogeneity distributions of willingness-to-pay in choice models. Quant Mark Econ. 2007;5:313–31.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Terry N. Flynn
    • 1
    Email author
  • Marcel Bilger
    • 2
  • Chetna Malhotra
    • 2
    • 3
  • Eric A. Finkelstein
    • 2
    • 3
    • 4
  1. 1.TF Choices LtdNottinghamUK
  2. 2.Program in Health Services and Systems ResearchDuke-NUS Graduate Medical SchoolSingaporeSingapore
  3. 3.Lien Centre for Palliative CareDuke-NUS Graduate Medical SchoolSingaporeSingapore
  4. 4.Duke-Global Health InstituteDuke UniversityDurhamUSA

Personalised recommendations