Using a discrete choice experiment to value the QLU-C10D: feasibility and sensitivity to presentation format
- 777 Downloads
To assess the feasibility of using a discrete choice experiment (DCE) to value health states within the QLU-C10D, a utility instrument derived from the QLQ-C30, and to assess clarity, difficulty, and respondent preference between two presentation formats.
We ran a DCE valuation task in an online panel (N = 430). Respondents answered 16 choice pairs; in half of these, differences between dimensions were highlighted, and in the remainder, common dimensions were described in text and differing attributes were tabulated. To simplify the cognitive task, only four of the QLU-C10D’s ten dimensions differed per choice set. We assessed difficulty and clarity of the valuation task with Likert-type scales, and respondents were asked which format they preferred. We analysed the DCE data by format with a conditional logit model and used Chi-squared tests to compare other responses by format. Semi-structured telephone interviews (N = 8) explored respondents’ cognitive approaches to the valuation task.
Four hundred and forty-nine individuals were recruited, 430 completed at least one choice set, and 422/449 (94 %) completed all 16 choice sets. Interviews revealed that respondents found ten domains difficult but manageable, many adopting simplifying heuristics. Results for clarity and difficulty were identical between formats, but the “highlight” format was preferred by 68 % of respondents. Conditional logit parameter estimates were monotonic within domains, suggesting respondents were able to complete the DCE sensibly, yielding valid results.
A DCE valuation task in which only four of the QLU-C10D’s ten dimensions differed in any choice set is feasible for deriving utility weights for the QLU-C10D.
KeywordsQuality of life Utility QLQ-C30 Discrete choice experiment Cancer
The MAUCa Consortium, in addition to those named as authors, consists of the following members, all of whom made some contribution to the research reported in this paper: Helen McTaggart-Cowan, Peter Grimison, Monika Janda, and Julie Pallant. This research was supported by a National Health and Medical Research Council (Australia) Project Grant (632662). Associate Professor Janda was supported by a NHMRC Career Development Award 1045247. Dr. Norman was supported by a NHMRC early career research fellowship (1069732). Professor King was supported by the Australian Government through Cancer Australia.
This research was supported by a National Health and Medical Research Council (Australia) Project Grant (632662). Dr. Norman was supported by a NHMRC early career research fellowship (1069732). Professor King was supported by the Australian Government through Cancer Australia.
Compliance with ethical standards
Conflict of interest
The authors declare they do not have conflicts of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki Declaration and its later amendments or comparable ethical standards. The study was approved by the University of Sydney Human Research Ethics Committee, Approval Number 2012/2444.
Informed consent was obtained from all individual participants included in the study.
- 10.King, M. T., Costa, D. S. J., Aaronson, N. K., Brazier, J. E., Cella, D. F., Fayers, P. M., et al. (submitted). QLU-C10D: A health state classification system for a multi-attribute utility measure based on the EORTC QLQ-C30. Quality of Life Research (currently under review).Google Scholar
- 12.Aaronson, N. K., Ahmedzai, S., Bergman, B., Bullinger, M., Cull, A., Duez, N. J., et al. (1993). The European Organisation for Research and Treatment of Cancer QLQ-C30: A quality-of-life instrument for use in international clinical trials in oncology. Journal of the National Cancer Institute, 85(5), 365–376.CrossRefPubMedGoogle Scholar
- 22.Chrzan, K. (2010). Using partial profile choice experiments to handle large numbers of attributes. International Journal of Marketing Research, 52(6), 827–840.Google Scholar
- 24.Vass, C., Rigby, D., Campbell, S., Tate, K., Stewart, A., & Payne, K. (2014). PS2-33 investigating the framing of risk attributes in a discrete choice experiment: An application of eye-tracking and think aloud. In Paper presented at the 36th meeting of the Society for Medical Decision Making, Miami, FL.Google Scholar
- 25.Krucien, N., Ryan, M., & Hermens, F. (2014). Using eye-tracking methods to inform decision making processes in discrete choice experiments, Health Economists’ Study Group (HESG). Glasgow Caledonian University.Google Scholar
- 26.Whitty, J. A., Ratcliffe, J., Chen, G., & Scuffham, P. A. (2014). Australian public preferences for the funding of new health technologies: A comparison of discrete choice and profile case best–worst scaling methods. Medical Decision Making, 34(5), 638–654.Google Scholar
- 28.Mulhern, B., Bansback, N., Brazier, J., Buckingham, K., Cairns, J., Devlin, N., et al. (2014). Preparatory study for the revaluation of the EQ-5D tariff: Methodology report. Health Technology Assessment, 18(12), vii–xxvi, 1–191.Google Scholar