Advertisement

Applied Health Economics and Health Policy

, Volume 9, Issue 1, pp 15–27 | Cite as

Valuing child health utility 9D health states with a young adolescent sample

A feasibility study to compare best-worst scaling discrete-choice experiment, standard gamble and time trade-off methods
  • Julie RatcliffeEmail author
  • Leah Couzner
  • Terry Flynn
  • Michael Sawyer
  • Katherine Stevens
  • John Brazier
  • Leonie Burgess
Practical Application

Abstract

QALYs are increasingly being utilized as a health outcome measure to calculate the benefits of new treatments and interventions within cost-utility analyses for economic evaluation. Cost-utility analyses of adolescent-specific treatment programmes are scant in comparison with those reported upon for adults and tend to incorporate the views of clinicians or adults as the main source of preferences. However, it is not clear that the views of adults are in accordance with those of adolescents on this issue. Hence, the treatments and interventions most highly valued by adults may not correspond with those most highly valued by adolescents. Ordinal methods for health state valuation may be more easily understood and interpreted by young adolescent samples than conventional approaches. The availability of young adolescent-specific health state values for the estimation of QALYs will provide new insights into the types of treatment programmes and health services that are most highly valued by young adolescents.

The first objective of this study was to assess the feasibility of applying best-worst scaling (BWS) discrete-choice experiment (DCE) methods in a young adolescent sample to value health states defined by the Child Health Utility 9D (CHU9D) instrument, a new generic preference-based measure of health-related quality of life developed specifically for application in young people. The second objective was to compare BWS DCE questions (where respondents are asked to indicate the best and worst attribute for each of a number of health states, presented one at a time) with conventional time trade-off (TTO) and standard gamble (SG) questions in terms of ease of understanding and completeness.

A feasibility study sample of consenting young adolescent school children (n = 16) aged 11–13 years participated in a face-to-face interview in which they were asked to indicate the best and worst attribute levels from a series of health states defined by the CHU9D, presented one at a time. Participants were also randomly allocated to receive additional conventional TTO or SG questions and prompted to indicate how difficult they found them to complete.

The results indicate that participants were able to readily choose ‘best’ and ‘worst’ dimension levels in each of the CHU9D health states presented to them and provide justification for their choices. Furthermore, when presented with TTO or SG questions and prompted to make comparisons, participants found the BWS DCE task easier to understand and complete.

The results of this feasibility study suggest that BWS DCE methods are potentially more readily understood and interpretable by vulnerable populations (e.g. young adolescents). These findings lend support to the potential application of BWS DCE methods to undertake large-scale health state valuation studies directly with young adolescent population samples.

Keywords

Standard Gamble Health State Valuation Standard Gamble Technique Incremental QALY Gain Standard Gamble Task 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Notes

Acknowledgements

The authors are particularly grateful to Mr Darren McLachlan, Westminster School, for his support and help with study administration; we are indebted to the staff, parents and children from Westminster School, Adelaide, who consented to and participated in this study. The authors would also like to thank Brita Pekarsky for her helpful comments on a previous version of this paper.

No sources of funding were used to conduct this study or prepare this manuscript. The authors have no conflicts of interest that are directly relevant to the content of this article.

Supplementary material

40258_2012_90100151_MOESM1_ESM.pdf (203 kb)
Supplementary material, approximately 208 KB.

References

  1. 1.
    Shaw R. Treatment adherence in adolescents: development and psychopathology. Clin Child Psychol Psychiatry 2001; 6: 137–50CrossRefGoogle Scholar
  2. 2.
    Kovacs M, Goldston D, Obrosky DS, et al. Prevalence and predictors of pervasive noncompliance with medical treatment among youths with insulin-dependent diabetes mellitus. J Am Acad Child Adolesc Psychiatry 1992 Nov; 31(6): 1112–9PubMedCrossRefGoogle Scholar
  3. 3.
    Dolgin M, Katz E, Doctors S, et al. Caregivers perception of medical compliance in adolescents with cancer. J Adolesc Health Care 1986; 7: 22–7PubMedCrossRefGoogle Scholar
  4. 4.
    Berg J, Dischler J, Wagner D, et al. Medication compliance: a health care problem. Ann Pharmacotherapy 1993; 27 Suppl.: 2–21Google Scholar
  5. 5.
    Drummond MF, Sculpher MJ, Torrance GW, et al. Methods for the economic evaluation of health care programmes. 3rd ed. Oxford: Oxford University Press, 2005Google Scholar
  6. 6.
    Brazier J, Ratcliffe J, Salomon J, et al. Measuring and valuing health benefits for economic evaluation. Oxford: Oxford University Press, 2007Google Scholar
  7. 7.
    Ungar W, Santos M. Trends in paediatric health economic evaluation 1980–1999. Arch Dis Child 2004; 89: 26–9PubMedGoogle Scholar
  8. 8.
    Griebsch I, Coast J, Brown J. Quality adjusted life years lack quality in paediatric care: a critical review of published studies. Paediatrics 2005; 115: 600–14CrossRefGoogle Scholar
  9. 9.
    Gunning JP. Understanding democracy: an introduction to public choice. Taiwan: Nomad Press, 2003Google Scholar
  10. 10.
    Tylee A, Haller D, Graham T, et al. Youth friendly primary care services: how are we doing and what more needs to be done. Lancet 2007; 369: 1585–73CrossRefGoogle Scholar
  11. 11.
    Cavet J, Sloper P. The participation of children and young people in decisions about UK service development. Child Care Health Dev 2004; 30: 613–21PubMedCrossRefGoogle Scholar
  12. 12.
    Saigal S, Stoskopf B, Feeny D, et al. Differences in preferences for neonatal outcomes among health care professionals, parents and adolescents. JAMA 1999; 281: 1991–7PubMedCrossRefGoogle Scholar
  13. 13.
    Wasserman J, Aday L, Begley C, et al. Measuring health state preferences for hemophilia: development of a disease-specific instrument. Haemophilia 2005; 11: 49–57PubMedCrossRefGoogle Scholar
  14. 14.
    Norquist G, McGuire T, Essock S. Cost effectiveness of depression treatment for adolescents. Am J Psychiatry 2008; 165: 549–52PubMedCrossRefGoogle Scholar
  15. 15.
    Varni JW, Seid M, Kurtin PS. PedsQL 4.0: reliability and validity of the Pediatric Quality of Life Inventory version 4.0 generic core scales in healthy and patient populations. Med Care 2001; 39(80): 800–12PubMedCrossRefGoogle Scholar
  16. 16.
    Ravens-Sieberer U, Gosch A, Rajmil L. The KIDSCREEN-52 quality of life measure for children and adolescents: psychometric results from a cross cultural survey in 13 European countries. Value Health 2008; 11: 645–58PubMedCrossRefGoogle Scholar
  17. 17.
    Stevens K. Developing a preference-based measure of health in children [PhD thesis]. Sheffield: ScHARR, University of Sheffield, 2008Google Scholar
  18. 18.
    Torrance G, Feeny D, Furlong W, et al. Multiattribute utility function for a comprehensive health status classification system: Health Utilities Index Mark 2. Med Care 1996; 34: 702–22PubMedCrossRefGoogle Scholar
  19. 19.
    Stevens K. Working with children to develop dimensions for a preference-based generic, paediatric, health-related quality-of-life measure. Qual Health Res 2010; 20(3): 340–51PubMedCrossRefGoogle Scholar
  20. 20.
    Stevens K. Developing a descriptive system for a new preference-based measure of health related quality of life for children. Qual Life Res 2009; 18: 1105–13PubMedCrossRefGoogle Scholar
  21. 21.
    McCabe C, Brazier J, Gilks P, et al. Using rank data to estimate health state utility models. J Health Econ 2006; 25(3): 418–31PubMedCrossRefGoogle Scholar
  22. 22.
    Ratcliffe J, Brazier J, Tsuchiya A, et al. Using DCE and ranking data to estimate cardinal values for health states for deriving a preference-based single index from the sexual quality of life questionnaire. Health Econ 2009; 18(11): 1261–76PubMedCrossRefGoogle Scholar
  23. 23.
    Salomon JA. Reconsidering the use of rankings in the valuation of health states: a model for estimating cardinal values from ordinal data. Popul Health Metr 2003 Dec 19; 1(1): 12PubMedCrossRefGoogle Scholar
  24. 24.
    Craig B, Busschbach J, Salomon J. Keep it simple: ranking health states yields values similar to cardinal measurement approaches. J Clin Epidemiol 2009; 62: 296–305PubMedCrossRefGoogle Scholar
  25. 25.
    Ratcliffe J, Buxton M. Patient’s preferences regarding the process and outcomes of high technology medicine: an application of conjoint analysis to liver transplantation. Int J Technol Assess Health Care 1999; 15: 340–51PubMedGoogle Scholar
  26. 26.
    Coast J, Flynn T, Natarajan L, et al. Valuing the ICECAP capability index for older people. Soc Sci Med 2008: 67; 874–992PubMedCrossRefGoogle Scholar
  27. 27.
    Ryan M, Netten A, Skåtun D, et al. Using discrete choice experiments to estimate a preference-based measure of outcome: an application to social care for older people. J Health Econ 2006; 25(5): 927–44PubMedCrossRefGoogle Scholar
  28. 28.
    Hakim Z, Pathak D. Modelling the EuroQol data: a comparison of discrete choice conjoint and conditional preference modelling. Health Econ 1999; 82: 103–16CrossRefGoogle Scholar
  29. 29.
    Louviere J, Hensher D, Swait J. Stated choice methods: analysis and application. Cambridge: University of Cambridge, 2000CrossRefGoogle Scholar
  30. 30.
    Finn A, Louviere J. Determining the appropriate response to evidence of public concern: the case of food safety. J Public Pol Mark 1992; 11: 12–25Google Scholar
  31. 31.
    McIntosh W, Louviere J. Separating weight and scale value: an exploration of best-attribute scaling in health economics. Health Economists Study Group Meeting; Uxbridge; 2002 JulGoogle Scholar
  32. 32.
    Marley AAJ, Flynn TN, Louviere JJ. Probabilistic models of set-dependent and attribute-level best-worst choice. J Math Psychol 2008; 52: 281–96CrossRefGoogle Scholar
  33. 33.
    Flynn TN, Louviere JJ, Peters TJ, et al. Best-worst scaling: what it can do for health care research and how to do it. J Health Econ 2007; 26(1): 171–37PubMedCrossRefGoogle Scholar
  34. 34.
    Flynn TN, Louviere JJ, Peters TJ, et al. Estimating preferences for a dermatology consultation using best-worst scaling: comparison of various methods of analysis. BMC Med Res Methodol 2008; 8: 76PubMedCrossRefGoogle Scholar
  35. 35.
    Myschool [online]. Available from URL: http://www.myschool.com.au/ [Accessed 2010 Sep 28]
  36. 36.
    Fiebig D, Keane M, Louviere J, et al. The generalized multinomial logit model: accounting for scale and coefficient heterogeneity. Mark Sci 2010; 29(3): 393–421CrossRefGoogle Scholar
  37. 37.
    Flynn TN, Louviere JJ, Marley AA, et al. Rescaling quality of life values from discrete choice experiments for use as QALYs: a cautionary tale. Popul Health Metr 2008 Oct 22; 6: 6PubMedCrossRefGoogle Scholar
  38. 38.
    Street DJ, Burgess L. The construction of optimal stated choice experiments: theory and methods. Hoboken (NJ): Wiley, 2007Google Scholar

Copyright information

© Adis Data Information BV 2011

Authors and Affiliations

  • Julie Ratcliffe
    • 1
    Email author
  • Leah Couzner
    • 1
  • Terry Flynn
    • 2
  • Michael Sawyer
    • 3
  • Katherine Stevens
    • 4
  • John Brazier
    • 4
  • Leonie Burgess
    • 5
  1. 1.Centre for Clinical ChangeFlinders UniversityAdelaideAustralia
  2. 2.Centre for the Study of ChoiceUniversity of Technology, SydneySydneyAustralia
  3. 3.Discipline of PaediatricsUniversity of AdelaideAdelaideAustralia
  4. 4.Health Economics and Decision Science, ScHARRUniversity of SheffieldSheffieldUK
  5. 5.Department of Mathematical SciencesUniversity of Technology, SydneySydneyAustralia

Personalised recommendations