Applied Health Economics and Health Policy

, Volume 9, Issue 4, pp 225–241

Ordering errors, objections and invariance in utility survey responses

A framework for understanding who, why and what to do
Practical Application

DOI: 10.2165/11590480-000000000-00000

Cite this article as:
Wittenberg, E. & Prosser, L.A. Appl Health Econ Health Policy (2011) 9: 225. doi:10.2165/11590480-000000000-00000


Background: Utilities are the quantification of the perceived quality of life associated with any health state. They are used to calculate QALYs, the outcome measure in cost-utility analysis. Generally measured through surveys of individuals, utilities often contain apparent or unapparent errors that can bias resulting values and QALYs calculated from these values.

Objective: The aim of this study was to improve direct health utility elicitation methodology through the identification of the types of survey responses that indicate errors and objections, and the reasons underlying them.

Methods: We conducted a systematic review of the medical (PubMed), economics (EconLit) and psychology (PsycINFO) literature from 1975 through June 2010 for articles describing the types and frequency of errors and objections in directly elicited utility survey responses, and strategies to address these responses. Primary data were collected through an internet-based utility survey (standard gamble) of community members to identify responses that indicate error or objections. A qualitative telephone survey was conducted among a subset of respondents with these types of responses using an open-ended protocol to elicit rationales for them.

Results: A total of 11 papers specifically devoted to errors, objections and invariance in utility responses have been published since the mid-1990s. Error/objection responses can be broadly categorized into ordering errors (which include illogical and inconsistent responses) and objections/invariance (which include missing data, protest responses and refusals to trade time or risk in utility questions). Reported frequencies of respondents making ordering errors ranged from 5% to 100%, and up to 35% of respondents have been reported as objecting to the survey or task in some manner. Changes in the design, administration and analysis of surveys can address these potentially problematic responses.

Survey data (n = 398) showed that individuals who provided invariant responses (n = 26) reported the lowest level of difficulty with the survey and often identified as religious (23% of invariant responders found the survey difficult vs 63% of all responders, and 77% of invariant responders identified as religious compared with 56% of entire sample; p < 0.05 for both). Respondents who provided illogical responses (n = 50) were less likely to be college educated (56% of illogical responders vs 73% of entire sample; p < 0.05), and less likely to be confident in their responses (62% vs 75% of entire sample; p < 0.05). Qualitative interviews (n = 42) following the survey revealed that the majority of ordering errors were a result of confusion, lack of attention or difficulty in responding to the survey on the part of the respondent, while invariant responses were often considered and thoughtful reactions to the premise of valuing health using the standard gamble task.

Conclusions: Rationales for error/objection responses include difficulty in articulating preferences or misunderstanding with a complex survey task, and also thoughtful and considered protestations to the task. Mechanisms to correct unintentional errors may be useful, but cannot address intentional responses to elements of the measurement task. Identification and analysis of the prevalence of errors and objections in responses in utility data sets are essential to understanding the accuracy and precision of utility estimates and analyses that depend thereon.

Supplementary material

40258_2012_9040225_MOESM1_ESM.pdf (131 kb)
Supplementary material, approximately 134 KB.

Copyright information

© Adis Data Information BV 2011

Authors and Affiliations

  1. 1.Schneider Institutes for Health Policy, MS035, Heller School for Social Policy and ManagementBrandeis UniversityWalthamUSA
  2. 2.Child Health Evaluation and Research Unit, Division of General PediatricsUniversity of Michigan Health SystemAnn ArborUSA
  3. 3.Department of Health Management and PolicyUniversity of Michigan School of Public HealthAnn ArborUSA

Personalised recommendations