Abstract
Health-related discrete choice experiments (DCEs) information can be used to inform decision-making on the development, authorisation, reimbursement and marketing of drugs and devices as well as treatments in clinical practice. Discrete choice experiment is a stated preference method based on random utility theory (RUT), which imposes strong assumptions on respondent choice behaviour. However, respondents may use choice processes that do not adhere to the normative rationality assumptions implied by RUT, applying simplifying decision rules that are more selective in the amount and type of processed information (i.e., simplifying heuristics). An overview of commonly detected simplifying heuristics in health-related DCEs is lacking, making it unclear how to identify and deal with these heuristics; more specifically, how researchers might alter DCE design and modelling strategies to accommodate for the effects of heuristics. Therefore, the aim of this paper is three-fold: (1) provide an overview of common simplifying heuristics in health-related DCEs, (2) describe how choice task design and context as well as target population selection might impact the use of heuristics, (3) outline DCE design strategies that recognise the use of simplifying heuristics and develop modelling strategies to demonstrate the detection and impact of simplifying heuristics in DCE study outcomes.
Similar content being viewed by others
Explore related subjects
Discover the latest articles, news and stories from top researchers in related subjects.Avoid common mistakes on your manuscript.
When completing DCEs, respondents may use choice processes that do not adhere to random utility theory assumptions by applying simplifying decision rules (i.e., simplifying heuristics). |
Ignoring the possibility of heuristics in choice models will likely lead to biased parameter estimates, incorrect claims of preferences heterogeneity and thereby, bias in DCE outcome measures. |
Researchers can (1) plan and design their study to mitigate the induced use of heuristics, (2) detect the use of heuristics by means of a priori qualitative work and a posteriori data analyses and (3) estimate the effects of the use of heuristics on their study outcomes. |
1 Introduction
Discrete choice experiment (DCE) is globally recognised as a valuable instrument to measure preferences [1,2,3,4]. Health-related DCEs can be designed to inform decision making on the development, authorisation, reimbursement and marketing of drugs and devices as well as treatments in clinical practice [5,6,7,8,9,10,11]. This stated preference (SP) method allows researchers to quantify the importance of certain characteristics (attributes) of particular alternatives, calculate willingness-to-accept risks or willingness-to-pay for benefits, and forecast potential participation rates/market shares. Discrete choice experiments are based on random utility theory (RUT) [12,13,14], which imposes strong assumptions on respondent choice behaviour. Respondents are expected to use complex and rational decision-making processes when completing choice tasks [4, 15]. It is assumed that respondents evaluate all alternatives and their corresponding attributes and attribute levels in each choice task and choose the alternative that maximises their utility. These behavioural premises directly or implicitly presume that respondents have limitless cognitive resources to gather and process information, to correctly understand and interpret the choice situation and make choices in line with several axioms used to describe normative rationality (i.e., monotonicity, continuity, completeness, transitivity) [4, 16, 17].
However, decades of research in behavioural economics and psychology has shown that the assumptions which underlie RUT are often violated, particularly when respondents are faced with complex choice problems [18, 19]. Specifically, respondents may not always maximise utility, but instead use choice processes that do not adhere to the normative rationality assumptions by applying simplifying decision rules that are more selective in the amount and type of processed information (i.e., simplifying heuristics) [18, 20,21,22]. This loosely corresponds to the concept of ‘bounded rationality,’ which states that respondents’ cognitive capacity limits the degree to which they can maximise utility and that some choices require the use of simplifying strategies [23, 24]. In face of time, attention, resource, information, and/or mental processing limits, respondents defer to simplifying heuristics, thereby partially or fully violating RUT assumptions [20].
Discrete choice experiments in health still largely ignore the possibility of simplifying heuristics and are generally designed and analysed under strict RUT assumptions. In health applications, DCEs are particularly vulnerable to the use of simplifying heuristics as they generally include complex choices and (multiple) benefit and risk attributes. Health risks particularly are hard to evaluate [25, 26], since these often-small risks can have a significant downside for patients when they eventuate, and their evaluation is loaded with emotional respondent reactions and fraught decisional contexts. To support health policy development based on DCEs, the choice model should emulate what happens in practice [27, 28]. Hence, using choice models in health care under strict RUT assumptions likely compromises the realistic representation of decision makers’ choice decisions, thereby negatively impacting the external validity of DCE outcomes [27]. As a consequence, ignoring the possibility of heuristics in choice models, or even simply deleting ‘irrational’ responses from the DCE dataset, will likely lead to biased preference estimates [22, 29,30,31], incorrect claims on preferences heterogeneity [32] and, ultimately, to bias in DCE outcome measures. Therefore, it is important that researchers be aware of factors that can induce heuristics during the design phase of a DCE and model their DCE data in a way that allows for heuristic choice processes. However, a search of the literature demonstrates the lack of an overview of commonly applied simplifying heuristics in (health-related) DCEs. It is unclear, therefore, how to identify and deal with these heuristics in practice; more specifically, it leaves unclear how researchers might alter DCE design and modelling strategies to accommodate for the effects of heuristics.
Hence, the aim of this paper is three-fold; (1) to provide an overview of common simplifying heuristics in health-related DCEs, (2) to describe how choice task design and context as well as target population selection might impact the use of heuristics, (3) to specify DCE design strategies that recognise the use of simplifying heuristics and modelling strategies to demonstrate the use and impact of simplifying heuristics in DCE study outcomes. Therefore, we conducted an extensive scoping review drawing on literature (i.e., published papers and books) from various fields including health sciences, psychology, econometrics, management sciences, marketing research and transportation science.
2 Simplifying Heuristics in the Context of DCEs
Completing a DCE requires cognitive effort on the part of respondents. It follows that we should therefore consider what might occur when the cognitive demand we impose via the DCE exceeds respondents’ limited cognitive capacity, since DCE respondents may employ methods or strategies that reduce the effort expended to choose an alternative among several substitutes [33]. These methods or strategies which allow people to make choices more quickly, and/or frugally than classical utility-maximising strategies, are termed ‘simplifying heuristics’ [18, 20,21,22, 34]. The use of heuristics can be considered adaptive (i.e., responsive to decision context), or even rational, depending on the context in which decisions must be made. It has been shown that even when ignoring information, certain heuristics do allow respondents to arrive at an optimal decision without allocating full effort [20]. Yet, although heuristics help individuals solve complex decision making in real life, their use may result in choices made in partial or total conflict with the RUT assumptions. Hence, the focus of this manuscript is to describe a subset of heuristics known and studied in psychology that has relevance to DCEs [18].
Since respondents’ usage of simplifying heuristics can be triggered by the degree to which DCEs are cognitively or emotionally demanding, it can be useful to minimise factors that impose such demands when planning a DCE study (Sect. 3). For this, it is important to be aware of DCE characteristics that are likely to induce heuristics. These characteristics of DCEs can generally be classified into factors relating to task complexity and context effects [20]. Researchers should be aware that choice task simplifications can threaten the DCE’s external validity by abstracting choice tasks in such a way that they no longer represent decision problems in a realistic way [28]. Hence, DCE choice tasks should be presented in a manner compatible with real-world level of complexity, when applicable. Additionally, researchers should in general expect that all respondents may be inclined to use simplifying heuristics in their decision making, regardless of their sociodemographic characteristics. However, certain social subgroups more often/quickly tend to experience choice tasks as cognitively demanding and are more likely to resort to heuristic decision making [35,36,37] (Sect. 4). Notably, respondent’s literacy and numeracy, as well as their age and education, can influence the perceived cognitive burden of choice tasks.
3 DCE Characteristics That May Induce the Use of Simplifying Heuristics
Task complexity increases as the hypotheticality, number of choice tasks, attributes and alternatives of the choice tasks increase [38, 39]. The concept of hypothetical bias is well known in DCEs; the absence of constraints (i.e., real-life consequences) likely impacts study outcomes [40, 41] and may lower respondents’ engagement in the choice process. Without constraints, the importance of choosing the option with the highest utility may diminish, which in turn might make respondents inclined to decrease their cognitive effort and adapt heuristic strategies to make a choice [42, 43]. Additionally, the more hypothetical a DCE is for a given patient population, the more difficult it may be for respondents to anticipate a certain state or scenario, and consequently, to form a preference and make a choice (e.g., elicit preferences of the general population for cancer care) [44, 45]. Lengthy choice experiments add to task complexity and can cause respondents to increasingly rely on simplifying heuristics as boredom and fatigue set in [46, 47]. In addition to the complexity of each individual choice task, the cumulative cognitive burden over the course of the DCE should be considered by the researcher [46]. Increasing the number of attributes increases the amount of information which needs to be processed; this in turn increases the risks of cognitive overload and can thereby induce the use of heuristics [20]. While a higher number of attributes has been found to increase respondent’s confidence in their choice [20] and omitting relevant attributes may cause respondents to ‘fill in the blanks’ or infer missing information [48,49,50,51], the DCE design should include alternatives that are sufficiently characterised without cognitively overloading the respondent. Task complexity can further increase as the number of alternatives per choice task increases. Discrete choice experiments in health generally include two to three alternatives per choice task [4, 10], and it has been found that respondents are most likely to trade attributes across alternatives when presented with two alternatives per choice task (in unlabelled experiments) [20]. Balancing choice complexity and providing respondents with a sufficient range of alternatives, accurately capturing real-life choice contexts [52], enables respondents to make considered choices that capture trade-off behaviour [53].
Context effects pertain to information and values in the choice task that influence respondents’ decision-making process [20]. These include the similarity of alternatives, reference points, framing effects and unfamiliarity with the topic of choice. If respondents are faced with similar alternatives encompassing approximately equal overall utility scores, choice complexity increases and so may respondents’ reliance on heuristics [20]. Since this is more likely to occur if the DCE entails restricted attribute level ranges [22, 54], sufficiently wide attribute level ranges that are reflective of the population’s preference range in the DCE should be considered [55]. Respondent’s perception of the adequacy of attribute level ranges may also be related to their unobserved reference points. Given that respondents may pick and set reference points based on any information or experience prior to the experiment [56], reference points may vary greatly between respondents and cause the use of heuristics [57]. Providing respondents with information on a reference point can keep the influence of individual reference points somewhat constant [58]. In addition to pre-set reference points, individuals can also apply individual risk perceptions to attributes that determine how likely they consider themselves to fall within a certain risk category, which might be driven by cognitive biases, such as overconfidence, availability, or illusion of control [59]. Similarly, positive versus negative framing of attributes (e.g., survival vs mortality or effectiveness vs failure rate) can induce loss aversion despite conveying the same information and have been shown to induce the use of heuristics as well as different study outcomes [49, 60]. Finally, empirical evidence shows that cognitive burden tends to increase in DCEs with unfamiliar choice contexts [38], attributes [61] and attribute levels [22]. This effect can be accelerated in emotionally laden [62] and high-risk choice contexts [38], which provoke affective reactions and respondents’ refusal to imagine and trade certain attributes [63]. In such situations, respondents might instinctively exclude certain alternatives by attaching negative emotions to them based on the choice context and not necessarily the attribute (levels) (affect heuristic). This could happen, for instance, when evaluating the chance of hospitalisation after suffering from severe side effects due to treatment, which affects emotions and feelings associated with hospitalisation.
Overall, we recommend designing DCE choice tasks that are compatible with real-world level complexity (i.e., also mirroring the potential use of heuristics in real life) and avoiding the promotion of the use of heuristics as a result of solely choice task complexity. Researchers should upfront identify and pre-test elements in their DCE that might promote the use of heuristics and a priori plan on how to avoid this in their design (see Sect. 7.1) or how to accommodate for the likely impact in their modelling strategy (see Sect. 7.2).
4 Respondent Characteristics May Induce the use of Simplifying Heuristics
Older age and lower education attainment have been associated with, on average, lower health literacy skills (i.e., a person’s ability to understand and apply health information in health-related decisions [64]) in The Netherlands [65]. Respondents with a low education and low health literacy have indicated that they consider fewer DCE attributes in their decision making [49]. In addition, respondent’s numeracy has been linked to their understanding of numeric attributes (e.g., risk attributes) [66, 67]. Previous studies suggest that high numeracy skills improve choice consistency [53, 68], decrease respondent’s susceptibility to framing effects [69] and are associated with elaborate processing techniques of numeric information in choice tasks, such as considering and combining numeric information [70]. In addition to conventional numeracy and health literacy, the related concept of risk literacy affects respondents’ ability to evaluate and understand risk attributes [71]. For age, a bell-shaped distribution was found, indicating that, on average, younger and older respondents showed the most inconsistent choice behaviour in a DCE conducted in Scotland [48], although younger adults seem to resort to different types of heuristics compared to older adults [72]. In older adults, cognitive aging increases the likelihood of applying simplifying strategies [73, 74]. In addition, digital competencies of elderly (i.e., e-literacy [75]) may also impact the perceived complexity of the DCE and, hence, induce heuristic strategies. Inconsistent choice behaviour in young adults, in contrast, may be driven by self-imposed time-pressure, impatience and rushing through the DCE [76, 77]. Choice based on the use of heuristics may further be exacerbated when recruiting participants through online panels that incentivise rapid survey completions (respondents 'speeding’ through the survey) [77]. Finally, language barriers for non-native speakers can challenge respondents’ understanding of the attributes, consequently increasing perceptions of complexity, and leading to heuristic decision making.
For the reasons mentioned above, researchers should always (a) collect respondents’ socio-demographic information (including literacy and numeracy) along with their choices, () be particularly alert if the aforementioned social groups are (over)represented in their sample, and (c) plan for accommodating the likely impact of the sample recruitment in their modelling strategy (see Sect. 7.2).
5 Characterising Heuristics Common in DCEs
Although heuristics can be characterised in many ways, in this paper we describe them by the amount of information they require respondents to use and the decision process that respondents apply [20, 33]. Some heuristics do not require any of the information presented in the DCE tasks, with respondents deferring to a random-choice strategy or making position-based decisions. Other heuristics enable participants to ignore part of the information presented in choice tasks (either alternatives and/or attributes). Finally, some heuristics may require processing all information presented but impact the decision process that respondents use to make a choice.
The data generation process respondents use to evaluate the information is described by five characteristics in this paper. First, heuristics differ in the way that respondents structure the evaluation of the information presented, which may be alternative-based, attribute-based or a combination. Second, although the usual empirical representation associated with RUT applications adopts a linear-in-parameters utility function, which implies compensatory evaluation, respondents may employ simplifying heuristics based on non- or semi-compensatory behaviour (e.g., excellent levels of some attributes are not [completely] compensated for by poor levels of other attributes [20]). Third, the use of certain heuristics may imply explicit exclusion of one or more attributes from the evaluation process. Fourth, the attribute weighting strategy employed by respondents may differ across heuristics. While in some cases respondents may use attribute weights to make a choice, with other heuristics this may not be the case: e.g., a simpler strategy such as equal weighting of attributes, ordinally prioritising the attributes from ‘most important’ to ‘least important’, or no strategy at all may be used. Fifth, the strategy applied to evaluate attribute (levels) differs across heuristics and can be primarily qualitative or quantitative in nature. Respondents might convert attribute levels into unobserved (latent) categories of levels, which might be qualitative or quantitative irrespective of whether the attribute levels are metric or categorical. For instance, when respondents evaluate a risk of 5%, and they only evaluate this as being lower than a threshold of 10%, the evaluation process is actually qualitative in nature, as no value is associated with the numbers per se.
6 Defining Heuristics Common in DCEs
Table 1 provides an overview of heuristics most common in (health-related) DCEs. As noted earlier, this overview is a subset of heuristics known and studied in psychology [18]. The exclusion of a specific heuristic in no way implies that it does not occur empirically, only that in our judgement it is less likely to do so in health applications. All heuristics are characterised and described by means of specifications described in Sect. 5.
Four heuristics have been identified that have respondents using the full information provided in the choice task, but in the decision strategy they are assumed to employ processes that are not in line with RUT. First, reference point effects, also known in the health preference literature as anchoring and adjustment, refers to a choice process where attribute values in the first alternative encountered influence preferences in later choice tasks [78, 79]. In other words, the levels of some or all attribute(s) of the first alternative respondents encounter serve as reference points for the evaluation of the attribute levels in all following alternatives. Although respondents evaluate numerical information under this heuristic, they merely compare a numeric level as to whether it is under or over a reference point, making the evaluation strategy qualitative in nature. Second, ordinal recoding (also known in the literature as categorisation), refers to the process by which respondents construct categories and attach qualitative labels to metric attribute levels [8, 80, 81]. This means that respondents no longer interpret the actual numerical information of attributes (e.g., risk of side effects expressed as 5, 10, 20%) provided in the choice task, but instead defer to the categorisation of the numerical values (e.g., low, medium, high risk of side effects). Third, when using the conjunctive or disjunctive heuristic, respondents establish a minimal acceptable cut-off point for all or key attributes, respectively [82]. If an alternative in a choice task falls below the cut-off (or above depending on cut-off definition) on any attribute (i.e., qualitative evaluation strategy) or particular key attributes, it is excluded as a possible option. Fourth, tallying implies that respondents compare all the attributes across the available alternatives without appreciating their relative importance [83]. Respondents simply select the alternative with the most ‘best’ and/or the least ‘worst’ attribute levels. This heuristic has several well-known variations; (1) the majority of confirming dimensions [20, 84], (2) equal weights [85,86,87] and (3) frequency of good and bad features [88]. These variations of tallying differ slightly in the process respondents use to compare attribute levels across alternatives. For the frequency of good and bad features, respondents can decide to focus on only the number of good features within an alternative, but in contrast to the ‘majority of confirming dimensions’ they can also decide based only the number of bad features, or both. Both these heuristics are different from the equal weights heuristic as they are based on categorising attribute levels into ‘good/best’ and ‘bad’ but do not require respondents to place any value on these attribute levels.
We consider seven heuristics that enable participants to ignore part of the information presented in choice task. First, attribute non-attendance is a heuristic that results in respondents completely ignoring a certain attribute or attributes when evaluating the alternatives in a choice task [89,90,91], while making trade-offs across other attributes between alternatives. Second, attribute level non-attendance is an elaborated specification of the attribute non-attendance heuristic, which suggests that respondents ignore a certain attribute or certain attributes whenever their level falls within a predefined range [92]. For instance, a respondent might disregard the risk of side-effect attribute, unless it shows the highest level of side effects. Third, whenever respondents employ choice set formation (i.e., alternative screening), they eliminate entire choice alternatives based on certain threshold levels of particular attributes (often benefit, cost and risk attributes) [29, 93, 94]. To illustrate, if the risk of side effects in one alternative is considered unacceptably high, the full alternative may not be considered at all within its choice set, and hence cannot be chosen. After such a screening stage, respondents trade across the other attributes between surviving alternatives. Fourth, elimination-by-aspect refers to a choice process where respondents exclude an alternative based on predefined cut-off values of selected attributes [95], which in essence makes this a combined screening and selection process. This process starts with the most important attribute and continues for as many other attributes as needed to achieve a singular alternative. This is different from the choice set formation heuristic where only a particular (set of) attribute(s) that violate the respondent’s cut-off values results in the full alternative being eliminated. Fifth, when respondents show lexicographic preferences (also referred to in literature as take-the-best [96]), they select alternatives based entirely on their superiority on one most important attribute [97, 98]. If the levels of this attribute overlap in the choice task, the second most important attribute is then evaluated (i.e., qualitative evaluation of attributes on ordinal scale). This is an iterative process that is repeated until only one alternative remains. Sixth, dominant decision-making behaviour refers to the choice process where a respondent selects one attribute to be the most important [99, 100]. Based on the level of only this one attribute, an alternative is selected. If the levels of this attribute are equal across alternatives, a random choice is made. This behaviour is nothing more than a truncated lexicographic decision rule employing only one attribute (which may vary from one respondent to another). Seventh, the final heuristic in this group of heuristics is satisficing. This means that an attribute’s aspiration levels are defined, and the first alternative encountered that surpasses these aspiration levels is chosen [23]. For instance, if the left-most alternative shows satisfactory levels for all attributes, this alternative would be chosen without considering the other alternative(s) within the same choice task (although other alternative(s) might have more favourable attribute levels).
Finally, two heuristics are identified that do not require respondents to use any information from the choice tasks. First, task non-attendance refers to choice processes when respondents know nothing about the influence of the attributes in his/her utility. This behaviour may be activated if respondents do not care about their responses or do not pay attention to the DCE instructions and choice task. Two variations exist within this heuristic: (1) complete ignorance, wherein respondents make choices randomly (i.e., all alternatives are equally likely) [101] and (2) flatlining, wherein respondents consistently choose an alternative placed at a certain spot of the choice task (e.g., only the alternative on the left or right side of the presentation format), regardless of the attribute levels shown in this alternative [102, 103]. Second, when employing the reading order heuristic, also known as left-right bias, respondents display a preference for alternatives that are placed at a certain spot dependent on where people start reading the presentation format (e.g., left side for respondents familiar with Latin-based writing and right side for respondents familiar with Arabic-based writing), regardless of the attribute levels [102, 103]. However, in contrast to flatlining, respondents do not consistently choose this option, instead they are merely more inclined to choose this particular alternative.
7 Design and Modelling Strategies for Dealing with Heuristics
7.1 Design Strategies to Reduce the Use and Impact of Heuristics
Designing a DCE requires researchers to balance statistical efficiency and the cognitive capacity of respondents, also referred to as response efficiency in design literature [104, 105]. The experimental design needs to enable researchers to generate a set of attribute level parameters that, jointly, are as precise as possible, while at the same time, avoid overly complex choice tasks, which could induce the use of heuristics. Several strategies can be employed in the design of DCE studies that may mitigate or reduce the use of heuristics potentially induced by a DCE. However, all design strategies come at a price, as they generally result in less efficient experimental designs, thus requiring larger sample sizes [104, 105]. Additionally, strategies that might reduce the usage of one particular heuristic might induce the use of other heuristics. Therefore, a carefully balanced approach needs to be determined on a case-by-case basis.
In general, DCEs should be designed according to good research practice guidelines [105, 106]. In doing so, researchers are advised to include proper educational materials (including comprehension question(s)) to educate respondents on the topic of their study as well as the attributes and attribute levels included. This explanatory section preferably includes one or several warm-up tasks in which respondents are taught how to complete a choice task [103]. Researchers should also provide realistic expectations of the amount of time it is likely to take to complete the survey, as well as consider the inclusion of a progress bar to indicate remaining time investment required throughout the survey.
Logical attribute ordering increases choice consistency and with that likely reduces the use of heuristics [107,108,109]. This would imply attributes ordered in alignment with ways in which respondents logically process information. To exemplify, one should group the mode, dose, and frequency of administration attributes that describe a particular medical treatment. To avoid order bias arising from attribute order, the order of groups of attributes can be randomised across respondents. Randomisation of the place of alternatives in a choice task across respondents, a test-retest or a dominance test might aid the identification and potentially the impact of task-non-attendance and left-right bias. A test-retest that entails respondents to be presented with an identical choice task twice in the survey (in which [preferably] the order of alternatives is shuffled) could be helpful [103]. During a dominance test, respondents are presented with a choice task in which one alternative clearly dominates the other alternative [102, 103]. While such measures might be of assistance in the identification, and potentially the impact of some heuristics, their ability to represent validity of the data in general is questioned [110]. Information display strategies can reduce task complexity, especially if DCEs include common-metric attributes (e.g., multiple risks). Comprehensive information display techniques can be applied to enhance respondents’ attribute understanding [25, 111]. Although empirical literature on the comparative efficiency of different information display strategies (e.g., visual, numeric or literal) in health-related preference elicitation remains inconclusive as empirical evidence is lacking, combining natural frequencies with images shows promising results [25, 112, 113]. For any other attributes, the use of graphics might induce the use of heuristics as previous studies show that they are associated with increased categorisation [80] and likely attribute nonattendance [114]. Partial profile designs, also known in health literature as level overlap, is a design strategy in which levels of one or more attributes are identical across the alternatives in a given choice task [115,116,117,118]. This reduces the task complexity as fewer attributes need to be compared across the alternatives. Specifically, this strategy reduces heuristics related to non-attendance and dominant-decision making [115,116,117]. Colour coding is a technique in which researchers use a separate colour to mark the categorisation of attribute levels [115, 116]. For example, different shades of the same colour can be used for the different levels of side effects if those are categorical, e.g., mild, moderate, severe. Highlighting is a variation to this where researchers simply highlight the attributes with different levels across the presented alternatives [115, 116]. Both colour coding and highlighting reduce heuristics related to non-attendance and dominant decision making [115, 116]. When applying colour coding or highlighting schemes, researchers should be mindful to not induce the categorisation heuristic. On the other hand, it is possible that such techniques impact preference inferences by inducing attention shifts that are not preference aligned.
7.2 Modelling Strategies to Identify the Use and Impact of Heuristics on DCE Study Outcomes
The usual modelling approach to DCE data with application in health involves the application of a simple Multinomial Logit (MNL) for data exploration, potentially with systematic heterogeneity of preferences through interaction of attributes with respondents’ socio-demographic characteristics and/or stochastic error heterogeneity (aka scale heterogeneity) in a scaled MNL. This is usually followed by a (1) Latent Class model adding discrete mass stochastic preference heterogeneity to the simple MNL, perhaps with the addition of scale heterogeneity across classes (Scaled Latent Class analysis [119]); and/or a (2) Mixed Logit model, adding parametric stochastic preference heterogeneity to the simple MNL [4, 120]. All these models reflect the same underlying assumptions about rational behaviour as assumed under RUT. In econometric data analyses, this is referred to as the data generation process (dgp) for the rational (utility maximising) decision maker.
Some heuristics can be investigated by means of basic data exploration methods, e.g., task non-attendance can be identified by simply counting the number of respondents always selecting either the left, middle or right alternative [102]. The inclusion of an alternative specific constant can help to identify a (tendency towards) a reading order heuristic, while the inclusion of a spline function and dummy attributes can help to identify ordinal recoding (especially in cases where attribute levels have not been evenly distributed over the full attribute level range) [80]. A suitably extended MNL is also the simplest way to handle the satisficing heuristic and reference point effects, by collecting augmented data concerning attribute cut-offs or reference points at the respondent level [31]. After eliciting cut-offs or reference points, they can be included in the model that defines a penalised utility function, i.e., penalties are given by the degree of violations of self-imposed cut-offs/reference points, which is then the basis for maximisation-based choice behaviour. This approach has the advantage in that it results in a standard MNL specification and can be estimated through any existing estimation software. It does, however, require researchers to assume that these quantities are exogenous and not changed by the information encountered in the DCE. Using such data exploration models enables researchers to identify the proportion of respondents who likely employed certain heuristics and with that comes an opportunity to (1) run robustness analyses comparing outcomes using the full dataset versus a dataset without respondents who showed the use of heuristics, (2) discuss whether certain DCE-related or respondent characteristics might have induced the use of heuristics beyond what might be expected in real life, and (3) discuss the possible impact of the use of heuristics on the conclusions drawn from the DCE.
The investigation of the presence and exact impact of other heuristics is more complex for several reasons. First, in many cases such explorations will lead to more complex econometric specifications than the standard utility maximising representations mentioned above. Second, the identification and modelling of the occurrence of most heuristics essentially requires the generalisation of the utility maximising dgp to allow for their co-existence in predicting choice responses; such extensions are unlikely to be built into standard estimation software, and thus require researchers to invest in development and testing of customised software. Third, some heuristics present especially daunting modelling challenges. For several heuristics, few straightforward paths have been identified to represent their dgp [121]. This is mainly the case for heuristics of which their representation requires enumeration of attribute consideration order, which is usually not known a priori, and requires the development of an auxiliary model of attribute processing (e.g., elimination-by-aspect (e.g., [122]), lexicographical preferences (e.g., [123]). These models are good examples of what can bedevil the translation of decision process work in psychology into practical econometric specifications: what is assumed to be known in the elaboration of the decision rule (e.g., attribute order) may be beyond the possibility (or practicality) to express in probabilistic terms. This has proven daunting and calls for future development efforts. Eliciting the order of attribute importance from respondents might be an opportunity to allow inferences on the use of these heuristics. However, computational issues are likely to arise as such an effort requires more complex econometric models at the individual level. Awaiting further developments, researchers have to rely on qualitative research to be able to infer whether or not such heuristics are likely to have impacted DCE study outcomes.
Other heuristic-specific modelling approaches do exist. There are two important dimensions in these modelling approaches, which have to do with the assumption the researcher makes about the process in which heuristics are adopted: (1) adoption is both respondent- and task-specific, hence can change within person over the course of the DCE, or (2) adoption is respondent-specific, hence constant for the respondent across all encountered tasks in the DCE. Thus, the decision process description that must be embodied in the likelihood function needs to specify if heuristics are dependent on the evaluated choice tasks and whether adoption remains constant over tasks (thereby only differs across respondents).
Attribute (level) non-attendance (ANA) has received relatively substantial attention in health- related DCEs [30, 91, 92, 124, 125]. When modelling this heuristic, the most commonly applied model assumes that the only variation to the utility maximising dgp is that attributes may or may not be employed in calculating a compensatory utility measure, and that ANA does not vary across tasks [30, 91, 124, 126]. This model allows for the possibility/likelihood that an attribute is used in constructing the utility measure by including an attribute-specific ‘attention’ parameter. If this parameter is unity for all attributes, all attributes are fully used and the model reduces to the standard MNL; if, however, one or more of the attributes has an ‘attention’ parameter value below one, the impact of the attribute is scaled downwards relative to the full attendance case.
Choice set formation has recently received attention in health DCEs [29]. In modelling this heuristic, researchers should account for the fact that all excluded alternatives due to screening necessarily have choice probabilities of exactly zero (these are called structural zeros) [93, 94, 127]. In other words, if a specific set of alternatives is presented to the respondent in the choice task (call this set D), but the respondent eliminates one or more of them (e.g., because the risks are not acceptable), the resulting reduced set of alternatives evaluated (call this C) has fewer alternatives than set D. The choice set formation heuristic dgp is developed recognising that in general the screened set C is unobserved, thus latent.
Modelling the conjunctive or disjunctive heuristics is proposed using a two-stage decision process, like the choice-set formation model, wherein first a subset of alternatives is selected from the choice task, and then an alternative is selected from that reduced set [128]. The alternatives included in the choice set are identified with an ‘indicator function’. If respondents applied the conjunctive rule this function equals one, otherwise it equals zero [128]. The conjunctive decision rule dictates that an alternative is acceptable only if the pre-set threshold (i.e., the smallest level of the attribute a respondent needs to include the alternative for further consideration) for all attributes is met [128]. Since attribute thresholds are latent, the modelling procedure further mimics that of the choice set formation heuristic. This process is the deterministic version of describing these heuristics, the probabilistic (and superior) procedure would involve formulating choice set probabilities as a function of the heuristics.
Latent Class analysis allows for different dgps to co-exist, permitting the representation and investigation of several heuristics (example of application in health [129], recently this has also been tested in Mixed Logit specification [130]). Note that the only extra requirement for the estimation software is that one be able to fix preference parameters to given constants within a class. Next, several examples using this approach will be discussed. First, to investigate task non-attendance (complete ignorance) a Latent Class model with 2 classes can be constructed where Class 1 is used to represent the utility maximising dgp, and Class 2 predicts random choice by restricting all preferences to zero. Second, tallying can be identified by a Latent Class model with 2 classes, where Class 1 is used to represent the utility maximising dgp and Class 2 predicts tallying by restricting preferences to a certain constant with additions or subtractions depending upon the ‘good’ or ‘bad’ categorisations of the attribute. Note that this approach assumes that the attribute values have been appropriately scaled to be in the same range. Third, this general approach can be used to represent dominant decision-making behaviour: again a 2-class Latent Class model can be used where Class 1 is used to represent the utility maximising dgp and Class 2 predicts dominant decision-making behaviour by restricting all attributes to be zero except for the attribute investigated for dominant decision-making behaviour. Selecting attributes to be investigated for dominant decision-making behaviour can be done based on evidence from literature or outcomes of the qualitative work preceding the attribute and level selection of the DCE.
This approach of defining heuristic-based latent classes is quite flexible and can be used to simultaneously represent multiple heuristics in a given data set. Instead of modelling one heuristic at a time besides the utility maximising class, it is possible to generalise the classes in a single model to represent multiple heuristics in one model (e.g., utility maximising, plus reading order, plus dominant decision-making behaviour). Essentially, if the original MNL model can represent the individual heuristic via a restriction on its parameters, a latent class can be defined to capture that hypothesis. The proliferation of too many classes may be problematic; however, if one or more of the sought heuristics do not actually exist in the data, this will lead to numerical instability during model estimation, even non-convergence. Removal of the offending class should stabilise the estimation process. Focus on the most likely heuristics in the context should guide researchers’ specification of classes. This approach, while quite useful, is not a ‘one size fits all’ solution since it is not always possible to represent a heuristic in a restricted/constrained form of the MNL model.
Adding preference heterogeneity continues to be an important consideration when modelling DCE data that might contain heuristic-driven choices, but modelling priorities for the analyst should be heuristics first, and preference heterogeneity second. This prioritisation will likely result in strong impacts on preference heterogeneity inferences. This is not to say that preference heterogeneity will somehow disappear, but rather that the impact of heuristics on choices might in part already explain differences that would otherwise have been attributed to preference heterogeneity. Future research efforts are needed to understand the general impact of this conditioning of preference heterogeneity by heuristics modelling.
8 Concluding Remarks
For DCEs to generate useful and valid outcomes, they should mimic real-life choice situations as much as possible [2, 106]. Making choices about health (care) is generally considered complex, both in real life and in DCEs. Therefore, preference elicitation techniques—including DCEs—are susceptible to the use of heuristics. This is likely not problematic unless researchers design their DCE in a way that induces heuristics in measurement that differ from heuristics used in real-life choices and/or ignore the use of heuristics in modelling procedures and assume choices were made under RUT assumptions. To date, the potential use and impact of heuristics in applied health-related DCEs generally is only alluded to in papers when discussing results and potential limitations. However, this manuscript shows that researchers are not as helpless as they might think regarding the impact heuristics might have on their study and study outcomes. Researchers can (1) plan and design their study to reduce the use of task-induced heuristics, (2) detect the use of heuristics by means of a priori qualitative work and a posteriori data analyses and (3) estimate the effects of the use of heuristics on their study outcomes. Future DCE studies should report on the likelihood of heuristics being employed in their experiment and the potential impact of this on their study outcomes. Additionally, methodological research is needed to generate (1) further insights on how best to design a DCE and train respondents to prevent increased use of heuristics beyond their use in real-life decision making and (2) define appropriate and accessible modelling strategies to estimate the impact of the use of heuristics on study outcomes. In conclusion, it seems that a lot (more) can be done regarding heuristics in DCEs, researchers should never take shortcuts in modelling choice behaviour.
Data Availability
Not applicable to this article as no datasets were generated or analyzed during the current study.
References
Wittink DR, Cattin P. Commercial use of conjoint analysis: an update. J Mark. 1989;53:91–6. https://doi.org/10.2307/1251345.
Lancsar E, Louviere J. Conducting discrete choice experiments to inform healthcare decision making. Pharmacoeconomics. 2008;26:661–77. https://doi.org/10.2165/00019053-200826080-00004.
Luce RD, Tukey JW. Simultaneous conjoint measurement: a new type of fundamental measurement. J Math Psychol. 1964;1:1–27. https://doi.org/10.1016/0022-2496(64)90015-X.
Hensher D, Rose JM, Greene WH. Applied choice analysis. 2nd ed. Cambridge: Cambridge University Press; 2015.
Bouvy JC, Cowie L, Lovett R, Morrison D, Livingstone H, Crabb N. Use of patient preference studies in HTA decision making: a NICE perspective. Patient. 2020;13:145–9. https://doi.org/10.1007/s40271-019-00408-4.
Cowie L, Bouvy JC. Measuring patient preferences: an exploratory study to determine how patient preferences data could be used in health technology assessment (HTA). Project report. Edinburgh: MyelomaUK; 2019.
Ho M, Saha A, McCleary KK, Levitan B, Christopher S, Zandlo K, et al. A framework for incorporating patient preferences regarding benefits and risks into regulatory assessment of Medical Technologies. Value Health. 2016;19:746–50.
Ho MP, Gonzalez JM, Lerner HP, Neuland CY, Whang JM, McMurry-Heath M, et al. Incorporating patient-preference evidence into regulatory decision making. Surg Endosc. 2015;29:2984–93.
Chow RD, Wankhedkar KP, Mete M. Patients’ preferences for selection of endpoints in cardiovascular clinical trials. J Community Hosp Intern Med Perspect. 2014;4:22643. https://doi.org/10.3402/jchimp.v4.22643.
Soekhai V, de Bekker-Grob EW, Ellis AR, Vass CM. Discrete choice experiments in health economics: past, present and future. Pharmacoeconomics. 2019;37:201–26. https://doi.org/10.1007/s40273-018-0734-2.
de Bekker-Grob EW, Berlin C, Levitan B, Raza K, Christoforidi K, Cleemput I, et al. Giving patients’ preferences a voice in medical treatment life cycle: the PREFER Public-Private Project. Patient. 2017;10:263–6. https://doi.org/10.1007/s40271-017-0222-3.
McFadden D. Conditional logit analysis of qualitative choice behavior. In: Zarembka P, editor. Frontiers in econometrics. New York: Academic Press; 1974. p. 105–42.
McFadden D. The choice theory approach to market research. Mark Sci. 1986;5:275–97.
Thurstone LL. The method of paired comparisons for social values. J Abnorm Soc Psychol. 1927;21:384–400. https://doi.org/10.1037/h0065439.
Lloyd AJ. Threats to the estimation of benefit: are preference elicitation methods accurate? Health Econ. 2003;12:393–402. https://doi.org/10.1002/hec.772.
Ryan M, Gerard K, Amaya-Amaya M. Using discrete choice experiments to value health and health care. Dordrecht: Springer; 2008.
Lancsar E, Louviere J. Deleting ‘irrational’ responses from discrete choice experiments: a case of investigating or imposing preferences? Health Econ. 2006;15:797–811. https://doi.org/10.1002/HEC.1104.
Kahneman D. Thinking fast and slow. New York: Farrar, Straus & Giroux; 2011.
Starmer CF. Developments in non-expected utility theory: the hunt for a descriptive theory of choice under risk. J Econ Lit. 2000;38:332–82.
Payne JW, Bettman JR, Johnson EJ. The adaptive decision maker. Cambridge: Cambridge University Press; 1993.
Hensher DA. Attribute processing, heuristics and prefrence construction in choice analysis. In: Hess Daly AS, editor. Choice modelling: the state-of-the-art and the state-of-practice. Bingley: Emerald Press; 2010. p. 35–69.
Hensher DA. How do respondents process stated choice experiments? Attribute consideration under varying information load. J Appl Econ. 2006;21:861–78. https://doi.org/10.1002/JAE.877.
Simon HA. A behavioral model of rational choice. Q J Econ. 1955;69:99–118.
Simon HA. Rationality as process and as product of thought. In: Bell DE, Raiffa H, Tversky A, editors. Decision making. Cambridge: Cambridge University Press; 1988. p. 58–77.
Lipkus IM. Numeric, verbal, and visual formats of conveying health risks: suggested best practices and future recommendations. Med Decis Mak. 2007;27:696–713. https://doi.org/10.1177/0272989X07307271.
Waters EA, Weinstein ND, Colditz GA, Emmons K. Formats for improving risk communication in medical tradeoff decisions. J Health Commun. 2006;11:167–82. https://doi.org/10.1080/10810730500526695.
de Bekker-Grob EW, Donkers B, Bliemer M, Coast J, Swait J. Towards accurate prediction of healthcare choices: the INTERSOCIAL project. Patient. 2022;15:509–12. https://doi.org/10.1007/s40271-022-00593-9.
Lancsar E, Swait J. Reconceptualising the external validity of discrete choice experiments. Pharmacoeconomics. 2014;32:951–65. https://doi.org/10.1007/s40273-014-0181-7.
Veldwijk J, Swait JD. The role of attribute screening and choice set formation in health discrete choice experiments: modeling the impact of benefit and risk attributes. Value Health. 2022;25:1416–27.
Lagarde M. Investigating attribute non-attendance and its consequences in choice experiments with latent class models. Health Econ. 2013;22:554–67. https://doi.org/10.1002/hec.2824.
Swait J. A non-compensatory choice model incorporating attribute cutoffs. Transp Res Part B Methodol. 2001;35:903–28. https://doi.org/10.1016/S0191-2615(00)00030-8.
Swait J, Popa M, Wang L. Capturing context-sensitive information usage in choice models via mixtures of information archetypes. J Mark Res. 2016;53:646–64. https://doi.org/10.1509/jmr.12.0518.
Shah AK, Oppenheimer DM. Heuristics made easy: an effort-reduction framework. Psychol Bull. 2008;134:207–22. https://doi.org/10.1037/0033-2909.134.2.207.
Gigerenzer G, Gaissmaier W. Heuristic decision making. Annu Rev Psychol. 2011;62:451–82. https://doi.org/10.1146/annurev-psych-120709-145346.
West RF, Toplak ME, Stanovich KE. Heuristics and biases as measures of critical thinking: associations with cognitive ability and thinking dispositions. J Educ Psychol. 2008;100:930–41. https://doi.org/10.1037/a0012842.
Kokis JV, Macpherson R, Toplak ME, West RF, Stanovich KE. Heuristic and analytic processing: age trends and associations with cognitive ability and cognitive styles. J Exp Child Psychol. 2002;83:26–52. https://doi.org/10.1016/S0022-0965(02)00121-2.
Stanovich KE, West RF. Individual differences in reasoning: implications for the rationality debate? Behav Brain Sci. 2000;23:645–65. https://doi.org/10.1017/S0140525X00003435.
Bessette DL, Wilson RS, Arvai JL. Do people disagree with themselves? Exploring the internal consistency of complex, unfamiliar, and risky decisions. J Risk Res. 2021;24:593–605. https://doi.org/10.1080/13669877.2019.1569107.
Rieskamp J, Hoffrage U. When do people use simple heuristics, and how can we tell? The memory-based formation of preferences View project. In: Gigerenzer G, Todd PM, the ABC Research Group, editors. Simple heuristics that make us smart. Oxford: Oxford University Press; 1999. p. 141–67.
Loomis J. What’s to know about hypothetical bias in stated preference valution studies? J Econ Surv. 2011;2011:2.
Özdemir S, Johnson FR, Hauber AB. Hypothetical bias, cheap talk, and stated willingness to pay for health care. J Health Econ. 2009;28:894–901. https://doi.org/10.1016/j.jhealeco.2009.04.004.
Kang MJ, Rangel A, Camus M, Camerer CF. Hypothetical and real choice differentially activate common valuation areas. J Neurosci. 2011;31:461–8. https://doi.org/10.1523/JNEUROSCI.1583-10.2011.
Quaife M, Terris-Prestholt F, di Tanna GL, Vickerman P. How well do discrete choice experiments predict health choices? A systematic review and meta-analysis of external validity. Eur J Health Econ. 2018;19:1053–66. https://doi.org/10.1007/s10198-018-0954-6.
Pearce A, Harrison M, Watson V, Street DJ, Howard K, Bansback N, et al. Respondent understanding in discrete choice experiments: a scoping review. Patient. 2020;14:17–53. https://doi.org/10.1007/S40271-020-00467-Y.
Hol L, de Bekker-Grob EW, van Dam L, Donkers B, Kuipers EJ, Habbema JDF, et al. Preferences for colorectal cancer screening strategies: a discrete choice experiment. Br J Cancer. 2010;102:972–80. https://doi.org/10.1038/sj.bjc.6605566.
Swait J, Adamowicz W. The influence of task complexity on consumer choice: a latent class model of decision strategy switching. J Consum Res. 2001;28:135–48. https://doi.org/10.1086/321952.
Bech M, Kjaer T, Lauridsen J. Does the number of choice sets matter? Results from a web survey applying a discrete choice experiment. Health Econ. 2011;20:273–86. https://doi.org/10.1002/hec.1587.
San Miguel F, Ryan M, Amaya-Amaya M. “Irrational” stated preferences: a quantitative and qualitative investigation. Health Econ. 2005;14:307–22. https://doi.org/10.1002/hec.912.
Veldwijk J, Determann D, Lambooij MS, van Til JA, Korfage IJ, de Bekker-Grob EW, et al. Exploring how individuals complete the choice tasks in a discrete choice experiment: an interview study. BMC Med Res Methodol. 2016. https://doi.org/10.1186/s12874-016-0140-4.
Johnson RD. Making judgements when information is missing: inferences, biases, and framing effects. Acta Psychol. 1987;66:69–82. https://doi.org/10.1016/0001-6918(87)90018-7.
Johnson RD, Levin IP. More than meets the eye: the effect of missing information on purchase evaluations. J Consum Res. 1985;12:169. https://doi.org/10.1086/208505.
Determann D, Gyrd-Hansen D, de Wit GA, de Bekker-Grob EW, Steyerberg EW, Lambooij MS, et al. Designing unforced choice experiments to inform health care decision making: implications of using opt-out, neither, or status quo alternatives in discrete choice experiments. Med Decis Mak. 2019;39:681–92. https://doi.org/10.1177/0272989X19862275/FORMAT/EPUB.
de Bekker-Grob EW, Swait JD, Kassahun HT, Bliemer MCJ, Jonker MF, Veldwijk J, et al. Are healthcare choices predictable? The impact of discrete choice experiment designs and models. Value Health. 2019;22:1050–62. https://doi.org/10.1016/j.jval.2019.04.1924.
Cairns J, van der Pol M. Repeated follow-up as a method for reducing non-trading behaviour in discrete choice experiments. Soc Sci Med. 2004;58:2211–8. https://doi.org/10.1016/j.socscimed.2003.08.021.
Ohler T, Le A, Louviere J, Swait J. Attribute range effects in binary response tasks. Mark Lett. 2000;11:249–60. https://doi.org/10.1023/A:1008139226934.
Lipman SA, Brouwer WBF, Attema AE. Living up to expectations: experimental tests of subjective life expectancy as reference point in time trade-off and standard gamble. J Health Econ. 2020. https://doi.org/10.1016/J.JHEALECO.2020.102318.
Mattmann M. Testing choice theory using discrete choice experiments in Swiss energy policy. Amsterdam: Vrije Universiteit; 2017.
Lipman SA, Brouwer WBF, Attema AE. A QALY loss is a QALY loss is a QALY loss: a note on independence of loss aversion from health states. Eur J Health Econ. 2019;20:419–26. https://doi.org/10.1007/s10198-018-1008-9.
Simon M, Houghton SM, Aquino K. Cognitive biases, risk perception, and venture formation. J Bus Ventur. 2000;15:113–34. https://doi.org/10.1016/S0883-9026(98)00003-2.
Smith IP, DiSantostefano RL, de Bekker-Grob EW, Levitan B, Berlin C, Veldwijk J, et al. Methodological priorities for patient preferences research: stakeholder input to the PREFER Public-Private Project. Patient. 2021;14:449–53. https://doi.org/10.1007/s40271-021-00502-6.
Tervonen T, Gelhorn H, Sri Bhashyam S, Poon JL, Gries KS, Rentz A, et al. MCDA swing weighting and discrete choice experiments for elicitation of patient benefit-risk preferences: a critical assessment. Pharmacoepidemiol Drug Saf. 2017;26:1483–91. https://doi.org/10.1002/pds.4255.
Luce MF, Payne JW, Bettman JR. Emotional trade-off difficulty and choice. J Mark Res. 1999;36:143. https://doi.org/10.2307/3152089.
Slovic P, Finucane ML, Peters E, MacGregor DG. The affect heuristic. Eur J Oper Res. 2007;177:1333–52. https://doi.org/10.1016/j.ejor.2005.04.006.
Sørensen K, van den Broucke S, Fullam J, Doyle G, Pelikan J, Slonska Z, et al. Health literacy and public health: a systematic review and integration of definitions and models. BMC Public Health. 2012;12:1–13. https://doi.org/10.1186/1471-2458-12-80.
van der Heide I, Uiters E, Sørensen K, Röthlin F, Pelikan J, Rademakers J, et al. Health literacy in Europe: the development and validation of health literacy prediction models. Eur J Public Health. 2016;26:906–11. https://doi.org/10.1093/eurpub/ckw078.
Bodemer N, Meder B, Gigerenzer G. Communicating relative risk changes with baseline risk. Med Decis Mak. 2014;34:615–26. https://doi.org/10.1177/0272989X14526305.
Garcia-Retamero R, Sobkow A, Petrova D, Garrido D, Traczyk J. Numeracy and risk literacy: what have we learned so far? Span J Psychol. 2019;22:1–11. https://doi.org/10.1017/SJP.2019.16.
Ashby NJS. Numeracy predicts preference consistency: deliberative search heuristics increase choice consistency for choices from description and experience. Judgm Decis Mak. 2017;12:128–39.
Peters E, Levin IP. Dissecting the risky-choice framing effect: numeracy as an individual-difference factor in weighting risky and riskless options. Judgm Decis Mak. 2008;3:435–48.
Cokely ET, Kelley CM. Cognitive abilities and superior decision making under risk: a protocol analysis and process model evaluation. Judgm Decis Mak. 2009;4:20–33.
Cokely ET, Feltz A, Ghazal S, Allan JN, Petrova D, Garcia-Retamero R. Skilled decision theory: from intelligence to numeracy and expertise. In: Ericsson KA, Hoffman RR, Kozbelt A, Williams AM, editors. The Cambridge handbook of expertise and expert performance. 2nd ed. Cambridge: Cambridge University Press; 2018. p. 476–505.
Besedeŝ T, Deck C, Sarangi S, Shor M. Age effects and heuristics in decision making. Rev Econ Stat. 2012;94:580–95. https://doi.org/10.1162/REST_A_00174.
Mata R, Schooler LJ, Rieskamp J. The aging decision maker: cognitive aging and the adaptive selection of decision strategies. Psychol Aging. 2007;22:796–810. https://doi.org/10.1037/0882-7974.22.4.796.
Pachur T, Mata R. Cognitive aging and the adaptive use of recognition in decision making. Psychol Aging. 2009. https://doi.org/10.1037/a0017211.supp.
Morris A, Brading H. E-literacy and the grey digital divide: a review with recommendations. J Inf Lit. 2007;1:13. https://doi.org/10.11645/1.3.14.
Zhang C, Arbor A, Conrad FG. Speeding in Web Surveys: the tendency to answer very fast and its association with straightlining. Surv Res Methods. 2014;8:127–35. https://doi.org/10.18148/SRM/2014.V8I2.5453.
Conrad FG, Tourangeau R, Couper MP, Zhang C. Reducing speeding in web surveys by providing immediate feedback. Surv Res Methods. 2017;11:45–61. https://doi.org/10.18148/srm/2017.v11i1.6304.
Kahneman D. Reference points, anchors, norms, and mixed feelings. Organ Behav Hum Decis Process. 1992;51:296–312. https://doi.org/10.1016/0749-5978(92)90015-Y.
Tversky A, Kahneman D. Judgment under uncertainty: heuristics and biases. Science. 1979;1974(185):1124–31. https://doi.org/10.1126/science.185.4157.1124.
Veldwijk J, Lambooij MS, van Til JA, Groothuis-Oudshoorn CGM, Smit HA, de Wit GA. Words or graphics to present a discrete choice experiment: does it matter? Patient Educ Couns. 2015;98:1376–84. https://doi.org/10.1016/j.pec.2015.06.002.
Johnson FR, Mohamed AF, Özdemir S, Marshall DA, Phillips KA. How does cost matter in health-care discrete-choice experiments? Health Econ. 2011;20:323–30. https://doi.org/10.1002/hec.1591.
Einhorn HJ. The use of nonlinear, noncompensatory models in decision making. Psychol Bull. 1970;73:221–30. https://doi.org/10.1037/h0028695.
Gigerenzer G, Goldstein DG. Reasoning the fast and frugal way: models of bounded rationality. Psychol Rev. 1996;103:650–69. https://doi.org/10.1037/0033-295X.103.4.650.
Russo JE, Dosher BA. Strategies for multiattribute binary choice. J Exp Psychol Learn Mem Cogn. 1983;9:676–96. https://doi.org/10.1037/0278-7393.9.4.676.
Einhorn HJ, Hogarth RM. Unit weighting schemes for decision making. Organ Behav Hum Perform. 1975;13:171–92. https://doi.org/10.1016/0030-5073(75)90044-6.
Dawes RM. The robust beauty of improper linear models in decision making. Am Psychol. 1979;34:571–82. https://doi.org/10.1037/0003-066X.34.7.571.
Dawes RM, Corrigan B. Linear models in decision making. Psychol Bull. 1974;81:95–106. https://doi.org/10.1037/h0037613.
Alba JW, Marmorstein H. The effects of frequency knowledge on consumer decision making. J Consum Res. 1987;14:14. https://doi.org/10.1086/209089.
Hensher DA, Rose J, Greene WH. The implications on willingness to pay of respondents ignoring specific attributes. Transportation (Amst). 2005;32:203–22. https://doi.org/10.1007/s11116-004-7613-8.
Campbell D, Hutchinson WG, Scarpa R. Incorporating discontinuous preferences into the analysis of discrete choice experiments. Environ Resour Econ. 2008;41:401–17. https://doi.org/10.1007/s10640-008-9198-8.
Hensher DA, Greene WH. Non-attendance and dual processing of common-metric attributes in choice analysis: a latent class specification. Empir Econ. 2010;39:413–26. https://doi.org/10.1007/s00181-009-0310-x.
Erdem S, Campbell D, Hole AR. Accounting for attribute-level non-attendance in a health choice experiment: does it matter? Health Econ. 2015;24:773–89. https://doi.org/10.1002/hec.3059.
Swait J, Ben-Akiva M. Incorporating random constraints in discrete models of choice set generation. Transp Res Part B Methodol. 1987;21:91–102. https://doi.org/10.1016/0191-2615(87)90009-9.
Swait J. Choice set generation within the generalized extreme value family of discrete choice models. Transp Res Part B Methodol. 2001;35:643–66. https://doi.org/10.1016/S0191-2615(00)00029-1.
Tversky A. Elimination by aspects: a theory of choice. Psychol Rev. 1972;79:281–99.
Gigerenzer G, Goldstein D. Betting on one good reason: the take the best heuristic. In: Gigerenzer G, Todd PM, The ABC Research Group, editors. Simple heuristics that make us smart. Oxford: Oxford University Press; 1999. p. 75–95.
Fishburn PC. Axioms for lexicographic preferences. Rev Econ Stud. 1975;42:415. https://doi.org/10.2307/2296854.
Tversky A. Intransitivity of preferences. Psychol Rev. 1969;76:31–48. https://doi.org/10.1037/h0026750.
Scott A. Identifying and analysing dominant preferences in discrete choice experiments: an application in health care. J Econ Psychol. 2002;23:383–98. https://doi.org/10.1016/S0167-4870(02)00082-X.
Lancaster K. Operationally relevant characteristics in the theory of consumer behavior. In: Peston M, Corrt B, editors. Essays in honour of Lord Robbins. London: Weidenfels and Nicholson; 1972. p. 43–62.
Cohen M, Jaffray J-Y. Rational behavior under complete ignorance. Econometrica. 1980;48:1281. https://doi.org/10.2307/1912184.
Johnson FR, Yang J-C, Reed SD. The internal validity of discrete choice experiment data: a testing tool for quantitative assessments. Value Health. 2019;22:157–60. https://doi.org/10.1016/j.jval.2018.07.876.
Janssen EM, Marshall DA, Hauber AB, Bridges JFP. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability? Expert Rev Pharmacoecon Outcomes Res. 2017;17:531–42. https://doi.org/10.1080/14737167.2017.1389648.
Louviere JJ, Islam Wasi NT, Street D, Burgess L. Designing discrete choice experiments: do optimal designs come at a price? J Consum Res. 2008;35:360–75.
Johnson FR, Lancsar E, Marshall D, Kilambi V, Mühlbacher A, Regier DA, et al. Constructing experimental designs for discrete-choice experiments: report of the ISPOR conjoint analysis experimental design good research practices task force. Value Health. 2013;16:3–13. https://doi.org/10.1016/j.jval.2012.08.2223.
Bridges JFP, Hauber AB, Marshall D, Lloyd A, Prosser LA, Regier DA, et al. Conjoint analysis applications in health—a checklist: a report of the ISPOR good research practices for conjoint analysis task force. Value Health. 2011;14:403–13. https://doi.org/10.1016/j.jval.2010.11.013.
Kjær T, Bech M, Gyrd-Hansen D, Hart-Hansen K. Ordering effect and price sensitivity in discrete choice experiments: need we worry? Health Econ. 2006;15:1217–28. https://doi.org/10.1002/hec.1117.
Heidenreich S, Phillips-Beyer A, Flamion B, Ross M, Seo J, Marsh K. Benefit-risk or risk-benefit trade-offs? Another look at attribute ordering effects in a pilot choice experiment. Patient. 2021;14:65–74. https://doi.org/10.1007/s40271-020-00475-y.
Logar I, Brouwer R, Campbell D. Does attribute order influence attribute-information processing in discrete choice experiments? Resour Energy Econ. 2020;60: 101164. https://doi.org/10.1016/j.reseneeco.2020.101164.
Jonker MF, Roudijk B, Maas M. The sensitivity and specificity of repeated and dominant choice tasks in discrete choice experiments. Value Health. 2022;25:1381–9. https://doi.org/10.1016/j.jval.2022.01.015.
Harrison M, Rigby D, Vass C, Flynn T, Louviere J, Payne K. Risk as an attribute in discrete choice experiments: a systematic review of the literature. Patient. 2014;7:151–70. https://doi.org/10.1007/s40271-014-0048-1.
Galesic M, Garcia-Retamero R, Gigerenzer G. Using icon arrays to communicate medical risks: overcoming low numeracy. Health Psychol. 2009;28:210–6. https://doi.org/10.1037/a0014474.
Galesic M. Statistical numeracy for health. Arch Intern Med. 2010;170:462. https://doi.org/10.1001/archinternmed.2009.481.
DeLong KL, Syrengelas KG, Grebitus C, Nayga RM. Visual versus text attribute representation in choice experiments. J Behav Exp Econ. 2021;94: 101729. https://doi.org/10.1016/j.socec.2021.101729.
Jonker MF, Donkers B, de Bekker-Grob EW, Stolk EA. Effect of level overlap and color coding on attribute non-attendance in discrete choice experiments. Value Health. 2018;21:767–71. https://doi.org/10.1016/j.jval.2017.10.002.
Jonker MF, Donkers B, de Bekker-Grob E, Stolk EA. Attribute level overlap (and color coding) can reduce task complexity, improve choice consistency, and decrease the dropout rate in discrete choice experiments. Health Econ. 2019;28:350–63. https://doi.org/10.1002/hec.3846.
Maddala T, Phillips KA, Johnson FR. An experiment on simplifying conjoint analysis designs for measuring preferences. Health Econ. 2003;12:1035–47. https://doi.org/10.1002/hec.798.
Kessels R, Jones B, Goos P. An improved two-stage variance balance approach for constructing partial profile designs for discrete choice experiments. Appl Stoch Models Bus Ind. 2015;31:626–48. https://doi.org/10.1002/asmb.2065.
Vass CM, Wright S, Burton M, Payne K. Scale heterogeneity in healthcare discrete choice experiments: a primer. Patient. 2018;11:167–73. https://doi.org/10.1007/s40271-017-0282-4.
Hauber AB, González JM, Groothuis-Oudshoorn CGM, Prior T, Marshall DA, Cunningham C, et al. Statistical methods for the analysis of discrete choice experiments: a report of the ISPOR conjoint analysis good research practices task force. Value Health. 2016;19:300–15. https://doi.org/10.1016/j.jval.2016.04.004.
Swait J, de Bekker-Grob EW. A discrete choice model implementing gist-based categorization of alternatives, with applications to patient preferences for cancer screening and treatment. J Health Econ. 2022;85: 102674. https://doi.org/10.1016/j.jhealeco.2022.102674.
Fader PS, McAlister L. An elimination by aspects model of consumer response to promotion calibrated on UPC scanner data. J Mark Res. 1990;27:322. https://doi.org/10.2307/3172589.
Kohli R, Jedidi K. Representation and inference of lexicographic preference models and their variants. Mark Sci. 2007;26:380–99.
Heidenreich S, Watson V, Ryan M, Phimister E. Decision heuristic or preference? Attribute non-attendance in discrete choice problems. Health Econ. 2018;27:157–71. https://doi.org/10.1002/hec.3524.
Sever I, Verbič M, Sever EK. Estimating willingness-to-pay for health care: a discrete choice experiment accounting for non-attendance to the cost attribute. J Eval Clin Pract. 2019;25:843–9. https://doi.org/10.1111/jep.13095.
Scarpa R, Gilbride TJ, Campbell D, Hensher DA. Modelling attribute non-attendance in choice experiments for rural landscape valuation. Eur Rev Agric Econ. 2009;36:151–74. https://doi.org/10.1093/erae/jbp012.
Swait J, Erdem T. Brand effects on choice and choice set formation under uncertainty. Mark Sci. 2007;26:679–97. https://doi.org/10.1287/mksc.1060.0260.
Gilbride TJ, Allenby GM. A choice model with conjunctive, disjunctive, and compensatory screening rules. Mark Sci. 2004;23:391–406. https://doi.org/10.1287/mksc.1030.0032.
Karim S, Craig BM, Groothuis-Oudshoorn CGM. Exploring the importance of controlling heteroskedasticity and heterogeneity in health valuation: a case study on Dutch EQ-5D-5L. Health Qual Life Outcomes. 2022;20:85. https://doi.org/10.1186/s12955-022-01989-9.
Jonker MF. The garbage class mixed logit model: accounting for low-quality response patterns in discrete choice experiments. Value Health. 2022;25:1871–7. https://doi.org/10.1016/j.jval.2022.07.013.
Acknowledgements
J. Veldwijk, S. M. Marceta, J. D. Swait, and S. A. Lipman were supported by the Erasmus Initiative ‘Smarter Choices for Better Health’. E. W. de Bekker-Grob was funded via the Dutch Research Council (NWO-Talent-Scheme-Vidi-Grant no. 09150171910002).
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Author contributions
JV, SM and JDS drafted the manuscript, all authors critically reviewed the manuscript, all authors approved the final manuscript.
Data sharing
Not applicable to this article as no datasets were generated or analyzed during the current study.
Conflict of interest
The authors declare that they have no conflict of interest.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, which permits any non-commercial use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by-nc/4.0/.
About this article
Cite this article
Veldwijk, J., Marceta, S.M., Swait, J.D. et al. Taking the Shortcut: Simplifying Heuristics in Discrete Choice Experiments. Patient 16, 301–315 (2023). https://doi.org/10.1007/s40271-023-00625-y
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s40271-023-00625-y