Abstract
We use a simple version of the Psychological Expected Utility Model (Caplin and Leahy, Q J Econ 116:55–80, 2001) to analyze the optimal choice of information accuracy by an individual who is concerned with anticipatory feeling. The individual faces the following trade-off: on the one hand information may lead to emotional costs, on the other the higher the information accuracy, the higher the efficiency of decision-making. We completely and explicitly characterize how anticipatory utility depends on information accuracy, and study the optimal amount of information acquisition. We obtain simple and explicit conditions under which the individual prefers no-information or partial information gathering. We show that anomalous attitudes towards information can be more frequent and articulated than previously thought.
Similar content being viewed by others
Notes
For example, Lerman et al. (1998) demonstrated that 46% of subjects whose blood was tested for genetic mutations refused to receive the test results despite the fact that the test results indicated whether or not these subjects were susceptible to breast cancer later in life. By knowing more precise information on their risk of cancer patients would have been able to take better prevention measures.
For example, patients could consider some partially informative facts such as their family medical history, or they could look for some information concerning the probability of being high-risk conditional on their specific lifestyle. Whereas a medical test would correspond to a perfect signal. Investors could read financial newspapers to obtain information on financial market performance. Consulting their financial expert, instead, would be equivalent to solve all the uncertainty on their portfolio return. Distrustful partners could look for circumstantial evidence such as ambiguous messages on mobile phones. Otherwise, by employing a personal investigator to spy on their partner they could obtain irrefutable evidence.
In our motivating examples the patient’s action is represented by all preventative behavior adopted or treatment undertaken, whereas in the case of the investor, the action can be interpreted as every possible change in his portfolio composition. The action for the suspicious partner could be the appropriate attitude to take with his/her partner.
This approach is also used in Koszegi, so that our results will be directly comparable with those of Koszegi (2003).
Note that discomfort from information acquisition is always positive and increasing in information accuracy when no action is available. Whereas, when information has decision-making value, it occurs only if unfavorable updated beliefs due to bad news have a stronger impact on anticipatory utility than better choices. Moreover, in that case, discomfort from information is not necessarily increasing in information accuracy.
When the function u(·) is convex or linear, expression 3.7 is always positive. This means that anticipatory utility is monotonically increasing in q and exhibits a global minimum for \(q=\frac{1 }{2}\) : as we expected, an information-loving or information-neutral DM will always choose full information gathering.
Note that, when \(q=\tfrac{1}{2}\) is a maximum, whatever the distance between the two outcomes, the preferred signal is either the fully informative or the uninformative one (see Table 1). However, when w 1 − w 2 ≥ 1, anticipatory utility is always first decreasing and then increasing in information accuracy. On the contrary, when w 1 − w 2 < 1, anticipatory utility can be either first decreasing and then increasing or monotonically decreasing in information accuracy. The latter sub-case occurs when the emotional costs of information always dominate its physical benefits.
In addition to being continuously differentiable, a quadratic loss function differs from a linear one (i.e. − k|w i − a|) in that the cost of ignorance due to imprecise decisions becomes very small and negligible when uncertainty tends to vanish. We believe that both loss functions may adequately represent different real world environments and are equally interesting.
In particular, in both the PEU and the Kreps and Porteus model, information avoidance behaviors depend on a concavity condition (concavity of the anticipatory utility function u(·) in the PEU model, and of the payoff function associated to the “outcome-belief” lotteries in Kreps and Porteus’ model).
Moreover, as Caplin and Leahy (2004) show, different motivations (surprise, anxiety) for anomalous attitudes towards information can have different implications in a principal-agent setting where a benevolent agent must decide about information transmission to a principal who exhibits non-linear preference over temporal lotteries. They conclude that “the classical revealed preference approach to the theory of choice is insufficient to answer a potential important class of policy questions in which the policy maker must decide how much information to share with private agents” (p. 504).
For a survey on the important change in the analysis of the patient–physician relationship recently introduced by considering patient emotions, see Barigozzi and Levaggi (2008).
Information acquisition by a (partially or totally) uninformed agent has been considered in the literature on information gathering in standard agency. Within a standard principal-agent model, Lewis and Sappington (1991) study a principal who chooses the probability p that the agent receives perfect private information on the state of nature. Increasing p increases the chances that the agent’s activity can be tailored to the realized state of nature but also increases the expected rent of the agent. The authors prove that the optimal solution for the principal always occurs at an extreme value of p (either p = 0 or p = 1). This is somehow related to our paper, although the optimality of “all-or-nothing” information comes from a trade-off with rent extraction, which is absent in our analysis.
To the best of our knowledge, no empirical parameter estimates of individual relative information aversion are available. The estimates of relative risk aversion vary considerably, but values in the 0.5–3 interval are often referred to.
Our simulations show that w 2 must be sufficiently close to zero: when this is the case the power function is concave enough for inequality 5 to be satisfied.
Note that the optimal action conditional to the signal observed \(a_{i}^{\ast }\) does not change with respect to the case where anticipatory utility only is maximized. In fact, here, the DM’s FOC is \(\left[ u_{i}^{\prime }\left( \cdot \right) +1\right] \frac{\partial E\left[ w-\left( w-a\right) ^{2}|s_{i} \right] }{\partial a_{i}}=0.\)
References
Barigozzi, F., & Levaggi, R. (2008). Emotions in physician agency. Health Policy, 88, 1–14.
Caplin, A., & Eliaz, K. (2003). AIDS and psychology: A mechanism-design approach. RAND Journal of Economics, 34, 631–646.
Caplin, A., & Leahy, J. (2001). Psychological expected utility theory and anticipatory feelings. Quarterly Journal of Economics, 116, 55–80.
Caplin, A., & Leahy, J. (2004). The supply of information by a concerned expert. Economic Journal, 114, 487–505.
Eliaz, K., & Spiegler, R. (2006). Can anticipatory feelings explain anomalous choices of information sources? Games and Economic Behaviors, 56, 87–104.
Jacobsen, P., Valdimarsdottir, H., Brown, K., & Oct, K. (1997). Decision-making about genetic testing among women at familial risk for breast cancer. Psychosomatic Medicine, 59, 459–466.
Karlsson, N., Loewenstein, G., & Seppi, D. (2009). The ‘Ostrich Effect’: Selective attention to information. Journal of Risk and Uncertainty 38, 95–115.
Koszegi, B. (2003). Health anxiety and patient behavior. Journal of Health Economics, 22, 1073–1084.
Koszegi, B. (2006). Emotional agency. Quarterly Journal of Economics, 121(1), 121–155.
Kreps, D., & Porteus, E. (1978). Temporal resolution of uncertainty and dynamic choice theory. Econometrica, 46, 185–200.
Lerman, C., Daly, M., Masny, M., & Balsheim, A. (1994). Attitudes about genetic testing for breast-ovarian cancer susceptibility. Journal of Clinical Oncology, 12, 843–850.
Lerman, C., Hughes, C., Lemon, S. J., Main, D., Snyder, C., Durham, C., et al. (1998). What you don’t know can hurt you: Adverse psychological effects in members of BRCA1-linked and BRCA2-linked families who decline genetic testing. Journal of Clinical Oncology, 16, 1650–1654.
Lewis, T. R., & Sappington, D. (1991). All-or-nothing information control. Economics Letters, 37, 111–113.
Quaid, K., & Morris, M. (1993). Reluctance to undergo predictive testing: The case of Huntington’s Disease. American Journal of Medical Genetics, 45, 41–45.
Yariv, L. (2005). I’ll see it when I believe it—A simple model of cognitive consistency. Mimeo, UCLA.
Author information
Authors and Affiliations
Corresponding author
Additional information
A previous, different version of the paper circulated with the title “Emotional Decision-Makers and Information Gathering”. The authors are grateful to Giacomo Calzolari, Louis Eeckhoudt, Claudia Scarani, Peter Sivey, Rani Spiegler, Claudio Zoli and seminar participants in Bologna, Konstanz, Marseille, Padova, Roma, Siena and Toulouse for helpful comments and discussions. An anonymous referee of this journal provided valuable comments and suggestions.
Appendix
Appendix
1.1 Proof of Remark 4
-
(i)
From Eq. Eq. 3.6 it is easy to check that, when w 1 − w 2 < 1, \(f_{2}^{\prime }(\cdot )\) is negative for \(q\in \left[ \tfrac{1}{2},1\right] \).
- (ii)
-
(iii)
When w 1 − w 2 ≥ 1, \( f_{2}^{\prime }(\cdot )<0\) for \(q=\tfrac{1}{2}\) and \(f_{2}^{\prime }(\cdot )>0\) for q = 1. Thus, starting from \(q=\tfrac{1}{2},\) f 2(·) is first decreasing and then increasing in q. From Eq. 3.6 it is easy to see that the higher w 1 − w 2, the shorter the subinterval of \(\left[ \tfrac{1}{2},1\right] \) where f 2(·) is decreasing in q.
1.2 Proof of Lemma 1
-
(i)
When \(q=\frac{1}{2}\), it is easy to see that:
$$ \left. \frac{\partial ^{2}U\big(q;w_{1},w_{2}\big)}{\partial q^{2}}\right\vert _{q= \frac{1}{2}}=2u^{\prime }\left( f\left(\frac{1}{2};w_{1},w_{2}\right)\right) +u^{\prime \prime }\left( f\left(\frac{1}{2};w_{1},w_{2}\right)\right) $$where \(u_{1}^{\prime }\left( \cdot \right) =u_{2}^{\prime }\left( \cdot \right) =u^{\prime }\left( \cdot \right) ,\) thus:
$$ \left. \frac{\partial ^{2}U\big(q;w_{1},w_{2}\big)}{\partial q^{2}}\right\vert _{q= \frac{1}{2}}>0\Leftrightarrow R_{u}(f)<2f\left(\frac{1}{2};w_{1},w_{2}\right). \label{conditionq12minimum} $$(4) -
(ii)
This corresponds to the case where \(\left. \frac{\partial ^{2}U(q;w_{1},w_{2})}{\partial q^{2}}\right\vert _{q=\frac{1}{2}}<0.\)
1.3 Anticipatory utility decreasing in the signal precision for q = 1.
Remark 6
Anticipatory utility is decreasing in the signal precision for q = 1 if and only if:
Two necessary conditions are w 1 − w 2 < 1 and u(·) concave enough.
Proof
When q = 1, it is easy to see that:
Thus, for q = 1, anticipatory utility is decreasing in the signal precision if:
From Eq. 6 condition 5 can be immediately derived. Moreover Eq. 6 can be verified only if w 1 − w 2 < 1 and it implies that u(·) must be concave enough. In fact, inequality 5 is verified only if \(u_{1}^{\prime }\left( w_{1}\right) \) is sufficiently lower than \(u_{2}^{\prime }\left( w_{2}\right).\)
1.4 Simulations
Let’s consider the power function \(u\left( x\right) =\frac{ x^{1-\gamma }}{1-\gamma },\) with γ re-interpreted as the parameter of constant relative aversion to information.Footnote 14 In this paragraph we provide examples of all the possible results described in Table 2.
-
Case (a) Let’s start with the more interesting sub-case: a preference for the partial informative signal. According to Remark 5, let’s take \(\gamma <2f(\tfrac{1}{2};w_{1},w_{2}).\) The uninformative signal is a minimum for w 1 ∈ \([w_{2}+1-\sqrt{ 4w_{2}+1-2\gamma },\) \(w_{2}+1+\sqrt{4w_{2}+1-2\gamma }]\) and \(\gamma <\frac{1 }{2}+2w_{2}.\) Moreover, for w 1 − w 2 < 1 the derivative of anticipatory utility in q = 1 is negative.Footnote 15 Figures 1 and 2 show anticipatory utility in the interval \(q\in \left[ \tfrac{1}{2},1\right] \) when u(·) is the power function and w 1 = 0.55, w 2 = 0.023. It can be seen that for γ = 0.42 (Fig. 1) the uninformative signal dominates the fully informative one, whereas for γ = 0.4 (Fig. 2) the opposite holds. According to intuition, all else being equal, the lower the relative aversion to information, the closer to 1 the optimal precision of the signal. In “The case of total utility” below we present the case of total utility (anticipatory utility plus expected physical utility): an interior solution still exists and it is compatible with higher values of information aversion.
The second sub-case arises when \(q=\tfrac{1}{2}\) corresponds to a global minimum and full information is the optimal choice. This occurs, for example, in the case γ = 0.3, ω 1 = 0.5 and ω 2 = 0.2.
-
Case (b) The uninformative signal corresponds to a global minimum and full information gathering is the optimal choice. This occurs, for example, when γ = 0.5, ω 1 = 2.5 and ω 2 = 0.5 .
-
Case (c) In the first sub-case \(q=\tfrac{1}{2}\) is a global maximum and no information is the optimal choice. This occurs, for example, when γ = 0.7, ω 1 = 0.45 and ω 2 = 0.2. In the second sub-case anticipatory utility is first decreasing and then increasing in information accuracy and either full information or no information are the preferred choices. When γ = 1.2, ω 1 = 1, ω 2 = 0.3 full information dominates no information whereas when γ = 1.2, ω 1 = 0.99, ω 2 = 0.2, the opposite occurs.
-
Case (d) Anticipatory utility exhibits a global minimum for partial information gathering. The preferred choice can be either the fully informative signal or the uninformative one. When γ = 1.1, ω 1 = 1.3, ω 2 = 0.05 no information dominates full information whereas when γ = 0.8, ω 1 = 1.3, ω 2 = 0.05 the opposite occurs.
1.5 The case of total utility
As we mentioned before, using anticipatory utility only instead of total utility (consisting of anticipatory utility plus expected physical utility) as the DM’s objective involves no loss of generality. To show that, in this paragraph we obtain sufficient conditions such that partial information is the preferred choice (as in Remark 5) when the DM’s well-being is measured by total utility. Total utility isFootnote 16:
Since a convex term has been added to the DM’s objective function, Lemma 1 and Remark 6 (the latter in the Appendix) must be slightly modified.
Corollary 1
The uninformative signal is a local minimum of total utility if
where ε ≡ \(\frac{2f\big(\tfrac{1}{2};w_{1},w_{2}\big)}{ u^{\prime }\left( f\big(\tfrac{1}{2};w_{1},w_{2}\big)\right) }.\)
Note that \(f(\tfrac{1}{2};w_{1},w_{2})\) and ε are positive for w 1 ∈ (\(w_{2}+1-\sqrt{4w_{2}+1},\) \(w_{2}+1+\sqrt{4w_{2}+1}\)). The following corollary is the equivalent of Remark 6 in the case of total utility, that is, it derives the condition such that total utility is decreasing in the signal precision for the fully informative signal.
Corollary 2
Total utility is decreasing in the signal precision for q = 1 if and only if:
Finally, the following corollary states sufficient conditions for an internal solution in the case of total utility:
Corollary 3
With total utility, if both the uninformative signal is a minimum of anticipatory utility and anticipatory utility is decreasing in the signal precision for the fully informative signal, that is if conditions 7 and 8 simultaneously hold, a partially informative signal is the DM’s optimal choice.
Note that, with total utility, it is less likely that the objective function is decreasing in q for the fully informative signal, however the condition such that the uninformative signal is a local minimum is less stringent than before.
To summarize, when the DM maximizes total utility and an internal solution exists, the latter is compatible with levels of information aversion higher than the values we observed when anticipatory utility was considered alone. Our simulations show that an internal solution with total utility can be found, for example when γ = 0.56, w 1 = 0.53 and w 2 = 0.023.
Rights and permissions
About this article
Cite this article
Barigozzi, F., Levaggi, R. Emotional decision-makers and anomalous attitudes towards information. J Risk Uncertain 40, 255–280 (2010). https://doi.org/10.1007/s11166-010-9092-y
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11166-010-9092-y