Skip to main content

Not So Hypocritical After All: Belief Revision Is Adaptive and Often Unnoticed

  • Chapter
  • First Online:
Empirically Engaged Evolutionary Ethics

Part of the book series: Synthese Library ((SYLI,volume 437))

Abstract

We are all apt to alter our beliefs and even our principles to suit the prevailing winds. Examples abound in public life (think of the politician who bases an election campaign on the need to address the budget emergency represented by a deficit, only to be indifferent to an even larger deficit once in office), but we are all subject to similar reversals. We often accuse one another of hypocrisy when these kinds of reversals occur. Sometimes the accusation is justified. In this paper, however, I will argue that in many such cases, we don’t manifest hypocrisy, even if our change of mind is not in response to new evidence. Marshalling evidence from psychology and evolutionary theory, I will suggest that we are designed to update our beliefs in response to social signals: as these signals change, we change our minds, often without even noticing.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 119.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 119.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    It is moral hypocrisy that is the focus here. Formally, hypocrisy can occur in any domain that is governed by norms: the domain of aesthetics, knowledge, prudence, and so on. But hypocrisy seems to attract opprobrium only in the moral and prudential domains, and even in the latter perhaps only insofar as it is seen as a kind of moral hypocrisy to offer advice one does not abide by oneself.

  2. 2.

    A caveat: these studies, like almost all the experimental (but not the correlational) work in social psychology at the time, used small samples and almost certainly report effect sizes that are exaggerated by the failure to report unsuccessful trials. That said, Dong, van Prooijen, and van Lange (2019) report three very much larger studies (one preregistered) that replicate these results.

  3. 3.

    Graham et al. define moral duplicity as “claiming moral motives to others, falsely”. It is not clear whether they require the agent recognizes the falseness. Moral duplicity is also more encompassing than strong hypocrisy, because it includes agents who actually live up to their principles, if and when they are motivated by impression management rather than moral concern to do so.

  4. 4.

    Sie (2015) also argues that the participants in these experiments do not act wrongly. Rather, she argues, experimental demands lead them to wrongly think they are acting contrary to what morality requires of them. I think it is more plausible to think that while the experimental manipulations succeed in altering their conception of where exactly their own action falls on the continuum from forbidden to supererogatory, in all variants they see their action as morally permissible.

  5. 5.

    Lönnqvist, Irlenbusch, and Walkowitz (2014) explicitly address a question that seems to bear on this interpretation: perhaps participants see themselves as already having ‘won’ the lottery by being given the opportunity to assign tasks to themselves and others, and therefore see themselves as having no obligation not to take the positive task for themselves. To test this possibility, they had independent raters assess their slightly different paradigm, a dictator game in which the participant could choose an 5/5 distribution, an 8/2 distribution or to flip a coin. On a 9 point scale that collapsed across measures for “unfair-fair,” “immoral-moral” and “bad-virtuous,” with the midpoint of the scale marked “neutral”, the mean ratings were 8.09 (5/5), 3.56 (8/2) and 6.27 (coin flip). The experimenters claim that these data demonstrate that participants indeed see themselves as acting contrary to their own moral judgments. I disagree: I don’t think this is evidence that 8/2 is seen as selfish, in the sense required to establish hypocrisy, by the participants. The extreme end of these scales does not correspond with the required option; it corresponds with the best option. The midpoint is therefore halfway between the immoral (unfair; vicious) and the morally best.

  6. 6.

    Again, one would should treat this result with caution in the light of the replication crisis and the small number of participants. A number of studies have reported that images of eyes increase prosocial behavior (e.g. Rigdon, Ishii, Watabe, & Kitayama, 2009) but recent meta-analyses suggest that the effect size is not significantly different from zero (Northover, Pedersen, Cohen, & Andrews, 2017). Perhaps, however, the mirror manipulation succeeds where images of eyes does not.

  7. 7.

    Admittedly, there is a good deal of controversy over the High Gods theory, especially with regard to the claim that moralizing gods are needed for the emergence of complex societies (e.g. Whitehouse et al., 2019). But even its harshest critics accept that even if it does not explain the emergence of large scale settlements, moralizing gods probably helps to explain their stability over time (see Gray & Watts, 2017).

  8. 8.

    It would be a mistake to place too much weight on these experiments, because most were conducted prior to widespread awareness of problems of replicability, and often had too small a number of participants for us to be confident that the effect is real, even setting aside the possibility of inflated effect sizes due to the file drawer effect. There has not yet been a preregistered multi-lab replication attempt of this work, though one is currently in the planning stages. That said, there are preregistered replications of the basic finding (e.g. Forstmann & Sagioglou, 2020), which makes me somewhat optimistic that the effect will replicate. The researchers behind the multi-lab replication attempt report that they believe that the effect is real, albeit inflated in the published literature (Vaidis & Sleegers, 2018).

  9. 9.

    Interestingly, there is evidence that we represent our beliefs at a finer level of granularity than the beliefs of others (Thornton, Weaverdyck, Mildner, & Tamir, 2019). The evidence I am marshalling here suggests, however, that we do not thereby represent stable and detailed mental states at all. There are three reasons why the data from Thornton et al. are compatible with my picture. First, their claim is comparative: people represent their own states more distinctly than those of others (there is a gradient in distinctness, such that the more socially distant someone is, the less distinct our representations tend to be). More distinct is compatible with very indistinct, of course. Second, as Thornton et al. note, “rich representations are not necessarily accurate ones”; indeed the richness of the representations may help give rise to an illusion that we know our own mental states (6). Third, any temptation to regard this as evidence about the distinctness of one’s mental states should be heavily tempered by the method, which did not ask participants to introspect but instead asked them to consider images paired with state descriptors.

  10. 10.

    Note that constructing beliefs rather than recalling them is in fact very common, especially with regard to dispositional beliefs. As Gareth Evans (1982) influentially noted, we answer questions like “do you believe there will be another World War?” not by looking inward, to a repository of our beliefs, but by considering the world. This kind of case is somewhat different from those under discussion, since in this kind of case we (apparently) attempt to consider the facts that make the claim true or false in it, rather than looking to indirect cues for whether we should accept it (in fact, I am confident that we also look to cues to belief in even the best of cases). Nevertheless, given the similarities it is unsurprising that there is no phenomenological difference between the different ways of constructing our beliefs on the spot.

  11. 11.

    It might be weaker on one side than another if group identification is weaker on that side, or if identification is not with party or partisan groupings. The rise of identity politics might entail a weakening of broad-based cues for belief revision, in favour of a fragmentation of such cues.

  12. 12.

    It is important to note that many epistemologists who accept that disagreements with epistemic peers constitute higher-order evidence might reject my claim that outsourcing is rational, because they advance extremely restrictive notions of who counts as an epistemic peer. As Lackey (2010) notes, the idealized notion of peerhood these accounts work with threatens to cut the debate off from the realworld cases that gives it its point. In any case, we can set this issue aside: when I disagree with many others, I can be very confident that among the dissenters are many people who are at least my epistemic peer (no doubt some are my epistemic superior – at least with regard to the proposition at issue – and I ought to give their dissent especially heavy weight).

References

  • Alicke, M. D., Gordon, E., & Rose, D. (2013). Hypocrisy: What counts? Philosophical Psychology, 26(5), 673–701.

    Article  Google Scholar 

  • Barrett, J. L. (1999). Theological correctness: Cognitive constraint and the study of religion. Method & Theory in the Study of Religion, 11(4), 325–339.

    Article  Google Scholar 

  • Batson, C. D., & Thompson, E. R. (2001). Why don’t moral people act morally? Motivational considerations. Current Directions in Psychological Science, 10(2), 54–57.

    Article  Google Scholar 

  • Batson, C. D., Thompson, E. R., & Chen, H. J. (2002). Moral hypocrisy: Addressing some alternatives. Journal of Personality and Social Psychology, 83(2), 330–339.

    Article  Google Scholar 

  • Batson, C. D., Thompson, E. R., Seuferling, G., Whitney, H., & Strongman, J. A. (1999). Moral hypocrisy: Appearing moral to oneself without being so. Journal of Personality and Social Psychology, 77(3), 525–537.

    Article  Google Scholar 

  • Batson, C. D., Tsang, J., & Thompson, E. R. (2000). Weakness of will: Counting the cost of being moral. Unpublished manuscript.

    Google Scholar 

  • Boyd, R., Richerson, P. J., & Henrich, J. (2011). The cultural niche: Why social learning is essential for human adaptation. Proceedings of the National Academy of Sciences, 108(supplement 2), 10918–10925.

    Article  Google Scholar 

  • Brooks, R. A. (1990). Elephants don’t play chess. Robotics and Autonomous Systems, 6(1–2), 3–15.

    Article  Google Scholar 

  • Browning, L. E., Patrick, S. C., Rollins, L. A., Griffith, S. C., & Russell, A. F. (2012). Kin selection, not group augmentation, predicts helping in an obligate cooperatively breeding bird. Proceedings of the Royal Society B: Biological Sciences, 279(1743), 3861–3869.

    Article  Google Scholar 

  • Carruthers, P. (2013). The opacity of mind. Oxford: Oxford University Press.

    Google Scholar 

  • Chater, N. (2018). The mind is flat: The remarkable shallowness of the improvising brain. New Haven, CT: Yale University Press.

    Google Scholar 

  • Christensen, D. (2007). Epistemology of disagreement: The good news. Philosophical Review, 116(2), 187–218.

    Article  Google Scholar 

  • Chudek, M., Heller, S., Birch, S., & Henrich, J. (2012). Prestige-biased cultural learning: bystander’s differential attention to potential models influences children’s learning. Evolution and Human Behavior, 33(1), 46–56.

    Article  Google Scholar 

  • Clark, A. (1997). Being there: Putting brain, body, and world together again. Cambridge, MA: MIT Press.

    Google Scholar 

  • Clement, S. (2017, April 10). Poll: Narrow support for Trump’s strike in Syria. Washington Post. https://www.washingtonpost.com/world/national-security/poll-narrow-support-for-trumps-strike-in-syria/2017/04/10/15dab5f6-1e02-11e7-a0a7-8b2a45e3dc84_story.html?utm_term=.d786f6570982

  • Cohen, G. L. (2003). Party over policy: The dominating impact of group influence on political beliefs. Journal of Personality and Social Psychology, 85(5), 808–822.

    Article  Google Scholar 

  • Cooper, J. (2007). Cognitive dissonance: Fifty years of a classic theory. London: Sage.

    Book  Google Scholar 

  • Dong, M., van Prooijen, J.-W., & van Lange, P. A. M. (2019). Self-enhancement in moral hypocrisy: Moral superiority and moral identity are about better appearances. PLoS One, 14(7), e0219382. https://doi.org/10.1371/journal.pone.0219382

    Article  Google Scholar 

  • Duhaime, E. P. (2015). Is the call to prayer a call to cooperate? A field experiment on the impact of religious salience on prosocial behavior. Judgment and Decision making, 10(6), 593–596.

    Article  Google Scholar 

  • Edelman, B. (2009). Red light states: Who buys online adult entertainment? Journal of Economic Perspectives, 23(1), 209–220.

    Article  Google Scholar 

  • Evans, G. (1982). The varieties of reference. New York: Oxford University Press.

    Google Scholar 

  • Forstmann, M., & Sagioglou, C. (2020). Religious concept activation attenuates cognitive dissonance reduction in free-choice and induced compliance paradigms. Journal of Social Psychology, 160(1), 75–91.

    Article  Google Scholar 

  • Graham, J., Meindl, P., Koleva, S., Iyer, R., & Johnson, K. M. (2015). When values and behavior conflict: Moral pluralism and intrapersonal moral hypocrisy. Social and Personality Psychology Compass, 9(3), 158–170.

    Article  Google Scholar 

  • Gray, R. D., & Watts, J. (2017). Cultural macroevolution matters. Proceedings of the National Academy of Sciences, 114(3), 7846–7852.

    Article  Google Scholar 

  • Hall, L., Johansson, P., & Strandberg, T. (2012). Lifting the veil of morality: Choice blindness and attitude reversals on a self-transforming survey. PLoS One, 7(9), e45457.

    Article  Google Scholar 

  • Hall, L., Strandberg, T., Pärnamets, P., Lind, A., Tärning, B., & Johansson, P. (2013). How the polls can be both spot on and dead wrong: Using choice blindness to shift political attitudes and voter intentions. PLoS One, 8(4), e60554.

    Article  Google Scholar 

  • Hamilton, W. D. (1964). The genetical evolution of social behaviour: I. Journal of Theoretical Biology, 7(1), 1–16.

    Article  Google Scholar 

  • Harris, P. (2012). Trusting what you’re told: How children learn from others. Cambridge, MA: Harvard University Press.

    Book  Google Scholar 

  • Henrich, J., & Boyd, R. (1998). The evolution of conformist transmission and between-group differences. Evolution and Human Behavior, 19(4), 215–242.

    Article  Google Scholar 

  • Henrich, J., & Gil-White, F. (2001). The evolution of prestige: Freely conferred deference as a mechanism for enhancing the benefits of cultural transmission. Evolution and Human Behavior, 22(3), 165–196.

    Article  Google Scholar 

  • Henrich, N., & Henrich, J. (2007). Why humans cooperate: A cultural and evolutionary explanation. New York: Oxford University Press.

    Google Scholar 

  • Johansson, P., Hall, L., Sikström, S., & Olsson, A. (2005). Failure to detect mismatches between intention and outcome in a simple decision task. Science, 310(5745), 116–119.

    Article  Google Scholar 

  • Kurtzleben, D. (2016, October 23). POLL: White evangelicals have warmed to politicians who commit ‘immoral’ acts. National Public Radio. https://www.npr.org/2016/10/23/498890836/poll-white-evangelicals-have-warmed-to-politicians-who-commit-immoral-acts

  • Lackey, J. (2010). A justificationist view of disagreement’s epistemic significance. In A. Haddock, A. Millar, & D. Pritchard (Eds.), Social epistemology (pp. 298–325). New York: Oxford University Press.

    Chapter  Google Scholar 

  • Laland, K. N. (2017). Darwin’s unfinished symphony: How culture made the human mind. Princeton, NJ: Princeton University Press.

    Book  Google Scholar 

  • Levy, N. (2018). In praise of outsourcing. Contemporary Pragmatism, 15(3), 244–265.

    Article  Google Scholar 

  • Levy, N. (2019). Due deference to denialism: Explaining ordinary people’s rejection of established scientific findings. Synthese, 196(1), 313–327.

    Article  Google Scholar 

  • Levy, N., & Alfano, M. (2020). Knowledge from vice. Mind, 129(515), 887–915.

    Article  Google Scholar 

  • Lönnqvist, J. E., Irlenbusch, B., & Walkowitz, G. (2014). Moral hypocrisy: Impression management or self-deception? Journal of Experimental Social Psychology, 5, 53–62.

    Article  Google Scholar 

  • Lopez, G. (2017, 30 November). Lindsey Graham, 2017: I’m tired of media portraying Trump as a kook. Graham, 2016: Trump is a kook. Vox. https://www.vox.com/policy-and-politics/2017/11/30/16720814/lindsey-graham-trump-kook

  • Malhotra, D. K. (2008). (When) are religious people nicer? Religious salience and the ‘Sunday effect’ on pro-social behavior (NOM Working Paper No. 09-066). Boston: Harvard Business School. https://doi.org/10.2139/ssrn.1297275

    Book  Google Scholar 

  • Maoz, I., Ward, A., Katz, M., & Ross, L. (2002). Reactive devaluation of an “Israeli” vs. “Palestinian” peace proposal. Journal of Conflict Resolution, 46(4), 515–546.

    Article  Google Scholar 

  • Mascaro, O., & Sperber, D. (2009). The moral, epistemic, and mindreading components of children’s vigilance towards deception. Cognition, 112(3), 367–380.

    Article  Google Scholar 

  • McCauley, R. N. (2011). Why religion is natural and science is not. New York: Oxford University Press.

    Google Scholar 

  • Mercier, H., Deguchi, M., Van der Henst, J.-B., & Yama, H. (2015). The benefits of argumentation are cross-culturally robust: The case of Japan. Thinking & Reasoning, 22(1), 1–15.

    Article  Google Scholar 

  • Mercier, H., & Sperber, D. (2017). The enigma of reason. Cambridge, MA: Harvard University Press.

    Book  Google Scholar 

  • Mercier, H., Trouche, E., Yama, H., Heintz, C., & Girotto, V. (2015). Experts and laymen grossly underestimate the benefits of argumentation for reasoning. Thinking and Reasoning, 21(3), 341–355.

    Article  Google Scholar 

  • Miller, D. D. (2019). The mystery of evangelical trump support? Constellations, 26(1), 43–58.

    Article  Google Scholar 

  • Monin, B., & Merritt, A. (2012). Moral hypocrisy, moral inconsistency, and the struggle for moral integrity. In M. Mikulincer & P. Shaver (Eds.), The social psychology of morality: Exploring the causes of good and evil (pp. 167–184). New York: American Psychological Association.

    Chapter  Google Scholar 

  • Norenzayan, A. (2013). Big gods: How religion transformed cooperation and conflict. Princeton, NJ: Princeton University Press.

    Book  Google Scholar 

  • Norenzayan, A., Shariff, A. F., Gervais, W. M., Willard, A. K., McNamara, R., Slingerland, E., et al. (2016). The cultural evolution of prosocial religions. Behavioral and Brain Sciences, 39(1), 1–19.

    Article  Google Scholar 

  • Northover, S. B., Pedersen, W. C., Cohen, A. B., & Andrews, P. W. (2017). Artificial surveillance cues do not increase generosity: Two meta-analyses. Evolution and Human Behavior, 38(1), 144–153.

    Article  Google Scholar 

  • Nowak, M. A., Tarnita, C. E., & Wilson, E. O. (2010). The evolution of eusociality. Nature, 466(7310), 1057–1062.

    Article  Google Scholar 

  • Pew Research Center. (2017). Changing attitudes on gay marriage, June 26. http://www.pewforum.org/fact-sheet/changing-attitudes-on-gay-marriage/

  • Rayner, K. (1998). Eye movements in reading and information processing: 20 years of research. Psychological Bulletin, 124(3), 372–422.

    Article  Google Scholar 

  • Richerson, P. J., & Boyd, R. (2005). Not by genes alone. Chicago: University of Chicago Press.

    Google Scholar 

  • Rigdon, M., Ishii, K., Watabe, M., & Kitayama, S. (2009). Minimal social cues in the dictator game. Journal of Economic Psychology, 30(3), 358–367.

    Article  Google Scholar 

  • Sie, M. (2015). Moral hypocrisy and acting for reasons: How moralizing can invite self-deception. Ethical Theory and Moral Practice, 18(2), 223–235.

    Article  Google Scholar 

  • Simons, D. J., & Levin, D. T. (1997). Change Blindness. Trends in Cognitive Sciences, 1(7), 261–267.

    Article  Google Scholar 

  • Simons, D. J., & Levin, D. T. (1998). Failure to detect changes to people during a real-world interaction. Psychonomic Bulletin & Review, 5(4), 644–649.

    Article  Google Scholar 

  • Sperber, D., Clément, F., et al. (2010). Epistemic vigilance. Mind & Language, 25(4), 359–393.

    Article  Google Scholar 

  • Sterelny, K. (2014). A paleolithic reciprocation crisis: Symbols, signals, and norms. Biological Theory, 9(1), 65–77.

    Article  Google Scholar 

  • Sterelny, K. (2016). Cooperation, Culture, and Conflict. British Journal for the Philosophy of Science, 67(1), 31–58.

    Article  Google Scholar 

  • Swanson, G. E. (1966). The birth of the gods. Ann Arbor, MI: University of Michigan.

    Google Scholar 

  • Thornton, M. A., Weaverdyck, M. E., Mildner, J. N., & Tamir, D. I. (2019). People represent their own mental states more distinctly than those of others. Nature Communications, 10, 2117.

    Article  Google Scholar 

  • Trivers, R. L. (1971). The evolution of reciprocal altruism. Quarterly Review of Biology, 46(1), 35–57.

    Article  Google Scholar 

  • Vaidis, D., & Sleegers, W. (2018). Large scale registered replication project – Cognitive dissonance: Induced compliance paradigm with counterattitudinal essay. Open Science Foundation. https://osf.io/9xsmj/

  • Whitehouse, H., François, P., Savage, P. E., Currie, T. E., Feeney, K. C., Cioni, E., et al. (2019). Complex societies precede moralizing gods throughout world history. Nature, 568, 226–229.

    Article  Google Scholar 

  • Xygalatas, D. (2013). Effects of religious setting on cooperative behaviour. A case study from Mauritius. Religion, Brain and Behavior, 3(2), 91–102.

    Article  Google Scholar 

Download references

Acknowledgements

I am grateful to an audience at “Evolutionary ethics: The nuts and bolts approach,” held at Oxford Brookes in July 2018 for helpful comments. I am especially grateful to Helen De Cruz, Johan De Smedt and Mark Alfano for extremely helpful comments on the written version, in light of which the paper has been revised extensively.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Neil Levy .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2021 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Levy, N. (2021). Not So Hypocritical After All: Belief Revision Is Adaptive and Often Unnoticed. In: De Smedt, J., De Cruz, H. (eds) Empirically Engaged Evolutionary Ethics. Synthese Library, vol 437. Springer, Cham. https://doi.org/10.1007/978-3-030-68802-8_3

Download citation

Publish with us

Policies and ethics