Advertisement

The Biases of Thinking Fast and Thinking Slow

  • Dirk Streeb
  • Min Chen
  • Daniel A. Keim
Chapter

Abstract

Visualization is a human-centric process, which is inevitably associated with potential biases in humans’ judgment and decision-making. While the discussions on humans’ biases have been heavily influenced by the work of Daniel Kahneman as summarized in his book “Thinking, Fast and Slow’, there have also been viewpoints in psychology in favor of heuristics, such as by Gigerenzer. In this chapter, we present a balanced discourse on the humans’ heuristics and biases as the two sides of the same coin. In particular, we examine these two aspects from a probabilistic perspective, and relate them to the notions of global and local sampling. We use three case studies in Kahneman’s book to illustrate the potential biases of human- and machine-centric decision processes. Our discourse leads to a concrete conclusion that visual analytics, where interactive visualization is integrated with statistics and algorithms, offers an effective and efficient means to overcome biases in data intelligence.

Notes

Acknowledgements

Part of this work was funded by the German federal state’s Initiative of Excellence via the Graduate School of Decision Sciences at the University of Konstanz.

References

  1. 1.
    Bar-Hillel M (1980) The base-rate fallacy in probability judgements. Ars Psychologica 44(3):211–233CrossRefGoogle Scholar
  2. 2.
    Bickel PJ, Hammel EA, O’Connell JW (1975) Sex bias in graduate admissions: data from Berkeley. Science 187(4175):398–404CrossRefGoogle Scholar
  3. 3.
    Birnbaum MH (1983) Base rates in Bayesian inference: signal detection analysis of the cab problem. Am J Psych 96(1):85–94CrossRefGoogle Scholar
  4. 4.
    Fiedler K (2008) The ultimate sampling dilemma in experience-based decision making. J Exp Psych: Learning, Memory, Cognition 34(1):186–203Google Scholar
  5. 5.
    Fiedler K, von Sydow M (2015) Heuristics and biases: beyond Tversky and Kahneman’s (1974) judgment under uncertainty. In: Eysenck MW, Groome D (eds) Cognitive psychology: revisiting the classic studies, Sage Publications, chap 12, pp 146–161, https://www.worldcat.org/isbn/9781446294475
  6. 6.
    Gigerenzer G, Hertwig R, Pachur T (eds) (2011) Heuristics: the foundations of adaptive behaviour. Oxford University Press, New York, https://www.worldcat.org/isbn/9780190494629
  7. 7.
    Green B, Zwiebel J (2017) The hot-hand fallacy: cognitive mistakes or equilibrium adjustments? evidence from major league baseball. Manage Sci  https://doi.org/10.2139/ssrn.2358747
  8. 8.
    Kahneman D (2011) Thinking, fast and slow. Penguin Books Ltd., London, https://www.worldcat.org/isbn/9781846140556
  9. 9.
  10. 10.
    Keren G, Schul Y (2009) Two is not always better than one. Perspect Psychological Sci 4(6):533–550CrossRefGoogle Scholar
  11. 11.
    Krynski TR, Tenenbaum JB (2003) The role of causal models in reasoning under uncertainty. In: Proceedings of the 25th annual conference of the cognitive science society. Erlbaum, Mahwah, NJ. https://www.worldcat.org/isbn/9780805849912
  12. 12.
    Krynski TR, Tenenbaum JB (2007) The role of causality in judgment under uncertainty. J Exp Psych: Gen 136(3):430–450CrossRefGoogle Scholar
  13. 13.
    Miller JB, Sanjurjo A (2016) Surprised by the gambler’s and hot hand fallacies? A truth in the law of small numbers. IGIER Working Paper No 552  https://doi.org/10.2139/ssrn.2627354
  14. 14.
    Open Science Collaboration (2015) Estimating the reproducibility of psychological science. Science 349(6251): https://doi.org/10.1126/science.aac4716
  15. 15.
    Pearl J (2009) Causality: Models, reasoning and inference, 2nd edn. Cambridge University Press,  https://doi.org/10.1017/CBO9780511803161
  16. 16.
    Pearl J (2013) Understanding simpson’s paradox. Tech. rep., http://ftp.cs.ucla.edu/pub/stat_ser/r414.pdf
  17. 17.
    Schimmack U, Heene M, Kesavan K (2017) Reconstruction of a train wreck: how priming research went off the rails. online, https://replicationindex.wordpress.com/2017/02/02/reconstruction-of-a-train-wreck-how-priming-research-went-of-the-rails/. Accessed in June 2017
  18. 18.
    Simon HA (1990) Invariants of human behavior. Annu Rev Psych 41:1–20CrossRefGoogle Scholar
  19. 19.
    Simpson EH (1951) The interpretation of interaction in contingency tables. J R Stat Soc Ser B (Methodological) 13(2):238–241Google Scholar
  20. 20.
    Sloman SA (1996) The empirical case for two systems of reasoning. Psychological Bulletin 119(1):3–22CrossRefGoogle Scholar
  21. 21.
    Tversky A, Kahneman D (1974) Judgment under uncertainty: heuristics and biases. Science 185(4157):1124–1131CrossRefGoogle Scholar
  22. 22.
    Wagenmakers EJ, Zwaan RA (2016) Registered replication report: Strack, Martin, & Stepper (1988). Perspect Psychological Sci 11(6):917–928Google Scholar
  23. 23.
    Wagner CH (1982) Simpson’s paradox in real life. Am Statistician 36(1):46–48Google Scholar
  24. 24.
    Yong E (2012) Nobel laureate challenges psychologists to clean up their act: Social-priming research needs “daisy chain” of replication.  https://doi.org/10.1038/nature.2012.11535, including a public letter by Daniel Kahneman

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  1. 1.Data Analysis and Visualization Group at the University of KonstanzKonstanzGermany
  2. 2.Oxford e-Research Centre at the University of OxfordOxfordUK

Personalised recommendations