Advertisement

A Formative Study of Interactive Bias Metrics in Visual Analytics Using Anchoring Bias

  • Emily WallEmail author
  • Leslie Blaha
  • Celeste Paul
  • Alex Endert
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11747)

Abstract

Interaction is the cornerstone of how people perform tasks and gain insight in visual analytics. However, people’s inherent cognitive biases impact their behavior and decision making during their interactive visual analytic process. Understanding how bias impacts the visual analytic process, how it can be measured, and how its negative effects can be mitigated is a complex problem space. Nonetheless, recent work has begun to approach this problem by proposing theoretical computational metrics that are applied to user interaction sequences to measure bias in real-time. In this paper, we implement and apply these computational metrics in the context of anchoring bias. We present the results of a formative study examining how the metrics can capture anchoring bias in real-time during a visual analytic task. We present lessons learned in the form of considerations for applying the metrics in a visual analytic tool. Our findings suggest that these computational metrics are a promising approach for characterizing bias in users’ interactive behaviors.

Keywords

Cognitive bias Anchoring bias Visual analytics 

Notes

Acknowledgements

This research is sponsored in part by the U.S. the Department of Defense through the Pacific Northwest National Laboratory, the Siemens FutureMaker Fellowship, and NSF IIS-1813281. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the U.S. Government.

Supplementary material

Supplementary material 1 (mp4 4512 KB)

488591_1_En_34_MOESM2_ESM.png (50 kb)
Supplementary material 2 (png 50 KB)
488591_1_En_34_MOESM3_ESM.png (31 kb)
Supplementary material 3 (png 31 KB)

References

  1. 1.
    Brown, E.T., Liu, J., Brodley, C.E., Chang, R.: Dis-function: learning distance functions interactively. In: IEEE Conference on Visual Analytics Science and Technology (VAST), pp. 83–92 (2012)Google Scholar
  2. 2.
    Brown, E.T., et al.: Finding Waldo: learning about users from their interactions. IEEE Trans. Visual Comput. Graphics 20(12), 1663–1672 (2014)CrossRefGoogle Scholar
  3. 3.
    Cho, I., Wesslen, R., Karduni, A., Santhanam, S., Shaikh, S., Dou, W.: The anchoring effect in decision-making with visual analytics. In: IEEE Conference on Visual Analytics Science and Technology (VAST) (2017)Google Scholar
  4. 4.
    Coco, M.I., Dale, R.: Cross-recurrence quantification analysis of categorical and continuous time series: an R package. Front. Psychol. 5, 510 (2014)CrossRefGoogle Scholar
  5. 5.
    Cottam, J.A., Blaha, L.M.: Bias by default? A means for a priori interface measurement. In: DECISIVe: Workshop on Dealing with Cognitive Biases in Visualizations (2017)Google Scholar
  6. 6.
    Dimara, E., Bailly, G., Bezerianos, A., Franconeri, S.: Mitigating the attraction effect with visualizations. IEEE Trans. Visual Comput. Graphics 25, 850–860 (2018)CrossRefGoogle Scholar
  7. 7.
    Dimara, E., Bezerianos, A., Dragicevic, P.: The attraction effect in information visualization. IEEE Trans. Visual Comput. Graphics 23(1), 471–480 (2017)CrossRefGoogle Scholar
  8. 8.
    Dimara, E., Franconeri, S., Plaisant, C., Bezerianos, A., Dragicevic, P.: A task-based taxonomy of cognitive biases for information visualization. IEEE Trans. Visual Comput. Graphics (2018)Google Scholar
  9. 9.
    Dou, W., Jeong, D.H., Stukes, F., Ribarsky, W., Lipford, H.R., Chang, R.: Recovering reasoning process from user interactions. IEEE Comput. Graphics Appl. 29, 52–61 (2009)CrossRefGoogle Scholar
  10. 10.
    Endert, A., Han, C., Maiti, D., House, L., Leman, S.C., North, C.: Observation-level interaction with statistical models for visual analytics. In: IEEE VAST, pp. 121–130 (2011)Google Scholar
  11. 11.
    Endert, A., et al.: The state of the art in integrating machine learning into visual analytics. In: Computer Graphics Forum. Wiley Online Library (2017)Google Scholar
  12. 12.
    Englich, M., Mussweiler, T.: Anchoring effect. Cognitive Illusions: Intriguing Phenomena in Judgement, Thinking, and Memory, p. 223 (2016)Google Scholar
  13. 13.
    Friedman, B., Nissenbaum, H.: Bias in computer systems. ACM Trans. Inf. Syst. (TOIS) 14(3), 330–347 (1996)CrossRefGoogle Scholar
  14. 14.
    Furnham, A., Boo, H.C.: A literature review of the anchoring effect. J. Socio-economics 40(1), 35–42 (2011)CrossRefGoogle Scholar
  15. 15.
    Gigerenzer, G., Goldstein, D.G.: Reasoning the fast and frugal way: models of bounded rationality. Psychol. Rev. 103(4), 650 (1996)CrossRefGoogle Scholar
  16. 16.
    Gotz, D., Sun, S., Cao, N.: Adaptive contextualization: combating bias during high-dimensional visualization and data selection. In: Proceedings of the 21st International Conference on Intelligent User Interfaces, pp. 85–95. ACM (2016)Google Scholar
  17. 17.
    Gotz, D., Zhou, M.X.: Characterizing users’ visual analytic activity for insight provenance. Inf. Vis. 8(1), 42–55 (2009)CrossRefGoogle Scholar
  18. 18.
    Heuer Jr., R.J.: Psychology of Intelligence Analysis, Washington, DC (1999)Google Scholar
  19. 19.
    Horvitz, E.: Principles of mixed-initiative user interfaces. In: Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 159–166, May 1999Google Scholar
  20. 20.
    Kahneman, D.: Thinking, Fast and Slow. Macmillan (2011)Google Scholar
  21. 21.
    Kahneman, D., Frederick, S.: A model of heuristic judgment. In: The Cambridge Handbook of Thinking and Reasoning, pp. 267–294 (2005)Google Scholar
  22. 22.
    Kim, H., Choo, J., Park, H., Endert, A.: InterAxis: steering scatterplot axes via observation-level interaction. IEEE Trans. Visual Comput. Graphics 22(1), 131–140 (2016)CrossRefGoogle Scholar
  23. 23.
    Newell, A.: Unified Theories of Cognition. Harvard University Press (1994)Google Scholar
  24. 24.
    Nickerson, R.S.: Confirmation bias: a ubiquitous phenomenon in many guises. Rev. Gen. Psychol. 2(2), 175–220 (1998)CrossRefGoogle Scholar
  25. 25.
    Patterson, R.E., et al.: A human cognition framework for information visualization. Comput. Graphics 42, 42–58 (2014)CrossRefGoogle Scholar
  26. 26.
    Pike, W.A., Stasko, J., Chang, R., O’Connell, T.A.: The science of interaction. Inf. Vis. 8(4), 263–274 (2009)CrossRefGoogle Scholar
  27. 27.
    Pirolli, P., Card, S.: Sensemaking processes of intelligence analysts and possible leverage points as identified though cognitive task analysis. In: Proceedings of the 2005 International Conference on Intelligence Analysis, McLean, p. 6 (2005)Google Scholar
  28. 28.
    Pohl, M., Smuc, M., Mayr, E.: The user puzzle - explaining the interaction with visual analytics systems. IEEE Trans. Visual Comput. Graphics 18(12), 2908–2916 (2012)CrossRefGoogle Scholar
  29. 29.
    Shneiderman, B.: The eyes have it: a task by data type taxonomy for information visualizations. In: The Craft of Information Visualization, pp. 364–371. Elsevier (2003)Google Scholar
  30. 30.
    Thomas, A.K., Millar, P.R.: Reducing the framing effect in older and younger adults by encouraging analytic processing. J. Gerontol. B Psychol. Sci. Soc. Sci. 2, 139 (2011)Google Scholar
  31. 31.
    Tversky, A., Kahneman, D.: Judgment under uncertainty: heuristics and biases. Science 185, 1124–1131 (1974)CrossRefGoogle Scholar
  32. 32.
    Tversky, A., Kahneman, D.: The framing of decisions and the psychology of choice. Science 211, 453–458 (1981)MathSciNetCrossRefGoogle Scholar
  33. 33.
    Tversky, A., Kahneman, D.: Rational choice and the framing of decisions. J. Bus. 59, S251–S278 (1986)CrossRefGoogle Scholar
  34. 34.
    Valdez, A.C., Ziefle, M., Sedlmair, M.: A framework for studying biases in visualization research. In: DECISIVe 2017: Dealing with Cognitive Biases in Visualisations (2017)Google Scholar
  35. 35.
    Valdez, A.C., Ziefle, M., Sedlmair, M.: Priming and anchoring effects in visualization. IEEE Trans. Visual Comput. Graphics 1, 584–594 (2018)CrossRefGoogle Scholar
  36. 36.
    Wall, E., Blaha, L.M., Franklin, L., Endert, A.: Warning, bias may occur: a proposed approach to detecting cognitive bias in interactive visual analytics. In: IEEE Conference on Visual Analytics Science and Technology (VAST) (2017)Google Scholar
  37. 37.
    Wall, E., Blaha, L.M., Paul, C.L., Cook, K., Endert, A.: Four perspectives on human bias in visual analytics. In: DECISIVe: Workshop on Dealing with Cognitive Biases in Visualizations (2017)Google Scholar
  38. 38.
    Wason, P.C.: On the failure to eliminate hypotheses in a conceptual task. Q. J. Exp. Psychol. 12(3), 129–140 (1960)CrossRefGoogle Scholar

Copyright information

© IFIP International Federation for Information Processing 2019

Authors and Affiliations

  • Emily Wall
    • 1
    Email author
  • Leslie Blaha
    • 2
  • Celeste Paul
    • 3
  • Alex Endert
    • 1
  1. 1.Georgia TechAtlantaUSA
  2. 2.Air Force Research LaboratoryPittsburghUSA
  3. 3.U.S. Department of DefenseWashington, D.C.USA

Personalised recommendations