Skip to main content
Log in

How Do People Process Information from Automated Decision Aids: an Application of Systems Factorial Technology

  • Original Paper
  • Published:
Computational Brain & Behavior Aims and scope Submit manuscript

Abstract

While many researchers have investigated the performance consequences of automated recommender systems, little research has measured how these recommendations can impact the user’s decision-making process. In the present work, we measured how people process information when provided with an automated recommender system using the Systems Factorial Technology (SFT) framework. This research comprises two experiments that explore the circumstances in which people use one or all available information (Experiment 1) and process information serially or in parallel (Experiment 2). For each experiment, participants completed a speeded length judgment task with a reliable but imperfect aid. Participants demonstrated serial processing of information and likely used only one source of information when making decisions across all conditions. Integrating information on the display and accurate training were shown to lead to more efficient information processing. Display characteristics, performance incentives, and training play a role in how people use information from the automated aids which may lead to slow downs or speed ups in information processing. This research sheds light on how people gather and process information with an automation aid and suggests how we might design systems to improve decision performance.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12

Similar content being viewed by others

Notes

  1. Note that this design does not follow the standard double factorial paradigm template. In the standard paradigm, the salience manipulation is applied to the target stimuli following from the original application in which the responses were target-present and target-absent (and hence salience could not apply to an absent stimulus). In this and other applications of the double factorial paradigm to discrimination tasks in which both choices can be closer or further from the category boundary, the salience manipulation can apply to both choices. In principle, we have sufficient information to run a separate SIC/MIC analyses on both the short and the long, but because it is unreasonable that people would apply different architectures to each, we collapsed across the two choices. Thus, clear “short” information from both the aid and the bar length was treated the same as clear “long” information from both the aid and the bar length for the purposes of SIC/MIC calculations, and similarly for the low-discriminability signals.

  2. All data and materials are available on Open Science Framework at https://osf.io/zm2hy/

  3. We would like to thank our reviewers for this suggestion.

References

  • Ashton, R. H. (1990). Pressure and performance in accounting decision settings: Paradoxical effects of incentives, feedback, and justification. Journal of Accounting Research, 28, 148–180.

    Article  Google Scholar 

  • Bahner, J. E., Hüper, A. D., & Manzey, D. (2008). Misuse of automated decision aids: Complacency, automation bias and the impact of training experience. International Journal of Human-Computer Studies, 66, 688–699.

    Article  Google Scholar 

  • Ballard, T., Sewell, D. K., Cosgrove, D., & Neal, A. (2019). Information processing under reward versus under punishment. Psychological Science, 30(5), 757–764.

    Article  PubMed  Google Scholar 

  • Blaha, L. M., Townsend, J. T., Houpt. J. W., & Kneeland, C. M. (Under Review). Capacity coefficient analysis for single-target self-terminating processes. Journal of Mathematical Psychology. Manuscript submitted for publication.

  • Boag, R. J., Strickland, L., Heathcote, A., Neal, A., & Loft, S. (2019a). Cognitive control and capacity for prospective memory in complex dynamic environments. Journal of Experimental Psychology: General, 148(12), 2181.

    Article  PubMed  Google Scholar 

  • Boag, R. J., Strickland, L., Loft, S., & Heathcote, A. (2019b). Strategic attention and decision control support prospective memory in a complex dual-task environment. Cognition, 191, 103974.

    Article  PubMed  Google Scholar 

  • Bürkner, P. C. (2017). brms: An R package for Bayesian multilevel models using Stan. Journal of Statistical Software, 80(1), 1–28. https://doi.org/10.18637/jss.v080.i01

    Article  Google Scholar 

  • Chen, J., Mishler, S., Hu, B., Li, N., & Proctor, R. W. (2018). The description-experience gap in the effect of warning reliability on user trust and performance in a phishing-detection context. International Journal of Human-Computer Studies, 119, 35–47.

    Article  Google Scholar 

  • Crocoll, W. M., & Coury, B. G. (1990). Status or recommendation: Selecting the type of information for decision aiding. In Proceedings of the human factors society annual meeting (pp. 1524–1528). Los Angeles, CA: SAGE Publications.

  • Desimone, R., & Duncan, J. (1995). Neural mechanisms of selective visual attention. Annual Review of Neuroscience, 18, 193–222.

    Article  CAS  PubMed  Google Scholar 

  • Duncan, J. (1984). Selective attention and the organization of visual information. Journal of Experimental Psychology: General, 113(4), 501.

    Article  CAS  PubMed  Google Scholar 

  • Eidels, A., Donkin, C., Brown, S. D., & Heathcote, A. (2010). Converging measures of workload capacity. Psychonomic Bulletin & Review, 17, 763–771.

    Article  Google Scholar 

  • Fific, M., Nosofsky, R. M., & Townsend, J. T. (2008a). Information-processing architectures in multidimensional classification: A validation test of the systems factorial technology. Journal of Experimental Psychology: Human Perception and Performance, 34(2), 356.

    PubMed  Google Scholar 

  • Fific, M., Townsend, J. T., & Eidels, A. (2008b). Studying visual search using systems factorial methodology with target—distractor similarity as the factor. Perception and Psychophysics, 70, 583–603.

    Article  PubMed  Google Scholar 

  • Fifić, M., Little, D. R., & Nosofsky, R. M. (2010). Logical-rule models of classification response times: A synthesis of mental-architecture, random-walk, and decision-bound approaches. Psychological Review, 117(2), 309–348.

    Article  PubMed  PubMed Central  Google Scholar 

  • Glover, S. M., Prawitt, D. F., & Spilker, B. C. (1997). The influence of decision aids on user behavior: Implications for knowledge acquisition and inappropriate reliance. Organizational Behavior and Human Decision Processes, 72, 232–255.

    Article  Google Scholar 

  • Houpt, J. W., & Townsend, J. T. (2012). Statistical measures for workload capacity analysis. Journal of Mathematical Psychology, 56, 341–355.

    Article  MathSciNet  PubMed  PubMed Central  Google Scholar 

  • Houpt, J. W., Blaha, L. M., McIntire, J. P., Havig, P. R., & Townsend, J. T. (2014). Systems factorial technology with r. Behavior Research Methods, 46, 307–330.

    Article  PubMed  Google Scholar 

  • Jeffreys, H. (1961). Theory of probability (3rd ed.). Oxford University Press.

    Google Scholar 

  • Juvina, I., & Taatgen, N. (2009a). Adding distractors improves performance by boosting top-down control. In N. Taatgen & H. van Rijn (Eds.), Proceedings of the 31st annual conference of the cognitive science society. Amsterdam, The Netherlands: Cognitive Science Society.

  • Juvina, I., Nador, J., Larue, O., Green, R., Harel, A., & Minnery, B. S. (2018). Measuring individual differences in cognitive effort avoidance. In Proceedings of the 40th annual conference of the cognitive science society. Austin, TX: Cognitive Science Society.

  • Juvina, I., & Taatgen, N. A. (2009b). A repetition-suppression account of between-trial effects in a modified stroop paradigm. Acta Psychologica, 131, 72–84.

    Article  PubMed  Google Scholar 

  • Kneeland, C. M., Houpt, J. W., & Bennett, K. B. (2021). Exploring the performance consequences of target prevalence and ecological display designs when using an automated aid. Computational Brain & Behavior, 4, 335–354.

  • Kool, W., McGuire, J. T., Rosen, Z. B., & Botvinick, M. M. (2010). Decision making and the avoidance of cognitive demand. Journal of Experimental Psychology: General, 139, 665–682.

    Article  PubMed  Google Scholar 

  • Lee, J. D., & See, K. A. (2004). Trust in automation: Designing for appropriate reliance. Human Factors, 46, 50–80.

    Article  PubMed  Google Scholar 

  • Little, D. R., Nosofsky, R. M., Donkin, C., & Denton, S. E. (2013). Logical rules and the classification of integral-dimension stimuli. Journal of Experimental Psychology: Learning, Memory, and Cognition, 39(3), 801.

    PubMed  Google Scholar 

  • McCarley, J. S., Mounts, J. R., & Kramer, A. F. (2007). Spatially mediated capacity limits in attentive visual perception. Acta Psychologica, 126, 98–119.

    Article  PubMed  Google Scholar 

  • Meyer, J. (2001). Effects of warning validity and proximity on responses to warnings. Human Factors, 43(4), 563–572.

    Article  CAS  PubMed  Google Scholar 

  • Moneer, S., Wang, T., & Little, D. R. (2016). The processing architectures of whole-object features: A logical-rules approach. Journal of Experimental Psychology: Human Perception and Performance, 42(9), 1443.

    PubMed  Google Scholar 

  • Morey, R. D., & Rouder, J. N. (2018). BayesFactor: Computation of Bayes factors for common designs. R package version 0.9.12-4.2. https://CRAN.R-project.org/package=BayesFactor

  • Mosier, K. L., & Skitka, L. J. (1999). Automation use and automation bias. In Proceedings of the human factors and ergonomics society annual meeting (Vol. 43, No. 3, pp. 344–348). Sage CA: Los Angeles, CA: SAGE Publications.

  • Mosier, K. L., Dunbar, M., McDonnell, L., Skitka, L. J., Burdick, M., & Rosenblatt, B. (1998). Automation bias and errors: Are teams better than individuals? In Proceedings of the human factors and ergonomics society annual meeting (pp. 201–205). Los Angeles, CA: SAGE Publications.

  • Parasuraman, R., & Riley, V. (1997). Humans and automation: Use, misuse, disuse, abuse. Human Factors, 39, 230–253.

    Article  Google Scholar 

  • Parasuraman, R., Sheridan, T. B., & Wickens, C. D. (2000). A model for types and levels of human interaction with automation. IEEE Transactions on Systems, Man, and Cybernetics-Part a: Systems and Humans, 30, 286–297.

    Article  CAS  Google Scholar 

  • Payne, J. W., Bettman, J. R., & Johnson, E. J. (1988). Adaptive strategy selection in decision making. Journal of Experimental Psychology: Learning, Memory, and Cognition, 14(3), 534–552.

    Google Scholar 

  • Posner, M. I., Snyder, C. R., & Davidson, B. J. (1980). Attention and the detection of signals. Journal of Experimental Psychology: General, 109(2), 160–174.

    Article  CAS  Google Scholar 

  • Raab, D. H. (1962). Statistical facilitation of simple response times. Transactions of the New York Academy of Sciences, 24, 574–590.

    Article  CAS  PubMed  Google Scholar 

  • Rovira, E., McGarry, K., & Parasuraman, R. (2007). Effects of imperfect automation on decision making in a simulated command and control task. Human Factors, 49(1), 76–87.

    Article  PubMed  Google Scholar 

  • Samuels, J. A., & Whitecotton, S. M. (2011). An effort based analysis of the paradoxical effects of incentives on decision aided performance. Journal of Behavioral Decision Making, 24, 345–360.

    Article  Google Scholar 

  • Simon, H. A. (1990). Bounded rationality. In J. Eatwell, M. Milgate, & P. Newman (Eds.), Utility and probability (pp. 15–18). London: Palgrave Macmillan.

    Chapter  Google Scholar 

  • Skitka, L. J., Mosier, K. L., & Burdick, M. (1999). Does automation bias decision-making? International Journal of Human-Computer Studies, 51, 991–1006.

    Article  Google Scholar 

  • Skitka, L. J., Mosier, K., & Burdick, M. D. (2000). Accountability and automation bias. International Journal of Human-Computer Studies, 52(4), 701–717.

    Article  Google Scholar 

  • Strauch, B. (2017). The automation-by-expertise-by-training interaction: Why automation-related accidents continue to occur in sociotechnical systems. Human Factors, 59, 204–228.

    Article  PubMed  Google Scholar 

  • Strickland, L., Heathcote, A., Bowden, V. K., Boag, R. J., Wilson, M. D., Khan, S., & Loft, S. (2021). Inhibitory cognitive control allows automated advice to improve accuracy while minimizing misuse. Psychological Science, 32(11), 1768–1781.

    Article  PubMed  Google Scholar 

  • Strickland, L., Boag, R. J., Heathcote, A., Bowden, V., & Loft, S. (2023). Automated decision aids: When are they advisors and when do they take control of human decision making? Journal of Experimental Psychology: Applied. Advance online publication.

  • Todd, P. M., & Gigerenzer, G. (2007). Environments that make us smart: Ecological rationality. Current Directions in Psychological Science, 16, 167–171.

    Article  Google Scholar 

  • Townsend, J. T., & Nozawa, G. (1995). Spatio-temporal properties of elementary perception: An investigation of parallel, serial, and coactive theories. Journal of Mathematical Psychology, 39, 321–359.

    Article  Google Scholar 

  • Treisman, A. (1985). Preattentive processing in vision. Computer Vision, Graphics, and Image Processing, 31, 156–177.

    Article  Google Scholar 

  • Tsotsos, J. K., Culhane, S. M., Wai, W. Y. K., Lai, Y., Davis, N., & Nuflo, F. (1995). Modeling visual attention via selective tuning. Artificial intelligence, 78, 507–545.

    Article  Google Scholar 

  • VanRullen, R., Reddy, L., & Fei-Fei, L. (2005). Binding is a local problem for natural objects and scenes. Vision Research, 45, 3133–3144.

    Article  PubMed  Google Scholar 

  • Wickens, C. D., & Dixon, S. R. (2007). The benefits of imperfect diagnostic automation: A synthesis of the literature. Theoretical Issues in Ergonomics Science, 8, 201–212.

    Article  Google Scholar 

  • Yamani, Y., & McCarley, J. S. (2018). Effects of task difficulty and display format on automation usage strategy: A workload capacity analysis. Human Factors, 60, 527–537.

    Article  PubMed  Google Scholar 

  • Yamani, Y., McCarley, J. S., Mounts, J. R., & Kramer, A. F. (2013). Spatial interference between attended items engenders serial visual processing. Attention, Perception, and Psychophysics, 75, 229–243.

    Article  Google Scholar 

Download references

Funding

This work was supported by a grant from the National Science Foundation (Grant # 2042074). Author C.M.K has received research support from the Human Factors and Ergonomic Society’s Perception and Performance Technical Group, as well as the Wright State University’s Graduate Student Assembly for the current work as part of her doctoral dissertation.

Author information

Authors and Affiliations

Authors

Contributions

All authors contributed to the study conception and design. Material preparation, data collection, and analysis were performed by C.M.K as part of her doctoral dissertation work. The first draft of the manuscript was written by C.M.K, and all authors commented on previous versions of the manuscript. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Cara M. Kneeland.

Ethics declarations

Ethics Approval

This research was approved by the Institutional Review Board at Wright State University (Approval number: 06267)

Consent to Participate

Informed consent was obtained from all individual participants included in the study.

Competing Interests

The authors declare no competing interests.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

A majority of this research was completed while C.M.K was attending Wright State University.

Appendix

Appendix

Please see table 4.

Table 4 Individual participants’ logistic regression results by on-time and delayed aid

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Kneeland, C.M., Houpt, J.W. & Juvina, I. How Do People Process Information from Automated Decision Aids: an Application of Systems Factorial Technology. Comput Brain Behav 7, 106–128 (2024). https://doi.org/10.1007/s42113-023-00188-z

Download citation

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s42113-023-00188-z

Keywords

Navigation