Behavior Research Methods

, Volume 50, Issue 2, pp 589–603 | Cite as

Bayes factors for the linear ballistic accumulator model of decision-making

Article

Abstract

Evidence accumulation models of decision-making have led to advances in several different areas of psychology. These models provide a way to integrate response time and accuracy data, and to describe performance in terms of latent cognitive processes. Testing important psychological hypotheses using cognitive models requires a method to make inferences about different versions of the models which assume different parameters to cause observed effects. The task of model-based inference using noisy data is difficult, and has proven especially problematic with current model selection methods based on parameter estimation. We provide a method for computing Bayes factors through Monte-Carlo integration for the linear ballistic accumulator (LBA; Brown and Heathcote, 2008), a widely used evidence accumulation model. Bayes factors are used frequently for inference with simpler statistical models, and they do not require parameter estimation. In order to overcome the computational burden of estimating Bayes factors via brute force integration, we exploit general purpose graphical processing units; we provide free code for this. This approach allows estimation of Bayes factors via Monte-Carlo integration within a practical time frame. We demonstrate the method using both simulated and real data. We investigate the stability of the Monte-Carlo approximation, and the LBA’s inferential properties, in simulation studies.

Keywords

GPU Bayes factor Decision-making 

References

  1. Brown, S. D., & Heathcote, A. (2008). The simplest complete model of choice response time: Linear ballistic accumulation. Cognitive Psychology, 57(3), 153–178.CrossRefPubMedGoogle Scholar
  2. Brown, S. D., Marley, A., Donkin, C., Heathcote, A., & et al. (2008). An integrated model of choices and response times in absolute identification— nova. The University of Newcastle’s Digital Repository.Google Scholar
  3. Burman, K., & Anderson, D. (2002). Model selection and multi-model inference: A practical information-theoretic approach. New York: Springer-Verlag.Google Scholar
  4. Donkin, C., Averell, L., Brown, S., & Heathcote, A. (2009). Getting more from accuracy and response time data: Methods for fitting the linear ballistic accumulator. Behavior Research Methods, 41(4), 1095–1110.CrossRefPubMedGoogle Scholar
  5. Donkin, C., Brown, S. D., & Heathcote, A. (2009). The overconstraint of response time models: Rethinking the scaling problem. Psychonomic Bulletin & Review, 16(6), 1129–1135.CrossRefGoogle Scholar
  6. Forstmann, B. U., Anwander, A., Schäfer, A., Neumann, J., Brown, S., Wagenmakers, E.-J., & Turner, R. (2010). Cortico-striatal connections predict control over speed and accuracy in perceptual decision making., In Proceedings of the national academy of sciences (Vol. 107, pp. 15916–15920).Google Scholar
  7. Forstmann, B. U., Dutilh, G., Brown, S., Neumann, J., Von Cramon, D. Y., Ridderinkhof, K. R., & Wagenmakers, E.-J. (2008). Striatum and pre-SMA facilitate decision-making under time pressure., In Proceedings of the national academy of sciences (Vol. 105, pp. 17538–17542).Google Scholar
  8. Forstmann, B. U., Tittgemeyer, M., Wagenmakers, E.-J., Derrfuss, J., Imperati, D., & Brown, S. (2011). The speed-accuracy tradeoff in the elderly brain: A structural model-based approach. The Journal of Neuroscience, 31(47), 17242–17249.CrossRefPubMedGoogle Scholar
  9. Gelman, A., Hwang, J., & Vehtari, A. (2014). Understanding predictive information criteria for Bayesian models. Statistics and Computing, 24(6), 997–1016.CrossRefGoogle Scholar
  10. Gomez, P., Ratcliff, R., & Perea, M. (2007). A model of the go/no-go task. Journal of Experimental Psychology: General, 136(3), 389.CrossRefGoogle Scholar
  11. Hawkins, G. E., Marley, A., Heathcote, A., Flynn, T. N., Louviere, J. J., & Brown, S. D. (2014). Integrating cognitive process and descriptive models of attitudes and preferences. Cognitive Science, 38(4), 701–735.CrossRefPubMedGoogle Scholar
  12. Ho, T. C., Yang, G., Wu, J., Cassey, P., Brown, S. D., Hoang, N., & et al. (2014). Functional connectivity of negative emotional processing in adolescent depression. Journal of Affective Disorders, 155, 65–74.CrossRefPubMedGoogle Scholar
  13. Kass, R. E., & Raftery, A. E. (1995). Bayes factors. Journal of the American Statistical Association, 90(430), 773–795.CrossRefGoogle Scholar
  14. Lee, M. D. (in press). Bayesian methods in cognitive modeling. The Stevens’ Handbook of Experimental Psychology and Cognitive Neuroscience, Fourth Edition.Google Scholar
  15. Matzke, D., Dolan, C. V., Logan, G. D., Brown, S. D., & Wagenmakers, E.-J. (2013). Bayesian parametric estimation of stop-signal reaction time distributions. Journal of Experimental Psychology: General, 142(4), 1047.CrossRefGoogle Scholar
  16. Rae, B., Heathcote, A., Donkin, C., Averell, L., & Brown, S. (2014). The hare and the tortoise: Emphasizing speed can change the evidence used to make decisions. Journal of Experimental Psychology: Learning, Memory, and Cognition, 40(5), 1226.PubMedGoogle Scholar
  17. Ratcliff, R. (1978). A theory of memory retrieval. Psychological Review, 85(2), 59.CrossRefGoogle Scholar
  18. Ratcliff, R., Gomez, P., & McKoon, G. (2004). A diffusion model account of the lexical decision task. Psychological Review, 111(1), 159.CrossRefPubMedPubMedCentralGoogle Scholar
  19. Ratcliff, R., & Rouder, J. N. (1998). Modeling response times for two-choice decisions. Psychological Science, 9(5), 347–356.CrossRefGoogle Scholar
  20. Ratcliff, R., Thapar, A., & McKoon, G. (2007). Application of the diffusion model to two-choice tasks for adults 75-90 years old. Psychology and Aging, 22(1), 56.CrossRefPubMedPubMedCentralGoogle Scholar
  21. Ratcliff, R., Thapar, A., & McKoon, G. (2010). Individual differences, aging, and IQ in two-choice tasks. Cognitive Psychology, 60(3), 127–157.CrossRefPubMedGoogle Scholar
  22. Ratcliff, R., Thapar, A., & McKoon, G. (2011). Effects of aging and IQ on item and associative memory. Journal of Experimental Psychology: General, 140(3), 464.CrossRefGoogle Scholar
  23. Schwarz, G (1978). Estimating the dimension of a model. The Annals of Statistics, 6(2), 461–464.CrossRefGoogle Scholar
  24. Spiegelhalter, D. J., Best, N. G., Carlin, B. P., & Van der Linde, A. (1998). Bayesian deviance, the effective number of parameters, and the comparison of arbitrarily complex models. Research Report, 98–009.Google Scholar
  25. Turner, B. M., Sederberg, P. B., Brown, S. D., & Steyvers, M. (2013). A method for efficiently sampling from distributions with correlated dimensions. Psychological Methods, 18(3), 368.CrossRefPubMedPubMedCentralGoogle Scholar
  26. Vehtari, A., & Gelman, A. (2014). WAIC and cross-validation in Stan. Submitted. http://www.stat.columbia.edu/~gelman/research/unpublished/waic_stan.pdf Accessed, 27(2015), 5.
  27. Wagenmakers, E.-J., Lodewyckx, T., Kuriyal, H., & Grasman, R. (2010). Bayesian hypothesis testing for psychologists: A tutorial on the Savage–Dickey method. Cognitive Psychology, 60(3), 158–189.CrossRefPubMedGoogle Scholar
  28. Wagenmakers, E. J., Van Der Maas, H. L., & Grasman, R. P. (2007). An EZ-diffusion model for response time and accuracy. Psychonomic Bulletin & Review, 14(1), 3–22.Google Scholar
  29. Wetzels, R., Grasman, R. P., & Wagenmakers, E.-J. (2010). An encompassing prior generalization of the Savage–Dickey density ratio. Computational Statistics & Data Analysis, 54(9), 2094–2102.CrossRefGoogle Scholar

Copyright information

© Psychonomic Society, Inc. 2017

Authors and Affiliations

  1. 1.School of PsychologyUniversity of NewcastleCallaghanAustralia

Personalised recommendations