YouTube has been implicated in the transformation of users into extremists and conspiracy theorists. The alleged mechanism for this radicalizing process is YouTube’s recommender system, which is optimized to amplify and promote clips that users are likely to watch through to the end. YouTube optimizes for watch-through for economic reasons: people who watch a video through to the end are likely to then watch the next recommended video as well, which means that more advertisements can be served to them. This is a seemingly innocuous design choice, but it has a troubling side-effect. Critics of YouTube have alleged that the recommender system tends to recommend extremist content and conspiracy theories, as such videos are especially likely to capture and keep users’ attention. To date, the problem of radicalization via the YouTube recommender system has been a matter of speculation. The current study represents the first systematic, pre-registered attempt to establish whether and to what extent the recommender system tends to promote such content. We begin by contextualizing our study in the framework of technological seduction. Next, we explain our methodology. After that, we present our results, which are consistent with the radicalization hypothesis. Finally, we discuss our findings, as well as directions for future research and recommendations for users, industry, and policy-makers.
This is a preview of subscription content, log in to check access.
Buy single article
Instant access to the full article PDF.
Price includes VAT for USA
Subscribe to journal
Immediate online access to all issues from 2019. Subscription will auto renew annually.
This is the net price. Taxes to be calculated in checkout.
Though see Levy (2019) for an argument that, from the inside, conspiracy theorizing is not irrational. Even if this is true in general, Wolfe’s case clearly represents some sort of normative failing. And even when conspiracy theorizing is subjectively reasonable, there is often something objectively irrational about it. For discussion of this latter point, see Simion et al. (2016), among others.
The pre-registration and other details can be found at https://osf.io/cjp96/?view_only = e56fc77588194336810d41aeef04f822.
We should note that we do not mean to treat Buckey Wolfe as a representative sample of all YouTube viewers. We doubt that all viewers are equally susceptible to self-radicalization online. Rather, it seems likely that people who are already psychologically vulnerable would be most liable to radicalization and accepting conspiracy theories. It could be that such vulnerable individuals would end up radicalized even without interacting with the YouTube recommender system; we have no way of addressing the plausibility of that counterfactual. Thanks to an anonymous referee for raising this point.
In contrast with bottom-up technological seduction, top-down technological seduction occurs via the manipulative framing of online choice architecture that structures the user’s perception of the relevant option space in a way that guides them in certain prescribed directions. See, along with Alfano et al. (2018), also Weinmann et al. (2016) for discussion of digital nudging.
See King (2019) for a similar account that is framed in terms of the presumptuousness of recommender systems.
It is worth registering—though it is beyond the scope of what we can do here to explore in detail—some points of connection between (1) severe conspiracies in the sense of Klein et al. (2018) and (2) epistemic echo chambers as they have been discussed in recent work in social epistemology by Nguyen (2018; see also Jiang et al. 2019). Being in an epistemic echo chamber (for example, a religious or political cult) is to be in a social-epistemic environment in which viewpoints that run contrary to the accepted viewpoint, viz., voices from outside the echo chamber, are met with systematic distrust. Those who subscribe to severe conspiracy theories, likewise, are inclined to a systematic distrust of views that run contrary to the conspiracy theory, and this is the case given that part of what it is to accept a severe conspiracy is to accept that there are conspirators attempting to distort evidence against the theory. Accordingly, someone who subscribes to a severe conspiracy theory in the sense of Klein et al. (2018) will de facto find themselves in a specific kind of echo chamber. This is the case even though the contrary does not hold, viz., not all echo chambers involve belief in conspiracy theories.
These prohibit nudity, incitement to violence or self-harm, hate speech, violent or graphic content, harassment, spam, misleading metadata, scams, threats, copyright violations, doxing, impersonation, and harm to minors. For more, see: https://www.youtube.com/about/policies/#community-guidelines.
This matrix is created using the same method the same as the previous one. The only difference is the unit of analysis, which is more fine-grained in Fig. 4. Here, instead of calculating the similarity between every pair of topics, we perform all the calculations at search-term level.
For instance, they tease a revelation without giving enough details to form reasonable expectations. Which three common mistakes are made in street fights? What is the secret to mastering a handgun? What strange secret, Earl Nightingale? YouTube content creators share YouTube’s interest in selling advertisements, so it is unsurprising that some of them are desperate to draw attention and curiosity with their video titles.
Does the recommender system promote more conspiracy theories than some sort of neutral baseline? We are unable to address this question in the current study because we have no way of ascertaining what a neutral baseline might be. It might be possible to compare one recommender system to another, or to compare this recommender system to an older version of the same recommender system. However, we lack access to these comparators. What we have established is that the YouTube recommender system does in fact push conspiracy theories, not that it pushes them harder than they would be pushed by an alternative. Thanks to an anonymous reviewer for raising this point.
A qualified note of optimism here owes to recent efforts by YouTube’s parent company Alphabet to limit the demographics of election ad targeting to the three general demographics of age, gender, and location. This new implementation is scheduled to go into effect in January 2020, and it reflects an (albeit minimal) effort on the part of Google to disrupt the way potentially misleading information is disseminated. See https://www.blog.google/technology/ads/update-our-political-ads-policy/. However, there is perhaps more cause for pessimism about what to expect from further tweaks to the recommender system. Insofar as the algorithm’s design function continues to aim at maximising watch-time within the system, ‘improvements’ to this design in the form of further tweaks are (provided they are effective) going to be tweaks that only further serve to recommend the very kinds of conspiratorial content that is likely to hold attention. With this in mind, the fact that YouTube’s recommender system is a ‘moving target’ not only makes it difficult to study, but also gives it the potential to transform (absent further regulation) into an even more epistemically pernicious mechanism of technological seduction than it is presently.
For additional recent discussion about the relationship between conspiracy theories and political ideology, see Cassam (2019).
The idea that such shifts might be epistemically important is key to Paul’s (2014) influential work on the epistemology of transformative experience. According to Paul, the adoption of certain kinds of perspectives requires a significant experience, one that won’t necessarily be secured by an incremental exposure to a certain kind of evidence (or apparent evidence) in favour of that perspective. This gloss of Paul’s view is of course compatible with there being some possible cases where incremental change can elicit a transformative experience; her position does not foreclose that possibility.
The monetization of conspiracy theories is, of course, not limited to YouTube, which has been our focus, and the fact that conspiracy theories are monetizable is well established. Sunstein (2014) refers to those who engage in the wider practice of profiting off of conspiracy theories ‘conspiracy entrepreneurs’ (2014: p. 12), a classic example of which he offers is Alex Jones of InfoWars. It is worth noting the important gap between conspiracy theorising and conspiracy entrepreneurship. Though Jones (like YouTube) profits from the production of conspiratorial content, publicly available court documents cast doubt on whether he himself believes the content he profits from. See https://www.theguardian.com/us-news/2019/mar/30/alex-jones-sandy-hook-claims-psychosis. YouTube, being a large corporation, presumably does not have intentional states such as beliefs. It is also worth highlighting the important gap—vis-a-vis YouTube — between the the epistemic badness of (1) believing conspiracy theories; and (2) facilitating the belief in such theories in others. We’ve demonstrated how YouTube’s recommender system can easily bring about the second kind of epistemic bad, but we presume that YouTube itself lacks beliefs.
Abalakina-Paap, M., Stephan, W. G., Craig, T., & Gregory, W. L. (1999). Beliefs in conspiracies. Political Psychology,20(3), 637–647.
Abraham, F. D., Abraham, R. H., & Shaw, C. D. (1990). A visual introduction to dynamical systems theory for psychology. Aerial Press. Retrieved from http://psycnet.apa.org/psycinfo/1991-97299-000.
Alfano, M., Carter, J. A., & Cheong, M. (2018). Technological seduction and self-radicalization. Journal of the American Philosophical Association,4(3), 298–322.
Alfano, M., Iurino, K., Stey, P., Robinson, B., Christen, M., Yu, F., et al. (2017). Development and validation of a multi-dimensional measure of intellectual humility. PLoS ONE,12(8), e0182950.
Alfano, M., & Klein, C. (2019). Trust in a social and digital world. Social Epistemology Review and Reply Collective,8(10), 1–8.
Alfano, M., & Skorburg, J. A. (2017). The embedded and extended character hypotheses. In J. Kiverstein (Ed.), Handbook of philosophy of the social mind (pp. 465–478). London: Routledge.
Alfano, M., & Skorburg, J. A. (2018). Extended knowledge, the recognition heuristic, and epistemic injustice. In D. Pritchard, J. Kallestrup, O. Palermos, & J. A. Carter (Eds.), Extended knowledge. Oxford: Oxford University Press.
Bale, J. M. (2007). Political paranoia v. Political realism: On distinguishing between bogus conspiracy theories and genuine conspiratorial politics. Patterns of Prejudice,41, 45–60.
Beer, R. D. (1995). A dynamical systems perspective on agent-environment interaction. Artificial Intelligence,72(1), 173–215.
Bird, S., Loper, E., & Klein, E. (2009). Natural language processing with Python. Newton: O’Reilly Media Inc.
Bogart, L. M., & Thorburn, S. (2005). Are HIV/AIDS conspiracy beliefs a barrier to hiv prevention among African Americans? Journal of Acquired Immune Deficiency Syndromes,38(2), 213–218.
Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual Web search engine. Computer Networks and ISDN Systems,30, 107–117.
Cassam, Q. (2019). Conspiracy theories. London: Wiley.
Chaslot, G. (2016). Exploring YouTube recommendations. Available at https://github.com/pnbt/youtube-explore.
Coady, D. (2007). Are conspiracy theories irrational? Episteme,4(2), 193–204.
Cook, J., & Lewandowsky, S. (2016). Rational irrationality: Modeling climate change belief polarization using Bayesian networks. Topics in Cognitive Science,8, 160–179.
Dentith, M. (2014). The philosophy of conspiracy theories. London: Palgrave.
Dunn, A. G., Leask, J., Zhou, X., Mandl, K. D., & Coiera, E. (2015). Associations between exposure to and expression of negative opinions about human papillomavirus vaccines on social media: An observational study. Journal of Medical Internet Research,17(6), e144.
Ecker, U. K. H., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review,18, 570–578.
Fleiss, J. L. (1971). Measuring nominal scale agreement among many raters. Psychological Bulletin,76(5), 378–382.
Foot, P. (1997). Virtues and vices. In R. Crisp & M. Slote (Eds.), Virtue ethics (pp. 163–177). Oxford: Oxford University Press.
Forrester, J. (1990). The seductions of psychoanalysis: Freud, Lacan, and Derrida. Cambridge: Cambridge University Press.
Gigerenzer, G. (2008). Rationality for mortals: How people cope with uncertainty. Oxford: Oxford University Press.
Gigerenzer, G., & Goldstein, D. (1996). Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review,103(4), 650.
Goertzel, T. (1994). Belief in conspiracy theories. Political Psychology.,15, 731–742.
Hall, L., Johansson, P., & Strandberg, T. (2012). Lifting the veil of morality: Choice blindness and attitude reversals on a self-transforming survey. PLoS ONE,7(9), e45457.
Hall, L., Strandberg, T., Pärnamets, P., Lind, A., Tärning, B., & Johansson, P. (2013). How the polls can be both spot on and dead wrong: Using choice blindness to shift political attitudes and voter intentions. PLoS ONE,8(4), e60554.
Heersmink, R. (2017). A virtue epistemology of the internet: Search engines, intellectual virtues and education. Social Epistemology,32(1), 1–12.
Hofstadter, R. (1964). The paranoid style in American politics. Harper’s Magazine,229(1374), 77–86.
Icke, D. (1999). The biggest secret: The book that will change the world. London: Bridge of Love Publications.
Jern, A., Chang, K.-M. K., & Kemp, C. (2009). Bayesian belief polarization. In Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, & A. Culotta (Eds.), Advances in neural information processing systems (vol. 22, pp. 853–861).
Jiang, R., Chiappa, S., Lattimore, T., Gyorgy, A., & Kohli, P. (2019). Degenerate feedback loops in recommender systems. In Proceedings of AAAI/ACM conference on AI, ethics, and society, Honolulu, HI.
Johansson, P., Hall, L., Sikström, S., & Olsson, A. (2005). Failure to detect mismatches between intention and outcome in a simple decision task. Science,310, 116–119.
Jolley, D., & Douglas, K. M. (2014a). The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS ONE,9(2), e89177.
Jolley, D., & Douglas, K. M. (2014b). The social consequences of conspiracism: Exposure to conspiracy theories decreases intentions to engage in politics and to reduce one’s carbon footprint. British Journal of Psychology,105(1), 35–56.
Keeley, B. L. (1999). Of conspiracy theories. The Journal of Philosophy,96, 109–126.
King, O. (2019). Presumptuous aim attribution, conformity, and the ethics of artificial social cognition. Ethics and Information Technology.
Klein, C., Clutton, P., & Dunn, A. (2019). Pathways to conspiracy: The social and linguistic precursors of involvement in Reddit’s conspiracy theory forum. PLoS ONE,14(11), 1–23.
Klein, C., Clutton, P., & Polito, V. (2018). Topic modeling reveals distinct interests within an online conspiracy forum. Frontiers in Psychology,9, 189.
Levy, N. (2017). The bad news about fake news. Social Epistemology Review and Reply Collective,6(8), 20–36.
Levy, N. (2019). Is conspiracy theorising irrational? Social Epistemology Review and Reply Collective,8(10), 65–76.
Lewandowsky, S., Gignac, G. E., & Oberauer, K. (2013). The role of conspiracist ideation and worldviews in predicting rejection of science. PLoS ONE,8(10), e75637.
Lewandowsky, S., Kozyreva, A., & Ladyman, J. (2020). What rationality? Comment on Levy. Social Epistemology Review and Reply Collective, 8(10).
Miller, S. (2002). Conspiracy theories: Public arguments as coded social critiques: A rhetorical analysis of the TWA flight 800 conspiracy theories. Argumentation and Advocacy,39(1), 40–56.
Meyer, M. 2019. Fake news, conspiracy, and intellectual vice. Social Epistemology Review and Reply Collective, 8(10), 9–19. https://wp.me/p1Bfg0-4tp.
Nguyen, C. T. (2018). Echo chambers and epistemic bubbles. In Episteme (pp. 1–21).
O’Connor, C. (2019). The origins of unfairness: Social categories and cultural evolution. Oxford: Oxford University Press.
Oliver, J. E., & Wood, T. (2014). Medical conspiracy theories and health behaviors in the United States. JAMA Internal Medicine,174(5), 817–818.
Oreskes, N., & Conway, E. (2010). Merchants of doubt. New York: Bloomsbury.
Palermos, S. O. (2016). The dynamics of group cognition. Minds and Machines,26(4), 409–440.
Paul, L. (2014). Transformative experience. Oxford: Oxford University Press.
Pigden, C. (1995). Popper revisited, or what is wrong with conspiracy theories? Philosophy of the Social Sciences,25(1), 3–34.
Pigden, C. (2015). Conspiracy theories and the conventional wisdom revisited. In O. Loukola (Ed.), Secrets and conspiracies. Amsterdam: Rodopi.
Prentice, D. A., & Gerrig, R. J. (1999). Exploring the boundary between fiction and reality. In Shelly Chailen & Yaacov Trope (Eds.), Dual process theories in social psychology (pp. 529–546). New York: Guilford Press.
Resnik, D. (1991). How-possibly explanations in biology. Acta Biotheoretica,39, 141–149.
Ribeiro, M. H., Ottoni, R., West, R., Almeida, V., & Meira Jr, W. (2018). Auditing radicalization pathways on YouTube. In Woodstock’18: ACM symposium on neural gaze detection, June 03–05, 2018, Woodstock, New York. https://doi.org/10.1145/1122445.1122456.
Simion, M., Kelp, C., & Ghijsen, H. (2016). Norms of belief. Philosophical issues,26(1), 374–392.
Simmons, W. P., & Parsons, S. (2005). Beliefs in conspiracy theories among African Americans: A comparison of elites and masses. Social Science Quarterly,86(3), 582–598.
Sperber, D., Clement, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., et al. (2010). Epistemic vigilance. Mind and Language,25(4), 359–393.
Sunstein, C. (2013). Deciding by default. University of Pennsylvania Law Review,162(1), 1–57.
Sunstein, C. (2014). Conspiracy theories and other dangerous ideas. New York: Simon and Schuster.
Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy,17, 202–227.
Swami, V., Chamorro-Premuzic, T., & Furnham, A. (2010). Unanswered questions: A preliminary investigation of personality and individual difference predictors of 9/11 conspiracist beliefs. Applied Cognitive Psychology,24(6), 749–761.
Thomas, S. B., & Quinn, S. C. (1991). The Tuskegee syphilis study, 1932 to 1972: Implications for HIV education and AIDS risk education programs in the black community. American Journal of Public Health,81(11), 1498–1505.
Vallor, S. (2016). Technology and the virtues: A philosophical guide to future worth wanting. Oxford: Oxford University Press.
Weinmann, M., Schneider, C., & vom Brocke, J. (2016). Digital nudging. Business and Information Systems Engineering,58(6), 433–436.
Wheeler, C., Green, M. C., & Brock, T. C. (1999). Fictional narratives change beliefs: Replications of Prentice, Gerrig, and Bailis (1997) with Mixed Corroboration. Psychonomic Bulletin & Review,6, 136–141.
Whitson, J. A., & Galinsky, A. D. (2008). Lacking control increases illusory pattern perception. Science,322(5898), 115–117.
Zajonc, R. (1968). Attitudinal effects of mere exposure. Journal of Personality and Social Psychology,9(2), 1–27.
Work partly supported by Australian Research Council Grant DP190101507 (to M.A. and C.K.) and Templeton Foundation Grant 61387 (to M.A.).
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
About this article
Cite this article
Alfano, M., Fard, A.E., Carter, J.A. et al. Technologically scaffolded atypical cognition: the case of YouTube’s recommender system. Synthese (2020). https://doi.org/10.1007/s11229-020-02724-x
- Technological seduction
- Transformative experience
- Recommender systems
- Conspiracy theory