Technologically scaffolded atypical cognition: the case of YouTube’s recommender system

Abstract

YouTube has been implicated in the transformation of users into extremists and conspiracy theorists. The alleged mechanism for this radicalizing process is YouTube’s recommender system, which is optimized to amplify and promote clips that users are likely to watch through to the end. YouTube optimizes for watch-through for economic reasons: people who watch a video through to the end are likely to then watch the next recommended video as well, which means that more advertisements can be served to them. This is a seemingly innocuous design choice, but it has a troubling side-effect. Critics of YouTube have alleged that the recommender system tends to recommend extremist content and conspiracy theories, as such videos are especially likely to capture and keep users’ attention. To date, the problem of radicalization via the YouTube recommender system has been a matter of speculation. The current study represents the first systematic, pre-registered attempt to establish whether and to what extent the recommender system tends to promote such content. We begin by contextualizing our study in the framework of technological seduction. Next, we explain our methodology. After that, we present our results, which are consistent with the radicalization hypothesis. Finally, we discuss our findings, as well as directions for future research and recommendations for users, industry, and policy-makers.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6

Notes

  1. 1.

    For news coverage see https://www.seattletimes.com/seattle-news/crime/god-told-me-he-was-a-lizard-seattle-man-accused-of-killing-his-brother-with-a-sword/ and https://www.huffpost.com/entry/proud-boy-allegedly-murders-brother-with-a-sword-thinking-hes-a-lizard_n_5c36042ee4b05b16bcfcb3d5.

  2. 2.

    See https://www.huffpost.com/entry/proud-boy-allegedly-murders-brother-with-a-sword-thinking-hes-a-lizard_n_5c36042ee4b05b16bcfcb3d5.

  3. 3.

    See https://threadreaderapp.com/thread/1083437810634248193.html.

  4. 4.

    In fact, as it’s been argued in recent work on conspiracy theories by Cassam (2019), the kind of ‘built-in’ implausibility of paradigmatic conspiracy theories makes them such that belief in them will—generally speaking—require a subversion of rational norms. See also Lewandowsky et al. (2020).

  5. 5.

    Though see Levy (2019) for an argument that, from the inside, conspiracy theorizing is not irrational. Even if this is true in general, Wolfe’s case clearly represents some sort of normative failing. And even when conspiracy theorizing is subjectively reasonable, there is often something objectively irrational about it. For discussion of this latter point, see Simion et al. (2016), among others.

  6. 6.

    See https://www.nytimes.com/2018/03/10/opinion/sunday/youtube-politics-radical.html and https://www.theguardian.com/technology/2018/feb/02/how-youtubes-algorithm-distorts-truth.

  7. 7.

    See https://www.nytimes.com/2019/06/03/world/americas/youtube-pedophiles.html.

  8. 8.

    The pre-registration and other details can be found at https://osf.io/cjp96/?view_only = e56fc77588194336810d41aeef04f822.

  9. 9.

    For more on how-possibly explanations, see Resnik (1991). For another recent attempt to offer how-possibly explanations of troubling social phenomena, see O’Connor (2019).

  10. 10.

    We should note that we do not mean to treat Buckey Wolfe as a representative sample of all YouTube viewers. We doubt that all viewers are equally susceptible to self-radicalization online. Rather, it seems likely that people who are already psychologically vulnerable would be most liable to radicalization and accepting conspiracy theories. It could be that such vulnerable individuals would end up radicalized even without interacting with the YouTube recommender system; we have no way of addressing the plausibility of that counterfactual. Thanks to an anonymous referee for raising this point.

  11. 11.

    See https://www.cnet.com/news/youtube-ces-2018-neal-mohan/.

  12. 12.

    In contrast with bottom-up technological seduction, top-down technological seduction occurs via the manipulative framing of online choice architecture that structures the user’s perception of the relevant option space in a way that guides them in certain prescribed directions. See, along with Alfano et al. (2018), also Weinmann et al. (2016) for discussion of digital nudging.

  13. 13.

    See King (2019) for a similar account that is framed in terms of the presumptuousness of recommender systems.

  14. 14.

    See https://www.businessinsider.com/youtube-watch-time-vs-views-2015-7?international=true&r=US&IR=T.

  15. 15.

    Dynamical systems, generally, are systems with significant feedback loops. For discussion see Alfano and Skorburg (2017), Palermos (2016), Abraham et al. (1990) and Beer (1995).

  16. 16.

    It is worth registering—though it is beyond the scope of what we can do here to explore in detail—some points of connection between (1) severe conspiracies in the sense of Klein et al. (2018) and (2) epistemic echo chambers as they have been discussed in recent work in social epistemology by Nguyen (2018; see also Jiang et al. 2019). Being in an epistemic echo chamber (for example, a religious or political cult) is to be in a social-epistemic environment in which viewpoints that run contrary to the accepted viewpoint, viz., voices from outside the echo chamber, are met with systematic distrust. Those who subscribe to severe conspiracy theories, likewise, are inclined to a systematic distrust of views that run contrary to the conspiracy theory, and this is the case given that part of what it is to accept a severe conspiracy is to accept that there are conspirators attempting to distort evidence against the theory. Accordingly, someone who subscribes to a severe conspiracy theory in the sense of Klein et al. (2018) will de facto find themselves in a specific kind of echo chamber. This is the case even though the contrary does not hold, viz., not all echo chambers involve belief in conspiracy theories.

  17. 17.

    These prohibit nudity, incitement to violence or self-harm, hate speech, violent or graphic content, harassment, spam, misleading metadata, scams, threats, copyright violations, doxing, impersonation, and harm to minors. For more, see: https://www.youtube.com/about/policies/#community-guidelines.

  18. 18.

    See https://www.theatlantic.com/health/archive/2018/08/the-peterson-family-meat-cleanse/567613/.

  19. 19.

    This matrix is created using the same method the same as the previous one. The only difference is the unit of analysis, which is more fine-grained in Fig. 4. Here, instead of calculating the similarity between every pair of topics, we perform all the calculations at search-term level.

  20. 20.

    For instance, they tease a revelation without giving enough details to form reasonable expectations. Which three common mistakes are made in street fights? What is the secret to mastering a handgun? What strange secret, Earl Nightingale? YouTube content creators share YouTube’s interest in selling advertisements, so it is unsurprising that some of them are desperate to draw attention and curiosity with their video titles.

  21. 21.

    Does the recommender system promote more conspiracy theories than some sort of neutral baseline? We are unable to address this question in the current study because we have no way of ascertaining what a neutral baseline might be. It might be possible to compare one recommender system to another, or to compare this recommender system to an older version of the same recommender system. However, we lack access to these comparators. What we have established is that the YouTube recommender system does in fact push conspiracy theories, not that it pushes them harder than they would be pushed by an alternative. Thanks to an anonymous reviewer for raising this point.

  22. 22.

    See https://www.youtube.com/about/press/.

  23. 23.

    A qualified note of optimism here owes to recent efforts by YouTube’s parent company Alphabet to limit the demographics of election ad targeting to the three general demographics of age, gender, and location. This new implementation is scheduled to go into effect in January 2020, and it reflects an (albeit minimal) effort on the part of Google to disrupt the way potentially misleading information is disseminated. See https://www.blog.google/technology/ads/update-our-political-ads-policy/. However, there is perhaps more cause for pessimism about what to expect from further tweaks to the recommender system. Insofar as the algorithm’s design function continues to aim at maximising watch-time within the system, ‘improvements’ to this design in the form of further tweaks are (provided they are effective) going to be tweaks that only further serve to recommend the very kinds of conspiratorial content that is likely to hold attention. With this in mind, the fact that YouTube’s recommender system is a ‘moving target’ not only makes it difficult to study, but also gives it the potential to transform (absent further regulation) into an even more epistemically pernicious mechanism of technological seduction than it is presently.

  24. 24.

    For additional recent discussion about the relationship between conspiracy theories and political ideology, see Cassam (2019).

  25. 25.

    The idea that such shifts might be epistemically important is key to Paul’s (2014) influential work on the epistemology of transformative experience. According to Paul, the adoption of certain kinds of perspectives requires a significant experience, one that won’t necessarily be secured by an incremental exposure to a certain kind of evidence (or apparent evidence) in favour of that perspective. This gloss of Paul’s view is of course compatible with there being some possible cases where incremental change can elicit a transformative experience; her position does not foreclose that possibility.

  26. 26.

    The monetization of conspiracy theories is, of course, not limited to YouTube, which has been our focus, and the fact that conspiracy theories are monetizable is well established. Sunstein (2014) refers to those who engage in the wider practice of profiting off of conspiracy theories ‘conspiracy entrepreneurs’ (2014: p. 12), a classic example of which he offers is Alex Jones of InfoWars. It is worth noting the important gap between conspiracy theorising and conspiracy entrepreneurship. Though Jones (like YouTube) profits from the production of conspiratorial content, publicly available court documents cast doubt on whether he himself believes the content he profits from. See https://www.theguardian.com/us-news/2019/mar/30/alex-jones-sandy-hook-claims-psychosis. YouTube, being a large corporation, presumably does not have intentional states such as beliefs. It is also worth highlighting the important gap—vis-a-vis YouTube — between the the epistemic badness of (1) believing conspiracy theories; and (2) facilitating the belief in such theories in others. We’ve demonstrated how YouTube’s recommender system can easily bring about the second kind of epistemic bad, but we presume that YouTube itself lacks beliefs.

References

  1. Abalakina-Paap, M., Stephan, W. G., Craig, T., & Gregory, W. L. (1999). Beliefs in conspiracies. Political Psychology,20(3), 637–647.

    Article  Google Scholar 

  2. Abraham, F. D., Abraham, R. H., & Shaw, C. D. (1990). A visual introduction to dynamical systems theory for psychology. Aerial Press. Retrieved from http://psycnet.apa.org/psycinfo/1991-97299-000.

  3. Alfano, M., Carter, J. A., & Cheong, M. (2018). Technological seduction and self-radicalization. Journal of the American Philosophical Association,4(3), 298–322.

    Article  Google Scholar 

  4. Alfano, M., Iurino, K., Stey, P., Robinson, B., Christen, M., Yu, F., et al. (2017). Development and validation of a multi-dimensional measure of intellectual humility. PLoS ONE,12(8), e0182950.

    Article  Google Scholar 

  5. Alfano, M., & Klein, C. (2019). Trust in a social and digital world. Social Epistemology Review and Reply Collective,8(10), 1–8.

    Google Scholar 

  6. Alfano, M., & Skorburg, J. A. (2017). The embedded and extended character hypotheses. In J. Kiverstein (Ed.), Handbook of philosophy of the social mind (pp. 465–478). London: Routledge.

    Google Scholar 

  7. Alfano, M., & Skorburg, J. A. (2018). Extended knowledge, the recognition heuristic, and epistemic injustice. In D. Pritchard, J. Kallestrup, O. Palermos, & J. A. Carter (Eds.), Extended knowledge. Oxford: Oxford University Press.

    Google Scholar 

  8. Bale, J. M. (2007). Political paranoia v. Political realism: On distinguishing between bogus conspiracy theories and genuine conspiratorial politics. Patterns of Prejudice,41, 45–60.

    Article  Google Scholar 

  9. Beer, R. D. (1995). A dynamical systems perspective on agent-environment interaction. Artificial Intelligence,72(1), 173–215.

    Article  Google Scholar 

  10. Bird, S., Loper, E., & Klein, E. (2009). Natural language processing with Python. Newton: O’Reilly Media Inc.

    Google Scholar 

  11. Bogart, L. M., & Thorburn, S. (2005). Are HIV/AIDS conspiracy beliefs a barrier to hiv prevention among African Americans? Journal of Acquired Immune Deficiency Syndromes,38(2), 213–218.

    Article  Google Scholar 

  12. Brin, S., & Page, L. (1998). The anatomy of a large-scale hypertextual Web search engine. Computer Networks and ISDN Systems,30, 107–117.

    Article  Google Scholar 

  13. Cassam, Q. (2019). Conspiracy theories. London: Wiley.

    Google Scholar 

  14. Chaslot, G. (2016). Exploring YouTube recommendations. Available at https://github.com/pnbt/youtube-explore.

  15. Coady, D. (2007). Are conspiracy theories irrational? Episteme,4(2), 193–204.

    Article  Google Scholar 

  16. Cook, J., & Lewandowsky, S. (2016). Rational irrationality: Modeling climate change belief polarization using Bayesian networks. Topics in Cognitive Science,8, 160–179.

    Article  Google Scholar 

  17. Dentith, M. (2014). The philosophy of conspiracy theories. London: Palgrave.

    Google Scholar 

  18. Dunn, A. G., Leask, J., Zhou, X., Mandl, K. D., & Coiera, E. (2015). Associations between exposure to and expression of negative opinions about human papillomavirus vaccines on social media: An observational study. Journal of Medical Internet Research,17(6), e144.

    Article  Google Scholar 

  19. Ecker, U. K. H., Lewandowsky, S., Swire, B., & Chang, D. (2011). Correcting false information in memory: Manipulating the strength of misinformation encoding and its retraction. Psychonomic Bulletin & Review,18, 570–578.

    Article  Google Scholar 

  20. Fleiss, J. L. (1971). Measuring nominal scale agreement among many raters. Psychological Bulletin,76(5), 378–382.

    Article  Google Scholar 

  21. Foot, P. (1997). Virtues and vices. In R. Crisp & M. Slote (Eds.), Virtue ethics (pp. 163–177). Oxford: Oxford University Press.

    Google Scholar 

  22. Forrester, J. (1990). The seductions of psychoanalysis: Freud, Lacan, and Derrida. Cambridge: Cambridge University Press.

    Google Scholar 

  23. Gigerenzer, G. (2008). Rationality for mortals: How people cope with uncertainty. Oxford: Oxford University Press.

    Google Scholar 

  24. Gigerenzer, G., & Goldstein, D. (1996). Reasoning the fast and frugal way: Models of bounded rationality. Psychological Review,103(4), 650.

    Article  Google Scholar 

  25. Goertzel, T. (1994). Belief in conspiracy theories. Political Psychology.,15, 731–742.

    Article  Google Scholar 

  26. Hall, L., Johansson, P., & Strandberg, T. (2012). Lifting the veil of morality: Choice blindness and attitude reversals on a self-transforming survey. PLoS ONE,7(9), e45457.

    Article  Google Scholar 

  27. Hall, L., Strandberg, T., Pärnamets, P., Lind, A., Tärning, B., & Johansson, P. (2013). How the polls can be both spot on and dead wrong: Using choice blindness to shift political attitudes and voter intentions. PLoS ONE,8(4), e60554.

    Article  Google Scholar 

  28. Heersmink, R. (2017). A virtue epistemology of the internet: Search engines, intellectual virtues and education. Social Epistemology,32(1), 1–12.

    Article  Google Scholar 

  29. Hofstadter, R. (1964). The paranoid style in American politics. Harper’s Magazine,229(1374), 77–86.

    Google Scholar 

  30. Icke, D. (1999). The biggest secret: The book that will change the world. London: Bridge of Love Publications.

    Google Scholar 

  31. Jern, A., Chang, K.-M. K., & Kemp, C. (2009). Bayesian belief polarization. In Y. Bengio, D. Schuurmans, J. Lafferty, C. K. I. Williams, & A. Culotta (Eds.), Advances in neural information processing systems (vol. 22, pp. 853–861).

  32. Jiang, R., Chiappa, S., Lattimore, T., Gyorgy, A., & Kohli, P. (2019). Degenerate feedback loops in recommender systems. In Proceedings of AAAI/ACM conference on AI, ethics, and society, Honolulu, HI.

  33. Johansson, P., Hall, L., Sikström, S., & Olsson, A. (2005). Failure to detect mismatches between intention and outcome in a simple decision task. Science,310, 116–119.

    Article  Google Scholar 

  34. Jolley, D., & Douglas, K. M. (2014a). The effects of anti-vaccine conspiracy theories on vaccination intentions. PLoS ONE,9(2), e89177.

    Article  Google Scholar 

  35. Jolley, D., & Douglas, K. M. (2014b). The social consequences of conspiracism: Exposure to conspiracy theories decreases intentions to engage in politics and to reduce one’s carbon footprint. British Journal of Psychology,105(1), 35–56.

    Article  Google Scholar 

  36. Keeley, B. L. (1999). Of conspiracy theories. The Journal of Philosophy,96, 109–126.

    Article  Google Scholar 

  37. King, O. (2019). Presumptuous aim attribution, conformity, and the ethics of artificial social cognition. Ethics and Information Technology.

  38. Klein, C., Clutton, P., & Dunn, A. (2019). Pathways to conspiracy: The social and linguistic precursors of involvement in Reddit’s conspiracy theory forum. PLoS ONE,14(11), 1–23.

    Article  Google Scholar 

  39. Klein, C., Clutton, P., & Polito, V. (2018). Topic modeling reveals distinct interests within an online conspiracy forum. Frontiers in Psychology,9, 189.

    Article  Google Scholar 

  40. Levy, N. (2017). The bad news about fake news. Social Epistemology Review and Reply Collective,6(8), 20–36.

    Google Scholar 

  41. Levy, N. (2019). Is conspiracy theorising irrational? Social Epistemology Review and Reply Collective,8(10), 65–76.

    Google Scholar 

  42. Lewandowsky, S., Gignac, G. E., & Oberauer, K. (2013). The role of conspiracist ideation and worldviews in predicting rejection of science. PLoS ONE,8(10), e75637.

    Article  Google Scholar 

  43. Lewandowsky, S., Kozyreva, A., & Ladyman, J. (2020). What rationality? Comment on Levy. Social Epistemology Review and Reply Collective, 8(10).

  44. Miller, S. (2002). Conspiracy theories: Public arguments as coded social critiques: A rhetorical analysis of the TWA flight 800 conspiracy theories. Argumentation and Advocacy,39(1), 40–56.

    Article  Google Scholar 

  45. Meyer, M. 2019. Fake news, conspiracy, and intellectual vice. Social Epistemology Review and Reply Collective, 8(10), 9–19. https://wp.me/p1Bfg0-4tp.

    Google Scholar 

  46. Nguyen, C. T. (2018). Echo chambers and epistemic bubbles. In Episteme (pp. 1–21).

  47. O’Connor, C. (2019). The origins of unfairness: Social categories and cultural evolution. Oxford: Oxford University Press.

    Google Scholar 

  48. Oliver, J. E., & Wood, T. (2014). Medical conspiracy theories and health behaviors in the United States. JAMA Internal Medicine,174(5), 817–818.

    Article  Google Scholar 

  49. Oreskes, N., & Conway, E. (2010). Merchants of doubt. New York: Bloomsbury.

    Google Scholar 

  50. Palermos, S. O. (2016). The dynamics of group cognition. Minds and Machines,26(4), 409–440.

    Article  Google Scholar 

  51. Paul, L. (2014). Transformative experience. Oxford: Oxford University Press.

    Google Scholar 

  52. Pigden, C. (1995). Popper revisited, or what is wrong with conspiracy theories? Philosophy of the Social Sciences,25(1), 3–34.

    Article  Google Scholar 

  53. Pigden, C. (2015). Conspiracy theories and the conventional wisdom revisited. In O. Loukola (Ed.), Secrets and conspiracies. Amsterdam: Rodopi.

    Google Scholar 

  54. Prentice, D. A., & Gerrig, R. J. (1999). Exploring the boundary between fiction and reality. In Shelly Chailen & Yaacov Trope (Eds.), Dual process theories in social psychology (pp. 529–546). New York: Guilford Press.

    Google Scholar 

  55. Resnik, D. (1991). How-possibly explanations in biology. Acta Biotheoretica,39, 141–149.

    Article  Google Scholar 

  56. Ribeiro, M. H., Ottoni, R., West, R., Almeida, V., & Meira Jr, W. (2018). Auditing radicalization pathways on YouTube. In Woodstock’18: ACM symposium on neural gaze detection, June 03–05, 2018, Woodstock, New York. https://doi.org/10.1145/1122445.1122456.

  57. Simion, M., Kelp, C., & Ghijsen, H. (2016). Norms of belief. Philosophical issues,26(1), 374–392.

    Article  Google Scholar 

  58. Simmons, W. P., & Parsons, S. (2005). Beliefs in conspiracy theories among African Americans: A comparison of elites and masses. Social Science Quarterly,86(3), 582–598.

    Article  Google Scholar 

  59. Sperber, D., Clement, F., Heintz, C., Mascaro, O., Mercier, H., Origgi, G., et al. (2010). Epistemic vigilance. Mind and Language,25(4), 359–393.

    Article  Google Scholar 

  60. Sunstein, C. (2013). Deciding by default. University of Pennsylvania Law Review,162(1), 1–57.

    Google Scholar 

  61. Sunstein, C. (2014). Conspiracy theories and other dangerous ideas. New York: Simon and Schuster.

    Google Scholar 

  62. Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy,17, 202–227.

    Article  Google Scholar 

  63. Swami, V., Chamorro-Premuzic, T., & Furnham, A. (2010). Unanswered questions: A preliminary investigation of personality and individual difference predictors of 9/11 conspiracist beliefs. Applied Cognitive Psychology,24(6), 749–761.

    Article  Google Scholar 

  64. Thomas, S. B., & Quinn, S. C. (1991). The Tuskegee syphilis study, 1932 to 1972: Implications for HIV education and AIDS risk education programs in the black community. American Journal of Public Health,81(11), 1498–1505.

    Article  Google Scholar 

  65. Vallor, S. (2016). Technology and the virtues: A philosophical guide to future worth wanting. Oxford: Oxford University Press.

    Google Scholar 

  66. Weinmann, M., Schneider, C., & vom Brocke, J. (2016). Digital nudging. Business and Information Systems Engineering,58(6), 433–436.

    Article  Google Scholar 

  67. Wheeler, C., Green, M. C., & Brock, T. C. (1999). Fictional narratives change beliefs: Replications of Prentice, Gerrig, and Bailis (1997) with Mixed Corroboration. Psychonomic Bulletin & Review,6, 136–141.

    Article  Google Scholar 

  68. Whitson, J. A., & Galinsky, A. D. (2008). Lacking control increases illusory pattern perception. Science,322(5898), 115–117.

    Article  Google Scholar 

  69. Zajonc, R. (1968). Attitudinal effects of mere exposure. Journal of Personality and Social Psychology,9(2), 1–27.

    Article  Google Scholar 

Download references

Funding

Work partly supported by Australian Research Council Grant DP190101507 (to M.A. and C.K.) and Templeton Foundation Grant 61387 (to M.A.).

Author information

Affiliations

Authors

Contributions

Alfano designed the study, coded data, and wrote the majority of the manuscript. Fard wrote the code to collect data from YouTube and made the figures. Carter coded data and wrote and edited much of the manuscript (especially the discussion). Clutton coded data and provided literature review. Klein ran the topic models and wrote and edited much of the manuscript.

Corresponding author

Correspondence to Mark Alfano.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Alfano, M., Fard, A.E., Carter, J.A. et al. Technologically scaffolded atypical cognition: the case of YouTube’s recommender system. Synthese (2020). https://doi.org/10.1007/s11229-020-02724-x

Download citation

Keywords

  • Technological seduction
  • Transformative experience
  • Radicalization
  • YouTube
  • Recommender systems
  • Conspiracy theory