Skip to main content

Indirect Causal Influence of a Single Bot on Opinion Dynamics Through a Simple Recommendation Algorithm

  • Conference paper
  • First Online:
Complex Networks & Their Applications X (COMPLEX NETWORKS 2021)

Part of the book series: Studies in Computational Intelligence ((SCI,volume 1073))

Included in the following conference series:

  • 3645 Accesses

Abstract

The ability of social and political bots to influence public opinion is often difficult to estimate. Recent studies found that hyper-partisan accounts often directly interact with already highly polarised users on Twitter and are unlikely to influence the general population’s average opinion. In this study, we suggest that social bots, trolls and zealots may influence people’s views not only via direct interactions (e.g. retweets, at-mentions and likes) but also via indirect causal pathways mediated by platforms’ content recommendation systems. Using a simple agent-based opinion-dynamics simulation, we isolate the effect of a single bot – representing only 1% of the population – on the average opinion of Bayesian agents when we remove all direct connections between the bot and human agents. We compare this experimental condition with an identical baseline condition where such a bot is absent. We used the same random seed in both simulations so that all other conditions remained identical. Results show that, even in the absence of direct connections, the mere presence of the bot is sufficient to shift the average population opinion. Furthermore, we observe that the presence of the bot significantly affects the opinion of almost all agents in the population. Overall, these findings offer a proof of concept that bots and hyperpartisan accounts can influence average population opinions not only by directly interacting with human accounts but also by shifting platforms’ recommendation engines’ internal representations.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 299.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 379.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 379.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Bessi, A., Ferrara, E.: Social bots distort the 2016 US Presidential Election online discussion. SSRN 21, 14 (2016)

    Google Scholar 

  2. Lerman, K., Yan, X., Wu, X.-Z.: The ‘Majority Illusion’ in social networks. PLoS ONE 11, e0147617 (2016)

    Article  Google Scholar 

  3. Broniatowski, D.A., et al.: Weaponized health communication: Twitter bots and Russian trolls amplify the vaccine debate. Am. J. Public Health 108, 1378–1384 (2018)

    Article  Google Scholar 

  4. Stewart, A.J., et al.: Information gerrymandering and undemocratic decisions. Nature 573, 117–121 (2019)

    Article  Google Scholar 

  5. Paul, C., Matthews, M.: The Russian ‘firehose of falsehood’ propaganda model. Rand Corporation 2–7 (2016)

    Google Scholar 

  6. Shao, C., et al.: The spread of low-credibility content by social bots. Nat. Commun. 9, 4787 (2018)

    Article  Google Scholar 

  7. Stella, M., Ferrara, E.D., Domenico, M.: Bots increase exposure to negative and inflammatory content in online social systems. Proc. Natl. Acad. Sci. U.S.A. 115, 12435–12440 (2018)

    Article  Google Scholar 

  8. Howard, P.: How political campaigns weaponize social media bots. IEEE Spectrum (2018)

    Google Scholar 

  9. Ferrara, E., Varol, O., Davis, C., Menczer, F., Flammini, A.: The rise of social bots. Commun. ACM 59, 96–104 (2016)

    Article  Google Scholar 

  10. Ledford, H.: Social scientists battle bots to glean insights from online chatter. Nature 578, 17 (2020)

    Article  Google Scholar 

  11. Hurtado, S., Ray, P., Marculescu, R.: Bot detection in Reddit political discussion. In: Proceedings of the Fourth International Workshop on Social Sensing, pp. 30–35. Association for Computing Machinery (2019)

    Google Scholar 

  12. Linvill, D.L., Warren, P.L.: Troll factories: the internet research agency and state-sponsored agenda building. Resource Centre on Media Freedom in Europe (2018)

    Google Scholar 

  13. Aral, S., Eckles, D.: Protecting elections from social media manipulation. Science 365, 858–861 (2019)

    Article  Google Scholar 

  14. Tucker, J.A., et al.: Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature (2018). https://doi.org/10.2139/ssrn.3144139

  15. Vosoughi, S., Roy, D., Aral, S.: The spread of true and false news online. Science 359, 1146–1151 (2018)

    Article  Google Scholar 

  16. Guess, A., Nagler, J., Tucker, J. Less than you think: prevalence and predictors of fake news dissemination on Facebook. Sci. Adv. 5, eaau4586 (2019)

    Google Scholar 

  17. Allen, J., Howland, B., Mobius, M., Rothschild, D., Watts, D.J. Evaluating the fake news problem at the scale of the information ecosystem. Sci. Adv. 6, eaay3539 (2020)

    Google Scholar 

  18. Bail, C.A., et al.: Assessing the Russian Internet Research Agency’s impact on the political attitudes and behaviors of American Twitter users in late 2017. Proc. Natl. Acad. Sci. 117, 243–250 (2020)

    Article  Google Scholar 

  19. Zaller, J.R.: The Nature and Origins of Mass Opinion. Cambridge University Press, Cambridge (1992)

    Book  Google Scholar 

  20. Endres, K., Panagopoulos, C.: Cross-pressure and voting behavior: evidence from randomized experiments. J. Polit. 81, 1090–1095 (2019)

    Article  Google Scholar 

  21. Kalla, J.L., Broockman, D.E.: The Minimal persuasive effects of campaign contact in general elections: evidence from 49 field experiments. Am. Polit. Sci. Rev. 112, 148–166 (2018)

    Article  Google Scholar 

  22. Bail, C.A., et al.: Exposure to opposing views on social media can increase political polarization. Proc. Natl. Acad. Sci. 115, 9216–9221 (2018)

    Article  Google Scholar 

  23. Pescetelli, N., Yeung, N.: The effects of recursive communication dynamics on belief updating. Proc. Roy. Soc. B: Biol. Sci. 287, 20200025 (2020)

    Article  Google Scholar 

  24. González-Bailón, S.D., Domenico, M.: Bots are less central than verified accounts during contentious political events. Proc. Natl. Acad. Sci. U.S.A. 118, 1–8 (2021)

    Article  Google Scholar 

  25. Flache, A., et al.: Models of social influence: towards the next frontiers. J. Artif. Soc. Soc. Simul. 20, 1–31 (2017)

    Article  Google Scholar 

  26. Deffuant, G., Neau, D., Amblard, F., Weisbuch, G.: Mixing beliefs among interacting agents. Adv. Complex Syst. 03, 87–98 (2000)

    Article  Google Scholar 

  27. DeGroot, M.H.: Reaching a consensus. J. Am. Stat. Assoc. 69, 118 (1974)

    Article  MATH  Google Scholar 

  28. Friedkin, N.E., Johnsen, E.C.: Social influence and opinions. J. Math. Sociol. 15, 193–206 (1990)

    Article  MATH  Google Scholar 

  29. Bakshy, E., Messing, S., Adamic, L.A.: Exposure to ideologically diverse news and opinion on Facebook. Science 348, 1–4 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  30. Das, A., Datar, M., Garg, A. Rajaram, S.: Google news personalization: scalable online collaborative filtering. In: Proceedings of the 16th International Conference on World Wide Web, pp. 271–280 (2007)

    Google Scholar 

  31. Pipergias Analytis, P., Barkoczi, D., Lorenz-Spreen, P. Herzog, S.: The structure of social influence in recommender networks. In: Proceedings of The Web Conference 2020, pp. 2655–2661. Association for Computing Machinery (2020)

    Google Scholar 

  32. Lazer, D.: Studying human attention on the Internet. Proc. Natl. Acad. Sci. U.S.A. 117, 21–22 (2020)

    Article  Google Scholar 

  33. Pescetelli, N., Yeung, N.: The role of decision confidence in advice-taking and trust formation. J. Exp. Psychol. Gen. (2020). https://doi.org/10.1037/xge0000960

    Article  Google Scholar 

  34. Harris, A.J.L., Hahn, U., Madsen, J.K., Hsu, A.S.: The appeal to expert opinion: quantitative support for a Bayesian network approach. Cogn. Sci. 40, 1496–1533 (2016)

    Article  Google Scholar 

  35. Pescetelli, N., Rees, G., Bahrami, B.: The perceptual and social components of metacognition. J. Exp. Psychol. Gen. 145, 949–965 (2016)

    Article  Google Scholar 

  36. Lazer, D.M.J., et al.: The science of fake news. Science 359, 1094–1096 (2018)

    Article  Google Scholar 

  37. Karan, N., Salimi, F., Chakraborty, S.: Effect of zealots on the opinion dynamics of rational agents with bounded confidence. Acta Phys. Pol. B 49, 73 (2018)

    Article  MathSciNet  Google Scholar 

  38. Yildiz, E., Acemoglu, D., Ozdaglar, A.E., Saberi, A., Scaglione, A.: Discrete opinion dynamics with stubborn agents. SSRN Electron. J. https://doi.org/10.2139/ssrn.1744113

  39. Ali, M., et al.: Discrimination through optimization: how Facebook’s ad delivery can lead to biased outcomes. Proc. ACM Hum.-Comput. Interact. 3, 1–30 (2019)

    Article  Google Scholar 

  40. Hannak, A., et al.: Measuring personalization of web search. In: Proceedings of the 22nd International Conference on World Wide Web, pp. 527–538. Association for Computing Machinery (2013)

    Google Scholar 

  41. Robertson, R.E., Lazer, D., Wilson, C.: Auditing the personalization and composition of politically-related search engine results pages. In: Proceedings of the 2018 World Wide Web Conference on World Wide Web - WWW 2018, pp. 955–965. ACM Press (2018)

    Google Scholar 

  42. Ricci, F., Rokach, L., Shapira, B.: Introduction to recommender systems handbook. In: Ricci, F., Rokach, L., Shapira, B., Kantor, P.B. (eds.) Recommender Systems Handbook, pp. 1–35. Springer, Boston, MA (2011). https://doi.org/10.1007/978-0-387-85820-3_1

    Chapter  MATH  Google Scholar 

  43. Das, A.S., Datar, M., Garg, A. Rajaram, S.: Google news personalization: scalable online collaborative filtering. In: Proceedings of the 16th International Conference on World Wide Web, pp. 271–280. Association for Computing Machinery (2007)

    Google Scholar 

  44. Koren, Y., Bell, R.: Advances in collaborative filtering. In: Ricci, F., Rokach, L., Shapira, B. (eds.) Recommender Systems Handbook, pp. 77–118. Springer, Boston (2015). https://doi.org/10.1007/978-1-4899-7637-6_3

  45. Analytis, P.P., Barkoczi, D., Herzog, S.M.: Social learning strategies for matters of taste. Nat. Hum. Behav. 2, 415–424 (2018)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Niccolo Pescetelli .

Editor information

Editors and Affiliations

Supplementary Materials

Supplementary Materials

Different Engagement Functions

In the main text, we presented results assuming that agents are more likely to engage when the distance between their own opinion and the other agent’s opinion is high. Here we study the sensitivity of our results to other engagement functions. Figure S1 shows the same results as Fig. 2 in the main text. Instead of assuming that agents are more likely to engage when content is dissimilar, we assume that agents are more likely to engage when the observed content is similar. In Fig. S2, we study a bimodal engagement function where agents are more likely to engage with very similar or very dissimilar content and less likely to engage with content that is neither too similar nor too dissimilar.

Fig. S1.
figure 5

Homophilous engagement function. Agents are more likely to engage with content in their feed that is closer to their own opinions. (a) population’s mean opinion; (b) population’s mean engagement; (c) the number of people influenced by the bot. (d) agents’ mean opinion shift. The analysis shows that the results reported in the main text might be sensitive to the specific engagement function used by the agents to chose which content items they engage with.

Fig. S2.
figure 6

Bimodal engagement function. Agents’ engagement with content follows a binomial distribution with bimodal probability for content that is close to the target agent’s private opinion as well as content that is distant from the agent’s opinion. Content that falls between these two extremes is less likely to generate engagement. (a–d) population’s mean opinion, population’s mean engagement, number of people influenced by the bot and mean opinion change, as a function of time. Notice that contrary to the main text results (Fig. S2) here no difference emerges between conditions. This is likely due to the bimodal engagement function. Agents using this engagement function were more likely to engage with content that was similar to their own opinion. According to the Bayesian update rule (Eq. 4), this preference for similar content generates escalation dynamics that lead to agents reinforcing their own opinion [23] and thus being deaf to the bot’s different opinion.

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Pescetelli, N., Barkoczi, D., Cebrian, M. (2022). Indirect Causal Influence of a Single Bot on Opinion Dynamics Through a Simple Recommendation Algorithm. In: Benito, R.M., Cherifi, C., Cherifi, H., Moro, E., Rocha, L.M., Sales-Pardo, M. (eds) Complex Networks & Their Applications X. COMPLEX NETWORKS 2021. Studies in Computational Intelligence, vol 1073. Springer, Cham. https://doi.org/10.1007/978-3-030-93413-2_3

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-93413-2_3

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-93412-5

  • Online ISBN: 978-3-030-93413-2

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics