Advertisement

The Specter of Echo Chambers—Public Diplomacy in the Age of Disinformation

  • Ilan ManorEmail author
Chapter
Part of the Palgrave Macmillan Series in Global Public Diplomacy book series (GPD)

Abstract

Recent years have seen growing concerns over the use of propaganda and disinformation by nations such as Russia. Fueled by the phenomenon of echo chambers and filter bubbles, diplomats are increasingly wary of using digital technologies in public diplomacy activities. This chapter reviews the latest studies pertaining to algorithmic filtering on social media sites. Building on this, it then explores how some nations attempt to weaponize filter bubbles so as to spread propaganda and disinformation. Given that the flow of disinformation is not limited to social media, this chapter identifies the tools through which disinformation and propaganda are spread across multiple digital platforms. The chapter concludes with two case studies that demonstrate how British and Israeli diplomats are attempting to fracture echo chambers and burst filter bubbles of disinformation.

References

  1. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31(2), 211–236.Google Scholar
  2. Anderson, M., & Jiang, J. (2018). Teens, social media & technology. Pew Research Center. Retrieved from http://www.pewinternet.org/2018/05/31/teens-social-media-technology-2018/.
  3. Applebaum, A., Pomerantsev, P., Smith, M., & Colliver, C. (2017). ‘Make Germany great again’: Kremlin, alt-right and international influences in the 2017 German elections. Institute for Strategic Dialogue and the Arena Project at the LSE’s Institute of Global Affairs. Retrieved from http://www.isdglobal.org/wp-content/uploads/2017/12/Make-Germany-Great-Again-ENG-081217.pdf.
  4. Arsenault, A. (2013). Networks of freedom, networks of control: Internet policy as a platform for and an impediment to relational public diplomacy. In R. S. Zaharna, A. Arsenault, & A. Fisher (Eds.), Relational, networked and collaborative approaches to public diplomacy (pp. 192–208). New York, NY: Taylor & Francis.Google Scholar
  5. Attias, S. (2012). Israel’s new peer-to-peer diplomacy. The Hague Journal of Diplomacy, 7(4), 473–482.Google Scholar
  6. Bakshy, E., Messing, S., & Adamic, L. A. (2015). Exposure to ideologically diverse news and opinion on Facebook. Science, 348(6239), 1130–1132.Google Scholar
  7. Barberá, P. (2014). How social media reduces mass political polarization. Evidence from Germany, Spain, and the US. Job Market Paper, New York University, 46.Google Scholar
  8. Bartles, C. K. (2016). Getting Gerasimov Right. Military Review, 96(1), 30–38.Google Scholar
  9. Baumgartner, J. C., & Morris, J. S. (2010). MyFaceTube politics: Social networking web sites and political engagement of young adults. Social Science Computer Review, 28(1), 24–44.Google Scholar
  10. BBC News. (2014, August). Captured Russian troops ‘in Ukraine by accident’. BBC News. Retrieved from https://www.bbc.co.uk/news/world-europe-28934213.
  11. Bessi, A., & Ferrara, E. (2016). Social bots distort the 2016 US Presidential election online discussion. First Monday, 21(11), 14.Google Scholar
  12. Bevan, T., Fellner, E., & Slovo, R. (Producers), & Alfredson, T. (Director). (2011). Tinker Taylor soldier spy. [Motion picture]. UK: Focus Features.Google Scholar
  13. Boichak, O., Jackson, S., Hemsley, J., & Tanupabrungsun, S. (2018, March). Automated diffusion? Bots and their influence during the 2016 US Presidential election. In International conference on information (pp. 17–26). Cham: Springer.Google Scholar
  14. Boxell, L., Gentzkow, M., & Shapiro, J. M. (2017). Greater internet use is not associated with faster growth in political polarization among US demographic groups. Proceedings of the National Academy of Sciences, 114(40), 10612–10617.Google Scholar
  15. Boyd, D. M., & Ellison, N. B. (2007). Social network sites: Definition, history, and scholarship. Journal of Computer-Mediated Communication, 13(1), 210–230.Google Scholar
  16. Bradshaw, S., & Howard, P. N. (2018). Challenging truth and trust: A global inventory of organized social media manipulation. Computational Propaganda Research Project. Retrieved from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/93/2018/07/ct2018.pdf.
  17. Chan, M. (2017, March). There’s a new anti-Semitic message on social media every 83 seconds: Study. Time. Retrieved from http://time.com/4712439/anti-semitic-posts-world-jewish-congress/.
  18. Crilley, R. (2016). Like and share forces: Making sense of military social media sites. In Understanding popular culture and world politics in the digital age (pp. 67–83). Oxon: Routledge.Google Scholar
  19. del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, A., …, Quattrociocchi, W. (2016). The spreading of misinformation online. Proceedings of the National Academy of Sciences, 113(3), 554–559.Google Scholar
  20. Duggan, M., & Smith, A. (2016). The political environment on social media. Pews Research Center. Retrieved from http://www.pewinternet.org/2016/10/25/the-political-environment-on-social-media/.
  21. Edwards, C., Edwards, A., Spence, P. R., & Shelton, A. K. (2014). Is that a bot running the social media feed? Testing the differences in perceptions of communication quality for a human agent and a bot agent on Twitter. Computers in Human Behavior, 33, 372–376.Google Scholar
  22. Flaxman, S., Goel, S., & Rao, J. M. (2016). Filter bubbles, echo chambers, and online news consumption. Public Opinion Quarterly, 80(S1), 298–320.Google Scholar
  23. Forelle, M., Howard, P., Monroy-Hernández, A., & Savage, S. (2015). Political bots and the manipulation of public opinion in Venezuela. Retrieved from https://arxiv.org/ftp/arxiv/papers/1507/1507.07109.pdf.
  24. Fredheim, R. (2017). Robotrolling. NATO StratCom Centre of Excellence. Retrieved from https://www.stratcomcoe.org/robotrolling-20171.
  25. Galeotti, M. (2018a). The mythical ‘Gerasimov Doctrine’ and the language of threat. Critical Studies on Security, 1–5.Google Scholar
  26. Galeotti, M. (2018b). I’m sorry for creating the ‘Gerasimov Doctrine’. Foreign Policy.Google Scholar
  27. Garrett, R. K. (2017). The “echo chamber” distraction: Disinformation campaigns are the problem, not audience fragmentation. Journal of Applied Research in Memory and Cognition, 6(4), 370–376.Google Scholar
  28. Gerber, T. P., & Zavisca, J. (2016). Does Russian propaganda work? The Washington Quarterly, 39(2), 79–98.Google Scholar
  29. Giles, K. (2016). Russia’s ‘new’ tools for confronting the West: Continuity and innovation in Moscow’s exercise of power. Royal Institute of International Affairs Chatham House.Google Scholar
  30. Ginsberg, A. (2015). Howl and other poems. San Francisco, CA: Martino.Google Scholar
  31. Groshek, J., & Dimitrova, D. (2011). A cross-section of voter learning, campaign interest and intention to vote in the 2008 American election: Did Web 2.0 matter. Communication Studies Journal, 9(1), 355–375.Google Scholar
  32. Howard, P. N., & Kollanyi, B. (2016). Bots, #StrongerIn, and #Brexit: Computational propaganda during the UK-EU referendum.Google Scholar
  33. Issue. (2016). Strategic communications: East and south. Paris: EU Institute for Security Studies. Retrieved from https://www.iss.europa.eu/sites/default/files/EUISSFiles/Report_30.pdf.
  34. Kenski, K., & Stroud, N. J. (2006). Connections between internet use and political efficacy, knowledge, and participation. Journal of Broadcasting & Electronic Media, 50(2), 173–192.Google Scholar
  35. King, G., Pan, J., & Roberts, M. E. (2017). How the Chinese government fabricates social media posts for strategic distraction, not engaged argument. American Political Science Review, 111(3), 484–501.Google Scholar
  36. Kollanyi, B., Howard, P. N., & Woolley, S. C. (2016). Bots and automation over Twitter during the first US Presidential debate. Computational Propaganda Research Project. Retrieved from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2016/10/Data-Memo-First-Presidential-Debate.pdf.
  37. Lanchester, J. (2017). You are the product. London Review of Books, 39(16), 3–10.Google Scholar
  38. Lankina, T., & Watanabe, K. (2017). ‘Russian spring’ or ‘Spring betrayal’? The media as a mirror of Putin’s evolving strategy in Ukraine. Europe-Asia Studies, 69(10), 1526–1556.Google Scholar
  39. Leach, K. (2014, September). Building stability in an unpredictable world—The NATO SUMMIT IN Wales 4–5 September [Blog]. Retrieved from https://blogs.fco.gov.uk/katherineleach/2014/09/04/building-stability-in-an-unpredictable-world-the-nato-summit-in-wales-4-5-september/.
  40. McClenaghan, M. (2017). The ‘Dark Ads’ election: How are pollical parties targeting you on Facebook? The Bureau of Investigative Journalism. Retrieved from https://www.thebureauinvestigates.com/stories/2017-05-15/the-dark-ads-election-how-are-political-parties-targeting-you-on-facebook.
  41. Michael, K. (2017). Bots trending now: Disinformation and calculated manipulation of the masses. IEEE Technology and Society Magazine, 36(2), 6–11.Google Scholar
  42. Mihaylov, T., Georgiev, G., & Nakov, P. (2015). Finding opinion manipulation trolls in news community forums. In Proceedings of the nineteenth conference on computational natural language learning (pp. 310–314).Google Scholar
  43. NATO. (2010). Military concept for NATO strategic communications. Deputy Secretary General of NATO. Retrieved from https://info.publicintelligence.net/NATO-STRATCOM-Concept.pdf.
  44. Newman, N., Fletcher, R., Kalogeropoulos, A., Levy, D. A., & Nielsen, R. K. (2017). Reuters Institute digital news report 2017.Google Scholar
  45. Nocetti, J. (2015). Contest and conquest: Russia and global internet governance. International Affairs, 91(1), 111–130.Google Scholar
  46. Oliver, J. E., & Wood, T. J. (2014). Conspiracy theories and the paranoid style (s) of mass opinion. American Journal of Political Science, 58(4), 952–966.Google Scholar
  47. Open hearing social media influence 2016 US election before Senate Intelligence Committee. Senate, 115th Congress (2017).Google Scholar
  48. Pamment, J., Nothhaft, H., Agardh-Twetman, H., & Fjallhed, A. (2018). Countering information influence activities: The state of the art. Lund University.Google Scholar
  49. Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. London: Penguin.Google Scholar
  50. Paulauskas, R. (2018). Understanding Lithuania’s digital diplomacy model. [In person].Google Scholar
  51. Phillips, W. (2015). This is why we can’t have nice things: Mapping the relationship between online trolling and mainstream culture. Cambridge, MA: MIT Press.Google Scholar
  52. Pietilä, V. (1994). Perspectives on our past: Charting the histories of mass communication studies. Critical Studies in Media Communication, 11(4), 346–361.Google Scholar
  53. Ratkiewicz, J., Conover, M., Meiss, M. R., Gonçalves, B., Flammini, A., & Menczer, F. (2011). Detecting and tracking political abuse in social media. ICWSM, 11, 297–304.Google Scholar
  54. Rogers, K., & Bromwhich, J. E. (2016, November). The hoaxes, fake news, and misinformation we saw on election day. The New York Times. Retrieved from https://www.nytimes.com/2016/11/09/us/politics/debunk-fake-news-election-day.html.
  55. Sazonov, V., Kristiina Müür, M. A., & Mölder, H. (2016). Russian information campaigns against the Ukrainian state and defence forces. Tartu: NATO StratCom Centre of Excellence. Retrieved from https://www.stratcomcoe.org/analysis-russias-information-campaign-against-ukraine-1.
  56. Shearer, E., & Gottfried, J. (2017). News use across social media platforms 2017. Pew Research Center. Retrieved from http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/.
  57. Sproule, J. M. (1989). Progressive propaganda critics and the magic bullet myth. Critical Studies in Media Communication, 6(3), 225–246.Google Scholar
  58. Sunstein, C. R. (2017). #Republic: Divided democracy in the age of social media. Princeton: Princeton University Press.Google Scholar
  59. Sunstein, C. R., & Vermeule, A. (2009). Conspiracy theories: Causes and cures. Journal of Political Philosophy, 17(2), 202–227.Google Scholar
  60. Tandoc, E. C., Jr., Lim, Z. W., & Ling, R. (2018). Defining “fake news” a typology of scholarly definitions. Digital Journalism, 6(2), 137–153.Google Scholar
  61. Timberg, C. (2016, November). Russian propaganda effort helped spread “fake news” during election, experts say. The Washington Post. Retrieved from https://www.washingtonpost.com/business/economy/russian-propaganda-effort-helped-spread-fake-news-during-election-experts-say/2016/11/24/793903b6–8a40-4ca9-b712-716af66098fe_story.html?utm_term=.6a0b32245a72.
  62. Treré, E. (2016). The dark side of digital politics: Understanding the algorithmic manufacturing of consent and the hindering of online dissidence. IDS Bulletin, 47(1), 127–138.Google Scholar
  63. Tucker, J., Guess, A., Barberá, P., Vaccari, C., Siegel, A., Sanovich, S., …, Nyhan, B. (2018). Social media, political polarization, and political disinformation: A review of the scientific literature. Hewlett Foundation.Google Scholar
  64. Vanderhill, R. (2013). Promoting authoritarianism abroad. Boulder, CO: Lynne Rienner.Google Scholar
  65. Van Ham, P. (2013). Social power in public diplomacy. In R. S. Zaharna, A. Arsenault, & A. Fisher (Eds.), Relational, networked and collaborative approaches to public diplomacy (pp. 17–28). New York, NY: Taylor & Francis.Google Scholar
  66. Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146–1151.Google Scholar
  67. Way, L. A. (2015). The limits of autocracy promotion: The case of Russia in the “near abroad”. European Journal of Political Research, 54(4), 691–706.Google Scholar
  68. Williams, E. (2014). Ev Williams on Twitter’s early years [Print]. Retrieved from https://www.inc.com/issie-lapowsky/ev-williams-twitter-early-years.html?cid=em01011week40day04b.
  69. Wilton Park. (2017). Diplomacy in the information age: Wednesday 22–Friday 24 February WP1519. Wilson House. Retrieved from https://www.wiltonpark.org.uk/event/wp1519/.
  70. Wood, M. J., Douglas, K. M., & Sutton, R. M. (2012). Dead and alive: Beliefs in contradictory conspiracy theories. Social Psychological and Personality Science, 3(6), 767–773.Google Scholar
  71. Woolley, S. C., & Howard, P. N. (2017). Computational propaganda worldwide: Executive summary. Computational Propaganda Research Project. Retrieved from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Casestudies-ExecutiveSummary.pdf.
  72. Zeitz, J. (2016, July). How Trump is recycling Nixon’s ‘Law and Order’ playbook. Politico. Retrieved from https://www.politico.com/magazine/story/2016/07/donald-trump-law-and-order-richard-nixon-crime-race-214066.
  73. Zuckerberg, M. (2018). Mark Zuckerberg’s 2004 CNBC interview shows how far he and Facebook have come [TV]. Retrieved from https://www.cnbc.com/video/2018/04/16/facebook-founder-mark-zuckerbergs-first-tv-interview-in-2004-on-cnbc.html.

Copyright information

© The Author(s) 2019

Authors and Affiliations

  1. 1.Department of International DevelopmentUniversity of OxfordOxfordUK

Personalised recommendations