Advertisement

Introduction

  • Susannah B. F. Paletz
  • Brooke E. Auxier
  • Ewa M. Golonka
Chapter
Part of the SpringerBriefs in Complexity book series (BRIEFSCOMPLEXITY)

Abstract

As the use of social media platforms has increased, so have they become a new domain in information warfare. Before tackling the roots or spread of misinformation or disinformation, it is important to understand why people share any information on social media at all. This book presents a broad, multidisciplinary review and creation of a theoretical framework of the factors that have been shown to, or might, influence sharing information on social media, regardless of its veracity. The act of sharing information online is made up of several categories of factors: sources of messages, reactions to the original message and messenger, the motivation to share, the ability to share (and perception of the ability to share), and then, of course, actual sharing behavior. In addition, while genuine actors may have reactions to the original message and messenger, there also exist non-genuine actors that have pre-programmed or pre-planned reactions. We also qualitatively examined 20 fake news stories in two different languages as they appeared in social media in order to illustrate factors affecting information propagation and identify potential gaps in the literature.

Keywords

Social media Fake news Misinformation Disinformation Multidisciplinary Model Qualitative research Russia Information warfare Social media sharing 

References

  1. Ajzen, I. (1991). The theory of planned behavior. Organizational Behavior and Human Decision Processes, 50, 179–211.CrossRefGoogle Scholar
  2. Allcott, H., & Gentzkow, M. (2017). Social media and fake news in the 2016 election. Journal of Economic Perspectives, 31, 211–236.CrossRefGoogle Scholar
  3. Arnsdorf, I. (2017, August 23). Pro-Russian bots take up the right-wing cause after Charlottesville: Analysts tracking Russian influence operations find a feedback loop between Krelmin propaganda and far-right memes. ProPublica. Retrieved from https://www.propublica.org/article/pro-russian-bots-take-up-the-right-wing-cause-after-charlottesville
  4. Arsenault, A., & Castells, M. (2008). The structure and dynamics of global multi-media business networks. International Journal of Communication, 2, 43. Retrieved from http://ijoc.org/index.php/ijoc/article/view/298Google Scholar
  5. Barthel, M., Mitchell, A., & Holcomb, J. (2016, December 15). Many Americans believe fake news is sowing confusion. Retrieved from Pew Research Center, Journalism & Media site: http://www.journalism.org/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/
  6. Benigni, M. C., Joseph, K., & Carley, K. M. (2017). Online extremism and the communities that sustain it: Detecting the ISIS supporting community on Twitter. PLoS One, 12, e0181405.CrossRefGoogle Scholar
  7. Bowman-Grieve, L. (2013). A psychological perspective on virtual communities supporting terrorist & extremist ideologies as a tool for recruitment. Security Informatics, 2, 9.  https://doi.org/10.1186/2190-8532-2-9CrossRefGoogle Scholar
  8. Breckler, S. J. (1984). Empirical validation of affect, behavior, and cognition as distinct components of attitude. Journal of Personality and Social Psychology, 47, 1191–1205.CrossRefGoogle Scholar
  9. Brock, T. C., & Green, M. C. (Eds.). (2005). Persuasion: Psychological insights and perspectives (2nd ed.). London: Sage.Google Scholar
  10. Bu, Z., Xia, Z., & Wang, J. (2013). A sock puppet detection algorithm on virtual spaces. Knowledge-Based Systems, 37, 366–377.CrossRefGoogle Scholar
  11. Caiani, M., & Wagemann, C. (2009). Online networks of the Italian and German extreme right. Information, Communication & Society, 12, 66–109.CrossRefGoogle Scholar
  12. Cialdini, R. B., & Goldstein, N. J. (2004). Social influence: Compliance and conformity. Annual Review of Psychology, 55, 591–621.CrossRefGoogle Scholar
  13. Cooke, N. (2017). Posttruth, truthiness, and alternative facts: Information behavior and critical information consumption for a new age. Library Quarterly: Information, Community, Policy, 87, 211–222.CrossRefGoogle Scholar
  14. Debatin, B. (2008). The internet as a new platform for expressing opinions and as a new public sphere. In W. Donsbach & M. W. Traugott (Eds.), The SAGE handbook of public opinion research (pp. 64–72). Los Angeles: Sage.CrossRefGoogle Scholar
  15. Derrick, D. C., Sporer, K., Church, S., & Ligon, G. S. (2016). Ideological rationality and violence: An exploratory study of ISIL’s cyber profile. Dynamics of Asymmetric Conflict, 9, 57–81.CrossRefGoogle Scholar
  16. Ellison, N., Vitak, J., Steinfield, C., Gray, R., & Lampe, C. (2011). Privacy online: Perspectives on privacy and self-Disclosure in the social web. In Negotiating privacy concerns and social capital needs in a social media environment (pp. 19–32). Berlin: Springer.  https://doi.org/10.1007/978-3-642-21521-6_3CrossRefGoogle Scholar
  17. Glynn, C., Herbst, S., O’Keefe, G. J., Shapiro, R. Y., & Linderman, M. (2004). Public opinion (2nd ed.). Boulder, CO: Westview.Google Scholar
  18. Goolsby, R. (2013). On cybersecurity, crowdsourcing, and social cyber-attack (Policy Memo Series, Vol. 1). Washington, DC: Wilson Center Science and Technology Innovation Program Commons Lab.Google Scholar
  19. Goolsby, R., Galeano, R., & Agarwal, N. (2014). Shaping the battlefield through communication: Russian information tactical and strategic maneuver through social media. Unpublished report.Google Scholar
  20. Green, M. C., & Brock, T. C. (2005). Persuasiveness of narratives. In T. C. Brock & M. C. Green (Eds.), Persuasion: Psychological insights and perspectives (2nd ed., pp. 117–142). London: Sage.Google Scholar
  21. Hermida, A. (2014). Tell everyone: Why we share and why it matters. Toronto: Anchor Canada (Penguin Random House Canada).Google Scholar
  22. Hinck, R., Kluver, R., & Cooley, S. (2017). Media visions of the gray zone: Contrasting geopolitical narratives in Russian and Chinese media. College Station, TX: Texas A&M University.Google Scholar
  23. Jowett, G. S., & O’Donnell, V. (2015). Propaganda and persuasion (6th ed.). Los Angeles, CA: Sage.Google Scholar
  24. Lazer, D. M. J., Baum, M. A., Benkler, Y., Berinsky, A. J., Greenhill, K. M., Menczer, F., et al. (2018). The science of fake news: Addressing fake news requires a multidisciplinary effort. Science, 359, 1094–1096.CrossRefGoogle Scholar
  25. Lokot, T., & Diakopoulos, N. (2016) News bots. Digital Journalism, 4, 682–699.Google Scholar
  26. Madden, T. J., Ellen, P. S., & Ajzen, I. (1992). A comparison of the theory of planned behavior and the theory of reasoned action. Personality and Social Psychology Bulletin, 18, 3–9.CrossRefGoogle Scholar
  27. McKernon, E. (1928). News fakers. The Outlook, 149(130-131), 150.Google Scholar
  28. Mills, A. (2012). Virality in social media: The SPIN framework. Journal of Public Affairs, 12, 162–169.  https://doi.org/10.1002/pa.1418CrossRefGoogle Scholar
  29. Mustafaraj, E., & Metaxas, P. T. (2017). The fake news spreading plague: Was it preventable? In WebSci’17, June 2017, Troy, NY.  https://doi.org/10.1145/3091478.3091523
  30. O’Sullivan, D. (2018, January 19). Twitter to tell 677,775 people they interacted with Kremlin-linked trolls. CNN.com. Retrieved from http://money.cnn.com/2018/01/19/media/twitter-internet-research-agency-accounts/index.html
  31. Paletz, D. L. (2002). The media in American politics: Contents and consequences (2nd ed.). New York: Longman.Google Scholar
  32. Paletz, D. L., Owen, D., & Cook, T. E. (2018). American government and politics in the information age: Version 3.0. Boston, MA: FlatWorld.Google Scholar
  33. Paul, C., & Matthews, M. (2016). The Russian “firehose of falsehood” propaganda model: Why it might work and options to counter it. Santa Monica, CA: RAND Corporation. Retrieved from https://www.rand.org/pubs/perspectives/PE198.htmlCrossRefGoogle Scholar
  34. Pew Research Center, Internet & Technology. (2018). Social media fact sheet. [Fact sheet]. Retrieved from http://www.pewinternet.org/fact-sheet/social-media/
  35. Pratkanis, A., & Aronson, E. (2001). Age of propaganda: The everyday use and abuse of persuasion. New York: W. H. Freeman.Google Scholar
  36. Prier, J. (2017). Commanding the trend: Social media as information warfare. Strategic Studies Quarterly, 11, 50–85.Google Scholar
  37. Segev, E., Nissenbaum, A., Stolero, N., & Shifman, L. (2015). Families and networks of internet memes: The relationship between cohesiveness, uniqueness, and quiddity concreteness. Journal of Computer-Mediated Communication, 20, 417–433.CrossRefGoogle Scholar
  38. Shao, C., Ciampaglia, G., Varol, O., Flammini, A., & Menczer, F. (2017). The spread of fake news by social bots. ArXiv e-prints. arXiv:1707.07592 [cs.SI]
  39. Shearer, E., & Gottfried, J. (2017, September 7). News use across social media platforms 2017. Retrieved from Pew Research Center, Journalism & Media site: http://www.journalism.org/2017/09/07/news-use-across-social-media-platforms-2017/
  40. Shifman, L. (2013). Memes in a digital world: Reconciling with a conceptual troublemaker. Journal of Computer-Mediated Communication, 18, 362–377.CrossRefGoogle Scholar
  41. Shifman, L. (2014). Memes in digital culture. Cambridge, MA: MIT Press.Google Scholar
  42. Sindelar, D. (2014, August 12). The Kremlin’s troll army: Moscow is financing legions of pro-Russia Internet commentators. But how much do they matter? The Atlantic. Retrieved from http://www.theatlantic.com/international/archive/2014/08/the-kremlins-troll-army/375932/
  43. Sydell, L. (2017, October 29). How Russian propaganda spreads on social media. National Public Radio. Retrieved from http://www.npr.org/sections/alltechconsidered/2017/10/29/560461835/how-russian-propaganda-spreads-on-social-media
  44. Trepte, S., & Reinecke, L. (2011). Privacy online: Perspectives on privacy and self-disclosure in the social web. New York: Springer-Verlag.CrossRefGoogle Scholar
  45. Tufekci, Z. (2018, January 19). It’s the (democracy-poisoning) golden age of free speech. Wired. Retrieved from https://www.wired.com/story/free-speech-issue-tech-turmoil-new-censorship/
  46. United States of America v. Internet Research Agency LLC, 18 U.S.C. §§ 2, 371, 1349, 1028A (District Court for the District of Columbia, 2018).Google Scholar
  47. van Krieken, K., & Sanders, J. (2016). Framing narrative journalism as a new genre: A case study of the Netherlands. Journalism, 18, 1364–1380.  https://doi.org/10.1177/1464884916671156CrossRefGoogle Scholar
  48. Varol, O., Ferrara, E., Davis, C., Menczer, F., & Flammini, A. (2017). Online human-bot interactions: Detection, estimation, and characterization. In International AAAI Conference on Web and Social Media (pp 280–289). arXiv:1703.03107 [cs.SI].
  49. Vidino, L., & Hughes, S. (2015). ISIS in America: From retweets to Raqqa. Washington, DC: Program on Extremism, George Washington University.Google Scholar
  50. Waltzman, R. (2017). The weaponization of information: The need for cognitive security. Testimony presented before the Senate Armed Services Committee, Subcommittee on Cybersecurity on April 27, 2017. Santa Monica, CA: RAND. Retrieved from https://www.armed-services.senate.gov/download/waltzman_04-27-17
  51. Wang, B., & Zhuang, J. (2018). Rumor response, debunking response, and decision makings of misinformed Twitter users during disasters. Natural Hazards, 93(3), 1145–1162.  https://doi.org/10.1007/s11069-018-3344-6CrossRefGoogle Scholar
  52. Woolley, S. C., & Howard, P. N. (2017). Computational propaganda worldwide: Executive summary (Working Paper No. 2017.11). Oxford, UK: University of Oxford. Retrieved from http://comprop.oii.ox.ac.uk/wp-content/uploads/sites/89/2017/06/Casestudies-ExecutiveSummary.pdf

Copyright information

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Susannah B. F. Paletz
    • 1
  • Brooke E. Auxier
    • 2
  • Ewa M. Golonka
    • 1
  1. 1.Center for Advanced Study of LanguageUniversity of MarylandCollege ParkUSA
  2. 2.Philip Merrill College of JournalismUniversity of MarylandCollege ParkUSA

Personalised recommendations