Skip to main content

Bots, Trolls, Elves, and the Information War in Lithuania: Theoretical Considerations and Practical Problems

  • Chapter
  • First Online:
Information Wars in the Baltic States

Abstract

Recently, the media sphere has been found to be regularly flooded with misinformation and disinformation. Some scholars refer to such outbreaks as infodemics (Bruns, A., Harrington, S., & Hurcombe, E., Media International Australia, 2020). We are at a point where we must challenge the assumption of online spaces as emblematic of democratic ideals where participation is assumed to solely foster healthy debate in an authentic marketplace of ideas. Prominent in this subverted media arena are bots, inauthentic participants in the exchange of information, and trolls, the “Ghostwriters,” flamethrowers of the internet, whether automated or not. This problem is felt acutely in Lithuania where there are fears that a Russian information war could turn into real war or other political disruption. It is, therefore, imperative to understand the process of this media manipulation. This chapter argues that researchers will have to adapt theories and methodologies to do so by linking the theoretical groundings of mass media influence to the concept of information warfare (Cronin, B., & Crawford, H., Information Society 15:257–263, 1999), where social media chaos can warp civic participation (Zelenkauskaite, A., Creating Chaos Online: Disinformation and Subverted Post-Publics. University of Michigan Press, 2022).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

Notes

  1. 1.

    Misinformation refers to information that is factually inaccurate, i.e. “misleading information created or disseminated without manipulative or malicious intent” (UNESCO, 2018). Misinformation is often accompanied by disinformation—where individuals, groups, and organizations deliberately aim to create confusion and discord. Disinformation is defined by “deliberate (often orchestrated) attempts to confuse or manipulate” (UNESCO, 2018).

References

  • Allport, F. H. (1920). The influence of the group upon association and thought. Journal of Experimental Psychology, 3, 159–182.

    Article  Google Scholar 

  • Anagnostopoulos, A., Kumar, R., & Mahdian, M. (2008). Influence and correlation in social networks. In Proceedings of the 14th ACM SIGKDD International Conference on knowledge discovery and data mining (pp. 7–15).

    Chapter  Google Scholar 

  • Bakshy, E., Hofman, J. M., Mason, W. A., & Watts, D. J. (2011). Everyone’s an influencer: Quantifying influence on Twitter. In Proceedings of the fourth ACM International Conference on web search and data mining (pp. 65–74).

    Chapter  Google Scholar 

  • Baños, R. A., Borge-Holthoefer, J., & Moreno, Y. (2013). The role of hidden influentials in the diffusion of online information cascades. EPJ Data Science, 21, 6.

    Article  Google Scholar 

  • Bedford, S., & Vinatier, L. (2018). Resisting the irresistible: ‘Failed opposition’ in Azerbaijan and Belarus revisited. Government and Opposition, 54(4), 686–714.

    Article  Google Scholar 

  • Bessi, A., & Ferrara, E. (2016). Social bots distort the 2016 U.S. presidential election online discussion. First Monday, 21(11). https://firstmonday.org/article/view/7090/5653

  • Botometer. (n.d.). Botometer. Accessed from https://botometer.iuni.iu.edu/#!/

    Google Scholar 

  • Bruns, A., Harrington, S., & Hurcombe, E. (2020). ‘Corona? 5G? Or both?’: The dynamics of COVID-19/5G conspiracy theories on Facebook. Media International Australia. https://journals.sagepub.com/doi/full/10.1177/1329878X20946113

  • Burt, R. S. (1976). Positions in networks. Social Forces, 55(1), 93–122.

    Article  Google Scholar 

  • Canavan, J. (2005, October). The evolution of malicious IRC bots. Virus Bulletin Conference. https://www.semanticscholar.org/paper/The-evolution-of-malicious-IRC-bots-Canavan/4fb473e4741a5d9d157d075c6747a924eb22fa72

  • Chen, A. (2015, June 7). The agency. The New York Times Magazine. https://www.nytimes.com/2015/06/07/magazine/the-agency.html

  • Cohen, A. R. (1959). Some implications of self-esteem for social influence. In C. Hovland & I. L. Janis (Eds.), Personality and persuasibility (pp. 102–120). Yale University Press.

    Google Scholar 

  • Cronin, B., & Crawford, H. (1999). Information warfare: Its application in military and civilian contexts. Information Society, 15(4), 257–263.

    Article  Google Scholar 

  • Daniel, F., & Millimaggi, A. (2020). On Twitter bots behaving badly: A manual and automated analysis of Python code patterns on GitHub. Journal of Web Engineering, 18(8), 1–36.

    Google Scholar 

  • DebunkEU. (2020). About elves. https://debunk.eu/about-elves/

  • Deutsch, M., & Gerard, H. B. (1955). A study of normative and informational social influences upon individual judgment. The Journal of Abnormal and Social Psychology, 51(3), 629–636.

    Article  Google Scholar 

  • Dion, K. K., & Stein, S. (1978). Physical attractiveness and interpersonal influence. Journal of Experimental Social Psychology, 14(1), 97–108.

    Article  Google Scholar 

  • Dukalskis, A. (2017). The authoritarian public sphere: Legitimation and autocratic power in North Korea, Burma, and China. Routledge.

    Book  Google Scholar 

  • Edelstein, S., & Edwards, J. (2002). If you build it, they will come: Building learning communities through threaded discussions. eLearn Magazine, 4, 3.

    Article  Google Scholar 

  • Ferrara, E., Varol, O., Davis, C., Menczer, F., & Flammini, A. (2016). The rise of social bots. Communications of the ACM, 59(7), 96–104.

    Article  Google Scholar 

  • Fisher, S., & Lubin, A. (1958). Distance as a determinant of influence in a two-person serial interaction situation. The Journal of Abnormal and Social Psychology, 56(2), 230–238.

    Article  Google Scholar 

  • Friedkin, N. E. (2006). A structural theory of social influence (Vol. 13). Cambridge University Press.

    Google Scholar 

  • Garbačiauskaitė-Budrienė, M. (2016, June 30). Garbačiauskaitė-Budrienė. Atpažink Kremliaus trolį. Delfi.lt. https://www.delfi.lt/news/ringas/lit/m-garbaciauskaite-budriene-atpazink-kremliaus-troli.d?id=71642580

  • Geiger, R. S. (2018). The lives of bots. arXiv preprint arXiv:1810.09590.

    Google Scholar 

  • Golovchenko, Y., Hartmann, M., & Adler-Nissen, R. (2018). State, media and civil society in the information warfare over Ukraine: Citizen curators of digital disinformation. International Affairs, 94(5), 975–994.

    Article  Google Scholar 

  • González-Bailón, S., Borge-Holthoefer, J., & Moreno, Y. (2013). Broadcasters and hidden influentials in online protest diffusion. American Behavioral Scientist, 577, 943–965.

    Article  Google Scholar 

  • Gorwa, R., & Guilbeault, D. (2018). Unpacking the social media bot: A typology to guide research and policy. Policy & Internet, 12(2), 225–248.

    Article  Google Scholar 

  • Herring, S. C. (1999). Interactional coherence in CMC. Journal of Computer-Mediated Communication, 4(4).

    Article  Google Scholar 

  • Herring, S. C. (2003). Gender and power in on-line communication. In J. Holmes & M. Meyerhoff (Eds.), The handbook of language and gender (pp. 202–228). Blackwell.

    Chapter  Google Scholar 

  • Hjorth, F., & Adler-Nissen, R. (2019). Ideological asymmetry in the reach of pro-Russian digital disinformation to United States audiences. Journal of Communication, 69(2), 168–192.

    Article  Google Scholar 

  • Hovland, C. I., & Weiss, W. (1951). The influence of source credibility on communication effectiveness. Public Opinion Quarterly, 15(4), 635–650.

    Article  Google Scholar 

  • Huhtinen, A. M., Kotilainen, N., Särmä, S., & Streng, M. (2021). Information influence in hybrid environment: Reflexive control as an analytical tool for understanding warfare in social media. In Research anthology on fake news, political warfare, and combatting the spread of misinformation (pp. 243–259). IGI Global.

    Chapter  Google Scholar 

  • Jamieson, K. H. (2018). Cyberwar: How Russian hackers and trolls helped elect a president, what we don’t, can’t, and do know. Oxford University Press.

    Google Scholar 

  • Kalmar, P. (2010). Bootstrapping websites for classification of organization names on Twitter. In CLEF Notebook Papers/LABs/Workshops, 2(6). http://clef2010.clef-initiative.eu/resources/proceedings/clef2010labs_submission_78.pdf

  • Katz, E., & Lazarsfeld, P. (1955). Personal influence: The part played by people in the flow of mass communications. The Free Press.

    Google Scholar 

  • Khan, Z., & Jarvenpaa, S. L. (2010). Exploring temporal coordination of events with Facebook.com. Journal of Information Technology, 25(2), 137–151.

    Article  Google Scholar 

  • Kuk, G. (2006). Strategic interaction and knowledge sharing in the KDE developer mailing list. Management Science, 52(7), 1031–1042.

    Article  Google Scholar 

  • LRT. (2019, September 27). More fake news target NATO’s presence in Lithuania. https://www.lrt.lt/en/news-in-english/19/1101632/more-fake-news-target-nato-s-presence-in-lithuania

  • Mandernach, B. J., Gonzales, R. M., & Garrett, A. L. (2006). An examination of online instructor presence via threaded discussion participation. Journal of Online Learning and Teaching, 2(4), 248–260.

    Google Scholar 

  • Messias, J., Schmidt, L., Oliveira, R., & Benevenuto, F. (2013). You followed my bot! Transforming robots into influential users in Twitter. First Monday, 18(7). https://doi.org/10.5210/fm.v18i7.4217

  • Meyer, K. A. (2003). Face-to-face versus threaded discussions: The role of time and higher-order thinking. Journal of Asynchronous Learning Networks, 7(3), 55–65.

    Google Scholar 

  • Miller, R. L., & Benz, J. J. (2008). Techniques for encouraging peer collaboration: Online threaded discussion or fishbowl interaction. Journal of Instructional Psychology, 35(1), 87–94.

    Google Scholar 

  • Mislove, A., Lehmann, S., Ahn, Y. Y., Onnela, J. P., & Rosenquist, J. N. (2011). Understanding the demographics of Twitter users. Proceedings of the International AAAI Conference on Web and Social Media, 5(1). https://ojs.aaai.org/index.php/ICWSM/article/view/14168

  • Orenstein, M. A. (2019). The lands in between: Russia vs. the West and the new politics of hybrid war. Oxford University Press.

    Book  Google Scholar 

  • Pearce, K. E., Vitak, J., & Barta, K. (2018). Privacy at the margins| socially mediated visibility: Friendship and dissent in authoritarian Azerbaijan. International Journal of Communication, 12(22). https://ijoc.org/index.php/ijoc/article/view/7039

  • Pelz, D. C. (1952). Influence: A key to effective leadership in the first-line supervisor. Personnel, 29, 209–217.

    Google Scholar 

  • Raven, B. H. (1958). Legitimate power, coercive power, and observability in social influence. Sociometry, 21, 83–97.

    Article  Google Scholar 

  • Raven, B. H. (1965). Social influence and power. In I. D. Steiner & M. Fishbein (Eds.), Current studies in social psychology (pp. 371–382). Holt, Rinehart & Winston.

    Google Scholar 

  • Romero, D. M., Meeder, B., & Kleinberg, J. (2011, March). Differences in the mechanics of information diffusion across topics: Idioms, political hashtags, and complex contagion on Twitter. In Proceedings of the 20th international conference on world wide web (pp. 695–704). ACM.

    Chapter  Google Scholar 

  • Sabbagh, D. (2020, July 30). Russia-aligned hackers since 2017 have been running anti-NATO fake news campaign. The Guardian. https://www.theguardian.com/technology/2020/jul/30/russia-aligned-hackers-running-anti-nato-fake-news-campaign-report-poland-lithuania

  • Schein, E. H. (1960). Interpersonal communication, group solidarity, and social influence. Sociometry, 23(2), 148–161.

    Article  Google Scholar 

  • Schmid, P., & Betsch, C. (2019). Effective strategies for rebutting science denialism in public discussions. Nature Human Behaviour, 3(9), 931–939.

    Article  Google Scholar 

  • Sengupta, K. (2019, July). Meet the elves, Lithuania’s digital citizen army confronting Russian trolls. The Independent. https://www.independent.co.uk/news/world/europe/lithuania-elves-russia-election-tampering-online-cyber-crime-hackerskremlin-a9008931.html

  • Simons, G. (2015). Perception of Russia’s soft power and influence in the Baltic States. Public Relations Review, 41(1), 1–13.

    Article  Google Scholar 

  • Smith, M., Cadiz, J. J., & Burkhalter, B. (2000, December). Conversation trees and threaded chats. In Proceedings of the 2000 ACM conference on computer supported cooperative work (pp. 97–105). ACM.

    Chapter  Google Scholar 

  • Stukal, D., Sanovich, S., Tucker, J. A., & Bonneau, R. (2019). For whom the bot tolls: A neural networks approach to measuring political orientation of Twitter bots in Russia. SAGE Open, 9(2). https://journals.sagepub.com/doi/full/10.1177/2158244019827715

  • Sun, B., & Ng, V. T. (2013). Identifying influential users by their postings in social networks. Springer Berlin Heidelberg.

    Book  Google Scholar 

  • Toepfl, F. (2018). Innovating consultative authoritarianism: Internet votes as a novel digital tool to stabilize non-democratic rule in Russia. New Media & Society, 20(3), 956–972.

    Article  Google Scholar 

  • Toepfl, F., & Litvinenko, A. (2018). Transferring control from the backend to the frontend: A comparison of the discourse architectures of comment sections on news websites across the post-Soviet world. New Media & Society, 20(8), 2844–2861.

    Article  Google Scholar 

  • UNESCO. (2018). Journalism, ‘Fake News’ & disinformation: Handbook for journalism education and training. https://unesdoc.unesco.org/ark:/48223/pf0000265552/PDF/265552eng.pdf.multi

  • Valeriano, B., Jensen, B. M., & Maness, R. C. (2018). Cyber strategy: The evolving character of power and coercion. Oxford University Press.

    Book  Google Scholar 

  • Van Dijk, T. A. (1998). Ideology: A multidisciplinary approach. Sage.

    Google Scholar 

  • Vasiliauskaitė, N. (2021, January 11). Nida Vasiliauskaitė. Laisvės propaganda. Delfi.lt. https://www.delfi.lt/news/ringas/lit/nida-vasiliauskaite-laisves-propaganda.d?id=86197159

  • VDU (2021). UNESCO-UNITWIN Medijų ir informacinio raštingumo tyrimų centras. https://pmdf.vdu.lt/mokslas/mokslo-centrai/unesco-unitwin-mediju-ir-informacinio-rastingumo-tyrimu-centras/

  • Williams, R. S., & Humphrey, R. (2007). Understanding and fostering interaction in threaded discussion. Journal of Asynchronous Learning Networks, 11(2), 129–143.

    Google Scholar 

  • Woolley, S. C., & Howard, P. N. (Eds.). (2018). Computational propaganda: Political parties, politicians, and political manipulation on social media. Oxford University Press.

    Google Scholar 

  • Wright, M. (1943). The influence of frustration upon the social relations of young children. Journal of Personality, 12(2), 111–122.

    Article  Google Scholar 

  • Zelenkauskaite, A. (2022). Creating chaos online: Disinformation and subverted post-publics. University of Michigan Press.

    Book  Google Scholar 

  • Zelenkauskaite, A., & Balduccini, M. (2017). “Information warfare” and online news commenting: Analyzing forces of social influence through location-based commenting user typology. Social Media + Society, 3(3). https://journals.sagepub.com/doi/full/10.1177/2056305117718468

  • Zelenkauskaite, A., & Niezgoda, B. (2017). “Stop Kremlin trolls:” Ideological trolling as calling out, rebuttal, and reactions on online news portal commenting. First Monday, 22(5). https://doi.org/10.5210/fm.v22i5.7795

Download references

Acknowledgments

Author expresses gratitude to research assistant Brandon Niezgoda who helped systemize some of the literature review presented in this chapter.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Asta Zelenkauskaite .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Zelenkauskaite, A. (2022). Bots, Trolls, Elves, and the Information War in Lithuania: Theoretical Considerations and Practical Problems. In: Chakars, J., Ekmanis, I. (eds) Information Wars in the Baltic States. The Palgrave Macmillan Series in International Political Communication. Palgrave Macmillan, Cham. https://doi.org/10.1007/978-3-030-99987-2_7

Download citation

Publish with us

Policies and ethics