Keywords

Introduction

The European Council announced a ban on the Russian media outlets RT and Sputnik in March 2022, just weeks after Russia’s full-scale invasion of Ukraine. The invasion has cost many thousands of lives, created millions of Ukrainian refugees, and caused enormous material damage, but also increased the disinformation activities of Russia in Ukraine and the rest of Europe. The two Russian state international media organizations, Sputnik and RT, have been pinpointed as key actors. In a press release announcing the ban, the Council of the European Union noted that:

the EU will urgently suspend the broadcasting activities of Sputnik and RT/Russia Today… in the EU or directed at the EU until the aggression to Ukraine is put to an end, and until the Russian Federation and its associated outlets cease to conduct disinformation and information manipulation actions against the EU and its member states. (Council of the European Union, 2022)

The EU High Representative for Foreign Affairs and Security Policy, Joseph Borrell, added that:

Systematic information manipulation and disinformation by the Kremlin is applied as an operational tool in its assault on Ukraine. It is also a significant and direct threat to the Union’s public order and security. Today, we are taking an important step against Putin’s manipulation operation and turning off the tap for Russian state-controlled media in the EU. We have already earlier put sanctions on leadership of RT, including the editor-in-chief Simonyan, and it is only logical to also target the activities the organisations have been conducting within our Union. (Council of the European Union, 2022)

Whereas most scholars of propaganda and disinformation would assert that the international broadcasters are used to further Russian strategic interests, many also argue that, to achieve these aims, Sputnik and RT use their news coverage to spread disinformation in order to inflict harm on Western and European societies. In its justification for the ban, the European Union states that Russia has undertaken what it refers to as “propaganda actions” through various state media outlets, and that “Such actions constitute a significant and direct threat to the Union’s public order and security” and “are essential and instrumental in bringing forward and supporting the aggression against Ukraine, and for the destabilization of its neighboring countries”. This book argues that such security threats are promoted through everyday disinformation by way of constant and continuous Russian news coverage intended to cause harm. Disinformation should therefore not be conceived as separate campaigns or a sudden crisis limited in time and spurred by critical, dramatic situations. It is an everyday practice that feeds on current political and social events, be they normal or extraordinary, and poses long-term rather than short-term security threats. Everyday disinformation describes a practice in which the stability of a society is shaken, its constitutional pillars are torn down and its legitimacy weakened. Among the tools used to accomplish this are international news media, which are constructed for practices of the everyday and have the capacity to produce harmful narratives as part of their strategic communications.

The reason for the EU ban is the use of disinformation by Russia in its attacks on Ukraine. Within that context, however, the EU includes the threat that RT and Sputnik pose to European states (Council of the European Union, 2022). The EU has since been joined in the blocking of RT and Sputnik by Google, YouTube, Twitter, and Facebook (France24, 2022; RFE/RL, 2022).

Although censoring Russian state media outlets is an acute and dramatic measure that cuts to the core of EU norms and principles, scholars of propaganda and disinformation have for many years observed the manipulative and biased coverage of such media in providing international or foreign audiences with denigrating messages about their own countries (Wagnsson & Barzanje, 2021; Hoyle et al., 2021; Chatterje-Doody & Crilley, 2019; Bennett & Livingstone, 2018; Yablokov, 2015). These media are seen as active participants in information warfare, and as connected to Russia’s military and security forces with the aim of destabilizing foreign (especially Western) societies by creating mistrust and hostility between the population, the government and state institutions. Moreover, this is an enduring and long-term threat. Official Russian security documents state that information is part of modern conflict (Russian Government, 2014), and recent National Security Strategies, which focus on information wars on a global level, are explicit in their views on information as a tool for national security (Russian Government, 2015; Russian Government, 2016; Russian Government, 2021).

This introductory chapter elaborates further on how disinformation poses a security threat by way of harmful narratives and describes the kinds of security problems disinformation might pose. It also discusses what is meant by disinformation and why the concept provides a useful avenue for exploring Russian activity.

Disinformation as a Security Threat

This book is situated at the interface between international relations, security studies, and journalism, media, and communication studies. It takes as its point of departure the notion that disinformation is a security threat to liberal democracies. As information and news reporting have taken on increasingly pertinent roles in security and defense politics, and as soft and sharp power strategies (Glazunova et al., 2022) become an increasingly integral part of national security strategies, use of the media as a weapon to weaken an adversary has come into sharper focus.

Disinformation does not target geographical borders or national sovereignty, but liberal democratic principles such as governance and the rule of law, as well as institutions, elections, and public trust in the government. The damage caused by disinformation therefore results in a weakening of democratic societies from within. It aims to slowly and gradually wear down social and political stability, and resilience. In this sense, it is an existential threat to the democratic nation state, but without militaristic features and without the aspect of surprise that characterizes a military intervention in a foreign country.

This disinformation is carried out by the use of multi-language international news channels that are controlled by the state. These media organizations thus serve as tools for extending Russian power and influence over other states. In Russia, disinformation is widely perceived as an expression of a zero-sum game where the “insecurity of others makes Russia itself more secure” (Giles, 2019, p. 23, see also Hoyle et al., 2021, p. 2; Szostek, 2020, p. 2729). An information strategy that can achieve this involves constructing dominant discourses or master narratives about the West, and this is done through news journalism (Szostek, 2020, p. 2729).

Different terminology has been used to conceptualize and comprehend this type of threat. Wagnsson (2020) argues that while malign information influencing could be seen as a type of soft power in the sense that it is not using military capabilities and aims to “win hearts and minds”, it is different from soft power because it is centered on negative views of the target state and aims to denigrate and cause harm. She prefers the concept of sharp power, a term coined by Walker to refer to the ability to weaken an enemy by asserting control over its media, academic, publishing, and cultural institutions to inflict damage on them and their production of meaning and knowledge (Walker, 2018, p. 13). Pomerantsev argues that it is about confusing rather than convincing (Pomerantsev, 2015). In other words, it is not about imposing an ideological or political package of ideas on the enemy state, but encouraging doubt and mistrust in existing values and ideas, and in the institutions that maintain them, while also offering critical and skeptical audiences a platform where they can grow and amplify their discontent. In their study, Orttung and Nelson (2019, pp. 77–8) show how Russian disinformation makes use of the audience’s dissatisfaction with domestic media and offers an alternative, while Bennett and Livingstone (2018) highlight how Russian disinformation has gained inroads into target states by way of national media. Others refer to this as weaponized information, information warfare, or information-psychological warfare (Ramsay & Robertshaw, 2019, p. 12 with reference also to Giles, 2016). The latter term stresses the aim of steering not only information flows, but also people’s cognition and mindsets.

The threat therefore concerns people’s exposure to the misrepresentation of facts and to story constructions about the world intended to cause harm to society by twisting the truth, sowing mistrust in public institutions and exposing the negative consequences of societies run by “naïve liberal governments” or “incompetent elites”.

However, disinformation also aims to target deeper democratic competencies and alter people’s relationship with information and facts in general, as well as their ability to interpret and capacity for critical thinking. It calls on citizens in the target country to mistrust the state, the government, and its institutions, and to be skeptical about all information. It is an attempt to break down faith and trust not just in the nation or the government, but in the very basis of rational and critical thinking, and to make people doubt their own ability to make sense of social reality or play a part in it (see Bjola & Papadakis, 2020, p. 2). Most scholars agree that the security threat that disinformation poses is long term and targets not so much a particular political outcome as a state’s capacity for democratic governance. These operations, write Lemke and Habegger (2022), “penetrate the existing networks and erode relations of trust and authority over time….Contemporary [Russian] disinformation does not aim for a powerful political knockout blow. Rather, it is designed to gradually weaken an opponent’s social and political mobilization capacity”.

The role and significance assigned to international broadcasting in matters of security have undergone various changes in the past decade, especially with regard to Russia. However, the continuity in the use of information for security purposes going back to the Cold War and the Soviet era is worth noting. At that time, it was the Soviet intelligence agencies that were engaged in information operations to weaken the West. This involved leaking false information, spreading false rumors, and creating forgeries with the objective of furthering the Soviet Union’s foreign policy goals. According to Rid (2020), present day Russian disinformation is linked historically to what were called active measures during the Cold War, a strategy of causing harm to foreign states by way of disinformation, which sought to erode the political system slowly and gradually in a way that made it difficult to identify or blame external actors. Then as well as now, Rid notes that the Russian strategy was to destabilize and delegitimize foreign political systems.

Just as scholars of propaganda see disinformation as a tool, so Rid argues that disinformation is the way in which active measures continue to be practiced, albeit spurred on by the internet. The internet has brought about changes to active measures by giving new tools to what Rid calls “old-school disinformation professionals” (Rid, 2020, p. 13). According to Rid, the consequences of active measures during the Cold War were similar to today’s replacement of fact-based understandings with emotions, and the facilitation of a dichotomization between us and them (Rid, 2020, p. 11).

A further strategy of Soviet propaganda with some bearing on the disinformation of today is reflexive control. Doroshenko and Lukito (2021) explain that:

Reflexive control happens when the controlling actor presents an enemy with information that leads the enemy to a desired decision (Leonenko, 1995).…The chief task of reflexive control is to find and exploit weak links in information assessment during decision making. Russian disinformation strategies are not meant to just present falsehoods and confuse adversaries. Rather, the goal is to spread disinformation that would lead adversaries to make erroneous decisions favoring Russia, the controlling agent. (Doroshenko & Lukito, 2021, p. 4665)

This was therefore a method of influencing the target actors’ perceptions so that they became aligned with those of the controlling actor, resulting in actions beneficial to Russia (see Ramsay & Robertshaw, 2019, p. 112, also citing Thomas, 2004). This strategy is similar to the “strategy of direction” discussed by Wagnsson and Barzanje (2021), which they see as “a strategy of guiding the other away from an undesired posture, policy or behaviour, towards a preferred one through ‘carrots’ rather than ‘sticks’” (2021, p. 251). Russia makes direction efforts to influence the Other to take a course of action that is advantageous to Russia by means of “tacit inducement” (Wagnsson & Barzanje, 2021, p. 251). A comparative study of the Nordic states particularly noted the direction of Sputnik coverage of Finland, in which Finland was depicted as a global player as a result of its special relationship with Russia (Deverell et al., 2021, p. 24).

However, there are also alternative views on how to assess Russia’s state media and journalism, and the extent to which they deviate from liberal media cultures, as well as whether its journalistic style should be acknowledged as critical of investigative journalism that seeks to scrutinize power structures. The Russian state broadcaster, RT, is a good example of an international media institution that is talked of by some as a public diplomacy channel—or an expression of media globalization similar to the US CNN International—and by others as a weapon of disinformation used in Russia’s information war against the West (see Szostek, 2020). Media scholars have until recently regarded RT as a channel that aims to defend national interests, territories, traditions, and identities (Widholm, 2016, p. 196). In this way, it has been argued that RT is similar to other international television channels. It also aims to be a “Russian voice in a global media landscape”, which provides an alternative view on the world to that of Western broadcasters. However, a major critique of the channel is that it serves Putin’s interests through its propagandistic content (Widholm, 2016, p. 196). Widholm writes about this duality as a propaganda paradox: what is critical journalism to one is propaganda to another. He refers to the slogan “Question more”, launched by the channel in 2010, to stress its aim to challenge Western media (Widholm, 2016, p. 197).

At this time, however, there is little doubt about the antagonistic intentions of Russian state media such as RT and Sputnik. It was the Russian military invasion of Ukraine that led to the drastic censorship measures but the decision to ban the media outlets also highlights the significance the EU and other institutions, including global media platforms, now attribute to the role of disinformation. Media manipulation and disinformation strategies are now labelled “actions [that] constitute a significant and direct threat to the Union’s public order and security” and “essential and instrumental in bringing forward and supporting the aggression against Ukraine, and for the destabilization of its neighboring countries” (European Council statement cited by Cabrera Blazquez, 2022).

How Disinformation Has Denigrated Sweden

This study asks what disinformation about Sweden from Sputnik and RT looks like. It adopts a narrative approach, which means that I have sought to trace what stories were told and how. The study thus analyzes the narrative logic of the propaganda and disinformation narratives promulgated by Russian state-sponsored media platforms and aims to show how these media have sought to denigrate Sweden. The study builds on previous work by Wagnsson and Barzanje (2021), which analyzes Sputnik news coverage about Sweden in 2014–2018. The current study employs the same method and analyzes coverage from July 2019 to January 2021. The news material analyzed also includes a small sample of RT television news and talk shows. Moreover, where the previous study included all news about Sweden during the chosen period, I focus attention on news about climate change, public health, gender, (anti-)liberal values, and culture. This enables me to undertake a more in-depth analysis and demonstrate in greater detail how the stories are constructed and the type of reporting that is being done. In so doing, special attention is paid to a number of different storytelling techniques, notably: (a) the narrative perspective from within (see also Yablokov & Chatterje-Doody, 2022); (b) the instigation of polarization; (c) the overlap of narratives/topics; (d) misuse of key concepts and choice of words to name phenomena, confuse their meanings, use concepts inaccurately, and repetition of concepts with slight variations; (e) mockery by way of “citations”, “the so-called”; and (f) use of experts. These storytelling techniques are presented in Chap. 3 and discussed in Chap. 10 in connection with the harmful narratives identified in the analysis.

The similarities between the findings of the earlier Sputnik study and the present one are striking. Wagnsson and Barzanje (2021) found that Sputnik depicted Sweden as a nation in decline with severe domestic problems, most notably regarding immigration and crime, along with increasing polarization between traditionalists and radicals. They identified six subplots that developed over time, starting with what they termed “The Conflict Torn Space” in 2014 which turned out to dominate the coverage over the four years studied. This plot described Sweden as “a polarized society and a state in continuous dispute with the outside world”. It is a plot which might also be used to describe the RT and Sputnik coverage between July 2019 and January 2021. In 2015, plots were added about “the invaded space” and “the unsafe space”, which are also reflected in findings in this current study, along with somewhat updated versions of “The (un)sexy space”, “The decadent space”, and “The ultra-modern space” (Wagnsson & Barzanje, 2021, p. 244). Wagnsson and Barzanje wrote how:

Sputnik reported in a strictly thematic way, narrating singular events, with no follow-ups, and every piece fitted nicely into one or more of the subplots….This makes Sputnik’s narrative of 2015–2018 appear to be not like traditional news media coverage driven by day-to-day events, but like a calculated campaign that included selective reporting on a number of particular pre-set themes. (2021, pp. 243–244)

The same can be said about the Sputnik and RT coverage in 2019 and 2020. The continuity in the news narratives between the two periods is striking.

A further similarity that is useful to highlight are the intersections between the narratives. Interconnections between narratives reinforce each individual narrative and add to the overall message that Sweden is in decline, experiencing liberal-extremist chaos, and has irresponsible leaders unable to navigate among Islamists, migrants, politicians, and radical feminists, all of whom at one time or another have threatened to take over the country. Other studies of Russian state media coverage of Sweden have also found that migration and cultural tensions are defined as the cause of Sweden’s decline (Ramsay & Robertshaw, 2019). Islamization and sexual crime have been added to this mix (Colliver et al., 2018, p. 14).

By analyzing the news coverage of Sputnik and RT, I show that narratives about Sweden disseminated to Swedish and English-speaking audiences between July 2019 and January 2021 are the same as those previously found in an earlier period, and set out to demonstrate how they are constructed in greater detail. I suggest storytelling techniques that might help to explain how these narratives were constructed. My analyses focus on news about climate change and the environment, public health, gender, culture (tradition and national heritage), and (anti-)liberalism. These are all areas where Sweden has long been known both domestically and internationally as united and strong, but also areas where I anticipated the Russian state media would seek to find fault and weakness, and to target issues sensitive to Swedish national identity and social stability, possibly giving rise to heated contestation.

The question asked was: What is the Russian state media’s narrative about Sweden and how is it constructed? This question is answered by drawing on narrative theory and method. Russian news coverage is explored by identifying and analyzing news narratives for each of the themes. As noted above, there is a striking continuity over time in the narratives of the Sputnik coverage. Where Wagnsson and Barzanje used their six subplots to identify three antagonistic strategies for disinformation, this study instead deconstructs the narratives in depth and seeks to reveal the structures behind them at the microlevel. By doing so, our understanding of how harmful narratives are constructed should be enhanced.

I will now proceed to define the key concept disinformation, explain why it is a suitable concept in this study and how it relates to similar concepts in the field.

Choice of Concept for Defining the Construction and Dissemination of Harmful Narratives

Harmful narratives serve as the object of study around which this book revolves. Analyzing harmful narratives reveals the processes used by a hostile foreign power to destabilize another state from within. As is shown in the previous section, however, the terminology for describing these types of processes is not clear cut. The fields of communication studies and security and defence studies use different concepts, some of which stem from the Cold War while others that have gained prominence with the development of social media. What they all have in common, however, is their intention to capture the production and use of news and information for malign or harmful purposes to enhance state interests.

This book uses the term disinformation as its key concept, as defined by Bennett and Livingstone who write that disinformation:

involves the production and dissemination of intentionally distorted information for the purpose of deceiving an audience. Distortion might involve deliberate factual inaccuracies or amplified attention to persons, issues, events, or both. Some disinformation campaigns seek to exacerbate existing social and political fissures by mimicking social protest movements and radicalizing and amplifying their narratives. (Bennett & Livingstone, 2021, p. 35)

This is in line with how the European Commission has defined the types of threat aimed at European states. A 2018 report defines disinformation as including “all forms of false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit” (European Commission, 2018, p. 3; De Cok Buning, 2018, p. 3). It also resonates with how various other scholars have talked about the distortion of information (see below).

We define it [disinformation] as false, inaccurate, or misleading information designed, presented and promoted to intentionally cause public harm or for profit. The risk of harm includes threats to democratic political processes and values, which can specifically target a variety of sectors, such as health, science, education, finance and more. It is driven by the production and promotion of disinformation for economic gains or for political or ideological goals, but can be exacerbated by how different audiences and communities receive, engage, and amplify disinformation. (European Commission, 2018, p. 10)

The field is rich not just in concepts, but also in the meanings that are ascribed to these different concepts. Scholars of disinformation tend to agree with Freelon and Wells’s (2020, p. 145) criteria, where disinformation is considered to be deceptive, and to have the potential for harm and the intent to harm. Disinformation is considered a type of information influencer. Information influencing, however, can also have benign intent and aim to persuade and convince a political opponent or an electorate of a certain view, ideology, or opinion. The type of information influencing dealt with in this study represents processes with a hostile intent that aim to weaken and undermine democratic systems. Wagnsson (2023a) talks about “malign information influence” to clarify this distinction.

Disinformation is thus a normative concept used to define a threat to democratic societies by way of communication and information flows. Most scholars use it to highlight lies and inaccurate statements, but there is a divide with the understanding of “inaccuracy” and therefore also regarding what disinformation actually entails. For Wardle and Derakhshan (2017), disinformation is understood as messages being intentionally constructed from false accounts aimed to deceive and mislead, or “information that is false and deliberately created to harm a person, social group, organization or country” (2017, p. 21; See also Shu et al., 2020).

This, I argue, is too narrow an understanding of disinformation that creates difficulties in interpreting what is to be considered false. In addition, news stories that present accurate facts and figures, as well as true statements and accounts can also be constructed to distort the message and create incorrect interpretations. This is part of the deception strategies on which harmful news narratives are based. Disinformation is thus used in this study more broadly than merely referring to inaccurate information intended to mislead. The way in which the European Commission defines the concept is in line with such an interpretation, as are Bennett and Livingstone (2021, p. 35) when they write that disinformation also includes how the information is structured and how it is framed (see also Crilley & Chatterje-Doody, 2021).

Disinformation in Relation to Other Concepts

Stringent use of concepts aims to bring clarity to the phenomena they depict. Nonetheless, disinformation must be discussed in relation to other concepts in the field, since disinformation as a concept and phenomenon shares features that overlap with other terms, such as “computational propaganda” (Wolley & Howard, 2018, p. 4), “digital propaganda” (Bjola & Papadakis, 2020), “rewired propaganda” (Oates, 2016) and “malign information influence” (Wagnsson, 2023a).

Fake News

There are also concepts that appear in the literature about information influencing that are unsuitable for a study about Russian disinformation by way of state news media. The concept of fake news is one such term. It has been used in both popular language and academia to denote verifiable falsehoods or fabrications. However, fake news is an inappropriate term for use in the context of this book, since it indicates that the information produced and disseminated in the format of news is false or inaccurate, which is rarely the case. Studies have shown (see Hellman, 2021; Wagnsson & Barzanje, 2021) that it is more common to find accurate events reported in a biased way that distorts the true meaning of a story or message. If the intended meaning is to refer to intentional inaccuracies or untrue statements, the term misinformation is more appropriate (see below).

Moreover, fake news is a problematic concept because it includes too broad a spectrum of incorrect messaging from honest mistakes by professional journalists to the fabrication of facts with an intention to influence election outcomes and political opinions (see Wardle & Derakhshan, 2017, p. 16; see also Tandoc et al., 2017). Fake news at times can also refer to “automated amplification techniques” (European Commission, 2018, p. 10), or the use of bots and other computer-generated data to disseminate massive amounts of information to dominate the media flow and influence populations in a certain direction. Fake news has also been used by politicians to accuse journalists of lying or reporting falsely, in order to undermine trust in the news media and to defend themselves from legitimate criticism. It has become a term used by various politicians to interfere with the “circulation of information and attack and undermine independent news media” (European Commission, 2018, p. 10; Wardle & Derakhshan, 2017, p. 16; Haigh et al., 2018), and is also understood and used differently by scholars and politicians. Research has shown that people associate fake news with poor journalism or with “partisan political debate” (Nielsen & Graves, 2017), which are topics outside the scope of this study.

There might, however, be exceptional circumstances where the term is appropriate or to be preferred. In a study of the information warfare that preceded the Russian military invasion in Ukraine, Khaldarova and Pantti (2016) found what they refer to as narratives of allegedly fake news on the Russian state media Channel One. They describe the news coverage as fabricated and “a proxy for Russian strategic narratives”. The Channel One news coverage between 1 December 2012 and 1 February 2015 is compared with the counternarratives produced by the organization StopFake in the same period. Given the appearance in the study of an organization named StopFake, and the fact that the researchers found 30 untrue stories debunked by StopFake in a cohort of around 300 stories, use of the term Fake news is unsurprising and perhaps called for. The authors write that “Fake news often takes the form of propaganda entertainment (kompromat), which is a combination of scandalous material, blame and denunciations, dramatic music and misleading images taken out of context” (Khaldarova & Pantti, 2016, p. 893 with reference to Oates, 2014). Nonetheless, the term is not appropriate for use here in the broader sense of the word.

Mal-information and Misinformation

Because they are only infrequently used in the scholarly debate, and rarely occur in political discourse in a distinctive way from disinformation, the terms mal-information and misinformation are also excluded from this study. Mal-information is “genuine information that is shared with an intent to cause harm” (Shu et al., 2020, pp. 2–3), which Wardle and Derakhshan list as harassment, leaks, and hate speech (2017, p. 5). There is nothing hidden in the exercise of mal-information; nor is it driven by an urge to deceive by tampering with a message’s content. One example of mal-information is when the emails of the then French presidential candidate Emmanuel Macron were leaked just before the election day media blackout in France in 2017 (Wardle & Derakhshan, 2017, p. 21). Malinformation has more similarities with sabotage than disinformation.

In contrast to disinformation, where there is intent to cause harm, misinformation involves unintentional falsehoods, “deceptive messages that may cause harm without the disseminator’s knowledge” (Freelon & Wells, 2020; Shu et al., 2020, pp. 2–3). Bjola and Papadakis (2020, p. 5) make a clear distinction between misinformation and disinformation based on intent. “Disinformation”, they say, “is used for deliberately propagated false information, in opposition to ‘misinformation’ which is unintentionally propagated false information…”. Instances of misinformation might be situations where public figures pass on rumors that are later found to be false. Because the questions explored in this book depart from an understanding of disinformation as a security threat, such unfortunate misunderstandings are beyond the scope of the study.

Propaganda

In contrast to fake news and misinformation, the term propaganda cannot so easily be dismissed. Considering and updating the term has value in that it draws attention to the historical continuity of the strategy of information influencing in security politics, as well as in war and conflict. Digital technology and global security dynamics have brought about major changes to the role of information but the characteristics of propaganda are not all new. Propaganda connects to disinformation and, despite being a concept mainly connected with conflicts of the past such as the First and Second World Wars and the Cold War, our understanding of propaganda has been continuously updated and revised over time (see e.g., Sorrels, 1983 on Soviet Cold War propaganda). The most recent of these attempts to reshape the concept is linked to the development of digital technology and social media. Even if propaganda is not the most appropriate term for the phenomenon of information influence studied in this book, for reasons which are explained below, relevant and significant contributions to the field have been made using variations of that concept and therefore deserve attention.

The works of Lasswell in the 1920s and 1930s are usually taken as the starting point for the history of propaganda as a scholarly subject. In light of the increasing interest in mass society, mass migration, mass communication, mass media, and so on, Lasswell asked how communication might play a part in the exercise of control over populations (Benkler et al., 2018, p. 24 with ref to Lasswell, 1927). According to Benkler et al. (2018), “Propaganda as a field was an application of the modernist commitment to expertise and scientific management, applied to the problem of managing a mass population in time of crisis” (25). In the same vein, Lippman (1922) spoke of the force or persuasion residing in mass communication as the “manufacture of consent”. He argued that it would be a tool made use of by democratic governments and change the nature of governance: “None of us begins to understand the consequences, but it is no daring prophecy to say that the knowledge of how to create consent will alter every political calculation and modify every political premise” (Lippman (1922)[1997], p. 158).

Although a lot has changed since Lippman wrote his book, with regard to governance and political calculation, as well as the tools available for creating consent, the approach he takes to propaganda is still useful in that it addresses the issue of persuasion from a relatively neutral position, acknowledging that it might be used within the confines of democratic principles as well as outside of these for undemocratic or malign purposes to control an emerging mass society. This is a view that has also been proposed by more recent scholars such as Philip Taylor (1992) and Jowett and O’Donnell (2019), who argue that the term propaganda does not in itself delineate any malicious or undermining strategy, but that convincing someone of the strength of one’s argument and opinions is in fact a precondition for democracy itself. Taylor (1992) states that: “propaganda is a practical process of persuasion and, as a practical process, it is an inherently neutral concept”.

Jowett and O’Donnell (2019) argue that there might be instances where controlling or manipulating a group of people could be beneficial both for the group forced into a belief and the wider public that reaps the consequences of this belief. One such example is the setting up of media institutions to promote liberal democratic values where such media is forbidden; another is when Voice of America sought to manipulate the understanding of US allies and the enemy during World War II. The intention was “to spread the contagion of fear among our enemies, but also to spread the contagion of hope, confidence and determination among our friends” (Shulman, 1997, p. 97 quoted in Jowett & O’Donnell, 2019, p. 11).

The term manufacturing consent was later picked up by Edward Herman and Noam Chomsky (in 1988), in their now classic book with that very title, Manufacturing Consent: The Political Economy of the Mass Media. However, they presented a decidedly critical perspective of the role the media plays, as serving as a propaganda tool for the powerful economic and political interests in society. Their use of the term propaganda is critical of commercial mass media, the concentration of media ownership, the links between elites and the media, and so on, as negative consequences of the neoliberalism that they strongly opposed. Democracy, they argued, is not supported by the mass media, but rather threatened by it (Herman & Chomsky, 1988)

The classic propaganda concept differs from that of Herman and Chomsky’s in that it focuses not on the neoliberal trends in society as threatening democracy, or on other critical perspectives on how mass media institutions in liberal democratic states are run or interact with political actors. Instead, its key component is persuasion, or even mass persuasion. Whereas Herman and Chomsky were critical of the concentration of media ownership and of the large media conglomerates becoming major powerholders in the news media sector, and as a consequence gaining considerable political leverage, they were less concerned about the use of media by authoritarian regimes to weaken foreign states.

In their oft-cited book, Jowett and O’Donnell (2019, p. 6) define propaganda as “a deliberate, systematic attempt to shape perceptions, manipulate cognitions and direct behaviour to achieve a response that furthers the desired intent of the propagandist”. This definition supports what in this study is referred to as disinformation in several respects. First, it is deliberate and carefully planned and, as is argued further below, consists of strategic narratives. Jowett and O’Donnell also argue that the strategy involves promoting an ideology, but this might be less relevant today. (This is also further discussed below.) The definition further states that propaganda is systematic, and that it is precise and methodical and therefore different from strategic communication. Later works on propaganda also make this distinction between the use of strategic communication and disinformation, referring to these as tools used for persuasive purposes, whereas propaganda connotes the practice writ large (Bjola & Papadakis, 2020). These formulations on shaping perceptions and manipulating cognitions represent perhaps the most important features of not just propaganda, but also disinformation—including engagement in these practices with specific intent. However, whereas the desired intent of the propagandist has tended to be looked at as ideologically driven, with the aim of strengthening the propagandist’s relative position of power, this might also be less relevant today. Even if a great power such as Russia is striving to increase its weight internationally, this will not necessarily be done by impregnating target countries with the national conservative ideology of Putinism, and nor is the propagandist nation necessarily hailed in these communications either (Wagnsson, 2023b). Instead, the propagandist appears set on weakening and harming “the Other”.

Like Jowett and O’Donnel (1992), Briant talks about propaganda as “the deliberate manipulation of representations…producing an effect on the audience…that is desired by the propagandist” (Briant, 2014, p. 9). Some authors (Benkler et al., 2018) group together the terms disinformation and propaganda, and define them both as “manipulating and misleading people intentionally to achieve political ends” (p. 24); others refer to disinformation as a tool used in propaganda (Bjola & Papadakis, 2020). Most often the propagandist is a government or a regime and the propaganda is targeted at a foreign state. However, in recent years the concept has been opened up to include types of actors other than governments as propagandists with the capacity to disseminate their messages to target audiences. This also means that messages are not necessarily constructed by government agencies, such as information bureaus or intelligence services, but by journalists in state media institutions, while public diplomacy activities have become difficult to distinguish from propaganda and psychological operations (see Szostek, 2020).

Staying with the classic definition of propaganda, various caveats have been added by scholars to take account of the major media developments of recent decades, in particular digital developments. Bjola and Papadakis (2020) thus talk about “digital propaganda”, which they call an umbrella label that includes terms such as fake news, disinformation, or post-truth (p. 4). Digital propaganda is defined as: “the use of digital technologies with the intention to deceive the public through the generation and dissemination of verifiably false or misleading information” (Bjola & Papadakis, 2020, p. 5). Digital propaganda appears almost identical to how Bennett and Livingstone (2021) define disinformation, except that the latter omits the term “digital technologies”. This difference in terminology, however, signals that Bjola and Papadakis (2020) are focused on the impact of the media technologies as well as the information itself. They make a point of distinguishing the “computational dimension”, by which they mean trolls and automated messaging, from the “content dimension” and argue that each requires different means of resilience. Wolley and Howard (2018, p. 4) stress the technological aspects more strongly, using the term “computational propaganda”, and refer in their work to “the use of algorithms, automation, and human curation to purposefully manage and distribute misleading information over social media networks”. However, by including the term “human curation”, a term which indicates that the production of propaganda is not entirely automated, but managed by human beings, they too think of information influencing as a combination of content and technology.

The role played by technological advances must not be understated. The manipulation of information using digital technology in combination with developments within AI is highly problematic not only because of its capacity to disseminate large amounts of information, the sources of which are impossible to verify or trace, but also because the deceptive messages are difficult to distinguish from true messages, and fabricated persons making statements or producing stories are difficult to distinguish from the accounts of real people (see Vaccari & Chadwick, 2020 on Deepfakes). Although there is general agreement in the field that the importance of propaganda and disinformation as security measures is dependent on developments in digital technology, the “computational enhancements” that Walker and Ludwig (2017) talk about are not analyzed in this study. Instead, the analysis is centered on the “content dimension”, to use the vocabulary of Bjola and Papadakis (2020).

A further concept is “network propaganda” (Benkler et al., 2018). Here, propaganda is not dismissive of technology but nor is it centered on technology in itself, but instead on the networks and media ecosystems that the technology make possible, understood as “network architecture” (Benkler et al., 2018, pp. 33–34). Their study shows how propagandists make use of the same networks that used to serve as promoters of pluralism and facilitators of democratic participation, and how they use them for disinformation and propaganda. The understanding of the type of information involved, used, and disseminated in these networks is similar to the definitions of Bennett and Livingstone (2021).

Oates (2016) also revises propaganda terminology to conform with the digital age with the term rewired propaganda. She argues that the internet has opened up new possibilities for autocratic regimes to strengthen, and rewire, their propaganda as: “a commitment to disinformation and manipulation, when coupled with the affordances of the new digital age, give particular advantages to a repressive regime that can proactively shape the media narrative”. At the same time, however, efforts to control information flows have become more difficult (Oates, 2016, p. 399).

Oates talks about rewired propaganda as a “more dynamic conception of how information communication technology changes the media ecology in non-free states” (2016, p. 400). Technology has a bearing on content, she argues, and this in turn has consequences for the media system at large. Although her definition of propaganda is in line with my understanding of disinformation, and also shares a focus on international news coverage as its outlet, her study object is different in that she studies how the domestic media system of the propagandist, in this case Russian system, is affected by the turn to rewired propaganda. Rewired propaganda aims to stress the opportunities that new media technology offers autocratic states. Even if regime control over information has become a lot more problematic with the advent of developments in digital technology, Oates’ point is that the Russian regime has adapted to the new media environment, and that propaganda strategies are integrated with the internet and with social media strategies to maintain dominance over citizens and maintain legitimacy (Oates, 2016, p. 399; Oates 2021).

In Conclusion

The various propaganda-related concepts, updated and revised to capture the dynamics of the current digital world, each have their contribution to make to the field. However, the term propaganda still carries connotations from the Cold War era and before that differ from how Russia and other states engage in disinformation today. The twenty-first century propagandist might still be a state or regime but whereas previously propaganda was produced and disseminated from a ministry or state department, such as for example the Ministry of Propaganda and Public Enlightenment led by Joseph Goebbels in Nazi Germany, the information influencing of today is built into the news media ecology. As Wagnsson writes: “actors use new channels and normal media consumption patterns to reach citizens in other societies” (2023a, p. 1850). Propaganda is not the product of a ministry of information or some state bureaucracy, but of media staff working for the state—and at times Western journalists and others employed by the news organization. In other words, “international news outlets can be instrumentalized for geopolitical ends” (Moore & Colley, 2022, p. 3).

Nor is today’s information influence enacted through a nationalist discourse glorifying and idealizing the propagandist home nation and political system in contrast to other state systems. The messaging of the Russian regime’s international broadcasting contains few depictions of Russia as popular or special, in contrast or compared to the target country. Instead, the strategy is to weaken and denigrate the target country with depictions of, for example, institutional deficiencies, government incompetence, and domestic conflict. The fact that the information is being produced and disseminated as news media by journalists in newsrooms makes it more difficult to distinguish this type of information from liberal journalism or public diplomacy. Like journalism in liberal democracies, communication is interactive and not linear. Propaganda does not have to be one-way, in a message distributed from a major power holder to a mass audience, but this is often what is associated with the term. This implies that there is a sender that produces information and messages with the intention to weaken the protagonist and strengthen the self, spreading the message to an audience that receives it in accordance with the intention of the sender. This is far from how mediated messages move in today’s fragmented media system, however, where audiences interact with content of their choosing, publish comments, and spread the news in their own networks (Szostek, 2020, p. 2730).

As noted above, Taylor (1992) is critical of the assignment of normative connotations to the concept of propaganda. He rejects the notion that propaganda is equivalent to disinformation and must be seen as a threat to security and liberal democratic systems. He argues that the term could be just as relevant in depictions of the propagation of a strengthened democracy, or freedom of opinion or resistance against anti-democratic forces, and so on. This is also what Wagnsson argues when adding the term malign to information influencing. Any well-functioning democracy must have political actors and citizens engaged in information influencing. It is when this turns malign that it becomes a threat.

Yet another reason for using the term disinformation rather than propaganda is that the former term signifies information used not necessarily to spread a denigrating message, but to distort and destroy the distinction between what is true and what is false (Wagnsson, 2023b, p. 651; Hellman, 2021), thereby dissolving trust in any source of information. Whereas propaganda is intended to “manipulate the views and attitudes of the target group in a pre-determined direction” (Bjola & Papadakis, 2020; see also Cull et al., 2003: xix; and Lasswell, 1927), that direction is more difficult to discern today, if it exists at all. A Kremlin insider quoted in Pomerantsev and Weiss (2014, p. 9) says that when the Soviets lied “they took care to prove [that] what they were doing was the ‘truth’”. This is different today: “now no one even tries proving the ‘truth’. You can say anything. Create realities”. Some would argue that direction or ideology has been replaced by the aim of installing disbelief and skepticism in people, to incapacitate them and leave them unable to make sense of the world, imposing on the target audience a sense that their society is lost to chaos (Bjola & Papadakis, 2020, p. 2).

This could be seen as a novel strategy of the use of propaganda but is better spoken of as disinformation. It indicates that the influence is neither primarily about imposing a positive image of the sender country as superior, nor simply about denigrating the target country, but disseminating messages and information in a way that prevents people from making sound interpretations, thinking critically, or trusting public institutions and one another. This is also the type of threat that the European Commission identifies in its report as disinformation.

In contrast to Shu et al. (2020) and Wardle and Derakshan (2017), disinformation in this study does not refer merely to inaccurate information and falsehoods, but also to information being distorted and intentionally constructed to deceive. Messages that are intentionally deceptive, distorted, or misleading are also considered disinformation even when the separate pieces of information in the messages are accurate. It might be the structure of the messages, or the associations made between pieces of information within the message that give rise to the deceptive meanings. Such messages can be constructed by way of narratives where the format and structure are as important for the meaning making as the separate pieces of information. (Harmful narratives are discussed further in Chap. 3.) Disinformation is also often multilayered and contains statements with a variety of truth claims. Culloty and Suiter (2021, p. 6) argue that this means that making distinctions between true and false is of no importance. Bennett and Livingstone (2018, p. 124) make a similar distinction between disinformation and falsehood, arguing that disinformation is about “strategic deceptions that may appear very credible to those consuming them”. Simple fact-checking is not sufficient since disinformation cuts deeper into political institutions and democratic values in complex ways.

This study treats disinformation as emanating from interacting nodes in a horizontal network enabled by the internet and with the use of social media, rather than as a hierarchal and vertical top-down structure where messages are produced and disseminated from a single center to a mass audience. Disinformation, digitally disseminated and dressed up as news coverage, would support such an understanding. It is a type of strategic communication that emerges from foreign state establishments, governments, regimes, or elites, targeted at a broad or mass audience. Disinformation flows therefore quickly become part of the news media system, and inform and interact with other nodes and actors in that system. The argument is that agents of disinformation such as the Russian state media contaminate the global news networks with a manipulated and fabricated news format intended for malign purposes.

Disinformation can thus be depicted as flows between nodes interacting more or less intensely and not always predictably, and similar to what Archetti (2018) talks about with regard to narratives with a malign intent. Disinformation is relationally constituted in social space, reflecting, amplifying, or weakening the links between the nodes and taking off in different or similar directions (see Archetti, 2018). She refers to these dynamics as “overlapping reflections in a hall of mirrors”, an image similar to the flows of disinformation, and one that reveals the difficulties of preventing these dynamics from doing harm. Disinformation flows know no borders and domestic groups might intentionally or accidentally amplify a foreign message intended to harm another society, causing it to bounce against the side of a prism and disseminate its reflection to another node in the media system, and so on and so forth.

To sum up, this book uses the concept of disinformation in order to explore how the Russian state media has sought to weaken and denigrate a European state by way of international news reporting. This means that the study is limited to one dimension of the disinformation processes: the construction of the news narratives that make up the disinformation. Nonetheless, it is essential to state that the premise for the study lies in the understanding of disinformation as undertaken by news media organizations controlled by the Russian state; that disinformation involves distortions of statements and accounts as well as misleading information, which may or may not include inaccuracies, all of which are considered to pose threats to national security; and that, contrary to most understandings of propaganda, the objective is to cause domestic unrest and tension in the target state by instilling doubts about the authenticity of all information and sowing mistrust between citizens and the state.