This article argues that political economy can explain how and why information distribution and opinion formation are changing. Even more importantly, political economy offers us ways to fight the negative effects of misinformation. It is argued that asymmetry of information can explain recent developments and why poor-quality information is capable of competing with good-quality information. Subsequently, the article explains how poor-quality information influences peoples’ decisions through internal decision-making processes.
This article will argue that opinion formation and information distribution are undergoing a radical transformation. As a consequence of this, mainstream media and other traditional authorities that used to have a monopoly over information distribution and the influencing of opinions have been challenged by new actors whose existence is made possible by the Internet. I will argue that this shift and the consequences can be explained by applying political economy theories, such as asymmetry of information, to the Internet-age media landscape. This article considers centrist political parties to be among the traditional authorities that used to have a strong influence over opinion formation. Therefore, after presenting the central thesis, I will make recommendations, derived from political economy, on how political parties and other traditional authorities can survive in the new media marketplace where information is distributed and opinions are subsequently formed.
This article touches upon the hot contemporary topic of the ‘post-factual society’. ‘Influencing opinions’ is understood to be a public act, performed by taking part in public discussion or spreading (false or accurate) information in a way that has the potential to reach people outside of one’s own social circle. Traditional authorities are understood to comprise the people and institutions that prior to the Internet had access to mass media channels, for example political parties, broadsheet newspaper columnists, think tanks, mainstream professors and politicians. The ‘media marketplace where opinions are formed’ includes all current media channels that expose people to the opinions of those outside of their own social circles, for example, social media channels, newspaper comment sections, newspaper articles and blogs, and so on. Information distribution and opinion formation are understood to have a causal relationship, with information distribution influencing opinion formation.
The basis for the fragmentation and decentralisation of information distribution was the creation of the Internet and the subsequent digitisation of the media. This is so incredibly important since it has drastically reduced the cost of information distribution and hence the cost of participating in opinion formation. Nowadays, social media and other applications of the Internet make it possible for everyone to reach, at least in theory, everyone who has access to the Internet and speaks the same language, virtually for free. Previously, only certain press and television media could be described as ‘mass’ media. Furthermore, access to the masses through the mass media was and remains expensive and is limited in other ways as well. Libel laws and the requirement for balanced and truthful reporting, including journalists’ own quasi-peer-review processes, limit the spread of certain kinds of ‘information’, which is readily available online through channels other than the traditional mass media.
Not all opinions are equal
The ability to influence opinions used to be the privilege of persons who were authorities in their fields. Their dominance is being challenged by laymen and the ever-increasing volume of information.
Not all opinions are equal. This statement does not mean that everyone should not have the right to express and distribute their opinion, but it would be ludicrously naïve to equalise an expert’s and a layman’s opinion on any particular matter. Before the arrival of online media channels and social media platforms, only top experts in their fields were given the opportunity to express their opinions on a matter through the print and television media. Pre-Internet, any new controversial claim that a newspaper was considering sharing with the masses would have to have been verified by other experts, or as a minimum the opposing view would have to have been presented alongside the controversial new claim. Access to the masses through the mass media happens through a de facto quasi-peer-reviewed process in which only verifiable information is published; when mistakes are made the publication of corrections is mandatory and required by law. Pre-Internet, laymen were confined to preaching to their own social circle of close acquaintances. Scholars hoped, and to some extent they have been right, that this democratisation of public opinion formation as a result of the Internet would have a positive influence on civic participation and the production of information (Maximino 2014).
However, there is a clear asymmetry of informationFootnote 1 that is highly problematic and so far has not been endogenously mitigated. The person exposed to information is expected to make a judgement call, occasionally even on highly technical issues, without knowing whether or not the information or opinion spread by a person is accurate. Take the subject of vaccinations as an example. Within this particular field, the opponents of vaccinations are often not medical professionals themselves. However, if the opponent of vaccinations claims to be a medical professional, the problem of asymmetry is further exacerbated. A person without expertise in the given field would find it difficult and time consuming not only to first verify the claim that the anti-vaccination person is actually a medical professional, but also to find out what the scientifically established consensus is, in this case on vaccinations. In contrast, the person spreading this information would be fully informed of the merits or otherwise of the information he or she is spreading. This asymmetry of information on the quality of the information being spread through channels other than the traditional mass media has the potential to reduce trust in all information and opinions.
Is asymmetric information driving good-quality information from the market?
In 1970 Nobel Prize-winning economist George Akerlof (1970) published a seminal paper titled ‘The Market for LemonsFootnote 2: Quality Uncertainty and the Market Mechanism’. The paper explains the logic of how asymmetric information leads to adverse selectionFootnote 3 in the second-hand-car market. Essentially, his argument comes down to uncertainty over quality, which subsequently leads to a fall in trust among the market actors. One key aspect behind this process is that buyers are only willing to pay the average asking price between a good- and a bad-quality car due to difficulties with identifying the qualitative differences between the goods in the market. This process drives out high-quality producers and leaves the market dominated by low-quality producers. In this way, Akerlof illustrated the mechanism through which prices can determine the quality of goods in a market where there is a problem with asymmetric information.
It is possible to apply Akerlof’s theory to the modern information creation and dissemination ‘market’. Creating and spreading information has a price. High-quality information has a higher price than lower-quality information or outright lies due to the time and costs related to the fact-checking processes. It is difficult to find an accurate estimate of the price for a journalistic piece for an average newspaper, let alone for a piece produced for television. The estimates range from a few dozen euros to thousands, depending on the size of the newspaper, all the way up to hundreds of thousands for high-quality investigative journalism pieces. One calculation estimates that on average each article costs The New York Times around $1750 and that this price only includes the cost of actually writing the article, excluding back office overheads. Some of their investigative cover stories have cost up to $400,000 to produce (Langeveld 2010). Naturally, it also takes longer to produce and distribute good-quality and accurate information.
The problem of quality uncertainty is most potent among information spread by obscure newspapers, independent self-described ‘experts’, and celebrities and laymen who have a large following and seem like trustworthy sources. Unlike in the pre-Internet age, people with no authority over a subject can have a huge following, and have the ability to spread information and subsequently influence opinion formation without any quasi-peer review or similar processes to control whether the information that is being spread is accurate. Essentially, the problem with asymmetric information is prevalent among all people or actors who do not publish peer-reviewed or quasi-peer-reviewed material. The main issue here is that articles that are quasi-peer-reviewed by journalists and subject to editorial scrutiny are competing for the same visibility as information created and spread by actors with considerably lower production costs, such as laymen and independent experts. However it is difficult for consumers in this information market to distinguish between high-quality and low-quality, and accurate and false information. Carrying out the research required to establish the validity and accuracy of the information one encounters from a source other than a mainstream media outlet would take a considerable amount of time. What this can lead to is a real ‘market for lemons’, in which the price and production costs determine the quality of the information available as (most) information is cost-free online and people remain hesitant to pay for it. In other words, the average price (zero) paid, combined with the lower production costs for poor-quality information compared to high-quality information and the quality uncertainty exacerbated by asymmetric information, is driving good-quality information from the market.
In a normal state of affairs, The New York Times, or similar globally respected publications, would not be affected by the problem of the asymmetry of information. This is due to its strong branding and reputation for accuracy. Creating brands and quality standards are among the main ways to mitigate asymmetry of information problems. However, recent polls show that in the US, for example, public trust in the media is at its lowest point in history. Currently, only 32% of Americans have a great deal of trust in the media, compared to the highest rating of 72% in 1976. Since 2003, when trust was at 54%, it has fallen almost continuously to its current low (Swift 2016). We do not have data that shows the exact reasons for this. However, the fragmentation of the media, polarisation and systematic efforts to discredit mainstream media are all factors (The Economist 2016). Essentially this data shows that for many people, information spread by The New York Times is not automatically considered trustworthy. Hence, the newspaper is in competition with low-quality information producers, while we can reasonably expect that it has considerably higher production costs.
Worryingly, it is also possible that the information market may turn into a dualistic market with a high-price–high-quality information segment that is not available to all and a low-quality–low-price information segment, which functions according to the rules of the market for lemons. This has the potential to decouple citizens from each other and create parallel realities, based on different sets of ‘facts’. However, as I will highlight later, such different realities might already exist.
The ways in which misinformation influences peoples’ decision-making
We have to react to these developments, since the spread of inaccurate or false information influences our decision-making abilities. According to research, people instinctively accept the information to which they are exposed and must actively work to reject falsehoods; they tend to think familiar information is true, and they cherry-pick data to support their view (The Economist 2016). Nobel Prize-winner in economics Daniel Kahneman calls this ‘cognitive ease’. This process happens both consciously and subconsciously due to biases that are common in the heuristicalFootnote 4 models we employ to make estimates and decisions when uncertain.
In 1974 Tversky and Kahneman (1974) published a paper on the decision-making biases that influence us all. In their paper they explain the most common heuristics employed by people when making a judgement when uncertain: availability, representativeness, and adjustment and anchoring. The key outcome was that, for most issues, decision-making is based on incomplete data of limited validity (Kahneman 2011, 410–11). Even if we had perfectly valid data, we would still employ these heuristics and commit the same errors. If the data, equalling information, at our disposal is of lower quality and less trustworthy because of the information market increasingly becoming a market for lemons, our decisions will be affected in a negative way.
Kahneman and Tversky argue that there are situations in which people assess the frequency of a class or the probability of an event by the ease with which instances or occurrences can be brought to mind (Kahneman 2011, 418). Essentially, this means that the ease with which we remember an occurrence has an influence on our decision-making. For example, immediately after seeing a house fire, people consider their own house burning down to be more likely than before, even though nothing has changed concerning their own house. This is called the availability heuristic. This is a bias we all suffer from, especially when making decisions without properly analysing the situation and based on intuition, or what Kahneman (2011) in his later works describes as ‘thinking quick’ instead of ‘thinking slow’. Being exposed to false stories or misrepresentations therefore clearly has an influence on our decision-making due to it being easier to remember more recent stories or occurrences.
Representativeness is a heuristic employed by people when faced with one of the following types of problems: what is the probability that object A belongs to class B? What is the probability that event A originates from process B? What is the probability that process B will generate event A? When answering such questions, people typically rely on a representativeness heuristic, in which essentially probabilities are evaluated by the degree to which A is representative of B. They argue that when A is considered highly representative of B, the probability that A originates from B is judged to be high and vice versa (Kahneman 2011, 410–11). Applying this heuristic can lead to many errors, especially if one is constantly exposed to false and inaccurate information on an issue or about a group of people, particularly since, according to Kahneman and Tversky, people tend to be really bad at understanding statistics (Kahneman 2011, 410–18).
Adjustment and anchoring is the third powerful heuristic used to solve problems and estimate probabilities. ‘In many situations people make estimates by starting from an initial value that is adjusted to yield the final answer. The initial value, or starting point, may be suggested by the formulation of the problem, or it may be the result of a partial computation [calculation]’ (Kahneman 2011, 421). It is a powerful heuristic because when estimating probabilities, one can have a statistically significant effect on people’s answers by manipulating the information available onto which they anchor their thinking and estimates (Kahneman 2011, 421–2). Hence, by spreading false information, for example, on the proportion of migrants in society or on the likelihood of a Muslim being a terrorist, one can have a considerable effect on people’s estimates and hence influence their opinions. The anchoring heuristic, which we use as an in-built mechanism to bring order to our thinking, is a powerful tool when it is used to manipulate people. Because of these heuristics, we should recognise the tremendous power that false information and those spreading it have over people’s opinions.
One can also argue that the most dangerous part about false information is that through decision-making heuristics we internalise false information, even if we disagree with it or recognise that it is inaccurate. We are victims of our own success in creating this heuristic to help us deliver more accurate probability estimates.
The sad fact is that we are already fighting an uphill battle, as polls point towards huge discrepancies between public opinion and facts. In 2013, King’s College London and Ipsos Mori polled the UK population on issues including what percentage of the welfare budget is fraudulently claimed, how common teenage pregnancy is and whether crime is falling or increasing. Citizens were grossly wide of the mark. For example, in terms of benefit fraud, the population’s estimate was that 24% of benefits are fraudulently claimed. The real figure is 0.7%: public opinion is off by a factor of 34. In terms of teenage pregnancy, the popular estimate was that 15% of girls under the age of 16 become pregnant every year. The real figure is 0.6%. Interestingly, 26% of people thought that foreign aid was among the top three items the UK government spends money on. In reality only 1.1% of the UK’s budget goes on foreign aid. The proportion of Muslims in the UK was grossly overestimated at an average estimate of 24%; in reality 5% of the population are Muslims. The proportion of Christians was grossly underestimated at 34%, whereas in reality 59% of the UK population are Christians (King’s College London and Ipsos Mori 2013). The miserable results of this survey are probably a combination of misinformation and biases in heuristics. These polling results can help to explain many political developments of late, since most policymakers and the average population seem to live in different realities.
What lessons can we draw from political economy concerning communications from political parties?
Political economy offers ways to deal with asymmetry of information and we can also draw lessons from Kahneman and Tversky’s work on the heuristics people employ when estimating probabilities and frequencies. First of all, traditional authorities must understand that they are in competition for visibility and attention with actors that are not bound by the same rules.
Brand validation is key (asymmetry of information)
Branding has been around since the Ancient Greeks, but the branding of mass consumer goods started in the late nineteenth century (Saarikoski 2012, 6). It is one of the key ways to counter asymmetry of information. Branding gives the less-informed partner information or almost a guarantee of the quality to expect. For traditional authorities, building or strengthening their brands is immensely important because it builds trust in them. In this case, applying this to political parties at the member state level, parties should avoid having completely different communication styles and strategies when in opposition compared to when in government. If the party’s brand is that it is the fiscally responsible party that values non-hyperbolic policy-focused work, then this must be communicated accordingly when in opposition as well. If not, the party’s brand will be damaged to the point that it does not have one, its credibility will be in tatters and people will not know what to expect from it. Brands only solve the problem of asymmetric information when they are credible and consistent. You would not trust a hotel chain’s brand if every second hotel was not at the level expected. Why would you trust a political party that does not act consistently, in accordance with its own principles?
Volume and repetition are kings (availability plus anchoring)
If Tversky and Kahneman’s theses are correct, we must understand that the volume of information and hence the volume of communication from political parties is hugely important. When people make estimates and form opinions based on the ease with which they can recall issues, we must understand that it is hugely important to increase the volume and repetition of our messages. If we want to be known as the party that wants to go the moon, it must be part of everything we communicate. It must be ingrained in the party’s brand. For example, it must be in the logo, in the letterheads, in the notes to editors, and repeated in different forms on different channels in a disciplined manner for as long as the data shows us that this is what people think of us. This is what political economy and decision-making heuristics teach us about the current fragmented media landscape. The traditional authorities must also engage their supporters to multiply the message and reach out to others, in the same manner as those who are distributing false information do. Our supporters, and centrists in general, should be encouraged to become ‘fact-trolls’ to counter the effects of misinformation. However, this will only work if our information is completely truthful and consistent. As volume has an effect on decision-making through availability and anchoring heuristics, one must increase the volume of information and communication.
Actively challenging falsehoods (availability plus representativeness plus anchoring)
In order to counter the effects of all three heuristics, traditional authorities, including political parties, must actively challenge falsehoods. This means in practice that traditional authorities must engage online in discussions with people and show why the allegations made in the new media are wrong. They should actively work to shape peoples’ opinions. The other alternative is to wait on the sidelines and formulate one’s policies based on people’s opinions, but this can hardly be described as values-based politics. Further, the counter argument to this approach is that debating has to happen on the topic chosen by the person on the attack and hence one is bound to repeat the original false claim. Yet at the same time, not challenging false claims gives us no chance whatsoever of mitigating the effect on people’s decision-making. By challenging, and actively engaging in debate on Facebook and Twitter, in the comment sections of websites and on similar forums, one can counter anchoring, representativeness and availability heuristics and spread one’s own message at the same time. Furthermore, challenging false claims builds up the brand of the party in question and underlines its truthfulness. However, this naturally requires a highly skilled and agile communications team and humility as an organisation in order to accept that mistakes are made, since not all criticism is unfounded.
Challenging and using first-mover advantage
Traditional authorities should not only be on the defensive. They need to actively challenge populist movements and myths, even if people are biased against changing their opinions. The first-mover advantage can be applied to the modern media landscape. Essentially, the first-mover has the advantage that not all the people who see the first message will see the responses to and rejections of that information by other actors. The power of the first-mover advantage should be utilised by centrist forces to counter populist movements and those distributing false information, since these groups are most certainly taking full advantage of this factor.
Endogenous peer-review systems and gate-keepers (asymmetry of information)
In the long term, it is possible that technology will solve part of the problem of asymmetrical information, yet it remains unlikely that it can remove our heuristical biases. As in e-commerce, it is possible that peer-review will come to solve part of the asymmetry of information problem by allowing people to rate the information to which they are exposed. Naturally, there is no reason why this would not be exploited. But such a system would be an endogenous solution to the problem so prevalent right now. An alternative to a fully diffused peer-review system would be that over time certain individuals or organisations that are universally accepted as neutral fact-checkers are promoted to gate-keeper status. Such a role will not be suitable for political parties, but they can support the development of different peer-review systems and elevate institutions to the status of gate-keepers.
Information distribution is undergoing a radical transformation. Research makes it clear that we are all victims of biases that are built into our decision-making heuristics. These biases expose us to false and inaccurate information that influences our decision-making. We should explore options for how to counter these negative effects by applying political economy theories to the current information distribution and opinion formation ‘market’. We have to understand that the biases in our heuristical models mean that we have to actively counter misinformation and lies if we want to mitigate the effects of misinformation on people’s decision-making.
Asymmetric information, sometimes referred to as information failure, is present whenever one party to an economic transaction possesses greater material knowledge than the other party. This normally manifests itself when the seller of a good or service has greater knowledge than the buyer, although the opposite is possible. Almost all economic transactions involve information asymmetries (Investopedia.com n. d. b).
In the US the term ‘lemons’ refers to poor-quality second-hand cars.
Adverse selection occurs when one party in a negotiation has relevant information that the other party lacks. The asymmetry of information often leads to bad decisions being made, such as to do more business with less-profitable or riskier market segments (Investopedia.com n. d. a).
A heuristic is a mental shortcut that allows people to solve problems and make judgements quickly and efficiently. These rule-of-thumb strategies shorten decision-making time and allow people to function without constantly stopping to think about their next course of action. Heuristics are helpful in many situations, but they can also lead to cognitive biases.
Akerlof, G. A. (1970). The market for lemons: The quality uncertainty and the market mechanism. The Quarterly Journal of Economics, 84(3), 488–500.
Investopedia.com. (n. d. a). Adverse selection. http://www.investopedia.com/terms/a/adverseselection.asp?ad=dirN&qo=investopediaSiteSearch&qsrc=0&o=40186. Accessed 26 October 2016.
Investopedia.com. (n. d. b). Asymmetry of information. http://www.investopedia.com/terms/a/asymmetricinformation.asp. Accessed 26 October 2016.
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar Straus and Giroux.
King’s College London, & Ipsos Mori. (2013). Perceptions are not reality. The top 10 we get wrong. Ipsos Mori, 9 July. https://www.ipsos-mori.com/researchpublications/researcharchive/3188/Perceptions-are-not-reality-the-top-10-we-get-wrong.aspx. Accessed 27 September 2016.
Langeveld, M. (2010). How much does it cost the NYT on average to produce one news story? Quora.com, 10 December. https://www.quora.com/How-much-does-it-cost-the-NYT-on-average-to-produce-one-news-story. Accessed 27 September 2016.
Maximino, M. (2014). Does media fragmentation contribute to polarization? Evidence from lab experiments. Journalist’s Resource, 22 August. http://journalistsresource.org/studies/society/news-media/media-fragmentation-political-polarization-lab-experiments. Accessed 27 September 2016.
Saarikoski, S. (2012). Brands, stars and regular hacks—A changing relationship between news institutions and journalists. Oxford: Reuters Institute for the Study of Journalism, University of Oxford.
Swift, A. (2016). Americans’ trust in mass media sinks to new low. Gallup.com, 14 September. http://www.gallup.com/poll/195542/americans-trust-mass-media-sinks-new-low.aspx. Accessed 27 September 2016.
The Economist. (2016). Briefing: The post-truth society. 10 September. http://www.economist.com/news/briefing/21706498-dishonesty-politics-nothing-new-manner-which-some-politicians-now-lie-and. Accessed 26 October 2016.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–31.
About this article
Cite this article
Nurvala, JP. Do not trust people: lessons from political economy on how to counter misinformation and lies. European View 15, 253–263 (2016). https://doi.org/10.1007/s12290-016-0420-8