Concern about information disorders and the harm to trust and truth has raised questions about how to control or limit free speech and forced us to better understand how digital literacy emerges in the lives of everyday users of technologies (Bhatt and MacKenzie 2019Footnote 4). But while online free speech, a free press, and near unfettered access to open and transparent information is fraught with the dangers of deceit, information disorders and [mis] information overload, silencing these channels is not the answer. John Stuart Mill is often invoked by those claiming that free speech or opinion is an unconditional good that should in no way be curtailed. Mill was very sure that all silencing of an opinion was an ‘assumption of infallibility’:
the peculiar evil of silencing the expression of an opinion is, that it is robbing the human race; posterity as well as the existing generation; those who dissent from the opinion, still more than those who hold it. If the opinion is right, they are deprived of the opportunity of exchanging error for truth: if wrong, they lose, what is almost as great a benefit, the clearer perception and livelier impression of truth, produced by its collision with error. (Mill 2006: 32)
The ‘dissentian worlds’ of other people should be seen as a standing invitation to be ‘corrected before the whole world’. If we are to cultivate our understanding, then we need to learn the grounds of our opinions and to be able to defend them against common objections. The only rational means by which we can justify our beliefs and assure ourselves that we are right is by having the freedom to have these contradicted or disproven by others who have contrary views or experiences. Errors, Mills argued ‘are corrigible’ (27): that is, mistakes can be rectified by discussion and experience, ‘despite there being few who may be capable of judging truth’. The habit of subjecting one’s opinions to scrutiny is the only stable foundation for relying on the truth of it, and only then has a person a right to think her ‘judgment better than that of any person, or any multitude, who have not gone through a similar process’ (27). The further value in subjecting one’s views to fearless scrutiny and critique was that it would be held as ‘a living truth’ rather than ‘dead dogma’ or yet another superstition (42),
assuming that the true opinion abides in the mind, but abides as a prejudice, a belief independent of, and proof against, argument—this is not the way in which truth ought to be held by a rational being. This is not knowing the truth. Truth, thus held, is but one superstition the more, accidentally clinging to the words which enunciate a truth. (42–43)
Mill’s confidence may be misplaced or naïve given the force and reach of modern digital technology. Nevertheless, sustained and principled exposure to diversity of opinion offers us our best chance of obtaining truth (an exposure that is not, perhaps, suited to online environments). It is always possible that a dissenter from established opinion could have something valuable to say and so obstructing free speech would be a loss to truth.
Arendt (1971) asserted that facts need testimony to be remembered and trustworthy witnesses to be established in order to be secure. Factual statements, however true or credible, however often they are tested against public opinion, can rarely be beyond doubt, certainly not secure against attack. The deceiver knows that reality can be questioned and that a version of the truth may be more appealing than the reality itself, and appeal to sentiment rather than to reason (MacKenzie and Bhatt 2020a). One would think that the easy availability of information should work to contradict and so undermine the ‘truth’ of false claims but we have a tendency to be passive before overwhelming information or when informants we trust, respect or admire speak; we often grant too much credibility to the speaker or the source.
All freedom of opinion becomes a ‘cruel hoax’ (Arendt 1971) if access to unmanipulated factual information is impeded. Digital technologies, and social media platforms in particular, create both new norms for language and discourse, and enable new forms of power and inequality to exist, casting doubts about technologically deterministic accounts of technology’s relationship with society. To what extent is the design and infrastructure of digital platforms an enabler in these current problems? If dupery is not always ‘by design’ in that humans can and often do mistakenly spread misinformation (see Meserole 2018), then is the very design of a social media platform to be implicated in the current problems we face? Information disorders are highly complex phenomena that algorithmic tweaks are unlikely to solve. Legislation that fines platforms for hosting unlawful content could work, along with factchecking by the platforms or third parties, quality trending topics and news feeds, high-quality engineering resources, and strong editors. These are some basic if complex technological solutions. Non-technological solutions include credibility scoring, whitelisting and blacklisting.
However, as Martin Moore, Director of the Center for the Study of Media, Communication and Power, argued in his submission of evidence to the UK Parliamentary Select Committee on Fake News, ‘the long history of fake news, the political, social and economic motivations for producing it … mean that technology will only ever partly address the problem’ (2017: 11). Engineers and platform owners make value-driven choices to determine which news to promote and which to suppress, assuming they can solve, or even alleviate, the problem:
The technology platforms on which this news travels are reliant on advertising that prioritises popular and engaging content that is shared widely. The content is not distinguished by its trustworthiness, authority or public interest, since these are not criteria that drive likes and shares. (Moore 2017: 11–12)
News and digital literacy programmes, critical research skills, and education on the power of images to skew opinion and feeling, are essential. The UK House of Commons Committee for Digital Culture, Media and Sport Committee, has stated that digital literacy should be the ‘fourth pillar’ of education along with reading, writing and maths (2019: 350). The primary features of such an education could include: (i) social media verification skills; (ii) how algorithms prioritise information and sites, along with curation functions; (iii) techniques for developing emotional scepticism to override our tendency to be less critical of content that provokes an emotional response; and (iv) statistical numeracy.