Introduction

The global trends of economic insecurity, educational inequality, democratic deficits and socio-economic exclusion have intensified the epistemic tensions between authorities and citizens in recent years. The global economic recession that followed the 2008 financial crisis and the rise of social media have greatly contributed to the formation of a post-truth political climate, including the emergence of trolls, who seek to deny or simply ignore common sense knowledge and professional testimony. In these circumstances many political scientists (e.g. Moore 2017; Somin 2016) and philosophers (e.g. Brennan 2016) have questioned the value of democracy for epistemic reasons and suggested that political participation and democratic deliberation can render people more irrational and biased. The traditional sources of authoritative information are increasingly being questioned, since citizens now have access to new sources of information and, therefore, doubt established knowledge and the testimony of experts (Moore 2017). Online dis/misinformation produced by various agents, for example, peer groups, activists and fake news sites, places increasing pressure on the work of professional organisations and authorities. In terms of responding to the spread of misinformation and manufactured ignorance, critical philosophical conceptualisations are needed to draw attention to the intersection of ignorance and expertise.

This article addresses philosophical questions regarding the epistemic virtues of expertise when experts face pressure from peer groups’ activism in democratic societies. Modern societies cannot function without the division of labour and the associated institutions of expertise (Moore 2017), which means that not all viewpoints and opinions can be equally valid in decision-making processes. How can expert knowledge be justified without destabilising the equality between citizens and experts? How can the risks of elite domination and the power of professional cabals be avoided without slipping into a situation in which each voice is equally important? In a complex world that is riddled with uncertainties and wicked problems such as climate change, it is crucial that experts recognise and acknowledge the limits of their expertise. Yet, such acknowledgements, although virtuous, can backfire in conditions of manufactured uncertainty. The aim of this article is to clarify the notion of the epistemic humility of expertise when experts have to fight against claims made by distributors of false information and, at the same time, negotiate with individuals who are misinformed and so resist expert knowledge.

In the field of professional education, reflective practices (Schön 1983) serve as a key conceptualisation for theorising the limitations of one’s own knowledge and skills in relation to expert work. In this context, the notion of negative knowledge refers to a professional’s ability to recognise the limits of her own knowledge and to avoid errors in work tasks (e.g. Gartmeier et al. 2008). The philosophical approaches to professional knowledge have typically concentrated on propositional knowledge (known knowns), as well as on tacit knowledge and intuition (unknown knowns), although more recently metacognitive issues related to ‘known unknowns’ and ‘unknown unknowns’ have been subject to increasing attention. According to Ann Kerwin’s (1993) taxonomy of ignorance, the term ‘known unknowns’ refers to a situation in which a person knows that she does not know something, while the concept of ‘unknown unknowns’ refers to a situation in which a person does not know that she does not know whether the subject matter in question is knowable or unknowable. As Smithson (2008) proposes, every discipline and profession involves often implicit assumptions and beliefs regarding ‘unknown’ issues.

In 2002, a speech by Donald Rumsfeld (2002) brought much fame and public attention to the four epistemological concepts, namely known knowns, unknown knowns, known unknowns and, most importantly, unknown unknowns. Ever since then, the characterisation of unknown unknowns has been used to capture unidentified risks that have traditionally been considered outside the scope of project risk management, such as the 2004 Boxing Day tsunamis and the Fukushima nuclear accident in 2011. The majority of unknown unknowns are considered to be impossible to examine, since no one can imagine them in advance. While discussions concerning unknown unknowns have strongly relied on risk analysis and the theories of risk management (e.g. Kim 2017), with regards to the epistemology of ignorance, scholars have raised questions regarding unknown unknowns related to terms such as ‘inevitable ignorance’ (Rescher 2009), ‘cognitive closure’ (Stoljar 2009) and ‘insolubilia’ (Rescher 2009). The term ‘insolubilia’ is derived from the field of theology and it is used to refer to a domain that cannot be considered a proper object of epistemological study.

Over the last decade, epistemological discussions concerning unknowns, ignorance and insecurity have developed into a dynamic research field, which is known as the epistemology of ignorance or agnotology (e.g. Alcoff 2007; Beck 2009; Proctor and Schiebinger 2008; Tuana and Sullivan 2006). According to the traditional philosophical epistemology, common ways of taking about ignorance have focused on the absence of knowledge as a bad epistemic practice (El Kassar 2018). In fact, the traditional epistemological approach has tended to neglect ignorance as a relevant research topic and failed to recognise its status as an epistemic state. El Kassar (2018, p. 301) refers to the traditional understanding of ignorance as a ‘propositional conception of ignorance’, whereby ignorance is understood as a situation in which P is true but S does not know that P is true. Propositional ignorance is complicated, since there are many ways in which S can fail to know P, for example, simply through not knowing or disbelief (El Kassar 2018).

Smithson (1989) was one of the first scholars to develop a taxonomy of ignorance, splitting ignorance into two categories: the state of ignorance (error) and the act of ignorance (irrelevance). The latter corresponds to deliberately ignoring something relevant to the problem-solving situation, whereas the former is a state of ignorance resulting from different causes, including distorted or incomplete knowledge. More recent epistemological approaches to ignorance have sought to study deliberate acts of ignorance and their social and political consequences. As Alcoff (2007) argues, the epistemology of ignorance aims to study the relational and structural reasons behind substantive practices of ignorance.

Recent epistemological discussions concerning ignorance have revealed that, both theoretically and socially, ignorance is a far more complex and far more ambiguous phenomenon than philosophers and scholars have previously thought. A wide range of scholars working across many disciplines have begun to consider, for instance, how social and economic issues are involved in the strategic use of disinformation at the political level. Rather than approaching ignorance as a liability that can and will be overcome once individuals have sufficient information regarding the implications of their actions, a growing number of scholars have used the lens of wilful or strategic ignorance to investigate how Internet trolls and social media groups seek to deny, justify or simply ignore common sense knowledge and facts (Gross and McGoey 2015).

Our approach to the epistemic humility of expertise reconsiders the notion of negative expertise when trying to capture the ‘external’ condition of reflective practice and discussing the role played by ignorance, insecurity and disinformation in the context of professional knowledge. Thus far, the discussion concerning negative expertise has focused on the individualistic and internal perspectives of professionals that arise from practical experience and dilemmas within concrete work contexts. Emphasising individuals’ responsibility for their professional competence, these approaches fail to analyse the role played by both digital disinformation and constructed ignorance in undermining the institutional status of professional knowledge. While uncertainty is concerned with not knowing something for sure (e.g. incomplete information), the phenomenon of manufactured uncertainty frequently stems from strategic operations. Proctor and Schiebinger’s (2008) notion of manufactured uncertainty is a fruitful conceptualisation with regards to understanding the epistemic pressures on expertise caused by the spreading of digital disinformation. Recognising the role played by ignorance can serve to reframe the theories of professional knowledge, broadening them to include not just the acquisition of truth, but also the development of reliable and significant knowledge on the part of people in different locations and different situations. Although epistemic humility (e.g. Schwab 2012), negative knowledge (e.g. Ariso 2017) and different kinds of wilful and manufactured ignorance have individually received a considerable amount of attention in the burgeoning field of ignorance studies, they are rarely discussed together.

In short, by focusing on the external pressures on both the formation of expert knowledge and the conditions of expert testimony, our study evolves a novel epistemological approach to the epistemic humility of expertise, thereby identifying the role played by disinformation in the knowledge-intensive work of professionals. In the following section, we introduce the conceptualisation of negative expertise by drawing on the theory of reflective practice and the notion of negative capability. Then, relying on social and feminist epistemological considerations of knowledge, we move on to discuss the asymmetrical epistemic relations between experts and laypeople so as to understand the micro- and macro-level relationality inherent in manufactured ignorance. We discuss various epistemic strategies, but rather than endorsing any particular strategy or set of virtues, we argue that expertise requires negative expertise, which is better understood as a metacognitive and social skill than as a fixed virtue or clearly delineated strategy. As a social skill, negative expertise is concerned not only with the epistemic limits of the expert, but also with the limits of others, as well as with the social and cultural conditions of those limits. As a metacognitive skill, negative expertise is used to reflect on the pros and cons of different strategies in different situations. We draw on the sociology of knowledge (Knorr Cetina 1999), feminist epistemologies (Townley 2011) and virtue epistemology (Levy 2019), and we suggest that a focus on negative expertise can supplement them all, although negative expertise as such need not be limited to any individual epistemological approach.

Negative expertise as reflective practice

According to humility theories (e.g. Lehrer et al. 1996), the epistemic subject is seen to possess self-confidence, to have the capability for self-reflection and yet to lack epistemic arrogance. This type of epistemic subject is reminiscent of reflective practitioners who know what they do not know, thereby understanding their epistemic boundaries and potentials. Over the past 20 years, reflective practice has become one of the most popular theories of professional knowledge, having been widely adopted by the education (Benade 2018; Galea 2012; Russell 2013; Valli 1992), health care (Bulman and Schutz 2013; Kinsella 2010; Mamede et al. 2007) and social care (Gould and Taylor 1996; Moe et al. 2014) professions. The term was coined by Donald Schön (1983) in his influential book, and it has gone on to garner unprecedented attention in the theorisation of expertise and to be integrated into continuing education programmes (Kinsella 2010). Regardless of nuances, the central idea behind Schön’s (1983) notion of reflection-in-action is that, through reflection, experts better understand their professional activity, while reflecting on problems will lead to new insights for their practice. However, despite many scholars having promoted or justified the benefits of this process in terms of enriching and constructing professional knowledge (e.g. Edwards et al. 2002), a key problem raised in the literature concerns the lack of analysis of the institutional stakeholders and power structures in society that promote the authority of highly educated professionals. It remains open for discussion whether the theory of reflective practice can respond to the question of power by offering an analysis of negative expertise, as revealed through a critical examination of the philosophical epistemologies concerning ignorance.

The conceptualisation of negative knowledge is assumed to promote in-depth reflective processes, since reflection is seen to be an essential component of the development of professional competence. An early approach to negative knowledge can be found within Marvin Minsky’s (1994) conception of negative expertise. He argues that experts have a substantial amount of knowledge regarding what can go wrong in their domain as well as what kinds of procedures will lead to suboptimal solutions to problems and should, therefore, be avoided. Minsky (1994) supposes that this knowledge supports experts’ effective action and prevents them from committing errors. Describing the nature of negative knowledge as an element of reflective epistemic processes, Parviainen and Eriksson (2006) and Gartmeier et al. (2008) suggest that negative knowledge refers to an epistemic situation when one ‘knows what not to do’. In this sense, the domain-specific aspect of negative knowledge is focused on what is ‘wrong, but relevant to know’. Thus, it can be seen to concern ‘known knowns’ rather than ‘known unknowns’. In the context of expertise, negative knowledge is defined as a reflective activity involved in a professional’s ability to avoid errors in work tasks and to recognise the limits of her own expertise. Negative knowing includes learning from mistakes (Straehler-Pohl and Pais 2014) and it also refers to unlearning capabilities when applicable knowledge becomes obsolete (McWilliam 2005). It is assumed that elaborating negative knowledge helps to increase certainty through raising professionals’ awareness of the possible positive and negative outcomes of their own actions and through promoting the capability to judge their respective probabilities in given circumstances.

Learning from errors is considered to be crucial to the development of negative knowledge (Oser and Spychiger 2005). Errors are typically defined as an unintended discrepancy between a current state and a desired state, most likely in the form of a deviation from a given norm. Closely connected to errors, failure experiences in terms of failing to achieve a given goal constitute individuals’ affective reactions, with a focus on the consequences of making errors (Zhao 2011). Importantly, not every error is necessarily interpreted as a failure. Whether an error is evaluated as a failure or not depends on both situational aspects (e.g. social norms) and personal characteristics. In short, the errors of the epistemic subject can be seen as actions that endanger the attainment of desired goals with or without feelings of failure. Furthermore, the labelling of an action as an error involves the judgment of skilled members of the community (Gartmeier et al. 2008). One underlying problem concerns the term ‘error’ being a general label for different phenomena, which can be very diversely perceived and interpreted in different cultural and social contexts. Returning to the question of negative knowledge, learning from errors concerns the metacognitive situation defined as ‘known knowns’, while making errors due to ignorance or a lack of knowledge is related to ‘known unknowns’. When making a mistake, the epistemic subject is aware of the correct procedures, although making an error reveals that she does not know something.

According to Knorr Cetina’s (1999) interpretation of negative knowledge, making errors inherently indicates the epistemic subject’s ignorance or non-knowledge concerning the issue at hand. Knorr Cetina’s (1999, p. 64) notion of negative knowledge refers to ‘…knowledge of the limits of knowing, of the mistakes we make in trying to know, of the things that interfere with our knowing, of what we are not interested in and do not really want to know’. Hence, rather than ‘known knowns’, this characterisation of negative knowledge includes the metacognitive epistemic attitude whereby the epistemic subject knows that she doesn’t know something. The importance of Knorr Cetina’s (1999) approach lies in the fact that errors indicate the limits of knowing. Sometimes, an important piece of information, long having been neglected, can suddenly be taken seriously and may even be considered fundamental. For instance, when applicable knowledge becomes obsolete, the epistemic subject only recognises the changing norms through a conflict with the new standards. By applying her reflective capability, the epistemic subject may unlearn the existing procedure and acquire the relevant skills to cope with the new situation (McWilliam 2005). Parviainen and Eriksson (2006) propose that negative knowledge has a heuristic value in the sense that it provides clues to the unknown and enables people to evolve alternative strategies to solve their problems.

In formulating her idea of negative knowledge, Knorr Cetina (1999) uses the notion of liminality, thereby underlining the notion that ambiguity and uncertainty are inherent aspects of negative knowledge. As Knorr Cetina (1999, p. 63) suggests, limen means ‘threshold’, ‘doorstep’ and ‘entrance’ in Latin. By exploring this type of unknown, people can perceive things indirectly, similar to viewing photos through their negatives. When applying a negative approach and without ever confronting them directly, people outline phenomena based on half-knowledge (Adlam 2014) or by simply following hints and traces, guessing at what the phenomena might look like. Typically, archaeologists rely on negative tools when trying to perceive, based on fragments of pots or visible marks on human bones, how people might have lived during a certain historical period. Knorr Cetina’s (1999) conceptualisation of negative knowledge could potentially be interesting with regards to the discussion of unknown unknowns. While most unknown unknowns are believed to be impossible to examine because no one can even imagine them, Knorr Cetina (1999) emphasises that it is quite remarkable how much of the unknown we can mobilise through negative perception.

The idea of liminality has played a central role in various formulations of negative capability. The term was first introduced by the poet John Keats in 1817 to characterise the capacity that can lead people into intellectual confusion and uncertainty so as to feed their creativity and problem-solving skills rather than to search for epistemic certainty. Due to living in complex and rapidly changing environments, professionals have to deal with uncertainties as well as a complex body of knowledge in their field, so why would they need extra confusion to feed their creativity? The answer lies in professionals’ desire for overconfidence. Recent research findings (Cassam 2017) show that physicians’ overconfidence is a major factor in relation to diagnostic errors in the field of medicine. Overconfidence is considered to be a form of cognitive bias in the sense that it can obstruct people from seeing their own weaknesses and mistakes. Our interpretation of negative capability is characterised as the capacity that can lead people not just into but through intellectual confusion and uncertainty so as to develop their resilience in terms of handling insecurity and avoiding overconfidence.

An obvious candidate for a corrective virtue to combat overconfidence would be epistemic humility. Cassam (2017, p. 5) suggests that ‘[e]ffective critical reflection on one’s own epistemic and other vices requires the possession and exercise of epistemic virtues such as open-mindedness, sensitivity to evidence, and the humility to acknowledge one’s vices’. Epistemic humility is also crucial to the ability to learn from one’s mistakes. An epistemically arrogant subject would more likely not recognise having made a mistake and thus miss an opportunity to learn and become a better knower. The metacognitive skills of knowing what one knows and how one knows it, as well as what one does not know, require that one does not have an inflated opinion of oneself. This is true for an ‘agential conception’ of knowledge and ignorance (El Kassar 2018). What matters for the agential conception is that knowers are humble and honest to themselves with regards to their own capabilities. However, a relational account of ignorance must grapple with situations in which different people with different vices, backgrounds and motives interact. As we discuss below, being humble can prove risky in a culture of ignorance or where manufactured uncertainty exists.

Thus far, the discussion of negative knowledge has focused on the individual perspective of professionals that arises from practical experience and dilemmas within organisational contexts. The connection with practical experience and concrete situations represents a crucial aspect of negative expertise. In concrete situations, most experts operate with messy epistemic situations in terms of solving problems that have several possible solutions depending on the context. Professionals must rely on the knowledge of other experts and scientific institutions to provide diagnoses, analyses, considered opinions and policy proposals to non-experts, including politicians and government officials. As Townley (2006, p. 40) points out, ‘We need each other for full epistemic agency […] I need membership in a community of epistemic agents who will advise and correct me as I cultivate, refine, and maintain skills of reasoning and inquiry’. In both their epistemic communities and their roles as experts who interact with members of other epistemic communities, experts must be mindful of not only their own ignorance and epistemic vices, but also those of others. In the next section, we consider the epistemic potential of ignorance to develop new insights into critical points of professional knowledge.

Ignorance as a relational epistemic concept

According to feminist epistemologies, ‘relations’ form the heart of the epistemological theory, drawing attention to the structural and relational aspects of knowing, as opposed to the individual approaches, which have dominated Eurocentric epistemological theories for a long time (e.g. Anderson 2011; Code 1991; Fricker 2007; Potter 1993). Feminist epistemologists seek to understand not only how the social relations of gender, class and race shape knowledge practices, but also whether and how those relations should play a role in good knowing. Emphasising testimonial approaches, feminist epistemologists consider that knowers are able to contribute to the construction of knowledge due to the relationships that they have with others. Obviously, there exists a great deal of variation in the theories and approaches that constitute the feminist epistemology, so few generalisations can be made across the field. However, one of the key interests of the approach concerns how power relations play out epistemically, especially systemic and structural relations of power (Alcoff and Potter 1993). Feminist arguments regarding the importance of trust in terms of knowing (Code 1991), coupled with their analyses of the moral and affective dimensions of trust (Jones 1996), also suggest that affective relations are important aspects of epistemic analyses (Grasswick 2018). In addition, feminist epistemologists have played a prominent role in the development of the epistemologies of ignorance, addressing how philosophers must attend to the role of ignorance if they are to adequately understand the influence of power relations on epistemic matters (e.g. Alcoff 2007; Sullivan and Tuana 2007; Townley 2011). El Kassar’s (2018) division of ignorance studies into three conceptions—propositional, agential and structural—shows that the relational aspects of ignorance remain undertheorised. It is not just structures in the abstract, but rather the actual relations between persons, groups and institutions that we need to look at if we are to understand failures of testimony, strategic ignorance or epistemic injustice. This does not necessitate the abandonment of the structural explanations of ignorance, but instead requires attention to be paid to what mediates structure and agency.

Social epistemologists (Longino 2002) have alleged the deep connections between democracy and the success of epistemic practices, addressing the effects of both the social relations internal to epistemic communities and the social relations external to those communities. Modern societies cannot function without individuals admitting the limits of their own knowledge and trusting in other people’s expertise and, therefore, without a social division of labour and a reliance on experts, professionals and intellectuals. Similarly, in democratic societies, expertise and government rely on each other. Although professions vary in terms of the legitimacy of their claims to expertise and in their status, the most important capital of expert organisations remains individual professionals’ knowledge, skills and competence (Suddaby et al. 2008).

Expert organisations such as hospitals and law firms rely on individual professionals’ knowledge, skills and competence (Suddaby et al. 2008) in the sense that knowledge formation is frequently developed through professionals’ collaboration with colleagues and customers alike. In many professions, ranging from medicine to education, social work to law, citizens, customers, patients, students and parents all play an active part in the knowledge production processes, since professionals are reliant on personal information received from such people. Thus, knowledge is no longer mere capital, possessed by an individual or a company, but it rather inherently relational in nature. To develop epistemic relationality with their clients, reflective capabilities and negotiation skills must be accentuated in modern expert work. However, the paradox here is that the epistemic relation between experts and laypeople never constitutes symmetrical dialogue due to their different positions as epistemic agents. In addition, for strategic or opportunistic reasons, customers may be excluded from the sources of knowledge in decision-making processes when dangerous, confidential or sensitive knowledge cannot be allowed to fall into the wrong hands (Knudsen 2011). There is also the possibility of information being actively hidden so that customers and citizens are kept from hearing the justifications involved in strategic decision making. Information flows can be limited for legal or public relations (PR) purposes or for reasons of national security. Conelly et al. (2011, p. 65) define the hiding of knowledge as ‘an intentional attempt by an individual to withhold or conceal knowledge that has been requested by another person’. In law firms, hospitals and administration organisations, knowledge represents not just power but also risk, meaning that professionals will often keep certain ideas secret or limit their circulation so as to avoid their improper use. In the era of the Internet, the circulation of information following publication is far more difficult to control.

Previously, most epistemic discourses concerning professions emphasised the dominance of experts over non-experts, but the power of customers has changed form with the advent of the Internet and the rise of social media. In the online environment, customers and citizens have new access to various types of knowledge sources as well as to their peers around the world. They can organise peer groups related to their interests, which means that they can also access stronger backup and new tools to doubt or dismiss the judgement and testimonials of professionals (e.g. Del Vicario et al. 2016; Moorhead et al. 2013; Ventola 2014). Recent empirical findings from various expert fields (e.g. van der Linden et al. 2017; Moorhead et al. 2013) show that professionals struggle with the pressures associated with the online (dis)information produced by various agents (peer groups, activists, think tanks, etc.). Climate change denial is a paradigmatic case in which there have been both concerted and organised efforts to produce disinformation and to manufacture uncertainty on the part of powerful interest groups as well as less centralised activities by citizens with various motives to foster distrust regarding climate science and policy.

With regards to propositional and agential approaches, ignorance is mainly discussed as a failure of individual cognition, either as a non-accessible epistemic position or a selective choice. For instance, the notion of excusable ignorance captures ‘…circumstances where there is a plausible excuse of the individual’s being ignorant’ (Rescher 2009, p. 11). The notion of culpable ignorance refers to a situation in which the requisite information is available, but an individual has made insufficient, incompetent or inadequate efforts to obtain it (Rescher 2009, p. 11). Relational and structural approaches frequently understand ignorance as as resource that it is distributed unequally or a deliberately constructed strategic ploy (Proctor and Schiebinger 2008). Manufactured ignorance is defined as a strategy used by different interest groups, including global corporations, to advance their agendas and to defend their social positions and power (Michaels 2008; Oreskes and Conway 2010; Proctor and Schiebinger 2008). Ignorance is manufactured when certain epistemic agents have an interest in maintaining uncertainty; thus, they actively work to organise doubt or spread mis/disinformation so as to keep people ignorant. In this sense, disinformation represents an actively engineered aspect of a deliberate plan involving trolling on social media, advertising, delivering press releases, funding decoy research and publishing articles in predatory scientific journals.

The repetition of (dis/mis)information in various media is recognised to render an argument more reliable in the eyes of receivers, regardless of whether the argument is true or not. In the field of psychology, the repetition of false information is defined as the ‘misinformation effect’ (Ayers and Reder 1998) and, more recently, it has been noticed that the mimic dynamics of ‘memes’ can carry disinformation on social media (Del Vicario et al. 2016). People can be influenced by mis/disinformation even when they understand the corrected information, they believe the corrected information and they later on remember the corrected information. Even in those circumstances, people’s reasoning and decision making will, on average, be measurably affected by the misinformation that was later corrected (Ecker et al. 2010). In the case of manufactured ignorance, being ignorant is not necessarily a failure of individual cognition, since people can easily become trapped by ‘confirmation bias’, even if they have access to critical sources of knowledge (Lewandowsky et al. 2017). Confirmation bias refers to the tendency to only look for information that confirms what a person already believes, to accept facts that only strengthen her preferred explanations and to dismiss data that challenge what she already accepts to be true (e.g. Dunning and Kruger 1999).

Dismissing relevant evidence, or what is referred to as the suppressed evidence fallacy in the study of informal logic, represents the other side of confirmation bias. Much of the evidence that, of necessity, we use in relation to complex issues such as climate change or the safety of medical procedures is expert testimony. Levy (2019) suggests that, when filtering expert testimony, we pay attention to the ‘cues of benevolence and competence’ and that, in politicised situations, we interpret those cues from a partisan perspective that leads to the devaluation of expertise we consider to support the opposing side to our own. For example, climate change denial has been found by many researchers to be connected to a commitment to laissez-faire economics and opposition to regulation (e.g. Collomb 2014; Oreskes and Conway 2010), while party identification in general is a predictor of attitudes towards climate science (McCright et al. 2014).

Experts from different fields encounter the boundaries of their own knowledge and their own unknowns, but they also need to negotiate with ‘ignorant’ customers on a daily basis. What kind of capabilities and strategies do experts have at hand when ‘managing ignorance’ (Denicola 2017)? How do they deal with customers’ and citizens’ ignorance when it is not a form of culpable ignorance or excusable ignorance, but is rather manufactured ignorance? If the claims concerning strategic ignorance being produced systematically by certain corporations and organisations are true, then experts face a new problem that is not socially resolvable with ‘good interaction skills’. The change in the global context means that the domain-specific negative expertise that professionals have acquired may not be sufficient for them to recognise new modes of ignorance.

From the perspective of expertise, we identify two levels of managing ignorance, namely (1) micro-level interaction in which an expert deals with an individual customer’s culpable ignorance or excusable ignorance either face-to-face or online and (2) macro-level interaction in which an expert faces manufactured ignorance through the actions and rhetoric of individual agents or a group of agents. In the first case, experts, through relying on their negative capability as a work skill, are expected to be mindful and open regarding the limits of customers’ knowledge. Experts should recognise customers’ finitude; try to map customers’ known unknowns; be aware that unknown unknowns always exist, some of them knowable, some not; make informed decisions about what to know; and also recognise and confess when customers are not capable of forming relevant knowledge. In virtue theoretical terms, among the relevant traits that ought to be cultivated in relation to expertise are curiosity or love of knowledge, honesty, humility and tolerance (Cassam 2016; de Bruin 2013; Denicola 2017). Many of these are ‘other-regarding’ virtues, since experts qua experts are members of communities (Kawall 2002). In a situation in which true beliefs and proposals have value, experts’ negative knowledge provides a precondition for coping with the epistemic asymmetry between experts and laypeople when they both mutually recognise their own epistemic limits and are humble and honest with regards to their interests of knowing.

In the second case (i.e. manufactured ignorance), relying on the value of true beliefs and the good faith of interlocutors appears idealistic and naive. Turning back to the case of climate denialism, Oreskes and Conway (2010) show how the strategy of purposefully creating doubt and uncertainty has been a common strategy applied by the spokespeople for the tobacco industry and climate denial-related think tanks (e.g. the Heartland Institute). When the link between lung cancer and tobacco became clear to researchers and they made this finding public, the tobacco industry mounted a counter campaign that attempted to put forward the message that the science was not certain. Similarly, and sometimes run by the same people, there has been a powerful campaign of disinformation disputing or creating doubt with regards to the scientific consensus concerning human-induced climate change. Tobacco companies and fossil fuel companies may have, in their public announcements at least, accepted the mainstream science, but they have continued to fund the lobbyists and think tanks producing and spreading disinformation.

Following Proctor and Schiebinger’s (2008) formulation, Oreskes and Conway (2010) refer to this strategy as ‘manufacturing doubt’. The point of this strategy is not to argue against scientific findings straightforwardly, but rather to allege that they are uncertain. The hearer of the doubtful message is supposed to ask themselves, in the case of smoking, whether they would be willing to give up an activity they enjoy and that is part of their identity, while in the case of climate change, they are supposed to ask whether they should give up cheap energy and cars and endanger jobs due to uncertain theories. The difficulty for a medical doctor advising a patient to quit smoking or for a climate scientist giving an interview concerning the effects climate change on crops in Kansas is that it would be hubristic or dishonest for them to claim complete and absolute certainty. The causal links between smoking and cancer are well understood, but they are not straightforward enough to state that a particular patient will die exactly 20 years earlier if they smoke. While it is absolutely clear to scientists that human-induced climate change is taking place, predicting the local effects is less certain. A virtuous expert would, in such situations, give probabilistic answers and confess if their expertise is limited in terms of a particular question. However, if the manufacturing of doubt has been successful, hearers may consider such admissions of uncertainty as proof of more general uncertainty surrounding the issues and hence become even more stubborn in relation to their smoking or climate change denial.

Epistemic strategies for encountering manufactured ignorance

When facing macro-level ignorance, we address three strategies for managing ignorance in non-ideal epistemic settings in which manufactured uncertainty and ignorance exist. To describe these different strategies in a somewhat truncated fashion, we refer to them as ‘Kantian’, ‘Machiavellian’ and ‘Marxist’ strategies. They are all ideal types of epistemic strategies, not interpretations of philosophical history. Instead of setting goals and determining actions, they can be understood as either deliberately chosen strategies or as character traits, that is, fixed sets of epistemic virtues and vices.

A Kantian would be honest and humble in all situations and with all interlocutors. Kant’s (1797/1993) famous and much discussed example concerns whether it is permissible to lie to an axe-murderer in order to save one’s friend, with Kant, infamously, advocating honesty even then. A Kantian strategy for managing ignorance would face the same criticism as Kant’s moral theory: it is too rigid for a world in which most people are not very good Kantians. An epistemic Kantian is especially vulnerable to manufactured ignorance and situations in which some interlocutors act in bad faith. These criticisms notwithstanding, it is not impossible that being consistently humble and honest in even the most non-ideal of worlds might pay off, for example, if enough others see one as an example to be emulated. Additionally, the Kantian impulse to see interlocutors not as a means, but rather as an end, remains as relevant to experts as ever.

In direct opposition to the Kantian strategy is the Machiavellian strategy, which advocates strategically lying or being otherwise dishonest so as to produce the epistemic greater good. While this certainly sounds bad, epistemic Machiavellianism is a commonly accepted practice in many contexts. Consider, for example, the physics teacher who teaches Newtonian physics as ‘the truth’ for beginners, but problematises Newton with general relativity and quantum field theory for more advanced students. There are many other examples of instrumental and transitional ignorance, where sometimes the ignorance serves the learners themselves and sometimes it serves a more general good (Townley 2006). The problem with more generalised Machiavellianism is that dishonesty can become second nature, that is, the seemingly obvious solution to any communication problem. Moreover, Machiavellianism means putting oneself above others as someone who knows both what the greater good is and how to achieve it. This may in turn lead to different forms of epistemic injustice if the expert treats others unjustly as knowers (Fricker 2007).

A Marxist strategy would involve arguing that if the world makes epistemic virtues difficult, then experts must strive to change the world. Rather than debating whether or not they ought to be honest in this or that situation, members of epistemic communities ought to eradicate the culture of ignorance and combat the conditions of manufactured uncertainty head on. The problem with this strategy is that it is very difficult to implement. It is also unhelpful when a doctor needs to provide a diagnosis today or a climate scientist needs to give an interview tomorrow. In any case, changing the social context requires changing the opinions of others; thus, the original problem of dealing with the ignorance of others is reintroduced. This does not mean that experts should cease trying to change the context of their epistemic work, only that they should not be paralysed in concrete situations, even if the social epistemic context is problematic. They can pursue several different epistemic goals with several different timeframes, although to do so, they need more than one strategy and they need to know which strategy is most appropriate in a given situation.

A reformulation of negative expertise: the phronetic skill to manage ignorance in complex social settings

In a complex and rapidly changing world, a particular epistemic strategy will sometimes work and sometimes not. Similarly, a person with a certain set of virtues may flourish in one context and fail in another. For an expert, situated and embedded in a specific social context, there can be no set solutions for managing ignorance. Rather than arguing for a specific strategy or a specific set of virtues, we propose reformulating the concept of negative expertise so as to accommodate the advances of social epistemology and feminist epistemology in terms of understanding the social and political contexts of ignorance. In earlier formulations, negative expertise was understood as a metacognitive skill concerning self-reflective recognition and the management of the knower’s own epistemic limits. In our formulation, it is also a social skill. To be more precise, the concept of negative expertise should be reformulated to include the ability to

  1. (a)

    understand and cope with the epistemic limits of others and the possibility that they may act in bad faith;

  2. (b)

    understand how things can go wrong in interpersonal and group-level contexts;

  3. (c)

    understand the institutional, structural and cultural constraints and possibilities of both knowledge and ignorance; and

  4. (d)

    understand and manage one’s own epistemic limits in terms of understanding the limits of others.

As it currently stands, the concept of negative expertise implies the use of practical reason. It is a phronetic skill. Recognising one’s own epistemic limitations and being able to act in the resultant state of uncertainty and confusion requires the ability to recognise and act for particular reasons as well as to understand the relations between the means and the end, even if (or, perhaps, especially) only intuitively and tacitly. Recognising the non-ideal conditions of manufactured ignorance reinforces the need for prudence.

Consider again the examples of climate change and smoking. They are both time-sensitive issues with high stakes. The stakes are not just epistemic, but also matters of well-being, which are ultimately matters of life and death. There is an ongoing debate in the field of philosophy of science concerning how moral and other non-epistemic reasons and values are related to epistemic reasons and values (e.g. Douglas 2009; Rolin 2015), although for our purposes, it is sufficient to say that part of the expertise of doctors, climate scientists and other experts should be the capability to investigate how morality, politics and epistemology are specifically related in their particular situations. The negative aspect of this expertise concerns being able to act in cases in which there exists uncertainty over this relation. All possible cases involve a degree of uncertainty. This uncertainty has many sources, one of them being the opacity of the political, economic and moral motives of our interlocutors.

In relation to both climate-change- and smoking-related denialism, economic interests, positions of power and epistemically and morally vicious activities have become entangled in a striking fashion. This is not to say that all denialists are on the payroll of the relevant industries. The social production of ignorance may be a complex process involving participants from different social strata, with different motives and with different degrees of intentionality. For an expert operating in a field in which manufactured ignorance exists, a certain degree of Machiavellian context sensitivity is required: ‘[the ruler] should imitate both the fox and the lion, for the lion is liable to be trapped, whereas the fox cannot ward off wolves. One needs, then, to be a fox to recognise traps, and a lion to frighten away wolves.’ (Machiavelli 1988, p. 61). Besides ‘trappers’ and ‘wolves’, there are people who act in good faith, but who are mistaken; people who believe they are acting in good faith, but who are biased in one way or another; people who are otherwise epistemically virtuous, but who are in the grip of some ideology or a conspiracy theory; and people who know something important that we do not, but whose knowledge we do not immediately recognise or understand, perhaps due to our own biases and epistemic injustice. Experts may have to provide testimony or take part in discussions in which all these kinds of people are present at the same time, all listening to and trying to convince each other. Navigating such a situation is not easy, but if negative expertise is a skill that can be learned, then it is possible.

When they encounter manufactured ignorance, experts with negative expertise can recognise it and consider different strategies for dealing with the situation in a responsible manner. They can understand that the reasons for epistemic adversity may be non-epistemic, instead stemming from deeply held political, moral or religious beliefs. If, for example, experts notice that their interlocutors dismiss them and the evidence they present using arguments that suggest political animosity, they can then try to reformulate their arguments so that they take into account the core values of their interlocutors. In such situations, the epistemic humility of the experts may acquire new significance in the sense that, while they may be confident that they know their own field better than the interlocutors, they can recognise that they may know much less about the general background and specific reasons for disbelief that the interlocutors hold. This could in turn lead the experts to assess whether they ought to learn more about the beliefs and commitments of their interlocutors in order to be able to serve more effectively as experts in the situation in question.

Feminist epistemologists have made important strides in theorising epistemic injustice and the interrelations between power and ignorance (Code 2004; Fricker 2007). The recognition of the feedback loop between a position of social power and the production of ignorance represents one such advance. Experts, since they still tend to occupy privileged social positions, may be particularly prone to committing the epistemic injustice of misrecognising the intellectual capacities and knowledge of those who are less privileged (Medina 2013, pp. 30–40). The possibility, and even the probability, of some degree of epistemic injustice and other bias is hence another social source of uncertainty for experts. Insofar as experts operate in a social context in which they mostly discuss problems with other privileged people as equals or as authority figures they look up to, the risk of epistemic injustice is worsened (Medina 2013, pp. 30–40). Managing these risks represents an aspect of negative expertise.

Two possible interrelated objections to our reformulation of negative expertise can be offered, namely that it is too demanding and that it is too broad. Negative expertise is demanding because we expect experts to be aware of not only their own epistemic capacities, but also those of others. Further, we expect that they should have the social skills necessary to recognise when some interlocutors are acting in bad faith, while we also suggest that they ought to understand complex sociological entities such as social structures, institutions and cultural conditions. Such a broad formulation also seems counter-intuitive to the understanding of expertise as being limited to a specific domain. Indeed, why should doctors or climate scientists understand social structures when we have sociologists for that?

One response to the demandingness objection is that we are still operating within a virtue theoretical framework. Having and developing the skills that we suggest is certainly virtuous, although being able to use them successfully all the time is not an absolute deontic demand. In fact, developing one’s skills is part of what being an expert is all about. If negative expertise forms part of expertise, then learning the metacognitive and social skills needed to cope with one’s own ignorance and the ignorance of others is part of an expert’s work in the same way that attending professional conferences and courses and reading professional literature are. Additionally, the demandingness is not an abstract postulation; rather, it arises from the concrete historical situations that we have discussed above. It is possible that in less complex times, negative expertise is less important.

The broadness objection can also be partly answered by reference to the relevant social conditions. It may be that, in order to play their part in the epistemic division of labour, experts from different domains are required to be amateur sociologists. For example, a doctor who understands how the different social contexts of different patients may affect their understanding of clinical matters will be better able to offer them recommendations than someone who does not. A climate scientist who is asked to provide expert testimony to right-wing politicians will be better able to motivate them to listen and understand if she takes into account how their political commitments may influence their hermeneutic and conceptual frameworks. In addition, in an uncertain, rapidly changing and increasingly complex world, the boundaries between domains of expertise may shift in unexpected ways. Purely focusing on one’s own area of expertise may hence lead to failure in even that specific domain.

We do not propose that experts learn negative expertise on their own. It could form part of their training, and there should be room to develop it in the epistemic communities in which experts work. There could also be sub-types of negative expertise and, therefore, an epistemic division of labour. For example, some experts may be knowledgeable about biases and social conditions, while others may have good ‘people skills’. Learning and practicing negative expertise in concert with others could thus alleviate the concerns about both demandingness and broadness.

Conclusion

In this article, we have discussed the external pressures that experts face as well as how such pressures lead to the need to re-evaluate the epistemic virtues and strategies of experts. The case of manufactured ignorance is especially troubling, but it is also illuminating in terms of highlighting the different types of epistemic pressures at play. To understand manufactured ignorance, it is not sufficient to simply understand epistemic vices and to then cultivate the right virtues to counter them. An awareness of the social and cultural conditions of the production and take-up of manufactured ignorance is also required, as is an understanding of the meso-level, that is, the level of groups and institutions. We have suggested that, in a complex and non-ideal world, experts need a specific skill to cope with their own and others’ ignorance: negative expertise. Where negative expertise has previously been understood as a metacognitive skill in an individualistic manner, we have proposed a reformulation in which it is also a social skill. As a skill, negative expertise does not represent an alternative to a virtue epistemological understanding of expertise. Rather, having certain virtues may render learning negative expertise easier, while learning negative expertise may in turn serve to develop the expert’s character in such a way as to cultivate virtues.

We have discussed negative expertise as being primarily a defensive skill. The positive aspect of negative expertise, however, lies in the recognition of uncertainty as being a source of innovation, errors as being integral to learning and unlearning as being an important skill in relation to knowledge formation. This becomes more complicated, both epistemically and morally, when the uncertainty and errors do not solely belong to oneself. When and how is it either morally acceptable or epistemically virtuous to use the ignorance of others for innovation, for example? The positive aspects of ignorance in social settings require further study and we expect the concept of negative expertise to play a role in such inquiries.