Introduction

Misinformation is rampant in our society today. Misinformation affects public health, such as when citizens reject scientific advice on how to control and end the pandemic. Climate change is one of the top areas for misinformation and disinformation on the web. We even see disinformation campaigns designed to influence public opinion concerning war, such as the falsely labelled video of Ukrainian President Zelenskyy gunning down citizens with a machine gun. That violent scene was actually staged for a television series, Servants of the People, with Zelenskyy in the lead role as an actor prior to being elected president.

Information falls into distinct categories. Figure 1 displays three quadrants with two axes. The vertical axis is whether or not there is an intent to deceive. The horizontal axis is whether the information is misleading or not misleading. Trustworthy information falls in the upper right-hand corner of this plot, because it is truthful information and there is no intent to deceive.

Fig. 1
figure 1

Types of information. The quadrants map information, misinformation, and disinformation depending on the intent to deceive and the accuracy of the message

Information from the upper right-hand quadrant is what we want to distribute to the public, decision makers, and other scientists. On the other hand, to the left is misinformation that is misleading without an intent to deceive. Very often those who are spreading misinformation actually believe that the information they are sharing and advocating to others is correct. In contrast, disinformation in the lower left quadrant is not only misleading, but there is a conscious intent to deceive. The purveyors of disinformation usually have a motive in wishing to confuse; the result is that people take actions often not in their own best interest. The WHO (World Health Organization) combined both misinformation and disinformation when they declared an infodemic, occurring in tandem with the current pandemic.

Disinformation in climate change

Disinformation is serious, especially in the case of climate change, because it impacts the willingness of society to take the mitigating measures required to avoid cascading consequences on our warming planet. One very specific example of a disinformation campaign targeted NASA. The US space agency posted a blog on the web in September of 2019 that reported correct information on the Sun’s role in climate change. It points out that the Sun is important to keep the planet warm for us to survive. It also mentions natural influences on climate, including the subtle changes in Earth’s orbit around the Sun that were responsible for the waxing and waning of glaciers during the Ice Ages. However, the anthropogenic warming we have experienced over the last few decades is too rapid to be linked to changes in Earth’s orbit and is too large to be caused by solar activity. The blog explains that, while there have been climate changes in the past that were caused by orbital and solar variability, we cannot attribute the current rapid warming to natural causes.

Unfortunately, the information in that NASA blog was cherry-picked by a group called Natural News, a known purveyor of disinformation. Incorrect postings from this group claimed that NASA “admits” that climate change occurs because of changes in our solar orbit, a fact that has long been known but operates on time scales of tens to hundreds of thousands of years. The disinformation went further to contend that climate change was not caused by fossil fuel combustion despite the fact that the statement from NASA made no such claim.

Sadly, such disinformation became the top-performing climate content on the web in 2019, outperforming all of the correct information on climate change. It ended up with 4.2 million impressions, the sum of tweets, Facebook shares, Reddit posts, and other social media sharing. This is a classic of twisting some correct information to produce a very highly publicized piece of disinformation. Numerous factors contributed to the wide uptake of the disinformation. To begin with, it started with a nugget of reputable information: the widely accepted Milankovitch theory that subtle variations in Earth's orbit (obliquity, eccentricity, and the precession of the equinoxes) can cause long-term changes in climate. These factors change too slowly to be responsible for the current warming. Next, the many attempts to post corrections of that disinformation did not get the same engagement from the public. Anyone who read the disinformation and might have been sceptical of the claims could easily have found corrections to that misinformation, had they tried. But for most, it was far more attractive to read that society might not be responsible for climate change. The disinformation played on reader’s bias for not wanting to make changes in their lifestyle by limiting fossil fuel consumption. Finally, the post was easily and widely shared among likeminded social groups that did not include those in positions to debunk the claim. By the time this widely circulated post came to the attention of people who could point out its flaws, it already had gone viral. As a result of this widely seen disinformation, search engines are now prioritizing the fact-check posting debunking the Natural News piece. But unfortunately, the original disinformation is still available on the web. The problem persists.

Misinformation on covid-19

The climate change example above clearly involved disinformation, because there was the intent by Natural News to deceive the readers. In contrast, there was no intent to deceive with much of the misinformation regarding COVID treatment, even though it led to excess deaths by discounting the importance of vaccines, face masks, and FDA-approved treatments, and promoted therapies that have not been scientifically validated. The promotion of ivermectin to prevent COVID was basically the perfect storm of misinformation.

The problem started with a science study that did not rise to the quality level expected in clinical medicine. The data sample was small, and there was no untreated “control” group against which to measure the efficacy of the treatment. Nevertheless, the study hinted at the possible benefit from ivermectin as a treatment for COVID. The paper was published in a peer-reviewed journal, making it difficult for anyone but experts to distinguish this poorly controlled study from much higher quality research. Adding to the confusion, ivermectin was already FDA-approved for human use for treating parasitic intestinal worms with a doctor’s prescription, and thus the argument could not be made that the treatment is dangerous for humans. The final problem was that ivermectin is widely available at feed stores for deworming horses and cows, although in formulations that can be lethal to humans. The combination of the availability of the drug, the knowledge that humans can take it, and the weak scientific study made it difficult to dissuade people from taking the drug. What resulted was a surge in overdoses from use of the horse version of the drug and a 24-fold increase in off-label use of the drug supplied by doctors’ prescriptions. An additional problem is that proponents of ivermectin diverted people from proven interventions, such as vaccines and masks, which have much stronger efficacy.

Predictably, failure to vaccinate prompted by misinformation causes hospitalizations and deaths. Consider the case of Madison, Wisconsin (Fig. 2). Out of every 100,000 fully vaccinated people just 125 were expected to get COVID, of which about 5 would actually be hospitalized and only one out of one million would die. For non-fully vaccinated people, three times as many people would catch the disease, four times as many people would have to be hospitalized and more than 10 times as many people will actually die from COVID. While this is only one example, there is good evidence that being vaccinated is very effective treatment.

Fig. 2
figure 2

Rates of COVID-19 infection, hospitalizations, and deaths for vaccinated versus unvaccinated people as of July 2021. Data from the Wisconsin Department of Health Services as reported by Statista

In the case of Ivermectin, it was nearly impossible to sort out the confusion. There was too much nuance for the public (e.g., the relative strength of the original scientific study) to correctly understand what to do. The topic pitched doctors who understood the danger of Ivermectin against other doctors who felt that it could not hurt, and might actually help patients who were opposed to vaccination. The combination of FDA approval for Ivermectin use in the US along with the ease of acquiring the drug without a prescription made off-label use appear to be scientifically defensible, even though the study design was weak. Compounding the problem was that some patients given Ivermectin actually did improve. Statistically about 40 percent of all patients do improve just by chance, with no intervention.

While it is tempting to place blame on the authors of the poorly controlled study or the journal that published the work, in the end, no one wins when a legitimate debate is prevented. It is essential that scientists with different views be able to talk about the evidence. One can also have a legitimate debate about when the efficacy of Ivermectin as a therapy for COVID transitioned from “information” to “misinformation.”

An additional factor is the large numbers of effective channels for spreading misinformation. For broadcast and print media, there are gatekeepers who determine reputable voices allowed to distribute information through these channels. In contrast, anyone can post information on the web and via social media, spreading misinformation much more quickly. The business model for these platform providers indirectly encourages posting misinformation, as it is more likely to be either surprising or supportive of viewer bias, prompting more reader engagement.

Countering misinformation and disinformation

Social science research shows that the more salacious the content, the more likely it is to spread. Bad actors know how to prey on readers’ preconceived biases to influence their opinions, harden their positions and create polarization through disinformation. A number of interventions could reduce the influence of misinformation and disinformation, but few are actually within the purview of science. For example, anyone can create content on the web or on social media, but scientists cannot control what is posted on the web. Scientists cannot control the business model for platform providers that encourages and amplifies inaccurate content. Scientists cannot influence how exciting and thrilling the misinformation is likely to be, and scientists have virtually no control over bad actors who want to prey on readers’ preconceived biases. Intervening in these areas are the role of policy makers and the business community. Yet there are areas where scientists can be effective.

Anticipate distortion and amplification

They can start by anticipating how information will be distorted into misinformation, or disinformation, and they can anticipate how pre-existing phobias will be amplified. A forward-looking communication plan can thwart misinformation before it gains a toehold.

As scientists, we can also educate students of all ages on critical thinking on how to use healthy scepticism in assessing something that seems too good to be true. Providing students with resources on where to check the validity of information is part of this intervention. We as scientists need to diversify our own information channels, so that we know that misinformation is spreading and thus intervene early on. Focusing only on scientific journals and mainstream media allows misinformation to spread, unchecked.

Communicate how science builds trust

Scientists should also be explaining how science builds trust. Too many members of the public assume that science is a list of facts. Instead, science is constantly evolving, and new ideas are constantly proposed. Scientists have proven ways of verifying and building trust in those new ideas. There also needs to be a focus on communicating concepts such as risk and exponential growth.

As one example of an intervention at the National Academy Sciences to anticipate future challenges from the pandemic, we conducted a science scenario-building exercise. The purpose was to anticipate possible futures dictated by forces beyond our control and determine what are some no-regrets actions regardless of which future materializes. Figure 3 shows an example. When we first conducted this scenario building, it was early in the pandemic. And now we know that the virus is not predictable and that a good proportion of US society has not been proactive. Nevertheless, the scenario exercise was useful in identifying no regrets actions regardless of which of the four possible futures materialized, such as addressing supply chain problems for personal protective equipment and limiting opportunities for exposure.

Fig. 3
figure 3

Example of four possible futures for the COVID pandemic depending on whether the virus is predictable or constantly mutating (vertical axis), and whether society is proactive to complex challenges (e.g., by getting vaccinated) or reactive (horizontal axis)

Build critical thinking skills

Scientists can also help students develop critical thinking skills, healthy scepticism, and teach them how to find resources for verification of claims. The National Academy launched a video game for all ages, freely available on the web. The game, Cat Colony Crisis, allows the player to shepherd a spaceship of cats headed to a new planet. On their way, a pandemic breaks out in the spaceship. The player must determine how to get the maximum number of the cats to the other planet, safely and with good health. The spaceship is analogous to Planet Earth, the cats are humankind, and the person playing the game is the decision maker who must care for the survival of the cats. The game has been downloaded many tens of thousands of times.

Broaden information channels

Broadening information channels is a challenge for everyone, not just scientists. Social media algorithms are set to direct individuals to feeds that best match messages they have sought in the past. The net effect is reinforcement of what we already believe and lack of easy access to viewpoints that might help us understand how other people think and believe. To counter this, the National Academy of Sciences launched a new program to address misinformation called Based on Science. Platform providers alerted us to topics frequently sought through search engines for which there was no authoritative content to which users could be directed, and therefore only misinformation was popping up. The Academy then tasked scientists to generate content to fill the gaps and professional communicators to present that material in a way that the general public would understand.

This project was particularly effective during the early days of the COVID pandemic in discounting unsupported rumours and clarifying effective interventions for the general public. Based on Science has been one of the most successful public information campaigns in the history of the Academy. As of September of 2021, as many as 2 million people per month were accessing this information.

Explain how science builds trust

A more difficult challenge is communicating how scientists themselves determine what to trust. The analogy I developed to communicate trust in science is that of a giant game of Jenga (Fig. 4), where players try to remove blocks of 2 × 4 wood from a tower without the tower toppling. My analogy with science is that the scientific method is the base of the Jenga game. Scientists trust the scientific method as the most proven self-correcting approach for revealing the laws governing the natural world. At the top of the giant Jenga game is the scientific consensus. This consensus is achieved on the foundation of all these 2 × 4 s, the individual studies, that support the scientific consensus. Scientists are always sceptical of the validity of any one study, but to the extent that they together are consistent with the consensus, they are viewed as valid.

Fig. 4
figure 4

“Tower of Trust.” Scientists most trust the scientific method, the foundation of the tower. They also trust the scientific consensus, built on top of a series of individual studies. If too many individual studies need revision, and are thus pulled from the tower, the consensus collapses. A new consensus must be built based on revised science. This is called a paradigm shift

However, if scientists determine that some individual studies are not trustworthy, for whatever reason (additional data, improved instrumentation, etc.) it is analogous to pulling out one or more of these 2 × 4 s in the stack. If too many of these individual studies are proven to be untrustworthy and pulled out of this stack, then the scientific consensus falls. We call that a paradigm shift and a different, better scientific consensus will be built on a tower of new, more trustworthy studies. This sort of analogy helps communicate to people that there is a hierarchy of trust. Scientists always trust the scientific method, and they trust the scientific consensus unless too many studies supporting it are found to be unreliable. An individual study’s result is just a suggestion of what might be a truth, until it is supported by independent work and a consensus.

The National Academy developed a program called The Science Behind It to help communicate science as a way of knowing rather than a list of facts. The purpose is to educate non-scientists on not only what do scientists trust, but also how did they come to that understanding and what was the process. Examples of topics we use to illustrate sciences and way of knowing, included PTSD (Post-traumatic stress disorder) in soldiers; climate change; human flight to Mars; dementia; representation of women in science; immigration; forensic science; cyber security; the origin of the universe; space weather; 3d printing; the cloud; climate and weather; and the cost of drugs, among other topics.

Communicate risk

Often decisions must be made with incomplete information or based on probabilities of events that are either at present or perhaps even fundamentally unpredictable. Examples are approaching hurricanes, climate change, and rapidly evolving pandemics. In such cases, it is critical that scientists communicate the relative risks of action versus non-action.

There are some good examples of risk communication that can be used as exemplars. Figure 5 is an example of a risk matrix developed by NOAA. The matrix disaggregates the likelihood of something happening versus the consequence if it does.

Fig. 5
figure 5

A NOAA risk matrix. Red boxes a very high risk because the probability of the event is high, and when it occurs, consequences are disastrous. Green boxes are low risk because the event is unlikely and inconsequential. (Color figure online)

The high risk areas need special approaches to communication. NOAA suggests making an informed communication plan and addressing the audience’s interest, not that of the scientists making the projections or the policy makers urging response. The audience is only going to be motivated by what could happen to them. Explain the risk using stories and visuals to make the case clear. Such communication can be very difficult when an event has no precedents, such as the current pandemic. Excellent communication offers options to reduce the risk and selects trusted messengers using multiple communication channels to get the word out. The messenger is as important as the message. NOAA also recommends testing your message on the intended audience before you put it out, to avoid unintended consequences.

As an example, when Hurricane Harvey stalled out on the Gulf Coast of Texas in 2017, the National Weather Service used language that motivated action: “Locations may be uninhabitable for weeks or months, and all impacts are unknown and will be beyond anything experienced.” And indeed, that statement proved true. I calculated that the rainfall on Houston was equivalent to the flow of 13 Mississippi Rivers raining down over several days onto a city with topographic relief essentially as flat as a table top.

Understand the several dimension of public trust

Finally, scientists must earn the public trust by being not just competent, but also demonstrating that they are working in the public interest. Scientists who study trust argue that there are actually two dimensions of trust. One dimension is “competence,” the perception that the professional is knowledgeable in his or her specialty. The second dimension is warmth, or rather the degree to which the public feel that the professional is working in the public’s best interest. As shown in Fig. 6, engineers, scientists and doctors tend to max out the scale on competence. The public believes that we are expert at what we do. But in terms of how much they believe we are working in their best interest, scientists fall in the middle of the pack. Childcare workers, teachers, and nurses are much more likely to be viewed as caring about what happens to average citizens.

Fig. 6
figure 6

Dimensions of trust, and how various professions fare in the estimation of the public. Figure courtesy of Susan Fiske, Princeton University

Scientists would benefit from interacting regularly and meaningfully with non-scientists to demonstrate that we understand their challenges and want to make their lives better.

Bridging the two worlds

This understanding brings up the need to bridge the two worlds—science as practiced versus what matters to the public. Scientists are taught to communicate their findings only if they produce a major advancement and, even better, overturn accepted scientific views. Emphasis is on originality and finding flaws in existing paradigms. By always emphasizing how our results differ from anything that came before, we can confuse non-scientists who seek consistent messaging and consensus.

Scientists are also taught to remain dispassionate: results should not depend on which scientist did the analysis if it is to be regarded as a truth of nature and be reproducible. Whereas the public wants to hear the human side of the story: what motivated a scientist to do the study and what challenges were overcome.

Finally, proper science practice is to emphasize all caveats and uncertainties. Why might the result not stand the test of time? What issues might confound the proposed interpretation of the data? What further work is required to confirm the result? However, uncertainty often makes the public think that the science is not sound, the study was flawed, and the result should be ignored.

Concluding thoughts

The National Academy of Sciences is stepping up to many of these challenges by creating a Strategic Council for Research Excellence, Integrity, and Trust. This council is charged with advancing overall health, quality and effectiveness of the research enterprise, and includes all domains that fund, execute, disseminate and apply scientific work for the public interest. The Strategic Council includes leaders from federal agencies that fund research, research institutions that actually do the work, journals that disseminate the results, and policy-makers who ultimately use that science. The Strategic Council has identified challenges to excellence and trust in science, some of which arise from misaligned incentives across the research enterprise. By conducting pilot projects and articulating principles and best practices, it is coordinating collaborative action across the entire enterprise to remove barriers and accelerate solutions.

Examples of some initial issues the Strategic Council is addressing include:

  • Conflict of interest—reducing the burden to researchers in complete and transparent disclosure of outside interests;

  • Measuring impact—determining what should matter and how to assess what matters;

  • Correcting the record—encouraging authors, institutions, and journals to be transparent, fast, and fair so that flawed science does not continue to mislead people.

The overall goal for all scientists is to earn the public trust by making science more trustworthy, not by demanding that scientists be trusted.