Skip to main content

How Extreme Is the Precautionary Principle?


The precautionary principle has often been described as an extreme principle that neglects science and stifles innovation. However, such an interpretation has no support in the official definitions of the principle that have been adopted by the European Union and by the signatories of international treaties on environmental protection. In these documents, the precautionary principle is a guideline specifying how to deal with certain types of scientific uncertainty. In this contribution, this approach to the precautionary principle is explicated with the help of concepts from the philosophy of science and comparisons with general notions of practical rationality. Three major problems in its application are discussed, and it is concluded that to serve its purpose, the precautionary principle has to (1) be combined with other decision principles in cases with competing top priorities, (2) be based on the current state of science, which requires procedures for scientific updates, and (3) exclude potential dangers whose plausibility is too low to trigger meaningful precautionary action.


No other safety principle has been so vehemently contested as the precautionary principle. It has repeatedly been accused of being both irrational and unscientific, [1, 2], and numerous authors have claimed that is stifles innovation by imposing unreasonable demands on the safety of new technologies [3,4,5]. Judging by these accounts, the precautionary principle would seem to be rather extreme. But are these descriptions accurate? In order to answer that question, we need to pay close attention to how the precautionary principle is defined and conceived by those who have the legislative power to apply it. “The Precautionary Principle in Official Documents” delineates how the precautionary principle is defined in official documents. The picture that emerges differs radically from the negative descriptions just referred to. “A Science-based Principle” is devoted to a philosophical explication of the principle, as it is presented in these documents. In “The First Problem: Competing Top Priorities,” “The Second Problem: The Need for Scientific Updates,” and “The Third Problem: Excluding Too Implausible Dangers,” three major problems in the application of the precautionary principle are discussed, and in the final “Conclusion” some conclusions are offered on the effects of applying the principle and on the limitations on its use.

The Precautionary Principle in Official Documents

The precautionary principle is often taken to be a general instruction to be cautious, much like the maxim “better safe than sorry,” with which it has often been equated.Footnote 1 However, that is not how the precautionary principle is presented in official documents. There, it is described as a principle with a much more limited scope, namely a principle for the evaluation of uncertain or incomplete scientific evidence.Footnote 2

Precautionary thinking can be traced back many centuries ([8], p. 26), but the idea of a specific precautionary principle grew out of national and international discussions on environmental policies in the 1980s. A decisive step towards its acceptance was taken when the “precautionary concept found its way into international law and policy as a result of German proposals made to the International North Sea Ministerial Conferences” ([9], p. 4). The declaration from the Second International Conference on Protection of the North Sea in 1987 was the first major international document in which a “principle” of precaution was promulgated. It was called “the principle of precautionary action,” and meant that “in order to protect the North Sea from possibly damaging effects of the most dangerous substances, a precautionary approach is necessary which may require action to control inputs of such substances even before a causal link has been established by absolutely clear scientific evidence” [10]. Another early statement can be found in the British Government’s White Paper This Common Inheritance from 1990, which promoted “precautionary action to limit the use of potentially dangerous materials or the spread of potentially dangerous pollutants, even where scientific knowledge is not conclusive, if the balance of likely costs and benefits justifies it” (cit. [11], p. 197).

Perhaps the most influential international proclamation of the principle can be found in the Rio Declaration on Environment and Development, which was adopted at the Rio de Janeiro Earth Summit (Rio Conference) in June 1992:

Principle 15. Precautionary principle

In order to protect the environment, the precautionary approach shall be widely applied by States according to their capabilities. Where there are threats of serious or irreversible damage, lack of full scientific certainty shall not be used as a reason for postponing cost-effective measures to prevent environmental degradation. [12]

In the same year, the European Union incorporated the precautionary principle into its legislative framework. This was done in the 1992 Maastricht Amendments to the European Treaty (Treaty of Rome, now known as the Treaty on the Functioning of the European Union) ([13], p. 206):

Union policy on the environment shall aim at a high level of protection taking into account the diversity of situations in the various regions of the Union. It shall be based on the precautionary principle and on the principles that preventive action should be taken, that environmental damage should as a priority be rectified at source and that the polluter should pay [14].

A Communication on the Precautionary Principle was published by the European Commission in February 2000. While acknowledging that the treaty only mentions the precautionary principle in connection with environmental protection, the communication asserts that “in practice, its scope is much wider, and specifically where preliminary objective scientific evaluation, indicates that there are reasonable grounds for concern that the potentially dangerous effects on the environment, human, animal or plant health may be inconsistent with the high level of protection chosen for the Community” [15]. A court decision in 2002 confirmed this interpretation and made it clear that the precautionary principle should be considered to be a general principle of European law ([16], pp. 110–111 and 549–550).

The Communication from 2000 described the principle as an approach to risk management, which “should not be confused with the element of caution that scientists apply in their assessment of scientific data.” The use of scientific information in risk management is strongly emphasized, and the use of the precautionary principle is essentially restricted to decisions under scientific uncertainty. When applying the principle, one should “start with a scientific evaluation, as complete as possible, and where possible, identifying at each stage the degree of scientific uncertainty.” Furthermore, the following six requirements are imposed on applications of the principle:

Where action is deemed necessary, measures based on the precautionary principle should be, inter alia:

  • proportional to the chosen level of protection,

  • non-discriminatory in their application,

  • consistent with similar measures already taken,

  • based on an examination of the potential benefits and costs of action or lack of action (including, where appropriate and feasible, an economic cost/benefit analysis),

  • subject to review, in the light of new scientific data, and

  • capable of assigning responsibility for producing the scientific evidence necessary for a more comprehensive risk assessment. [15]

The European Commission’s White Paper for a future chemicals policy, published in February 2001, provides additional clarifications on the precautionary principle. The planned legislation was intended to be “in line with the overriding goal of sustainable development and seek to make the chemical industry accept more responsibility by respecting the precautionary principle and safeguarding the Single Market and the competitiveness of European industry.” The precautionary principle demands that “action must be taken even if there is still scientific uncertainty as to the precise nature of the risks” [17]. The legislation referred to here was adopted in 2006 under the name Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH). It requires precautionary measures not only against chemicals that are known to be dangerous but also against substances for which there is insufficient but non-negligible scientific evidence of a danger to human health or the environment [18].

Three important conclusions can be drawn from these official definitions and explanations of the precautionary principle. First, the principle refers specifically to the evaluation of uncertain evidence for decision-making purposes, which is only one of several types of cautious reasoning that we may apply when making decisions. Thus, it is not a general principle of cautiousness or “better safe than sorry.” Secondly, it is an injunction to take preventive action not only against known dangers but also against potential dangers for which there is only insufficient evidence. Thirdly, the indications of danger that it enjoins us to take seriously are those that are provided by science. Thus, it does not recommend actions based on suppositions or fears that lack support in science.Footnote 3 The official documents leave no doubt that the precautionary principle is intended to be science-based in this sense.

In public debates, the term “precautionary principle” is often used with a different meaning. In particular, environmental groups often use the term about rules of cautiousness that are more far-reaching than the legally defined precautionary principle, and support more extensive environmental measures. These rules of precaution are stronger than the legally defined principle, in the informal and somewhat vague sense of “stronger” as “demanding or supporting more far-reaching counter-measures.” (In the literature, reference is often made to “strong” and “weak” precautionary principles, but these terms are defined in many different ways.Footnote 4) Some of these stronger rules of precaution are also interesting objects of scholarly analysis.Footnote 5 However, the official and legal use of important terms is always of special interest. For instance, the legal notion of theft is subject to focused studies, in which other usages, such as those according to which “property is theft” or “taxation is theft,” are left out of consideration. Mainly for similar reasons, the rest of this article is devoted to an analysis the precautionary principle as it is defined and expounded in international treaties and in European law. An additional reason for this approach is that, as we will see, a coherent and epistemically interesting conception of precaution can be extracted from these documents.Footnote 6

The focus will be on how the precautionary principle, as defined and conceived in official documents, can be explicated and clarified. No attempt will be made at a systematic exploration of how it is applied in practical decision-making.Footnote 7

A Science-based Principle

To clarify the meaning of such, science-based, precaution, we can use a simple model from the philosophy of science, which shows how scientific data is (ideally) processed and used, both for the purpose of scientific judgments and for that of practical decisions.Footnote 8

Science has a long tradition of giving much priority to the avoidance of error. A new hypothesis or idea is only accepted if it is supported by convincing evidence. Consequently, the burden of producing such evidence has to be carried by those who put forward the new proposal. This means, for instance, that those who claim to have identified a new hazard or risk have to provide sufficient scientific evidence to convince their colleagues that they are right. For the workings of science, this is an appropriate assignment of the burden of proof. When a new claim is accepted as a scientific fact, future investigations will be based on it. If we accept false statements as scientific facts, then they can hamper scientific progress and lead research into a cul-de-sac. To avoid this, strict criteria of proof have to be applied to new scientific claims.

In science, nothing is accepted once and for all. There are of course scientific standpoints that we currently have no reason whatsoever to doubt, but this does not mean that it is impossible for such reasons to emerge in the future. Therefore, scientific statements should not be treated as definitely and irreversibly accepted. Instead, they should be seen as accepted provisionally, i.e., until reasons to doubt them become known. This provisionality combines with the continuous search for new information to make science self-correcting. This is a crucial mechanism for scientific progress.

The statements that are taken to be scientific facts — provisionally, until we have reasons to doubt and perhaps revise or reject them — form the scientific corpus, i.e., the body of all scientific claims that the scientific community currently holds to be true. See Fig. 1 ([39], pp. 15–17). Scientific knowledge derives from data that we obtain in experiments and other observations. Based on these data, we construct and critically assess more general scientific statements, including theories, which — if deemed tenable enough — are included in the scientific corpus. The corpus can be defined as the collection of all scientific standpoints that we presently have no reason to doubt.

Fig. 1

The scientific corpus is based on scientific data

Since the entry requirements for the corpus are rather strict, the information that it contains is usually reliable enough for the purpose of practical decisions. However, in some practical decisions, we have strong reasons to apply standards of proof that differ from those of science. In particular, there are cases when plausible suppositions that do not satisfy the criteria for inclusion in the corpus are nevertheless relevant for a practical decision.Footnote 9 The following hypothetical example illustrates the typical structure of such cases:

The Baby Cream ExampleAn experiment has just been reported in which a product containing nanoparticles was mixed into fodder for pigs in order to increase their uptake of certain nutrients. It turned out that some of the pigs contracted liver cancer, and therefore this feed additive was not introduced for general use. The same type of nanoparticles is also a component of some moisturizing baby creams. A group of toxicologists was tasked with determining if the use of these nanoparticles in skin products has any negative health effects. In their report, they concluded that it was not known whether this use of the nanoparticles posed any danger. In the pigs that ingested them, the particles were absorbed into the bloodstream in the small intestine, and then transported in the blood to the liver and other organs. There were no indications that the particles could be absorbed through the skin. However, data had only been obtained for intact skin, and no information was available for skin affected by infections or other diseases. Upon receiving this report, the agency responsible for the safety of hygienic products had to make a decision based on uncertain information about a possible danger that might not exist.

In this example, a claim that the nanoparticles cause cancer in humans cannot be treated as a (provisional) scientific fact. Such a claim would be an uncertain supposition that is not part of the scientific corpus. But nevertheless, a reasonable case can be made that these nanoparticles should be removed from skin products, at least until more information about their properties has been obtained. In this and many other situations, we tend to act as if a danger exists, even though the scientific information does not amount to full scientific evidence that there is such a danger. Such measures are very much in line with the Rio declaration’s statement that “lack of full scientific certainty” should not be used as a reason not to act against “threats of serious or irreversible damage.” They also conform with the proclamation in the European Commission’s White Paper that “action must be taken even if there is still scientific uncertainty as to the precise nature of the risks.”

Figure 2 illustrates how scientific information can be used for policy-making purposes. Most commonly, information from the corpus is used (arrow 2). However, in order to avoid plausible but uncertain dangers, this may not be enough. In such cases, we need a direct path, a bypass route, to take us from data to policy (arrow 3). This is the crux of the precautionary principle; it allows us to use the bypass route in order to avoid possible dangers.

Fig. 2

The use of science in risk management

It is important to note that this bypass route has its starting-point in scientific data that give rise to a suspicion of danger. It cannot be accessed from an entry-point consisting of scientifically unsubstantiated fears or the whims of uninformed opinion. The following three principles have been proposed for science-based decisions employing the bypass route:

  1. 1.

    The same type of evidence should be taken into account in the policy process as in the formation of the scientific corpus. Policy decisions are not well served by the use of irrelevant data or the exclusion of relevant data.

  2. 2.

    The assessment of how strong the evidence is should be the same in the two processes.

  3. 3.

    The two processes may differ in the required level of evidence. It is a policy issue how much evidence is needed for various practical decisions. [36]

Practical decision-making that complies with these principles can be said to follow the tenets of science-based precaution. It can be seen from the official documents referred to above that this model corresponds closely to what is officially meant by the precautionary principle. In particular, public authorities endorsing the precautionary principle emphasize that it can only be triggered by scientifically valid indications of danger. For instance, the Swedish Chemicals Legislation from 1985 required a “reasonable scientific foundation” in order to trigger precautionary measures ([40], p. 23). According to the Communication from the European Commission in 2000, the principle is intended for cases with “reasonable grounds for concern.” The same document emphasizes that risk assessments have to be based on scientific information. We can conclude from this that the “reasonable grounds” referred to will have to be scientifically supportable [15].

Nanotechnology provides us with excellent examples both of potential dangers that have sufficient support to trigger the precautionary principle and of potential dangers that lack such support and therefore do not trigger itFootnote 10:

The hypothesis that nanoparticles cause harm to humans is reasonable given what is known about asbestos and deserves further testing (it must be noted of course that we regularly breathe in nanoparticles without any apparent harm). It is plausible to believe that they might be harmful even though there is not enough evidence to even say that this is probable. It is less clear that gray goo presents a credible threat. For reasons mentioned earlier, there are serious doubts about whether self-replication of the type required is possible. If this is so, then an hypothesis such as ‘the development of self-replicating robots will lead to the gray goo problem,’ while perhaps true, is practically pointless given that the development of these robots is such a remote possibility. ([43], pp. 143–144)

The requirement that precaution should be science-based must be clearly distinguished from the preposterous but in some circles still popular idea that no action should be taken against a suspected danger unless there is full scientific evidence of its existence. The latter standpoint, which would exclude the by-pass route described above, was introduced in 1993 by the tobacco company Philip Morris. They initiated and funded an ostensibly independent organization called The Advancement of Sound Science Coalition (TASSC), whose major purpose was to sow doubts on the scientifically well-documented negative health effects of passive smoking. One of their means for doing so was to argue that no action should be taken against a potential danger until full scientific proof of its existence has been obtained [44]. Needless to say, this is an irresponsible approach that is sure to invite disaster.

The precautionary principle has usually been put forward as a principle for decision-making in environmental and health-related issues. However, it is an expression of a much more general pattern of thought, namely that protective measures against a potential danger can be justified even if it is not known for sure that the danger exists. This way of thinking can be found in all areas of decision-making, also concerning other types of risk than those that affect human health and the environment. Military commanders do not passively wait for full evidence of a suspected enemy attack before taking counter-measures. Governments and central banks are expected to act against a potential financial crisis without knowing for sure that it will occur.Footnote 11 A safety engineer will close an elevator for maintenance based on rather weak indications that its cables have been damaged, rather than wait for incontrovertible evidence that this is the case.

Why is the term “precautionary principle” only seldom used in these other areas where essentially the same pattern of thought is applied as in the protection of human health and the environment? The difference seems to be that no special principle is needed in most other areas, since the rationality of taking action in the absence of full evidence is seldom if ever contested. In contrast, preventive measures against potential harms to human health and the environment are often put to question on both ideological and interest-based grounds [46,47,48]. This has created a need for a principled defense of such measures. In this perspective, the precautionary principle can be seen as the application of a common pattern of rational reasoning in certain areas where it is often contested.

Since the precautionary principle does not seem to be needed in other areas, one may well ask whether it can have any impact in the areas where it is mainly used, namely public health and environmental policy. As we have now seen, the “official” version of the principle does not prescribe what actions should be taken against the plausible but unproven dangers that it refers to. Instead, it just allows and approves counter-measures against such potential dangers. It has sometimes been maintained that since this (“weak”) version of the precautionary principle does not require any particular action against the potential dangers it refers to, it is forceless and only provides “somewhat innocuous or feeble additions to the regulatory landscape” ([49], p., 315). This argument, however, views the principle in isolation, not as part of a legal system that also contains other rules and regulations. As was noted in an authoritative commentary on the European legislation, the precautionary principle has to be interpreted in combination with provisions in that legislation stipulating that “the Community institutions are responsible, in all their spheres of activity, for the protection of public health, safety and the environment” ([16], p. 550). Against this background, it should be no surprise that the General Court of the European Union wrote in 2018:

The precautionary principle is a general principle of EU law requiring the authorities in question, in the particular context of the exercise of the powers conferred on them by the relevant rules, to take appropriate measures to prevent specific potential risks to public health, safety and the environment, by giving precedence to the requirements related to the protection of those interests over economic interests. (emphasis added) ([50], §109)

The quotation is taken from a decision in which the court upheld the Commission’s restrictions on several potentially harmful pesticides. Parts of these restrictions would have lacked legal support without the precautionary principle. A large number of historical examples have been documented in which scientifically plausible indications of danger have been excluded from consideration, often with disastrous results for human health and the environment [51, 52]. In many of these cases, timely application of the legal principles expounded in the 2018 decision by the General Court could have made a large difference.

In the next three sections, we are going to consider some of the potential problems that tend to accompany the application of the precautionary principle, in the version that can be found in international treaties and European legislation.

The First Problem: Competing Top Priorities

One of the most common criticisms against the precautionary principle is that it is said to assign an inflexible top priority to the protection of health and the environment. As we have seen, the principle as such does not have such implications. However, if combined with strict legislation requiring protection of human health and the environment, the effect may be a next-to-absolute requirement to take action against possible dangers. But this is not unique. Strong priorities to take actions against possible dangers are also set in other areas. For instance, safety engineering gives the highest priority to the avoidance of accidents, and military thinking to the avoidance of a successful enemy attack. The situation becomes much more complex when there is competition for the top priority. There are two major types of such priority conflicts, both of which can be accentuated by applications of the precautionary principle.

First, conflicts can arise if equally severe effects on health or the environment can plausibly be expected both if we take a particular course of action and if we refrain from taking it. This is often exemplified by reductions of pesticide use in the Global South. Pesticides, in particular insecticides, give rise to considerable environmental damage and to severe cases of occupational disease [53]. However, a decision not to use these substances can lead to crop failure or to increased prevalence of diseases such as malaria. Several discussants have maintained that if the precautionary principle is applied to the risk of a famine or a malaria epidemic, then it will support a revocation of some pesticide bans [54,55,56].

Another example of the same type of conflict is the weighing of the risks (side effects) of a medical treatment against the risks of refraining from the treatment. For instance, it is not uncommon for treatments of cancer to incur a risk of treatment-induced (“secondary”) cancer [57]. Obviously, a treatment decision will have to be based on some sort of weighing of the risks of side effects against those of refraining from treatment. However, if there are risks of death on both sides of the balance, then the precautionary principle is not of much help. This is a major reason why the precautionary principle does not have a prominent role in clinical medicine. (It can be more useful in other medical contexts, for instance in decisions on preventive measures and on enhancement.)

The second type of problem arises when different types of effects compete for (or all have) the top priority. For instance, suppose that in a severe pandemic, a government assigns equally high priority to minimizing the fatalities due to the disease as to avoiding the negative economic effects of measures such as travel bans and social distancing that can reduce the death toll. When these two goals run into conflict, the precautionary principle will not be of much help in deciding what to do (unless it is combined with a decision to give one of these two objectives higher priority than the other).

Generally speaking, the precautionary principle loses its bite when risks of the same high priority are at conflict. Other decision-aiding principles may have to be applied in order to achieve adjudication or balance in such cases [58].

The Second Problem: The Need for Scientific Updates

As we saw in “The Precautionary Principle in Official Documents,” one of the six requirements on applications of the precautionary principle that were proclaimed in the European Commission’s Communication from 2000 is that such methods should be “subject to review, in the light of new scientific data.” Precautionary measures have to be “periodically reviewed in the light of scientific progress, and amended as necessary.” The precautionary principle should only be applied “so long as scientific information is incomplete or inconclusive” [15]. This approach would seem to be rather uncontroversial, given that scientific knowledge is always subject to corrections, additions, and improvements. In practice, however, institutional inertia often makes it difficult to revoke or change a decision. This applies to decisions based on the precautionary principles as well as other types of decisions.

The regulation of genetically modified organisms provides a clear illustration of these difficulties.Footnote 12 In July 1974, when this technology was at its very beginnings, 11 of the researchers working with it published a letter in Science, proposing that scientists should “voluntarily defer” two types of experiments with biologically active recombinant DNA molecules. The reason was “serious concern that some of these artificial recombinant DNA molecules could prove biologically hazardous” [59]. A moratorium was in fact put into effect, and it was used by the researchers to perform a careful evaluation of the potential dangers of the new technology. At the Asilomar Conference on Recombinant DNA in February 1975, they concluded that these dangers were manageable. The moratorium was lifted, and experiments were resumed, applying safeguards that had been agreed upon.

Twenty years later, Paul Berg, one of the initiators of the moratorium, co-authored a retrospective paper on genetic modification. He and his co-author Maxine Singer observed that in the preceding two decades, the new technology had revolutionized biological science, and that it had done so without giving rise to any of the harmful effects that the pioneers had feared 20 years earlier:

Literally millions of experiments, many even inconceivable in 1975, have been carried out in the last 20 years without incident. No documented hazard to public health has been attributable to the applications of recombinant DNA technology. Moreover, the concern of some that moving DNA among species would breach customary breeding barriers and have profound effects on natural evolutionary processes has substantially disappeared as the science revealed that such exchanges occur in nature. [60]

In the quarter century that has passed since this article was published, our knowledge in genetics, plant biology, and ecology has increased dramatically. The uncertainties that justified the 1974 moratorium have been replaced by in-depth understanding of the technology, its mechanisms, and consequences [32]. But nevertheless, the European Union and many other jurisdictions still have legislations on biotechnology whose fundamental principles are based on the level of scientific knowledge in the 1970s, and in particular on the uncertainties that then prevailed. This state of affairs has often been described as a consequence of the precautionary principle. However, as we have just seen, the European Commission’s own position paper makes it clear that decisions based on the precautionary principle should be “periodically reviewed in the light of scientific progress, and amended as necessary.” Therefore, the discrepancy between this legislation and the current status of scientific knowledge should not be seen as a consequence of the European Union applying its precautionary principle, but rather as a consequence of its failure to apply the principle in the way that is prescribed in its own official documents.Footnote 13

A more general lesson can be learned from this. We need to take precautionary measures in cases of scientific uncertainty, but we also have to adjust these measures when uncertainty gives way to new scientific knowledge. Such adjustments can of course go in either direction: They can lead to more or less stringent protective measures, depending on the nature of the new information. But experience shows that it is often difficult to keep laws updated in pace with scientific and technological developments. A common solution to this is to provide laws with built-in mechanisms for adjustments to new knowledge, without the need for new decisions by the legislative body ([63], pp. 601–603). Obviously, this need for adaptability concerns not only the precautionary principle but also other legal rules whose application has to be sensitive to future developments in science and technology.

The Third Problem: Excluding Too Implausible Dangers

Some authors have claimed that the precautionary principle requires that even highly implausible suspicions of danger should lead to costly precautionary measures. For instance, Whelan [64] claimed that the principle requires that “we act on all the remote possibilities in identifying causes of human disease,” which lets “the distraction of purely hypothetical threats cause us to lose sight of the known or highly probable ones.” (For similar views, see: [65,66,67].)Footnote 14 This might seem to be a plausible interpretation. If we want to be on the safe side, what reason could there be not to consider all possible dangers?

In fact, there is such a reason, and indeed quite a compelling one. The reason is that such arguments can be constructed both for and against almost anything. For instance, think of some foodstuff that you eat. It is possible that it has some serious long-term health effect that scientists have not yet discovered. But the same applies to everything else that you eat. Therefore, the mere possibility that your favorite food can have negative health effects is not reason enough to refrain from consuming it. To be worth considering, an argument for doing so will have to show that there is a higher degree of plausibility than mere possibility.Footnote 15

As this example shows, decisions based on the precautionary principle have to be triggered by considerations that have a higher degree of scientific credibility than mere possibilities. Consequently, such decisions cannot be triggered on any contention that someone chooses to make, without supporting scientific evidence. Such contentions are a real problem since they are a common modus operandi of science denialists and other pseudoscientists. One typical example is the completely unfounded claims that a common vaccine for children, namely the MMR vaccine (against measles, mumps, and rubella) gives rise to autism. The source of this claim is a retracted paper, which has been shown to be fraudulent [70,71,72]. Competently performed epidemiological studies have shown no connection whatsoever between vaccination and autism [73,74,75]. But nevertheless, anti-vaccination activists have persisted in claiming that there is a causal connection between vaccination and autism, or at least scientific uncertainty in the matter [76, 77]. According to their argumentation, since scientists cannot prove the absence of a connection with absolute certainty, it must be considered to be a real risk. On the face of it, this might look like a reasonable application of the precautionary principle.

Obviously, science cannot prove with absolute certainty that this vaccine will never, in any person, causally contribute to autism. However, the claims of the anti-vaccinationists can nevertheless be efficiently refuted. The crux of the matter is that in the same sense that the vaccine might contribute to autism, so might anything else that happens in a young person’s life: riding the merry-go-round, or playing with a skipping rope, or perhaps eating ice cream and strawberries. From a scientific point of view, these are all at least equally strong candidates as MMR vaccination for being causal factors in autism. (Arguably, they are stronger candidates, since there is no negative epidemiological evidence for any of them, as there is for the vaccine.) Furthermore, the alternative supposition that the vaccine protects against autism is no less plausible than the supposition that it gives rise to autism.Footnote 16 For all these reasons, the claim that the MMR vaccine causes autism lacks the plausibility required for triggering the precautionary principle.

This example shows that in order to serve its purpose, the precautionary principle will have to be applied solely to potential risks with specific, scientifically tenable evidence. It cannot be applied to any unsubstantiated possibility of a risk that someone chooses to put focus on. To trigger the precautionary principle, a potential risk should, at the very minimum, have a scientific plausibility that is specific for this particular risk, above the plausibility level of “alternative” postulations that would lead us to act differently.


The common claims that the precautionary principle is irrational, goes against science, stifles innovation, etc. are based on interpretations of the principle that deviate drastically from the official interpretations in international treaties and in legislation and other binding documents adopted by the European Union. In its official versions, the precautionary principle is a guideline for the use of certain types of scientific evidence when making decisions. Importantly, it assumes that policy decisions should be based on science, and it does not leave room for decisions based on suppositions or fears that have no scientific backing. The basic message of the precautionary principle is that preventive measures can be justified by scientific evidence indicating a danger, even if that evidence is not sufficient to prove conclusively that the danger exists. This approach to uncertainty conforms with general principles of practical reasoning, and it can be explicated in detail with the help of a model of the scientific corpus and the science-policy interface. However, like other decision-making principles, the precautionary principle has its limitations. Based on an analysis of some problems arising in its use, we have identified three conditions that should be satisfied for an application of the precautionary principle to serve its purpose:

  1. (1)

    The precautionary principle cannot adjudicate between competing top priorities. In cases with such a priority structure, it may therefore have to be supplemented with decision principles suitable for weighing different potential outcomes against each other.

  2. (2)

    All precautionary actions should be based on the current state of science. Therefore, procedures for the scientific update of background information must be in place.

  3. (3)

    Potential dangers whose plausibility does not rise sufficiently above the level of “mere possibility” must be excluded from serious consideration.


  1. 1.

    For instance, Daniel Sarewitz and Roger Pielke ([6], p. 59) described the precautionary principle as “a dandified version of ‘better safe than sorry’”.

  2. 2.

    The historical material in this section is largely based on the more extensive account of the origin and development of the precautionary principle in Hansson [7].

  3. 3.

    It should be mentioned, though, that one of the international documents invoking the precautionary principle is somewhat unclear about the role of science. According to the preamble of the Ministerial Declaration from the third North Sea conference in 1990, the precautionary principle justifies “action to avoid potentially damaging impacts of substances that are persistent, toxic, and liable to bioaccumulate even where there is no scientific evidence to prove a causal link between emissions and effects” [19]. The phrase “no scientific evidence” might be interpreted as allowing the precautionary principle to be triggered by suspicions of harmfulness that lack scientific support. However, this phrase only refers to the emission–effect connection for a substance that is already known to be persistent, toxic, and bioaccumulative. The quoted statement is compatible with the standpoint that these three properties have to be determined by scientific means.

  4. 4.

    See: Sandin et al. [20], p. 289n; Conko [21]; Som et al. [22], p. 497; Sachs [23], p. 1288n.

  5. 5.

    See Hansson [24] for a discussion and critique of the notion of a “reversed burden of proof”, which has a prominent role in many of these stronger principles of precaution.

  6. 6.

    It has sometimes been claimed that the Cartagena Protocol on Biosafety from 2000 represents a stronger version of the precautionary principle than that found in other international treaties ([25], p. 6; [26], p. 20). However, there is no support for that contention in the actual references to the precautionary principle in the document. The Cartagena Protocol explicitly reaffirms the definition of the precautionary principle from the Rio Convention, and does not go beyond it ([27], articles 1, 10.6 and 11.8 and Annex III:4). In the negotiations leading up to the Protocol, there were heated debates on how the precautionary principle should be interpreted, but in the end, no new explanation or interpretation of the principle was included in the document [28, 29].

  7. 7.

    There is evidence that its application in the European Union lacks in consistency and does not always conform fully with the documents that define and explain it. See Hansson and Rudén [30], Klika [31], Hansson [32], Garnett and Parsons [33], and Ingre-Khans et al. [34].

  8. 8.

    This model is based on the traditional concept of a scientific corpus (cf. [35]). It was presented in Hansson [36] and further developed in Hansson [37] and Hansson [38]. It is a normative model, delineating some aspects of how science should be performed in order to best fulfill its purpose. Needless to say, a reasonably well-defined normative ideal of central aspects of science is needed in order to identify and rectify deviations that render the outputs of science less reliable or otherwise less useful. In the present context, a normative perspective on science is appropriate since the precautionary principle concerns how science should be applied to decision-making.

  9. 9.

    On situations in which it is the other way around, see Hansson [37].

  10. 10.

    For analyses of the precautionary principle that emphasize the notion of a trigger, see also Sandin [41] and Ahteensuu [42].

  11. 11.

    But see Schefczyk [45].

  12. 12.

    The description of this case is based on a more extensive account in Hansson [32].

  13. 13.

    For other examples of failures to adjust the application of the precautionary principle to new scientific information, see Kramer et al. [61] and Young et al. [62].

  14. 14.

    It is often difficult to determine which version of the precautionary principle various criticisms are directed at. Not uncommonly, the legally defined principle is accused of having implications which it does not have, but which some other precautionary rules may have. For instance, McKinney [66] introduces the precautionary principle with a definition that is fully consonant with the definitions that are now part of European law (p. 430). This definition does not require a reverse burden of proof, but most of McKinney’s argumentation is directed against a reverse burden of proof. Whelan [64] introduces her topic as “the so-called ‘precautionary principle,’ which has become enshrined in many international environmental treaties and regulations.” She then goes on to make statements about the precautionary principle that are clearly not true about its formulation in treaties and regulations, for instance that “the precautionary principle assumes that no detriment to health or the environment will result from the proposed new banning or chemical regulation.”

  15. 15.

    For a more thorough discussion of mere possibility arguments and how they can be defeated, see Hansson [68, 69].

  16. 16.

    What is known, however, is that the vaccine protects against complications of measles that have severe negative effects on cognitive development and are ultimately lethal [78, 79].


  1. 1.

    Durodié B (2003) The true cost of precautionary chemicals regulation. Risk Anal 23:389–398

    Google Scholar 

  2. 2.

    Nilsson R (2004) Control of chemicals in Sweden: an example of misuse of the ‘precautionary principle’. Ecotoxicol Environ Saf 57:107–117

    Google Scholar 

  3. 3.

    Castro D, McLaughlin M (2019) Ten ways the precautionary principle undermines progress in artificial intelligence. Working paper, Information Technology and Innovation Foundation.

  4. 4.

    Graham JD (2004) The perils of the precautionary principle: lessons from the American and European experience. Heritage Letters #818, January 15 2004. Heritage Foundation

  5. 5.

    Wainwright D (1998) Disenchantment, ambivalence, and the precautionary principle: the becalming of British health policy. Int J Health Serv 28(3):407–426

    Google Scholar 

  6. 6.

    Sarewitz D, Pielke R Jr (2000) Breaking the global-warming gridlock. The Atlantic Monthly 286(1):55–64

    Google Scholar 

  7. 7.

    Hansson SO (2018) The precautionary principle. In: Möller N, Hansson SO, Holmberg J-E, Rollenhagen C (eds) Handbook of safety principles. Wiley, Hoboken, pp 258–283

  8. 8.

    Charles JA (1967) Early arsenical bronzes–a metallurgical view. Am J Archaeol 71:21–26

    Google Scholar 

  9. 9.

    Freestone D, Ellen H (1996) Origins and development of the precautionary principle. In: Freestone D, Hey E (eds) The precautionary principle and international law: the challenge of implementation. international environmental law and policy series, vol 31. Kluwer Law International, The Hague, pp 3–15

    Google Scholar 

  10. 10.

    (1987) Ministerial declaration on the protection of the North Sea. Environ Conserv 14:357–361

  11. 11.

    O'Riordan T, Jordan A (1995) The precautionary principle in contemporary environmental politics. Environmental Values 4:191–212

    Google Scholar 

  12. 12.

    Report of the United Nations Conference on Environment and Development, Rio de Janeiro, 3-14 June 1992,

  13. 13.

    Pyhälä M, Brusendorff AC, Paulomäki H (2010) The precautionary principle. In: Fitzmaurice M, Ong DM, Merkouris P (eds) Research handbook on international environmental law. Edward Elgar, Cheltenham, pp 203–226

    Google Scholar 

  14. 14.

    Consolidated version of the treaty on the functioning of the European Union, 26.10.2012.

  15. 15.

    Communication from the Commission of 2 February 2000 on the precautionary principle,

  16. 16.

    Craig N, de Búrca G (2011) EU law. Text, cases, and materials, 5th edn. Oxford University Press, Oxford

  17. 17.

    Commission White Paper of 27 February 2001 on the strategy for a future chemicals policy,

  18. 18.

    Rudén C, Hansson SO (2010) REACH is but the first step – how far will it take us? Six further steps to improve the European chemicals legislation. Environ Health Perspect 118(1):6–10

    Google Scholar 

  19. 19.

    Ministerial Declaration of the Third International Conference on the Protection of the North Sea, The Hague, March 8, 1990. Downloaded March 28, 2020, from

  20. 20.

    Sandin P, Peterson M, Hansson SO, Rudén C, Juthe A (2002) Five charges against the precautionary principle. Journal of Risk Research 5:287–299

    Google Scholar 

  21. 21.

    Conko G (2003) Safety, risk and the precautionary principle: rethinking precautionary approaches to the regulation of transgenic plants. Transgenic Res 12(6):639–647

    Google Scholar 

  22. 22.

    Som C, Hilty LM, Köhler AR (2009) The precautionary principle as a framework for a sustainable information society. J Bus Ethics 85(3):493–505

    Google Scholar 

  23. 23.

    Sachs NM (2011) Rescuing the strong precautionary principle from its critics. Univ Ill Law Rev 2011:1285–1338

    Google Scholar 

  24. 24.

    Hansson SO (1997) Can we reverse the burden of proof? Toxicol Lett 90:223–228

    Google Scholar 

  25. 25.

    Goklany IM (2001) The precautionary principle: a critical appraisal of environmental risk assessment. Cato Institute, Washington, D.C.

    Google Scholar 

  26. 26.

    Sunstein CR (2005) Laws of fear: beyond the precautionary principle. Cambridge University Press, Cambridge

    Google Scholar 

  27. 27.

    Cartagena Protocol (2000) Cartagena protocol on biosafety to the convention on biological diversity. Text and annexes. Montreal: Secretariat of the Convention on Biological Diversity. Downloaded from May 23, 2020

  28. 28.

    Egziabher TBG (2007) The Cartagena protocol on biosafety: history, content and implementation from a developing country perspective. In: Traavik T, Ching LL (eds) Biosafety first: holistic approaches to risk and uncertainty in genetic engineering and genetically modified organisms. Tapir Academic Press, Trondheim, pp 389–405

    Google Scholar 

  29. 29.

    Newell P, Mackenzie R (2000) The 2000 Cartagena protocol on biosafety: legal and political dimensions. Glob Environ Chang 10(4):313–317

    Google Scholar 

  30. 30.

    Hansson SO, Rudén C (2010) REACH: What has been achieved and what needs to be done? In: Eriksson J, Gilek M, Rudén C (eds) Regulating chemical risks. European and global challenges. Springer, Berlin, pp 71–83

    Google Scholar 

  31. 31.

    Klika C (2015) Risk and the precautionary principle in the implementation of REACH. European Journal of Risk Regulation 6(1):111–120

    Google Scholar 

  32. 32.

    Hansson SO (2016) How to be cautious but open to learning: time to update biotechnology and GMO legislation. Risk Anal 36(8):1513–1517

  33. 33.

    Garnett K, Parsons DJ (2017) Multi-case review of the application of the precautionary principle in European Union law and case law. Risk Anal 37(3):502–516

    Google Scholar 

  34. 34.

    Ingre-Khans E, Ågerstrand M, Beronius A, Rudén C (2019) Reliability and relevance evaluations of REACH data. Toxicol Res 8(1):46–56

    Google Scholar 

  35. 35.

    Becker K (2002) Kuhn’s vindication of Quine and Carnap. Hist Philos Q 19(2):217–235

    Google Scholar 

  36. 36.

    Hansson SO (2008) Regulating BFRs – from science to policy. Chemosphere 73:144–147

    Google Scholar 

  37. 37.

    Hansson SO (2017) How values can influence science without threatening its integrity. In: Leitgeb H, Niiniluoto I, Seppälä P, Sober E (eds) Logic, methodology and philosophy of science – Proceedings of the 15th International Congress, College Publications, pp 207–221

  38. 38.

    Hansson SO (2018) Politique du risque et l'intégrité de la science. In de Guay A (ed) Risque et Expertise. Les Conférences Pierre Duhem. Presses universitaires de Franche-Comté, pp 57–112

  39. 39.

    Hansson SO (2004) Philosophical perspectives on risk. Techne 8(1):10–35

  40. 40.

    Jordbruksutskottets betänkande om kemikaliekontroll (JoU 1984/85:30), Stockholm: Sveriges Riksdag

  41. 41.

    Sandin P (1999) Dimensions of the precautionary principle. Hum Ecol Risk Assess 5:889–907

    Google Scholar 

  42. 42.

    Ahteensuu M (2008) In Dubio pro Natura? A philosophical analysis of the precautionary principle in environmental and health risk governance. University of Turku, Turku

    Google Scholar 

  43. 43.

    Weckert J, Moor J (2007) The precautionary principle in nanotechnology. In: Allhoff F, Lin P, Moor J, Weckert J (eds) Nanoethics: the ethical and social implications of nanotechnology. Wiley-Interscience, Hoboken, pp 133–146

    Google Scholar 

  44. 44.

    Ong EK, Glantz AS (2001) Constructing ‘sound science’ and ‘good epidemiology’: tobacco, lawyers, and public relations firms. Am J Public Health 91:1749–1757

    Google Scholar 

  45. 45.

    Schefczyk M (2016) Financial markets: applying argument analysis to the stabilisation task. In: Hansson SO, Hadorn GH (eds) The argumentative turn in policy analysis: reasoning about uncertainty. Springer, Berlin, pp 265–290

    Google Scholar 

  46. 46.

    Jones MM, Bayer R (2007) Paternalism & its discontents: motorcycle helmet laws, libertarian values, and public health. Am J Public Health 97(2):208–217

    Google Scholar 

  47. 47.

    Oreskes N, Conway EM (2010) Merchants of doubt: how a handful of scientists obscured the truth on issues from tobacco smoke to global warming. Bloomsbury Press, New York

    Google Scholar 

  48. 48.

    Wiley LF, Berman ML, Blanke D (2013) Who’s your nanny?: choice, paternalism and public health in the age of personal responsibility. Journal of Law, Medicine and Ethics 41:88–91

    Google Scholar 

  49. 49.

    Soule E (2000) Assessing the precautionary principle. Public Aff Q 14(4):309–328

    Google Scholar 

  50. 50.

    General Court of the European Union (2018) Judgment 17 May 2018 in cases T-429/13 and T-451/13 (Bayer CropScience AG and Syngenta Crop Protection AG). Downloaded March 28, 2020 from

  51. 51.

    Harremoës P, Gee D, MacGarvin M, Stirling A, Keys J, Wynne B, Guedes Vaz S (2001) Late lessons from early warnings: the precautionary principle 1896-2000. European Environment Agency, environmental issue report no 21, Luxembourg: Office for Official Publications of the European Communities

  52. 52.

    Late lessons from early warnings: science, precaution, innovation. European Environment Agency, report 1. 2013. Luxembourg: Office for Official Publications of the European Communities. Downloaded March 28, 2020, from

  53. 53.

    Hansson SO (2014) Occupational risks in agriculture. In: Thompson PB, Kaplan DM (eds) Encyclopedia of food and agricultural ethics. Springer, Dordrecht, pp 1461–1467

    Google Scholar 

  54. 54.

    Attaran A, Maharaj R (2000) Doctoring malaria, badly: the global campaign to ban DDT. BMJ 321:1403–1404

    Google Scholar 

  55. 55.

    Curtis CF, Lines JD (2000) Should DDT be banned by international treaty? Parasitol Today 16(3):119–121

    Google Scholar 

  56. 56.

    Gray GM, Hammitt JK (2000) Risk/risk trade-offs in pesticide regulation: an exploratory analysis of the public health effects of a ban on organophosphate and carbamate pesticides. Risk Anal 20:665–680

    Google Scholar 

  57. 57.

    Hansson SO (2011) Radiation protection – sorting out the arguments. Philosophy and Technology 24:363–368

  58. 58.

    Hansson SO (2016) The ethics of economic decision rules. In: DeMartino GF, McCloskey DN (eds) The Oxford handbook of professional economic ethics. Oxford University Press, New York, pp 29–54

  59. 59.

    Berg P, Baltimore D, Boyer HW, Cohen SN, Davis RW, Hogness DS, Nathans D, Roblin R, Watson JD, Weissman S, Zinder ND (1974) Potential biohazards of recombinant DNA molecules. Science 185(4148):303

    Google Scholar 

  60. 60.

    Berg P, Singer MF (1995) The recombinant DNA controversy: twenty years later. Proc Natl Acad Sci 92(20):9011–9013

    Google Scholar 

  61. 61.

    Kramer K, Zaaijer HL, Verweij MF (2017) The precautionary principle and the tolerability of blood transfusion risks. Am J Bioeth 17(3):32–43

    Google Scholar 

  62. 62.

    Young K, Doernberg SB, Snedecor RF, Mallin E (2019) Things we do for no reason: contact precautions for MRSA and VRE. J Hosp Med 14(3):178–180

    Google Scholar 

  63. 63.

    Hansson SO (2018) ALARA, BAT, and the substitution principle. In: Möller N, Hansson SO, Holmberg J-E, Rollenhagen C (eds) Handbook of safety principles. Wiley, Hoboken, pp 593–624

  64. 64.

    Whelan EM (2000) Can too much safety be hazardous? A critical look at the ‘precautionary principle’. (downloaded October 2015)

  65. 65.

    Manson N (1999) The precautionary principle, the catastrophe argument, and Pascal’s wager. Ends and Means 4(1):12–16

    Google Scholar 

  66. 66.

    McKinney WJ (1996) Prediction and Rolston’s environmental ethics: lessons from the philosophy of science. Sci Eng Ethics 2:429–440

    Google Scholar 

  67. 67.

    Nollkaemper A (1996) ‘What you risk reveals what you value’, and other dilemmas encountered in the legal assaults on risks. In: Freestone D, Hey E (eds) The precautionary principle and international law. Kluwer Law International, Dordrecht, pp 73–94

  68. 68.

    Hansson SO (2004) Great uncertainty about small things. Techne 8(2):26–35

  69. 69.

    Hansson SO (2011) Coping with the unpredictable effects of future technologies. Philosophy and Technology 24:137–149

  70. 70.

    Deer B (2011) How the case against the MMR vaccine was fixed. BMJ: British Medical Journal 342(7788):77–82

  71. 71.

    Deer B (2011) How the vaccine crisis was meant to make money. BMJ: British Medical Journal 342(7789):136–142

  72. 72.

    Deer B (2011) The Lancet’s two days to bury bad news. BMJ: British Medical Journal 342(7790):200–204

  73. 73.

    Bloch AB, Orenstein WA, Stetler HC, Wassilak SG, Amler RW, Bart KJ, Kirby CD, Hinman AR (1985) Health impact of measles vaccination in the United States. Pediatrics 76(4):524–532

    Google Scholar 

  74. 74.

    Maglione MA, Das L, Raaen L, Smith A, Chari R, Newberry S, Shanman R, Perry T, Goetz MB, Gidengil C (2014) Safety of vaccines used for routine immunization of US children: a systematic review. Pediatrics 134:325–337

    Google Scholar 

  75. 75.

    Pasquale D, Alberta PB, Garçon N, Stanberry LR, El-Hodhod M, Da Silva FT (2016) Vaccine safety evaluation: practical aspects in assessing benefits and risks. Vaccine 34(52):6672–6680

    Google Scholar 

  76. 76.

    Dixon GN, Clarke CE (2013) Heightening uncertainty around certain science: media coverage, false balance, and the autism-vaccine controversy. Sci Commun 35(3):358–382

    Google Scholar 

  77. 77.

    Hobson-West P (2007) ‘Trusting blindly can be the biggest risk of all’: organised resistance to childhood vaccination in the UK. Sociology of Health & Illness 29(2):198–215

    Google Scholar 

  78. 78.

    Rota PA, Rota JS, Goodson JL (2017) Subacute sclerosing panencephalitis. Clin Infect Dis 65(2):233–234

    Google Scholar 

  79. 79.

    Wendorf KA, Winter K, Zipprich J, Schechter R, Hacker JK, Preas C, Cherry JD, Glaser C, Harriman K (2017) Subacute sclerosing panencephalitis: the devastating measles complication that might be more common than previously estimated. Clinical Infectious Diseases 65(2):226–232

    Google Scholar 

Download references


I would like to thank Marko Ahteensuu, Per Sandin, Steffen Foss Hansen, and the editor-in-chief and referees of NanoEthics for valuable comments on an earlier version of this article.


Open access funding provided by Royal Institute of Technology.

Author information



Corresponding author

Correspondence to Sven Ove Hansson.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Hansson, S.O. How Extreme Is the Precautionary Principle?. Nanoethics 14, 245–257 (2020).

Download citation


  • Precautionary principle
  • Risk
  • Scientific corpus
  • Science-policy interface
  • Mere possibilities
  • Sound science