Keywords

The most important source for evaluating the dangers of radiation to large population groups was the study of survivors of the atomic bombings of Hiroshima and Nagasaki. It supplied the best available epidemiological data on the effects of radiation on humans, and scientific knowledge about radiation hazards drew in significant measure from the work of the Atomic Bomb Casualty Commission.... (Walker, 2000)

– J. Samuel Walker, historian of the U.S. Nuclear Regulatory Commission.

The A-bomb studies have set standards that are patently false. (Greene, 2003)

– Dr. Alice Stewart, pioneering British epidemiologist, pictured in

Fig. 7.1
A photograph of Alice Stewart writing on a blackboard.

Dr. Alice Stewart, the epidemiologist who demonstrated that x-rays in utero cause leukemia in young children. The author fervently recommends the brilliant biography of Dr. Stewart written by Gayle Greene, The Woman Who Knew Too Much (Greene, 2003)

Fig. 7.1.

The Postal Analogy

Imagine a vast postal sorting warehouse. Under one roof, extending to the horizon, there sits an enormous collection of conveyor belts, each filled with corrugated postal bins. Perhaps each conveyor belt corresponds to a different location, while every postal bin is directed to a certain truck leaving at a certain time. The bins move by continuously, independently, on each belt, and very quickly. The scale of the entire operation is immense. For the purposes of illustration, consider just a small portion of the entire warehouse, shown in Fig. 7.2. Within the field of view, ten conveyor belts are arranged. On each conveyor belt, at one instant in time, there sit ten bins. A total of one hundred bins are visible.

Fig. 7.2
Three 10 by 10 matrices with the input of numerical values. The first matrix has input values from 18 to 43. The second matrix has input values as 1 and 2 only. The third matrix has an input value of 1 in 3 boxes.

The postal sorting warehouse contains an ensemble of Y conveyor belts, each holding X postal bins. Ensembles representing high (left), intermediate, and low (right) dose rate exposures are illustrated. The number of pieces of mail distributed are 3000, 30, and 3, respectively. From left to right, the physical picture changes from describing the number of events contained in each bin, to describing how often a bin contains a single event. The rightmost column states the sum of the number of pieces of mail across the row. Because the postal bins on one conveyor belt represent divisions of time, the illustrated ensembles transition from “temporally homogeneous” on the left to “temporally inhomogeneous” on the right

A mail sorter has 3000 pieces of mail to distribute among these 100 bins. The mail is all junk mail, it doesn’t matter where it goes, and the sorter distributes the individual pieces of mail at random. How many pieces of mail wind up in each bin? It is obvious that, as an average matter, each bin will contain 30 letters. Because the process is random, however, some bins will contain more than the average number, and some less. A valid distributionFootnote 1 is shown in the leftmost grid of Fig. 7.2. Despite the random distribution between individual bins, on average the ten postal bins in one row (on one conveyor belt, corresponding to one location) together contain about 300 pieces of mail.

It is interesting next to extrapolate downward. What if there are only thirty letters to distribute among these one hundred bins? Or three? Because a letter may not be cut into pieces, many - or most - bins are now empty. (It may be junk mail, but it may not be delivered pre-shredded!) These situations are illustrated in the middle and rightmost grids of Fig. 7.2.

The alert reader may ask, what about an extrapolation below one letter? Is there a lower limit? Because the number of conveyor belts is very large but (in theory) not fixed, the answer is that no lower limit exists. To extrapolate further downward, it is necessary simply to expand the field of vision to include more conveyor belts.Footnote 2 In statistical physics, the collection of postal bins on an infinite number of conveyor belts that has been conjectured is called an “ensemble”. It is never necessary to cut apart a piece of mail to represent any desired average number of pieces of mail on one conveyor belt (300, 3, 0.3, 0.0003, etc.) if the ensemble is infinite.

Moving from left to right in Fig. 7.2, one observes that a meaningful description of the three different distributions transitions from describing “how many” pieces of mail are contained in each bin, to “how often” each bin contains a single event. In technical terms, the distinction is that between the amplitude of events, and the frequency with which they occur. Human beings, especially due to the tendency to discount the occurrence of improbable events, reason most comfortably in the realm of “how many”. Nature, however, incorporates both viewpoints simultaneously. The correct mathematical construct is known as the “ensemble average”. Human common sense unfortunately misleads in this situation.

The numerical distinction between the average and the ensemble average, illustrated with reference to the examples shown in Fig. 7.2, is presented in Table 7.1. When the postal bins are very full, there is little difference between the average value, and the ensemble average value. There is nevertheless a very important difference between them: while the average value extrapolates linearly downward, the ensemble average value does not.Footnote 3 (Two quantities are linearly related if separated by a constant factor; for instance, the “Number of Letters” and “Average” columns in Table 7.1.) Instead, the ensemble average exceeds the average value by a large factor that increases as the grid becomes filled more sparsely.

Table 7.1 Illustration of the distinction between average and ensemble average values using the ensembles presented in Fig. 7.2

The reader may have a difficult time accommodating to the concept of the ensemble average, which after all invokes some complex ideas. The fundamental issue, however, is concrete: the Post Office is not allowed to shred a piece of mail with the goal of placing the same weight of mail in every postal bin. The distribution of letters is not uniform. (“Inhomogeneous” is a useful description.) The random process of distributing mail bumps up against a discrete lower limit, as may be seen in the bottom two rows of Table 7.1. The ensemble average represents Nature’s compromise between the decreasing average value (for instance, 0.0003 letters, a felony act according to the U.S. Postal Code), and the lower limit imposed by the discrete nature of a letter (1 piece of mail, indivisible).

The foregoing discussion has been an attempt to construct by analogy a simple description of a complex phenomenon. That phenomenon is biological injury due to exposure to ionizing radiation. Each piece of mail represents a single ionization event - that is, the liberation of a single energetic electron in living tissue. (The liberation of an electron is the reason the phenomenon is referred to as “ionizing” radiation.) Every ionization event has the possibility to be followed by biological injury. The distribution of events is random, both in time and in space. The assignment of events to bins represents an understanding that events occur both in a certain location (on one conveyor belt, rather than another) and at a certain time (in a certain postal bin). The rate at which the postal bins on a conveyor belt whisk by represents the rate of the chemical reaction responsible for biological injury. The total number of bins on one conveyor belt represents the duration of exposure.

Adding up across the rows the packets of energy associated with individual ionization events to arrive at the total “dose”, commonsensically, indicates the severity of exposure. Since both the amplitude and the frequency of events must be accounted for, however, Nature judges the situation with more subtlety, using the ensemble average. It follows that a protracted, low-level exposure to ionizing radiation may be far more damaging to health than an acute exposure depositing the same absorbed dose in tissue.

The Gray (Gy) is a physical measure of the absorbed dose, indicating the quantity of energy dissipated by ionizing radiation in a volume of tissue. The unit for the equivalent or reference dose, which is a biological measure of risk, is the Sievert (Sv). It is believed that an acute dose of 1 Sv increases the risk of developing a fatal cancer by 5.5%. Referring to the postal bin analogy, the author contends that the average value corresponds to the absorbed dose in Gray, while the ensemble average value corresponds to the equivalent dose in Sieverts. The hypothesis goes by the name of “shot noise in radiobiological systems”.

Challenging the Linear Model

On the question of radiation protection, and of the impacts to human health of low-level exposures to ionizing radiation, expert opinion endorses what is known as the “Linear No Threshold”, or LNT, model. The National Academy of Sciences most recently supported this outlook in its 2006 report, titled “Health Risks from Exposure to Low Levels of Ionizing Radiation” (BEIR VII). The LNT model asserts two essential findings. First, according to the LNT model there is no such thing as a “safe” dose. Any single interaction may be biologically damaging, and there is furthermore no evidence supporting any threshold of exposure below which damage cannot occur. The second assertion made by the LNT model is linearity: it is believed acceptable to extrapolate impacts linearly downward from high doses to low doses. The extrapolation is necessary because it is difficult to directly evaluate health impacts resulting from exposures of less than about 0.1 Gy.

The BEIR VII report justifies the linear extrapolation on the following basis:

[A]ny single track of ionizing radiation has the potential to cause cellular damage. However, if only one ionizing particle passes through a cell’s DNA, the chances of damage to the cell’s DNA are proportionately lower than if there are 10, 100, or 1000 such ionizing particles passing through it. There is no reason to expect a greater effect at lower doses from the physical interaction of the radiation with the cell’s DNA. (National Research Council, 2006)

The postal bin analogy from the previous section has prepared the reader to consider the argument offered by the BEIR VII committee. While intuitively sensible and therefore very appealing, the argument for linearity in fact is valid only under very special circumstances. The difficulty arises because there exist proper scales of volume and time describing physical/chemical/biological action. In short, what size are the postal bins? Yes, ten, one hundred, or even one thousand or more events that may cause injury are posited to occur, but within what interval of time? Within what volume of tissue? Nature has very specific answers to these questions, which however expert opinion does not identify.Footnote 4

Because the authorities have never adequately conceptualized the implicit question about natural scales of time and volume, in the author’s view the authorities have also failed to recognize the necessary role of the ensemble calculation. The language chosen by the BEIR VII committee, which adopts the frame of “how many?” events while failing to address “how often?” events occur, supports this interpretation. While the LNT model permits outcomes to extrapolate downward, it encompasses no mechanism to extrapolate exposures downward below the level of one track per cell. The difficulty is a serious one. However, the resolution to this contradiction can be addressed as it was in the postal analogy.

Nature utilizes the ensemble average value, which reflects both the amplitude and the frequency of events. Only the amplitude of events, represented by the average value, extrapolates linearly downward. When every bin is full (a temporally homogeneous exposure), it happens that the average value about coincides with the ensemble average value. For this reason, the LNT model in this situation agrees with the ensemble average value. In sparse (temporally inhomogeneous) ensembles when many bins are empty, however, the LNT model and the ensemble average value diverge. The LNT model is therefore incorrect for protracted, low dose rate exposures.

To recap: in a single interaction volume during a single interval of time, either an ionization event occurs (possibly several), or it does not. Each event is discrete, occurring in a single bin representing a single interaction volume during a very brief interval of time. The mathematical representation of the process never treats an event as though it can be divided. The scientific heritage of this idea is more than a century old, having first been developed to describe the failure of vacuum tubes to operate optimally at low levels of amplification. The theory as applied to radiation injury is known as “shot noise in radiobiological systems” (Datesman, 2016).

From the localized perspective of affected tissue, one should conclude from this description that there is no such thing as a “low dose” or a “low-level exposure”. Every individual ionization event creates, in a limited volume, a high dose rate exposure of very short duration. Because an overall exposure of finite duration is built up from a series of discrete events as shown in Fig. 7.3, all exposures should therefore be viewed as high dose rate exposures. Exposures characterized as “low dose” are built up from individual high dose rate ionization events, spread out over increasing intervals of time. As a consequence, the LNT model vastly understates the chemical, biological, and medical impact of dilute, protracted exposures to ionizing radiation. The threshold dose rate at which the LNT model begins to fail—because it attempts to extrapolate linearly downward beyond the level of a unit event per bin—lies in the regime below about 100 Gy/hr.

Fig. 7.3
A vertical line graph plots p of t versus time. It indicates that the area in one pulse equals E k. The other lines are plotted for the equations, p of t equals sigma k, E k, delta multiplied by t minus t k, p S N equals E 0, square root n B, and p a v equals n E a v.

A low dose exposure consists of a series of very brief, high dose rate exposures, schematically illustrated as orange “pulses”, spread out in time. In electrical engineering, a waveform of this kind is known as a “pulse train”

No argument has been made that lower-dose/dose rate exposures are more damaging, although it follows from the shot noise hypothesis that protracted exposures are more damaging per unit of absorbed dose. The speed of DNA repair (it takes about 2 h to repair a double strand break) also has complex consequences for radiation injury in the case of prolonged exposure.

If the shot noise phenomenon is indeed firmly established, so that the LNT model contradicts physical law, how is it that the important knowledge described in this section has been overlooked for many decades? The interesting answer is that it has not. The excerpt below is taken from a prominent lecture delivered by Harald Rossi, the primary inventor of the field of microdosimetry:

Nearly all physical quantities are nonstochastic, although in many instances the discreteness of matter and of radiation causes statistical fluctuations. However, in most cases these are small enough to be ignored, and no attempt is made to consider the underlying stochastic quantity or to give it a special name. (Rossi, 1986)

Rossi continues to provide a very clear description of the nature of shot noise in electrical circuits, which it is not necessary to reproduce here. The important point is that, although Rossi possessed a clear understanding of the existence, nature, and relevance of shot noise, his conclusion was incorrect: the statistical fluctuations (meaning whether adjacent bins are full or empty) are by no means negligible. The consequences of this error are significant and concerning.

Conventionally, the absorbed dose in Grays and the equivalent dose in Sieverts are related by a “radiation weighting factor,” derived using microdosimetry. The weighting factors permit radiations of different “quality” – meaning x- and gamma rays, beta particles, alpha particles, and neutrons – to be compared on the basis of biological injury. It is the author's opinion that the hypothesis of shot noise in radiobiological systems may invalidate the concept of the radiation weighting factor.

Experimental Evidence for the Hypothesis of Shot Noise in Radiobiological Systems

In 1972, Dr. Abram Petkau of the Whiteshell Nuclear Research Establishment in Canada published the results of an intriguing experiment in the mainstream scientific journal Health Physics. Dr. Petkau was the head of the Medical Biophysics branch at his institution, affiliated with the Canadian atomic energy establishment. While investigating topics related to radiation chemistry and biological injury, Petkau had previously devised a method to create artificial biological membranes in a small apparatus, in which the membrane spanned an aperture separating two compartments filled with water. With this arrangement it was possible to irradiate the water with x-rays, while simultaneously observing the membrane under a microscope. It was found that the membranes reliably ruptured (an observable biological outcome) after an absorbed x-ray dose of 35 Gray.

The rupture dose was not found to be replicated when external x-ray irradiation of the membrane was replaced with irradiation from a beta-emitting radionuclide, dissolved in the water contained within the apparatus. It was found instead that membrane rupture occurred at far smaller absorbed doses, increasing with increasing dose rate (Petkau, 1972). The result is intriguing because the Petkau experiment is a direct interrogation of the concept of different qualities of ionizing radiation. Far from confirming the accepted belief that x-rays and beta particles are of identical quality, irrespective of dose rate, the Petkau experiment indicated that beta particle irradiation was as much as 3000 times more effective for membrane rupture.

In the context of the discussion in this chapter, it is noteworthy that Petkau’s experiment investigated low dose rates, in all cases less than 0.6 Gy/h. This value lies far below the threshold of approximately 100 Gy/hr. at which the author asserts the linear extrapolation begins to break down. For this reason, the Petkau experiment directly interrogates not only the concept of the radiation quality (which it appears to invalidate), but also the hypothesis of shot noise in radiobiological systems. As shown in Fig. 7.4, the Petkau result agrees with the prediction of the shot noise hypothesis reasonably well. The finding that the membrane rupture result is consistent with the hypothesis of shot noise in radiobiological systems was published by the author in an article that appeared, also in Health Physics, in 2019.

Fig. 7.4
A scatterplot exhibits log T versus log c. A line for theory has a decreasing trend. Data points for Petkau indicate a decreasing trend. Another graph inset plots log T versus log C, which has a decreasing trend for conventional, W r equals 1.

The Petkau Effect explained by the hypothesis of shot noise in radiobiological systems. The membrane rupture time is plotted on the vertical axis, while the concentration of the beta-emitting radionuclide sodium-22 is plotted on the horizontal axis. From the author’s own work (Datesman, 2019)

Background Radiation May Not Be Properly Understood

Environmental and medical exposures comprise the so-called “background radiation”. Environmental exposures especially are generally protracted in character. At the current time, in the United States the average annual exposure to natural (non-medical) sources of ionizing radiation totals about 3 mSv/yr.Footnote 5 The contribution to background due to medical radiation (mostly CT scans and x-rays) is of similar magnitude. There are four categories of exposure to non-medical background radiation: inhalation, ingestion of food and water, terrestrial radiation, and cosmic radiation. The inhalation route (principally radon) accounts for 2.3 mSv/yr., while the other contributions all lie in the range of 0.2–0.3 mSv/yr. The stated values are merely averages, as there are large local variations due to geology, building construction, altitude, and personal factors.

A particularly interesting contributor to the background dose is potassium-40 (K-40), listed in the first row of Table 7.2. Living tissue and blood are universally contaminated with this long-lived radioisotope, which is an energetic mixed emitter (that is, it emits both gamma rays and beta particles). About 4000 decays of K-40 occur in a human body of 60 kg every second. The absorbed dose due to this contaminant comes to about 0.15 mGy/year, corresponding to an absorbed dose rate of 2 × 10−8 (or twenty one-billionths) of a Gray per hour. Invoking the postal warehouse analogy, this low environmental dose rate corresponds approximately to one postal bin out of every 56 billion containing a single event. The ensemble in this case is indeed filled very sparsely. The ensemble average dose rate, reflecting the degree of biological injury listed in the fifth column of Table 7.2, exceeds the absorbed or average dose rate by a factor of about 100,000 times.

Table 7.2 Comparison of low dose exposures using the hypothesis of shot noise in radiobiological systems

Medical exposures for diagnostic purposes are generally of reasonably short duration. A notional chest x-ray delivering a dose of 0.15 mGy with a duration of 0.1 s is considered in the second row of Table 7.2. In this case, again because the dose rate is lower than the threshold value, the ensemble average dose rate (representing the biological impact of the exposure) is also larger than the absorbed dose rate, although only by a factor of about four.

As explained by the official historian of the Nuclear Regulatory Commission in the quotation opening this chapter, the Life Span Study (LSS) of survivors of the atomic bombings of Hiroshima and Nagasaki is the principal foundation upon which rests our knowledge regarding the harms of low dose exposure to ionizing radiation. Findings from the LSS essentially anchor the scale society uses to understand the hazards of exposure to ionizing radiation. The third line of Table 7.2 displays the average absorbed dose represented in the LSS due to “delayed gamma radiation” for individuals 2000 meters from the hypocenter at Hiroshima.

Table 7.2 reveals an interesting finding.Footnote 6 Between the chest x-ray and the LSS exposures, the absorbed doses differ by a factor of 264. The equivalent (biological) doses, meanwhile, differ by a similar factor of 172. The similarity indicates that, although the exposure durations differ by a large factor, the scale established by the LSS describes diagnostic medical radiation exposures reasonably well.

One should not mistake coincidence for wisdom, however. If the hypothesis of shot noise in radiobiological systems is correct, the framework based upon the LSS data is fundamentally unsuitable as regards environmental exposures. As shown in the first row of Table 7.2, the equivalent dose due to potassium-40 amounts to about 13,000 mSv per annum. For comparison, an acute dose (that is, occurring in a short duration) in the range of 5000 mSv will be fatal for a majority of those exposed within 30 days.

What could this surprising result mean? Contamination with potassium-40, because it is both universal and essentially unavoidable, is widely considered to be benign. The judgment may not be totally correct. Instead, a reasonable hypothesis is that the mechanisms of cellular repair have evolved to act at a rate that approximately compensates for the injury to genetic material caused by K-40. (According to Table 7.2, the rate at which that damage accrues is approximately 1.5 mSv/hour.) Biological damage caused by exogenous exposures (whether from an atomic bomb, a medical X-ray, or radon in the basement) occur only on top of the baseline activity of endogenous damage due to potassium-40, and its repair. The conventional view embodied in the LNT model unfortunately elides this complication.

It is the author’s opinion that no model of radiation injury describing the biological impact of exposures at environmental dose rates can possibly be valid without a comprehensive temporal description of both damage and repair. The scale built upon the LSS is valid for the simple reason that the exposures against which it is proofed are much shorter in duration (of the order of seconds) than the processes responsible for the repair of double strand breaks (which require hours). The dynamic nature of the repair process is therefore not a necessary component of the description of, for instance, medical x-rays.

When the exposure is continuous and protracted, however, a more thorough analysis incorporating the process of repair must be undertaken. Because the LNT model does not incorporate such an analysis, exposure to “background” levels of ionizing radiation in the environment may be a phenomenon that is incompletely, or incorrectly, understood.

Environmental Releases from Operating Nuclear Power Stations

Emissions of radioactive pollution from nuclear power stations occur both on a continuing basis, as well as in “batch” releases of short duration. For example, for each of the years 1999–2003, operating nuclear power plants worldwide released about 30,000 Curies (Ci) of radioxenon on a continuous basis, and 6000 Ci in batch releases (Kalinowski & Tuma, 2009).

In the United States, utilities are required to report emissions to the Nuclear Regulatory Commission. A portion of a table from one such report, from the Oyster Creek Generating Station in New Jersey, is shown in Fig. 7.5. In each of the first three quarters of 2018, the utility that owns the facility reported that the 636 MW boiling water reactor at Oyster Creek released between about 20–40 Ci of “Fission & Activation Gases”, principally the noble gases krypton and xenon. The facility reported no emissions for the fourth quarter of 2018, since the reactor was shut down in September of that year.Footnote 7

Fig. 7.5
A table with data on fission and activation gases, units, quarter 1, quarter 2, quarter 3, quarter 4, and estimated total error in percentage. The estimated error is 24.64%. The details of gases include total release, activation release, gamma air dose, beta air dose, and percent of O D C M limit.

Emissions from the Oyster Creek Generating Station, as reported to the Nuclear Regulatory Commission by Exelon (Exelon Generation, 2018). The reported doses (in millirads) are of the order of 1–10 μGy (microGray)

According to the Code of Federal Regulations, releases of radioactive effluents from operating nuclear power plants are permitted so long as doses to individuals in unrestricted areas do not exceed 0.02 mSv in any hour and 0.5 mSv in a year.Footnote 8 Consider therefore a batch release consisting solely of the radioactive noble gas krypton-85, a beta-emitting noble gas. At a highly dilute but constant ambient concentration of 1 μCi per liter, an individual exposed to this release will receive a whole-body gamma dose of approximately 0.02 mSv in 1 h.

While the noble gases are unreactive, they do represent an inhalation hazard. Moreover, the lung epithelial tissue is permeable to krypton, which binds to hemoglobin and is distributed throughout the body via the circulatory system. (The same is also true of xenon, which is important as it relates to the 1979 accident at the Three Mile Island nuclear power station in Pennsylvania.) The activity of krypton-85 in the bloodstream of the individual exposed to this batch release would be 1.7 Bq per milliliter. By way of comparison, the overall K-40 activity in the human body is much less: only 0.067 Bq per milliliter.

For this notional exposure, the absorbed dose rate to the blood would be 245 nGy/hr. If the hypothesis of shot noise in radiobiological systems is correct, however, this permitted exposure to Kr-85 would have a biological impact greater than 5 mSv. A statement of risk places the result into context. Extrapolating the calculated individual dose up to the population level, and furthermore employing the guidance promulgated by the BEIR VII committee for a whole-body exposure, it follows that one fatal cancer would be expected to result from the exposure if 3600 people inhaled radioactive Kr-85 in the manner described. In sum, it appears that the regulation may fail – by a factor of hundreds – to meet the protective standard it intends. The LNT model asserted by the authorities is not representative of the physical reality of low-level exposures.

Summary Points

  1. 1.

    Nature judges the probability of biological injury caused by exposure to ionizing radiation on an ensemble-averaged basis. Common sense does not address the question of “how often” events occur very well.

  2. 2.

    There is no such thing as a low-level exposure. All exposures are composed of discrete, high dose rate events of very brief duration.

  3. 3.

    Our scientific understanding of protracted exposures, including background radiation, may be incomplete, or even incorrect. A hypothesis known as “shot noise in radiobiological systems” has been proposed.