How Old is Earth, and How Do We Know?
Earth scientists have devised many complementary and consistent techniques to estimate the ages of geologic events. Annually deposited layers of sediments or ice document hundreds of thousands of years of continuous Earth history. Gradual rates of mountain building, erosion of mountains, and the motions of tectonic plates imply hundreds of millions of years of change. Radiometric dating, which relies on the predictable decay of radioactive isotopes of carbon, uranium, potassium, and other elements, provides accurate age estimates for events back to the formation of Earth more than 4.5 billion years ago. These and other dating techniques are mutually consistent and underscore the reality of “deep time” in Earth history.
KeywordsGeochronology Dendrochronology Varves Radiometric dating Carbon-14 dating Uranium-lead dating Prechronism Created antiquity Deep time
Historians love to quote the dates of famous events in human history. They celebrate great accomplishments and discoveries, such as the Wright Brothers’ first flight of December 17, 1903, and the first manned moon landing on July 20, 1969. They recount days of national loss and tragedy like December 7, 1941 and September 11, 2001. And they remember birthdays: July 4, 1776 and, of course, February 12, 1809 (the coincident birthdays of Charles Darwin and Abraham Lincoln). We trust the validity of these historic moments because of the unbroken written and oral record that links us to the not-so-distant past.
Geologists also love to quote historic age estimates: about 12,500 years ago, when the last great glaciation ended and humans began to settle North America; 65 million years ago, when the dinosaurs and many other creatures became extinct; the Cambrian boundary at 542 million years ago, when diverse animals with hard shells suddenly appeared; 4.56 billion years ago, when the Sun and Earth formed from a vast cloud of dust and gas. But how can we be sure of those age estimates? There’s no written record past a few thousand years, nor is there any oral tradition that can inform estimates of Earth’s ancient chronology.
Earth scientists have developed numerous independent yet consistent lines of evidence that point to an incredibly old Earth. I describe just three of these many approaches—annual layerings, geologic rates, and our most accurate and reliable method, isotopic age determinations or “radiometric dating.” When properly applied, all three approaches yield identical estimates of geologic events.
But first, a warning: it is difficult for anyone to conceive of such an immense time span as 4.5 billion years. The oldest humans (the current record according to Guinness is held by a French woman who lived to celebrate her 122nd birthday) fall far short of living for 4.5 billion seconds (about 144 years). All of recorded human history is much less than 4.5 billion minutes. Yet, geologists claim that Earth formed half-a-million times longer ago than that. No one can easily fathom the meaning of “deep time.” So how can we be sure such age estimates are correct? The answer lies in the testimony of the rocks.
Annual Rock Clocks
Rocks reveal their ages of formation in several ways and provide Earth scientists with their most reliable clocks. The most straightforward geologic timekeepers are rock formations with annual layers. Annual tree rings provide a familiar analog (Fig. 1). Each year of a tree’s life is marked by a distinctive ring, as growth increases in spring and slows the following winter. The oldest trees on Earth are a few thousand years old, but tree ring dating (or “dendrochronology”) has been pushed back 26,000 years by comparing living trees with buried logs of increasing age (Friedrich et al. 2004; Stuiver et al. 1986).
Sedimentary rocks, too, can display annual layerings, or “varves,” that result from seasonal differences in sediment deposition (Fig. 2) (Kemp 1996). The most dramatic varve deposits, such as a meticulously documented 13,527-year sequence in glacial lakes in Sweden, occur as thin alternating light and dark layers, representing coarser-grained spring sediments and finer winter sediments, respectively. Ancient varved deposits sometimes preserve much longer time spans: the finely laminated Green River shale in Wyoming features continuous vertical sections with more than a million such layers (Fig. 3).
The oldest annual layers are extracted from ice cores, whose laminae arise from seasonal variations in snowfall (Fig. 4). A 2,000-meter ice core from East Antarctica reveals 160,000 annual layers of accumulation, year-by-year, snow layer by snow layer. And those annual layers rest atop another 2,000 meters of ice, which sit on vastly older rocks. Similar ages of ice cores comprise Greenland’s thick glacial deposits. The obvious conclusion is that at least a million years is needed to account for many surficial deposits of sediment and ice. Earth must be much older than that, but how old?
Slow, inexorable changes of Earth’s dynamic surface provide a vivid, if approximate, measure of deep time. Consider three simple “back-of-the-envelope” calculations. First, how old is the big island of Hawaii? The massive Hawaiian Islands rose from the Pacific as volcanoes periodically added layers of lava (Fig. 5). From modern-day eruptions, we know that active volcanoes grow by perhaps a meter every century. The highest point on the big island of Hawaii is Mauna Kea at 4,205 meters above sea level. However, the volcano rises approximately 10,200 meters above the ocean floor, so a rough calculation gives its age:
This is a rough estimate, to be sure, but it jives well with other methods that date the big island of Hawaii as about a million years old. The other islands that string out to the northwest, each with now-dormant volcanoes, are progressively older (and a new island, dubbed Loihi, is already forming as volcanoes erupt on the ocean floor southeast of the big island).
You can do a similar calculation to date the Atlantic Ocean, which is about 3,700 kilometers wide and grows wider every year. The near perfect fit of the East Coast of South America with the West Coast of Africa provided key evidence for plate tectonics, the idea that Earth’s crust is broken into about a dozen thin, brittle plates that shift positions in response to convection in Earth’s deep interior. These continents were once joined into the supercontinent Pangaea; the Atlantic Ocean formed when Pangaea split down the middle and formed a divergent boundary, now marked by the Mid-Atlantic Ridge (Fig. 6). New crust forms along the Ridge, as Europe and Africa move away from the Americas. Exacting satellite measurements over the past two decades reveal an average spreading rate of 2.5 centimeters per year, so the approximate age of the Atlantic is easily estimated as:
This rough estimate of about 150 million years is close to other measurements of the age of the Atlantic. It is remarkable to imagine that a great ocean, a seemingly permanent feature of our home planet, is so transient in the context of Earth history.
A third simple calculation reveals even longer time spans. The Appalachian Mountains are now gently rounded and relatively low—mostly below 3,000 meters high (Fig. 7). But geological evidence reveals that they once were the grandest mountain chain on Earth, rivaling the Himalayas in ruggedness and height (with some peaks at more than 10,000 meters). Ever so gradually, erosion has worn the Appalachians down to their present state, but how long might that process take?
For the sake of a rough estimate, consider a mountain as a rectangular block about five kilometers high, four kilometers long, and four kilometers wide (that’s about 2.5 miles on a side). The volume of this impressive mountain is thus:
Now, imagine a stream that flows down the side of this mountain. Mountain streams carry silt and sand downwards—a key factor in erosion. You’ll probably also find coarse gravel and even a few boulders in the stream, the result of occasional flash floods that follow heavy rains. All of these sediments came from higher up the mountain, which is constantly being eroded away.
To estimate how long a mountain might survive against erosion, consider a mountain with six principal streams. A typical stream might carry an average of one-tenth of a cubic meter of rock and soil (a few shovels full) per day off the mountain, though the actual amount would vary considerably from day to day. Over a period of a year, the six streams might thus remove:
That means every year on the order of 220 cubic meters of material, or about 20 dump trucks full of rock and soil, might be removed from a mountain by normal stream erosion. If the mountain streams remove about 220 cubic meters per year, then the lifetime of the mountain can be estimated as the total volume of the mountain divided by the volume lost each year:
This estimate is certainly rough and not directly applicable to any specific mountain. But the lesson is clear: even the grandest mountains can’t last more than a few hundred million years. Nevertheless, a few hundred million years is but a small fraction of a few billion years. How can we say Earth is 4.56 billion years old?
The physical process of radioactive decay has provided Earth scientists, anthropologists, and evolutionary biologists with their most important method for determining the absolute age of rocks and other materials (Dalrymple 1991; Dickin 2005). This remarkable technique, which depends on measurements of the distinctive properties of radioactive materials, is called radioisotope geochronology, or simply “radiometric dating.”
Trace amounts of isotopes of radioactive elements, including carbon-14, uranium-238, and dozens of others, are all around us—in rocks, in water, and in the air (Table 1). These isotopes are unstable, so they gradually break apart or “decay.” Radiometric dating works because radioactive elements decay in predictable fashion, like the regular ticking of a clock. Here’s how it works. If you have a collection of one million atoms of a radioactive isotope, half of them will decay over a span of time called the “half life.” Uranium-238, for example, has a half life of 4.468 billion years, so if you start with a million atoms and come back in 4.468 billion years, you’ll find only about 500,000 atoms of uranium-238 remaining. The rest of the uranium will have decayed to 500,000 atoms of other elements, ultimately to stable (i.e., nonradioactive) atoms of lead-206. Wait another 4.468 billion years and only about 250,000 atoms of uranium will remain (Fig. 8).
Common radioactive elements and their half-lives Radon-222 3.8 days Carbon-14 5,730 years Potassium-40 1.248 billion years Uranium-238 4.468 billion years Rubidium-87 47 billion years
Common radioactive elements and their half-lives
1.248 billion years
4.468 billion years
47 billion years
The best-known radiometric dating method involves the isotope carbon-14, with a half life of 5,730 years. Every living organism takes in carbon during its lifetime. At this moment, your body is taking the carbon in your food and converting it to tissue, and the same is true of all other animals. Plants are taking in carbon dioxide from the air and turning it into roots, stems, and leaves. Most of this carbon (about 99%) is in the form of stable (non-radioactive) carbon-12, while perhaps 1% is the slightly heavier stable carbon-13. But a certain small percentage of the carbon in your body and every other living thing—no more than one carbon atom in every trillion—is in the form of radioactive carbon-14.
As long as an organism is alive, the carbon-14 in its tissues is constantly renewed in the same small, part-per-trillion proportion that is found in the general environment. All of the isotopes of carbon behave the same way chemically, so the proportions of carbon isotopes in the living tissue will be nearly the same everywhere, for all living things. When an organism dies, however, it stops taking in carbon of any form. From the time of death, therefore, the carbon-14 in the tissues is no longer replenished. Like a ticking clock, carbon-14 atoms transmute by radioactive decay to nitrogen-14, atom-by-atom, to form an ever-smaller percentage of the total carbon. Scientists can thus determine the approximate age of a piece of wood, hair, bone, or other object by carefully measuring the fraction of carbon-14 that remains and comparing it to the amount of carbon-14 that we assume was in that material when it was alive. If the material happens to be a piece of wood taken out of an Egyptian tomb, for example, we have a pretty good estimate of how old the artifact is and, by inference, when the tomb was built. What’s more, scientists have conducted meticulous year-by-year comparisons of carbon-14 dates with those of tree ring chronologies (Reimer et al. 2004). The result: the two independent techniques yield exactly the same dates for ancient fossil wood.
Carbon-14 dating often appears in the news in reports of ancient human artifacts. In a highly publicized discovery in 1991, an ancient hunter was found frozen in the ice pack of the Italian Alps (Fig. 9). “Ötzi the iceman,” as he was called, was shown by carbon-14 techniques to date from about 5,300 years ago. The technique provided similar age determinations for the tissues of the iceman, his clothing, and his implements (Fowler 2000).
Carbon-14 dating has been instrumental in mapping human history over the last several tens of thousands of years. When an object is more than about 50,000 years old, however, the amount of carbon-14 left in it is so small that this dating method cannot be used. To date rocks and minerals that are millions of years old, scientists must rely on similar techniques that use radioactive isotopes of much greater half-life (Table 1). Among the most widely used radiometric clocks in geology are those based on the decay of potassium-40 (half-life of 1.248 billion years), uranium-238 (half-life of 4.468 billion years), and rubidium-87 (half-life of 47 billion years). In these cases, geologists measure the total number of atoms of the radioactive parent and stable daughter elements to determine how many radioactive nuclei were present at the beginning. Thus, for example, if a rock originally formed a long time ago with a small amount of uranium atoms but no lead atoms, then the ratio of uranium-to-lead atoms today can provide an accurate geologic stop watch.
When you see geologic age estimates reported in scientific publications or in the news, chances are those values are derived from radiometric dating techniques. In the case of the early settlement of North America, for example, carbon-rich campfire remains and associated artifacts point to a human presence by about 13,000 years ago. Much older events in the history of life, some stretching back billions of years, are often based on potassium-40 dating. This technique works well because fossils are almost always preserved in layers of sediments, which also record periodic volcanic ash falls as thin horizons. Volcanic ash is rich in potassium-bearing minerals, so each ash fall provides a unique time marker in a sedimentary sequence. The rise of humans about 2.5 million years ago, the extinction of the dinosaurs 65 million years ago, the appearance of animals with hard shells starting about 540 million years ago, and other key transitions in life on Earth are usually dated in this way (Fig. 10).
The oldest known rocks, including basalt and other igneous formations, solidified from incandescent red-hot melts. These durable samples from the moon and meteorites are typically poor in potassium, but fortunately, they incorporate small amounts of uranium-238 and other radioactive isotopes. As soon as these molten rocks cool and harden, their radioactive elements are locked into place and begin to decay. The most ancient of these samples are several types of meteorites, in which slightly more than half of the original uranium has decayed to lead. These primordial space rocks, the leftovers from the formation of Earth and other planets, yield an age of about 4.56 billion years for the nascent solar system. The oldest known moon rocks, at about 4.46 billion years, also record these earliest formative events (Norman et al. 2003).
Earth must have formed at about the same time, but our restless planet’s original surface has now eroded away. Only a few uranium-rich, sand-sized grains of the hardy mineral zircon, some as old as 4.4 billion years, survive (Wilde et al. 2001). Nevertheless, uranium-bearing rocks, on every continent provide a detailed chronology of the early Earth (Hazen et al. 2008, 2009). The oldest Earth rocks, at about four billion years, point to the early origins of continents. Rocks from almost 3.5 billion years ago host the oldest unambiguous fossils—primitive microbes and dome-like structures called stromatolites, which formed their rocky homes (Fig. 11). Distinctive uranium-rich sedimentary formations and layered deposits of iron oxides from about 2.5 to 2.0 billion years document the gradual rise of atmospheric oxygen through photosynthesis (Hazen et al. 2008, 2009). Indeed, every stage of Earth history has been dated with exquisite accuracy and precision thanks to radiometric techniques.
Overwhelming observational evidence confirms that Earth history is the story of the co-evolving geospheres and biospheres: Life has changed continuously over the course of Earth history. As the work of Eugenie Scott has so forcefully defended, Earth must be billions of years old (Scott 2009). However, such a conclusion is at odds with the doctrine of many Christian fundamentalists, who believe in the literal Biblical chronology of a universe no more than about 10,000 years old. How can science respond to such adamant claims?
The testimony of the rocks is unambiguous: an enormous body of observational evidence points to the reality of deep time. Annual ice and rock layerings reveal a million years of Earth history. Geologic rates of mountain building, erosion and plate tectonics demand hundreds of millions of years. Radiometric dating pushes the history back billions of years. And when these techniques overlap, their independent estimates of the timing of ancient events are internally consistent. Any claim that Earth’s age is 10,000 years or less defies the overwhelming and unambiguous observational evidence, not to mention the laws of physics and chemistry. Such a “young-Earth” chronology is based entirely on a rigid, some would say idiosyncratic, reading of certain translations of the Bible. There is no science in “scientific creationism,” nor intelligence in “intelligent design.”
The only alternative for a person who believes in a young Earth is that God has falsified Earth’s record to test our faith—a conclusion first expounded by the exacting American naturalist and devout believer Philip Gosse in 1857 (Gosse 1998). In his treatise Omphalos (named for the Greek “navel,” because motherless Adam was created with a navel so as to look as if born by woman), Gosse catalogues hundreds of pages of unambiguous evidence for an ancient Earth. And then, remarkably, he proceeds to describe how God created everything 10,000 years ago to look much older!
Some readers may find comfort in this unfalsifiable Creationist loophole of “created antiquity” or “prechronism.” We observe stars and galaxies that are billions of light years away so one might conclude that light has been traveling through space for all those billions of years. But, no, according to the doctrine of created antiquity, the universe was created with light from those stars and galaxies already on its way to Earth. We observe rocks with characteristic ancient ratios of radioactive and daughter isotopes. Presumably, the rocks are ancient. But no, those rocks were created with just the right mixtures of uranium, lead, potassium, and carbon to make them appear much older than they really are.
Here, scientists are stymied. It is difficult to imagine any experiment or observation that could disprove the doctrine of created antiquity. Any result of any measurement that reveals evidence for deep time, no matter what it is, can be dismissed—explained away as misleading and false just by saying “God created the universe that way to look much older.” But the implications of such a contention are devastating to rational thought. I refuse to accept the idea that any God would bestow such precious gifts as our senses and reason, seemingly to understand His creation, and then try to fool us.
This inflexible characteristic of young-Earth creationist arguments proves to be their Achilles’ heel in scientific debates. Every scientific idea must be testable by observations or experiments that can be independently confirmed. In principle, it must be possible to imagine outcomes that would prove the proposition wrong. Without such independent confirmation, a hypothesis cannot be considered scientific. Created antiquity is not falsifiable. Consequently, the teaching of young-Earth creationism, as well as any other doctrine based on a miraculous creation of life, has been repeatedly prohibited in public schools not because the doctrine was proved wrong, but because it simply is not science. As the US Supreme Court ruled in Edwards v. Aguillard (1987), creationism is a religious belief that is inherently untestable by the techniques of science (Working Group on Teaching Evolution 1998).
Many lines of evidence point to the unfathomable antiquity of Earth. As this article has discussed, geologists employ annual layerings of rock, gradual changes of Earth’s surface, and the inexorable decay of radioactive elements to confirm the vastness of geologic time. More than a dozen other techniques also provide reliable age determinations: fission-track dating based on gradual accumulation of radiation damage, thermochronology based on the slow diffusion of atoms through rocks, methods that rely on surface weathering rates or even on the slow growth of lichens. These and other measures of deep time are independent yet yield the same unassailable results. Geologic data are complemented by insights from astrophysics (see Krauss this issue) and biology (see Padian this issue).
The lessons of the rocks, stars, and life are equally clear. If you would choose to understand Earth then you must divorce yourself from the inconsequential temporal or spatial scale of a human life. We live on a single tiny world in a cosmos of a hundred billion galaxies, each with a hundred billion stars. Similarly, we live day by day in a cosmos aged hundreds of billions of days. If you would seek for meaning and purpose in the cosmos, you will not find it in any privileged status in space or time. Rather, Earth and the heavens declare the glory of a cosmos bounded by natural laws that lead inevitably, inexorably to a universe that is learning to know itself.