The “Anthropocene” Problem

Joe Romm at ThinkProgress has a good post entitled, “Words Matter When Talking About Global Warming: The ‘Good Anthropocene’ Debate”.

ruins

Taking a nod from Orwell, who more than any other writer demonstrated how language can be a weapon and a tool of social control, Romm explored how the euphemism “good Anthropocene” is not only meaningless for communication with the public, but also probably harmful as a concept. “Good Anthropocene” becomes jargon and an obfuscation of critical, urgent problems.

But a trickier issue is the term “Anthropocene” itself. This word is not yet formally recognized as a geologic time division, but rather is a suggestion for the name of a new one, to follow the current Holocene epoch. The root “anthro”—as in “anthropology”—means human; the “Anthropocene” would be the era marked by humans.

Geologists divide time periods using several methods. Sometimes the division is marked by the first appearance of a distinctive fossil. For example, the beginning of the Cambrian five-hundred and forty-one million years ago is defined by the appearance of a complex trace fossil called Treptichnus pedum, which looks rather like the aftermath of a curious worm burrowing through mud. Geologists have agreed that a cliff in Fortune Head, Newfoundland, will serve as the worldwide reference point for how the first appearance of T. pedum presents itself in rock strata. (Geologists call such reference points “global boundary stratotype section and points,” with the rather unhelpful acronym GSSP.)

Geologists define GSSPs in other ways, too: the geologic moment when non-avian dinosaurs became extinct is marked in part by a spike in the element iridium. This iridium anomaly was famously linked to a meteorite impact in what is now the Yucatan peninsula, a cosmic strike that rang in what was truly one of the worst days this planet has ever seen (researchers suspect it must have happened on a Monday).

In each of these cases, there is a permanent, distinctive feature. That feature is worldwide and detectable. A place is designated as a type locality where future scientists can make comparisons. Thousands or millions of years from now, our new overlords (whom I, for one, welcome) should be able to go to the same place and observe the same distinguishing feature. Formal geologic time divisions require these sorts of features.

Here is where the “Anthropocene” concept runs into trouble. If we define it by a certain extinction, which organism is the best candidate? Humans certainly wiped out passenger pigeons, with the last survivor dying in 1914—but passenger pigeons were limited to North America. We know that humans played a big role in the extinction of the thylacines—but, again, they were geographically limited, in this case to Australia, Tasmania, and New Guinea. And though we know that passenger pigeons and thylacines existed, what sort of fossil record have they left, what trace of them might be detectable in ages to come?

Perhaps we can distinguish the “Anthropocene” using radioactive isotopes produced by atomic bombs. These surely would be widespread and unique to this time period. But many of these isotopes are short-lived, and won’t last over geologic time. Even samples of trinitite, the greenish glass deposited on the desert floor during the very first atomic explosion, are less radioactive today than they were on July 16, 1945, the day of their formation (which was a Monday).

In the June issue of GSA Today, a publication of the Geological Society of America, a paper entitled “An anthropogenic marker horizon in the future rock record” proposed using a substance called “plastiglomerate” to distinguish the human mark. Researchers identified a new “‘stone’ formed through intermingling of melted plastic, beach sediment, basaltic lava fragments.”

This seems like a great candidate—a substance created from the plastics unique to humans, which will become part of the rock record, which is (tragically) ubiquitous and worldwide. But the authors also noted the uncertainly about how long plastic may exist, within “a range of hundreds to thousands of years.” Perhaps under lower temperatures plastics might exist longer, but there is certainly no conclusive evidence of long-term plastic persistence compared to, say, clay minerals or quartz. If we chose this as the marker of the “Anthropocene,” in a few hundred years we may not be able to find very much of it.

Part of the problem, then, is the nature of our ephemeral perception of time. We think of time using the clipped scale of our brief lives, rather than nature’s stubborn, plodding pace.

If tomorrow our civilization abruptly collapses and humanity goes extinct—which when one thinks of celebrities such as Kim Kardashian, Justin Bieber, and Chris Brown might be well-deserved—what will be left in the long run? Will there be any trace of us? In his book The World Without Us, Alan Weisman explores this question, and gives numerous examples of nature’s rapid assimilation of our artifacts. Certainly we are leaving our mark in the form of a warmer climate, and this will persist for at least several centuries, but that’s a rounding error as geologists measure time. The upshot is that our perception of humanity’s permanent mark on Earth may be tinged with hubris.

We don’t need to “rebrand” our geologic period to show we’re serious about the problems we are creating. Rebranding is a vacuous trick, substituting superficiality for meaningful change. We might name a time period after ourselves, and “Anthropocene” is a name that works, but I would like to suggest another: we are living in the Ozymandian.

Short Bio

Steve Newton is a former Programs and Policy Director at NCSE.