It was
February 2000 and the Nobel laureate Paul Crutzen was sitting in a meeting room
in Cuernavaca, Mexico, stewing quietly. Five years earlier, Crutzen and two
colleagues had been awarded the Nobel prize in chemistry for proving that the
ozone layer, which shields the planet from ultraviolet light, was thinning at
the poles because of rising concentrations of industrial gas. Now he was
attending a meeting of scientists who studied the planet’s oceans, land
surfaces and atmosphere. As the scientists presented their findings, most of
which described dramatic planetary changes, Crutzen shifted in his seat. “You
could see he was getting agitated. He wasn’t happy,” Will Steffen, a chemist
who organised the meeting, told me recently.
What
finally tipped Crutzen over the edge was a presentation by a group of
scientists that focused on the Holocene, the geological epoch that began around
11,700 years ago and continues to the present day. After Crutzen heard the word
Holocene for the umpteenth time, he lost it. “He stopped everybody and said:
‘Stop saying the Holocene! We’re not in the Holocene any more,’” Steffen
recalled. But then Crutzen stalled. The outburst had not been premeditated, but
now all eyes were on him. So he blurted out a name for a new epoch. A combination
of anthropos, the Greek for “human”, and “-cene”, the suffix used in names of
geological epochs, “Anthropocene” at least sounded academic. Steffen made a
note.
A few
months after the meeting, Crutzen and an American biologist, Eugene Stoermer, expanded
on the idea in an article on the “Anthropocene”. We were entering an entirely
new phase of planetary history, they argued, in which human beings had become
the driving force. And without a major catastrophe, such as an asteroid impact
or nuclear war, humankind would remain a major geological force for many
millennia. The article appeared on page 17 of the International
Geosphere-Biosphere Programme’s newsletter.
At this
point it did not seem likely the term would ever travel beyond the abstruse literature
produced by institutions preoccupied with things like the nitrogen cycle. But
the concept took flight. Environmental scientists latched on to what they saw
as a useful catch-all term for the changes to the natural world – retreating
sea ice, accelerating species extinction, bleached coral reefs – that they were
already attributing to human activity. Academic articles began to appear with
“Anthropocene” in the title, followed by entire journals dedicated to the
topic. Soon the idea jumped to the humanities, then newspapers and magazines,
and then to the arts, becoming a subject of photography, poetry, opera and a
song by Nick Cave. “The proliferation of this concept can mainly be traced back
to the fact that, under the guise of scientific neutrality, it conveys a
message of almost unparalleled moral-political urgency,” wrote the German
philosopher Peter Sloterdijk.
There
was just one place where the Anthropocene seemed not to be catching on: among
the geologists who actually define these terms. Geologists are the guardians of
the Earth’s timeline. By studying the Earth’s crust, they have carved up the
planet’s 4.6bn years of history into phases and placed them in chronological
order on a timescale called the International Chronostratigraphic Chart. That
timescale is the backbone of geology. Modifying it is a slow and tortuous
process, overseen by an official body, the International Commission on
Stratigraphy (ICS). You can’t just make up a new epoch and give it a convincing
name; the care taken over the timescale’s construction is precisely what gives
it authority.
But as
the idea of the Anthropocene spread, it became harder for geologists to ignore.
At a meeting of the Geological Society of London, in 2006, a stratigrapher
named Jan Zalasiewicz argued that it was time to look at the concept seriously.
Stratigraphy is the branch of geology that studies rock layers, or strata, and
it is stratigraphers who work on the timescale directly.
To
Zalasiewicz’s surprise, his colleagues agreed. In 2008, Gibbard asked if
Zalasiewicz would be prepared to assemble and lead a team of experts to
investigate the matter more deeply. If the group found evidence that the
Anthropocene was “stratigraphically real”, they would need to submit a proposal
to the ICS. If the proposal was approved, the result would be literally
epoch-changing. A new chapter of Earth’s history would need to be written.
With a
mounting sense of apprehension, Zalasiewicz agreed to take on the task. He knew
the undertaking would not only be difficult but divisive, risking the ire of
colleagues who felt that all the chatter around the Anthropocene had more to do
with politics and media hype than actual science. “All the things the
Anthropocene implies that are beyond geology, particularly the social-political
stuff, is new terrain for many geologists,” Zalasiewicz told me. “To have this
word used by climate commissions and environmental organisations is unfamiliar
and may feel dangerous.”
What’s
more, he had no funding, which meant he would have to find dozens of experts
for the working group who would be willing to help him for free. Having spent
much of his career absorbed in the classification of 400m-year-old fossils
called graptolites, Zalasiewicz did not consider himself a natural people
manager. “I found myself landed in this position,” he said. “My reaction was:
goodness me, where do we go from here?”
Then, in
the late 18th century, a different theory emerged, one based on the close
observation of the natural world. By studying the near-imperceptibly slow process
of the weathering and forming of rocks, thinkers such as the Scottish landowner
James Hutton argued that the Earth must be far, far older than previously
thought.
The
invention of geology would go on to transform our sense of our place in
existence, a revolution in self-perception similar to the discovery that the
Earth is not at the centre of the universe. Human beings were suddenly an
astonishingly recent phenomenon, a “parenthesis of infinitesimal brevity”, as
James Joyce once wrote. During the almost inconceivable expanse of pre-human
time, successive worlds had risen and collapsed. Each world had its own
peculiar history, which was written in rock and waiting to be discovered
In the
early 19th century, geologists began naming and organising different rock
formations in a bid to impose some order on the endless discoveries they were
making. They used clues within the rock layers, such as fossils, minerals,
texture and colour, to tell when formations in different locations dated to the
same time period. For instance, if two bands of limestone contained the same
type of fossilised mollusc, alongside a certain quartz, it was likely they had
been laid down at the same point in time, even if they were discovered miles
apart.
Geologists
called the spans of time that the rock formations represented “units”. On the
timescale today, units vary in size, from eons, which last for billions of
years, to ages, which last for mere thousands. Units nestle inside each other,
like Russian dolls. Officially, we live in the Meghalayan age (which began
4,200 years ago) of the Holocene epoch. The Holocene falls in the Quaternary
period (2.6m years ago) of the Cenozoic era (66m) in the Phanerozoic eon
(541m). Certain units attract more fanfare than others. Most people recognise
the Jurassic.
The task
of interpreting and classifying 4.6bn years of Earth history continues today.
Geologists have barely begun to describe the Precambrian eon, which spans
Earth’s first 4bn years. Meanwhile, well-studied units are revised as new
evidence unsettles old assumptions. In 2004, the Quaternary period was
unceremoniously jettisoned and the preceding period, the Neogene, extended to
cover its 1.8m years. The move came as a surprise to many Quaternary
geologists, who mounted an aggressive campaign to redeem their period.
Eventually, in 2009, the ICS brought the Quaternary back and moved its boundary
down by 800,000 years to the beginning of an ice age, a point considered more
geologically significant. Having now “lost” millions of years, Neogene
scientists were incandescent. “You might ask: who wasn’t upset by it?” Gibbard
told me.
Modifying
the geological timescale is a bit like trying to pass a constitutional
amendment, with rounds of proposal and scrutiny overseen by the ICS. “We have
to be relatively conservative,” said Gibbard, “because anything we do is going
to have a longer-term implication in terms of the science and literature.”
First, a working group drafts a proposal which is submitted to an expert
subcommission for review and vote. From the subcommission, the proposal
advances to the voting members of the ICS (composed of the chairs of the
subcommissions, plus the chair, vice-chair and general-secretary of the ICS).
Once the ICS has voted in its favour, it passes to the International Union of
Geological Sciences (IUGS), geology’s highest body, to be ratified.
Whether
or not a new proposal successfully passes through all these rounds comes down
to the quality of evidence that the working group can amass, as well as the
individual predilections of the 50-or-so seasoned geologists who constitute the
senior committees.
This did
not bode well for Zalasiewicz as he began to put together the Anthropocene
working group. In fundamental ways, the idea of the Anthropocene is unlike
anything geologists have considered before. The planet’s timekeepers have built
their timescale from the physical records laid down in rocks long ago. Without
due time to form, the “rocks” of the Anthropocene were little more than “two
centimetres of unconsolidated organic matter”, as one geologist put it to me.
“If we think about the Anthropocene in purely geological terms – and that’s the
trouble, because we’re looking at it with that perspective – it’s an instant,”
said Gibbard.
Zalasiewicz
grew up in the foothills of the Pennines in a house that contained his parents,
sister and a growing collection of rocks. When he was 12, his sister brought
home a nestful of starlings, which his mother, who loved animals, nursed to
health. Soon neighbours started calling round with all manner of injured birds,
and for several years Zalasiewicz shared his bedroom with a little owl and a
kestrel. (Kestrels, he came to know, are “rather thick creatures”.) He started
volunteering at the local museum in Ludlow in the summer, where he met people
who were expert in the things he cared most about, such as where to find
trilobites. By his mid-teens, he told me, “geology was it”.
Now 64,
Zalasiewicz is small and slight, with silver hair that sticks out like a
scarecrow’s. He has worked in Leicester University’s geology department for 20
years, and presents himself as a quintessential geologist, a wearer of leather
elbow patches and lover of graptolites. Yet among geologists, he is a known
provocateur. His reputation stems from one of his papers, published in 2004, in
which he argued that stratigraphy should throw out some of the terminology that
has been in use since the discipline’s earliest days in favour of more modern
terms. It was, to some, an audacious suggestion. When I emailed David
Fastovsky, the former editor of the journal Geology, who had published the
paper 15 years ago, he remembered it well. “The general feeling at the time,”
he wrote, “was that it might be possible, but who would dare to take the first
shot?”
Over the
years, Zalasiewicz has indulged in thought experiments that are, among
geologists, peculiar. In 1998, he wrote an article for New Scientist in which
he imagined what mark humans might leave on the Earth long after we are
extinct. His ideas became a book, published 10 years later, called The Earth
After Us. Geologists tend to have their minds trained on the deep past, and
Zalasiewicz’s forward-thinking approach marked him out. When, in 2006, Zalasiewicz
broached the subject of the Anthropocene at the Geological Society meeting,
Gibbard recalled thinking: “Well, these two go together very well.”
After he
was appointed chair of the Anthropocene working group, Zalasiewicz needed to
assemble his team. “At the time, it was simply a hypothetical and interesting
question: can this thing be for real geologically?” Zalasiewicz told me when I
visited him in Leicester last year. “It was arm-waving with very little
specific detail. The diagrams were back-of-the-beer-mat things.”
Stratigraphic
working groups are, not surprisingly, usually composed of stratigraphers. But
Zalasiewicz took a different approach. Alongside traditional geologists, he
brought in Earth systems scientists, who study planet-wide processes such as
the carbon cycle, as well as an archeologist and an environmental historian.
Soon the group numbered 35. It was international in character, if
overwhelmingly male and white, and included experts with specialisms in
paleoecology, radiocarbon isotopes and the law of the sea.
If the
Anthropocene was, in fact, already upon us, the group would need to prove that
the Holocene – an unusually stable epoch in which temperature, sea level and
carbon dioxide levels have stayed relatively constant for nearly 12 millenia –
had come to an end. They began by looking at the atmosphere. During the
Holocene, the amount of CO2 in the air, measured in parts per million (ppm),
was between 260 and 280. Data from 2005, the most recent year recorded when the
working group started out, showed levels had climbed to 379 ppm. Since then, it
has risen to 405 ppm. The group calculated that the last time there was this
much CO2 in the air was during the Pliocene epoch 3m years ago. (Because the
burning of fossil fuels in pursuit of the accumulation of capital in the west
has been the predominant source of these emissions, some suggest “Capitalocene”
is the more appropriate name.)
Next
they looked at what had happened to animals and plants. Past shifts in
geological time have often been accompanied by mass extinctions, as species
struggle to adapt to new environments. In 2011, research by Anthony Barnosky, a
member of the group, suggested something similar was underway once again.
Others investigated the ways humans have scrambled the biosphere, removing
species from their natural habitat and releasing them into new ones. As humans
have multiplied, we have also made the natural world more homogenous. The
world’s most common vertebrate, the broiler chicken, of which there are 23bn
alive at any one time, was created by humans to be eaten by humans.
Then
there was also the matter of all our stuff. Not only have humans modified the
Earth’s surface by building mines, roads, towns and cities, we have created
increasingly sophisticated materials and tools, from smartphones to ballpoint
pens, fragments of which will become buried in sediment, forming part of the
rocks of the future. One estimate puts the weight of everything humans have
ever built and manufactured at 30tn tonnes. The working group argued that the
remnants of our stuff, which they called “technofossils”, will survive in the
rock record for millions of years, distinguishing our time from what came
before.
By 2016,
most of the group was persuaded that what they were seeing amounted to more
than a simple fluctuation. “All these changes are either complete novelties or
they are just off the scale when it comes to anything Holocene,” Zalasiewicz
told me. That year, 24 working group members co-authored an article, published
in the journal Science, announcing that the Anthropocene was “functionally and
stratigraphically distinct” from the Holocene.
But the
details were far from settled. The group needed to agree a start-date for the
Anthropocene, yet there was nothing as clean as a colossal volcanic eruption or
an asteroid strike to mark the point where it began. “From a geological point
of view, that makes life very difficult,” said Gibbard, who is also a member of
the working group.
The
group was split into opposed camps, largely according to their academic
specialisation. Initially, when he first proposed the notion of the
Anthropocene, Paul Crutzen, who is an atmospheric chemist, had suggested the
industrial revolution as the start-date because that was when concentrations of
CO2 and methane began accumulating significantly in the air. Lately the Earth
system scientists had come to prefer the start of the so-called “great
acceleration”, the years following the second world war when the collective
actions of humans suddenly began to put much more strain on the natural world
than ever before. Most stratigraphers were now siding with them – they believe
that the activity of the 1950s will leave a sharper indentation on the
geological record. This concerned the archaeologists, who felt that privileging
a 1950 start-date dismissed the thousands of years of human impact that they
study, from our early use of fire to the emergence of agriculture. “There is a
feeling among the archaeologists that because the word ‘anthropo’ is in there,
their science should be central,” one geologist complained to me privately.
Agreeing the start-date, Gibbard warned, could be the Anthropocene’s “stumbling
block”.
At the
tail end of last summer, members of the working group boarded flights to
Frankfurt and then took a 45-minute train west, to Mainz. Over two days, they
gathered at the Max Planck Institute for Chemistry for the group’s annual
meeting. Crutzen, now in his mid-80s, spent much of his career at the
institute, and he was present both as a spectator and in the form of a bronze
bust in the foyer. I asked him what he made of the progress of his idea. “It
started with a few people and then it exploded,” he said.
Under
the glow of a projector in a darkened classroom, two dozen researchers shared
their latest findings on topics such as organic isotope geochemistry and peat
deposits. Things proceeded without a wrinkle until the second day, when a
debate broke out about the start date, which then turned into a debate about
whether it was OK for different intellectual communities to use the term
“Anthropocene” to mean different things. Someone at the back suggested adding
the word “epoch” for the strictly geological definition, so “Anthropocene” by
itself could be used generally.
“It’s
just a personal view, but I think it would be confusing to have the same term
having different meanings,” said a stratigrapher.
“I don’t
think it would be that confusing,” an environmental scientist countered.
In the
front row, Zalasiewicz watched with the air of an adjudicator. Eventually, he
chimed in. “Certainly, in terms of our remit, we can only work from the
geological term. We can’t police the word ‘Anthropocene’ beyond that,” he said.
Throughout the meeting, Zalasiewicz seemed at pains to emphasise the
Anthropocene’s geological legitimacy. He was aware that a number of influential
geologists had taken against the idea, and he was worried about what might
happen if the working group was seen to be straying too far from the
discipline’s norms.
One of
the loudest critics of the Anthropocene is Stanley Finney, who as the
secretary-general of the IUGS, the body that ratifies changes to the timescale,
is perhaps the most powerful stratigrapher in the world. During the meeting in
Mainz, I was told that Finney was both a “big phallus of the discipline” and
“really vehemently anti-Anthropocene”.
Zalasiewicz
told me that Finney was an accomplished geologist, but one of a different
temperament. “He sees me as someone who tries to bring in these crazy ideas by
the backdoor,” he said. “I guess if you’re a geologist who spends your time in
the past where you have these enormous vistas of time – the human-free zone, if
you like – then to have something as fast, busy, crowded, as
science-fiction-like, come into the steady, formalised, bureaucratised array of
geological time, I can see it as something you might naturally take against.”
When
Finney first came across the term “Anthropocene”, in a paper written by
Zalasiewicz in 2008, he thought little of it. To him, it just seemed like a big
fuss over the human junk on the surface of the planet. Finney, who is 71 and a
professor of geological sciences at California State University, Long Beach,
has spent much of his career trying to picture what the planet was like 450m
years ago, during the Ordovician period, when the continents were bunched
together in the southern hemisphere and plants first colonised land. Over the
years, he has worked his way up through stratigraphy’s hierarchy. By the time
Zalasiewicz was appointed chair of the working group, Finney was chair of the
ICS. The two scientists knew each other professionally. Zalasiewicz’s favourite
fossils, graptolites, are found in Ordovician strata.
But for
some time the pair had not seen eye to eye. When Zalasiewicz published his 2004
paper arguing that stratigraphers should cast off their long-established
terminology, Finney was affronted by this lack of respect for the discipline’s
traditions. In an attempt to find a middle ground, the pair worked on a
“compromise paper”. As the writing got underway, things turned sour. Finney
began to feel that Zalasiewicz was not treating his suggested revisions
seriously. “He would take my comments and he would make tiny little changes but
still keep the whole thing,” Finney told me. “When I saw the final draft that
was ready to be accepted [by a journal], I said: ‘Take my name off, I’m not
happy with this. Just take my name off.’” From then on, their relations assumed
a cool distance.
Finney
only decided to look at the Anthropocene in detail after he began getting
comments from people who thought it was now an official part of the geological
timescale. The more he looked, the less he liked the idea. “You can make the
‘big global changes’ issue out of it if you want, but as geologists we work
with rocks, you know?” he told me. To Finney, a negligible amount of
“stratigraphic content” has amassed since the 1950s. Geologists are used to
working with strata several inches deep, and Finney thought it was excessively
speculative to presume that humans’ impact will one day be legible in rock. As
the Anthropocene working group gained momentum, he grew concerned that the ICS
was being pressured into issuing a statement that at its heart had little to do
with advancing stratigraphy, and more to do with politics.
Academics
both inside and outside geology have noted the Anthropocene’s political
implications. In After Nature, the law professor Jedediah Purdy writes that
using the term “Anthropocene” to describe a wide array of human-caused
geological and ecological change is “an effort to meld them into a single
situation, gathered under a single name”. To Purdy, the Anthropocene is an
attempt to do what the concept of “the environment” did in the 1960s and 70s.
It is pragmatic, a way to name the problem – and thus begin the process of
solving it.
Yet if a
term becomes too broad, its meaning can become unhelpfully vague. “There is an
impulse to want to put things in capital letters, in formal definitions, just
to make them look like they’re nicely organised so you can put them on a shelf
and they’ll behave,” said Bill Ruddiman, professor emeritus at the University
of Virginia. A seasoned geologist, Ruddiman has written papers arguing against
the stratigraphic definition of the Anthropocene on the grounds that any single
start-date would be meaningless since humans have been gradually shaping the
planet for at least 50,000 years. “What the working group is trying to say is
everything pre-1950 is pre-Anthropocene, and that’s just absurd,” he told me.
Ruddiman’s
arguments have found wide support, even from a handful of members of the
working group. Gibbard told me he had started out “agnostic” about the
Anthropocene but lately he had decided it was too soon to tell whether or not
it really was a new epoch. “As geologists, we’re used to looking backwards,” he
said. “Things that we’re living through at the moment – we don’t know how
significant they are. [The Anthropocene] appears significant but it would be
far easier if we were 200 to 300, possibly 2,000 to 3,000, years in the future
and then we could look back and say: yes, that was the right thing to do.”
Yet for
the majority of the working group, the stratigraphic evidence for the
Anthropocene is compelling. “We realise the Anthropocene goes against the grain
of geology in one sense, and other kinds of science, archaeology and
anthropology, in another sense,” Zalasiewicz told me. “We try and deal honestly
with their arguments. If they were to put out something that we couldn’t jump
over, then we’d hold up our hands and say: OK, that’s a killer blow for the
Anthropocene. But we haven’t seen one yet.”
The day
after the Mainz conference came to a close, a small number of working group
members met at the central station and took a train to Frankfurt airport. As
the train left the city it crossed the Rhine, a wide river the colour of tepid
tea. Buildings became sparse, giving way to flat fields crossed by pylons and
wires.
For all
the years of discussion, research and debate, after the meeting it was obvious
that the Anthropocene working group was still a long way off submitting its
proposal to the ICS. Zalasiewicz’s favourite joke, that geologists “work in
geological time”, was starting to wear thin. Proposals to amend the timescale
require evidence in the form of cores of sediment that have been extracted from
the ground. Within the core there must be a clear sign of major environmental
change marked by a chemical or biological trace in the strata, which acts as
the physical evidence of where one unit stops and another begins. (This marker
is often called the “golden spike” after the ceremonial gold spike that was
used to join two railway tracks when they met in the middle of the US in 1869,
forming the transcontinental railroad.)
The core
extraction and analysis process takes years and costs hundreds of thousands of
pounds – money that, at that point, and despite grant applications, the group
did not have. They discussed the problem on the train. “Beg, borrow and steal.
That is the working group motto,” Zalasiewicz said, a little bitterly.
But in
the months that followed the meeting, their fortunes changed. First, they
received €800,000 in funding from an unexpected source, the Haus der Kulturen
der Welt, a state-funded cultural institute in Berlin that has been holding
exhibitions about the Anthropocene for several years. The money would finally
allow the group to begin the core-extraction work, moving the proposal beyond
theoretical discussion and into a more hands-on, evidence-gathering stage.
Then, in
late April, the group decided to hold a vote that would settle, once and for
all, the matter of the start-date. Working group members had one month to cast
their votes; a supermajority of at least 60% would be needed for the vote to be
binding. The results, announced on 21 May, were unequivocal. Twenty-nine
members of the group, representing 88%, voted for the start of the Anthropocene
to be in the mid-20th century. For Zalasiewicz, it was a step forward. “What
we’ll do now is the technical work. We’ve now moved beyond the general, almost
existential question of ‘is the Anthropocene geological?’” he said, when I
called him. The important votes at the ICS were still to come, but he felt
optimistic.
In
Mainz, after the train pulled into the airport, the group made for the
departure zone. Among the chaos of wheelie suitcases and people hurrying about,
suddenly a voice cried out: “Fossils!” Zalasiewicz was off to one side, eyes
fixed on the polished limestone floor. “That’s a fossil, these are fossil
shells,” he said, pointing to what looked like dark scratches. One was the
shape of a horseshoe, and another looked like a wishbone. Zalasiewicz
identified them as rudists, a type of mollusc that had thrived during the
Cretaceous, the last period of the dinosaurs. Rudists were a hardy species, the
main reef-builders of their time. One rudist reef ran the length of the North
American coast from Mexico to Canada.
Staring
at the rudists encased in limestone slabs that had been dug out of the ground
and transported many miles across land, it was strange to think of the
unlikeliness of their arrival in the airport floor. The rudists beneath our
feet had died out 66m years ago, in the same mass extinction event that wiped
out the dinosaurs. Scientists generally believe that the impact of an asteroid
in Yucatan, Mexico, plunged the planet into a new phase of climatic instability
in which many species perished. Geologists can see the moment of the impact in
rocks as a thin layer of iridium, a metal that occurs in very low
concentrations on Earth and was likely expelled by the asteroid and dispersed
across the world in a cloud of pulverised rock that blotted out the sun. To
stratigraphers, the iridium forms the “golden spike” between the Cretaceous and
Paleogene periods.
Now that
the working group has decided roughly when the Anthropocene began, their main
task is picking the golden spike of our time. They are keeping their options
open, assessing candidates from microplastics and heavy metals to fly ash. Even
so, a favourite has emerged. From the pragmatic stratigraphic perspective, no
marker is as distinct, or more globally synchronous, than the radioactive
fallout from the use of nuclear weapons that began with the US army’s Trinity
test in 1945. Since the early 1950s, this memento of humankind’s darkest self-destructive
impulses has settled on the Earth’s surface like icing sugar on a sponge cake.
Plotted on a graph, the radioactive fallout leaps up like an explosion.
Zalasiewicz has taken to calling it the “bomb spike”.
The
Anthropocene epoch: have we entered a new phase of planetary history? By Nicola Davison. The Guardian , May 30, 2019.
Also of
interest : Have we
entered the "Anthropocene"? International Geosphere-Biosphere Programme October 31, 2010.
No comments:
Post a Comment