Blog Archive

Showing posts with label Anthropocene. Show all posts
Showing posts with label Anthropocene. Show all posts

Sunday, April 5, 2015

Stephen Mulkey: Sleepwalking toward a new ecology

by Stephen Mulkey, The Environmental Century, March 29, 2015

The pace of ecological change is quickening and I see little sense of urgency to address the negative consequences that are unfolding. The increasing speed of change is a direct consequence of two interacting drivers – resource use and climate change. The long standing processes of human use of natural resources and resulting habitat degradation have increased in scale and impact as our population has continued to explode. Adding to this, as defense analysts have argued, climate change is both a primary driver and amplifier of change. Collectively these factors are driving worldwide ecosystem change at a pace and scale far exceeding any previous period of change in the history of our planet.
Will Steffen and colleagues recently published updates of the famous “Great Acceleration” graphs, which showed major socio-economic trends in resource use from 1750 to 2000. It is no surprise that none of these crucial trends shows any evidence of slowing over the last decade (Steffen et al., 2015, Anthropocene Review 1-18). Although the starting point remains an issue for academic debate (Zalasiewicz et al., 2014, Quaternary International 1-8), there is little doubt that we have entered a new geological epoch whose hallmark characteristic is the impact of humans. The scientific community has declared this to be the Anthropocene epoch (my personal preference was for the term Homogeocene, but this never gained traction).
Untitled
Steffen et al. 2015.
All ecologists and natural historians who have lived more than a few decades are painfully aware of numerous local habitats that have degraded beyond recognition in our lifetimes. Quite independent of the effects of climate change, we are watching these changes unfold so quickly that destruction can essentially be overnight. The legendary botanist, Alwyn Gentry, witnessed the now classic example of the loss of dozens of endemic species as a consequence of a single episode of logging at Centinela Ridge in Ecuador in 1978. This scenario is being played out with increasing speed on land and in marine habitats all over the globe as the Sixth Extinction ensues. Climate change significantly amplifies this steadily increasing loss.
Although ecologists and conservationists have long understood these trends, it is disturbing to me that our institutions and government agencies seem to be clueless about how to manage such change. Simply put, we should be vigorously engaged in proactive adaptation. It seems logically axiomatic that proactive adaptation is far less disruptive and costly than is reactive adaptation. The water crisis in California is a case in point. For well over a decade, general circulation climate models have projected prolonged decadal drought for California and the American Southwest. For many years, the booming agriculture and population growth of the state have been on a collision course with dwindling water resources. Recently, NASA data have shown that California has one year of water reserves above ground. The implications of this for human and natural systems are extreme. Had proactive adaptation been implemented a decade ago, this situation would be much more manageable and much of the pain of conservation and likely rationing could have been avoided.
Most alarming to many ecologists is the speed at which climate change is impacting species and habitats. Evidence has been accumulating for over two decades that species are responding to climate change and  shifts in growing zones (e.g., Walther et al. 2002, Nature 389-385). The implications of these studies are profound for the stability and predictability of ecosystems. Proactive management of forest, prairie, freshwater, wildlife, and marine habitats must begin now if we are to hope of having viable resources in the second half of this century. Forests are an example of change that influences survival and reproduction of tree species within the lifetime of individual trees.
The management challenges posed by this speed and degree of change are manifold. When I was a wildlife student at a big university in the Midwest, I was taught that construction of nature preserves was the best approach to ensure the longevity of species. Now we see that the growing zones are on the move and are increasingly decoupled from the species at the base of the food chain. As this decoupling progresses we can expect many preserves to be located in climate regions that are inappropriate for their various conservation functions. Without aggressive intervention, we can expect widespread transformation and even outright failure of species interactions and functional processes within ecosystems. Clearly, we need a much more dynamic and proactive approach to conservation than was typically assumed in the 20th century.
The specific tools and approaches for such dynamic conservation remain nascent in their development at our major research institutions. To respond to the need for proactive management we must refocus public funding to provide strong incentive for research and implementation of adaptive management specific to this new paradigm of change. For example, habitat corridors connecting protected areas will become an increasingly important conservation tool as climate change ensues. The theory and practice of habitat corridor establishment and development are only superficially understood. Similarly, we must more fully understand the complex process of assisted colonization so that we can move species as their preferred habitat changes or degrades. Indeed, it would seem that we must re-examine our understanding of invasive species in this brave new ecology. Unless we make the development of such tools for proactive adaptation a top priority, we will certainly lose much of our natural heritage. Indeed, our children and their descendants will live on a depauperate and diminished planet.
My recommendations and concerns are only relevant if we can mitigate the causes of climate change before midcentury. Although management of human use of natural resources has always been an urgent need, climate change has increased the stakes and made the costs of inaction staggeringly high. An increase of more than 3-4 ˚C above the preindustrial global average is unthinkable. If we don’t manage and effectively halt emissions of greenhouse gases, there is little that sophisticated management can do to stem the impending losses.
I am struck that many of those working hard to move governments and corporations to address climate change are simply unfamiliar with the concept of ecological change.  It has not been part of their training or thinking, and thus they focus on the impacts of climate change on human engineered systems. This is extremely short sighted given that all human systems are ultimately tied to our planetary ecology. As pointed out by Sir Nicholas Stern, climate change is the greatest market failure in history because there is no legitimate way to externalize this cost.
We stand at a crossroads for human civilization. It remains to be seen if we can accept the reality that we are of, and not separate from, the Earth. As Bill McKibben pointed out in his landmark The End of Naturethere remains no place on our planet that has not been touched by the hand of humanity, often with devastating consequences for living systems. Despite this widely understood reality, we continue to act as if we can, with impunity, irrevocably alter the fundamental life support systems of our home. David Orr pointed out in his indictment of politics in the US, The Last Refugethat contrary to conservative claims, most of the major predictions of ecologists have turned out to be fundamentally correct. Today the warning signs could not be more clear and the outcome more crucial. We can continue our zombie walk into the future, ignoring the data from well established science, but this time the consequences will be irrevocable on a millennial timescale.
http://environmentalcentury.net/2015/03/29/sleepwalking-toward-a-new-ecology/

Thursday, June 19, 2014

Joe Romm: Words Matter When Talking Global Warming: The ‘Good' Anthropocene Debate

by Joe Romm, Climate Progress, June 19, 2014


man talking underwater
CREDIT: SHUTTERSTOCK
We spend more of our waking hours communicating than perhaps any other single activity. And while the principles of effective writing and speaking have been understood for centuries if not millennia, they are largely ignored today — sometimes intentionally, as Orwell pointed out nearly seven decades ago.
“In our time, political speech and writing are largely the defense of the indefensible,” George Orwell wrote in “Politics And The English Language” in 1946. “Thus political language has to consist largely of euphemism, question-begging and sheer cloudy vagueness.”
Nowhere is that clearer than in the arena of climate politics and journalism — which often seems driven by the unproductive extremes of “Don’t Worry, Be Happy” and “STAND BACK AND WATCH THE WORLD BURN.” Ultimately, they are both equally pessimistic, since they both push the premise that there is no chance the human race could actually embrace the kind of aggressive action needed to have a realistic chance of avoiding multiple catastrophes.
I am more optimistic, as I explained in my reply to Ezra Klein’s pessimism. I suppose if I had a motto, it might be: Do Worry, Take Action, THEN Be Happy.
I’ve been thinking about all this because I was on two recent science communications panels: a “Science & Policy Communications Workshop” this week for the American Geophysical Union (AGU) and a Communications Workshop at the American Meteorological Society (AMS) Summer Policy Colloquium last week. Everything I know on the subject can be found in my 2012 book, “Language Intelligence: Lessons on Persuasion from Jesus, Shakespeare, Lincoln and Lady Gaga.”
For those who want the pithy version, start with the great 20th Century essayist, Orwell, in his greatest essay, “Politics And The English Language” — and the great 20th Century orator, Winston Churchill, in his essay metaphorically titled, “The Scaffolding of Rhetoric.”
George Orwell political speech
Orwell offers six simple rules for writing with clarity, “rules that one can rely on when instinct fails,” when you are “in doubt about the effect of a word or a phrase”:

  • (i) Never use a metaphor, simile, or other figure of speech which you are used to seeing in print.
  • (ii) Never use a long word where a short one will do.
  • (iii) If it is possible to cut a word out, always cut it out.
  • (iv) Never use the passive where you can use the active.
  • (v) Never use a foreign phrase, a scientific word, or a jargon word if you can think of an everyday English equivalent.
  • (vi) Break any of these rules sooner than say anything outright barbarous.
  • What’s interesting is that in his essay, Churchill says some very similar things even though he is focused on oratory. “There is no more important element in the technique of rhetoric than the continual employment of the best possible word,” he argues. “Whatever part of speech it is it must in each case absolutely express the full meaning of the speaker. It will leave no room for alternatives.”
    So clarity is king, just as it is for Orwell. Churchill then takes on a very common myth about rhetoric:
    The unreflecting often imagine that the effects of oratory are produced by the use of long words…. The shorter words of a language are usually the more ancient. Their meaning is more ingrained in the national character and they appeal with greater force to simple understandings than words recently introduced from the Latin and the Greek. All the speeches of great English rhetoricians … display an uniform preference for short, homely words of common usage….
    Short words win. Jargon loses.
    In preparing for my AMS and AGU talks, I asked a senior legislative aide with over two decades of Hill experience for some advice. He told me that if scientists speak to a Legislative Assistant (L.A.) for a member on the climate issue, they “can’t assume the L.A. knows anything.” It would be a mistake, he said, to even use a phrase like “statistically significant”!
    Susan Joy Hassol, an expert in climate communication, made the same point in a 2010 post here — avoid jargon: “Words that seem perfectly common to scientists are still jargon to the wider world and always have simpler substitutes. Rather than anthropogenic, you could say human caused.”
    And this bring us to the latest dust-up over jargon and euphemism. The New York Times climate blog published a piece titled, “Exploring Academia’s Role in Charting Paths to a ‘Good’ Anthropocene.”
    I think it’s safe to say that both Orwell and Churchill would have gagged at “Anthropocene,” which, as perhaps 1% (0.1%?) of the U.S. population knows, means “an informal geologic chronological term that marks the evidence and extent of human activities that have had a significant global impact on the Earth’s ecosystems.”
    Inside the tiny community of people who actually understand the term, there was widespread objection to the entire phrase, “Good Anthropocene.” Australian author, climate expert and Professor of Public Ethics Clive Hamilton wrote, “those who argue for the ‘good Anthropocene’ are unscientific and live in a fantasy world of their own construction.”
    I very much agree. Elizabeth Kolbert, one of the most thoughtful climate journalists, tweeted: 

    The NY Times blogger (Andy Revkin) criticized Kolbert for having tweeted that without having watched the hour talk he gave, “Paths to a ‘Good’ Anthropocene” (video here).
    After she watched it, Kolbert emailed me:
    I don’t see the value in the “good Anthropocene” as a rhetorical construct, even if it’s well-intentioned. What we are doing to the planet, which is of course the reason geologists are considering renaming the epoch in which we live, is in no way good. A few years ago, Paul Crutzen told me that he hoped the word Anthropocene would serve as “a warning to the world.” I think part of the power of the term is that it resists modification.
    I’ve watched the video. In its own way, it is just as much a pessimistic, self-fulfilling prophecy as Ezra Klein’s “7 reasons America will fail on climate change.” We know what we need to do to avoid catastrophic warming — quickly embrace a series of policies (at a national and global level) including a carbon price that drive emissions down sharply decade after decade. The good news is that the world’s leading governments and scientists and energy experts have explained that this strategy is cheap (far cheaper than inaction), and that we have the technology to start ASAP. Oh, and deployment-driven innovation will keep providing new and better and cheaper technology.
    It is certainly a legitimate view to argue that the nation (and the world) aren’t up to that task, as Klein and Revkin do. But it is Orwellian to climate that making such an argument is optimistic and not self-fulfilling. That’s especially true if your recommended alternative is to basically give up (Klein) or to abandon quantitative targets and embrace personal growth and some R&D (Revkin).
    As Hamilton writes:
    The advocates of the “good Anthropocene” do not attempt to repudiate the mass of scientific evidence; instead they choose to reframe it. As you declare so disarmingly in your talk: “You can look at it and go ‘Oh my God’, or you can look at it and go ‘Wow, what an amazing time to be alive!’ I kind of choose the latter overall.”
    Talking of a “good Anthropocene” while proposing strategies that can’t possibly achieve it — and while repeatedly attacking those (including the National Academy of Sciences) who propose strategies that could — is the road to a very, very bad Anthropocene. In jargon-free terms, it is the road to Hell and High Water.
    The phrase “good Anthropocene” as some are using it, is a euphemism as Orwellian as “enhanced interrogation.” As Hamilton puts it:
    … the “good Anthropocene” is a story about the world that could have been written by the powerful interests that have got us into this mess and who are fighting so effectively to prevent us from getting out of it. In the long term this kind of thinking will prove more insidious than climate science denial.
    The eco-pragmatists, as Hamilton calls them, never offer any set of specific proposals that any credible group of independent experts has said could possibly keeps us far from 7 °F warming (the end of modern civilization as we know it) — let alone the unimaginable 10+°F. All they offer is the euphemism, hand-waving, and sheer cloudy vagueness Orwell warned about.
    Revkin tweets: “I trust those bridling at vision of a “good” #Anthropocene aren’t hoping for bad one. http://nyti.ms/1qdcd0F @CliveCHamilton @ElizKolbert.”
    Seriously. Hamilton and Kolbert have dedicated themselves to informing the public about the worst impacts — and how to avoid them. Kolbert’s terrific 2006 book, Field Notes from a Catastrophe, famously ends, “It may seem impossible to imagine that a technologically advanced society could choose, in essence, to destroy itself, but that is what we are now in the process of doing.”
    That kind of clarity is what we are missing from the current discussion. The climate debate isn’t about what people are “hoping for” — it is about the kind of future that we are choosing through our climate policy. So far we have chosen poorly.

http://thinkprogress.org/climate/2014/06/19/3450400/orwell-language-good-anthropocene/

Monday, May 16, 2011

Nature News: Human influence comes of age. Geologists debate epoch ("Anthropocene") to mark effects of Homo sapiens.

Published online 11 May 2011 | Nature 473, 133 (2011) | doi:10.1038/473133a

News

Human influence comes of age

Geologists debate epoch to mark effects of Homo sapiens.


Humanity's profound impact on this planet is hard to deny, but is it big enough to merit its own geological epoch? This is the question facing geoscientists gathered in London this week to debate the validity and definition of the "Anthropocene," a proposed new epoch characterized by human effects on the geological record.

"We are in the process of formalizing it," says Michael Ellis, head of the climate-change programme of the British Geological Survey in Nottingham, who coordinated the 11 May meeting. He and others hope that adopting the term will shift the thinking of policy-makers. "It should remind them of the global and significant impact that humans have," says Ellis.

But not everyone is behind the idea. "Some think it premature, perhaps hubristic, perhaps nonsensical," says Jan Zalasiewicz, a stratigrapher at the University of Leicester, UK, and a co-convener of the meeting. Zalasiewicz, who declares himself "officially very firmly sitting on the fence", also chairs a working group investigating the proposal for the International Commission on Stratigraphy (ICS) — the body that oversees designations of geological time.

The term Anthropocene was first coined in 2000 by Nobel laureate Paul Crutzen, now at the Max Planck Institute for Chemistry in Mainz, Germany, and his colleagues. It then began appearing in peer-reviewed papers as if it were a technical term rather than scientific slang.



The "evidence for the prosecution," as Zalasiewicz puts it, is compelling. Through food production and urbanization, humans have altered more than half of the planet's ice-free land mass1 (see 'Transformation of the biosphere'), and are moving as much as an order of magnitude more rock and soil around than are natural processes2. Rising carbon dioxide levels in the atmosphere are expected to make the ocean 0.3–0.4 pH points more acidic by the end of this century. That will dissolve light-coloured carbonate shells and sea-floor rocks for about 1,000 years, leaving a dark band in the sea-floor sediment that will be obvious to future geologists. A similar dark stripe identifies the Palaeocene–Eocene Thermal Maximum about 55 million years ago, when global temperatures rose by some 6 °C in 20,000 years. A similar temperature jump could happen by 2100, according to some high-emissions scenarios3.

The fossil record will show upheavals too. Some 20% of species living in large areas are now invasive, says Zalasiewicz. "Globally that's a completely novel change." And a review published in Nature in March4concluded that the disappearance of the species now listed as "critically endangered" would qualify as a mass extinction on a level seen only five times in the past 540 million years — and all of those mark transitions between geological time periods.
Some at the ICS are wary of formalizing a new epoch. "My main concern is that those who promote it have not given it the careful scientific consideration and evaluation it needs," says Stan Finney, chair of the ICS and a geologist at California State University in Long Beach. He eschews the notion of focusing on the term simply to "generate publicity."

Others point out that an epoch typically lasts tens of millions of years. Our current epoch, the Holocene, began only 11,700 years ago. Declaring the start of a new epoch would compress the geological timeline to what some say is a ridiculous extent. Advocates of the Anthropocene, however, say that it is natural to divide recent history into smaller, more detailed chunks. A less controversial alternative would be to declare the Anthropocene a new "age": a subdivision of an epoch.

If scientists can agree in principle that a new time division is justified, they will have to settle on a geological marker for its start. Some suggest the pollen of cultivated plants, arguing that mankind's fingerprint can be seen 5,000–10,000 years ago with the beginnings of agriculture. Others support the rise in the levels of greenhouse gases and air pollution in the latter part of the eighteenth century, as industrialization began. A third group would start with the flicker of radioactive isotopes in 1945, marking the invention of nuclear weapons.
Should the working group decide that the Anthropocene epoch has merit, it will go to an ICS vote. But the whole process will take time — defining other geological periods has sometimes taken decades. In the meantime, Zalasiewicz says, "the formalization is the excuse to try to do some very interesting science," comparing Earth's current changes to those of the past.

Saturday, February 19, 2011

Elizabeth Kolbert: Enter the Anthropocene—Age of Man

Enter the Anthropocene—Age of Man
It’s a new name for a new geologic epoch—one defined by our own massive impact on the planet. That mark will endure in the geologic record long after our cities have crumbled.
by Elizabeth Kolbert, National Geographic, March 2011
The path leads up a hill, across a fast-moving stream, back across the stream, and then past the carcass of a sheep. In my view it's raining, but here in the Southern Uplands of Scotland, I'm told, this counts as only a light drizzle, or smirr. Just beyond the final switchback, there's a waterfall, half shrouded in mist, and an outcropping of jagged rock. The rock has bands that run vertically, like a layer cake that's been tipped on its side. My guide, Jan Zalasiewicz, a British stratigrapher, points to a wide stripe of gray. "Bad things happened in here," he says.
The stripe was laid down some 445 million years ago, as sediments slowly piled up on the bottom of an ancient ocean. In those days life was still confined mostly to the water, and it was undergoing a crisis. Between one edge of the three-foot-thick gray band and the other, some 80 percent of marine species died out, many of them the sorts of creatures, like graptolites, that no longer exist. The extinction event, known as the end-Ordovician, was one of the five biggest of the past half billion years. It coincided with extreme changes in climate, in global sea levels, and in ocean chemistry—all caused, perhaps, by a supercontinent drifting over the South Pole.
Stratigraphers like Zalasiewicz are, as a rule, hard to impress. Their job is to piece together Earth's history from clues that can be coaxed out of layers of rock millions of years after the fact. They take the long view—the extremely long view—of events, only the most violent of which are likely to leave behind clear, lasting signals. It's those events that mark the crucial episodes in the planet's 4.5-billion-year story, the turning points that divide it into comprehensible chapters.
So it's disconcerting to learn that many stratigraphers have come to believe that we are such an event—that human beings have so altered the planet in just the past century or two that we've ushered in a new epoch: the Anthropocene. Standing in the smirr, I ask Zalasiewicz what he thinks this epoch will look like to the geologists of the distant future, whoever or whatever they may be. Will the transition be a moderate one, like dozens of others that appear in the record, or will it show up as a sharp band in which very bad things happened—like the mass extinction at the end of the Ordovician?
That, Zalasiewicz says, is what we are in the process of determining.
The word "Anthropocene" was coined by Dutch chemist Paul Crutzen about a decade ago. One day Crutzen, who shared a Nobel Prize for discovering the effects of ozone-depleting compounds, was sitting at a scientific conference. The conference chairman kept referring to the Holocene, the epoch that began at the end of the last ice age, 11,500 years ago, and that—officially, at least—continues to this day.
"'Let's stop it,'" Crutzen recalls blurting out. "'We are no longer in the Holocene. We are in the Anthropocene.' Well, it was quiet in the room for a while." When the group took a coffee break, the Anthropocene was the main topic of conversation. Someone suggested that Crutzen copyright the word.
Way back in the 1870s, an Italian geologist named Antonio Stoppani proposed that people had introduced a new era, which he labeled the anthropozoic. Stoppani's proposal was ignored; other scientists found it unscientific. The Anthropocene, by contrast, struck a chord. Human impacts on the world have become a lot more obvious since Stoppani's day, in part because the size of the population has roughly quadrupled, to nearly seven billion. "The pattern of human population growth in the twentieth century was more bacterial than primate," biologist E. O. Wilson has written. Wilson calculates that human biomass is already a hundred times larger than that of any other large animal species that has ever walked the Earth.
In 2002, when Crutzen wrote up the Anthropocene idea in the journal Nature, the concept was immediately picked up by researchers working in a wide range of disciplines. Soon it began to appear regularly in the scientific press. "Global Analysis of River Systems: From Earth System Controls to Anthropocene Syndromes" ran the title of one 2003 paper. "Soils and Sediments in the Anthropocene" was the headline of another, published in 2004.
At first most of the scientists using the new geologic term were not geologists. Zalasiewicz, who is one, found the discussions intriguing. "I noticed that Crutzen's term was appearing in the serious literature, without quotation marks and without a sense of irony," he says. In 2007 Zalasiewicz was serving as chairman of the Geological Society of London's Stratigraphy Commission. At a meeting he decided to ask his fellow stratigraphers what they thought of the Anthropocene. Twenty-one of 22 thought the concept had merit.
The group agreed to look at it as a formal problem in geology. Would the Anthropocene satisfy the criteria used for naming a new epoch? In geologic parlance, epochs are relatively short time spans, though they can extend for tens of millions of years. (Periods, such as the Ordovician and the Cretaceous, last much longer, and eras, like the Mesozoic, longer still.) The boundaries between epochs are defined by changes preserved in sedimentary rocks—the emergence of one type of commonly fossilized organism, say, or the disappearance of another.
The rock record of the present doesn't exist yet, of course. So the question was: When it does, will human impacts show up as "stratigraphically significant"? The answer, Zalasiewicz's group decided, is yes—though not necessarily for the reasons you'd expect.
Probably the most obvious way humans are altering the planet is by building cities, which are essentially vast stretches of man-made materials—steel, glass, concrete, and brick. But it turns out most cities are not good candidates for long-term preservation, for the simple reason that they're built on land, and on land the forces of erosion tend to win out over those of sedimentation. From a geologic perspective, the most plainly visible human effects on the landscape today "may in some ways be the most transient," Zalasiewicz has observed.
Humans have also transformed the world through farming; something like 38 percent of the planet's ice-free land is now devoted to agriculture. Here again, some of the effects that seem most significant today will leave behind only subtle traces at best.
Fertilizer factories, for example, now fix more nitrogen from the air, converting it to a biologically usable form, than all the plants and microbes on land; the runoff from fertilized fields is triggering life-throttling blooms of algae at river mouths all over the world. But this global perturbation of the nitrogen cycle will be hard to detect, because synthesized nitrogen is just like its natural equivalent. Future geologists are more likely to grasp the scale of 21st-century industrial agriculture from the pollen record—from the monochrome stretches of corn, wheat, and soy pollen that will have replaced the varied record left behind by rain forests or prairies.
The leveling of the world's forests will send at least two coded signals to future stratigraphers, though deciphering the first may be tricky. Massive amounts of soil eroding off denuded land are increasing sedimentation in some parts of the world—but at the same time the dams we've built on most of the world's major rivers are holding back sediment that would otherwise be washed to sea. The second signal of deforestation should come through clearer. Loss of forest habitat is a major cause of extinctions, which are now happening at a rate hundreds or even thousands of times higher than during most of the past half billion years. If current trends continue, the rate may soon be tens of thousands of times higher.
Probably the most significant change, from a geologic perspective, is one that's invisible to us—the change in the composition of the atmosphere. Carbon dioxide emissions are colorless, odorless, and in an immediate sense, harmless. But their warming effects could easily push global temperatures to levels that have not been seen for millions of years. Some plants and animals are already shifting their ranges toward the Poles, and those shifts will leave traces in the fossil record. Some species will not survive the warming at all. Meanwhile rising temperatures could eventually raise sea levels 20 feet or more.
Long after our cars, cities, and factories have turned to dust, the consequences of burning billions of tons' worth of coal and oil are likely to be clearly discernible. As carbon dioxide warms the planet, it also seeps into the oceans and acidifies them. Sometime this century they may become acidified to the point that corals can no longer construct reefs, which would register in the geologic record as a "reef gap." Reef gaps have marked each of the past five major mass extinctions. The most recent one, which is believed to have been caused by the impact of an asteroid, took place 65 million years ago, at the end of the Cretaceous period; it eliminated not just the dinosaurs, but also the plesiosaurs, pterosaurs, and ammonites. The scale of what's happening now to the oceans is, by many accounts, unmatched since then. To future geologists, Zalasiewicz says, our impact may look as sudden and profound as that of an asteroid.
If we have indeed entered a new epoch, then when exactly did it begin? When did human impacts rise to the level of geologic significance?
William Ruddiman, a paleoclimatologist at the University of Virginia, has proposed that the invention of agriculture some 8,000 years ago, and the deforestation that resulted, led to an increase in atmospheric CO2 just large enough to stave off what otherwise would have been the start of a new ice age; in his view, humans have been the dominant force on the planet practically since the start of the Holocene. Crutzen has suggested that the Anthropocene began in the late 18th century, when, ice cores show, carbon dioxide levels began what has since proved to be an uninterrupted rise. Other scientists put the beginning of the new epoch in the middle of the 20th century, when the rates of both population growth and consumption accelerated rapidly.
Zalasiewicz now heads a working group of the International Commission on Stratigraphy (ICS) that is tasked with officially determining whether the Anthropocene deserves to be incorporated into the geologic timescale. A final decision will require votes by both the ICS and its parent organization, the International Union of Geological Sciences. The process is likely to take years. As it drags on, the decision may well become easier. Some scientists argue that we've not yet reached the start of the Anthropocene—not because we haven't had a dramatic impact on the planet, but because the next several decades are likely to prove even more stratigraphically significant than the past few centuries. "Do we decide the Anthropocene's here, or do we wait 20 years and things will be even worse?" says Mark Williams, a geologist and colleague of Zalasiewicz's at the University of Leicester in England.
Crutzen, who started the debate, thinks its real value won't lie in revisions to geology textbooks. His purpose is broader: He wants to focus our attention on the consequences of our collective action—and on how we might still avert the worst. "What I hope," he says, "is that the term 'Anthropocene' will be a warning to the world."

Thursday, July 8, 2010

Extinction of mammoths by early humans contributed to global warming, initiating the "Anthropocene" about 15,000 years ago

Man-made global warming started with ancient hunters

environmentalresearchweb.org, July 5, 2010

Even before the dawn of agriculture, people may have caused the planet to warm up, a new study suggests.
Mammoths used to roam modern-day Russia and North America, but are now extinct – and there's evidence that around 15,000 years ago, early hunters had a hand in wiping them out. A new study, accepted for publication in Geophysical Research Letters, a journal of the American Geophysical Union (AGU), argues that this die-off had the side effect of heating up the planet.

"A lot of people still think that people are unable to affect the climate even now, even when there are more than 6 billion people," says the lead author of the study, Chris Doughty of the Carnegie Institution for Science in Stanford, California. The new results, however, "show that even when we had populations orders of magnitude smaller than we do now, we still had a big impact."

In the new study, Doughty, Adam Wolf, and Chris Field – all at Carnegie Institution for Science – propose a scenario to explain how hunters could have triggered global warming.

First, mammoth populations began to drop – both because of natural climate change as the planet emerged from the last ice age, and because of human hunting. Normally, mammoths would have grazed down any birch that grew, so the area stayed a grassland. But if the mammoths vanished, the birch could spread. In the cold of the far north, these trees would be dwarfs, only about 2 meters (6 feet) tall. Nonetheless, they would dominate the grasses.

The trees would change the color of the landscape, making it much darker so it would absorb more of the Sun's heat, in turn heating up the air. This process would have added to natural climate change, making it harder for mammoths to cope, and helping the birch spread further.

To test how big of an effect this would have on climate, Field's team looked at ancient records of pollen, preserved in lake sediments from Alaska, Siberia, and the Yukon Territory, built up over thousands of years. They looked at pollen from birch trees (the genus Betula), since this is "a pioneer species that can rapidly colonize open ground following disturbance," the study says. The researchers found that around 15,000 years ago – the same time that mammoth populations dropped, and that hunters arrived in the area – the amount of birch pollen started to rise quickly.

To estimate how much additional area the birch might have covered, they started with the way modern-day elephants affect their environment by eating plants and uprooting trees. If mammoths had effects on vegetation similar to those of modern elephants , then the fall of mammoths would have allowed birch trees to spread over several centuries, expanding from very few trees to covering about one-quarter of Siberia and Beringia – the land bridge between Asia and Alaska. In those places where there was dense vegetation to start with and where mammoths had lived, the main reason for the spread of birch trees was the demise of mammoths, the model suggests.

Another study, published last year, shows that "the mammoths went extinct, and that was followed by a drastic change in the vegetation," rather than the other way around, Doughty says. "With the extinction of this keystone species, it would have some impact on the ecology and vegetation – and vegetation has a large impact on climate."

Doughty and colleagues then used a climate simulation to estimate that this spread of birch trees would have warmed the whole planet more than 0.1 °C (0.18 °F) over the course of several centuries. (In comparison, the planet has warmed about six times more during the past 150 years, largely because of people's greenhouse-gas emissions.)

Only some portion – about one-quarter – of the spread of the birch trees would have been due to the mammoth extinctions, the researchers estimate. Natural climate change would have been responsible for the rest of the expansion of birch trees. Nonetheless, this suggests that when hunters helped finish off the mammoth, they could have caused some global warming.

In Siberia, Doughty says, "about 0.2 °C (0.36 °F) of regional warming is the part that is likely due to humans."

Earlier research indicated that prehistoric farmers changed the climate by slashing and burning forests starting about 8,000 years ago, and when they introduced rice paddy farming about 5,000 years ago. This would suggest that the start of the so-called "Anthropocene" – a term used by some scientists to refer to the geological age when mankind began shaping the entire planet – should be dated to several thousand years ago.

However, Field and colleagues argue, the evidence of an even earlier man-made global climate impact suggests the Anthropocene could have started much earlier. Their results, they write, "suggest the human influence on climate began even earlier than previously believed, and that the onset of the Anthropocene should be extended back many thousands of years."

This work was funded by the Carnegie Institution for Science and NASA.

Link:  http://environmentalresearchweb.org/cws/article/yournews/43104