Wildfire: Study questions U.S. policy of forest ‘restoration’

Please consider this article from E&E a companion to this February 10th post on this blog. – mk

WILDFIRE: Study questions U.S. policy of forest ‘restoration’
By Phil Taylor, E&E reporter, 2/14/14

Western forests today experience fewer high-severity wildfires than they did more than a century ago, depriving some fire-dependent species and stifling biodiversity, according to a new study.

The study challenges conventional wisdom held by politicians and the Forest Service that the West is experiencing an unnatural burst in uncharacteristic wildfires as a result of a century of wildfire suppression.

In fact, some Western forests are experiencing a “deficit” in high-intensity blazes and in some cases should be encouraged to burn, said the study published this month in the journal PLOS ONE.

It questioned the government’s policy of mechanically thinning, or “restoring,” backcountry areas to ensure fires stay low to the ground and create “park-like” conditions. Thinning to reduce high-severity wildfire can reduce habitat for the imperiled black-backed woodpecker, often requires new roads and can introduce invasive species into the forest, the study authors said.

It’s bound to spark some controversy considering high-severity wildfires threaten lives and property and drain billions of dollars in taxpayer money each year. Moreover, aggressive forest thinning and restoration policies are politically popular because they create rural jobs and seek to mitigate wildfire threats to communities.

“Given societal aversion to wildfires, the threat to human assets from wildfires, and anticipated effects of climate change on future wildfires, many will question the wisdom of incorporating historical mixed-severity fire into management goals,” the study said. “However, a major challenge lies with the transfer of information needed to move the public and decision-makers from the current perspective that the effects of contemporary mixed-severity fire events are unnatural, harmful, inappropriate and more extensive due to fire exclusion — to embrace a different paradigm.”

The study was funded by Environment Now, a nonprofit foundation in California whose goals include “preserving and restoring coastal, freshwater and forest ecosystems.”

It was independently conducted by 11 scientists from several Western universities, the Canadian Forest Service, the Earth Island Institute and Geos Institute.

The study used U.S. Forest Service data and other published sources to explore the historical prevalence of “mixed-severity fire regimes” in ponderosa pine and mixed-conifer forests in western North America and to try to determine whether mixed-severity fire patterns in those forests had changed as a result of the past century of fire suppression.

On the latter question, it concluded that the past century of fire suppression has not “greatly increased the prevalence of severe fire,” even though fuel levels in Western forests are believed to be much higher.

Since 1930, the rate of young forest establishment fell by a factor of four in the Sierra Nevada and Southwest, by a factor of three in the Klamath, and by half in the eastern Cascades and central and northern Rockies, it said.

The study recommends the government focus its fire mitigation work adjacent to homes in the wildland-urban interface instead of in the backcountry, where managed wildland fires could promote ecological benefits.

“The need for forest ‘restoration’ designed to reduce variation in fire behavior may be much less extensive than implied by many current forest management plans or promoted by recent legislation,” the study said. “Incorporating mixed-severity fire into management goals, and adapting human communities to fire by focusing fire risk reduction activities adjacent to homes, may help maintain characteristic biodiversity, expand opportunities to manage fire for ecological benefits, reduce management costs, and protect human communities.”

But other scientists and policymakers have argued that the societal benefits of taming mega-fires often trump whatever ecological benefits they may produce. In addition, climate change is expected to intensify droughts that create dangerously dry fuel conditions. Fire seasons are more than two months longer than in the 1970s, the Forest Service has said. (Editor’s note: using 1970s as the starting point is disingenuous–the 1950s-1980s were cooler and moister)

A study commissioned by the Interior Department and led by Northern Arizona University last spring found that although hazardous fuels treatments near communities can reduce wildfire risks to homes and people, backcountry fuels treatments are important to prevent mega-fires that can scorch watersheds and drain federal wildfire budgets (E&ENews PM, May 28, 2013).

While fire in the backcountry can be beneficial if it stays low to the ground, landscape-scale “crown” fires can damage watersheds, tarnish viewsheds and threaten communities, NAU’s Diane Vosick, one of the study authors, said last spring.

“In order to get ahead of the cost of large and severe fire, more treatments will be needed outside the wildland-urban interface,” Vosick told the Senate Energy and Natural Resources Committee last June.

The Forest Service and Interior Department also face political pressure to restore backcountry areas.

In December 2011, Congress inserted language in its appropriations report ordering both agencies to halt policies that direct most hazardous fuels funding to the wildland-urban interface, spending it instead on the “highest priority projects in the highest priority areas.”

For now, “restoration” across the National Forest System remains popular policy in Congress and in Western states.

According to the Forest Service, the states of Florida, Georgia, Utah, California, Texas, Arizona, New Mexico and Colorado have all experienced their largest or most destructive wildfire in just the last several years. Wildfires burn twice as many acres annually compared with the 1970s, and the number of wildfires annually that cover more than 10,000 acres has increased sevenfold.

Forest Service Chief Tom Tidwell last week told Congress there are nearly two dozen landscape-scale collaborative forest restoration projects underway that seek to “re-establish natural fire regimes and reduce the risk of uncharacteristic wildfire.”

“Our findings are sure to be controversial as each year federal agencies spend billions of dollars in fuel reduction costs in the backcountry based on the assumption that we have more high-severity fire now than we did historically,” said Dominick DellaSala, chief scientist at the Geos Institute, one of the study’s authors. “Fuel treatments are best targeted immediately adjacent to where people live, given that the increasing costs of suppressing fires is not ecologically justifiable and may, in fact, produce artificially manipulated landscapes that need more fire to remain healthy and productive.”

Examining Historical and Current Mixed-Severity Fire Regimes in Ponderosa Pine and Mixed-Conifer Forests of Western North America

The other day I got the following note and link to some new relevant research from Douglas Bevington, author of The Rebirth of Environmentalism: Grassroots Activism and the New Conservation Movement, 1989-2004.  Bevington’s note is shared below with his permission, along with a link to the new study and article. . – mk

——————

I wanted to let you know about an important new study that was just published by the high-profile science journal PLOS ONE. The article, titled “Examining Historical and Current Mixed-Severity Fire Regimes in Ponderosa Pine and Mixed-Conifer Forests of Western North America,” was co-authored by 11 scientists from various regions of the western US and Canada.

Their study found that there is extensive evidence from multiple data sources that big, intense forest fires were a natural part of ponderosa pine and mixed-conifer ecosystems prior to modern fire suppression. These findings refute the claims frequently made by logging and biomass advocates that modern mixed-severity forest fires (erroneously called “catastrophic” fires) are an unnatural aberration that should be prevented through more logging (“thinning”) and that more biomass facilities should be built to take the resulting material from the forest.

In contrast to these claims, logging done ostensibly to reduce fire severity now appears to be not only unnecessary, but also potentially detrimental when it is based on erroneous notions about historic forest conditions and fire regimes. These findings have big implications for biomass and forest policy, so I encourage you to take a look at this article.

The full article in PLOS ONE is available here.

Here are a few key points from the abstract and conclusion:

Abstract, p. 1

“There is widespread concern that fire exclusion has led to an unprecedented threat of uncharacteristically severe fires in ponderosa pine and mixed-conifer forests of western North America. These extensive montane forests are considered to be adapted to a low/moderate-severity fire regime that maintained stands of relatively old trees. However, there is increasing recognition from landscape-scale assessments that, prior to any significant effects of fire exclusion, fires and forest structure were more variable in these forests….We compiled landscape-scale evidence of historical fire severity patterns in the ponderosa pine and mixed-conifer forests from published literature sources and stand ages available from the Forest Inventory and Analysis program in the USA. The consensus from this evidence is that the traditional reference conditions of low-severity fire regimes are inaccurate for most forests of western North America….Our findings suggest that ecological management goals that incorporate successional diversity created by fire may support characteristic biodiversity, whereas current attempts to ‘‘restore’’ forests to open, low-severity fire conditions may not align with historical reference conditions in most ponderosa pine and mixed-conifer forests of western North America.”

Conclusion: p. 12

“Our findings suggest a need to recognize mixed-severity fire regimes as the predominant fire regime for most of the ponderosa pine and mixed-conifer forests of western North America….For management, perhaps the most profound implication of this study is that the need for forest ‘‘restoration’’ designed to reduce variation in fire behavior may be much less extensive than implied by many current forest management plans or promoted by recent legislation. Incorporating mixed-severity fire into management goals, and adapting human communities to fire by focusing fire risk reduction activities adjacent to homes, may help maintain characteristic biodiversity, expand opportunities to manage fire for ecological benefits, reduce management costs, and protect human communities.”

Coquelle Trails: Scientific Transparency & Public Lands Management

"Volunteers On the March" (Glisan 1874: 293)
“Volunteers On the March” (Glisan 1874: 293)

Earlier this week I gave a 60-minute talk to a meeting of the Alsea Watershed Council, my “home group,” where I have been giving presentations every few years since they first formed in the 1980s. The audience was a little smaller than usual, but all of the old-timers were there and Elmer Ostling’s wife had baked delicious cinnamon rolls for everyone.

The theme of my talk was to discuss scientific and political “transparency” in this age of Internet communications – and to use the recently completed website report, Oregon Websites and Watershed Project’s (ORWW) “Coquelle Trails,” as a model and framework for the discussion. The Coquelle Trails project covered more than 1,400,000-acres in southwest Oregon, including sizable portions of BLM and USFS lands and hundreds of thousands of acres of marbled murrelet, spotted owl, coho, California condor, wolf, and elk habitat. PowerPoint and PDF versions of the presentation have been put online here:

www.NWMapsCo.com/ZybachB/Presentations/2010-2013/index.html#20130221

The original 2-page Press Release for Coquelle Trails was used as a handout. The online version of the handout can be found here:

www.ORWW.org/Coquelle_Trails/Press_Release_20130107.html

The discussion was arranged in four parts: 1) a proposed definition of “scientific and political transparency” — at least as it should apply to taxpayer-funded research — for the 21st century; 2) a demonstration of how inexpensive and easy it is to produce baseline data in modern digital formats, by using the Coquelle Trails’ predictive map construction and field verification methodology as an illustration; 3) a brief overview of how the Coquelle Trails’ historical datasets and current findings were formatted for Internet access by using the same standards developed by ORWW with Siletz School 2nd-Grade students 15 years ago; and 4) basic conclusions regarding current opportunities and needs to create better trust and transparency between federal land management agencies and local communities via enhanced research methods and internet communications.

After a brief introduction and background regarding the focus of my talk and the reference materials we would be using, we began with the proposed definition for “Scientific (& Political) Transparency: 2013,” which was also outlined in four parts:

1. Plain English

Acronyms + Jargon + Latin + Metrics x Statistics = Total Obfuscation

Doug Fir vs. Doug-fir vs. PsMe

TMDL vs. turbidity vs. muddy water

2. Research Methodology

A. All taxpayer-funded work is documented.

B. All documentation is made readily available via public websites.

C. Most work is subject to Independent Peer Review.

D. All peer reviews and resulting discussions are made publicly available.

3. Direct Access to all taxpayer-funded research, meetings, reports, correspondence, political decisions, etc.

4. Stable, well-designed (dependable, comprehensive & “easy to use”) Websites: ORWW Coquelle Trails as a model.

The opening discussion of Plain English was illustrated with a philosophical approach as to how Latin had been used to create distance between the Messengers of God and the illiterate masses in the Middle Ages, and how that process was still being used today – via government acronyms, professional jargon, metrics, and obscure statistics (and Latin) – to create distance between government agencies and the public; between the agencies themselves; and even between different generations of scientists within the same disciplines.

I used personal examples of the “evolution” of Douglas Fir (Pseudotsuga taxifolia) to Douglas-fir (Pseudotsuga menziesii) to PsMe (“Piz-Me”) in the agencies and classrooms during the past 60 years – while everyone in town and at the sawmills continued to call it “Doug Fir.” The similar history of TMDL – and why that acronym is not a good fit to discuss with current grade school and high school students – was another example. Same with metrics: the USFS and BLM are US agencies. Our standard of measure, used by all taxpayers, is the English system (chains, links, feet, miles, and acres) — why then do agency personnel try and talk and write in terms of hectares and kilometers in official reports and public presentations (rhetorical question)?

The second part of the discussion involved a series of slides showing how traditional archival research methods and modern technology were used during the Coquelle Trails project to achieve desired results. This was, essentially, a summary of the methodology as described and illustrated by the online report:

www.ORWW.org/Coquelle_Trails/Methodology/

Part three of the discussion used a series of slides showing how ORWW has continued to use the same methods and formats developed with Siletz 2nd-Graders in 1998 to present Coquelle Trails research datasets, findings, and conclusions to the present day:

www.ORWW.org/PEAS/SZDay/SalmonCycle/

www.ORWW.org/Coquelle_Trails/Maps_1856-2012

The point was made – pointedly – that government websites to the present time continue to be far less stable, far less comprehensive, and much more difficult to navigate than methods developed by grade-schoolers during the past century – during the very infancy of the Internet. Also, that the more accessible and reliable design was developed and has been expanded and maintained by a tiny non-profit in Philomath, Oregon, entirely funded by local residents, businesses, and organizations – and no federal dollars. And that those works have been continuously available and online for more than 16 years (compare to the life of an average government link or URL).

Which brought us to the Conclusions, also listed in four parts:

Conclusions: How Transparency Saves Money & Improves Decision Making

1. The 1976 Paperwork Reduction Act and the 2010 Plain Writing Act already require the use of Plain English by federal agencies. These acts simply need to be enforced.   

2. Modern technology makes automated scanning of documents and GPS-referenced digital photography increasingly cheap and easy. Citizens should insist on such documentation and direct access to all taxpayer-funded research, meetings, etc., affecting local regulations.

3. High-speed Internet communications and the recent proliferation of ipads and smart phones has made universal access to technical information possible, with few limitations to time and location.

4. Increased access to better information is believed to result in improved research, discussion, and decision-making. Stable, well-designed websites make such access possible for almost all citizens, including: students, teachers, scientists, politicians and public resource managers.

So that was my presentation. I would be very interested in other thoughts on this. I think the current lack of transparency in government and in science (and maybe particularly in government-funded science) is doing a great disservice to taxpaying citizens, our voters, and our students and teachers, all of whom deserve clear and complete answers to their questions and requests.

Modern technology and Internet communications have made sharing information more possible, cheaper, and easier than at any other time in history – so why does the government (and its scientists) continue to hide behind secret meetings, foreign languages and measurements, unavailable “findings,” clunky and outdated communications, never-ending acronyms, and other forms of deliberate obfuscation? That’s a rhetorical question with lots of answers, but the bottom line is that there is really no excuse for allowing this type of behavior to continue. It’s way too expensive, totally unnecessary, probably unethical, and counterproductive to most legitimate workings of government and of science. In my opinion. I’m interested in the thoughts of others.