Jasanoff on Public Truths: An Approach to the Fire Retardant Debate?

Sheila Jasanoff, Sheila Jasanoff
Pforzheimer Professor of Science and Technology Studies, Harvard Kennedy School

Andy Stahl’s comment here: reminded me of a recent piece I’d read by Sheila Jasanoff. Andy made the claim:

“While anecdotal information (“evidence based on hearsay“) can be helpful, scientific research and hard facts are a better basis for policy decisions.”

Sheila Jasanoff, Pforzheimer Professor of Science and Technology Studies at the Harvard Kennedy School, is an expert on the intersection of science, policy and law. This interesting piece by her recently came across my desk, and while it is fairly long, I’d like to draw your attention to her thoughts about how democracy, information and science come together (my italics).

To address the current retreat from reason—and indeed to restore confidence that “facts” and “truth” can be reclaimed in the public sphere—we need a discourse less crude than the stark binaries of good/bad, true/false, or science/antiscience. That oversimplification, we have seen, only augments political polarization and possibly yields unfair advantage to those in possession of the political megaphones of the moment. We need a discourse more attuned to findings from the history, sociology, and politics of knowledge that truth in the public domain is not simply out there, ready to be pulled into service like the magician’s rabbit from a hat. On the contrary, in democratic societies, public truths are precious collective achievements, arrived at just as good laws are, through slow sifting of alternative interpretations based on careful observation and argument and painstaking deliberation among trustworthy experts.

In good processes of public fact-making, judgment cannot be set side, nor facts wholly disentangled from values. The durability of public facts, accepted by citizens as “self-evident” truths, depends not on nature alone but on the procedural values of fairness, transparency, criticism, and appeal in the fact-finding process. These virtues, as the sociologist Robert K. Merton noted as long ago as 1942, are built into the ethos of science. How else, after all, did modern Western societies repudiate earlier structures of class, race, gender, religious, or ethnic inequality than by letting in the skeptical voices of the underrepresented? It is when ruling institutions bypass the virtues of openness and critique that public truthfulness suffers, yielding to what the comedian Stephen Colbert called “truthiness,” the shallow pretense of truth, or what the Israeli political scientist Yaron Ezrahi calls “out-formations,” baseless claims replacing reliable, institutionally certified information. That short-circuiting of democratic process is what happened when the governments of Tony Blair and George W. Bush disastrously claimed to have evidence of weapons of mass destruction in Iraq. A cavalier disregard for process, over and above the blatancy of lying, may similarly deal the harshest blows to the credibility of the Trump administration.

Public truths cannot be dictated—neither by a pure, all-knowing science nor unilaterally from the throne of power. Science and democracy, at their best, are modest enterprises because both are mistrustful of their own authority. Each gains by making its doubts explicit. This does not mean that the search for closure in either science or politics must be dismissed as unattainable. It does mean that we must ask and insist on good answers to questions about the procedures and practices that undergird both kinds of authority claims. For assertions of public knowledge, the following questions then seem indispensable:

Who claims to know?
In answer to whose questions?
On what authority?
With what evidence?
Subject to what oversight or opportunity for criticism?
With what openings for countervailing views to express themselves?
And with what mechanisms of closure in cases of disagreement?

If those questions can be raised and discussed, even if not resolved to everyone’s satisfaction, then factual disagreements retreat into the background and confidence builds that ours is indeed a government of reason. For those who are not satisfied, the possibility remains open that one can return some other day, with more persuasive data, and hope the wheel of knowledge will turn in synchrony with the arc of justice. In the end, what assures a polity that knowledge is justly coupled to power is not the assertion that science knows best, but the conviction that science itself has been subjected to norms of good government.

It might be fun to work through Jasanoff’s list of questions with the fire retardant issue. To someone outside this issue like me, it seems that we have practitioner experience vs. other knowledge approaches. It seems to me that any “serviceable truth” around fire retardant would have to incorporate both practitioner observations and explain the information (or lack thereof in the case of no studies) presented by Andy. Note that it doesn’t seem to be anyone’s paid position to arrive at “serviceable truths,” and it requires hard work to dig into the details. And of course, we could also use Sheila’s questions for our fuel treatment discussion after we have rounded up the relevant information and looked at it from a variety of angles.

The response of the forest to drought

This post provides some on the ground research and consistent but separate modeling results that demonstrate the importance of stand density in coping with climate change and therefore the importance of sustainable forest management. Hopefully this will change some minds on the importance of strategically managing density.

A) The response of the forest to drought: the role of stand density and species diversity This article is an attempt to quantify previously established science.

1) “Droughts affect wood formation through the reduction in photosynthetic rates due to stomatal closure, reducing the amount of carbohydrates available for building new cells.”

2) “used tree-ring data from long-term forest plots of two pine species, ponderosa pine (Pinus ponderosa) and red pine (Pinus resinosa). The experiments were distributed in different geographical areas in the USA and they covered a large aridity gradient. They quantified growth responses at the population level to express both resistance and resilience to drought in relation to the relative tree population density, finding out that reducing densities would enhance both growth responses to drought. Trees growing in denser populations were more negatively impacted by drought and this has been shown in all three biogeographical areas.”
NOTE from “Climate Change Research Focuses on Great Lakes Forests”: “ASCC is monitoring the growth, health and survival rates of the trees in these forests, and focusing on three key qualities: resistance, resilience and transition. Resistance measures a species’ ability to remain stable and productive in a drought situation, resilience is a tree’s ability to return to normal productivity after experiencing an environmental change and transition refers to circumstances that encourage ecosystems to adapt to changing conditions.”

3) “This study confirms once more that the vulnerability of monospecific coniferous forests to increasing drought can be reduced through thinning interventions, which represent a viable adaptation strategy under climate change.”

4) “investigated the drought response of 16 individual tree species in different regions of Europe and evaluated if this was related to species diversity and stand density. Based on previous findings indicating that combining species with complementary characteristics is more important than simply increasing species diversity to cope with drought, their results indicate that species growing in a mixture are not always less water stressed than those growing in monoculture.”
See also: a) “Species composition determines resistance to drought in dry forests of the Great Lakes – St. Lawrence forest region of central Ontario” b) “SPECIES RICHNESS AND STAND STABILITY IN CONIFER FORESTS OF THE SIERRA NEVADA” c) “Functional diversity enhances silver fir growth resilience to an extreme drought”

5) “Investigating these effects at the level of species identity (i.e., different combinations of species) is more advisable than doing it at the level of species richness (i.e., abundance of species), because different mixtures respond differently depending on the region. If we consider that different provenances of the same species can show different adaptation strategies to cope with drought, the situation may be even more complex.”

B) Ecosystem services, mountain forests and climate change
Note: This modeling effort passes the #1 smell test in that it agrees with already established scientific principles while adding quantitative measures that support the previously known trend but shouldn’t be taken as absolutes.

1) “it is estimated that about half of the global human population depends – directly or indirectly – on services delivered by mountain forests. It is therefore essential to assess whether multiple ecosystem services can be provided to human societies in the future. Given that climate is changing fast, the consideration of climate change in scientific assessments is a must! Let’s not forget that European forests are managed since centuries (check out this nice book about the history of European forests). Thus, changes in management regimes must be considered as well.”

2) “in the Iberian Mountains their simulation results indicate that forest management, rather than climate change, is responsible for a reduction in carbon storage and biodiversity. On the contrary, in Western Alps changes in climatic regimes could induces large alterations in the supply of several ecosystem services, particularly under the most pessimistic future climate scenarios. In other areas (e.g., in the Slovenian Dinaric Mountains) climate change would strongly affect ecosystem services, albeit differently depending on elevation and stand conditions.”

3) “This confirms that management is a strong driver of forest dynamics in European mountains, and it can highly modify the future provision of ecosystem services (i.e., more than the direct effects of climate change!).”

Inside the Firestorm

This is for those who insist that we don’t need to use forest management to reduce the risk of catastrophic loss to wildfire. Several people have expressed unscientific views on this site to the effect that ‘Wildfire is climate driven and no amount of controlled burns and or thinning can have any impact on total acreage burned since it is all due to global warming (drought and high winds)’ Hopefully this will bring them to their senses and open their minds to the possibility that they are flat out wrong.

Several of us have tried to explain that global warming only makes the need for managing stand density even more important. We have also tried to explain that what many see as climate driven firestorms are instead micro climate created by the fire. Hence the need to use the appropriate forest management tools to reduce the risk of an ignition spreading at a rate that will create its own weather and to provide opportunities for crown fires to return to the ground in order to allow the fire to be controlled as is appropriate for the specific situation.

I have rightly or wrongly gotten the impression that some here don’t really respect the research done on wildfire for at least the last 80 years. My reason for saying this is the applause they afford to people who come up with conclusions contrary to the science but don’t bother to reconcile their suppositions/theories, based on cherry picked incidents, with the established and well replicated science.

So here is an article that should give you a better understanding of and respect for the work behind the real science and how it corroborates what some of us have repeatedly stated here and elsewhere so, obviously, the principles described here are not new – They are just getting a whole lot more attention as technology has advanced to the point where tools are now available to make precise measurements on what has only been repeatedly observed before. This is a pretty intense read and well worth reading in its entirety.

1) ‘“It looked,” says Kingsmill, “like a nuclear bomb.”
Undaunted, Kingsmill and the pilot decided to do what no research aircraft had done: Fly directly through the plume.’

2) ‘For decades, scientists have focused on the ways that topography and fuels, such as the trees, grass or houses consumed by flames, shape fire behavior, in part because these things can be studied even when a fire isn’t burning. But this line of inquiry has offered only partial answers to why certain blazes, like the Pioneer Fire, lash out in dangerous and unexpected ways — a problem magnified by severe drought, heat and decades of fire suppression.’

3) ‘“The plume is orders of magnitude harder to study than the stuff on the ground,” says Brian Potter, a meteorologist with the Pacific Wildland Fire Sciences Laboratory in Seattle who sometimes works with Clements. Indeed, it took a global conflagration much darker than any forest fire to even begin laying the foundations of this work. Kingsmill’s observation about the bomb, it turns out, isn’t far off.’
–> Here the article dives off into the beginning of fire study as it began in the early ’40’s in preparation for the British bombing of Hamburg, Germany on July 27, 1943 when ‘42,000 people died, and another 37,000 were injured’

4) ‘these old experiments, finished by 1970, are still a key source of knowledge about extreme fire behaviors. Until recently, technology was simply too limited to reveal much more about the specific mechanisms by which a fire plume might feed a firestorm, let alone how beasts like fire tornadoes and fireballs form.’

5) ‘His instrument towers, deployed in carefully controlled fires, provided yet more unprecedented and precise measurements: how winds accelerate and draft into an advancing flame front, the heat and turbulence above the flames, and the speed of the rising hot air.’

6) ‘Clements wanted to capture the whole phenomenon — to look inside the opaque mass of an entire fire plume from a distance, and see all of its parts swirling at once. In 2011, he found his lens: a technology called Doppler lidar.’

7) ‘The team’s insight about the Bald and Eiler fires has implications for predicting smoke and air quality — a constant concern for communities near large fires. It also impacted the fires themselves. Even though both fires existed in the same atmospheric environment of pressures and winds, and burned across similar terrain, they were spreading in opposite directions that day — Bald to the south, and Eiler to the north. This denser current of cold air and smoke was actually pulling the Bald Fire in the opposite direction of what was predicted based on wind alone.’

8) ‘Coen works at the National Center for Atmospheric Research in Boulder, Colorado, where she studies fire’s inner workings. In September 1998, she spent several hours aboard a Hercules C-130 aircraft as it circled over Glacier National Park. The McDonald Creek Fire was marching up a steep slope at roughly three feet per second. Its smoke obscured the advancing flames, but infrared video cameras mounted outside the plane recorded what was happening underneath. It was only later, as Coen looked through individual frames of that video, that she noticed something strange: At one point, a jet of flame seemed to shoot ahead of the fire. It lasted only a second or two, but left a trail of newly ignited vegetation in front of the fire. Not until Coen calculated the size of the pixels and the time between frames could she appreciate its true significance.
The jet had surged 100 yards ahead of the fire’s front, advancing 100 mph — “like a flamethrower,” she says. It was 10 times faster than the local wind — generated, somehow, by the fire’s own internal tumult.
Coen called it the “finger of death,” and for her it brought to mind the unconfirmed reports of fireballs that occasionally circulated among firefighters.
She had never seen such a thing, but as she examined footage of other fires, she was surprised to find fire jets again and again.’

9) ‘Finney’s slow-motion videos show that these rolling eddies exist in pairs within the fire. They roll in opposite directions, coupled like interlocking gears. Their combined motion periodically pushes down on the advancing front of the fire, causing flames to lick downward and forward, ahead of the fire.
Finney believes that these forward flame-licks are scaled-down versions of the “fingers of death” that Coen has seen in wildfires — possibly even related to the fireballs said to have shot out of buildings during the 1943 Hamburg firestorm.
Coen has actually documented similar flame-rollers in real wildfires using infrared video. But she believes that the finger of death also requires another factor. As bushes and trees are heated by an approaching fire, their decomposing cellulose releases hydrogen, methane, carbon monoxide and other flammable gases in a process called pyrolysis.
Coen and Shankar Mahalingam, a fluid-dynamics engineer at the University of Alabama in Huntsville, believe that rolling currents can mix these flammable gases with oxygen-rich air. “The dangerous situation is when the fire is going up on a hill,” says Mahalingam. “Maybe there are pyrolysis products that have accumulated” in front of the fire and mixed with fire-boosting oxygen. As the flame licks forward into this invisible tinderbox, it ignites a blowtorch. … These same buoyant gases also supply the momentum that drives a fire whirl to spin once it is triggered. And on a much larger scale, they are what pushes a fire plume ever higher in the sky, powering the in-drafts that keep the fire burning below.’

10) ‘what drew Potter’s interest was the water. Concentrations of water vapor rose 10 to 20 times higher than the surrounding air.
Water is a major product of combustion, second only to carbon dioxide. It forms as oxygen binds to the hydrogen atoms in wood, gasoline or just about any other fuel — creating hydrogen oxide, otherwise known as H2O. Burning four pounds of perfectly dry wood releases a pound or two of water. …
And yet water vapor fuels the strongest updrafts in nature, says Potter, from thunderstorms to tornadoes to hurricanes. As moist air rises during these storms, the water vapor condenses into cloud droplets, releasing a small amount of heat that keeps the air slightly warmer than its surroundings, so it continues to rise. “Water,” he says, “is the difference between a weak updraft and a really powerful updraft.”’

11) ‘He believes that water was pivotal in fueling the firestorm that swept through the suburbs of Canberra, the Australian capital, on Jan. 18, 2003.
The fire consumed 200,000 acres of drought-stricken territory that day, isolating the city under a glowing haze of Halloween orange. Remote infrared scans suggest that during a single 10-minute period, it released heat equivalent to 22,000 tons of TNT — 50 percent more than the energy unleashed by the atomic bomb dropped on Hiroshima.’

12) ‘When N2UW flew through the plume of the Pioneer Fire in 2016, its instruments registered updrafts of 80 to 100 miles per hour. Yet at that elevation, 8,000 feet above the flames, the interior of the plume was only 3 to 6 degrees Fahrenheit warmer than the surrounding air, meaning that its buoyant stampede through the atmosphere was powered by a density difference of just about 1 percent.
In other words, given the right atmospheric conditions, a few degrees of warmth and extra buoyancy could spell the difference between a plume that pushes 40,000 feet up, into the stratosphere, powering a vicious blaze on the ground — as Pioneer did — and one whose smoke never escapes the top of the boundary layer at 3,000 feet, leaving the fire stunted, like a weather-beaten dwarf tree gasping for life at timberline.’

13) ‘Clements’s trained eye began to pick out some basic structures: a 40 mph downdraft next to a 60 mph updraft signified a turbulent eddy on the edge of the plume. Hot air pushing up past cooler, stationary air had set in motion a tumbling, horizontal vortex — the sort of thing that could easily have accounted for the plane’s brief freefall. Those blotchy radar pictures may finally allow us to see through wildfire’s impulsive, chaotic veneer’

Yes, professional wildfire researchers, the in air observations of pilots of spotters and retardant dropping planes and the on the ground observations of fire crews that point the researchers in various directions all deserve our respect. They actually put their lives on the line as opposed to those who disdain their commitment and repeatedly validated science.

Contact the author if you want references or check back in some previous postings on this site for some related references. I post this without references because it jives with the known and validated science that I have critically studied since I first started my forestry education in 1963.

A conservation plan puts science ahead of politics

This story about the Pima County Arizona conservation planning effort isn’t directly about national forests, though there should have been (and probably was) coordination with the Coronado National Forest.  And my point here isn’t about the success of a conservation plan driven by the need to protect at-risk species (arguably an ESA success story).  It’s about the role of scientists in the process (Sharon).

“County leaders stated from the outset that their primary goal was to conserve biological diversity through a scientifically defendable process, not to come up with a plan that everybody could agree on,” wrote the late urban planning specialist Judith Layzer in her 2008 book Natural Experiments, which analyzed more than a half-dozen regional land-conservation efforts.

The scientists and county staff discussed the plan in public sessions, but county officials made it clear that their work would not be derailed by complaints from developers and other critics. The scientists established standards for identifying biologically valuable lands and used computer models, observation records and the judgment of local naturalists and recognized experts to come up with a biological preserve map.

In contrast, in other multi-species plans, scientists, politicians, agency staffers, developers and moderate conservationists collectively determined which lands to save, thus bringing political and economic considerations into the science.

Looking back this spring, Huckelberry, a former county transportation chief, says he was simply applying the best practices from his previous job, highway planning, to land conservation. Typically, both a technical committee and a citizens’ committee review big road projects, he says: “The whole purpose of a technical advisory committee is not to play with the numbers, not to slant the analysis. We felt the political side could potentially be used to manipulate the scientific side, and felt that would bias the entire process.”

After the science team created a map of the proposed preserve system, a separate steering committee of 84 people, including developers, environmentalists and neighborhood leaders, haggled over its details. By then, though, the plan’s broad vision was already solidly in place.

Bringing this back to the Forest Service, this is similar to how a team of biologists developed the Lynx Conservation Assessment and Strategy, which was then followed by forest plan amendments that “haggled over the details.”  The Forest Service doesn’t like some of things it can’t do, but there haven’t been challenges to the science.  The grizzly bear conservation strategies seem to be more like the alternative process, where what the land managers want is infused into the discussions of the science.  (The Yellowstone strategy was already voided by a court once because of scientific issues.)

The Impact of Sound Forest Management Practices on Wildfire Smoke and Human Health

– Some would have us turn our forests back to a time before any of mankind inhabited North America.
– Some suggest that we should limit our management of forests to that done by native Americans pre European times.
– Some of us see a problem with limiting ourselves to these past practices because of the current population level.
– Some of us even see that properly validated forest science carried out in environmentally sound ways can improve the sustainability of our forest ecosystems and all of the species that depend on them for habitat, store carbon and reduce our dependence on the use of non-renewable, environmentally unfriendly resources which are being extracted from their long term, safe, natural storage underground.

This article (J. For. 115(●):000–000 http://dx.doi.org/10.5849/jof.16-042
Copyright © 2017 Society of American Foresters) “fire & fuels management Aligning Smoke Management with Ecological and Public Health Goals” seems to me to be a good starting point for a much neglected discussion on why mankind has to manage our federal forest better just from the point of protecting human health.

A) Motivation for the study comes from:
1) “mismatches between the scale of benefits and risks make it difficult to proactively manage wildland fires to promote both ecological and public health.”
2) “A recent update to wildfire smoke policy proposed by the US Environmental Protection Agency (US EPA) recognized the need to restore and maintain more frequent fire regimes through intentional use of fire, while asserting that protecting human health remained the agency’s “highest priority” (Office of the Federal Register 2015). Therefore, addressing both forest restoration and air quality objectives remains a central challenge.”
3) “Hurteau et al. (2014) found that under a business-as-usual climate scenario, this escalation in fire potential is likely to increase wildfire emissions in California by 50% by the end of this century unless agencies take a more proactive approach to fire use.”
4) “… current policies have permitted regulators to curtail fires intentionally managed for resource objectives in response to nuisance complaints by a few individuals, despite the potential for such
fires to have long-term collective benefits (Engel 2013). Because the impact and likelihood of smoke increase the longer that fire is kept out of the system, extensive fire suppression can result in a vicious cycle that becomes more and more costly to escape until the system fails, as represented by extreme
wildfires (Calkin et al. 2015).”
5) “Smoke and wildfires can impact public health in ways other than particulate pollution, including ozone pollution, increased stress during and after wildfires, and strains on medical services and communication systems (Fowler 2003, Kumagai et al. 2004, Finlay et al. 2012). Despite these broader
considerations, public health regulations for smoke typically focus on a 24-hour average of PM2.5. Values that exceed 35ug/m3 are considered unhealthy for sensitive groups, which include pregnant women, young children, elderly individuals, smokers, and people with chronic respiratory problems such as asthma (Delfino et al. 2009, Kochi et al. 2010, Moeltner et al. 2013).”

Please note that this study was not offered as a be all and end all study. In my opinion, the main objective was achieved. That objective being to give order of magnitude numbers to justify further research and further stimulate the process of rethinking current regulations and forest management policies.

B) Known Facts:
1) California: “The wildfire emissions in 2008 represented 68% of all PM2.5 emissions in the state, and they caused notable public health impacts (Wegesser et al. 2009, Preisler et al. 2015)”
2) “An important spatial mismatch results from the fact that large wildfires can create smoke impacts on distant urban populations. The risk to urban populations from regional-scale smoke impacts has increased as California became the most urbanized state in the United States, with 90% of its population residing within cities that have more than 50,000 people and another 5% living in smaller urban clusters (US Census Bureau 2015). Many of those urban areas are situated in valleys or basins that have poor air quality due to human activities as well as natural conditions that often trap pollutants (Ngo et al. 2010, Nakayama Wong et al. 2011). For example, the four metropolitan areas in the United States with the highest levels of particle pollution are all located in California’s Central Valley (American Lung Association 2015). Because many urban populations already experience poor air quality during the summer, they are particularly vulnerable to health impacts from wildfires (Delfino et al. 2009, Cisneros et al. 2014)”
3) “Within the study area, daily emissions from both prescribed burns and resource objective wildfires remained well below 500 tons PM2.5 , whereas the Rim Fire had 20 days exceeding that threshold (nearly half of its entire period of active fire growth) and peaked at nearly 11,000 tons PM2.5 /day on Aug. 26, 2013 (Figure 2). During the late summer, air quality is already problematic in downwind areas such as the Lake Tahoe Basin and San Joaquin Valley”
4) “Ground-level monitoring indicated that these large smoke plumes coincided with highly polluted days in Reno, which occurred on August 23–25 and again on August 28–29, when PM2.5 values exceeded the “unhealthy for all populations” standard (55.5ug/m3) (Figure 4F). Such high levels are such a serious health concern that people are advised to avoid going outdoors. Navarro et al. (2016) reported that very unhealthy and unhealthy days occurred at 10 air monitoring sites in the central Sierras, northern Sierras, and Nevada during the Rim Fire.”

C) Data – Smoke Plume data was used to “compare differences in smoke impacts between resource objective wildfires and full-suppression wildfires within the San Joaquin River watershed in California’s Sierra Nevada, the Sierras that burned between 2002 and 2013, including 10 resource objective wildfires (totaling 20,494 acres), 17 prescribed fires (totaling 6,636 acres), 4 small wildfires (totaling 12,025 acres), and the exceptionally large Rim Fire (257,314 acres). … the limited availability of smoke monitoring data, particularly before 2007, requires a focus on modeled emissions.”

D) Findings: Reasonable Expectations from the use of increased forest management to reduce the impact on human health of catastrophic wildfires include:
1) “Our results indicate that the 257,314-acre Rim Fire of 2013 probably resulted in 7 million person-days of smoke impact across California and Nevada, which was greater than 5 times the impact per burned unit area than two earlier wildfires, Grouse and Harden of 2009, that were intentionally managed for resource objectives within the same airshed.”
2) “The combination of a warming climate and accumulation of forest fuels ensures a future with more large fires and smoke in dry western US forests. We have outlined framework to more directly account for regional-scale smoke impacts from these events using surface monitoring and satellite observations of smoke. Managing large fires for resource objectives can shift the release of inevitable emissions to conditions that minimize large-scale smoke impacts, by controlling fire spread based on available dispersion and monitored impacts and creating anchors for containing future hazardous fires. When well supported by firefighting, air quality monitoring and modeling, and public communications resources, this approach can overcome existing disincentives for achieving ecological and public health goals.”
3) “August 31 … Altogether, medium- and high-density HMS smoke from the Rim Fire on that day covered a large area (251,691 mi2) with a population of 2.8 million people, more than 2 million of whom resided below high-density smoke … In contrast, the Grouse and Harden Fires burned slowly over the early summer of 2009, with very modest emissions until the last week of June … Our analysis of HMS maps indicated that there were only 2 days when medium-density plumes overlaid substantial populations in California and Nevada, amounting to 25,000 person-days”
4) “the Rim Fire burned 55 times more area (257,213 acres) than the combined footprint of the Grouse and Harden Fires (4,695 acres), but our analysis suggests that it had at least 275 times greater impact in terms of persondays, or 5.5 times greater impact relative to area burned.”
5) “Our analyses help to illustrate and begin to quantify many of the potential benefits of resource objective wildfires compared with those of extreme fires:
– 1. Reduced fuels and reduced consumption. … We accounted for this effect within the 10,385 acres of the Rim Fire’s footprint that had experienced prescribed fires or resource objective wildfires since 2002 by changing “typical” fuel loads to “light,” which reduced estimated emissions in those areas by 53%.
– 2. More favorable dispersion and potential for less ozone. As maintenance burns reduce fuel levels over time, managers may be able to burn more safely earlier in the summer and or later in the fall, when dispersion is often more favorable and ozone concentrations are lower (Jaffe et al. 2013). Fires managed for resource objectives are less likely to result in the greater lofting and concentrations of smoke reported from extreme fires, which often deliver pollution to distant, large urban populations in lower-elevation valleys (Colarco et al. 2004, Peterson et al. 2015).
– 3. Greater ability to regulate fire spread. Because wildfires would be managed for resource objectives when weather and fire behavior conditions are more moderate than under extreme wildfires, their slower fire spread can curb daily emissions. In addition, managers can employ the push-pull tactics burn described for the Grouse Fire to regulate daily emissions based on monitored concentrations fire will become increasingly important for reducing the likelihood and extent of large-scale, extreme fires like the Rim Fire (Westerling et al. 2015).”

Humans sparked 84 percent of US wildfires, increased fire season over two decades

How should we deal with the new math on forest fires?

If this article published in the February Proceedings of the National Academy of Sciences is not a fluke then it would seem to me that our expanding population dictates the need for more forest management not less. The less desireable alternative would be to severely restrict access to our federal forests. The main conclusion of the article is that humans sparked 84 percent of US wildfires and caused nearly half of the acreage lost to wildfire. This number excludes intentionally set controlled burns.

From the above, I would deduce that human initiated fires caused proportionally less acreage loss because they were closer to civilization and to forest access points and therefore closer to and more easily accessed by suppression resources. The fact that nearly half of the wildfire acres lost occur in these areas suggests that we would get more bang for our tax dollars if we increased and focused federal sustainable forest management around high traffic areas easily accessible to humans.

Knowing that humans who cause wildfires are, by definition, either careless or malicious, we might deduce that they are generally not inclined to put great effort into getting to their ignition set points. This would lead us to consider that human caused fires might prove to be in less difficult terrain areas with high human traffic. Fires like the Rim fire being the exception. That, if true, would suggest that forest management for risk reduction on these sites could be done at lower costs per acre than other less accessible forest acreage. Focusing forest management efforts on these high benefit to cost areas would have the biggest bang per tax dollar expended in order to lower the total cost of federal wildfire control. If my thinking is correct, this should play a large part in setting the priorities as to where we should: 1) apply controlled burns to reduce ground and other low fuels, 2) utilize commercial thinnings to reduce ladder and proximity fuels or 3) use commercial regeneration harvests to create greater variation in tree heights between stands in order to provide fire breaks for crown fires when appropriate for the site and species. The net effect would be positive for all species including endangered and threatened species. There would still be plenty of lightning caused wildfire, controlled burn hotspots/breakouts and a significantly reduced acreage of human caused fires to satisfy those who don’t mind national ashtrays. Reducing the number and size of human caused fires would also free resources to attack lightning fires earlier and harder when allowing the fire to burn was not an option.

Pertinent Quotes:

  1. “After analyzing two decades’ worth of U.S. government agency wildfire records spanning 1992-2012, the researchers found that human-ignited wildfires accounted for 84 percent of all wildfires, tripling the length of the average fire season and accounting for nearly half of the total acreage burned.” Italics added
  2. “”These findings do not discount the ongoing role of climate change, but instead suggest we should be most concerned about where it overlaps with human impact,” said Balch. “Climate change is making our fields, forests and grasslands drier and hotter for longer periods, creating a greater window of opportunity for human-related ignitions to start wildfires.”” Italics added
  3. “”Not all fire is bad, but humans are intentionally and unintentionally adding ignitions to the landscape in areas and seasons when natural ignitions are sparse,” … “We can’t easily control how dry fuels get, or lightning, but we do have some control over human started ignitions.””

IN SEARCH OF COMMON GROUND

It seems like an exercise in futility for the “New Century of Forest Planning” group to be discussing and cussing forest planning &/ policy when we haven’t even agreed to the scientific fundamentals that serve as the cornerstone and foundation for any such discussions.

Below, I have developed a tentative outline of the high level fundamentals which any Forest Plan or Policy must incorporate in order to have a reasonable chance of meeting the desired goals. Until we can come up with a version of these “Forestry Fundamentals” that we generally agree to, we are pushing on a rope and wasting each other’s time unless our objective here is simply to snap our suspenders and vent on each other.

In your comments, please note the outline Item that you are responding to. Maybe we can revise my initial effort and come to some common ground. In doing so we would perform a service and make a step forward that would be useful outside of this circle instead of just chasing our tails. Coming to such an agreement would be a step towards developing a priority hierarchy and eliminating the internal conflicts which make current federal forest policy and law ambiguous and self-contradictory. Until we reach common ground, the current obviously unworkable policies will continue to doom our forests to poor health and consequentially increase the risk of catastrophic loss of those forests and the species that depend on them for survival.

– FORESTRY FUNDAMENTALS – 1st Draft 12/15/16

ESTABLISHED SCIENCE WHICH MUST BE INCORPORATED IN PLANNING FOR

THE SUSTAINABILITY OF FOREST DEPENDENT SPECIES

I) The Fundamental Laws of Forest Science which have been repeatedly validated over time, location, and species. They include:
— A) plant physiology dictating the impact of competition on plant health,
— B) fire science dictating the physics of ignition and spread of fire and
— C) insects and pathogens and their propensity to target based on proximity and their probability of success being inversely proportional to the health of the target.

— D) Species suitability for a specific site is based on the interaction between the following items, those listed above and others not mentioned:

— — 1) hydrology, the underlying geology and availability of nutrients in the soil.

— — 2) latitude, longitude, elevation, aspect and adjacent geography.

— — 3) weather including local &/ global pattern changes.

 

II) The Fundamental Laws controlling the success of endangered, threatened and other species dependent on niche forest types (ecosystems):

— A) Nesting habitat availability.

— B) Foraging habitat availability.

— C) Competition management.

— D) Sustainability depends on maintaining a fairly uniform continuum of the necessary niches which, in turn, requires a balanced mix of age classes within each forest type to avoid species extinguishing gaps.

— E) Risk of catastrophic loss must be reduced where possible in order to minimize the chance of creating species extinguishing gaps in the stages of succession.

 

III) The role of Economics:

— A) Growing existing markets and developing new markets in order to provide revenue to more efficiently maintain healthy forests and thence their dependent species.

— B) Wise investment in the resources necessary to accomplish the goals.

— C) Efficient allocation of existing resources.

 

IV) The role of Forest Management:

— A) Convert the desires/goals of the controlling parties into objectives and thence into the actionable plans necessary to achieve the desired objectives.

— B) Properly execute the plans in accordance with the intent of: governing laws/regulations and best management practices considering any economies.

— C) Acquire independent third party audits and make adjustments in management practices where dictated in order to provide continuous improvement in the means used to achieve goals.

— D) Adjust plans as required by changes: in the goals, as required by the forces of nature and as indicated by on the ground results.

— E) Use GIS software to maintain the spatial and associated temporal data necessary for Scheduling software to find and project feasible alternatives and recommend the “best” alternative to meet the goals set by the controlling parties.

What did I miss, what is wrong, what is right, what would improve this list of Forest Fundamentals?

County job description for biologist: “help us combat the radical environmental influence”

This job interview of a former Forest Service employee by Tuolumne County Supervisors didn’t go well.

Supervisor Evan Royce noted he wanted to be explicit with Boroski, trying to make sure they are on the same page, by saying, “I think we have experienced a lot of extreme environmental influence on public lands policy and in Tuolumne County 78% of our county is publically owned and that has a huge effect on our communities, and we fight very hard on this board to try to protect our communities and represent them in a way that will preserve our quality of life and prosperity…looking into the future, as we are about to adopt a general plan and we’re dealing with new forest plans, it’s very critical to us that if we are going to do business with you that you represent us in that way and you help us combat the radical environmental influence that you see from groups like Center for Biological Diversity. That’s what we want.

A revealing look at their approach to forest plan collaboration.

Do elk need trees?

For many years, it has been pretty much common knowledge, supported by science, that as the amount of hunting season open roads increases, there is more need for cover for elk to hide.  The Helena National forest plan (and others) have incorporated this relationship into standards for elk security.  (Full disclosure – I had something to do with this on the Helena 30 years ago.)   When the Helena National Forest developed its Divide travel plan, it found that it couldn’t meet its requirements for elk habitat because there were too many roads and not enough trees to provide security (trees in the area have been killed in large numbers by mountain pine beetles in recent years).  So it amended its forest plan elk standard to eliminate the role of tree cover in determining elk security (distance from roads replaces road density as a factor).

The rationale provided in the Record of Decision emphasizes the fact that elk have been doing well despite the fact that the existing forest plan standards have not been met in many places.

I have taken into account the fact that Montana Fish,Wildlife and Parks data indicate that elk populations in the Divide landscape are either at or near population objectives of the 2005 Montana Elk Plan and that elk management challenges are only partially related to access management according to that Plan. I have also taken into account the fact that, despite several miles of road closures, only one herd unit comes into compliance with standard 4a in the Travel Plan Decision. Given this, I have concluded that the existing standard 4a is not an accurate indicator of elk security and is insensitive to changing road densities. The methodology utilized for the new standard (based on the percentage of an elk herd unit occupied by elk security areas and/or intermittent refuge areas) indicates that overall elk security in the Divide landscape is adequate. This measure of security is sensitive to changes in open road configuration and will provide a way to determine where proposed management actions are effective or where management needs to improve to ensure adequate big game security. I believe the new standard will provide a more realistic means of guiding travel management and other future management activities in the Divide Travel Planning Area.

In essence, the Forest is using anecdotal evidence in place of long-established science (which the Forest now asserts is not relevant to this kind of forest).  Has the science just not caught up with reality, or is it possible that the high elk numbers are a result of unknown factors that, when they change, will render excessive road densities fatal to meeting elk harvest goals?  When the plan is revised under the 2012 planning rule (revision is ongoing), it will have to meet the requirement for using best available scientific information for its elk habitat management decisions.  (The amendment is using the 1982 planning process, but scientific integrity is still required.)

A court has been asked to weigh in on the amendment.

Interestingly, the lawsuit is by participants in a collaborative process.

Science consistency review on the southern Sierra national forests

The draft revised Sierra, Sequoia and Inyo national forest plans include aggressive restoration programs across the forest, including logging areas of existing old forest structure to protect old forests and associated wildlife species.  The Forest Service has asked (unidentified) reviewers to look at the draft forest plans and draft EIS and address these questions in the first science consistency review conducted under the 2012 planning rule (it is an optional process under associated agency policy):

1. Has applicable and available scientific information been considered?

2. Is the scientific information interpreted reasonably and accurately?

3. Are the uncertainties associated with the scientific information acknowledged and documented?

4. Have the relevant management consequences, including risks and uncertainties, been identified and documented?

Here are some of the topics being addressed:

• Vegetation: Forest Resilience, Seral stage distribution, Effects of post-disturbance harvest, and Impacts on native vegetation.

• Fire and Fuels: Fuels management and community protection, Current fuel loading, Current and future wildfire trends, Effectiveness of treatments for fuel reduction.

• Wildlife and Habitat: Impacts to wildlife and their habitats, terrestrial and aquatic, Protection of old forest and associated species, Threatened and endangered species habitat requirements and availability, Species of Conservation Concern habitat requirements and availability.

• Climate Change: Current and projected trends, Effects on wildlife habitats and populations, Effects on carbon sequestration and carrying capacity

Given the debate on this blog surrounding these issues, the results should be interesting.  However there is no commitment here to any public release or discussion of the results.  The comment period on the draft EIS closes August 25th.  The results of this review were scheduled to be available in August.  “The technical experts (on the planning team) will review the report, consult and address any concerns from the review team, and incorporate any recommendations that would benefit the final EIS.” 

Given the debate on this blog surrounding these issues, the results should be interesting.  However there is no commitment here to any public release or discussion of the results.  The comment period on the draft EIS closes August 25th.  The results of this review where scheduled to be done in August.  “The technical experts (on the planning team) will review the report, consult and address any concerns from the review team, and incorporate any recommendations that would benefit the final EIS.”

Here is the revision website.