“Proforestation” It Aint What It Claims To Be

‘Proforestation’ separates people from forests

AKA: Ignorance and Arrogance Still Reign Supreme at the Sierra Club.

I picked this up from Nick Smith’s Newsletter (sign up here)
Emphasis added by myself as follows:
1)  Brown Text for items NOT SUPPORTED by science with long term and geographically extensive validation.                                                                                                                                                        2) Bold Green Text for items SUPPORTED by science with long term and geographically extensive validation.
3) >>>Bracketed Italics for my added thoughts based on 59 years of experience and review of a vast range of literature going back to way before the internet.<<<

“Proforestation” is a relatively new term in the environmental community. The Sierra Club defines it as: “extending protections so as to allow areas of previously-logged forest to mature, removing vast amounts of atmospheric carbon and recovering their ecological and carbon storage potential.”          >>>Apparently, after 130 years of existence, the Sierra Club still doesn’t know much about plant physiology, the carbon cycle or the increased risk of calamitous wild fire spread caused by the close proximity of stems and competition driven mortality in unmanged stands (i.e. the science of plant physiology regarding competition, limited resources and fire spread physics). Nor have they thought out the real risk of permanent destruction of the desired ecosystems nor the resulting impact on climate change.<<<

Not only must we preserve untouched forests, proponents argue, but we must also walk away from previously-managed forests too. People should be entirely separate from forest ecology and succession. >>>More abject ignorance and arrogant woke policy based only on vacuous wishful thinking.<<<

Except humans have managed forests for millennia. In North America, Indigenous communities managed forests and sustained its resources for at least 8,000 years prior to European settlement. It is true people have not always managed forests sustainably. Forest practices of the late 19th century are a good example.                                                                                                                                                 >>>Yes, and the political solution pushed on us by the Sierra Club and other faux conservationists beginning with false assumptions about the Northern Spotted Owl was to throw out the continuously improving science (i.e. Continuous Process Improvement [CPI]).  The concept of using the science to create sustainable practices and laws that regulated the bad practices driven by greed and arrogance wasn’t even considered seriously.  As always, the politicians listened to the well heeled squeaky voters.  Now, their arrogant ignorance has given us National Ashtrays, destruction of soils, and an ever increasing probability that great acreages of forest ecosystems will be lost to the generations that follow who will also have to cope with the exacerbated climate change.  So here we are, in 30+/- years the Faux Conservationists have made things worse than the greedy timber barons ever could have.  And the willfully blind can’t seem to see what they have done. Talk about arrogance.<<<

Forest management provides tools to correct past mistakes and restore ecosystems. But Proforestation even seems to reject forest restoration that helps return a forest to a healthy state, including controlling invasive species, maintaining tree diversity, returning forest composition and structure to a more natural state.

Proforestation is not just a philosophical exercise. The goal is to ban active forest management on public lands. It has real policy implications for the future management (or non-management) of forests and how we deal with wildfires, climate change and other disturbances.

We’ve written before about how this concept applies to so-called “carbon reserves.” Now, powerful and well-funded anti-forestry groups are pressuring the Biden Administration to set-aside national forests and other federally-owned lands under the guise of “protecting mature and old-growth” trees.

In its recent white paper on Proforestation (read more here), the Society of American Foresters writes that “preservation can be appropriate for unique protected areas, but it has not been demonstrated as a solution for carbon storage or climate change across all forested landscapes.”

Proforestation doesn’t work when forests convert from carbon sinks into carbon sources. A United Nations report pointed out that at least 10 World Heritage sites – the places with the highest formal environmental protections on the planet – are net sources of carbon pollution. This includes the iconic Yosemite National Park.

The Intergovernmental Panel on Climate Change (IPCC) recognizes active forest management will yield the highest carbon benefits over the long term because of its ability to mitigate carbon emitting disturbance events and store carbon in harvested wood products. Beyond carbon, forest management ensures forests continue to provide assets like clean water, wildlife habitat, recreation, and economic activity.
>>>(i.e. TRUE SUSTAINABILITY)<<<

Forest management offers strategies to manage forests for carbon sequestration and long-term storage.Proforestation rejects active stewardship that can not only help cool the planet, but help meet the needs of people, wildlife and ecosystems. You can expect to see this debate intensify in 2023.

Some Thoughts on Questions for the Prescribed Fire Review: Re Research and Models

The release of the Forest Service report on prescribed fire appears to be on the horizon. On September 2, Source New Mexico reported that the review is in the latter stages;  somewhere else I read that the Chief is reviewing the report. According to the news story:

Questions the Forest Service review hopes to answer, according to Chief Moore:

  • Does our prescribed fire program incorporate the most current research on climate change?
  • Do we use our climate models to add to the expertise of decision-makers on the ground?
  • What in our burn plans might need to change?
  • Do we have access to accurate weather forecasts?
  • Do we have enough personnel for the scale of prescribed fire needed to match the scale of wildfire risk across the landscape?
  • Do our existing policies and authorities affect our ability to make sound decisions on the ground?

***********************************************

It won’t surprise any TSW readers that  I would have added some questions about research and models…

(1) To what extent have decision-makers on the ground, and fire behavior analysts, specifically, been involved in developing and ground-truthing models that incorporate climate change?

(2) Is the institutional forum for linking modelling improvements and research requested by fire practitioners and that developed by universities and government research entities? Or are those entities simply funding “research that sounds useful to the fire community” without their direct involvement?

(3) Is JFSP the only program specifically targeting practitioner needs? How well are they doing at this, and are any improvements needed? Do they need more funding? How could it be taken from the “sounds plausible” research panels and redirected to research prioritized by fire managers and practitioners?

*********************************

I know the Chief is quite knowledgeable about all this, and how it works, from his many years interacting with FS and other researchers.

But the first two questions seem to assume that existing research is 1) relevant and 2) correct for that problem/area, and it’s only a  question of managers adopting it, what I call the “briefcase left under the bridge” view of research links to management. “Pick it up and use it, whether you think it’s useful or not, because someone you don’t know and have never spoken with determined that you need it.”  Soon to be followed by accusations of “not using the science” if it is determined not to be relevant or correct. Which of course brings up issues of power and privilege between the studiers and the doers.  And of course there are researchers that work closely with practitioners to produce research.  But now that wildfire is a cool subject to study in the eyes of the world, various disciplinary crows are circling the funding carcass and not all of them know how to, or will, involve the practitioner community.

I think research should be considered relevant if, and only if,  fire practitioners have asked for it, and given input into how and where it’s designed, carried out and interpreted.

I think research should be considered correct if, and only if, it has been ground-truthed by fire practitioners.

These two are not hard targets to achieve. It only appears difficult, in my view, because our research institutions are not (for the most part) currently set up with these goals in mind.  So here’s an idea..

Any study that states that it has utility for practitioner communities should be reviewed by representatives of those communities.  That information would be available via a link to each journal article.  Or better yet, put practitioners on review committees for funding proposals.  We tried that when I worked at CSREES, now NIFA, and it resulted in a very different landscape of approaches and designs.

 

Science Friday: Law et al. Paper on Prioritizing Forest Areas for Protection in 30 x 30

We’ve looked at two scientific papers in the last week,  last Friday Siirila-Woodburn et al.   “low to no snow future and water resources” as we discussed here.  Then yesterday we took a look at Ager et al. 2021. as part of a discussion about the Forest Service 10 year wildfire risk reduction plan.  Today I’d like to look at a recent paper by Law et al. that Steve brought up in a comment yesterday. It’s interesting for many reasons, not the least of which is that the journal, Communications Earth and Environment publishes the review comments and responses, and is open access. Apologies for the length of this post, but there’s lots of interesting stuff around this paper.

The first question is “what is the point of the paper?”  In the discussion, the authors say

“We developed and applied a geospatial framework to explicitly identify forestlands that could be strategically preserved to help meet these targets. We propose that Strategic Forest Reserves could be established on federal and state public lands where much of the high priority forests occur, while private entities and tribal nations could be incentivized to preserve other high priority forests. We further find that preserving high priority forests would help protect (1) ecosystem carbon stocks and accumulation for climate mitigation, (2) animal and tree species’ habitat to stem further biodiversity loss, and (3) surface drinking water for water security. Progress has been made, but much work needs to be done to reach the 30 × 30 or 50 × 50 targets in the western US.”

Basically, to put words in their mouths, they used geospatial data from various sources to help figure out how to meet 30×30 and 50×50 goals. It seems to me that they equate “preervation” to “conditions that are good for carbon stocks, biodiversity, and drinking water.”  This is perhaps fine in a non-fire environment (and we can all make assumptions about future fires on the West Side, but if we were perfectly honest we’d admit that “fires may well occur on the west side as well and possibly increase” but “no one knows for sure.”

Now if we were to raise our sights from the details of the geospatial framework, we might see that 30 x 30 is a current policy discussion about how much conservation versus protection and what practices count.  So they might have taken the same tack as Siirila-Woodburn and Ager’s coauthors.. “let’s ask the people who know about these practices and are working in the area what they would like to know that would help them. Keeping in mind that these systems are so complex, we can’t really predict and need to be open about uncertainties.” There’s also a substantial literature about these national or international priority setting analyses, and their tendencies to disempower local people. No reviewers of this paper that I could tell were social scientists.

Nevertheless, it seems like they ran some numbers, and then had a long discussion in a mode of an op-ed with citations.

Differences in fire regimes among ecoregions are important parts of the decision-making process. For example, forests in parts of Montana and Idaho are projected to be highly vulnerable to future wildfire but not drought, thus fire-adapted forests climatically buffered from drought may be good candidates for preservation. Moist carbon rich forests in the Pacific Coast Range and West Cascades ecoregions are projected to be the least vulnerable to either drought or fire in the future25, though extreme hot, dry, and windy conditions led to fires in the West Cascades in 2020. It is important to recognize that forest thinning to reduce fire risk has a low probability of success in the western US73, results in greater carbon losses than fire itself, and is generally not needed in moist forests79,80,81,82.

Biodiversity- wise, though, you don’t need a PhD in wildife ecology to think.. protecting more west-side Doug-fir isn’t as good for biodiversity as protecting some of that and some of Montana or New Mexico.. so really carbon and biodiversity don’t always lead us to the same places.  It’s interesting that the reviewers didn’t catch the claim that “forest thinning has a low probability of success” What is paper 73, you might ask?  It’s a perspective piece in PNAS (so another op-ed with citations) by our geography friends at University of Colorado.  And “results in greater losses than fire itself?”  See our California versus Oregon wildfire carbon post here.

Forests help ensure surface drinking water quality63,64 and thus meeting the preservation targets would provide co-benefits for water security in an era of growing need.

This was an interesting claim for “protected” forests, as our hydrology colleagues (who perhaps are more expert in this area?) wrote in their review..

Changes in wildfire frequency, severity and timing are particularly catastrophic consequences of a low- to- no snow future. Indeed, alongside continued warming, a shift towards a no- snow future is anticipated to exacerbate wildfire activity, as observed169,170. However, in the longer term, drier conditions can also slow post- fire vegetation regrowth, even reducing fire size and severity by reducing fuels. The hydrologic (and broader) impacts of fire are substantial, and include: shifts in snowpack accumulation, snowpack ablation and snowmelt timing171; increased probability of flash flooding and debris flows172,173; enhanced overland flow; deleterious impacts on water quality 174,175; and increased sediment fluxes176,177. Notably, even small increases in turbidity can directly impact water supply infrastructure178,179. Vegetation recovery within the first few years following fire rapidly diminishes these effects, but some longer term effects do occur, as evidenced with stream chemistry180 and above and below ground water partitioning both within and outside of burn scars181.

There’s even a drive-by (so to speak) on our OHV friends..

Recreation can be compatible with permanent protection so long as it does not include use of off-highway vehicles that have done considerable damage to ecosystems, fragmented habitat, and severely impacted animals including threatened and endangered species37

Here’s a link to the review comments. The authors did not include fragmentation in their analysis as one reviewer pointed out, so they added

Nevertheless, our current analysis did not incorporate metrics of forest connectivity39 or fragmentation48, thus isolated forest “patches” (i.e., one or several gird cells) were not ranked lower for preservation priority than forests that were part of large continuous corridors.

To circle back to handling uncertainty and where the discussion of these uncertainties takes place (with practitioners and inhabitants or not), another review comment on uncertainty and the reply:

The underlying datasets that we used in this analysis did not include uncertainty estimates and thus it is not readily possible for us to characterize cumulative uncertainty by propagating uncertainty and error through our analysis. We recognize the importance of characterizing uncertainty in geospatial analyses and acknowledge this is an inherent limitation in our current study. To better acknowledge this limitation and the need for future refinements, we added the following text to the end of the Discussion (lines 445-447): Next steps are to apply this framework across countries, include non-forest ecosystems, and account for how preservation prioritization is affected by uncertainty in underlying geospatial datasets.

It makes me hanker for old timey economists, who put uncertainties front and center. Remember sensitivity analysis?

But the reviewers never addressed the gaps that I perceive between what the authors claim in their discussion and what the data show. I suspect that’s because “generating studies using geospatial data” is a subfield, and the reviewers are experts in that, but the points in the discussion (what’s an IRA, what’s the state of the art on fuel treatments) not so much. I think that that’s an inevitable part of peer review being hard unpaid work- at some point reviewers will use the “sounds plausible from here” criterion. And so it goes..

Practice of Science Friday: The Structure of Scientific Disagreement- Fish Behavior and Ocean Acidification

There’s a fascinating article in Science that delves into a scientific controversy around fish behavior and ocean acidification.  Martin Enserink is a journalist with Science who  wrote the article and sent me a link, so hopefully you can access it. It raises some questions about the science biz, how it works, and how it might work better.

The fight, between two groups united by their passion for fish, isn’t just about data and the future of the oceans. It highlights issues in the sociology, psychology, and politics of science, including pressure on researchers to publish in top-tier journals, the journals’ thirst for eye-catching and alarming findings, and the risks involved in whistleblowing.


Should extraordinary claims get closer scrutiny?

The paper has proved so polarizing in the field, “It’s like Republicans and Democrats,” says co-author Dominique Roche of Carleton University in Ottawa, Canada. Some scientists hailed it as a stellar example of research replication that cast doubt on extraordinary claims that should have received closer scrutiny from the start.

and

Not long after that paper was published, Science received a “technical comment” from JCU reef ecologist Andrew Baird, who noted several problems, including the fact that the water flow quoted for Dixson’s flume was much faster than any coral larvae have been reported to swim, meaning larvae would be washed out of the back of the flume. Science’s review process deemed the comment “as low priority for publication,” says Deputy Editor for Research Sacha Vignieri. (Science’s news and editorial departments operate independently of each other.) Baird published it as a preprint instead, but it drew little attention.

Maybe there should be a screen for “extraordinary claims” and/or “high degree political and policy implications,” that would invoke an open peer review process? Would this story have been different if that had been the case? I think much of the interpersonal drama might have been avoided, and of course, better research accomplished.

When Scholarly Discussion is Not Enough: Give Up or Blow Whistle or ???

The group learned some painful lessons. After a preliminary inquiry, a UU panel dismissed the request for an investigation in a terse report and berated the team for failing to discuss its concerns with Lönnstedt and Eklöv in a “normal scholarly discussion.” Lönnstedt said the group was simply jealous. The accusers spent many months gathering additional documentation, at the expense of their own research. In April 2017, Sweden’s Central Ethical Review Board concluded there had indeed been “scientific dishonesty” in the research, and Science retracted the paper; 8 months later, a full UU investigation concluded the data had been fabricated. (Eklöv blamed Lönnstedt; Lönnstedt maintained her innocence.)

The brazenness of the apparent deception shocked Jutfelt. “It really triggered my skepticism about science massively,” he says. “Before that paper, I could not understand how anyone could fabricate data. It was inconceivable to me.” Now, he began to wonder how many other papers might be a total fantasy. The experience also taught the group that, if they were ever to blow the whistle again, they would have to bring a stronger case right from the start, Clark says.

On the other side of the globe, the group’s accusations had Munday’s attention. “It seems that Clark and Jutfelt are trying to make a career out of criticizing other people’s work. I can only assume they don’t have enough good ideas of their own to fill in their time,” he wrote to Lönnstedt in a June 2016 email that she used in her defense to the ethics board. “Recently, I found out they have been ‘secretly’ doing work on the behavioural effects of high CO2 on coral reef fishes, presumably because they want to be critical of some aspects of our work.”

Others defended Munday and Dixson more diplomatically. Four “grandfathers in the field,” as Bruno calls them, criticized the replication in a paper in Biogeosciences. One author, Hans-Otto Pörtner of the Alfred Wegener Institute in Bremerhaven, Germany, says his own work had been on the receiving end of criticism from the “youngish group” in the past. “Building a career on judging what other people did is not right,” says Pörtner, who co-chairs one of IPCC’s three working groups. “If such a controversy gets outside of the community, it’s harmful because the whole community loses credibility.”

So.. according to Pörtner, criticism should stay within the community because otherwise the community would lose credibility. So people shouldn’t criticize each others’ work in public? Only via blind peer review, or publishing papers that disagree (given that the journal would publish them, because “not finding something” isn’t all that exciting). Leaving the rest of us to ponder “why did they come to different conclusions?”. I would actually trust the community more if their disagreements were in the open. I wonder whether that might be a role for the relevant scientific society.

Practice of Science Friday: The Lack of Coolness of Applied Science- Some History

I think Experimental Forests and Ranges are scientifically way cool.

Everyone has heard of basic and applied sciences. Generally, basic is thought to be better by scientists, and applied, more useful by others. But why is this, and how does it affect how we affect the practice of science today? First, let’s return to Peter Medawar.

The hard and fast distinction between pure and applied science is a quaint relic of the days when it was widely and authoritatively believed that axioms and generative ideas of some privileged sciences (the ‘Pure’ Sciences strictly so called) were known with certainty by intuition or revelation, while the Applied Sciences grew out of merely empirical observations concerning ‘matters of fact or existence’. The distinction between pure and applied science persisted in Victorian and Edwardian times as the basis of a class distinction between activities that did or did not become a gentleman (‘Pure Science’ being a genteel occupation and ‘Applied Science’ having disreputable associations with manual work or with trade). This class distinction is now widely believed to have been rather damaging to this country.

Medawar also argued strongly in the 70’s that basic and applied funding decisions needed to be made by scientists, and not users of science. Here’s a link to a history paper on this. It did not go without notice by others that the position of scientist must make all funding decisions about science is a bit self-serving for something funded by tax dollars. But what does this have to do with us today?

A few months ago, I attended a Forest Service Partners shindig and ran into a person in Forest Service R&D. I asked him how things were going, and he said Congress was on his back about not funding useful research. Of course, this has nothing to do with FS R&D as currently constituted, but has been an ongoing tension for as long as there has been serious levels of public funding for scientific research, and not just in the US. (This is my version of history and others are invited to add their own perspectives and experiences).

The Fund for Rural America was an attempt by Congress to convince researchers to do useful things for rural Americans. You can check out the provisions of the 1996 Farm Bill here (note that the same conversation was close to 25 years ago):

(C) USE OF GRANT.—
(i) IN GENERAL.—A grant made under this paragraph may be used by a grantee for 1 or more of
the following uses:
(I) Outcome-oriented research at the discovery end of the spectrum to provide breakthrough results.
(II) Exploratory and advanced development and technology with well-identified outcomes.
(III) A national, regional, or multi-State program oriented primarily toward extension programs and education programs demonstrating and supporting the competitiveness of United States
agriculture.

Notice the “outcome-oriented.”

Yet, let’s look at the other forces pushing toward basic research. Here’s a 2014 report from the National Academy of Sciences.

USDA has played a key role in supporting extramural research for agriculture since the passage of the Hatch Act in 1887, but its use of competitive funding as a mechanism to support extramural research began more recently (see Figure 3-1). A peer-review competitive grants program was proposed as a means of moving a publicly funded agricultural research portfolio toward the more basic end of the R&D spectrum.2 A 1989 National Research Council report stated that “there is ample justification for increased allocations for the [competitive] grants program to a level that would approximate 20 percent of the USDA’s research budget, at least one half of which would be for basic research related to agriculture” (NRC, 1989, pp. 49–50).

The National Research Council study (part of the National Academies, what I call the Temple of Science) pushed toward more basic research. Why 50% of the total? I’m sure they have a rationale, but not sure that the Congress would agree. So again, we see the Science Establishment going for more basic (and potentially less utility and accountability) and Congress later pushing back asking public funds to have more accountability (outcome-oriented).

I also had a ring-side seat for part of a transition. At one time USDA had (more) formula funds that were given to land grant schools to figure out useful things. Often for forest science, the Dean would get together users and others to help prioritize research via discussions at the state level, at the best. Or perhaps give it to his buddies or use it as trade for something, at the worst. But then USDA-CSREES now NIFA hired a bunch of folks from the National Science Foundation, who came with the idea that only scientists can judge whether research is worth doing, and the best way is to bring them together for panels in DC. The FRA staff, including me, even got in trouble with the Powers that Were for allowing a user on a panel.

Meanwhile, back at the Forest Service, Forest Service scientists were less funded, and told to look for funding elsewhere than the Forest Service, which necessarily led them to focus on what other scientists who run grant programs think is cool.

If we go back to the history paper above, we can see that politicians can see the self-interest of scientists in this, but it has nevertheless been difficult for politicians (even appropriators!) and the public to get a grip on it. Of course, agriculture, health, nutrition, engineering, and different technologies have different communities, funding sources and approaches, so the applied sciences, like science in general, are not one thing.

If something is useful, then framing the question (as we’ve seen, extremely important in figuring out which disciplines and approaches are helpful) should definitely be done including users (in our case, practitioners, land managers and so on). I’m not saying that researchers don’t do this through their own personal commitment- I’m saying that the systems in place do not necessarily support it and could be changed to support it.

Another tendency is for departments to centralize scientists (e.g. USGS), which can also drive them farther away from research that helps their agency colleagues. IMHO the Forest Service was wise, politically astute, and/or lucky to retain their own research scientists, even if it leads sometimes to intramural drama. A small price to pay for a modicum of independence from the Science Establishment.

Validated Science versus Unproven Scientific Hypothesis – Which One Should We Choose?

In a 6/13/18 article, David Atkins provides a critique of the assumptions behind the Law et al article titled: “Land use strategies to mitigate climate change in carbon dense temperate forests” and shows how hypothetical science can and has been used, without any caveat, to provide some groups with slogans that meet their messaging needs instead of waiting for validation of the hypothesis and thereby considering the holistic needs of the world.

I) BACKGROUND

The noble goal of Law et. al. is to determine the “effectiveness of forest strategies to mitigate climate change”. They state that their methodology “should integrate observations and mechanistic ecosystem process models with future climate, CO2, disturbances from fire, and management.”

A) The generally (ignoring any debate over the size of the percentage increase) UNCONTESTED points regarding locking up more carbon in the Law et. al. article are as follows:
1) Reforestation on appropriate sites – ‘Potential 5% improvement in carbon storage by 2100’
2) Afforestation on appropriate sites – ‘Potential 1.4% improvement in carbon storage by 2100′

B) The CONTESTED points regarding locking up 17% more carbon by 2100 in the Law et. al. article are as follows:
1) Lengthened harvest cycles on private lands
2) Restricting harvest on public lands

C) Atkins, at the 2018 International Mass Timber Conference protested by Oregon Wild, notes that: “Oregon Wild (OW) is advocating that storing more carbon in forests is better than using wood in buildings as a strategy to mitigate climate change.” OW’s first reference from Law et. al. states: “Increasing forest carbon on public lands reduced emissions compared with storage in wood products” (see Law et. al. abstract). Another reference quoted by OW from Law et. al. goes so far as to claim that: “Recent analysis suggests substitution benefits of using wood versus more fossil fuel-intensive materials have been overestimated by at least an order of magnitude.”

II) Law et. al. CAVEATS ignored by OW

A) They clearly acknowledge that their conclusions are based on computer simulations (modeling various scenarios using a specific set of assumptions subject to debate by other scientists).

B) In some instances, they use words like “probably”, “likely” and “appears” when describing some assumptions and outcomes rather than blindly declaring certainty.

III) Atkins’ CRITIQUE

Knowing that the modeling used in the Law et. al. study involves significant assumptions about each of the extremely complex components and their interactions, Atkins proceeds to investigate the assumptions which were used to integrate said models with the limited variables mentioned and shows how they overestimate the carbon cost of using wood, underestimate the carbon cost of storing carbon on the stump and underestimate the carbon cost of substituting non-renewable resources for wood. This allows Oregon Wild to tout unproven statements as quoted in item “I-C” above and treat them as fact and justification for policy changes instead of as an interesting but unproven hypothesis that needs to be validated in order to complete the scientific process.

Quotes from Atkins Critique:

A) Wood Life Cycle Analysis (LCA) Versus Non-renewable substitutes.
1) “The calculation used to justify doubling forest rotations assumes no leakage. Leakage is a carbon accounting term referring to the potential that if you delay cutting trees in one area, others might be cut somewhere else to replace the gap in wood production, reducing the supposed carbon benefit.”
2) “It assumes a 50-year half-life for buildings instead of the minimum 75 years the ASTM standard calls for, which reduces the researchers’ estimate of the carbon stored in buildings.”
3) “It assumes a decline of substitution benefits, which other LCA scientists consider as permanent.”
4) “analysis chooses to account for a form of fossil fuel leakage, but chooses not to model any wood harvest leakage.”
5) “A report published by the Athena Institute in 2004, looked at actual building demolition over a three-plus-year period in St. Paul, Minn. It indicated 51 percent of the buildings were older than 75 years. Only 2 percent were demolished in the first 25 years and only 12 percent in the first 50 years.”
6) “The Law paper assumes that the life of buildings will get shorter in the future rather than longer. In reality, architects and engineers are advocating the principle of designing and building for longer time spans – with eventual deconstruction and reuse of materials rather than disposal. Mass timber buildings substantially enhance this capacity. There are Chinese Pagoda temples made from wood that are 800 to 1,300 years old. Norwegian churches are over 800 years old. I visited at cathedral in Scotland with a roof truss system from the 1400s. Buildings made of wood can last for many centuries. If we follow the principle of designing and building for the long run, the carbon can be stored for hundreds of years.”
7) “The OSU scientists assumed wood energy production is for electricity production only. However, the most common energy systems in the wood products manufacturing sector are combined heat and power (CHP) or straight heat energy production (drying lumber or heat for processing energy) where the efficiency is often two to three times as great and thus provides much larger fossil fuel offsets than the modeling allows.”
8) “The peer reviewers did not include an LCA expert.”
9) The Dean of the OSU College of Forestry was asked how he reconciles the differences between two Doctorate faculty members when the LCA Specialist (who is also the director of CORRIM which is a non-profit that conducts and manages research on the environmental impacts of production, use, and disposal of forest products). The Dean’s answer was “It isn’t the role of the dean to resolve these differences, … Researchers often explore extremes of a subject on purpose, to help define the edges of our understanding … It is important to look at the whole array of research results around a subject rather than using those of a single study or publication as a conclusion to a field of study.”
10) Alan Organschi, a practicing architect, a professor at Yale stated his thought process as “There is a huge net carbon benefit [from using wood] and enormous variability in the specific calculations of substitution benefits … a ton of wood (which is half carbon) goes a lot farther than a ton of concrete, which releases significant amounts of carbon during a building’s construction”. He then paraphrased a NASA climate scientistfrom the late 1980’s who said ‘Quit using high fossil fuel materials and start using materials that sink carbon, that should be the principle for our decisions.’
11) The European Union, in 2017, based on “current literature”, called “for changes to almost double the mitigation effects by EU forests through Climate Smart Forestry (CSF). … It is derived from a more holistic and effective approach than one based solely on the goals of storing carbon in forest ecosystems”
12) Various CORRIM members stated:
a) “Law et al. does not meet the minimum elements of a Life Cycle Assessment: system boundary, inventory analysis, impact assessment and interpretation. All four are required by the international standards (ISO 14040 and 14044); therefore, Law et al. does not qualify as an LCA.”
b) “What little is shared in the article regarding inputs to the simulation model ignores the latest developments in wood life cycle assessment and sustainable building design, rendering the results at best inaccurate and most likely incorrect.
c) “The PNAS paper, which asserts that growing our PNW forests indefinitely would reduce the global carbon footprint, ignores that at best there would 100 percent leakage to other areas with lower productivity … which will result in 2 to 3.5 times more acres harvested for the same amount of building materials. Alternatively, all those buildings will be built from materials with a higher carbon footprint, so the substitution impact of using fossil-intensive products in place of renewable low carbon would result in >100 percent leakage.”
d) More on leakage: “In 2001, seven years after implementation, Jack Ward Thomas, one of the architects of the plan and former chief of the U.S. Forest Service, said: “The drop in the cut in the Pacific Northwest was essentially replaced by imports from Canada, Scandinavia and Chile … but we haven’t reduced our per-capita consumption of wood. We have only shifted the source.”
e) “Bruce Lippke, professor emeritus at the University of Washington and former executive director of CORRIM said, “The substitution benefits of wood in place of steel or concrete are immediate, permanent and cumulative.””

B) Risks Resulting from High Densities of Standing Timber
1) “The paper underestimates the amount of wildfire in the past and chose not to model increases in the amount of fire in the future driven by climate change.”
2) “The authors chose to treat the largest fire in their 25-year calibration period, the Biscuit Fire (2003), as an anomaly. Yet 2017 provided a similar number of acres burned. … the model also significantly underestimated five of the six other larger fire years ”
3) “The paper also assumed no increase in fires in the future
4) Atkins comments/quotes support what some of us here on the NCFP blog have been saying for years regarding storing more timber on the stump. There is certainty that a highly significant increase in carbon loss to fire, insects and disease will result from increased stand densities as a result of storing more carbon on the stump on federal lands. Well documented, validated and fundamental plant physiology and fire science can only lead us to that conclusion. Increases in drought caused by global warming will only increase the stress on already stressed, overly dense forests and thereby further decrease their viability/health by decreasing the availability of already limited resources such as access to minerals, moisture and sunlight while providing closer proximity between trees to ease the ability and rate of spread of fire, insects and disease between adjacent trees.

Footnote:
In their conclusion, Law et. al. state that“GHG reduction must happen quickly to avoid surpassing a 2°C increase in temperature since preindustrial times.” This emphasis leads them to focus on strategies which, IMHO, will only exacerbate the long-term problem.
→ For perspective, consider the “Failed Prognostications of Climate Alarm

Is the Science Biz Letting Women Down? I. Harassment- How Can We Still Not Know This?

So here we are in 2018, and sexual harassment is still a thing. I agree with Dr. Judith Curry, the atmospheric scientist, that part of the problem might be that different actions are lumped together under “sexual harassment” and that makes it more difficult to address. As she says here..

In the 1990’s there was growing awareness of sexual harassment in the universities. In the early 1990’s, I was on a university committee to evaluate new training materials on sexual harassment. I was astonished when I saw that ‘winking’ and ‘elevator eyes’ were on the same list as rape and quid pro quo behavior. There was simply no hierarchy of sexual harassment sins — a problem that continues to concern me as we hear the latest litany of accusations.

The most vexing issue was ‘hostile environment’ and the subsequent ‘backlash’ if you reported anything. This issue became very real to me when a female faculty member in my department complained about lewd and crude cartoons being posted on the walls of the Center administrative offices. She complained to the Center Director – he wouldn’t take them down. She complained the the Department Chair, essentially no response. But then the backlash began, with attempts to harm her career. She lawyered up based on the backlash, and after several agonizing years she apparently won her case (details were never made public) and managed to salvage her career at the same university and go on to have a very successful career. What was exceptional about this case is that her job and career were salvaged in the outcome — other successful litigants in such cases usually ended up leaving their university because the situation was too hostile and unsalvageable. I suspect that having a female Associate Dean helped this to happen.

The failure to discriminate among the hierarchy of sexual harassment behaviors is evident in the current round of accusations. Behaviors in the ‘hostile environment’ category are particularly vexing, as individual women have very different sensitivities and desires in context of their casual social interactions with men. However, assault, quid pro quo and backlash situations are very unambiguous, and we need to make sure that ambiguous hostile environment issues don’t detract from the most serious transgressions.

My question is that we have excellent thinkers and scientists of all persuasions, from social to physical. Here’s a simple question: are sexual harassment efforts more successful if they break down the different categories of harassment and address them separately?

We spend oceans of research money on all kinds of future problems (like climate change). How can we know so little about this problem and how to fix it? Are some institutions (universities and fire research funders) so run by men that this doesn’t seem worthy of attention? Do the answers to these problems involve the relatively uncool social sciences instead of the always cool physical sciences? Who determines that figuring out fixes of current real world problems is less important than running models to attempt to predict things in 2100?

What would a random mix of women scientists come up with as key areas of research if their own research was not included in the possibilities (to avoid conflict of interest).. what if they could think outside the box, not what can I get funded, but what does the world most need getting done?

Inspiring Science: Co-design and Co-producion

Professor Sir Peter Gluckman

We’ve been having a discussion about how to design a study to understand owl/fire relationships with the results being accepted by the people on both sides. A while back I write about “what science related to policy should have” in my Eight Steps to Vet Scientific Information for Policy Fitness post here.

Let me tell a story.. one day back in the 80’s a researcher told me that management should listen to her findings. I ran a lab for the National Forests that did the same kind of lab work with QA/QC protocols. I asked about her QA/QC protocols. She said “researchers can’t afford to do that on our grants.” I said National Forests can’t afford to change management on research without QA/QC.

But I’m not an outlier on this.. here’s the text of a speech delivered by Sir Peter Gluckman, the Science Advisor to the Prime Minister of New Zealand on January 18, 2018, titled “Science to Inspire Humanity”.

Second, science must be embedded in society at every stage. This means the science community must be open to discussion on what science is undertaken (before it is undertaking it), how it is done, and what practical implications might be drawn from it. Concepts like co-design and co-production and extended peer review need to be more than slogans. These concepts need to be implemented in ways that enhances scientific rigor while ensuring valid public values about the research agenda and the application of science and technology.

Some science policy literature suggests you should give up on science once it becomes a weapon in policy science-slinging. But there is another way. Co-design and co-production and extended peer review (including of proposals) would be a great deal of work, but this might be the size of topic to try it out. Maybe USGS, NSF, the FS could all fund it jointly and the state universities would participate?

I also think his “science roadmaps” to facilitate prioritization and coordination are interesting ideas. See his blog post here.

Minority Report: “EPA’s Playbook,” “Fraud,” and “Secret Science”

About a month ago I had a discussion with Dr. Bob Ferris on this blog, initially concerning the general quality of “government science.” Dr. Ferris is a “real scientist” who is “serious” and currently works for Cascadia Wildlands, an environmental activist group based in Eugene, Oregon. He has a number of publications to his credit and was instrumental in the efforts to reintroduce gray wolves into Yellowstone National Park in 1996, through his position as a biologist with Defenders of Wildlife. Here is a current sample of his work: http://www.mercurynews.com/opinion/ci_25387360/wheres-science-fish-and-wildlife-service-must-rewrite

When Dr. Ferris brought up the topic of “best available science,” I responded by providing a link to an Evergreen Magazine interview with Dr. Alan Moghissi, a published and widely recognized expert on the topic. For some reason Dr. Ferris was able to use this link as an opportunity to veer oddly and sharply off-topic and to begin leveling ad hominem attacks on ESIPRI’s website; one of ESIPRI’s founders (and occasional contributor to this blog), Norman MacLeod; Evergreen Magazine; Jim Petersen (Dr. Moghissi’s interviewer and publisher of Evergreen); Dr. Moghissi’s “right wing credentials”; Dave Skinner, a regular contributor to this blog; and the Boards of both ESIPRI and Evergreen Foundation: https://forestpolicypub.com/2014/02/17/of-wolves-and-wilderness/comment-page-1/#comment-38197

The link that seemed to cause Dr. Ferris so much vexation and total disregard for the topic at hand (“best available science”) was to a recent issue of Evergreen Magazine with a picture of Dr. Moghissi on the front cover, the Capitol Building in the background, and featuring the headline: “Fresh Air! Alan Moghissi: Rocking Capitol Hill and the EPA!”: http://www.esipri.org/Library/Evergreen_2012.pdf

Earlier today Karla Davenport, producer of Salem, Oregon’s iSpy Radio show, sent me a copy of this amazing report, EPA’s Playbook Unveiled: A Story of Fraud, Deceit, and Secret Science: http://www.esipri.org/Library/Bolar-Steel_20140319.pdf

This may be the first time I have ever referred to a government report as “amazing” without meaning to be disrespectful. If this is only 50% accurate, it should be made required reading by our public land legislators and their staffs immediately. In my opinion. The report was just released on Wednesday, so has only been available for 72 hours. I have reproduced its Executive Summary here for discussion purposes. I’m curious as to how it is going to be received.

EXECUTIVE SUMMARY

The greatness of our unique nation hinges on the fundamental purpose of the government to serve at the will of the people and to carry out public policy that is in the public interest. When it comes to the executive branch, the Courts have extended deference to agency policy decisions under the theory that our agencies are composed of neutral, non-biased, highly specialized public servants with particular knowledge about policy matters. This report will reveal that within the Environmental Protection Agency (EPA), some officials making critically important policy decisions were not remotely qualified, anything but neutral, and in at least one case — EPA decision making was delegated to a now convicted felon and con artist, John Beale.

John Beale is the character from the bizarre tale of the fake CIA agent who used his perch at the EPA to bilk the American taxpayer out of more than a million dollars. Even Jon Stewart, host of the popular Daily Show, featured Beale’s bizarre tale as “Charlatan’s Web” on his program in December 2013. Before his best friend Robert Brenner hired him to work at EPA, Beale had no legislative or environmental policy experience and wandered between jobs at a small-town law firm, a political campaign, and an apple farm. Yet at the time he was recruited to EPA, Brenner arranged to place him in the highest pay scale for general service employees, a post that typically is earned by those with significant experience.

What most Americans do not know is that Beale and Brenner were not obscure no-name bureaucrats housed in the bowels of the Agency. Through his position as head of the Office of Policy, Analysis, and Review, Brenner built a “fiefdom” that allowed him to insert himself into a number of important policy issues and to influence the direction of the Agency. Beale was one of Brenner’s acolytes — who owed his career and hefty salary to his best friend.

During the Clinton Administration, Beale and Brenner were very powerful members of EPA’s senior leadership team within the Office of Air and Radiation, the office responsible for issuing the most expensive and onerous federal regulations. Beale himself was the lead EPA official for one of the most controversial and far reaching regulations ever issued by the Agency, the 1997 National Ambient Air Quality Standards (NAAQS) for Ozone and Particulate Matter (PM). These standards marked a turning point for EPA air regulations and set the stage for the exponential growth of the Agency’s power over the American economy. Delegating the NAAQS to Beale was the result of Brenner’s facilitating the confidence of EPA elites, making Beale the gatekeeper for critical information throughout the process.

Beale accomplished this coup based on his charisma and steadfast application of the belief that the ends justify the means. Concerned about this connection, the Senate Committee on Environment and Public Works (EPW) staff have learned that the same mind that concocted a myriad of ways to abuse the trust of his EPA supervisors while committing fraud is the same mind that abused the deference afforded to public servants when he led EPA’s effort on the 1997 NAAQS. Brenner was known to have an objective on NAAQS, and would have done whatever was necessary to accomplish his desired outcome. Together, Brenner and Beale implemented a plan, which this report refers to as “EPA’s Playbook.”

The Playbook includes several tools first employed in the 1997 process, including sue-and-settle arrangements with a friendly outside group, manipulation of science, incomplete cost-benefit analysis reviews, heavy-handed management of interagency review processes, and capitalizing on information asymmetry,
reinforced by resistance to transparency. Ultimately, the guiding principal behind the Playbook is the Machiavellian principal that the ends will justify the means. In the case of the 1997 NAAQS, the Playbook started with a sue-and-settle agreement with the American Lung Association, which established a compressed timeline to draft and issue PM standards. This timeline was further compressed when EPA made the unprecedented decision to simultaneously issue new standards for both PM and Ozone. Issuing these standards in tandem and under the pressure of the sue-and-settle deadline, Beale had the mechanism he needed to ignore opposition to the standards — EPA simply did not have the time to consider dissenting opinions.

The techniques of the Playbook were on full display in the “Beale Memo,” a confidential document that was leaked to Congress during the controversy, which revealed how he pressured the Office of Information and Regulatory Affairs to back off its criticism of the NAAQS and forced them to alter their response to Congress in 1997. EPA also brushed aside objections raised by Congress, the Office of Management and Budget, the Department of Energy, the White House Council of Economic Advisors, the White House Office of Science and Technology Policy, the National Academy of Sciences, and EPA’s own scientific advisers — the Clean Air Science Advisory Committee.

These circumstances were compounded by EPA’s “policy call” to regulate PM2.5 for the first time in 1997. PM2.5 are ubiquitous tiny particles, the reduction of which EPA used to support both the PM and Ozone NAAQS. In doing so, the Playbook also addressed Beale’s approach to EPA’s economic analysis: overstate the benefits and underrepresent the costs of federal regulations. This technique has been applied over the years and burdens the American people today, as up to 80% of the benefits associated with all federal regulations are attributed to supposed PM2.5 reductions.

EPA has also manipulated the use of PM2.5 through the NAAQS process as the proffered health effects attributable to PM2.5 have never been independently verified. In the 1997 PM NAAQS, EPA justified the critical standards on only two data sets, the Harvard “Six Cities” and American Cancer Society (ACS II) studies. At the time, the underlying data for the studies were over a decade old and were vulnerable to even the most basic scrutiny. Yet the use of such weak studies reveals another lesson from EPA’s Playbook: shield the underlying data from scrutiny.

Since the 1997 standards were issued, EPA has steadfastly refused to facilitate independent analysis of the studies upon which the benefits claimed were based. While this is alarming in and of itself, this report also reveals that the EPA has continued to rely upon the secret science within the same two studies to justify the vast majority of all Clean Air Act regulations issued to this day. In manipulating the scientific process, Beale effectively closed the door to open scientific enquiry, a practice the Agency has followed ever since. Even after the passage in 1999 of the Shelby Amendment, a legislative response to EPA’s secret science that requires access to federal scientific data, and President Obama’s Executive Orders on Transparency and Data Access, the EPA continues to withhold the underlying data that originally supported Beale’s efforts.
After President Clinton endorsed the 1997 NAAQS and the Agency celebrated their finalization, Beale became immune to scrutiny or the obligation to be productive for the remainder of his time at the Agency. Similarly, the product of his labors have remained intact and have been shielded from any meaningful scrutiny, much the same way Beale was protected by an inner circle of career staff who unwittingly aided in his fraud. Accordingly, it appears that the Agency is content to let the American people pay the price for Beale and EPA’s scientific insularity, a price EPA is still trying to hide almost twenty years later.

After reaching the pinnacle of his career at the Agency in 1997, and facing no accountability thereafter, Beale put matters on cruise control and enjoyed the lavish lifestyle that the highest paid EPA employee could afford, producing virtually no substantive work product thereafter. For Beale’s successes in the 1997 NAAQS process, Beale was idolized as a hero at the Agency. According to current EPA Administrator, Gina McCarthy, “John Beale walked on water at EPA.”

This unusual culture of idolatry has led EPA officials to blind themselves to Beale’s wrongdoing and caused them to neglect their duty to act as public servants. As such, to this day EPA continues to protect Beale’s work product and the secret science behind the Agency’s NAAQS and PM claims.

The Red-Cockaded Woodpecker Story and a Shared Research Platform

Forest Service photo safe approach to nest cavity
Forest Service photo safe approach to nest cavity
Once, a long, long time ago, in a town I can’t actually remember, I was at a Society of American Foresters Convention. I can’t remember where or when, but perhaps someone else remembers. There was a presentation at a technical session of a study on Red-Cockaded Woodpeckers. At the end of the session a field forester asked a question that remains with me to this day. He said something along the lines of “you say they don’t do this in these places, but I’ve seen them doing it in these places.” The answer from the presenter was simply “what you say is not documented in the literature.” End of story, end of discussion. You know how it goes.. if you are working on something, you go to meetings where you get lectured to, but you never get to frame the question or really discuss why things look different to you. Transmission of information is basically one way.

At the time I thought “how powerful would it be if there were conversations between people in the woods observing first-hand and the scientific community?” And “how can SAF help make that happen?” I’m still working on this.

Through my career in research administration, I saw the gradual erosion of involvement of field people in research prioritization, and funding. At one time, extension folks were out there learning and teaching and hearing from field folks what they are interested in and what they know. At its best, the land grant system enabled them to interact with researchers and students and for all us to learn from, and teach each other. The land grant system provided a research, education and extension platform, on which those conversations could take place.

But as I worked at USDA, I found that the research folks there went from a land grant mission-centered model to a model where funding was decided by a random bunch of scientists from wherever, we hauled in to the national office (the NSF model). Ideologically, it came about as “formula” is bad, and “competitive” is good. People in the science establishment really believe this. At least, I personally have been unable to convey my belief that there are self-interest and power issues highly tied to their preference for the NSF model.

In fact, when I worked for the Fund for Rural America, an attempt by Congress to fund research that helps people (they had their doubts about the utility of what they were getting for the bucks), we (the Forestry Program) got in trouble for allowing (gasp!) stakeholders to sit on research panels.

Unfortunately, in my view, the locus of control was also shifted from the state universities to the Science Establishment. Maybe it works for other areas than natural resources and agriculture, which are profoundly local.

The reason I bring this up today is because of Bob’s owl piece here, and Mike’s comments here.

If we wanted to develop a new Shared Research Platform for practitioners, local people and academics to share what they have observed, say, for example, on the spotted owl, and what potential further research projects these observations could lead to, where would we do it? I think research studies should be suggested by the sum of observations made by everyone.. and that the observers and stakeholders should have a role in prioritization and design. Until we do that, we can use “the best available science” but the emphasis will be on “available” and no way on “best.”

I would propose that OSU and the relevant professional societies take the lead. Perhaps a group could develop a set of questions that could be answered online. Other ideas on where or how to do it?

Have you observed anyone doing this successfully on other topics ? I think the Fire Learning Network might be an example. Others?