Why Some Things Some People Say Sometimes About Wildfires Are Wrong: Michael Shellenberger in Forbes

In a small gesture for accuracy in media, I took the liberty of retitling Shellenberger’s November 4 Forbes piece for our TSW post. Here’s the link.

Let’s look at this article in light of what we discussed yesterday. Climate and wildfires both have their own sets of experts and a variety of disciplines, each of which having different approaches, values and priorities. I’m not saying that studying the results of climate modelling is entirely a Science Fad, but listen to Keeley here:

I asked Keeley if the media’s focus on climate change frustrated him.

“Oh, yes, very much,” he said, laughing. “Climate captures attention. I can even see it in the scientific literature. Some of our most high-profile journals will publish papers that I think are marginal. But because they find climate to be an important driver of some change, they give preference to them. It captures attention.

And the marginal ones are funded because.. climate change is cool (at least to “high-profile” journals, which have their own ideas of coolness). And who funds it? The US Government. Me, I would give bucks to folks for, say, physical models of fire behavior,(something that will help suppression folks today)rather than attempting to parse out the unknowable interlinkages of the past. But that’s just me, and the way the Science Biz currently operates, neither I nor you get a vote.

Note that the scientists quoted, (Keeley, North, and Safford) in this piece are all government scientists. Keeley and North are research scientists while Safford appears to be a Regional Ecologist (funded by the National Forests), so if we went by the paycheck method (are they funded by R&D?) Safford would have to get permission from public affairs, while Keeley and North would not. On the other hand, Safford has published peer-reviewed papers, so perhaps he should not have to ask permission? Even with permission requirements, though, they seem able to provide their expertise to the press.

Keeley talks about the fact that shrubland and forest fires are different beasts, and sometimes get lumped together in coverage. Below is an interesting excerpt on the climate question.

Keeley published a paper last year that found that all ignition sources of fires had declined except for powerlines.
“Since the year 2000 there’ve been a half-million acres burned due to powerline-ignited fires, which is five times more than we saw in the previous 20 years,” he said.
“Some people would say, ‘Well, that’s associated with climate change.’ But there’s no relationship between climate and these big fire events.”
What then is driving the increase in fires?
“If you recognize that 100% of these [shrubland] fires are started by people, and you add 6 million people [since 2000], that’s a good explanation for why we’re getting more and more of these fires,” said Keeley.
What about the Sierras?
“If you look at the period from 1910 – 1960,” said Keeley, “precipitation is the climate parameter most tied to fires. But since 1960, precipitation has been replaced by temperature, so in the last 50 years, spring and summer and temperatures will explain 50% of the variation from one year to the next. So temperature is important.”
Isn’t that also during the period when the wood fuel was allowed to build due to suppression of forest fires?
“Exactly,” said Keeley. “Fuel is one of the confounding factors. It’s the problem in some of the reports done by climatologists who understand climate but don’t necessarily understand the subtleties related to fires.”
So, would we have such hot fires in the Sierras had we not allowed fuel to build-up over the last century?
“That’s a very good question,” said Keeley. “Maybe you wouldn’t.”
He said it was something he might look at. “We have some selected watersheds in the Sierra Nevadas where there have been regular fires. Maybe the next paper we’ll pull out the watersheds that have not had fuel accumulation and look at the climate fire relationship and see if it changes.”
I asked Keeley what he thought of the Twitter spat between Gov. Newsom and President Trump.

Sharon’s note: I don’t know Keeley, but I thought he handled this very well, considering it’s not really a question about his research. The above italics are mine.

“I don’t think the president is wrong about the need to better manage,” said Keeley. “I don’t know if you want to call it ‘mismanaged’ but they’ve been managed in a way that has allowed the fire problem to get worse.”
What’s true of California fires appears true for fires in the rest of the US. In 2017, Keeley and a team of scientists modeled 37 different regions across the US and found “humans may not only influence fire regimes but their presence can actually override, or swamp out, the effects of climate.” Of the 10 variables, the scientists explored, “none were as significantly significant… as the anthropogenic variables.”

It’s encouraging to think that research shows that fire suppression has an big impact on fires. Otherwise, we’d have to give it up because fire suppression isn’t “based on science” 😉

Practice of Science Friday: Proposed Vectors of Scientific Coolness

The US Temple of Science: Home of Coolness

Previously we’ve talked about how scientific research is funded and how researchers may not think that they will be rewarded for studying something that will help out practitioners- because it is not funded nor cool. In some fields, there is not even a directly linkage between practitioners and researchers.

Penalized for Utility
At one time I was asked to be the representative of research users on a panel for the research grade evaluation for a Forest Service scientist. He had been doing useful work, funded by people interested in what he was doing, and seemed to be doing a good job. Now remember that Forest Service research is supposed to study useful, mission-oriented things. One of the researchers on the panel said that the paneled scientist didn’t deserve an upgrade because he was doing “spray and count” research (he was an entomologist). The implicit assumption in our applied work is that excellence is not determined by the user. I asked “but you guys funded him to do this, how can you turn around and say he should be working on something else?” In my own line of work, what was cool was to apply some new theoretical approach from a cooler field and see if it works. I’d also like to point out that I’ve heard this same kind of story from university engineering departments, e.g. “the State (it was a state university) folks asked Jane to investigate how they could solve a problem, but Jane was denied tenure because the kind of research she did wasn’t theoretical enough.”

Beware of Science Tool Fads
Ecologists did this with systems theory which came from a higher field (math). Sometime in the 90’s, I was on a team reviewing one of the Forest Service Stations. There were some fish geneticists at the Station who had done excellent ground-breaking work learning where trout go in streams. Very important to know for a variety of reasons. The Assistant Director told me that that kind of research was passe, and that in the future, scientists would only study “systems”. But, I asked, how can you understand a system without understanding anything in it? To be fair, GE trees for forest applications were also a fad.

Big Science and Big Money
In 1988, I was at the Society of American Foresters Convention in Rochester, New York, and had dinner with a Station Director. I was a geneticist with National Forests at the time. He said that his station wouldn’t be doing much work related to the National Forests any more, as now their scientists could compete for climate change bucks and do real science. (thank goodness this wasn’t a western or southern Station Director).

I’m not being hard on the Forest Service here, it happens to be where I worked (and FS scientists themselves are generally quite helpful).

Right now I’m thinking of different vectors of coolness. There is the abstraction vector, which may go back to the class-based distinctions from the 19th century that we talked about last time. The more abstract the better. But what you study has a coolness vector as well. Elementary particles are better than rocks, and organisms, and people (social sciences) are at the bottom. Tools have a coolness vector of their own. Superconducting super colliders, great. Satellite data and computer models, lidar and so on, good. Counting or measuring plants, not so cool. Giving people interviews, well, barely science. The feedback from users is another vector. Highest is “what users, we are adding to human knowledge?” somewhere along the line is “national governments should be listening to us, but we’re not asking them what’s important” and finally “yes, we determine our research agenda by speaking to the people with problems.”

There is also some kind of feeling that non-interventional “studying distributions of plants” is better than intervening in the lives of plants. Perhaps that is the old-class based vector of coolness transformed into “environmental” vs. “exploitative.”

Given that both of these are improving the environment, which do you think is cooler and why?

Methane Detectives

Collaborative Adaptive Rangeland Management

Another Side of the Colville Revision Story: Comment from Russ Vaagen

Many thanks to Russ Vaagen for giving another perspective. It helps us understand the frustrations of the different parties, and also (as is often the case) how the FS is between a rock and a hard place. Here I’d like to especially highlight one point:

I just hope we don’t have to go through this process of Forest Planning again. It has no place in this modern era of collaboration and public involvement. If those efforts result in a need to alter the Forest Plan, the USFS should recognize it and alter the plan a step at a time rather than the whole thing.

This reminds me of the idea of some planners that plans should be more like a loose-leaf notebook of decisions. It also reminds me of the R-2 forests who didn’t want to get on the list for plan revisions because “it’s like opening up all the disagreements that we had settled, to what end?” Anyway here’s his whole comment, originally posted here.

As the President of NEWFC I would like to say that not everyone feels the same way that Tim does. I completely understand where he’s coming from and support his ability to speak for himself and his organization.

Tim and Kettle Range Conservation Group have been great members of our collaborative. The fact is, these forest planning processes are divisive. He’s right, we did collaborate for a long time to reach a consensus on the Forest Plan. However, due to the broad nature of the community involvement it attracted members of the public that weren’t as adept at the collaborative process and never got to the point where they dropped the positions and started talking about interests.

I don’t have the same opinion of Colville National Forest Supervisor Rodney Smolden. Because of this planning process he was put in a very difficult position. If he were to agree to press forward with Wilderness levels supported by NEWFC members, all 9 county commissioners in the three counties affected and a number of groups would have all been adamantly opposed. The fact is, most were opposed to any additional Wilderness. I’m not making a judgment call, I’m just saying when you hear somewhere between 120,000 and 200,000 acres from a local collaborative and then zero from other community leaders including elected officials, 60,000 was an attempt at a compromise. Unfortunately that’s not popular, and it doesn’t work for anyone.

If we are to be giving collaboration its due, we need to adjust this Forest Planning Process. It goes against collaboration and the collaborative process. It’s a disaster.

I know we can agree to more Wilderness on the Colville, but more importantly we can agree to other designations that achieve the interests of conservation and other interested parties. The forest industry participants have already sent a signed letter to the Colville National Forest urging the agency NOT to pursue any projects that would involve logging in Inventoried Roadless Areas. This is a big win for Conservation. Since the letter, no projects have taken place in an IRA on the CNF. When the previous Forest Supervisor suggested some management of an IRA, it was immediately met with opposition from the Forest Industry participants.

I’d also like to address the clear cutting. Tim is right, no one in NEWFC has been asking for Clear Cutting. It’s my opinion that the Forest Service has taken some liberties with some openings by making them too large and subsequently unsightly. I think it was a mistake on their part and these issues have been addressed. I’m hopeful that any future openings will be smaller and mimic natural disturbances. The VAST MAJORITY of the treatments on the CNF are restorative. That means that it’s dominated by thinning.

The Forest Service leadership on the Colville NF and others need to continually revisit expectations of the members of the collaborative groups to ensure that they don’t take things too far. The social license to manage these forests can easily be revoked if the projects don’t consistently match the expectations of the collaborative groups.

I’d also like to address the 25,000 acres of annual treatment. These acres are restorative in design. Over 20 years, that’s 500,000 acres of the 1.1 million acre Colville NF. That almost directly matches up with our collaborative plan to restore and manage about 491,000 acres (if memory serves) that have roads and have been managed in the past. There’s another layer of land between the front country or actively managed lands and potential Wilderness that may or may not need treatment. Therefore, the acreage of treatment isn’t surprising, so long as it’s completed in a way that meets public acceptance.

Tim is a member in good standing with NEWFC and his disappointment is palpable. We’ve all done incredible work on the CNF. The fact that there’s a new Forest Plan now, signed by the Regional Forester, won’t change the fact that we will continue to collaborate. Collaboration has shaped the way we manage the forest and will continue to. That same collaboration will lead to solutions which I believe will include a completed Wilderness Bill and further solutions that will enhance conservation, recreation, and the economics of the forest. I just hope we don’t have to go through this process of Forest Planning again. It has no place in this modern era of collaboration and public involvement. If those efforts result in a need to alter the Forest Plan, the USFS should recognize it and alter the plan a step at a time rather than the whole thing.

For Want of A Document?: More on the Bear Creek “Trail’n’Trout” Litigation

Cyclists take a break at an overlook along Trail 667 in North Cheyenne Canon Park. Seth Boster, The Gazette

Seth Boster of the Colorado Springs Gazette wrote this piece on another Cutthroat Trout/CBD Lawsuit story. What is really interesting to me is that he interviewed:
The District Ranger, the USFWS Biologist, the Plaintiff’s Attorney and the FS fish bio, who strike me as the main actors in the story thus far. I’m glad that the biologists and the Ranger were allowed to speak to the press and give their side of the story.

Bottom line: case seems to be currently based on FWS documentation. Similar to the Rio Grande cutthroat we’ve been discussing, but definitely at the micro scale.

Also: “there have been indications that maybe it wasn’t built up to the standard of some groups’ interest” hence the lawsuit. This seems to be an example of disagreements about design and location of a trail while protecting fish (originally introduced to that place by humans, but now having legal status). I like how Townsend says “we need to make sure the trail is situated in a safe way.” This makes clear that the lawsuit is a tool to achieve a specific on-the-ground outcome.

“They raised concerns with how Trail 667 came to be on the north side of Kineo Mountain. The plan previously was to build on the south side, before machines encountered blocks, as Pikes Peak District Ranger Oscar Martinez explained to stakeholders in an August 2016 meeting.

The center claims documentation is missing to show how the Forest Service coordinated the late change with the U.S. Fish and Wildlife Service. In reading the notice, “I was thinking that perhaps the Center for Biological Diversity may not have been aware that the reinitiation on the consultation had already occurred,” Leslie Ellwood, the Fish and Wildlife biologist assigned to Bear Creek, told The Gazette in a phone call. “It seemed like a reinitiation should happen, when in fact one already had happened quite a while ago.”

A letter would confirm this, she said — a response approving the Forest Service’s direction. A Gazette request for that letter had yet to be met on Thursday.

“The way I would characterize is, I think there’s a schism,” Martinez said late Wednesday at the roundtable meeting. “Did we consult (Fish and Wildlife)? The answer is yes.”

To attendees’ questions of how the Forest Service would respond to the center, Martinez hesitated.

“I’m not gonna lay out all my legal cards here,” he said. “Do I think we followed the process to do the right thing relative to those concerns? The answer is yes. Will we continue to work with them to address those concerns? The answer is yes.”

He said he expected the center would hear back from his office in two weeks.

“If we do see evidence of consultation, it may change the immediacy of us filing a lawsuit,” center attorney Margaret Townsend said. “But we need to make sure the trail is situated in a safe way, and there have been indications that maybe it wasn’t built up to the standard of some groups’ interest.”

She added: “There may be new information that the Forest Service and Fish and Wildlife need to consider.”

Observers have criticized the sustainability of Trail 667, which they say is showing signs of wear and tear. One of them is Allyn Kratz, president of the local chapter of Trout Unlimited.

On Wednesday, he asked about “ravines” he said he had noticed along the trail, threatening to cause washouts in the stream.

“We did not observe any locations of concern along the trail,” said Janelle Valladaras, the Forest Service’s fisheries biologist. She said the team took “a critical eye” to 667 in early October after the center’s notice.

Townsend, based in Portland, Ore., said she was in the Springs this week to scout the site, but snow got in the way of those plans.

“We’re doing our due diligence to make sure we’re getting all the information and evidence necessary,” she said. “If all the evidence suggests the trail is safe and situated in such a way that the fish will thrive and folks can keep enjoying themselves as they have been, then there’s no reason to file a lawsuit.”

Modeling for Decisions IV. In Practice – Climate Change and the Rio Grande Cutthroat Trout (and Forest Planning)

It’s fortuitous that we have this recent example of how a court viewed a population model for an at-risk wildlife species that addresses climate change. The court included the usual caveat that, “Deference to the agency “is especially strong where the challenged decisions involve technical or scientific matters within the agency’s area of expertise.”

It is undisputed that the Service attempted to estimate the effects of climate change by using both “moderate” and “severe” predictions of expected effects, and that for the severe model, it “increased the risk function over time by 20 percent for the 2040 forecast and 40 percent for the 2080 forecast.” 79 Fed. Reg. at 59,147–48. The Plaintiffs take issue with the Service’s observation that the differences in results from the moderate and severe climate change models were “not particularly large.” Disbelieving that this could be a correct conclusion, the Plaintiffs thus suggest that the models “are driven by the Service’s assumption that climate change will have relatively little influence on the threats to individual Trout populations.” (# 76 at 26.)

But the Plaintiffs’ argument begs its own question, assuming that the Service’s models are infected by false preexisting assumptions that climate change effects with be minimal. It is essential to note that the Plaintiffs have not gotten “under the hood” of the Service’s models and pointed out any methodological, programming, or data entry flaws with them. Rather, the Plaintiffs simply argue that the models must be flawed because they produced results with which the Plaintiffs disagree. It may be that the models are flawed, but it may also be that that Plaintiffs’ (and the Service’s as of 2008) expectations about climate change effects are misplaced. Ultimately, it is the Plaintiffs’ burden to demonstrate an error in the Service’s actions, and simply pointing out that two different methodological approaches to calculating the effects of climate change in the far future produced two different results, one of which the Plaintiffs disagree with, does not suffice to carry that burden.

The threatened inquiry takes a longer-term view, asking whether the species might become endangered in a more distant future. But the threatened inquiry is necessarily closed-ended; once the Court has reached the endpoint of the “foreseeable future” (a term found in ESA, and defined recently by regulation) — which the parties here agree is 2080 — the Court’s ability to prognosticate must also come to an end. After 2080, nothing can be foreseen, all is simply speculation. So it is meaningless to ask whether a species will be threatened as of 2080, because it is impossible in 2080 to engage in the long-term future examination that the threatened analysis requires. By 2080, a species must have either reached the level of endangered and be at immediate risk of extinction, or it never will.

As the Plaintiffs observe, it appears that the Trout is on a “slide towards extinction.” (# 76 at 35.) But if the Service’s models are correct — and in the absence of a challenge, the Court must assume that they are — that slide will not be completed as of or immediately following 2080. At that time, there will still be 50 populations of Trout remaining, a number that the Service believes (and the Plaintiffs have not disputed) is enough to ensure the species’ survival through some indeterminate point in the future. What might become of those 50 populations after 2080 is beyond our ability to foresee; the curtain has come down and the movie has ended. We could attempt to speculate about what might happen thereafter — the 50 populations could persist, they could perish, new populations could be discovered, old habitats could become viable again — but speculation is all it would be. Our ability to predict what might happen has come to an end.

This analysis and decision actually has some important implications for forest planning (from the 2013 Rio Grande Cutthroat Trout Conservation Strategy).

Of the total 1,110 km (690 mi) of occupied habitat, 698 km (434 mi) (63 percent) are under Federal jurisdiction, with the majority (59 percent) occurring within National Forests (Alves et al. 2008).

Range-wide, a large proportion of the watershed conditions within the forests that have Rio Grande cutthroat trout are rated as “functioning at risk,” which means that they exhibit moderate geomorphic, hydrologic, and biotic integrity relative to their natural potential condition (USFS 2011)

Land management activities are currently practiced according to the Carson, Santa Fe, and Rio Grande National Forest Land and Resource Management Plans, and BLM Resource Management Plans. During scheduled revisions, the forests and BLM field offices will evaluate the current Land and Resource Management Plans and update as necessary to provide adequate protection for Rio Grande cutthroat trout with current best management practices. Land management activities that would result in the loss of habitat or cause a reduction in long-term habitat quality will be avoided.

 

If the trout is a warranted for listing (even if precluded by higher priorities), it is a “candidate” species under ESA.   The Planning Rule requires that forest plan components conserve candidate species (which under ESA means the same thing as recover). Since this decision that listing is no longer warranted was reversed, that should mean the species is again a candidate species.

Of course, national forests where the trout is found have been revising their forest plans after 2014, when listing was no longer considered warranted. Consequently, the Rio Grande cutthroat was considered for inclusion as a species of conservation concern. The Rio Grande National Forest has identified the species as an SCC in its final plan (currently in the objection period). The requirement for forest plans for SCC is for plan components to maintain a viable population.

Logically, a species that is warranted for listing should warrant greater protection than one that is not. So it’s possible that the Rio Grande will need to reconsider plan components in areas that are important to this species, or to at least document why this change doesn’t make a difference.

Which could bring us back to the modeling question – how does the Forest Service show that it is meeting the NFMA requirement to provide ecological conditions necessary for this species?   If there is a working population model for a species, then those factors that may be influenced by national forest management should be examined to determine how they could change as a result of forest plan decisions, and whether or how that could affect the model results.

 

Environmental investors fund fuel reduction projects

Here’s how it works: Investors buy into the bond, and the money is drawn as needed for forest restoration work. This includes thinning, strategic backfires and other rehabilitation. In this first case, it was a $4 million bond with money from CSAA Insurance, Maryland-based investment firm Calvert Impact Capital, The Rockefeller Foundation, and the Gordon and Betty Moore Foundation.

The investors are paid back over five years, with 4% interest, by those who benefit from the work and have contracted with Blue Forest, like the U.S. Forest Service and state agencies. In this case, payments will come from the Yuba Water Agency, whose reservoirs receive water from the forest and the California Department of Forestry and Fire Protection.

And the bond couldn’t come at a better time in the investor community, as an increasingly popular trend of socially conscious investing is taking off. It’s called ESG, which stands for environmental, social and corporate governance. It focuses on investing for the greater good; in this case, buying into the health of the forest but still making money.

It is exactly the kind of investment Jennifer Pryce, CEO of Calvert Impact Capital, says her clients want.

“Our investors are looking for an impact and a financial return, and this is off the charts when you look at what it’s giving back,” said Pryce, who polls investors each year to see how they want to align their capital with their values. “Fighting climate change is No. 1.”

She admits this one was a difficult sell because it is designed to prevent fires, rather than fight them. Still, once the possibilities and savings were made clear, the investors were in.

It’s not exactly “fighting climate change” either.  I wonder what else might get lost in translation, and would the “environmental” investors necessarily like the “other rehabilitation” that the Forest Service decides to fund with their money, and whether there are any restrictions on what an agency could use the funds for.  An interesting concept though …

Colville National Forest Plan is a Public Disgrace: 17 Years of Collaboration is Ignored

The following press release was issued by the Republic, Washington-based Kettle Range Conservation Group on October 23, 2019. It can be viewed here.

For Immediate Release: October 23, 2019

Colville National Forest Plan is a Public Disgrace: 17 Years of Collaboration is Ignored

On October 21, U.S. Forest Service Regional Forester Glen Casamassa signed the Record of Decision finalizing the revised Forest Plan for Colville National Forest. The plan signals a significant increase in logging – including a large areas of clearcutting – as much as 25,000 acres per year across the 1.1 million acre Colville Forest and above prior recommendations. The Colville’s 5-year logging levels are now projected at 100 to 150 million board feet per year or about two to three thousand log trucks filled each year. [MODERATOR NOTE: I believe 100 to 150 million board feet would require 20,000 to 30,000 log trucks. – mk]

“The visual impacts are going to be very, very significant and increasing each year from now for the next 20 years as hundreds of thousands of acres are logged,” said Timothy Coleman, executive director of Republic-based Kettle Range Conservation Group. “The visual landscape-scale degradation is already in full swing and visible now on Sherman and Boulder Pass, but it’s just the beginning.”

The Forest Plan Revision Process has dragged on for over 15 years and countless hours have been volunteered by the public to inform the Forest Service’s process. Two earlier draft Plan proposals recommended far less logging and far more wilderness acres be protected. The Final Plan is a complete reversal of those early plans.

“I have participated in countless Forest Service-led collaborative processes and I never once heard the public ask for more clearcut logging,” said Coleman. “One forest supervisor after another told us the Forest Service would use our comments to craft a plan we could feel good about. Well, in the final hour it was Colville Supervisor Rodney Smolden, who supervised plan creation, pulled the rug out from under all of us.”

Coleman is a 17 year veteran of the Northeast Washington Forestry Coalition (NEWFC) considered to be the most successful forest collaborative in the country and working with the Colville National Forest has made it famous. The Colville has the highest timber volume production in the entire National Forest System.

“Seventeen years of my life has been wasted getting the public to believe the U.S. Forest Service had changed its history of clearcut logging and could be trusted to care for our public lands. NEWFC even had a collaborative agreement between the Timber Industry and environmental community to support 200,000 acres of wilderness management in the Forest – but Supervisor Smolden stabbed collaboration in the back. It’s almost uncanny release of the final plan was timed to coincide with Halloween – the trick really is on us,” Coleman said.

In the face of mass extinctions of wildlife caused by climate change and loss of forest and other critical habitat, it is absolutely unconscionable what Supervisor Smolden is proposing to do,” said Coleman. “I believe the Colville National Forest and the U.S. Forest Service will rue the day it went back on its word to support collaborative agreements and restore “healthy” forests. It’s just another ruse by the U.S. Forest Service adding to a long history of disservice to wildlife and public interests.”

NOTE: Tim Coleman has been active in the conservation of forest and water resources since 1971, nearly 50 years. Tim is a Vietnam-era Navy veteran. Tim and his wife built and live in a hand-crafted solar-powered log cabin north of Republic, Washington. Since 1993, Tim has served as director of the Kettle Range Conservation Group. Tim cofounded the Wild Washington Campaign that led to the passage of the Wild Sky Wilderness, and he is a cofounder of the Northeast Washington Forestry Coalition and the Columbia Highlands Initiative. In 1998 Tim received the Environmental Hero Award from the Washington Environmental Council.

“Coloradans know what’s best for our state – not Washington”: The CORE Act and the Role of State Politicians in Federal Land Management

Sometimes our usual public lands disagreements get wrapped up in partisan politics, which always invoke more heat than light. I thought this article was interesting, given that we were just discussing the roles of State politicians in deciding what happens on federal lands in Alaska. Here in Colorado, we have local people who disagree among themselves about what is best. We also have federal elected officials who disagree with each other. So we have an opportunity to ask “if politicians themselves disagree within the state, who can best be said to be making a claim of legitimacy based on geography?”

Senator Bennet seems to be arguing for Colorado (state) as a legitimate level for decisionmaking on federal lands.

“Coloradans know what’s best for our state – not Washington,” Bennet said in a written statement on Tuesday. “The CORE Act was drafted by Coloradans, for Coloradans – engaging with stakeholders across the state for nearly a decade to hammer out a reasonable public lands bills with broad support.”

Of course, not all Coloradans agree about this bill, as we’ve previously discussed, including the local Representative, Scott Tipton. So Congressional Districts are not the right level, but States are. Which is OK, but then (1) would that make Alaska state influences equally legitimate, or not? Note: Colorado did its own Roadless Rule (ultimately signed off by Obama/Hickenlooper)
Or perhaps Utah?

(2) What is it that makes Front Range congressfolk so interested in public lands not in their district?

“The White House also said that not enough local input has been addressed when it comes to the legislation, which is expected to get a vote this week in the U.S. House.” And of course, again, is the question there wasn’t enough input, or that it wasn’t listened to? I don’t believe that folks working on legislation do a “response to comments” so that the rest of us could figure this out. (3) What exactly is “local input”, how local is it and would we know it if we saw it? And how could the rest of us judge it without a “response to comments?”

Modelling For Decisions III. Energy Modelers and Black and Dying Swans

Photo by Tom Waugh

Yesterday’s piece was by economists- today’s is about energy systems modeling, from some energy policy and analysis experts. Of course, the energy sector is key to reducing carbon emissions, so perhaps their experience and perspective is valuable.

My favorite quote is the last line.

“But perhaps a start is for decision-makers to adapt to an increasingly uncertain and dynamic world by creating a more imaginative discourse, one that welcomes nuance and doubt as spaces for opportunity and transformative change, and sees forecasts as the beginning of a policy or investment discussion rather than the end, and forecasters not as Delphic oracles of outcome, but as the people who know best why attempts at prediction must fall short.”

I would argue that we stakeholders and public, and of course, practitioners and scientists outside the modelling community should participate in that imaginative discourse. Nuance and doubt are our friends, IMHO, and yield more robust paths forward.

“We explore two challenges to forecasting complex systems which can lead to large forecast errors. Though not an exhaustive list, these two challenges lead to a significant fraction of large forecast errors and are of central importance to energy system modeling. The first challenge is that in complex systems, there are more variables than can be considered. Often described as epistemic uncertainty, these un-modeled variables—the unknown unknowns—can lead to reality diverging dramatically from forecasts. The second challenge in forecasting complex systems is from the inherently nonlinear nature of many such systems. This results in a compounding of stochastic uncertainties—the known unknowns—which in turn can result in real-world outcomes that deviate significantly from forecasts.”

Remember from yesterday, Idea 1 was “weather-like vs. climate-like” forecasting with weather-like having more chances to check in with the real world. Idea 2 was the concept of Big Surprises which seem unpredictable. Today’s authors’ experience is that “the future is often directed by unlikely events.” They also relate epistemic uncertainty to black swans and stochastic uncertainties to dying swans. Perhaps the longer the projection, the greater the probability that unlikely events will overwhelm likely events? For that reason, some have suggested focusing on the short to medium term for projections.

In many cases, modeling apologetics are insightful and accurate, and meaningfully contribute to improved future forecasts. But there are reasons to question the universality of this narrative. Explaining away modeling errors as due to one-off unlikely events misses the prevalence of errors caused by such events, and may lure us (especially those of us who are non-modelers but rely upon model outputs) toward a heuristic of naturalistic equilibrium: a belief that “now things are normal,” or that they will soon be. The history of the energy system teaches us that the future is often directed by unlikely events, and that there is value in questioning whether naturalistic analogies of equilibrium are appropriate in many cases. Energy systems may experience multiple years, or even decades, of disequilibrium due to complex and shifting market rules, uncertainties of technological or economic feasibility at nonlinear scales of deployment, and extraordinary diversity in market structure, composition, and actors. The enormity of such extraneous uncertainties places any forecaster in very deep water.

The questions raised here, and the types of forecast errors described, should be expanded to other sectors. The rapid pace of advancement and interconnectedness of the world means that epistemic uncertainty is larger than ever [50]. For many newer technologies, the degree of uncertainty regarding future generation has increased in recent years. Cost declines in technologies such as wind, solar, and energy storage place them on competitive terms with conventional generation technologies. These technologies have shown even more stark learning-by-doing effects than shale gas production. Markets for electric cars and demand response, to name a few, similarly pose the possibility for dramatic shifts. Even moderate changes in cost and policies can lead to large changes in the future adoption of these technologies.

Decision-makers and investors would benefit from learning more about why they were caught unawares by the shale revolution, and how they can be better prepared the next time such a surprise occurs. The answer to that second question is not immediately apparent. Tautologically, if we were prepared for them, surprises would cease to be surprises. But perhaps a start is for decision-makers to adapt to an increasingly uncertain and dynamic world by creating a more imaginative discourse, one that welcomes nuance and doubt as spaces for opportunity and transformative change, and sees forecasts as the beginning of a policy or investment discussion rather than the end, and forecasters not as Delphic oracles of outcome, but as the people who know best why attempts at prediction must fall short.

Modelling For Decisions II: Escape From Model-Land: A Guide To Temptations and Pitfalls- Thompson and Smith

Figure 1: A map of Model-land. The black hole in the middle is a way out.

Here’s a link to a paper called “Escape From Model-Land”. It’s written by an economist/modeler at the London School of Economics and a mathematician/statistician at Pembroke College, Oxford and LSE.

From the abstract:

The authors present a short guide to some of the temptations and pitfalls of model-land, some directions towards the exit, and two ways to escape. Their aim is to improve decision support by providing relevant, adequate information regarding the real-world target of interest, or making it clear why today’s model models are not up to that task for the particular target of interest.

I like how the authors (Idea 1) distinguish between “weather-like” tasks, and “climate-like” tasks. In our world, a “weather-like task” might be a growth and yield or fire behavior model, for which more real-world data is always available to improve the model (that is, if there is a feedback process and someone whose job it is to care for the model). Model upkeep, however, it not as scientifically cool as model development, so there’s that.

Our image of model land is intended to illustrate Whitehead’s (1925) “Fallacy of Misplaced Concreteness”. Whitehead (1925) reminds us that “it is of the utmost importance to be vigilant in critically revising your modes of abstraction”. Since obviously the “disadvantage of exclusive attention to a group of abstractions, however well-founded, is that, by the nature of the case, you have abstracted from the remainder of things”. Model-land encompasses the group of abstractions that our model is made of, the real-world includes the remainder of things.
Big Surprises, for example, arise when something our simulation models cannot mimic turns out to have important implications for us. Big Surprises invalidate (not update) model-based probability forecasts: the conditions I in any conditional probability P(x|I) changes. In “weather-like” tasks, where there are many opportunities to test the outcome of our model against a real observed outcome, we can see when/how our models become silly (though this does not eliminate every possibility of a Big Surprise). In “climate-like” tasks, where the forecasts are made truly out-of-sample, there is no such opportunity and we rely on judgements about the quality of the model given the degree to which it performs well under different conditions.

In economics, forecasting the closing value of an exchange rate or of Brent Crude is a weather-like task: the same mathematical forecasting system can be used for hundreds or thousands of forecasts, and thus a large forecast-outcome archive can be obtained. Weather forecasts fall into this category; a “weather model” forecast system produces forecasts every 6 hours for, say, 5 years. In climate-like tasks there may be only one forecast: will the explosion of a nuclear bomb ignite and burn off the Earth’s atmosphere (this calculation was actually made)? How will the euro respond if Greece leaves the Eurozone? The pound? Or the system may change so much before we again address the same question that the relevant models are very different, as in year-ahead GDP forecasting, or forecasting inflation, or the hottest (or wettest) day of the year 2099 in the Old Quad of Pembroke College, Oxford.

……
(Idea 2). The unpredictable or Big Surprise. People working with “weather-like” tasks may develop a high level of humility regarding the many unknowns small and Big that can happen in the real world. It’s possible that people working with “climate-like” tasks have fewer opportunities to develop that appreciation, or perhaps that they must trudge forward knowing about known unknowns and unknown unknowns. The authors’ discussion of the difference between how economists use model outputs compared to climate modelers (intervening expert judgement)is also interesting.

And the questions we asked in the last post:

“It is helpful to recognise a few critical distinctions regarding pathways out of model-land and back to reality. Is the model used simply the “best available” at the present time, or is it arguably adequate for the specific purpose of interest? How would adequacy for purpose be assessed, and what would it look like? Are you working with a weather-like task, where adequacy for purpose can more or less be quantified, or a climate-like task, where relevant forecasts cannot be evaluated fully? Mayo (1996) argues that severe testing is required to build confidence in a model (or theory); we agree this is an excellent method for developing confidence in weather-like tasks, but is it possible to construct severe tests for extrapolation (climate-like) tasks? Is the system reflexive; does it respond to the forecasts themselves? How do we evaluate models: against real-world variables, or against a contrived index, or against other models? Or are they primarily evaluated by means of their epistemic or physical foundations? Or, one step further, are they primarily explanatory models for insight and under-standing rather than quantitative forecast machines? Does the model in fact assist with human understanding of the system, or is it so complex that it becomes a prosthesis of understanding in itself?

“A prosthesis of understanding” (for prosthesis, I think “substitute”) reminds me of linear programming models (most notably, in our case, Forplan).