Practice of Science Friday: Research Design Depends On Where You Drop Your Pin

Let’s talk about framing issues- described  in the Wikipedia entry here.

In social theory, framing is a schema of interpretation, a collection of anecdotes and stereotypes, that individuals rely on to understand and respond to events.[2] In other words, people build a series of mental “filters” through biological and cultural influences. They then use these filters to make sense of the world. The choices they then make are influenced by their creation of a frame.

Framing is also a key component of sociology, the study of social interaction among humans. Framing is an integral part of conveying and processing data on a daily basis. Successful framing techniques can be used to reduce the ambiguity of intangible topics by contextualizing the information in such a way that recipients can connect to what they already know.

Framing involves social construction of a social phenomenon – by mass media sources, political or social movements, political leaders, or other actors and organizations. Participation in a language community necessarily influences an individual’s perception of the meanings attributed to words or phrases. Politically, the language communities of advertisingreligion, and mass media are highly contested, whereas framing in less-sharply defended language communities might evolve imperceptibly and organically over cultural time frames, with fewer overt modes of disputation.

Two additions to the Wikipedia entry. Scientific communities and disciplines specifically also do framing (“other actors”). But what the entry isses is that individuals are each entitled to her or his own framing of any issue. We can each drop our pin and define the problem in our own way. But only some of us have access to research funding to investigate and publish on a particular topic. So when folks say “the science says, so we should..,” it can be another case of “authority by privilege,” as people without access to research funding to study questions within their framing, remain unheard. Different framings most commonly lead us into talking past each other. Let’s look at an example.

Since the 80’s (at least) folks have been talking and studying about the pros and cons of harvesting trees in Western (and to a lesser degree, Central and Eastern Oregon). If you’re in Western Oregon, the framing may be “should timber harvest occur? If so, what practices will sustain the (owls, fish), minimize carbon loss, be resilient to climate change,  and so on?”

But what if the pin is dropped instead somewhere in the Front Range of Colorado, which is a wood products sink, not source, with expanding residential and commercial development?  If we were to look for environmentally protective things to do (to use less wood), we could try (1) not allowing people to move here, (2) changing building codes to allow only multi-family housing (from the lumber and heating and cooling efficiency perspective), plus a variety of other planning and development regulation options.

Let’s imagine our fully-funded Front Range Sustainable Development Research Center. First we’d gather stakeholders, including representatives of state and county governments, who would decide what questions to be addressed. Perhaps we’d do a study comparing wood and other construction materials in terms of their environmental costs and benefits (or at least try to figure out why existing studies disagree). Environmental impact would include emissions from transport to us, what would happen if trees weren’t cut in terms of carbon, including the social likelihood of other uses of forested areas (say the Southern US) or possibly burning up (with more fires due to climate change), as well as all the other considerations.

But all that is from the environmental perspective, which is not the only one. There are also possible social costs, due to (maybe) higher prices from Canada (cue the various incarnations of the Softwood Lumber DisAgreements). Plus, there are tax and feeding families advantages to producing products in our own country. Our Center would do research on all these aspects of wood product sources and uses, as guided by the stakeholder group.

And we can’t leave out the conditional nature of any scientific knowledge. So the folks working on putting CO2 into concrete might change all these calculations if/when they scale up.

Holy Pseudotsuga! There are a great many things that could be studied, from a variety of perspectives and across a wide range of disciplines! And whether your state or county is a source or a sink might well affect the design of research. And while all these studies will help inform policy, none should determine policy. After all, perhaps the most environmentally preferable one would be to keep people out. But then where would they go, and what would be the impacts there?

So where do you drop your pin or frame the issue?And what would your research program look like?

One more thing you may have noticed. It’s fairly easy to calculate some impacts, but at some point you are making guesses about what people and technology are going to do or not do. Is it better for decision makers and the public  to openly discuss those guesses (say, scenario planning) or to put guesses into models where they lose the information on their uncertainty?

Study: Warming Climate, More-Severe Wildfires in the Blue Mountains

Press release about a Portland State University-led study. Open access, here…. and worth a look. The Management Implications section offers a concise look at the problems.

“…the team’s findings suggest that forest managers should consider projected climate changes and increasing wildfire size, frequency and severity on future forest composition when planning long-term forest management strategies.”

Well, yes, of course! That implies removing timber — commercial harvests.

The team also suggests that in light of the projected expansion of grand fir, managers should continue to reduce fuel continuity through accelerated rates of thinning and prescribed burning to help reduce the extent and severity of future fires.

Reduce fuel continuity — yes, but that’s not enough. Fuel loading needs to be reduced.

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
News Release

Study: Wildfires in Oregon’s blue mountains to become more frequent, severe due to climate change

Portland State University

Under a warming climate, wildfires in Oregon’s southern Blue Mountains will become more frequent, more extensive and more severe, according to a new Portland State University-led study.

Researchers from PSU, North Carolina State University, University of New Mexico and the U.S. Forest Service looked at how climate-driven changes in forest dynamics and wildfire activity will affect the landscape through the year 2100. They used a forest landscape model, LANDIS-II, to simulate forest and fire dynamics under current management practices and two projected climate scenarios.

Among the study’s findings:

  • Even if the climate stopped warming now, high-elevation species such as whitebark pine, Engelmann spruce and sub-alpine fir will be largely replaced by more climate- and fire-resilient species like ponderosa pine and Douglas fir by the end of the century.
  • A growing population of shade-loving grand fir that has been expanding in the understory of the forest was also projected to increase, even under hotter and drier future climate conditions, which provided fuels that helped spread wildfires and made fires even more severe.

Brooke Cassell, the study’s lead author and a recent Ph.D. graduate from PSU’s Earth, Environment and Society program, said that if these forests become increasingly dominated by only a few conifer species, the landscape may become less resilient to disturbances, such as wildfire, insects and diseases, and would provide less variety of habitat for plants and animals.

Cassell said that the team’s findings suggest that forest managers should consider projected climate changes and increasing wildfire size, frequency and severity on future forest composition when planning long-term forest management strategies.

The team also suggests that in light of the projected expansion of grand fir, managers should continue to reduce fuel continuity through accelerated rates of thinning and prescribed burning to help reduce the extent and severity of future fires.

###

The study’s findings were published in the journal Ecosphere. The research team also included Melissa Lucash, a research assistant professor of geography in PSU’s College of Liberal Arts and Sciences; Robert Scheller from North Carolina State University; Matthew Hurteau from the University of New Mexico; and E. Louise Loudermilk from the U.S. Forest Service.

USFS Timber Targets 2019 and Beyond

Interesting item from the American Forest Resource Council‘s Nov. 2019 newsletter. This line drew my attention:

The Forest Service is developing a “market based” approach to timber sale appraisals that will aim to improve alignment between local market conditions and appraisal metrics.

I assumed that they’d been doing this all along. Anyone have insights?

Federal Timber Purchasers Committee Meeting
Last month the Federal Timber Purchasers Committee (FTPC) met in Alexandria, Louisiana with Forest Service and Bureau of Land Management leadership from around the country. The committee meets with agency personnel twice a year to discuss issues pertinent to the timber sale program. The meeting covered topics such as timber sale appraisals, updates to Forest Service manuals and handbooks, and product utilization standards.

The Forest Service had initially established a timber target of 3.7 billion board feet (BBF) for Fiscal Year 2019. That level was assigned to the Regions and then reduced by the Chief of the Forest Service in May to 3.3 BBF for reasons including, but not limited to, the government shutdown, the impact of not receiving the 2018 fire repayment, and delays in hiring new staff. However, the agency target, assigned by the Department of Agriculture, remained at 3.7 BBF. The Regions sold 3.26 BBF, attaining 99% of the adjusted target. The agency as a whole attained 88% of its assigned target. Forest Service leadership emphasized the need to grow in 2020 and anticipates establishing a timber target of 3.7 BBF with a goal of hitting 4.0 BBF in 2021.

These ambitions for growth will likely be augmented by ongoing Forest Service efforts including Forest Products Modernization (FPM), Environmental Analysis and Decision Making (EADM), and Shared Stewardship. Through FPM the Forest Service is developing a “market based” approach to timber sale appraisals that will aim to improve alignment between local market conditions and appraisal metrics. There was general agreement and recognition among all participants that demand for federal timber products remains high and that improvements to the agency’s appraisal practices will help ensure that all economical sales with useful products will sell. Coupled with this effort were recommendations that Regions and Forests improve their access to up-to-date information on product utilization specifications to ensure alignment with local industry standards. Revisions and updates to Forest Service Manuals and Handbooks are ongoing and solicitation for public comment is anticipated to begin this calendar year. Updates on items ranging from timber cruising to stewardship contracting will be rolled out in batches over a six-month period.

Efforts to supplement the agency’s capacity for growth through outside partnerships, generally referred to as Shared Stewardship, were recognized as an integral component of expanding active management. There are currently 10 Shared Stewardship Agreements signed across the country and an additional 10 in progress. Partnering with entities such as State governments and Tribes continues to be a national priority. Forest Service leadership emphasized not just the importance of establishing these agreements but also developing clear metrics that can be used to gauge their effectiveness to further the agency’s mission.

Forest Service tries again on Blue Mountains plan revision

The revision of the three national forest plans encompassing the Blue Mountains of eastern Oregon and Washington is becoming a poster child for failing to finish forest planning.

Northwest Regional Forester Glenn Casamassa announced in March 2019 the Forest Service was scrapping the proposed Blue Mountains Forest Plan Revision, which includes the Umatilla, Wallowa-Whitman and Malheur. A final draft of the plans had been released in June 2018. It was not the first time the Blue Mountain Forest Plan had been paused.A draft version of the plans was completed in 2014, and received so much backlash that local forest supervisors decided to develop new plan alternatives.

So they’re trying something new:

The Blues Intergovernmental Council has been formed to help frame the process of developing a new methodology for forest planning for the Wallowa-Whitman, Malheur and Umatilla national forests. A series of meetings between county commissioners and key Forest Service personnel have been held across the Blue Mountain region over the past year to help kickstart a framework for cross-jurisdiction work.

“The underlying intent is to ensure that we can develop plans for the three national forests that would provide the opportunity for durable relationships with our communities and to make an important difference on the landscape for the long term,” said Eric Watrud, the forest supervisor on the Umatilla National Forest.  Watrud said the council includes state and county representatives in Oregon and Washington, four treaty tribes and regulatory agencies, in addition to the Forest Service.

“The attempt here is to create just a more open, inclusive approach where the Forest Service is working closely with our communities in order to make sure that we are developing a plan that is gonna stand the test of time,” he said. “We have the responsibility of stewarding the management of these three national forests, which are a national and local treasure. And so there’s a tremendous amount of interest, and our intent is to make sure that we’re incorporating that feedback, incorporating those ideas and local suggestions in order to make sure that we accomplish that goal.”

A better process for local input – that’s ok. But this is obviously “inclusive” of only “local” “communities.”  I assume this is only part of the story (as suggested by the forest supervisor’s careful reference to “a national and local treasure”), but I hope they aren’t (maybe again) setting up expectations that won’t be met.

Weekly litigation settlement summary

It’s been awhile since we’ve gotten a Forest Service litigation summary, but it was a busy week for settling three cases involving the Forest Service.

Idaho salmon

A U.S. District Court judge signed off on an agreement between the Forest Service and Idaho Conservation League involving water diversions in the Sawtooth Valley.  The Forest Service has agreed to complete biological assessments of 20 water diversions in central Idaho that the conservation group says could be harming protected salmon. The Forest Service has three years to complete the reviews of the diversions that mostly supply water to homes in the area.  The judge in June ordered the Forest Service to complete the reviews, and the documents filed Thursday spell out a timeline as agreed to by the Forest Service and Idaho Conservation League.

In this case the settlement narrowly concerns the timetable because the court fully resolved the case and told the Forest Service what it needed to do

Montana bull trout

The Alliance for the Wild Rockies sent a 60-day notice of intent to sue in September over management of bull trout in the East Fork of Rock Creek and in the St. Mary River drainage, alleging the federal agencies managing bull trout in those drainages violated the Endangered Species Act by failing to complete formal consultation to protect the trout, which are listed as a “threatened” population. The notice specified the Bureau of Reclamation in connection with the St. Mary River and the Beaverhead/Deer Lodge National Forest in regards to Rock Creek.  The issues here also include water diversions, and additional concerns about flow management and degraded stream channels in Rock Creek.  Earlier this month, the agencies sent letters to lawyers for the Alliance, agreeing to undertake the formal consultation.  

This is exactly what a NOI to sue under ESA is intended to do.  Technically it’s not a settlement since a lawsuit hadn’t been filed yet, but it keeps the agencies out of the position that the Sawtooth is in (above).  There is no timetable specified; they just need to do the process.

Mountain Valley Pipeline

Before a judge decides whether to approve a $2.15 million settlement of a lawsuit alleging environmental damage caused by building the Mountain Valley Pipeline, state regulators will consider public comments on the proposal.  The state agencies are a plaintiff who sued Mountain Valley, saying the company violated state regulations meant to limit erosion and sedimentation more than 300 times in building the largest natural gas pipeline ever to cross Southwest Virginia.

This settlement involves some substantive elements.  In October, Attorney General Mark Herring announced a settlement that provides a framework for court-ordered enforcement going forward, with the possibility that the financial penalty will exceed the $2.15 million agreement if additional violations occur.  Mountain Valley agreed to conditions — which include hiring independent monitors to make inspections beyond what had previously been required by the state — as part of a consent decree that will soon go to a county circuit judge.

This is just one of the lawsuits against the pipeline; others seek to restore permits struck down last year by the 4th Circuit, one to allow the pipeline to pass through the Jefferson National Forest (described here) and the second for it to cross more than 1,000 streams and wetlands.

 

Carbon Capture and Forests: Might We Co-Design and Co-Produce Research for Realistic Options?

I’ve been following climate science since the mid- 90’s with the inception of the US Global Change Research Program.  I remember one of my co-workers, Elvia Niebla, coming back from a meeting and telling us about this source of funding, and our saying “hey the climate affects everything; you could justify anything under climate.”  With the passage of time, researchers who, say, had formerly studied loblolly pine physiology in greenhouses started writing proposals called “effects of climate change on loblolly pine” from work in greenhouses. I don’t think we fully understand how that shading of funding for the last 20 years or so has affected not only what work has gotten done, but actually how we think about climate change- especially relative to less-studied and less-funded problems.

Needless to say, the US and other governments have spent beaucoup bucks attempting to describe future impacts, and not quite so much on fixes to the problem. This could be because it may make sense for all of it to go to DOE for practical fixes, and not be broadly distributed across agencies and disciplines. That’s why I think that former Energy Secretary Moniz’s effort is so interesting (and is in some current bipartisan bills), as it’s all about fixes. AS DOE and its Labs understand so well (yes, because they are “separate” the Labs are allowed to lobby Congress, unlike the other agencies), telling Reps that some funding will go to their district is a good way to get funding. You can tell this from Moniz’s presentation at this House Appropriations Hearing.

That’s why I think that former Energy Secretary Moniz’s effort is so interesting (and is in some current bipartisan bills), as it’s all about fixes.

He noted there are three main approaches to carbon removal: “natural techniques,” such as afforestation; “technologically enhanced natural processes,” such as the uptake of carbon in rocks through accelerated mineralization; and purely technological approaches, such as direct air capture, which uses chemical processes to absorb carbon from ambient air. Some removal techniques also require associated storage solutions, such as incorporating the carbon into new products or sequestering it underground. Justifying a broad approach, the EFI report argues it is “too soon to declare a ‘winner’” among the techniques.

In proposing a detailed funding profile for the initiative, Moniz noted EFI drew from a 2018 study by the National Academies that similarly charted a 10 year interagency research agenda for negative emission technologies. He said EFI’s plan is not “jarringly different” from the Academies study, though he said there are some divergences.

Moniz said the initiative would require an annual investment that would begin at about $300 million and peak at $1.4 billion. Of the $11 billion total, $2 billion would go toward large-scale demonstration projects in the latter phase of the initiative. Across agencies, DOE would spend about $5 billion in total, and the National Science Foundation, Department of Agriculture, and the National Oceanic and Atmospheric Administration would spend about $900 million each…..

Moniz acknowledged the price tag for the initiative could be a tough sell in Congress and observed its inclusion of agencies spanning multiple appropriations subcommittees further increased the complexity of gaining the necessary support. He remarked, “The fact that we have 10 agencies involved and six appropriations bills does not make it easier from the process point of view. In other words, the program is not well matched to the silos in Congress.”

However, he noted the federal government has committed to other interagency initiatives costing more than $1 billion annually, such as the National Nanotechnology Initiative, the U.S. Global Change Research Program, and the National Networking and Information Technology R&D Initiative.

IMHO forests and trees are mostly going to do what they’re going to do. Live, die, burn up, regenerate and so on. Our ability to change that on a mega scale (enough to affect large amount of carbon, with a high degree of certainty over time) tends to be both questionable and usually expensive. In general, drastically changing land use practices (afforestation where no trees are now) for carbon sequestration is 1) difficult to do and 2) runs into all kinds of social and environmental obstacles. Plus there are no guarantees that our carefully-tended carbon uptakers won’t just die due to climate change and/or invasives, or their interaction, in all kinds of unpredictable combos. I suspect what will happen is with more funding some people will dream up possible forest interventions, and other people will critique them, which is pretty much the cycle that research is already on; self-sustaining but not necessarily productive of useful policy options. I wonder whether there’s a way to design ourselves out of that cycle, perhaps by co-design and co-development of proposals with people who disagree about the existing proposed forest interventions?

Light my fire?

“You know that it would be untrue
You know that I would be a liar
If I was to say to you
Girl, we couldn’t get much higher
Come on baby, light my fire
Come on baby, light my fire
Try to set the night on fire.”
– The Doors (1967)

The following was posted by the California Chaparral Institute (https://www.californiachaparral.org/) and I thought it was worth sharing their post:

“Next time you hear someone claim California, the Sierra Nevada (pick your spot) is supposed to burn every 5-10 years or so, or when someone uses the pine forests in the Southeast or Arizona/New Mexico as a model of what is supposed to happen in California fire wise, show them this map. The natural fire return interval in California is pretty low… from nothing to pretty darn infrequent when compared to the rest of the country.”

Practice of Science Friday: The Turn Toward Investigator-Initiated Research in the US, is it Too Late to Turn Back?

Years ago, I worked on the Research Committee of the 7th American Forest Congress.  There was a “communities” member on the committee (I believe it was Carolyn Daly) who said from the perspective of forest communities “Why are scientists always telling us what we can’t do? Why don’t they help us figure out what we can do?”

For many of us, our scientist heroes were folks like George Washington Carver, who used science to help make people’s lives better.  Today there are many scientists who carry on that tradition just fine in all disciplines. However, it has happened that even within my lifetime, the reins of the science budget have drifted farther away from people who want problems solved.  You’d think that Congress could fix that, and they have tried, but the inertia of investigator-initiated research and science community control is very strong.

Here’s a study by Kyle Myers of the Harvard Business School who studied how much money it costs to get a scientist to change their topic.

But, how should we decide what types of research to fund? What diseases, what populations, what methodologies should we focus on? We could leave this up to the scientists themselves. And in the United States, this has become a popular choice. The “investigator-initiated” grant, where scientists propose their own ideas to be evaluated by their peers, is commonplace. This is especially true at the National Institutes of Health (NIH), the single largest funder of biomedical research in the world.

This investigator-initiated approach makes sense. Scientists are, after all, the people who should know the most about what ideas are the most promising. But the incentive structure of science, with its emphasis on priority and prestige, may not lead scientists to prefer the same things as society would like. And, like the rest of us, scientists may be influenced by certain preferences, biases, or other constraints that could prove misaligned with social goals. It is not surprising then that many countries, and notably the EU’s Horizon Europe research framework, rely on more “top-down” or “mission-oriented” styles where policymakers directly allocate funds to specific topics.

The NIH balances this tradeoff by using a combination of both investigator-initiated grants, as well as a number of “targeted” grant mechanisms that solicit proposals for particular types of research. These targeted mechanisms request ideas that focus on a particular disease, methodology, or population, and have become increasingly popular (see fig.1). But the NIH, and most other scientific funding agencies, have long assumed that scientists will be willing to adjust their research trajectories in response to these sorts of targeted grants or mission-oriented policies. However, whether these adjustments actually occur in practice – do scientists do what policymakers ask them to? — and just how costly they are to induce, has been unclear.

It’s interesting that the EU is apparently more open to these “mission” ideas. An op-ed here from Marianna Mazzucato of University College, London.

The good news is that we don’t have to look very far for tangible lessons. Most of the smart products we have in our bags and pockets came from investments that were more far reaching than a simple “science-push” explanation provides. They came from the ability to connect science to solving concrete problems  –  that is, through “missions”….

Today we have the opportunity to direct innovation in similar mission-led ways, which will be as bold as the moonshot programme was, but will instead be aimed at the multiple social and technological challenges we have. These will be inspired not by Cold War challenges, but by what one could call the war on poverty, the war on climate change and the urgent need to create societies that are more just and sustainable.

Today’s political leaders are not short of societal challenges that they can turn into concrete missions: climate change, ageing populations and rising inequality. I have been advocating a mission-led approach in the European Union and setting out potential missions for a plastic-free ocean, carbon-neutral cities and decreasing the burden of dementia. These are all significant challenges of our time that need bold and inspirational leadership.

Missions are set at the top without being prescriptive on what the innovation required to solve the problem must be. They then facilitate bottom-up innovation to achieve the goal. We need to use the full power of government instruments – from prize schemes to procurement – to crowd in multiple bottom-up solutions. The moonshot, for example, required innovations across different sectors to be successful, including nutrition, computing and clothing as well as spaceflight.
It will also be important not to ignore the humanities and social sciences in missions.

I’d only argue that the social sciences are more than “not to be ignored.” Many new technologies have foundered on the shoals of cost or public acceptance. Maybe Mazzucato, as an economist was unwilling to toot her own discipline’s horn.

Of Woodpeckers and Salvage Harvests

New Rocky Mountain Research Station publication of interest, “Of Woodpeckers and Harvests: Finding Compatibility Between
Habitat and Salvage Logging.” Not online yet, but linked here in our library.

The line from one of the researchers sums it up: “[T]“he logging treatments essentially accelerated the habitat conditions some woodpecker species prefer while not compromising the habitat needs of others.”

 

150,000 acre “project” on the Bitterroot

Well, not exactly, maybe.  This could be a good example of how to get the public involved early enough in the process for timber harvest decisions that the locations have not been determined yet.  But consider that the decision-maker is the same one who applied “condition-based” NEPA analysis to the Prince of Wales area of the Tongass, which has ended up in court.

Bitterroot National Forest Supervisor Matt Anderson has added a new “pre-pre-scoping” stage to the process, not part of the traditional process in which a set of options is presented to the public for review and analysis.

The new approach is meant to get the public involved prior to coming up with any specific actions being planned for any specific location.

That much I like the sound of.

“There is confusion,” said Anderson. “It’s hard for the public to get involved. We are asking ‘What do you want to see? What’s your vision?’” He said the agency was “starting at the foundational level, not any particular location.” He said it was important to get to those particulars but the way there was to first describe the “desired future condition that we want and then look at the various ways we can achieve it.”

Asked about the fact that the current Forest Plan describes a desired future condition for the Bitterroot Front that involves returning it to primarily a Ponderosa pine habitat with little understory, Anderson said that is in the current plan, but that the plan is about 30 years old. He said a lot has changed in that time on the ground. There have been lots of fires and areas where no fires have occurred, and the fuel load has gotten extremely high. He said current conditions need to be assessed and they were currently compiling all the maps and other information they need to get an accurate picture of what is on the ground today in the project area.

This should raise a concern about how this process relates to forest planning, since forest plans are where decisions about desired conditions are made.  However, old forest plans typically didn’t provide desired conditions that are specific enough for projects, so that step has occurred at the project level.  Under the 2012 planning rule, specific desired conditions are a requirement for forest plans, but the Bitterroot National Forest is not yet revising its plan. Whatever desired conditions they come up with should be intended as part of the forest plan, and the public should be made aware of this.  If the new decision is not consistent with “Ponderosa pine habitat with little understory,” they’ll need an amendment to be consistent with the current plan.  (I’d add that changes in the on-the-ground conditions over the last 30 years shouldn’t necessarily influence the long-term desired condition.)

“The Tongass is so different than the Bitterroot,” said Anderson. “There is not much similarity. I’m not trying to replicate that process here. It was a conditioned-based process up there. It’s like comparing apples to oranges.” In reference to conditioned-based projects, he said, “One difference with this project is that some of that will be pre-decision and some of that will be in implementation. We are trying to shift some of the workload to the implementation stage.”

He said they have a slew of options, from traditional NEPA, to programmatic NEPA to condition-based NEPA “and we are trying to figure it out.”

He insists that the NEPA process will be followed with the same chance for public comment and involvement on every specific project that is proposed in the area.

There’s some ambiguous and possibly inconsistent statements there.  Condition-based NEPA seeks to avoid a NEPA process “on every specific project.”  I could also interpret shifting workload to “pre-decision” and  “the implementation stage” is a way to take things out of the NEPA realm.

And then there’s this:

In response to the notion that the huge project is being driven by timber targets and not health prescriptions, Anderson said that the Regional Office had set some timber targets for different areas of the region, but that those targets were not driving the analysis.This project has nothing to do with meeting any target,” said Anderson.

This feels a little like “There was no quid pro quo.”  Would timber harvested from this project not count towards the targets?  (I’d like to see  targets for achieving desired conditions.) All in all this project would be worth keeping an eye on.

(By the way, here’s the latest on Prince of Wales.)