Secretary of Agriculture Remarks at Science Forum

Secretary of Agriculture Tom Vilsack started the second day of the science forum this morning to emphasize his attention to the Forest Service planning rulemaking effort.   He said that the Obama administration believes in science.  He noted that forests are a source of enormous economic opportunity.  They are a source of water for economic livelihood and drinking water.  This rule is an important blueprint.   He said that past rulemaking processes haven’t been collaborative, resulting in litigation and misguided policies while forests have suffered.

Vilsack said that people are surprised that he even knows about the Forest Service.  But he called the Department “an every way every day agency” which is not just about farming.  He said many people don’t even know the Forest Service is in Agriculture.  But he said that this is a personal matter – when he first got the job, his son in Colorado (an outdoor enthusiast) told him that he’s got responsibility for the Forest Service.  Vilsack later talked about his interest in forestry after family experiences in Iowa planting pine trees and watching them grow.

He told the audience that they were all important in this process.   Vilsack said that the President is focused on this, as part of his conversation on his Great America Outdoors initiative, and an upcoming White House summit.  The country is faced with obesity issues, and part of the problem is that people don’t spend enough time in the great outdoors.   The planning process is part of this larger dialogue.

 

K.I.S.S. in Rule Form, Part 5

In drafting these K.I.S.S. model rules, I look first at what the original 1979 and subsequent 1982 rules have to say on each subject. I use the 1979 rules because I have a ragged paper copy of that day’s federal register with the rules in it. This heirloom was given to me when I was hired as an assistant to teach NFMA planning to Forest Service interdisciplinary teams. My boss told me to read the rules, which were hot of the press, and be prepared to “teach” them the following week. I look to the 1982 rules because they are the rules under which all forest plans were promulgated.

It was with some amusement that I noticed, for the first time, that the 1982 rules fail to faithfully implement NFMA’s nominal public participation requirement (see the link’s paragraph (d)). I have fixed that problem below:

36 CFR 219.6: Public Participation

(a) The revised land management plan shall be made available to the public electronically and at convenient locations in the vicinity of the unit for a period of at least three months before the revised plan is adopted. During this three-month period, public meetings at locations that foster public participation in the plan revision shall be held.

(b) If the land management plan revisions, singly or in combination with the vegetation management and timber harvest program, require an environmental impact statement pursuant to Section 102(2)(C) of the National Environmental Policy Act (“NEPA”), 42 U.S.C. § 4321 et seq., the public participation process set forth in Council on Environmental Quality regulations, 40 CFR Part 1500 et seq., shall be followed.

(c) In addition to the requirements of (a) and (b) above, other processes suited to the decisions made in the plan revision, including the vegetation management and timber harvest program, may be used to involve the public.

Science Forum Panel 2: Landscape Models and Monitoring

This afternoon, the second panel at the science forum addressed the technical questions of landscape scale modeling and monitoring, and the related question of adaptive management.

Eric Gustafson of the Forest Service Northern Research Station started the panel by describing the merits of landscape models, which he called the “computational formalism of state-of-the-art knowledge.” (slides pdf) He said that well-verified models are as good as the science they reflect.  They are a synthesis of the state of the art.  They can be generalizations, but taken as a general expectations they are useful for research and planning. Models can account for spatial processes and spatial dynamics, consider long temporal scales and large spatial scales, and make predictions about the expected range of future states (composition, pattern, biomass).  These models don’t accurately predict individual events.

Gustafson mentioned the “pathway based succession models” often used in the West (VDDT/TELSA, LANDSUM, SIMMPPLE, RMLANDS, Fire-BCG, FETM, HARVEST)  In these models, the disturbance simulation may be process-based, or mechanistic.  He has direct experience with a different type of model, the “process based successional model” (LANDIS, LANDIS-II) for eastern forests with less predictable successional trajectories.  He said that climate change, or novel conditions, may require a process-based approach.   For these models, the landscape scales and long time frames mean that observable validation is not possible, so modelers validate independent model components that are as simple and discrete as possible, then verify component interactions, and compare model behavior with known ecosystem behavior.  Modelers also conduct sensitivity and uncertainty analysis.  Open source models are preferable because their code is available for many eyes to spot a problem.

Gustafson said that models have been used to compare outcomes of management alternatives, looking at plans once developed, or to compute the effects of proposed management (species and age class, biomass, spatial pattern, habitat for a species of interest).  Gustafson prefers looking at comparisons of alternatives rather than absolute projections. 

When projecting the impacts of alternatives, he suggests a focus on appropriately large spatial and temporal scales for evaluating ecosystem drivers and responses;  accounting for any important spatial dynamics of forest regenerative and degenerative processes; and accounting for interactions among the drivers of ecosystem dynamics and conditions (disturbance, global changes).

Steve McNulty of the Forest Service Southern Research Station talked about climate change and water. (slides pdf) He began with some observations about addressing water resources.  First, if we only focus on climate change and water resources, the forest plan will fail.  We need to look at the big picture.  Second, if we consider water as a stand alone ecosystem service, the forest plan will fail.   If we just stress water, then we’ll miss the impact on forest growth, carbon, or biodiversity.  Also, if we only look at the effects of climate change on water, we’ll miss more important factors on water quantity:  population change, changing in seasonal timing of precipitation, vegetation change, and all the sectors of water demand.

McNulty used as an example the results from a model called WaSSI (Water Supply Stress Index) which looks at environmental water use and human water use.  The model shows that the vast majority of water is used by the ecosystem (only a sliver of 2 to 5 percent is human use).  Human water use in the West is mainly used for irrigation.  The WaSSI model looks at both climate change and population change.

McNulty also gave other examples of the breadth of the analysis:  Groundwater loss can be a factor, and it doesn’t matter what climate change does to surface flows if we’ve lost our groundwater.  Another example is the interaction between land use change and climate change interactions.  If we reduce irrigation by 20%, it has as big an impact as climate change.  McNulty also described the relationships between water availability, carbon sequestration, and biodiversity.  The more you evapotranspire, the more productivity you have – it’s good for carbon storage but bad for water yield. 

McNulty also mentioned the TACCIMO project (Template for Assessing Climate Change Impacts and Management Options) with the Research Station and the Forest Service Southern Region.   This project looks at current forest plans, climate change science, and you can search by geographic location to obtain a planning assessment report. 

Ken Williams, the chief of the USGS Cooperative Research Units, spoke about adaptive management in the Department of the Interior. (slides pdf) Ken was one of the authors of the DOI adaptive management technical guide.   The DOI has worked for many years to develop a systematic approach to resource systems, recognizing the importance of reducing uncertainty.  The DOI framework can be useful to forest planning.

The management situation for the Forest Service involves complex forest systems operating at multiple scales; fluctuating environmental conditions and management actions; decision making required in the near term; uncertainty about long term consequences; and lots of stakeholders with different viewpoints.  There are four factors of uncertainty: environmental variation; partial controllability; partial observability; and structural uncertainty (lack of understanding about the processes).  All of this uncertainty limits the ability to manage effectively and efficiently.

The DOI approach emphasizes learning through management, and adjusting management strategy based on what is learned (not just trial and error); focusing  on reducing uncertainty about the influence of management actions; and improving management as a result of improved understanding.  Williams said we need to understand that this is adaptive MANAGEMENT and not adaptive SCIENCE.  Science takes it value in the contribution it makes. 

Here are the common features of an adaptive management approach: a management framework; uncertainty about management consequences; iterative decision making; the potential for learning through the process of management itself; and potential improvement of management based on learning.  Adaptive management implementation features the iterative process of decisionmaking; followed by monitoring; followed by analysis and assessment; followed by learning; followed by future decisionmaking. At each point in time, decisions are guided by management objectives; assessments provide new learning, and the process rolls through time. 

There are two key outcomes: improved understanding over time, and improved management over time.  Williams described two stages of management and learning:  a deliberative phase (phase I) with stakeholder involvement, objectives, potential management altnernatives, predictive models, monitoring protocols and plans.  Then, an iterative phase (phase II) with decision making, and post decision phase feedback on the system being managed.  Sometimes you have to break out of the iterative phase to get back to the deliberative phase.

What about the fit to forest planning?  Williams talked about considering: forests are dynamic systems; planning is mandated to be science based and stakeholder driven, acknowledging the importance of transparency, acknowledging uncertainty, recognizing the value of management adaptation, building upon experience, and understanding.  Every one of these planning attributes is a hallmark of adaptive decisionmaking.

The last panelist was Sam Cushman from the Forest Service Rocky Mountain Experiment Station. (slides pdf) Cushman cited several of his papers which talked about the problems with generalizations and the need for specific detailed desired conditions and specific monitoring.  He said that his research shows the problem with the indicator species concept – no species can explain more than 5% of the variability of the bird community; even select poolings of species fail to explain more than 10% of the variability among species.  He said that ecosystem diversity cannot be used as a surrogate for species diversity.  In practice there is a conundrum: there is a limit to the specificity that you can describe diversity, i.e. we have limited data, so that we can only define diversity very coarsely.   His research shows that habitat explains less than half of species abundance.  Cover types are inconsistent surrogates for habitat.  Also vegetation cover types maps don’t predict the occurrence and dominance of forest trees: his analysis found you can only explain about 20%, at the best only 12%.  Classified cover type maps are surprisingly poor predictors of forest vegetation.  80% of variability in tree species importance among plots was not explained even by a combination of three maps. 

To be useful, desired conditions statements should be: detailed, specific, qnatitiative, and appropriately scaled.  Research shows that detailed composition and structure of vegetation and seral stages, and its pattern across the landscape has strong relationship.  Cushman described an example looking at grizzly bear habitat, looking at detailed pixels rather than patches.  Cushman also discussed his research on the species gene flow in complex landscapes, and his research on connectivity models. 

Cushman added that monitoring is the foundation to adaptive management.  If desired condition statements are specified in detail with quantitative behcnmarks at appropriate scales, we then need to have good monitoring data, and representative data that is carefully thought out.  We need large samples for precision, recent data, appropriate spatial scale, standardized protocols, statistical power, and high precision.  We need to look at the long term.   Adaptive management needs to be supported by monitoring and it must be funded.

Science Forum Panel 1: Landscape Ecology

The first panel of today’s science forum emphasized landscape scale planning across multiple ownerships.

Tom Sisk from Northern Arizona University began with the word: “practical”.  (slides pdf) The next 20 years of the science about landscape scale management will focus on transparency, inclusion, and public deliberation.   We are learning to “scale up” our management because of our experience with uncharacteristic large fires and changes in fire regimes.  Sisk showed a map of the large footprint of the Rodeo-Chediski fire in state of Arizona.  The fire spread until it ran out of forest.   Also, wide-ranging species operate at landscape scales, especially when land uses have the potential to fragment habitat.  The protection of diversity will require a continued focus on viability, but with advances in monitoring and assessment, we can estimate species distribution by presence/absence data (which was done for goshawks on over 2 million acres in Northern Arizona).  Finally, exotic species are changing the ecosystem dynamics across the continent.  Sisk pointed out the predicted occurrence of cheatgrass using preliminary maps of areas most vulnerable to invasion.  He added that the covariates driving these models involve climate, which are useful in predicting effects of climate change.

Later during the panel discussion period, Visk said that an agreement in Northern Arizona brought people together.  Those efforts filtered into a state-based process.  The State office took the next steps, which led to a statewide strategy.  Visk said that different people who participated in the overall process had different levers they could use to shape the decisions.  He added that there was a pulling of people together through the various jurisdictions.  For scientists, these processes can be a challenge since they provide the analytical framework, matching analysis to questions.  He said that data in Landfire and other programs help provide capacity.

Sisk also pointed out the collaborative work being done on 3 million acres in northern New Mexico.  He said that one way to address the historic controversies is to empower people with information.  This new science can reside in everybody’s minds.   Land ownership can fragment our thinking rather than looking at integrated landscape properties.  Sisk described working with 50 people in breakout groups to identify priority areas for management attention, and recommending types of management in those priority areas.   Next, they looked at the predicted effects and what the landscape would look like. 

Sisk said that science must be transparent, but not dumbed down.  It must be rigorous, repeatable, and defensible so that is will inspire confident action.  The public must own the science if they are to trust and accept decisions based on it.  The planning process should provide a predictive capacity and allow exploration of alternative scenarios.  Scientists should inform and not attempt to dictate decisions.  He said that landscape ecology is a mature field, but it will require a bold attempt to embrace multiple scales. 

Jim Vose from the Forest Service Southern Research Station talked about watershed science. (slides pdf) Much of what is known is at small scales, because watersheds are so complex.  The challenge becomes taking the small scale knowledge and moving it to the watershed and landscape scale.  He added that watersheds don’t operate in isolation – they are connected.  He said that disturbances and habitats go beyond watersheds.  Movement of materials is the primary ways that watersheds interact, and hydrologic flowpaths are the primary way that small watersheds interact with larger watersheds. 

Water is an excellent integrator of watershed health: the cleanest water comes off certain landscape types.  As water flows through land uses, water quality declines.  Landscape design and configuration has a big effect on water quality. Any type of land management alters water processes and impacts resilience.  Most best management practices (BMPs) deal with the  physical aspects of land disturbance. 

Vose described innovations in linking “bottom-up” (streamflow gauging) and “top-down” (satellite imagery) technology.  There is more data from the last 20 years than the previous 80 years.  There is a variety of sensing technology, which can be used to develop monitoring networks.  Networks can go to satellites, to monitor what’s going on with landscapes.  We can quantify watershed processes, as data comes together from the bottom to the top, all the way to landscapes.  We can address the relationship of large scale carbon and water cycling.  Vose also mentioned the importance of large networks of experimental forests and the new megamonitoring network of the National Ecological Observatory Network (NEON).  Regarding climate change, Vose mentioned that we can look at long-term data in new ways. 

Vose said that we need to focus on the Where? How? What? and Why?  for restoration.  The “where” is really important (it’s not random acts of restoration).  We need to look at high risk areas and “avoid the unmanageable and manage the unavoidable.”Finally, the really big challenge is the “how” – looking across ownerships need partnerships and approaches; and recognition of connectivity.

Connie Millar from the Forest Service Pacific Southwest Station said that we need to confront all the dimensions of climate change, not just greenhouse gases. (slides pdf) We need to understand that climate is a driver of change. Climate often changes in cycles, nested at different scales from annual changes to millenial changes.   Different physical mechanisms drive changes at different cycles, like an orchestra where the music is the combined forces of instruments:  individual cycles, gradual or directional, extreme or abrupt. However, species and ecosystem respond to the orchestra with “a cacophony of responses.” 

Forests we manage are still responding to the long term cycles of history that they have encountered.  That makes a difference in how we can address climate in the future. For instance, when looking at glacial-interglacial cycles, we have evidence of over 40 cycles of greenhouse gas levels.  Over time, species have moved up and down in elevation to respond to climate (north and south in the east).  There are enduring changes in genetic diversity which we’re still seeing today.   There have been a number of ecosystem “flip flops”: some transitions had rapid change, analogous to what we have now.  Some warming has occurred in less than 75 years, with replacement of whole forest types.

We’ve had cycles of 1000 year periods due to changes in solar activity – we’ve just come out of a cold phase.  A little ice age ended in 1920.  Before that, there was a medieval anomaly with some warming and century long droughts.  Tall fast growing trees are now above tree line due to a change over 650 years, the lifespan of a single tree.  Their size and survival now depend upon their area of origination not where they currently reside.

Millar says that the shortest scale changes are the decadal or the annual scale changes.  The 20-50 year cycles affect parts of world differently – internal ocean circulation and ocean temperature.  The shortest is the El Nino/ LaNina 2-8 yr cycle.  She said that anthropogenic forcing is important at these shorter time scales.  We could see type conversions, invasions into barren lands, massive forest dying, higher elevation mortality as insects move upslope, and fires occurring in winter.   There are also important aspects of climate change at local scales, either natural forces or greenhouse gases.  Millar showed examples of the ice cover in the Great Lakes affecting shipping, and how 70% of wildfires in the West are La Nina events during decadal cycles. For local management at small scales, the local climate responses can be decoupled from effects at a regional scales.  For instance the American pika (the temperate polar bear) can escape the warm heat by the air conditioned ground. 

Millar said that climate is a fundamental architect of ecosystem change in all its flux and dynamics.  The historical conditions are increasingly poor reference and we need to look forward.  Also, human land use cover creates new challenges.

Later during the panel discussion period, Millar said that the planning rule should address climate as a macro-disturbance process.  The rule should provide for adaptive capacity of species (species ranges and species diversity) by understanding climate’s role.

Millar also answered a follow-up question from the audience about definitions.  She advised the use of definitions specifically.  The term “resistence” describes the ways to armor species in the short run against climate effects.  The term “resilience” is the most widely interpreted term, from the meaning used in the field of engineering, to the idea of returning something to its previous state.  It can be thought of as the capacity of a system to absorb change without changing its state.  Millar said that resistence and resilience are mid-term strategies.  Meanwhile, “restoration” can be a range of options from small scale to large scale.  It will be important to define what we mean.

Max Moritz from UC Berkeley concluded the first panel with a discussion of fire ecology. (slides pdf) He  said that the conventional wisdom is that climate change has resulted in increased fire activity because of warmer spring temperatures and earlier snowmelt.  But for shrublands, we haven’t seen the same effects.  Fire regimes respond to changes to multiple climate variables (temperature, precipitation quantity, and timing).

We’ve been worried about the problem of fire exclusion.  But scales of climate change and fire are quite broad.  Fires cross administrative boundaries, and different ecosystems may be adapted to very different fire regimes. 

We also only think about one kind of fire, a low severity fire.  Yet a lot of ecosystems have different fire sensitivities even to these types of fire.

Moritz said he liked the ecosystem service framework, which can give us common terms and currencies.  For fire, we need some common definition and understanding of both beneficial and detrimental roles.  Healthy?  Resilience?  A lot of the definitions and a lot of the goals can actually contradict each other.

Mortitz described a case study of range shifts to show the benefits and detriments of fire.  The rate and intensity of climate change is important to the plasticity and dispersal ability of species.  Many species will have to move 1 km per year to keep up with their suitable climate envelope.  There is a huge extinction potential.  Moritz also mentioned that many habitat models are missing the effects of fire.    He said that we might see large scale invasions and retreats of fire.  As far as species movement, fire can hasten the movement of “trailing edge” losses and benefits of habitat change. 

Regarding how fire releases carbon, he said we tend to emphasize stocks, instead of fluxes.  Ecosystems have a carrying capacity – forests will regain a standing carbon stock.  Fire is a temporary blip – but if you factor in regrowth it’s not quite as drastic.   Black carbon or “biochar” charcoal is a relative stable form of carbon – with every fire the carbon is fixed relative permanently.

Moritz said that the ongoing fragmentation of land ownership is a problem, especially when landowners have a decreasing interest in ecosystems and the resources needed to reduce fire related consequences.

Finally, Moritz said that there is lots of uncertainties.  Fire regimes are fuzzy concepts to begin with.  There is often almost no knowledge of some areas: for instance we have very little info on wind patterns.  However, the broad scope of the planning rule is a plus.

Discussion

After the presentations, there was a panel discussion and question and answer period.  The discussion centered on the topics of uncertainty, scale of analysis, integration of science with local knowledge, and the value of historical data in light of climate change.

Uncertainty

Sisk observed that uncertainty is one of the two or three key pieces of landscape planning.  Scientists try to reduce uncertainty, which is a foreign concept to nonscientists.  A scientist must take responsibility for being clear about that.  People aren’t ready to receive uncertainty – they are ready to receive the data, but combining data sources each with its own level of uncertainty will blow up the whole process.   Vose added that there is a tremendous amount of pressure on scientists to tie practice and science in an uncertain way.  Sisk observed that scientists can relate to a scenario based exploration with a sensitivity component.

Sisk emphasized that adaptive management and management flexibility aren’t the same.  Adaptive management is a structured process where you collect monitoring data and feed it into a learning structure established in the plan a priori.  The results of data collection constrain the decision space.  So adaptive management is less flexible.  

There was a followup question from the audience about the uncertainty of science.  Sisk said that the literature has shown for a long time that technical experts can be wrong.  He added that science informs the citizens and leaders in a democracy.  How to interpret uncertainty is best left to deliberation in a democratic society. Millar added that having panels when there are known differences in the science is important.  It’s important to display the differences in a transparent process.

One audience participant expressed frustration at Forest Service project NEPA documents that fail to address climate change because the issue is uncertain.  Millar said that there always is insufficient knowledge, so that is a cop out.   When addressing climate change we need to think about nesting a local analysis within larger spatial scales, and work with the knowledge we have.  Vose said that some body of information can be synthesized.

Sisk said that when we lack information we still need to make tough decisions.  Sisk said that the “Precautionary Principle” has been used in Europe and mentioned in the Rio conference.  In order to protect the environment, management should be appled according to the ecosystem capabilities.  Sisk said that a lack of uncertainty should not be used to adopt environmental protections.  Later, there was a follow-up question from the audience about whether we should be using adaptive management instead of the Precautionary Principle.  Sisk reminded everyone that the Precautionary Principle cuts both ways: both taking and not taking action.  We should not let uncertainty stop us in our tracks.

Regarding adaptive management, Vose gave examples of testing that was done, and talked about the importance of the network of experimental forests.  He said we should go slow, but test, monitor, and adapt.  He said that models are becoming increasingly sophisticated for making projected assessments.  Moritz said that in the absence of knowledge, if we can identify thresholds that we can seek and test for, we can provide some boundaries.

Scale of Analysis

At the beginning of the discussion period, Sisk noted that one thread throughout all of the talks was the need to be able to work at different scales to do intelligent planning.  Vose said that the scale of operation is the larger scale.  Sisk said that for the West, THE issue is water and the future of water resources.  Water influences the level of organization and how we scale planning.  John Wesley Powell had this figured out.  In some areas watersheds make sense, other areas they don’t, as an organizing tool.  Millar added that we should also crosswalk our analysis with nested temporal scales.  Moritz added that although “firesheds” is a term that is not well defined, it brings up this idea of nested scales.  But we first need a common language and common definitions.

There was a question from the audience about NFMA planning at a forest scale.  Vose said that watersheds are all relative, and you need to look at all scales of watersheds in planning, but a common focus is the 8th level HUC.  The real challenge is when you mix National Forest System and private land with different land uses, with no jurisdiction or a means to work in a coordinated way. 

Integrating Science with Other Sources of Knowledge

One audience participant was concerned about science being compromised in local collaborative processes.  The question is “how can the public own the science while keeping science sound?”  Sisk said that science has a high social relevance, and science efforts are supported by society.  Traning is important. Scientists have a responsibility to get their information back into social side of decisionmaking.  We need to make a concerted effort to improve the integrity of the knowledge.   It’s the responsibility of the scientists to participate in the discussions to talk about the valid use of their science.  In local collaborative processes, when conducted in a respectful and inclusive manner, there is a receptivity to science.  This isn’t the case when decisions “based on science” are laid out for people.  Sisk mentioned the idea of “braiding” of knowledge, referring of an example of Hopi collaboration.  It’s not watering down the knowledge, but braiding the knowledge together, which is stronger.  Sisk said this is the most promising vision to address these huge challenges.  Millar added that there is a long history of observation among tribes.

Moritz said that most scientists working at the landscape scale are not as much reductionists as other scientists.  At this scale, the problems are all connected.  Vose added that the most successes are when landscape and smaller scales come together.  The NEON program is an attempt to bridge that gap.

There was a later followup question from the audience about communcations with the public.  Moritz said that you can’t expect the public to dig through the journals, but the public has to have some level of literacy.  It has to go back and forth.  Millar talked about the role of a “boundary spanner”,  someone that can communicate from the formal science to the applied world.  We might not have answers at the scale of the operational level.  Sisk added that a collaborative process can bridge communication.  Of course some collaborative processes are imperfect, but communications in collaboration processes are not engineered but take place more organically.  Mortiz added that sometimes science is out there, and there is a responsibility for scientific literacy.

Role of the Historic Range of Variability (HRV) Assessment Given Climate Change

There was a question from the audience on the reliance of an HRV assessment in a landscape analysis when climate change may be creating conditions not previously experienced.  Millar said that we can still be informed by an historic process.  The HRV concept is still important but we need a different pair of glasses.   We need a realignment while using knowledge from the past about where we’ve been, looking at different earlier patterns that are more similar to where we are going.

Moritz said we need to be educated by HRV, but not be blinded by it.  There are two pieces of info: where we came from and where we are going.

Climate Change Strategies

Moritz said that the idea of assisted migration of species is relatively new and scary.  After a fire, if a species has taken a big hit, assisted migration might be a tool. Millar said that extreme events historically have major changes in ecological trajectories.  Large scale disturbance are opportunities for quickly catching up.  We can enable succession rather than force it.  Systems can reequilibrate or catch up. Sisk said that our first line of defense should be to maintain connectivity so that organisms can move on their own.  Some animals and long lived plants can do that better than we can think. Regarding extreme events, Vose said that we can understand hydrologic responses to extreme events.  There are tools and models available.

Science Forum Introductory Remarks

The two-day science forum began this morning with introductory remarks from Harris Sherman,  Department of Agriculture Undersecretary for Natural Resources and Environment.  Sherman will be the responsible official for the planning rule.   Sherman began his remarks by saying that the planning rule must be rooted in integrity – it must be based on sound science. 

He mentioned how the Secretary of Agriculture has focused on restoration – 115 million acres of NFS lands nationwide are in need of restoration.  For the remainder of the lands, we need to maintain forest health.  We need to address these challenges through landscape approaches.  We also sometimes overlook the importance of forests for water.  Over 60 million people get their water from forests.   We need to think about climate change, and we might mitigate the effects, or adapt to the changes.  We need to be concerned about the diversity of species.  Also, job creation is an important part of this – jobs in restoration or biofuels. 

Sherman said that collaboration is important, and it’s exciting to see the way that people are coming together.  He said that there is common recognition that we have serious problems.  He said that good science underpins each of these areas.  Without good science we’ll have a flawed rule that people will not buy into.  Sherman concluded by emphasizing vision, integrity, and support: developing a rule that will survive legal challenges.

Other introductory speakers included Ann Bartuska, FS deputy chief of Research and Development.   Bartuska pointed to the concept of best available science, and the challenge to answer what it is, when it is, and how we should use it.  She said that science should be the underpinning of the discussions.  She said that the Forest Service tries to make science-based decisions, and has built a science organization to support that, and many scientists from the organizations are present at this conference.  She pointed out challenges such as uncertainty, variability, scale, and complexity.  She also asked what happens if science changes – is it still best science?

Hank Kashdan, associate chief, said that the science forum is an indicator of the collaborative rulemaking effort.  He said there is a clear intent to involve people across the country.  He also said that we can’t overstate the importance of science in this process. 

The director of Ecosystem Management Coordinator, Tony Tooke, finished the introductory remarks by mentioning the target to complete the rule by November 2011.  He defined success as having a rule that is grounded in science and robust collaboration, with plans that can respond to current needs and improve management.

Climate Resilience- Let’s Not Overthink This

Those of you who read this blog, know that I am a big fan of TU’s approach to climate change Protect, Reconnect, Restore, Sustain. It’s on the front page of their website here. It seems to me like these are the basics of leaving the land in the best condition to respond to climate change as well as other stressors. We basically have a chance to refocus on what we should have been doing all along; good land management, adaptive management with adaptive governance.

For the discussions in the next few weeks, I think it is important that we have an idea of what we mean by climate resilience or ecosystem resilience, which may be two different things. For one thing, climate will require resilient social as well as ecosystems. There is an opportunity for us to become entangled in a verbal jungle from which we may never emerge. So we asked University of Montana graduate student Matt Ehrman to take a look at the different definitions and see how much definitional diversity there is; and if there is diversity in just the definitions of resilience compared to activities designed to promote climate compared to ecosystem resilience.”

Here is a link to the paper Climate Resilience<. He summarizes:

The second principle in the forest planning rule’s NOI asks the public to consider whether plans could proactively address climate change through a series of measures, such as “[management] will need to restore ecosystem resiliency, and also factor adaptation and mitigation strategies into planning and project development.” The new planning rule should go a step further and offer a clear definition of resilience. It should also seek to identify the USFS value or resource that should be managed for resilience, in addition to what it is to be resilient to. The inherent ambiguity in the term “climate resilience” could pose problems for the USFS as it drafts forest plans under a planning rule that does not explicitly define climate resilience as ecosystem resilience to climate change.

Thoughts? How far along the simplicity spectrum do you want to be?

K.I.S.S. in Rule Form, Part 3

In an agency beset with feelings of process predicament and analysis paralysis, it would be cruel punishment indeed to suggest NFMA rules that add more analysis and process to the mix. The new rules should also be durable; that is, not chase after every cause de jour (e.g., climate change) or impose inflexible, one-size-fits-all analysis processes.

The following “Assessment of New Information and Changed Circumstances” is based on the fact that forest plans exist now that cover every acre of the National Forest System. Congress directed that these plans “be revised from time to time” when “conditions in a unit have significantly changed, but at least every fifteen years.” It makes sense that only those parts of a forest plan affected by changed conditions require revision.

It is with these principles in mind that I put forward Part 3 of the keep-it-simple-sweet NFMA rules:

36 CFR 219.3: Assessment of New Information and Changed Circumstances

The revision shall assess new information and changed circumstances and conditions in the unit that are relevant to the decisions made in the land management plan. If the new information or changed circumstances and conditions warrant amendments to the land management plan, the land management plan amendments shall be assessed as a part of the vegetation management and timber harvest program’s NEPA document. If the land management plan amendments, singly or in combination with the vegetation management and timber harvest program, require an environmental impact statement pursuant to Section 102(2)(C) of the National Environmental Policy Act (“NEPA”), 42 U.S.C. § 4321 et seq., an environmental impact statement shall be prepared.

Imagining A Changing Forest

 

A desired condition is not a picture.  It’s a movie.

This is a map of four seral stages for the Pagosa Springs district of the San Juan National Forest.  Young stands of trees (class 1) are very rare.  So are the purple areas representing the oldest stands of trees (class 4).  Most of the map shows middle-aged stands (red and green).  Think about how this information might be used in forest planning.  For instance, the purple areas might be important habitat for late-seral stage wildlife species, they might be mapped as ecological reserves, or they might have some unique social values we want to protect.

Here is a simulation of what could happen to these stands of trees over time due to fire, insects and disease.  Each interval in the movie is a 10-year increment.   It is based on work by Kevin McGarigal of the University of Massachusetts and Bill Romme now at CSU, for the San Juan Forest Plan Revision using a GIS-based simulator called RMLANDS.  It formed an understanding of the historical range of variability of vegetation for the DEIS.

The stand size and distribution is most dependent upon fire interval and fire size, randomly simulated based on historical data.  Over time, the tree conditions seem to float across the landscape like shifting sand.  There are some places where topography seems to influence the disturbances to allow persistence of older trees, but even these areas are eventually affected by the random events.

The smaller the scale, the larger the variation.  If you look at a particular place, there is more change over time in the color of the place.  The larger the scale, there is more likelihood that you’ll find the color you are looking for somewhere.

When planning for forests influenced by disturbance, landscape ecologists advise us that it’s important to think of time and space.   It calls for a discussion beyond static desired conditions.  Instead, a discussion is needed on the disturbance processes, if anything should be or can be done to shape those processes, and what we should do with the conditions that might result.  This is a very different type of forest plan than we have done in the past.

Project-based Forest Planning and Collaboration

People collaborate best when they are in the woods talking about real forests and what to do with them.  On the other hand, put them in a conference room to debate the merits of hypothetical silvicultural standards and you end up with the sort of nonsense we are seeing in northern Arizona. There several prominent, litigious environmental groups have made peace with local timber mills and workers regarding which trees to log on several national forests.  The Forest Service, however, doesn’t want to play ball.  Instead, its regional staff in Albuquerque is busily re-writing northern Arizona NFMA plans to include silvicultural standards that are inimical to agreements reached in the woods between the green groups and industry.

This bureaucratic passion play could be avoided altogether if NFMA plans were based on projects, not standards.  Recall that NFMA requires only one thing of forest plans:  “the planned timber sale program and the proportion of probable methods of timber harvest within the unit necessary to fulfill the plan.”  Recall also that NFMA does not mandate one forest plan for each national forest.  The Forest Service has broad discretion to decide the geographic scope of each plan, i.e., a single national forest can be divided into several NFMA plans.

Under the current two-tier planning regime, the Forest Service and its protagonists get to fight twice over what to do with national forests.  The forest plan fight is all about the adequacy of standards, the aspirational zoning of land, and the magnitude of largely irrelevant allowable sale quantities.  The second fight, at the project level, often repeats all of the above (because forest plan standards become ripe for legal challenge only when implemented in a project), with plan-consistency arguments thrown in for good measure.

Let’s just cut out the middle man altogether.  A forest plan should be no more than the logging projects the Forest Service proposes for the next several years.  The plan’s NEPA document (probably an EIS, but an EA is not inconceivable if the logging projects are environmentally modest) would evaluate alternatives, disclose effects, and form the basis for any required inter-agency consultation.  The plan’s Record of Decision would set forth the site-specific projects to be undertaken,  eliminating separate project-based planning and decision-making.  Forest planning collaboration, if pursued, would consist of people talking in the woods about each of the projects.

Landscape NEPA Comment in Wise Blog on Tester’s and Wyden’s Bills

Thanks to John for finding this blog..

Can’t really link to the comment directly but Tim B. comments here on the WISE blog

Just my opinion, but as a field-based NEPA practitioner with 29 years of experience, I would say that anybody seriously considering an environmental analysis covering all the various types of restoration activities that might need to happen on a landscape 50,000 acres or larger has their head very far up their ***.

Much too wide a scope; a broad purpose and need statement will generate large numbers of proposed actions, which will generate a huge number of environmental issues to address. All this would create a number of likely very complicated alternatives, and it would be difficult to determine what an adequate range of alternatives might be.

From an appeals and litigation standpoint, if a large number of activities are included in a landscape level NEPA analysis and decision, said decision and everything in it would all be vulnerable to challenge by any crank who did not like just one of the proposed actions. Given the size and complexity of such an analysis, that person could likely convince a Regional Appeals Reviewing Officer or a judge that an inadequate range of alternatives was addressed or some pet issue was missed. And the document would be so cumbersome a judge could be likely to throw the whole thing out because he couldn’t understand it – and he’d be right.

It would be much better to do a programmatic overview of such a watershed, like a watershed analysis to develop the rationale for implementation of whatever restorations actions, then do a set of some nice, neat, narrowly focused and short environmental assessments that would be much more bullet proof appeal and litigation wise.

Our congresspersons could help us out a lot if they would quit writing these complicated, prescriptive (why employ foresters and all other manner of specialists if some dudes in D.C. are deciding what can and can’t happen?) Region-specific bills and work on simplifying and clarifying the ones on the books that have been problematic for years. I also find it interesting that as far as I know, these guys never talk with/to the folks on the ground that really know what may and may not work.