Science Forum Panel 5: Bringing it All Together

 

The last panel at the Science Forum talked about the latest thinking in how to do planning, bringing together the information from the first four panels.

Clare Ryan from the University of Washington spoke about policy design and implementation, especially best science. (slides pdf)  When thinking about policy, we think about policy goals contained in the policy itself.  We think about the tools you have to implement the policy, and we think about the organizations and the resources.  Ryan described a three legged stool for policy implementation for the program or policy.  The first leg is the idea of a sound theory, one that is relevant and relates to management actions.  This leg seems pretty firm, but the sound theory needs to focus on process principles and what more can be done on how to implement them.  The next leg is managerial and political skill.  Leadership needs to be committee to the statutory goals .  Ryan says to pay attention to this leg of implementation: leadership should invest in training, mentoring, career development to build capacity, and providing resources and incentives to recognize and reward.  The last leg is support.  Ryan thinks the Forest Service has struggled with this leg.  A policy should be supported by constituent groups, legislatures and the courts.  To build support, focus on communication, engage people early in process, and make materials easily available and understandable.  Use partnerships, collaboration, and other tools.  The decision making needs to be consistent with NEPA, APA and other policies, which will build support.  Ryan said that there are both statutory and non-statutory elements (managerial skill and support) to this three legged stool.

Ryan talked about the consideration of best science.  In policy design, she encouraged approaches that define the characteristics of best science, encourage joint fact finding (collaborative research), practice adaptive governance, and make process principles high priority.   Make the decisionmaking processes a high priority.  Ryan talked about the critical area policy in the State of Washington, which requires counties and cities to include best available science in policies for critical areas.  The counties and cities are given assistance in defining best science, including peer review, methods, inference, analysis, context, references, sources, as well as criteria for obtaining best science, incorporating best science, and identifying inadequate science.

Ryan advocated the use of joint fact finding and collaborative research as a potential alternative to adversarial science.  People can work together to develop issues, data and analysis, and apply informations to reach decisions.  There are some situations where this is more or less useful.

Ryan also mentioned the practice of adaptive governance (see related post on this blog), where science doesn’t drive the policy.  Often there aren’t single target policies.  Policies are also resilient (for instance the 1872 mining law is still around).  The policies drive your decision making.  The process explores what goals we are managing for.  The decision making involves local nongovernmental organizations and communities to determine the goals in a collaborative process.   The science helps monitor the goals, and you consider uncertainty and new information.

The process principles must be high priority.  Many challenges to agency decisions are about the process.  The process must be supported internally, through training, resources and rewards.  Ryan urges the development of patterns and communities of practice.  She mentioned that the NEPA for the 21st Century initiative has good info on best practices.

Ryan suggested several ways to address uncertainty and the need for flexibility.  Where it’s difficult to hold people to agreements, consider contingent agreements (if then agreements).  She said that planning processes should not be viewed as an end, but rather a beginning of continued long-term interactions.  She also mentioned the need for new institutions, including partnerships and coordinating councils.

Martin Nie from the University of Montana, and one of the administrators of this blog, spoke from his prepared remarks.   He said that the right questions are being asked in the NOI, questions about institutions, process, scale, use of science, and accountability. He said that we must appreciate the limitations of planning.  We need a broader analysis of the system’s statutory and regulatory framework, and we need to revisit the cumulation of laws.  A National Forest Law Review Commission should be convened in the near future.  The planning rule needs to implement the spirit and letter of our laws.  These should be viewed as goals not constraints.  Then we should turn to the question of doing it in a more effective way.

Nie said that the planning rule needs to be informed from the bottom up.  We need to look at how planning is being done at the local level.  There is little value if no decisions are made in plans.  Legally binded and enforceable standards should be included.  These standards will help adaptive management – the purpose and boundaries of the process.  Without standards, adaptive management creates dodging.  We need legal sideboards for collaboration.  There is a tendency to try to maximize discretion like the 2005 rule, using the Supreme Court Ohio Forestry and SUWA cases as justifications.  It appears that the courts will grant discretion, but agency should think carefully.  Place based group want more certainty and less discretion to the agency.  Plans have been viewed as contingent wish list.  But the agency has overpromised and underdelivered.  This mismatch creates distrust.  Plans should be realistic.

Plans should be collaborative, but we must give the purpose of the collaboration.  There have been mixed signals sent in the past and incompatible expectations.  We need to formalize collaboration, using things like advisory boards, advisory committees.  We need to encourage development of collaboratively written alternatives in NEPA processes. Nie said that the Forest Service needs a national advisory committee for outside advice- not another Committee of Scientists, but one that can deal with a more inclusive set of problems beyond science.  There should be technical science committees on subjects like new approaches to viability. 

Forest plans should not be bloated.  The previous rational planning framework looked at wicked problems as scientific/technical problems.  But planning approaches that are more collaborative make sense.  We need to describe how to practice collaboration while ensuring accountability and transparency.  The 2005 planning rule had an ill-defined adaptive management framework while dropping NEPA requirements and species viability.  This got an important dialogue off on the wrong foot.   A fully funded and working monitoring program should be implemented.  Monitoring is subject to political influence and bias over what will be monitored.  Nie advocates the use of prenegotiated commitments if monitoring shows x or y.  Predetermined courses should be built into the adaptive framework from the beginning.   This might alleviate concerns about the amount of discretion.  NEPA presents challenges to adaptive management, since it requires a forward looking approach.  Once the general course is set, adaptive management is a means to an end.  It needs a purpose and hopefully NEPA is a way to define that purpose.

Nie said that we should pause before dumping more analytical requirements in the planning rule beyond NEPA, to address things like climate change or the quantification of ecosystem benefits.  Some of the NOI issues are better dealt with at higher or lower levels.  There should be serious thought on how plans and assessment are tiered and integrated, and how we avoid the planning shell game.

He talked about increasing interest in place-based and formalized MOUs.  There is considerable attention on the Tester and Wyden bills.  There are informative initiatives across the nation.  They are all searching for durable bottom-up solutions.  For the planning rule, we need to learn lessons from these initiatives.  Nie said we need to find out what role the forest plans played in these initiatives.  We need to visit these places, so we can be informed as much as possible from the bottom up approaches.  There are common themes worth considering : certainty, landscape scale restoration, and formalized decisionmaking.  These initiatives appear to be pushing the Forest Service in directions it wants to go.  Where problems are being identified by these groups, we can learn lessons about conflict resolution and problem solving from the grass roots.

Tony Cheng from Colorado State University began with three premises. (slides pdf)

First, National Forest planning is comprised of value judgments (related to choices) about linked social-ecological systems, done in the face of risks, uncertainty and constrained budgets and time.  The planning rule is really about governance, the process of making choices.  Cheng referred to the phrase “coming to judgment”  from the title of Daniel Yankelovich’s book.   NFMA specifies some types of choices/judges to be made.

Second, National Forest planning assumes that there is sufficient institutional infrastructure to support it.  The original motivation of RPA was that budgets would follow planning.

Third, National Forest plan development and implementation is a shared burden.  The Forest Service has insufficient capacity to address wicked problems, which Forest Planning has been described as far back as 1983.  Cheng said that formal partnerships are needed.

Cheng said that NFMA has the philosophical underpinning of rational comprehensive planning.  One assumption is that good information will lead to good decisions, so more good science will lead to better decisions.  It’s not inherently wrong to make those assumptions, but subsequent work in behavior decisionmaking and institutional analysis has found that good decisions are a function of a well structure process as well as the institutions to sustain those processes.

Planning is making decisions under risk and uncertainty.  Risk is the probability of an occurrence times the magnitude of the consequences of the event.  One of the challenges is we don’t know whose consequences – there are many consequences.   The planning rule can’t just do risk assessment, but we also have to look at risk decisionmaking – this is a function of risk preference, the values of the consequences, and cognitive and institutional biases.  As we think about the collaborative processes, participants come in with a whole portfolio of risk preferences, not just about ecological or resource conditions, but sometimes identity, reputations, and professional careers.  The decision about which science to use will already bias the process.

Cheng said we know how to do a good decision process – we just need to do it.  It includes defining the decision space, making a fair inclusive process, and focusing on systems thinking.  We shouldn’t think about a desired conditions but a range of desired states.  We should encourage deliberation and social learning.  Accountability is a fuzzy term like resilience, with lots of meaning in political science, so it must be determined.  The Forest Service has it’s own resources – like the National Collaboration Cadre, that uses a pure learning approach.  Cheng also referenced the work of the U.S. Institute for Environmental Conflict Resolution.  He added that the features of a good decision process include standards to avoid being arbitrary and capricious, analysis sufficient for a reasoned choice, and ways to deal with a lack of science information.

Cheng cited work from the Uncompahgre plateau in Colorado, with multiple agencies and public groups.  It had systems views and assessments, including an RMLANDS model.  The communities in the public lands partnership also developed their own assessments.  They worked on goals collaboratively, and issues were scaled down to specifics.  The Colorado Forest Restoration Institute put together collaborative joint fact finding.  This was similar to the work that Tom Sisk described in the Science Forum Panel 1, but at a smaller scale.  They looked at fire regimes, and data was collected by stakeholders.  Social, economic, and ecological monitoring was formalized in an NOI.  The NEPA decision after the NOI was not appealed or litigated.

Cheng described the features of supportive institutions, emphasizing that good process is not enough.  He quoted Elinor Ostrom who says that institutions matter (see related post on this blog).   Budgets, performance measures, and consequences need to be tied to plan outcomes.  What is measured is what will get done.  Other features of supportive institutions include formal and informal networks, Statewide strategies, nested enterprises, boundary spanning and bridging organizations like the Southwest Ecological Restoration Institute, and open access information systems like database sharing.  Lastly, there should be formalized structures and mechanisms for learning loops:  single-loop learning answers the question if the intervention worked.  Double-loop learning focuses on the validity of the assumptions about how the system works. 

In followup questions, Cheng also talked about statewide assessments and strategies.  He explained the Forest Service State and Private redesign after the Farm Bill, which looked at restructuring how federal funds are distributed to states for forest stewardship needs.  It requires state strategies and assessments.  For states with large interface acres like Colorado, with issues like insects, wildfires, and inherent transboundary issues, there is no mechanism for integrating statewide assessment into forest plans.  Plus there is a pool of funds for cross boundary objectives.  The Forest Landscape Restoration Act has a requirement for a non-federal match.  Several things are moving toward an all lands cross boundary approach.  The road has been laid out but it seems we’re trying to catch up.

Cheng had a parting question for the planning rule team: is the institutional infrastructure for this rulemaking adequate, and is the planning rule the place for addressing the issues in the NOI?

Chris Liggett, the planning director in the Southern Region, spoke about some of the planning tools his Region has been using. (slides pdf) He began by briefly mentioning the iterative alternative development process used for the Southern Appalacian plans: this “rolling alternative” NEPA approach is now endorsed by CEQ.  He also talked about consistent standards across all forest plans.

Liggett listed several requirements for planning tools: openness, collaboration, practicality (ease of use), stakeholder support, efficiency, and durability (despite changes in governing regulations).  He said that planning tools have shed the “black box” image of previous planning models.  He said that plans need to be intentional and describe a trajectory to achieve desired outcomes.  He said that tools are available to move us toward openness, transparency and consistency, and maintaining scientific rigor.

He talked about the Ecological Sustainability Evaluation Tool (ESE) developed by the Southern Region, based on The Nature Conservancy’s Conservation Action Planning (CAP) workbook.  It allows for conservation targets, and links ecosystem and species targets.  Liggett said that it’s easy to develop ecosystem and species targets.  Within the ESE process, you select conservation targets, identify key attributes, identify indicators (including species), set indicator rating criteria, assess current conditions, and develop conseration strategies (converted in ESE to forest planning language).  These steps are done collaboratively.   The Ozarks/Ouachita used this process with assistance from NatureServe and the Arkansas Game and Fish.   Now the ESE has formed that backbone of their planning process.  The ESE planning steps reside in an access database, with tabs for each step in the process.

Liggett also talked about TACCIMO, mentioned by Steve McNulty in the Science Forum Panel 2.  It allows managers to review current climate change forecasts and threats, match them with management options, and determine how they may impact forest management and planning.  This is a relational database sitting on the web, which gives you reports and maps and has feedback loop.  Statements from forest plans are clipped into the data base.

He also mentioned the Forest Service Human Dimensions Toolkit, used to display economic and social science assessment information.  It’s a web-based product which produces reports and secondary social and economic information from BEA.  It’s the first cut at an expert system.  

The last panelist was Mike Harper, a member of the National Association of County Planners, recently retired from Washoe County in Reno, Nevada. (slides pdf) Harper said that citizens are often more familiar with local planning processes than the federal process, so it’s good to think about using similar approaches.  As a professional planner, Harper provided some general observations.   First, planning is not a science – rather, it is process oriented, and an integration of vision, goals, policy and process.  Second, plan making is not regulation – rather, the primary focus is vision, goals and policy.  A vision is aspiring – it is a touchstone of a plan.  Goals define the vision parameter.  Third, Harper said that “end state” plans are rarely viable.  Planning involves politics, adaptive management is limited, and plans are out-of-date on the day of adoption.  Finally, Harper said that non-implementable plans are not worth the effort – no one will stay engaged.

Regarding ways to “foster collaborative efforts”, Harper suggested identifying and mapping where the affected audience lives and what their issues are.  He also suggested identifying state and local planners who know their audience and can provide info and perspective.  He encourages  interactive programs to collect and organize and respond to comments (web based processes),  He says it’s important to establish an initial time table and keep to it, focusing on key issues.

Regarding information sources, Harper said to: engage state and local organizations; encourage web links; encourage unified organizational responses; directly engage public on issues not normally managed at state or local levels; and directly engage state local government on issues.  He said KISS – keep the modeling simple – this might be ways to obfuscate the issues.  Also, if you’re going to use planning language then define it for your audience.

Regarding the question of accounting for the relationship between National Forest System lands and neighboring lands, Harper observed that all enabling legislation in states require elements of conservation, economic, recreation, public facilities and services in master plans.  He suggested partnering with local and regional organizations to help sponsor the process.  The Feds are not always looked like they are close to locals.  Local organizations can sponsor planning activities, and provide information both in and out of forest boundaries.

Harper also commented on the idea of a creation of a shared vision.  The planning rule should consider a national vision for forests and grasslands, and the local units should tier off the national vision.  He also recommending looking at visions in local plans that you are consulting with.   All the area plans typically have vision statements, and they usually include National Forest System lands.

He said that the planning process should define subjects, public partication procedures, and compliance with statutory requirements.  Ideally it should allow flexibility at local level.

Plans typically do contain both the strategic (vision and goals) and the tactical (policies).  Plan making involves choices.  Don’t approve projects or activities in the plan, because participants will focus on those projects and will forget about the vision or goal part of your planning process.  The capital approvement project shold follow.  The planning process moves from the general to specific.  There needs to be an aspirational nature to planning, but you need to get to the guidance portion (where the meat and bones are).  Plans identify ideals defined by present.  Circumstances change.  Emperical data is crucial – no amount of data collection and modeling will substitute for good planning, represented by the goals and the strategies.

Discussion

The Need for Certainty

In followup questions, that panel was asked about the need for certainty in Forest Plans.  Nie had said that standards were necessary, while Liggett preferred a focus on desired conditions.  Nie said the zoning approach resonates with people – not hyper zones like some of the management area allocations in existing plans, but a sense of what can be done.   Harper said be careful about using the term “zoning”, which in the county context implies regulation.  The general public might define Forest Service terms differently.  Nie thought that people are used to Forest Planning, and they want certainty.  

Harper observed that in the planning world, master plans and zoning are different things.  If you want to involve the public, then define zoning.  Liggett said that implementing NFMA has been a struggle, because it requires us to have all the pieces of planning mashed together.  Even if we can just separate the pieces temporarily into phases as we prepare the overall plan, we’d have greater acceptance and understanding. 

Cheng characterized plans as a scene from today.  The 1982 rule provided certainty with timber suitability.  But there are other ways to obtain certainty than drawing lines on a map.  Nie said that’s why we need a new planning rule.  Looking at Tester’s bill, locals have expressed a sense of frustration.  They wanted a clear commitment from the Forest Service about a stable land base, and they can’t get that from strategic and aspirational document.  The alternative course of action is Congress and that’s not the best either.  

Ryan also mentioned how NEPA associated with plans has been seen as a problem for the Forest Service.  She said that NEPA has not been the problem; rather, the Forest Service has been losing on procedural grounds for not violating plans. 

Liggett was asked to explain the distinctions he was making between desired conditions, guidelines, and standards.  He said that desired conditions are word pictures – descriptions of what we want the forest to look like in an ideal state – not necessarily a future state – they could be a current state. There is some disagreement about their detail, but they are really there to describe the vision.  NFMA refers to standards and guidelines as part of management prescription.  Over the years some space opened up between the terms standards and guidelines.  Guidelines evolved to be a more flexible variant of a standard, a little softer.  During project planning they are reviewed on the ground.  Standards and guidelines are all statements that apply to projects – they have no function if you aren’t doing any projects.

Liggett was asked why implementation in the Southern Appalachian plans has been spotty.  At the project level, guideline that produce outputs seem to be followed, while guidelines might not be followed for things like old growth survey or designation.  Liggett said that plans can be subject to misapplication.  He also added that in a lot of plans under the 1982 rule, you find a lot of spillage between the different section of the plan – objectives look like desired conditions, guidelines read like monitoring requirements.  The last rule attempted to provide definitions to those components – so they would actually work together.  Getting the number of standards pared down is the first step.  There is still a lot of variation in how plans are being used.  Some line officers understand planning.

Harper said when deciding if you have a good plan or a bad plan, a good plan has citizens that actually show up and support it.   When people try to change it, you have people defending it, or at the very least asking for the justification for the change.  The mediocre plan has initial attention then put on the shelf – the people who abuse it the most are the people that run it.  The poor plan is on the shelf and gathers dust.

An audience member observed that standards and guidelines come from the expertise of the interdisciplinary team who has worked out best practices.  That represents the best knowledge of the agency.  If they don’t belong in forest plans then where?  Liggett said that we need standards and guidelines, but as long as they are clearly identified as such.  They represent guidance for project and activity decisionmaking.  The Fish and Wildlife Services uses the term “standards” to mean different things than what the Forest Service uses, so you can get wrapped up in the nomenclature.  Planning practice has evolved – they may not have been effective.  We have this paradigm about standards as the only measure of accountability.  But there are the other ways of accountability.  As far as standards being working knowledge, Liggett said that they really seem to be describing a desired condition or an objective.  If you do that instead of writing it as a standard, it will be stronger or better.

Cheng said that a collaborative review of standards can help everyone understand the operational constraints that the Forest Service is working under.  He described the Wyoming cooperating agency structure to deal with local governments.  The Bighorn planning team worked with the cooperating agency mechanisms for a 3-4 week process.  The process was a working battleground, but in the end the participants were able to understand the constraints.  That event turned the local cooperators into champions of the plans. 

How much should the public be engaged?

The panel was asked the question about how not to burn out the public, while keeping them engaged.  Ryan suggested looking at planning as building partners.   When the planning process ends, all the material is gone on the web – which is a problem.  The engagement needs visible updates.  Think about workshops for updates on monitoring – report cards are opportunities for further engagement.  Harper said that annual reports are very helpful.  It’s good to reengage the public, and independent organizations can see if the Agency is telling the truth.

Cheng said that the planning rule needs to identify what it expects planning teams to do as they participate and coconvene collaborative processes.  The planning rule has antecedents in the Administrative Procedures Act, providing a clear basis of choice.  If there is not a clear basis of choice – there is the burden to get the evidence they need for a clear basis of choice.  Collaborative processes require a certain level of capacity and competency, which is lumpy throughout the country.   The rule could say that the Agency should use resources it has available.

Ryan agreed that the rule should define the characteristics of the process.  The agency  is already held accountable for collaboration through litigation. 

Cheng said that Forest planning is not the place to start collaboration.  It’s like taking a beginning skier to a double black diamond run.  You can’t do that in a forest plan with so many moving parts.  There is a tremendous amount of capacity building with project level planning – you build at project level and draw on those networks at the project level context.  He described the complexities of the GMUG Forest planning process.  The pre-NEPA process began in December 2000, with the last collaborative process in fall of 2004, even before they got to scoping.  There was significant adaptation, and a lot of bumps along the road.  Unfortunately, the GMUG plan has never seen the light of day, with so much disappointment of participants and stakeholders. 

Liggett said that sometimes collaboration is not the right thing to do, citing the work of Ron Heifetz at the Kennedy School of Government.  Planning ranges from simple problems with known existing solutions, to situations where you don’t know the problems or the solutions (the wicked problems.)  Collaboration is one way to approach wicked problems, but it can be inadequate to the task if you don’t frame the problem correctly – if you try to address things not answered due to inaction or malfeasance – these should be sorted out by congress or some institutional thing.  You have people thrashing things out on the ground.  We need to ask if we have not framed this, bitten off more than we can chew, or not having the capacity.  Forest planning falls all over this spectrum.

Levels of Planning

Responding to a question about what should be in the rule, and what should be in a plan, Ryan said that Forest plans must have some details.  The rule may have broader principles, but the details should be in the plan.  Ryan also said that we can’t get too specific at the national level about what is valid science.  Part of the collaborative process is making a collective judgement about what is good science – that could be incorporated into the rule.

Liggett said that we need a multi-scale framework for planning.  In the past, the Forest Service has tried to characterize a two-staged process (forest plans and projects/activities.)  But there really is a planning framework that is multi-scale.  There may be five levels – national, broad scale (at least assessment work), forest, slice of a forest like a watershed or group of watersheds, then the project.  We get very confused at the levels. 

Cheng commented that if plans are just aspirational and there is no intent for actually allocating resources, you’re not going to get people to participate.   Liggett said that plans also need more nuts and bolts stuff, but NFMA creates plans that look like they’re done by a committee.  Maybe we need the parts separate steps.  Harper said that in local planning, it’s clear where the steps are.  If projects are included in plans, people will lock in on specific projects and the whole planning process will get skewed.  Nie said that his participation on this New Century blog (see Andy Stahl’s KISS posts) has changed his view on this.  As long as plans contain a strategy that is clearly coordinated, and the parts are integrated, he is open to the idea of having oil and gas leasing, travel management, and other decisions outside of planning.  Liggett agreed, but observed that these separate activity decisions will splinter the planning approach.  We need to resist this, but don’t put it all on the back of forest plans.  We need to find balance. 

One audience participant commented that back in the 70s he was on the Society of American Foresters’ committee to address the adverse Monongahela decision on clearcutting.  The committee’s recommendations got instituted into NFMA.   The road was paved with good intentions.  He joked that he thought that the theory would work, but he encouraged everyone not to give up hope.

National Audience Concern about the Control of Local Participants

Questions were asked about how to involve interested parties not living in the local area.  Harper reiterated that mapping the location of interested citizens is important, so you don’t miss the non-locals.  Liggett said there are inherent dangers, but line officers are part of a national organization, so they are well schooled in this.  Cheng said that there are still standards for avoiding arbitrary and capricious decisions.  There is accountability to not overly rely on any one source of information.  Cheng takes a more empirical perspective on this philosophical concern – what really is the frequency that leads to collaborative processes that lead to agency capture.  Was the Tester bill collaborative? – some people will disagree because of the multiple points of entry.  There are legitimate claims against anything we come up with, but democracy is better than any other alternative.

Nie said that this is a tension that needs to be balanced.  There is a larger statutory framework with the safe boards and side boards (NEPA, NFMA).  For the rule development, perhaps the most constructive way to think about this is to marry the safeguards from the top down with the good ideas of the bottom up.  Bottom up innovation pragmatism, innovation, community.  Top down – standards.

Ryan said that there are many examples of politics just as ugly at the national level than local.  We see good examples of innovation at the state and local level and don’t want to miss those – climate, energy, greenbuilding – which happen sooner at the local level.

Collaboration for the Planning Rule

The panel was asked about the 11/11/11 target for completing the planning rulemaking process, given the desire to collaborate.

Liggett mentioned that the Land Between the Lakes plan was done in 17 months and 27 days.  He suggested that the rule can be done quickly.  He said that difficult problems don’t take a long time to solve.  He said that the LBL had a dedicated planning team.  It’s important to frame the questions correctly.  If we do that around this rulemaking we can meet the timeframe. Harper said the rule can’t be everything to everyone.  There should be some preparatory work and a timeframe to create a sense of urgency.  Make a decision – planning is a decisionmaking process.

Cheng asked how much prescriptive substantive goals for all forest plans does the rule want to have.  There are fuzzy things still out there.  How prescriptive does the rule need to be as a framework as a guidance for choice.  The more certainty you think people want – the more prescriptive.  On the other hand, the more prescriptive you are at the national level, the more chance for errors.

Science Forum Panel 4: Social, Cultural and Economic Sustainability

 

The fourth panel at the science forum talked about the social, culural and economic dimensions of planning.

Mike Dockry, a Forest Service liason with the College of the Menominee Nation in Wisconsin, began by describing the difficulties in defining sustainability. (slides pdf) After decades of discussions, there is still disagreement across disciplines, perceptions and experience, cultural understanding, time and spatial scales .  Dockry quoted a definition from the 1987 Brundtland report saying that sustainability is meeting the needs of the present without compromising the ability of future generations to meet their needs.   He said that definitions of sustainability come in two forms: “weak sustainability” assuming natural and manufactured capital are interchangeable, and “strong sustainability” assuming manufactured capital cannot replace some natural capital.  Dockry said that in all sustainability definitions, it bridges humans with the environment, it integrates the social and the scientific, it describes what people collectively want and what is ecologically possible; and it integrates scientific and societal information in an iterative rather than a linear process. 

A key component of sustainability is adaptation and learning.  It relies on successful adaptation to changing conditions across time, location, and context.  It empowers participation in the decision making.  We need to collaborate among different perspective, and the Forest Service even needs to collaborate within the agency.  Dockry said that we need to understand different perceptions about sustainability and the values.  Uncertainty is inherent in complex ecological and social systems, and it requires an informed social dialogue for decisions.

Dockry showed some sustainability models.  One has a three legged stool with the natural environment, the economic environment, and healthy communities.  This type of model leans toward the category of weaker sustainability models.  A strong sustainability model shows a nested approach, with the economy nested within the society nested within the environment.

He then turned to the model he has used at the College of the Menominee Nation in Wisconsin.  This is a reservation where you can see the linear boundaries of its dense vegetation on satellite imagery.   He said that the Nation has been cutting timber since 1854, and they have their own sawmill. In the mid 90s the Sustainable Development Institute came together to look at the Menominee, bringing academics, tribal leaders, and tribal members together to understand sustainability.   The model they developed depends on six interactive and dynamic dimensions, with human communities in every one of these factors:  (1) land tenure and sovereignty (how decisions are made); (2) the natural environment; (3) institutions, including informal clan structures or decisionmaking institutions, and formal instutions like colleges; (4) technology, from traditional harvest technologies to modern technologies such as GIS; (5) the economy, at all scales from local, regional, national, international; and (6) human perception, activity and behavior – how the group perceive and behave. 

Dockry said that sustainable development is a process of recognizing these elements and balancing the tensions within and among them.  As tensions are relieved, new tensions arrive, or old tensions come back.  Dockry said that sustainable development is a continual process, and the Menominee have balanced the tensions through the idea of “Menominee Autochthony” – a profound sense of place.  The model has been used to develop a narrative, both quantitative and qualitative, to understand change over time, and to understand intersections among model elements.  It has been used to set future goals and develop solutions, while gathering stories to understand the sense of place, and data to understand the environment, economy, demographics and trend.  Dockry said that the act of creating a narrative actually fosters collaboration.  The model is based on Menominee experience, it can be used to assess past and present situations and develop future solutions; it can be used by researchers, planners, and communities; and the model can incorporate scale time and complexity.

Dockry said that the planning rule should mention the need for scientific information and models, and data from local and national scales (e.g. the National Report on Sustainable Forests, and inventory and maps).  He said that indicators projects are a good source of information.  The rule should include qualitative and quantitative social science and humanities in the process, and make social sciences an explicit part of planning rule.  We need to recognize the social nature of science and decision making; what variables we use – how we approach our own disciplines (the social elements).  Decisionmaking is fundamentally social.  Collaboration requires learning and flexibility because of diverse public stakeholders, the need for internal agency collaboration, and tribal collaboration (government to government).   The rule needs to allow each forest plan to define what sustainability means to that forest.  There are different definitions, but we often will be talking about the same things.

Roger Sedjo, a senior fellow at the Resources for the Future, titled his presentation:  “the Forest Service planning rule – please no not again.” (slides pdf) Sedjo reflected on his experience in previous rulemaking processes, including his participation on the 1999 Committee of Scientists.  He said that once we get into the planning rule and planning process, it’s very rarely science – it’s usually values.  The only exception is viability which the 1999 Committee chewed on the for a long time, but was subsequently disregarded.

Sedjo described the frustration that the Forest Service has experienced with forest planning.  He mentioned the backlog of plan revisions not getting done.  He referenced the Tongass planning effort, which has been revised, litigated, and appealed for 25 years and is still not settled.  He said that even when we get planning rules done, we rarely get through planning processes.  Sedjo pointed out early critisms of RPA and NFMA, which were legislation to plan, but not to implement the plan.  RPA and NFMA identifed multiple-use outputs to be produced, but gave little guidance on how much of each output and the tradeoffs. 

Sedjo said that there has been little ability to get consensus.  Often groups did not participate in the process but challenged or litigated at the end .  The whole process bogged down.  Sedjo said that there has always been wide agreement that the planning process needs to be made simpler, less costly, more user-friendly, and understandable to the public.  He said that former Forest Service Chief Max Peterson wanted to insulate the process from both political and legal review.  But Sedjo thought the 1999 COS said to do even more planning, essentially throwing the Forest Service “an anvil instead of a liferaft.”

Over time, Sedjo said that the Forest Service was able to balance the timber industry, environmentalists, and recreationists against each other.  He cited Paul Culhame’s 1981 book which explained this balance of power notion, how the agency did a masterful job in balancing these groups, and for a long time the Forest Service was able to maintain the balance.  But Sedjo said there was a demise of this balance of power in the 1980s, in what are often called the timber wars, which the environmentalists won.  Timber interests were pushed out of the National Forest System, and as their interests became less, the overall conflicts lessed.  Although the forest industry has stayed around, Sedjo said that there is now a stronger interest in this rulemaking by environmentalists.  He pointed out that 35% of the audience were representing environmental groups.  Because of the narrower constituency, Sedjo thought there might be a basis for consensus of the remaining players.

Sedjo mentioned the problems identifying a mission for the Forest Service.  He quoted previous Chief Max Peterson who said the Forest Service needs a new mission, and previous Chief Jack Ward Thomas who said the Forest Service needs a new mission from Congress.  Sedjo said that clearly a planning process needs some objectives and consensus goals.  If you don’t know where you’re going, any road will get you there.  But Sedjo said that perhaps we do have the basis for a new mission or a new consensus.  Since around 1990 the Forest Service has been mostly about custodial management, specifically attention on wildfire control.  Regarding a new consensus, we might find one, not available in the past, around carbon sequestration benefits to climate change, wildlfire control, water, biodiversity, or maybe a few others.  Later in the discussion period, Sedjo said that the tradeoffs are a lot smaller than before.  Those old tradoffs in the past are now more manageable, which is easier for the various interests.

Sedjo cited Bowes and Krutilla’s 1989 book that said that National Forests could be viewed as a “forest factory” capable of generating a variety of services, as called for in legislation, with the objective to generate a set that would maximize social income.  Since many of the values were non-market, you could use contingent valuation techniques. Subsequently, we went into the era of ecosystem management, and the emphasis shifted from the “outputs” to the “factory”.  The old system focused on the eggs, and now ecosystem management shifted the focus to the goose laying the eggs.  It was the “body beautiful” goose.  Today we have the basis for describing some sort of system of ecosystem services.  We now have markets for carbons, although the Copenhagen Climate Conference went nowhere with this.  Other values can be determined by markets (timber, grazing, perhaps recreation and water) , but questions remain about wildlife or biodiversity. 

Randall Wilson from Gettyburg College spoke about the social-economic context of planning decisions. (slides pdf) He said that social-economic dimensions of rural communties are often linked to local natural environments.   The concept of “place” is important in collaborative processes.  Wilson said he studies what to make of these contexts and how they can be identified, measures, and incorporated into planning.

Wilson described regional and global processes at a larger scale.  He said that the literature on the New West looked at declining traditional resource extraction industries, risks to service sector economies (tourism), and an influx of ex-urban amenity migrants leading to habitat fragmentation, wildfire risk in the urban interface, risking tensions,and land use conflict.  These processes are also seen elsewhere across the U.S.  Wilson said that there is an uneven geography of the New West.  The most advanced New West communities receive the most in-migration and economic growth.  They have an emerging concentration of wealth and urbanization.  This is in contrast with the majority  of rural communities remaining dependent on resource extractions.  Some communities  are advanced New West, others have one foot in the old west and the other in the new west, while others are still dependent on resource extraction communities.  

Wilson looked at how these diverse place-based contexts linked up with collaborative planning efforts.  His metrics included the social-environmental context, management priorities developed in the collaborative processes, and the form and structure of the process.  He gave the example of the Greater Flagstaff Forest Partnership, which he characterized within a New West context.  There is rapid urbanization, a diversified economy, and an emphasis on recreation and tourism.  The management priorities from the process were ecological restoration and wildfire risk.  The form and structure of the collaboration was formalized, with lots of participants with institutional representative.  Wilson then talked about the Ponderosa Pine Forest partnership in Southwest Colorado, which he characterized as a transitional context.  Ecological restoration was coupled with economic development for viable local industries.  Collaboration had a less formal structure, and informal decisions and individual relationships important.  Finally, Wilson talked about the Catron County Citizens Group in New Mexico, with a focus on extractive industries and intense community conflicts.  Participants looked to improve community capacity with open participation and a moderator.

Wilson described a “three legged stool”: economic development, ecological health, and social equity/commmity capacity/cultural heritage.  The place-based social-environmental context informs the planning process, including public participation, the facilitation and scoping process; and defining forest planning priorities, guidelines, and standards.  Wilson says that Forest Planning assists rural communities, in the process itself, in the development of social capital and cohesion, and collaborative learning.   Finally plan implementation can make a difference to communties. 

 Wilson urged attention about the terminologies in forest planning: how public participation and collaboration are defined, how the definition of sustainability stretches to political, economic and social factors, and how monitoring is defined so that local communities take a role in that process.

Spencer Phillips, an economist with the Wilderness Society, emphasized that recreation is but one ecosystem service among many, and recreation is connected to these things. (slides pdf) The attention should be on both recreational carrying capacity and (not or) ecosystem sustainability.  They need to go together hand and glove – we need both.  The tools and concepts need to evolve – things have changed since 1982 – and communities have changed.  Phillips talked about the principles of the Society to connect people to nature, referencing their comment letter on the planning rule NOI.  He said that recreation is about connecting people to nature – they don’t care about the specifics of nutrient cycling but what the forest looks like when they are there.  People want to conserve and respect our natural, historic and cultural heritage, while experiencing enjoyable and safe recreation.  He noted that fewer people today grew up in the forest or on the farm with the outdoor cultural tradition, so it’s important to make a connection.  

He encouraged public participation, the attention to net economic benefits (is the money staying in the economy), agency budgets (recreation is the main economic output of public lands), monitoring, transportation systems (more than roads – getting to the right place with the right experience), and thinking of using infrastructure to enhance connection with nature.  He said that recreation infrastructure is not just the built infrastructure, but also the information infrastructure.

Phillips said the planning rule must make recreation a “substantive principle”, and suggested a full accounting for impacts both on and of recreation.  He encouraged the use of recreational planning concepts , tools, and models and systems thinking beyond simple benefit/cost analysis.   He urged Forests to focus on their niche, which may be quiet recreation.  There are opportunities on private lands in different ways.  He said this is already suggested in the 1982 planning rule which describes an identification of the uses, values, products, and services that National Forst System lands are uniquely suited to provide.   As the niche is identified, the planning process should determine how to add value to that niche.  Phillips used an example of the planning around Roslyn, Washington.  A partnership looked at what ecosystem services were provided on this parcel, then how to manage use to account for those services.  You may have to separate uses and zone for motorized, mechanized and non-motorized use. 

Phillips concluded by emphasizing the meaning of recreation, with his parting word “play.”  Recreation is “re: creation.”  People make a connection when they see where their water and game come from, and it’s easier to understand when you make that connection.  Also “re-creation” is to make something new.

Science Forum Panel 3: Species Diversity

This morning, the third panel of the science forum discussed the latest approaches to species diversity.

Kevin McKelvey of the Forest Service Rocky Mountain Research Station began by emphasizing monitoring of species, where quantification is essential. (slides pdf) In the overall framework of planning, then monitoring, then a new plan, there is a very different data standard for planning than for monitoring.  In planning, things really can’t be quantified – you ask what will fit the mission statement.  The plan is the broad aspirational thing, but monitoring is the reality.  He used a business analogy, where planning is launching a new product line, and monitoring is looking at the quarterly profits.

If monitoring is so important, why haven’t we done more?  McKelvey gave two reasons.  First, science has not provided the appropriate tools and direction.  Second, monitoring has been too expensive and difficult.  But there are new monitoring methods that haven’t been around, even the last time we did a planning rule. 

In the old way of monitoring, the gold standard was collecting population trend and size data over time.  But you can’t get population data across large areas.  You’d have to capture repeatedly a large portion of the population, which isn’t really posible across large geographic domains.  Plus, even if you can get the data, it’s not really that useful.  McKelvey used the example of the dynamics of voles, whose population data jump all over the place, so you can’t tell how they are doing.  He also explained that indices aren’t useful, because we often don’t know the relationship between the index number and the population.  One index often used is habitat.  But he used an example where more grass doesn’t necessarily mean more elk: maybe the elk got shot out, or the grass was in the wrong place.  Another index often used is surrogate species.  But he used the examples of elephants which don’t indicate what other species are present.

Over the past three years a new idea has developed: looking at presence or absence.  Presence/absence is not a surrogate for species abundance, but is a separate idea.  It has to be both: presence and absence.  You can estimate the likelihood of the false absence rate.  Today, nearly an entire branch of statistics have developed on this idea.  You can look at the area occupied over time, and whether or not the range is contracting.  McKelvey said that species range is a better metric for population size for long term viability.  It’s also important for monitoring range shifts due to climate change.  Plus, the spatial part of the analysis is a nice hook to looking at management being done on the land.  Presence/absence is easier to collect than abundance data. 

There is an exciting new suite of tools:  forensic DNA stuff – collecting hairs, scats, etc.  Just by sampling water, you can find the presence of fish DNA.  This radically decreases the cost, and level of skill for person collecting the data.  It’s not the skills of something like a breeding bird survey.

Presence/absence has some problems because it is insensitive to things like habitat fragmentation.  It tells you the “wheres” of it all but not how the population is put together.  So McKelvey urges a second monitoring method: looking at landscape genetics, which is really good at filling the gaps in knowledge. In most cases if you’re handling the data or detritus you have enough data to do genetic samples.  Historically, lab costs have been a big deal.  In 1990 it cost $10 for one base pair.  Those costs were reduced to only $1 for up to 1 million pairs, and now it’s even 1/3 of that.  This stuff will be cheap and fast.  McKelvey gave an example of a study on wolverines, where they identified corridors based on empirical data. 

Marilyn Stoll, a biologist with the Fish and Wildlife Service in Florida discussed the integration of science, policy, and stakeholder involvement in her recovery work in the Everglades. (slides pdf) She said that there are many legal authorities for an ecosystem approach, including the affirmative obligations of the section 7(a)(1) of the Endangered Species Act. 

Stoll said that recovery plans contain important information that should be used in land management planning.  For instance, the South Florida recovery plan covers 68 species, and it contains new science and recovery tasks.  It addresses 23 natural ecosystems and lists restoration tasks.  She explained the need to communicate with others in order to implement the actions.  She mentioned that recovery plans are helpful when you look at the reasons the species were listed, and what were the threats.

Stoll described a conceptual model under ESA section 7(a)(2) where you look at how species status can decline from healthy populations, to concern species, candidate, proposed, threatened, endangered, and lastly in jeopardy.  It might be helpful to think about where in the model the species currently is, and which way it’s headed.  You look at the baseline, status of the species, effects of the proposed action and cumulative effects  (state and private actions)– those are the components of the jeopardy analysis.  

In the Everglades, she is looking at water quality, timing, and distribution.  There is an engineered system they are trying to restore.  They are developing an integrated science strategy, with adaptive management at different scales.  More importantly, they are communicating the science for all different audiences at all different times.  This varies from easy to read stuff like maps colored green or red, to more complex ecological models, focusing on hypothesis clusters.  Stoll gave a few examples, including restoring the channelized basin around the Kissimmee River, and addressing the barrier created by a road and canal.

Gary Morishima, a private consultant in Washington state, emphasized a holistic approach. (slides pdf) In explaining how we must look beyond borders, he quoted Chief Seattle:   “All things are connected.  Whatever befalls the earth, befalls the sons of the earth.” Morishima said that tribes have traditionally managed lands according to this tradition.

Morishima said there is no firm definition of biodiversity.  He said the U.N. Convention on Biological Diversity has said that it’s the variety of life, but it’s valued by different people and cultures for different reasons which range from aesthetic to economic.  Over history most species have become extinct.  Nature is indifferent. Ecology and evolution are intertwined.  The environment is inherent unstable, species adapt or they die.  Human societies have evolved during the holicene period, a period of relative stability.  Although some human influence is not bad, and sometimes species richness is improved due to human involvement, some have talked about a 6th great mass extinction era being cause by humans.  We care because of our ethics and values.  2010 is the international year of biodiversity by the U.N.   No international reports since 1992 have showed any improvements.  2011 will be the U.N. international year of forests.

Morishima said that our society’s emphasis on individualism, concept of property, and the drive to accumulate private capital lead us to becoming “pixelized” at the smallest possible unit.  This leads to isolation, fragmentation, compartmentalism, and costs being transfered to others.  This pixelized view of the world is hard to put back together again.  He gave an example of King County, Washington, with a land ownership pattern that is highly pixelized: missing landscape components, disconnected properties, externalities, and divergence of management goals.  The land ownership is not coincident with ecosystem function, and even the National Forest has been pixelized by management allocation schemes of allowed/restricted/prohibited.  Nationwide, forests are disappearing, as landownership is becoming more fragmented into smaller parcels.

Morishima said that it’s time to step back and think things over, to determine the right goals, and whether the goals are effective to manage for a suite of benefits.  We need to ask what society wants from our forests, both economic and ecological.  Often, we can’t get there from here, and many of the things we want we can’t get, when we just look within administrative boundaries.  The big question is how do we coordinate and integrate.  He said that species goals can’t be done on Forest Service lands alone.  We need a system view.  Not all forest lands are equal, and we can’t expect everything for everyone, everwhere all the time.

Morishima gave some examples of “greenprinting” in King County, where ecological values are assigned to each parcel.  Another concept is to use “anchor forests” on the landscape to support transportation, manufacturing, forest health, and ecosystem function. 

He talked about the obstacles of communication and distrust. Distrust arises from perceptions of risk.  He cited Ezrahi’s work on pragmatic rationalism, describing the relationship between politicians and scientific experts.  If they both agree, there is an efficient means to the end.  If they both disagree, you can have biostitution.  There is a need to search for “serviceable truth” that does not sacrifice social interests for scientific certainty.  There is no free lunch.

Morishima explained a new paradigm of panarchy (see Dave Iverson’s related post), resiliency, and consideration of social and ecological systems.  Later, in a followup question, Morishima said that panarchy is based on the recognition that we can’t know, much less control everything in the system.  So we need to develop systems adapted to change and disturbance – both socially and ecologically.  He said that integration work has to be supported.  What the public wants is not input – they want influence.

He quoted the Secretary of Agriculture’s all lands/all hands approach, to work collaboratively to effectuate cooperation. The Forest Service must change, overcome institutional barriers to collaboration, and a relectance to devolve decision making.  We must support collaboration at the local level, stakeholder involvement, independent facilitation, and multidisciplinary communication. We need to replace our pixelized window on the world with a landscape view of social and economic processes and realities.  Morishima concluded by saying that this new approach is not so new after all: it’s a reflection of native Americans for generations.  All things are connected, part of the earth, part of this, for we are merely a strand in the world – we didn’t create the world.

Bill Zielinski of the Forest Service Pacific Southwest Research Station concluded the panel presentation with a discussion of the conceptual thinking of conservation planning. (slides pdf) He talked about the two common components, sometimes called “coarse filter” and “fine filter.”  A coarse filter approach assumes a representation of ecological types and ecological process.  A fine filter approach is a complement to look at specialized elements.  Most scientists seem to agree that a combination is a decent compromise, as used by the Nature Conservancy, and the forest restoration approach described by Tom Sisk in the first day of the science forum.  Zielinski said the coarse filter is cost effective and easy to implement (Schulte et. al. 2006), but it assumes you have information on vegetation composition and structure.  It’s more than cover types and seral stages, which are not a predictor of populations.  More commonly, the literature (for instance Noon et. al., 2009; Schlossberg and King, 2008; Flather, et. al, 1997)  shows a wide distribution of population sizes within a vegetation type.  The other shortcoming of a coarse filter approach is the rare local types.  The fine filter approach assumes you are have indices, now the use of presence/absence monitoring. 

Zielinski said that we need to respect our ignorance – we don’t understand the complexity of nature sufficiently to develop a protocol for sustaining ecosystem.  We often don’t know system dynamics, we don’t know what to protect, what to restore, or what to connect.  But we can’t want to delay our actions until we understand the extent of diversity on public lands.  He argued for a spatially extensive and economic method to collect info on species.  Zielinski said that we can exploit existing platforms, like plots from the Forest Inventory and Analysis (FIA), and models like the Forest Vegetation Simulator (FVS).  He used this existing information for small, high-density species to build a habitat model that can predict occurrence at all FIA plots.  For instance, he was involved in a study that looked at 300 2.5-acre plots in Northern California for terrestrial mollusks.   For larger animals with larger areas, he uses the same approach, but looks at an important habitat element like resting places.  For instance, he was involved in a study that used FIA plots in four Southern Sierra Forests to predict resting habitat for fischers.  He did it for two time steps of FIA series.  He used FVS to predict conditions of stands and plots in the future.  He linked FVS to FIA models that he built, so he could answer questions like the effects of thinning on fischers.  He added that the approach can be expanded to multiple species with simple detection surveys.  Later in the science panel session, Zielinski responded to a question from the audience about how sparse the FIA samples are, and how forests and districts will sometimes supplement the data.

Discussion

There were a number of follow-up questions for the panel.

Regarding the question of monitoring systems, Morishima said that they aren’t paying attention to the information that tribes have.  They have a permanence of place: they will observe things, but won’t always express things in the terms that we use.

Responding to a question about conflict between ecosystem goals and species goals, Zielinski talked about ecosystem goals for thinning, but not all species benefit from those broader goals.  Stoll said we need to manage for the at-risk species, because they are probably endangered because they are niche specialist.

Morishima said that the rule needs to avoid “rigidity traps”.  We should focus on flexibility for species and social systems to adapt.  Under panarchy and social/ecology integration, we need to let things self organize in order to provide for resilience.

Zielinski referred to the previous day’s examples in Connie Millar’s talk about pikas in high elevation microclimates.  Forestry in the past has created homogenized environments.  We need to take our guide from ecological proceses and look at heterogenous landscapes.  Many species in a bind will have more local refugia – like the pika below the talus.  Stoll added that climate change will also affect human populations and she encouraged landscape conservation cooperatives to look at human movement to climate change.

The panel also discussed how species should be selected for monitoring.  McKelvey said there are number of criteria, some social, some functional.  In region 1 they ranked species, emphasizing things like primary excavators or pollinators.  Zielinski said that just when you try to believe there are key species, then the science shifts to the other species.   The suite of species changes with science and society’s interest.  McKelvey added that once you start monitoring – you’re in for a run.  If 5 years into the game you want to do something else, you won’t produce anything coherent.  Same with models – everything is changing: computer capacities, remote sensing capacities.  But the monitoring is what you put in place 25 years ago.   Regarding reduction in monitoring costs, McKelvey said you just go down the priority list until you run out of money. Zielinski takes a more practical approach, looking at “gravy data” or “incidental data.”

Somewhat surprising to the panel, a question about the 1982 rule viability provision didn’t come up until the end.  Zielinski said that viability has had an interesting history.  In its most formal definition it includes difficult data collection: you can’t compute the probability of persistence.  Viability may be nuanced more these days, like maintenance of geographic range.  Zielinski said that viability analysis is not his area of expertise but he thought the concept is evolving.  McKelvey said the formal population viability analysis (PVA) process can’t be done.  The determination of the probablility of surviving can’t be done.  Think of Lewis and Clark predicting what the area they were exploring would look like in 200 years.  McKelvey said that if you back up and look at the concepts, you think about what is well distributed and connected.  If a population is doing that, it’s probably viable.  If we monitor them, then we can contribute to viability.  Morishima added that for many species, you can’t guarantee viability looking at forest service lands alone.  The whole issue of diversity in the 1982 rule could set the Forest Service up for an impossible situation when you expand the concept to nonvertebrates.  You could set up a “rigidity trap”.  Stoll acknowledge that you still need PVAs for listed species.

Secretary of Agriculture Remarks at Science Forum

Secretary of Agriculture Tom Vilsack started the second day of the science forum this morning to emphasize his attention to the Forest Service planning rulemaking effort.   He said that the Obama administration believes in science.  He noted that forests are a source of enormous economic opportunity.  They are a source of water for economic livelihood and drinking water.  This rule is an important blueprint.   He said that past rulemaking processes haven’t been collaborative, resulting in litigation and misguided policies while forests have suffered.

Vilsack said that people are surprised that he even knows about the Forest Service.  But he called the Department “an every way every day agency” which is not just about farming.  He said many people don’t even know the Forest Service is in Agriculture.  But he said that this is a personal matter – when he first got the job, his son in Colorado (an outdoor enthusiast) told him that he’s got responsibility for the Forest Service.  Vilsack later talked about his interest in forestry after family experiences in Iowa planting pine trees and watching them grow.

He told the audience that they were all important in this process.   Vilsack said that the President is focused on this, as part of his conversation on his Great America Outdoors initiative, and an upcoming White House summit.  The country is faced with obesity issues, and part of the problem is that people don’t spend enough time in the great outdoors.   The planning process is part of this larger dialogue.

 

K.I.S.S. in Rule Form, Part 5

In drafting these K.I.S.S. model rules, I look first at what the original 1979 and subsequent 1982 rules have to say on each subject. I use the 1979 rules because I have a ragged paper copy of that day’s federal register with the rules in it. This heirloom was given to me when I was hired as an assistant to teach NFMA planning to Forest Service interdisciplinary teams. My boss told me to read the rules, which were hot of the press, and be prepared to “teach” them the following week. I look to the 1982 rules because they are the rules under which all forest plans were promulgated.

It was with some amusement that I noticed, for the first time, that the 1982 rules fail to faithfully implement NFMA’s nominal public participation requirement (see the link’s paragraph (d)). I have fixed that problem below:

36 CFR 219.6: Public Participation

(a) The revised land management plan shall be made available to the public electronically and at convenient locations in the vicinity of the unit for a period of at least three months before the revised plan is adopted. During this three-month period, public meetings at locations that foster public participation in the plan revision shall be held.

(b) If the land management plan revisions, singly or in combination with the vegetation management and timber harvest program, require an environmental impact statement pursuant to Section 102(2)(C) of the National Environmental Policy Act (“NEPA”), 42 U.S.C. § 4321 et seq., the public participation process set forth in Council on Environmental Quality regulations, 40 CFR Part 1500 et seq., shall be followed.

(c) In addition to the requirements of (a) and (b) above, other processes suited to the decisions made in the plan revision, including the vegetation management and timber harvest program, may be used to involve the public.

Science Forum Panel 2: Landscape Models and Monitoring

This afternoon, the second panel at the science forum addressed the technical questions of landscape scale modeling and monitoring, and the related question of adaptive management.

Eric Gustafson of the Forest Service Northern Research Station started the panel by describing the merits of landscape models, which he called the “computational formalism of state-of-the-art knowledge.” (slides pdf) He said that well-verified models are as good as the science they reflect.  They are a synthesis of the state of the art.  They can be generalizations, but taken as a general expectations they are useful for research and planning. Models can account for spatial processes and spatial dynamics, consider long temporal scales and large spatial scales, and make predictions about the expected range of future states (composition, pattern, biomass).  These models don’t accurately predict individual events.

Gustafson mentioned the “pathway based succession models” often used in the West (VDDT/TELSA, LANDSUM, SIMMPPLE, RMLANDS, Fire-BCG, FETM, HARVEST)  In these models, the disturbance simulation may be process-based, or mechanistic.  He has direct experience with a different type of model, the “process based successional model” (LANDIS, LANDIS-II) for eastern forests with less predictable successional trajectories.  He said that climate change, or novel conditions, may require a process-based approach.   For these models, the landscape scales and long time frames mean that observable validation is not possible, so modelers validate independent model components that are as simple and discrete as possible, then verify component interactions, and compare model behavior with known ecosystem behavior.  Modelers also conduct sensitivity and uncertainty analysis.  Open source models are preferable because their code is available for many eyes to spot a problem.

Gustafson said that models have been used to compare outcomes of management alternatives, looking at plans once developed, or to compute the effects of proposed management (species and age class, biomass, spatial pattern, habitat for a species of interest).  Gustafson prefers looking at comparisons of alternatives rather than absolute projections. 

When projecting the impacts of alternatives, he suggests a focus on appropriately large spatial and temporal scales for evaluating ecosystem drivers and responses;  accounting for any important spatial dynamics of forest regenerative and degenerative processes; and accounting for interactions among the drivers of ecosystem dynamics and conditions (disturbance, global changes).

Steve McNulty of the Forest Service Southern Research Station talked about climate change and water. (slides pdf) He began with some observations about addressing water resources.  First, if we only focus on climate change and water resources, the forest plan will fail.  We need to look at the big picture.  Second, if we consider water as a stand alone ecosystem service, the forest plan will fail.   If we just stress water, then we’ll miss the impact on forest growth, carbon, or biodiversity.  Also, if we only look at the effects of climate change on water, we’ll miss more important factors on water quantity:  population change, changing in seasonal timing of precipitation, vegetation change, and all the sectors of water demand.

McNulty used as an example the results from a model called WaSSI (Water Supply Stress Index) which looks at environmental water use and human water use.  The model shows that the vast majority of water is used by the ecosystem (only a sliver of 2 to 5 percent is human use).  Human water use in the West is mainly used for irrigation.  The WaSSI model looks at both climate change and population change.

McNulty also gave other examples of the breadth of the analysis:  Groundwater loss can be a factor, and it doesn’t matter what climate change does to surface flows if we’ve lost our groundwater.  Another example is the interaction between land use change and climate change interactions.  If we reduce irrigation by 20%, it has as big an impact as climate change.  McNulty also described the relationships between water availability, carbon sequestration, and biodiversity.  The more you evapotranspire, the more productivity you have – it’s good for carbon storage but bad for water yield. 

McNulty also mentioned the TACCIMO project (Template for Assessing Climate Change Impacts and Management Options) with the Research Station and the Forest Service Southern Region.   This project looks at current forest plans, climate change science, and you can search by geographic location to obtain a planning assessment report. 

Ken Williams, the chief of the USGS Cooperative Research Units, spoke about adaptive management in the Department of the Interior. (slides pdf) Ken was one of the authors of the DOI adaptive management technical guide.   The DOI has worked for many years to develop a systematic approach to resource systems, recognizing the importance of reducing uncertainty.  The DOI framework can be useful to forest planning.

The management situation for the Forest Service involves complex forest systems operating at multiple scales; fluctuating environmental conditions and management actions; decision making required in the near term; uncertainty about long term consequences; and lots of stakeholders with different viewpoints.  There are four factors of uncertainty: environmental variation; partial controllability; partial observability; and structural uncertainty (lack of understanding about the processes).  All of this uncertainty limits the ability to manage effectively and efficiently.

The DOI approach emphasizes learning through management, and adjusting management strategy based on what is learned (not just trial and error); focusing  on reducing uncertainty about the influence of management actions; and improving management as a result of improved understanding.  Williams said we need to understand that this is adaptive MANAGEMENT and not adaptive SCIENCE.  Science takes it value in the contribution it makes. 

Here are the common features of an adaptive management approach: a management framework; uncertainty about management consequences; iterative decision making; the potential for learning through the process of management itself; and potential improvement of management based on learning.  Adaptive management implementation features the iterative process of decisionmaking; followed by monitoring; followed by analysis and assessment; followed by learning; followed by future decisionmaking. At each point in time, decisions are guided by management objectives; assessments provide new learning, and the process rolls through time. 

There are two key outcomes: improved understanding over time, and improved management over time.  Williams described two stages of management and learning:  a deliberative phase (phase I) with stakeholder involvement, objectives, potential management altnernatives, predictive models, monitoring protocols and plans.  Then, an iterative phase (phase II) with decision making, and post decision phase feedback on the system being managed.  Sometimes you have to break out of the iterative phase to get back to the deliberative phase.

What about the fit to forest planning?  Williams talked about considering: forests are dynamic systems; planning is mandated to be science based and stakeholder driven, acknowledging the importance of transparency, acknowledging uncertainty, recognizing the value of management adaptation, building upon experience, and understanding.  Every one of these planning attributes is a hallmark of adaptive decisionmaking.

The last panelist was Sam Cushman from the Forest Service Rocky Mountain Experiment Station. (slides pdf) Cushman cited several of his papers which talked about the problems with generalizations and the need for specific detailed desired conditions and specific monitoring.  He said that his research shows the problem with the indicator species concept – no species can explain more than 5% of the variability of the bird community; even select poolings of species fail to explain more than 10% of the variability among species.  He said that ecosystem diversity cannot be used as a surrogate for species diversity.  In practice there is a conundrum: there is a limit to the specificity that you can describe diversity, i.e. we have limited data, so that we can only define diversity very coarsely.   His research shows that habitat explains less than half of species abundance.  Cover types are inconsistent surrogates for habitat.  Also vegetation cover types maps don’t predict the occurrence and dominance of forest trees: his analysis found you can only explain about 20%, at the best only 12%.  Classified cover type maps are surprisingly poor predictors of forest vegetation.  80% of variability in tree species importance among plots was not explained even by a combination of three maps. 

To be useful, desired conditions statements should be: detailed, specific, qnatitiative, and appropriately scaled.  Research shows that detailed composition and structure of vegetation and seral stages, and its pattern across the landscape has strong relationship.  Cushman described an example looking at grizzly bear habitat, looking at detailed pixels rather than patches.  Cushman also discussed his research on the species gene flow in complex landscapes, and his research on connectivity models. 

Cushman added that monitoring is the foundation to adaptive management.  If desired condition statements are specified in detail with quantitative behcnmarks at appropriate scales, we then need to have good monitoring data, and representative data that is carefully thought out.  We need large samples for precision, recent data, appropriate spatial scale, standardized protocols, statistical power, and high precision.  We need to look at the long term.   Adaptive management needs to be supported by monitoring and it must be funded.

Science Forum Panel 1: Landscape Ecology

The first panel of today’s science forum emphasized landscape scale planning across multiple ownerships.

Tom Sisk from Northern Arizona University began with the word: “practical”.  (slides pdf) The next 20 years of the science about landscape scale management will focus on transparency, inclusion, and public deliberation.   We are learning to “scale up” our management because of our experience with uncharacteristic large fires and changes in fire regimes.  Sisk showed a map of the large footprint of the Rodeo-Chediski fire in state of Arizona.  The fire spread until it ran out of forest.   Also, wide-ranging species operate at landscape scales, especially when land uses have the potential to fragment habitat.  The protection of diversity will require a continued focus on viability, but with advances in monitoring and assessment, we can estimate species distribution by presence/absence data (which was done for goshawks on over 2 million acres in Northern Arizona).  Finally, exotic species are changing the ecosystem dynamics across the continent.  Sisk pointed out the predicted occurrence of cheatgrass using preliminary maps of areas most vulnerable to invasion.  He added that the covariates driving these models involve climate, which are useful in predicting effects of climate change.

Later during the panel discussion period, Visk said that an agreement in Northern Arizona brought people together.  Those efforts filtered into a state-based process.  The State office took the next steps, which led to a statewide strategy.  Visk said that different people who participated in the overall process had different levers they could use to shape the decisions.  He added that there was a pulling of people together through the various jurisdictions.  For scientists, these processes can be a challenge since they provide the analytical framework, matching analysis to questions.  He said that data in Landfire and other programs help provide capacity.

Sisk also pointed out the collaborative work being done on 3 million acres in northern New Mexico.  He said that one way to address the historic controversies is to empower people with information.  This new science can reside in everybody’s minds.   Land ownership can fragment our thinking rather than looking at integrated landscape properties.  Sisk described working with 50 people in breakout groups to identify priority areas for management attention, and recommending types of management in those priority areas.   Next, they looked at the predicted effects and what the landscape would look like. 

Sisk said that science must be transparent, but not dumbed down.  It must be rigorous, repeatable, and defensible so that is will inspire confident action.  The public must own the science if they are to trust and accept decisions based on it.  The planning process should provide a predictive capacity and allow exploration of alternative scenarios.  Scientists should inform and not attempt to dictate decisions.  He said that landscape ecology is a mature field, but it will require a bold attempt to embrace multiple scales. 

Jim Vose from the Forest Service Southern Research Station talked about watershed science. (slides pdf) Much of what is known is at small scales, because watersheds are so complex.  The challenge becomes taking the small scale knowledge and moving it to the watershed and landscape scale.  He added that watersheds don’t operate in isolation – they are connected.  He said that disturbances and habitats go beyond watersheds.  Movement of materials is the primary ways that watersheds interact, and hydrologic flowpaths are the primary way that small watersheds interact with larger watersheds. 

Water is an excellent integrator of watershed health: the cleanest water comes off certain landscape types.  As water flows through land uses, water quality declines.  Landscape design and configuration has a big effect on water quality. Any type of land management alters water processes and impacts resilience.  Most best management practices (BMPs) deal with the  physical aspects of land disturbance. 

Vose described innovations in linking “bottom-up” (streamflow gauging) and “top-down” (satellite imagery) technology.  There is more data from the last 20 years than the previous 80 years.  There is a variety of sensing technology, which can be used to develop monitoring networks.  Networks can go to satellites, to monitor what’s going on with landscapes.  We can quantify watershed processes, as data comes together from the bottom to the top, all the way to landscapes.  We can address the relationship of large scale carbon and water cycling.  Vose also mentioned the importance of large networks of experimental forests and the new megamonitoring network of the National Ecological Observatory Network (NEON).  Regarding climate change, Vose mentioned that we can look at long-term data in new ways. 

Vose said that we need to focus on the Where? How? What? and Why?  for restoration.  The “where” is really important (it’s not random acts of restoration).  We need to look at high risk areas and “avoid the unmanageable and manage the unavoidable.”Finally, the really big challenge is the “how” – looking across ownerships need partnerships and approaches; and recognition of connectivity.

Connie Millar from the Forest Service Pacific Southwest Station said that we need to confront all the dimensions of climate change, not just greenhouse gases. (slides pdf) We need to understand that climate is a driver of change. Climate often changes in cycles, nested at different scales from annual changes to millenial changes.   Different physical mechanisms drive changes at different cycles, like an orchestra where the music is the combined forces of instruments:  individual cycles, gradual or directional, extreme or abrupt. However, species and ecosystem respond to the orchestra with “a cacophony of responses.” 

Forests we manage are still responding to the long term cycles of history that they have encountered.  That makes a difference in how we can address climate in the future. For instance, when looking at glacial-interglacial cycles, we have evidence of over 40 cycles of greenhouse gas levels.  Over time, species have moved up and down in elevation to respond to climate (north and south in the east).  There are enduring changes in genetic diversity which we’re still seeing today.   There have been a number of ecosystem “flip flops”: some transitions had rapid change, analogous to what we have now.  Some warming has occurred in less than 75 years, with replacement of whole forest types.

We’ve had cycles of 1000 year periods due to changes in solar activity – we’ve just come out of a cold phase.  A little ice age ended in 1920.  Before that, there was a medieval anomaly with some warming and century long droughts.  Tall fast growing trees are now above tree line due to a change over 650 years, the lifespan of a single tree.  Their size and survival now depend upon their area of origination not where they currently reside.

Millar says that the shortest scale changes are the decadal or the annual scale changes.  The 20-50 year cycles affect parts of world differently – internal ocean circulation and ocean temperature.  The shortest is the El Nino/ LaNina 2-8 yr cycle.  She said that anthropogenic forcing is important at these shorter time scales.  We could see type conversions, invasions into barren lands, massive forest dying, higher elevation mortality as insects move upslope, and fires occurring in winter.   There are also important aspects of climate change at local scales, either natural forces or greenhouse gases.  Millar showed examples of the ice cover in the Great Lakes affecting shipping, and how 70% of wildfires in the West are La Nina events during decadal cycles. For local management at small scales, the local climate responses can be decoupled from effects at a regional scales.  For instance the American pika (the temperate polar bear) can escape the warm heat by the air conditioned ground. 

Millar said that climate is a fundamental architect of ecosystem change in all its flux and dynamics.  The historical conditions are increasingly poor reference and we need to look forward.  Also, human land use cover creates new challenges.

Later during the panel discussion period, Millar said that the planning rule should address climate as a macro-disturbance process.  The rule should provide for adaptive capacity of species (species ranges and species diversity) by understanding climate’s role.

Millar also answered a follow-up question from the audience about definitions.  She advised the use of definitions specifically.  The term “resistence” describes the ways to armor species in the short run against climate effects.  The term “resilience” is the most widely interpreted term, from the meaning used in the field of engineering, to the idea of returning something to its previous state.  It can be thought of as the capacity of a system to absorb change without changing its state.  Millar said that resistence and resilience are mid-term strategies.  Meanwhile, “restoration” can be a range of options from small scale to large scale.  It will be important to define what we mean.

Max Moritz from UC Berkeley concluded the first panel with a discussion of fire ecology. (slides pdf) He  said that the conventional wisdom is that climate change has resulted in increased fire activity because of warmer spring temperatures and earlier snowmelt.  But for shrublands, we haven’t seen the same effects.  Fire regimes respond to changes to multiple climate variables (temperature, precipitation quantity, and timing).

We’ve been worried about the problem of fire exclusion.  But scales of climate change and fire are quite broad.  Fires cross administrative boundaries, and different ecosystems may be adapted to very different fire regimes. 

We also only think about one kind of fire, a low severity fire.  Yet a lot of ecosystems have different fire sensitivities even to these types of fire.

Moritz said he liked the ecosystem service framework, which can give us common terms and currencies.  For fire, we need some common definition and understanding of both beneficial and detrimental roles.  Healthy?  Resilience?  A lot of the definitions and a lot of the goals can actually contradict each other.

Mortitz described a case study of range shifts to show the benefits and detriments of fire.  The rate and intensity of climate change is important to the plasticity and dispersal ability of species.  Many species will have to move 1 km per year to keep up with their suitable climate envelope.  There is a huge extinction potential.  Moritz also mentioned that many habitat models are missing the effects of fire.    He said that we might see large scale invasions and retreats of fire.  As far as species movement, fire can hasten the movement of “trailing edge” losses and benefits of habitat change. 

Regarding how fire releases carbon, he said we tend to emphasize stocks, instead of fluxes.  Ecosystems have a carrying capacity – forests will regain a standing carbon stock.  Fire is a temporary blip – but if you factor in regrowth it’s not quite as drastic.   Black carbon or “biochar” charcoal is a relative stable form of carbon – with every fire the carbon is fixed relative permanently.

Moritz said that the ongoing fragmentation of land ownership is a problem, especially when landowners have a decreasing interest in ecosystems and the resources needed to reduce fire related consequences.

Finally, Moritz said that there is lots of uncertainties.  Fire regimes are fuzzy concepts to begin with.  There is often almost no knowledge of some areas: for instance we have very little info on wind patterns.  However, the broad scope of the planning rule is a plus.

Discussion

After the presentations, there was a panel discussion and question and answer period.  The discussion centered on the topics of uncertainty, scale of analysis, integration of science with local knowledge, and the value of historical data in light of climate change.

Uncertainty

Sisk observed that uncertainty is one of the two or three key pieces of landscape planning.  Scientists try to reduce uncertainty, which is a foreign concept to nonscientists.  A scientist must take responsibility for being clear about that.  People aren’t ready to receive uncertainty – they are ready to receive the data, but combining data sources each with its own level of uncertainty will blow up the whole process.   Vose added that there is a tremendous amount of pressure on scientists to tie practice and science in an uncertain way.  Sisk observed that scientists can relate to a scenario based exploration with a sensitivity component.

Sisk emphasized that adaptive management and management flexibility aren’t the same.  Adaptive management is a structured process where you collect monitoring data and feed it into a learning structure established in the plan a priori.  The results of data collection constrain the decision space.  So adaptive management is less flexible.  

There was a followup question from the audience about the uncertainty of science.  Sisk said that the literature has shown for a long time that technical experts can be wrong.  He added that science informs the citizens and leaders in a democracy.  How to interpret uncertainty is best left to deliberation in a democratic society. Millar added that having panels when there are known differences in the science is important.  It’s important to display the differences in a transparent process.

One audience participant expressed frustration at Forest Service project NEPA documents that fail to address climate change because the issue is uncertain.  Millar said that there always is insufficient knowledge, so that is a cop out.   When addressing climate change we need to think about nesting a local analysis within larger spatial scales, and work with the knowledge we have.  Vose said that some body of information can be synthesized.

Sisk said that when we lack information we still need to make tough decisions.  Sisk said that the “Precautionary Principle” has been used in Europe and mentioned in the Rio conference.  In order to protect the environment, management should be appled according to the ecosystem capabilities.  Sisk said that a lack of uncertainty should not be used to adopt environmental protections.  Later, there was a follow-up question from the audience about whether we should be using adaptive management instead of the Precautionary Principle.  Sisk reminded everyone that the Precautionary Principle cuts both ways: both taking and not taking action.  We should not let uncertainty stop us in our tracks.

Regarding adaptive management, Vose gave examples of testing that was done, and talked about the importance of the network of experimental forests.  He said we should go slow, but test, monitor, and adapt.  He said that models are becoming increasingly sophisticated for making projected assessments.  Moritz said that in the absence of knowledge, if we can identify thresholds that we can seek and test for, we can provide some boundaries.

Scale of Analysis

At the beginning of the discussion period, Sisk noted that one thread throughout all of the talks was the need to be able to work at different scales to do intelligent planning.  Vose said that the scale of operation is the larger scale.  Sisk said that for the West, THE issue is water and the future of water resources.  Water influences the level of organization and how we scale planning.  John Wesley Powell had this figured out.  In some areas watersheds make sense, other areas they don’t, as an organizing tool.  Millar added that we should also crosswalk our analysis with nested temporal scales.  Moritz added that although “firesheds” is a term that is not well defined, it brings up this idea of nested scales.  But we first need a common language and common definitions.

There was a question from the audience about NFMA planning at a forest scale.  Vose said that watersheds are all relative, and you need to look at all scales of watersheds in planning, but a common focus is the 8th level HUC.  The real challenge is when you mix National Forest System and private land with different land uses, with no jurisdiction or a means to work in a coordinated way. 

Integrating Science with Other Sources of Knowledge

One audience participant was concerned about science being compromised in local collaborative processes.  The question is “how can the public own the science while keeping science sound?”  Sisk said that science has a high social relevance, and science efforts are supported by society.  Traning is important. Scientists have a responsibility to get their information back into social side of decisionmaking.  We need to make a concerted effort to improve the integrity of the knowledge.   It’s the responsibility of the scientists to participate in the discussions to talk about the valid use of their science.  In local collaborative processes, when conducted in a respectful and inclusive manner, there is a receptivity to science.  This isn’t the case when decisions “based on science” are laid out for people.  Sisk mentioned the idea of “braiding” of knowledge, referring of an example of Hopi collaboration.  It’s not watering down the knowledge, but braiding the knowledge together, which is stronger.  Sisk said this is the most promising vision to address these huge challenges.  Millar added that there is a long history of observation among tribes.

Moritz said that most scientists working at the landscape scale are not as much reductionists as other scientists.  At this scale, the problems are all connected.  Vose added that the most successes are when landscape and smaller scales come together.  The NEON program is an attempt to bridge that gap.

There was a later followup question from the audience about communcations with the public.  Moritz said that you can’t expect the public to dig through the journals, but the public has to have some level of literacy.  It has to go back and forth.  Millar talked about the role of a “boundary spanner”,  someone that can communicate from the formal science to the applied world.  We might not have answers at the scale of the operational level.  Sisk added that a collaborative process can bridge communication.  Of course some collaborative processes are imperfect, but communications in collaboration processes are not engineered but take place more organically.  Mortiz added that sometimes science is out there, and there is a responsibility for scientific literacy.

Role of the Historic Range of Variability (HRV) Assessment Given Climate Change

There was a question from the audience on the reliance of an HRV assessment in a landscape analysis when climate change may be creating conditions not previously experienced.  Millar said that we can still be informed by an historic process.  The HRV concept is still important but we need a different pair of glasses.   We need a realignment while using knowledge from the past about where we’ve been, looking at different earlier patterns that are more similar to where we are going.

Moritz said we need to be educated by HRV, but not be blinded by it.  There are two pieces of info: where we came from and where we are going.

Climate Change Strategies

Moritz said that the idea of assisted migration of species is relatively new and scary.  After a fire, if a species has taken a big hit, assisted migration might be a tool. Millar said that extreme events historically have major changes in ecological trajectories.  Large scale disturbance are opportunities for quickly catching up.  We can enable succession rather than force it.  Systems can reequilibrate or catch up. Sisk said that our first line of defense should be to maintain connectivity so that organisms can move on their own.  Some animals and long lived plants can do that better than we can think. Regarding extreme events, Vose said that we can understand hydrologic responses to extreme events.  There are tools and models available.

Science Forum Introductory Remarks

The two-day science forum began this morning with introductory remarks from Harris Sherman,  Department of Agriculture Undersecretary for Natural Resources and Environment.  Sherman will be the responsible official for the planning rule.   Sherman began his remarks by saying that the planning rule must be rooted in integrity – it must be based on sound science. 

He mentioned how the Secretary of Agriculture has focused on restoration – 115 million acres of NFS lands nationwide are in need of restoration.  For the remainder of the lands, we need to maintain forest health.  We need to address these challenges through landscape approaches.  We also sometimes overlook the importance of forests for water.  Over 60 million people get their water from forests.   We need to think about climate change, and we might mitigate the effects, or adapt to the changes.  We need to be concerned about the diversity of species.  Also, job creation is an important part of this – jobs in restoration or biofuels. 

Sherman said that collaboration is important, and it’s exciting to see the way that people are coming together.  He said that there is common recognition that we have serious problems.  He said that good science underpins each of these areas.  Without good science we’ll have a flawed rule that people will not buy into.  Sherman concluded by emphasizing vision, integrity, and support: developing a rule that will survive legal challenges.

Other introductory speakers included Ann Bartuska, FS deputy chief of Research and Development.   Bartuska pointed to the concept of best available science, and the challenge to answer what it is, when it is, and how we should use it.  She said that science should be the underpinning of the discussions.  She said that the Forest Service tries to make science-based decisions, and has built a science organization to support that, and many scientists from the organizations are present at this conference.  She pointed out challenges such as uncertainty, variability, scale, and complexity.  She also asked what happens if science changes – is it still best science?

Hank Kashdan, associate chief, said that the science forum is an indicator of the collaborative rulemaking effort.  He said there is a clear intent to involve people across the country.  He also said that we can’t overstate the importance of science in this process. 

The director of Ecosystem Management Coordinator, Tony Tooke, finished the introductory remarks by mentioning the target to complete the rule by November 2011.  He defined success as having a rule that is grounded in science and robust collaboration, with plans that can respond to current needs and improve management.

Climate Resilience- Let’s Not Overthink This

Those of you who read this blog, know that I am a big fan of TU’s approach to climate change Protect, Reconnect, Restore, Sustain. It’s on the front page of their website here. It seems to me like these are the basics of leaving the land in the best condition to respond to climate change as well as other stressors. We basically have a chance to refocus on what we should have been doing all along; good land management, adaptive management with adaptive governance.

For the discussions in the next few weeks, I think it is important that we have an idea of what we mean by climate resilience or ecosystem resilience, which may be two different things. For one thing, climate will require resilient social as well as ecosystems. There is an opportunity for us to become entangled in a verbal jungle from which we may never emerge. So we asked University of Montana graduate student Matt Ehrman to take a look at the different definitions and see how much definitional diversity there is; and if there is diversity in just the definitions of resilience compared to activities designed to promote climate compared to ecosystem resilience.”

Here is a link to the paper Climate Resilience<. He summarizes:

The second principle in the forest planning rule’s NOI asks the public to consider whether plans could proactively address climate change through a series of measures, such as “[management] will need to restore ecosystem resiliency, and also factor adaptation and mitigation strategies into planning and project development.” The new planning rule should go a step further and offer a clear definition of resilience. It should also seek to identify the USFS value or resource that should be managed for resilience, in addition to what it is to be resilient to. The inherent ambiguity in the term “climate resilience” could pose problems for the USFS as it drafts forest plans under a planning rule that does not explicitly define climate resilience as ecosystem resilience to climate change.

Thoughts? How far along the simplicity spectrum do you want to be?

K.I.S.S. in Rule Form, Part 3

In an agency beset with feelings of process predicament and analysis paralysis, it would be cruel punishment indeed to suggest NFMA rules that add more analysis and process to the mix. The new rules should also be durable; that is, not chase after every cause de jour (e.g., climate change) or impose inflexible, one-size-fits-all analysis processes.

The following “Assessment of New Information and Changed Circumstances” is based on the fact that forest plans exist now that cover every acre of the National Forest System. Congress directed that these plans “be revised from time to time” when “conditions in a unit have significantly changed, but at least every fifteen years.” It makes sense that only those parts of a forest plan affected by changed conditions require revision.

It is with these principles in mind that I put forward Part 3 of the keep-it-simple-sweet NFMA rules:

36 CFR 219.3: Assessment of New Information and Changed Circumstances

The revision shall assess new information and changed circumstances and conditions in the unit that are relevant to the decisions made in the land management plan. If the new information or changed circumstances and conditions warrant amendments to the land management plan, the land management plan amendments shall be assessed as a part of the vegetation management and timber harvest program’s NEPA document. If the land management plan amendments, singly or in combination with the vegetation management and timber harvest program, require an environmental impact statement pursuant to Section 102(2)(C) of the National Environmental Policy Act (“NEPA”), 42 U.S.C. § 4321 et seq., an environmental impact statement shall be prepared.