PALS Database Hits the Permitting Reform Big Time

So things were heating up with the Manchin regulatory reform for energy projects, or perhaps “quid pro maybe… not. ” It’s kind of fun to watch (more important) groups discuss the same kinds of things we always discuss. But to my amazement, the Center for Western Priorities mentioned this paper, which is actually about the Forest Service and uses PALS. So I guess the FS has hit the permitting “big time.” Here’s what CWP said.

Senator Joe Manchin released the text of his proposed changes to the country’s process for permitting energy projects. The legislation proposes two-year time limits on environmental reviews, prioritization of transmission projects, and significant permitting changes under the Clean Water Act. It would also authorize the completion of the Mountain Valley natural gas pipeline in Manchin’s state of West Virginia.

Recent research from the University of Utah found that the median time for completing an environmental impact statement was 2.8 years, while environmental assessments were completed in a median time of 1.2 years. The study found the main cause of permitting delays was a lack of expertise or staffing, suggesting that increasing funding for federal agencies may be the best way to improve efficiency.

Here’s the conclusion of the law journal paper:

Reviewing over 41,000 NEPA decisions made by the Forest Service over a 16-year period, we observed that reports on average decision-making times across agencies are skewed by outlying decisions with extended timeframes. Focusing on the median decision-making times reveals that the majority of decisions adhere to a more predictable timeframe that is shorter than reported averages. Moreover, level of analysis does not dictate decision-making times. The fastest 25% of EISs are completed more quickly than the slowest 25% of EAs, and the fastest 25% of EAs are completed more quickly than the slowest 25% of CEs. This overlap demonstrates that efficiencies can be achieved at each level of analysis without foregoing the “hard look” required by NEPA. Focusing on activities associated with delay revealed that many sources of delay attributed to NEPA are caused by external factors. Some of these delay factors, like inadequate staffing, insufficient funding, time spent on inter-agency coordination, and litigation aversion can be addressed through fiscal and cultural reforms. Other sources of delay, like delays obtaining information from permittees, are not caused by NEPA and should not drive NEPA reforms. Finally, when used properly, NEPA’s function as an umbrella statute and can mitigate or avoid delays caused by compliance with other statutory and regulatory requirements. We hope that our work, focusing on real-world problems causing delay within NEPA implementation, will provide a springboard to reforms that improve NEPA efficacy and advance the twin goals of public engagement and informed decision-making

It’s nice to know that scientific researchers aren’t the only ones to not spend enough time on framing.  When people provide “evidence-based” answers in the literature and basically tell people who work in an area “your observations are wrong and off-base”, to me it’s a “Knowledge Production Situation That Shouts Watch Out!”.  If I had one improvement to make, I would require that before any journal publish an article about the practice of something, that practitioners review it.  Does that sound crazy? I hope not.

Nevertheless, there are many interesting findings in this paper that are worthy of discussion. So for those who are interested, please consider reading the whole thing and commenting.

1.  The Forest Service, for historical reasons, has never sited much wind or solar energy, it’s mostly BLM. So if you were going to ask the question about wind and solar and transmission  NEPA, why would analyzing FS decisions even be relevant? Perhaps because PALS exists, and BLM doesn’t have an equivalent (perhaps this is an example of the streetlight effect).

2. Many of us think it’s litigating claims under NEPA, NFMA, and ESA that takes time, not NEPA docs per se.  This paper doesn’t discuss litigation at all.

3. The FS has, jointly with BLM,  analyzed fossil fuel decisions, some of which I’ve been involved with.  They are usually in litigation for many years.  Interestingly, though, it’s hard to find a database of these timeframes.  I wonder, though, it seems like both the FS and the BLM should have tracking of litigation timeframes. In fact, the FS has/had an Appeals and Litigation database similar to PALS at least at one time.  We would be able to tell something relevant from that, I think.

4. I wonder what “litigation aversion can be addressed through fiscal and cultural reforms” means in the absence of changes in litigation. It seems to me that litigation aversion may be an  altogether rational response.. to.. er… litigation.

Litigation aversion leads to unwieldy, bulky, time-consuming documents. The EADM Roundtables National Synthesis Report summarized the problem as follows: “Minimal litigation or objection is viewed as a positive outcome in terms of a project moving to implementation, but the negative costs of defensive over-analysis, unwieldy documentation, and narrowing the scope of projects in order to ‘fly under the radar’ of litigants are not usually considered.”248The concern resurfaced later in the report when discussing lengthy documents as a barrier to efficient decision-making. “Risk aversion and a history of legal challenges to USFS decisions have led to the ‘bullet-proofing’ of environmental analysis documents and specialist reports.”249The report continued, noting that “the complexity and size of analysis is of-ten inconsistent with the complexity and size of the project.”250The report explicitly distinguished between this dynamic, which it identified as a cultural barrier within the Forest Service and the NEPA process it-self. “NEPA is often blamed for these problems, when really it is not the law itself but the Agency’s process that is the cause [of lengthy documents].”251This observation is consistent with external research on Forest Service NEPA practice. In 2010, Mortimer et al., found that the threat of litigation had more influence than the degree of environmental impacts on Forest Service decisions whether to prepare an EA or an EIS for recreationa nd as “risk averse,” fearful of “backlash,” “not feeling supported in making risky decisions,” “perceived risk of being litigated and fear of losing in court” and feeling criticized for taking a risk where “success [is] defined as lack of objections or litigation”); REGION 4ROUNDTABLE REPORT, supra note 195, at 8 (identifying Forest Service staff as “risk averse” and hemmed by a “sue and settle” reality); REGION 5ROUNDTABLE RESULTS, supra note 192, at 6, 20, 28 (identifying “risk averse USFS staff” with “fear of making decisions based on imperfect data” and stating that “fear of litigation results in excessive time spent and detail in EADM documents” where EADM documents are “‘padded’ to mitigate risk of litigation” and “litigation threat undermines opportunities to conduct large landscape EADM”); REGION 6ROUNDTABLE REPORT, supra note 192, at 6 (identifying “risk aversion” as a barrier with line officers “not wanting to ‘rock the boat’”); REGION 8ROUNDTABLE REPORT, supra note 192, at 6, 8 (identifying “fear of litigation and defensive NEPA stance” as well as reluctance toward “taking on large projects for fear of objection to one small part,” suggesting that District Rangers resist a project for political reasons “until they change jobs”); REGION 9ROUNDTABLE REPORT, supra note 192, at 6 (characterizing a “risk averse USFS culture at all levels” that produces “excessive documentation”); REGION 10ROUNDTABLE REPORT, supra note 200, at 6 (describing “risk aversion” as a barrier with Forest Service “litigation-proofing documents” based on a “perception that all NEPA documents are challenged when only a small percent are challenged”). 247REGION 1ROUNDTABLE REPORT,supra note 192; REGION 2ROUNDTABLE REPORT,supra note 192; REGION 3ROUNDTABLE REPORT,supra note 200; REGION 6ROUNDTABLE REPORT, supra note 192; REGION 8ROUNDTABLE REPORT, supra note 192.248EADMROUNDTABLES NATIONAL SYNTHESIS REPORT,supra note 131, at 13.249Id. at 19.250Id.251Id.

5.  Interesting about focusing on APD’s and the operator not providing information that slows down the use of CE’s.. My view would be that it’s leasing decisions that draw most fire (because APD’s have other analysis completed) and I think should have been looked at separately. But all of this is probably moot because it’s the litigation/litigation proofing cycle that takes time.

6. Most puzzling to me was this..

The regression database identified 86 projects involving Forest Plan Revisions. The fastest took 45 days and the longest took 5,695 days.  Only 16 (19%) took less than a year. Fifty-two of the 84 projects (60%) were analyzed in an EIS, 21 (24%) were analyzed in EAs, and 13 (15%) were analyzed in CEs. Eighty-four percent of the EISs took longer than the median time for EISs (44 out of 52 took longer than 1,006 days).

We could certainly learn something from the Forest Plan revisions that took 45 days and less than a year.. or maybe they were mislabeled or .. this certainly points to the utility of practitioner review, IMHO.

Again, there are many other interesting observations in this paper, and as long as this post is (sorry!) I only highlighted some. so please post the ones that you find intriguing in the comments.

13 thoughts on “PALS Database Hits the Permitting Reform Big Time”

  1. Sharon, I’m puzzled by this comment of yours:

    “It’s nice to know that scientific researchers aren’t the only ones to not spend enough time on framing. When people provide “evidence-based” answers in the literature and basically tell people who work in an area “your observations are wrong and off-base”, to me it’s a “Knowledge Production Situation That Shouts Watch Out!”. If I had one improvement to make, I would require that before any journal publish an article about the practice of something, that practitioners review it. Does that sound crazy? I hope not.”

    Are you saying that environmental lawyers aren’t practitioners? If, as you say, litigation is the real problem and not NEPA, then aren’t environmental lawyers perhaps the most relevant practitioners? It seems from the rest of your commentary that you at least think the paper was interesting & relevant, so I’m not sure what you are objecting to here. I do know that I was one of several people who reviewed this paper prior to publication, I’m certainly not a practitioner, but I don’t think either of us know who else might have reviewed it, and I don’t know how much time the authors spend as academics and how much time they might be spending actually practicing environmental law.

    Reply
    • Forrest, thanks for this comment. Let’s parse it out a little. This is the class of research that say “if only agencies would do it correctly/efficiently there would be no problems.” and/or “problems, what problems?” I have heard this point of view from CEQ folks when faced with a room full of agency NEPA reps (of which I was the FS one at the time). So yes, there is a difference between being and environmental lawyer and being in the trenches of writing environmental documents, public involvement and all that. Note that the authors don’t make recommendations for how the work of “environmental attorneying” might be done better; rather it is about how the FS can do NEPA better. Or even more oddly, that the perceptions of people working on these project documents are simply wrong.

      The only rational way to deal with this is to interview people who think there’s a problem (of which there are many such people) and investigate exactly what those people think and why. And that is more social science than legal journal material.

      Reply
      • I didn’t read the article as saying that “perceptions of people working on these project documents are wrong.” I read it as presenting an external view of probable causes – and I’ll note that issues such as staffing and agency culture have been noted in the agency’s own analyses. Perhaps you know people who have perceptions that are inconsistent with the claims in the article. I know people working in NEPA in agencies who thought this paper was spot on. Without doing systematic interviews (which I agree would be useful and I would personally like to do some day) I don’t think we know what “people working on these project documents think.” We have a pretty clear view that people who comment on this blog have widely divergent views based on their own experiences.

        Reply
  2. As far as the litigation piece, we’ve discussed this before. To outside observers, its difficult to understand why rare events so dominate thinking. PALS shows about 1% of decisions are litigated, so its difficult to understand why that 1% is the most important. I think if practitioners think that’s the main problem, we should have a discussion about why they think that 1% is so important, and what would successfully address that. Do we think anything in the Manchin package would have actually made a substantial difference?

    Reply
    • Hmm. 1%. That’s the same argument for prescribed fires that escape. Why do you care, New Mexico, if it’s only rarely that escaped prescribed fires cause damage? Is 1% an argument for the FS not checking on its management of the escapes and trying to improve?

      I can’t remember what bill it was in but TRCP had an idea about timing of litigation (timeframes) and that they should go to arbitration (I think it was). Can’t find it right now but I think there’s something there.

      Note: I think litigation and defense against it is PART of the problem of getting projects going and can’t be ignored, at least for projects on FS land. Yes there are a variety of other problems, including budgets, people, burning windows, availability of markets for removed material. But as with renewable energy projects, you can’t stomp on the gas and expect to get far when the brake is still engaged. And interest groups holding the brake seem to be unwilling to give up that power or even discuss it. But like I said, for energy, this is all way above our pay grade and larger forces than our community will have to work it out.

      Reply
  3. it may be worth a deep dive on the whole of the paper, as a suggested post. at a first read, the use of medians versus averages is commendable, as that’s basic statistical literacy when it comes to outliers and skewed distributions. the recommendations do not exactly seem to follow from the data, despite the claim of an evidentiary base. certainly, the evidentiary base marshaled by the paper is uneven, and that bugs me. their review of evidence for nepa as a source of delay may indeed point to overstatement coming from some corners (manchin, lee, perhaps?)

    …but the evidentiary base supporting claims that the main need is more funding and culture change seem less robust than the statistical datasets, and there’s a problem when you present them alongside such data as if they’re similarly robust conclusions.

    one problem is that the quite robust findings in model results sections iv. a, b, and c only partially appear to inform the conclusions / recommendations that follow in recommendations section v. a. 1 thru 4. Rather, the much less robust, much more qualitative findings of section iv. d. appear, (again on a first read) to be the primary basis of recommendations v.a. 2, 3, and 4. The only recommendation that appears driven by any really quantitative data is v.a.1 and part of 2.

    note that reviewers include fleischmann, who’s well established as a proponent of the “NEPA is fine, NEPA cannot be changed” mindset in the research, and that mindset seems to have been ingrained at the outset here. if the paper is reviewed by researches with a known and previously indicated bias, it’s not realistic to expect unbiased results, sorry. One indicator that such a mindset is assumed at the outset is the fact that the authors appear to believe that the production of information in the NEPA process is itself a worthy outcome in land management, as opposed to a necessary but less valuable step that prefaces actual outcomes on the land, something that reasonable people can disagree about (for instance, what level of information actually meets the need for transparency and deliberation; and if courts have stated that is the purpose of nepa, they have also stated that it is a procedural statute,

    A more detailed review could try to link the arguments and conclusions more carefully than I’ve done here, but i am skeptical that the full suite of recommendations actually follows from the model results presented, at least not without smuggling in a number of assumptions that could themselves be questioned or at least subject to reasonable disagreement.

    another potentially worthy topic in a more detailed review could be practioner review of the claims made about FS nepa processes, culture, and the like.

    a final aside, and bone to pick : culture arguments and culture recommendations are, more often than not, just audacious fudges, an authoritative-looking vessel to house whatever opinion you already held. in a paper with “evidence-based” in it’s title, probably inappropriate. after all, this usage of “evidence-based” nomenclature reminds one of the use of that terminology in medical parlance and journals, subject to massive meta-analyses (like a Cochrane review). The “culture” parts of this are not even in the same galaxy as that kind of standard. maybe a bit harsh there, but in a paper that does contain legitimate statistical work (and is in large part premised on that as a corrective to other research or opinion) this kind of thing sticks out as a weak, overreaching part of the analysis.

    Reply
    • To be clear Anonymous, this paper was published in a Law Review. Law Reviews are not subject to peer review, they have their own rather idiosyncratic editing and reviewing process which those outside of law don’t really understand but the primary editors and reviewers are students at the law school. The fact that the authors of the paper asked me for my comments on the paper informally prior to publishing is an indication that they were seeking input from people who had published related research. I had nothing to do with the publication process so I don’t see how their asking for my feedback in an informal context would constitute a “bias.”

      I would be interested in what recommendations others think are evidence-based. I think the larger problem – at least from an outsider perspective – is a lack of evidence about what works. I agree with anonymous that some of the evidence-based claims have less clear evidence behind them than others (although I’d request that you refrain from confusing math with evidence).

      Reply
        • I think its fairly unusual to have a law review article with a statistical model in it (my recollection is that the authors of the paper sent it to me because they wanted to check that they were doing their modeling correctly). Yet I think law reviews are full of evidence – the evidence is in my understanding typically logical, historical, and analogical (I’ve never written a law review). In the social sciences we talk about evidence coming in many forms, of which statistical analysis of quantified phenomena are only one. The strength of evidence is not so much tied to quantification as to the research design (i.e. experimental evidence is valued more than observational evidence even if the observational data is more quantified, which it often is), as well as to the fit within a broader understanding of a field. As far as I can tell this is similar to biological fields, in which many influential pieces of evidence do not appear to involve quantification – for example https://www.nature.com/articles/171737a0. If the kind of evidence we want about NEPA is quantitative evidence that can be subjected to meta-analysis, we’re never going to have any evidence.

          Reply
  4. as an additional note, i think the manchin piece probably would not fix the issues it claims, in spite of criticism of paper’s conclusory overreach or weakness on items where it strays beyond the implications of the quantitative data

    Reply
  5. The link in the following text does not point to the Utah research but to the NY Times article.

    “Recent research from the University of Utah found that the median time…”

    Reply
  6. Sharon: “But all of this is probably moot because it’s the litigation/litigation proofing cycle that takes time.”

    I’m not sure if this was limited to APDs, or more general, but I think it’s more of a scapegoat. If litigation is a real problem, it’s because of a law, and that should be the focus of criticism. If litigation proofing is a real problem, I suggest that it is because the people doing the work don’t really understand the law. In the formal agency training they learn what the book says, but less about why, so that they may not make appropriate judgments about how much effort to put where. And then there’s turf/funding kinds of issues where everyone wants to show how important the work they do is. I’ve seen plenty of environmental documents that have as much excess on less important things as they have deficiencies on the critical issues.

    Overall, I would say litigation aversion by decision-makers is a good thing. It is one way that public opinion can actually make a “political” difference. You might find this discussion of the role of NEPA interesting (though not sure who can read the Missoulian, the headline captures it): https://missoulian.com/news/local/um-law-prof-public-input-environmental-impacts-might-not-stop-holland-lake-proposal/article_dcbbdb2d-f545-5ab5-9f60-273de1c326c7.html

    “Zellmer said that it’s up to the discretion of the Forest Service as to how much weight they give public comment when deciding what type of review to conduct or whether to approve the project.”

    Reply

Leave a Comment