Should the State of Colorado Fund Fire Modeling Research?

Apparently NCAR (Boulder, CO) folks visited the Denver Post editorial board, who produced this editorial last Sunday.

Colorado has been ravaged by large and unpredictable wildfires and floods in recent years that have left death and destruction in their wake.

If we could rewind time and know 12 hours in advance what some of these monsters were going to do, could some of that damage have been prevented? Could lives have been saved?

The answer is undoubedly yes. And that is why forecasting tools developed by the National Center for Atmospheric Research in Boulder are so exciting. They can make such predictions with surprising accuracy, based upon numerous tests retracing actual events. Now, a bill in the legislature would spend $10 million over five years to put these systems to work in Colorado.

It’s important to know that these systems aren’t theoretical.

Scientists have been working on them for decades and the National Science Foundation and other funders have invested more than $20 million in research time and dollars to create the technology.

The Colorado contribution would finish the job and create tools tailored to the unique topography and weather of the state.

The fire modeling system marries newly available satellite imagery with detailed weather data to predict important characteristics of a wildfire and how it will likely move and change in the coming hours. It even considers the available fuel and the hydration status of vegetation.

The money that the sponsor, Rep. Tracy Kraft-Tharp, D-Arvada, is looking for in House Bill 1129 would bring this tool from the demonstration phase to a point where it can be used by firefighters while a fire is unfolding.

The same goes for the flood prediction tool, which uses many of the same types of data to forecast where flooding will occur and how severe it will be.

The price tag for disasters in recent years in Colorado has been huge — in the billions of dollars.

These tools have the potential to reduce costs and suffering at a relatively modest price.

I wonder why NSF couldn’t bring this project home after investing $20 million, or why JFSP or other funders wouldn’t want to do this beyond Colorado. If it’s useful it seems like it should be applied more broadly than Colorado (and maybe the $10 mill would go farther).

Evolutionary Theory and the Practice of Policy (3): Disciplinary Fragmentation

Here's me giving my talk at the Festschrift.
Here’s me giving my talk at the Festschrift.
In this section, I focus on how producing droves of scientists has led to them forming smaller groups, or disciplinary splintering. This has, in turn, led to “reinventing the wheel” plus micro-disciplinary territorial disputes, all of which might be invisible to the naked policy eye. Policy makers, to my mind, must be aware of the sociology of these disputes to really understand what scientific information can contribute, or in some cases, cannot contribute. And I would argue that 15 years after I wrote this paper, we still know little about the environmental effects of genetically engineered organisms, compared to how widely they are dispersed into the environment. Follow the money, as they say, back to the groups who outline “priority research” and panels who decide how the funding is spent. I worked on one research program where bad vibes were given from above for actually including a stakeholder on research panels. Because it did not hold to the sacred NSF model that USDA was trying to emulate at the time, despite Congress’s clear direction for the program (the Fund for Rural America).

When a person in policy or management tries to arrange research developed by the current system into a package relevant to a real world problem, this can leave a somewhat awkward jumble of technologies and strategies as a pool of scientific information. An analogy might be designing a car by contacting a couple of hundred groups and asking them to work on car design. Some might work on door design, others engines. But perhaps no one works on steering or air conditioning. And there’s probably a few who decide that what’s really needed is a train or an airplane and work on that. And it’s no one’s job to ensure that the pieces are all there or that the pieces fit together. Within the research system, there is little to encourage interdisciplinary cooperation as journals, funding agencies and other power structures of research communities tend to be either disciplinary, or from a restrictive subset of disciplines (e.g., ecological economics). This makes the work of taking the discrete nuggets of scientific information and arraying them into a meaningful policy analysis another mix of science, art and intuition.

DIVERGENT SELECTION, INDIRECT SELECTION AND DISCIPLINARY DRIFT

Administrators in research have selected for certain traits in scientists. Certainly it is desirable to measure accomplishments. This has been done for number of papers and grants awarded. However two questions arise. First, are there undesirable indirect effects from this selection? Second, how divergent is this selection from selection in the policy arena, and how would such a divergence influence scientists working with policy makers?

No doubt there are other forces that cause disciplinary fragmentation, but there has been a proliferation of journals and symposia, associations and subdisciplinary communities. This is good for publication records, but difficult to individuals who want to keep up with or synthesize science findings. In terms of worldview, there tends to be some “disciplinary drift” as well.

Each scientific discipline contains the paradox that the more the circumstances are controlled to get accurate data, the less relevant the answer is to the real world. Science used to depend for its legitimacy on designed experiments, which could be replicated and tested. As issues like global climate change come under scrutiny, however, or even evolution, it is recognized that in most cases rerunning the clock is not possible, and even if it were, stochastic forces might lead to a variety of possible outcomes. Therefore, as problems get more complex, science grows less “scientific.” We depend more and more on a given scientific community, rather than reproducibility, to determine what is good science and bad science. But as disciplines splinter and recombine, the scientific communities may be mixing values and science in varying proportions with unquestioned approaches and unstated assumptions and paradigms. Thus in today’s complex world, there may be ultimately no quality control on this science.

In addition, focusing on the production of publications as an organizational target plus disciplinary drift can have the effect of scientists amplifying minor discoveries or reinventing what is commonly known in another discipline. There is also a tendency to make the simple arcane and esoteric so that it appears that the discovery is important. In policy, citizens and their predilections are the key. In research, both citizens and practitioners are often left out of decisions and not the target of communication. This is a major difference between the two worlds.

Scientists can become advocates for technologies they develop, or amplify threats (from someone else’s technology). In this environment, it is difficult for a policy maker to get around the self-serving nature of these debates and get information that is balanced. For example, Jasanoff (1990) cites the Ecological Society of America adopting an influential public position on assessing the risk of releasing genetically engineered organisms into the environment. According to Jasanoff, this action was prompted in large part by a desire to enhance the organization’s professional standing; significantly it postdates a report from the National Research Council, in which the institutionally more powerful community of molecular biologists and biochemists had articulated somewhat different principles, downplaying ecological consequences. Today, almost nine years later, with substantial sums of research funds invested in the interim, we are no closer to understanding the true environmental risks of GMO’s than we were nine years ago. This is clearly an indictment of the scientific establishment’s ability or desire to look beyond its interests in increased research funding by discipline and develop information useful to the citizens of the U.S.

Evolutionary Theory and the Practice of Policy For the 21st Century (2))

Dr. Bill Libby and I at Unifying Perspectives of Evolution, Conservation and Breeding: A symposium in honour of Dr. Gene Namkoong
Dr. Bill Libby and I at Unifying Perspectives of Evolution, Conservation and Breeding: A symposium in honour of Dr. Gene Namkoong

In this section, I address different framings of the larger issue around forests, and explore how different framings privilege different scientific and other disciplines. This can occur whether a framing is helpful or not to decision makers, or accepted or not by stakeholders, simply due to how research is prioritized and funded (most often, by groups of scientists).

Below is a sample of some ways of looking at the important issues for forests, intended to show the breadth and diversity of approaches that different disciplines can take. As stated above, this tends to be done implicitly, with relatively little debate or discussion among disciplines. It seems likely that discussion across disciplines and with stakeholders and the public about the important issues might serve to focus research to provide better information for future decisions.

Is the basic problem in forestry how to allocate forests to different uses in a wise, just, and environmentally sensitive manner- “how should we manage a given area of land or landscape?” This is traditionally the area of planners, local governments, and landowners. These groups then use scientific information as available to help make decisions. Other valuable information they use in decision-making includes their own experience, experience of practitioners and indigenous people, history, the law, and the mixture of their own preferences and values.

The role of place is specifically linked to this framing of the question. In an era of globalization, land is the ultimate thing that cannot be moved or shipped. Carey (1998) describes the process of becoming located in a place in these words “over time our perceptions, thoughts and feelings undergo a process of integration with that place. We achieve an intimacy with climate and landforms. Adaptation gives way to coevolution. The place changes us even as we change the place.” He goes on to compare “space” and “place”, “space is defined by numerical coordinates, squares on a grid, longitude and latitude. By contrast place is defined by “human experience, by stories and a sense of connection that is born of years spent observing and interacting with a particular ecosystem.” The poet Gary Snyder (1995) calls bioregionalism and watershed consciousness “ a move toward resolving both nature and society with the practice of profound citizenship in both the natural and social worlds. If the ground can be our common ground, we can begin to talk to each other (human and nonhuman) again.” One of the key tensions in land allocation is between local, regional, national and international rights and responsibilities, and a tension between academic, scientific and local knowledge. Yet, framing the issue as helping local decision makers allocate land leaves a different center and clientele for research than some of the other way of asking the question.

From the forestry or landscape architecture point of view, is the question “how should we design landscapes?” McQuillan (1993) mentions both the aesthetic of architecture and aesthetic of participation in the urban or rural environment. McQuillan also points out that forestry is an art as well as a science, as acknowledged by the Society of American Foresters since 1971. Once again, in forests there is science, but what is its proper role- like that of engineering in architecture? Perhaps forestry and conservation biology have this in common, as Soule’ (1985) states in describing conservation biology “in crisis disciplines, one must act before knowing all the facts; crisis disciplines are thus a mixture of science and art, and their pursuit requires intuition as well as information.” Once again real world applications require some mix of intuition, art, and science. Still, looking at the issue as one of design rather than allocation brings art to bear, and a different lens than simply allocation. Framing the question this way leaves science, art, and intuition as partners in design, and on real landscapes, citizens and their political structures.

Is the key question for forests “what practices and technologies should be used?” Certain practices, such as home and road development, logging, and using automobiles, using or not using fire, have effects on water, soil, air, and organisms of all kinds. Can humans decrease our needs, substitute other products, or decrease the negative impacts of these practices? The difficulty with framing the question this way is that in many cases the critical question about practices for a policy maker is not whether it impacts the environment, but what the impacts compared to available alternatives. Often in scientific research, alternatives are not part of the study, which leaves the policy maker to potentially compare apples and oranges, or, more often, kiwifruit and ball bearings. This can lead to the plaintive cry of the scientist to the policy maker along the lines of “if you don’t use my ball bearing in your cake, you are not using the best science.”

Is the question “how can we change human culture to be more sensitive to the environment?” Again, this would be the province of the humanities. Is the problem too many people? Is the problem how to conserve biodiversity? If the question is seen as “how best to protect the environment from people?” or “how can we develop new technologies that provide for people’s needs?” the first group is not realistic because it does not consider that people have needs, while the second group does not consider the social or environmental impacts of its technology. While feeling smug and comfortable within disciplines, reducing the problem into these discrete units is ultimately a political and not scientific action, and leaves us bereft of potential solutions to the problem of harmonizing the needs of humans and other organisms.

Each way of describing the problem has several parameters: different mixes of humanities and sciences to be purveyors of information, different systems of stakeholders and decision makers, paradigms of preservation and use, and different focus on the local compared to general concepts and principles. The paradox is that to some extent each is the problem, the whole problem is each of these and larger than each of these. The paradox is that the answer is a function of the question and who picks the question is a function of values, not science. And we need to base our design of the future on what we know about people and the environment, but also on what resonates with the human soul, a world designed to be a place we would all want to live.

Evolutionary Theory and the Practice of Policy For the 21st Century (1)

Unifying Perspectives of Evolution, Conservation and Breeding: A symposium in honour of Dr. Gene Namkoong in commemoration of his retirement after four decades of pioneering research in forest genetics July 22-24, 1999
Unifying Perspectives of Evolution, Conservation and Breeding: A symposium in honour of Dr. Gene Namkoong in commemoration of his retirement after four decades of pioneering research in forest genetics July 22-24, 1999
This summer I am celebrating my 40th year in the forestry profession. I am also going through my files and preparing for a transition to a past-forester, current theologian. I will be posting pieces that strike me as still relevant to the forest science/policy world.

The first is a piece I wrote for Dr. Gene Namkoong’s Festschrift in 1999 (15 years ago) at University of British Columbia. A Festschrift is a symposium in honor of a scientist’s work, where his or her students present papers. Here is a link, and a photo of the group above.

The rest of the Festschrift was published, as you can see from the link, but my paper was too odd to fit. It’s kind of a reflective look (“from the balcony”, as it were, thank you Richard Stem for sticking that expression in my brain!) at our business. I was honored to be counted among Gene’s students, even though I was only a post-doc. Even though the science I did with him at State had a short half-life, the practice of doing it and learning from him is still a rich and valued part of my life today. I could say a lot more about him and his work, but that deserves a separate post.

Anyway, I will post this in 500 or so word chunks. The paper is “about” making the science biz and policy more harmonious. At the time, I had spent 25 years or so working at the interface, both in forest policy and on biotechnology. You can see those experiences reflected in this piece. So here goes:

IF SCIENCE IS THE ANSWER, WHAT WAS THE QUESTION?

Policy issues reflect competing views about not only potential courses of action, but also about the nature of the problems themselves (Dunn, 1981). In the development of scientific research that is ultimately used in policy formulation, ideas of the nature of environmental problems tend to be implicit rather than explicit, and vary by discipline. In policy analysis, problem structuring is one of the most important steps in the process. Without careful thought, discussion, and analysis, it is common to address what turns out to be the wrong problem. In the U.S., research direction is determined at different levels by different groups, most frequently without substantive levels of involvement by policy makers, landowners, practitioners, citizens or other potential users of the research.

For example, let’s imagine the research budget that might be developed by Rosemary Radford Ruether, the ecofeminist theologian (1998). She feels that the ultimate environmental challenge is “to harmonize our needs with those of the rest of the earth-community.” She thinks the solution is “new social systems that relate men and women, races and ethnic groups, in a global community where all enjoy adequate means of life and where the land, air, soil, forests, oceans and rivers can be freed of toxic poisons and provide the sustaining basis for all earth creatures..this will involve new sustainable technologies.” One could imagine research directed toward finding the most environmentally destructive first world consumption and looking for ways to encourage people to consume less, find less polluting technologies, and look for ways to find poor rural people ways of caring for the land that are not destructive but do provide for their needs. Somewhere in this research effort would be concern for the environment, for social justice, and a culture that speaks to the human soul.

As mentioned in a previous paper at this symposium (Burley, 1999) it is estimated that 50% of the wood used in the world is used for cooking. Perhaps the best focus for research is developing cheap, environmentally benign energy technologies. Or perhaps further research on electronic paper might help decrease the amount of use of trees or other fiber for pulp. The dean of the college of agriculture might decide that with burgeoning populations, the only way to provide for needs is to use genetically engineered crops. But as we have seen, the benefits, costs and risks are not necessarily addressed by the sociology, economics, botany and zoology disciplines until the technology is already developed and in use. This allows the wasteful development of technologies that are not acceptable by consumers, with the opportunity cost of the lost improvements that could have been made in people’s well-being and the environment if research had been more carefully integrated from the beginning.

Nevertheless, these are difficult questions. If someone had funding for one molecular biologist, should they be developing systems for turning hog waste into environmentally benign useful products or developing genetically engineered crops that provide nutrients or flavors like meat but are less consumptive of resources than meat production? Studying a production system, even a potential production system, raises questions about who would benefit from this production system and who would lose, if the technology could be successfully developed. Often these questions, considered in policy analysis, are not considered in research programs, nor are they the subject of public discussion, even when the research is publicly funded.

Minority Report: “EPA’s Playbook,” “Fraud,” and “Secret Science”

About a month ago I had a discussion with Dr. Bob Ferris on this blog, initially concerning the general quality of “government science.” Dr. Ferris is a “real scientist” who is “serious” and currently works for Cascadia Wildlands, an environmental activist group based in Eugene, Oregon. He has a number of publications to his credit and was instrumental in the efforts to reintroduce gray wolves into Yellowstone National Park in 1996, through his position as a biologist with Defenders of Wildlife. Here is a current sample of his work: http://www.mercurynews.com/opinion/ci_25387360/wheres-science-fish-and-wildlife-service-must-rewrite

When Dr. Ferris brought up the topic of “best available science,” I responded by providing a link to an Evergreen Magazine interview with Dr. Alan Moghissi, a published and widely recognized expert on the topic. For some reason Dr. Ferris was able to use this link as an opportunity to veer oddly and sharply off-topic and to begin leveling ad hominem attacks on ESIPRI’s website; one of ESIPRI’s founders (and occasional contributor to this blog), Norman MacLeod; Evergreen Magazine; Jim Petersen (Dr. Moghissi’s interviewer and publisher of Evergreen); Dr. Moghissi’s “right wing credentials”; Dave Skinner, a regular contributor to this blog; and the Boards of both ESIPRI and Evergreen Foundation: https://forestpolicypub.com/2014/02/17/of-wolves-and-wilderness/comment-page-1/#comment-38197

The link that seemed to cause Dr. Ferris so much vexation and total disregard for the topic at hand (“best available science”) was to a recent issue of Evergreen Magazine with a picture of Dr. Moghissi on the front cover, the Capitol Building in the background, and featuring the headline: “Fresh Air! Alan Moghissi: Rocking Capitol Hill and the EPA!”: http://www.esipri.org/Library/Evergreen_2012.pdf

Earlier today Karla Davenport, producer of Salem, Oregon’s iSpy Radio show, sent me a copy of this amazing report, EPA’s Playbook Unveiled: A Story of Fraud, Deceit, and Secret Science: http://www.esipri.org/Library/Bolar-Steel_20140319.pdf

This may be the first time I have ever referred to a government report as “amazing” without meaning to be disrespectful. If this is only 50% accurate, it should be made required reading by our public land legislators and their staffs immediately. In my opinion. The report was just released on Wednesday, so has only been available for 72 hours. I have reproduced its Executive Summary here for discussion purposes. I’m curious as to how it is going to be received.

EXECUTIVE SUMMARY

The greatness of our unique nation hinges on the fundamental purpose of the government to serve at the will of the people and to carry out public policy that is in the public interest. When it comes to the executive branch, the Courts have extended deference to agency policy decisions under the theory that our agencies are composed of neutral, non-biased, highly specialized public servants with particular knowledge about policy matters. This report will reveal that within the Environmental Protection Agency (EPA), some officials making critically important policy decisions were not remotely qualified, anything but neutral, and in at least one case — EPA decision making was delegated to a now convicted felon and con artist, John Beale.

John Beale is the character from the bizarre tale of the fake CIA agent who used his perch at the EPA to bilk the American taxpayer out of more than a million dollars. Even Jon Stewart, host of the popular Daily Show, featured Beale’s bizarre tale as “Charlatan’s Web” on his program in December 2013. Before his best friend Robert Brenner hired him to work at EPA, Beale had no legislative or environmental policy experience and wandered between jobs at a small-town law firm, a political campaign, and an apple farm. Yet at the time he was recruited to EPA, Brenner arranged to place him in the highest pay scale for general service employees, a post that typically is earned by those with significant experience.

What most Americans do not know is that Beale and Brenner were not obscure no-name bureaucrats housed in the bowels of the Agency. Through his position as head of the Office of Policy, Analysis, and Review, Brenner built a “fiefdom” that allowed him to insert himself into a number of important policy issues and to influence the direction of the Agency. Beale was one of Brenner’s acolytes — who owed his career and hefty salary to his best friend.

During the Clinton Administration, Beale and Brenner were very powerful members of EPA’s senior leadership team within the Office of Air and Radiation, the office responsible for issuing the most expensive and onerous federal regulations. Beale himself was the lead EPA official for one of the most controversial and far reaching regulations ever issued by the Agency, the 1997 National Ambient Air Quality Standards (NAAQS) for Ozone and Particulate Matter (PM). These standards marked a turning point for EPA air regulations and set the stage for the exponential growth of the Agency’s power over the American economy. Delegating the NAAQS to Beale was the result of Brenner’s facilitating the confidence of EPA elites, making Beale the gatekeeper for critical information throughout the process.

Beale accomplished this coup based on his charisma and steadfast application of the belief that the ends justify the means. Concerned about this connection, the Senate Committee on Environment and Public Works (EPW) staff have learned that the same mind that concocted a myriad of ways to abuse the trust of his EPA supervisors while committing fraud is the same mind that abused the deference afforded to public servants when he led EPA’s effort on the 1997 NAAQS. Brenner was known to have an objective on NAAQS, and would have done whatever was necessary to accomplish his desired outcome. Together, Brenner and Beale implemented a plan, which this report refers to as “EPA’s Playbook.”

The Playbook includes several tools first employed in the 1997 process, including sue-and-settle arrangements with a friendly outside group, manipulation of science, incomplete cost-benefit analysis reviews, heavy-handed management of interagency review processes, and capitalizing on information asymmetry,
reinforced by resistance to transparency. Ultimately, the guiding principal behind the Playbook is the Machiavellian principal that the ends will justify the means. In the case of the 1997 NAAQS, the Playbook started with a sue-and-settle agreement with the American Lung Association, which established a compressed timeline to draft and issue PM standards. This timeline was further compressed when EPA made the unprecedented decision to simultaneously issue new standards for both PM and Ozone. Issuing these standards in tandem and under the pressure of the sue-and-settle deadline, Beale had the mechanism he needed to ignore opposition to the standards — EPA simply did not have the time to consider dissenting opinions.

The techniques of the Playbook were on full display in the “Beale Memo,” a confidential document that was leaked to Congress during the controversy, which revealed how he pressured the Office of Information and Regulatory Affairs to back off its criticism of the NAAQS and forced them to alter their response to Congress in 1997. EPA also brushed aside objections raised by Congress, the Office of Management and Budget, the Department of Energy, the White House Council of Economic Advisors, the White House Office of Science and Technology Policy, the National Academy of Sciences, and EPA’s own scientific advisers — the Clean Air Science Advisory Committee.

These circumstances were compounded by EPA’s “policy call” to regulate PM2.5 for the first time in 1997. PM2.5 are ubiquitous tiny particles, the reduction of which EPA used to support both the PM and Ozone NAAQS. In doing so, the Playbook also addressed Beale’s approach to EPA’s economic analysis: overstate the benefits and underrepresent the costs of federal regulations. This technique has been applied over the years and burdens the American people today, as up to 80% of the benefits associated with all federal regulations are attributed to supposed PM2.5 reductions.

EPA has also manipulated the use of PM2.5 through the NAAQS process as the proffered health effects attributable to PM2.5 have never been independently verified. In the 1997 PM NAAQS, EPA justified the critical standards on only two data sets, the Harvard “Six Cities” and American Cancer Society (ACS II) studies. At the time, the underlying data for the studies were over a decade old and were vulnerable to even the most basic scrutiny. Yet the use of such weak studies reveals another lesson from EPA’s Playbook: shield the underlying data from scrutiny.

Since the 1997 standards were issued, EPA has steadfastly refused to facilitate independent analysis of the studies upon which the benefits claimed were based. While this is alarming in and of itself, this report also reveals that the EPA has continued to rely upon the secret science within the same two studies to justify the vast majority of all Clean Air Act regulations issued to this day. In manipulating the scientific process, Beale effectively closed the door to open scientific enquiry, a practice the Agency has followed ever since. Even after the passage in 1999 of the Shelby Amendment, a legislative response to EPA’s secret science that requires access to federal scientific data, and President Obama’s Executive Orders on Transparency and Data Access, the EPA continues to withhold the underlying data that originally supported Beale’s efforts.
After President Clinton endorsed the 1997 NAAQS and the Agency celebrated their finalization, Beale became immune to scrutiny or the obligation to be productive for the remainder of his time at the Agency. Similarly, the product of his labors have remained intact and have been shielded from any meaningful scrutiny, much the same way Beale was protected by an inner circle of career staff who unwittingly aided in his fraud. Accordingly, it appears that the Agency is content to let the American people pay the price for Beale and EPA’s scientific insularity, a price EPA is still trying to hide almost twenty years later.

After reaching the pinnacle of his career at the Agency in 1997, and facing no accountability thereafter, Beale put matters on cruise control and enjoyed the lavish lifestyle that the highest paid EPA employee could afford, producing virtually no substantive work product thereafter. For Beale’s successes in the 1997 NAAQS process, Beale was idolized as a hero at the Agency. According to current EPA Administrator, Gina McCarthy, “John Beale walked on water at EPA.”

This unusual culture of idolatry has led EPA officials to blind themselves to Beale’s wrongdoing and caused them to neglect their duty to act as public servants. As such, to this day EPA continues to protect Beale’s work product and the secret science behind the Agency’s NAAQS and PM claims.

Science, Law, and the Press: Idealized vs. Real

I’ve been thinking about how people use the terms “science” as in ” policies are better if they’re based on science”; and law as in “environmental laws are great because Congress made them, but if Congress messes with any of the case-law derived interpretations, that would be bad.”

It’s almost like there’s an idealized institution that people appeal to in some arguments, while sometimes ignoring or downplaying the realities of the institution. I think it will be helpful to talk about in future discussions how that plays out..for example, are Franklin and Johnson’s involvement with prescriptions on O&C lands making it “science.” What if it were two other scientists who developed a different prescription, would that still be “science”? It’s not hard to imagine other ecologist/economist pairs that could come up with other prescriptions.

Now, Congress’s messiness is laid out for the whole world to see through the press. But in my experience dealing with Forest Service projects wending their way through the system, I saw the “real” side of “science” (which I already knew about); the courts, and the press. Now I am not saying that any of them are any worse than any other; but they are all human and not perfect institutions. Human behavior in groups tends to be fairly similar and is not always perfect. When we talk about institutions, then, it seems to me, we should generally be talking about the institution as real and not as idealized.

Now people who are in the trenches on projects and see this firsthand, do not really have a voice. As agency folks, you are not allowed to question (in public) some of the issues or problems you see. For one thing, that might make powerful folks angry at the FS. For example, on one case, one of our attorneys said “we think the judge has the law wrong on this, but we won’t tell him because he is a young judge and we don’t want to have him biased against the FS for his career.” The fact that others critique the FS, but the FS can’t (usually) engage in meaningful public back and forth means that only one side is represented to the public, as we’ve discussed before.

Which also brings up that none of the feedback loops in the table allow for public discussion of claims and counterclaims, as we have on this blog. It’s too time-consuming, perhaps, but not having a place for that to occur seems to me to also be a problem. And we have to look at who is involved in the discussion and how members of the public get involved or not.

institutional feedback 2

I am interested in your thoughts on this table. One thing I thought we might be able to do on this blog, that might be helpful, would be to keep tabs on some of the journals and post relevant information on this blog so that these critiques are more available in the public sphere.

What do you think about the table? What would you change or add? What ideas does the table generate in your mind?

Spruce Beetle- Beetle Without Drama and With FS Research

One thing I noticed when panels of scientists came to talk to us about our bark beetle response (from CU particularly) is that they kept talking about our “going into the backcountry and doing fuel treatments” and why this was a bad idea. We would tell them we weren’t actually doing that, but I don’t think they believed us. I have found in general, that people at universities tend to think that a great deal more management is possible on the landscape than actually ever happens. The fact that Colorado has few sawmills means we aren’t cutting many trees for wood..

Anyway, in my efforts to convey this “there isn’t much we really can/can afford to do in these places”, I ran across this story.. It just seems so common-sensical and drama free. Perhaps that is the culture of the San Luis Valley, reflected in its press coverage.

Note: Dan Dallas, the Forest Supervisor of the Rio Grande National Forests (and former Manager of the San Luis Valley Public Lands Center, a joint FS/BLM operation, ended for unclear reasons) is a fire guy, so has practitioner knowledge of fires, fire behavior and suppression.

A beetle epidemic in the forest will have ramifications for generations to come.

Addressing the Rio Grande Roundtable on Tuesday, Rio Grande National Forest staff including Forest Supervisor Dan Dallas talked about how the current spruce beetle epidemic is affecting the forest presently and how it could potentially affect the landscape and watershed in the future. They also talked about what the Forest Service and other agencies are doing about the problem.

We’ve got a large scale outbreak that we haven’t seen at this scale ever, Dallas said.

SLV Interagency Fire Management Officer Jim Jaminet added the infestation and disease outbreak in the entire forest is pretty significant with at least 70 percent of the spruce either dead or dying “just oceans of dead standing naked canopy, just skeletons standing out there.”

Dallas said unless something changes, and he and his staff do not think it will, all the spruce will be dead in a few years.

As far as effects on wildlife, Dallas said the elk and deer would probably do fine, but this would have a huge impact on the lynx habitat.

He also expected impacts on the Rio Grande watershed all the way down to the New Mexico line. For example, the snowpack runoff would peak earlier.

However, Dallas added, “All that said, it is a natural event.”

He said the beetle epidemic destroying the Rio Grande National Forest spread significantly in just a few years. He attributed the epidemic to a combination of factors including “blow down” of trees where the beetles concentrated on the downed trees, as well as drought stressing the trees so they were more susceptible to the bugs, which are always present in the forest but because of triggering factors like drought have really taken over in recent years.

“There’s places up there now where every tree across the board is gone, dead,” Dallas said. “It’s gone clear up to timberline.”

He said the beetle infestation could be seen all the way up the Rocky Mountain range into Canada.

Safety first

To date, the U.S. Forest Service’s response has focused on health and safety both of the public and staff, Dallas explained. Trees have been taken out of areas like Big Meadows and Trujillo Meadows campgrounds where they could pose a danger to visitors, for example.

“Everybody hiking or whatever needs to be aware of this. All your old habitats, camping out underneath dead trees, that’s bad business,” Dallas said.

He said trail crews can hardly keep up with the debris, and by the time they have cleaned up a trail, they have to clear it again on their way back out.

Another way the Forest Service is responding to the beetle epidemic is through large-scale planning, Dallas added.

For example, the Forest Service has 10 years worth of timber sales ready to go at any point in time, which was unheard of a few years ago.

……….

Forest research

Dallas said a group of researchers from the Forest Service will be looking at different scenarios for the forest such as what might happen if the Forest Service does nothing and lets nature take its course or what might happen if some intervention occurs like starting a fire in the heat of summer on purpose.

The researchers are expected to visit the upper Rio Grande on June 17. They are compiling a synthesis before their trip. They will then undertake some modeling exercises to look at what might happen in the forest and what it will look like under different scenarios.

“We have the opportunity now to do some things to change the trajectory of the forest that comes back,” Dallas said. “We want to understand that, not to say that’s something we really want to do.”

He added, “We would have to involve the public, because we are talking about what the forest is going to look like when we are long dead and gone and our kids are long dead and gone.”

If the Forest Service is going to do something, however, now is the time, he added.

Fire risks

Jaminet talked with the roundtable members about fire risks in the forest.

Fire danger depends on the weather and the environment, he said.

If the conditions were such that the weather was hot, dry and windy, “We could have a large fire event in the San Luis Valley,” Jaminet said.

He added that fortunately the Valley does not have many human-caused fires in the forests. The Valley is also fortunate not to have many lightning-caused fires, he added.

“Will there be an increase in fires?” he asked. “Probably not. Will there be an increase in severity? Probably not now but probably later. The fire events are going to be largely weather driven.”

He said some fire could be good for an ecosystem as long as it does not threaten structures and people

One has to wonder whether the reviewers of the NSF studies (in this post) knew that the FS was doing what appears to be addressing the same problem, only with different tools. Seems to me like some folks who study the past, assume that the past is somehow relevant to the best way forward today. I am not against the study of history, but, to use a farming analogy, we don’t need to review the history of the Great Plains before every planting season.

Maybe there should be financial incentives for those who find duplicative research, with a percentage of the savings targeted for National Forest and BLM recreation programs ;)?

High Quality Research Act, And Research Duplication

Here’s a post from David Bruggeman about a proposed bill.

The High Quality Research Act is a draft bill from Representative Lamar Smith, Chair of the House Science, Space and Technology Committee. Still not officially introduced, it has prompted a fair amount of teeth gnashing and garment rending over what it might mean. The bill would require the Director of the National Science Foundation (NSF) to certify that the research it funds would: serve the national interests, be of the highest quality, and is not duplicative of other research projects being funded by the federal government. The bill would also prompt a study to see how such requirements could be implemented in other federal science agencies.

There’s a lot there to explore, including how the bill fits into recent inquiries about specific research grants made by the National Institutes of Health (NIH) and the NSF. (One nice place to check on this is the AmericanScience team blog.)

But what this bill has brought to my mind is that it brings the alleged tradeoff between research autonomy and research accountability into stronger relief (at least for those of us who research and analyze these things. The advocates are in combat mode). The goals of the bill – certifying that the research serves the national interests – could be interpreted as being contrary to the notions of blue sky or basic research. If the research must be linked to a national interest, how can it be done without concern for eventual applications?

My opinion…just the non-duplicative aspect would be powerful. Maybe there could be a small incentive for those who identify duplication? Because right now the only check seems to be the research panels, who often have not read the literature relevant to a specific proposal, and there is no mechanism for them to be aware of other government funded research in the area.

Just one example. You can check the NSF database here…just type in the topic you are interested in. I typed in “spruce beetle” and got a list..

This is the information for one study, and below, the abstract for one:

This collaborative research project will address the following questions about interactions between wildfire and spruce beetle outbreaks under varying climate and their consequences for ecosystem services: (1) How does climatic variation affect the initiation and spread of spruce beetle outbreaks across complex landscapes? (2) How does prior disturbance by windstorm, logging, and fire affect the subsequent occurrence and severity of spruce beetle outbreak? (3) In the context of a recently warmed climate, how do spruce beetle outbreaks affect forest structure and composition? (4) How do spruce beetle outbreaks affect fuels and potential wildfire activity under varying climatic conditions? (5) How will climate change and the climate-sensitive disturbances of wildfire and spruce beetle activity affect future ecosystem services in the subalpine zone of the southern Rocky Mountains under varying scenarios of adaptive forest management? The first four questions will be addressed through empirical research, including extensive tree-ring reconstructions of past disturbances, re-measurement of permanent forest plots, field measurements of effects of spruce beetle outbreaks on fuels, fire behavior modeling, and spatiotemporal analyses of the spread of recent spruce beetle outbreaks. The fifth question will be examined through simulation modeling of future forest conditions and their consequences for key selected ecosystem services, including biodiversity, wildlife habitat, and resilience to environmental change.

Not to pick on Kulakowski at Worcester, or even on NSF (which studies everything regardless of what other agencies study it, except perhaps NIH) but it makes me think that perhaps folks at the Forest Service and USGS around here are probably also studying some of these same topics?

It would be interesting to FOIA the peer review documents and see what the reviewers had to say about how this research fits in to ongoing federal research on the topic and how useful it will be. Because after all, there are not a lot of management choices…

University of Calgary Study on Human Impacts on Ecosystems

An article from Bob Berwyn’s blog here.

‘Even in protected areas, the influence of humans might be greater than we previously thought … ‘

FRISCO — As much as we’d like to believe in nature unbound, a new Canadian study suggests that human impacts are more widespread than we realize, even extending well into protected areas.

The five-year study by University of Calgary ecologists, included monitoring wolves, elks, cattle and humans. The resarchers concluded that human activities dominate all other factors, even in protected areas.

“Our results contrast with research conducted in protected areas that suggested food chains are primarily regulated by predators. Rather, we found that humans influenced other species in the food chain in a number of direct and indirect ways, thus overshadowing top-down and bottom-up effects,” said lead author Dr. Tyler Muhly.

In one sense, the findings are a “well, duh”. As many have stated previously, air pollution, climate change, effects of neighbor’s fire policies, and invasive species aren’t limited by “protected area” designation. Here’s a link to the university press release.

I wish there were a rule that every time a press release says “different from what was previously thought” they refer to at least one publication that asserts what they are refuting.

When I went to a website to find this I also found this study
“New research challenges assumptions about effects of global warming on mountain tree line” here. Here’s my unrefutable science about treelines; we don’t know what the H. will happen because it’s too complex to predict. We can study it until the cows come home but no one knows. It makes me wonder if there mightn’t be something more useful to study.

But I guess utility is a bad word, to a fellow named Phil Plait from Boulder who wrote an op-ed here in the Denver Post critiquing Canada’s R&D policy.. (you and I both wonder if there isn’t something more relevant for the Post to publish than a critique of a neighboring country’s science policy). The title of the piece was simply “Canada Sells Out Science.”

And that’s OK, because it’s not like the money is wasted when invested in science. For one thing, the amount of money we’re talking about here is tiny compared to a national budget. For another, investment in science always pays off. Always, and at a very high rate. If you want to boost your economy in the middle and long run, one of the best ways to do it is invest in science.

But the Canadian government is doing the precise opposite. If proposed and immediate economic benefits are the prime factors in choosing what science to fund, then the freedom of this human endeavor will be critically curtailed. It’s draining the passion and heart out of one of the best things we humans do.

By doing this, the Canadian government and the NRC have literally sold out science.

I don’t mean to pick on the “tree line modelers” here, it’s a gig and we all need them. But it’s good to know that somehow that this and future investments in modeling will “pay off” because Phil Plait says so..

Here’s another quote:

John MacDougal, president of the NRC, said, “Scientific discovery is not valuable unless it has commercial value.” Gary Goodyear, the Canadian Minister of State for Science and Technology, also stated “There is [sic] only two reasons why we do science and technology. First is to create knowledge … second is to use that knowledge for social and economic benefit. Unfortunately, all too often the knowledge gained is opportunity lost.”

I had to read the Toronto Sun article two or three times to make sure I wasn’t missing something, because I was thinking that no one could possibly utter such colossally ignorant statements. These two men — leaders in the Canadian scientific research community — were saying, out loud and clearly, that the only science worth doing is what lines the pocket of business.

Really? Doing something to improve our society is doing something for “business”? I guess I am “colossally ignorant” too. I know there is a school of scientists who believe that research should be “of the scientists, by the scientists and for the scientists” but they’re usually not that vitriolic.

Whoops.. guess I wandered off the original topic.

Still Looking for Equality: Happy International Women’s Day

1914
First Woman
The first woman employed by the Forest Service as a lookout was Hallie M. Daggett, who started work at Eddy’s Gulch Lookout Station atop Klamath Peak (Klamath NF) in the summer of 1913 (she worked as lookout for 14 years).

I have been saving this article from PNAS in October for an appropriate day:

Science faculty’s subtle gender biases favor male students.
Moss-Racusin CA, Dovidio JF, Brescoll VL, Graham MJ, Handelsman J.
Source
Department of Molecular, Cellular and Developmental Biology, Yale University, New Haven, CT 06520, USA.
Abstract
Despite efforts to recruit and retain more women, a stark gender disparity persists within academic science. Abundant research has demonstrated gender bias in many demographic groups, but has yet to experimentally investigate whether science faculty exhibit a bias against female students that could contribute to the gender disparity in academic science. In a randomized double-blind study (n = 127), science faculty from research-intensive universities rated the application materials of a student-who was randomly assigned either a male or female name-for a laboratory manager position. Faculty participants rated the male applicant as significantly more competent and hireable than the (identical) female applicant. These participants also selected a higher starting salary and offered more career mentoring to the male applicant. The gender of the faculty participants did not affect responses, such that female and male faculty were equally likely to exhibit bias against the female student. Mediation analyses indicated that the female student was less likely to be hired because she was viewed as less competent. We also assessed faculty participants’ preexisting subtle bias against women using a standard instrument and found that preexisting subtle bias against women played a moderating role, such that subtle bias against women was associated with less support for the female student, but was unrelated to reactions to the male student. These results suggest that interventions addressing faculty gender bias might advance the goal of increasing the participation of women in science.

My point being that when we give special preference to “science”, we need to understand who it is who decides what problems to fund, what disciplines to fund, and so on and how broadly that group represents different kinds of people and their interests. That’s why I think the discipline of sociology of science is to important to track for all of us who work with scientific products.

Today might be a good day to give a shout out to the women who are working with you in a business that is not always easy.. Mine is to Chief Gail Kimbell, who was the first woman Chief- I wish she would write her story like Chief Thomas did.