PCAST, Science Laundering and Wildfire Presentations

The above slides are from Deborah Sivas’ presentation to PCAST at their March 24, 2022 Meeting.

At one time I was the science lead for an interagency task force on the regulation of genetically engineered organisms in the environment.  It was quite a mouthful then, and remains so today.  I worked at the Office of Science and Technology Policy (and my co-lead from the legal side worked at CEQ the Council on Environmental Quality).  These groups are considered part of the White House and working there, even on a detail, has many desirable perks.  My office, in the Old Executive Office Building, was shared with other “agency reps” as we were known.  Well, the Los Alamos fires happened.  Shortly thereafter, folks showed up from Los Alamos to visit the DOE agency rep at the next desk.

Sidenote: my understanding is that because DOE gives money to the national labs which are run via a contractor, who conveniently is not a government agency,  so can lobby freely for funding.  It was a brilliant stroke.

They said something like “wildfires are important, we need more money to do research”.  Since I was at the next desk, I innocently inquired (after all I worked in Forest Service R&D at the time with cubicle neighbors Dave Cleaves and Sue Conard, who ran the fire science program) “doesn’t the Forest Service already do research on wildfires?”  They sort of snickered, since everyone knows that the FS was thought to be sort of “bush league” science, compared to physicists, and went on talking and apparently making sure funding was going to Los Alamos for this.

Later I was having lunch with  one of the OSTP folks higher on the food chain- a physicist from Stanford- and he said something like “the problem with USDA research is that they have an indirect cost cap.” Of course, indirect costs at Stanford were being used for yachts and antiques for the President so ..   This person told me “if USDA wants to get good science, say from MIT, they need to raise their indirect cost rate.”  I said “I don’t think researchers at MIT are really interested in say, wheat breeding research for Kansas farmers.”  Well, this was 20 years ago now so perhaps it’s not such a thing.  Or is it?

Now there are people who believe that OSTP and PCAST (the President’s Council of Advisors on Science and Technology) fill a valuable role in giving science advice to the Admin.  However, as a person who worked there, I have to question the need for “science laundering.”  There are scientists in agencies who know stuff.  Sometimes (often) these scientists disagree with each other and with other agencies.  If OSTP ran some kind of process to develop “the USG story on the science at this point in time” that would be useful.  However, that’s not exactly what they do.  And there have been improvements with PCAST, it used to be all math and information technology, now it has one agricultural scientist on it (yay!) and a few medical science types.

Epistemically, though, there’s a problem.  If we give scientists authority because they know a lot about a specific topic, but we have a group of scientists who don’t know about a topic, why are they more legitimate information sources than say, a group of journalists who could also find out from the real experts.  Does getting a Ph.D. in some field of science (say atmospheric physics) help you understand medical biotechnology better than .. whom exactly? That’s why I call it “science-laundering” – you have, say fire scientists talk about fire, but you launder their science through a panel of other scientists who don’t know about it.

This PCAST meeting has some presentations about wildfire. there’s also a video, that I didn’t watch. Interesting from the epistemic point of view, the presentation with the slides above was given by a Stanford environmental litigator.  So in addition to science-laundering, policy issues are seen through the lens of legal folks before they are transmitted to the PCAST  scientists. You might also wonder what value this is adding, given that there is a already an interagency Wildfire Commission working right now… identifying scientific and technical needs. Oh well, it’s only time and money..

From the meeting notes:

Deborah Sivas, who is an environmental litigator and law prof at Stanford, who used to work for Earthjustice. This choice didn’t seem like “the science” to me..

Sivas offered three suggestions to PCAST: First, one size does not fit all needs. Different solutions will be needed for urban, wildland–urban interface, and back country areas. Second, as increased funding from federal- and state-level sources becomes available and decisions are made about how to allocate that funding, such as to firefighting resources and hardening homes, thought should also be given to allocating funding to protect vulnerable populations, such as children and underserved communities, from smoke inhalation. Third, regulatory reform should be reviewed to include consideration of prescribed burning and guidance on environmental review procedures.

I think it’s interesting that Sivas recommends regulatory reform.. but not for thinning. only prescribed fire. In the slides she seems to conflate thinning with commercial.. which it is not in many places. Which leaves one wondering, if a tree is cut in the woods and  say burned but not sold, would that be OK for regulatory reform? This would be an interesting discussion to have, but I don’t see it as about science at all.

Gardner suggested that PCAST could help by addressing three main needs:
1) Early detection of fires: Most fires that occur in the west begin during weather conditions that are hot, dry, and windy, and most fire devastation, death, and destruction occurs in the first 12 hours of a firefight. Thus, early detection is critical. A number of parties need information early, including firefighters, police, emergency medical personnel, emergency managers, policy makers, elected officials, and the public. In Southern California, the 10 million residents with cell phones helps speed detection, but detection is more challenging in remote areas, and science and technology may have a role to play in detection in these areas.

2) Tracking resources: Despite the ability to easily order items on a cell phone and have them arrive on one’s doorstep the next day, Gardner’s fire department is using a resource ordering system that was designed in the 1940s mainly to move military equipment. It can take two days just to process the department’s order for equipment. Improved tracking of firefighters is a serious need as well. This applies to tracking one’s own team of firefighters and tracking firefighters in nearby teams. Gardner said that not knowing where one’s firefighters are while fighting a fire can be unbearable, and this kind of tracking should be possible given the ubiquity of GPS.

3) Monitoring fires: Having the ability to monitor fires, forecast their growth, and then share that information with fire ground commanders, emergency responders, emergency managers, and the public would improve awareness, decision making, and help with moving people out of harm’s way. Gardner noted the challenges of monitoring fast-moving fires, mentioning one fire that forced over 100,000 to flee as it traveled 14 miles in one night. Tracking that fire was hampered by high winds that grounded helicopters.

Sean Triplett of NIFC (National Interagency Fire Center):

Triplett closed by saying that while a mature and robust fire detection and tracking system has been developed that can gather a lot of information that is important to share with decision makers, the information-sharing process needs to be improved through the development of standard data practices, better intelligence tools, and more usable forecasting techniques. Information must be formatted in digestible pieces that can be received and used by firefighters with mobile devices in the field or by emergency managers working in operations centers.

And circling back to Rod Linn at Los Alamos:

Linn closed with a list of issues demonstrating that wildland fire modeling is a multidisciplinary exercise involving various aspects of natural science, computer science, AI, machine learning, and smoke modeling. Advanced modeling will require advanced data sets, including an understanding of the three dimensional structure of the vegetation. And finally, wildland fire science advancement must be done in cooperation with the wildland fire management community.

4 thoughts on “PCAST, Science Laundering and Wildfire Presentations”

  1. This is a really interesting post. The issue of “whose science reigns supreme?” (to borrow from a certain competitive cooking show) is important, especially because the problems we face today are often unbounded by discipline.

    If that’s the case, then we might think about this again: “Epistemically, though, there’s a problem. If we give scientists authority because they know a lot about a specific topic, but we have a group of scientists who don’t know about a topic, why are they more legitimate information sources than say, a group of journalists who could also find out from the real experts?”

    The point is spot on, but maybe a different frame might help.

    What if we were to recognize that this problem is actually tractable and largely about risk, making it more of a social science problem, a problem of what the National Research Council described as “getting the right people focused on the right problem in the right way.”

    That’s a paraphrase, but its the idea. The main point, however, is that only what they call an analytic-deliberative process is equipped to handle the type of problem we’re talking about because there is no single best answer, only answers that are appropriate for the moment and flexible enough to change in response to different moments down the road. Arguing about finding a single best answer through a single best disciplinary lens is a recipe for disaster (see how I got that cooking show in here again…).

    For more, check out “Understanding Risk: Informing Decisions in a Democratic Society” from the National Research Council of The National Academies.

    Reply
  2. When I want to know what weather is forecast for my home, I have two choices. I can go to a secondary source, like AccuWeather, that interprets NOAA’s data and presents it with pretty graphics (and advertisements).

    Or I can go straight to the source and access NOAA’s data directly.

    When it comes to wildfire data, Forest Service GIS analyst Triplett says that “a mature and robust fire detection and tracking system has been developed that can gather a lot of information that is important to share with decision makers, the information-sharing process needs to be improved through the development of standard data practices, better intelligence tools, and more usable forecasting techniques.”

    Why can’t the public access this wildfire data in real-time, as it can NOAA’s weather data?

    Reply
    • Excellent question! A lot of that information gets “laundered” through Public Information Officers and Incident Teams. I routinely send inquiries to the contacts listed for individual fires on Inciweb asking them to provide different maps/data that I KNOW are available – but the information that is publicly available on sites like Inciweb and others is pretty paltry. Newspapers and other media are routinely going to other sources for fire data to show how fires moved rapidly via satellite/remote sensing information that is publicly available. Even if you work for the FS, it is difficult to get complete and timely fire information to use for post-fire planning while the fire is burning.

      Reply

Leave a Comment