A Look Behind the Scenes in Climate Science Publishing: Patrick Brown Explains Why Our Wildfire-Related Sciences Are Left Out

Patrick Brown of The Breakthrough Institute posted this today at The Free Press.  It’s about wildfires and climate, and also IMHO does a great job of explaining the climate science system and what gets published. Warning: this is an extraordinarily long post because I believe it’s paywalled. So

This may be surprising to some of you, but to others.. not so much. Here are some excerpts.. I bolded sentences that deal with the theme of “why are the usual sciences that deal with, say, forests or wildfire, often overlooked in climate papers (as is adaptation)?”

I am a climate scientist. And while climate change is an important factor affecting wildfires over many parts of the world, it isn’t close to the only factor that deserves our sole focus.

So why does the press focus so intently on climate change as the root cause? Perhaps for the same reasons I just did in an academic paper about wildfires in Nature, one of the world’s most prestigious journals: it fits a simple storyline that rewards the person telling it.

The paper I just published—“Climate warming increases extreme daily wildfire growth risk in California”—focuses exclusively on how climate change has affected extreme wildfire behavior. I knew not to try to quantify key aspects other than climate change in my research because it would dilute the story that prestigious journals like Nature and its rival, Science, want to tell.

This matters because it is critically important for scientists to be published in high-profile journals; in many ways, they are the gatekeepers for career success in academia. And the editors of these journals have made it abundantly clear, both by what they publish and what they reject, that they want climate papers that support certain preapproved narratives—even when those narratives come at the expense of broader knowledge for society.

To put it bluntly, climate science has become less about understanding the complexities of the world and more about serving as a kind of Cassandra, urgently warning the public about the dangers of climate change. However understandable this instinct may be, it distorts a great deal of climate science research, misinforms the public, and most importantly, makes practical solutions more difficult to achieve.

……………

Here’s how it works.

The first thing the astute climate researcher knows is that his or her work should support the mainstream narrative—namely, that the effects of climate change are both pervasive and catastrophic and that the primary way to deal with them is not by employing practical adaptation measures like stronger, more resilient infrastructure, better zoning and building codes, more air conditioning—or in the case of wildfires, better forest management or undergrounding power lines—but through policies like the Inflation Reduction Act, aimed at reducing greenhouse gas emissions. 

So in my recent Nature paper, which I authored with seven others, I focused narrowly on the influence of climate change on extreme wildfire behavior. Make no mistake: that influence is very real. But there are also other factors that can be just as or more important, such as poor forest management and the increasing number of people who start wildfires either accidentally or purposely. (A startling fact: over 80 percent of wildfires in the US are ignited by humans.)

In my paper, we didn’t bother to study the influence of these other obviously relevant factors. Did I know that including them would make for a more realistic and useful analysis? I did. But I also knew that it would detract from the clean narrative centered on the negative impact of climate change and thus decrease the odds that the paper would pass muster with Nature’s editors and reviewers.

This type of framing, with the influence of climate change unrealistically considered in isolation, is the norm for high-profile research papers. For example, in another recent influential Nature paper, scientists calculated that the two largest climate change impacts on society are deaths related to extreme heat and damage to agriculture. However, the authors never mention that climate change is not the dominant driver for either one of these impacts: heat-related deaths have been declining, and crop yields have been increasing for decades despite climate change. To acknowledge this would imply that the world has succeeded in some areas despite climate change—which, the thinking goes, would undermine the motivation for emissions reductions.

This leads to a second unspoken rule in writing a successful climate paper. The authors should ignore—or at least downplay—practical actions that can counter the impact of climate change. If deaths due to extreme heat are decreasing and crop yields are increasing, then it stands to reason that we can overcome some major negative effects of climate change. Shouldn’t we then study how we have been able to achieve success so that we can facilitate more of it? Of course we should. But studying solutions rather than focusing on problems is simply not going to rouse the public—or the press. Besides, many mainstream climate scientists tend to view the whole prospect of, say, using technology to adapt to climate change as wrongheaded; addressing emissions is the right approach. So the savvy researcher knows to stay away from practical solutions.

Here’s a third trick: be sure to focus on metrics that will generate the most eye-popping numbers. Our paper, for instance, could have focused on a simple, intuitive metric like the number of additional acres that burned or the increase in intensity of wildfires because of climate change. Instead, we followed the common practice of looking at the change in risk of an extreme event—in our case, the increased risk of wildfires burning more than 10,000 acres in a single day.

This is a far less intuitive metric that is more difficult to translate into actionable information. So why is this more complicated and less useful kind of metric so common? Because it generally produces larger factors of increase than other calculations. To wit: you get bigger numbers that justify the importance of your work, its rightful place in Nature or Science, and widespread media coverage. *

Another way to get the kind of big numbers that will justify the importance of your research—and impress editors, reviewers, and the media—is to always assess the magnitude of climate change over centuries, even if that timescale is irrelevant to the impact you are studying.

For example, it is standard practice to assess impacts on society using the amount of climate change since the industrial revolution, but to ignore technological and societal changes over that time. This makes little sense from a practical standpoint since societal changes in population distribution, infrastructure, behavior, disaster preparedness, etc., have had far more influence on our sensitivity to weather extremes than climate change has since the 1800s. This can be seen, for example, in the precipitous decline in deaths from weather and climate disasters over the last century. Similarly, it is standard practice to calculate impacts for scary hypothetical future warming scenarios that strain credibility while ignoring potential changes in technology and resilience that would lessen the impact. Those scenarios always make for good headlines.

A much more useful analysis would focus on changes in climate from the recent past that living people have actually experienced and then forecasting the foreseeable future—the next several decades—while accounting for changes in technology and resilience. 

In the case of my recent Nature paper, this would mean considering the impact of climate change in conjunction with anticipated reforms to forest management practices over the next several decades. In fact, our current research indicates that these changes in forest management practices could completely negate the detrimental impacts of climate change on wildfires. 

Another way to get the kind of big numbers that will justify the importance of your research—and impress editors, reviewers, and the media—is to always assess the magnitude of climate change over centuries, even if that timescale is irrelevant to the impact you are studying.

* When I first saw Patrick’s paper on TwitX, I thought “why on earth did they pick that bizarre variable when we know total acres haven’t gone up?  I even wondered, skeptic that I am, if they had looked at a lot of other variables that didn’t work out for the Preferred Narrative.

**************

I think the article might be paywalled. So here’s more.  I think I’ve tried to say this in the past but much less articulately..

**************

For example, it is standard practice to assess impacts on society using the amount of climate change since the industrial revolution, but to ignore technological and societal changes over that time. This makes little sense from a practical standpoint since societal changes in population distribution, infrastructure, behavior, disaster preparedness, etc., have had far more influence on our sensitivity to weather extremes than climate change has since the 1800s. This can be seen, for example, in the precipitous decline in deaths from weather and climate disasters over the last century. Similarly, it is standard practice to calculate impacts for scary hypothetical future warming scenarios that strain credibility while ignoring potential changes in technology and resilience that would lessen the impact. Those scenarios always make for good headlines.

A much more useful analysis would focus on changes in climate from the recent past that living people have actually experienced and then forecasting the foreseeable future—the next several decades—while accounting for changes in technology and resilience.

In the case of my recent Nature paper, this would mean considering the impact of climate change in conjunction with anticipated reforms to forest management practices over the next several decades. In fact, our current research indicates that these changes in forest management practices could completely negate the detrimental impacts of climate change on wildfires.

This more practical kind of analysis is discouraged, however, because looking at changes in impacts over shorter time periods and including other relevant factors reduces the calculated magnitude of the impact of climate change, and thus it weakens the case for greenhouse gas emissions reductions.

You might be wondering at this point if I’m disowning my own paper. I’m not. On the contrary, I think it advances our understanding of climate change’s role in day-to-day wildfire behavior. It’s just that the process of customizing the research for an eminent journal caused it to be less useful than it could have been.

This means conducting the version of the research on wildfires that I believe adds much more practical value for real-world decisions: studying the impacts of climate change over relevant time frames and in the context of other important changes, like the number of fires started by people and the effects of forest management. The research may not generate the same clean story and desired headlines, but it will be more useful in devising climate change strategies.

But climate scientists shouldn’t have to exile themselves from academia to publish the most useful versions of their research. We need a culture change across academia and elite media that allows for a much broader conversation on societal resilience to climate.

The media, for instance, should stop accepting these papers at face value and do some digging on what’s been left out. The editors of the prominent journals need to expand beyond a narrow focus that pushes the reduction of greenhouse gas emissions. And the researchers themselves need to start standing up to editors, or find other places to publish.

What really should matter isn’t citations for the journals, clicks for the media, or career status for the academics—but research that actually helps society.

Why would we do that? (Just joking, I am a proud graduate of land-grant institutions with that explicit mission).

*******************************

********************************

Here’s the beginning of the piece:

If you’ve been reading any news about wildfires this summer—from Canada to Europe to Maui—you will surely get the impression that they are mostly the result of climate change.

Here’s the APClimate change keeps making wildfires and smoke worse. Scientists call it the “new abnormal.

And PBS NewsHour: Wildfires driven by climate change are on the rise—Spain must do more to prepare, experts say.

And The New York TimesHow Climate Change Turned Lush Hawaii Into a Tinderbox.

And BloombergMaui Fires Show Climate Change’s Ugly Reach.

I am a climate scientist. And while climate change is an important factor affecting wildfires over many parts of the world, it isn’t close to the only factor that deserves our sole focus.

So why does the press focus so intently on climate change as the root cause? Perhaps for the same reasons I just did in an academic paper about wildfires in Nature, one of the world’s most prestigious journals: it fits a simple storyline that rewards the person telling it.

The paper I just published—“Climate warming increases extreme daily wildfire growth risk in California”—focuses exclusively on how climate change has affected extreme wildfire behavior. I knew not to try to quantify key aspects other than climate change in my research because it would dilute the story that prestigious journals like Nature and its rival, Science, want to tell.

This matters because it is critically important for scientists to be published in high-profile journals; in many ways, they are the gatekeepers for career success in academia. And the editors of these journals have made it abundantly clear, both by what they publish and what they reject, that they want climate papers that support certain preapproved narratives—even when those narratives come at the expense of broader knowledge for society.

To put it bluntly, climate science has become less about understanding the complexities of the world and more about serving as a kind of Cassandra, urgently warning the public about the dangers of climate change. However understandable this instinct may be, it distorts a great deal of climate science research, misinforms the public, and most importantly, makes practical solutions more difficult to achieve.

Why is this happening?

It starts with the fact that a researcher’s career depends on his or her work being cited widely and perceived as important. This triggers the self-reinforcing feedback loops of name recognition, funding, quality applications from aspiring PhD students and postdocs, and of course, accolades.

But as the number of researchers has skyrocketed in recent years—there are close to six times more PhDs earned in the U.S. each year than there were in the early 1960s—it has become more difficult than ever to stand out from the crowd. So while there has always been a tremendous premium placed on publishing in journals like Nature and Science, it’s also become extraordinarily more competitive.

In theory, scientific research should prize curiosity, dispassionate objectivity, and a commitment to uncovering the truth. Surely those are the qualities that editors of scientific journals should value.

In reality, though, the biases of the editors (and the reviewers they call upon to evaluate submissions) exert a major influence on the collective output of entire fields. They select what gets published from a large pool of entries, and in doing so, they also shape how research is conducted more broadly. Savvy researchers tailor their studies to maximize the likelihood that their work is accepted. I know this because I am one of them.

Disclaimer: I’ve had convos with various folks at The Breakthrough Institute and will be moderating a panel at an upcoming conference of theirs, but have never spoken to Patrick.

“Proforestation” It Aint What It Claims To Be

‘Proforestation’ separates people from forests

AKA: Ignorance and Arrogance Still Reign Supreme at the Sierra Club.

I picked this up from Nick Smith’s Newsletter (sign up here)
Emphasis added by myself as follows:
1)  Brown Text for items NOT SUPPORTED by science with long term and geographically extensive validation.                                                                                                                                                        2) Bold Green Text for items SUPPORTED by science with long term and geographically extensive validation.
3) >>>Bracketed Italics for my added thoughts based on 59 years of experience and review of a vast range of literature going back to way before the internet.<<<

“Proforestation” is a relatively new term in the environmental community. The Sierra Club defines it as: “extending protections so as to allow areas of previously-logged forest to mature, removing vast amounts of atmospheric carbon and recovering their ecological and carbon storage potential.”          >>>Apparently, after 130 years of existence, the Sierra Club still doesn’t know much about plant physiology, the carbon cycle or the increased risk of calamitous wild fire spread caused by the close proximity of stems and competition driven mortality in unmanged stands (i.e. the science of plant physiology regarding competition, limited resources and fire spread physics). Nor have they thought out the real risk of permanent destruction of the desired ecosystems nor the resulting impact on climate change.<<<

Not only must we preserve untouched forests, proponents argue, but we must also walk away from previously-managed forests too. People should be entirely separate from forest ecology and succession. >>>More abject ignorance and arrogant woke policy based only on vacuous wishful thinking.<<<

Except humans have managed forests for millennia. In North America, Indigenous communities managed forests and sustained its resources for at least 8,000 years prior to European settlement. It is true people have not always managed forests sustainably. Forest practices of the late 19th century are a good example.                                                                                                                                                 >>>Yes, and the political solution pushed on us by the Sierra Club and other faux conservationists beginning with false assumptions about the Northern Spotted Owl was to throw out the continuously improving science (i.e. Continuous Process Improvement [CPI]).  The concept of using the science to create sustainable practices and laws that regulated the bad practices driven by greed and arrogance wasn’t even considered seriously.  As always, the politicians listened to the well heeled squeaky voters.  Now, their arrogant ignorance has given us National Ashtrays, destruction of soils, and an ever increasing probability that great acreages of forest ecosystems will be lost to the generations that follow who will also have to cope with the exacerbated climate change.  So here we are, in 30+/- years the Faux Conservationists have made things worse than the greedy timber barons ever could have.  And the willfully blind can’t seem to see what they have done. Talk about arrogance.<<<

Forest management provides tools to correct past mistakes and restore ecosystems. But Proforestation even seems to reject forest restoration that helps return a forest to a healthy state, including controlling invasive species, maintaining tree diversity, returning forest composition and structure to a more natural state.

Proforestation is not just a philosophical exercise. The goal is to ban active forest management on public lands. It has real policy implications for the future management (or non-management) of forests and how we deal with wildfires, climate change and other disturbances.

We’ve written before about how this concept applies to so-called “carbon reserves.” Now, powerful and well-funded anti-forestry groups are pressuring the Biden Administration to set-aside national forests and other federally-owned lands under the guise of “protecting mature and old-growth” trees.

In its recent white paper on Proforestation (read more here), the Society of American Foresters writes that “preservation can be appropriate for unique protected areas, but it has not been demonstrated as a solution for carbon storage or climate change across all forested landscapes.”

Proforestation doesn’t work when forests convert from carbon sinks into carbon sources. A United Nations report pointed out that at least 10 World Heritage sites – the places with the highest formal environmental protections on the planet – are net sources of carbon pollution. This includes the iconic Yosemite National Park.

The Intergovernmental Panel on Climate Change (IPCC) recognizes active forest management will yield the highest carbon benefits over the long term because of its ability to mitigate carbon emitting disturbance events and store carbon in harvested wood products. Beyond carbon, forest management ensures forests continue to provide assets like clean water, wildlife habitat, recreation, and economic activity.
>>>(i.e. TRUE SUSTAINABILITY)<<<

Forest management offers strategies to manage forests for carbon sequestration and long-term storage.Proforestation rejects active stewardship that can not only help cool the planet, but help meet the needs of people, wildlife and ecosystems. You can expect to see this debate intensify in 2023.

Science Friday: From Caveats to Certitude in Wildfire-Climate Relationships

Proportion of variance explained by top-ranking multiple regression models of seasonal climate influence on annual fire activity for (A) federal lands and (B) NEON domains in the continental United States. Fig. 4 in Syphard et al. 2017 paper.

A while back I thought I had linked to this WaPo story “Massive wildfires helped fuel global forest losses in 2021” but I got totally sidetracked and so am posting it now. I’m sure it’s an interesting story. But this is how I got sidetracked.

The article said:

In a recent assessment, the Intergovernmental Panel on Climate Change found that human-caused emissions have significantly increased the area burned by wildfires the American West and British Columbia.
So I wrote the reporter,because the link wasn’t to the IPCC,  and the reporter replied that the link is the study that the IPCC referenced.  So let’s take a look at that study.
First, the study was of BC, not BC and the American West.  The authors don’t make that claim, as far as I can tell. It’s about weather (affected by climate and AGW) and how that impacts fires.  And  I think it’s worth looking at the authors’ own set of caveats.
The result is dependent on the regression model being realistic. Our regression model assumes that nonclimatic variability in the natural log of area burned is stationary in time and does not account for the possible influence of human factors such as changes in forest management or human ignition sources. Humans have long had a direct influence on fire activity (Bowman et al., 2011), and trends in some regions have been strongly impacted by human intervention (Fréjaville & Curt, 2017; Parisien et al., 2016; Turco et al., 2014). Syphard et al. (2017) demonstrated that climate influence on fire activity becomes less important with a strong human presence. We also do not consider directly the impacts of repeated suppression over time, which could result in larger fires, nor do we consider the pine beetle infestation that has affected BC (Kurz et al., 2008), although such a disturbance did not impact large-scale area burned in the United States (Hart et al., 2015). Nonetheless, consistency of our finding with attribution of an increase in fire risk and previous studies demonstrating that climate change is an important driver of changes in fire behavior in many regions of North America (Gillett et al., 2004; Littell et al., 2009; Morton et al., 2013) supports the finding of a substantial contribution of anthropogenic climate change to the risk of a burned area as large as that in 2017.
My bold. It appears the links to the cited papers in the paper itself work, but not from this excerpt.  So if you want to go to a cited paper you have to go back to the original paper . This from the conclusions was also interesting:
 While we find no evidence that anthropogenic influence contributed to the risk of extremely dry conditions, we find that it has substantially increased the risk of warm conditions, elevated wildfire risk, and large area burned comparable to those observed.
The Syphard et al paper  (2017) I hadn’t seen before (or have forgotten?) and was interesting.. here’s an excerpt from the results:
Of the 10 different variables we explored to explain the variation in the strength of fire-climate relationships, none were statistically significant at P ≤ 0.05 except for the anthropogenic variables (Table 1 and Figs. S1 and S2). In the federal data, regions in close proximity to either roads or developed areas had weaker fire-climate relationships; and in the FPA FOD data, regions with a higher mean human population and proportion of developed land had weaker fire-climate relationships.
More folks around, less direct relationship between climate and fire.  It raises many interesting questions, as the authors say in the discussion. Hopefully the authors are continuing to explore them.
Humans can affect wildfire patterns in a number of ways, from starting fires to managing fires (e.g., prescribed fire or fire suppression) and via changes in the abundance and continuity of fuel through land use decisions. For example, humans alter native vegetation through agriculture, urbanization, and forestry management practices. Although their geographical subdivisions were coarser than those used here, regions where lightning-started fires dominated in a recent nationwide analysis (27) show some alignment with areas here where fire-climate relationships were stronger, largely in the interior, northwestern part of the country. Nevertheless, although human-caused ignitions predominate across most of the country, there are also regions like the interior Southeast where fire-climate relationships were relatively strong but the cause of ignitions was nevertheless dominated by humans.
This suggests that human influence goes beyond just starting fires, and there is some combination of factors that leads to a dampening of the effect of climate on fire activity. This may be due to effective lengthening of the fire season (27), or starting fires in areas where naturally occurring fires are rare. On the other hand, fragmentation of fuels via land use and urban development may interrupt the spread of fires that would otherwise occur in a less human-dominated landscape. In this case, the climatological factors that might otherwise lead to fire spread are overridden by human-created landscape patterns. This dual effect of humans either increasing fire where it would not otherwise occur, or decreasing it where it would occur, may be why the overall amount of fire in a region was not significantly related to the importance of climate. A couple of other studies performed at smaller extents also suggest that human influence [i.e., suppression policy (40) or land use (33)] in addition to fuel quantity or quality (3133) can potentially mediate or dampen fire-climate relationships across different temporal scales.
One of the most consistently important variables for explaining strong fire-climate relationships was prior-year precipitation, which is similar to results in other studies (e.g., refs. 2830, and 35). This relationship is often found in grasslands and savannahs where fire activity is fuel-limited. High precipitation appears to have a dampening effect on current-year fires but leads to high fuel loads in subsequent years, and the production of fine-fuel biomass that dries by the following year is conducive to fire spread (41). In forested ecosystems, this relationship has been shown to be present in forest types with herbaceous understories and absent in ones with understory fuels comprising litter and other downed material (42)
An important caveat to this study is the fact that the variance explained for the differences in strength of fire-climate relationships was not particularly high, and there was substantial variability in the data (Figs. S1 and S2). Therefore, despite evaluating the role of climatic or topographic variability, or variation in vegetation or forest biomass, differences in strength of fire-climate relationships may be due to additional factors. For example, temperature or precipitation patterns may be less variable in some regions than in others, meaning there is less annual variability in fire activity due to these variables. Another reason may be that the fire-climate models do not include essential factors such as localized fire-weather events, long-term drought, or lightning density, nor do they account for variable interactions or more complex variable combinations.
Maybe “fire activity” and “acres burned” are different measures?
****************
If for some reason you are still interested in this topic. I went to the 3500 or so page IPCC report and searched on Kirchmeier-Young, the first author of the paper.  And much to my surprise and delight, I found these paragraphs (page 2-53) which summarize how sure the IPCC is, and which sources they cited. I’m fascinated by how the authors got from the caveats in the KY et al. paper to robust evidence- high agreement.  Plus the differences in dryness as a factor. I suppose it’s mostly a function of different conditions in different places.
Since the IPCC Fifth Assessment Report and the IPCC Special Report on Land, published research has
24 detected increases in the area burned by wildfire, analysed relative contributions of climate and non-climate
25 factors, and attributed burned area increases above natural levels to anthropogenic climate change in one part
26 of the world – western North America (robust evidence, high agreement) (Abatzoglou and Williams, 2016;
27 Partain et al., 2016; Kirchmeier‐Young et al., 2019; Mansuy et al., 2019; Bowman et al., 2020b). Across the
28 western United States, increases in vegetation aridity due to higher temperatures from anthropogenic climate
29 change doubled burned area from 1984 to 2015 over what would have burned due to non-climate factors,
30 including unnatural fuel accumulation from fire suppression, with the burned area attributed to climate
31 change accounting for 49% (32-76%, 95% confidence interval) of cumulative burned area (Abatzoglou and
32 Williams, 2016). Anthropogenic climate change has doubled the severity of a southwest North American
33 drought from 2000 to 2020 that has reduced soil moisture to its lowest levels since the 1500s (Williams et
34 al., 2020), driving half of the increase in burned area (Abatzoglou and Williams, 2016; Holden et al., 2018;
35 Williams et al., 2019). In British Columbia, Canada, the increased maximum temperatures due to
36 anthropogenic climate change increased burned area in 2017 to its highest extent in the 1950-2017 record,
37 seven to eleven times the area that would have burned without climate change (Kirchmeier-Young et al.,
38 2019). In Alaska, USA, the high maximum temperatures and extremely low relative humidity due to
39 anthropogenic climate change accounted for 33‒60% of the probability of wildfire in 2015, when the area
40 burned was the second highest in the 1940-2015 record (Partain et al., 2016). In protected areas of Canada
41 and the United States, climate factors (temperature, precipitation, relative humidity, evapotranspiration
change accounting for 49% (32-76%, 95% confidence interval) of cumulative burned area (Abatzoglou and
32 Williams, 2016). Anthropogenic climate change has doubled the severity of a southwest North American
33 drought from 2000 to 2020 that has reduced soil moisture to its lowest levels since the 1500s (Williams et
34 al., 2020), driving half of the increase in burned area (Abatzoglou and Williams, 2016; Holden et al., 2018;
35 Williams et al., 2019). In British Columbia, Canada, the increased maximum temperatures due to
36 anthropogenic climate change increased burned area in 2017 to its highest extent in the 1950-2017 record,
37 seven to eleven times the area that would have burned without climate change (Kirchmeier-Young et al.,
38 2019). In Alaska, USA, the high maximum temperatures and extremely low relative humidity due to
39 anthropogenic climate change accounted for 33‒60% of the probability of wildfire in 2015, when the area
40 burned was the second highest in the 1940-2015 record (Partain et al., 2016). In protected areas of Canada
41 and the United States, climate factors (temperature, precipitation, relative humidity, evapotranspiration)
accounted for 60% of burned area from local human and natural ignitions
1 from 1984 to 2014, outweighing
2 local human factors (population density, roads, and built area) (Mansuy et al., 2019).
3
4 In summary, field evidence shows that anthropogenic climate change has increased the area burned by
5 wildfire above natural levels across western North America in the period 1984-2017, at global mean surface
6 temperature increases of 0.6ºC -0.9ºC, increasing burned area up to 11 times in one extreme year and
7 doubling burned area over natural levels in a 32 year period (high confidence).

Practice of Science Friday: Who Is Holding the Climate Research Funding Flashlight and How it Affects Our Thinking

One memorable summer or fall day, probably over ten but less than twenty years ago,  when I worked for the Forest Service, I attended some kind of conference (most likely planning) in Missoula.  Missoula was very smoky from wildfires.  Several colleagues of mine and I sought out a brewpub after the conference, but it was too smoky to sit outside, so I remember us getting a keg and retiring to a colleague’s home.

Now we know.. via many scientific studies, that wildfire smoke is bad for you.   Perhaps we knew it then, but then it just seemed like part of life in wildfire-prone country.  What has changed?  Scientists have studied it extensively and found it to be bad for health.  The same particles were always bad, but now scientific studies exist to tell us that they are bad.  I think of scientific interest as a flashlight.  The world goes on, but the part that gets highlighted as “science” depends on who’s holding the flashlight. This is much discussed in the history and sociology of science literature, but not so much questioned day to day information sharing and reporting.

And who is holding the flashlight is a complex multi-actor and institution process that is not well understood. If we look at relative funding for sociology of science,  we can see why.

What made smoke suddenly more interesting and worthy of research?  Was it the fact that highly populated areas were getting more smoke?  Was it the fact that suddenly wildfires were “due to” climate change and there’s plenty of money in climate change?

I remember clearly one day when I worked in the WO in Vegetation Management and Protection Research, Elvia Niebla our USGCRP (US Global Change Research Program) staff person, came and said “there’s going to be huge amounts of money for climate change.”  So I asked her “everything we deal with is affected by climate, so couldn’t anything be funded?” This seemed like a great deal for almost any researchers. But it couldn’t fund everything, so people had to decide what is more worthy in topic and approaches.

I attended an SAF meeting somewhere in New York (perhaps in the 80’s?) and a Station Director (FS research equivalent of Regional Forester) told me that his Station wouldn’t be doing much work with the National Forests anymore as they had tapped into climate change bucks and they were going to do “real science.”

And most recently, just a few years before I retired, I received a call from a nice woman in Missoula who had been tasked to ask research users what our social science needs were.  After talking about it, it became clear that this was because they had received climate change funding, so they couldn’t actually study today’s problems and issues- only those  due to climate change. Understanding how to reduce recreation impacts? Uh, no.  Housing in resort communities? Well, perhaps if they were climate refugees..  Peoples’ attitudes toward prescribed fire? Yes, if due to climate change.  It’s perhaps a slippery slope from rationalizing useful research by highlighting the climate aspects, to publishing studies that unintentionally encourage people to overlook other sources of problems.

I don’t think that we have adequately thought about how this infusion of money (and chasing after it) may have changed the way we view issues, the disciplines we hire (or don’t) and how that affects what’s illuminated by the flashlight, the way we approach issues about forests, and so on.  And there is definitely a tendency for the volume control holders (media, foundations and interest groups) to highlight certain results.   Certainly “climate change is not a big factor in this” is not as attractive as “climate change is going to have really bad effects.”

For whatever reason, there was a push for many years to value mitigation over adaptation.  And the sciences involved in mitigation (atmospheric physics and chemistry) became way cooler and more important than say, hydrology or wheat breeding (adaptation sciences).  If we look back at the history of science, we can just barely see the fingerprints of the physical sciences being way cooler than others (aka  the well-known “physics envy”).  Conceivably much of the funding could have been sent instead, once the climate problem was identified, to coalitions of engineering schools to figure out cost-effective ways to decarbonize.   What factors of the scientific enterprise kept modelling so high in importance, and figuring out ways to fix that are relatively low in importance?  And does this age-old historical bias against applied science and engineering unconsciously play out in what is covered in the media… leading to climate despair? And how does this play into the availability of satellite data (another flashlight) , journals favoring global conclusions, and studies getting further and further from real people (social sciences) and real places (the scale at which actions take place).

An example of the mitigation/adaptation disconnect is this Katharine Hayhoe (et tu, TNC, Chief Scientist?) interview on On Being with Krista Tippett.   She says that Texans need to give up barbecue and pickups for a better world.. meanwhile, auto manufacturers are producing electric pickups.

We can look at other examples as we encounter them.

 

 

Science Friday: More Climate Model Innards and Critiques

This is an interesting article in Science Magazine leading up to the upcoming IPCC report.  This one looks at some of the reasons and the “poorly understoods” that are causing models to run hot.

The models were also out of step with records of past climate. For example, scientists used the new model from NCAR to simulate the coldest point of the most recent ice age, 20,000 years ago. Extensive paleoclimate records suggest Earth cooled nearly 6°C compared with preindustrial times, but the model, fed with low ice age CO2 levels, had temperatures plummeting by nearly twice that much, suggesting it was far too sensitive to the ups and downs of CO2. “That is clearly outside the range of what the geological data indicate,” says Jessica Tierney, a paleoclimatologist at the University of Arizona and a co-author of the work, which appeared in Geophysical Research Letters. “It’s totally out there.”

To find out why, modelers probed the guts of the simulations, focusing on their representation of clouds, long the wild card of climate change. The models can’t simulate clouds directly, so they rely on known physics and observations to estimate cloud properties and behavior. In previous models ice crystals made up more of the low clouds in the midlatitudes of the southern Pacific Ocean and elsewhere than satellite observations seemed to justify. Ice crystals reflect less sunlight than water droplets, so as these clouds heated and the ice melted, they became more reflective and caused cooling. The new models start with more realistic clouds containing more supercooled water, which allows other dynamics driven by warming—the penetration of dry air from above and a subduing of turbulence—to thin the clouds.

But that fix has allowed scientists to spy another bias previously countered by the faulty cooling trend. In both the old and new climate models, the patchy cumulus clouds that form in the tropics thin out in response to warming, allowing in more heat than satellite observations suggest, according to a study by Timothy Myers, a cloud scientist at Lawrence Livermore National Laboratory. “Even though one feature of the climate is now more realistic, another that’s persistently biased has been revealed,” Myers says.

 

But this part brought back a memory from the 70’s…

So the IPCC team will probably use reality—the actual warming of the world over the past few decades—to constrain the CMIP projections. Several papers have shown how doing so can reduce the uncertainty of the model projections by half, and lower their most extreme projections. For 2100, in a worst-case scenario, that would reduce a raw 5°C of projected warming over preindustrial levels to 4.2°C.

Our silviculture professor at Yale, Dave Smith, had a skeptical view of forest vegetation models.  We were also, as graduate students, learning FORTRAN as a programming language (history, this was actually a graduate level course!). Dave told us something like “when trees grow too tall, they just put in a card that says “if the modeled tree height value is greater than 90, tree height equals 90″”. I don’t know if that was actually true about the JABOWA model, but this climate modeling intervention sounds a bit like the same thing.

The modelers hope to do better next time around. Lamarque says they may test new simulations against recent paleoclimates, not just historical warming, while building them. He also suggests that the development process could benefit from more time, with updates every decade or so rather than the current report interval of every 7 years. And it could be helpful to divide the modeling process in two, with one track focused on scientific experimentation—when a large range of climate sensitivities is helpful—and the other on providing a best estimate to policymakers. “It’s not easy to reconcile these two approaches under a single entity,” Lamarque says.

A cadre of researchers dedicated to the task of translating the models into useful projections could also help, says Angeline Pendergrass, a climate scientist at Cornell University who helped develop one technique for weighting the model results by their accuracy and independence. “It’s an actual job to go between the basic science and the tools I’m messing around with,” she says.

For now, policymakers and other researchers need to avoid putting too much stock in the unconstrained extreme warming the latest models predict, says Claudia Tebaldi, a climate scientist at Pacific Northwest National Laboratory and one of the leaders of CMIP’s climate projections. Getting that message out will be a challenge. “These issues don’t translate very well in practice,” she says. “It’s going to be hard for people looking to make some projection of a water basin in the West to make sense of it.”

Maybe people who manage resources like water just acknowledge “we don’t know, but things could be somewhat, very, or terribly much worse in a variety of ways due to climate change and a variety of other factors that may interact” and then see how the responses would play out, and what would be key decision points. It seems like that’s actually what they are doing, at least, as I recall Denver Water, ten or more years ago. Here are some examples.

Practice of Science Friday: Getting to Co-Design and Co-Production of Climate Models

From Meadow et al.
https://journals.ametsoc.org/view/journals/wcas/7/2/wcas-d-14-00050_1.xml

We’ve been discussing the RCP 8.5 issue in climate modeling, this week and earlier, which reminded me of this story. A few years ago, I attended a reunion (my class’s 40th) at the Yale School of Forestry (cl and Environmental Studies (now School of the Environment). Jerry Melillo, of Woods Hole, gave a talk about climate modelling.

Jerry showed a slide of “protected areas,” and of course, having worked on the Colorado Roadless Rule for years, I could see from the map that I might not agree with the boundaries. At least not in terms of physically meaningful differences for input into climate models. Imagine how hard it would be to take the non-Wilderness parts of your neighboring forest and make assumptions about what they would or would not contribute to climate change in the next fifty years or more? My question to Jerry was “forest management is a sliding scale, from planting trees after fires and leaving them alone thereafter, to intensive forest management as practiced in the southeastern US with loblolly pine. It’s a dial not a toggle, so how do models of future land use reflect that?” His answer was that IUCN had made the determination of what was protected, and he was using their numbers. Of course, determining something is a toggle when it is a dial is a value judgment, not a scientific finding. And folks might choose toggle due to computational convenience, not proximity to validity.

Roger Pielke, Jr. and Justin Ritchie wrote a comprehensive historical account of the development and use of RCP’s, including definitions of the relevant jargon. From a history of science/sociology of science perspective. It has been the best guide I’ve found to try to understand “is our work (land-use) an input, output or both?” There are even flowcharts! Roger and Justin have a few suggestions (these are only some):

*Despite the presence of thousands of IAM scenarios in the community, and the motivation to proceed with ‘one model one vote’ dynamics where all models are assessed equally with no explicit probability statements, more regular attention needs to be given to a much simplified set of near-term, policy relevant scenarios, similar to how IEA issues three scenarios on an annual basis: a Current Policies Scenario (high), a Stated Policies Scenario (baseline) and a Sustainable Development (policy) scenario.

(I see this as fewer but more realistic options, done more frequently to reflect changes and course-correct assumptions, some of the same preferences we would have for any planning effort.)

*More work is needed to reconcile long-term narrative pathways based on an idealized year 2100 end-point with what policy makers need to know about the next few years and decades. While there are an increasing number of scenarios focused on the role of Paris Agreement NDCs through 2030, there is a significant gap in the literature for scenarios that address developments before 2050 in the context of today’s policy environment. This gap is created by an excessive focus on long-run, full century scenarios, driven in large part by the needs of the physical science modeling community.

There seems to be a need for policy folks to say “nope, that solution isn’t working for us, how about trying …?” Not sure that there is the direct connection among groups for this discussion to take place. Remember the idea of “co-designed, co-produced research” with stakeholders and policy makers?

* Climate research and assessment would benefit from a more ecumenical and expansive view on relevant knowledge. The IPCC scenario process has been led by a small group of academics for more than a decade, and decisions made by this small community have profoundly shaped the scientific literature and correspondingly, how the media and policy communities interpret the issue of climate change. The dominant role of this small community might be challenged in order to legitimize a broader perspective of views, approaches and methods.

It would be handy IMHO if that were to be a role of (at least some of) the new climate $ in the President’s budget proposal. I can imagine a multi-stakeholder group at the regional level asking questions like “what do we want from climate models to help us plan mitigation and adaptation strategies?”. How can we use our local and regional knowledge as input into the process? What have scientists learned that can be helpful to us, and what else do we need to know?

As it turns out there is quite a bit of literature on co-design and co-production of climate science. You can go into Google Scholar and search on “climate science co-production.” That’s how I found the Alison et al. paper that yielded the Table 2 above.

Climate Science Voyage of Discovery: Bioenergy with Carbon Capture and Storage- From Pulpmill to IAMS to Saving the Planet

A Kraft pulp mill in Sweden.

I’ve always been curious as to how forests are handled in climate modeling (to predict the future they have to assume many things, including land use). I wonder who is the “they” who decides what goes in, and what groups are consulted on these numbers. Here’s a history of the idea of BECCS from Carbon Brief that touches on one aspect of this- where BECCS came from and how it got included into the IAMS (integrated assessment models). The article also has sidebars for some of the technical terms which is handy.

Bioenergy with carbon capture and storage – better known by the acronym “BECCS” – has come to be seen as one of the most viable and cost-effective negative emissions technologies.

Even though they have yet to be demonstrated at a commercial scale, negative emissions technologies – typically BECCS – are now included by climate scientists in the majority of modelled “pathways” showing how the world can avoid the internationally agreed limit of staying “well below” 2C of global warming since the pre-industrial era.

Put simply, without deploying BECCS at a global scale from mid-century onwards, most modellers think we will likely breach this limit by the end of this century.

But where did the idea for this “saviour” technology come from? Who came up with it? Who then developed and promoted the concept?

Möllersten says the first spark for the idea of BECCS came to him in 2000 when he was preparing to give a presentation at the 5th biannual Greenhouse Gas Control Technologies (GHGT) conference in Cairns, Australia. Working the idea through with Jinyue Yan, his PhD supervisor, Möllersten claims today that he “cannot remember the exact moment when we thought about this”, but he can recall the background:

“The way it started for me was when I started doing the work for my PhD. My focus was on looking at the pulp and paper industry as a very important industrial branch in the Swedish energy system. What measures could be taken to achieve cost effective emission reductions or CO2 emission reductions? Having worked on this topic for a while, looking at the most conventional measures, my professor and I noticed that there was a lot of work going on in this rather new and exciting area that was called “carbon capture and storage”. We also noticed that, as far as we could see, all that work was focused on emissions from fossil use. We simply decided to investigate what CCS could mean in the context of pulp and paper mills. When we did this work, we were looking at energy systems with a negative CO2 balance. For me, personally, it felt exciting to see that.”

And from the pulp mill study we get to..

But a key tipping point in the story of BECCS came when climate scientists started to increasingly include it in their modelling for sub-2C emissions pathway scenarios, often to the point that they grew reliant on it.

..

In little more than a decade, BECCS had gone from being a highly theoretical proposal for Sweden’s paper mills to earn carbon credits to being a key negative emissions technology underpinning the modelling, promoted by the IPCC, showing how the world could avoid dangerous climate change this century.

It’s interesting to me that an idea could become so important among modelers (whom we look to as experts on climate change) without ever having a stop for a reality check with folks who would have to carry it off.

Climate Science Voyage of Discovery: Climate Attribution For Wildfires and the Science-Journalism Translation

We’ve heard wildfires are “caused by” “exacerbated by” “primarily” “significantly” by climate change in various stories and op-eds. So let’s trace these back to the studies.

The Society of Environmental Journalism sent me a link to these Science Facts put out by our friends at AAAS. Scientists reading this.. please register with Sciline here.

Top Line
Human-caused climate change is a significant contributor to the increasing size and intensity of, and damage from, western U.S. wildfires.

The Essentials
In the western United States human-caused climate change caused more than half the increase in forest fuel aridity (how dry and flammable vegetation is) since the 1970s and has approximately doubled the cumulative area burned in forest fires since 1984.

Now many of us might ask, how could you know that “across the western US?” Could there have been more vegetation due to lack of fire suppression, that led to less water for each plant and hence dryness of vegetation? Haven’t suppression tactics changed over that period? If logging is bad for fires, as some claim, then there’s been much logging since 1984, how does that factor in? How could these estimates possibly be considered “facts?”

But first let’s take an aside to point out two concepts that may get lost in stories.

1. Not everything about climate is AGW or anthropogenic climate change. As we’ve seen via Matthew’s graphs of PDO, there is much climate variability that occurs naturally. E.g., in my own part of the country about 100 years ago there was a serious drought that led to the Dust Bowl. So just because I’m experiencing heat, it’s not necessarily AGW.

2. And not all climate change is about carbon, nor even all greenhouse gases (GHGs). Land use changes such as albedo, irrigation, urban heat islands and so on can also affect climate, in complex and interactive ways we don’t yet understand. So we can’t go directly to “reducing carbon (alone) will solve the problem.”

To find the contribution of AGW, we have to look at attribution studies. These studies run climate models and try to find the fingerprint of AGW by what sounds like comparing the model results to real world observations of some kind. I recommend taking the time to read this one. Sure I don’t get all the details of CMIPs and so on, but we can all understand what data goes in, what comes out, and we can read the caveats, and the conclusions. We can also examine the almost magical transitions between the paper (careful), the interviews (not quite as careful) and the headlines e.g. “caused by.”

Here’s a typical exchange of scientists:

Swain with UCLA and other scientists earlier this year published a study that said climate change has doubled the number of extreme-risk days for California wildfires.

It said temperatures statewide rose 1.8 degrees Fahrenheit since 1980, while precipitation dropped 30%. That doubled the number of autumn days that offer extreme conditions for the ignition of wildfires (Climatewire, April 3).

The heat is expected to get worse with time. Climate models estimate that average state temperatures will climb 3 degrees Fahrenheit by 2050 unless the world makes sharp cuts in greenhouse gas emissions, said Michael Wehner, a senior scientist at Lawrence Berkeley National Laboratory.

Even with emissions cuts, average temperatures would rise 2 degrees by midcentury, he said.

Jon Keeley, a senior scientist at the U.S. Geological Survey Western Ecological Research Center, argued that the study from Swain and others failed to show that hotter temperatures are driving wildfires.

“Show us data that shows that level of temperature increase is actually associated with increased fire activity,” Keeley said. “They don’t show that.”

Keeley added, “We ought to be much more concerned with ignition sources than a 1- to 2-degree change in temperature.”

A big contributor to large California fires is that the state has focused on extinguishing blazes for about a century rather than allowing for controlled burns, he said. That has caused dead vegetation to accumulate.

Trump has accused California of failing to “sweep” its forests, which he has linked to fires in the state.

Keeley said that “we don’t sweep forests here in the U.S., but what we do is prescription burning. … It’s potentially the same thing. It’s modifying the fuels prior to a fire.”

Swain, the UCLA climate scientist, said global warming is affecting how big fires get and how fast they move.

“What happens when they start burning, what is the character of those fires, and is it changing?” Swain asked. “The answer is yes.”

If we look at the statements in the study, though..we get a “fingerprint for meteorological preconditions.” Not exactly “fires caused by climate change.”

Collectively, this analysis offers strong evidence for a human fingerprint on the observed increase in meteorological preconditions necessary for extreme wildfires in California.

In the present study, we do not quantify the relative role of increased urban and suburban incursion into the high-risk wildland-urban interface, nor the contribution of historical land/vegetation management practices to increasing wildfire risk or possible future climate-fire feedbacks

And sounding rather like Keeley..or any of the rest of us..

In the long-term, reduction of global greenhouse gas emissions is the most direct path to reducing this risk, though the near-term impacts of these reductions may be limited given the many sources of inertia in the climate system [78]. Fortunately, a broad portfolio of options already exists, including the use of prescribed burning to reduce fuel loads and improve ecosystem health [79], upgrades to emergency communications and response systems, community-level development of protective fire breaks and defensible space, and the adoption of new zoning rules and building codes to promote fire-resilient construction

I recommend reading the conclusions for yourself.

Science Friday: When is Research Useful and Who Decides? Uncertainty and Current Decisions

A pretty view of picnic grounds on Homestake Road (1890) Library of Congress https://www.loc.gov/pictures/item/2004665560/

Jon made a comment yesterday that I think is worth exploring in detail. He said:

If this is just a compilation of existing data, that’s one thing. If this will be their basis for future planning, and they are saying they are going to ignore future climate change, it might be hard to argue that’s the best available science.

I’m sure that no one wants to “ignore climate science.” On the other hand, how exactly should specific pieces of climate science be applied to a decision? Who makes that call? In the past, when the topic was less adversarial (say reforestation practices), the National Forests hired experts who would make that determination and decide what Forests should do. Now, I’m sure it won’t surprise you that even with these less-charged kinds of decisions, there was sometimes disagreement between practitioners and researchers (as well as within each community). But most of the time these did not boil over into public spats as it was taken for granted that the authority to decide lay with the local technical expert. Researchers were content to publish, and practitioners were content to pick the best approaches based on experience. Many times FS researchers and National Forests worked closely together in what we might call today “co-design and coproduction”. Today, though, we have broader questions, with more disciplines involved, so that there may not be one “expert,” and the public wants to understand the scientific questions where they have policy relevance. Both of those changes present challenges.

As I’ve argued before, we don’t have a clue as to how microclimates perceived by trees will change due to climate change, and we also don’t know how those changes might affect living trees, nor do we know how those changes might affect their offspring. Remember, climate models as used for projecting future conditions have economics as an input. I think, reading the views of experts right now, they have no clue how much the Coronavirus may set back worldwide economies and emissions. If you run out this string of cumulative cluelessness about the future, it becomes a decision question for stakeholders and decisionmakers- we don’t know, so how should that uncertainty affect our decisions? We also don’t have a clue as to whether trade policies will let an invasive diseases or insects into the US which will decimate the ponderosa pines on the BH. The future, is indeed, unknown and uncertain. In fact, there are decision sciences that research how best to decide under conditions of uncertainty.

So, what to do about our cluelessness about future tree growth? I belong to what I might call the Pete Theisen school. He was the R-6 Regional Geneticist who said that additional growth due to tree improvement would come out in future measurements, so we shouldn’t try to model it in growth and yield models. Applying that to climate change, we would measure tree growth every ten years or so (or whatever the cycle is today) and incorporate that into future decisions. What we might call “monitoring the forest plan.” As we would then instances of insects, diseases, and fires.

But as Jon points out, we also have the question of what is the “best available science?”. We’d have to ask “who decides what is best? Based on what criteria?”. Peter Williams has spoken of the concept of “research utility.” I like that approach because it involves practitioners and stakeholders in determining whether a study is useful or relevant to the decisions that, at least on public lands, are essentially public decisions.

For me, the “best science” of tree growth in the Black Hills is what people have recently measured, knowing that it could change due to climate change or a variety of other factors, unknown, uncertain or unknowable.

Climate Science Voyage of Discovery. V. The Satellite Gaze and Grazing Animals

We’ve seen before that “where you drop the pin” (where you start) or at what scale you choose to examine a problem can lead to fundamentally different conclusions. In some cases, there seems to be no formal institution for the type of communication between people on the land, and people modeling the land. We might call this the “satellite gaze” after the idea of the “imperial gaze.”

Imperial gaze, in which the observed find themselves defined in terms of the privileged observer’s own set of value-preferences.[8] From the perspective of the colonised, the imperial gaze infantilizes and trivializes what it falls upon,[9] asserting its command and ordering function as it does so.

So let’s look at beef production, and the study that the chart above was based on. The authors are from Oxford, and published in Science. According to Our World in Data, in this study, the authors looked at data across more than 38,000 commercial farms in 119 countries. If you look at the study itself, (and the errata) the modeling is mind-bogglingly complex.

Science has a section for e-letters. Ilse Kohler-Rollefson of the League for Pastoral Peoples and Endogenous Livestock Development wrote:

This article, both in its database and its conclusions, totally ignores the vital role of livestock for the life and livelihoods of the people living in the non-arable parts of the world. Without livestock large parts of the world wold become uninhabitable. Please see http://www.ilse-koehler-rollefson.com/?p=1160
In addition, one wonders about the usefulness of “land use” as indicator for judging the environmental impact of a food production strategy. Applying this indicator to livestock production would give preference to intensive and factory animal farming over extensive herding.

When this graph was posted on Twitter, I and a fellow from Sasketchewan and a woman from Ireland pointed out that some people raise animals because (1) people can’t raise anything else, or (2) in the case of Sasketchewan, the prairie was plowed for wheat, canola and lentil monocrops, which are arguably worse for biodiversity. I pointed out that there is no land conversion associated with beef in the prairies and the answer was that “if you look overall at beef, you have to account for the env cost of land converted to corn/beef.” My point being that consumers don’t have to look overall at beef, they only have to look at the beef that they’re buying. Who decided that we needed to make a purchasing judgment call based on a global average, and call it “science”? You can follow the Twitter feed here.

A variant of the same convo occurred in the letter section of New Scientist this month:

You might think that as a livestock farmer I would resent vegans claiming that my way of life is unethical, and you would be right. You quote Michael Clark saying that eating animals fed on plants must be less efficient than people eating plants. This ignores the fact that many herbivores can, and do, get much more feed value out of plants than people. What’s more, much of the north and west of the UK and Ireland can only be farmed practically using grazing animals.

The editor writes:

Much of the world’s beef is now produced in intensive feedlots – pens without pasture. This may have a lower carbon footprint than pasture-fed beef. We suspect that most of that is done on fertile land, not rough grazing. It is even harder to find figures for other meats. We note that in the UK, upland sheep production is only marginally viable even with EU subsidies.

There are two more ideas here- that feedlots (feeding animals lots of grain) produces less carbon than feeding animals grass. Again, the editor “suspects” that most of it is done on “fertile land”. And upland sheep production is only “marginally viable” even with subsidies.

At the same time in the US, grassfed beef is undergoing popularity for environmental reasons, and as as of 12/27/2019, USDA FSIS published new guidelines clarifying grass-fed labeling.

As this HuffPost piece says:

Our best bet is to educate ourselves about where our meat (and all our food) is coming from and the practices used to raise it.

“Don’t be afraid to ask questions,” Williams told HuffPost, adding that consumers have an incredible effect on the market. “Go to the grocery store and ask for products that are regeneratively produced. Go to restaurants and demand grass-fed meat and dairy. You’ll be amazed at how quickly those businesses will respond.”

A complex question indeed. What does “the science” say? Who produced it, and how relevant is it to our lives and choices?