Here’s a link to a short article (and video) about the new study, “Hot fire, cool soil,” with a brief excerpt below. The American Geophysical Union demanded that we remove a copy of the actual study, which they provided me earlier in the day, from our website….so I’ve done that. Sorry folks.
When scientists torched an entire 22-acre watershed in Portugal in a recent experiment, their research yielded a counterintuitive result: Large, hot fires do not necessarily beget hot, scorched soil.
It’s well known that wildfires can leave surface soil burned and barren, which increases the risk of erosion and hinders a landscape’s ability to recover. But the scientists’ fiery test found that the hotter the fire—and the denser the vegetation feeding the flames—the less the underlying soil heated up, an inverse effect which runs contrary to previous studies and conventional wisdom.
Rather, the soil temperature was most affected by the fire’s speed, the direction of heat travel and the landscape’s initial moisture content.
And here’s the abstract:
Wildfires greatly increase a landscape’s vulnerability to flooding and erosion events by removing vegetation and changing soils. Fire damage to soil increases with increasing soil temperature and, for fires where smoldering combustion is absent, the current understanding is that soil temperatures increase as fuel load and fire intensity increase. Here, however, we show that this understanding that is based on experiments under homogeneous conditions does not necessarily apply at the more relevant larger scale where soils, vegetation and fire characteristics are heterogeneous. In a catchment-scale fire experiment, soils were surprisingly cool where fuel load was high and fire was hot and, conversely, soils were hot where expected to be cooler. This indicates that the greatest fire damage to soil can occur where fuel load and fire intensity are low rather than high, and has important implications for management of fire-prone areas prior to, during and after fire events.