Nonprofit, independent journalism. Supported by readers.


Human carelessness is more responsible than climate change for worsening wildfires

Year after year, July 4 is the worst day for wildfire as Americans celebrate independence by igniting their environment with fireworks and barbecue grills.

Human carelessness is doing far more than climate change to worsen wildfire damage — and raise the budget-straining costs of suppression — across the United States, new research concludes.

While the shift to warmer and drier climate regimes seems to have lengthened the season for forest and grassland fire “by a couple of weeks over the past three decades,” scientists found, fires directly traceable to human ignition form a fire season in the lower 48 that is now three times longer than the period of natural, lightning-caused fires (154 days versus 46).

And while human-caused fires accounted for less than half of the acreage that burned during a study period from 1992 through 2012, they now make up the great bulk of all wildfires — 84 percent — and have extended the geography where fire is common into a lot of new territory, from the Southeast to the northern Great Plains, including northern Minnesota.

I should acknowledge quickly that “carelessness” is my word; the paper published last week in the Proceedings of the National Academy of Sciences chose “human-caused” to categorize blazes that begin from burn piles, fireworks, campfires, cigarette butts, power equipment, and children fooling with fire.

Article continues after advertisement

The category also includes arson, which of course is not carelessness, but criminal ignition accounts for only one human-caused fire in five.  And the grouping excludes prescribed burns set intentionally for forest and grassland restoration purposes, as well as “managed agricultural fires.”

Since 80 percent of them happened while somebody wasn’t paying sufficient attention, I’m comfortable with carelessness as a fair characterization. It underlines a factor of avoidability that is worth keeping in mind as we consider how these findings maybe ought to inform human efforts to do better.

For example, by being just a bit more mindful on July 4. Year after year, the paper finds, that’s the worst day for wildfire as people across the country celebrate independence by igniting their environment with roman candles and barbecue grills.

Over the 21-year study period, the National Interagency Fire Center (NIFC) recorded 7,762 human-started fires on Independence Day, which averages to about 370 per year.

A new look at impacts

The fact that so many wildfires are human-caused isn’t in itself a revelation; that’s long been a part of fire-policy discussions in this space and elsewhere and is a staple of public-service messaging by the firefighting and natural resources agencies. (Although it did appear to startle an editor at the Washington Times, which headlined its report, Smokey Bear is right: 84 percent of wildfires are caused by humans, new study finds.)

What is new, and interesting, about this paper is how it quantifies the impact of human causation relative to other factors in the wildfire equation, in driving a pattern of wildfire that has worsened markedly since the 1990s, with budget-busting consequences and a disturbing number of fatalities.

Also, the sheer size of the NIFC data set it draws upon — more than 1.5 million fires large enough to require responses from state or federal agencies during the 1992-2012 time frame.

Because those agencies report a wealth of wildfire info, including attributions of cause, to the NIFC, assembling data wasn’t a major challenge; indeed, as I read the paper I found myself wondering why nobody had addressed its obvious, important questions before now.

Lead author on the paper is Jennifer K. Balch of the University of Colorado’s Earth Lab in Boulder; she was joined by researchers from universities in Massachusetts and Idaho. Some key excerpts from the findings, with footnotes omitted:

Article continues after advertisement

Humans have vastly expanded the spatial and seasonal “fire niche” in the coterminous United States, accounting for 84% of all wildfires and 44% of total area burned. During the 21-year time period, the human-caused fire season was three times longer than the lightning-caused fire season and added an average of 40,000 wildfires per year across the United States.

Human-started wildfires disproportionally occurred where fuel moisture was higher than lightning-started fires, thereby helping expand the geographic and seasonal niche of wildfire. Human-started wildfires were dominant (more than 80% of ignitions) in over 5.1 million square kilometers, the vast majority of the United States, whereas lightning-started fires were dominant in only 0.7 million square kilometers, primarily in sparsely populated areas of the mountainous western United States.

Overall, humans expand the spatial and temporal “fire niche” by introducing ignitions into landscapes when fuels are sufficiently dry enough to ignite and carry fire, but when lightning is rare.

Thus it’s not so much that we’re being less cautious with ignition sources; there’s just more of us doing the same stupid stuff in more places, often in areas at the edge of expanding development.

Regional differences examined

Breaking down the 1,272,076 human-caused and 245,446 lightning-started wildfires in the study period, the paper reaches these findings:

  • 78% of lightning fires occurred in the summer (June-August), 9% in the spring (March-May), and 12% in the fall (September-November). Human fires were distributed more evenly: 24% in summer, 38% in spring, 19% in fall, 19% in winter. (This likely reflects that burn piles are the No. 1 source of human-caused ignition, and especially prone to trouble in winter and spring, when vegetation remains bone-dry even though the ground seems safely snow-covered or mucky.)
  • There are regional differences, though, with human-caused fires in the eastern U.S. occurring primarily in the spring, while they’re more common in the fall and winter in Texas and the Gulf states.
  • While lightning-caused fires have increased most dramatically in the mountains of the northwestern U.S., the surge in human-caused wildfires was largest in the Great Plains — especially in the spring.

I have not seen much reaction to these findings, but they certainly impressed Tom Jeffery, a hazard analyst for CoreLogic, which is known for its work on wildfire risk assessments. Speaking to Insurance Journal, he said,

It certainly takes you back to see the numbers on a page. I never realized that those numbers were so high.

“If there’s one thing that we have some control over, it’s trying to reduce the ignition of human-caused fires,” he continued, pointing out that homeowners and homebuilders can take steps to reduce the risks of accidentally starting a fire in some of the 100-plus ways he claims to have counted.

Prevention was also on the minds of Balch and her colleagues, who pointed out the need for policies that focus on “reducing human expansion of the fire niche.” Which on its face sounds like a very difficult challenge, though an easier avenue than rapidly restoring cooler, wetter climate regimes.

Article continues after advertisement

On the other hand, as my son remarked when I told him about this study:

Probably easier to stop lightning strikes than to keep people from being stupid.

* * *

The full paper, “Human-started wildfires expand the fire niche across the United States,” can be read and downloaded here without charge.