In the decade leading up to 2011, twice as many acres burned in the U.S. compared to the decade before it. The average fire size rose and so did the time to control it.
And the sheer number of large fires -- in this case, fires larger than 1,000 acres -- rose steadily from 257 in 1992 to 857 in 2011.
That’s what you'll see in this interactive map from EarthFix. It's based on new data from the Forest Service that offers one of the most complete looks yet at wildfires in the U.S.
What caused that apparent rise? Name a potential reason -- drought, rural development, suppression techniques, lightning patterns, climate change, increased incident reporting -- and you’ve got part of a very complex answer.
Historically, drawing conclusions about trends in wildfires is in many ways a fool’s errand.
The scientists who analyze and make predictions about wildfire sift through millions of data points on the factors listed above, all of which complicate each other.
And since the 1970s the best data they had on the topic was riddled with redundancies, inaccuracies and inconsistencies that made their way into studies, predictions and media reports. Five federal agencies along with state and local agencies report information on wildfires into a mess of different databases. Two or three different databases might include the same fire, but with no way to tell the records apart except for pouring over them one by one. Some states spent decades never reporting information on wildfires, leaving massive gaps.
Chuck Maxwell, a meteorologist who does fire prediction for the U.S. Forest Service’s Southwest Coordination Center, recalled taking nine months to patch holes and scrub redundancies out of the data before he could even begin an analysis.
“You line up a bunch of stuff and it should have a direct relationship between the environment and wildland fire, and you have missing or erroneous data over a long period of time and you have bad relationships or no relationships that steer your in the wrong direction,” Maxwell said.
Unless the research came from by someone with intimate knowledge of how the data came to be -- and a lot of it did -- he said, “chances are it will not be of great value.”
Wildfire experts have long known about the problem, but only recently have efforts begun to fix it. The most advanced of those is a project by Karen Short at the Fire Lab in Missoula, Mont. She spent three years compiling what’s now the single most complete database of wildfires in the U.S. spanning the years 1992-2011, published this year by the Forest Service.
"What this dataset allows you to do is to try to look for patterns in space an times in terms of wildfire numbers and area burned," Short said. “You have a dataset that allows you to essentially, maybe, tease apart the most influential factors that contribute to those apparent changes."
Short is the first to acknowledge the new database isn't without limitations. But according to those who use it, there are far fewer than before. Maxwell has begun using it for his analyses and said that for every one problem in the new database, there are 12 in the old one.