Some 41 million Americans are at risk of seeing their homes flooded in so-called 100-year events, an exposure level perhaps three times higher than the official estimates of the Federal Emergency Management Agency and other government bodies.
This is the marquee finding, but hardly the only surprise, in a groundbreaking study by researchers in Britain and the United States, including two scientists for The Nature Conservancy who work out of the group’s Minneapolis office.
The results are derived from modeling based on extraordinary advances in high-resolution mapping and supercomputing, in techniques developed at England’s University of Bristol and a nearby research institute called Fathom.
The new modeling has been applied globally for a number of Fathom’s public and private clients, and in this instance sought to make improvements over “past attempts to estimate rainfall-driven flood risk across the U.S. [that] either have incomplete coverage, coarse resolution or use overly simplified models of the flooding process.”
Other conclusions of the paper published last week in Environmental Research Letters:
- About $1.2 trillion in assets, such as buildings and other infrastructure, is also at risk. This is actually a smaller number, by two-thirds, than the respected World Resources Institute has estimated; again, the difference is attributed to the new model’s greater accuracy, in this case about the built landscape.
- Population growth and continued development will push the at-risk figures higher in years to come. The present-day population at risk from 100-year events is about 13.3 percent of all Americans, but by 2050 it is projected at more than 15.6 percent, and by 2100, it exceeds 16.4 percent. (All figures are for the lower 48 only; Alaska and Hawaii are ignored in this study.)
- There are indications that development will actually be more intensive in areas of flood risk than elsewhere, especially in the Great Lakes region, the Great Plains, Florida, Texas and the northeast.
- Although the likely contributions of climate change are not specifically assessed, the assumption is that they will raise flood risk still further in many parts of the country. (Just think for a moment of how often in recent years you’ve seen a news report about a place that has had multiple 100-year floods within a period of a few years, or 10, or 20.)

At first look, this work is a gee-whiz demonstration of what’s possible with new technology. But in a conversation last Friday with TNC’s Kris Johnson, excerpted below, I also experienced a holy-crap moment, when I heard him describe how little effort had gone into getting an accurate grip on this problem before:
Nearly half of the U.S. doesn’t have any flood mapping at all. Even where there are FEMA maps, it’s done by local consultants or municipalities, and then approved by FEMA, which doesn’t have the actual staff to create these maps. That happens, as you might imagine, in a really slow, gradual process — in some cases there are cities that are managing flood risk using maps that literally may be decades old.
Now, if you look at where FEMA maps are and aren’t, it makes some intuitive sense — all of the major cities have some flood maps. But where there are smaller towns, or farms, or towns that have grown more recently in size, they lack any mapping at all. If you look at the whole state of Minnesota, there are gaps everywhere — the Twin Cities metro area is covered, St. Cloud, Fargo-Moorhead. But most of the state is not covered.
According to the paper, whose lead author is Bristol’s Oliver E.J. Wing, the state-of-the-art resolution for large-scale assessment of flood risk was limited to catchments on the order of 10,000 square kilometers. (“Catchment,” Johnson helpfully explained, is British for “watershed.”) The new tool reaches a resolution of 50 square kilometers (km2).

For those who don’t speak metric, or visualize large areas with ease, I got out a calculator and gazetteer and came up with this parochial illustration: 10,000 km2 is equivalent to 3,861 square miles, an area slightly larger than the seven-county metro area plus Chisago and Sherburne counties; 50 km2 is about 19 square miles in extent, equal to half a Bloomington or two Mendota Heights.

Johnson, who has been studying rivers and floodplains for many years, explained the problems of times past:
One of the things that always flummoxed us is that people study and manage flood risks from an engineering point of view — especially in the U.S. We’ve moved kind of piece by piece, reach by reach, place by place, to build up complicated engineering models to tell us what might a flood be like at this particular location, and how do we build a levee or a flood defense, and give communities the information they need to protect themselves.
And that’s gotten us in the mess we’re in, because it means we haven’t thought holistically about levees and changes to landscape that increase the risk of flooding downstream. We’re moving water across our landscape much more quickly than we used to, and climate change is probably making that worse.
These researchers in Britain were some of the first to develop the ability to model flooding and floodplains at a much, much bigger scale, using supercomputers and some really smart algorithms and techniques. They use the same physics — the same basic understanding of the movement of water — that is in our models, but they put it in this tool that can be used to look at a much bigger scale.
Their initial model was global, and I learned about that at a conference of the American Geophysical Union and said, hey, we have higher-resolution topographic data in the U.S., and we have information about where levees exist — could you rerun your model for the U.S., using this better data than is available in the rest of the world?
Because the focus of the research is large-area mapping and a whole-country assessment of flood risk, there are limits to how precisely its findings can be applied at small scales (although Fathom’s product offerings include that kind of modeling, too):
We’re not claiming necessarily that this model for any given spot on the map is better than the previous local models that might have been developed for FEMA, for example.
If you were to go to a stretch of a river today and develop a local model, you’d build that out with really fine-scale data. You’d probably have somebody in a boat taking depth measurements, and surveying the width of channel, and providing detail about bridge supports or the size of culverts, things like that.
The physics would be 2D: you’d be capturing the movement of water downstream, which is one dimension, but then you’d also capture what they call sheet flow, the lateral movement of water. A lot of the typical models that FEMA has used in the past to generate flood-insurance rate maps have been only 1D … and the Army Corps of Engineers’ gold standard model is, for the most part, 1D, capturing the downriver flow. But that was the best data available 20 years ago.
The Bristol model is 2D, but because it’s done at such a huge scale we’re not capturing as much detail about depth or channel width or bridge structures or things like that. A brand-new, top-of-the-line, 2D local-scale model would do a better job of predicting what flooding might look like at a particular site.
It’s interesting to note, though, the paper’s claim that when data from the new model are compared with the results of detailed local mapping, there’s a nearly 90 percent match.
Johnson stressed that his focus has been on risk factors across the nation as a whole, but he couldn’t help noting some particulars about his home state:
It’s kind of surprising to look at a map of the 100-year flood risk for the Twin Cities. It’s just more water than we’re used to seeing. The other thing that is noteworthy is the population-at-risk estimate — over half a million people in Minnesota currently exposed to 100-year flood, which is more than I would have expected. Not only that, but considering only projected potential development patterns — and NOT considering climate change — we expect that to double by 2050. Those were pretty eye-opening numbers to me, because we’re not a super-high-growth population area — but it’s around a million people.
Given the general inattention to mapping flood risk in the past, and the patterns of continued growth and development in areas already known to be in harm’s way, I wondered if Johnson has faith that research findings like these have a prayer of shaping policy.
Well, that IS the $64,000 question and it’s what drove our interest in this work. My hope, and the whole push behind this effort, is that this information can bring a growing awareness of this counterproductive, completely non-proactive approach to planning — which is costing us lots of money, costing us harm to people and property.
Not only do we need to worry about the assets already in place — the farms, the towns, the buildings already in harm’s way — we should not dig the hole any deeper, you know? And so if we look around the country at the places we know are exposed to flood risk right now, even without any impact of climate change or changes in the pattern of intensity of rain events, let’s do what we can to change our local zoning, our county planning, our approach to insuring properties, as best we can to deeply discourage additional development in these areas.
Development puts new assets at risk, and once they’re at risk we’re obligated to protect them, and that effort will exacerbate flooding elsewhere in the system, and further degrade these really important places that provide benefits for people, for wildlife, for water quality.
We’re not calling for millions of tons of more concrete and miles and miles of levees. That may seem to be the way to address this, but that’s not the savvy long-term approach. There’s an opportunity to protect people, save taxpayers’ money and help the environment, too.
It’s a big country. We have other places we can develop.
* * *
The full paper, “Estimates of present and future flood risk in the conterminous United States,” can be read here without charge.