Navigating a collaborative approach to flood risk management

David Johnson, of Purdue University, outlines the work of research groups in Louisiana seeking to improve flood risk estimation strategies.

Claiming the lives of over 1,800 people and causing $150bn of damage, the 2005 Hurricane Katrina had a devastating impact on the city of New Orleans and the surrounding areas. Further disruption to the State of Louisiana came in 2016 after prolonged rainfall led to severe flooding, resulting in 60 fatalities and around $15bn of damage caused.

Following these devastating events, Louisiana introduced several initiatives to propel efforts in disaster response. Shortly after Hurricane Katrina, a new state agency was issued to create and maintain a Coastal Master Plan (CMP), incorporating a 50-year, approximately $50bn collection of recommended flood protection and coastal restoration projects. Prior to such initiatives, approaches to disaster response and flood prevention had been relatively separate, with a lengthy validation and approval process required for mitigation and restoration projects. This resulted in competition over scarce funding resources, as well as political gridlock due to a lack of state-wide coordination. Since the implementation of the CMP, however, Louisiana has witnessed a highly successful transformation, with updates passing through the state legislature unanimously every five to six years. It has also seen billions of dollars in projects being completed or underway.

The Louisiana Watershed Initiative (LWI) is an attempt to expand on the success of the CMP and broaden the spirit of interagency coordination to the entire state. LWI, that was prompted by the mass flooding in Baton Rouge and other inland communities from historic rainfall in August 2016, is coordinated by the Louisiana Council on Watershed Management and supports floodplain management efforts across eight watershed regions spanning the state. Following the launch of LWI, Louisiana was awarded a $1.2bn flood mitigation grant to support state-wide planning, watershed modelling, data collection, and projects to reduce flood risk.

In an example of work funded by the Louisiana Watershed Initiative, a team of researchers from Purdue University, The Water Institute of the Gulf, University of Iowa, and Louisiana State University, carried out a study with the aim of improving flood policy design by devising a multi-model joint probability method for estimating compound flood risk. Explaining the project in further detail, David Johnson, Assistant Professor of Industrial Engineering at Purdue University, spoke to Innovation News Network.

What is the background behind your work and why was a better effort required to coordinate flood risk management strategies?

A major motivator for the modelling approach and statistical framework is that state-level planning efforts are constrained computationally, so the use of multi-fidelity modelling and optimal sampling lowers the cost of obtaining accurate estimates of compound flood hazard. When incorporating predicted rainfall into the design of protection systems, development of flood risk maps, or early-warning systems, it is critical to acknowledge and quantify uncertainty.

Our work extended the joint probability method with optimal sampling (JPM-OS) to characterise the joint probability not only of tropical cyclone parameters (central pressure deficit, radius of maximum windspeed, etc.), but also of the spatiotemporal distribution of rainfall (using a novel probabilistic generator) and antecedent conditions in the affected watershed. Our ‘compound JPM-OS’ procedure runs a large quantity of equiprobable rain fields for each storm through the computationally ‘cheap’ HEC-HMS model of river discharge under different antecedent baseflow conditions. We apply principal component analysis and k-means clustering to select a reduced set of events to run through a high-resolution HEC-RAS model of inundation, also including storm surge forced by the ADCIRC model as a boundary condition, resulting in estimates of the compound flooding AEP distribution.

How was the research carried out?

Our approach involved running tens of thousands of storm simulations through a comparatively fast, coarse model. The simulations varied not only in the storm characteristics but also the local conditions, like soil moisture and stream flows, at the time the storm arrives.

Using statistical analysis, we could identify a much smaller set of several hundred events that have very similar statistical properties to the entire set of simulations, such as a similar distribution of storm surge elevation at multiple locations and similar rates of increase in river flows in tributaries within the basin.

Following this, we ran the reduced set through a much more detailed model to better understand the potential flooding risk that these events and conditions would cause in combination. The final risk estimates are based on the outcomes from the more detailed model, but we need the lower resolution model to know which events to run to approximate the true underlying risk.

Why is it a challenge to simultaneously model various sources of flooding within flood risk studies?

Estimating the risk of compound flooding from multiple simultaneous sources (such as storm surge, rainfall, overflowing rivers and streams) is difficult because analysts typically estimate the risk from each source using different statistical techniques, datasets, and models. For example, in many locations, we have daily or hourly observations of rainfall and stream flows spanning decades, so we can fit those values to a probability distribution to estimate extreme values relatively successfully. In contrast, tropical cyclones may produce storm surge in an area only once every few years. Due to that much smaller dataset, estimates of storm surge hazard typically rely more on simulation than observation.

That said, it is easy to imagine that flooding from a tropical storm could be influenced by whether or not a thunderstorm passed through the week before, saturating soils, and raising water levels in drainage features, retention ponds and other green stormwater infrastructure. Correctly identifying the risk of flooding from multiple sources like this is complex from a modelling and statistical perspective, because it requires the use of multiple models and, potentially, a large number of computationally intensive simulations.

What were the key findings of the study and what implications do they have on current policy and flood management strategies?

Flood insurance regulations in the US are primarily based on estimates of the ‘100-year floodplain’: the extent and depth of flooding that has a 1% chance of occurring or being exceeded in any given year.

In our study, we found that, in some areas, the 100-year flood depths estimated when considering interactions between multiple flood mechanisms are a foot or more higher than what you would estimate from a traditional approach that assumes those mechanisms are independent of each other. Consequently, this could mean that tens of thousands of people in regions subject to the threat of compound flooding have a mistaken impression of their risk and are choosing not to carry flood insurance. This, in turn, has an impact on the solvency of the National Flood Insurance Program and the accuracy of information local and state governments (in Louisiana and elsewhere) use to inform decisions on flood protection investments.

In conclusion, our findings suggest that better procedures for estimating compound risk are needed to inform flood insurance policies, building codes, and other flood risk management mechanisms.

David Johnson
Assistant Professor of Industrial Engineering
Purdue University
www.purdue.edu
https://www.linkedin.com/company/purdue-ie
https://www.facebook.com/PurdueIndEng
@PurdueIndEng

Please note, this article will also appear in the ninth edition of our quarterly publication.

 

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Featured Topics

Partner News

Advertisements

Media Partners

Advertisements

Similar Articles

More from Innovation News Network