The Problem
Flooding and natural disasters continue to impact human life, the built environments, and natural ecosystems. The causes are complicated and constantly change due to natural, anthropogenic, and climate change factors, while the consequences vary in spread,
extent, nature, and magnitude across the planet. The core of the problem is understanding, preparing, mitigating, predicting, and responding to current and future flood risk.
The Process
Understanding flooding, quantifying impacts, and designing resilience projects is analogous to a circuit board. Engineers, scientists, meteorologists, statisticians, data analysts, programmers, geospatial professionals, benefit-cost analysts, planners, policy experts, environmental permitting experts, and data
visualization professionals are each a crucial piece of intricate and complex circuitry. By working together, these pieces support a successful project outcome much like each electrical component in a circuit board is crucial to successful computing,
data transfer, and communication of an electrical device.
For resilience projects, diverse and complex data sets are collected, analyzed, processed, and passed on from one team to another in a seamless assembly line of unique technical expertise. Success depends on accurate value addition at each step towards
the intended goals of end users and stakeholders, ranging from smart predictive capabilities to developing effective mitigation strategies at various scales.
The Roles and Tools
- Water resources engineers use hydrologic, hydraulic, hydrodynamic, coastal, and groundwater models and tools, such as HEC HMS, HEC RAS, PCSWMM, XPSWMM, DHI MIKE, and MODFLOW, to study the physical processes that cause flooding.
- Meteorologists analyze nonstationarity and variability in climatological components.
- Statisticians help identify the appropriate statistical methods and best practices, such as logistic regression, random forest, and Monte Carlo methods, and tools to elucidate the cause-effect relationships between individual parameters
and their combined and accrued influence.
- Data analysts, programmers, and geospatial analysts develop scripts, such as Python, R, and Jupyter Notebooks, to efficiently implement process steps at different scales in an expedited fashion using recent technological advances in
the computational arena.
- Benefit-cost analysts evaluate financial impacts and cost benefit of proposed mitigation strategies.
- Data visualization professionals use tools such as Tableau and PowerBI to aggregate, analyze, and present results for easy visualization by end user groups for interpretation and better-informed decision-making.
- Mitigation planning and implementation involves a multidisciplinary team of engineers including those in civil, structural, and geotechnical fields, environmental and permitting specialists and construction professionals.
Statistically informed engineering methods incorporate the underlying physics and account for randomness of the causal variables to replicate natural occurrences of the impacts. Training the statistical model through machine learning techniques using
historical and real-time data is an essential step to help with the reliability of results and improve prediction capabilities. Strategies include model calibration through adjustment of physical parameters, refinement of statistical methods and assumptions,
and robust technology implementation. Cloud computing, using services such as Amazon Web Services and Microsoft Azure and open source data have proven to be very useful tools for disciplines involved in resilience projects.
The Outcome
In the era of smart apps enabling citizens to mark themselves as safe during disasters, our mission continues to focus on minimizing the need and use of such widgets. We use crowd-sourced data to carefully engineer flood risk evaluation, disaster response,
and mitigation planning. I am thrilled to have had the opportunity to be a part of the following projects, where our firm has helped establish resilient communities across the nation: