Developing Daily Quantitative Damage Estimates From Geospatial Layers To Support Post Event Recovery

Tuesday, 16 December 2014
Bryan K Woods1, Lisa H Wei1 and Thomas Cameron Connor2, (1)Atmospheric and Environmental Research, Lexington, MA, United States, (2)Verisk Climate, Lexington, MA, United States
With the growth of natural hazard data available in near real-time it is increasingly feasible to deliver damage estimates caused by natural disasters. These estimates can be used in disaster management setting or by commercial entities to optimize the deployment of resources and/or routing of goods and materials. This work outlines an end-to-end, modular process to generate estimates of damage caused by severe weather. The processing stream consists of five generic components:

1) Hazard modules that provide quantitate data layers for each peril.

2) Standardized methods to map the hazard data to an exposure layer based on atomic geospatial blocks.

3) Peril-specific damage functions that compute damage metrics at the atomic geospatial block level.

4) Standardized data aggregators, which map damage to user-specific geometries.

5) Data dissemination modules, which provide resulting damage estimates in a variety of output forms.

This presentation provides a description of this generic tool set, and an illustrated example using HWRF-based hazard data for Hurricane Arthur (2014). In this example, the Python-based real-time processing ingests GRIB2 output from the HWRF numerical model, dynamically downscales it in conjunctions with a land cover database using a multiprocessing pool, and a just-in-time compiler (JIT). The resulting wind fields are contoured, and ingested into a PostGIS database using OGR. Finally, the damage estimates are calculated at the atomic block level and aggregated to user-defined regions using PostgreSQL queries to construct application specific tabular and graphics output.