Publication Date: 2020-04-27

Approval Date: 2019-11-22

Posted Date: 2019-09-09

Reference number of this document: OGC 19-051

Reference URL for this document: http://www.opengis.net/doc/IP/userguide/19-051

Category: User Guide

Editor: Guy Schumann, Albert Kettner

Title: OGC Disasters Resilience Pilot User Guide: Interoperability to Optimize Resource Allocation Across Flood Disaster Timeline


COPYRIGHT

Copyright © 2020 Open Geospatial Consortium. To obtain additional rights of use, visit http://www.opengeospatial.org/

Important

Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. The Open Geospatial Consortium shall not be held responsible for identifying any or all such patent rights. Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the standard set forth in this document, and to provide supporting documentation.

Note

This document is a user guide created as a deliverable in an OGC Interoperability Initiative as a user guide to the work of that initiative and is not an official position of the OGC membership. There may be additional valid approaches beyond what is described in this user guide.


POINTS OF CONTACT

Name

Organization

Guy Schumann

RSS-Hydro

Albert Kettner

Dartmouth Flood Observatory, University of Colorado, Boulder (RSS-Hydro consultant)


1. Introduction

For flood disasters, hurricanes and tropical storms in recent years have been particularly devastating globally, with rainfall and inundation footprints far exceeding records and also national response capabilities. National Aeronautics and Space Administration (NASA) has been increasingly stepping up to these new challenges in flood monitoring and response by harnessing the full scope of its remote sensing and modeling resources during events, either through ongoing activities at NASA centers or through NASA support of directly related projects. As these capabilities are designed and then progressively improved, there is a coupled need for sustainable mechanisms.

Several international initiatives and organizations in which NASA participates also provide relevant services and geospatial data. This information ‘firehose’ makes it difficult to coordinate all of the relevant systems during one single event. It becomes impossible with multiple simultaneous events. Thus, response activities to Hurricane Harvey (August 23-September 25, 2017) were still occurring as Hurricane Irma (September 4-October 18, 2017) caused flood damage. Although each of the response systems provides a “unique” capability, there is to date no global decision support system for flood disasters that ingests all the data from existing systems and provides real-time critical information that can guide operational reactions on the ground. Because these capabilities evolve over time, any such “interoperable” system must be able to easy incorporate changes and improvements thereof, it must be flexible, and itself robust and able to be maintained into the future.

The Open Geospatial Consortium (OGC) Disaster Resilience Pilot and the GEOSS Applications Implementation Pilot (GEOSS AIP) have been merged to form the new OGC Disaster Resilience Pilot (DRP-2019) & GEOSS Architecture Implementation Pilot (AIP-10). OGC is an international not for profit organization committed to making quality open standards for the global geospatial community. Whereas GEOSS will achieve comprehensive, coordinated and sustained observations of the Earth system, in order to improve monitoring of the state of the Earth, increase understanding of Earth processes, and enhance prediction of the behavior of the Earth system. Both initiatives have been merged because they share the same ambitions, technical principles, interoperability challenges, and goals: To develop best practices in sharing and accessing data through Spatial Data Infrastructures in specific contexts: Disaster Resilience on the one side, and general Earth Observation data driven contexts such as ecology, energy, or public health on the other.

The goal of the Disaster Resilience pilot is to develop and demonstrate user guides to build reliable and powerful data infrastructures that make all data required for decision making, analysis, and response in a flooding, hurricane, or wildfire situation available in a cost-effective way. The initiative will bring data and infrastructure experts together to exercise specific scenarios. Focus is on disaster resilience, but exercised together with additional scenarios that have similar interoperability challenges.

In this initiative, this user guide proposes to use enhanced satellite- and model-based flood information readily available from the DFO (http://floodobservatory.colorado.edu/) and elsewhere (NASA, National Oceanic and Atmospheric Administration (NOAA), European Centre for Medium-Range Weather Forecasts (ECMWF), University of Maryland, etc.) to contribute to the successful implementation of Scenario 1. Flood. All key DFO flood data layers are already provided as OGC interoperable web map services and as well as through a newly launched mobile app. The system is highly appreciated by many disaster response organizations worldwide and used by many (e.g. United Nations World Food Programme(UN WFP), Federal Emergency Management Agency (FEMA), Red Cross, World Bank, Latin America Development Bank (CAF)) as it provides a) historical flood events, b) near real-time flood maps and c) additional relevant flood disaster information (for example satellite gauged water discharge) globally. The system provided by DFO fits this OGC initiative perfectly and, furthermore, it is continuously maintained, augmented under and plays a major role in several ongoing NASA and other projects.

1.1. Flood Scenario: Hurricane Harvey Example

This user guide presents a flood scenario based on Hurricane Harvey that hit Texas and parts of Louisiana in August 2017. The goal is to illustrate how different geospatial datasets can be served to the right people or person at the right time to optimize operations in the field. This will be demonstrated using interoperable OGC web services standards and protocols across the Hurricane Harvey flood disaster timeline in order to optimize resource allocation. The section below briefly outlines the use case and illustrates the workflow and general workflow datasets used.

1.2. Use Case

A rescue team on the ground requests what homes are flooded and closest to them. These requests are happening across the hurricane and flood event timeline. An operations analyst at HQ serves the relevant event datasets (predicted potential, imminent, actual flooding) as WMS layers via GeoPlatform and a mobile app application as the event unfolds. This data stream at the right instances allows the rescue team in the field to optimize allocation of resources, for example which homes closest to them are flooded and to what water depth, and what is the population density in the respective neighborhood(s).

The figures below provide an overview of the workflow and illustrate the general flood timeline data layers, respectively.

workflow
Figure 1. Overview workflow



timeline_data
Figure 2. General workflow datasets for the flood timeline illustrated: Predicted (source: NOAA); Imminent (source: UMD); Onset (source: Fathom; input to the 2-D flood model: NOAA rainfall forecast to run it up, then hindcast mode); Unfolding (source: DFO, UCONN, Copernicus, LIST, and others)

1.3. General Structure of this User Guide

Chapter 2 introduces the simple architecture in terms of data and formats, web clients anf interfaces used.

Chapter 3 describes the general use case activity in terms of data and resources.

Chapter 4 briefly outlines how the right data are passed to the right people or person.

Chapter 5 presents the solution developed in this use case. A clear mapping of requirements to solutions is provided.

The final Chapter 6 lists any issues encountered and makes future recommendations.

2. Simple Architecture

This section contains and overview of the used Data providers, Catalogs and Data users for this disasters pilot users case scenario.

2.1. Overview

The figure below illustrates the simple architecture flow adopted in this use case.

architecture
Figure 3. Simple architecture



2.1.1. Notes

Some of the dataset(s) used were only available in GeoTIFF format - which is often the case (still) with geospatial datasets - and so they are not in an OGC interoperable service format. These data needed to be translated to a WMS first.

The ECMWF rainfall data is obtained from a WMS-T service. This provides the current rainfall by default when not specifically adding the date of interest to construct the URL to obtain the service.

2.2. Data Providers

  1. DFO Flood Observatory (http://floodobservatory.colorado.edu/), University of Colorado, Institute of Arctic and Alpine Research, Boulder, CO.

  2. Luxembourg Institute of Science and Technology (LIST; https://www.list.lu/), Luxembourg.

  3. Fathom (https://www.fathom.global/), United Kingdom.

  4. Global Flood Monitoring System (GFMS), University of Maryland (UMD), College Park, MD.

  5. ECMWF, ECMWF is the European Centre for Medium-Range Weather Forecasts (https://www.ecmwf.int/en/forecasts/datasets), United Kingdom. (Used as model input)

  6. Global Flood Monitoring System (GFMS; http://flood.umd.edu/), University of Maryland, College Park, MD.

  7. World Pop (University of Southampton, UK (https://www.worldpop.org/)

  8. Esri, Geographic information system company, Redlands, CA.

URLs of Near Real Time WMS flood extent maps of providers:

Static Base map:

Near Real Time Flood depth simulations:

2.3. Catalog Providers

2.4. Data Consumers

Flood data is used by a wide variety of governmental and non-governmental institutes and organizations. As flood data provider most data requests are made by:

This pilot scenario is created around the data user Local rescue and relief agencies, mentioned above in the 'Data Consumers' section.

3. General Use Cases by User Activity

This section will provide details on the use case and end user.

3.1. Publication of data

The used final data products for the use case are available for viewing on the GeoPlatform.Gov, at: https://viewer.geoplatform.gov/?id=0f9db0530c2b0de05f31d07f8d3e2436. The same data is also viewable on smartphones which can be done by downloading the 'DFO Floods' phone app, for IOS and Android through the Apple and the Google Play stores respectively. All data is publicly available through OGC WMS or WMS-T.

Chapter 5 will show screenshots of the data on GeoPlatform and the DFO mobile application (DFO Floods), available for free in Apple App and Google Play online stores.

3.2. Discovering of data

The data sets used during flood disasters are numerous and, as outlined in the introduction (Chapter 1), there is "firehose" of data made available. In the presented scenario, datasets are used that would typically be made available during disaster. In addition, the data used are all in operational mode, so essentially the data and use case presented here can be repeated for any other flood disaster, at the global scale - this is why data limited only to a particular event or region (country) have not been used to demonstrate the flood scenario.

3.3. Downloading of data

All the data can be downloaded and is freely available through the URLs provided in Chapter 2.

3.4. Data Integration

URLs of Near Real Time WMS flood extent maps of providers:

Static Base map:

Near Real Time Flood depth simulations:

3.5. Displaying of the data with proper symbology

All WMS data sources used have proper symbology and comply with OGC standards.

4. Special Topics

4.1. Right data for the right user at the right time

This cases study is around the work of a first responder (fire fighter) and how the first responder gets the right information at any given moment of a disaster a flood event due to Hurricane Harvey. The information timeline of interest is from before a flood event (pre flooding) to during the flood event.

4.1.1. Event time Step 1: Preparedness phase I

Hurricane track and rainfall show that a potential disaster is about to haven in the coming days. ECMWF Rainfall data will show how much rain is forecasted over a period of time, given a good idea of potential flooding. Fire fighters getting prepared by making sure their location is not impacted by the amount of expected rain; ensuring their resources are operational under the forecasted circumstances. The fire fighter will check the hurricane track and rainfall data available from ECMWF: https://apps.ecmwf.int/wms/?token=public&request=GetMap&layers=composition_aod550,grid,foreground&width=600&bbox=-180,-90,180,90 For ECMWF GloFAS forecast flood modeling, the data are described here: http://www.globalfloods.eu/static/downloads/GloFAS-WMS-T_usermanual.pdf This PDF contains the WMS, WMS-T links. This should be straightforward but let me know if you need help. Also, this page may be useful: http://www.globalfloods.eu/general-information/data-and-services/

4.1.2. Event time step 2: Preparedness phase II

With less than a day out, ECMWF rainfall data (link above) becomes more accurate and fire fighters become more aware of significance of the disaster. They have made sure all available resources (boots, trailers to transport boots to locations, etc.) are ready to deploy and with less than one day before the hurricane makes landfall the fire fighters check the hydrological models to better understand flood depth per area and where the need for first responders is the most. Data of two hydrological models are available: the University of Maryland, the Global Flood Monitoring System (GFMS) http://eagle2.umd.edu/flood/download;these data are for the flood Detection/Intensity at 1/8th degree resolution as output by the GFMS hydrological prediction model. The other simulation data available is generated from LISFLOOD-FP and made available by Fathom (see section 2 for a description of both hydrological models). The firefighters will use the population data (section 2) to get a better idea of how many people are at risk in which areas. Given this information the fire fighters inform local people to evacuate while they can and broadcast the areas they think (given the information they received) might be most severely impacted by Hurricane Harvey.

4.1.3. Event time step 3: First response, rescuing people

Fire fighters receive calls from people in flooded areas and do realize they only get part of the information as many neighborhoods are without power and parts of the phone network is down. Therefore, fire fighters use the Near Real Time flood information provided by DFO Flood Observatory, which is derived from satellites together with the population maps to identify critical areas that need help.

5. Scenarios and Tools Demonstration

This section provides a detailed description of the scenario and the description of the tools used in the demonstration.

5.1. Question this User Guide is Trying to Answer

5.1.1. Context Setting

During (flood) disasters, there is a "firehose" of datasets and products being created for a variety of end-users and decision-makers. Disaster response managers and field operations analysts are often faced with either too much or too little information and with hardly any guidance on how to make decisions based on the data provided. End-users may receive dozens of flood maps during the cycle of a flood disaster, and each single one will be missing information on accuracy and uncertainty. Probably, even more importantly, there is no general guidance on how to use the information on the map, which may oftentimes not be the information required at all.

Furthermore, many flood maps, particularly those from satellite imagery, arrive too late to act upon. The maps will have different spatial scales and display noncommensurable information, or there is little spatial consensus between them. This situation is, of course, frustrating to both the decision maker and the product developer, whose flood maps may not be used as intended or indeed may not be used at all. Schumann et al. (2016) outlined several reasons for this apparent underutilization by decision makers: (a) limited time and capacity to understand, process, and handle the datasets; (b) limited near real-time data accessibility, bandwidth, and sharing capacity; (c) incompatibility between user platforms and geospatial data formats; (d) data availability may be simply unknown and/or data latency (lag from acquisition to delivery) may be inadequate; and (e) limited understanding by scientists and engineers about end-user product and timing needs.

5.1.2. Problem Statement

Hence, the question this demonstration poses is: How can we pass the "firehose" of data to the right person at the right time during a flood disaster?

5.2. Proposed Solution

In order to answer the question posed above, we propose a scenario using the Hurricane Harvey disaster in August 2017 and demonstrate the use of interoperability to optimize resources allocation across flood disaster timeline.

The solution is demonstrated using the GeoPlatform (https://www.geoplatform.gov/),an open-access platform developed by the member agencies of the Federal Geographic Data Committee (FGDC) through collaboration with partners and stakeholders. The target audience for the GeoPlatform includes Federal agencies, State, local, and Tribal governments, private sector, academia, and the general public.

In parallel, as part of a NASA-funded SBIR (Small Business Innovative Research) Phase II project (https://sbir.nasa.gov/SBIR/abstracts/17/sbir/phase2/SBIR-17-2-S5.02-8498.html), Remote Sensing Solutions Inc. in collaboration with the DFO at University of Colorado Boulder (led by Guy Schumann and Albert Kettner), a mobile app was developed that hosts all the DFO WMS layers and allows other public WMS data to be pulled in. A screenshot of the app is shown in the next section. The app ("DFO-Floods") is available for free on both Android on Apple OS. Feedback for improvements and additions can be provided through the app itself.

General tasks and guidelines worked on during this pilot to reach the scenario demonstration goal:

  1. Collaborate with variety of users as observers. The goal is to put the correct type of information in the hands of the right people.

  2. Different OGC WMS layers are ported to GeoPlatform.gov (https://www.geoplatform.gov/) for showcase of the Hurricane Harvey event timeline and, in the long run, support the long term 5 year effort. Examine first what the technical requirements & feasibilities are.

  3. The scenario design and demonstration is kept flexible if sponsors or others want to upload/provide further data and services.

  4. Elements of spatial data infrastructure are included into the scenario demonstration.

  5. Including address data (already available on Geoplatform.gov) is an option.

  6. The scenario is supporting scaled demonstration that demonstrates metropolitan, urban, and rural communities.

Note that the objective is also that this use case on the GeoPlatform can be used as a demo exercise to train federal agency personnel. Furthermore, all data used are globally available, so this demonstration can be done for any flood disaster worldwide.

5.3. Data Layers on GeoPlatform.gov and the DFO Mobile App

The following three figures illustrate the GeoPlatform Map Viewer with the WorldPop data and the different flood-related layers across the Hurricane Harvey timeline. The rainfall data of ECMWF is loaded as WMS-T layer, which is showing the current rainfall event, not the rainfall during Harvey. But we could get to the data by uring the date timestamp when loading the WMS-T.


geo_ecmwf
Figure 4. Screenshots of the GeoPlatform Map Viewer base map, the WorldPop population density layer and the ECMWF rainfall forecast WMS-T layer (note the dates on the WMS-T layers are not the event dates but always the current ones, so when those layers are loaded during a current event, they will be the correct dates).



geo_fathom
Figure 5. Screenshots of the GeoPlatform Map Viewer base map, Fathom’s (UK) in-house 2-D inundation depths at high-resolution based on NOAA forecast rainfall, and the inundation depths intersecting the World Pop layer.



geo_dfo
Figure 6. Screenshots of the larger region satellite-derived DFO flood event map on top of GeoPlatform Map Viewer base map, a zoomed-in version, and the DFO flood map intersecting the World Pop layer.



The figure below shows a screenshot of the recently released free DFO mobile app ("DFO-Floods"), available on Android and Apple OS. This app was supported by a NASA SBIR Phase II project led by Remote Sensing Solutions Inc. (Guy Schumann) and the Dartmouth Flood Observatory (Albert Kettner). For some more details, see previous section. The app can be obtained here:


mob_app
Figure 7. Screenshot of the DFO mobile app showing a satellite image base map, the DFO flood maps of Hurricane Harvey and the ECMWF accumulated precipitation layer.

6. Conclusion and Way Forward

The OGC Disaster Pilot is a great success in that it provided the unique opportunity to identify data resources to generate information such that it is supports end users in making decisions. In this data intensive period of time, most data is available in some form, but it can be difficult to find the needed data. Also, not always is data directly accessible, interoperable or re-usable (according to the F.A.I.R principle; Findable, Accessible, Interoperable and Re-usable).

This OGC Disaster Pilot enabled evaluating the availability of flood disaster data in the awakening of a Hurricane event, Hurricane Harvey, as well as at times when the disaster was evolving. While a disaster is unfolding, not much time is available to search for the right data sources to base your decisions on. And although there is an abundance of disaster related data, this case study only identified a few key data sets that, when combining, these turn into useful information for first responders.

Only global datasets and services were purposely used that are publicly, free available without costs. So, although Hurricane Harvey was chosen for this case study, any other flooding disaster would have resulted in similar findings.

6.1. Overview of lessons learned during the Disasters Resilience Pilot

  • It would be great to agree on a particular (or several) disaster event(s) that can be worked on, data collected, etc., so these can be used in future disaster pilots; this way, participants can spend more time working on their software workflows rather than collecting data first. That would also make it possible to determine what data is available (or missing) during all stages of a disaster, which datasets are useful and when for (specific) users (which datasets don’t), etc.

  • It would be useful to have data providers all serve their data as some form of interoperable standards right at the source. Oftentimes, data are served in an interoperable format by some "go-between" provider rather than by their source producer. This makes interoperable data discovery very difficult;

  • Flood observations derived from satellite data are now most of the time provided as a merged flood extent product that is updated overtime to capture the maximum flooded area. This is useful for first responders, but it would be even better if first responders would see a flood evolving over time; how it slowly propagating through urban area over the days. Same with flood depth simulations which were now provided as a time merged product as well (maximum flood depth over time). When presented as time dependent data, first responders can even better evaluate who to rescue; those that are just flooded or those that experienced flooding for several days. For example, a WMS-T interface would make this possible.

  • The presented case study focused around providing the best information to offer adequate first response with as case study flooding due to Hurricane Harvey. However, evidenced by its framing in the Sendai Framework for Disaster Risk Reduction (https://www.unisdr.org/we/coordinate/sendai-framework), flood risk should require more attention. What actions are needed to address the main challenges in flood risk estimation and aligning those actions with the goals of GEO Global Flood Risk Monitoring (GEO 2017-2019 Work Programme; https://www.earthobservations.org/geoss_wp.php) and the Sendai Framework for Disaster Risk Reduction. What data and tools are needed to mitigate flood risk and how can data availability and interoperability play a key role in this. It would be great if a follow-up phase of the disaster pilot would emphasis more on what information is needed to mitigate flood risk.

Appendix A: Abbreviations

  • CAF: development bank of Latin America

  • DFO: Dartmouth Flood Observatory

  • DRP: Disaster Resilience Pilot

  • ECMWF: European Centre for Medium-Range Weather Forecasts

  • F.A.I.R: Findable, Accessible, Interoperable and Re-usable.

  • FEMA: Federal Emergency Management Agency

  • FGDC: Federal Geographic Data Committee

  • GEO: Group on Earth Observations

  • GEOSS AIP: Global Earth Observation System of Systems - Applications Implementation Pilot

  • GFMS: Global Flood Monitoring System

  • GPM: Global Precipitation Measurement

  • IMERG: Integrated Multi-Satellite Retrievals for GPM

  • IOS: iPhone Operating System

  • LIST: Luxembourg Institute of Science and Technology

  • MODIS: Moderate Resolution Imaging Spectroradiometer

  • NASA: National Aeronautics and Space Administration

  • NOAA: National Oceanic and Atmospheric Administration

  • PDF: Portable Document Format

  • SAR: Synthetic Aperture Radar

  • SBIR: Small Business Innovative Research

  • TRMM: Tropical Rainfall Measuring Mission

  • UCONN: University of Connecticut

  • UMD: University of Maryland

  • UN WFP: United Nations World Food Programme

  • URL: Uniform Resource Locator

  • WMS: Web Map Service

  • WMS-T: Web Map Service with support to temporal requests