Published

OGC Engineering Report

Engineering report for OGC Climate Resilience Pilot
Guy Schumann Editor Albert Kettner Editor Nils Hempelmann Editor
OGC Engineering Report

Published

Document number:23-020r2
Document type:OGC Engineering Report
Document subtype:
Document stage:Published
Document language:English

License Agreement

Use of this document is subject to the license agreement at https://www.ogc.org/license



I.  Executive Summary

The OGC Climate Resilience Pilot marked the beginning of a series of enduring climate initiatives with the primary goal of evaluating the value chain encompassing raw data to climate information processes within Climate Resilience Information Systems. This includes the transformation of geospatial data into meaningful knowledge for various stakeholders, including decision-makers, scientists, policymakers, data providers, software developers, service providers, and emergency managers. The results of the OGC Climate Resilience Pilot support the location community to develop more powerful visualization and communication tools to accurately address ongoing climate threats such as heat, drought, floods, and wild-fires as well as supporting governments in meeting commitments for their climate strategies. This will be accomplished through evolving geospatial data, technologies, and other capabilities into valuable information for decision-makers, scientists, policymakers, data providers, software developers, and service providers so they can make valuable, informed decisions to improve climate action. One of the most significant challenges so far has been converting the outputs of global and regional climate models into specific impacts and risks at the local level. The climate science community has adopted standards and there are now numerous climate resilience information systems available online, allowing experts to exchange and compare data effectively. However, professionals outside the weather and climate domain, such as planners and GIS analysts working for agencies dealing with climate change impacts, have limited familiarity with and capacity to utilize climate data.

Stakeholders depend on meaningful information to make decisions or advance their science. In the context of climate change, this meaningful information is delivered through climate services as a combination of technical applications and human consultation. The technical infrastructures underpinning climate services, named here as Climate Resilience Information Systems, require the processing of vast amounts of data from diverse providers across various scientific ecosystems as follows.

This report assesses the value chain from raw data to climate information and the onward delivery to stakeholders. It explains good practices on how to design climate resilience information systems, identifies gaps, and gives recommendations on future work.

The OGC pilot demonstrated the capability of creating data pipelines to convert vast amounts of raw data through various steps into decision-ready information and 3D visualizations while embedding good practice approaches for communicating this knowledge to non-specialized individuals. In other words, in order to obtain decision-ready information, the data must first be collected from multiple sources and organized, then transformed into analysis-ready formats.

To address the value chain from raw data to decision-ready indicators, one focus of this pilot was to explore methods for extracting climate variables from climate model output scenarios and delivering them in formats that are more easily usable for post-processing experts, alongside being applicable to local situations and specific use-cases. Climate variable Data Cubes were extracted or aggregated into temporal and spatial ranges specific to the use cases. Then, the data structure was transformed from multidimensional gridded cubes into forms that can be readily utilized by geospatial applications. These pilot data flows serve as excellent examples of how climate data records can be translated into estimates of impacts and risk at the local level in a way that seamlessly integrates into existing planning workflows and is made available to a broad user community via open standards.

In addition, the pilot explored various parts of the processing pipelines that were examined using climate-impact case studies related to heat, droughts, floods, and wildfires, highlighting assessment tools and the complexities of climate indices. It also recognized the existence of solar radiation databases and web map services, emphasizing the need to enhance their accessibility and applicability at a national level to combat the effects of climate change by utilizing solar energy resources more efficiently. Ultimately, this Climate Resilience Pilot serves as a crucial asset for making well-informed decisions that bolster climate action. It particularly aids the location community in developing enhanced 3D visualization, simulation, and communication tools to effectively address prevalent climate change impacts or hazards caused by meteorological extreme events.

This report also demonstrates the workflow from data to 3D visualization, specifically for non-technical individuals. A chapter is dedicated to the options and challenges of applying artificial intelligence to establish a climate scenario digital twin where various scenarios of efficiencies of climate action can be simulated. These simulations can encompass the reduction of disaster risks through technical engineering. The concept of climate resilience is explored, not only considering the shift of meteorological phenomena but also accounting for land degradation and biodiversity loss. More specifically, the scenarios focus on understanding the effects of climate change on vegetation in the Los Angeles area. 3D landscape vegetation simulations are presented, demonstrating how different tree species adapt under changing climate conditions represented by a range of climate and policy scenarios over time.

The pilot acknowledges the significant challenges of effectively conveying information to decision-makers. This necessitates a thorough examination of communication methods. Consequently, a dedicated chapter emphasizes unique approaches to facilitate effective communication with non-technical individuals, who frequently hold responsibility for local climate resilience action strategies. The development and implementation of a stakeholder survey provides insight into the strengths and weaknesses of past adaptation processes and allows for the derivation of opportunities for improvement. By prioritizing communication, the pilot aims to bridge the gap between technical and non-technical stakeholders, ensuring accurate and comprehensive information transmission for the benefit of both sides. The addition of this chapter demonstrates the pilot’s aim to enhance communication strategies to foster improved decision-making in the realm of climate resilience.

Overall, this engineering report presents various workflow processes which illustrate the seamless exchange of data, models, and components, such as climate application packages, that emphasize the potential for optimization using OGC Standards.

In the context of climate and disaster resilience, this document greatly contributes to a comprehensive understanding of flood, drought, heat, and wildfire assessments offering insights into decision-making for climate actions, specifically addressing the enhancement of Climate Resilience Information Systems in line with FAIR Climate Services principles.

II.  Keywords

The following are keywords to be used by search engines and document catalogues.

Climate Resilience, data, ARD, component, use case, FAIR, Drought, Heat, Fire, Floods, Data cubes, Climate scenario, Impact, Risk, Hazard, DRI, Indicator

III.  Submitters

The various organizations and institutes that contribute to the Climate Resilience Pilot are described below.

Table — Contributors of this Climate Resilience Pilot

NameOrganizationRole or Summary of contribution
Guy SchumannRSS-HydroLead ER Editor
Albert KettnerRSS-Hydro/DFOLead ER Editor
Sacha LepretreCAE, Presagis (CAE Subsidiary)Use of AI DigitalTwin and Simulation for climate (5D Meta World demo with Laubwerk).
Timm DapperLaubwerk GmbH
Peng YueWuhan UniversityDatacube component
Zhe FangWuhan UniversityClimate ARD component
Hanwen XuWuhan UniversityDrought impact use cases
Dean HintzSafe Software, Inc.Climate Analysis Ready Data and Drought Indicator
Kailin OpaleychukSafe Software, Inc.Climate Analysis Ready Data and Drought Indicator
Samantha LavenderPixalytics LtdDevelopment of drought indicator
Andrew LavenderPixalytics LtdDevelopment of drought indicator
Jenny CocksPixalytics LtdDevelopment of drought indicator
Jakub P. WalawenderFreelance climate scientist and EO/GIS expertClimate ARD and solar radiation use case
Daniela Hohenwallner-RiesalpS GmbHCommunication with stakeholders
Hanna KrimmalpS GmbHCommunication with stakeholders
Hinnerk RiesalpS GmbHCommunication with stakeholders
Paul SchattanalpS GmbHCommunication with stakeholders
Jérôme Jacovella-St-LouisEcere CorporationDatacube API client and server
Patrick DionEcere CorporationDatacube API client and server
Eugene YuGMU
Gil HeoGMU
Glenn LaughlinPelagis Data SolutionsCoastal Resilience & Climate Adaptation
Tom LandryIntact Financial Corporation
Steve KoppEsriClimate services & web interface
Lain GrahamEsriClimate services & web interface
Nils HempelmannOGCClimate resilience Pilot Coordinator

III.A.  About alpS

alpS GmbH is an international engineering and consulting firm that supports companies, municipalities, and governments in sustainable development and in dealing with the consequences, opportunities, and risks of climate change. Over the past 20 years, alpS has worked with more than 250 municipalities and industrial partners on climate-related projects. alpS accompanied a large number of adaptation cycles from risk assessments to the implementation and evaluation of adaptation measures.

III.B.  CAE

CAE is a high-tech company with a mission and vision focused on safety, efficiency, and readiness. As a technology company, CAE digitalizes the physical world, deploying simulation training and critical operations support solutions. Above all else, empowering pilots, airlines, defense and security forces, and healthcare practitioners to perform at their best every day, especially when the stakes are the highest. CAE represents 75 years of industry firsts—the highest-fidelity flight, future mission, and medical simulators, and personalized training programs powered by artificial intelligence. CAE invests time and resources into building the next generation of cutting-edge, digitally immersive training and critical operations solutions while keeping positive environmental, social, and governance (ESG) impact at the core of its mission. Presagis is part of CAE and is specialized in developing 3D Modeling & Simulation Software. Presagis has developed VELOCITY 5D (V5D), a Next Generation 3D Digital Twins Creation and Simulation geospatial platform leveraging artificial intelligence.

III.C.  About Ecere

Ecere is a small software company located in Gatineau, Québec, Canada. Ecere develops the GNOSIS cross-platform suite of geospatial software, including a map server, a Software Development Kit and a 3D visualization client. Ecere also develops the Free and Open Source Ecere cross-platform Software Development Kit, including a 2D/3D graphics engine, a GUI toolkit, an Integrated Development Environment and a compiler for the eC programming language. As a member of OGC, Ecere is an active contributor in several Standard Working Groups as co-chair and editor, and participated in several testbeds, pilots and code sprints. In particular, Ecere has been a regular contributor and an early implementer for several OGC API standards in its GNOSIS Map Server and GNOSIS Cartographer client, and is also active in the efforts to modernize the OGC CDB data store and OGC Styles & Symbology standard.

III.D.  About Esri

Esri is a leading provider of geographic information system (GIS) software, location intelligence, and mapping. Since 1969, Esri has supported customers (more than a half million organizations in over 200 countries) with geographic science and geospatial analytics, taking a geographic approach to problem-solving, brought to life by modern GIS technology. The ArcGIS platform includes an integrated system of desktop, web, and mobile software products and data committed to open science.

Within the context of this OGC engagement, Esri provides the full range of capabilities from CMIP climate data processing and publishing, spatial analysis for risk assessment, climate adaption and resilience, to web application development and science communication tools.

III.E.  About George Mason University (GMU)

George Mason University (GMU) is a public research university that conducts research and provides training to postdoctoral fellows, PhD candidates, and master’s students in Geospatial information science, remote sensing, satellite image analysis, geospatial data processing, Earth system science, geospatial interoperability and standards, geographic information systems, and other related subjects. GMU will contribute an ARD use-case.

III.F.  About Intact

Intact Financial Corporation (IFC) is the largest provider of Property & Casualty (P&C) insurance in Canada. IFC’s purpose is to help people, businesses, and society prosper in good times and be resilient in bad1. The company has been on the front lines of climate change for more than a decade – getting its customers back on track and adapted to change. As extreme weather is predicted to get worse over the next decade, Intact intends to double down on adjusting to this changing environment to become more well prepared for floods, wildfire, and extreme heat2.

With close to 500 experts in data, artificial intelligence, machine learning, and pricing, the Intact Data Lab has deployed almost 300 AI models in production to date, focussing on improving risk selection and making operations as efficient as possible while creating outstanding interactions with customers. Within Intact’s Data Lab, the Centre for Climate and Geospatial Analytics (CCGA) uses weather, climate, and geospatial data along with machine learning models and claims data to develop risk maps and other specialized products.

III.G.  About Laubwerk

Laubwerk is a software development company whose mission is to combine accurate, broadly applicable visualizations of vegetation with deeper information and utility that goes far beyond their visual appearance. Laubwerk achieves this through building a database that combines ultra-realistic 3D representations of plants with extensive metadata that represents plant properties. This unique combination makes Laubwerk a prime partner to bridge the gap from data-driven simulation to eye-catching visualizations.

III.H.  About Pixalytics Ltd

Pixalytics Ltd is an independent consultancy company specializing in Earth Observation (EO) combining cutting-edge scientific knowledge with satellite and airborne data to provide answers to questions about EArth’s resources and behavior. The underlying work includes developing algorithms and software, with activities including a focus on EO quality control and end-user focused applications.

III.I.  About Pelagis

Pelagis is an OceanTech venture located in Nova Scotia, Canada focusing on the application of open geospatial technology and standards designed to promote the sustainable use of ocean resources. As a member of the Open Geospatial Consortium, Pelagis co-chairs the Marine Domain Working Group responsible for developing a spatially-aware federated service model of marine and coastal ecosystems.

III.J.  About RSS-Hydro

RSS-Hydro is a geospatial solutions and service company focusing its R&D and commercial products in the area of water risks, with a particular emphasis on the SDGs. RSS-Hydro has been part of several successful OGC testbeds, including the DP 21 to which this pilot is linked, not only in terms of ARD and IRD but also in terms of use cases. In this pilot, RSS-Hydro’s main contribution is the lead of the Engineering report. In terms of technical contributions to various other OGC testbeds and pilots, RSS-Hydro is creating digestible OGC data types and formats for specific partner use cases, in particular producing ARD from publicly available EO and model data, including hydrological model output as well as climate projections. These ARD will feed into all use cases for all participants, especially use cases proposed for floods, heat, drought and health impacts by other participants in the pilot. The created ARD in various OGC interoperable formats will create digestible dataflows for the proposed OGC Use Cases.

Specifically, RSS-Hydro can provide access to the following satellite and climate projection data.

  • Wildfire: Fire Radiant Power (FRP) product from Sentinel 3 (NetCDF), 5p, MODIS products (fire detection), VIIRS (NOAA); possibly biomass availability (fire fuel)

  • Land Surface Temp: Sentinel 3

  • Pollution: Sentinel 5p

  • Climate Projection data (NetCDF, etc., daily downscaled possible): air temp (10 m above ground) with rainfall and possibly wind direction as well

  • Satellite-derived Discharge Data to look at droughts/floods etc. by basin or other scale

  • Hydrological model simulation outputs at (sub)basin scale

III.K.  About Safe Software

Safe Software is a leader in supporting geospatial interoperability and automation for more than 25 years as creators of the FME platform. FME was created to promote FAIR principles, including data sharing across barriers and silos, with unparalleled support for a wide array of both vendor specific formats and open standards. Within this platform, Safe Software provides a range of tools to support interoperability workflows. FME Form is a graphical authoring environment that allows users to rapidly prototype transformation workflows in a no-code environment. FME Flow then allows users to publish data transforms to enterprise oriented service architectures. FME Hosted offers a low cost, easy to deploy, and scalable environment for deploying transformation and integration services to the cloud.

Open standards have always been a core strategy for Safe Software to better support data sharing. The FME platform can be seen as a bridge between the many supported vendor protocols and open standards such as XML, JSON, and OGC standards such as GML, KML, WMS, WFS, and OGC APIs. Safe Software has collaborated extensively over the years with the open standards community. Safe Software actively participates in the CityGML and INSPIRE communities in Europe and is also active within the OGC community and participated in many initiatives including test beds, pilots such as Maritime Limits and Boundaries and IndoorGML, and most recently the 2021 Disaster Pilot and 2023 Climate Resilience Pilot. Safe Software also actively participates in a number of Domain and Standards working groups.

III.L.  About Jakub P. Walawender

Jakub P. Walawender is a freelance climate scientist and EO/GIS expert carrying out his PhD research on the solar radiation climatology of Poland at the Laboratory for Climatology and Remote Sensing (LCRS), Faculty of Geography, Philipps University in Marburg, Germany. Jakub specializes in the application of satellite remote sensing, GIS, and geostatistics in the monitoring and analysis of climate variability and extremes and supports users in the application of different climate data records to tackle the effects of climate change.

III.M.  About Wuhan University (WHU)

Wuhan University (WHU) is a university that plays a significant role in researching and teaching all aspects of surveying and mapping, remote sensing, photogrammetry, and geospatial information sciences in China. In this Climate Resilience Pilot, WHU will contribute three components (ARD, Drought Indicator, and Data Cube) and one use-case (Drought Impact Use-cases).

1.  Terms, definitions and abbreviated terms

No terms and definitions are listed in this document.

Carrying Capacity

an area both suitable and available for human activity based on the state of the ecosystem and competitive pressures for shared resources

CityGML

an open standardized data model and exchange format to store digital 3D models of cities and landscapes

Data Cube

In computer programming contexts, a data cube (or datacube) is a multi-dimensional (“n-D”) array of values. Typically, the term data cube is applied in contexts where these arrays are massively larger than the hosting computer’s main memory; examples include multi-terabyte/petabyte data warehouses and time series of image data.

FAIR Climate Service

a climate resilience information system where the entire architecture follows FAIR principles

FAIR principles3

the concept of making digital assets Findable, Accessible, Interoperable, and Reusable

Resilience

the ability of a system to compensate impacts

Sentinel (satellite mission)

a series of next-generation Earth observation missions developed by the European Space Agency (ESA) on behalf of the joint ESA/European Commission initiative Copernicus

1.1.  Abbreviated terms

ACDC

Atmospheric Composition Data Cube

ACDD

Attribute Convention for Data Discovery

ACIS

Applied Climate Information System

ADES

Application Deployment and Execution Service

ADS

Atmosphere Data Store

AP

Application Package

API

Application Programming Interface

AR

Assessment Report

ARD

Analysis Ready Data

ARDC

Analysis Ready Data Cube

AWS

Amazon Web Service

BCSD

Bias Corrected Spatially Downscaled

BRDF

Bidirectional Reflectance Distribution Function

C3S

Copernicus Climate Change Service

CCI

Climate Change Initiative

CDI

Combined Drought Indicator

CDR

Climate Data Record

CDS

Climate Data Store

CEOS

Committee on Earth Observation Satellites

CF

Climate and Forecast

CGMS

Coordination Group for Meteorological Satellites

CIOOS

Canadian Integrated Ocean Observing System

CMIP

Coupled Model Intercomparison Project

CMR

Common Metadata Repository

CMRA

Climate Mapping for Resilience and Adaptation

COG

Cloud Optimized Geotiff

CRIS

Climate Resilience Information System

CRMA

Climate Mapping for Resilience and Adaptation

CSV

Comma-Separated Values

CWIC

CEOS WGISS Integrated Catalog

DEM

Digital Elevation Model

DRI

Decision Ready Indicator

DSW

Drought Severity Workflow

DWG

Domain Working Group

ECMWF

European Centre for Medium-Range Weather Forecasts

ECV

Essential Climate Variable

EDR

Environmental Data Retrieval

EFFIS

European Forest Fire Information System

EMS

Exploitation Platform Management Service

EO

Earth Observation

ER

Engineering Report

ERA5

fifth generation ECMWF atmospheric reanalysis of the global climate

ESA

European Space Agency

ESDC

Earth System Data Cube

ESDL

Earth System Data Laboratory

ESIP

Earth Science Information Partners

EUMETSAT

European Organisation for the Exploitation of Meteorological Satellites

FAIR

Findable, Accessible, Interoperable, Reusable

FAPAR

Fraction of Absorbed Photosynthetically Active Radiation

FME

Feature Manipulation Engine

FOSS4G

Free and Open Source Software for Geospatial

FRP

Fire Radiant Power

FWI

Fire Weather Index

GCM

General Circulation Model

GCOS

Global Climate Observing System

GDO

Global Drought Observatory

GDP

Gross Domestic Product

GHG

Greenhouse Gasses

GML

Geography Markup Language

GMU

George Mason University

GOOS

Global Ocean Observing System

GRACE

Gravity Recovery and Climate Experiment

HDF

Hierarchical Data Format

IFC

International Finance Corporation

IHO

International Hydrographic Organization

IMGW

Institute of Meteorology and Water Management4

IOOS

Integrated Ocean Observing System

IoT

Internet of Things

IPCC

Intergovernmental Panel on Climate Change

JRC

Joined Research Center

JSON

JavaScript Object Notation

KML

Keyhole Markup Language

LCRS

Laboratory for Climatology and Remote Sensing

LDN

Land Degradation Neutrality

LOCA

Localized Constructed Analogs

MERRA

Modern Era Retrospective-Analysis for Research and Applications

ML/AI

Machine Learning / Artificial Intelligence

MODIS

Moderate Resolution Imaging Spectroradiometer

MSDI

Marine Spatial Data Infrastructures

NASA

National Aeronautics and Space Administration

NCA4

National Climate Assessment 4

NCAR

National Center for Atmospheric Research

NDVI

Normalized Difference Vegetation Index

NDWI

Normalized Difference Water Index

NetCDF

Network Common Data Form

NOAA

National Oceanic and Atmospheric Administration

NRCan

Natural Resources Canada

OGC

Open Geospatial Consortium

OGE

Open Geospatial Engine

OMSv3

OGC Observations & Measurements 3.0

OPeNDAP

Open-source Project for a Network Data Access Protocol

OSM

OpenStreetMap

QGIS

Quantum Geographic Information System

RCI

Regional Climate Indicator

RCM

Regional Climate Model

RCP

Representative Concentration Pathway

REST

Representational State Transfer

S3

Simple Storage Service

SDG

Sustainable Development Goal

SMA

Soil Moisture Anomaly

SPEI

Standardized Precipitation Evapotranspiration Index

SPI

Standardized Precipitation Index

SQL

Structured Query Language

SR

Surface Reflectance

SSL

Secure Sockets Layer

STAC

SpatioTemporal Asset Catalogs

THREDDS

Thematic Real-time Environmental Distributed Data Services

TIE

Technical Interoperability Experiments

UNFCCC

United Nations Framework Convention on Climate Change

URL

Uniform Resource Locator

USGS

United States Geological Survey

VIIRs

Visible Infrared Imaging Radiometer Suite

WCS

Web Coverage Service

WFV

Wide Field View

WG Climate

Joint Working Group on Climate

WGISS

Working Group on Information Systems and Services

WHI

Wildland-Human Interface

WHU

Wuhan University

WMS

Web Map Service

WPS

Web Processing Service

WUI

Wildland-Urban Interface

XML

Extensible Markup Language

2.  Introduction

The OGC Climate Resilience Pilot represents the first phase of multiple long term climate activities aiming to combine geospatial data, technologies, and other capabilities into valuable information for decision makers, scientists, policy makers, data providers, software developers, and service providers to assist in making valuable, informed decisions to improve climate action.

2.1.  The goal of the pilot

The goal of this pilot was to enable decision makers (scientists, city managers, politicians, etc.) in taking the relevant actions to address climate change and make well informed decisions for climate change adaptation. Since no single organization has all the data needed to understand the consequences of climate change, this pilot shows how to use data from multiple organizations—​available at different scales for large and small areas—​in scientific processes, analytical models, and simulation environments. The aim was to demonstrate visualization and communication tools used to craft the message in the best way for any client. Many challenges can be met through resources that adhere to FAIR (Findable, Accessible, Interoperable, and Reusable) principles. The OGC Climate Resilience Pilot identifies, discusses, and develops these resources.

The goal was to help the location community develop more powerful visualization and communication tools to accurately address ongoing climate threats such as heat, drought, floods, and fires as well as supporting nationally determined targets for greenhouse gas emission reduction. Climate resilience is often considered the use case of our lifetime; the OGC community is uniquely positioned to accelerate solutions through collective problem solving with this initiative.

ValueChain

Figure 1 — Value chain from raw data to climate information

As illustrated, large sets of raw data from multiple sources require further processing in order to be used for analysis and climate change impact assessments. Applying data enhancement steps, such as bias adjustments, re-gridding, or calculation of climate indicators and essential variables creates “Decision Ready Indicators.” The spatial data infrastructures required for this integration should be designed with interoperable application packages following FAIR data principles. Heterogeneous data from multiple sources can be enhanced, adjusted, refined, or quality controlled to provide Science Services data products for Climate Resilience. The OGC Climate resilience pilot also illustrates the graphical exploration of the Decision Ready Indicators and effectively demonstrates how to design FAIR climate resilience information systems underpinning FAIR Climate Services. The OGC Pilot participants illustrate the necessary tools and the visualizations to address climate actions moving towards climate resilience.

The vision of the OGC Climate Resilience Community is to support efforts on climate actions, enable international partnerships (SDG 17), and move towards global interoperable open digital infrastructures providing climate resilience information on demand by users. This pilot contributes to establishing an OGC climate resilience concept store for the community where all appropriate climate information to build climate resilience information systems as open infrastructures can be found in one place, be it information about data services, tools, software, or handbooks, or a place to discuss experiences and needs. It covers all phases of Climate Resilience from initial hazards identification and mapping, vulnerability and risk analysis, options assessments, prioritization, and planning, to implementation planning and monitoring capabilities. These major challenges can only be met through the combined efforts of many OGC members across government, industry, and academia.

2.2.  Objectives

This Pilot set the stage for a series of follow up activities and focuses on use-case development, implementation, and exploration. It also answers the following questions.

  • What use-cases can be realized with the data, services, analytical functions, and visualization capabilities currently available? Current data services include, for example, the Copernicus Services, including Climate Data Store (CDS) https://cds.climate.copernicus.eu/ and Atmosphere Data Store (ADS) https://ads.atmosphere.copernicus.eu/.

  • How much effort is required to realize these use-cases?

  • What is missing, or needs to be improved, in order to transfer the use-cases developed in the pilot to other areas?

The pilot had three objectives:

  • to better understand what is currently possible with the available data and technology;

  • to determine what additional data and technology need to be developed in the future to better meet the needs of the Climate Resilience Community; and

  • to capture Best Practices and allow the Climate Community to copy and transform as many use-cases as possible to other locations or framework conditions.

2.3.  Background

With growing local communities, an increase in climate-driven disasters, and an increasing risk of future natural hazards, the demand for National Resilience Frameworks and Climate Resilience Information Systems (CRISs) cannot be overstated. CRISs are enabling data-search, -fetch, -fusion, -processing, and -visualization enabling access, understanding, and use of federal data, facilitating integration of federal and state data with local data, and serving as local information hubs for climate resilience knowledge sharing.

CRISs already exist and are operational, such as the Copernicus Climate Change Service with the Climate Data Store. CRIS architectures can be further enhanced by providing scientific methods and visualization capabilities as climate application packages. Based on FAIR principles, these application packages enable the reusability of CRIS features and capabilities. Reusability is an essential component when goals, expertise, and resources are aligned from the national to the local level. Framework conditions differ across nations, but application packages enable as much reuse of existing Best Practices, tools, data, and services as possible.

Goals and objectives of decision makers vary at different scales. At the municipal level, leaders and citizens directly face climate-related hazards. Aspects thus come into focus, such as reducing vulnerability and risk, building resilience through local measures, or enhancing emergency response. At the state level, the municipal efforts can be coordinated and supported by providing funding and enacting relevant policies. The national, federal, and international levels provide funding, data, and international coordination to enable the best analyses and decisions at the lower scales.

image

Figure 2 — Schematic synergies within different climate and science services FAIR and open infrastructures

Productivity and decision making are enhanced when climate application packages are exchangeable across countries, organizations, or administrative levels (see Figure 2). This OGC Climate Resilience Pilot is a contribution towards an open, multi-level infrastructure that integrates data spaces, open science, and local-to-international requirements and objectives. It contributes to the technology and governance stack that enables the integration of data including historical observations, real time sensing data, reanalyses, forecasts, and future projections. It addresses data-to-decision pipelines, data analyses, and representation, and bundles everything in climate resilience application packages. These application packages are complemented by Best Practices, guidelines, and cook-books that enable multi–stakeholder decision making for the good of society in a changing natural environment.

The OGC Innovation Program brings all of the various groups together: members of the stakeholder groups define use cases and requirements; the technologists and data providers experiment with new tools and data products in an agile development process; and the scientific community provides results in appropriate formats and enables open science by providing applications that can be parameterized and executed on demand.

Figure 3 — The OGC Climate Resilience DWG and Pilot brings the climate resilience community together with infrastructure providers, policy makers, commercial companies, and the scientific community

This OGC Climate Resilience Pilot is part of the OGC Climate Community Collaborative Solution and Innovation process, an open community process that uses OGC as the governing body for collaborative activities among all members. A spiral approach is applied to connect technology enhancements, new data products, and scientific research with community needs and framework conditions at different scales. The spiral approach defines real world use cases, identifies gaps, produces new technology and data, and tests these against the real world use cases before entering the next iteration. Evaluation and validation cycles alternate and continuously define new work tasks. These tasks include documentation and toolbox descriptions on the consumer side, and data and service offerings, interoperability, and system architecture developments on the producer side. It is emphasized that research and development is not constrained to the data provider or infrastructure side. Many tasks need to be executed on the data consumer side in parallel and then merged with advancements on the provider side in regular intervals.

Good results have been achieved using OGC API standards in the past. For example, the remote operations on climate simulations (roocs) use OGC API Processes for subsetting data sets to reduce the data volume being transported. Other systems use OGC STAC for metadata and data handling or OGC Earth Observation Exploitation Platform Best Practices for the deployment of climate application packages into CRIS architectures. Still, data handling regarding higher complex climate impact assessments within FAIR and open infrastructures needs to be enhanced. There is no international recommendation or best practice on usage of existing API standards within individual CRISs. It is the goal of this pilot to contribute to the development of such a recommendation, respecting existing operational CRISs already in service.

Figure 4 — Schematic Architecture of a Climate Resilience Information System. By respecting FAIR principles for the climate application packages the architecture enables open infrastructures to produce and deliver information on demand of the users needs

2.4.  Technical Challenges

Realizing the delivery of Decision Ready Data on demand to achieve Climate Resilience involves a number of technical challenges that have already been identified by the community. A subset will be selected and embedded in use-cases that will be defined jointly by Pilot Sponsors and the OGC team. The goal is to ensure a clear value-enhancement pipeline as illustrated in Figure 1, above. This includes, among other elements, a baseline of standardized operators for data reduction and analytics. These need to fit into an overall workflow that provides translation services between upstream model data and downstream output — basically from raw data to analysis-ready data to decision-ready data.

The following technical challenges have been identified and will be treated in the focus areas of the pilot.

  • Big Data Challenge: Multiple obstacles still exist which create barriers for seamless information delivery starting from Data Discovery. The emergence of new data platforms, processing functionalities, and products means that data discovery remains a challenge. In addition to existing solutions based on established metadata profiles and catalog services, new technologies such as OGC’s Spatio-Temporal Asset Catalog (STAC) and open Web APIs such as OGC API Records will be explored. Furthermore, aspects of Data Access need to be solved where the new OGC API suite of Web APIs for data access, subsetting, and processing are currently utilized very successfully in several domains. Several code sprints have shown that server-side solutions can be realized within days and clients can interact very quickly with these server endpoints, radically reducing development time. A promising specialized candidate for climate data and non-climate data integration has been recently published in the form of the OGC API — Environmental Data Retrieval (EDR). But which additional APIs are needed for climate data? Is the current set of OGC APIs sufficiently qualified to support the data enhancement pipeline illustrated in Figure 1? If not, what modifications and extensions need to be made available? How do OGC APIs cooperate with existing technologies such as THREDDS and OPEnDAP? For challenges of data spaces, Data Cubes have recently been explored in the OGC Data Cube workshop. Ad hoc creation and embedded processing functions have been identified as essential ingredients for efficient data exploration and exchange. Is it possible to transfer these concepts to all stages of the processing pipeline? How can users scale both ways from local, ad hoc cubes to pan-continental cubes, and vice versa? How can cubes be extended as part of data fusion and data integration processes?

  • Cross-Discipline Data Integration: Different disciplines such as Earth Observation, various social sciences, or climate modeling use different conceptual models in their data collection, production, and analytical processes. How can these different models be mapped? What patterns have been used to transform conceptual models to logical models and, eventually, physical models? The production of modern Decision-ready information requires the integration of several data sets, including census and demographics, further social science data, transportation infrastructure, hydrography, land use, topography and other data sets. This pilot cycle uses ‘location’ as the common denominator between these diverse data sets which works with several data providers and scientific disciplines. In terms of Data Exchange Formats, the challenge is to know what data formats need to be supported at the various interfaces of the processing pipeline. What is the minimum constellation of required formats to cover the majority of use cases? What role do container formats play? Data Provenance is also challenging on the technical level. Many archives include data from several production cycles, such as IPCC AR 5 and AR 6 models. In this context, long term support needs to be realized and full traceability from high level data products back to the original raw data. Especially in context of reliable data based policy, clear audit trails and accountability for the data to information evolution must be ensured.

  • Application packages for processing pipelines: Machine Learning and Artificial Intelligence plays an increasing role in the context of data science and data integration. This focus area evaluates the applicability of machine learning models in the context of the value-enhancing processing pipeline. What information needs to be provided to describe machine learning models and corresponding training data sufficiently to ensure proper usage at various steps of the pipeline? Upcoming options to deploy ML/AI within processing APIs to enhance climate services are rising challenges, e.g., on how to initiate or ingest training models and the appropriate learning extensions for the production phase of ML/AI. Heterogeneity in data spaces can be bridged with Linked Data and Data Semantics. Proper and common use of shared semantics is essential to guarantee solid value-enhancement processes. At the same time, resolvable links to procedures, sampling and data process protocols, and used applications will ensure transparency and traceability of decisions and actions based on data products. What level is currently supported? What infrastructure is required to support shared semantics? What governance mechanisms need to be put in place?

2.5.  Relevance to the Climate Resilience Domain Working Group

The Climate Resilience DWG will concern itself with technology and technology policy issues, focusing on geospatial information and technology interests as related to climate mitigation and adaptation, as well as the means by which those issues can be appropriately factored into the OGC standards development process.

The mission of the Climate Resilience DWG is to identify geospatial interoperability issues and challenges that impede climate action, then examine ways in which those challenges can be met through the application of existing OGC Standards, or through development of new geospatial interoperability standards under the auspices of OGC.

Activities to be undertaken by the Climate Resilience DWG include, but are not limited to:

  • identify the OGC interface standards and encodings useful to apply FAIR concepts to climate change services platforms;

  • liaise with other OGC Working Groups (WGs) to drive standards evolution;

  • promote the use of the aforementioned standards with climate change service providers and policy makers addressing international regional and local needs;

  • liaise with external groups working on technologies relevant to establishing ecosystems of EO Exploitation Platforms;

  • liaise with external groups working on relevant technologies;

  • publish OGC Technical Papers, Discussion Papers, or Best Practices on interoperable interfaces for climate change services; and

  • provide software tool kits to facilitate the deployment of climate change services platforms.

2.6.  Value Chain from raw data to Information

During this pilot, participants have worked on a number of workflows and architectures focusing on several use cases of floods, droughts, heatwaves, and fires. It required the use of Climate Resilience Information Systems where interoperability played a vital role in producing climate information by enabling seamless integration and exchange of information between data, models, and various components.

The value chain from raw data to climate information (Figure 1) can be clustered in sections according to the value quality. This value chain, often also compared to a conveyor belt, can be designed with different component workflows which are developed, analyzed, and described in this pilot. The order of the chapters of the document reflects value chain organizing and processing starting from Raw data to Datacubes (Chapter 3). The following Chapter 4 describes the data refinement from Raw Data and Datacubes to Analysis Ready Data (ARD). Various data pipelines are considered and evaluated on how best to move raw data, first to data cubes for efficient handling, and then how to process them to ARD, or derive the ARD directly from the raw data. This guides the discussion on the standardization of Data Cubes and ARD. Subsequently, Chapter 5, illustrates how to transform ARD to Decision Ready Indicator (DRI) by including an example set of climate indices. The pilot also demonstrates the value added of high-end 3D visualization combined with artificial-intelligence-enriched simulations for increasing climate resilience and for facilitating the decision-making process. The use cases driven value chain from Data to Visualization is described in Chapter 6. To close an important gap, a strong emphasis has been made to Climate Information and Communication with Stakeholders in Chapter 7 lining out the importance of consultation work to non-technical users to identify their requirements and optimize the information delivery use-case specific on demand. Some of the value chain elements from raw data to visualization are illustrated by Use cases in Chapter 8. And Lessons Learned (Chapter 9) showcase the pilot’s work and include challenges with the value chain from raw data to climate information. The final chapter, chapter 10Recommendations for future climate resilience pilots describes future work.

3.  Raw data to Datacubes

Raw data and Datacubes are two different forms for organizing and structuring data in the context of data analysis and data warehousing.

  1. Raw Data refers to the unprocessed, unorganized, and unstructured data that is collected or generated directly from various sources. It can include a variety of forms such as text, numbers, (geo) images, audio, video, or any other form of data. Raw data often lacks formatting or context and requires further processing or manipulation before it can be effectively analyzed or used for decision-making purposes. Raw data is typically stored in databases or data storage systems.

  2. Datacubes, also known as multidimensional cubes, are a structured form of data representation that organizes and aggregates raw data into a multi-dimensional format. Datacubes are designed to facilitate efficient and fast analysis of data from different dimensions or perspectives. They are commonly used in data warehousing.

Datacubes organize data into a multi-dimensional structure typically comprising dimensions, hierarchies, and cells. Dimensions represent various attributes or factors that define the data, such as time, geography, or products. Hierarchies represent the levels of detail within each dimension. Cells typically store the aggregated data values at the intersection of dimensions.

Datacubes enable users to perform complex analytical operations like slicing, dicing, drilling down, or rolling up data across different dimensions. They provide a summarized and pre-aggregated view of data that can significantly speed query processing and analysis compared to working directly with raw data, which is very valuable for the climate resilience community. Therefore, Datacubes are often used to support decision-making processes. The example below highlights a climate resilience related example of how to create and make available Datacubes for wildfire risk analysis.

3.1.  Analysis Ready Data Cubes — user-friendly sharing of climate data records

Climate Data Record (CDR) is a time series of measurements of sufficient length, consistency, and continuity to determine potential climate variability and change (US National Research Council). These measurements can be obtained through ground based stations or derived from a long time series of satellite data.

Data Cube (Different approaches): Datacubes organize data into a multi-dimensional structure. They contain typically:

  • multidimensional arrays of data (Kopp et al., 2019);

  • 4-dimensional arrays with dimensions x (longitude or easting), y (latitude or northing), time, and bands sharing the same data properties (Appel and Pebesma, 2019); and

  • the term “cube” can be a metaphor to help illustrate a data structure that can in fact be 1- dimensional, 2-dimensional, 3-dimensional, or higher-dimensional. The dimensions may be coordinates or enumerations, e.g., categories (OGC, 2021).

The common (technical) definition of the Data Cube focuses on the data structure aspects exclusively!

Analysis Ready Data (ARD) — are data sets that have been processed to a minimum set of requirements and organized into a form that allows immediate analysis with a minimum of additional user effort and interoperability both through time and with other datasets. ARDs often represent satellite data (CEOS website).

The idea behind the ARD is that data providers, such as EUMETSAT or ESA, are better suited to perform data pre-processing, e.g., atmospheric correction, cloud masking, and re-gridding, than users.

Analysis Ready Data Cubes (ARDCs)
ARDCs are often made available in the form of data cubes which focus on one specific region (e.g., Swiss Data Cube) or thematic application (e.g., EUMETSAT Drought & Vegetation Data Cube (D&V Data Cube)). Data cubes can also be defined by the type of data they include (e.g., Atmospheric Composition Data Cube (ACDC), Earth System Data Cube (ESDC) + Data Analytics Toolkit (Earth System Data Lab)).
A data cube which contains various climate data records can be generally referred to as Climate Data Cube.

Table 1 — Example Climate Data Cubes

Data Cube nameClimate Data RecordsProviderYear of releaseData sourceAccessibilityData formatTemporal coverage
EUMETSAT Drought & Vegetation Data CubeSolar radiation: Global Radiation, Direct normal Solar Radiation, Sunshine Duration, other: Land Surface Temperature, Reference Evapotranspiration, NDVI, Fractional Vegetation Cover, Leaf Area Index, Fraction of absorbed photosynthetically active radiation, Soil Wetness Index (root zone), Precipitation, Air temperature at 2mEUMETSAT2021CMSAF SARAH2 (for solar radiation), other: LSA SAF, H SAF, GPCC, ECMWFFree after enrollment (EUMETSAT Prototype Satellite Data Cube)CF compliant netCDF4 via a THREDDS serverSolar radiation: 1983-2020, other: 2004-2020, SWI 1992-2020, Precipitation 1982-2020, T2m 1979-2020
mesogeos — a Daily Datacube for the Modeling and Analysis of Wildfires in the MediterraneanSolar radiation: mean daily surface solar radiation downwards from ERA5-Land, other: dynamic variables — previous day Leaf Area Index, evapotranspiration, Land Surface Temperature, meteorological data, fire variables, and Fire Weather Index static variables — roads density, population density, and topography layersOne of many data cubes created within the Deep Cube (Horizon 2020 Project “Explainable AI pipelines for big Copernicus data”")2022MODIS, ERA5, JRC European Drought Observatory, worldpop.org, Copernicus C3S, Copernicus EU-DEM, EFFISFree, open code on github.zarr (file storage format for chunked, compressed, N-dimensional arrays based on an open-source specification), Jupyter Notebooks (python)2002 — 2022
The Earth System Data Cube (ESDC) Solar radiation: Surface Net Solar Radiation Other: the cube includes all important meteorological variables (the list is too long to include in this table)DeepESDL Team (ESA-funded project Earth System Data Lab)2022ERA5 (for solar radiation)FreeDirectory of NetCDF files based on xcube, can also be accessed via a dedicated ESDL THREDDS server which supports the OPeNDAP and WCS1979 — 2021
MADIA — Meteorological variables for agriculture (Italy)Solar radiation: mean of daily surface solar radiation downwards (shortwave radiation), other 10-day gridded agro-meteorological data: air temperature and humidity, precipitation, wind speed, evapotranspirationCouncil for Agricultural Research and Economics–Research Centre for Agriculture and Environment2022ERA5 hourly data accessed through Climate Data StoreFreeNetCDF, csv and vector file (Shapefile) for administrative regions (NUTS 2 and 3)1981 — 2021
Open Environmental Data CubeClimate: air temperature (Min, Mean, Max), land surface temperature (Min, Mean, Max), precipitation (Daily Sum) Other: natural disasters, air quality, land cover, terrain, soil, forest, and vegetationOpenGeoHub, CVUT Prague, mundialis,Terrasigna, MultiOne (Horizon2020 Project: “Geo-harmonizer: EU-wide automated mapping system for harmonization of Open Data based on FOSS4G and Machine Learning”)2022ERA5 (for climate variables)FreeWFS for vector data, Cloud Optimized GeoTIFFs for raster datasets (allowing import, subset, crop, and overlay parts of data for the local area.)2000 — 2020 and Predictions based on Ensemble Machine Learning

Analysis Ready Data Cubes (ARDCs) play an important role in handling large volumes of data (such as satellite-based CDRs). They are often deployed on different spatial scales and consist of datasets dedicated for particular application. This makes them more accessible, easier to use, and less costly for the users.

3.2.  Data cubes to support wildfire risk analysis

To support the pilot activities, Ecere provided, as an in-kind contribution, a deployment of its GNOSIS Map Server implementing several OGC API standards enabling efficient access to data cubes. The API and backend functionality for these data cubes, improved throughout this pilot, also support a Wildland Fire Fuel indicator workflow for the OGC Disaster Pilot taking place until the end of September 2023. As an end goal of that Disaster Pilot, the data cube API should support machine learning predictions for classifying wildland fire fuel vegetation type from Earth Observation imagery. A number of climate datasets and wildland fire danger indices were also made accessible through that same data cube API. Additional machine learning prediction experiments may be performed based on those datasets as well.

The API and datasets were provided in the hope that they would prove useful to other participants and could be part of Technology Integration Experiments (TIEs) for the pilot and other related OGC initiatives. Mainly due to the exploratory nature of this first phase of the pilot, no successful TIE with these resources with other participants were noted during its execution. However, these resources will remain operational and successful TIEs are expected with them as part of the Disaster Pilot, the OGC Testbed 19 Geo Data Cube tasks, and as future phases of the climate resilience pilot.

3.2.1.  Climate resilience data cubes

During the course of the pilot, the following datasets relevant to climate resilience were optimized and deployed at a data cube API demonstration end-point using the GNOSIS Map Server.

Table 2 — Datasets provided through GNOSIS Map Server data cube API

Data collectionFieldsTemporal intervalTemporal resolutionSpatial extentSpatial resolutionAdditional dimensionSource
ESA sentinel-2 Level-2AB01..B12, B8A, AOT, WVP, SCLNovember 2016 to October 20225 daysGlobal (land only)10 metersN/ACOGs and STAC catalogs on AWS
CMIP5 projections (wind speed)Eastward and Northward wind velocity2016 to 2025dailyGlobal2.5° longitude x 2° latitude8 pressure levelsCopernicus Climate Data Store
CMIP5 projections (air temperature)Air temperature2016 to 2025dailyGlobal2.5° longitude x 2° latitude8 pressure levelsCopernicus Climate Data Store
CMIP5 projections (geopotential height)Geopotential height2016 to 2025dailyGlobal2.5° longitude x 2° latitude8 pressure levelsCopernicus Climate Data Store
CMIP5 projections on single levelNear-surface specific humidity, Precipitation, Snowfall flux, Sea level pressure, Surface downwelling shortwave radiation, Daily-mean near-surface wind speed, Average, Minimum and Maximum, near-surface air temperature2016 to 2025dailyGlobal2.5° longitude x 2° latitudeN/ACopernicus Climate Data Store
ERA5 reanalysis (relative humidity)Relative humidityApril 1 to 6, 2023hourlyGlobal0.25° longitude x 0.25° latitude37 pressure levelsCopernicus Climate Data Store
ECMWF CEMS Fire Danger indicesBurning index, Build-up index, Danger risk, Drought code, Duff moisture code, Fire danger severity rating, Energy release component, Fire danger index, Fine fuel moisture code, Forest fire weather index, Ignition component, Initial spread index, Keetch-byram drought index, Spread componentJanuary 2021 to July 2022dailyGlobal (except Antarctica)0.25° longitude x 0.25° latitudeN/ACopernicus Climate Data Store
Fuel Vegetation Types for Continental United StatesFuel vegetation type2022 (no time axis)N/AContinental U.S.~20 metersN/Alandfire.gov

Figure 5 — ESA sentinel-2 Level-2A from COGs and STAC catalogs on AWS

Figure 6 — CMIP5 projections (air temperature) from Copernicus Climate Data Store

Figure 7 — ECMWF CEMS Fire Danger indices from Copernicus Climate Data Store

Figure 8 — Fuel Vegetation Types for Continental United States from landfire.gov

3.2.2.  Overview of supported OGC API standards to access the data

The GNOSIS Map Server implements several published and candidate OGC API standards and is a certified implementation of OGC API — Features as well as OGC API — Processes. This section describes some of these supported standards and illustrates their use with requests for the climate data collections listed above.

3.2.2.1.  OGC API — Common

The OGC API standards form a complementary set of functionality for efficiently accessing data and processing resources, combining together through the OGC API — Common framework. Whereas OGC API — Common — Part 1 standardizes how the API can present a landing page, describe itself, and declare conformance to specific standards, Part 2 provides a consistent mechanism to list and describe collections of geospatial data. The following Common resources are available from the GNOSIS Map Server demonstration end-point:

Table 3 — Common resources that are available from the GNOSIS Map Server

ResourceCommon PartURL
Landing pagePart 1https://maps.gnosis.earth/ogcapi
OpenAPI descriptionPart 1https://maps.gnosis.earth/ogcapi/api
Conformance declarationPart 1https://maps.gnosis.earth/ogcapi/conformance
List of collectionsPart 2https://maps.gnosis.earth/ogcapi/collections
Collection descriptionPart 2https://maps.gnosis.earth/ogcapi/collections/{collectionId}

In addition to the common resources standardized by Part 1 and Part 2, several API building blocks are consistently re-used across the different OGC API standards. The following table summarizes common query parameters supported by several of the data access APIs:

Table 4 — Common query parameters

Query parameterDescriptionAPIs
subsetFor subsetting (trimming or slicing) on an arbitrary dimensionCoverages, Maps, Tiles (except for spatial dimensions), DGGS (zone query; for data retrieval: except for DGGS dimensions)
bboxFor subsetting on spatial dimensions (Features: spatial intersection)Coverages, Maps, DGGS (zone query), Features
datetimeFor subsetting on temporal dimension (Features: temporal intersection)Coverages, Maps, Tiles, DGGS (data retrieval: except for temporal DGGS), Features
propertiesFor selecting specific properties to return (range subsetting); deriving new fields (properties) using CQL2 expressionCoverages, Tiles, DGGS, Features
filterFor filtering using a CQL2 expressionCoverages, Maps, Tiles, DGGS, Features
crsFor selecting an output coordinate reference systemCoverages, Maps, Features
bbox-crsFor specifiying the coordinate reference system of the bbox parameterCoverages, Maps, Features, DGGS
subset-crsFor specifiying the coordinate reference system of the subset parameterCoverages, Maps, DGGS
widthFor specifying the width of the output (resampling)Coverages, Maps
heightFor specifying the height of the output (resampling)Coverages, Maps

With Coverages and Maps, a spatial area of interest can be specified using either, e.g., bbox=10,20,30,40 or subset=Lat(20:40),Lon(10:30).

For temporal datasets, a specific time can be requested using, e.g., datetime=2022-03-01 or subset=time("2022-03-01").

For the data cubes with multiple pressure levels, the pressure dimension is defined and can be used with the subset query parameter with all of the data access OGC API standards (Coverages, Tiles, DGGS and Maps), e.g., subset=pressure(500).

3.2.2.2.  OGC API — Coverages

The OGC API — Coverages candidate Standard is a simple API defining fundamental functionality to retrieve access data for arbitrary fields, area, time, and resolution of interest from a data cube.

The main resource to retrieve data using the Coverages API is located at /collections/{collectionId}/coverage for each data collection. This resource supports a number of query parameters defined by optional requirements classes and extensions supported by the GNOSIS Map Server.

Table 5 — Supported query parameters defined by optional requirements classes and extensions supported by the GNOSIS Map Server

Query parameterDescriptionRequirements class
subsetFor subsetting (trimming or slicing) on an arbitrary dimensionSubsetting
bboxFor subsetting on spatial dimensionsSubsetting
datetimeFor subsetting on temporal dimensionSubsetting
scale-factorFor resampling using the same factor for all dimensions (1: no resampling, 2: 2x downsampling)Scaling (resampling)
scale-axesFor resampling using a specific factor for individual dimensionsScaling (resampling)
scale-sizeFor resampling by specifying the expected number of cells for each dimensionScaling (resampling)
widthFor specifying the width of the output (resampling)Scaling (resampling)
heightFor specifying the height of the output (resampling)Scaling (resampling)
propertiesFor selecting specific properties to return (range subsetting); deriving new fields using CQL2 expressionRange subsetting; Derived fields extension
filterFor filtering using a CQL2 expressionRange filtering extension
crsFor selecting an output coordinate reference systemCRS extension
bbox-crsFor specifiying the coordinate reference system of the bbox parameterCRS extension
subset-crsFor specifiying the coordinate reference system of the subset parameterCRS extension

The Coverages draft currently also specifies a DomainSet JSON object which is linked using the [ogc-rel:coverage-domainset] link relation from the collection description, which may be included either within the collection description itself, or at a dedicated resource (/collections/{collectionId}/coverage/domainset). The schema for this DomainSet object describes the domain of the coverage (the extent and resolution of its dimensions / axes) and follows the Coverages Implementation Schema (CIS) 1.1.1. An example of such a domain set resource can be found at https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:byPressureLevel:windSpeed/coverage/domainset?f=json .

At the time of writing this report discussions are underway to potentially simplify the API by fully describing the domain directly within the collection description resource, using uniform additional dimensions, as well as the grid property, inside the extent property, which can describe both regular as well as irregular grids, removing the need for this extra resource. For example, see the collection description for the CMIP5 single pressure level data and its corresponding CIS domain set resource.

The Coverages draft currently also specifies a RangeType JSON object which is linked using the [ogc-rel:coverage-rangetype] link relation from the collection description, which may be included either within the collection description itself or at a dedicated resource (/collections/{collectionId}/coverage/domainset). The schema for this RangeType object describes the range type of the coverage (the extent and resolution of its dimensions / axes) and follows the Coverages Implementation Schema (CIS) 1.1.1. An example of such a range type resource can be found at https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:byPressureLevel:windSpeed/coverage/rangetype?f=json . It might be possible to also describe the range type in a common way across the different OGC APIs using a JSON schema with semantic annotations, as per the work undertaken for OGC API — Features — Part 5: Schemas.

A Coverage Tiles requirements class is defined in OGC API — Coverages, leveraging the OGC API — Tiles standard while clarifying requirements for coverage tile responses. Examples of coverage tile requests are described below in the OGC API — Tiles section.

At the moment, the GNOSIS Map Server implementation of Coverages is limited to the following 2D (spatial dimensions) output formats:

  • GeoTIFF (multiple fields, two-dimensional); and

  • PNG (single field, 16-bit output, currently using fixed scale (2.98) and offset (16384) modifiers).

There is a plan to add support for n-dimensional output formats, including netCDF, CIS JSON, and eventually CoverageJSON, as well. For coverages with more than two dimensions, a specific time and/or pressure slice must therefore be selected, currently requiring separate API requests to retrieve a range of time or pressure levels.

Some example of coverage requests:

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:singlePressure/coverage?f=geotiff=tas,tasmax,tasmin,pr,psl=Lat(-90:90),Lon(0:180)=400=2020-05-20 (GeoTIFF coverage with 5 bands for each field)

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/coverage?f=geotiff=pressure(750) (GeoTIFF Coverage)

Figure 9 — Coverage request for CMIP5 maximum daily temperature

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:singlePressure/coverage?f=png=(tasmax-250)*400

3.2.2.3.  OGC API — Maps

The OGC API — Maps candidate Standard defines the ability to retrieve a visual representation of geospatial data. The main resource to retrieve data using the Maps API is located at /collections/{collectionId}/map for each data collection. This resource supports a number of query parameters defined by optional requirements classes and extensions supported by the GNOSIS Map Server.

Table 6 — Supported query parameters defined by optional requirements classes and extensions supported by the GNOSIS Map Server

Query parameterDescriptionRequirements class
bboxFor subsetting on spatial dimensionsSpatial Subsetting
bbox-crsFor specifiying the coordinate reference system of the bbox parameterSpatial Subsetting
subsetFor subsetting (trimming or slicing) on an arbitrary dimensionSpatial/Temporal/General Subsetting
subset-crsFor specifiying the coordinate reference system of the subset parameterSpatial/Temporal/General Subsetting
datetimeFor subsetting on temporal dimensionTemporal Subsetting
widthFor specifying the width of the output (resampling)Scaling (resampling)
heightFor specifying the height of the output (resampling)Scaling (resampling)
crsFor selecting an output coordinate reference systemCRS
bgcolorFor specifiying the color of the backgroundBackground
transparentFor specifiying whether the background should be transparentBackground
filterFor filtering using a CQL2 expressionFiltering extension

Some example map requests:

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/map?width=2048=pressure(750)=0x002040

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:byPressureLevel:windSpeed/map?subset=pressure(850)=1024

NOTE:    Proper symbolization for this wind velocity map (above request) would require support for wind barbs. In the meantime, the Eastward and Northward velocities are assigned to the green and blue color channels.

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:byPressureLevel:temperature/map?subset=pressure(850)

Figure 10 — Sentinel-2 map (natural color)

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/map?subset=Lat(-16.259765625:-16.2158203125),Lon(124.4091796875:124.453125)=2022-06-28

Some example map requests for a specific style, in conjunction with OGC API — Styles:

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:singlePressure/styles/precipitation/map?datetime=2022-09-04

Figure 11 — Sentinel-2 map for NDVI style

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/styles/ndvi/map?subset=Lat(-16.259765625:-16.2158203125),Lon(124.4091796875:124.453125)=2022-04-28

Figure 12 — Sentinel-2 map for Scene Classification Map style

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/styles/scl/map?subset=Lat(-16.259765625:-16.2158203125),Lon(124.4091796875:124.453125)=2022-06-28

A Map Tilesets requirements class is defined in OGC API — Maps, leveraging the OGC API — Tiles stand while clarifying requirements for map tile responses. Examples of map tiles requests are described below in the OGC API — Tiles section.

3.2.2.4.  OGC API — Tiles

The OGC API — Tiles Standard defines the ability to retrieve geospatial data as tiles based on the OGC 2D Tile Matrix Set and Tileset Metadata Standard, originally defined as part of the Web Map Tile Service (WMTS) Standard. Unlike WMTS, which focused strictly on pre-rendered or server-side rendered Map tiles, the Tiles API was designed to also enable the use of data tiles such as Coverages Tiles and Vector Tiles which can be styled, rendered, and used for data analytics performed on the client side. Using pre-determined partitioning schemes facilitates caching for both servers and clients, resulting in more responsive dynamic maps.

The following Tiles API resources are defined:

Table 7 — Tiles API resources

ResourceRequirements ClassDescription
…​/tilesTilesets listList of available tilesets
…​/tiles/{tileMatrixSetId}TilesetDescription of tileset and link to 2D Tile Matrix Set definition
…​/tiles/{tileMatrixSetId}/{tileMatrix}/{tileRow}/{tileCol}CoreTiles for a given Tile 2D Matrix Set, tile matrix/row/column

The GNOSIS Map Server supports a number of 2D Tile Matrix Sets for all of the collections it hosts, including:

3.2.2.4.1.  Coverage Tiles

The GNOSIS Map Server currently supports the following coverage tile formats:

  • GNOSIS Map Tiles (multiple fields, n-dimensional);

  • GeoTIFF (multiple fields, two-dimensional); and

  • PNG (single field, 16-bit value using fixed scale (2.98) and offset (16384) modifiers).

Support is planned for netCDF, CIS JSON, and eventually CoverageJSON, as well as additional formats.

Example coverage tile queries:

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/coverage/tiles/GNOSISGlobalGrid/3/4/17

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/coverage/tiles/ISEA9Diamonds/4/373/288

To request a different sentinel-2 band than the default RGB (B04, B03, B02) bands:

Figure 13 — Sentinel-2 PNG coverage tile for band 08 (near infra-red)

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/coverage/tiles/GNOSISGlobalGrid/3/4/17?properties=B08=png

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/coverage/tiles/ISEA9Diamonds/4/373/288?properties=B08

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:singlePressure/coverage/tiles/WebMercatorQuad/1/1/0?f=geotiff=2022-09-04 (GeoTIFF coverage tile)

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/coverage/tiles/WorldCRS84Quad/0/0/0?f=geotiff=pressure(750) (GeoTIFF coverage tile)

3.2.2.5.  OGC Common Query Language (CQL2)

The OGC Common Query Language, abbreviated CQL2, allows the user to define query expressions. Although introduced as a language to specify a boolean predicate for OGC API — Features — Part 3: Filtering, the language is easily extended for additional use cases, such as filtering the range set of a coverage request, or deriving new fields using expressions—​that can return non-boolean values—​including performing coverage band arithmetics, such as calculating vegetation indices.

Support for CQL2 in the filter parameter is implemented in the GNOSIS Map Server for Coverages, Features, Maps, Tiles as well as DGGS. For example, requesting all data from the CMIP5 single pressure level collection where the maximum daily temperature is greater than 300 Kelvins, filter=tasmax>300 (unmatched cells will be replaced by NODATA values).

Support for CQL2 in the properties parameter is currently implemented for Coverages, Tiles and DGGS. For example, the pr precipitation property can be multiplied by a factor of one thousand using properties=pr*1000.

Using a CQL2 expression to filter out the clouds in a map tile:

Figure 17 — Sentinel-2 map tile filtered by Scene Classification Layer to remove clouds (a longer time interval with fewer clouds would be necessary to complete the mosaic)

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/map/tiles/GNOSISGlobalGrid/3/4/17?filter=SCL8 or SCL>10

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/map/tiles/ISEA9Diamonds/4/373/288?filter=SCL8 or SCL>10

Using a CQL2 expression in coverage tile requests to perform band arithmetic computing NDVI:

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/coverage/tiles/GNOSISGlobalGrid/3/4/17?properties=(B08/10000-B04/10000)/(B08/10000+B04/10000)

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/coverage/tiles/ISEA9Diamonds/4/373/288?properties=(B08/10000-B04/10000)/(B08/10000+B04/10000)

Figure 18 — Coverage tile request from sentinel-2 computing NDVI

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/coverage/tiles/GNOSISGlobalGrid/3/4/17?properties=(B08/10000-B04/10000)/(B08/10000+B04/10000)*10000=png

Using a CQL2 expression in a coverage request to multiply the relative humidity and filter resulting values below a threshold (20):

Figure 19 — Coverage request from relative humidity coverage multiplying r by 200 and returning only values where r > 20

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/coverage?f=png=pressure(750),Lat(-90:90),Lon(0:180),time(%222023-04-03%22)=r*200=r%3E20

3.2.2.6.  OGC API — Discrete Global Grid Systems

The OGC API — DGGS candidate Standard allows the retrieval of data and performance of spatial queries based on hierarchical multi-resolution discrete grids covering the entirety of the Earth. There are three main requirement classes for this standard:

  • Core (DGGS definition and zone information resource);

  • Zone Data Retrieval (What is here?); and

  • Zones Query (Where is it?).

The following DGGS API resources are defined:

Table 8 — DGGS API resources

ResourceRequirements ClassDescription
…​/dggsCoreList of available DGGSs
…​/dggs/{dggsId}CoreDescription and link to definition of a specific DGGS
…​/dggs/{dggsId}/zonesZone QueryFor retrieving the list of zones matching a collection and/or query
…​/dggs/{dggsId}/zones/{zoneId}CoreFor retrieving information about a specific zone
…​/dggs/{dggsId}/zones/{zoneId}/dataData RetrievalFor retrieving data for a specific zone

DGGS API requests imply the use a particular grid, understood by both the client and the server, associated with the {dggsId} of the resource on which the request is performed. Several different discrete global grids have been defined. The GNOSIS Map Server currently supports two discrete global grids:

  • the GNOSIS Global Grid, based on the 2D Tile Matrix Set of the same name defined in the EPSG:4326 geographic CRS, axis-aligned with latitude and longitude, and using variable width tile matrices to approach equal area (maximum variation is ~48% up to a very detailed zoom level); and

  • the ISEA9R (Icosahedral Snyder Equal Area aperture 9 Rhombus) grid, a dual DGGS of ISEA3H (aperture 3 hexagonal) for its even levels, using rhombuses/diamonds which, compared to hexagons, are much simpler to index and for which it is much easier to encode data in rectilinear formats such as GeoTIFF. The area values of ISEA3H hexagons can be transported as points on the rhombus vertices for those ISEA3H even levels. The ISEA9R grid is also axis-aligned to a CRS defined by rotating and skewing the ISEA projection, also allowing the definition of a 2D Tile Matrix Set for it.

A client will normally opt to use OGC API — DGGS if the client shares an understanding and internal use of the same grid with the server. Although, for axis-aligned DGGS that can be represented as a 2D Tile Matrix Set OGC, API — Tiles can be used to retrieve data for specific zones. The DGGS API enables zone data retrieval for other DGGS which are not axis-aligned or whose geometry makes that impossible (e.g., hexagons). Another important use of the DGGS API is the ability to efficiently retrieve the results of a spatial query (e.g., using CQL2) in the form of a compacted list of zone IDs.

3.2.2.6.1.  Core

The core requirements class defines requirements for listing available DGGS, describing each of them, and providing information for individual zones.

In the GNOSIS Map Server implementation of the zone information resource, since both supported DGGS also correspond to a 2D Tile Matrix Set, the Level, Row, and Column for the equivalent OGC API — Tiles request is displayed on the information page, as can be seen below.

For the DGGS {zoneId}, the level, row, and column are encoded differently in a compact hexadecimal identifier.

Some example zone information requests:

Figure 20 — GNOSIS Map Server information resource for GNOSIS Global Grid zone 5-24-6E

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/dggs/GNOSISGlobalGrid/zones/5-24-6E

Figure 21 — GNOSIS Map Server information resource for ISEA9Diamonds zone A7-0

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/dggs/ISEA9Diamonds/zones/A7-0

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/dggs/ISEA9Diamonds/zones/E7-FAE

3.2.2.6.2.  Zone Data Retrieval: What is here?

The Zone Data Retrieval requirements class allows the retrieval of data for a specific DGGS zone. For axis-aligned DGGSs whose zone geometry can be described by a 2D Tile Matrix Set, such as the GNOSISGlobalGrid, ISEA9R, or rHealPix, this capability is equivalent to Coverage Tiles requests for the corresponding TileMatrixSets. This requirements class supports returning data for zones whose geometries are of an arbitrary shape, e.g., hexagonal or triangular.

The zone data retrieval resource is …​/dggs/{dggsId}/zones/{zoneId}/data, for which the GNOSIS Map Server supports a number of query parameters:

Table 9 — Query parameters supported by the GNOSIS Map Server

Query parameterDescription
filterFor filtering data within the response using a CQL2 expression
propertiesFor selecting specific properties to return (range subsetting); deriving new fields using CQL2 expression
datetimeFor subsetting on temporal dimension
subsetFor subsetting (trimming or slicing) on an arbitrary dimension (besides the DGGS dimensions)
subset-crsFor specifiying the coordinate reference system of the subset parameter
zone-depthFor specifying zone depths to return relative to the requested zone (0 corresponding to a single set of values for the zone itself)

Some example of data retrieval queries:

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/dggs/GNOSISGlobalGrid/zones/3-4-11/data

https://maps.gnosis.earth/ogcapi/collections/sentinel2-l2a/dggs/ISEA9Diamonds/zones/E7-FAE/data

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/dggs/GNOSISGlobalGrid/zones/0-0-3/data?f=geotiff=2023-04-03

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:singlePressure/dggs/GNOSISGlobalGrid/zones/0-0-3/data?f=geotiff=2022-09-04

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/dggs/ISEA9Diamonds/zones/A7-0/data?f=geotiff=2023-04-03

https://maps.gnosis.earth/ogcapi/collections/climate:cmip5:singlePressure/dggs/ISEA9Diamonds/zones/A7-0/data?f=geotiff=2022-09-04

3.2.2.6.3.  Zone Queries: Where is it?

The Zone Query requirements class allows the efficient retrieval of the results of a spatial query in the form of a compact list of zone IDs. The list can be compacted (the default) by replacing children zones by their parents when all children of that parent are part of the result set. The zone query resource is …​/dggs/{dggsId}/zones, for which the GNOSIS Map Server supports a number of query parameters:

Table 10 — Zone query parameters supported by the GNOSIS Map Server

Query parameterDescription
zone-levelFor specifying the desired zone hierarchy level for the resulting list of zone IDs
compact-zonesFor specifying whether to return a compact list of zones (defaults to true)
filterFor filtering using a CQL2 expression
datetimeFor subsetting on temporal dimension
bboxFor subsetting on spatial dimensions
bbox-crsFor specifiying the coordinate reference system of the bbox parameter
subsetFor subsetting (trimming or slicing) on an arbitrary dimension
subset-crsFor specifiying the coordinate reference system of the subset parameter

By creating a kind of mask at a specifically requested resolution level, DGGS Zones Query can potentially greatly help parallelization and orchestration of spatial queries combining multiple datasets across multiple services, allowing the performance of early optimizations with lazy evaluations.

NOTE:    There are currently some limitations to the GNOSIS Map Server implementation of the Zones Query requirements class.

Examples of zone queries:

Where is relative the humidity at 850 hPa greater than 80% on April 3rd, 2023? (at the precision level of GNOSIS Global Grid level 6)

(using the default compact-zones=true where children zones are replaced by parent zones if all children zones are included)

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/dggs/GNOSISGlobalGrid/zones?subset=pressure(850)=2023-04-03=r%3E80=6=json (Plain Zone ID list output)

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/dggs/GNOSISGlobalGrid/zones?subset=pressure(850)=2023-04-03=r%3E80=6=uint64 (Binary 64-bit integer Zone IDs)

https://maps.gnosis.earth/ogcapi/collections/climate:era5:relativeHumidity/dggs/GNOSISGlobalGrid/zones?subset=pressure(850)=2023-04-03=r%3E80=6=geotiff (GeoTIFF output)