I. Abstract
This OGC Testbed-18 (TB-18) Engineering Report (ER) is based on previous OGC Moving Features and Sensor Integration (MFSI) activities. The OGC TB-18 MFSI task addressed the interoperability between sensors and between sensing systems as well as the exchange of multiple sources of detected moving objects into one common analytic client. This ER describes the architecture framework for multi-source moving object detection into the client supported by OGC MFSI Standards and describes challenges of multi-sensor integration in the context of Moving Features data.
II. Executive Summary
The OGC Testbed-18 initiative aimed to explore six tasks, including advanced interoperability for: Building Energy; Secure; Asynchronous Catalogs; Identifiers for Reproducible Science; Moving Features and Sensor Integration (MFSI); 3D+ Data Standards and Streaming; and Machine Learning (ML) Training Data (TD).
The goal of the MFSI task was to define a powerful Application Programming Interface (API) for discovery, access, and exchange of moving features and their corresponding tracks and to exercise this API in a near real-time scenario. The MFSI task considered these advancements by addressing the following two application scenarios.
An extension of the previous work on the Federated Marine Spatial Data Infrastructure (FMSDI) project.
The monitoring of hurricanes and Saildrones.
This Engineering Report (ER) represents deliverable D020 of the OGC Testbed-18 Moving Features and Sensor Integration task. The ER explores the architecture for collaborative distributed object detection and analysis of multi-source motion imagery.
The ER begins with an overview of Testbed-17 work. The overview is followed by an explanation of Moving Features and Sensor Integration in the context of the task requirements and use case scenarios. Then, the individual Testbed participant deliverables are reviewed in terms of data, architecture, functions, and results. Finally, the Technical Integration Experiment (TIE) results, future work, and lessons learned are summarized.
III. Keywords
The following are keywords to be used by search engines and document catalogues.
Moving Features, Sensor hub, ingestion service, hurricane tracking, Actuator, Trajector
IV. Security considerations
No security considerations have been made for this document.
V. Submitting Organizations
The following organizations submitted this Document to the Open Geospatial Consortium (OGC):
- Blue Monocle, Inc.
VI. Submitters
All questions regarding this document should be directed to the editor or the contributors:
Name | Organization | Role |
---|---|---|
Brittany Eaton | Blue Monocle, Inc. | Editor |
Logan Stark | Blue Monocle, Inc. | Contributor |
Gianpiero Maiello | Superelectric | Contributor |
Alex Robin | Botts Innovative Research | Contributor |
Zhining Gu | Arizona State University | Contributor |
Glenn Laughlin | Pelagis | Contributor |
Rob Smith | Away Team | Contributor |
Sara Saeedi | OGC | Contributor |
Testbed-18: Moving Features Engineering Report
1. Scope
This Testbed-18 Engineering report (ER) begins with an introduction to previous Testbed-17 work on Moving Features, followed by an explanation of Moving Features and Sensor Integration in the context of the task requirements and the hurricane and vessel tracking use case scenario. Then, the individual Testbed participant deliverables are reviewed in terms of data, architecture, functions, and results. The moving features ingestion services from two different participants are explored, as well as the sensor hub, and then the client in the context of three participants is discussed. Additionally, a chapter discussing orientation analysis of geotagged video for road network use case scenarios from a TB-18 observer participant is included. Finally, the summary reviews the Technology Integration Experiments (TIEs), future work, and lessons learned.
2. Normative references
The following documents are referred to in the text in such a way that some or all of their content constitutes requirements of this document. For dated references, only the edition cited applies. For undated references, the latest edition of the referenced document (including any amendments) applies.
Kyoung-Sook KIM, Nobuhiro ISHIMARU: OGC 19-045r3, Moving Features Encoding Extension — JSON, 2019 https://docs.ogc.org/is/19-045r3/19-045r3.html
Open API Initiative: OpenAPI Specification 3.0.2, 2018 https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.2.md
van den Brink, L., Portele, C., Vretanos, P.: OGC 10-100r3, Geography Markup Language (GML) Simple Features Profile, 2012 http://portal.opengeospatial.org/files/?artifact_id=42729
W3C: HTML5, W3C Recommendation, 2019 http://www.w3.org/TR/html5/
Schema.org: http://schema.org/docs/schemas.html
R. Fielding, J. Gettys, J. Mogul, H. Frystyk, L. Masinter, P. Leach, T. Berners-Lee: IETF RFC 2616, Hypertext Transfer Protocol — HTTP/1.1. RFC Publisher (1999). https://www.rfc-editor.org/info/rfc2616.
E. Rescorla: IETF RFC 2818, HTTP Over TLS. RFC Publisher (2000). https://www.rfc-editor.org/info/rfc2818.
G. Klyne, C. Newman: IETF RFC 3339, Date and Time on the Internet: Timestamps. RFC Publisher (2002). https://www.rfc-editor.org/info/rfc3339.
M. Nottingham: IETF RFC 8288, Web Linking. RFC Publisher (2017). https://www.rfc-editor.org/info/rfc8288.
H. Butler, M. Daly, A. Doyle, S. Gillies, S. Hagen, T. Schaub: IETF RFC 7946, The GeoJSON Format. RFC Publisher (2016). https://www.rfc-editor.org/info/rfc7946.
3. Foreword
Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. The Open Geospatial Consortium shall not be held responsible for identifying any or all such patent rights.
Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the standard set forth in this document, and to provide supporting documentation.
4. Introduction
Testbed 17 (TB-17) successfully tracked moving objects by combining data from multiple sensors mounted on a moving platform. Matching object trajectories synchronized to motion imagery data enabled visual identification of the tracked object for more accurate discrimination of observations. Calculation of movement attributes increased confidence in correct identification by confirming that values were in the expected range for that object type, such as speed, which can be calculated even from short tracks with a one-second duration. This work was demonstrated using two real-time situational awareness scenarios: the detection and tracking of moving buses in front of a school; and autonomous vehicle use case analyzed to detect and track people and vehicles moving nearby with WebVMT. WebVMT is an enabling technology whose main use is for marking up external map track resources in connection with the HTML <track> element. WebVMT files provide map presentation and annotation synchronized to video content, including interpolation, and more generally any form of geolocation data that is time-aligned with audio or video content.
In the OGC Moving Features, a ‘feature’ is defined as an abstraction of real-world phenomena [ISO 19109:2015] whereas a “moving feature” is defined as a representation, using a local origin and local ordinate vectors, of a geometric object at a given reference time [adapted from ISO 19141:2008]. The goal of this ER is to demonstrate the business value of moving features that play an essential role in hurricane and ship tracking scenarios as well as other use cases.
Moving Features are a vital part of the future of the Internet of Things (IoT). The fast-pace growth of digital motion imagery and advancements in machine learning technology continue to accelerate widespread use of moving feature detection and analysis systems. The overall Testbed 18 MF goal was to define a powerful Application Programming Interface (API) for discovery, access, and exchange of moving features and their corresponding tracks and to exercise this API in a near real-time scenario. The OGC Testbed-18 MF task considers these advancements by addressing three application scenarios.
The first scenario is an extension of the previous work completed as part of the Marine Data Working Group (DWG) Federated Marine Spatial Data Infrastructure (FMSDI) project. In this scenario, daily aggregated vessel traffic from Denmark published to the Automatic Identification System (AIS) Vessel Traffic system was used. This use case focused on using the vessel traffic to understand the positional relationships of ships to marine protected areas as defined by the IHO S-122 standard. Of interest is identifying those vessels that altered their path away from a Maritime Patrol Aircraft (MPA) when not required. This analysis was based on class type and the MPA restrictions. Another related use case is ‘stop detection’ of ocean vessels to identify those vessels that slowed their course through an MPA area, possibly indicating illegal fishing in the area. More recently, this work could have helped identify those vessels in the area of the Nordstream pipelines immediately prior to the issues with the pipelines.
The second scenario involved hurricanes and Saildrones. Saildrones for the National Oceanic and Atmospheric Administration (NOAA) Hurricane Monitoring programs for 2021, specifically data for Hurricane Sam and for 2022, data for Hurricane Fiona. The use cases in this scenario relate to the environmental conditions of the ocean surface in the vicinity of the hurricane tract. Testbed 18 MF task worked to identify anomalies in the ocean surface to indicate the influence of these properties on the trajectory of these hurricanes.
The third scenario extended the Testbed-17 autonomous vehicle use case to address the real-world road network issues of identifying wrong-way drivers and monitoring roadside litter accumulation. These use cases aggregate geotagged video footage from roadside traffic cameras and fleet dashcams respectively. The former correlates moving vehicle observations from multiple locations to track vehicle movements over time so that dangerous drivers can be quickly intercepted. The latter identifies locations of roadside objects from fleet dashcam footage and aggregates these over time to calculate litter accumulation rates so that collection teams can be optimally deployed. Testbed 18 MF task focused on the orientation issues associated with the video capture devices required for such use cases and highlighted that the draft OGC GeoPose Standard can play a key role in aggregating orientation data captured by diverse devices in commercial markets often dominated by proprietary formats.
Additionally, this OGC MF ER explores the architecture for collaborative distributed object detection and analysis of multi-source motion imagery. The ER represents deliverable D020 of the OGC Testbed 18 performed under the OGC Innovation Program.
The additional deliverables under this Testbed 18 Moving Features and Sensor Integration task include the following.
D140 & D141 Moving Features Collection Ingestion Service: This component is a collection of Moving Features deployed for the following two experiments.
To link moving features to a shared collection of features in order to develop a Best Practice for extending an existing Feature dataset with Moving Features data
To serve as an ingestion system that ingests moving feature detections into the Sensor Hub (D142)
D142 Sensor Hub: This component is a software (SW) system that can enable a sensor or system to be discovered, accessed, and controlled through OGC standard services and APIs. The sensor hub receives moving feature detections from components D140 & D141 and provides API-Moving Features to the client (D143).
D143 Client: This component uses AI technology to improve and enhance the moving feature data ingested from D140 & D141 and refined and stored in D142. These deliverables will be discussed in detail in later sections.
Figure 1 — Testbed 18 Moving Features and Sensor Integration Task Deliverables
5. Abbreviations & Definitions
This section includes abbreviations and definitions needed for this ER.
5.1. Actuator
A device that is used by, or implements, an (Actuation) Procedure that changes the state of the world. Actuator is a subclass of System.
5.2. Application Programming Interface
An Application Programming Interface (API) is a standard set of documented and supported functions and procedures that expose the capabilities or data of an operating system, application or service to other applications (adapted from ISO/IEC TR 13066-2:2016).
5.3. Deployment
Describes the Deployment of one or more Systems for a particular purpose. A Deployment may be done on a Platform.
5.4. Feature Of Interest
The thing whose property is being estimated or calculated in the course of an Observation to arrive at a Result, or whose property is being manipulated by an Actuator, or which is being sampled or transformed in an act of Sampling.
5.5. Observation
Act of carrying out an (Observation) Procedure to estimate or calculate a value of a property of a FeatureOfInterest. Links to a Sensor to describe what made the Observation and how; links to an Observable Property to describe what the result is an estimate of, and to a Feature Of Interest to detail what that property was associated with.
5.6. Observation Collection (SOSA/SSN extension)
Collection of one or more observations, whose members share a common value for one or more properties.
5.7. OGC APIs
Family of OGC standards developed to make it easy for anyone to provide geospatial data to the web.
5.8. Ontology Design Pattern
Reusable solutions intended to simplify ontology development and support the use of semantic technologies by ontology engineers that document and package good modelling practices for reuse, ideally enabling inexperienced ontologists to construct high-quality ontologies
5.9. Procedure
A workflow, protocol, plan, algorithm, or computational method specifying how to make an Observation, create a Sample, or make a change to the state of the world (via an Actuator). A Procedure is re-usable, and might be involved in many Observations, Samplings, or Actuations. It explains the steps to be carried out to arrive at reproducible Results.
5.10. Platform
A Platform is an entity that hosts other entities, particularly Sensors, Actuators, Samplers, and other Platforms.
In general, a SOSA Platform can host any other Systems, but also other Platforms. We thus model Platform as a particular type of System in the Connected System API.
5.11. Sensor
Device, agent (including humans), or software (simulation) involved in, or implementing, a Procedure. Sensors respond to a Stimulus, e.g., a change in the environment, or Input data composed from the Results of prior Observations, and generate a Result. Sensors can be hosted by Platforms. Sensor is a subclass of System.
5.12. System
System is a unit of abstraction for pieces of infrastructure that implement Procedures. A System may have components, its subsystems, which are other Systems.
Abbreviated terms
ASV Autonomous Surface Vessel
ER Engineering Report
FMSDI Federated Marine Spatial Data Infrastructure
IoT Internet of Things
MPA Maritime Patrol Aircraft
MF Moving Features
MFSI Moving Features and Sensor Integration
OGC Open Geospatial Consortium
OMS Observations, Measurements, and Samples
SSN Semantic Sensor Network
SWE Sensor Web Enablement
6. Ingestion Service: D140 & D141 Arizona State University
6.1. Introduction
The Moving Feature collection (D140) obtains hurricane tracking data from the National Oceanic and Atmospheric Administration (NOAA) and converts different types of hurricanes and corresponding tracks into Features of Interest (FOI) and Observations associated with moving features and collection of features (that are not moving). The service then posts FOIs and observations to the Sensor Hub where multi-source datasets (e.g, Hurricane tracks from D140, Automatic Identification System (AIS) data (or Vessel traffic data) from D141, etc.) are stored and interpreted as moving features associated with static features.
6.2. Data
The ingestion service obtained information for 1,338 hurricanes from NOAA. As an example, the figure below shows a track for the 2002 Hurricane Lili. For Hurricane Lili, there is a collection of moving features (observations) indicating the movement of the hurricane. Additionally, each observation is associated with static features including the segment ID, occurrence time, wind speed, pressure, and category. With the collected dataset, the service processed them into JSON files. Each type of hurricane information is saved in a file.
Figure 2 — Example: Hurricane (Lili 2002) track and its static features
6.3. Architecture
Ingestion process architecture
The following figure displays the entire architecture of the ingestion service for the hurricane use case. The D140 ingestion service deliverable is composed of three parts: data loader, integration, and publication.
Figure 3 — Ingestion architecture for hurricane use case
The data loader reads the hurricane track raw data. Raw data includes the moving feature collection (hurricane tracks) and the static feature collection (i.e., wind speed, pressure, precipitation, timestamp, etc.). The data loader takes the raw data into the ingestion service so the data can be further processed. Then the ingestion service integrates different types of hurricane information into Features of Interest (FOIs). For each type of hurricane, the corresponding tracks are chronologically integrated according to the “storm ID” attribute for each FOI following the data model standards. At different timestamps, the hurricane occurrence place has different static characteristics. For example, different wind speeds and pressures at different times. After integration, the ingestion service is ready to publish features and observations into the Sensor Hub via HTTP POST. The Sensor Hub takes data in the format of the output from the integration component of the ingestion service. The publication includes FOI ingestion and Observation ingestion.
In this way, the D143 Client deliverable can obtain the customized information via the Sensor Hub for the integrated visualization use case. The following figure shows the interaction among the ingestion service, Sensor Hub, and Client.
Figure 4 — Interaction with other team components
Data Ingestion
To publish a FOI corresponding to a type of hurricane, the ingestion service sent the HTTP POST with the following payload at the endpoint shown below:
Table 1
endpoint | https://api.georobotix.io/ogc/t18/api/systems/7iob6agpcril4/featuresOfInterest |
payload | { “type”: “Feature”, “properties”: { “uid”: “urn:osh:foi:storm:2020228N37286”, “name”: “Tropical Storm Kyle”, “validTime”: [ “2020-08-14T12:00:00Z”, “2020-08-16T00:00:00Z” ] } } |
After publishing a FOI, a FOI ID (attribute: id) is created automatically in the Sensor Hub. The ingestion service then sends a HTTP GET request to obtain the generated FOI id through the following endpoint, so the Observations associated with corresponding FOI will be published correctly.
Table 2
endpoint | https://api.georobotix.io/ogc/t18/mfapi/collections/storms/items |
After obtaining the FOI id (attribute: foi@id), the ingestion service integrates corresponding observations to further publish moving features and static features for the FOI. Then, the service sends a HTTP POST request for publication. The endpoint and payload are shown as follows:
Table 3
endpoint | https://api.georobotix.io/ogc/t18/api/datastreams/tm3kijpkaoei6/observations |
payload | { “foi@id”: “0hh79ki1f29l8”, “phenomenonTime”: “2020-08-14T12:00:00Z”, “resultTime”: “2020-08-14T12:00:00Z”, “result”: { “location”: { “lat”: 36.6, “lon”: -74.2 }, “windSpeed”: 35, “minPressure”: 1008.0, “category”: “TS” } } |
Results
The Sensor Hub automatically integrates each set of ingested hurricane data with the attributes including FOI id, validity period, temporal geometries, and temporal properties. The validity period indicates the occurrence duration for the hurricane with the given FOI id. Temporal geometries represent the moving features of the hurricane chronologically. Temporal properties including wind speed, pressure, and category represent non-moving features as time goes on. As an example, the figure below shows some ingested hurricane information in Sensor Hub. On the interface, each block contains the information for a hurricane. Temporal Geometries and temporal properties are clickable for more details in JSON format.