i. Abstract
1 Geospatial Computing across the Edge-Fog-Cloud Continuum
“The cloud is dead – long live the cloud!” so begins an IEC White paper on Edge Intelligence.[1] The IEC White Paper continues that “Driven by the internet of things (IoT), a new computing model – edge-cloud computing – is currently evolving, which involves extending data processing to the edge of a network in addition to computing in a cloud or a central data centre. Edge-Fog-Cloud computing models operate both on premise and in public and private clouds, including via devices, base stations, edge servers, micro data centres and networks.”
Location, space and time play a fundamental role across the Edge-Fog-Cloud Computing continuum. This OGC White Paper consider geospatial information’s role through a technology roadmapping approach.
The OpenFog Consortium identifies Fog computing as the missing link in the cloud-to-thing continuum[2]. Fog architectures selectively move compute, storage, communication, control, and decision making closer to the network edge where data is being generated in order solve the limitations in current infrastructure to enable mission-critical, data-dense use cases.
gp>In most fog deployments, there are usually several tiers of nodes. In general, each level of the N-tier environment would be sifting and extracting meaningful data to create more intelligence at each level. Tiers are created in order to deal efficiently with the amount of data that needs to be processed and provide better operational and system intelligence. At the highest level this can be represented by Figure 1.
The NIST Definition of Fog Computing[3] clarifies the distinction between Edge and Fog. The Edgeis the network layer encompassing the smart end-devices and their users, to provide, for example, local computing capability on a sensor, metering or some other devices that are network-accessible. Fog computing is hierarchical, where edge tends to be limited to a small number of peripheral layers. Moreover, in addition to computation, fog also addresses networking, storage, control, and data-processing acceleration.
The Open Geospatial Consortium (OGC) brings a missing element to the architectures defined in the JTC 1 White Paper and the OpenFog Architecture. In order to achieve the “intelligence from data” the computing architecture needs to include elements based on open standards for semantically enriching unstructured Edge data using the concepts of space and time. For more than twenty years OGC has developed standards that provide this element. Adding the OGC standards to the JTC 1 and OpenFog architecture provides a robust approach to achieve the desired, shared outcomes.
The OGC is an international not for profit organization committed to making quality open standards for the global geospatial community. These standards are made through a consensus process and are freely available for anyone to use to improve sharing of the world’s geospatial data.
OGC role is to define how IT creates a digital model of the world considering space and time as the primary organizing methods. A key principal of OGC and consistent with Figure 2 is the increasing semantic richness across the Edge-Fog-Cloud computing continuum. At any point in the continuum spatial-temporal computing is used to add semantic richness to the available information. This occurs by adding structure and labels to less-structured data as well as fusing multiple data observations.
Key themes across the sections in the white paper include how location and spatial-temporal computing when combine with Edge and Fog enable these new functions.
· OGC’s Sensor Web Enablement suite of standards including the recent SensorThings API standard are well suited to provide the semantics, data structures and functional interfaces for sensors on the Edge and for computing in the Fog.
· Geospatial Fusion is at once both a mature and emerging technology. Established capabilities for fusion processing with spatial and temporal techniques are being extended to new sensor types and processing capabilities. Layering Geospatial Fusion across the Edge-Fog-Cloud computing continuum will bring new capabilities.
· Geospatial processing extended by the concepts in JTC 1’s Edge Intelligence White Paper introduces a paradigm shift with regard to acquiring, storing, and processing data: the data processing is placed at the edge between the data source (e.g. a sensor) and the IoT core and storage services located in the cloud.
· Computing in the Fog using geospatial concepts will enable decision making without a human in the loop based on the semantics of spatial temporal in data. Self-organization, self-configuration, self-discovery can be based on spatial concepts
This White Paper describes the role of location and spatial temporal computing across Edge-Fog-Cloud computing in three main sections:
· Section 2 – Where we want to be: Scenarios for Edge-Fog-Cloud
· Section 3 – Current State of Play of Location across EFC
· Section 4 – What needs to be done
· A Technology Roadmap is presented in Annex A.
· A summary of Relevant OGC Standards is listed in Annex B.
· Engineering Reports from OGC innovation initiatives are listed in Annex C.
2 Where we want to be: Scenarios for Edge-Fog-Cloud future
2.1 Smart Cities
Imagine knowing practically every detail about a city: The state of the infrastructure, inhabitants, and the environment are all known to you, at high resolutions in time and in space. You are able to fuse physical data streams with socio-economic data. Transport data tells you where people are going. Sales and transaction data tells you what they are going to see or do or buy. Social media tells you how groups feel about events. And of course, high-quality weather data is already built into your system. Suppose that practically every movement and action within the city’s systems and infrastructure created location enabled data that could be used to enhance livability, provision of services, and more. Think of the data streams that would exist or could be created; the rates at which those data streams would flow; the technology and skills that would be necessary to acquire, store, and analyze such massive data. Think also of the theories and models that social scientists could generate and test, the problems that system operators and policy-makers could solve if they had access to those models and applications; and of the speed at which those problems could be addressed. Therein is the potential of Smart Cities.[4]
The OpenFog Reference Architecture identifies that Fog computing will play a key role in addressing public safety and security issues for smart cities. In particular Fog Computing can provide secure data and distributed analytics in Smart Cities. Adequately addressing privacy concerns is a must for Smart City technology to be effective.
A Smart City provides effective integration of physical, digital and human systems in the built environment to deliver a sustainable, prosperous and inclusive future for its citizens[5]. International Standards organizations are working to advance open standards to meet the needs of the widespread deployment of information technology to cities. To provide a spatial perspective on smart cities, the OGC contributed to the initiation of JTC 1/WG 11. A key basis to the contributions to JTC 1 was the OGC Smart Cities Spatial Information Framework white paper which provides a spatial-temporal perspective to smart cities.
The ESPRESSO (Espresso – systEmic standardisation apPRoach to Empower Smart citieS and cOmmunities) project developed a conceptual Smart City Information Framework based on open standards. The ESPRESSO Smart Cities Reference Architecture[6] is shown in Figure 3.
2.2 Resource Management: Precision Agriculture
Due to growing world population and changing climatic norms, the sustainment of crop quality as well as quantity from existing agricultural land is an important challenge today in reducing global food insecurity. The goal of precision agriculture research is to define a decision support system (DSS) for whole farm management with the goal of optimizing returns on inputs while preserving resources[7]. Precision can only improve decision-making and farm management if agriculture farmers have access to the necessary small scale, detailed information to make informed choices. The creation of field or even plant-level information can support farmers to improve their crop production and attract long-term investment. The more comprehensive and up-to-date picture that farmers have about their crops (e.g., through remote sensing and GPS technologies), the better decisions they can make as to where and when to apply seed, how much to fertilize, when to irrigate and so forth. Longer-term records of agricultural processes from precision farming data allow farmers to use their cropland more efficiently, increase crop size and quality, and respond more effectively to climatic challenges such as drought. Precision farming has the potential to make a worthwhile difference in farmers’ income, crop yields, and resilience while mitigating negative environmental impacts farming.[8]
The LEO Horizon2020 (H2020) Project developed software tools that support the whole lifecycle of reuse of EO data and related linked geospatial data. To demonstrate the benefits of linked open EO data and its combination with linked geospatial data to the European economy, a precision farming application was developed (Figure 4 - Source: Manolis Koubarakis).
2.3 Mobile Location Services
Location-enabled mobile devices are a major source of Big Data. Location data coming from the mobile devices and their associated networks enables many Big Data applications. The Ways Big Geospatial Data Is Driving Analytics In the Real World begins with this observation:
“Amid the flood of data we collect and contend with on a daily basis, geospatial data occupies a unique place. Thanks to the networks of GPS satellites and cell towers and the emerging Internet of Things, we’re able to track and correlate the location of people and objects in very precise ways that were not possible until recently”.
Recent studies of mobile devices identified the predictability of human mobility. A study reported in Science found that “by measuring entropy of individual’s trajectory, we find 93% potential predictability in user mobility” as determined based on a study of ~10 million anonymous mobile phone users. The power of location data was highlighted by Sir Martin Sorrell, CEO WPP, during his speech at Mobile World Congress in his comment that “Location targeting is holy grail for marketers.”
Location based contextual awareness is relevant to location-based marketing, first responders, urban planners and many other applications. Data from many sources including IoT devices, sensor webs, social media and crowd-sourcing are combined with semantically rich urban and indoor spatial data. The resulting context information is delivered to and shared by mobile devices in connected and disconnected operations.
2.4 Transportation and Moving Objects
Management and optimization of transportation systems benefits from the Big Data platforms to monitor, visualize and perform predictive analytics of objects moving in space and time. Traffic congestion is reduced as trip demand data collected using transportation surveys is integrated with real time or projected traffic data. Combined, optimal schedules and routes can be calculated. Thanks to the availability of real time reports from location enable devices, these schedules and routes can even be optimized at runtime. Automobiles will continue to increase as generators of location based big data. Intel has predicted that autonomous vehicles will generate 4 TB of observation and measurement data per car per day.
The more real-time information is made available, the better the optimization algorithms work defining requirements on Big Data handling and processing for, standards can leverage these capabilities. OGC Moving Features standard[9] allows seamless integration of mobile objects and predictions based on mobile objects across systems.
3 Current State of Play of Location across EFC
Standards from OGC and other organizations allow for edge and fog computing strategies to be applied using time and space data models. This section presents examples of how existing OGC standards have been used in Edge-Fog-Cloud style computing environments.
· A summary of Relevant OGC Standards is listed in Annex B.
· Engineering Reports from OGC innovation initiatives are listed in Annex C.
3.1 OGC Sensor Web Enablement and SensorThings – Sensors on the Edge
OGC developed the Sensor Web Enablement (SWE) standards to address the availability of sensors connected to the open internet. SWE is a framework of standards and best practices that make linking of diverse sensor related technologies fast and practical.
Sensor technology, computer technology and network technology are advancing together while demand grows for ways to connect information systems with the real world. Linking diverse technologies in this fertile market environment, integrators are offering new solutions for plant security, industrial controls, meteorology, geophysical survey, flood monitoring, risk assessment, tracking, environmental monitoring, defense, logistics and many other applications.
The primary OGC Standards in the SWE framework include:
- Observations & Measurements (O&M) – General models and XML encodings for observations and measurements.
- Sensor Model Language (SensorML) – Standard models and XML Schema for describing processes within sensor and observation processing systems.
- PUCK Protocol Standard – Protocol to retrieve a SensorML description, sensor “driver” code, and other information from the device itself, enabling automatic sensor installation, configuration and operation.
- Sensor Observation Service (SOS) – Open interface for a web service to obtain observations and sensor and platform descriptions from one or more sensors.
- Sensor Planning Service (SPS) – Open interface for a web service to 1) determine the feasibility of collecting data from one or more sensors or models and 2) submit collection requests.
The OGC SWE Implementation Maturity Engineering Report presents an assessment of the maturity of implementations based on SWE standards. SWE-based implementations are operating at the maximum level of technology readiness, i.e., SWE is used in multiple operational systems.
SensorThings is the newest standard in the SWE framework. SensorThings applies the SWE concepts to an IoT communications environment. The OGC SensorThings API provides an open, geospatial-enabled and unified way to interconnect the Internet of Things (IoT) devices, data, and applications over the Web. At a high level the OGC SensorThings API provides two main functionalities and each function is handled by a part: 1) Sensing and 2) Tasking. The Sensing part provides a standard way to manage and retrieve observations and metadata from heterogeneous IoT sensor systems. The Tasking part provides a standard way for tasking IoT devices, such as sensors, actuators, drones. SensorThings is being considered for standardization through ITU-T SG20.
3.2 IMIS Internet of Things (IoT) Pilot – Sensor Hubs in the Fog
The Incident Management Information Sharing (IMIS) Internet of Things (IoT) Pilot developed SWE and other distributed computing standards to provide a basis for future IMIS sensor and observation interoperability. The pilot demonstrated the SWE based architecture in a realistic incident management scenario. The results of the pilot were taken up in the US Department of Homeland Security (DHS) Next Generation First Responder program. Results are documented in the IMIS IoT Pilot Extension Engineering Report.
The IMIS IoT Pilot was based on interoperable standards as a central requirement for enabling ad-hoc integration of sensor resources (e.g., catalogs, data access services, portrayal services). The architecture as shown in Figure 8 uses OGC standards and other standards in a layered architecture approach. The Sensor Hub was a key component in the IMIS IoT Pilot architecture that can be viewed as a form of Fog computing.
Sensor-Hubs (S-Hubs) are gateways between one or more local sensor devices on one side and Internet users of the sensors on the other. S-Hubs provide standard, predictable and interoperable access to minimally connected, often proprietary sensor devices and are vital to both SWE and IoT. Within the SWE-IoT architecture, S-Hubs are components implemented as software stacks that conform to OGC SWE standards as well as other industry standards, in order to fill this mediation role. S-Hubs provide standard protocols and encodings for accessing real-time or archived observations, as well as for tasking sensor or actuator systems.
3.3 Debris flow warning system – Decisions in the Fog
Use cases of edge sensing and in-network processing based on spatial temporal processing are demonstrated in a Debris flow warning system deployed in Taiwan using OGC standards. The Debris Flow system was tested in the OGC OWS-6 Testbed[10] and deployed for operations[11].
Debris flow events are deadly and destructive events that occur all too often in Taiwan due to steep terrain and frequent typhoons. A Debris Flow is a fast-moving mass of unconsolidated, saturated debris that looks like flowing concrete. Flows can carry rocks ranging in size from clay particles to boulders, and also often contains a large amount of woody debris. Debris Flows triggered by large amounts of rainfall can vary from 1 mph to 35 mph in extreme conditions. Debris flows are extremely destructive to life and property.
In Taiwan, the Soil and Water Conservation Bureau (SWCB)[12] has cooperated with GIS Research Center[13] at Feng Chia University (FCU)[14] to establish Debris Flow monitoring stations over Taiwan. The warning model for the Debris Flow alert uses accumulated precipitation as warning indexes to determine whether rainfall is reached a specific threshold or not. When debris flows occur, the geophone can detect the ground vibration generated by the collision between boulders and channel bed.
The Taiwan Debris Flow Monitoring system depends on the following OGC standards:
• Sensor Observation Service (SOS)
• Sensor Planning Service (SPS)
• Sensor Model Language (SensorML)
• Web Processing Service (WPS)
The Taiwan Debris flow system shows how sensors at the edge along workflow in the network leads to decision support and warnings. As shown in Figure 10, the workflow in the network includes a series of intermediate decisions, e.g., “is this a debris flow event?” that leads to an alert.
A key question posed by this white paper is: Can the concepts of Fog Computing be used to distribute the geospatial decision making in a sensor and processing workflow as shown in Figure 10?
3.4 Future City Pilot CityGML Dynamizer – merging dynamic and static
The OGC Future City Pilot Phase 1 (FCP1) demonstrated the ability of spatial data infrastructures to support quality of life, civic initiatives, and urban resilience. During the pilot, scenarios focused on real-time sensor readings and other time-dependent properties within semantic 3D city models as described in an OGC Engineering Report (ER).[15] The ER highlights the ’Dynamizer’ concept, which allows representation of highly dynamic data in different and generic ways and providing a method for injecting dynamic variations of city object properties into the static representations. It also establishes explicit links between sensor/observation data and the respective properties of city model objects that are measured by them[16].
Some of the changes may also represent high frequency or dynamic variations of object properties, e.g., variations of (i) thematic attributes such as changes of physical quantities (energy demands, temperature, solar irradiation levels); (ii) spatial properties such as change of a feature’s geometry, with respect to shape and location (moving objects); and (iii) real-time sensor observations. In this case, only some of the properties of otherwise static objects need to represent such time-varying values. Dynamizer is a new concept supporting the latter types of changes and allowing for enriching the city model by data from dynamic data feeds.
3.5 Geospatial Fusion – Semantic enrichment in the fog
Making new connections in existing data is a powerful method to gain understanding of the world. Such fusion of data is not a new topic, but new technology provides opportunities to enhance this ubiquitous process. Data fusion in distributed information environments with interoperability based on open standards is radically changing the classical domains of data fusion while inventing entirely new ways to discern relationships in data with little structure. Associations based on locations and times are of the most primary type. To that end OGC conducted a series of studies[17] on Geospatial Fusion can be organize in three categories:
· Observation Fusion: merging of multiple sensor measurements of the same phenomena (i.e., events of feature of interest) into a combined observation.
· Object/Feature Fusion: processing of observations into higher order semantic features and feature processing. Object/feature fusion processes include generalization and conflation of features.
· Decision fusion: supporting a human decision for situation assessment, impact assessment and decision support, using information from multiple sensors, processed information.
Among the most sophisticated fusion tasks is the combination of strongly heterogeneous space-borne, airborne, and terrestrial. Figure 12 shows the multiple computing processes in multi-level geospatial fusion[18].
A key question posed by this white paper is: how to apply the fusion processing across the Edge-Fog-Continuum as shown in Figure 2? This will involve computing at the edge based on processing and conceptual models held nearest the sensor and sensor hubs. Computing in the Fog will enable feature-level fusion based on observations and fusion from multiple edge nodes. Decision level fusion can happen at all technologies across the continuum including the need for autonomous decision making in the Fog.
Several OGC initiatives have shown the value of the Web Processing Service (WPS) standard to provide open interfaces to processing in distributed computing. The recent OGC Testbed 12 conducted experiments on conflation – a type of feature fusion[19]. A conflation tool such as the Hootenanny software was deployed using the WPS interface. The WPS interface is used to wrap a complex software processing algorithm in a simpler set of RESTful interactions. It is often said that the WPS defines simple “knobs” to control a complex process. The WPS knobs allow for easier integration into an Edge-Fog-Cloud continuum.
3.6 Geospatial Computing in Cloud
As this report began with quoting IEC’s statement “The cloud is dead – long live the cloud!” – we complete this section with a brief summary of how geospatial processing is achieved in the cloud using open standards.
OGC Testbed 13 advance the state of the art of cloud computing for geospatial data, in particular, for processing of Earth observation data. An OGC Engineering Report[20] address issues in lack of interoperability and portability of cloud computing architectures which cause difficulty in managing the efficient use of virtual infrastructure such as in cloud migration, storage transference, quantifying resource metrics, and unified billing and invoicing. Cloud computing is paving the way for future scalable computing infrastructures and is being used for processing digital earth observation data. In this EOC thread effort, data is stored in various storage resources in the cloud and accessed by an OGC WPS. The methods in which these processes are deployed and managed must be made interoperable to mitigate or avoid the complexities of administrative effort for the scientific community.
A foundational aspect of cloud computing is the use of containers. Also part of OGC Testbed 13, the Application Package OGC Engineering Report (ER) defines a data model and serialization for Application Packages. This ER is part of the Testbed-13 for exercising workflows[21] for data integration, processing, and analytics based on algorithms developed by users that are deployed in multiple clouds. The wide usage of virtualization and the possibility to start virtual environments within Cloud services significantly simplifies the creation of environments and provisioning of resources. The application packaging specifies the elements that will ensure: Scientific reproducibility, Dependencies identification and management, Maintainability from an operational perspective and avoid version pilling, Portability in different Cloud providers
4 What needs to be done
OGC role is to define how IT creates a digital model of the world considering space and time as the primary organizing methods. This section describes ideas for meeting the objectives for open geospatial information considering Cloud-Fog-Edge computing.
4.1 Proliferation of sensor types - Semantic interoperability
An original driver for the OGC SWE standards was the diversity of types of sensors and the requirement to be able access a sensor discovered on the web and process observations from a based on dynamic configuration. SensorML and the Observations and Measurements standards provided a level of interoperability for this use cases. More recently, semantic technologies have made the SensorML and O&M concepts extensible to at a higher level of generality.
The OGC/W3C Semantic Sensor Network (SSN) ontology describes sensors and their observations, the involved procedures, the studied features of interest, the samples used to do so, and the observed properties. SSN follows a horizontal and vertical modularization architecture by including a lightweight but self-contained core ontology called SOSA (Sensor, Observation, Sample, and Actuator) for its elementary classes and properties. With their different scope and different degrees of axiomatization, SSN and SOSA are able to support a wide range of applications and use cases, including satellite imagery, large-scale scientific monitoring, industrial and household infrastructures, social sensing, citizen science, observation-driven ontology engineering, and the Web of Things.
Use cases for dynamic sensor configuration and processing built on SSN are available to be deployed using the Edge Intelligence and OpenFog approaches.
4.2 Proliferation of sensor platforms – communications network needs.
Not only has there been increasing variety in sensors, additionally the platforms on which sensors are mounted are also becoming more varied, e.g., UAVs, connected automobiles. The communications methods for reaching these platforms need to be advanced to enable, reliable communications.
Methods for handling communications in disadvantaged, degraded or intermittent linkages are needed for sensor data. For delivering of data that is offered by OGC services over (very) low bandwidth, two options have been analyzed[22]: On the one hand, the geospatial features remain the same, but compression techniques are used to reduce the size of the data that needs to be transferred. On the other hand, generalization techniques may be used by reducing the details of geometries and/or attributes in order to reduce the amount of data.
Sensors continue to increase the data volumes and the associated bandwidth and the communication link functionality. LiDAR is a good example of this challenge that has been studies in a recent OGC Testbed[23] to support streaming of LiDAR data with SWE technologies.
Trustworthiness of communication networks for sensor data needs to be continually increasing. Targeting at the sensor at the edge.[24] Computing at the edge to perform quality checks on sensor measurements. Protecting the data in transit: Blockchain and other methods to capture and protect the provenance of the processing in network
It is anticipated that JTC 1 Edge Intelligence discussions along with the OpenFog Consortium concepts can support these communication needs for geospatial sensors.
4.3 Feature detection and tracking at the edge and in the fog
Using spatial-temporal concepts applied to observations made by multiple sensors on the edge, features and be identified and tracked both in edge nodes and in the fog. This can be achieved by applying the concepts of Geospatial Fusion to the Edge Intelligence and OpenFog approaches. Two examples help to motivate this concept: 1) Moving Features, 2) Video Analysis.
The OGC Moving Features Access Standard[25] defines methods for retrieving feature attributes, information on a relation between a trajectory object and one or more geometry objects, and information on a relation between two trajectory objects from a database storing trajectory data of moving features. The OGC standard is based on the abstract methods of accessing moving features data are defined in ISO 19141:2008 (Geographic information - Schema for moving features) [ISO 19141:2008]. Identifying the trajectories and comparing the trajectories of moving features with Edge and Fog computing could support multiple use cases.
Video content analysis (VCA) at the edge is an emerging capability. Adding geospatial and temporal aspects to VCA will provide greater capabilities. The ability of deep learning technology to enable adaptive analysis of video and require less calibration of algorithms will drive a big leap in the future use, accuracy and range of applications for VCA technology[26]. This is just one example of the need to support machine learning algorithms in Edge and Fog Computing.
4.4 Machine learning at the edge and in the fog
We are in the middle of the third wave of interest in artificial neural networks as the leading paradigm for machine learning. The first wave dates back to the 1950s, the second to the 1980s, and the third to the 2010s. The current wave has been called “deep learning” because of the emphasis on having multiple layers of neurons between the input and the output of the neural network.
OGC has foster the progress of machine learning for geospatial data through several workshops, initiatives and working groups. The OGC Big Geo Data White paper[27] summarizes much of the work. The current OGC Testbed 14[28] will create additional results through a work item to develop a holistic understanding and to derive best practices for integrating Machine Learning, Deep Learning, and Artificial Intelligence tools and principles into OGC Web service contexts.
OGC anticipates that Machine Learning algorithm will migrate from cloud computing to fog and edge computing with the deployment of processing, e.g., GPUs, and the refinement of ML algorithms suitable to processing capabilities in the Edge and Fog. This will enable ML for geospatial at the Edge and in the Fog.
4.5 Dynamic resource control - Autonomy in the fog
During the OGC Technical Committee meeting, March 2018, the Future Directions session focused on “Automony.” The motivation for the session was multi-fold: Unprecedented amounts of geospatial data being collected, beyond the processing capacity of most organization; Increased streaming of geospatial data from collection platforms, requiring continuous processing to avoid missing significant events; and Geospatial data is increasingly being used by home users and non-professional purposes. The session included several presentations including one by the OpenFog Consortium.
In order to achieve the autonomy objectives using edge and fog computing the network bandwidth, latency and computing-in-the-network will be issues. Real-time challenges and sensor-control-actuator round trip latency will need t be explored as a key attribute supported by fog.
For example, consider the OGC Dynamizer introduced earlier (3.4 Future City Pilot CityGML Dynamizer – merging dynamic and static) in combination with real time sensors, predictive real-time models and other computing can be the basis for control of dynamic resources at urban scale in Smart Cities.
4.6 Linked Geodata across the Edge-Fog-Cloud continuum
Developments in the Semantic Web make it possible to link data based on geographic information in a way that provides more insight. The OGC/W3C Spatial Data on the Web has defined a set of Best Practices[29] for spatial data that are driving significant advancements in linked geo-data on the web. The Location Powers: Big Linked Geodata workshop investigated scaling effective exploitation of linked geodata by using big data approaches.
As presented during the Location Powers workshop, “From Linked Datasets to Linked Data Streams” is about a platform to publish and interlink datasets on the web of data, with input data coming from multiple heterogeneous formats. The output data produced are Linked Data, they are also named semantic and interconnected data. These capabilities are being further developed in the ATOS WAVES platform for real-time semantic stream management (Figure 15). This realtime streaming and structuring of data using spatial concepts for linked data can be layered on the Edge and Fog Computing Continuum as shown in Figure 2
4.7 Indoor space: positioning, spatial modeling, navigation
The JTC1 Edge Intelligence article identifies the need for several new functionalities related to indoor location. There are some items solved and standardized but there are indeed including these OGC standards:
· OGC CityGML – This standard provides for the geometry of indoor spaces; and
· OGC IndoorGML – This standard provides for the navigation of indoor spaces.
OGC looks forward to working with those in JTC 1 interested in indoor space and navigation standards.
4.8 Placing OGC standards in Open Architectures for Edge/Fog
The preceding sections have identified multiple opportunities for geospatial information technology advances considering a future based on Edge/Fog computing. The new capabilities will evolve more quickly with some stable points interlaced with the rapidly emergent technologies. The stable points typically are established standards mapped to new architectures.
A next step in OGC consideration of Edge/Fog Computing will be to identify the opportunities to use OGC standards in key points of open architectures for Edge/Fog Computing, e.g., EdgeXFoundary. EdgeXFoundry is a vendor-neutral open source project hosted by The Linux Foundation[30]. EdgeX Foundry proposes a loosely-coupled tiered IoT architecture (Figure 16) by allowing customers to deploy a mix of plug–and–play microservices on compute nodes at the edge depending on the capability of the host devices, where they sit in the solution stack, and the use case.
Annex A – Technology Roadmap
The Open Geospatial Consortium (OGC) tracks Geospatial Technology trends as part of the OGC Technology Strategy. This process is similar to the IEC’s Market Strategy Board methods to identify the principal technological trends and market needs[31]. The OGC approach includes these elements:
· Survey: Identify and characterize emerging technology
· Assess: Evaluate trends to identify priorities
· Focus: Take action to advance priority trends
For high priority trends, OGC uses Technology Roadmapping[32],[33] to improve the ability to characterize, assess and plan for technology development. An initial roadmap for The Role of Geospatial in Edge-Fog-Cloud Computing is provided in Figure 17.
Annex B – Relevant OGC Standards
This annex lists OGC Standards relevant to Edge-Fog-Cloud Computing. A full listing of OGC Standards is here: http://www.opengeospatial.org/standards
Sensor Web Enablement standards
1.0 |
15-078r6 |
|
The OGC SensorThings API provides an open, geospatial-enabled and unified way to interconnect the Internet of Things (IoT) devices, data, and applications over the Web. At a high level the OGC SensorThings API provides two main functionalities and each function is handled by a part. The two parts are the Sensing part and the Tasking part. The Sensing part provides a standard way to manage and retrieve observations and metadata from heterogeneous IoT sensor systems. The Tasking part is planned as a future work activity and will be defined in a separate document as the Part II of the SensorThings API. |
||
2.0 |
12-006 |
|
The SOS standard is applicable to use cases in which sensor data needs to be managed in an interoperable way. This standard defines a Web service interface which allows querying observations, sensor metadata, as well as representations of observed features. Further, this standard defines means to register new sensors and to remove existing ones. Also, it defines operations to insert new sensor observations. This standard defines this functionality in a binding independent way; two bindings are specified in this document: a KVP binding and a SOAP binding. |
||
2.0 |
09-000 |
|
The SPS defines interfaces for queries that provide information about the capabilities of a sensor and how to task the sensor. The standard is designed to support queries that have the following purposes: to determine the feasibility of a sensor planning request; to submit and reserve/commit such a request; to inquire about the status of such a request; to update or cancel such a request; and to request information about other OGC Web services that provide access to the data collected by the requested task. |
||
1.0.0 |
07-000 |
|
The OpenGIS® Sensor Model Language Encoding Standard (SensorML) specifies models and XML encoding that provide a framework within which the geometric, dynamic, and observational characteristics of sensors and sensor systems can be defined. There are many different sensor types, from simple visual thermometers to complex electron microscopes and earth observing satellites. |
||
1.0 |
15-043r3 |
|
The OGC Timeseries Profile of Observations and Measurements is a conceptual model for the representation of observations data as timeseries, with the intent of enabling the exchange of such data sets across information systems. This standard does not define an encoding for the conceptual model; however there is an accompanying OGC Standard which defines an XML encoding (OGC TimeseriesML 1.0 - XML Encoding of the Timeseries Profile of Observations and Measurements). Other encodings may be developed in future. |
||
2.0 |
10-025r1 |
|
This standard specifies an XML implementation for the OGC and ISO Observations and Measurements (O&M) conceptual model (OGC Observations and Measurements v2.0 also published as ISO/DIS 19156), including a schema for Sampling Features. This encoding is an essential dependency for the OGC Sensor Observation Service (SOS) Interface Standard. More specifically, this standard defines XML schemas for observations, and for features involved in sampling when making observations. These provide document models for the exchange of information describing observation acts and their results, both within and between different scientific and technical communities. |
OGC Web Services Standards
href="http://portal.opengeospatial.org/files/?artifact_id=14416 OpenGIS Web Map Service (WMS) Implementation Specification |
1.3.0 |
06-042 |
The OpenGIS® Web Map Service Interface Standard (WMS) provides a simple HTTP interface for requesting geo-registered map images from one or more distributed geospatial databases. A WMS request defines the geographic layer(s) and area of interest to be processed. The response to the request is one or more geo-registered map images (returned as JPEG, PNG, etc.) that can be displayed in a browser application. The interface also supports the ability to specify whether the returned images should be transparent so that layers from multiple servers can be combined or not. NOTE: WMS 1.3 and ISO 19128 are the same documents. |
||
1.0.0 |
07-057r7 |
|
This Web Map Tile Service (WMTS) Implementation Standard provides a standard based solution to serve digital maps using predefined image tiles. The service advertises the tiles it has available through a standardized declaration in the Service Metadata document common to all OGC web services. This declaration defines the tiles available in each layer (i.e. each type of content), in each graphical representation style, in each format, in each coordinate reference system, at each scale, and over each geographic fragment of the total covered area. The Service Metadata document also declares the communication protocols and encodings through which clients can interact with the server. Clients can interpret the Service Metadata document to request specific tiles. |
||
2.0.1 |
09-110r4 |
|
This document specifies how a Web Coverage Service (WCS) offers multi-dimensional coverage data for access over the Internet. This document specifies a core set of requirements that a WCS implementation must fulfil. WCS extension standards add further functionality to this core; some of these are required in addition to the core to obtain a complete implementation. This document indicates which extensions, at a minimum, need to be considered in addition to this core to allow for a complete WCS implementation. This core does not prescribe support for any particular coverage encoding format. This also holds for GML as a coverage delivery format: while GML constitutes the canonical format for the definition of WCS, it is not required by this core that a concrete instance of a WCS service implements the GML coverage format. WCS extensions specifying use of data encoding formats in the context of WCS are designed in a way that the GML coverage information contents specified in this core is consistent with the contents of an encoded coverage. |
||
OGC® Web Feature Service 2.0 Interface Standard - With Corrigendum |
2.0.2 |
09-025r2 |
The Web Feature Service (WFS) represents a change in the way geographic information is created, modified and exchanged on the Internet. Rather than sharing geographic information at the file level using File Transfer Protocol (FTP), for example, the WFS offers direct fine-grained access to geographic information at the feature and feature property level. This International Standard specifies discovery operations, query operations, locking operations, transaction operations and operations to manage stored, parameterized query expressions. Discovery operations allow the service to be interrogated to determine its capabilities and to retrieve the application schema that defines the feature types that the service offers. Query operations allow features or values of feature properties to be retrieved from the underlying data store based upon constraints, defined by the client, on feature properties. Locking operations allow exclusive access to features for the purpose of modifying or deleting features. Transaction operations allow features to be created, changed, replaced and deleted from the underlying data store. Stored query operations allow clients to create, drop, list and described parameterized query expressions that are stored by the server and can be repeatedly invoked using different parameter values. |
||
2.0 |
14-065 |
|
The OpenGIS® Web Processing Service (WPS) Interface Standard provides rules for standardizing how inputs and outputs (requests and responses) for geospatial processing services, such as polygon overlay. The standard also defines how a client can request the execution of a process, and how the output from the process is handled. It defines an interface that facilitates the publishing of geospatial processes and clients’ discovery of and binding to those processes. The data required by the WPS can be delivered across a network or they can be available at the server. |
||
OpenGIS Web Coverage Processing Service (WCPS) Language Interface Standard |
1.0.0 |
08-068r2 |
The OGC® Web Coverage Processing Service (WCPS) defines a protocol-independent language for the extraction, processing, and analysis of multi-dimensional coverages representing sensor, image, or statistics data. |
||
1.1 |
05-005 |
|
This document is a companion specification to the OGC Web Map Service Interface Implementation Specification version 1.1.1 [4], hereinafter "WMS 1.1.1." WMS 1.1.1 specifies how individual map servers describe and provide their map content. The present Context specification states how a specific grouping of one or more maps from one or more map servers can be described in a portable, platform-independent format for storage in a repository or for transmission between clients. This description is known as a "Web Map Context Document," or simply a "Context." Presently, context documents are primarily designed for WMS bindings. However, extensibility is envisioned for binding to other services. A Context document includes information about the server(s) providing layer(s) in the overall map, the bounding box and map projection shared by all the maps, sufficient operational metadata for Client software to reproduce the map, and ancillary metadata used to annotate or describe the maps and their provenance for the benefit of human viewers. A Context document is structured using eXtensible Markup Language (XML). Annex A of this specification contains the XML Schema against which Context XML can be validated. |
Other OGC Standards
Geographic information — Well known text representation of coordinate reference systems |
1.0 |
12-063r5 |
This Standard provides an updated version of WKT representation of coordinate reference systems that follows the provisions of ISO 19111:2007 and ISO 19111-2:2009. It extends the earlier WKT to allow for the description of coordinate operations. This International Standard defines the structure and content of well-known text strings. It does not prescribe how implementations should read or write these strings. The jointly developed draft has also been submitted by ISO TC211 for publication as an International Standard document. The version incorporates comments made during both the OGC Public Comment Period as well as the ISO ballot for DIS (ISO TC211 document N3750). |
||
1.0 |
16-120r3 |
|
This document defines Moving Features Access, i.e., access methods to moving feature data for retrieving feature attributes, information on a relation between a trajectory object and one or more geometry objects, and information on a relation between two trajectory objects from a database storing trajectory data of moving features. Abstract methods of accessing moving features data are defined in ISO 19141:2008 (Geographic information - Schema for moving features) [ISO 19141:2008]. |
||
1.0 |
11-030r1 |
|
The OpenGIS® Open GeoSMS standard defines an encoding for location enabling a text message to be communicated using a Short Messages System (SMS). |
||
1.2 |
12-128r14 |
|
This OGC® Encoding Standard defines GeoPackages for exchange and GeoPackage SQLite Extensions for direct use of vector geospatial features and / or tile matrix sets of earth images and raster maps at various scales. Direct use means the ability to access and update data in a "native" storage format without intermediate format translations in an environment (e.g. through an API) that guarantees data model and data set integrity and identical access and update results in response to identical requests from different client applications. GeoPackages are interoperable across all enterprise and personal computing environments and are particularly useful on mobile devices like cell phones and tablets in communications environments with limited connectivity and bandwidth. |
||
1.0 |
14-005r3 |
|
This OGC® IndoorGML standard specifies an open data model and XML schema for indoor spatial information. IndoorGML is an application schema of OGC® GML 3.2.1. While there are several 3D building modelling standards such as CityGML, KML, and IFC, which deal with interior space of buildings from geometric, cartographic, and semantic viewpoints, IndoorGML intentionally focuses on modelling indoor spaces for navigation purposes. |
||
1.0 |
13-131r1 |
|
Publish/Subscribe 1.0 is an interface specification that supports the core components and concepts of the Publish/Subscribe message exchange pattern with OGC Web Services. The Publish/Subscribe pattern complements the Request/Reply pattern specified by many existing OGC Web Services. This specification may be used either in concert with, or independently of, existing OGC Web Services to publish data of interest to interested Subscribers. Publish/Subscribe 1.0 primarily addresses subscription management capabilities such as creating a subscription, renewing a subscription, and unsubscribing. However, this standard also allows Publish/Subscribe services to advertise and describe the supported message delivery protocols such as SOAP messaging, ATOM, and AMQP. |
Annex C – Relevant OGC Engineering Reports
This Annex lists OGC Engineering Reports relevant to Edge-Fog-Cloud computing. Engineering Reports (ERs) are a primary output of OGC Innovation Program Initiatives (testbeds, pilot projects and interoperability experiments).
Incident Management Information Sharing (IMIS) Internet of Things (IoT) Extension Engineering Report |
16-092r2 |
The Incident Management Information Sharing (IMIS) Internet of Things (IoT) Pilot established the following objectives.· Apply Open Geospatial Consortium (OGC) principles and practices for collaborative development to existing standards and technology in order to prototype an IoT approach to sensor use for incident management.· Employ an agile methodology for collaborative development of system designs, specifications, software and hardware components of an IoT-inspired IMIS sensor capability.· Development of profiles and extensions of existing Sensor Web Enablement (SWE) and other distributed computing standards to provide a basis for future IMIS sensor and observation interoperability.· Prototype capabilities documented in engineering reports and demonstrated in a realistic incident management scenario. These principles continued through the IoT Pilot Extension, with additional objectives of:· Integration into the existing Next Generation First Responder (NGFR) Apex development program process as part of Spiral 1;· Defining steps to begin the integration of existing incident management infrastructure, e.g., pulling in National Institute of Emergency Management (NIEM) message feeds; and· Demonstration and experimentation in a ‘realistic’ incident environment using two physically separate sites–an incident site within an active first responder training facility (Fairfax County Lorton site), and a command center (DHS S&T Vermont Avenue facility).The initial Pilot activity has been documented in three OGC public engineering reports. The present report describes and documents the additional activities and innovations undertaken in the Extension. |
|
17-035 |
|
This OGC Engineering Report (ER) will describe the use of OGC Web Processing Service (WPS) for cloud architecture in the OGC Testbed 13 Earth Observation Cloud (EOC) Thread. This report is intended to address issues in lack of interoperability and portability of cloud computing architectures which cause difficulty in managing the efficient use of virtual infrastructure such as in cloud migration, storage transference, quantifying resource metrics, and unified billing and invoicing. This engineering report will describe the current state of affairs in cloud computing architectures and describe the participant architectures based on use case scenarios from sponsor organizations. Cloud computing is paving the way for future scalable computing infrastructures and is being used for processing digital earth observation data. In this EOC thread effort, data is stored in various storage resources in the cloud and accessed by an OGC Web Processing Service. The methods in which these processes are deployed and managed must be made interoperable to mitigate or avoid the complexities of administrative effort for the scientific community. In other words, the intent of this effort is to develop a way for scientists to acquire, process, and consume earth observation data without needing to administer computing cloud resources. |
|
17-029r1 |
|
This Engineering Report (ER) addresses the development of a consistent, flexible, adaptable workflow that will run behind the scenes. A user should be able to discover existing workflows via a catalog and execute them using their own datasets. An expert should be able to create workflows and to publish them. Previous OGC Testbed initiatives investigated workflows in the geospatial domain: • OWS 3 Imagery Workflow Experiments • OWS 4 WPS IPR Workflow descriptions and lessons learned • OWS 4 Topology Quality Assessment Interoperability Program Report • OWS 5 Data View Architecture Engineering Report • OWS 6 Geoprocessing Workflow Architecture Engineering Report These initiatives mostly favored Business Processing Execution Language (BPEL) as the workflow execution language. More recent studies ([6], [7]) were performed using BPMN as a means for describing and executing workflows comprised of OGC Web services. This ER will give an overview about existing approaches to compose and execute geospatial workflows and will describe the approach taken in Testbed-13, taking into account security aspects. |
|
17-023 |
|
The Application Package OGC Engineering Report (ER) defines a data model and serialization for Thematic Exploitation Platforms (TEP) Application Packages. A TEP refers to a computing platform that follows a given set of scenarios for users, data and ICT provision aggregated around an Earth Science thematic area. This ER is part of the Testbed-13 Earth Observation Clouds (EOC) effort to support the development by the European Space Agency (ESA) of the TEP by exercising envisioned workflows for data integration, processing, and analytics based on algorithms developed by users that are deployed in multiple clouds. The wide usage of virtualization and the possibility to start virtual environments within Cloud services significantly simplifies the creation of environments and provisioning of resources. However, it still leaves a problem of portability between infrastructures. This ER identifies a strategy for packaging an application in a Cloud environment that will be able to run in a predictable manner in different computing production environments. The application packaging specifies the elements that will ensure: Scientific reproducibility, Dependencies identification and management, Maintainability from an operational perspective and avoid version pilling, Portability in different Cloud providers The ER proposes the use of containers, defining everything required to make a piece of software run packaged into isolated containers. Unlike a Virtual Machine (VM), a container does not bundle a full Operating System (OS) - only libraries and settings required to make the software work are needed. This makes for efficient, lightweight, self-contained systems and guarantees that software will always run the same, regardless of where it’s deployed. A discussion on application deployment and execution is presented in the separate OGC Testbed-13 Application Deployment and Execution Service ER [1]. |
|
Testbed-12 WPS Conflation Service Profile Engineering Report |
16-022 |
One practical purpose of this ER will be to describe how a conflation tool such as the Hootenanny software can be used for conflation tasks using the Web Processing Service interface. The developed WPS REST (conflation) Service will be described in detail. Special focus will be laid on more complex conflation tasks that include user interaction. During earlier testbeds, we connected different conflation tools to the WPS and performed different conflation tasks (see [1] and [2]). The experiences gathered there together with the ones gathered in the Testbed 12 will be captured in the ER. As the WPS REST (Conflation) Service will be RESTful, this ER could be the basis for a REST binding extension for WPS 2.0. Service profiles are an important aspect of the WPS 2.0 standard. We will investigate how a WPS 2.0 Conflation Profile could look like in the hierarchical profiling approach of WPS 2.0. |
|
16-137r2 |
|
This document describes how the OGC PubSub standard can be used as a mechanism to automatically notify analysts of data availability for CSW and other OGC Web Services (e.g. WFS, WCS). In particular, this document proposes the following: Specific PubSub 1.0 extensions for CSW 2.0.2 and 3.0, leveraging on standard functionalities, data models, and semantics to enable sending notifications based on user-specified area of interest and/or keywords; A general, basic mechanism for enabling PubSub for the generic OGC Web Service over the existing request/reply OWS’s, i.e. usual requests as filters, usual responses as appropriate updates/data pushes, existing semantics and syntax expressiveness. This document is the result of activity performed within the Large-Scale Analytics (LSA) Thread of the OGC Testbed 12 Interoperability initiative, being identified as document deliverable "A074 PubSub / Catalog Engineering Report". This document also captures lessons learnt from the implementation of component deliverable "A016 CSW 2.0.2 with PubSub Core Support Server". |
|
Testbed-12 Low Bandwidth & Generalization Engineering Report |
16-021r1 |
For delivering of data that is offered by OGC services over (very) low bandwidth, two options may be considered: On the one hand, the geospatial features remain the same, but compression techniques are used to reduce the size of the data that needs to be transferred. On the other hand, generalization techniques may be used by reducing the details of geometries and/or attributes in order to reduce the amount of data. The aim of this ER is to summarize the results of implementing sample services using compression techniques for DGIWG WFS (U002) and providing generalization processes using WPS (U003). The ER compares the results of the different approaches and infers recommendations and best practices for supporting data delivery of standard data and complex 3D data from OGC services over low and very low bandwidth. |
|
16-034 |
|
This Engineering Report describes how developments of the Community Sensor Model Working Group (CSMW) can be harmonized with the latest SWE specifications and developments in order to support streaming of LiDAR data with SWE technologies. The report will therefore provide an overview on both initiatives and then describe different options how to integrate LiDAR data streams and SWE technologies. In particular, the ER will consider the results of the activities SOS Compression (LiDAR) Server (A012) and LiDAR Streaming Client (A010) and infer recommendations for future developments. |
|
13-032 |
|
This report summarizes the outcomes of a process to assess the maturity of implementations based on SWE standards. This report covers the following areas: SWE standards overview; Implementations of SWE in major systems; SWE software implementations and compliance SWE implementations in IP; Recommendations and Observations. A main outcome is the summary assessment of the SWE Implementation Maturity as presented in the Preface based on the body of the report. |
|
10-184 |
|
This Engineering Report summarizes two phases of the Open Geospatial Consortium (OGC®) Fusion Standards study and of the fusion prototypes developed during the OWS-7 Testbed which occurred between the two study phases. Recommendations from the first phase of the study were implemented in OWS-7. Based upon the results of OWS-7, responses to two Requests for Information and a multi-day workshop, this report provides a cumulative set of recommendations for advancing fusion based on open standards. |
Document History
Document contributor contact points
All questions regarding this document should be directed to the editor or the contributors:
Name |
Organization |
---|---|
George Percivall |
OGC |
|
|
|
|
Revision history
Date |
Version |
Editor |
Description - Primary clauses modified |
---|---|---|---|
2018-02-05 |
Initial |
G. Percivall |
First version of document - Prepared for JTC 1 Joint Advisory Group Meeting, March 2018 |
2018-05-14 |
18-004r1 |
G. Percivall |
Updated based on comments from OpenFog Consortium and after presentation to OGC Members TC in Future Directions session, March 2018 - Motion passed to release as a public OGC White Paper. |
[1] http://www.iec.ch/whitepaper/pdf/IEC_WP_Edge_Intelligence.pdf
[2] Figure 11 - OpenFog Reference Architecture for Fog Computing, OpenFog Consortium. February 2017. Document 1 OPFRA001.020817
[3] https://csrc.nist.gov/CSRC/media//Publications/sp/800-191/draft/documents/sp800-191-draft.pdf
[4] The entire paragraph is an edited version from: Ann Keller, S., Koonin, S. E. and Shipp, S. (2012), Big data and city living – what can it do for us? Significance, 9: 4–7. doi:10.1111/j.1740-9713.2012.00583.x
[5] Smart city definition from BSI PAS 180
[6] http://espresso.espresso-project.eu/wp-content/uploads/2017/03/D4-17579.2-Smart-City-reference-architecture-report.pdf
[7] McBratney, A., Whelan, B., Ancev, T., 2005. Future Directions of Precision Agriculture. Precision Agriculture, 6, 7-23
[8] http://www.linkedeodata.eu/Precision_Farming
[9] http://www.opengeospatial.org/standards/movingfeatures
[10] http://portal.opengeospatial.org/files/?artifact_id=34977
[11] http://portal.opengeospatial.org/files/?artifact_id=34126
[12] http://en.swcb.gov.tw/
[13] http://www.gis.fcu.edu.tw/
[14] http://en.fcu.edu.tw/
[15] http://docs.opengeospatial.org/per/16-098.html
[16] https://youtu.be/aSQFIPwf2oM
[17] OGC Fusion Standards Study, Phase 2 Engineering Report
[18] http://ieeexplore.ieee.org/document/7740215/
[19] Testbed-12 WPS Conflation Service Profile Engineering Report
[21] OGC Testbed-13: Workflows ER
[22] Testbed-12 Low Bandwidth & Generalization Engineering Report
[23] Testbed-12 LiDAR Streaming Engineering Report
[24] https://cacm.acm.org/magazines/2018/2/224627-risks-of-trusting-the-physics-of-sensors/
[25] OGC Moving Features Access
[26] https://cdn.ihs.com/www/pdf/TEC-Video-Surveillance-Trends.pdf
[27] http://docs.opengeospatial.org/wp/16-131r2/16-131r2.html
[28] http://www.opengeospatial.org/projects/initiatives/testbed14
[29] https://www.w3.org/TR/sdw-bp/
[30] https://www.edgexfoundry.org/about/
[31] http://www.iec.ch/whitepaper/
[32] http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.203.3161
[33] https://www.sciencedirect.com/science/article/pii/S0040162510001393