I. Abstract
This OGC Discussion Paper presents a proposal that recommends the development of Open Geospatial Consortium (OGC) standards that define a framework for location-based service metrics that inform the spatial, spectral, and temporal errors associated with various data sources. This paper discusses current industry practices on spatial errors, spectral errors, and error propagation. The paper also presents a proposed framework and a recommended study effort.
II. Keywords
The following are keywords to be used by search engines and document catalogues.
ogcdoc, OGC document, OGC, remote sensing, spectral error, imaging
III. Preface
Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. The Open Geospatial Consortium shall not be held responsible for identifying any or all such patent rights.
Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the standard set forth in this document, and to provide supporting documentation.
IV. Security Considerations
No security considerations have been made for this document.
V. Submitting Organizations
The following organizations submitted this Document to the Open Geospatial Consortium (OGC):
- Maxar Technologies, Westminster, CO
- Exquisite Geolocation Systems, Alexandria, VA
VI. Submitters
All questions regarding this submission should be directed to the editor or the submitters:
Name | Affiliation |
---|---|
Navulur, K. | Maxar Technologies, Westminster, CO |
Abrams, M. C. | Exquisite Geolocation Systems, Alexandria, VA |
Standardizing a Framework for Spatial and Spectral Error Propagation
1. Introduction
Increasingly, location based services bring together information products and services into a common data ecosystem in which we expect that all of the data will be synergistic and interoperable: so that GNSS-based location services, navigation databases, and satellite-derived image-maps are current, accurate, and precise at the scale of a human being. Emerging technologies such as autonomous vehicles and military robots will require location information to be current, reliable, and actionable, as will each smart phone and Internet of Things (IoT) device. All of these devices need consistent, current, accurate, and precise coordinates in order to perform their functions effectively. The current state of practice for describing the spatial accuracy of location are insufficient to capture the error sources during data capture at the sensor level, the necessary ancillary data used for processing the location data, and the inherent errors in the data transformations (projections, resampling, warping, etc.) necessary to register, fuse, extract, and identify the feature content that is needed for location-based services. Consequently, the data and derived services are unreliable for applications that require high precision and accuracy.
Three examples illustrate the latent complexity: two aspects of satellite photogrammetry (the magic behind the various Earth skins that provide the visual context for applications such as GoogleEarth, Bing Maps, Baidu, and Openstreetmaps) and GNSS-based navigation.
Overhead photogrammetry combines multiple images to create a digital surface model, such as the Shuttle Radar Topographic Mission (SRTM), which produced a 30 m spatial resolution topographic map of Earth from 56 S to 60 N with a vertical precision of 9.8 m. High resolution images are ortho-rectified against this type of DSM to produce the basemaps commonly found in most navigation applications. The spatial resolution of the imagery (often 0.25-1.0 m) provides a very precise impression of the planet’s surface, but with a spatial accuracy that is fundamentally limited by the underlying topography. High resolution 3D topography and orthoimages can be produced by correlating multiple images with a corresponding degradation due to the time interval required to collect sufficient imagery with the necessary diversity of viewing geometries to create an effective 3D representation of the scene. The results are often remarkable, achieving ~ 1 m spatial resolution (precision due to resampling) and accuracies that are less than 3 m relative to DGNSS location determination.
Adding complexity to the satellite photogrammetry problem (simultaneous sampling of an object from multiple geometries) is the desire to collect simultaneous multi-spectral data to facilitate material identification, which is technically difficult (and practically impossible). Instead, remote sensing systems collect data under different viewing geometries, at different times of day, with different weather and atmospheric conditions, and often using multiple sensors each of which has a different calibration schema. The downstream processing algorithm must digest all of this data and put it into a common reference frame (spatially, temporally, and spectro-radiometrically) in order to produce a high fidelity representation of the target scene.
This becomes particularly relevant as AI/ML technologies mature, where there is a need for ensuring spectral integrity of the data for automated information extraction that can be relied upon in the field. With sensors capturing data under varying collection geometries, collection times of the day, atmospheric conditions (including BRDF), as well as varying processing techniques (QUAC, FLASH, etc.), there is a need for end user to understand the fidelity of the data for spectral analysis. The provenance and curation of AI/ML training sets will become a discriminating feature of location-based information systems and will uniquely depend on the calibration and the spectral and radiometric integrity of the data.
GNSS-based navigation determines a user’s position through a process of quad-lateration (‘triangulation’ against four known objects to determine time, latitude, longitude, and elevation) using RF time of flight measurements with an accuracy of ~ 5 m for smartphone class devices and ~ 3-5 cm per axis for differential GNSS devices. Recently, multi-GNSS smart phone devices have demonstrated < 2 m geolocation, offering the near-term potential for human scale location services with commodity smartphones.
Remarkably, as is demonstrated on smartphones every day, these essentially independent location services locate the device with remarkable consistency, most of the time (often GNSS location services will place you within your house footprint, while you are inside your house and shielded from a direct line of sight to the GNSS satellites!). Unfortunately, when they fail, individually or collectively, the results for the location-service enabled individual (or device) can have negative consequences. In its benign form, a navigation device incorrectly reports a location from a image basemap that is low resolution, or obsolete. In a more complicated form, certain countries view location information as a security issue and intentionally remap location information using a non-linear confidential algorithm (GCJ-02, for example) and regulate the usage of GPS or GNSS services. In its worst form, incorrect, or incompatible, coordinates can have devastatingly negative consequences, as the inadvertent bombings of an embassy in Belgrade (1999) and that of a hospital in Afghanistan (2015) demonstrated. Accurate coordinates matter in all matters of navigation, and few individuals are able to validate the accuracy and provenance of a coordinate or address at human scales.
This proposal recommends the development of Open Geospatial Consortium (OGC) standards that define a framework for location-based service metrics that inform the spatial, spectral, and temporal errors associated with various data sources. The geomatics and geodesy community of practice has had nearly 400 years to develop methodologies for location determination and the photogrammetry community has been making overhead maps for more than 100 years. Consequently, as with any well developed discipline, there is a diversity (and divergence) of methods for error propagation that would benefit from an international standards organization supporting research, develop, test, and evaluate of metrics to enable inter-operable location based services and to encourage convergence where possible and technically appropriate. Further, this project can leverage current standards at OGC, as well as military standards, to create a comprehensive framework for error budgets.
Approved for Public Release 2019-04729
DTG 17DEC19:1126
2. Current Industry Practices
Mirroring the introduction (above), current practices will be divided into three sections that address spatial and spectral error sources, error propagation, and accuracy estimates. Additionally, industry and government practices and standards will be identified where appropriate.
2.1. Spatial errors
Digital maps from Google, Bing, Apple, and OSM have become de facto mapping standards and used by majority of consumers for navigation and location-based services. Various government-provided digital mapping products are available and are included in some of these mapping services. Each location service provider uses different data sources and processing techniques to create, update, and publish their maps. None of the service providers qualify their methodology or product, other than some version of the ‘standard disclaimer’ that the operator is responsible for proper navigation. More concerns with the advent of autonomous vehicles is the curation of a navigation database to reflect current usability of the recommended trajectory. Occasional academic studies will compare digital maps with local DGNSS measurements and provide some insight into the local precision and accuracy, but no global studies have been published to date1.
Inherent in a commercial digital map service will be a set of technical decisions regarding resolution, accuracy, and currency that optimize the return on investment for location-based services. Search for a place like Mocoron Honduras or Linden Guyana and you will immediately recognize that these are not locations with significant ROI for location based services. In contrast, one might expect that urban areas would be consistently and accurately mapped and updated frequently. Figure 1 illustrates some of the discrepancies between commonly used maps over the same region in Beijing China. Google2 and Bing maps have noticeable misalignments between the road vectors and the underlying imagery while Apple and Baidu road vectors align closely with the imagery. Among the four, it is impossible to ascertain which data sets are accurate, although a comparison with GPS data provided to OSM could be used as an independent source of location information (granting that the collection and provision/use of such data violates the surveying and mapping law of the People’s Republic of China (2002)3.
Figure 1 — Comparison of various mapping portal and spatial errors associated with them.
In contrast, government organizations have been very explicit about data quality, data qualification, and error propagation in the development of their mapping and imagery products. In 1991, the US government published MIL STD 6000014 (1991) as a standard for mapping, charting, and geodesy that has continued to be accepted as a standard and best practice nearly 30 years later. It characterizes coordinate precision and accuracy in terms of circular error probable 90% (CEP90) and linear error probable 90% (LE90) for horizontal and vertical error estimation and references these coordinates to a horizontal datum (World Geodetic System (WGS) 84) and a time varying vertical datum (Earth Gravitation Model (EGM) 08 is the current revision) . These accuracy standards have been used for the assessment of the accuracy and precision of commercial satellite imaging systems such as DigitalGlobe (JACIE reference). These results have been extended to the assessment of high resolution 3D terrain models and orthoimages5.
Conveniently, the USG published the WGS84 datum and continues to use it as the reference frame for all GPS based location services with regular updates to the geoid estimate to better estimate the surface of the earth relative to the surface of gravitational equipotential (the underlying vertical ellipsoid reference frame). In contrast to the photogrammetry community, the GPS/GNSS community has chosen to utilize different accuracy metrics (CE95/LE95) and 3D accuracy metrics such as 3DRMS. Consequently, when a GNSS position error estimate is displayed on an image map, the relevant question should be what are the corresponding error estimates for the underlying image and is the joint position determination consistent with the individual error sources and how do these errors change with improving positional accuracies of smart phones (Figure 2).
Figure 2 — Location accuracy of smart phones is steadily increasing in the last few years
Today, with five competing satellite navigation systems (GPS, GLONASS, GALILEO, BEIDOU, and QZSS) and the regional IRNSS/NAVIC (India) system, it is expected that each system will pursue it’s own independent world geodetic system framework (much as nationally-based mapping organizations have maintained independent map projections for the last 400 years). Many current generation smartphones are beginning to provide estimates of location accuracy, within the constraints of a real time (once per second) location service capability with limited battery power. Each vendor makes different choices regarding the choice of GNSS hardware and software, in an attempt to provide sufficient location and navigation services within the available power budget for their device. Additionally, many vendors utilize a hybrid location strategy that utilizes a combination of GNSS, cellular tower location, and WIFI geolocation to enable optimal interior geolocation at a manageable power budget. A majority of 5G (and likely IoT devices) will be inherently multi-GNSS capable, creating the opportunity for location based service providers to benefit from high-density RF geolocation with a corresponding improvement in consistency, precision, accuracy, and timeliness, but with a commensurate power impact. Unfortunately, no vendor has published their error propagation algorithm nor provided a demonstration of the devices performance against geodetic benchmarks. Occasional results are published6 indicating the statistical accuracy of smartphone class devices relative to DGNSS survey and have utilized USG standards and recommended best practices for the characterization of the precision and accuracy of location-based service devices.
Logistics companies such as FedEx, UPS, and others are also interested in understanding the spatial error positioning in X,Y, and Z dimensions to optimize their delivery routes and error metrics around Z are rarely included in these maps. Traditional cartographic mapping concepts such as map scales are no longer relevant in our digital map age and make gross assumptions around Z accuracies.
2.2. Spectral errors
In order to provide current, high resolution color basemaps, worldwide, satellite and airborne companies collect imagery continuously, with the result that imagery of an area is typically collected at varying times of day, which may occur over weeks and months (years) and may extend across different seasons. A ‘typical’ example is provided in Figure 3, illustrating the Denver metropolitan area from Maxar’s satellite constellation (which is headquartered in Denver). As illustrated, this browse imagery collection is not color balanced with other images, has and shows varying off-nadir angles, differing atmospheric conditions, collected at different times of the year, and has multiple sensor modalities (visible, near-infrared, and shortwave infrared). Individual image scenes contain pixels with varying spectro-radiometric intensities as a result of a combination of different viewing angles and varying spectral behavior of the object on the ground (the inherent bi-directional reflectance distribution function -BRDF). Each of these effects needs to be accounted for in the data processing to properly calibrate the imagery and enable accurate change detection, feature detection and identification, and for reliable automated information extraction (especially with the advent of AI/ML enabled information extraction techniques). Needless to say, Figure 3 illustrates how these differences will limit automated feature extraction opportunity space due to the absence of an error propagation methodology that accounts for the distortion of each pixel in the collection, processing, and exploitation cycle.
Figure 3 — A ‘typical’ browse image collection (Denver metropolitan area) illustrating the limitations inherent from multi-look, multi-day, multi-spectral, multi-season data without sufficient calibration and error propagation.
2.3. Error propagation
Orthorectified image mosaics introduce a unique opportunity for error propagation analysis. Utilizing the methodology of MIL STD 600001, an assessment of the geometric precision and accuracy of this mosaic can be performed with a summary result of an absolute accuracy of <3 m (CE90) and a relative accuracy of <1 m (CE90). As with most orthomosaics, there are no measured elevations and consequently no measured vertical uncertainties (LE90). Extending the methodology into 3D, three dimensional error analysis with an absolute accuracy of 3.1 m (CE90) and 2.4 m (LE90) and a relative accuracy of 0.46 m (CE90) and 0.03 m (LE90)7 are now possible from satellite imagery. The USG has developed a Generalized Positioning Model to provide a predictive model for error estimation of 2D and 3D imaging systems and the derived data products.
The hazard of orthomosaic generation is the spatial, temporal, and spectral averaging that is necessary to accumulate a ‘complete’ mosaic with a finite number of parallax (or lay-over) artifacts and the residual issue of adequate sampling (and resolution) on vertical and partially obscured surfaces. The remote sensing community8 has developed a General Image Quality Equation (GIQE) that addresses the image sampling, radiometry, and image construction/reconstruction problem9 and have established a theoretical basis for quantitative error propagation.
Model-based error analysis and propagation that includes spectral and temporal (BRDF) has been demonstrated10 and provides the basis for developing the recommended strategy for the OGC error propagation framework and associated metrics. As is illustrated in Figure 3, the necessary spatial, temporal, and spectral information will require reporting on a per-pixel basis which places a significant burden on the exploitation algorithm to minimize the required data volume. An alternative schema would be the aggregation of ‘regions of similarity’ that would permit the minimization of the data volume based on a fidelity/accuracy specification on the part of the location-service provider.
3. Proposed Framework
Existing error propagation models range from scene based to pixel based, with the logical observation that, with the range of complexity will come an incumbent range of fidelity and accuracy.
Presently, error propagation and accuracy frameworks exist that address the spatial precision and accuracy of a mosaic-type production chain, without addressing the temporal averaging artifacts implicit in multi-look overhead imaging. The extension of these frameworks to 3D has been accomplished with the Generalized Positioning Model (GPM), again without addressing the temporal averaging artifact11. Independently, the GNSS RF geodesy community has a similar schema for geolocation accuracy, with the caveat that the units of reporting are not common with the photogrammetry community.
General and Spectral Image Quality Equations (GIQE/SIQE) exist and permit the demonstration of model based error propagation for overhead remote sensing systems, with the caveat that most of these systems are per-pixel based and consequently largely impractical for operational data processing.
Consequently, the majority of the required elements exist in isolated disciplines and the challenge of this proposed study will be to determine the complexity of building an integrated error propagation model and finding a mechanism to permit fidelity and accuracy to be curated at an affordable level of additional complexity.
3.1. Recommended Study Effort
-
A technical evaluation of the implementation of a generalized positioning model (GPM-like) should be performed on 2D and 3D data for common sites against DGNSS ground control. The extension of this construct to include parallax, lay-over, and temporal averaging would define the additional level of complexity necessary to account for a completeness metric (accounting for lay-over in 2D and 3D obscuration).
-
A technical evaluation of the implementation of a GIQE12/SIQE error propagation model that accounts for time of day, sun angle, and bi-directional reflectance distribution functions (BRDF) should be performed, ideally on a site with the capability of calibrated spectro-radiometric and BRDF ground truth measurements.
-
A technical evaluation of the feasibility of a ‘less than per-pixel’ aggregation model for curating accuracy and fidelity as a function of data volume and cost.
-
Community technical interface meetings to develop a common, standards-based, international framework for accuracy assessment and error propagation in location-based services community of practice. A specific, desirable, outcome of these meetings would be the convergence into a common set of definitions and metrics for error propagation and accuracy assessments.
Annex A
(normative)
Revision History
Table A.1
Date | Release | Editor | Primary clauses modified | Description |
---|---|---|---|---|
2021-01-29 | 1.0 | K. Navulur | all | initial version |