Publication Date: 2019-02-15

Approval Date: 2018-12-13

Submission Date: 2018-11-28

Reference number of this document: 18-028r2

Reference URL for this document: http://www.opengis.net/doc/PER/t14-D011

Category: Public Engineering Report

Editor: Guy Schumann (RSS)

Title: WMS Quality of Service & Experience


OGC Engineering Report

COPYRIGHT

Copyright (c) 2019 Open Geospatial Consortium. To obtain additional rights of use, visit http://www.opengeospatial.org/

WARNING

This document is not an OGC Standard. This document is an OGC Public Engineering Report created as a deliverable in an OGC Interoperability Initiative and is not an official position of the OGC membership. It is distributed for review and comment. It is subject to change without notice and may not be referred to as an OGC Standard. Further, any OGC Engineering Report should not be referenced as required or mandatory technology in procurements. However, the discussions in this document could very well lead to the definition of an OGC Standard.

LICENSE AGREEMENT

Permission is hereby granted by the Open Geospatial Consortium, ("Licensor"), free of charge and subject to the terms set forth below, to any person obtaining a copy of this Intellectual Property and any associated documentation, to deal in the Intellectual Property without restriction (except as set forth below), including without limitation the rights to implement, use, copy, modify, merge, publish, distribute, and/or sublicense copies of the Intellectual Property, and to permit persons to whom the Intellectual Property is furnished to do so, provided that all copyright notices on the intellectual property are retained intact and that each person to whom the Intellectual Property is furnished agrees to the terms of this Agreement.

If you modify the Intellectual Property, all copies of the modified Intellectual Property must include, in addition to the above copyright notice, a notice that the Intellectual Property includes modifications that have not been approved or adopted by LICENSOR.

THIS LICENSE IS A COPYRIGHT LICENSE ONLY, AND DOES NOT CONVEY ANY RIGHTS UNDER ANY PATENTS THAT MAY BE IN FORCE ANYWHERE IN THE WORLD. THE INTELLECTUAL PROPERTY IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NONINFRINGEMENT OF THIRD PARTY RIGHTS. THE COPYRIGHT HOLDER OR HOLDERS INCLUDED IN THIS NOTICE DO NOT WARRANT THAT THE FUNCTIONS CONTAINED IN THE INTELLECTUAL PROPERTY WILL MEET YOUR REQUIREMENTS OR THAT THE OPERATION OF THE INTELLECTUAL PROPERTY WILL BE UNINTERRUPTED OR ERROR FREE. ANY USE OF THE INTELLECTUAL PROPERTY SHALL BE MADE ENTIRELY AT THE USER’S OWN RISK. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR ANY CONTRIBUTOR OF INTELLECTUAL PROPERTY RIGHTS TO THE INTELLECTUAL PROPERTY BE LIABLE FOR ANY CLAIM, OR ANY DIRECT, SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES, OR ANY DAMAGES WHATSOEVER RESULTING FROM ANY ALLEGED INFRINGEMENT OR ANY LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR UNDER ANY OTHER LEGAL THEORY, ARISING OUT OF OR IN CONNECTION WITH THE IMPLEMENTATION, USE, COMMERCIALIZATION OR PERFORMANCE OF THIS INTELLECTUAL PROPERTY.

This license is effective until terminated. You may terminate it at any time by destroying the Intellectual Property together with all copies in any form. The license will also terminate if you fail to comply with any term or condition of this Agreement. Except as provided in the following sentence, no such termination of this license shall require the termination of any third party end-user sublicense to the Intellectual Property which is in force as of the date of notice of such termination. In addition, should the Intellectual Property, or the operation of the Intellectual Property, infringe, or in LICENSOR’s sole opinion be likely to infringe, any patent, copyright, trademark or other right of a third party, you agree that LICENSOR, in its sole discretion, may terminate this license without any compensation or liability to you, your licensees or any other party. You agree upon termination of any kind to destroy or cause to be destroyed the Intellectual Property together with all copies in any form, whether held by you or by any third party.

Except as contained in this notice, the name of LICENSOR or of any other holder of a copyright in all or part of the Intellectual Property shall not be used in advertising or otherwise to promote the sale, use or other dealings in this Intellectual Property without prior written authorization of LICENSOR or such copyright holder. LICENSOR is and shall at all times be the sole entity that may authorize you or any third party to use certification marks, trademarks or other special designations to indicate compliance with any LICENSOR standards or specifications.

This Agreement is governed by the laws of the Commonwealth of Massachusetts. The application to this Agreement of the United Nations Convention on Contracts for the International Sale of Goods is hereby expressly excluded. In the event any provision of this Agreement shall be deemed unenforceable, void or invalid, such provision shall be modified so as to make it valid and enforceable, and as so modified the entire Agreement shall remain in full force and effect. No decision, action or inaction by LICENSOR shall be construed to be a waiver of any rights or remedies available to it.

None of the Intellectual Property or underlying information or technology may be downloaded or otherwise exported or reexported in violation of U.S. export laws and regulations. In addition, you are responsible for complying with any local laws in your jurisdiction which may impact your right to import, export or use the Intellectual Property, and you represent that you have complied with any regulations or registration procedures required by applicable law to make this license enforceable.

1. Summary

1.1. Rationale

Quality of Service (QoS) and Quality of Experience (QoE) as they are intended and described at the OGC are two related concepts which require very specific treatment and characterization. Citing the definitions provided by the Domain Working Group (DWG) charter document:

  • Quality of Service: Technical reliability and performance of a network service. Typically measured using metrics like error rates, throughput, availability and delay or request response time. This Engineering Report (ER) attempts to handle QoS aspects such as service availability, scalability and speed.

  • Quality of (User) Experience: A holistic, qualitative measure of the customers' experience of the application or service. It encompasses both the user experience and the customer support experience of the evaluated applications and/or services.

QoE focuses on the usability of the information that is conceived via OGC services to end users or other client application and therefore is concerned more with qualitative aspects of such services like presence of metadata, proper and descriptive namings, appropriate styling and so on (a more thorough treatment is present in the QoE discussion paper OGC 17-049 entitled "Ensuring Quality of User Experience with OGC Web Mapping Services" available at https://portal.opengeospatial.org/files/?artifact_id=74403&version=1).

QoS focuses on providing reliable (i.e. quantitative ) measures of spatial data service metrics which can be used to characterize how a service ( one or more specific datasets exposed by a certain service) is performing both in near real-time as well as historically. It touches concepts like availability, scalability (also known as capacity), absolute performance (i.e. speed) and can be used to assess also perceived performance by final clients. As mentioned above, it is typically measured using metrics like error rates, throughput, availability and delay or request response time.

Quite often the QoS and QoE aspects of spatial data services are underestimated if not simply ignored due to lack of resources as well as lack of awareness, resulting in services which are difficult to exploit (i.e. QoE very low) and/or unstable or very slow (i.e. QoS very low). The result is that few users end up using them after the initial launch and this is especially true for services targeting end users who are used to interact with services a-la Google Maps which delivers extreme performance and scalability as well as bullet-proof usability.

1.2. Context

The ability to combine and visualize location-based data using a Web Map Service (WMS) is a key value proposition of the Federal Geospatial Platform (FGP). The FGP is a collaborative online environment where a collection of the government of Canada’s most relevant geospatial information can be found easily and viewed on maps to support evidence-based decision-making, foster innovation, and provide better service for Canadians. The FGP includes the capability of selecting datasets with WMS to view individually or combined with other services in the FGP viewer.

This functionality, as is the general case with a WMS, is provided to allow users to immediately visualize and analyze geospatial data. Unfortunately, user feedback has proven that these geospatial web services are not always easy or intuitive to navigate, combine or understand. Because the FGP’s primary end users are government policy analysts who are not always highly familiar with mapping and web mapping technologies, it is important to make WMS, and the content they make available, as user-friendly as possible.

In 2016, to help alleviate this issue, the FGP developed a web service quality assessment methodology that supported WMS, ran an assessment and developed recommendations and best practices to support a user-friendly experience for all employees and citizens using WMS. Assessments to date have shown that key considerations are often very simple, but very impactful on QoE. The results of this study were used as the primary input into an OGC Discussion Paper created by the Quality of Service Experience (QoSE) DWG. The OGC QoSE-DWG has developed a discussion paper (OGC 17-049) entitled "Ensuring Quality of User Experience with OGC Web Mapping Services" [https://portal.opengeospatial.org/files/17-049] that identifies and describes issues often encountered by users of OGC WMS that affect the quality of their experience, and also provides an assessment framework for identifying issues and measuring quality, along with potential solutions and guidance to improve the usability of services.

The assessment framework for measuring QoE and the associated recommendations for improving service quality are intended to benefit human end-users who need to rapidly assimilate and use web mapping visualizations to answer questions or input into analysis. In other words, they need to be able to make sense of the information an OGC WMS provides them.

In addition, Testbed-13 addressed QoS aspects in the aviation domain. Though specific to a particular domain, The Testbed13: Data Quality Specification Engineering Report [http://docs.opengeospatial.org/per/17-018.html] addressed a number of general aspects that apply to this task nevertheless.

1.3. Requirements & Research Motivation

Testbed-14 has addressed the following WMS usability aspects:

1. Develop a revision of the OGC Discussion Paper 17-049: Develop a revision of the OGC Discussion Paper 17-049 as per the review and assessment of the current OGC Discussion Paper 17-049. This revision shall be performed in coordination with OGC QoSE DWG to address potential new requirements such as QoS metadata (Capabilities Extensions) server implementation and monitoring client implementations that allows improved service quality and measurement. The revision is part of this ER.

2. Quality of Service Experience Assessment Framework: The fourteen assessment criteria described in Discussion Paper 17-049 and summarized in Figure 1 are all aimed at assessing the quality of a web service in terms of the degree to which it conveys clearly understood information to the user. The user is assumed to be a non-expert in geospatial web services, but in most cases the criteria are equally valid for all classes of users. A selected number of services in the Testbed will be assessed against the QoSE assessment criteria, and retested once recommendations have been applied. Comparison of results will help to validate the effectiveness of the assessment criteria and the corresponding recommendations to improve usability, and allow for feedback, improvement or correction. Services are provided as Testbed-14 deliverable D115 and, a dedicated client as D116.

WMS QoSE criteria chart
Figure 1. The fourteen assessment criteria for WMS QoSE

3. Quality of Service Experience Practices to Alleviate Usability Issues: The fourteen assessment criteria all have corresponding recommendations that describe practices that, once implemented, should help to alleviate user confusion and improve the usability of WMS as a means of visualizing and simple visual or query-based analysis of geospatial data. Once selected services have been assessed against assessment criteria, all or as many QoSE recommendations will be applied, then services will be subject to retesting.

4. Quality of Service Experience Implementing Best Practices: Man or Machine?: Analysis required here on the making of a geospatial WMS and the opportunities of human operator or programmatic response to make decisions that impact service quality of experience. Assessment of which usability issues are determined/caused by human input or what determined/caused by default via the programmatically generated aspects of the service (not due to direct human input or decisions) via the standard specification implementation. Results are captured in this ER.

5. Quality of Service Experience Test Suite: Develop test suite to programmatically test and validate that best practices have been successfully implemented. The test suite will automatically assess the quality of service according to the assessment criteria and validate or flag services that do not comply to best practices/recommendations. The results of the test suite (D117) are also captured in this ER.

Figure 2 illustrates the QoSE work items as reflected by the OGC Testbed-14 thread.

workitem chart
Figure 2. Quality of Service & Experience (QoSE) work items

Note: The purpose of D122 is to provide a Client to meet requirements for QoSE and Portrayal as well as measure statistics on response time, number of layers, number of features, error codes generated, etc. within the client. Since subjective measurement criteria, focusing for example on legibility, intuitiveness, consistency of symbology, may be more difficult to automate, participant will gather QoSE statistics manually for those criteria or elements, and then propose recommendations for an automated service for those assessment criteria.

1.4. What does this ER mean for the QoSE DWG and OGC in general

The OGC QoSE DWG provides a forum for discussing issues related to QoS and QoE of spatial data services and applications relying on these services for delivering timely and accurate spatial information to the end-users. Key business goals of the Working Group include the following:

  • Sharing implementation experience and ideas in evaluating and improving QoS and QoE of spatial data services.

  • Collect best practice, create and promote guidance on evaluating and self-assessing the QoE of spatial data services, and practical means for improving the user experience of these services.

  • Identify gaps in the existing standards and guidance related to QoS and QoE of spatial data services, and as appropriate, suggest new standardization activities within OGC to fill those gaps.

1.4.1. QoE specific problem statements from the DWG Charter

  • Metadata, such as titles and keywords, is not always written in clear and understandable language considering the end users. Further, metadata does not always describe the provided services and data sets in necessary detail. This makes it more difficult for users to fully take advantage of the provided service and its datasets. There should be better guidance and a checklist to assist data and service providers to record human-readable metadata.

  • The ancillary information, such as legends for WMS, is not always clear and human-readable. Missing or ambiguous legend information may easily lead to misinterpretation of the presented data. There should be better guidance for data providers on specifying good and readable legends.

  • There is a lack of methods and best practice on evaluating and improving user experience and human interaction of processes involving discovery, initial evaluation of the fitness-for-purpose of spatial data services, as well as access to change records during the services' lifetime.

  • There is a need for more accurate understanding of how spatial data services are used and perceived in the larger web service end-user community.

  • Evaluating, comparing and improving the QoE of spatial data services is difficult without commonly agreed and well-defined metrics for measuring the QoE.

1.5. Prior-After Comparison

The task described in this ER addressed the QoE assessment criteria laid out in the DWG Discussion paper referred to earlier and also implemented a QoS performance assessment. Specifically, an interactive user-friendly Graphical User Interface (GUI) was developed based on the QoE assessment criteria defined by the DWG. This GUI also lists performance statistics from a WMS server.

1.6. Recommendations for Future Work

A goal of this task and its analysis was also to suggest potential future activity where these results could be investigated through new tasks in future testbeds.

As per OGC Policies and Procedures, the Discussion Papers are not versioned documents. The option discussed in the QoSE DWG has been that a new document (e.g. perhaps a Best Practice) would be produced based on the original Discussion Paper and the results of the Testbed-14.

There should be a discussion on how QoS/QoE will fit into next generation OGC service interface specifications (i.e. WFS3, CAT4) given the clean break and move to REST/JSON/etc.

Quality of Experience (QoE) Items
  • Extend QoSE evaluation work to test other geospatial services

Although current QoSE indicators are specifically designed for the Web Map Service, some of them could be directly used to measure the quality of other OGC web services (OWS). For example, the title or similar elements in nearly all OWS types could be considered, resulting in the title meaningfulness and related criteria being mapped to the elements. In addition, the feature-related QoSE indicators could be used to evaluate the quality of the WFS. And most of them could be directly used to measure the WMTS.

  • Improve current QoSE indicators

The fifth QoSE criteria is named as Feature Attribution, but its description indicates that it is used to measure the number and relevance of attributes provided for each feature. Besides, the attribution is used to identify the source of the geospatial information. As a result, it might be worth considering to change the Feature Attribution to measures of Feature Attribute.

The fourteen (14) QoSE criteria only aim at assessing the quality of a web service in terms of the degree to which it conveys clearly understood information to the user. However, QoSE means identifying the usability issues associated with use of OGC web services. Hence, additional indicators beside the information understandability should be developed. For example, the service performance also affects user experience.

In the QoSE discussion paper, the user is assumed to be a non-expert. However, some indicators are difficult for them to understand. In my opinion, the evaluation on feature attribute needs some expert knowledge, since it measures the number and relevance of attributes provided for each feature.

Not all items in the QoSE recommendations could find a matching item in the QoSE criteria, such as fees and access constraints, attribution, bounding box, and service interoperability. Thus, this needs improving.

D117recommendations
Figure 3. Recommendations for future items related to D117
Quality of Service (QoS) Items
  • Investigate further towards the concept of a status page to report information about availability and status of OGC services endpoints;

  • Investigate further, together with the QoSE DWG, the QoS extensions to GetCapabilities operations for WMS and WFS in order to streamline them and propose them as profiles for the respective OGC services;

  • Better link the "E" (Experience) of QoSE with QoS since bad performance and/or availability means bad experience for the end-user; meaning that OGC should try and draw a link between checking the famous "14 rules" that are more towards naming and metadata, and that OGC should check for decent performance and decent stability when assigning scores to layers (ideally the score should become dynamic at this stage since performance and availability should be monitored over time).

  • Advance performance testing of WMS services by executing the load tests through the containerized architectures: Load testing is crucial for WMS that have to serve many users in real time. Load testing involves simulating similar actions, where computer-generated virtual users mimic real users. In most cases, this kind of testing cannot be generated by a single server due to limited hardware resources. Additionally, physically distributed servers are a must if user behavior is desired from different regions of the world. For these large-scale load tests containerized cloud services offer a good alternative. Cloud services that make use of containerized architectures such as Kubernetes, rkt, etc, offer an alternative to standalone machines, running full virtual machine instances for applications and are well-suited for scaling of simulated clients;

  • Extend performance tests by expanding to other geospatial services: There is a variety of open geospatial standards to provide online access to geospatial data, for example, Web Coverage Service (WCS), Web Map Service (WMS), Web Feature Service (WFS), and Web Map Tile Service (WMTS). OGC Testbed-14 focused on performance tests for WMS, but WCS and WFS services are more sensitive to data resolution. For WFS services, due to their verbose nature, transferring large amounts of data might be problematic and lead to higher latencies and lower performance. Extending performance tests to other geospatial services offering data from a sponsor would provide a broader view of how well other geospatial services are provided.

1.7. Document contributor contact points

All questions regarding this document should be directed to the editor or the contributors:

Contacts

Name Organization

Guy Schumann

Remote Sensing Solutions Inc.

Albert Kettner

Consultant for Remote Sensing Solutions Inc./INSTAAR, DFO, CU Boulder

Simone Giannecchini

GeoSolutions SAS

Zelong Yang

School of Geographical Sciences and Urban Planning, Arizona State University

Keith Pomakis

CubeWerx Inc.

1.8. Foreword

Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. The Open Geospatial Consortium shall not be held responsible for identifying any or all such patent rights.

Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the standard set forth in this document, and to provide supporting documentation.

2. References

The following normative documents are referenced in this document.

3. Terms and definitions

For the purposes of this report, the definitions specified in Clause 4 of the OWS Common Implementation Standard OGC 06-121r9 shall apply. In addition, the following terms and definitions apply.

  • Quality

    degree to which a set of inherent characteristics fulfills requirements [SOURCE: ISO 9000:2005, 3.1.1, modified - Original Notes have been removed.]
  • Usability

    The degree to which something is able or fit to be used

3.1. Abbreviated terms

NOTE: The abbreviated terms clause gives a list of the abbreviated terms and the symbols necessary for understanding this document.
  • DWG Domain Working Group

  • QoE Quality of Experience

  • QoS Quality of Service

  • QoSE Quality of Service and Experience

  • WMS Web Map Service

4. Overview

This ER describes all the different components that were implemented to advance WMS QoSE. Section 5 describes the following components in detail:

  • GUI for WMS Service Quality Assessment

  • Test Suite for WMS Service Quality Assessment

  • WMS Stress Testing with JMeter

  • GeoServer Extension for QoS

  • OGC Web Service Landing Pages

  • TIE and Scenario for Demonstration

5. Components and Component Scenario

5.1. Component Overview Chart

Figure 4 illustrates the connections and interoperability between the various component parts discussed in this ER.

components
Figure 4. Component Overview Chart

5.2. Quality of Service & Experience (QoSE)

A discussion paper entitled "Ensuring Quality of User Experience with OGC Web Mapping Services" provides an assessment framework for identifying issues and measuring quality, which can be downloaded at public link. It outlines fourteen criteria to measure QoSE, and recommendations are given for each of them, which are shown as components using a rectangle in Figure 5.

qose indicators metadata elements relationship
Figure 5. Relationships among QoSE Indicators and WMS Metadata Elements

5.3. Client with QoSE Support

D116 requires implementation of a client dedicated to testing the assessment criteria of WMS instances. Therefore, a Graphical User Interface (GUI) was designed and is shown in Figure 6. It consists of three basic components, including a WMS layer selector, metadata information viewer, and QoSE assessment panel. Users can select a layer of interest from the WMS layer selector. Once a layer is selected, its metadata is loaded and shown in the metadata information viewer, which is located in the bottom-left. Finally, users can give their evaluation for each QoSE indicator based on the layer metadata contents. At the same time, users can also leave feedback about the QoSE framework or the client using the feedback panel.