Publication Date: 2020-05-04

Approval Date: 2020-04-29

Submission Date: 2020-03-01

Reference number of this document: OGC 20-011

Reference URL for this document: http://www.opengis.net/doc/PER/scira-pilot

Category: OGC Public Engineering Report

Editor: Sara Saeedi

Title: OGC SCIRA Pilot Engineering Report


OGC Public Engineering Report

COPYRIGHT

Copyright © 2020 Open Geospatial Consortium. To obtain additional rights of use, visit http://www.opengeospatial.org/

WARNING

This document is not an OGC Standard. This document is an OGC Public Engineering Report created as a deliverable in an OGC Interoperability Initiative and is not an official position of the OGC membership. It is distributed for review and comment. It is subject to change without notice and may not be referred to as an OGC Standard. Further, any OGC Public Engineering Report should not be referenced as required or mandatory technology in procurements. However, the discussions in this document could very well lead to the definition of an OGC Standard.

LICENSE AGREEMENT

Permission is hereby granted by the Open Geospatial Consortium, ("Licensor"), free of charge and subject to the terms set forth below, to any person obtaining a copy of this Intellectual Property and any associated documentation, to deal in the Intellectual Property without restriction (except as set forth below), including without limitation the rights to implement, use, copy, modify, merge, publish, distribute, and/or sublicense copies of the Intellectual Property, and to permit persons to whom the Intellectual Property is furnished to do so, provided that all copyright notices on the intellectual property are retained intact and that each person to whom the Intellectual Property is furnished agrees to the terms of this Agreement.

If you modify the Intellectual Property, all copies of the modified Intellectual Property must include, in addition to the above copyright notice, a notice that the Intellectual Property includes modifications that have not been approved or adopted by LICENSOR.

THIS LICENSE IS A COPYRIGHT LICENSE ONLY, AND DOES NOT CONVEY ANY RIGHTS UNDER ANY PATENTS THAT MAY BE IN FORCE ANYWHERE IN THE WORLD. THE INTELLECTUAL PROPERTY IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS FOR A PARTICULAR PURPOSE, AND NONINFRINGEMENT OF THIRD PARTY RIGHTS. THE COPYRIGHT HOLDER OR HOLDERS INCLUDED IN THIS NOTICE DO NOT WARRANT THAT THE FUNCTIONS CONTAINED IN THE INTELLECTUAL PROPERTY WILL MEET YOUR REQUIREMENTS OR THAT THE OPERATION OF THE INTELLECTUAL PROPERTY WILL BE UNINTERRUPTED OR ERROR FREE. ANY USE OF THE INTELLECTUAL PROPERTY SHALL BE MADE ENTIRELY AT THE USER’S OWN RISK. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR ANY CONTRIBUTOR OF INTELLECTUAL PROPERTY RIGHTS TO THE INTELLECTUAL PROPERTY BE LIABLE FOR ANY CLAIM, OR ANY DIRECT, SPECIAL, INDIRECT OR CONSEQUENTIAL DAMAGES, OR ANY DAMAGES WHATSOEVER RESULTING FROM ANY ALLEGED INFRINGEMENT OR ANY LOSS OF USE, DATA OR PROFITS, WHETHER IN AN ACTION OF CONTRACT, NEGLIGENCE OR UNDER ANY OTHER LEGAL THEORY, ARISING OUT OF OR IN CONNECTION WITH THE IMPLEMENTATION, USE, COMMERCIALIZATION OR PERFORMANCE OF THIS INTELLECTUAL PROPERTY.

This license is effective until terminated. You may terminate it at any time by destroying the Intellectual Property together with all copies in any form. The license will also terminate if you fail to comply with any term or condition of this Agreement. Except as provided in the following sentence, no such termination of this license shall require the termination of any third party end-user sublicense to the Intellectual Property which is in force as of the date of notice of such termination. In addition, should the Intellectual Property, or the operation of the Intellectual Property, infringe, or in LICENSOR’s sole opinion be likely to infringe, any patent, copyright, trademark or other right of a third party, you agree that LICENSOR, in its sole discretion, may terminate this license without any compensation or liability to you, your licensees or any other party. You agree upon termination of any kind to destroy or cause to be destroyed the Intellectual Property together with all copies in any form, whether held by you or by any third party.

Except as contained in this notice, the name of LICENSOR or of any other holder of a copyright in all or part of the Intellectual Property shall not be used in advertising or otherwise to promote the sale, use or other dealings in this Intellectual Property without prior written authorization of LICENSOR or such copyright holder. LICENSOR is and shall at all times be the sole entity that may authorize you or any third party to use certification marks, trademarks or other special designations to indicate compliance with any LICENSOR standards or specifications.

This Agreement is governed by the laws of the Commonwealth of Massachusetts. The application to this Agreement of the United Nations Convention on Contracts for the International Sale of Goods is hereby expressly excluded. In the event any provision of this Agreement shall be deemed unenforceable, void or invalid, such provision shall be modified so as to make it valid and enforceable, and as so modified the entire Agreement shall remain in full force and effect. No decision, action or inaction by LICENSOR shall be construed to be a waiver of any rights or remedies available to it.

None of the Intellectual Property or underlying information or technology may be downloaded or otherwise exported or reexported in violation of U.S. export laws and regulations. In addition, you are responsible for complying with any local laws in your jurisdiction which may impact your right to import, export or use the Intellectual Property, and you represent that you have complied with any regulations or registration procedures required by applicable law to make this license enforceable.

Table of Contents

1. Subject

This engineering report (ER) captures Smart City Interoperability Reference Architecture (SCIRA) Pilot implementation outcomes and findings to demonstrate the risk mitigation and safety capability of the SCIRA interoperable and standard-based architecture. SCIRA Pilot is an OGC (Open Geospatial Consortium) Innovation Program project sponsored by the US Department of Homeland Security (DHS) Science & Technology (S&T) in collaboration with the city of St. Louis, Missouri. The purpose of this project is to advance standards for smart and safe cities and develop open, interoperable design patterns for incorporating the Internet of Things (IoT) sensors into city services.

2. Executive Summary

A lack of consensus on both common terminology and smart city architectural principles can result in divergent, proprietary and even contradictory specifications. Often, they fail to provide sufficient interoperability and scalability of the underlying IoT, and Cyber-Physical Systems (CPS) technologies to provide a suitable foundation for many smart cities applications.

SCIRA provides free deployment guides, a reusable design toolkit, and other resources to plan, acquire, and implement standards-based, cost-effective, vendor-agnostic, and future-proof smart city Information Technology (IT) systems and networks using technologies such as IoT, Sensor webs, and Geospatial frameworks. The objective of this Pilot is to research, design, and test the SCIRA as a reusable design toolkit that shows how to integrate proprietary IoT sensors into a common framework for public safety applications at the community level.

This Pilot focused on the City of St. Louis, Missouri to refine elements of interoperable smart city architecture through implementation and testing in functional, ‘real world’ public safety scenarios. The SCIRA Pilot concluded with on-site exercises and then a demonstration intended to establish a reusable example of SCIRA-based deployment. The operational application of the SCIRA design toolkit showed how cities can reap the benefits of standards-based interoperability.

2.1. Document contributor contact points

All questions regarding this document should be directed to the editor or the contributors:

Table 1. Contacts
Name Organization Role

Sara Saeedi, PhD

University of Calgary

Editor

Josh Lieberman, PhD

OGC

Contributor

Steve Liang, PhD

University of Calgary

Contributor

Brian Hawkins

Coolfire

Contributor

Charles Chen

Skymantics, LLC

Contributor

Ignacio Correas

Skymantics

Contributor

Igor Starkov

Ecodomus

Contributor

Jason MacDonald

Compusult

Contributor

Marcus Alzona

keys / keys net LLC

Contributor

Mike Botts, PhD

Botts Innovative Research Inc.

Contributor

Mahnoush Mohammadi Jahromi

University of Calgary

Contributor

Sepehr Honarparvar

University of Calgary

Contributor

Sushmapriya Maddala

CYIENT Ltd.

Contributor

2.2. Foreword

Attention is drawn to the possibility that some of the elements of this document may be the subject of patent rights. The Open Geospatial Consortium shall not be held responsible for identifying any or all such patent rights.

Recipients of this document are requested to submit, with their comments, notification of any relevant patent claims or other intellectual property rights of which they may be aware that might be infringed by any implementation of the standard set forth in this document, and to provide supporting documentation.

3. References

4. Terms and Definitions

For the purposes of this report, the definitions specified in Clause 4 of the OWS Common Implementation Standard OGC 06-121r9 shall apply. In addition, the following terms and definitions apply.

  • 3DTiles

    An OGC Community Standard for streaming large heterogeneous 3D geospatial datasets.
  • Application programming interface

    A standard set of documented and supported functions and procedures that expose the capabilities or data of an operating system, application or service to other applications (adapted from ISO/IEC TR 13066-2:2016)
  • Deep Learning

    It is a subset of data-driven Machine Learning algorithms trained by large quantities of data.
  • Feature

    Abstraction of real-world phenomena (source: ISO 19101-1:2014)
  • GeoJSON

    "GeoJSON is a geospatial data interchange format based on JavaScript Object Notation (JSON).  It defines several types of JSON objects and the manner in which they are combined to represent data about geographic features, their properties, and their spatial extents. GeoJSON uses a geographic coordinate reference system, World Geodetic System 1984, and units of decimal degrees." (https://tools.ietf.org/html/rfc7946[The GeoJSON Format])
  • GeoPackage

    An open, non-proprietary, platform- independent and standards-based data format for geographic information implemented as a SQLite database container.
  • gLTF

    (GL Transmission Format) is a royalty-free specification developed and maintained by Khronos for the efficient transmission and loading of 3D scenes and models by applications.
  • MQTT

    Message Queuing Telemetry Transport is an open lightweight, network protocol that transports messages between devices.
  • REST

    The Representational State Transfer (REST) style is an abstraction of the architectural elements within a distributed hypermedia system.
    It encompasses the fundamental constraints upon components, connectors, and data that define the basis of the Web architecture, and thus the essence of its behavior as a network-based application.
    An API that conforms to the REST architectural principles/constraints is called a RESTful API.
  • Sensor

    An entity capable of observing a phenomenon and returning an observed value. Type of observation procedure that provides the estimated value of an observed property at its output. [OGC 12-000]
  • SensorThings

    OGC Standard providing an open and unified framework to interconnect IoT sensing devices, data, and applications over the Web.

4.1. Abbreviated Terms

  • 3D 3 Dimensional

  • AI Artificial Intelligence

  • API Application Programming Interface

  • APP Applications

  • AVL Automated Vehicle Location

  • AWS Amazon Web Services

  • BLE Bluetooth Low Energy

  • BYOD Bring Your Own Device

  • CCTV Closed Circuit Television

  • CFP Call for Participation

  • CityGML City Geography Markup Language

  • COP Common Operating Picture

  • CPS Cyber-Physical Systems

  • CPU Central Processing Unit

  • CR Change Request

  • CSI Camera Serial Interface

  • CSW Catalogue Services for the Web

  • CUDA Compute Unified Device Architecture

  • CUDNN CUDA Deep Neural Network library

  • DEM Digital Elevation Model

  • DHS Department of Homeland Security

  • DL Deep Learning

  • DTM Digital Terrain Model

  • DWG Domain Working Group

  • EC2 Elastic Compute Cloud

  • EMS Emergency Medical Service

  • ER Engineering Report

  • FCU Feng Chia University

  • FLIR Forward Looking Infrared Radar

  • FME Feature Manipulation Engine

  • FP False Positive

  • fps frames per second

  • GeoJSON Geospatial JavaScript Object Notation

  • GlTF Graphics Library Transmission Format

  • GPS Global Positioning System

  • GPU Graphics Processing Unit

  • HD High Definition

  • HTTP HyperText Transfer Protocol

  • I3S Indexed 3D Scene layers

  • ID/Id Identification

  • IoT Internet of Things

  • IoU Intersection of Union

  • ISO International Organization for Standardization

  • IT Information Technology

  • JSON JavaScript Object Notation

  • LTE Long Term Evolution

  • mAP Mean Average Precision

  • MIPI-CSI Mobile Industry Processor Interface Camera Serial Interface

  • MicroSD Micro Secure Digital

  • ML Machine Learning

  • MO Missoury

  • MQTT Message Queuing Telemetry Transport

  • MSEL Master Scenario Events List

  • NN Neural Network

  • NGFR Next Generation First Responder

  • NYS New York State

  • OGC Open Geospatial Consortium

  • ORM OGC Reference Model

  • OSH Occupational safety and health

  • OSM Open Street Map

  • OWS OGC Web Services

  • REST Representational state transfer

  • RFID Radio-Frequency IDentification

  • S&T Science & Technology

  • SCIRA Smart City Interoperability Reference Architecture

  • SMS Short Message Service

  • SOS Sensor Observation Service

  • STA SensorThings API

  • SWE Sensor Web Enablement

  • SWG Standards Working Group

  • TIE Technology Integration Experiment

  • TP True Positive

  • UML Unified Modeling Language

  • URL Uniform Resource Locator

  • US United States

  • USB Universal Serial Bus

  • USGS United States Geological Survey

  • UWB Ultra-WideBand

  • WES Web Enterprise Suite

  • WFS Web Feature Service

  • WFS-T Web Feature Service - Transactional

  • WG Working Group (SWG or DWG)

  • WGS84 World Geodetic System 1984

  • WiFi Wireless Fidelity

  • WPS Web Processing Service

  • WatchOS Apple Watch Operating SyStem

5. Overview

The SCIRA Pilot includes various stakeholders, applications (App), data clouds and operational field sensors. A schematic overview of the SCIRA Pilot is shown in the following figure (Figure 1):

Wiring Overview
Figure 1. SCIRA Pilot Systems Overview

As shown in the above figure, the SCIRA Pilot systems are roughly divided into four segments:

  • Stakeholders: Those who set up, oversee, provide data to, and/or make use of applications provided by the prototype systems;

  • App Zone: Applications, whether desktop, mobile, or browser-based, that are provisioned by system data and services;

  • Data Cloud: Storage and processing services that maintain an array of data "pools" accessible by the "Cloud Hub" ensemble of OGC Web services;

  • Field Zone: Mobile, in-situ, and incident nodes that form a data and communications constellation reporting sensor measurements from various sensors and sensor platforms up towards the Data Cloud and provisioning local applications for first responders, commanders, and other field personnel.

The overall prototype system has been divided into six individual systems (listed in Table 1) that, while sharing some components, deliver distinct sets of functionality. In the second column, the list of deliverables from the SCIRA Pilot Call For Participation.

Table 2. SCIRA Pilot Systems
System Deliverables

SmartHub System

  • D1: First Responder Tracking Sensor

  • D2: First Responder Health Sensor

  • D3: First Responder Environment Sensor

  • D4: First Responder SmartHub

  • D5: First Responder Incident SensorHub

  • D6: Cloud-based SensorHub and Catalog

  • Related Component:

    • D13: Command and Communication System

Command Communication System

  • D5: First Responder Incident SensorHub

  • D12: 3D Dashboard App

  • D13: Command and Communication System

  • D17: Community Mobility Navigation, Alert and Sensing App

  • Related Components:

    • D6: Cloud-Based SensorHub and Catalog

    • D16: Legacy Public Safety Information System Adapter

    • D20: 3D City Model

Indoor Navigation System

  • D8: First Responder Indoor Navigation App

  • D18: RFID (Radio-Frequency IDentification)/BLE (Bluetooth Low Energy) Indoor Navigation Beacons

  • Related Component:

    • D19: 3D Interior Building Model

Street Navigation System

  • D15: Traffic Management and Routing Model

  • D17-Road: Community Mobility Navigation, Alert and Sensing App

  • Related Components:

    • D16: Legacy Public Safety Information System Adapter

    • D20: 3D City Model

Flood Sensing System

  • D11/D16: Legacy Stream Sensors

  • D14: Flood and Inundation Model

  • D17-Flood: Community Mobility Navigation and Alert and Sensing App

  • Related Component:

    • D20: 3D City Model

Road Sensing System

  • D10: Smart Traffic/Road Sensor

  • Related Components:

    • D11: Smart Weather Sensor

    • D12: 3D Dashboard

This ER is organized as follows:

  • Section 5 (Solutions Architecture) describes the use cases and scenarios, as well as the components developed and implemented for the SCIRA Pilot.

  • Section 6 (SmartHub) describes SmartHub systems - supporting, accessing, and connecting wearable sensors with personal applications, as well as incident scene and Emergency Operations Center information resources.

  • Section 7 (Command Communication) presents Command Communication systems for a Common Operating Picture (COP), as well as tasking between first responders, incident commanders, city personnel, and city leaders.

  • Section 8 (Indoor Navigation) details an Indoor Navigation system for enabling first responders to navigate and be tracked inside of a building in the absence of GPS (Global Positioning System) reception.

  • Section 9 (Street Navigation) shows solutions developed for advanced Street Navigation.

  • Section 10 (Flood Sensing) discusses both predicted and actual flood inundation regions and street blockages

  • Section 11 (Road Sensing) review road condition and traffic awareness for traffic monitoring and as input to response or evacuation routing.

  • Section 12 (Findings) briefly discusses the key issues experienced during the SCIRA Pilot implementation. This section also provides a summary of the main findings and recommendations on preferred strategies and change requests.

6. Solutions Architecture

The Solutions Architecture describes the use cases, scenarios and interfaces for the components developed and implemented for the SCIRA Pilot. This section provides a reference architecture to guide future implementations of Smart Cities and demonstrate the interoperability achievable using OGC standards. The St. Louis Pilot solution architecture, while specific to the St. Louis, MO municipality, constitutes many stakeholders and human interactions with redundant service implementations to provide sufficient confidence in the standards applied.

6.1. Scenarios

The five scenarios for SCIRA Pilot are listed in the following table (Table 2):

Table 3. SCIRA Pilot Scenarios
Scenario Description

Scenario 1: River Monitoring

The City needs to monitor the river and anticipate the additional threat of flooding and needs to continue to take preparatory action while also dealing with potential responses related to the pending storm. Focus on River Des Pers and Riverview Dr. & Spring Gardens Area.

Scenario 2: Flash Flooding

Flash flooding has led to a high amount of standing water on streets in and around downtown. These streets are no longer passable and need to be blocked off. The City needs to detect where flash flooding is occurring, the severity, and respond accordingly (i.e. block off flooded streets).

Scenario 3: Assistance to Vulnerable Populations

The flash flooding has created a life and safety risk for vulnerable populations in the areas where the flash flooding is occurring. The health department needs to detect and verify the existence of vulnerable individuals and conduct outreach to get them to safety.

Scenario 4: Building Fire

Unknown to anyone, trenching in the area of T-Rex has damaged the basement and created a crack in the basement wall. Flash flooding and the flooding of trenches has led to flooding in the basement at T-Rex. The flooding in the mechanical room causes a short in the electrical system and starts a fire. Fire, Police, and EMS need to combat the fire and ensure safe and total evacuation of the building.

Scenario 5: Vehicle Accident

A speeding vehicle hits a patch of standing water, hydroplanes and collides with a fire hydrant and as a result knocks over the hydrant. Police, Water, Streets, and EMS need to respond to the incident, render assistance, and deal with the damaged fire hydrant and clear the accident.

The SCIRA Pilot follows a scenario flow as shown in Figure 2.

5 SolutionArchitect 326d8
Figure 2. SCIRA Pilot Scenario Flow

The solution architecture is derived from a Master Scenario Event List (MSEL) which is a compilation of timelines and locations for all expected exercise events and actions that describe a scenario. The MSEL as provided in Master Scenario Events List contains the following items for the each scenario:

  • Chronological listing that supplements exercise scenario

  • Actor

  • Location

  • Event Description

    • Real or Simulated Event

    • Push or Pull data

    • Originating Device

    • Destination Device

    • Hardware

  • Capability Set

  • Required Training

  • Test Result

  • Notes

6.2. Use Case Diagrams

Each MSEL scenario is used to derive a Use Case diagram in Unified Modeling Language (UML). The following UML diagrams (Figure 3, to Figure 7) describe the interactions between the stakeholders and the activities performed using the architecture.

5 SolutionArchitect 2d27b
Figure 3. Scenario 1: River Monitoring
5 SolutionArchitect 49cb1
Figure 4. Scenario 2: Flash Flooding
5 SolutionArchitect 900c8
Figure 5. Scenario 3: Assist Vulnerable Population
5 SolutionArchitect bef2e
Figure 6. Scenario 4: Building Fire
5 SolutionArchitect ebe5d
Figure 7. Scenario 5: Vehicle Accident

6.3. Class Diagrams

Each of the scenarios can be derived into a Class Diagram as shown below (Figure 8, Figure 9, Figure 10, Figure 11, and Figure 12).

5 SolutionArchitect 86a47
Figure 8. Scenario 1 Class Diagram
5 SolutionArchitect a8d9e
Figure 9. Scenario 2 Class Diagram
5 SolutionArchitect 59c1e
Figure 10. Scenario 3 Class Diagram
5 SolutionArchitect 64284
Figure 11. Scenario 4 Class Diagram
5 SolutionArchitect 8b692
Figure 12. Scenario 5 Class Diagram

6.4. Reference Architecture

The final reference architecture is a generic view of a singular platform of services to perform a set of communications between applications for use in the Smart Cities architecture. For the scope of this Pilot, the reference architecture depicts the minimum subset of components and interfaces needed to fulfill the need of the St. Louis SCIRA Pilot which addresses particularly the emergency management activities of the city (Figure 13). These components can be viewed as being a reference for the implementation of future emergency management components in a Smart City infrastructure for the appropriate use cases described herein.

Note
Future work could include expanding use cases to include Smart City communications for transportation services, utilities, and other aspects of Smart City architecture.
5 SolutionArchitect bb9e0
Figure 13. Smart Cities Generic Solution Architecture

7. SmartHub System

SmartHub systems center around supporting, accessing, and connecting body-worn sensors with personal applications, as well as incident scene and Emergency Operations Center information resources. Further background on SmartHub systems is found in the Next Generation First Responder (NGFR) Integration Handbook. The wiring of a SmartHub system is shown in the figure below (Figure 14):

Wiring SmartHub
Figure 14. SmartHub System Overview

The OGC® Sensor Web Enablement (SWE) and OGC® SensorThings API (Application Programming Interface) standards were both used in the SCIRA Pilot in support of the SensorHub. These standards provide an interoperable approach in support of virtually any sensor system. In essence, the suite of OGC® SWE and OGC® SensorThings API (STA) standards provide standard means for the discovery of sensors, tasking sensors and actuators, requesting robust descriptions of sensors and sensor observations, and for accessing real-time or archived observations and messages. Those standards can support sensor systems in a Cloud environment, within a field hub, or as part of the sensor itself or on the sensor platform ("on-the-edge").

OGC® SWE and OGC® STA standards can support the insertion of sensors and sensor observations through transactional mechanisms, thereby allowing small apps on the edge to push observations to sensor hubs residing in the cloud or on a field node. Some implementations of the OGC® SWE and OGC® STA standards enable the complete deployment of a SensorHub within any environment. One such implementation is OpenSensorHub (OSH) an open source software platform supporting OGC® SWE and OGC® SensorThings API standards. As depicted in the figure below (Figure 15), OSH can be fully deployed at all levels of the sensor environment, from the sensor or sensor platform itself to Android phones and tablets, to local workstations, and to the Cloud. This means that all functionality of the OSH platform, from storage, processing, real-time streaming of observations, and tasking of sensors can be supported on any of these devices.

BigPicture new
Figure 15. OpenSensorHub (OSH) Cross-Platform Functionality

Within the DHS domain, the SensorHub domain is often divided into three main components:

1) First Responder SmartHub: the personal mobile sensor hub, sometimes referred to as a SmartHub;

2) First Responder Incident SensorHub: the incident field hub domain supported by perhaps a laptop or desktop within a local network;

3) Cloud-based SensorHub & Catalog: the enterprise-level deployed in the Cloud.

Often there are restrictions in the network bandwidth within these three domains such that Cloud deployments support the widest breadth of connectivity and available resources. The incident field hubs support a local network shared with local devices (sensors, databases, and etc.) but may have limits to its connectivity to Cloud resources. Finally, the personal SmartHub may be body-worn, support sensors that are personal to the wearer (e.g., location, heartbeat, body-cam, and etc.), and may be limited in connectivity with a local team or squad. In some cases, the amount of data exchanged in real-time between these three domains may be limited, and may be configured to get only the most relevant data up to the Cloud or incident hub. Other times, connectivity may not be a challenge at all, allowing for complete exchange of observations between any parties who desire it.

7.1. D1: First Responder Tracking Sensor

A basic but important capability for monitoring First Responders and other assets is to be able to access and display their locations in real-time. A SensorHub treats location as simply another observation derived from the GPS, beacon, Laser RangeFinder sensors or other positioning devices. First Responder tracking involves tracking the vehicle carrying the First Responders or tracking the location and movement of the First Responder themselves. In either case, tracking locations outside typically involves GPS sensors. When the First Responder moves inside a building, GPS often fails or becomes highly inaccurate.

An ideal device for reporting First Responder location is a smartphone (Figure 16 and Figure 17). The smartphone not only provides GPS-based location but also provides geospatial orientation using a combination of a geomagnetic compass and accelerometer. In addition, the smartphone provides a ready-means to push locations to a remote server through its LTE (Long-Term Evolution) or WiFi (Wireless Fidelity) connection. Several First Responder Tracking approaches were used for this exercise including pushing locations to the CoolFire Ronin platform, as well as pushing locations from an Apple Watch and an OSH SmartHub to the Compusult SensorHub in the Cloud.

PoliceCar Video2
Figure 16. Police Car Tracking using GPS with Android Smartphone Running OSH
Web Client Toolkit 2016 09 07
Figure 17. Tracking Vehicles and Personnel with Android Smartphone Running OSH

The SCIRA Pilot incorporated three sensor systems for tracking the location of First Responders:

  • Botts Innovative Research Indoor Navigation: Botts Innovative Research's solution focused on indoor navigation, utilizing indoor beacons detected by SONIM phones issued to the First Responders to derive and communicate location. Please see the Indoor Navigation section for more details;

  • keys App (Apple Watch): keys' solution reported position data utilizing the GPS information provided by Apple Watches worn by the First Responders - a mixture of both watches issued to the First Responders, as well as watches already owned by First Responders (representing a Bring-Your-Own-Device (BYOD) deployment model). Please see the First Responder Health Sensor section for more details;

  • Coolfire Mobile Response: Coolfire’s solution reported position data utilizing the GPS information provided by SONIM phones issued to the First Responders. See the Command & Communication section for details.

7.2. D2: First Responder Health Sensor

In Public Safety incident scenarios, knowing the status of your most important asset - the First Responder - is vital for implementing an effective, efficient, and safe response. In addition to location, awareness of the current health of First Responders while performing their duties could save lives, helping identify a First Responder that is in distress, that has fallen, or even that has had a heart attack.

7.2.1. keys App (Apple Watch)

keys' solution reported heart-rate data and health incident alerts utilizing the heart-rate information provided by Apple Watches (Figure 18). These Watches worn by the First Responders in the SCIRA Pilot exercise- a mixture of two watches issued to the First Responders, as well as watches already owned by First Responders (representing a BYOD deployment model).

keys IoT Watches with STL Alert
Figure 18. keys App on WatchOS (Apple Watch Operating SyStem) with a High Heart Rate Alert

The best camera/sensor/platform is the one you always have with you – using the keys App, First Responders & Incident Commanders can leverage the devices they already carry to provide location, heart rate, and other sensor data into their current operational systems, as well as receive incident alerts (visual/audio/vibration), select sensor data, and an incident/team map – all from their smartwatch via FirstNet-compatible phones, LTE or Incident Area Networks.

keys App Platform - Consumer Hardware & Frameworks

The keys App is a software-based solution that utilizes mass-market consumer wearable computing and biometric platforms (such as the Apple Watch) to both send and receive location, sensor data, and alert notifications.

keys App Deployment

The keys App allows for both organization-provided and user-provided (BYOD) implementations.

keys App Communication

The keys App leverages the network capabilities of its hardware platform (Apple Watch) to both send and receive sensor data and messaging/alerts (Figure 19). This includes BLE (connecting via its paired iPhone), WiFi (Incident Area Network, Vehicle or First Responder Hotspot, and etc.), LTE (Apple Watches with LTE service), and FirstNet (Apple Watch with service or via paired iPhone). The communication channel is chosen automatically based on both network availability and power consumption costs (BLE is much less "expensive" than WiFi, which in turn is less expensive than LTE).