1. Introduction

1. INTRODUCTION

1.1 Problem Statement

Broadband coverage is going to be, and may already be for some, the gold standard for what an emergency responder needs as a technology. Broadband connectivity allows for more complex and useful information to be conveyed so that emergency responders can carry out their daily duties more effectively. A problem arises in how to provide them with this technology everywhere and at all times. The solution is a deployable system, which in simple terms is a portable box that can provide broadband services to everyone on scene.

USE CASE 1: WILDFIRES

Let’s start with a scenario to help our understanding: Picture yourself as a firefighter in a National Forest trying to contain a burn area of several hundred acres. You are in the woods with no access to a live map of where you are, no voice or text communications from your cell phone to others in the area, and no access to services like video feeds that could be covering the fire. A deployable system can provide all these services and more depending on the needs of each agency.

USE CASE 2: NATURAL DISASTERS

This is not limited to spot cases like wildfires, either. Imagine being a law enforcement officer on the Gulf Coast where a hurricane just came and knocked down multiple cell towers in the area. Your only link to others is your trusty old push to talk radio, but that device has some limitations. The radio cannot tell you where all your fellow officers or medical units are in the area, nor could you upload video and critical data intensive information to anyone around you. It’s absolutely trustworthy and still a great tool, but we can provide more to our emergency responders.

Solutions to both cases and many more exist today in the form of a deployable system, but they face many challenges, both policy-wise and technically for public safety to use. One roadblock public safety faces is the lack of operational understanding of what these systems can do. This challenge contest is to build a solution that will help emergency responders understand what these systems can do for them in terms of coverage and service usability.

1.2 Objectives

The objective of this contest is to enable participants to create prototype network diagnostic tools to help emergency responders understand what coverage a broadband deployable system can provide. Participants will be able to leverage their knowledge of computer systems, RF, common sense, user interface design, and other relevant skills in order to accomplish this.

The end goal is to let emergency responders use the app to let them know:

  • What’s the expected coverage area?
  • What services can I expect out at specific locations?
  • What’s the real, measured coverage?
  • What’s the real service quality at specific locations?

1.3 Resources

  1. Participants will have at their disposal a few example datasets from a real deployment of a deployable system. These datasets will serve as a reference to help tune participant solutions.
  2. A range of what the deployable system characteristics could be, for example a range of the possible output powers and antenna types, will be provided to participants.
  3. Participants will receive an estimated translation table for coverage quality to usability for text, voice, and video.
  4. Participant can also use open source RF propagation software such as SPLAT! and Qrap as well as some simple open source propagation models. Participants are not required to use these in their submission.

https://www.qsl.net/kd2bd/splat.html

http://www.qrap.org.za/

https://www.itu.int/dms_pubrec/itu-r/rec/p/R-REC-P.529-3-199910-W!!PDF-E.pdf

https://www.its.bldrdoc.gov/resources/radio-propagation-software/itm/itm.aspx

http://qradiopredict.sourceforge.net/

2. Evaluation criteria

Participants must adhere to the basic application requirements listed below. Failure to do so may result in non-grading of the application.

2. EVALUATION CRITERIA

Criteria #0: Basic Requirements

Rating: Pass/Fail

  • The solution must provide an expected coverage area of a deployed LTE system. Specifically, it must provide a heat map, overlaid on a real map, that reflects LTE Reference Signal Received Power (RSRP) measurements.
  • This map must be interactive, meaning the deployable systems characteristics can be changed and the outputs are automatically updated.
  • The solution must provide expected services at any location from the deployable system. Specifically, participants are required to provide some indication of the usability of text, voice, and video services. How this information is conveyed is up to the participant.
  • The solution must update the expected coverage area with measurements from user equipment in the field. This provides emergency responders with the “current real coverage” in the areas where measurements are taken. Specifically, it needs to take in GPS vs RSRP and update its coverage map accordingly.
  • The solution must update the expected services at locations with measurements from user equipment. This provides emergency responders with the “current real services” that are available in the areas where measurements are taken.
  • The solution must provide data of dead spots or areas without any connectivity. More specifically this is anywhere a phone cannot connect to the network or if the predicted RSRP strength is lower than -140dbm.
  • The solution cannot rely on any cloud services and must be local at all times. This is to simulate a real deployment.

Criteria #1: EXPECTED COVERAGE AND SERVICES

Rating: 20/100

  • Expected coverage (15/100)
    • This will be objectively scored based on a submitted CSV file that the participant will upload near the end of the competition. We will compare your predicted RSRP values, on a specified deployment scenario, with what we measured from a real deployable system.
  • Expected services (5/100)

This will be objectively scored based on the same submitted CSV file. We will check to see if you implemented the RSRP to service translation. Even if your RSRP predictions are not consistent with our data, did you at least include and provide a service translation.

Criteria #2: CURRENT COVERAGE AND SERVICE

Rating: 20/100

  • Current coverage (20/100)

This will be objectively scored based on another CSV file submission at the end of the competition. We will provide you with a partially filled in CSV file with only some data points of the same deployment scenario. From their your solution should be able to make improvements from its initial prediction. Think of this as now incorporating an interpolation function. You will then submit a completely filled out CSV file that we will grade again in the same way as before. This will take place an hour before the competition ends.

Criteria #3: Wild Card and Features

Rating: 20/100 

Participants may include added functionality that enhances the value of the solution in the context of the use cases provided. These points will be given by our judges at their discretion. They will be looking for incredible and unique ideas that make your app different from the rest. Think about what an emergency responder might need to know or what another developer could do with your solution. You could build an API framework into your solution or have added features like Wi-Fi coverage prediction and measurements. Think about some of the other bits of information a smart phone can provide like throughput and latency between points; see if you can use those in a clever way. You could also develop an entire architecture around features that you may not have time to implement.

What will happen here is your submission will include a short Word document (300 words at most) that describes the added features you think qualify for this category. You can reference screenshots that you submit to further explain what the feature is. You will submit this document and the screen shots at the close of the competition.

Criteria #4: User Interface/User Experience

Rating: 40/100

This score will be based on subjective judgments made by a panel of public safety and technical experts to evaluate the User Interface and User Experience (UI/UX) of the solution. Not all users will make it to this section as only qualifying submissions from the last two scoring sections will be graded. In essence, users are pushed to make a solution that conveys information in an easy-to-understand format without any substantial user training.  For additional information on UI/UX[1], below is a breakdown of what would be considered a good UI/UX:

A good one conveys information in a usable manner after training.

                        (0-10 points based on judges)

A great one conveys information in an easy way after minimal training.

                        (10-25 points based on judges)

An excellent one conveys information in an easy way without any training.

            (25-40 points based on judges)

[1] https://iso25000.com/index.php/en/iso-25000-standards/iso-25010?limit=3&limitstart=0

3. EXPECTED DELIVERABLES FROM PARTICIPANTS

3.1 CONTEST SPECIFIC DELIVERABLES 

Review the How to Participate instructions in section 3 of this document to ensure that you are prepared for the in-person Regional Codeathon or Online Contest. The following deliverables will need to be included with the submission:

Part 1

Output:

  • Specific to this contest, participants will be provided an example deployment configuration and some GPS coordinates. Participants will run their solution with this configuration information and fill out the CSV file with the RSRP and service usability for the GPS points provided. This output is the expected coverage and service usability for the participant’s solution.

Process:

  • 4 hours before the end of the contest, a CSV file, which can be found at techtoprotectchallenge.org, will be available to all participants and it will contain specific deployable characteristics and GPS coordinates around the deployable system. You will submit an example screenshot of your code and the same CSV file, but filled out with RSRP and service expectation at each point. You will only have 2 hours to complete this task, so make sure your solution is working by the time the CSV file is released to you.

Part 2

Output:

  • In order to see if your solution takes in real measurements, we will repeat the procedure above but with new points and some real measured points. Your solution should now have a better coverage map given some real points. This is the real coverage and service usability for your solution.

Process:

  • Approximately 2 hours before the end of the competition, we will provide another CSV file, which can be found at techtoprotectchallenge.org. This file will contain the same deployable characteristics as before, but the twist is that we will include completely different GPS points. Some of those points will have real measured RSRP values included You are to input this new CSV file into your solution, with real measurement points, and fill in the rest of the blank points. You will only have 2 hours to complete this task and upload your solution’s completed CSV output, so make sure your solution is working by the time this CSV file is released to you.

3.2 ADDITIONAL DELIVERABLES  

  • A completed submission form through techtoprotectchallenge.org
  • A 3-minute narrated PowerPoint file or 3-minute narrated video with the following sections:
    1. A summary of the submission.
    2. Basic information about the participant or participant’s team.
    3. Specific requirements of the contest that are being addressed by the participants.
    4. Two to four screenshots of the solutions or prototype.
    5. Overview of key functions included in the submission.
  • Any available files or links to the developed solution or prototype.
  • Any additional files based on the contest description.
Tech To Protect
Rules & Guidelines