-
Start Preamble
Start Printed Page 21476
AGENCY:
Federal Communications Commission.
ACTION:
Final rule.
SUMMARY:
In this document, the Wireless Telecommunications Bureau (WTB), the Office of Economics and Analytics (OEA), and the Office of Engineering and Technology (OET) (collectively, the Bureau and Offices) adopt technical requirements to implement the mobile challenge, verification, and crowdsourcing processes required by the Broadband DATA Act. The Bureau and Offices adopt the proposed processes and methodology set forth in the Broadband Data Collection (BDC) Mobile Technical Requirements Proposed Rules for collecting challenge process data and for determining when the threshold to create a cognizable challenge has been met. Additionally, the Bureau and Offices adopt detailed processes for mobile providers to respond to challenges, for the Federal Communications Commission (Commission or FCC) to initiate a verification request to a service provider, and for providers to respond to verification requests to confirm broadband coverage in areas they claim have service. The Bureau and Offices adopt the parameters and metrics that must be collected both for on-the-ground test data to support challenge submissions, rebuttals to cognizable challenges, and responses to verification requests, and for infrastructure information to support challenge rebuttals and responses to verification requests. Government entities and third parties are required to submit verified broadband data using the same data specifications required of mobile service providers. Finally, the Bureau and Offices find the Commission's speed test app to be a reliable and efficient method for entities to use in submitting crowdsourced mobile coverage data to the Commission and describe the methodology staff will use in determining when a “critical mass of” crowdsourced filings suggests that a provider has submitted inaccurate or incomplete data. The measures adopted in this document to implement the mobile challenge, verification, and crowdsourcing processes will enable the Commission, Congress, other Federal and state policy makers, Tribal entities, consumers, and other third parties to verify and supplement the data collected by the Commission on the status of mobile broadband availability throughout the United States.
DATES:
Effective May 11, 2022.
Start Further InfoFOR FURTHER INFORMATION CONTACT:
William Holloway at William.Holloway@fcc.gov, Competition & Infrastructure Policy Division, (WTB), (202) 418-2334, Jonathan McCormack at Jonathan.McCormack@fcc.gov (OEA), (202) 418-1065, or Martin Doczkat at Martin.Doczkat@fcc.gov (OET), (202) 418-2435.
End Further Info End Preamble Start Supplemental InformationSUPPLEMENTARY INFORMATION:
This is a summary of the Commission's Order, DA 22-241, in WC Docket No. 19-195, adopted and released on March 9, 2022. The full text of this document, including the technical appendix is available for public inspection and can be downloaded at https://www.fcc.gov/document/fcc-releases-bdc-mobile-technical-requirements-order.
People With Disabilities. To request materials in accessible formats for people with disabilities (braille, large print, electronic files, audio format), send an email to fcc504@fcc.gov or call the Consumer & Government Affairs Bureau at 202-418-0530 (voice), 202-418-0432 (tty).
Paperwork Reduction Act. This document does not contain new or modified information collection(s) subject to the Paperwork Reduction Act of 1995 (PRA), Public Law 104-13, as the requirements adopted in this document are statutorily exempted from the requirements of the PRA. As a result, the Order will not be submitted to the Office of Management and Budget (OMB) for review under Section 3507(d) of the PRA.
Congressional Review Act. The Commission has determined, and the Administrator of the Office of Information and Regulatory Affairs, Office of Management and Budget, concurs, that these rules are “non-major” under the Congressional Review Act, 5 U.S.C. 804(2). The Commission will send a copy of the Order to Congress and the Government Accountability Office pursuant to 5 U.S.C. 801(a)(1)(A).
Synopsis
1. In this document, the Bureau and Offices adopt the technical requirements to implement the mobile challenge, verification, and crowdsourcing processes required by the Broadband DATA Act as part of the FCC's ongoing BDC effort to improve the Commission's broadband availability data.
I. Discussion
A. Mobile Service Challenge Process
2. In this document, the Bureau and Offices adopt the proposals for the mobile challenge process set forth in the BDC Mobile Technical Requirements Proposed Rules (86 FR 40398, July 28, 2021), with certain modifications described below.
3. The Broadband DATA Act requires that the Commission “establish a user-friendly challenge process through which consumers, [s]tate, local, and Tribal governmental entities, and other entities or individuals may submit coverage data to the Commission to challenge the accuracy of—(i) the coverage maps; (ii) any information submitted by a provider regarding the availability of broadband internet access service; or (iii) the information included in the [Broadband Serviceable Location] Fabric.” The general requirements and framework for the mobile challenge process predate the BDC Mobile Technical Requirements Proposed Rules, and were set forth in either the Broadband DATA Act or prior Commission orders. We note that, to the extent commenters ask the Bureau and Offices to eliminate, modify, or otherwise revisit particular requirements established in either the Broadband DATA Act or prior Commission-level orders, we lack the legal authority to do so. In the Third Further Notice of Proposed Rulemaking (Third Further NPRM) (85 FR 50911, Aug. 18, 2020), the Commission proposed a challenge process that “encourages participation to maximize the accuracy of the maps, while also accounting for the variable nature of wireless service.” In the Third Order (85 FR 18124, April 7, 2021), the Commission adopted its proposals from the Second Order (85 FR 50886, Aug. 18, 2020) and Third Further NPRM, and established a framework for consumers, state, local, and Tribal governments, and other entities to submit data to challenge the mobile broadband coverage maps.
4. The Commission determined that it should enable stakeholders to challenge mobile coverage data based on both a lack of service and poor service quality (such as slow delivered user speeds). Challenges must be based upon on-the-ground speed test data taken outdoors ( i.e., from an in-vehicle mobile or outdoor stationary environment). The Commission adopted a requirement that consumers use a speed test application (either developed by the FCC or a third- Start Printed Page 21477 party app approved by OET for use in the challenge process) that automatically collects information and metrics associated with each speed test and allows for submission of information directly to the Commission from a mobile device. Consumers will be required to submit certain identifying information to deter frivolous filings. Government and other third-party entity challengers (including competing mobile service providers) may use their own software or hardware to collect data for the challenge process so long as the data contain metrics that are substantially the same as those collected by approved speed test applications. Moreover, government and other entity challengers are required to conduct on-the-ground tests using a device advertised by the challenged provider as compatible with its network.
5. The Commission adopted a requirement for providers to either submit a rebuttal to the challenge or concede the challenge within 60 days of being notified of the challenge. Rebuttals must consist of either on-the-ground test data or infrastructure data. A challenge respondent may also submit supplemental data in support of its rebuttal, either voluntarily or in response to a request for additional information from OEA. The Commission directed OEA to develop a methodology and mechanism to determine if the data submitted by a provider constitute a successful rebuttal to all or some of the challenged service area and to establish procedures to notify challengers and providers of the results of a challenge. Further, the Commission adopted a requirement that providers that concede or lose a challenge file new coverage data within 30 days depicting the challenged area that has been shown to lack service.
6. The requirements that we adopt in this document will enable the Commission to collect sufficient measurements to ensure that the challenge process is statistically valid while remaining “user-friendly.” In particular, we establish a methodology for determining a threshold number of mobile speed tests and the geographic boundaries within a specified area. Based on this methodology, a challenge is created by associating the locations of validated speed tests within geographical hexagons defined by the accessible, open-source H3 geospatial indexing system and analyzing those speed tests. We also adopt the parameters and metrics that speed tests must meet to be validated and used to meet the challenge thresholds. Importantly, as the Commission specified in the Third Order, the challenge process will remain user-friendly because all of the information a consumer needs to create a challenge will be collected and submitted by the FCC Speed Test app and any third-party mobile speed test apps approved by OET. Governmental and other entity challengers may use these apps or their own software or hardware to collect data for the challenge process. Additionally, we implement the Commission's decision to aggregate speed tests to resolve challenges “in an efficient manner, mitigate the time and expense involved, and ensure that the mobile coverage maps are as reliable and useful as possible,” by adopting our proposal to combine speed tests conducted by consumers, governmental agencies, and other entities to determine whether the thresholds for a cognizable challenge have been met. These requirements strike the appropriate balance between ensuring that consumers, state, local, and Tribal governments, and other entities can participate in the challenge process, on the one hand, and protecting providers from being burdened by having to respond to challenges that do not meet the cognizable challenge standard, on the other hand.
1. Creating a Challenge/Cognizable Challenges
7. On-the-Ground Speed Test Data Parameters and Metrics. Challenges must be supported by on-the-ground test data. We have therefore established the required testing parameters and data metrics for speed test submissions. At the outset, we will require the FCC Speed Test app and approved third-party apps to collect the name and email address of the end user and mobile phone number of the device on which the speed test was conducted, to the extent technically feasible. As discussed in further detail below, Apple iOS devices will not automatically transmit the mobile phone number associated with the device that runs a speed test. We will therefore require testers submitting tests for use in the challenge process to manually submit, through the speed test app, the phone number associated with the device on which the speed test was conducted. The Commission's rules state that consumer challengers must include “name and contact information ( e.g., address, phone number, and/or email address) in their data submissions.” We amend these rules to require that app users also submit their email address so that the Commission can notify testers of the status of their speed test(s) and any resulting challenge(s), and we also amend the rules to require app users to submit the mobile phone number of the device on which the speed test was conducted so that we may, if necessary, share this information with mobile broadband providers for use when responding to challenges. We anticipate we will only share the phone number of the device on which the speed test was conducted with mobile broadband providers in situations where a challenged provider is unable to identify a subscriber by using the timestamp that test measurement data were transmitted to the app developer's servers, as well as the source IP address and port of the device, as measured by the server, which we will also require to be included in challenge data submitted by the app, as discussed below. We will not collect the address of an end user for use in the mobile challenge process at this time in order to minimize the amount of personally identifiable information we require from end users, and because a mobile user's physical address is not currently helpful either to the Bureau and Offices when considering challenges or to providers when responding to challenges. In addition to the testing metrics adopted by the Commission in the Third Order, we adopt the testing parameters and updated metrics for challenge speed test data proposed in the BDC Mobile Technical Requirements Proposed Rules, with the modifications described below. We will require the FCC Speed Test app and approved third-party apps to collect the consumer's name, email address, and phone number of the device on which the speed test was conducted to the extent technically feasible. With the exception of different considerations pertaining to the submission of speed test data taken on iOS devices and the submission of IP address, source port, and timestamp measured by an app developer's servers by government entities and service providers in some scenarios, these parameters and metrics will apply across all testing mechanisms, not only in the challenge process but also for on-the-ground data submitted in response to verification inquiries. The information we will use in the challenge process that can be collected from Android devices, but not iOS devices, includes the signal strength, signal quality, unique identifier, and other radiofrequency (RF) metrics of each serving cell, as well as the spectrum bands used for the test and other network characteristics ( e.g., whether the device was roaming, as well as the identity of the provider for the connected network). As discussed in greater detail below, we will allow Start Printed Page 21478 government and other third-party entities to alternatively submit the International Mobile Equipment Identity (IMEI) of the device used to conduct a speed test for use in the challenge process rather than provide the source IP address, source port, and timestamp measured by an app developer's servers. We will also not require a service provider to submit either the device IMEI or the combination of source IP address, source port, and timestamp measured by an app developer's servers when submitting speed tests either in response to a challenge or in response to a verification inquiry. Individual consumer challengers must collect on-the-ground speed test data using mobile devices running either a Commission-developed app ( e.g., the FCC Speed Test app) or another speed test app approved by OET for the submission of challenges. The Bureau and Offices will announce the process and procedures for third-party app providers to seek approval for a speed test app to be used in submitting data for use in the challenge process. Third-party and governmental entities may, as specified in the Third Order, collect data using either one of these speed test apps or their own software and hardware that collects broadband availability data, consistent with the parameters and metrics set forth herein. We include “hardware” to capture the professional tools such as laptops, hard drives, or other hardware devices, used to collect on-the-ground data. The Third Order provided that government and other entity challengers submit a complete description of the methodologies used to collect the data. The Bureau and Offices will issue a public notice announcing the process and procedures for such parties to submit the necessary documentation.
8. In the Third Order, the Commission required consumer challengers to use a speed test app approved by OET for use in the challenge process and provided the metrics that approved apps must collect for each speed test. The Commission directed OET, in consultation with OEA and WTB, to update the FCC Speed Test app as necessary or develop a new speed test app to collect the designated metrics, so that challengers may use it in the challenge process. For government and third-party entity challengers, the Commission did not require the use of a Commission-approved speed test app but instead set forth the information that all submitted government and third-party challenger speed test data must contain and directed OEA, WTB, and OET to adopt additional testing requirements if they determine it is necessary to do so. Our BDC Mobile Technical Requirements Proposed Rules proposed certain testing parameters and metrics to standardize the on-the-ground test data submitted in the challenge process and to assure more reliable challenges; a number of parties agree that such consistency among the apps used for challenges and rebuttals is important. This set of standardized parameters and metrics will also ensure that we can make a meaningful comparison of tests run by different entities using different methods ( e.g., tests run on a speed test app versus a government's own hardware and software), and will enable us to easily combine and evaluate speed test data used in the challenge process. Accordingly, we will require that such data meet the following testing parameters set forth in the BDC Mobile Technical Requirements Proposed Rules: (1) A minimum test length of 5 seconds and a maximum test length of 30 seconds; (2) test measurement results that have been averaged over the duration of the test ( i.e., total bits received divided by total test time); and (3) a restriction that tests must be conducted between the hours of 6:00 a.m. and 10:00 p.m. local time. To avoid requiring excessive data usage for tests on particularly fast networks ( e.g., 5G-NR (New Radio) using high-band spectrum), we will relax the minimum test duration requirement once a download or upload test measurement has transferred at least 1,000 megabytes of data. Specifically, when a speed test transfers at least 1,000 megabytes of data, we will validate the test if it has a duration value of greater than 0 seconds and less than or equal to 30 seconds. Otherwise, a speed test must have a duration value of greater than or equal to 5 seconds and less than or equal to 30 seconds to be valid.
9. We clarify that the minimum and maximum test length parameters will apply individually to download speed, upload speed, and round-trip latency measurements, and will not include ramp up time. We disagree with the Competitive Carriers Association (CCA), Public Knowledge/New America, and Vermont Department of Public Service (Vermont DPS) that imposing a maximum test limit places an arbitrary or inferior limitation on testing. These timing requirements balance representative measurement over a stable Transmission Control Protocol (TCP) connection, on the one hand, versus data usage considerations, on the other hand—especially for consumers who may have limited data plans. The FCC Speed Test app, for example, first initiates a test server selection process, which typically takes two seconds (and a maximum of 10 seconds if servers fail to respond) then individually runs, including a warm-up time, a maximum of eight seconds for download and eight seconds for upload tests by establishing three concurrent TCP connections and summing the three resulting data rates for each test. In addition, the round-trip latency testing runs for a fixed five seconds to transmit up to 200 UDP (User Datagram Protocol) packets ( i.e., datagrams) to calculate the average latency of those datagrams. Hence, a typical test cycle takes approximately 23 seconds to complete, and a maximum of 31 seconds to complete.
10. We also decline to adopt CCA's request to exempt continuous network monitoring from the maximum test length. Continuous network monitoring software can monitor active users' speeds at the cell sites and other network parameters over extended periods of time. We are not persuaded that deviating from the uniform 30-second per test component maximum testing standard to accommodate continuous network monitoring will yield equal or more accurate test results. We found in the Mobility Fund Phase II challenge process that continuous network monitoring speed tests recorded significant variability within the same area and across a short time span, in some cases recording strong network performance well exceeding the minimum requirement interspersed with short seconds-long drops in performance that may have been the result of normal network conditions ( e.g., sector handover or network scheduling). The overall performance in these areas indicated that coverage was adequate ( i.e., with the average of tests in the same area over 15-20 seconds exceeding the minimum requirement), but because the test results were so variable, we are concerned that allowing the reporting of continuous speed tests could result in inaccurate results that do not reflect the typical on-the-ground customer experience, which as the results showed, may be adequate when averaged, but may not deliver consistent speeds to consumers. To the extent challengers choose to use continuous network monitoring to record challenge data, results of the speed tests should report the average speeds over a uniform time period consistent with the minimum and maximum test lengths we adopt above ( i.e., a minimum of 5 seconds and a maximum of 30 seconds).
11. We share Ookla's concern that averaging the number of bits received over the entire duration of a throughput Start Printed Page 21479 test may negatively affect the accuracy of any calculation, as that may not exclude an internet connection's known and expected “ramp-up time.” To account for this, we will apply the following formula: [(total bits received−ramp up bits) divided by (total test time−ramp up time)]. We consider “ramp up bits” to be the initial bits received during the initial warm-up time. We find that this approach will sufficiently account for ramp-up time and fully satisfy Ookla's concern, especially in light of the clarification above that the test time limits apply individually to tests' upload and download measurements.
12. We require on-the-ground speed test data to include a standardized set of metrics. Each on-the-ground speed test must include the following metrics that were previously adopted by the Commission as modified by the updates proposed in the BDC Mobile Technical Requirements Proposed Rules: (1) The timestamp and duration of each test metric; (2) geographic coordinates ( i.e., latitude/longitude) measured at the start and end of each test metric with typical Global Positioning System (GPS) Standard Positioning Service accuracy or better, along with the location accuracy (“location accuracy” refers to a metric that GPS-enabled smartphones are able to report on the horizontal accuracy of the geographic coordinates of the location reported); (3) the consumer-grade device type(s), brand/model, and operating system used for the test; (4) the name and identity of the service provider being tested; (5) location ( e.g., hostname or IP address) of the test server; (6) signal strength, signal quality, unique identifier, and other RF metrics of each serving cell, where available; (7) download speed; (8) upload speed; (9) round-trip latency; (10) for an in-vehicle test, the speed the vehicle was traveling when the test was taken, where available. All on-the-ground speed tests must also include the following metrics previously adopted by the Commission: (11) Whether the test was taken in an in-vehicle mobile or outdoor, pedestrian stationary environment (government and other third-party entities must also indicate whether an in-vehicle mobile test was conducted with the antenna outside of the vehicle); (12) an indication of whether the test failed to establish a connection with a mobile network at the time and location it was initiated; and (13) the network technology ( e.g., 4G LTE (Long Term Evolution), 5G-NR) and spectrum bands used for the test. We adopt an additional metric that was proposed in the BDC Mobile Technical Requirements Proposed Rules: (14) The app name and version. We will also require all speed tests to include: (15) The timestamp that test measurement data were transmitted to the app developer's servers, as well as the source IP address and port of the device, as measured by the server. Given concerns that challengers may conduct tests after exceeding data limits, we will collect the timestamp that test measurement data were transmitted to the app developer's servers, as well as the source IP address and port of the device, as measured by the server, so that a service provider may determine if a challenger's device is subject to reduced speeds or otherwise lacks full network performance. The source port of the device is an available network port over which the device communicates with the server and is unique to a particular network connection or transmission. The IP address and source port associated with the device used in testing is attainable from devices using both iOS and Android devices. For the same reasons, we will allow government and other third-party entities to alternatively submit the IMEI of the device used to conduct the test rather than provide the source IP address, source port, and timestamp measured by an app developer's servers since such entities are allowed to use their own hardware or software to conduct speed tests. The purpose of collecting either type of data is to allow for the challenged provider to identify characteristics of the device or service plan used to conduct the test, such as whether the device was roaming or was subjected to slower service due to the subscriber's data plan. Accordingly, we will not require a service provider to submit either the device IMEI or the combination of source IP address, source port, and timestamp when submitting speed tests (either in response to a challenge or in response to a verification inquiry), as these fields are relevant only for data submitted by challengers.
13. Finally, we require on-the-ground challenge test data to include all other metrics required per the most recent specification for mobile test data adopted by OEA and WTB in accordance with 5 U.S.C. 553. Concurrent with release of this document, we are publishing the full technical and data specifications for mobile speed test data on the Commission's website at https://www.fcc.gov/BroadbandData/resources. The specification for speed test data includes additional fields derived from the high-level metrics defined herein, as well as other identifiers to facilitate management of the submission of such data. These fields include: a unique device installation ID; a unique test ID; the device Type Allocation Code (TAC); the Mobile Country Code (MCC) and Mobile Network Code (MNC) values measured from the network and from the device's SIM card; flags indicating whether the network is connected, is available, and/or is roaming; total bytes transferred and calculated bytes per second for download and upload tests; jitter and packets sent and received for latency tests; for each connected cell, the measured cell ID, Physical Cell Identity (PCI), cell connection status, Received Signal Strength Indication (RSSI), Reference Signal Received Power (RSRP), Reference Signal Received Quality (RSRQ), Signal to Interference and Noise Ratio (SINR), Channel Quality Indicator (CQI), spectrum band and bandwidth, and Absolute Radio-Frequency Channel Number (ARFCN); and the horizontal accuracy of GPS coordinates and speed accuracy of measured velocity for each location measurement. Third-party app developers and government or other third parties that use their own hardware or software to conduct speed tests will be required to update their processes in accordance with such updates, including, as stated in the BDC Mobile Technical Requirements Proposed Rules, revised specifications for mobile test data adopted by the Bureau and Office in accordance with 5 U.S.C. 553. The modified set of parameters and metrics we adopt aligns more closely with those already required of government and third-party challengers. The Commission delegated authority to the Bureau and Offices to adopt additional testing requirements for government and third-party challengers. We therefore add certain metrics to those listed in paragraph 117 of the Third Order and § 1.7006(f) of the Commission's rules and make clear that all challengers must collect these metrics, with the exception that consumers need not indicate whether an in-vehicle mobile test was conducted with the antenna outside of the vehicle.
14. We recognize the concerns raised by Vermont DPS, Enablers, and Public Knowledge/New America about excessive data and burdens on consumers and governments and other third-party challengers to assure that their data aligns to these standards, but we believe that such parameters and metrics are necessary to provide the Commission with complete and reliable challenge data that accurately reflect on-the-ground conditions in the challenged Start Printed Page 21480 area and provide the additional context necessary to efficiently and fully adjudicate challenges and thereby assure that more accurate and reliable coverage maps are made available. These data metrics are also substantially similar to those adopted by the Commission in the Third Order, and therefore we do not anticipate that they will create any new burdens on consumers or governmental entities and third parties beyond those in place resulting from the previously adopted requirements. Further, the challenge process will remain user-friendly because any challenger can use a readily downloadable mobile app to collect and submit data (including the FCC Speed Test app, which the FCC makes available for download at no cost), and government and third-party entities have the flexibility also to use their own software or hardware. Therefore, government and other third parties will only need to modify their software once, to the extent necessary to conform to the required testing parameters and metrics we discuss above (and subject to our adopting any new metrics in the future). The Commission will also provide technical assistance to consumers and state, local, and Tribal governmental entities with respect to the challenge process, which will be a resource for government entities that do not understand some of our data collection requirements. The Bureau and Offices will ensure that the FCC Speed Test app and other apps approved for use in the challenge process collect this information, and government and other third-party challengers will be able to submit challenge data to the Commission through such apps under the procedures adopted for consumer challenges.
15. We understand that certain technical network information and RF metrics that we would otherwise require are not currently available on Apple iOS devices. The information we will use in the challenge process that can be collected from Android devices, but not iOS devices, includes the signal strength, signal quality, unique identifier, and other RF metrics of each serving cell, as well as the spectrum bands used for the test and other network characteristics ( e.g., whether the device was roaming, as well as the identity of the provider for the connected network). Therefore, until such time as such information and metrics are available on iOS devices, and the Bureau and Offices indicate they will collect such information from iOS devices, government and third-party entity challenges must use a device that is able to interface with drive test software and/or runs the Android operating system. The iOS operating system, which supports iPhone and iPad hardware devices, does not disclose certain technical network information and RF metrics that are essential to the Commission's challenge and crowdsource processes. This limits the conclusions that we can draw from on-the-ground tests conducted using such devices. OET will update its guidance if future iOS software versions are released that disclose this technical network information and/or RF metrics. To ensure that the challenge process remains user-friendly and encourage public participation, including by consumers who use a device running the iOS operating system, however, we will not extend this restriction to challenges submitted by consumers, and we will still consider speed test data submitted using an iOS device towards challenges. Although iOS software does not report the complete metrics we require in this document ( e.g., certain technical network information and RF metrics), the Bureau and Offices will nevertheless use the remaining on-the-ground data we receive from consumers using iOS software in the challenge process. Although we may receive limited data from tests run on iOS devices, we do not anticipate that such tests will significantly impede the creation of challenges because, as mentioned, the Commission will aggregate speed tests to create cognizable challenges. iOS speed tests will be considered in combination with other speed tests that fall within the same resolution 8 hexagon. We therefore anticipate that data submitted by government and other entities, as well as consumer tests run on Android devices, will help fill in any gaps in information about the on-the-ground quality and availability of broadband coverage that may result from the limited nature of the data we receive from speed tests run on iOS devices. Our approach preserves balance and flexibility for both types of challengers, while also ensuring that the Commission gathers adequate data to adjudicate challenges. On the one hand, government and other third-party entities who can be expected to submit large amounts of speed test data may not use iOS devices but have the flexibility to use their own hardware and software. On the other hand, consumers who use iOS devices and would face a prohibitive burden if required to use a non-iOS device to submit a challenge may submit speed tests conducted using an iOS device but do not have the same flexibility as government and other entities to use non-approved software.
16. Third-party app developers and government or other entities that use their own hardware or software to conduct speed tests will be required to update their processes in accordance with updates to the full technical and data specifications for mobile speed test data, including, as stated in the BDC Mobile Technical Requirements Proposed Rules, revised specifications for mobile test data adopted by the Bureau and Offices. The Rural Wireless Association (RWA) asserts that adopting the proposed data metrics and parameters, including “all other metrics required per the most-recent specification for mobile test data released by OEA and WTB” would be an improper incorporation by reference that violates the Office of the Federal Register (OFR) regulations and the Administrative Procedure Act (APA). We disagree with RWA that this is an improper incorporation by reference that violates OFR regulations and the APA. The metrics we require are substantially similar to those already adopted by the Commission in the Third Order, and have been adopted after notice and comment in accordance with the APA's rulemaking requirements. Furthermore, we note that certain changes to the specifications that apply to the submission of on-the-ground test data, including for example, changing the file type to be submitted, are not substantive changes, and may be adopted without notice and comment. The Bureau and Offices have been delegated authority to adopt such procedural changes pursuant to § 1.7010 of the Commission's rules. To the extent that we may wish to make any substantive changes to testing parameters or metrics, we clarify that we would make such changes in accordance with 5 U.S.C. 553. Any future changes we make to the testing parameters or metrics will also be consistent with the Commission's Orders implementing the Broadband DATA Act. Finally, the adoption of these rules will not result in an improper incorporation by reference because we will comply with the requirements of any applicable Federal statutes and regulations governing the publication of these test parameters and metrics in the Federal Register and the Code of Federal Regulations.
17. Speed Test Applications. Pursuant to the Commission's directive in the Third Order, OET is currently in the process of developing updates to the FCC Speed Test app to incorporate additional functionalities that will allow for its use in submitting speed test data Start Printed Page 21481 as part of the BDC mobile challenge and crowdsource processes. OET recently released a technical description of the metrics and methodologies used in the current version of the FCC Speed Test app. The revised technical description document includes updated technical standards and additional modifications. While this document does not illustrate future user experience design changes to the FCC Speed Test app that will be made to implement the challenge and crowdsource functionalities, we anticipate that the fundamental measurement methodologies reflected in the recently updated technical description document will not be affected by these design updates. We note that the description includes the following about the test system architecture: “The measurement servers, each supporting a 100 [gigabit per second] Gbps capacity, used for mobile broadband measurement are hosted by StackPath and are distributed nationally to enable a measurement client to select the host server with the least latency.” The technical description includes data dictionaries for both Android and iOS versions of the app, but these dictionaries define data fields and formats for the current version of the app (and not the updated version of the app). To provide third-party app developers and other stakeholders with information and guidance as early in the process as possible, the Bureau and Offices have made available, contemporaneous with the release of this document, a current draft of the data specification the FCC Speed Test app will use once updated to include challenge and crowdsource data functionalities. The updated data specification aligns with the test metrics adopted in this document. The updated FCC Speed Test app with those functionalities will be available on the FCC's website and in iOS and Android app stores prior to the opening of the challenge and crowdsource process.
18. We decline to provide a further opportunity for comment on the FCC Speed Test app. Although some parties request an opportunity for public comment on both the FCC Speed Test app and third-party apps before we allow them to be used in the challenge process, we note that the Commission already sought comment on the use of the FCC Speed Test app in the challenge process as part of this rulemaking proceeding. The Commission also provided other opportunities to comment on the FCC Speed Test app because (1) the app was initially developed in coordination with the major wireless providers and trade associations several years ago; and (2) information on the data collected by the app has been publicly available on the Commission's website and has been available for comment in a rulemaking docket for several years. Additionally, the Commission specified in the Third Order that the challenge process use an FCC app, and, unlike some newer third-party speed test apps, the FCC Speed Test app has been in use for several years and the updates that are underway will merely implement the data specifications and requirements proposed in the BDC Mobile Technical Requirements Proposed Rules and adopted by this document. For these reasons, we do not believe it is necessary to seek comment on the use of the FCC Speed Test app for challenge and verification purposes.
19. CCA and RWA assert that it is unclear how the FCC Speed Test app will operate when there is inadequate connectivity to upload data or record a test. The FCC Speed Test app is designed to record and store measurements conducted in areas without internet connectivity and then to automatically transmit such failed tests once the app is opened when the device next has broadband connectivity. Moreover, third-party apps will be required to function in a similar way to be granted approval for use in the challenge process. Several commenters likewise misunderstand how the FCC Speed Test App reports “failed” tests or tests where mobile service is unavailable. As set forth in the 2021 technical description of the FCC Speed Test app, “test[ ] results are transferred depending on the available connectivity at the conclusion of the test and can be stored and forwarded when connectivity is immediately unavailable.” Failed test results are therefore uploaded to the server and included in the relevant dataset when the app user reestablishes a broadband connection. The upload and download components of a failed test will be recorded as negative if they fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred. For example, if a failed test records speeds of 0 megabits per second (Mbps) upload and 0 Mbps download, both components of the test will be recorded as negative.
20. At a later date, OET will release a public notice outlining the process for collecting, reviewing, and approving applications for third-party speed test apps. In their applications, app developers will be required to describe their performance-centric speed test methodologies and how their app complies with the data collection requirements set forth in this document. Applicants will not be required to disclose any proprietary and/or confidential information that is sensitive to public inspection, such as source code, to the Commission, and we therefore decline to adopt T-Mobile's request that we require developers to submit their source code for public review. The OET public notice also will describe procedures for interested parties to submit comments and replies in response to the proposals and will publish on the Commission's website a list of approved third-party apps and any available data specifications for third-party apps.
21. We agree with commenters who recommend holding the FCC Speed Test app and third-party apps to the same technical standards. Both the FCC Speed Test app and third-party apps, as well as software used by state and local governments and other third parties, must comply with the data collection requirements set forth in this document. We also agree with commenters who recommend requiring speed test apps to use multiple servers that are geographically diverse. As to this point, CCA asserts that Ookla's speed test app is more accurate than the FCC Speed Test app due in major part to its many geographically distributed servers (with 41 servers in the U.S. and 15,019 testing servers globally), which allow users to run a test against a server that is located physically close to them reducing the likelihood of inaccurate latency measurements or artificial increases in latency distorting the download and upload speeds. As described in the most recent technical description for the FCC Speed Test app, the app currently carries out measurements against 13 servers spread out across ten locations throughout the United States and initiates a test sequence by selecting a measurement server using a latency test to identify the optimal server that has the lowest round-trip latency for performing subsequent tests. We believe that the current distribution of FCC Speed Test app servers, combined with this measurement server selection process, provides sufficient diversity to meet this geographic-diversity criterion. We also note that the number of servers used by a speed test is of less concern than the ratio of the concurrent consumers conducting tests to the total capacity of the test server hosting those tests ( i.e., the server utilization rate). The FCC Speed Test app's test servers are overprovisioned based upon statistics of the utilization rate and usage pattern, which are automatically monitored for the highest system Start Printed Page 21482 availability, to maintain the optimal connectivity rate. A utilization rate of 80% or more is classified as a critical state, and triggers the provisioning of new servers to stabilize load across the platform. Accordingly, although not as geographically diverse as Ookla's speed test app, we believe that the geographic diversity offered by the FCC Speed Test app in the United States provides sufficient capacity to support its user base and that it is sufficiently diverse to meet the required needs that rely on the test system architecture. The test system architecture for multiple redundant and meshed servers to target maximum availability of the test platform also employs load balancing for traffic to failover to other servers in which each server provides a 100 Gbps connectivity capacity. In sum, the FCC Speed Test app provides sufficient capacity to support its users and has sufficient geographic diversity to meet the required needs of the test system architecture. We also observe that latency is the principal concern raised by commenters. In this regard, we note that Commission rules require measurement of round-trip latency. As adopted and implemented in the FCC Speed Test app, the variability of latency is not entirely a function of geographical distance to the test server but also is a function of the network congestion, and so, at a minimum, servers should be distributed nationally in consideration of user base, population density, and the server utilization rate for multiple servers to be examined before the test server selection and located reasonably close to internet eXchange Points (IXPs) to accurately reflect unbiased real-world conditions. We point out that the FCC Speed Test app sufficiently considers these effects to help reduce round-trip latency.
22. Validating Speed Tests. As proposed in the BDC Mobile Technical Requirements Proposed Rules, we will validate submitted speed tests and exclude those that: (i) Are outside the scope of the challenge process, (ii) do not conform to the data specifications, or (iii) do not otherwise present reliable evidence. We will accept as valid speed tests only those tests conducted between the hours of 6:00 a.m. and 10:00 p.m. local time. Commenters do not raise concerns with our adopting a window for purposes of validating speed tests. We will compare speed tests for a particular network technology ( e.g., 3G, 4G LTE, or 5G-NR) to the coverage maps for the corresponding technology or higher-generation technology, to the extent the service provider claims coverage for the more than one technology in the tested location. We implement these changes so that testers are able to submit tests to be used to challenge a higher-generation technology map in situations when a mobile service provider claims multiple technologies at a location but the tester's device only connects to a lower-generation technology. We agree with Vermont DPS that our original proposal did not adequately address those situations where a device that is unable to connect to a network using a particular technology “falls back” to a lower-generation technology ( e.g., 4G LTE to 3G), which could make it impossible to challenge the higher-generation technology. We will allow, therefore, a speed test conducted using a device capable of connecting to a higher-generation technology, but that only connects to a lower-generation technology, to count as a test for the higher-generation technology. To be a valid test for the higher-generation technology, the consumer submitting the challenge must also subscribe to a service plan that is capable of connecting to the provider's network using the higher-generation technology. To prevent gaming, and as discussed further below, we will allow challenged providers to invalidate challenger speed tests with specific evidence that the challenger's device was not capable of connecting using a higher-generation technology or that the service plan to which the challenger subscribes does not allow use of the higher-generation technology. For example, a test conducted with a 4G LTE-capable device in a location where the service provider claims 4G LTE but where the challenger can only connect via the 3G network could count as both a 3G test when compared to the provider's 3G coverage map as well as a negative 4G LTE test when compared to its 4G LTE coverage map if the test did not meet the 5/1 Mbps minimum speeds; alternatively, it could count as a positive 4G test if the test met or surpassed the 5/1 Mbps minimum speeds reported for the 4G LTE map. Note that, under this approach, the 3G test may count towards the 4G LTE coverage map regardless of whether the provider claims 3G coverage at the location. This modified approach would resolve Vermont DPS's hypothetical concern that, under the proposal set forth in the BDC Mobile Technical Requirements Proposed Rules, a test result that “fell back” to a lower-generation technology would not be “preserved.” As discussed, such tests will be preserved and used to challenge a higher technology's maps if a service provider offers a higher-generation service in that area and the tester subscribes to a service plan that is capable of connecting to the provider's network using the higher-generation technology.
23. Similarly, if a challenger conducts a test but fails to connect to any network, we will treat that as a failed test against the provider's coverage maps for each technology to which the device is capable of connecting. These small changes to our original proposal will help prevent the scenario raised by Vermont DPS and enable more meaningful challenges in areas with marginal coverage where a device “falls back” to a lower-generation technology. Our updated approach also accounts for situations in which a device could alternate between, or utilize both, 4G LTE and 5G-NR over the course of a single test. Verizon agrees with the Bureau and Offices' initial proposal to compare each speed test against the relevant coverage map, and argues that “only speed tests conducted on 3G networks should be used to challenge 3G coverage, only speed tests conducted on 4G LTE networks should be used to challenge 4G LTE coverage, and only speed tests conducted on 5G-NR networks should be used to challenge 5G-NR coverage.” However, we are persuaded that the proposal we sought comment on in the BDC Mobile Technical Requirements Proposed Rules could allow for a scenario in which a tester seeking to support a challenge to a provider's 5G coverage would be prevented from submitting evidence because their phone fell back to the 4G network. Under our original proposal, in areas where a provider claims coverage for multiple technologies, a lower-generation technology could have prevented the higher-generation technology from being challenged, which in turn could isolate higher-generation technologies from legitimate challenges.
24. We will also compare speed tests conducted in a particular environment—outdoor stationary or in-vehicle mobile—to where the provider's maps report coverage for the corresponding environment ( i.e., outdoor stationary or in-vehicle mobile), as discussed in greater detail below. Additionally, we will also treat as invalid and exclude from the challenge process any speed tests that fall outside the boundaries of the provider's most recent coverage data for all claimed technologies and environments. This differs from our original proposal in the BDC Mobile Technical Requirements Proposed Rules in that the system will preserve all tests in a geographic area Start Printed Page 21483 where a provider claims coverage by any technology. We believe our modified approach will result in more reliable evidence for challenges because tests that may otherwise have been excluded for falling outside a provider's coverage for a specific technology under the proposed methodology in the BDC Mobile Technical Requirements Proposed Rules may now be counted as challenge data. This change will allow for the scenarios discussed above, in which a test conducted using a lower-generation technology could be used to challenge a provider's map for a higher-generation technology if the provider claims both types of coverage ( e.g., 4G LTE and 5G-NR), but a challenger's device is not connected to the higher-generation technology.
25. In response to Verizon's concerns that tests may be throttled, we will not validate, for purposes of the challenge process, speed tests conducted by customers of mobile virtual network operators (MVNOs) or tests conducted while roaming on another carrier's network, so as to avoid biasing the challenge process with speed tests that may not reflect typical network performance. MVNOs do not own any network facilities. Instead, they purchase mobile wireless service wholesale from facilities-based service providers and resell these services to consumers. Because the agreements between a facilities-based provider and MVNOs or roaming partners often include limitations on the technology and speed available to or the network prioritization of devices used by consumers of the MVNO or roaming partner, we conclude that speed tests from such devices are not reliable evidence about the performance of the facilities-based provider's network. While we anticipate that the majority of tests conducted by an MVNO subscriber or while roaming will fail our automated validations, there may be circumstances where the BDC system is unable to automatically identify these tests ( e.g., identifying whether an iOS device is roaming is not currently possible). We anticipate that a provider may identify whether a specific device(s) used in the testing was either roaming at the time, was an MVNO customer, or was subject to deprioritized or otherwise limited service because, as discussed, on-the-ground speed tests submitted in the challenge process will include the timestamp that test measurement data were transmitted to the app developer's servers, as well as the source IP address and port of the device, as measured by the server. We therefore do not agree with Vermont DPS's assertion that pre-paid tests in rural areas will be less accurate than speed tests run by subscribers of a typical service provider, due to the fact that pre-paid services exclude roaming in rural areas, because we will not validate any tests conducted while a subscriber is roaming. While we will allow a service provider's pre-paid customers to submit speed tests for use in the challenge process, a service provider will be able to use the timestamp that test measurement data were transmitted to the app developer's servers, as well as the source IP address, and port of the device, as measured by the server to determine if a specific speed test is run by a pre-paid subscriber that experienced limited service, and use that information when responding to a challenge. Given that these consumers may likely be subject to de-prioritization or otherwise limited service, and that the BDC system will be unable to detect whether or not a limitation in mobile service exists, we are unable to establish a reliable method for validating MVNO or roaming tests and, thus, these tests will be excluded from the challenge process. As discussed later, however, we may consider speed tests conducted by consumers of MVNOs and consumers roaming on other providers' networks when evaluating crowdsourced data.
26. Aggregating Valid Speed Tests. The Bureau and Offices will combine and collectively evaluate—according to the testing environment ( i.e., outdoor stationary or in-vehicle mobile) and technology type—valid speed tests submitted by consumer, governmental, and third-party challengers. Speed tests, including those collected through an approved speed test app and the data collected by government and other third-party entities through their own software and hardware, will be combined and collectively evaluated according to their tested environment and technology type. For example, as discussed in greater detail below, in-vehicle tests will generally be evaluated against a carrier's in-vehicle maps, and stationary tests will generally be compared against a carrier's stationary maps. We expect that in-vehicle and stationary tests will have substantially different results such that they would not provide an equal comparison and aggregating these tests would be problematic because there are fundamental characteristics of the two environments that are expected to cause noticeable signal losses for the in-vehicle mobile environment. As noted above, we do not expect iOS and Android devices to pose a similar problem. While we will receive a more complete set of datapoints from Android tests than iOS tests, we do not expect them to have substantially different results when, for example, tests using both types of devices are conducted in a pedestrian stationary environment, such that the tests would not have equal value and could not be compared and aggregated; the fact that iOS provides fewer datapoints than Android tests does not render a test run using iOS any less accurate than a test run using the Android operating system. Similarly, tests conducted with an external antenna will be considered in-vehicle, and while subtle differences between test results from those antenna placements may occur, overall those differences are considerably less significant than the differences between stationary vs. in-vehicle mobile more broadly.
27. We will combine such speed test evidence and apply a single methodology to determine whether the thresholds for a cognizable challenge (described in greater detail below) have been met and the boundaries of the challenged area. Several commenters express support for aggregating speed tests from multiple challengers, and we find that doing so will result in more accurate challenges and will further the Commission's goals of resolving challenges in an efficient manner, mitigating time and expense, and ensuring that maps are as reliable and useful as possible. We disagree with the California Public Utilities Commission's (CPUC) assertion that combining speed test data will not reduce costs or complexity in the challenge process. In fact, combining speed tests could ease the other potential burdens on an individual challenger of conducting multiple speed tests to meet the challenge thresholds. Our approach ensures that a smaller number of speed tests by one person or entity may nevertheless contribute to a challenge because the tests will be combined with other validated speed tests to meet the testing, temporal, and geographic thresholds. As a result, in many cases, no single challenger—whether a consumer, a government agency, or other entity—will be required to individually shoulder the burden of creating a challenge. While in places with low population density an individual challenger may be the only entity to submit speed tests to create a cognizable challenge, in many other cases, challengers will be able to combine efforts to submit speed tests in an area. Speed tests will be combined and used collectively—according to Start Printed Page 21484 testing environment ( i.e., outdoor stationary or in-vehicle mobile) and technology type—to meet the thresholds set forth below.
28. We will evaluate tests for a given technology against each provider map independently (one reporting stationary and one reporting in-vehicle mobile coverage) when determining whether to establish a cognizable challenge. Pursuant to the Third Order, tests taken on bicycles and motorcycles will be considered tests from in-vehicle mobile environments. We will consider in-motion tests taken in similar environments, such as on snowmobile or all-terrain vehicle, to be tests from in-vehicle mobile environments. By contrast, consistent with the Third Order, tests taken from stationary positions and tests taken at pedestrian walking speeds (such as on horseback) will be considered tests taken in outdoor pedestrian environments. We decline to exclude tests taken on other vehicles as T-Mobile requests. The Commission did not give the Bureau and Offices authority to change this accommodation; we anticipate that challengers may take speed tests on other vehicles than cars in areas with difficult or hard to reach terrain. Additionally, we will exclude stationary tests that occur outside a provider's stationary coverage map and in-vehicle mobile tests that occur outside a provider's in-vehicle mobile coverage map. Our approach differs from that which we proposed in the BDC Mobile Technical Requirements Proposed Rules in that we will no longer aggregate in-vehicle and stationary maps together. We find that the approach we adopt will result in more accurate challenges. To ensure that the challenge process also remains user-friendly, and because we expect performance to be better for stationary tests than for in-vehicle tests, stationary speed test results that create a cognizable challenge to an area on the stationary map will also create a cognizable challenge to the same area on the in-vehicle map if the area has overlapping coverage on both maps. On the other hand, the reverse situation will not be permitted, meaning, we will not permit a challenge to an area on the in-vehicle map to automatically create a challenge to the same area on the stationary map if the area has coverage on both maps. If, however, in an area that has coverage on both maps we find that large portions of a provider's in-vehicle mobile map have been successfully challenged, but there are very few speed tests conducted in a stationary environment, then we may use this as evidence upon which to form a credible basis for initiating a verification inquiry of a provider's stationary coverage in that area. Similarly, a provider refuting a challenge to a geographic area on the in-vehicle map would also refute a challenge to the same area on the stationary map if that challenge exists.
29. Several providers express concern about the proposal to aggregate in-vehicle mobile and outdoor stationary tests and compare them collectively against both coverage maps. As described above, we will not aggregate all stationary and in-vehicle mobile tests for comparison against both maps but will evaluate stationary tests against the stationary map and the in-vehicle mobile tests against the in-vehicle map. Rather than aggregating all tests, we will allow cognizable challenges to the stationary map to also create a challenge for the same area on the in-vehicle map and successful provider responses to the in-vehicle map to also refute a cognizable challenge of the same area on the stationary map. We find that this approach adequately addresses providers' concerns about comparing tests from different modeled environments, and promotes consistency between the maps. We thus decline to adopt the Vermont DPS's recommendation to allow challengers to submit in-motion tests to challenge stationary coverage, because we do not expect in-vehicle tests to achieve the same performance had the test been conducted in a stationary environment. If we did not allow for challenge or response comparison to both maps in the limited circumstances we adopt above, it would be easier for one map in an area to show a lack of coverage while the other map shows robust coverage—solely because of a lack of testing.
30. Data from speed tests taken after the “as-of” date of the initial BDC data collection will be considered as part of the challenge process upon confirmation that they meet the validation criteria set forth herein. Accordingly, once the Commission has generated maps of the data collected from providers, the BDC system will analyze all previously submitted tests to determine whether they were taken after the “as-of” date of the maps and to perform the data validations discussed further below, including whether they were taken within the published coverage area claimed by the applicable provider. Speed tests submitted as part of the challenge process that do not meet these qualifications will be considered crowdsourced data. Validated speed tests results will be reconsidered on a monthly basis, in conjunction with any newly validated speed test filings, to determine whether the data meet the geographic, temporal, and testing thresholds to create a cognizable challenge to an area. Such speed tests will be considered for up to one year to determine whether the data for a location subsequently meet the thresholds to be considered a cognizable challenge, and if so, the tests will be used collectively to challenge the maps that are published at that time.
31. Once the maps have been published, the BDC system will analyze all submitted tests to determine whether speed tests fall within the geographic area depicted in a provider's published coverage area. Speed tests submitted after the “as of” date but prior to publication of the map, as well as those submitted after the publication of the maps, will be used to challenge the maps that are published at that time, subject to the restriction that speed tests are considered valid evidence for one year from the date the test was taken. During the one-year period that they remain valid evidence, speed tests may initially be excluded from consideration in the challenge process because the speed tests fell outside of the provider's reported coverage maps but be included when the system reconsiders the challenge data every month, due to subsequent publication of maps reporting coverage in which such tests are located. For example, if a challenger submits otherwise valid speeds tests that were conducted in July in an area reported by the provider to not have coverage in its maps that are “as of” the previous December 31, such tests would be initially excluded. If the coverage maps submitted by the provider “as of” June 30 and published in September of that year do report such areas as covered however, the tests taken in July would be considered as valid evidence in favor of a challenge to the June 30 maps. Parties submitting speed tests to be used in the challenge process will be notified when their test has been submitted and that the test submitted may be used to create a challenge if such data meet the validation requirements. Thereafter, parties that have submitted speed tests to be used in the challenge process will be notified of the status of their submitted speed tests, which will include information on whether their speed test is used in the creation of a cognizable challenge.
32. Maps That Can Be Challenged. We clarify that speed test data will only be used to create challenges in areas where a provider reports that it has broadband service availability. We will, however, permit challenges to 3G, 4G LTE, and 5G-NR coverage maps. Some Start Printed Page 21485 commenters suggest that we defer consideration of challenges to 3G maps, but the Commission has classified 3G as a mobile broadband technology in previous BDC orders and has determined to allow challenges to the accuracy of mobile broadband coverage maps. Since the Commission did not delegate to the Bureau and Offices the authority to limit challenges to certain technologies, we lack the discretion to limit challenges to only 4G LTE and 5G-NR maps. Moreover, doing so could exclude certain consumers from the challenge process. For example, consumers rely on 3G in areas where 4G LTE and 5G-NR are not offered by the provider or are otherwise unreliable, and subscribers in rural areas continue to use 3G at higher concentrations than other parts of the country. We note that, when a provider retires a given mobile broadband technology such as 3G, that service would not be included on its updated coverage maps and therefore would not be available for challenges. However, until providers retire a particular broadband network technology, they will be obligated to respond to challenges to their claims of coverage for that technology.
33. Based on the record and the goals underlying the Broadband DATA Act, we adopt our proposal to exclude voice maps from the challenge process. The Broadband DATA Act requires the Commission to establish a process for challenging the accuracy of broadband coverage data, which, for mobile services, is defined as “the coverage maps” ( i.e., the broadband maps discussed in 47 U.S.C. 642 (c)(1)) and “any information submitted by a provider regarding the availability of broadband internet access service.” Additionally, the Commission has decided that the mobile challenge process applies only to broadband (and not voice) coverage maps. We also note that commenters raise concerns with using “speed test” data to verify voice coverage maps. Vermont DPS disagrees, proposing that the Bureau and Offices should set parameters for voice maps, including defining a threshold signal level of upload and download speeds that would indicate voice service is available in an area. We reject the Vermont DPS proposal. Vermont DPS was the only commenter to proffer minimum throughput parameters ( i.e., download and upload speeds) or signal strength values necessary to support a voice call, but these values did not receive any additional record support. Although Vermont DPS recommends that the Bureau and Offices determine threshold parameters that “would be indicative of no mobile service,” it does not propose specific parameters, noting only that zero would be indicative of no service and that 256 kilobits per second (kbps) download, 64 kbps upload, or a signal strength of less than −105 decibel-milliwatts (dBm) would indicate that service is likely insufficient. We therefore decline to include voice maps as part of the mobile challenge process at this time.
34. Additionally, we reject commenters' requests to allow challenges only to outdoor stationary coverage maps. CTIA—The Wireless Association, Verizon, T-Mobile, and AT&T argue that the Commission should focus first on challenges to outdoor stationary maps, and defer consideration of any challenges to in-vehicle maps until after it has ruled on CTIA's petition for reconsideration to eliminate in-vehicle coverage maps. The Commission's Third Order clearly directed that we collect both sets of maps, and we will not eliminate or delay the challenge process for in-vehicle maps given the importance in making the challenge process available for consumers and other entities that use mobile services in vehicles, unless the Commission determines that such maps are not necessary. CTIA, Verizon, T-Mobile, and AT&T also argue that in-vehicle maps should be excluded from the challenge process because the Commission has not established parameters for mapping in-vehicle coverage or evaluating in-vehicle challenges. Limiting the challenge process to outdoor stationary tests and maps could reduce the utility and accuracy of the challenge process, given that many consumers use mobile services in vehicles and in motion. We recognize that many states ban handset use while driving and many vehicle operators do not have passengers. We do not intend to contravene state bans on handset use while driving, nor do we advocate for consumers to run speed tests on a personal handset while operating a vehicle. It also would ignore a significant number of speed tests, especially on highways and in areas where it is not safe or convenient to conduct stationary speed tests. Moreover, the Commission has established sufficient parameters for mapping in-vehicle coverage and evaluating in-vehicle challenges. The Commission has allowed consumers to conduct speed tests in an in-vehicle mobile environment, but declined to adopt detailed testing requirements for in-vehicle consumer tests, whereas it required government and third-party challengers to submit more detailed information on tests run in in-vehicle mobile environments. We reiterate that all challengers must report whether the test was taken in an in-vehicle mobile or outdoor pedestrian environment; for in-vehicle tests, the speed the vehicle was traveling when in-vehicle tests were taken (where available); and, for government and other third-party challengers conducting in-vehicle tests, whether the test was conducted with an antenna located outside of the vehicle.
35. Finally, we decline to adopt Vermont DPS's request to change the thresholds for in-vehicle tests “to account for the slight difference in performance of stationary and mobile tests” because, as discussed, we will not use in-vehicle test data to form the basis of a challenge of stationary maps. Moreover, Vermont DPS has not given us any objective metric by which to adjust tests upward or downward for purposes of meeting the threshold when comparing the test against the other environment ( i.e., Vermont does not suggest any formula to accurately estimate actual performance (based upon, e.g., signal strength) and thus, there is no way we could translate signal strength into actual speeds).
36. We also reject suggestions that we permit challenges only in rural areas. The Broadband DATA Act envisions a broad challenge process, and there is nothing in the Act that authorizes the Commission, or by extension, the Bureau and Offices, to limit the challenge process to rural areas.
37. Grouping Valid Speed Tests by Location. After excluding speed tests that fail our validations, we will associate the location of each valid speed test with a particular underlying hexagonal cell geography based on the H3 geospatial indexing system. The H3 system is designed with a nested structure wherein a lower resolution cell (the “parent” hexagon) contains approximately seven hexagonal cells at the next higher resolution (its “children” and each a “child” hexagon), which approximately fit within the “parent” hexagon. The lower the resolution, the larger the area of the hexagonal cell. Because of this nested structure, using the H3 system to group speed tests allows for challenges at multiple levels of granularity which, as discussed below, enables challengers in rural areas where broadband coverage may be more sporadic to contest larger areas if aggregated speed test data demonstrate a lack of coverage within a sufficient number of child hexagons. As proposed, the smallest cognizable challenge will be to a single resolution 8 hexagonal cell, which has an area of approximately 0.7 square kilometers. Start Printed Page 21486
38. Some commenters support the use of hexagons to evaluate challenges but recommend basing challenges on a different hexagonal cell size. While Vermont DPS generally supports the proposed use of H3 indexing, it argues that the system is not intuitive to use and asks the Commission to create and share geospatial indexing system (GIS) layers for the H3 hexagons at all resolutions it intends to employ in the coverage analysis, which we have already done. CTIA, T-Mobile, and AT&T urge us to use smaller resolution 10 hexagons instead of resolution 8, contending that hexagons at resolution 10 better match the 100-meter resolution providers must use when submitting their coverage maps. RWA and Vermont DPS, meanwhile, recommend allowing challenges to resolution 6 and 7 hexagons in rural areas, which RWA notes are often difficult to test because of a lack of accessible roads.
39. We find that resolution 8 strikes an appropriate balance as the smallest resolution for a cognizable challenge. Smaller areas ( e.g., resolution 9 or 10) could result in many disparate challenges that may require excessive testing by providers and, in the case of resolution 10 hexagons, may exceed the granularity of propagation maps that were not designed to provide such precision. Coverage maps must be submitted at a resolution of 100 meters ( i.e., 0.1 km) or better. Therefore, allowing for challenges to an area as small as a resolution 10 hexagon cell, which is smaller than the 100 meter map resolution, may instead reflect inaccuracies due to the resolution at which the provider generated its maps. Larger areas ( e.g., resolution 6 or 7 hexagons), on the other hand, would require significantly more testing for challengers and make it difficult to verify coverage in distinct local areas. For example, a resolution 7 hexagon would require four to seven times as many tests as a resolution 8 hexagon to create a successful challenge. The Commission directed staff to determine the requisite number of tests and define the geographic boundaries of cognizable challenges while satisfying the goals of both “encourag[ing] consumers to participate in the challenge process and assuring that providers are not subject to the undue cost of responding to a large number of challenges to very small areas.” We are not persuaded that allowing challenges to areas smaller than the 100-meter resolution ( i.e., a resolution 10 hexagon) requirement would adequately meet these goals. Using areas smaller than a resolution 8 hexagon would additionally make it difficult for consumers to reach the threshold of cognizable challenges. A challenger would need to take many more tests in the smaller hexagons to achieve the statistical significance required. Use of particularly small areas also would likely make in-motion testing for both challengers and providers impossible. In the future, we may consider using hexagonal cells at a higher resolution if it becomes necessary to correct coverage errors at a more granular level.
40. RWA and Verizon assert that the use of the H3 geospatial indexing system would present implementation issues. RWA cautions that third-party network maps, which providers may use to supplement the data used to rebut challenges, may not be compatible with the H3 geospatial indexing system. Verizon also raises concerns that providers would need to develop new tools and systems for managing speed tests and evaluating data in an H3-based environment and notes that tracking and evaluation may be complicated because child cells will not nest precisely into their parent cell. These concerns do not warrant deviations from our proposal since parties seeking to rebut challenges do not need to conform their tools or data to the H3 indexing system. The BDC system itself will overlay submitted speed test points with the H3 hexagons; providers need only submit their speed test data and the BDC system will appropriately index them (so long as the data otherwise meet the specifications and test requirements to qualify as valid on-the-ground speed test data). Moreover, H3 is an open-source indexing system, and therefore we do not anticipate it being overly expensive or burdensome for providers to access. Finally, in response to Verizon's argument that the tracking and evaluation of speed test data will be complicated because child cells will not nest precisely into their parent cell, we note that speed tests will be evaluated based on the resolution 8 hexagon within which a test falls.
41. CPUC and Public Knowledge/New America assert that submitting speed test data under the H3 system using resolution 8 hexagons would be more burdensome and expensive, and would result in fewer challenges, because challengers would need to gather statewide measurements in each resolution 8 hexagon. We disagree. First, challengers will not need to submit speed tests in every resolution 8 hexagon in a state because challenge data cannot form the basis of a cognizable challenge in areas where a provider does not claim coverage. Challengers will be aware of the areas in which a provider does not claim coverage from the publicly available mobile broadband map and can avoid the burden and expense of conducting speed tests in those areas. Second, as discussed, we will combine, according to the tested environment, valid speed tests conducted by consumers, state, local, and Tribal governments, and other entities. This likely will reduce the number of speed tests that any one challenger needs to submit to create a challenge. The number of required tests needed to meet the thresholds reflect the total number of speed tests needed to create a cognizable challenge, not necessarily the number of speed tests that must be submitted by an individual consumer or entity. Third, CPUC's concerns ignore our decision to allow testers to challenge larger geographic areas, such as resolution 7 hexagons or resolution 6 hexagons, when at least four of the seven child hexagons of the parent hexagon are challenged. Testers will be able to see which areas have been challenged and if, for example, four or more of the seven child-resolution 8 hexagons in a resolution 7 hexagon are challenged, then the entire resolution 7 hexagon will be considered challenged. Finally, H3 indexing will not burden testers because it will serve as an “under the hood” way for the Commission to group and analyze speed tests submitted by testers at various times and places.
42. We will evaluate all valid challenger speed tests that present evidence about the service of a given technology and environment within each hexagon to determine whether to create a cognizable challenge to the coverage in that area. We did not receive any comments on this proposal. We also adopt the alternative approach proposed in the BDC Mobile Technical Requirements Proposed Rules to evaluate the download and upload components of each speed test individually rather than evaluating them jointly. Under this approach, each component will be categorized as either “positive” or “negative” based on whether the component is consistent with the provider's modeled coverage ( i.e., the coverage assumptions in providers' BDC propagation maps). A positive component is one that records speeds meeting or exceeding the minimum speeds that the mobile service provider reports as available where the test occurred. A negative component is one that records speeds that fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred. For each speed test, the download component will be Start Printed Page 21487 either positive or negative, and the upload component will be either positive or negative. The coverage map will then be evaluated for all download tests and separately for all upload tests. If a resolution 8 hexagon meets the thresholds for either upload or download tests, a challenge would be triggered. In order to rebut a challenge, a provider would need to meet the thresholds for both the upload components and download components. Speed test apps typically measure download and upload components sequentially and not simultaneously, so evaluating these components independently will better account for geographic and/or temporal variability.
43. In the case where the starting and ending locations of a test are in different hexagons ( e.g., because the testing device was in motion), we will associate the test with the hexagon containing the midpoint of the reported start and end coordinates for each test component. We also will use the midpoint to determine whether the test component falls within the applicable provider's coverage map. Each test component will be point-hex dependent. Therefore, a download test could be associated with a different point-hex than an upload test, and in such cases, the two tests would be treated independently. We disagree with Ookla that we should use the start location as the single point value of a test rather than associating two locations for each data point. We also disagree with Vermont DPS that we should use a single set of geographic coordinates at the start of each on-the-ground sequence, but we do agree with its alternative recommendation and will capture the timestamp and duration of each test component, as well as the geographic coordinates measured at the start and end of each test component with typical GPS Standard Positioning Service accuracy or better. Having start and end coordinates for each test will facilitate our verification of stationary maps versus mobile maps because it will enable us to capture the precise locations of drive tests.
44. We decline Verizon's request to adopt additional device- and plan-specific requirements. We recognize that some devices have limitations ( e.g., an older device may not connect to all spectrum bands), but find that restricting the types of devices that can be used to conduct speed tests would make the challenge process less user-friendly and less accessible to consumers and non-consumers alike. At the same time, a challenger must disclose the manufacturer and model of its device so that providers will have this information when rebutting challenges and can seek to invalidate tests from devices that are not compatible with a specific network or band. We will also allow mobile service providers to respond to a challenge with infrastructure information in situations where a mobile device used in the testing accessed the network over a data plan that could result in slower service. Finally, the methodology we adopt for aggregating speed tests and requiring challenges to meet the thresholds described below will ensure that challenges are temporally and geographically diverse and therefore reflect a robust and representative sample of user experience, regardless of device type or subscriber plan.
45. Challenges to Larger, Lower-Resolution Hexagons. We adopt our proposal for a “parent” or “grandparent” hexagon ( i.e., a hexagon at resolution 7 or 6) to be considered challenged if at least four of its child hexagons are challenged. CCA supports this proposal, while T-Mobile and Verizon argue that it could allow for challenges to very large areas even though significant portions of them have not been tested. We disagree with T-Mobile and Verizon and find that this approach will allow for the effective challenge of larger areas where an abundance of geographically diverse tests indicate a pervasive problem. Under it, a resolution 7 or 6 “parent” hexagon will be considered challenged only if more than half ( i.e., at least four of seven) of its “child” hexagons are challenged. The threshold can therefore be met without testing each resolution 8 hexagon, including ones that may be practically inaccessible. But each “child” hexagon must still meet the geographic threshold described below, which means that any challenges to larger “parent” hexagons will reflect that negative tests are persistent throughout the geographic area. While we decline to set the minimum size of a cognizable challenge at either resolution 7 or resolution 6 hexagons as requested by RWA, we believe that the approach we adopt herein will allow for challenges covering a significant portion of otherwise inaccessible resolution 8 hexagons. So long as challengers submit tests meeting the thresholds in at least four of the seven resolution 8 hexagons for a “parent” resolution 7 hexagon, the remaining hexagons would be effectively covered by the challenge to the “parent,” even if these resolution 8 hexagons are inaccessible. We conclude that this strikes an appropriate balance between reducing the burden on challengers while ensuring that robust evidence of a problem exists before requiring a provider to respond.
46. Required Thresholds. A resolution 8 hexagon will, as proposed, be challenged when tests submitted within the hexagon meet three thresholds: Geographic, temporal, and testing. We adopt the proposed geographic threshold, modified to account for our approach to evaluate each test component ( i.e., download and upload) separately. If the tests for a given technology in a resolution 8 hexagon meet all three thresholds we will consider that map's coverage to be challenged in that area. To satisfy the geographic threshold for a challenge, in general, at least four child hexagons ( i.e., “point-hexes”) within the resolution 8 hexagon must contain two of the same test components (download or upload), one of which is a negative test, in each point-hex. The threshold must be met for one component entirely, meaning that a challenge may contain either two upload components per point-hex, one of which is negative, or two download components per point-hex, one of which is negative. Requiring at least four out of seven point-hexes to include two of the same test components and at least one negative test will ensure that more than half of the point-hexes within a resolution 8 hexagon show inadequate coverage. Requiring at least one negative test in multiple locations within the geographic area of a resolution 8 hexagon will demonstrate that negative tests are persistent throughout the hexagon.
47. Consistent with the Commission's direction to consider (among other factors) “whether the tests were conducted in urban or rural areas” when setting the methodology for aggregating speed test results, we will adjust the geographic thresholds to allow challenges that account for differences in areas. Specifically, we adopt a different geographic threshold depending on the road density of each resolution 8 hexagon. We will relax the geographic threshold to require tests in fewer than four point-hexes when fewer than four of the point-hexes of a resolution 8 hexagon are “accessible.” We define an “accessible” point-hex as one in which the provider reports coverage for at least 50% of the area of the point-hex in its reported coverage data and through which at least one road traverses. Using the most recent U.S. Census Bureau roadway data, a point-hex would contain a road if it overlaps any primary, secondary, or local road, which are defined as Master Address File/Topologically Integrated Geocoding and Referencing (MAF/ Start Printed Page 21488 TIGER) Feature Class Codes S1100, S1200, or S1400, respectively. In order to account for road width, we will apply a small buffer around the U.S. Census Bureau road line data. No entities commented on this definition. We choose 50% of the area of the point-hex to be within the provider's reported coverage because we want challengers to have a high likelihood of being within the coverage map when they test. We note that challengers can still test within a point-hex that is not “accessible” so long as the test falls within the provider's reported coverage. We settle on this definition of “accessible” because without a road it becomes significantly more difficult for parties to run speed tests in a point-hex. We find that the existence of at least one road gives parties a way to access a hexagon and run speed tests. We anticipate that this approach will make it easier for challengers to establish a challenge in less densely populated areas because challengers will be permitted to show less geographic diversity among tests if there are fewer accessible point-hexes in a resolution 8 hexagon.
48. We decline to adopt Vermont DPS's proposal to eliminate the requirement that four of the seven point-hexes within a resolution 8 hexagon meet the geographic threshold. Requiring a challenge to meet the geographic threshold in four of seven point-hexes ensures geographic diversity of tests and will help identify potential coverage gaps over a sufficiently wide area. Vermont DPS does not propose any alternative geographic threshold, and the record supports our conclusion that the geographic threshold is necessary to minimize the chance of anomalous results. We also reject RWA's proposal to reduce the geographic threshold for inaccessible resolution 7 hexagons or allow for a resolution 7 hexagon with low road density to automatically trigger a challenge. We believe the two proposals we adopt—(1) to reduce the geographic threshold for resolution 8 hexagons with low road density, and (2) to allow a “parent” or “grandparent” hexagon ( i.e., a hexagon at resolution 7 or 6) to be challenged if at least four of its child hexagons are considered challenged—adequately address RWA's concerns. For example, a resolution 7 hexagon that does not contain any roads is comprised of seven resolution 8 hexagons that also do not contain roads. A challenger therefore would not need to meet the geographic threshold in any of the resolution 8 hexagons if none of the point-hexes contain roads. Moreover, if a challenger runs tests meeting the temporal and testing thresholds in four resolution 8 hexagons and such tests show inadequate coverage sufficient to create a challenge, then the entire resolution 7 hexagon will be considered challenged. Thus, while our proposal does require challengers to meet the temporal and testing thresholds in a resolution 8 hexagon that has no accessible point-hexes, the tests do not need to be geographically diverse within each resolution 8 hexagon. We believe such a trade-off is reasonable to challenge a large geographic area.
49. We also adopt a modified version of our proposed temporal threshold. To meet the temporal threshold under the approach we adopt, each resolution 8 hexagon cell must include a set of two negative components of the same type (upload or download) with a time-of-day at least four hours different from two other negative components of the same type as the first set, regardless of the date of the tests. In other words, if the negative tests within the hexagon were ordered chronologically, regardless of the day of the tests, the difference in time between the first two tests and the last two tests must be at least four hours. The temporal threshold is evaluated across all tests within the resolution 8 hexagon and need not be met for each point-hex within the hexagon. That is, the earliest two negative tests and the latest two negative tests can be recorded in different point-hexes and still meet the temporal threshold so long as the difference in time between the two pairs of tests is at least four hours. Accordingly, because the geographic threshold for a fully-accessible resolution 8 hexagon requires at least eight negative tests ( i.e., two each in four of the hexagon's point-hexes) whereas the temporal threshold could be met using only four of those tests (located in any of the point-hexes), the temporal threshold would not necessarily require the challenger(s) to conduct additional testing. This threshold is different from that which we proposed in the BDC Mobile Technical Requirements Proposed Rules in that we now require two sets of negative tests to be temporally diverse, rather than one negative test being temporally diverse from one other test. T-Mobile supports the adoption of the temporal threshold proposed in the BDC Mobile Technical Requirements Proposed Rules, and we believe our modified approach is consistent with the concepts for which T-Mobile expresses support. Verizon and AT&T generally support a temporal threshold, and agree with our determination that temporal diversity is important, but we decline to adopt their proposal to categorize tests into specific four-hour ranges. We disagree that categorizing tests into specific time ranges would ensure temporal diversity. For example, Verizon and AT&T's proposal could allow a challenger to satisfy the temporal threshold with tests that have been conducted within a very short timeframe. However, in light of Verizon's concerns with our initial proposal, we find that multiple tests separated by four hours, rather than one at each end of a minimum of a four hour period, are needed to show temporal diversity, and thus modify our approach to ensure temporal diversity across several tests.
50. We are also unpersuaded by Vermont DPS's argument that we should not adopt the temporal threshold because it would require a challenger to drive test a road twice, and by CPUC's argument that the temporal threshold would significantly increase costs on challengers. We believe that the effort required to achieve the temporal threshold is outweighed by the need to collect a representative sample of a mobile service provider's coverage, particularly since our decision to combine challenge data from consumers, governments, and other entities in a given area will help minimize burdens on challengers and limit the number of drive tests any one challenger will need to conduct. We conclude that our approach is a reasonable solution that will ensure challengers demonstrate persistent inadequate coverage while accounting for the temporal variability of mobile networks, such as variability due to cell loading.
51. Finally, we adopt a modified version of the proposed testing threshold to require that there must be at least five negative test components of the same type (upload or download) within the resolution 8 hexagon when 20 or fewer total challenge test components of that type have been submitted. Consistent with the approach originally proposed, when challengers have submitted more than 20 test components of the same type in a hexagon, we will require that a certain minimum percentage of the total number of test components of the same type in that hexagon be negative, ranging from at least 24% negative when challengers have submitted between 21 and 29 total tests, to at least 16% negative when challengers have submitted 100 or more tests. Once the percentage of negative test components of the same type submitted meets the Start Printed Page 21489 minimum negative percentage required (for example, for a sample of fewer than 21 tests, once there are at least five negative tests submitted), we will not require additional tests so long as both the geographic and temporal thresholds for a resolution 8 hexagon have been met. The failure rates we adopt were chosen to demonstrate that coverage does not reach a 90% probability threshold. We find that this 90% threshold is reasonable to use because most speed tests will be taken within the provider's cell (rather than solely at the edge of the cell) where the cell area probability should be greater than the modeled cell edge probability of 90%, and to simplify the process, we will use the 90% threshold for tests conducted anywhere in the cell. To avoid the risk that the testing threshold would be skewed by a disproportionate number of tests occurring in one location within a resolution 8 hexagon, however, we adopt a modified approach such that if the number of test components of the same type in a single point-hex represent more than 50% of the total test components in the resolution 8 hexagon (where there are four or more accessible point-hexes in the hexagon), the test components in that point-hex will count only toward meeting 50% of the testing threshold. In a resolution 8 hexagon where there are only three accessible point-hexes, if the number of test components in one point-hex represent more than 75% of the total test components in the hexagon where the geographic threshold is otherwise satisfied, the test components in that point-hex will count only toward 75% of the testing threshold. If fewer than three point-hexes are accessible, we will not apply a maximum percentage of total test components for a single point-hex as the risk that testing would be skewed by a disproportionate number of tests occurring in a single location is reduced. We believe that these changes mitigate the potential bias resulting from a disproportionate number of tests occurring in one point-hex, and that this revised testing threshold will result in greater variety of tests within each resolution 8 hexagon.
52. Verizon, CTIA, and T-Mobile generally support the adoption of a testing threshold. Verizon supports our evaluating challenges based on the percentage of tests in a cell that are below the relevant speed threshold, but expresses concern that the Commission's geographic threshold “would allow cognizable challenges even if substantially all of the negative tests are in a single point-hex.” The modified approach we adopt mitigates the potential problems Verizon raises because the Commission would adjust the testing threshold when a disproportionate number of tests occur in the same point-hex. T-Mobile contends that staff should adjudicate challenges based on a threshold number and percentage of “negative” tests, with a minimum of five tests for each resolution 10 hex cell and at least 50% of those negative. We decline to adopt T-Mobile's alternative proposal because, as discussed above, we believe resolution 10 hexagons are too small for the challenge process. We also find that T-Mobile's proposal to require that 50% of tests be negative, regardless of the number of tests run, would place a high burden on challengers, and could diminish legitimate indications that coverage is unavailable in particular areas. In contrast, the thresholds for the percentage of negative tests we adopt are based on the statistical significance necessary to demonstrate lack of coverage. We also decline to adopt Vermont DPS's proposal to allow a single test, or maximum of two tests to be used to show inadequate coverage at multiple locations within a resolution 8 hexagon. Vermont DPS's argument that the geographic and testing thresholds effectively prevent drive testing assumes that a challenger should be able to run all of the tests necessary to meet each threshold on a single drive through a resolution 8 hexagon, but if challengers find that they are having to drive at a slow pace to run an in-vehicle test in a resolution 9 hexagon, they may periodically stop to run tests in a stationary manner before moving on to the next resolution 8 hexagon. We anticipate that government and other third-party testers can use software that overlays the H3 indexing system and/or providers published maps on a drive test map and may therefore know whether they are keeping within a hex or moving into another one while doing a test. We note, however, that this may not be necessary since we will be combining challenges from consumers, governments, and other entities in a given area which would lessen the number of drive tests any one challenger will need to conduct. For this same reason, we disagree with the CPUC that the testing threshold will be extremely expensive and require complicated coordination of efforts. As discussed, we will aggregate challenges from multiple sources and no one entity will be required to conduct all tests needed to challenge a particular geographic area.
53. User-Friendly Challenge Process. AT&T concurs with our assessment that the challenge process we proposed is reasonable and user-friendly and supports the overall framework, including the use of the H3 geospatial indexing system. In addition, CTIA, T-Mobile, and AT&T agree that the proposal to combine test data from consumers, governments, and other entities is user-friendly and reduces burdens on challengers, who will not be required to collect and submit every drive test needed to sustain a challenge on their own. Although Public Knowledge/New America raise concerns about whether the challenge process is sufficiently user-friendly, they share our belief that the challenge process should be as streamlined and burden-free as possible for consumers and other entities; we note that our implementation of the consumer challenge process is consistent with the Third Order' s determination that challengers will collect and submit all speed test data needed to support a challenge, including the new speed test metrics and parameters we adopt, through the FCC Speed Test app or another app approved by OET to collect and submit challenge data to the Commission.
54. We disagree with commenters that argue that our challenge process is not “user-friendly.” RWA argues that the testing process is not “user-friendly” because consumers can test only the networks their handsets are authorized to use. It recommends requiring providers to allow tests by other networks' subscribers. The Commission has already determined that consumer challengers must submit certain identifying information, including that they are a subscriber or authorized user of the provider being challenged, to deter frivolous filings, and the Bureau and Offices were not delegated authority to change this requirement. Similarly, Vermont DPS recommends requiring providers to temporarily provide approved devices with post-paid service at no or reduced cost to governmental entities wishing to engage in a challenge. We decline to adopt Vermont DPS's request because we lack the authority to subsidize government challenges and believe it would be too burdensome to require providers to establish and bear the costs of such programs. Enablers argues (and Public Knowledge/New America agree) that “ `testing parameters that amount to an exceedingly high burden of proof for consumers and other parties' run `contrary to the Broadband DATA Act and [the Commission's] own policy goals.' ” Public Knowledge/New America accordingly encourage the Bureau and Offices to consider Start Printed Page 21490 “allow[ing] the option to use other trusted sources to challenge providers' claims.” The Precision Ag Connectivity & Accuracy Stakeholder Alliance (PAgCASA) similarly claims that the proposed challenge process “delineates a series of technical and non-technical steps [m]obile customers must initiate and successfully navigate when conducting their [c]hallenge process that . . . falls well short of being easy to use from a customer's perspective.” These commenters also raise many issues that were already decided in the Third Order ( e.g., subscriber certifications and testing methodology and metrics) and are not delegated to the Bureau and Offices, or urge the Bureau and Offices to ignore the instructions given by the Commission, and would have been more appropriately filed as a petition for reconsideration of the Third Order. We reject the arguments of these commenters as untimely because they should have been filed as petitions for reconsideration to the extent that they raise issues already decided by the full Commission. Under Section 405(a) of the Communications Act of 1934, as amended, any party in a proceeding may file a petition for reconsideration within thirty days of public notice of the decision. These commenters raise issues that were decided by the Commission in the Third Order, which was published in the Federal Register on April 7, 2021. This publication date means that deadline for filing a petition for reconsideration of the Third Order was May 7, 2021. Because these commenters did not file their comments until September 2021, the Bureau and Offices find that the arguments are untimely and would have been more appropriately filed as petitions for reconsideration.
55. In conclusion, while the challenge processes and methodologies we adopt are by necessity detailed and technical, so as to assure that accurate and rigorous measurements are supplied to challenge providers' claimed broadband coverage, the Commission and Bureau and Offices have minimized the burdens placed on challengers by providing a user-friendly means for challengers to run speed tests using their mobile devices and submit all data via either the FCC Speed Test app or another OET-approved third-party app. As discussed, the Bureau and Offices were instructed to implement a number of complex and complicated tasks, among them, developing thresholds for determining when a cognizable challenge has been met, a procedure for resolving challenges, and adopting additional testing requirements if necessary. These obligations were delegated by the Commission within the context of the Broadband DATA Act, which requires the Commission to consider user-friendly challenge submission formats, reducing the time and expense burdens on consumers submitting challenges and providers responding to them, while at the same time considering lessons learned from the challenge process established under Mobility Fund Phase II, and the costs to consumers and providers resulting from a misallocation of funds because of a reliance on outdated and inaccurate maps. Indeed, financial assistance for underserved areas may, in the future, be based on updated Commission maps. Therefore, we find that the processes we adopt strike an appropriate balance, within the authority delegated to us by the Commission, to ensure the challenge process is easy to use and accessible for consumers and government and other entities and also results in high-quality challenges that will accurately correct any errors associated with providers' reported coverage maps.
2. Challenge Responses
56. Notification of Challenges. We adopt the BDC Mobile Technical Requirements Proposed Rules' proposed procedures for notifying service providers of cognizable challenges filed against them and for notifying challengers and providers of results of challenges. The BDC Mobile Technical Requirements Proposed Rules proposed that challenged mobile service providers would be notified via the online portal at the end of each calendar month of the hexagons that are subject to cognizable challenges. CTIA and T-Mobile express support for our proposal. We find this approach will help create a manageable process for providers by providing them with a standard set of deadlines rather than an erratic and potentially unpredictable set of innumerable deadlines for rebuttals that begin as soon as any given discrete area becomes challenged. We also adopt our proposal for mobile service providers and challengers to be notified monthly of the status of challenged areas, and parties will be able to see a map of the challenged area, and a notification about whether or not a challenge has been successfully rebutted, whether a challenge was successful, and if a challenged area was restored based on insufficient evidence to sustain a challenge. In the Third Order, the Commission directed that challenge and crowdsource data other than the location that is the subject of the challenge, the name of the provider, and details concerning the basis for the challenge must be kept private to protect challengers' privacy interests. Accordingly, before a service provider receives access to crowdsourced or challenge data, it will be required, within the BDC system, to acknowledge that it will use personally identifiable information that it receives for the sole purpose of responding to the challenge and that it will protect and keep private all such personally identifiable information. Such personally identifiable information may include challenger contact information, device information, and network information, as well as other personally identifiable information included in addition to evidence that a challenger submits.
57. Timeframe for Responding to Challenges. In the Third Order, the Commission determined that providers must either submit a rebuttal to a challenge or concede a challenge within 60 days of being notified of the challenge. Consistent with the Third Order, if the challenged provider concedes or fails to submit data sufficient to overturn the challenge within 60 days of notification, it must revise its coverage maps to reflect the lack of coverage in the successfully challenged areas.
58. In comments on the BDC Mobile Technical Requirements Proposed Rules, CCA argues that the Bureau and Offices should allow providers to seek a waiver of the 60-day deadline if the provider needs additional time to submit on-the-ground data due to unforeseen events or weather. Verizon contends that providers should be able to choose to seek either: (1) A waiver of rules that limit the permitted uses of infrastructure data or transmitter monitoring software in lieu of speed tests; or (2) a waiver of the 60-day deadline if the provider will rebut with speed test data. The Commission adopted the requirement that providers submit a rebuttal or concede a challenge in the Third Order based on its determination that permitting 60 days to respond to a challenge would make the challenge process more manageable for providers, while also providing for speedy resolution of challenges consistent with the requirements of the Broadband DATA Act. The Bureau and Offices do not have authority to change the required timeframe for provider responses. To the extent that a provider may wish to seek a waiver of the 60-day deadline for responding to a challenge in any individual case, it may do so under the Commission's generally applicable waiver rules.
59. Future Challenges in Successfully Rebutted Areas. We adopt our proposal Start Printed Page 21491 to make any areas where a provider has demonstrated sufficient coverage in a challenged area ineligible for subsequent challenge until the next biannual broadband availability data filing at least six months after the later of either the end of the 60-day response period or the resolution of the challenge. This ineligibility applies only with respect to the particular network technology and modeled environment for which the provider has demonstrated sufficient coverage. We deny Verizon and AT&T's request that the Bureau and Offices make successfully rebutted areas exempt from future challenges for a period of three years. We find that preventing future subsequent challenges for a period as long as three years could result in less accurate maps due to changes over time in technology and coverage. We find instead that limiting subsequent challenges for at least six months after the resolution of the challenge strikes an appropriate balance between avoiding a requirement that providers repeatedly confirm the same areas while ensuring that challengers have the opportunity to submit data regarding changed conditions. Although commenters assert that it is unlikely that coverage will be reduced in an area that was subject to challenge, an area that is subject to repeated cognizable challenges may highlight that significant technical issues continue to affect the availability of broadband service in that area. Permitting a subsequent challenge in these areas will help ensure that the Commission receives the most accurate and up-to-date coverage data reflecting consumers' on-the-ground experience. In any area in which a provider does not overturn the challenge but which is otherwise no longer considered challenged ( e.g., where, as a result of data submitted by the provider there is no longer sufficient evidence to sustain the challenge to that area but the provider's data fall short of confirming coverage in the area), the coverage area will be restored to its pre-challenge status and will be eligible for future challenges against it.
a. Rebutting Challenges With On-the-Ground Data
60. We adopt our proposal from the BDC Mobile Technical Requirements Proposed Rules that, when a challenged mobile service provider submits on-the-ground speed test data to rebut a challenge, the provider will be required to meet analogous thresholds to those required of challengers, adjusted to reflect the burden on providers to demonstrate that sufficient coverage exists at least 90% of the time in the challenged hexagon(s). Consistent with our proposal, the on-the-ground test data that providers submit must meet the same three thresholds required of challenger tests for both the upload and download components: (1) A geographic threshold; (2) a temporal threshold; and (3) a testing threshold, albeit with different values ( i.e., the number of tests and percentages) for test data for each threshold.
61. For the geographic threshold, the provider will need to meet the same geographic threshold required of challengers, but with positive test components rather than negative test components. At least four point-hexes of a resolution 8 hexagon must include two download test components taken within them, at least one of which must be positive, and at least four point-hexes of a resolution 8 hexagon must include two upload test components taken within them, at least one of which must be positive to demonstrate that adequate coverage occurs at multiple locations within the resolution 8 hexagon. We adopt a modified version of our proposed temporal threshold. To meet the temporal threshold under the approach we adopt, each resolution 8 hexagon will need to include a set of five positive download components with a time-of-day difference of at least four hours from another set of five positive download components, regardless of the date of the test and a set of five positive upload components with a time-of-day difference of at least four hours from another set of five positive upload components, regardless of the date of the test. We modify the threshold proposed in the BDC Mobile Technical Requirements Proposed Rules because we find that requiring more tests to be separated in time will help ensure that there is more consistent temporal diversity across several tests. For the testing threshold, we adopt our proposal that challenged providers must demonstrate statistically significant evidence that coverage is adequate to overturn a challenge using on-the-ground speed tests, based on the same statistical significance analysis used for determining challenges for both upload and download components. Specifically, in order for the testing threshold for a resolution 8 hexagon to be met, we require that at least 17 positive test components of the same type have been taken in the hexagon when the provider has submitted 20 or fewer test components of that type. When the provider has submitted more than 20 test components of the same type, we require that a certain minimum percentage of the total number of test components of that type in the hexagon must be positive, ranging from at least 82% positive, when providers have submitted between 21 and 34 total test components of the same type, to at least 88% positive, when providers have submitted 100 or more test components of the same type. The positive test rates we adopt were chosen to demonstrate that coverage does reach a 90% probability threshold, as opposed to the requirement that challengers demonstrate coverage does not reach a 90% probability threshold. Additionally, in line with the modification we adopt for challengers, if more than 50% of the test components of the same type are within a single point-hex where four or more point-hexes in the resolution 8 hexagon are accessible, the test components in that point-hex will be down-weighted to only account for 50% of the total test components when evaluating the testing threshold. If more than 75% of the tests are within one point-hex where there are three accessible hexes in the resolution 8 hexagon, the tests in that point-hex will be reduced to only account for 75% of the total tests when evaluating the testing threshold. By limiting the percentage of test components within any one point-hex that may contribute to a challenge response, this requirement will help ensure that there is sufficient diversity in the test data that a challenged provider submits. A provider may also demonstrate sufficient coverage in a resolution 8 hexagon that was not challenged in order to rebut a challenge to a lower-resolution hexagon containing the non-challenged resolution 8 hexagon ( i.e., the “parent” resolution 7 hexagon or “grandparent” resolution 6 hexagon). As discussed more fully in Section 3.2.4 of Appendix A—Technical Appendix (available at https://www.fcc.gov/document/fcc-releases-bdc-mobile-technical-requirements-order ), for challenged hexagons at resolution 7 or 6, if the provider submits response data sufficient to demonstrate coverage in the hexagon's child hexagons such that fewer than four child hexagons would still be challenged, then the resolution 7 or 6 hexagon would no longer be challenged even if sufficient data were not submitted to rebut a challenge for the remaining child hexagons. In analyzing challenges, staff may consider other relevant data submitted by providers, request additional information from the challenged provider, and take other actions as may be necessary to ensure the reliability and accuracy of rebuttal data. These Start Printed Page 21492 actions may include rejecting speed tests or requiring additional testing.
62. In the BDC Mobile Technical Requirements Proposed Rules, we proposed to require providers to collect on-the-ground test data using mobile devices running either a Commission-developed app ( e.g., the FCC Speed Test app), another speed test app approved by OET to submit challenges, or other software if approved by staff. T-Mobile urges the Bureau and Offices to allow providers to use their own software tools to rebut challenges without seeking prior staff approval. If approval is needed, T-Mobile argues, then OET should commit to approve or reject such tools within 90 days of submission. Our proposal to require approval of testing software used by providers was based on the Third Order's direction to the Bureau and Offices to approve the equipment that providers may use to conduct on-the-ground testing to respond to verification inquiries, combined with the Commission's determination that providers rebutting challenges with on-the-ground test data would be subject to the same requirements and specifications that apply to providers submitting data in response to a Commission verification request. T-Mobile also asks the Commission to “ensure the process for submitting and responding to challengers is user friendly” by making the challenge portal “compatible with widely used database software like Salesforce.” We decline to adopt a requirement that the portal be compatible with specific types of software. However, we take other steps to provide flexibility for providers in responding to challenges, including, as described in more detail below, allowing them to use their own software tools to gather on-the-ground test data. We also anticipate that service providers and other entities will be able to build their own tools and integrate their own software and databases with the BDC system using a modern web-based Application Programming Interface (API).
63. While we continue to read these provisions as requiring the Bureau and Offices to approve any software tools providers may use to gather on-the-ground test data, we clarify that, to the extent that a provider chooses to use software other than the FCC Speed Test app or another speed test app approved by OET for use in the challenge process, we will consider such software approved for use in rebutting challenges provided that the software incorporates the test methodology and collects the metrics that approved apps must gather for consumer challenges and that government and third-party entity challenger speed test data must contain. We understand that certain technical network information and RF metrics that we would otherwise require are not currently available on Apple iOS devices. Therefore, until such time as such information and metrics are available on iOS devices, and the Bureau and Offices indicate that they will collect such information from iOS devices, providers must collect all of the required technical network information and RF metrics using a device that is able to interface with drive test software and/or runs the Android operating system. We also require providers conducting in-vehicle mobile tests ( i.e., drive tests) to conduct such tests with the antenna located inside the vehicle. We disagree with Verizon that providers should be able to choose whether or not to use an external antenna when conducting speed tests. Because most consumers will take in-vehicle tests using an antenna inside the vehicle, adopting this requirement for providers will help minimize discrepancies and ensure more consistent comparisons between on-the-ground test data supplied by challengers and data supplied by providers.
64. In order to inform our approval process and consistent with the requirement that applies to government and other entity challengers who choose to use their own software when submitting challenges, we require providers who choose to use their own software to submit a complete description of the methodologies used to collect their data and to substantiate their data through the certification of a qualified engineer or official. Permitting providers to use their own tools is consistent with the approach the Commission adopted for government and other entity challengers in collecting challenge data and it is preferable to requiring prior approval for providers wishing to use their own software tools because it will help streamline the challenge process by reducing the potential for any delays that might be caused by requiring prior review of specific software tools that providers may wish to use. It also will provide greater flexibility and reduce burdens on providers by allowing them to more easily use the software tools they may already be using in the ordinary course of their business.
65. We recognize that this approach is different than the approach we have adopted for third-party speed tests apps where we require OET approval before such apps may be used in the challenge process. We find, however, that the difference in treatment is justified and warranted. Mobile broadband service providers routinely test and monitor network performance as they develop their networks, and their software has been engineered specifically to obtain detailed speed test measurement data. Providers' software is unlikely to be constrained by limitations in the categories of data that can be collected; in contrast, and as discussed above, consumer-facing third-party apps (particularly apps run over iOS) cannot provide certain categories of information. We require approval for third-party speed test apps because we want to ensure that the apps measure coverage as accurately as possible and report information into the BDC system with the required certifications and in a useable format. In addition, requiring approval is necessary to hold the third-party app developers accountable for the accuracy and reliability of their tools and to allow us to inform consumers of the available third-party apps that meet our requirements and are approved for use in the challenge and crowdsource processes. In contrast, the Commission has greater jurisdiction over service providers, as providers are required under the Broadband DATA Act to ensure the accuracy of the coverage information they submit to the Commission. Permitting providers to use these existing performance measurement tools without individualized review and approval will help increase efficiency while continuing to ensure that the Commission receives high-quality data that will allow an apples-to-apples comparison between challenge data submitted by consumers and other entities and data supplied by providers using their own software. While we expect that this approach will benefit our administration of the challenge process, we retain the discretion to require prior approval of providers' software or to make changes to the required metrics via notice and comment at a later time. We also retain discretion to revoke the automatic grant of approval in instances where a provider's software is found to be unreliable or otherwise inconsistent with our objective of ensuring accurate mapping data.
66. We decline T-Mobile's request that we “adopt a 90-day `expiration' date for challenge data” and instead adopt our proposal to make on-the-ground test data valid for one year from the test date. The process we adopt for submission of challenges ensures that providers have sufficient details to respond to challenges, including dates and times of speed tests. Moreover, to Start Printed Page 21493 the extent a provider improves its network coverage in an area, it can either remove the area from its current data and add it back in with its next biannual submission or rebut a challenge by submitting on-the-ground test data demonstrating network performance in the recently deployed area. We find that these alternatives strike a better balance in facilitating robust participation in the challenge process and ensuring high-quality data than requests to curtail the lifespan of valid challenge data.
b. Rebutting Challenges With Infrastructure Data
67. Under the rules adopted in the Third Order, providers may respond to challenges with infrastructure data rather than (or in addition to) on-the-ground speed test data. In cases where a challenged mobile service provider chooses to submit infrastructure data to respond to a challenge, we adopt our proposal to require the provider to submit the same data as required when a mobile provider submits infrastructure information in response to a Commission verification request, including information on the cell sites and antennas used to provide service in the challenged area. In the Third Order, the Commission directed OEA and WTB to provide guidance on the types of data that will likely be more probative in validating broadband availability data submitted by mobile service providers in different circumstances and in the BDC Mobile Technical Requirements Proposed Rules, we proposed to use infrastructure data, on their own, to adjudicate challenges in a limited set of circumstances. Specifically, we proposed that a challenged provider may use infrastructure data to identify tests within challenger speed test data that the provider claims are invalid or non-representative of network performance and proposed four circumstances under which a provider could claim a speed test was invalid, or non-representative. In response, CCA argues that providers should not be permitted to respond to a challenge with only infrastructure data because such data are predictive and are not as reliable as on-the-ground test data. CTIA and Verizon, by contrast, argue that the Bureau and Offices lack delegated authority to impose any limitation on providers' ability to submit infrastructure data to respond to challenges.
68. We find that our proposed approach strikes the best balance between providing flexibility for providers and ensuring that they respond to challenges with probative data. We continue to view data that reflect actual on-the-ground tests, as opposed to infrastructure data, generally to more accurately reflect user experience and therefore be of more probative value in most—but not all—circumstances. We disagree with CTIA and Verizon's argument that the Commission's decision to permit providers to respond with infrastructure data precludes us from adopting rules governing the circumstances under which such data can be used, on their own, to respond to challenges. While the Commission directed providers to “submit to the Commission either on-the-ground test data or infrastructure data, so that Commission staff can examine the provider's coverage in the challenged area and resolve the challenge,” it also “directed OEA and WTB to develop the specific requirements and methodologies that providers must use in conducting on-the-ground testing and in providing infrastructure data” and “direct[ed] OEA and WTB to provide guidance about what types of data will likely be more probative in different circumstances.” The Commission also found that “if needed to ensure adequate review, OEA may also require that the provider submit other data in addition to the data initially submitted, including but not limited to, either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider).” Defining the circumstances under which infrastructure data, on their own, may be used to rebut a challenge is consistent with these delegations of authority and offers guidance to providers about when the Commission will find infrastructure data to be as probative as on-the-ground test data, as well as when such data are likely to be sufficient to resolve a challenge.
69. We also disagree with Verizon that requiring a challenged provider to submit infrastructure data in cases where there may be other forms of evidence that can rebut a challenge is “unnecessarily burdensome.” In the Third Order, the Commission determined that providers may rebut a challenge by submitting to the Commission on-the-ground test data and/or infrastructure data, so that Commission staff can examine the provider's coverage in the challenged area and resolve the challenge, and may optionally include additional data or information in support of a response. The Bureau and Offices do not have the authority to change the Commission's decision or permit challenge responses that do not include either on-the-ground test data and/or infrastructure data.
70. While we adopt our proposal to use infrastructure data, on their own, to resolve challenges in a limited set of circumstances, we agree with commenters that providing additional flexibility will help providers submit responses efficiently. Therefore, we add to the list of circumstances where we will accept infrastructure data, on their own, to respond to a challenge. In the circumstances listed below, we find that infrastructure information will likely be as probative as on-the-ground test data and therefore a provider may submit infrastructure data, on their own, in response to challenge that would invalidate speed tests submitted by challengers. We disagree with CCA that the circumstances for submitting infrastructure data are not defined sufficiently and risk increasing burdens on challengers. We expect the circumstances outlined above to occur rarely and providers, not challengers, must demonstrate that one of these circumstances exists when responding to a challenge solely with infrastructure data.
71. First, we find that infrastructure information will likely be of comparable probative value when extenuating circumstances at the time and location of a given test ( e.g., maintenance or temporary outage at the cell site) caused service to be abnormal. In such cases, we adopt our proposal for providers to submit coverage or footprint data for the site or sectors that were affected and information about the outage, such as bands affected, duration, and whether the outage was reported to the FCC's Network Outage Reporting System (NORS), along with a certification about the submission's accuracy. We will then remove measurements in the reported footprint in the relevant band(s) made during the outage and, as appropriate, recalculate the statistics.
72. Second, we find that infrastructure data will likely be of comparable probative value when the mobile device(s) with which the challenger(s) conducted their speed tests are not capable of using or connecting to the radio technology or spectrum band(s) that the provider models as required for service in the challenged area. In such cases, we adopt our proposal for providers to submit band-specific coverage footprints and information about which specific challengers' device(s) lack the band or technology. We will then remove measurements from the listed devices in the relevant coverage footprint and recalculate the statistics. Start Printed Page 21494
73. Third, we find that infrastructure data will likely be of comparable probative value when speed tests were taken during an uncommon special event ( e.g., a professional sporting event or concert) that increased traffic on the network. As we previously stated, we recognize that in such cases mobile service providers would not have the same throughput they would in normal circumstances given the high volume of traffic on networks during these types of uncommon special events, so demonstrating the existence of coverage in the area by submitting infrastructure information would be persuasive for why speed tests were negative in such a scenario.
74. Fourth, we find that infrastructure data will likely be of comparable probative value when speed tests were taken during a period where cell loading was abnormally higher than the modeled cell loading factor. Speed tests taken during a period when cell loading is higher than usual can result in negative speed tests, and we thus anticipate that infrastructure information will be useful to remove the tests and recalculate the statistics for challenges in this situation. In such cases, we adopt our proposal to require providers to corroborate their claims by submitting cell loading data and we clarify that these data must both (a) establish that the cell loading for the primary cell(s) at the time of the tests was abnormally higher than modeled, and (b) include cell loading data for a one-week period before and/or after the provider was notified of the challenge showing as a baseline that the median cell loading for the primary cell(s) was not greater than the modeled value ( e.g., 50%). To meet this threshold, infrastructure data reporting cell loading at the time of test would need to show that actual loading was both higher than the modeled cell loading factor ( e.g., 50%) and higher than the 75th percentile of the 15-minute interval weekly cell loading data submitted as a cell loading baseline. Adopting the 75th percentile requirement would ensure that loading at the time is abnormally high because loading would be higher than the four busiest hours each day during the 6:00 a.m. to 10:00 p.m. daily window to submit challenges during the baseline. These clarifications should help address concerns about the utility of infrastructure data by ensuring that we receive robust evidence, based upon actual cell loading measurements, that higher-than-modeled cell loading at the time of the test is an abnormal occurrence. We also adopt our proposal that, if a high number of challenges show persistent over-loading, staff may initiate a verification inquiry to investigate whether mobile providers have submitted coverage maps based on an accurate assumption of cell loading in a particular area.
75. Fifth, in response to the record we find that infrastructure data will likely be of comparable probative value when a mobile device used in testing used a data plan that could result in slower service. In such cases, providers must submit information about which specific device(s) used in the testing were using a data plan that would have resulted in slower service and information showing that the provider's network did, in fact, slow the device at the time of the test.
76. Sixth, and also in response to the record, we find that infrastructure will likely be of comparable probative value when a mobile device used in the testing was either roaming or was used by the customer of an MVNO. As adopted above, we will not permit speed tests submitted by customers of an MVNO or whose devices are roaming on another provider's network to be counted as valid tests against the facilities-based provider's network on which the speed test was conducted. As stated above, because the agreements between a facilities-based provider and MVNOs or roaming partners often include limitations on the technology and speed available to or the network prioritization of devices used by consumers of the MVNO or roaming partner, we conclude that speed tests from such devices are not reliable evidence about the performance of the facilities-based provider's network. While we anticipate that the majority of such tests will fail our automated validations, there may be circumstances where the BDC system is unable to automatically identify these tests ( e.g., identifying whether an iOS device is roaming is not currently possible). In such circumstances, providers must identify which specific device(s) used in the testing were either roaming at the time or used by the customer of an MVNO, based upon their records.
77. After the provider identifies the speed tests it seeks to invalidate pursuant to one of the six circumstances we adopt above and submits all required infrastructure data in support of this contention, we will remove any invalidated speed tests and recalculate the challenged hexagons. Any challenged hexagons that no longer meet the thresholds required for a challenge would be restored to their status before the cognizable challenge was created. We note that where a provider rebuts a challenge using this process, the challenged hexagons that have been restored to their status before the cognizable challenge was created would continue to be eligible for subsequent challenges.
78. Where a challenged provider does not claim that a challenger's speed tests were invalid based upon one of the six circumstances listed above, Commission staff will consider any additional information submitted by the challenged provider or request additional information from the challenged provider. Such information must include on-the-ground speed test data and may also include other types of data, as specified in the Third Order. Staff will use this information to complete its adjudication of the challenge. Although we adopt the foregoing approach for considering infrastructure information in response to challenges, we note that we may make changes to this approach over time as we gain experience with administering the challenge process.
c. Other Data
79. In the Third Order, the Commission determined that providers may rebut a challenge by submitting to the Commission either on-the-ground test data and/or infrastructure data, and may optionally include additional data or information in support of a response, including drive testing data collected in the ordinary course of business, third-party testing data (such as speed test data from Ookla or other speed test app), and/or tower transmitter data collected from transmitter monitoring software. Consistent with the Commission's direction in the Third Order, OEA staff will review such data when voluntarily submitted by providers in response to challenges, and, if any of the data sources are found to be sufficiently reliable, staff will specify appropriate standards and specifications for each type of data and issue a public notice adding the data source to the alternatives available to providers to rebut a consumer challenge.
80. In the BDC Mobile Technical Requirements Proposed Rules, the Bureau and Offices sought comment regarding the conditions under which a provider's transmitter monitoring software can be relied upon by staff in resolving challenges. Commenters did not discuss specific conditions under which transmitter monitoring software should be relied upon, instead expressing general support for the use of such data and encouraging the Commission to develop standards for when such data would be sufficient for rebutting a challenge. Based on the Start Printed Page 21495 record, we find that there is insufficient evidence to determine, at this time, the conditions under which we may rely on transmitter monitoring software data to resolve challenges. Accordingly, we will review such data when voluntarily submitted by providers in response to challenges and, in doing so, we will consider, among other things, the extent to which the transmitter monitoring software data augment or reinforce the probative value of infrastructure or other data to rebut challenger speed test data, how such systems measure the geographic coordinates (longitude and latitude) of the end-user devices, how the data compare to the information collected from on-the-ground testing, and whether such software records instances of end-user devices not being able to connect to the network at all.
81. Several providers filed comments requesting additional flexibility in responding to challenges. They argue that, rather than only being permitted to voluntarily submit other types of data, such as data from field tests conducted in the ordinary course of business or third-party data, in addition to either on-the-ground test data or infrastructure data, providers should be able to submit such data on their own as a response to challenges. The Commission has already addressed requests for additional flexibility in responding to challenges, and the Bureau and Offices do not have authority to change the Commission's determinations. In the Third Order, the Commission considered arguments that providers should have additional flexibility to submit other types of data in responding to challenges, including, among others, drive testing data collected in the ordinary course of business. The Commission recognized the need for flexibility in provider responses, determining that providers may voluntarily submit other types of data beyond on-the-ground testing data or infrastructure data they are required to submit to rebut a challenge, but found that the record did not support a finding that such data were sufficient to serve as a complete substitute for either on-the-ground testing or infrastructure data. The Bureau and Offices do not have the discretion to change the Commission's decision. Although OEA has the delegated authority to adopt new alternatives as a substitute for on-the-ground data or infrastructure data, it can exercise such authority only after reviewing such data submissions, determining that they are sufficiently reliable, and specifying the appropriate standards and specifications for each type of data.
B. Collecting Verification Information From Mobile Providers
82. The Broadband DATA Act requires the Commission to “verify the accuracy and reliability” of the broadband internet access service data providers submit in their biannual BDC filings in accordance with measures established by the Commission. The Commission determined in the Third Order that OEA and WTB “may request and collect verification data from a provider on a case-by-case basis where staff have a credible basis for verifying the provider's coverage data.” In response to such an inquiry, the provider must submit either on-the-ground test data or infrastructure information for the specified area(s). The provider may also submit additional data, including but not limited to, on-the-ground test data or infrastructure data (to the extent such data are not the primary option chosen by the provider), or other types of data that the provider believes support its reported coverage. A mobile service provider has 60 days from the time of the request by OEA and WTB to submit, at the provider's option, on-the-ground or infrastructure data, as well as any additional data that the provider chooses to submit to support its coverage. OEA and WTB may require submission of additional data if such data are needed to complete the verification inquiry. The Commission directed OEA and WTB “to implement this data collection and to adopt the methodologies, data specifications, and formatting requirements that providers shall follow when collecting and reporting [these] data.” The BDC Mobile Technical Requirements Proposed Rules sought comment on processes and methodologies for determining areas subject to verification ( i.e., areas where Commission staff have a credible basis for verifying a mobile provider's coverage data in an area) and for the collection of on-the-ground test data and infrastructure information, as well as information from transmitter monitoring systems and other data. Below we discuss and expand on when a credible basis exists for initiating a verification inquiry. Additionally, we adopt approaches for submitting data in response to a verification request and discuss our efforts to balance the needs of this proceeding with the burdens placed on providers in verifying coverage.
1. Area Subject to Verification
83. To identify the portion(s) of a mobile provider's coverage map for which we will require verification data—referred to as the targeted area(s)—we will rely upon all available evidence, including submitted speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff evaluation and knowledge of submitted coverage data (including maps, link budget parameters, and other credible information). We find this approach allows for needed flexibility while accounting for the relevant data at hand when selecting a targeted area. The adopted approach to the mobile verification process differs from the challenge process and the verification process proposed in the BDC Mobile Technical Requirements Proposed Rules by removing the testing and geographic threshold requirements of the challenge process. This reduces the burden on providers while still allowing for an accurate verification process and is discussed further below.
84. A Credible Basis to Verify a Provider's Coverage Data. We will conduct verification inquiries in areas where we find there is a “credible basis” for such an inquiry, and we will use an evidence-based analysis to determine whether a credible basis exists. The factors we will consider in this analysis include, but are not limited to, the geographic size of the area, the number of tests taken, the reliability of the tests, the parameters of the RF link budgets, infrastructure data accuracy, backhaul, and cell loading factor requirements. As discussed below, staff may also adjust the fade margins of the RF link budgets to calculate new “core coverage” areas using a standard propagation model, which would have a higher probability of coverage. For example, if testing data in an area exhibit an aberration compared to nearby areas and make that area appear as an outlier, this could constitute a credible basis to initiate a verification inquiry for that area. For example, assume an area is within a provider's 3G and 4G LTE coverage maps and there are many speed tests in the area on 3G but no tests recorded using 4G LTE from devices that are technologically capable of connecting to a 4G LTE network. This absence of tests on a superior technology would be considered an aberration in an area with many tests. Similarly, if speed tests submitted as challenges are sufficient to create many small, disparate challenges across a much larger area, these may be indicative of a pervasive problem, which could give staff a credible basis for conducting a verification inquiry. Another example where a credible basis could exist is an area where a significant Start Printed Page 21496 number of speed tests have been submitted as challenges but do not meet the thresholds to create cognizable challenges. A credible basis could also be established for an area without cognizable challenge data but where other available data, such as the results of staff's statistical analysis of crowdsourced data (including, e.g., Kriging spatial-interpolation analysis), indicate that coverage data may be incorrect. Additionally, as discussed further below, once we determine that a “critical mass” of crowdsourced filings indicate a provider's coverage map may be inaccurate, Commission staff has a credible basis for verifying a provider's coverage data in that area. Notwithstanding any of the foregoing, we note that the Commission also retains the right to perform audits of provider submissions at random, even without the existence of a credible basis necessary to trigger a verification inquiry.
85. We believe that the aforementioned examples of the information we will consider, as well as the standards and types of analysis we intend to apply, when deciding where to initiate a verification inquiry provide sufficient guidance on this topic, and we therefore find it unnecessary to adopt additional restraints, as advocated by T-Mobile. Because the Broadband DATA Act gives the Commission the responsibility to “verify the accuracy and reliability of [service providers' biannual coverage data],” it is important that staff have enough discretion to consider whether coverage data are accurate based on a range of factors, including geographic size, on-the-ground tests taken, and the reliability of those tests, according to the particular data and circumstances of the data that are presented to us. On the other hand, the case-by-case nature of the data received from providers, the challenge process, and the crowdsourced data is sufficient to limit verification requests to areas where a reason exists to view the area as problematic. We believe the approach described here is the most reasonable and effective way to pursue the goals of this proceeding and the Broadband DATA Act. We do not seek to require superfluous information from providers, but if circumstances indicate that additional data or other information are necessary to verify coverage in an area where evidence suggests the coverage is problematic, we have an obligation to verify the data, and, in many cases, additional information will be necessary to verify the area's coverage and carry out the Commission's obligations under the Broadband DATA Act.
86. Multiple commenters express a strong general desire to reduce or minimize the burden placed on providers as a result of the verification process. For instance, Verizon claims that the methods proposed for determining an area subject to verification would create verification areas that are too large. It recommends initially testing the verification process on a smaller scale, such as in rural areas. It also recommends that the Bureau and Offices limit verification requests to one per map submission (and up to two per year) and limit the areas to be sampled in the verification process to three contiguous resolution 6 hexagons. T-Mobile supports focusing verification requests in rural areas. T-Mobile similarly requests that the Bureau and Offices limit verification requests, recommending that such requests cover an area of no more than 10,000 square miles in a given year.
87. We decline to adopt any specific limitations on the basis for initiating verification inquiries or the areas subject to verification, including instances where a provider is already required to conduct drive testing for other reasons. We likewise decline to adopt a limit on the number of verification inquiries that we initiate for a particular provider within a given timeframe. We also decline to limit the verification process to a smaller scale initially, or to focus verification requests in rural areas. The Broadband DATA Act envisions that the Commission will assess accuracy and reliability of broadband availability data, and we find it inappropriate to limit staff's ability to carry out its tasks to further the goals of both the Act and this proceeding. Although we decline to set a maximum size for the target area, we consider any target area with a size less than 50 resolution 8 hexagons to be de minimis and more appropriate for the mobile challenge process than the mobile verification process.
88. However, we are mindful of the burden that a large area subject to verification can pose for providers. For this reason, we will rely on a sampling method for verification inquiries. The sampling method we adopt, described more fully in the Technical Appendix, is a somewhat modified version of the proposed approach. It relaxes the burden on providers in nearly all cases and is generally more streamlined, but still falls well within the bounds of accepted statistical methodologies.
89. In its comments, Verizon requests that the Bureau and Offices allow providers at least 15 days to review and respond to a verification request before a request is officially made and starts the 60-day clock. We decline to adopt Verizon's request. We view this request as tantamount to requesting an amendment of the 60-day term stipulated in the Third Order, and such an amendment would be beyond the Bureau and Offices' delegated authority. Further, we find that allowing a pre-review period could cause delays in the verification process that would adversely affect the provision of accurate broadband coverage information to the public. Additionally, as verification requests are triggered when there is a credible basis, there is already reason to view the relevant area with concern, and we do not believe that this delay would outweigh the need to verify the data.
2. Sampling Methodology
90. Gathering Statistically Valid Samples of Verification Data. As proposed in the BDC Mobile Technical Requirements Proposed Rules, we require a mobile service provider subject to a verification inquiry to provide data for a statistically valid sample of areas within the targeted area. We will determine the statistically valid sample size by dividing the targeted area into hexagonal units based on the H3 indexing system at resolution 8; the aggregation of these hexagonal units comprises “the frame.” We will then categorize the hexagonal units that comprise the frame into non-overlapping, mutually exclusive groups (one “stratum” or multiple “strata”). Each stratum will be based upon one or more variables that are correlated with a particular mobile broadband availability characteristic. These variables could include core/non-core coverage area (if available, and as explained further below), signal strength (from a provider's reported “heat map” or staff-performed propagation modeling), population, urban/rural status, road miles, clutter, and/or variation in terrain. For example, terrain variation is correlated with broadband availability due to the characteristics of radiofrequency propagation. Hexagons that are not accessible by roads will be excluded from all strata. We will then select a random sample of hexagons within each stratum for which service providers must conduct on-the-ground testing. As an alternative to on-the-ground testing, a provider can respond with infrastructure information covering the targeted area. To the extent mobile service providers receive personally identifiable information through the verification process by way of receiving crowdsource data, providers may only Start Printed Page 21497 use such information for the purpose of responding to a verification inquiry, and must protect and keep private all such personally identifiable information.
91. We find this sampling approach minimizes the cost and burden placed on service providers while ensuring that staff have sufficient data to verify coverage in a reliable way. Without such sampling, providers would need to submit substantially more data to verify their broadband availability, whereas requiring providers to submit speed test results for only a stratified random sample of units within a targeted area will minimize the time and resources associated with responding to the verification requests. This approach is also a more efficient and less burdensome approach than having providers perform annual drive tests, regularly submit infrastructure information, or submit data for their entire network coverage area. The stratification methodology will also ensure that variation in broadband availability will be as small as possible within hexagons in the same stratum. We anticipate this methodology will reduce the sample size and the cost of data collection.
92. Failing to Verify Coverage in a Targeted Area. If the provider fails to verify its coverage data, the provider will be required to submit revised coverage maps that reflect the lack of coverage in the targeted areas failing the verification within 30 days. When a provider submits such revised coverage data, we will re-evaluate the data submitted by the provider during the verification process by comparing it with the revised coverage data for the targeted area using the same methodology. If the targeted area still cannot be successfully verified, we will require that the provider submit additional verification data, such as additional on-the-ground tests, or that it further revise its coverage maps until the targeted area is successfully verified. We note, however, that at any point after the initial 30-day deadline has elapsed, we may treat any targeted areas that still fail verification as a failure to file required data in a timely manner and that the Commission may make modifications to the data presented on the broadband map ( i.e., by removing some or all of the targeted area from the provider's coverage maps). Cases where a provider fails to respond in a timely manner may also lead to enforcement action.
3. On-the-Ground Test Data
93. The approach we adopt for providers to respond to verification requests using on-the-ground test data is a modified version of what was proposed in the BDC Mobile Technical Requirements Proposed Rules. As requested by providers in the record, our modified approach is intended to lessen the burden on providers. These modified thresholds will still provide the Commission with sufficient data to evaluate a provider's coverage but aim to reduce the testing burden on the providers. First, rather than requiring tests to meet a geographic threshold, we adopt a revised requirement wherein staff will randomly select a single point-hex ( i.e., a child resolution 9 hexagon) within the resolution 8 hexagon selected for the sample where the provider must conduct its tests. Unlike in the challenge process, geographic variation in the on-the-ground test data submitted for the verification process is guaranteed by spatial random sampling approach; thus, the geographic threshold used in the challenge process is unnecessary here. Second, the specific testing threshold requirements that apply to challenges are not as relevant to verifications. Accordingly, the temporal threshold is the only relevant threshold from the challenge process necessary to ensure statistically valid results when submitting on-the-ground test data for the verification process. Third, we adopt a slight modification to the temporal threshold for verification responses. The temporal threshold proposed in the BDC Mobile Technical Requirements Proposed Rules requires the provider to record at least two tests within each of the randomly selected hexagons where the time of the tests are at least four hours apart, irrespective of date. We adopt the proposed temporal threshold for the verification process with a slight modification in certain circumstances. Specifically, we relax this threshold from what was proposed by requiring only a single test in a sampled hexagon if the provider establishes that any significant variance in performance was unlikely due to cell loading. The provider can establish this by submitting with its speed test data actual cell loading data for the cell(s) covering the hexagon sufficient to establish that median loading, measured in 15-minute intervals, did not exceed the modeled loading factor ( e.g., 50%) for the one-week period prior to the verification inquiry. We find that this modification will reduce the burden on providers without sacrificing statistical robustness because the temporal threshold exists to mitigate the likelihood that the speed measured in test data is unrepresentative of the speed when measured at different times of day, with different cell loading utilization that may exceed the provider's modeled loading assumptions.
94. We will evaluate the entire set of speed test results to determine the probability that the targeted area has been successfully verified. The upload and download components of a test will be evaluated jointly in the verification process (rather than separately, as in the challenge process). We will treat any resolution 8 hexagons in the sample where the provider fails to submit the required speed tests in the randomly selected point-hex as containing negative tests in place of the missing tests when performing this calculation. Providers must verify coverage of a sampled area using the H3 geospatial indexing system at resolution 8. The tests will be evaluated to confirm, using a one-sided 95% statistical confidence interval, that the cell coverage is 90% or higher. If the provider can show sufficient coverage in the selected resolution 8 hexagons, the provider will have successfully demonstrated coverage to satisfy the verification request in the targeted area. Sampling allows us to identify where to test and to draw statistically meaningful results about the performance in areas that are not sampled. We believe the specific thresholds and confidence interval that we adopt balance the costs to providers of verifying maps with the Commission's need to acquire a sample sufficient to accurately verify mobile broadband availability.
95. As proposed in the BDC Mobile Technical Requirements Proposed Rules, we require that mobile providers conduct on-the-ground tests consistent with the testing parameters and test metrics that we require for provider-submitted test data in the challenge process. As required in the challenge process for in-vehicle mobile tests, providers must conduct in-vehicle mobile tests in the verification process with the antenna located inside the vehicle. As noted above, because most consumers will take in-vehicle tests using an antenna inside the vehicle, adopting that requirement for providers will help minimize discrepancies and ensure more equivalent comparisons between on-the-ground test data supplied by consumers and data supplied by providers.
96. We decline to ask for on-the-ground test data from mobile providers on a continuous or quarterly basis as part of the verification process as proposed by Enablers. As noted above, we are mindful of the burden placed on provider resources and find a continuous or quarterly rolling Start Printed Page 21498 submission requirement unnecessarily burdensome.
97. Commission staff may also leverage spatial interpolation techniques, such as Kriging, to evaluate and verify the accuracy of coverage maps based on on-the-ground data. Spatial interpolation techniques can be an alternative or complementary approach to specifying an exact testing threshold, since spatial interpolation techniques require fewer data to compare with predictions using propagation models.
4. Infrastructure Information
98. In the BDC Mobile Technical Requirements Proposed Rules, we noted the Commission found that infrastructure information can provide an important means to fulfill its obligation to independently verify the accuracy of provider coverage maps. We also reiterated the Commission's conclusion that collecting infrastructure data from mobile service providers will enable the Commission to verify the accuracy and reliability of submitted coverage data as required under the Broadband DATA Act.
99. In determining how best to utilize infrastructure data to verify a provider's coverage, the Bureau and Offices proposed that Commission staff evaluate whether a provider has demonstrated sufficient coverage for each selected hexagon using standardized propagation modeling. Under that proposed approach, staff engineers would generate their own predicted coverage maps using the infrastructure data submitted by the provider (including link budget parameters, cell-site infrastructure data, and the information provided by service providers about the details of the propagation models they used). Using those staff-generated maps, the proposed approach anticipated that Commission staff would evaluate whether each selected hexagon has predicted coverage with speeds at or above the minimum values reported in the provider's submitted coverage data. The Bureau and Offices sought comment on this proposed approach to verifying coverage using standardized propagation modeling, as well as on other ways more generally that infrastructure data could be used to evaluate the sufficiency of coverage in the proposed verification process. In the BDC Mobile Technical Requirements Proposed Rules, we noted staff may also consider other relevant data submitted by providers during the verification process, may request additional information from the provider (including on-the-ground speed test data, if necessary), and may take steps to ensure the accuracy of the verification process. Alternatively, we sought comment on other ways to use the submitted infrastructure and link budget data to perform initial verification of the claimed coverage within the selected hexagons using standard propagation models as well as appropriate terrain and clutter data. We stated we could evaluate the provider's link budgets and infrastructure data for accuracy against other available data, such as Antenna Structure Registration and spectrum licensing data. This alternative approach would include using a staff projection of speeds, available crowdsourced data at the challenged locations, and any other information submitted by or requested from a provider in order to verify coverage. The Bureau and Offices further discussed leveraging spatial interpolation techniques to evaluate and verify the accuracy of coverage maps based on available crowdsourcing and on-the-ground data. We sought comment on both the original and alternative approaches and invited comment on any other ways that infrastructure data and staff propagation modeling could be used to verify a provider's coverage in a targeted area.
100. We adopt the BDC Mobile Technical Requirements Proposed Rules' proposal that, if a provider chooses to submit infrastructure information in response to a verification request, it must provide such data for all cell sites and antennas that serve or affect coverage in the targeted area. As set forth in that notice, staff may use these infrastructure data—in conjunction with link-budget data from the provider, standard sets of clutter and terrain data, other factors, and standardized propagation modeling—to inform our decision about whether the provider has verified its claimed coverage. However, we agree with several commenters that it would be difficult for staff to account for the intricacies of a provider's dynamic network configuration and replicate provider models with staff's own propagation models and that the proposed approach is not necessary to accomplish the Commission's goals with respect to the verification process. Rather than attempt to replicate the results of providers' modeling, we expect staff will rely on a more flexible approach to its analysis. For example, in appropriate cases staff may choose to estimate a “core coverage area,” in which coverage at the modeled throughput is highly likely to exist, and would focus its verification efforts instead on areas outside of that “core coverage area”—but within the service provider's claimed coverage area ( i.e., close to the cell edge)—and may consider other data that could be relevant ( e.g., cell loading or signal strength measurements) to determine whether to seek additional information in furtherance of a verification inquiry for areas within the core coverage area.
101. While each analysis will turn on the relevant facts and circumstances, we offer one possible example of the approach in an effort to provide guidance about how the staff's analysis might work. In this scenario, Commission engineers would first confirm that the backhaul, technology, and other network resources reported for the base station(s) that serve(s) the targeted area are sufficient to meet or exceed the required speed thresholds. Second, staff could use propagation modeling to estimate the provider's core coverage area within the targeted area using more conservative parameters (including a higher cell edge probability) than required of the propagation modeling the provider used to generate its coverage data. Third, staff could analyze downlink and uplink cell loading data submitted by the provider as part of its infrastructure data to confirm that the median cell loading values are less than or equal to the cell loading factor modeled by the provider ( e.g., 50%). Fourth, staff could then evaluate the signal strength information from all available speed test measurements—including those submitted as challenges, crowdsourced data, or on-the-ground data in response to a verification inquiry. For a verification inquiry, the system would evaluate whether the portion of the target area falls outside of the staff-determined core coverage area. If the targeted area falls within the core coverage area, then we would consider other relevant evidence (if any) to determine whether further inquiry is necessary or appropriate.
102. In cases where staff's analysis indicates that infrastructure data alone would be insufficient to resolve the verification inquiry, staff may determine to sample a new set of areas and in appropriate cases may also take into account additional infrastructure data and information on the core coverage areas, where staff expect adequate coverage is highly likely. Staff could then request additional information, such as on-the-ground data, to complete the verification process. Staff may also consider infrastructure data independently and review for anomalies.
103. Several commenters argue that Commission staff should not generate Start Printed Page 21499 propagation models with the submitted infrastructure information or do so only in limited cases. For example, Verizon urges Commission staff to limit predictive studies to localized examinations of the reasonableness of a service provider's map and clarify that successful speed test data would preclude staff propagation modeling or outweigh countervailing staff propagation modeling results. We clarify that where a provider submits valid speed test data in sample-selected areas, staff propagation studies based on infrastructure data should not be necessary. We also clarify that while staff has the option to create predictive maps based on providers' infrastructure data, we are not required to do so. However, the option to create staff propagation studies is a tool necessary to retain in the analyzation of collected infrastructure data and fulfillment of our obligations under the Broadband DATA Act.
104. Initial Verification of Claimed Coverage. We adopt our proposal to perform initial verification of claimed coverage as an alternative way to use infrastructure data to assess providers' coverage data. We will compare the provider's link budget and infrastructure data with other available data for accuracy, such as Antenna Structure Registration and spectrum licensing data. If staff believe, after making these comparisons, that there is a technical flaw in a provider's maps ( e.g., a model was run with the wrong parameters), we will then determine if this flaw would result in a significant difference in coverage. If staff estimation of speed ( e.g., resulting from staff-performed propagation modeling or other related calculations), along with the available crowdsourced data at the challenged locations, does not predict speeds at or above the minimum values reported in the provider's submitted coverage data, Commission staff will consider any additional information submitted by the provider or request other data from the provider; other data may include on-the-ground data. No commenters addressed this alternative to perform initial verification of claimed coverage.
105. Additional required infrastructure information. We adopt the proposal to expand the categories of infrastructure information that providers must submit. As anticipated, we find that such information is necessary to analyze verification inquiries adequately. In addition to the types of infrastructure information listed as examples in the Third Order, providers must submit the following parameters: (1) Geographic coordinates of each transmitter measured with typical GPS Standard Positioning Service accuracy or better; (2) per site classification ( e.g., urban, suburban, or rural); (3) elevation above ground level for each base station antenna and other transmit antenna specifications ( i.e., the make and model, beamwidth (in degrees), radiation pattern, and orientation (azimuth and any electrical and/or mechanical down-tilt in degrees) at each cell site); (4) operate transmit power of the radio equipment at each cell site; (5) throughput and associated required signal strength and signal-to-noise ratio; (6) cell loading distribution (we will require providers to submit information on the actual loading for each cell site that serves the targeted area, including, for example, the average number of active radio resource control channel users and average bandwidth carrying user traffic for both the downlink and uplink carriers measured in 15-minute intervals for the one-week period before the provider received the verification inquiry); (7) areas enabled with carrier aggregation and a list of band combinations; and (8) any additional parameters and fields that are listed in the most-recent specifications for wireless infrastructure data adopted by OEA and WTB in accordance with 5 U.S.C. 553.
106. Some commenters argue that the Commission should not require infrastructure data fields beyond what was required in the Third Order. Verizon advocates for deleting proposed fields it called unnecessary, unclear, or unable to be readily provided. CTIA says the “Bureaus should not second-guess a provider's cell-loading factor if the data indicates higher than average cell loading in a given area at a given time.” CTIA also urges the Commission not to collect additional infrastructure information due to its sensitive and confidential nature and the burdens this collection would impose; CTIA contends this collection would be inconsistent with the Broadband DATA Act, and staff should rather tailor its requests to specific issues after discussion with the provider.
107. The data fields we adopt here are necessary to help predict more precisely the users' speeds, and the potential burdens of providing these data are outweighed by the necessity of the information. To elaborate, required signal strengths and signal-to-noise (SNR) ratio data are critical factors that enable or impede the speed at which users may connect and are thus required to estimate the users' speeds. Cell loading distribution is the measured cell loadings observed for each cell over time ( e.g., every 15 minutes or less for each cell on the day of interest). Cell loading distribution is also necessary to calculate the final users' speeds and analyze challenges, as evidenced by the inclusion of a minimum 50% cell loading specification in the Broadband DATA Act. A provider's measured cell loading factor is the best way to verify actual cell loading; the cell loading factor is not being second-guessed. In areas with carrier aggregation, a list of spectrum band combinations used for carrier aggregation is necessary to analyze the capacity of the cell, and will be used in conjunction with cell loading data to evaluate more precisely the disputed areas of the coverage map. More detailed infrastructure data specifications are listed in § 1.7006(c)(2) of the final rules.
108. While we do not prioritize one information source over another, we noted above that where providers' responses to verification inquiries include valid speed test data for each sampled area, staff propagation studies based on infrastructure data should not be necessary. As previously noted, we are sensitive to confidentiality and security concerns in the collection of mobile infrastructure information, and infrastructure information submitted by providers at the request of staff will be treated as presumptively confidential. We are also sensitive to not imposing undue burden on providers and have therefore not mandated the submission of infrastructure data in response to every verification inquiry. We may engage in discussions with a provider when necessary, after which we can request specific areas in which to collect the data. When staff find that infrastructure data are necessary to verify coverage consistent with the Broadband DATA Act, the infrastructure data fields enumerated herein are necessary for staff to carry out that obligation.
5. Transmitter Monitoring Information
109. The Commission directed OEA and WTB to review transmitter monitoring information submitted voluntarily by providers in addition to on-the-ground and infrastructure information. T-Mobile asserts that providers should be allowed to submit data from alternative sources, including transmitter monitoring information, to satisfy verification requests. Verizon states that transmitter monitoring information “provides a comprehensive picture of network performance.” We agree that these data could be helpful, to the extent that they support potential reasons for service disruptions during the time interval in which measurements were performed. Start Printed Page 21500 Therefore, we will consider transmitter monitoring information voluntarily submitted by a provider in addition to on-the-ground testing or infrastructure data in response to a verification inquiry. We do not believe, however, that the record supports a finding that such data constitute a sufficient substitute for the on-the-ground testing or infrastructure data required by the Third Order to respond to a verification inquiry.
C. Collecting Verified Broadband Data From Government Entities and Third Parties
110. We adopt our proposal for governmental entities and third parties to submit verified on-the-ground test data using the same metrics and testing parameters that mobile providers must use when submitting on-the-ground test data in response to a verification request. We also note, as set forth in the Third Order, government and other third-party entities that submit verified broadband availability data must file their broadband availability data in the same portal and under the same parameters as providers. This includes a certification by a certified professional engineer that he or she is employed by the government or other third-party entity submitting verified broadband availability data and has direct knowledge of, or responsibility for, the generation of the government or other entity's Broadband Data Collection coverage maps. We find that assigning consistent, standardized procedures for governmental entities and third parties to submit on-the-ground data is necessary to ensure that the Commission receives consistent, reliable data and that the broadband availability maps are as accurate and precise as possible. The record exhibits support for this approach. Next Century Cities advocates the Commission develop outreach and explanatory materials to encourage participation from state and local leaders, and we will be making such materials available to state, local, and Tribal government entities to file verified data. We are mindful of PAgCASA's concerns that imposing these standards will not result in the submission of verified data from governmental entities and third parties. We believe, however, that this approach is the most efficient and effective way for providers and staff to review verified data from governmental entities and third parties. This approach minimizes variables between different datasets and thus helps ensure that staff and other parties may more efficiently and effectively evaluate competing data ( e.g., verified on-the-ground tests submitted by a governmental entity versus on-the-ground tests conducted by the provider) with an apples-to-apples comparison to determine the source of any data discrepancies. Assigning consistent, standardized procedures for governmental entities and third parties to submit verified on-the-ground data is appropriate and necessary to ensure the broadband availability maps are as accurate and precise as possible.
111. We also adopt our proposal that, to the extent the Commission is in receipt of verified on-the-ground data submitted by governmental entities and third parties, such data may be used when the Commission conducts analyses as part of the verification processes and will be treated as crowdsourced data. Governmental entities and third parties may also choose to use these data to submit a challenge, provided they meet the requirements for submission of a challenge under the Commission's rules.
112. Enablers advocates that the Commission create a “strong active testing-based verification layer with sampling of nationwide coverage” and revisit the decision to require propagation maps instead of continuous drive testing. To that end, Enablers notes that its solution allows for cost-effective, continuous active testing by third parties to better produce statistically valid samples and advocates that its approach be adopted. To the extent that government entities and third parties choose to submit verified data, we note that the Commission requires them to submit their data under the same parameters as providers. The Bureau and Offices lack the authority to override decisions by the full Commission. We note, however, that if Enablers or other parties submit crowdsourced data consistent with the specifications outlined below, we will treat those data as such.
D. Crowdsourced Data
113. The Broadband DATA Act requires the Commission to “develop a process through which entities or individuals . . . may submit specific information about the deployment and availability of broadband internet access service . . . on an ongoing basis . . . to verify and supplement information provided by providers.” In the Second Order, the Commission adopted a crowdsourcing process to allow individuals and entities to submit such information. The Commission required that crowdsourced data filings contain: The contact information of the filer, the location that is the subject of the filing (including the street address and/or GPS coordinates of the location), the name of the provider, and any relevant details about the deployment and availability of broadband internet access service at the location. The Commission also required that crowdsourced data filers certify that, “to the best of the filer's actual knowledge, information, and belief, all statements in the filing are true and correct.” As the Commission has clarified, the Bureau and Offices, together with the Wireline Competition Bureau (WCB), will use crowdsourced data to “identify[ ] trends,” and “individual instances or patterns of potentially inaccurate or incomplete deployment or availability data that warrant further investigation or review.” Crowdsourced information is intended to “verify and supplement information submitted by providers for potential inclusion in the coverage maps.” Notably, the Commission also expressly reserved the right to investigate provider filings in instances that warrant further investigation based on the specific circumstances presented by crowdsourced data.
114. We provide further guidance and adopt rules regarding the crowdsourced data process as described below. We provide additional information about updates we are making to the FCC Speed Test app's technical standards and requirements to configure the app for submission of mobile challenge and crowdsourced data. We also outline the procedures OET will follow for approving third-party speed test apps for these purposes. We establish requirements for consumers and other entities to submit any crowdsourced data to the online portal using the same parameters and metrics providers would use when submitting on-the-ground data in response to a Commission verification request, with some simplifications, as described above. Finally, we provide guidance on our methodology for evaluating mobile crowdsourced data through an automated process—a process that will assist us in establishing when crowdsourced data filings reach a “critical mass” sufficient to merit further inquiry. Once the automated process identifies areas where verification may be warranted, Commission staff will conduct an evaluation based upon available evidence such as speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff's review of submitted coverage data (including maps, link budget parameters, and other credible information) to determine whether a credible basis for conducting a verification inquiry has been established Start Printed Page 21501 using the standards outlined in greater detail below.
1. Tools To Submit Crowdsourced Data
115. In the BDC Mobile Technical Requirements Proposed Rules, the Bureau and Offices proposed a process for consideration of crowdsourced data submitted through data collection apps used by consumers and other entities, including methods to prioritize the consideration of crowdsourced data submitted through apps that are determined to be “highly reliable” and that “have proven methodologies for determining network coverage and network performance.” We noted that the Commission directed the Bureau and Offices (along with WCB) to consider “(1) whether the application uses metrics and methods that comply with current Bureau and Office requirements for submitting network coverage and speed test data in the ordinary course; (2) whether the speed test app used has enough users that it produces a dataset to provide statistically significant results for a particular provider in a given area; and (3) whether the application is designed so as not to introduce bias into test results.” The Bureau and Offices noted that “data submitted by consumers and other entities that do not follow any specific metrics and methodologies may be less likely to yield effective analysis and review . . . of providers' mobile broadband availability.” Commenters did not provide any suggestions or recommendations on how to prioritize consideration of crowdsourced data.
116. We find that the FCC Speed Test app is a reliable and efficient tool for users to submit crowdsourced mobile coverage data to the Commission. The FCC Speed Test app allows users to submit specific information about the availability of mobile broadband service and its performance and meets the requirements outlined in the Commission's Second Order. We also make clear that we will include both stationary and mobile in-vehicle speed test results in crowdsourced data. Specifically, we find the FCC Speed Test app sufficiently meets the considerations that the Commission set forth. First, we find the FCC Speed Test app uses metrics and methods that comply with current requirements for submitting network coverage and speed test data in the ordinary course. These include upload speed, download speed, latency and other network performance metrics. These metrics are consistent with the network performance metrics required to be collected by the Commission under the 2020 Broadband DATA Act and the 2008 Broadband Data Improvement Act. Next, we find that the FCC Speed Test app is designed to minimize bias in test results. The FCC Speed Test app's test system architecture implements dedicated off-net servers hosted by a Content Delivery Network (CDN) to provide robust and reproducible test results for effective representation of network performance. The test servers are deployed at Tier 1 major peering/transit locations to minimize bias which is a practical approach to measure network performance. With regard to whether the FCC Speed Test app produces a dataset sufficient to provide statistically significant results for a particular provider in a given area as it pertains to crowdsourced data, we note that we will not be analyzing speed test results from the FCC Speed Test app in isolation. Rather, we will aggregate and/or cluster all speed tests conducted with the FCC Speed Test app—along with those conducted with an authorized third-party speed test app and those conducted by government or other entities using their own hardware or software—for a particular provider in a particular area during our analysis, as described further below. We anticipate that this aggregation and/or clustering process will lead to statistically valid results by provider and geographic area. We therefore find that the FCC Speed Test app meets the required criteria and is a reliable, efficient method for those interested to use when submitting crowdsourced mobile coverage data to the Commission.
117. As discussed, OET maintains a technical description that describes the metrics and methodologies used in the existing FCC Speed Test app. We note that RWA requests that the FCC Speed Test app display whether users are roaming and, if so, identify the roaming network. The FCC Speed Test app currently has the ability to provide network roaming information via the app's local data export feature for download and upload speed tests and latency tests; however, this capability is not available for Apple iOS devices as certain technical network information and RF metrics are currently not available on those devices. In order to ensure ample public participation in the crowdsourcing process, we clarify that consumers wishing to submit crowdsourced data may use a device running either the iOS or Android operating system to collect speed test data and submit it as crowdsourced information; for the same reasons discussed above, however, we require government, other third-party, and provider entities to collect all of the required technical network information and RF metrics using a device that can interface with drive test software and/or runs the Android operating system. We also clarify, as discussed earlier, that speed tests conducted by a customer of an MVNO will be considered and evaluated as crowdsourced data.
118. Regarding third-party speed test apps used to collect challenge and crowdsourced data on mobile wireless broadband availability, the BDC system will accept challenge and crowdsourced data from third-party applications approved by OET that collect the required data set forth in the relevant data specification for mobile challenge and crowdsourced data ( e.g., contact information, geographic coordinates, and required certifications) and in a format that comports with the application programming interface (API) for the backend of the BDC system. To the extent that consumers and other entities choose to submit on-the-ground crowdsourced mobile speed test data, such data will be collected using a similar measurement methodology as the FCC Speed Test app and submitted in a similar format to that which challengers and providers will use when submitting speed tests. We will thus only find third-party apps to be “highly reliable” and to “have proven methodologies for determining network coverage and network performance” if OET has approved them based upon the processes and procedures we will adopt for review of third-party apps for use in the mobile challenge process, and we will only allow for submission of crowdsourced data from such approved apps. As noted above, OET will release a public notice announcing the process for approving third-party apps for use in the mobile challenge process, inviting third-party app proposals, and seeking comment on third-party apps being evaluated. As previously mentioned, OET will announce and publish a web page to maintain a list of approved third-party apps and any available data specifications for third-party apps. We also will consider as crowdsourced data speed tests taken with an authorized app that do not meet the criteria needed to create a cognizable challenge or are otherwise not intended to be used to challenge the accuracy of a mobile service provider's map.
119. Finally, we recognize that changes in technology and other considerations may require us to periodically revaluate these initial determinations in order to satisfy the Act's provisions for submitting crowdsourced data. The Bureau and Offices will modify the process for collecting mobile crowdsourced data Start Printed Page 21502 over time, as experience dictates may be necessary and appropriate to improve our procedures and assure that the maps we make are as reliable and accurate as possible.
2. Crowdsourced Data Submitted in the Online Portal
120. We will use crowdsourced data to “identify individual instances, or patterns of potentially inaccurate or incomplete deployment or availability data that warrant further investigation or review.” In light of this given purpose, we believe it is reasonable to provide those collecting crowdsourced data with increased flexibility to facilitate making the process more user-friendly. Specifically, on-the-ground crowdsourced data must include the same parameters and metrics as required for on-the-ground speed test data submitted through the mobile service challenge process, except that we will allow on-the-ground crowdsourced data to include any combination of download speed and upload speed (rather than both). Crowdsourced data should include valid on-the-ground speed tests and will be categorized and evaluated based on the upload and download speed tests as “positive” or “negative” tests, similar to speed tests in the challenge process. In the BDC Mobile Technical Requirements Proposed Rules, the Bureau and Offices noted that the Commission directed them, together with WCB, to establish and use an online portal for crowdsourced data filings and to use the same portal for challenge filings. The Bureau and Offices will release additional guidance on how consumers and other entities can use the online portal to submit crowdsourced data once the portal is available.
121. Staff will validate submitted crowdsourced speed test data and exclude those that are, for example, anomalous, do not conform to the data specifications, or do not otherwise present reliable evidence and then evaluate the crowdsourced data as described further below to determine whether a critical mass of crowdsourced filings suggest that a provider has submitted inaccurate or incomplete information. This approach helps ensure that the crowdsourced data staff analyzes are valid and reliable while also affording consumers some added flexibility by allowing on-the-ground crowdsourced data to include any combination of download speed and/or upload speed rather than both. Similarly, mobile providers will be notified of a crowdsource filing but will not be required to respond to crowdsource filings unless and until Commission staff request that they do so, based on the procedures outlined below. We believe this process is an efficient and effective way for staff to analyze and review a provider's mobile broadband availability using crowdsourced data.
122. T-Mobile supports making certain speed test metrics optional for crowdsourced data and not to require providers to automatically respond to crowdsourced data filings, stating they are appropriately tailored and will serve to limit burdens on providers without compromising the need for the Commission to ensure that it receives verified and reliable data. We agree that making certain test metrics optional for the crowdsourced data filings and also not requiring providers to respond to crowdsourced data filings (absent a Commission inquiry) serves to limit the burdens on filers and providers without compromising the reliability of the crowdsourced data, with the goal of providing as broad and robust crowdsourced data as possible.
3. When Crowdsourced Filings Reach a “Critical Mass”
123. In the Second Order, the Commission directed staff to initiate inquiries when a “critical mass” of crowdsourced filings suggest that a provider has submitted inaccurate or incomplete information and directed us to provide guidance on when crowdsourced filings reach such a critical mass. We sought comment in the BDC Mobile Technical Requirements Proposed Rules on when inquiries based on a critical mass of crowdsourced filings could be initiated. Specifically, we proposed to evaluate crowdsourced data in the first instance with an automated process to identify areas that would trigger further review.
124. Establishing Critical Mass. We adopt our proposal and will evaluate mobile crowdsourced data through a combination of automated processing and further review by Commission staff. As described in more detail below, the automated process will identify areas for further review by first excluding or “culling” any anomalous or otherwise unusable speed test information and then using data clustering to identify groupings of potential targeted areas where a provider's coverage map is inaccurate that would trigger further review. Staff will then review the identified potential targeted areas and any other relevant data to confirm whether this cluster presents a credible basis to warrant verification. Under this approach, areas identified from crowdsourced data using this methodology would be subject to a verification inquiry consistent with the mobile verification process adopted herein.
125. We note that commenters generally support our proposals regarding when crowdsourced data should trigger an inquiry about the accuracy of a provider's broadband mapping information. Verizon, for example, finds reasonable our proposals regarding which crowdsourced information to consider. Specifically, Verizon states that the Commission's proposal is reasonable to accept as crowdsourced information speed tests taken with an authorized app that do not meet the criteria needed to create a cognizable challenge or are otherwise not intended to be used to challenge the accuracy of a mobile service provider's map. Additionally, Verizon states the Commission should adopt the proposal to permit consumers and other entities to submit crowdsourced data collected using either the FCC Speed Test app or other speed test apps approved by OET. Furthermore, T-Mobile supports our proposal to initiate an inquiry when crowdsourced data suggest that a provider has submitted inaccurate or incomplete coverage data. Ookla agrees, pointing out that “crowdsourcing allows for the rapid, cost-effective collection of actionable, accurate broadband data.”
126. We expect that the minimum data standards and structured vetting process we adopt for evaluating crowdsourced data described below address concerns about any bias in, and the reliability of, the crowdsourced data collected. For example, because the automated process we describe below will filter out anomalies or other unusable speed test information, we believe this filtering process sufficiently addresses Verizon's concerns about including inaccurate speed test information in any crowdsourced dataset due to possible varying test conditions. Further, because the process will also employ a clustering methodology to identify trends or patterns suggesting persistent coverage issues over time, we believe the crowdsourced data will be an efficient and effective means with which to inform, but not decide, a provider's claimed deployment and availability of broadband internet access service and thereby be an important part of the Commission's available data verification options.
127. Other commenters offer different views regarding our proposal to evaluate crowdsourced data. RWA requests more clarity, suggesting that we define what the “critical mass” is to trigger an inquiry in rural and urban areas. Public Knowledge/New America, seeking to Start Printed Page 21503 bolster the usefulness and value of crowdsourced information, opposes our proposal to initiate a verification inquiry only when there is a “critical mass of” crowdsourced data. Instead, they argue that staff should make it easier for crowdsourced data to inform our verification inquiries. We find that the requirement we adopt to initiate an inquiry in response to crowdsourced data when a critical mass of these data suggest that a provider has submitted incomplete or inaccurate information strikes the best balance. This approach allows for the crowdsourcing process to highlight problems with the accuracy of a provider's mobile broadband coverage maps and is an important tool in the Commission's verification process. As Ookla observes “crowdsourcing uses large numbers of samples to identify useful conclusions.” The crowdsourcing process we adopt provides a user-friendly way for interested filers to provide crowdsourced data to the Commission in a cost-effective way without requiring providers to respond automatically to such filings. Because the process is user-friendly, we also believe it will incentivize greater participation in the crowdsourced data gathering process. We believe this strikes the right balance and helps us ensure more reliable mobile broadband coverage data.
128. Automated Process. We will evaluate mobile crowdsourced data first through an automated process to identify potential areas that warrant further review and evaluation by Commission staff. Specifically, we adopt a modified version of our proposal in the BDC Mobile Technical Requirements Proposed Rules regarding the automated process and will evaluate crowdsourced filings using a two-step process by first excluding any anomalous or otherwise unusable tests submitted as crowdsourced data and then by using data clustering (an industry standard tool for clustering GIS data) to identify potential targeted areas where crowdsourced tests indicate a provider's coverage map is inaccurate. Areas identified by the automated process then would be subject to further review and evaluation by Commission staff of available evidence, such as speed test data, infrastructure data, crowdsourced and other third-party data, and the staff's review of submitted coverage data, including maps, link budget parameters, and other credible information to make a determination as to whether a credible basis for conducting a verification inquiry has been established and whether a verification request is appropriate.
129. More particularly, the automated process will involve an analysis at the end of each month that will include aggregating the crowdsourced data into H3 hexagons at resolution 8, and categorizing each hexagon for purposes of further analysis. Next, we will apply a clustering algorithm to spatially cluster these hexagons. We will track the growth of the clusters of hexagons over time and if the level of negative speed tests is observed for three consecutive months, will make a determination of whether crowdsourced data have reached a “critical mass” warranting verification. The details of this process are described in more detail in the Technical Appendix. We note that the Density Based Spatial Clustering of Applications with Noise (DBSCAN) algorithm we will employ is one of the 10 default tools for clustering GIS data in the industry standard Esri ArcGIS software and is commonly used to perform this type of data clustering analysis. In fact, the DBSCAN algorithm we will employ is one of the most commonly used methods for data clustering analysis.
130. Verizon opposes the use of an automated process to analyze crowdsourced data as well as the use of data clustering to identify potential targeted areas where crowdsourced tests indicate that a provider's coverage map is inaccurate, and asks that, should we adopt these proposals, we provide more detail about their mechanics and seek further comment on the proposed algorithm, data sources, and criteria the processes will use for identifying potential targeted areas for further review and evaluation. We proposed to use an automated process to identify potential areas that would trigger further review using a methodology similar to the mobile verification process, with certain simplifications. More specifically, we proposed to use data clustering to identify potential targeted areas where crowdsourced tests suggest that a provider's coverage map is inaccurate and also sought comment on any alternative methods for determining when a critical mass of crowdsourced filings suggest a provider may have submitted inaccurate or incomplete information. We did not receive any comments suggesting any alternative methods for the critical mass determination. We adopt a modified version of our proposal as described above. Employing the modified automated process we adopt is a reasonable approach to analyze crowdsourced data because of the anticipated volumes of data. Using data clustering to identify potential targeted areas for further Commission staff review and evaluation is also a reasonable way to group crowdsourced data together for a particular area within a coverage map. In this regard, we note that a data clustering approach for the identification of clusters of concern will reduce the amount of staff work and assure that an unbiased analysis has provided evidence that specific areas warrant further review by Commission staff. We believe the modified version of the automated process we adopt, including the use of data clustering, is sufficiently detailed and, taken together with the added safeguard of subsequent staff evaluation, addresses Verizon's request for more information about the automated process itself and the data clustering and other criteria the process will use as described below to identify potential areas for further review and evaluation.
131. Staff Evaluation. As noted above, the data identified in this process will inform, but not decide, a provider's claimed deployment and availability of broadband internet access service and thereby be an important part of the Commission's available verification options. If the automated process suggests that an area has persistent coverage issues, Commission staff will evaluate the data and make a final determination as to whether clusters of hexagons identified in this manner for three consecutive months have, indeed, reached “critical mass.” Staff may consider other relevant data submitted by providers, consumers and/or third parties; may request additional information; and may take other actions as may be necessary to ensure the reliability and accuracy of the provider's coverage data and any applicable crowdsourced data. Should automated processing establishing a “critical mass” of crowdsourced filings combined with staff evaluation suggest a provider's coverage map is inaccurate, Commission staff will have a “credible basis” for verifying a provider's coverage data. Under this approach, areas identified from crowdsourced data using this methodology would be subject to a verification inquiry consistent with the mobile verification process adopted herein. Finally, we reiterate that we may initiate an inquiry, in the absence of a critical mass of crowdsourced filings, to collect and request verification data from a provider where there is a credible basis for doing so based upon a holistic review of all data available to staff (including crowdsourced data, data associated with challenges, verified data from government or third-party entities, or broadband availability data included Start Printed Page 21504 in the provider's initial filing). On a case-by-case basis, staff may thus have a credible basis for initiating a verification inquiry if warranted by the specific circumstances of a crowdsourced data filing in the context of all other data available to staff.
4. Public Availability of Crowdsourced Data
132. The Commission determined in the Second Order that all information submitted as part of the crowdsourcing process will be made public, except for personally identifiable information (PII) and data required to be confidential under § 0.457 of its rules. The Commission also directed OEA to make crowdsourced data publicly available as soon as practicable after submission and to establish an appropriate method for doing so. No commenters addressed, or provided any alternatives to, our proposal in the BDC Mobile Technical Requirements Proposed Rules to make crowdsourced data filings available to the public or offered any suggestions about any specific ways to protect PII or other sensitive information.
133. We therefore adopt our proposal to make crowdsourced data available via the Commission's public-facing website. This will include data collected via designated third-party apps. This publicly available information will depict coverage data and other associated information but will not include any PII or other data required to be confidential under § 0.457. Since designated third-party apps will be collecting data on behalf of the Commission, we expect similar handling of PII or other confidential information by third-party designees. We also adopt a modified version of our proposal and will update the public crowdsourced data at least biannually in order to make available the most up-to-date data. This is consistent with the Commission's requirement to update the Fabric every six months to ensure the most up-to-date information is available for all of the locations identified in the common dataset and will ensure the crowdsourced data provided is also current, reliable and robust.
E. Other Matters
134. Additional Mapping Information. We reject calls to require providers at this time to submit additional information with their maps. Next Century Cities and Public Knowledge/New America recommend that providers be required to include other performance and affordability information, such as the throughput speeds experienced by broadband consumers, signal strength, and pricing information. The Commission declined to adopt pricing and throughput data filing requirements for fixed services in the Third Order, and did not delegate authority to the Bureau and Offices to add such requirements for mobile services. The Broadband DATA Act defines standardized propagation modeling at defined throughput speeds for 4G-LTE coverage. The Commission followed Congress's approach and required mobile broadband providers to model broadband coverage, including 3G and 5G-NR services, based on standardized propagation modeling. We thus decline to require providers to model actual mobile throughput. Even if we had the delegated authority adopt a rule to require the modeling of mobile throughput, we note that doing such modeling would be a computationally difficult, if not impossible, task for mobile broadband providers. Instead, we will use on-the-ground data collected through the challenge and crowdsource processes to improve the accuracy of the coverage maps. The Commission did specifically consider whether to standardize signal strength for mobile propagation maps, and instead adopted a requirement for providers to submit “heat maps.” Mobile providers are therefore already required to submit maps showing Reference Signal Received Power (RSRP) or Received Signal Strength Indicator (RSSI) signal levels for each technology. Additionally, in adopting rules to implement the Broadband DATA Act, the Commission focused on ensuring that the public has access to more precise coverage maps, but did not delegate to the Bureau and Offices the authority to adopt new mapping requirements such as requiring providers to include affordability or pricing data for their broadband services. We also find it would be inconsistent with the Commission's reasoning to adopt these types of pricing requirements for mobile maps, but not fixed maps.
135. Expanding the Types of Data That Can Be Used to Challenge Maps. CPUC, Public Knowledge/New America, and Vermont DPS recommend allowing interpolation techniques to be used for challenging provider-submitted maps. The Commission explicitly adopted a requirement that consumers and government and other entities submit speed test data to support their mobile coverage challenges, and did not grant the Bureau and Offices authority to accept data other than on-the-ground speed tests to challenge coverage. We therefore lack delegated authority to accept interpolations or statistical sampling as challenge data in lieu of actual, valid speed tests.
136. Expanding the Types of Data That Can Be Used for Verified Data. CPUC and Vermont DPS likewise recommend allowing interpolations of speed test results by government entities to identify areas requiring validation. Such spatial interpolation techniques could include the Kriging technique discussed in the BDC Mobile Technical Requirements Proposed Rules. In contrast, T-Mobile states that the Commission must reject any proposal premised on interpolation. To the extent governments or other entities submit on-the-ground speed test data through our crowdsource process, we agree with CPUC and Vermont DPS that the results of spatial interpolation analyses would be useful additional information on which to determine if there is a credible basis for verifying a provider's coverage data. However, the Commission directed that verified mobile on-the-ground data be submitted “through a process similar to the one established for providers making their semiannual [BDC] filings,” and the Bureau and Offices do not have discretion to change that approach. Since interpolation is a projection, it therefore does not meet the requirements established for “verified” broadband availability data under the Broadband DATA Act. Therefore, while we may use interpolation in our analysis of on-the-ground data submitted either as part of the challenge process or as crowdsourced data when conducting a holistic review to ensure the accuracy of coverage data ( e.g., when evaluating whether there is a credible basis for conducting a verification inquiry), we are unconvinced that accepting interpolated data on their own would give us the necessary understanding of on-the-ground performance consistent with our obligations under the Broadband DATA Act and Commission Orders.
137. Decline to Require Providers to Offer Challenge Incentives. We will not, as urged by some commenters, require that providers offer subscribers incentives to conduct speed tests or submit voluntary challenges. Once we implement the challenge process, we believe that consumers and third parties will be motivated to provide us with data where they believe providers' coverage maps are inaccurate or incomplete. Relatedly, the Commission noted in the Third Order that speed test results submitted by consumer challengers that do not reach the threshold of a cognizable challenge will nevertheless be incorporated in the analysis of crowdsourced data, and similarly that on-the-ground test data Start Printed Page 21505 submitted by governmental and third-party entities that do not reach the threshold of a cognizable challenge also will be considered in the analysis of crowdsourced data. We believe that combining these speed test results along with other available data, including other available crowdsourced data, will provide us with a robust and accurate dataset, thereby obviating the need for provider-offered incentives to spur consumers and third parties into submitting challenges or collecting crowdsourced data to submit to us. The user-friendly challenge process we implement should facilitate consumers and other entities alike in submitting challenges and crowdsourced mobile coverage data. As one commenter observes, “[d]ue to known shortcomings in mobile coverage maps[,] . . . the Commission needs a good challenge process” and should “allow the use of crowd-sourced data to challenge providers' claims.” We agree, and believe that we have put efficient and effective challenge and crowdsource processes and procedures in place.
138. Pre-Publication Commission Review of Maps. We decline to establish an additional period of review for the Commission to perform a “quick look” at the data that service providers submit before publishing maps rendering the data. CCA suggests an “initial review and sampling process,” which “could be automated, although there is likely no complete substitute for some degree of manual review and sampling,” to identify “significant and overt errors”; CCA cites the Commission's initial review of spectrum license transfer applications prior to placing them on public notice as a potential framework for a similar initial review process. It also recommends staff conduct random sampling or statistical analysis and comparison of the data provided by each provider to detect clear errors, and then quickly review maps for errors such as failure to account for terrain and clutter, excessive signal propagation at co-located sites, failure to use the required resolution, understated/overstated service in populated areas, depicted service ceasing at artificial boundaries, and failure to match the coverage maps on providers websites. CTIA and Public Knowledge/New America agree that such a process could be helpful, reasoning that a Commission-led initial review would eliminate a costly and open-ended burden on challengers who, they argue, will expend time and energy identifying overt errors that carriers never should have submitted.
139. While we recognize the theoretical benefits of a “quick look” of provider-submitted maps before they are made available to the public to challenge, we find that these are outweighed by the significant delay that this would introduce into the challenge process. Requiring the Commission to independently analyze provider submissions or conduct field surveys would significantly delay when this information is made available for the public to challenge. It also would be difficult to operationalize meaningful and practical standards to be applied in a “quick look.” The Commission will be collecting data and rendering multiple maps for scores of mobile and fixed providers, and it would clearly be wholly impracticable for staff to review every map of every provider before making them available to the public and to other federal, state, and local government agencies, Tribal entities, and other third parties. In order to build a process to undertake this type of review, we would need to decide, for example, which maps to review; how much time to spend reviewing them; and what kinds of “significant and overt” errors to look for. Commenters who support this pre-screening of provider data offer virtually no input on these fundamental implementation challenges, and we note that adopting CCA's suggested “quick look” approach in the absence of a more complete record on issues like these would likely require additional notice and comment. Additionally, the Broadband DATA Act created a framework whereby mobile service providers submit propagation maps based on a standardized set of propagation model details; in turn, the Commission is required to publish the data mobile service providers submit, and outside stakeholders are permitted to challenge mobile service providers' broadband coverage assumptions or submit crowdsource information to help us further refine and validate mobile service providers' propagation maps. Creating a “quick look” process could interfere with Congress's intent that we leverage public input to improve the maps over time.
140. That is not to say that we have not already planned to undertake certain data validations a part of the BDC submission process to preempt or remediate any overt errors. The BDC system will perform dozens of data validations and automatic processing steps on uploaded data and will alert the provider when any of the data fail one of these steps. These validations and processing steps will—for the first time—allow for the Commission's systems to automatically detect many of the GIS data and mapping issues that have historically been found in data submitted by providers after a time-consuming and largely manual review by staff for each Form 477 filing round. The new validations and automatic processing will flag a number of factors that would undermine the accuracy of a provider's data, including geometric errors in maps and overt errors in providers' assumptions. Moreover—and also for the first time—the BDC system will require providers to review and correct maps rendered from their data and to confirm that they uploaded the correct data and that any changes made as a result of data validations ( e.g., automatic repairs of invalid geometries and incorrect map projections) are correct, all prior to certifying their submissions. We anticipate that these additional validations and processing steps will significantly improve the process to submit data and, by preventing a provider from completing its submission until it has successfully undergone these data validations, will prevent the lengthy back-and-forth between filers and FCC staff that has typically occurred after the submission of Form 477 data. We believe that the new validations and automatic processing will help correct many, if not all, of the problems CCA discusses. The Bureau and Offices will maintain discretion to develop additional tools in the future to provide automatic feedback to carriers as we receive more data.
141. Use of BDC Data. RWA requests that Bureau and Offices clarify when the data collection, Fabric, and coverage maps will be “complete” for the purposes of awarding broadband deployment funds. We note that decisions regarding specific programs and how to use BDC data to determine areas of eligibility are outside the scope of this proceeding.
142. Non-substantive Changes. Finally, we make two non-substantive changes. First, we correct the numbering of 47 CFR 1.7006(e)(1). In particular, we redesignate the first paragraph (e)(1)(iv) as paragraph (e)(1)(iii). Second, in the second sentence of 47 CFR 1.7006(f) introductory text, we change the first instance of the word “or” to “of”.
II. Supplemental Final Regulatory Flexibility Analysis
143. As required by the Regulatory Flexibility Act of 1980, as amended (RFA) a Supplemental Initial Regulatory Flexibility Analysis (Supplemental IRFA) was incorporated in the BDC Mobile Technical Requirements Proposed Rules released in July 2021 in this proceeding. The Commission prepared Initial and Final Regulatory Flexibility Analyses in connection with Start Printed Page 21506 the Digital Opportunity Data Collection Report and Order (73 FR 37869, July 2, 2008) and Further Notice of Proposed Rulemaking (82 FR 40119, Aug. 24, 2017), Second Order and Third Further NPRM, and Third Order (collectively, Broadband Data Act Proceedings). Written public comments were requested on the IRFAs prepared for the Further Notice of Proposed Rulemakings that are part of the Broadband Data Act Proceedings. Additionally, the Commission sought written public comment on the proposals, including comments on the Supplemental IRFA, in the BDC Mobile Technical Requirements Proposed Rules. No comments were filed addressing the Supplemental IRFA or the IRFAs incorporated in the Broadband Data Act Proceedings. This Supplemental Final Regulatory Flexibility Analysis (Supplemental FRFA) supplements the Final Regulatory Flexibility Analyses (FRFAs) in the Broadband Data Act Proceedings to reflect actions taken in this document and conforms to the RFA.
A. Need for, and Objectives of, the Order
144. The Broadband DATA Act requires the Commission to collect granular data from providers on the availability and quality of broadband internet access service and to verify the accuracy and reliability of the broadband coverage data submitted by providers. In its Second Order and Third Further NPRM, and Third Order, the Commission adopted some of the Broadband DATA Act's requirements, developed the framework for the BDC, established processes for verifying providers' broadband data submissions, and established a data challenge process. The Commission delegated authority to the Bureau and Offices to design and construct the new mapping system, which includes setting forth the specifications and requirements for the challenge, verification, and crowdsourcing processes. Following the December 27, 2020, Congressional appropriation of funding for the implementation of the Broadband DATA Act, the Commission began to implement challenge, verification, and crowdsourcing processes involving broadband data coverage submissions.
145. In this document, pursuant to their delegated authority, the Bureau and Offices take the next steps toward obtaining better coverage data and implementing the requirements of the Broadband DATA Act. More specifically, the Bureau and Offices take action to carry out their responsibility to develop technical requirements for verifying service providers' coverage data, a challenge process that will enable consumers and other third parties to dispute service providers' coverage data, and a process for consumers and other entities to submit crowdsourced data on mobile broadband availability. These measures will help the Commission, Congress, other federal and state policy makers, Tribal entities, consumers, and other third parties better evaluate the status of broadband deployment throughout the United States.
146. This document discusses the technical requirements to implement the mobile challenge, verification, and crowdsourcing processes required by the Broadband DATA Act, such as parameters and metrics for on-the-ground test data and a methodology for determining the threshold for what constitutes a cognizable challenge requiring a provider response. It also provides guidance on what types of data will likely be more probative in different circumstances. Additionally, this document discusses detailed processes and metrics for providers to follow when responding to a Commission verification request, for government entities and other third parties to follow when submitting verified broadband coverage data, and for challengers to follow when contesting providers' broadband coverage availability. We believe this level of detail is necessary to formulate the processes and procedures to enable better evaluation of the status of broadband deployment throughout the United States and to meet the Commission's obligations under the Broadband DATA Act.
B. Summary of Significant Issues Raised by Public Comments in Response to the IRFA
147. There were no comments filed that specifically addressed the proposed rules and policies presented in the Supplemental IRFA.
C. Response to Comments by the Chief Counsel for Advocacy of the Small Business Administration
148. Pursuant to the Small Business Jobs Act of 2010, which amended the RFA, the Commission is required to respond to any comments filed by the Chief Counsel for Advocacy of the Small Business Administration (SBA) and to provide a detailed statement of any change made to the proposed rules as a result of those comments. The Chief Counsel did not file comments in response to the proposed rules in this proceeding.
D. Description and Estimate of the Number of Small Entities to Which the Rules Will Apply
149. The RFA directs agencies to provide a description of and, where feasible, an estimate of the number of small entities that may be affected by the rules adopted herein. The RFA generally defines the term “small entity” as having the same meaning as the terms “small business,” “small organization,” and “small governmental jurisdiction.” In addition, the term “small business” has the same meaning as the term “small-business concern” under the Small Business Act. A “small-business concern” is one which: (1) Is independently owned and operated; (2) is not dominant in its field of operation; and (3) satisfies any additional criteria established by the SBA.
150. As noted above, Regulatory Flexibility Analyses were incorporated into the Broadband Data Act Proceedings and the BDC Mobile Technical Requirements Proposed Rules. More specifically, the FRFAs incorporated in the Broadband Data Act Proceedings described in detail the small entities that might be significantly affected in the proceedings. Accordingly, in this Supplemental FRFA, we hereby incorporate by reference from the FRFAs in the Broadband Data Act Proceedings the descriptions and estimates of the number of small entities that might be significantly affected, as well as the associated analyses, set forth therein.
E. Description of Projected Reporting, Recordkeeping, and Other Compliance Requirements for Small Entities
151. We expect that the granular data collection for the challenge and verification processes in this document will impose some new reporting, recordkeeping, or other compliance requirements on some small entities. Specifically, as part of the challenge process, challenged mobile service providers are notified monthly via the online portal of the challenged hexagons at the end of each calendar month. Mobile providers of broadband internet access service must submit a rebuttal (consisting of either on-the-ground test data or infrastructure data) to the challenge or concede the challenge within 60 days of being notified of the challenge. A challenge respondent may submit supplemental data in support of its rebuttal, either voluntarily or, in some cases, in response to a request from OEA. When rebutting a challenge with on-the-ground data, the provider must meet analogous thresholds (geographic, temporal, and testing) to Start Printed Page 21507 those required of challengers, adjusted to reflect the burden on providers to demonstrate that sufficient coverage exists at least 90% of the time in the challenged hexagons. When a provider submits only infrastructure data to rebut a challenge, the provider must submit the same data as required when a mobile provider submits infrastructure information in response to a Commission verification request.
152. As part of the verification process, mobile providers of broadband internet access service must submit coverage data in the form of on-the-ground test data or infrastructure information on a case-by-case basis in response to a Commission request to verify mobile broadband providers' biannual BDC data submissions in a targeted area. For on-the-ground test data, we adopted an approach for providers to reply to verification requests using on-the-ground test data to verify networks which require mobile providers to submit data using the H3 geospatial indexing system at resolution 8. The tests will be evaluated to confirm, using a one-sided 95% statistical confidence interval, that the cell coverage is 90% or higher. Providers must also meet a temporal threshold in verification inquiry submissions that may be relaxed from that required in the challenge process. Additionally, consistent with our proposal in the BDC Mobile Technical Requirements Proposed Rules, state, local, and Tribal government entities as well as other third parties who voluntarily submit on-the-ground test data as verified data must use the same metrics and testing parameters that mobile providers must use when submitting on-the-ground test data, to ensure the consistency and accuracy of the broadband availability maps.
153. This document allows providers to submit infrastructure information in response to a verification request as proposed in the BDC Mobile Technical Requirements Proposed Rules. If a provider chooses to submit infrastructure information in response to a verification request, it must provide such data for all cell sites and antennas that serve or affect coverage in the targeted area. To the extent that the infrastructure information submitted by a provider in response to a verification request standing alone is not sufficient to demonstrate adequate coverage, the Commission may request additional information be submitted by the provider to complete the verification process. This document expands the categories of infrastructure information that providers must submit when collecting and reporting mobile infrastructure data by adopting the eight additional data categories proposed in the BDC Mobile Technical Requirements Proposed Rules which will enable a more precise evaluation of the challenged area of a provider's coverage map. Further, recognizing the need to allow flexibility for responding providers, this document also allows providers to submit other types of data to supplement on-the-ground or infrastructure information, such as transmitter monitoring information, data from their own field tests conducted in the ordinary course of business, and data collected using their own software tools.
154. With regard to the reporting or submission of crowdsourced data, the Bureau and Offices were directed by Commission to establish and use an online portal for crowdsourced data filings and to use the same portal for challenge filings. As proposed in the BDC Mobile Technical Requirements Proposed Rules to the extent state, local, and Tribal government entities, other entities, or consumers choose to submit on-the-ground crowdsourced mobile speed test data in the online portal, the data submission must use measurements similar to the methodology used by the FCC's speed test app and be submitted in a similar format to that which challengers and providers are required to use when submitting speed tests. Likewise, if state, local, and Tribal government entities, other entities, or consumers choose to submit preliminary on-the-ground crowdsourced mobile speed test information prior to availability of the online portal, the data collection requirements require use of a similar measurement methodology as the FCC's speed test app and submission in a format similar to the one used for speed tests.
155. The requirements we adopt in this document continue the Commission's actions to implement the Broadband DATA Act and develop more accurate, more useful, and more granular broadband availability data to advance our statutory obligations and continue our efforts to close the digital divide. We conclude that it is necessary to adopt these rules to produce broadband deployment maps that will allow the Commission to precisely target scarce universal service dollars to where broadband service is lacking. We are cognizant of the need to ensure that the benefits resulting from use of the data outweigh the reporting burdens imposed on small entities. The Commission believes, however, that any additional burdens imposed by our revised reporting approach for providers and state, local, and Tribal government entities are outweighed by the significant benefit to be gained from producing more accurate broadband deployment data and map. We are likewise cognizant that small entities will incur costs and may have to hire attorneys, engineers, consultants or other professionals to comply with this document. Moreover, although the Commission cannot quantify the cost of compliance with the requirements in this document, we believe that the reporting and other requirements we have adopted are necessary to comply with the Broadband DATA Act and ensure the Commission obtains complete and accurate broadband coverage maps.
F. Steps Taken To Minimize the Significant Economic Impact on Small Entities, and Significant Alternatives Considered
156. The RFA requires an agency to describe any significant, specifically small business, alternatives that it has considered in reaching its approach, which may include the following four alternatives (among others): “(1) the establishment of differing compliance or reporting requirements or timetables that take into account the resources available to small entities; (2) the clarification, consolidation, or simplification of compliance and reporting requirements under the rule for such small entities; (3) the use of performance rather than design standards; and (4) an exemption from coverage of the rule, or any part thereof, for such small entities.”
157. The requirements adopted in this document balance the need for the Commission to generate more precise and granular mobile broadband availability maps with any associated costs and burdens on mobile broadband providers and other entities participating in the BDC process. The Commission has considered the comments in the record and is mindful that some small entities will have to expend resources and will incur costs to comply with requirements in this document. In reaching the requirements we adopted in this document, there were various approaches and alternatives that the Commission considered but did not adopt, which we discuss below, that will prevent small entities from incurring additional burdens and will minimize the economic impact of compliance.
158. The mobile challenge process requirements adopted by the Commission will facilitate the collection of sufficient measurement information to ensure the mobile challenge process Start Printed Page 21508 is statistically valid while, at the same time, meeting the Commission's statutory obligation to keep the challenge process “user-friendly.” The adopted requirements strike a balance between ensuring that small entities, including but not limited to state, local, and Tribal governments, as well as consumers and other third-party challengers, can use the challenge process, and ensuring that providers, including small providers, are not unreasonably burdened by responding to every speed test that shows a lack of coverage. The mobile challenge process we have adopted includes a process to determine whether there is a cognizable challenge to which a provider is required to respond rather than requiring a provider to respond to any and all submitted challenges. This will minimize the economic impact for small providers to the extent they are subject to challenges. For challengers, the mobile challenge process allows drive test data meeting specific testing parameters to be submitted via a mobile app—the data must be collected using mobile devices running either a Commission-developed app ( i.e., the FCC Speed Test app) or another speed test app approved by OET—and allows governmental entities and other third-party challengers to use their own software and hardware, which contributes to the “user-friendly” nature of the challenge process. Additionally, the speed test data from state, local, and Tribal governments, consumers and other third-party challengers will be aggregated as part of the mobile challenge process to ensure that one challenger is not required to submit all of the speed test data needed to create a challenge, thereby lessening the load as well as the costs and resources required for small entities and others who participate in the mobile challenge process to create a cognizable challenge.
159. The notification process adopted in this document to inform service providers of cognizable challenges filed against them and inform challengers and service providers of the status and results of challenges will be done on a monthly basis via the online portal. This approach should be more manageable, more administratively efficient, and thereby less costly for small entities and other providers by providing them with a standard set of deadlines rather than having a rolling set of multiple deadlines, while also ensuring that challengers have the opportunity to submit additional evidence in support of their challenge submissions if desired. Providers and challengers will have access to all relevant information through the online portal, including a map of the challenged area(s), notification of whether or not a challenge has been successfully rebutted, whether a challenge was successful, and if a challenged area was restored based on insufficient evidence to sustain a challenge.
160. The mobile service challenge process metrics for mobile providers to follow when responding to a Commission verification request seek to balance the need for the Commission to establish valuable methods for verifying coverage data with the need to reduce the costs and burdens associated with requiring mobile providers to submit on-the-ground test data and infrastructure information. For example, in order to ensure the challenge process is user-friendly for challengers and workable for mobile providers to respond to and rebut challenges, the challenged mobile service providers who choose to submit on-the-ground speed test data are required to meet analogous thresholds as the challengers to demonstrate that the challenged areas have sufficient coverage. Providers are required to submit on-the-ground data to demonstrate that sufficient coverage exists at least 90% of the time and meet the same three threshold tests as challengers. We considered but declined a proposal to define a challenge area based on the test data submitted by the challengers on our belief that our proposal is both user-friendly and supported by sufficient data while also targeting a more precise geographic area where broadband coverage is disputed and limits the burden on providers in responding to challenges.
161. We also declined to adopt several recommendations from commenters which would have expanded the scope of requirements for the challenge process and increased costs for small and other providers. More specifically, we declined to include voice maps in the challenge process, noting that Broadband DATA Act makes no mention of allowing challenges to voice maps, and the Commission decided that the mobile challenge process applies only to broadband ( i.e., not voice) coverage maps. Further, we declined to require providers to provide additional information such as performance and affordability information like throughput speeds experienced by consumers, signal strength, and pricing information with their maps. In the Third Order, the Commission specifically declined to adopt pricing and throughput data on fixed services, and we do not believe the Bureau and Offices have discretion to add such requirements in this document.
162. For small entities and other providers who use on-the-ground test data to rebut challenges, we provide greater flexibility in the collection of on-the-ground test data and reduce burdens on providers by allowing them to use the software tools they may already be using. To the extent that a provider chooses to use software other than the FCC Speed Test app or another speed test app approved by OET for use in the challenge process, we will consider such software approved for use in rebutting challenges provided that the software collects the metrics that approved apps must collect for consumer challenges and that governmental and third-party challengers' speed test data must contain. This approach will help minimize costs for small and other providers and increase efficiency, while continuing to ensure that the Commission receives high quality data that will allow an equivalent comparison between challenge data submitted by consumers and other entities, and data created by providers using their own software. We note however, that we retain the discretion to require prior approval of providers' software tools or make changes to the required metrics via notice and comment at a later time. Similarly, we provide small and other providers flexibility to rebut challenges by allowing the use of infrastructure data, on their own, to adjudicate challenges in a limited set of circumstances.
163. In our adoption of parameters for the collection of verification information, we recognize that it may be more costly for small providers to obtain on-the-ground test data. We take steps to address this issue by adopting a targeted and more inclusive approach. Specifically, we identify the portion of a provider's coverage map (targeted area) that may require verification data and will conduct our determination based upon all available evidence. The scope of all available evidence includes speed test data, infrastructure data, crowdsourced and other third-party data, as well as staff evaluation and knowledge of submitted coverage data (including maps, link budget parameters, and other credible information). Thus, rather than a one-size-fits-all requirement, this approach will allow Commission staff to evaluate whether a verification request is warranted and for providers to submit the type of data in response to a verification request that most cost-effectively supports their coverage calculations. To further minimize the costs and burden placed on small and Start Printed Page 21509 other service providers, while ensuring Commission staff have access to sufficient data to demonstrate coverage, we will use sampling of the target area and require service providers to provide verification data which covers a statistically valid sampling of areas for which sufficient coverage must be demonstrated to satisfy the verification request. By using a sampling plan to demonstrate broadband availability, we decrease the data submission requirements allowing small and other providers to avoid the costs that would have been associated with submitting considerably more data. Additionally, we declined a request to require providers to submit actual on-the-ground test data on a continuous or quarterly basis as such a requirement would be unnecessarily burdensome.
164. To ensure consistency, reliability, comparability, and verifiability of the data the Commission receives, in this document we require state, local, and Tribal government entities and other third parties, including small entities that fall within these categories, to comply with the challenge process applicable to providers. Consistent with our approach for providers which does not carve out different or lower standards for smaller providers, requiring state, local, and Tribal government entities and third parties to submit on-the-ground test data using analogous thresholds we adopted for mobile providers will ensure that the Commission implements a standardized process resulting in broadband availability maps that are as accurate and precise as possible. We are cognizant however, that on-the-ground test data can be more costly to obtain and can impose burdens for small entities. Therefore, our consideration of appropriate verification data sources took into consideration both the usefulness and costs of on-the-ground test data, and the fact that this type of data may not be necessary in every situation, particularly where infrastructure information is available which based on our analysis will likely be of comparable probative value to on-the-ground test data in certain situations.
165. Finally, in the Second Order, the Commission adopted a crowdsourcing process to allow individuals and entities to submit information about the deployment and availability of broadband internet access service. Consistent with the data collection and submission requirements adopted in this document for the mobile challenge and verification process, governmental entities and other third parties, including small entities that fall within these categories, can submit on-the-ground crowdsourced mobile speed test data using the online portal that will be used by providers for the challenge and verification processes. As mentioned above in Section E, crowdsourced data will be collected using a similar measurement methodology and submitted in a format similar to the format challengers and providers use to submit speed test data. In adopting this approach for crowdsourced data, the continued consistency will minimize the cost and administrative burdens for small entities and further ensure the uniformity, dependability, comparability, and verifiability of the data received by the Commission in the mobile challenge, verification, and crowdsourcing processes.
G. Report to Congress
166. The Commission will send a copy of the Order, including the Supplemental FRFA, in a report to be sent to Congress pursuant to the Congressional Review Act. In addition, the Commission will send a copy of the Order, including the Supplemental FRFA, to the Chief Counsel for Advocacy of the SBA. A copy of the Order and Supplemental FRFA (or summaries thereof) will also be published in the Federal Register .
III. Ordering Clauses
167. Accordingly, it is ordered that, pursuant to sections 1-4, 7, 201, 254, 301, 303, 319, 332, and 641-646 of the Communications Act of 1934, as amended, 47 U.S.C. 151-154, 157, 201, 254, 301, 303, 319, 332, 641-646, the Order is adopted .
168. It is further ordered that part 1 of the Commission's rules is amended as set forth in Appendix B of the Order.
169. It is further ordered that the Order shall be effective 30 days after publication in the Federal Register .
170. It is further ordered that the Office of the Managing Director, Performance Evaluation and Records Management, shall send a copy of the Order in a report to be sent to Congress and the Government Accountability Office pursuant to the Congressional Review Act, 5 U.S.C. 801(a)(1)(A).
Start List of SubjectsList of Subjects in 47 CFR Part 1
- Administrative practice and procedure
- Broadband
- Broadband mapping
- Communications
- Internet
- Reporting and recordkeeping requirements
- Telecommunications
Federal Communications Commission.
Amy Brett,
Chief of Staff, Wireless Telecommunications Bureau.
Final Rules
For the reasons discussed in the preamble, the Federal Communications Commission proposes to amend 47 CFR part 1 as follows:
Start PartPART 1—PRACTICE AND PROCEDURE
End Part Start Amendment Part1. The authority citation for part 1 continues to read as follows:
End Amendment Part Start Amendment Part2. Amend § 1.7001 by adding paragraph (a)(20) to read as follows:
End Amendment PartScope and content of filed reports.(a) * * *
(20) H3 standardized geospatial indexing system. A system developed by Uber Technologies, Inc., that overlays the Earth with hexagonal cells of different sizes at various resolutions. The smallest hexagonal cells are at resolution 15, in which the average hexagonal cell has an area of approximately 0.9 square meters, and the largest are at resolution 0, in which the average hexagonal cell has an area of approximately 4.25 million square kilometers. Hexagonal cells across different resolutions are referred to as a “hex-n” cell, where n is the resolution ( e.g., “hex-15” for the smallest size hexagonal cell). The H3 standardized geospatial indexing system employs a nested cell structure wherein a lower resolution hexagonal cell (the “parent”) contains approximately seven hexagonal cells at the next highest resolution (its “children”). That is, a hex-1 cell is the “parent” of seven hex-2 cells, each hex-2 cell is the parent of seven hex-3 cells, and so on.
* * * * *3. Amend § 1.7006 by:
End Amendment Part Start Amendment Parta. Redesignating paragraphs (b)(2) through (4) as paragraphs (b)(3) through (5);
End Amendment Part Start Amendment Partb. Adding new paragraph (b)(2);
End Amendment Part Start Amendment Partc. Revising newly redesignated paragraphs (b)(4) and (5) and paragraphs (c) and (e)(1)(i);
End Amendment Part Start Amendment Partd. Removing paragraph (e)(1)(ii);
End Amendment Part Start Amendment Parte. Redesignating paragraph (e)(1)(iii) and the first paragraph (e)(1)(iv) as paragraphs (e)(1)(ii) and (iii);
End Amendment Part Start Amendment Partf. Revising newly redesignated paragraph (e)(1)(ii) and paragraphs (e)(2), (4), and (6);
End Amendment Part Start Amendment Partg. Adding paragraph (e)(7);
End Amendment Part Start Amendment Parth. Revising paragraphs (f) introductory text and (f)(1)(i); Start Printed Page 21510
End Amendment Part Start Amendment Parti. Removing “and” from the end of paragraph (f)(1)(ii);
End Amendment Part Start Amendment Partj. Removing the period at the end of paragraph (f)(1)(iii) and adding “; and” in its place;
End Amendment Part Start Amendment Partk. Adding paragraph (f)(1)(iv); and
End Amendment Part Start Amendment Partl. Revising paragraphs (f)(2), (3), and (5).
End Amendment PartThe additions and revisions read as follows:
Data verification.* * * * *(b) * * *
(2) On-the-ground crowdsourced data must include the metrics and meet the testing parameters described in paragraphs (c)(1)(i) and (ii) of this section, except that the data may include any combination of download speed and upload speed rather than both.
* * * * *(4) If, as a result of crowdsourced data and/or other available data, the Commission determines that a provider's coverage information is likely not accurate, then the provider shall be subject to a verification inquiry consistent with the mobile verification process described in paragraph (c) of this section.
(5) All information submitted as part of the crowdsourcing process shall be made public via the Commission's website, with the exception of personally identifiable information and any data required to be confidential under § 0.457 of this chapter.
(c) Mobile service verification process for mobile providers. Mobile service providers must submit either infrastructure information or on-the-ground test data in response to a request by Commission staff as part of its inquiry to independently verify the accuracy of the mobile provider's coverage propagation models and maps. In addition to submitting either on-the-ground data or infrastructure data, a provider may also submit data collected from transmitter monitoring software. The Office of Economics and Analytics and the Wireless Telecommunications Bureau may require the submission of additional data when necessary to complete a verification inquiry. A provider must submit its data, in the case of both infrastructure information and on-the-ground data, within 60 days of receiving a Commission staff request. Regarding on-the-ground data, a provider must submit evidence of network performance based on a sample of on-the-ground tests that is statistically appropriate for the area tested. A provider must verify coverage of a sampled area using the H3 geospatial indexing system at resolution 8. The on-the-ground tests will be evaluated to confirm, using a one-sided 95% statistical confidence interval, that the cell coverage is 90% or higher. In submitting data in response to a verification request, a provider must record at least two tests within each of the randomly selected hexagons where the time of the tests are at least four hours apart, irrespective of date, unless, for any sampled hexagon, the provider has and submits alongside its speed tests actual cell loading data for the cell(s) covering the hexagon sufficient to establish that median loading, measured in 15-minute intervals, did not exceed the modeled loading factor for the one-week period prior to the verification inquiry, in which case the provider is required to submit only a single test for the sampled hexagon. We will treat any tests within the sampled accessible point-hex that are outside the coverage area as valid in the case where tests were not recorded within the coverage area. If the required sampled point-hex continue to have missing tests, we will also consider tests that fall slightly outside the required point-hex but within the typical Global Positioning System (GPS) average user range error as valid when no tests are recorded within the point-hex. If the sampled point-hex still has missing tests, we would set those missing required speed tests as negative tests when performing the final adjudication. For in-vehicle mobile tests, providers must conduct tests with the antenna located inside the vehicle.
(1) When a mobile service provider chooses to demonstrate mobile broadband coverage availability by submitting on-the-ground data, the mobile service provider must provide valid on-the-ground tests within a Commission-identified statistically valid and unbiased sample of its network.
(i) On-the-ground test data must meet the following testing parameters:
(A) A minimum test length of 5 seconds and a maximum test length of 30 seconds. These test length parameters apply individually to download speed, upload speed, and round-trip latency measurements, and do not include ramp up time. The minimum test duration requirement will be relaxed once a download or upload test measurement has transferred at least 1,000 megabytes of data;
(B) Reporting test measurement results that have been averaged over the duration of the test ( i.e., total bits received divided by total test time); and
(C) Conducted outdoors between the hours of 6:00 a.m. and 10:00 p.m. local time; and
(ii) On-the-ground test data must include the following metrics for each test:
(A) Testing app name and version;
(B) Timestamp and duration of each test metric;
(C) Geographic coordinates ( i.e., latitude/longitude) measured at the start and end of each test metric measured with typical GPS Standard Positioning Service accuracy or better, along with location accuracy;
(D) Consumer-grade device type(s), brand/model, and operating system used for the test;
(E) Name and identity of the service provider being tested;
(F) Location of test server ( e.g., hostname or IP address);
(G) Signal strength, signal quality, unique identifier, and radiofrequency metrics of each serving cell, where available;
(H) Download speed;
(I) Upload speed;
(J) Round-trip latency;
(K) Whether the test was taken in an in-vehicle mobile or outdoor, pedestrian stationary environment;
(L) For an in-vehicle test, the speed the vehicle was traveling when the test was taken, where available;
(M) An indication of whether the test failed to establish a connection with a mobile network at the time and place it was initiated;
(N) The network technology ( e.g., 4G LTE (Long Term Evolution), 5G-NR (New Radio)) and spectrum bands used for the test; and
(O) All other metrics required per the most recent specification for mobile test data adopted by Office of Economics and Analytics and the Wireless Telecommunications Bureau in accordance with 5 U.S.C. 553.
(2) When a mobile service provider chooses to demonstrate mobile broadband coverage availability by submitting infrastructure data, the mobile service provider must submit such data for all cell sites and antennas that serve or interfere with the targeted area.
(i) Infrastructure data must include the following information for each cell site that the provider uses to provide service for the area subject to the verification inquiry:
(A) The latitude and longitude of the cell site measured with typical GPS Standard Positioning Service accuracy or better;
(B) The cell and site ID number for each cell site;
(C) The ground elevation above mean sea level (AMSL) of the site (in meters);
(D) Frequency band(s) used to provide service for each site being mapped Start Printed Page 21511 including channel bandwidth (in megahertz);
(E) Radio technologies used on each band for each site;
(F) Capacity (megabits per second (Mbps)) and type of backhaul used at each cell site;
(G) Number of sectors at each cell site;
(H) Effective Isotropic Radiated Power (EIRP, in decibel-milliwatts (dBm)) of the sector at the time the mobile provider creates its map of the coverage data;
(I) Geographic coordinates of each transmitter site measured with typical GPS Standard Positioning Service accuracy or better;
(J) Per site classification ( e.g., urban, suburban, or rural);
(K) Elevation above ground level for each base station antenna and other transmit antenna specifications ( i.e., the make and model, beamwidth (in degrees), radiation pattern, and orientation (azimuth and any electrical and/or mechanical down-tilt in degrees) at each cell site);
(L) Operate transmit power of the radio equipment at each cell site;
(M) Throughput and associated required signal strength and signal-to-noise ratio;
(N) Cell loading distribution;
(O) Areas enabled with carrier aggregation and a list of band combinations; and
(P) Any additional parameters and fields that are listed in the most-recent specifications for wireless infrastructure data released by the Office of Economics and Analytics and the Wireless Telecommunications Bureau in accordance with 5 U.S.C. 553.
(ii) [Reserved]
* * * * *(e) * * *
(1) * * *
(i) Name, email address, and mobile phone number of the device on which the speed test was conducted;
(ii) Speed test data. Consumers must use a speed test app that has been designated by the Office of Engineering and Technology, in consultation with the Office of Economics and Analytics and the Wireless Telecommunications Bureau, for use in the challenge process. Consumer challenges must include on-the-ground test data that meets the requirements in paragraphs (c)(1)(i) and (ii) of this section, and must also report the timestamp that test measurement data were transmitted to the app developer's servers, as well as the source IP address and port of the device, as measured by the server;
* * * * *(2) Consumer speed tests will be used to create a cognizable challenge based on the following criteria:
(i) The smallest challengeable hexagonal cell is a hexagon at resolution 8 from the H3 standardized geospatial indexing system.
(ii) The download and upload components of a speed test will be evaluated separately.
(iii) A “positive” component is one that records speeds meeting or exceeding the minimum speeds that the mobile service provider reports as available where the test occurred ( e.g., a positive download component would show speeds of at least 5 Mbps for 4G LTE, and a positive upload component would show speeds of at least 1 Mbps for 4G LTE). A “negative” component is one that records speeds that fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred.
(iv) A point-hex shall be defined as one of the seven hex-9s from the H3 standardized geospatial indexing system nested within a hex-8.
(v) A point-hex shall be defined as accessible where at least 50% of the area of the point-hex overlaps with the provider's reported coverage data and the point-hex overlaps with any primary, secondary, or local road in the U.S. Census Bureau's TIGER/Line Shapefiles.
(vi) A hex-8 from the H3 standardized geospatial indexing system shall be classified as challenged if the following three thresholds are met in the hex-8 for either the download or upload components.
(A) Geographic threshold. When there are at least four accessible point-hexes within the hex-8, each must contain two of the same test components (download or upload), one of which is a negative test. The threshold must be met for one component entirely, meaning that a challenge may contain either two upload components per point-hex, one of which is negative, or two download components per point-hex, one of which is negative. The minimum number of point-hexes in which tests must be recorded must be equal to the number of accessible point-hexes or four, whichever number is lower. If there are no accessible point-hexes within a hex-8, the geographic threshold shall not need to be met;
(B) Temporal threshold. A hex-8 cell must include a set of two negative test components of the same type with a time-of-day difference of at least four hours from another set of two negative test components of the same type, regardless of the date of the tests; and
(C) Testing threshold. At least five speed test components of the same type within a hex-8 cell are negative when a challenger has submitted 20 or fewer test components of that type.
( 1 ) When challengers have submitted more than 20 test components of the same type, the following minimum percentage of the total number of test components of that type in the cell must be negative:
( i ) When challengers have submitted 21-29 test components, at least 24% must be negative;
( ii ) When challengers have submitted 30-45 test components, at least 22% must be negative;
( iii ) When challengers have submitted 46-60 test components, at least 20% must be negative;
( iv ) When challengers have submitted 61-70 test components, at least 18% must be negative;
( v ) When challengers have submitted 71-99 test components, at least 17% must be negative; and
( vi ) When challengers have submitted 100 or more test components, at least 16% must be negative.
( 2 ) In a hex-8 with four or more accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 50% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only towards 50% of the threshold. In a hex-8 where there are only three accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 75% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only towards 75% of the threshold.
( 3 ) Once the percentage of negative components of the same type recorded meets the minimum negative percentage required (or for a sample of fewer than 21 components, once there are at least five negative component submitted), no additional tests are required so long as both the geographic and temporal thresholds for a hex-8 have been met.
(vii) A larger, “parent” hexagon (at resolutions 7 or 6) shall be considered challenged if at least four of the child hexagons within such a “parent” hexagon are considered challenged.
(viii) Mobile service providers shall be notified of all cognizable challenges to their mobile broadband coverage maps at the end of each month. Challengers shall be notified when a mobile provider responds to the challenge. Mobile service providers and challengers both shall be notified Start Printed Page 21512 monthly of the status of challenged areas and parties will be able to see a map of the challenged area and a notification about whether or not a challenge has been successfully rebutted, whether a challenge was successful, and if a challenged area was restored based on insufficient evidence to sustain a challenge.
* * * * *(4) To dispute a challenge, a mobile service provider must submit on-the-ground test data that meets the requirements in paragraphs (c)(1)(i) and (ii) of this section, (for in-vehicle mobile tests, providers must conduct tests with the antenna located inside the vehicle), or infrastructure data that meets the requirements in paragraph (c)(2)(i) of this section to verify its coverage map(s) in the challenged area. To the extent that a mobile service provider believes it would be helpful to the Commission in resolving a challenge, it may choose to submit other data in addition to the data initially required, including but not limited to either infrastructure or on-the-ground testing (to the extent such data are not the primary option chosen by the provider) or other types of data such as data collected from network transmitter monitoring systems or software, or spectrum band-specific coverage maps. Such other data must be submitted at the same time as the primary on-the-ground testing or infrastructure rebuttal data submitted by the provider. If needed to ensure an adequate review, the Office of Economics and Analytics may also require that the provider submit other data in addition to the data initially submitted, including but not limited to either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider) or data collected from network transmitter monitoring systems or software (to the extent available in the provider's network). If a mobile provider is not able to demonstrate sufficient coverage in a challenged hexagon, the mobile provider must revise its coverage maps to reflect the lack of coverage in such areas.
(i) A “positive” component is one that records speeds meeting or exceeding the minimum speeds that the mobile service provider reports as available where the test occurred ( e.g., a positive download component would show speeds of at least 5 Mbps for 4G LTE, and a positive upload component would show speeds of at least 1 Mbps for 4G LTE). A “negative” component is one that records speeds that fail to meet the minimum speeds that the mobile service provider reports as available where the test occurred.
(ii) A point-hex shall be defined as one of the seven nested hexagons at resolution 9 from the H3 standardized geospatial indexing system of a resolution 8 hexagon.
(iii) A point-hex shall be defined as accessible where at least 50% of the area of the point-hex overlaps with the provider's reported coverage data and the point-hex overlaps with any primary, secondary, or local road in the U.S. Census Bureau's TIGER/Line Shapefiles.
(iv) A mobile service provider that chooses to rebut a challenge to their mobile broadband coverage maps with on-the-ground speed test data must confirm that a challenged area has sufficient coverage using speed tests that were conducted during the 12 months prior to submitting a rebuttal. A provider may confirm coverage in any hex-8 cell within the challenged area. This includes any hex-8 cell that is challenged, and also any non-challenged hex-8 cell that is a child of a challenged hex-7 or hex-6 cell. Confirming non-challenged hex-8 cells can be used to confirm the challenged hex-7 or hex-6 cell. To confirm a hex-8 cell, a provider must submit on-the ground speed test data that meets the following criteria for both upload and download components:
(A) Geographic threshold. Two download components, at least one of which is a positive test, and two upload components, at least one of which is a positive test, are recorded within a minimum number of point-hexes within the challenged area, where the minimum number of point-hexes in which tests must be recorded must be equal to the number of accessible point-hexes or four, whichever number is lower. If there are no accessible point-hexes within a hex-8, the geographic threshold shall not need to be met.
(B) Temporal threshold. A hex-8 cell will need to include a set of five positive test components of the same type with a time-of-day difference of at least four hours from another set of five positive test components of the same type, regardless of the date of the test.
(C) Testing threshold. At least 17 positive test components of the same type within a hex-8 cell in the challenged area when the provider has submitted 20 or fewer test components of that type. When the provider has submitted more than 20 test components of the same type, a certain minimum percentage of the total number of test components of that type in the cell must be positive:
( 1 ) When a provider has submitted 21-34 test components, at least 82% must be positive;
( 2 ) When a provider has submitted 35-49 test components, at least 84% must be positive;
( 3 ) When a provider has submitted 50-70 test components, at least 86% must be positive;
( 4 ) When a provider has submitted 71-99 test components, at least 87% must be positive;
( 5 ) When a provider has submitted 100 or more test components, at least 88% must be positive; and
( 6 ) In a hex-8 with four or more accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 50% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only toward 50% of the threshold. In a hex-8 where there are only three accessible point-hexes, if the number of test components of the same type in one point-hex represent more than 75% of the total test components of that type in the hex-8 but still satisfies the geographic threshold, the components in that point-hex will count only toward 75% of the threshold.
(D) Use of FCC Speed Test App or other software. Using a mobile device running either a Commission-developed app ( e.g., the FCC Speed Test app), another speed test app approved by OET to submit challenges, or other software provided that the software adopts the test methodology and collects the metrics that approved apps must perform for consumer challenges and that government and third-party entity challenger speed test data must contain (for in-vehicle mobile tests, providers must conduct tests with the antenna located inside the vehicle):
( 1 ) Providers must submit a complete description of the methodologies used to collect their data; and
( 2 ) Providers must substantiate their data through the certification of a qualified engineer or official.
(E) Use of an appropriate device. Using a device that is able to interface with drive test software and/or runs on the Android operating system.
(v) A mobile service provider that chooses to rebut a challenge to their mobile broadband coverage maps with infrastructure data on their own may only do so in order to identify invalid, or non-representative, speed tests within the challenger speed test data. The mobile service provider must submit the same data as required when a mobile provider submits infrastructure information in response to a Start Printed Page 21513 Commission verification request, including information on the cell sites and antennas used to provide service in the challenged area. A provider may submit only infrastructure data to rebut a challenge if:
(A) Extenuating circumstances at the time and location of a given test ( e.g., maintenance or temporary outage at the cell site) caused service to be abnormal. In such cases, a provider must submit coverage or footprint data for the site or sectors that were affected and information about the outage, such as bands affected, duration, and whether the outage was reported to the FCC's Network Outage Reporting System (NORS), along with a certification about the submission's accuracy;
(B) The mobile device(s) with which the challenger(s) conducted their speed tests are not capable of using or connecting to the radio technology or spectrum band(s) that the provider models for service in the challenged area. In such cases, a provider must submit band-specific coverage footprints and information about which specific device(s) lack the technology or band;
(C) The challenge speed tests were taken during an uncommon special event ( e.g., professional sporting event) that increased traffic on the network;
(D)( 1 ) The challenge speed tests were taken during a period where cell loading was abnormally higher than the modeled cell loading factor. In such cases, providers must submit cell loading data that both:
( i ) Establish that the cell loading for the primary cell(s) at the time of the test was abnormally higher than modeled; and
( ii ) Include cell loading data for a one-week period before and/or after the provider was notified of the challenge showing as a baseline that the median loading for the primary cell(s) was not greater than the modeled value.
( 2 ) If a high number of challenges show persistent over-loading, staff may initiate a verification inquiry to investigate whether mobile providers have submitted coverage maps based on an accurate assumption of cell loading in a particular area;
(E) The mobile device(s) with which the challenger(s) conducted their speed tests used a data plan that could result in slower service. In such cases, a provider must submit information about which specific device(s) used in the testing were using such a data plan and information showing that the provider's network did, in fact, slow the device at the time of the test; or
(F) The mobile device(s) with which the challenger(s) conducted their speed tests was either roaming or was used by the customer of a mobile virtual network operator. In such circumstances, providers must identify which specific device(s) used in the testing were either roaming at the time or used by the customer of a mobile virtual network operator based upon their records.
(vi) If the Commission determines, based on the infrastructure data submitted by providers, that challenge speed tests are invalid, such challenge speed tests shall be ruled void, and the Commission shall recalculate the challenged hexagons after removing any invalidated challenger speed tests and consider any challenged hexagons that no longer meet the challenge creation threshold to be restored to their status before the challenge was submitted.
* * * * *(6) After a challenged provider submits all responses and Commission staff determines the result of a challenge and any subsequent rebuttal has been determined:
(i) In such cases where a mobile service provider successfully rebuts a challenge, the area confirmed to have coverage shall be ineligible for challenge until the next biannual broadband availability data filing six months after the later of either the end of the 60-day response period or the resolution of the challenge.
(ii) A challenged area may be restored to an unchallenged state, if, as a result of data submitted by the provider, there is no longer sufficient evidence to sustain the challenge to that area, but the provider's data fall short of confirming the area. A restored hexagon would be subject to challenge at any time in the future as challengers submit new speed test data.
(iii) In cases where a mobile service provider concedes or loses a challenge, the provider must file, within 30 days, geospatial data depicting the challenged area that has been shown to lack sufficient service. Such data will constitute a correction layer to the provider's original propagation model-based coverage map, and Commission staff will use this layer to update the broadband coverage map. In addition, to the extent that a provider does not later improve coverage for the relevant technology in an area where it conceded or lost a challenge, it must include this correction layer in its subsequent filings to indicate the areas shown to lack service.
(7) Commission staff are permitted to consider other relevant data to support a mobile service provider's rebuttal of challenges, including on-the-ground data or infrastructure data (to the extent such data are not the primary rebuttal option submitted by the mobile service provider). The Office of Economics and Analytics will review such data when voluntarily submitted by providers in response to challenges, and if it concludes that any of the data sources are sufficiently reliable, it will specify appropriate standards and specifications for each type of data and will issue a public notice adding the data source to the alternatives available to providers to rebut a consumer challenge.
(f) Mobile service challenge process for State, local, and Tribal governmental entities; and other entities or individuals. State, local, and Tribal governmental entities and other entities or individuals may submit data to challenge accuracy of mobile broadband coverage maps. They may challenge mobile coverage data based on lack of service or poor service quality such as slow delivered user speed.
(1) * * *
(i) Government and other entity challengers may use their own software and hardware to collect data for the challenge process. When they submit their data the data must meet the requirements in paragraphs (c)(1)(i) and (ii) of this section, except that government and other entity challengers may submit the International Mobile Equipment Identity (IMEI) of the device used to conduct a speed test for use in the challenge process instead of the timestamp that test measurement data were transmitted to the app developer's servers, as well as the source IP address and port of the device, as measured by the server;
* * * * *(iv) If the test was taken in an in-vehicle mobile environment, whether the test was conducted with the antenna outside of the vehicle.
(2) Challengers must conduct speed tests using a device advertised by the challenged service provider as compatible with its network and must take all speed tests outdoors. Challengers must also use a device that is able to interface with drive test software and/or runs on the Android operating system.
(3) For a challenge to be considered a cognizable challenge, thus requiring a mobile service provider response, the challenge must meet the same thresholds specified in paragraph (e)(2) of this section.
* * * * *(5) To dispute a challenge, a mobile service provider must submit on-the-ground test data or infrastructure data to verify its coverage map(s) in the challenged area based on the Start Printed Page 21514 methodology set forth in paragraph (e)(4) of this section. To the extent that a service provider believes it would be helpful to the Commission in resolving a challenge, it may choose to submit other data in addition to the data initially required, including but not limited to either infrastructure or on-the-ground testing (to the extent such data are not the primary option chosen by the provider) or other types of data such as data collected from network transmitter monitoring systems or software or spectrum band-specific coverage maps. Such other data must be submitted at the same time as the primary on-the-ground testing or infrastructure rebuttal data submitted by the provider. If needed to ensure an adequate review, the Office of Economics and Analytics may also require that the provider submit other data in addition to the data initially submitted, including but not limited to either infrastructure or on-the-ground testing data (to the extent not the option initially chosen by the provider) or data collected from network transmitter monitoring systems or software (to the extent available in the provider's network).
* * * * *4. Amend § 1.7008 by revising paragraph (d)(2) to read as follows:
End Amendment PartCreation of broadband internet access service coverage maps.* * * * *(d) * * *
(2) To the extent government entities or third parties choose to file verified data, they must follow the same filing process as providers submitting their broadband internet access service data in the data portal. Government entities and third parties that file on-the-ground test data must submit such data using the same metrics and testing parameters the Commission requires of mobile service providers when responding to a Commission request to verify mobile providers' broadband network coverage with on-the-ground data ( see § 1.7006(c)(1)).
* * * * *[FR Doc. 2022-06826 Filed 4-8-22; 8:45 am]
BILLING CODE 6712-01-P
Document Information
- Effective Date:
- 5/11/2022
- Published:
- 04/11/2022
- Department:
- Federal Communications Commission
- Entry Type:
- Rule
- Action:
- Final rule.
- Document Number:
- 2022-06826
- Dates:
- Effective May 11, 2022.
- Pages:
- 21476-21514 (39 pages)
- Docket Numbers:
- WC Docket No. 19-195, DA-22-241, FR ID 78895
- Topics:
- Administrative practice and procedure, Broadband, Communications, Internet, Reporting and recordkeeping requirements, Telecommunications
- PDF File:
- 2022-06826.pdf
- CFR: (3)
- 47 CFR 1.7001
- 47 CFR 1.7006
- 47 CFR 1.7008