[Federal Register Volume 63, Number 94 (Friday, May 15, 1998)]
[Proposed Rules]
[Pages 27021-27035]
From the Federal Register Online via the Government Publishing Office [www.gpo.gov]
[FR Doc No: 98-12971]
=======================================================================
-----------------------------------------------------------------------
FEDERAL COMMUNICATIONS COMMISSION
47 CFR Chapter I
[CC Docket No. 98-56, RM-9101, FCC 98-72]
Performance Measurements and Reporting Requirements for
Operations Support Systems, Interconnection, and Operator Services and
Directory Assistance
AGENCY: Federal Communications Commission.
ACTION: Proposed rule.
-----------------------------------------------------------------------
SUMMARY: The Commission is issuing this Notice of Proposed Rulemaking
seeking comment on various proposed performance measurements and
reporting requirements relating to incumbent carriers' operations
support systems (OSS). The performance measurements and reporting
requirements proposed in the NPRM will complement existing state
proceedings and efforts by carriers, independent of regulatory
requirements, to incorporate performance measurements into their
interconnection agreements.
DATES: Comments are due on or before June 1, 1998 and Reply Comments
are due on or before June 22, 1998. Written comments by the public on
the proposed information collections are due June 1, 1998. Written
comments must be submitted by the Office of Management and Budget (OMB)
on the proposed information collections on or before July 14, 1998.
ADDRESSES: Comments and reply comments should be sent to Office of the
Secretary, Federal Communications Commission, 1919 M Street, N.W., Room
222, Washington, D.C. 20554, with a copy to Janice Myles of the Common
Carrier Bureau, 1919 M Street, N.W., Room 544, Washington, D.C. 20554.
Parties should also file one copy of any documents filed in this docket
with the Commission's copy contractor, International Transcription
Services, Inc., 1231 20th St., N.W., Washington, D.C. 20036. In
addition to filing comments with the Secretary, a copy of any comments
on the information collections contained herein should be submitted to
Judy Boley, Federal Communications Commission, Room 234, 1919 M Street,
N.W., Washington, D.C. 20554, or via the Internet to jboley@fcc.gov,
and to Timothy Fain, OMB Desk Officer, 10236 NEOB, 725--17th Street,
N.W., Washington, D.C. 20503 or via the Internet to fain__t@al.eop.gov.
FOR FURTHER INFORMATION CONTACT: Radhika Karmarkar, Attorney, Common
Carrier Bureau, Policy and Program Planning Division, (202) 418-1580.
For additional information concerning the information collections
contained in this NPRM contact Judy Boley at (202) 418-0214, or via the
Internet at jboley@fcc.gov.
SUPPLEMENTARY INFORMATION: This is a summary of the Commission's Notice
of Proposed Rulemaking adopted April 16, 1998 and released April 17,
1998 (FCC 98-72). This NPRM contains proposed information collections
subject to the Paperwork Reduction Act of 1995 (PRA). It has been
submitted to the OMB for review under the PRA. The OMB, the general
public, and other Federal agencies are invited to comment on the
proposed information collections contained in this proceeding. The full
text of this Notice of Proposed Rulemaking is available for inspection
and copying during normal business hours in the FCC Reference Center,
1919 M St., N.W., Room 239, Washington, D.C. The complete text also may
be obtained through the World Wide Web, at http://www.fcc.gov/Bureaus/
Common Carrier/Orders/fcc9872.wp, or may be purchased from the
Commission's copy contractor, International Transcription Service,
Inc., (202) 857-3800, 1231 20th St., N.W., Washington, D.C. 20036.
Paperwork Reduction Act
This NPRM contains a proposed information collection. The
Commission, as part of its continuing effort to reduce paperwork
burdens,
[[Page 27022]]
invites the general public and OMB to comment on the information
collections contained in this NPRM, as required by the Paperwork
Reduction Act of 1995, Public Law 104-13. Public and agency comments
are due at the same time as other comments on this NPRM; OMB
notification of action is due July 14, 1998. Comments should address:
(a) whether the proposed collection of information is necessary for the
proper performance of the functions of the Commission, including
whether the information shall have practical utility; (b) the accuracy
of the Commission's burden estimates; (c) ways to enhance the quality,
utility, and clarity of the information collected; and (d) ways to
minimize the burden of the collection of information on the
respondents, including the use of automated collection techniques or
other forms of information technology.
OMB Approval Number: None.
Title: Performance Measurements and Reporting Requirements for
Operations Support Systems, Interconnection, and Operator Services and
Directory Assistance.
Form No.: N/A.
Type of Review: New collection.
----------------------------------------------------------------------------------------------------------------
Estimated
Number of time per Total
Information collection respondents pesponse annual
(Approximately) (annual) burden
(hours) (hours)
----------------------------------------------------------------------------------------------------------------
Pre-Ordering: Average Response Time.................................. 11 240 2,640
Ordering/Provisioning: Order Completion Measurements................. 11 480 5,280
Ordering/Provisioning: Coordinated Customer Conversions.............. 11 240 2,640
Ordering/Provisioning: Order Status Measurements..................... 11 1,200 13,200
Ordering/Provisioning: Held Order Measurement........................ 11 240 2,640
Ordering/Provisioning: Installation Troubles Measurement............. 11 240 2,640
Ordering/Provisioning: Order Quality Measurements.................... 11 480 5,280
Ordering/Provisioning: 911 Database Update and Accuracy.............. 11 480 5,280
Repair and Maintenance Measurements.................................. 11 960 10,560
Billing Measurements................................................. 11 480 5,280
General Measurements: Systems Availability........................... 11 240 2,640
General Measurements: Center Responsiveness.......................... 11 240 2,640
General Measurements: OS/DA.......................................... 11 240 2,640
Interconnection: Trunk Blockage Measurements......................... 11 480 5,280
Interconnection: Collocation Measurements............................ 11 720 7,920
----------------------------------------------------------------------------------------------------------------
Frequency of Response: Monthly; On occasion.
Total Annual Burden: 76,560 hours.
Respondents: Business or other for profit.
Estimated costs per respondent: $800,000.
Needs and Uses: The NPRM seeks comment on certain performance
measurements and reporting requirements to implement the
interconnection requirements of the 1996 Act. The proposed measurements
are intended to permit a direct assessment of whether an incumbent
local exchange carrier is complying with its obligations under section
251 of the Communications Act of 1934, as amended.
Synopsis of Notice of Proposed Rulemaking
I. Introduction
1. In this proceeding, we explore ways to advance a fundamental
goal of the Telecommunications Act of 1996--to increase consumer choice
by fostering competition in the provision of local telephone service.
The 1996 Act requires incumbent local telephone service providers to
open their markets to competition.
2. Congress required incumbents to make available to new entrants
in a nondiscriminatory, and just and reasonable manner the services and
facilities the incumbents use to provide retail services to their own
customers. In order to take advantage of the service and facility
offerings that Congress requires incumbents to provide, new entrants
need access to the support functions that incumbents use to process
orders from their own customers.
3. In this proceeding, we propose a methodology by which to analyze
whether new providers of local telephone service are able to access,
among other things, the support functions (that is, the functions
provided by computer systems, databases, and personnel) of incumbent
local telephone companies in a manner consistent with the 1996 Act's
nondiscrimination requirement. We seek comment, as explained below, on
certain proposed measurements and reports designed to illuminate the
performance of incumbent local telephone companies in providing access
to these vital support functions. Such performance measurements will
assist incumbents, new entrants, and regulators in evaluating an
incumbent's performance in meeting its statutory obligations. We do
not, however, propose specific performance standards or technical
standards. We also seek comment on ways to achieve the statutory goals,
while also minimizing the burden on all incumbent carriers, especially
small, rural, and midsized incumbent local telephone companies.
4. We recognize that some state commissions have undertaken efforts
to develop performance measurements and reporting requirements for
these support functions. Other states have yet to begin such efforts,
but plan to do so. States have sought this Commission's help in
developing these measurements. The primary goal of this NPRM,
therefore, is to provide guidance, in the most efficient and
expeditious manner possible, to the states and the industry on a set of
performance measurements and reporting requirements that will help spur
the development of local competition. Accordingly, we propose, in the
first instance, to adopt model performance measures and reporting
requirements, as described in detail herein, that are not legally
binding. This approach will allow those states that have commenced
proceedings to incorporate the model performance measurements and
reporting requirements as they deem beneficial and aid those states
that have not begun work in this area. We expect to develop such model
performance measurements and reporting requirements as expeditiously as
possible once the record closes in this proceeding. The experience we
gain from the
[[Page 27023]]
development of these model performance measurements and reporting
requirements and their application by the states will, we believe,
provide a more informed and comprehensive record upon which to decide
whether to adopt national, legally binding rules. The adoption of
national rules may, however, prove to be unnecessary in light of the
states' and carriers' application of the model performance measurements
and reporting requirements that we intend to adopt in the first
instance. We emphasize our belief that the adoption of model
performance measurements and reporting requirements to serve as
guidelines for state commissions constitutes the most efficient and
effective role for the Commission in this area at this time.
II. Background
A. Procedural History
5. On May 30, 1997, LCI International Telecom Corp. (LCI) and the
Competitive Telecommunications Association (CompTel) jointly filed a
petition asking the Commission to initiate a rulemaking proceeding
(``LCI/CompTel Petition'') concerning the requirements governing OSS,
interconnection, and other related activities established by the
Commission in its Local Competition First Report and Order, 61 FR
45476, August 29, 1996. On June 10, 1997, the Commission issued a
Public Notice seeking comment on the LCI/CompTel petition. A number of
parties, including both incumbent LECs and competing carriers, filed
comments and reply comments in response to this Public Notice.
6. Among other things, petitioners ask the Commission to establish:
(1) performance measurements and reporting requirements for the
provision of operations support systems (OSS) functions; (2) default
performance standards or benchmarks that would apply when an incumbent
LEC fails, or refuses, to report on its performance; (3) technical
standards for OSS interfaces; and (4) remedial provisions that would
apply to non-compliant incumbent LECs. In their petition, LCI/CompTel
propose that the Commission rely on the Service Quality Measurements
adopted by the Local Competition Users Group (LCUG) as the basis for
establishing performance measurements, reporting requirements, and
default performance standards. On October 8, 1997, LCUG filed a revised
proposal that described in detail its proposed performance measurements
and default standards. A number of parties filed additional ex parte
comments, offering their own proposed measurements and addressing the
specific recommendations made by LCUG in its revised proposal.
B. Summary of Proposals
7. In this NPRM, we tentatively conclude that we should propose
model performance measurements and reporting requirements for OSS
functions, interconnection, and access to operator services and
directory assistance. In Part III, we discuss the respective roles of
the Commission and the states with regard to the development and
implementation of model rules, as well as with respect to the
establishment of legally binding rules. In Part IV, we set forth
proposed performance measurements. In Part V, we discuss reporting
procedures, and in Part VI we propose methods to evaluate performance
measurements. As explained in Part VII, we conclude that we will not
address at this time several points raised in the LCI/CompTel petition,
such as the establishment of national performance standards, technical
standards, and enforcement mechanisms. In addition, we recognize that
the proposals set forth in this NPRM may disproportionately impact
small, rural, and midsized incumbent LECs. Consequently, in Part VIII
we also seek comment on the potential burdens that our proposed model
rules could impose on these incumbent LECs and we seek comment on
possible remedies.
III. Role of Commission and States
8. LCI and CompTel petitioned the Commission to initiate a
rulemaking to promulgate performance measurements and reporting
requirements. States as well have urged us to assist them in developing
these measurements. Indeed, NARUC passed a resolution seeking such
assistance. It states in pertinent part:
Resolved: That the FCC be urged to move promptly to advance the
establishment of performance guidelines that can be used to evaluate
the provision of access to the components of OSS functions * * *.
Individual states have also begun work in this area. For example,
California and New York have initiated proceedings to develop OSS
requirements, including performance measurements and reporting
requirements.
9. The primary goal of this NPRM is to provide the requested
guidance to the states in the most efficient and expeditious manner
possible. Accordingly, we intend, in the first instance, to adopt a set
of model performance measurements and reporting requirements, based on
the detailed descriptions provided herein and subject to whatever
modifications we deem appropriate in light of comments received. These
model performance measurements and reporting requirements would not be
legally binding.
10. We recognize that parties in this proceeding have offered
differing opinions concerning our jurisdiction to issue OSS rules. Some
have argued that the Eighth Circuit's decision in Iowa Utilities v. FCC
would preclude our authority to establish rules relating to OSS, while
others have argued, to the contrary, that portions of that decision
would validate our authority to issue such rules. We invite parties to
comment on this issue. Given that our primary goal is to provide
guidance to states through the adoption of model rules in the first
instance, however, we strongly encourage parties to focus on the
substance of the proposed performance measurements and reporting
requirements, rather than focusing exclusively on issues of
jurisdiction.
IV. Proposed Performance Measurements and Reporting Requirements
A. General Issues
11. In this section, we propose performance measurements for each
of the five OSS functions, as well as for interconnection and OS/DA.
These measurements are intended to permit a direct assessment of
whether an incumbent LEC is complying with its obligations under
section 251.
12. In the Local Competition First Report and Order, the Commission
determined that, because OSS includes the information necessary to
obtain other network elements or resold services, providing access to
OSS functions falls squarely within an incumbent LEC's duty under
section 251(c)(3) to provide unbundled network elements under terms and
conditions that are nondiscriminatory, just and reasonable, and its
duty under section 251(c)(4) to offer resale services without imposing
any limitations or conditions that are discriminatory or unreasonable.
Additionally, the Commission identified OSS itself as a network element
and stated that it consisted of five functions: (1) pre-ordering; (2)
ordering; (3) provisioning; (4) maintenance and repair; and (5)
billing. The Commission concluded that, as with all unbundled network
elements, an incumbent LEC must provide access to these five OSS
functions that is equivalent to what it provides itself, its own end-
user customers, or other carriers.
[[Page 27024]]
13. As a practical matter, for those OSS functions provided to
competing carriers that are analogous to OSS functions that an
incumbent LEC provides itself in connection with retail service
offerings, the incumbent LEC must provide access to competing carriers
that is equivalent to the level of access that the incumbent LEC
provides itself in terms of quality, accuracy, and timeliness. Thus,
for example, for those functions that an incumbent LEC itself accesses
electronically, the incumbent LEC must provide electronic access for
competing carriers. In addition, competing carriers must have access to
OSS functions that allows them to make use of such functions in
``substantially the same time and manner'' as the incumbent LEC. For
those OSS functions that have no direct retail analog, such as the
ordering and provisioning of unbundled network elements, an incumbent
LEC must provide access sufficient to allow an efficient competitor a
meaningful opportunity to compete.
14. With respect to interconnection, the Commission concluded that
``section 251(c)(2)(C) requires an incumbent LEC to provide
interconnection between its network and that of a requesting carrier at
a level of quality that is at least indistinguishable from that which
the incumbent provides itself, a subsidiary, an affiliate, or any other
party.'' Finally, incumbent LECs are obligated under section 251(c)(3)
to provide nondiscriminatory access to operator services and directory
assistance because they are network elements.
15. The measurements we propose in this NPRM are designed to assist
in assessing an incumbent LEC's performance in providing OSS,
interconnection, and OS/DA to competing carriers. Various parties
presented proposals for performance measurements in this proceeding. We
conclude, however, that no single proposal optimally balances our goals
of detecting possible instances of discrimination while minimizing, to
the extent possible, burdens imposed on incumbent LECs. We therefore
propose a set of measurements that we believe provides an appropriate
balance of these goals.
16. We recognize that reporting averages of performance
measurements alone, without further analysis, may not reveal whether
there are underlying differences in the way incumbent LECs treat their
own retail operations in relation to the way they treat competing
carriers. Consequently, we propose, as part of the model rules proposed
herein, the use of statistical tests to determine whether measured
differences in the average performance of incumbent LECs toward their
retail customers and toward competing carriers represent true
differences in behavior rather than random chance. Further, we
recognize that reporting on averages alone may mask potential forms of
discrimination. For example, an incumbent LEC may have the same average
completion interval in providing service to competing carriers as it
has in providing service to its retail customers, but the variation in
completion intervals in providing the service may differ greatly. It
may be the case, for instance, that the average completion interval is
four days for both competing carriers and retail customers, but half of
competing carriers' orders are completed in one day and half in seven
days, while all of retail customers' orders are completed in exactly
four days. For this reason, we seek comment below on the possible use
of statistical tests that capture differences in variances between two
samples as well as tests of differences in averages. We also seek
comment below on whether, as part of the model rules proposed herein,
the data underlying the performance measurement results should be made
available to competing carriers so that they can evaluate the incumbent
LECs' performance in other ways if they choose to do so.
17. Before describing the individual performance measurements,
however, we seek comment on a number of general issues that pertain to
all performance measurements. These general issues concern: 1) the
appropriate balance between the burdens and benefits associated with
performance measurements and reporting requirements; 2) the appropriate
geographic level for reporting; 3) the scope of activities that
incumbent LECs should report; and 4) the relevant electronic interfaces
for purposes of reporting the measurements described below.
1. Balance Between Burdens and Benefits
18. Our goal in developing performance measurements, and the
associated level of detail, is to isolate the activities in which an
incumbent could discriminate when providing services and facilities to
competing carriers. We believe that persistent discrimination by an
incumbent LEC in any of the activities for which we have proposed
performance measurements potentially would undermine a competing
carrier's prospects for success in the local market. At the same time,
as we have noted previously, although we believe that performance
measurements and reporting requirements will help foster competition in
the local exchange market, compliance with performance measurements and
reporting requirements imposes certain burdens on incumbent LECs. In
developing our proposed performance measurements and reporting
requirements, we have sought to balance our goal of detecting possible
instances of discrimination with our goal of minimizing, to the extent
possible, burdens imposed on incumbent LECs. As a general matter, we
seek comment on whether our proposed measurements appropriately balance
these twin goals.
19. Additionally, we ask parties to comment generally on the level
of detail contained in the proposed performance measurements.
Specifically, we seek comment on whether the performance measurements
we propose in this NPRM are sufficiently detailed to ensure the
collection of meaningful data, or whether greater detail or
disaggregation is necessary or whether lesser detail or disaggregation
would be sufficient.
2. Geographic Level for Reporting
20. We seek comment on the appropriate geographic level of
reporting. In particular, we seek comment on whether carriers should
report data for each performance measurement based on state boundaries,
LATAs, metropolitan statistical areas (MSAs), or some other relevant
geographic area. We also seek comment on whether a uniform geographic
level of reporting should apply to all performance measurements, or
whether it would be appropriate to require different levels of
reporting for separate measurements.
3. Scope of Reporting
21. We believe that, when an incumbent LEC reports the results of
the performance measurements, it must do so in a manner that permits a
competing carrier to compare the access the incumbent LEC provides to
the carrier and other competing carriers with the access the incumbent
LEC provides to itself or its affiliates. Accordingly, we tentatively
conclude that an incumbent LEC should report separately on its
performance as provided to: (1) its own retail customers; (2) any of
its affiliates that provide local exchange service; (3) competing
carriers in the aggregate; and (4) individual competing carriers. We
seek comment on these proposed levels of disaggregation and whether
they will permit competing carriers to detect discrimination.
[[Page 27025]]
4. Relevant Electronic Interfaces
22. As the Commission has previously noted, an incumbent LEC must
provide competing carriers the same electronic access to its OSS
functions as it provides itself in accessing its own internal systems
and databases. Because incumbent LECs access their systems
electronically for retail purposes, we tentatively conclude that
incumbent LECs need measure only the access they provide electronically
to competing carriers. Therefore, our proposals would only require
incumbent LECs to measure the performance of the electronic interfaces
that incumbent LECs offer to competing carriers for access to OSS.
23. We recognize that most incumbent LECs provide several types of
electronic interfaces, such as a GUI-based interface and an EDI-based
interface. We seek comment on whether these incumbent LECs must provide
performance measurements for each type of electronic interface. We seek
comment on whether an incumbent LEC should measure performance for each
of its electronic interfaces or only some subset of the interfaces it
offers. To the extent that incumbent LECs report on performance for all
electronic interfaces, we tentatively conclude that they should
disaggregate the data by interface type when reporting each performance
measurement.
24. As noted above, we have sought to balance our goal of detecting
possible instances of discrimination with our goal of minimizing, to
the extent possible, burdens imposed on incumbent LECs. Because we
intend to limit our proposed measurements to the performance of an
incumbent LEC's electronic interfaces, we expect that most of the
measurements proposed in this NPRM can be collected through electronic
coding or some other automatic logging procedure. We seek comment on
which, if any, of our proposed measurements may require more labor-
intensive collection methods and whether, as a result, they would be
unduly burdensome.
B. Proposed Measurements
1. Pre-Ordering Measurements
25. The pre-ordering function allows a competing carrier to gather
and confirm information necessary to place an accurate order for its
end user. We tentatively conclude that an incumbent LEC must measure
the average interval for providing access to pre-ordering information
to competing carriers, as well as to itself. The Average Response Time
measurement could, however, be based on all queries sent to the pre-
ordering interface or some subset of these queries. We seek comment on
whether a sampling approach, such as the one adopted in the Bell
Atlantic/NYNEX Merger Order, would be a sufficient method for assessing
an incumbent LEC's nondiscriminatory provision of pre-ordering
information. In addition, we propose that an incumbent LEC disaggregate
the results for this measurement according to the pre-ordering sub-
functions.
26. We recognize that there may be instances where an incumbent LEC
does not provide access to certain pre-ordering sub-functions on a real
time basis, but rather via batch files (e.g., street address
verification). We seek comment on whether incumbent LECs should exclude
those pre-ordering sub-functions that are not provided on a real time
basis from this measurement, or whether there are alternative methods
to detect possible discriminatory access in such instances.
27. In certain instances a competing carrier may be unable to
retrieve pre-ordering information for each query attempt. Instead, it
may receive a rejected query notice (also known as a failed attempt
notice). We seek comment on whether an incumbent LEC should measure the
speed by which it provides rejected query notices to competing carriers
as well as to itself. In addition, we seek comment on whether a
rejected query notice measurement must be provided as a separate
category for the pre-ordering function in general or, alternatively,
disaggregated separately for each pre-ordering sub-function. Finally,
we seek comment on whether incumbent LECs should measure the number of
rejected query notices as a percentage of the total number of pre-
ordering queries.
2. Ordering and Provisioning Measurements
a. Disaggregation of data. 28. Before describing the proposed
ordering and provisioning measurements, this section discusses the
levels of disaggregation that we believe should apply to these
measurements, as well as to the repair and maintenance measurements
discussed in Part IV.B.3. We believe that some level of disaggregation
is necessary to ensure the collection of meaningful results. We note
that a number of parties have proposed various levels of
disaggregation. Although we make no tentative conclusions regarding the
appropriate levels of disaggregation for ordering and provisioning
measurements and repair and maintenance measurements, we seek comment
on the thirteen measurement categories. In order for competing carriers
to track more easily the treatment accorded to certain types of orders
throughout the ordering and provisioning process, we propose to use
these thirteen measurement categories for the order completion
measurements, the order status measurements, the held orders
measurement, and the installation troubles measurement. Similarly, in
order for competing carriers to observe more easily correlations
between the types of services or elements ordered and any subsequent
need for repair and maintenance, we propose to use the same thirteen
measurement categories for the various repair and maintenance
measurements, the Average Time to Restore measurement, the Frequency of
Troubles in a Thirty Day Period measurement, the Frequency of Repeat
Troubles in a Thirty Day Period measurement and the Percentage of
Customer Troubles Resolved within Estimated Time measurement.
29. We seek comment on whether the thirteen proposed measurement
categories are appropriate. In particular, we seek comment on whether
these categories would disaggregate the data sufficiently to allow the
detection of discrimination. We also seek comment on whether fewer
levels of disaggregation would sufficiently detect instances of
discrimination, but would impose less reporting burden on incumbent
LECs.
30. We propose that incumbent LECs first break down the orders by
separating resold services, unbundled network elements, and
interconnection trunks.
For resold services, we propose to disaggregate the measurements
further according to the three broad categories of resold
telecommunications services: (1) Residential POTS; (2) business POTS;
and (3) special services. We believe that each particular service that
is available for resale can be categorized under one of these broader
service umbrellas. We propose, however, that each group should be
broken down by orders that require the dispatch of a service technician
and those that do not. We believe that this breakdown is important
because the need for field work has a significant impact on the amount
of time necessary to provision a resale order placed by a competing
carrier. We seek comment on the proposed levels of disaggregation for
resold services.
31. For unbundled network elements, we propose that incumbent LECs
report separately the measurement results associated with ordering and
provisioning different types of network elements (i.e., unbundled
loops,
[[Page 27026]]
unbundled switching, and unbundled local transport). We believe that
disaggregation by type of network element is necessary because there
are varying degrees of order complexity and inter-carrier coordination
involved with different types of network elements, including
combinations of network elements, and that these variations will affect
the time required to provision a network element order. In addition, we
propose that orders for unbundled loops should be broken down by
whether the loops are provisioned with interim number portability. We
believe that the provisioning time for loops with interim number
portability may differ from those without. We seek comment on our
proposed levels of disaggregation for network element orders. We also
seek comment on whether the unbundled loop category should be further
disaggregated, as suggested by LCUG, between 2-wire unbundled loops,
which are generally used for POTS-type services, and all other loop
types, such as 4-wire unbundled loops and unbundled DS1 loops, which
may be more complex to provision.
32. Finally, we propose to include interconnection trunks as a
separate measurement category. Although interconnection trunks are
physically indistinguishable from transport links, interconnection
trunks are unique because they are used for the transmission of traffic
between two networks, whereas transport links are used for the
transmission of traffic within the incumbent's network. As a result,
the process for ordering interconnection trunks, as well as the
mechanisms for provisioning those trunks, is likely to involve a higher
degree of order complexity, as well as greater inter-carrier
coordination, and, therefore, may require a separate reporting
category. We seek comment on the inclusion of interconnection trunks as
a separate measurement category.
b. Order Completion Measurements.
33. We tentatively conclude that incumbent LECs must measure the
Average Completion Interval and the Percentage of Due Dates Missed for
orders placed by their own retail customers and for orders placed by
competing carriers.
34. The measurement for the Average Completion Interval seeks to
compare the average length of time it takes an incumbent LEC to
complete orders for competing carriers with the average length of time
it takes to complete comparable incumbent LEC retail orders. For
competing carriers' orders, we tentatively conclude that an incumbent
LEC must measure the interval from its receipt of a valid order
(``Order Submission Date and Time'') at its OSS interface until the
time it returns a completion notification to the competing carrier
(``Date and Time of Notice of Completion''). For its own orders, we
propose that an incumbent LEC measure the interval from when its
service representative enters an end user customer's order into its
order processing system (``Order Submission Date and Time'') to the
time it completes the order (``Completion Date and Time''). We seek
comment on whether our proposed measurement for the Average Completion
Interval is sufficient or whether greater or lesser detail is
necessary.
35. The Percentage of Due Dates Missed measurement seeks to
determine whether the agreed-upon due dates for order completion are
equally reliable for orders placed by competing carriers and orders
placed by an incumbent LEC's end user customers. We tentatively
conclude that an incumbent LEC must calculate this percentage by
comparing the total number of orders not completed by the committed due
date and time during the specified reporting period to the total number
of orders scheduled to be completed during that reporting period. This
same measurement would apply to orders for an incumbent LEC's customers
and for orders submitted by competing carriers. We seek comment on
whether our proposed measurement for Percentage of Due Dates Missed is
appropriate or whether additional detail is necessary.
36. With respect to both the Average Completion Interval and
Percentage of Due Dates Missed measurements, we tentatively conclude
that certain exclusions should apply. We tentatively conclude that
incumbent LECs should exclude orders canceled or supplemented by
competing carriers from these measurements. We seek comment on whether
additional exclusions are needed.
c. Average time for coordinated customer conversions. 37. We
tentatively conclude that the incumbent LECs should measure the Average
Time for Coordinated Customer Conversions. Specifically, incumbent LECs
must measure the average time it takes to disconnect an unbundled loop
from the incumbent LEC's switch and cross connect it to a competing
carrier's equipment with and without number portability. This
performance measurement will assist in determining how long a customer
switching to a competing carrier is without local exchange service when
the competing carrier utilizes the incumbent LEC's unbundled loop, in
conjunction with its own switching equipment, to provide such service.
We believe that this measurement will assist in evaluating the
incumbent LEC's provisioning of unbundled loops and the impact on
competing carriers' customers.
d. Order status measurements. 38. We have previously stated that a
competing carrier must receive information on the status of its orders
on the same basis as an incumbent LEC provides such notices to itself.
39. We tentatively conclude that incumbent LECs must provide the
following order status measurements: (1) the Average Reject Notice
Interval; (2) the Average Firm Order Confirmation (FOC) Notice
Interval; (3) the Average Jeopardy Notice Interval; (4) the Percentage
of Orders in Jeopardy; and (5) the Average Completion Notice Interval.
We tentatively conclude that all incumbent LECs must also measure these
intervals for themselves, whether or not they have done so previously,
in order to provide a basis for comparison with the average intervals
for competing carriers. A comparison of these times can provide
information on whether the incumbent is providing nondiscriminatory
access to competing carriers. We seek comment on these tentative
conclusions. If an incumbent LEC does not currently provide itself with
a certain form of notice (e.g., a FOC), we seek comment on the
appropriate retail analog that should be measured. We also seek comment
on whether all of these order status measurements are necessary to
ensure that an incumbent LEC is providing nondiscriminatory access.
40. The Average Reject Notice Interval seeks to measure the amount
of time it takes an incumbent LEC to notify the competing carrier that
an order has been rejected. An incumbent LEC typically sends an order
rejection notice for invalid orders, such as those that have syntax or
formatting errors in the order form. The Commission has previously
explained that ``[t]imely delivery of order rejection notices has a
direct impact on a new entrant's ability to service its customers,
because new entrants cannot correct errors and resubmit orders until
they are notified of their rejection * * *.'' We tentatively conclude
that an incumbent LEC must measure the time it takes to deliver such
notices by using the measurement. We propose that an incumbent LEC
measure this interval from the time it receives an order at its OSS
interface to the time the rejection notice leaves its gateway. We seek
comment on these tentative conclusions.
41. The Average FOC Notice Interval seeks to measure the amount of
time it takes an incumbent LEC to send a
[[Page 27027]]
competing carrier a notice confirming the order. Competing carriers
rely on FOC notices to apprise their customers of due dates. We
tentatively conclude that an incumbent LEC must measure the time it
takes to deliver a FOC notice by using the measurement. We also
tentatively conclude that the incumbent LEC must measure this interval
from the time it received a valid order at its OSS interface from the
competing carrier to the time the FOC leaves its OSS interface and is
transmitted to the competing carrier. Because this interval measures
only valid orders, we tentatively conclude that incumbent LECs must
exclude rejected orders from this measurement. We seek comment on these
tentative conclusions.
42. The Average Jeopardy Notice Interval attempts to determine how
far in advance a competing carrier receives notice that its customer's
order is in jeopardy of not being completed as scheduled, compared to
how far in advance an incumbent LEC's service representative receives
such notice. The Commission has previously explained that competing
carriers need timely order jeopardy notices to inform their customers
of the potential need to reschedule the time for service installation.
We tentatively conclude that incumbent LECs must measure the amount of
time between the originally scheduled order completion date and time
(as stated on the FOC) and the date and time a notice leaves the
incumbent LEC's interface informing the carrier that the order is in
jeopardy of missing the originally scheduled date. We seek comment on
this tentative conclusion.
43. We also tentatively conclude that incumbent LECs must measure
the Percentage of Orders in Jeopardy. This measurement determines the
percentage of orders that the incumbent LEC identifies as being in
jeopardy of not being completed on time for any reason. This
information will enable a competing carrier to determine whether a
significantly higher percentage of its orders are placed in jeopardy
than an incumbent LEC's retail orders. Additionally, a competing
carrier should receive a jeopardy notification for each of its orders
that the incumbent LEC fails to complete on time. A competing carrier
can determine whether it is receiving this requisite advance notice by
comparing the Percentage of Orders in Jeopardy to the Percentage Due
Dates Missed measurement.
44. Finally, the Average Completion Notice Interval measures the
amount of time it takes an incumbent LEC to send a competing carrier
notice that work on an order has been completed. We tentatively
conclude that an incumbent LEC must use the measurement and must
measure the interval by subtracting the date and time that it completed
the work from the date and time a valid completion notice leaves its
OSS interface. We seek comment on these tentative conclusions.
e. Average interval for held orders. 45. We tentatively conclude
that incumbent LECs must measure the Average Interval for Held Orders.
This measurement seeks to capture the time required to complete held
orders, i.e., those orders pending at the end of the reporting period
whose committed due dates have passed. For example, if incumbent LECs
report on a monthly basis, a held order would be any order that is
overdue at the end of the month. By measuring those orders whose due
dates have passed, the Average Held Order measurement will capture
those orders not covered by the Average Completion Interval
measurement, which measures orders that are completed by the committed
due date. We believe that the Average Interval for Held Orders
measurement will enable a requesting carrier to determine whether the
average period that its orders are pending after the committed due date
is no longer than the average period for similar incumbent LEC pending
orders. We seek comment on the utility of measuring the average
interval for held orders and whether the measurement described below
accurately captures the necessary information.
46. To arrive at the Average Interval for Held Orders, we
tentatively conclude that the incumbent LEC should first identify all
orders with a FOC listing a due date prior to the end of the reporting
period in question for which a valid completion notice has not yet been
issued. The held order interval for a particular order is the number of
calendar days between the completion date listed on that order's FOC
and the close of the reporting period. The Average Interval for Held
Orders is then calculated by dividing the total number of days since
the due date up to the reporting period close date by the number of
held orders. Incumbent LECs should measure the Average Interval for
Held Orders for both competing carrier orders and their own retail
customer orders. We propose that incumbent LECs exclude from this
measurement those orders cancelled by a competing carrier. We seek
comment on whether these exclusions will assist in producing meaningful
results and on whether additional exclusions are needed.
f. Installation troubles. 47. We tentatively conclude that an
incumbent LEC must measure Percentage Troubles in Thirty Days for New
Orders. We believe that incumbent LECs must calculate the percentage of
new orders for which a competing carrier, or incumbent LEC customer
service representative, receives complaints that there is a problem
with the service within the first thirty days after completion of the
order. Trouble reports often indicate that a customer has not received
the exact service ordered, either because the carrier provided the
wrong type of service or a lower quality of service than expected. We
believe, therefore, that this measurement will provide information
about whether the incumbent LEC processed the order accurately.
Accordingly, we propose that incumbents LECs measure Percentage
Troubles in Thirty Days for New Orders as a substitute for LCUG's
proposed measurement of Percentage Orders Processed Accurately. We
believe that Percentage Troubles in Thirty Days for New Orders will
provide the information sought by LCUG, but will be a less burdensome
measurement than measuring order accuracy, which requires an incumbent
LEC to compare the original account profile and order sent by the
competing carrier to the account profile following completion of the
order. Nevertheless, we seek comment on using this measurement as a
substitute for order accuracy. We also seek comment on whether thirty
days is an appropriate cut-off for measuring trouble reports for new
orders.
48. Although we make no tentative conclusions regarding the
specific measurement needed to measure Percentage Troubles in Thirty
Days for New Orders, we seek comment on the measurement. Specifically,
we seek comment on whether this measurement should be disaggregated in
the same way as the other ordering and provisioning measurements. It
may not be appropriate, for example, to include interconnection trunks
because any problems relating to such trunks will likely affect many
customers on the competing carrier's network, rather than one specific
customer. We seek comment on whether interconnection trunks, or any
other categories of disaggregation, should be eliminated for this
measurement.
49. Finally, we seek comment on whether it is appropriate to
measure percentage troubles on a ``per order'' basis. We seek comment
on whether tracking troubles on a per order basis might mask a higher
number of troubles for larger orders. For example, an order of forty
new lines may have several problems and yet would be reported as having
only one trouble report. We therefore seek comment on whether a
[[Page 27028]]
``per circuit'' basis for resale orders and ``per element'' basis for
unbundled network element orders might be more useful than a ``per
order'' basis.
g. Ordering quality measurements.
1. Order Flow Through
50. An incumbent LEC's internal ordering system permits its retail
service representatives to submit retail customer orders
electronically, directly into the ordering system. This is known as
``flow through.'' Similarly, a competing carrier's orders ``flow
through'' if they are transmitted electronically (i.e., with no manual
intervention) through the gateway into the incumbent LEC's ordering
systems. Order Flow Through applies solely to the OSS ordering
function, not the OSS provisioning function. In other words, Order Flow
Through measures only how the competing carrier's order is transmitted
to the incumbent's back office ordering system, not how the incumbent
ultimately completes that order. Electronically processed service
orders are more likely to be completed and less prone to human error
than orders that require some degree of human intervention.
51. We tentatively conclude that incumbent LECs should measure the
percentage of competing carriers' orders that flow through
electronically to the incumbent LEC's ordering systems. The Percentage
Order Flow Through measurement seeks to calculate the percentage of
orders that an incumbent LEC processes electronically through its
gateway and accepts into its back office systems without manual
intervention (i.e., without additional human intervention once the
order is submitted into the system). This measurement only applies to
valid orders, that is, orders that have not been rejected for some
reason. A separate measurement for rejected orders is in paragraph 53.
52. We tentatively conclude that the Order Flow Through measurement
must be disaggregated by the following categories: (1) resale POTS; (2)
resale specials; (3) network elements; and (4) combinations of network
elements. We note that the proposed categories for the Order Flow
Through measurement are less detailed than the categories proposed for
the other measurements relating to the ordering process (e.g., order
completion and order status measurements). We believe this distinction
is justified because the Order Flow Through measurement focuses solely
on the OSS ordering function, whereas the other proposed measurements
(i.e., those regarding order completion and order status) also focus on
the OSS provisioning function. In the provisioning context, there may
be substantial differences in the time required to provide various
types of unbundled network elements and services. For example, the time
required to complete certain orders may vary based on whether an order
requires a dispatch, or merely a billing change. In the order flow
through context, such issues are irrelevant. The method of ordering
resold services and network elements is not likely to vary between
residential and business customers. We seek comment on the proposed
levels of disaggregation for the Order Flow Through measurement and
whether further disaggregation is necessary.
2. Order Rejections
53. We tentatively conclude that incumbent LECs must report on the
Percentage of Rejected Orders. We also tentatively conclude that this
measurement must be reported to the same level of disaggregation as the
Order Flow Through measurement. The Percentage of Rejected Orders
measurement, would determine the percentage of total orders received
electronically that are rejected.
54. In addition to the above measurement, we seek comment on
whether incumbent LECs should report on the average number of times an
order must be resubmitted before it is finally accepted as a valid
order. The Average Submissions per Order measurement would require
incumbent LECs to measure the number of orders accepted for
provisioning and the number of orders rejected during the reporting
period in order to calculate the total number of order submissions in
the reporting period. The total number of order submissions would then
be divided by the total number of orders accepted for provisioning in
the reporting period.
h. 911 Database update and accuracy. 55. One of the OSS databases
used in ordering and provisioning services and facilities to competing
carriers is the 911/E911 database. We seek comment on whether incumbent
LECs should measure the provision of 911 and E911 emergency services to
competing carriers. The accuracy of 911 and E911 database updates was
identified as an important issue in the Ameritech Michigan 271 Order,
62 FR 44969, August 25, 1997. We seek comment on whether federal
reporting requirements are necessary to monitor possible
discrimination, or whether the states' existing oversight functions of
911 and E911 database services adequately monitor carrier-to-carrier
discrimination.
56. We also seek comment on what particular measurements would be
useful if we were to adopt reporting requirements in this area. In
particular, we seek comment on the utility of measuring the percentage
of accurate updates for incumbent LEC and competing carrier customers.
Such a measurement might assist a competing carrier in determining
whether there is discriminatory treatment in updating these databases.
57. We also seek comment on the utility of measuring the timeliness
of updates to the 911 and E911 databases. We seek comment on whether
incumbent LECs should measure the percentage of missed due dates by
establishing due dates, or specific time frames, for updating
databases. Alternatively, we seek comment on whether incumbent LECs
should measure the mean time to update the 911 and E911 databases.
3. Repair and Maintenance Measurements
58. We tentatively conclude that incumbent LECs must provide the
following repair and maintenance measurements: (1) Average Time to
Restore; (2) Frequency of Repeat Troubles in Thirty Days; (3) Frequency
of Troubles in a Thirty Day Period; and (4) Percentage of Customer
Troubles Resolved within the Estimated Time. Incumbent LECs must
calculate these measurements for themselves and for competing carriers.
We seek comment on whether these four measurements are sufficient to
assess whether incumbent LECs provide repair and maintenance in a
nondiscriminatory manner, or whether this assessment could be done with
fewer measurements. In addition, we seek comment on whether incumbent
LECs should disaggregate the repair and maintenance measurements in the
manner described with respect to the ordering and provisioning
measurements.
59. The Average Time to Restore measurement allows a competing
carrier to gauge whether its customers' services are repaired in the
same time frame as that of the incumbent LEC's customers. The Average
Time to Restore measures the time from when a service problem is
reported to the incumbent LEC (i.e., when a ``trouble ticket'' is
logged) to the time when the incumbent LEC returns a trouble ticket
resolution notification to the competing carrier.
60. The Frequency of Troubles in a Thirty Day Period measurement
reports the percentage of access lines that receive trouble tickets in
a thirty day period. This measurement permits a competing carrier to
determine on an
[[Page 27029]]
ongoing basis whether its customers experience more frequent incidents
of trouble than the incumbent LEC's end users. Disparity in this
measurement may indicate differences in the underlying quality of the
network components supplied by the incumbent LEC. We seek comment on
whether thirty days is an appropriate time frame.
61. The Frequency of Repeat Troubles in a Thirty Day Period
measurement calculates the percentage of trouble tickets that are
repeat trouble tickets. Any differences in this measurement may
indicate that the incumbent LEC provides inferior maintenance support
in the initial resolution of troubles or, in the alternative, that the
incumbent LEC supplies network components of an inferior quality. The
Frequency of Repeat Troubles in a Thirty Day Period measurement is
calculated by dividing the number of repeat troubles generated in a
thirty day period by the total number of trouble tickets received in
the same thirty day period. Again, we seek comment on whether thirty
days is an appropriate time frame.
62. The Percentage of Customer Troubles Resolved Within the
Estimated Time measures whether the estimated times for repairs the
incumbent LEC reports to competing carriers are as reliable as the
estimated times the incumbent LEC provides to its end user customers.
Recognizing that troubles on interconnection trunks may not be customer
specific, we seek comment on the utility of requiring incumbent LECs to
report on the Percentage of Customer Troubles Resolved Within the
Estimated Time with respect to interconnection trunks.
63. We note that LCUG has proposed measurement categories for the
Average Time to Restore measurement based on the disposition and cause
of the trouble. We seek comment on whether most carriers use the
disposition and cause categories proposed by LCUG, and whether such a
breakdown would be useful for the repair and maintenance measurements.
We also seek comment on whether such a breakdown would place undue
burdens on incumbent LECs.
64. We tentatively conclude that incumbent LECs should exclude the
following types of trouble reports from the measurements described
above: (1) trouble tickets that are cancelled by the competing carrier;
(2) incumbent LEC trouble reports associated with the internal or
administrative use of local service; and (3) instances where the
customer requests a ticket be ``held open'' for monitoring. With
respect to the Frequency of Repeat Troubles measurement, we tentatively
conclude that incumbent LECs should exclude subsequent trouble reports
on maintenance tickets that have not been reported as resolved or
closed. We seek comment on whether these exclusions will assist in
producing meaningful results and whether additional exclusions are
needed.
4. Billing Measurements
65. As noted above, an incumbent LEC must provide nondiscriminatory
access to billing, as one of the five OSS functions identified by the
Commission in the Local Competition First Report and Order. A competing
carrier is dependent on an incumbent LEC to obtain billing information,
regardless of whether it uses unbundled network elements or resold
services. Two types of billing information a competing carrier must
obtain from an incumbent LEC are: (1) customer usage records (i.e.,
those records detailing each end user's use of the incumbent's
services); and (2) billing invoices, which establish the amount the
competing carrier owes the incumbent LEC for use of its services or
facilities.
66. We tentatively conclude that a competing carrier can determine
whether it is obtaining nondiscriminatory access to these two sets of
billing records by obtaining performance measurements on the Average
Time to Provide Usage Records and the Average Time to Deliver Invoices.
The first measurement (Average Time to Provide Usage Records) seeks to
capture the average time it takes an incumbent LEC to provide customer
usage records. We tentatively conclude that incumbent LECs should use
the measurements for the Average Time to Provide Usage Records in
calculating the intervals for competing carriers and for their own
retail use. For competing carriers, an incumbent LEC must compare the
date and time it records usage data with the date and time it transmits
the records from its OSS gateway to the competing carrier. For its own
retail use, we propose that an incumbent LEC measure the elapsed time
between the date and time of recording the usage record to the date and
time it reformats the record on an Electronic Message Record (EMR), or
an equivalent, format. We seek comment on these measurements.
Additionally, we understand that files and billing for local usage,
exchange access usage, and alternately billed usage are separated in
the actual billing process, and we seek comment on whether incumbent
LECs should disaggregate the Average Time to Provide Usage Records into
these three groups.
67. The second measurement (Average Time to Deliver Invoices) seeks
to measure the average time it takes an incumbent LEC to transmit a
billing invoice to a competing carrier for charges related to resale
and/or network elements. We tentatively conclude that incumbent LECs
should calculate the Average Time to Deliver Invoices. For competing
carriers, an incumbent LEC must compare the date and time it transmits
the invoices to the competing carrier to the date and time the billing
cycle closes. For an incumbent LEC's own retail use, LCUG has proposed
that an incumbent LEC compare the date and time the customer's bills
are produced in electronic format (whether or not they are distributed)
to the date and time the billing cycle closes. We seek comment on this
proposal for retail use and on our tentative conclusion regarding the
appropriate measurement for competing carriers. We also seek comment on
whether incumbent LECs should report separately for wholesale bill
invoices and unbundled element bill invoices for competing carriers.
Finally, we seek comment on whether any other measurements for billing
are appropriate.
5. General Measurements
a. Systems Availability. 68. We tentatively conclude that an
incumbent LEC must measure the percentage of time its electronic
interfaces for each OSS function are actually operational as compared
to the scheduled availability. We propose that an incumbent LEC
calculate this measurement by comparing the total time it provides
access to a particular interface during the reporting period to the
total time the interface was scheduled to be available during the
reporting period. We also propose that an incumbent LEC compare the
total time its own systems are available to its service representatives
to the amount of time that those systems should have been available
during the reporting period. We believe that this measurement will
assist in determining whether the incumbent LEC provides
nondiscriminatory access to its electronic interfaces. We believe that
both prolonged outages and frequent unavailability of electronic access
to an incumbent LEC's OSS interfaces may significantly and adversely
affect a competing carrier's ability to provide service to end users.
We tentatively conclude that this measurement must be disaggregated by
interface type, such as EDI and GUI, as well as by each separate OSS
function provided by the incumbent LEC to competing carriers (e.g.,
pre-ordering, ordering,
[[Page 27030]]
provisioning, repair and maintenance, and billing). We seek comment on
our tentative conclusions regarding systems availability measurements.
b. Center Responsiveness. 69. We tentatively conclude that an
incumbent LEC must measure the average time to answer calls from
competing carriers to an incumbent LEC's wholesale service center. We
propose that an incumbent LEC calculate this measurement by tracking
the time elapsed from when the service center's call management system
is prompted by an incoming call from a competing carrier until the call
is answered by an incumbent LEC's service representative. We seek
comment on our tentative conclusion to require a measurement for center
responsiveness.
c. Operator services and directory assistance. 70. We tentatively
conclude that an incumbent LEC must measure the average time it takes
its own end user customers and those of competing carriers to access
the incumbent LEC's operator services and directory assistance
databases or operators. We seek comment on this specific measurement.
71. Incumbent LECs appear to be able to provide separate
measurement results for competing carriers that use dedicated trunks to
access the incumbent LEC's OS/DA database or operators. Therefore, we
tentatively conclude that incumbent LECs must provide separate
measurement results in such instances. We seek comment, however, on
whether, for purposes of disaggregation, an incumbent LEC is able to
differentiate between OS/DA calls from its own end user customers and
customers of competing carriers if all such calls are carried over the
same OS/DA trunk groups.
6. Interconnection Measurements
72. As previously noted, section 251(c)(2) of the Act requires
incumbent LECs to provide interconnection to competing carriers at the
same level of quality as used in their own networks. We tentatively
conclude that incumbent LECs must measure the quality of
interconnection through three different means. As discussed above, we
tentatively conclude that incumbent LECs must report separately for
interconnection trunks when disaggregating the ordering and
provisioning measurements, as well as the repair and maintenance
measurements. We also tentatively conclude, as discussed below, that
incumbent LECs must report on two sets of interconnection measurements,
one for trunk blockage and one for collocation. These two sets of
measurements are intended to reveal the quality of interconnection
provided to competing carriers.
a. Trunk Blockage. 73. We tentatively conclude that incumbent LECs
must measure trunk blockage, i.e., blockage on final trunk groups
within their networks. Blockage on these final trunk groups prevents
end user calls from reaching their final destination. The inability of
a competing carrier's end users to complete or receive calls has a
direct impact on the customer's perception of the competing carrier's
quality of service.
74. We believe that competing carriers' traffic can be blocked at
two critical points: (1) interconnection trunk groups (e.g., those
trunk groups connecting the incumbent LEC's end offices, access
tandems, or local tandems with a competing carrier's network); or (2)
common trunk groups located within the incumbent LEC's network behind
the point of interconnection (e.g., trunks connecting the incumbent's
tandem switch with other points in the incumbent LEC's network). We
therefore tentatively conclude that an incumbent LEC measure on
blockage on both sets of trunk groups. We seek comment on these
tentative conclusions.
75. We seek comment on certain general issues associated with
measuring trunk blockage. We recognize that inferior service is
generally indicated by repeated blockage on the same final trunk
groups. We therefore seek comment on whether incumbent LECs should
measure whether there is repeated blockage over the same trunk groups
for an ongoing period, such as three consecutive months. We also seek
comment on whether incumbent LECs should report on blockage exceeding a
certain blocking standard for both interconnection and common trunk
group measurements. In the Bell Atlantic/NYNEX Merger Order, for
example, the Commission required Bell Atlantic to report on blockage
exceeding a blocking standard of B.01 for interconnection trunks and
B.005 for common trunks. We seek comment on whether incumbent LECs
should measure blockage exceeding these standards.
76. We also seek comment on methods by which parties may evaluate
whether incumbent LECs are providing interconnection in compliance with
their statutory obligations under section 251(c)(2). With respect to
interconnection trunks, we seek comment on the utility of comparing
blockage on interconnection trunks and blockage on the incumbent LEC's
interoffice trunk groups carrying its retail customers' traffic. In the
Ameritech Michigan 271 proceeding, Ameritech provided data on trunk
blockage rates for both groups. The Commission determined that a higher
percentage of interconnection trunking groups experienced blockage than
did Ameritech's interoffice trunking groups serving its retail
customers, suggesting that Ameritech's interconnection facilities did
not meet the same service standards as those used within its own
network. We seek comment on the value of using a comparison similar to
that used in the Ameritech Michigan 271 Order for gauging whether
interconnection trunks are provided in a nondiscriminatory manner. We
also seek comment on which set of interoffice trunk groups incumbent
LECs should monitor.
77. A competing carrier's ability to provide service to its
customers may also be affected by blockage on common trunks located
within the incumbent LEC's network behind the point of interconnection.
We tentatively conclude that it is necessary to measure common trunk
blockage and seek comment on appropriate methods to make such
measurements. Specifically, we seek comment on whether incumbent LECs
should use the common trunk data report established in BellCore Special
Report SR STS-000317, ``Common Trunk Transport Group Performance
Data,'' Issue 2, September 1990. While we recognize that this report
was intended to provide information about common trunk blockage to
interexchange carriers (IXCs), we seek comment on whether this report
can provide useful information for competing carriers as well. We also
seek comment on whether incumbent LECs generally use this common trunk
data report and whether all the measurements in the report are
applicable to competing carriers. Additionally, we seek comment on the
utility of requiring incumbent LECs to report on blockage on common
trunks within their networks that connect to a point of
interconnection, as well as on interoffice common trunks that are not
connected to a point of interconnection. We seek comment on an
incumbent LEC's ability to separately measure and report on blockage
over these two types of common trunks (i.e., those trunk groups that
connect to a point of interconnection and those that do not) and
whether information about these two types of trunk groups will assist a
competing carrier in determining whether it is receiving
nondiscriminatory interconnection.
78. Finally, we seek comment on whether an incumbent LEC must
[[Page 27031]]
measure call completion rates to demonstrate that it is satisfying the
statutory requirements of section 251(c)(2). In measuring call
completion rates, an incumbent LEC would compare the percentage of
calls completed by incumbent LEC customers to competing carrier
customers, relative to the percentage of calls completed by incumbent
LEC customers to other incumbent LEC customers. In the Ameritech
Michigan 271 Order, the Commission noted that data regarding the rate
of call completion would be useful in assessing the quality of
interconnection. We seek comment on the utility of using this
measurement to gauge the quality of interconnection provided by an
incumbent LEC and on the benefits of using the call completion
measurement in addition to, or instead of, the trunk blockage
measurement. We also seek comment on the additional costs or burdens
that such a measurement would impose on incumbent LECs.
b. Collocation. 79. We tentatively conclude that incumbent LECs
must measure certain aspects of providing collocation arrangements.
Section 251(c)(6) and our rules require incumbent LECs to provide
physical and virtual collocation as a means of interconnection or
access to unbundled network elements. Consequently, we tentatively
conclude that incumbent LECs must provide measurements concerning their
provision of collocation facilities to competing carriers, including
the response time for initial requests for collocation. We also
tentatively conclude that this measurement must be disaggregrated
between virtual and physical collocation arrangements. The provision of
collocation arrangements involves several steps: (1) the initial query
by a competing carrier regarding space for collocation, and the
incumbent LEC's response to that query; (2) the actual ordering of the
collocation arrangement by the competing carrier; and (3) the
completion of that arrangement by the incumbent LEC. We tentatively
conclude that incumbent LECs must provide the following measurements:
(1) Average Time to Respond to a Collocation Request; (2) Average Time
to Provide a Collocation Arrangement; and (3) Percentage of Due Dates
Missed with respect to the provision of collocation arrangements. We
seek comment on the utility of these proposed measurements.
80. We tentatively conclude that the Average Time to Respond to a
Collocation Request must be determined by computing the elapsed time
from the incumbent LEC's receipt of a request for collocation by a
competing carrier to the time the incumbent LEC responds to such a
request. The Average Time to Provide a Collocation Arrangement must be
calculated from the time that the competing carrier submits an order
for a collocation arrangement to the time that the arrangement is made
available to the competing carrier. Finally, an incumbent LEC must
calculate the Percentage of Due Dates Missed by comparing the number of
times it missed a committed date for providing collocation facilities
to the total number of confirmed due dates for collocation arrangements
during the reporting period. We also tentatively conclude that
incumbent LECs must disaggregate these measurements by virtual and
physical collocation arrangements. We seek comment on these tentative
conclusions.
V. Reporting Procedures
81. We also propose model procedures to assist states considering
how performance measurements should be reported. These model reporting
procedures are intended to facilitate access by competing carriers and
states to the measurements produced by the incumbent LECs so that
carriers and states can determine whether incumbent LECs are satisfying
their statutory obligations pursuant to section 251. This section
discusses proposals regarding: (1) who should receive the reports; (2)
the frequency of reports; and (3) auditing procedures.
A. Receipt of Reports
82. We seek comment on who should receive these reports from the
incumbent LECs on a regular basis. We believe that the main purpose of
these performance reports is to permit competing carriers to determine
whether they are obtaining access consistent with the requirements of
section 251. We tentatively conclude, therefore, that only those
carriers that already obtain services or facilities from the incumbent
LEC through an interconnection agreement, or under a statement of
generally available terms, should have the opportunity to receive
reports. Commenters that believe that other groups of carriers, such as
those considering whether to enter the market, should also receive
reports should explain why the benefits of their receiving reports
outweigh the costs to incumbent LECs.
83. In order to minimize unnecessary costs or burdens for incumbent
LECs, we further conclude that an incumbent LEC should provide reports
to an individual competing carrier only after receiving a request from
the competing carrier for such reports.
84. States may also have an interest in reviewing performance
reports. With respect to whether state officials should receive a copy
of the reports that we propose in this NPRM, we tentatively conclude
that individual states can best assess whether they wish to receive the
reports. While this Commission may not need to review reports on a
regular basis, we note that the Commission could obtain the reports
upon request.
85. Finally, we seek comment on whether reports should be filed
with a central clearinghouse so that state commissions, other competing
carriers, or the general public can review an incumbent LEC's
performance in different states. We seek comment on the benefits and
costs involved in developing such a clearinghouse. We also seek comment
on what entity should act as a clearinghouse, e.g., a coalition of
regulators (such as NARUC) or another organization.
86. We recognize that parties may be concerned about disclosing
confidential measurement results if results particular to an incumbent
LEC or to an individual competing carrier are reported broadly. We seek
comment on the need to keep individual competing carrier information
confidential and on whether only aggregate measurement results be made
available to other competing carriers or to the general public.
87. With respect to incumbent LEC measurement results, we believe
that individual competing carriers must have access to incumbent LEC
results so that they can make a meaningful comparison with their own
data. We seek comment, however, on whether incumbent LEC measurement
results should be protected from disclosure to non-requesting competing
carriers or to the general public. If regulatory agencies request
incumbent LEC and competing carrier measurement results, we ask parties
to comment on whether protective measures are necessary and to propose
appropriate mechanisms to keep those results confidential. Similarly,
we ask parties to comment on whether competing carriers that receive
incumbent LEC measurement results should be required to limit their use
and disclosure of those results and to propose appropriate mechanisms
for guarding against improper use.
B. Frequency of Reports
88. We also seek comment on how frequently incumbent LECs should
file performance reports with competing carriers once requested by
those carriers. Specifically, we seek comment
[[Page 27032]]
on the costs and benefits of requiring monthly reporting, as opposed to
reporting on a less frequent basis, such as quarterly. We also seek
comment on how quickly an incumbent LEC should provide a performance
report after it is requested.
C. Auditing Requirements
89. As part of a performance monitoring mechanism, several
competing carriers proposed that competing carriers be given a
reasonable opportunity to conduct audits of performance reports. These
commenters have stated that periodic auditing of the performance
reports is necessary to ensure that incumbent LECs are using
appropriate methodologies and are accurately reporting the required
measurements. We believe, however, that some audits may be unnecessary
or unduly burdensome for the incumbent LEC. We therefore seek comment
on the need to conduct such audits as part of a model performance
monitoring scheme. We also seek comment on the types of audits that
might impose undue burdens. Finally, we seek comment on mechanisms that
will permit competing carriers to conduct audits, when necessary, while
protecting incumbent LECs from unduly burdensome or unnecessary audits.
In addressing this issue, we ask parties to comment on who should pay
for the costs of the audit.
90. In addition to audits, LCUG also proposed that an incumbent LEC
should make available, at a competing carrier's request, the raw data
underlying a report at the same time it provides the performance report
to that competing carrier.
The raw data is that data captured by the incumbent LEC, such as
the individual stop and start times, that are used to produce the
measurement results. The competing carrier could use this data to
validate the incumbent LEC's performance measurements or to perform
additional statistical tests to determine whether there is a
statistically significant difference in the way in which an incumbent
LEC provisions itself compared with the way in which it provisions
competing carriers. We seek comment on whether model reporting
procedures should include providing access to raw data at this initial
stage, rather than in the context of an audit. We recognize that there
may be additional burdens or costs to the incumbent LEC in providing
the raw data to a competing carrier and that incumbent LECs may wish to
keep data regarding services and facilities they provide to themselves
confidential. We seek comment on the types and magnitudes of these
burdens or costs. To the extent that commenters support regular
provision of the raw data, they should explain why the advantages of
obtaining such data outweigh these costs.
91. Finally, we seek comment on how long the incumbent LEC should
retain the underlying data. One party proposed that an incumbent LEC
retain the data for two years. We seek comment on whether this is an
appropriate period for retention, or whether such a requirement is
excessive if a competing carrier is also permitted to obtain the raw
data on a regular basis along with the report.
VI. Evaluation of Performance Measurements
92. We believe that performance measurements and reporting
requirements are necessary to ensure that incumbent LECs provide
interconnection and access to OSS functions and OS/DA in compliance
with the statutory requirements of section 251 of the Communications
Act. As a practical matter, we expect that various parties will use the
information contained in performance measurements as bases for
determining whether an incumbent LEC is in compliance with the
applicable statutory standards. For example, competing carriers may
review the measurements to determine whether the incumbent LEC is
providing access in a nondiscriminatory manner. In making this
determination, parties will inevitably evaluate the results of these
measurements using some preestablished set of criteria in order to
determine whether the statutory requirements have been satisfied.
93. Although few parties raised the issue in the initial round of
comments, several carriers have recently raised questions about how
regulators and competing carriers can use the data generated by
performance measurements to evaluate whether an incumbent LEC has
adhered to its statutory obligations. We seek comment on whether we
should recommend use of a uniform evaluation process that relies on
objective criteria. We seek comment on whether such an approach will
inject more consistency and predictability into determining whether an
incumbent is meeting its statutory obligations. We believe that
bringing more consistency and predictability to the evaluation process
is supported by the pro-competitive goals of the 1996 Act and would
benefit both incumbent LECs and competing carriers.
94. Incumbent LECs must comply with various statutory requirements
in their provision of interconnection and access to OSS functions and
operator services and directory assistance. We believe that a number of
methods for evaluating performance measurements could be used to make
an objective determination as to whether an incumbent LEC is meeting
these statutory requirements. In particular, the few parties that have
addressed this issue have proposed using statistical analysis or
performance benchmarks as evaluation methodologies.
95. Statistical analysis can help reveal the likelihood that
reported differences in a LEC's performance toward its retail customers
and competitive carriers are due to underlying differences in behavior
rather than random chance. We seek comment on whether specifying a
preferred statistical methodology would assist in evaluating an
incumbent LEC's performance, and on whether a uniform statistical
methodology would assist in comparing the performance of incumbent LECs
across regions. We seek comment on which statistical tests, if any, the
Commission should recommend. We believe that simple statistical tests
that are widely understood and generally accepted would most likely be
perceived as fair and would lead to the least disagreement concerning
the interpretation of the statistical results. We seek comment on the
use of conventional statistical tests of the equality of means to
determine whether observed differences in various performance
measurements between an incumbent LEC's own retail customers and
competing carriers are likely to reflect actual differences in
performance. We also seek comment on whether tests of the equality of
variances or of the equality of the proportions of each sample that
exceed a given value would be useful. We seek comment on whether any
assumptions associated with the statistical methods described above
might not be met by the performance measurement data, and on what the
appropriate statistical methodology would be in such instances. We
request comment on the desirability of using other, more complex forms
of statistical analysis, and on whether additional data collection
would be necessary to allow use of these techniques.
96. In an ex parte submission AT&T proposed using three criteria to
determine incumbent LEC compliance with nondiscrimination obligations,
including the maximum number of comparisons failing the statistical
test for nondiscrimination, the maximum number of repeating
measurements failing the test, and that no extreme
[[Page 27033]]
differences occur between the results for the incumbent LEC and those
for the competing carrier. BellSouth in another proceeding has argued
that the appropriate standard is that monthly results for the competing
carrier should lie within three standard deviations of the average of
the incumbent LEC's monthly performance, and that the results for one
of the entities should not be higher than those for the other for three
consecutive months. We request comment on AT&T's and BellSouth's
proposed approaches to the use of statistical tests in evaluating
performance data. We note that, even if statistically significant
differences appear between results for the incumbent LEC and the
competing carrier, these differences may be too small to have any
practical competitive consequence and may not justify a legal
conclusion that the incumbent LEC has discriminated against the
competing carrier. Consequently we seek comment on whether threshold
values of the absolute difference, or the percentage difference, in
averages of performance measures should be used in addition to measures
of statistical significance. We request comment on whether the form in
which an incumbent LEC makes the data available to other parties and to
regulators, for instance whether the data should be continuous or in
intervals, should be specified, and on whether the data should be
provided in a computer file rather than on paper.
VII. Other Issues Raised by Petitioners
97. In developing model rules, we tentatively conclude that it is
not appropriate at this time to undertake certain additional actions
requested by petitioners. These additional actions include establishing
performance standards, technical standards for OSS interfaces, and
remedial measures for non-compliant incumbent LECs.
VIII. Small and Midsized LECS
98. We seek comment on whether the proposed model performance
measurements and reporting requirements will impose particular costs or
burdens on small, rural, or midsized incumbent LECs. We also seek
comment on how the proposed model rules should be modified to take into
account any particular concerns of these LECs. For example, certain
incumbent LECs may believe that the proposed guidelines should be
tailored to meet circumstances relating to the areas in which small,
rural or midsized LECs are located.
IX. Procedural Matters
A. Ex Parte Presentations
99. This matter shall be treated as a ``permit-but-disclose''
proceeding in accordance with the Commission's ex parte rules. Persons
making oral ex parte presentations are reminded that memoranda
summarizing the presentations must contain summaries of the substance
of the presentations and not merely a listing of the subjects
discussed. More than a one or two sentence description of the views and
arguments presented is generally required. Other rules pertaining to
oral and written presentations are set forth in section 1.1206(b) as
well.
B. Initial Paperwork Reduction Act Analysis
100. This Notice contains either a proposed information collection.
As part of its continuing effort to reduce paperwork burdens, we invite
the general public and the Office of Management and Budget (OMB) to
take this opportunity to comment on the information collections
contained in this Notice, as required by the Paperwork Reduction Act of
1995, Public Law 104-13. Public and agency comments are due at the same
time as other comments on this Notice; OMB comments are due 60 days
from date of publication of this Notice in the Federal Register.
Comments should address: (a) whether the proposed collection of
information is necessary for the proper performance of the functions of
the Commission, including whether the information shall have practical
utility; (b) the accuracy of the Commission's burden estimates; (c)
ways to enhance the quality, utility, and clarity of the information
collected; and (d) ways to minimize the burden of the collection of
information on the respondents, including the use of automated
collection techniques or other forms of information technology.
C. Initial Regulatory Flexibility Certification
101. As required by the Regulatory Flexibility Act (RFA), the
Commission has prepared the present Initial Regulatory Flexibility
Analysis (IRFA) of the possible significant economic impact on small
entities by the policies and rules proposed in the Notice of Proposed
Rulemaking (NPRM) on Performance Measurements and Reporting
Requirements for Operations Support Systems, Interconnection, and
Operator Services and Directory Assistance. Written public comments are
requested on the IRFA. Comments must be identified as responses to the
IRFA and must be filed by the deadlines for comments on the NPRM
provided below in Part IX. D. The Commission will send a copy of the
NPRM, including the IRFA, to the Chief Counsel for Advocacy of the
Small Business Administration. In addition, the NPRM on Performance
Measurements and Reporting Requirements for Operations Support Systems,
Interconnection, and Operator Services and Directory Assistance and
IRFA (or summaries thereof) will be provided in the Federal Register.
102. Need for and Objectives of the Proposed Rule. We are issuing
the NPRM specifically seeking comment on and presenting tentative
conclusions on proposed performance measurements and reporting
requirements intended to measure whether an incumbent LEC is providing
nondiscriminatory access to operations support services (OSS),
interconnection, and operator services and directory assistance (OS/
DA). We also seek comment on the use of performance standards and other
methods to evaluate whether an incumbent LEC is complying with its
statutory obligations under section 251. Finally, although we do not
set forth proposals in this area, we seek comment on issues related to
OSS interface standards and remedial provisions. Based on the comments
received in the NPRM, we may issue new rules.
103. Legal Basis. The legal basis for any action that may be taken
pursuant to the NPRM is contained in sections 1, 2, 4, 201, 202, 222,
251, and 303(r) of the Communications Act of 1934, as amended, 47
U.S.C. 151, 152, 154, 201, 202, 222, 251, and 303(r).
104. Description and Estimates of the Number of Small Entities
Affected by the Notice of Proposed Rulemaking. The RFA directs agencies
to provide a description of and, where feasible, an estimate of the
number of small entities that will be affected by our rules. The RFA
generally defines the term ``small entity'' as having the same meaning
as the terms ``small business,'' ``small organization,'' and ``small
governmental jurisdiction.'' For the purposes of this order, the RFA
defines a ``small business'' to be the same as a ``small business
concern'' under the Small Business Act, 15 U.S.C. 632, unless the
Commission has developed one or more definitions that are appropriate
to its activities. Under the Small Business Act, a ``small business
concern'' is one that: (1) is independently owned and operated; (2) is
not dominant in its field of operation; and (3) meets any additional
criteria established by the Small Business Administration (SBA). The
SBA has defined a small business
[[Page 27034]]
for Standard Industrial Classification (SIC) category 4813 (Telephone
Communications, Except Radiotelephone) to be an entity that has no more
than 1,500 employees.
105. Although affected incumbent local exchange carriers (ILECs)
may have no more than 1,500 employees, we do not believe that such
entities should be considered small entities within the meaning of the
RFA because they either are dominant in their field of operations or
are not independently owned and operated, and are therefore by
definition not ``small entities'' or ``small business concerns'' under
the RFA. Accordingly, our use of the terms ``small entities'' and
``small businesses'' does not encompass small incumbent LECs. Out of an
abundance of caution, however, for regulatory flexibility analysis
purposes, we will separately consider small ILECs within this analysis
and use the term ``small incumbent LECs'' to refer to any incumbent
LECs that arguably might be defined by SBA as ``small business
concerns.''
106. Total Number of Telephone Companies Affected. The United
States Bureau of the Census (the Census Bureau) reports that at the end
of 1992, there were 3,497 firms engaged in providing telephone
services, as defined therein, for at least one year. This number
contains a variety of different categories of carriers, including local
exchange carriers, interexchange carriers, competitive access
providers, cellular carriers, mobile service carriers, operator service
providers, pay telephone operators, PCS providers, covered SMR
providers, and resellers. It seems certain that some of those 3,497
telephone service firms may not qualify as small entities because they
are not ``independently owned and operated.'' For example, a PCS
provider that is affiliated with an interexchange carrier having more
than 1,500 employees would not meet the definition of a small business.
It seems reasonable to conclude, therefore, that fewer than 3,497
telephone service firms are either small entities or small incumbent
LECs that may be affected by this order.
107. Local Exchange Carriers. Neither the Commission nor the SBA
has developed a definition of small providers of local exchange
services. The closest applicable definition under the SBA's rules is
for telephone communications companies other than radiotelephone
(wireless) companies. The most reliable source of information regarding
the number of LECs nationwide of which we are aware appears to be the
data that we collect annually in connection with the Telecommunications
Relay Service (TRS). According to our most recent data, 1,371 companies
reported that they were engaged in the provision of local exchange
services. Although it seems certain that some of these carriers are not
independently owned and operated, or have more than 1,500 employees, or
are dominant we are unable at this time to estimate with greater
precision the number of LECs that would qualify as small business
concerns under the SBA's definition. Consequently, we estimate that
fewer than 1,371 small providers of local exchange service are small
entities or small ILECs that may be affected by this order.
108. Description of Projected Reporting, Recordkeeping and Other
Compliance Requirements. We are seeking comment on requiring all
incumbent LECs to report on all the measurements. These proposed
measurements seek to measure access provided by an incumbent LEC to all
five OSS functions, as well as to interconnection and OS/DA. We also
seek comment on how often incumbent LECs should provide these
measurements, whether and for how long they should retain the
measurement data, and whether the incumbent LEC should perform any
statistical analysis of the measurement data. Finally we seek comment
on reporting procedures, including: (1) whether an incumbent LEC must
report separately on performance to itself, any local exchange
affiliate, competing carriers in aggregate, and individual competing
carriers; (2) whether an incumbent LEC should only provide performance
monitoring reports to an individual competing carrier after receiving a
request from the competing carrier for such reports on a regular basis;
(3) how frequently an incumbent LEC should provide performance
monitoring reports; (4) whether to accord confidential treatment to
individual competing carrier information and incumbent LEC retail
information; (5) whether an incumbent LEC should make available upon
the request of a competing carrier or regulator raw data underlying a
report; and (6) whether competing carriers should be entitled to ask
for and obtain audits of the data underlying performance reports.
109. Steps Taken to Minimize Significant Economic Impact on Small
Entities and Significant Alternatives Considered. In Part VIII of the
NPRM, we seek comment on the expenses involved with the proposed
reporting requirements and the particular burdens they would impose on
small, rural, or midsized LECs, if any. In Part VIII, we also seek
comment on possible alternatives to these proposed measurements and
reporting requirements. We note that certain incumbent LECs might
propose ways in which the Commission should tailor its proposals to
meet circumstances relating to the areas in which small, rural or
midsized LECs are located.
110. Federal Rules that May Duplicate, Overlap, or Conflict with
the Proposed Rule. None.
D. Comment Filing Procedures
111. To file formally in this proceeding, you must file an original
and four copies of all comments, reply comments, and supporting
comments. Please note, however, that comments and reply comments may be
filed electronically. If you want each Commissioner to receive a
personal copy of your comments, you must file an original and nine
copies.
112. Comments and reply comments must include a short and concise
summary of the substantive arguments raised in the pleading. Comments
and reply comments must also comply with section 1.49 and all other
applicable sections of the Commission's rules. We also direct all
interested parties to include the name of the filing party and the date
of the filing on each page of their comments and reply comments. All
parties are encouraged to utilize a table of contents, regardless of
the length of their submission.
113. Parties are also asked to submit comments and reply comments
on diskette. Such diskette submissions would be in addition to and not
a substitute for the formal filing requirements addressed above.
Parties submitting diskettes should submit them to Janice Myles of the
Common Carrier Bureau, 1919 M Street, N.W., Room 544, Washington, D.C.,
20554. Such a submission should be on a 3.5 inch diskette formatted in
an IBM compatible form using MS DOS 5.0 and WordPerfect 5.1 software.
The diskette should be submitted in ``read only'' mode. The diskette
should be clearly labeled with the party's name, proceeding, type of
pleading (comment or reply comments) and date of submission. The
diskette should be accompanied by a cover letter.
114. You may also file informal comments or an exact copy of your
formal comments electronically via the Internet. To file electronic
comments in this proceeding, you may use the electronic filing
interface available on the FCC's World Wide Web site at http://
dettifoss.fcc.gov:8080/cgi-bin/ws.exe/beta/ecfs/upload.hts>.
[[Page 27035]]
Only one copy of electronically-filed comments must be submitted.
Further information on the process of submitting comments
electronically is available at that location and at http://
www.fcc.gov/e-file/>.
X. Ordering Clauses
115. Accordingly, it is ordered that, pursuant to sections 1, 2, 4,
201, 202, 222, 251, and 303(r) of the Communications Act of 1934, as
amended, 47 U.S.C. Secs. 151, 152, 154, 201, 202, 222, 251, and 303(r),
a notice of proposed rulemaking is adopted
116. It is further ordered that the Commission's Office of Public
Affairs, Reference Operations Division, SHALL SEND a copy of this
Notice of proposed rulemaking, including the Initial Regulatory
Flexibility Certification, to the Chief Counsel for Advocacy of the
Small Business Administration, in accordance with the Regulatory
Flexibility Act, see 5 U.S.C. 605(b).
Federal Communications Commission.
Magalie Roman Salas,
Secretary.
[FR Doc. 98-12971 Filed 5-14-98; 8:45 am]
BILLING CODE 6712-01-P