ARCHIVED - Telecom Decision CRTC 2003-72

This page has been archived on the Web

Information identified as archived on the Web is for reference, research or recordkeeping purposes. Archived Decisions, Notices and Orders (DNOs) remain in effect except to the extent they are amended or reversed by the Commission, a court, or the government. The text of archived information has not been altered or updated after the date of archiving. Changes to DNOs are published as “dashes” to the original DNO number. Web pages that are archived on the Web are not subject to the Government of Canada Web Standards. As per the Communications Policy of the Government of Canada, you can request alternate formats by contacting us.

 

Telecom Decision CRTC 2003-72

  Ottawa, 30 October 2003
 

Finalization of interim competition-related Quality of Service indicators and standards

  Reference: 8621-C12-01/00, 8638-C12-48/01 and 8660-C12-06/01
  In this decision, the Commission resolves several issues regarding the definition and implementation of certain competition-related Quality of Service (Q of S) indicators and gives final approval to ten Q of S indicators which were analyzed and reported on in report BPRE030a of the CRTC Interconnection Steering Committee (CISC) Business Process Working Group (BPWG), filed with the Commission on 2 December 2002 (the BPWG Report).
  In the BPWG Report, four Q of S indicators were submitted as consensus items. The Commission has approved three of these Q of S indicators as submitted, and one indicator has been approved with modifications. Six Q of S indicators were submitted on a non-consensus basis. The Commission has finalized these six indicators as set out in the decision below.
 

Introduction

1.

The Quality of Service (Q of S) regime established by the Commission in Quality of service indicators for use in telephone company regulation, Telecom Decision CRTC 97-16, 24 July 1997 (Decision 97-16) included three competition-related Q of S indicators (Indicator 1.6, Competitor Installation Appointments Met; Indicator 1.7, On-Time Activation of PICs for Alternate Providers of Long Distance Service (APLDS); and, Indicator 2.6, Competitor Repair Appointments Met). Decision 97-16 directed those incumbent local exchange carriers (ILECs) subject to the decision, to implement these indicators on an interim basis. Decision 97-16 also initiated a process to finalize these and other Q of S indicators.

2.

In Final standards for quality of service indicators for use in telephone company regulation and other related matters, Decision CRTC 2000-24, 20 January 2000 (Decision 2000-24), the Commission gave final approval to all of the interim Q of S indicators established by Decision 97-16, including the three competition-related Q of S indicators. In Decision 2000-24, the Commission also expressed the preliminary view that certain additional Q of S indicators might be appropriate and initiated a process to consider this possibility.

3.

In CRTC creates new quality of service indicators for telephone companies, Decision CRTC 2001-217, 9 April 2001 (Decision 2001-217), the Commission created six new competition-related Q of S indicators (Indicator 1.8, New Unbundled Type A and B Loop Order Service Intervals Met; Indicator 1.9, Migrated Unbundled Type A and B Loop Order Service Intervals Met; Indicator 1.10, Competitor LNP Order (Standalone) Service Interval Met; Indicator 1.11, Competitor Interconnection Trunk Order Service Interval Met; Indicator 2.7, Competitor Out-of-Service Trouble Reports Cleared within 24 Hours; and, Indicator 2.8, Migrated Local Loop Completion Notices to Competitors). The Commission also directed the Network Operations Working Group (NOWG) and the Business Process Working Group (BPWG) of the CRTC Interconnection Steering Committee (CISC) to examine several outstanding issue relating to approved Q of S indicators, as well as the creation of additional competition-related Q of S indicators, and then report back to the Commission.

4.

In CISC recommended competition-related Quality of Service indicators - Follow up to Decision CRTC 2001-217, Decision CRTC 2001-366, 20 June 2001 (Decision 2001-366), the Commission considered the reports of the NOWG and BPWG that were prepared in response to Decision 2001-217. The Commission approved proposed changes to certain existing Indicators and gave interim approval to ten new competition-related Q of S Indicators (Indicator 1.12, Local Service Request Confirmed Due dates Met; Indicator 1.13, Unbundled Type A and B Loop Order Late Completions; Indicator 1.14, Unbundled Type A and B Loop Held Orders; Indicator 1.15, Local Number Portability Order (Standalone) Late Completions; Indicator 1.16, Bill & Keep Interconnection Trunk Order Late Completions; Indicator 1.17, Local Service Request (LSR) Rejection Rate; Indicator 1.18, LSR Turnaround Time Met; Indicator 2.7A, Mean Time to Clear Competitor Out-of-Service Trouble Reports Outside The Performance Standard of Indicator 2.7; Indicator 2.8A, New Loop Status Provided to Competitors; Indicator 2.9, Competitor Degraded Trouble Reports Cleared Within 48 Hours). The Commission also directed the BPWG to examine a full year of monitoring data - up to and including the second quarter of 2002 - and report by 1 December 2002 on the appropriateness and reasonableness of the new competition-related Q of S Indicators that had been approved on an interim basis.

5.

In order to establish the new competition-related Q of S Indicators as quickly as possible, Decision 2001-366 did not fully explain the Commission's determinations, but indicated that full reasons would be delivered later. Those reasons were provided in CISC recommended competition-related Quality of Service indicators - Follow up to Decisions CRTC 2001-217 and 2001-366, Decision CRTC 2001-636, 5 October 2001 (Decision 2001-636). Decision 2001-636 did not introduce any changes to the Q of S Indicators.

6.

On 2 December 2002, the BPWG filed report BPRE030a (the BPWG Report) with the Commission.
 

The BPWG Report

7.

The CISC BPWG is open to any member of the public who wishes to participate in its proceedings. Participants in the process which led to the BPWG Report were: Aliant Telecom Inc. (Aliant Telecom), AT&T Canada Corp. Inc. (AT&T Canada; now Allstream Inc.), Bell Canada, Bell West Inc., Call-Net Communications Inc. (Call-Net), Futureway Communications Inc. (Futureway; now FCI Broadband), GT Group Telecom Services Corp. (Group Telecom; now LondonConnect Inc.), TELUS Communications Inc. (TELUS) and Hewlett-Packard Company.

8.

The BPWG Report indicated that of the ten interim Q of S Indicators under consideration by the BPWG, consensus was reached on four of these indicators (Indicators 1.15, 1.18, 2.8A and 2.9).

9.

The BPWG was unable to reach consensus on the six remaining indicators because of differences in opinion as to the proper approach that should be adopted for certain matters. Specifically, the participants could not come to a consensus with respect to:
 

i) the proper basis of measurement for certain indicators;

 

ii) the treatment of missed deadlines attributable to a shortage in ILEC facilities; and

 

iii) the treatment of expedited orders.

10.

The BPWG Report provided the views of participants with respect to both the consensus items and the six indicators where consensus could not be reached. The BPWG Report also expressed the general view of participants that, prior to finalizing the interim competition-related Q of S Indicators, the Commission should initiate a formal public proceeding to provide interested persons with a further opportunity to submit comments on general issues relating to those Indicators. The BPWG Report noted that Bell Canada and Call-Net also believed that interested persons should be given an opportunity to comment on all competition-related Q of S Indicators, both final and interim, given that these Indicators would form the basis for the Rate Adjustment Plan (RAP) set out in Regulatory framework for second price cap period, Telecom Decision CRTC 2002-34, 30 May 2002 (Decision 2002-34). The BPWG Report stated that Commission staff who participated in the BPWG process had assured the BPWG participants that some form of public proceeding would be initiated to provide an opportunity for further comment on some or all of the Q of S Indicators.

11.

Overall, the BPWG Report gave rise to four general matters requiring consideration by the Commission:
 

a) the need for and possible scope of a further public process relating to the competition-related Q of S Indicators;

 

b) the treatment of the four Q of S Indicators on which the BPWG reached consensus;

 

c) the issues giving rise to the failure to reach consensus on the remaining six Q of S Indicators; and

 

d) the treatment of the six Q of S Indicators where consensus was not achieved.

 

a) A further public proceeding

12.

As noted above, in Decision 2001-366, the Commission directed the BPWG to report on the appropriateness and reasonableness of the interim Q of S indicators. In the Commission's view, the purpose of the BPWG process was to determine whether the interim Q of S Indicators should be modified before being finalized and, if so, to indicate what modifications might be appropriate. 

13.

The BPWG Report indicated that the general view of the BPWG participants was that a further public process would be appropriate in order to give interested parties an opportunity to provide additional comments on the interim competition-related Q of S Indicators before they are finalized.

14.

The BPWG Report also included the comments of participants regarding the objectives of a Quality of Service regime and the granularity of the measurements in the regime. In particular, Bell Canada expressed concern that there not be duplication as between Indicators since, in its view, this could distort the measurement of an ILEC's performance. Bell Canada argued that these issues should be the subject of a further public proceeding.

15.

The Commission notes that competitive local exchange carriers (CLECs) have reported difficulties at various steps of the ILECs' provisioning processes, which has resulted in the need to measure performance at these various steps. These measurements are used to monitor the efficiency of the entire provisioning process and thereby help to identify problem areas.

16.

The Commission is of the view that, at this early stage of competition, all tools that can help monitor a situation, focus on a problem area, assist in finalizing solutions and increase the efficiency and effectiveness of the competitive process, should be maintained. Collapsing some indicators into other existing indicators may dilute this capability.

17.

In the Commission's view, the primary reason for concern with regard to duplication of measurements for a single request for service is the possible subsequent use of those measurements in the RAP. However, in the Commission' view, finalizing the interim competition-related Q of S indicators at this time would not preclude a full discussion of the role of these Q of S indicators in the RAP for competitors during the follow-up proceeding on the RAP announced in Decision 2002-34.

18.

Finally, the Commission notes that competition-related Q of S indicators have been the subject of at least four public processes over the last several years. The Commission is of the view that it would not be premature to finalize the interim competition-related Q of S indicators in these circumstances.

19.

Accordingly, the Commission has concluded that a further public process is not required before the interim competition-related Q of S Indicators are finalized. The Commission notes that in Finalization of the Quality of Service rate adjustment plan for competitors, Telecom Public Notice CRTC 2003-9, 30 October 2003, the Commission has initiated a public proceeding to examine the RAP for competitors.
 

b) The treatment of the consensus items

20.

As noted above, the BPWG Report indicated that the BPWG had reached consensus on four indicators:
 
  • Indicator 1.15: Local Number Portability Order (Standalone) Late Completions
 
  • Indicator 1.18: LSR Turnaround Time Met;
 
  • Indicator 2.8A: New Loop Status Provided to Competitors; and
 
  • Indicator 2.9: Competitor Degraded Trouble Reports Cleared Within 48 Hours.

21.

For each of these Indicators, the BPWG Report recommended newly drafted Business Rules. The BPWG Report also recommended:
 
  • changes to the definitions of the Numerator and the Denominator for Indicator 1.15;
 
  • changes to the Definition section and to the definition of the Numerator of Indicator 1.18;
 
  • changes to the Definition section and to the definitions of the Measurement Method, Numerator and Denominator for Indicator 2.8A; and
 
  • changes to the definition of the Numerator for Indicator 2.9.

22.

The Commission considers the proposed changes to Indicators 1.18, 2.8A and 2.9 to be appropriate. The Commission therefore approves on a final basis Indicators 1.18, 2.8A and 2.9 and their associated standards as submitted in the BPWG Report and set out in Appendix A to this decision.

23.

With regard to Indicator 1.15: Local Number Portability Order (Standalone) Late Completions, while this Indicator was filed as a consensus item in the BPWG Report, the Commission is not satisfied with two elements of the BPWG proposal.

24.

First, the Commission has concluded that a numbering change is required in order to maintain a consistent numbering pattern for the Q of S indicators. Specifically, where two Indicators track the same function but involve different time periods for performance of the function, the Indicators have generally been given related numbering (e.g., Indicators 2.7 and 2.7A, Indicators 2.8 and 2.8A).

25.

Applying the same approach to Indicator 1.15, the Commission is of the view that it would be logical to associate Indicator 1.15 directly with Indicator 1.10 as both indicators relate to the implementation of Local Number Portability orders by the ILEC. Indicator 1.10 tracks the ability of the ILEC to implement orders within the standard service interval. Indicator 1.15 tracks the implementation of missed orders within a further 24 hours. The Commission has therefore determined that Indicator 1.15 should be renumbered as Indicator 1.10A.

26.

The second element of concern pertains to the standards set for Indicators 1.10 and 1.10A. These standards have been defined to measure the ability of an ILEC to port a standalone number within two different time periods. Indicator 1.10 tracks the ability of an ILEC to port a standalone number within the standard service interval (e.g., 48 hours). The minimum required standard is a 90% success rate. For Indicator 1.10A, the ILEC is measured on its ability to port the remaining standalone numbers within the next 24 hours, 90% of the time. However, the Commission is of the view that the standard for the ILEC to port the remaining standalone numbers should be increased from 90% to 100% so as to measure all number porting. The Commission therefore determines that the standard for Indicator 1.10A, formerly 1.15, be set at 100%.

27.

Accordingly, the Commission approves Indicator 1.10A and its associated standard on a final basis as described in Appendix A.
 

c) Issues preventing consensus for certain items

28.

The competition-related Q of S Indicators that were filed as non-consensus items in the BPWG Report were:
 
  • Indicator 1.12: Local Service Request Confirmed Due Date Met;
 
  • Indicator 1.13: Unbundled Type A and B Loop Order Late Completions;
 
  • Indicator 1.14: Unbundled Type A and B Loop Held Orders;
 
  • Indicator 1.16: Interconnection Trunk Order Late;
 
  • Indicator 1.17: Local Service Request (LSR) Rejection Rate; and
 
  • Indicator 2.7A: Competitor Out-of-Service Trouble Report Late Clearances.

29.

The BPWG Report identified the following issues as preventing the BPWG from reaching a consensus on the six Indicators identified above:
 

i) whether Indicators should be measured at the unbundled element level (e.g. loop, trunk, telephone number) or at the request level (e.g., Local Service Request (LSR), Abbreviated Access Service Request (AASR));

 

ii) how to deal with situations where the ILEC cannot fill an order because it does not have facilities in place; and

 

iii) how to deal with situations where a CLEC requests an expedited order.

 

i) The basis for the measurement of indicators

30.

The BPWG Report stated that there were inconsistencies among the ILECs as to how they collect and report the data for most of the Q of S indicators. In the record of the BPWG process it was noted that when the ILECs calculated their Q of S results, they interpreted the word "order" as meaning any one of the following:
 
  • the LSR;
 
  • the reference number of the LSR;
 
  • the AASR; or
 
  • the ILECs internal order.

31.

According to the record of the BPWG process, most ILECs tracked and calculated quality of service results as follows:
 

a) for LSRs, the word "order" was interpreted as the LSR in its entirety;

 

b) for an AASR, the word "order" was interpreted as the AASR in its entirety for orders relating to Bill & Keep Interconnection Trunks (B & K Trunks); and

 

c) for an AASR involving repair and maintenance activities, the trouble report was the reference unit.

32.

The CLECs who participated in the BPWG claimed that the inconsistency of reporting made it impossible to verify the results filed by the ILECs. In general, the CLECs were of the view that ILECs should use identical measurement techniques so that it would be possible to track and compare quality of service performance and understand how the statistics were derived.

33.

AT&T Canada submitted that quality of service should be measured from the customer's perspective and therefore, an order should encompass all the loops and activity requested on the LSR or the AASR. Consequently, AT&T Canada submitted that the basis for measurement of most indicators should be at the LSR or at the AASR level.

34.

Bell Canada questioned the need for, and practicability of identical measurement techniques. Bell Canada also stated that when the Q of S Indicators were developed, there was an agreement among participants that variations due to counting based on orders or on components of orders would not affect the indicators to a significant degree.
  Commission analysis and determination

35.

The Commission is of the view that for an order to be reported as complete, all of its constituent elements should be delivered in working condition on the due date, including notification to the CLEC that placed the order. This should be the case for any order, whether it be an LSR or an AASR or a trouble report.

36.

The Commission is also of the view that partial completion of an order should not grant a partial or percentage completion. In the context of Competitor Services, quality of service relates primarily to the ability of a competitor to deliver a service to its end customer. If a single element of an order is not provided in working condition, including notification, then the competitor cannot provide the service to its end customer. Consequently, the entire order should be measured as incomplete.

37.

Accordingly, the Commission determines that for LSR and AASR orders and Trouble Reports to be measured as complete, all constituent elements making up the LSR and AASR orders and all constituent elements making up the Trouble Report have to be complete. The Commission notes also that in order to avoid ambiguities in the counting of due dates for indicator measurement purpose, working days exclude the statutory and corporate (serving ILEC) holidays.

38.

In keeping with this determination, the Commission concludes that:
 

a) for Indicators 1.8, 1.9, 1.10, 1.10A, 1.12, 1.13, 1.14, 1.17, 1.18, 2.8 and 2.8A the definition of an LSR should include all of its constituent elements;

 

b) for Indicators 1.11 and 1.16 the definition of an AASR should include all of its constituent elements; and

 

c) for Indicators 2.7, 2.7A and 2.9 the definition of a trouble report should include all of its constituent elements.

  ii) Treatment of the lack of ILEC facilities

39.

In the BPWG Report, consensus could not be achieved on Indicators 1.12 and 1.13 due to opposing views with respect to situations where an ILEC could not fulfil an order because of a lack of facilities.

40.

Call-Net, AT&T Canada, Group Telecom and Futureway submitted that in the measurement of Indicators 1.12 and 1.13, orders that could not be completed on the confirmed due date due to a lack of ILEC facilities should be counted as incomplete. They took the position that if an ILEC confirmed a due date, it should be held to that date.

41.

Aliant Telecom, Bell Canada and TELUS submitted that if orders could not be completed due to a lack of facilities then those orders should not be included in the measurement of Indicators 1.12 and 1.13. They argued that orders that could not be completed due to a lack of facilities were already captured by Indicator 1.14. Moreover, according to these companies, it was an established principle of the quality of service regime that installations that could not be completed due to circumstances beyond the ILEC's "reasonable control" would not be included as negative results in the quality of service measures.
  Commission analysis and determination

42.

In the Commission's view, a provisioning ILEC should confirm the availability of all the constituent elements of an order, including unbundled loops, feeder capacity, and line cards, prior to providing the CLEC with a due date for its order. If an ILEC were not to proceed on this basis, then the due date communicated to the CLEC could not be relied upon. This would defeat the purpose of providing a due date and undermine the very purpose of a quality of service regime (i.e., to ensure reliable, timely provision of services).

43.

The Commission notes that in Decision 2001-366 and in Decision 2001-636, it accepted the recommendation of the BPWG to exclude orders where due dates were missed due to a lack of facilities when it finalized Indicators 1.8 and 1.9.

44.

The Commission is of the view, however, that a confirmed due date should have a binding effect and that any order with a confirmed due date that has not been delivered on that date, should be included in the measurement of the associated indicator.

45.

Accordingly, the Commission establishes as a fundamental operating principle for Q of S business rules and their corresponding indicators, that when an ILEC provides a due date to a service provider, that date becomes binding on the ILEC for all purposes such that missing a due date for any reason including the lack of facilities will be counted as a miss. In accordance with this principle, the Commission modifies the definition of the business rules for Indicators 1.8, 1.9, 1.12 and 1.13 to include orders that cannot be completed on a confirmed due date due to a lack of ILEC facilities.
  iii) Treatment of expedited orders

46.

In response to a customer request, such as for an early installation date or relief of facilities congestion, CLECs occasionally ask an ILEC to expedite LSR or AASR orders so that they would be completed ahead of the standard service due date set out for the particular service or facility. The CLEC is required to provide to the ILEC some justification for expediting the order, but the ILEC is not obligated to agree to an earlier date.

47.

The BPWG Report indicated that the participants were unable to agree as to the treatment of the failure of an ILEC to meet an expedited due date in respect of Indicator 1.16.

48.

AT&T Canada, Call-Net, Futureway and Group Telecom proposed that the expedited due date for an order should qualify as the effective due date and that a failure to meet this date should be included in the measurement of the indicator.

49.

Aliant Telecom, Bell Canada and TELUS proposed that the failure to meet an expedited due date should not be included in the measurement of an indicator and that the focus of measurement should be whether the order was completed within the standard service due date. In addition, they argued that any change in the Business Rules for expedited orders in the case of Indicator 1.16 would require a similar revision to final Indicator 1.11: Competitor Interconnection Trunk Order Service Interval met and that the possibility of such a revision should be addressed in a broader proceeding.
  Commission analysis and determination

50.

The Commission recognizes that expediting a due date may require an ILEC to reassign resources and reset priorities in order to accommodate the requested date. It is therefore appropriate for an ILEC to require that a CLEC provide adequate justification to support its request, prior to the ILEC agreeing to an expedited due date.

51.

In the Commission's view, however, once an ILEC agrees to an expedited due date, it has created an expectation that that due date will be met and has implicitly accepted that the CLEC will rely on the due date when arranging its affairs. In these circumstances, the Commission considers that it would be inappropriate to treat the expedited due date as, in effect, non-binding for quality of service purposes. Consequently, the Commission is of the view that when agreed to by the ILEC and the CLEC, an expedited due date becomes the effective due date of the order, against which all results should be measured.

52.

The Commission notes that in Decisions 2001-366 and 2001-636, it determined that orders with a confirmed due date are to be measured against that due date, expedited or otherwise, for the purposes of Indicator 1.12. However, in those same decisions, orders with an expedited due date that were not completed on the expedited due date would be considered as having met the requirements of Indicators 1.8, 1.9 and 1.10, as long as the order was delivered within the standard service due date and would not be tracked in Indicator 1.16.

53.

The Commission is of the view that it would not be appropriate to maintain this different treatment of expedited orders for Indicators 1.8, 1.9, 1.10 and 1.16. Accordingly, the Commission is modifying these Indicators to reflect its determination that an expedited due date is to be treated as the effective due date against which competition-related Q of S results should be measured. The Commission is also revising Indicator 1.11 to reflect this approach. Revised versions of these Indicators are set out in Appendix A.

 

The non-consensus items

  Indicator 1.12: Local Service Request Confirmed Due Dates Met

54.

Indicator 1.12 measures the percentage of LSR orders that are delivered on the confirmed due date. This indicator measures the ILEC's ability to deliver new unbundled loops and to migrate existing unbundled loops. Orders that are not completed by their confirmed due date as a result of causes attributable to the CLECs or their customers are excluded.

55.

The BPWG Report indicated that the BPWG could not reach consensus on Indicator 1.12 because of disagreements on two issues:
 

i) the basis for the measurement of an LSR; and

 

ii) the treatment of LSRs that are not completed on their confirmed due date because of a lack of facilities.

56.

The Commission has determined above that in order for an LSR to be measured as complete, all the constituent elements making up the LSR must be complete. The Commission also determined above that if an order is not completed by its confirmed due date as result of a lack of ILEC facilities, then it will be considered as a miss.

57.

Applying these two determinations to Indicator 1.12, the Commission approves on a final basis Indicator 1.12, its new definition and its associated standard as set out in Appendix A.
  Indicator 1.13: Unbundled Type A and B Loop Order Late Completions

58.

Indicator 1.13 measures the percentage of orders for unbundled type A and B loops that are not completed by their due date but which are completed within one working day of the due date. Orders that are not completed by the confirmed due date as a result of causes attributable to the CLECs or their customers are excluded.

59.

The BPWG Report indicated that the BPWG could not reach consensus on Indicator 1.13 because of a disagreement regarding when an order should qualify for consideration under Indicator 1.13.

60.

AT&T Canada, Call-Net, Futureway and Group Telecom submitted that an order that was not completed by its confirmed due date should count for the purposes of Indicator 1.13. They argued against measuring the lateness of orders solely by the standards of service as set out in Indicators 1.8 and 1.9 because this approach would not capture the full range of possible due dates.

61.

Bell Canada argued that the definition of Indicator 1.13 limits its scope to those orders measured under Indicators 1.8 and 1.9 that have missed their due date based on Commission-approved service intervals.

62.

The Commission notes that Indicator 1.13 was introduced in Decision 2001-366 to measure the percentage of orders for unbundled type A and B loops that are not completed on their due date but are completed within one working day of their due date. In that same decision, the Commission determined that for Indicator 1.13, the lateness of the order was to be measured against the due date.

63.

The Commission also notes that the Business Rules for Indicator 1.13, proposed in the BPWG Report, state that where the due dates for orders are missed due to a lack of facilities, they should be excluded from any measurement.

64.

The Commission is of the view that the approach adopted above in respect of Indicator 1.12 should also be applied to Indicator 1.13. In particular, the measure for the late completion of orders for unbundled type A and B loops and their sub-categories should be based on the due date confirmed by the ILEC and should not be restricted to the date based on the standard service interval plus one working day. In addition, orders where confirmed due dates are missed due to a lack of ILEC facilities should be included in the measurement of Indicator 1.13 for the same reasons as stated above for Indicator 1.12.

65.

Accordingly, the Commission approves on a final basis Indicator 1.13 as set out in Appendix A.
  Indicator 1.14: Unbundled Type A and B Loops Held Orders

66.

Indicator 1.14 measures the number of unbundled type A and B loops held orders expressed as a percentage of loop inward movement. The Commission noted in Decision 2001-636 that the setting of a performance standard for Indicator 1.14 would necessitate one year of data collection and directed the BPWG to file a report on Indicator 1.14 by the end of 2002.

67.

The BPWG Report included a revised definition of Indicator 1.14 to correct a misquote in Decision 2001-636 from Quality of Service Indicators BPWG consensus report BPRE028a (Report BPRE028a) filed on 11 May 2001. However, the BPWG Report also indicated that a consensus could not be reached on the definitions of the numerator and denominator of Indicator 1.14 because of a disagreement among the participants as to the proper treatment of orders that are not completed because of a lack of ILEC facilities. The participants were also unable to reach consensus on the standard to be applied.

68.

AT&T Canada, Call-Net and Group Telecom proposed revised definitions of the numerator and the denominator for Indicator 1.14 taking into account orders not completed due to a lack of ILEC facilities. In addition, they recommended the adoption of a 0.25% standard for this indicator. They indicated that this standard was derived from the average percentage of Call-Net held orders over a 12 month period in Bell Canada territory.

69.

Aliant Telecom and Bell Canada submitted that the definitions of the numerator and denominator for Indicator 1.14 should be addressed in the context of a general review of all indicators. Aliant Telecom, Bell Canada and TELUS proposed a standard of 3.3% for Indicator 1.14, based on the standard that applies for retail Q of S Indicator 1.3: Held Orders per 100 Main Inward Movement.

70.

In keeping with its determination above, the Commission is of the view that orders not completed due to a lack of ILEC facilities should be taken into account in the calculation of Indicator 1.14. In addition, the Commission considers the proposed standard of 0.25% appropriate, given that it is based on historical data for unbundled type A and B loops held orders per a single inward movement.

71.

Accordingly, the Commission approves on a final basis Indicator 1.14 and its associated standard as described in Appendix A.
  Indicator 1.16: Bill & Keep Interconnection Trunk Order Late Completions

72.

Indicator 1.16 measures the late completion, within five working days of the due date, of local exchange carrier interconnection trunks (also known as Bill & Keep Trunks). Currently, ILECs exclude from the measurement of this indicator, orders where the due date is missed for reasons attributable to the CLEC, as well as orders for which an expedited due date was missed, provided that the order was completed within the applicable standard service due date.

73.

The BPWG Report indicated that consensus could not be reached on the proper treatment of orders with an expedited due date where that due date is missed.

74.

As noted above, the Commission is of the view that any order with an expedited due date should be included in the calculation of the associated indicator. Consequently, the Commission determines that an order for Bill & Keep Trunks where an expedited due date is missed shall be tracked by Indicator 1.16.

75.

The Commission notes that the BPWG Report did not comment on the standard for Indicator 1.16, which is currently set at 90%. The Commission is of the view that, in keeping with its determination regarding the standard for Indicator 1.10A, the standard for Indicator 1.16 should be increased to 100% to capture all late completions of Bill & Keep Trunks.

76.

The Commission also notes that the same issue of numbering consistency arises with respect to Indicators 1.11 and 1.16 as is discussed above in connection with Indicators 1.10 and 1.10A. In both instances, the second indicator is a continuation of the first, whereby the ILEC is given an extended time period to perform a specific function. The Commission is of the view that to be consistent with the numbering of Indicators 2.7 and 2.7A, and with the approach adopted in respect of Indicator 1.10A, it would be logical to directly associate Indicator 1.11 with Indicator 1.16.

77.

In light of the above, the Commission determines that Indicator 1.16 shall be renumbered as Indicator 1.11A and that the due date shall mean the standard service due date, unless the parties have agreed to an earlier or later due date, in which case the agreed upon date shall qualify as the due date. The Commission also sets the standard for Indicator 1.11A at 100%. The Commission approves on a final basis Indicator 1.11A and its associated standard as described in Appendix A.
  Indicator 1.17: Local Service Request Rejection Rate

78.

Indicator 1.17 measures the percentage of LSR orders rejected by the ILECs due to perceived errors. This indicator does not have an associated standard. Following a request from the Commission in Decision 2001-217, the BPWG reviewed the indicator and expressed the view in Report BPRE028a that an industry performance standard was not meaningful. The BPWG Report indicated that consensus could not be reached on a standard for this indicator.

79.

AT&T Canada submitted that the current rejection rate was high and proposed a standard of 5%. AT&T Canada argued that requiring the ILECs to report on the reasons that the standard had not been met would assist parties in the determining the root causes for the rejection of orders.

80.

Call-Net and Bell Canada proposed that the indicator be reported and monitored but that no standard should be established.

81.

The Commission notes that in Decision 2001-217, Indicator 1.17 was introduced to monitor the possibility of biased treatment of a CLEC by an ILEC in respect of the rejection of LSRs because of perceived errors. The Commission is of the view that LSRs should not be rejected based on a subjective perception. Instead, an LSR should only be rejected on the basis of an error that can be objectively demonstrated and that requires some corrective action which warrants the subsequent re-issue of an order. The Commission is therefore of the view that the Definition of Indicator 1.17 should be changed to read "due to errors identified by the ILEC" instead of "due to errors perceived by the ILEC".

82.

The Commission notes that no supporting rationale was provided by AT&T Canada to justify its proposed 5% standard. The Commission also notes that no alternative standard was proposed by any other participant in the BPWG.

83.

The Commission is of the view that establishing a standard for Indicator 1.17 would help in the assessment of the quality of orders issued by CLECs. The Commission is also of the view that, in those cases where the standard is not met, an analysis of the ILEC's reasons for rejecting the CLEC's orders would assist both the ILEC and the CLEC in identifying the root problems with the orders and motivate the parties to find solutions.

84.

Accordingly, the Commission modifies the definition of Indicator 1.17 to read "the percentage of LSRs submitted by the CLECs that are returned due to errors identified by the ILECs and based on an error that can be objectively demonstrated and that requires some corrective action that warrants the re-issue of an order" and approves a 5% standard for this indicator. The Commission approves on a final basisIndicator 1.17 and its associated standard as described in Appendix A.
  Indicator 2.7A: Mean Time to Clear Competitor Out-of-Service Trouble Reports Outside the Performance Standard of Indicator 2.7

85.

Indicator 2.7 measures the percentage of initial out-of-service trouble reports cleared within 24 hours relative to the total number of initial out-of-service trouble reports received during the month.

86.

Indicator 2.7A measures the mean time to repair local loops capturing the actual length of time the service is out beyond the time period established in the performance standard of Indicator 2.7. ILECs compile the total number of trouble reports outside the performance standard of Indicator 2.7 and the mean time to repair.

87.

The BPWG Report identified two alternative Definitions for Indicator 2.7A but indicated that consensus could not be reached on a preferred definition.

88.

The first proposed Definition is: "The mean time to repair type A and B unbundled loops and their sub-categories for trouble reports that are not cleared within 24 hours, i.e., outside the performance standard of Indicator 2.7."

89.

The second proposed Definition is: "The percentage of trouble reports for type A and B unbundled loops and their sub-categories that are not cleared within 24 hours, i.e., outside the performance standard of Indicator 2.7, but which are cleared within the subsequent 24 hours." The BPWG Report also proposed the following title for Indicator 2.7A if this second Definition were adopted: "Competitor Out-of-Service Trouble Report Late Clearances".

90.

Aliant Telecom, Bell Canada, Call-Net, Futureway, Group Telecom and TELUS supported the first alternative Definition. AT&T Canada supported the second alternative Definition.

91.

Parties could not agree on a standard for this indicator using either definition. For the first definition, Call-Net and Group Telecom proposed that the standard be 32 hours or less and the ILECs proposed a standard of 48 hours or less. For the second definition, the CLECs proposed a 100% standard while the ILECs proposed that any change in the standard be addressed in the context of a general review of all indicators.

92.

The Commission notes that a performance standard of 90% was set on an interim basis for Indicator 2.7A in Decision 2001-636. However, the definition of Indicator 2.7A did not specify a time reference against which the 90% standard should be measured. In other words, the definition uses one unit of measure (i.e., time) to define what is monitored, while the performance standard uses another unit (i.e., percentage) to assess performance. Consequently, the standard cannot be applied.

93.

The Commission is of the view that if the second definition were adopted then it would be appropriate to set the standard at 100% in order to permit the tracking of all trouble reports that are not cleared within the time period set out in the definition. The Commission is also of the view that this approach would be more effective than the first definition in promoting a focused attempt to clear competitor troubles since it would draw attention to those service outages that remain unresolved after 24 hours.

94.

Accordingly, the Commission approves the second definition of Indicator 2.7A set out above under the proposed new title "Competitor Out-of-Service Trouble Report Late Clearances". The Commission also approves a standard of 100% for this indicator to capture all trouble reports that are not completed within the time period set out in Indicator 2.7. The Commission approves on a final basis Indicator 2.7A and its associated standard as described in Appendix A.
  Secretary General
  This document is available in alternative format upon request and may also be examined at the following Internet site: www.crtc.gc.ca
 

 

APPENDIX A

  Indicator 1.8 - New Unbundled Type A and B Loop Order Service Intervals Met
  Definition: The percentage of time that the due dates for the provisioning of new unbundled type A and B local loop orders are met within the applicable standard service interval.
  Measurement Method: Completed new loop orders are compiled, and the percentage of those that were completed within the applicable standard service interval is reported. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 1.8 - New Unbundled Type A and B Loop Order Service Intervals Met.
  Numerator: Number of orders for new type A and B unbundled loops that have met the standard interval due date for the month.
  Denominator: Total number of orders for new type A and B unbundled loops for which a standard interval due date has been assigned for the month. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.
  Business Rules:
 

· CLEC by CLEC.

 

· Loop(s) delivered in working condition and according to the loop specifications agreed to by the Industry.

 

· Include orders that cannot be completed on an agreed to expedited due date. These orders are counted as missed in the calculation of the indicator.

 

· Include in measurement those orders where confirmed due dates are missed due to a lack of facilities. These orders are counted as missed in the calculation of the indicator.

 

· Exclude from the measurement, those local service requests (LSRs) where confirmed due dates are missed due to causes attributable to CLECs or their customers per Due Dates Missed Attributable to End Customers or CLECs CISC BPWG consensus report BPRE029a (Report BPRE029a), 21January 2002, and approved by the Commission in CRTC Interconnection Steering Committee - Consensus Reports, Telecom Decision CRTC 2002-26, 22 April 2002 (Decision 2002-26).

  Indicator 1.9 - Migrated Unbundled Type A and B Loop Order Service Intervals Met
  Definition: The percentage of time that the due dates for the provisioning of migrated unbundled type A and B local loop orders are met within the applicable standard service interval.
  Measurement Method: Completed loop migration orders are compiled, and the percentage of those that were completed within the applicable standard service interval is reported. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 1.9 - Migrated Unbundled Type A and B Loop Order Service Intervals Met.
  Numerator: Number of orders for migrated type A and B unbundled loops that have met the standard interval due date for the month.
  Denominator: Total number of orders for migrated type A and B unbundled loops for which a standard interval due date has been assigned for the month. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.
  Business Rules:
 

· CLEC by CLEC.

 

· Loop(s) delivered in working condition and according to the loop specifications agreed to by the Industry.

 

· Include orders that cannot be completed on an agreed to expedited due date. These orders are counted as missed in the calculation of the indicator.

 

· Include in measurement those orders where confirmed due dates are missed due to a lack of facilities. These orders are counted as missed in the calculation of the indicator.

 

· Exclude from the measurement, those LSRs where confirmed due dates are missed due to causes attributable to CLECs or their customers per CISC BPWG consensus report BPRE029a, and approved by the Commission in Decision 2002-26.

  Indicator 1.10 - Local Number Portability (LNP) Order (Standalone) Service Interval Met
  Definition: The percentage of time that due dates relating to orders for the standalone porting of numbers are met within the applicable standard service interval.
  Measurement Method: Completed standalone LNP orders are compiled and the percentage of those that were completed within the applicable standard service interval is reported. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 1.10 - Local Number Portability (LNP) Order (Standalone) Service Interval Met.
  Numerator: Number of orders for standalone porting of numbers that have met the standard interval due date for the month.
  Denominator: Total number of orders for standalone porting of numbers for which a standard interval due date was assigned for the month. Orders for which the requested due date is beyond the applicable standard service interval are excluded from this measure.
  Business Rules:
 

· CLEC by CLEC.

 

· Include orders that cannot be completed on an agreed to expedited due date. These orders are counted as missed in the calculation of the indicator.

  Indicator 1.10A (formerly 1.15) - Local Number Portability Order (Standalone) Late Completions
  Definition: The percentage of orders for standalone porting of numbers that missed the confirmed due date, which are completed within one working day of the confirmed due date.
  Measurement Method: Completed (standalone) local number portability orders that missed their confirmed due dates are compiled, and the percentage of those that were completed within one working day of their respective confirmed due dates is reported.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 100%.
  Reporting Format: Indicator 1.10A - Local Number Portability Order (Standalone) Late Completions.
  Numerator: Total number of orders for standalone porting of numbers in the month that missed the confirmed due date, which were completed within one working day of the confirmed due date.
  Denominator: Total number of orders for standalone porting of numbers completed in the month for which a confirmed due date was missed.
  Business Rules:
 

· CLEC by CLEC.

 

· Standalone porting of numbers only.

 

· Include orders not meeting the standard in Indicator 1.10.

 

· Exclude from the measurement, those orders (ports) where confirmed due dates are missed due to causes attributable to CLECs or their customers per CISC BPWG consensus report BPRE029a, and approved by the Commission in Decision 2002-26.

 

· Orders are considered completed when the ILEC has created a Subscription Version in the Number Portability Administration Centre / Service Management System (NPAC/SMS) within the applicable interval defined in industry guidelines and placed the 10-digit unconditional trigger on the telephone number in the local switch, where this capability is available.

  Indicator 1.11 - Competitor Interconnection Trunk Order Service Interval Met
  Definition: The percentage of time that the agreed upon due date for the turn-up of Bill & Keep Interconnection Trunks are met.
  Measurement Method: Tracking of due dates met. The due date interval is 20 business days when augments to existing trunk groups are required where facilities exist and 35 business days when new trunk groups are required where no facilities exist.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 1.11 - Competitor Interconnection Trunk Order Service Interval Met.
  Numerator: Number of orders for Bill & Keep Trunks that have met the standard interval (agreed upon) due date for the month.
  Denominator: Total number of orders for Bill & Keep Trunks for which a standard interval (agreed upon) due date has been assigned for the month. The due date interval is 20 business days when augments to existing trunk groups are required where facilities exist and 35 business days when new trunk groups are required where no facilities exist.
  Business Rules:
 

· CLEC by CLEC.

 

· Trunk(s) delivered in working condition and according to industry specifications.

 

· Include in measurement those orders where confirmed due dates are missed due to a lack of facilities. These orders are counted as missed in the calculation of the indicator.

 

· Include orders that cannot be completed on an agreed to expedited due date. These orders are counted as missed in the calculation of the indicator.

 

· Exclude from the measurement, those LSRs where confirmed due dates are missed due to causes attributable to CLECs.

  Indicator 1.11A (formerly 1.16) - Interconnection Trunk Order Late Completions
  Definition: The percentage of orders for the turn-up of B & K Trunks for which the agreed upon due date is missed, but which are completed within five working days of the agreed upon due date.
  Measurement Method: Completed orders for B & K Trunks which were not completed on their due dates are compiled, and the percentage of these orders which were then completed within the next five working days of their respective due date is reported.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 100%.
  Reporting Format: Indicator 1.11A - Bill & Keep Interconnection Trunk Order Late Completions.
  Numerator: Total number of orders for Bill & Keep Interconnection Trunks which were not completed on their due date, but were then completed within the next five working days of the due date.
  Denominator: Total number of completed orders for Bill & Keep Interconnection Trunks for which a due date for that month was missed.
  Business Rules:
 

· CLEC by CLEC.

 

· Include all orders captured by Indicator 1.11 that are not completed by the standard service due date set out in Indicator 1.11.

 

· The due date means the standard service due date, unless the parties have agreed to an earlier due date.

 

· Exclude from the measurement those LSRs where confirmed due dates are missed due to causes attributable to CLECs or their customers per Report BPRE029a and approved by the Commission in Decision 2002-26.

  Indicator 1.12 - Local Service Request, Confirmed Due Dates Met
  Definition: The percentage of instances that the confirmed due date is met for the provisioning of Local Service Requests (LSRs). The due date means the standard service due date, unless the parties have agreed to an earlier due date.
  Measurement Method: Completed LSRs are compiled, and the percentage of those which were completed by the due date is reported. LSRs are to be counted as complete only if all constituent elements of the LSR order are complete.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 1.12 - Local Service Request Due Dates Met.
  Numerator: Total number of LSRs completed on the due date during the month.
  Denominator: Total number of LSRs completed during the month.
  Business Rules:
 

· CLEC by CLEC.

 

· Expedited orders will be included.

 

· Include in measurement those LSRs where due dates are missed due to a lack of facilities.

 

· Exclude from the measurement, those LSRs where confirmed due dates are missed due to causes attributable to CLECs or their customers per CISC BPWG consensus report BPRE029a, and approved by the Commission in Decision 2002-26.

 

· All constituent elements of an order are to be delivered in working condition.

  Indicator 1.13 - Unbundled Type A and B Loop Order Late Completions
  Definition: The percentage of orders for unbundled type A and B loops and their sub-categories, for which the due date as measured in Indicators 1.8, 1.9 and 1.12 was missed, but which were completed within one working day of the confirmed due date. The due date means the standard service due date, unless the parties have agreed to an earlier due date.
  Measurement Method: Completed loop orders that are not completed by their due dates are compiled, and the percentage of those which were completed within one working day of their respective confirmed due dates is reported.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 1.13 - Unbundled Type A and B Loop Order Late Completions.
  Numerator: Total number of orders for new and migrated type A and B unbundled loops and their sub-categories that have been completed within the month, but missed the confirmed due date by one working day.
  Denominator: Total number of orders for new and migrated type A and B unbundled loops and their sub-categories completed within the month for which a due date has been missed.
  Business Rules:
 

· CLEC by CLEC.

 

· Include in measurement those orders where due dates are missed due to a lack of facilities.

 

· Exclude from the measurement those orders for type A and B loops and their sub-categories where due dates are missed due to causes attributable to CLECs or their customers per BPRE029a, and approved by the Commission in Decision 2002-26.

  Indicator 1.14 - Unbundled Type A and B Loops Held Orders
  Definition: The number of orders for type A and B loops and their sub-categories that were not completed on the confirmed due date because of a lack of facilities, expressed as a percentage of loop inward movement.
  The confirmed due date means the date assigned by the provisioning ILEC and does not necessarily reflect the standard service interval, nor the customer requested due date.
  Inward movement means instances in which there is the provisioning of new and the migration of unbundled loops or modifications to existing unbundled loops that require loop facility changes.
  Measurement Method: Orders for unbundled loops are compiled and the percentage of these orders that were not completed on the due date as a result of the lack of facilities is reported.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 0.25% or less.
  Reporting Format: Indicator 1.14 - Unbundled Type A and B Loops Held Orders.
  Numerator: Total number of completed orders for type A and B unbundled loops and their sub-categories (inward movement) that were not completed on their due dates that month due to a lack of facilities together with the total number of orders for the month not yet completed for which confirmed due dates cannot be met due to a lack of facilities.
  Denominator: Total number of orders for type A and B unbundled loops and their sub-categories (inward movement) completed for the month, together with the total number of orders for the month not yet completed for which due dates cannot be met due to a lack of facilities.
  Business Rules:
 

· CLEC by CLEC.

  Indicator 1.17 - Local Service Request (LSR) Rejection Rate
  Definition: The percentage of LSRs submitted by CLECs that are returned due to errors identified by the ILECs and based on an error that can be objectively demonstrated and that requires some corrective action that warrants the re-issue of an order.
  Measurement Method: LSRs received and rejected are tracked and reported.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 5% or less.
  Reporting Format: Indicator 1.17 - Local Service Requests Rejected.
  Numerator: Total number of LSRs rejected by the ILEC during the month.
  Denominator: Total number of LSRs received by the ILEC during the month.
  Business Rules:
 

· CLEC by CLEC.

  Indicator 1.18 - Local Service Request (LSR) Turnaround Time Met
  Definition: The percentage of instances that the applicable LSR confirmation interval is met, as defined in the Canadian Local Ordering Guidelines (C-LOG), and in accordance with applicable Commission decisions.
  Measurement Method: Local Service Confirmations (LSCs) are compiled, and the percentage of those which were returned within the applicable standard interval, is reported.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 1.18 - Local Service Request (LSR) Turnaround Time Met.
  Numerator: Total number of Local Service Confirmations (LSCs) returned to the CLEC during the month within the applicable standard interval.
  Denominator: Total number of Local Service Confirmations (LSCs) issued during the month.
  Business Rules:
 

· CLEC by CLEC.

 

· Measures by following the specific confirmation intervals related to the standard service as defined in the C-LOG.

 

· Once an LSC has been issued and a subsequent version of the LSR is issued, the service interval related to the new LSC commences.

  Indicator 2.7 - Competitor Out-of-Service Trouble Reports Cleared within 24 Hours (Final indicator not subject to this decision. Included for the sake of comprehensiveness.)
  Definition: The total of initial out-of-service trouble reports and those cleared within 24 hours. Percentages of those cleared relative to this total.
  Measurement Method: Compilation of trouble report data gathered at each repair bureau.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 80%.
  Reporting Format: Indicator 2.7 - Competitor Out-of-Service Trouble Reports Cleared within 24 Hours.
  Numerator: Number of initial out-of service trouble reports cleared within 24 hours of their receipt during the month.
  Denominator: Total number of initial out-of-service trouble reports received during the month.
  Business Rules:
 

· CLEC by CLEC.

  Indicator 2.7A - Competitor Out-of-Service Trouble Report Late Clearances
  Definition: The percentage of trouble reports for type A and B unbundled loops and their sub-categories that are not cleared within 24 hours (i.e. outside the performance standard of Indicator 2.7), but which are cleared within the subsequent 24 hours.
  Measurement Method: Trouble reports are compiled for type A and B unbundled loops and their sub-categories outside the performance standard of Indicator 2.7, and the percentage of these Trouble Reports that are cleared within a subsequent 24-hour period.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 100%.
  Reporting Format: Indicator 2.7A - Competitor Out-of-Service Trouble Report Late Clearances.
  Numerator: Total number of initial out-of-service trouble reports received during the month for type A and B unbundled loops and their sub-categories cleared within 48 hours, excluding those cleared within 24 hours of their issuance.
  Denominator: Total number of initial out-of-service trouble reports received during the month for type A and B unbundled loops and their sub-categories, excluding those cleared within 24 hours of their issuance.
  Business Rules:
 

· CLEC by CLEC.

 

· Includes out-of-service trouble reports for type A and B unbundled loops and their sub-categories not meeting Indicator 2.7.

 

· Excludes a subsequent report related to an open trouble.

  Indicator 2.8 - Migrated Local Loop Completion Notices to Competitors (Final indicator not subject to this decision. Included for the sake of comprehensiveness.)
  Definition: The total number of completions of migrations of local loops and the number of notifications given on time by the incumbent telephone company to the competitors, notifying that the local loop migration is complete at the facilities of the incumbent telephone company, with the percentage of notifications given on time relative to this total.
  Measurement Method: Completions of migrated local loops and the notifications given on time are sorted to determine the actual numbers and the percentage of notifications given on time.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90%.
  Reporting Format: Indicator 2.8 - Migrated Local Loop Completion Notices to Competitors.
  Numerator: Number of notifications of local loops migrations completed during the month given on time to the CLEC.
  Denominator: Total number of completions of migrations of local loops scheduled for that month.
  Business Rules:
 

· CLEC by CLEC.

  Indicator 2.8A - New Loop Status Provided to Competitors
  Definition: Percentage of order completion notices and order status reports provided to competitors for new type A and B unbundled loops and their sub-categories. Completion notices are to be provided to competitors as soon as possible following installation of an unbundled loop. Order status reports are to be provided to the competitor by 5:00 p.m. (in the ILEC serving territory) for uncompleted orders on the day for which the orders are scheduled.
  Measurement Method: New loop orders are compiled and the percentage of those is reported for which the required completion notices and/or order status reports were provided to the competitor.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 2.8A - New Loop Status provided to Competitors.
  Numerator: Total number of orders in the month for new unbundled type A and B loops and their sub-categories for which the required completion notices and/or order status reports were given.
  Denominator: Total number of orders for new unbundled type A and B loops and their sub-categories scheduled to be completed in the month.
  Business Rules:
 

· CLEC by CLEC.

 

· Status to be provided by 5:00 p.m. in the ILEC serving territory.

 

· Measurement includes the count of completion notifications provided on completed new loops and status provided on non-completed new type A and B unbundled loops and their sub-categories.

  Indicator 2.9 - Competitor Degraded Trouble Reports Cleared Within 48 Hours
  Definition: The total number of CLECs degraded trouble reports cleared by ILECs within 48 hours of notification.
  Measurement Method: Total degraded trouble reports are sorted to determine the actual numbers and the percentage of reports cleared.
  Geographical Basis: Company-wide, no geographic distinction.
  Standard: 90% or more.
  Reporting Format: Indicator 2.9 - Competitor Degraded Trouble Reports Cleared Within 48 hours.
  Numerator: Total number of degraded trouble reports reported by CLEC and cleared within 48 hours of their notification.
  Denominator: Total number of degraded trouble reports received from CLEC during the month.
  Business Rules:
 

· CLEC by CLEC.

Date Modified: 2003-10-30

Date modified: