TL 9000 Quality Management System Measurements Handbook: Release 3.0

Download as pdf or txt
Download as pdf or txt
You are on page 1of 168

Quality Excellence for Suppliers of

Telecommunications Forum
(QuEST Forum)

TL 9000
Quality Management System

Measurements Handbook

Release 3.0
Copyright

Copyright ã 2001 Quality Excellence for Suppliers of Telecommunications Forum

For further information,


see the QuEST Forum Web page at:
http://www.questforum.org/

TL 9000 is a registered trademark


of the
Quality for Excellence for Suppliers of Telecommunications Forum.

Sections of this document contain copyrighted material from a variety of sources;


these sources are identified in the Bibliography of this handbook.

To the memory of
Terry Blok Unisys
Hank Malec 3Com

Approved and Adopted


by the
QuEST Forum
Effective
March 31, 2001

TL 9000 Quality Management System Measurements Handbook 3.0

ii
Foreword

Foreword
The TL 9000 Quality Management System Measurements Handbook was
prepared in a cooperative effort by the members of the Quality Excellence for
Suppliers of Telecommunications (QuEST) Forum. From the outset the QuEST
Forum’s goal has been to develop and maintain a consistent set of quality
system requirements and measurements that, when implemented, will help
provide telecommunications customers with faster, better and more cost-effective
services.

This book complements the TL 9000 Quality Management System Requirements


Handbook with measurements that reflect the performance of the industry and its
products. QuEST Forum members, including service providers and suppliers,
utilize measurements that are collected under the provisions of this handbook to
improve their processes. By improving processes, the industry becomes more
efficient and telecommunications customers, worldwide, derive the benefit of
improved services.

The QuEST Forum is pleased to present this book in a common spirit of


delighting our customers.

George C. Via, Verizon Olga Striltschuk, Motorola


QuEST Forum – Chairman QuEST Forum – Vice Chair

Steven Welch, SBC Barry D’Amour, Nortel Networks

Don Pickens, Bell South Jerry Cates, Corning

William Wessman, Boston Karl-Heinz Augenstein, Alcatel


Communications Group

Marty Lustig, SPRINT Masahide Sekiguchi, Fujitsu

Signature on File

Isabelle Courville, Bell Canada Monica Garcia, Complas

TL 9000 Quality Management System Measurements Handbook 3.0

iii
Preface

Preface
The Quality Excellence for Suppliers of Telecommunications Forum (QuEST
Forum) was founded to foster continued improvements to the quality and
reliability of telecommunications service. The founders took the critical initial step
of establishing a common set of quality management system requirements and
measurements by creating the TL 9000 Quality Management System
Requirements Handbook and the TL 9000 Quality Management System
Measurements Handbook. These handbooks are the result of a cooperative
effort among members of the telecommunications industry.
The work of the QuEST Forum yields benefits to customers, their subscribers,
and their suppliers. Membership is composed of telecommunication Service
Providers, Suppliers, and Liaisons. Members fund and participate in the QuEST
Forum, have defined voting rights, and are expected to contribute to the work of
the QuEST Forum. Members vote on adoption of the TL 9000 structure, content,
administration, and other questions coming before the QuEST Forum.
The QuEST Forum establishes and maintains a common set of quality
management system requirements and measurements built on currently used
industry standards including ISO 9001:2000. The requirements and
measurements promote consistency and efficiency, reduce redundancy and
improve customer satisfaction. They also enable suppliers to improve quality
and reliability, reduce costs, and increase competitiveness.

TL 9000 Quality Management System Measurements Handbook 3.0

iv
Acknowledgements

Acknowledgements
The strength of the QuEST Forum is the outstanding capabilities and
commitment of the members who represent their respective organizations at the
QuEST Forum and Work Group Meetings. This exceptional talent produced the
first TL 9000 Quality Management System Measurements Handbook in record
time and now has completed this major update in less than one year. Individuals
whose companies were customers, suppliers, and competitors accomplished the
update of the Requirements handbook through extraordinary teamwork.
This outstanding accomplishment was facilitated in partnership with the American
Society of Quality (ASQ) and The University of Texas at Dallas (UTD). Special
thanks for their constant support and encouragement during the last three years
of our development to Dr. Bill Osborne, Dean and Dr. Douglas E. Harris,
Associate Dean, Erik Jonsson School of Engineering and Computer Science and
to Paul Borawski, ASQ Executive Director and Brian LeHouillier, ASQ Director,
Programs & Operations.
Personally, and for the entire QuEST Forum, I would like to thank the following
QuEST Forum individuals and companies of the Measurements and Oversight
work groups for their direct contributions to this update of the TL 9000 Quality
Management System Measurements Handbook.

Jack Pompeo
QuEST Forum Project Director

Measurements Work Group


Leaders: Chair Rick Werth SBC
Vice-Chair Matt Lindsay Tellabs Operations
Secretary John Walz Lucent Technologies
Time Keeper Jeffery Rose Sumitomo Electric Lightwave
SME Richard Morrow The University of Texas at Dallas

Contributors:
ADTRAN Alcatel Antec
Charles O’Donnell Jim Ko Bob Lichkay
Advanced Fibre Peter Loew Rob Lindner
Communications June Miller Astec Advanced Power
Mark Fischer Steven Quigley Systems
Mark Hodges Tab Rabalao Roger Daunais
Rhonda Sator Tom Yohe Andre Lapointe

TL 9000 Quality Management System Measurements Handbook 3.0

v
Acknowledgements

AT&T Japan Quality Assurance Qwest


Kathy Parker Organization - JQA Jerry Keintz
Michael Server Katsuhiko Haruta Don Wilford
Bell Canada KARLEE SBC
Jean-Normand Drouin David Briggs Jim Lankford
BellSouth John Jennings Vuong Phi
Telecommunications Liebert Corporation Scott Suko
Ed Ballington Larry Ables Rick Werth
Mort Burnett Tom Baldrick Siemens ICN
Tex Prater Dale Carpenter Ken Koffman
Boston Communications Dave Giannamore Mark Young
Group Lucent Technologies Sumitomo Electric Lightwave
Tareq Rahman David Bisone Jeffrey Rose
British Telecommunications Ari Jain Symmetricom
Steve Dickens Art Morrical Jack Riskey
Celestica Corporation John Walz Tekelec
Paul Pettinger John Wronka Angela Hall
Charles Industries Manly-Williams Michael Willis
Angelo DiMonte Linda Cue Telamon
Victor Potent John Manly William McLinn
CommWorks Marconi Telcordia Technologies
James Oates Bret Barclay Debbie Hearn
Comverse Network Systems Don Denk Tellabs Operations
Bruce Rozett Beverly McClain Duke Dahmen
Elizabeth Tracy Jennifer Vidos Denise Echols
Corning Motorola Matt Lindsay
Kevin Calhoun James Fritsch Telmar Network Technology
Steve Cooper Tama McBride Jerry Constance
CTDI Jim Osman The University of Texas at
Jim McCormick Network Access Solutions Dallas
Ericsson Laurence Rudolph Richard Morrow
Victor Sandoval Newbridge Networks Unisys
Excel Partnership Bob Cicelski Richard Geesaman
Steve Obolewicz Nokia Verizon Communications
Dave Sanicola Doug Hall Galen Aycock
Flextronics Karen Rawson Alan Beaudry
Johnny Hancock Nortel Networks Brendan Pelan
Fujitsu Jim Dumouchelle Ron Resh
Tom Garrison Paceon WorldCom
Hummingbird Jim Shields Robert Paschke
Mary Harmer Xerox Corporation
Brian Fannon

Oversight Work Group


Leaders: Chair Joe Taylor Tellabs Operations
Vice-Chair Ron Basque Complas
Secretary Kim Hauswirth American Society for Quality

Contributors:
ADC Telecommunications ADTRAN Alcatel
Randy Pezon Randal Whorton Dave Aiken
Ron Luttrull

TL 9000 Quality Management System Measurements Handbook 3.0

vi
Acknowledgements

American Society for Quality Independent Association of SBC


Kim Hauswirth Accredited Registrars Jim McDonnell
British Standards Institution Chris Shillito Siemens ICN
(BSI) Marconi Ken Koffman
Ky White Donna Reinsch Stat-A-Matrix
Complas Motorola Rich Watts
Ron Basque Greg Feldman Tekelec
Corning Nortel Networks Ben Crane
Sandy Holston Jeff Harpe Tellabs Operations
Joel Reece Perry Johnson Registrars Joe Taylor
Excel Partnership Nicole Martin
Donna Locke

The QuEST Forum benefits from the continued and dedicated service of many
individuals working towards the goals of the QuEST Forum. Without these
individuals and their company’s support, the QuEST Forum would not be
successful in ensuring that the quality of telecommunication services to the end-
user keep pace with changing technological opportunities in the twenty-first
century.

A Board of Directors guides the QuEST Forum activities through a strategic plan,
which is implemented by the work groups. The Measurements and Oversight
work groups are credited for producing this document and they would like to
recognize the individuals and companies that participated in the other work
groups for providing invaluable service in support of the overall QuEST Forum
Mission.

Business Excellence Acceleration Model (BEAM) Work Group


Leaders: Chair Gene Hutchison SBC
Vice-Chair Mary Hattrick Marconi
Vice-Chair Don Brown Alcatel
Secretary/SME Tom Withey The University of Texas at Dallas

Contributors:
Agilent Technologies Comverse Network Systems Infonet
John Murray Zvi Ravia Dan Russ
Alcatel Corning Lucent Technologies
Don Brown Steve Cooper Sandford Liebesman
Ian Mackie Len Young Marconi
AT&T Excel Partnership Mary M. Hattrick
Robert Gray David Middleton Motorola
BellSouth Flextronics Greg Feldman
Telecommunications Johnny Hancock Nortel Networks
Irv Briks Fujitsu Daniel Proffit
British Telecommunications Ashok Dandekar Christopher West
Alex Cuthbertson Glenayre Technologies
Mark Webster Deborah Brigham

TL 9000 Quality Management System Measurements Handbook 3.0

vii
Acknowledgements

SBC Tellabs Operations The University of Texas at


Gene Hutchison Mari Silvenius Dallas
Stephen Stroup Telmar Network Technology Tom Withey
Telkom South Africa Gary McMullin
Mike Donald

Governance Work Group


Leaders: Co-Chair Jim McDonnell SBC
Co-Chair Len Young Corning

Contributors:
Alcatel ECI Telecom SBC
Ron Luttrull Misha Ptak Jim McDonnell
Corning Motorola Tellabs Operations
Len Young Greg Feldman Joe Taylor
Nortel Networks
Jeff Harpe

Marketing and Communications (Marcom) Work Group


Leaders: Chair Jack Pompeo TeleCentric
Vice-Chair Ashok Dandekar Fujitsu

Contributors:
ADC Telecommunications Complas Perry Johnson Registrars
Jerry Lefever Ronald Basque Nicole Martin
Randy Pezon Corning QUASAR
ADTRAN Joel Reece Doug Luciani
Randal Whorton Fujitsu Tellabs Operations
Alcatel Ashok Dandekar Joe Taylor
Dave Aitken Marconi TeleCentric
Ron Luttrull Donna Reinsch Jack Pompeo
American Society for Quality
Jeff Weitzer

Requirements Work Group


Leaders: Chair Brendan Pelan Verizon
Vice-Chair(2000) Debbie Hearn Telcordia
Vice-Chair(1999) Matt Lindsay Tellabs Operations, Inc.
Secretary Tama McBride Motorola
Time Keeper Jeffery Rose Sumitomo Electric Lightwave
SME(2000) Richard Morrow The University of Texas at Dallas
SME(1999) Bob Brigham Telcordia

Contributors:
3M Telecom Systems Advanced Fibre Alcatel
Thierno Diallo Communications Chandan Banerjee
ADTRAN Mark Hodges Ian Mackie
Charles O'Donnell Rhonda Sator Mark Moore
Mike Rippe
Tom Yohe

TL 9000 Quality Management System Measurements Handbook 3.0

viii
Acknowledgements

Antec Flextronics SBC


Rob Lindner Johnny Hancock Ed Franck
Bob Lichkay Fujitsu Jim Lankford
Bill Taylor Doug McCullough Judy Przekop
Astec Advanced Power Hekimian Laboratories Steve Stroup
Systems Robin Williams Rick Werth
Andre Lapointe Liebert Corporation Siemens ICN
AT&T Larry Ables Pat Muirragui
Kathy Parker Tom Baldrick Tom West
Michael Server Dale Carpenter Sorrento Networks
Bell Canada David Giannamore Richard Lycette
Jean-Normand Drouin Lucent Technologies Sprint
Jean-Pierre Quoibion Ruth A. Harman Tim Dinneen
BellSouth Sandford Liebesman Sumitomo Electric Lightwave
Telecommunications John Wronka Gary Bishop
Tex Prater John Walz Jeffrey Rose
Joel Sullivan Marconi Symmetricom
British Telecommunications Bret Barclay Donna Schilling
Steve Dickens Don Denk Jack Riskey
Charles Industries Mary Hattrick Telcordia Technologies
Angelo DiMonte Beverly McClain Bob Brigham
Victor Potent Harold Morrison Debbie Hearn
Chatsworth Products Jennifer Vidos John Russell
Edward Gaicki Motorola Leslie Wolfe
CommWorks Mary Demmert TeleCentric
Laura Coplon Tama McBride Jack Pompeo
Jim Oates Jim Osman Tellabs Operations
David Stahl NEC America Matt Lindsay
Comverse Network Systems David Barski Telmar Network
Bruce Rozett Nokia Technology
Elizabeth Tracy Karen Rawson Jerry Constance
Corning Doug Hall Gary McMullin
Steve Cooper Nortel Networks The University of Texas at
Len Young Robert Oakley Dallas
Entela Richard Pierrie Richard Morrow
Ralph Stowe Paceon Tyco Electronics
Ericsson Jim Shields Greg Blount
Victor Sandoval QUALCOMM Verizon Communications
Excel Partnership Grace Weaver Alan Beaudry
Steve Obolewicz Qwest Brendan Pelan
Dave Sanicola Don Wilford Worldcom
Jerry Keintz Robert Paschke
John Rosenow

Supply Chain Work Group


Leaders: Chair David Briggs KARLEE
Vice-Chair Greg Lilly Nortel Networks
Contributors:
Acterna Agilent Technologies Artesyn Technologies
Andrzej Kozak Brent Wahl Scott Ireland
ADC Alcatel Michael Sullivan
Jerry Lefever Phil Dudley AT&T
Soundar Rajan Mike Server

TL 9000 Quality Management System Measurements Handbook 3.0

ix
Acknowledgements

Atlanta Cable Sales Graybar Electric Nokia


Bryan Glutting Jack Evans Douglas Hall
BellSouth IECQ/ECCB Nortel Networks
Telecommunications Charles Packard Greg Lilly
Joel Sullivan JDS Uniphase NQA
Bookham Technology David Hall Chris Mooney
Nick Whiteley KARLEE Pulsecom
Celestica David Briggs Robert Hungate
Paul Pettinger Liebert QUALCOMM
Complas Larry Ables Grace Weaver
Fred Denny Hang Tan SCI Systems
Corning Lucent Technologies Ken Crane
Joel Reece Andrea Long Stat-A-Matrix
Leonard Young Mike Musky Hal English
ECI Telecom Manly-Williams Superior Telecommunications
Misha Ptak John Manly Jules Fijux
Ericsson Marconi Eric Perry
Ron Hershberger John Wheeler Telcordia Technologies
Excel Partnership Masterwork Electronics John Russell
Donna Locke Scott Woods TeleCentric
Flextronics Motorola Jim Carmichael
Ellen Evans Jim Osman Tyco Electronics
Johnny Hancock Network & Cable Products Greg Masciana
Fujitsu Jay Chenault Verizon
Joe Bartnicki Corky Roberts Galen Aycock
Matthew Weir

Training Work Group


Leaders: Chair Rosemarie Moskow SBC
Vice-Chair Jeff Harpe Nortel Networks
Contributors:
CTDI Nortel Networks Stat-A-Matrix
Jim McCormick Jeff Harpe Paul Berman
Excel Partnership Pulsecom Jim Gerard
Joe DeCarlo Misha Ptak telcobuy.com
Steve Obolewicz SBC Shannon Kohlenberger
Donna Locke Jim McCormick Monica Eskridge
Rosemarie Moskow

TL 9000 Quality Management System Measurements Handbook 3.0

x
Table of Contents

Table of Contents
SECTION 1 INTRODUCTION 1-1

SECTION 2 STRUCTURE 2-1

SECTION 3 MEASUREMENTS PROCESSING,


USAGE AND RESPONSIBILITIES 3-1

SECTION 4 GENERAL MEASUREMENTS REQUIREMENTS 4-1

SECTION 5 COMMON MEASUREMENTS 5-1


5.1 NUMBER OF PROBLEM REPORTS (NPR) 5-1
5.2 PROBLEM REPORT FIX RESPONSE TIME (FRT) 5-9
5.3 OVERDUE PROBLEM REPORT FIX RESPONSIVENESS (OFR) 5-17
5.4 ON-TIME DELIVERY (OTD) 5-24
SECTION 6 HARDWARE AND SOFTWARE MEASUREMENTS 6-1
6.1 SYSTEM OUTAGE MEASUREMENT (SO) 6-1
SECTION 7 HARDWARE MEASUREMENTS 7-1
7.1 RETURN RATES (RR) 7-1
SECTION 8 SOFTWARE MEASUREMENTS 8-1
8.1 SOFTWARE INSTALLATION AND MAINTENANCE 8-1
SECTION 9 SERVICES MEASUREMENTS 9-1
9.1 SERVICE QUALITY (SQ) 9-1
APPENDIX A PRODUCT CATEGORY TABLES A-1

APPENDIX B TL 9000 CUSTOMER SATISFACTION MEASUREMENTS


GUIDELINES B-1

GLOSSARY ABBREVIATIONS, ACRONYMS AND DEFINITIONS 1

BIBLIOGRAPHY 1

TL 9000 Quality Management System Measurements Handbook 3.0

xi
Table of Contents

List of Figures
FIGURE 2.1-1 THE TL 9000 MODEL 2-1

FIGURE 2.3-1 TL 9000 MEASUREMENT DATA FLOW AND USAGE 2-2

FIGURE 7.1-1 SHIPPING DATE GROUPS FOR COMPUTING RETURN RATES 7-7

TL 9000 Quality Management System Measurements Handbook 3.0

xii
List of Tables

List of Tables
Table 5.1-1 Number of Problem Reports (NPR)
Measurement Identifiers and Formulas 5-4
Table 5.1-2 Number of Problem Reports (IPR)
RQMS Alternative Measurements 5-4
Table 5.1-3 TL 9000 NPR Data Table 5-4
Table 5.1-4 RQMS Alternative NPR Data Table (IPR) 5-5
Table 5.1-5 Example 1 – NPR H/S Data Report 5-6
Table 5.1-6 Example 1 – NPR Source Data and
Measurement Calculation 5-7
Table 5.1-7 Example 2 – NPR Data Report (Services) 5-7
Table 5.1-8 Example 2 – NPR Source Data and Measurements (Services) 5-8
Table 5.2-1 Problem Report Fix Response Time (FRT)
Measurement Identifiers and Formulas 5-11
Table 5.2-2 Problem Report Fix Response Time (ORT)
RQMS Alternative Measurements 5-12
Table 5.2-3 TL 9000 FRT Data Table 5-12
Table 5.2-4 RQMS Alternative FRT Data Table (ORT) 5-13
Table 5.2-5 Example 1 – FRT Data Report 5-15
Table 5.2-6 Example 1 – FRT Source Data and
Measurement Calculation 5-15
Table 5.2-7 Example 2 – FRT Data Report (Services) 5-16
Table 5.2-8 Example 2 – FRT Source Data and
Measurement Calculation (Services) 5-16
Table 5.2-9 Example 3 – Effect of Customer Delay 5-16
Table 5.3-1 Overdue Problem Report Fix Responsiveness (OFR)
Measurement Identifiers and Formulas 5-19
Table 5.3-2 Overdue Problem Report Fix Responsiveness (OPR)
RQMS Alternative Measurements 5-19
Table 5.3-3 TL 9000 OFR Data Table 5-20
Table 5.3-4 RQMS Alternative OFR Data Table (OPR) 5-20
Table 5.3-5 Example 1 – OFR Data Report 5-22
Table 5.3-6 Example 1 – OFR Source Data and
Measurement Calculation 5-22
Table 5.3-7 Example 2 – OFR Data Report (Services) 5-23
Table 5.3-8 Example 2 – OFR Source Data and
Measurement Calculation (Services) 5-23
Table 5.4-1 On-Time Delivery (OTD)
Measurement Identifiers and Formulas 5-26
Table 5.4-2 TL 9000 OTD Data Table 5-27
Table 5.4-3 Example 1 – On-Time Installed System (OTIS) 5-28
Table 5.4-4 Example 2 – On-Time Service Delivery (OTS) 5-29
Table 5.4-5 Example 3 – On-Time Item Delivery (OTI) 5-30
Table 5.4-6 Example 1, 2, 3 – On-Time Delivery Data Report (OTD) 5-31

TL 9000 Quality Management System Measurements Handbook 3.0

xiii
List of Tables

Table 6.1-1 System Outage Measurement (SO)


Measurement Identifiers and Formulas 6-4
Table 6.1-2 System Outage Measurements (SOE)
RQMS Alternative Measurements
End Office and/or Tandem Office,
Wireless Products, and NGDLC Products 6-5
Table 6.1-3 System Outage Measurements (SOG)
RQMS Alternative Measurements
General Series 6-5
Table 6.1-4 TL 9000 SO Data Table 6-8
Table 6.1-5 RQMS Alternative SO Data Table (SOE) 6-9
Table 6.1-6 RQMS Alternative SO Data Table (SOG) 6-9
Table 6.1-7 Example 2 –SO Data Report for March 2001 6-12
Table 6.1-8 Example 3 –SO Measurement
Calculation for a Transport System 6-12
Table 6.1-9 Example 3 – Normalized SO Measurement
Calculation for a Transport System 6-13
Table 6.1-10 Example 3 – Transport SO Data Report for March 2001 6-13
Table 7.1-1 Return Rates (IRR, YRR, LTR, and NYR)
Measurement Identifiers and Formulas 7-5
Table 7.1-2 TL 9000 RR Data Table 7-5
Table 7.1-3 Example Returns 7-7
Table 7.1-4 Example 2 – Return Rate Data Table 7-12
Table 8.1.5-1 Release Application Aborts (RAA)
Measurement Identifiers and Formulas 8-5
Table 8.1.5-2 Release Application Aborts (RAQ)
RQMS Alternative Measurements 8-6
Table 8.1.5-3 TL 9000 RAA Data Table 8-6
Table 8.1.5-4 RQMS Alternative RAA Data Table (RAQ) 8-7
Table 8.1.5-5 Example 1 – RAA Source Data and
Measurement Calculation 8-8
Table 8.1.5-6 Example 1 – RAA TL 9000 Data Report 8-9
Table 8.1.6-1 Patch Quality (CPQ and FPQ)
Measurement Identifiers and Formulas 8-12
Table 8.1.6-2 Patch Quality (DCP and DFP)
RQMS Alternative Measurements 8-13
Table 8.1.6-3 TL 9000 CPQ or FPQ Data Table 8-13
Table 8.1.6-4 RQMS Alternative CPQ or FPQ Data Table
(DCP or DFP) 8-14
Table 8.1.6-5 Example 1 – CPQ Source Data and
Measurement Calculation 8-15
Table 8.1.6-6 Example 1 – CQP Data Report 8-16
Table 8.1.7-1 Software Update Quality (SWU)
Measurement Identifiers and Formulas 8-19
Table 8.1.7-2 Software Update Quality (DSU)
RQMS Alternative Measurements 8-19
Table 8.1.7-3 TL 9000 SWU Data Table 8-20
Table 8.1.7-4 RQMS Alternative SWU Data Table (DSU) 8-20

TL 9000 Quality Management System Measurements Handbook 3.0

xiv
List of Tables

Table 8.1.7-5 Example 1 – SWU Source Data and


Measurement Calculation 8-21
Table 8.1.7-6 Example 1 – SWU Data Table Report for June 2000 8-22
Table 9.1-1 Definitions of Defects, Service Volume and
Units of Measure by Service Product Categories for
Service Quality Measurements 9-2
Table 9.1-2 Service Quality (SQ)
Measurement Identifiers and Formulas 9-3
Table 9.1-3 TL 9000 SQ Data Table 9-4
Table 9.1-4 SQ Data Sources 9-4
Table 9.1-5 Example 1 – Source Data for Installation SQ 9-5
Table 9.1-6 Example 1 – Data Report for Installation SQ 9-5
Table 9.1-7 Example 2 – Source Data for Repair SQ 9-6
Table 9.1-8 Example 3 – Source Data for Maintenance SQ 9-6
Table 9.1-9 Example 4 – Source Data for Customer Support Service SQ 9-6
Table 9.1-10 Example 5 – Source Data for Support Service SQ 9-7
Table A-1 Product Category Definitions A-3
Table A-2 Measurement Applicability Table (Normalized Units) A-23
Table A-3 Transmission Standard Designations and Conversions A-35
Table A-4 Optical and Electrical Equivalency A-36
Table A-5 Measurements Summary Listing A-37

TL 9000 Quality Management System Measurements Handbook 3.0

xv
Section 1 - Introduction

Section 1 Introduction

The TL 9000 handbooks (the TL 9000 Quality Management System


Requirements Handbook and the TL 9000 Quality Management System
Measurements Handbook) are designed specifically for the telecommunications
industry to document industry quality management system requirements and
measurements.

The TL 9000 Quality Management System Requirements Handbook establishes


a common set of quality management system requirements for suppliers of
telecommunications products: hardware, software, and services. The
requirements built on existing industry standards, including ISO 9001. The
TL 9000 Quality Management System Measurements Handbook defines a
minimum set of performance measurements. The measurements are selected to
guide progress and evaluate results of quality management system
implementation.

The goals of TL 9000 are to:


1.1 Goals
• Foster quality management systems that effectively and efficiently protect the
integrity and use of telecommunications products: hardware, software, and
services,
• Establish and maintain a common set of quality management system
requirements,
• Reduce the number of telecommunications quality management system
standards,
• Define effective cost and performance-based measurements to guide
progress and evaluate results of quality management system
implementation,
• Drive continual improvement,
• Enhance customer-supplier relationships, and
• Leverage industry conformity assessment processes.

The purpose of TL 9000 is to define the telecommunication quality management


1.2 Purpose system requirements for the design, development, production, delivery,
installation, and maintenance of products: hardware, software, and services.
Included in TL 9000 includes performance-based measurements that quantify
reliability and quality performance of these products. Long-term goals include
both cost- and performance-based measurements.

Suppliers of telecommunication products, their customers, service providers, and


1.3 Benefits of the end subscriber benefit from the implementation of TL 9000.
Implementation
Expected benefits are:
• Continual improvement of service to subscribers,
• Enhanced relationships between the organization and its customers,
• Standardization of quality management system requirements,

TL 9000 Quality Management System Measurements Handbook 3.0

1-1
Section 1 - Introduction

• Efficient management of external audits and site visits,


• Uniform measurements,
• Overall cost reduction and increased competitiveness,
• Enhanced management and improvement of the organization’s performance,
and
• Industry benchmarks for TL 9000 measurements.

The QuEST Forum maintains compatibility with other sets of requirements and
1.4 Relationship to standards. TL 9000 provides a telecommunications-specific set of requirements
ISO 9001 and Other built on an ISO 9001:2000 framework. See the Bibliography for the standards
Requirements and requirements that were considered during the development of TL 9000.

Characteristics of the TL 9000 relationship to other requirements are:


• TL 9000 includes ISO 9001:2000 and any future revisions will be
incorporated,
• Conformance to TL 9000 constitutes conformance to corresponding
ISO 9001 requirements, and
• It is the intent of the QuEST Forum that conformance to TL 9000 will
eliminate the need for conformance to other telecommunications quality
management standards.

The QuEST Forum is responsible for the development, publication, distribution


1.5 Developing and and maintenance of the TL 9000 handbooks. Change requests for the
Maintaining the handbooks, following initial publication, are to be submitted to the QuEST Forum
Handbook(s) Administrator. Any user of the handbooks may submit change requests.
Change requests will be forwarded to the appropriate handbook section
chairperson by the QuEST Forum Administrator and will be considered for the
next revision. A change request/feedback form is available at the QuEST Forum
web site (http://www.questforum.org/).

Final approval of all changes to TL 9000 handbooks will be by vote of the QuEST
Forum voting members in accordance with the QuEST Forum’s bylaws. Re-
issue of the TL 9000 handbooks will be determined by the QuEST Forum, but not
to exceed five years following the last issue date. When the QuEST Forum
determines there are changes necessary in TL 9000 that could impact third party
registration, then addenda or similar communication mechanisms will be
employed to inform the industry of corrections and updates to the TL 9000
handbooks.

TL 9000 Quality Management System Measurements Handbook 3.0

1-2
Section 2 – Structure

Section 2 Structure

TL 9000 is structured in layers (see Figure 2.1-1):


2.1 Overall
Structure
• International Standard ISO 9001:2000
• Common TL 9000 Requirements
• Hardware, Software, and Services Specific Quality Management System
Requirements
• Common TL 9000 Measurements
• Hardware, Software, and Services Specific Quality Management System
Measurements

International Standard ISO 9001:2000

Common TL 9000 Requirements


T
Hardware Specific Software Specific Services Specific L
Requirements Requirements Requirements
9
0
Common TL 9000 Measurements 0
0
Hardware Specific Software Specific Services Specific
Measurements Measurements Measurements

Figure 2.1-1 The TL 9000 Model

The QuEST Forum retains complete control over the content except for material
that is copyrighted by others.

The word “shall” indicates mandatory requirements. The word “should”


indicates a preferred approach. Organizations choosing other approaches must
be able to show that their approach meets the intent of TL 9000. Where the
words “typical” and “examples” are used, an appropriate alternative for the
particular commodity or process should be chosen. Paragraphs marked “NOTE”
are for guidance and not subject to audit.

Endnotes denoted by [x] represent bibliography source material that is not


auditable (see “Bibliography").

In this handbook the term supplier refers to the organization pursuing TL 9000
2.2 Terminology implementation, conformance, and/or registration.

TL 9000 Quality Management System Measurements Handbook 3.0

2-1
Section 2 – Structure

Figure 2.3-1 illustrates the data flow and usage of TL 9000 Quality Management
2.3 Data Flow and System Measurements as described in this handbook.
Usage of
Measurements

Customer Supplier M RS Reported


TL 9000 Data
Continuous
Improvement
Measurements
Programs
Repository
System (MRS)

QuEST Forum
Industry Statistics
Web Site

Figure 2.3-1 TL 9000 Measurement Data Flow and Usage


The use of measurements should be designed to meet the principles of the
QuEST Forum, which are stated in subsection 3.3.
Usage Approach - Figure 2.3-1 depicts an environment where improvement
opportunities are identified by an organization and its customer through
information exchanges and from TL 9000 trend data.
a. Measurements may be used between an organization and its customer to set
mutual targets to improve products. This helps build customer and
organization relationships and establishes targets that best meet their needs.
b. Some of the TL 9000 measurements may be used as improvement
measures by individual organizations. These measurements receive careful
review to ascertain that the measures are indeed comparable.
Measurements are monitored by the Measurements Administrator to assure
that aggregation across organizations into summary statistics is valid and
meaningful. The summary statistics definitions will be revised as needed.
The definition of these measurements includes the designation “compared
data.”
c. Other measurements include the designation “research data”. Research
data shall not be used for comparison purposes. However, the
Measurements Administrator will analyze the data to reveal possible industry
trends. These analyses are reported only to the measurements work group
for study to determine future uses.
d. The product category performance is improved as each organization
compares its results against the summary statistics and improves its
performance.
e. The QuEST Forum measurements database is not intended for use as a
management tool to manage an organization supplying products, but as a
data repository. Output from the database shall consist of statistical
summary reports derived from the TL 9000 Measurements Repository
System (MRS) for each measurement by product category.

TL 9000 Quality Management System Measurements Handbook 3.0

2-2
Section 3 – Measurements Processing, Usage and Responsibilities

Section 3 Measurements Processing, Usage and


Responsibilities

In order to fully meet the requirements of this handbook and the companion
3.1 Requirements TL 9000 Quality Management System Requirements Handbook, the
for Measurements measurements requirement defined here shall be used by the organization:
Usage a. Internally as a part of their continual improvement programs and
management reports,
b. As appropriate, in customer-organization exchanges and continual
improvement programs, and
c. When reporting to the Measurements Administrator, where indicated.

TL 9000 registration requires the fulfillment of the TL 9000 Quality Management


3.2 Principles of System Requirements and the reporting of the TL 9000 Quality Management
Measurements System Measurements data specific to that TL 9000 registration to the
Processing Measurements Administrator.

The following principles for processing the measurements are meant to foster an
environment where customers and organizations can work together to drive
continual improvement:
a. All applicable measurements for a product category as defined in the
Measurement Applicability Table (Normalized Units), Appendix A, Table A-2
shall be reported.
b. Valid reasons for the exclusion of specific measurements from the scope of
registration must be documented by the organization and available to the
registrar (certification/registration body) and customer on request.
c. Organizations shall provide TL 9000 measurement data to the
Measurements Administrator who will compile the data and calculate product
category statistics, such as “Industry Mean”, “Standard Deviation”, “Median”,
“Range”, “Number of Data Points”, and “Best in Industry” for each product
category, as appropriate. Results and reports produced by the
Measurements Administrator will not identify individual organizations.
d. Customers who are members of the QuEST Forum shall provide the
necessary TL 9000 field performance data to the suppliers in order to
calculate the specific measurements.
e. A customer may request organizations that directly supply products to
provide the TL 9000 measurements specific to that customer. This
information exchange occurs strictly between the organization and the
customer per mutual agreement. The QuEST Forum Administrator and
Measurements Administrator are not involved in any way.
f. There will be no ranking of organizations by the QuEST Forum Administrator.
g. The processing of measurements shall not compromise the proprietary
nature of the data.

TL 9000 Quality Management System Measurements Handbook 3.0

3-1
Section 3 – Measurements Processing, Usage and Responsibilities

The intended usage of TL 9000 measurements is to:


3.3 Principles of
Measurements a. Provide industry performance information suitable for benchmarking,
Usage b. Improve telecommunications processes and products,
c. Identify improvement opportunities, and
d. Standardize customer report cards or assessments.

3.4.1 Aggregation of Products


3.4 Measurements
Data Aggregation If an organization wishes to register multiple products in the same product
and Customer Base category and clearly identifies them as separate in the registration scope, the
organization may report the data for each product separately. Similarly, if an
organization registers a business unit or a location, the organization has the
option to determine which products will be registered and how the data will be
aggregated.

3.4.2 Customer Base


a. Customer base refers to the defined group of customers that the
organization’s measurement data encompasses. The customer base options
are:
(1) Forum Members: Only the organization’s customers who are members
of the QuEST Forum, or
(2) Total: All of the organization’s customers for the product(s) to which the
measurement applies.
b. The customer base shall be reported in each measurement data submission
for each measurement as indicated in the measurement profile.
c. The organization shall report measurement data from only one customer
base per individual measurement.

3.5.1 QuEST Forum Administrator Responsibilities


3.5 Responsibilities
The QuEST Forum Administrator shall:
a. Maintain complete security and confidentiality of an organization’s
information,
b. Develop, implement, publish and maintain formal operating procedures
defining the TL 9000 measurement process,
c. Receive the “Data Confirmation Report” from the Measurements
Administrator and forward the “Data Confirmation Report” to the organization,
d. Communicate to the organization missing or questionable data concerns as
reported by the Measurements Administrator,
e. Maintain a membership database that includes registration and
measurements submission history,

TL 9000 Quality Management System Measurements Handbook 3.0

3-2
Section 3 – Measurements Processing, Usage and Responsibilities

f. Publish and maintain industry reportable statistics by:


(1) Product category with data from a total customer base,
(2) Product category with data from only a QuEST Forum customer base,
and
(3) All available data for a product category (i.e., 1 and 2),
Note: Measurements are reported only when there is a minimum of five data
submissions from at least three companies for a given product category.
g. Provide and control access to the measurement data output through the
QuEST Forum web site,
h. Develop and implement a disaster recovery plan for QuEST Forum
Administrator related operations,
i. Support external audit or oversight of QuEST Forum Administrator activities,
and
j. Immediately notify registered organizations and affected companies when
updates to Measurement Applicability Table (Normalized Units), Appendix A,
Table A-2 are released on the QuEST Forum web site
(http://www.questforum.org/).

3.5.2 Measurements Administrator Responsibilities

The Measurements Administrator shall:


a. Maintain complete security and confidentiality of the data,
b. Develop, implement, publish, and maintain formal operating procedures
defining the TL 9000 measurement process tools and techniques,
c. Receive and acknowledge receipt of data from organizations, including
identifying missing or inaccurate data and reporting back to the QuEST
Forum Administrator,
d. Calculate the industry statistics, such as “Industry Mean”, “Standard
Deviation”, “Median”, “Range”, “Number of Data Points”, and “Best in
Industry”, as appropriate, by product category using the appropriate data
elements for each measurement that has compared data,
e. Compute industry statistics by:
(1) Product category with data from a total customer base,
(2) Product category with data from only a QuEST Forum customer base,
and,
(3) All available data for a product category (i.e., 1 and 2),
f. Post compared data output to the web site at least quarterly,
g. Develop and implement a disaster recovery plan for related operations,
h. Support external audit or oversight of activities,
i. Determine when sufficient data has been collected per measurement product
category to publish statistically valid results,
Note: Measurements are reported only when there is a minimum of five data
submissions from at least three companies for a given product category.
j. Be responsible for the accurate representation of provided data,
k. Create and maintain user manuals,

TL 9000 Quality Management System Measurements Handbook 3.0

3-3
Section 3 – Measurements Processing, Usage and Responsibilities

l. Propose aggregation of product categories to produce meaningful


measurements as a result of analysis of the inputs,
m. Analyze “research data” to reveal industry trends and report only to the
measurements work groups, and
n. Analyze “research data” to determine if there are conditions under which the
data could be compared and make recommendations only to the
measurements work groups to achieve comparability.

3.5.3 Organization Responsibilities

The organization shall:


a. Have documented processes in place to capture and validate applicable
measurement data such that source data records are available,
b. Collect, validate, and submit data per the defined measurement definitions to
the Measurements Administrator using the provided tool(s),
c. Submit data on measurements that are within its scope of registration,
d. Submit a minimum of three consecutive months of data to the Measurements
Administrator and receive a “Data Confirmation Report” acknowledging valid
submissions to obtain TL 9000 registration,
e. Continue to submit data every calendar quarter after becoming registered no
later than eight weeks after the end of each quarter,
f. Provide measurement data for new products within six months from General
Availability of the product, if it falls within the scope of registration,
g. Compare internal measurements to the industry statistics and take steps to
improve products and practices as appropriate,
h. Provide regular TL 9000 Quality Management System Measurements reports
to its responsible management,
i. Correct any data discrepancies, and
j. Re-submit corrected data for any erroneous data submitted within the
previous two years.

3.5.4 Customer Responsibilities

The customer shall:


a. Provide the necessary data to allow supplier organizations to generate the
TL 9000 measurements,
b. Have processes in place to capture and validate applicable measurement
data,
c. Use the TL 9000 measurements definitions for standardizing the supplier
organization performance review process (e.g., report cards),
d. Establish joint improvement teams and objectives based on TL 9000
measurements and other required performance objectives, and
e. Consider using TL 9000 measurements as an input when determining life
cycle costs.

TL 9000 Quality Management System Measurements Handbook 3.0

3-4
Section 3 – Measurements Processing, Usage and Responsibilities

3.5.5 QuEST Forum Responsibilities

The QuEST Forum shall:


a. Be responsible for the administration of the TL 9000 Quality Management
System Measurements Handbook,
b. Ensure that the TL 9000 Quality Management System Measurements
Handbook is publicly available. Publication, distribution and maintenance are
performed under the direction of the QuEST Forum, which retains its
copyright,
c. Be responsible for assuring the availability of appropriate training to help
users correctly and consistently interpret the TL 9000 requirements and
report the TL 9000 measurements,
d. Provide measurements process oversight,
e. Address all issues and concerns relating to the measurements process and
provide a summary and recommendations to the appropriate QuEST Forum
work group, and
f. Review proposed aggregations of product categories submitted by the
Measurements Administrator.

3.5.6 Registrar Responsibilities

During each audit the registrars shall verify that:


a. Processes are in place to ensure data validity and integrity in accordance
with the TL 9000 Quality Management System Measurements definitions and
requirements,
b. All supplier organization responsibilities are met, and
c. All measurement process non-conformances are corrected within the
registrar-specified timeframe.

TL 9000 Quality Management System Measurements Handbook 3.0

3-5
Section 4 –General Measurements Requirements

Section 4 General Measurements Requirements

Title Handbook Section


4.1 Measurements
Listing
Common Measurements (C) 5
Number of Problem Reports (NPR) 5.1
Problem Report Fix Response Time (FRT) 5.2
Overdue Problem Report Fix
Responsiveness Measurements (OFR) 5.3
On-Time Delivery (OTD) 5.4

Hardware and Software Measurements (HS) 6


System Outage Measurement (SO) 6.1

Hardware Measurements (H) 7


Return Rates (RR) 7.1

Software Measurements (S) 8


Software Installation and Maintenance 8.1
Release Application Aborts (RAA) 8.1.5
Corrective Patch Quality (CPQ) and
Feature Patch Quality (FPQ) 8.1.6
Software Update Quality (SWU) 8.1.7

Services Measurements (V) 9


Service Quality (SQ) 9.1

TL 9000 Quality Management System Measurements Handbook 3.0

4-1
Section 4 –General Measurements Requirements

4.2.1 Conformance to Measurements Profile


4.2 Measurements
Reporting The supplier shall generate and distribute the measurement data to the
Requirements Measurements Administrator (and to customers according to the principles of
measurements processing as detailed in Section 3) as described by the profiles
in this handbook for the applicable product categories. The measurement data
shall conform to the requirements in the corresponding profile. Changes to
reported data that are needed to comply with a new master version of
Appendix A or a new version of the measurements handbook shall be completed
within six months of their release. All data reported commencing with the second
data submission after a new release of this handbook shall be in compliance with
the new release of the handbook.

4.2.2 Applicable Product Categories

For each product the supplier shall identify product categories and applicable
measurements according to Measurement Applicability Table (Normalized Units),
Appendix A, Table A-2. Appendix A is current as of the release of this handbook.
The Measurement Applicability Table is subject to periodic updates. To
accommodate these changes, the master is available on the Quest Forum web
site (http://www.questforum.org/). The master shall be used in conjunction with
registrations and for all data submittals to the QuEST Forum database.

4.3.1 Customer Source Data


4.3 Measurements
Data and Reports When the customer does not provide data required for a measurement, the
supplier shall not be required to report the measurement for that customer.

Organizations shall submit data for a measurement if any of their customers


provide the information. Organizations are exempt from submitting measurement
data to the Measurements Administrator if none of their customers provide the
required information.

4.3.2 Acceptable Alternative Measurements

When the measurement profile states that RQMS (GR-929-CORE, Reliability and
Quality Measurements for Telecommunications Systems (RQMS) [1]) alternative
reporting is acceptable, under the “Method of Delivery and Reporting” topic in the
profile, the following shall apply:

TL 9000 Quality Management System Measurements Handbook 3.0

4-2
Section 4 –General Measurements Requirements

a. RQMS Data Acceptability

If a supplier is using the methods outlined in the latest issue of RQMS to


calculate a specific measurement, those methods and the resulting data will be
accepted in lieu of the TL 9000 definition if the following conditions are met:
(1) The data used for reporting to the QuEST Forum and its members
include all applicable data as prescribed by the TL 9000 definition of
the measurement and it is not limited to the RQMS client company
subset of data.
(2) For product categories not subject to RQMS reporting, the TL 9000
definition shall be used.

b. TL 9000 Data Preference

In all cases, the TL 9000 definition of the measurement is the preferred method.
Where none of a supplier’s customers require the supplier to generate the RQMS
reports, then the TL 9000 method shall be used. The supplier shall state which
method is used when reporting this measurement.

NOTE: The intent of RQMS alternative measurements is to minimize redundant


effort by suppliers when both RQMS and TL 9000 measurements are
contractually required. In that case, compliance audits would accept an
RQMS-based procedure coupled with meeting the conditions listed above as
valid for calculating this measurement.

4.3.3 Report Frequency and Method

Unless otherwise specified by the profile, the supplier shall collect data monthly
and report the required results quarterly to the QuEST Forum Measurements
Administrator. The supplier is free to use whatever time periods or formats
appropriate for reporting internally and to its customers. The quarterly update
shall include the data points from the preceding three months. Except for pre-
registration data submittals, all submissions shall be by calendar quarter.

4.3.4 Use of Fiscal Periods and Calendar Days

The supplier shall report TL 9000 measurement data based on calendar months
or defined fiscal months. The supplier shall use the chosen method consistently.
The supplier shall notify customers and the Measurements Administrator prior to
changing methods. The supplier shall use calendar days for the measurements
that involve number of days.

4.3.5 Reporting of Compared Data and Research Data

The supplier shall report data for all applicable measurements defined in this
handbook to the Measurements Administrator according to the agreed rules.
This reporting requirement applies whether the supplier uses the TL 9000
method or the RQMS alternative reporting and whether the measurement

TL 9000 Quality Management System Measurements Handbook 3.0

4-3
Section 4 –General Measurements Requirements

includes the designation “compared data” (CD) or “research data” (RD). See the
Measurements Summary Listing, Table A-5 in Appendix A.

NOTE: The designation “compared data” in the Method of Delivery and Reporting
section of the profile means that industry statistics may be available from the
QuEST Forum Administrator. However, the designation “research data”
indicates that no comparable industry statistics are available and the
Measurements Administrator will report analyses of industry trends only to the
QuEST Forum measurements work group.

4.3.6 Product Exclusions

The supplier may exclude data on products that are no longer supported for its
general customer base. This exclusion does not apply to individual field
replaceable units that have been made obsolete by a later version unless those
units are completely recalled from the field. Formal notification of placement of
the product on “Additions and Maintenance” (A&M) or “Manufacturing
Discontinued” (MD) status shall have been made to the customers for this
exclusion to apply.

4.3.7 Measurement Applicability

Unless otherwise stated, measurements shall only apply to products during


General Availability.

4.3.8 Calculation of Normalization Units

Where the normalization factor is traffic capacity based, such as DS1, OC-1, DSL
or Terminations, the calculation shall be based on the true useable traffic
capacity. Equipment within the system used to provide protection for the main
traffic path shall not be included, as it does not add useable capacity to the
system.

TL 9000 Quality Management System Measurements Handbook 3.0

4-4
Section 5 – Common Measurements

Section 5 Common Measurements


Common measurements are measurements that apply to all products: hardware,
software, and services.

5.1 Number of Problem Reports (NPR)

5.1.1 General Description and Title

The Total Problem Reports (Complaints) Measurement is a measure of total


problem reports as specified in the Measurement Applicability Table (Normalized
Units), Appendix A, Table A-2. This measurement is adapted from RQMS [1] and
is applied to all products: hardware (H), software (S), and services (V).

5.1.2 Purpose

This measurement is used to evaluate the number of customer originated


problem reports (complaints) that are indicative of the quality of the product
delivered during the operating life cycle of that product. Problem reports may
have a negative impact to the supplier (such as rework), to the customer (such
as scheduling repeat site visits) and may reduce end user loyalty. This
measurement is intended to stimulate ongoing improvements in order to reduce
the number of problem reports and reduce associated costs and potential
revenue losses.

5.1.3 Applicable Product Categories

This measurement applies to product categories as shown in Measurement


Applicability Table (Normalized Units), Appendix A, Table A-2.

5.1.4 Detailed Description

a. Terminology

The Glossary includes definitions for the following terms used for the NPR
Measurement:

• Annualization Factor (Afactor)


• General Availability
• No Trouble Found
• Official Fix
• Problem – Critical H/S
• Problem – Major H/S
• Problem – Minor H/S

TL 9000 Quality Management System Measurements Handbook 3.0

5-1
Section 5 – Common Measurements

• Problem Report
• Service Problem Report
• Severity Level

b. Counting Rules

The following rules shall apply in counting problem reports for the NPR
measurement.

(1) In the case of hardware or software, problem reports associated with


any and all in-service supported release versions shall be counted. A
software release or a system is “in service” when it handles end-user
traffic or transactions. This includes field trials prior to General
Availability where end customers are affected.
(2) In the case of services, any formal report of a problem (complaint) after
or during delivery of a service shall be counted. Service reports shall
include the originator’s name and a feedback mechanism for closure.
(3) Only customer-originated problem reports shall be counted.
(4) Any problem report after General Availability shall be counted unless
otherwise specified.
(5) Identical problem reports, i.e., multiple reports of the same occurrence
of the same problem at the same location at the same time, shall be
counted as one problem report.
(6) Duplicate problem reports, i.e., the same fault has occurred either at a
different customer location or at another time, shall each be counted as
separate problem reports.
(7) Multiple problems recorded on the same problem report (as in a
problem report form or screen) shall be counted separately, unless in
the customer's view these problems are all related to the same
manifestation of failure experienced by the customer.
(8) In order to obtain a comparable measure the supplier and customers
shall map the severity of hardware or software problem reports
according to the definitions contained in the glossary for critical, major
and minor H/S problem reports. Whenever a problem clearly belongs in
a given severity level per the glossary definition, then that severity level
shall be used. If it is not clear which severity level applies, the
customer’s assignment of severity level shall be used.
(9) Problem reports on hardware or software products shall be counted in
the severity classification in effect at the time the data is calculated for
reporting to the Measurements Administrator.
(10) Temporary fixes, such as temporary patches or workarounds, are
frequently used to resolve critical software or hardware problems. The
official fix is often developed under a subsequent or “follow up” major or
minor problem report that references the original critical problem report.
A critical problem report of this type shall not be reclassified and shall
be reported as a critical problem report. The subsequent major or
minor problem report shall not be counted in NPR but is included in
Problem Report Fix Response Time (FRT) and Problem Report
Overdue Fix Responsiveness (OFR) measurements.

TL 9000 Quality Management System Measurements Handbook 3.0

5-2
Section 5 – Common Measurements

(11) NPR problem reports are counted in the month they are received and
only in the month they are received.

c. Counting Rule Exclusions

The following shall be excluded from the problem report count for the NPR
measurement:

(1) A problem report determined to represent an information request (IR)


or request for a feature by agreement between the supplier and
customer,
(2) A problem report related to use of the product in a manner not defined
in the specification of the product by agreement between supplier and
customer,
(3) Customer reports of routine events such as expected maintenance,
normal field replaceable unit returns, or software upgrades, or
(4) Routine reports of outages, such as Service Failure Analysis Reports
(SFAR).

d. Calculations and Formulas

(1) The measurements (see NPR1, NPR2, NPR3 and NPR4 in


Table 5.1-1) shall be calculated monthly as the total number of
incoming problem reports divided by the normalization factor listed in
the Product Category Table, Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2 multiplied by the
Annualization factor (Afactor).
(2) In the hardware and software product categories where the
normalization factor is identified as “None” in Measurement
Applicability Table (Normalized Units), Appendix A, Table A-2, the
supplier will still be required to track the number of problem reports and
their resolution (In this case enter normalization factor = “none” in Table
5.1-3).
(3) When reporting RQMS alternative measurements for hardware and/or
software, suppliers shall refer to IPR1, IPR2, and IPR3 in Table 5.1-2 to
determine reporting conventions.

Notation

NU = Normalization Unit (NU) from Measurement


Applicability Table (Normalized Units),
Appendix A, Table A-2
S = Normalization Factor; the total NU count
Afactor = The number of reporting periods in a year (see Glossary)
Np1 = Number of Critical H/S Problem Reports in the reporting period
Np2 = Number of Major H/S Problem Reports in the reporting period
Np3 = Number of Minor H/S Problem Reports in the reporting period
Np4 = Number of Service Problem Reports in the reporting period

TL 9000 Quality Management System Measurements Handbook 3.0

5-3
Section 5 – Common Measurements

Table 5.1-1 Number of Problem Reports (NPR)


Measurement Identifiers and Formulas

Identifier Title Formula


NPR1 H/S Critical Problem Reports Np1 x Afactor / S
per NU per year
NPR2 H/S Major Problem Reports Np2 x Afactor / S
per NU per year
NPR3 H/S Minor Problem Reports Np3 x Afactor / S
per NU per year
NPR4 Service Problem Reports Np4 x Afactor / S
per NU per year

Table 5.1-2 Number of Problem Reports (IPR)


RQMS Alternative Measurements

Identifier Title
IPR1 Incoming Critical Problem Reports per system per month
IPR2 Incoming Major Problem Reports per system per month
IPR3 Incoming Minor Problem Reports per system per month

e. Reported Data and Format

(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 NPR Data Table (Table 5.1-3) – The NPR measurement shall
be reported with data elements (or equivalent as defined by the QuEST
Forum Administrator) for each month and each product category as
follows:

Table 5.1-3 TL 9000 NPR Data Table

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability
Table (Normalized Units),
Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Normalization Factor: S or none
Annualization Factor: Afactor (see Glossary)
Measurement Identifier: NPR
NPR1 Numerator: Np1
NPR2 Numerator: Np2
NPR3 Numerator: Np3
NPR4 Numerator: Np4

TL 9000 Quality Management System Measurements Handbook 3.0

5-4
Section 5 – Common Measurements

(3) RQMS Alternative NPR Data Table (Table 5.1-4) – The RQMS
alternative measurements shall be reported with data elements (or
equivalent as defined by the Measurements Administrator) for each
month and each product category as follows:

Table 5.1-4 RQMS Alternative NPR Data Table (IPR)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability
Table (Normalized Units), Appendix A,
Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Normalization Factor: Number of systems in service
Measurement Identifier: IPR
IPR1 Numerator: IPR1n – Number of incoming critical problem
reports
IPR2 Numerator: IPR2n – Number of incoming major problem
reports
IPR3 Numerator: IPR3n – Number of incoming minor problem
reports

5.1.5 Sources of Data

Data for the NPR measurement is derived from information provided by


customers and from supplier analysis as follows:

a. Customers
• Report problems to the supplier
• Report normalizing information for hardware or software categories to
the supplier according to the Product Category Tables, Measurement
Applicability Table (Normalized Units), Appendix A, Table A-2.
b. Suppliers
• Count reported problems by product category and customer base and
convert to “number of problem reports” according to the applicable
counting rules
• For service products, track and report service normalization unit
• Calculate the normalization factor
• When customer supplied data is insufficient, suppliers may calculate the
normalizing information for hardware or software categories based on
internal shipment or billing records for products within the scope of the
applicable registration and according to the Product Category Tables,
Measurement Applicability Table (Normalized Units), Appendix A,
Table A-2.

TL 9000 Quality Management System Measurements Handbook 3.0

5-5
Section 5 – Common Measurements

5.1.6 Method of Delivery or Reporting

a. Compared Data (CD) or Research Data (RD):

Critical Problem Reports per NU CD


Major Problem Reports per NU CD
Minor Problem Reports per NU CD
Service Problem Reports per NU CD

b. RQMS Alternative Reporting:

Critical Problem Reports per NU YES


Major Problem Reports per NU YES
Minor Problem Reports per NU YES
Service Problem Reports per NU NO

5.1.7 Example Calculations

a. Example 1 – NPR for H/S Products

(1) Consider one month’s data for a supplier of a particular operational


support system (OSS) sold to both members and non-members of the
QuEST Forum. There are 30 systems in service during the entire
month and NU is “systems in service.”
(2) The data reported is shown in Table 5.1-5.

Table 5.1-5 Example 1 – NPR H/S Data Report

Year: YYYY
Month MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: 4.2
Measurement Methodology: TL 9000
Customer Base: Total
Normalization Factor: 30
Annualization Factor: 12
Measurement Identifier: NPR
NPR1 Numerator: Np1 0
NPR2 Numerator: Np2 3
NPR3 Numerator: Np3 45
NPR4 Numerator: Np4 NA

TL 9000 Quality Management System Measurements Handbook 3.0

5-6
Section 5 – Common Measurements

(3) The calculation of the measurement would be:

Table 5.1-6 Example 1 – NPR Source Data and


Measurement Calculations

Problem Normalization
Reports Severity Afactor Factor (S) NPR Measurement Result
Np1 = 0 Critical 12 30 NPR1 = 0 Critical Problem
Reports per
system per year
Np2 = 3 Major 12 30 NPR2 = 1.2 Major Problem
Reports per
system per year
Np3 = 45 Minor 12 30 NPR3 = 18 Minor Problem
Reports per
system per year
Np4 = NA NPR4 = NA Service Problem
Reports are not
applicable for this
product

b. Example 2 – NPR for Services Products

(1) Consider one month’s data for a supplier of a particular maintenance


service sold to both members and non-members of the QuEST Forum.
There are 20 units maintained during the entire month and NU is “units
served.”
(2) Data reported is shown in Table 5.1-7.

Table 5.1-7 Example 2 – NPR Data Report (Services)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: 7.3
Measurement Methodology: TL 9000
Customer Base: Total
Normalization Factor: 20
Annualization Factor: 12
Measurement Identifier: NPR
NPR1 Numerator: Np1 NA
NPR2 Numerator: Np2 NA
NPR3 Numerator: Np3 NA
NPR4 Numerator: Np4 30

TL 9000 Quality Management System Measurements Handbook 3.0

5-7
Section 5 – Common Measurements

(3) The calculation of the measurement is shown in Table 5.1-8.

Table 5.1-8 Example 2 – NPR Source Data and


Measurements (Services)

Problem Normalization NPR Measurement


Reports Severity Afactor Factor (S) Result
Np1 = NA Critical NPR1 = NA H/S Critical
Problem Reports
are not
applicable for
services
Np2 = NA Major NPR2 = NA H/S Major
Problem Reports
are not
applicable for
services
Np3 = NA Minor NPR3 = NA H/S Minor
Problem Reports
are not
applicable for
services
Np4 = 30 Not 12 20 NPR4 = 18 Service Problem
applicable Reports per unit
for maintained per
Services year

TL 9000 Quality Management System Measurements Handbook 3.0

5-8
Section 5 – Common Measurements

5.2 Problem Report Fix Response Time (FRT)


5.2.1 General Description and Title

Problem Report Fix Response Time (FRT) is the supplier’s overall


responsiveness to reported problems. The Problem Report Fix Response Time
applies to the delivery of the official fix in response to hardware/software (H/S)
problem reports and to service (V) problem reports. This measurement is
adapted from RQMS. [1]

5.2.2 Purpose

This measurement is used to quantify the responsiveness to problem reports and


facilitate prompt fixes and closures of problem reports.

5.2.3 Applicable Product Categories

These measurements apply to product categories as shown in Measurement


Applicability Table (Normalized Units), Appendix A, Table A-2.

5.2.4 Detailed Description

a. Terminology

The Glossary includes definitions for the following terms used for the FRT
Measurements:

• Closure Criteria
• Closure Date
• Closure Interval
• Fix
• Fix Response Time
• Official Fix
• Overdue Service Problem Report
• Problem - Critical H/S
• Problem - Major H/S
• Problem - Minor H/S
• Problem Report
• Severity Level
• Temporary Fix

b. Counting Rules

(1) Only Problem Reports that are originated by a customer and meet the
criteria for Number of Problem Reports shall be included. All counting

TL 9000 Quality Management System Measurements Handbook 3.0

5-9
Section 5 – Common Measurements

rules and exclusions noted in section 5.1.4.b and 5.1.4.c also apply to
FRT.
(2) The start of the interval for calculating FRT shall be considered as the
receipt of that problem report by the supplier. If the severity of a
problem report is modified, the FRT shall start at the receipt of the
problem report.
(3) The end of the interval for calculating FRT shall be considered as the
date that the official fix or closure criteria is made available. Should the
problem report originator later reject the fix as incomplete or causing
side effects, the problem report shall be re-classified as open.
(4) For FRT, problem reports are counted ONLY in the month they are due
and not in the month they are fixed if different.
(5) The total FRT shall be reported in the severity classification at the time
the fix is due to be closed.
(6) The customer has the final determination that a problem report is
resolved. All resolutions must be acknowledged by the customer that
the solution provided by the supplier meets the customer’s
requirements. This is particularly relevant to the resolution of duplicate
problem reports where the criteria may vary by individual customer.
(7) Since this measurement is intended to quantify the supplier’s fix
response time, any extraordinary delays in the closure of a problem
report caused by the customer may be deleted from the overall closure
time. The supplier shall keep records of such delays with specific start
and stop dates. Examples of this type of event are:
- Excess delay in testing of a proposed solution due to customer
staffing constraints,
- After opening a problem report and being requested for needed
data by the supplier, the customer delays supplying sufficient
information for the supplier to commence problem resolution, and
- Not being able to get access to a customer facility to resolve a
service problem report.
(8) If the deployment of the fix is delayed (or does not occur) specifically at
customer request (and not because of supplier problems), the interval
is defined as ending when the official fix is first made available for
delivery. The delay interval shall not be included in the FRT
calculation.
(9) If, with customer consent, the implementation of a fix is deferred (such
as waiting for the next software update versus a patch) then the
deferral interval shall not be included.
(10) The delivery of temporary fixes or workarounds in response to critical
problem reports shall not be counted in this measurement.
Subsequent or “follow up” major or minor problem reports opened to
track the development and delivery of the official fix shall be included.
When the official fix activity is tracked against the original critical
problem report, then those reports shall be treated as major reports for
FRT and OFR reporting.
(11) On customer approval, the time between the application of a temporary
fix and the commitment date for a permanent fix may be discounted in
the fix response time calculation. The customer must agree that the
temporary fix meets their needs. Failure to provide an acceptable

TL 9000 Quality Management System Measurements Handbook 3.0

5-10
Section 5 – Common Measurements

resolution with a permanent fix by the negotiated commitment date will


result in the restoration of all the discounted time.
c. Counting Rule Exclusions

All counting rule exclusions in 5.1.4.c also apply to FRT.

d. Calculations and Formulas


(1) Each of the FRT measurements (see FRT2, FRT3 and FRT4 in
Table 5.2-1) shall be calculated monthly as the percentage of the total
number of problems that were due to be closed during the month and
that were delivered on time by the due threshold time. The due
threshold time is:
− 30 calendar days for major H/S problem reports and
− 180 calendar days for minor H/S problem reports
− A closure date agreement made between the customer and the
supplier for all service problem reports. Expected closure intervals
for services may be predetermined by a contractual agreement.
(2) When reporting RQMS alternative measurements for FRT
measurements, suppliers shall refer to ORT2 and ORT3 in Table 5.2-2
to determine reporting conventions.
(3) FRT will be considered to be 100% when there are no problem reports
due during the reporting period.

Notation

Fr2 = Major H/S Fixes delivered on time


Fr3 = Minor H/S Fixes delivered on time
Fr4 = Service problem reports resolved on time
Fr2d = Number of major H/S fixes due to be closed
Fr3d = Number of minor H/S fixes due to be closed
Fr4d = Number of service problem reports due to be closed

Table 5.2-1 Problem Report Fix Response Time (FRT)


Measurement Identifiers and Formulas

Identifier Title Formula Note


FRT2 H/S Major Problem Reports (Fr2 / Fr2d) x 100 % delivered
Fix Response Time on time
FRT3 H/S Minor Problem Reports (Fr3 / Fr3d) x 100 % delivered
Fix Response Time on time
FRT4 Service Problem Reports (Fr4 / Fr4d) x 100 % resolved
Fix Response Time on time

TL 9000 Quality Management System Measurements Handbook 3.0

5-11
Section 5 – Common Measurements

Table 5.2-2 Problem Report Fix Response Time (ORT)


RQMS Alternative Measurements

Identifier Title
ORT2 % Major Problems Closed On Time
ORT3 % Minor Problems Closed On Time

e. Reported Data and Format

(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 FRT Data Table – The FRT measurement shall be reported
with data elements (or equivalent as defined by the Measurements
Administrator) for each month and each product category as shown in
Table 5.2-3.

Table 5.2-3 TL 9000 FRT Data Table

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability
Table (Normalized Units), Appendix A,
Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Measurement Identifier: FRT
FRT2 Numerator: Fr2
FRT3 Numerator: Fr3
FRT4 Numerator: Fr4
FRT2 Denominator: Fr2d
FRT3 Denominator: Fr3d
FRT4 Denominator Fr4d

TL 9000 Quality Management System Measurements Handbook 3.0

5-12
Section 5 – Common Measurements

(3) RQMS Alternative FRT Data Table – The RQMS alternative


measurements shall be reported with data elements (or equivalent as
defined by the Measurements Administrator) for each month and each
product category as shown in Table 5.2-4.

Table 5.2-4 RQMS Alternative FRT Data Table (ORT)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability
Table (Normalized Units), Appendix A, Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Measurement Identifier: ORT
ORT2 Numerator: Ort2n – The total number of major fixes due to be
closed during the three-month window that were
delivered on time

ORT2 Denominator: Ort2d – The total number of major fixes that were
due to be delivered during the three-month window

ORT3 Numerator: Ort3n – The total number of minor fixes due to be


closed during the three-month window that were
delivered on time

ORT3 Denominator Ort3d – The total number of minor fixes that were
due to be delivered during the three-month window

5.2.5 Sources of Data

The data for the FRT measurement are derived from information provided by
customers and from supplier analysis as follows:

a. Customers

• Report problems to supplier


• Confer with supplier to establish severity classification for H/S
• Agree on service problem reports closure interval
• Agree with problem report closure decisions.

TL 9000 Quality Management System Measurements Handbook 3.0

5-13
Section 5 – Common Measurements

b. Suppliers

• Track problem reports, their severity (H/S), the agreed closure interval
(services), and actual closure dates
• Count due, overdue and on-time fixes and problem reports, and compute
the measurements according to the stated rules.

5.2.6 Method of Delivery or Reporting

a. Compared data (CD) or research data (RD):

Major H/S Problem Report Fix Response Time CD


Minor H/S Problem Report Fix Response Time CD
Services Problem Report Fix Response Time CD

b. RQMS Alternative Reporting:

Major H/S Problem Report Fix Response Time YES


Minor H/S Problem Report Fix Response Time YES
Services Problem Report Fix Response Time NO

5.2.7 Example Calculations

a. Example 1 – FRT for an H/S Product

(1) Consider one month’s data for a supplier of a particular OSS sold to
both members and non-members of the QuEST Forum. There are five
fixes to major problem reports due to be closed during the month and
all five were delivered on time. There are 25 fixes to minor H/S
problem reports due and 20 were delivered on time.
(2) The FRT data reported is shown in Table 5.2-5.

TL 9000 Quality Management System Measurements Handbook 3.0

5-14
Section 5 – Common Measurements

Table 5.2-5 Example 1 – FRT Data Report

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: 4.2
Measurement Methodology: TL 9000
Customer Base: Total
Measurement Identifier: FRT
FRT2 Numerator: Fr2 5
FRT3 Numerator: Fr3 20
FRT4 Numerator: Fr4 NA
FRT2 Denominator: Fr2d 5
FRT3 Denominator: Fr3d 25
FRT4 Denominator: Fr4d NA

(3) The calculation of the FRT measurements would be:

Table 5.2-6 Example 1 – FRT Source Data and


Measurement Calculation

Fixes FRT Measurement


On-Time Severity Fixes Due Results
Fr2 = 5 Major Fr2d = 5 FRT2 = 100% Major H/S Problem
Report Fixes Delivered
On Time

Fr3 = 20 Minor Fr3d = 25 FRT3 = 80% Minor H/S Problem


Report Fixes Delivered
On Time

Fr4 = NA Services Fr4d = NA FRT4 = NA Services Problem


Reports
are not applicable for
this product

b. Example 2 – FRT for Services

(1) Consider one month’s data for a supplier of a particular installation


service sold to both members and non-members of the QuEST Forum.
There are 20 service problem reports due to be closed during the
month and 16 were resolved on time.

TL 9000 Quality Management System Measurements Handbook 3.0

5-15
Section 5 – Common Measurements

(2) FRT data reported is shown in Table 5.2-7.

Table 5.2-7 Example 2 – FRT Data Report (Services)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: 7.1
Measurement Methodology: TL 9000
Customer Base: Total
Measurement Identifier: FRT
FRT2 Numerator: Fr2 NA
FRT3 Numerator: Fr3 NA
FRT4 Numerator: Fr4 16
FRT2 Denominator: Fr2d NA
FRT3 Denominator: Fr3d NA
FRT4 Denominator: Fr4d 20

(3) The calculation of the FRT measurements is shown in Table 5.2-8.

Table 5.2-8 Example 2 – FRT Source Data and


Measurement Calculation (Services)

On-Time
Closures Fixes Due FRT Measurement Results
FR4 = 16 Fr4d = 20 FRT4 = 80% Service Problem
Reports Resolved
On Time

c. Example 3 – Effect of Customer Delay

Table 5.2-9 Example 3 – Effect of Customer Delay

Event Event Date Problem Due


Date
Major Problem Report Received March 1 March 31
Need for site access identified March 10 March 31
Customer informs site not March 12 In suspense
available until Apr.1
Site Available April 1 April 18

The effect of the site not being available is to move the due date of the problem
report from March 31 to April 18. The difference is the length of the delay. The
problem report would therefore be reported with the April data per counting rule
5.2.4 b. (4).

TL 9000 Quality Management System Measurements Handbook 3.0

5-16
Section 5 – Common Measurements

5.3 Overdue Problem Report Fix Responsiveness


(OFR)

5.3.1 General Description and Title

Overdue Problem Report Fix Responsiveness (OFR) is the rate of closure of


overdue major and minor H/S problem reports and all service problem reports.
This measurement is adapted from RQMS. [1]

5.3.2 Purpose

This measurement is used to quantify the responsiveness to overdue problem


reports and to facilitate prompt fixes and closures of overdue problem reports.

5.3.3 Applicable Product Categories

This measurement applies to product categories as shown in Measurement


Applicability Table (Normalized Units), Appendix A, Table A-2.

5.3.4 Detailed Description

a. Terminology

The Glossary includes definitions for the following terms used for the OFR
Measurements:

• Closure Criteria
• Closure Date
• Closure Interval
• Fix
• Fix Response Time
• Official Fix
• Overdue Service Problem Report
• Problem - Critical H/S
• Problem - Major H/S
• Problem - Minor H/S
• Problem Report
• Severity Level
• Temporary Fix

TL 9000 Quality Management System Measurements Handbook 3.0

5-17
Section 5 – Common Measurements

b. Counting Rules

In addition to the rules contained in section 5.2, the following rules shall apply.

(1) Overdue problem reports are those that are open beyond the due
threshold time. The due threshold time is defined as:
− 30 calendar days for major H/S problem reports and
− 180 calendar days for minor H/S problem reports
− A closure date agreement made between the customer and the
supplier for all service problem reports. Expected closure intervals
for services may be predetermined by a contractual agreement.
(2) Open Problem Reports shall be counted as overdue in each month
during which they are open and overdue including the month they are
closed.

For example: If a problem report number 123 is open and overdue in


month m and did not close by the last day of month m, then it shall
count as overdue in month m and overdue in month m+1 even if it
closed on day one of month m+1.

(3) Penalty problem reports are counted in the OFR measurement and are
applicable only to hardware and software products. A penalty problem
report is defined as:
− For majors, all problem reports with age since opening which
exceed 180 calendar days,
− For minors, all problem reports with age since opening which
exceed 270 calendar days,
− Penalty problem reports shall also be counted as overdue problem
reports (that is, double counting constitutes the “penalty”).

c. Counting Rule Exclusions

The counting rule exclusions in Section 5.2 shall apply.

d. Calculations and Formulas

Each of the OFR measurements (see OFR in Table 5.3-1) shall be calculated as
follows:

− The sum of penalty problem reports for the month shall be added to
the sum of the overdue problem reports for the month.
− The number of overdue problem reports closed are those overdue
problem reports that were closed in the month.
− The measurement is computed as the number of overdue problem
reports closed divided by the sum of overdue problem reports and
the total number of penalty problem reports; the result shall be
expressed as a percentage.
− The measurement shall be reported as 100% in the case where
there are no overdue problem reports during the period.

TL 9000 Quality Management System Measurements Handbook 3.0

5-18
Section 5 – Common Measurements

Notation

Pro2 = Number of overdue major H/S problem reports


Pro3 = Number of overdue minor H/S problem reports
Pro4 = Number of overdue service problem reports
Prp2 = Number of major H/S penalty problem reports
Prp3 = Number of minor H/S penalty problem reports
Prc2 = Number of overdue major H/S problem reports closed
Prc3 = Number of overdue minor H/S problem reports closed
Prc4 = Number of overdue service problem reports closed

Table 5.3-1 Overdue Problem Report Fix Responsiveness


(OFR) Measurement Identifiers and Formulas

Identifier Title Formula Note


OFR2 H/S Major Overdue Problem (Prc2/[Pro2+Prp2]) x 100 %
Report Fix Responsiveness closed
OFR3 H/S Minor Overdue Problem (Prc3/[Pro3+Prp3]) x 100 %
Report Fix Responsiveness closed
OFR4 Service Overdue Problem (Prc4/Pro4) x 100 %
Report Fix Responsiveness closed

Table 5.3-2 Overdue Problem Report Fix Responsiveness


(OPR) RQMS Alternative Measurements

Identifier Title
OPR2 % Rate of Closures of Overdue Problem Reports – Major
OPR3 % Rate of Closures of Overdue Problem Reports – Minor

e. Reported Data and Format

(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 OFR Data Table – The OFR measurements shall be reported
with data elements (or equivalent as defined by the Measurements
Administrator) for each month and each product category as shown in
Table 5.3-3.

TL 9000 Quality Management System Measurements Handbook 3.0

5-19
Section 5 – Common Measurements

Table 5.3-3 TL 9000 OFR Data Table

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability
Table (Normalized Units), Appendix A,
Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Measurement Identifier: OFR
OFR2 Numerator: Prc2
OFR3 Numerator: Prc3
OFR4 Numerator: Prc4
OFR2 Denominator: Pro2
OFR3 Denominator: Pro3
OFR4 Denominator Pro4
nd
OFR2 Denominator 2 Term: Prp2
nd
OFR3 Denominator 2 Term: Prp3

(3) RQMS Alternative OFR Data Table – The RQMS alternative


measurements shall be reported with data elements (or equivalent as
defined by the Measurements Administrator) for each month and each
product category as shown in Table 5.3-4.

Table 5.3-4 RQMS Alternative OFR Data Table (OPR)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Measurement Identifier: OPR
OPR2 Numerator: Opr2n – The sum of the overdue major problem
reports closed in the three-month period.
OPR2 Denominator: Opr2d – The sum of penalty major problem
reports for the three-month period added to the
sum of the overdue major problem reports for the
same period
OPR3 Numerator: Opr3n – The sum of the overdue minor problem
reports closed in the three-month period
OPR3 Denominator: Opr3d – The sum of penalty minor problem
reports for the three-month period added to the
sum of the overdue minor problem reports for the
same period

TL 9000 Quality Management System Measurements Handbook 3.0

5-20
Section 5 – Common Measurements

5.3.5 Sources of Data

The data for the OFR measurement are derived from information provided by
customers and from supplier analysis as follows:

a. Customers

• Report problems to supplier


• Confer with supplier to establish severity classification for H/S
• Agree on service problem reports closure interval
• Agree with problem report closure decisions.

b. Suppliers

• Track problem reports, their severity (H/S), the agreed closure interval
(services), and actual closure dates
• Count due, overdue and on-time fixes and problem reports, and compute
the measurements according to the stated rules.

5.3.6 Method of Delivery or Reporting

a. Compared data (CD) or research data (RD):

Major H/S Overdue Problem Report Fix Responsiveness RD


Minor H/S Overdue Problem Report Fix Responsiveness RD
Services Overdue Problem Report Fix Responsiveness RD

b. RQMS Alternative Reporting:

Major H/S Overdue Problem Report Fix Responsiveness YES


Minor H/S Overdue Problem Report Fix Responsiveness YES
Services Overdue Problem Report Fix Responsiveness NO

5.3.7 Examples

a. Example 1 – OFR for an H/S Product

(1) At the beginning of the month, there were six major H/S problem
reports that were overdue (age > 30 calendar days). One of these
became a penalty major H/S problem report during the month (age >
180 calendar days). Two of the six overdue reports were closed during
the month. There was no overdue minor H/S problem report at the
beginning of the month. However, by the end of the month five minor
H/S problem reports for which fixes had been due during the month
had become overdue. One of these overdue minor H/S problem
reports was closed before the end of the month.
(2) OPR data reported is shown in Table 5.3-5.

TL 9000 Quality Management System Measurements Handbook 3.0

5-21
Section 5 – Common Measurements

Table 5.3-5 Example 1 – OFR Data Report

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: 4.2
Measurement Methodology: TL 9000
Customer Base: Total
Measurement Identifier: OFR
OFR2 Numerator: Prc2 2
OFR3 Numerator: Prc3 1
OFR4 Numerator: Prc4 NA
OFR2 Denominator: Pro2 6
OFR3 Denominator: Pro3 5
OFR4 Denominator: Pro4 NA
nd
OFR2 Denominator – 2 Factor: Prp2 1
nd
OFR3 Denominator – 2 Factor: Prp3 0

(3) The calculation of the OFR measurements for the month is shown in
Table 5.3-6.

Table 5.3-6 Example 1 – OFR Source Data and


Measurement Calculation

Closed Penalty
Overdue Fixes Problem
Problems Severity Overdue Reports
OFR Measurement Result
Prc2 = 2 Major Pro2 = 6 Prp2 = 1
OFR2 = 2 / (6+1) x 100 = 28.6%
% Overdue Major Problem
Reports Closed
Prc3 = 1 Minor Pro3 = 5 Prp3 = 0 OFR3 = 1 / (5+0) x 100 = 20%
% Overdue Minor Problem
Reports Closed
Prc4 = NA Services Pro4 = NA not Services Problem Reports are
applicable not applicable for this product

b. Example 2 – OFR for a Services Product

(1) At the beginning of the month, there were two Service problem reports
that were overdue (age greater than the agreed closure interval). One
of the two overdue reports was closed during the month.
(2) OFR data reported would be as shown in Table 5.3-7.

TL 9000 Quality Management System Measurements Handbook 3.0

5-22
Section 5 – Common Measurements

Table 5.3-7 Example 2 – OFR Data Report (Services)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: 7.1
Measurement Methodology: TL 9000
Customer Base: Total
Measurement Identifier: OFR
OFR2 Numerator: Prc2 NA
OFR3 Numerator: Prc3 NA
OFR4 Numerator: Prc4 1
OFR2 Denominator: Pro2 NA
OFR3 Denominator: Pro3 NA
OFR4 Denominator: Pro4 2
nd
OFR2 Denominator – 2 Factor: Prp2 NA
nd
OFR3 Denominator – 2 Factor: Prp3 NA

(3) The calculation of the OFR measurements for the month is shown in
Table 5.3-8.

Table 5.3-8 Example 2 – OFR Source Data and


Measurement Calculation (Services)

Closed Penalty
Overdue Fixes Problem OFR
Problems Severity Overdue Reports Measurement Result
Prc4 = 1 not Pro4 = 2 not OFR4 = 1 / 2 x 100 = 50%
applicable applicable % Overdue Service
Problem Reports Closed

TL 9000 Quality Management System Measurements Handbook 3.0

5-23
Section 5 – Common Measurements

5.4 On-Time Delivery (OTD)

5.4.1 General Description and Title

On-Time Delivery (OTD) is a measure of timeliness of all product orders


delivered to customers.

5.4.2 Purpose

This measurement is used to evaluate the supplier’s on-time delivery


performance in order to meet the customer’s need for timely product delivery and
to meet end-customer expectations.

5.4.3 Applicable Product Categories

This measurement applies to product categories as shown in Measurement


Applicability Table (Normalized Units), Appendix A, Table A-2. It does not apply
to continuous services (e.g., Customer Support Service) where service is
measured by service problem reports.

5.4.4 Detailed Description

a. Terminology

A service order is an order for service having a Customer Requested Completion


Date (CRCD), but not an installed system order. An example of a service order
is when a supplier is contracted by a customer to install and/or engineer a
product that is manufactured by another supplier. Services may include
engineering and/or installation.

The Glossary includes definitions for the following terms used for the OTD
measurement:

• Installed System
• Installed System Order
• On-Time Installed System Delivery
• On-Time Item(s) Delivery

b. Counting Rules

(1) A system that includes any combination of hardware, software, and


service applications is counted as one order.
(2) Acceptance shall be defined according to purchase order and/or
contract terms and conditions unless notified otherwise by the
customer.

TL 9000 Quality Management System Measurements Handbook 3.0

5-24
Section 5 – Common Measurements

(3) Due dates and delivery dates are considered to be one 24-hour period
(customer’s calendar day).
(4) Early order completions or deliveries are considered to have missed
the delivery date unless authorized by the customer.
(5) Actual Completion Date (ACD) is the date when service is complete at
a job site and accepted by the customer.
(6) Customer Requested Date (CRD) is the desired delivery date of items,
systems or services as defined by the customer’s purchase order or
contract. CRD is the initial requested date or, in the case of customer
requested changes, the revised date.
(7) The monthly OTD data shall include all orders having CRD occurring
during the same month.
(8) Actual On-Job Date (AOJD) identifies the date when the shipment
actually was delivered at the ship-to address. This date is derived by
adding the transportation interval to the actual ship date.
(9) CRD is either CRCD or CROJD depending on order type. Customer
Requested Completion Date (CRCD) is the date requested by the
customer that orders are completed. Customer Requested On Job
Date (CROJD) is the date requested by the customer of shipment
delivery.
(10) Order types can be: installed system, items, or service.
(11) A service order is one having a CRCD, but not an installed system
order. Services may include installation and/or engineering.
(12) Compound orders designated by the customer for a single delivery
(“must ship complete” orders) shall be treated in aggregate. If one line
item is late, then all line items shall be counted as late.

c. Counting Rule Exclusions

(1) Late Orders Received (LOR) are those for which CRD is earlier than
Date Order Received, and are excluded from the measurement.

d. Calculations and Formulas

(1) On-Time Delivery (OTD) (see OTD in Table 5.4-1) is the percentage of
orders/items accepted on the Customer Requested Date (CRD) where
CRD is equal to either ACD or AOJD depending on order type.
(2) OTD is calculated as 100 multiplied by the number of orders/items
accepted on the CRD during the month divided by the number of
orders/items for which CRD occurred during the month.
(3) OTD is comprised of three measurements of order fulfillment, as
follows:
− Percentage of installed system orders accepted on Customer
Requested Completion Date (CRCD),
− Percentage of line items accepted on Customer Requested On-Job
Date (CROJD), and
− Percentage of service orders accepted on Customer Requested
Completion Date (CRCD).

TL 9000 Quality Management System Measurements Handbook 3.0

5-25
Section 5 – Common Measurements

Notation

Cs = Number of installed systems for which CRCD occurred during the month
Ss = Number of installed systems accepted on the CRCD during the month
Ci = Number of items for which CROJD occurred during the month
Si = Number of items accepted on the CROJD during the month
Cv = Number of service orders for which CRCD occurred during the month
Sv = Number of service orders accepted on the CRCD during the month

Table 5.4-1 On-Time Delivery (OTD)


Measurement Identifiers and Formulas

Identifier Title Formula Note


OTIS On-time Installed (Ss / Cs) x 100 % accepted
System Delivery on CRD
OTI On-time Items (Si / Ci) x 100 % accepted
Delivery on CRD
OTS On-time Service (Sv / Cv) x 100 % accepted
Delivery on CRD

e. Reported Data and Format

(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 OTD Data Table – The OTD measurements shall be reported
with data elements (or equivalent as defined by the Measurements
Administrator) for each month and each product or product/service
category as follows (Table 5.4-2):

TL 9000 Quality Management System Measurements Handbook 3.0

5-26
Section 5 – Common Measurements

Table 5.4-2 TL 9000 OTD Data Table

Year: YYYY
Month MM
Reporting ID: Provided by the QuEST Forum
Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table
A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Measurement Identifier: OTD
OTIS Numerator: Ss
OTI Numerator: Si
OTS Numerator: Sv
OTIS Denominator: Cs
OTI Denominator: Ci
OTS Denominator: Cv

5.4.5 Sources of Data

OTD data is derived from one or more of the following sources:

a. Supplier’s order entry department,


b. Installation teams, and
c. Customer data.

5.4.6 Method of Delivery or Reporting

a. Compared data (CD) or research data (RD):

On-time Installed System Delivery CD


On-time Items Delivery CD
On-time Service Delivery CD

b. RQMS Alternative Reporting:

None

5.4.7 Examples

a. Table 5.4-3 illustrates computation of OTD measurement from a series of


installations of systems per purchase order (PO).

TL 9000 Quality Management System Measurements Handbook 3.0

5-27
Section 5 – Common Measurements

Table 5.4-3 Example 1 – On-Time Installed System (OTIS)

Purchase CRD Line Quantity Quantity Date Date On-time Note


Order mm/dd Item Ordered Installed Installed Accepted Installations
A 03/10 1 5 5 3/10 1
2 6 6 3/10
3 4 4 3/10 3/10
B 03/20 1 8 4 3/22 0 1
4 3/23
2 12 6 3/22
6 3/25 3/25
C 03/21 1 2 2 3/21 0
2 2 1 3/21
1 3/22 3/22
D 02/15 1 7 7 3/15 NA 2
2 1 1 3/15 3/15
E 03/25 1 1 1 3/25 4/15 0 3

Number of Number of On-time


Orders System Purchase
CRDs Due in Orders (Ss)
Month (Cs)
TOTALS: 5 4 1
March OTD OTIS = 4
(Ss/Cs) 25.0%

NOTES:
1. Order B – 2 line items were split into 4 partial installations – each with a separate date
installed.
2. PO system D CRD was not counted in the total of 4 for March as it had a February CRD.
3. PO E Service Order while installed on time, did not meet customer acceptance until
supplier changes were completed and after the CRD and therefore was not on time.
4. The CRD installed system OTDI performance for March was 25% or 1(CRD met) / 4
(CRDs due).
5. It should be noted the line items and associated quantities are shown for completeness.
They have no direct impact in the calculation of OTD for installed systems other than the
system installation has not been completed until the last line item has been accepted.

TL 9000 Quality Management System Measurements Handbook 3.0

5-28
Section 5 – Common Measurements

b. Table 5.4-4 illustrates computation of OTD measurement from a series of


services per purchase order (PO).

Table 5.4-4 Example 2 – On-Time Service Delivery (OTS)

PO CRD Line Quantity Quantity Compl. Accep- OTS Note


mm/dd Item Ordered Completed Date tance CRD
mm/dd mm/dd
F 3/10 1 5 5 3/10 1
2 6 6 3/10
3 4 4 3/10 3/10
G 3/20 1 8 4 3/22 0
4 3/23
2 12 6 3/22
6 3/25 3/25
H 3/21 1 2 2 3/21 0
2 2 1 3/21
1 3/22 3/22
I 2/15 1 7 7 3/15 NA 1
2 1 1 3/15 3/15
J 3/25 1 1 1 3/15 3/25 0 2
Number CRDs On-time
of Due in Orders SV
Orders March
Cv
TOTAL: 5 4 1
March OTD OTS = 3
Sv/Cv 25%

NOTES:
1. PO system I CRD was not counted in the total of 3 for March as it had a
February CRD date and it was previously counted.
2. Service Order was completed but not accepted for early delivery. Thus, CRD
was not met.
3. The CRD OTD performance for March was 25% or 1(CRD met) / 4 (CRDs
due).
4. It should be noted the line items and associated quantities are shown for
completeness. The service has not been delivered until the last item has been
accepted.

TL 9000 Quality Management System Measurements Handbook 3.0

5-29
Section 5 – Common Measurements

c. Table 5.4-5 illustrates computation of OTD measurement from a series of


delivered line items per purchase order (PO).

Table 5.4-5 Example 3 – On-Time Item Delivery (OTI)

PO Line Qty CRD Split Qty Actual OTD Note


Item Ordered Order Delivered CRD
K 1 5 3/10 5 3/10 1
2 6 3/12 6 3/13 0
3 4 3/17 4 3/18 0
L 1 8 3/20 8 3/22 0
2 12 3/22 y 6 3/22 0 1
y 6 3/25 0 1
3 2 3/29 2 ? 0 2
4 2 3/30 2 3/30 1
M 1 7 2/15 7 3/15 NA 3
2 1 2/15 1 3/15 NA 3
N 1 20 3/25 y 10 3/25 0 4
y 10 3/25 1 4
O 1 2 3/10 2 3/5 0 5
Number Number CRDs On-time
of of Line Due in Orders
Orders Items March Si
Ci
TOTAL: 4 9 9 3
March OTD OTI = 6
Si/Ci 33%

NOTES:
1. Line item L2 was not on time for CRD because only ½ of the items were
delivered to CRD.
2. “?” - OTD date could not be confirmed and therefore the line item is
assumed to have missed OTD.
3. PO line items M1 and M2 CRDs were not counted in the total of 9 for March
as they had Feb CRD dates and were previously counted.
4. Line item N1 is counted as 1 on time line item because while both portions of
the split shipments were delivered on time, it is still just 1 line item on the
order.
5. Line item O1 was delivered early. Thus, CRD was not met.
6. The CRD OTD performance for March was 33% or 3 (CRD met) / 9 (CRDs
due).

TL 9000 Quality Management System Measurements Handbook 3.0

5-30
Section 5 – Common Measurements

d. The data that would be reported for the above examples


are in Table 5.4-6.

Table 5.4-6 Example 1, 2, 3 – On-Time Delivery


Data Report (OTD)

Year 2000
Month: 03
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Total
Measurement Identifier: OTD
OTIS Numerator: Ss 1
OTIS Denominator: Cs 4
OTI Numerator: Si 3
OTI Denominator: Ci 9
OTS Numerator: Sv 1
OTS Denominator: Cv 4

TL 9000 Quality Management System Measurements Handbook 3.0

5-31
Section 6 – Hardware and Software Measurements

Section 6 Hardware and Software Measurements


Hardware and Software measurements apply to all hardware and software
products. They do not apply to services.

6.1 System Outage Measurement (SO)

6.1.1 General Description and Title

System Outage (SO) applies only to hardware and software products. SO


measures the loss of primary functionality of all or part of any
telecommunications system. The SO measurement expresses the annualized
total number (outage frequency) and total duration (downtime) of outages
experienced by a system. These measures translate directly into system Mean
Time Between Failures (MTBF) and system availability, respectively. The SO
measurements are calculated for both overall (all causes) and for supplier-
attributable causes. Supplier-attributable availability / unavailability is often
specified in system reliability performance requirements.

6.1.1 Note 1: Bolded text in the definition column of the Product Category
Applicability Table A-1 indicates the primary function of the product
category. This is the function to use for outage measurements.

6.1.2 Purpose

This measurement is used to evaluate the downtime performance and outage


frequency delivered to the end user during field operation with a goal to reduce
both the frequency and duration of outages and their associated cost and
revenue impact.

6.1.3 Applicable Product Categories

This measurement applies to product categories as shown in Appendix A.


Products specifically not included are single circuit packs or non-system products
for which the term outage has no meaning.

6.1.4 Detailed Description

The supplier shall provide two sets of measurements for each product category
code: (1) overall outage frequency and downtime, and (2) supplier attributable
outage frequency and downtime.

TL 9000 Quality Management System Measurements Handbook 3.0

6-1
Section 6 – Hardware and Software Measurements

a. Terminology

Downtime Performance Measurement (DPM):


DPM applies only to suppliers that elect RQMS equivalent reporting. DPM is the
expected long-term average sum, over one operating year, of the time duration of
events that prevent a user from requesting or receiving services. A failure that
causes service interruption contributes to the outage downtime of that service.
Outage downtime is usually expressed in terms of minutes per system per year.

Outage Frequency Measurement (OFM):


OFM applies only to suppliers that elect RQMS equivalent reporting. OFM is the
expected long-term average number of events, per unit time, that cause a loss of
services to the service provider Outage frequency is usually expressed in terms
of incidents per system per year.

The Glossary includes definitions for the following terms used for the SO
measurement:

• Customer Base
• Scheduled Outage
• Total System Outage

b. Counting Rules

Unless an exclusion applies, all outages representing loss of functionality


shall be counted as follows:

(1) Complete loss of primary functionality of all or part of a system for a


duration greater than 30 seconds is counted. For a scheduled event, a
duration greater than 15 seconds is counted.

Examples of loss of functionality include:


− In switching systems, loss of origination or termination capability
for all or part of the office is counted.
− In a tandem office or Signaling Transfer Point (STP), any total loss
of Common Channel Signaling (CCS) is counted as total loss of
functionality.
− In a mated pair Service Control Point (SCP), only binodal outages
resulting in complete loss of processing.
− Service order processing system cannot process any orders and
new orders cannot be entered.
(2) Scheduled outages are counted unless the system is allocated a
maintenance window and, during that window, the system is not
required to be in service.
(3) Outages attributed to the customer are counted as part of the overall
outage measurement.

TL 9000 Quality Management System Measurements Handbook 3.0

6-2
Section 6 – Hardware and Software Measurements

(4) A supplier attributable outage is an outage primarily triggered by


a) the system design, hardware, software, components or other parts
of the system, or
b) scheduled events necessitated by the design of the system, or
c) supplier support activities including documentation, training,
engineering, ordering, installation, maintenance, technical
assistance, software or hardware change actions, etc.
(5) For systems that are not continuously operational (24X7), count only
outages and duration that occur during the operational window.
(6) If redundancy is available for a particular product but the customer
chooses not to purchase it, count the outage as follows:
a) For TL 9000 reporting methodology include the event in "All
Causes" category.
b) For RQMS reporting methodology include the event in "Service
Provider Attributable" category.
(7) Outages are counted in a product only when the failure is within the
product itself.
(8) Counting by Release and by Host / Remote Systems – The following
shall apply:
a) Performance of individual releases is not counted separately.
b) Performance of host systems and remote systems of a product
type is not counted separately.

c. Counting Rule Exclusions

The exclusions to counting all outages are as follows:

(1) Outages due to natural disasters are not counted.


(2) A remote system in stand-alone mode (when its functionality continues
after losing its connection with the host) is not considered out of
service, as this is not a complete loss of functionality.
(3) A CCS outage in an end office is not counted, as this is not a complete
loss of functionality.
(4) Loss of feature functionality, such as Calling Number Delivery, etc., is
not counted.
(5) Outages caused by other products in the network are excluded, e.g., a
failure within an OC192 ring is counted against the OC192 product that
caused the event and is excluded from all the attached multiplexers.

TL 9000 Quality Management System Measurements Handbook 3.0

6-3
Section 6 – Hardware and Software Measurements

d. Calculations and Formulas


(1) The measurement is calculated monthly for overall and supplier
attributable outages.
(2) When reporting RQMS alternative measurements for product
categories where one of the RQMS designations, “end and / or tandem
office”, “wireless”, DWDM-FR, DWDM-PL, DCS, ATM Node, Trunk
Gateway, Access Gateway, SNC or “NGDLC” applies, suppliers shall
refer to Table 6.1-2 for reporting requirements.
(3) When reporting RQMS alternative measurements for product
categories where the RQMS designations in (2) above do not apply,
suppliers shall refer to Table 6.1-3 for reporting requirements.

Notation

P = Overall Weighted Outage Minutes


Ps = Supplier Attributable Weighted Outage Minutes
Q = Overall Weighted Outages
Qs = Supplier Attributable Weighted Outages
S = Normalization Factor, the total number of normalization units
that are in service during the month
Afactor = Annualization Factor (see Glossary)
NU = Normalization Unit (NU) from Measurement
Applicability Table (Normalized Units),
Appendix A, Table A-2

Table 6.1-1 System Outage Measurement (SO)


Measurement Identifiers and Formulas

Identifier Title Formula Note


SO1 Annualized Weighted Q x Afactor / S Weighted outages
Outage Frequency per NU per year
SO2 Annualized Weighted P x Afactor / S Minutes per NU
Downtime per year
SO3 Annualized Supplier Qs x Afactor / S Weighted outages
Attributable Outage per NU per year
Frequency
SO4 Annualized Supplier Ps x Afactor / S Minutes per NU
Attributable Downtime per year

TL 9000 Quality Management System Measurements Handbook 3.0

6-4
Section 6 – Hardware and Software Measurements

Table 6.1-2 System Outage Measurements (SOE)


RQMS Alternative Measurements
End Office and/or Tandem Office,
Wireless Products, and NGDLC Products

NOTE: Report only the measurements in this table that are applicable to the
specific product as defined by RQMS. [1]

Identifier Title
rDPMsn Supplier Attributable Total Outage Minutes per System per Year
– remote only
hDPMsn Supplier Attributable Total Outage Minutes per System per Year
– host only
rDPMcn Service Provider Attributable Total Outage Minutes per System
per Year – remote only
hDPMcn Service Provider Attributable Total Outage Minutes per System
per Year – host only
rOFMsn Supplier Attributable Total Outages per System per Year –
remotes
hOFMsn Supplier Attributable Total Outages per System per Year –
hosts
rOFMcn Service Provider Attributable Total Outages per System per Year
– remotes
hOFMcn Service Provider Attributable Total Outages per System per Year
– hosts

Table 6.1-3 System Outage Measurements (SOG)


RQMS Alternative Measurements
General Series

Identifier Title
DPMn Total Outage Minutes Per System Per Year –overall
DPMsn Total Outage Minutes Per System Per Year –supplier attributable
OFMn Total Outages Per Year – overall
OFMsn Total Outages Per Year – supplier attributable

TL 9000 Quality Management System Measurements Handbook 3.0

6-5
Section 6 – Hardware and Software Measurements

(4) Detailed formulas for the downtime numerator quantities P, Ps, Q, and
Qs and for the normalization factor S are given in the following
analysis.

Notation

N = Number of systems in service at the end of the month


M = Number of outages
Pi = Duration of the ith outage (i = 1,…, m)
Ai = Number of units (lines, DS1s, etc) affected in outage I
Sn = Number of units (lines, DS1s, etc) in system n
S = Number of units (lines, DS1s, etc) in the total population

S = å n =1 S n
N

Downtime Formulas

Systems of Uniform Size

For the special case of products of uniform system size where only total system
outages are possible, the downtime calculation is comparable to the current
RQMS calculation for end offices. Downtime is computed as follows for monthly
data:

å Pi
DT = 12x i=1 (6.1-1)
N

Examples include toll ticketing, voice messaging, SMDR, dispatch systems, etc.

All other Systems

Downtime for all other products (where systems consist of lines, ports,
terminations, or other normalization units) is computed as follows for monthly
data:

åAP i i
(6.1-2)
DT = 12 x i=1
N

åS
n =1
n

TL 9000 Quality Management System Measurements Handbook 3.0

6-6
Section 6 – Hardware and Software Measurements

Outage Frequency Formulas

Systems of Uniform Size


For the special case of products of uniform system size where only total system
outages are possible, outage frequency is comparable to the current RQMS
calculation for end offices. Outage frequency is computed as follows for monthly
data:

m
OF = 12 x (6.1-3)
N

Examples include toll ticketing, voice messaging, SMDR, dispatch systems, etc.

All other Systems

Outage Frequency for all other products (where systems consist of lines, ports,
terminations, or other normalization units) is computed as follows for monthly
data:

åA i
(6.1-4)
OF = 12 x i=1
N

åS
n=1
n

Because A is expressed in normalization units per system outage and S is total


normalization units, the units of this calculation is Outages per Year, and is
independent of the units chosen for normalization. This measurement is a
downtime “pain index” as viewed from the user’s perspective. The perspective of
system performance delivered to a termination is equivalent to system
performance because each termination appears on exactly one system.

e. Reported Data and Format

(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 SO Data Table 6.1-4 – The SO measurement shall be reported
with data elements (or equivalent as defined by the Measurements
Administrator) for each month and each product category as follows:

TL 9000 Quality Management System Measurements Handbook 3.0

6-7
Section 6 – Hardware and Software Measurements

Table 6.1-4 TL 9000 SO Data Table

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Normalization Factor: S
Annualization Factor: Afactor (see Glossary)
Measurement Identifier: SO
P: DT – Calculated downtime in minutes/year for all
causes
Ps: DT – Calculated downtime in minutes/year for all
supplier-attributable causes
Q: OF – Calculated outage frequency in
occurrences/year for all causes
Qf: OF – Calculated outage frequency in
occurrences/year for supplier-attributable causes

(3) RQMS Alternative SO Data Table – The RQMS alternative measurements


shall be reported with data elements (or equivalent as defined by the
Measurements Administrator) from the applicable Table 6.1-2 or
Table 6.1-3, for each month as follows:

TL 9000 Quality Management System Measurements Handbook 3.0

6-8
Section 6 – Hardware and Software Measurements

Table 6.1-5 RQMS Alternative SO Data Table (SOE)

NOTE: If separation of host and remote systems does not apply, report all items
under the host category.

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Normalization Factor: rS – Total systems deployed per RQMS –
remote only
hS – Total systems deployed per RQMS – host
only
Measurement Identifier: SOE
rDPMsn: Annualized supplier attributable total outage
minutes – remote only
hDPMsn: Annualized supplier attributable total outage
minutes – host only
rDPMcn: Annualized service provider attributable total
outage minutes – remote only
hDPMcn: Annualized service provider attributable total
outage minutes – host only
rOFMsn: Annualized supplier attributable total outage
frequency – remote only
hOFMsn: Annualized supplier attributable total outage
frequency – host only
rOFMcn: Annualized service provider attributable total
outage frequency – remote only
hOFMcn: Annualized service provider attributable total
outage frequency – host only

Table 6.1-6 RQMS Alternative SO Data Table (SOG)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Normalization Factor: S – Total systems deployed per RQMS
Measurement Identifier: SOG
DPMn: Annualized total outage minutes for all causes
DPMsn: Annualized supplier attributable outage minutes
OFMn: Annualized total outage frequency for all causes
OFMsn: Annualized supplier attributable outage minutes

TL 9000 Quality Management System Measurements Handbook 3.0

6-9
Section 6 – Hardware and Software Measurements

6.1.5 Sources of Data


Customers shall report all outage data and system population of their end users
to the supplier. If outage data is not supplied, then the supplier is not responsible
for reporting this measurement.

6.1.6 Method of Delivery or Reporting

a. Compared Data (CD) or Research Data (RD):

Overall System Downtime CD


Overall System Outage Frequency CD
Supplier Attributable System Downtime CD
Supplier Attributable System Outage Frequency CD

b. RQMS Alternative Reporting:

Overall System Downtime YES


Overall System Outage Frequency YES
Supplier Attributable System Downtime YES
Supplier Attributable System Outage Frequency YES

6.1.7 Example Calculations

a. Example 1 – System Outage Reporting

From a population of 200 systems which can be considered either to be


operational or not operational (such as toll ticketing, voice messaging, SMDR,
dispatch system, etc.) Outages of 20 minutes, 40 minutes, and 60 minutes
occurred during the month, which were attributable to the supplier and one
outage of 10-minutes duration was not attributable. The calculations follow:

å Pi
DT = 12x i=1 (downtime calculation 6.1-1)
N

DTc = 12 (20+40+60+10) / 200 = 1560 / 200


DTc = 7.8 minutes / yr

DTs = 12 (20+40+60) / 200


DTs = 7.2 minutes / yr

m (outage frequency calculation 6.1-3)


OF = 12 x
N

OFc = 12 (4/200) = 0.24 occurrence / yr


OFs = 12 (3/200) = 0.18 occurrence / yr

TL 9000 Quality Management System Measurements Handbook 3.0

6-10
Section 6 – Hardware and Software Measurements

b. Example 2 – End Office System Outage Reporting

Consider a population of four central office systems comprising 1600


terminations distributed as 100, 200, 300, and 1000 terminations per system.
The 100- and 1000-termination systems are host systems and each experienced
one 10-minute total outage during the month. The 200- and 300-termination
switches are remote systems. The 200-termination system incurred a 20-minute
outage affecting 50 terminations. All of the outages are attributed to the supplier.

åAP i i
(downtime calculation 6.1-2)
DT = 12 x i=1
N

åS
n =1
n

DTs = DTc = 12 {(10 min)(100 terms)+(10 min)(1000 terms)


+(20 min)(50 terms)} ÷ 1600 terms
DTs = DTc = 12 (1000 + 10,000 + 1000) ÷ 1600
= 12 x 12,000 ÷ 1600 = 90.0 min / yr

åA i
(outage frequency calculation 6.1-4)
OF = 12 x i=1
N

åS
n=1
n

OFs = OFc = 12 (100 terms + 1000 terms + 50 terms) ÷ 1600 terms


OFs = OFc = 12 (1150) ÷ 1600 = 8.625 occurrence / yr

This measurement translates directly to real performance delivered to the end


user. A typical system (or a typical line on a typical system) will experience
8.625 outages totaling 90.0 minutes in a year based on performance in the
current month. From equation 6.1-6, the availability for this system is {525,600 –
90} / 525,600 = 0.999829. This measurement also translates to performance
delivered to customer. The customer (service provider) will experience 9.0
outages per system per year (not weighted by duration of outage or by size of
system).

TL 9000 Quality Management System Measurements Handbook 3.0

6-11
Section 6 – Hardware and Software Measurements

Table 6.1-7 Example 2 –SO Data Report for March 2001

Year: 2001
Month: 3
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Forum
Normalization Factor: 4
Annualization Factor: 12
Measurement Identifier: SO
P: 90.0
Ps: 90.0
Q: 8.625
Qs: 8.625

c. Example 3 – Transport System - Digital Cross Connect

Consider a population of a given product consisting of various sized cross


connects interfacing with the network at various signal levels. From the quantity
of each type of port card and its traffic capacity in terms of DS1 equivalents,
excluding units used for protection, the total average capacity of these systems
during the month can be determined. Note that this is a “per DS1-equivalent
port” number. For this example, assume that in the systems there are 200 OC-3
port cards (16,800 DS1 equivalents), 400 DS3 / STS1 units (11,200 DS1
equivalents), and 1,000 units with 4 DS1 ports each (4,000 DS1 equivalents) for
a total capacity of 32,000 DS1 equivalent ports. The outages for the month are
all supplier attributable and are given in the following table. The table is
constructed to calculate formulas 6.1-2 and 6.1-4.

Table 6.1-8 Example 3 –SO Measurement Calculation


for a Transport System

Outage Signal Signal DS1 Number Weighted Weighted


Length Type Quantity Equivalents of Frequency Time
(minutes) Outages (DS1 x (DS1 x
(Ai) number) minutes)
(Pi) Q=Si (Ai x Pi)
60 DS3 1 1 x 28 1 28 1680
3 DS1 8 8 x1 1 8 24
16 DS1 1 1x1 1 1 16
5 OC-3 1 1 x 84 1 84 420
Total month m, Q, P M=4 Q = 121* P = 2140*
Annualized 48 1452 25680
*reported items

TL 9000 Quality Management System Measurements Handbook 3.0

6-12
Section 6 – Hardware and Software Measurements

Dividing the annualized totals by the 32,000 DS1 capacity (S = 32,000 DS1
Equivalents), the normalized downtime numbers are:

Table 6.1-9 Example 3 – Normalized SO Measurement


Calculation for a Transport System

Supplier
Digital Cross-connect Overall Attributable Unweighted
Downtime 0.8025 0.8025
(min / equivalent DS1 / yr)
Outage Frequency 0.045 0.045 0.0015
(Count / equivalent DS1 / yr)

This represents delivered annualized performance on a per-DS1 basis. Each


equivalent DS1 will experience 0.045 outages totaling 0.8025 minutes in a year.
From Equation 6.1-6, as defined subsequently in Section 6.1.8 Reliability
Conversions, availability for this system is {525,600 – 0.8025} / 525,600 =
0.9999984. Due to the mix of port cards in this example, the four outages
experienced by the TSP in the month represent 1.5 outages per 1,000 equivalent
DS1s per year (unweighted).

Table 6.1-10 Example 3 – Transport SO Data Report


for March 2001

Year: 2001
Month: 3
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Forum
Normalization Factor: 32000
Annualization Factor: 12
Measurement Identifier: SO
P: 0.8025
Ps: 0.8025
Q: 0.045
Qs: 0.045

6.1.8 Reliability Conversions

NOTE: The following analysis provides formulas to convert outage frequency and
downtime to other reliability measurements for reference. Equation 6.1-5
provides conversion from outage frequency to mean time between failures
(MTBF). Equation 6.1-6 provides conversion from downtime (expressed in
minutes) to system availability. System availability / unavailability and MTBF are
alternative expressions of system reliability found in some requirement
specifications.

TL 9000 Quality Management System Measurements Handbook 3.0

6-13
Section 6 – Hardware and Software Measurements

MTBF = Mean Time Between Failures (reference only)

MTBF = {(365)(24)} ÷ OF = mean hours to failure (6.1-5)

This calculation represents the mean (average) number of hours between system
outages.

A = Availability (Reference only)

A = Probability that the system is operational when required


A = Up time ÷ Total time
A = {(365)(24)(60) – DT} ÷ {(365)(24)(60)}
A = {525,600 – DT} ÷ 525,600 (6.1-6)

NOTE: Availability is often expressed as a percentage rather than as shown


above.

U = Unavailability (reference only)

U = Probability that the system is not operational when required


U=1–A (6.1-7)
For five minutes per system per year of downtime, availability is 0.9999905, or
“five nines.” and unavailability is 1 – A = 9.5 x E-6. For 50 minutes of downtime,
A = 0.999905, or “four nines,” and unavailability is 1 – A = 9.5 x E-5.

Customer Aggregation

A customer can determine the overall system availability delivered to its end
users by aggregating the system availability from his various suppliers as follows,
where Ax, Ay, or Az is the availability of system type X, Y, or Z, where Px, Py, or
Pz is the probability that a termination is served by system type X, Y, or Z
(determined by ratio of terminations or systems), and where Px + Py + Pz = 1.

A TSP = A X PX + A Y PY + A Z PZ (6.1-8)

TL 9000 Quality Management System Measurements Handbook 3.0

6-14
Section 7 – Hardware Measurements

Section 7 Hardware Measurements

7.1 Return Rates (RR)


7.1.1 General Description and Title

This profile defines four return rate measurements:

• Initial Return Rate (IRR) – return rate of units during the first six months after
initial shipment (months zero through six of shipment),
• One-Year Return Rate (YRR) - return rate of units during the first year
following the Initial Return Rate period (months seven through 18 of
shipment),
• Long-Term Return Rate (LTR) - return rate of units any time following the
One-Year Return Rate period (months 19 and later after shipment), and
• Normalized One-Year Return Rate (NYR) – the normalized return rate of
units during the One-Year Return Rate period.

7.1.2 Purpose

The purpose of this measurement is to:

• Provide a measure of the quality of the product as initially received by the


customer and during subsequent in-service operation,
• Determine areas needing corrective action or most likely benefiting from
continuous improvement activity, and
• Provide input data needed to calculate equipment life cycle costs.

7.1.3 Applicable Product Categories

a. This measurement applies to product categories as shown in Measurement


Applicability Table (Normalized Units), Appendix A, Table A-2.
b. In general, these measurements apply to:
• Any system comprised of field replaceable units (FRUs)
• A system which itself is an FRU
• The individual FRUs themselves.
c. These measurements apply equally to any FRU shipped either in a system or
separately.
d. These measurements are not intended for items shipped in bulk such as:
• Cable
• Optical fiber
• Mechanical hardware, for example, metallic connectors, optical
connectors, conduit, mounting hardware, labels, etc.

TL 9000 Quality Management System Measurements Handbook 3.0

7-1
Section 7 – Hardware Measurements

NOTE: The Initial Return Rate measurement for items warehoused outside of the
supplier’s control, for an extended period before placement in service, may not
accurately reflect the actual return rate for product in service. This may also be
true of items sold through distributors.

NOTE: Long-Term Return Rates may become inaccurate for older products as
units are taken out of service.

NOTE: The return rate for low cost items after the expiration of any warranty
period is likely to be inaccurate if purchasing a new item is no more expensive
than repairing the failed one.

7.1.4 Detailed Descriptions

a. Terminology

The Glossary includes definitions for the following terms used for this
measurement:

• Annualization Factor (Afactor)


• Basis Shipping Period
• Field Replaceable Unit
• Return

b. Counting Rules

The following rules shall apply when counting returns and shipments for the
return rate measurements.
(1) All returns except as noted in “Counting Rule Exclusions” are counted
in these calculations. See “Return” in the Glossary and the rules below
for the exact definition used here.
(2) Only returns from the basis shipping period corresponding to the
specific measurement shall be counted.
(3) The supplier shall document, for the specific measurement, the method
of determining which of the returns are from the corresponding basis
shipping period. This may be determined by any of the following
methods:
• Serialized shipment records of the returned unit,
• A shipment or warranty start date code marked on the unit,
• A shipment date associated with a customer order, and
• A manufactured date associated with a lot number.
NOTE: The last method would require the determination of an
accounting for a standard time delay between the date of manufacture
and shipment.
(4) Units that fail due to the problem corrected by a recall before they can
be rotated are to be counted as returns.

TL 9000 Quality Management System Measurements Handbook 3.0

7-2
Section 7 – Hardware Measurements

(5) Units damaged during normal shipping handling where the container
itself is not damaged due to abnormal shipping conditions are counted
as returns.
(6) No trouble found units, i.e., returned units determined by the supplier’s
organization to meet supplier’s specifications are included in return
totals.
NOTE: Returns and shipments should only be reported once when submitting
data to the QuEST Forum Measurements Administrator. When a unit may be
used in more than one product, it may not be practical or possible to identify with
which product a return or shipment is associated. In such cases, the supplier
should apportion the returns and shipments appropriately among all products in
which the unit is used.

c. Counting Rule Exclusions

The following may be excluded from the return and shipment counts for the
return rate measurements.

(1) Working or untested units returned as part of a formal rotation or recall


program are not considered returns for the purposes of these
measurements.
(2) Units damaged during shipping or while in service due to vehicular
accidents, water leakage, electrical spikes outside of specified limits, or
other environmental factors outside those conditions for which the
equipment was designed, may be excluded.
(3) Items that were ordered in error, purposely ordered in excess, or
consignment items that have not been found defective may also be
excluded from the measure.
(4) All returns from laboratory systems and / or First Office Application
(FOA) systems may be excluded from these measurements.

d. Calculations and Formulas

(1) The measurements shall be calculated according to the formulas


shown in Table 7.1-1. The formulae for IRR and LTR are not
normalized but are expressed in percentage returns per year. The
formula for YRR is normalized with the normalization units given in the
Measurement Applicability Table (Normalized Units), Appendix A,
Table A-2.
(2) The return rates are annualized.
(3) Normalization of System Numbers – The YRR shall be normalized with
units defined in the Measurement Applicability Table (Normalized

TL 9000 Quality Management System Measurements Handbook 3.0

7-3
Section 7 – Hardware Measurements

Units), Appendix A, Table A-2 when reported to the Measurements


Administrator.
• A general formula (normalized return rate) of this normalization of
a return rate measurement would take the form of:

Normalized return rate =


Returns x Afactor / Normalization Factor

• Example Calculations (see 7.1.7.b) illustrate computation of


normalized return rates.

(4) Initial Return Rate (IRR) – The Initial Return Rate measures the rate of
return of product during the reporting period from the population of
units shipped during the prior month through six months prior to the
reporting period. This basis shipping period is assumed to represent
the initial return rate of the product during installation, turn-up, and
testing. Returns from units shipped during the current month are also
included.
(5) One-Year Return Rate (YRR) – The One-Year Return Rate measures
the rate of return of product in its first year of service life following the
initial period included in IRR. It is based on the number of returns
during the reporting period from the population of units shipped seven
to eighteen months prior to the reporting period. This basis shipping
period is assumed to represent the operation during any early life
period.
(6) Long-Term Return Rate (LTR) – The Long-Term Return Rate
measures the rate of return of product more than eighteen months from
time of shipment. It is based on the number of returns during the
reporting period from the population of units shipped more than
eighteen months prior to the reporting period. This rate represents the
mature return rate of the product.
(7) Normalized One-Year Return Rate (NYR) – The normalization of the
One-Year Return Rate allows this circuit pack return measure to be
compared between like products with different architecture.

Notation

NU = Normalization Unit (NU) from Measurement Applicability


Table (Normalized Units), Appendix A, Table A-2
S = Normalization Factor; the total NU count shipped in the one-year
basis shipping period.
Afactor = Annualization Factor, the number of reporting periods in a year
(see glossary)
Ri = Number of returns in the IRR basis shipping period
Ry = Number of returns in the YRR basis shipping period
Rt = Number of returns in the LTR basis shipping period
Si = Number of FRUs shipped during the IRR basis shipping period
Sy = Number of FRUs shipped during the YRR basis shipping period
St = Number of FRUs shipped during the LTR basis shipping period

TL 9000 Quality Management System Measurements Handbook 3.0

7-4
Section 7 – Hardware Measurements

Table 7.1-1 Return Rates (IRR, YRR, LTR, and NYR)


Measurement Identifiers and Formulas

Identifier Title Formula Note


IRR Initial (Ri / Si) x Afactor x 100 % per year
Return Rate
YRR One-Year (Ry / Sy) x Afactor x 100 % per year
Return Rate
LTR Long-Term (Rt / St) x Afactor x 100 % per year
Return Rate
NYR Normalized One- (Ry / S) x Afactor Returns per
Year Return Rate NU

e. Reported Data and Format

(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) Data shall be reported for IRR, YRR and LTR. Compared data industry
statistics are based only on the normalized YRR measurement.
(3) TL 9000 Return Rate Data Table - The return rates shall be reported with
data elements (or equivalent as defined by the Measurements
Administrator) for each month and each product category as follows:

Table 7.1-2 TL 9000 RR Data Table

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Normalization Factor: S (value for computing NYR)
Annualization Factor: Afactor (see Glossary)
Measurement Identifier: RR
IRR Numerator: Ri
YRR Numerator: Ry
LTR Numerator: Rt
IRR Denominator: Si
YRR Denominator: Sy
LTR Denominator: St

TL 9000 Quality Management System Measurements Handbook 3.0

7-5
Section 7 – Hardware Measurements

7.1.5 Source(s) of Data

The supplier should have available, as a part of its data systems the information
listed above needed to calculate these measurements. This includes:
a. FRU shipping records – These are required to determine which units
received for repair are “initial returns,” “one-year returns” or “long- term
returns” and determine the respective populations.
b. FRU returns records – The supplier’s return records shall include the
identifier necessary to match returns with shipment records.
c. Third party returns records – Units returned to a third party repair agency by
the customer or repaired by the customer itself shall be included in the return
counts when available. To have accurate measurements, it is necessary for
the customer to make it a contractual requirement of their third party repair
agencies to supply this data to the original equipment manufacturers.

7.1.6 Method of Delivery or Reporting

At present, only NYR is considered to have compared data. The IRR, YRR, and
LTR data shall also be reported to the Measurements Administrator for future
use.

a. Compared data (CD) or research data (RD):

Initial Return Rate RD


One-Year Return Rate RD
Long-Term Return Rate RD
Normalized One-Year Return Rate CD

b. RQMS Alternative Reporting:

None

Due to the nature of the changes to the return rate measurement in release 3.0
of the handbook, the release 2.5 return rate measurements are not comparable
to return rate measurements in release 3.0 and later versions of the handbook.

7.1.7 Example Calculations

a. Example Without Normalization


In a given reporting month, all returns are divided into three groups, according to
when they were shipped. For example, for the reporting month of January 1999,
returns are divided into the following groups (as illustrated in Figure 7.1-1):
• Initial Returns: From units shipped in the period from July 1, 1998,
through January 31, 1999.
• One-Year Returns: From units shipped in the period from July 1, 1997
through June 30, 1998.
• Long-Term Returns: From units shipped prior to July 1, 1997.

TL 9000 Quality Management System Measurements Handbook 3.0

7-6
Section 7 – Hardware Measurements

1998 1999
JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC JAN
One-Year Returns Initial Returns

1996 1997
DEC JAN FEB MAR APR MAY JUN JUL AUG SEP OCT NOV DEC
Long-Term Returns One-Year Returns

Figure 7.1-1 Shipping Date Groups for Computing Return Rates

Table 7.1-3 shows shipments for July 1997 through December 1999, plus all
shipments prior to July 1997. In addition, it shows returns for January 1999
through December 1999, broken out by month of shipment as determined by
shipping records. The highlighted first row of data in Table 7.1-3 shows the
breakdown by month of shipment for the 355 returns received during January
1999. For example, in January 1999, 22 returns were received from the 8253
units shipped in July 1997, and 11 returns were received from the 9243 units
shipped in August 1997.

Table 7.1-3 Example Returns

Ship Jun-97 Jul-97 Aug-97 Sep-97 Oct-97 Nov-97 Dec-97 Jan-98 Feb-98 Mar-98 Apr-98 May-98 Jun-98
Date → &
Return before
Mo.↓
Jan-99 39 22 11 17 19 16 24 11 7 14 10 6 6
Feb-99 44 9 11 16 13 8 16 11 9 15 9 11 13
Mar-99 42 11 14 17 21 15 17 7 8 12 14 12 12
Apr-99 46 12 12 12 15 14 22 9 11 8 10 11 16
May-99 31 11 19 16 17 21 12 9 10 10 9 16 11
Jun-99 35 10 15 16 11 28 19 9 8 15 8 9 7
Jul-99 48 7 13 17 14 17 17 9 5 4 10 13 9
Aug-99 36 10 1 7 19 17 15 13 5 12 6 16 12
Sep-99 46 8 16 16 16 19 24 6 3 7 12 8 14
Oct-99 41 15 10 11 18 13 14 3 14 9 11 13 13
Nov-99 32 16 13 12 17 14 15 6 7 5 11 10 7
Dec-99 30 5 21 17 13 20 14 3 9 12 10 3 13
Ship- 30000 8253 9243 9261 9721 10131 10140 6263 6436 7244 7275 7396 8263
ments:

TL 9000 Quality Management System Measurements Handbook 3.0

7-7
Section 7 – Hardware Measurements

Table 7.1-3 (continued) Example Returns

Ship Jul-98 Aug-98 Sep-98 Oct-98 Nov-98 Dec-98 Jan-99 Feb-99 Mar-99 Apr-99 May-99 Jun-99 Jul-99
Date →
Return
Mo.↓
Jan-99 14 16 20 39 36 23 5
Feb-99 12 6 18 24 26 30 33 1
Mar-99 14 14 15 18 24 20 23 31 5
Apr-99 12 17 18 7 23 22 25 23 27 2
May-99 12 14 16 15 12 25 22 27 26 33 4
Jun-99 14 14 6 15 13 15 30 24 20 28 27 1
Jul-99 14 11 12 17 6 15 18 24 29 26 27 31 1
Aug-99 15 14 19 16 13 15 15 11 38 26 28 26 35
Sep-99 11 12 12 13 9 16 14 13 17 16 31 28 25
Oct-99 10 12 6 6 9 12 19 22 14 19 18 26 32
Nov-99 11 8 16 19 20 16 14 11 19 19 13 22 35
Dec-99 8 13 11 9 12 11 19 16 12 12 24 15 16
Ship- 8833 8954 9368 9818 9787 10528 10644 11321 11332 11674 12151 12460 13494
ments:

Table 7.1-3 (continued) Example Returns

Ship Aug-99 Sep-99 Oct-99 Nov-99 Dec-99 Total


Date → Returns:
Return
Mo.↓
Jan-99 355
Feb-99 335
Mar-99 366
Apr-99 374
May-99 398
Jun-99 397
Jul-99 414
Aug-99 5 445
Sep-99 33 4 449
Oct-99 25 30 4 449
Nov-99 28 23 34 3 476
Dec-99 21 32 22 36 4 463
Ship- 13670 13933 13725 14467 14905
ments

The annualized return rates for the month of January 1999, are calculated as:

Initial (Returns of units shipped Jul-98 through Jan-99) x 12 x 100


Return Rate = ———————————————————————————
Total Shipments for Jul-98 through Dec-98

= (14+16+20+39+36+23+5) x 12 x 100 _
8833+8954+9368+9818+9787+10528

= 3.20%

TL 9000 Quality Management System Measurements Handbook 3.0

7-8
Section 7 – Hardware Measurements

During January 1999, the number of returned units was calculated as follows:

14 returns of units shipped in July 1998,


16 returns of units shipped in August 1998,
20 returns of units shipped in September 1998, and so on, including
5 returns of units shipped in the month of January 1999,
for a total number of initial returns of 153.

The corresponding field population is determined by the sum of the shipment


quantities shown in the bottom row of Table 7.1-3 for the months of July 1998
through December 1998. Note that the returns of units shipped in January are
included in order to count all returns during the month, and to be alerted to any
developing problems. However, shipments during January are excluded
because most units will not have been placed into operation.

One-Year (Returns of units shipped Jul-97 through Jun-98) x 12 x 100


Return Rate = ———————————————————————————
Total Shipments for Jul-97 through Jun-98

= (22+11+17+19+16+24+11+7+14+10+6+6) x 12 x 100 _
(8253+9243+9261+9721+10131+10140+6263+6436+7244+7275+7396+8263)

= 1.96%

Long-Term (Returns from shipments prior to Jul-97) x 12 x 100


Return Rate = ———————————————————————————
Total Shipments prior to Jul-97

= 39 x 12 x100
30000
= 1.56%
Calculating the return rates for all months in 1999 gives:

Initial One- Long-


Return Year Term
Rate Return Return
Rate Rate
Jan-99 3.20% 1.96% 1.56%
Feb-99 2.80% 1.72% 1.66%
Mar-99 2.66% 1.96% 1.69%
Apr-99 2.44% 1.96% 1.73%
May-99 2.74% 1.86% 1.70%
Jun-99 2.57% 1.65% 1.80%
Jul-99 2.69% 1.50% 1.84%
Aug-99 2.80% 1.81% 1.52%
Sep-99 2.47% 1.55% 1.86%
Oct-99 2.39% 1.55% 1.66%
Nov-99 2.39% 1.73% 1.56%
Dec-99 2.14% 1.57% 1.55%

TL 9000 Quality Management System Measurements Handbook 3.0

7-9
Section 7 – Hardware Measurements

b. Examples With Normalization

(1) Example 1 – Normalized One-Year Return Rate

A supplier makes an HDSL transmission system consisting of the


following products:

i. HDSL Central Office Transceiver Unit (HTU-C) – One HTU-C is


required per HDSL line deployed.
ii. HDSL Remote Transceiver Unit (HTU-R) – One HTU-R is required
per HDSL line deployed.
iii. HDSL Range Extender (HRE) – Zero to two HREs may be used per
HDSL line deployed.
iv. HDSL Fuse / Alarm Controller (HFAC) – One HFAC is required per
HDSL shelf, which may be used to deploy up to 13 HDSL lines.
v. HDSL Central Office Terminal Controller (HCOT-CTL) – One
HCOT-CTL can control up to 100 shelves.
vi. HDSL E220 Shelf – One shelf can accommodate up to 13 HDSL
transceiver units.
Only products i through v are field replaceable units.
To calculate the normalized YRR, returns are aggregated for the entire
HDSL product category and the normalizing factor is applied to the
category as a whole:
i. HDSL Central Office Transceiver Unit (HTU-C)
Returns in the reporting period (one [1] month): 50
Shipments in the basis period (one [1] year): 100,000
ii. HDSL Remote Transceiver Unit (HTU-R)
Returns in the reporting period: 40
Shipments in the basis period: 100,000
iii. HDSL Range Extender
Returns in the reporting period: 5
Shipments in the basis period: 50,000
iv. HDSL Fuse/Alarm Controller
Returns in the reporting period: 3
Shipments in the basis period: 10,000
v. HDSL Central Office Terminal Controller
Returns in the reporting period: 0
Shipments in the basis period: 500

The normalizing factor for xDSL products is the number of DSL lines
deployed. Since one HTU-C and one HTU-R are required to deploy a
single HDSL line, the total number of lines deployed in the basis period is
100,000.

The normalized One-Year Return Rate would be:

[(50 + 40 + 5 + 3 + 0) x 12] / 100,000 = 0.012 returns / yr. / DSL line

TL 9000 Quality Management System Measurements Handbook 3.0

7-10
Section 7 – Hardware Measurements

(2) Example 2 – Normalized One-Year Return Rate

A supplier makes a local switch consisting of the following products:


i. POTS line card – Each POTS line card has 16 POTS lines.
ii. Trunk line card – Each trunk line card has four trunk lines.
iii. ISDN line card – Each ISDN line card has eight basic rate ISDN
lines, each of which has two terminations. Each ISDN line card has
an identical card providing protection for the first card.
iv. Miscellaneous circuit packs – Besides the previous three circuit
packs there are 30 other pack codes. They do not supply
termination service, but are needed to support the office.

During the basis period for the YRR, this supplier installed one switch
with the line cards and other circuit packs listed below. To calculate the
normalized YRR, returns are aggregated for the entire switch and the
normalizing factor is applied to the category as a whole.

i. POTS Line Card


Returns in the reporting period (one [1] month): 10
Shipments in the basis period (one [1] year): 1,000

ii. Trunk Line Card


Returns in the reporting period: 5
Shipments in the basis period: 500

iii. ISDN Line Card


Returns in the reporting period: 2
Shipments in the basis period: 500
Active cards with 1:1 protection: 250

iv. Miscellaneous Circuit Packs


Returns in the reporting period: 2

The normalizing factor for Switching and Routing System Elements is 1


termination. The total of all terminations for this switch is:

(1,000 x 16) + (500 x 4) + (250 x 8 x 2) = 22,000

The normalized One-Year Return Rate per 1 termination for the switch
circuit pack shipments is:

(10 + 5 + 2 + 2) x 12 x 1/ 22,000 = 0.010364 returns / yr / termination

FYR Denominator = 1,000 + 500 + 250 + 2 = 1,752

TL 9000 Quality Management System Measurements Handbook 3.0

7-11
Section 7 – Hardware Measurements

(3) Example 2 – Return Rate Data Table

The following table shows how data from the above example would be
reported to the Measurements Administrator. For completeness, the
report includes examples of IRR and LTR data that were not discussed
in the example.

Table 7.1-4 Example 2 – Return Rate Data Table

Year: 1999
Month 01
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: 1.1
Measurement Methodology: TL 9000
Customer Base: Total
Normalization Factor: 22000
Annualization Factor: 12
Measurement Identifier: RR
IRR Numerator: Ri 14
YRR Numerator: Ry 19
LTR Numerator: Rt 30
IRR Denominator: Si 1200
YRR Denominator: Sy 1752
LTR Denominator: St 4500

TL 9000 Quality Management System Measurements Handbook 3.0

7-12
Section 8 – Software Measurements

Section 8 Software Measurements

8.1 Software Installation and Maintenance

8.1.1 General Description and Title

Software Installation and Maintenance (SWIM) measurements track the


installation of new releases and the maintenance effort associated with the
software. These measurements are adapted from RQMS. [1]

8.1.2 Purpose

This measurement is used to evaluate the level of defective software installations


and defective maintenance activities with a goal of minimizing associated
customer impacts. This section defines the measurements associated with the
installation and maintenance of product software. The measurements in this
section are provided to aid the service provider and the supplier in understanding
the effort involved in the installation of new software generic/releases and the
efforts involved in the maintenance of the software generic/release. For the
purpose of these measurements, maintenance covers the activities to correct
defects and/or to add additional functionality to the generally available
generic/release.

Due to the wide assortment of products available and the various mechanisms
used to install and maintain software, three options are provided for the
measurements. The supplier, with service provider input, is to select one of the
options for a particular product based on the most applicable choice.

8.1.3 Applicable Product Categories

This measurement applies to product categories as shown in Appendix A.

8.1.4 Detailed Description

a. Terminology

The Glossary includes definitions for the following terms used for software
update measurements:
• General Availability
• Patch
• Patch – Defective Corrective
• Patch – Defective Feature
• Patch – Official
• Release Application

TL 9000 Quality Management System Measurements Handbook 3.0

8-1
Section 8 – Software Measurements

NOTE: A software update is a set of changes to a release and is commonly


referred to as a “dot” or “point” release. A software update completely replaces
existing product code with a new generic/release as opposed to entering patches
into a generic/release. Software updates differ from patching in the manner in
which software changes are made to a system.

Software Updates are used to install the new generic/release and to provide
temporary fixes and new functionality (dot or point releases) between releases.

b. Measurement Options:

The measurement options outlined below group the installation and maintenance
activities together. In this way, the relationship between the installation and
maintenance efforts is apparent. Suppliers providing data shall submit their
results in accordance with the measurements contained in the selected option.

The selection criteria for the three options are described as follows:

Insertion of New Release Maintenance


Patching S/W Update
S/W Release Application Option 1
S/W Update Option 2
S/W Update Option 3

Option 1 — Software Release Application and Patching:

This option groups the measurements for Software Release Application and
Patching together for those products that use Software Release Application as
the installation methodology for the new release and patching is used as the
maintenance mechanism. The guidelines for using this option are:
• Installation of a new generic/release by a Software Application
completely replaces the existing code in the product.
• Patching is used as the ONLY maintenance mechanism to provide fixes
for defects and to provide additional functionality between
generics/releases. A Patch, by definition, affects only a portion of the
software in the generic/release.
• The methodology used to install the generic/release is usually
significantly different from the processes to install a Patch.
These methodologies commonly apply to End and Tandem offices, STPs, etc.

The following software measurements apply to option 1:

• Release Application Aborts (RAA)


• Corrective Patch Quality (CPQ)
• Feature Patch Quality (FPQ)

TL 9000 Quality Management System Measurements Handbook 3.0

8-2
Section 8 – Software Measurements

Option 2 — Software Updates:

This option is applicable to the products that use Software Updates exclusively
for both the installation of the generic/release and the maintenance of the
software after installation. The guidelines for using this option are:
• A Software Update completely replaces the existing code in the product.
• Software Updates are used to install a new generic/release and to
perform changes to the software between releases.
• The process to install a new generic/release and to perform changes
between releases is essentially the same.
• Software updates introduced between generics/releases provide fixes for
defects and may also provide additional functionality. This software is
commonly referred to as a point or dot release.
This methodology is commonly applied to Transport products.

Software Updates (SWU) is the only software measurement applicable to


option 2.

Option 3 — Software Update and Patching:

For some products, the supplier uses the S/W Update process to install a new
generic/release and uses both S/W Updates (point or dot releases) and Patches
to perform maintenance. This approach is used by the supplier to address urgent
field affecting issues in a timely fashion while retaining the option to maintain the
software using S/W Updates where integration testing, etc., can be better
performed for large changes. The guidelines for using this option are:
• A Software Update completely replaces the existing code in the product.
• Software Updates are used to install a new generic/release and to
perform changes to the software between releases.
• Software Updates introduced between generics/releases, provide fixes to
defects and may also provide additional functionality. This software is
commonly referred to as a point or dot release.
• Patching is used as a maintenance mechanism to provide fixes to
defects and to provide additional functionality between
generics/releases. A Patch, by definition, affects only a portion of the
software in the generic/release.
• The methodology used for S/W Updates is usually significantly different
from the processes to install a Patch.

The following software measurements apply to option 3:

• Software Update (SWU)


• Corrective Patch Quality (CPQ)
• Feature Patch Quality (FPQ)

TL 9000 Quality Management System Measurements Handbook 3.0

8-3
Section 8 – Software Measurements

8.1.5 Release Application Aborts (RAA)

8.1.5.1 General Description and Title

The Release Application Aborts measurement (RAA) is the percentage of


release applications with aborts. This measurement is derived from RQMS. [1]

8.1.5.2 Purpose

This measurement is used to evaluate the percentage of release applications


with aborts with a goal of minimizing the service provider risk of aborts when
applying a software release.

8.1.5.3 Applicable Product Categories

This measurement applies to product categories as shown in Appendix A.

8.1.5.4 Detailed Description

a. Terminology

The Glossary contains definitions for the following term used for the RAA
measurement:

• General Availability

b. Counting Rules

(1) A release is counted on General Availability.


(2) Only supplier attributable aborts shall be counted.
(3) The application/installation interval shall start 24 hours prior to
scheduled cutover and shall end seven days after cutover.
(4) A Release Application Abort (RAA) is the regression to a previous
release within seven days of cutover or the reschedule of the release
application within 24 hours of cutover.
(5) The percentage for each month shall be calculated using the
cumulative number of release application attempts at the end of the
month for that release.
(6) The data shall include the three most dominant releases for each
product being reported. If fewer than three releases exist, the data
shall include all existing releases.

c. Counting Rule Exclusions

None

TL 9000 Quality Management System Measurements Handbook 3.0

8-4
Section 8 – Software Measurements

d. Calculations and Formulas

(1) The measurement shall be calculated monthly for each release as the
percentage of the cumulative number of application attempts for which
a new release has been applied or committed to be applied and for
which a release application abort has occurred.
(2) For each of the three most dominant releases, the supplier shall
provide the number of release application attempts for the month and
the number of systems that encountered any abort during the release
application/installation interval. The supplier shall report this data
monthly.
(3) When reporting RQMS alternative measurements, suppliers shall refer
to RAQ0, RAQ1, RAQ2, Rar0, Rar1, Rar2, in Table 8.1.5-2 to
determine reporting conventions.
(4) The reported data and each of the computed measurements are
totaled/aggregated to one value per registered entity per product
category per month.

Notation

Release N The most recent dominant release reported.


Release N-1 The previous release reported.
Release N-2 The release previous to N-1 that is reported.
Ra0 = Cumulative number of release application attempts for
Release N
Ar0 = Cumulative number of release application aborts for
Release N
Ra1 = Cumulative number of release application attempts for
Release N-1
Ar1 = Cumulative number of release application aborts for
Release N-1
Ra2 = Cumulative number of release application attempts for
Release N-2
Ar2 = Cumulative number of release application aborts for
Release N-2

Table 8.1.5-1 Release Application Aborts (RAA)


Measurement Identifiers and Formulas

Identifier Title Formula Note


RAA0 Release Application (Ar0 / Ra0) x 100 % of systems
Aborts – Release N with aborts
RAA1 Release Application (Ar1 / Ra1) x 100 % of systems
Aborts – Release N-1 with aborts
RAA2 Release Application (Ar2 / Ra2) x 100 % of systems
Aborts – Release N-2 with aborts

TL 9000 Quality Management System Measurements Handbook 3.0

8-5
Section 8 – Software Measurements

Table 8.1.5-2 Release Application Aborts (RAQ)


RQMS Alternative Measurements

Identifier Title
RAQ0 Cumulative % of Systems Experiencing an Abort during Release
Application – Release N
RAQ1 Cumulative % of Systems Experiencing an Abort during Release
Application – Release N-1
RAQ2 Cumulative % of Systems Experiencing an Abort during Release
Application – Release N-2
Rar0 Cumulative Number of Release Application Attempts –
Release N
Rar1 Cumulative Number of Release Application Attempts –
Release N-1
Rar2 Cumulative Number of Release Application Attempts –
Release N-2

e. Reported Data and Format

(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 RAA Data Table – The RAA measurements shall be reported
with data elements (or equivalent as defined by the Measurements
Administrator) for each month and each product category combination
as in Table 8.1.5-3.

Table 8.1.5-3 TL 9000 RAA Data Table

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Measurement Identifier: RAA
RAA0 Numerator: Ar0
RAA1 Numerator: Ar1
RAA2 Numerator: Ar2
RAA0 Denominator: Ra0
RAA1 Denominator: Ra1
RAA2 Denominator: Ra2

TL 9000 Quality Management System Measurements Handbook 3.0

8-6
Section 8 – Software Measurements

(3) RQMS Alternative RAA Data Table – The RQMS alternative


measurements shall be reported with data elements (or equivalent as
defined by the Measurements Administrator) for each month and each
product category combination as defined in Table 8.1.5-4.

Table 8.1.5-4 RQMS Alternative RAA Data Table (RAQ)

Year: YYYY
Month MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Measurement Identifier: RAQ
RAQ0 Numerator: Cumulative number of systems experiencing an
abort during release application for Release N
RAQ1 Numerator: Cumulative number of systems experiencing an
abort during release application for Release N-1
RAQ2 Numerator: Cumulative number of systems experiencing an
abort during release application for Release N-2
RAQ0 Denominator: Cumulative number of release applications for
Release N (Rar0)
RAQ1 Denominator: Cumulative number of release applications for
Release N-1 (Rar1)
RAQ2 Denominator: Cumulative number of release application
attempts for Release N-2 (Rar2)

8.1.5.5 Sources of Data

a. Suppliers shall capture data relative to numbers of release application aborts.

b. Customers shall provide the suppliers (via the mutually agreed procedure)
with timely feedback related to any aborts that were encountered. If
customers perform the release application, they must provide the supplier
with the planned and actual dates for each software application and identify
the applications that aborted due to supplier attributable causes.

TL 9000 Quality Management System Measurements Handbook 3.0

8-7
Section 8 – Software Measurements

8.1.5.6 Method of Delivery or Reporting

a. Compared Data (CD) or Research Data (RD):

Release Application Aborts CD

b. RQMS alternative reporting:

Release Application Aborts YES

8.1.5.7 Example Calculations

a. Example 1 – A supplier is upgrading three active releases (from prior


releases to release N, N-1 and N-2). Release application counts and those
encountering release application aborts are shown in Table 8.1.5-5.

Table 8.1.5-5 Example 1 – RAA Source Data and


Measurement Calculation

Month 1 2 3 4 5 6 7 8 9 10 11 12 13
Dec Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec
2000 2001 2001 2001 2001 2001 2001 2001 2001 2001 2001 2001 2001

Number of Release Applications In Month


Release N 1 4 6 14 22 39 45 52 54 50 47 36 30
Release N-1 1 3 5 12 20 39 46 51 52 45 48 33 29
Release N-2 1 2 5 10 20 40 45 46 45 43 44 30 24

Number of Release Applications that Encountered


Release Application Aborts in Month
Release N 0 0 0 0 1 0 1 0 1 0 1 0 0
Release N-1 0 0 0 1 0 1 1 1 1 1 0 0 1
Release N-2 0 0 0 1 0 1 1 1 1 1 1 1 0

Cumulative Release Application Attempts


Ra0 Release N 1 5 11 25 47 86 131 183 237 287 334 370 400
Ra1 Release N-1 1 4 9 21 41 80 126 177 229 274 322 355 384
Ra2 Release N-2 1 3 8 18 38 78 123 169 214 257 301 331 355

Cumulative Release Application Aborts


Ar0 Release N 0 0 0 0 1 1 2 2 3 3 4 4 4
Ar1 Release N-1 0 0 0 1 1 2 3 4 5 6 6 6 7
Ar2 Release N-2 0 0 0 1 1 2 3 4 5 6 7 8 8

Release Application Aborts Measurement (Cumulative %)


RAA0 Release N 0.00 0.00 0.00 0.00 2.13 1.16 1.53 1.09 1.27 1.05 1.20 1.08 1.00
RAA1 Release N-1 0.00 0.00 0.00 4.76 2.44 2.50 2.38 2.26 2.18 2.19 1.86 1.69 1.82
RAA2 Release N-2 0.00 0.00 0.00 5.56 2.63 2.56 2.44 2.37 2.34 2.33 2.32 2.41 2.25

TL 9000 Quality Management System Measurements Handbook 3.0

8-8
Section 8 – Software Measurements

b. For the month of December 2001, the TL 9000 data reported for the above
example is shown in Table 8.1.5-6.

Table 8.1.5-6 Example 1 – RAA TL 9000 Data Report

Year: 2000
Month: 12
Reporting ID: Provided by QuEST Forum
Administrator
Product Category Code: From Measurement Applicability
Table (Normalized Units), Appendix
A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Forum
Measurement Identifier: RAA
RAA0 Numerator: Ar0 4
RAA1 Numerator: Ar1 7
RAA2 Numerator: Ar2 8
RAA0 Denominator: Ra0 400
RAA1 Denominator: Ra1 384
RAA2 Denominator: Ra2 355

TL 9000 Quality Management System Measurements Handbook 3.0

8-9
Section 8 – Software Measurements

8.1.6 Corrective Patch Quality (CPQ) and


Feature Patch Quality (FPQ)

8.1.6.1 General Description and Title

The Corrective Patch and Feature Patch measurements are used to monitor the
maintenance activities associated with a generic/release. Corrective Patch
Quality is the percentage of official corrective patches that are determined to be
defective. Feature Patch Quality is the percentage of official feature patches that
are determined to be defective. These measurements are adapted from
RQMS. [1]

8.1.6.2 Purpose

This measurement is used to evaluate the percentage of defective official


patches with a goal of minimizing service provider risk of failure.

8.1.6.3 Applicable Product Categories

This measurement applies to product categories as shown in Appendix A.

8.1.6.4 Detailed Description

a. Terminology

The Glossary includes definitions for the following terms used for these
measurements:

• General Availability
• Official Patch
• Patch
• Patch – Defective Corrective
• Patch – Defective Feature

b. Counting Rules

(1) Non-identical patches packaged together in one administrative unit


shall be counted individually even if the package can be installed
during one craftsperson task.
(2) Identical patches distributed to multiple processors (or units) in the
same system shall be counted only once provided they can be installed
during one craftsperson task.
(3) If several separate patches are provided to effect a single change
(such as covering different parts of the code) that are separately
identifiable to the customer, they shall each be counted separately.
(4) A patch is counted on General Availability of the patch. For example,
patches are counted when either (1) on-site and ready for system
installation or (2) available for downloading by the customer to the site.

TL 9000 Quality Management System Measurements Handbook 3.0

8-10
Section 8 – Software Measurements

(5) Patches included with a release that require additional effort to


implement shall be counted as patches.
(6) A defective patch shall be counted against the month during which the
patch was found defective.
(7) The data shall include the three most dominant releases for each
product being reported. If fewer than three releases exist, the data
shall include all existing releases.
c. Counting Rule Exclusions

(1) Patches shall not be counted when included in the release by the
supplier prior to the shipment of that release for the first Service
Provider General Availability.
d. Calculations and Formulas

(1) These measurements (see Table 8.1.6-1 Patch Quality (CPQ and
FPQ) shall be calculated monthly by release. Each measurement is
calculated by multiplying 100 by the number of defective patches
identified during the month and dividing by the number of patches that
became available for general release during the month.
(2) For CPQ, the supplier shall provide, by release, the total monthly
number of official corrective patches delivered and the number of
official corrective patches identified as defective.
(3) For FPQ, the supplier shall provide, by release, the total monthly
number of official feature patches delivered and the number of official
feature patches identified as defective.
(4) When reporting RQMS alternative measurements, suppliers shall refer
to Table 8.1.6-2 Patch Quality (DCP and DFP) – RQMS Alternative
Measurements to determine reporting conventions.

TL 9000 Quality Management System Measurements Handbook 3.0

8-11
Section 8 – Software Measurements

Notation

Release N The most recent dominant release reported.


Release N-1 The previous release reported.
Release N-2 The release previous to N-1 that is reported.
DPc0 = Number of defective corrective patches for the month for
release N
DPc1 = Number of defective corrective patches for the month for
release N-1
DPc2 = Number of defective corrective patches for the month for
release N-2
Pc0 = Total number of corrective patches that became available
for general release during the month for release N
Pc1 = Total number of corrective patches that became available
for general release during the month for release N-1
Pc2 = Total number of corrective patches that became available
for general release during the month for release N-2
DPf0 = Number of defective feature patches for the month for
release N
DPf1 = Number of defective feature patches for the month for
release N-1
DPf2 = Number of defective feature patches for the month for
release N-2
Pf0 = Total number of feature patches that became available
for general release during the month for release N
Pf1 = Total number of feature patches that became available
for general release during the month for release N-1
Pf2 = Total number of feature patches that became available
for general release during the month for release N-2

Table 8.1.6-1 Patch Quality (CPQ and FPQ)


Measurement Identifiers and Formulas

Identifier Title Formula Note


CPQ0 Defective Corrective (DPc0 / Pc0) x 100 % defective
Patches – Release N per month
CPQ1 Defective Corrective (DPc1 / Pc1) x 100 % defective
Patches – Release N-1 per month
CPQ2 Defective Corrective (DPc2 / Pc2) x 100 % defective
Patches – Release N-2 per month
FPQ0 Defective Feature Patches (DPf0 / Pf0) x 100 % defective
– Release N per month
FPQ1 Defective Feature Patches (DPf1 / Pf1) x 100 % defective
– Release N-1 per month
FPQ2 Defective Feature Patches (DPf2 / Pf2) x 100 % defective
– Release N-2 per month

TL 9000 Quality Management System Measurements Handbook 3.0

8-12
Section 8 – Software Measurements

Table 8.1.6-2 Patch Quality (DCP and DFP)


RQMS Alternative Measurements

Identifier Title
DCP0 Monthly Number of Defective Corrective Patches Identified
– Release N
DCP1 Monthly Number of Defective Corrective Patches Identified
– Release N-1
DCP2 Monthly Number of Defective Corrective Patches Identified
– Release N-2
DFP0 Monthly Number of Defective Feature Patches Identified
– Release N
DFP1 Monthly Number of Defective Feature Patches Identified
– Release N-1
DFP2 Monthly Number of Defective Feature Patches Identified
– Release N-2
CPr0 Monthly Number of Corrective Patches Delivered – Release N
CPr1 Monthly Number of Corrective Patches Delivered – Release N-1
CPr2 Monthly Number of Corrective Patches Delivered – Release N-2
FPr0 Monthly Number of Feature Patches Delivered – Release N
FPr1 Monthly Number of Feature Patches Delivered – Release N-1
FPr2 Monthly Number of Feature Patches Delivered – Release N-2

e. Reported Data and Format


(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 Data CPQ or FPQ Table – The CPQ and FPQ measurements
shall be reported with data elements (or equivalent as defined by the
Measurements Administrator) for each month and each product
category combination as follows:

Table 8.1.6-3 TL 9000 CPQ or FPQ Data Table

Year: YYYY
Month MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Measurement Identifier: CPQ or FPQ
CPQ0 or FPQ0 Numerator: DPc0 (for CQP) or DPf0 (for FPQ)
CPQ1 or FPQ1 Numerator: DPc1 (for CQP) or DPf1 (for FPQ)
CPQ2 or FPQ2 Numerator: DPc2 (for CQP) or DPf2 (for FPQ)
CPQ0 or FPQ0 Denominator: Pc0 (for CQP) or Pf0 (for FPQ)
CPQ1 or FPQ1 Denominator: Pc1 (for CQP) or Pf1 (for FPQ)
CPQ2 or FPQ2 Denominator: Pc2 (for CQP) or Pf2 (for FPQ)

TL 9000 Quality Management System Measurements Handbook 3.0

8-13
Section 8 – Software Measurements

(3) RQMS Alternative CPQ or FPQ Data Table – The RQMS alternative
measurements for CPQ and FPQ shall be reported with data elements
(or equivalent as defined by the Measurements Administrator) for each
month and each product category combination as follows:

Table 8.1.6-4 RQMS Alternative CPQ or FPQ Data Table


(DCP or DFP)

Year: YYYY
Month MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Measurement Identifier: DCP or DFP
DCP0 or DFP0 Numerator: Number of defective (corrective / feature)
patches for release N
DCP1 or DFP1 Numerator: Number of defective (corrective / feature)
patches for release N-1
DCP2 or DFP2 Numerator: Number of defective (corrective / feature)
patches for release N-2
DCP0 or DFP0 Denominator: Number of (corrective / feature) patches
delivered for release N (CPr0 or FPr0)
DCP1 or DFP1 Denominator: Number of (corrective / feature) patches
delivered for release N-1 (CPr1 or FPr1)
DCP2 or DFP2 Denominator: Number of (corrective / feature) patches
delivered for release N-2 (CPr2 or FPr2)

8.1.6.5 Sources of Data

Suppliers shall collect all data necessary to support this measurement.

8.1.6.6 Method of Delivery or Reporting

a. Compared Data (CD) or Research Data (RD)

Corrective Patch Quality CD


Feature Patch Quality RD

b. RQMS Alternative Reporting:

Corrective Patch Quality YES


Feature Patch Quality YES

TL 9000 Quality Management System Measurements Handbook 3.0

8-14
Section 8 – Software Measurements

8.1.6.7 Example Calculations

a. The following example illustrates calculation of the corrective patch quality


measurement. Calculation of the feature patch quality measurement is
analogous.

Example 1 - Corrective Patch Quality Measurement:

A supplier has three active releases (N, N-1, and N-2). Corrective patch
distribution and bad corrective patch counts were as shown in Table 8.1.6-5.

Table 8.1.6-5 Example 1 – CPQ Source Data and


Measurement Calculation

Month: 1 2 3 4 5 6 7 8 9 10 11 12 13
Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec Jan
2000 2000 2000 2000 2000 2000 2000 2000 2000 2000 2000 2000 2001
Number of Corrective
Patches Issued In Month
Pc0 Release N 52 53 48 35 34 34 32 30 28 30 25 24 22
Pc1 Release N-1 55 55 50 40 36 32 34 36 33 32 26 24 24
Pc2 Release N-2 60 55 50 47 42 35 35 31 32 30 29 27 25

Number of Defective Corrective Patches


Identified in Month
DPc0 Release N 0 1 0 0 0 1 0 0 0 0 0 1 0
DPc1 Release N-1 1 0 0 1 0 0 1 0 0 0 1 0 1
DPc2 Release N-2 1 0 0 2 0 0 0 1 0 1 1 0 0

Defective Corrective Patch Measurement -


% Defective
CPQ0 Release N 0 1.89 0 0 0 2.94 0 0 0 0 0 4.17 0
CPQ1 Release N-1 1.82 0 0 2.50 0 0 2.94 0 0 0 3.85 0 4.17
CPQ2 Release N-2 1.67 0 0 4.26 0 0 0 3.23 0 3.33 3.45 0 0

TL 9000 Quality Management System Measurements Handbook 3.0

8-15
Section 8 – Software Measurements

b. For the month of November 2000, the TL 9000 CPQ data reported is shown
in Table 8.1.6-12.

Table 8.1.6-6 Example 1 – CQP Data Report

Year: 2000
Month 11
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Forum
Measurement Identifier: CPQ
CPQ0 Numerator: DCP0 0
CPQ1 Numerator: DCP1 1
CPQ2 Numerator: DCP2 1
CPQ0 Denominator: PC0 25
CPQ1 Denominator: PC1 26
CPQ2 Denominator: PC2 29

TL 9000 Quality Management System Measurements Handbook 3.0

8-16
Section 8 – Software Measurements

8.1.7 Software Update Quality (SWU)

8.1.7.1 General Description and Title

A variety of new products have been developed that use an alternative approach
to install new generic/releases and maintenance software (point or dot releases)
into the product. Software updates replaces the existing code with new software.
The mechanism used to install the generic/release and the point or dot releases
are essentially the same. The service provider is concerned with the quality of
the software and the number of changes the supplier makes during the release’s
lifecycle. Software Update Quality (SWU) quantifies the percentage of these
updates that are defective.

A software update is used:


• To install a new generic/release into a product.
• Between generics/releases to effect a series of changes to fix problems
or to implement new features that the service provider may wish to
deploy on a timely basis rather than wait for a new generic/release.

8.1.7.2 Purpose

This measurement is used to evaluate the level of defective software updates


with a goal of minimizing associated customer risks.

8.1.7.3 Applicable Product Categories

This measurement applies to the product categories as shown in Appendix A per


the rules for software measurement option selection noted above.

8.1.7.4 Detailed Description

a. Terminology

The Glossary includes definitions for the following terms used for these
measurements:

• General Availability

b. Counting Rules

Software Updates

The following rules shall apply to the Software Update measurements:

(1) A software update is counted on General Availability.

TL 9000 Quality Management System Measurements Handbook 3.0

8-17
Section 8 – Software Measurements

(2) Software updates shall be considered delivered when they are


delivered to the destination(s) designated by the customer.
For example, software updates are considered delivered when on-site,
or ready for system installation, or available for downloading by the
customer to the site.
(3) The data shall include the three most dominant releases for each
product being reported. If fewer than three releases exist, the data
shall include all existing releases.

Defective Software Update

A defective software update is a software update that:


(1) is withdrawn or has its distribution curtailed due to a supplier
attributable problem
(2) causes a critical or major supplier attributable problem within 6 months
of general availability of the software
(3) does not correct the targeted problem(s) or provide the intended
feature functionality

The following rules shall apply to counting defective software updates:

(1) A defective software update shall be counted against the month during
which the software update was found defective and the release for
which it was intended to update.
(2) The data shall include the three most dominant releases for each
product being reported. If fewer than three releases exist, the data
shall include all existing releases.
(3) For this calculation, the volume of software updates and defective
software updates shall include the software update used to install the
release and all the maintenance software updates (point or dot
releases) associated with the release.

c. Counting Rule Exclusions

None

d. Calculations and Formulas

(1) The measurement (see SWU0, SWU1 and SWU2 in Table 8.1.7-1)
shall be calculated monthly as the cumulative percentage of defective
software updates by release since General Availability.
(2) The percentage for each month shall be calculated by dividing the
cumulative number of defective software updates by the cumulative
number of software updates deployed for the release.
(3) The supplier shall provide, by release, the total monthly number of
software updates delivered and the number of defective software
updates identified.

TL 9000 Quality Management System Measurements Handbook 3.0

8-18
Section 8 – Software Measurements

(4) When reporting RQMS alternative measurements, suppliers shall refer


to DSU0, DSU1, and DSU2 in Table 8.1.7-2 to determine reporting
conventions.
(5) The reported data and each of the computed measurements are
totaled/aggregated to one value per registered entity per product
category per month.

Notation

Release N The most recent dominant release reported.


Release N-1 The previous release reported.
Release N-2 The release previous to N-1 that is reported.
Du0 = Cumulative number of defective software updates for release N
Du1 = Cumulative number of defective software updates for release N-1
Du2 = Cumulative number of defective software updates for release N-2
Us0 = Cumulative number of software updates for release N
Us1 = Cumulative number of software updates for release N-1
Us2 = Cumulative number of software updates for release N-2

Table 8.1.7-1 Software Update Quality (SWU)


Measurement Identifiers and Formulas

Identifier Title Formula Note


SWU0 Defective Software Updates (Du0 / Us0) x 100 Cumulative
– Release N % defective
SWU1 Defective Software Updates (Du1 / Us1) x 100 Cumulative
– Release N-1 % defective
SWU2 Defective Software Updates (Du2 / Us2) x 100 Cumulative
– Release N-2 % defective

Table 8.1.7-2 Software Update Quality (DSU)


RQMS Alternative Measurements

Identifier Title
DSU0 Cumulative Number of Defective Software Updates
– Release N
DSU1 Cumulative Number of Defective Software Updates
– Release N-1
DSU2 Cumulative Number of Defective Software Updates
– Release N-2

e. Reported Data and Format


(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) TL 9000 Data SWU Table – The SWU measurements shall be reported
with data elements (or equivalent as defined by the Measurements

TL 9000 Quality Management System Measurements Handbook 3.0

8-19
Section 8 – Software Measurements

Administrator) for each month and each product category combination


as follows:

Table 8.1.7-3 TL 9000 SWU Data Table

Year: YYYY
Month MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: TL 9000
Customer Base: Either Total or Forum
Measurement Identifier: SWU
SWU0 Numerator: Du0
SWU1 Numerator: Du1
SWU2 Numerator: Du2
SWU0 Denominator: Us0
SWU1 Denominator: Us1
SWU2 Denominator: Us2

(3) RQMS Alternative SWU Data Table – The RQMS alternative


measurements shall be reported with data elements (or equivalent as
defined by the Measurements Administrator) for each month and each
product category combination as follows:

Table 8.1.7-4 RQMS Alternative SWU Data Table (DSU)

Year: YYYY
Month: MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Measurement Methodology: RQMS
Customer Base: Either Total or Forum
Measurement Identifier: DSU
DSU0: Cumulative number of defective software
updates for release N
DSU1: Cumulative number of defective software
updates for release N-1
DSU2: Cumulative number of defective software
updates for release N-2

8.1.7.5 Sources of Data

Customers shall provide feedback to the supplier on the results (successful or


unsuccessful) of any customer installed software update. Suppliers shall collect
all data necessary to report these measurements to the Measurements
Administrator.

TL 9000 Quality Management System Measurements Handbook 3.0

8-20
Section 8 – Software Measurements

8.1.7.6 Method of Delivery or Reporting

a. Compared data (CD) or research data (RD):

Software Update Quality CD

b. RQMS Alternative Reporting:

Software Update Quality YES

8.1.7.7 Example Calculations

Example 1 – A supplier of a software driven product distributes software updates


in the interim between releases. Table 8.1.7-5 shows the history of updates for a
6-month period. Table 8.1.7-6 shows the data report for June 2000.

Table 8.1.7-5 Example 1 – SWU Source Data and


Measurement Calculation

(Product Category 2.1)


Month: Jan Feb Mar Apr May Jun
Release ID Number of 2000 2000 2000 2000 2000 2000
Updates
D4 (N) Cumulative (Us0) 25 55
D4 (N) Cum. Defectives (Du0) 2 4
D4 (N) Current 25 30
D4 (N) Current Defectives 2 2
SWU0 Cumulative 8% 7.3%
% Defective
D2 (N-1) Cumulative (Us1) 10 15 30 50
D2 (N-1) Cum. Defectives (Du1) 2 3 4 6
D2 (N-1) Current 10 5 15 20
D2 (N-1) Current Defectives 2 1 1 2
SWU1 Cumulative 20% 20% 13.3% 12%
% Defective
C25 (N-2) Cumulative (Us2) 11 15 18 25 28 29
C25 (N-2) Cum. Defectives (Du2) 1 1 1 2 4 4
C25 (N-2) Current 2 4 3 7 3 1
C25 (N-2) Current Defectives 0 0 0 1 2 0
SWU2 Cumulative 9.1% 6.7% 5.6% 8.0% 14.3% 13.8%
% Defective

TL 9000 Quality Management System Measurements Handbook 3.0

8-21
Section 8 – Software Measurements

Table 8.1.7-6 Example 1 – SWU Data Table


Report for June 2000

Year: 2000
Month 6
Reporting ID: Provided by QuEST Forum
Administrator
Product Category Code: 2.1
Measurement Methodology: TL 9000
Customer Base: Forum
Measurement Identifier: SWU
SWU0 Numerator: Du0 4
SWU1 Numerator: Du1 6
SWU2 Numerator: Du2 4
SWU0 Denominator: Us0 55
SWU1 Denominator: Us1 50
SWU2 Denominator: Us2 29

TL 9000 Quality Management System Measurements Handbook 3.0

8-22
Section 9 – Services Measurements

Section 9 Services Measurements

9.1 Service Quality (SQ)

9.1.1 Description and Title

Service Quality is a measure of conformance of a service to specified criteria.

9.1.2 Purpose

This measurement is used to provide quality measurement information for


establishing the evaluation and continuous improvement of the service.

This section does not contain all the Service Measurements. Section 5 also
contains measurements associated with Service namely, Service Problem
Reports, Service Fix Response Time, Service Overdue Problem reports, On-
Time Installed System, and On-time Service Delivery.

9.1.3 Applicable Product Categories

This measurement applies to service categories as shown in Appendix A.

9.1.4 Detailed Description

a. Terminology

The Glossary includes definitions for the following terms used for these
measurements:

• Installation and/or Engineering Audit


• Maintenance
• Service Categories

b. Counting Rules
(1) Failure of any unit during the repair warranty period shall be counted as
a defective repair unit.
(2) Audits performed at “installation” shall include supplier caused
installation engineering defects and installation defects.
(3) Definitions for defects, service volume (Normalization Unit) and
measurement units for the applicable product categories are given in
Table 9.1-1.

TL 9000 Quality Management System Measurements Handbook 3.0

9-1
Section 9 – Services Measurements

c. Counting Rule Exclusions

(1) Customer Support Center activities that are turned into customer
problem reports are not to be included in this measure.
(2) Maintenance visits or callbacks shall not be counted if it is determined
that they were attributable to incorrect information supplied by the
customer as mutually agreed between parties. A maintenance visit is a
site visit to a customer’s location for the purpose of performing
maintenance. A maintenance callback is a site visit to a customer’s
location for the purpose of maintenance rework.

Table 9.1-1 Definitions of Defects, Service Volume and


Units of Measure by Service Product Categories
for Service Quality Measurements

Service Category Counted Item Service Volume % SQ


(defect) (Abbreviation)
Installation Number of non- Total number of % SQ
conforming audits installation and/or
engineering audits

(number of audits)

Maintenance Number of Total number of % SQ


maintenance maintenance visits
callbacks
(number visits)

Repair Number of Total number of % SQ


defective repair repaired units
warranty units
(number of repairs)

Customer Support Number of Total number of calls % SQ


Service resolutions
exceeding (number of calls)
agreed time
Support Service Number of Total number of % SQ
defects transactions

(number of
transactions)

NOTE: Service volume is a measure of the amount of service delivered.

NOTE: A nonconforming audit is one that fails to satisfy specified


acceptance requirements.

TL 9000 Quality Management System Measurements Handbook 3.0

9-2
Section 9 – Services Measurements

d. Calculations and Formulas

The method to compute service quality is percentage conforming (% SQ). To


determine the percentage conforming to specified criteria the percentage of non-
conformances shall be counted and subtracted from 100%. Percentage of non-
conformance shall be calculated as the total number of defects divided by the
total number of opportunities for defects.

% Service Quality (% SQ) = (1 – counted item/service volume) x 100

Detailed service quality measurements formulas (SQ1, SQ2, SQ3, SQ4, and
SQ5) appear in Table 9.1-2.

Notation

(Items are counted according to above stated rules)

NU = Service volume unit (normalization unit)


S1 = Installation service audits
S2 = Maintenance service volume
S3 = Repair service volume
S4 = Customer Support Service service volume
S5 = Support service volume
Sd1 = Number of installation non-conforming audits
Sd2 = Number of maintenance callbacks
Sd3 = Number of defective repair warranty units
Sd4 = Number of Customer Support Service resolutions exceeding
specified agreed time
Sd5 = Number of support service defects

Table 9.1-2 Service Quality (SQ)


Measurement Identifiers and Formulas

Identifier Title Formula Note


SQ1 Conforming Installation (1-Sd1/S1) x 100 % audits
and/or Engineering conforming
Audits
SQ2 Successful Maintenance (1-Sd2/S2) x 100 % visits without
Visits maintenance
callbacks
SQ3 Successful Repairs (1-Sd3/S3) x 100 % successful
repairs
SQ4 Conforming Customer (1-Sd4/S4) x 100 % calls resolved
Support Service within agreed
Resolutions time
SQ5 Conforming Support (1-Sd5/S5) x 100 % transactions
Service Transactions without Defect

TL 9000 Quality Management System Measurements Handbook 3.0

9-3
Section 9 – Services Measurements

e. Reported Data and Format


(1) Data shall be reported quarterly. Each report shall include data for the
three months in the quarter.
(2) SQ Data Table – The SQ measurement shall be reported with data
elements (or equivalent as defined by the Measurements
Administrator) for each month and each product category as shown in
Table 9.1-3.

Table 9.1-3 TL 9000 SQ Data Table

(Report one value each for S1-S5, SQ1-SQ5, and Sd1-Sd5)

Year: YYYY
Month MM
Reporting ID: Provided by QuEST Forum Administrator
Product Category Code: From Measurement Applicability Table
(Normalized Units), Appendix A, Table A-2
Customer Base: Either Total or Forum
Normalization Factor: S1, S2, S3, S4, or S5
(as appropriate for the specific measurement
reported)
Measurement Identifier: SQ1, SQ2, SQ3, SQ4, or SQ5
SQ Numerator: Sd1, Sd2, Sd3, Sd4, or Sd5
(as appropriate)

9.1.5 Sources of Data

See Table 9.1-4.

Table 9.1-4 SQ Data Sources

Category Source of Data


Installation Supplier to count number of non-conforming supplier
and/or installation and/or engineering and audits
Engineering
Audits
Maintenance Supplier to count maintenance revisits
Repair Supplier to count number of repaired units that failed within
repair warranty period
Customer Supplier to count number of Customer Support Service
Support resolutions exceeding specified time
Service
Support Supplier to count number of Support Service defects
Service

TL 9000 Quality Management System Measurements Handbook 3.0

9-4
Section 9 – Services Measurements

9.1.6 Method of Delivery or Reporting

a. Compared Data (CD) or Research Data (RD)

Conforming Installations and/or Engineering Audits RD


Successful Maintenance Visits CD
Successful Repairs CD
Conforming Customer Support Service Resolutions CD
Conforming Support Service Transactions RD

b. RQMS Alternative Reporting:

None

9.1.7 Example Calculations

a. Example 1 – Installation

(1) Data Collected and Results

Table 9.1-5 Example 1 – Source Data for Installation SQ

January February March April


Number of Non-conforming 5 1 0 6
Installation and/or Engineering Audits
Total Number of Installation and/or 100 50 75 80
Engineering Audits
Service Quality Measurement 95% 98% 100% 92.5%
i. Computation for the month of January:
(1-5/100) x 100 = 95%
ii. Data Report for January 2000 is shown in Table 9.1-6.

Table 9.1-6 Example 1 – Data Report for Installation SQ

Year: 2000
Month 01
Reporting ID: Provided by QuEST Forum
Administrator
Product Category Code: 7.1
Measurement Methodology: TL 9000
Customer Base: Forum
Normalization Factor: 100
Measurement Identifier: SQ1
SQ Numerator: 5

TL 9000 Quality Management System Measurements Handbook 3.0

9-5
Section 9 – Services Measurements

b. Example 2 – Repair

(1) Data Collected and Results

Table 9.1-7 Example 2 – Source Data for Repair SQ

January February March April


Number of Defective Repaired 2 0 1 4
Units within Repair Warranty
Total Number of Repaired Units 30 20 75 120
Service Quality Measurement 93.3% 100% 98.6% 96.6%
(2) Computation for the month of January:
(1-2/30) x 100 = 93.3%
(3) Data report is analogous to the Installation example.

c. Example 3 – Maintenance
(1) Data Collected and Results

Table 9.1-8 Example 3 – Source Data for Maintenance SQ

January February March April


Number of Maintenance callbacks 2 0 1 4
Number of Maintenance Visits 30 20 75 120
Quality Service Measurement 93.3% 100% 98.6% 96.6%
(2) Computation for the month of January:
(1-2/30) x 100 = 93.3%
(3) Data report is analogous to the Installation example.

d. Example 4 – Customer Support Service


(1) Data Collected and Results

Table 9.1-9 Example 4 – Source Data for


Customer Support Service SQ

January February March April


Number Of Call Resolutions 15 40 10 4
That Exceeded The
Specified Time Allotment
Total Number Of Calls 2000 5000 2750 3000
Which Came Into Customer
Support Service
Service Quality 99.25% 99.2% 99.6% 99.8%
Measurement

TL 9000 Quality Management System Measurements Handbook 3.0

9-6
Section 9 – Services Measurements

(2) Computations for the month of January:


(1-15/2000) x 100 = 99.25%
(3) Data report is analogous to the Installation example.

e. Example 5 – Support Service Example


This example references a cable locator service with a defined defect as a cut
cable due to incorrect identification.
(1) Data collected and results

Table 9.1-10 Example 5 – Source Data for Support


Service SQ

January February March April


Cut Cables 5 2 0 4
(Number of Defects)
Number of Cables Identified 1000 500 750 300
(Number of Opportunities for
Defects)
Service Quality Conformance 99.5% 99.6% 100% 98.6%
Measurement

(2) Computation for the month of January:


(1-5/1000) x 100 = 99.5%
(3) Data report is analogous to the Installation example.

TL 9000 Quality Management System Measurements Handbook 3.0

9-7
Appendix A

Appendix A Product Category Tables

This Appendix is current with the release of this handbook. However, these
tables in this appendix are subject to revision. See the QuEST Forum web
site (http://www.questforum.org/) for the latest version. The latest version
shall be used in conjunction with registrations.

Suppliers shall classify their products and report measurements according to the
listed product categories. The Measurement Applicability Table (Normalized
Units), Table A-2, lists specific measurements that apply to each category as well
as the normalized units and other information necessary for compiling
measurement reports.

1. List of Tables

Table A-1. Product Category Definitions


Table A-2. Measurement Applicability Table (Normalized Units)
Table A-3. Transmission Standard Designations and Conversions
Table A-4. Optical and Electrical Equivalency
Table A-5. Measurements Summary Listing

2. Product Category Definitions

Table A-1 contains definitions of product categories to be used by suppliers in


categorizing their products.

2.1 Rules for Classification of Products

A supplier will not be required to report measurements for a given product in


multiple product categories. Therefore, any product from a given supplier must
be classified in exactly one product category.

1. General-purpose products (e.g., computers) will be classified by specific


function (e.g., signaling) when provided as a system designed for that
function. Otherwise, they will be classified in a separate category, (e.g.,
Common Systems-Computers) designed for the general-purpose product.

2. A product will be classified according to its primary function. For example, a


digital transmission facility product with performance monitoring will be
classified as a transmission product instead of an operations and
maintenance product.

3. The standard for classification is the product category, not the possible uses
to which the product may be put. For example, if a product classification falls
in the Outside Plant category, all products that are consistent with that
category will be classified as such, even if the exact same product is

TL 9000 Quality Management System Measurements Handbook 3.0

A-1
Appendix A

sometimes used in the customer premises and even if a particular supplier's


product is sold primarily into the customer premises market.

2.2 Principles for Construction of the Product Category Table

a. Product categories should fall into a clearly defined hierarchy of


classification.

b. There are well-established rules for classification.

c. Product categories should not be separated artificially if they can be logically


aggregated.

d. Product categories should have clear definitions, which lend themselves to


unambiguous interpretation.

e. For each category, the level to which measurements may be aggregated


shall be defined.

f. Each product category specification shall consist of standard elements.

g. The placement of the product in the hierarchy will reflect the dominant use of
the product.

TL 9000 Quality Management System Measurements Handbook 3.0

A-2
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
1 Switching Equipment for the physical or virtual interconnection of
communication channels in response to a signaling system. The
switching category is broadly defined to include packet or circuit
switched architectures.
1.1 Circuit Switch Equipment for the termination of subscriber lines and/or trunk lines • End-office
and the dynamic interconnection of these ports or channels in a • Tandem
digital transmission facility. A circuit switch establishes a dedicated • Tandem access
circuit, as opposed to a virtual circuit, in response to a signal. Stored • Remote
Program Control (SPC) is the most common type of switching • Service Switching Point
equipment used at end offices and tandem offices. These systems use [SSP]
either analog or digital switching. The switching system used must • Mobile Switching Center
have the capability to send, receive and be actuated by signals, e.g., [MSC]
access line signals, or inter-office in-band or common-channel
signaling. This category includes all circuit switches regardless of
transmission medium, i.e., wireline, or wireless.
1.2 Packet Switch Equipment for switching or routing data on virtual, as opposed to
dedicated, circuits. The service is packet switched in that the
customer’s data is transported as a sequence of data blocks
(packets) that do not exceed a specified size. This packetization
permits data from many data conversations to share a given
transmission facility economically through statistical multiplexing.
Such data conversations are known as virtual circuits, which are
full duplex and connection-oriented.

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-3
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
1.2.1 Public Packet Switched Equipment for the provision of connection-oriented, packet-switched • X.25 packet switch
Network (PPSN) communication services designed to provide economical data • Access concentrator / PAD
transport based on internationally standardized packet protocols. The
packet switch is the primary switching element of the network allowing
efficient connectivity to many customers. The access concentrator
concentrates traffic from lower-speed access lines for more efficient
packet-switch port usage and performs any necessary protocol
conversion via the Packet Assembler/Disassembler (PAD) function.
1.2.2 IP Packet Switch / Equipment that moves variable-length IP (Internet Protocol) packets
Router from source to destination. Routing generally uses software algorithms
to optimize one or a combination of data-transport “measurements”
such as delay, the use of reliable paths, “hops” between servers, etc.
Switching is generally faster than routers since the decision as to where
to send the packet is done by hardware, but are also limited to less
sophisticated algorithms than are routers to determine which path the
packets should use. Most systems provide a combination of routing
and switching, as appropriate, to best serve the needs of the user.
1.2.3 Asynchronous Transfer Switching equipment that operates at OSI Level 2 (hardware layer) to
Mode (ATM) Switch move fixed-length (53-byte) data cells from source to destination over
virtual paths or channels. ATM is designed to support mixed data types
(voice, video, computer communications, etc.), provides selectable
Quality of Service guarantees and easily enables billing for data
switching services. Throughput of up to 622 Mbps is commonly
available in ATM Switches.
1.2.4 Frame Relay Switch Switching equipment that operates at OSI Level 2(hardware) to move
variable-length Frame Relay Frames over virtual circuits from source
to destination. Data are moved without data integrity checks or flow
control at up to T3 rates.

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-4
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
2 Signaling Equipment for the provision of signaling, i.e., states applied to
operate and control the component groups of a
telecommunications circuit to cause it to perform its intended
function. Generally speaking, there are five basic categories of
"signals" commonly used in the telecommunications network.
Included are supervisory signals, information signals, address
signals, control signals, and alerting signals. This category
includes those signaling products that function within the
telecommunications network and excludes (possibly similar)
products that would normally provide enhanced services outside
the network, or on the customer premises such as ACD, IVR, or
voice messaging systems.
2.1 Service Control Point A signaling point that functions as a database to provide information • Service Control Point
(SCP) to another SCP or Service Switching Point (SSP). Transaction • Service nodes
Capabilities Application Part (TCAP) queries and responses are used to • Service resource facilities
communicate with the SCP as is done for 800 Data Base Service and
ABS. SCPs may support one or more services per SCP and SCPs
may be deployed singularly as stand-alone nodes, as mated pairs, or
as multiple replicates (more than 2) to increase their availability. SCPs,
connected to STPs, are associated with applications that consist of
service-specific software and a database of customer-related
information. This product category includes conventional SCP
equipment, plus other platforms such as service nodes, intelligent
peripherals, or service resource facilities, which may combine
capabilities of a SCP, SSP or that may be used to provide AIN
functionality or other enhanced services within the network.

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-5
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
2.2 Signaling Transfer A signaling point with the function of transferring signaling messages
Point (STP) from one signaling link to another and considered exclusively from the
viewpoint of the transfer. An STP is a specialized routing signaling
point (SP). It is an SS7-based packet switch that transfers SS7
messages to and from other SPs and is always deployed in mated pairs
for reliability. The STP uses the Message Transfer Part (MTP) and the
Signaling Connection Control Part (SCCP) of the SS7 protocol to
screen and route messages destined for other nodes in the SS7
network. It functions as an SS7 network routing hub, interfacing with
SPs only through SS7 links and not voice or data trunks. Within the
LEC CCS network structure, STPs are architecturally referred to as
either Local STPs (LSTPs) or Regional STPs (RSTPs).
2.3 Home Location Equipment to provide a permanent database used in wireless
Register (HLR) applications to identify a subscriber and to contain subscriber data
related to features and services. It stores information such as service
profiles, location and routing information for roamers, service
qualification, interface for moves, adds and changes. It communicates
with other HLRs and provides access to maintenance functions such as
fault information, performance data, and configuration parameters.
3 Transmission Equipment for the connection of the switched and interoffice
networks with individual customers. An integral part of the
distribution network is the loop, which connects the customer to
the local central office (CO), thus providing access to the
interoffice network.
3.1 Outside Plant The part of the telecommunications that is physically located
outside of telephone company buildings. This includes cables,
supporting structures, and certain equipment items such as load
coils. Microwave towers, antennas, and cable system repeaters
are not considered outside plant.
Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-6
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
3.1.1 Transmission Optical fiber, metallic cable, or other physical medium for the
Medium transmission of analog or digital communications.
3.1.1.1 Metallic Products Metallic as opposed to optical or wireless transmission media.
3.1.1.1.1 Metallic Conductor Metallic pairs of conductors housed in a protective cable • Metallic cable
Cable • Central office coaxial cable
• Hybrid coaxial/twisted pair
drop
3.1.1.1.2 Metallic Connectors Devices used to terminate a metallic cable. • Coaxial connectors
• Coaxial distribution
connectors
3.1.1.2 Optical Fiber Optical, as opposed to metallic or wireless transmission media.
Products
3.1.1.2.1 Optical Fiber and Glass fiber wherein light is propagated and any associated covering. • Loose tube cable
Cable • Single Tube Bundled
Cables
• Single Tube Ribbon Cables
• Tight Buffered Cables
• Indoor Fiber Optic Cables
• Single Mode Fiber
• Multi-mode Fiber
• Dispersion Shifted Fiber
3.1.1.2.2 Optical Connectors Device used to terminate an optical cable • Optical SC,ST, or MT
connectors
• Connectorized cable
assemblies, e.g., optical
fiber ribbon fanouts

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-7
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
3.1.1.3 Transmission Sub-systems embedded in the transmission medium other than
Sub-systems cable or connectors
3.1.1.3.1 Active Sub-systems Active sub-systems containing electronics • Coaxial drop amplifiers
• Fiber optic data links
3.1.1.3.2 Passive Optical Sub- Optical sub-systems containing no electronics • Wavelength Division
systems Multiplexer [WDM]
• Add drop multiplexers
• Fiber optic dispersion
compensators
• Optical isolators
• Filters
• Attenuators
3.1.1.3.3 Ancillary Sub-Systems Other transmission sub-systems not specifically covered in other • Surge protectors
transmission component categories. Typically passive. • Bonding and grounding
hardware
• Taps
3.1.2 Physical Equipment for the support of telephone transmission media.
Structure These physical structures include poles, towers, conduits, and
equipment enclosures such as huts.
3.1.2.1 Enclosures Enclosures for network equipment located in the outside plant. • Fiber optic splice
enclosures
• ONU enclosures
• Organizer assemblies
• Seal assemblies
• Controlled environment
Vaults

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-8
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
3.1.2.2 Support structures Products for the physical support of transmission media or enclosures. • Telephone poles
• Pedestals
• Microwave / radio towers
3.1.2.3 Conduits Channels for the containment of optical fiber or metallic cable. • Innerduct
• Multi-bore conduit
• PVC pipe
3.2 Transport Equipment located in the central office or at the customer
Equipment premises, but inside the network demarcation point, for the
transmission of digital or analog communication over
transmission media. This product category includes equipment
for terminating, interconnecting, and multiplexing
communications circuits.
3.2.1 Cross Connect Equipment to provide a physical termination point for physical
Systems cables and individual conductors. They can be manual or
automated, metallic or optical. Cross-connect systems, such as
distributing frames, Digital Signal Cross Connects (DSXs) and
Fiber Distributing Frames (FDFs) provide the following basic
functions: cross-connection of network distribution facilities and
equipment in the central office, electrical protection for conductive
media, test access, temporary disconnection, and termination
points for facilities and equipment.
3.2.1.1 Manual Cross Connect Equipment to provide a physical termination point for physical cables • Digital Signal Cross
Systems and individual conductors where changes in connections are performed Connect Panel (DSX)
manually. These can be metallic or optical systems such as distributing • Fiber Distribution Frame
frames or Fiber Distributing Frames (FDFs) provide the following basic (FDF)
functions: cross-connection of network distribution facilities and • Feeder Distribution
equipment in the central office, electrical protection for conductive Interface (FDI)
media, test access, temporary disconnection, and termination points for
facilities and equipment.
Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-9
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
3.2.1.2 Digital Cross Connect Equipment to provide a physical termination point for physical cables • Digital Cross-connect
Systems and individual conductors where changes in connections are performed System (DCS)
electronically. These can be metallic or optical systems such as digital • Electronic DSX
cross connect systems (DCS) that provide cross-connection of network • Active Optical DSX
distribution facilities and equipment in the central office, electrical
protection for conductive media, test access, temporary disconnection,
and termination points for facilities and equipment.
3.2.2 Carrier Systems / Equipment for transmitting multiple communication channels over
Multiplexers a single transmission facility. This category includes equipment
for transmission over interoffice trunks, for example, from central
to remote offices.
3.2.2.1 Interoffice / Long Equipment for transmission between central offices, between
Haul exchanges, or between carriers, as opposed to transmission
between an end office and a remote location, typical of a loop
carrier.
3.2.2.1.1 Metallic Carrier System Carrier system that uses metallic transmission medium. • Analog carrier (N-,L- carrier)
• D4, D5 digital carrier
3.2.2.1.2 Optical Carrier Carrier system that uses optical transmission medium.
System
3.2.2.1.2.1 SONET / SDH Fully featured digital transmission system employing optical medium • OC-3, 12, 48, or 192
Transport Systems SONET equipment
configurable as linear or
ring.
• Similar for STM-x SDH
equipment

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-10
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
3.2.2.1.2.2 WDM / DWDM / Shelf level systems used for multiplexing, de-multiplexing, or • Wavelength Division
Optical Amplification amplification of optical signals. Lack the built in protection, electrical Multiplexer [WDM]
Products conversion and other features of a SONET Transport System. • Dense Wavelength Division
Multiplexer
3.2.2.1.3 Microwave Carrier system that employs fixed microwave transmission. • 6, 8, 11, or 18 gigahertz
microwave radio
3.2.2.2 Loop Carrier Equipment for deploying multiple voice or digital channels over fewer • Digital loop carrier (DLC)
physical channels than would be otherwise required (a “pair gain” • Universal digital loop carrier
function). Loop carriers are typically digital systems which employ time- (UDLC)
domain multiplexing (TDM) but may include analog systems as well. • SLC remote terminal
Loop carrier systems consist of a Central Office Terminal (COT) located • Integrated digital loop
near the switching system, a Remote Terminal (RT) located near the carrier
customer to be served and a transmission facility connecting the COT • Analog loop carrier
to the RT. Individual communications circuits (such as POTS and
Foreign Exchange [FX]) are accepted as separate inputs at the COT
(RT), time-division multiplexed (in a digital loop carrier) by the loop
carrier system and reproduced at the RT (COT).

There is an analog-to-digital (A/D) conversion of analog inputs to the


DLC and these signals, which are carried digitally within the DLC,
undergo a digital-to-analog (D / A) conversion when output at the COT
or RT. The transmission facility used by a loop carrier may be metallic
cable pairs, repeated metallic cable pairs, or optical fibers.

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-11
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
3.2.3 Line Terminating Equipment to provide the termination point for voice-grade and voice- • Tall conventional
Equipment / grade compatible facilities and equipment in a central office. It is distributing frames
Distributing Frames composed of protectors, connectors and terminal strips or blocks. • Low-Profile Conventional
Distributing frames are categorized as either conventional or modular. Distribution Frames
(LPCDFs)
• Conventional protector
frames
• Combined Main Distributing
Frame (CMDF)
• Subscriber Main Distributing
Frame (SMDF)
• Trunk Main Distributing
Frame (TMDF)
• Intermediate Distributing
Frame (IDF)
• Tie-Pair Distributing Frame
(TPDF).
• Office repeater bays
3.2.4 Digital Subscriber Line Equipment for the transport of high-speed digital data on the embedded • ISDN
(DSL) copper plant. DSL typically will operate over nonrepeatered, POTS- • HDSL
like, conditioned unloaded loops out to CSA ranges. This product • ADSL
category includes central office and remote units, regenerators or range • DDS
extenders, and supporting equipment.
3.3 Wireless Equipment for analog or digital transmission to the subscriber
Transmission unique to wireless services. This category does not include
interoffice or long haul wireless carrier systems.

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-12
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
3.3.1 Base Station Equipment which provides the interface between wireless systems • BSC
Equipment and the Public Switched Telephone Network (PSTN). It provides, • BSS
for example, electrical signaling isolation as well as switching, routing,
billing, and features capabilities. It provides subsystems for vocoding
and selecting hand off decision.
3.3.2 Base Transceiver Equipment that provides the radio link to the mobile subscribers. It • BTS
System (BTS) is connected to the BSC though a backhaul interface between the BSC • Wireless Repeaters
and BTS for both vocoded and overhead packet traffic. This includes
terminals and repeaters.
3.3.3 Pilot Beacon Unit Equipment whose primary purpose is to transmit an ANSI J-STD-008
(PBU) Pilot channel and ANSI J- STD-008 Sync channel and a partial
ANSI J-STD-008 Paging channel. The PBU is intended to notify a
mobile unit of a change in CDMA coverage and can be used to assist in
the execution of cellular CDMA-AMPS and inter-frequency CDMA-
CDMA hand-off. It is designed with the capability for extended
temperature and environmental operation ranges.
4 Operations & Equipment, systems, and services for the management, upkeep,
diagnosis and repair of the communications network.
Maintenance
4.1 Test Systems Equipment to support testing of the network. This category
includes permanently installed equipment used to provide a
centralized test capability or local test access, as opposed to
portable equipment, as might be carried by a craftsperson.
4.1.1 Test Access Equipment to provide test access to transmission circuits. Test access
Equipment equipment is in series with the customer circuit at all times and
therefore directly affects the circuit reliability. This equipment is
designed with transmission equipment issues in mind. This equipment
may have analog and perhaps a variety of digital (i.e., T1, E1) types.

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-13
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
4.1.2 Test Equipment, Equipment to perform tests on transmission circuits. This equipment is
Embedded designed with transmission equipment issues in mind. Test equipment
is NOT generally in series with the customer circuit and may be
connected to a variety of access equipment and network elements with
integral access features. This equipment may have analog and
perhaps a variety of digital (i.e., T1, E1) types. Failure of this
equipment doesn't bring down customer circuits; however, it inhibits the
ability to maintain the network and to restore lost service.
4.1.3 Test Support Software Computer software that runs on a general purpose computer (office
environment) and perhaps the maintenance network that the computer
uses to communicate with the CO access and test equipment.
4.2 Operations Systems that provide TMN (Telecommunication Management
Support Systems Network) compliant, flexible, scaleable, and interoperable
solutions to automate service activation, service assurance, and
network capacity management processes to worldwide existing
and emerging network services and equipment providers.
4.2.1 On Line Critical Real time network management systems, demanding high • Network traffic management
availability, typically 24 hours a day and 7 days per week. • Surveillance of 911
• Fire alarms
4.2.2 On Line Non-critical Real time network management systems with lower availability • Provisioning
demands than on line critical systems. • Dispatch
• Maintenance
4.2.3 Off Line Traditional business systems that are run off line sometimes in batch • Inventory
mode, typically overnight and do not have high availability expectations. • Billing records
• Service creation platform

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-14
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
4.3 Ancillary Operations Tools, test equipment, and other specialized products used to • Optical splicers
and Maintenance support the operations and maintenance of the communications • Single fiber fusion splicers
Products network but not part of the permanent network • Mass fiber fusion splicers
• Mechanical splicers
• Portable test equipment
• Optical connector tools
• Cleavers
5 Common Any of a variety of specialized generic, shared equipment to
support network elements. Common systems include power
Systems systems and the Network Equipment-Building System (NEBS) that
provides space and environmental support for network elements.
These systems are located in central offices and remote building
locations.
5.1 Synchronization Equipment for operating digital systems at a common clock rate • Stratum 1,2, 3E domestic,
(frequency synchronization). This category includes primary TNC, LNC and Type 1
reference sources and other timing signal generators that produce a International
timing signal traceable to UTC. • GPS timing receivers,
cesium, loran, or CDMA RF
pilot timing reference
generators.

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-15
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
5.2 General Purpose A category reserved for computer complexes (one or more • Terminals
Computers interconnected machines) that perform general business functions for • PCs
a TSP but which do not provide any telephony transmission or storage • Workstations
service to subscribers or other TSP customers, or which may provide • Mini, mid, mainframes
such services, but are not sold to the service provider as part of a
system designed exclusively for that purpose. The purposes to which
such machines may be put include but are not limited to:
• Accounting systems
• Billing systems
• Legal systems
• Ordering systems
• Business Information systems
• HR functions
• Engineering and support functions
• Marketing and Sales functions
5.3 Power Systems Equipment for the provision of power to network equipment. Power • AC rectifiers/battery
systems provide two principal functions: the conversion of the chargers
commercial AC power source to DC voltages required by the network • Battery systems
equipment and the generation and distribution of emergency (reserve) • Uninterruptible Power
power when the commercial power is interrupted. This category also Supplies (UPS)
includes the ringing plant, a redundant plant which supplies the ringing • DC to AC inverters
voltage, frequency, tones, and interrupter patterns • DC to DC bulk converters
• AC and DC switch gear
• Ring generator
• Power distribution panels

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-16
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
6 Customer Equipment installed beyond the network demarcation point.
Although commonly installed on the subscriber’s premises,
Premises equipment with essentially identical function installed in the
service provider’s facility may also be classified as customer
premises equipment.
6.1 Enhanced Systems that provide an environment in which service-specific
Services application programs can execute and an infrastructure by which
those application programs can provide enhanced services.
Platforms Although each enhanced services platform has a corresponding
service creation environment, that creation environment is
sometimes packaged separately and may execute on a different
platform.
6.1.1 Interactive Voice Equipment used to allow menu navigation and information retrieval,
Response (IVR) often from legacy databases external to the IVR platform itself.
Platforms
6.1.2 Messaging Platforms Equipment for storage and retrieval of voice and/or fax messages Voice mail systems
6.1.3 Multi-Application Equipment which provides an environment rich in capabilities so that Unified/Universal Messaging
Platforms multiple, possible disparate services can be provided concurrently. (system providing a subscriber
the means, from a given device,
to manipulate messages
originated on like or different
devices. Such devices include,
but are not limited to,
conventional telephone
handsets, wireless handsets,
PC terminals, fax machines, and
email)

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-17
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
6.2 Terminal Equipment connected to the network demarcation point that
Equipment provides a service to the subscriber. Terminal equipment
includes telephone sets, whether wireline, cordless, cellular, PCS,
or other voice terminals, fax machines, answering machines,
modems, data service units (DSUs), or ISDN terminal adapters.
6.2.1 Voice Terminals Conventional, wireless, cellular, PCS, or other voice terminal
equipment.
6.2.1.1 Wireline Telephone Telephone sets connected to conventional wireline (POTS) circuits. • POTS telephone sets
Sets • Cordless telephones
6.2.1.2 Wireless Subscriber The subscriber user terminal made to transmit and receive voice and/or • Wireless single mode user
User Terminals data communication using Telecommunication Infrastructure equipment terminal
not requiring hard lines as a means of transport. User terminals may • Wireless mobile user
be of any functional technology available for public use. terminal
• Wireless stationary user
terminal
• Wireless multi-mode user
terminal
• Wireless multi-purpose user
terminal
• Wireless Global user
terminal
6.2.2 Fax equipment Equipment for sending or receiving facsimile (fax) over conventional
voice-grade lines.
6.2.3 Data Modems Equipment for digital communications over voice-grade lines
6.2.4 Digital Data Service Equipment for the interconnection of data terminal equipment (DTE) • DDS CSU / DSU
Units with a digital communications service. Such equipment typically • ISDN CSU / DSU
provides a network interface and one or more DTE interfaces and may • IDSN terminal adapter
be configurable. • T1 CSU DSU
Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-18
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
6.3 Automatic Call Equipment for the distribution of incoming calls to any of a number
Distribution (ACD) of destinations based on some programmed logic. ACD systems are
systems typically used in Customer Support service or sales centers.
6.4 Private Branch Equipment to provide circuit switched voice and fax
Exchange (PBX) communications services, optimized for medium to large sized
customer sites. Now is evolving to utilize ATM and IP networks and
support multimedia communications.
6.5 Small Communications Equipment to provide circuit switched voice and FAX
System (Key communications services, optimized from small to medium sized
Telephone System) customer sites. Now is evolving to utilize IP networks.
7 Services Result generated by activities at the interface between the supplier
and the customer and by supplier internal activities to meet the
customer needs.
NOTES:
1. The supplier or the customer may be represented at the
interface by personnel or equipment,
2. Customer activities at the interface with the supplier may be
essential to the service delivery,
3. Delivery or use of tangible products may form part of the
service delivery, and
4. A service may be linked with the manufacture and supply of
tangible product. [4]
7.1 Installation Service Contracted service to position, configure, and/or adjust a product.
7.2 Engineering Service Contracted service to design and/or develop a product. This includes,
but is not limited to, the documentation necessary for positioning,
configuring, connecting, and/or adjusting.
7.3 Maintenance Service Contracted service to maintain customer’s equipment and/or systems.
7.4 Repair Service Contracted service to repair customer’s equipment and/or systems
Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-19
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
7.5 Customer Support Contracted service to process customer requests. This service may Call Center, web-based support,
Service include call answering, response to general inquiries, information Dispatch Centers, etc.
requests, and information sharing. When the customer support service
center also handles product problem reports, those problem reports
shall be included in the appropriate product category measurements
and not in this category.
7.6 Procurement Services Contracted services for the procurement of reuse and new equipment Typically includes
refurbishing/retesting
7.7 Logistical Services Contracted service for the distribution of equipment between the Typically includes strictly
organization and customer warehousing and transportation
7.8 Reserved for future use
7.9 General Support Contracted service that is not included in another product category.
Service
8 Components Individual components or assemblies provided for use in
telecommunications systems excluding those already covered by
and a specific product category in another product family. These
Subassemblies items would typically be used by other suppliers and not sold
directly to service providers except as replacement parts.
8.1 Components Individual self-contained devices without separable parts. Crystals, ASIC’s, Lasers, optical
detectors, any individual piece
part
8.2 Subassemblies A device made up of a number of components for use in a
telecommunications system. This device is a portion of the
completed system, but would not make up the entire system.
8.2.1 Simple Less than 11 components or 49 solder connections excluding VCXO’s
connectors

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-20
Appendix A

Table A-1 Product Category Definitions


Category Category: Definition: Examples:
Code
8.2.2 Medium Complexity More than 10 components or 48 solder connections but less than 51 • Multi die hybrids
components or 241 solder connections excluding connectors. • Optical assemblies
• DC/DC converter “bricks”
8.2.3 High Complexity More than 50 components or 240 solder connections but less than 501 • Medium sized printed circuit
components or 2401 solder connections excluding connectors assemblies
8.2.4 Very High Complexity More than 500 components or 2400 solder connections excluding • Single board computers
connectors

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 3 Bolded text in the product category definition indicates the primary function of the product category. This is the function to
use for outage measurements.
TL 9000 Quality Management System Measurements Handbook 3.0

A-21
Appendix A

3. Measurement Applicability Table (Normalized Units)

3.1 Measurements Without Normalization Factors

The measurements Fix Response Time (FRT), Overdue Fix Responsiveness


(OFR), and On-Time Delivery (OTD) are applicable and required for ALL product
categories, with the exception of OTD for Customer Support Service (category
7.5) where resolution time is the service quality measurement. These
measurements (FRT, OFR and OTD) do not require product specific
normalization. In the interest of saving space, they are not listed in the following
table but data must be submitted for each of these three measurements for all
products. Use the following table to determine the normalization units and
applicability of the rest of the measurements.

3.2 Other Rules and References

Where the normalization factor is traffic capacity based, such as DS1, OC-1, DSL
or Terminations, the calculation shall be based on the true useable traffic
capacity. Equipment within the system used to provide protection for the main
traffic path shall not be included, as it does not add useable capacity to the
system.

Software measurements are based on the three most dominant releases.

% = 100 x Quantity Defective / Total Quantity. “%” is applicable to "Software


Only" measurements.

“NA” means the measurement is not applicable for the product category.

“None” means that no common normalization factor has been identified for the
product category; however, data shall be submitted for the measurement.

The column headings in Table A-2 are general descriptions covering several sub-
measurements in some cases. For cross-references to the detailed descriptions
of the measurements elsewhere in this document, find measurement/ sub-
measurement symbols in Table A-5 Measurement Summary Listing.

3.3 Measurement Summary Listing

Table A-5 is a listing of the measurements included in this handbook with the
symbols used in data reporting, the applicability to hardware, software, and/or
services (H, S, V), and a reference to the table in this handbook with data
reporting details. The symbols listed here are referenced by the normalization
unit and applicability table to clarify the general descriptions used as column
headings

TL 9000 Quality Management System Measurements Handbook 3.0

A-22
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
1 Switching
1.1 Circuit Switch Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ termination/ Reports/
year year year system/year
1.2 Packet Switch
1.2.1 Public Packet Switched Network Minutes/ Outages/ Returns/ Problem % % % %
(PPSN) system/ system/ termination/ Reports/
year year year system/year
1.2.2 IP Packet Switch/Router Minutes/ Outages/NC/ Returns/ Problem % % % %
NC/ year termination/ Reports/
year year system/year
1.2.3 Asynchronous Transport Mode Minutes/ Outages/ Returns/ Problem % % % %
(ATM) Switch system/ system/ system/ Reports/
year year year system/year
1.2.4 Frame Relay Switch Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ termination/ Reports/
year year year system/year
2 Signaling
2.1 Service Control Point (SCP) Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-23
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
2.2 Signaling Transfer Point (STP) Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
2.3 Home Location Register (HLR) NA NA NA Problem % % % %
Reports/
system/year
3 Transmission
3.1 Outside Plant
3.1.1 Transmission Medium
3.1.1.1 Metallic Cable Products
3.1.1.1.1 Metallic Conductor Cable NA NA NA None NA NA NA NA
3.1.1.1.2 Metallic Connectors NA NA NA Problem NA NA NA NA
Reports/
unit shipped/
year
3.1.1.2 Optical Fiber Products
3.1.1.2.1 Optical Fiber and Cable NA NA NA None NA NA NA NA

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-24
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
3.1.1.2.2 Optical connectors NA NA NA Problem NA NA NA NA
Reports/
unit shipped/
year
3.1.1.3 Transmission Sub-systems
3.1.1.3.1 Active Sub-systems NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit/year
3.1.1.3.2 Passive Optical Sub-systems NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit/year
3.1.1.3.3 Ancillary Sub-systems NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit/year
3.1.2 Physical Structure
3.1.2.1 Enclosures NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit shipped/
year

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-25
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
3.1.2.2 Support Structures NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit shipped/
year
3.1.2.3 Conduits NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year meter shipped/
year
3.2 Transport Equipment
3.2.1 Cross Connect Systems
3.2.1.1 Manual Cross Connect Systems NA NA Returns/ Problem NA NA NA NA
DS1/ year Reports/
system/year
3.2.1.2 Digital Cross Connect Systems Minutes/DS1/ Outages/DS1 Returns/ Problem % % % %
year / DS1/ year Reports/
year system/year
3.2.2 Carrier Systems/Multiplexers
3.2.2.1 Interoffice/Long Haul
3.2.2.1.1 Metallic Carrier System Minutes/DS1/ Outages/ Returns/ Problem % % % %
year DS1/ DS1/ Reports/
year year system/year
Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-26
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
3.2.2.1.2 Optical Carrier System
3.2.2.1.2.1 SONET/SDH Transport Systems Minutes/ Outage/ Returns/ Problem % % % %
OC-1/ OC-1/ OC-1/ Reports/
year year year network
element/year
3.2.2.1.2.2 WDM/DWDM/Optical Amplification Minutes/ Outages/ Returns/ Problem % % % %
Products OC-1/ OC-1/ OC-1/ Reports/
year year year network
element/year
3.2.2.1.3 Microwave Minutes/ Outages/ Returns/ Problem % % % %
DS1/ DS1/ DS1/ Reports/
year year year network
element/year
3.2.2.2 Loop Carrier Minutes/ Outages/ Returns/ Problem % % % %
DS1/ DS1/ DS1/ Reports/
year year year DS1/year
3.2.3 Line Terminating NA NA Returns/ Problem % % % %
Equipment/Distributing Frames termination / Reports/
year termination/
year

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-27
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
3.2.4 Digital Subscriber Line (DSL) NA NA Returns/ Problem % % % %
DSL line/ Reports/
year DSL line/year
3.3 Wireless Transmission
3.3.1 Base Station Controller (BSC) and Minutes/ Outages/ Returns/ Problem % % % %
Base Station System (BSS) system/ system/ unit/ Reports/
year year year system/year

3.3.2 Base Transceiver System (BTS) Minutes/ Outages/ Returns/ Problem % % % %


system/ system/ unit/ Reports/
year year year system/year
3.3.3 Pilot Beacon Unit (PBU) Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ unit/ Reports/
year year year system/year
4 Operations & Maintenance
4.1. Test Systems
4.1.1 Test Access Equipment NA NA Returns/ Problem % % % %
unit/ Reports/
year system/year

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-28
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
4.1.2 Test Equipment, Embedded NA NA Returns/ Problem % % % %
unit/ Reports/
year system/year
4.1.3 Test Support Software Minutes/ Outages/ NA Problem % % % %
system/ system/ Reports/
year year system/year
4.2 Operations Support Systems
4.2.1 On Line Critical Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
4.2.2 On Line Non-Critical Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
4.2.3 Off Line Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
4.3 Ancillary Operations and NA NA NA None NA NA NA NA
Maintenance Products
5 Common Systems

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-29
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
5.1 Synchronization Minutes/ Outages/ Returns/ Problem NA NA NA NA
system/ system/ system/ Reports/
year year year system/year
5.2 General Purpose Computers Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
5.3 Power Systems Minutes/ Outages/ Returns/ Problem NA NA NA NA
system/ system/ unit/ Reports/
year year year system/year
6 Customer Premises
6.1 Enhanced Services Platforms
6.1.1 Interactive Voice Response (IVR) Minutes/ Outages/ Returns/ Problem % % % %
Platforms system/ system/ system/ Reports/
year year year system/year
6.1.2 Messaging Platforms Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
6.1.3 Multi-Application Platforms Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
6.2 Terminal Equipment

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-30
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
6.2.1 Voice Terminals
6.2.1.1 Wireline Telephone Sets NA NA Returns/ Problem % % % %
unit/ Reports/
year unit shipped/
year
6.2.1.2 Wireless Subscriber User NA NA Returns/ Problem % % % %
Terminals unit/ Reports/
year unit shipped/
year
6.2.2 Fax Equipment NA NA Returns/ Problem % % % %
unit/ Reports/
year unit shipped/
year
6.2.3 Data Modems NA NA Returns/ Problem % % % %
unit/ Reports/
year unit shipped/
year
6.2.4 Digital Data Service Units NA NA Returns/ Problem % % % %
unit/ Reports/
year unit shipped/
year

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-31
Appendix A

Table A-2 Measurement Applicability Table (Normalized Units)


Product Category Hardware and Common Software Only
Hardware Software (Per Applicable Option)
Code Description Downtime Outage Return Rate Problem Corrective Feature Software Release
Performance Frequency Reports Patch Patch Update Application
H, S H, S H H,S Quality Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
6.3 Automatic Call Distribution (ACD) Minutes/ Outages/ Returns/ Problem % % % %
Systems system/ system/ system/ Reports/
year year year system/year
6.4 Private Branch Exchange (PBX) Minutes/ Outages/ Returns/ Problem % % % %
system/ system/ system/ Reports/
year year year system/year
6.5 Small Communications System Minutes/ Outages/ Returns/ Problem % % % %
(Key Telephone System) system/ system/ system/ Reports/
year year year system/year

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-32
Appendix A

Table A-2 Measurement Applicability Table (Normalization Units)


Product Category Applicability and Normalization Units for Services
Code Description Service Problem Reports Service Quality Return Rate
TL 9000 Measurement Symbols (see Table A-5) NPR (all) SQ RR (all)
RQMS Alternative Symbols (see Table A-5) NA NA
7 Services
7.1 Installation Service Problem Reports/ % audits conforming NA
job/year
7.2 Engineering Service Problem Reports/ NA NA
job/year
7.3 Maintenance Service Problem Reports/ % visits without maintenance callbacks NA
unit maintained/
year
7.4 Repair Service Problem Reports/ % of successful repairs NA
unit repaired/
year
7.5 Customer Support Service Problem Reports/ % requests resolved within agreed time NA
1000 requests/
year
7.6 Procurement Services Problem Reports/ NA Returns/
unit/year unit/year
7.7 Logistical Services Problem Reports/ NA NA
unit/year
7.8 Reserved for future use
7.9 General Support Service Problem Reports/ % transactions without defect NA
unit /year

Table A-2 Measurement Applicability Table (Normalized Units)


Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-33
Appendix A

Product Category Hardware and Common Software Only


Hardware Software
Code Description Downtime Outage Return Problem Corrective Feature Software Release
Performance Frequency Rate Reports Patch Quality Patch Update Application
H, S H, S H H,S Quality Quality Aborts
TL 9000 Measurement Symbols (see Table A-5) SO2; SO4; SO1;SO3; RR (all) NPR (all) CPQ (all) FPQ (all) SWU (all) RAA (all)
RQMS Alternative Symbols (see Table A-5) r,h,DPMs,c_ r,h,OFMs,c IPR (all) DPQ (all) DFP (all) DSU (all) RAQ (all)
8 Components and Subassemblies
8.1 Components NA NA NA Problem NA NA NA NA
Reports/
unit shipped/
year
8.2 Subassemblies
8.2.1 Simple NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit shipped /
year
8.2.2 Medium Complexity NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit shipped/
year
8.2.3 High Complexity NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit shipped /
year
8.2.4 Very High Complexity NA NA Returns/ Problem NA NA NA NA
unit/ Reports/
year unit shipped /
year

Note 1 The information in this table may have changed. See the QuEST Forum web site, http://www.questforum.org/ for the latest
information.
Note 2 Measurements FRT, OFR & OTD are applicable and must be reported for all product categories except for OTD for 7.5.
Note 3 Product Categories listed in RED and italicized will be used for possible Data Aggregation only. Measurements must be
submitted per the lower Product Category listing.
Note 4 If the normalization factor contains the word “shipped”, then the quantity shipped in the 12 months ending prior to the
month being reported shall be used.
TL 9000 Quality Management System Measurements Handbook 3.0

A-34
Appendix A

4. Equivalency Tables

Tables A-3 and A-4 are included for convenience only.

Table A-3 Transmission Standard Designations


and Conversions

Electrical Frequency Equivalent


NORTH AMERICAN
DS0 64 Kb 1 Trunk
DS1 1.544 Mb 24 DS0
VT 1.5 1.728 Mb 1 DS1, 24 DS0
DS1C 3.152 Mb 2 DS1, 48 DS0
DS2 6.312 Mb 4 DS1, 96 DS0
DS3 44.736 Mb 28 DS1, 672 DS0
STS-1 51.84 Mb 1 DS3, 28 DS1, 672 DS0
STS-3 155.52 Mb 3 DS3, 84 DS1, 2,016 DS0
STS-12 622.08 Mb 12 DS3, 336 DS1, 8,064 DS0
STS-48 2488.32 Mb 48 DS3, 1,344 DS1, 32,256 DS0
STS-192 9953.28 Mb 192 DS3, 5,376 DS1, 129,024 DS0
INTERNATIONAL (PDH)
E1 - 2 Mbits/sec 2,048 Mb 30 64 Kb Channels
E2 - 8 Mbits/sec 8,448 Mb 4 2 Mbit/s, 120 64 Kb Channels
E3 - 34 Mbits/sec 34,368 Mb 4 8 Mbit/s, 16 2 Mbit/s, 480 64 Kb Channels
E4 - 140 Mbits/sec 139,264 Mb 4 34 Mbits/s, 64 2 Mbit/s, 1,920 64 Kb Channels
565 Mbits/sec 636,000 Mb 4 140 Mbit/s, 16 34 Mbit/s, 64 8 Mbit/s, 256 2
Mbit/s, 7,680 64 Kb Channels

TL 9000 Quality Management System Measurements Handbook 3.0

A-35
Appendix A

Table A-4 Optical and Electrical Equivalency

Optical Electrical Frequency Equivalent


NORTH AMERICAN (SONET)
OC-1 STS-1 51.84 Mb 1 OC-1, 1 DS3, 28 DS1, 672 DS0
OC-3 STS-3 155.52 Mb 3 OC-1, 3 DS3, 84 DS1, 2,016 DS0
OC-12 STS-12 622.08 Mb 12 OC-1, 12 DS3, 336 DS1, 8,064 DS0
OC-48 STS-48 2,488.32 Mb 48 OC-1, 48 DS3, 1,344 DS1, 32,256 DS0
OC-192 STS-192 9,953.28 Mb 192 OC-1,192 DS3, 5,376 DS1, 129,024
DS0
OC-768 Not available 39,680 Mb Not available
OC-1536 158,720Mb Not available
INTERNATIONAL (SDH)
STM-1o STM-1e 155.52 Mb 1 E4, 4 E3, 64 E1, 1,920 Channels
(OC-3)
STM-4o STM-4e 622.08 Mb 4 E4, 16 E3, 256 E1, 7,680 Channels
(OC-12)
STM-16o STM-16e 2,488.32 Mb 16 E4, 48 E3, 1,024 E1, 30,720 Channels
(OC-48)
STM-64o STM-64e 9,953.28 Mb 64 E4, 192 E3, 4,096 E1, 122,024
(OC-192) Channels
Not applicable VC-11 (VT1.5) 1.644 Mb (1.544 Mb) 1 DS1
Not applicable VC-12 (E1) 2.240 Mb (2.048 Mb) 1 E1 (2 Mb)
Not applicable VC-2 (VT6) 6.784 Mb (6.312 Mb)
Not applicable VC-3 (E3) 48.960 Mb (34.368 Mb) 1 E3 (34 Mb)
Not applicable VC-4 (E4) 150.336 Mb (139.264 Mb) 1 E4 (140 Mb)

TL 9000 Quality Management System Measurements Handbook 3.0

A-36
Appendix A

5. Measurement Summary Listing

Table A-5 is a listing of the measurements included in this handbook showing: (1)
the symbols used in data reporting, (2) the applicability to hardware, software,
and/or services (H, S, V), and (3) a reference to the table with data reporting
details. The symbols listed here are also included in Table A-2, Measurement
Applicability Table (Normalized Units), to clarify the general descriptions in the
column headings.

Table A-5 Measurements Summary Listing

Table A-5. Measurements Summary Listing.


Para- Measurement Measur Sub - Applic- Reported Compared or
graph Sub-Measurement ement measur ability Items Research
Symbol ement (H/S/V) (Table) Data
Symbol
5.1 Number of Problem Reports NPR H,S,V 5.1-3
Formulas: Table 5.1-1
H/S Critical Problem Reports per NPR1 H,S compared
Normalization Unit
H/S Major Problem Reports per NPR2 H,S compared
Normalization Unit
H/S Minor Problem Reports per NPR3 H,S compared
Normalization Unit
Service Problem Reports per NPR4 V compared
Normalization Unit
Number of Problem Reports - RQMS IPR H,S 5.1-4
Alternative Formulas: Table 5.1-2
Incoming Critical Problem Reports per IPR1 H,S compared
System per Month
Incoming Major Problem Reports per IPR2 H,S compared
System per Month
Incoming Minor Problem Reports per IPR3 H,S compared
System per Month
5.2 Problem Report Fix Response Time FRT H,S,V 5.2-3
Formulas: Table 5.2-1
H/S Major Problem Reports Fix Response FRT2 H,S compared
Time
H/S Minor Problem Reports Fix Response FRT3 H,S compared
Time
Service Problem Reports Fix Response FRT4 V compared
Time
Problem Report Fix Response Time - RQMS ORT H,S 5.2-4
Alternative Formulas: Table 5.2-2
% Major Problems Closed On Time ORT2 H,S compared
% Minor Problems Closed On Time ORT3 H,S compared
5.3 Overdue Problem Report Fix OFR H,S,V 5.3-3
Responsiveness Formulas: Table 5.3-1

TL 9000 Quality Management System Measurements Handbook 3.0

A-37
Appendix A

Table A-5. Measurements Summary Listing.


Para- Measurement Measur Sub - Applic- Reported Compared or
graph Sub-Measurement ement measur ability Items Research
Symbol ement (H/S/V) (Table) Data
Symbol
H/S Major Overdue Problem Reports Fix OFR2 H,S research
Responsiveness
H/S Minor Overdue Problem Reports Fix OFR3 H,S research
Responsiveness
H/S Service Overdue Problem Reports Fix OFR4 V research
Responsiveness
Overdue Problem Report Fix Responsiveness - OPR H,S 5.3-4
RQMS Alternative Formulas: Table 5.3-2
% Rate of Closures of Overdue Problem OPR2 H,S research
Reports – Major
% Rate of Closures of Overdue Problem OPR3 H,S research
Reports - Minor
5.4 On-Time Delivery OTD H,S,V 5.4-2
Formulas: Table 5.4-1
On-Time Installed System Delivery OTIS H,S,V compared
On-Time Items Delivery OTI H,S compared
On-Time Service Delivery OTS V compared
6.1 System Outage SO H,S 6.1-4
Formulas: Table 6.1-1
Annualized Weighted Outage Frequency SO1 H,S compared
Annualized Weighted Downtime SO2 H,S compared
Annualized Supplier Attributable Outage SO3 H,S compared
Frequency
Annualized Supplier Attributable Downtime SO4 H,S compared
System Outage - RQMS Alternative for End SOE H,S 6.1-5
Office and/or Tandem Office, Wireless
Products and NGDLC Products Formulas:
Table 6.1-2
Supplier Attributable Total Outage Minutes rDPMs H,S compared
per System per Year – Remote Only
Supplier Attributable Total Outage Minutes hDPMs H,S compared
per System per Year - Host Only
Service provider Attributable Total Outage rDPMc H,S compared
Minutes per System per Year - Remote
Only
Service provider Attributable Total Outage hDPMc H,S compared
Minutes per System per Year - Host Only
Supplier Attributable Total Outages per rOFMs H,S compared
System per Year - Remotes
Supplier Attributable Total Outages per hOFMs H,S compared
System per Year - Hosts
Service Provider Attributable Total rOFMc H,S compared
Outages per System per Year - Remotes

TL 9000 Quality Management System Measurements Handbook 3.0

A-38
Appendix A

Table A-5. Measurements Summary Listing.


Para- Measurement Measur Sub - Applic- Reported Compared or
graph Sub-Measurement ement measur ability Items Research
Symbol ement (H/S/V) (Table) Data
Symbol
Service Provider Attributable Total hOFMc H,S compared
Outages per System per Year – Hosts
System Outage - RQMS Alternative - General SOG H,S 6.1-6
Series Formulas: Table 6.1-3
Total Outage Minutes per System per Year DPM H,S compared
– Overall
Total Outage Minutes per System per Year DPMs H,S compared
- Supplier Attributable
Total Outages per Year - Overall OFM H,S compared
Total Outages Per Year - Supplier OFMs H,S compared
Attributable
7.1 Return Rates RR H 7.1-2
Formulas: Table 7.1-1
Initial Return Rate IRR H research
One-Year Return Rate YRR H research
Long-Term Return Rate LTR H research
Normalized One-Year Return Rate NYR H compared
8.1.5 Release Application Aborts RAA S 8.1.5-3
Formulas: Table 8.1.5-1
Release Application Aborts - Release N RAA0 S compared
Release Application Aborts - Release N-1 RAA1 S compared
Release Application Aborts - Release N-2 RAA2 S compared
Release Application Aborts - RQMS Alternative RAQ S 8.1.5-4
Formulas: Table 8.1.5-2
Cumulative % of Systems Experiencing an RAQ0 S compared
Abort during Release Application -
Release N
Cumulative % o Systems Experiencing an RAQ1 S compared
Abort during Release Application -
Release N-1
Cumulative % of Systems Experiencing an RAQ2 S compared
Abort during Release Application -
Release N-2
Cumulative Number of Release Application Rar0 S compared
Attempts - Release N
Cumulative Number of Release Application Rar1 S compared
Attempts - Release N-1
Cumulative Number of Release Application Rar2 S compared
Attempts - Release N-2
8.1.6 Corrective Patch Quality CPQ S 8.1.6-3
Formulas: Table 8.1.6-1
Defective Corrective Patches - CPQ0 S compared
Release N

TL 9000 Quality Management System Measurements Handbook 3.0

A-39
Appendix A

Table A-5. Measurements Summary Listing.


Para- Measurement Measur Sub - Applic- Reported Compared or
graph Sub-Measurement ement measur ability Items Research
Symbol ement (H/S/V) (Table) Data
Symbol
Defective Corrective Patches - CPQ1 S compared
Release N-1
Defective Corrective Patches - CPQ2 S compared
Release N-2
Corrective Patch Quality - RQMS Alternative DCP S 8.1.6-4
Formulas: Table 8.1.6-2
Monthly Number of Defective Corrective DCP0 S compared
Patches Identified - Release N
Monthly Number of Defective Corrective DCP1 S compared
Patches Identified - Release N-1
Monthly Number of Defective Corrective DCP2 S compared
Patches Identified - Release N-2
Monthly Number of Corrective Patches CPr0 S compared
Delivered - Release N
Monthly Number of Corrective Patches CPr1 S compared
Delivered - Release N-1
Monthly Number of Corrective Patches CPr2 S compared
Delivered - Release N-2
8.1.6 Feature Patch Quality FPQ S 8.1.6-3
Formulas: Table 8.1.6-1
Defective Feature Patches - Release N FPQ0 S research
Defective Feature Patches - Release N-1 FPQ1 S research
Defective Feature Patches - Release N-2 FPQ2 S research
Feature Patch Quality - RQMS Alternative DFP S 8.1.6-4
Formulas: Table 8.1.6-2
Monthly Number of Defective Feature DFP0 S research
Patches Identified - Release N
Monthly Number of Defective Feature DFP1 S research
Patches Identified - Release N-1
Monthly Number of Defective Feature DFP2 S research
Patches Identified - Release N-2
Monthly Number of Feature Patches FPr0 S research
Delivered - Release N
Monthly Number of Feature Patches FPr1 S research
Delivered - Release N-1
Monthly Number of Feature Patches FPr2 S research
Delivered - Release N-2
8.1.7 Software Update Quality SWU S 8.1.7-3
Formulas: Table 8.1.7-1
Defective Software Updates - Release N SWU0 S compared
Defective Software Updates - Release N-1 SWU1 S compared
Defective Software Updates - Release N-2 SWU2 S compared
Software Update Quality – RQMS Alternative DSU S 8.1.7-4
Formulas: Table 8.1.7-2

TL 9000 Quality Management System Measurements Handbook 3.0

A-40
Appendix A

Table A-5. Measurements Summary Listing.


Para- Measurement Measur Sub - Applic- Reported Compared or
graph Sub-Measurement ement measur ability Items Research
Symbol ement (H/S/V) (Table) Data
Symbol
Cumulative Number of Defective Software DSU0 S compared
Updates – Release N
Cumulative Number of Defective Software DSU1 S compared
Updates – Release N-1
Cumulative Number of Defective Software DSU2 S compared
Updates – Release N-2
9.1 Service Quality SQ V 9.1-3
Formulas: Table 9.1-2
Conforming Installations and/or SQ1 V research
Engineering
Successful Maintenance Visits SQ2 V compared
Successful Repairs SQ3 V compared
Conforming Customer Support Service SQ4 V compared
Resolutions
Conforming Support Service Transactions SQ5 V research

TL 9000 Quality Management System Measurements Handbook 3.0

A-41
Appendix B

Appendix B TL 9000 Customer Satisfaction


Measurements Guidelines

The TL 9000 Quality Management System Requirements Handbook contains


requirements for measuring customer satisfaction. The design of the mechanism
for collecting data from customers will necessarily be unique to each
organization. This appendix offers guidelines to assist organizations in the
design or review of their own customer feedback program.

1 Profile for Customer Satisfaction Measurements


B. Measurements
Profile for The following measurements profile provides basic guidelines for a customer
Customer feedback mechanism and references a detailed example of a customer
Satisfaction satisfaction survey. Results may be provided to customer organizations that
Mechanism have direct experience with the supplier organization’s products or performance
that these organizations may include, for example, Quality, Purchasing,
Operations, Engineering, Planning, Logistics, and Technical Support.

1.1 Purpose

These measurements are used to measure and improve the degree of customer
satisfaction with an organization and its products from a customer point of view to
help the organization to improve the satisfaction of its customers.

1.2 Applicable Product Categories

All products delivered through a purchase order and fulfillment process are
applicable. This should include stock items as well as items that are made-to-
order.

1.3 Detailed Description

Feedback is obtained through various mechanisms (such as satisfaction surveys


and front line customer technical support input). The surveys should determine
the importance of the items surveyed as well as how satisfied customers are.
Analysis should include trends and rates of improvement.

1.4 Sources of Data

Both customers and supplier organizations collect data on satisfaction with an


organization’s products.

1.5 Method of Delivery or Reporting

Both customers and supplier organizations should administer the mechanism for
determining customer satisfaction. Results should be obtained at least once per

TL 9000 Quality Management System Measurements Handbook 3.0

B-1
Appendix B

year and reported according to each customer or organization firm’s own formats
and procedures.

1.6 Example

For an example survey, see the QuEST Forum web site


(http://www.questforum.org/). The following are typical survey topics:

Quality of Delivery
• Delivers on time
• Meets due date without constant follow-up
• Lead time competitiveness
• Delivers proper items
• Delivers proper quantities
• Accurate documentation and identification
• Handles emergency deliveries

Quality of Pricing
• Competitive pricing
• Price stability
• Price accuracy
• Advance notice on price changes

Quality of Customer Service


• Compliance to contract terms
• Supplier organization representatives have sincere desire to serve
• Provides feedback from factory
• Recognizes cost effectiveness
• Market insight
• Training provided on equipment/products
• Support on professional and technical matters
• Invoicing efficiency
• Issuing credit notes
• Order acknowledgement
• Adherence to company policy

Quality of Product
• Product reliability/durability/meets specifications
• Product documentation, instructions, technology
• Product packaging, suitability, environmental aspects
• Contract service quality

TL 9000 Quality Management System Measurements Handbook 3.0

B-2
Glossary

Glossary Abbreviations, Acronyms and Definitions

This Glossary contains a list of abbreviations and acronyms followed by


definitions of terms. Definitions of terms that appear only in the Product
Category Table, Table A-1, are not provided here.

ABBREVIATIONS and ACRONYMS


A&M Additions and Maintenance
ABS Alternate Billing Service
ACD Actual Completion Date
ACD Automatic Call Distribution
Afactor Annualization Factor
AIN Advanced Intelligent Network
AOJD Actual On-Job Date
ATM Asynchronous Transfer Mode
BSC Base Station Controller
BSS Base Station System
BTS Base Transceiver System
CCS Common Channel Signaling
CO Central Office
COT Central Office Terminal
CPQ Corrective Patch Quality
CRCD Customer Requested Completion Date
CRD Customer Requested Date
CROJD Customer Requested On-Job Date
DCS Digital Cross Connect System
DIS Digital Interface System
DPM Downtime Performance Measurement
DS(x) Digital Signal Level
DSX Digital Signal Cross Connect
DWDM Dense Wavelength Division Multiplexer
E(x) European Digital Rate
FAX Facsimile (Electronic)
FDF Fiber Distribution Frame
FDI Feeder Distribution Interface
FPQ Feature Patch Quality
FRT Fix Response Time
FRU Field Replaceable Unit
GA General Availability
H Hardware
H/S Hardware Common to Software
HLR Home Location Register
IRR Initial Return Rate

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-1
Glossary

ABBREVIATIONS and ACRONYMS


IP Internet Protocol
IP Intelligent Peripheral
IR Information Request
ISDN Integrated Services Digital Network
IVR Interactive Voice Response
LEC Local Exchange Carrier
LOR Late Orders Received
LSTP Local Signaling Transfer Point
LTR Long-term Return Rate
MD Manufacturing Discontinued
MSC Mobile Switching Center
MTBF Mean Time Between Failure
NA Not Applicable
NGDLC Next Generation Digital Loop Carrier
NPR Number of Problem Reports
NTF No Trouble Found
NYR Normalized One-Year Return Rate
OC-(xxx) North American Equivalent Optical Rate
OFM Outage Frequency Measurement
ONU Optical Network Unit
OPR Overdue Problem Report
OSS Operational Support System
OTD On-Time Delivery
OTI On-Time Item Delivery
OTIS On-Time Installed System Delivery
OTS On-Time Service Delivery
PBX Private Branch Exchange
PDH Plesiochronous Digital Hierarchy
PO Purchase Order
POTS Plain Old Telephone Service
RAA Release Application Aborts
RQMS Reliability and Quality Measurements for Telecommunications
Systems
RSTP Regional Signaling Transfer Point
RT Remote Terminal
S Software
SCP Service Control Point
SDH Synchronous Digital Hierarchy
SFAR Service Failure Analysis Report
SLC Subscriber Line Concentrator
SO System Outage
SONET Synchronous Optical Network Element
SPC Stored Program Control
SQ Service Quality
SS7 Signaling System 7

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-2
Glossary

ABBREVIATIONS and ACRONYMS


SSP Service Switching Point
STM-(x)e Synchronous Transport Module, Electrical
STM-(x)o Synchronous Transport Module, Optical
STP Signaling Transfer Point
STS Synchronous Transport Signal
SWIM Software Insertion and Maintenance
SWU Software Update Quality
TCAP Transactional Capabilities Application Part
UDLC Universal Digital Loop Carrier
V Service
VC Virtual Container
VT Virtual Tributary
WDM Wave Division Multiplexers
xDLC Digital Loop Carrier
xDSL Digital Subscriber Line
YRR One-Year Return Rate

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-3
Glossary

Note: The following terms are used in this handbook or in the companion TL 9000 Quality Management
System Requirements Handbook.

Accredited Registrars Qualified organizations certified by a national body (e.g., the Registrar
Accreditation Board in the U.S.) to perform audits to TL 9000 and to
register the audited company when that they are shown as conforming to
the TL 9000 requirements.

Annualization Factor This factor is applied to annualize the return rate. It is the number of
(Afactor) reporting periods in one year.

Report Period Type Afactor


Calendar Month 12
4 Week Fiscal Month 13
5 Week Fiscal Month 10.4
6 Week Fiscal Month 8.7
28 Day Month 13.04
29 Day Month 12.59
30 Day Month 12.17
31 Day Month 11.77

Basis Shipping Period A length of time during which Field Replaceable Units (FRUs) are
shipped to the customer. Specifically the period during which the FRUs
that comprise the population for determining the return rate were
shipped.

Certification Procedure(s) by which a third party gives written assurance that a


product, process or quality management system conforms to specified
requirements.

Certification Mark The mark used to indicate successful assessment to and conformance to
the requirements of a quality management system.

Closure Criteria Specific results of actions that the customer agrees are sufficient to
resolve the customer’s problem report.

Closure Date The date on which a problem report is resolved, as acknowledged by the
customer.

Closure Interval The reference point is the length of time from origination of a problem
report to the agreed closure date.

Compared Data Measurements that are adequately consistent across organizations and
appropriately normalized such that comparisons to aggregate industry
statistics are valid. Only industry statistics based on “compared data” as
designated within each measurement profile are posted on the QuEST
Forum Web Site at http://www.questforum.org/. See also Research
Data.

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-4
Glossary

Configuration A discipline applying technical and administrative direction and


Management surveillance to identify and document the functional and physical
characteristics of a configuration item, control changes to those
characteristics, record and report changes, processing and
implementation status, and verify conformance to specified requirements.

Customer Base The defined group of customers that the supplier organization’s
measurement data encompasses. The customer base options are as
follows:
− Forum Members: only the supplier organization’s customers that are
members of the QuEST Forum. This is the minimum customer base.
− Total: all of the supplier organization’s customers for the product(s)
to which the measurement applies.

Design Change Changes affecting form, fit, or function including ISO 9000:2000 definition
for “Design and Development”.

Deviation A departure from a plan, specified requirement, or expected result.

Disaster Recovery The response to an interruption in the ability to recreate and service the
product and service throughout its life cycle by implementing a plan to
recover an organization’s critical functions.

Engineering Complaint A mechanism used to document a problem to the supplier for resolution.
Problems reported may include unsatisfactory conditions or performance
of a supplier products or services, as defined in GR-230-CORE.

Electrostatic Discharge The transfer of charge between bodies at different electrical potential.

Field Replaceable A distinctly separate part that has been designed so that it may be
Unit exchanged at its site of use for the purposes of maintenance or service
adjustment.

Fix A correction to a problem that either temporarily or permanently corrects


a defect.

Fix Response The interval from the receipt of the original problem report to the
Time organization’s first delivery of the official fix.

General Availability The period of time when a product is available to all applicable
customers.

Installation and/or An organization’s final internal audit of an installation and/or engineering


Engineering Audit project. This audit takes place prior to customer acceptance.

Installed System A system installed by the supplier of the system hardware and/or
software or by another supplier of installation services.

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-5
Glossary

Installed System Order An order for a system engineered, furnished and installed by the
organization of the system hardware and/or software and having a
Customer Requested Complete Date.

Life Cycle Model The processes, activities, and tasks involved in the concept, definition,
development, production, operation, maintenance, and, if required,
disposal of products, spanning the life of products.

Maintenance Any activity intended to keep a functional hardware or software unit in


satisfactory working condition. The term includes tests, measurements,
replacements, adjustments, changes and repairs.

Manufacturing A product at the end of its life cycle that is no longer generally available.
Discontinued

Measurement Term used to replace the term, "metrics", previously used in TL 9000
standards and requirements. Companies collect measurement data as
defined in the TL 9000 Quality Management System Measurements
Handbook.

Method A means by which an activity is accomplished which is not necessarily


documented but which is demonstrated to be consistent and effective
throughout the organization.

No Trouble Found Supplier organization tested returned item where no trouble is found.

Normalization Factor The total number of normalization units in the product or product
population to which a measurement is applied. The measurement
denominator reduces measurements on different populations to
comparable per unit values.

Official Fix A fix approved by the supplier organization and made available for
general distribution.

On-Time Installed Percentage of Installed Systems delivered on time to Customer


System Delivery Requested Completion Date

On-Time Item(s) Delivery Percentage of items delivered on time to Customer Requested On Job
Date.

On-Time Service Delivery Percentage of Services completed on time to Customer Requested


Completion Date.

Overdue Service Problem A service problem report that has not been resolved on or before a
Report particular date as agreed by the customer and supplier.

Patch An interim software change between releases delivered or made


available for delivery to the field. It consists of one or more changes to
affected parts of the program. Patches may be coded in either source
code or some other language.

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-6
Glossary

Patch – Defective A patch that includes any of the following:


Corrective a) cannot be installed,
b) does not correct the intended problem,
c) is withdrawn because of potential or actual problems,
d) causes an additional critical or major problem.

Patch – Defective Feature A patch that:


a) cannot be installed,
b) fails to provide the intended feature capability,
c) is withdrawn because of potential or actual problems,
d) causes an additional critical or major problem.

Patch - Official A corrective or feature patch for which delivery to all affected deployed
systems has begun.

Plan A scheme or method of acting, proceeding, etc., developed in advance.

Problem Escalation The process of elevating a problem to appropriate management to aid in


its resolution.

Problem - Critical H/S Conditions that severely affect service, capacity/traffic, billing and
maintenance capabilities and require immediate corrective action,
regardless of time of day or day of the week as viewed by a customer on
discussion with the supplier. For example:
− A loss of service that is comparable to the total loss of effective
functional capability of an entire switching or transport system,
− A reduction in capacity or traffic handling capability such that
expected loads cannot be handled,
− Any loss of safety or emergency capability (e.g., 911 calls).

Problem - Major H/S Conditions that cause conditions that seriously affect system operation,
maintenance and administration, etc. and require immediate attention as
viewed by the customer on discussion with the supplier. The urgency is
less than in critical situations because of a lesser immediate or
impending effect on system performance, customers and the customer’s
operation and revenue. For example:
− reduction in any capacity/traffic measurement function,
− any loss of functional visibility and/or diagnostic capability,
− short outages equivalent to system or subsystem outages, with
accumulated duration of greater than 2 minutes in any 24 hour
period, or that continue to repeat during longer periods,
− repeated degradation of DS1 or higher rate spans or connections,
− prevention of access for routine administrative activity,
− degradation of access for maintenance or recovery operations,
− degradation of the system’s ability to provide any required critical or
major trouble notification,
− any significant increase in product related customer trouble reports,
− billing error rates that exceed specifications,
− corruption of system or billing databases.

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-7
Glossary

Problem - Minor H/S Conditions that do not significantly impair the function of the system.
Problems that do not significantly impair the functioning of the system
and do not significantly affect service to customers. These problems are
not traffic affecting.
Note: Engineering complaints are classified as minor unless otherwise
negotiated between the customer and supplier.

Problem Report All forms of problem reporting and complaints registered from the
customer such as written reports, letters and telephone calls that are
recorded manually or entered into an automated problem reporting
tracking system.

Product Category The recognized grouping of products for reporting TL 9000


measurements.

Program A planned, coordinated group of activities, procedure(s), etc., often for a


specific purpose.

QuEST Forum Quality Excellence for Suppliers of Telecommunications is a partnership


of telecommunications suppliers and service providers. The QuEST
Forum’s mission is developing and maintaining a common set of quality
management system requirements for the telecommunications industry
worldwide, including reportable cost and performance-based
measurements for the industry.

Registrar Certification/Registration Body. Also see Accredited Registrar.

Release Application The process of installing a generally available release in a customer’s in-
service product.

Reliability The ability of an item to perform a required function under stated


conditions for a stated time period.

Research Data Measurements that are not consistent from one organization to another
and/or are not possible to normalize and consequently cannot be
compared to aggregate industry statistics. Industry statistics from
research data are analyzed for trends and reported to the measurements
work groups. See also “compared data.”

Return Any unit returned for repair or replacement due to any suspected
mechanical or electrical defect occurring during normal installation,
testing, or in-service operation of the equipment.

Risk Management A proactive approach for enabling business continuity. A loss prevention
methodology that encompasses identification and evaluation of risk,
selection of risks to control, identification of preventive actions, cost
benefit, analysis and implementation of mitigating plans.

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-8
Glossary

Scheduled Outage An outage that results from a scheduled or planned maintenance,


installation, or manual initialization. This includes such activities as
parameter loads, software/firmware changes and system growth.

Service Categories Product categories that refer to services.

Service Problem Report A formal report of dissatisfaction because a contractual service


requirement was not met. Service problems may be either tangible or
intangible.
− Tangible problems are those indicating dissatisfaction with the result
of the service.
− Intangible problems are those indicating dissatisfaction with
personnel.
Service problem reports may be reported via any media.

Service Providers A company that provides telecommunications services.

Severity Level The classification of a problem report as critical, major or minor. See
“problem – critical H/S,” “Problem – major H/S,” and “problem – minor
H/S.”

Subscriber A telecommunication’s services customer.

Support Service The complete cycle from a service request through the completion of the
Transaction service by the supplier.

System Test Testing conducted on a complete integrated system to evaluate the


system’s conformance to its specified requirements.

Temporary Fix A fix that is delivered to a limited number of systems in the field for the
purposes of verification or to solve system problems requiring immediate
attention. A temporary fix is usually followed by an official fix.

Test Plan Describes the scope, strategy, and methodology for testing.

Total System Outage A failure that results in the loss of functionality of the entire system.

Virus, Software A computer program, usually hidden within another seemingly innocuous
program, which produces copies of itself and inserts them into other
programs and that usually, performs a malicious action (such as
destroying data).

Work Instructions Type of document that provides information about how to perform
activities and processes consistently.

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-9
Glossary

ISO 9000:2000 Defined Terms [7]

A N
audit 3.9.1 nonconformity 3.6.2
audit client 3.9.8
audit conclusions 3.9.7 O
agreed criteria 3.9.4 objective evidence 3.8.1
audit evidence 3.9.5 organization 3.3.1
audit findings 3.9.6 organizational structure 3.3.2
audit programme 3.9.2
audit scope 3.9.3 P
audit team 3.9.10 preventive action 3.6.4
auditee 3.9.9 procedure 3.4.5
auditor 3.9.11 process 3.4.1
auditor qualifications 3.9.13 product 3.4.2
project 3.4.3
C
capability 3.1.5 Q
characteristic 3.5.1 qualification process 3.8.6
concession 3.6.11 qualified auditor 3.9.14
conformity 3.6.1 quality 3.1.1
continual improvement 3.2.13 quality assurance 3.2.11
correction 3.6.6 quality characteristic 3.5.2
corrective action 3.6.5 quality control 3.2.10
criteria 3.9.4 quality improvement 3.2.12
customer 3.3.5 quality management 3.2.8
customer satisfaction 3.1.4 quality management system 3.2.3
quality manual 3.7.4
D quality objective 3.2.5
defect 3.6.3 quality plan 3.7.5
dependability 3.5.3 quality planning 3.2.9
design and development 3.4.4 quality policy 3.2.4
deviation permit 3.6.12
document 3.7.2 R
record 3.7.6
E regrade 3.6.8
effectiveness 3.2.14 release 3.6.13
efficiency 3.2.15 repair 3.6.9
requirement 3.1.2
G review 3.8.7
grade 3.1.3 rework 3.6.7

I S
information 3.7.1 scrap 3.6.10
infrastructure 3.3.3 specification 3.7.3
inspection 3.8.2 supplier 3.3.6
interested party 3.3.7 system 3.2.1

M T
management 3.2.6 technical expert <audit> 3.9.12
management system 3.2.2 test 3.8.3
measurement control system 3.10.1 top management 3.2.7
measurement process 3.10.2 traceability 3.5.4
measuring equipment 3.10.4
metrological characteristic 3.10.5 V
metrological confirmation 3.10.3 validation 3.8.5
metrological function 3.10.6 verification 3.8.4

W
work environment 3.3.4

TL 9000 Quality Management System Measurements Handbook 3.0

Glossary-10
Bibliography

Bibliography

1. GR-929-CORE Reliability and Quality Measurments for Telecommunications Systems


(RQMS), Morristown, NJ, Telcordia Technologies, Issue 4, December
1998.

2. GR-230-CORE Generic Requirements for Engineering Complaints, Morristown, NJ,


Telcordia Technologies, Issue 2, December 1997.

3. GR-1323-CORE Supplier Data—Comprehensive Generic Requirements, Morristown, NJ,


Telcordia Technologies, Issue 1, December 1995.

4. ISO 8402:1994 Quality Management Assurance – Vocabulary, Geneva, Switzerland,


International Organization for Standardization.

5. TR-NWT-000284 Reliability and Quality Switching Systems Generic Requirements


(RQSSGR), Morristown, NJ, Telcordia Technologies, Issue 2, October
1990.

6. TL 9000 TL 9000 Quality Management System Requirements Handbook,


Milwaukee, WI, QuEST Forum, Release 3.0, March 2001

7. ISO 9000:2000 ISO 9000:2000 – Quality management Systems – Fundamentals and


vocabulary, Geneva, Switzerland, International Organization for
Standardization, 2000.

TL 9000 Quality Management System Measurements Handbook 3.0

Bibliography-1

You might also like