Najmi2005 PDF
Najmi2005 PDF
Najmi2005 PDF
Performance
A framework to review measurement
performance measurement systems
systems
109
Manoochehr Najmi
Graduate School of Management and Economics, Sharif University of
Technology, Iran
John Rigas and Ip-Shing Fan
Department of Enterprise Integration, Cranfield University, Cranfield, UK
Abstract
Purpose – This paper aims to describe a structured review framework for managing business
performance. The framework entails the review of business performance, including the strategic relevance
of the measures, as well as the efficiency and effectiveness of the performance measurement system itself.
Design/methodology/approach – A range of approaches and tools are employed in the
framework, which features a review card providing a high-level view of the review process,
showing the different types of review perspectives and their interactions.
Findings – The performance measurement system of an organisation is a mechanism to manage and
control the organisation. Maintaining the effectiveness of the organisation and the measurement
systems requires a systematic review process. The process of reviewing performance is a complex task
that spans the whole organisation. A good review process seeks the correct balance between
organisational benefits and the effort required.
Originality/value – Overall, the framework presented provides a structured path of performance
review in a methodical and effective manner. This is useful for practitioners to apply and adapt in their
situations.
Keywords Performance measurement (quality), Performance management
Paper type Research paper
Introduction
Performance measurement systems (PMS) have been at the top of the research and
business agenda over the last few years. Businesses realised the importance of a multi-
dimensional and balanced performance measurement system as a tool that would
enable them to drive the company forward. The shortcomings of traditional accounting
performance measures have been well documented in the literature, and include failing
to convey strategies and priorities effectively within an organisation (Skinner, 1974;
Maskell, 1991), encouraging short-termism (Johnson and Kaplan, 1987), and
inflexibility to change (Richardson and Gordon, 1980). It is now widely accepted
that the use of appropriately defined measures can ensure the strategic alignment of
the organisation and communication of the strategy throughout the business. Taking
the concept a bit further, Andersson et al. (1989), Eccles (1991), Lynch and Cross (1991) Business Process Management
and Kaplan and Norton (1992) identified the weaknesses of traditional measurement Journal
Vol. 11 No. 2, 2005
systems because of their uni-dimensional and backward looking nature. This led to the pp. 109-122
development of innovative performance measurement frameworks such as the q Emerald Group Publishing Limited
1463-7154
balanced scorecard (Kaplan and Norton, 1992) and the EFQM Excellence Model DOI 10.1108/14637150510591129
BPMJ (European Foundation for Quality Management, 2003) which viewed business
11,2 performance through more than one perspective. Moreover, Dixon et al. (1990) and
Maskell (1991), amongst others, identified that performance measurement had to be
coherent with low-level action taken within the business. This initiated the
development of processes to implement performance measurement systems (Neely
et al., 1996; Bititci et al., 1997; Bourne, 1999). Consequently, the need emerged for
110 developing a way of sustaining and maintaining these successful performance
measurement system implementations. It became obvious that there is a need to review
these systems effectively (Neely et al., 1996; Maskell, 1991; Dixon et al., 1990; Ghalayini
and Noble, 1996; Neely et al., 2000; Medori and Steeple, 2000; Bititci and Turner, 2000).
Questions arising from previous research in the area of the performance measurement
system maintenance stages are the basis on which this paper was formed. This paper
will establish the need to review performance measurement systems, and propose a
framework that explains how the review process is performed, the people involved and
their roles, the frequency of the review and further relevant considerations.
Background
Business performance is a dynamic quantity that is ever changing by nature.
Consequently, all performance interactions (internal and external) must be accounted
for when the system changes. This task alone requires a method that can accommodate
such change and can facilitate that process in a structured way.
Bititci et al. (1997) support that a feedback mechanism is essential to enable the
correct deployment of strategic and tactical objectives, and further state that there
should be a structured framework to facilitate this flow of information to the
decision-making function of the company. Within ISO 9000:2000 (ISO, 2000), as shown
in Figure 1, developing a method for measurement, analysis and improvement is an
integral part of the system.
The RADAR logic (Results, Approach, Deployment, Assessment and Review) used
at the heart of the EFQM Excellence Model (European Foundation for Quality
Figure 1.
The EFQM excellence
model
Management, 2003) includes the elements of assessment and review as part of its Performance
scoring methodology. The model is shown in Figure 2. measurement
According to the model, assessment and review cover what an organisation needs to
review and improve both the approach and deployment of the approach for the main systems
elements of an organisation, which are known as “enabler” criteria. These criteria are
leadership, policy and strategy, people, partnership and resources, and processes. In an
excellent organisation, the approach and deployment will be subject to regular 111
measurement, learning activities will be undertaken, and the output from both will be
used to identify, prioritise, plan and implement improvement. Improvement in these
criteria will be reflected in organisational results, which are customer, people, society
and key performance results. Therefore, there is a cause and effect relationship
between enablers and results. By using this approach, not only does the organisation
help to review its performance, but also to use relevant measures to its business. BAE,
the winner of the UK Business Excellence Award in 1999, has reported different levels
of review in relation its policy and strategy, as depicted in Figure 3 (BAE, 1999). It
shows how low-level reviews, which tend to happen frequently, interact with
company-wide review.
Wisner and Fawcett (1991), in their nine-step approach to designing a performance
measurement system, clearly state the need for periodic re-evaluation of the
appropriateness of the established performance measurement system, but nevertheless
do not provide enough information as to its deployment. Lingle and Schiemann (1996)
suggest the inclusion of a periodical review process in a performance measurement
system, reflecting possible changes to the competitive environment. Kueng et al. (2001)
argue empirically that many of the PMSs in place have not been developed
systematically. They believe that two distinct cycles of “creation of PMS” and “use of
Figure 2.
The ISO 9000:2000 model
BPMJ
11,2
112
Figure 3.
Levels of review in BAE
PMS” should be distinguished. They clearly state that any PMS should be under
review periodically, and in the following circumstances one can go back from the
second cycle to the first:
(1) the business strategy is modified;
(2) stakeholders state new requirements;
(3) the implemented performance indicators are not useful;
(4) new operational IT systems are put in place; and
(5) new IT opportunities emerge.
This necessity for the maintenance of PMSs is also discussed in the CMMI approach
(CMMI, 2002) under one of the system requirements as “[E]stablish and maintain the
organisation’s measurement repository”. A similar argument has developed by
Wettstein and Kueng (2002) using the maturity model for performance measurement
systems. It has become evident in the literature and in business that there is a need to
review performance measurement systems in the ever-changing environment of
modern markets. The main purpose of a structured review process is to help
organisations to be prepared for any adjustments required after the design and
implementation stages of a performance measurement system. It is important not only
to develop an effective and efficient means of reviewing the performance and
performance measurement system of an organisation, but it is also important that this
is sustainable, and will have the ability to adapt to the “environmental” changes
mentioned previously. The review process is an essential part of the successful
development of performance measurement (Bourne et al., 2000) and is an element that
can ensure the longevity and flexibility of the system.
Application framework
The review framework was part of a wider PMS design methodology that was
developed and applied in two of the participating companies in the EU-funded
project[1].
The companies in which the review framework approach was applied had used a Performance
generic performance measurement system (PMS) design approach consisting of three
basic elements:
measurement
(1) direction;
systems
(2) processes; and
(3) measures. 113
The existence of the element of “direction” implies that the company has defined its
mission, vision and strategic objectives and that the company’s direction is clear.
“Processes” imply that the company is being managed by processes and is familiar
with process improvement practices. Finally, “measures” implies that the company has
attached measures to its processes that have been derived from the strategy and reflect
the company’s direction. Their interactions, as well as the different steps involved in
each of the elements, are depicted in Figure 4.
It is important to mention that the parties involved in the definition of direction,
top-level processes and strategic indicators are referred to as the Executive Team, and
the parties involved in the definition of the detailed processes and the operational
indicators are referred to as the Process Team. The Process Owners are in charge of the
Process Teams.
The approach can be applied to any company that is using the three elements
shown in Figure 3 in one way or another.
Figure 4.
Generic PMS design
approach
BPMJ Characteristics of PMS design process Characteristics of measures
11,2
Performance measures should be derived from the Performance measures should enable/facilitate
company’s strategy benchmarking
The purpose of each performance measure must Ratio based performance measures are preferable
be made explicit to absolute numbers
Data collection and methods of calculating the Performance criteria should be directly under
114 level of performance must be made clear control of the evaluated organizational unit
Everyone (customers, employees and managers) Objective performance criteria are preferable to
should be involved in the selection of the subjective ones
measures
The performance measures that are selected Non-financial measures should be adopted
should take account of the organization
The process should be easily revisitable – Performance measures should be simple and easy
measures should change as circumstances change to use
Table I. Performance measures should provide fast
Characteristics of PMS feedback
design process and Performance measures should stimulate
measures continuous improvement rather than just monitor
Figure 5.
PMS review framework
“Business performance” can have an impact on PMS design and implementation. This
review category assesses the performance of the business through the PMS, and for
this purpose is divided into three levels based on their review frequency:
(1) Ongoing, which deals with reviewing the operational performance of the
business. The ongoing review has an impact on the definition of the operational
indicators (measures).
(2) Periodic, which deals with reviewing the strategic performance of the company
by reviewing the strategic indicators. The periodic review has an impact on the
definition of operational indicators as well as any additional process analysis Performance
that the company might want to carry out (process costing, etc.) (measures and measurement
processes).
(3) Overall, which deals with the review of the company’s overall strategic
systems
objectives, including the mission and vision statements of the business. The
overall review has an impact on all steps of the direction, processes and
measures elements mentioned earlier (direction and measures and processes). 115
The second review category of the framework, “PMS performance”, deals with the
assessment of how efficient and effective the PMS is in actually measuring the
company’s performance. This also includes issues such as the accuracy of the mapping
of the business onto the PMS and the efficiency of the PMS design process. Many of the
tools explained in the next section of this paper are being used by the companies
participating in the EU research project. As a result, particular attention was paid to
ensuring the usefulness and usability of the framework within an operating business
context.
The three review sections are interrelated and provide inputs and outputs to each
other. This ensures that none of the reviewing procedures are undertaken in isolation,
hence maintaining the functionality of the strategic alignment “backbone” of the PMS.
To understand the interaction between the three different review categories, a review
card was developed illustrating the links. The card is depicted in Figure 6.
The review card identifies the inputs and outputs to each of the review stages and
how they interact together. It also brings the review process into perspective by
defining the scope of the reviews and the tools that can be used. Furthermore, it
provides a suggested frequency as well as the type of people to be involved in the
process to ensure maximum benefits. The purpose of the card is to provide
constructive guidelines to companies implementing the PMS review framework, so
that they can understand the requirements and expected outputs of the process. The
card provides a high-level view of the review process from an operational point of view.
This enables the company to understand not only the individual needs of each of the
Figure 6.
Review card
BPMJ review stages but also the positioning of the review framework within the individual
11,2 company’s PMS.
The review framework and the review card comply with the findings of Bourne et al.
(2000), as stated below:
The performance measurement system requires developing and reviewing at a number of
different levels as the situation changes. For example:
116 (1) The performance measurement system should include an effective mechanism for
reviewing and revising targets and standards.
(2) The performance measurement system should include a process for developing
individual measures as performance and circumstances change.
(3) The performance measurement system should include a process for periodically
reviewing and revising the complete set of measures in use. This should be done to
coincide with changes in either the competitive environment or strategic direction.
(4) The performance measurement system should be used to challenge the strategic
assumptions.
Review procedure
Within the literature, many of the tools mentioned have been used for process
monitoring and control purposes, and this is more evident in the quality field
(Feigenbaum, 1983; Ishikawa, 1991; Oakland, 1986). Nevertheless, they are not the only
tools that can be used for process improvement and review purposes. Within each of
the three main levels the framework provides information regarding the purpose of the
specific review, the people involved (“who”), the tools used (“how”), and the expected
output (“what”). In the remainder of the paper the three main levels will be explained
regarding their deployment and use within a PMS.
All review levels are of a similar layout:
(1) overview – where an explanation is given regarding the review’s purpose and
expected outputs and mechanisms, and steps are explained in more detail
regarding the review’s operational and functional requirements; and
(2) support tools – where a possible set of tools that can be used to carry out such a
review is suggested.
Ongoing review
Overview. Ongoing review on a day-to-day basis is a practice that seeks to ensure
that organisational processes are under control and are achieving the expected
performance. The process team is usually responsible for this type of review and
the results are reported back to the process owner. Ongoing review deals mostly
with operational indicators. Although the results of the review are reported back to
the process owner, in order to solve possible problems or improve processes, the
framework encourages collaboration between different process owners. At this level
the process owner can amend operational indicators, ensuring their alignment to
strategic indicators. Any changes, even minor, will change the system (impact on
the PMS design) and subsequently the implementation procedure in terms of data
collection, IT considerations (if any), reporting and so on (impact on
implementation).
The approach that can be applied to ongoing and periodic review is of the following Performance
generic nature: measurement
(1) Define frequency, format and responsibility. The frequency of review, the format systems
in which data is gathered and the person(s) who is (are) responsible for this are
identified.
(2) Monitor and control the process. Once the required data gathered, it is
monitored in a way that shows whether the process is under control and 117
moving towards the targets set for the indicators. If any adjustments are
required, the approaches defined in the next steps will be helpful.
(3) Diagnose. In this approach, one needs to identify the most critical aspects of the
problem and to determine the root causes of the problem.
(4) Remedy. Once the root causes of the problems are identified, solutions should be
proposed and implemented. Following that, it must be ensured that the
solutions are effective and similar problems will not be created.
Support tools. For an ongoing review a number of tools and techniques for monitoring,
controlling, investigating and problem solving are required. These can include:
(1) seven basic tools (cause and effect diagram, run chart, scatter diagram, flow
chart, pareto chart, histogram, control chart);
(2) seven management tools (affinity diagram, interrelationship digraph, tree
diagram, prioritisation grid, matrix diagram, process decision program chart,
activity network diagram); and
(3) others.
Periodic review
Overview. In today’s rapidly evolving and changing marketplace, flexibility is
considered essential to the competitiveness of any organisation. To this end, it is
important that a company possesses the ability to change its strategic orientation as
times dictate. Therefore, the periodic review of its strategic indicators is crucial. This
review will evaluate the overall performance of the organisation at the strategic level.
One of the main inputs to this review process is information from the operational
indicators level. This information is essential if adjustments are needed to the strategic
indicator level, since it can provide the platform for performing statistical and other
relevant analyses. The main considerations for this review are:
.
the effectiveness of the organisation in achieving its strategic objectives;
.
the validity of any hypotheses regarding organisational performance developed
during the design stages of the performance measurement system; and
.
the validity of possible relationships amongst performance indicators.
The Executive Team will be in charge of this kind of review. As mentioned before, the
impact of any changes in measures should be considered in the PMS design and
implementation. Due to the presence of the Executive Team, any elements of the
generic PMS design approach (see Figure 4) can be altered. However, for most
organisations changing the direction so frequently seems to be unlikely. Changing the
organisational direction is a fundamental change that may require re-design of the
BPMJ system. Therefore, if the organisation is doing well, only some amendments in
11,2 measures and processes might be required.
The review of the strategic indicators may have a diverse impact on the PMS. In the
Executive Team meeting (the same team that defines the strategic indicators), if it
becomes obvious that the organisation fails to move effectively towards its strategic
objectives, then the causes should be analysed and might include:
118 .
under-performance at a lower level;
.
strategic objectives are not communicated effectively;
.
strategic objectives have not been defined properly; and
.
validity/relevance of the objectives.
It is obvious that depending on the issue identified, the course of action should be
different for each individual case. The possibility of two or more cases being valid at
the same time should not be excluded.
The validation of the hypotheses stated initially regarding possible performance
indicator relationships should be a driving force of the review process. If the study of
data and information indicates a discrepancy in these hypotheses, then the cause
should be identified:
(1) Was the method used to establish the hypotheses appropriate? This may
include review of the quality of data used and tools.
(2) If there is no relationship, can that be proved?
(3) Are any existing relationships relevant/contributing to the overall business?
The approach followed to address the above issues is similar to the ongoing review.
Support tools. The tools that may be used at this stage include those already
mentioned, plus more specifically the following:
.
trend analysis tools – control charts;
.
relationship analysis tools – correlation analysis, multivariate analysis, cause
and effect diagrams, Pareto charts; and
.
checking the consistency of the PMS communication – departmental reports and
input.
Overall review
Overview. This review tends to be on an annual basis, and the following considerations
are taken into account:
.
the validity of the mission and vision statements; and
.
whether the company’s strategic objectives still support the mission and vision
of the organisation.
Having this review on top of previous reviews will provide the organisation with the
ability to be proactive in the real market rather than just reactive. In other words, in
this review all external and internal factors (SWOT analysis) will be considered to
make any adjustment to the overall direction of the organisation. Also, the whole PMS
system will be reviewed systematically to ensure its effectiveness and efficiency.
Therefore, the PMS system can be updated as a result of change in the organisational Performance
direction or improving the system itself.
The approach followed for the overall review is as follows:
measurement
(1) Arrange review meetings. Due to the importance of this review, it is suggested
systems
that three milestones in terms of review meetings are to be arranged, i.e.
introductory meeting (in which the issues regarding scope, objectives and so on
will be considered), intermediate meeting (in which decisions about the 119
adjustments required for the review process will be taken – moreover, the
diagnostic stage of the approach may have some reports for this meeting), and
final meeting (in which proposals for action based on remedial approach will be
finalised).
(2) Define the scope and objectives and resource allocation. According to the
information gathered through periodic reviews and external factors such as
new competitors, legislation, customer demands, etc., the scope and objectives
of the review will be defined and appropriate resources will be allocated.
(3) Diagnostic approach. If the organisation suffered any problems at the
organisational level, the root causes should be identified. The tools and
techniques for overall review will help the organisation to do so.
(4) Remedial approach. A proposal for the action to remove the problems and
initiate improvement will be prepared.
Support tools. The tools used for the overall review of the PMS are larger-scale review
tools that affect the overall organisational direction and orientation. To this end they
can be categorised as internal tools (i.e. tools that compare the organisational
performance to a set standard of practice), and external tools (i.e. tools that concentrate
on comparing the organisational performance to competitors and good or best-practice
organisations). As an example, internal tools may include the process of
self-assessment (European Foundation for Quality Management, 2000) and external
tools may include the process of benchmarking.
Conclusions
The performance measurement system of an organisation is the mechanism by which
to manage and control the organisation. For organisations that use PMS as the basis
for their operations and development, the health of the organisation depends on the
effectiveness of the PMS. Maintaining the effectiveness of the organisation and the
measurement systems requires a systematic review process. The process of reviewing
performance is a complex task that spans the whole organisation. Involving the
appropriate persons in spending sufficient time reviewing the PMS is a costly exercise.
Nevertheless, it is very important to the continuous adjustment of the business and its
performance orientation in today’s markets. A good PMS review process seeks the
correct balance between organisational benefits and the efforts required.
This paper presents a process that was designed to review a PMS that has close
integration of measures and business processes. The work was completed with pilot
applications in two European enterprises. Empirical work with the case study
enterprises in the implementation of the integrated PMS identified the review
requirements. The review process proposed represents the balance of effects and
efforts that are considered to be appropriate.
BPMJ In the design of the review process, previous studies in PMS design and
11,2 implementation were used as starting reference. In the integrated PMS, the top-level
measures (strategic) are systematically cascaded down to lower levels (operational).
Therefore it is appropriate that in a systematic review, this link is maintained as part
the PMS review. The review process is designed to consider both the organisational
performance through the applied PMS and the validity of the PMS itself. To this end a
120 set of tools and techniques that can be used is proposed. They detail the people, tools
and expected outputs of each stage of the review process. The framework includes a set
of possible questions and considerations that can be taken into account when
reviewing performance measurement systems. A review card was developed to
provide a view of the major elements at a glance.
Overall, the framework presented provides a structured method of reviewing
performance in a methodical and effective manner. This is useful for practitioners to
apply and adapt to their own situations.
Note
1. This paper was produced during the 28-month research project named “Performance
Measurement System for Total Quality Management”, sponsored by the EC and funded
under the ESPRIT initiative (TBP No. 26736). The participants were Business Integration
Technologies (UK), ESADE Business School (Spain), Cranfield University (UK), Universitat
Jaume I (Spain), Gres de Nules (Spain), and Wendel Email (Germany).
References
Andersson, P., Aronsson, H. and Storhagen, N.G. (1989), “Measuring logistics performance”,
Engineering Costs and Production Economics, Vol. 17, pp. 253-62.
BAE (1999), Submission for the UK Quality Award, BAE, London.
Bititci, U.S. and Turner, T. (2000), “Dynamics of performance measurement systems”,
International Journal of Operations & Production Management, Vol. 20 No. 6, pp. 692-704.
Bititci, U.S., Carrie, S.A. and McDevitt, L. (1997), “Integrated performance measurement systems:
a development guide”, International Journal of Operations & Production Management,
Vol. 17 No. 5, pp. 522-34.
Bourne, M., Mills, J., Wilcox, M., Neely, A. and Platts, K. (2000), “Designing, implementing and
updating performance measurement system”, International of Journal of Operations &
Production Management, Vol. 20 No. 7, pp. 754-71.
Bourne, M.C.S. (1999), “Designing and implementing a balanced performance measurement
system”, Control, July, pp. 21-3.
CMMI (2002), “Capability Maturity Model Integration”, Software Engineering Institute, Carnegie
Mellon University, Pittsburgh, PA, available at: www.sei/cmu.edu/cmmi
Dixon, J.R., Nanni, J. and Vollmann, T.E. (1990), The New Performance Challenge: Measuring
Operations for World-Class Competition, Irwin, Homewood, IL.
Eccles, R.G. (1991), “The performance measurement manifesto”, Harvard Business Review,
January/February, pp. 131-7.
European Foundation for Quality Management (2000), Self-Assessment Based on the European
Model for Total Quality, European Foundation for Quality Management, Brussels.
European Foundation for Quality Management (2003), The EFQM Excellence Model, European
Foundation for Quality Management, Brussels.
Feigenbaum, A.V. (1983), Total Quality Control, McGraw-Hill, New York, NY. Performance
Ghalayini, A.M. and Noble, J.S. (1996), “The changing basis of performance measurement”, measurement
International Journal of Operations & Production Management, Vol. 16 No. 8, pp. 63-80.
ISO (2000), ISO 9000:2000, International Organisation for Standardisation, Geneva.
systems
Ishikawa, K. (1991), Introduction to Quality Control, Chapman & Hall, London.
Johnson, H.T. and Kaplan, R.S. (1987), Relevance Lost: The Rise and Fall of Management
Accounting, Harvard Business School Press, Boston, MA. 121
Kaplan, R.S. and Norton, D.P. (1992), “The balanced scorecard – measures that drive
performance”, Harvard Business Review, January/February, pp. 71-9.
Kueng, P., Meier, A. and Wettsrein, T. (2001), “Performance measurement systems must be
engineered”, Communications of the AIS, Vol. 7, Article 3.
Lingle, J.H. and Schiemann, W.A. (1996), “From balanced scorecard to strategic gauges: is
measurement worth it?”, Management Review, Vol. 85 No. 3, pp. 56-62.
Lynch, R.L. and Cross, K.F. (1991), Measure Up – the Essential Guide to Measuring Business
Performance, Mandarin, London.
Maskell, B.H. (1991), Performance Measurement for World Class Manufacturing: A Model for
American Companies, Productivity Press, Cambridge, MA.
Medori, D. and Steeple, D. (2000), “A framework for auditing and enhancing performance
measurement systems”, International Journal of Operations & Production Management,
Vol. 20 No. 5, pp. 520-33.
Neely, A., Mills, J., Platts, K., Richards, H., Gregory, M., Bourne, M. and Kennerley, M. (2000),
“Performance measurement system design: developing and testing a process-based
approach”, International Journal of Operations & Production Management, Vol. 20 No. 10,
pp. 1119-45.
Neely, A.D., Mills, J.F., Gregory, M.J., Richards, A.H., Platts, K.W. and Bourne, M.C.S. (1996),
Getting the Measure of Your Business, Findlay, London.
Oakland, J.S. (1986), Statistical Process Control – a Practical Guide, Heinemann, London.
Richardson, P.R. and Gordon, J.R.M. (1980), “Measuring total manufacturing performance”, Sloan
Management Review, Winter, pp. 47-58.
Skinner, W. (1974), “The decline, fall and renewal of manufacturing”, Industrial Engineering,
October, pp. 32-8.
Wettstein, T. and Kueng, P.A. (2002), “A maturity model for performance measure systems”, in
Brebbia, C. and Pascola, P. (Eds), Management Information Systems, WIT Press,
Southampton.
Wisner, J.D. and Fawcett, S.E. (1991), “Link firm strategy to operating decisions through
performance measurement”, Production and Inventory Management Journal, 3rd quarter,
pp. 5-11.
Further reading
Eccles, R.G. and Pyburn, P. (1992), “Creating a comprehensive system to measure performance”,
Management Accounting, October, pp. 41-4.
Fitzgerald, L. (1988), “Managing performance measurement in service industries”, International
Journal of Operations & Production Management, Vol. 8 No. 3, pp. 109-16.
Lynch, R.L. and Cross, K.F. (1992), Measure Up! Yardsticks for Continuous Improvement,
Blackwell, Oxford.
BPMJ Moseng, B. and Bredrup, H. (1993), “A methodology for industrial studies of productivity
performance”, Production Planning and Control, Vol. 4 No. 3.
11,2 Senge, P.M. (1990), The Fifth Discipline: The Art and Practice of the Learning Organization,
Century, London.
Sink, D.S. and Tuttle, T. (1989), Planning and Measurement in Your Organisation of the Future,
Industrial Engineering and Management Press, Norcross, GA.
122 Stoop, P.P.M. (1996), “Performance management in manufacturing – a method for short term
evaluation and diagnosis”, PhD thesis, Eindhoven University of Technology, Eindhoven.
Stoop, P.P.M. and Bertrand, M.W. (1997), “Performance prediction and diagnosis in two
production departments”, Integrated Manufacturing Systems, Vol. 8 No. 2, pp. 103-9.