0% found this document useful (0 votes)
97 views20 pages

Pressman 7 CH 25 Metrics

Uploaded by

mail
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
97 views20 pages

Pressman 7 CH 25 Metrics

Uploaded by

mail
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 20

Chapter 25

 Process and Project Metrics


Slide Set to accompany
Software Engineering: A Practitioner’s Approach, 7/e
by Roger S. Pressman

Slides copyright © 1996, 2001, 2005, 2009 by Roger S. Pressman

For non-profit educational use only


May be reproduced ONLY for student use at the university level when used in conjunction
with Software Engineering: A Practitioner's Approach, 7/e. Any other reproduction or use is
prohibited without the express written permission of the author.

All copyright information MUST appear if these slides are posted on a website for student
use.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 1
A Good Manager Measures
process
process metrics
project metrics
measurement
product metrics

product
What do we
use as a
basis?
• size?
• function?
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 2
Why Do We Measure?
 assess the status of an ongoing project
 track potential risks
 uncover problem areas before they go
“critical,”
 adjust work flow or tasks,
 evaluate the project team’s ability to
control quality of software work products.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 3
Process Measurement
 We measure the efficacy of a software process
indirectly.
 That is, we derive a set of metrics based on the
outcomes that can be derived from the process.
 Outcomes include
• measures of errors uncovered before release of the
software
• defects delivered to and reported by end-users
• work products delivered (productivity)
• human effort expended
• calendar time expended
• schedule conformance
• other measures.
 We also derive process metrics by measuring the
characteristics of specific software engineering tasks.
These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 4
Process Metrics Guidelines
 Use common sense and organizational sensitivity when
interpreting metrics data.
 Provide regular feedback to the individuals and teams who
collect measures and metrics.
 Don’t use metrics to appraise individuals.
 Work with practitioners and teams to set clear goals and
metrics that will be used to achieve them.
 Never use metrics to threaten individuals or teams.
 Metrics data that indicate a problem area should not be
considered “negative.” These data are merely an indicator for
process improvement.
 Don’t obsess on a single metric to the exclusion of other
important metrics.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 5
Software Process Improvement
Process model

Process improvement
Improvement goals recommendations

Process metrics
SPI

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 6
Process Metrics
 Quality-related
 focus on quality of work products and deliverables
 Productivity-related
 Production of work-products related to effort expended
 Statistical SQA data
 error categorization & analysis
 Defect removal efficiency
 propagation of errors from process activity to activity
 Reuse data
 The number of components produced and their degree of
reusability

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 7
Project Metrics
 used to minimize the development schedule by making the
adjustments necessary to avoid delays and mitigate
potential problems and risks
 used to assess product quality on an ongoing basis and,
when necessary, modify the technical approach to improve
quality.
 every project should measure:
 inputs—measures of the resources (e.g., people, tools)
required to do the work.
 outputs—measures of the deliverables or work products
created during the software engineering process.
 results—measures that indicate the effectiveness of the
deliverables.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 8
Typical Project Metrics
 Effort/time per software engineering task
 Errors uncovered per review hour
 Scheduled vs. actual milestone dates
 Changes (number) and their
characteristics
 Distribution of effort on software
engineering tasks

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 9
Metrics Guidelines
 Use common sense and organizational sensitivity when
interpreting metrics data.
 Provide regular feedback to the individuals and teams
who have worked to collect measures and metrics.
 Don’t use metrics to appraise individuals.
 Work with practitioners and teams to set clear goals and
metrics that will be used to achieve them.
 Never use metrics to threaten individuals or teams.
 Metrics data that indicate a problem area should not be
considered “negative.” These data are merely an indicator
for process improvement.
 Don’t obsess on a single metric to the exclusion of other
important metrics.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 10
Typical Size-Oriented Metrics
 errors per KLOC (thousand lines of code)
 defects per KLOC
 $ per LOC
 pages of documentation per KLOC
 errors per person-month
 errors per review hour
 LOC per person-month
 $ per page of documentation

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 11
Typical Function-Oriented Metrics
 errors per FP (thousand lines of
code)
 defects per FP
 $ per FP
 pages of documentation per FP
 FP per person-month

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 12
Comparing LOC and FP
Programming LOC per Function point
Language avg. median low high
Ada 154 - 104 205
Assembler 337 315 91 694
C 162 109 33 704
C++ 66 53 29 178
COBOL 77 77 14 400
Java 63 53 77 -
JavaScript 58 63 42 75
Perl 60 - - -
PL/1 78 67 22 263
Powerbuilder 32 31 11 105
SAS 40 41 33 49
Smalltalk 26 19 10 55
SQL 40 37 7 110
Visual Basic 47 42 16 158

Representative values developed by QSM

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 13
Why Opt for FP?
 Programming language independent
 Used readily countable characteristics that
are determined early in the software process
 Does not “penalize” inventive (short)
implementations that use fewer LOC that
other more clumsy versions
 Makes it easier to measure the impact of
reusable components

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 14
Object-Oriented Metrics
 Number of scenario scripts (use-cases)
 Number of support classes (required to
implement the system but are not
immediately related to the problem domain)
 Average number of support classes per key
class (analysis class)
 Number of subsystems (an aggregation of
classes that support a function that is visible
to the end-user of a system)

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 15
WebApp Project Metrics
 Number of static Web pages (the end-user has no control over
the content displayed on the page)
 Number of dynamic Web pages (end-user actions result in
customized content displayed on the page)
 Number of internal page links (internal page links are pointers
that provide a hyperlink to some other Web page within the
WebApp)
 Number of persistent data objects
 Number of external systems interfaced
 Number of static content objects
 Number of dynamic content objects
 Number of executable functions

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 16
Measuring Quality
 Correctness — the degree to which a program
operates according to specification
 Maintainability—the degree to which a program
is amenable to change
 Integrity—the degree to which a program is
impervious to outside attack
 Usability—the degree to which a program is
easy to use

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 17
Defect Removal Efficiency

DRE = E /(E + D)

where:
E is the number of errors found before
delivery of the software to the end-user
D is the number of defects found after
delivery.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 18
Metrics for Small Organizations
 time (hours or days) elapsed from the time a request is
made until evaluation is complete, tqueue.
 effort (person-hours) to perform the evaluation, Weval.
 time (hours or days) elapsed from completion of
evaluation to assignment of change order to personnel,
teval.
 effort (person-hours) required to make the change, Wchange.
 time required (hours or days) to make the change, tchange.
 errors uncovered during work to make change, Echange.
 defects uncovered after change is released to the
customer base, Dchange.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 19
Establishing a Metrics Program
 Identify your business goals.
 Identify what you want to know or learn.
 Identify your subgoals.
 Identify the entities and attributes related to your subgoals.
 Formalize your measurement goals.
 Identify quantifiable questions and the related indicators that
you will use to help you achieve your measurement goals.
 Identify the data elements that you will collect to construct
the indicators that help answer your questions.
 Define the measures to be used, and make these definitions
operational.
 Identify the actions that you will take to implement the
measures.
 Prepare a plan for implementing the measures.

These slides are designed to accompany Software Engineering: A Practitioner’s Approach, 7/e
(McGraw-Hill 2009). Slides copyright 2009 by Roger Pressman. 20

You might also like