pm6 EffortEstimation
pm6 EffortEstimation
MANAGEMENT
(CSC 4125)
4
Where Are Estimates Done?
Estimates are carried out at various stages of a software
project for a variety of reasons.
• Strategic Planning
• Feasibility Study
• System Specification
• Evaluation of Suppliers’ Proposals
• Project Planning
– Accuracy of estimates should improve as project proceeds
– Some speculation(assumptions) about physical
implementation may be necessary for estimation
5
Problems with Over- and Under-Estimates
• Over Estimates: An over-estimate is likely to cause project to take
longer than it would otherwise.
– Parkinson’s Law – ‘work expands to fill the time available’ (Does not apply
universally, can be controlled)
– Brooks’ Law – ‘putting more people on a late job makes it later’
7
Sample Historical Data
8
Software Effort Estimation Techniques
• Algorithmic Models – use ‘effort drivers’
representing characteristics of the target system and
implementation environment to predict effort
• Expert Judgment – based on the advice of
knowledgeable staff
• Analogy – similar, completed projects are identified
and their actual effort is used as the basis of estimate
( case-based reasoning, comparative)
• Parkinson – identifies the staff effort available to do
a project and uses that as an ‘estimate’
9
Software Effort Estimation Techniques (cont.)
11
Top-Down Approach
• Normally associated with parametric/algorithmic models
• Effort will be related mainly to variables associated to
characteristics of the final system (and development
environment)
• Based on past project data
• Form of parametric model will normally be
effort = (system size) x (productivity rate)
• Important to distinguish between size models and effort
models
• After calculating the overall effort, proportions of that effort
are allocated to various activities
• Combinations of top-down and bottom-up estimation may
(and should) be used
12
Top-Down Approach
• Produce overall estimate
Estimate using effort driver(s)
100 days overall
project • Distribute proportions of
overall estimate to
components
design code test
30% 30% 40%
i.e. i.e. i.e. 40 days
30 days 30 days
13
Estimating by Analogy
• Also called Case-Based Reasoning
• Estimators seek out projects that have been completed
(source cases) that have similar characteristics to the new
project (target case)
• Actual effort for the source cases can be used as a base
estimate for the target
• Estimator then identifies differences between the target and
the source and adjusts the base estimate to produce an
estimate for the new project
• Historical data must include all relevant dimensions included
in the model
14
Estimating by Analogy (cont.)
• Problem is to identify similarities and differences
between applications where you have a large
number of past projects to analyze.
18
Albrecht Function Point Analysis (cont.)
• Each instance of each external user type in the system
is identified
• Each component is then classified as having high,
average, or low complexity
• Counts of each external user type in each complexity
band are multiplied by specified weights and summed
to get Unadjusted FP (UFP) count
• Fourteen Technical Complexity Factors (TCFs) are then
applied in a formula to calculate the final FP count
FP File Type Complexity
20
FP External Input Complexity
21
FP External Output Complexity
22
FP Complexity Multipliers
23
Function Points Mark II
• Developed by Charles R. Symons
• Recommended by Central Computer and Telecommunications Agency
(CCTA)
• Used by a minority of FP specialists in the UK
• UFPs = Wi x (number of input data element types) +
We x (number of entity types referenced) +
Wo x (number of output data element types)
– where Wi , We , and Wo are weightings derived from previous projects or
industry averages normalized so they add up to 2.5
24
Model of a Transaction
25
Calculating Mark II FP: An Example
UFP =
26
Object Points
• Similarities with FP approach, but takes account of more readily
identifiable features
• No direct bearing on object-oriented techniques, but can be used
for object-oriented systems as well
• Uses counts of screens, reports, and 3GL components, referred to
as Objects
• Each object has to be classified as simple, medium, or difficult
• Number of objects at each level are multiplied by appropriate
complexity weighting and summed to get an overall score
• Object point score can be adjusted to accommodate reusability
factor
• Finally is divided by a productivity rate(PROD) (derived from
historical data or industry averages) to calculate effort
27
Object Points for Screens
28
Object Points for Reports
29
Object Point Complexity Weightings
30
Object Point Effort Conversion
31
Procedural Code-Oriented Approach
1. Envisage the number and type of programs
in the final system
2. Estimate the SLOC of each identified program
3. Estimate the work content, taking into
account complexity and technical difficulty
4. Calculate the work-days effort
32
COCOMO (COnstructive COst MOdel) II :
A Parametric Productivity Model
• Based on SLOC (source lines of code) characteristic, and operates according to the
following equations:
• Effort = PM = Coefficient<EffortFactor>*(SLOC/1000)^P