0% found this document useful (0 votes)
15 views35 pages

pm6 EffortEstimation

Software development and project management

Uploaded by

likemeorkillme99
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
15 views35 pages

pm6 EffortEstimation

Software development and project management

Uploaded by

likemeorkillme99
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 35

SOFTWARE DEVELOPMENT PROJECT

MANAGEMENT
(CSC 4125)

Lecture 6: Software Effort Estimation


What makes a successful project?
Delivering: Stages:
• agreed functionality 1. Set targets
• on time 2. Attempt to achieve
• at the agreed cost targets
• with the required quality

BUT what if the targets are not achievable?


A key point here is that developers may in fact be very competent,
but incorrect estimates leading to unachievable targets will lead to
extreme customer dissatisfaction.
Introduction
• A successful project is one that is delivered on time,
within budget and with the required quality.
– Targets are set which the project manager then tries to
meet. This assumes that the targets are reasonable.
– Realistic estimates are therefore crucial

• A project manager has to produce estimates of


– Effort (which affects costs), and
– Activity durations (which affect the delivery time)
Difficulties in Software Estimation
• Complexity & invisibility of software
• Novel applications of software
• Changing technology
• Lack of homogeneity of project experience
• Lack of standard definitions
• Political implications
• Subjective nature of estimating

4
Where Are Estimates Done?
Estimates are carried out at various stages of a software
project for a variety of reasons.
• Strategic Planning
• Feasibility Study
• System Specification
• Evaluation of Suppliers’ Proposals
• Project Planning
– Accuracy of estimates should improve as project proceeds
– Some speculation(assumptions) about physical
implementation may be necessary for estimation

5
Problems with Over- and Under-Estimates
• Over Estimates: An over-estimate is likely to cause project to take
longer than it would otherwise.
– Parkinson’s Law – ‘work expands to fill the time available’ (Does not apply
universally, can be controlled)
– Brooks’ Law – ‘putting more people on a late job makes it later’

• Under Estimates: Under-estimated projects might not be completed


on time or to cost. The danger with under-estimate is the effect on
quality.
– Weinberg’s Zeroth Law of reliability – ‘If a system does not have to be
reliable, it can meet any other objective’
– Demotivation and low productivity
– Burnout and turnover
• Having Realistic and Achievable Estimates is Critical
• Artificial Urgencies and Deadlines MUST be avoided
6
Basis for Software Estimating
• The need for Historical Data
– Most estimating methods need information about past
projects
– BUT, Care should be taken when applying because of
possible differences in factors (e.g., programming
languages, experience of staff)
• Measure of Work (Size, effort & time)
• Complexity

7
Sample Historical Data

8
Software Effort Estimation Techniques
• Algorithmic Models – use ‘effort drivers’
representing characteristics of the target system and
implementation environment to predict effort
• Expert Judgment – based on the advice of
knowledgeable staff
• Analogy – similar, completed projects are identified
and their actual effort is used as the basis of estimate
( case-based reasoning, comparative)
• Parkinson – identifies the staff effort available to do
a project and uses that as an ‘estimate’

9
Software Effort Estimation Techniques (cont.)

• Price to Win – the ‘estimate’ is a figure that is


sufficiently low to win a contract
• Top-Down – an overall estimate is formulated for the
whole project and is then broken down into the effort
required for component tasks
• Bottom-Up – Component tasks are identified & sized
and these individual estimates are aggregated
Bottom-Up Estimating
• Detailed Work Breakdown Structure (WBS) is made
• Effort for each bottom-level activity is estimated
• Estimates for bottom-level activities are added to get
estimates for upper-level activities until overall project
estimate is reached
• Identify all tasks that have to be done – so quite time-
consuming
• Appropriate at later, more detailed, stages of project
planning
• Advisable where a project is completely novel or there is
no historical data available

11
Top-Down Approach
• Normally associated with parametric/algorithmic models
• Effort will be related mainly to variables associated to
characteristics of the final system (and development
environment)
• Based on past project data
• Form of parametric model will normally be
effort = (system size) x (productivity rate)
• Important to distinguish between size models and effort
models
• After calculating the overall effort, proportions of that effort
are allocated to various activities
• Combinations of top-down and bottom-up estimation may
(and should) be used
12
Top-Down Approach
• Produce overall estimate
Estimate using effort driver(s)
100 days overall
project • Distribute proportions of
overall estimate to
components
design code test
30% 30% 40%
i.e. i.e. i.e. 40 days
30 days 30 days

13
Estimating by Analogy
• Also called Case-Based Reasoning
• Estimators seek out projects that have been completed
(source cases) that have similar characteristics to the new
project (target case)
• Actual effort for the source cases can be used as a base
estimate for the target
• Estimator then identifies differences between the target and
the source and adjusts the base estimate to produce an
estimate for the new project
• Historical data must include all relevant dimensions included
in the model

14
Estimating by Analogy (cont.)
• Problem is to identify similarities and differences
between applications where you have a large
number of past projects to analyze.

• One method is to use shortest Euclidean distance to


identify the source case that is nearest the target
Euclidean Distance = square root of
[(target_parameter1 – source_parameter1)2 + … +
(target_parametern – source_parametern)2 ]
Calculating Euclidean Distance :
Example 5.1 (page 113)
• Say that the cases are being matched on the basis of
two parameters, the number of inputs to and the
number of outputs from the application to be built.
The new project is known to require 7 inputs and 15
outputs. One of the past cases, project A, has 8
inputs and 17 outputs. The Euclidean distance
between the source and the target is therefore the
square-root of (( 7 – 8)2 + (15 – 17)2 ), that is 2.24
Exercise 5.6 (Page 113): Calculating Euclidean distance
& Deciding about best analogy
• Project B has 5 inputs and 10 outputs. What would be the
Euclidean distance between this project and the target new
project being considered in Example 5.1?
Is Project B a better analogy with the target than Project A?

• Solution: The Euclidean distance between Project B and the


target case is square-root of (( 7 – 5)2 + (15 – 10)2 ), that is 5.39.
Project A is therefore a closer analogy.
Albrecht Function Point Analysis
• Top-Down method devised by Allan Albrecht and later
adopted by International Function Point Users Group (IFPUG)
• Quantifies the functional size of programs independently of
the programming language
• Based on five major components or ‘external user types’
• External Input Types
• External Output Types
• Logical Internal File Types
• External Interface File Types
• External Inquiry Types

18
Albrecht Function Point Analysis (cont.)
• Each instance of each external user type in the system
is identified
• Each component is then classified as having high,
average, or low complexity
• Counts of each external user type in each complexity
band are multiplied by specified weights and summed
to get Unadjusted FP (UFP) count
• Fourteen Technical Complexity Factors (TCFs) are then
applied in a formula to calculate the final FP count
FP File Type Complexity

20
FP External Input Complexity

21
FP External Output Complexity

22
FP Complexity Multipliers

23
Function Points Mark II
• Developed by Charles R. Symons
• Recommended by Central Computer and Telecommunications Agency
(CCTA)
• Used by a minority of FP specialists in the UK
• UFPs = Wi x (number of input data element types) +
We x (number of entity types referenced) +
Wo x (number of output data element types)
– where Wi , We , and Wo are weightings derived from previous projects or
industry averages normalized so they add up to 2.5

• It has 5 Technical Complexity Adjustments (TCAs) factors in addition to


the 14 in the original Albrecht FP method

24
Model of a Transaction

25
Calculating Mark II FP: An Example

UFP =

26
Object Points
• Similarities with FP approach, but takes account of more readily
identifiable features
• No direct bearing on object-oriented techniques, but can be used
for object-oriented systems as well
• Uses counts of screens, reports, and 3GL components, referred to
as Objects
• Each object has to be classified as simple, medium, or difficult
• Number of objects at each level are multiplied by appropriate
complexity weighting and summed to get an overall score
• Object point score can be adjusted to accommodate reusability
factor
• Finally is divided by a productivity rate(PROD) (derived from
historical data or industry averages) to calculate effort

27
Object Points for Screens

28
Object Points for Reports

29
Object Point Complexity Weightings

30
Object Point Effort Conversion

31
Procedural Code-Oriented Approach
1. Envisage the number and type of programs
in the final system
2. Estimate the SLOC of each identified program
3. Estimate the work content, taking into
account complexity and technical difficulty
4. Calculate the work-days effort

32
COCOMO (COnstructive COst MOdel) II :
A Parametric Productivity Model
• Based on SLOC (source lines of code) characteristic, and operates according to the
following equations:

• Effort = PM = Coefficient<EffortFactor>*(SLOC/1000)^P

• Development Time = DM = 2.50*(PM)^T

• Required Number of people = ST = Effort(PM)/Development Time(DM)


Software Coefficient
P T
Project Type
where: <Effort Factor>

PM : person-months needed for project Organic 2.4 1.05 0.38


SLOC : source lines of code Semi-
3.0 1.12 0.35
P : project complexity (1.04-1.24) detached
DM : duration time in months for project Embedded 3.6 1.20 0.32
T : SLOC-dependent coefficient (0.32-0.38)
ST : average staffing necessary
33
COCOMO applies to
Three classes of software projects
1) Organic projects
– "small" teams develop s/w in a highly familiar in-house
environment
– with "good" experience
– working with "less than rigid" requirements
2) Semi-detached projects
– "medium" teams
– with mixed experience
– working with a mix of rigid and less than rigid requirements
3) Embedded projects
– developed within a set of very "tight" constraints (hardware,
software, operational, ...)
– changes to system very costly
Summary
• Estimates are really management targets
• Collect as much information about previous projects as
possible
• Use more that one method of estimating
• Top-down approaches will be used at the earlier stages of
project planning while bottom-up approaches will be more
prominent later on
• Be careful about using other people’s historical productivity
data as a basis for your estimates, especially if it comes from
a different environment
• Seek a range of options
• Document your method of doing estimates and record all
your assumptions

You might also like