0% found this document useful (0 votes)
34 views22 pages

Notes 3 SPM

Uploaded by

antesports1234
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views22 pages

Notes 3 SPM

Uploaded by

antesports1234
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Some problems with estimating

• Subjective nature of much of estimating


It may be difficult to produce evidence to support your precise target
• Political pressures
Managers may wish to reduce estimated costs to win support for
acceptance of a project proposal
• Changing technologies
These bring uncertainties, especially in the early days when there is a
‘learning curve’
• Projects differ
Experience on one project may not apply to another
Where are the Estimates Done

• Strategic planning: Project portfolio management involves estimating


the costs and benefits of new applications to allocate priorities. Such
estimates may also influence the scale of development staff recruitment.
• Feasibility study: This confirms that the benefits of the potential system
will justify the costs.
• System specification: Most system development methodologies usefully
distinguish between the definition of the user’s requirements and the
design which shows how those requirements are to be fulfilled. The effort
needed to implement different design proposals will need to be estimated.
Estimates at the design stage will also confirm that the feasibility study is
still valid.
• Evaluation of supplier’s proposals: In the case of the IOE annual
maintenance contracts subsystem.
• Project Planning: As the planning and implementation of the project
becomes more detailed, more estimates of smaller work components will
be made. These will confirm earlier broad brush estimates and will support
more detailed planning, especially staff allocations.
The Basis for Software Estimating
 Basis for successful estimating
Information about past projects
Need to collect performance details about past projects: how big were they? How much
effort/time did they need?
We need to be able to measure the amount of work involved
 The need for historical data
 Parameters to be Estimated
The project manager is required to estimate two parameters whereas size is a fundamental
measure of work.
Based on the estimated size, two parameters are estimated:
• Effort
• Duration
Effort is measured in person-months: One person-month is the effort an individual can typically
put in a month.
 Measure of Work/ Effort
The size of a project is not the number of bytes that the source code occupies, neither is it the
size of the executable code. The project size is a measure of the problem’s complexity in
terms of the effort and time required to develop the product.
Two metrics are at present popularly being used to measure size.
• Source Lines of Code (SLOC)
• Function Point (FP).
Software Effort Estimation Techniques

1. Algorithmic models, which use ‘effort drivers’ representing


characteristics of the target system and the implementation
environment to predict effort
2. Expert judgment, based on the advice of knowledgeable staff.
3. Analogy, where a similar, completed, project is identified, and its
actual effort is used as the basis of the estimate.
4. Parkinson, where the staff effort available to do a project becomes the
‘estimate’
5. Price to win, where the ‘estimate’ is a figure that seems sufficiently low to
win a contract.
6. Top-down, where an overall estimate for the whole project is broken
down into the effort required for component tasks.
7. Bottom-up, where component tasks are identified and sized and these
individual estimates are aggregated.
Bottom-up Estimating

1. Break the project into smaller and smaller components


2. Stop when you get to what one person can do in one/two weeks]
3. Estimate costs for the lowest level activities
4. At each higher level calculate the estimate by adding estimates for lower
levels
The Top-down Approach

Top-down estimates steps


• Produce overall estimate using effort driver(s)
• distribute proportions of the overall estimate to components
Bottom-up versus top-down

Bottom-up
• use when no past project data
• identify all tasks that have to be done – so quite time-consuming
• use when you have no data about similar past projects
Top-down
• Produce an overall estimate based on project cost drivers
• Based on past project data
• Divide the overall estimate between jobs to be done
Algorithmic/Parametric models

The problem with COCOMO is that the input parameter for system size is
an estimate of lines of code. This is going to have to be an estimate at the
beginning of the project.
Function points, as will be seen, count various features of the logical design
of an information system and produce an index number that reflects the
amount of information processing it will have to carry out. This can be
crudely equated to the amount of code it will need.
Function points

 FP (Function Point) is the most widespread functional type metrics


suitable for quantifying a software application. It is based on five
users identifiable logical "functions", which are divided into two data
function types and three transactional function types. For a given
software application, each of these elements is quantified and
weighted, counting its characteristic elements, such as file references
or logical fields.
 The resulting numbers (Unadjusted FP) are grouped into Added,
Changed, or Deleted functions sets, and combined with the Value
Adjustment Factor (VAF) to obtain the final number of FP.
Parametric models

Parametric estimating is a project estimation technique to


estimate cost, duration and effort on a project. It uses a set
of algorithms, statistics or models to describe the project
and is one of four primary methods that project managers
use when estimating a project.
It works by correlating a parameter with a cost or time
value. That correlation is scaled to the size of the project.

estimated effort = (system size) / productivity


While larger projects will likely require more complicated statistical models or
algorithms, smaller projects follow a more straightforward formula.
E_parametric = a_old / p_old * p_curr
In this case, E_parametric is the parametric estimate, a_old is the historic amount
of cost or time, p_old is the historic value of that parameter and p_curr is the value
of the parameter in your current project.
Parametric Models

1. Albrecht/IFPUG function points

2. Symons/Mark II function points


3. COSMIC function points
4. COCOMO81 and COCOMO II
Estimating by Analogy, Albrecht Function Point
Analysis

The source cases, in this situation, are completed projects. For each of details of the
factors that would have a bearing on effort are recorded. These might include lines
of code, function points (or elements of the FP counts such as the number of inputs,
outputs etc), number of team members etc etc. For the values for the new project
are used to find one or more instances from the past projects than match the
current one. The actual effort from the past project becomes the basis of the
estimate for the new project.
A problem is identifying the similarities and differences between applications where
you havelarge of past projects to analyse.
Albrecht/IFPUG function points

Albrecht worked at IBM and needed a way of measuring the relative


productivity of different programming languages. Needed some way of
measuring the size of an application without counting lines of code.
Identified five types of component or functionality in an information
system Counted occurrences of each type of functionality in order to get
an indication of the size of an information system.
Five function types/Major components

1. Logical interface file (LIF) type – equates roughly to a data store in


systems analysisterms. Created and accessed by the target system
2. External interface file types (EIF) – where data is retrieved from a data
store that is actually maintained by a different application.
3. External input (EI) types – input transactions that update internal computer files
4. External output (EO) types – transactions that extract and display data from
internal computer files. Generally involves creating reports.
5. External Inquiry (EQ) types – user-initiated transactions that provide
information but do not update computer files. Normally the user inputs
some data that guides the system to the information the user needs.
Albrecht complexity multipliers
The complexity of each instance of each ‘user type’ is assessed and a rating
applied. Originally this assessment was largely intuitive, but later versions,
developed by IFPUG (the International FP User Group) have rules governing how
complexity is rated.

External user types Low complexity Medium complexity High complexity

EI 3 4 6

EO 4 5 7

EQ 3 4 6

LIF 7 10 15

EIF 5 7 10
Symons/Function point mark II

Developed by Charles R. Symons ‘Software sizing and estimating - Mk II FPA’,


Wiley & Sons,1991. It is builds on work by Albrecht.
It should be compatible with SSADM; mainly used in UK. It has developed in
parallel to IFPUGFPs. It is a simpler method.
For each transaction (cf use case) count the number of input types (not
occurrences e.g. where a table of payments is input on a screen, so the account
number is repeated a number of times), the number of output types, and the
number of entities accessed. Multiply by the weightings shown and sum. This
produces an FP count for the transaction which will not be very useful. Sum the
counts for all the transactions in an application and
the resulting index value is a reasonable indicator of the amount of processing
carried out. The number can be used as a measure of size rather than lines of
code. See calculations of productivity.
COSMIC Full Function Points

COSMIC (Common Software Measurement Consortium)Data groups can be


moved in fourways. The following are counted:
• Entries: movement of data into software component from a higher layer
or a peer component
• Exits: movements of data out
• Reads: data movement from persistent storage
• Writes: data movement to persistent storage
Each counts as 1 ‘COSMIC functional size unit’ (Cfsu)
COCOMO II
COCOMO 2 incorporates a range of sub-models:
Produces increasingly accurate estimates.
The 4 sub-models in COCOMO 2 are:
Application composition model. Used when software is composed from existing
parts.
Early design model. Used when requirements are available but design has notyet
started.
Reuse model. Used to compute the effort of integrating reusable components.
Post-architecture model. Used once the system architecture has been designed
and more information about the system is available.
An updated version of COCOMO:
There are different COCOMO II models for estimating at the ‘early design’ stage and
the ‘post architecture’ stage when the final system is implemented. We’ll look
specifically at the first.
The core model is:
pm = A(size)(sf) ×(em1) ×(em2) ×(em3)….
where pm = person months, A is 2.94, size is number of thousands of lines of code, sf
is thescale factor, and emi is an effort multiplier
Cost Estimation

Project cost can be obtained by multiplying the estimated effort (in man-
month, from the effort estimate) with the manpower cost per month.
Implicit in this project cost computation is the assumption that the entire
project cost is incurred on account of the manpower cost alone. However,
in addition to manpower cost, a project would incur several other types of
costs which we shall refer to as the overhead costs. The overhead costs
would include the costs of hardware and software required for the project
and the company overheads for administration, office space, etc.
Depending on the expected values of the overhead costs, the project
manager has to suitably scale up the cost estimated by using the
COCOMO formula.
Capers Jones Estimating Rules of Thumb

 Formulated based on
observations
 No scientific basis
 Because of their simplicity
 These rules are handy to use for making off-hand estimates.
 Give an insight into many aspects of a project for which no formal
methodologies exist yet.
Capers Jones’ Rules

Rule 1: SLOC-function point equivalence:


One function point = 125 SLOC for C programs.
Rule 2: Project duration estimation:
Function points raised to the power 0.4 predicts the approximate development time in
calendar months.
Rule 3: Rate of requirements creep:
User requirements creep in at an average rate of 2% per month from the design through
coding phases.
Rule 4: Defect removal efficiency:
Each software review, inspection, or test step will find and remove 30% of the bugs that are
present.
Rule 5: Project manpower estimation:
The size of the software (in function points) divided by 150 predicts the approximate
numberof personnel required for developing the application.
Rule 6: Number of personnel for maintenance
Function points divided by 500 predicts the approximate number of personnel required for
regular maintenance activities.
Rule 7: Software development effort estimation:
The approximate number of staff months of effort required to develop a software is given
by the software development time multiplied with the number of personnel required.

You might also like