0% found this document useful (0 votes)
42 views24 pages

Cost Estimates in SPM

The document discusses various methods for estimating effort for software projects, including expert judgment, analogous estimation, parametric estimation, bottom-up estimation, three-point estimation, Delphi technique, COCOMO, function point analysis, and story points in agile methodologies. It then provides more details on COCOMO, describing the basic, intermediate, and COCOMO II versions as well as the COCOMO estimation process.

Uploaded by

Abhay Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views24 pages

Cost Estimates in SPM

The document discusses various methods for estimating effort for software projects, including expert judgment, analogous estimation, parametric estimation, bottom-up estimation, three-point estimation, Delphi technique, COCOMO, function point analysis, and story points in agile methodologies. It then provides more details on COCOMO, describing the basic, intermediate, and COCOMO II versions as well as the COCOMO estimation process.

Uploaded by

Abhay Singh
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Effort Estimations Methods

Effort estimation in software project management is crucial for planning, budgeting, and scheduling.
Various methods have been developed to tackle this challenge, each with its strengths and limitations.
Here are some of the primary methods used for effort estimation:

1. Expert Judgment

This method relies on the experience and intuition of seasoned project managers or team members.
Experts use their knowledge of similar past projects to estimate the effort required for new projects.
While subjective, it can be surprisingly accurate, especially when the experts have a deep understanding
of the domain and the specific technologies involved.

2. Analogous Estimation

Analogous estimation involves comparing the current project with past projects that are similar in size,
complexity, and functionality. This method assumes that the effort required for the current project will
be similar to that of the projects it's being compared to. It's quick and often used in the early stages of
project planning when detailed information is not yet available.

3. Parametric Estimation

Parametric estimation uses statistical models to estimate project effort. It involves identifying the
relationship between various project characteristics (parameters) and the effort required. These models
can be based on industry data or data from the organization's past projects. Examples include regression
models where effort is a function of lines of code, function points, or other quantifiable measures of
software size.

4. Bottom-Up Estimation

In bottom-up estimation, the project is broken down into smaller components or tasks, and the effort
for each is estimated individually. These individual estimates are then summed up to get the total project
effort. This method can be very accurate since it accounts for the details of the project, but it's also time-
consuming and requires a clear understanding of all the project's aspects.

5. Three-Point Estimation

The Three-Point Estimation technique, particularly when applied using the PERT (Program Evaluation
and Review Technique) formula, is a powerful method to estimate task durations or effort in project
management. It is especially useful in addressing the uncertainty and variability inherent in estimating
complex tasks. The technique requires three types of estimates for each task:

O (Optimistic): The best-case scenario where everything goes as smoothly as possible.

M (Most Likely): The most probable outcome, assuming a normal level of obstacles and opportunities.

P (Pessimistic): The worst-case scenario, considering potential challenges that could arise.

The formula for calculating the expected duration or effort (E) is:

E=(O+4M+P)/6
This formula not only gives a weighted average (with the most likely estimate receiving the highest
weight) but also balances the optimistic and pessimistic views, leading to a more balanced and realistic
estimation.

6. Delphi Technique

The Delphi Technique is a structured communication technique, originally developed as a systematic,


interactive forecasting method. It relies on a panel of experts who anonymously provide their estimates.
After each round, the range of the estimates is shared, and the experts are allowed to adjust their
estimates based on the information provided by the other participants. This process is repeated until a
consensus is reached.

7. COCOMO (Constructive Cost Model)

COCOMO and its successors, like COCOMO II, are algorithmic models that use mathematical formulas
to estimate project effort based on project size (measured in lines of code or function points) and a set
of cost drivers that adjust the base effort estimation. These models provide a more systematic and
repeatable estimation process than some of the more qualitative methods.

8. Function Point Analysis (FPA)

Function Point Analysis is a method for measuring the functionality delivered by the project,
considering the user's external view of the system. The total count of function points is then used in
conjunction with historical productivity data (function points per person-month) to estimate the effort
required for the project.

9. Story Points in Agile Methodologies

In agile methodologies, such as Scrum, effort estimation is often done using story points. This approach
estimates the effort for user stories based on their complexity, risk, and the amount of work required.
Story points allow teams to estimate the effort for tasks relative to each other without tying the estimates
to specific time durations.

Each of these methods has situations where it is most applicable, and many projects benefit from using
a combination of methods to cross-validate estimates and address different aspects of project planning
and uncertainty.

Effort Estimation Using COCOMO

Cost effort in the context of software engineering and project management, particularly within the
frameworks like COCOMO (Constructive Cost Model), refers to the amount of human labor and
resources required to complete a software project, measured in person-months or person-hours. This
metric is crucial for planning, budgeting, and scheduling software development projects. It helps project
managers estimate how much work is involved in a project and how many people and how much time
will be needed to complete it, which in turn influences the overall cost of the project.

Parametric Model (COCOMO Model)


COCOMO (Constructive Cost Model) is a parametric model used for estimating the cost, effort, and
schedule of software projects. Developed by Dr. Barry Boehm in 1981, COCOMO is one of the most
widely used and recognized parametric estimation models in the field of software engineering.
COCOMO estimates the effort and duration of a software project based on the size of the software
product, which is typically measured in thousands of lines of code (KLOC) or function points (FP). The
model uses these measurements as input parameters and applies them to a set of predefined equations
that also take into account various project and product attributes affecting productivity, such as software
complexity, team experience, and technology constraints.

Versions of COCOMO

COCOMO has evolved through different versions, each adding more sophistication and accuracy to the
estimation process:

1. Basic COCOMO: The original version provides a quick and rough estimate of effort based on
the size of the software and a set of three project classes (Organic, Semi-Detached, and
Embedded), each representing different levels of project difficulty.
2. Intermediate COCOMO: This version refines the basic model by introducing cost drivers that
account for various factors affecting productivity and project quality, such as developer
experience, software reliability requirements, and the use of modern tools and techniques.
3. COCOMO II: The most recent version of the model, COCOMO II, updates the model to
accommodate modern software development practices, including object-oriented approaches,
software reuse, and rapid development techniques. COCOMO II is more adaptable to a wider
range of projects and includes models for early prototyping, early design, and post-architecture
stages of project development.

COCOMO Estimation Process

The COCOMO estimation process involves the following steps:

• Size Estimation: Estimate the size of the software project in KLOC or function points.
• Model Selection: Choose the appropriate COCOMO model (Basic, Intermediate, or
COCOMO II) based on the project's needs and available information.
• Parameter Specification: For Intermediate and COCOMO II models, specify the values for
the cost drivers relevant to the project.
• Effort and Schedule Estimation: Apply the size and parameter values to the COCOMO
equations to estimate effort (in person-months) and project duration.

COCOMO's parametric nature allows for relatively quick and adaptable project estimates, making it a
valuable tool for project managers. However, like any estimation model, the accuracy of COCOMO's
predictions depends heavily on the quality of the input data and the appropriateness of the chosen model
and parameters for the specific project context.

Basic COCOMO: The original version provides a quick and rough estimate of effort based on the size
of the software and a set of three project classes (Organic, Semi-Detached, and Embedded), each
representing different levels of project difficulty.

COCOMO (Constructive Cost Model) is indeed a parametric model used for estimating software project
effort, cost, and schedule. Developed by Dr. Barry Boehm in 1981, it's one of the most widely
recognized and historically significant models in the field of software engineering.

COCOMO is based on a mathematical formula that uses the size of the software project (usually
measured in thousands of lines of code, KLOC) as the primary input parameter. It then adjusts this base
estimate according to a set of cost drivers that reflect the project's characteristics, such as complexity,
required reliability, team experience, and use of modern tools and techniques.
Basic COCOMO

The Basic COCOMO model is the simplest form of the COCOMO series and estimates the software
development effort (and subsequently cost and duration) using a single equation for each type of project.
The projects are categorized into three classes to account for the varying levels of complexity and
difficulty:

1. Organic: Projects with a "small" team and relatively "simple" software solutions, typically
found in business or non-complex scientific applications where the team has a good
understanding of the application domain.
2. Semi-Detached: Projects that fall between organic and embedded, often with mixed
characteristics such as medium-sized teams or projects with varied experience levels and
moderately complex applications.
3. Embedded: Projects that are developed within a set of tight constraints, high-levels of
hardware, software, operational, and interface complexity. This class is characteristic of
software that controls hardware devices, consumer electronics, or highly specialized
applications in aerospace and military.

The Basic COCOMO formula for estimating the software development effort (E) in person-months is
given by:

E=ab(KLOC) bb

Where:

• E is the effort applied in person-months.


• KLOC is the estimated number of delivered lines of code for the project (in thousands).
• ab and bb are constants that vary depending on the project class (Organic, Semi-Detached,
Embedded).

After estimating the effort, the model can also estimate the project duration and staffing.

Advantages and Limitations

The strength of Basic COCOMO lies in its simplicity and the ease with which it can provide rough
estimates of project effort. However, its simplicity is also a limitation since it does not account for the
nuances and variations between individual projects beyond the basic classification. It assumes a fixed
relationship between project size and effort, which may not hold true for projects with significant
differences in technology, tools, and team dynamics.

For more detailed and nuanced estimation, Boehm developed extensions to the Basic COCOMO model,
namely Intermediate COCOMO and Detailed COCOMO, which consider additional factors and cost
drivers that can influence project effort and duration.

COCOMO and its variants remain a valuable tool in the arsenal of software project managers for initial
project estimation, especially when calibrated with contemporary data and adjusted for the specific
context of the projects they are managing.

Table for the values of constants in this case:

Project Class ab bb
Organic 2.4 1.05
Project Class ab bb
Semi-Detached 3.0 1.12
Embedded 3.6 1.20

'a' and 'b' are constants determined from historical data and represent the productivity and scale factors,
respectively.

The values of 'a' and 'b' can vary depending on factors such as the organization's development
environment, the type of projects being analyzed, and the characteristics of the development team.
However, they are typically derived from regression analysis of historical project data to provide a
baseline for effort estimation.

While the values of 'a' and 'b' may not change frequently, they are not fixed and may be adjusted over
time as new data becomes available or as the development environment evolves. Additionally, different
organizations or industries may use slightly different values for 'a' and 'b' based on their specific context
and experience.

Intermediate COCOMO Model:


Intermediate COCOMO, an extension of the original COCOMO model, offers a more nuanced
approach to estimating software project effort and duration by considering additional factors known as
cost drivers. These cost drivers account for various aspects affecting productivity and project quality.
Here's an overview of Intermediate COCOMO and its key components:

Factors in Intermediate COCOMO:

1. Product Attributes
• Required Software Reliability (RELY): Reflects the degree of importance placed on the
reliability of the software. It's measured on a scale from very low to very high.
• Size of the Application Database (DATA): Measures the size and complexity of the database
used by the application. Larger databases typically require more effort to develop and maintain.

2. Hardware Attributes:
• Product Complexity (CPLX): Evaluates the complexity of the product in terms of its
architecture, interfaces, and integration requirements. It's rated on a scale from very low to extra
high.

3. Personnel Attributes:
• Analyst Capability (ACAP): Reflects the skill and experience level of the analysts involved
in the project.
• Programmer Capability (PCAP): Measures the skill and experience level of the
programmers.
• Personnel Continuity (PCON): Accounts for the personnel turnover within the project team.

4. Project Attributes:
• Use of Modern Programming Practices (MODP): Reflects the extent to which modern
programming practices, tools, and techniques are employed in the project.
• Use of Software Tools (TOOL): Evaluates the sophistication and effectiveness of the software
development tools used.

Applying Intermediate COCOMO


1. Identify Cost Drivers: Project managers assess each cost driver based on their project's
characteristics and context.
2. Assign Rating Values: Rate each cost driver on a scale from very low to extra high, reflecting its
influence on project effort.
3. Calculate Adjustment Factors: Convert the rating values to adjustment factors using predefined
tables or formulas provided by COCOMO.
4. Estimate Effort and Duration: Use the adjusted adjustment factors along with the size of the
software (usually in thousands of lines of code) to estimate the effort in person-months and project
duration.

Advantages of Intermediate COCOMO:

• Increased Precision: By considering a broader range of project attributes and cost drivers,
Intermediate COCOMO provides more accurate estimates compared to Basic COCOMO.
• Customization: The model allows project managers to tailor estimates based on their specific
project characteristics, team capabilities, and development environment.
• Reflects Industry Trends: By incorporating factors such as the use of modern tools and
techniques, Intermediate COCOMO stays relevant in the rapidly evolving software
development landscape.

Limitations:

• Complexity: Intermediate COCOMO introduces additional complexity compared to Basic


COCOMO, requiring more effort to assess and apply the various cost drivers accurately.
• Data Requirements: It relies on historical data or expert judgment to assign rating values to
cost drivers, which may not always be readily available or accurate.
• Subjectivity: Assessing the influence of cost drivers and assigning rating values involves a
degree of subjectivity and may vary between project managers.

Overall, Intermediate COCOMO offers a valuable refinement to the original COCOMO model by
incorporating a more comprehensive set of factors that influence software project effort and duration.
By carefully considering these factors, project managers can generate more precise estimates, leading
to better project planning and management.

In Intermediate COCOMO, the mathematical formula for estimating software project effort is similar
to the one used in Basic COCOMO, but it incorporates adjustment factors (also known as cost drivers)
to account for various project attributes. The formula for estimating effort (E) in person-months is given
as:

E = ab×(KLOC)bb×EAF
Where:

• E is the effort applied in person-months.


• KLOC is the estimated number of delivered lines of code for the project (in thousands).
• ab and bb are constants specific to the project class (Organic, Semi-Detached, Embedded), as
in Basic COCOMO.
• EAF is the Effort Adjustment Factor, which is the product of all the adjustment factors
reflecting the project's attributes.

The Effort Adjustment Factor (EAF) is calculated based on the product's characteristics and the ratings
assigned to the various cost drivers. The formula for calculating EAF is:
EAF=Πi=1 to n EMi
Where:

• n is the total number of cost drivers.


• EMi represents the rating value assigned to each cost driver, typically ranging from very low
(0.7) to extra high (1.5).

Each cost driver rating (EM) is determined based on the project's attributes, such as required software
reliability, product complexity, personnel capabilities, and use of modern tools and techniques. These
ratings are then converted into adjustment factors using predefined tables provided by COCOMO.

Once all the adjustment factors are determined, they are multiplied together to calculate the overall
Effort Adjustment Factor (EAF). This EAF value is then applied to the Basic COCOMO effort
estimation formula to account for the project's specific attributes and characteristics.

In summary, the steps for mathematically calculating effort estimation in Intermediate COCOMO are
as follows:

1. Determine the project size (KLOC).


2. Determine the rating values (EM) for each cost driver based on project attributes.
3. Convert rating values to adjustment factors using predefined tables.
4. Calculate the Effort Adjustment Factor (EAF) by multiplying all adjustment factors.
5. Apply the EAF to the Basic COCOMO effort estimation formula to calculate the final effort
estimate.

This mathematical approach allows project managers to generate more accurate effort estimates by
considering a broader range of project attributes and their impact on software development effort.

Let's walk through a detailed case study to illustrate how Intermediate COCOMO can be applied to
estimate software project effort.

Case Study: Development of a Web Application


Project Overview:

A software development company, TechSolutions Inc., has been tasked with developing a web
application for a client in the e-commerce sector. The application will allow users to browse and
purchase products online. The project involves a team of developers, designers, and testers.

Project Attributes:

1. Size of the Application (KLOC): The estimated size of the application is 100,000 lines of code.
2. Product Complexity (CPLX): The application is moderately complex due to its requirement for
user authentication, product search functionality, shopping cart management, and payment
processing.
3. Required Software Reliability (RELY): Reliability is crucial for the application as it deals with
financial transactions. High reliability is required.
4. Analyst Capability (ACAP): The analysts in the team are experienced and skilled in gathering and
analyzing requirements.
5. Programmer Capability (PCAP): The programmers have average experience levels.
6. Personnel Continuity (PCON): The project team has a low turnover rate.
7. Use of Modern Programming Practices (MODP): Modern programming practices and tools are
moderately employed.
8. Use of Software Tools (TOOL): The team utilizes sophisticated software development tools and
integrated development environments (IDEs).

Cost Drivers and Ratings:

Based on the project attributes, let's assign rating values to each cost driver:

1. Required Software Reliability (RELY): Very High (1.15)


2. Size of the Application Database (DATA): Low (0.90)
3. Product Complexity (CPLX): Nominal (1.00)
4. Analyst Capability (ACAP): High (0.85)
5. Programmer Capability (PCAP): Nominal (1.00)
6. Personnel Continuity (PCON): Very High (1.10)
7. Use of Modern Programming Practices (MODP): Nominal (1.00)
8. Use of Software Tools (TOOL): High (0.90)

Calculation of Effort Adjustment Factor (EAF):

EAF=RELY×DATA×CPLX×ACAP×PCAP×PCON×MODP×TOOL
EAF = 1.15×0.90×1.00×0.85×1.00×1.10×1.00×0.90
EAF =1.15×0.90×0.85×1.10×0.90
EAF=1.01565
Calculation of Effort (E):
E=ab×(KLOC)bb×EAF
Given that this is a semi-detached project:
ab=3.0
bb=1.12
KLOC=100
E=3.0×(100)1.12×1.01565
E=3.0×398.1×1.01565
E=1202.7 person-months
Conclusion

Based on the Intermediate COCOMO estimation, the effort required for the development of the web
application is approximately 1202.7 person-months. This estimate can be used by TechSolutions Inc.
to plan resources, schedule activities, and budget for the project accordingly.

This case study demonstrates how Intermediate COCOMO can be applied to estimate software project
effort by considering various project attributes and their influence on development effort.

Cost Drivers Very Low Low Nominal High Very High Extra High
Required Software Reliability 0.90 0.95 1.00 1.10 1.20 1.30
Size of Application Database 0.95 0.98 1.00 1.05 1.10 1.15
Product Complexity 0.90 0.95 1.00 1.10 1.20 1.30
Analyst Capability 0.85 0.90 1.00 1.10 1.20 1.25
Cost Drivers Very Low Low Nominal High Very High Extra High
Programmer Capability 0.85 0.90 1.00 1.10 1.20 1.25
Personnel Continuity 0.95 1.00 1.00 1.05 1.10 1.15
Use of Modern Programming Practices 0.95 0.98 1.00 1.05 1.10 1.15
Use of Software Tools 0.95 0.98 1.00 1.05 1.10 1.15

COCOMO II
The COCOMO II (Constructive Cost Model II) is an advanced software cost estimation model
designed to provide an accurate projection of the efforts, time, and resources necessary for developing
a software project. It's an evolution of the original COCOMO model developed by Dr. Barry Boehm
in the 1980s, updated to accommodate the changes and advancements in software development
practices. COCOMO II offers a more flexible and detailed approach for estimating the cost of
software development projects, especially useful for modern software development that includes
concepts like rapid development, reuse, and non-sequential life cycles.

COCOMO II comprises three models, which are applied sequentially throughout the software
development process:

Models within COCOMO II:

COCOMO II comprises three distinct models designed to provide accurate cost estimations at different
stages of software development. Each model is tailored to the specific needs and available information
at its respective phase in the development cycle. Here's a detailed look at each model:

1. Application Composition Model


• When It's Used: Ideal for the early stages of development, especially when the project is more about
assembling existing components or modules. This stage is often seen in projects that rely on rapid
application development (RAD) methodologies or where software composition and integration are
key.
• How It Works: The core of the estimation process revolves around Object Points, which quantify the
size of the software based on the number of screens, reports, and third-party modules. These object
points are then adjusted based on various project and product attributes to estimate the effort required.
• Suitable For: Projects that emphasize speed and flexibility, where the main task involves integrating
existing software components or modules to create new applications.
2. Early Design Model
• When It's Used: This model comes into play once the project's requirements are defined but before
moving on to detailed design and coding. It's a critical phase for making broad estimations that help in
strategic planning, such as assessing project feasibility or deciding between different architectural
options.
• How It Works: The size of the software is estimated using Function Points or Object Points,
depending on the nature of the project. These points are then used alongside a set of cost drivers that
adjust for factors like project complexity, team capabilities, and other project-specific attributes,
converting them into an effort estimate measured in person-months.
• Suitable For: Providing early, strategic estimations that guide decision-making about the project's
direction, feasibility, and architectural approach before committing to detailed design.
3. Post-Architecture Model
• When It's Used: Activated after the software's architecture has been solidified and detailed design is
about to begin. This phase allows for the most precise and detailed cost estimation within COCOMO
II, making it highly valuable for planning and resource allocation for the remainder of the project.
• How It Works: The software size is typically measured in Lines of Code (LOC) or Function Points,
offering a clear view of the project's scope. The initial size estimate is refined using a comprehensive
set of cost drivers and scale factors, which account for the complexity of the project, the capabilities
of the development team, and the impact of the chosen architecture on development effort.
• Suitable For: Detailed planning and resource allocation. At this point, the project's architecture
provides a clear framework for estimating the effort and costs associated with development, allowing
for accurate budgeting and scheduling.

Each model within COCOMO II serves a unique purpose, tailored to the specific stage of
software development. They collectively ensure that project managers and development
teams can make informed decisions throughout the project lifecycle, from early planning and
feasibility studies to detailed design and development.

Key Features of COCOMO II

Software Reuse
• What It Means: COCOMO II understands that not every part of a new software project needs
to be built from scratch. Teams often use existing software components, either by directly
incorporating them into the project or by modifying them to fit new requirements. This reuse
can significantly reduce the effort and time needed for development.
• How It Works: The model adjusts the overall effort estimate based on the amount (how much
code) and type (completely new, slightly modified, or directly reused) of software being reused.
This means if your project can utilize a lot of existing components, the estimated effort and cost
will be lower.

Scale Factors
• What They Are: Scale factors are elements that reflect how various attributes of a project
impact its overall difficulty and, therefore, the effort required to complete it.
• Examples Include:
a. Project Size: Larger projects require more effort.
b. Complexity: More complex projects are harder to manage and develop.
c. Volatility: Projects with frequent changes (in requirements, design, etc.) are more
challenging to estimate and complete.
• Impact: These factors are used to adjust the base effort estimate to account for the size,
complexity, and other scaling impacts of the project.

Cost Drivers
• What They Are: Cost drivers are project-specific factors that influence the cost and effort
needed beyond the basic size of the software.
• Examples Include:
a. Team Experience: More experienced teams can work more efficiently.
b. Software Tools: The use of advanced development tools can speed up the development
process.
c. Documentation Quality: High-quality documentation can make the development
process smoother and faster.
• How They Work: Each cost driver adjusts the effort estimate up or down based on its presence
or quality in the project. For example, a highly experienced team might reduce the estimated
effort, while a project that requires extensive documentation might increase it.
Maintenance Model
• Purpose: Recognizes that a significant portion of software costs occur after the initial
development, during the maintenance phase.
• What It Does: Provides a way to estimate the effort required for maintaining software after its
initial release. This includes fixing bugs, updating features, and ensuring the software continues
to operate with new hardware or software environments.
• Importance: By including maintenance in the estimation, COCOMO II gives a more
comprehensive view of a software project's lifecycle costs.

COCOMO II provides a comprehensive framework that can be tailored to a wide range of software
projects, from small, simple applications to large, complex systems. It is better suited for today's
software development environment, where rapid development cycles, component-based assembly, and
software reuse are common.

Key Components of COCOMO II:

COCOMO II's methodology for estimating the cost and effort required for software
development projects is both comprehensive and adaptable, accommodating various types of
software projects from early development stages through maintenance. Here's a breakdown of
its key components, making it easier to understand for those not familiar with software
project estimation:

Size Estimation
• What It Is: The foundation of any cost estimation in COCOMO II. It quantifies the amount of work
involved in a software project.
• How It's Measured:
• KSLOC (Kilo Source Lines of Code): Estimates based on the lines of code expected in the
final product.
• Function Points (FP): A measure based on the functionality provided by the system, such as
inputs, outputs, user interactions, and the complexity of the system's internal operations.
• Object Points (OP): Similar to function points but used for projects developed with object-
oriented methods, considering the number of screens, reports, and third-party modules.
• Why It Matters: The size directly influences the effort and time estimates. A larger size typically
means more effort and longer duration.
Cost Drivers
• What They Are: Elements that can increase or decrease the effort needed beyond the basic workload
indicated by size.
• Types of Impact:
• Software Complexity: More complex algorithms or architectures can increase effort.
• Developer Capability: Highly skilled developers can reduce effort and time.
• Documentation Quality: Comprehensive and clear documentation can streamline
development.
• Software Tools: Advanced tools can accelerate development but might require additional
training.
• Impact Mechanism: Each driver adjusts the effort estimate by a factor rated from "very low" to
"extra high."
Scale Factors
• Difference From Cost Drivers: While cost drivers adjust effort linearly, scale factors impact the
effort non-linearly, meaning their effect increases more dramatically as the project scales up.
• What They Reflect: Attributes like project size, team cohesion, or the duration of the project that
globally impact the project's effort and duration.
• Use Case: Particularly important in the Post-Architecture phase of development when the project's
structure and team dynamics become clearer.
Effort Multipliers
• Function: These multipliers adjust the base effort estimate to reflect the influence of cost drivers.
• Application: Specific to various development phases and project characteristics, they are applied
directly to the size estimation to refine the effort required.
Equations
• Role: Empirical equations calculate the core metrics of a project: effort, duration, and staffing.
• Basis: These calculations consider size estimates, cost drivers, scale factors, and effort multipliers to
provide a comprehensive estimation.
• Outcome: The result is a detailed projection of how much effort (in person-months), how long
(duration), and how many people (staffing requirements) are needed to complete the project.

In essence, COCOMO II offers a structured approach to estimating the resources required for
software development, accommodating a wide range of project types and complexities. Its
adaptability lies in its ability to consider a variety of factors from the project's size and
complexity to the team's capabilities and the tools at their disposal, making it a valuable tool
for project managers and developers alike.

Effort & Duration Calculation using COCOMO II


Step 1: Determine the Size of the Software
There isn't a single formula for this step because it involves estimation techniques. The size
can be estimated in thousands of Source Lines of Code (KSLOC), Function Points (FP), or
Object Points (OP), depending on the information available and the stage of the project.

Step 2: Adjust for Complexity and Environment (Cost Drivers)


The adjustment for complexity and environment doesn't follow a singular formula but
involves applying effort multipliers (EM) associated with each cost driver to the size
estimate:

Adjusted Size=Size Estimate×∏(Effort Multipliers)

Step 3: Consider Scale Factors


Scale factors (SF) are used to adjust the effort estimate to account for the size, complexity,
and other attributes of the project that impact its execution. These are applied in the effort
calculation formula directly.

Step 4: Calculate Effort Using the COCOMO II Model


The effort in person-months is calculated using the formula:

Effort=A×(Adjusted Size)E×∏(Cost Drivers) ×∏(Scale Factors)

• A is a constant factor, typically 2.94 for COCOMO II.


• E is an exponent derived from the scale factors and reflects economies or diseconomies of scale,
calculated as B+0.01×∑(Scale Factors), where B is usually 0.91 for COCOMO II.
• ∏(Cost Drivers) represents the product of all applicable cost driver effort multipliers.

Step 5: Calculate Duration and Staffing


The duration is calculated with the formula:

Duration=3.67×(Effort)F

Where, F= (0.28+0.2×(E−B))

• E is an exponent derived from the scale factors, impacting the effort estimation.
• B is a constant baseline value. In traditional COCOMO II, B often starts around 0.91
for small to medium-sized projects but can vary based on the project's size and
complexity.
• The effort is the calculated effort from step 4.

The formula takes into account the non-linear relationship between effort and time. The
number of people required (staffing) can be approximated by dividing the effort by the
duration.

Step 6: Estimate Cost


To translate effort into cost, you would typically use a formula based on average labor rates:

Cost=Effort×Average Labor Rate


• The average labor rate will vary depending on geographic location, the specific skill sets required, and
other factors.

Example:
Assuming you have an adjusted size of 32 KSLOC (after applying cost drivers and
considering the software's complexity), and the project's scale factors suggest an E value of
1.15, the effort calculation would look something like this:

Effort=2.94×(32)1.15

Effort=2.94×(32)1.15

Effort≈2.94×45.8≈134.7 person-months

Then, calculating the duration:

Duration=3.67×(134.7)(0.28+0.2×(1.15−0.91))
Duration≈3.67×(134.7)0.33≈21.4 months

And if the average labor rate is $7,500 per person-month, the cost would be:
Cost=134.7×7,500=$1,010,250

These steps offer a structured approach to estimating the effort, duration, and cost of software
projects using COCOMO II, providing vital insights for project planning and management.

Example
Imagine a project estimated to be 10 KSLOC with average complexity and environment
factors. Suppose after adjustments for cost drivers and scale factors, the effort calculation
equation looks like this:

• Assume A=2.94, E=1.05 (after adjusting for scale factors),


• and the product of cost drivers roughly equals 1 for simplicity.

The effort estimation would be:

Effort=2.94×(10)1.05≈31.4 person-months

If the average cost per person-month is $7,000, the estimated cost would be approximately:

31.4 person-months X $7,000/person-month = $219,800

This calculation provides a baseline for budgeting, but it's essential to periodically revisit and
adjust the estimate based on actual project performance and newly discovered information.

Case Study

To provide a comprehensive understanding of how cost estimation, specifically


using a model like COCOMO II, applies in real-world scenarios, let's consider a
detailed fictional case study of a software development project for a new customer
relationship management (CRM) system.

Background
Company: XTech Innovations, a mid-sized software development firm
specializing in enterprise solutions.

Project: Developing a new CRM system tailored for small to medium-sized


enterprises, focusing on ease of use, scalability, and integration with existing
platforms like email marketing tools and social media analytics.

Initial Requirements and Size Estimation


Scope:

• Contact Management
• Sales Pipeline Tracking
• Customer Support & Service
• Marketing Campaign Management
• Reporting and Dashboards
• Integration with third-party services (email, social media, etc.)

Size Estimation: Given the lack of detailed requirements at the very beginning, the
project management team decides to use analogy-based estimation and expert
judgment, comparing the new CRM project to a past project of similar scope.
Based on their experience and initial requirements, they estimate the project to be
around 85,000 Source Lines of Code (SLOC).

Phase 1: Preliminary Cost Estimation


Using COCOMO II's Post-Architecture model (since some architecture decisions
have already been made), they begin with the initial size estimation:

• SLOC: 85,000

Cost Drivers (selected examples):

• Required Software Reliability (RELY): High


• Product Complexity (CPLX): High
• Personnel Experience (PREX): Nominal
• Platform Difficulty (PDIF): High
• Tool Use (TOOL): Good

Scale Factors (selected examples):

• Precedentedness (PREC): Low (new project type for XTech)


• Development Flexibility (FLEX): High (clients are flexible on requirements)
• Team Cohesion (TEAM): High

Calculation:

• Assume average values for cost drivers and scale factors not specified.
• Using the COCOMO II formula and the given parameters, the project management
team calculates an initial effort estimate.

This is simplified for illustration:


Effort=2.94×(85)1.12×Cost Driver Adjustment× Scale Factors
Adjustments
Suppose, this results in an estimated effort of approximately 300 person-months,
considering all the adjustments for cost drivers and scale factors.
Phase 2: Detailed Estimation and Planning
After the initial estimation, XTech moves into detailed planning, breaking down
the project into components and revising the size estimation based on more
detailed design documents. They now have a better understanding of the
requirements and the technical challenges, leading to a revised SLOC of 100,000.

With this refined estimate, they recalculate the effort and duration, considering
more accurate cost drivers and scale factors. This results in a slight increase in the
estimated effort to 350 person-months.

Cost Estimation
Given the effort estimate, the next step is to calculate the cost. XTech uses an
average labor rate of $7,500 per person-month, which includes salaries, overheads,
and other expenses.

Cost = 350 person-months X $7,500 = $2,625,000

Project Execution and Adjustments


As the project progresses, XTech monitors actual effort versus estimated effort
closely. They encounter some challenges, such as higher than expected complexity
in integrating third-party services, which requires additional effort. To manage this,
they adjust their plans, increase efficiency through better tooling, and re-negotiate
some scope aspects with the client to stay within budget.

Conclusion
The CRM project is completed in 370 person-months, slightly over the initial
estimate but within a revised budget that accounted for discovered complexities.
The project's final cost is approximately $2,775,000.

This case study demonstrates the iterative nature of project estimation and the
importance of flexibility, continuous monitoring, and communication with
stakeholders. COCOMO II and similar models provide a structured approach to
estimation, but the art of project management lies in adapting to changes and
learning from experience.

Object Points

Object Points are a metric used in software development to estimate the size and
complexity of a software application. They are part of the COCOMO II model,
particularly relevant in the Application Composition model, which is used during
the early stages of project development. Object Points offer a way to measure
the functional size of a project based on the software's user interface and
underlying operations, making them particularly useful for projects where rapid
application development (RAD) techniques are employed, or for projects that uses
iterative project development.

Here’s a breakdown of how Object Points are calculated and used:

Components of Object Points


Object Points are derived from counting the number of screens (or reports), the
number of interfaces, and the complexity of each:

1. Screens (or User Interfaces): The total number of distinct screens or user
interfaces that the application will have. Each screen is categorized by complexity
(simple, medium, or complex) based on criteria such as the number of fields or the
amount of data processing required.
2. Reports: Similar to screens, this counts the total number of reports the software
will generate, with each report also classified as simple, medium, or complex based
on factors like layout complexity and data aggregation requirements.
3. Interfaces: The number of interfaces to other systems or applications. Interfaces
are also assessed for complexity, which might consider the amount of data
exchanged, the need for data transformation, and the communication protocols
used.
Calculating Object Points
After identifying and classifying the screens, reports, and interfaces, each
component is assigned a weight based on its complexity. For example:

• Simple Screen: 1 point


• Medium Screen: 2 points
• Complex Screen: 3 points

(These values are illustrative; actual weights might vary based on specific
methodologies or organizational standards.)

The total Object Points for a project are calculated by summing the weighted
counts of screens, reports, and interfaces. This total gives a quantifiable measure of
the application's size and complexity.

Using Object Points for Estimation


Once the total Object Points are determined, they can be used, along with historical
project data and adjustment factors (for productivity, team capability, etc.), to
estimate the effort and duration required for the project. This can help in resource
allocation, budgeting, and scheduling for the project.

Object Points provide a language- and technology-agnostic way to estimate project


size, making them particularly useful for early project estimation and planning,
especially when detailed information about the project's technical implementation
is not yet available. They help bridge the gap between functional requirements and
the technical complexity of implementing those requirements, offering a structured
approach to early project estimation.

Function Points
Function Points are a standardized, technology-independent measure used to
estimate the size and complexity of software applications. Developed in the 1970s
by Allan Albrecht at IBM, the method has since been refined and adopted globally.
Function Points quantify the functionality delivered to the user, based
primarily on the logical design and user requirements, rather than on
technical complexity or the programming effort required. This makes Function
Points particularly useful for estimating projects early in the development cycle,
comparing productivity across projects and technologies, and benchmarking.

Components of Function Points


The core of the Function Point analysis involves evaluating five major components
of a software application:

1. External Inputs (EI): These are the operations where data enters the system from
external sources. An example is data entered by a user through a form on a
website.
2. External Outputs (EO): These refer to the operations where processed data is
sent back to the user or to another system. Reports, search results, and automated
email notifications are typical examples.
3. External Inquiries (EQ): These operations involve both an input and an output,
essentially a request for information where the output is directly related to the input
query without significant processing. A search feature could be classified as an
inquiry, provided it merely retrieves and displays data without substantial
manipulation.
4. Internal Logical Files (ILF): These represent the user-identifiable groups of
logically re lated data or information maintained within the system. An example
could be a user database.
5. External Interface Files (EIF): These are similar to ILFs but are used to refer to
logically related data that is used for reference purposes only and is maintained by
another system. These files are typically accessed or utilized by the software being
developed but are not directly maintained or controlled by it. An example is accessing a
third-party ZIP code database.
Calculating Function Points
The calculation process involves several steps:

1. Counting: Identify and count instances of the five components within the
application.
2. Weighting: Each of these components is then weighted according to its
complexity (simple, average, or complex). Complexity is determined by factors
like the number of data elements involved and the degree of interaction with other
components.
3. Summing: The weighted counts are summed to produce a total Function Point
count.
4. Adjusting: The total is optionally adjusted to account for various factors affecting
the project, such as the data communication needs, performance requirements, and
operational constraints. This adjustment is made using a Value Adjustment Factor
(VAF), which can increase or decrease the total Function Points.
Using Function Points
The total Function Points provide a measure of the functional size of the software.
This measure can be used to estimate project effort, cost, and duration by
comparing it against historical data of similar projects. The beauty of Function
Points lies in their independence from programming language, development
methodology, or technology, which allows for consistent and objective
comparisons across different projects or teams.

In summary, Function Points offer a systematic approach to software size


estimation that focuses on the functionality and value provided to the user,
facilitating early project estimation, productivity analysis, and benchmarking
across diverse software development environments.

Object Points Calculation Example


Imagine you're developing a small inventory management application. The
application has:

• User Interfaces (Screens):


• 2 Simple screens (e.g., login, simple listing of items): 1 point each
• 3 Medium screens (e.g., item details, inventory count): 2 points each
• 1 Complex screen (e.g., inventory analysis dashboard): 3 points
• Reports:
• 1 Simple report (e.g., low stock alert): 1 point
• 2 Medium reports (e.g., monthly sales, inventory turnover): 2 points each
• Interfaces:
• 1 Simple interface (e.g., import data from a CSV file): 1 point
• 1 Medium interface (e.g., sync data to a cloud database): 2 points

Calculating Object Points:

• Screens: (2×1) + (3×2) + (1×3) = 2 + 6 + 3 = 11 points


• Reports: (1×1) + (2×2) = 1 + 4 = 5 points
• Interfaces: (1×1) + (1×2) = 1 + 2 = 3 points
Total Object Points = 11 (screens) + 5 (reports) + 3 (interfaces) = 19 points

Function Points Calculation Example


Let's consider a different scenario for a simple customer management system. To
calculate Function Points (FP), we analyze five components: External Inputs (EI),
External Outputs (EO), External Inquiries (EQ), Internal Logical Files (ILF), and
External Interface Files (EIF).

• External Inputs (EI): Entering new customer details, updating customer


information - 2 EIs
• External Outputs (EO): Generating a customer report, sending email notifications
- 2 EOs
• External Inquiries (EQ): Searching for customer information - 1 EQ
• Internal Logical Files (ILF): Customer records database, user accounts database -
2 ILFs
• External Interface Files (EIF): Importing contact details from an external CRM
system - 1 EIF

Assuming each of these components has been evaluated as average complexity,


they could be assigned the following complexity weights (for simplicity):

• EI, EO, EQ: 4 points each


• ILF, EIF: 7 points each

Calculating Function Points:

• EIs: 2×4 = 8 points


• EOs: 2×4 = 8 points
• EQs: 1×4 = 4 points
• ILFs: 2×7 = 14 points
• EIFs: 1×7 = 7 points

Total Function Points = 8 (EI) + 8 (EO) + 4 (EQ) + 14 (ILF) + 7 (EIF) = 41 points

Notes
• Object Points are particularly useful in environments emphasizing screen and
report generation, and when rapid application development (RAD) methodologies
are used. They provide a quick, early estimate of effort based on the number and
complexity of user interfaces, reports, and interfaces.
• Function Points focus more on the functionalities that the software provides from
a user's perspective, including data processing, data storage, and data inquiry
functionalities, regardless of how these functions are implemented technically.
Both these metrics serve as tools to estimate the size of a software development
project, which can then be used to estimate effort, cost, and duration. The choice
between Object Points and Function Points often depends on the project's nature,
the development stage, and the available information for estimation.

Rapid Application Development

RAD, or Rapid Application Development, is a software development methodology


that emphasizes quick and iterative development cycles, aiming to produce high-
quality systems with the least amount of delay. Introduced in the 1980s, RAD
became popular as an alternative to traditional waterfall models, which often
proved too rigid and slow for dynamic software projects. RAD focuses on
involving users deeply in the development process, rapid prototyping, and the use
of software development tools that allow for quick iterations.

Key Features of RAD


1. User Involvement: RAD involves users from the beginning to the end of the
development process. This continuous feedback loop helps in identifying and
incorporating user requirements accurately and promptly.
2. Prototyping: Instead of focusing on delivering a complete final product in one go,
RAD emphasizes the creation of prototypes. These are working models of the
system, developed quickly to demonstrate features and functionalities, allowing for
early user feedback and iterative refinement.
3. Iterative Development: RAD adopts an iterative approach to software
development, allowing for incremental development and refinement of the
software product. Each iteration incorporates user feedback and refines the
previous version of the prototype.
4. Timeboxing: Projects are divided into short, fixed time periods known as time
boxes, within which specific features are developed. This helps in managing
priorities and ensures that the project moves forward at a steady pace.
5. Use of Powerful Development Tools: RAD leverages software development tools
and environments that support rapid prototyping, reusability, and automatic code
generation. These tools help in speeding up the development process by
minimizing manual coding.
Advantages of RAD
• Speed: RAD significantly reduces development time, making it possible to meet
tight deadlines.
• Flexibility: The methodology allows for changes in requirements even in the later
stages of the development process.
• Increased User Satisfaction: Direct user involvement means the final product is
more likely to meet user needs and expectations.
• Risk Reduction: Early identification of potential issues and continuous refinement
help in mitigating risks effectively.
Disadvantages of RAD
• Resource Intensive: RAD often requires highly skilled developers and may
demand more resources than traditional methodologies.
• Not Suitable for Large Projects: The highly collaborative and iterative nature of
RAD can become challenging to manage in large-scale projects.
• Dependency on Strong Team and User Commitment: The success of RAD
projects heavily relies on the commitment and collaboration of both the
development team and the users.
Conclusion
RAD is an effective software development methodology for projects where speed
and user involvement are critical. While it offers significant advantages in terms of
flexibility and time to market, it's important to assess the nature of the project,
resource availability, and team capability to determine if RAD is the most suitable
approach.

Effort Estimation Methods:


Effort estimation in software project management is crucial for planning, budgeting, and
scheduling. Various methods have been developed to tackle this challenge, each with its
strengths and limitations. Here are some of the primary methods used for effort estimation:

1. Expert Judgment
This method relies on the experience and intuition of seasoned project managers or team
members. Experts use their knowledge of similar past projects to estimate the effort required
for new projects. While subjective, it can be surprisingly accurate, especially when the
experts have a deep understanding of the domain and the specific technologies involved.

2. Analogous Estimation
Analogous estimation involves comparing the current project with past projects that are
similar in size, complexity, and functionality. This method assumes that the effort required
for the current project will be similar to that of the projects it's being compared to. It's quick
and often used in the early stages of project planning when detailed information is not yet
available.

3. Parametric Estimation
Parametric estimation uses statistical models to estimate project effort. It involves identifying
the relationship between various project characteristics (parameters) and the effort required.
These models can be based on industry data or data from the organization's past projects.
Examples include regression models where effort is a function of lines of code, function
points, or other quantifiable measures of software size.

4. Bottom-Up Estimation
In bottom-up estimation, the project is broken down into smaller components or tasks, and
the effort for each is estimated individually. These individual estimates are then summed up
to get the total project effort. This method can be very accurate since it accounts for the
details of the project, but it's also time-consuming and requires a clear understanding of all
the project's aspects.

5. Three-Point Estimation
TThe Three-Point Estimation technique, particularly when applied using the PERT (Program
Evaluation and Review Technique) formula, is a powerful method to estimate task durations
or effort in project management. It is especially useful in addressing the uncertainty and
variability inherent in estimating complex tasks. The technique requires three types of
estimates for each task:

• O (Optimistic): The best-case scenario where everything goes as smoothly as possible.


• M (Most Likely): The most probable outcome, assuming a normal level of obstacles and
opportunities.
• P (Pessimistic): The worst-case scenario, considering potential challenges that could arise.

The formula for calculating the expected duration or effort (E) is:

E=(O+4M+P)/6

This formula not only gives a weighted average (with the most likely estimate receiving the
highest weight) but also balances the optimistic and pessimistic views, leading to a more
balanced and realistic estimation.

6. Delphi Technique
The Delphi Technique is a structured communication technique, originally developed as a
systematic, interactive forecasting method. It relies on a panel of experts who anonymously
provide their estimates. After each round, the range of the estimates is shared, and the experts
are allowed to adjust their estimates based on the information provided by the other
participants. This process is repeated until a consensus is reached.

7. COCOMO (Constructive Cost Model)


COCOMO and its successors, like COCOMO II, are algorithmic models that use
mathematical formulas to estimate project effort based on project size (measured in lines of
code or function points) and a set of cost drivers that adjust the base effort estimation. These
models provide a more systematic and repeatable estimation process than some of the more
qualitative methods.

8. Function Point Analysis (FPA)


Function Point Analysis is a method for measuring the functionality delivered by the project,
considering the user's external view of the system. The total count of function points is then
used in conjunction with historical productivity data (function points per person-month) to
estimate the effort required for the project.

9. Story Points in Agile Methodologies


In agile methodologies, such as Scrum, effort estimation is often done using story points.
This approach estimates the effort for user stories based on their complexity, risk, and the
amount of work required. Story points allow teams to estimate the effort for tasks relative to
each other without tying the estimates to specific time durations.

Each of these methods has situations where it is most applicable, and many projects benefit
from using a combination of methods to cross-validate estimates and address different aspects
of project planning and uncertainty.

You might also like