0% found this document useful (0 votes)
11 views47 pages

SoftwareProjectMgt Software Economy1.1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views47 pages

SoftwareProjectMgt Software Economy1.1

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 47

Software project

management –
Software economy
Dr. Siva
Parameters of software cost model
• Most software cost models can be abstracted into a function of five basic parameters:
size, process, personnel, environment, and required quality.
1. The size of the end product (in human-generated components), which is typically quantified in terms
of the number of source instructions or the number of function points required to develop the
required functionality
2. The process used to produce the end product, in particular the ability of the process to avoid nonvalue-
adding activities (rework, bureaucratic delays, communications overhead)
3. The capabilities of software engineering personnel, and particularly their experience with the
computer science issues and the applications domain issues of the project
4. The environment, which is made up of the tools and techniques available to support efficient
software development and to automate the process
5. The required quality of the product, including its features, performance, reliability, and adaptability
The relationships among these parameters and the estimated cost can be written as follows:
Effort = (Personnel) (Environment) (Quality) ( Size^(process) )
The effort grows exponentially with the process!!!(Diseconomy of scale)
Three generations of software development
IN the context of software economy,
1) Conventional: 1960s and 1970s, craftsmanship. Organizations used custom tools, custom processes,
and virtually all custom components built in primitive languages. Project performance was highly
predictable in that cost, schedule, and quality objectives were almost always underachieved.
2) Transition: 1980s and 1990s, software engineering. Organiz:1tions used more-repeatable processes and off
the- shelf tools, and mostly (>70%) custom components built in higher level languages.
Some of the components (<30%) were available as commercial products, including the operating
system, database management system, networking, and graphical user interface.
3) Modern practices: 2000 and later, software production. This book's philosophy is rooted in the
use of managed and measured processes, integrated automation environments, and mostly
(70%) off-the-shelf components. Perhaps as few as 30% of the components need to be custom.
Point to be noted
Technologies for environment automation, size reduction, and process
improvement are not independent of one another.
In each new era, the key is complementary growth in all technologies.
For example, the process advances could not be used successfully
without new component technologies and increased tool automation
Achieving ROI
Pragmatic software cost estimation
• Pragmatic?
Dealing with things sensibly and realistically in a way that is
based on practical rather than theoretical considerations:
• One critical problem in software cost estimation is a lack of well-
documented case studies of projects that used an iterative development
approach.
Software cost estimation
Three topics of these debates are of particular interest here:
1. Which cost estimation model to use?
2. Whether to measure software size in source lines of code or function
points.
3. What constitutes a good estimate?

• There are several popular cost estimation models (such as COCOMO,


CHECKPOINT, ESTIMACS, KnowledgePlan, Price-S, ProQMS, SEER, SLIM,
SOFTCOST, and SPQR/20),
• CO COMO is also one of the most open and well-documented cost estimation models.
• The general accuracy of conventional cost models (such as COCOMO) has been
described as "within 20% of actuals, 70% of the time."
Two approaches
• Two definitions on costing
• Target cost – The PM adjusts the size and the parameters till the
target is achieved
• ‘Should’ cost -
Attributes of a good software
• A good software cost estimate has the following attributes:
 It is conceived and supported by the project manager, architecture team,
development team, and test team accountable for performing the work.
 It is accepted by all stakeholders as ambitious but realizable.
 It is based on a well-defined software cost model with a credible basis.
 It is based on a database of relevant project experience that includes similar
processes, similar technologies, similar environments, similar quality requirements,
and similar people.
 It is defined in enough detail so that its key risk areas are understood and the
probability of success is objectively assessed.
Extrapolating from a good estimate, an ideal estimate would be derived from a
mature cost model with an experience base that reflects multiple similar projects
done by the same team with the same mature processes and tools.
Improving the software economy
• Five basic parameters of the software cost model are
• 1.Reducing the size or complexity of what needs to be
developed.
• 2. Improving the development process.
• 3. Using more-skilled personnel and better teams (not
necessarily the same thing).
• 4. Using better environments (tools to automate the process).
• 5. Trading off or backing off on quality thresholds.
REDUCING SOFTWARE PRODUCT SIZE
The most significant way to improve affordability and return on investment (ROI) is usually to
produce a product that achieves the design goals with the minimum amount of human-generated
source material.
Component-based development is introduced as the general term for reducing the "source" language
size to achieve a software solution.
Reuse, object-oriented technology, automatic code production, and higher order programming
languages are all
focused on achieving a given system with fewer lines of human-specified source directives
(statements).
Size reduction is the primary motivation behind improvements in higher order languages (such as
C++, Ada 95,
Java, Visual Basic), automatic code generators (CASE tools, visual modeling tools, GUI builders),
reuse of
commercial components (operating systems, windowing environments, database management
systems,
middleware, networks), and object-oriented technologies (Unified Modeling Language, visual
modeling tools, architecture frameworks).
The reduction is defined in terms of human-generated source material. In general, when size-
reducing technologies are used, they reduce the number of human-generated source lines.
Languages
• Universal function points (UFPs1) are useful estimators for language-independent,
early life-cycle estimates.
• The basic units of function points are external user inputs, external outputs,
internal logical data groups, external data interfaces, and external inquiries.
• SLOC metrics are useful estimators for software after a candidate solution is
formulated and an implementation language is known.
• Note:
• Function point metrics provide a standardized method for measuring the various
functions of a software application.
• The basic units of function points are external user inputs, external outputs,
internal logical data groups, external data interfaces, and external inquiries.
Language SLOC per UFP(function point)
OBJECT-ORIENTED METHODS AND VISUAL MODELING

1. An object-oriented model of the problem and its solution encourages a common vocabulary between
the end users of a system and its developers, thus creating a shared understanding of the problem
being solved.
2. The use of continuous integration creates opportunities to recognize risk early and make incremental
corrections without destabilizing the entire development effort.
3. An object-oriented architecture provides a clear separation of concerns among disparate elements of a
system, creating firewalls that prevent a change in one part of the system from rending the fabric of
the entire architecture.
Booch also summarized five characteristics of a successful object-oriented project.
1. A ruthless focus on the development of a system that provides a well understood collection of essential
minimal characteristics.
2. The existence of a culture that is centered on results, encourages communication, and yet is not afraid
to fail.
3. The effective use of object-oriented modeling.
4. The existence of a strong architectural vision.
5. The application of a well-managed iterative and incremental development life cycle.
Reuse characteristic
Commercial components
Process improvement 3 perspectives
Process is an overloaded term. Three distinct process perspectives are.
 Metaprocess: an organization's policies, procedures, and practices for pursuing a software-intensive line of
business. The focus of this process is on organizational economics, long-term strategies, and software ROI.

Macroprocess: a project's policies, procedures, and practices for producing a complete software product within
certain cost, schedule, and quality constraints. The focus of the macro process is on creating an adequate instance of
the Meta process for a specific set of constraints.

 Microprocess: a project team's policies, procedures, and practices for achieving an artifact of the
software process. The focus of the micro process is on achieving an intermediate product baseline
with adequate quality and adequate functionality as economically and rapidly as practical.
Processes and their attributes
Team effectiveness
Boehm five staffing principles are
1. The principle of top talent: Use better and fewer people
2. The principle of job matching: Fit the tasks to the skills and motivation of the people available.
3. The principle of career progression: An organization does best in the long run by helping its people
to self-actualize.
4. The principle of team balance: Select people who will complement and harmonize with one another
5. The principle of phase-out: Keeping a misfit on the team doesn't benefit anyone
Attributes of a project manager
1. Hiring skills. Few decisions are as important as hiring decisions. Placing the right person in the right
job seems obvious but is surprisingly hard to achieve.
2. Customer-interface skill. Avoiding adversarial relationships among stakeholders is a prerequisite for
success.
Decision-making skill. The jillion books written about management have failed to provide a clear
definition of this attribute. We all know a good leader when we run into one, and decision-making
skill seems obvious despite its intangible definition.
Team-building skill. Teamwork requires that a manager establish trust, motivate progress, exploit
eccentric prima donnas, transition average people into top performers, eliminate misfits, and
consolidate diverse opinions into a team direction.
Selling skill. Successful project managers must sell all stakeholders (including themselves) on decisions
and priorities, sell candidates on job positions, sell changes to the status quo in the face of resistance, and
sell achievements against objectives. In practice, selling requires continuous negotiation, compromise,
and empathy
IMPROVING AUTOMATION THROUGH SOFTWARE ENVIRONMENTS
However, the typical chronology of events in performance assessment was as follows
 Project inception. The proposed design was asserted to be low risk with adequate performance
margin.
 Initial design review. Optimistic assessments of adequate design margin were based mostly on paper
analysis or rough simulation of the critical threads. In most cases, the actual application algorithms
and database sizes were fairly well understood.
 Mid-life-cycle design review. The assessments started whittling away at the margin, as early
benchmarks and initial tests began exposing the optimism inherent in earlier estimates.
 Integration and test. Serious performance problems were uncovered, necessitating fundamental
changes in the architecture. The underlying infrastructure was usually the scapegoat, but the real
culprit was immature use of the infrastructure, immature architectural solutions, or poorly understood
early design trade-offs.
Peer inspection
 Transitioning engineering information from one artifact set to another, thereby assessing
the consistency, feasibility, understandability, and technology constraints inherent in the
engineering artifacts
 Major milestone demonstrations that force the artifacts to be assessed against tangible
criteria in the context of relevant use cases
 Environment tools (compilers, debuggers, analyzers, automated test suites) that ensure
representation rigor, consistency, completeness, and change control
 Life-cycle testing for detailed insight into critical trade-offs, acceptance criteria, and
requirements compliance
 Change management metrics for objective insight into multiple-perspective change trends
and convergence or divergence from quality and progress goals
Function point method for software estimation
• Function Point Analysis (FPA) is a software estimation
technique used to measure the functional size of
the software work.
• It is a standardized method used commonly as an
estimation technique in software engineering
•.
FPA is used to make estimate of the software project, in
cluding its testing in terms of functionality or function si
ze of the software product
3
.
• The method was first defined by Allan J. Albrecht in 197
9 at IBM, and has since then underwent several modific
ations, mainly by the International Function Point Users
Group (IFPUG)
2
.
• Functional Point Analysis gives a dimensionless
number defined in function points which we have found
to be an effective relative measure of function value
delivered to our customer.
• A systematic approach to measuring the different
functionalities of a software application is offered by
function point metrics.
Two types of functional point analysis
• Transaction type
• External Input (EI): EI processes data or control information that
comes from outside the application’s boundary. The EI is an
elementary process.
• External Output (EO): EO is an elementary process that generates
data or control information sent outside the application’s boundary.
• External Inquiries (EQ): EQ is an elementary process made up of
an input-output combination that results in data retrieval.
• Data Functional Type
• Internal Logical File (ILF): A user-identifiable group of logically
related data or control information maintained within the boundary
of the application.
• External Interface File (EIF): A group of users recognizable
logically related data allusion to the software but maintained within
the boundary of another software.
Advantages
• Functional Point helps in describing system complexity
and also shows project timelines.
• It is majorly used for business systems like information
systems.
• FP is language and technology independent, meaning it
can be applied to software systems developed using
any programming language or technology stack.
• All the factors mentioned above are given weights, and
these weights are determined through practical
experiments in the following table.
Characteristics of Functional Point
Analysis
We can calculate the functional point with the help of the number of functions
and types of functions used in applications.

These are classified into five types:


Weightages for attributes – An example
Example for computation
Measurement Count Total count Weighing factor
parameter (average) Simple Average Complex
Number of external 32 32*4=128 3 4 6
inputs (EI)

Number of external 60 60*5=300 4 5 7


outputs (EO)

Number of external 24 24*4=96 3 4 6


inquiries (EQ)

Number of internal files 8 8*10=80 7 10 15


(ILF)

Number of external 2 2*7=14 5 7 10


interfaces (EIF)

Algorithms used Count 618


total →
CAF – Complexity adjustment factor
Step 1 -The 14 questions to be answered for Function points
calculation
"reliable backup and recovery required ?",
"data communication required ?",
"are there distributed processing functions ?",
"is performance critical ?",
"will the system run in an existing heavily
utilized operational environment ?",
"on line data entry required ?",
"does the on line data entry require the input
transaction to be built over multiple screens or operations
?",
"are the master files updated on line ?",
"is the inputs, outputs, files or inquiries complex
?",
"is the internal processing complex ?",
"is the code designed to be reusable ?",
"are the conversion and installation
included in the design ?",
"is the system designed for multiple
installations in different organizations ?",
"is the application designed to
facilitate change and ease of use by the user ?“

Compute F using
Scale *The total number of ‘Yes’ for the above
questions where scale is a complexity factor.
The typical number is 3.
Scale - 0 - No Influence 1 - Incidental 2 - Moderate 3 -
Average 4 - Significant 5 - Essential
From the above tables, Functional Point is calculated with the
following formula
FP = Total-Count * [0.65 + 0.01 * ⅀(fi)]
= Count * CAF
Here, the count-total is taken from the chart.
CAF = [0.65 + 0.01 * ⅀(fi)]

You might also like