SoftwareProjectMgt Software Economy1.1
SoftwareProjectMgt Software Economy1.1
management –
Software economy
Dr. Siva
Parameters of software cost model
• Most software cost models can be abstracted into a function of five basic parameters:
size, process, personnel, environment, and required quality.
1. The size of the end product (in human-generated components), which is typically quantified in terms
of the number of source instructions or the number of function points required to develop the
required functionality
2. The process used to produce the end product, in particular the ability of the process to avoid nonvalue-
adding activities (rework, bureaucratic delays, communications overhead)
3. The capabilities of software engineering personnel, and particularly their experience with the
computer science issues and the applications domain issues of the project
4. The environment, which is made up of the tools and techniques available to support efficient
software development and to automate the process
5. The required quality of the product, including its features, performance, reliability, and adaptability
The relationships among these parameters and the estimated cost can be written as follows:
Effort = (Personnel) (Environment) (Quality) ( Size^(process) )
The effort grows exponentially with the process!!!(Diseconomy of scale)
Three generations of software development
IN the context of software economy,
1) Conventional: 1960s and 1970s, craftsmanship. Organizations used custom tools, custom processes,
and virtually all custom components built in primitive languages. Project performance was highly
predictable in that cost, schedule, and quality objectives were almost always underachieved.
2) Transition: 1980s and 1990s, software engineering. Organiz:1tions used more-repeatable processes and off
the- shelf tools, and mostly (>70%) custom components built in higher level languages.
Some of the components (<30%) were available as commercial products, including the operating
system, database management system, networking, and graphical user interface.
3) Modern practices: 2000 and later, software production. This book's philosophy is rooted in the
use of managed and measured processes, integrated automation environments, and mostly
(70%) off-the-shelf components. Perhaps as few as 30% of the components need to be custom.
Point to be noted
Technologies for environment automation, size reduction, and process
improvement are not independent of one another.
In each new era, the key is complementary growth in all technologies.
For example, the process advances could not be used successfully
without new component technologies and increased tool automation
Achieving ROI
Pragmatic software cost estimation
• Pragmatic?
Dealing with things sensibly and realistically in a way that is
based on practical rather than theoretical considerations:
• One critical problem in software cost estimation is a lack of well-
documented case studies of projects that used an iterative development
approach.
Software cost estimation
Three topics of these debates are of particular interest here:
1. Which cost estimation model to use?
2. Whether to measure software size in source lines of code or function
points.
3. What constitutes a good estimate?
1. An object-oriented model of the problem and its solution encourages a common vocabulary between
the end users of a system and its developers, thus creating a shared understanding of the problem
being solved.
2. The use of continuous integration creates opportunities to recognize risk early and make incremental
corrections without destabilizing the entire development effort.
3. An object-oriented architecture provides a clear separation of concerns among disparate elements of a
system, creating firewalls that prevent a change in one part of the system from rending the fabric of
the entire architecture.
Booch also summarized five characteristics of a successful object-oriented project.
1. A ruthless focus on the development of a system that provides a well understood collection of essential
minimal characteristics.
2. The existence of a culture that is centered on results, encourages communication, and yet is not afraid
to fail.
3. The effective use of object-oriented modeling.
4. The existence of a strong architectural vision.
5. The application of a well-managed iterative and incremental development life cycle.
Reuse characteristic
Commercial components
Process improvement 3 perspectives
Process is an overloaded term. Three distinct process perspectives are.
Metaprocess: an organization's policies, procedures, and practices for pursuing a software-intensive line of
business. The focus of this process is on organizational economics, long-term strategies, and software ROI.
Macroprocess: a project's policies, procedures, and practices for producing a complete software product within
certain cost, schedule, and quality constraints. The focus of the macro process is on creating an adequate instance of
the Meta process for a specific set of constraints.
Microprocess: a project team's policies, procedures, and practices for achieving an artifact of the
software process. The focus of the micro process is on achieving an intermediate product baseline
with adequate quality and adequate functionality as economically and rapidly as practical.
Processes and their attributes
Team effectiveness
Boehm five staffing principles are
1. The principle of top talent: Use better and fewer people
2. The principle of job matching: Fit the tasks to the skills and motivation of the people available.
3. The principle of career progression: An organization does best in the long run by helping its people
to self-actualize.
4. The principle of team balance: Select people who will complement and harmonize with one another
5. The principle of phase-out: Keeping a misfit on the team doesn't benefit anyone
Attributes of a project manager
1. Hiring skills. Few decisions are as important as hiring decisions. Placing the right person in the right
job seems obvious but is surprisingly hard to achieve.
2. Customer-interface skill. Avoiding adversarial relationships among stakeholders is a prerequisite for
success.
Decision-making skill. The jillion books written about management have failed to provide a clear
definition of this attribute. We all know a good leader when we run into one, and decision-making
skill seems obvious despite its intangible definition.
Team-building skill. Teamwork requires that a manager establish trust, motivate progress, exploit
eccentric prima donnas, transition average people into top performers, eliminate misfits, and
consolidate diverse opinions into a team direction.
Selling skill. Successful project managers must sell all stakeholders (including themselves) on decisions
and priorities, sell candidates on job positions, sell changes to the status quo in the face of resistance, and
sell achievements against objectives. In practice, selling requires continuous negotiation, compromise,
and empathy
IMPROVING AUTOMATION THROUGH SOFTWARE ENVIRONMENTS
However, the typical chronology of events in performance assessment was as follows
Project inception. The proposed design was asserted to be low risk with adequate performance
margin.
Initial design review. Optimistic assessments of adequate design margin were based mostly on paper
analysis or rough simulation of the critical threads. In most cases, the actual application algorithms
and database sizes were fairly well understood.
Mid-life-cycle design review. The assessments started whittling away at the margin, as early
benchmarks and initial tests began exposing the optimism inherent in earlier estimates.
Integration and test. Serious performance problems were uncovered, necessitating fundamental
changes in the architecture. The underlying infrastructure was usually the scapegoat, but the real
culprit was immature use of the infrastructure, immature architectural solutions, or poorly understood
early design trade-offs.
Peer inspection
Transitioning engineering information from one artifact set to another, thereby assessing
the consistency, feasibility, understandability, and technology constraints inherent in the
engineering artifacts
Major milestone demonstrations that force the artifacts to be assessed against tangible
criteria in the context of relevant use cases
Environment tools (compilers, debuggers, analyzers, automated test suites) that ensure
representation rigor, consistency, completeness, and change control
Life-cycle testing for detailed insight into critical trade-offs, acceptance criteria, and
requirements compliance
Change management metrics for objective insight into multiple-perspective change trends
and convergence or divergence from quality and progress goals
Function point method for software estimation
• Function Point Analysis (FPA) is a software estimation
technique used to measure the functional size of
the software work.
• It is a standardized method used commonly as an
estimation technique in software engineering
•.
FPA is used to make estimate of the software project, in
cluding its testing in terms of functionality or function si
ze of the software product
3
.
• The method was first defined by Allan J. Albrecht in 197
9 at IBM, and has since then underwent several modific
ations, mainly by the International Function Point Users
Group (IFPUG)
2
.
• Functional Point Analysis gives a dimensionless
number defined in function points which we have found
to be an effective relative measure of function value
delivered to our customer.
• A systematic approach to measuring the different
functionalities of a software application is offered by
function point metrics.
Two types of functional point analysis
• Transaction type
• External Input (EI): EI processes data or control information that
comes from outside the application’s boundary. The EI is an
elementary process.
• External Output (EO): EO is an elementary process that generates
data or control information sent outside the application’s boundary.
• External Inquiries (EQ): EQ is an elementary process made up of
an input-output combination that results in data retrieval.
• Data Functional Type
• Internal Logical File (ILF): A user-identifiable group of logically
related data or control information maintained within the boundary
of the application.
• External Interface File (EIF): A group of users recognizable
logically related data allusion to the software but maintained within
the boundary of another software.
Advantages
• Functional Point helps in describing system complexity
and also shows project timelines.
• It is majorly used for business systems like information
systems.
• FP is language and technology independent, meaning it
can be applied to software systems developed using
any programming language or technology stack.
• All the factors mentioned above are given weights, and
these weights are determined through practical
experiments in the following table.
Characteristics of Functional Point
Analysis
We can calculate the functional point with the help of the number of functions
and types of functions used in applications.
Compute F using
Scale *The total number of ‘Yes’ for the above
questions where scale is a complexity factor.
The typical number is 3.
Scale - 0 - No Influence 1 - Incidental 2 - Moderate 3 -
Average 4 - Significant 5 - Essential
From the above tables, Functional Point is calculated with the
following formula
FP = Total-Count * [0.65 + 0.01 * ⅀(fi)]
= Count * CAF
Here, the count-total is taken from the chart.
CAF = [0.65 + 0.01 * ⅀(fi)]