0% found this document useful (0 votes)
6 views32 pages

Unit 1 Chap 2

The document discusses the evolution of software process improvement, emphasizing the shift from conventional methods like the Waterfall Model to modern practices that focus on iterative development and component-based approaches. It highlights the importance of software economics, cost estimation, and the need for effective management and team dynamics to enhance productivity and quality. Key principles for successful software engineering are outlined, including the necessity of early customer involvement, quality assurance, and adapting to change.

Uploaded by

Rithik Barsal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
6 views32 pages

Unit 1 Chap 2

The document discusses the evolution of software process improvement, emphasizing the shift from conventional methods like the Waterfall Model to modern practices that focus on iterative development and component-based approaches. It highlights the importance of software economics, cost estimation, and the need for effective management and team dynamics to enhance productivity and quality. Key principles for successful software engineering are outlined, including the necessity of early customer involvement, quality assurance, and adapting to change.

Uploaded by

Rithik Barsal
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 32

UNIT - I

Introduction
In the past ten years, typical goals in the software

process improvement of several companies are to


achieve a 2x, 3x, or 10x increase in productivity,
quality, time to market, or some combination of all
three, where x corresponds to how well the company
does now.
The Old Way (Conventional SPM)

The Waterfall Model

Conventional Software Management Performance

Evolution of Software Economics



Software Economics

Pragmatic Software Cost Estimation
Improving Software Economics

Reducing Software Product Size

Improving Software Processes

Improving Team Effectiveness

Improving Automation through Software Environments

Achieving Required Quality

Peer Inspections: A Pragmatic View

The Old Way and the New



The Principles of Conventional Software Engineering

The Principles of Modern Software Management

Transitioning to an Iterative Process
The Old Way
Software crisis


“The best thing about software is its flexibility”

It can be programmed to do almost anything.


“The worst thing about software is also its flexibility”

The “almost anything ” characteristic has made it difficult to plan, monitor,
and control software development.
The Old Way
The Waterfall Model

System
System
requirements
requirements

Software
Software
requirements
requirements

Analysis
Analysis

 Drawbacks Program
Program
design
design
 Protracted integration
and late design breakage Coding
Coding
 Late risk resolution
 Requirements - driven
functional decomposition Testing
Testing
 Adversarial stakeholder relationships
 Focus on document Maintenance
Maintenance
and review meetings and
andreliance
reliance
The Old Way
Conventional Software Management Performance

1. Finding and fixing a software problem after delivery costs 100 times more than finding
and fixing the problem in early design phases.

2. You can compress software development schedules 25% of nominal, but no more.

3. For every $1 you spend on development, you will spend $2 on maintenance.

4. Software development and maintenance costs are primarily a function of the number of
source lines of code.

5. Variations among people account for the biggest differences in software productivity.

6. The overall ratio of software to hardware costs is still growing. In 1955 it was 15:85; in
1985, 85:15.

7. Only about 15% of software development effort is devoted to programming.

8. Walkthroughs catch 60% of the errors.

9. 80% of the contribution comes from 20% of contributors.


The Old Way and the New
The Principles of Conventional Software Engineering
1. Make quality #1. Quality must be quantified and mechanism put into place to motivate its achievement.
2. High-quality software is possible. Techniques that have been demonstrated to increase quality include
involving the customer, prototyping, simplifying design, conducting inspections, and hiring the best people.
3. Give products to customers early. No matter how hard you try to learn users’ needs during the requirements
phase, the most effective way to determine real needs is to give users a product and let them play with it.
4. Determine the problem before writing the requirements. When faced with what they believe is a problem,
most engineers rush to offer a solution. Before you try to solve a problem, be sure to explore all the
alternatives and don’t be blinded by the obvious solution.
5. Evaluate design alternatives. After the requirements are agreed upon, you must examine a variety of
architectures and algorithms. You certainly do not want to use an “architecture” simply because it was used in
the requirements specification.
6. Use an appropriate process model. Each project must select a process that makes the most sense for that
project on the basis of corporate culture, willingness to take risks, application area, volatility of requirements,
and the extent to which requirements are well understood.
7. Use different languages for different phases. Our industry’s eternal thirst for simple solutions to complex
problems has driven many to declare that the best development method is one that uses the same notation
through-out the life cycle. Why should software engineers use Ada for requirements, design, and code unless
Ada were optimal for all these phases?
8. Minimize intellectual distance. To minimize intellectual distance, the software’s structure should be as close
as possible to the real-world structure.
9. Put techniques before tools. An undisciplined software engineer with a tool becomes a dangerous,
undisciplined software engineer.
10. Get it right before you make it faster. It is far easier to make a working program run than it is to make a fast
program work. Don’t worry about optimization during initial coding.
The Old Way and the New
The Principles of Conventional Software Engineering
11. Inspect code. Inspecting the detailed design and code is a much better way to find errors than testing.

12. Good management is more important than good technology. The best technology will not compensate for poor management, and a good
manager can produce great results even with meager resources. Good management motivates people to do their best, but there are no universal
“right” styles of management.

13. People are the key to success. Highly skilled people with appropriate experience, talent, and training are key. The right people with insufficient
tools, languages, and process will succeed. The wrong people with appropriate tools, languages, and process will probably fail.

14. Follow with care. Just because everybody is doing something does not make it right for you. It may be right, but you must carefully assess its
applicability to your environment. Object orientation, measurement, reuse, process improvement, CASE, prototyping-all these might increase
quality, decrease cost, and increase user satisfaction. The potential of such techniques is often oversold, and benefits are by no means
guaranteed or universal.

15. Take responsibility. When a bridge collapses we ask, “what did the engineers do wrong?” Even when software fails, we rarely ask this. The fact
is that in any engineering discipline, the best methods can be used to produce awful designs, and the most antiquated methods to produce elegant
design.

16. Understand the customer’s priorities. It is possible the customer would tolerate 90% of the functionality delivered late if they could have 10%
of it on time.

17. The more they see, the more they need. The more functionality (or performance) you provide a user, the more functionality (or performance)
the user wants.

18. Plan to throw one away .One of the most important critical success factors is whether or not a product is entirely new. Such brand-new
applications, architectures, interfaces, or algorithms rarely work the first time.

19. Design for change. The architectures, components, and specification techniques you use must accommodate change.

20. Design without documentation is not design. I have often heard software engineers say, “I have finished the design. All that is left is the
documentation.”
The Old Way and the New
The Principles of Conventional Software Engineering

21. Use tools, but be realistic. Software tools make their users more efficient.
22. Avoid tricks. Many programmers love to create programs with tricks- constructs that
perform a function correctly, but in an obscure way. Show the world how smart you are
by avoiding tricky code.
23. Encapsulate. Information-hiding is a simple, proven concept that results in software that
is easier to test and much easier to maintain.
24. Use coupling and cohesion. Coupling and cohesion are the best ways to measure
software’s inherent maintainability and adaptability.
25. Use the McCabe complexity measure. Although there are many metrics available to
report the inherent complexity of software, none is as intuitive and easy to use as Tom
McCabe’s.
26. Don’t test your own software. Software developers should never be the primary testers
of their own software.
27. Analyze causes for errors. It is far more cost-effective to reduce the effect of an error
by preventing it than it is to find and fix it. One way to do this is to analyze the causes of
errors as they are detected.
28. Realize that software’s entropy increases. Any software system that undergoes
continuous change will grow in complexity and become more and more disorganized.
29. People and time are not interchangeable. Measuring a project solely by person-
months makes little sense.
30. Expert excellence. Your employees will do much better if you have high expectations
for them.
The Old Way and the New
The Principles of Modern Software Management

Architecture-first approach The central design element


Design and integration first, then production and test

Iterative life-cycle process The risk management element


Risk control through ever-increasing function, performance, quality
Component-based development The technology element
Object-oriented methods, rigorous notations, visual modeling

Change management environment The control element


Metrics, trends, process instrumentation
Round-trip engineering The automation element
Complementary tools, integrated environments
Chapter 2 – Evolution of Software
Economics
2.1 Software Economics
Five fundamental parameters that can be abstracted from

software costing models:



Size (typically, number of source instructions)

Process

Personnel

Environment

Quality (performance, reliability, adaptability…)
 Size: Usually measured in SLOC or number of Function Points required to realize the

desired capabilities.
 Function Points – a better metric earlier in project

 LOC a better metric later in project

 These are not new metrics for measuring size, effort, personnel needs,…

 Process – used to guide all activities.

 Workers (roles), artifacts, activities…

 Support heading toward target and eliminate non-essential / less important

activities
 Process critical in determining software economics
 Component-based development; application domain…iterative approach, use-case driven…
Personnel – capabilities of the personnel in general and in

the application domain in particular


get the right people; good people; Can’t always do this.

Much specialization nowadays. Some are terribly expensive.

Emphasize ‘team’ and team responsibilities…Ability to work in

a team;
Environment – the tools / techniques / automated procedures

used to support the development effort.


Integrated tools; automated tools for modeling, testing,

configuration, managing change, defect tracking, etc…


Required Quality – the functionality provided; performance,

reliability, maintainability, scalability, portability, user


interface utility; usability…

Effort = (personnel)(environment)(quality)(size )
(Note: effort is exponentially related to size….)
Notice the Process Trends….for three generations of

software economics
 Conventional development (60s and 70s)
 Application – custom; Size – 100% custom

 Process – ad hoc

 70s - SDLC; customization of process to domain / mission, structured analysis, structured design, code.

 Transition (80s and 90s)


 Environmental/tools – some off the shelf.
 Tools: separate, that is, often not integrated esp. in 70s…

 Size: 30% component-based; 70% custom

  Process: repeatable

 Modern Practices (2000 and later)


 Environment/tools: off-the-shelf; integrated

 Size: 70% component-based; 30% custom

 Process: managed; measured (refer to CMM)


Evolution of Software Economics
Three generations of software economics

Cost

Software size
1960s-1970s 1980s-1990s 2000 and on
Waterfall model Process improvement Iterative development
Functional design Encapsulation-based Component- based
Diseconomy of scale Diseconomy of scale Return to investment

Environments/tools: Environments/tools: Environments/tools:


Custom Off-the-shelf, separate Off-the-shelf, integrated
Size: Size: Size:
100% custom 30%component-based, 70% custom 70%component-based, 30% custom
Process: Process: Process:
Ad hoc Repeatable Managed/measured

Typical project performance


Predictably bad Unpredictable Predictable
Always: Infrequently: Usually:
-Over budget -On budget -On budget
-Over schedule -On schedule -On schedule
Notice Performance Trends….for three
generations of software economics
 Conventional: Predictably bad: (60s/70s)
 usually always over budget and schedule; missed requirements
 All custom components; symbolic languages (assembler); some third generation languages (COBOL, Fortran, PL/1)
 Performance, quality almost always less than great.

 Transition: Unpredictable (80s/90s)


 Infrequently on budget or on schedule

 Enter software engineering; ‘repeatable process;’ project management


 Some commercial products available – databases, networking, GUIs; But with huge growth in complexity,
(especially to distributed systems) existing languages and technologies not enough for desired business
performance

 Modern Practices: Predictable (>2000s)


 Usually on budget; on schedule. Managed, measured process management. Integrated environments; 70%

off-the-shelf components. Component-based applications RAD; iterative development; stakeholder emphasis.


2.2 “Pragmatic” Software Cost Estimation
 Little available on estimating cost for projects using iterative development.

Difficult to hold all the controls constant

 Application domain; project size; criticality; etc.

 Metrics (SLOC, function points, etc.) NOT consistently applied EVEN in the same

application domain!
 Definitions of SLOC and function points are not even consistent!

Much of this is due to the nature of development. There is no magic date

when design is ‘done;’ or magic date when testing ‘begins’ …


Consider some of the issues:

https://www.geeksforgeeks.org/software-cost-estimation/
Three Issues in Software Cost Estimation:
 1. Which cost estimation model should
be used?
 2. Should software size be measured

using SLOC or Function Points?


(there are others too…)
 3. What are the determinants of a good

estimate? (How do we know our


estimate is good??)

So very much is dependent upon estimates!!!!


Cost Estimation Models
 Many available.

 Many organization-specific models too based on their own

histories, experiences…
 Oftentimes, these are super if ‘other’ parameters held constant, such as

process, tools, etc. etc.


 COCOMO, developed by Barry Boehm, is the most popular cost

estimation model.
 Two primary approaches:
 Source lines of code (SLOC) and

 Function Points (FP)


Source Lines of Code (SLOC)
Many feel comfortable with ‘notion’ of LOC

SLOC has great value – especially where applications are

custom-built.
Easy to measure & instrument – have tools.

Nice when we have a history of development with applications and

their existing lines of code and associated costs.

Today – with use of components, source-code generation

tools, and objects have rendered SLOC somewhat ambiguous.


Generally more useful and precise basis than FPs
Function Points
Use of Function Points - many proponents.

International Function Point User’s Group – 1984 – “is the

dominant software measurement association in the industry.”

 Major advantage: Measuring with function points is


independent of the technology (programming language, tools …)
used and is thus better for comparisons among projects. 
Function Points
 Function Points measure numbers of
 external user inputs,

 external outputs,

 internal data groups,

 external data interfaces,

 external inquiries, etc.

  Major disadvantage: Difficult to measure these things.


 Definitions are primitive and inconsistent

 Metrics difficult to assess especially since normally done earlier in the

development effort using more abstractions.


 Yet, no project will be started without estimates!!!!
But:
 Cost estimation is a real necessity!!! Necessary to ‘fund’ project!

 All projects require estimation in the beginning (inception) and adjustments…


 These must stabilize; They are rechecked…

 Must be reusable for additional cycles

 Can create organization’s own methods of measurement on how to ‘count’ these metrics...

 No project is arbitrarily started without cost / schedule / budget / manpower /

resource estimates (among other things)


  SO critical to budgets, resource allocation, and to a host of stakeholders
So, How Good are the Models?
 COCOMO is said to be ‘within 20%’ of actual costs ’70% of the time.’ (COCOMO

has been revised over the years…)

 Cost estimating is still disconcerting when one realizes that there are already a

plethora of missed dates, poor deliverables, and significant cost overruns that
characterize traditional development.

 Yet, all non-trivial software development efforts require costing; It is a basic


management activity.

 RFPs on contracts force contractors to estimate the project costs for their survival.

 So, let’s look at top down and bottom up estimating.


Top Down versus Bottom Up
Substantiating the Cost…
 Most estimators perform bottom up costing - substantiating a

target cost - rather than approaching it a top down, which would


yield a ‘should cost.’
 Many project managers create a ‘target cost’ and then play with

parameters and sizing until the target cost can be justified…


 Work backwards!

 Attempts to win proposals, convince people, …

 Any approach should force the project manager to assess risk and

discuss things with stakeholders…


Top Down versus Bottom Up
Bottom up … substantiating? Good?

If well done, it requires considerable analysis and expertise

based on much experience and knowledge; Development of


similar systems a great
help; similar technologies…
If not well done, causes team members to go crazy! (This is

not uncommon)
Independent cost estimators (consultants…) not reliable.
Author suggests:
Likely best cost estimate is undertaken by an experienced
project manager, software architect, developers, and test
managers – and this process can be quite iterative!

Previous experience is essential. Risks identifiable, assessed,

and factored in.

When created, the team must live with the cost/schedule

estimate.
A Good Project Estimate:
1. It is simply conceived i.e. planned and supported by project manager. architecture team,
development team, and test team responsible for performing work and task.

2. All the stakeholders generally accept it as ambitious but realizable.

3. It is based on a well-defined and efficient cost model of software on a credible basis.

4. It is also based on a similar project experience database that includes and contains similar
processes, relevant technologies, relevant environments, relevant quality requirements, and
all similar people.

5. It is also defined and explained in much amount of detail so that all of its key risks are simply
understood and probability of success is objectively assessed.
Evolution of Software Economics
The predominant cost estimation process

Software manager,
software architecture manager,
software development manager,
software assessment manager

Cost modelers

Risks, options,
trade-offs,
alternatives

Cost estimate

You might also like