Req 6
Req 6
Software Processes
- Life-cycle Models (Process Models)
Build-and-Fix
Waterfall
Rapid Prototyping
Incremental
Evolutionary
Spiral
- Process Improvements: CMM
- Metrics: product and process
Handouts: 6
Prepared by: Sagar Naik
1
SE Process?
• A process defines
– Who is doing what, when and how in the development of
a software system
• S/W development activities and their ordering
» Requirement elicitation
» Specification
» Design
» Code
» Test: unit, integration, system
» Maintain
2
Process vs. Product
Process: the only thing you retain
The asset that distinguishes you from
your competitor en route to a product
The asset that gets you to your next product
The asset that determines key properties of Process Tools
Automation
your products
Template
Participants
People Project
Result
3
Lightweight vs. Heavyweight Processes
Lightweight Heavyweight
(eXtreme Programming: XP) (Waterfall)
Important:
- Choose a process depending on the type of product being developed.
- For large systems, management is a big problem choose a strictly managed process
- For small systems, more informality is possible …
4
Build-and-Fix Model
• Properties
• No planning
Build first
• Only product: working code version
decreasing understandability
and maintainability
Maintenance
5
Waterfall Model
• Properties Requirements Changed Req.
Verify Verify
» Sequential steps
» Feedback loops
» Documentation-driven Specification
• Advantages Verify
» Documentation
» Easier maintenance Design
• Disadvantages Verify
• Advantages
• Requirements better specified and Code
validated Test
• Early feasibility analysis
• Strong involvement of the
Integration
customer in prototyping
• Disadvantages Test
7
Retirement
Incremental
Release 1
Design Coding Test Deployment
Requirements
Release 2
Design Coding Test Deployment
Release 3
Design Coding Test Deployment
9
Evolutionary
Version 1
Requirements Design Coding Test Deployment
Version 1
Requirements Design Coding Test Deployment
Version 1 Feedback
Requirements Design Coding Test Deployment
11
Spiral Model
12
Spiral model
14
Process model risk problems
• Waterfall
• High risk for new systems because of
specification and design problems
• Low risk for well-understood developments using
familiar technology
• Prototyping
• Low risk for new applications because
specification and program stay in step
• High risk because of lack of process visibility
• Evolutionary and Spiral
• Middle ground between waterfall and prototyping
15
Hybrid process models
• Large systems are usually made up of several
sub-systems
• The same process model need not be used for
all subsystems
• Prototyping for high-risk specifications
• Waterfall model for well-understood systems
• Tailor the process to a problem
16
Rational Unified Process (RUP) Model (from
IBM)
• Characteristics
– Iterative and incremental
– Use-case-driven
– Architecture-centric
– Uses UML as its modeling notation
– Provides
• Comprehensive set of document templates and process
guidelines
17
RUP is Use-Case-Driven and Architecture-Centric
• Use cases drive numerous activities
• Creation and validation of the design model
• Definitions and procedures of the test cases
• Creation of user documentation
• Build, validate, and baseline an architecture
• Prototype the architecture to validate it
• A validated architecture serves as the baseline
• Other artifacts derive from architecture
• Product structure
• Team structure
18
Life-cycle Phases and Major Milestones
time
20
XP Overview
Characteristics
• Evolutionary development
• Focus on working code that
Planning Every 2-3 implements customer needs
weeks
(rather than documents)
Write tests
• Testing is a crucial element
• Focus on flexibility and
efficiency of the process
Pair Programming
Release
+ Refactoring • Designed for small teams
(<10)
Test
Min.
Integration
daily
21
XP Practices (I)
• Planning
– Small releases
– Start with the smallest useful feature set
– Release early and often, adding a few features each time
– Stakeholders meet to plan the next iteration
– Business people decide on business value of features
– Developers identify the technical risk of features and predict effort per feature
• Simple Design
– Always use the simplest possible design to get the job done (runs the tests and
states intentions of the programmer)
• Testing
– Test-first: write test, then implement it
– Programmers write unit tests and customers write acceptance tests
• Refactoring
– Refactoring is done continuously; the code is always kept clean
22
XP Practices (II)
• Pair programming: all production code written by two
• One programmer is thinking about implementing the current method,
the other is thinking strategically about the whole system
• Pairs are put together dynamically
• Continuous integration
• All changes are integrated into the code-base at least daily
• The tests have to run 100% before and after the integration
• 40-hrs week
• Programmers go home on time
• Overtime is a symptom of a serious problem
• No errors by tired developers
• On-site customer
• Development team has continuous access to a real life customer/user
• Coding standards
• Everyone codes to the same standards
23
XP Practices
• Advantages
– Simple concept
– Low management overhead (no complicated procedures to
follow, no documentation to maintain, direct communication)
– Continuous risk management (early feedback from the
customer)
– Continuous effort estimation
– Emphasis on testing
• tests help in evolution and maintenance
• Disadvantages
– Appropriate for small teams (up to 10 developers)
– If maintainers are not the people that developed the
code, good documentation is necessary
24
Reading
• RUP
– Craig Larman, Applying UML and Patterns: An Introduction to Object-
Oriented Analysis and Design and the Unified Process, Prentice-Hall,
2002 (2nd edition)
• Agile development
– Kent Beck, Extreme Programming: Explained, Addison-Wesley, 1999
25
Note: An Effective Process …
• Provides guidelines for efficient development of
quality software
• Reduces risk and increases predictability
• Captures and presents best practices
• Promotes common vision
• Provides roadmap for applying tools
26
Software Process Improvement Initiatives
• Capability Maturity Model (CMM)
• http://www.sei.cmu.edu/cmm/cmms/cmms.html
• ISO 9000-series
27
©Steven Schach 2002 [modified]
SW–CMM
(SEI, 1986)
– A strategy for improving the software process
– Fundamental ideas:
• Improving the software process leads to
– Improved software quality
– Delivery on time, within budget
• Improved management leads to
– Improved techniques
– Five levels of “maturity” are defined
– Organization advances stepwise from level to level
28
©Steven Schach 2002
Levels 1 and 2
• Level 1: Initial (Most organizations world-wide are at this)
• Ad hoc approach
– Entire process is unpredictable
– Management consists of responses to crises
• Level 2: Repeatable
• Basic software management
• Management decisions should be made on the basis
of previous experience with similar products
• Measurements (“metrics”) are made
– These can be used for making cost and duration
predictions in the next project
• Problems are identified, immediate corrective action
is taken
29
Levels 3, 4 and 5
• Level 3: Defined
– The software process is fully documented
• Managerial and technical aspects are clearly defined
• Continual efforts are made to improve quality + productivity
• Reviews are performed to improve software quality
• CASE tools are applicable now (and not at levels 1 or 2)
• Level 4: Managed
– Quality and productivity goals are set for each project
• Quality, productivity are continually monitored
• Statistical quality controls are in place
• Level 5: Optimized
– Continuous process improvement
• Statistical quality and process controls
• Feedback of knowledge from each project to the next
30
SW–CMM Summary
31
Software Metrics
32
Product Quality Metrics
• A quality metric should be a predictor of
product quality
• Note
• Most quality metrics are design quality metrics
(measure coupling and complexity)
• The relationship between these metrics and
quality has to be judged by a human
• There are no “magic thresholds,” rather the trend
of metrics over time needs to be monitored
33
Software Metrics
• Design Metrics
• Coupling metrics
– # of calling functions or called functions
• Cyclomatic complexity
– a measure of control structure complexity
• OO metrics
– Coupling between objects (CBO)
– Depth of inheritance tree (DIT)
• Quality Metrics
– # of failures observed
– Rate of observed failure
34
Process Metrics
• Time taken for process activities to be completed
• Ex. Calendar time to complete an activity
• Resources required for processes or activities
• Ex. Total effort in person-days
• Number of occurrences of a particular event
• Ex. Number of defects discovered
35
Goal-Question-Metric Paradigm
• Goals
– What is the organisation trying to achieve?
– The objective of process improvement is to satisfy these goals
• Questions
– Questions about areas of uncertainty related to the goals.
– You need process knowledge to derive these
• Metrics
– Measurements to be collected to answer the questions
36
Goal-Question-Metric Paradigm: Example
• Goal
– Reduce the requirements related defects
• Questions
– How many defects have been introduced in the
requirements phase?
– Why did the defects go undetected?
• Metrics
– Classify the defects
– Count how many are requirements related
– Answer: Why did the defect go undetected
– Take action to prevent a similar defect
37