0% found this document useful (0 votes)
38 views28 pages

Team Software Process (TSP) : - Humphrey Intermediate Between PSP and CMM

The Team Software Process (TSP) is an iterative software development process that operates in 4-5 month cycles. It was created by Humphrey as an intermediate maturity process between the Personal Software Process (PSP) and the Capability Maturity Model (CMM). TSP uses statistical process control and involves teams of 2-20 members working together using scripts, forms, and other documentation to manage their process.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views28 pages

Team Software Process (TSP) : - Humphrey Intermediate Between PSP and CMM

The Team Software Process (TSP) is an iterative software development process that operates in 4-5 month cycles. It was created by Humphrey as an intermediate maturity process between the Personal Software Process (PSP) and the Capability Maturity Model (CMM). TSP uses statistical process control and involves teams of 2-20 members working together using scripts, forms, and other documentation to manage their process.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 28

Team Software Process (TSP)

• Humphrey; intermediate between PSP


and CMM
– Teams of two to twenty members
– Multiteams of up to 150 members
• Tradition of statistical process control
• Iterative (four to five month cycles)
• Scripts and forms

6/18/2007  2007, Spencer Rugaber 1


Process Flow

6/18/2007  2007, Spencer Rugaber 2


Teams (Dyer)
• A team consists of at least two people.
• The members are working toward a
common goal.
• Each person has a specific assigned
role.
• Completion of the mission requires
some form of dependency among the
group members.
6/18/2007  2007, Spencer Rugaber 3
Effective Teams
• The members are skilled.
• The team’s goal is important, defined, visible,
and realistic.
• The team’s resources are adequate for the
job.
• The members are motivated and committed
to meeting the team’s goal.
• The members cooperate and support each
other.
• The members are disciplined in their work.
6/18/2007  2007, Spencer Rugaber 4
Team Building
• The team members establish common goals and defined roles.
• The team develops an agreed-upon strategy.
• The team members define a common process for their work.
• All team members participate in producing the plan, and each member
knows his or her personal role in that plan.
• The team negotiates the plan with management.
• Management reviews and accepts the negotiated plan.
• The team members do the job in the way that they have planned to do
it.
• The team members communicate freely and often.
• The team forms a cohesive group: the members cooperate, and they are
all committed to meeting the goal.
• The engineers know their status, get feedback on their work, and have
leadership that sustains their motivation.
6/18/2007  2007, Spencer Rugaber 5
Launch

6/18/2007  2007, Spencer Rugaber 6


Strategy
• Create a conceptual design for the
product
• Decide what will be produced in each
cycle
• Make initial size and effort estimates
• Establish a configuration management
plan

6/18/2007  2007, Spencer Rugaber 7


Selecting Roles
• Team Leader • Customer interface
• Development manager
Manager • Design manager
• Planning • Test manager
Manager • Safety manager
• Quality/Process • Security manager
Manager
• Performance
• Support
manager
Manager
6/18/2007  2007, Spencer Rugaber 8
Team Leader Responsibilities
• Motivating team members
• Handling customer issues
• Interaction with management
• Day-to-day direction of the work
• Protecting team resources
• Resolving team issues
• Conducting team meetings
• Reporting on the work status
6/18/2007  2007, Spencer Rugaber 9
Development Manager
• Leads and guides the team in designing and
developing the product
– Lead the team in producing the development
strategy and the product conceptual design
– Lead the team in producing the design
specification (SDS)
• If there is no separate Design Manager or Software
Architect
– Lead the team in implementing the product

6/18/2007  2007, Spencer Rugaber 10


Planning Manager
• Supports and guides the team in
planning and tracking their work
– Lead the team in producing the task plan
and schedule for each development cycle
– Lead the team in producing the balanced
team development plan
– Track the team's progress against their
plan

6/18/2007  2007, Spencer Rugaber 11


Quality / Process Manager
• Supports the team in defining their process
needs, in making the quality plan and in tracking
process and product quality
– Lead the team in producing and tracking their quality
plan
– Identify where quality performance falls short of
objectives.
– Lead the team in defining, documenting, and
maintaining their processes and development
standards
– Act as moderator and lead all team reviews and
inspections
6/18/2007  2007, Spencer Rugaber 12
Support Manager
• Supports the team in determining, obtaining,
and managing the tools needed to meet its
technology and administrative support needs
– Lead the team in determining their support needs and
obtaining the needed tools and facilities
– Lead the development and management of
Change/Configuration Management System
– Handle the team's issue and risk tracking
system
– Act as the team's reuse advocate

6/18/2007  2007, Spencer Rugaber 13


Task Planning

• Generate a default task list


• Modify the default list as appropriate
and estimate sizes and times for each
task
• Assign portions of tasks to individual
engineers
• Determine total time required for the
project cycle
6/18/2007  2007, Spencer Rugaber 14
Schedule Planning
• Use LOC estimates and LOC/hour rates
for time estimation
• Generate TASK and SCHEDULE plans
– Estimate times based upon LOC and
LOC/hour estimates
– Estimate time available for the team

6/18/2007  2007, Spencer Rugaber 15


Produce Size Estimates
• Produce the conceptual design
– Objects for all cycles
• Select a development strategy
– Allocate objects to cycles
– Produce a minimal working subset in the first
cycle
– Development Manager leads
• Produce preliminary size estimates
– Estimate LOC in each class
– Use size estimates as a basis for allocation of
tasks to cycles
– Enter estimates on the STRAT form
6/18/2007  2007, Spencer Rugaber 16
Quality Planning
• Enter default quality criteria into the
SUMQ form
– the Percent Defect Free components in
each of the defect detection phases
– the defect removal yields in various phases
(Phase Yields)
– the rate at which defects are injected in
various phases (Defect Injection
Rates)
6/18/2007  2007, Spencer Rugaber 17
TSP Quality Guidelines
• Percent (of modules) Defect Free (PDF) at entrance to
– Compile > 10%
– Unit Test > 50%
– Integration Test > 70%
– System Test > 90%
• Defects/KLOC:
– Total defects injected 75 - 150; If not PSP trained, use 100 to 200.
– Compile < 10
– Unit Test < 5
– Integration Test < 0.5
– System Test < 0.2
• Defect Ratios
– Detailed design review defects /unit test defects > 2.0
– Code review defects/compile defects > 2.0

6/18/2007  2007, Spencer Rugaber 18


Development Time Ratios
• Requirements inspection/requirements time >
0.25 Elicitation in requirements time
• High-level design inspection/high-level design
time > 0.5 Design work only, not studies
• Detailed design/coding time > 1.00
• Detailed design review/detailed design time >
0.5
• Code review/code time > 0.5

6/18/2007  2007, Spencer Rugaber 19


Review and Inspection Rates
• Requirements pages/hour < 2 Single-spaced
text pages
• High-level design pages/hour < 5 Formatted
design logic
• Detailed design text lines/hour < 100
Pseudocode ~ equal to 3 LOC
• Code LOC/hour < 200 Logical LOC

6/18/2007  2007, Spencer Rugaber 20


Defect Injection and
Removal Rates
• Requirements defects injected/hour 0.25
• Requirements inspection defects removed/hour 0.5
• High-level design defects injected/hour 0.25
• High-level design inspection defects removed/hour 0.5
• Detailed design defects injected/hour 0.75
• Detailed design review defects removed/hour 1.5
• Detailed design inspection defects removed/hour 0.5
• Code defects injected/hour 2.0
• Code review defects removed/hour 4.0
• Compile defects injected/hour 0.3
• Code inspection defects removed/hour 1.0
• Unit test defects injected/hour 0.067
6/18/2007  2007, Spencer Rugaber 21
The Yield Measure
Defect
injection
phase
Development
• Phase yield is the Injects
defects
%
percentage of defects Phase %
entering and injected yield Defect
r emoval
in a phase that are Development phase
Injects
removed in that phase defects

• Process yield is the Phase %


yield
yield of all phases up
Development
to that point in the Pr ocess
Injects
defects
process yield
Phase %
yield

6/18/2007  2007, Spencer Rugaber 22


Phase Yields
• Team requirements inspections ~ 70%
• Design reviews and inspections ~ 70%
• Code reviews and inspections ~ 70%
• Compiling ~ 50% (90+ % of syntax defects)
• Unit test - at 5 or less defects/KLOC ~ 90%
– For high defects/KLOC - 50-75%
• Integration and system test - at < 1.0 defects/KLOC ~ 80%
– For high defects/KLOC - 30-65%
• Before compile >75%
• Before unit test > 85%
• Before integration test > 97.5%
• Before system test > 99%
6/18/2007  2007, Spencer Rugaber 23
Quality Measures
• Percent (modules) defect free—PDF
• Defect-removal profile
– Defects/KLOC vs. phase
• Quality profile
– Design review time, design/code ratio,
code review time, compile defects/KLOC,
unit test defects/KLOC
• Process quality index—PQI
– Product of quality profile factors
6/18/2007  2007, Spencer Rugaber 24
Produce the Quality Plan
• Estimate defect injection rates for each phase
• Estimate yield for each phase
• Generate a trial quality plan
• Compare the quality plan with team goals
– Examine produce quality at each phase of the
project
– Modify time planned for defect removal if quality
goals are not satisfied
• Continue generating trial plans until quality
goals are satisfied
6/18/2007  2007, Spencer Rugaber 25
Component Quality Profile
• The PSP/TSP criteria for a quality process
are that
– Detailed design (DLD) time >= coding time
– Detailed design review time >= 50% of DLD
time
– Code review time >= 50% of coding time
– Compile defects <= 10 per KLOC
– Unit test defects <= 5 per KLOC
• Many defect-free components do not meet
these criteria
• All components that have met these criteria
have been defect
6/18/2007 free
 2007, Spencer Rugaber 26
Project Tracking
• Earned value [Humphrey 95]
• Each task is assigned a value based on the
percentage of the total project estimate that is
required for that task
• If a project was planned to take 1,000 task hours, a
32-hour task would have 3.2 planned value, or
100*32/1000 = 3.2%.
• Then, when the team has completed that task, the
engineers would have accumulated 3.2 earned value
points, no matter how long the task actually took.

6/18/2007  2007, Spencer Rugaber 27


Experiences
• Teradyne improvement from 20 defects per
KLOC to 1 defect per KLOC. The savings in
defect repair costs were about 4.5 times
the cost of producing the programs in the
first place.
• Hill Air Force Base productivity improved
123% and test time was reduced from
22% to 2.7% of the project schedule
• Boeing, on a large avionics project, had a
94% reduction in system test time
6/18/2007  2007, Spencer Rugaber 28

You might also like