5) Testing-Management L D
5) Testing-Management L D
Level
• Module 5 of 6: Test Management
Febraury, 2017
Study Sessions
• These study packs are intended as sessions directly from the chapters in the
syllabus
• There are 6 main modules
1. Fundamentals of Testing
2. Testing Throughout the Software Life Cycle
3. Static Techniques
4. Test Design Techniques
5. Test Management
6. Tool Support for Testing
2
Covered in this session…
3
5.1 Test Organisation
Organisation and Independence
Test Organisation and Independence
e
enc
Developers test each other’s code
end
Testers within the development teams
dep
g In Test team reporting to project
management or higher
Testers from the business organization or
sin
user community
rea
5
Benefits and drawbacks of independence
Benefits: Drawbacks:
Independent testers see other • Isolated from the development
and different defects team
Are unbiased • May be the bottleneck as the
Can verify assumptions people last checkpoint
made during specification and • Developers may lose a sense of
implementation of the system responsibility for quality
6
Who does testing?
Testing tasks may be done by people in a specific testing role:
Test Leader
(Test Manager or Test Coordinator)
- separate roles on large projects
Testers
» on separate test team
- within development team
Other Roles
Business & Domain expert, quality manager,
Infrastructure and IT operations
7
Tasks of the Test Leader
• Coordinate the test strategy and plan with project managers and
others.
• As well as contribute testing perspective on other project
activities such as integration planning
• Plan testing:
• How long will it take, how many cycles, etc?
• How many people, effort, cost?
• Which test levels, approach and objectives?
• How will incidents and risk be managed?
8
Test Leader tasks continued…
• Establish configuration management
• Introduce suitable metrics
• Respond to changes based on progress
• adapt the test plan accordingly
• Decide approach to automation
• What tests need automating?
• How? What extent?
• What testing tool and training required?
• What needs to be setup for test environments? Schedule testing
• Create test summary reports based on what happens in testing
9
Tester tasks:
• Review and contribute to test plan
• Review requirements and specification for:
• Completeness
• Consistency
• Accuracy
• Testability
• Feasibility
10
Tester tasks continuing…
11
Testing tasks may be done by:
• Testers at the acceptance test level would be business experts and users
12
5.2 Test Planning and
Estimation
Activities and Effort
Test Planning
What influences test planning?
14
Test Planning Activities
• Define the test approach or strategy
• Test levels, entry and exit criteria
• Integrating and coordinating testing activities
• Acquisition, supply, dev, operation and maintenance
• Making decisions
• What to test?
• What roles will perform the testing activities?
• When and how the testing activities should be done?
• How the test results will be evaluated?
• When to stop testing (exit criteria)?
• Assigning resources for the different tasks defined
15
Test Planning Activities continued ….
• Defining the amount, level of detail, structure and templates for the
test documentation
• Scheduling:
• Test analysis and design activities
• Test implementation, execution and evaluation
• Selecting metrics for:
• Monitoring and controlling test preparation and execution
• Defect resolution
• Risk issues
• Setting the level of detail for test procedures
• Enough information to reproduce test preparation and execution
16
Entry Criteria
17
Exit Criteria
When should we stop testing?
• the end of a test level
• when a set of tests has a specific goal
18
IEEE-829 Test Plan
1) Test Plan Identifier 14) Staffing and Training Needs
2) References 15) Responsibilities
3) Introduction 16) Schedule
4) Test Items 17) Planning Risks and Contingencies
5) Software Risk Issues 18) Approvals
6) Features to be Tested 19) Glossary
7) Features not to be Tested
8) Approach
9) Item Pass/Fail Criteria
Purpose: To identify items and features to be
10) Suspension Criteria and Resumption tested, tasks to be performed, responsibilities and
Requirements schedules
11) Test Deliverables A high level view of how testing will proceed; WHAT
is to be tested, by WHOM, HOW, in what TIME
12) Remaining Test Tasks frame, to what QUALITY level
13) Environmental Needs
19
SPACE DIRT = IEEE829 Test Plan
S - Scope test items, what to test, what not to test
P - People training, responsibilities, schedule
A - Approach the approach that will be taken to testing
C - Criteria entry/exit criteria, suspension/resumption criteria
E - Environment test environment needs
D - Deliverables what is being delivered as part of the test process
I - Incidentals introduction, identification (of the document), approval
authorities
R - Risks risks and contingencies
T - Tasks the test tasks that are involved in the testing process.
Reference: http://www.intosaiitaudit.org/intoit_articles/19_12_spacedirt.pdf
20
Test Estimation
Estimating methods:
1. The metrics-based approach
– measures of previous or similar projects
» if we have historical information or typical values
» Eg. Using a model with past data on test preparation and execution
times
2. The expert-based approach
– assessment by experts or task owner
» depends on their expertise / experience
» Eg. Agile estimation planning poker games
• Once the test effort is estimated, resources can be identified and a schedule
can be drawn up.
21
Estimating methods - considerations
22
Test Strategy and Test Approach
When do you start the test design phase?
• Preventive approach
- tests are designed as early as possible
23
Test Strategy and Test Approach continued
• Reactive approach
• test design comes after software or system has been produced
24
Typical approaches/strategies include:
1. Analytical
• Focusing testing on the most critical functionality (risk-based)
2. Model-based
• Stochastic or Monkey testing using random or statistical information (tool).
• Operational profiles
3. Methodical approaches
• Failure based (error guessing and fault attack) , experience-based, check-list
based and quality characteristic-based
25
Typical approaches/strategies include:
26
Typical approaches/strategies continued…
7. Regression-averse approaches
• such as those that include reuse of existing test material,
extensive automation of functional regression tests, and standard
test suites
28
5.3 Progress Monitoring and
Control
Metrics and Decision making
Test Progress Monitoring
31
Common Test Metrics continued…
• Metrics on defects
• % defect density
• % defect found
• % defect fixed
• Failure rate
• Results of re-testing the defects
• Metrics on coverage
• % of requirements covered
• % of code covered
• % of critical risk functionality covered
• % of high risk functionality covered
• % of medium risk functionality covered
• % of low risk functionality covered
32
Common Test Metrics continued…
• Milestones
– have they been reached?
33
Test Reporting
Summarizes information about testing showing:
• what happened during testing?
• were dates and exit criteria met?
Analyze metrics and make appropriate recommendations
• Assessment of defects remaining
• Is it worth carrying on with testing?
• What’s the situation with risks?
• Are we confident the application will work?
• Assess the adequacy of test objectives/approach and effectiveness
of testing
34
IEEE-829 Test Summary Report
1. Identifier
2. Summary
3. Variances
4. Comprehensive Assessment
5. Summary of Results
6. Summary of Activities
The Test Summary brings together all pertinent information about the testing, including the number of incidents raised
and outstanding, and crucially an assessment about the quality of the system. Also recorded for use in future project
planning is details of what was done, and how long it took. This document is important in deciding whether the quality
of the system is good enough to allow it to proceed to another stage. This assessment is based upon detailed
information that was documented in the Test Plan.
35
Test Control
Guiding or Corrective actions taken to help us meet the original or
modified plans
Feedback is vital
–this will demonstrate whether the controlling action has had
the desired effect for the project.
36
5.4 CONFIGURATION
MANAGEMENT
Supports Testing
Configuration Management
38
Why do we need CM?
• Without configuration management how can you accurately
reproduce failure/pass?
• Which version of code?
• Which version of the test?
• Which version of the requirement?
• Which configuration of the environment or data?
• For the tester, configuration management helps to uniquely identify
(and to reproduce) the tested item, test documents, the tests and
the test harness(es).
• During test planning, the configuration management procedures and
infrastructure (tools) should be chosen, documented and
implemented.
39
5.5 Risk and Testing
Analysis and Management
What is Risk?
Risk can be defined as “the chance of an event, hazard, threat or
situation occurring and resulting in undesirable consequences or a
potential problem”
The level of risk is the likelihood of an adverse event happening x the impact (the
resulting harm from the event)
41
Project Risks
• Are risks that surround the project’s capability to deliver its objectives,
such as:
• Organisational factors:
• Skills, training and staff shortages
• Personnel issues
• Political issues
• Improper attitude toward or expectations of testing
• Technical issues:
• Problems in defining the right requirements
• Constraints
• Test environment not ready on time
• Late data conversion
• Low quality design, code, configuration data, test data and results
• Supplier issues – failure of a third party, contractual issues
42
Mitigating Project Risk
44
Risk-based Testing
45
Risk-based Testing continued...
Draw on the collective knowledge
insight of the project stakeholders
to determine the risks
and the levels of testing required to address those risks
To minimise chance of failure, risk management activities provide a
disciplined approach to:
1. Assess what can go wrong (risks)
2. Determine what risks are important to deal with
3. Implement actions to deal with those risks
Testing
supports the identification of new risks
may help to determine what risks should be reduced
46
5.6 Incident Management
Defects and IEEE 829
Incident Management
48
Definitions from the ISTQB Glossary
49
Definitions from the ISTQB Glossary
• failure: Deviation of the component or system from its expected delivery, service
or result.
• defect: A flaw in a component or system that can cause the component or system
to fail to perform its required function, e.g. an incorrect statement or data
definition. A defect, if encountered during execution, may cause a failure of the
component or system.
50
Incident Management continued...
• Incidents may be raised during development, review, testing or use
of a software product.
51
Incident Reporting Objectives
52
Incident Reports
Details of the incident report may include:
Date of issue, issuing organization, and author
Expected and actual results
Identification of the test item (configuration item) and environment
Software or system life cycle process in which the incident was observed
Description of the incident to enable reproduction and resolution, including logs,
database dumps or screenshots
Scope or degree of impact on stakeholder(s) interests
Severity of the impact on the system
Urgency/priority to fix
Status of the incident (e.g., open, deferred, duplicate, waiting to be fixed, fixed awaiting
re-test, closed)
Conclusions, recommendations and approvals
Global issues, such as other areas that may be affected by a change resulting from the
incident
Change history, such as the sequence of actions taken by project team members with
respect to the incident to isolate, repair, and confirm it as fixed
References, including the identity of the test case specification that revealed the problem
53
IEEE829
Standard for Software Test Documentation
The Eight Parts of IEEE-829
1.Test Plan
2.Test Design Specification
3.Test Case Specification
4.Test Procedure Specification
5.Test Item Transmittal
6.Test Log
7.Test Incident Summary
8.Test Summary Report
55
Part 1 IEEE-829 Test Plan
1) Test Plan Identifier 15) Responsibilities
2) References 16) Schedule
3) Introduction 17) Planning Risks and Contingencies
4) Test Items 18) Approvals
5) Software Risk Issues 19) Glossary
6) Features to be Tested
7) Features not to be Tested
8) Approach
9) Item Pass/Fail Criteria Purpose: To identify items and features to be
tested, tasks to be performed, responsibilities and
10) Suspension Criteria and Resumption schedules
Requirements
A high level view of how testing will proceed; WHAT
11) Test Deliverables is to be tested, by WHOM, HOW, in what TIME
frame, to what QUALITY level
12) Remaining Test Tasks
13) Environmental Needs
14) Staffing and Training Needs Source: http://testingqa.com/ieee-829-
standards/
56
SPACE DIRT = IEEE829 Test Plan
57
Part 2 IEEE-829 Test Design Specification
1. Test Design Specification Identifier
2. Features to be tested
3. Approach refinements
4. Test identification
5. Feature pass/fail criteria
Source: http://testingqa.com/ieee-829-
standards/
58
Part 3 IEEE-829 Test Case Specification
1. Test case specification identifier
2. Test items
3. Input specifications
4. Output specifications
5. Environmental needs
6. Special procedural requirements
7. Inter-case dependencies
Source: http://testingqa.com/ieee-829-
standards/
59
Part 4 IEEE-829 Test Procedure Specification
1. Identifier
2. Purpose
3. Special Requirements
4. Procedural steps, including setup, proceed, evaluate, shutdown
Purpose: To specify steps for executing a set of cases or, more generally ,
the steps used to analyze a software item in order to evaluate a set of
features
Describes how the tester will physically run the test, including set up
procedures. The standard defines ten procedure steps that may be
applied when running a test.
Source: http://testingqa.com/ieee-829-
standards/
60
Part 5 IEEE-829 Test Item Transmittal Report
1. Identifier
2. Transmitted items
3. Location
4. Status
5. Approvals
The recording of when individual items to be tested have been passed from
one stage of testing to another. This includes where to find such items,
what is new about them, and is in effect a warranty of 'fit for test'.
Source: http://testingqa.com/ieee-829-
standards/
61
Part 6 IEEE-829 Test Log
1. Identifier
2. Description
3. Activity and event entries with attachments
Details of what tests were run, by whom, and whether individual tests
passed or failed.
Source: http://testingqa.com/ieee-829-
standards/
62
Part 7 IEEE-829 Test Incident Report
1. Identifier
2. Summary
3. Incident Description
4. Impact
Source: http://testingqa.com/ieee-829-
standards/
63
Part 8 IEEE-829 Test Summary Report
1. Identifier
2. Summary
3. Variances
4. Comprehensive Assessment
Source: http://testingqa.com/ieee-829-
5. Summary of Results standards/
6. Summary of Activities
Purpose: To document any event that occurs during testing which requires investigation.
The Test Summary brings together all pertinent information about the testing, including the
number of incidents raised and outstanding, and crucially an assessment about the quality of
the system. Also recorded for use in future project planning is details of what was done, and
how long it took. This document is important in deciding whether the quality of the system is
good enough to allow it to proceed to another stage. This assessment is based upon detailed
information that was documented in the Test Plan.
64
End of Module 5
Questions?
65
End of Module 5 Learning Check... 1
1. List in order of independence from least to most...
External tester, developer testing, tester outside of development group, test
specialist
Answer:
Developer testing, tester outside of development group, test specialist, external
tester
66
End of Module 5 Learning Check... 2
3. Which of the following is not a part of the IEEE829 standard for Test Plans?
Approvals, Risk, Incident Management Process, Features Not to be Tested, Test
Deliverables, Suspension and Resumption Criteria
Answer:
Incident Management Process
Answer: dynamic more reactive than pre-planned, Consultative you can still pre-plan
test design by getting advice from experts
67
End of Module 5 Learning Check... 3
5. What sort of estimation technique would you use if there was no historical data?
Answer:
Expert-based.
68