0% found this document useful (0 votes)
26 views68 pages

5) Testing-Management L D

This document provides an overview of test management topics covered in Module 5 of the ISTQB Certified Tester Foundation Level study sessions, including: test organization and independence; roles of test leader and testers; test planning activities such as defining strategy, estimating effort, and establishing entry/exit criteria; and test estimation methods. The document outlines the key elements of a test plan based on the IEEE 829 standard and provides a SPACE DIRT mnemonic for remembering those elements.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
26 views68 pages

5) Testing-Management L D

This document provides an overview of test management topics covered in Module 5 of the ISTQB Certified Tester Foundation Level study sessions, including: test organization and independence; roles of test leader and testers; test planning activities such as defining strategy, estimating effort, and establishing entry/exit criteria; and test estimation methods. The document outlines the key elements of a test plan based on the IEEE 829 standard and provides a SPACE DIRT mnemonic for remembering those elements.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 68

ISTQB Certified Tester Foundation

Level
• Module 5 of 6: Test Management

Febraury, 2017
Study Sessions

• These study packs are intended as sessions directly from the chapters in the
syllabus
• There are 6 main modules
1. Fundamentals of Testing
2. Testing Throughout the Software Life Cycle
3. Static Techniques
4. Test Design Techniques
5. Test Management
6. Tool Support for Testing

2
Covered in this session…

5.1 Test organization


5.2 Test planning and estimation
5.3 Test progress monitoring and control
5.4 Configuration management
5.5 Risk and testing
5.6 Incident Management

3
5.1 Test Organisation
Organisation and Independence
Test Organisation and Independence

Developers test their own code


(not independent)

e
enc
Developers test each other’s code

end
Testers within the development teams

dep
g In Test team reporting to project
management or higher
Testers from the business organization or
sin

user community
rea

Test specialists e. g usability, security,


certification
Inc

Testers outsourced or external to the


organisation (3rd party)

5
Benefits and drawbacks of independence
Benefits: Drawbacks:
Independent testers see other • Isolated from the development
and different defects team
 Are unbiased • May be the bottleneck as the
Can verify assumptions people last checkpoint
made during specification and • Developers may lose a sense of
implementation of the system responsibility for quality

6
Who does testing?
Testing tasks may be done by people in a specific testing role:
 Test Leader
(Test Manager or Test Coordinator)
- separate roles on large projects

 Testers
» on separate test team
- within development team

Other Roles
Business & Domain expert, quality manager,
Infrastructure and IT operations

7
Tasks of the Test Leader
• Coordinate the test strategy and plan with project managers and
others.
• As well as contribute testing perspective on other project
activities such as integration planning
• Plan testing:
• How long will it take, how many cycles, etc?
• How many people, effort, cost?
• Which test levels, approach and objectives?
• How will incidents and risk be managed?

• Coordinate the design, preparation, implementation and execution


of tests
• monitor and control execution

8
Test Leader tasks continued…
• Establish configuration management
• Introduce suitable metrics
• Respond to changes based on progress
• adapt the test plan accordingly
• Decide approach to automation
• What tests need automating?
• How? What extent?
• What testing tool and training required?
• What needs to be setup for test environments? Schedule testing
• Create test summary reports based on what happens in testing

9
Tester tasks:
• Review and contribute to test plan
• Review requirements and specification for:
• Completeness
• Consistency
• Accuracy
• Testability
• Feasibility

• Design test specifications, design test cases


• Review tests developed by other testers
• Setup test environment
• Create test data

10
Tester tasks continuing…

Run tests and log results


Use testing tools, log faults and results
Use automation tools
Test non functional characteristics e. g performance, Security

11
Testing tasks may be done by:

• Typically testers at the component and integration level would be developers

• Testers at the acceptance test level would be business experts and users

• Testers for operational acceptance testing would be operators

12
5.2 Test Planning and
Estimation
Activities and Effort
Test Planning
What influences test planning?

Organisation’s Test Policy


Scope
Objectives
Risks
Constraints
Criticality
Testability
Availability of resources
As we progress more information becomes
available and more detail can be added or
adjusted

14
Test Planning Activities
• Define the test approach or strategy
• Test levels, entry and exit criteria
• Integrating and coordinating testing activities
• Acquisition, supply, dev, operation and maintenance
• Making decisions
• What to test?
• What roles will perform the testing activities?
• When and how the testing activities should be done?
• How the test results will be evaluated?
• When to stop testing (exit criteria)?
• Assigning resources for the different tasks defined

15
Test Planning Activities continued ….
• Defining the amount, level of detail, structure and templates for the
test documentation
• Scheduling:
• Test analysis and design activities
• Test implementation, execution and evaluation
• Selecting metrics for:
• Monitoring and controlling test preparation and execution
• Defect resolution
• Risk issues
• Setting the level of detail for test procedures
• Enough information to reproduce test preparation and execution

16
Entry Criteria

• Defines when to start testing


• At beginning of a test level
• Or when a test set is ready for
execution

• Typically entry criteria:


• Test environment availability
and readiness
• Test tool readiness in the test environment
• Testable code availability
• Test data availability

17
Exit Criteria
When should we stop testing?
• the end of a test level
• when a set of tests has a specific goal

• Typical exit criteria are:


• Thoroughness measures, such as coverage of code, functionality
or risk
• Estimates of defect density or reliability measures
• Cost
• Residual risks, such as defects not fixed or lack of test coverage in
certain areas
• Schedules such as those based on time to market

18
IEEE-829 Test Plan
1) Test Plan Identifier 14) Staffing and Training Needs
2) References 15) Responsibilities
3) Introduction 16) Schedule
4) Test Items 17) Planning Risks and Contingencies
5) Software Risk Issues 18) Approvals
6) Features to be Tested 19) Glossary
7) Features not to be Tested
8) Approach
9) Item Pass/Fail Criteria
Purpose: To identify items and features to be
10) Suspension Criteria and Resumption tested, tasks to be performed, responsibilities and
Requirements schedules

11) Test Deliverables A high level view of how testing will proceed; WHAT
is to be tested, by WHOM, HOW, in what TIME
12) Remaining Test Tasks frame, to what QUALITY level
13) Environmental Needs

19
SPACE DIRT = IEEE829 Test Plan
S - Scope test items, what to test, what not to test
P - People training, responsibilities, schedule
A - Approach the approach that will be taken to testing
C - Criteria entry/exit criteria, suspension/resumption criteria
E - Environment test environment needs
D - Deliverables what is being delivered as part of the test process
I - Incidentals introduction, identification (of the document), approval
authorities
R - Risks risks and contingencies
T - Tasks the test tasks that are involved in the testing process.
Reference: http://www.intosaiitaudit.org/intoit_articles/19_12_spacedirt.pdf

20
Test Estimation

Estimating methods:
1. The metrics-based approach
– measures of previous or similar projects
» if we have historical information or typical values
» Eg. Using a model with past data on test preparation and execution
times
2. The expert-based approach
– assessment by experts or task owner
» depends on their expertise / experience
» Eg. Agile estimation planning poker games
• Once the test effort is estimated, resources can be identified and a schedule
can be drawn up.
21
Estimating methods - considerations

• Quality of the specification


• Size and complexity of application
• Requirements for non – functional testing
• Stability and maturity of development process
• Tools used
• Skills of people involved
• Time available
• Amount of rework required!
• How do you estimate how many defects you need to retest?

22
Test Strategy and Test Approach
When do you start the test design phase?
• Preventive approach
- tests are designed as early as possible

V-Model encourages early test design

23
Test Strategy and Test Approach continued
• Reactive approach
• test design comes after software or system has been produced

Waterfall development life cycle has late test design

24
Typical approaches/strategies include:

1. Analytical
• Focusing testing on the most critical functionality (risk-based)
2. Model-based
• Stochastic or Monkey testing using random or statistical information (tool).
• Operational profiles
3. Methodical approaches
• Failure based (error guessing and fault attack) , experience-based, check-list
based and quality characteristic-based

25
Typical approaches/strategies include:

4. Process- or standard-compliant approaches


• Industry-specific standards (Eg. medical, aviation)
• Various agile methodologies
5. Dynamic and heuristic approaches
• such as exploratory testing (more reactive than pre-planned)
• Execution and evaluation are concurrent tasks
6. Consultative approaches
• such as those where test coverage is driven primarily by the advice and
guidance of technology and/or business domain experts outside the test team

26
Typical approaches/strategies continued…
7. Regression-averse approaches
• such as those that include reuse of existing test material,
extensive automation of functional regression tests, and standard
test suites

Note: Different approaches may be combined,


for example, a risk-based dynamic approach.
27
Test approaches/ strategies considerations

Select an approach considering the context:


• Risk of failure of the project, hazards to the product and risks of product
failure to humans, the environment and the company
• Skills and experience of the people in the proposed techniques, tools and
methods
• The objective of the testing endeavour and the mission of the testing team
• Regulatory aspects, such as external and internal regulations for the
development process
• The nature of the product and the business
• (Custom built vs. COTS)

28
5.3 Progress Monitoring and
Control
Metrics and Decision making
Test Progress Monitoring

Reasons for monitoring test progress?


 to provide feedback and visibility of testing activities

 gathering and supplying information to stakeholders so that informed decisions


can be made

 show how are we doing against the plan


• time and budget
• exit criteria

What metrics do you keep?


30
Common Test Metrics

• Metrics on test preparation


• % test cases designed
• % planned test cases designed
• % of environment set up

• Metrics on test execution


• % of test cases run
• % of test cases run and passed
• % of test cases run and failed
• % of test cases not run

31
Common Test Metrics continued…
• Metrics on defects
• % defect density
• % defect found
• % defect fixed
• Failure rate
• Results of re-testing the defects
• Metrics on coverage
• % of requirements covered
• % of code covered
• % of critical risk functionality covered
• % of high risk functionality covered
• % of medium risk functionality covered
• % of low risk functionality covered

32
Common Test Metrics continued…

• Metrics on other aspects


• How the tester feels about the product – are they confident it is fit for
purpose?

• Milestones
– have they been reached?

• Costs – cost/benefits analyses


– is it worth continuing on with the testing? Will running more tests be
beneficial cost-wised to the project?

33
Test Reporting
Summarizes information about testing showing:
• what happened during testing?
• were dates and exit criteria met?
Analyze metrics and make appropriate recommendations
• Assessment of defects remaining
• Is it worth carrying on with testing?
• What’s the situation with risks?
• Are we confident the application will work?
• Assess the adequacy of test objectives/approach and effectiveness
of testing

IEEE 829-1998 includes the outline of a test summary report

34
IEEE-829 Test Summary Report
1. Identifier
2. Summary
3. Variances
4. Comprehensive Assessment
5. Summary of Results
6. Summary of Activities

Purpose: To summarise what happened during testing

The Test Summary brings together all pertinent information about the testing, including the number of incidents raised
and outstanding, and crucially an assessment about the quality of the system. Also recorded for use in future project
planning is details of what was done, and how long it took. This document is important in deciding whether the quality
of the system is good enough to allow it to proceed to another stage. This assessment is based upon detailed
information that was documented in the Test Plan.

35
Test Control
Guiding or Corrective actions taken to help us meet the original or
modified plans

Some key controlling actions examples


–more resource (people, machines, time)
–de-scope product
–de-scope testing
–increase entry criteria (better quality
delivered)
–reduce exit criteria (reduction of quality to customers)

Feedback is vital
–this will demonstrate whether the controlling action has had
the desired effect for the project.
36
5.4 CONFIGURATION
MANAGEMENT
Supports Testing
Configuration Management

• The purpose of configuration management is to establish and maintain the


integrity of the products
• (components, data and documentation)
• For testing, configuration management may involve ensuring the following:
• All items of test ware are identified, version controlled, tracked for changes,
related to each other and related to development items (test objects) so that
traceability can be maintained throughout the test process
• All identified documents and software items are referenced unambiguously in
test documentation

38
Why do we need CM?
• Without configuration management how can you accurately
reproduce failure/pass?
• Which version of code?
• Which version of the test?
• Which version of the requirement?
• Which configuration of the environment or data?
• For the tester, configuration management helps to uniquely identify
(and to reproduce) the tested item, test documents, the tests and
the test harness(es).
• During test planning, the configuration management procedures and
infrastructure (tools) should be chosen, documented and
implemented.

39
5.5 Risk and Testing
Analysis and Management
What is Risk?
Risk can be defined as “the chance of an event, hazard, threat or
situation occurring and resulting in undesirable consequences or a
potential problem”

The level of risk is the likelihood of an adverse event happening x the impact (the
resulting harm from the event)

Risk = Likelihood x Impact

Two types of Risk:


Project Risk and Product Risk

41
Project Risks
• Are risks that surround the project’s capability to deliver its objectives,
such as:
• Organisational factors:
• Skills, training and staff shortages
• Personnel issues
• Political issues
• Improper attitude toward or expectations of testing
• Technical issues:
• Problems in defining the right requirements
• Constraints
• Test environment not ready on time
• Late data conversion
• Low quality design, code, configuration data, test data and results
• Supplier issues – failure of a third party, contractual issues

42
Mitigating Project Risk

• Test Managers use well established project management principles when


analysing, managing and mitigating risks.
IEEE 829 standard for
Test Plans requires
Risks and Contingencies to the stated

• Examples of mitigating Project risks:


• 1. Skill shortage in Service Testing is
mitigated by running knowledge sharing
sessions
• 2. Political risk that Testing is not valued
is mitigated by educating the benefits of testing
43
Product Risks
• Are potential failure areas of the software or system – they are a risk
to the quality of the product
• Failure prone software
• Potential that software/hardware could cause harm to an
individual, company or environment
• Poor software characteristics
(like usability, reliability, performance)
• Poor data integrity and quality
• Software that doesn’t perform intended functions
(not fit for purpose)

TIP: Product Risks can usually be mitigated by testing.


For example, if there is a risk that an airline ticketing site may not cope
with traffic during a sale, this can be mitigated by performance testing

44
Risk-based Testing

• Risks are used to decide where to start testing


• And where to test more!
• We can measure residual risk by looking at
effectiveness of critical defect removal rates
• A risk-based testing approach provides proactive opportunities to reduce product
risk
• Determine the test techniques to be used
• Determine the extent of testing to be carried out
• Prioritise testing to attempt to find critical defects earlier
• Determine whether non-testing activities are needed to reduce risk – eg.
Recommend training to inexperienced BAs

45
Risk-based Testing continued...
Draw on the collective knowledge
 insight of the project stakeholders
 to determine the risks
 and the levels of testing required to address those risks
To minimise chance of failure, risk management activities provide a
disciplined approach to:
 1. Assess what can go wrong (risks)
 2. Determine what risks are important to deal with
 3. Implement actions to deal with those risks
Testing
 supports the identification of new risks
 may help to determine what risks should be reduced

46
5.6 Incident Management
Defects and IEEE 829
Incident Management

Since one of the objectives of testing is to find defects, the discrepancies


between actual and expected outcomes need to be logged as incidents.
An incident shall be investigated and
may or may not turn out to be a defect.
Incidents and defects shall be tracked from
 1. discovery
 2. classification
 3. to correction
 4. and confirmation of the solution.

48
Definitions from the ISTQB Glossary

• anomaly: Any condition that deviates from expectation based on requirements


specifications, design documents, user documents, standards, etc. or from
someone’s perception or experience. Anomalies may be found during, but not
limited to, reviewing, testing, analysis, compilation, or use of software products or
applicable documentation.
[IEEE 1044] See also bug, defect, deviation, error, fault, failure, incident, problem.

• incident: Any event occurring that requires investigation.

49
Definitions from the ISTQB Glossary

• error: A human action that produces an incorrect result.

• failure: Deviation of the component or system from its expected delivery, service
or result.

• defect: A flaw in a component or system that can cause the component or system
to fail to perform its required function, e.g. an incorrect statement or data
definition. A defect, if encountered during execution, may cause a failure of the
component or system.

50
Incident Management continued...
• Incidents may be raised during development, review, testing or use
of a software product.

• They may be raised for issues in code or the working system, or in


any type of documentation including requirements, development
documents, test documents, and user information such as “Help” or
installation guides.

51
Incident Reporting Objectives

• Incident reports have the following objectives:


• 1. Provide developers and other parties with feedback about the problem to
enable identification, isolation and correction as necessary
• 2. Provide test leaders a means of tracking the quality of the system under test
and the progress of the testing
• 3. Provide ideas for test process improvement

IEEE 829 standard includes the structure of an incident report

52
Incident Reports
Details of the incident report may include:
 Date of issue, issuing organization, and author
 Expected and actual results
 Identification of the test item (configuration item) and environment
 Software or system life cycle process in which the incident was observed
 Description of the incident to enable reproduction and resolution, including logs,
database dumps or screenshots
 Scope or degree of impact on stakeholder(s) interests
 Severity of the impact on the system
 Urgency/priority to fix
 Status of the incident (e.g., open, deferred, duplicate, waiting to be fixed, fixed awaiting
re-test, closed)
 Conclusions, recommendations and approvals
 Global issues, such as other areas that may be affected by a change resulting from the
incident
 Change history, such as the sequence of actions taken by project team members with
respect to the incident to isolate, repair, and confirm it as fixed
 References, including the identity of the test case specification that revealed the problem

53
IEEE829
Standard for Software Test Documentation
The Eight Parts of IEEE-829

1.Test Plan
2.Test Design Specification
3.Test Case Specification
4.Test Procedure Specification
5.Test Item Transmittal
6.Test Log
7.Test Incident Summary
8.Test Summary Report

55
Part 1 IEEE-829 Test Plan
1) Test Plan Identifier 15) Responsibilities
2) References 16) Schedule
3) Introduction 17) Planning Risks and Contingencies
4) Test Items 18) Approvals
5) Software Risk Issues 19) Glossary
6) Features to be Tested
7) Features not to be Tested
8) Approach
9) Item Pass/Fail Criteria Purpose: To identify items and features to be
tested, tasks to be performed, responsibilities and
10) Suspension Criteria and Resumption schedules
Requirements
A high level view of how testing will proceed; WHAT
11) Test Deliverables is to be tested, by WHOM, HOW, in what TIME
frame, to what QUALITY level
12) Remaining Test Tasks
13) Environmental Needs
14) Staffing and Training Needs Source: http://testingqa.com/ieee-829-
standards/
56
SPACE DIRT = IEEE829 Test Plan

S - Scope test items, what to test, what not to test


P - People training, responsibilities, schedule
A - Approach the approach that will be taken to testing
C - Criteria entry/exit criteria, suspension/resumption criteria
E - Environment test environment needs
D - Deliverables what is being delivered as part of the test process
I - Incidentals introduction, identification (of the document), approval authorities
R - Risks risks and contingencies
T - Tasks the test tasks that are involved in the testing process.
Reference: http://www.intosaiitaudit.org/intoit_articles/19_12_spacedirt.pdf

57
Part 2 IEEE-829 Test Design Specification
1. Test Design Specification Identifier
2. Features to be tested
3. Approach refinements
4. Test identification
5. Feature pass/fail criteria

Purpose: To refine the test approach

Details the test conditions to be exercised, with the expected


outcome (in general terms)

Source: http://testingqa.com/ieee-829-
standards/
58
Part 3 IEEE-829 Test Case Specification
1. Test case specification identifier
2. Test items
3. Input specifications
4. Output specifications
5. Environmental needs
6. Special procedural requirements
7. Inter-case dependencies

Purpose: To define a test case

Specific data requirements to run tests, based upon the test


conditions identified

Source: http://testingqa.com/ieee-829-
standards/
59
Part 4 IEEE-829 Test Procedure Specification
1. Identifier
2. Purpose
3. Special Requirements
4. Procedural steps, including setup, proceed, evaluate, shutdown

Purpose: To specify steps for executing a set of cases or, more generally ,
the steps used to analyze a software item in order to evaluate a set of
features

Describes how the tester will physically run the test, including set up
procedures. The standard defines ten procedure steps that may be
applied when running a test.

Source: http://testingqa.com/ieee-829-
standards/

60
Part 5 IEEE-829 Test Item Transmittal Report
1. Identifier
2. Transmitted items
3. Location
4. Status
5. Approvals

Purpose: To identify test items being transmitted for testing

The recording of when individual items to be tested have been passed from
one stage of testing to another. This includes where to find such items,
what is new about them, and is in effect a warranty of 'fit for test'.

Source: http://testingqa.com/ieee-829-
standards/
61
Part 6 IEEE-829 Test Log
1. Identifier
2. Description
3. Activity and event entries with attachments

Purpose: To provide a chronological record of relevant details about the


execution of tests

Details of what tests were run, by whom, and whether individual tests
passed or failed.

Source: http://testingqa.com/ieee-829-
standards/
62
Part 7 IEEE-829 Test Incident Report
1. Identifier
2. Summary
3. Incident Description
4. Impact

Purpose: To document any event that occurs during testing which


requires investigation.

Details of instances where a test 'failed' for a specific reason.

Source: http://testingqa.com/ieee-829-
standards/
63
Part 8 IEEE-829 Test Summary Report
1. Identifier
2. Summary
3. Variances
4. Comprehensive Assessment
Source: http://testingqa.com/ieee-829-
5. Summary of Results standards/

6. Summary of Activities

Purpose: To document any event that occurs during testing which requires investigation.

The Test Summary brings together all pertinent information about the testing, including the
number of incidents raised and outstanding, and crucially an assessment about the quality of
the system. Also recorded for use in future project planning is details of what was done, and
how long it took. This document is important in deciding whether the quality of the system is
good enough to allow it to proceed to another stage. This assessment is based upon detailed
information that was documented in the Test Plan.

64
End of Module 5

Questions?

65
End of Module 5 Learning Check... 1
1. List in order of independence from least to most...
External tester, developer testing, tester outside of development group, test
specialist

Answer:
Developer testing, tester outside of development group, test specialist, external
tester

2. List the benefits and drawbacks of independence


Answer:
Benefits: find different defects, unbiased, verify assumptions
Drawbacks: isolation, developer’s lose sense of responsibility, seen as a
bottleneck – cause of delays

66
End of Module 5 Learning Check... 2
3. Which of the following is not a part of the IEEE829 standard for Test Plans?
Approvals, Risk, Incident Management Process, Features Not to be Tested, Test
Deliverables, Suspension and Resumption Criteria

Answer:
Incident Management Process

4. What is the difference between a dynamic and a consultative test approach?

Answer: dynamic more reactive than pre-planned, Consultative you can still pre-plan
test design by getting advice from experts

67
End of Module 5 Learning Check... 3
5. What sort of estimation technique would you use if there was no historical data?

Answer:
Expert-based.

6. What is in a Test Summary Report according to IEEE-829?

Answer: Identifier, Summary, Variances, Comprehensive assessment, Summary of


results, Summary of activities

7. Which of the following is a product risk?


a) the response time is slow and users abandon site
b) our best tester just resigned
c) the new payments file may cause problems with account balance
Answer: c)

68

You might also like