0% found this document useful (0 votes)
32 views88 pages

2.1 Introduction To Testing v3.0-1

Sw Testing

Uploaded by

17kt7788
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
32 views88 pages

2.1 Introduction To Testing v3.0-1

Sw Testing

Uploaded by

17kt7788
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 88

© Faculty Training Institute.

All rights reserved


SOFTWARE TESTING (APST) 2021
APPLIED PROGRAM IN
TESTING MODULES WITHIN APST
Module 2 Introduction to Testing
Module 3 Testing Analysis
Module 4 Test Management

© Faculty Training Institute. All rights reserved


1
COURSE DELIVERY: SOFTWARE TESTING MODULES 4-6
These modules are delivered through a series of
lectures or seminars which are facilitated via an
online training environment.
They are presented over 24 weeks which consists
Lecture Session Module 2: Introduction to Testing
of a combination of mid-week evening classes and
Saturday full-day sessions. 1 Unit 1: Introduction to Testing
The aim is to cover the theory during the week, 2 Unit 1: Introduction to Testing (continue)
whilst the Saturday sessions are focused on the
assignments that would make up POE3 3 Unit 2: Testing Levels

NB: 4 Unit 2: Testing Types

© Faculty Training Institute. All rights reserved


1. Check and diarise the dates & 5 Unit 3 The people in Testing
times for the mid-week and
weekend lectures.
2. Ensure to familiarise yourself
with online training platform.
Ensure you have received the
dial-in details for each scheduled
lecture.
3. Any problems? Contact the
course administrator.
2
© Faculty Training Institute. All rights reserved
3
INTRODUCTION TO TESTING
Module 2
MODULE 2: INTRODUCTION TO TESTING

1. Introduction to Testing

2. Testing Methods

3. The People in Testing

© Faculty Training Institute. All rights reserved


4
OBJECTIVES: INTRODUCTION TO TESTING
Context: Testing within ADLM

• Testing is such a broad topic and have so many areas to focus on. In this module the aim
is to introduce the delegate to the concept of Testing.

• Furthermore, it explains the different SDLC models, with the different Testing Levels and
Testing Types.

© Faculty Training Institute. All rights reserved


• “Systems makes it possible, people make it happen” – lastly, we look at how people make
it happen

5
OBJECTIVES: INTRODUCTION TO TESTING
By the end of this session you will be able to:

• Good understanding concept of Testing, it’s psychology and necessity

• Understand the objectives of Testing

• Apply the Testing principles and explain the Testing process

© Faculty Training Institute. All rights reserved


• Explain the different Testing life-cycle models

6
SESSION PLAN
Content
1 What is testing
➢ Typical objectives of testing
➢ Testing and debugging
2 Why is Testing necessary
3 Testing Principles

© Faculty Training Institute. All rights reserved


4 Test Process
5 Testing: Sequential vs Agile approaches

7
CLASS DISCUSSION
Class discussion….

Question 1:
What is testing?

© Faculty Training Institute. All rights reserved


8
WHAT IS TESTING?
• Software is now everywhere
• banks
• cars
• hospitals
• your fridge!
• Consequences if it doesn’t work can be loss of
• money

© Faculty Training Institute. All rights reserved


• time
• business reputation
• share price
• and even life itself
• Software testing is a way to assess the quality of software and to reduce
the risk of its failure in operation
9
WHAT IS TESTING? – MORE THAN RUNNING TESTS
Policy and strategies

Test planning
Static testing (reviews)
Process improvement

Test improvement
Test analysis

Test control
Design tests
Implement tests

© Faculty Training Institute. All rights reserved


Execute tests
Evaluate results

Check exit criteria and report

Test closure activities


10
WHAT IS TESTING? – NOT ONLY RUNNING TESTS

• Dynamic testing • Static testing


• software is executed • software is not executed
▪ tests are run ▪ no tests run
• tries to generate failures • tries to find defects
• manual or automated • manual or automated
▪ design usually manual • static analysis by tools
• often based on test cases ▪ code / models

© Faculty Training Institute. All rights reserved


▪ can be exploratory • manual review
▪ code, requirements, user stories etc.

11
WHAT IS TESTING?

• V&V
• Verification • Validation
• checking a work product • comparing behaviour to user needs
conforms to standards / & expectations
guidelines / rules relevant to • did we build the right thing?
the work product
• did we build the thing right?

© Faculty Training Institute. All rights reserved


12
CLASS DISCUSSION
Class discussion….

Question 2:
Why do we do testing?

© Faculty Training Institute. All rights reserved


13
TYPICAL OBJECTIVES OF TESTING
From the ISTQB Syllabus (not a complete list)
• Evaluate work products - requirements, user stories, design, code
▪ static testing
• Verify whether all specified requirements have been met
• Validate whether test object is complete and works as users and other stakeholders
expect
• Build confidence in the level of quality of whatever is being tested
• Find defects (of course!)

© Faculty Training Institute. All rights reserved


• Prevent defects
• Provide info for decision-making regarding the system’s quality
▪ good enough for next stage of project (which may be live operation)?
• Reduce risk of inadequate software quality
▪ risk to the business of defects causing failures in live operation.
• Comply with contractual / legal / regulatory requirements or standards
▪ verify test object’s compliance with them
14
TYPICAL OBJECTIVES OF TESTING
• Objectives can vary with context
• what is under test
• test level – see later
• systems development lifecycle model
▪ sequential / iterative / incremental
• Examples
• component testing (early test level)

© Faculty Training Institute. All rights reserved


▪ verify component and find as many defects as possible
▪ achieve a coverage (thoroughness) target
• acceptance testing (later test level)
▪ validate that system works as expected & meets requirements
▪ give info to stakeholders about risks of release at any given time

15
SESSION PLAN
Content
1 What is testing
➢ Typical objectives of testing
➢ Testing and debugging
2 Why is Testing necessary
3 Testing Principles

© Faculty Training Institute. All rights reserved


4 Test Process
5 Testing: Sequential vs Agile approaches

16
CLASS DISCUSSION
Class discussion….

Question 3:
What is the difference between:
• Testing
• Debugging

© Faculty Training Institute. All rights reserved


17
TESTING AND DEBUGGING – THEY’RE NOT THE SAME!
• Testing • Debugging
1. generates failures “The process of finding, analysing and removing the
(caused by defects) causes of failures in software”
2. write incident report about it
(if required)
3. identify the defect that caused the failure
6. confirmation testing
(to ensure a correct fix) 4. repair the defect in the code
7. regression test 5. check that the defect is repaired correctly
(to check for adverse impacts)

© Faculty Training Institute. All rights reserved


• done by • done by
• testers (mostly higher test levels) • developers
• developers (at lower test levels) ▪ whatever level of testing generated the
failure
In summary: testers and developers
test, developers debug. Agile testers ISO/IEC/IEEE 29119-1 has more info
may help with debugging. about software testing concepts
11
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
➢ Testing contribution to success
➢ Quality assurance and testing
➢ Errors, defects and failures

© Faculty Training Institute. All rights reserved


➢ Defects, root causes and effects
3 Testing Principles
4 Test Process
5 Testing: Sequential vs Agile approaches

19
WHY IS TESTING NECESSARY?
• Because failures happen
• UK Border Agency's immigration system
▪ up to £1 billion of wasted tax payers’ money
▪ 50,000 rejected asylum seekers "lost", backlog of 29,000 applicants
• UK government tax agency
▪ 5 million incorrect tax calculations, huge cost to correct them
• New York gas utility company's internal payroll system

© Faculty Training Institute. All rights reserved


change from Oracle to SAP cost almost $1 billion, incorrect payments resulted in
legal action
• US Affordable Healthcare ("Obamacare") website
▪ national site allowed only 6 people to register on 1st day; took 6 weeks before
performance coped with demand
▪ Oregon scraps state website, sues supplier for $200 million

20
WHY IS TESTING NECESSARY? (CONT)

• Screwfix.com home improvement supplies sales website


▪ everything priced at £34.99 (data validation error?), orders had to be cancelled,
share price collapsed
• Microsoft's Azure cloud failure
▪ 1,000s of companies’ websites crashed when access to Microsoft’s Office 365
apps were affected

© Faculty Training Institute. All rights reserved


Source: The 2014 IT Disaster Hall of Shame
http://origsoft.com/news-desk/2014-disaster-hall-shame/
21
TESTING’S CONTRIBUTION TO SUCCESS
• Reviews of requirements & user • Testers work with developers while
story refinement code under development
▪ detect defects there before they are ▪ increase each party’s understanding
used for further development of the code and how to test it
▪ reduce risk of incorrect or untestable ▪ reduce risk of defects within both the
functionality being developed code and the tests
• Testers work with designers while o yes, tests can contain defects too
system being designed • Verify and validate software prior to

© Faculty Training Institute. All rights reserved


▪ increase each party’s understanding release
of the design and how to test it ▪ detect failures that might otherwise
▪ reduce the risk of fundamental design have been missed
defects ▪ support removal of defects that
▪ enable tests to be identified at an caused the failures
early stage ▪ increase likelihood that software will
meet user needs & requirements
22
TESTING’S CONTRIBUTION TO SUCCESS
• As long as it’s good testing
• appropriate techniques
• skilfully used
• at the right levels and times You think
You are here
High

Many Few
Defects Test Defects
Quality

© Faculty Training Institute. All rights reserved


Low High
Software quality

Few Few
Defects Defects

You may
Be here
Low
23
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
➢ Testing contribution to success
➢ Quality assurance and testing
➢ Errors, defects and failures

© Faculty Training Institute. All rights reserved


➢ Defects, root causes and effects
3 Testing Principles
4 Test Process
5 Testing: Sequential vs Agile approaches

24
CLASS DISCUSSION
Class discussion….

Question 4:
What is the difference between:
• Quality Assurance
• Testing

© Faculty Training Institute. All rights reserved


25
TESTING AND QUALITY ASSURANCE

• testing and QA are not the same


• they are related: quality management ties them together

good processes Quality


lead to good Management
work products Quality Assurance Quality Control
• focused on processes • includes testing,

© Faculty Training Institute. All rights reserved


• retrospectives reviews and audits
process • root cause analysis
improvement (discussed later)
activities
e.g. reviews, audits, test
design, test implementation,
e.g. coding standards, process standards, test execution
prescribing test techniques and tool use

17
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
➢ Testing contribution to success
➢ Quality assurance and testing
➢ Errors, defects and failures

© Faculty Training Institute. All rights reserved


➢ Defects, root causes and effects
3 Testing Principles
4 Test Process
5 Testing: Sequential vs Agile approaches

27
CLASS DISCUSSION
Class discussion….

Question 5:
Is there a difference between:
• An error
• A defect
• A failure

© Faculty Training Institute. All rights reserved


28
ERRORS, DEFECTS AND FAILURES
• Error (mistake): A human action that produces an incorrect result
• Defect (fault / bug): An imperfection or deficiency in a work product
where it does not meet its requirements or specifications
• e.g. incorrect statement or data definition
• if executed, a defect in code may cause a failure
• Failure: An event in which a component or system does not perform a
required function within specified limits

© Faculty Training Institute. All rights reserved


• can also be caused by an environment condition

A failure is an event; a defect is a state of the work product,


caused by an error

29
ERRORS, DEFECTS AND FAILURES
Failure is an event; defect is a state of the work product,
caused by an error
A person makes
An error ...

… That creates a
defect in the
software ...

© Faculty Training Institute. All rights reserved


… That can cause
a failure
in operation

30
ERRORS, DEFECTS AND FAILURES
• errors may occur for many reasons:
• aggressive deadlines ➔ excessive time pressure
• inexperience or lack of skills and training
• miscommunication between people in the project team
▪ e.g. about requirements and/or design,
• complexity of the problem or solution
• new and/or unfamiliar technologies

© Faculty Training Institute. All rights reserved


simple human fallibility
• failures may also be caused by environmental factors:
• radiation
▪ e.g. solar flares
• pollution
▪ e.g. electrostatic attraction of dust
• magnetism etc.
20
ERRORS, DEFECTS AND FAILURES
• Not all unexpected test results are really failures:
• false positive
▪ test incorrectly failed when it should have passed
▪ false positives ➔ spurious defect reports
• false negative
▪ test incorrectly passed when it should have failed
▪ false negatives ➔ missed defects

© Faculty Training Institute. All rights reserved


may be due to
▪ an error in the way the test was run
▪ a defect in
o test data
o test environment
o other testware
o other reasons
32
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
➢ Testing contribution to success
➢ Quality assurance and testing
➢ Errors, defects and failures

© Faculty Training Institute. All rights reserved


➢ Defects, root causes and effects
3 Testing Principles
4 Test Process
5 Testing: Sequential vs Agile approaches

33
DEFECTS, ROOT CAUSES AND EFFECTS
• Root cause (a process issue)
• “The source of a defect, such that if it is removed, the occurrence of the
defect type is decreased or removed”
▪ by understanding root causes, processes can be improved,
thereby preventing future defects
• e.g.
Defect due to a Misunderstanding due to Lack of training due to

© Faculty Training Institute. All rights reserved


misunderstanding lack of training budget restrictions

Remember: So, how can we improve


Defect prevention training within the budget?
Is an aspect of
Quality assurance

34
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
3 Testing Principles
➢ Seven testing principles
4 Test Process

© Faculty Training Institute. All rights reserved


5 Testing: Sequential vs Agile approaches

35
SEVEN TESTING PRINCIPLES
• Principle:
• fundamental truth
• rule by which conduct may be guided

• Testing principles:
• suggested over the past 40+ years
• offer general guidelines

© Faculty Training Institute. All rights reserved


• applicable to all software testing situations

• ISTQB® defines 7 testing principles in the CTFL syllabus

36
CLASS DISCUSSION

Question 6:
Now that you know the definition of
a principle → what do you think is a
principle of testing?

© Faculty Training Institute. All rights reserved


37
PRINCIPLE 1: TESTING SHOWS PRESENCE OF DEFECTS

• If all tests pass, is the software definitely defect free? No


• If some tests fail, have we definitely found all the defects? No
• Testing reduces the probability of undiscovered
defects remaining
• but not to 0%!
• no matter how much testing we do
▪ we can’t prove there are no defects

© Faculty Training Institute. All rights reserved


• testing can only prove there are defects
▪ as soon as the first one is found

1. Testing shows the presence of defects,


not their absence

38
PRINCIPLE 2: EXHAUSTIVE TESTING IS IMPOSSIBLE
• What is exhaustive testing -
• when all the testers are exhausted? No
• when all the planned tests have been executed? No
• exercising all combinations of inputs and preconditions? Yes
• Why is exhaustive testing impossible?
• it’s not feasible (except for trivial cases)
▪ there’d never be enough time

© Faculty Training Institute. All rights reserved


• so, we focus testing effort using risk and priorities

2. Exhaustive testing is impossible

39
PRINCIPLE 2: WHY CAN’T WE "TEST EVERYTHING"?

• System inputs: • For exhaustive testing:


• type of loan • 3 * 1000 * 120 +
▪ 3 types 3 * 1000 * 100 = 660,000 tests
• amount of loan • At 1 second per test
▪ $100 - $10,000 in $10 • 11,000 minutes (or 183 hours or
increments (1,000 values) 7.6 days)
• duration of loan ▪ not counting finger trouble, defects

© Faculty Training Institute. All rights reserved


▪ 1 to 120 months or re-testing
• or monthly repayment • At 10 seconds per test
▪ $100 - $1,000 in $10 • 11 weeks
increments (100 values)
• At 1 minute per test
• 15 months
40
PRINCIPLE 3: EARLY TESTING
• Does it matter when the testing starts?
• defects found earlier are cheaper to fix Yes
• Which of the following should testing focus on -
• criticising developers? No
• showing that the software is working? No
• finding the showstoppers as quickly as possible? Yes

© Faculty Training Institute. All rights reserved


3. Early testing saves time and money

41
PRINCIPLE 4: DEFECT CLUSTERING
• True or false? (Usually)
• defects are distributed evenly in the system
• a small number of modules:
False
▪ contain most of the defects discovered True
in pre-release testing
▪ show the most operational failures True
• testing should be more thorough where it is most needed True

© Faculty Training Institute. All rights reserved


so, predicted defect clusters, and those actually observed in test or operation,
are an important input to help focus test effort

4. Defects cluster together

42
PRINCIPLE 5: PESTICIDE PARADOX
• true or false?
• tests should find defects True
• tests should be repeatable True
• run the same test again - find a new defect? Unlikely
• how can we find a stronger pesticide (set of tests)?
• regularly review and revise tests
• write new and different tests to exercise different parts and

© Faculty Training Institute. All rights reserved


potentially find more defects
• run existing tests in a different order

5. Beware of the pesticide paradox

43
PRINCIPLE 5: PESTICIDE PARADOX

Which is the
right answer?
One that
passes

No! One that


finds defects

© Faculty Training Institute. All rights reserved


44
PRINCIPLE 6: TESTING IS CONTEXT DEPENDENT
• Should all software be tested equally well? No
• Must good testing be the same in different organisations? No
• Testing need not be the same in every situation
• safety-critical applications
▪ more formal, more thorough
• e-commerce
▪ careful balance between time, cost and quality

© Faculty Training Institute. All rights reserved


• new market, innovative product
▪ less formal, time to market more important

6. Testing is context dependent

45
PRINCIPLE 7: ABSENCE-OF-ERRORS FALLACY
• Should every defect found by testing be fixed? No
• Is it ok to release software with known defects in it? Maybe
• Suppose all defects found by testing are fixed – is the system
now ok? Not necessarily
• what if performance requirements were not specified?
▪ users could find system unacceptably slow
• they may not be able to use a system even if it conforms to the

© Faculty Training Institute. All rights reserved


requirements
• Fixing defects is not enough if the system built is unusable or
does not fulfil user/customer needs and expectations

7. Absence-of-errors is a fallacy

46
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
3 Testing Principles
4 Test Process
➢ Test process in context
➢ Test activities and tasks

© Faculty Training Institute. All rights reserved


➢ Test work products
➢ Traceability between the test basis and test work products
5 Testing: Sequential vs Agile approaches

47
CLASS DISCUSSION
Class discussion….

Question 6:
Is there a universal test process?

Question 7:
Are there some common activities that

© Faculty Training Institute. All rights reserved


will always help? Like what?

48
TEST PROCESS IN CONTEXT
• A test process consists of common sets of activities

• The proper way to do it in any situation will optimise


• the activities performed
• how they are done
• when they are done

6. Testing is context dependent

© Faculty Training Institute. All rights reserved


• Each organisation will have its own view about this
• its test strategy is a good place to make this view clear
• It’s always good if we can measure test coverage
• need objective way to measure extent to which testing meets its objectives
49
TEST PROCESS IN CONTEXT (CONTINUE)
• factors that can influence test process: 6. Testing is
• product and project risks context dependent
• SDLC model(s) and project methodology/(ies) being used
• test levels and test types being considered
• business domain
▪ market sector in which the business operates
▪ the part/s of the business for which systems are being developed
• operational constraints, including

© Faculty Training Institute. All rights reserved


▪ budgets and resources
▪ timescales
▪ complexity
▪ contractual and regulatory requirements ISO/IEC/IEEE 29119-2 has
• organizational policies and practices more info about test
• internal and/or external standards processes
36
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
3 Testing Principles
4 Test Process
➢ Test process in context
➢ Test activities and tasks

© Faculty Training Institute. All rights reserved


➢ Test work products
➢ Traceability between the test basis and test work products
5 Testing: Sequential vs Agile approaches

51
CLASS DISCUSSION
Class discussion….

Question 8:
Think about inviting family for a Sunday
lunch →
• Name some of the activities

© Faculty Training Institute. All rights reserved


• What would be the first activity?

52
TEST ACTIVITIES AND TASKS
• ISTQB model has these main groups of activities:
Test Planning ● Not necessarily sequential
• monitoring & control is

Test Monitoring and Control


Test Analysis continuous after planning
• other activity groups may
Test Design combine, be omitted,
overlap or even be

© Faculty Training Institute. All rights reserved


Test Implementation concurrent
• iterative SDLCs do them
Test Execution iteratively
• sequential SDLCs may
Test Completion feature feedback

53
TEST PLANNING
• Test planning involves activities that define Test Planning
• the objectives of testing

Test Monitoring and Control


• the approach for meeting them, e.g. Test Analysis

▪ test activities and tasks Test Design


▪ test schedule
▪ within applicable constraints Test Implementation

• feedback from monitoring and control may Test Execution

© Faculty Training Institute. All rights reserved


cause re-planning
• more in section 5.2 Test Completion

54
TEST MONITORING AND CONTROL

• Test monitoring involves Test Planning

Test Monitoring and Control


• comparing progress to the plan Test Analysis
• using metrics defined in the plan
• Test control involves
Test Design

• taking actions to meet the plan’s objectives Test Implementation

▪ if progress isn’t meeting them Test Execution


• Both involve evaluation of [results against] exit

© Faculty Training Institute. All rights reserved


Test Completion
criteria
• Agile: Definition of Done
▪ e.g. checking test results and logs against coverage criteria
▪ assessing the level of component or system quality
o based on test results and logs
▪ determining if more tests are needed
55
TEST ANALYSIS
• Test analysis involves Test Planning

Test Monitoring and Control


• collecting and analysing the “test basis” Test Analysis
• all info that can be found about expected behaviour
• identify testable features Test Design

• define the test conditions Test Implementation


▪ to show whether features are implemented correctly
▪ to provide coverage criteria Test Execution

© Faculty Training Institute. All rights reserved


Major activities: Test Completion

• analyse the test basis


• evaluate the test basis
• identify features to be tested
• define and prioritise test conditions
• capture bi-directional traceability from basis to conditions
56
TEST ANALYSIS (CONTINUED)
• Test basis, depending on level, may include Test Planning

Test Monitoring and Control


• requirement specifications, e.g. Test Analysis
▪ business / system / functional requirements
▪ epics and user stories
Test Design

• design and implementation information, e.g. Test Implementation


▪ architecture & design specifications
Test Execution
o diagrams or documents

© Faculty Training Institute. All rights reserved


interface specifications, call flows Test Completion
▪ any work products that specify component or system structure
o could include UML/ other diagrams e.g. data models
• implementation of component / system itself
▪ code, database metadata and queries, interface data
• outputs from risk analysis
• full list in learner handbook
57
TEST ANALYSIS (CONTINUED)
• Define test conditions via session 4 techniques Test Planning

Test Monitoring and Control


black-box, white-box & experience-based
• reduce likelihood of missing something important
Test Analysis

• define conditions accurately and precisely Test Design

• Test conditions can go into test charters Test Implementation


• help to make exploratory testing coverage
Test Execution
measurable

© Faculty Training Institute. All rights reserved


• Test analysis can find defects in test basis Test Completion

• identifying testable features can reveal problems


▪ manual verification and validation
▪ important potential benefit especially if no reviews
• ATDD and BDD techniques help with this in Agile
▪ See Study Notes for more detail
58
TEST DESIGN
• Test design involves elaborating test conditions Test Planning

Test Monitoring and Control


turning them into test cases
• creating other testware
Test Analysis

• Major activities: Test Design

• designing and prioritise test cases Test Implementation


▪ and sets of test cases
• identify test data needed to support these test cases Test Execution

© Faculty Training Institute. All rights reserved


• design the test environment Test Completion
• identify any required infrastructure and tools
• extend bi-directional traceability to the test cases
• Test design can use same techniques as test analysis
• Can find defects in test basis
• in same way as can test analysis
59
TEST IMPLEMENTATION
• Test implementation involves finalising testware Test Planning

Test Monitoring and Control


turning test cases into test procedures / scripts
• completing / creating all other testware needed Test Analysis

• Major activities: Test Design


• develop and prioritise test procedures –
▪ may include sequencing >1 test case into one test procedure Test Implementation
▪ may include creating automated test scripts

Test Execution
create test suites (from test procedures &/ automated test scripts)

© Faculty Training Institute. All rights reserved


• arrange test suites in efficient test execution schedule Test Completion
• build test environment, verify that set up correctly
▪ test harnesses, simulators etc. if needed
• prepare test data, loaded it into test environment
• verify / update bi-directional traceability test basis ➔ test suites
• Test implementation often combined with test design
• both done together with test execution in exploratory testing 60
TEST EXECUTION
• test execution involves running the tests Test Planning

Test Monitoring and Control


according to the test execution schedule
• major activities:
Test Analysis

• execute tests, manually / with test execution tools Test Design


• compare actual results with expected ones
• analyse anomalies to establish likely causes
Test Implementation

▪ failure due to defect, or false positive? Test Execution


• report defects based on failures observed

© Faculty Training Institute. All rights reserved


▪ or suspected if not yet clear Test Completion

• log outcomes (pass, fail, blocked)


▪ include identities & versions of test items / object & everything
used
• repeat test activities
▪ after fix, change or test not properly run
• verify & update bi-directional traceability ➔ test results 61
TEST COMPLETION
• test completion involves tidying up after Test Planning

Test Monitoring and Control


a test level is completed
• an Agile iteration is finished Test Analysis

• a test project is completed (or cancelled) Test Design


• a new system or maintenance release goes live
• major activities: Test Implementation

• check that all defect reports are closed Test Execution


▪ create CR’s or product backlog items for any not closed

© Faculty Training Institute. All rights reserved


• create test summary report for stakeholders Test Completion

• finalise / archive testware for later reuse &/or audit trail


▪ test environment / data / infrastructure / other testware, incl. test results
• hand over testware to maintenance teams / project teams / others
• analyse lessons learned from completed test activities
▪ determine changes needed for future iterations / releases / projects
▪ use information gathered to improve test process maturity
62
FORMATIVE ASSESSMENT PART 2

• Activity 2.1.1: Test Process Activities

• 15 minutes to do it, then


• 5 minutes to discuss solution.

© Faculty Training Institute. All rights reserved


63
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
3 Testing Principles
4 Test Process
➢ Test process in context
➢ Test activities and tasks

© Faculty Training Institute. All rights reserved


➢ Test work products
➢ Traceability between the test basis and test work products
5 Testing: Sequential vs Agile approaches

64
TEST WORK PRODUCTS

• Things created as part of the


Test Planning

Test Monitoring and Control


test process – “testware” Test Analysis

• different organisations will Test Design


create different ones
▪ &/or use different names Test Implementation

• those described below Test Execution


correspond to the process

© Faculty Training Institute. All rights reserved


Test case
described above Test Completion
Test plan
• tools can help to create and Test
store them condition

▪ especially test management ISO/IEC/IEEE 29119-3


tools & defect management has more info about
tools test work products

66
TEST WORK PRODUCTS – PLANNING, MONITORING & CONTROL

• Test planning
Test Planning

Test Monitoring and Control


test plans, including
▪ info about test basis, on which to build traceability Test Analysis

▪ exit criteria, schedules etc. for monitoring and control Test Design
▪ more detail later in the course
Test Implementation

• Test monitoring and control Test Execution

© Faculty Training Institute. All rights reserved


• test progress reports Test Completion

• test summary reports


• more detail later in the course

67
TEST WORK PRODUCTS – TEST ANALYSIS AND TEST DESIGN
• test analysis

Test Monitoring and Control


Test Planning

• prioritised test conditions Test Analysis


▪ with bi-directional traceability to test basis Test Design
• high-level in test charter for exploratory testing

Test Implementation
test design Test Execution
• test cases and sets of test cases

Test Completion
test case objective: exercise one or more conditions

© Faculty Training Institute. All rights reserved


▪ with bi-directional traceability to test conditions
▪ may be at high level
o no concrete values, so re-usable across multiple test cycles
• design and/or identification of test data
• design of test environment
• identification of infrastructure and tools
• corrected / amended / extra test conditions 68
TEST WORK PRODUCTS – TEST CASES

• •

Test Monitoring and Control


Test case consists of Post-conditions Test Planning

• Preconditions ▪ things that should be true Test Analysis

▪ must be true before test begins after the test has finished Test Design
▪ e.g., logged in with correct ▪ not directly relevant to the
Test Implementation

credentials objectives of that particular


test case Test Execution
• Inputs (values & actions)
▪ may be difficult to identify Test Completion
▪ data that will be provided during
o may be about

© Faculty Training Institute. All rights reserved


the test persistence, e.g.
▪ things that will be done during outcomes of test should
the test remain stable
• Expected results o may be about
▪ data values etc. that will allow housekeeping, e.g.
clearing session cookies
you to verify correct behaviour

69
TEST WORK PRODUCTS – TEST IMPLEMENTATION
• test procedures & /or scripts Test Planning

Test Monitoring and Control


procedures for manual tests, scripts for automated ones
• may have >1 test case sequenced within Test Analysis

• test suites Test Design


• containing test procedures / scripts
• test execution schedule Test Implementation

• ancillary testware Test Execution

• test harnesses, simulators, service virtualisation

© Faculty Training Institute. All rights reserved



Test Completion
often for use by / with tools
• test data
• data values turn high-level test cases into executable ones
• test environment
• created, verified, data loaded
• corrected / amended / extra test conditions &/or test cases
70
TEST WORK PRODUCTS – TEST EXECUTION

Test Monitoring and Control


Test Planning
Defect reports (see later in the course)
• Status of individual test cases / procedures
Test Analysis


Test Design
e.g. ready to run, pass, fail, blocked, deliberately
Test Implementation
skipped etc.

Test Execution
Records of what was involved in the testing Test Completion

• test item(s), test object, test tools, other testware


© Faculty Training Institute. All rights reserved


including relevant version information
• usually in a test log

71
TEST WORK PRODUCTS – TEST COMPLETION
• Test summary reports (see later in the course)

Test Monitoring and Control


Test Planning

• also, jointly, products of monitoring & control Test Analysis

• Action items for improvement of subsequent projects Test Design


• or iterations, after an Agile retrospective Test Implementation
• Change requests or product backlog items Test Execution
• from defect reports that hadn’t been closed Test Completion

© Faculty Training Institute. All rights reserved


so they can now be closed
• Finalized testware
• documented
• archived &/or handed over

72
FORMATIVE ASSESSMENT PART 2

• Activity 2.1.2: Test Process Products

• 15 minutes to do it, then


• 5 minutes to discuss solution.

© Faculty Training Institute. All rights reserved


73
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
3 Testing Principles
4 Test Process
➢ Test process in context
➢ Test activities and tasks

© Faculty Training Institute. All rights reserved


➢ Test work products
➢ Traceability between the test basis and test work products
5 Testing: Sequential vs Agile approaches

74
TRACEABILITY BETWEEN THE TEST BASIS AND TEST WORK PRODUCTS

• Two conversations ...


• (1) test results
• stakeholder: How’s the testing going?
• tester: Well, tests FBA345 & FDB378 have passed but FGJ129 has failed.
• stakeholder:

© Faculty Training Institute. All rights reserved


• (2) change requests
• stakeholder: We need to change requirement 27.3.
• tester: Which tests will be affected?
• stakeholder: Why are you asking me?
76
TRACEABILITY BETWEEN THE TEST BASIS AND TEST WORK PRODUCTS

• need to know how our testware items relate


• to each other
• to the test basis
• traceability throughout the test process supports
• understanding test results
• analyzing the impact of changes.
• relating technical aspects of testing to stakeholders in terms that they can
understand

© Faculty Training Institute. All rights reserved


• making testing auditable.
• meeting IT governance criteria.
• providing information to assess product quality, process capability and project
progress against business goals
• must take us both ways – bi-directional
• test management tools should support this
• buy, or build your own
77
TRACEABILITY BETWEEN THE TEST BASIS AND TEST WORK PRODUCTS

Requirements
Specification Test Conditions Test Cases Test Procedures

not run
Req. 1 Test cond. 1 TC A1 TP 1.101 failed
passed
Test cond. 2
Req.1: all tests run,
TC B2
all passed
Test cond. 3 not run
TP 1.102 failed Req.2: all tests
Req. 2
passed run, not all

© Faculty Training Institute. All rights reserved


TC C3
Test cond. 4 passed; test
condition 6 not
not run covered
TC D4 failed
Test cond. 5 TP 1.103 passed

Test cond. 6
not covered by a Test Case

Req. 2 changes: look for impact on test procedures TP 1.102 and TP 1.103
78
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
3 Testing Principles
4 Test Process
5 Testing: Sequential vs Agile approaches
➢ Testing and development activities

© Faculty Training Institute. All rights reserved


➢ Project work products

79
TESTING AND DEVELOPMENT ACTIVITIES
• Testing and development • Agile models differ:
activities are interconnected • how activities are integrated
• structures defined by models that • project work products
are: • names
▪ sequential, iterative and/or • test level entry/exit criteria
incremental
• use of tools
• Agile is ‘iterative incremental’
• use of independence in testing

© Faculty Training Institute. All rights reserved


• Ability to adapt is key
• lifecycle implementations vary
greatly
• deviations from the ideals
▪ may be intelligent customisation
and adaption 80
COMPARING TESTING AND DEVELOPMENT ACTIVITIES 1
Agile Traditional

• Short iterations • Longer iterations


• usually 1 to 4 weeks • few months if not longer
• implements a few user stories • major features
• early and frequent feedback • feedback later and much less
frequent

© Faculty Training Institute. All rights reserved


• Working software • Delivered software
• deliverable of each iteration • delivery deadline sometimes more
• decision based on quality important than quality

81
COMPARING TESTING AND DEVELOPMENT ACTIVITIES 2
Agile Traditional

• Testing roles • Testing roles


• developers, testers and business • developers: unit test (if time
stakeholders permits)
▪ all involved in testing throughout • testers: other test levels (often
each iteration subject to ‘squeeze’)

© Faculty Training Institute. All rights reserved


▪ most likely from unit tests to • business stakeholders often not
acceptance tests inclusive
involved in testing

• Technical debt • Technical debt


• strategies to resolve early or • accumulates
avoid altogether
82
COMPARING TESTING AND DEVELOPMENT ACTIVITIES 3

Agile Traditional

• Risk-based testing • Risk-based testing


• risk analysis during release and • emphasis on risk analysis at start of
iteration planning project
• influences sequence of • on-going risk analysis with view of
development and priority and whole project

© Faculty Training Institute. All rights reserved


depth of testing

• Pairing • Pairing
• continuous review • not traditional
• shared ownership of quality

83
SESSION PLAN
Content
1 What is testing
2 Why is Testing necessary
3 Testing Principles
4 Test Process
5 Testing: Sequential vs Agile approaches
➢ Testing and development activities

© Faculty Training Institute. All rights reserved


➢ Project work products

84
PROJECT WORK PRODUCTS

Req.
Business-oriented Spec.
Describe what is needed User
and how to use software stories
User
doc.
LI G T W E G H T
Development DB H I
Describe how built, design Code
that implement, and

© Faculty Training Institute. All rights reserved


Unit
that evaluate code pieces
tests
Testing
Describe how system tested, Test plan
Test
that test, and scripts
that present results Dashboards

85
BUSINESS-ORIENTED WORK PRODUCTS IN AGILE
• User stories/epics Theme Epics Stories Tasks

• written on cards
▪ or electronic
Book using
Task
equivalent points
• Include Book
flights
Task

• acceptance criteria
Rebook
regular flight
Task

• conversation Frequent flyer


Task

© Faculty Training Institute. All rights reserved


benefits
notes Cancel with
no charge Task
Cancel
User flights
stories Task
Emailed
confirmation
Task
User
stories Task

86
DEVELOPMENT WORK PRODUCTS IN AGILE
• Code
• low technical debt
▪ no known defects
Code
▪ easy to maintain
▪ testable
• Unit tests
• manual or automated Unit

© Faculty Training Institute. All rights reserved


form of executable low-level design specification (TDD) tests

Test-driven development (TDD):


automated unit tests created and run
before code is written to pass tests
one test at a time
87
TESTER WORK PRODUCTS IN AGILE
• Tests
• manual and automated Test
scripts
• Test plans/strategies Test

• Defect reports Defect


plan

reports
• Test result logs

© Faculty Training Institute. All rights reserved


Test

• Test reports
logs

• including metrics Dashboards

88
MODULE 2: INTRODUCTION TO TESTING

1. Introduction to Testing

2. Testing Methods

3. The People in Testing

© Faculty Training Institute. All rights reserved


89

You might also like