0% found this document useful (0 votes)
42 views30 pages

Ch22 and Ch23 - Testing

This document discusses software verification, validation and testing. It covers topics like verification and validation planning, software inspections, automated static analysis, Cleanroom software development and system testing. It describes the differences between verification and validation, static and dynamic verification techniques like inspections and testing. It also explains various verification and validation methods like inspections, static analysis tools, Cleanroom process, system testing, integration testing, release testing, stress testing and component testing.

Uploaded by

Imane Aa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views30 pages

Ch22 and Ch23 - Testing

This document discusses software verification, validation and testing. It covers topics like verification and validation planning, software inspections, automated static analysis, Cleanroom software development and system testing. It describes the differences between verification and validation, static and dynamic verification techniques like inspections and testing. It also explains various verification and validation methods like inspections, static analysis tools, Cleanroom process, system testing, integration testing, release testing, stress testing and component testing.

Uploaded by

Imane Aa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 30

Software Verification, Validation and

Testing

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 1


Topics covered
 Verification and validation planning
 Software inspections
 Automated static analysis

Cleanroom software development
 System testing

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 2


Verification vs validation
 Verification:
"Are we building the product right”.
 The software should conform to its
specification.

Validation:
"Are we building the right product”.
 The software should do what the user really
requires.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 3


Static and dynamic verification


Software inspections. Concerned with analysis of
the static system representation to discover
problems (static verification)
• May be supplement by tool-based document and code
analysis

Software testing. Concerned with exercising and
observing product behaviour (dynamic verification)
• The system is executed with test data and its operational
behaviour is observed

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 4


Software inspections
 Software Inspection involves examining the
source representation with the aim of
discovering anomalies and defects without
execution of a system.
 They may be applied to any representation
of the system (requirements, design,
configuration data, test data, etc.).

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 5


Inspections and testing

Inspections and testing are complementary and not
opposing verification techniques.

Inspections can check conformance with a
specification but not conformance with the
customer’s real requirements.
 Inspections cannot check non-functional
characteristics such as performance, usability, etc.
 Management should not use inspections for staff
appraisal i.e. finding out who makes mistakes.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 6


Inspection procedure
 System overview presented to inspection team.
 Code and associated documents are
distributed to inspection team in advance.

Inspection takes place and discovered errors
are noted.
 Modifications are made to repair errors.

Re-inspection may or may not be required.

Checklist of common errors should be used to
drive the inspection. Examples: Initialization,
Constant naming, loop termination, array bounds…

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 7


Inspection roles

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 8


Inspection checks 1

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 9


Inspection checks 2

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 10


Automated static analysis
 Static analysers are software tools for source
text processing.
 They parse the program text and try to
discover potentially erroneous conditions and
bring these to the attention of the V & V
team.

They are very effective as an aid to
inspections - they are a supplement to but
not a replacement for inspections.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 11


LINT static analysis

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 12


Cleanroom software development
 The name is derived from the 'Cleanroom'
process in semiconductor fabrication. The
philosophy is defect avoidance rather than
defect removal.
 This software development process is based on:
• Incremental development;
• Formal specification;
• Static verification using correctness arguments;
• Statistical testing to determine program reliability.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 13


Cleanroom process teams
 Specification team. Responsible for developing
and maintaining the system specification.
 Development team. Responsible for
developing and verifying the software. The
software is NOT executed or even compiled
during this process.

Certification team. Responsible for developing
a set of statistical tests to exercise the software
after development. Reliability growth models
used to determine when reliability is acceptable.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 14


System testing
 Involves integrating components to create a
system or sub-system.
 May involve testing an increment to be
delivered to the customer.
 Two phases:
• Integration testing - the test team have access
to the system source code. The system is
tested as components are integrated.
• Release testing - the test team test the
complete system to be delivered as a black-box.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 15


Incremental integration testing

A T1

T1
A
T1 T2
A B
T2

T2 B T3

T3
B C
T3 T4
C
T4

D T5

Test sequence 1 Test sequence 2 Test sequence 3

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 16


Release testing
 The process of testing a release of a system
that will be distributed to customers.
 Primary goal is to increase the supplier’s
confidence that the system meets its
requirements.
 Release testing is usually black-box or
functional testing
• Based on the system specification only;
• Testers do not have knowledge of the system
implementation.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 17


Black-box testing
Inputs causing
anomalous
Input test da ta Ie beha viour

System

Outputs w hich r eveal


the pr esence of
Output test r esults Oe defects

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 18


Stress testing

Exercises the system beyond its maximum design
load. Stressing the system often causes defects to
come to light.

Stressing the system test failure behaviour..
Systems should not fail catastrophically. Stress
testing checks for unacceptable loss of service or
data.
 Stress testing is particularly relevant to distributed
systems that can exhibit severe degradation as a
network becomes overloaded.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 19


Component testing
 Component or unit testing is the process of
testing individual components in isolation.
 It is a defect testing process.
 Components may be:
• Individual functions or methods within an object;
• Object classes with several attributes and
methods;
• Composite components with defined interfaces
used to access their functionality.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 20


Object class testing
 Complete test coverage of a class involves
• Testing all operations associated with an object;
• Setting and interrogating all object attributes;
• Exercising the object in all possible states.

Inheritance makes it more difficult to design
object class tests as the information to be
tested is not localised.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 21


Interface testing
 Objectives are to detect faults due to
interface errors or invalid assumptions about
interfaces.

Particularly important for object-oriented
development as objects are defined by their
interfaces.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 22


Interface types
 Parameter interfaces
• Data passed from one procedure to another.

Shared memory interfaces
• Block of memory is shared between procedures or
functions.
 Procedural interfaces
• Sub-system encapsulates a set of procedures to be
called by other sub-systems.
 Message passing interfaces
• Sub-systems request services from other sub-system.s

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 23


Test case design
 Involves designing the test cases (inputs and
outputs) used to test the system.
 The goal of test case design is to create a
set of tests that are effective in validation
and defect testing.
 Design approaches:
• Requirements-based testing (i.e. trace test
cases to the requirements)
• Partition testing;
• Structural testing.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 24


Partition testing
 Input data and output results often fall into
different classes where all members of a
class are related.
 Each of these classes is an equivalence
partition or domain where the program
behaves in an equivalent way for each class
member.

Test cases should be chosen from each
partition.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 25


Structural testing
 Sometime called white-box testing.
 Derivation of test cases according to
program structure. Knowledge of the
program is used to identify additional test
cases.
 Objective is to exercise all program
statements (not all path combinations).

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 26


Path testing
 The objective of path testing is to ensure that
the set of test cases is such that each path
through the program is executed at least
once.
 The starting point for path testing is a
program flow graph that shows nodes
representing program decisions and arcs
representing the flow of control.
 Statements with conditions are therefore
nodes in the flow graph.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 27


Binary search flow graph
1

bottom > top while bottom <= top


5

elemArray [mid] != key


7 11
elemArray
elemArray [mid] > key elemArray [mid] < key
[mid] = key

8
12 13

14 10

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 28


Independent paths
 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 14
 1, 2, 3, 4, 5, 14
 1, 2, 3, 4, 5, 6, 7, 11, 12, 5, …

1, 2, 3, 4, 6, 7, 2, 11, 13, 5, …
 Test cases should be derived so that all of
these paths are executed
 A dynamic program analyser may be used to
check that paths have been executed

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 29


Test automation

Testing is an expensive process phase. Testing
workbenches provide a range of tools to reduce the
time required and total testing costs.

Systems such as Junit support the automatic
execution of tests.
 Most testing workbenches are open systems
because testing needs are organisation-specific.
 They are sometimes difficult to integrate with closed
design and analysis workbenches.

©Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 22 Slide 30

You might also like