Seminar On: Software Testing: A Practioner'S Approach
Seminar On: Software Testing: A Practioner'S Approach
Seminar on
SOFTWARE TESTING:
A PRACTIONERS
APPROACH
Software Testing
1 / 202
Date : 06-April-2002
Todays Agenda
Expectations
Testing
Testing Types & Levels
Test Assets
Test Approaches
Aspects of
Automation
Metrics from Testing
Test Tools
Q & A and Wrap-up
Software Testing
AGENDA
Type text
2 / 202
Date : 06-April-2002
Objectives
Discussing a group on fine tuning the testing
skills
Sharing experiences in testing
Provide practical viewpoint on testing
Discussing issues/problems faced during
testing and possible ways to overcome them
Exploring use of metrics to reduce testing
effort
Highlight use of varied test approach
depending on environment
Software Testing
3 / 202
Date : 06-April-2002
Non-Objectives
Not to teach fundamentals of testing
Not to create appreciation for testing
Not to give a ready-to-use formula for all
testing issues
Software Testing
4 / 202
Date : 06-April-2002
Why Test?
Software Testing
Nobody is perfect
Bugs in development tools
Certain bugs easier to find in testing
Bugs found on-site can be disastrous
Post release debugging is expensive
To deliver a Quality product to the
customer meeting the specifications
5 / 202
Date : 06-April-2002
What is Quality?
Producers view
Meeting requirements
Customers View
Fitness For purpose
ISO 9000
The totality of characteristics of an entity (product or
service) that bear on its ability to satisfy stated or
implied needs.
Software Testing
6 / 202
Date : 06-April-2002
Software Testing
7 / 202
Date : 06-April-2002
Software Testing
8 / 202
Date : 06-April-2002
Introduction to Defect
Software Testing
9 / 202
Date : 06-April-2002
Software Quality
Software Engg
Methods
Formal
Reviews
Metrics
QUALITY
Standards &
Procedures
Testings
SCM &
SQA
Software Testing
10 / 202
Date : 06-April-2002
Definitions
Defect
A deviation from specification standard
Anything that causes customer dissatisfaction
Verification
All QC activities throughout the life cycle that ensure
that interim deliverables meet their input
specification
Validation
The test phase of the cycle which assures that the
end product meets the users needs
Software Testing
11 / 202
Date : 06-April-2002
Some Facts
Testing cycle break-up (effort-wise)
Testing - 20%
Debugging - 80%
Software Testing
12 / 202
Date : 06-April-2002
Points to Ponder
80% of all defects occur in 20% of the work
Software Testing
13 / 202
Date : 06-April-2002
Testing
Testing is the process of exercising or
evaluating a system or system component
by manual or automated means to verify
that it satisfies specified requirements.
Software Testing
14 / 202
Date : 06-April-2002
Software Testing
15 / 202
Date : 06-April-2002
Testing
Testing is a process of executing a program with the intent of finding
errors
A good test case is one that has a high probability of finding an error
A successful test case is one that detects an as-yet-undiscovered
error
Software Testing
16 / 202
Date : 06-April-2002
Testing Involves
Plan for testing
Design test conditions, cases
Develop test data
Create required test environment
Execute tests
Analyze actual results with expected
results
Results: Rest passed or failed!
Software Testing
17 / 202
Date : 06-April-2002
Testers Maturity
Biezer
Phase 0 - no difference between testing and
debugging. Test only to support debugging
Phase 1 - Purpose is to show that software works
Phase 2 - Purpose is to show that software does not
work
Software Testing
18 / 202
Date : 06-April-2002
Software Testing
19 / 202
Date : 06-April-2002
Software Testing
20 / 202
Date : 06-April-2002
Software Testing
21 / 202
Date : 06-April-2002
Software Testing
22 / 202
Date : 06-April-2002
Purpose of debugging
find the error/misconception that led to failure and
implement program changes that correct the error
Software Testing
23 / 202
Date : 06-April-2002
Debugging
Debugging is the act of attempting to
determine the cause of the symptoms of
malfunctions detected by testing or by
frenzied user complaints.
Software Testing
24 / 202
Date : 06-April-2002
Types of Testing
Static testing
Dynamic Testing
Software Testing
White box
Black box
Grey box
Functional
Structural
Regression
Stress
Volume
Performance
Guerilla/Ad-hoc
Timing
Smoke test
25 / 202
Date : 06-April-2002
Types of Testing
(contd..)
Static Testing
Verification performed without executing the
systems code
Code inspection
Revenue Engineering
Dynamic Testing
Verification or validation performed by executing the
systems code
Software Testing
26 / 202
Date : 06-April-2002
Software Testing
27 / 202
Date : 06-April-2002
Structural Test
Tests that validate the system architecture (how the
system was implemented)
Regression Test
Testing after changes have been made to ensure that
no unwanted changes were introduced.
Software Testing
28 / 202
Date : 06-April-2002
Volume Test
Tests that validate the capacity of the system to
handle extremely large volume of data
Performance Test
Testing to check that the system behavior is as
expected in terms of its performance
Software Testing
29 / 202
Date : 06-April-2002
Timing Test
Response time related testing
Smoke Test
Surface test; quick check
Configuration Test
To test that the software is in one piece
Software Testing
30 / 202
Date : 06-April-2002
Progressive Test
Testing of new features after regression testing of
previous features
Suspicion Test
Certain component of the system; new programmer,
new technology; new domain;
Software Testing
31 / 202
Date : 06-April-2002
Software Testing
32 / 202
Date : 06-April-2002
Integration Testing
collection of units to test the interaction among them
(within a syb-system)
Software Testing
33 / 202
Date : 06-April-2002
Acceptance Testing
complete integrated system to evaluate fitness of
use (users viewpoint)
Software Testing
34 / 202
Date : 06-April-2002
More Definitions
Software Item
Object coe, job control code, control data, or a
collection of these items; e.g. COM file, .EXE file,
executable file in UNIX, Shell script.
Test Item
A software item which is the object of testing.
Software Testing
35 / 202
Date : 06-April-2002
Definitions (contd..)
Test Case Specification
Specification of initial conditions, inputs and
expected result(s) for test item(s); Also known as test
assertions.
Test Program
Code which is used to perform the test case
specification.
Software Testing
36 / 202
Date : 06-April-2002
Definitions (contd..)
Test Case
Test case specification and associated test
program(s).
Test Procedure
A sequence of steps specifies how to execute a test.
It can be manual or automated or a combination of
both.
Software Testing
37 / 202
Date : 06-April-2002
Definitions (contd..)
Test
When used as a noun it denotes,
A set of one or more test cases
A set of one or more test procedures
A set of one or more test cases and procedures
Software Testing
38 / 202
Date : 06-April-2002
Definitions (contd..)
Regression Test Bucket
Set of tests to be executed after software is
changed to show that the softwares
behaviour is unchanged except insofar as
required by the change to the software itself.
The Test coverage Matrix can be used as the
input for building the Regression Test Bucket.
Software Testing
39 / 202
Date : 06-April-2002
Definitions (contd..)
Test Plans
Test Plans specify the test case specifications that
will be tested for a specific level of testing. Plans also
also contain other information about resources,
schedules, etc.
Software Testing
40 / 202
Date : 06-April-2002
Unit Testing
Unit testing done to show that the unit does
not satisfy the functional specification and/or
its implemented structure does not match the
intended design structure
Software Testing
41 / 202
Date : 06-April-2002
What is a Unit?
A unit is a testable piece of software, that
can be built and executed under the control
of a test harness or driver. Unit could be a
functionality, subsystem, function, or as
defined by the customer.
Software Testing
42 / 202
Date : 06-April-2002
ETVX
Tasks/Activities
Exit Criteria
Software Testing
Verification
Candidate
Exit Criteria
43 / 202
Date : 06-April-2002
Software Testing
44 / 202
Date : 06-April-2002
Validation
Unit Test Plan (UTP) is reviewed
Exit Criteria
UTP is reviewed and approved
Software Testing
45 / 202
Date : 06-April-2002
Validation
Design of Unit Test Cases is reviewed
Exit Criteria
Unit Test Case design is reviewed and approved
Software Testing
46 / 202
Date : 06-April-2002
Validation
Unit Test Cases are reviewed
Exit Criteria
Unit Test Cases are reviewed and approved
Software Testing
47 / 202
Date : 06-April-2002
Validation
Defect Report reviewed
Exit Criteria
Test items passed the unit testing as per pass/fail
criteria in the UTP
Unit testing summary details are available in the
Defect Report form for final cycle
Software Testing
48 / 202
Date : 06-April-2002
Software Testing
49 / 202
Date : 06-April-2002
B1
C1
Software Testing
B2
C2
C3
B3
C4
C5
50 / 202
Date : 06-April-2002
Integration Testing
Integration testing is done to show that even
though the units were individually tested OK,
the integration is incorrect or inconsistent.
Software Testing
51 / 202
Date : 06-April-2002
Typical Integration
Problems
Software Testing
52 / 202
Date : 06-April-2002
Activities in Integration
Testing
Software Testing
53 / 202
Date : 06-April-2002
Develop Integration
Test Plan
Entry Criteria
Reviewed and approved SRS document is available
A draft Design Document is available
Validation
Integration Test Plan (ITP) is reviewed
Exit Criteria
An approved Design Document is available
ITP is reviewed and approved
Software Testing
54 / 202
Date : 06-April-2002
Validation
Design of Integration Test Cases is reviewed
Exit Criteria
Integration Test Case design is reviewed and
approved
Software Testing
55 / 202
Date : 06-April-2002
Validation
Integration Test Cases are reviewed
Exit Criteria
Integration Test Cases are reviewed and approved
Software Testing
56 / 202
Date : 06-April-2002
Validation
Defect Log reviewed
Exit Criteria
All defects found in the test items are logged
Test items passed the integration testing as per
pass/fail criteria in the ITP
Software Testing
57 / 202
Date : 06-April-2002
System Testing
System testing focuses on items that cannot
be attributed to a component, and addresses
to uncover inconsistencies between
components or planned interactions between
components.
E.g., Performance, Security, Recovery
Software Testing
58 / 202
Date : 06-April-2002
Activities in System
Testing
Software Testing
59 / 202
Date : 06-April-2002
Validation
System Test Plan (STP) is reviewed
Exit Criteria
As approved SRS is available
STP is reviewed and approved
Software Testing
60 / 202
Date : 06-April-2002
Validation
Design of System Test Cases is reviewed
Exit Criteria
System Test Case design is reviewed and approved
Software Testing
61 / 202
Date : 06-April-2002
Validation
System Test Cases are reviewed
Exit Criteria
System Test Cases are reviewed and approved
System Test Cases are placed under Configuration
Control
Software Testing
62 / 202
Date : 06-April-2002
Validation
Defect Log reviewed
Exit Criteria
All defects found in the test items are logged
Test items passed the integration testing as per
pass/fail criteria in the STP
Software Testing
63 / 202
Date : 06-April-2002
Software Testing
64 / 202
Date : 06-April-2002
Software Testing
65 / 202
Date : 06-April-2002
Software Testing
66 / 202
Date : 06-April-2002
Software Testing
67 / 202
Date : 06-April-2002
Software Testing
68 / 202
Date : 06-April-2002
69 / 202
Date : 06-April-2002
70 / 202
Date : 06-April-2002
Software Testing
71 / 202
Date : 06-April-2002
Software Testing
72 / 202
Date : 06-April-2002
Software Testing
73 / 202
Date : 06-April-2002
Software Testing
74 / 202
Date : 06-April-2002
Software Testing
75 / 202
Date : 06-April-2002
Software Testing
76 / 202
Date : 06-April-2002
77 / 202
Date : 06-April-2002
Acceptance Testing
Testing for implied requirement
Evaluating fitness of use
Should not find bugs which should have been
found in earlier testing phases
Software Testing
78 / 202
Date : 06-April-2002
Alpha Testing
All developer site - by the customer
developer records bugs and usage problems
controlled environment
Software Testing
79 / 202
Date : 06-April-2002
Beta Testing
Software Testing
80 / 202
Date : 06-April-2002
Suspicion testing
Software Testing
81 / 202
Date : 06-April-2002
Software Testing
82 / 202
Date : 06-April-2002
Software Testing
83 / 202
Date : 06-April-2002
Test Assets
Software Testing
84 / 202
Date : 06-April-2002
Test Repository
Software Testing
Test Strategy
Test Plans
Test Scripts, Test Environment and Test Data
Test Results / Test Log
Defect Log
85 / 202
Date : 06-April-2002
Test Assets
Software Testing
86 / 202
Date : 06-April-2002
Test Plans
Ideally test plans should be prepared as soon
as the corresponding document in the
development cycle is produced.
The preparation of the test plan itself validates
the document in the development cycle.
Software Testing
87 / 202
Date : 06-April-2002
Test Cases
Describe specific functional capability or
feature which needs to be validated.
Based on system specifications / users
production environment / other available lists.
Should also describe the Pass/Fail Criteria
Software Testing
88 / 202
Date : 06-April-2002
Pass/Fail Criteria
Criteria to be used to determine if the test
item has passed or failed testing
e.g.,
Testing should achieve at least 85% code coverage
No critical/Serious defects found in system test case
execution
!! Is it a Bug or a Feature!!
Software Testing
89 / 202
Date : 06-April-2002
Traceability to Requirements
Check necessary and sufficient condition for every
test case
Every requirement must be completely addressed
by one or more test cases
Every test case must be to address one or more
requirements fully or in part
Good to build a traceability matrix
Peers Reviews check completion
Software Testing
90 / 202
Date : 06-April-2002
Test Data
This is the data that is required to run and
verify the test. Test data includes:
Initial database contents
Data to be transacted
Expected results
Software Testing
91 / 202
Date : 06-April-2002
Test Environment
Environment under which testing takes place.
Typical points under environment are:
Operating system
Start state of the system
Single user / multi user
Database state
Software Testing
92 / 202
Date : 06-April-2002
Test Script
Test script contains the step by step procedure
comprising the action to be taken and the verification
of results expected.
Test scripts could be manual or automated. It is easy
to automate the test scripts relating to batch/report
programs.
Tools are available to automate the scripting of online programs also.
One script may test for one or more test conditions
Software Testing
93 / 202
Date : 06-April-2002
Test Script
Detailed, complete specification of all aspects
of a test including initialization, data,
keystrokes, etc.
In principle, a script can be executed by an
idiot or a computer
Software Testing
94 / 202
Date : 06-April-2002
Software Testing
95 / 202
Date : 06-April-2002
Defect Log
Every defect that is found during testing is
logged in a defect log. The defect log can be
used to track and close the defect and also to
perform statistical analysis
Statistical analysis of defects can be used to
identify the root causes of the defects and help
in improving the development processes.
Software Testing
96 / 202
Date : 06-April-2002
Software Testing
97 / 202
Date : 06-April-2002
Software Testing
98 / 202
Date : 06-April-2002
Aim at...
Not absolute proof
But, a convincing demonstration with
Qualitative measures
Judgement of enough
Software Testing
99 / 202
Date : 06-April-2002
Completion Criteria
Time Runs Out
POOR CRITERIA !!!
Software Testing
100 / 202
Date : 06-April-2002
Software Testing
101 / 202
Date : 06-April-2002
Methods of Testing
Manual Testing
Automated Testing
Software Testing
102 / 202
Date : 06-April-2002
Software Testing
103 / 202
Date : 06-April-2002
Automated Testing
First cycle takes more time than manual cycle
After initial test development, test cycles take
less time
Frequent and comprehensive tests possible
Each build can be tested fully - better coverage
Detect more bugs, earlier
Software Testing
104 / 202
Date : 06-April-2002
Software Testing
105 / 202
Date : 06-April-2002
Software Testing
Date : 06-April-2002
Software Testing
107 / 202
Date : 06-April-2002
Software Testing
108 / 202
Date : 06-April-2002
Software Testing
109 / 202
Date : 06-April-2002
Miscellaneous Tools
CGI TESTER
(checks the output from Perl, SP, CGI etc.)
NULLSTONE
(automated compiler performance analysis
tool)
QUANTIFY
(Software quality improvement - performance
bottleneck analysis)
Purify
(run0-time error & memory leak detection)
Software Testing
110 / 202
Date : 06-April-2002
Software Testing
111 / 202
Date : 06-April-2002
Software Testing
112 / 202
Date : 06-April-2002
Software Testing
113 / 202
Date : 06-April-2002
Software Testing
114 / 202
Date : 06-April-2002
Software Testing
115 / 202
Date : 06-April-2002
Software Testing
116 / 202
Date : 06-April-2002
Software Testing
117 / 202
Date : 06-April-2002
Testing Techniques
Means by which test conditions / cases are
identified
Types of techniques / Approaches
coverage based
domain based
process maturity based
lifecycle based
Software Testing
118 / 202
Date : 06-April-2002
Coverage Based
Software Testing
Statement coverage
Branch coverage
Condition coverage
Multiple condition coverage
Full path coverage
119 / 202
Date : 06-April-2002
Domain based
Software Testing
Banking
Manufacturing
Insurance
Engineering Projects
Process control
Avionics
120 / 202
Date : 06-April-2002
Software Testing
121 / 202
Date : 06-April-2002
Maturity Levels
Want to deliver good quality product by testing
it before delivery
Somebody should be responsible for quality of
my products
I want to produce reliable software
I want to produce reliable software every time
I want to produce reliable software every time
with reduced cost/effort
Software Testing
122 / 202
Date : 06-April-2002
Lifecycle/Model based
Approach
Software Testing
123 / 202
Date : 06-April-2002
Lifecycle / Models
Software Testing
Waterfall (V-model)
RAD / Iterative model / Spiral Model
OO model
Client-Server model
Internet Applications
124 / 202
Date : 06-April-2002
Waterfall (V-Model)
Acceptance Test Plan
Acceptance Test
Requirements
STP
Analysis
System Test
ITP
HLD
Integration Test
UTP
LLD
Unit Testing
Coding
Software Testing
125 / 202
Date : 06-April-2002
Testing in V-Model
Freezing of Test Plans during early stages may not be
practical
Test plans can be in draft until before the actual
testing is to be taken up
Need to have a re-look at the model itself for testing
activities
e.g., Some of unit level test cases may be easier to
test during integration test.
Software Testing
126 / 202
Date : 06-April-2002
Software Testing
127 / 202
Date : 06-April-2002
Software Testing
128 / 202
Date : 06-April-2002
Software Testing
129 / 202
Date : 06-April-2002
Software Testing
130 / 202
Date : 06-April-2002
OO Model
Characteristics of OOAD/OOPS
OO functions are generally smaller
interactions amount components increases
base class & derived class
Software Testing
131 / 202
Date : 06-April-2002
Testing in OO Model
Fault Based Testing
driven by product specifications
some faults may become less possible (function
level)
some faults might become more possible
(integration)
some new faults to be considered (inheritance)
Software Testing
132 / 202
Date : 06-April-2002
Testing in OO Model
(contd)
Scenario based Testing
driven by user need
use of Use Cases
interaction among subsystems
Software Testing
133 / 202
Date : 06-April-2002
Software Testing
134 / 202
Date : 06-April-2002
Software Testing
135 / 202
Date : 06-April-2002
Software Testing
Client GUI
Target environment
Distributed Database
Non-robust target environment
Non-linear performance relationships
136 / 202
Date : 06-April-2002
Software Testing
137 / 202
Date : 06-April-2002
GUI Testing
Considerations..
No sequencing of fields
Cross Platform
Mouse and Keyboard interface
Event driven
Contain custom objects
Distributed data and processes
Software Testing
138 / 202
Date : 06-April-2002
Internet Applications
Software Testing
Host based
WWW (non-proprietary n/s)
Multiple client platforms
URLs
Java applets
Security
Performance
139 / 202
Date : 06-April-2002
Software Testing
140 / 202
Date : 06-April-2002
Product Development
Scenario
Typically two phases of development
Feature Development Phase
Product Stabilization Phase
Software Testing
141 / 202
Date : 06-April-2002
Product Development
Scenario (contd..)
Feature Development Phase
Developers do most
Testers role is limited
Software Testing
142 / 202
Date : 06-April-2002
Software Testing
143 / 202
Date : 06-April-2002
Planned Testing
Tester has a prior knowledge of
what approach to take
what a complete set of tests is
what is the time allocated
Software Testing
144 / 202
Date : 06-April-2002
Guerrilla Testing
Opportunistically seek to find severe bugs
less planned
depends on experience of testers
tests are usually not documents & preserved
Software Testing
145 / 202
Date : 06-April-2002
Regression Testing
Return tests of see if one that used to pass
now fails
Software Testing
146 / 202
Date : 06-April-2002
Software Testing
147 / 202
Date : 06-April-2002
Software Testing
148 / 202
Date : 06-April-2002
Software Testing
149 / 202
Date : 06-April-2002
Quality Goals
Software Testing
Functionality
Usability
Supportability
Installability
Performance
Reliability
Maintainability
150 / 202
Date : 06-April-2002
Software Testing
151 / 202
Date : 06-April-2002
Software Testing
152 / 202
Date : 06-April-2002
Reporting Bugs
Bug reporting is a part of an evolving
relationship between tester and developer
Bug report should address two issues provide information about state of the product (main
goal)
provide information about you and the degree to
which developer can rely on you (usefulness of
testers)
Software Testing
153 / 202
Date : 06-April-2002
Bug Report
Be clear
what is being tested
what is expected to happen
what did happen
what was incorrect about it
if possible, explicit sequence of steps to make bugs
reproducible
if problem does not happen on developers machine,
quickly try to understand configuration dependence
and report it.
Software Testing
154 / 202
Date : 06-April-2002
Software Testing
155 / 202
Date : 06-April-2002
Verification of Testing
Process
Like any other software engineering work
product, testing process also is open for review
and verification
a repeatable, defined, measured and managed
process
is auditable
as part of SQA audits, testing should get
audited for conformance and adequacy
Software Testing
156 / 202
Date : 06-April-2002
Alternatives to Testing?
Software Testing
Formal reviews
design processes
static analysis
language checks
development environment
157 / 202
Date : 06-April-2002
Software Testing
158 / 202
Date : 06-April-2002
Metrics of Testing
Software Testing
159 / 202
Date : 06-April-2002
160 / 202
Date : 06-April-2002
GQM - example
Goal : Better time management
Question : Are schedules met?
Metric : % delay in schedule for every
milestone
Question : Is estimation of effort done well?
Metric : % difference in actual and estimated
effort
Software Testing
161 / 202
Date : 06-April-2002
Collect Data
Data from projects over a period of time
Data can also be gathered through
Defect Control System
Test Plans
Test Summary Reports
Personal interviews
Software Testing
162 / 202
Date : 06-April-2002
Possible Metrics
Defect arrival rate
Defects by severity
Defect repair rate
Test effort
effectiveness
Original of Defects
cost of defect
Defect distribution
of cycle
Software Testing
Feature-wise
defects
distribution
Effort Vs. Elapsed
time
# of Test cases Vs.
Defects
Defects removal
efficiency by
severity
163 / 202
Date : 06-April-2002
Software Testing
164 / 202
Date : 06-April-2002
Software Testing
165 / 202
Date : 06-April-2002
Software Testing
166 / 202
Date : 06-April-2002
Software Testing
167 / 202
Date : 06-April-2002
Software Testing
168 / 202
Date : 06-April-2002
Software Testing
169 / 202
Date : 06-April-2002
Software Testing
170 / 202
Date : 06-April-2002
Software Testing
171 / 202
Date : 06-April-2002
Software Testing
172 / 202
Date : 06-April-2002
Software Testing
173 / 202
Date : 06-April-2002
Software Testing
174 / 202
Date : 06-April-2002
Software Testing
175 / 202
Date : 06-April-2002
Software Testing
176 / 202
Date : 06-April-2002
Software Testing
Date : 06-April-2002
Software Testing
178 / 202
Date : 06-April-2002
A Question?
What can be the effect on choosing which
bugs to fix when top management tracks the
number of open bugs?
Software Testing
179 / 202
Date : 06-April-2002
Software Testing
180 / 202
Date : 06-April-2002
Software Testing
Database Testing
test are only 30% complete against the plan
have got 70% coverage
bug finding rate is good
Continue with testing
Stress Testing
10% complete against the plan
65 bugs found which is quite high
might increase effort in this
181 / 202
Date : 06-April-2002
Software Testing
Security Testing
40% of tests planned done
50% coverage
bugs being found as expected
will continue as planned
182 / 202
Date : 06-April-2002
Software Testing
183 / 202
Date : 06-April-2002
Software Testing
GUI Testing
55% of tests done out of planned
80% coverage
only few bugs found
module looks rebust
may not do all the tests planned
184 / 202
Date : 06-April-2002
Software Testing
Library Module
85% of tests complete
90% coverage
only 4 bugs found; no yield!
Should have stopped testing earlier; will do so
immediately
185 / 202
Date : 06-April-2002
Software Testing
186 / 202
Date : 06-April-2002
Software Testing
187 / 202
Date : 06-April-2002
Software Testing
188 / 202
Date : 06-April-2002
Software Testing
189 / 202
Date : 06-April-2002
A Test Manager
Is a frequent bearer of bad news
As a keeper of data, understands trends and
special occurrences in the project
Ensures that testing team & its work are
represented well to right people
Avoid two data traps
unjustified faith in numbers
rejecting numbers completely because they
are imperfect.
Software Testing
190 / 202
Date : 06-April-2002
Case Study
For your project, identify Quality Goals that are
measurable
Pick up any two of the Goals, and define test
strategy for them
Identify test metrics
Define pass/fail criteria
Software Testing
191 / 202
Date : 06-April-2002
Software Testing
192 / 202
Date : 06-April-2002
Software Testing
193 / 202
Date : 06-April-2002
Testing - Summary
Software Testing
194 / 202
Date : 06-April-2002
Software Testing
195 / 202
Date : 06-April-2002
Reference Material
Beizer B Software System Testing and Quality
Assurance, Van Nostrand Reinhold 1984.
Beizer, B Software Testing Techniques. Van
Nostrand Reinhold, 1990.
Myers, G J., the Art of software Testing, New
York: John Wiely and Sons, 1979.
Software Metrics: Establishing A CompanyWide Program by Grady and caswell
Software Testing
196 / 202
Date : 06-April-2002
Software Testing
197 / 202
Date : 06-April-2002
OAKSYS - What we do in
testing
Take -up independent testing (and other V & V)
assignments
Perform traceability between work products
Design test cases
Build test strategy
Develop test suite
Develop test scripts
Perform testing (automated/manual)
Software Testing
198 / 202
Date : 06-April-2002
Q&A
Software Testing
199 / 202
Date : 06-April-2002
200 / 202
Date : 06-April-2002
OAKSYS - How do we do
Have
Have
Have
Have
Have
Software Testing
201 / 202
Date : 06-April-2002
Thank You
We can be
Contacted at
IBM
Software Testing
202 / 202