Software Test Strategy Document Example
Software Test Strategy Document Example
Revision History
Date
Author
Description of revisions
Version #
December 30, 2010
Initial Draft
1.0
1.1
Table of Contents
REVISION HISTORY 1
1. INTRODUCTION 4
1.1 PURPOSE 4
1.2 SOFTWARE FUNCTIONAL OVERVIEW 4
1.3 CRITICAL SUCCESS FACTOR 4
1.4 SOFTWARE TESTING SCOPE (TBD) 5
Inclusions 5
Exclusions 5
1.5 SOFTWARE TEST COMPLETION CRITERIA 5
2. TIMEFRAME 6
3. RESOURCES 6
4.1 SOFTWARE TESTING TEAM 6
4.2 HARDWARE REQUIREMENTS 6
4.3 SOFTWARE REQUIREMENTS 6
5. APPLICATION TESTING RISKS PROFILE 7
6. SOFTWARE TEST APPROACH 8
6.1 STRATEGIES 8
6.2 GENERAL TEST OBJECTIVES: 8
6.3 APPLICATION FUNCTIONALITY 8
6.4 APPLICATION INTERFACES 8
6.5 SOFTWARE TESTING TYPES 8
6.5.1 Stability 8
6.5.2 System 9
6.5.3 SOFTWARE Regression testing 10
6.5.4 Installation 10
6.5.5 Recovery 11
6.5.6 Configuration 14
6.5.7 Security 15
7. BUSINESS AREAS FOR SYSTEM TEST 16
8. SOFTWARE TEST PREPARATION 16
8.1 SOFTWARE TEST CASE DEVELOPMENT 17
8.2 TEST DATA SETUP 17
8.3 TEST ENVIRONMENT 17
8.3.1 Database Restoration Strategies. 17
9. SOFTWARE TEST EXECUTION 18
9.1 SOFTWARE TESTING EXECUTION PLANNING 18
9.2 SOFTWARE TEST EXECUTION DOCUMENTATION 18
9.3 PROBLEM REPORTING 18
10. STATUS REPORTING 19
10.1 SOFTWARE TEST EXECUTION PROCESS 19
10.2 PROBLEM STATUS 19
11. HANDOVER FOR USER ACCEPTANCE TEST TEAM19
12. DELIVERABLES 19
13. APPROVALS 19
14. APPENDIXES 20
14.1 APPENDIX A (BUSINESS PROCESS RISK ASSESSMENT) 20
14.2 APPENDIX B (SOFTWARE TEST DATA SETUP) 20
14.3 APPENDIX C (SOFTWARE TEST CASE TEMPLATE)20
14.4 APPENDIX D (PROBLEM TRACKING PROCESS) 23
1.
Introduction
1.1 Purpose
This document describes the SOFTWARE Test Strategy for the 'PRODUCT'
application and tend to support the
following objectives:
Identify the existing project information and the software components
that should be tested
Identify types of software testing to be done
Recommend and describe the software testing strategy to be employed
Identify the required resources and provide the estimate of the test
efforts
List the deliverables of the test project
With the implementation of the 'PRODUCT' system the users community will
be able to manage
sales contacts, turn sales contacts into sales opportunities, assign
sales opportunities to
sales team members, generate reports, forecast sales, etc.
The following lists specific items that are included or excluded from
the testing scope.
Inclusions
- Opportunity Contact
- Opportunities
- Opportunity Journal
- Opportunity Comments
- Sales Setup
Exclusions
Outlook2000 or other MS functionality
Software Testing under illegal hardware/software configurations
Criterion
Description
Signoff of test cases
All test cases defined for the release have been reviewed by the
appropriate stakeholders
and signed off.
Execution of the test
All test transactions have been executed successfully at least once.
Closure of outstanding problems
All problems found during the testing process have been reviewed,
closed, or deferred by
the management agreement.
2. Timeframe
3. Resources
Name
Position
Start Date
End Date
Days of Effort
Test
Tech Support
Sales
DBA
The box for database ... . I do not think that we will need a separate
box. Allocated space
for Test Environment, PV and backup could be enough.
Examples:
High Impact related to the loss of the client
Example:
2. Medium
Typically up to $ millions (or thousands) per month (or per year)
Examples:
Major inconvenience to the customer
Example:
3. Low
Typically up to $ millions (or thousands) per month (or per year)
Examples:
Minor inconvenience or no visible impact to a client
Example:
Likelihood Ranking
Low
Feature set to be used to a particular company
Medium
Used by a particular User group.
High
Core functionality will be used by all User groups
6. Test Approach
6.1 Strategies
Several strategies are employed in the plan in order to manage the risk
and get maximum
value from the time available for test preparation and execution.
To find any bugs that have not been found in unit and integration
testing performed
by development team
To ensure that all requirements have been met
6.5.1 Stability
6.5.2 System
Test Objective:
Ensure proper application navigation, data entry, processing, and
retrieval.
Technique:
Execute each use case, use case flow, or function, using valid and
invalid data, to verify
the following:
The expected results occur when valid data is used.
The appropriate error / warning messages are displayed when invalid data
is used.
Each business rule is properly applied.
Completion Criteria:
All planned tests have been executed.
All identified defects have been addressed.
Special Considerations: (TBD)
[Identify / describe those items or issues (internal or external) that
impact the
implementation and execution of System test]
Test Objective:
Verify that the reported problems were fixed properly and no additional
problems were
introduced during the fix.
Technique:
Manually or develop automated scripts to repeat tests were the problems
were originally
discovered.
Run few tests to verify the surrounding functionality.
Completion Criteria:
'PRODUCT' transactions execute successfully without failure.
Special Considerations:
What is the extend of verification of surround functionality?
6.5.4 Installation
Installation testing has two purposes. The first is to insure that the
software can be
installed on all possible configurations, such as a new installation, an
upgrade, and a
complete installation or custom installation, and under normal and
abnormal conditions.
Abnormal conditions include insufficient disk space, lack of privilege
to create
directories, etc. The second purpose is to verify that, once installed,
the software operates
correctly. This usually means running a number tests that were developed
for Function testing.
Test Objective:
Verify and validate that the 'PRODUCT' client software properly installs
onto each client
under the following conditions:
New Installation, a new machine, never installed previously with
'PRODUCT'
Update machine previously installed 'PRODUCT', same version
Update machine previously installed 'PRODUCT', older version
Technique:
Manually or develop automated scripts to validate the condition of the
target machine
(new - 'PRODUCT' never installed, 'PRODUCT' same version or older
version already installed).
Launch / perform installation.
Using a predetermined sub-set of Integration or System test scripts, run
the transactions.
Completion Criteria:
'PRODUCT' transactions execute successfully without failure.
Special Considerations:
What 'PRODUCT' transactions should be selected to comprise a confidence
test that
'PRODUCT' application has been successfully installed and no major
software components
are missing?
6.5.5 Recovery
Failover testing ensures that, for those systems that must be kept
running, when a failover
condition occurs, the alternate or backup systems properly "take over"
for the failed system
without loss of data or transactions.
Test Objective:
Verify that recovery processes (manual or automated) properly restore
the database,
applications, and system to a desired, known, state. The following
types of conditions
are to be included in the testing:
Power interruption to the client
Power interruption to the server
Communication interruption via network server(s)
Interruption, communication, or power loss to DASD and or DASD
controller(s)
Incomplete cycles (data filter processes interrupted, data
synchronization processes
interrupted).
Invalid database pointer / keys
Invalid / corrupted data element in database
Technique:
Tests created for System testing should be used to create a series of
transactions.
Once the desired starting test point is reached, the following actions
should be performed
(or simulated) individually:
Power interruption to the client: power the PC down
Power interruption to the server: simulate or initiate power down
procedures for the server
Interruption via network servers: simulate or initiate communication
loss with the network
(physically disconnect communication wires or power down network
server(s) / routers).
Interruption, communication, or power loss to DASD and or DASD
controller(s): simulate or
physically eliminate communication with one or more DASD controllers or
devices.
Once the above conditions / simulated conditions are achieved,
additional transactions
should executed and upon reaching this second test point state, recovery
procedures should
be invoked.
Completion Criteria:
In all cases above, the application, database, and system should, upon
completion of
recovery procedures, return to a known, desirable state. This state
includes data
corruption limited to the known corrupted fields, pointers / keys, and
reports indicating
the processes or transactions that were not completed due to
interruptions.
Special Considerations:
Recovery testing is highly intrusive. Procedures to disconnect cabling
(simulating
power or communication loss) may not be desirable or feasible.
Alternative methods,
such as diagnostic software tools may be required.
Resources from the Systems (or Computer Operations), Database, and
Networking groups
are required.
These tests should be run after hours or on an isolated machine(s). This
may call for
the separate test server.
6.5.6 Configuration
Test Objective:
Validate and verify that the client, 'PRODUCT' Application function
properly on the
prescribed client workstations.
Technique:
Use Software Integration and System Test scripts
Open / close various Microsoft applications, such as Excel and Word,
either as part
of the test or prior to the start of the test.
Execute selected transactions to simulate users activities into and out
of 'PRODUCT'
and Microsoft applications.
Repeat the above process, minimizing the available conventional memory
on the client.
Completion Criteria:
For each combination of 'PRODUCT' and Microsoft application, 'PRODUCT'
transactions are
successfully completed without failure.
Special Considerations:
What Microsoft Applications are available, accessible on the clients?
What applications are typically used?
What data are the applications running (i.e. large spreadsheet opened in
Excel, 100 page
document in Word).
The entire systems, netware, network servers, databases, etc. should
also be documented
as part of this test.
6.5.7 Security
Security and Access Control Testing focus on two key areas of security:
- Application security, including access to the Data or Business
Functions, and
- System Security, including logging into / remote access to the
system.
System security ensures that only those users granted access to the
system are capable
of accessing the applications and only through the appropriate gateways.
Test Objective:
Function / Data Security: Verify that user can access only those
functions / data for
which their user type is provided permissions.
System Security: Verify that only those users with access to the system
and application(s)
are permitted to access them.
Technique:
Function / Data Security: Identify and list each user type and the
functions / data
each type has permissions for.
Create tests for each user type and verify each permission by creating
transactions
specific to each user type.
Modify user type and re-run tests for same users. In each case verify
those additional
functions / data are correctly available or denied.
System Access (see special considerations below)
Completion Criteria:
For each known user type the appropriate function / data are available
and all
transactions function as expected and run in prior System tests
Special Considerations:
Access to the system must be reviewed / discussed with the appropriate
network or
systems administrator. This testing may not be required as it maybe a
function of
network or systems administration. The remote access control is under
special consideration.
For the Test purpose the system will be divided into the following
areas:
1. Sales Setup
2. Creating Databases
3. Getting Started - User
4. Managing Contacts and Opportunities
5. Managing the database
6. Reporting
7. Views
8. Features
9. User Tips
8. Test Preparation
Rather than developing detail test cases to verify the appearance and
mechanisms of
the GUI during the Unit testing, we will develop a standard checklist to
be used by developers.
If the timeframe will not permit the development of detail test scripts
to test a
coming version of 'PRODUCT' with precise input and output, the test
cases along
with check lists that will be explored to a level that will allow a
tester to
understand the objectives of each test will be developed.
The test data setup and test data dependencies are described in the
Appendix B.
(To consult with DBA) Test Data setup could be not an issue. However,
the data
dependencies (what data and from where) should be identified.
8.3 Test environment
The database will be backed up daily. Backups are to be kept for two
weeks, so it should
be possible to drop back to a clean version of the database if we will
have a database
corruption problem during the testing. (This will be more work if the
database definition
has changed in the interim. In the case when the database will be moved
from the MS Access
to SQL server the data conversion could be run if the test database will
have a lot of data.)
9. Test Execution
Testers will check off each successful step on the test sheets with the
execution date,
then sign and date-completed sheets. Any printouts used to verify
results will be annotated
with the step number and attached. This documentation will be retained
for inclusion in the
package for hand over to the UAT team at the end of the Testing cycle.
For test steps that find problems, testers will note the test step
number in the problem logs,
and also annotate the test sheet with the problem log numbers. Once the
problem has been fixed
and successfully retested, the tester will update the problem log to
reflect this.
Test Cases template is described in the Appendix C.
The following metrics in the forms of graphs and reports will be used to
provide
the required information of the problem status:
Weekly problem detection rates
Weekly problem detection rates - by week - diagram
Priority 1-2 problems vs. Total problems discovered ratio
Re-Open / Fixed problem ratio
(TBD)
On the System test completion, the test Lead will hand over the tested
system and all
accompanying test documentation to the (stakeholder). Specific handover
criteria will
be developed and agreed upon.
12. Deliverables
The following documents, tool, and reports will be created during the
testing process:
Deliverables
By Whom
To Whom
When
1. Test Strategy
2. Test Plan
3. Test Results
13. Approvals
The Test Strategy document must meet the approvals of the appropriate
stakeholders.
Title
Name
Signature
Date
1.
2.
3.
14. Appendixes
##
Function
Data Required
Data source
1
Business Areas
01.
Process name
01.01
Test Case
01.01.01
Test Case Prerequisites
Tester
Sign off
Date
Version
Step
Action
Date
Results
Expected Results
Pass/Log#
Retest
.01
1.1
14.4 Appendix D (Problem Tracking Process)
This document describes the Bug Tracking Process for the 'PRODUCT'
program.
All problems found during the testing will be logged in the XXX Bug
tracking system,
using a single database for all participating systems. Everyone who will
be testing,
fixing bugs, communicating with a clients, or managing teams doing
either activity,
will be given "write" access to the database.
Step
Process
Responsible stakeholder
Action
Bug Status
Problem Type
1.
Log Problem
Problem Originator:
Tester
Technical Support
Sales
2.1
If this is a bug, but it will not be corrected at this time due to the
low
Priority/Severity rating, time or resources limitation:
Escalate for decision/agreement
Set the problem type as appropriate
Annotate the log with recommended solution
Set status to pending
The Development Leader remains as a problem owner until the problem
will
be re-assign for resolution, corrected, send to training or closed by
the management
decision with a Pending status assigned
Pending
Bug
2.3
2.5
7.1
'PRODUCT' program
Test Strategy Version 1.1