Test Strategy Sample
Test Strategy Sample
Table of Contents
1. INTRODUCTION............................................................................................................................................ 5
1.1. PURPOSE...................................................................................................................................................... 5
2. SCOPE........................................................................................................................................................... 6
2.1. SCOPE ITEMS............................................................................................................................................... 6
2.2. TYPES OF TESTING..................................................................................................................................... 6
3. TEST APPROACH......................................................................................................................................... 8
3.1. TESTING PHASES AND APPROACH..........................................................................................................8
3.1.1. UNIT TESTING.............................................................................................................................................. 9
3.1.2. FUNCTIONAL TESTING............................................................................................................................. 10
3.1.3. SYSTEM INTEGRATION TESTING............................................................................................................10
3.1.4. REGRESSION TESTING............................................................................................................................. 10
3.1.5. END TO END TESTING............................................................................................................................... 10
3.1.6. USABILITY TESTING.................................................................................................................................. 11
3.1.7. OS/BROWSER COMPATIBILITY TESTING...............................................................................................11
3.1.8. ACCESSIBILITY TESTING......................................................................................................................... 12
3.1.9. DATA MIGRATION TESTING..................................................................................................................... 12
3.1.10. PERFORMANCE TESTING......................................................................................................................... 12
3.1.11. SECURITY TESTING................................................................................................................................... 13
3.1.12. DISASTER RECOVERY / MDHA TESTING................................................................................................13
3.1.13. CONTENT VALIDATION TESTING............................................................................................................13
3.1.14. USER ACCEPTANCE TESTING.................................................................................................................13
3.1.15. CUTOVER TESTING................................................................................................................................... 14
3.1.16. USER ASSESSMENT.................................................................................................................................. 14
3.2. TESTING TEAM MODEL IN ITERATIVE DEVELOPMENT........................................................................14
4. ENTRY AND EXIT CRITERIA..................................................................................................................... 16
4.1. ENTRY AND EXIT CRITERIA FOR STORY TESTING DURING ITERATION:..........................................16
4.2. ENTRY AND EXIT CRITERIA FOR UAT:...................................................................................................16
4.3. TEST SUSPENSION CRITERIA.................................................................................................................. 17
4.4. TEST RESUMPTION REQUIREMENTS.....................................................................................................17
5. TESTING ORGANIZATION......................................................................................................................... 18
5.1. PROGRAM TEST MANAGEMENT.............................................................................................................18
5.2. ROLES AND RESPONSIBILITIES..............................................................................................................19
5.3. PROGRAM TEAMS ROLES AND RESPONSIBILITIES............................................................................22
6. TRAINING.................................................................................................................................................... 23
6.1. ON-BOARDING SAPIENT INDIA................................................................................................................23
6.2. KNOWLEDGE TRANSFER TO PROJECT INDIA......................................................................................23
6.3. KNOWLEDGE TRANSFER PROCESS TO PROJECT INDIA....................................................................24
7. TEST ENVIRONMENTS.............................................................................................................................. 25
8. TEST TOOLS............................................................................................................................................... 27
8.1. HP QUALITY CENTER................................................................................................................................ 27
8.2. HP QUICK TEST PROFESSIONAL............................................................................................................27
8.3. HP LOADRUNNER...................................................................................................................................... 27
8.4. OTHER TOOLS........................................................................................................................................... 27
9. TESTING PROCESSES.............................................................................................................................. 28
9.1. DEFECT MANAGEMENT............................................................................................................................ 28
9.2. TEST CASE WRITING APPROACH...........................................................................................................29
9.3. TEST CASE MANAGEMENT...................................................................................................................... 29
9.4. TRACEABILITY........................................................................................................................................... 29
10. COMMUNICATION AND ESCALATION.....................................................................................................31
10.1. MEETINGS SCHEDULE.............................................................................................................................. 31
10.2. PURPOSE OF THE MEETINGS.................................................................................................................. 32
11. TEST DELIVERABLES AND REPORTING................................................................................................33
11.1. DELIVERABLE LIST................................................................................................................................... 33
11.2. REPORTING METRICS............................................................................................................................... 33
12. TEST DATA MANAGEMENT...................................................................................................................... 35
13. TEST AUTOMATION APPROACH.............................................................................................................36
14. RISKS ASSUMPTIONS AND DEPENDENCIES.........................................................................................37
14.1. RISKS.......................................................................................................................................................... 37
14.2. ASSUMPTIONS........................................................................................................................................... 37
14.3. DEPENDENCIES......................................................................................................................................... 37
1. Introduction
1.1. Purpose
The approach we are following is an early phased approach to testing where the testing activity is embedded
within the Development story and starts in parallel with the development effort. No story is deemed complete
until the exit criteria for the story as defined in Entry and Exit criteria section below has been met.
will be integrated as part of End to End Test stories and testing will be focused on complete business
processes.
3.1.1.Unit Testing
The purpose of Unit Testing is to validate that each distinct module of the application is implemented as per
the technical and functional specification of the story at the unit level.
Unit tests will be automated where possible using frameworks like JUnit and will be executed throughout the
development process across all iterations.
Development teams in the functional tracks will conduct Unit tests as part of Development process and will
publish the unit test results as part of the build release to QA.
3.1.2.Functional Testing
The purpose of Functional Testing is to validate that the functionality implemented as part of the story meets
the requirements.
3.1.4.Regression Testing
The purpose of regression testing is to validate that the code / configuration changes have not broken
functionality that was working earlier.
3.1.6.Usability Testing
Usability Testing is a qualitative methodology used to assess a solution with Project end-users. The purpose
is to assess the Usefulness and Usability of the solution - that is how well the solution enables users to
accomplish their goals and meet their needs and how easy it is to use.
The Information Architect supports the overall track by serving as dedicated note-taker for the sessions,
generating ideas for future state recommendations, and depending on the project, may generate wireframes
depicting the recommendations. IA would be allocated for execution and to shape protocol and provide input
into analysis and recommendations.
There will be separate stories created for OS/Browser testing and the test team within the Functional tracks
will perform these tests for the web pages that are developed over various iterations.
3.1.8.Accessibility Testing
Purpose: The purpose is to validate that the web pages conform to W3C AAA Web Content Accessibility will
perform these tests for the web pages that are developed over various iterations.
Organization change vertical will be responsible for doing User Assessment tests.
4.1. Entry and Exit Criteria for Story Testing during Iteration:
7.
8.
9. Test Environments
This section covers the list of environments and their usage. Following pictures reflect the list of environments
and the code propagation flow:
10. Test Tools
This section covers the list of tools we will use during testing process.
10.3. HP LoadRunner
HP LoadRunner will be used running the performance tests.
The defects logged during the testing process will be discussed in Defects triage meetings. A daily Defect
triage meeting will be held for each of the Functional tracks. The participants will include Functional Track
Lead, Testing Lead, Architect and optionally Tester and Developer.
During these meetings the group will review the new defects logged and the status of outstanding
showstopper defects.
Here are the Severity definitions for defects that will be followed on the program
Severity Description
0 - Show Stopper There is no workaround and the project will be halted from making further
progress until the defect is resolved
1 – Severe The defect prevents further use of this work product until resolved
2 – Major The defect will prevent further use of a portion of this work product until it is
resolved
3 – Significant The defect should be addressed, but work on future work products may proceed
according to plan
4 – Minor The defect has little or no impact on current or future work products
11.2. Test Case Writing Approach
This section covers the approach we will take for writing test cases. Following are steps that test team will
follow:
Requirements Understanding:
• Traceability: have
• Test Case Reviews: roved by Project business users – that will be one of the entry criteria for End to
End test execution.
11.4. Traceability
Following picture reflects how Traceability will work between Requirements, Test Cases and Test Execution.
Vertical -> Functional Track -> Functionality -> Use Case -> Requirements
In the Test Plan module we will break down Functionality into Use Case (s). Each of the Dev stories
consisting of requirements will be listed here. All the associated test cases for the Dev story will be listed
within the Dev Story folder. These Test Cases will be mapped to Requirements and Use Cases in
requirements module. The folder structure in Test Plan module of QC will be as follows:
In the Test Lab module we will create a folder for the ADR Release no. which will have sub-folders for
Iterations. These Iterations will in turn have Test set representing the Dev Story. All the test cases for the
scope delivered will be added to these test sets. Folder structure will be as follows:
ADR -> Iteration -> Functional Track -> Story -> Test Case
12. Communication and Escalation
16.1. Risks
16.2. Assumptions
16.3. Dependencies