QA-Project Plan Template
QA-Project Plan Template
Revision History
Version No Date of revision Description of change Section/Pages affected Change made by
XYZ
Approved by
XYZ
0.8
11/08/06
0.9
21/12/06
Initial template draft all text in red should be removed from QA plan. Updates
10
XYZ
XYZ
Table of Contents
1. Introduction 1.1 Purpose 1.2 Scope of the project 2. Requirements for Test 2.1 Features to be tested 2.2 Features not to be tested 3. System Test Strategy 3.1 Test Model 3.2 Testing Types 4. Entry/ Exit Criteria 5. Risks / Dependencies 6. Resources 6.1 Team 6.2 QA Environment(s) 6.3 QA Test Tools 7. Project Milestones Deliverables 7.1 Test Plan 7.2 Test Specifications 7.3 Requirements Traceability Matrix 7.4 Reports 8. Assumptions 9. Appendices 9.1 Discoveries + QA Estimates 9.2 Requirements and supporting documents Appendix A: Project Tasks 5 5 5 5 5 6 6 6 6 8 9 9 9 10 11 11 12 12 12 13 13 14 14 14 14 15
1. Introduction
This section should give an explanation of goals, methodologies and objectives. Include customer information, project version details and purpose of release. Some background information on the project may also be useful.
1.1 Purpose
Purpose of this document is to state the QA process for the project ABC and detail out the QA strategies. It serves as a reference document for all members of the project team in implementing and verifying all QA related activities.
2.1.1
Project name QA Project Plan Navigator Server 4.0 Content Integration Manager 4.0 Device Manager 4.0 Menu Manager 4.0 Update Manager 4.0 Import Manager 4.0 Report Manager 4.0
Project name QA Project Plan Test Objective: Technique: Completion Criteria: Special Considerations:
Version: 0.9 Issue Date: 18/04/2012 Ensure the PRODUCT 3.0 is stable enough to take on to fullscale system testing. Execute each steps of the Installation as specified in the Installation Documents. There should not be any show-stoppers during initial installations. All documents and Installation notes should be in Place. All bugs should be logged properly. Run sanity test with the most common configuration.
3.2.2Product Testing
The PRODUCT 3.0 testing focuses on any requirements that can be traced directly to the correct derived data. The goal is to verify proper installation oracle 9.i platform along with additional derivation logics.
Verify the Installation works properly. Verify the upgrade scripts works properly. All meta-data are converted and populated properly. All objects are in place All DB partitions are in place. All jobs are working properly.
There should not be any Data base Error. All calculation should be validated and correct. The upgrade scripts shall be run on the real but known data. Several combinations of negative and positive cases are needed for Derivations Checking.
Test Objective:
Mediation, MSI, TRUNK GROUP INVENTORY, LERG, Rate File Processor, Route Manager and AIN Manager are enabled and working properly. Once entire database is upgraded the Mediation Process will ingest data and through the Application the data will be validated.
Technique:
Version: 0.9 Issue Date: 18/04/2012 All relevant data are displayed properly. Derivations are as per defined expected results. Create a user that can access all the tabs.
3.2.4Performance Testing
Depending on the scope of the release and potential impact on performance, this section may need a separate document. Identify the areas that will potentially affect performance. Outline high-level tests that will be executed. Define expected behaviour. Identify what applications will be involved in the performance test environment. Clarify when performance testing can begin. [Need to include references to existing performance testing documentation.]
The system test ends when all tests have been completed, and the exit criteria for PRODUCT 3.0 Testing have been met. The following items identify the exit criteria: All test procedures executed with individual test criteria successfully completed. All problems identified and documented in the defect-tracking tool. All Major and Critical problems resolved and re-tested. All Components are working properly with derivations. A regression has been performed on the complete product while doing end to end testing.
5. Risks / Dependencies
Outline the potential risks that you see occurring during the project. Try and estimate the likelihood of these actually happening and assess their impact. A contingency plan should be put in place should the risk materialise. Samples below: Risk 1 2 Unclear requirements from client Availability or performance lab resources Prob 30% 30% Impact High High Contingency Highlight issue with PM 1. Book machines in advance. 2. Coordination with performance lab owner, Francis Cahill. 3. Stability tests to be run on functional test machines. 4. Initial performance testing to be carried out on functional test machines 1. Weekly meeting should prevent this. 2. Hand over documents to be filled in for all releases to QA
10%
High
6. Resources
Insert details of project team here, include assigned duties if known.
6.1 Team
Name AN Other Team Lead Role Responsibilities QA Planning & co-ordination
AN Other
QA Engineer
Feature3 Feature4
Technical Writer
AN Other
Feature1 Feature2
AN Other
Software Engineer
Feature3 Feature4
6.2 QA Environment(s)
Add details of the QA environment to be used. Note: The test environment includes but is not limited to the following Machines (type of machine, specification) Operating Systems (Solaris, Linux, OS Version, patch version) Application Servers (BEA WebLogic, Tomcat, JBoss) Oracle database (Enterprise, version) Machine mounts JRE versions (what version is the customer using?) Integration with 3rd party applications
The test environment should match the customers environment as closely as possible. Include architecture diagram if necessary. Resource Machine1 Machine2 BEA WebLogic Machine3 - Oracle Databases Details Solaris 10 + outline specifications Solaris 10 + outline specifications Version 8.1, will host the Device Manager + outline specifications Version is 10G, OS + specification of machine What databases will be used? Do they
Version: 0.9 Issue Date: 18/04/2012 already have information? If so, how many profiles or profile_hits?
7. Project Milestones
This section is completed in conjunction with the Project Plan created by Project Manager. Sample included for reference purposes only: Milestone Pre-iteration 1 Benchmark performance testing Iteration 1 JDK v1.5 changes complete NS Performance Testing: o2 Capacity upgrade SNMP Framework, Recommender and UM NS Performance Testing : Personalizer optimization Iteration 2 Feature 1 11/09/06 19/09/06 22/09/06 22/09/06 18/09/06 21/09/06 30/09/06 30/09/06 34 days 01/01/06 10/01/06 Start Date End Date QA Effort
Deliverables
Below is the list of artifacts that are process driven and should be produced during the testing lifecycle. Certain deliverables should be delivered as part of test validation, you may add to the below list of deliverables that support the overall objectives and to maintain the quality. This matrix should be updated routinely throughout the project development cycle in you project specific Test Plan.
Deliverable Documents Test Plan Test Specifications Test Case / Bug Write-Ups Test Cases / Results Test Coverage Reports Reports Test results report Test Final Report - Sign-Off
objectives for each of the components to be tested as identified in the projects Test Plan.
Te t s ID TC-0 01
7.4 Reports
The Test Lead will be responsible for writing and disseminating the following reports to appropriate project personnel as required.
The document must contain the following metrics: Total Test Cases, Number Executed, Number Passes / Fails, Number Yet to Execute Number of Bugs Found to Date, Number Resolved, and Number still Open Breakdown of Bugs by Severity / Priority Matrix Discussion of Unresolved Risks Discussion of Schedule Progress (are we where we are supposed to be?)
7.4.3
A Final Test Report will be issued by the Test Lead. It will certify as to the extent to which testing has actually completed (test case coverage report suggested), and an assessment of the products readiness for Release to Production.
8. Assumptions
This section should contain a list of assumptions that the contents of the QA plan are based on. If assumptions change, impact needs to be reviewed and relevant people made aware of consequences if any. Some sample assumptions may be QA estimates based on discoveries as of [DATE]. Updates to discoveries/features may impact estimates. Data from external servers will not be required for activation rules. The custom rules supplied will be based on HTTP headers. Only 3 page styles will be used o o o Grid look and feel Test based look and feel Test based with one image
All devices will be XHTML. All non-XHTML device will be redirected to the current portal No changes to the Report Manager parser will be required. Resources are not taken off the project.
9. Appendices
Any other information of use. Locations of discoveries, QA estimates, requirements documents, e-mails, architecture documents etc.