0% found this document useful (0 votes)
250 views15 pages

QA-Project Plan Template

This document provides a quality assurance (QA) plan for a project. It outlines the purpose, scope, requirements, test strategy, entry/exit criteria, risks, dependencies, and resources for QA testing. Key elements include testing features that are in and out of scope, a system test approach including sanity, product, and user interface testing, and criteria for starting and completing each testing phase. The team, environment, tools, milestones, and assumptions are also defined. The goal is to serve as a reference for implementing and verifying all QA activities on the project.

Uploaded by

Rupali Bhutani
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
250 views15 pages

QA-Project Plan Template

This document provides a quality assurance (QA) plan for a project. It outlines the purpose, scope, requirements, test strategy, entry/exit criteria, risks, dependencies, and resources for QA testing. Key elements include testing features that are in and out of scope, a system test approach including sanity, product, and user interface testing, and criteria for starting and completing each testing phase. The team, environment, tools, milestones, and assumptions are also defined. The goal is to serve as a reference for implementing and verifying all QA activities on the project.

Uploaded by

Rupali Bhutani
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 15

Project name here QA Plan

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

Revision History
Version No Date of revision Description of change Section/Pages affected Change made by
XYZ

Approved by
XYZ

0.8

11/08/06

0.9

21/12/06

Initial template draft all text in red should be removed from QA plan. Updates

10

XYZ

XYZ

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

Person/Stakeholder responsible for signoff and Approval


Name Date Version No

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

Table of Contents
1. Introduction 1.1 Purpose 1.2 Scope of the project 2. Requirements for Test 2.1 Features to be tested 2.2 Features not to be tested 3. System Test Strategy 3.1 Test Model 3.2 Testing Types 4. Entry/ Exit Criteria 5. Risks / Dependencies 6. Resources 6.1 Team 6.2 QA Environment(s) 6.3 QA Test Tools 7. Project Milestones Deliverables 7.1 Test Plan 7.2 Test Specifications 7.3 Requirements Traceability Matrix 7.4 Reports 8. Assumptions 9. Appendices 9.1 Discoveries + QA Estimates 9.2 Requirements and supporting documents Appendix A: Project Tasks 5 5 5 5 5 6 6 6 6 8 9 9 9 10 11 11 12 12 12 13 13 14 14 14 14 15

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

1. Introduction
This section should give an explanation of goals, methodologies and objectives. Include customer information, project version details and purpose of release. Some background information on the project may also be useful.

1.1 Purpose
Purpose of this document is to state the QA process for the project ABC and detail out the QA strategies. It serves as a reference document for all members of the project team in implementing and verifying all QA related activities.

1.2 Scope of the project


QA includes testing the functionality of the integrated system as per the Functional Requirements. Entire system is tested to ensure that all software requirements are met. It can include Regression Testing, Performance Testing, User Interface Testing, Load Testing, Volume Testing, Stress Testing and Security Testing. Some Tools may be used by the project to simulate the test environment as close to the real time scenarios.

2. Requirements for Test


This section should list in detail the components (products and features) that will be covered as part of this release. It is split into what is included or in scope and what is not included or out of scope.

2.1 Features to be tested


List all discoveries being released as part of this project. List all areas of functionality that will be affected as part of this project. If performance/stability testing is required mention this here. If ATS needs to be modified mention this here. If API client needs to be updated or modified mention this here. Include information on existing bug fixes that will be addressed as part of this release. The contents of this section can be broken down into sub sections if required. See sections 2.1.1 and 2.1.2 for examples. For discovery and QA estimate list see Discoveries + QA Estimates

2.1.1

Products for release


List specific products that will be included as part of the release, e.g.

Project name QA Project Plan Navigator Server 4.0 Content Integration Manager 4.0 Device Manager 4.0 Menu Manager 4.0 Update Manager 4.0 Import Manager 4.0 Report Manager 4.0

Version: 0.9 Issue Date: 18/04/2012

2.2 Features not to be tested


Include information on any specific pieces of functionality that are not covered in this release may include functionality initially planned but deferred to a later date. Other excluded features/products/tests should also be mentioned here. Some other examples may be Bugs not going to be addressed Testing against various different application servers (e.g. Tomcat, WebLogic, JBoss) Testing against JRE versions.

3. System Test Strategy


This section defines the overall QA approach or strategy. It should cover areas such as priorities, number of test passes and test cycles.

3.1 Test Model


The testing will be executed on the sample data provided by the Development Team. Testers need to check and validate the derivation logics.

3.2 Testing Types


3.2.1 Sanity Testing
Sanity testing is an initial testing effort to determine if the new upgraded PRODUCT 3.0 is performing well enough to be accepted for the full-scale testing effort. This type of testing will be executed with new release or major enhancements.

Project name QA Project Plan Test Objective: Technique: Completion Criteria: Special Considerations:

Version: 0.9 Issue Date: 18/04/2012 Ensure the PRODUCT 3.0 is stable enough to take on to fullscale system testing. Execute each steps of the Installation as specified in the Installation Documents. There should not be any show-stoppers during initial installations. All documents and Installation notes should be in Place. All bugs should be logged properly. Run sanity test with the most common configuration.

3.2.2Product Testing
The PRODUCT 3.0 testing focuses on any requirements that can be traced directly to the correct derived data. The goal is to verify proper installation oracle 9.i platform along with additional derivation logics.

Test Objective: Technique:

Verify the Installation works properly. Verify the upgrade scripts works properly. All meta-data are converted and populated properly. All objects are in place All DB partitions are in place. All jobs are working properly.

Completion Criteria: Special Considerations:

There should not be any Data base Error. All calculation should be validated and correct. The upgrade scripts shall be run on the real but known data. Several combinations of negative and positive cases are needed for Derivations Checking.

3.2.3User Interface Testing


Through the User Interface testing we would be able to verify the display of Data in the Proper places with proper values.

Test Objective:

Mediation, MSI, TRUNK GROUP INVENTORY, LERG, Rate File Processor, Route Manager and AIN Manager are enabled and working properly. Once entire database is upgraded the Mediation Process will ingest data and through the Application the data will be validated.

Technique:

Project name QA Project Plan Completion Criteria: Special Considerations:

Version: 0.9 Issue Date: 18/04/2012 All relevant data are displayed properly. Derivations are as per defined expected results. Create a user that can access all the tabs.

3.2.4Performance Testing
Depending on the scope of the release and potential impact on performance, this section may need a separate document. Identify the areas that will potentially affect performance. Outline high-level tests that will be executed. Define expected behaviour. Identify what applications will be involved in the performance test environment. Clarify when performance testing can begin. [Need to include references to existing performance testing documentation.]

4. Entry/ Exit Criteria


This section describes the general criteria by which testing commences, temporarily stopped, resumed and completed within each testing phase. Different features/components may have slight variation of their criteria, in which case, those should be mentioned in the feature test plan. The assumptions critical for beginning and completion of the System Test are identified below: The Hardware setup is ready with Adequate Database instance and defined configuration. Release notes information. A product demonstration IS HIGHLY REQUIRED Before Delivering to the QA for testing. Unit tests are successfully completed and documented for each pattern. All codes are frozen. Critical problems found during Unit Tests are resolved and corrected software is placed under Configuration Management (CM) control or VSS (Currently we are using Odyssey). Integration Testing is completed successfully and the Test Results are published to the QA Team for all modules. The System Test Plan is completed and has been reviewed with the project team. A Test Readiness Review (TRR) has been conducted to verify these assumptions, and system components are ready for System Test by the scheduled turnover date to the test team as determined by the TRR.

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

The system test ends when all tests have been completed, and the exit criteria for PRODUCT 3.0 Testing have been met. The following items identify the exit criteria: All test procedures executed with individual test criteria successfully completed. All problems identified and documented in the defect-tracking tool. All Major and Critical problems resolved and re-tested. All Components are working properly with derivations. A regression has been performed on the complete product while doing end to end testing.

5. Risks / Dependencies
Outline the potential risks that you see occurring during the project. Try and estimate the likelihood of these actually happening and assess their impact. A contingency plan should be put in place should the risk materialise. Samples below: Risk 1 2 Unclear requirements from client Availability or performance lab resources Prob 30% 30% Impact High High Contingency Highlight issue with PM 1. Book machines in advance. 2. Coordination with performance lab owner, Francis Cahill. 3. Stability tests to be run on functional test machines. 4. Initial performance testing to be carried out on functional test machines 1. Weekly meeting should prevent this. 2. Hand over documents to be filled in for all releases to QA

Changes to functionality without QA being notified or discovery being updated

10%

High

6. Resources
Insert details of project team here, include assigned duties if known.

6.1 Team
Name AN Other Team Lead Role Responsibilities QA Planning & co-ordination

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012 BugZilla monitoring Feature1 Feature2

AN Other

QA Engineer

Feature3 Feature4

Technical Writer

User Guides, Admin Guides, release notes, strings etc.

AN Other

Senior Software Engineer

Feature1 Feature2

AN Other

Software Engineer

Feature3 Feature4

AN Other AN Other Project Manager

6.2 QA Environment(s)
Add details of the QA environment to be used. Note: The test environment includes but is not limited to the following Machines (type of machine, specification) Operating Systems (Solaris, Linux, OS Version, patch version) Application Servers (BEA WebLogic, Tomcat, JBoss) Oracle database (Enterprise, version) Machine mounts JRE versions (what version is the customer using?) Integration with 3rd party applications

The test environment should match the customers environment as closely as possible. Include architecture diagram if necessary. Resource Machine1 Machine2 BEA WebLogic Machine3 - Oracle Databases Details Solaris 10 + outline specifications Solaris 10 + outline specifications Version 8.1, will host the Device Manager + outline specifications Version is 10G, OS + specification of machine What databases will be used? Do they

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012 already have information? If so, how many profiles or profile_hits?

6.3 QA Test Tools


FireFox Web Browser plus additional extensions (Modify Header, WML Browser, XHTML Mobile Profile) or Proximatron. RadView WebLoad Oracle Raptor/Squirrel Oracle Enterprise Manager Add any other required tools

7. Project Milestones
This section is completed in conjunction with the Project Plan created by Project Manager. Sample included for reference purposes only: Milestone Pre-iteration 1 Benchmark performance testing Iteration 1 JDK v1.5 changes complete NS Performance Testing: o2 Capacity upgrade SNMP Framework, Recommender and UM NS Performance Testing : Personalizer optimization Iteration 2 Feature 1 11/09/06 19/09/06 22/09/06 22/09/06 18/09/06 21/09/06 30/09/06 30/09/06 34 days 01/01/06 10/01/06 Start Date End Date QA Effort

Iteration 3 Feature 2 Product test passes Complete documentation

Project name QA Project Plan Projected Release Date

Version: 0.9 Issue Date: 18/04/2012 12/12/06

Deliverables
Below is the list of artifacts that are process driven and should be produced during the testing lifecycle. Certain deliverables should be delivered as part of test validation, you may add to the below list of deliverables that support the overall objectives and to maintain the quality. This matrix should be updated routinely throughout the project development cycle in you project specific Test Plan.

Deliverable Documents Test Plan Test Specifications Test Case / Bug Write-Ups Test Cases / Results Test Coverage Reports Reports Test results report Test Final Report - Sign-Off

7.1 Test Plan


The Test Plan is contains Requirements, Functional Specs, and detailed Design Specs. The Test Plan identifies the details of the test approach, identifying the associated test case areas within the specific product for this release cycle. The purpose of the Test Plan document is to: Specify the approach that Testing will use to test the product, and the deliverables (extract from the Test Approach). Break the product down into distinct areas and identify features of the product that are to be tested. Specify the procedures to be used for testing sign-off and product release. Indicate the tools used to test the product. List the resource and scheduling plans. Indicate the contact persons responsible for various areas of the project. Identify risks and contingency plans that may impact the testing of the product. Specify bug management procedures for the project. Specify criteria for acceptance of development drops to testing (of builds).

7.2 Test Specifications


A Test Specification document is derived from the Test Plan as well as the Requirements, Functional Spec., and Design Spec documents. It provides specifications for the construction of Test Cases and includes list(s) of test case areas and test

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

objectives for each of the components to be tested as identified in the projects Test Plan.

7.3 Requirements Traceability Matrix


The Requirements Traceability Matrix (RTM) is maintained within the test cases. The comments column of the test cases identifies the requirement paragraph number being testing at the specific test step. This mechanism provided one to one mapping between requirements and test cases. There will be one to one mapping between the test procedures and requirements. One test procedure would be developed for each pattern in the PRODUCT 3.0. Followings a sample basic RTM which could provide a starting point for this documentation.

Re Doc Module Com q. p R-20349 Admin 1 Rights

Se rit Com ve y p. Re ID q. Major High R1.1.1

Te t s ID TC-0 01

7.4 Reports
The Test Lead will be responsible for writing and disseminating the following reports to appropriate project personnel as required.

7.4.1 Testing status reports


A weekly or bi-weekly status report will be provided by the Test Lead to project personnel. This report will summarize weekly testing activities, issues, risks, bug counts, test case coverage, and other relevant metrics.

7.4.2 Phase Completion Reports


When each phase of testing is completed, the Test Lead will distribute a Phase Completion Report to the Product manager, Development Lead, and Program Manager for review and sign-off. The below bullets illustrates an example of what the document may include.

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

The document must contain the following metrics: Total Test Cases, Number Executed, Number Passes / Fails, Number Yet to Execute Number of Bugs Found to Date, Number Resolved, and Number still Open Breakdown of Bugs by Severity / Priority Matrix Discussion of Unresolved Risks Discussion of Schedule Progress (are we where we are supposed to be?)

7.4.3

Test Final Report - Sign-Off

A Final Test Report will be issued by the Test Lead. It will certify as to the extent to which testing has actually completed (test case coverage report suggested), and an assessment of the products readiness for Release to Production.

8. Assumptions
This section should contain a list of assumptions that the contents of the QA plan are based on. If assumptions change, impact needs to be reviewed and relevant people made aware of consequences if any. Some sample assumptions may be QA estimates based on discoveries as of [DATE]. Updates to discoveries/features may impact estimates. Data from external servers will not be required for activation rules. The custom rules supplied will be based on HTTP headers. Only 3 page styles will be used o o o Grid look and feel Test based look and feel Test based with one image

All devices will be XHTML. All non-XHTML device will be redirected to the current portal No changes to the Report Manager parser will be required. Resources are not taken off the project.

9. Appendices
Any other information of use. Locations of discoveries, QA estimates, requirements documents, e-mails, architecture documents etc.

9.1 Discoveries + QA Estimates


All discoveries and QA estimates are in \\saturn\developers\QA\QA-Estimates\location where discoveries are

9.2 Requirements and supporting documents


All supporting documentation, status meeting minutes and updates can be found on [provide details of project website, or folder location on the servers.]

Project name QA Project Plan

Version: 0.9 Issue Date: 18/04/2012

Appendix A: Project Tasks


Below are the test related tasks: Plan Test Identify Requirements for Test Develop Test Strategy Identify Test Resources Create Schedule Generate Test Plan Design Test Procedure Template Workload Analysis Identify and Describe Test Cases Review and Access Test Coverage Implement Test Execute Test Evaluate Execution of Test Verify the results Log Defects Retest Evaluate Retest-Case Coverage Determine if Test Completion Criteria and Success Criteria have been achieved

You might also like