0% found this document useful (0 votes)
41 views6 pages

Test Plan Template

The test plan document outlines the structure and key elements that will be covered in a software test plan, including test items, features to be tested, approach and strategy, responsibilities, and criteria for starting and completing testing. The plan provides a framework to guide the testing process and ensure critical areas are addressed.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views6 pages

Test Plan Template

The test plan document outlines the structure and key elements that will be covered in a software test plan, including test items, features to be tested, approach and strategy, responsibilities, and criteria for starting and completing testing. The plan provides a framework to guide the testing process and ensure critical areas are addressed.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6

Test Plan

Table of Contents
1. Test plan identifier.
2. References.
3. Introduction.
4. Test items.
5. Software risk issues.
6. Features to be tested.
7. Features not to be tested.
8. Approach/Strategy.
9. Item pass/fail criteria.
10. Suspension criteria and resumption requirements.
11. Test deliverables.
12. Remaining Test Tasks.
13. Environmental Needs.
14. Responsibilities.
15. Staffing and training needs.
16. Schedule.
17. Risks and contingencies.
18. Entry criteria
19. Exit criteria.
20. Approvals.

Test Plan Identifier


Some type of unique company generated number to identify this test plan, its level and the level of
software that it is related to. Preferably the test plan level will be the same as the related software level.
The number may also identify whether the test plan is a Master plan, a Level plan, an integration plan or
whichever plan level it represents. This is to assist in coordinating software and testware versions within
configuration management.

References
List all documents that support this test plan. Refer to the actual version/release number of the document
as stored in the configuration management system. Do not duplicate the text from other documents as this
will reduce the viability of this document and increase the maintenance effort. Documents that can be
referenced include:

 Project Plan
 Requirements specifications
 High Level design document
 Detail design document
 Development and Test process standards
 Methodology guidelines and examples
 Corporate standards and guidelines

Introduction
State the purpose of the Plan. This is essentially the executive summary part of the plan.
Keep information brief and to the point.

Test Items
These are things you intend to test within the scope of this test plan. Essentially, something you will test,
a list of what is to be tested.
This information includes version numbers, configuration requirements where needed, (especially if
multiple versions of the product are supported). It may also include key delivery schedule issues for
critical elements.

Software Risk Issues


Identify what software is to be tested and what the critical areas are, such as:

 Delivery of a third party product.


 New version of interfacing software.
 Ability to use and understand a new package/tool, etc.
 Extremely complex functions.
 Poorly documented modules or change requests.

Features to be Tested
his is a listing of what is to be tested from the USERS viewpoint of what the system does. This is not a
technical description of the software, but a USERS view of the functions.
Set the level of risk for each feature. Use a simple rating scale such as (H, M, L): High, Medium and
Low. These types of levels are understandable to a User.

Features not to be tested


This is a listing of what is NOT to be tested from both the Users viewpoint of what the system does and a
configuration management/version control view. This is not a technical description of the software, but a
USERS view of the functions.
Identify WHY the feature is not to be tested, there can be any number of reasons.

 Not to be included in this release of the Software.


 Low risk, has been used before and is considered stable.
 Will be released but not tested or documented as a functional part of the release of this version of
the software.
What will and will not be tested are directly affected by the levels of acceptable risk within the project,
and what does not get tested affects the level of risk of the project.

Approach/Strategy
This is your overall test strategy for this test plan. Overall rules and processes should be identified.

 Are any special tools to be used and what are they?


 Will the tool require special training?
 What metrics will be collected?
 Which level is each metric to be collected at?
 How is Configuration Management to be handled?
 How many different configurations will be tested?
 Hardware.
 Software.
 Combinations of HW, SW and other vendor packages.
 What levels of regression testing will be done and how much at each test level?
 Will regression testing be based on severity of defects detected?
 How will elements in the requirements and design that do not make sense or are untestable be
processed?

Item pass/fail criteria


What are the Completion criteria for this plan? This is a critical aspect of any test plan and should be
appropriate to the level of the plan.

 At the Unit test level this could be items such as:


o All test cases completed.
o A specified percentage of cases completed with a percentage containing some number of
minor defects.
o Code coverage tool indicates all code covered.
 At the Master test plan level this could be items such as:
o All lower level plans completed.
o A specified number of plans completed without errors and a percentage with minor
defects.
What is the number and severity of defects located?

 Is it possible to compare this to the total number of defects? This may be impossible, as some
defects are never detected.
o A defect is something that may cause a failure, and may be acceptable to leave in the
application.
o A failure is the result of a defect as seen by the User, the system crashes, etc.
Suspension criteria and resumption requirements
Know when to pause in a series of tests.
If the number or type of defects reaches a point where the follow on testing has no value, it makes no
sense to continue the test; you are just wasting resources.
Specify what constitutes stoppage for a test or series of tests and what is the acceptable level of defects
that will allow the testing to proceed past the defects.
Testing after a truly fatal error will generate conditions that may be identified as defects but are in fact
ghost errors caused by the earlier defects that were ignored.

Test deliverables
What is to be delivered as part of this plan?

 Test plan document.


 Test cases.
 Test design specifications.
 Tools and their outputs.
 Static and dynamic generators.
 Error logs and execution logs.
 Problem reports and corrective actions.
One thing that is not a test deliverable is the software itself that is listed under test items and is delivered
by development.

Remaining Test Tasks


If this is a multi-phase process or if the application is to be released in increments there may be parts of
the application that this plan does not address. These areas need to be identified to avoid any confusion
should defects be reported back on those future functions. This will also allow the users and testers to
avoid incomplete functions and prevent waste of resources chasing non-defects.

Environmental Needs
Are there any special requirements for this test plan, such as:

 Special hardware.
 How will test data be provided. Are there special collection requirements or specific ranges of
data that must be provided?
 How much testing will be done on each component of a multi-part feature?
 Specific versions of other supporting software.
 Restricted use of the system during testing.
Responsibilities
Who is in charge?
This issue includes all areas of the plan. Here are some examples:

 Setting risks.
 Selecting features to be tested and not tested.
 Setting overall strategy for this level of plan.
 Ensuring all required elements are in place for testing.
 Providing for resolution of scheduling conflicts, especially, if testing is done on the production
system.
 Who provides the required training?
 Who makes the critical go/no go decisions for items not covered in the test plans?

Staffing and training needs


Specify test staffing needs by skill level. Identify training options for providing necessary skills.

Schedule
Include test milestones identified in the software project schedule as well as all item transmittal events.
Should be based on realistic and validated estimates.

Risks and contingencies


What are the overall risks to the project with an emphasis on the testing process?

 Lack of personnel resources when testing is to begin.


 Lack of availability of required hardware, software, data or tools.
 Late delivery of the software, hardware or tools.
 Delays in training on the application and/or tools.
 Changes to the original requirements or designs.

Entry Criteria
Entry criteria is used to determine when a given test activity should start. It also includes the beginning of
a level of testing, when test design or when test execution is ready to start.
Examples for Entry Criteria:

 Verify if the Test environment is available and ready for use.


 Verify if test tools installed in the environment are ready for use.
 Verify if Testable code is available.
 Verify if Test Data is available and validated for correctness of Data.
Exit Criteria
Exit criterion is used to determine whether a given test activity has been completed or NOT. Exit criteria
can be defined for all of the test activities right from planning, specification and execution.
Exit criterion should be part of test plan and decided in the planning stage.
Examples of Exit Criteria:

 Verify if All tests planned have been run.


 Verify if the level of requirement coverage has been met.
 Verify if there are NO Critical or high severity defects that are left outstanding.
 Verify if all high risk areas are completely tested.
 Verify if software development activities are completed within the projected cost.
 Verify if software development activities are completed within the projected timelines.

Approvals
Specify the names and titles of all persons who must approve this plan. Provide space for the signatures
and dates.

You might also like