Software Test Documentation IEEE 829
Software Test Documentation IEEE 829
Project Name
Version
Date
i. Introduction
This test plan for Mobile App xxx supports the following objectives:
1. To define the tools to be used throughout the testing process.
● Requirements Specifications
● Design Specifications
You can use the table below to list the items:
ITEMS TO BE
VERSION NUMBER
TESTED
Item1 V1.3
Item2
Item3
Item4
Item5
v. Approach:
This is your overall test strategy for this test plan; it should be appropriate to the level of the plan
(master, acceptance, etc.) and should be in agreement with all higher and lower levels of plans.
Overall rules and processes should be identified.
Mention the overall approach to testing.
Specify the following:
o Testing levels
o Testing types
o Testing methods
CRITERIA DESCRIPTION
Item 1 xxx
Item 2
Item 3
Item 4
Item 5
Item 6
Item 7
Item 8
· Specify what constitutes stoppage for a test or series of tests and what is the
acceptable level of defects that will allow the testing to proceed past the defects.
· Testing after a truly fatal error will generate conditions that may be identified as
defects but are in fact ghost errors caused by the earlier defects that were ignored.
Once all bugs/defect reported after complete testing is fixed and no other bugs are found,
report will be deployed to client’s test site by PM.
• Once round of testing will be done by QA on client’s test site if required Report will be
delivered along with sample output by email to respective lead and Report group.
• QA will be submitting the filled hard copy of delivery slip to respective developer.
• Once lead gets the hard copy of delivery slip filled by QA and developer, he will send the
report delivery email to client.
This will also allow the users and testers to avoid incomplete functions and prevent
waste of resources chasing Non-defects.
x. Environmental Needs:
Are there any special requirements for this test plan, such as:
· Special hardware such as simulators, static generators etc.
· How will test data be provided. Are there special collection requirements or
specific ranges of data that must be provided?
· How much testing will be done on each component of a multi-part feature?
· Special power requirements.
· Specific versions of other supporting software.
· Restricted use of the system during testing.
· Tools (both purchased and created).
· Communications
· Web
· Client/Server
· Network
· Topology
· External
· Internal
· Bridges/Routers
1- Testing Tools
Process Tool
Test case creation
Test case tracking
Test case execution
Test case
management
Defect management
Test reporting
Check list creating
Project structure
2- Test Environment
E.g.
o Support level 1 (browsers):
Windows 8: Edge, Chrome (latest), Firefox (latest)
Mac OS X: Chrome (latest), Firefox (latest)
Linux Ubuntu: Chrome (latest), Firefox (latest)
o Support level 1 (devices):
Phone 5 / 6, iPad 3, Nokia Lumia 910, Google Nexus 7, LG G3.
o Support level 2:
Windows 7: IE 9+, Chrome (latest), Firefox (latest)
Windows XP: IE 8, Chrome (latest), Firefox (latest)
o Support level 3: x anything else
xi. Responsibilities
There should be a responsible person for each aspect of the testing and the test process.
Each test task identified should also have a responsible person assigned.
This includes all areas of the plan, here are some examples.
· Setting risks.
· Selecting features to be tested and not tested.
· Setting overall strategy for this level of plan.
· Ensuring all required elements are in place for testing.
· Providing for resolution of scheduling conflicts, especially if testing is
done on the production system.
· Who provides the required training?
· Who makes the critical go/no go decisions for items not covered in the test plans?
· Who delivers each item in the test items section?
List the responsibilities of each team / role / individual:
STAFF
ROLE RESPONSIBILITIES
MEMBER
1. Acts as a primary contact for development and QA
Project team.
Manager
2. Responsible for Project schedule and the overall
success of the project.
1. Understand requirements
2. Writing and executing Test cases
3. Preparing RTM
4. Reviewing Test cases, RTM
QA 5. Defect reporting and tracking
6. Retesting and regression testing
7. Bug Review meeting
8. Preparation of Test Data
9. Coordinate with QA Lead for any issues
xii. Schedule
Should be based on realistic and validated estimates. If the estimates for the development
of the application are inaccurate the entire project plan will slip and the testing is part of
the overall project plan.
Testing will take place 4 weeks prior to the launch date. The first round of testing should be
completed in 1 week.
RISK
RISK TYPE DETAILS CONTINGENCY
RATING
A slip in the schedule in one of the
SCHEDULE other phases could result in a L/M/H
subsequent slip in the test phases
As this is new system, if there is a
TECHNICAL L/M/H
failure the old system can be used
SIGNATURE