Software Testing Documentation
Software Testing Documentation
documentation goal:
1. QA should involve at the very first phase of project so that QA and Documentation
work hand in hand.
2. Process defined by QA should follow by technical people, this helps remove most
of the defects at very initial stage.
3. Only creating and maintaining software testing templates is not enough, force
people to use them.
4. Don’t only create and leave document, Update as and when required.
5. Change requirement is important phase of project don’t forget to add them in the
list.
6. Use version controlling for everything. This will help you manage and track your
documents easily.
7. Make defect remediation process easier by documenting all defects. Make sure to
include clear description of defect, reproduce steps, affected area and details about
author while documenting any defect.
8. Try to document what is required for you to understand your work and what you
will need to produce to your stakeholders whenever required.
9. Use standard template for documentation. Like any excel sheet template or doc
file template and stick to it for all your document needs.
10. Share all project related documents at single location, accessible to every team
member for reference as well to update whenever required.
I am not saying that by applying above steps you will get sudden results. I know this
change won’t happen in a day or two, but at least we can start so that these changes
start happening slowly.
Below sample bug/defect report will give you exact idea of how to report a bug
in bug tracking tool.
Here is the example scenario that caused a bug:
Lets assume in your application under test you want to create a new user with user
information, for that you need to logon into the application and navigate to USERS
menu > New User, then enter all the details in the ‘User form’ like, First Name, Last
Name, Age, Address, Phone etc. Once you enter all these information, you need to
click on ‘SAVE’ button in order to save the user. Now you can see a success message
saying, “New User has been created successfully”.
But when you entered into your application by logging in and navigated to USERS
menu > New user, entered all the required information to create new user and
clicked on SAVE button. BANG! The application crashed and you got one error page
on screen. (Capture this error message window and save as a Microsoft paint file)
Now this is the bug scenario and you would like to report this as a BUG in your
bug-tracking tool.
How will you report this bug effectively?
Here is the sample bug report for above mentioned example:
(Note that some ‘bug report’ fields might differ depending on your bug tracking
system)
SAMPLE BUG REPORT:
Bug Name: Application crash on clicking the SAVE button while creating a new user.
Bug ID: (It will be automatically created by the BUG Tracking tool once you save
this bug)
Area Path: USERS menu > New Users
Build Number: Version Number 5.0.1
Severity: HIGH (High/Medium/Low) or 1
Priority: HIGH (High/Medium/Low) or 1
Assigned to: Developer-X
Reported By: Your Name
Reported On: Date
Reason: Defect
Status: New/Open/Active (Depends on the Tool you are using)
Environment: Windows 2003/SQL Server 2005
Description:
Application crash on clicking the SAVE button while creating a new
user, hence unable to create a new user in the application.
Steps To Reproduce:
1) Logon into the application
2) Navigate to the Users Menu > New User
3) Filled all the user information fields
4) Clicked on ‘Save’ button
5) Seen an error page “ORA1090 Exception: Insert values Error…”
6) See the attached logs for more information (Attach more logs related to bug..IF
any)
7) And also see the attached screenshot of the error page.
Expected result: On clicking SAVE button, should be prompted to a success
message “New User has been created successfully”.
(Attach ‘application crash’ screen shot.. IF any)
Save the defect/bug in the BUG TRACKING TOOL. You will get a bug id, which you
can use for further bug reference.
Default ‘New bug’ mail will go to respective developer and the default module owner
(Team leader or manager) for further action.
Table of Contents :
1. Introduction
1.1. Test Plan Objectives
2. Scope
2.1. Data Entry
2.2. Reports File Transfer
2.3. File Transfer
2.4. Security
3. Test Strategy
3.1. System Test
3.2. Performance Test
3.3. Security Test
3.4. Automated Test
3.5. Stress and Volume Test
3.6. Recovery Test
3.7. Documentation Test
3.8. Beta Test
3.9. User Acceptance Test
4. Environment Requirements
4.1. Data Entry workstations
4.2 Mainframe
5. Test Schedule
6. Control Procedures
6.1 Reviews
6.2 Bug Review meetings
6.3 Change Request
6.4 Defect Reporting
7. Functions To Be Tested
8. Resources and Responsibilities
8.1. Resources
8.2. Responsibilities
9. Deliverables
10. Suspension / Exit Criteria
11. Resumption Criteria
12. Dependencies
12.1 Personnel Dependencies
12.2 Software Dependencies
12.3 Hardware Dependencies
12.3 Test Data & Database
13. Risks
13.1. Schedule
13.2. Technical
13.3. Management
13.4. Personnel
13.5 Requirements
14. Tools
15. Documentation
16. Approvals
Let me know if you still want example Test plan!!
Update: You can visit the latest updated article on Sample Test plan. You can also
download pdf file version of this Test plan template on this post.
Writing effective status report is as important as the actual work you did! How to
write a effective status report of your weekly work at the end of each week?
Here I am going to give some tips. Weekly report is important to track the
important project issues, accomplishments of the projects, pending work
and milestone analysis. Even using these reports you can track the team
performance to some extent. From this report prepare future actionables items
according to the priorities and make the list of next weeks actionable.
So how to write weekly status report?
Follow the below template:
Prepared By:
Project:
Date of preparation:
Status:
A) Issues:
Issues holding the QA team from delivering on schedule:
Project:
Issue description:
Possible solution:
Issue resolution date:
You can mark these issues in red colour. These are the issues that requires
managements help in resolving them.
Issues that management should be aware:
These are the issues that not hold the QA team from delivering on time but
management should be aware of them. Mark these issues inYellow colour. You can
use above same template to report them.
Project accomplishments:
Mark them in Green colour. Use below template.
Project:
Accomplishment:
Accomplishment date:
B) Next week Priorities:
Actionable items next week list them in two categories:
1) Pending deliverables: Mark them in blue colour: These are previous weeks
deliverables which should get released as soon as possible in this week.
Project:
Work update:
Scheduled date:
Reason for extending:
2) New tasks:
List all next weeks new task here. You can use black colour for this.
Project:
Scheduled Task:
Date of release:
C) Defect status:
Active defects:
List all active defects here with Reporter, Module, Severity, priority, assigned to.
Closed Defects:
List all closed defects with Reporter, Module, Severity, priority, assigned to.
Test cases:
List total number of test cases wrote, test cases passed, test cases failed, test cases
to be executed.
This template should give you the overall idea of the status report. Don’t ignore the
status report. Even if your managers are not forcing you to write these reports they
are most important for your work assessment in future.
There are hundreds of documents used in software development and testing life
cycle. Here I am listing few important software testing documents that we
need to use/maintain regularly:
1) Test plan
2) Test design and Test case specification
3) Test Strategy
4) Test summary reports
5) Weekly Status Report
6) User Documents/ manuals
7) User Acceptance Report
8 ) Risk Assessment
9) Test Log
10) Bug reports
11) Test data
12) Test analysis
What types of documents are needed for software testing?
Software testing is not just limited to testing the application, it also includes documentation.
Testing documentation is used for a variety of reasons:
Testing Methodology
Test Strategy
Testing Effort Estimation
Testing Glossary
QA Roles & Responsibilities
Test Plan
Testing Checklist
Requirements Traceability Matrix
Test Cases
Test Scripts (Automated/Performance)
Performance Questionnaire
Testing Progress Report
Defect Log
Testing Completion Report (aka Findings Report)
OR
IEEE 829 has been developed specifically with software testing in mind and is applicable to each stage of
the testing life cycle including system and acceptance testing.
Types of Document
Test specification
Test Plan: Covers how the testing will be managed, scheduled and executed.
Test Design Specification: defines logically what needs to be tested by examining the
requirements or features. Then these requirements can be converted into test conditions.
Test Case Specification: Converts the test conditions into test cases by adding real data,
pre-conditions and expected results
Test Procedure: Describes in practical terms how the tests are run.
Test Item Transmittal Report: Specify the items released for testing.
Test execution
Test Log: Is an audit trail that records the details of tests in chronologically.
Test Incident Report: Record details of any unexpected events and behaviours that need
to be investigated.
Test reporting
The test preparation is by far the most important part of any software testing project. During this stage you
must create your tests and define the requirements of your test environment.
The Test Plan describes how you will deliver the testing.
1. Test Plan Identifier
2. References
3. Introduction
4. Test Items
5. Software Risk Issues
6. Features to be tested
7. Features not to be tested
8. Approach
9. Item Pass/Fail Criteria
10. Suspension Criteria and Resumption Requirements
11. Test Deliverables
12. Remaining Test Tasks
13. Environmental Needs
14. Staffing and Training Needs
15. Responsibilities
16. Schedule
17. Planning Risks and Contingencies
18. Approvals
19. Glossary
The test design specification identifies the test conditions from the requirements and functional design
documentation.
Let’s use a Banking project example where the following testing requirements have been defined. In this
case Bank A is rolling out new “hole in the wall” machines all over the country and the project will include
testing the functionality of the ATMs to demonstrate the ability to:
Complete a valid withdrawal of funds.
Print a report summary of recent transactions.
Change passwords.
The test design does not record the values to be entered for a test, but describes the requirements for
defining those values. This is done at a logical level and should not be cluttered with individual examples
and data.The design specification provides a link between test requirements and test cases.
The test cases can be produced when the test design is completed. A test case may cover one or more test
conditions derived from the test design. A test case should include:
The precise data that is required to execute the test. This is a combination of input data,
and application and system data that is needed.
The expected results and outputs
Pre-conditions that will determine the starting point for each test. A feature (requirement)
from the test design may be tested in more than one test case, and a test case may test more
than one feature. The aim is for a set of test cases to test each feature (requirement) in the test
design at least once. Taking the ATM project example the first 2 requirements could be tested
using one test case:
Test case 1 could be defined to include the completion of a cash withdrawal from the ATM
and then a printout request to show this withdrawal has been correctly executed and the right
amount has been debited.
IEEE 829 - Test Procedure Specification The test procedures are developed from both the test design and
the test case specification. The procedure document describes how the tester will physically run the test, the
physical set-up required, and the procedure steps that need to be followed. The standard defines ten
procedure steps that may be applied when running a test.
This document is a handover document and provides details of the previous stage of testing.
Similar to a release note this provides a list of what is being delivered and shows any changes and new
items contained. It includes the person responsible for each item, its physical location and its status.
The schedule of what test cases are run and when, is defined in the Test Plan. The test results are recorded
in the test log, and in test incident reports.
The Test Log records the details of what Test Cases have been run, the order of their running, and the
results of the test. The results are either the test passed, meaning that the actual and expected results were
identical, or it failed and that there was a discrepancy. If there is a discrepancy than one or more Test
Incident Reports are raised or updated, and their identities recorded on the Test Log.
The Test Log is important as it allows progress of the testing to be checked, as well as providing valuable
information for finding out what caused an incident. If an incident is a coding fault, the fault may have
occurred not in the Test Case that failed but in one that was run previously. Thus the sequence of the tests
enables the fault to be found.
This report documents any event that requires subsequent investigation. An incident should be raised when
there is an unexpected result or any unexpected behaviour during testing. At this point it may not always be
clear whether there is a bug or fault in the software, since incidents can occur as a result of configuration
errors, faults in the software, faults in the requirements and in-correct expected results recorded in the test
case.
The report consists of all details of the incident such as actual and expected results, when it failed, and any
supporting evidence that will help in its resolution. The report will also include, if possible, an assessment of
the impact upon testing of an incident.
Eventually testing will be completed according to the criteria specified in the Test Plan. This is when the
success or failure of the system is decided based on the results. The Test Summary records this
information.
The summary provides the results of the designated testing activities and an evaluation of these results.
Moreover the summary provides an overall view of the testing and the quality of the softwar
http://searchsoftwarequality.bitpipe.com/tlist/Software-Testing.html
This week:
1. did this
2. did that
1. do this
2. do that
3. Was not able to accomplish the thing I needed to do because of system problems
Software:
Enterprise Build 21-22
Build Release
somecompany.com
Hardware:
Current test machine: Windows 2000, (NTFS);
64RAM 4GB 2 partitions (C:,D:)
Laptop COMPAQ :Windows '2000 , '98 (32-bit FAT) Dual-Boot
64RAM 4GB 2 Partitions (C: D:)
Types of Test:
Manual Testing (with and without test cases)
Regression Testing
Operating Systems:
Windows 2000 on Test Machine (since sept. 11)
Windows '98 Compaq Presario 1235
Description:
· Tested the new build 12 using the NT and '98. Did regression testing and bug
verification.
· Tested Enterprise Build Released Version on Windows '95
· Worked on Enterprise Build 21-22 using Win 2000 on MS Outlook and Lotus
Future Plans:
Interested in new projects.
2.System Configuration:
5.Unexpected results
8.Accomplishment
9.Bug statistics: