Test Summary Report
Test Summary Report
REPORT
VERSION 1.0
TEST SUMMARY REPORT
This template was created to enable departments to more easily develop their project
plans. The Department of Technology, Consulting and Planning Division, created this
template based on its experiences. The template relies on industry best practices
combined with decades of experience on California state information technology
projects. The way it was structured is to enable a department to complete the
information related to its project without having to write background information
related to the discipline. A department may use as much or as little of the template as it
wishes.
Template Instructions:
Instructions for completing this template – written for the author of the project
plan - are encased in [ ] and the text is italicized and bolded.
Examples are provided as a guideline to the type of sample information presented
in each section and the text is italicized.
Boilerplate standard language for each section is written in the document font and
may be used or modified, as necessary.
A department’s project specific information goes within the brackets << >>.
Informational text is italicized within square brackets [ ] for informational purposes
to the person who has to create the plan and includes background information,
explanation, rationale, etc.
Page i
TEST SUMMARY REPORT
DOCUMENT HISTORY
DOCUMENT APPROVAL HISTORY
Prepared By
Reviewed By
<< The designated responsible person(s) specified in the organization’s
test policy and strategies or project approves the document. Typically,
Approved By
the Test Manager is the designated approver. Insert name(s) here and
have them sign it. >>
DOCUMENT REVISION HISTORY
DOCUMENT
DATE REVISION DESCRIPTION AUTHOR
VERSION
02/06/2015 1.0 Initial Version J. Fong
Page ii
TEST SUMMARY REPORT
TABLE OF CONTENTS
1. INTRODUCTION............................................................................................................................ 1
Page iii
TEST SUMMARY REPORT
1. INTRODUCTION
The Test Summary Report summarizes the results of the testing activities, provides an
evaluation based on the results, advises on the product’s release readiness, and documents
any known product shortcomings. This report allows the Test Manager to summarize the
testing and to identify limitations of the software and the failure likelihood. There should
be a Test Summary Report that corresponds to every test plan. In essence, the Test
Summary Report is an extension of the test plan and serves to “close the loop” on the plan.
This document can be used during change control to summarize testing efforts in
preparation for production deployment.
This document is organized into two components: Test Summary Report template and
Appendix (i.e., Test Measures (e.g., charts, graphs]).
Page 1 of 3
TEST SUMMARY REPORT
Example: Additional test cases are needed to increase test coverage due
to problems encountered while executing tests for system functionality
XYZ.
TEST COMPLETION [Specify the extent of how testing met the specified test
EVALUATION completion criteria against the Master Test Plan and explain
what criteria was not met
Provide an overall assessment of the test conducted including
the depth and breadth of the testing process based upon the
system test documents, test plan, test cases, and incidents
Document any testing inclusion and/or exclusion and the
reason for the inclusion and/or exclusion (e.g., features not
covered or tested)
Document any assumptions and/or limitations encountered
during testing.]
FACTORS THAT [Specify the factors which delayed or impeded the progress of
BLOCKED the testing
PROGRESS
If this section does not apply, enter N/A.]
Example: The scheduled Cycle 1 testing extended beyond the planned
completion date due to test resource absences.
TEST MEASURES [Specify the measures applied to track the test progress and
status. Example: Measurements on Test Cases, Defects, Incidents,
Test Coverage, Activity Progress, Test Effectiveness, and Resource
Consumption
Document detailed test measurement using the charts in the
Appendix]
TEST [Specify the test deliverables and artifacts produced as a result
DELIVERABLES & of the test effort
REUSABLE ASSETS
Indicate the reusable assets and storage location for easy
accessibility]
LESSON LEARNED [After collectively meeting with the team to reflect on the
completion of the test efforts, specify testing strengths and
opportunities for improvement from a project, product, and
process perspective]
FINAL [Summarize the final recommendation of the application
RECOMMENDATIO readiness to the next Test Level taking into consideration the
N
Test Level’s entrance and exit criteria (e.g., System Test to
Acceptance, Acceptance Test to Production).
Consider all section results when providing the final
recommendation: Test Performed, Deviation (variances), Risk
Page 2 of 3
TEST SUMMARY REPORT
APPROVALS
Page 3 of 3
TEST SUMMARY REPORT
The Appendix provides three examples for measuring test progress: Test Effort Summary,
Test Coverage, and Defect Report. The following tables may be referenced as support data
within the Test Measure section of the Test Summary Report and could be illustrated in the
form of presentation graphs or charts.
Examples
The Test Effort Summary table provides the test results for a particular build or test cycle
relative to the test cases that were designed and executed. For each build or test cycle, the
table captures the number and percent of test cases in various states (e.g., passed, failed,
stopped) and the number of defects found. For example, if a test cycle revealed that there
were a high number of test cases that failed or stopped with many open defects, the Test
Manager may use the information to determine the root cause of the problem and establish
test remediation efforts.
The table below summarizes the overall test results for the builds that were tested for <<
software under test >> during << time period >>.
Final Build
<< Date >>
Build 1 /
Cycle 1
<< Date >>
Build 2 /
Cycle 2
<< Date >>
Build 3 /
Cycle 3
<< Date >>
Note: Each new build will cause another execution of the test cases. This re-execution is
called a cycle. It is also possible to have a cycle without a new build (e.g., modified test cases
would cause this to occur). The Test Efforts Summary should include tracking of both cycle
and builds.
Page 4 of 3
TEST SUMMARY REPORT
1
Reason for the Test Case not being Tested or Reason why the Test Case Stopped during
execution
[The table below documents the test cases that were not tested or stopped and the
reasons why the test cases were not executed. The test case ID and Description entries
below should reflect a detailed account of the number of “No Test or Stopped Test
Cases” as specified in the Test Effort Summary table above.
Specify the Test Case ID, brief description, and the reason why the test case was not
tested or stopped.]
Test Coverage
The table summarizes the test coverage criterion used in determining the testing of systems,
subsystems, logic, functions, or features. If the project is using a tool, use the tool’s reporting
capabilities to report the test coverage. Otherwise, a Test Coverage Matrix could be created using
either Excel or Word. Additionally, a Requirements Test Coverage Matrix template may be used to
track and report requirements coverage.
Page 5 of 3
TEST SUMMARY REPORT
2
The Test Coverage will differ depending on the Test Level.
Integration Test - Logic Coverage
Function Test – Function or Feature Coverage (e.g., Requirements, Use Case, Flow
within the Use Case, Change Control)
System Test – Function or Feature Coverage (e.g., Requirements, Use Case, Flow within
the Use Case, Change Control)
Performance Test - Function or Feature Coverage (e.g., Functional and Non-
Functional Requirements, Use Case, Flow within the Use Case, Change Control)
User Acceptance Test – Function or Feature Coverage (e.g., Requirements, Use Case,
Flow within the Use Case, Change Control)
Defect Report
This table provides a high level overview of the application quality in terms of defect quantities,
types, severity level, and final state. The defect quantities should be compared to the exit
criteria as stated in the Master Test Plan or level test plan to verify whether the number of
defects by type and severity level are appropriate.
Note: The Severity Level of Defect values (e.g., Critical, Major, Average, Minor) are examples
and may vary depending on the project. The appropriate severity level categories for the
project should be inserted in the “Severity Level of Defect” column.
If a Defect Tracking System tool is available, generate the graphical reports and paste the
information here.
3
The open defect may have been be postponed and fixed in a future planned release.
Page 6 of 3
TEST SUMMARY REPORT
Page 7 of 3