Sample Master Test Plan
Sample Master Test Plan
Version: 1.0
Prepared for
1 INTRODUCTION...............................................................................................5
1.1 PURPOSE............................................................................................................................5
1.2 SCOPE...............................................................................................................................6
1.2.1 Product Description...............................................................................................7
1.2.2 The Current Product Version..................................................................................7
1.2.3 Updating the Product.............................................................................................8
1.2.4 High-Level Product Development Objective...........................................................8
1.3 TEST PLAN OBJECTIVES...........................................................................................................9
2 TEST ITEMS (FUNCTIONS)..............................................................................10
2.1 CLIENT APPLICATION.............................................................................................................10
2.2 QUICK HELP TESTING...........................................................................................................10
2.3 LICENSE KEY......................................................................................................................11
2.4 SECURITY..........................................................................................................................11
3 SOFTWARE RISK ISSUES................................................................................11
3.1 SCHEDULE.........................................................................................................................11
3.2 TECHNICAL........................................................................................................................11
3.3 MANAGEMENT.....................................................................................................................11
3.4 PERSONNEL.......................................................................................................................12
3.5 REQUIREMENTS...................................................................................................................12
4 FUNCTIONS TO BE TESTED.............................................................................12
5 FUNCTIONS TO NOT BE TESTED......................................................................12
6 TEST APPROACH............................................................................................13
FUNCTIONAL TESTING WILL BE CONDUCTED DURING THE ENTIRE APPLICATION DEVELOPMENT LIFE CYCLE BY THE QUALITY
ASSURANCE ENGINEER AND THE SYSTEM BUSINESS ANALYST. AT THE CONCLUSION OF EACH ITERATION, FORMAL TESTING
WILL BE CONDUCTED BY THE BUSINESS UNIT SUBJECT MATTER EXPERTS IN TWO CYCLES; WHILE OVERALL TESTING OF THE
ENTIRE SYSTEM WILL BE CONDUCTED BY THE QUALITY ASSURANCE ENGINEER AND THE SYSTEM BUSINESS ANALYST IN ONE
FINAL CYCLE.............................................................................................................................13
HOWEVER, THE DEVELOPMENT TEAM; INCLUDING THE SQA ENGINEER, HAS PROACTIVELY BEGUN “SELF-LEARNING” AND
UTILIZING THE TOOL AND HAS INTEGRATED THE USE OF THE MICROSOFT TOOLS INTO ITS OVERALL TESTING STRATEGY.......14
ALL TEST RESULTS WILL BE DOCUMENTED IN THE MASTER TEST CASE SPREADSHEET, REVIEWED BY THE BUSINESS UNIT TEAM
LEADER, AND FORWARDED TO THE QUALITY ASSURANCE ENGINEER AND THE SYSTEM BUSINESS ANALYST...................14
THE TEST RESULTS WILL BE EVALUATED AND ENTERED INTO THE TEAM FOUNDATION SERVER THROUGH THE USE OF VISUAL
STUDIO TEST MANAGER; WHERE EACH “AUTOMATABLE” TEST CASE WILL BE RECORDED AND RUN USING THE “CODED UI
TEST” FEATURE IN VISUAL STUDIO 2010 ULTIMATE. ............................................................................14
REPORTS WILL BE GENERATED HIGHLIGHTING:......................................................................................14
THE NAME OF THE ASSIGNED TESTER................................................................................................14
THE NUMBER OF TEST CASES COMPLETED...........................................................................................14
THE NUMBER OF PASSED AND FAILED TEST CASES..................................................................................14
THE NUMBER OF OPEN ISSUED (BY PRIORITY).......................................................................................15
EVENTUALLY, PERFORMANCE TESTING WILL BE CONDUCTED TO MEASURE THE SYSTEM RESPONSE TIME, BASELINE LOAD, AND
SYSTEM STRESS CAPACITY..............................................................................................................15
VIRTUAL MACHINE IMAGES IN A LIBRARY OF PRE-BUILT IMAGES USING SYSTEM CENTER VIRTUAL MACHINE MANAGER
(SCVMM) TO ENSURE TEAMS ALWAYS BEGIN THEIR TESTING FROM A KNOWN CONFIGURATION. ..............................15
THE FOLLOWING OPERATING SYSTEMS AND BROWSER WILL BE USED IN MULTIPLE COMBINATIONS FOR TESTING THE [PROJECT
NAME] APPLICATION:...................................................................................................................15
OPERATING SYSTEMS.................................................................................................................15
WINDOWS XP.........................................................................................................................15
WINDOWS VISTA......................................................................................................................15
WINDOWS 7...........................................................................................................................15
BROWSERS.............................................................................................................................15
FIREFOX 3.0..........................................................................................................................15
INTERNET EXPLORER 7.0............................................................................................................15
INTERNET EXPLORER 8.0............................................................................................................15
6.5 REGRESSION TESTING............................................................................................................15
REGRESSION TESTING WILL BE CONDUCTED AT THE CONCLUSION OF EACH TESTING ITERATION BY THE SQA DEPARTMENT
AND ANY ASSIGNED BUSINESS UNIT SUBJECT MATTER EXPERTS. IN MOST CASES, THE TESTING WILL BE BASED ON SEVERITY OF
DEFECTS DETECTED.....................................................................................................................16
REPORTED TO THE SBA; WHO WILL IN TURN ADDRESS THESE ISSUES WITH THE SHAREHOLDERS FOR FURTHER CLARIFICATION.
..........................................................................................................................................16
7 TEST STRATEGY.............................................................................................16
7.1 SYSTEM TEST.....................................................................................................................16
7.2 PERFORMANCE TEST.............................................................................................................16
7.3 SECURITY TEST...................................................................................................................16
7.4 AUTOMATED TEST................................................................................................................16
7.5 STRESS AND VOLUME TEST ...................................................................................................16
7.6 RECOVERY TEST ..............................................................................................................17
7.7 DOCUMENTATION TEST .......................................................................................................17
7.8 BETA TEST.......................................................................................................................17
7.9 USER ACCEPTANCE TEST........................................................................................................17
8 ENTRY AND EXIT CRITERIA.............................................................................17
8.1 TEST PLAN........................................................................................................................17
8.1.1 Test Plan Entry Criteria........................................................................................17
8.1.2 Test Plan Exit Criteria..........................................................................................17
8.1.3 Suspension and Resumption Criteria...................................................................17
8.2 TEST CYCLES.....................................................................................................................18
8.2.1 Test Cycle Entry Criteria......................................................................................18
8.2.2 Test Cycle Exit Criteria........................................................................................18
9 DELIVERABLES..............................................................................................18
9.1 TEST EVALUATION SUMMARIES..................................................................................................19
9.2 INCIDENT LOGS AND CHANGE REQUESTS......................................................................................19
10 ENVIRONMENT REQUIREMENTS....................................................................19
10.1 BASE SYSTEM HARDWARE....................................................................................................19
10.2 BASE SOFTWARE ELEMENTS IN THE TEST ENVIRONMENT..................................................................20
10.3 PRODUCTIVITY AND SUPPORT TOOLS.........................................................................................20
11 RESPONSIBILITIES, STAFFING AND TRAINING NEEDS.....................................20
11.1 PEOPLE AND ROLES............................................................................................................20
11.2 STAFFING AND TRAINING NEEDS..............................................................................................23
12 TEST SCHEDULE...........................................................................................23
13 POTENTIAL RISKS AND CONTINGENCIES........................................................24
14 CONTROL PROCEDURES...............................................................................24
14.1 REVIEWS........................................................................................................................24
14.2 BUG REVIEW MEETINGS........................................................................................................24
14.3 CHANGE REQUEST.............................................................................................................24
14.4 DEFECT REPORTING............................................................................................................25
15 DOCUMENTATION........................................................................................25
16 ITERATION MILESTONES...............................................................................26
17 MANAGEMENT PROCESS AND PROCEDURES..................................................27
17.1 PROBLEM REPORTING, ESCALATION, AND ISSUE RESOLUTION.............................................................27
17.2 APPROVAL AND SIGNOFF......................................................................................................27
1 Introduction
1.1 Purpose
The purpose of this Software Quality Assurance (SQA) Plan is to establish the goals,
processes, and responsibilities required to implement effective quality assurance
functions for the [Project Name] Improvement project.
This [Project Name] Improvement Software Quality Assurance Plan provides the
framework necessary to ensure a consistent approach to software quality assurance
throughout the project life cycle. It defines the approach that will be used by the
Software Quality (SQ) personnel to monitor and assess software development processes
and products to provide objective insight into the maturity and quality of the software.
The systematic monitoring of the [Project Name] products, processes, and services will
be evaluated to ensure they meet requirements and comply with [COMPANY NAME] and
[Project Name] policies, standards, and procedures, as well as applicable Institute of
Electrical and Electronic Engineers (IEEE) standards.
The overall purpose of this Master Test Plan is to gather all of the information necessary
to plan and control the test effort for testing the [Project Name] application. It
describes the approach to testing the software, and will be the top-level plan used by
testers to direct the test effort.
This plan is designed to create clear and precise documentation of the test methods and
processes that [COMPANY NAME] will use throughout the course of the [Project Name]
system verification testing.
This plan covers SQA activities throughout the formulation and implementation phases
of the [Project Name] mission. SQA activities will continue through operations and
maintenance of the system.
This Documenting of the test methods and processes will serve as the basis for ensuring
that all major milestones and activities required for effective verification testing can
efficiently and successfully be accomplished. This Master Test Plan will be modified and
enhanced as required throughout the verification testing engagement.
1.2 Scope
The scope of this quality assurance effort is to validate the full range of activities related
to the functionality of the flagship product of the [COMPANY NAME]: the [Project Name]
Course; as it undergoes re-designing and re-building.
This test plan describes the unit, subsystem integration, and system level tests that will
be performed on components of the [Name] application. It is assumed that prior to
testing each subsystem to be tested will have undergone an informal peer review and
only code that has successfully passed a peer review will be tested.
Unit tests will be done through test driver program that perform boundary checking and
basic black box testing.
The scope of this test effort outlines the quality assurance methodology, process and
procedures used to validate system functionality and the user’s ability to navigate
through the 4 major components of the [Project Name] Course solution:
The Prep Course
Practice Exams
The [PRODUCT NAME]Exam
A Progress Tracking
The functional testing will include a Prep Course creation and maintenance tool (Admin
Tool) for use by internal [COMPANY NAME] staff to facilitate creating new Prep Course,
Practice Exam, and [PRODUCT NAME]Exam question and answer databases, as well as
testing the ability to maintain those question and answer databases by adding new
questions, correcting errors in existing questions, modifying responses, etc. This
internal tool will support the maintenance of multiple versions of the software, and
should be capable of generating a database suitable for distribution with the software
itself. Finally, the test effort will include an internal tool which allows [COMPANY NAME]
staff to review results that are uploaded by users for the Prep Course, Practice Exams,
and [PRODUCT NAME]Exam.
The [Project Name] Course and [PRODUCT NAME]Exam are offered in US,
Canadian, UK, and International editions, and a new revision of each is released
annually.
Upon successful completion of the Prep Course, they are then provided a second
key to unlock the [PRODUCT NAME]Exam. The user may submit the results of the
Exam by exporting an encrypted text file from the software and then emailing that
file to the [COMPANY NAME] for grading.
1.2.3 Updating the Product
The software has proven to be a very stable and reliable product despite the fact
that it was developed in 1999; however, the product is showing its age and needs
to be re-written to bring it up to current standards. Since the product has proven to
be very successful and has received a great deal of positive feedback from users, it
is the intent of this project to keep most of the features of the existing version
intact. Additionally, several new features and enhancements will be added to
improve the customer experience with the product, help users work through the
Prep Course more efficiently, as well as to enhance the installation, maintenance,
and administration of the software. For this initial release, the product will continue
to be offered as an installable product on Windows-based PCs and will not require
Internet access, with the exception of a few specific tasks (such as license key
verification, transmitting exam results to the [COMPANY NAME], and receiving
updates and patches).
Provide a feature set that helps the user proceed through the
certification process faster and more efficiently, from the time the product is
purchased to the time the exam results are submitted to the [COMPANY
NAME] for evaluation
Outlines and defines the overall test approach that will be used.
Defines the types of security threats and vulnerabilities against which each
exam system will be tested.
Serves as a foundation for the development of Test Plans and Test Cases.
Identifies the motivation for and ideas behind the test areas to be covered.
Identifies the required resources and provides an estimate of the test efforts.
Define the activities required to prepare for and conduct System, Beta and
User Acceptance testing
Scope
2 Test Items (Functions)
The testing effort will be concentrated on the following functions of the application:
New functionality will be implemented to enhance this process. The users will be
required to complete and electronically submit a License Key request form to the
[COMPANY NAME] Certification Team. The [COMPANY NAME] Certification Team will
electronically generate and email the license key back to the requestor.
This new functionality will be included in the scope of the testing effort.
2.4 Security
Each user will need a UserId and password to login to the system. The UserId and
password will be created by the user during the registration process. The system should
validate that the UserId and password both meet the correct format standard. Once the
registration form has been electronically submitted, the system will notify the user that
the requested identification information has been accepted and the request has been
granted. The system will require the users to change the password every 30 days.
3.1 Schedule
The schedule for each phase is very aggressive and could affect testing. A slip in the
schedule in one of the other phases could result in a subsequent slip in the test phase.
Close project management is crucial to meeting the forecasted completion date.
3.2 Technical
Since this is a new [Name] system, in the event of a failure the old system can be used.
We will run our test in parallel with the production system so that there is no downtime
of the current system.
3.3 Management
Management support is required so when the project falls behind, the test schedule does
not get squeezed to make up for the delay. Management can reduce the risk of delays
by supporting the test team throughout the testing phase and assigning people to this
project with the required skills set.
3.4 Personnel
Due to the aggressive schedule, it is very important to have experienced testers on this
project. Unexpected turnovers can impact the schedule. If attrition does happen, all
efforts must be made to replace the experienced individual.
3.5 Requirements
The test plan and test schedule are based on the current Requirements Document. Any
changes to the requirements could affect the test schedule and will need to be approved
by the CCB.
4 Functions to Be Tested
A Requirements Validation Matrix will “map” the test cases back to the requirements. See
Deliverables.
This is to be determined.
6 Test Approach
Functional testing will be conducted during the entire Application Development Life Cycle by
the Quality Assurance Engineer and the System Business Analyst. At the conclusion of each
iteration, formal testing will be conducted by the business unit subject matter experts in two
cycles; while overall testing of the entire system will be conducted by the Quality Assurance
Engineer and the System Business Analyst in one final cycle.
The overall testing approach of the project will address and encompass the following rules
and processes:
Visual Studio Test Professional 2010 - An integrated testing toolset that delivers
a complete plan-test-track workflow for in-context collaboration between testers and
developers.
Since Microsoft Visual Studio 2010 Ultimate is a relatively new tool on the market, some
special training will be required. Due to the current time constraints of the project, this
training will be scheduled and completed at a later date.
However, the Development Team; including the SQA Engineer, has proactively begun
“self-learning” and utilizing the tool and has integrated the use of the Microsoft tools
into its overall testing strategy.
Testing Information will be collected and oriented toward the level of testing. For higher
levels, application and functional data will be collected and documented. For lower
levels, program, unit, module and build data will be collected and documented.
All test results will be documented in the Master Test Case spreadsheet, reviewed by
the business unit team leader, and forwarded to the Quality Assurance Engineer and the
System Business Analyst.
The test results will be evaluated and entered into the Team Foundation Server through
the use of Visual Studio Test Manager; where each “automatable” test case will be
recorded and run using the “Coded UI Test” feature in Visual Studio 2010 Ultimate.
Test configuration management will be managed by the Lab Manager tool; included with
the Microsoft Visual Studio 2010 Ultimate package.
Lab Manager can fully provision and ready multiple environments for testing so that
build scripts can explicitly target a particular lab configuration at build time. Lab
Management stores the environments as virtual machine images in a library of pre-built
images using System Center Virtual Machine Manager (SCVMM) to ensure teams always
begin their testing from a known configuration.
The following operating systems and browser will be used in multiple combinations for
testing the [Project Name] application:
Operating Systems
Windows XP
Windows Vista
Windows 7
Browsers
Firefox 3.0
The business requirements will be elicited and managed by the Systems Business
Analyst. Any elements in the requirements and design that do not make sense or are
not testable will be immediately documented and reported to the SBA; who will in turn
address these issues with the shareholders for further clarification.
7 Test Strategy
The test strategy consists of a series of different tests that will fully exercise the [Name]
system. The primary purpose of these tests is to uncover the systems limitations and
measure its full capabilities. A list of the various planned tests and a brief explanation
follows below.
If any defects are found which seriously impact the test progress, the QA manager
may choose to suspend testing. Criteria that will justify test suspension are:
Hardware/software is not available at the times indicated in the project schedule.
Source code contains one or more critical defects, which seriously prevents or
limits testing progress.
Assigned test resources are not available when needed by the test team.
9 Deliverables
10 Environment Requirements
This section presents the non-human resources required for the Test Plan.
System Resources
Resource Quantity Name and Type
Application Server 1
System Resources
Resource Quantity Name and Type
—CPUs
—Memory
—Hard Disk 1
—Hard Disk 2
—Server Name
—IP Address
Test Development PCs TBD
This section outlines the personnel necessary to successfully test the [Name] application.
Since staff size is fixed these number may change.
(number of full-time
roles allocated)
Responsibilities include:
Responsibilities include:
(number of full-time
roles allocated)
Responsibilities include:
• log results
• document incidents
Responsibilities include:
Responsibilities include:
(number of full-time
roles allocated)
Responsibilities include:
Staffing is fixed for the duration of this project. It is likely most of the staff will
assume some testing role.
12 Test Schedule
The overall risks to the project with an emphasis on the testing process are:
• The test schedule and development schedule could move out an appropriate number
of days.
14 Control Procedures
14.1 Reviews
The project team will perform reviews for each Phase. (i.e. Requirements Review,
Design Review, Code Review, Test Plan Review, Test Case Review and Final Test
Summary Review). A meeting notice, with related documents, will be emailed to each
participant.
15 Documentation
The following documentation will be available at the end of the test phase:
Test Plan
Test Cases
Defect reports