Master Test Plan: Project 199 New Website Development For UNITED
Master Test Plan: Project 199 New Website Development For UNITED
DOCUMENT CONTRIBUTORS
The following individuals contributed to this Master Test Plan: XXXXXX (Project Manager) XXXXX (QA)
TABLE OF CONTENTS
INTRODUCTION ...................................................................................................................................... 3 1.1. 1.2. 1.3. 1.4. Document Purpose.................................................................................................................... 3 Target Audience ........................................................................................................................ 3 Testing Objectives..................................................................................................................... 3 Scope ....................................................................................................................................... 4 1.4.1. Functions To Be Tested ................................................................................................ 4 1.4.2. Application Functions Not To Be Tested ....................................................................... 4 1.5. References ................................................................................................................................ 5 1.6. Dependencies ........................................................................................................................... 5 1.7. Glossary of Terms ..................................................................................................................... 5 TEST STRATEGY.................................................................................................................................... 5 1.8. Outstanding Issues, Assumptions, Risks & Contingencies ....................................................... 5 1.8.1. Testing Risks ................................................................................................................. 5 1.8.2. Strategy To Mitigate Testing Risks................................................................................ 6 1.9. Requirements Analysis/Validation............................................................................................. 6 1.10. General Strategy/Approach....................................................................................................... 6 1.11. Test Environment/Platform........................................................................................................ 6 1.12. Test Data................................................................................................................................... 7 1.13. Test Documents ........................................................................................................................ 7 1.13.1. Test Case Dependencies .............................................................................................. 7 1.14. Change Control Procedures...................................................................................................... 7 1.15. Build Deployment Strategy........................................................................................................ 8 1.16. Defect Reporting ....................................................................................................................... 8 1.16.1. Defect Fields.................................................................................................................. 9 1.17. Defect Management Process .................................................................................................. 10 SCHEDULE............................................................................................................................................ 11 1.18. Milestones/Target Dates ......................................................................................................... 11 TESTING TEAM..................................................................................................................................... 12 QUALITY ASSURANCE LIFE CYCLE (QALC) .................................................................................... 13 TEST PHASES INCLUDED IN THIS RELEASE ................................................................................... 14 TEST PHASES NOT INCLUDED IN THIS RELEASE........................................................................... 20 TEST PLAN APPROVAL....................................................................................................................... 22
Project Manager: XXXX Date Revised: 8/27/09
INTRODUCTION
1.1. Document Purpose This document identifies the areas that will be included within the scope of testing for implementing New Website Development for UNITED. It describes the testing approach used by the Internal-Facing Applications Team to ensure a successful implementation. This plan also defines who is responsible for overseeing the testing as well as those expected to participate in the different testing phases. Additionally, the plan addresses the environment in which the test initiative is to occur and indicates what the Entrance and Exit criteria will be for each type of testing. 1.2. Target Audience This plan impacts Wells Fargo Financial Advisors, IT, and Marketing. Financial Advisors currently use both Comac and ADP (through Onyx) to fulfill literature requests. The Wells Fargo Web team will play a role in converting the Comac user interface into an Onyx and ADP-supported application and process. In addition, the Onyx team will be responsible for enhancements and updates to Onyx so literature orders can be stored and viewed in a single application. The Marketing team may be responsible for updates to literature products in the Onyx Product Administration tool. 1.3. Testing Objectives The testing objectives of this plan are to verify: The system satisfies the projects Business and User Requirements. The correctness of information within the fields of the application based on user inputs. The correctness of information within the fields of external systems during and following the completion of a user-transaction or other user or system-initiated event. Major system defects are identified, reported, and managed.
UC # 15 1 2 3 4 5 6 7/8 9 10 11 19 12 18 13 14 24
Function View Indicator Onyx Dashboard View & Display Recommendations Create New Recommendation Clone a Recommendation Revise a Recommendation Record Recommendation Record Survey Results Calculate & Display Score & Portfolio Record and Save Portfolio Approve Recommendations Distribute Client Output Select Portfolio Kit Resend Client Output Cancel Client output Print Client Rec Print Client Q & A (Pretty Print) Link Account to Recommendation
Area/Type User Interface User Interface User Interface User Interface User Interface User Interface User Interface User Interface User Interface User Interface Literature Literature User Interface & Literature User Interface & Literature User Interface User Interface User Interface
Batch Processing to integrate into System Test 16 Inactivate Recommendations User Interface 17 Inactivate Survey Answer Set User Interface Update User Interface Stability 23 Survey User Interface 21 Transfer data to Onyx Catalog Reporting Integration Testing w/ADP 20 Select data and transmit to ADP Client Output 25 Client Output documents correct Client Output Outside of WFRS BFDS Updates Mark Trades as solicited 22 BFDS
Dependencies Completion of testing is dependent on the successful delivery of the One-Time Recommendation development components and Requirements. Additionally, the QA team will require support from the following project areas: BAs for Requirements validation and testing IT for Unit Testing and assistance with setting up the testing environment(s), platform(s), and test data DBAs for setup of the test data and environment(s) Members of the Business Team to participate in testing and defect resolution The PM for schedule changes or change-control issues
1.7.
Glossary of Terms For a Glossary of terms, see \\FILESHARE\IT$\Projects\Onyx 2005\Wells Fargo Glossary.doc
TEST STRATEGY
1.8. Outstanding Issues, Assumptions, Risks & Contingencies See the latest version of the One-Time Recommendation Project Definition document (located in G:\Projects\Onyx 2005\One time Recommendation folder). 1.8.1. Testing Risks Lack of personnel resources when testing is to begin Lack of availability of required hardware, software, data or tools Late delivery of the software, hardware or tools Delays in training on the application and/or tools Changes to the original requirements or design Vague or unclear requirements Requirements that cannot be tested
Project Manager: XXXX Date Revised: 8/27/09
Requirements documentation can be found at G:\Projects\Onyx 2005\One time Recommendation\02 Requirements Analysis 1.10. General Strategy/Approach Since One-Time Recommendation is a new application, the following tasks will be performed to help ensure a successful deployment: Extensive planning for project risk mitigation Careful coordination of test data, environments, test cycles, and training A phased approach to integration/system testing to minimize big bang testing near the end of the project One or two dedicated QA team members will be assigned to testing and validating the system Since no pre-existing test cases exist, the primary BA will be asked to assist with test case identification as well as what test cases should be included in each testing cycle (System Testing vs. Regression vs. User Acceptance, etc.)
1.11. Test Environment/Platform The QA team will work with IT and the DBAs to determine the following: The testing environment/platform to be used
6
1.12. Test Data The QA team will work with IT, DBAs, and the BAs to determine the following: Test accounts, test data, and data tables Any special collection requirements or specific ranges of data that must be provided 1.13. Test Documents Relevant test documents to this project include: 1. 2. 3. 4. 5. 6. 7. The Master Test Plan Component Test Plans Use Cases Test Cases Testing Progress Reports Error logs and execution logs Problem reports and corrective actions
Project documents can be found at: G:\Projects\Onyx 2005\One time Recommendation\05 Testing 1.13.1. Test Case Dependencies Prior to test case development, it is expected that the following exist: Use cases Screen Flow diagrams Functional requirements Technical design document Traceability matrices Once these documents are signed off, changes to them will be treated as enhancements. 1.14. Change Control Procedures The QA team will work with the PM to address and mitigate any requirement or system changes that impact testing timelines and deliverables. If a Change Request is approved, the following procedures will apply: 1. The Change Request documented in Caliber and communicated to the QA team. 2. The BA will notify QA of the Requirement(s) to be created or changed.
Date Created: 10/11/05
1.16. Defect Reporting Defects will be submitted and, in most cases, assigned to the developer corresponding to the feature or function where the defect was found. Reporting and tracking of defects will be done using PVCS Tracker. However, Mercury Quality Center may be used in conjunction with Tracker. Defects will be assigned a severity by the submitter and a review of the defects will be done as warranted by the project. Defect status reporting will be done on a regular basis at project status meetings. In general, the defect lifecycle will follow that of the following document: \\FILESHARE\IT$\transition projects\Transition Projects\001-PFPC to DST\000- Overall IT Program\004- Testing\Tracker procedures - HNW ONYX example.doc
Values
Comments Include TIN, Login ID, Account, Fund, and OAO App ID as appropriate. Urgent a showstopper; must be fixed in next build High cant go live with this Medium Not High, not Low Low OK to go live with this Verified -> Closed: near end of project, PM may need awareness of final fixes going in; also, post-go-live, we may want QA to verify again in production
*Status
Open Rejected Ready For Retest Reopen Verified Closed <List of all project team members> <List of all project team members>
Useful for UAT or customer-reported issues Defect Enhancement Dev, Test, Staging, Prod <None> OAO OAA VRU SuperUser Defaults to todays date <None> OAO MF OAO 529 OAA Login/ Registration OAA MyPortfolio OAA - Trading OAA Statements/eDocs OAA Account Services OAA MarketWatch VRU Login/ Registration VRU Quotes VRU Balances VRU Transactions VRU Other Information Y N Dev, Test, Staging, Prod <Date>
An appropriate message is to be added whenever an Assignee changes the Assigned To field or changes the SCR Status Associated files shall reside in folder G:\transition projects\Transition Projects\001-PFPC to DST\000- Overall IT Program\004- Testing\SWELL Tracker Attachments\ with SCR # as file name.
Add a Defect
OR
Status = Ready For Retest (Ready for the next build) Assigned To = <submitter>
Passed Retest?
10
SCHEDULE
1.18. Milestones/Target Dates The One-Time Recommendation project plan can be found at: G:\Projects\Onyx 2005\One time Recommendation\01 Planning. As of 10/11, the testing deliverables/dates are listed below:
Task Name Duration 49 days Test Update Master Test Plan 1 day Create Test 21 days Cases/scripts WFRS User Interface 11 days Client Output/fulfillment 10 days Execute System 41.38 days testing WFRS User Interface Literature Fulfillment Client Output
Complete Onyx System testing
Start 11/01/05 11/02/05 11/10/05 11/10/05 11/29/05 11/01/05 12/13/05 12/22/05 12/30/05 01/03/06 11/02/05 11/01/05 12/22/05 01/03/06 01/11/06
Finish 01/12/06 11/02/05 12/12/05 11/28/05 12/12/05 01/03/06 12/29/05 12/23/05 01/03/06 01/03/06 11/02/05 11/01/05 01/10/06 01/10/06 01/12/06
Milena/Sanjeev Milena/Sanjeev
11.13 days 1.25 days 1.38 days 0 days 1 day 0.25 days 12 days 1 wk 2 days
Test Status Report Test Status Report Test Status Report Test Status Report Test Status Report Test Status Report UAT Status Report Regression Status Report
Reporting BFDS Execute Integration (end-to-end) testing Execute User Acceptance testing Execute Regression testing
Note that any changes to this schedule could result in delays in delivery. Changes should be discussed before testing begins to ensure adequate resources and time exist to complete all testing by the milestone completion dates. Any approved changes to the testing schedule will be documented within the Project Plan rather than updating the approved test plan.
11
Task Data Conversion/Data Creation/File loading to test databases Unit Testing (white box) Functional Testing (black box) Integration Testing Regression Testing
Name
Developers QA, BAs, and Business (where applicable) Developers and QA QA, BAs, and Business (where applicable) QA, BAs, and Business team members IT and QA QA
12
Change Requests
Onyx Test case developed by looking into Use Case and Requirement document
Test Status
13
White Box testing to validate specific functions or code modules. Unit testing include a variety of the following checks: Execution paths Error-handling paths Boundary conditions Inputs Outputs Computations Logical decision points
IT
Code
N/A
14
2. Smoke Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)
Non-exhaustive software testing performed to ascertain that the most crucial functions of a program are functioning as expected.
QA
2. Functional Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)
Testing performed at the modular or page level to validate all requirements have been met.
QA BAs
Documentation of requirements is complete and signed off by the business. Test case development is complete for the first iteration of the testing cycle. All testing hardware must be in place and
Test case execution is complete for the testing iteration. Any defects have been documented in PVCS. Upon completion of the items above, Functional components can be
15
2. Integration Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)
Testing of combined parts of the application to determine if they function together correctly.
QA
2. Systems Testing
Requirements To Be Validated Work Products (Incl. Test Docs and Reports)
Definition
Participants
Sources of Data
Entrance Criteria
Exit Criteria
16
Test case execution is complete for all documented requirements. All High and Mediumpriority test cases have been executed. Low priority test cases may wait until post-Live. All Showstopper and High priority reported defects have been resolved and retested. For remaining open Medium and Low priority defects are outstanding, the Project Manager and Business Champion must sign off on the implementation risk as acceptable. Upon completion of items listed above, system components can be considered ready for User Acceptance Testing (UAT).
17
2. Regression Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)
Regression testing will be performed as needed following the defect control and code promotion process to be in place.
QA
Relevant defects have been fixed for the testing cycle and are ready for retesting Development for additional system functions is complete Build is complete and promoted to the test environment
Testing of all identified items is complete Test results have been recorded Defects have been closed or returned for fixes
While system testing verifies each documented requirement, User Acceptance Testing validates that the applications features and functions perform as expected by the business. UAT will begin toward the end of the system-testing phase. It is expected that the PM
System Testing is complete on all core application components Most major defects have been resolved and Regression tested Test Lab or designated testing area is ready
18
Test case execution is complete for the testing iteration. Executed test cases have been reviewed for completeness by QA Any defects have been documented in PVCS. Upon completion of
Upon completion of testing (100% passed test conditions), all projects will undergo a final round of Regression testing to make sure any code built to take care of any outstanding defects did not break anything else. Upon completion of the final round of Regression testing, the application/code will be deemed ready for deployment to production. The Final test status will be communicated to the PM and Business champion and the finalized testing documents will be submitted for sign-off.
QA
UAT has been completed and signed-off by the PM and Business champron.
Testing of all identified items is complete Test results have been recorded All final testing documents have been signed-off by the PM and Business champion System is determined to be ready for deployment to production.
All completed Test Plans from each test cycle Completed Test Cases Test Status Reports from each test cycle Defect Status Reports from each test cycle Final Defect Report listing any outstanding defects.
19
At this time, no required response times or performance expectations for One-Time Rec have been documented. If needed, the following types of performance testing can be executed using tools such as Segue Silk Performer or Mercury LoadRunner. Load: Ensures the system functions properly beyond the expected maximum workload. Performance is expected to match that of corresponding systems currently in production. Stress: Identifies system defects due to low resources or competition for resources. Performance is expected to match that of
IT QA
N/A
20
2. Automated Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)
There are no plans to automate testing prior to the implementation of One-time Recommendation. However, once all test cases are documented and organized in Quality Center, automation should be considered soon after deployment for future Regression testing needs.
QA
Relevant defects have been fixed for the testing cycle and are ready for retesting Development for additional system functions is complete Build is complete and promoted to the test environment
Testing of all identified items is complete Test results have been recorded Defects have been closed or returned for fixes
21
Project Manager
Lead Developer
QA Team
This publication contains proprietary information not to be distributed outside of Wells Fargo & Co. This document, in whole or in part, must not be reproduced in any form without the express written permission of Wells Fargo & Co.