0% found this document useful (0 votes)
57 views24 pages

Master Test Plan: Project 199 New Website Development For UNITED

Uploaded by

sh7966
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
57 views24 pages

Master Test Plan: Project 199 New Website Development For UNITED

Uploaded by

sh7966
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Project 199 New Website Development for UNITED

Version: 4.0 Date: 8/27/09

Master Test Plan

Revised by: XXXXX UNITED QA

Master Test Plan


Project 199 New Website Development for UNITED

DOCUMENT CONTRIBUTORS
The following individuals contributed to this Master Test Plan: XXXXXX (Project Manager) XXXXX (QA)

DOCUMENT VERSION CONTROL


Version # 1.0 2.0 3.0 4.0 5.0 Date 10/11/05 10/17/05 11/10/05 11/28/05 12/01/05 XXXX Author Change Description Initial draft Updated draft Updated draft Updated draft Updated draft

Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED

TABLE OF CONTENTS
INTRODUCTION ...................................................................................................................................... 3 1.1. 1.2. 1.3. 1.4. Document Purpose.................................................................................................................... 3 Target Audience ........................................................................................................................ 3 Testing Objectives..................................................................................................................... 3 Scope ....................................................................................................................................... 4 1.4.1. Functions To Be Tested ................................................................................................ 4 1.4.2. Application Functions Not To Be Tested ....................................................................... 4 1.5. References ................................................................................................................................ 5 1.6. Dependencies ........................................................................................................................... 5 1.7. Glossary of Terms ..................................................................................................................... 5 TEST STRATEGY.................................................................................................................................... 5 1.8. Outstanding Issues, Assumptions, Risks & Contingencies ....................................................... 5 1.8.1. Testing Risks ................................................................................................................. 5 1.8.2. Strategy To Mitigate Testing Risks................................................................................ 6 1.9. Requirements Analysis/Validation............................................................................................. 6 1.10. General Strategy/Approach....................................................................................................... 6 1.11. Test Environment/Platform........................................................................................................ 6 1.12. Test Data................................................................................................................................... 7 1.13. Test Documents ........................................................................................................................ 7 1.13.1. Test Case Dependencies .............................................................................................. 7 1.14. Change Control Procedures...................................................................................................... 7 1.15. Build Deployment Strategy........................................................................................................ 8 1.16. Defect Reporting ....................................................................................................................... 8 1.16.1. Defect Fields.................................................................................................................. 9 1.17. Defect Management Process .................................................................................................. 10 SCHEDULE............................................................................................................................................ 11 1.18. Milestones/Target Dates ......................................................................................................... 11 TESTING TEAM..................................................................................................................................... 12 QUALITY ASSURANCE LIFE CYCLE (QALC) .................................................................................... 13 TEST PHASES INCLUDED IN THIS RELEASE ................................................................................... 14 TEST PHASES NOT INCLUDED IN THIS RELEASE........................................................................... 20 TEST PLAN APPROVAL....................................................................................................................... 22
Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED

INTRODUCTION
1.1. Document Purpose This document identifies the areas that will be included within the scope of testing for implementing New Website Development for UNITED. It describes the testing approach used by the Internal-Facing Applications Team to ensure a successful implementation. This plan also defines who is responsible for overseeing the testing as well as those expected to participate in the different testing phases. Additionally, the plan addresses the environment in which the test initiative is to occur and indicates what the Entrance and Exit criteria will be for each type of testing. 1.2. Target Audience This plan impacts Wells Fargo Financial Advisors, IT, and Marketing. Financial Advisors currently use both Comac and ADP (through Onyx) to fulfill literature requests. The Wells Fargo Web team will play a role in converting the Comac user interface into an Onyx and ADP-supported application and process. In addition, the Onyx team will be responsible for enhancements and updates to Onyx so literature orders can be stored and viewed in a single application. The Marketing team may be responsible for updates to literature products in the Onyx Product Administration tool. 1.3. Testing Objectives The testing objectives of this plan are to verify: The system satisfies the projects Business and User Requirements. The correctness of information within the fields of the application based on user inputs. The correctness of information within the fields of external systems during and following the completion of a user-transaction or other user or system-initiated event. Major system defects are identified, reported, and managed.

Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED
1.4. Scope The scope of testing for One-Time Recommendation includes: 1.4.1. Functions To Be Tested
Testing Complexity Low Med High X X X X X X X X X X X X X X X X X X X X X X X

UC # 15 1 2 3 4 5 6 7/8 9 10 11 19 12 18 13 14 24

Function View Indicator Onyx Dashboard View & Display Recommendations Create New Recommendation Clone a Recommendation Revise a Recommendation Record Recommendation Record Survey Results Calculate & Display Score & Portfolio Record and Save Portfolio Approve Recommendations Distribute Client Output Select Portfolio Kit Resend Client Output Cancel Client output Print Client Rec Print Client Q & A (Pretty Print) Link Account to Recommendation

Area/Type User Interface User Interface User Interface User Interface User Interface User Interface User Interface User Interface User Interface User Interface Literature Literature User Interface & Literature User Interface & Literature User Interface User Interface User Interface

Batch Processing to integrate into System Test 16 Inactivate Recommendations User Interface 17 Inactivate Survey Answer Set User Interface Update User Interface Stability 23 Survey User Interface 21 Transfer data to Onyx Catalog Reporting Integration Testing w/ADP 20 Select data and transmit to ADP Client Output 25 Client Output documents correct Client Output Outside of WFRS BFDS Updates Mark Trades as solicited 22 BFDS

1.4.2. Application Functions Not To Be Tested


TBD

Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED
1.5. References This Master Test Plan was developed with input from the following references (located in G:\Projects\Onyx 2005\One time Recommendation folder): 1.6. Project definition Project plan Project timeline User requirements Use cases Test cases

Dependencies Completion of testing is dependent on the successful delivery of the One-Time Recommendation development components and Requirements. Additionally, the QA team will require support from the following project areas: BAs for Requirements validation and testing IT for Unit Testing and assistance with setting up the testing environment(s), platform(s), and test data DBAs for setup of the test data and environment(s) Members of the Business Team to participate in testing and defect resolution The PM for schedule changes or change-control issues

1.7.

Glossary of Terms For a Glossary of terms, see \\FILESHARE\IT$\Projects\Onyx 2005\Wells Fargo Glossary.doc

TEST STRATEGY
1.8. Outstanding Issues, Assumptions, Risks & Contingencies See the latest version of the One-Time Recommendation Project Definition document (located in G:\Projects\Onyx 2005\One time Recommendation folder). 1.8.1. Testing Risks Lack of personnel resources when testing is to begin Lack of availability of required hardware, software, data or tools Late delivery of the software, hardware or tools Delays in training on the application and/or tools Changes to the original requirements or design Vague or unclear requirements Requirements that cannot be tested
Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED
1.8.2. Strategy To Mitigate Testing Risks The test schedule and development schedule will move out an appropriate number of days The number of tests performed will be reduced based on results of Risk Analysis The number of acceptable defects will be increased Resources will be added to the test team The test team will work overtime The scope of the plan may be changed 1.9. Requirements Analysis/Validation Requirements will be documented by the project Business Analyst using Caliber. Requirements documentation (Platform Specs) will be reviewed by the Business team to ensure each requirement is Complete, Accurate, Precise, Clear, Consistent, and Relevant. Traceability and gap analysis will be performed by the QA team to validate that each requirement is adequately addressed and covered by a test case. The traceability of requirements to test cases will be managed by Test Director. Searches will be performed in the Requirements by the QA team for words that signify a requirement isnt complete such as, always, some, good, if then (missing else), etc. Upon Requirements reviews by the Business Analyst, IT, QA, and the Business team, documents will be distributed to the Business team for final sign-off.

Requirements documentation can be found at G:\Projects\Onyx 2005\One time Recommendation\02 Requirements Analysis 1.10. General Strategy/Approach Since One-Time Recommendation is a new application, the following tasks will be performed to help ensure a successful deployment: Extensive planning for project risk mitigation Careful coordination of test data, environments, test cycles, and training A phased approach to integration/system testing to minimize big bang testing near the end of the project One or two dedicated QA team members will be assigned to testing and validating the system Since no pre-existing test cases exist, the primary BA will be asked to assist with test case identification as well as what test cases should be included in each testing cycle (System Testing vs. Regression vs. User Acceptance, etc.)

1.11. Test Environment/Platform The QA team will work with IT and the DBAs to determine the following: The testing environment/platform to be used
6

Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED
Any special hardware such as simulators, static generators, etc. Any specific versions of other supporting software that might be required Any restricted use of the system during testing Security/access for testers Requirements to set up a dedicated test area

1.12. Test Data The QA team will work with IT, DBAs, and the BAs to determine the following: Test accounts, test data, and data tables Any special collection requirements or specific ranges of data that must be provided 1.13. Test Documents Relevant test documents to this project include: 1. 2. 3. 4. 5. 6. 7. The Master Test Plan Component Test Plans Use Cases Test Cases Testing Progress Reports Error logs and execution logs Problem reports and corrective actions

Project documents can be found at: G:\Projects\Onyx 2005\One time Recommendation\05 Testing 1.13.1. Test Case Dependencies Prior to test case development, it is expected that the following exist: Use cases Screen Flow diagrams Functional requirements Technical design document Traceability matrices Once these documents are signed off, changes to them will be treated as enhancements. 1.14. Change Control Procedures The QA team will work with the PM to address and mitigate any requirement or system changes that impact testing timelines and deliverables. If a Change Request is approved, the following procedures will apply: 1. The Change Request documented in Caliber and communicated to the QA team. 2. The BA will notify QA of the Requirement(s) to be created or changed.
Date Created: 10/11/05

Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Master Test Plan


Project 199 New Website Development for UNITED
3. QA will notify the PM of any impacts to the testing timelines and deliverables. 4. After the Requirements are created or changed, QA will incorporate the changes into the affected Use Cases, Test Cases, and Test Plans. 5. Once the testing associated with the Change Request is complete, QA will notify the PM or BA. 1.15. Build Deployment Strategy The Onyx Dev Support team owns and executes the build and migration process. QA requests deployments of code. Once complete, a deployment is smoke-tested by QA to assure stability. If a build is found to be unstable, a defined escalation process will be followed.

1.16. Defect Reporting Defects will be submitted and, in most cases, assigned to the developer corresponding to the feature or function where the defect was found. Reporting and tracking of defects will be done using PVCS Tracker. However, Mercury Quality Center may be used in conjunction with Tracker. Defects will be assigned a severity by the submitter and a review of the defects will be done as warranted by the project. Defect status reporting will be done on a regular basis at project status meetings. In general, the defect lifecycle will follow that of the following document: \\FILESHARE\IT$\transition projects\Transition Projects\001-PFPC to DST\000- Overall IT Program\004- Testing\Tracker procedures - HNW ONYX example.doc

Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED
1.16.1. Defect Fields

Tracker SCR Field (* - required) *Summary *Description *Severity

Values

Comments Include TIN, Login ID, Account, Fund, and OAO App ID as appropriate. Urgent a showstopper; must be fixed in next build High cant go live with this Medium Not High, not Low Low OK to go live with this Verified -> Closed: near end of project, PM may need awareness of final fixes going in; also, post-go-live, we may want QA to verify again in production

Urgent, High, Medium, Low

*Status

*Submitter *Assigned To Detected By *Type of Defect *Environment Affected *Project

Open Rejected Ready For Retest Reopen Verified Closed <List of all project team members> <List of all project team members>

Useful for UAT or customer-reported issues Defect Enhancement Dev, Test, Staging, Prod <None> OAO OAA VRU SuperUser Defaults to todays date <None> OAO MF OAO 529 OAA Login/ Registration OAA MyPortfolio OAA - Trading OAA Statements/eDocs OAA Account Services OAA MarketWatch VRU Login/ Registration VRU Quotes VRU Balances VRU Transactions VRU Other Information Y N Dev, Test, Staging, Prod <Date>

*Detected on Date Project Component

If not one of these subcomponents, leave this <None>

*Reproducible *Environment Affected Scheduled Implementation Date Notes Attached Files

An appropriate message is to be added whenever an Assignee changes the Assigned To field or changes the SCR Status Associated files shall reside in folder G:\transition projects\Transition Projects\001-PFPC to DST\000- Overall IT Program\004- Testing\SWELL Tracker Attachments\ with SCR # as file name.

Project Manager: XXXX Date Revised: 8/27/09

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED
1.17. Defect Management Process

New 10am/2pm Build -> Testing -> Defect

Add a Defect

Status = Open Assigned To = <developer>

Status = Rejected Assigned To = <submitter> (Resolution may include BA, PM)

Fix the Defect

OR

Need more info; Revisit requirement; Scope issue

Status = Ready For Retest (Ready for the next build) Assigned To = <submitter>

Reopen the Defect

NO Status = Reopen Assigned To = <developer>

Passed Retest?

YES Status = Verified (QA) Status = Closed (QA/PM)

Close the Defect

Project Manager: XXXX Date Revised: 8/27/09

10

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED

SCHEDULE
1.18. Milestones/Target Dates The One-Time Recommendation project plan can be found at: G:\Projects\Onyx 2005\One time Recommendation\01 Planning. As of 10/11, the testing deliverables/dates are listed below:

Task Name Duration 49 days Test Update Master Test Plan 1 day Create Test 21 days Cases/scripts WFRS User Interface 11 days Client Output/fulfillment 10 days Execute System 41.38 days testing WFRS User Interface Literature Fulfillment Client Output
Complete Onyx System testing

Start 11/01/05 11/02/05 11/10/05 11/10/05 11/29/05 11/01/05 12/13/05 12/22/05 12/30/05 01/03/06 11/02/05 11/01/05 12/22/05 01/03/06 01/11/06

Finish 01/12/06 11/02/05 12/12/05 11/28/05 12/12/05 01/03/06 12/29/05 12/23/05 01/03/06 01/03/06 11/02/05 11/01/05 01/10/06 01/10/06 01/12/06

Resource Names Deliverable Milena Master Test Plan

Milena/Sanjeev Milena/Sanjeev

Test Cases/scripts Test Cases/scripts

11.13 days 1.25 days 1.38 days 0 days 1 day 0.25 days 12 days 1 wk 2 days

Milena/Sanjeev Milena/Sanjeev Milena/Sanjeev

Test Status Report Test Status Report Test Status Report Test Status Report Test Status Report Test Status Report UAT Status Report Regression Status Report

Reporting BFDS Execute Integration (end-to-end) testing Execute User Acceptance testing Execute Regression testing

Georgia Tice Milena/Sanjeev Milena/Sanjeev Milena/Sanjeev Milena/Sanjeev

Note that any changes to this schedule could result in delays in delivery. Changes should be discussed before testing begins to ensure adequate resources and time exist to complete all testing by the milestone completion dates. Any approved changes to the testing schedule will be documented within the Project Plan rather than updating the approved test plan.

Project Manager: XXXX Date Revised: 8/27/09

11

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED TESTING TEAM
The following individuals will be responsible for testing and testing approvals: Test Plan Approver? Test Plan Distribution

Task Data Conversion/Data Creation/File loading to test databases Unit Testing (white box) Functional Testing (black box) Integration Testing Regression Testing

Testing Role DBA and QA

Name

Developers QA, BAs, and Business (where applicable) Developers and QA QA, BAs, and Business (where applicable) QA, BAs, and Business team members IT and QA QA

User Acceptance Testing Performance Testing Automated Testing

Project Manager: XXXX Date Revised: 8/27/09

12

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 New Website Development for UNITED

QUALITY ASSURANCE LIFE CYCLE (QALC)

Use Case/Requirements (Base Line & Reopen)

Change Requests

Onyx Test case developed by looking into Use Case and Requirement document

Check if in scope of release

Release Test Ccondition (By Iteration)

Test Status

Execute Release/Iteration Test Conditions

Release Test Data sheet

Project Manager: XXXX Date Revised: 8/27/09

13

New Website Development for UNITED

Date Created: 10/11/05

Master Test Plan


Project 199 One Time Recommendation

TEST PHASES INCLUDED IN THIS RELEASE


This section further outlines the process and participants for each of the following test phases associated with this project: 1. Unit Testing 2. Smoke Test 3. Functional Testing 4. Integration Testing 5. System Testing 6. Regression Testing 7. User Acceptance Testing 8. Final Acceptance Testing 2. Unit Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

White Box testing to validate specific functions or code modules. Unit testing include a variety of the following checks: Execution paths Error-handling paths Boundary conditions Inputs Outputs Computations Logical decision points

IT

Code

Confirmation of functionality/ code readiness is required

Code checked in/added to build

N/A

Unit Test Status Report TBD

Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

14

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation
Error messages Constraints Limits Loops Data flow coverage Test interfaces Data structures

2. Smoke Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

Non-exhaustive software testing performed to ascertain that the most crucial functions of a program are functioning as expected.

QA

Data from Test environmen t tables User inputs

Unit Testing is complete

No Showstopper or High priority defects have been identified

Refer to the Smoke test plan in TestDirector (list path)

Smoke Test Status Report TBD

2. Functional Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

Testing performed at the modular or page level to validate all requirements have been met.

QA BAs

Data from Test environment tables User inputs

Documentation of requirements is complete and signed off by the business. Test case development is complete for the first iteration of the testing cycle. All testing hardware must be in place and

Test case execution is complete for the testing iteration. Any defects have been documented in PVCS. Upon completion of the items above, Functional components can be

Refer to the Functional test plan in TestDirector (list path)

System Test Status Report TBD

Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

15

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation
available to the testing team. A code promotion process must be in place to formally move executable code to the test environment with documented content of each promotion of code. Test data for all test conditions must be identified and available to the testing team. Project requirements and design are frozen and under formal change control Unit testing has been completed. considered ready for User Acceptance Testing (UAT).

2. Integration Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

Testing of combined parts of the application to determine if they function together correctly.

QA

Data from Test environment tables User inputs

Functionality/ code readiness requires confirmation from IT

System components function together as expected.

Refer to the Integration test plan in TestDirector (list path)

Integration Test Status Report TBD

2. Systems Testing
Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

Definition

Participants

Sources of Data

Entrance Criteria

Exit Criteria

Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

16

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation
This testing phase includes: End-to-End testing Additional testing on various operating systems and browsers supported by the Onyx application Defect retesting QA BAs Data from Test environment tables Ad-hoc data User inputs Personnel resources are assigned and in place. Other team members will participate in executing the tests as needed. Documentation of requirements is complete and signed off by the business. Test case development is complete for the first iteration of the testing cycle. All testing hardware must be in place and available to the testing team. A code promotion process must be in place to formally move executable code to the test environment with documented content of each promotion of code. Test data for all test conditions must be identified and available to the testing team. Project requirements and design are frozen and under formal change
Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

Test case execution is complete for all documented requirements. All High and Mediumpriority test cases have been executed. Low priority test cases may wait until post-Live. All Showstopper and High priority reported defects have been resolved and retested. For remaining open Medium and Low priority defects are outstanding, the Project Manager and Business Champion must sign off on the implementation risk as acceptable. Upon completion of items listed above, system components can be considered ready for User Acceptance Testing (UAT).

Refer to the Systems test plan in TestDirector (list path)

System Test Status Report TBD

17

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation
control Unit testing and Integration testing has been completed.

2. Regression Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

Regression testing will be performed as needed following the defect control and code promotion process to be in place.

QA

Data from Test environment tables Ad-hoc data User inputs

Relevant defects have been fixed for the testing cycle and are ready for retesting Development for additional system functions is complete Build is complete and promoted to the test environment

Testing of all identified items is complete Test results have been recorded Defects have been closed or returned for fixes

Refer to the Regression test plan in TestDirector (list path)

Regression Test Status Report Defect Retesting Log TBD

2. User Acceptance Testing


Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

While system testing verifies each documented requirement, User Acceptance Testing validates that the applications features and functions perform as expected by the business. UAT will begin toward the end of the system-testing phase. It is expected that the PM

BAs Business Team QA

Data from Test environment tables Ad-hoc data User inputs

System Testing is complete on all core application components Most major defects have been resolved and Regression tested Test Lab or designated testing area is ready
18

Test case execution is complete for the testing iteration. Executed test cases have been reviewed for completeness by QA Any defects have been documented in PVCS. Upon completion of

Refer to the UAT test plan in TestDirector (list path)

UAT Status Report TBD

Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation
and BA will lead the coordination of User Acceptance Testing with the Business side. The QA team will provide direction and sample system tests to aid this testing. UAT participants have received highlevel training to execute test cases UAT build is complete and promoted to the test environment the testing cycle, the PM and Business Champion will signoff on the test plan.

2. Final Acceptance Testing


Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

Upon completion of testing (100% passed test conditions), all projects will undergo a final round of Regression testing to make sure any code built to take care of any outstanding defects did not break anything else. Upon completion of the final round of Regression testing, the application/code will be deemed ready for deployment to production. The Final test status will be communicated to the PM and Business champion and the finalized testing documents will be submitted for sign-off.

QA

Data from Test environment tables Ad-hoc data User inputs

UAT has been completed and signed-off by the PM and Business champron.

Testing of all identified items is complete Test results have been recorded All final testing documents have been signed-off by the PM and Business champion System is determined to be ready for deployment to production.

Refer to the FAT test plan in TestDirector (list path)

All completed Test Plans from each test cycle Completed Test Cases Test Status Reports from each test cycle Defect Status Reports from each test cycle Final Defect Report listing any outstanding defects.

Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

19

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation

TEST PHASES NOT INCLUDED IN THIS RELEASE


This section further outlines the test phases not associated with this project: 1. Performance Testing 2. Automated Testing 1. Performance Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

At this time, no required response times or performance expectations for One-Time Rec have been documented. If needed, the following types of performance testing can be executed using tools such as Segue Silk Performer or Mercury LoadRunner. Load: Ensures the system functions properly beyond the expected maximum workload. Performance is expected to match that of corresponding systems currently in production. Stress: Identifies system defects due to low resources or competition for resources. Performance is expected to match that of

IT QA

Data from Test environment tables Ad-hoc data User inputs

Confirmation of functionality/ code readiness is required

Code checked in/added to build

N/A

Performance Test Status Report TBD

Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

20

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation
corresponding systems currently in production. Volume: Tests large amounts of data to determine if limits can be reached that will cause the system to fail Performance is expected to match that of corresponding systems currently in production.

2. Automated Testing
Definition Participants Sources of Data Entrance Criteria Exit Criteria Requirements To Be Validated Work Products (Incl. Test Docs and Reports)

There are no plans to automate testing prior to the implementation of One-time Recommendation. However, once all test cases are documented and organized in Quality Center, automation should be considered soon after deployment for future Regression testing needs.

QA

Data from Test environment tables Ad-hoc data User inputs

Relevant defects have been fixed for the testing cycle and are ready for retesting Development for additional system functions is complete Build is complete and promoted to the test environment

Testing of all identified items is complete Test results have been recorded Defects have been closed or returned for fixes

Refer to the Regressi on test plan in TestDirec tor (list path)

Regression Test Status Report Defect Retesting Log TBD

Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

21

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation

TEST PLAN APPROVAL


Project Champion

Brian Kubly Business Resource Team/CRT

Project Manager

Alan Haga Internal-Facing Applications Team

Lead Developer

David Therkildsen Internal-Facing Applications Team

Lead Business Analyst

Linda Larson Internal-Facing Applications Team

QA Team

Milena Vranjes Internal-Facing Applications Team

Sanjeev Yarlagadda Internal-Facing Applications Team

22 Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

Date Created: 10/11/05 Date Revised: 8/27/09

Master Test Plan


Project 199 One Time Recommendation

THIS PAGE INTENTIALLY LEFT BLANK

This publication contains proprietary information not to be distributed outside of Wells Fargo & Co. This document, in whole or in part, must not be reproduced in any form without the express written permission of Wells Fargo & Co.

2005 Wells Fargo & Co. All rights reserved.

23 Project Manager: Alan Haga C:\Documents and Settings\syarlagadda\Desktop\imbuesys\Training material\Test_Plan.doc

Date Created: 10/11/05 Date Revised: 8/27/09

You might also like