0% found this document useful (0 votes)
35 views129 pages

Sam TestingPlaybook EVDOracleCloudSFSFWorkday v01

The Testing Playbook provides a structured approach for conducting various types of testing in Cloud implementation projects, emphasizing the need for adaptation based on client-specific requirements. It outlines key testing types, roles, responsibilities, and a blended methodology that combines Waterfall and Agile concepts to ensure effective testing and integration. The document serves as a comprehensive guide for project managers and testing leads to plan, execute, and manage testing activities efficiently.

Uploaded by

bikram115566
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
35 views129 pages

Sam TestingPlaybook EVDOracleCloudSFSFWorkday v01

The Testing Playbook provides a structured approach for conducting various types of testing in Cloud implementation projects, emphasizing the need for adaptation based on client-specific requirements. It outlines key testing types, roles, responsibilities, and a blended methodology that combines Waterfall and Agile concepts to ensure effective testing and integration. The document serves as a comprehensive guide for project managers and testing leads to plan, execute, and manage testing activities efficiently.

Uploaded by

bikram115566
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 129

Testing Playbook

<insert date here>


Contents

Testing Playbook Overview

Integration Testing

Release Testing

Performance Testing

Payroll Compare Testing

2
Copyright © 2017 Deloitte Development LLC. All rights reserved.
​Testing Playbook Overview

3
Objectives and Approach

The Testing Playbook:


• provides a recommended structure and process for preparing for and conducting various types of testing needed
for a Cloud implementation project
• outlines key aspects for test planning and test management keeping the future state implementation
approach in mind
• is intended to provide the PM or the Testing Lead with the ability to leverage one or more chapters from this
document to define their testing approach, clarify roles and responsibilities, and align on the expected outcomes

4
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Scope of the Testing Playbook
This Playbook is intended to cover the key activities highlighted below. While the Playbook provides a good base for your
Cloud project from a method perspective, you may need to adapt it based on the specific needs of your client.

Client
Project
Sustainme
Launch project Duration
Imagine nt
Adapt playbooks
Personalized Key:
Create project plan
Insightful Scope of
Personas this
Meaningful
Journey maps Playbook
Moments that matter
User stories
Adapt solution
Plan sprints
Sustainment Lab

Sprint: observe
Deliver
Sprint: feedback
Agile
Sprint: iterate
Iterative
Validate
Responsive
Conduct mock conversion
Coordinate cloud release testing
Perform integration testing
Perform payroll compare testing
Coordinate vendor performance
testing
Rehearse
Deploy
Run Support
Innovative Sustain
Efficient Optimize
Productive
5
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing in Cloud is different from Testing on-prem

Cloud projects follow a blended approach leveraging Waterfall and Agile methodology concepts. As a consequence,
testing strategy in Cloud needs to adjust to this new approach.
• Prototype sprints provide the Client with an approach that encourages the teams to “Fail Early, Fail Fast, Learn
Faster”
• Due to the iterative nature of prototyping, the Client team can get well versed with the solution by the time the
project moves into the Testing phase and prepare for Integration Testing
• Periodic upgrades from the vendors also drive the need to evaluate the new functionality and determine the value
of implementing the changes

The Test Strategy provides the necessary framework to incorporate these changes and encourages the Project teams
to think and plan ahead.

6
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing in Cloud - Strategy Overview
Applying innovative and lean thinking to traditional test strategy to arrive at an agile method of testing in the Cloud

CLOUD REFRESH – TEST STRATEGY OVERVIEW

Pha
PLAN ARCHITECT/PROTOTYPE TEST DEPLOY/SUPPORT
se

TEST PLANNING PROTOTYPE TEST EXECUTION TEST TRANSITION


PRIMARY ACTIVITIES

Test Strategy Sprints Integration Testing Prod Release Planning/Execute

Test Approach Performance Testing

Release Testing*

DEFECT MANAGEMENT

Personas Data Conversions Production Releases


SECONDARY/
SUPPORT

Playbooks (PM/Tech/Sprint) Sprint Planning/Lean Specifications/Support Model

Journey Maps/User Stories/Workbooks

Testing Playbook Integration Test Scenarios Integration Test Results


Work Products
Deliverables/

Release Test Plan* Release Test Results* Performance Test Results

Performance Test Plan Payroll Compare Test Results (if applicable)

Integration Test Plan

*If applicable Chapters within Testing Playbook

7
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing in Cloud – Testing Types (Independent of the Scope)
Irrespective of the scope and timeline of the project, every Cloud implementation needs to have certain mandatory testing
types in order to achieve the project objectives.

Testing Type Objectives/Description


• Validation of end-to-end business processes leveraging user stories that are stitched together
• Validation of data flow between applications for business continuity
Integration Testing • Verification of system access and connectivity between systems
• Verification of user experience with the goal of refining the solution, sharpening communications, and
enhancing training

• Validation of new functionality delivered as part of a vendor release


Release Testing*
• Validation of the impact of the new release on the current solution

*If applicable

8
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing in Cloud – Testing Types (Dependent on the Scope)
In certain cases, depending on the scope of the project and/or client preferences, the project team may
need to account for additional testing types.

Testing Type Objectives/Description

Performance Testing • To validate system/application response to user load, data/volume load and system response time

• Aimed at ensuring that the most important functions work for configuration or integration (connectivity,
decryption, etc.)
Smoke Testing
• The results of this testing is used to decide if the build is stable enough to proceed with the actual
testing cycle

Payroll Compare Testing


• To validate that Payroll ties accurately in the new solution when compared with the legacy solution
(formerly known as Parallel Testing)

9
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Determining the nature and number of testing types/events

Depending on the technology, module, and geographical scope the project team needs to determine the number and
nature of testing events. Other key inputs in making this determination are:
• An inventory of prioritized user stories that highlight customer experience milestones for various personas
• The prioritized set of customer journey maps, further condenses to identifying specific Moments that Matter to
various personas, that most impact the customer experience – current pain points, desired future experience, and
sentiment
• This along with other parameters such as timeline, Cloud environment, project goals, compliance etc. help
ascertain the nature and number of testing events
• Lastly testing events also need to consider resource availability and SLAs with 3rd party integrations in
determining the testing types and events

10
Copyright © 2017 Deloitte Development LLC. All rights reserved.
​Integration Testing

11
Integration Testing Overview

• Overview
‒ Anatomy of a Test Cycle
‒ Roles and Responsibilities
• Planning and Preparing
• Execution
• Support and Manage
• Close Cycle

12
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Overview

13
Anatomy of a Test Cycle
Environment Setup Configuration Data Conversion

- Instance/patches - Basic setups for application - Convert master and


- Supporting apps - Access setup transactional data
(Boundary/partner systems) - Setup to support customizations - Reconciliation

Smoke test Execute Test Resolve defects


- Pre-test hand-picked
- Log issues based on
components of the solution - Basic setups for application test scenario runs
- Sample transaction - Access setup - Confirm/re-test and
processing - Setup to support customizations validate to close
- Connectivity, access
- Apply fixes as
verification, UI verification
needed

Regression test Close test cycle (Exit)

- Basic setups for application - Confirm exit criteria has been met
- Access setup - Acknowledge any open exceptions and supporting mitigation plan
- Setup to support - Stakeholder approval
customizations

14
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Roles and Responsibilities (sample)

Roles Responsibilities

• Manage their assigned work-stream testers


Test Champions • Represent work stream in daily test status calls and periodic defects call
• Keep testers in the loop of any key communications
• Provide guidance to testers on test scenarios, defects and data sheets
• Escalate issues to Test Management team
• Ensure HPE ALM is being updated (tests & defects)
• Ensures scenarios are executed per planned schedule

• Primary point of contact to analyze any new open defect; route the defect to the right individual/team
Triage Leads • Follow up on resolution for defect ETA and re-test planning

• Accountable for signing off on test results and exit criteria by workstream
Leads - Global Process • Understand and resolve/escalate highlighted risks and issues from test champions/team
• Review and decide on any proposed scope change during testing
• Review and approve any viable workarounds needed to support defects/scenarios

• Provide systems and process knowledge


Functional/Solution Leads • Primary support test execution and defect logging
• Triage and resolve defects; coordinate defect fix migration into the instance (for functional fixes only)

• Provide escalation support


PMO • Arrive at key decisions and review risks/issues
• Approve any mitigation/remediation plans to help keep testing on track

• Point person by region to manage test execution for local requirements


Localization Coordinator • Manage defects related to localizations testing

• Execute in-country specific localization requirements


Localization Tester • Identify and re-test defects for scoped scenarios

• Point person to review and confirm test scenarios, scripts and datasheets for each work stream
Testing QA
• Document scenarios and associated test scripts OR modify existing ones as needed
• Create datasheets to support test execution

15
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Roles and Responsibilities (sample)

Roles Responsibilities

• Plan and facilitate test cycle activities


Testing Coordinators • Manage test execution and daily test status reporting
• Provide testing tool support

• Drive and monitor defects to resolution


Defect Management Lead • Work with test champions to align on defect closures, ETA’s and key issues/risks
• Ensure compliance with the defined defect severity guidelines
• Report out defect status to PMO and project teams

• Execute test scenarios approved for testing, log defects and record results
Business SME’s/Testers • Work with the R12 Functional Teams to identify retests and execute them
• Provide subject matter expertise on processes being tested
• Provide formal sign-off on test results

• Support test execution, defect resolution and re-test activities


Technical – Development • Support and perform migrations for defect fixes
• Communicate to functional leads/workstreams once fixes are migrated

• Points of contact to support test execution of interfaces


Enablers (Integrations/Job • Provide defect resolution and re-test support
scheduling)

• Provide primarily support for any DBA/Infrastructure activities (includes application, database and patch activities)
Security • Provide any access requests for testers

• Manages all releases (technical, functional and related ones) during test cycles and any major release between test cycles
Release Coordinator • Ensures release objects/fixes are approved and have been verified in lower instances prior to migrations
• Confirms and communicates once release migration is complete

• Performs the actual release changes in the environment


Release Team • Mix of team members from multiple teams as applicable to the release objects

16
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Roles and Responsibilities (sample)

Role Testing Responsibilities Resources (Names)

• Validate testing strategy, templates and tools


• Owns test plan creation (templates, tools, schedule, etc.) and overall success of Payroll Compare testing
Deloitte Payroll Lead
• Manage Deloitte-assigned functional defects
• Manage support for Workday functional configuration

• Provide primary support for Workday functional configuration


Deloitte Payroll Team Various Team Members
• Complete assigned defects

Deloitte Operating • Provide primary support for Workday Operating Model inputs/outputs
Model Lead • Provide input into test scenarios

• Provide test systems, receive test files, provide feedback


Vendors Various Team Members

• Build Test Tenant


Deloitte
Conversion Lead • Migrate Employee Data
• Manage Conversion related defects

17
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Roles and Responsibilities (sample)

Role Testing Responsibilities Resources (Names)

• Owns test plan creation (templates, tools, schedule, etc.) and overall success of E2E
• Plan for and manage testing execution and defect resolution process
Testing Lead • Track and report on testing progress
• Evaluate entry/exit criteria for advancement to go live
• Obtain final sign-off for E2E

• Validate testing strategy, templates and tools


• Provide input into E2E test
Client Functional Lead
• Review/update Workday functional configuration defect priorities, assign defects, manage overall functional
configuration defect process and progress

• Resolve configuration defects (primary) associated to Integration


Client Functional • Provide support for functional configuration (secondary)
Various Team Members
Team • Complete assigned defects
• Provide knowledge transfer at the beginning of E2E
Client Subject Matter
• Provide clarification on requirements as needed Various Team Members
Experts

• Execute testing scenarios for Functional & INT configuration


Testers • Track status of testing and log defects as necessary Various Client Resources
• Retest scenarios to successful completion as defects are resolved

18
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Roles and Responsibilities (sample)

Role Testing Responsibilities Resources (Names)

• Validate testing strategy, templates and tools


• Owns test plan creation (templates, tools, schedule, etc.)
Client Payroll Lead • Provide input into test scenarios
• Review/update Workday functional configuration defect priorities, assign defects, manage overall functional
configuration defect process and progress

Client Payroll Team • Support test execution according to test schedule as needed Various Team Members

Client Operating • Provide primary support for Operating Model inputs/outputs


Model Lead • Provide input into test scenarios

• Validate testing strategy, templates and tools


• Provide input into E2E and Payroll Compare tests
Client Integration
• Review/update defect priorities, assign defects
Lead
• Own coordination of integration testing efforts with vendors, serving as the vendor liaison (or manager of vendor
liaisons)

• Launch Client-owned integrations during CAT per schedule


• Review integration actual results against expected results (vendor side)
Client Integration
• Track integration testing status Various Team Members
Team
• Log defects if integration issues are identified on vendor side
• Resolve assigned defects for Client-owned integrations

19
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Planning and Preparing

20
Traceability in the Cloud Method

• RTM will no longer be used in it’s true sense for Cloud Implementations (Requirements - Test scenarios)

• Instead we will use process maps to translate into user stories which will form the test scripts/test steps

• These user stories will be linked together to create test scenarios

• Test Scenarios will this be linked to the Process Maps

• Recommended approach to document scenarios is as below


‒ Review Process Maps
‒ Confirm underlying User stories (which will be unique and can stand on itself)
‒ Link these user stories to create test scenarios

Example is in the next section

21
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Planning and Preparing (Scenarios/Scripts/Traceability)
Personas Moments that Matter Journey Maps

User Stories Process Maps

Test Scenarios

Scenario 1 Scenario 2

Moments that matter carried Scenario 3 Scenario 4


through build, test, and deploy
22
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Integration Testing (Scenario) Build

Test Scenario (Traditional Approach)

Test Script 1 Test Script 2 Test Script 3 Test Script 4


• Move away from traditional
approach of test scripts

Create/Approve Generate/ Process Invoice • Build a lean “testing” model


Receive against PO
Requisition Approve PO and Payment

• User training on system/solution is


a key pre-requisite (to replace test
scripts with user stories)
Test Scenario (Future State Approach)
• User stories which cater to process
User Stories Strung Together aspects aided with enablers behind
(Scenario - Procure direct material with Invoice Approval and Payments via electronic processing) the scene (like integrations)

User Story 1 User Story 2 User Story 3 User Story 4

UPK and
As a Procurement As a Purchasing As a Payment Manager I Datasheets
As a Procurement can process Invoices and
Manager I can Clerk I should be
Supervisor, I can make Payment with
approve a PO able to receive approval from
create a requisition
between $1000 against an Procurement Director for
for direct materials any Invoices > 1 M
and 2000 approved PO

23
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Data

Converted data
• Using existing data from conversions (master and transactional)
• This will likely cover only a certain % of test scenario behavior

New data
• Enter new transactions
• Enter new master data
• Confirm the test scenarios align and behave as expected with new data entry
• This will likely cover remainder of variations for the test scenario

24
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Determining a schedule for testing types/events

A typical Cloud implementation project can transition from agile approach to waterfall approach at the beginning of the Test
phase. The Test phase needs to be treated as a stage gate to confirm that the user stories, journey maps, and any other
supporting elements for the overall solution implementation have been built and reviewed with the Client team.

The duration of each testing type and the comprising testing events is dependent on certain key parameters like:
• Overall project timeline
• Geographic and technology scope
• Resource availability
• SLAs with the 3rd party boundary systems

Additionally, specific testing types may be dependent on vendor schedule and process. For instance:
• Release Testing
‒ The schedule for Release Testing is dependent on the vendor release schedule
‒ The magnitude of changes being introduced in a given release and the impact of those changes on the project
objectives further drive the schedule
• Performance Testing
‒ As the project team does not have access to the infrastructure of solution, there is limited capability for conducting
performance testing
‒ The project team may need to work with the vendor on determining the overall performance tuning process based on
the estimated usage of the solution

25
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Execution

26
Smoke Test/Pre-testing
Smoke testing/Pre-testing is usually performed to sanity check the testing instance using a select group of core team
members before the system is opened to the wider testing community

Monday Tuesday Wednesday Thursday Friday Sat Sun

Initiation

System Validation
Functional Validation

Functional Validation: Run identified scenarios in select business units

Activity Duration
Details (How) Who
(What) (When)

System Week X  Basic access checks (Login to application; verify basic forms/screens)
Validation  Confirm basic operations by navigating through screens
 Validate critical customizations/personalizations
 Verify concurrent managers are operational Limited no of
 Verify connectivity between application and identified boundary/partner systems identified core
team members

Functional Week Y  Run selection of standard functionality (“happy path”) scenarios in identified business units to verify
Validation instance operability
 Track results in testing tool and log defects as applicable

27
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Running Scenarios
Scenario execution is typically divided into 3 sections –prepare to execute the scenario, execute and analyze test results, re-
test and close.

Prep/Construct Run/Analyze Results Re-test/Close

This phase includes: This phase includes: This phase includes:


• preparing the data sheets needed to • executing the scenarios, which consist of • Triaging/Resolving logged defects
support the scenario execution user stories strung together • Re-testing resolved defects/scenarios to
• walking the testers through what’s • analyzing and recording the test results confirm no regression impacts due to the
expected from the scenario observed in the system fix
• setting up the scenario in the testing tool • Updating failed scenarios, and blocking • Initiating scenarios that were blocked and
and confirming that it has the right exception cases due to an outstanding progressing through the normal scenario
assignment of testers/scheduled issue preventing the tester t execute the structure
execution dates and verify for scenario
completeness

28
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Defect Resolution

Open • Defects will be opened for any issues detected while running the scoped
(Re-open) test scenarios

• Typically defects are raised at the test step level, and it is likely that it
may be related to a user story which forms the scenario
Assigned
• Once defect is opened it will undergo triaging to get assigned to the right
team/team members

Pending Cross Team Pending CR/SR • For majority of defects where the assigned team/team members resolves
verification Resolution the issue, the defects is put back for re-testing in the test instance (after it
is fixed and successfully verified in the lower instance(s))

• In exception cases the defect may need an vendor SR to be logged or a


Fixed Ready
for re-test Change request to be submitted, which then becomes dependent on the
resolution of a CR/SR to enter back into re-testing and closure

• Once the defect is re-tested, it is handed back to the tester for a final
Verified verification at which point the tester may re-run the entire scenario or a
section of it based on the impact of the issue

• Once verified successfully, the defect can be closed, if not then it goes
back to assigned status and the defects cycle is repeated
Closed

29
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Environmental Change Control
Environment Change Control focuses on controlled introduction of changes, primarily caused due to defects or Approved
Change requests, into the testing environment that need to be tested.

Recommended Release Path

Technical Development Functional Development Testing


GOLD
instance Instance Instance(s)

Re-test/Close

• These are the development instance to • This is the testing environment to be • All approved configurations and solution
be used primarily by the development used for performing the test cycles components will be migrated into a
team and functional testers to verify the standard GOLD instance at the end of a
fixes applied are valid or not • Any changes introduced into this test cycle
instance has to be tested and verified in
• This is a pre-requisite prior to migrating lower environments (includes all • PROD will be built based on the GOLD
any changes into the testing instance changes – patches/code/configurations instance, with any incremental changes
etc.) after the final GOLD snapshot to be done
• Typically there will be two (2) instances on a one-time basis (closer to PROD
with a level of validation performed from • A daily or periodic release management Deployment/Cutover)
a functional/process perspective, needed to migrate fixes tested in lower
amounting to a functional object test instance into the testing instance

30
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Regression Testing

Regression Testing (during and between test cycles)

1. Regression testing is the concept of verifying the end-to-end process flow/scenario even if one part of it has been fixed/resolved due to an
open defect during testing
2. As defects are fixed, there is an iterative regression testing occurring during the testing event to ensure existing solution is not impacted
3. Between two (2) testing events there is regression testing performed with any open defects from the exit getting resolved and released for
next test cycle
During Test
Cycle

Test Cycle 1 Test Cycle 2 Test Cycle 3


Test Cycle
Between

• Scope of regression testing is typically driven based on the type of defect fix or change introduced
• It may be limited to re-testing the scenario impacted or may lead to re-testing collection of scenarios
• It need to be performed with an intent to ensure any related functionality or customization is not impacted by the defect fix
• In the event there are additional defects detected during regression testing, they need to be fixed and the overall impact needs to be re-
assessed
31
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Support and Manage

32
Cloud is supposed to be intuitive, what is the value of testing tools on Cloud projects?

• Cloud solutions across vendors are aimed at making business processes more intuitive for the end user, among other
things. However, this does not preclude the project teams for doing thorough testing of complex solutions.

• The testing tools, as a result, are intended to provide structure, rigor, and add detail to the testing process. The specific
tools mentioned in subsequent slides are more commonly used on projects, and in some cases, recommended for the
project.

• HPE Application Lifecycle Management (ALM), in collaboration with Agile Manager (AgM), lends itself to translate user
stories defined in the earlier stages of the project into actionable test scenarios that can be executed.

33
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool – HPE ALM
ALM is a web based tool that helps to manage the application lifecycle from project planning, requirements gathering, until
testing and deployment.

Access:
Click here to access CMT site including materials on ALM and Sprinter (add-on tool)

Once your access is setup, launch Application Lifecycle Management. Additionally, you can download and install add-ins for business views or
Excel. For the Sprinter add-in, please contact your project administrator for installation and user guide files.

34
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool - HPE ALM
Below are a few Frequently Asked Questions regarding ALM

Who can use the capability? Can it be used outside Deloitte How can I determine the cost of using ALM for my project?
Network
Projects will not incur any costs for use of the tool.
Deloitte Consulting personnel, clients and contractors can use ALM. Yes, it can
be used outside Deloitte Network and no VPN is required

How can we train the project team on ALM and Sprinter? What tools can ALM integrate with? To what extent can this tool
be used?
The HPE Adoption Readiness Tool (ART) provides end-user training for most of
HPE software. It can be accessed remotely, which allows for self-paced learning ALM is currently integrated with Agile Manager (AgM), Performance Center
for Deloitte practitioners. The training is designed to be interactive including (PC), Unified Functional Testing (UFT), and Sprinter. Additionally, via the ALM
simulation exercises and knowledge checks while also gaining an understanding Synchronizer, projects can synchronize ALM defects with Atlassian Jira and
of the full product functionality Microsoft Team Foundation Server.

How complex is ALM? Is there a learning curve or high project What happens to the ALM project and data after the Deloitte
overhead cost involved due to the complexity of using ALM? engagement is complete?
ALM is a very intuitive tool. After initial setup, there is a very low overhead for Projects and users will be deactivated at the completion of the Deloitte
ongoing maintenance and support. Each project assigns a Deloitte ALM project engagement. The project will remain on the Deloitte ALM server for 7 years and
administrator to perform initial project set up tasks, facilitate project training can be recovered, if necessary. To learn more about data handover to the client
and provide support. prior to engagement completion, read the ​ALM client project handover guide.

35
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – Kainos (Workday-specific)

Automated Testing Tool - Kainos


• Kainos SMART is the cloud-based automated testing tool.

• With Kainos repeatable tests can be created and performed on Workday HCM, Financials and Security.

Visit https://www.kainosworksmart.com/products for more details

File Comparison Tools


• The comparison of files can be done manually. But that may require a lot of time and effort.

• This may introduce some scope for human errors and ignorance on the differences.

• There are tools/applications available to compare the files and present us with the differences in the two
files being compared. The popular ones being:

‒ Beyond Compare

‒ Notepad ++

• Refer to the Appendix to learn about the file comparison options available.

36
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools - Unified Functional Testing (UFT) and UFT Pro (LeanFT)
Our tools UFT capabilities
Unified Functional Tests applications UI and API
Testing (UFT) is an layer UFT's solution supports test
functionality across multiple
advanced tool for
Tests variety of applications application layers, such as
functional and regression the front-end GUI layer and
using Add-ins
test automation. back-end service layers. Our
Creates repeatable processes capability increases efficiency
UFT Pro (formerly and speed of delivery with
LeanFT) is a powerful lower overall effort.
and lightweight functional Integrates with Application
Lifecycle Management (ALM)
testing tool built
specifically for continuous
test and continuous UFT Pro capabilities
integration. UFT Pro is used to create test Supports “shift left” initiatives
automation in developer for earlier testing
integrated development
UFT continues as the environments (IDEs). This
overall market share solution delivers a new Helps simplify the process of
leader and has worked to standard in continuous building robust, stable tests
expand its coverage. delivery and test automation
for Agile Project Management
and DevOps teams. Supports the most popular
technologies & development
languages
37
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Using UFT/UFT Pro (LeanFT)
​UFT and UFT Pro require specialized skillsets to deliver test automation services

Global availability
No Cost
Advanced testing tools are
currently available globally Enterprise licensing is
for Deloitte Consulting currently available at
practitioners and clients. no additional cost for
the duration of the
Deloitte engagement.

VBScript experience Java or C# experience


required for UFT required for UFT Pro
Projects using UFT require Projects using UFT Pro require an
an experienced Deloitte experienced Deloitte resource
resource with VBScript with Java or C# programming
programming experience experience staffed on the project
staffed on the project to to deliver test automation
deliver test automation services.
services.

To get started, take advantage of the UFT Quick Start Guide , the UFT Pro Quick Start Guide, and the
Project Ramp Up Guide. Visit the Help Center and UFT/UFT Pro resource center for additional learning
opportunities. 38
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Test Execution)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.
3. Click “View
2. Filter on
Entries” to view
“Assigned to”
details of Scenario

1. Go to
“Scenario
Execution
Tool”

39
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Defect Management)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.

1. Click
‘New’ to
log a 3. Fill in the required information and click OK
defect

2. A ‘New
Item’ form
will
appear

40
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Outlines
Below is a list of sample parameters that must be reported for each testing phase. The testing team must work with the
Client Testing Lead to define key parameters based on stakeholders, refer to the governance framework for an illustrative
example on metrics that need to be reported based on the audience of the report.

Progress Reporting
• Unique Number of test scenarios or user stories Quality Reporting
in each process
• Feedback scores of the testers
• % Completion for unique transactions
• Trends in the feedback scores
• Gap between the number of scheduled vs actual
test scenarios executed
• Total Projected scenarios to be executed

Defect Reporting Risks Reporting


• Defect reporting by criticality Potential Risks in the testing project due to any of the
following:
• Defect reporting by process
a) Critical/High priority defects
• Defects closure status
b) Deviation from the test strategy, processes etc.
• Metrics – Defect Aging, Defect trend analysis
etc. c) Process/Solution change requests

41
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Samples
Below is an illustrative depiction of key metrics that can be used in testing status reports

Plan vs Actual Test Scenario Execution Tester Feedback

5.00

4.00
Gap=Y
3.00

2.00
Dip in feedback observed due to
ABC Reason
1.00
Ease of System Navigation Overall Response Time
Overall Quality of HRDSC Comms Overall Service Experience
1 – Needs
Scale 2 - Fair 3 – Good 4 - Very Good 5 - Excellent
Improvement
Date 

% Transaction Scenario Completion


Defects % according to Severity
100 Breakup of the Critical Issues
45
50 1) Issue 1
20
25 2) Issue 2
0
Transaction Scenario

Not Started In Progress Completed Critical Urgent Medium Minor


42
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Defect Severity Description

Severity Impact Target Turnaround Time

 Very severe: Entire application, component, or function will not work


 Client, system or environment is unavailable. No work-around available
Critical  Severe data loss or corruption: Data integrity issue related to security, confidentiality, legal, or x hours
regulatory non-compliance
 Intermittent defects that result in any of the above are also classified as Critical.

 Significant: Entire application, component or function will not work. A work-around is available
 Corruption of a critical component
Urgent/High  Loss of a non-critical component y hours
 Intermittent defects that result in any of the above are also classified as High

• Result is not as expected: Corruption of a non-critical component. A work-around is available


Medium •Low impact to the end user or application z days
• Intermittent defects that result in any of the above are also classified as Medium

•Minor defect
Minor/Low •Some of the application operations are unexpected a days
•Intermittent defects with low impact to the business operations or end users

43
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Close Cycle

44
Suggested Entry Criteria for Integration Testing
Below is a sample entry criteria for Integration Testing. In addition to solution related criteria, it is vital that the Deloitte Test
lead works with the client to define any applicable criteria related to the availability of the testers for the duration of the
testing event.

Category Criteria

Test cycle scope has been reviewed and approved by all stakeholders

Functional and Technical design has been completed and approved for test cycle scope

Environment has been validated to be ready for testing

• Environment build (including patches) has been completed


• Functional configurations completed and validated in testing environment
• Security access for testers has been configured

Testing tool has been configured with tester access setup and test repository compiled for test cycle execution

Entry
All enablers and boundary systems are configured and connected to the testing environment

Pre-testing has been conducted and any open defects have a plan of action for resolution

All critical and blocker defects from prior test cycles are closed out

Any open major defects have a viable workaround with plan for resolution in the current test cycle

No impact from open defects from the preceding test cycles

45
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Suggested Exit Criteria for Integration Testing
Below is a sample exit criteria for Integration Testing. In addition to solution related criteria, it is vital that the Deloitte Test
lead works with the client to define any applicable criteria driven by the scope, project timeline vis-à-vis critical business
activities (like quarter close) and any other project specific items.

Category Criteria

All blocker and critical defects have been resolved and closed out

All test scenarios have been executed (100% execution)

Established pass rate (95%) for scoped test scenarios has been met (including re-test scenarios)
(Green/Blue >= 95% ; Yellow 90 – 95%; Red < 90%)

Defined SLA’s have been met via test results

Exit
Any open major defects have been reviewed and approved by the business teams for a viable workaround (documented and successfully tested)

All minor defects have been reviewed and validated as rightly categorized

All User Stories identified for the test cycle have been tested

Business users have signed off on the testing results

All blocker and critical defects have been resolved and closed out

46
Copyright © 2017 Deloitte Development LLC. All rights reserved.
​Release Testing

47
Release Testing Overview

• Planning and Preparing


• Test Execution
• Support and Manage
• Close Cycle

48
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Purpose – Release Testing

Goals and Purpose

User Interface Acceptance Testing to Build repeatable and reusable


understand new navigation/workflow test scenarios

Report any issues to Cloud Vendor


well in advance

Ensure Release has no impact


on Business Processes, Participating Test Teams to complete all
Integrations, Reports and testing within the defined testing window.
Security

49
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Planning and Preparing

50
Planning and Preparing

​As the vendor releases newer versions of the software during the course of the project, the project team should do the
necessary due diligence to assess the impact of the release. Below is a sample process flow that can be leveraged for
determining the scope and size of Release testing. Additional considerations for planning purposes are:

• Importance of new functionality for Review changes in the new release


go-live/ongoing business process

• Impact on user experience


Impacts the Yes Impacts the Yes Yes
Needs a new
Project current
• Global impact vs. Regional impact modules? solution?
User Story?

• Timing of the release relative to the No No No

project schedule or adoption of the Identify regression test scenarios


Identify impacted solution
End components
current solution to validate

Conduct change impact


assessment

Make necessary changes


Identify test scenarios
No
Change
impacts timeline/
Conduct testing resources?

Yes

End Project Change Control Process

51
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Planning and Preparing (continued)
After determining that the release impacts the scope and the solution, the project team needs to do a deeper dive to assess
the impact. Below is a sample set of considerations to evaluate further.

Area Consideration

• Does the current configuration work?


Functional/Module
• What changes, if any, are needed due to the new release?
Configuration
• Any additional project specific planning considerations

• Does the current security design and configuration work?


Security • What changes, if any, are needed due to the new release?
• Any additional project specific planning considerations

• Are there any changes that impact the interface fields or mapping?
Integration • What changes, if any, are needed due to the new release?
• Any additional project specific planning considerations

• Will the current reports still work?


Reports • What changes, if any, are needed due to the new release?
• Any additional project specific planning considerations

• Will the current data conversion routines still work?


Data Conversion • What changes, if any, are needed due to the new release?
• Any additional project specific planning considerations

52
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Execution

53
Release Test Execution

​Based on the Release impact assessment, the project team may determine to follow one of two approaches for
Release test execution:

• Solution As-Is: the Client is not going to take on the new functionality

• Incorporate the new functionality into the solution

54
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Release Test Execution – As-Is Solution

• The Client is not going to take on the new functionality

‒ As a result, the current solution should not be impacted

‒ However some potential look and feel changes may need to be incorporated in the solution as these will be
the new defaults

​Test Execution:

• The implementation team will need to review changes that are coming in as default settings and disable the same

• Identify the sample set of test scenarios that can be processed to validate no impact

• Stage the test data needed for processing these scenarios

• Test the solution

• Report on the results

55
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Release Testing Approach – With New Functionality

Understand
Familiarize Test Planning Identify test Scope Perform Test Track & Deploy
Prerequisites

• Familiarize with the • Identify roles and • Run the audit reports • Identify test scope • Execute test cases • Report any issues to
feature release responsibilities vendor
• Prepare test cases – Integration • Triage defects
• Identify goals and • Define testing and test data • Deploy the defect
purpose timelines – Reports • Create cut over resolutions once
• Prepare tenant tracker release is live as per
• Familiarize with • Create test plan management – Business process the cut over tracker
environment strategy • Educate regions if UI
timelines and refresh – security changes • Close out testing
schedules

56
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Support and Manage

57
Cloud is supposed to be intuitive, what is the value of testing tools on Cloud projects?

• Cloud solutions across vendors are aimed at making business processes more intuitive for the end user, among other
things. However, this does not preclude the project teams for doing thorough testing of complex solutions.

• The testing tools, as a result, are intended to provide structure, rigor and add detail to the testing process. The specific
tools mentioned in subsequent slides are more commonly used on projects, and in some cases, recommended for the
project.

• HPE Application Lifecycle Management (ALM), in collaboration with Agile Manager (AgM), lends itself to translate user
stories defined in the earlier stages of the project into actionable test scenarios that can be executed.

58
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool – HPE ALM
ALM is a web based tool that helps to manage the application lifecycle from project planning, requirements gathering, until
testing and deployment

Access:
Click here to access CMT site including materials on ALM and Sprinter (add-on tool)

Once your access is setup, launch Application Lifecycle Management. Additionally, you can download and install add-ins for business views or
Excel. For the Sprinter add-in, please contact your project administrator for installation and user guide files.

59
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool - HPE ALM
Below are a few Frequently Asked Questions regarding ALM

Who can use the capability? Can it be used outside Deloitte How can I determine the cost of using ALM for my project?
Network
Projects will not incur any costs for use of the tool.
Deloitte Consulting personnel, clients and contractors can use ALM. Yes, it can
be used outside Deloitte Network and no VPN is required

How can we train the project team on ALM and Sprinter? What tools can ALM integrate with? To what extent can this tool
be used?
The HPE Adoption Readiness Tool (ART) provides end-user training for most of
HPE software. It can be accessed remotely, which allows for self-paced learning ALM is currently integrated with Agile Manager (AgM), Performance Center
for Deloitte practitioners. The training is designed to be interactive including (PC), Unified Functional Testing (UFT), and Sprinter. Additionally, via the ALM
simulation exercises and knowledge checks while also gaining an understanding Synchronizer, projects can synchronize ALM defects with Atlassian Jira and
of the full product functionality Microsoft Team Foundation Server.

How complex is ALM? Is there a learning curve or high project What happens to the ALM project and data after the Deloitte
overhead cost involved due to the complexity of using ALM? engagement is complete?
ALM is a very intuitive tool. After initial setup, there is a very low overhead for Projects and users will be deactivated at the completion of the Deloitte
ongoing maintenance and support. Each project assigns a Deloitte ALM project engagement. The project will remain on the Deloitte ALM server for 7 years and
administrator to perform initial project set up tasks, facilitate project training can be recovered, if necessary. To learn more about data handover to the client
and provide support. prior to engagement completion, read the ​ALM client project handover guide.

60
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – Kainos (Workday-specific)

Automated Testing Tool - Kainos


• Kainos SMART is the cloud-based automated testing tool.

• With Kainos repeatable tests can be created and performed on Workday HCM, Financials and Security.

Visit https://www.kainosworksmart.com/products for more details

File Comparison Tools


• The comparison of files can be done manually. But that may require a lot of time and effort.

• This may introduce some scope for human errors and ignorance on the differences.

• There are tools/applications available to compare the files and present us with the differences in the two
files being compared. The popular ones being:

‒ Beyond Compare

‒ Notepad ++

• Refer to the Appendix to learn about the file comparison options available.

61
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools - Unified Functional Testing (UFT) and UFT Pro (LeanFT)
Our tools UFT capabilities
Unified Functional Tests applications UI and API
Testing (UFT) is an layer UFT's solution supports test
functionality across multiple
advanced tool for
Tests variety of applications application layers, such as
functional and regression the front-end GUI layer and
using Add-ins
test automation. back-end service layers. Our
Creates repeatable processes capability increases efficiency
UFT Pro (formerly and speed of delivery with
LeanFT) is a powerful lower overall effort.
and lightweight functional Integrates with Application
Lifecycle Management (ALM)
testing tool built
specifically for continuous
test and continuous UFT Pro capabilities
integration. UFT Pro is used to create test Supports “shift left” initiatives
automation in developer for earlier testing
integrated development
UFT continues as the environments (IDEs). This
overall market share solution delivers a new Helps simplify the process of
leader and has worked to standard in continuous building robust, stable tests
expand its coverage. delivery and test automation
for Agile Project Management
and DevOps teams. Supports the most popular
technologies & development
languages
62
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Using UFT/UFT Pro (LeanFT)
​UFT and UFT Pro require specialized skillsets to deliver test automation services

Global availability
No Cost
Advanced testing tools are
currently available globally Enterprise licensing is
for Deloitte Consulting currently available at
practitioners and clients. no additional cost for
the duration of the
Deloitte engagement.

VBScript experience Java or C# experience


required for UFT required for UFT Pro
Projects using UFT require Projects using UFT Pro require an
an experienced Deloitte experienced Deloitte resource
resource with VBScript with Java or C# programming
programming experience experience staffed on the project
staffed on the project to to deliver test automation
deliver test automation services.
services.

To get started, take advantage of the UFT Quick Start Guide , the UFT Pro Quick Start Guide, and the
Project Ramp Up Guide. Visit the Help Center and UFT/UFT Pro resource center for additional learning
opportunities. 63
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Test Execution)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.
3. Click “View
2. Filter on
Entries” to view
“Assigned to”
details of Scenario

1. Go to
“Scenario
Execution
Tool”

64
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Defect Management)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.

1. Click
‘New’ to
log a 3. Fill in the required information and click OK
defect

2. A ‘New
Item’ form
will
appear

65
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Outlines
Below is a list of sample parameters that must be reported for each testing phase. The testing team must work with the
Client Testing Lead to define key parameters based on stakeholders, refer to the governance framework for an illustrative
example on metrics that need to be reported based on the audience of the report.

Progress Reporting
• Unique Number of test scenarios or user stories Quality Reporting
in each process
• Feedback scores of the testers
• % Completion for unique transactions
• Trends in the feedback scores
• Gap between the number of scheduled vs actual
test scenarios executed
• Total Projected scenarios to be executed

Defect Reporting Risks Reporting


• Defect reporting by criticality Potential Risks in the testing project due to any of the
following:
• Defect reporting by process
a) Critical/High priority defects
• Defects closure status
b) Deviation from the test strategy, processes etc.
• Metrics – Defect Aging, Defect trend analysis
etc. c) Process/Solution change requests

66
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Samples
Below is a sample of key metrics that can be used in testing status reports.

Plan vs Actual Test Scenario Execution Tester Feedback

5.00

4.00
Gap=Y
3.00

2.00
Dip in feedback observed due to
ABC Reason
1.00
Ease of System Navigation Overall Response Time
Overall Quality of HRDSC Comms Overall Service Experience
1 – Needs
Scale 2 - Fair 3 – Good 4 - Very Good 5 - Excellent
Improvement
Date 

% Transaction Scenario Completion


Defects % according to Severity
100 Breakup of the Critical Issues
45
50 1) Issue 1
20
25 2) Issue 2
0
Transaction Scenario

Not Started In Progress Completed Critical Urgent Medium Minor


67
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Defect Severity Description

Severity Impact Target Turnaround Time

 Very severe: Entire application, component, or function will not work


 Client, system or environment is unavailable. No work-around available
Critical  Severe data loss or corruption: Data integrity issue related to security, confidentiality, legal, or x hours
regulatory non-compliance
 Intermittent defects that result in any of the above are also classified as Critical.

 Significant: Entire application, component or function will not work. A work-around is available
 Corruption of a critical component
Urgent/High  Loss of a non-critical component y hours
 Intermittent defects that result in any of the above are also classified as High

• Result is not as expected: Corruption of a non-critical component. A work-around is available


Medium •Low impact to the end user or application z days
• Intermittent defects that result in any of the above are also classified as Medium

•Minor defect
Minor/Low •Some of the application operations are unexpected a days
•Intermittent defects with low impact to the business operations or end users

68
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Close Cycle

69
Suggested Entry Criteria for Release Testing
Below is a sample entry criteria for Release Testing. In addition to solution related criteria, it is vital that the Deloitte Test
lead works with the client to define any applicable criteria related to the availability of the testers for the duration of the
testing event.

Category Criteria

Test cycle scope has been reviewed and approved by all stakeholders

Environment has been validated to be ready for testing

• Environment build (including patches) has been completed


• Functional configurations completed and validated in testing environment
• Security access for testers has been configured
Entry

Testing tool has been configured with tester access setup and test repository compiled for test cycle execution

All enablers and boundary systems are configured and connected to the testing environment

Pre-testing has been conducted and any open defects have a plan of action for resolution

70
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Suggested Exit Criteria for Release Testing
Below is a sample exit criteria for Release Testing. In addition to solution related criteria, it is vital that the Deloitte Test lead
works with the client to define any applicable criteria driven by the scope, project timeline vis-à-vis critical business activities
(like quarter close) and any other project specific items.

Category Criteria

All blocker and critical defects have been resolved and closed out

All test scenarios have been executed (100% execution)

Established pass rate (95%) for scoped test scenarios has been met (including re-test scenarios)
(Green/Blue >= 95% ; Yellow 90 – 95%; Red < 90%)

Exit Any open major defects have been reviewed and approved by the business teams for a viable workaround (documented and successfully tested)

All minor defects have been reviewed and validated as rightly categorized

All User Stories identified for the test cycle have been tested

Business users have signed off on the testing results

71
Copyright © 2017 Deloitte Development LLC. All rights reserved.
​Performance Testing

72
Performance Testing Overview

• Planning and Preparing


• Test Execution
• Support and Manage
• Close Cycle

73
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Purpose – Performance Testing

Definitions Purpose Expected Outputs

Ability of system to react to user activities in a timely manner


Performance Test Gauge the system performance as it relates to user
experience e.g. Users are able to enter orders within the application within the
acceptable SLA/reasonable time
Capability of the system to handle data volume
Volume Test Measure the system performance as it relates to it’s
ability to handle data volumes e.g. Application can process 1000 Invoices per day with normal business
operations
Break-point of the system to handle particular range of user load
Stress Test Understand the capability of the system as it relates to or system load
user load and concurrent processing e.g. : Application can handle 1000 concurrent users performing day-to-
day activities with scheduled jobs running in background

Key Drivers - Performance Testing

Output Performance Test Volume Test Stress Test


Established SLA’s

User Stories User stories may be used partially TBD TBD

Testing Scope

Testing Tools

Manual v/s Automated Approach

Testers

Support Teams

Entry/Exit Criteria

74
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Planning and Preparing

75
Planning and Preparing

Critical Success Factors


Duration: Amount of time for the test cycle
 Test response of the system across multiple aspects (User experience,
Data volume and User Volume)
Scope : No of entities to be tested  Utilize the Business teams to document SLA’s for confirming system
performance

Process Scope: No of modules/processes  Verify environment readiness and connectivity prior to performance testing

Resources: Core/Extended/Support resources

Objective Verify identified areas of the solution, application and infrastructure to be able to withstand the rigors of an operational system and usage

• SLA’s for performance tests have been established


Readiness • Bulk data/User Volume simulation has been prepared for the tests
Aspects • Test environment is stable and ready for execution
• Any automated tools and testing tools have been setup and ready for test runs

Core Team: Extended (Business) Team: Support teams :


• Conduct test event kickoff • Verify performance tests • Perform performance tuning
Key • Conduct daily status calls • Provide requirements and test cases • Confirm issues are resolved/support defects resolution
Activities • Set up test environment • Assist in confirming performance • Conduct associated sizing and infrastructure changes
tuning changes

76
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Execution

77
Performance Test Execution

• In the Cloud ERP landscape performance tuning and sizing is addressed primarily by the vendor

• The next section will illustrate the more traditional ways of performance testing used in existing ERP applications

• As Cloud projects scale across complex global implementations there is a possibility of a broader and much open
performance testing model

78
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Support and Manage

79
Cloud is supposed to be intuitive, what is the value of testing tools on Cloud projects?

• Cloud solutions across vendors are aimed at making business processes more intuitive for the end user, among other
things. However, this does not preclude the project teams for doing thorough testing of complex solutions.

• The testing tools, as a result, are intended to provide structure, rigor and add detail to the testing process. The specific
tools mentioned in subsequent slides are more commonly used on projects, and in some cases, recommended for the
project.

• HPE Application Lifecycle Management (ALM), in collaboration with Agile Manager (AgM), lends itself to translate user
stories defined in the earlier stages of the project into actionable test scenarios that can be executed.

80
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool – HPE ALM
ALM is a web based tool that helps to manage the application lifecycle from project planning, requirements gathering, until
testing and deployment

Access:
Click here to access CMT site including materials on ALM and Sprinter (add-on tool)

Once your access is setup, launch Application Lifecycle Management. Additionally, you can download and install add-ins for business views or
Excel. For the Sprinter add-in, please contact your project administrator for installation and user guide files.

81
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool - HPE ALM
Below are a few Frequently Asked Questions regarding ALM

Who can use the capability? Can it be used outside Deloitte How can I determine the cost of using ALM for my project?
Network
Projects will not incur any costs for use of the tool.
Deloitte Consulting personnel, clients and contractors can use ALM. Yes, it can
be used outside Deloitte Network and no VPN is required

How can we train the project team on ALM and Sprinter? What tools can ALM integrate with? To what extent can this tool
be used?
The HPE Adoption Readiness Tool (ART) provides end-user training for most of
HPE software. It can be accessed remotely, which allows for self-paced learning ALM is currently integrated with Agile Manager (AgM), Performance Center
for Deloitte practitioners. The training is designed to be interactive including (PC), Unified Functional Testing (UFT), and Sprinter. Additionally, via the ALM
simulation exercises and knowledge checks while also gaining an understanding Synchronizer, projects can synchronize ALM defects with Atlassian Jira and
of the full product functionality Microsoft Team Foundation Server.

How complex is ALM? Is there a learning curve or high project What happens to the ALM project and data after the Deloitte
overhead cost involved due to the complexity of using ALM? engagement is complete?
ALM is a very intuitive tool. After initial setup, there is a very low overhead for Projects and users will be deactivated at the completion of the Deloitte
ongoing maintenance and support. Each project assigns a Deloitte ALM project engagement. The project will remain on the Deloitte ALM server for 7 years and
administrator to perform initial project set up tasks, facilitate project training can be recovered, if necessary. To learn more about data handover to the client
and provide support. prior to engagement completion, read the ​ALM client project handover guide.

82
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – Kainos (Workday-specific)

Automated Testing Tool - Kainos


• Kainos SMART is the cloud-based automated testing tool.

• With Kainos repeatable tests can be created and performed on Workday HCM, Financials and Security.

Visit https://www.kainosworksmart.com/products for more details

File Comparison Tools


• The comparison of files can be done manually. But that may require a lot of time and effort.

• This may introduce some scope for human errors and ignorance on the differences.

• There are tools/applications available to compare the files and present us with the differences in the two
files being compared. The popular ones being:

‒ Beyond Compare

‒ Notepad ++

• Refer to the Appendix to learn about the file comparison options available.

83
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools - Unified Functional Testing (UFT) and UFT Pro (LeanFT)
Our tools UFT capabilities
Unified Functional Tests applications UI and API
Testing (UFT) is an layer UFT's solution supports test
functionality across multiple
advanced tool for
Tests variety of applications application layers, such as
functional and regression the front-end GUI layer and
using Add-ins
test automation. back-end service layers. Our
Creates repeatable processes capability increases efficiency
UFT Pro (formerly and speed of delivery with
LeanFT) is a powerful lower overall effort.
and lightweight functional Integrates with Application
Lifecycle Management (ALM)
testing tool built
specifically for continuous
test and continuous UFT Pro capabilities
integration. UFT Pro is used to create test Supports “shift left” initiatives
automation in developer for earlier testing
integrated development
UFT continues as the environments (IDEs). This
overall market share solution delivers a new Helps simplify the process of
leader and has worked to standard in continuous building robust, stable tests
expand its coverage. delivery and test automation
for Agile Project Management
and DevOps teams. Supports the most popular
technologies & development
languages
84
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Using UFT/UFT Pro (LeanFT)
​UFT and UFT Pro require specialized skillsets to deliver test automation services

Global availability
No Cost
Advanced testing tools are
currently available globally Enterprise licensing is
for Deloitte Consulting currently available at
practitioners and clients. no additional cost for
the duration of the
Deloitte engagement.

VBScript experience Java or C# experience


required for UFT required for UFT Pro
Projects using UFT require Projects using UFT Pro require an
an experienced Deloitte experienced Deloitte resource
resource with VBScript with Java or C# programming
programming experience experience staffed on the project
staffed on the project to to deliver test automation
deliver test automation services.
services.

To get started, take advantage of the UFT Quick Start Guide , the UFT Pro Quick Start Guide, and the
Project Ramp Up Guide. Visit the Help Center and UFT/UFT Pro resource center for additional learning
opportunities. 85
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Test Execution)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.
3. Click “View
2. Filter on
Entries” to view
“Assigned to”
details of Scenario

1. Go to
“Scenario
Execution
Tool”

86
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Defect Management)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.

1. Click
‘New’ to
log a 3. Fill in the required information and click OK
defect

2. A ‘New
Item’ form
will
appear

87
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Outlines
Below is a list of sample parameters that must be reported for each testing phase. The testing team must work with the
Client Testing Lead to define key parameters based on stakeholders, refer to the governance framework for an illustrative
example on metrics that need to be reported based on the audience of the report.

Progress Reporting
• Unique Number of test scenarios or user stories Quality Reporting
in each process
• Feedback scores of the testers
• % Completion for unique transactions
• Trends in the feedback scores
• Gap between the number of scheduled vs actual
test scenarios executed
• Total Projected scenarios to be executed

Defect Reporting Risks Reporting


• Defect reporting by criticality Potential Risks in the testing project due to any of the
following:
• Defect reporting by process
a) Critical/High priority defects
• Defects closure status
b) Deviation from the test strategy, processes etc.
• Metrics – Defect Aging, Defect trend analysis
etc. c) Process/Solution change requests

88
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Samples
Below is a sample of key metrics that can be used in testing status reports.

Plan vs Actual Test Scenario Execution Tester Feedback

5.00

4.00
Gap=Y
3.00

2.00
Dip in feedback observed due to
ABC Reason
1.00
Ease of System Navigation Overall Response Time
Overall Quality of HRDSC Comms Overall Service Experience
1 – Needs
Scale 2 - Fair 3 – Good 4 - Very Good 5 - Excellent
Improvement
Date 

% Transaction Scenario Completion


Defects % according to Severity
100 Breakup of the Critical Issues
45
50 1) Issue 1
20
25 2) Issue 2
0
Transaction Scenario

Not Started In Progress Completed Critical Urgent Medium Minor


89
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Defect Severity Description

Severity Impact Target Turnaround Time

 Very severe: Entire application, component, or function will not work


 Client, system or environment is unavailable. No work-around available
Critical  Severe data loss or corruption: Data integrity issue related to security, confidentiality, legal, or x hours
regulatory non-compliance
 Intermittent defects that result in any of the above are also classified as Critical.

 Significant: Entire application, component or function will not work. A work-around is available
 Corruption of a critical component
Urgent/High  Loss of a non-critical component y hours
 Intermittent defects that result in any of the above are also classified as High

• Result is not as expected: Corruption of a non-critical component. A work-around is available


Medium •Low impact to the end user or application z days
• Intermittent defects that result in any of the above are also classified as Medium

•Minor defect
Minor/Low •Some of the application operations are unexpected a days
•Intermittent defects with low impact to the business operations or end users

90
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – Performance (HPE PC)
Performance Center is a web based, globally accessibly tool that helps to design, execute, and interpret the results of
performance tests in an effort to validate performance requirements and ensure system stability.

Release Requirement Performance Performance


Project setup Management Management Test Planning Test Execution Defect Tracking

Monitor Progress and Quality

Access:
Request access to the tool by visiting the “Access the tool” section on the Performance Center website, for both training and demo, or production
needs. Once your access is set up, launch Performance Center.

Reference Material:
Access the Getting Started Guide here to learn about what happens in each phase, review training material, and access informational videos.
91
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – Performance (HPE PC)
Below are a few Frequently Asked Questions regarding Performance Center

Who can use the capability? Can it be used outside Deloitte How can I determine the cost of using Performance Center for my
Network project?
Deloitte Consulting personnel and contractors can use Performance Center. Projects will not incur any costs for use of the tool, however it is necessary to
Yes, it can be used outside Deloitte Network and no VPN is required. staff experienced performance testing resources and there may also be costs
associated with required infrastructure that must be set up by the project.

How can we train the project team on PC? What tools can PC integrate with? To what extent can this tool be
used?
The HPE Adoption Readiness Tool (ART) provides end-user training for most of
HPE software. It can be accessed remotely, which allows for self-paced learning PC is currently integrated with Application Lifecycle Management (ALM),
for Deloitte practitioners. The training is designed to be interactive including Network Virtualization (NV), Mobile Center (MC), and Unified Functional Testing
simulation exercises and knowledge checks while also gaining an understanding (UFT).
of the full product functionality. There are additional learning materials and
videos available on Deloitte’s Performance Center website which have been
developed to assist projects in getting started with the tool.

How complex is PC? Is there a learning curve or high project What happens to the PC project and data after the Deloitte
overhead cost involved due to the complexity of using PC? engagement is complete?
Due to the nature of scripting and performance remediation, PC does require Projects and users will be deactivated at the completion of the Deloitte
experienced resources to effectively utilize the tool. Resources should have engagement. The project will remain on the Deloitte ALM+PC server for 7 years
experience with load testing, preferably using Performance Center or and can be recovered, if necessary. To learn more about options data handover
LoadRunner, and experience with the underlying technology of the application to the client prior to engagement completion, read the ​
being tested (web/HTTP, SAP, Oracle, etc.). CMT can facilitate initial training PC client project handover guide.
calls with the project team to familiarize them with the environment.

92
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Close Cycle

93
Suggested Entry Criteria for Performance Testing

Category Criteria

Testing
Cycle

All critical and blocker defects from prior test cycles are closed out

Any open major defects have a viable workaround with plan for resolution in the current test cycle

No impact from open defects from the preceding test cycles

Test cycle scope has been reviewed and approved by all stakeholders

Functional and Technical design has been completed and approved for test cycle scope
Entry Environment has been validated to be ready for testing

• Instance build (including patches) has been completed


• Functional configurations completed and validated in testing environment
• Security access for testers has been configured

Testing tool has been configured with tester access setup and test repository compiled for test cycle execution

All enablers and boundary systems are configured and connected to the testing environment

Pre-testing has been conducted and any open defects have a plan of action for resolution

94
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Suggested Exit Criteria for Performance Testing

Category Criteria

Testing
Cycle

All blocker and critical defects have been resolved and closed out

All test scenarios have been executed (100% execution)

Established pass rate (95%) for scoped test scenarios has been met (including re-test scenarios)
(Green/Blue >= 95% ; Yellow 90 – 95%; Red < 90%)

Defined SLA’s have been met via test results

Exit Any open major defects have been reviewed and approved by the business teams for a viable workaround (documented and successfully tested)

All minor defects have been reviewed and validated as rightly categorized

All User Stories identified for the test cycle have been tested

Business users have signed off on the testing results

95
Copyright © 2017 Deloitte Development LLC. All rights reserved.
​Payroll Compare Testing

96
Payroll Compare Testing Overview

• Planning and Preparing


• Test Execution
• Support and Manage
• Close Cycle

97
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Purpose – Payroll Compare Testing

Execution of tests in test and prod environment and comparison of the outputs after processing is complete.

e.g. Use Case – Payroll Compare Testing


Execution of prior payroll cycles to compare to Client legacy payroll results to validate that the Cloud System payroll results
are acceptable.

• Compare Client legacy payroll results to Cloud System payroll results and validate that the results are acceptable.
• Validate same period processing in Cloud System and compare it against a calculated payroll in the Legacy System
• The project team will validate payroll results

98
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Planning and Preparing

99
Planning and Preparing

• Compare Client legacy payroll results to Cloud System payroll results and validate that the results are acceptable.
Goals of PCT • Validate same period processing in Cloud System and compare it against a calculated payroll in the Legacy System
• The project team will validate payroll results

• Verify data conversion mapping, data clean-up, cutover/ready-room procedures


• Validate systems configuration
• Simulate payroll jobs and reconciliation
Scope of
PCT • Validate the system known differences
• Attain processing results to meet project expectations and acceptable outcomes
• Confirm readiness to move to production
• New System testing against a previously run payroll for comparison
• Payroll is a simulation of legacy run looking for differences
PCT is … • Allows for performance tests on a large set of data
• All integrations is desired but not a hard requirement
• Includes the entire worker population
• Two systems running at the same time to compare output
• Assume legacy is always correct
PCT is not … • Only intended to resolve defects
• Any opportunity to re-visit the design
• Another HCM testing instance

100
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Execution

101
Payroll Compare Test Execution
• Must consist of at least two pay cycles
• The first cycle is essentially a “shakedown” as there may be data errors
• Two cycles generally provides enough information to determine new system accurately calculates the pay
• Keep inputs and outputs of a third legacy pay cycle in reserve as a contingency
Payroll • Current Paycheck details (earnings, deductions and taxes) for two pay periods.
Compare • Pay Cycle One- Provide Date Range <<Examples Below/Update as Needed>>
Dates
– Hourly: 1/29/17 to 2/4/17 (Checkdate 2/10/17)
– Semi-Monthly: 2/1/17 to 2/15/17 (Checkdate 2/15/17)
• Pay Cycle Two
– Hourly: 2/5/17 to 2/11/17 (Checkdate 2/17/17)
– Semi-Monthly: 2/16/17 to 2/28/17 (Checkdate 2/28/17)

• Where differences exist outside of tolerance levels they will be explained or configuration adjusted
• HR data is planned to be as of XX/XX/XXX date
• History Payments will be loaded as of the XX/XX/XXX payroll
• Any activity or inputs needed for both pay periods to be added or ruled out
Key • Bonus will be tested as part of Pay Cycle Two (How will this occur since Bonus file will not be available until
Consideration early MM YYYY?)
s
• For cycle 2 all transactions that can impact a worker’s gross to net calculation must be tracked, like -
– HCM staffing transactions - Compensation change, Hires, Terminations and Change of address that will impact state, local,
school or county taxes
– Absence - Time off requests, Time off corrections, Leave of Absence
– Time Entry - to capture hours worked, by period and hours type

102
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Support and Manage

103
Cloud is supposed to be intuitive, what is the value of testing tools on Cloud projects?

• Cloud solutions across vendors are aimed at making business processes more intuitive for the end user, among other
things. However, this does not preclude the project teams for doing thorough testing of complex solutions.

• The testing tools, as a result, are intended to provide structure, rigor and add detail to the testing process. The specific
tools mentioned in subsequent slides are more commonly used on projects, and in some cases, recommended for the
project.

• HPE Application Lifecycle Management (ALM), in collaboration with Agile Manager (AgM), lends itself to translate user
stories defined in the earlier stages of the project into actionable test scenarios that can be executed.

104
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool – HPE ALM
ALM is a web based tool that helps to manage the application lifecycle from project planning, requirements gathering, until
testing and deployment

Access:
Click here to access CMT site including materials on ALM and Sprinter (add-on tool)

Once your access is setup, launch Application Lifecycle Management. Additionally, you can download and install add-ins for business views or
Excel. For the Sprinter add-in, please contact your project administrator for installation and user guide files.

105
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Test Management Tool - HPE ALM
Below are a few Frequently Asked Questions regarding ALM

Who can use the capability? Can it be used outside Deloitte How can I determine the cost of using ALM for my project?
Network
Projects will not incur any costs for use of the tool.
Deloitte Consulting personnel, clients and contractors can use ALM. Yes, it can
be used outside Deloitte Network and no VPN is required

How can we train the project team on ALM and Sprinter? What tools can ALM integrate with? To what extent can this tool
be used?
The HPE Adoption Readiness Tool (ART) provides end-user training for most of
HPE software. It can be accessed remotely, which allows for self-paced learning ALM is currently integrated with Agile Manager (AgM), Performance Center
for Deloitte practitioners. The training is designed to be interactive including (PC), Unified Functional Testing (UFT), and Sprinter. Additionally, via the ALM
simulation exercises and knowledge checks while also gaining an understanding Synchronizer, projects can synchronize ALM defects with Atlassian Jira and
of the full product functionality Microsoft Team Foundation Server.

How complex is ALM? Is there a learning curve or high project What happens to the ALM project and data after the Deloitte
overhead cost involved due to the complexity of using ALM? engagement is complete?
ALM is a very intuitive tool. After initial setup, there is a very low overhead for Projects and users will be deactivated at the completion of the Deloitte
ongoing maintenance and support. Each project assigns a Deloitte ALM project engagement. The project will remain on the Deloitte ALM server for 7 years and
administrator to perform initial project set up tasks, facilitate project training can be recovered, if necessary. To learn more about data handover to the client
and provide support. prior to engagement completion, read the ​ALM client project handover guide.

106
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – Kainos (Workday-specific)

Automated Testing Tool - Kainos


• Kainos SMART is the cloud-based automated testing tool.

• With Kainos repeatable tests can be created and performed on Workday HCM, Financials and Security.

Visit https://www.kainosworksmart.com/products for more details

File Comparison Tools


• The comparison of files can be done manually. But that may require a lot of time and effort.

• This may introduce some scope for human errors and ignorance on the differences.

• There are tools/applications available to compare the files and present us with the differences in the two
files being compared. The popular ones being:

‒ Beyond Compare

‒ Notepad ++

• Refer to the Appendix to learn about the file comparison options available.

107
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools - Unified Functional Testing (UFT) and UFT Pro (LeanFT)
Our tools UFT capabilities
Unified Functional Tests applications UI and API
Testing (UFT) is an layer UFT's solution supports test
functionality across multiple
advanced tool for
Tests variety of applications application layers, such as
functional and regression the front-end GUI layer and
using Add-ins
test automation. back-end service layers. Our
Creates repeatable processes capability increases efficiency
UFT Pro (formerly and speed of delivery with
LeanFT) is a powerful lower overall effort.
and lightweight functional Integrates with Application
Lifecycle Management (ALM)
testing tool built
specifically for continuous
test and continuous UFT Pro capabilities
integration. UFT Pro is used to create test Supports “shift left” initiatives
automation in developer for earlier testing
integrated development
UFT continues as the environments (IDEs). This
overall market share solution delivers a new Helps simplify the process of
leader and has worked to standard in continuous building robust, stable tests
expand its coverage. delivery and test automation
for Agile Project Management
and DevOps teams. Supports the most popular
technologies & development
languages
108
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Using UFT/UFT Pro (LeanFT)
​UFT and UFT Pro require specialized skillsets to deliver test automation services

Global availability
No Cost
Advanced testing tools are
currently available globally Enterprise licensing is
for Deloitte Consulting currently available at
practitioners and clients. no additional cost for
the duration of the
Deloitte engagement.

VBScript experience Java or C# experience


required for UFT required for UFT Pro
Projects using UFT require Projects using UFT Pro require an
an experienced Deloitte experienced Deloitte resource
resource with VBScript with Java or C# programming
programming experience experience staffed on the project
staffed on the project to to deliver test automation
deliver test automation services.
services.

To get started, take advantage of the UFT Quick Start Guide , the UFT Pro Quick Start Guide, and the
Project Ramp Up Guide. Visit the Help Center and UFT/UFT Pro resource center for additional learning
opportunities. 109
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Test Execution)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.
3. Click “View
2. Filter on
Entries” to view
“Assigned to”
details of Scenario

1. Go to
“Scenario
Execution
Tool”

110
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Testing Tools – SharePoint (For Defect Management)
For clients who do not have a test management tool, MS Excel/SharePoint are used to track test scenario/user story
execution and manage defects.

1. Click
‘New’ to
log a 3. Fill in the required information and click OK
defect

2. A ‘New
Item’ form
will
appear

111
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Outlines
Below is a list of sample parameters that must be reported for each testing phase. The testing team must work with the
Client Testing Lead to define key parameters based on stakeholders, refer to the governance framework for an illustrative
example on metrics that need to be reported based on the audience of the report.

Progress Reporting
• Unique Number of test scenarios or user stories Quality Reporting
in each process
• Feedback scores of the testers
• % Completion for unique transactions
• Trends in the feedback scores
• Gap between the number of scheduled vs actual
test scenarios executed
• Total Projected scenarios to be executed

Defect Reporting Risks Reporting


• Defect reporting by criticality Potential Risks in the testing project due to any of the
following:
• Defect reporting by process
a) Critical/High priority defects
• Defects closure status
b) Deviation from the test strategy, processes etc.
• Metrics – Defect Aging, Defect trend analysis
etc. c) Process/Solution change requests

112
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Management Reporting – Samples
Below is a sample of key metrics that can be used in testing status reports.

Plan vs Actual Test Scenario Execution Tester Feedback

5.00

4.00
Gap=Y
3.00

2.00
Dip in feedback observed due to
ABC Reason
1.00
Ease of System Navigation Overall Response Time
Overall Quality of HRDSC Comms Overall Service Experience
1 – Needs
Scale 2 - Fair 3 – Good 4 - Very Good 5 - Excellent
Improvement
Date 

% Transaction Scenario Completion


Defects % according to Severity
100 Breakup of the Critical Issues
45
50 1) Issue 1
20
25 2) Issue 2
0
Transaction Scenario

Not Started In Progress Completed Critical Urgent Medium Minor


113
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Defect Severity Description

Severity Impact Target Turnaround Time

 Very severe: Entire application, component, or function will not work


 Client, system or environment is unavailable. No work-around available
Critical  Severe data loss or corruption: Data integrity issue related to security, confidentiality, legal, or x hours
regulatory non-compliance
 Intermittent defects that result in any of the above are also classified as Critical.

 Significant: Entire application, component or function will not work. A work-around is available
 Corruption of a critical component
Urgent/High  Loss of a non-critical component y hours
 Intermittent defects that result in any of the above are also classified as High

• Result is not as expected: Corruption of a non-critical component. A work-around is available


Medium •Low impact to the end user or application z days
• Intermittent defects that result in any of the above are also classified as Medium

•Minor defect
Minor/Low •Some of the application operations are unexpected a days
•Intermittent defects with low impact to the business operations or end users

114
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Close Cycle

115
Suggested Entry Criteria for Payroll Compare Testing

# Entry Criteria Owner Due Date Status

1. Payroll Compare Test Plan developed and approved Test Lead/Functional Lead

Payroll Compare scope, resources, approach, responsibilities and


2. schedule are confirmed with and communicated to all relevant Test Lead
participants.

The Payroll Compare Test instance is built, scheduled and configured


3. Deloitte Test Lead
for access by the testers.

4. Integrations are migrated. Deloitte Integration Lead

Payroll Compare defect prioritizing, tracking and reporting


5. Test Lead
procedures and tools are in place
Payroll Compare test scenarios are developed, approved, loaded and
6. Client Functional Lead
ready for execution

7. Dedicated technical support available during testing Functional/Integration Leads

8. Payroll Compare exit criteria has been agreed upon Test Lead

9. PACT/Compare Edge tool attached to Payroll Compare instance Deloitte Functional Lead

10. History data representing balances at the start of the cycle Deloitte Functional Lead

Transactions or inputs from Legacy System agreed upon and


11. Client Functional Lead
included

12. P3 Build complete and validation completed Client and Deloitte Functional Lead

116
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Suggested Entry Criteria for Payroll Compare Testing

# Entry Criteria Owner Due Date Status

1. All identified testing scenario/scripts complete Functional and Technical Leads

Payroll Compare test scenarios have been executed and actual


2. Test Lead
results documented

3. High defects have been successfully resolved, re-tested, and closed. Functional and Technical Leads

Medium and Low defects have been resolved, re-tested, and closed
4. Functional and Technical Leads
or an acceptable workaround has been defined and approved

A plan is in place to resolve any remaining defects and issues


5. Test Lead
(including open defects) which has been agreed upon

Results of all scenarios are acceptable or categorized as exceptions


6. Test Lead
with corresponding action plans and approvals

7. All variances meet accepted tolerance levels or can be explained Functional Lead

117
Copyright © 2017 Deloitte Development LLC. All rights reserved.
​Appendix

118
Release Test Execution –
With New Functionality

119
Integration Test Scope Identification

​With the Release, the integrations’ behaviors may get impacted.

​The changes impacting the integrations may include:

• Introduction/deprecation of Security policies


Changes to the security
• Updates to the security policies/domains being used by Integration System users/Integration reports
configurations

• Web service WSDL updates


• Deprecation of web services
Changes to Integration
Techniques • Introduction of new services
• Updates to the existing integration templates, etc.

• Deprecation of data sources


Changes to the fields • Deprecation of fields
• Introduction of new data sources (indexed)

Changes to calculated • Introduction of new calculated field functions


fields’ behavior • Processing logic changes to the calculated field functions

120
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Custom Report Test Scope Identification

The changes impacting the Custom Reports may include:

• Changes in the domain configurations


• Changes to the data source (example, introduction of a new prompt, or deprecation of the Data source)
• Changes to the fields used by the Custom Report
• Changes to the calculated fields behaviors, etc.
Critical
(Payroll/Audit/Benefits/etc.)

Standalone

Non-critical
Custom
Reports

Part of
Integration

• Integration Custom reports will be tested as part of integration testing


• Critical standalone Custom reports are identified for regression testing

121
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Business Process Test Scope Identification

• The Feature Release may introduce some changes to the Business Processes. These changes may or may not impact the
functionality, but will require testing.

• Examples – introduction of eSign by Adobe, Custom Notifications, etc.

• Critical Business Processes are identified for regression testing for every Feature release.

• Additional BPs can be tested based on bandwidth and availability.

122
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Perform Test - Integration

Outbound Integration Inbound Integration Critical Integration

• Run the integration in both the testing • Use the input file from last successful run in • The critical integrations, like the Payroll
tenants (Sandbox and Preview), with the Production for testing. integrations requires mandatory additional
same launch parameters if any. testing at the vendor level.
• Run the inbound integration in both the
• Compare the output files generated from testing tenants (Sandbox and Preview) using • Send the files generated from the Payroll
the integration on both the tenants. testing SFTP. integrations in the Sandbox/Implementation
Since the tenants were locked down Preview tenants to the Payroll Vendor, so
from any further activities/transactions, • Verify the integration functionality in both the they can load the files to their test
the output files from both the tenants tenants (in UI, example, updating the Work environment and test the files.
should be match. Email address) and verify no impacts of the
feature release.

• Also, compare any logs generated.

123
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Perform Test – Custom Report

Run
• Run the Custom Report in both the testing tenants (Sandbox tenant and Sandbox/Implementation Preview
tenant), with the same launch parameters if any.

Extract
• Extract the output of the Custom Report from both the tenants to Excel.

Compare • Compare the excel files from both the tenants.


• Since the tenants were locked down from any further activities/transactions,
the output files from both the tenants should be a match.

124
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Perform Test – Business Process

Business Process
• Enter business process transactions in the testing tenants (Sandbox Preview tenant).

• Complete/Approve all the steps using Start and Stop Proxy (or using the login credentials of the approvers) to
complete the transaction entered to confirm no impact of the release.

• Customer critical conditions, if any, needs to be tested thoroughly.

• Regional testers initiates Business Process transactions on workers.

• Conditional rules specific to a location if any should be tested by regional testers

** Regional teams may not be applicable in the case of non-Global clients

125
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Perform Test - Security

Security Testing Steps


• Security changes requires testing of the Security policies and • Randomly select few workers from each Region and each type of security
Workers’/Managers’ accessibility to various tasks. access.

• Few workers are selected from each Region. Tests should be • Proxy the Sandbox Preview tenant (or login with credentials) as the Test
performed to verify their access is retained and there are no Data worker.
differences in the security policies.
• Initiate/Complete the task, being tested in the test tenant.
• A high level round of testing should be performed by the
Regional testers as well to ensure they have access to the • Verify that the worker has the same level of access to the task as in the
everyday tasks they will be performing on the tenants Sandbox tenant. Any addition or reduction in level of access makes the test
a failure.

126
Copyright © 2017 Deloitte Development LLC. All rights reserved.
Release Testing Timeline Details

X weeks prior to the • The new features coming in the feature release are available in
5 release date Preview tenants, along with translations

One day prior to the • All Implementation, Production and Sandbox tenants will be
1 Feature Release unavailable for the release window

• New release and translations are delivered to all Production,


FR Feature Release date
Sandbox, and Implementation tenants

127
Copyright © 2017 Deloitte Development LLC. All rights reserved.
File Comparison Tools
The following tools are available to compare files and present the differences.

​Beyond Compare

• Beyond Compare is a focused application. Beyond Compare allows you to quickly and easily compare your files and folders.
By using simple, powerful commands you can focus on the differences you're interested in and ignore those you're not. You
can then merge the changes, synchronize your files, and generate reports for your records.

​Notepad ++ (with Compare plug-in)

• Notepad++ is a one of the free editor and we can use this software to compare files by using Notepad++ Compare plugin.
It has the below features:

• Side-by-side visual differencing

• Comparison against SVN database

• Highlight differences inside lines

• Navigation bar shows a map of compared files

• Moved line detection

• Easy navigation between differences

• Customizable results presentation


128
About Deloitte

Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member
firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as
“Deloitte Global”) does not provide services to clients. Please see www.deloitte.com/about for a detailed description of DTTL and its member
firms. Please see www.deloitte.com/us/about for a detailed description of the legal structure of Deloitte LLP and its subsidiaries. Certain services
may not be available to attest clients under the rules and regulations of public accounting.

Copyright © 2017 Deloitte Development LLC.


129

You might also like