Test Strategy - ESSP
Test Strategy - ESSP
Table of Contents
1. Introduction........................................................................................................................................1
2 QA Testing Scope..............................................................................................................................3
3 Roles and Responsibilities.................................................................................................................6
4 Testing Types.....................................................................................................................................6
5 Test Plans..........................................................................................................................................8
6 Communication................................................................................................................................10
7 Technology/Tools.............................................................................................................................11
8 Test Environment.............................................................................................................................11
9 Strategy for Test Automation...........................................................................................................11
10 Strategy for Test Data......................................................................................................................13
11 Defect Management.........................................................................................................................13
12 Supporting Evidences......................................................................................................................14
13 QA Entry and Exit Criteria................................................................................................................15
14 QA Sign-Off Criteria.........................................................................................................................15
15 Prerequisites and Constraints..........................................................................................................16
16 Suspension and Resumption Criteria for Test Execution................................................................16
17 Test Deliverables.............................................................................................................................16
18 Metrics and KPIs..............................................................................................................................17
1.Introduction
The objective of this project is to manage and store all the data related to different business object of
the Eurofins ex: Eurofins legal Entity, Business Unit etc. It is envisioned to replace the current EGSS
system by providing varied level of features, ESSP will enable Authorized user to directly verify, create,
modify or close the Business object, creation of deviation/approval of SNCA, ARPPVT change
requests, eCDR documents request, E2E flow process.
The purpose of this document is to outline test strategy for ESSP program which determines the
project’s approach to testing. The strategy looks at the characteristics and the risks of the system to be
built, the project timeline and budget, and plans the breadth and depth of the testing effort. The Testing
Strategy will influence tasks related to test planning, test types, test script development and test
execution.
.
Last modified on: 04/07/2022 Approved on (if applicable): 30/06/2022 Version: 2.0
Program Director
Development Manager
Business Analyst (Product Owner)
Architect
IT Development Team
IT QA Testing Team
Document Description
“Test Management Process” (EDR: Document that describes the Test Management Process, its
4-979-IS-IPR-01151628) activities, roles and responsibilities.
“Defect Management Guideline” Document that helps to identify, track and manage defects
(EDR: 4-979-IS-ITP-01254925) identified out of IT testing activities.
3. Test case preparation: Test cases are identified based on 1) Boundary value analysis 2)
Positive and negative test steps for each acceptance criteria.
5. Test Execution:
All tests should be executed according to predefined and approved plans and changes to test
cases and scripts must be maintained under version control.
6. Test Reporting:
Test Reports providing the details of the Test Execution & Defects reported, should be
created for every Sprint within the Release.
Test Summary Reports must be produced that summarize activities and findings and state
the conclusions. Approval of the Test Summary Report constitutes the formal release of
entities for subsequent life cycle steps.
In every sprint, IT QA Testing team will create new Test Cases for new user stories implemented. As
user stories would not cover the system flows from an end-to-end perspective, additional exploratory
testing charters will be created to address these scenarios, wherever applicable.
2.2 Instances
Note: Please refer to the “Test Management Process” (EDR: 4-979-IS-IPR-01151628) for the complete
list of responsibilities of the IT TESTING roles.
4 Testing Types
Development team performs unit testing of the code and also does a quick functional check to validate
if the user story is functioning as expected before releasing the user story for testing to the IT QA
Testing team.
The testing phase for Fetch project will comprise of System Testing.
Based on the project needs, the below types of testing will be performed by IT QA Testing team:
IT QA Testing team will execute a set of Test Cases which evaluates the basic flow of the
application. Based on the result from the Sanity Tests, IT QA Testing team will determine whether
the build is stable enough to proceed with Test Execution activity.
The same Sanity Test Suite will be used in all the sprints for a release unless there is a major
change in the basic feature which requires additional Test Cases to be added to the Sanity Test
Suite.
Also, Sanity tests will only be performed on need basis, like if milestone builds are received, or
whenever DB refresh is done, and the team will not perform any other formal testing prior to Sanity
checks.
Sanity Test Cases can be obtained in AZURE DEVOPS by filtering based on the value “Sanity” in
the field “ETF Type”.
IT QA Testing team will execute all Functional Test Cases developed to test the user stories
deployed on Test Environment in each functional sprint. In addition, IT QA Testing team will also
execute the ET Charters created to cover the Should and Could scenarios for the user stories.
Browser Compatibility Testing will be performed on the following browsers in scope. Main intention
of this testing will be to ensure that functionality works as expected and there are no major issues in
rendering the UI. Currently we are executing all our functional test in latest chrome browser only.
Microsoft Edge (or current versions during subsequent releases of Fetch)
Google Chrome version 84 (or current versions during subsequent releases of Fetch)
Mozilla Firefox version 48 (or current versions during subsequent releases of Fetch)
In order to minimize time needed to test on multiple browsers, IT QA Testing team will make use of
the available developer tools and plug-ins to simulate the different browsers through Microsoft Edge
or Chrome.
All Test Cases from the previous sprints qualify as regression Test Cases for the current release. In
order to optimize the regression suite for the release the following guidelines can be adopted.
All must functional Test Cases related to user stories from the current release and those which
are impacted in the current release are going to be planned and executed in the regression
testing phase.
All the failed Test Cases in the release for which defects are fixed will be re-executed.
All critical and major defects from the current release are going to be retested in the regression
testing phase.
All defects fixed in the release are going to be verified.
NA
The types of testing planned for each sprint are defined in the Test Plan document.
Data Validation Testing: Data in the database will be validated for correctness by using SQL
queries with appropriate table joins. SQL queries are going to be designed based on schema
information available in the functional specifications or design documents and will be included as
part of Test Case design.
NA
Automation testing will be performed based upon criteria defined in the Section 9
On successful QA signoff for the release, UAT team comprising of Product Owner and Business
users will perform testing on the UAT servers. They will be running their own UAT scenarios to
validate the functionalities implemented in the release. On successful signoff of UAT the build will
be promoted to Production. IT QA Testing team can support UAT team, wherever needed based
on the bandwidth available.
5 Test Plans
The test plan holds the tests conducted for the core functionalities of respective application.
The Test Plans are updated for every release in Azure DevOps to provide details on the features that
are Tested. The Sprint Plan holds the details of the features that are tested for the particular Sprint &
thus the release.
The testing activities are planned along with the development team during the Sprint ceremonies.
The High-Level Plan details are available in section 5.3.
Test Plans are created, and Test Cases are prioritized for a particular release based on below criteria:
The Business Analysts explain the PBI to the Testing Team at the beginning of the Sprint.
The QA Resources creates the Test Cases for all Acceptance Criteria of a PBI.
QA team develops automation scripts for all the insprint test cases.
The Business Analysts review the Test Cases created by the QA Resources.
Test engineers coordinates the selection of Regression Test Cases and smoke test cases with the
Development Team (IT Project Manager).
QA resources executes the reviewed Test Cases for previous Sprint PBI’s.
QA resources executes the selected Regression Test Cases.
QA resources reports defects found during Test Execution.
5.3.1 Test pyramid: As per our current test strategy we are following below test pyramid.
System Testing:
Sanity Testing
Functional Testing
Integration Testing
Regression Testing
Smoke testing
Main Module/Functions Planned for R2 release
System Testing:
Sanity Testing
Functional Testing
Integration Testing
Regression Testing
Smoke Testing
6 Communication
The communications w.r.t. overall program/project such as sprint ceremonies, team meeting, project
status meetings, etc. are defined in the ESSP Program Management Plan. In addition to these the
following table describes the communication carried out by the IT QA Testing team.
7 Technology/Tools
Tool Purpose
Microsoft Team User Stories, Test Case definition, Tasking & Effort Estimation, Defect tracking
Foundation Sprint Link
Server
Microsoft Test Test plan definition, Test execution & results, Defect tracking
Manager Test Plan
DMS Project Plan:
Reporting Tool Living Documentation reporting tool is used for extracting Test Case details with
steps and screenshots
SQL Client SQL Server 2019 Management Studio
Automation Tools Selenium with Specsflow
8 Test Environment
Sanity, Functional and regression test cases are automated. The tool used for generating the
automation scripts is selenium. GIT repo is used integrating the automation script in version control
system
For Release – 1 Automation for this project is not must for release 1 as development timeframe is short.
For Release – 2 Manual IT QA Testing team (Trained engineers in automation) will develop automation
scripts mainly focusing on the Regression suite and services.
Sanity test cases followed by web service and regression test cases will be considered for
automation with the below priority
1. Sanity test cases
2. Regression test cases
Only Functionally Validated Test Cases (Functional Validation helps to establish that the software
does what it was originally meant to do), and any above status, can be executed as part of a Test
Cycle/Test Run. Sanity, functional, regression test cases are automated.
Test cases which are automated are update with the automation status as “Automated”, test cases
which cannot be automated (data base test cases) are update with “Cannot be automated” status
ex: Database related PBIs, email verification PBIs, attachment verification on UI.
NA
NA
Test cases should adhere to test cases writing guidelines, automated test cases shall not be
modified again. Automation scripts are developed using selenium and are integrated with GIT
repository. The check in triggers the CI / CD Pipeline.
Test data creation are performed by Importing data from excel (automation). Sql scripts are used
for generating test data for manual testing.
11 Defect Management
AZURE DEVOPS will be used for defect reporting and tracking in this project.
The defect severity and priority (default value) will be assigned initially by the reporter of the
defect. BA’s have to review and reassign the severity and priority based on the business impact
during defect triage. BA’s will also have to change the status to APPROVED if the defect has to
be taken up for fixing in the sprint.
On accepting the APPROVED defect for fixing, the developer will change the defect status in
AZURE DEVOPS to COMMITTED. After fixing the defect and merging it, AZURE DEVOPS
status will be changed to READY FOR QA.
IT QA Testing team re-tests the defects which are in READY FOR QA status in AZURE
DEVOPS and updates the defect status accordingly (either to DONE or COMMITTED).
Once the defect is fixed and the status is” Ready for QA” in AZURE DEVOPS, the defect will be
taken up for retesting on the next build, or on the same build, based on the build deployment
frequency and urgency of verification of the defect.
The failed Test Cases shall be retested if the defect is fixed in the same release.
If the resolved defect is working as expected, then the defect status will be changed to DONE in
AZURE DEVOPS. If not, the defect will be reopened (AZURE DEVOPS status – COMMITED).
As there are going to be defects raised by the non testing members in the team (BAs, Dev, UAT
team, Production), in case those defects aren’t closed by the submitter, the following process
shall be followed to keep track of all fixed defects in the release.
o IT QA Testing team will verify such functional defects not raised by them and will provide
screenshots for having verified them based on the clarity of steps provided in the defect.
In case steps are not clear IT QA Testing team will mark the defect as “Removed” and
term it as “Not a Defect”. Hence it becomes very important for the BA to triage defects
Tags are used to specify the priority of the defects. Please find them below:
0: Product requires successful resolution of the work item before it ships and
addressed soon.
1: Product requires successful resolution of the work item before it ships but
does not need to be addressed immediately.
2: Resolution of the work item is optional based on resources, time, and risk.
Severity:
1 – Critical: Must fix. A defect that causes termination of one or more system components or the
complete system or causes extensive data corruption. And, there are no acceptable alternative
methods to achieve required results.
2 – High: Consider fix. A defect that causes termination of one or more system components or the
complete system or causes extensive data corruption. However, an acceptable alternative method
exists to achieve required results.
3 – Medium: (Default) A defect that causes the system to produce incorrect, incomplete, or inconsistent
results.
4 – Low: A minor or cosmetic defect that has acceptable workarounds to achieve required results.
12 Supporting Evidence
Steps to reproduce a defect will be provided to the BA’s/Developers in AZURE DEVOPS with
necessary screenshots or documents.
Test report at end of each Sprint. Test report - Test Plans (azure.com)
Test summary reports at the end of each Release. Test Summary Progress report - Test Plans
(azure.com)
QA Sign-off report at the end of each Release. - to be decided by Team
Criteria Owner
Detailed requirements are provided to the IT QA Testing team and Business Analyst
based on these specifications Test Case development starts in
parallel to coding
For major functionality changes, a walkthrough/overview of the Business Analyst
new functionality must be given to the IT QA Testing team
Code review and unit testing completed Development Team
Test Environment setup and Smoke test runs must be completed Deployment Team
successfully
Readiness of Test Plan for the Release and Test Cases for a IT QA Testing Team
Sprint
The package delivered for testing at planned duration must pass IT QA Testing Team
the minimal Sanity Testing
Criteria Owner
All test suites identified for each of the testing cycles are executed IT QA Testing Team
All system and requirement discrepancies are resolved or IT QA Testing Team & BA,
documented and moved to a future release Dev Teams
All the user stories have status as “DONE” in AZURE DEVOPS, IT QA Testing Team, BA, Dev
except for the ones deferred to future release/sprint by the Teams
stakeholders
All defects have status as “Done” in AZURE DEVOPS, except for IT QA Testing Team, BA, Dev
the ones deferred to a future release by the stakeholders Teams
The Test Report and Test Summary Reports have been created IT QA Testing Team & IT QA
and reviewed Manager
14 QA Sign-Off Criteria
Fetch release will be considered as having successfully passed the testing phase if at the end of the
testing sprint, all the below criteria have been met.
Note: All defects prioritized for addressing in subsequent releases should have proper target versions or
iteration paths set in AZURE DEVOPS. Also on meeting the above QA signoff criteria, the defects
which are open in the release at the time of QA signoff should be triaged and moved to subsequent
releases or to the backlog, unless there is a genuine need to have it fixed in the same release.
If the above criteria are not met, then the IT QA Testing team should raise the risks or issues during the
Scrum Ceremonies.
NA
A Test Run within this Sprint/Release will be suspended if any of the following occur:
Testing will be resumed if the issue that caused testing to be suspended is resolved. If a new
build/package is provided, a new Test Run is started.
17 Test Deliverables
Deliverables When
Test Strategy Beginning of the project and revisited and updated
as and when needed
Test Environment Validated when Test Strategy is prepared and
updated as and when there are any environment
related changes
Test Plan For each process at the beginning of every sprint,
the test plans are updated in Azure Devops link
Test Cases For the user stories planned in a sprint
Test Report End of each Sprint
Test Summary Report During QA signoff (Maintained in Azure DevOps
NA
Revision History
Date Version Description Author / Review
Created initial version Mohanraj Kannabiran /
7th Sept 2022 DRAFT
Trivedi Brijesh
17th Sept 2022 1.0 Reviewed and Approved Trivedi Brijesh
rd
03 Aug 2023 1.1 Reviewed QQF5 / Q3AX