Testing in System Integration Project
Testing in System Integration Project
Wipro Technologies
Bangalore, India
Page 1 of 23
Table of Contents
1.
INTRODUCTION..................................................................................................................................4
2.
3.
4.
IV & V METHODOLOGY....................................................................................................................5
4.1 TESTING PROCESS...................................................................................................................................6
5.
TYPES OF TESTING..........................................................................................................................10
5.1 Functionality Testing........................................................................................................................10
5.2
Out of Box functionality Testing.................................................................................................10
5.3
GUI (Web Interface)...................................................................................................................10
5.4
Multi-user...................................................................................................................................11
5.5
Recovery & Restart....................................................................................................................11
5.6
Security.......................................................................................................................................11
5.7
Interface Tests.............................................................................................................................11
5.8
Data Integrity Testing.................................................................................................................11
5.9
Data Backup and Restore Testing...............................................................................................11
5.10 Compatibility testing..................................................................................................................12
5.11 Performance testing...................................................................................................................12
5.12 Reliability & Availability Tests (includes Failover tests)...........................................................13
5.13 Localization /Globalization Testing...........................................................................................13
5.14 Workflow Testing........................................................................................................................13
6.
STAGES OF TESTING........................................................................................................................13
6.1
6.2
6.3
6.4
7.
TEST TOOLS.......................................................................................................................................21
8.
9.
TEST ARTIFACTS...............................................................................................................................22
10.
RESOURCE PLANNING................................................................................................................23
11.
12.
APPENDIX.......................................................................................................................................25
Page 2 of 23
1. Introduction
An OSS/BSS project or System integration project typically involves multiple applications with a
mind-boggling combination of protocols, Hardware and Software. The very characteristic of these
projects poses lot of challenges for testing. WIPRO has developed a framework WIP-SHARP to
measure Scalability, High Availability, Reliability and Performance by carrying out a complete
suite of IV & V activities through out the project life cycle. The objective of this methodology is to
provide effective IV & V services by making use of the best practices. This document does not
cover the process details.
Characteristics
Involves Multiple
applications,
customized/extended,
Different system Architecture
and design aspects
Meeting Performance
requirements
Performance, Scalability,
Availability, Reliability tests,
use of load testing tools
Interface testing,
development of test
drivers/stubs/scripts/simulat
ors
Business Scenario as
Requirement Specification,
New business process/
Enhancement
Implementing business
process
Functionality testing of
these applications
Rigorous Integration testing,
business scenario testing,
exceptional scenario testing,
Adapter testing (custom
built),
Type/Method of testing
Challenges
Up gradation of applications,
Decommissioning legacy
systems, Data Migration
Data Consistency,
Smooth transition from the
existing system/applications
to new system/applications
Rigorous Regression
Testing, Automation
Localization
Localization Testing
Page 3 of 23
CRM
Billing
Service Provisioning
Service Assurance
Web Portal
Information System
Mediation
EAI
4. IV & V Methodology
The testing of System Integration implementations would be in TWO parts. The functional
verification and validation against the Requirement Spec (Business Processes) and Performance
evaluation against the indicated requirements
The IV & V team is involved right from the beginning of the project and we follow the IV & V
model given below.
Business
Business
requirements,
requirements,
Solution
Solution
Architecture
Architecture
Traceability,
Master Test Strategy
Project Planning
Test Planning
Strategy for
individual
applications, Test
case design
Solution
Mapping/
Development/
Customization
Release
Integrated
Solution
Application
Integration
Acceptance Tests
&
Certification
Performance
Testing,
Business Cycle
testing
Incremental
Integration Testing
System Testing
(Application level
Testing)
Page 4 of 23
The over all testing processes would be organized into seven groups of one or more process
each and is as indicated below.
R
Test
Test
Strategy
Strategy
R
Test
Test
Planning
Planning
Test Case
Test Case
Design
Design
R
Test
Test
Management
Management
Test
Test
Execution
Execution
Summary
Summary
Report &
Report &
Analysis
Analysis
Defect
Defect
Reporting
Reporting
Legends
Test Management
Test Strategy
Test Planning
Test Execution
Defect Reporting
Summary Report
The following figure illustrates the process groups overlaps and variation within a phase
Page 5 of 23
Level of
Activity
Test Start
Time
Test End
Page 6 of 23
Development of Stubs/drivers/simulators
Test Environment creation
Test Data generation
Record Defects
Assign severity & Priority
Report defects to development team
Description
The functionality cannot be performed using the system; No
workaround exists
The functionality cannot be performed using the system;
Workaround exists
A specific case does not function for a required functionality
Description
Requires immediate fix
Requires immediate attention
Bug fix can be scheduled to the next release
Bug fix can be delayed
The fix is withheld
Page 7 of 23
During this phase the following metrics and reports will be prepared.
Test Coverage
Error Discovery Rate
Defect Trend analysis
Test Metrics
Performance Metrics
Performance Graphs
List of non compliance
Verification evidences
The overall summary of testing process would be as
Task
Input
Output
Test Strategy
- Business Requirement
- Application details
- Overall solution
architecture
- Testing Knowledge
- Testing Methodology
Test Planning
-Test Strategy
documents
- Application Information
- Project Plan
- Test Plan
- Solution Mapping
- Business Scenarios
- Detailed Design doc
- Testing Knowledge
- Templates
- Test Cases/Scripts/
Stubs/Drivers
Test Execution
- Test Plans
- Test Cases/Scripts
- Testing Skills
- Test Tools
- Performance Graphs
- Test Results
- Defect Reports
- Summary Reports
- Test Management
Tool
- Testing Knowledge
- Test Methodology
- Templates
- Automated Tools
Knowledge.
-Test Plan
- Support details
5. Types of Testing
GrayBox approach would be applied for system integration projects. Gray-Box approach is
nothing but combination of White Box and Black Box approach. The following types of tests would
be carried out for System Integration projects.
Page 8 of 23
The screen flow & links should be checked. The movement & control of screens back
wards and forwards through links and browser options to be checked.
The navigation aspects of the CRM users who would be operating the Clarify
b) Validations
Check for field / screen level validations of all fields on the screen when a save or submit
is performed (formats like date, currency and other mandatory files)
Check for Enabling/Disabling of buttons & links based on access control defined for the
user (Add and Modify)
c) Multilingual Features
The ability of the system to handle users using different languages simultaneously needs
to be checked
d) Error handling
The error messages will be stored in the database and based on the type of error a suitable
error message will be retrieved.
On Wrong / Bad field inputs, the behavior of the system, the legibility and meaningfulness
of the displayed messages will be checked.
Exception handling
5.4 Multi-user
The basic objective of these tests will be to ensure that the system handles multiple requests in
parallel without getting into deadlocks or loosing the integrity of the transaction data. This set of
tests would identify transactions that can happen in parallel and state tests around these
conditions.
Transaction recovery
Process recovery/restart
Page 9 of 23
Browser recovery/restart
5.6 Security
These tests would verify that the system meets the security policies and requirements of the
system.
Authentication features
Authorisation features related to access control on resources
Auditing & logs
All the events from the web will be audited
The security will be checked for both the internal and external users of the system. More details
on this will be added as security is finalised.
5.10
Compatibility testing
Compatibility testing is necessary to ensure that localized products function properly in the local
hardware and software environment, including local operating systems, peripheral devices, and
networking and communications standards.
5.11
Performance testing
The objective of the performance test is to check if the system can handle the transaction time for
provisioning and also customer support. It checks that the response times do not degenerate and
that they are still within acceptable limits during peaks. The objective of the performance tests is
to ensure that the system would be able to handle the projected loads within the expected
Service levels on the production hardware. The performance tests would focus on:
1. The ability of the server to handle the predicted loads
2. The response times and performance of the screens involved in the transaction at peak hours
3. The ability to handle large volumes of data over a period of time (volume & stress).
Page 10 of 23
The performance of the network is not a part of these tests. Hence all these tests would be
performed on a Local Area Network.
In addition, the following parameter would be checked depending on the monitoring tools that
may be available in the given environment
CPU loads
Disk loads
Network traffic involved
The Load test is many concurrent users running the same program to see
whether a system handles the load without compromising functionality or
performance
b) Volume Testing
The purpose of Volume Testing is to find weaknesses in the system with respect
to its handling of large amounts of data during short time periods. For example,
this kind of testing ensures that the system will process data across physical and
logical boundaries such as across servers and across disk partitions on one
server.
c) Stress Testing
5.12
The purpose of Stress Testing is to show that the system has the capacity to
handle large numbers of processing transactions during peak periods.
In a batch environment a similar situation would exist when numerous jobs are
fired up after down time.
The reliability of a system is the conditional probability that the element will operate during a
specified period of time. A system may be considered highly reliable (that is, it may fail very
infrequently), but, if it is out of service for a significant period of time as a result of a failure, it will
not be considered highly available.
One measure of systems reliability is its failure rate, or Mean Time To Failure (MTTF), the interval
in which the system or element can provide service without failure. Another measure of reliability
is the Mean Time To Repair (MTTR), which represents the time it takes to resume service after a
failure has been experienced.
5.13
The goal of localization/Globalization testing is to ensure that local language versions of the
product perform consistently with the source language version
Localization testing includes
Page 11 of 23
5.14
Workflow Testing
The workflow testing type is used to test the business workflow process definitions and scenarios.
In the integrated applications, the workflow process definitions will be tested independently.
6. Stages of Testing
The various stages of testing identified for system integration projects are:
Regression tests would be carried out at Application level and Systems Integration levels
6.1
Application level testing would be carried out on individual applications and includes the following
tests.
Functional
Area/Application
CRM
Types of tests
Sub Area
Method of testing
Remarks
Functionality
Out-of- box
Black Box
Business Scenario
Black Box
Customisation
Black Box
To identify the
limitations in the
system
To test the business
rules
This will be carried out
only in case of
customisation
Navigation
Black Box
Usability Tests
Page 12 of 23
Workflow Testing
Localization Testing
Billing
EAI
Functionality
Usability Tests
Functionality
Black Box
Black Box
Out-of- box
Black Box
Business Scenario
Black Box
Customisation
Black Box
Navigation
Message broker
Black Box
Message brokers are
key components in
the integrated system.
Various types of input
messages would be
fed to test these
brokers
All
adapters/Connectors
are tested with
message simulators
and set of XML files.
Specific scripts are
developed for
analyzing the log files
of individual instances
of Adapters.
Monitoring tools
provided by the tool
vendor would be used
to monitor adapters
Adapter
Black Box
Black Box
Page 13 of 23
Negative testing
(Exceptional
Scenarios)
Service Provisioning
Involves multiple
products like Network
Modeling, Inventory
Management and
automated service
provisioning tools
Functionality
Service Assurance
Again involves
multiple products like
performance
management, fault
management tools etc
Functionality
Adapter testing
Test
Drivers
for
different services
Off
the
shelf
simulators/stubs/drive
rs
Adapters could be
custom built
Network elements and
simulators are
required for these
tests
Adapter testing
Test
Drivers
different services
Adapters could be
custom built
Business Scenario
Off
the
shelf
simulators
for
generating faults and
alarms/drivers
for
Network
elements/simulators
are required for these
tests
Page 14 of 23
Dataware housing
Functionality
SAP
Functionality
Business Scenario
Adapter testing
Black Box
Test
Drivers
different services
Out-Of-Box
Black Box
Business Scenario
Adapter testing
Black Box
Test
Drivers
different services
Black Box
Out-Of-Box
Legacy
Functionality
Business Scenario
Adapter testing
Out-Of-Box
Black Box
Test
Drivers
different services
Black Box
for
Adapters could be
custom built
for
Adapters could be
custom built
for
Adapters could be
custom built
Summary
The overall summary of applications level testing would be
Task
Input
Output
- Test report
- Defect Report
- Application level
Tested Code
- Stubs/Drivers/Scripts
6.2
Review Mechanism
- Reviews
- Test Audit
These tests include Interface tests, Performance tests (Refer to Performance tests section),
Stress and Reliability Tests. This is carried out incrementally.
Page 15 of 23
Performance
Performance
Requirement
Requirement
s Study
s Study
Tool
Tool
Identificatio
Identificatio
n&
n&
Evaluation
Evaluation
Performance
Performance
Strategy
Strategy
R
Performance
Performance
Analysis
Analysis
Report
Report
Test
Test
Execution
Execution
Test Design
Test Design
Page 16 of 23
Based on the strategy, the scenarios for the creation of Virtual users would be developed. The
scenarios will depend on the business criticality and frequency (usage model). The test
environment would be decided based on the usage model, number of type virtual users.
6.2.2.4 Test Design
It involves the following activities.
Script Recording
Script Programming
Script Customization (Delay, Checkpoints, Synchronisation points)
Data Generation
Parameterization/Data pooling
16
14
12
10
8
6
4
2
0
Number of Transactions
Sample Graphs
120
100
80
60
40
20
0
1
10
Business Transaction
Fail
Pass
Page 17 of 23
Input
Perform End-to-End
System Integration Test
- System Integration
Test Plan
- Acceptance Criteria,
- System tested code
- Test reports
- Defect Report
- System Integration
Tested Code
- Stubs/Scripts/drivers
Perform Performance
Testing
6.3
Output
Performance Report
Test reports
Defect Report
Performance
Tested Code
- Stubs/Scripts/drivers
Review
Mechanism
- Reviews
- Test Audit
- Reviews
- Test Audit
Acceptance Testing
The final product deliverable will be tested as per the acceptance test plan.
The test cases are designed from a user point of view and would cover all the business
scenarios.
The steps described in the Testing Procedure will be followed while testing the final
deliverable.
All the defects captured during the acceptance testing will be logged into Defect tracking
system.
The result of Acceptance Testing will be recorded in the Test Report.
Page 18 of 23
On completion of acceptance testing and on meeting the acceptance criteria, Customer sign
off will be obtained for the project.
Summary
Task
Input
6.4
Output
- Test report
- Defect Report
- Acceptance Tested
Code
Review
Mechanism
- Reviews
- Test Audit
Regression Testing
Rigorous Regression testing would be carried out, to ensure that the fixes/changes made to the
application do not cause any new errors.
This test is executed on a baselined system or product when a part of the total system product
has been modified. Regression test would be carried out at all stages of the test cycles.
Regression testing would be carried out after each release of a previously tested application.
Automated test tools will be used for saving cost and Manpower. Traceability matrix would be
used for finding out the test scripts that need to be used in Regression testing
7. Test Tools
Automated testing tools will be used for verifying the functionality and performance of the
applications.
The following test tools will be used after the tools evaluation.
Tools Identification and evaluation for Performance and Regression testing would be carried out
during the Requirement or design phase. Tools would be identified based on the protocols,
Software and hardware used in the solution. A POC would be carried out if required.
The Summary of Automated tools identification, evaluation and procurement would be
Task
Input
Tools &
Techniques
Out Put
Page 19 of 23
Tools Identification
- Test Requirement
- Test Environment
- Applications description.
- Tools vendors information
Proof Of Concept
- Tools identified
environment
may
be
required
in
certain
case
like
9. Test Artifacts
The following test artifacts would be generated during the test life cycle
Test Artifacts
S L.
NO
1.
Test Strategy
2.
Test Plans
Remarks
Master test strategy, System Integration Test Strategy,
Performance Test Strategy and Application/System test
strategy for individual applications
Master Test plan and test plans for individual
applications
Page 20 of 23
3.
4.
Test driver/stub/scripts/
Simulators etc
Defect Reports
5.
6.
7.
8.
9.
Test Checklists
Resources Identification
Developing individual and group Skills
Assign Roles and responsibilities
Input
Tools &
Techniques
Resource
Planning
- Project requirements
- Constraints
- Estimation and
Schedule
-Human resources
practices
-Templates
Staff Acquisition
- Recruitment practices
- Staffing pool
Description
- Pre-assignment
- Procurement
Team
Development
- IV&V Team
- Project plan
- Training requirement
- Performance reports
- Training
- Team building
activities
Output
- Organization chart
- Roles and Responsibility
assignments
- IV&V team management
Plan
- IV&V team resources
assigned.
- Team directory
- Performance
improvement
- Input to performance
appraisal
The composition of the test team role and responsibility would be as shown bellow:
Page 21 of 23
Roles
Responsibility
Test Manager
Test Architect
Test Lead
Test Engineer
Automation Engineer
Contingency
Planning
Input
- Project Requirement &
Applications description.
- Development and
other planning output
- Historical information
- Sources of Risk
Tools &
Techniques
Out Put
Checklist
- Sources of Risk
-Risk analysis
- Mitigation plan
- Contingency plan
Page 22 of 23
Some of the Major Risks identified for System Integration projects are as given below:
#
Risk
Remarks
Completeness
Multiple Region
Complexity
Product Specification
Product Control
Usability
Co-operation
Schedule
Staff
10
Test Environment
11
Training
12
Customer
Page 23 of 23