ITIL - Service Validation and Testing
ITIL - Service Validation and Testing
Service Validation and Testing is "The Process responsible for Validation and
Testing of a new or Changed IT Service. Service Validation and Testing ensures
that the IT Service matches its Design Specification and will meet the needs of
the Business
According to ITIL, Service Validation and Testing is "The Process responsible for
Validation and Testing of a new or Changed IT Service. Service Validation and
Testing ensures that the IT Service matches its Design Specification and will
meet the needs of the Business."
Service Validation and Testing ensures Quality of Service (QoS). The
specifications or requirements for Service qualities are defined in the Service
Design phase. The deciding factor for quality Service within Service
Management is testing. The main requirements for the successful testing of a
Service are defined in Service Level Packages (SLPs).
The frameworks drawn up in the Release, Validation and Control (RCV) phase,
which guide and support the implementation of quality requirements in the RCV
process are
Risk policy
Release policy
Testing and Validation is a new process introduced in ITIL covered some aspects
of Release testing within the Release Management process. ITIL gives more
detailed guidance. A major addition is details on various testing stages during
Service Transition and descriptions of their testing approaches.
Plan and implement a process that will provide objective evidence that
the Service Change will fulfill the customers' and stakeholders' business
needs and deliver the appropriate level of Service.
Identify and address all the issues, errors and Risks in Service Transition
Goal
The goal of Service Validation and Testing is to assure that the Service will
provide value to customers and fulfill their business requirements
Objectives
The objectives of Service Validation and Testing are:
Check that a Service is "fit for purpose" - it will deliver the required
performance with the desired constraints removed
Ensure that a Service is "fit for use" - it meet certain specifications under
specified terms and conditions of use
Verify that the customer and stakeholder requirements for the new or
changed Service are correctly defined and rectify any errors or variances
early in the Service Lifecycle, because this is considerably cheaper than
fixing errors in production.
Ensure the quality and timely delivery of a Service that meets all the
requirements of a successful Service Release
Test the new or changed Service in the target Business Unit, Service
Unit, deployment group or environment
Usually, a Service Provider is responsible for delivering, operation and/or
maintaining customer or Service Assets at specified levels of Warranty under a
Risk policy
o
Each organization, customer, business or Service Unit has a
different perspective of Risk and its management. Consequently,
having a Risk, policy in place ensures that the validation and testing
team can control and minimize all Risk types, such as availability,
security, continuity and capacity Risks
Testing Policy
The Testing policy usually reflects the requirements from
Service Strategy
Test the library and reuse policy: because the nature of IT Service
Management is repetitive, it benefits from reuse
Audit tests
Objective/target deliverable
Service requirements
test model
Service model
Deployment release
test model
Deployment
installation test model
Deployment
To test that a deployment has completed
Tests and audits of "actual
verification test model successfully and that all service assets and service assets and
configurations are in place as planned and configurations"
meet their quality criteria
The Service V-Model
This model helps you:
The right side of the diagram represents the validation activities that you
perform against the specifications defined on the left
Each stage on the left has a corresponding activity on the right. This means
that you must start the Service Validation and acceptance test planning with
the definition of Service requirements. The customer who signs-off the Service
requirements will also sign-off the Service Acceptance Criteria (SAC) and test
plan.
Validation and Testing Perspectives
Successful validation and testing focuses on the following question: Will the
Service deliver as required?
The answer to this question is based on the perspective of those who will do
the following activities for the Service
Use
Deliver
Deploy
Manage
Operate
Service Acceptance Testing
This starts with verifying Service requirements, which are, in turn, based on
customer requirements. Customers, Service Providers and other stakeholders
sign-off the Service requirements, Acceptance Criteria and Service acceptance
test plan before you start building the Service. Stakeholders can be:
Suppliers
Ensure the quality of the Service. This plays a key role in influencing
Business Units about the quality, reliability and usability of the Service
even before the Service goes live
Understand how the acceptance test fits into the business service or
product development testing activity
Perspective of End Users Within the Customer's Business
You must do a User Acceptance Test after building the service. This ensures
that the customer checks and verifies the Service before accepting it. You must
test the Service in an environment that closely resembles the live operational
environment. The testing helps you determine whether the Service is meeting
the customer's expectations.
You must define the testing details and scope in the user test and UAT plans,
which the stakeholders should agree to at the start of the process.
The end users within the customer's business will:
Provide the relevant staff skills, knowledge and resources to support the
Service after it goes live
All the components and assets of the Service are tested separately using a
specific Test Model. Also, each component must have an associated acceptance
test in addition to the overall acceptance criteria of the Service.
All the Service Models and associated Service deliverables are supported by
their own reusable Test Model. You can use this Test Model during deployment
and in future testing. These Test Models help you ensure quality at an early
stage instead of waiting for feedback at the end.
Plan and design test: you plan and design in the early Service Lifecycle
phase. To plan and design, you need to:
o
Plan resources such as hardware, networking and staff
o
Identify business and customer requirements, such as raw
material
o
Plan for supporting Services and their support features, such as
access and security
o
Create schedules and get approval for them
o
Define the timelines and place for Service Delivery
o
Define financial requirements
Verify the test plan and test design: you need to verify the test plan and
test design to ensure that:
o
The Test Model delivers appropriate test coverage for the risk
profile of the Service
o
The Test Model covers the key integration aspects and interfaces
o
The test scripts are accurate and complete
Prepare the test environment: you must define the design plan of the
initial test environment. You can prepare the test environment by using
the:
o
Services of the build and test environment resource
o
Release and Deployment processes
Record the results of the tests. Even if the test fails, you must
document the result along with the reason for the failure of the test
o
Follow the test plan and scripts for testing, whenever possible
o
Resolve and document the Incident or issue when part of the test
fails. The person who is testing that part of the Service should resolve
the issue and test it again
Evaluate exit criteria and report: When you evaluate the exit criteria and
exit report, you need to:
o
Compare the predicted results to the actual results
o
Interpret the results in terms of Risk to the business, Risk to the
Service Provider, or change in expected cost
o
Collate the test metrics and summarize the results of the tests
o
Evaluate the exit criteria based on the performance of the Service
and client feedback. The Service must meet the customer's
technology and quality requirements
o
Ensure that Configuration Baselines have been recorded in the
Configuration Management System (CMS)
Test clean-up and closure: In this, you need to:
o
Clean or initialize the test environments
o
Identify improvements that can be input to design, build, decision
parameters or future testing policies and procedures
o
Measure the test process and manage and control the testing activities
Determine the progress of testing, the earned value and the outstanding
testing. The Test Manager uses this data to estimate the duration of the
testing
Data for Problems, errors, issues, nonconformance and Risks that you
have to resolve
Sign-off document
Guidelines to Prepare the Test Environment
Use Service Management best practices to actively maintain and protect test
environments. For significant Changes, the possibility that the test data needs
to be updated should be considered.
Release plan
Test plan
An SLP
o
Plays a very important role in test planning and design because it
provides Utility and Warranty based on the customer's requirements,
assets and Patterns of Business Activity (PBAs)
Acceptance Criteria
o
Consist of specific requirements for testing at all levels
The output from the process is the test report that you need to forward to the
Evaluation team. This report includes all information relevant to the testing
phase, such as details of the testing environment and test results.
Based on the output, you can evaluate the Service only after a specific duration
of the Service going live. At that time, you can compare the predicted
performance with the actual performance of the Service. If the Service is
functioning as expected, the evaluation is successful. You then send a report to
Change Management along with a suggestion to remove Early Life Support
(ELS) from the Service and make it part of normal operations.
Interface with the Service Lifecycle
Service Validation and Testing supports all steps of the Release and
Deployment phase in Service Transition. You align the testing strategy to work
with all other Lifecycle stages to improve the quality of the Service. Some of
these interfaces are:
Service Design
o
Ensures that you can test the designs. For example, you can test
hardware, software, Service elements that you reuse, third-party
access rights or delivered Service elements
Service Operation
o
Uses maintenance tests to ensure that Services are effective. You
will need to maintain these tests to handle innovation and
environmental changes
Service Strategy
o
Ensures that you test the services within the specified cost and
time, using limited resources
Service Validation and Testing Interfaces with Other Processes
The relationships from Service Validation and Testing with other processes
within the RCV context are:
Service Package
IT Service Management tasks are repetitive and benefit from the reuse of
predefined models, processes and formats.
Some testing good practices are:
Tests library
o
A good, practical approach is to create and maintain a library of
relevant tests and update it whenever you make any Changes. The
test management group can take responsibility for organizing and
maintaining test scripts, test cases and test data in an organization.
Check for the redundancy of the test data or environment. This helps
you test the Service within another, existing test environment and using
another set of test data
Consider the fact that the test data and test environment might provide
low-level testing because of the new or updated changed Service
When maintaining the test data, you should:
Separate the test data from any live data to ensure that the test data is
not mistaken for live data and ice versa
Follow data protection regulations. For example, when you use live data
to create a test database, you should protect the data and ensure that it is
not transferable
Create a back-up copy of the test data. You must restore the database
for future testing. You can also do this for hardware tests
Use an established test database to ensure that you have a safe and
realistic training environment for a Service
Test Report
A test report includes:
Test results
The updated data and other information that is to be added to the SKMS,
for example, errors and Workarounds, testing techniques and analysis
methods
The primary KPIs include the indicators that you can use to judge the
effectiveness of testing in delivering Services that affect the business of
the organization
Secondary KPIs
Designs and plans testing conditions, test scripts and test data sets to
ensure appropriate and adequate coverage and control
Allocates and oversees test resources, ensuring that they adhere to the
test policies
Test analysts carry out the tests as set out in the testing plans and/or
Service Package
Service Design will design the test, as an element of the overall Service
Design. For many services, standard tests will exist, perhaps contained
within the transition model chosen as already accepted as appropriate for
the type of new or changed service under consideration
Ensures that the build delivery components are from controlled sources