Unit Iii

Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 25

UNIT-III

CALIBRATION AND VALIDATION


WHAT IS CALIBRATION?
• Calibration is the comparison of measurement values delivered by a device
under test with those of a calibration standard of known accuracy.
• Calibration of an instrument is the process of determining its accuracy.
• The process involves obtaining a reading from the instrument and
measuring its variation from the reading obtained from a standard
instrument.
• Calibration also involves adjusting its precision and accuracy so that its
readings come in accordance with the established standard.
OBJECTIVES

• To check the accuracy of an instrument.


• It determines the traceability of the measurement.
• A well designed and organized calibration program often leads to benefits in quality,
productivity and increased revenue.
QUALIFICATION
• Qualification is defined as the “action of proving that any equipment works
correctly and leads to the expected results.”
• Qualification is also part of validation and is product specific.
• Qualification is often a part (the initial stage) of validation, but the
individual qualification steps alone do not constitute process validation.
• There are four stages of qualification:
– Design qualification (DQ)
– Installation qualification (IQ)
– Operational qualification (OQ)
– Performance qualification (PQ)
VALIDATION
• 'Establishing documented evidence which provides a high degree of
assurance that a specific process will consistently produce a product,
meeting its predetermined specifications and quality attributes.’
• Process validation or simply called validation may be conducted at
different points during the life cycle of a product.
• The types of process validation are defined in terms of when they occur in
relation to product design, transfer to production and release of the product
for distribution.
i. Prospective Validation
ii. Concurrent Validation
iii. Retrospective Validation
iv. Revalidation
Prospective Validation
• Prospective validation is conducted before a new product is released for
distribution or, where the revisions may affect the product's characteristics,
before a product made under a revised manufacturing process is released
for distribution.

• Criteria for performing the prospective validation:


 Facilities and equipments should meet GMPs requirements
 Personnel has to be trained properly
 Critical processing steps and processing variables should be identified
and provisional operational control limits for each critical test parameter
should be provided using pilot laboratory batches.
Concurrent Validation
• Concurrent validation is used for establishing documented evidence that a
facility and processes do what they are expected to do, based on information
generated during actual imputation of the process.
• Concurrent validation is a subset of prospective validation and is conducted
with the intention of ultimately distributing product manufactured during the
validation study.
Retrospective Validation
• Retrospective validation is the validation of a process based on accumulated
historical production, testing, control, and other information for a product
already in production and distribution.
• This type of validation makes use of historical data and information which
may be found in batch records, production log books, lot records, control
charts, test and inspection results, customer complaints or lack of complaints,
field failure reports, service reports and audit reports.
• Historical data must contain enough information to provide an in-depth picture
of how the process has been operating and whether the product has
consistently met its specifications
Revalidation
• Repeated validation of an approved process to ensure continued compliance with
established requirements.
• Revalidation should be performed following a change that could have an effect on
the process, procedure, quality of the product and/or the product characteristics.
• Revalidation should be performed following a change that could have an effect on
the process, procedure, quality of the product and/or the product characteristics.
Relationship between validation and
qualification

• Validation and qualification are essentially components of the same


concept.
• The term qualification is normally used for equipment, utilities and
systems.
• Validation is used for processes (qualification is part of validation).
ADVANTAGES OF VALIDATION
i. Process parameters and controls are determined during the validation of any
process or system.
ii. It helps to determine the worst case and risks that may arise during the
manufacturing of the quality products.
iii. Validation helps to investigate the deviations caused during the process.
iv. Deep study and understanding of the system and equipment are made
possible due to the validation.
v. The risk of the regulatory non-compliance is minimized after the validation.
vi. Batch to batch variation is minimized due to the validation of processes,
systems and equipment.

vii. Reduces the production cost of the product.

viii. Increases the production of manufacturing facility due to the minimized


rework and rejection.

ix. Decreases the chances of the failure of the batches.


ANALYTICAL METHOD VALIDATION
• Validation of an analytical procedure is the process by which it is established, by
laboratory studies, that the performance characteristics of the procedure meet the
requirements for its intended use.
• All analytical methods intended to be used for analyzing any clinical samples
will need to be validated.
• Validation of analytical methods is an essential but time ‐consuming activity for
most analytical development laboratories.
The analytical method validation activity is not a one‐time study.
• The typical process that is followed in an analytical method validation is as
follows:
1. Planning and deciding on the method validation experiments.
2. Writing and approval of method validation protocol.
3. Execution of the method validation protocol.
4. Analysis of the method validation data.
5. Reporting the analytical method validation.
6. Finalizing the analytical method procedure.
Validation Parameters

• The validation parameters that will be evaluated will depend on the type of
method to be validated.
• Analytical methods that are commonly validated can be classified into
three main categories:
– Identification
– Testing for impurities
– Assay
A normal validation protocol should contain the following minimum
contents:
• Objective of the protocol
• Validation parameters that will be evaluated
• Acceptance criteria for all the validation parameters evaluated
• Details of the experiments to be performed
• Draft analytical procedure.
1. Selectivity/specificity

Selectivity of an analytical method is its ability to measure accurately an analyte in


the presence of interferences that may be expected to be present in the sample
matrix.

Selectivity is checked by examining chromatographic blanks (from a sample that


is known to contain no analyte) in the expected time window of the analyte peak.
2. Precision
Precision of a method is the degree of agreement among individual test
results when the procedure is applied repeatedly to multiple samplings.

•Precision is measured by injecting a series of standards or analyzing series of


samples from multiple samplings from a homogeneous lot. From the
measured standard deviation (SD) and Mean values, precision as relative
standard deviation (% RSD) is calculated.
3. Accuracy
The accuracy of an analytical method is the degree of agreement of
test results generated by the method to the true value.

•Accuracy is measured by spiking the sample matrix of interest with a known


concentration of analyte standard and analyzing the sample using the
“method being validated.”
The procedure and calculation for Accuracy (as % recovery) will be varied
from matrix to matrix and it will be given in respective study plan or
amendment to the study plan.
4. Linearity
The linearity of an analytical method is its capability to elicit check
consequences which might be at once, or with the aid of well described
mathematical adjustments, proportional to the concentration of analytes in
within a given range.
•Linearity is determined by injecting a series of standards of stock
solution/diluted stock solution using the solvent/mobile phase, at a minimum
of five different concentrations in the range of 50–150% of the expected
working range.
•The linearity graph will be plotted manually/using Microsoft Excel or
software of the computer (Concentration vs. Peak Area Response) and which
will be attached to respective study files.
5. Range
The range of an analytical method is the interval between the upper and
lower levels that have been demonstrated to be determined with precision,
accuracy and linearity using the set method.
This range will be the concentration range in which the Linearity test is done.

6. Stability
Many analytes readily decompose prior to chromatography
investigations, for example during the preparation of the sample solutions,
during extraction, clean-up, phase transfer, and during storage of prepared
vials. Under these circumstances, method development should investigate the
stability of the analyte. Accuracy test takes care of stability.
It is required to mention in the method how long a sample after extraction can
be stored before final analysis, based on the duration taken for accuracy test.
7. Limit of detection(LOD) and limit of quantitation(LOQ)
The term LOD is defined as the lowest concentration at which the instrument
is able to detect but not quantify and the noise to signal ratio for LOD should be 1:3.
The term LOQ is defined as the lowest concentration at which the instrument is able
to detect and quantify. The noise to signal ratio for LOQ should be 1:10.
8. Robustness (or ruggedness)
It is the ability of the procedure to provide analytical results
of acceptable accuracy and precision under a variety of conditions.
The results from separate samples are influenced by changes in
the operational or environmental conditions.
Robustness should be considered during the development phase,
and should show the reliability of an analysis when deliberate
variations are made in the method parameters.

You might also like