SOP For Computer System Validation
SOP For Computer System Validation
1. Purpose:
To provide a standard operating procedure for Computer System Validation.
2.0 Scope:
This SOP (Standard Operating Procedure) applies to Computer System Validation and
outlines the life cycle of various computerized systems, including Process Control
System, SCADA and PLC based systems, Analytical Laboratory Systems, Configurable
Systems like Learning Management Systems (LMS, etc.), and Customized Software used
at [company name].
3.0 Responsibility:
3.1 System Administrator:
3.1.1 Set system policy, assign privileges according to system-specific SOPs, and
maintain user accounts.
3.1.2 Support and maintain computerized systems in accordance with system-specific
SOPs.
3.1.3 Maintain the security of data residing on the system, its backups, and archives.
3.1.4 Ensure changes are documented and implemented according to QA-approved
documents initiated by user departments.
3.1.4.1 IT department personnel shall act as the administrator for Analytical Laboratory
Systems, Building Management Systems, Centralized Configurable systems, and
Customized Software.
3.1.4.2 Building Management System is site-specific and critical to operational activities
and runs in three shifts; hence, the second administrator access control is assigned to
the Site Quality Head.
3.1.4.3 The Engineering Manager/ designee shall set the password policy and have user
maintenance (User Creation/ Modification/ Deactivation) privileges for Process Control
Systems.
3.1.4.4 IT department personnel shall be responsible for maintaining electronic data,
backup and archival of the data from Process Control Systems, and checking compliance
with equipment/instrument handover checklists.
3.2 Engineering Team:
3.2.1 Manage and plan projects overall.
3.2.2 Ensure all system requirements are fulfilled as per URS in discussion with user
department(s) and vendor.
3.2.3 Communicate to user department if any requirement defined in the URS is not
fulfilled by the vendor.
3.2.4 Report discrepancies promptly during project phase and take corrective action,
document and trace in qualification documents.
3.2.5 Ensure the support service delivered is as per the requirements laid down in the
URS.
3.2.6 Ensure that maintenance and changes to the system are performed according to
documented procedures that maintain the validated status of the system.
3.3 System Owner:
3.3.1 Act as the subject matter expert to define business needs from the system.
3.3.2 Manage and plan installation of new systems or upgrade existing systems if
Project/ Engineering team is not involved.
3.3.3 Define user needs and intended use of the system.
3.3.4 Provide input for the generation of the URS and clarifications with regards to the
system functionality.
3.3.5 Provide Functional Requirement Specification for the system in URS.
3.3.6 Ensure discrepancies are correctly addressed and resolved during validation of the
system.
3.3.7 Control project activities, resources, and costs jointly with cross-functional teams
(e.g., Projects, Engineering, etc.).
3.3.8 Depute resources from the department for executing validation activities.
3.3.9 Ensure that the use and maintenance of the system comply with documented
procedures that maintain the system in a validated state.
3.3.10 Review lifecycle documentation and validation status.
3.3.11 Prepare validation documents and ensure they are traceable.
3.3.12 Ensure adequate training is provided to users, administrators, and maintenance
staff on relevant procedures.
3.3.13 Ensure adequate training is provided to users, administrators, and maintenance
staff by suppliers during installation of new systems.
3.3.14 Ensure compliance with equipment/ instrument handover checklists after
completion of CSV.
3.3.15 Review and initiate system changes if any.
3.3.16 Ensure changes to computerized systems or its environment are properly
documented and approved.
3.4 The Quality Assurance Representative has the following responsibilities:
3.4.1 Make sure that the system meets all necessary requirements from regulations,
business, technical and users.
3.4.2 Help prepare and review project documents to ensure they are up to standards.
3.4.3 Review the implementation of the system’s life cycle and make sure it stays in a
validated state.
3.4.4 Ensure that the Validation Plan is appropriate and followed, and verify
equipment/instrument handover checklist after completion of validation.
3.4.5 Review system changes and make sure they are properly assessed and validated
to maintain the system’s validated state.
3.4.6 Ensure proper documentation procedures are followed for changes to the
computerized system or its environment, and review the documentation produced along
with risk assessment.
3.5 The Quality Assurance Manager has the following responsibilities:
3.5.1 Define the overall quality standards for the system implementation.
3.5.2 Review and approve necessary project documents.
3.5.3 Review and approve changes to the computerized system or its environment.
3.5.4 Ensure proper documentation procedures are followed for changes to the
computerized system or its environment.
3.5.5 Make sure the system stays in a validated state.
3.5.6 Ensure all validation deliverables are approved, available and traceable.
3.6 The Contract Service Provider for CSV Support has the following responsibilities:
3.6.1 Follow the organization’s procedures and policies related to CSV and GDP.
3.6.2 Carry out pre-approved protocols and compile reports.
3.6.3 Submit all executed CSV deliverables to the User department.
3.7 The Responsibility Matrix for Development, Review and Approval of validation
documents is as follows:
The following list identifies the responsible parties for generating and approving each
deliverable. However, it should be noted that the responsible parties may vary
depending on the specific system being implemented. This information should be
outlined in the roles and responsibility section of the individual system validation plan or
validation summary report.
Concept Phase
Reviewer and
Activity Doer
Approver
Project Phase
User, Engineering,
Validation Plan (VP) Supplier/ User
IT, QA
User, Engineering,
Installation Qualification (IQ) Supplier/ User
IT, QA
User, Engineering,
Migration Qualification (MQ) Supplier/ User
IT, QA
User, Engineering,
Traceability Matrix (TM) Supplier/ User
IT, QA
User, Engineering,
Validation Summary Report (VSR) User
IT, QA
4. Definitions:
4.1.1 System Owner: The System Owner is the person who is responsible for managing
the entire lifecycle of the system, including procurement, development, integration,
modification, operation, maintenance, and retirement.
4.1.2 Administrator: An Administrator is responsible for maintaining the software, setting
system policies, managing user accounts, assigning privileges, and backing up and
restoring electronic data.
4.1.3 Backup: A Backup is a copy of current data, metadata, and system configuration
settings that are kept to facilitate recovery, including disaster recovery.
4.1.4 Archival: Archival is the process of safeguarding records from being altered or
deleted and storing them under the control of independent data management personnel
throughout the retention period.
4.15 Computer System Validation: It is a documented evidence that provides a high
level of confidence that a computerized system functions consistently and reproducibly
according to its intended use.
5. Procedure:
5.1 Validation activities can be done by company team or outsourced to external service
providers. Use qualified external service providers who’ve been approved through a
vendor approval process.
5.2 If company outsource validation support, make sure that the responsibilities of the
external service providers are clearly defined in the agreement.
5.3 Any supplier we work with must perform validation activities according to the
approach defined in company document.
5.4 This SOP doesn’t apply to equipment or instruments that don’t have Industrial PC
(IPC), Human Machine Interface (HMI), or Desktop PC attached. It also doesn’t apply to
microprocessor-based equipment/instruments like pH meters or weighing balances.
5.5 The Engineering and IT departments are responsible for reviewing process control
systems-related documents.
5.6 For existing (legacy) systems, high-level risk assessment to be done to evaluate and
document their impact on GxP. We’ll also prepare an inventory list of GxP impacting
systems, and existing computerized system validation documents will be considered
valid.
5.7 When validating GxP impacting computerized systems, it will be based on the
following criteria:
5.7.1 A high-level risk assessment
5.7.2 Categorization of the system
5.7.3 Implementation of a life cycle model for GxP impacting computerized systems
5.8 The High-Level Risk Assessment process evaluates the validation requirements, and
follow these steps:
5.8.1 Identify the relevance to GxP
5.8.2 Assess electronic records and signatures
5.8.3 Identify the level of risk
5.8.4 Categorize the software as follows:
GAMP Category 1: Infrastructure Software
Description: (a.) Layered Software i.e. Upon which applications are Built, (b.) Software
used to manage operating environment, Example: Operating System, Middleware,
Database engine, Network Monitoring tools, Scheduling tools, Spreadsheets
Typical Approach of CSV:
Record Version No., verify correct installation by following approved installation
procedure
GAMP Category 2: This category is no more valid (firmware)
GAMP Category 3: Non- Configured
Description: (a.) Run Time parameters may be entered or stored but software can’t be
configured to suit business requirements.
Example: Laboratory Instruments, Firmware Based applications, Commercial Off the
shelf (COTS) software
Typical Approach of CSV: Abbreviated Life Cycle Approach, URS, Record Version No.,
Verify Correct Installation, Testing Against Requirements, Procedures in Place for
maintaining compliance and fitness for intended use
GAMP Category 4: Configured
Description: Software often very complex that can be configured by user to meet the
user’s business process. The software code is not altered
Example: ERP, LMS, LIMS, SCADA, HMI software
Typical Approach for CSV: Life Cycle Approach, Risk Based Approach to
supplier’s assessment, Demonstrate Supplier has adequate Quality Management System,
Record version No., Verify correct Installation. Risk Based testing to demonstrate that
applications work as designated in test Environment, Procedures in place for maintaining
compliance and fitness for purpose, Procedures in place for managing data.
GAMP Category 5: Customized Software
Description: Software Custom Designed and coded to suit business Needs
Example: Internally and Externally developed custom Software, Customized PLCs,
Custom Firmware.
Typical Approach: Life Cycle Approach, Risk Based Approach to supplier’s assessment,
Demonstrate Supplier has adequate Quality Management System, Record version No.,
Verify correct Installation. Risk Based testing to demonstrate that applications work as
designated in test Environment, Procedures in place for maintaining compliance and
fitness for purpose, Procedures in place for managing data, More rigorous supplier
assessment, Possession of Full Life Cycle Documentation (FS, DS and System Build
documents), Design and Source Code Review.
5.9 Based on the outcome of the high-level risk assessment, prepare a validation
strategy and minimum validation deliverables for each type of software category and its
identified level of risk. Based on the categorization, it’ll need at least the following
validation deliverables:
5.9.1 For Category 1 Infrastructure software: We don’t need to verify infrastructure
software separately because we can verify it during the installation qualification of
application software.
5.9.2 For Category 3 Non-configured software: We can consider the operational manual
or any other relevant vendor document as the design specification or functional
specification.
Deliverable for
Non-configured
software
DS/ FS
Level of Risk URS HLRA VP IQ OQ PQ TM VSR OM
/ CS
Low √ √ √ √ √
Moderate √ √ √ √ √ √ √ √ √
High √ √ √ √ √ √ √ √ √ √
5.9.2.1 When Operational Manual or Technical Manual any other relevant vendor
document is used as Design/ Functional Specification, required functionalities to be
traceable in the Traceability Matrix.
5.9.3 For Category 4: Deliverable for Configured software
Deliverable for
Configured
software
Level of DS/
URS HLRA SVA VP FRA IQ OQ PQ TM VSR OM
Risk FS/ CS
Low √ √ √ √ √ √ √ √ √ √ √
Moderate √ √ √ √ √ √ √ √ √ √ √ √
High √ √ √ √ √ √ √ √ √ √ √ √
5.9.3.1 For laboratory based systems, software performance i.e. PQ can be verified as
part of calibration activity.
5.9.4 For Category 5: Deliverable for Customized software
Low √ √
Moderate √ √ √ √ √ √ √
High √ √ √ √ √ √ √
Low √ √ √ √
Moderate √ √ √ √ √ √
High √ √ √ √ √ √
5.10 Prepare the High Level Risk Assessment (HLRA) according to respective Format.
5.11 The following definitions apply to the HLRA:
5.11.1 Direct Impact on Product Quality: If a system’s function can change any
characteristic of the product, such as its physical form or chemical properties, it’s
considered to have a direct impact on the product quality. Examples include Co-mill,
Sifter, and HVAC system.
5.11.2 Indirect Impact on Product Quality: If a system’s function can’t change any
characteristic of the product, but it impacts equipment that directly affects product
quality, it’s considered to have an indirect impact on product quality. Examples include
Chillers and Hot Water systems.
5.11.3 No Impact on Product Quality: If a piece of equipment’s function doesn’t
change any characteristic of the product, it’s considered to have no impact on the
product quality. Examples include Lifting and Positioning systems.
5.12 System Implementation Life Cycle:
The computerized system life cycle is divided into five phases: Concept Phase, Design
Phase, Project Phase, Operation Phase, and Retirement Phase.
5.12.1 Concept Phase: In this phase, the overall scope of the business needs is defined,
and the type of system and overall requirements are identified.
5.12.2 User Requirements Specification (URS): A URS document is typically prepared to
indicate user requirements.
5.12.3 User Requirements Specification (URS) document: The URS document defines the
system requirements and expectations for data reliability and data security. The URS
shall include, but is not limited to:
5.12.3.1 Business Requirements
5.12.3.2 Interface Requirements
5.12.3.3 Security and Safety Requirements
5.12.3.4 Electronic Records, Reports and Electronic Signature Requirements
5.12.3.5 Audit trail Requirements
5.16.3.6 Backup, Archival and Restoration Requirements
5.12.3.7 Performance Requirements
5.12.3.8 Operational/ Maintenance Support Requirements
5.12.4: Each requirement in the User Requirement Specification (URS) should be
numbered to create a Traceability Matrix.
5.12.5: Use the User Requirement Specification template provided in SOP.
5.12.6: The software or system must be evaluated for potential high-risk issues.
5.12.7: Identify potential suppliers and vendors for the system.
5.12.8: Evaluate the maintenance and technical support practices of vendors using
respative Format
5.12.9: Choose a vendor based on their GxP regulated industry experience, feedback
from their previous customers, and the quality of their product and support.
5.13: Specify the design elements of the system during this phase to meet the URS
requirements.
5.13.1: Provide detailed design specifications during this phase that show how the
requirements will be met. This phase primarily focuses on developing customized
software.
5.14 Project Phase: The vendor will provide a Functional Specification (FS), Design
Specification (DS), or Design Qualification (DQ) based on the User Requirement
Specification (URS). The provided documents will be reviewed and approved according to
the DQ Approval process. The Validation Plan will be prepared as per respactive Format
to define the validation strategy.
5.14.1 The Validation Plan will be used to prepare for validation testing.
5.14.2 During this phase, the system hardware and software will be assembled/ installed
and configured based on the information supplied in the design specifications.
5.14.3 All system components will be integrated to complete the system architecture.
5.14.4 The Validation Team will review the functional/ design/ configuration
specifications to confirm that the system satisfies the User Requirement Specification.
5.14.5 The software operation manual or any other relevant system-specific document
may be used as functional/ design/ configuration specifications and must be approved.
5.14.6 If the vendor cannot provide the specification document, the User Department/
Engineering can prepare the functional/ design/ configuration specification for required
functionalities by referring to the operation manual or any other relevant system-specific
document. The document title will be based on the type of document.
5.14.7 If the software operation manual or any other relevant system-specific document
is used as the functional/ design specification, the required functionalities should be
traceable in the Traceability Matrix.
5.14.8 If the vendor cannot provide the configuration specification, the system
configuration (e.g. privileges, system policies, password policy, etc.) will be documented
in the SOP, or the User Department can prepare it by referring to the system-specific
document.
5.14.9 The Functional/ Design/ Configuration Specification must be prepared before the
preparation of qualification protocols.
5.14.10 Functional Risk Assessment to be carried out based on the functionality of the
software and controls to be examined to mitigate the risk. Functional Risk Assessment
(FRA) shall be done as per respactive Format.
5.14.11 Mitigation strategy for the evaluated risks will be developed and verified through
the Design of software, Verification of functions, SOPs and Trainings for the business
process.
5.14.12 The general criteria that should be considered for FRA are listed below (but not
limited to):
5.14.12.1 Unauthorized system access
5.14.12.2 Abnormal process condition at the time of process operation
5.14.12.3 Power failure condition
5.14.12.4 Communication failure of software/ hardware / network
5.14.12.5 Failure of control system / set parameters
5.14.12.6 Improper training
5.14.12.7 Improper system function
5.14.12.8 Procedures not available/ inadequate
5.14.12.9 Improper safety measures
5.14.12.10 Loss of data backup
5.14.12.11 Password policy not applied/ functional
5.14.12.12 Security policy not applied/ functional
5.14.12.13 Incorrect configuration
5.14.12.14 Audit Trail not configured/ functional
5.14.12.15 Procedures for calibration does not exists
5.14.12.16 Validation documents/ User manual not available
5.14.13 Before qualifying the system, a Functional Risk Assessment should be carried
out to identify any risks that should be addressed during the qualification process.
5.14.14 If the vendor is unable to provide the configuration specification, the user should
prepare a document, according to the system-specific details, and include it in the
Standard Operating Procedure (SOP).
5.14.15 Formal testing is essential to ensure that the computerized system has been
installed correctly and operates in accordance with the User Requirement and Functional
specification documents.
5.14.16 The testing will be conducted in three stages: Installation Qualification (IQ),
Operational Qualification (OQ), and Performance Qualification (PQ).
5.14.17 The testing will ensure that the system meets the GxP requirements and is
supported by documented evidence.
5.14.18 Typical tests in the Installation Qualification (IQ) protocol will include, but are
not limited to:
5.14.18.1 Software Version identification.
5.14.18.2 Verification of Hardware and Software Configuration.
5.14.18.3 Connected Instrument/ System identification.
5.14.18.4 Identification of Software Backup copy for disaster management control
5.15: A protocol called Installation Qualification Protocol needs to be prepared, following
respactive Format. If the vendor prepares the protocol, it needs to be approved using
respactive.
5.16: If a backup copy of the software is not available, a letter of willingness from the
vendor should be obtained to support the system during a disaster condition.
Alternatively, this can be managed through an Annual Maintenance Contract (AMC) or
Service Level Agreement (SLA).
5.17: During the Operation and Performance Qualification (OPQ), system functionality
and performance need to be verified. This includes various tests based on their
relevance, but not limited to:
5.17.1 Audit trail verification
5. 17.2 Access control verification
5. 17.3 Communication Failure verification (Equipment to PC, PC to Server; as
applicable)
5. 17.4 System Operation verification
5. 17.5 Data backup and restoration verification
5. 17.6 Time zone verification
5. 17.7 Alarm/ Interlock and Reset Response verification
5. 17.8 Input/ Output Verification
5. 17.9 Loop/ Network Testing
5. 17.10 Calculation verification
5. 17.11 Power Failure and Restoration verification
5. 17.12 Report verification
5.18 The equipment/ instrument’s time zone should be selected and set according to
India’s time zone.
5.19 For critical systems, both positive and negative case testing will be conducted to
test the system’s capabilities.
5.20 Client-Server based systems may have separate or combined qualification protocols
for Server and Client configuration.
5.21 Screen captures should be attached to executed Qualification Protocols as evidence
for critical systems when required.
5.22 OPQ protocol shall be prepared according to respactive Format and Vendor OPQ
shall be approved accordingly.
5.23 Sometimes, it is suitable to combine different testing stages such as Installation
and Operational Qualification (IOQ) or Operation and Performance Qualification (OPQ).
5.24 The criteria for accepting whether a system is qualified or not will be described in
the appropriate qualification protocols.
5.25 All protocol documents will be approved beforehand.
5.26 Each test result will be signed and dated by the person who conducted the test, and
the test case will be reviewed by another person.
5.27 Any discrepancies or non-conformances where the actual results do not match the
expected results will be documented in the respective qualification protocol.
5.28 If any non-conformances/ discrepancies require further effort to resolve, the
resolution process will be included.
5.29 After completing all tests successfully, the executed protocols shall be approved.
5.30 To ensure the accuracy of the system, all draft and final SOPs that support the
system shall be verified.
5.31 Migration Qualification:
5.31.1 The Migration Qualification (MQ) includes verifying original data, migrating data,
and verifying the migrated data.
5.31.2 During the implementation of certain systems, data migration may be required.
5.31.2.1 New system Implementation
5.31.2.2 System upgrade
5.31.2.3 System retirement
5.31.2.4 Data archival
5.31.3 Data migration involves moving the data in the following cases, but not limited to
5.31.3.1 Database or software upgrade
5.31.3.2 Originating System to a different system
5.31.3.3 One server to another server
5.31.3.4 Multiple data sources to one data source
5.31.4 In order to perform the migration qualification, several parameters need to be
verified, which may vary depending on the type of system:
5.31.4.1 Verify that the original data is intact just before migration.
5.31.4.2 Verify the correctness of the migrated data, including raw data and audit trail.
5.31.5 If a new system requires migration qualification, the migration plan shall be
included in the validation plan.
5.31.6 If a vendor performs the migration qualification for a system upgrade, the
migration qualification protocol shall be approved as per respactive Format.
5.32 The Traceability Matrix should be updated to connect the User Requirement
Specification with the tests designed in the IQ, OQ, PQ, OPQ, MQ protocols.
5.33 The Traceability Matrix needs to be updated before the validation summary report
as per respactive Format.
5.34 Any accepted discrepancy from the URS should be documented in the ‘Comment’
section of the Traceability Matrix table.
5.35 The Validation Summary Report should summarize the results of the validation
activities and provide a decision to release the System for use.
5.36 The system can only be used for operations once the Validation Summary Report
for a particular system has been approved.
5.37 The Validation Summary Report should include information about:
5.37.1 Decision on System Release
5.37.2 List of approved deliverables
5.37.3 Result of tests and open deviations from expected results, if any
5.38.4 Discussion and conclusion including any system limitations, open deviations (if
applicable) and, if required follow-up activities.
5.39.5 All training associated with Users, Technical support personnel shall be
completed and logged.
5.40 The Validation Summary Report needs to be prepared according to respactive
Format.
5.41 If there are multiple systems associated with a single software system, an interim
summary report needs to be written and approved for individual systems before
releasing the system for official use. Once all associated computerized system validation
is completed, a final summary report can be developed.
5.42 External Validation Support:
5.42.1 Validation activities can be given to outside service providers.
5.42.2 External service providers must be qualified through a vendor approval process.
5.42.3 The outsourced validation support’s responsibilities must be mentioned in the
agreement.
5.42.4 The vendor must perform the validation activities according to the approach
defined in this SOP.
5.43 Vendor’s Qualification Document:
5.43.1 The vendor qualification document may be acceptable, subject to review for its
competency.
5.43.2: Review the vendor qualification documents and record the evaluation outcomes
in respactive formats, which must be authorized by the QA department.
5.43.3: If the vendor document already has a document number, keep the numbering
system of the vendor document as it is.
5.44: The computer system validation documents should be numbered in the following
order.
5.45 Each validation document should have a revision number.
5.46 The first version of the document should be marked as “00”. Subsequent revisions
will have to be incremented by “01”, “02”, “03”, “04”, and so on.
5.47 When an existing document is revised, it should be numbered according to the
current version of this SOP.
5.48 If there are any modifications/ changes in the existing system, the addendum
document’s numbering system should start from “Addendum 01”, followed by
“Addendum 02”, “Addendum 03”, and so on.
5.49 The Equipment/ Instrument handover checklist should be completed by IT
personnel and verified by the User and QA department, preferably with the Validation
Summary Report.
5.50 Retirement Phase:
5.50.1 When the system no longer meets company’s business needs or business
operations, it should be retired or decommissioned according to QMS documents.
5.50.2 A defined retirement plan should be in place that outlines how system data will be
archived and restored in the future or how hardware and software components will be
decommissioned.
5.50.3 The data should be available or restored for audit purposes for the length of time
required by the relevant regulatory authorities.
5.51 Strategy for Maintaining the Validated State:
5.51.1 Change Control: Any modifications made to the computerized system or its
related documents that may impact the software validation status must be documented
through Quality Management System, i.e., change control.
5.51.2 Up-gradation of the system must be done through the change management
process, and system data should be transferred orderly to the new application software
or alternatively archived.
5.51.3 Error Handling: Any error, incident or defect encountered with the system that
may affect data integrity, product quality, or patient safety must be investigated and
handled according to the Quality Management System, i.e., Deviation.
5.51.4 Periodic Review: Systems must be reviewed periodically during equipment re-
qualification. Periodic review helps to determine the changes that have occurred on the
system, the effects of those changes, the status of system documentation such as SOPs
and verification of active users. This review must ensure that the system and all related
documentation are current at all times.
6. Formats:
1. Format for High Level Risk Assessment
Table of Contents
1. Purpose
2. Scope
3. References
4. System Overview
5. Responsibilities
6. GxP Relevance
Sr. Yes/
Questions
No. No
Is the system used in the collection, analysis or storage of data from pre-
2
clinical studies or clinical trials?
Is the system used to produce or process data that will be used in (drug)
3
regulatory submissions?
If one of the questions is answered with “Yes”, the System is classified as “GxP relevant”
and Computer System Validation is required.
7. Electronic Records, Electronic Signature
Applicable if the system is classified as “GxP Relevant”
Sr.
Question Yes/No/NA
No.
Does the system support business processes that are part of the
product development, registration, manufacturing or the distribution of
1
products that are to be delivered to the US and/or European market
and/or to any other market that is regulated regarding ERES?
Sr.
Question Yes/No/NA
No.
If 7.2 and/ or 7.3 is answered with “Yes”, “Electronic records” are applied to the system.
If 7.4 is answered with “Yes”, “Electronic signatures” is applied to the system.
8. Level of Risk
For GxP relevant system, determine the ‘Level of Risk’ (Severity of harm) based on
system impact on GxP relevance, Business relevance and failure consequence. Use below
table for this purpose.
Assessment Step:
Assessment regarding GxP relevance (Yes/No)
Assessment regarding Business relevance (Yes/No)
Assessment regarding Failure Consequence (High/ Moderate/ Low)
If system is classified as GxP relevance, system must be automatically classified as
business relevance and at least moderate for the failure consequence.
If system is impacting on business, which means on activities like Research and
Development, Quality Control, Production, Sales, etc. then system should be classified as
Business Relevance.
If system is classified as neither GxP nor business relevant, the classification of the
failure consequence should be Low.
If system has direct impact on, Product quality, Patient Safety then the system failure
consequences shall be High.
If system has indirect impact on, Product quality, Patient Safety then the system failure
consequences shall be Moderate.
If system has No direct or indirect impact on Product quality, Patient Safety then system
failure consequences shall be Low.
Yes- 10, High- 20, Moderate- 08, Low- 02
No Yes High 30
No Yes Moderate 18
No Yes Low 12
No No Low 02
5 Level of Risk
9. GAMP Categorization
10. Summary
Respond to the questions according to section 6 to 9 of this document
Questions Response
1 Organizational chart
Sr. Submitted (No) and
Document Name/Type Submitted (Yes)
No. the reason provided
Severity
Description
(S)
Direct Impact on Product Quality: The function of system which could change any
characteristic of the product with respect to physical form or chemical property to be
considered as direct impact on the product quality.
E.g. Co-mill, Sifter, HVAC system
Indirect Impact on Product Quality: The function of system which could change any
characteristic of the product with respect to physical form or chemical property, the
system which shall impact to the direct quality impacting equipment to be considered as
indirect impact on the product quality.
E.g. Hot water system, Chiller, etc.
No Impact on Product Quality: The function of equipment does not change any
characteristic of the product with respect to physical form or chemical property to be
considered as no impact on the product quality.
E.g. Lifting and positioning system
For each risk, its frequency of occurrence (O) will be rated, determined and documented
(Low = 1, Medium=2, High =3)
Occurs occasionally 2
For each risk, its likelihood of detectability (D) will be rated, determined and documented
(Low = 3, Medium=2, High =1)
Detectability
Description
(D)
1 to 4 Low (L)
6 to 8 Medium (M)
9 to 27 High (H)
URS Qualification
Sr. Description of Test Pass/
Point Document Status
No. URS point Number Fail
No. Number
Verified by:
Sign/ Date
8. Format for Validation Summary Report
1. Purpose
2. Scope
3. References
4. System Overview
5. Responsibilities
6. Qualification Test Result
7. Summary of the Validation Activities
8. Deviation from Validation Master Plan
9. System Release
10. Conclusion
11. Attachment
12. Revision History
9. Format for Equipment/Instrument handover checklist
Following checkpoints need to be verified before handover of equipment/instrument:
Verified
Sr. Compliance
Checkpoints by Sign/
No. Status
Date