Considerations For Computerized System Validation in The 21st Century Life Sciences Sector
Considerations For Computerized System Validation in The 21st Century Life Sciences Sector
Considerations For Computerized System Validation in The 21st Century Life Sciences Sector
Chapter 1
CONTENTS
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Life Cycle Processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
Computerized Systems Validation. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Validation SOPs. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
System and Application Evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
System Register. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Validation Plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Qualification Protocols . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Design Qualification. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 18
Installation Qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
Operational Qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Performance Qualification . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
Validation Report. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
Periodic Review. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
Computerized Systems Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
User Requirements Specification (URS) . . . . . . . . . . . . . . . . . . . . . . . . 27
Supplier Audit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Supplier Quality Management System . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Requirements Traceability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
System Development Life Cycle . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
System Description . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
Development Methodologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34
System Development Testing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35
Data Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
System Documentation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Computerized Systems Operation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
System Location . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
System Changes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Error and Problem Reporting. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 40
Backup and Archiving . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41
Disaster Recovery and Business Contingency Planning . . . . . . . . . . . . . 42
System Maintenance and Support . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 42
System Decommissioning and Retirement . . . . . . . . . . . . . . . . . . . . . . . 43
Inspection Readiness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
Electronic Records and Electronic Signatures . . . . . . . . . . . . . . . . . . . . . . . . . . 44
FDA Guidance for Industry, Part 11, ERES – Scope and Application. . . 44
Predicate Rules and Part 11 . . . . . . . . . . . . . . . . . . . . . . . . . . . . 45
Narrow Interpretation of Scope. . . . . . . . . . . . . . . . . . . . . . . . . . 45
What the FDA Still Intends to Enforce When Part 11 Applies . . . 45
The Life Sciences Industry’s Responsibilities . . . . . . . . . . . . . . . 46
Validation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Hybrid Systems (Paper and Electronic Records) . . . . . . . . . . . . . 47
Electronic Signatures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Audit Trails. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Copies of Records. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Retention of Records . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Legacy Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 48
Electronic Records Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
Validation Requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49
System Access . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Data Entry Checks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50
Audit Trails. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Copies of Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Data Security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Batch Release and Qualified Persons . . . . . . . . . . . . . . . . . . . . . 52
Documentation Control. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 53
Open Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Electronic Signatures Compliance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54
Association with Electronic Records . . . . . . . . . . . . . . . . . . . . . . 54
Accountability . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Meaning and Timestamps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 55
Security – Personnel . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Security – Equipment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 57
INTRODUCTION
As the U.S. Food and Drug Administration (FDA) embarks on its systems inspection
and risk-based compliance approach for manufacturing systems, as we see
worldwide recognition of GAMP guidance, and as new European inspection focus is
identified in the new PIC/S Guide, this chapter focuses on some key considerations
and issues that still surface to cause problems.
The prime objective of applying GxP regulations is to minimize the risk and
potential danger of producing adulterous or ineffective life sciences products and
releasing those products to the market. The GxP concept was instigated by the
World Health Organization (WHO) in 1967. The FDA released the final good
laboratory practice (GLP) rule in June 1979 with the current version of good
manufacturing practice (GMP) dating back to 1992. Both the WHO GMP and GLP
concepts require validation of critical processes and systems, and define validation
as the documented act of proving that any procedure, process, equipment, material,
activity, or system actually leads to the expected results. The major life sciences
sector development, manufacturing, and user countries have defined their own sets
of requirements, e.g., the U.S., European Community (EU), Australia, Japan,
Switzerland, and Canada.
The regulations are the legal requirements of the country in which a life sciences
sector product will be (or is already) marketed. Generally, by the nature of
governmental due process in establishing each new law, regulations do not change
significantly over time. Hence, the wording and terminology of earlier regulations
are not always easy to follow when applying more recent technology. Where
appropriate, the life sciences company must therefore consider and adopt current
good practice as well as the more traditional interpretation of the regulations. To
emphasize this responsibility and to indicate that good practice is continuously
evolving, the U.S. FDA promotes the term current good manufacturing, laboratory,
clinical practice (cGxP), which recognizes the availability and application of new
process and technology developments.
Significant improvements were made in the validation of computerized systems
in the latter part of the 20th century. The FDA Blue Book (1983) kick started the
process by interpreting the GMP predicate rules for computerized systems and this,
supported by industry sector guidance documents such as the STARTS (Software
Tools for Application to Large Real-Time Systems) Guide from the National
Computing Centre (1987) and the introduction of the TickIT scheme (1991), which
provides a software interpretation of the international quality standards (latterly ISO
9000:2000), led to the publication of the GAMP Guide and its predecessors.
Individual and collaborative efforts by the life sciences sector and its specialist
consultants, the regulatory authorities, and system suppliers have, over the years,
produced guidance on the activities and documentation that are needed to ensure
successful validation. Table 1.1 compares the phases of the validation life cycle as
defined by the PDA Technical Report 18 and the GAMP Guide. It is noticeable that
the industry focus on guidance and methodologies devised to achieve validation of
computerized systems has assisted validation programs in other areas of GxP
compliance.
In the years following the publication of the FDA Blue Book the regulatory
authorities issued some additions and changes in the regulations and guidance
relating to GMP, most notably in the field of medical devices. Apart from upholding
the concept of life cycle validation most of these did not have a significant effect on
the general approach adopted by the life sciences sector for ensuring the compliance
of computerized systems. However, all of this was to change with the publication of
21 CFR Part 11 in August 1997. The document introduced a step change in the level
of compliance expected by the U.S. FDA, and this has carried forward into the
current century.
Validation is the governing process for the complete life cycle of a computerized
system and must include the stages of planning, specification, programming,
testing, commissioning, operation, monitoring, modification, and retirement [8§2].
Specific information on the validation of computerized systems which supports
Computer Applications
& Infrastructure
Instrumentation / Regulating
Devices (as applicable) SOPs
Computer Systems Controlled Process
Computerized Operation
Operating Environment
regulatory requirements in the life sciences sector [1–3] can be found in the GAMP
Guide [7] and PIC/S Guidance [14].
The implementation of a life cycle approach for computerized system validation
is now an established process in life sciences, and is promoted in both the software
sector (via the TickIT scheme) and in the life sciences sector guidance (via GAMP).
The V-Model became a recognized life cycle development model when used by
the U.K. National Computing Centre and Department of Trade and Industry in the
STARTS Guide in 1987. It also became a system and software standard for the
German Federal Armed Forces in 1991–1992. It follows then that the model can be
adapted and used for the validation of computer systems applied in the regulated
life sciences industry [7].
The V-shaped model in Figure 1.2 illustrates in chronological order the life cycle
activities for prospective validation. The user requirements and design specifications
are the source of phase-specific test plans that enable formal qualifications to be
conducted and reported. Each task on the life cycle, when successfully executed,
results in documented evidence in support of the validation program.
For IT infrastructure platforms the focus is on separate risk-based qualification
for the “standard” infrastructure hardware and software components, layers, or
platforms. For infrastructure component functionality that is used solely when
operating in conjunction with a business software application that shares the IT
infrastructure, the qualification testing will be part of the OQ and PQ stages of the
software application validation. This is depicted in the validation life cycle model
in Figure 1.3.
SW / HW Verifies IQ Report
Design Specs Installation
IQ Test Plan Qualification
System
Development &
Build
Integration Testing
With this in mind, and recognizing the size and complexity of some infrastructure
platforms, and for operations where GxP (and business) critical data, parameters,
and functions are involved, there is a need to examine the type and level of
“positive” and “negative” testing. This is necessary to demonstrate critical data
integrity, security and availability, and operational reliability and repeatability.
For “retrospective validation” emphasis is put on the assembly of appropriate
historical records for system definition, controls, and testing. Systems that are not
well documented and do not demonstrate change control or do not have approved
test records, cannot be considered as candidates for retrospective validation as
defined by the regulatory authorities [14].
Hence, for existing (legacy) systems that are in operational use and do not meet
the criteria for retrospective validation, the approach should be to establish
documented evidence that the system does what it purports to do. The validation
methodology to be followed for existing computer systems will therefore require
elements of the prospective validation life cycle to facilitate the process of
generating system specifications and qualification test records [14].
Difficulties that can arise include lack of up-to-date (or incomplete)
documentation for critical parameter definitions, system specifications, operator
Infrastructure Maintaining
Platform Validated Status
Qualification
PQ
URS
FDS
OQ
Platform OQ
Specification Design
Document Specs. IQ
Set
IQ
Dev./Config. DQ
& Integration
Platform
Build DQ
interfaces, definition of system or data logs, reports and records, and test and
calibration records. Add to this the potential of compromise by the availability of a
“live” system when qualification testing and the complications increase. In
addition, quality and engineering procedures for ensuring controlled application
and operation of the system may be nonexistent or may not focus on issues related
to computer system or software application.
So, for legacy systems, determining the status and control of existing procedures
and records will form the basis of the validation rationale, extent of redocumenting,
degree of testing, and the validation life cycle entry level (see Figure 1.4). This
approach is sometimes referred to as retroactive validation. When supported with
an appropriate level of extended system performance monitoring and analysis, it
can provide an effective method of accomplishing validation of existing (legacy)
systems.
The issues raised are by no means insurmountable. However, with the complexity,
duration and cost of validating existing systems, it would also be prudent to study
the implications of replacing the computer system and starting afresh.
The validation process is itself a risk-limiting exercise and as such is the prime
process for streamlining computer system applications, and controlling the life
cycle phases through structured and documented procedures and records.
System Retirement
Document
Evaluation & Risk Validation Report
Assessment
System Validation
Specification Documentation
Update
Key Considerations
• Identify who has the necessary expertise and training for each validation activity
[14]. Where nobody with the necessary experience is available, alternative
arrangements must be employed (e.g., train existing staff, hire new experienced
staff, or hire consultants). If these measures are employed there may be time or
cost implications for the project which should be taken into account.
• The initial planning activities must be well defined, and for prospective systems,
a documented gap-analysis of validation control or support procedures should
be undertaken to establish readiness for validation planning. In the case of
legacy systems, it is normal to undertake a compliance assessment to determine
the validation status of the system. These preparatory examinations will identify
any corrective actions that may be required, and allow inclusion of remediation
considerations and the resulting validation rationale in the validation plan.
• For legacy systems, it can also be beneficial to compile a “history file,”
capturing all existing documentation and records as the baseline for adjudging
the validation rationale and strategy. Further, changing regulations or their
interpretation may have rendered the system’s operational and technical control
capabilities inappropriate or inadequate. Such issues need to be addressed to
decide suitable risk-based remedial actions. Similarly, legacy systems should be
examined to verify that data residing within the system are shown to have
integrity and that data archived are secure and accessible.
• The life cycle steps align closely with the project stages for new computer
system applications where, in general, structured project management methods
are employed. These, together with good software engineering practice, can
provide the platform for successful validation. With this in mind it is recognized
that a large proportion of documentation required for validation can be
generated from the controlled implementation of a well-planned project.
The revised 21 CFR Part 11 guidance [17] has reduced the number of systems that
might be bound by the requirement of this regulation. However, even if a system no
longer falls within the scope of 21 CFR Part 11, this does not mitigate the need to
validate it appropriately in order to fulfill the requirements of the applicable
predicate rules, for example, 21 CFR Part 211.68.
Key Considerations
• In addition to the usual objectives of validation the key requirements for data
integrity, security, and auditability are paramount.
• To achieve and maintain validated computer systems a life sciences company
must demonstrate commitment and should require that quality-related critical
parameters, data, and functions must be the focus of each phase of a validation
life cycle, including design and development.
• During the procurement of new computer systems, validation requirements and
costs (including the cost of allocating or hiring suitable personnel) should be
considered as part of the capital approval process.
• The FDA does not formally require a validation master plan (VMP), but
inspectors will ask what your approach toward validation is and what steps you
have taken to validate a specific system. However, the EU GMP Directive
Annex 15 formally requires a VMP.
• The operational life of a system including its decommissioning or retirement
must be under a documented and ongoing evaluation. This must include periodic
reviews, performance monitoring, change control, configuration management,
and any necessary or scheduled revalidation.
• The control and support procedures prerequisite to any validation program
include document control, training, configuration management, change control,
security access, training, incident management, contingency planning,
decommissioning or retirement. These need to be established, (i.e., approved,
signed, current, in-place, and in-use) [14].
• For large or complex systems such as an ERP applications the level and type of
testing need to be considered. Generally, basic functional testing will be
conducted, but for quality-related critical data and functions the degree of
“negative or invalid” challenge testing to be carried out has to be determined.
To best attain this, the business (software) modules operating in the GxP
environment first need to be identified, along with the instances of electronic
records and signatures required by the predicate rules. Thereafter, an assessment
to determine the risk of inadvertent or deliberate action that can directly impact
critical data [12] will provide direction as to the type of testing that can
demonstrate that the system and its data are secure.
Validation SOPs
Key Considerations
• The early planning and definition of the validation life cycle processes.
• SOPs must contain clear and unambiguous instructions for each stage of the life
cycle process.
• Identify who has the necessary expertise, with supporting documentary
evidence.
• Restrict the number of persons who can sign SOPs to the absolute minimum.
Include QA and a person proficient in the subject area.
• For up-front tasks in the planning phase, procedures will be required for the
provision of approved validation guidelines and procedures, risk assessment,
documenting the validation rationale and strategy, and for determining the
quality-related critical parameters and data for each application.
• Suitable standards must be provided for accepting manufacturer-supplied
documentation for specific life cycle stages (i.e., what should be included in the
manufacturer’s documentation for it to be accepted, what documentation should
be produced to demonstrate that it has been accepted, and what to do if it does
not fully meet the life science company’s standards).
• Define the conditions under which external consultants may be used and the
process for appointing them.
The requirement for validation and need for a formal assessment for electronic
records and signatures must be determined through a documented impact and
criticality assessment process that evaluates the regulatory context of the business
processes and any data supported by the computerized system. All computerized
systems used in support of regulatory requirements or regulated processes must be
assessed. This assessment may either be a stand-alone document, i.e., a validation
determination statement form (providing a high-level GxP criticality assessment) or
may form part of a more detailed study conducted under methodologies that address
applicable regulations or predicate rules, and GxP criticality.
Key Considerations
System Register
Key Considerations
Validation Plan
A validation plan (VP) based on current industry practice [7], system criticality and
business risk, and regulatory implications, must be produced. This defines the
nature and extent of the validation life cycle activities to be performed (what will
be done, and what will not be done and the reasons why). Subordinate VPs may be
produced for large or complex systems. It is also permissible for the computerized
system VP to be incorporated into the overall validation project plan for a
manufacturing process or an item of equipment. Information on the content of VPs
is provided in the GAMP Guide [7].
Key Considerations
• The VP can be the first instance where the compliance and validation training,
and the system design, operation, maintenance training are formally identified
and scheduled for review. This will ensure the computer system is specified,
designed, implemented and operated by individuals who are trained (on a
continuing basis) to a level commensurate with their work function.
• Identification of the validation activities following a risk-based approach.
• Preparation of an infrastructure qualification plan that takes into account
applicable “predicate rules” and addresses the following:
– Overview of business and compliance needs.
– Identify control and support procedures to be adopted throughout
implementation, operation, and training (both GxP and operational), e.g.,
change control, configuration management, backup–compare–restore, data
migration, data archival or retention, problem management, performance
monitoring, internal audits, periodic review, and service level agreements.
– Identify where supplier specifications and procedures can be adopted.
– Overview of main platform architecture and components, both hardware
and software (operating systems, utilities, drivers, and tools).
– Intended operating “boundaries” for hardware and software within the
infrastructure platform.
– Current view of all “application software” that uses (shares) the platform,
i.e., both GxP and nonGxP, and identify intended use of electronic records
and electronic signatures.
– Overview all known intersystem, software, and hardware interfaces.
– Overview data flow and ownership (the “responsibles” and “accountables”).
– Identify “controls” (both technical and procedural) for infrastructure
platform and application software.
– Identify risk assessments required and, with focus on “data integrity” and
“data security,” ensure an appropriate level of risk-based qualification
testing.
– Identify validation rationale and life cycle for the platform infrastructure
(prospective and existing (legacy) components).
– Outline testing strategy for platform components installation and
operational qualification.
– Define platform implementation and qualification testing and verification
of the application software implementation and validation program.
• There will be occasions when validation preparatory and support activities will
be undertaken before the respective validation plans are finalized. This can be a
common occurrence when reviewing existing system validation status. If there
are no such activities prior to compiling the validation plan this should be
clearly stated as: “No preplan activities have been identified.” If preplan
activities have been conducted these should be identified in the validation plan
as such. Include detail and record references of all validation preparatory,
support, or life cycle activities that were undertaken prior to issue of the
validation plan. With regard to the impact on the system validation these earlier
activities should be evaluated and resolved under the change control or risk
assessment procedure. The result of each evaluation should be referenced in the
validation plan (typically as an appendix). Examples of such preplan activities
include:
– Review and prepare validation policy.
– Supplier or integrator audit rationale.
– Conduct supplier or integrator audits.
– Preordering of “standard” hardware and software system components.
– Gap-analysis of existing validation life cycle and support procedures
complete with records and documentation where relevant, including high-
level, quality system documentation and record and signature predicate rule
requirements.
– Where implementation embraces an upgrade, and thus a significant change
to software or hardware, this and the respective change control evaluation
and any resulting rationale for a satisfactory level of redefinition, and any
associated requalification testing or system revalidation referenced [14].
– Examination of any “interfaced” systems validation status with regard to
data integrity, and data control and rationale for any redefinition or re-
qualification testing or system revalidation.
– Preparation or review of prerequisite guidelines and procedures required to
support and control the validation life cycle activities, e.g., change control,
periodic review, performance monitoring, etc.
– Preparation of an overall computer systems registry (GxP, nonGxP, and new
systems under way) including the identification of the validation rationale
and status of each system.
– Conducting compliance and validation training [14].
– Conducting risk assessment training.
Qualification Protocols
Design Qualification
In addition to the scheduled design reviews carried out by the supplier as part of the
system design and development process, the customer must also conduct and
document its own design reviews to ensure that the system design meets its
technical and quality needs as detailed in the user requirements specification
(URS). These customer reviews also ensure that the URS, functional specification,
• All documentation has adequate revision control and an audit trail referring all
the way back to an initiating document or instruction (see section on
requirements traceability).
• All documentation has the required signatures.
• The documentation is presented in a form that will enable information to be
easily found, and assist in ongoing system maintenance and future changes or
modifications.
• The quality processes followed by the supplier are compliant with the supplier’s
quality management system for system design, testing, and documentation.
• The supplier has complied with any customer quality requirements and SOPs.
The following system design issues should be examined during each DQ review.
• The hardware and software meet the criteria defined in the URS and FS (see
section on requirements traceability).
• Where applicable, the clauses in the hardware and software development
specifications have been written in a form which will enable a suitable test to be
identified and specified.
• The test clauses in the hardware and software test specifications and system
acceptance test specifications are traceable to the appropriate design clauses in
the hardware and software development specifications. The individual tests are
risk-based and sufficiently detailed, appropriate to the item under test,
measurable and recordable, achievable and realistic.
• The hardware and software has been developed according to the predefined
procedures or standards.
• Full, accurate, and current documentation of the hardware and software exists
and is readily understandable by a suitably qualified person. Diagrams have
been used, where applicable, to assist understanding (see section on
requirements traceability).
• A risk analysis has been carried out on the computerized system to identify and
resolve any potential risks to the public, personnel, plant, and the business (see
section on risk assessment).
• All system functions that are directly related to GMP are identified in the
documentation, and the implementation requirements for these functions have
been examined and reported in the GMP risk assessment (see section on risk
assessment).
• An electronic records and electronic signatures assessment has been carried out,
where the need has been identified in the VP (see section on electronic records
and electronic signatures).
• Adequate safeguards exist to protect software against loss, theft, and malicious
intent.
• Adequate safeguards exist to protect hardware and software against loss, theft,
and damage from environmental attack.
Installation Qualification
The purpose of the installation qualification (IQ) is to ensure that the computer
platform and the application software are correctly installed at their operational site.
The IQ typically includes (but is not limited to) the following items.
Operational Qualification
The operational qualification (OQ) demonstrates that the installed system works
properly in its operational environment under both normal and abnormal
operational conditions. The OQ may typically include (but is not limited to) the
following.
Performance Qualification
The purpose of the performance qualification (PQ) phase is to verify that the
complete system behaves as intended according to the user requirements under
normal or expected operational conditions. The PQ may include (but is not limited
to) the following.
Key Considerations
• General recognition of DQ as a key tool for the life sciences company to review
the design, development, and associated testing undertaken by the supplier
during system build, and with a view to utilizing that work to support
qualification testing.
• Where two or more similar qualification protocols are produced for a project it
may be beneficial and more efficient to create a high level test strategy
document that contains all of the information which is common to all
qualification protocols in the validation project. Individual protocols will then
only contain that information that is specific to that validation stage.
• The protocols must demonstrate that the computerized system meets the
requirements of the user and can cope with the stresses that will be placed upon
it (e.g., continues to function reliably when the maximum predicted number of
users are connected, collects data at sufficient data transfer rates, etc.) through
documentation and testing.
• Where the system will be rolled out to additional departments, ensuring that
sufficient testing is performed to demonstrate that it meets the requirements of
those departments while not needlessly repeating testing already conducted.
• Ensure that the testing is constructed such that any potential errors that may
impact the use of the system are exposed and addressed.
• Ensure that the testing progresses as quickly and economically as possible by
designing protocols that test more than one function simultaneously where
possible.
• The protocols should be designed such that they can evolve and cope with
changes to new versions of the software or changes in the user interface.
• The protocols must provide verifiable evidence that tests were completed and
their outcome clearly stated with regard to the predetermined acceptance
criteria.
• Each qualification must be formally reported to maintain an auditable system of
documentation and to ensure an approved and controlled, fully documented
transition to subsequent life cycle phases. The report must clearly reference its
controlling protocol and should provide an overview of the results obtained
during the testing, together with details of all problems encountered, including
how the problems were resolved. If any problems are still outstanding, the report
must provide suitable justification for progressing to the next testing stage.
Approval of the report signifies the acceptance of the results and authorizes
progress to the next stage of the validation project.
• The status of the traceability matrix should be recorded as part of each
qualification summary report and documentation kept in a validation file.
• In the case of a computer system applied to a manufacturing process, with direct
relationships to plant equipment and the process itself, the VP should specify
whether separate or integrated qualification protocols and summary
qualification reports are to be prepared for IQ, OQ, and PQ. A similar situation
applies to laboratory systems where the software is directly associated with a
specific piece of equipment that must also be validated. The situation is further
complicated where several applications are interfaced together and rely upon
one another for successful operation.
• Conduct the appropriate level of calibration and provide documented calibration
records to support the validation program. A methodical approach for
conducting the calibration of control, monitoring and laboratory instrumentation
is required. Calibration is to be undertaken with test instruments precalibrated
against recognized national or international standards.
• The calibration periodicity should take into account the instrument supplier
recommendations and the robustness, sensitivity, and duty of the instrument.
Laboratory instrument calibrations may require running system suitability
samples during each analysis to check the suitability of the system for use. Once
a periodicity is established, the instrument should be added to a calibration
program for call-off dates to be determined. Initial calibration periodicity
should be reviewed against historical records. The current calibration status of
critical instruments should be known and verifiable. It is advisable to schedule
any recalibration to take place immediately after any scheduled maintenance of
the system.
• The scope of the protocols should reflect a risk-based approach addressed under
formal risk assessments.
• Align with the ISPE Baseline Guide, Commissioning and Qualification [12]
and conducting qualification-level testing only where quality-related critical
parameters, data, and functions can be “directly impacted.”
• Control quality-related critical GxP data throughout supplier design,
development, and build.
• Supplier software and hardware records should be available for or referenced by
the DQ.
• Validation of self or remote calibration and test software.
• Qualification of network or fieldbus security and diagnostics.
• Performance monitoring during operational use can be viewed as an extension
of the PQ process, because for most large systems it is not possible to operate
and test or verify under all conditions. Consequently, there is a need to
continually monitor the system performance under differing loads, and to review
system self diagnostics and any self calibrations.
Validation Report
A validation report must be produced which provides a review of the results of the
validation life cycle activities performed and documentation produced, as set out in
the respective VPs, and clearly define the overall validation status of the
computerized system. The validation report must include information on any
deviations from the life cycle processes specified in the VP (e.g., additional activities,
activities which could not be done, and the reasons why); information on any actions
that remain open, their severity, who has responsibility for closure, and the expected
date for closure (a place for the actual date of closure must also be provided). The
computerized system validation report can also be incorporated into the overall
validation summary report for a manufacturing process or an item of equipment.
Information on the content of validation reports is provided in the GAMP Guide [7].
Key Considerations
Periodic Review
The ongoing evaluation phase is usually the longest phase of the validation life
cycle, covering the operational life of the computerized system. The earlier life cycle
phases provided a comprehensive validation documentation set that is the mainstay
of ongoing evaluation and the basis for regulatory inspection.
An important objective of ongoing evaluation is to uphold an auditable set of
validation documentation and ensure a controlled, fully documented record of any
activity that may affect the validation status of the computerized system throughout
its operational use.
Periodic reviews are an important element of ongoing evaluation and may be
undertaken as a scheduled or event-driven examination, e.g., a major upgrade to
hardware or software. The frequency of the scheduled periodic reviews can vary
depending on the application but are generally undertaken on an annual basis as a
minimum. Detailed document reviews may be required more frequently and these,
together with internal audits, will support the periodic review.
A periodic review will encompass the validation life cycle documentation and
records, the associated control and support procedures and records, and ensure that
these are established, i.e., approved, signed (including signatory function), dated,
current, in place, and in use.
Periodic review meetings are held to document the review process, the
documents reviewed, comments from attendees, and the collectively agreed course
of action. The periodic review summary report records the findings of the review
meeting and includes an action schedule, itemizing any documentation that requires
updating, and those responsible for completing the work. The progress of updates
should be monitored against agreed completion dates.
In the case of GLP, those study directors in charge of projects using the system,
together with management, must be notified of any problems found, and the
potential impact of those problems. The frequency of reviews must be in accordance
with current company policy. Information on the periodic review process is
provided in the GAMP Guide [7] and PIC/S Guide [14].
Following a successful periodic review, acceptance of the evaluation should be
clearly stated in an approved periodic review report.
Key Considerations
• Obtain the necessary documentation in order to perform the review. This should
be kept or referenced by a “validation file” which includes a detailed index to
control and locate all validation documentation. Typically the file may be
subdivided to align with life cycle stages or validation milestones.
• Training records must verify that persons affected by a control or support
procedure have been satisfactorily instructed in its use. [14]
• A risk assessment should form part of each periodic review in order to verify
the findings of the previous analysis, and to provide risk information for
consideration when assessing the need for revalidation.
• It is vital that the validation status of the computerized system operation is not
compromised.
• Ensure that periodic reviews are carried out according to a defined schedule
(and maintaining the schedule up to date) or are event driven by significant
changes to the system or its use or events impacting the system data integrity.
• Ensure that signed records of each inspection are maintained, showing the date
of that inspection, as required by GLP.
• It is likely that a regulatory inspector will want to see evidence that the reviews
are carried out.
• Typically, a periodic review will cover:
– GxP, validation, operation training records.
– Validation life cycle documents, records, and reports.
– Operational or maintenance procedures and records.
– Control and support procedures.
– Impact of regulatory inspection findings and changes in the regulations.
– Qualified resource availability.
– Risk assessment review.
– Health, safety, and environmental issues.
– Periodic review summary reports.
A URS must be produced by, or on behalf of, the customer. The URS provides a clear,
unambiguous description of what the system must do. It must be written in sufficient
detail to enable prospective suppliers to provide a detailed quotation or tender for the
work and its subsequent use for the production of system life cycle documentation.
Information on the content of URS is provided in the GAMP Guide [7].
Key Considerations
• The life sciences company must ensure that sufficient and accurate information
and quality criteria are available at an early stage.
• The URS is the base document (or in some cases, set of documents) for
developing and testing the computer system, and it needs to provide clearly
defined requirements that can be verified by means of inspection or testing.
• The required system functions should identify the features that must be satisfied
and those that are desirable. Features that are necessary to uphold GxP and to
validate a system should always be considered as firm requirements, not
desirable features. The desired features should be prioritized as to their relative
importance and GxP significance.
• In addition to required functions, the specification should include nonfunctional
requirements, data requirements (e.g., access speed, data storage and retrieval),
interface requirements, environmental requirements, and system constraints
(e.g., compatibility, availability, procedural and maintenance constraints).
• Examine the feasibility of physically or technically segregating GxP data within
a system database.
• A matrix can be prepared to identify both user and supplier responsibilities for
the provision and management of documentation in the relevant validation
project phases.
• The URS structure should be such as to facilitate traceability of each requirement,
or group of related requirements, throughout the corresponding sections in the
succeeding design specification documentation. In the case of customized and
custom-built systems this helps ensure design decisions are auditable back to
source requirements. This will enable easy identification and audit of any design
changes and technical compliance issues (along with associated cost issues).
Traceability should also be carried forward to the qualification test procedures
where it will link each test and the qualification acceptance criteria directly to the
respective requirement. This can readily be achieved by employment of a
“traceability matrix” that will identify the corresponding elements of the life
cycle documents to ensure that all stated requirements are being provided and
tested or verified throughout the system life cycle.
• The URS must contain clear, concise and unambiguous statements, and must be
written in nontechnical language that can be understood by both the user and the
supplier. Each statement should, where possible, only contain one requirement.
• To aid understanding of more complex requirements it may be helpful to include
an example. Diagrams, simple flow charts, and more complex sequential flow
charts are also beneficial for clarifying requirements and processing functions.
• The URS must be formally reviewed by all interested parties prior to issue for
inquiry, and must be reviewed (and revised as required) during the project to
reflect the system being purchased.
• Capture the users’ requirements and express them in a suitable form that can be
understood by both the users and supplier.
• Details of the quality-related critical data that must be “handled” and controlled
by the computerized system is fundamental to upholding GxP compliance. The
integrity and security of the quality critical data are the prime objective of
computer system validation.
• The URS should identify all inputs and outputs to the computerized system
(whether automatic or manual). These should be individually evaluated for their
criticality to the operating or business process.
• Ensure that the user requirements properly match current GxP stipulations and
current and future business needs.
Supplier Audit
A supplier audit is usually performed by, or on behalf of, the customer on suppliers
of GxP critical or business critical computerized systems. The need for performing
a supplier audit should be specified in the respective validation plan. Information
on conducting a supplier audit and the areas to be covered are provided in the
GAMP Guide [7].
A supplier audit can be used to provide objective evidence that a quality
management system is in place and is followed. The supplier audit may be used as
a preevaluation of a number of potential suppliers.
Depending on the criticality and available information of the vendor there are
levels of assessment methods.
Key Considerations
All reasonable steps must be taken to ensure that the computerized system software
will be produced in accordance with a quality management system [8 §5, 14].
Key Considerations
supplier and any external audits required by the life sciences company should
be scheduled to coincide with the phase activities, and their scope defined.
• Inclusion of a design and development activity schedule, complete with
resource allocations, will document the structured approach to be used and will
allow progress of key activities to be monitored. The schedule will also assist in
identifying problem areas and identify tasks that require input from the life
sciences company. Typically, the schedule will identify for each activity the
procedures to be used and the acceptance criteria. Provision can be made for
supplier and customer signatures against each activity to provide a record, and
thus control, of the development phase of the validation life cycle in support of
design qualification.
• Ensure that claims by suppliers that they use GAMP as their quality management
system are verified.
• Ensure that supplier audits are conducted and evaluated prior to the purchase of
the computerized system.
• Ensure that documented design and code reviews are conducted by the supplier
during system design and development.
Requirements Traceability
Key Considerations
• If the RTM is divided up and included in each individual design and test
document, then it should provide clause level traceability from each document
to the preceding document used to generate it. This will enable design reviews
to take place.
• The RTM is a key support document for reference and consideration in the
validation report.
• A number of software solutions are available for maintaining traceability
information and may be particularly useful for larger, more complex systems,
where a large number of requirements must be maintained.
Information on the system development life cycle and support processes is provided
in the GAMP Guide [7] and International Standards and Guidelines [24–26].
Key Considerations
• Ensure that the user requirements are adequately broken down in a systematic
and documented way to the level where coding can take place, and that the
design can be verified though adequate testing at each level.
• Implement formal, documented design and code and configuration reviews.
• Adhere to life cycle and coding standards.
• Manage documentation and code under formal change control and configuration
management processes.
• Ensure that the supplier maintains awareness of, and produces systems that
comply with regulatory requirements.
• The influence of a risk-based approach on system design.
System Description
Key Considerations
• Maintain the system description up to date for complex and evolving systems.
• For information systems in particular, address the general reluctance to produce
data flows, network infrastructure diagrams, and reference installation
specifications, even though this information is invariably available.
Development Methodologies
Although the responsibility for validation remains with the life sciences company,
the supplier can play a key role in life cycle activities, and should contribute a full
set of auditable design and development documentation to support the validation.
The entire system development process should be carried out to written and
approved procedures and all development, testing, and verification activities
documented, reviewed, and approved. The life sciences user cannot afford this key
and usually intense phase to become invisible.
For large or complex systems, the functional design specification on its own may
not enable the prospective user to fully understand how the system will be applied
and used. One method of overcoming this problem is to develop a system prototype.
Techniques such as prototyping and rapid application development (RAD) are
acceptable for use when developing computerized systems providing that the use of
these techniques is planned, controlled, and documented. These techniques must not
be used as a means of circumventing life cycle controls and documentation, and
must be implemented in line with logical life cycle stages [14]. Information on
system development methodologies is provided in the GAMP Guide [7].
Prototyping can provide the following benefits:
Key Considerations
• Although the responsibility for validation remains with the life sciences
company, the supplier will be involved in life cycle activities and must
contribute a full set of auditable design and development documentation to
support the validation.
• Computer system development and acceptance test procedures and records,
when documented in line with life sciences validation practice, can be used to
support qualification testing, and thus must be a prime focus for DQ.
• Align terminology and design or development and the associated development
testing (whether it be manufacturing automation, laboratory or information
systems) with a life cycle that is recognizable to the regulatory authorities. This
in turn demonstrates an understanding of the validation life cycle.
• Prototypes must only be used to determine a design approach. They must not be
released as the final product and no data produced on a prototype system can be
used in support of a GxP study or manufacturing program.
• Development processes must be well controlled and regularly baselined. Each
baselined deliverable should provide a usable solution that can be formally
released.
• Align the “extended” system definition processes (i.e., conference room pilots,
planning of large or complex systems with recognized life cycle controls and
deliverables).
• New methodologies may require radical changes in approach.
• Evaluate the availability, use, and impact of development tools.
The design specifications for software and hardware provide definition of the
component parts and the system integration from which corresponding tests can be
developed. Tests for each requirement should be developed on completion of the
respective specification to help to ensure all matters are addressed. For large systems
test plans are usually developed soon after a design specification is approved in
order to capture the overall test requirements and any particular issues identified at
that time. The test plan will then evolve into a detailed test procedure that will
address the more detailed criteria that become apparent as development proceeds.
The test planning process must identify the necessary scope and extent of testing,
as well as the predetermined acceptance criteria. Tests and their acceptance criteria
must be defined and documented prior to execution. Evidence of testing must be
documented (e.g., individual tests completed, signed, and checked, and screen shots,
reports or other evidence generated wherever possible and appropriate). Each
individual test must have a pass or fail result.
Test reporting must conclude on the overall outcome of testing, and summarize
how outstanding issues (including corrective actions to address test failures) are
managed. If the computerized system is replacing a manual one, the two must be
run in parallel for a time as a part of this testing and validation process [8 §7].
Testing may include software module, software integration, and package
configuration testing. Where the system is based upon custom-built hardware,
hardware acceptance testing will also be necessary.
Key Considerations
• New and evolving technologies may prove difficult to test conventionally. New
approaches to testing may be required.
• Evaluate the availability, use, and impact of automatic testing tools.
• Consider rationalization of testing stage terminology, e.g., decide whether to
adopt IQ, OQ, PQ “speak.”
• Determine how much testing is enough during the system development. This
may include a combination of functional and structural testing.
• Demonstrate the reliability of the system. It may also be necessary to perform
advanced software testing using statistical testing. Using this technique, specific
areas of concern are targeted and large amounts of data generated, thereby
increasing the probability of encountering rare unanticipated operating
conditions.
Data Migration
For new systems replacing existing one, data migration must preserve the integrity
and security of the original data. Processes and methods used to load data (manual
and automatic) must be defined and validated with supporting documentation
before they are accepted for use [14].
Key Considerations
System Documentation
The user documentation must include, as a minimum, those items of the supplier’s
system development life cycle documentation necessary to provide evidence of
validation in order to support a regulatory inspection (e.g., system specification and
description, diagrams, test specifications or scripts, test results). The extent of
the supplier’s system development life cycle documentation to be provided to the
customer will depend on the complexity, product and business criticality of the
computerized system and will be defined in the URS and any contract agreements.
Where the supplier’s documentation will be used to support the validated state of the
system, it must have been properly assessed and accepted by the user organization.
Such review and acceptance must be formally documented. Information on system
documentation for the user is provided in the GAMP Guide [7].
Key Considerations
• Maintain the user and support documentation set up to date following changes
to the system.
System Location
Key Considerations
System Changes
Key Considerations
• The extent or impact of each change must be fully investigated and documented.
Where changes are routine (e.g., the automatic update of a parameter in the
software following the routine calibration of an analytical instrument) the
change control SOP may allow that change to occur without the need for
investigation, providing the change is appropriately logged.
• Site changes by the supplier must follow the same procedures or processes as
those used by the customer.
• System documentation must be updated where applicable.
• All items impacted by the change must be examined to ensure that the change
audit trail and change history is complete.
• Software configuration management records must be updated.
Key Considerations
• The corrective action process should be linked to the change management process.
• The root cause of any errors should be established and a solution found which
will prevent a recurrence.
• Corrective action plans must encompass all documentation (specifications,
records, procedures, report updates).
Key Considerations
Key Considerations
• Alternative arrangements must be based on the risk of failure and business impact.
• Security and integrity of both quality and business critical data.
• Appropriate methodology transfer to an alternative system, equipment, or
personnel to ensure consistency and accuracy of data.
The arrangements for computerized system maintenance and support by the supplier
or by another support agency (e.g., a company, organization, group or person) which
is either internal or external to the life science company, must be formally
documented. This document must take the form of an agreement which clearly
defines the scope of the support service, response measures and the responsibilities
of the supplier or support agency [8 §18, 14]. Where support is split between
different companies, groups or people, the precise responsibilities of each must be
fully documented. If the supplier or support agency is external to the life science
company the appropriate rules concerning contract manufacture and analysis [A1:
Part 4, Chapter 7] will apply. These rules will also be used as guidance for
determining an appropriate support agreement when the supplier or support agency
is internal to the life science company.
Key Considerations
• Maintenance and support activities should link into customer procedures and
processes.
Key Considerations
Future Challenges
Inspection Readiness
Key Considerations
The FDA issued the 21 CFR Part 11 Electronic Records; Electronic Signatures,
Final Rule in March 1997 [16] and this was followed over the next few years by a
Compliance Policy Guide on Enforcement Policy (CPG 7153 – 21 CFR Part 11;
Electronic Records; Electronic Signatures, July 1999), and a series of draft Part 11
guidance documents covering Validation (August 2001), Glossary of Terms (August
2001), Time Stamps (February 2002), Maintenance of Electronic Records (July
2002), and Electronic Copies of Electronic Records (August 2002) which
represented the agency’s current thinking on each topic. However, in February 2003
the FDA issued an announcement in which it withdrew the CPG and all of the
guidance documents and announced that Part 11 was to be reexamined.
FDA Guidance for Industry, Part 11, ERES – Scope and Application
The next step is for the FDA to reexamine the Part 11 rulemaking process as part
of the cGMP initiative Pharmaceutical CGMPs for the 21st Century: A Risk-Based
Approach; A Science and Risk-Based Approach to Quality Product Regulation
Incorporating an Integrated Quality Systems Approach.*
In response to the publication, the life sciences industry must recognize and act
accordingly to embrace the following key indicators.
* www.fda.gov.oc.guidance.gmp.html.
• FDA will enforce all predicate rule record and record keeping requirements.
• Records must still be maintained or submitted in accordance with the
underlying predicate rules, and the Agency can take regulatory action for non-
compliance with such predicate rules.
• Part 11 remains in effect.
• Part 11 is intended to permit the widest possible use of electronic technology,
compatible with the FDAs responsibility to protect public health.
• The predicate rules define:
– What records must be maintained.
– The contents of the record.
– Whether signatures are required.
– How long records must be retained.
• Records and signatures that meet the criteria:
– Will be considered equivalent to paper records and signatures.
– May be used in lieu of paper, unless otherwise started in other regulations.
• To understand the implications of the final guidance on the scope and application
of Part 11.
• Establish clear policies and procedures.
• Implement a reasonable risk-based approach.
– The risk assessment approach may vary from case to case. Methodologies
ranging from a simple impact assessment methodology to a full failure
modes and effects analysis (FMEA) or impact and criticality analyses and
may be used as appropriate [7, Appendix M3, 12 §3 and Appendix 2, 13 §9].
– Establish priorities by focusing on the predicate rule requirements, the
impact and value of specific records, and risks to those records.
– Risk is a key decider in examining the need for specific controls for specific
records, i.e., controls commensurate with documented risk.
– Within user organizations different circumstances may call for varying
levels of controls for each record, i.e., there may not be a single level of
controls across the organization and covering all records.
– It should be remembered that whatever the assessed risk, Part 11 applies to
all Part 11 records.
• Apply appropriate controls.
• Understand their regulated processes and determine and document where they
have records and signatures and, based on predicate rules, whether specific
records are Part 11 records.
• Recognize that some records are directly identified by predicate rules, but in the
case where a record is implied rather than specifically stated, then the company
should consider whether the record is required for them to fulfil the
requirements of the predicate rule or required to provide evidence, document
their decision and rationale.
Validation
Electronic Signatures
Audit Trails
Copies of Records
Retention of Records
Legacy Systems
• A computer system in use before August 20, 1997, i.e., when Part 11 came into
effect, that has met, and continues to meet, all applicable predicate rule require-
ments throughout the system’s operational life, is outside the scope of Part 11.
• There is documented evidence and justification that the system is fit for its
intended use and has an acceptable level of record security and data integrity.
• Any changes to the legacy system must be controlled and the changes assessed.
If the changes prevent the system from meeting predicate rules, Part 11 controls
must be applied to Part 11 records and signatures.
Key Considerations
In the context of this section, any text, graphics, data, audio, pictorial, or other
information represented in digital form that is used for regulatory submission or to
support a regulatory submission, or that is required by local laws and relevant
regulations, and is stored electronically, is considered to be an electronic record; and
the requirements of the regulatory agencies will apply [1, 8, 16]. The information
provided here should be used for guidance in the application of the appropriate
regulations for a “closed” system, but in all cases the requirements of the FDA [16]
and the associated life sciences company corporate interpretation [28] will take
precedence.
Validation Requirements
Key Considerations
• Recent surveys have shown that lack of validation is still the most prominent
finding by the regulatory authorities.
• Instilling a validation culture in both the life science companies and their
suppliers.
• Identification of the records to be generated to satisfy the appropriate “predicate
rules” and held within the system. In the case of larger control and information
systems e.g., ERP, MES, LIMS, the instances of records required by the
predicate rules are to be determined throughout the software functionality.
System Access
The computerized system must only be used and data must only be entered or
altered by authorized persons [16 §11.10(d), 16 §11.10(g)]. Suitable methods for
deterring unauthorized access to the system and its data must be used including the
use of keys, pass cards, and personal codes, and by restricting access to networked
computers and terminals, and interfaces to other computer systems which can
provide access to the system. A procedure must be established for the issue,
cancellation, and alteration of authorization to enter and amend data, including the
changing of personal passwords. Unsuccessful attempts to access the system by
unauthorized persons must be recorded [8 §8, 16 §11.10(d), §11.10(g), §11.300(d)].
Key Considerations
The computerized system must have built-in checks, where appropriate, for the
correct entry and processing of data, verification of the data source, and the correct
sequencing of steps and events [8 §6, 16 §11.10(f), §11.10(h)]. Additional checks
must be carried out on the accuracy of the records generated when critical data are
entered manually (e.g., the weight and batch number of an ingredient during
dispensing). This check can be carried out by a second operator or by a validated
electronic method [8 §9].
Key Considerations
• Data verification.
• Transfer of electronic approval (signatures) from one system to another.
Audit Trails
The computerized system should generate secure, time-stamped audit trails which
record the identity of operators entering, confirming, altering or deleting critical
electronic data and the action taken [8 §10, 16 §11.10(e)]. Any alteration made to a
critical data entry must be authorized and recorded with the reason for the alteration
and the date and time the alteration was made [8 §10]. A complete record of all entries
and amendments made by the operator to electronic records should be generated and
retained by the system throughout the records retention time [8 §10, 16 §11.10(e)].
Where this is not technologically possible, appropriate alternative measures must be
implemented to ensure the integrity and accuracy of the electronic record. Changes
to records must not obscure previously recorded information [16 §11.10(e)].
Key Considerations
Copies of Data
Key Considerations
• This requirement has now been “toned down” by the FDA. However, the FDA
still requires “useful and reasonable access” to records. They recommend using
established automated conversion and export methods to produce copies in
common data formats. The copy should preserve the meaning and content of the
record and where technically feasible, should provide sorting and trending
facilities.
• Objectively assessing potential data formats to ensure they are appropriate for
the storage of data and are acceptable to the relevant regulatory authorities.
Data Security
Key Considerations
• A data backup, archiving, and restoration process is required appropriate for the
computerized system.
• Where possible, data should have a checksum or cyclic redundancy check
associated with it to confirm integrity.
Computerized systems used to release batches for sale or supply, must only allow a
qualified person to perform the release. The identity of the qualified person must
be clearly identified and recorded by the system [8 ߧ9, 16 §11.10(d), §11.10(j)].
Key Considerations
Documentation Control
Appropriate controls must be provided for the distribution, use, and access of
documentation for system operation and maintenance. Change control processes
must ensure that change history (audit trail) information is maintained for the
development and alteration of system documentation [16 §11.10(k1, k2)].
Key Considerations
Open Systems
In addition to the requirements described here for closed systems, open systems
must have additional measures, such as document encryption and the use of
appropriate digital signature standards, to ensure the authenticity, integrity, and
confidentiality of electronic records [16 §11.30].
Key Considerations
In the context of this section, any legally admissible electronic signing applied by
an individual to an electronic record that is used for regulatory submission or to
support a regulatory submission, or that is required by local laws and relevant
regulations, is considered to be an electronic signature; and the requirements of the
regulatory agencies will apply [1, 7, 16]. The information provided below should be
used for guidance in the application of the appropriate regulations, but in all cases
the requirements of the FDA [16] and the associated life sciences company
corporate interpretation [28] will take precedence.
The appropriate requirements for electronic records must also apply to electronic
signatures [16 §11.50]. Electronic signatures, and handwritten signatures applied to
electronic records, must be permanently and unambiguously linked to their
corresponding electronic records [16 §11.70].
Key Considerations
• The linking of the user name or password combination to specific records and
events.
• Technology used for applying handwritten signatures to electronic records e.g.,
validation of pattern recognition software to ensure correct signature is applied.
Accountability
Business areas must ensure that individuals understand that they are accountable
and responsible for actions initiated under their electronic signatures, as if it were a
handwritten signature [16 §11.10(j), §11.100(c)]. This responsibility and
accountability has been certified by the pharmaceutical company in a letter sent to
the U.S. FDA [16 §11.100(c, c1), 20].
Key Considerations
Key Considerations
• The operator should be made aware of what is signed and the implication of that
signing (e.g., review, approval, release, etc.).
Security – Personnel
Key Considerations
Security – Equipment
The initial and periodic testing of any identification devices must be carried out and
documented [16 §11.300(e)]. Identification codes and passwords must be
periodically checked, recalled or changed, and their unauthorized use recorded and
reported to the appropriate company authorities [16 §11.300(b, d)]. Arrangements
must be made for the issue of replacement identification components which have
been lost, stolen, or otherwise compromised, and the generation of appropriate
documentation [16 §11.300(c)].
Key Considerations
Key Considerations
PERSONNEL
Internal and external personnel who specify, develop (design, implement and test),
install, validate, operate, manage, support, and decommission computerized
systems must have the education, training, and experience commensurate with the
tasks concerned. Training must be documented [8 §1, 16 §ß11.10(i)] [B8].
Key Considerations
• Use personnel who are competent to perform the work supported by appropriate
documentary evidence.
• Inspect education certificates for new personnel where necessary.
• Periodic reviews of personnel competence supported by the identification of
training needs and a schedule for their implementation, and any necessary re-
training.
• Multidisciplinary teams may be required for certain types of computerized
systems.
The validation of a computerized system and its assessment for compliance with the
appropriate GxP regulations and electronic records and signatures regulations must
be performed by personnel who are competent in the areas for which they are
responsible. These include the process, system or technology, application,
appropriate regulatory requirements, and appropriate life science company local and
corporate requirements [14]. This section identifies the key job roles and
responsibilities relating to the validation of computerized systems, for ensuring that
they comply with the appropriate GxP regulations, including the regulations relating
to electronic records and electronic signatures. The job roles and responsibilities
will be governed by the size and type of computerized system and will be described
in detail in the validation master plan (or VP), and if appropriate, any associated
project documentation.
Executive Sponsor
Commits the quality management group to providing the system owner with the
procedures and agreed resources to ensure all quality-related activities are
satisfactorily conducted and documented to meet the respective regulations, and to
applying all relevant quality system procedures to ensure and monitor the computer
registry, document control, change control, training programs or records, and
internal audits. Also, to support the system owner during regulatory inspections.
Senior IT Management
Commits the candidate site or facility to providing or verifying the data required for
records under “predicate rules,” enabling the system owner to approve system
functionality and any change requirements, supporting the system owner with the
procedures and agreed resources to ensure that all quality-related data and records
are derived from validated computer systems, controlling life cycle validation
programs during implementation and throughout the operational life of the GxP
systems, and conducting GxP validation and operating training programs for study
or manufacturing and support personnel.
Commits the site/facility to identifying the applicable GxP regulations and the data
required for records under the “predicate rules” applicable to the computerized
system. Also, provides timely resource for controlling and monitoring the life cycle
validation program throughout implementation and operational use of a
computerized system, and resources to fulfill ongoing training programs for GxP,
validation and system operation.
Responsible for reviewing and approving validation life cycle documentation, high-
level computerized system life cycle documentation, and system change control
documentation to ensure that they comply with accepted industry computer systems
validation practice, the appropriate GxP regulations, including the regulations
relating to electronic records and electronic signatures, and that the activities and
document sign-offs have been carried out by trained and authorized personnel. The
quality assurance subject matter expert is also responsible for:
• Providing guidance to the life science company and associated groups on the
regulatory requirements and industry guidelines for computerized systems
validation and electronic records and signatures.
• Performing quality and technical audits on internal and external suppliers of
computerized systems.
• The production of the life science company computer systems validation
policies and SOPs.
Responsible for ensuring that the computerized system is validated and that it meets
the appropriate regulatory requirements in terms of the technical solution provided
(e.g., equipment, hardware, software), the validation and computerized system life
cycle documentation generated, and supporting SOPs and any equipment protocols.
The system owner can also be the system administrator or the project manager.
System Administrator
Project Manager
Directly or indirectly responsible for the specification (URS), selection, design and
regulatory compliance review, acceptance testing and validation of the
computerized system, and the supply of its associated validation life cycle
documentation. The validation work or project management may be performed by
an internal life science company group or on their behalf by an external company.
System Supplier
SUMMARY
This chapter has explored the impact of the current GxP and supporting regulations
on computerized systems compliance and validation and the challenges imposed on
industry practices in order to ensure and streamline the compliance process.
As this chapter is being written the use of new technologies is again being
encouraged, with process analytical technology (PAT) recognized by regulatory
authorities as encompassing strategic new technologies that will impact the way the
industry operates. This will lead to significant changes in the regulation of product and
service quality and will demand compliance and validation enabling methodologies.
PAT encompasses technologies such as optronics, computer technology and
methods of abstracting information from complex data matrices (chemometrics),
that will afford direct measurements with manufacturing processes (online and at-
line analysis) and inside chemical and physical processes (in-line analysis). PAT
will afford opportunities for design of advanced real time analysis and control of
manufacturing processes to assure product quality at the completion of the process.
Similarly, through the advent of combinatorial chemistry, the number of potential
drug candidates entering development is increasing. This has led to the use of high
throughput drug metabolism, pharmacokinetic and drug fate analyses and the need
for improved and “real time” toxicology and safety assessment techniques. This has
resulted in a large increase in data volume and increased employment of advanced
data processing and data deconvolution techniques.
These new technologies, in turn, provide more in depth process knowledge from
development to manufacturing and require reliable tools and systems for non-
invasive (in some cases) and speedy measurements (e.g., spectroscopic techniques
such as liquid chromatography–mass spectrometry/Sciex, near infrared, infrared,
raman, fluorescence, UV–Vis absorption and advanced nuclear magnetic resonance
and magnetic resonance imaging).
The new technologies that will surface in the coming years will be well served by
the FDAs risk-based approach and evolving industry guidance and practices, so as to
ensure the right level of controls and validation are in place to achieve and maintain
regulatory compliance, and hence safeguard product and service quality attributes.
DEFINITIONS
REFERENCES
The documents identified below aim to establish the controls that will ensure that all
appropriate computerized systems are validated and, where applicable, are compliant
with the current regulations relating to electronic records and electronic signatures.
The compliance of computerized systems will be assessed using the regulations,
directives, guidelines, and life science company documentation listed below.
User Critical
Identification of
Requirements Process Customer Customer
Suppliers
Specification Parameters
Note:
(xxxxxxxx)
Brackets indicate possible
Supplier involvement
Customer
Assessment & Supplier
(Supplier)
Selection
Tender Award
Design Agreement
System Development Phase
Customer Design Qualification/Review Period
S/W Module
C&I Procurement Supplier
Coding/ Supplier
(against items 5 & 6) (Customer)
Configuration
System Testing
C&I Inspection/ Computer S/W Module
Supplier
Calibration H/W Testing Testing Supplier
(Customer)
(against item 5) (against item 2) (against item 4)
S/W Module
Supplier
Integration Testing Supplier
(Customer)
(against item 3)
Supplier’s System
Supplier
Integration Testing Supplier
(Customer)
(against items 1, 2 & 3)
Operational Customer
Customer
Qualification (Supplier)
Performance Customer
Customer
Qualification (Supplier)
Validation Customer
Customer
Summary Report (Supplier)
Hand-over to Production
Operational Phase
Change Periodic Customer
Maintenance Customer
Management Review (Supplier)
System De-commissioning/
Retirement Customer
Retirement Customer
(Supplier)
65
Figure 1.5 Computerized Systems Validation Lifecycle Activities and Documentation
The following SOPs are intended to highlight key subject areas. These would need
to be supplemented by lower level SOPs, in some cases, to cover specific topics.
Further Reading