0% found this document useful (1 vote)
673 views40 pages

Gamp Good Practice Guide For GXP Computerized Lab Systems: All Rights Reserved

Uploaded by

Huu Tien
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (1 vote)
673 views40 pages

Gamp Good Practice Guide For GXP Computerized Lab Systems: All Rights Reserved

Uploaded by

Huu Tien
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

e d

e r v
GAMP Good Practice e s
s R
Guide for GxP h t
Computerized Lab Rig
Systems A l l
-
P E
Mark Newton & Paul Smith

I S
Webinar
Jun 25, 2019

19
2 0
©
Today’s Speakers e d
e r v
Mark E Newton
e s
Heartland QA
s R
h t
ig
Mark is an independent consultant in laboratory informatics and

ll R
data, data integrity, validation of computer systems/spreadsheets,
analytical instruments and LIMS/ELN. He has 30+ years of

- A
pharmaceutical experience in QC Labs, computer systems
validation and lab informatics. Mark co-lead Eli Lilly’s data integrity

PE
remediation program for QC Labs worldwide in 2012, consulted
and audited several Lilly sites preparing for data-integrity focused

I S
inspections. He is a co-leader for the GAMP Data Integrity Special

9
Interest Group and Chair of ISPE Global Documents Committee.
1
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 2
Today’s Speakers e d
e r v
Paul A. Smith
e s
Global Strategic Compliance Specialist
s R
Agilent Technologies
h t
ig
R
Paul has a passion for laboratory compliance. He started his career as an

ll
Infrared Spectroscopist, recording and interpreting infrared and Raman
spectra and chemometric modeling of FT-IR, Raman and NIR data. He did his

- A
first software validation work in 1992, before moving into broader analytical
chemistry and laboratory management roles. Overall, he spent 17 years in the
pharmaceutical industry and has worked in laboratory consultancy roles for the
last 17 years.

PE
He has worked in Pharmaceutical R&D, New Product Introduction and Quality

I S
Assurance and has focused on compliance of laboratory instruments and
harmonization of this compliance, writing articles, while papers and

9
contributing to GAMP Good Practice Guides.

0 1
In his current Agilent role, he monitors laboratory compliance change and non-
compliance trends, sharing this information with customers and colleagues.

© 2 Connecting Pharmaceutical Knowledge ISPE.org 3


e d
e r v
e s
s R
h t
Overview ig
ll R
A
• Speakers Introduction

-
Intent of the Good Practice Guide


Scope (In/Out)
PE
Differences between v1 and v2 of GPG

I S
19
2 0
©
Intent of GPG e d
e r v
e
The GAMP Good Practice Guide for GxP Compliant Lab Computerized Systems s
• Risk-based approach to managing computerized systems
s R
• Holistic view of equipment and software
h t
• Consistent with ASTM E-2500 and GAMP 5
ig
• Consistent with USP<1058>
ll R
- A
• Intended for systems in all GxP environments
• Authored and reviewed by industry and regulators

P E
I Swhere effort is matched to risk.
Goal: computerized lab systems that are fit for purpose using “right-sized”

9
approach to validation,
1
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 5
Scope of GPG e d
e r v
The GPG applies to computerized analytical systems that generate data.
e s
s R
Non-data equipment is out of scope (see USP<1058>)
h t
ig
• Shaker baths

R
• Centrifuges

• Chromatography
A ll
Multi-user software applications are out of scope (see GAMP5)

• ELN/LIMS
-

E
Lab Execution Systems

P
S
• Other multi-user systems

9 I
0 1
© 2 Connecting Pharmaceutical Knowledge ISPE.org 6
Differences: GPG v1 and v2 e d
e r v
For those who remember the original GPG (2005):
e s
R
• Lifecycle model was modified to ASTM E-2500, which was adopted with
s
GAMP 5. (Concept, Project, Operate, Retire)

h t
g
• Moved away from categories (seven) based on functionality (equipment

R i
type) to three levels, based on data capabilities/complexity.

ll
• Dropped non-data equipment, as it moved to USP<1058>.

A
-
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 7
e d
e r v
e s
s R
h t
The GAMP Approach ig
ll
Systems Classifications (good/bad) R
- A
Symbiosis: USP <1058>, GPG, GAMP 5

PE
I S
19
2 0
©
GAMP – Good Practice Guide
e d
v
Contents (57 pages)

r
A Risk-Based Approach to GxP Compliant Laboratory Computerized Systems

e
1. Introduction (5 pages) Second Edition

s

2. Key Concepts (5 pages)

2012
3.
4.
Life Cycle Approach (3 pages)
Life Cycle Phases (19 pages)
R e
s
5. Quality Risk Management (6 pages)

t
(GPG 2)

h
6. Regulated Organization Activities (4 pages) 8. Categories of Software

g
9. System Description
7. Supplier Relationships (3 pages)

i
10. Data Integrity
 11. Simple Systems

R
Systems & Examples 12. Medium Systems

ll
(49 Pages) 13. Complex Systems
Appendices

A
Table of
14. System Interfacing Consideration

-
Contents (103 pages) Considerations 15. Robotics Systems
(32 Pages) 16. Defining Electronic Records and Raw Data

E
17. Security Management for Lab. Compu. Systems

P
Supplier
Key Strength of Guide
S
18. Retention, Archiving, and Migration

I
(5 Pages)

Concept Simple

9
General 19. References

1
Project (12 Pages)
20. Glossary

0
Medium

2
Operation

©
Complex
Retirement
Connecting Pharmaceutical Knowledge ISPE.org 9
Historical Dilemma 1
Qualify the

e d
Validate the

r v
Instrument Software

e
Can’t qualify without Can’t validate the software

s
software control. without the instrument.

R e
Often, software and hardware (the instrument) were considered independently !

h ts
ig
Example of Key Principles

ll R
• Must Define User Requirements
Begins before
purchase -
• Scalable
A
• Risk Based
Approach
Extends to
beyond use

• P
E
• Implementation Life Cycle
into retirement

I S• Leverage Supplier Information


Define Responsibilities

1 9
2 0
These are Common Principles Between GAMP and USP <1058>

© Connecting Pharmaceutical Knowledge ISPE.org 10


Historical Dilemma 2 e d
v
Diverse Approaches / Interpreted Differently

e r
1987 – FDA Process Validation Guide Lines
s
Process Validation Guidance

e
2002 – FDA Guidance For Industry
s R
- General Principles of Software Validation

+g h t
Medical Device Focus

Industry Needed Guidance GAMP 4 (2001)


R i GAMP Good Practice Guide (2005)

USP <1058> (2008)


A ll Simple approach,

-
> 1000, Optional General Chapter
regulatory source

Which ApproachP
E
I S ? (GAMP or USP <1058>)………..

1 9
0
Structured development,

2
GAMP 5 (2008) Not a Regulatory Source consensus based….

© Connecting Pharmaceutical Knowledge ISPE.org 11


Life Cycle Approach e d
v
Evolution of USP <1058> and GAMP GPG

e r
s
Validation of Laboratory

e
Computerized Systems

2005

s R
GPG 1
USP
h t
g
(Good Practice Guide)

i
<1058>

R
2008 Analytical

l
Risk-Based Approach to GXP

l
Instrument Compliant Laboratory
Qualification

A
Computerized Systems
Data

-
2010 Integrity
Focus
2012
PE GPG 2
(Good Practice Guide)

I S New

9
USP
2017
0 1 <1058>

2
Analytical
Instrument

©
Qualification
Connecting Pharmaceutical Knowledge ISPE.org 12
Stronger Alignment of GAMP and USP <1058> e d
e r v
sAn Integrated
Validation of Laboratory Risk-Based Approach to GXP

e
Computerized Systems Compliant Laboratory
“Fixed” Approach to
R
Computerized Systems
2005
Categories
s
2012
GPG 1 (A to G - EXAMPLES) GPG 2
h t Computer

g“FIXED”
(Good Practice Guide)

i
(Good Practice Guide)
Software

Common Evolution
ll R Validation (CSV)

2008
- A EXAMPLES
2017
And

E
“Fixed” New
USP
<1058>
S P
Categories USP Analytical

I
Analytical (A to C - EXAMPLES) <1058> Instrument

9
Instrument Analytical Qualification (AIQ)
1
Qualification Instrument

0
Qualification

© 2 Connecting Pharmaceutical Knowledge ISPE.org 13


Continuum Between <1058> and GAMP…….. e d
e r v
e s
s R
h t RDI

2017
ig
New
USP
ll R
<1058>
- A
Instrument / System Complexity

E
Analytical
Instrument Key

P
Risk-Based Approach to GXP
Qualification Compliant Laboratory Concepts

S
Computerized Systems

9 I
0 1 GPG 2 Regulatory

2
Guidance

© Connecting Pharmaceutical Knowledge ISPE.org


USP <1058> Instrument Categories: A, B and C e d
e r v
USP <1058> 2008 USP <1058> 2017

e s
R
Control Strategy Control Strategy

ts
Observe Observe

Calibrate

ig h
Calibrate Have to determine
which category
Qualify

ll R Qualify
(A, B or C)

• Vortex Mixer
A
- A By:

E
A • Stirrer

P
Look
No

S
• pH Meter B Risk Assessment &
I
B at
• Balance Examples
Intended Use
9
Examples

1
C • HPLC C “…..the same type of instrument can fit into one

0
• GC

2
or more categories, depending on its Intended

©
use”
Connecting Pharmaceutical Knowledge ISPE.org 15
GAMP Good Practice Guide – Instrument Complexity e d
e r v
s
Key Strength of GPG, 2 Edition nd

e
GPG Validation of Laboratory

R
Version 1 Computerized Systems

In Appendix
s
Original Instrument Concept
“Fixed” Categories
h t Simple
• pH Meter

g
• Pipette

i
•A- Sonicator Based on • Balance

R
Project
•B- pH Meter

ll
•C- Key Pad HPLC
• PCR cyclers
•D- PC HPLC
Lifecycle Medium
A
•E- NMR • HPLC/GC

-
•F- Spread sheet & how • FT-IR
•G- Bespoke
The system Operation
Based on

PE
Is used Complex • CDS/HPLC

S
features of

I Retirement
the “system”

19
2 0
©
Good Project Management
Connecting Pharmaceutical Knowledge ISPE.org 16
Poll Question #1 e d
e r v
What guidance do you use for your laboratory computerized systems?
e s
A. USP <1058>
s R
B. GAMP GPG for Lab Systems
h t
C. GAMP 5
ig
D. Other
ll R
- A
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 17
e d
e r v
e s
s R
h t
Vendor Evaluation ig
Importance to Data Integrity
ll R
Deciding the Best Approach
- A
Responding to What is Discovered

PE
Total Cost of Ownership

I S
19
2 0
©
Vendor Evaluation e d
e r v
e s
• The type of evaluation should reflect the risk (and/or cost) to the organization
• Onsite evaluation

s R
• Teleconference

h t
g
• Questionnaire/survey
• None at all

R i
A ll
• As systems move up the complexity curve, the type of evaluation can move up as
well.
-
PE
• Good evaluations (regardless of type) usually require multiple specialties

I S
• Business
• QA
• IT
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 19
Vendor Evaluation e d
e r v
Assess the vendor’s development process and quality practices
e s
• Is there a development methodology?
s R
• Do they have a QA function?
h t
• Do they conduct formal, written tests?
ig
ll
• Do they have a complaint management process?
R
A
• How large is installed user base for this item?
-
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 20
Vendor and Data Integrity e d
e r v
• Appendix 11, Supplier Documentation and Services, describes services and
e s
information that vendors can supply.
s R
t
• Vendor designs can eliminate – or create – data integrity gaps in computerized
h
g
systems. It is imperative to evaluate data integrity gaps prior to purchase:

R i
Assessing enables an estimate of the mitigations and their cost to the organization (total

ll
cost of ownership)

A
• Assessing permits a cost/benefit comparison of competing models on a like vs like basis

-
Assessing provides a tool for price negotiations, esp. if gaps are found.

PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 21
Responding to Discoveries e d
e r v
• Why perform Vendor Evaluation if it does not lead to action ?
e s
• Some possible outcomes:
s R

h t
If vendor has tested some of requirements, then accept vendor testing on those

g
requirements

R i
If vendor has no evidence of testing, test all requirements (or all high/medium requirements)

A ll
• Software risks are usually higher than hardware risks, because software has low
detection
-
E
• When vendors outsource software development, must go to software firm as well
P
I S
19
(Insert experiences here)

2 0
© Connecting Pharmaceutical Knowledge ISPE.org 22
Total Cost of Ownership e d
e r v
Total Cost = purchase + e s
s R
project +
h t
10 year use + ig
retirement ll R
- A
E
Better data integrity compliance = lower project and use costs
P
I S
Mitigating gaps adds to project cost and use costs—more SOPs/training required

9
Buy based on total cost, not purchase price

0 1
© 2 Connecting Pharmaceutical Knowledge ISPE.org 23
e d
e r v
e s
s R
Special Considerations: h t
Project (Development) ig
Requirements
ll R
Design/Configuration
Testing - A
Traceability Matrix
PE
I S
19
2 0
©
Requirements (Specifications) e d
e r v
e s
• Requirements (or specifications) describe the characteristics or attributes that the
equipment must possess to be fit for its intended use
s R
t
• Requirements form the basis for all subsequent design and testing activities
h
ig
• Requirements have an important role in the selection of computerized systems,

R
and configuring it for use

ll
• For computerized lab systems, not necessary to write separate user/functional
A
-
requirements – a combined set is sufficient

E
• Give emphasis to security, data integrity and audit trails

S P
I
NOTE: Vendor specs from manuals, etc are not business specifications!!!!!!!

9
0 1
© 2 Connecting Pharmaceutical Knowledge ISPE.org 25
Design/Configuration e d
e r v
e s
• Configuration (and configuration control) can impact the complete record of testing;
therefore, it must be well-documented
s R
• Time clock, audit trails, calculation types, etc.

h t
g
• For complex systems, configuration can be a challenge, as it requires both

R i
business and IT expertise. Often need to work with vendor to understand options

ll
• CRITICAL: document WHY (rationale) each setting is chosen, so future change
proposals can be evaluated.
A
-
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 26
Instruments and Software…….. e d
e r v
Example V Model
e s
Example Instrument

R
Life Cycle

h ts
ig
ll R
- A
PE
I S
19
V Model is a Good fit for Software
V Model is a Poor fit for Standard Instruments
Instrument Life Cycle is a Poor fit for Software
Instrument Life Cycle is a Good fit for Instruments
(“COTS”)

2 0 (“COTS”)

© Connecting Pharmaceutical Knowledge ISPE.org 27


Instrument Complexity and Traceability e d
Traceability: of User Requirements
e r v
e s
(Depends on Use and Complexity of User Requirements)

R
• New Software - Trace Matrix

s
Instrument
h t
• Must be Maintained
• Periodic Testing
Testing
ig (e.g. HPLC OQ ~ Annual, PQ ?)

R
• Traceability - Complexity of URS

A ll - URS/OQ Alignment
- Formal Trace Matrix

-
(as part of software or Complex URS QTOF)

PE Software
• Software Validation

S
• Requirements Trace Matrix

9 I Testing • Change Control

1
• Risk Assessment / Re-Validation

0
• Periodic Review

© 2 •
Connecting Pharmaceutical Knowledge
Re-Validation ?
ISPE.org 28
Instrument Testing Considerations e d
e r v
e s
s R
Risk
RA
h t See USP <1058> 

g
Assessment Approved
provider

R i URS: User Requirement Specifications

URS DQ IQ OQ
A ll
PQ
DQ: Design Qualification
RA: Risk Assessment

- IQ: Installation Qualification

PE
Responsibility of OQ: Operational Qualification
PQ: Performance Qualification
S
the User

9 I
0 1
© 2 Connecting Pharmaceutical Knowledge ISPE.org 29
Traceability e d
e r v
• Traceability from requirements to code is not possible for COTS
e s
• Instead, trace requirements to testing scripts

s R

t
Consider advantages of tracing requirements to configuration decisions
h
• For most systems, a spreadsheet-style approach works well
ig
ll R
A
Requirement -------------------- Testing

-
PE
I S
19
0
Design Verification (DV)

© 2 Connecting Pharmaceutical Knowledge ISPE.org 30


e d
e r v
e s
s R
Special Considerations: h t
Operation ig
Firmware Upgrades
ll R
- A
Backup/Restore (Disaster/Continuity)
Archiving
System Retirement
PE
I S
19
2 0
©
Firmware Upgrades / Considerations e d
e r v
e s

s R
Firmware is Tested - by the Supplier

h t
Supplier Testing – may not be not always be available

ig
CDA Option - Confidential Disclosure Agreement

R
• Firmware – release notes


A ll
Compatibility – especially between vendors
Communication – needs to be tested
Updating Firmware
• -
Impact Assessment – depends on release notes
is not a “like for like”
change – Change Control
PE • Standardise – where possible

I S • Coordinate – instrument / software testing

19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 32
Backup/Restore (Disaster/Continuity) e d
e r v

e s
Backup (and Restore) are critical to preservation of electronic records

s
Loss of data due to no backup is a common Data Integrity finding
R

h t
If software install can be rebuilt from external media, then focus is on data.

ig
Need to periodically challenge restore process to verify it,

ll R
A
• Continuity planning is best done at the laboratory level, rather than single installs.
-
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 33
Archiving e d
e r v
• Secure, long-term storage of data/metadata until destruction
e s
• Often required for small computerized systems
s R
• Storage limits
h t
ig
• File vulnerabilities on local system (old systems/bad design)
• Often adds protection (few have write access)
• Not the same thing as backup files!
ll R
- A
• Like retirement: assure that complete record is archived

PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 34
System Retirement e d
e r v
• Retirement will focus on data retention, and system removal
e s
• Often results in “boneyards”: old systems for “read only”
s R
• Better choice is data migration, if feasible
h t

ig
Acknowledge that value of audit trails will decrease over time

ll R
A
• Move audit trails and metadata together to preserve complete record!
-
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 35
e d
e r v
e s
s R
h t
ig
Efficiencies in Compliance
R
ll
Multiple Installs of Identical Units
A
-
Common SOPs
Hardware and Software Re-Qualification

PE
I S
19
2 0
©
e d
Common Processes
e r v
e
• Once software has been tested, only IQ is needed for additional installs
s
R
• IQ/OQ always required for hardware

ts
• Create single system administration, system maintenance, backup/restore

h
SOPs for all systems at a lab level (not individual applications)

ig
• Plan business continuity at lab (or even entire plantsite) level

ll R
- A
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org 37
Instrument Performance Qualification – No Black/White e d
e r v
the instrument performance is “routinely”sevaluated”
Key PQ Thought – “How Different is your use of the instrument from how

R e
+ Point of Use Testingts
OQ PQ
ig h (System Suitability Testing - SST)

Your Use / ll R
OQ: Test PQ: Evaluates SST: Does it Work on the Day

ApplicationsA
User SST is not a PQ !
Requirements
-
P E Justification Implementation

• OQ + Holistic PQS
I
Simple
More Work to Justify
+ SST
9
(Holistic PQ is Representative of

1
Use)
• OQ + Use - Specific PQ + SST
0 Complex
Easy to Justify

2
(Method / Application based PQ)

© Connecting Pharmaceutical Knowledge ISPE.org 38


Poll Question #2 e d
e r v
e s
How do you manage Performance Qualification (PQ) testing for lab computerized
systems?
s R
A. Do nothing (no PQ)
h t
B. Have vendor perform PQ
ig
C. Annually perform PQ
ll R
A
D. PQ only when needed (e.g. major repair, upgrade)

-
PE
I S
19
2 0
© Connecting Pharmaceutical Knowledge ISPE.org
Q&A
e d
e r v
Contact Information Upcoming Webinars
e s
Mark Newton •

s R
21 August 2019 – Polishing an Old Gem:

t
Consultant, Heartland QA

h
Email: [email protected] Commissioning & Qualification Baseline Guide
Phone: 317-372-6141
i
Update
g
Paul Smith

ll R
24 September 2019 – Qualification of PAT Systems

Global Strategic Compliance


Specialist, Agilent Technologies
- A
Phone: +44-7900135512
PE
Email: [email protected]
Topic Ideas or Feedback?

I S Send to [email protected]

19
2 0
© Connecting Pharmaceutical Knowledge ispe.org 40

You might also like