Automating HPLC and GC-Method - Validation
Automating HPLC and GC-Method - Validation
A u t o m a t i n g HPLC a n d GC
A n a l y t i c a l M e t h o d V a l i d a t i o n
P a t ri c k L u ku l a y a n d Ri c h a rd V e rse pu t
M
Th e c h ro m a t o g ra ph i c a n a l y t i c a l m e t h o d
va l i d a t i o n process i n vo l ve s a se rie s o f a c t i vi t i e s
t h a t a re c u rre n t l y c o n d u c t e d in se pa ra t e
"t e c h n o l o g y i sl a n d s" u si n g t o o l s t h a t e xi st fo r
e a c h a c t i vi t y .Th i s a rt i c l e d e sc ri be s a so ft wa re
pro g ra m t h a t pro vi d e s a n o ve ra rc h i n g
a u t o m a t i o n t e c h n o l o g y fo r a n a l y t i c a l m e t h o d
va l i d a t i o n a n d bri n g s t o g e t h e r t h e i n d i vi d u a l
a c t i vi t i e s u n d e r o n e i n t e g ra t e d -t e c h n o l o g y
pl a t fo rm t h a t is a d a pt e d t o m u l t i pl e
i n st ru m e n t s a n d d a t a sy st e m s.
P a t ri c k L u ku la y , P h D, is the
Analytical R&D Group Leader at Pfizer
Global R&D (Ann Arbor, Ml). Ri c h a rd
V e rse pu t is President of S-Matrix
Corporation (Eureka, CA),
tel. 707.441.0405, fax 707.441.0411,
[email protected],
6 6 P h a m a c e a t i e a l T e c f i n o l o ^ j f M A Y 2 0 0 5
ethod validation activities encompass the planning
and experimental work involved in verifying the fit-
ness of an analytical method for its intended use.
These activities often are captured in company stan-
dard operating procedure (SOP) documents that usually incor-
porate US Food and D rug A dministration and I nternational
Conference on Harmonization (ICH) requirements and guid-
ances( 1-3). Method validation SOP documents describe all as-
pects of the method validation work for each experiment type
(e.g., accuracy and linearity) within a framework of three gen-
eral execution sequence steps: experimental plan, instrumen-
tal procedures, and analysis and reporting of results. The indi-
vidual elements within these three general steps are detailed in
the following paragraphs.
Step1:experimentalplan. An experimental plan should spec-
ify analyte concentrations, instrument parameters, and environ-
mental parameters. The plan also should include the number of
levels per variable, the number of preparation replicates per sam-
ple, and the number of injections per preparation replicate. An
integration of standards, inclusion of system suitability injec-
tions, and the acceptance criteria also are part of the plan.
Step 2 : instrumental procedures. T he instrumental procedures
step involves making the required transformations of the ex-
periment plan into the native file or data formats of the instru-
ment's controlling chromatography data system (CDS) soft-
ware (construction of sample sets and method sets or sequence
and methods files). In addition, the following are specified: the
number of injections (rows), the specific type of each injection
{e.g., sample or standard), and the required modifications to
the analytical method (robustness).
Step 3 : analysis and reporting of results. T his step includes analy-
sis calculations, report content and format, comparisons to ac-
ceptance criteria (FDA and ICH requirements), and graphs or
plots that should accompany the analysis.
M e t h o d va l i d a t i o n te ch n o lo g y pl a t fo rm s
The execution steps in method validation activities generally
involve manual operations carried out on unconnected tech-
nology platforms. The method validation chemist works in what
are essentially isolated technology islands with manual opera-
tions providing the only bridges.
www.pharmtech.com
To illustrate, an SOP guidance often is an electronic docu-
ment in MS Word. The experimental plan (step 1) within the
SOP guidance must be transferred to the high performance liq-
uid chromatography (HPl-C) or gass chromatography (GC) in-
strument for execution (step 2} by manually rekeying the ex-
periment into the instrument's controlling CDS software. In a
few cases, the statistical analysis of results (step 3a) can be con-
ducted within the CDS, but it is most often performed within
a separate statistical analysis software package or spreadsheet
program such as MS Excel. This process also requires manually
transferring the results data from the CDS to the analysis soft-
ware package. Reporting of results (Step 3b) is usually carried
out in MS Word and therefore requires the manual transfer of
all results, tables, and graphs from the separate statistical analy-
sis software package. The manual operations within the three
general execution sequence steps are described in the follow-
ing steps, and the isolated technology islands
are illustrated in Figure 1.
Step 1: experimental plan. T he validation
plan is developed in MS Word, and the ex-
perimental design protocol is developed in
off-line DOE software.
Step 2 : instrumental procedures. Sequences
or sample sets are manually built in the CDS
and the raw peak (x, y) data reduction cal-
culations are performed by the CDS (e.g.,
peak area and concentration).
Step 3 a: statistical analysis. Calculated re-
sults are manually transferred from the CDS
to MS Excel. Statistical analysis usually is
carried out manually in MS Excel. Some
graphs are generated manually in MS Excel,
and some arc obtained from the CDS.
Step 3b: Reporting resuKs. R eports are man-
ually constructed from template documents
in MS Word. Graphs and plots are manually
integrated into a report document.
Work was d o n e in se pa ra te
"te ch n o lo g y islands" as shown be lo w.
Validation planning | _ K Manual
. 1^"^ .- n/ oper a" t i bn
protocol generation. "^ ^
Sequence generation.
Data acquisition.
Data reduction.
Done in the CDS
Final report
generation.
Done in
MS Word
Manual
operation
Manual
operat'ion
. Statistical analysis.
/ ^ and graphing.
^ ^ Done in MS Excel
Fig u re 1: Methods validation: isolated technology islands.
Execution Step 1:
Validation planning
and protocol generation.
Workflow management
Workflow templates
21 CFR}} compliant
Execution Step 2:
Sequence generation.
Experiments exported
to CDS in native
formats for walk-away
operation.
Rigorous DOE practice
-CDS independence
21 CFRIl compliant
Execution Step 3.a:
Auto-import of CDS
results data.
Statistical analysis and
graphing.
Statistically valid,
defensible
21 CFR}} compliant
Automated file-less
design and results
data exchanges maintain
21 CFfin compliance.
Data aquisition.
Data reduction.
Execution Step 3.b.
Final report generation.
Meets FDA & ICH
guidances
21 CFR 11 compliant
Fig u re 2: Integrating the technology islands.
St ra t e g i c V a l i d a t i o n Te c h n o l o g y I n i t i a t i ve pro g ra m t e c h -
nology goals
The overall goal of the Strategic Validation Technology I nitia-
tive (SVTI) program was to fully automate the analytical method
validation work, which required integrating the isolated tech-
nology islands. The SVTI team clearly understood that adopt-
ing the final software program into standard use required min-
imizing the drudge work. T herefore, successful technology
transfer hinged on the automation goal. The two most critical
and challenging technical elements of the automation effort
were automating the data exchange between the off-line design
of experiments (DOE) software and the CDS and making the
data exchange technology generic and adaptable to the targeted
data systems. I mportant target instrument platforms were con-
trolled by various instrument data systems. Each target CDS
had a different data architecture and different functionality
within its respective software development kit {SDK, third-party
software development interface). W ithout a generalized tech-
nology, data exchange would be limited to a single CDS. SOPs,
therefore, would have to be manually adapted to instruments
controlled by different data systems. Also, the SVTI software
would not be able to automatically address instrument config-
uration differences to allow for the creation and dissemination
of workflow automation templates.
The SVTI project also was required to address the following
related analytical R&D technology development goals:
Implement easy setup of DOE-based experiments and facil-
itate statistically rigorous practice.
' Establish 21 CFR 11 compliance support toolset and help
maintain compliance across integrated platforms;
D evelop method connectivity. Early methods developed
manually or using other software tools should be able to be
optimized and validated using the new software.
Generate final reports that are simple to review, communi-
cate, and defend.
Establish standardized reporting. Report form and content
should be independent of the specific instrument, CDS, and
facility. Reports should meet all FDA and ICH guidelines.
Wr3rm3ceiff/C3/ T e c b n o l o g f mv s o o s 67
DATA A N D R E V I E W
Enptnrrwt No
P Syitw>Su M tii'C h e ck
Na o l C h w* 1771 I ri Ko bo n Ft a pM l M t y l SRSDo :
SUn d a d IniBolEiH I -ZJ
RM d u l n n (Ri >
F L m l ol Ou wu tbo n
F L M d DMaoian
Figure 3: An example of custom software interface.
Execution Step 1:
Validation planning
Fusion AE
Execution Step 2: Execution Step 3:
and protocol generation, native CDS formats
Workflow management for walk-away
,, 1 fl . I . operation.
Workflow templates
Rigorous statistical
practice
Experiments built in Statistical analysis,
graphing, and final
report generation.
Statistically valid,
defensible
Meets FDA and ICH
' CDS independence guidances.
OLE linking and
embedding of
Word documents
Excel workbooks
Automatically exchanges
experiment designs and
results data with COS.
Maintains 21 CFR^^
compliance.
Fig u re 4: An automated software solution.
T he main technology integration goals and the related tech-
nology development goals are illustrated in Figure 2.
SVTI pro g ra m : pro d u c t fe a t u re re qu i re m e n t s
T he SVTI program goal was the development of a central soft-
ware environment for all analytical methods validation work.
To facilitate acceptance and widespread use, the resulting soft-
ware program was required to include four specific feature sets:
a custom user interface specific to method validation experi-
mentation, a phased approach to method validation, the FDA
and I CH required complement of method validation experi-
ments, and a full complement of automation support features.
Custom user interface. T he custom user interface required of
method validation software is illustrated in Figure 3. An exper-
iment setup window contains controls for incorporating sys-
tem suitability check standard injections into the design and
defines acceptance criteria for evaluating suitability results such
as peak capacity factor {k') and peak resolution (Rs).
Phased approach t o m eth o d va lid a t io n . The Pharmaceutical Re-
6 8 Pbarmmtical Techmlo gf M A Y 2 0 0 5
search and Manufacturers of
A merica's (PhRMA's) analytical
technical group recommends a
phased approach to analytical
method validation in which
early-phase validation efforts
are done upstream on a reduced
set of validation elements ap-
propriate to the stage of devel-
opment(4). T his process in-
volves method performance
characterization experiments to
define the "validatahility" of the
current method. T he need for
this is obvious when one con-
siders that analytical methods
are being used in drug discov-
ery and development well before the point at which final
validation is usually conducted.
SVTI software development addressed the need for and
value of a phased approach in terms of both experiment
organization and experiment structure. Method validation
experiments were partitioned into early phase (character-
ization) and final phase (FDA and K^H submittal quality).
Some experiments are contained within both phases {e.g.,
accuracy and linearity). I n these cases, the software default
settings in terms of number of sampie preparation repli-
cates, number of injections per preparation replicate, num-
ber of concentration levels, and so forth will result in smaller
experiments with less time and resource burden in the early
phase, while the final-phase counterpart has the defaults
set to those defined in the FDA and I CH guidances.
Required complement of validation experiments. T he com-
plement of method validation experiments built into the
Fusion AE software (S Matrix) is categorized as follows.
Early-phasemethodvalidation (characterization).Early-phase val-
idation includes system suitability; filter validation; accu-
racy; linearity and range; repeatability (intra-assay preci-
sion); and sample solution stability (stability for a given time
period under prescribed conditions). R epeatability is affected
by both sample preparation error and instrument error (injec-
tion precision). T herefore, to demonstrate the repeatability of
the method as documented, all repeatability experiments were
required to include sample preparation replicates.
Final-phase method validation (FDA and ICH submittal quality). Final-phase
validation includes system suitability; accuracy-linearity and
range-repeatability combined design (I CH Q2A states that ac-
curacy, linearity, and repeatability can be performed together
as a single combined experiment); robustness; ruggedness (in-
termediate precision and reproducibility); and specificity.
Full complement of automation support features. T he central
software environment developed under the SVTI program re-
quired many custom support features to fully enable and auto-
mate the method validation experiment design suite just de-
scribed. T he required support features naturally group into five
feature sets: assay types, compounds, analysis and reporting,
acceptance criteria testing, and workfiow management.
www.pharmtech.com
D A T A AND R E V I E W
E-l3b notebook
interface for fast
and easy experiment
setup with correct
planning built in.
Automatically builds
statistically correct
experiments.
Exports them to the CDS
a5 ready-to-run method
sets and sample sets.
Auto-analyzes
results.
Creates final reports
that meet all FDA
and ICH guidelines.
Output formats
include:
,RTF
,DOC
HTML
,PDF
Experiments run
automatically
on the CDS,
Auto-impofts
all results.
Fig u re 5: Automated method validation workflow.
%Time spent per method validation activity
^ 40
c
a;
Bt 30
0)
. 20
^ 10
0
I _
Sample prep Data
and seq, analysis
setup and stats
No automation
Automation
A ctivity
u
Research
report
creation
Fig u re 6 : Efficiency gained by automation.
Assay types. The assay types feature set addressed the four main
assays routitiely addressed in method validation. Component
features included potency (drug content), content uniformity,
dissolution, and determination of impurities-
Compounds. The compounds feature set allowed multiple com-
pounds (active ingredients or impurities) to be included in the
same experiment (must accommodate as many as 10 active in-
gredients or impurities).
Analysis and reporting. The analysis and reporting feature set pro-
vided the statistical analysis and graphing results reports re-
quired by the FDA and ICH guidances. Component features
included automated analysis (one-button click); automated
graphics created as part of automated analysis; and automated
report construction to meet all FDA and ICH guidances.
Acceptance criteria testing (user defined value = X ). The acceptance cri-
teria testing feature set enabled the analysis and reporting fea-
ture set to automatically compare actual results with predefined
pass-fail acceptance criteria and report the results of the com-
parisons. Component features included
70 Phamaceutical fecM o gji M A Y 2 0 0 5
filter validation: % bias limits (X)
accuracy: % bias (<X )
linearity and range: % bias (<X)
repeatability: %RSD (^X )
sample solution stability: % recovery limits (X)
robustness: % effect (<X)
ruggedness: % effect (<X); intermediate precision %RSD
i^X ); and reproducibility %RSD (^X)
specificity: difference of practical significance (^X).
Workflow management. The workflow management feature set
enabled construction of work templates, software-based ad-
ministration, and control of the work. Component features in-
cluded an ability to create and distribute workflow templates;
an ability to control feature access with user permissions and
authorities settings; and an ability to control workflow with re-
view and approve e-signing control loops.
SVTI pro g ra m : re su l t s
Fusion AE software development was carried out at the S-Matrix
facility in Eureka, California. Project management and bench-
marking were carried out at Pfizer's facility in Ann Arbor, Michi-
gan. Benchmarking involved using the software to conduct "live"
method validation experiments in the walk-away mode with full
instrument control and automated data exchange with the CDS.
The work culminated in the delivery of a commercial release of
the Fusion AE software program that met all SVTI project require-
rnents. The Fusion AE software solution is illustrated in Figure 4.
The corresponding automated method validation workflow is il-
lustrated in Figure 5. N otable features of the software program
include the following:
a central software environment for all analytical method val-
idation work
flexibility to support "method validatability" studies done
as part of method development
transferable electronic template generator for work stan-
dardization
management workflow control
rigorous DOE methods and practice integration
automated data exchange between the target technology is-
lands
21 CFR 11 compliance support across all technology islands
data exchange with the target instrument data systems:
PerkinElmer "T otalChrom"; Varian "Galaxie"; and Waters
"Millennium''" and "Empower."
As final proof of project, a senior analytical chemist at Pfizer
used the Fusion AE system to carry out all early-phase and final-
phase method validation experiments (except robustness, which
was done subsequently at a different lab) in the following seven
work steps:
Prepare a series of HPLC injection samples and standards
containing two active compounds.
Cenerate all experiment designs within Fusion AE.
Use the automated data exchange feature to export the de-
signs to the CDS as ready-to-run methods and sequences in
the native file format of the CDS.
Set up the HPLC (prepare the mobile phase reservoirs and
load injection samples and standards into the autosampler).
www.pharmtech.com
Table I : Key pro j e c t g o a l s, c h a l l e n g e s, and re su l t s.
P ro j e ct goal
User interface:
easy setup of
DOE-based
experiments
Experiment design:
transform DOE
software generated
designs into fiie
and data formats
of the target data
systems
Data exchange:
fiexibie data
exchange
adaptabie to
several target
instruments
and data systems
Regulatory
compliance:
maintaining 21
CFR11 compli-
ance support
across multiple
instrument data
system software
platforms
P ri n c i pa l c h a lle n g e
A unique complement
of on-screen user
settings controls is
needed to generate
each of the required
validation experiment
designs
Lack of standardized
nomenclature and
settings structure for
run type designations
such as suitability,
standard, sample, or
unknown between
target data systems
Lack of standardized
data formats and
instrument control
structures within and
between instruments
and data systems
The instrument data
system software
platforms provided
little or no
programmatic
access to their
internal 21 CFR M
support features
Fin a l re su lt
An intuitive, DOE-
transparent interface
that displays required
design settings in
logical order and layout
for each experiment
design type
Ability to set all
required run types
within DOE designs
exported for automatic
execution by each
of the target data
systems
A dynamically
updatable instrument
control driver set that
adapts a generic data
exchange engine to a
target instrument and
data system
Use of compliant data
tracking values in all
data exchanges to
support 21 CFR 11
compliance (e.g., data
identity and audit trail)
across the different
software platforms
operations (steps 2 ,3,5,6, and 7) is minimized. As the
figure shows, under these circumstances the minimum
efficiency gain is stiil at least 60% (20% gain in data
analysis and 40% gain in report generation).
C o n clu sio n
A project of this complexity presented several soft-
ware development challenges in each of the four main
software program elements: user interface, experiment
design, data exchange, and regulatory compliance. The
most critical project goal in each of the four main pro-
gram elements is presented in Table I. The table also
presents the principal technical challenge associated
with accomplishing the goal as well as the result
achieved at the conclusion of the project.
The final Fusion AE deliverable enables the trans-
formation of written SOPS for all required analytical
method validation experiments into transferable au-
tomated templates in a timely manner and with full
CGMP compliance. It also allows harmonization of
the work across multiple sites and can he extended to
contract research organizations with full management
control of all work. Moreover, the connectivity to mul-
tiple instrument data systems means that the work en-
vironment is transparent to the instrument and the
CDS. T his will enable greater flexibility in selecting
instrument platforms as needs change and technol-
ogy improves.
Run all nine experiment design sequences on the HPLC in
walk-away mode.
Use the automated data exchange feature to import the re-
sults data sets from the CDS into Fusion AE.
Use the automated analysis, graphing, and reporting features
to generate submittal-quality reports for all nine experiment
designs that meet all FDA and ICH guidances.
The analyst began the proof of project work on a Thursday
morning at 9:00 am. All work was completed by noon of the
following day. Ruggedness testing was limited to analyst and
day. The work took less than 12 hours of the analyst's time. Work
records showed that, on average, the same amount of work
from SOP planning and experiment design construction to final
reportsusing manual approaches and existing tools took more
than two weeks of an analyst's time. T hus the proof-of-project
work represented an 83% reduction in time and effort ([12 h
/80 h] X 100%). Figure 6 illustrates a minimum expectation of
the efficiency gain possible with the automated software solu-
tion in which only a few of the simpler experiments are required
and the time required for manually carrying out the automated
A c kn o wl e d g m e n t s
The authors would like to thank the following Pfizer
Team sponsors and colleagues whose hard work and
contributions were directly responsible for the un-
qualified success of the SVTI program; Gerard Hokan-
son, PhD , John Twist, PhD , Steven Hagan, PhD ,
Thomas MacNeil, Steve Priebe, PhD, lames Sabatowski,
James Spavins, PhD , and James Morgado. S-Matrix
has full ownership and responsibility for the Fusion AE prod-
uct and should be contacted for all future inquiries.
Re fe re n c e s
1. FDA, CDER (CMC 3), "Reviewer Guidatice: V alidation of Chromato-
graphic Methods," N ov. 1994.
2. I CH Q2A , "(Guideline for I ndustry: Text on V alidation of A nalytical
Procedures," Mar.1995.
3. I CH Q2B, "Guideline for I ndustry: Q2B V alidation of Analytical Pro-
cedures: Methodology," N ov. 1996.
4. S.P. boudreau et ai, "Method Validation by Phase of D evelopment: An
A cceptable Analytical Practice," Pharm. Tedmol. 2 8 (11), 54-66,2 004.
3. L. T orbeck, Complying with ICH and FD A Requirements for A ssay Val-
idation (Suffield Press, E vanston, lL, 2002). FT
Please ra t e t h i s a rt i c l e .
On tfie Reader Service Card, circle a number:
3 45 Very u sefu l and i n fo rm a t i ve
3 46 So m e wh a t usefui and i n fo rm a t i ve
3 47 Not u sefu l or i n fo rm a t i ve
Y o ur feedback is ifiipo rtatJt to us .
Phamaceatical Jechno lo gf M A Y 2 0 0 5 71