100% found this document useful (19 votes)
840 views23 pages

Clinical Data Management., 978-0471983293

ISBN-13: 978-0471983293. Clinical Data Management Full PDF DOCX Download

Uploaded by

renellruvolono
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
100% found this document useful (19 votes)
840 views23 pages

Clinical Data Management., 978-0471983293

ISBN-13: 978-0471983293. Clinical Data Management Full PDF DOCX Download

Uploaded by

renellruvolono
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

Clinical Data Management

Visit the link below to download the full version of this book:
https://cheaptodownload.com/product/clinical-data-management-full-pdf-docx-downl
oad/
SEQ 0004 JOB WIL8280-000-007 PAGE-0004 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

First published 1993 (reprinted 1994; 1995, twice; 1996; 1998, twice), Second Edition
published 2000
Copyright  1993, 2000 by John Wiley & Sons Ltd,
Baffins Lane, Chichester,
West Sussex PO19 1UD, England
National 01243 779777
International (+44) 1243 779777
e-mail (for orders and customer service enquiries):
[email protected]
Visit our Home Page on http://www.wiley.co.uk
or http://www.wiley.com
All Rights Reserved. No part of this publication may be reproduced, stored in a retrieval
system, or transmitted, in any form or by any means, electronic, mechanical, photocopying,
recording, scanning or otherwise, except under the terms of the Copyright, Designs and
Patents Act 1988 or under the terms of a licence issued by the Copyright Licensing Agency,
90 Tottenham Court Road, London, UK W1P 9HE, without the permission in writing of the
publisher.
Other Wiley Editorial Offices
John Wiley & Sons, Inc., 605 Third Avenue,
New York, NY 10158-0012, USA
WILEY-VCH Verlag GmbH, Pappelallee 3,
D-69469 Weinheim, Germany
Jacaranda Wiley Ltd, 33 Park Road, Milton,
Queensland 4064, Australia
John Wiley & Sons (Asia) Pte Ltd, 2 Clementi Loop #02-01,
Jin Xing Distripark, Singapore 129809
John Wiley & Sons (Canada) Ltd, 22 Worcester Road,
Rexdale, Ontario M9W 1L1, Canada
Library of Congress Cataloging-in-Publication Data
Clinical data management / edited by Richard K. Rondel, Sheila A. Varley,
Colin F. Webb. — 2nd ed.
p. cm.
Includes bibliographical references and index.
ISBN 0-471-98329-2 (cased)
1. Medicine—Research—Data processing. I. Rondel, R. K.
II. Varley, S. A.
[DNLM: 1. Information Systems. 2. Automatic Data Processing.
3. Data Collection. 4. Data Display. 5. Quality Control. W
26.55.I4 C641 1999]
R853.D37C55 1999
610'.285—dc21
DNLM/DLC
for Library of Congress 99–31926
CIP
British Library Cataloguing in Publication Data
A catalogue record for this book is available from the British Library
ISBN 0-471-98329-2
Typeset in 10/12pt Cheltenham Book by Dorwyn Ltd, Rowlands Castle, Hants.
Printed and bound in Great Britain by Biddles Ltd, Guildford and King’s Lynn
This book is printed on acid-free paper responsibly manufactured from sustainable
forestry, in which at least two trees are planted for each one used for paper production.
SEQ 0005 JOB WIL8280-000-007 PAGE-0005 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

Contents

Contributors vii

Foreword xi

Preface xiii

1 Chapter Review 1
Stuart W. Cummings

2 The International Conference on Harmonisation and its Impact 21


Beverley Smith and Linda Heywood

3 Case Report Form Design 47


Moura Avey

4 Data Capture 75
Emma Waterfield

5 Planning and Implementation 89


Chris Thomas

6 Data Validation 109


Pankaj ‘Panni’ Patel

7 Quality Assurance and Clinical Data Management 123


Heather Campbell and John Sweatman

8 Performance Measures 143


Jon Wood

9 Data Presentation 159


Munish Mehra

10 Coding of Data—MedDRA and other Medical Terminologies 177


Elliot G. Brown and Louise Wood

11 Database Design Issues for Central Laboratories 207


Tom Tollenaere
SEQ 0006 JOB WIL8280-000-007 PAGE-0006 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

vi CONTENTS

12 Computer Systems 229


Louise Palma

13 Systems Software Validation Issues—Clinical Trials Database


Environment 249
Steve Hutson

14 Re-engineering the Clinical Data Management Process 271


Steve Arlington, Paul Athey, John Carroll and Alistair Shearin

15 Working with Contract Research Organizations 293


Kenneth Buchholz

16 Data Management in Epidemiology and Pharmacoeconomics 307


Michael F. Ryan and Andreas M. Pleil

17 Future Revisited 325


Ruth Lane

Index 347
SEQ 0007 JOB WIL8280-000-007 PAGE-0007 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

Contributors

Steve Arlington
PricewaterhouseCoopers, West London Office, Harman House, 1 George
Street, Uxbridge, London UB1 1QQ, UK

Paul Athey
PricewaterhouseCoopers, West London Office, Harman House, 1 George
Street, Uxbridge, London UB1 1QQ, UK

Moira Avey
2 Loughton Villas, Crowborough, East Sussex TN6 5UD, UK

Elliot G. Brown
EBC Ltd, 7 Woodfall Avenue, High Barnet, Herts EN5 2EZ, UK

Kenneth Buchholz
INC Research, Charlottesville, Virginia, USA

Heather Campbell
Covance Clinical & Periapproval Services Ltd, 7 Roxborough Way,
Maidenhead, Berkshire S16 3UD, UK

John Carroll
PricewaterhouseCoopers, West London Office, Harman House, 1 George
Street, Uxbridge, London UB1 1QQ, UK

Stuart W. Cummings
Merck Sharp & Dohme (Europe), Inc., Clos du Lynx 5, Lynx Binnenhof,
Brussel 1200, Bruxelles, Belgium

Linda Heywood
Amgen Ltd, 240 Cambridge Science Park, Milton Road, Cambridge CB4 0WD,
UK

Steve Hutson
Barnett International, Parexel, River Court, 50 Oxford Road, Denham,
Middlesex, UB9 49L, UK
SEQ 0008 JOB WIL8280-000-007 PAGE-0008 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

viii CONTRIBUTORS

Ruth Lane
MDS Therapeutic Director, Glaxo Wellcome UK, Greenford Road, Greenford,
Middlesex UB6 0HE, UK

Munish Mehra
M2 Worldwide, 1401 Rockville Pike, Suite 300, Rockville, Maryland 20852,
USA

Louise Palma
Berlex Laboratories Inc., 340 Changebridge Road, PO Box 1000, Montville,
New Jersey 07045-1000, USA

Pankaj ‘Panni’ Patel


Manager, Contract Operations and Resource Management, SmithKline
Beecham Pharmaceuticals, New Frontiers Science Park (South), Third
Avenue, Harlow, Essex CM19 5AW, UK

Andreas M. Pleil
Pharmacia & Upjohn AB, Lindhagensgatan 133, S-112 87 Stockholm, Sweden

Richard K. Rondel
HPRU Medical Research Centre, University of Surrey, Egerton Road,
Guildford, Surrey GU2 5XP, UK

Michael F. Ryan
460 Foothill Road, Bridgewater, New Jersey 08807, USA

Alistair Shearin
PricewaterhouseCoopers, West London Office, Harman House, 1 George
Street, Uxbridge, London, UB1 1QQ, UK

Beverley Smith
Amgen, Inc., One Amgen Center Drive, Thousand Oaks, California
91320-1789, USA

John Sweatman
7 Fox Covert, Lightwater, Surrey GU18 5TU, UK

Chris Thomas
Covance Clinical & Periapproval Services Ltd, 7 Roxborough Way,
Maidenhead, Berkshire S16 3UD, UK
SEQ 0009 JOB WIL8280-000-007 PAGE-0009 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

CONTRIBUTORS ix
Tom Tollenaere
T2 Data Consult BVBA, Willemstraat 28, B-3000 Leuven, Belgium

Sheila A. Varley
Customer Strategic Business Unit, Drug Development Services, Clinical
Services Europe, Quintiles Ltd, Ringside, 79 High Street, Bracknell, Berkshire
RG12 1DZ, UK

Emma Waterfield
Clinical Trials Research Ltd, 107/123 King Street, Maidenhead, Berkshire
SL6 1DP, UK

Colin F. Webb
Amgen Ltd, 240 Cambridge Science Park, Milton Road, Cambridge CB4 4WD,
UK

Jon Wood
Phoenix International UK, Mildmay House, St Edwards Court, London Road,
Romford, Essex RM7 9QD, UK

Louise Wood
Epidemiology Unit, Post Licensing Division, Medicines Control Agency,
Market Towers, 1 Nine Elms Lane, London SW8 5NQ, UK
SEQ 0011 JOB WIL8280-000-007 PAGE-0011 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

Foreword

Clinical data management is a profession with increasing importance


within pharmaceutical research and development. The diverse lineage of
clinical data management coupled with a wide range of responsibilities
makes a clear, clean definition of ‘clinical data management’ difficult at
best. As complex and diverse as the profession is, it is a field in which the
number of substantial publications is extremely small. The first edition of
this book provided one of the very few in-depth resources for clinical data
management professionals. This second edition continues in that tradition
by expanding and updating that knowledgebase.
Authored by professionals on both sides of the Atlantic, this text is
reflective of the current trends of global harmonisation of clinical research
and development. With the global consolidation of the industry, it is crit-
ical to understand, appreciate and be able to work within the framework of
global clinical development. This text should contribute to that
understanding.
As the global clinical data management discipline continues to grow we
can rightfully expect an increase in the amount of research and reference
material, such as this book, available to those working in and around the
pharmaceutical industry. This is good for all involved—authors, pub-
lishers and readers!

Paul R. Loughlin
Chairperson,
Association for Clinical Data Management (ACDM)
Dr Kenneth Buchholz
Chairman of the Board of Trustees
Society for Clinical Data Management
SEQ 0013 JOB WIL8280-000-007 PAGE-0013 PRELIMS I-X1V
REVISED 01NOV99 AT 18:21 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

Preface

Clinical Data Management has come a long way in the last decade. It is now
a firmly established discipline in its own right, and is becoming an area
that people know about and can progress their careers within.
We feel that the next decade will see major changes with the advantage
of electronic data capture. The clinical and data jobs/disciplines as we
know them today will become one as companies use more and more
sophisticated hardware and software to streamline and eliminate duplica-
tion from the clinical trial process. Gone will be the days of the Investiga-
tor giving the CRF to CRA, CRA giving the CRF to DM, DM giving the CRF to
DE. DE enters it, gives CRF back to DM and so on.
The ever increasing computerisation of the worldwide healthcare sys-
tem will mean a practically paperless environment when study protocols
will specify what data points at which intervals need to be transmitted
from the clinic to the company headquarters via electronic means.
The need will then be for strict computer validation to audit trail data
edits, electronic querying of data and mechanisms to ensure the host
database at the hospital site is updated correctly and not corrupted.
The industry is still contracting, with more and more mergers and aqui-
sitions occurring daily. The world of Contract Research Organisations has
started to follow, with CROs, often of considerable size, going through
takeovers and mergers to supply the type and size of service that the new
emerging pharmaceutical companies need. We are seeing the emergence
of virtual pharmaceutical/biotechnology companies who have no inten-
tion of having their own clinical research staff, but who just buy in a drug
and then rely on the service industry to take it to the market place.
We predict that the top pharma/CRO companies today will not be the
players of tomorrow unless they now address the necessary structures
and technology to move themselves into the New Tomorrow, and the true
advent of electronic data capture.

RKR
SAV
CFW
SEQ 0097 JOB PDF8280-001-006 PAGE-0001 CHAP 1 1-20
REVISED 30NOV01 AT 16:56 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

Clinical Data Management, 2nd Edition. Richard K. Rondel, Sheila A. Varley, Colin F. Webb
Copyright  1993, 2000 John Wiley & Sons Ltd
ISBNs: 0471-98329-2 (Hardback); 0470-84636-4 (Electronic)

1 Chapter Review
STUART W. CUMMINGS
Merck Sharpe & Dohme (Europe), Inc., Brussels, Belgium

INTRODUCTION

The breadth of topics covered in this second edition reflects the range of
regulatory, technical and operational areas of clinical development which
are all impacted by the need for sound and effective Clinical Data
Management (CDM) practices. The many authors who have contributed to
this book are able to draw on many years of practical experience from
within the pharmaceutical industry and have themselves either initiated
or implemented many of the ideas described in the chapters that follow.

ICH AND ITS IMPACT (Smith and Heywood)

Since the first edition of this book was published in 1994, the International
Conference on Harmonisation (ICH) has had a significant impact on how
clinical trials are conducted and has set new expectations regarding how
sponsor companies and drug regulatory authorities will interact in the
next millennium. The impetus for ICH stems from a common desire on the
part of industry to reduce development costs, from a regulatory perspec-
tive to reduce approval times and from a public health viewpoint to make
more economical use of human subjects in scientific research studies. In
Chapter 2 of this new edition of Clinical Data Management, Smith and
Heywood present a concise overview of the recent history and conclu-
sions resulting from the four ICH conferences that took place between
1991 and 1997.
The authors begin their review by describing the ICH organisational
structure and defining the stepwise process whereby guidelines are
developed and approved. A chart is provided which displays the status of
each guideline. Particular consideration is given to surveying the five ICH
guidelines (E2A, E3, E6, E8 and E9) which include specific references to

Clinical Data Management. Second Edition. Edited by R.K. Rondel, S.A. Varley and C.F. Webb.
 2000 John Wiley & Sons, Ltd
SEQ 0002 JOB WIL8280-001-006 PAGE-0002 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

2 STUART W. CUMMINGS

clinical data management. Procedural and system changes which may be


needed to assure compliance with ICH are reviewed in depth and the need
to assure appropriate training and education is emphasised. Key areas
where pharmaceutical companies will have to devote considerable energy
include system validation, harmonisation of adverse experience terminol-
ogy and the reformatting of key tables and listings for reporting purposes.
ICH also underlines the role and contribution of data management staff
throughout the drug development process including design activities, gen-
erating CSR tables and listing and satisfying electronic submission
requirements.
Reference is also made to other guidance which complements ICH but
has been developed separately in different regions. This includes the EU
GCP Directive and a number of FDA guidance documents focusing on the
submission of electronic case record forms (CRFs) and data listings. The
authors note that the involvement of regulatory agency staff in the de-
velopment of ICH and other guidelines, particularly with regard to elec-
tronic submissions should ultimately facilitate the review process, reduce
the review period and further encourage consistency across regulatory
agencies.
ICH is already impacting how data management departments are struc-
tured, how data management tasks are executed and how data are ex-
changed between sponsors and regulatory authorities. Widespread
adoption of ICH will confirm the common framework against which re-
search working practices will be evaluated, new clinical data management
systems implemented and training and education goals will be set.
However, it will probably take several years before it will be possible to
assess whether the goals of ICH have been reached.

CRF DESIGN (Avey)

Despite the recent emphasis on electronic data capture tools, 95% of


clinical data are still captured on paper and considerable resources are
still applied to achieving efficient design, production and distribution of
CRFs. Good CRF design offers the opportunity to minimise data process-
ing delays due to poor data quality or loss of data. However, CRF design
alone cannot compensate for inadequacies which may be inherent in the
protocol. Moreover, since study objectives differ between early phase
clinical trials and confirmatory trials it is to be expected that data collec-
tion and hence how CRFs are designed will also vary. These obser-
vations, from Avey, form the basis for a detailed account of CRF design
and implementation. The author proposes a life cycle model which con-
siders how a CRF is used at each stage of its evolution, encompassing the
SEQ 0003 JOB WIL8280-001-006 PAGE-0003 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

CHAPTER REVIEW 3
perspective of the designer, the user (form filler), data entry staff and
data reviewer.
Creating a time and event schedule (study flow chart) derived from the
study protocol can be helpful in designing the CRF and clarifying where
standards can be applied and is strongly recommended as a key
preparatory step. The author gives examples of modules which may gen-
erally be regarded as ‘standards’ and offers guidance as to the types of
changes to ‘standards’ that should be permitted or even mandated. The
use of standards must be balanced with a degree of flexibility to accom-
modate diverse trials since, if standards are applied slavishly, modules
become foreign to the form filler’s environment and data quality will be
jeopardised.
Advice is offered as to how to identify, construct and organise data
items onto CRF pages, noting that accuracy and legibility can be affected
by the availability and presentation of space for recording responses. CRF
‘performance’ can also be enhanced through the use of a ‘positive thinking
bias’ by presenting optional responses ranked by relevance and import-
ance. Amongst alternatives for responding to multiple choice questions
‘tick marks’ are favoured in preference to other indicators. Readers are
also cautioned that, when presenting an ordered categorical list, the posi-
tioning of response boxes relative to the question text can influence the
response. Design features that help to minimise ambiguity (e.g., ‘should’
could mean may or must, and avoiding of double negatives) are also
discussed. The order, format and physical characteristics of CRF pages
can all influence how they are completed. There is also some discussion
regarding various CRF production features, for example the use of dif-
ferent types and weights of paper, use of colour, margins, shading, fonts,
insertion of additional pages, and so on.
CRF design impacts all stages of the clinical trial process. CRFs designed
to facilitate data recording must also recognise how and where data will
be entered and subsequently reviewed. A life cycle analysis to evaluate
competing needs among different partners in the CRF process at different
timepoints can help to achieve a balanced solution.

DATA CAPTURE (Waterfield)

For more than a decade, there has been a drive towards using electronic
data capture tools in the belief that these technologies could be de-
veloped at reasonable cost, would reduce processing time and enhance
data quality. It has also been recognised that facilitating data capture
through technology solutions alone would not be sufficient and that
changes in work processes and in job roles would also have to occur if
SEQ 0004 JOB WIL8280-001-006 PAGE-0004 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

4 STUART W. CUMMINGS

such solutions were to be successful. However, the scalability of such


solutions has often been questioned not only in terms of development and
support costs but in terms of the computing architecture necessary to
sustain such solutions globally. Technology and cost constraints have
limited the widespread adoption of new approaches to data capture. Most
major and medium-sized pharmaceutical companies have experimented
with RDE solutions but few companies have embraced this approach as
their primary data capture solution. In this chapter, Waterfield first exam-
ines how attitudes towards data capture have evolved in recent years and
then reviews a number of different remote data entry (RDE) technologies.
The author stresses that data entry systems must be designed from the
perspective of the person keying the data. For example, the data entry
screen and data entry guidelines may be more or less complex depending
on the skill set and medical background of the person keying the data.
Understanding how a user will interact with the data entry screens deter-
mines the extent of edit check functionality built into the system and in
particular the extent to which autoencoding may be used. Data capture
requirements also change as one graduates from a centralised approach
using ‘heads down’ data entry staff to one where data entry is distributed
on a global scale and where the data entry would be carried out by clerical
staff, study monitors or investigators.
Data capture is not just limited to processing of CRF data and Waterfield
next considers various design concepts and issues surrounding the cap-
ture of data from external sources as well as considering the pros and
cons of alternative data capture technologies that can be used, for ex-
ample, fax-based systems, optical character recognition (OCR) and image
character recognition (ICR) systems.
There is a good discussion on the rationale and some of the design
issues concerned with the development and implementation of RDE tech-
nology. The potential benefits of using RDE must be assessed against the
development and the support costs associated with global ambitions to
achieve early access to study data. Successful RDE systems must be flex-
ible and be based on careful protocol selection. Factors which influence
the choice of data capture tools include cost, maintenance, security and
regulatory compliance.
To date, only the larger pharmaceutical companies have been willing to
invest heavily in new electronic data capture technologies and modify
their work processes. With the advent of the next generation of data
capture tools, embracing Web-based solutions and the prospect of con-
trolled access to medical records, companies of all sizes will have to
introduce electronic data capture technologies if they are to remain com-
petitive. However, most companies may expect to live with a mixed data
entry approach for the foreseeable future and should remain vigilant to
SEQ 0005 JOB WIL8280-001-006 PAGE-0005 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

CHAPTER REVIEW 5
the issues involved in scalability and user support as data entry becomes
even more decentralised.

PLANNING AND IMPLEMENTATION (Thomas)

Dependent on the trial in question, up to 40% of the total resource spent in


drug development can be attributed to tasks related to data management.
This large cost can be mitigated if, in particular, projects teams elect to
integrate data management early in the planning phase and involve data
managers as key members of design and implementation teams. In this
chapter, Thomas reviews the key steps associated with the planning and
execution of a clinical trial system, emphasising not only the process
involved but also the data management products delivered at each stage
of the process.
Planning starts with a basic understanding of the business needs, a
clear definition of objectives, a budget proposal, a summary of the
assumptions and constraints that may affect both development and
implementation and a notion of timeline. Only once the timeline has been
established can those tasks and key milestones which fall on the critical
path be identified. An early product resulting from the planning process is
a list of feasible solutions supported by a statement of the manpower and
materials required to support each solution together with a framework
against which alternatives can be evaluated. The evaluation should in-
clude scope (single vs multiple protocols), deliverables (study database,
statistical report, clinical study report), customer focus (internal,
external), data sources and data flow (CRFs, laboratory data), ownership
(processes and tasks) and constraints (budget, skills, time).
From the data management perspective, the project plan defines pro-
cess flows against which specific data management and study tracking
solutions can be developed. If a new technology is to be introduced then
the impact of change on existing work processes and job roles must be
taken into consideration before confirming the final solution. The plan-
ning phase must also consider such items as data validation, reconcilia-
tion of adverse experience (AE) data, SOP development and training,
dictionary management and how the liaison with CROs will be handled, if
applicable.
Throughout the project, timelines, budget, process efficiency and prod-
uct quality as stated in the project plan are under constant review. In this
regard, it is important to report on metrics representing performance,
quality and resource utilisation to determine if the study is being con-
ducted according to plan or if certain processes are not in control. Re-
sponses to processes which are found not to be in control may include
SEQ 0006 JOB WIL8280-001-006 PAGE-0006 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

6 STUART W. CUMMINGS

renegotiation of tasks and priorities, shifting resources, providing incen-


tives and possibly changing work processes.

DATA VALIDATION (Patel)

GCP and regulatory reporting requirements emphasise the importance of


validation of systems, process and data. Since the quality of trials depends
on the acceptability of data and results, all trial participants have a role to
play in ensuring this success and in sharing the responsibility for contin-
uous data validation. However, data recorded on CRFs do not always
represent data held in source documents and even if data are correctly
recorded on CRFs they may not always be correctly represented on the
clinical database, and study reports may not always reflect the contents of
the database. Whereas electronic data capture solutions and automated
query and review tools have, to some extent, reduced the time and effort
required to review and correct data at different stages, data validation still
commands considerable resources to ensure success. Resources to sup-
port validation efforts can be reduced through careful definition and ex-
ecution of data review and audit plans and by encouraging a continuous
data validation process throughout the study.
In this chapter, Patel considers data validation as a stepwise process
starting at the investigator site and ending only when the final clinical
study report is published. The roles of the investigator and study monitor
in assuring that source document verification (SDV), data entry and subse-
quent data review steps are conducted in accordance with GCP are dis-
cussed. Data validation during data entry is accomplished by executing
edit checks against the data being entered. The number and complexity of
edit checks will be dependent on the underlying data management pro-
cess and the job roles of those involved.
Steps that can be taken to enhance data quality before the study starts
are discussed in some detail. In particular, the role of SOPs found in
regulations and in company policies can be helpful in establishing the
environment and setting expectations regarding how data will be pro-
cessed and validated. Parallel development of the protocol and CRF, de-
velopment of clear data handling guidelines, timely training and support of
investigator and field staff can all lead to increased data quality. Headquar-
ters staff too can benefit from a clear understanding of the data manage-
ment guidelines in an effort to reduce the number of review questions
raised and presumably to increase the proportion of questions raised that
are relevant and lead to database changes. There is also mention of how
validation should be carried out for data from external sources (e.g., labo-
ratory data).
SEQ 0007 JOB WIL8280-001-006 PAGE-0007 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

CHAPTER REVIEW 7
In conclusion, Patel notes that the introduction of new technologies will
have a significant impact on data validation. As automated processes for
data capture and review become standard practice, it is expected that
there will be a shift from data validation late in the process towards early
validation of the systems and procedures that govern the clinical data
management process.

QUALITY ASSURANCE AND CLINICAL DATA MANAGEMENT


(Campbell and Sweatman)

Definitions of quality assurance (QA) and quality control (QC) can be


found in guidance issued by the FDA and in ICH GCP. In general, this
guidance refers not just to data but also to the systems procedures and
validation steps which give assurance that data have been processed
correctly and that the CSR is a true representation of the trial that took
place. Although audit findings cannot give 100% assurance with regard to
all aspects of a trial, they should accurately reflect what has happened.
Moreover, QA should not just be viewed as a confirmation step to ensure
compliance with regulations and procedures but as an opportunity to
positively influence decision making across all phases of development
before problems arise.
In the opening sections of the chapter, Campbell and Sweatman quote
the definitions of QA and QC and related terms as defined in ICH GCP and
take care to distinguish between these two terms, which are frequently
confused. One key distinction is that QC is carried out by all staff
throughout the trial whereas QA is an independent audit activity.
There follows a review of how audit practices have evolved in recent
years, characterised by a shift away from site audits and the late involve-
ment of QA staff to an earlier and more continuous effort focusing on
processes and procedures starting at the protocol review stage and con-
tinuing throughout the lifetime of a study. The early involvement of QA
staff also positions this group to play a more proactive role as the trial
progresses, including selection of investigator sites, setting the data re-
view strategy and the training of site and other study personnel. The
benefits of early and interim audits are also described.
From a data management perspective, audit activities focus on five key
areas—study documentation, completion of CRFs, emphasis on key vari-
ables, content and format of table and listings, and the CSR. Advice is
offered as to how groups supporting these various activities can prepare
for both internal and external regulatory audits. Guidance is also given in
terms of how compliance can be achieved and measured although some
SEQ 0008 JOB WIL8280-001-006 PAGE-0008 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

8 STUART W. CUMMINGS

caution is suggested against over-interpretation of error metrics unless


the structure behind their meaning is clear.
Future audit activities are expected to be greatly influenced by the im-
pact of technology changes—particularly the expansion of electronic data
capture systems and regulatory acceptance of electronic signatures, for
example, by FDA. QA will also have an expanded role to play in ensuring
harmonisation of submissions. The contribution of QA is dependent on
the quality and expertise of the staff performing this role. Criteria for
selection and recruitment of audit staff are mentioned, emphasising both
external as well as internal training opportunities. In conclusion, the au-
thors suggest that the time may be right to introduce a formal QA
qualification in response to the increasing role, contribution and size of
QA departments.

PERFORMANCE MEASURES (Wood)

Measurement of performance is necessary to demonstrate successful pro-


ject management and to identify opportunities for continuous improve-
ment. Yet measurement of clinical development processes, and in
particular data management activities, has proved to be notoriously diffi-
cult. Part of this difficulty concerns the fact that drug development time-
lines tend to be driven less by an underlying process but rather by setting
target dates and subsequently adding resources or adjusting priorities to
ensure that targets are met. Nevertheless, organisations that can generate
meaningful measures against which performance can be judged and future
targets set should be more successful in achieving continuous improve-
ment than those which do not.
In his review of the challenges and opportunities that impact our ability
to measure performance, Wood highlights three key dimensions where
measurement is desirable—productivity, quality and cycle time—noting
that aggregation of data across protocols may be difficult or inappropriate
since protocols have varying characteristics and operate under different
processes.
The key to successful measurement, Wood suggests, is defining a frame-
work or process flow from which processing units (e.g., a complete CRF)
can be defined, and against which measures (e.g., CRF visit date to CRF
reviewed) can be determined. Successful measurement involves gathering
input and buy-in from all team members since only by understanding how
measures relate to each other can a true appreciation of the process flow
be understood.
The use of metrics data for reporting or diagnostic purposes should
reflect the underlying work process (e.g., it should be possible to
SEQ 0009 JOB WIL8280-001-006 PAGE-0009 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

CHAPTER REVIEW 9
distinguish between queries raised by field monitors and in-house staff).
Such metrics, however, may be subject to misinterpretation and the au-
thor cautions against inappropriate use and the need for relevant sub-
analyses to ensure that performance is correctly evaluated. Wood also
stresses the importance of being able to capture resource data although
getting staff to record time spent per task will rarely by successful.
CDM still represents about 30–40% of total effort and expenditure in
drug development. Measurement of current and assessment of future
practice should identify improvement opportunities leading to reduced
cycle times and accelerated data clean up and reporting practices. Al-
though we can measure CDM performance, this is not always done or
applied consistently. As a consequence it is difficult to assess the real
economic and material benefits of change. Interest at lower levels can only
be successful if project teams buy-in to a methodology and measures of
common interest. In conclusion, Wood suggests that network-based tech-
nologies will continue to significantly impact current performance and the
standards that we use. However, without metrics the effect of change is
likely to be at best inaccurate or at worst it will lead to missed oppor-
tunities for further improvement.

DATA PRESENTATION (Mehra)

In designing a clinical trial, most attention is given to those activities


concerned with the trial set up, for example, CRF design, data capture
system specifications and data handling guidelines. Less attention is paid
to how data will be presented and reported. This is surprising when one
realises that it is the report and the way data are presented to regulatory
agencies that will determine whether approval is granted or not. Although
some effort is generally made to include reporting needs at the study
design stage, for example through the adoption of data analysis and data
management plans, most data presentation needs can only be finalised
well after the trial has started. To add to this confusion some companies
have found that the common formats defined by ICH have caused con-
siderable rework and some uncertainty, in determining how data will be
presented in clinical study reports.
In his analysis of how data should be displayed to meet regulatory
requirements, Mehra considers two uses of data presentations. Firstly, to
assist in the review and screening of data leading to clean files for analysis,
and secondly, for the purpose of permitting reviewers to accept or reject a
hypothesis under test on the basis of data presented in a submission
document. He points out the difficulty in designing clinical database
systems that are optimal for data capture, data retrieval and data
SEQ 0010 JOB WIL8280-001-006 PAGE-0010 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

10 STUART W. CUMMINGS

presentation and notes that in most cases data capture needs are fa-
voured. This practice may become even more widespread as data entry
becomes more decentralised. Database designers have traditionally
focused on structures which are optimal for storage and can allow trans-
formations for other purposes—the so called third normal form.
However, for database presentations, non-normalised databases are
preferred.
Another issue which designers and programmers alike have to face is
that different reviewers may require different presentations of the data. To
facilitate this debate, Mehra proposes certain rules governing data presen-
tations. He notes that displays of raw data, for example by CRF form type
or by patient across forms, may not necessarily be optimal for screening
purposes. This task is better accomplished through summary table dis-
plays (graphs, figures, plots, as well as data listings). Such listings are also
used for validation purposes and frequently use distribution statistics to
highlight abnormal values or trends of interest.
The author next presents a characterisation of different data types and
reviews how presentation needs differ between continuous and categori-
cal variables and between visit datapoints and those that are patient ori-
ented. Another issue covered is that of combining data across different
protocols where, for example, different units, variable names or methods
of data collection may have been used.
In conclusion the author stresses the importance of displaying data not
just for review and validation but also for interpretation of results. Data
displays demand the skills of both data management and statistical pro-
grammers and should ideally be amenable to statistical modelling and
analysis.

CODING OF DATA (Brown and Wood)

Brown and Wood dedicate this chapter to the late Dr Sue Wood, formerly
of the MCA, whose work is widely recognised as being a driving force in
the harmonisation of the use of medical terminology now reflected in the
work of the MedDRA and ICH M1 working parties. Brown and Wood both
work at the MCA and it is therefore appropriate that they also present
the benefits that coding brings from the perspective of a regulatory
authority.
The authors begin their review by assessing the need for coding
systems, particularly those linked with medical terminology, in response
to the need to manage large volumes of text data at a time when
computing technology was not well designed to meet this need. Coding of
data not only provides an opportunity for storing data more concisely and
SEQ 0011 JOB WIL8280-001-006 PAGE-0011 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

CHAPTER REVIEW 11
consistently but also greatly facilitates the summarisation and reporting
of text data. The irony that today’s computing technology solutions are
well able to handle large text databases is not lost.
Coding systems which permit the aggregation of data help sponsors to
meet legal obligations and aid in the identification of signals to detect rare
adverse experiences. In particular the use of recognised coding systems
effectively eliminates the risk that different processors could obtain dif-
ferent results based on determining their own rules for aggregating ver-
batim terms.
The authors next review the history and characteristics of a number of
the commercial dictionaries which have been embraced both by industry
and by drug regulatory agencies. Particular attention is given to the origin
of MedDRA which dates back to 1993, with further work culminating in the
acceptance of MedDRA as a new international standard endorsed by ICH.
There follows a concise overview of the structure, use and maintenance of
MedDRA, highlighting the expected advantages in comparison with exist-
ing commercial and indeed some in-house coding schemes. In certain
instances, MedDRA can be regarded as being more complex than existing
systems and will require a more in-depth knowledge of dictionary struc-
tures and a possible need to adjust MedDRA output for presentation pur-
poses. The authors note that over time MedDRA will become the gold
standard for pharmacovigilance and expedited reporting. However, its
acceptance by industry and industry support organisations is more likely
to be driven through its adoption by regulatory authorities.
In closing, the reader is reminded why the creation and acceptance of a
single medical terminology having the support of ICH will provide long-
term benefits both to sponsors and to agencies. As a consequence expec-
tations are high that data quality, electronic interchange of data and hence
speed of development, review and approval of drug applications and phar-
macovigilance will all improve for the public good.

DATABASE DESIGN ISSUES FOR CENTRAL LABORATORIES


(Tollenaere)

Guidelines regarding laboratory data systems are briefly mentioned in EC


GCP requirements and again in GMP whereas GLP advice is more directed
towards animal experimentation. In this chapter, Tollenaere considers the
special case of database structures for the management of clinical labora-
tory data. He begins by pointing out that, in general, the rules for normalis-
ing data do not necessarily apply to laboratory data, in part due to
regulations and in part to particular requirements which characterise lab-
oratory data processing.
SEQ 0012 JOB WIL8280-001-006 PAGE-0012 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

12 STUART W. CUMMINGS

By way of example, the author explores the structures, advantages and


disadvantages of normalised datasets, pointing out the particular diffi-
culties caused by erroneous or inconsistent laboratory data. Normalised
datasets minimise storage needs, avoid duplication of data and can
easily be updated. However, they have the disadvantage that data re-
trieval and interrogation is somewhat more complex and may require
sophisticated programming techniques. This point is illustrated by ex-
amining the impact of just one erroneous laboratory value. Conversely,
storing data in a denormalised form and selecting the right key can result
in correct output. A clean database can often be normalised for delivery
to a sponsor although the effort in creating it has not followed non-
normalised rules.
Next, a set of principles for managing laboratory data are described
including the need to be able to store inconsistent data. Again by example,
the difficulties resulting from one or more erroneous identification field
are explored in detail.
Another issue affecting the processing laboratory of data concerns the
need to process unscheduled or unexpected laboratory results. In prac-
tice, when repeat test results occur two choices are available to resolve
this situation—either programs are written to selectively add new values
or the database is designed with placeholders in such a way that any
number of results can be accommodated. Tollenaere emphasises the need
for a ‘good correction system’ to ensure an adequate audit trail and offers
two alternatives—one where all changed values are held and another
which retains only the original and most recently changed record. He
advocates that the former method offers greater advantages.
In conclusion, the author reminds the reader that understanding the
prerequisites of managing laboratory data and careful consideration of
database design issues can greatly improve the speed and efficiency with
which the data can be processed and reported. This statement is all the
more striking when one appreciates the greatest processing in any trial
concerns the processing of laboratory data.

COMPUTER SYSTEMS (Palma)

Increased computer literacy, advances in technology and a desire to accel-


erate drug development and approval has encouraged both industry and
regulatory authorities to invest in new database structures and state-of-
the-art computing infrastructure. In this chapter, Palma presents an over-
view of current thinking in clinical systems design and focuses on the
growth and potential of a number of commercial clinical data management
systems available.
SEQ 0013 JOB WIL8280-001-006 PAGE-0013 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

CHAPTER REVIEW 13
A necessary first step in defining a new clinical data management sys-
tem is to conduct a present state analysis and establish criteria which will
allow the organisation to put forward clearly defined objectives and to
define a process whereby the selection and evaluation of candidate prod-
ucts can be conducted. Objectives should be multidisciplinary and incorp-
orated into a requirements statement from which standards, strategy and
functionality and the potential for integration with other (internal and
external) systems can be derived. The author leads us through a stepwise
process and provides guidance for determining technical and operational
specifications and for conducting acceptance testing. User acceptance
test plans should mention the need to perform ‘gap analysis’, a determina-
tion of how feedback will be incorporated, a transition strategy and spec-
ify how training will be given.
The features of three of the more popular commercial systems (Clintrial
4, DLB, Oracle Clinical) are discussed in detail. The author comments that
many commercial database systems have been upgraded to include docu-
ment management needs, provide links to CROs and accommodate data
from external sources.
A product review is also provided of commercial workflow systems and
associated technology solutions designed to allow the receipt and track-
ing of data from different sources. In general these systems are based on
varying combinations of fax, scanning and image applications, some of
which are able to integrate CRF data directly into a clinical database. A
useful table contrasting these different options is provided.
Palma concludes that if data management systems are to be fully effec-
tive, they must be integrated with workflow and document management
systems and with data from other sources. Such systems require the
necessary time to plan, develop and implement and require attention to
training needs.

SYSTEMS VALIDATION (Hutson)

Historically, data management systems and data managers place greater


emphasis on data validation rather than systems validation. Only within
the past five years has this emphasis been reversed, driven in part by
greater reliance on computer systems and technology to replace manual
tasks and partly by regulatory guidance which now requires systems val-
idation to be demonstrated. Companies must have in place both internal
validation steps as well as a procedure for responding to external regula-
tory inspections. Regulatory guidance is primarily contained in the EC
GCP guidelines and in ICH E6. In this chapter, Hutson reviews the current
SEQ 0014 JOB WIL8280-001-006 PAGE-0014 CHAP 1 1-20
REVISED 01NOV99 AT 16:41 BY TF DEPTH: 58.01 PICAS WIDTH 40 PICAS

14 STUART W. CUMMINGS

validation environment governing clinical systems, including rationale for


developing a validation policy, and a description of what regulatory audi-
tors may request.
From a regulatory perspective, organisations should develop SOPs, pol-
icy statements and record meeting materials and minutes in such a way
that inspectors can easily and quickly determine how compliance has
been achieved. A useful reference in this regard is the joint PSI/ACDM
guideline on Computer Validation published in 1997. It is not, however,
sufficient to have SOPs in place for each protocol or system component.
Rather organisations need to demonstrate that they have developed and
indeed implemented a validation policy covering roles, personnel, train-
ing, and organisation and implementation of QA programs.
Retrospective validation generally applies to older systems which pre-
date the more recent guidance referenced above. In such cases, it is a
question of ‘filling the gaps’. Retrospective validation begins by establish-
ing an inventory of systems components and configurations, documenta-
tion, and a historical perspective on system use. Since not everything can
be included, a risk assessment must be conducted taking into account
resource availabality and timeframe. In extreme circumstances, it may
even be necessary to take the system out of commission for a short time.
The key steps involved consist of planning, validation, reporting deficien-
cies, correcting defects and revalidating the system.
Prospective validation is an inherent part of systems development
methodology and is commonly presented within the context of a Systems
Development Life Cycle (SDLC) model. The classic SDLC approach is illus-
trated in the text and provides a useful distinction between validation and
verification. Testing provides evidence that inputs and outputs are pro-
cessed correctly and particular mention is made of the need for integra-
tion and stress testing to ensure scalability.
The chapter includes a case history concerning the steps taken by a
development team in validating a Phase II/III clinical database. The discus-
sion details the data characteristics, the use of software packages, and
highlights the complexity of validation in a multidisciplinary setting. The
author identifies tips to aid in this process, including reference to sponsor
SOPs, a clear definition of scope, and a component analysis with an assess-
ment of risk for each component.
Hutson concludes by echoing a theme repeated elsewhere in the book
that as we have moved to integrated CDM systems, so the emphasis has
shifted from data to systems validation. Such systems are complex, must
meet regulatory requirements and strike a balance with good business
practice. As a consequence, systems should be more reliable, develop-
ment costs and implementation costs will be reduced and overall product
and process quality enhanced.

You might also like