Here, Large, and Not Going Away.: Data Integrity
Here, Large, and Not Going Away.: Data Integrity
DATA INTEGRITY
March-April 2016
Reprinted from
Here, Large,
PHARMACEUTICAL ENGINEERING
THE OFFICIAL TECHNICAL MAGAZINE OF ISPE
and Not
www.PharmaceuticalEngineering.org
Going Away.
Why is data integrity so important? Karen Takahashi, a senior policy advisor at US Food and
Drug Administration (FDA), summed it up in three key points during her presentation at the
ISPE/FDA/PQRI Quality Manufacturing Conference, June 1-3, 2015, in Washington, DC:
First, regulatory agencies, as well as industry, rely on accurate information to ensure drug
quality. If the information associated with a drug product is not accurate or reliable, there is
no way a company can ensure the safety and efficacy of their product for the patient.
Second, data integrity problems break trust between industry and regulatory agencies. The
regulatory agencies are not and cannot be responsible for ensuring the quality of our products.
They are not our quality organization. If they find compliance gaps, regaining trust can be a very
costly and time-consuming task.
Third, regulatory agencies rely largely on trusting the firm to do the right thing when the
regulatory agencies are not watching. Regulatory agencies have limited resources and they
cannot be present at every site which produces drug products. As stated earlier, they are
not our quality organizations; it is our responsibility to act as an ethical company and ensure
patient safety.
Data integrity is a global regulatory and compliance expectation, as seen by the increased
data integrity rigor by the FDA and guidance by the MHRA and the WHO. Global regulatory
agencies are becoming more aligned around these expectations. What can data integrity
problems mean for your firm? They can result in recalls of products, regulatory citation,
import alerts, injunctions, seizures, application integrity policy invocations/legal action, and
most concerning, patient harm. It is as much a compliance issue as it is a financial issue.
Key implementation considerations for a corporate data integrity program include development
of a high-level strategy, identifying and gaining executive sponsorship, focusing on management
accountability, implementing tools for knowledge sharing, and developing and providing
the appropriate levels of training. It is imperative that your data integrity program addresses
behavioral factors and drives a strategy that focuses on prevention, detection and response. And
be prepared to implement a plan for continuous improvement. This is an issue that’s here to stay.
Christopher Reid
ALCOA+
Desired state
A Attributable Who performed an action and when?
If a record is changed, who did it and
why? Link to the source data
Throwing
in a durable medium and be readable
A perfect example of this can be found in an April 2015 US Food and Drug Executive leadership must encourage right behaviors by prioritizing
Administration (FDA) Warning Letter:1 data integrity when setting objectives, performance targets and
incentives.
[T]he analyst at your firm altered the file name in the spectropho-
tometer containing the sample identification information for (b)(4) Leadership should drive a strategy that focuses on prevention,
API lot # (b)(4), tested on April 2, 2014, to support the release of two detection and response. The priority of effort for prevention should
previously manufactured lots, # (b)(4) and (b)(4). . . . This practice is be greater than the priority of effort for detection; effort for detection
unacceptable and raises serious concerns regarding the integrity and should be greater than effort for response. This translates into:
reliability of the laboratory analyses conducted by your firm.
¡ Select, install and configure systems that are capable of providing the
This statement clearly indicates an analyst deliberately falsified a result in a technical controls essential to protecting data integrity, such as unique
computerized system. (It should be recognized, however, that while some accounts, granular privileges and audit trails. (A more comprehensive
GxP data changes may not be the result of intentional falsification, they discussion on technical controls and data integrity by design can be
also lead to data-integrity issues.) found in “An Ounce of Prevention.”)
The importance of leadership ¡ Ensure that effective review processes are in place to detect any data-
Management responsibilities integrity issues throughout the operational life. (Detailed information
ISO 9001:2015 2 clearly identifies one of the key roles of management: on results review, audit-trail review, periodic review, data audits, etc.,
ensuring the availability of resources. This is reaffirmed in many, if not all, is covered in “Big Brother Is Watching.”)
GxP regulations around the world.
¡ On detection, ensure that the preventive actions implemented reduce
Applying this requirement to data integrity, management must: or eliminate data-integrity risks by technical or design controls
(preferred) and by influencing human behavior. (This is discussed in
¡ Provide sufficient competent people to complete the assigned “Doing the Right Thing.”)
tasks: Overworked people may feel pressured to maximize yield or
productivity at the expense of data integrity. Leadership must first accept that there have always been – and always will
be – data-integrity issues on some level. Investigating and understanding
¡ Provide sound, reliable equipment and instrumentation for production the existing data-integrity issues within an organization is a strong
and quality personnel to achieve the expected throughput: Outdated foundation from which to begin the process of reducing such issues.
equipment may neither provide the technological controls for data
integrity nor produce accurate data. Frequent equipment downtime The MHRA Data Integrity Definitions and Guidance states the objective as
can increase pressure on the staff to seek alternative ways keep up with being to “design and operate a system which provides an acceptable state of
their workload. control based on the data integrity risk, and which is fully documented with
supporting rationale.”3 Once a system with inherent controls has been put
¡ Maintain the facilities and operating environment in a fit state for their in place, detection is the next essential safeguard against the daily threats
intended purposes: Lack of physical security and poor IT infrastructure to data integrity. The reporting process for data-integrity problems must be
can themselves jeopardize data integrity by allowing unauthorized understood from the top level all the way down to the line operators, and
access to a server room, for example, or by losing data from a local it must come with immunity from management censorship or retribution.
hard drive.
Metrics
These responsibilities are in addition to providing leadership in all Poorly chosen metrics can undermine integrity by encouraging the wrong
matters of data integrity and compliance, as effective executive behaviors and potentially providing the “pressure” element envisaged by
leadership is a critical component in maintaining a high level of data Donald Cressey in his hypothesis on fraud4 and pictorially represented in
integrity. A corporation must emphasize the importance of data the “Fraud Triangle” (see Figure 1).
integrity to the organization through word and action, including
embedding the quality requirements within the business process.
Op
practices.
re
po
su
rtu
es
m
Pr
ity
¡ The corporate directors should consider the impact of any company
Rationalization activity on their individual industry reputations.
It should be noted, however, that a larger corporate business may suffer from:
Figure 1 The fraud triangle
Redefining the metric as the number of passing samples in a time period, Geographic culture
however, may provide substantial motivation for the analysts to make Even in today’s global society, geographic culture has a significant impact
samples pass by whatever means they can in order to return a high efficiency, on site operations. There are many published works on geographic culture
especially if there is potential for a pay rise or promotion linked to this. available; some of the cultural classifications in this section were taken from
The Culture Map, by Erin Meyer.5
A carefully chosen metric may involve the number of samples analyzed in
a time period, but it would also need to factor in any incorrect test results Cultures based on an egalitarian style with consensus decision making
as detected by second-person review or even repeat testing as part of an – as found, for example, in Scandinavian countries – may have a natural
investigation. advantage in promoting data integrity. Openness and a willingness to
discuss difficult situations can support an environment where failing results
Falsification for profit is discussed in more detail in “Doing the Right Thing,” are seen as a group problem to be resolved with clearly documented
as is the use of positive metrics linked to rewards. corrective actions that mitigate the manufacturing or other root cause.
Cultural considerations Similarly, people from cultures that tend toward direct negative feedback,
Cultural considerations can refer to a corporate culture (that is, the such as in the Netherlands, will likely feel comfortable escalating an issue
paradigm within which an organization operates) or a geographic culture through the management structure.
(the moral and behavioral norm within a particular country or region).
In a more hierarchical society, especially one that intuitively uses indirect
Corporate culture negative feedback, as might be found in highly traditional cultures like
Corporate culture can vary widely, from a family-owned private company Japan or China, reporting an out-of-specification result could be seen
to a publicly traded corporation with an independent board of directors as either a personal failing on the part of the analyst or even an implied
that comprises leading industry figures and subject-matter experts. criticism of the manufacturing department. Such cultures will have to invest
significant effort to consciously overcome traditional thinking in order to
From a regulatory perspective, there is no difference: The expectation for achieve the openness around data integrity that is needed for compliance.
data integrity and product quality remains the same. The publicly traded
Use people only for their strengths: Humans are very effective at moni-
toring multiple systems simultaneously, whereas it would require a highly
complex automated system to achieve the same monitoring function. The
data in Table A, however, shows that humans are naturally poor at manual
data entry, so this should be avoided by implementing the direct interfacing
Human error of equipment and automated transfer of data.
“Doing the Right Thing” focuses on intentionally fraudulent actions that
undermine the integrity of data; it is, however, important to recognize that Limit opportunities for human error: Use drop-down lists in place of free
such actions are thankfully in the minority and that data is more often text entry, for example, so that searching for a particular product name will
affected by genuine human error. not fail due to a spelling error.
McAuley proposes moving from the current and pervasive mindset that hu- Interestingly, more recent data from Potter9 seems to suggest that entering
man errors should be dealt with by “reprimanding, retraining, adding extra data in a more critical system – in-flight management, for example –
lines to SOPs, and thinking people just need to read them” to a paradigm does not lower error rates, as one might be expect given the perceived
based on openness and a real understanding of people and behaviors and importance of the situation; it can actually give a worse error rate than
ultimately to a corporate culture where “individuals who try to hide, ignore, situations without such pressure. Alternatively, the increased error rate
or respond inappropriately to perceived human errors are not able to exist could be attributed to less accurate keyboard input from users accustomed
in the business.” to word processing and spell-checking to correct errors compared to the
necessity for high accuracy among professional typists using manual
The monitoring of human-error rates can be a powerful indicator of the typewriters in the earlier studies (although spell-checking itself can create
company’s error culture, with a consistently high incidence of error errors when it “corrects” a word erroneously and thus changes the meaning
changing little over time showing that mistakes are accepted as inevitable of the statement).
with no effort made to improve working practices.
Table A Selected error rates in data entry Table B Selected error rates in spreadsheet development
Scenario Error Rate* Researcher, Date Summary Error Rate* Auditor, Date
Expert typist 1% Grudin, 1983 50 spreadsheets audited; 0.9% of for- 86% Powell, Baker and Lawson,
mula cells contained errors that would 2007
Student performing calculator tasks 1−2% Melchers and Harrington, give an incorrect result
1982
7 spreadsheets audited 86% Butler, 2000
Entries in an aircraft flight management 10% Potter, 1995
system, per keystroke; higher if heavy 22 spreadsheets audited, only looking 91% KPMG, 1998
workload for major errors
* Detected by second-person review * Percent of spreadsheets with detectable errors
While it may not be feasible for companies to audit all of their data
entry in such a formal and controlled fashion using an outside company,
Implementing
careful tracking and trending of the findings from properly conducted
root cause investigations should be able to provide some measurable
metric around the incidence of human error within the company. This
a Corporate
metric can then be monitored to measure the efficacy of data-integrity
activities as part of the company’s ongoing commitment to quality.
Data Integrity
When discussing the incidence of genuine human error, it’s important
to note that a regulator does not distinguish between human error and
data falsifications when assessing the impact of a data-integrity failure.
A well-defined strategy
is the cornerstone of any
data-integrity program.
A well-defined strategy is the cornerstone of any data-integrity program. Knowledge sharing and training
To design and implement a successful program you must have a keen As you roll out your data-integrity program, sharing and addressing a
understanding of your current state of affairs and business process number of questions will help build a good data-integrity foundation across
knowledge; you must also make sure that those processes support your your organization. These include, but are not limited to:
data-integrity requirements.
¡ What does data integrity mean and how does it apply to my day-to-day
The assessment activities outlined below can serve as a basis for defining business activities?
and establishing your strategy. The high-level plan presented here will ¡ What role do equipment qualification and computerized system
define the approach, timeline, resource requirements, and rationale validation play in data integrity?
required to execute your data-integrity program. It may also provide ¡ How does data integrity relate to 21 CFR Part 11 and EU GMP Annex 11?
a means to track progress for senior management reports, as well as a ¡ What are our roles and responsibilities? What are those of the
documented rationale and plan to outline your program and actions during regulatory agencies?
audits and inspections. Finally, it outlines a method that can help align ¡ When does data integrity start and when does it end?
multisite activities and provide a holistic approach to compliance.
It’s important to make information readily available to all levels of the or-
At a minimum, a well-defined strategy demonstrates your commitment to ganization. Employees from the executive suite to the shop floor should
managing data-integrity issues within your company and creates a corpo- have appropriate levels of knowledge and accountability about data-integ-
rate governance oversight process. rity requirements and expectations.
Identifying and establishing executive sponsorship is crucial to getting Establishing a data-integrity knowledge repository or knowledge base is a
support for your data-integrity program. The sponsor is responsible for great way to provide historical and current information. Consulting subject
the program’s overall success, and will be required to set direction, define matter experts both within and outside of your organization early in the
priorities, provide resources and break down organizational barriers. The process is crucial to establishing an appropriate knowledge foundation.
sponsor will also help executives be aware of the four key benefits that a
data-integrity program can deliver: financial, risk reduction, regulatory, and Data integrity should be inherent to your processes, so that it can provide
legal product liability. a foundation for more focused training. Data handlers should be trained to
understand that they are data-integrity stewards. They should understand
the business processes and the data they generate. They are responsible
What are the critical success factors? for identifying and escalating concerns regardless of the effect on deliv-
ery, quotas, or timelines. Those in quality and compliance roles should have
Management accountability advanced training to ensure that data-integrity requirements are imple-
While a successful data-integrity program requires cross-functional over- mented within systems and processes, and that they support the business
sight and participation, management accountability at all levels of the processes and business owners.
corporation – from the CEO to operations floor supervision – plays a key
role in ensuring data integrity. Managers should “walk the talk” and per-
sonify integrity in response to a failure. They should foster an environment
Are your controls in place?
in which employees are encouraged to identify and report data-integrity
issues on the shop floor. They should never incentivize data falsification Quality management system
and should always discourage the “wanting-to-please” mentality that can Data integrity and data governance are an integral part of your quality
lead to data corruption. system. It’s appropriate to start with organizational and procedural controls,
therefore, when designing a data-integrity program.
Accountable managers also provide the appropriate resources to ensure
data integrity – including people, capable instruments and systems, along Does your quality management system (QMS) adequately address the reg-
with sound and understandable business processes. They acknowledge ulatory requirements associated with data-integrity? An assessment will
that data-integrity issues will occur, and that human error contributes identify any procedural controls that might be lacking. Do adequate pro-
greatly to data integrity issues. And they drive a strategy that focuses on cesses exist within the QMS to prevent, detect, report, and address data-in-
prevention, detection and response. tegrity failures? Are the ALCOA+ requirements clearly addressed within the
QMS? Are there adequately defined processes to generate and review data?
And are there proper controls for the entire data life cycle? If you have a
good and well-defined corporate QMS aligned with current GxPs, most of
Technology
Data integrity and data As with organizational controls, you must also assess technical controls,
governance are an integral part which include your equipment and computer systems. Are these properly
qualified and/or validated to ensure data integrity? All too often, systems
of your quality system. are not qualified, designed, or configured to ensure data integrity. System
access and security should be properly defined and audit trails properly
utilized to review, detect, report, and address data integrity issues.
these items should be addressed and traceable to the appropriate regula-
tion applicable to your business processes. Compliance
Understanding how organizational and technical controls are executed
Organizational gaps are more likely to be identified as sites and local and applied in your business processes is critical. An audit or self-
business areas define and execute their local procedures, however, a more assessment process should monitor compliance with your QMS and
detailed gap assessment may be required to truly understand the state of the regulatory requirements of your business. A quick measure of data-
data-integrity controls in place at this level. integrity compliance can be taken with a review of the self-assessment,
internal audits, and third-party reports and observations associated with
Corporate quality culture these activities. What types of data integrity issues exist? Are there repeat
This leads to another control you should assess and understand: corporate findings related to data-integrity issues? Are there systemic issues and do
and quality culture. they stem from a corporate or quality culture issue?
Just as behaviors can promote appropriate actions and foster an Of course it is only possible to review this data if these self-assessment
environment that champions integrity, the opposite is equally true: Cost- and audit processes are designed and able to identify data-integrity risks
saving measures may encourage password sharing due to limited user and gaps. They should utilize forensic audit techniques and focus on data-
license purchases; poorly conducted investigations may blame human integrity compliance issues. This will be critical to the long-term monitoring
error or find no assignable cause. Changing a standard operating procedure and overall effectiveness of your program; it will also help ensure you
(SOP) may be proposed as a preventive action, but all too often it can be are identifying and addressing data-integrity issues before regulatory
ignored and not truly address the root issue. inspections find them.
Poorly chosen metrics can also undermine data integrity. Metrics that en- If you are fortunate enough to have received an inspection visit from a
courage pressure, opportunity, and rationalization can support fraudulent regulatory agency that has implemented forensic data-integrity inspection
practice and may encourage data-integrity issues. Emphasizing speed techniques, you will be able to use the results of that visit as yet another
rather than accuracy and quality, for example, can force employees to cut indication of your acceptable state of control of data-integrity risks.
corners and focus on the wrong things. Otherwise, a review of regulatory observations from other companies can
Understanding how
organizational and technical
controls are executed and
applied in your business
processes is critical.
The result review should not overlook the audit trail review,1 which
provides the most effective means of assessing data integrity.
Unfortunately, in some cases the audit trail is not easily accessible or
permanently associated with the result, making the review difficult to
complete and data-integrity issues difficult to detect.
Appropriate and accessible audit trails can prevent and detect da-
ta-integrity issues, but reviewing the audit trail and metadata as-
sociated with the volume of results generated in today’s business
processes can present logical and resource challenges. Technology
controls implemented within many systems, however, have provided
a means to review by exception.
An Ounce
This applies risk-based methodology to data review based on alerts
highlighting a subset of results that require additional detailed
review; these may be results and data that are within but close
of Prevention
to the specification limit, have been manually manipulated (i.e.,
integration), or have been reprocessed. These types of systems also
require validation to verify and document the alert functionality.
Periodic reviews: Computer systems require periodic reviews to The administrative and technical
ensure they continue to operate in a matter consistent with their
intended use and remain in a validated state consistent with that use.
controls needed to mitigate risks to
GAMP® 5 is a great resource that outlines the concepts of periodic data integrity prove Ben Franklin’s
review. From a data-integrity perspective, periodic review should maxim that “an ounce of prevention
include evaluation of any changes to system configuration that is worth a pound of cure.”
could affect data integrity. It should also focus on any data deletions,
including what was deleted, why, and by whom. In addition, the
review should target system administration activities and user Computerized systems’ functionality is based on a combination of hard-
accounts, especially accounts disabled following unsuccessful login ware, software, processes, personnel and environment. When such systems
attempts. are used for the collection, storage, sharing, use and archiving of regulated
data, the following guiding principles will apply:
Other periodic review activities include SOP review to ensure that
appropriate data integrity controls are addressed, system validation ¡ Data should be collected, stored, shared and used only for legitimate
records are current and reflect the intended use of the system, business purposes.
required SOP records are maintained, change control process is ¡ Data should be collected, stored, shared and used in a secure manner.
functioning properly, and system performance is not affected ¡ Any data that is to be shared externally must be transferred by
negatively by the intended use of the system. ¢ secure means.
¡ Active, responsible data stewards should be assigned to all
Michael Rutherford critical data.
¡ Users should only have access to the data needed to do their jobs and
should be granted access levels commensurate with the requirements
of their jobs.
¡ Data, as well as any associated metadata that provides content and
meaning to the data, must be retained for the relevant retention period.
¡ Implementing time synchronization and time-clock security controls and Drug Administration (FDA), for example, makes a very clear statement
¡ Assuring that vendor-provided software is maintained at a release that about chromatographic data:
is supported by the vendor. This ensures that the latest security patches
and service packs can be applied as soon as reasonably possible to For High Performance Liquid Chromatography (HPLC) and Gas
close known security risks. Chromatography (GC) systems (and other computerized systems
involving user inputs, outputs, audit trials, etc.), the predicate rules,
Contracts and other arrangements such as 21 CFR 211.68 and 21 CFR 211.180(d), require the electronic
From contract manufacturing or laboratory services to outsourcing IT or records themselves to be retained and maintained in accordance
using “as a service” options such as SaaS, PaaS or IaaS (software, platform, with those regulations. . . . [T]he printed chromatograms used in
or infrastructure as a service), these service providers have a potential drug manufacturing and testing do not satisfy the predicate rule
impact on data integrity that must be evaluated, controlled, and mitigated. requirements in 21 CFR Part 211. . . . [T]he electronic record must be
From the providers’ willingness to be audited through the completion maintained and readily available for review by, for example, QC/QA
of audits or assessments prior to supplier selection, and throughout the personnel or the FDA investigator.2
ongoing service engagement, data integrity should be an area of absolute
focus. Both IT controls and business processes must be reviewed to Technical controls
ensure that appropriate controls are in place to guarantee data integrity. Technical controls should be introduced to mitigate the risks associated
Establishing clear requirements related to data security and integrity in with human actions in the design of the original system. However, because
the contract and/or quality agreement provides a baseline for ongoing technical controls are designed, tested, and implemented by humans, it is
monitoring to ensure that expectations will be met. Record-retention important to recognize that the controls may themselves have design flaws.
periods and access requirements (including system availability) must be
clearly defined and achieved. Computerized systems are often generalized as IT systems or software
solutions. In fact, a computerized system is not limited to software but
Documentation and data management should be considered as a business process supported by the use of IT
The process used to store and manage documentation and data and the solutions. The key here is that business process comes before the technical
repository in which the data resides can have a significant impact on data solution – never the other way around.
integrity. Documents or other types of data files stored and managed in
a regulated electronic document management system or other validated
electronic-record computer system that leverages a relational database can Business process comes before
be controlled on a much higher level than documents or other types of data
files managed in a file share on a server using a manual process. This is the technical solution – never
explicitly supported in the MHRA GMP Data Integrity Definitions and Guid-
ance for Industry March 2015,1 which states:
the other way around
There is an inherently greater data-integrity risk with flat files Data integrity by design
(e.g., when compared to data contained within a relational data- The integrity of regulated data should be safeguarded in three spaces:
base), in that these are easier to manipulate and delete as a single during collection and processing, when transferring between systems
file. [Emphases in the original] and in storage.3 Evaluating the risks to data integrity at each stage of the
data flow in a business process can identify opportunities to improve data
Paper-based records integrity by intelligent system design; transcription errors, for example can
It should be noted that while paper-based records have been used for much be eliminated from a workflow by directly interfacing the source and target
longer than electronic records, they share many of the same concerns, such systems such that the data is transferred electronically using a validated
as how to ensure the record remains: process. Transmission security across an open network can be strengthened
by using integrity controls such as a checksum and encryption processes.
¡ Legible: Considerations around fading ink and the well-known issues Highly critical data editing or deletion functions can be additionally
with thermal printouts secured and justified using transactional safeguards such as password
¡ Available: How to protect the record during long-term archiving: Will authentication at the time of execution, and the recording of an explanation
the paper degrade? Is it threatened by moisture or pest species? for the action via free text or (preferably) preconfigured reason. A user
¡ Retrievable: How to quickly locate one record among the many interface that highlights potential data-integrity issues, such as manually
thousands retained; advantages and disadvantages of onsite vs. offsite integrated results or repeat samples, assists by focusing review efforts on
storage the results with the highest risk.
Paper records, of course, have an additional issue in that they lack the Technical controls have an important advantage over human controls. The
independent audit trail that can accompany an electronic record; this repeatable and reliable behavior of any validated IT system (whether it is
means that it is not possible to identify record backdating, repeat testing of a distributed clinical database or a manufacturing executing system) can
the sample, or whether all results have been retained. be designed, tested, operated, and maintained in such a way that data
integrity is ensured and well documented. However, it is also true that even
For certain records, there is clear regulatory guidance that the electronic the best systems – in terms of implementation, efficiency, and quality –
record must be retained, as the paper record is not sufficient. The US Food could not ensure data integrity without qualified data stewards. Data
Introducing humans to a Ensuring data integrity in a GxP system is extended, but not guaranteed, by
CSV. It may be necessary to accommodate vendor solutions that have data-
validated IT system creates integrity gaps in the technical controls; these must be mitigated as part of
the validation process. CSV only ensures that a system is fit for its intended
a more complex and purpose; it cannot absolutely prevent data-entry error or intentional misuse
unpredictable interaction of the system.
Raw data: Created in real time, this is all data on which quality decisions
Multiple analyses of assay with the same sample without adequate justification
are based.4 Raw data files can be unstructured, and are often based on a
proprietary, vendor-defined format.
Manipulation of a poorly defined analytical procedure and associated data analysis in order
to obtain passing results
Metadata: This “data about the data” is used for cataloging, describing and
tagging data resources. Metadata adds basic information, knowledge and Backdating stability test results to meet the required commitments
meaning, and helps organize electronic resources, provide digital identifi-
cation, and supports resource archiving and preservation. Creating acceptable test results without performing the test
Workflow complexities Integrating LIMS processes with enterprise workflows can also significantly
Simplifying scientific processes would significantly reduce challenges reduce the probability of data-integrity issues. Process harmonization will
in data integrity. Although our industry is trying to harmonize scientific initially increase the validation burden, but the effort will pay off in the
processes, other regulated industries are ahead in this field., There are, long-term, significantly reducing the amount of potential data-integrity
however, signals that our industry is recognizing the need. For example, failure points and boosting efficiency for laboratory staff and management.
balance and titrator suppliers have increased the value of their instruments
The final example is mapping the entire laboratory workflow and related
operations, from sample receipt to release of results, which can consolidate
Figure 3 Benefits of open data standards
operational workflow. The net effect may significantly reduce validation
effort and decrease data integrity risks.
One serious concern is the lack of data standards in the scientific commu-
nity. Without standards, data integrity will remain challenging and auditing
and verifying is an expensive exercise. ¢
Peter Boogaard
Big Brother
Is Watching
Corporate training can be considered
a “human” control for preventing data-
integrity problems. Reinforce “right
behavior” with ongoing training and
monitor effectiveness with review
processes.
The foundation for a high level of data integrity is knowing and under- It is a regulatory expectation that companies understand their data life
standing what data integrity is, the importance it has for a company and cycles and how data flows through their processes and systems. Personnel
the personal role each employee has in protecting it. in roles that own these processes and systems (such as business-process
and system owners) must understand their responsibilities in maintaining
Companies should also recognize regulatory authorities’ increasing data integrity. These could include:
awareness and expectations for data integrity. While this is not new, the
focus on and approach to managing and inspecting it are changing. As ¡ Understanding how and where the data is used and its effect on
technologies, electronic systems and business models modernize, the product quality and patient safety
industry must understand how to manage data in a changing environment. ¡ Knowing what other review processes and data stewards are involved
in each data flow, particularly those downstream of the system
Data-integrity training ¡ In-depth knowledge of system functionality with the most potential
Data-integrity problems can affect a company’s reputation and profitability. impact on data integrity and how to detect such activity
To avoid the problems associated with data-integrity breaches, a “speak
up/quality first” culture must be endorsed by company management, as Personnel in QA and compliance roles must also have an advanced under-
we discussed in “Throwing People into the Works,” and data-integrity standing of data-integrity requirements to ensure that these requirements
training should be implemented from senior executives down to the line- are implemented within the systems and processes, and to support the
operator level. business-process and system owners.
At the line-operator level, data integrity should be inherent in the process A corporation’s data-integrity training program should be both general and
and not compromised to meet delivery timelines. Key data handlers specific. It should target the correct audiences and consider the specific
should be formally trained to understand their roles in maintaining the scale of the corporation. In a large pharmaceutical company, high-level
integrity of the data they handle: They are data stewards, responsible for training for all employees might be at a foundational level, but the content
highlighting and escalating any concerns about data and quality regardless and focus may be quite different for different functions. (The consequences
of the effect on production quota or deadlines. Training should not only of a data-integrity issue will be very different for a line operator compared
ensure a common understanding of data, data integrity, falsification and to the operations director, for example.) This training approach might be
data life cycles, but should also emphasize electronic good documentation ineffective for small and/or startup companies, however. In those cases it
practices, also referred to as good data management. might be more effective to roll out both foundational and detailed training
simultaneously.
Foundational data-integrity training is only part of the bigger data-
integrity picture, however. An additional, deeper understanding of technical Training on the general principles of data integrity could be complemented
expectations and requirements, inspection and auditing techniques and by more detailed, contextual training appropriate for data stewards who
process governance are required to establish holistic data integrity for play a direct role in data handling. The specific training provided for such
those with data steward or quality assurance (QA) responsibilities. persons (including quality and compliance personnel) must extend beyond
the general requirements and definitions of data integrity. This role-based Where there is a data audit trail that is easily accessible and permanently
training should focus on critical thinking and auditing techniques and could associated with the result, a review is likely to be the most effective route
include specific-use cases related to the roles. Data-integrity training for to assessing the integrity of the data.
laboratory auditors and process owners, for example, might include a
comprehensive review of US Food and Drug Administration (FDA) warning The MHRA GMP Data Integrity Definitions and Guidance for Industry March
letters that describe data-integrity observations in laboratory settings and 2015 1 states that:
practical exercises around examining audit trails.
Audit trail review should be part of the routine data review/approval
process, usually performed by the operational area which has
generated the data (e.g., laboratory).
Result review should involve increased rigor of focus for results that involve
manual adjustment and/or “just passing” results; an application offering
the ability to highlight such results automatically provides an additional
level of efficiency and assurance and may allow for the review-by-exception
approach to data review.
Review by exception
Review processes Review by exception applies a risk-based approach to data review. In an
People might cause data-integrity problems, but they are also superior to environment where hundreds or even thousands of results are generated
machines when it comes to detecting integrity issues. Software applications daily, if an equal amount of time is devoted to reviewing each result, by
can generate an audit trail, but only a human can decide “Was that simple mathematics that amount of time will be very small. For just 100
integration parameter change a scientifically valid one?” For this reason, samples, even spending as little as 2 minutes per result means more than
review processes remain in the human domain. Review processes can be 3 hours’ review time daily from each reviewer on those 100 samples – and
discrete or continuous, one-off or repeated, and scheduled or unscheduled. more than one level of review may be required. Realistically, it is just not
In the sections below, different types of review processes and their timing possible to review each result and its history effectively in 2 minutes.
are discussed.
Where the process or application permits, review by exception creates
Result review alerts to highlight a subset of results requiring additional effort, such as
Result review is defined here as a review of individual results, or sets of those:
results, that is done prior to making the accept/reject decision about the
product or data quality. To make that decision effectively, it is essential that ¡ Within but close to the limit of the specification
the result review: ¡ That have been manually integrated
¡ Where manually entered critical data have been changed
¡ Compares the result against specifications/limits ¡ That have been reprocessed
¡ Evaluates the completeness and correctness of the metadata
supporting the result A detailed result review (as discussed above) is then conducted on this
¡ Determines the accuracy and integrity of any manually entered values subset of results to understand what has been changed and why in order to
¡ Reviews any decisions or actions taken decide whether to approve or reject the results. The remainder of the results,
¡ Understands any manual adjustment or alteration of the data or where the result is well within specification and no changes or adjustments
metadata have been made, can then be approved with a minimal level of review. A
¡ Investigates any changes to the method versions used in the creation company wishing to operate review by exception has the responsibility to
of the result determine and document what that minimal level of review is, and to justify
¡ Assesses conformity to sound scientific practice and documented it during a regulatory inspection. Some level of validation will be required to
procedures document and verify the alert functionality.
Reviews of system audit trails and logbooks are a more pressing concern Data audit
in laboratory environments and manufacturing sites, however, where the A range of data audit activities can be undertaken as part of the scheduled
sophistication of the interfacing systems can limit the ease of transmission periodic review process, unscheduled as part of an investigation or even in
between them. Suggestions of what to look for within the system audit trail preparation for a regulatory inspection or customer audit.
(as distinct from the data audit trail) are discussed in “Doing the Right Thing.”
One effective exercise could be to conduct a mock inspection of a specific
data-handling process, where the entire data flow would be explained as
if it were being presented to a regulatory official. This will highlight any
confusion about where the data resides and how it passes from one system Review process documentation
to another; it may identify areas of weakness in the system(s). Within regulated industries, simply completing an action is not sufficient;
there must be some documented evidence of when it was completed and
Another exercise could be to pick a single result and trace it back to the by whom. The MHRA GMP Data Integrity Definitions and Guidance for
raw data, including any laboratory notebook entries. Verify the data Industry March 2015 requires that:
integrity and audit trail at each step and demonstrate that all raw data,
paper or electronic, is readily retrievable, fully supports the final result and There should be evidence available to confirm that review of the
is consistent with any summary data filed with the regulatory agencies as relevant audit trails have taken place. When designing a system for
part of an application for approval. review of audit trails, this may be limited to those with GMP relevance
(e.g., relating to data creation, processing, modification,
Repeating the exercise in the opposite direction – to verify that all data and deletion).1
has been processed and reported and to confirm that there is no orphan
data that could indicate trial injections or other malpractices – is equally Reviewing audit trail entries associated with results (i.e., data audit trail)
important. may be governed by a Review of GxP Data SOP and documented by some
statement along the lines of “By approving this report I certify that I have
Further proactive data audit activities could be based on the regulators’ own reviewed the data, metadata, manually entered values and the audit trail
guidance; the FDA Compliance Program Guidance Manual on preapproval records associated with this data, in accordance with Review SOP XXX.”
inspections,9 for example, suggests that inspectors should: This statement could be included in the signature process for the electronic
record and be visible on the printed and displayed report.
¡ Review data on finished product stability, dissolution, content
uniformity and active pharmaceutical ingredient impurity The MHRA guidance goes on to state:
¡ Determine if data was not submitted to the application that should
have been QA should also review a sample of relevant audit trails, raw data, and
¡ Look for invalidated out-of-specification results and assess whether it metadata as part of self-inspection to ensure ongoing compliance with
was correct to invalidate them the data governance policy/procedures.1
¡ Seek out inconsistencies in manufacturing documentation, such as
identification of actual equipment used
BST
New Generation of cGMP Sterilizer
Modular Sterilizer / le
Designed for customer specific needs
Complete solutions for cleaning, disinfection and sterilization in healthcare and Life Science Industry
Belimed Germany: +49 8631 9896 0, Switzerland: +41 8488 55 88 11, USA: +1 843 216 7424, www.belimed.com
Data-integrity
problems can affect
a company’s
reputation and
profitability
Outside of the pharmaceutical industry, falsification and fraud occurred in respected financial
institutions such as JP Morgan (2003) and Credit Suisse Group (2007–2008). The article “Com-
pliance Alone Won’t Make Your Company Safe” 1 discusses the premise that good people can still
behave inappropriately and that creating a “policeman culture” of enforcing rules and procedures
may discourage generally honest employees from admitting that they wandered away from the
straight and narrow or inadvertently made a mistake.
Improvisation is the ability to work around a lack of people or absent or The extent and impact of falsification is greatly magnified if collusion is
damaged equipment, and even a lack of training, to “get the job done involved. A senior QC manager has the power to direct his or her staff to
somehow.” The downside to a culture of improvisation is that SOPs or other collude for falsification, resulting in systemic fraud within the laboratory,
controls will not be followed, and the integrity of any data produced by whereas an individual analyst can only try to persuade a coworker to try to
such means is therefore highly suspect. This reinforces yet again the impor- falsify data and inherently runs the risk of being reported to management
tance of management provision for sufficient and suitable resources. for inappropriate behavior. Geographic and corporate cultures may also
influence the ease with which collusion may occur; strongly hierarchical
The scientific and engineering mindset of people in skilled professions can cultures may be more susceptible to collusion instigated at a senior level
also create a culture in which any rule or impediment will be seen as a chal- as these cultures inherently discourage any disagreement with authority
lenge to be gotten around: “Ah, but in that case I could … ” and this is more figures. (Cultural considerations are discussed in more detail in the first ar-
difficult to mitigate. “Big Brother Is Watching” emphasized the importance ticle in this series.)
of training to reinforce the “right behavior” as one defense against this puz-
zle-solving mentality, but the “six sources of influence” discussed later in Understanding effective risk controls
this article may prove more effective overall compared to training alone. In formal risk methodology,12 there are the following risk treatment options:
Impartiality Avoid: Stop the activity or do it in a different way that eliminates the risk
Any person making critical product-quality decisions must be free from
commercial, marketing or financial pressure that could influence his or her Reduce (also termed “mitigate”): Adopt measures to reduce the likeli-
decision. hood of occurrence or reduce the severity of harm or increase the proba-
bility of detection
For example, a quality control (QC) lab supervisor who reports to the
operations department may be at risk of undue pressure to pass batches Retain: Accept a low level of residual risk
even if he or she has valid concerns about the test results. Good practice
would recommend reporting through the independent quality assurance Transfer: Transfer the risk creating activity (more practical for physical risk
department. than data-integrity risk)
into an SOP may have little or no long-term effect on the probability that
Table A: Potential for falsification as a function of motivation
and seniority someone will do something the wrong way. Single training events may
affect the probability of correct performance in the short term after the
Data integrity issues may now have Data integrity issues may constitute training, but will have minimal influence in the long term as people move
become quite sophisticated within the systemic, corporate fraud, where:
within the organization and old habits reassert themselves.
lab domain:
¡ All raw materials are used and
¡ Variety of saved test methods used all finished goods are released, Six sources of influence
for a range of known scenarios to irrespective of quality In Influencer,10 Joseph Grenny and his colleagues propose a model for influ-
effect the desired result encing behavior and attitudes. In the example below, this model has been
¡ The company benefits from
¡ Pool of “good projects” from which significant savings on staff and applied to data integrity in a hypothetical QC laboratory (see Figure 1).
Greed
Data integrity issues here are likely Data integrity issues may be focused Connecting the behavior to an outcome has a powerful impact. If possible,
to be on an individual sample or test around production yield, such as: find out whose neighbor, child or parent relies on that medication and (with
level, and may take the form of:
¡ Pressuring the quality department permission) use that person (our “real-life patient”) to make it personal for
¡ Test method or parameters altered to release borderline product all the lab staff. Spin the story. Add a picture and some background about
to influence the result the real-life patient: What are his or her hobbies? Does he or she have kids?
¡ Understating rejected batches or
Pets? Now, finding a failing sample is not a blot on the analyst’s day; it’s an
¡ Test samples destroyed having them mixed with passing
batches during rework
important victory keeping this real person safe and healthy so he or she can
Fear
¡ Test samples substituted to ensure continue sailing/studying environmental science at college/playing with his
a passing result Falsification is aimed at hiding poor
or her grandchildren.
performance from the shareholders,
Falsification is occurring when
and is endemic throughout the
samples fail, because the
production environment.
management culture does not
promote honesty and cares only
Figure 1: Grenny’s six sources of influence
about passing results.
Controls that rely on people to consistently perform an action the right way
out of many possible ways are ineffective. Simply writing an instruction
It is essential that the lab manager provide strong support for this kind of
self-improvement by helping analysts set short-term goals to measure the
improvement and providing praise for the achievement.
Strength in numbers
Research studies have proven repeatedly that groups perform significantly
better than individuals. In the new culture of openness within the lab, it
Increasing knowledge and confidence around data integrity will, in turn, lead
to continual improvement in the overall integrity of the laboratory data.
Creating approved methods for instrument control, data processing, and Charlie Wakeham and Thomas Haag
reporting all combine to make tasks quick and simple for the analyst –
while ensuring that he or she is doing them correctly. Creating custom field
calculations to eliminate calculation errors and getting sample weights read
into the system electronically to eliminate transcription errors significantly
strengthen data integrity by not only reducing the probability of error
but also removing the simplest means for an analyst to falsify the sample
weight or the concentration of active ingredient.
A Special Interest In the two years since its formation the SIG has generated presentations
on how to identify and mitigate data integrity risk; identified which global
Group (SIG) for GxP regulations and guidances are linked to data integrity; and developed
a prototype tool with hundreds of these references which, while available
Data Integrity only to GAMP SIG members today, may be rolled out to the broader mem-
bership in the future.
Launched in January 2014, the sponsor of the Data Integrity GAMP SIG, Goals for 2016 are three-fold:
Mike Rutherford, had signed up some 50 members before the announce- 1. Develop a GAMP guide on electronic records and data integrity that
ment at ISPE’s 2013 Annual Meeting. The group now boasts more than 100 will include current thinking on governance. A session will take place
members, a sign, says Mike, of the topic’s importance in the pharmaceutical at the 2016 Annual Meeting in Atlanta, this September, with the guide
manufacturing industry. “The group is working with Board member Chris targeted for publication by Q1 2017.
Reid to make the SIG and ISPE-centric activity that reaches beyond GAMP.” 2. Develop a GPG on how to apply the GAMP guide, as well as one that
focuses on pragmatic solutions.
In 2014, the SIG set four overarching objectives: 3. Create content for ISPE conferences, such as the Europe Annual
Conference just held in Frankfurt, Germany, the 2016 Annual Meeting,
1. Understand existing and future regulatory expectations, guidance and and the upcoming GAMP regional conference in Copenhagen. They are
enforcement strategies. also supporting the development of a Data Integrity Workshop at the
2. Identify and propose appropriate data integrity control strategies for ISPE/FDA GMP Conference in June of this year.
critical data and key quality attributes throughout the life cycle that
also address data management from the operational through to the As a topic that is the focus of regulatory agencies around the world, data
record retention phase. integrity “is something you absolutely need to be thinking,” says Mike. ISPE
3. Provide tools to align requirements with a product’s life cycle. is devoting much effort to it and solutions will continue to evolve for this
4. Provide a pragmatic and tangible framework for managing data business problem.
integrity risks across the industry.
“What’s important for members to understand is they needn’t panic.” ¢
Your Experience
Can Change Lives
Take hold of an opportunity that will define your career, Novo Nordisk is embarking
on the construction of one of the largest pharmaceutical facilities in the world
in Clayton, NC. Developing life-saving medication for people with diabetes and
hormonal and growth disorders is our goal. But as a Process Engineer, Six Sigma
Black Belt, or a Quality Specialist, your experience will help us treat patients even
more effectively in the fight against diabetes.
We are an Equal Opportunity/Affirmative Action Employer committed to supporting a winning culture where diversity is accepted and valued by all.
References
Throwing People into the Works 3. European Medicines Agency. “Reflection Paper on Expectations for Electronic Source
1. US Food and Drug Administration. Warning Letter 320-15-09. 6 April 2015. www.fda.gov/ Data and Data Transcribed to Electronic Data Collection.” EMA/INS/GCP/454280/2010. 9
ICECI/EnforcementActions/WarningLetters/ucm443247.htm. June 2010. http://www.ema.europa.eu/docs/en_GB/document_library/Regulatory_and_
procedural_guideline/2010/08/WC500095754.pdf
2. International Organization for Standardization. ISO 9001:2015: Quality Management Systems
– Requirements. www.iso.org/iso/catalogue_detail?csnumber=62085 4. European Commission. Health and Consumers Directorate-General. “Documentation.”
Chapter 4 of EudraLex Volume 4, Good Manufacturing Practice (GMP) Guidelines. 30 June
3. UK Medicines and Healthcare Products Regulatory Agency. “MHRA GMP Data Integrity 2011. http://ec.europa.eu/health/files/eudralex/vol-4/chapter4_01-2011_en.pdf
Definitions and Guidance for Industry March 2015.” www.gov.uk/government/uploads/
system/uploads/attachment_data/file/412735/Data_integrity_definitions_and_guidance_ 5. International Conference on Harmonisation of Technical Requirements for Registration of
v2.pdf. Pharmaceuticals for Human Use. ICH Harmonised Tripartite Guideline. “Pharmaceutical
Quality System: Q10.” 4 June 2008. http://www.ich.org/fileadmin/Public_Web_Site/
4. Cressey, Donald R. Other People’s Money: A Study in the Social Psychology of Embezzlement. ICH_Products/Guidelines/Quality/Q10/Step4/Q10_Guideline.pdf
Glencoe, Illinois: Free Press, 1953.
5. Meyer, Erin. The Culture Map: Breaking through the Invisible Boundaries of Global Business.
Philadelphia: Public Affairs, 2014. Big Brother Is Watching
6. McAuley, Gerry. “Optimizing Human Performance, Part I.” BioPharm International 27, no. 7. 1 1. UK Medicines and Healthcare Products Regulatory Agency. “MHRA GMP Data Integrity
July 2014. www.biopharminternational.com/optimizing-human-performance-part-i. Definitions and Guidance for Industry March 2015.” www.gov.uk/government/uploads/
7. ——— . “Optimizing Human Performance, Part II: A Road Worth Traveling.” BioPharm system/uploads/attachment_data/file/412735/Data_integrity_definitions_and_guidance_
International 27, no. 8. 3 September 2014. www.biopharminternational.com/optimizing- v2.pdf.
human-performance-part-ii-road-worth-traveling. 2. European Commission. Health and Consumers Directorate-General. EudraLex, Volume
8. ——— . “Optimizing Human Performance: A Road Worth Traveling, Part 3.” BioPharm 4, Annex 11: “Computerised Systems.” http://ec.europa.eu/health/files/eudralex/vol-4/
International 27, no. 9. 1 July 2014. www.biopharminternational.com/optimizing-human- annex11_01-2011_en.pdf.
performance-road-worth-traveling-part-3. 3. US Food and Drug Administration. Warning Letter 06-atl-09. 28 September 2006. www.
9. Potter, H. “Needed: A Systematic Approach for a Cockpit Automation Philosophy.” fda.gov/ICECI/EnforcementActions/WarningLetters/2005/ucm076083.htm.
Proceedings of the Workshop on Flight Crew Accident and Incident Human Factors. 21–23 4. ———. Warning Letter 10-NWJ-03. 14 January 2010. www.fda.gov/ICECI/
June 1995. Washington, DC: US Federal Aviation Administration, Office of System Safety. EnforcementActions/WarningLetters/ucm197966.htm.
10. Panko, Raymond R. “What We Know about Spreadsheet Errors.” Journal of End User 5. ———. Warning Letter 320-12-08. 23 February 2012. www.fda.gov/ICECI/
Computing 10, no. 2 (Spring 1998): 15–21. Revised May 2008. http://panko.shidler.hawaii. EnforcementActions/WarningLetters/2012/ucm294321.htm.
edu/SSR/Mypapers/whatknow.htm. 6. Gutman, Barbara, and Edward A Roback. “An Introduction to Computer Security: The NIST
11. US Food and Drug Administration. Warning Letter 320-15-06. 30 January 2015. www.fda. Handbook.” National Institute of Standards and Technology Special Publication 800-12.
gov/ICECI/EnforcementActions/WarningLetters/2015/ucm432709.htm. October 1995. http://csrc.nist.gov/publications/nistpubs/800-12/handbook.pdf.
7. Perez, Randy, Chris Reid, and Sion Wyn. “A Risk-Based Approach to Audit Trails.”
Pharmaceutical Engineering 35, no. 2 (March/April 2015). www.pharmaceuticalengineering.org.
Implementing a Corporate Data Integrity Program 8. International Society for Pharmaceutical Engineering. GAMP® Good Practice Guide.
1. UK Medicines and Healthcare Products Regulatory Agency. “MHRA GMP Data Integrity A Risk-Based Approach to Operation of GxP Computerized Systems – A Companion Volume
Definition and Guidance for Industry March 2015.” www.gov.uk/government/uploads/ to GAMP® 5. January 2010. www.ISPE.org.
system/uploads/attachment_data/file/412735/Data_integrity_definitions_and_guidance_ 9. US Food and Drug Administration. Compliance Program Guidance Manual
v2.pdf 7346.832. “Pre-Approval Inspections.” 12 May 2010. http://www.fda.
2. Avellanet, John, and Eve Hitching, “Considerations for a Corporate Data Integrity Program” gov/downloads/Drugs/DevelopmentApprovalProcess/Manufacturing/
ISPE GAMP Community of Practice Concept Paper. QuestionsandAnswersonCurrentGoodManufacturingPracticescGMPforDrugs/ucm071871.pdf.
3. Wakeham, Charlie, Eve Hitching, and Thomas Haag. Special Report on Data Integrity.
Pharmaceutical Engineering 36, no. 2 (March-April 2016).
Doing the Right Thing
1. De Cremer, David, and Bjarne Lemmich. “Compliance Alone Won’t Make Your Company
An Ounce of Prevention Safe.” Harvard Business Review, 18 May 2015. https://hbr.org/2015/05/compliance-alone-
1. UK Medicines and Healthcare Products Regulatory Agency. “MHRA GMP Data Integrity wont-make-your-company-safe.
Definitions and Guidance for Industry March 2015.” www.gov.uk/government/uploads/ 2. US Food and Drug Administration. Warning Letter 320-14-08. 7 May 2014. www.fda.gov/
system/uploads/attachment_data/file/412735/Data_integrity_definitions_and_guidance_ ICECI/EnforcementActions/WarningLetters/2014/ucm397054.htm.
v2.pdf. 3. ——— . Warning Letter 320-14-01. 25 November 2013. www.fda.gov/ICECI/
2. US Food and Drug Administration. “Questions and Answers on Current Good Manufacturing EnforcementActions/WarningLetters/2013/ucm376913.htm.
Practices, Good Guidance Practices, Level 2 Guidance – Records and Reports.” Question 4. ——— . Warning Letter 320-14-005. 6 March 2014. www.fda.gov/iceci/enforcementactions/
3. www.fda.gov/drugs/guidancecomplianceregulatoryinformation/guidances/ucm124787. warningletters/2013/ucm390278.
htm#3.
5. UK Medicines and Healthcare Products Regulatory Agency. Statement of Non-Compliance
3. Lopez, Orlando. “A Computer Data Integrity Compliance Model. Pharmaceutical Engineering with GMP. Report UK GMP 8913 Insp GMP 8913/378537-0004 NCR. www.pharmacompass.
35, no.2 (March/April 2015): 79−87. com/assets/pdf/edqm/A1348.pdf.
4. Code of Federal Regulations. Title 21, Part 11: “Electronic Records; Electronic Signatures.” 6. US Food and Drug Administration. Warning Letter 320-14-11. 16 June 2014. www.fda.gov/
http://www.accessdata.fda.gov/scripts/cdrh/cfdocs/cfcfr/CFRSearch.cfm?CFRPart=11 ICECI/EnforcementActions/WarningLetters/2014/ucm401451.htm.
5. European Commission. Health and Consumers Directorate-General. EudraLex, Volume 7. Italian Medicines Agency. Statement of Non-Compliance with GMP. Report IT/GMP/NCR/
4, Annex 11: “Computerised Systems.” http://ec.europa.eu/health/files/eudralex/vol-4/ INT/1-2014. www.pharmacompass.com/assets/pdf/news/N1.pdf.
annex11_01-2011_en.pdf.
8. US Food and Drug Administration. Warning Letter 320-15-09. 20 April 2015. www.fda.gov/
6. Pharmaceutical Inspection Co-Operation Scheme. “PIC/S GMP Guide.” http://www. ICECI/EnforcementActions/WarningLetters/ucm443247.htm.
picscheme.org/publication.php?id=4
9. ——— . Warning Letter 320-15-04. 19 December 2014. www.fda.gov/ICECI/EnforcementAc-
tions/WarningLetters/ucm427976.htm.
How Good Is Your Data? 10. Grenny, J. Influencer. McGraw-Hill, 2008.
1. International Society for Pharmaceutical Engineering. “GAMP® Good Practice Guides.” www. 11. US Food and Drug Administration. Warning Letter 320-15-07. 27 February 2015. http://www.
ispe.org/gamp-good-practice-guides fda.gov/iceci/enforcementactions/warningletters/2015/ucm436268.htm
2. US Food and Drug Administration. “Pharmaceutical Quality/Manufacturing Standards 12. Institute of Risk Management. “A Risk Management Standard.” 2002. https://www.theirm.
(CGMP).” http://www.fda.gov/Drugs/GuidanceComplianceRegulatoryInformation/ org/media/886059/ARMS_2002_IRM.pdf
Guidances/ucm064971.htm
[email protected]
www.cd-adapco.com