Title: Inform Database Quality Assessment Specification: Sponsor Project Number: XXXX Project Number: Sponsor Name
Title: Inform Database Quality Assessment Specification: Sponsor Project Number: XXXX Project Number: Sponsor Name
Sponsor Project Number: Sponsor Name: XYZ XXXX Project Number: 102294
This document has been signed electronically on the final pages by the following: Signatory
Independent Reviewer
Name: Role
Signatory
Reviewer
Name: Role:
XXXX
TABLE OF CONTENTS
1. REVISION HISTORY...................................................................................................................................3 2. Purpose..........................................................................................................................................................4 3. Scope.............................................................................................................................................................4 4. Out of Scope..................................................................................................................................................4 5. Time Frame....................................................................................................................................................4 The assessment(s) will be performed at pre-determined time points during the trial as determined by the XXXX Data Operation Lead (DOL) and the DMS during the Study Set up phase of the trial (refer to the DCS for the time points). This will allow flexibility for each trial based on; study design, study duration, rate of enrollment, interim analysis and/or final database lock timeline. The Data Operation Lead (DOL) will be responsible for selecting an Independent Reviewer to initiate all associate tasks..............................................4 6. Required Documents......................................................................................................................................4 7. Required Listings and Reports.......................................................................................................................5 8. Random Subject Sample Selection ................................................................................................................5 9. Procedural Steps............................................................................................................................................6 10. Database Quality Assessments Reviews .....................................................................................................7 11. Review Meeting/ Result Summary.............................................................................................................11 12. Calculation of the Error Rate.....................................................................................................................13 13. Sponsor Communication............................................................................................................................14
Confidential
XXXX
1.
Version
Confidential
XXXX
2. Purpose
The Database Quality Assessment Specification provides the requirements and procedures to ensure that the database meets the XXXX quality standard.
3.
Scope
This Data Management Operational Guideline describes the process for a regular sample-based assessment of the data review process. The outcome is an estimate of the overall error rate that measures the level of quality applied in the data review process. Data selected for the quality assessment must be clean prior to commencement of the Database Quality Assessment. Un-actioned discrepancies, outstanding queries, and unresolved data clarification issues are not permitted; unless it pertains to the out of scope activities listed below.
4.
Out of Scope
The following reports and queries will be considered out of scope for this Database Quality Assessment and therefore should not be included in the quality assessment. Reports out of scope: o Non-randomized patients (Screen Fails) o External Vendor Report o Patient Diaries Reports o Coding Reports Queries out of scope (determined by the XXXX Data Processing Organization and the DMS during the Study Start) : o Coding Queries o SAE Queries with pre-fix SAE in the query Text o External Vendor Data Queries o Autoqueries automatically closed
5.
Time Frame The assessment(s) will be performed at pre-determined time points during the trial as determined by the XXXX Data Operation Lead (DOL) and the DMS during the Study Set up phase of the trial (refer to the DCS for the time points). This will allow flexibility for each trial based on; study design, study duration, rate of enrollment, interim analysis and/or final database lock timeline. The Data Operation Lead (DOL) will be responsible for selecting an Independent Reviewer to initiate all associate tasks. Required Documents
Reference the following documents to facilitate the completion of the Database Quality Assessment: Final Version of the Edit Check Specification Approved Protocol/Protocol Amendment(s) for trial (as necessary) eCRF Specification eCRF Completion Instruction
6.
Confidential
XXXX
7.
The following listings/reports will be required to complete Database Quality Assessment: o Standard In-Life Report Listing and Study Specific Listings Off-Line listings per the Edit Check Specification o All Queries Report. The following Ad-Hoc Report will be generated to help initiate the sample selection process for the Database Quality Assessment. : o Subject Status Report.
8.
The Project Team will select the Sample size. Sample size a random sample of the subject population, using the square root of N+1, where N is the total number of patients in a study. o Subjects taken for the random sample selection-1181 (total subjects enrolled). o random Sample size - 30., this is obtained by applying N+1. o For this assessment a sample of 18 subjects was taken from the subjects declared clean by the PCDA. o PCDA is responsible for generating and sending it to Independent reviewer and IR in turn forward the Subject Status Report to the designed Clinical Programmer. o The DOL will be responsible for requesting read-only access for the Independent Reviewer (remember to request Ad-Hoc Capabilities to the request) o Prior to sending the Subject Status Report to the Clinical Programmer, the Independent Review must meet with the Primary CDA and Data Operation Lead to identify the clean subjects using the completed subject list. o Once the primary CDA and Data Operation Lead have identified the cleaned subjects, the primary CDA must sign the Subject Status Report to verify that the selected subjects are an accurate reflection of the data across the study, which will not compromise the integrity of the test. o The criteria for selecting a clean subject is as follow: No missing data on the eCRFs No missing eCRFs/Visits No missing no outstanding queries (excluding autoqueries automatically closed , Coding, SAE , Lab queries) All queries have been closed properly.
o o
After the clean subjects have been identified the Independent Reviewer must send the amended Subject Status Report to Clinical Programming. The Clinical Programmer will then select the random sample by using the appropriate XXXX standards and tools. The Clinical Programmer must refer to the attached document for steps involved in generating the DBQA listings.
Once the Random Sample has been selected then the Programmer will send the DBQA Listing which will include the number of selected subjects and datasets. o Independent Reviewer will QC the listings and send to the primary programmer for technical QC of the listings
Confidential
XXXX
9. Procedural Steps
o
Each project must complete the Database Quality Assessment unless otherwise specified by the Global Program Lead or the Project Lead. o The Data Operation Lead (DOL) will be responsible for selecting an Independent Reviewer to initiate all associate tasks. o Prior to starting the Database Quality Assessment the Independent Reviewer must schedule a meeting with Data Operation Lead (DOL) and Primary CDA to obtain a full overview of the project. During the meeting the Project Team would explain all discrepancies and outliers associated with the project and the selected patients. o Once all reports, listings, appropriate access rights have been provided to the Independent Reviewer the Database Quality Assessment can start.
9.1
Instructions on how to generate the Subject Status Report are listed as followed ( Generated by PCDA):
1. To Generate Subject Status report Adhoc reporting was used. 2. Click on the Reports Tab 3. On the Reporting and Analysis Screen click on Ad-Hoc Reporting 4. Select the Clinical Link 5. Click on the Clinical Data By Form 6. Select the Demography Form Folder
a.
Drag and Drop Subject Number from the Demography Form Folder b. Drag and Drop Select Full Baseline Number from the Demography Form Folder c. Drag and Drop Baseline Screen Number from the Demography Form Folder d. Drag and Drop Allocation Number from the Demography Form Folder 7. Click on the Subject Disposition Folder a. Drag and Drop Subject Status for Subject from the Subject Disposition Folder
b. Drag and Drop rdcDSS Subject from the Subject Disposition Folder
c. Delete the Screen Fail and then generate and print the report.
8. Then click
save icon.
Confidential
XXXX
12. Click on the blue error under the save file Run Options.
13. Select the Format Excel 2002. 14. Click the Run Icon to generate the report. 15. Remove the Title from the Excel Spreadsheet. 16. Selecting the View Icon on the Toolbar and then select Header and Footer Icon. Create the Header and
Footer using the following format: Header: Protocol Number: MK0462-082 Subject Status Report Footer: XXXX Project Number: 102294 Page: [X of Y] Version Number : [X.0] Effective Date: [DD-MON-YYYY]
17. Filter and pick the Completed, Excluded and Discontinued subjects in the Subject Status for the Study
column remove all blank rows.
18. Review the sample Subject Status report in Appendix 1. 19. Generate and Print the report for review. 20. Save the report as a CSV File in PMED. 21. Forward the CSV File to the correct programmer.
Confidential
XXXX
10.1
10.2
10.2.1
Instructional Steps
The Independent Reviewer will review the listings against the Database to ensure all rows have been
actioned correctly for the selected patients. Secondly, the reviewer will evaluate the audit trail (online screen review in database) for any related datapoints and associated queries and responses. Error Log is accessible via the MERCK-D Project File folder in PMED.)
All instances where potential errors have been identified must be documented in the Error Log. (The
If discrepancies were noted as an error and was not queried or queried incorrectly, it will be considered a
true error. Ensure all listings have been signed and dated by the Independent Reviewer and the verifier.
If discrepancies are identified because the In-Life Standard Listings Reports and Study Specific Listings
are not available or were not available at the time of the assessment, they are not considered as errors.
10.3 10.3.1
The Independent Reviewer will find all Visits with yellow traffic lights and review the form details for The Independent Reviewer must manually check the Yellow Traffic Lights in the InForm Database to
ensure that no fields have missing values. Secondly, the reviewer will review the audit trail (online screen review in database) for any related forms, associated queries and responses to ensure that no discrepancies are present.
Confidential
XXXX
10.4 10.4.1
The Independent Reviewer will investigate if all queries have been actioned appropriately. This review will o
o Request a list of the CDA staff members that worked on the study that have closed queries from the DOL or the primary CDA. Review all manually generated queries by the CDA staff and auto queries closed by the CDA Staff. Do not review queries closed by the system unless it was reopened manually by a CDA staff. Also, review all remaining open or answered queries. Make sure the Auto Closed Queries by the CRA/Site is not included in the review. It is required that the query details are reviewed by investigating the audit trail which will help to identify query details and potential query errors. The query errors include the following:
o o o o
o
It was queried, but incorrectly It was queried correctly, but inappropriately answered and closed. It was queried and/or answered, but remains open.
It was queried correctly, but a spelling error was noted that changed the meaning of the query which
counts as an error. (e.g. Hypertension instead of Hypotension) However, if the spelling error is noted and it doesnt change the meaning of the query this should be counted as an observation instead of an error.
Record query error findings in the Error Log. Ensure all Query Reports have been signed and dated by the Independent Reviewer and the Verifier.
Confidential
XXXX
10.4.2 Instructions on How Generate the All Queries Report (Generated by PCDA)
Generate the Queries Report using the Ad-Hoc reporting tool to populate the All Queries Report: 1. Click on the Reports Tab On the Reporting and Analysis Screen click on Ad-Hoc Reporting Tab Select the InForm Trial Management Folder Select the Site Folder Drag and Drop Site Mnemonic from the Site Folder 5. Select the Subject Folder a. Drag and Drop Subject Number from the Subject Folder 6. Select the Visit Folder a. Drag and Drop Visit Mnemonic from the Visit Folder 7. Select the Form Folder
2.
3. 4.
a.
a. Drag and Drop Query Text from the Queries Folder b. Drag and Drop Query Status from the Queries Folder c. Drag and Drop Originator User from the Queries Folder d. Drag and Drop Query Response from the Queries Folder
9. a. Select the Properties Folder Drag and Drop Query Type from the Properties Folder
Then click
save icon.
Save the report as All Queries Report date DD-MON-YYYY. Return to the Reports and Analysis Screen Click on Reports Tab Locate the file that was saved on the Reports and Analysis Screen. Click on the blue error under the save file Run Options. Select the Format Excel 2002
Page 10 of 17 Version Number:: 1.0 Effective Date: 26-oct-2010
15.
Confidential
XXXX
16. 17. Click the Run Icon to generate the report. Remove the Title from the Excel Spreadsheet.
18.
Selecting the View Icon on the Toolbar and then select Header and Footer Icon. Create the Header and Footer using the following format: Header: Protocol Number: MK0462-082 All Queries Report (Added to the Center Section ) Footer: XXXX Project Number: 102294 (Added to the left section in the footer) Page: [X of Y] (Added to the right section in the footer) Version Number : [X.0] (Added to the right section in the footer) Effective Date: [DD-MM-YYYY}(Added to the right section in the footer) Note: Once the Header and footer are completed then use the Page set-up function to select the format of the page as Landscape.
19. Then sort the excel spreadsheet according to selected Site/ Subject Number and print each
output for the review.
11. Review Meeting/ Result Summary 11.1 Feedback of Errors Identified to the Responsible Primary CDA and DOL
Once the assessment has been completed, the Independent Reviewer must review the error log, listings, and reports to ensure that all errors have been documented and accounted for. The Independent Reviewer must then schedule the last Review Meeting with the Study team to review the errors that were identified during the review in a timely manner. During the meeting the Independent Reviewer refers to the Primary CDA and DOL responsible for the data for each error found:
o o o o o o
If the error is agreed, it will remain on the Error Log. If the error is not a valid error then it will be documented on the Error Log in the comment section explaining why it is not a valid error. If the error is not agreed, the Independent Reviewer will discuss the error with Data Operations Lead to help identify if this is a valid error. Whether it is a true error or not or can not be agreed upon, the Independent Reviewer will discuss the error with Quality Management Specialist. The local Quality Management Specialist will confirm if the error is valid or not, and the Independent Reviewer will update the Error Log (as appropriate). If the Independent Reviewer identifies a consistent trend in errors, they must immediately contact the appropriate Line Manager/ local Quality Management Specialist. The Line Manager/ local Quality Management Specialist will evaluate and document their findings, following the CAPA process which has been defined in WSOP-1914. The Line Manager/Quality Management Specialist will communicate all decisions, and if there is a need for retraining, the appropriate parties will be notified.
Confidential
XXXX
11.2 Team Responsibilities
Allocate the Quality Assessment tasks to an Independent Reviewer. Adjust the table accordingly. Ensure that the Independent Reviewer is someone other than those who are associated with the project.
Task Set up and Run Reports in EDC System Execute programs to generate the subject sample Produce any SAS listings Perform Quality Assessment Evaluation of errors with the Project Team (including PCDA and DOL) Calculate error rate Independent review of error rate calculation Determine remediation if error rate is unsatisfactory
PCDA/ Designee
DOL
Programmer
Independent Reviewer
Confidential
XXXX
Correction of Errors
Investigate if all queries have been actioned appropriately. This review will include: o Manually generated queries by the CDA Staff and Auto queries Closed by the CDA staff. o Query details, manually check audit trial in the database, and assess for query error. If any errors were highlighted during the review, the DOL will investigate if the issues result from insufficient training of the CDAs or from inappropriate instruction in the Data Cleaning Specification (DCS). If so, initiate adequate further training CDAs, or to update the DCS, as appropriate. Initiate corrective measure, as appropriate. If errors have been identified, investigate if error results from inappropriate validation programming or check logic in the Edit Check Specification. Modify the program or Edit Check Specification Logic/Test Data as appropriate. This will need to be communicated to the Programmer/line manager. The CRF Completion Instructions, Edit Check Specification, and the validation procedures will be updated and amended as required from the results of the review in accordance with the InForm Data Cleaning process. If applicable, the revised Edit Check Specification will be resent to the sponsor for approval.
12.1
The total number of errors for all identified listed items (Numerator) The total number of datapoints reviewed (Denominator)
The percentage error rate is computed manually by using the provided formula [(Numerator) Number of Errors / (Denominator) Number of Datapoints reviewed] X 100 The percentage error-rate for each Form/Dataset is calculated as follows:
The total number of errors identified in the Form/Dataset (Numerator) The total number of datapoints reviewed in the Form/Dataset (Denominator)
The percentage error rate is computed manually by using the provided formula [(Numerator) Number of Errors / (Denominator) Number of Datapoints reviewed] X 100 To meet the quality standard, set by XXXX and Merck the error rate for the database will be equal to or less than 0.2%.
Confidential
XXXX
12.2 Definition of Numerator and Denominator
Numerator: The total number of errors identified: What Counts as an Error
o It was queried, but incorrectly o It was queried correctly, but inappropriately answered and closed. o It was queried correctly, but the wrong data correction was accepted resulting in query being closed incorrectly. o It was queried and/or answered, but remains open o It was queried correctly, but a spelling error was noted that changed the meaning of the query which counts as an error. (e.g. Hypertension instead of Hypotension) o Autoqueries answer inadequate and inappropriately closed Missing data (On-Screen Review) In-Life Report Errors include items as follows: o Report output not addressed o Report output addressed incorrectly (e.g. query issued incorrectly) Denominator: The total number of datapoints reviewed.
12.3
Confidential
XXXX
Appendix 1: Subject Status Report
To be used for selecting the random samples.
XXXX
Appendix: 2 All Queries Report
To be used during the review of the Query
XXXX
Appendix: 3 DBQA Listing
To be used during the review the calculation of the error rate and review process.