Data Quality 101 Webinar
Data Quality 101 Webinar
Data Quality 101
CHCANYS Webinar
April 9, 2014
www.chcanys.org
Live Meeting Guide
• Do not use your HOLD button
• Press *6 please mute your phone
• Press #6 to un‐mute your phone
• Participants will be muted throughout the
presentation. To ask a question at any time:
1. Click the Q&A menu. This displays the Q&A pane
2. Type the question in the text area, and then click ‘Ask’
• Please note that the webinar is being recorded
2
Presenters
• Lisa Perry, Senior VP Quality & Technology Initiatives
• Amy Grandov, Managing Director NYS Health Center
Controlled Network, CHCANYS
• Dr. Warria Esmond, Medical Director, Settlement
Health
• Also participating from CHCANYS:
– Kathy Alexis, Director of Quality Improvement
– Natalya Malamud, Health IT Project Manager
3
Presentation Overview
• Overview of Center for Primary Care
Informatics (CPCI)
• Drivers of Data Quality
• Process for Data Validation
• Data Quality Collaborative
• Questions
Why are we talking about data quality?
• Trust in the data is foundational to using the CPCI for
clinical quality improvement
• Need to understand the factors that influence data
quality to improve your data
• Webinar provides guidance for validating data and
identifying opportunities to improve data quality
5
Center for Primary Care Informatics
• The New York State Center for Primary Care Informatics (CPCI) is
a statewide reporting and analytics solution for NY’s FQHCs
• CPCI was a priority goal in the CHCANYS Strategic Plan. In 2011,
CHCANYS developed the CPCI to
¾ Support improvements in quality, patient and population
health outcomes
¾ Help control costs
¾ Support growth & success in a changing environment
• Partner closely with Azara Healthcare using the DRVS (“Drives”)
platform
6
CPCI Pilot & Roll Out January – March‘14
ANALYZER
MEASURE
PLANNING
REGISTRY
REPORTS
VISIT
8 Data represents a fictitious environment of 4 health centers. No PHI is being revealed.
Drivers of Data Quality
What is data quality?
• There is no single, standard definition of “data quality”. Generally
speaking, good quality data is:
9 Consistent
9 Correct
9 Current/timely
9 Complete
• “Data validation” or “data quality assurance” is the process of
identifying data quality issues, analyzing the root cause, and
determining an appropriate response.
• Ensures that your reports are credible and defendable.
10
Who is responsible for data quality?
• Collaborative partnership between health center staff,
CHCANYS staff, and the CPCI vendors Azara and Arcadia
• Health center is ultimately expert regarding data, and is in the
best position to ensure data quality
• Data quality is an ongoing effort
11
How the CPCI Works
Data is available in CPCI front‐end, a web‐
3
based reporting platform accessible from
any major browser
Data is mapped according to health
2 center‐specific business rules, and unified
in EHR‐agnostic Data Warehouse.
External data is referenced, if applicable
1 Data is pulled nightly from disparate
EHR/EPM* systems
“This
Data missing measure Connectivity
in EHR doesn’t seem issues
right…”
13
Potential Sources of Data Issues
• Measure specifications
• Workflow/documentation
• Data format (structured vs unstructured)
• Medical encounter definition
• Identification of labs
• Medications
• Patient Demographics
• Patient Populations
• Connectivity
14
Measure Specifications
• Instructions for how to calculate a measure,
including the measure’s numerator, denominator,
exclusions
• Result is typically a ratio or percentage
• Important to understand the specifications of a
measure to interpret its value
• A single measure may rely on data “building blocks”
that span more than one workflow and involve
disparate EHR screens and staff/providers
15
Example: Denominator Data Sources
Cervical Cancer Screening (NQF0032): “Percentage of women 21‐64 years of age who received
one or more Pap tests to screen for cervical cancer”
DENOMINATOR
Patient
INCLUSIONS
Female >= 21
Demographics to 64 years
Include
medical
Visit Type or CPT encounter in
coding of visits measurement
year
EXCLUSIONS
Health Complete Hysterectomy,
Maintenance 2/15/13
Exclude
Complete Removal of
Surgical History Cervix, 3/18/13
*NOTE: For illustration purposes only. Not indicating a recommendation or best practice. Workflow
and terminology will vary by health center and EHR
16
Example: Numerator Data Sources
Cervical Cancer Screening (NQF0032): “Percentage of women 21‐64 years of age who received
one or more Pap tests to screen for cervical cancer”
NUMERATOR
ORDERS Pap Smear Thin Lab Result RESULTS Referral Result
Prep Referral PAP
Referral Order Pap, 01/01/13, See Provider
Thin Prep path
Lab #1 Thin Prep,
Thin Prep, 01/01/13,
Paper Result Æ
Thin Prep HPV 02/22/13,
Normal Scanned Dysplasia, HPV
Reflex
Lab #3 Document Positive
Sure Path HPV Sure Path, 01/01/13,
In‐house Lab Reflex Unclear AS‐CUS
Order Lab #2 Manual Entry in
Sure Path HPV HPV, 01/01/13, Results Module
Reflex Negative
Pap Order/Result Match
*NOTE: For illustration purposes only. Not indicating a recommendation or best practice. Workflow
and terminology will vary by health center and EHR
17
Differences in Measure Specifications
• Similarly named measures may have distinctions in the specifications that
lead to legitimate difference in value:
– Cervical Cancer Screening (NQF0032) published by National Quality
Forum (http://www.qualityforum.org) ‐ Percentage of women 21‐64
years of age who received one or more Pap tests to screen for cervical
cancer
– UDS 2013 Table 6b ‐ Pap Tests published by HRSA: Female patients age
24‐64 who received one or more Pap tests during the measurement
year or during the two years prior to the measurement year OR, for
women over 30, received a Pap test accompanied with an HPV test
done during the measurement year or the four years prior
18
Workflow
• Workflow impacts how and where
providers and other staff are
capturing data in the EHR
• Workflows are always evolving.
Reporting should reflect your
standard workflow.
• Don’t change the standard until
it’s agreed upon within your
center. Update Azara on changes.
• CHCANYS can assist with
workflow mapping and analysis to
identify gaps that are impacting
your QI measures
19
Workflow Example: Pap Result
MA/LPN/Provider places
MA/LPN Rooms MA/LPN performs Yes order: Always order w/ HPV
Patient Front Desk vital signs and MA/LPN for women >=30). Pap LBVT‐
Patient and pre‐ Pap
Arrives Checks in places order via preps for NGRFXHPV, Liquid Pap w/
visit planning via today?
for Visit Patient. standing orders procedure Reflex genotype, Liquid Pap
flowsheet
with GC /HPV w/ genotype,
No and Pap or HPV
Care
Coordinator Provider sees
Schedule future Pap patient and
arranges
future visit performs pap.
RN reviews and adds result also MA/LPN
Result returns
Pap to the OB GYN Hx tree in the Lab performs packages
electronically
Complete visit note and fwd’s to provider diagnostics specimen to
from Lab
only as needed be sent to Lab
*NOTE: For illustration purposes only. Not a recommendation or best practice. Workflow, roles, terminology, etc will
vary by health center and EHR.
20
Workflow Example: Pap Exclusion
MA/LPN rooms
Front Desk No
Patient Patient and consults Hysterectomy? Continue
Checks in pt. about LMP and with visit
Arrives
Patient past surgical history
Yes
MA/LPN Medical History,
Surgical Procedures, select OBGYN Hx, select
Hysterectomy and enter only Hysterectomy and
the year. Possibly changing to same issues with
include dd/mm/yyyy, if not date
known‐ use 1/1/YYYY.
Pap
Exclusion
Complete
*NOTE: For illustration purposes only. Not a recommendation or best practice. Workflow, roles, terminology, etc will
vary by health center and EHR.
21
Structured vs Unstructured Fields
• Using a common vocabulary and methodology creates data that can be recognized,
ordered, analyzed, reported & shared. Data not captured in structured fields are not
reportable
UNSTRUCTURED STRUCTURED
DATA DATA
• Radio buttons
• Locked down Pick‐
Dictation lists
Transcription • Checkboxes
Voice recognition • NDC‐ID (Meds)
Free text • ICD‐9/10
Memo fields • SNOMED (Dx)
• LOINC (Labs)
• CPT (Procedures)
22
Medical Encounter Definition
• CPCI typically identifies primary care medical encounters by
either:
– CPT code
– Visit type
• Defining and validating medical encounters is done during
initial CPCI integration
• It is possible and not uncommon ‐ despite best efforts during
integration ‐ for Azara to be missing some CPT codes
• If the list of CPT codes or Visit Types changes, or Azara does
not have a complete list, mapping may need to be updated
23
Lab Names and Interface Changes
• Lab Result names can change, either at the whim of external labs, or if you
update the name of in‐house labs
• If Azara doesn’t know about the name change, the DRVS code will continue
looking for the old name, and fail to pick up the new results
Discontinued Glycosylated HbA1c
A1c Recognized
*NOTE: For illustration purposes only. Not a recommendation or best practice. Workflow, roles, terminology, etc will vary
24
Medications
• Some measures have medication requirements for the
numerator, and in some cases the denominator
• CPCI uses NDCs (National Drug Codes) to identify medications
• NDC is a unique 10‐digit, 3‐segment numeric identifier assigned
to each medication, and identifies the labeler or vendor,
product, and trade package
• Not all centers have medications coded with NDC
25
Patient Demographics
• Data quality issues are typically found when new
items are added to the EHR for Race, Ethnicity, or
Language and not mapped
• When field in EHR is free text, staff may be entering
demographics inconsistently
• ‘Needs Update‘ in CPCI report indicates that data is
missing for some patients
26
Patient Populations
• Some centers have a need to exclude specialty‐only
patients. Patients that come only for behavioral health
or dentistry generally shouldn’t be included in primary
care quality measures.
• In CPCI, you can filter for specific providers, e.g., exclude
all dentists
• New functionality has been added to allow centers to
define “Service Lines”. Contact Azara to set this up for
your center.
27
Connectivity
• CPCI is updated nightly
• Ability to report timely data is dependent on consistent
connectivity to your systems
• Common issues: Server migrations, firewall changes and
expiring access credentials, failed replications from
production cause data gaps
• Azara needs to know about changes to server configurations,
IP addresses, security or back‐up systems that run the EHR
28
Validation is an ongoing process
• After go‐live consider investing additional time in validating CPCI
reports against trusted internal reports for critical measures
• Notify Azara when something changes in your EHR or clinical
workflow that impacts how and where data is captured, including
data elements or new locations
• Most EHR upgrades, if properly managed do not create a disruption
for CPCI. Please let Azara know 6‐8 weeks prior to an EHR upgrade
• If you notice an issue at any time, click Report Issue button on CPCI
menu bar or homepage
29
Ongoing Change Management
• Incorporate CPCI into IT change management
processes
– Who is responsible for assessing the impact on CPCI of
EHR configuration changes/upgrades?
– Is the CPCI considered when standard workflows change?
– Who is responsible for contacting Azara support when
there is a data quality question or concern?
30
Data Validation Process
Data Validation Process
Determine
next steps
Analyze the
root cause
Identify
issues and
discrepancies
32
Identifying Issues
• Prioritize measures to validate (e.g., key grant
measures, UDS)
• Watch trends for unexplained changes
• Identify unexpected outliers
• Flag measures that don’t seem ‘right’
• Compare to internal reports
33
Considerations when Comparing Reports
• Make sure you’re comparing ‘apples to apples’
– Compare measure specifications
– Run CPCI and comparison reports with the same
filters/parameters
• Some variation can be expected, particularly when
comparing to reports based on manual chart reviews
• Comparison reports run from another data
warehouse or other intermediate data source may
introduce additional complexities
34
Identifying a Measure to Validate
View
specification
Data represents a fictitious environment. No PHI is being revealed.
35
Checking the Measure Specification
Link to more
36 details
Verifying Parameters
• Differences between reports and CPCI can sometimes be traced to
differences in the filter settings, such as time period being reviewed
• Access filters from the menu bar
• Click Update Report to re‐run the measure with selected parameters
Display/ hide filters
Update Results
37
Drilling Down: Measure Analyzer
• View details of any measure including parameters, measure specification,
provider‐level detail, patient detail and more
• Launch the Measure Analyzer by selecting the measure name or selecting
from the Measures menu
Select Measure name to
launch Measure Analyzer
38
Examining Performance by Location
• If your health center
has multiple locations,
you may learn more by
comparing
performance across
locations
• Use the drop down list
to view this measure by
location
¾ Do the differences
make sense?
Location level detail
39
Examining Performance by Provider
• To view the relative
performance of
providers on this
measure:
– View by Provider
– Change format to
“Table”
• To focus on providers
with the most patients,
sort by Denominator Provider level detail Select Table Sort by clicking
option format column headings
40
Drilling Into Unexplained Differences
• How are different providers performing on the measure?
– An unexpected high/low pattern may indicate a difference in workflow
or coding inconsistencies across providers
– Unexpectedly high or low numbers across all providers may indicate a
mapping issue
Drill in to further analyze the results
for this provider only
41
Accessing Patient‐Level Detail
• To further understand a provider’s performance on a
particular measure, select the Detail List button from the
Measure Analyzer menu to drill into the patient level details
Display patient‐level
detail for this provider
42
Patient‐Level Detail Spot‐Check
• Spot check about 10 patients against data in EHR
• 5 patients in the numerator ( ‘1’ in the Numerator column)
• 5 patients in the denominator only (‘0’ in the Numerator column)
• Consider exclusions if applicable (‘1’ in the Exclusion column)
• Export to Excel for further analysis Export list to
Hover over a name Excel
to display patient Page through
details results
43
Spot Check ‐ Questions
• Is this patient’s numerator & denominator reported correctly in
the CPCI report, according to the measure specification?
– Patient should be included in the numerator, but is not?
– Patient should NOT be included in the denominator, but is?
• If the data is in the EHR but not reflected in CPCI:
– Is the data in a structured field in EHR?
– Where is the data located in your EHR?
• Contact CHCANYS for assistance. An Excel tool is available to
help facilitate this analysis.
44
Reviewing Your Mappings
• A Data Mapping Document for your health center is available
in the Help section of the CPCI
• Explains how your EHR database is mapped to the CPCI
including your Medical encounter codes, medications, labs,
etc.
• Most appropriate for technical staff
Mapping Document will be linked
in here
45
Next Steps: Addressing Issues
Perceived Issue Possible Root Cause Next Steps Primary Support
Resources
Data missing or Documentation/ Workflow Workflow analysis EHR vendor
inconsistent in EHR issue
Provider training CHCANYS
EHR bug/ limitation
EHR configuration
Unstructured data
• Contact CHCANYS:
– Natalya Malamud, IT Project Manager
([email protected])
48
Data Quality Collaborative
Data Quality Collaborative
• New group forming with the purpose of assisting health
centers with data quality concerns
• Collaborative, peer‐to‐peer discussions facilitated by Dr.
Warria Esmond of Settlement Health and CHCANYS
• Conference calls held quarterly for 90 minutes
• Kickoff agenda developed by chairs, with subsequent
agendas driven by the needs and interests of the group
• Possibility in future meetings of break outs into EHR‐
specific discussion sections
• Please email Natalya Malamud at
[email protected] to express interest
50
Evaluation
Please complete our brief survey
https://www.surveymonkey.com/s/3LG8YQZ
52