0% found this document useful (0 votes)
23 views

Conceptualizing and Measuring Data Use

This document provides an overview and summary of tools that can be used to measure data use. It acknowledges that measuring data use is challenging due to diverse factors that influence it. The document describes MEASURE Evaluation's Health Information System Strengthening Model and logic model for improving data demand and use. It maps interventions from the logic model to areas in the HISSM. Finally, it reviews and compares several tools that can be used to measure different dimensions of data use, such as transparency, timeliness, dissemination, and use of indicators.

Uploaded by

michelle.li
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

Conceptualizing and Measuring Data Use

This document provides an overview and summary of tools that can be used to measure data use. It acknowledges that measuring data use is challenging due to diverse factors that influence it. The document describes MEASURE Evaluation's Health Information System Strengthening Model and logic model for improving data demand and use. It maps interventions from the logic model to areas in the HISSM. Finally, it reviews and compares several tools that can be used to measure different dimensions of data use, such as transparency, timeliness, dissemination, and use of indicators.

Uploaded by

michelle.li
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 47

ACKNOWLEDGMENTS

The authors thank the United States Agency for International Development (USAID) for its support of this
work. For technical review and input, we thank Heidi Reynolds and Manish Kumar, MEASURE Evaluation,
University of North Carolina, Chapel Hill; Tariq Azim, MEASURE Evaluation, John Snow, Inc.; and Shannon
Salentine, MEASURE Evaluation, ICF. We also thank the knowledge management team at MEASURE
Evaluation for editorial and production services.

Conceptualizing and Measuring Data Use 5


CONTENTS

Acknowledgments ............................................................................................................................................................... 5
Contents................................................................................................................................................................................ 6
Figures................................................................................................................................................................................... 7
Tables .................................................................................................................................................................................... 7
Abbreviations ....................................................................................................................................................................... 8
Introduction ......................................................................................................................................................................... 9
Data Use in the Health Information System Strengthening Model ........................................................................... 11
MEASURE Evaluation Logic Model for Improving Data Use ................................................................................. 14
Mapping the DDU Logic Model to the HISSM........................................................................................................... 16
Summary of Tools to Measure Data Use....................................................................................................................... 16
Discussion .......................................................................................................................................................................... 32
Conclusion.......................................................................................................................................................................... 34
References .......................................................................................................................................................................... 35
Appendix A. Indicators to Monitor the Implementation of Activities to Strengthen the Demand
for and Use of Data .......................................................................................................................................................... 38

6 Conceptualizing and Measuring Data Use


FIGURES

Figure 1. The World Health Organization (WHO) Health Systems Framework ...................................................... 9
Figure 2. MEASURE Evaluation Health Information System Strengthening Model ............................................ 11
Figure 3. MEASURE Evaluation continuum of data use .......................................................................................... 12
Figure 4. MEASURE Evaluation DDU Logic Model ................................................................................................. 15

TABLES

Table 1. Mapping interventions in the DDU Logic Model to areas in the HISSM................................................. 17
Table 2. Comparison of assessment tools to measure data use .................................................................................. 21

Conceptualizing and Measuring Data Use 7


ABBREVIATIONS

DDU data demand and use


DQA data quality assurance
HIS health information system(s)
HISSM Health Information System Strengthening Model
M&E monitoring and evaluation
MECAT Monitoring and Evaluation Capacity Assessment Toolkit
MEval-PIMA MEASURE Evaluation PIMA
OBAT Organizational and Behavioral Assessment Tool
PEPFAR United States President’s Emergency Plan for AIDS Relief
PRISM Performance of Routine Information System Management
RHIS routine health information system
USAID United States Agency for International Development
WHO World Health Organization

8 Conceptualizing and Measuring Data Use


INTRODUCTION

Health information is one of the six core functions of the health system (Figure 1) (World Health Organization
[WHO], 2007). The purpose of a health information system (HIS) is to produce high-quality information that
can be used at all levels of a health system for decision making about program monitoring and review; program
planning and improvement; advocacy; policy; and health strategy planning and implementation. Although each
core function is important for the improvement of a health system and, ultimately, for better health outcomes,
high-quality and timely data from the HIS are the foundation of the overall system. Health data inform
decision making in each of the other five core functions (i.e., service delivery, health workforce, access to
essential medicines, financing, leadership and governance) (AbouZahr & Boerma, 2005). Strengthening the
HIS is a priority on many global and national health agendas as a way to improve health outcomes.

Figure 1. The World Health Organization (WHO) Health Systems Framework

Source: WHO, 2010

To monitor and evaluate the success of HIS strengthening interventions, it is critical to measure the outputs of
data quality and data use. Definitions of and methods for the monitoring and measurement of improvements
in data quality are well developed (i.e., accuracy, reliability, precision, completeness, timeliness, integrity, and
confidentiality) (MEASURE Evaluation, n.d.). However, definitions and methods for monitoring and
measuring data use for decision making have proven more challenging. Different types of data users and
producers contribute to and employ the HIS in complex ways, and there is not always consensus about the
actions that constitute data use. For example, data sharing, visualization, dissemination, and review are often
considered cases of data use. In the literature, measures of data use have included such dimensions as
transparency, timeliness, visibility, accessibility, dissemination of information, calculation of key indicators,
preparation of information products, and presentation of the achievement of targets (Abajebel, Jira, & Beyene,
2011; Mwencha, Rosen, Spisak, Watson, Kisoka, & Mberesero, 2017). Measuring the use of data is challenging
because it is affected by diverse factors, such as decision-making processes; ongoing sector-wide HIS
strengthening activities to improve data availability and quality; actors across different levels in the health

Conceptualizing and Measuring Data Use 9


system; and information flows. Unlike data quality, there is no standard approach to defining and measuring
data use.

MEASURE Evaluation is at the forefront of developing guidance for the monitoring and measurement of data
use―a key output of HIS strengthening. This paper has the following purposes:

• Expand on the Health Information System Strengthening Model (HISSM) definition and
conceptualization of the use of data, especially for acting on and implementing decisions related to
health system performance.

• Describe activity areas to strengthen the demand for and use of data for decision making.

• Summarize indicators to measure the process and outputs of data use.

• Review tools to measure the dimensions of data use.

10 Conceptualizing and Measuring Data Use


DATA USE IN THE HEALTH INFORMATION SYSTEM
STRENGTHENING MODEL

MEASURE Evaluation developed a model (Figure 2) for strengthening the HIS in low- and middle-income
countries: the HISSM (MEASURE Evaluation, 2017a). Its purpose is to explore ways to promote the HIS as
an essential function of a health system; define HIS strengthening; measure HIS performance; and monitor and
evaluate HIS interventions.

Figure 2. MEASURE Evaluation Health Information System Strengthening Model

As shown in the model, HIS strengthening is the implementation of one or more interventions targeting one
or more components of the HIS to improve the quality and use of data for decision making at all levels of the
health system. The output of a strengthened HIS is measured by data quality and data use, that is, the

Conceptualizing and Measuring Data Use 11


improved availability of high-quality data that are used on a continuous basis for decision making at all levels of
the health system.

As described in the HISSM, data use involves two main stages: (1) improving the HIS; and (2) improving the
performance of health programs, with the ultimate goal to improve the functioning of the health system and
improve health outcomes (Figure 3). The first stage consists of steps to enhance the HIS: the analysis and
synthesis of data to identify data quality issues for improvement; the generation of health statistics to answer
key health questions; and the development of tailored information products to synthesize and disseminate
findings. The second stage of data use includes steps to drive data-informed decision making for health
program improvement. This conceptualization of data use requires that data are reviewed as part of a specific
decision-making process, for example, to create or revise a health program strategy or work plan; to develop or
revise a policy; to advocate for a policy or program; to allocate resources; or to monitor program performance.
Following the data review and interpretation process, a data-informed recommendation is submitted to a
higher level of management or a decision maker with a request for action, the decision to act is made, and
follow-up actions are implemented that lead to improved health outcomes.

Figure 3. MEASURE Evaluation continuum of data use

Improve health
program •Improved health
performance outcomes
•Improve data quality
•Generate health •Data review and •Improved health system
statistics interpretation outcomes
•Develop information •Data-informed
products advocacy
•Decision made Improve health
•Action implemented
system functioning
Improve the HIS

The HISSM does not fully expand on the second aspect of data use, that is, decisions made and acted on to
improve health programs. This aspect of decision making, which moves a data-informed recommendation to
an implemented action, often involves engaging decision makers who may have competing priorities, biases,
and values. Decisions may be more influenced by factors other than data, including the availability of funds to
implement decisions, political jockeying, donor pressure, personal interests, and competing agendas. Moreover,
decision-making authority may lie with organizations and departments outside those managing the HIS (i.e.,
decision makers from various health system functions, including service delivery, human resources,

12 Conceptualizing and Measuring Data Use


commodities/infrastructure, financing, stewardship) and those outside the health sector (such as policy units,
finance commissions, etc.).

The development of skills in data communication, data advocacy, and leadership is needed to increase the
capacity of decision makers to influence and act on data-informed recommendations to achieve and sustain
improved outcomes in health system performance. Strong coordination and feedback loops are also necessary
to ensure the availability of relevant data that respond to information needs of multisectoral decision makers at
prime decision-making opportunities. There is often a lack of interaction and understanding of roles and
responsibilities between those working in HIS strengthening and the decision makers who are the target
audience for using data to inform program planning, policy, and service delivery decisions. The HISSM does
not focus extensively on the engagement of stakeholders outside the HIS and monitoring and evaluation
(M&E) domains. To address this, MEASURE Evaluation developed a logic model that describes the role that
data use plays in strengthening the health system.

Conceptualizing and Measuring Data Use 13


MEASURE EVALUTION LOGIC MODEL FOR IMPROVING DATA
USE

MEASURE Evaluation developed the Data Demand and Use (DDU) Logic Model to describe the specific
activities and interventions needed to improve the use of health data for improved health programs and
policies (see Figure 4). The model maps the influence of data use intervention inputs and activities on the
outputs and outcomes of routine and sustained use of data in program review, planning, and policy. It also
outlines the theoretical assumptions under which the interventions are intended to influence data use and
health outcomes (Nutley & Reynolds, 2013). It specifies and provides a practical strategy for developing,
monitoring, and evaluating interventions to strengthen the use of data in decision making. The model
comprises eight domains of activities that have been identified in the literature and through MEASURE
Evaluation’s implementation experiences as critical to affect the technical, behavioral, and organizational
determinants of data-informed decision making. The domains are:

1. Assessing and improving the data use context

2. Engaging data users and data producers

3. Improving data quality

4. Improving data availability

5. Identifying information needs

6. Building capacity in data use core competencies

7. Strengthening the organization’s DDU infrastructure

8. Monitoring, evaluating, and communicating DDU successes

Table 1, on page 17, presents examples of activities in each of the eight domains of the DDU Logic Model.

14 Conceptualizing and Measuring Data Use


Figure 4. MEASURE Evaluation DDU Logic Model

Source: Nutley & Reynolds, 2013

Conceptualizing and Measuring Data Use 15


Activities to strengthen the demand for and use of data are built on a foundation of inputs essential to
implementation, including resources, indicator definitions, data sources, and data management. For the purposes
of the DDU Logic Model, these inputs are informed by HIS inputs and processes defined by the Health Metrics
Network (Health Metrics Network, 2008) because efforts to improve the demand for and use of information will
only be successful if they are implemented in a HIS that is functioning effectively or is in the process of being
strengthened.

The data use strengthening activities lead to such outcomes as improved individual DDU skills and capacity,
institutionalized DDU procedures and policies, and a long-term outcome of improved and sustained DDU.
Indicators to measure the process of strengthening data use are summarized in Box 3 on page 26 and are detailed
in Appendix A.

MAPPING THE DDU LOGIC MODEL TO THE HISSM


The HISSM and the DDU Logic Model both employ a systems-level approach to improve the HIS and data use.
However, the HISSM and the DDU Logic Model offer two different lenses through which we can conceptualize
and unpack data use.

The HISSM describes data use in the overall context of HIS strengthening and considers it to be an output of a
strengthened HIS. The DDU Logic Model is built on the assumption that efforts to improve the use of data are
successful only when implemented as part of long-term HIS strengthening activities, such as those outlined in the
HISSM model (for example, legislative, regulative, and planning frameworks; resources, such as personnel,
financing, information and communications technology; and indicators, data sources, and data management). On
the other hand, the DDU Logic Model describes a subset of HIS strengthening activities that are most likely to
catalyze improved and sustained data-informed decision making. This model builds on the HISSM by providing
specific and detailed ways to support the use of HIS data. DDU interventions are not necessarily unique to
DDU. For example, activities to improve data quality also strengthen the output and performance of the HIS.
Moreover, the DDU Logic Model includes activities to engage with multisectoral stakeholders outside the HIS
environment who are needed to advocate for and implement decisions based on HIS data. Table 1 maps the
DDU Logic Model to the areas and subareas of the HISSM.

16 Conceptualizing and Measuring Data Use


Table 1. Mapping interventions in the DDU Logic Model to areas in the HISSM

HISSM Area Illustrative DDU Strengthening Activities in the DDU Logic Model Comparison

HIS governance and Strengthening DDU infrastructure Both models emphasize the importance of
leadership consist of • Develop data-informed normative health sector guidance (e.g., the organizational context. Systems with clear
legislation that strategic plans) guidelines, strong leadership and
outlines specific • Institutionalize governance structures to regularly review data and governance structures, and defined roles and
activities under the program progress (e.g., technical working groups) responsibilities are better positioned to
HIS. It also involves • Develop organizational guidance and standardize job support HIS strengthening and DDU.
partnerships and descriptions for data user and producer roles in M&E, program and
coalition building to data review, program planning, research, and policy processes
leverage resources; • Develop protocols and guidelines to govern data processes and
governance Strong HIS leadership and governance are
clearly support data-informed decision making (e.g., data
structures, policies, needed to identify and engage with key
management; data quality assessment [DQA]; timely data
and standards; HIS decision makers (especially those outside the
synthesis and dissemination; data review; data use framework)
financing; and the HIS domain) and to institutionalize
• Prioritization of data-informed management, leadership, and
existence of HIS governance structures that regularly bring
advocacy to support data-informed recommendation
Enabling environment

champions. together data users and data producers to


development, planning, organizing, and budgeting for DDU
review and employ data during opportune
activities (e.g., human resources)
decision-making moments. Organizational
supports, such as organizational guidance
Identify and engage data users and data producers and clarifying roles and responsibilities in
• Assess and identify stakeholders program planning and monitoring, can also
• Ensure data user participation in M&E and HIS design and improve the engagement of data users and
development processes data producers.
• Develop organizational guidance and clarify roles for data user
and producer engagement in program planning, monitoring, and
policy development processes
• Include data users in M&E and research training
• Convene working groups to regularly review data and program
progress, and identify programmatic questions/data needs
• Jointly analyze and interpret data

Identify information needs


• Implement the Framework for Linking Data with Action tool1
• Ensure that the HIS design responds to the information needs and
presentation preferences of data users
• Identify upcoming decisions, link decisions and questions to
existing data sources and identify data gaps

Conceptualizing and Measuring Data Use 17


HISSM Area Illustrative DDU Strengthening Activities in the DDU Logic Model Comparison

HIS management Assess and improve the data use context Assessing, monitoring, and evaluating data
consists of planning • Assess the organizational, technical, and behavioral factors use interventions are essential HIS activities
and organizing HIS that should be planned and budgeted under
affecting decision making
activities and HIS management processes.
resources, financial
Monitor, evaluate, and communicate results of DDU interventions An initial assessment of the data use context
management for
• Monitor and evaluate data use interventions
Enabling environment

HIS, information is important to guide the adaptation of


• Document DDU successes interventions to improve data use. This is
management, and
• Develop DDU advocacy materials highlighted in the DDU Logic Model because
infrastructure
• Widely disseminate DDU successes to various audiences in many HIS assessments do not adequately
development.
appropriate formats evaluate organizational and behavioral
factors that most proximally affect data use.

Monitoring data use outputs is a core aspect


of both models. Communicating the results of
data use interventions by highlighting the links
among data use, advocacy, and improved
service delivery also helps build the value of
data use, generates data demand, and
reinforces the benefits of investments in data
use interventions.

Sustained data- Build capacity in core data use competencies The human element is foundational both for
informed decision • Capacity building in data analysis, interpretation, synthesis, HIS and DDU strengthening. The DDU Logic
Human element

making requires a presentation, and communication Model emphasizes the importance of the
dedicated • Training and coaching in data-informed leadership and advocacy human element to build a culture of data use
workforce made up • Apply and implement DDU procedures, guidelines, policies, and through effective management and
of individuals in support mechanisms communication and collaboration between
various job functions • Manage change around adopting a culture of data use data users and data producers. It defines
who are motivated data users and data producers and
to collect, analyze, underlines the specific core competencies
review, and discuss that are needed by these cadres to
data. strengthen their ability to use information.

1The Framework for Linking Data with Action is a management tool which brings together data users and producers to identify programmatic priorities, understand key
performance indicators, identify the types of analyses needed to inform regular decisions, conduct basic data analysis and interpretation, and use their findings for decision
making. It is available at https://www.measureevaluation.org/resources/publications/ms-11-46-b.

18 Conceptualizing and Measuring Data Use


HISSM Area Illustrative DDU Strengthening Activities in the DDU Logic Model Comparison

Data sources In the DDU Logic Model, data sources are considered foundational Data sources are highlighted in the HISSM, but
include institution- elements of a functioning HIS. Data sources are necessary inputs to the not in the DDU Logic Model.
based, population- success of DDU interventions.
based surveys, and
mixed-data sources.

Data management Improve data quality In the DDU Logic Model, data management
refers to data • Develop and disseminate data quality protocols and tools is an input that serves as a prerequisite to the
Information generation

collection and • Standardize data collection processes and simplify/improve the success of DDU interventions (e.g., data
storage, ensuring design and usability of data collection forms collection, cleaning, processing, and
data quality, and • Training on data entry, data management, DQA management). However, activities to
data processing • Regular data quality review meetings improve data quality are specifically
and compilation. • Conduct supportive supervision/mentorship highlighted in the model as one of the
Conduct regular data quality audits interventions most proximate to improving the
use of data in decision making.

The creation, Improve data availability Both models highlight the importance of
generation, and • Create interoperable data systems targeted, summarized, and synthesized data
dissemination of • Develop a data dissemination and communication plan in the form of visualizations and/or information
information products • Synthesize data and develop information products for different products that are easily understood and
for a variety of users data user audiences responding to their data needs relevant to decision makers. The DDU Logic
and purposes. • Develop standard auto-generated reports Model further highlights of the importance of
• Actively disseminate information products bidirectionally strengthening access to data, for example,
• Develop multidirectional feedback mechanisms for data sharing by linking data sources and integrating
fragmented information systems.

Conceptualizing and Measuring Data Use 19


SUMMARY OF TOOLS TO MEASURE DATA USE

MEASURE Evaluation has developed and applied several tools to measure the dimensions of data use. This
section provides an overview of the assessment tools and the measures of data use that have been employed by
the project to monitor the process of strengthening data use both to improve the HIS and to improve health
programs. A summary of the purpose, framework, and the stages of the data use continuum that each tool
measures is provided in Table 2.2 Then each tool is discussed in detail.

2 Table 2 does not include tools that focus solely on the measurement of data quality.

20 Conceptualizing and Measuring Data Use


Table 2. Comparison of assessment tools to measure data use

Data Use Continuum

Data Health Info Data


Tool Purpose Framework Examples of assessment statements for data use Advocacy Decision Action
quality statistics Product review

PRISM Assess the Technical, •Management of RHIS and/or discussion about RHIS
performance organizational, findings reviewed during routine meetings
of a RHIS and individual • Have they made any decisions based on these
barriers to data discussions?
✓ ✓ ✓ ✓ ✓ ✓ ✓
quality, data • Has any follow-up action taken place on the
analysis practices, decisions made during previous meetings?
and use of • Are there any RHIS-related issues that have been
information referred to the regional/national level for action?

RHIS Rapid WHO Health •Health planners use the results of the analysis of
Rapid assessment Facility and facility data to produce analytical reports on
Assessment of local Community progress and performance for health sector review
Tool health Information System •Appropriate staff have received training in data
information Toolkit, MEASURE analysis
systems Evaluation •Periodic data summaries (e.g., bulletins) are
against Guidelines on produced and distributed
global Data •Dashboards and summary charts are used to
standards Management convey information to diverse target audiences ✓ ✓ ✓ ✓
Standards •There is a comprehensive data dissemination
strategy
•There is demand for information from donors,
policy makers, program planners, etc.
•Facility and community-based data are used in
health sector planning
•Facility managers use data to improve
infrastructure, equipment, and human resources

Conceptualizing and Measuring Data Use 21


Data Use Continuum

Data Health Info Data


Tool Purpose Framework Examples of assessment statements for data use Advocacy Decision Action
quality statistics Product review

Assessment of Monitor Technical, Qualitative analysis synthesizing barriers to DDU


Barriers to progress in organizational, intervention areas. Sample statements include:
Data Use in improving and individual • In the past 12 months, the quality of data
the Health the use of barriers to data available has been sufficiently adequate that it can
Sector Toolkit data use across the confidently be used in decision making.
eight intervention • Information products are regularly sent to a wide
areas of the DDU variety of stakeholders.
Logic Model • There are guidelines to support the analysis,
✓ ✓ ✓ ✓ ✓
presentation, and use of data.
• Data review meetings are held quarterly at the
subnational level to discuss key program indicators.
• Can you give me some examples of times when
you consulted data to inform a decision about a
health service?
• How often do you think decisions in your
organization are informed by data?

12 Assess a Status of elements • Information products are regularly disseminated to


Components national M&E across the 12 data providers.
M&E Systems system components of a • Information products are regularly sent to a wide
Strengthening national HIV M&E variety of stakeholders, other than the data
Tool system providers. ✓ ✓ ✓
• National and subnational information products
meet stakeholders’ information needs.
• There are guidelines to support the analysis,
presentation, and use of data at the facility level.

MECAT Assess an Existence, quality, Existence, quality, and financial/technical


organization's and autonomy of autonomy in the development of:
capacity elements across • organizational data use plan
and the 12 • information products ✓ ✓ ✓
performance components of a • data analysis and presentation guidelines
in M&E national HIV M&E
system

22 Conceptualizing and Measuring Data Use


Performance of Routine Information System Management

The Performance of Routine Information System Management (PRISM) toolkit, developed by MEASURE
Evaluation, assesses the broad context in which routine health information systems (RHIS)3 operate. The
framework asserts that RHIS performance, defined as quality data that are continually used in decision making, is
a function of RHIS processes and their behavioral, technical, and organizational determinants. The PRISM
toolkit consists of four tools that are administered to comprehensively assess RHIS performance; identify the
technical, behavioral, and organizational factors affecting RHIS performance; aid in designing and prioritizing
multidimensional interventions to improve RHIS performance; and support ongoing efforts to monitor and
evaluate data quality and data use. PRISM can be applied to quantitatively assess data use across the data use
continuum. It employs a series of dichotomous indicators to assess whether RHIS information is discussed in
staff meetings, whether decisions evolved from these discussions, and whether these decisions were referred to
upper management for action (Box 1).

The four PRISM tools are: (1) RHIS Performance Diagnostic Tool; (2) RHIS Overview and Facility/Office
Checklist; (3) Organizational and Behavioral Assessment Tool (OBAT); and (4) RHIS Management Assessment
Tool. Depending on the implementation methodology selected, these tools can be used to understand the
existing RHIS at one point in time, identify any changes following the implementation of RHIS interventions (if
applied at two points in time), or monitor progress in data quality and data use over time (if applied routinely).
The PRISM toolkit can be used by any type of organization, such as ministries of health, health districts,
nongovernmental organizations, and private sector organizations, and across sectors. Depending on the nature of
the organization, the tool should be administered to a diverse mix of staff and at various organizational levels to
get a representative sample of the organization. For example, it can be applied at the community, facility, district,
subnational, and central levels of a health system.

Box 1: Data use measures contained in the PRISM toolkit

• Discussion of RHIS analyses


o Were the following topics discussed during routine meetings for reviewing
managerial or administrative matters: management of RHIS (data quality,
reporting, or timeliness) or RHIS findings (patient utilization, disease data,
service coverage, stock outs)
• Decisions taken
o Have they made any decisions based on the above discussions?
• Decision implemented
o Has any follow-up action taken place on the decisions made during previous
meetings?
• Decision referred to upper management for action
o Are there any RHIS-related issues or problems referred to regional/national level
for action?

3 An HIS encompasses all health data sources required by a county to plan and implement its national health strategy. These
include health facility data, surveillance data, census data, population surveys, vital event records, financial data, and
logistics and supply data. RHIS comprise data collected at regular intervals at public, private, and community-level health
facilities and institutions. The sources of these data are generally individual health records, records of services delivered, and
records of health resources.

Conceptualizing and Measuring Data Use 23


Overview of the Assessment Tools

The RHIS Performance Diagnostic Tool is the primary component of the PRISM toolkit; it evaluates overall
RHIS performance. This tool consists of four forms (to be administered at the facility and district or higher
levels) covering the dimensions of data quality and data use. The tools on data use deal with the production of
reports; display of information; existence of meetings to discuss RHIS information; and the use of information
for problem identification, problem solving, decision making, resource mobilization, and monitoring. There are a
series of dichotomous indicators about discussions and decisions made using RHIS information during routine
meetings.

The RHIS Overview and Facility/Office Checklist examine the technical determinants of RHIS performance,
including the structure and design of existing information systems; data collection and reporting forms;
information flows; RHIS resources; and interactions among different information systems. This tool can help
one understand how data and information flow from data collectors to users (and vice versa); inventory
information that is currently available for decision making; and how to identify opportunities to improve data
collection, analysis, and sharing to ensure the use of data.

The OBAT covers perceptions about the behavioral and organizational factors that affect RHIS performance. It
features rating scales and a written test to assess task competency and problem-solving skills. The tool contains
questions on data demand; the promotion of an organizational culture of information; levels of motivation and
confidence; and knowledge, competencies, and skills in RHIS tasks. It can be applied alone or in conjunction
with the RHIS Performance Diagnostic Tool to identify strengths and weaknesses in organizational processes for
promoting a culture of information and behavioral factors that affect the performance of RHIS tasks.

Last, the RHIS Management Assessment Tool looks at the management and supportive practices of the RHIS to
aid in the development of recommendations for better management of the RHIS. Although there are no specific
questions on data use, this tool assesses the larger managerial and enabling context, including such management
functions as governance, planning, training, supervision, use of performance improvement tools, quality
standards, and financial resources.

The PRISM toolkit has been applied in over 23 countries to assess RHIS performance and guide RHIS
strengthening, including in Ethiopia, Haiti, Liberia, Mozambique, Pakistan, Rwanda, South Africa, and Uganda. It
has also been employed in Cote d’Ivoire to evaluate the impact of interventions described in the DDU Logic
Model on data quality, data availability, and the use of information, using a pre- and post-test design (Nutley,
Gnassou, Traore, Bosso, & Mullen, 2014).

RHIS Rapid Assessment Tool

The RHIS Rapid Assessment Tool, developed by the WHO and MEASURE Evaluation, provides a rapid
assessment of the local HIS as against harmonized global standards for data management of information systems.
This tool identifies gaps and weaknesses to facilitate planning for RHIS strengthening at any level of the health
system, including the national, subnational, district, and service delivery point levels (e.g., health facility and
community-based information systems). The RHIS Rapid Assessment Tool can be implemented in a workshop
setting with representatives from different levels of the health system, as a self-assessment involving RHIS
stakeholders, or through the deployment of assessment teams to a sample of health facilities and subnational

24 Conceptualizing and Measuring Data Use


RHIS units. Depending on the assessment methodology selected, the RHIS Rapid Assessment Tool can be
applied as a one-off assessment prior to RHIS reform efforts or as a regular aspect of RHIS performance
assessments conducted every two to three years.

Overview of the Assessment Tool

The RHIS Rapid Assessment Tool is a checklist of standards for health facility and community information
systems that can be used for any level of the health system involved in data collection, reporting, aggregation, and
transmission of RHIS data. The checklist covers standards for the following thematic domains and subdomains:

• Management and governance, including policies, planning, and human resources

• Data and decision support needs, including data standards

• Data collection and processing, including data reporting, data quality, and information and
communication technology

• Data analysis, dissemination, and use

Standards for all domains (including data analysis, dissemination, and use) are presented as statements.
Respondents describe the extent to which the standard applies at the selected health system level using a five-
point Likert scale (0=no answer/not applicable; 1=not present, needs to be developed; 2=needs a lot of
strengthening; 3=needs some strengthening; 4=already present, no action needed). The measures of data use in
the RHIS Rapid Assessment Tool cover the use of data to improve information systems (e.g., data analysis and
the generation of health statistics; development of information products, such as data summaries and
dashboards); the use of facility and community-based data to monitor patient care and outcomes; to improve
facility infrastructure, equipment, and human resources; to develop service delivery strategies; and for health
sector planning. Box 2 lists examples of data use measures in the tool. Different components of the tool have
been implemented in the Gambia, Madagascar, Malawi, and Myanmar.

Box 2: Illustrative data use measures in the RHIS Rapid Assessment Tool

• Data analysis
o General principles for data cleaning/analysis of facility data are defined
(e.g., as standard operating procedures).
o Appropriate staff (i.e., facility and community information system managers,
program managers, etc.) have received training in data analysis.
• Data dissemination
o Periodic data summaries (e.g., bulletins) are produced and distributed to key
stakeholders describing key findings and interpretations.
o There is a comprehensive data dissemination strategy relevant to each level
of the health system with key products defined.
• DDU
o A culture of information use is promoted by policy leaders and decision
makers.
o There is demand for information by donors, policy makers, planners, program
managers, etc.
o Facility managers use data to improve infrastructure, equipment, and human
resources.
o Facility and community-based data are used in health sector planning (e.g.,
Conceptualizing and Measuring Data Use 25
health sector reviews).
Assessing Barriers to Data Use in the Health Sector: A Toolkit

Assessing Barriers to Data Use in the Health Sector: A Toolkit—a collection of four assessment tools—can be
used to measure the status of data use (if applied at one point in time) and progress toward improved use in an
organization (if applied at two or more points in time) (MEASURE Evaluation, 2018).4 These tools serve three
purposes:

• Identify existing barriers and constraints to data use

• Identify factors that facilitate data use

• Help in designing and prioritizing an action plan to address the barriers and constraints to data use

The tools are (1) In-Depth Interview Guide; (2) Self-Assessment; (3) Group Assessment; and (4) Site Visit
Checklist. Together, they identify barriers to data use across the eight intervention areas in the DDU Logic
Model (Figure 4). This suite of tools can be employed to monitor the implementation of an activity to strengthen
data use, by assessing the status of and progression in each of the data use intervention areas of the DDU Logic
Model (Box 3). It can also be applied to qualitatively assess the use of data to improve the HIS (e.g., improve data
quality, generate health statistics, develop information products); for data review and interpretation; and to
determine whether data are used to inform decision making.

The assessment tools can be used at the national, subnational, or organizational level or in some combination of
the three levels. The Site Visit Checklist is administered at the health-facility level. The four tools can be adapted
to suit the needs of the organization being examined, whether in terms of content area, type of organization,
health program area, or level of the health system.

Box 3: Indicators to monitor the implementation of activities to strengthen DDU

MEASURE Evaluation has developed a set of indicators that measure the status of and
progression in each of the data use intervention areas of the DDU Logic Model (Appendix
A). The indicators map directly to each intervention activity in the Logic Model and can be
measured using our toolkit on assessing barriers to data use in the health sector. The level of
maturity of each activity area can be assessed and scored. A score of 0 (absent) indicates
that the activity being measured is nonexistent. A score of 1 (nascent) indicates that the
initial steps of activity implementation are present. A score of 2 (emerging) indicates that the
activity is present but in an ad hoc and unsystematic way. A score of 3 (robust) indicates
that the activity is regularly and systematically implemented. Repeat measurement can
provide a qualitative assessment of improvements in the areas necessary for data use to
occur and progression toward regular and sustained data use. Monitoring the
implementation of activities to strengthen the demand for and use of data can help
determine whether the right set of interventions to support lasting, sustainable improvements
in data use are being implemented.

4An organization is broadly defined as a division of a ministry of health at the national, subnational, or district level; a specific
program in the ministry; or a nongovernmental organization or program.

26 Conceptualizing and Measuring Data Use


Overview of the Assessment Tools

The In-Depth Interview Guide contains 15 open-ended questions that cover the eight data use interventions
listed in the DDU Logic Model―the specific interventions that can improve the demand for and use of data
from all HIS. The conceptual framework demonstrates how information systems improve the other health
system building blocks and outlines the underlying assumptions and activities that are necessary to achieve the
desired outcome of increased data-informed decision making (Nutley, 2012).

The Self-Assessment covers the technical and behavioral determinants of data use. It examines the perceived
skills of data users and producers in the core competencies of data use (e.g., data analysis, synthesis,
interpretation, and presentation). It then reviews these competencies using a short test that demonstrates their
actual skills. The results of the self-assessment identify concrete areas that need to be addressed to build the
technical capacity of an organization. The tool also asks questions about people’s perceived notions of
organizational capacity where they work.

The Group Assessment poses questions about the organizational determinants of data use, specifically the
existence of data use guidance documents, the regular use and communication of information in decision
making, and the existence of supportive supervision and feedback.

The Site Visit Checklist collects additional evidence to support the Group Assessment tool, by having
interviewers observe whether guidelines, procedures, and information products mentioned in the Group
Assessment are present at health facilities.

Together, these four tools provide a complete picture of the eight components of the data use conceptual
framework, and the technical, behavioral, and organizational determinants of data use to understand the data use
context of an organization, along with the barriers to and facilitators of institutionalizing a culture of using data in
the decision-making process.

The toolkit has been applied in a variety of settings by MEASURE Evaluation, including in Lesotho (MEASURE
Evaluation, 2014b), Ethiopia (MEASURE Evaluation, 2014a, revised 2015), Tanzania, and the Democratic
Republic of Congo (Brodsky & Nyanzi, 2017).

Adaptations of “Assessing Barriers to Data Use in the Health Sector: A Toolkit”

Components of the toolkit for assessing barriers to data use in the health sector have been adapted to meet
specific needs of the MEASURE Evaluation project, across activities, technical areas, and countries. Examples
are as follows:

Checklists

The MEASURE Evaluation PIMA (MEval-PIMA) project aimed to build sustainable M&E capacity of health
decision makers in Kenya to use quality health data for evidence-based decision making. A key component of the
project’s DDU strategy was to improve data availability, stakeholder engagement, and the interaction between
data users and data producers by facilitating data review meetings for national programs, county health
management teams, and referral sentinel sites. To strengthen the organizational infrastructure for DDU, MEval-
PIMA developed guidelines to help support data review meetings for Ministry of Health and Civil Registration

Conceptualizing and Measuring Data Use 27


Services programs. These guidelines included a Data Demand and Use Checklist intended for MEval-PIMA staff
to document and track outcomes from data examined during data review meetings. The checklist tracked
whether the following events occurred: data were presented, data were reviewed, decisions were made, and an
action plan was developed.

Data use checklists have also been incorporated as part of MEASURE Evaluation’s research and evaluation
portfolio. For example, a recent study assessed the effect of the “pivot strategy” of the United States President’s
Emergency Plan for AIDS Relief’s (PEPFAR). The strategy is a geographic reprioritization of investments in
Kenya and Uganda on health outcomes and HIS performance areas, such as data quality and data use. The study
employed a mixed-methods approach, collecting quantitative data on health system performance using routine
health information and qualitative key informant interviews at the subnational level. It assessed data use, by
determining whether data generated by the HIS were employed for programmatic or policy decisions. A checklist
was administered to illuminate the processes supporting data use at the district level. It had questions on whether
meetings were regularly held to review health data, the frequency of and participants in these meetings, and the
existence of notes or meeting agendas to document data use. Two rounds of qualitative interviews were
conducted, focusing on how PEPFAR support for data use activities has affected data use trends, and the
evolution and support of data review processes (Box 4).

Box 4: Data use measures in MEASURE Evaluation Pivot Study

• Describe if and how data are discussed during the data review meeting
o What data are presented? How were the data presented?
o Do the data meet the information needs of stakeholders?
o Are data available and accessible for all participants?
• Describe if and how the data are used for program planning and decision making
o Provide examples, if any, of action plans developed based on data
• Describe if and how the data are used for information system or data quality
improvement
o Provide examples, if any, of action plans developed based on data

Case Studies and Qualitative Interviews

Components of “Assessing Barriers to Data Demand and Use in the Health Sector: A Toolkit” (MEASURE
Evaluation, 2018) have also been adopted for specific case studies and qualitative studies looking at data use
interventions in depth. For example, the In-Depth Interview Guide was adapted for an investigation of
information products in Kenya and Tanzania—an exercise that focused on the types of information products
available in those countries and how they could be improved to facilitate their use in decision making (Geers,
Nghui, Ekirapa, Rop, Mbuyita, Patrick, & Kusekwa, 2017). Questions about the types of program decisions
made, data availability, plans, policies, procedures/guidelines for communicating data, and segmentation of
communication to different audiences were customized to focus specifically on information products using RHIS
data. For this study, qualitative group interviews were conducted with key informants in the ministries of health

28 Conceptualizing and Measuring Data Use


at regional, district, and health facility levels covering such topics as data sources, experience with specific
information products, and support for and barriers to the use of these products.

The In-Depth Interview Guide was also adapted for a case study investigating the factors that contribute to
successful data use interventions in MEASURE Evaluation’s Associate Awards in Kenya, South Africa, and
Tanzania. These projects aimed to strengthen the national HIS and have implemented various DDU activities in
the eight intervention areas (Figure 4) as core components of the project. Key informant interviews using an
adapted interview guide were conducted with ministry of health staff with exposure to data use interventions in
one province/region in each country. Questions about the types of program decisions made, stakeholders
involved in decision-making processes, and data sources consulted were included, as were questions about the
outputs, facilitators, and barriers to specific data use intervention domains from the DDU Logic Model.

12 Components M&E Systems Strengthening Tool

The 12 Components M&E Systems Strengthening tool, developed in 2009 by the global M&E Reference Group
for HIV and AIDS, assesses a national M&E system (Joint United Nations Programme on HIV/AIDS
[UNAIDS], 2009). It was initially developed for HIV programs but can be adapted to address other diseases and
program areas. The tool provides a comprehensive assessment of the 12 components of a national HIV M&E
system. It can be used to understand the overall strengths and weaknesses of an M&E system forming the basis
for the development or revision of the national multiyear M&E plan and/or costed M&E work plan. It is
recommended that an assessment be conducted every two to three years to monitor progress in M&E
implementation.

The tool has been employed to assess data use by orphans and vulnerable children programs implemented by
Rwanda’s National Commission for Children (2013); HIV/AIDS programs with mainland Tanzania’s National
AIDS Control Program and the Zanzibar AIDS Control Program (2015-2017); and as part of the national HIV
Monitoring and Evaluation System assessment in Nigeria in 2010 (Mharadze, Ogungbemi, Boone, & Oyediran,
2010).

Overview of the 12 Components Tool

An assessment using this tool is built around the 12 components necessary for the effective functioning of a
national M&E system (UNAIDS, 2008). The components are organizational structures for M&E; human
resource capacity for M&E; M&E partnerships; M&E plan; costed M&E workplan; M&E advocacy;
communications and culture; routine program monitoring, surveys, and surveillance; M&E databases; supervision
and data auditing; evaluation and research; and data dissemination and use. Data use is measured by a series of
benchmarks and performance statements given in the “data dissemination and use” section of the tool. Group
consensus is employed to score performance using either a five-point scale (completely, mostly, partly, not at all,
not applicable) or a three-point scale (yes, no, not applicable). The tool can be administered to quantitatively
assess the use of data to improve the HIS, especially the development and dissemination of information products
that meet the identified information needs of relevant stakeholders. Examples of data use measures are given in
Box 5.

Conceptualizing and Measuring Data Use 29


Box 5: Data use measures contained in the 12 Components M&E Systems Strengthening Tool

• Stakeholder information needs have been assessed.


• Information products are regularly disseminated to the data providers.
• Information products are regularly sent to a wide variety of stakeholders, other than
the data providers.
• National and subnational information products meet stakeholders’ information
needs.
• There are guidelines to support the analysis, presentation, and use of data (e.g.,
graphs on walls showing cumulative coverage) at the facility level.
• Stakeholders have access to data/information products in the public domain.

Monitoring and Evaluation Capacity Assessment Toolkit

The Monitoring and Evaluation Capacity Assessment Toolkit (MECAT) was developed by MEASURE
Evaluation and its Kenya associate award, MEval-PIMA, to examine an organization’s capacity and performance
in M&E. MECAT assesses M&E across the 12 components of a well-functioning M&E system (described
above). DDU is one of the 12 capacity areas. In addition to measuring the existence of essential elements for a
M&E system (status), MECAT explores how well the M&E system functions according to established norms
(quality); internal capacity to accomplish M&E tasks (technical autonomy); and the organization’s ability to
financially support M&E tasks (financial autonomy).

The tool can be applied to health management teams at all levels of a government―from an individual in an
M&E unit, to hospitals and district/regional health centers, to ministries of health. The purpose of the tool is
to:

• Understand, document, and clarify an organization’s M&E performance objectives.


• Determine the status of performance and capacity in M&E.
• Identify gaps in the capacity of an organization to meet M&E performance objectives.

The tool can be employed as an internal assessment to develop capacity building plans in M&E, as a baseline
M&E assessment prior to capacity building interventions, and if implemented regularly, as a routine assessment
to monitor an organization’s M&E capacity. MECAT has been applied to assess DDU at national and
subnational levels in Kenya. It has also been employed at the national level to examine the M&E capacity of
programs in different ministries (e.g., the Ministry of Health and Ministry of Immigration) and, at the subnational
level, to evaluate countywide M&E systems. For three programs (National Malaria Control Program, and the
Reproductive Health and Maternity Services) and in three counties, MECAT was used at project end line to
assess changes in M&E capacity after three years of technical support from MEval-PIMA, to respond to gaps
identified during the baseline MECAT assessment (MEASURE Evaluation, 2017b-e).

30 Conceptualizing and Measuring Data Use


Overview of the MECAT

The four tools in MECAT are (1) group assessment; (2) individual assessment; (3) key informant interviews; and
(4) desk review.

The group assessment is a participatory organizational self-assessment targeting key M&E staff and
stakeholders and covering the 12 components of an M&E system. The DDU component defines capacity in
terms of a data use plan, the dissemination of information products, and data analysis and presentation guidelines
(Box 6).

The individual assessment is a self-evaluation by M&E staff of their competencies in leadership; data collection
and management; evaluation; data analysis, dissemination, and use; and overall management. The data analysis,
dissemination, and use section has items evaluating competencies in quantitative and qualitative analysis methods
and interpretation; knowledge about stakeholder information needs; dissemination of information products;
understanding of key program priorities; and how data from routine monitoring can be applied for decision
making.

Key informant interviews with M&E stakeholders outside the organization are conducted to understand the
larger context for M&E and stakeholder views on current M&E capacity levels and constraints. Stakeholders are
asked about the organization’s capacity to undertake M&E functions, including DDU; their knowledge about
experiences with the organization using data for planning and monitoring M&E goals; and additional information
required to make policy or program decisions.

Last, a desk review of key M&E documents and records related to strategic and organizational planning is
conducted to identify the background and history of M&E in the organization, the status of activities, and
documentation related to M&E capacity and gaps.

Box 6: Data use measures in the MECAT

• Existence and quality of an organizational data use plan


o An organizational data use plan exists.
o The data use plan is embedded in the organization’s strategic plan and M&E
plan.
o The data use plan conforms to best practices on collecting, recording,
collating, analysis, and reporting.
o The data use plan is informed by an assessment of user needs.
o The data use plan was developed with external technical
assistance/government support.
• Existence and quality of disseminated information products
o The organization disseminates information products to stakeholders, including
the Ministry of Health data users and producers.
o Information products have contributed to influences in policy and practice.
o Information products are disseminated with external technical assistance/with
support from the government.
• Existence and quality of data analysis and presentation guidelines
o Data analysis and presentation guidelines exist.
o Staff know and apply these guidelines.
o Gender analysis and reporting are included as an element of the data analysis
and presentation guidelines.
Conceptualizing and Measuring Data Use 31
DISCUSSION

This working paper presents a data use continuum that identifies the stages of data use for improving the
functioning of the HIS and to drive informed decision making. Each stage of the continuum may require
different considerations when identifying measurement indicators and methodologies.

All tools reviewed here measure the use of data to improve the functioning of the HIS, that is, dimensions of
data use related to improving data quality, generating health statistics, and developing information products.
“Assessing Barriers to Data Use in the Health Sector: A Toolkit,” in particular, assesses the implementation
process across the eight interventions identified as essential to strengthening the demand for and use of data.
Several tools (such as the 12 Components and MECAT) conceptualize data use as “data analysis and
dissemination,” and contain measures on the development and dissemination of information products and the
existence of guidelines and protocols for data use. These indicators mainly relate to the inputs and activities that
contribute to data-informed decision making (i.e., the process of strengthening data use).

Monitoring the use of data for improved health program performance, by tracking the application of data-
informed recommendations into action (i.e., decisions made and follow-up actions taken to improve health
program performance), is challenging to measure, especially using quantitative methods. The implementation of
decisions informed by HIS data to improve health programs depends on multisectoral decision-making
processes, which may be influenced by other functions inside and outside the health system, including leadership
and governance (e.g., who has the authority to make decisions?) and financing (e.g., is budget available to
implement the decision as recommended?). These decisions often lie beyond the authority and control of the
organization responsible for the HIS, and can be influenced by factors outside the health sector that inhibit data
use, such as political ideology, political will, competing priorities, personal interests, capacity of decision makers,
and commitment to transparency and accountability. Decision-making meetings are often ad hoc and
unpredictable, and may not include the individuals who generate, analyze, and synthesize the data. Moreover,
there is often a considerable gap in time between data generation, data review, data use, and eventual impact on
the health program and health system performance.

Few tools that measure the outcome of data use for improved health program performance exist. Many tools
contain an assessment item on the existence of meetings for data review and interpretation, and qualitative
assessments of whether decisions made by an organization are taken based on data. However, PRISM is the only
standardized tool that measures the full spectrum of the use of data to improve decision making. It measures the
extent to which data are employed in decision-making processes, conceptualized as whether RHIS information is
discussed during meetings, whether decisions evolved from these discussions, and whether decisions are referred
to upper management for action. PRISM also incorporates an overall RHIS assessment capturing measures of
data quality and data availability across multiple levels of a health system, thereby providing a comprehensive
overview of the technical, organizational, and individual barriers impacting data use. However, the
implementation of a full PRISM is a resource-intensive activity that requires sampling multiple units across
facility, district, and central levels.

MEASURE Evaluation has developed checklists to be applied during data review meetings to track whether data
presented during these meetings lead to decisions made and the development of action plans (Geers, Sagno,
Camara, & Bureau de Strategie et Developpement au sein du Ministere de la Santé de Guinee, 2017). More
experience is needed applying and capturing the outcomes of the use of the data review checklists for the

32 Conceptualizing and Measuring Data Use


purposes of measuring data use. Qualitative approaches are also often employed to understand how data have
been used to improve health program performance. For example, a question in the interview guide in the toolkit
for assessing barriers to data use in the health sector asks for instances when data were consulted to inform a
decision about a health service or program. Desk reviews can be conducted to understand whether data were
consulted for planning and budgeting purposes. However, it is often difficult to gather evidence to
retrospectively determine whether recommendations, decisions, and actions were informed by data and led to
improvements in health programs and health outcomes. Documentation on recommendations and decisions
made are often not kept or are not accessible because of time lags between the formulation of a recommendation
and the implementation of the recommendation, and the time between a decision and subsequent outcomes at
the service delivery level.

Conducting targeted follow-up of data-informed decision making can be a lengthy, costly, and labor-intensive
endeavor. Better measures of the outcome of data use are needed, along with ways to easily track the health
program and health system outcomes associated with decisions that are implemented. There is a need to identify
low-cost data collection methods to routinely track data use during an organization’s regular planning, program
monitoring, and budgeting cycle. This is especially true because information systems and analytical approaches to
data use are evolving to be better able to routinely generate information for continuous learning and adaptation.
Additional guidance and criteria should also be developed to help users objectively assess whether data were
employed to inform key decisions (e.g., strategic plans, budgets, action plans, etc.).

Conceptualizing and Measuring Data Use 33


CONCLUSION

This document summarizes how MEASURE Evaluation has conceptualized, defined, and monitored data use
for decision making. The project has expanded the concept of data use beyond the generation of statistics and
the review of data, and has articulated the steps necessary for data-informed decision making to take place (i.e.,
data-informed recommendation for action, decision made, and decision implemented). Although the project
has developed tools that capture this definition of data use (e.g., PRISM), it recognizes that measuring data-
informed decision making, and especially the programmatic outcomes of data-informed decisions, is difficult
because of the complexity of decision-making processes and the often retrospective nature of reporting on
governance processes.

MEASURE Evaluation has contributed other approaches to assessing and measuring data use because there
has been a gap in some HIS and M&E capacity assessments. These measures aim to capture the process of
strengthening data use to improve information systems (such as improving data quality, generating health
statistics, and developing information products), and activities to support the use of data for improved health
programs (such as the existence of meetings to review and discuss data). MEASURE Evaluation remains
committed to enhancing the standards for the measurement of data-informed decision making as new tools
and processes are developed in this area.

34 Conceptualizing and Measuring Data Use


REFERENCES

Abajebel, S., Jira, C., & Beyene, W. (2011). Utilization of health information system at district level in Jimma
Zone Oromia Regional State, South West Ethiopia. Ethiopian Journal of Health Sciences, 21(Suppl 1), 65-76.
Retrieved from https://www.ajol.info/index.php/ejhs/article/view/74271/64918.

AbouZahr, C., & Boerma, T. (2005). Health information systems: The foundations of public health. Bulletin of the
World Health Organization, 83, 578–583. Retrieved from http://www.who.int/bulletin/volumes/83/8/578.pdf.

Brodsky, I., & Nyanzi, I. (2017). Data use in the Democratic Republic of the Congo’s Malaria Program: National and
provincial results (TR-17-165-en.pdf). Chapel Hill, NC, USA: MEASURE Evaluation, University of North Carolina.
Retrieved from https://www.measureevaluation.org/resources/publications/tr-17-165-en/.

Geers, E., Nghui, P., Ekirapa, A,. Rop, V., Mbuyita, S., Patrick, J., & Kusekwa, S. (2017). Information products to
drive decision making: How to promote the use of routine data throughout a health system (SR-17-145-en.pdf). Chapel Hill,
NC, USA: MEASURE Evaluation, University of North Carolina. Retrieved from
https://www.measureevaluation.org/resources/publications/sr-17-145-en.

Geers, E., Sagno, J., Camara, A., & Bureau de Strategie et Developpement au sein du Ministere de la Santé de
Guinee. (2017). Directives des réunions de revue des données pour évaluer et améliorer la performance. Chapel Hill, NC, USA:
MEASURE Evaluation, University of North Carolina. Retrieved from
https://www.measureevaluation.org/resources/publications/tr-17-216

Health Metrics Network. (2008). Framework and standards for country health information systems. Geneva, Switzerland:
World Health Organization. Retrieved from
http://www.who.int/healthinfo/country_monitoring_evaluation/who-hmn-framework-standards-chi.pdf.

Joint United Nations Programme on HIV/AIDS (UNAIDS). (2008). Organizing framework for a functional
national HIV monitoring and evaluation system. Geneva, Switzerland: UNAIDS. Retrieved from
www.unaids.org/sites/default/files/sub_landing/files/20080430_JC1769_Organizing_Framework_Functional_
v2_en.pdf.

Joint United Nations Programme on HIV/AIDS (UNAIDS). (2009). 12 components monitoring & evaluation system
assessment. Geneva: UNAIDS. Retrieved from
http://www.unaids.org/sites/default/files/sub_landing/files/1_MERG_Assessment_12_Components_ME_Sys
tem.pdf.

MEASURE Evaluation. (2014a, revised 2015). Strengthening family planning programs with data: Creating a culture of data
demand and use (fs-14-120; revised 2015). Chapel Hill, NC, USA: MEASURE Evaluation, University of North
Carolina. Retrieved from https://www.measureevaluation.org/resources/publications/fs-14-120.

MEASURE Evaluation. (2014b). Strengthening orphan and vulnerable children programs with data: Creating a culture of data
demand and use (fs-14-106). Chapel Hill, NC, USA: MEASURE Evaluation, University of North Carolina.
Retrieved from https://www.measureevaluation.org/resources/publications/fs-14-106.

Conceptualizing and Measuring Data Use 35


MEASURE Evaluation. n.d. Data quality assurance tools. Retrieved from
https://www.measureevaluation.org/resources/tools/health-information-systems/data-quality-assurance-
tools/data-quality-assurance-tools.

MEASURE Evaluation. (2017a). Strengthening health information systems in low- and middle-income countries - a model to
frame what we know and what we need to learn (tr-17-156). Chapel Hill, NC, USA: MEASURE Evaluation, University
of North Carolina. Retrieved from https://www.measureevaluation.org/resources/publications/tr-17-156.

MEASURE Evaluation. (2017b). Siaya County: End line assessment of monitoring and evaluation capacity (tr-17-205).
Chapel Hill, NC, USA: MEASURE Evaluation, University of North Carolina. Retrieved from
https://www.measureevaluation.org/resources/publications/tr-17-205_en.

MEASURE Evaluation. (2017c). Narok County: End line assessment of monitoring and evaluation capacity (tr-17-206).
Chapel Hill, NC, USA: MEASURE Evaluation, University of North Carolina. Retrieved from
https://www.measureevaluation.org/resources/publications/tr-17-206_en.

MEASURE Evaluation. (2017d). Kakamega County: End line assessment of monitoring and evaluation capacity (tr-17-214).
Chapel Hill, NC, USA: MEASURE Evaluation, University of North Carolina. Retrieved from
https://www.measureevaluation.org/resources/publications/tr-17-214_en.

MEASURE Evaluation. (2017e). National Malaria Control Programme monitoring and evaluation capacity: End line
assessment report (tr-17-196). Chapel Hill, NC, USA: MEASURE Evaluation, University of North Carolina.
Retrieved from https://www.measureevaluation.org/resources/publications/tr-17-196.

MEASURE Evaluation. (2018). Assessing barriers to data demand and use in the health sector: A toolkit. Chapel
Hill, NC, USA: MEASURE Evaluation, University of North Carolina. Retrieved from
https://www.measureevaluation.org/resources/publications/ms-18-134

Mharadze, T.N., Ogungbemi, K., Boone, D., & Oyediran, K. (2010). Report on the status of the Nigerian national HIV
monitoring and evaluation system: Assessment using 12 Components System Strengthening Tool (SR-10-61). Chapel Hill, NC,
USA: MEASURE Evaluation, University of North Carolina. Retrieved from
https://www.measureevaluation.org/resources/publications/sr-10-61.

Mwencha, M., Rosen, J. E., Spisak, C., Watson, N., Kisoka, N., & Mberesero, H. (2017). Upgrading supply chain
management systems to improve availability of medicines in Tanzania: Evaluation of performance and cost
effects. Global Health: Science and Practice, 5(3), 399–411. Retrieved from
http://www.ghspjournal.org/content/5/3/399.

Nutley, T., Gnassou, L., Traore, M., Bosso, A.E., & Mullen, S. (2014). Moving data off the shelf and into action:
An intervention to improve data-informed decision making in Côte d'Ivoire. Global Health Action, 7, 25035.
Retrieved from https://www.ncbi.nlm.nih.gov/pubmed/25280738.

Nutley, T., & Reynolds, H.W. (2013). Improving the use of health data for health system strengthening. Global
Health Action, 6, 20001. Retrieved from https://www.ncbi.nlm.nih.gov/pmc/articles/PMC3573178/.

36 Conceptualizing and Measuring Data Use


World Health Organization (WHO). (2007). Everybody business: strengthening health systems to improve health outcomes:
WHO’s framework for action. Geneva, Switzerland: WHO. Retrieved from
www.who.int/healthsystems/strategy/everybodys_business.pdf.

World Health Organization (WHO). (2010). Monitoring the building blocks of health systems: a handbook of indicators and
their measurement strategies. Geneva, Switzerland: WHO. Retrieved from
http://www.who.int/healthinfo/systems/WHO_MBHSS_2010_full_web.pdf.

Conceptualizing and Measuring Data Use 37


APPENDIX A. INDICATORS TO MONITOR THE IMPLEMENTATION OF ACTIVITIES TO
STRENGTHEN THE DEMAND FOR AND USE OF DATA

Determinant Indicator Level 0 Level 1 - nascent Level 2 - emerging Level 3 - robust

Assess & improve - Assessment implemented - No previous efforts - Previous efforts to - Formal assessment - Data use
data use context to assess data use assess data use implemented with interventions
- Plan for improvement specific data use implemented
developed - No previous efforts - Previous efforts to questions regularly as part of
to improve data use improve data use the work plan
- DDU interventions regularly - Action plan
implemented developed

- Action plan
implemented

Engage data users - Representation of data - Limited - Representation of - Representation of - Data and
information
& producers in: producers & data users in representation of both data users and both data producers regularly
activities data users data producers and data users with demanded and
- M&E/HIS system ability to make reviewed and
used in decision
development (or - Regularity of interactions - Limited - Data users and data decisions present making
improvement) opportunities for producers meet semi-
- Discussion/interpretation of interaction regularly/ad hoc to - Meetings are
- Data/program data in relation to program discuss program regularly scheduled - Implementation of
review meetings improvement - Limited discussion/ recommendations
progress but not always held
is followed up
interpretation of
data

38 Conceptualizing and Measuring Data Use


Determinant Indicator Level 0 Level 1 - nascent Level 2 - emerging Level 3 - robust

- Research - Data informed - No tool/procedure - Data are presented - Relevant data users
development & recommendation(s) made implementation and discussed at & data producers
implementation meetings invited but do not
- Tools/procedures that always attend
- Policy dialogue facilitate interaction - Recommendations
implemented are made based on - Incomplete data
- Planning data presented and
discussed

- Data informed
recommendations
sometimes made

Improve data - Data quality assessment - No previous efforts - Ad hoc, - Formal, organized - DQA improvements
tool implemented
quality to assess data unsystematic data quality completed and
quality assessment of data assessment (DQA) evidence of
- Accuracy
quality conducted, action improvements made
- Skills building in data entry
- Timeliness - No electronic data plans developed, and
and data management
system - Ad hoc, implementation - DQA audits
- Completeness
unsystematic efforts to started regularly conducted
- Parallel systems improve data quality (e.g., quarterly)
exist for data
capture - Evidence of data
regularly cleaned,
stored securely, and
reported

Conceptualizing and Measuring Data Use 39


Determinant Indicator Level 0 Level 1 - nascent Level 2 - emerging Level 3 - robust

Improve data - Databases linked / - Parallel databases - Plans for linking data - Interoperability/ - Primary data
systems integrated/
availability interoperable have been discussed integration plans
(access, synthesis, - Few individuals can but no action has underway but in pilot interoperable
communication) - Clear guidelines for data access raw data taken place phase
sharing exist - Varied data users
- No data sharing - Data - Guidelines for data have access to data
- Data dissemination and protocols exist communication plan sharing developed
communications plan5 exists exists but not widely -New research
- No data regularly tracked
distributed
- Information products exist communication - Few communication
that synthesize information plan exists products exist and are - System for registering - Communication
for different audiences not tailored to new research plan fully
- Little implemented
audiences developed
- Multidirectional feedback communication
mechanisms in place, beyond donors and - Weak feedback - Communication plan - Feedback
based on relevant government mechanisms exists and partially improvement system
stakeholders implemented functioning for
- No formal - Limited consideration internal and external
feedback of audiences and/or - Plan to improve stakeholders
mechanisms in inappropriate feedback system
place messaging developed and - Data regularly
partially implemented shared with targeted
audiences in
- Information product appropriate formats
templates exist

5Document that lays out a strategic process of tailoring messages for specific audiences, i.e., standardized reports generated by the RHIS for identified key target groups;
feedback mechanisms and dissemination schedule outlined, by audience.

40 Conceptualizing and Measuring Data Use


Determinant Indicator Level 0 Level 1 - nascent Level 2 - emerging Level 3 - robust

Identify - Tool/strategy/workshop - Not discussing - Program - Program - Core analysis


information needs implemented that questions (starting questions/core questions/core identified/process for
generates questions about with the data and analyses irregularly analyses regularly regular inquiry
the program focusing on identified identified established
reporting)
- Key stakeholders involved - Reviewing data but - Guidelines for data - Data review
in identifying information - Insufficient data not regularly review defined and process fully
needs (producers and review/little data implemented but not functioning
users) interpretation - Irregular in-depth consistently
investigation into -Additional
- Data-informed strategic - Decision making issues highlighted by - Opportunities for research/analysis
plan exists does not involve a data review additional regularly conducted
range of research/data
- Actively reviewing and stakeholders - Involvement of analysis identified - Regularly review of
discussing data and stakeholders but strategic plans
identifying opportunities for - No strategic plan/ inconsistent and with - Expanded group of
additional data/ strategic plan with limited stakeholders stakeholders involved - Data gaps

information no targets in data review addressed either


- Strategic plan with through additional
-Data-informed unscientific targets - Recommendations analysis or new
recommendations made made based on data research
but not consistently
- Data-informed budgets, - Regularly engage
work plans, policy exist - Strategic plans with stakeholders
based on data review
- Data-informed
recommendations
acted upon

Conceptualizing and Measuring Data Use 41


Determinant Indicator Level 0 Level 1 - nascent Level 2 - emerging Level 3 - robust

Build capacity in - Capacity building plan for - Have basic M&E - DDU capacity exists - DDU capacity - DDU capacity and
data use core M&E/DDU skills but not sufficient building plan exists skills exist in all
competencies (reach/breadth) relevant staff
- Individuals trained in DDU - No/limited - DDU capacity exists (breadth and depth)
skills (analysis, interpretation, capacity in M&E - DDU skill level is low in key staff
synthesis, presentation, tasks - DDU skills normative
communication) - No DDU skills - DDU skills exist but
- No DDU skills are insufficient - Regular DDU skills
- Individuals trained in DDU transfer (core set of
skills (concepts and tools, - No skills in DDU - Some DDU skills trainers, more
advocacy, leadership, procedures/policies transfer (ability of replication)
managing change) facilitator to replicate
DDU
- Individuals trained in training/workshop
developing and facilitation)
implementing DDU
procedures, guidelines,
policies, and support
mechanisms

- Individual skill level


increased

Strengthen - Organizational mission, - Have M&E - Advocacy efforts - Mission/vision reflect - Regular, annual
organizational vision, and strategic plan organizational implemented to DDU budget line items for
data demand and that reflect DDU supports (e.g., M&E prioritize DDU DDU interventions
use infrastructure plan) but do not

42 Conceptualizing and Measuring Data Use


Determinant Indicator Level 0 Level 1 - nascent Level 2 - emerging Level 3 - robust

- Advocacy efforts to include DDU beyond - One to three DDU - Larger-scale


strengthen DDU in the reporting organizational advocacy efforts
organization implemented supports in place implemented

- Existence of organizational - Four to six


supports (policies and organizational
procedures to support DDU, supports implemented
data review guidelines,
guidelines for registering - Incentives exist for

new research, staff DDU data use

roles clarified, regular


meetings for data
user/producer interaction)

- Existence of DDU
successes

- Incentives for data use


exist

Communicate - Existence of DDU success None - Some experience - Existence of ad hoc - Systematic M&E of
data use stories with DDU M&E efforts to monitor DDU interventions
successes documented DDU interventions
- Existence of data on DDU - Widely disseminate
interventions - Ad hoc DDU successes to
communication of varied audiences in
- Promotion of DDU success successes appropriate formats
stories

Conceptualizing and Measuring Data Use 43


Determinant Indicator Level 0 Level 1 - nascent Level 2 - emerging Level 3 - robust

inside/outside the
organization

- Recognition of DDU
successes by the
organization at various
levels

44 Conceptualizing and Measuring Data Use


MEASURE Evaluation
University of North Carolina at Chapel Hill
123 West Franklin Street, Suite 330
Chapel Hill, NC 27516 USA
Phone: +1 919-445-9350
[email protected]

www.measureevaluation.org

You might also like