Guide M Agree
Guide M Agree
Guide M Agree
Abstract
Background: Realist reviews offer a rigorous method to analyze heterogeneous data emerging from multiple
disciplines as a means to develop new concepts, understand the relationships between them, and identify the
evidentiary base underpinning them. However, emerging synthesis methods such as the Realist Review are not well
operationalized and may be difficult for the novice researcher to grasp. The objective of this paper is to describe
the development of an analytic process to organize and synthesize data from a realist review.
Methods: Clinical practice guidelines have had an inconsistent and modest impact on clinical practice, which may
in part be due to limitations in their design. This study illustrates the development of a transparent method for
organizing and analyzing a complex data set informed by a Realist Review on guideline implementability to better
understand the characteristics of guidelines that affect their uptake in practice (e.g., clarity, format). The data
organization method consisted of 4 levels of refinement: 1) extraction and 2) organization of data; 3) creation of a
conceptual map of guideline implementability; and 4) the development of a codebook of definitions.
Results: This new method is comprised of four steps: data extraction, data organization, development of a
conceptual map, and operationalization vis-a-vis a codebook. Applying this method, we extracted 1736 guideline
attributes from 278 articles into a consensus-based set of categories, and collapsed them into 5 core conceptual
domains for our guideline implementability map: Language, Format, Rigor of development, Feasibility,
Decision-making.
Conclusions: This study advances analysis methods by offering a systematic approach to analyzing complex data
sets where the goals are to condense, organize and identify relationships.
Background why [2]. Realist reviews are an emerging method with few
Complex interventions, such as those used to improve published examples [3-5], and are particularly relevant for
quality of health care, are informed by principles from complex and under-conceptualized topics with a heteroge-
health services research, management, psychology and neous evidence base where traditional systematic reviews
engineering, in addition to medicine. Despite this, they would often conclude that there is no evidence to inform
often lack a clear theoretical basis, making it hard to next steps [6]. The recently published publication standards
summarize this disparate literature in a way that can for Realist Reviews (i.e., RAMESES criteria [7] will likely fa-
inform intervention design or interpretation of results cilitate improved reporting of this method, as existing tech-
[1]. A realist review is a knowledge synthesis methodology niques to organize and synthesize such information are not
pioneered by Ray Pawson [2], which seeks to better under- well operationalized [8], and require further development
stand what works for whom, in what circumstances and to be optimized and to help novice researchers manage
large datasets.
* Correspondence: [email protected] To advance the science of analyzing complex and
1
St. Michael’s Hospital Li Ka Shing Knowledge Institute, 209 Victoria Street,
Toronto M5B 1W8, ON, Canada
disparate data, this paper describes the development of
Full list of author information is available at the end of the article a process for organizing and analyzing complex evidence
© 2013 Kastner et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative
Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and
reproduction in any medium, provided the original work is properly cited.
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 2 of 8
http://www.biomedcentral.com/1471-2288/13/112
in the context of a Realist Review in the area of guideline of refinement: 1) extraction and 2) organization of data; 3)
implementability. We selected guideline implementability creation of a conceptual map of guideline implementability;
to illustrate our data analysis process because guidelines and 4) the operationalization of the map and its compo-
are considered an important knowledge translation tool yet nents vis a vis the development of a codebook of definitions
its potential to facilitate the implementation of evidence that will inform the design of a framework. In this section
into clinical practice has largely been unrealized [9-11]. we provide a description of the method used at each step
Poor guideline uptake may be due to external factors such and the results that emerged when the step was applied to
as the complex and competing demands on providers’ time, our data set.
organizational constraints, and lack of knowledge; as well
as characteristics of the guidelines themselves (i.e., intrinsic Level 1 – Extraction of data
factors). Approaches to improving uptake of guidelines Two groups of investigators extracted 1736 intrinsic
have largely focused on complex knowledge translation guideline attributes (i.e., characteristics) from 278 included
interventions consisting of extrinsic strategies that target articles on study discipline (i.e., medicine, psychology,
providers or practice environments. However, these strat- management, human factors engineering), attribute name
egies have yielded modest improvement with variable costs and definition (as documented by authors), attribute
[12,13]. Intrinsic strategies (e.g., addressing the clarity, operationalization (i.e., an explanation of how the attribute
specificity and clinical applicability of recommendations) functions within the context of the discipline or study), at-
are promising because they are inexpensive, easy to tribute relationship with uptake, and any potential tradeoffs.
implement and may be broadly applicable. Additionally, To ensure reliability, consistency and accuracy of the data
strategies that are being developed do not include disciplines extraction, we used an auditing process whereby secondary
outside of medicine (e.g., management and psychology), so reviewers checked data extractions of primary reviewers.
they are not being optimized to advance knowledge in this Disagreements were resolved through consensus-based
area. We therefore conducted a realist review to better group discussions involving all investigators.
understand the concept of guideline implementability from
a broad perspective of the literature, and to identify how Level 2 – Organization of data
guidelines could be optimized to increase their impact. The 1736 identified attributes were sorted with the same
More specifically, our goal was to identify guideline attri- name or root (e.g., valid/validity) in an Excel database.
butes that affect guideline uptake in clinical practice. The Two groups of investigators (6 in total, 3 per group) then
complete protocol for this review is described elsewhere took the same list of sorted attributes and independently
[14], and the final results of this review will be published in clustered them into logical categories. This involved a
a separate paper. Briefly, the realist review considered process of building up groups of similar or like attributes
evidence from four disciplines (medicine, psychology, (including their synonyms and antonyms) that concep-
management, and human factors engineering) to determine tually “fit” within a larger theme, and creating a label
what works for whom, in what circumstances and why in and description for each category. Table 1 describes the
relation to guideline implementation [14]. The search strat- operationalization of this process. Categorizations between
egy included expert-identified, purposive and bibliographic the two groups were compared for agreement aimed at
searching. The analytic approach drew on multiple ana- identifying a common set of categories and their included
lysis methods (i.e., Realist synthesis and other qualitative attributes. This involved documenting “agreed” and “diver-
synthesis methods). Although the realist review synthesis gent” classifications, and making consensus-based decisions
methods were helpful for interrogating our underlying through group discussion. This highly systematic approach
theory (i.e., why guidelines are not being implemented) allowed for efficient filtering and consolidation of a large
[1], Realist Review methods are relatively new, and it’s and complex dataset.
guidance on the process for organizing and relating findings
(i.e., the RAMESES criteria [7]) may be a challenge to Level 3 – Building a conceptual map of guideline
reproduce by people who are new to the field. implementability
To address this issue, we describe the development of Using a consensus approach among the two groups of
a process for organizing and analyzing complex evidence investigators via discussions of the attribute definitions
derived from findings of our realist review on guideline and their similarities and relationships, the final set of 27
implementability as a means to advance the science of categories (Table 2) were further grouped into 5 broad di-
knowledge synthesis. mensions associated with the uptake or use of guidelines:
Language, Format, Rigor of Development, Feasibility,
Methods and results Decision-making. Based on the evidence around these
Figure 1 shows the flow of the process that was used to domains, we developed broad and common sense defi-
make sense of the realist review data consisting of 4 levels nitions for each as well as their included categories, which
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 3 of 8
http://www.biomedcentral.com/1471-2288/13/112
Stage 5: Other
Auditing process
Data Extraction LEVEL 1: Extraction of data
Primary reviewers’ data extractions
1736 guideline attributes were extracted in audited by second reviewer
duplicate from 278 included articles Disagreements resolved through
consensus
Group 1 = 33 Group 1 = 28
categories categories
informed a conceptual map of guideline implementability. were asked to review the content of the 5 domains and
The development of this map was guided by a web-based its sub-domains, and to rename, rearrange and condense
visualization tool, MindMeister (http://www.mindmeister. attributes as they saw fit. The survey comprised Likert-type
com), which was used iteratively by all investigators to de- and open-ended questions about the operational definition
termine the structure of the framework (i.e., moving back- of the domains, and the fit of categories and their attributes
and-forth from the map to definitions and source material), within them (see Additional file 1). Through consensus-
and to facilitate the decision-making process for group- based discussions amongst our team, findings of this survey
ing and identifying patterns in the data. Such visualization were used to make modifications to the organization and
techniques have been shown to facilitate comprehension, structure of our data (e.g., collapsing and renaming some
identify the inferences about the qualities of parts and the attributes, categories and domains).
relations among them, and be useful for revealing the
hierarchy of groupings and important relationships [15]. Level 4 – Development of a codebook
To validate and to identify potential flaws in categorization The two groups collectively developed a codebook of
and to obtain agreement on the sensibility and fit of definitions to better understand each of the 5 domains
attributes within and across the categories, a group of 9 of implementability, the relationships between guideline
stakeholders with knowledge translation and guideline attributes and their uptake, and potential tradeoffs. The
development expertise were surveyed. These experts process involved documenting definitions for modifiable
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 4 of 8
http://www.biomedcentral.com/1471-2288/13/112
Table 1 Operationalization of the categorization process using the “LANGUAGE” domain as an example
Goal Steps Example
Organize, group, and appropriately label 1. Group attributes that are antonyms • Complex/Simple
similar or “like” attributes
2. Group attributes that are synonyms • Unclear/Confusing
3. Group attributes with the same root • Specific/Specificity
• Validity/Valid
4. Sort database by attribute
Categorize attributes into logical clusters 5. Are there commonalities among attributes? The following attributes can be grouped
into a category called “Clarity”
• Unambiguous
• Precise
6. Is there a central theme or focus among • Specific
groups of attributes?
Go through each cluster to determine 7. Do the attributes belong within the same cluster? The following categories can be collapsed:
sense and fit of attributes
8. Can they be collapsed? • “Complexity” with “Information overload”
9. Use attribute definitions to make these decisions • “Actionability” (e.g., using active voice)
with “Wording”
Develop a definition for clusters 10. Based on their included attributes and The LANGUAGE domain can be defined as:
definitions, define and label the cluster The clarity, precision, and specificity of the
context and message of the guideline
attributes (i.e., those that have the potential to be changed These findings will be used to answer our Realist review
by guideline developers) and their operationalization question: What is it about guidelines that facilitate or
(i.e., how the attribute can be used and examples of how it impede their uptake, for whom and in what circumstances
functions), the context and setting in which these occur, for this happens, and how and why this happens.
whom, any relationship with uptake, and attribute tradeoffs We reviewed a range of review methods to answer our
if they existed (see the Additional file 2 for an example research. The details explaining the rationale for selecting
Codebook). The codebook was developed one domain at a Realist Review is published in our protocol [14]. Briefly,
a time using a modified duplicate reviewing process we assessed a range of review methods (i.e., Realist Review,
that involved a set of primary reviewers extracting and Meta-narrative synthesis, and Meta-ethnography) to
documenting the information, and a second group of determine which of these was the most appropriate,
reviewers “auditing” (i.e., checking) primary reviews in but we found that none were a “perfect fit” to sufficiently
small-group discussions; a third group of reviewers resolved cover all our questions. We selected the Realist Review
disagreements. The main objectives of the auditing process method because the approach provides the most sys-
were to verify the completion of documentation, to ensure tematic guidance on how to conduct a complete review
the appropriate understanding of concepts, and to deter- (i.e., a process for a search strategy, article selection, and
mine the best fit of attributes and information within and data analysis), it allows the inclusion of diverse evidence
between categories and domains. (i.e., quantitative and qualitative), and provides an explana-
tory investigation of underlying theories and mechanisms
Discussion of the study under investigation. In our case, ‘causation’ was
Complex interventions are often atheoretical and loosely determined by considering the interaction between con-
draw on a broad literature that includes different disciplines texts (i.e., the circumstances and settings of guideline use),
and is difficult to summarize systematically. Qualitative mechanisms (i.e., the processes operating within guidelines
synthesis methods are poorly operationalized and do not that explain why they are used in some circumstances but
describe how to organize and analyze large heterogeneous not in others) and outcomes (whether guidelines are used
datasets. We used a systematic process of analysis to build or not). We theorized that unpacking these C-M-O rela-
a conceptual map of guideline implementability through tionships would facilitate our understanding of guideline
the classification of 1736 attributes into a consensus-based implementability. However, one difficulty with the Realist
set of categories, which were then collapsed into 5 core con- Review method is that it lacks a comprehensive process to
ceptual domains of guideline implementability: Language, compare disciplinary perspectives on a given issue. We then
Format, Rigor of development, Feasibility, Decision-making. considered Meta-narrative synthesis, which can be helpful
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 5 of 8
http://www.biomedcentral.com/1471-2288/13/112
for analysing data across different fields or disciplines [16]. developing new knowledge synthesis methods to address
Meta-ethnography was another method that we considered, the limitations of some of the traditional synthesis strat-
which involves translating key concepts from one study egies such as the systematic review. Like realist review,
to another to reveal new insights [17], but its application the advantage of these methods is that they can help
to large data sets and its focus on qualitative studies pre- organize information from underconceptualized fields
sents challenges when the data set is large and comprised like knowledge translation and quality improvement to
of mixed study designs. This lack of a “perfect fit” highlights create a more cumulative knowledge base. However,
the need to consider all factors associated with the research methodological strategies that are more accessible are
question when deciding which method is the most appropri- required if they are to be widely used and optimized. To this
ate to answer them. These included determining the breadth end, a scoping review by Tricco et al. is currently underway
of evidence needed (quantitative or qualitative or both) and to determine which knowledge synthesis methods are
balancing this need with the feasibility or resources available available, and to develop a systematic process to help re-
to perform the review, anticipating the end-users of findings, searchers select the most appropriate method(s) to address
and to what extent the method provides strategies for rigor their research questions about complex evidence [18].
and transparency. In fact, these are similar considerations A limitation of our work is that the approach we used
we may use for selecting the most appropriate methods for was largely interpretive. However, the quality of synthe-
primary studies. There has been a resurgence of interest in sis is dependent on reviewers’ explicitness and reflexivity
http://www.biomedcentral.com/1471-2288/13/112
Kastner et al. BMC Medical Research Methodology 2013, 13:112
Table 3 Suggested approach to organize, synthesize, validate and make sense of complex findings
Step Points to consider Example Advantages Challenges How to overcome challenges
1. Selection of analysis • Which method is the most • We searched the literature • Potentially more valid if the • There was no single • Need to adopt a flexible approach to
method appropriate to answer for various synthesis methods method matches the question synthesis method that best match appropriate methods to answer
research questions? of complex evidence fit our questions research questions
• Consider selecting a primary analysis
method supplemented by other or
modified methods to address all questions
2. Organization and • How will the data be • We sorted and organized our • Sorting of concepts and • Difficult to keep track • We used a modified duplicate review
analysis of data organized? data (1736 guideline attributes) themes on multiple levels of changes from multiple process that involved a group of second
in an Excel database (e.g., across attributes, categories, reviewers reviewers “auditing” the analysis of
disciplines) primary reviewers
• Will also depend on • Analysis process was done • Duplicate analysis • Duplicate review is • Ensure that document tracking is
selected analysis method in duplicate minimizes bias time consuming and transparent and efficient (e.g., track
resource intensive and document changes and include detailed
notes from all reviewers)
3. Validity measures • How are you going to • Sought expert consensus • Survey methodology is quick • Survey methodology • Depending on resources, other consensus
verify findings and on findings using survey and efficient has inherent biases methods may increase validity such as the
minimize bias? methodology Delphi method
• Transparency (i.e., document what was
planned, what was done and why)
4. Representation of data • How will the results • We developed a conceptual map • The conceptual map • There may be other • The conceptual framework needs to be
and data be used? of guideline implementability for contributes to the factors not captured in the refined according to the codebook of
guideline developers and understanding of guideline map that may influence definitions
end-users implementability guideline implementability
• The conceptual framework needs to be
• Who are the target • The process advances the rigorously evaluated to determine the
knowledge end users? knowledge about analysis feasibility of its use by guideline
methods for complex evidence developers, and its potential to influence
guideline uptake by family physicians
5. Dissemination of data • To what extent should • The map will inform a guideline • The framework will inform • There may be other factors • Prior to dissemination, the framework will
the data be disseminated? implementability framework for end-users about attributes that influencing guideline need to undergo rigorous evaluation
guideline developers, users and facilitate guideline uptake; and implementability (including quantitative and qualitative studies)
• Will the work inform policy makers may also inform policy around to test its potential to influence guideline
practice, system, policy?
guideline development uptake by family physicians who are the
primary end-users of clinical practice guidelines
Page 6 of 8
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 7 of 8
http://www.biomedcentral.com/1471-2288/13/112
doi:10.1186/1471-2288-13-112
Cite this article as: Kastner et al.: Making sense of complex data: a
mapping process for analyzing findings of a realist review on guideline
implementability. BMC Medical Research Methodology 2013 13:112.