Guide M Agree

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

Kastner et al.

BMC Medical Research Methodology 2013, 13:112


http://www.biomedcentral.com/1471-2288/13/112

RESEARCH ARTICLE Open Access

Making sense of complex data: a mapping


process for analyzing findings of a realist review
on guideline implementability
Monika Kastner1*, Julie Makarski2, Leigh Hayden1, Lisa Durocher2, Ananda Chatterjee1, Melissa Brouwers2
and Onil Bhattacharyya1

Abstract
Background: Realist reviews offer a rigorous method to analyze heterogeneous data emerging from multiple
disciplines as a means to develop new concepts, understand the relationships between them, and identify the
evidentiary base underpinning them. However, emerging synthesis methods such as the Realist Review are not well
operationalized and may be difficult for the novice researcher to grasp. The objective of this paper is to describe
the development of an analytic process to organize and synthesize data from a realist review.
Methods: Clinical practice guidelines have had an inconsistent and modest impact on clinical practice, which may
in part be due to limitations in their design. This study illustrates the development of a transparent method for
organizing and analyzing a complex data set informed by a Realist Review on guideline implementability to better
understand the characteristics of guidelines that affect their uptake in practice (e.g., clarity, format). The data
organization method consisted of 4 levels of refinement: 1) extraction and 2) organization of data; 3) creation of a
conceptual map of guideline implementability; and 4) the development of a codebook of definitions.
Results: This new method is comprised of four steps: data extraction, data organization, development of a
conceptual map, and operationalization vis-a-vis a codebook. Applying this method, we extracted 1736 guideline
attributes from 278 articles into a consensus-based set of categories, and collapsed them into 5 core conceptual
domains for our guideline implementability map: Language, Format, Rigor of development, Feasibility,
Decision-making.
Conclusions: This study advances analysis methods by offering a systematic approach to analyzing complex data
sets where the goals are to condense, organize and identify relationships.

Background why [2]. Realist reviews are an emerging method with few
Complex interventions, such as those used to improve published examples [3-5], and are particularly relevant for
quality of health care, are informed by principles from complex and under-conceptualized topics with a heteroge-
health services research, management, psychology and neous evidence base where traditional systematic reviews
engineering, in addition to medicine. Despite this, they would often conclude that there is no evidence to inform
often lack a clear theoretical basis, making it hard to next steps [6]. The recently published publication standards
summarize this disparate literature in a way that can for Realist Reviews (i.e., RAMESES criteria [7] will likely fa-
inform intervention design or interpretation of results cilitate improved reporting of this method, as existing tech-
[1]. A realist review is a knowledge synthesis methodology niques to organize and synthesize such information are not
pioneered by Ray Pawson [2], which seeks to better under- well operationalized [8], and require further development
stand what works for whom, in what circumstances and to be optimized and to help novice researchers manage
large datasets.
* Correspondence: [email protected] To advance the science of analyzing complex and
1
St. Michael’s Hospital Li Ka Shing Knowledge Institute, 209 Victoria Street,
Toronto M5B 1W8, ON, Canada
disparate data, this paper describes the development of
Full list of author information is available at the end of the article a process for organizing and analyzing complex evidence
© 2013 Kastner et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative
Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and
reproduction in any medium, provided the original work is properly cited.
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 2 of 8
http://www.biomedcentral.com/1471-2288/13/112

in the context of a Realist Review in the area of guideline of refinement: 1) extraction and 2) organization of data; 3)
implementability. We selected guideline implementability creation of a conceptual map of guideline implementability;
to illustrate our data analysis process because guidelines and 4) the operationalization of the map and its compo-
are considered an important knowledge translation tool yet nents vis a vis the development of a codebook of definitions
its potential to facilitate the implementation of evidence that will inform the design of a framework. In this section
into clinical practice has largely been unrealized [9-11]. we provide a description of the method used at each step
Poor guideline uptake may be due to external factors such and the results that emerged when the step was applied to
as the complex and competing demands on providers’ time, our data set.
organizational constraints, and lack of knowledge; as well
as characteristics of the guidelines themselves (i.e., intrinsic Level 1 – Extraction of data
factors). Approaches to improving uptake of guidelines Two groups of investigators extracted 1736 intrinsic
have largely focused on complex knowledge translation guideline attributes (i.e., characteristics) from 278 included
interventions consisting of extrinsic strategies that target articles on study discipline (i.e., medicine, psychology,
providers or practice environments. However, these strat- management, human factors engineering), attribute name
egies have yielded modest improvement with variable costs and definition (as documented by authors), attribute
[12,13]. Intrinsic strategies (e.g., addressing the clarity, operationalization (i.e., an explanation of how the attribute
specificity and clinical applicability of recommendations) functions within the context of the discipline or study), at-
are promising because they are inexpensive, easy to tribute relationship with uptake, and any potential tradeoffs.
implement and may be broadly applicable. Additionally, To ensure reliability, consistency and accuracy of the data
strategies that are being developed do not include disciplines extraction, we used an auditing process whereby secondary
outside of medicine (e.g., management and psychology), so reviewers checked data extractions of primary reviewers.
they are not being optimized to advance knowledge in this Disagreements were resolved through consensus-based
area. We therefore conducted a realist review to better group discussions involving all investigators.
understand the concept of guideline implementability from
a broad perspective of the literature, and to identify how Level 2 – Organization of data
guidelines could be optimized to increase their impact. The 1736 identified attributes were sorted with the same
More specifically, our goal was to identify guideline attri- name or root (e.g., valid/validity) in an Excel database.
butes that affect guideline uptake in clinical practice. The Two groups of investigators (6 in total, 3 per group) then
complete protocol for this review is described elsewhere took the same list of sorted attributes and independently
[14], and the final results of this review will be published in clustered them into logical categories. This involved a
a separate paper. Briefly, the realist review considered process of building up groups of similar or like attributes
evidence from four disciplines (medicine, psychology, (including their synonyms and antonyms) that concep-
management, and human factors engineering) to determine tually “fit” within a larger theme, and creating a label
what works for whom, in what circumstances and why in and description for each category. Table 1 describes the
relation to guideline implementation [14]. The search strat- operationalization of this process. Categorizations between
egy included expert-identified, purposive and bibliographic the two groups were compared for agreement aimed at
searching. The analytic approach drew on multiple ana- identifying a common set of categories and their included
lysis methods (i.e., Realist synthesis and other qualitative attributes. This involved documenting “agreed” and “diver-
synthesis methods). Although the realist review synthesis gent” classifications, and making consensus-based decisions
methods were helpful for interrogating our underlying through group discussion. This highly systematic approach
theory (i.e., why guidelines are not being implemented) allowed for efficient filtering and consolidation of a large
[1], Realist Review methods are relatively new, and it’s and complex dataset.
guidance on the process for organizing and relating findings
(i.e., the RAMESES criteria [7]) may be a challenge to Level 3 – Building a conceptual map of guideline
reproduce by people who are new to the field. implementability
To address this issue, we describe the development of Using a consensus approach among the two groups of
a process for organizing and analyzing complex evidence investigators via discussions of the attribute definitions
derived from findings of our realist review on guideline and their similarities and relationships, the final set of 27
implementability as a means to advance the science of categories (Table 2) were further grouped into 5 broad di-
knowledge synthesis. mensions associated with the uptake or use of guidelines:
Language, Format, Rigor of Development, Feasibility,
Methods and results Decision-making. Based on the evidence around these
Figure 1 shows the flow of the process that was used to domains, we developed broad and common sense defi-
make sense of the realist review data consisting of 4 levels nitions for each as well as their included categories, which
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 3 of 8
http://www.biomedcentral.com/1471-2288/13/112

Stage 1: Core articles


Realist Review Methods
Stage 2: Expert identified

Search strategy Stage 3: PubMed Related articles


Search strategy
An iterative, multiple search strategy that
consisted of 5 non-linear stages of searching Stage 4: Bibliography

Stage 5: Other

Article selection Article selection


Abstract level: N = 2044
Two sets of reviewers (6 in total) independently
screened articles using inclusion criteria
Full text level: N = 350

Auditing process
Data Extraction LEVEL 1: Extraction of data
Primary reviewers’ data extractions
1736 guideline attributes were extracted in audited by second reviewer
duplicate from 278 included articles Disagreements resolved through
consensus

Data Organization LEVEL 2: Organization of data Category agreement


2 sets of investigators (Group 1; Group 2) Compared “agreed” and “divergent”
independently clustered 1736 attributes into classifications
logical categories Derived a common set of
categories through group
discussion

Group 1 = 33 Group 1 = 28
categories categories

Derived a common set of Validation with 9 experts


categories through group Identify flaws in categorization
discussion (N = 27) Sense and fit of categories, attributes
and their labels and definitions

Data Analysis LEVEL 3: Building a conceptual map of


guideline implementability
Discussions of the content and patterns of the
attributes within categories led to their further
classification into 5 broad domains: Language,
Format, Evidence, Feasibility, and Decision-
making

LEVEL 4: Development of a Codebook of


definitions
Determine evidence-based definitions,
operationalization, context, and relationship
with uptake

Figure 1 Flow of data analysis process.

informed a conceptual map of guideline implementability. were asked to review the content of the 5 domains and
The development of this map was guided by a web-based its sub-domains, and to rename, rearrange and condense
visualization tool, MindMeister (http://www.mindmeister. attributes as they saw fit. The survey comprised Likert-type
com), which was used iteratively by all investigators to de- and open-ended questions about the operational definition
termine the structure of the framework (i.e., moving back- of the domains, and the fit of categories and their attributes
and-forth from the map to definitions and source material), within them (see Additional file 1). Through consensus-
and to facilitate the decision-making process for group- based discussions amongst our team, findings of this survey
ing and identifying patterns in the data. Such visualization were used to make modifications to the organization and
techniques have been shown to facilitate comprehension, structure of our data (e.g., collapsing and renaming some
identify the inferences about the qualities of parts and the attributes, categories and domains).
relations among them, and be useful for revealing the
hierarchy of groupings and important relationships [15]. Level 4 – Development of a codebook
To validate and to identify potential flaws in categorization The two groups collectively developed a codebook of
and to obtain agreement on the sensibility and fit of definitions to better understand each of the 5 domains
attributes within and across the categories, a group of 9 of implementability, the relationships between guideline
stakeholders with knowledge translation and guideline attributes and their uptake, and potential tradeoffs. The
development expertise were surveyed. These experts process involved documenting definitions for modifiable
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 4 of 8
http://www.biomedcentral.com/1471-2288/13/112

Table 1 Operationalization of the categorization process using the “LANGUAGE” domain as an example
Goal Steps Example
Organize, group, and appropriately label 1. Group attributes that are antonyms • Complex/Simple
similar or “like” attributes
2. Group attributes that are synonyms • Unclear/Confusing
3. Group attributes with the same root • Specific/Specificity
• Validity/Valid
4. Sort database by attribute
Categorize attributes into logical clusters 5. Are there commonalities among attributes? The following attributes can be grouped
into a category called “Clarity”
• Unambiguous
• Precise
6. Is there a central theme or focus among • Specific
groups of attributes?
Go through each cluster to determine 7. Do the attributes belong within the same cluster? The following categories can be collapsed:
sense and fit of attributes
8. Can they be collapsed? • “Complexity” with “Information overload”
9. Use attribute definitions to make these decisions • “Actionability” (e.g., using active voice)
with “Wording”
Develop a definition for clusters 10. Based on their included attributes and The LANGUAGE domain can be defined as:
definitions, define and label the cluster The clarity, precision, and specificity of the
context and message of the guideline

attributes (i.e., those that have the potential to be changed These findings will be used to answer our Realist review
by guideline developers) and their operationalization question: What is it about guidelines that facilitate or
(i.e., how the attribute can be used and examples of how it impede their uptake, for whom and in what circumstances
functions), the context and setting in which these occur, for this happens, and how and why this happens.
whom, any relationship with uptake, and attribute tradeoffs We reviewed a range of review methods to answer our
if they existed (see the Additional file 2 for an example research. The details explaining the rationale for selecting
Codebook). The codebook was developed one domain at a Realist Review is published in our protocol [14]. Briefly,
a time using a modified duplicate reviewing process we assessed a range of review methods (i.e., Realist Review,
that involved a set of primary reviewers extracting and Meta-narrative synthesis, and Meta-ethnography) to
documenting the information, and a second group of determine which of these was the most appropriate,
reviewers “auditing” (i.e., checking) primary reviews in but we found that none were a “perfect fit” to sufficiently
small-group discussions; a third group of reviewers resolved cover all our questions. We selected the Realist Review
disagreements. The main objectives of the auditing process method because the approach provides the most sys-
were to verify the completion of documentation, to ensure tematic guidance on how to conduct a complete review
the appropriate understanding of concepts, and to deter- (i.e., a process for a search strategy, article selection, and
mine the best fit of attributes and information within and data analysis), it allows the inclusion of diverse evidence
between categories and domains. (i.e., quantitative and qualitative), and provides an explana-
tory investigation of underlying theories and mechanisms
Discussion of the study under investigation. In our case, ‘causation’ was
Complex interventions are often atheoretical and loosely determined by considering the interaction between con-
draw on a broad literature that includes different disciplines texts (i.e., the circumstances and settings of guideline use),
and is difficult to summarize systematically. Qualitative mechanisms (i.e., the processes operating within guidelines
synthesis methods are poorly operationalized and do not that explain why they are used in some circumstances but
describe how to organize and analyze large heterogeneous not in others) and outcomes (whether guidelines are used
datasets. We used a systematic process of analysis to build or not). We theorized that unpacking these C-M-O rela-
a conceptual map of guideline implementability through tionships would facilitate our understanding of guideline
the classification of 1736 attributes into a consensus-based implementability. However, one difficulty with the Realist
set of categories, which were then collapsed into 5 core con- Review method is that it lacks a comprehensive process to
ceptual domains of guideline implementability: Language, compare disciplinary perspectives on a given issue. We then
Format, Rigor of development, Feasibility, Decision-making. considered Meta-narrative synthesis, which can be helpful
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 5 of 8
http://www.biomedcentral.com/1471-2288/13/112

Table 2 Final list of attribute categories across 5 domains of guideline implementability


Category (N = 27) Major attributes Domain (N = 5)
Clarity Ambiguity, Specificity, Vagueness Language
Cognitive fluency Congruity, Fluency, Schema
Complexity Complexity, Options, Difficult to understand
Wording Concision, Embedded propositions
Framing Relative advantage, Gain-loss frame Format
Graphical Algorithm, Graphs, Tables
Inclusion of specific elements in recommendation Elements (e.g., include harms-benefits, patient information,
Boolean operators)
Mode of delivery Accessibility, Computability
Presentation/Layout/Design Visual imagery, Presentation
Structure/Organization Arrangement,
Benefits-harms Balance of benefits/harms, Dual viewpoint Rigor of development
Credibility Credible, Authoritative
Reliability/Reproducibility Reliable, Reproducible, Explicitness
Rigor of development Evidence-based, Evidence-linked
Strength and quality of recommendations Quality of evidence, Strength of evidence, Evidence grading
Validity Validity, Up-to-date
Acceptability Acceptability, Fit with decision-making, Perceived usefulness, Visibility Feasibility
Actionability Actionable, Executable, Operationalizable
Adaptability Adaptability, Context, Tailoring
Feasibility Feasibility, Compatibility, Costs, Resources
Implementation considerations Implementability factors affecting feasibility, Trialability
Usability Ease of use, Usefulness
Clinical significance Clinical relevance, Applicability Decision-making
Considered judgment Appropriateness, Value judgments
Flexibility Flexibility, Clinical freedom
Patient preferences Patient involvement/communication/values
Values Beliefs, Compatibility, Values/Norms

for analysing data across different fields or disciplines [16]. developing new knowledge synthesis methods to address
Meta-ethnography was another method that we considered, the limitations of some of the traditional synthesis strat-
which involves translating key concepts from one study egies such as the systematic review. Like realist review,
to another to reveal new insights [17], but its application the advantage of these methods is that they can help
to large data sets and its focus on qualitative studies pre- organize information from underconceptualized fields
sents challenges when the data set is large and comprised like knowledge translation and quality improvement to
of mixed study designs. This lack of a “perfect fit” highlights create a more cumulative knowledge base. However,
the need to consider all factors associated with the research methodological strategies that are more accessible are
question when deciding which method is the most appropri- required if they are to be widely used and optimized. To this
ate to answer them. These included determining the breadth end, a scoping review by Tricco et al. is currently underway
of evidence needed (quantitative or qualitative or both) and to determine which knowledge synthesis methods are
balancing this need with the feasibility or resources available available, and to develop a systematic process to help re-
to perform the review, anticipating the end-users of findings, searchers select the most appropriate method(s) to address
and to what extent the method provides strategies for rigor their research questions about complex evidence [18].
and transparency. In fact, these are similar considerations A limitation of our work is that the approach we used
we may use for selecting the most appropriate methods for was largely interpretive. However, the quality of synthe-
primary studies. There has been a resurgence of interest in sis is dependent on reviewers’ explicitness and reflexivity
http://www.biomedcentral.com/1471-2288/13/112
Kastner et al. BMC Medical Research Methodology 2013, 13:112
Table 3 Suggested approach to organize, synthesize, validate and make sense of complex findings
Step Points to consider Example Advantages Challenges How to overcome challenges
1. Selection of analysis • Which method is the most • We searched the literature • Potentially more valid if the • There was no single • Need to adopt a flexible approach to
method appropriate to answer for various synthesis methods method matches the question synthesis method that best match appropriate methods to answer
research questions? of complex evidence fit our questions research questions
• Consider selecting a primary analysis
method supplemented by other or
modified methods to address all questions
2. Organization and • How will the data be • We sorted and organized our • Sorting of concepts and • Difficult to keep track • We used a modified duplicate review
analysis of data organized? data (1736 guideline attributes) themes on multiple levels of changes from multiple process that involved a group of second
in an Excel database (e.g., across attributes, categories, reviewers reviewers “auditing” the analysis of
disciplines) primary reviewers
• Will also depend on • Analysis process was done • Duplicate analysis • Duplicate review is • Ensure that document tracking is
selected analysis method in duplicate minimizes bias time consuming and transparent and efficient (e.g., track
resource intensive and document changes and include detailed
notes from all reviewers)
3. Validity measures • How are you going to • Sought expert consensus • Survey methodology is quick • Survey methodology • Depending on resources, other consensus
verify findings and on findings using survey and efficient has inherent biases methods may increase validity such as the
minimize bias? methodology Delphi method
• Transparency (i.e., document what was
planned, what was done and why)
4. Representation of data • How will the results • We developed a conceptual map • The conceptual map • There may be other • The conceptual framework needs to be
and data be used? of guideline implementability for contributes to the factors not captured in the refined according to the codebook of
guideline developers and understanding of guideline map that may influence definitions
end-users implementability guideline implementability
• The conceptual framework needs to be
• Who are the target • The process advances the rigorously evaluated to determine the
knowledge end users? knowledge about analysis feasibility of its use by guideline
methods for complex evidence developers, and its potential to influence
guideline uptake by family physicians
5. Dissemination of data • To what extent should • The map will inform a guideline • The framework will inform • There may be other factors • Prior to dissemination, the framework will
the data be disseminated? implementability framework for end-users about attributes that influencing guideline need to undergo rigorous evaluation
guideline developers, users and facilitate guideline uptake; and implementability (including quantitative and qualitative studies)
• Will the work inform policy makers may also inform policy around to test its potential to influence guideline
practice, system, policy?
guideline development uptake by family physicians who are the
primary end-users of clinical practice guidelines

Page 6 of 8
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 7 of 8
http://www.biomedcentral.com/1471-2288/13/112

of the methods. In our process to make sense of the Additional files


complex data that emerged from our Realist Review, we
ensured transparency of the methods and included several Additional file 1: Expert feedback review form on the Guideline
Implementability Framework.
validity measures to minimize sources of error. This was
Additional file 2: Example of a Codebook of definitions.
important given the interpretive nature of our process and
the anticipated learning curve involved in data abstraction.
The measures included an auditing process whereby pri- Competing interests
None of the authors have any financial or non-financial competing interests
mary data extractions was checked by secondary reviewers, to declare.
and a process to verify this data against a codebook of defi-
nitions during Level 4 analysis. Lastly we tested the validity Authors’ contributions
of our data organization and analysis through an expert All authors contributed in the design of the study. MK, LH, AC, JM, LD
executed the study and MK, LH, JM, AC, LD, OB, MB conducted the analysis
survey to verify the sense and fit of attributes and categories
and interpreted the results. MK drafted the manuscript, and all authors read
within the framework. and approved the final manuscript.
In our realist review, we considered each attribute and
Author details
integrated like-attributes into common themes and 1
St. Michael’s Hospital Li Ka Shing Knowledge Institute, 209 Victoria Street,
domains. Further, we considered evidence of impact or Toronto M5B 1W8, ON, Canada. 2Department of Oncology, Juravinski
effectiveness on our relevant outcome. For example, Hospital and Cancer Centre, McMaster University, 711 Concession Street,
Hamilton L8V 1C3, ON, Canada.
evidence indicates that a guideline recommendation is
more actionable if it clearly specifies when, who should do Received: 5 December 2012 Accepted: 6 September 2013
precisely what action; if a recommendation does not specify Published: 12 September 2013
these steps or uses passive verbs, its actionability will be di-
minished. Such conceptualization of the evidence can then References
1. Grimshaw J, Eccles MP: Is evidence-based implementation of evidence-based
be useful to support or refute various theories or their ele- care possible? Med J Aust 2006, 180:S50–S51.
ments in the literature about guideline implementability. 2. Pawson R, Greenhalgh T, Harvey G, Walshe K: Realist review-a new method
These strategies enabled us to embrace the whole of the of systematic review designed for complex policy interventions. J Health
Serv Res 2005, 19(Suppl1):S21–S34. Policy.
data, with few preconceived expectations, to identify and 3. Wong G, Greenhalgh T, Pawson R: Internet-based medical education:
carefully define elements that are relevant to guideline up- a realist review of what works, for whom and in what circumstances.
take. The approach described in this paper is an example of BMC Med Edu 2010, 10:12.
4. Wong G, Pawson R, Owen L: Policy guidance on threats to legislative
how new analytic methods can emerge and respond to the interventions in public health: a realist review. BMC Publ Health 2011,
challenges related to finding the best fit between methods 11:222.
and research questions. Based on our experience, Table 3 5. Rycroft-Malone JMB, Hutchinson AM, DeCorby K, Bucknall TK, Kent B,
Schultz A, Snelgrove-Clarke E, Stetler CB, Titler M, Wallin L, Wilson V:
highlights suggested steps to help determine the purpose Realist synthesis: illustrating the method for implementation research.
and scope of poorly understood concepts under investi- Implement Sci 2012, 7(7):33.
gation such as guideline implementability. This may be 6. Greenhalgh T, Peacock R: Effectiveness and efficiency of search methods
in systematic reviews of complex evidence: audit of primary sources.
particularly useful to help organize, synthesize, validate, BMJ 2005, 331:1064–1065.
and represent complex data resulting from qualitative 7. Wong GGT, Westhorp G, Buickingham J, Pawson R: RAMESES publication
reviews in a relevant and meaningful way. standards: Realist synthesis. BMC Med 2013, 11:21.
8. Rycroft-Malone J, Fontenla M, Bick D, Seers K: A realistic evaluation: the
Our work has the potential for wide influence. The case of protocol-based care. Implement Sci 2010, 5:38.
proposed method will appeal to more investigators 9. Kendall E, Sunderland N, Muencheberger H, Armstrong K: When guidelines
because the process has now been operationalized, is fairly need guideance: considerations and strategies for improving the
adoption of chronic disease evidence by general practitioners. J Eval Clin
straightforward to apply, it can be applied to a wide range Prac 2009, 15:1082–1090.
of topics and the return on effort is significant. Expanding 10. Brown LC, Johnson JA, Majumdar SR, et al: Evidence of suboptimal
this knowledge base will become particularly important as management of cardiovascular risk in patients with type 2 diabetes
mellitus and symptomatic atherosclerosis. CMAJ 2004, 171(10):1189–1192.
these rapidly expanding fields most often require more so- 11. Grimshaw J, Russell I: Achieving health gain through clinical guidelines
phisticated techniques to analyze data, which is informed I: Developing scientifically valid guidelines. Qual Health Care 1993,
by complex interventions that cut across multiple disci- 2:243–248.
12. Grimshaw J, Eccles M, Thomas R, et al: Toward evidence-based quality
plines and from the input of multiple stakeholders. improvement. JGIM 2006, 21:S14–S20.
13. Michie S, Johnston M: Changing clinical behaviour by making guidelines
Conclusions specific. BMJ 2004, 328:343–345.
14. Kastner M, Estey E, Perrier L, et al: Understanding the relationship between
This study represents a novel contribution to advancing the perceived characteristics of clinical practice guidelines and their
complex data analysis methods by offering a systematic uptake: protocol for a realist review. Impl Sci 2011, 6:69.
approach to analyzing any large and disparate data sets 15. Eppler MJ, Mengis J: Drawing Distinction: The Visualization of
Classification in Qualitative Research. =mcm working paper, No. 2/2011,
where the goals are to condense, organize and identify July 2011. St. Gallen: =mcm institute, University of St. Gallen; 2011.
relationships. Retrieved online at: www.knowledge-communication.org.
Kastner et al. BMC Medical Research Methodology 2013, 13:112 Page 8 of 8
http://www.biomedcentral.com/1471-2288/13/112

16. Greenhalgh T, Robert G, Macfarlane F, et al: Storylines of research in


diffusion of innovation: a meta-narrative approach to systematic review.
Soc Sci & Med 2005, 61:417–430.
17. Noblit GW, Hare RD: Meta-ethnography: synthesizing qualitative studies.
Newbury Park, California: Sage; 1988.
18. Kastner M, Tricco AC, Straus SE: How can we make sense of Cochrane reviews
of complex interventions? Consideration of 3 complementary synthesis
methods (realist review, meta-narrative, meta-ethnography) to better
understand the “how” and “why” of findings. Vancouver, BC: Canadian
Cochrane Symposium 2011; 2011.

doi:10.1186/1471-2288-13-112
Cite this article as: Kastner et al.: Making sense of complex data: a
mapping process for analyzing findings of a realist review on guideline
implementability. BMC Medical Research Methodology 2013 13:112.

Submit your next manuscript to BioMed Central


and take full advantage of:

• Convenient online submission


• Thorough peer review
• No space constraints or color figure charges
• Immediate publication on acceptance
• Inclusion in PubMed, CAS, Scopus and Google Scholar
• Research which is freely available for redistribution

Submit your manuscript at


www.biomedcentral.com/submit

You might also like