Tailoring Implementation Strategies
Tailoring Implementation Strategies
Tailoring Implementation Strategies
Author manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Author Manuscript
Abstract
Implementing behavioral health interventions is a complicated process. It has been suggested that
implementation strategies should be selected and tailored to address the contextual needs of a
given change effort; however, there is limited guidance as to how to do this. This article proposes
four methods (concept mapping, group model building, conjoint analysis, and intervention
mapping) that could be used to match implementation strategies to identified barriers and
facilitators for a particular evidence-based practice or process change being implemented in a
given setting. Each method is reviewed, examples of their use are provided, and their strengths and
weaknesses are discussed. The discussion includes suggestions for future research pertaining to
implementation strategies and highlights these methods' relevance to behavioral health services
Author Manuscript
and research.
Keywords
Implementation Strategies; Implementation Research; Tailoring
Correspondence concerning this article should be addressed to Byron J. Powell, Department of Health Policy and Management,
Gillings School of Global Public Health, University of North Carolina at Chapel Hill, 1105C McGavran-Greenberg Hall, Campus Box
7411, Chapel Hill, NC 27599-7411, [email protected], 919-843-2576.
Conflict of Interest: The authors have no conflicts of interest to declare.
Powell et al. Page 2
Introduction
Author Manuscript
regarding the types of strategies that may be effective in particular circumstances. This is
particularly true in behavioral health settings, as there are far fewer randomized controlled
trials and head-to-head comparisons of implementation strategies than in other medical and
health service settings.10–12 Thus, while it is well established that simply training clinicians
to deliver EBPs is insufficient,13 it is less clear which strategies are needed at the client,
clinician, team, organizational, system, or policy levels to facilitate implementation and
sustainment.
Conceptual frameworks14 can guide research and practice by suggesting factors that
influence implementation outcomes15,16 and providing some direction for the selection and
tailoring of strategies.17,18 But researchers often fail to explicitly refer to guiding conceptual
frameworks,12,19 and when they do, they often simply borrow a subset of constructs or
Author Manuscript
outcomes without framing their study within the broader context of the framework.20 It is
not always clear how to translate frameworks into implementation strategy design,21 as
many are primarily heuristic and do not indicate the direction or nature of causal
mechanisms. Thus, while conceptual frameworks and theories can inform all aspects of
implementation research,22 they provide a necessary but not sufficient guide for selecting
and tailoring implementation strategies.
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 3
The considerable variation in EBPs and other process changes has implications for selecting
strategies. Scheirer,23 for example, has proposed a framework of six different intervention
Author Manuscript
types that vary in complexity and scope, from interventions implemented by individual
providers (e.g., cognitive behavioral therapy) to those requiring coordination across staff and
community agencies (e.g., multisystemic therapy, assertive community treatment) to those
embracing broad-scale system change (e.g., Philadelphia's recovery transformation).24 These
intervention types may require the use of unique constellations of implementation strategies
to ensure that they are integrated and sustained.25 Contextual variation also has implications
for selecting strategies,8 as settings are likely to vary with regard to patient-level (e.g., fit
between patient's cultural values and EBP,26 patient “buy in” to EBP);27 provider-level (e.g.,
attitudes toward EBPs28 and specific behavior change mechanisms7); organizational-level
(e.g. culture and climate,29 implementation climate30); and system-level characteristics (e.g.,
policies and funding structures31,32).
Author Manuscript
barrier would be preventing “therapist drift” by selecting fidelity monitoring, audit and
feedback, or ongoing consultation. Discrete strategies may also need to be tailored to
address a particular barrier. For example, in-person trainings may be difficult to scale-up in
community settings because they require substantial expenditures of time and money; thus,
the training may need to be to be delivered as a web-based module.
The need for systematic methods for selecting and tailoring implementation strategies
Selecting and tailoring strategies to address contextual needs “has considerable face validity
and is a feature of key frameworks and published guidance for implementation in health
care.”4 While the empirical evidence supporting the approach is not yet robust, a Cochrane
Review found that strategies tailored to address identified barriers to change were more
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 4
[implementation strategies] appear to be effective, we do not yet know the most effective
Author Manuscript
ways to identify barriers, to pick out from amongst all the barriers those that are most
important to address, or how to select interventions [implementation srategies] likely to
overcome them.”33(p20) Linking strategies to barriers, facilitators, and contextual features
remains a creative, emergent, and non-systematic process that occurs during implementation
efforts.35 It is clear that “systematic and rigorous methods are needed to enhance the linkage
between identified barriers and change strategies.”39(p169) These methods should take
relevant theory and evidence into account, elicit stakeholder feedback and participation, and
be specified clearly enough to be replicated in science and practice. To date, few studies
have advanced candidate methods capable of addressing that need.
suggesting four methods that can be used to improve the process in which strategies are
linked to the unique needs of implementation efforts: concept mapping,41,42 group model
building,43–45 conjoint analysis,46 and intervention mapping.47 These methods were selected
based upon the authors' expertise and a narrative search of the literature in multiple
disciplines (e.g., implementation science, public health, engineering, marketing, etc.). They
were also chosen because they have been used to develop interventions in other contexts,
have substantial bodies of literature that could guide their use, and are not proprietary. The
evidence for their effectiveness in improving implementation and clinical outcomes is yet to
be determined; however, their use in other contexts suggests that they may be an important
step in a research agenda aiming to integrate the perspectives of implementation
stakeholders, make implementation planning more systematic, and improve the linkage
between implementation barriers and strategies. Below each method is reviewed and
Author Manuscript
examples of their use are provided. Table 1 contains a brief description of each method and
lists some advantages and disadvantages to them to select and tailor implementation
strategies.
Concept mapping
Overview—Concept mapping is a mixed methods approach that organizes the ideas of a
group to form a common framework.42 Data collection involves engaging stakeholders in
brainstorming, sorting, and rating tasks. The brainstorming task is structured through a
“focus prompt” (e.g., “a potential barrier to implementing this EBP is____” or “one strategy
for implementing this EBP in our organization [service system, state, etc.] is ____”).
Stakeholders are encouraged to generate as many items as possible. These items are then
compiled and culled to a list of no more than 100 statements. Stakeholders then sort the
Author Manuscript
statements into conceptually consistent piles. Stakeholders also are asked to rate each
statement on a number of dimensions (e.g., importance, feasibility, changeability) using a
Likert scale. These data are then analyzed using multidimensional scaling and hierarchical
cluster analysis. The end product is a map that contains different shapes representing distinct
concepts that can also be depicted in a cluster rating map to reflect varying ratings on
specified dimensions such as importance and feasibility.42 Kane and Trochim's text42
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 5
How it has been used—Concept mapping has been used in strategic planning,
community building, curriculum development, the development of conceptual models, and
evaluation.49 Increasingly, it has been successfully used to address implementation-related
objectives such as identifying barriers and facilitators,15,50 assessing program fidelity,51 and
creating a conceptual model to translate cancer research into practice.52
Examples from behavioral health—Concept mapping can improve the selection and
tailoring of strategies in a number of ways. First, it can be used to identify factors that may
affect the implementation of a specific EBP in a specific setting, and to determine which
factors are most important and actionable from the perspectives of a wide range of
stakeholders.15,50,53 For example, Green and Aarons53 compared the perspectives of policy
Author Manuscript
and direct practice stakeholders regarding factors that influence the implementation of EBPs.
Figure 1 shows an example of a cluster rating map taken from their study. Each point on the
map represents a specific barrier or facilitator of EBP implementation that policy
stakeholders identified. The 14 conceptually distinct clusters were derived from the sorting
process and subsequent analysis (i.e., multidimensional scaling and hierarchical cluster
analysis). The layers on the map depict the policy stakeholders' “importance” ratings. Thus,
the cluster of barriers and facilitators labeled “funding” was seen as more important than the
cluster labeled “agency compatibility.” Green and colleagues provide additional guidance
and examples of the use of concept mapping in relation to implementation.41
brainstorming sessions with those identified through relevant literature,5,6 and generating
conceptually sound categories of strategies from which to select. Waltz et al.8 have recently
used concept mapping to create conceptually distinct categories of strategies and determine
their importance and feasibility according to experts in implementation science and clinical
practice.
Finally, concept mapping could be used to generate consensus about the strategies (and
categories of strategies) that can most effectively, efficiently, and feasibly address a specific
barrier or set of barriers (e.g., a focus prompt could be, “In order to more effectively engage
patients in treatment, we need to____”).
Overview—Group Model Building comes from the field of system dynamics and involves
the “client” in identifying and implementing solutions to “messy” problems.43 System
dynamics is characterized by the use of social systems to obtain stakeholder feedback that
leads to models for studying causal loops (including variables, relationships, and feedback)
to better understand potential consequences of the system's structure prior to the actual
implementation. Group model building has evolved to include numerous structures and
formats (see Rouwette and Vennix54 for a review), with guidelines for choosing the optimal
approach.44 The reference group approach is perhaps the most commonly used. In this
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 6
approach, a reference group of between 8-10 “clients,” who could be directors, managers,
Author Manuscript
is initiated. This requires careful delineation of the relevant variables by the referent group
and is termed the “conceptualization stage.” For instance, the referent group might identify
training needs, resources, clinician confidence, time constraints, and supervision as relevant
variables to be included in the model. The facilitator then generates a causal loop or stock
and flow diagram based on this information. Figure 2 provides an example of a causal loop
diagram related to implementation.56 These models can be drawn freehand or created with
the aid of model building and simulation software. Third, a mathematical formula is
generated to quantify specified parameters. This model building stage requires the expertise
of the group model building team; the referent group is only used for consultation given its
time consuming and complicated nature. The model is then ready for simulation of the
proposed solution. More information about group model building can be found in
Hovmand's text,57 and training and consultation opportunities can be sought through the
System Dynamics Society.58
Author Manuscript
How it has been used—Group model building has been used to collaboratively solve
problems among stakeholders with the aid of system dynamics experts and implement
solutions across for-profit (e.g., oil, electronics, software, insurance, and finance industries),
non-profit (e.g., universities, defense research), and governmental (from the city to national
level) sectors, as well as between organizations.45 The Office of Behavioral and Social
Sciences Research and other leaders in the field have prioritized the development and
utilization of systems science methodologies to capture the complexity of threats to public
health and to address the science to practice gap.59,60
Example from behavioral health—Huz and colleagues61 used a group model building
approach to address gaps in the continuity of vocational rehabilitation services for
Author Manuscript
individuals with severe mental illness. The process consisted of four meetings that occurred
over a six-month period. After an initial introductory meeting, the system dynamics
simulation model was developed during a two-day conference involving 12 to 18 managers
of mental health and rehabilitation services from the involved county, state, and non-profit
agencies, as well as one or more client advocates. This model was used to guide discussion
and develop an action plan for moving toward a more integrated vocational rehabilitation
system. The third and fourth sessions involved the continued exploration of insights from the
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 7
simulation model, but moved toward the group's strategies for change and their progress
toward achieving their action plan. Huz and colleagues61 found that the group model
Author Manuscript
building process was well received by both the modeling team and the target group. More
importantly, the process resulted in significant shifts in their approach to integrating care and
reduced the group's reliance on “silver bullet” solutions that are not effective in improving
care. Participants also became more aligned in their perceptions of systems goals and the
strategies that they will use for integrating care.61
Conjoint analysis
Overview—Conjoint analysis is a method used to measure stakeholders' preferences for
product (or service, program, implementation strategy, etc.) attributes, to learn how changes
in those attributes influence demand, and to forecast the likely acceptance of products that
are brought to market.62,63 It is particularly useful in addressing situations in which there are
Author Manuscript
inherent tradeoffs,62 for instance, when implementation strategy X has more support than
strategy Y, but strategy Y is less costly than strategy X. Though there are many different
types of conjoint analysis,62–65 all of them ask respondents to consider multiple conjoined
product features and to rank or select the products that they would be likely to purchase or
use.63 In “full-profile” techniques, different products with varying attributes are shown one
at a time.62,63 For example, respondents might be presented with a single “card” that has five
discrete strategies (e.g., educational materials, training, supervision, fidelity monitoring, and
reminders) comprising a single multifaceted strategy, and be asked to rate the strategy on a
scale from 0 to 100, where 100 would mean “definitely would use to implement” a given
EBP in a given context. In choice-based conjoint or discrete choice experiments,62,66
products are displayed in pairs or sets and respondents are asked to chose the one they are
most likely to use or purchase (see Figure 3). Respondents generally complete 12-30 of
Author Manuscript
these questions, which are designed using experimental design principles of independence
and balance of the attributes. Independently varying the attributes shown to respondents and
recording their responses to the product profiles affords the opportunity to statistically
deduce the product attributes that are most desired and have the most influence on choice.63
Orme67 provides a useful introduction of conjoint analysis for beginners, and Sawtooth
Software provides a variety of software packages and resources that users will find helpful
and accessible.68
How it has been used—Conjoint analysis has primarily been used in marketing research
to determine the products and features that consumers value most;62 however, as Bridges et
al.66 note, it has also been applied to a range of health topics (e.g., cancer, HIV prevention,
diabetes). While it is just beginning to be applied in implementation research,8,46,69 conjoint
analysis can enhance the rigor of implementation strategy selection and tailoring processes.
Author Manuscript
The examples provided below demonstrate the utility of two types of conjoint analysis:
menu-based choice70 and discrete choice experiments.62,66
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 8
and services such as automobiles, employee benefit packages, restaurant menus, and
banking options.70 Waltz and colleagues8 are using a menu-based choice approach to
Author Manuscript
develop expert recommendations for the strategies that are most appropriate for
implementing clinical practice changes within the Department of Veterans Affairs (VA).
Expert panelists were given a set of 73 discrete strategies and were tasked with building
multifaceted implementation strategies for three different clinical practice changes
(measurement-based care for depression, prolonged exposure for posttraumatic stress
disorder, and metabolic monitoring for individuals taking antipsychotics). For each practice
change, panelists were provided with three different narratives that described relatively
weak, moderate, and strong contexts. Each panelist indicated how essential (e.g., absolutely
essential, likely essential, likely inessential, absolutely inessential) each discrete strategy
would be to successfully implementing the clinical practice at three different phases of
implementation (pre-implementation, implementation, and sustainment). While the results of
Author Manuscript
this study are forthcoming, the analysis of these data will involve calculating Relative
Essentialness Estimates70,71 for each discrete strategy (where 1 represents the highest degree
of endorsement and 0 represents the lowest) and generating count-based analyses that will
be used to characterize the most commonly selected combinations of essential strategies for
each scenario.8 These results will then be provided to the expert panelists to develop final
consensus statements about the strategies most appropriate for implementing each practice
change in each of the three implementation phases and contexts.
Menu-based choice methods may be most appropriate for selecting strategies, but discrete
choice experiments66 may be more helpful in tailoring strategies to specific populations and
contexts. Each strategy has a number of modifiable elements; for example, the ICEBeRG
Group72 pointed out that what might be seen as a relatively straightforward implementation
strategy, audit and feedback, actually has at least five modifiable elements (content,
Author Manuscript
intensity, method of delivery, duration, and context) that may influence its effectiveness. It
may be helpful if modifications to these elements are informed by clinicians' preferences.
Cunningham and colleagues69 used a discrete choice experiment to determine the relative
influence of 16 different attributes of practice changes and implementation strategies on the
preferences of 1,010 educators (see Figure 3 for a sample survey question). The attributes
included contextual and social attributes (e.g., presenter's qualities, colleague support,
compatibility with practice), content attributes (e.g., supporting evidence, focus on
knowledge vs. skills, universal vs. targeted), and practice change process attributes
(coaching to improve skills, workshop size, training time demands, internet options). Their
findings suggested ways in which stakeholders' preferences converged, namely their desire
for small-group workshops conducted by engaging experts who would teach skills
applicable to all students. Through the use of latent class analysis, the investigators found
Author Manuscript
two different segments of educators (the change ready segment [77%] and the demand
sensitive segment [23%]) who expressed different preferences for the design of
implementation strategies. The change ready segment was willing to adopt new practice
changes and preferred that schools have autonomy in making practice change decisions, that
coaching be provided to all participants, and that participants receive post-training follow-up
sessions. The demand sensitive segment was less intent on practice change, thought that
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 9
Intervention mapping
Overview—Intervention mapping draws upon mixed-methods research, theory, community
stakeholder input, and a systematic process for intervention development.73 There are five
steps in intervention mapping.47,73,74 First, a needs assessment is conducted to identify
determinants (barriers and facilitators) and guide the selection of intervention components.
Second, proximal program objectives are specified to produce matrices with multiple cells
that contain behavioral or environmental performance objectives, and the determinants of
these objectives as identified in the needs assessment. Third, a list of intervention methods
that are matched to the proximal program objectives identified in step two is generated. To
achieve this, one must brainstorm and delineate methods, and then translate those methods
Author Manuscript
The content produced throughout the stages of intervention mapping are both the
implementation strategy and the tools needed to evaluate the effectiveness of the
implementation strategy. The evaluation incorporates mixed methods measurement of both
the processes and outcomes of the implementation strategy. There is an additional step of
intervention mapping dedicated to adoption and implementation, but this step is not directly
applicable here given that this is already a central consideration of implementation research.
Bartholemew et al.47 provide a useful text on intervention mapping, and further resources
Author Manuscript
How it has been used—Intervention mapping has been used to develop a number of
health-related programs, including those addressing sex education,75 obesity prevention,76
breast and cervical cancer prevention,77 and cardiovascular health.78 While intervention
mapping has been proposed as a method for tailoring implementation strategies to identified
barriers,22,39 it has not been widely used, with a few notable exceptions.79,80
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 10
Discussion
Author Manuscript
This article provides an overview of four methods that can be used to select and tailor
implementation strategies to local contextual needs, an underdeveloped but critical area in
implementation science.4 The discussion focuses on these methods' common strengths and
weaknesses, recommendations for comparative effectiveness research on the selection and
tailoring of strategies, and suggestions for reporting this type of research. This represents a
first step in a research agenda aiming to improve behavioral health services by more
systematically and rigorously selecting and tailoring strategies for implementing effective
practices.
The methods share a number of strengths. First, all are inherently participatory, engaging
diverse stakeholders and importantly, providing concrete steps or processes for facilitating
communication. All four methods, particularly concept mapping and group model building,
Author Manuscript
have the potential to galvanize stakeholders around common goals and create consensus
regarding the implementation approach. This emphasis on stakeholder participation is
critical.82 Second, these methods all comprise systematic, transparent, replicable processes,
which is a step forward for selecting and tailoring implementation strategies, an area
previously described as a black box.40 Third, all can represent and respond to the complexity
of implementation efforts,83 in that they provide the structure for considering factors at
different levels and facilitate the selection and tailoring of strategies to address them.
The methods share a few primary weaknesses. If not informed by theory and evidence, they
may be helpful in eliciting preferences, but not necessarily provide the supports needed to
implement EBPs effectively. For example, in the discrete choice experiment cited above,
there was a segment of teachers that believed practice changes should be at the discretion of
Author Manuscript
individual teachers, that coaching should be discretionary, and that no post-training follow-
up support was necessary. This conflicts with evidence that coaching and post-training
follow-up are important.84,85 Thus, it would likely be less effective to give teachers the
choice to opt out of coaching or follow-up training. Clearly there is a balance to be struck
between a paternalistic overemphasis on the research literature and a disregard of evidence.
One way of accomplishing this is to “seed” the methods with theoretically or empirically
justified strategies and attributes, and allow stakeholders to supplement them based upon
their expertise and experience. This would bolster the likelihood that selection and tailoring
methods are guided by the best available theory and evidence, while preserving the benefits
of stakeholder engagement and preference. Another weakness of all of the methods is that
they all require consultation or specific training. Finally, with some notable exceptions,45
there is a dearth of empirical evidence supporting their effectiveness and cost-effectivness in
Author Manuscript
obtaining improved implementation and clinical outcomes. The latter weakness is not
suprising given that implementation science is a relatively new field that is gradually moving
toward more precise applications of implementation strategies.4,86
Each method holds promise for increasing the rigor of implementation research; however,
which method should be used is ultimately an empirical question. It is also possible that
some of these methods could be usefully combined. For example, conjoint analysis might be
particularly useful for tailoring implementation strategies after strategy selection is
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 11
completed via one of the other methods (concept mapping, group model building, or
intervention mapping). Three types of comparative effectiveness research87 are warranted.
Author Manuscript
First, there is a need for comparisons between implementation strategies that have been
selected and tailored to address contextual factors and more generic multifaceted
implementation strategies. Although preliminary studies suggest the effectiveness of
systematic selection of implementation strategies,88 the evidence is not robust.4,33 Second,
there is a need to comparatively test different methods for selecting and tailoring strategies.
It will be important to determine which methods are most acceptable and feasible for
stakeholders, whether or not they result in similar constellations of implementation
strategies, and whether some are more efficient and effective in identifying key contextual
factors and matching discrete strategies to address them. Finally, it will be essential to assess
the cost-effectiveness of these approaches.89,90 Several ongoing studies will contribute to the
evidence-base for selecting and tailoring implementation strategies to context-specific
factors, including the Tailored Implementation for Chronic Disease research program34 and
Author Manuscript
a study recently funded by the National Institute of Mental Health that will test tailored
versus standard approaches to implementing measurement-based care91 for depression
(R01MH103310; Lewis, PI).
Publications reporting efforts to select and tailor strategies to address contextual factors
should be clear about the methods used for selection and tailoring, the specific discrete
strategies selected, and the links between strategies and barriers and other contextual factors.
First, researchers should carefully report how they selected and tailored the strategies, and
provide descriptions of any systematic methodology (e.g., concept mapping, group model
building, etc.) that they may have used. Second, to move the field forward, there is a need for
researchers to clearly describe the implementation strategies used in their studies, as the
description of implementation strategies in published research has been noted to be very
Author Manuscript
poor.3,92 Proctor and colleagues3 have suggested reporting guidelines for implementation
strategies that require researchers to carefully name, conceptually define, and operationalize
strategies, and specify them according to features such as: (a) the actor; (b) the action; (c)
action target; (d) temporality; (e) dose; (f) implementation outcome affected;89 and (g) the
empirical, theoretical, or pragmatic justification for use of the strategy.3 Authors may benefit
from consulting those guidelines, or others that have recently been set forth93,94 when
reporting the results of implementation studies. Finally, in order to make the links between
strategies and the contextual factors that they are intended to address explicit, it may be
helpful if researchers use logic models, which could clarify how and why their carefully
selected and tailored implementation strategy is proposed to work.95,96 These suggestions
should lead to more consistent and transparent reporting, and improve the ability to
understand how strategies exert their effects.
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 12
exciting step forward for implementation science that holds potential for improving the
quality of behavioral health care. The methods presented in this article have the potential to
increase the rigor of the selection and tailoring process. While they may not make
implementation easier at the outset, these methods provide a step-by-step process for
selecting and tailoring implementation strategies and for engaging stakeholders that may be
very attractive to researchers and other implementation leaders. These initial investments are
also likely to payoff in the long run as implementation failures due to overlooked and
unaddressed barriers to improvement are avoided.
These methods are likely not the only ones that could be used to guide selection and
tailoring of strategies. There is a need for ongoing dialogue that might lead to the
identification of additional avenues for systematically selecting and tailoring strategies. The
move toward more the thoughtful and systematic application of implementation strategies
Author Manuscript
will increase the legitimacy of the field by enhancing the scientific rigor of proposed studies,
strengthening the evidence-base for implementation strategies, and providing support for
causal mechanisms and theory. Ultimately, there is reason to hope that the use and
evaluation of systematic methods for selecting and tailoring implementation strategies to
contextual needs will propel the field toward a greater understanding of how, when, where,
and why implementation strategies are effective in integrating evidence-based care and
improving clinical outcomes in behavioral health service settings.4
Acknowledgments
This work was supported in part by the National Institutes of Health (F31MH098478 to BJP; K23MH099179 to
RSB; R01MH103310 and R01MH106510 to CCL; R01MH092950 and R01MH072961 to GAA; R25MH080916,
P30DK092950, U54CA155496; UL1RR024992 to EKP; and R01MH106175 to DSM) and the Doris Duke
Author Manuscript
Charitable Foundation (through a Fellowship for the Advancement of Child Well-Being to BJP). Additionally, the
preparation of this article was supported in part by the Implementation Research Institute (IRI; NIMH
R25MH080916). Drs. Aarons & Proctor are IRI faculty; Dr. Beidas was an IRI fellow from 2012-2014.
References
1. Eccles MP, Mittman BS. Welcome to Implementation Science. Implementation Science. 2006; 1(1):
1–3. DOI: 10.1186/1748-5908-1-1
2. National Institutes of Health. [Accessed January 30, 2013] Dissemination and implementation
research in health (R01). 2013. Available online at: http://grants.nih.gov/grants/guide/pa-files/
PAR-13-055.html
3. Proctor EK, Powell BJ, McMillen JC. Implementation strategies: Recommendations for specifying
and reporting. Implementation Science. 2013; 8(139):1–11. DOI: 10.1186/1748-5908-8-139
[PubMed: 23279972]
4. Mittman, BS. Implementation science in health care. In: Brownson, RC.Colditz, GA., Proctor, EK.,
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 13
7. Michie S, Richardson M, Johnston M, et al. The behavior change technique taxonomy (v1) of 93
hierarchically clustered techniques: Building an international consensus for the reporting of
Author Manuscript
behavior change interventions. Annals of Behavioral Medicine. 2013; 46(1):81–95. DOI: 10.1007/
s12160-013-9486-6 [PubMed: 23512568]
8. Waltz TJ, Powell BJ, Chinman MJ, et al. Expert recommendations for implementing change (ERIC):
Protocol for a mixed methods study. Implementation Science. 2014; 9(39):1–12. DOI:
10.1186/1748-5908-9-39 [PubMed: 24398253]
9. Cochrane Collaboration. [Accessed April 15, 2013] Cochrane Effective Practice and Organisation of
Care Group. Available online at: http://epoc.cochrane.org
10. Landsverk J, Brown CH, Rolls Reutz J, et al. Design elements in implementation research: A
structured review of child welfare and child mental health studies. Administration and Policy in
Mental Health and Mental Health Services Research. 2011; 38(1):54–63. DOI: 10.1007/
s10488-010-0315-y [PubMed: 20953974]
11. Novins DK, Green AE, Legha RK, et al. Dissemination and implementation of evidence-based
practices for child and adolescent mental health: A systematic review. Journal of the American
Academy of Child and Adolescent Psychiatry. 2013; 52(10):1009–1025. DOI: 10.1016/j.jaac.
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 14
25. Isett KR, Burnam MA, Coleman-Beattie B, et al. The state policy context of implementation issues
for evidence-based practices in mental health. Psychiatric Services. 2007; 58(7):914–921. DOI:
10.1176/appi.ps.58.7.914 [PubMed: 17602006]
26. Cabassa LJ, Baumann AA. A two-way street: Bridging implementation science and cultural
adaptations of mental health treatments. Implementation Science. 2013; 8(90):1–14. DOI:
10.1186/1748-5908-8-90 [PubMed: 23279972]
27. Damschroder LJ, Aron DC, Keith RE, et al. Fostering implementation of health services research
findings into practice: A consolidated framework for advancing implementation science.
Implementation Science. 2009; 4(50):1–15. [PubMed: 19123945]
28. Aarons GA, Cafri G, Lugo L, et al. Expanding the domains of attitudes towards evidence-based
practice: The Evidence Based Attitudes Scale-50. Administration and Policy in Mental Health and
Mental Health Services Research. 2012; 39(5):331–340. DOI: 10.1007/s10488-010-0302-3
[PubMed: 20607597]
29. Glisson C, Landsverk J, Schoenwald S, et al. Assessing the organizational social context (OSC) of
Author Manuscript
mental health services: implications for research and practice. Administration and Policy in Mental
Health and Mental Health Services Research. 2008; 35(1-2):98–113. DOI: 10.1007/
s10488-007-0148-5 [PubMed: 18085434]
30. Jacobs SR, Weiner BJ, Bunger AC. Context matters: Measuring implementation climate among
individuals and groups. Implementation Science. 2014; 9(46):1–14. DOI: 10.1186/1748-5908-9-46
[PubMed: 24398253]
31. Ganju V. Implementation of evidence-based practices in state mental health systems: Implications
for research and effectiveness studies. Schizophrenia Bulletin. 2003; 29(1):125–131. [PubMed:
12908667]
32. Raghavan R, Bright CL, Shadoin AL. Toward a policy ecology of implementation of evidence-
based practices in public mental health settings. Implementation Science. 2008; 3(26):1–9. DOI:
10.1186/1748-5908-3-26 [PubMed: 18179688]
33. Baker, R., Cammosso-Stefinovic, J., Gillies, C., et al. Cochrane Database of Systematic Reviews.
2010. Tailored interventions to overcome identified barriers to change: Effects on professional
practice and health care outcomes. 3 Art. No.: CD005470:1-77
Author Manuscript
34. Wensing M, Oxman A, Baker R, et al. Tailored implementation for chronic diseases (TICD): A
project protocol. Implementation Science. 2011; 6(103):1–8. DOI: 10.1186/1748-5908-6-103
[PubMed: 21208425]
35. Wensing, M., Bosch, M., Grol, R. Selecting, tailoring, and implementing knowledge translation
interventions. In: Straus, S.Tetroe, J., Graham, ID., editors. Knowledge Translation in Health Care:
Moving from Evidence to Practice. Oxford, UK: Wiley-Blackwell; 2009. p. 94-113.
36. Grol R, Bosch MC, Hulscher MEJ, et al. Planning and studying improvement in patient care: The
use of theoretical perspectives. Milbank Quarterly. 2007; 85(1):93–138. DOI: 10.1111/j.
14680009.2007.00478.x [PubMed: 17319808]
37. Flottorp SA, Oxman AD, Krause J, et al. A checklist for identifying determinants of practice: A
systematic review and synthesis of frameworks and taxonomies of factors that prevent or enable
improvements in healthcare professional practice. Implementation Science. 2013; 8(35):1–11.
DOI: 10.1186/1748-5908-8-35 [PubMed: 23279972]
38. Wensing, M., Grol, R. Methods to identify implementation problems. In: Grol, R.Wensing, M.,
Author Manuscript
Eccles, M., editors. Improving Patient Care: The Implementation of Change in Clinical Practice.
Edinburgh, Ireland: Elsevier; 2005. p. 109-120.
39. Grol, R., Bosch, M., Wensing, M. Development and selection of strategies for improving patient
care. In: Grol, R.Wensing, M.Eccles, M., Davis, D., editors. Improving Patient Care: The
Implementation of Change in Health Care. Vol. 2nd. Chichester: John Wiley & Sons, Inc.; 2013. p.
165-184.
40. Bosch M, van der Weijden T, Wensing M, et al. Tailoring quality improvement interventions to
identified barriers: A multiple case analysis. Journal of Evaluation in Clinical Practice. 2007;
13:161–168. DOI: 10.1111/j.1365-2753.2006.00660.x [PubMed: 17378860]
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 15
41. Green AE, Fettes DL, Aarons GA. A concept mapping approach to guide and understand
dissemination and implementation. Journal of Behavioral Health Services & Research. 2012;
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 16
63. Sawtooth Software. What is conjoint analysis?. Sawtooth Software; 2014. Available online at:
http://www.sawtoothsoftware.com/products/conjoint-choice-analysis/conjoint-analysissoftware
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 17
81. Zwerver F, Schellart AJM, Knol DL, et al. An implementation strategy to improve the guideline
adherence of insurance physicians: An experiment in a controlled setting. Implementation Science.
Author Manuscript
Policy in Mental Health and Mental Health Services Research. 2009; 36(1):24–34. DOI: 10.1007/
s10488-008-0197-4 [PubMed: 19104929]
87. Institute of Medicine. Initial National Priorities for Comparative Effectiveness Research.
Washington, DC: The National Academies Press; 2009.
88. Waxmonsky J, Kilbourne AM, Goodrich DE, et al. Enhanced fidelity to treatment for bipolar
disorder: Results from a randomized controlled implementation trial. Psychiatry Services. 2014;
65(1):81–90. DOI: 10.1176/appi.ps.201300039
89. Proctor EK, Silmere H, Raghavan R, et al. Outcomes for implementation research: Conceptual
distinctions, measurement challenges, and research agenda. Administration and Policy in Mental
Health and Mental Health Services Research. 2011; 38(2):65–76. DOI: 10.1007/
s10488-010-0319-7 [PubMed: 20957426]
90. Raghavan, R. The role of economic evaluation in dissemination and implementation research. In:
Brownson, RC.Colditz, GA., Proctor, EK., editors. Dissemination and Implementation Research in
Health: Translating Science to Practice. New York: Oxford University Press; 2012. p. 94-113.
91. Scott K, Lewis CC. Using measurement-based care to enhance any treatment. Cognitive and
Author Manuscript
96. W. K. Kellogg Foundation. Logic Model Development Guide: Using Logic Models to Bring
Together Planning, Evaluation, and Action. Battle Creek, Michigan: W. K. Kellogg Foundation;
2004.
97. Baker-Ericzen MJ, Jenkins MM, Haine-Schlagel R. Therapist, parent, and youth perspectives of
treatment barriers to family-focused community outpatient mental health services. Journal of Child
& Family Studies. 2013; 22(6):854–868. DOI: 10.1007/s10826-012-9644-7 [PubMed: 24019737]
98. Bartholomew NG, Joe GW, Rowan-Szai GA, et al. Counselor assessments of training and adoption
barriers. Journal of Substance Abuse Treatment. 2007; 33(2):193–199. DOI: 10.1016/j.jsat.
2007.01.005 [PubMed: 17434707]
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 18
99. Brunette MF, Asher D, Whitley R, et al. Implementation of integrated dual disorders treatment: A
qualitative analysis of facilitators and barriers. Psychiatric Services. 2008; 59(9):989–995.
Author Manuscript
[PubMed: 18757591]
100. Cook JM, Biyanova T, Coyne JC. Barriers to adoption of new treatments: An internet study of
practicing community psychotherapists. Administration and Policy in Mental Health and Mental
Health Services Research. 2009; 36(2):83–90. DOI: 10.1007/s10488-008-0198-3 [PubMed:
19104928]
101. Forman SG, Olin SS, Hoagwood KE, et al. Evidence-based interventions in schools: Developers'
views of implementation barriers and facilitators. School Mental Health. 2009; 1(1):26–36. DOI:
10.1007/s12310-008-9002-5
102. Langley AK, Nadeem E, Kataoka SH, et al. Evidence-based mental health programs in schools:
Barriers and facilitators of successful implementation. School Mental Health. 2010; 2(3):105–
113. DOI: 10.1007/s12310-010-9038-1 [PubMed: 20694034]
103. Pagoto SL, Spring B, Coups EJ, et al. Barriers and facilitators of evidence-based practice
perceived by behavioral science health professionals. Journal of Clinical Psychology. 2007;
63(7):695–705. DOI: 10.1002/jclp.20376 [PubMed: 17551940]
Author Manuscript
104. Powell BJ, Hausmann-Stabile C, McMillen JC. Mental health clinicians' experiences of
implementing evidence-based treatments. Journal of Evidence-Based Social Work. 2013; 10(5):
396–409. DOI: 10.1080/15433714.2012.664062 [PubMed: 24066630]
105. Powell BJ, McMillen JC, Hawley KM, et al. Mental health clinicians' motivation to invest in
training: Results from a practice-based research network survey. Psychiatric Services. 2013;
64(8):816–818. DOI: 10.1176/appi.ps.003602012 [PubMed: 23903609]
106. Raghavan R. Administrative barriers to the adoption of high-quality mental health services for
children in foster care: A national study. Administration and Policy in Mental Health and Mental
Health Services Research. 2007; 34(3):191–201. DOI: 10.1007/s10488-006-0095-6 [PubMed:
17211714]
107. Rapp CA, Etzel-Wise D, Marty D, et al. Barriers to evidence-based practice implementation:
Results of a qualitative study. Community Mental Health Journal. 2010; 46(2):112–118. DOI:
10.1007/s10597-009-9238-z [PubMed: 19685185]
108. Shapiro CJ, Prinz RJ, Sanders MR. Facilitators and barriers to implementation of an evidence-
based parenting intervention to prevent child maltreatment: The triple p-positive parenting
Author Manuscript
DOI: 10.1186/1748-5908-1-13
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 19
Author Manuscript
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 20
Author Manuscript
Author Manuscript
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 21
Author Manuscript
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 22
Author Manuscript
Author Manuscript
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.
Powell et al. Page 23
Table 1
Overview of methods for selecting and tailoring implementation strategies
Author Manuscript
in the collaborative development of -Inherently integrated approach that involves the identification of
causal loop diagrams that model barriers and facilitators and strategies to overcome them
complex problems to identify
-Addresses problems that are “dynamically complex” such as
opportunities and strategies for
implementing EBTs
improvement
-Highly participatory and has potential to galvanize stakeholder
groups
-Ability to mathematically model consequences of proposed
solutions prior to implementing them may reduce use of strategies
that would be ineffective or too costly
Disadvantages:
-May require further training or methodological consultation
-Complexity of some models may be overwhelming to stakeholders
Disadvantages:
-May require further training or methodological consultation
J Behav Health Serv Res. Author manuscript; available in PMC 2018 April 01.