1906 Implementation in Action 1 0
1906 Implementation in Action 1 0
Implementation in action
A guide to implementing evidence-informed
programs and practices
The Australian Institute of Family Studies is committed to the creation and dissemination of research-based information
on family functioning and wellbeing. Views expressed in its publications are those of individual authors and may not
reflect those of the Australian Institute of Family Studies or the Australian Government.
Australian Institute of Family Studies
Level 4, 40 City Road, Southbank VIC 3006 Australia
Phone: (03) 9214 7888 Internet: aifs.gov.au
Cover image: © gettyimages/mixetto
Assoc. Prof. Robyn Mildon is founding Executive Director of CEI. She is an internationally recognised leader with
a long-standing career focused on implementation of evidence to achieve social impact for children, families and
communities.
Dr Melinda Polimeni is a Director at CEI, and a registered psychologist and researcher specialising in program
design and implementation.
Dr Jessica Hateley-Browne is a Senior Advisor at CEI and a researcher specialising in implementation and
evaluation of evidence-informed approaches in human service delivery settings.
Dr Lauren Hodge is an Advisor at CEI and a researcher with expertise in delivering, implementing and evaluating
programs for children and families in community settings.
Acknowledgements
This document has been produced as part of the Families and Children Expert Panel Project funded by the
Australian Government through the Department of Social Services.
2 Australian Institute of Family Studies
Contents
About the authors. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
Acknowledgements. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
List of abbreviations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Glossary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1. Introduction. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.1 What is the purpose of this guide? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.2 Who is this guide for?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
1.3 How should I use this guide?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2. What is implementation science? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.1 Why is good implementation important?. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.2 What are the key concepts of implementation science? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6
3. Overview of implementation stages. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
Stage 1: Engage and explore. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Stage 2: Plan and prepare. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Stage 3: Initiate and refine. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
Stage 4: Sustain and scale. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
4. Stage 1: Engage and explore. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4.1 Define what needs to change and for whom . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
4.2 Select and adopt an evidence-informed program or practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
4.3 Set up an implementation team. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12
4.4 Consider likely enablers and barriers, and assess readiness. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13
5. Stage 2: Plan and prepare. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
5.1 Choose implementation strategies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
5.2 Develop an implementation plan . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
5.3 Decide how to monitor implementation quality. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
5.4 Build readiness to use the program or practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
6. Stage 3: Initiate and refine. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.1 Initiate the program or practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.2 Continuously monitor the implementation process. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.3 Make improvements based on monitoring data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 23
6.4 Adapt the program or practice . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 25
7. Stage 4: Sustain and scale. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
7.1 Sustain the program or practice. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
7.2 Scale-up the program or practice. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 26
8. A note of encouragement. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
Appendix A: Implementation stages – Deciding where to start tool. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 29
Appendix B: Implementation progress checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 30
Appendix C: Implementation considerations checklist . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 32
Appendix D: Readiness Thinking Tool®. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 36
Appendix E: Implementation plan template . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
Implementation in action 3
List of abbreviations
AIFS Australian Institute of Family Studies
Glossary
Term Description
Acceptability The perception that a program or practice is agreeable, palatable or satisfactory
Evidence-informed Programs and practices that integrate the best available research evidence with practice
programs and expertise, and the values and preferences of clients
practices
Feasibility How successfully a program can be implemented or used within your setting
Implementation A process that uses active strategies to put evidence-informed approaches into practice. It
is the process of understanding and overcoming barriers to adopt, plan, initiate and sustain
evidence-informed programs and practices.
Implementation Implementation barriers make the implementation process more challenging.
barriers
Implementation Implementation enablers increase the likelihood a program or practice will be successfully
enablers implemented.
Implementation plan A document that specifies what implementation strategies are being used, how they will be
actioned, when, and by whom
Implementation The effect of using implementation strategies to implement new programs, practices and
outcome services. It is different to a client outcome.
Implementation The study of how to improve the uptake of research findings and other evidence-informed
science practices into ‘business as usual’. It aims to improve the quality and effectiveness of human
services.
Implementation A technique that enhances the adoption, planning, initiation and sustainability of a program
strategy or practice. It is the ‘how to’ component of implementing a new program or practice.
Program outcome A specific change expected in the target population as a result of taking part in a program
Randomised A research design that allocates participants at random. This design aims to reduce selection
controlled trial bias and produce high-quality evidence.
Reach How well a program or practice has been integrated into an agency or service provider,
including the target population
Sustainability The extent to which a program or practice has become incorporated into the mainstream way
of working, rather than being added on
4 Australian Institute of Family Studies
1. Introduction
We’ve written this guide to help you implement evidence-informed programs and practices in the child and
family service sector. We encourage you to use it in conjunction with the recommended tools and resources
highlighted throughout the guide.
We developed this guide using best-practice recommendations from implementation science. It uses a staged
implementation process to guide your implementation activities (Metz & Bartley, 2012; Metz et al., 2015). The
guide outlines all stages and steps briefly, and provides links to useful online resources.
The guide has different sections explaining the different stages of implementation. Depending on where you’re
at in your implementation process, it may not make sense for you to follow the process exactly as outlined in
this guide. You can ‘dip in’ to the different sections of the guide based on what information you need. However,
if your agency or service provider is new to implementation science – or has limited experience implementing
evidence-informed programs or practices – you’ll probably benefit from reading the whole guide.
Implementation in action 5
Many agencies and service providers now try to select and deliver programs and practices using the best
available research evidence (i.e. those proven to be effective, based on well-designed evaluation studies) and
best practice. Having an effective program or practice is necessary for good client outcomes. However, it is not
sufficient. Using programs and practices with a strong evidence base is important, but two common pitfalls
contribute to their potential not being realised:
l only focusing on ‘what’ program or practice to use, and ignoring ‘how’ the program or practice will be
implemented
l failing to consider influencing factors (such as enablers and barriers) that impact your ability to initiate and
sustain the program or practice.
You need to consider all these factors – the ‘what’, the ‘how’ and the influencing factors – to achieve the best
outcomes for children and families, as illustrated in Figure 1.
Barriers/enablers
Factors that
hinder or help
implementation
Implementation frameworks
Implementation frameworks explain the different stages and activities you’ll use when implementing a program
or practice. They provide a map and shared language for the implementation process. Many are applicable
across a wide range of settings – though some are better for particular types of interventions, or focus on
different aspects of implementation. During the past 20 years, researchers have developed many implementation
frameworks (Albers, Mildon, Lyon, & Shlonsky, 2017; Moullin, Sabater-Hernández, Fernandez-Llimos, & Benrimoj,
2015; Tabak, Khoong, Chambers, & Brownson, 2012).
Our guide outlines an implementation framework that integrates relevant concepts that are common across
existing frameworks frequently used in the child and family service sector (see Chapter 3 for more information).
Implementation stages
Implementation happens in stages. It is a process that unfolds – it’s not a single event. Your implementation
model should guide you through the different steps in your implementation process. It should help you decide
when to focus on each implementation activity – though the exact order of activities should not be fixed. Ideally,
you should tailor your implementation process to your needs and context. And while implementation does
happen in stages, the stages don’t always end exactly as another begins. The process isn’t always linear. For
example, timeline and funding pressures may mean that you need to move fast and cause stages to overlap (with
activities from two different stages happening at the same time). You may also experience setbacks that mean
you need to revisit a previous stage before you can progress further. For example, staff turnover may mean you
no longer have enough practitioners trained in the program or practice, resulting in a ‘pause’ on service delivery
while you recruit and/or train additional staff.
This guide outlines four stages (see Chapter 3 for more information):
1. Engage and explore
2. Plan and prepare
3. Initiate and refine
4. Sustain and scale.
Implementation barriers make the implementation process more challenging. Barriers can include a lack of
resources to deliver a program or practice, and low confidence in the program or practice among the people
delivering it. However, you shouldn’t give up on your implementation simply because you face barriers. It’s normal
to experience barriers. Your implementation will be successful if you can identify and overcome barriers early in
the process. You should continually monitor the enablers and barriers, as different influencing factors will emerge
during different stages of implementation. The process of assessing and identifying enablers and barriers is
described in Chapter 4.
Implementation strategies
Implementation strategies are techniques that improve the adoption, planning, initiation and sustainability of a
program or practice (Powell et al., 2019). They are the ‘how to’ components of the implementation process and
are used to overcome barriers.
So how do you decide which implementation strategies to use? Sometimes there is existing evidence showing
which implementation strategies are likely to be useful for implementing your particular program or practice.
Or, if you’re implementing a manualised program (i.e. a program which has been developed with a structured,
detailed manual that you usually need to buy a license for), the program developers may require you to use
Implementation in action 7
particular implementation strategies. If you don’t have any suggested implementation strategies, you can tailor
your strategies based on the barriers you’ve identified and experienced in your program or practice.
For example, in Stage 1 (engage and explore) a key barrier may be a lack of information about the needs of the
children and families who participate in your service. You can overcome this barrier by compiling agency or
service provider data, such as demographics; client goals at baseline; child and family feedback; and outcomes
for families when they exit your program. During Stage 2 (plan and prepare), when you’ve already decided what
to implement, a barrier might be that staff lack the right experience and skills to implement the new program
or practice. You can overcome this by providing staff with interactive, skills-based training and post-training
technical assistance, such as coaching.
It’s important for your agency or service provider to prioritise your implementation planning and preparation
activities. Common pitfalls include:
l not investing enough time or resources during the early stages of implementation
l attempting to implement too many new programs and practices at once
l not reprioritising resources from an ‘old’ to a ‘new’ initiative.
Investing adequate time and resources during the early stages of implementation will reduce your efforts later
down the track. This short-term pain will result in long-term gain.
In summary, using relevant implementation strategies will improve the quality of your implementation (as shown
by improved implementation outcomes). This, in turn, improves client outcomes. This shows just how important it
is to monitor implementation quality, as well as client outcomes.
Consider this scenario: You evaluate a new program by assessing the outcomes for children and families, and you
find no beneficial effects. Unless you also assess the implementation outcomes, it’s unclear if the program had
no effect because it was poorly implemented (e.g. lack of program buy-in or fit; or program components were
skipped or not delivered as intended), or because it’s a truly ineffective program.
8 Australian Institute of Family Studies
Figure 2: The relationships between implementation strategies, implementation outcomes and client outcomes
Barriers/enablers
• Policy environment
• Leadership support
• Complexity of
program or practice
• Practitioner
attitudes towards
program or practice
• Practitioner skill in
program or practice
Source: Adapted from Lewis (2017), Lyon and Bruns (2019), Proctor et al. (2011)
Implementation in action 9
Depending on where you’re at in your implementation process, it may not make sense for you to follow every
step in every stage, as outlined here. Consider which activities you’ve already completed, what decisions have
already been made and what makes sense in your context. You may skip some steps or start at a later stage. Use
the Implementation Stages – Deciding Where to Start tool (Appendix A) to help you determine where you are
up to, and which stage and step should come next. You can also use the Implementation Progress Checklist to
monitor your progress through the stages (Appendix B).
Select and adopt a program or practice: Look for existing programs and practices that could fill your gap.
Ensure they can meet your needs, can create the desired outcomes, are a good fit for your context and are
supported by evidence.
Set up an implementation team: Consider establishing a team that’s responsible for moving the program or
practice through the stages of implementation.
Consider likely enablers and barriers, and assess readiness: Identify enablers and barriers to implementation
that will occur early in the process, (noting enablers and barriers will need to be continuously monitored
throughout the stages). Focus particularly on the ways in which your organisation is ready – and unready – to
implement the program or practice.
Develop an implementation plan: Develop an implementation plan that identifies how to put your
implementation strategies into action. Carefully plan what needs to be done; when and where it needs to happen;
how it is to happen; and who is responsible.
Decide how to monitor implementation quality: Identify the best indicators of implementation quality. Plan how
you will measure and monitor these during the implementation process.
Build readiness to use the program or practice: Ensure your organisation will be ready to start using the
program or practice. Use implementation strategies such as training, acquiring resources and adapting existing
practices.
Continuously monitor and refine: Use continuous quality improvement cycles to monitor the quality of the
implementation. Use this information to guide improvements or adaptations to your implementation.
Scale-up the program or practice: If the first implementation attempts are stable, introduce the program or
practice to new teams, sites or contexts. This begins a new implementation process.
Implementation in action 11
During this stage, your aim is to make an informed decision about which evidence-informed program or practice
to adopt. You also need to assess your organisation’s readiness to implement this program or practice. As you
move through this process, engage as many internal stakeholders as possible. Consider running workshops or
other collaborative activities to explore their insights and understand their preferences and experiences.
To make good decisions about what program or practice to implement, you need to consider:
l the needs of the target population participating in your service
l the outcomes you’d like to achieve with and for children and families
l your agency or service provider’s capacity to implement the new program or practice
l the evidence proving the effectiveness of the program or practice you plan to implement.
To assess the needs of the target population, use data that indicate the intensity of the problem. Involve all
relevant stakeholders who can help find solutions, including people from other service providers in your region.
This exercise helps to identify and define the problem you’re trying to solve with your new initiative.
Next, identify the outcomes you’d like the new initiative to bring about for children and families. These should be
measurable changes or benefits that are experienced as a result of your new program or practice. The outcomes
should relate to the problem you defined at the start of this step. For example, if the problem was defined as
children living in an unsafe home environment, relevant outcomes might include a decrease in child injuries and
adequate stimulation for children in the home environment. This process helps you to narrow down the possible
programs or practices under consideration. Using a program logic can help, as it draws out the relationships
between program or practice inputs (e.g. resources), outputs (e.g. key activities) and outcomes (e.g. benefits for
children and families). Refer to the Child Family Community Australia guidance for developing a program logic
for more information.1
Box 1 provides example questions that you can use when defining your target population, their needs and the
desired outcomes.
1 aifs.gov.au/cfca/expert-panel-project/program-planning-evaluation-guide/plan-your-program-or-service/how-develop-program-logic-
planning-and-evaluation
12 Australian Institute of Family Studies
Table 1: Questions to help define what needs to change and for whom
Using an outcomes framework can help you identify and describe which outcomes you’d like to create.
Examples include the NSW Government Human Services Outcomes Framework and the Victorian Public
Health and Wellbeing Outcomes Framework.
You can use the Implementation Considerations Checklist tool (Appendix C) to guide decision makers through
the selection and adoption process.
The Implementation Considerations Checklist tool (Appendix C) can help you select a program that’s
appropriate and feasible for your context.
2 Several such menus exist; for example, Communities for Children Facilitating Partners Evidence-based Programme Profiles (apps.aifs.
gov.au/cfca/guidebook/), the Early Intervention Foundation Guidebook (guidebook.eif.org.uk/), and the California Evidence-Based
Clearinghouse for Child Welfare (www.cebc4cw.org/)
Implementation in action 13
Table 2 presents an overview of the implementation team’s purpose, composition and core capabilities.
Over time, the work of the team will be refined. The tasks and composition of the team can change as the
implementation stages progress. Be prepared to change the membership of the implementation team over time,
as needed. Consider:
l What core competencies are needed to drive the implementation at each implementation stage?
l Who has the skills, knowledge and decision-making authority to effectively facilitate the necessary
implementation activities?
l Which internal stakeholders need to be included?
l What organisational systems and policies are needed to support implementation?
Other considerations
When you’re establishing your implementation team, it’s essential to invest time in the initial implementation
team planning. You’ll need to:
l select team members
l ensure the team has the appropriate authority to implement changes and make decisions ‘in the room’ to
improve implementation
l develop accountability mechanisms, including tracking actions and scheduling regular meetings.
You should continue to monitor enablers and barriers throughout the implementation process, as different
opportunities and challenges are likely to emerge as the process unfolds. However, starting now will help you
14 Australian Institute of Family Studies
identify and address early barriers that could slow down the process and reduce momentum. It will also help you
to nurture the enablers which will help the new program or practice to flourish.
Some enablers and barriers will be obvious. For example, you may have a clear mandate from senior leadership
to use whatever resources it takes to initiate your new program. Or, conversely, you might not have enough
funding to support the program you’ve selected. Or, practitioners may need to be upskilled in the new practice
you are seeking to implement. Other enablers and barriers will be less obvious, though no less important.
You may find it helpful to take a structured approach when assessing enablers and barriers. This approach
can guide your thinking and help you to clearly see the obstacles to overcome and the existing enablers to
be maintained. One common approach for exploring enablers and barriers is the Consolidated Framework for
Implementation Research (CFIR) (Damschroder et al., 2009).
The CFIR identifies five domains that will influence your implementation process:
l characteristics of the program or practice itself (e.g. adaptability and cost)
l individuals involved in implementation (e.g. knowledge and beliefs about the program or practice)
l the inner context or setting (e.g. organisational culture and leadership engagement)
l the outer context or setting (e.g. client needs, and policy and funding priorities)
l the implementation process (e.g. planning, reflecting and evaluating).
The CFIR website3 describes the factors that influence implementation within each domain. It also contains tools
for identifying the specific enablers and barriers in each domain.
You can use the CFIR Interview Guide Tool4 to build a set of questions to guide your discussions. These
can also help you to assess the enablers and barriers that are most important in your setting. The
questions can be used for interviews with staff and they can guide the implementation team’s discussions,
as well as discussions with other decision makers during the implementation process.
Organisational readiness refers to the extent to which your organisation is willing and able to implement the
selected program or practice (Scaccia et al., 2015). Low organisational readiness is a common barrier at this
stage of implementation. However, it’s important to understand that ‘readiness’ is not a static condition. Your
organisation does not have to be 100% ‘ready’ at the very beginning of the implementation process. Some
aspects of readiness may not be present at first, but you can use implementation strategies to build them later.
Readiness may also decrease over time; for example, if key staff leave your organisation. You can reassess
your level of readiness at particular points during the implementation process and this can further inform your
decisions on what support or change is needed. For example, the very end of Stage 2 is a good time to reassess
readiness to check if the organisation is ready to initiate the practice.
A framework or tool can be useful to help to guide your assessment of organisational readiness. The Readiness =
Motivation x Capacity (General) x Capacity (Specific) framework (or R=MC2; Scaccia et al., 2015) describes three
factors that influence organisational readiness for implementation:
l the motivation of agency or service provider staff to implement the program or practice
l the general capacities of an agency or service provider
l the program- and practice-specific capacities needed to implement the intervention.
You can assess these three components using the Wandersman Center’s Readiness Thinking Tool® (see
Appendix D). This tool helps you to consider whether the different components are strengths or challenges for
your organisation. The tool also provides suggested discussion questions to help you respond to your readiness
assessment findings. The goal of this process is to identify how you can improve organisational readiness and
enhance the likelihood of implementation success.
3 cfirguide.org/constructs/
4 cfirwiki.net/guide/app/index.html#/
Implementation in action 15
If you’re able to choose your own implementation strategies, one useful technique is to match the strategies to
the implementation barriers you’ve identified or experienced. The Expert Recommendations for Implementing
Change (ERIC) project identified more than 70 commonly used implementation support strategies that can be
used to drive the implementation process (Powell et al., 2015; Waltz et al., 2015). See Table 3 for some examples.
These strategies have been matched with common implementation barriers (defined using the CFIR) to create a
decision aid – the CFIR-ERIC Matching Tool.
You can use the CFIR-ERIC Matching Tool5 to help you decide which implementation strategies to
use. Input the implementation barriers you’ve identified into the tool and it will generate a list of
implementation strategies that experts think will best address these barriers.
Sometimes the CFIR-ERIC Matching Tool will generate a long list of potential strategies for addressing the
inputted barriers, and these won’t all be feasible in your context. While helpful, the Matching Tool can’t replace
careful thought and decision making based on your specific context. We’ve identified some guiding principles to
help you select the best implementation strategies for your context:
l Select implementation strategies that best describe the change in behaviour you require to overcome the
barriers you identified in Stage 1.
l Engage stakeholders (practitioners, leadership, clients, referrers and the community) to help you select the
best implementation strategies and develop actions for these strategies. Consider asking stakeholders to rate
the importance and feasibility for proposed implementation strategies to help you make the decision.
l Remember that implementation strategies can be one discrete action, or a collection of actions that are
interwoven, packaged up and aimed at addressing multiple barriers (Powell et al., 2012).
Once you’ve chosen your implementation strategies, develop specific actions to bring them to life. Table 4
provides examples of common barriers to implementation, and relevant strategies and actions that can be used
in the child and family services context to overcome each of the barriers.
5 cfirguide.org/choosing-strategies/
Table 4: Implementation barriers, implementation strategies and actions that are relevant to the child and family services sector
Relevant implementation
Barrier Implementation strategy Definition stage(s) Example actions
Low adaptability Promote adaptability Identify how to tailor the program Stage 1: Engage and assess l Ask stakeholders which adaptations
A program or practice or practice to meet local needs. Stage 4: Sustain and scale would make the program or practice more
seems promising but Clarify which elements of the appropriate for their context.
has been developed for program or practice must be l Clarify which components of the
a different context and maintained to preserve fidelity. program or practice must be maintained
target population. It’s not to preserve fidelity. Determine which
appropriate in its current program elements can be tailored to your
form due to cultural, linguistic local context (if any). You can do this by
and other reasons. checking information in program menus
or repositories, program manuals or
guidelines, or by speaking directly with the
developer of the program or practice.
l If permitted, introduce the adaptations
once they’ve been approved and fidelity
to the core components of the program
or practice is reached (usually during
Stage 4). Monitor their impact to see if the
adapted version of the program or practice
is more acceptable, a better fit and can be
delivered with fidelity.
Resistance to change 1. Conduct local 1. Talk with stakeholders about (1) Stage 1: Engage and 1. (a) Conduct workshops with practitioners
Practitioners aren’t consensus discussions whether the chosen problem assess and ask for their thoughts on how to
committed to the change 2. Conduct educational is important to them and (2) & (3) Stage 2: Plan and define the target population, their unmet
because they don’t believe a meetings what program or practice is prepare needs, and how to explore new programs
new program or practice is appropriate to address it. or practices that might meet their needs.
needed. 3. Identify and prepare
champions 2. Meet with stakeholder groups (b) Conduct group discussions with
and tell them about the program practitioners and leadership staff using
or practice. the questions in the Implementation
3. Identify and prepare people who Considerations Checklist (see Appendix C).
can motivate colleagues, model 2. Run group or one-on-one information
effective implementation and sessions with staff and practitioners.
overcome resistance to change. Explain the program or practice, including
potential benefits and the resources and
commitment required. Give your staff the
opportunity to ask questions and explore
their concerns.
3. During consultations, identify possible
implementation champions. Approach
them afterwards and chat with them
Implementation in action
You can use the Implementation Plan Template (Appendix E) to help the implementation team or other
decision makers to map out their plan. For another example, see this template developed by the National
Clinical Effectiveness Committee.6
Your implementation monitoring plan should track your key implementation outcomes (i.e. is the program being
implemented and how well?). These are different to your program outcomes, which describe the desired changes
for children, parents, carers, families and caregivers (i.e. is the program making a difference for people using the
service?). Implementation outcomes indicate the quality of your implementation. An evidence-informed program
or practice that’s implemented well (i.e. has good implementation outcomes) has the best chance of delivering
benefits for children and families (see Figure 2 in Chapter 2).
Your implementation team or other decision makers should select which outcomes to monitor, ideally before the
new program or practice has started. However, if the program or practice has already started, it’s not too late to
put monitoring measures in place. You can do this any time.
Table 5 includes some key implementation outcomes, alongside some simple, good-quality measurement
methods and tools. We also encourage you to consider additional outcomes and measures that are appropriate
for your context.
6 health.gov.ie/wp-content/uploads/2018/09/Tool-4-Implementation-Plan.pdf
Implementation in action 21
It’s important to consider the quality of your measurement tools. This includes psychometric
considerations such as reliability, validity and sensitivity, as well as practical considerations such as length,
language and ease of use.
Your fidelity measures should be tailored to the program or practice you’re implementing. The EPIS Centre
website (www.episcenter.psu.edu/fidelity) provides examples of existing fidelity measures for a range
of programs in the child and family service sector. The best fidelity measures or checklists allow for
assessment of how often the program or practice is used, and how extensive its reach is. They also track
the competence and quality of program or practice use.
The Society for Implementation Research Collaboration (SIRC) is currently developing a repository of
tools (see societyforimplementationresearchcollaboration.org/sirc-instrument-project/) that measure
implementation outcomes. The repository includes information on the psychometric properties (e.g.
reliability, validity) and pragmatic qualities of each tool. The repository is a work in progress and available
to paid members of SIRC.
Table 5 includes three short, simple and freely accessible implementation outcome measures with good
psychometric properties: AIM (to measure acceptability), FIM (to measure feasibility) and IAM (to measure
appropriateness). All three have been developed and validated by Weiner and colleagues (2017) and are
available as a free download.7
If your implementation plan includes post-training implementation strategies, such as follow-on coaching, they
should be actioned now.
Be curious about the information and data you’re collecting. The purpose is not to judge whether the
implementation ‘succeeded’ or ‘failed’. Rather, the purpose is to bring some of the barriers to light so you can
respond to, minimise or overcome them.
When you identify barriers, draw on the resources of the implementation team or other decision makers to
decide how to respond to them and improve your implementation process. Use your data to inform your
decisions about how to make improvements. Once you’ve decided how to revise your implementation strategies,
update your implementation plan to record the new actions you’ve committed to. Remember to note who will be
responsible for each action and when each action is due to be completed.
It’s important to keep monitoring implementation quality, enablers and barriers after introducing potential
solutions. If nothing changes, you know the ‘solution’ you introduced is not working. You’ll need to try a new
implementation strategy or revisit your understanding of the barrier you’re trying to overcome. Figure 4
illustrates this continuous quality improvement cycle.
24 Australian Institute of Family Studies
Monitor
Respond Review
Applying this cycle during implementation will help you to quickly determine whether you need to make changes
to the program or practice to improve the fit between your context, and the new program or practice. As you
become more familiar with the improvement cycle, data-informed decision making will become easier and more
natural. What may have felt challenging at the beginning of this phase will likely become routine.
They can gather this information by reviewing the cases accepted at intake and discussing the issue
with relevant staff. Questions arise, like: Are the external referrals into the program inappropriate, but
being accepted at intake anyway? If so, they may need to ensure there’s clearer communication with
external stakeholders and undertake additional promotion of the program. Or perhaps practitioners are
self-selecting ‘easy’ children and families for the program, and putting those who reflect the true target
population on a wait list? This may suggest that practitioners aren’t confident with the new approach.
They may need additional encouragement (e.g. praising efforts) and support (e.g. reduced caseload or
administrative duties) from leadership, or more intensive coaching to build confidence in the program
elements (respond).
This example shows how implementation teams can make data-informed decisions to effectively address
barriers that can threaten high-quality implementation. Once you decide how to respond, you’ll need to
update and action your revised plan. Then the cycle starts again.
Implementation in action 25
There is some evidence to suggest local adaptations may be beneficial to implementation, encouraging buy-in
and ownership, and enhancing the fit between an intervention and the local setting (Lendrum & Humphrey, 2012).
However, too much flexibility can take away from a program’s effectiveness, particularly when modifications are
made to the core components of the intervention. If you find lots of adaptations are needed to fit your context,
you may want to revisit your initial decision to adopt that particular program or practice.
Practitioners can feel frustrated when they’re delivering manualised programs with many fixed, core components.
These types of programs can be perceived as inflexible and you may find program fidelity (i.e. delivering the
program exactly as it was designed and intended) pitted against a practitioner’s sense of autonomy and ‘practice
wisdom’. However, it can be more helpful to view program fidelity as a guide to understanding where to be ‘tight’
and where to be ‘loose’. Practitioners should stick tight to the core components of an intervention until they fully
understand them, and can apply and use them in daily practice. Only then should you begin to introduce local
adaptations. A good fidelity measure will enable you to actively and accurately monitor the core components
and will show you when adaptations can be introduced.
Core components may include the content and mode of delivery of a program. Flexible components
may include the program packaging and promotional material, which can be adapted to use different
languages and images that best reflect the local context.
26 Australian Institute of Family Studies
Sustainment requires adequate and ongoing funding. It requires a good program or practice-context fit, and
sufficient capacity to train new and replacement staff. It also requires ongoing support and stable stakeholder
commitment. Constant change is a normal part of the child and family service sector, so it’s important to ensure
you’re always ready to adapt to change. This means you need to consider sustainment from the very beginning
of your implementation process.
If the answers to these questions indicate the initial implementation is stable, it may be natural to scale up or out.
Scaling is the process of implementing the same program or practice to other teams, sites, service providers
or agencies. You should try to use the lessons you learned from the initial implementation process to identify
potential enablers and barriers during expansion, as well as predict which implementation strategies you
require. It can help to revisit the implementation plan from your initial implementation to review the barriers you
identified, encountered and overcame at each stage, including any unintended consequences of implementation
that needed to be addressed along the way.
Scaling up or out can be like an entirely new implementation process. It will lead your organisation back to some
of the steps in earlier implementation stages, starting a new implementation process. For example, organisational
readiness should be assessed with each new team or site, as their context and resources may differ. You’ll
probably need a separate implementation plan for each new implementing team or site.
If you plan to scale up the program or practice across a service system, ensure it’s not mandatory for all sites and
isn’t tied to compliance requirements. Implementation of the program or practice will be most successful if the
potential implementation sites have agency over the decision, and if they believe the approach will be beneficial.
Implementation teams and other implementation champions will be important resources during this stage. They
can inform and guide the scaling process. Similarly, coaches who supported local implementation efforts and
helped practitioners to learn and acquire new skills can help to share their skills and knowledge on a broader scale.
Implementation in action 27
8. A note of encouragement
Using good implementation practices can seem like a lot of work – and in many ways they are. They require
careful planning, thoughtfulness, resourcefulness and dedication. However, even though high-quality
implementation takes time and effort, this investment of resources pays dividends later – in the form of
more sustainable and effective service delivery for children and families. For further help, you can access
implementation support from a number of specialist organisations.
When you use the principles and processes outlined in this guide, you’ll become more familiar with what’s
required to achieve high-quality implementation. Actively using this guide will help you turn your knowledge of
the concepts into practical skills. Try applying the implementation framework outlined in this guide to your next
initiative. See what fits in your context, and what activities or approaches may need to be adapted or tailored. By
using this approach step by step, you’ll build your confidence and capacity to lead implementation efforts in your
context, for the ultimate aim of improving outcomes for the children and families using your service.
28 Australian Institute of Family Studies
References
Aarons, G. A., Ehrhart, M. G., & Farahnak, L. R. (2014). The implementation leadership scale (ILS): Development of a brief measure
of unit level implementation leadership. Implementation Science, 9(45).
Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Hurlburt, M. S. (2015). Leadership and organizational change for implementation
(LOCI): A randomized mixed-method pilot study of a leadership and organization development intervention for evidence-
based practice implementation. Implementation Science, 10(11).
Aarons, G. A., Ehrhart, M. G., Farahnak, L. R., & Sklar, M. (2014). Aligning leadership across systems and organizations to develop
a strategic climate for evidence-based practice implementation. Annual Review of Public Health, 35, 255–274.
Aarons, G. A., Green, A. E., Trott, E., Willging, C. E., Torres, E. M., Ehrhart, M. G. et al. (2016). The roles of system and organizational
leadership in system-wide evidence-based intervention sustainment: A mixed-method study. Administration and Policy in
Mental Health and Mental Health Services Research, 43(6), 991–1008.
Albers, B., Mildon, R., Lyon, A. R., & Shlonsky, A. (2017). Implementation frameworks in child, youth and family services – Results
from a scoping review. Children and Youth Services Review, 81, 101–116.
Brown, L. D., Feinberg, M. E., & Greenberg, M. T. (2010). Determinants of community coalition ability to support evidence-based
programs. Prevention Science, 11(3), 287–297.
Burke, K., Morris, K., & McGarrigle, L. (2012). An Introductory Guide to Implementation. Dublin: Centre for Effective Services.
Damschroder, L. J., Aron, D. C., Keith, R. E., Kirsh, S. R., Alexander, J. A., & Lowery, J. C. (2009). Fostering implementation
of health services research findings into practice: A consolidated framework for advancing implementation science.
Implementation Science, 4(50).
Eccles, M. P., & Mittman, B. S. (2006). Welcome to implementation science. Implementation Science, 1(1).
Lendrum, A., & Humphrey, N. (2012). The importance of studying the implementation of interventions in school settings. Oxford
Review of Education, 38(5), 635–652.
Lewis, C. (2017). What are implementation mechanisms and why do they matter? Paper presented at the 2017 Society for
Implementation Research Collaboration (SIRC) Conference, Seattle, WA.
Lyon, A. R., & Bruns, E. J. (2019). From evidence to impact: Joining our best school mental health practices with our best
implementation strategies. School Mental Health, 11(1), 106–114.
Metz, A., & Bartley, L. (2012). Active implementation frameworks for program success. Zero to Three, 32(4), 11–18.
Metz, A., Bartley, L., Ball, H., Wilson, D., Naoom, S., & Redmond, P. (2015). Active implementation frameworks for successful
service delivery: Catawba County Child Wellbeing Project. Research on Social Work Practice, 25(4), 415–422.
Moullin, J. C., Sabater-Hernández, D., Fernandez-Llimos, F., & Benrimoj, S. I. (2015). A systematic review of implementation
frameworks of innovations in healthcare and resulting generic implementation framework. Health Research Policy and
Systems, 13(16).
Powell, B. J., Fernandez, M. E., Williams, N. J., Aarons, G. A., Beidas, R. S., Lewis, C. C. et al. (2019). Enhancing the impact of
implementation strategies in healthcare: A research agenda. [Perspective]. Frontiers in Public Health, 7(3).
Powell, B. J., McMillen, J. C., Proctor, E. K., Carpenter, C. R., Griffey, R. T., Bunger, A. C. et al. (2012). A compilation of strategies for
implementing clinical innovations in health and mental health. Medical Care Research and Review, 69(2), 123–157.
Powell, B. J., Waltz, T. J., Chinman, M. J., Damschroder, L. J., Smith, J. L., Matthieu, M. M. et al. (2015). A refined compilation
of implementation strategies: results from the Expert Recommendations for Implementing Change (ERIC) project.
Implementation Science, 10(21).
Proctor, E., Silmere, H., Raghavan, R., Hovmand, P., Aarons, G., Bunger, A. et al. (2011). Outcomes for implementation research:
Conceptual distinctions, measurement challenges, and research agenda. Administration and Policy in Mental Health and
Mental Health Services Research, 38(2), 65–76.
Rabin, B., & Brownson, R. (2018). Terminology for dissemination and implementation research. In R. C. Brownson, G. A. Colditz &
E. K. Proctor (Eds.), Dissemination and implementation research in health: Translating science to practice (Vol. 2, pp. 19–45).
New York: Oxford University Press.
Scaccia, J. P., Cook, B. S., Lamont, A., Wandersman, A., Castellow, J., Katz, J. et al. (2015). A practical implementation science
heuristic for organizational readiness: R=MC2. Journal of Community Psychology, 43(4), 484–501.
Tabak, R. G., Khoong, E. C., Chambers, D. A., & Brownson, R. C. (2012). Bridging research and practice: Models for dissemination
and implementation research. American Journal of Preventive Medicine, 43(3), 337–350.
Waltz, T. J., Powell, B. J., Matthieu, M. M., Damschroder, L. J., Chinman, M. J., Smith, J. L. et al. (2015). Use of concept mapping
to characterize relationships among implementation strategies and assess their feasibility and importance: Results from the
Expert Recommendations for Implementing Change (ERIC) study. Implementation Science, 10(109).
Weiner, B. J., Lewis, C. C., Stanick, C., Powell, B. J., Dorsey, C. N., Clary, A. S. et al. (2017). Psychometric assessment of three newly
developed implementation outcome measures. Implementation Science, 12(108).
Implementation in action 29
This checklist is intended to be used as a progress monitoring and planning tool to assist implementation teams
and other decision makers in keeping the implementation process on track.
When using this tool, remember that the implementation process is often non-linear, with overlapping stages and
activities. Depending on where you are already at in your implementation process, it may not make sense for you
to follow every step in every stage as outlined here. Depending on what activities have already been undertaken,
what decisions have already been made, and what makes sense in your context, you may decide to skip some
steps, or to start in a later stage or step. Given this, it may be useful to tailor the tool to reflect your context and
circumstances before using.
There are no right or wrong answers to the questions. Rather, they should be used to guide your research and
your thinking about what program or practice to implement at your agency or service.
8 Several such menus/repositories exist; for example, Communities for Children Facilitating Partners Evidence-based Programme
Profiles (apps.aifs.gov.au/cfca/guidebook/), the Early Intervention Foundation Guidebook (guidebook.eif.org.uk/), and the California
Evidence-Based Clearinghouse for Child Welfare (www.cebc4cw.org/).
Implementation in action 33
This form can help you think about an organisation’s readiness to implement a new program, policy, practice or process.
Motivation Degree to which we want the program or practice to happen Challenge Strength Unsure
Relative advantage This program or practice seems better than what we are currently doing.
Compatibility This program or practice fits with how we do things.
Simplicity This program or practice seems simple to use.
Ability to pilot Degree to which this program or practice can be tested and experimented with.
Observability Ability to see that this program or practice is leading to outcomes.
Priority Importance of this program or practice compared to other things we do.
Program or practice-specific Capacity What is needed to make this particular program or practice happen?
Program or practice-specific knowledge & skills Sufficient abilities to do the program or practice.
Champion A well-connected person who supports and models this program or practice.
Supportive climate Necessary supports, processes, and resources to enable this program or practice.
Inter-organisational relationships Relationships between organisations that support this program or practice.
Intra-organisational relationships Relationships within organisation that support this program or practice.
General Capacity Our overall functioning
Culture Norms and values of how we do things here.
Climate The feeling of being part of this organisation.
Innovativeness Openness to change in general.
Resource utilisation
Ability to acquire and allocate resources including time, money, effort, and
technology.
Appendix D: Readiness Thinking Tool®
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
READINESS THINKING TOOL®
Discussion questions
Which is currently the greatest challenge for implementation? Where would more information and data be helpful? How can you
get these data?
Which is the greatest strength? Where do you have differences with your colleagues?
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License
37
38
The Implementation Activity Tracker, at the very back of the document, can be used help structure Implementation Team meetings. It is a useful tool to help
track and log the ‘agreed actions’ that are listed in the implementation plan.
Aim of project
Scope of project
Timeline
Implementation Team
members (roles)
Implementation Implementation Implementation stage Agreed action(s) Person responsible Time frame
barrier or enabler strategy to address to action or due date
barrier or maintain
enabler
Appendix E: Implementation plan template
The following guiding questions may help you and your colleagues in developing and updating your implementation plan. Use them as questions to prompt
discussion. There are no guiding questions provided for Stage 1 because the implementation plan is developed in Stage 2.
Describe how you will approach staff training and skill development over time, taking into consideration how to adapt the approach to accommodate
staff turnover.
For example, this might include:
§ how to integrate new managers and practitioners who commence employment after the initial program or practice training is provided, or
§ seeking staff feedback on training, or more generally exploring implementation enablers and barriers with staff.
Australian Institute of Family Studies
How will you obtain and maintain staff buy-in and foster a supportive change climate across the agency or service?
For example, this might include:
§ how leadership will promote and communicate about the new practice or program across the agency or service
§ how staff will be supported to give the new initiative a try, and to make mistakes and learn from them
§ ensuring there is sufficient time and space to pace implementation appropriately.
What planning can you do now, and/or what safeguards can be put in place now to promote program or practice sustainability?
For example, this may include:
§ gathering information that will be useful for developing a sustainable funding plan for program costs for the next one to four years
§ planning for providing ongoing skill development for staff in the long-term,
§ how to address staff turnover, particularly at the manager/supervisor level.
Guiding questions relevant to Stage 3: Initiate and refine
How will you know if your implementation strategies and actions should be improved? Who is going to make the decisions and changes, and how?
For example, this might include:
§ What information would trigger a discussion about changes to process?
§ Would decisions about changes sit with the implementation team, or other decision maker(s)?
§ Whose responsibility is it to ensure changes are made, and who needs to approve those changes?
Is the program or practice being implemented well (according to the indicators of quality implementation that you are monitoring)?
Which changes are you observing in the implementation quality monitoring data?
For example, this might include:
§ Are you achieving the implementation outcomes you planned for?
§ Which monitoring tools, systems and processes were useful and which were a challenge to use? Are revisions required to monitoring tools and
processes?
Implementation in action
41
42
§ capacity-building to support practitioner skill maintenance and continuous learning to be able to fully sustain the program or practice (e.g.
training internal, identifying additional skills for practitioners to develop)
§ continue to use information collected from monitoring data to respond to required changes needed to improve implementation strategies and
actions using continuous improvement cycles
§ handover plan for training and work tasks when new staff come on board (especially senior management) including ensuring new managers are
provided with your implementation plan.
Who will be involved in reviewing implementation monitoring data, and when will it be reviewed and discussed?
For example, this may include:
§ hiring an external coach or training an internal coach to continue to review and provide feedback on implementation quality indicators
§ developing a timeline for reviewing implementation quality (quarterly, annually or aligned with funding cycles).
What needs to be planned for if you decide to expand and scale up?
For example:
§ if/how to expand the program or practice to other teams or sites
§ What teams or sites would benefit most? How would those decisions be made?
§ What learnings from this implementation process will inform and support an expansion/scaling up of the program or practice? How can these
learnings be summarised and communicated?
Implementation action tracker
The Implementation Action Tracker can be used to structure Implementation Team meetings.
Implementation strategy Agreed action Key actions since last update Key actions prior to next update
Barriers or enablers Key learnings Feedback (practitioners, leadership) Proposed solution or change if the program
identified or practice is scaled up and expanded
Implementation in action
43