TOOL D14 Monitoring and Evaluation: A Framework
TOOL D14 Monitoring and Evaluation: A Framework
TOOL D14 Monitoring and Evaluation: A Framework
1 Pre-implementation (planning)
2 Implementation
3 Post-implementation.
The diagram on the next page outlines the framework, with detailed information provided on
pages 160-170.
160 Healthy Weight, Healthy Lives: A toolkit for developing local strategies
Pre-implementation
Implementation
Step Nine
Step Ten Implement intervention
Monitor progress and gather data
Post-implementation
Step Twelve
Step Eleven Report and disseminate
Analyse data results
Pre-implementation (planning)
Step One: Confirm objectives/expected outcomes and outputs
Objectives are the key to every successful programme and evaluation. Every evaluation is about
measuring whether the objectives have been achieved. Before starting the evaluation, local areas
must be clear about what the objectives are.
Unless you have a clear idea about what the project is trying to achieve, you cannot
measure whether or not it has been achieved.
The National Indicators of success can guide local areas in establishing intervention outcomes.
See Tool D5 for a list of indicators relevant to obesity.
Performance indicators can use any information, from any source, that shows whether objectives
are being met. Obesity prevalence figures are quantitative PIs – they are a direct measure of the
degree of the problem in your area. Other PIs, such as those that measure parents’ perceptions of
their child’s diet, are qualitative. If an intervention’s objective is to educate parents in the target
clusters about healthy eating, qualitative PIs must be used to measure this.
When you are developing performance indicators, it is important to establish a starting baseline
for the intervention against which performance will be measured. Performance indicators are a
key part of any monitoring and evaluation framework, as they enable the measurement of what
actions have been achieved.
Key points
• Be clear about what you are measuring. Having a clear idea of what you are trying to achieve
will help in selecting the right indicators. Always ensure that the data required are available
and easily collected.
• Think about the context. Performance indicators may need to take account of underlying
trends, or the environment in which the intervention is operating.
• Performance indicators can never be conclusive proof that a project is successful; they can
only ever be indicators. This is because external factors, which have not been measured, can
have an impact on an intervention without a local area being aware of them. However, well
chosen indicators that come from a wide range of sources and illustrate different aspects of
an intervention can provide good evidence of its success.
Optimisation of management
functions.
Interpretative Content analysis All purpose. Deconstruction of ‘hidden’
Used in operational (analysis of meanings and agendas.
meetings etc), summative (analysis Rich interpretation of
of materials or reports) and learning phenomena.
(deconstruction of programme Inherent risk of ideological
reports). bias.
Critical Discourse analysis More theoretical (usually critical As for interpretative methods,
theory) based than content analysis. but emphasises establishment
Typically used to assess structure, of generalisable laws.
coherence and value of large-scale Perceived to be unscientific,
programmes for learning purposes. especially by experimentalist
practitioners.
Participatory Action research Typically in developmental evaluation Encourages real engagement
mode. of subjects of intervention.
Good in highly uncertain
contexts.
Evaluators sometimes get too
involved in intervention itself.
164 Healthy Weight, Healthy Lives: A toolkit for developing local strategies
The table below summarises the broad types of interventions used in tackling obesity, and gives
some examples of evaluation questions and evaluation methods that would be associated with a
particular type of intervention.
Key point
Analysis requirements: Bear in mind that the selection of particular methods and techniques
also implies using the appropriate type of data analysis (which has its own resource and skills
implications). In general, large data sets (such as those derived from surveys) normally need
statistical software systems such as SPSS. Interpretative data (derived, for example, from content
analysis) can be analysed with proprietary qualitative software packages such as NVivo. In any
case, a clear coding frame to analyse such data is necessary.
TOOL D14 Monitoring and evaluation: a framework 165
Implementation
Step Nine: Implement intervention and gather data
The following are some important aspects to consider for the implementation step of the
evaluation framework.
• Contingency planning: As with planning an evaluation in general, anticipating adjustments
and changes to data collection is to be encouraged. It is useful to have a ‘plan B’ with
alternative arrangements for data collection should it become apparent that, for example,
time, skills or operational constraints are likely to conspire against planned activities.
166 Healthy Weight, Healthy Lives: A toolkit for developing local strategies
• Triangulation: The evaluation should already have been designed with regard to the resource
requirements of the choices specified and with the ‘insurance’ of contingency planning in
mind. It is also worth noting that ‘insurance’ also has a methodological component:
triangulation. Triangulation means utilising different methods to cover the evaluation from
different angles (for example, assessing the effectiveness of organisational structures of an
intervention from the points of view of different actors).
• Operational rules: The evaluation should be able to track (and have a record of): what data
are being collected, who collects the data, and in what form and location the data are stored.
Clear rules about operational procedures should be set out and distributed to all those
involved in data collection and analysis. Similarly, it is useful to draw up ‘evaluation contracts’
with other stakeholders, especially those supplying information. These contracts should
specify the objectives of the evaluation and any guarantees that apply (for example, on
confidentiality).
An example of monitoring the intervention would be: Keep a record of the resources used in
running the intervention, eg number of staff, who the staff are, how many hours staff work, and
costs incurred by the intervention.
Once a framework is established, those running the intervention monitor the data and
feed back the relevant information to the partnership.
Post-implementation
Step Eleven: Analyse data
Before analysing data, local areas need to ask the following questions:
• Are the data in the right format to apply to the performance indicators?
• Are there in-house facilities for analysing the data or do they need to be bought in?
• What methods of analysis are there?
Key point
Once the intervention has been implemented and data collected for evaluation, local areas
should:
• compare outcome data with the baseline
• calculate the cost-effectiveness of the intervention
TOOL D14 Monitoring and evaluation: a framework 167
• calculate the costs of the intervention, including any inputs monitored during the intervention
• examine comparable areas
• examine trends in the wider area and any similar comparison area to assess the impact of the
intervention.
More generally, it is important to the reputation, value and impact of the evaluation to give final
formal feedback to everybody who has contributed in some way to the evaluation (for example,
by sending them a copy of the report or inviting them to a final feedback event).
Dissemination should not be restricted to the circulation of a final report – especially in the case of
developmental process evaluation. Different stakeholders may require different communication
approaches. These might include:
• short summaries of the evaluation, tailored to different audiences
• journal articles for other researchers
• topical articles in the ‘trade’ press
• workshops for specific audiences
• feedback seminars for key decision makers.
The results from the evaluation should always be fed back into the future planning of
interventions.
168 Healthy Weight, Healthy Lives: A toolkit for developing local strategies
Yes No Action
Pre-implementation
Step One: Confirm objectives/expected outcomes and outputs
Have SMART objectives been developed to show what the intervention is
trying to achieve?
Are outcomes in place to show what the final achievement of the
intervention will be? (This should relate to the overall aim.)
Step Two: Establish outputs for the intervention
Have outputs been established to show what tasks are being carried out to
achieve the outcomes (eg establishing a baseline, producing quarterly
reports)?
Step Three: Establish performance indicators and starting baseline
Have performance indicators been established, taking into account data
availability, surrounding environment and underlying trends of local area?
Has a starting baseline been established?
Step Four: Identify data to be collected
Has the source of data been identified to calculate the performance
indicators?
Do the data need to be collected?
Have the data been checked for accuracy and reliability?
Is extra work required to format the data for analysis?
Step Five: Identify methods of gathering data
Have the methods of data collection been agreed?
Have appropriate analytical methods been agreed?
Have statistical specialists been employed to complete the analysis?
Step Six: Formulate a timetable for implementation
Has an implementation timetable been formulated to ensure the
intervention runs and finishes on time?
Have milestones for key activities of the intervention been established?
Have milestones for regular review of the inputs and outputs been
established?
Step Seven: Estimate the costs of planned inputs
Have the input costs been estimated, to enable the analysis of cost-
effectiveness of the intervention?
Step Eight (Optional): Identify a comparable area
Has a comparable area been identified to ensure any changes are a result
of the intervention?
TOOL D14 Monitoring and evaluation: a framework 169
Yes No Action
Implementation
Step Nine: Implement intervention and gather data
Has a contingency plan been organised?
Have operational rules been written and sent to all partners?
Step Ten: Monitor progress
Are the inputs being monitored?
Are the output and outcome data being monitored?
Are the key milestones being monitored?
Post-implementation
Step Eleven: Analyse data
Have the outcome data been compared with the baseline?
Has the cost-effectiveness of the intervention been calculated?
Have the costs of the intervention, including any inputs monitored during
the intervention, been calculated?
Has the comparable area been examined?
Have the trends in the wider area and any similar comparison area been
examined, to assess the impact of the intervention?
Step Twelve: Report and disseminate results
Have the results been disseminated to stakeholders in an appropriate
form?
Have the results been fed back into the future planning of interventions?
170 Healthy Weight, Healthy Lives: A toolkit for developing local strategies
Glossary
Aim A simple statement that sets out the purpose of the intervention.
Baseline The situation at the start of an intervention, before any preventive work has
been carried out. The information that helps to define the nature and extent of
the problem.
Evaluation Evaluation is the process of assessing, at a particular point in time, whether or
not particular interventions are achieving or have achieved their objectives.
Evaluation is about measuring the outcomes of a particular intervention. An
outcome is the overall result of an intervention. Evaluation can also be used to
measure whether the processes used in an intervention are working properly.
This is called process evaluation and it measures the inputs and outputs of an
intervention.
Input The inputs to an intervention are the resources used to carry out the work.
Resources can be financial, material or human.
Milestones Key points during the life of an intervention. They are decided at the planning
stage and can be time-based or event-based.
Monitoring The process of continually assessing whether or not particular interventions are
achieving or have achieved their objectives. Monitoring is also used to check
whether the processes being used are working effectively. Monitoring is carried
out throughout the life of an intervention, while evaluation is only carried out at
specific points in time.
Objective A statement that describes something you want to achieve – a desired outcome
of an intervention or an evaluation study.
Outcome The outcome of an intervention is the overall result of applying the inputs and
achieving the outputs.
Output A piece of work produced for an intervention. An output is not necessarily the
final purpose of an intervention. Outputs are usually things that need to be
done in order to produce the desired result. During the life of an intervention,
outputs are monitored to make sure they are being achieved on time and with
the resources available.
Performance The means by which you know whether or not you have achieved your targets
indicator (PI) and objectives. A PI is any information that indicates whether a particular
objective has been met. You can also use PIs that measure whether the inputs
and outputs in an intervention are working. For example, if a project is using
public meetings as one of its inputs, a PI could be used to measure the number
of meetings held and the number of people who attend each meeting. These
kind of PIs are called process PIs.
Process Process evaluation measures the inputs and outputs of a project.
evaluation
Programme A programme is a group or collection of interventions designed to achieve
particular objectives. The interventions in a programme are usually linked to a
particular problem or a particular area and fall under a common aim.
Qualitative PI PIs that measure qualities, which are usually quite intangible things, such as the
perceptions and feelings of individuals and groups.
Quantitative PI PIs that measure tangible things, such as the number of obese children in an
area.