Practical Monitoring and Evaluation: A Guide For Voluntary Organisations
Practical Monitoring and Evaluation: A Guide For Voluntary Organisations
Practical Monitoring and Evaluation: A Guide For Voluntary Organisations
4 Coldbath Square
London EC1R 5HL
+44 (0) 20 7713 5722
+44 (0) 20 7713 5692
[email protected]
www.ces-vol.org.uk
Practical
monitoring
and
evaluation
a guide for voluntary
organisations
helping voluntary
organisations to
be more effective
Since 1990, Charities Evaluation Services
(CES) has worked with a wide variety of
voluntary organisations and their funders.
Our aim is to promote accessible monitoring
and evaluation practice, which organisations
can carry out within the resources available
to them. CES provides training, advice
and technical help, and also carries out
independent evaluations as part of its
commitment to strengthening and improving
the effectiveness of the voluntary sector.
CES produces a range of publications
including PQASSO, the practical quality
assurance system for small organisations.
Copyright
Unless otherwise indicated no part of this publication
may be stored in a retrievable system or reproduced in
any form whatsoever without prior written permission
from Charities Evaluation Services. Charities Evaluation
Services will give sympathetic consideration to requests
from small organisations for permission to reproduce
this publication in whole or in part but terms upon
which such reproduction may be permitted will remain
at Charities Evaluation Services discretion.
Charities Evaluation Services, 2002
Third edition, 2008
ISBN 978-0-9558849-00
Published by Charities Evaluation Services
Designed by Alexander Boxill
New edition artwork by Artloud Ltd
Edited by Wordworks, London W4 4DB
Printed by Lithosphere Print Production
acknowledgements
charities
evaluation
services
Readers
For reading and commenting on the first edition text
Jenny Field, Bridge House Estates Trust Fund
Ciarn McKinney, Streetwise Youth
Tom Owen, Help the Aged
Georgie Parry-Crooke, University of North London
Professor Helen Simons, University of Southampton
Chris Spragg, NCH Action for Children
James Wragg, Esme Fairbairn Foundation
Funders
We are very grateful to the Calouste Gulbenkian
Foundation and the Wates Foundation for the
grants they provided for the first edition of this
publication.
charities
evaluation
services
foreword
A central part of the mission of NCVO is to help
voluntary organisations to achieve the highest
standards of practice and effectiveness. We know
from our work with organisations across the
country that the vast majority are firmly committed
to improving their work. However, while the
motivation is undoubtedly there, many
organisations lack the experience and practical
know-how to assess the effectiveness of their work
without help or guidance.
Charities Evaluation Services has an impressive track
record in developing clear and accessible material
for voluntary organisations. This stems from their
long experience of working closely with
organisations of all shapes and sizes.
CES Practical Monitoring and Evaluation was
first published in 2002, providing much-needed
guidance on evaluation for voluntary organisations
both large and small. The guide is now in its third
edition, with a revised further reading list, and text
additions which acknowledge the recent shifts in
the policy and funding context and developments
in monitoring and evaluation software and other
resources.
This guide will be invaluable to any voluntary
organisation that is serious about measuring the
effectiveness of its work. It has been carefully
designed to be of relevance to organisations
relatively new to this sort of approach, as well as
providing more demanding material for those who
need something more sophisticated.
I would encourage any voluntary organisation that
is committed to improving its work to use this
guide. The sector is under increasing scrutiny from
the public, the government and funders
particularly in our role in delivering public services.
There has never been a more important time for
the sector to demonstrate its effectiveness. I am
sure that this guide will continue to play an
important part in helping us do this.
Stuart Etherington
Chief Executive, National Council for Voluntary
Organisations
this guide
charities
evaluation
services
Planning
or
Monit ing
E v a l u a t io n
A resource
for PQASSO
users
charities
evaluation
services
Terminology
There is a comprehensive glossary in the
Practical toolkit. But here are some basic
definitions of terms used in this guide.
Project
For simplicity, the guide uses the term
project to imply a fairly limited set of
activities or services with a common
management and overall aim, whether
carried out independently or as part of a
larger organisation. However, this guide is
relevant to, and can be used by, the full
range of voluntary organisations, including
large organisations which may provide a
complex range of services at a number of
sites. The principles and practice also apply
to the evaluation of programmes, in which
a number of projects are funded within
the framework of a common overall aim.
Participant
This means any person, group or
organisation taking part in project activities,
where user is not an appropriate term
as no services are offered. The term
participant is also used for those who
take part in evaluation activities.
Respondent/informant
A respondent is someone who provides
information directly, usually by answering
questions asked by an evaluator. Informants
might provide information both directly
and indirectly, for example through
being observed.
Information/data
Information is collected during monitoring
and evaluation. This becomes data when
it is gathered for a specific purpose and is
linked to specific evaluation questions.
Staff/volunteers
Many voluntary organisations have no paid
staff, and depend on volunteers to deliver
their services. This is recognised in the
guide, which often refers to both staff and
volunteers. However, for brevity, the term
staff is also used as an umbrella term to
include everyone working for a project,
whether fully employed, sessional staff,
consultants or volunteers.
Symbols
The following symbols are used in the guide:
User
This term is used to mean any person,
group or organisation that may use an
organisations services, either directly or
indirectly. This can include clients, casual
callers, referral agencies and researchers.
Introduction
11
Section 1 Planning
13
Basic planning
15
Further planning
27
Section 2 Monitoring
33
Basic monitoring
35
Further monitoring
45
Section 3 Evaluation
49
Basic evaluation
51
Further evaluation
65
Section 4 Utilisation
79
Basic utilisation
81
Further utilisation
84
89
91
charities
evaluation
services
charities
evaluation
services
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 7
introduction
Evaluation and the voluntary
sector
Practical monitoring and evaluation is a
comprehensive introduction to monitoring
and evaluation for voluntary organisations
and their funders. It is intended to support
organisations which carry out their own
evaluation activities, as well as those that
use external consultants to help develop
their systems or to carry out an external
evaluation. The need for this guide has
become increasingly pressing. The voluntary
sector is complex and diverse, and is
affected by changing demands from funders,
from those in whose interests it works, and
from the general public.
This funding makes organisations more
formally accountable. Voluntary
organisations are more likely to have to
prove that their performance is of a high
quality and to adopt more business-like
management. The general public, funders
and grant givers, and service users
themselves, expect to see evidence that
projects are making a difference and that
they provide value for money.
A significant amount of voluntary and
community grant income comes from
government sources, both national and
local. Some major voluntary organisations
receive large sums from central and local
government. Local authority funding is now
increasingly available through contracts for
service delivery and more and more local
community sector organisations are funded
from local budgets. The sector is also highly
dependent on grant-aid from charitable
foundations, particularly for new initiatives,
as well as on public donations, and is
sensitive to any reduction in available
funding and greater competition for
dwindling funds.
This funding makes organisations more
formally accountable. Voluntary
charities
evaluation
services
7
introduction
Evaluation
and the
voluntary
sector
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 8
charities
evaluation
services
Monitoring
All organisations keep records and notes,
and discuss what they are doing. This simple
checking becomes monitoring when
information is collected routinely and
systematically against a plan. The information
might be about activities or services, your
users, or about outside factors affecting your
organisation or project.
Monitoring information is collected at specific
times: daily, monthly or quarterly. At some
point you need to bring this information
together so that it can answer questions such
as:
How well are we doing?
Approaches to monitoring
and evaluation
The following questions are key to making basic
decisions about monitoring and evaluation:
Why are you doing it?
Who is it for?
What are the key issues or questions
you wish to address?
When will you do it?
How will you do it?
Who will carry it out?
How will the information be managed and
analysed?
Evaluation
Evaluation aims to answer agreed questions
and to make a judgement against specific
criteria. Like other research, for a good
evaluation, data must be collected and
analysed systematically, and its interpretation
considered carefully. Assessing value or the
worth of something and then taking action
makes evaluation distinctive. The results of
an evaluation are intended to be used.
Although monitoring and evaluation are
different, certain types of evaluation may
involve a lot of monitoring activity. It can
be difficult to carry out an evaluation
unless monitoring data has already been
collected. The more resources you put
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 9
charities
evaluation
services
Why evaluate?
The approach we take in this guide is that
monitoring and evaluation not only measure
how well you are doing, but also help you
to be more effective.
Funders and others who sponsor an
evaluation will want to know whether a
project has spent its money appropriately,
or whether it provides value for money.
There is considerable pressure from funders
to prove success. Many projects have to
respond to this demand in order to survive.
But there is a danger that an evaluation
that is unable to prove success, or defines
success unacceptably, will be rejected, and
important learning from the evaluation will
be lost. Evaluation can help you to manage
and develop your work and this is a valid
purpose in its own right.
introduction
users
members
policy-makers and decision-makers
other agencies and project partners.
How you involve stakeholders in the
evaluation will affect how the evaluation
findings are accepted and, in turn, how
they are used. People are more likely to
respond and make changes if they have
The
partnership
between
monitoring and
evaluation
Approaches to
monitoring and
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 10
charities
evaluation
services
10
o th
Monitoring and
evaluation system
assesses the project
against a varied range
of agreed evaluation
questions
er inform ation n e
Quality assurance
system
assesses the project
against its own
expectations and
standards
ed
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 11
charities
evaluation
services
Self-evaluation
Self-evaluation cycle
Ut
se
an
Pl
ili
Needs
assessment
Set performance
indicators:
to measure your
progress
Review project
and implement
recommendations
at
e
on
lu
i to
a
Ev
Deliver work
programme
11
introduction
Challenges for
voluntary sector
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 12
charities
evaluation
services
12
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 13
section1 planning
Basic planning
15
Further planning
27
charities
evaluation
services
13
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 14
charities
evaluation
services
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 15
basic planning
Basic planning describes the foundation blocks
on which monitoring and evaluation are built.
This section tells you how to make clear
statements about what your organisation is
doing, with whom, and why. It shows you how
to word these statements so that your
activities, and the resulting changes, can be
measured and reported on. It looks at a simple
year plan and sets out the points you must
remember when planning a monitoring and
evaluation system.
15
basic
planning
Set targets
Understanding and
clarifying your project
charities
evaluation
services
Understanding
and clarifying
your project
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 16
charities
evaluation
services
16
Expressing values
Mission statements usually include some
core values that describe an organisations
approach when carrying out its activities,
for example, the way you work with service
users. Values may also be expressed in a
separate statement.
Organisations often use words such as
these to express the way they work: flexible,
respectful, caring, empowering, providing
equal opportunity.
Various expressions are used to describe the
basis of relationships in the organisation:
involving users in service planning, working in
partnership, collaborative.
People inside an organisation often make
assumptions about organisational values,
sometimes wrongly. Be sure to discuss values
openly, so that everyone can agree them. You
can then incorporate the statement into your
promotional material and any other
descriptions of your project, for example
materials prepared for funders and other
agencies.
Values that have been agreed and clearly
stated can be integrated into everyday work,
and monitored to show how successfully you
are working within them. Funders sometimes
have a statement about their own values, and
they may expect you to demonstrate that you
are working towards them. This may be a
problem if no-one decides how these values
might affect the way the project works.
Needs assessment
The starting point for any project is to
understand the needs of the group it plans to
work with, whether they are potential users,
partner agencies or those it wants to
influence.
A needs assessment is a form of evaluation
activity. It will identify the existing problems
in the community, how bad these are, the
services available and the needs that are not
being met. Look for local data and any
relevant national statistics and reports, and
talk to potential users and other
stakeholders to get information on:
the physical and economic characteristics
of the projects environment
the nature of the target population and
where they are
existing community groups and projects,
current service provision and gaps in
provision, and whether partnerships are
possible
problems and possible solutions.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 17
charities
evaluation
services
17
basic
planning
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 18
charities
evaluation
services
18
to increase
to reduce
to expand
to organise
to conduct
to provide
to enable
to develop
to improve
to distribute
to produce
to set up
s
i ve
ct
je
Ob
To change
young peoples
attitudes about
social issues
through drama
Sp
ec
i fi
ca
im
Ov
er
all
aim
To perform plays in
schools and community
venues in co-operation
with other voluntary
organisations
To enable
young people
to express
themselves
through dance
and drama
To run dance
and theatre
workshops for
young people
To enable
teachers and
community
workers to use
drama in teaching
and play activities
To hold theatre
skills courses for
teachers and
community
workers
To work with
schools on
integrating drama
into the school
curriculum
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 19
charities
evaluation
services
Impact
Overall aim
Why we do it
Specific aims
Objectives
Outcomes
What we do
Outputs
Inputs
Outputs
The outputs of the Community Theatre Project
are the work generated by the project, and relate to
the projects objectives. The following are examples:
Objectives
Outputs
Weekend workshops
One- and two-day workshops in schools
After-school courses
Holiday schemes
Productions
Introductory talks
Day workshops
Week-long courses
19
basic
planning
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 20
charities
evaluation
services
20
Outcomes
The outcomes of the Community Theatre Project
are the changes or benefits that take place as a
result of project activities. These relate to the
specific aims. It is important to assess only factors
which the project can reasonably control or
make happen.
Aims
Outcomes
Impacts
Impacts are the broader and longer-term changes
relating to your overall aim, or mission. The
Community Theatre Project would have to
demonstrate that it is meeting the overall aim
of changing the lives of children through
the use of drama.
Inputs
Outputs
Outcomes
Impacts
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 21
charities
evaluation
services
Performance indicators
There is one more key step in laying the
foundation for monitoring and evaluating
your project. To check how your
organisation is performing, you will need
to set performance indicators. They are
the criteria or clues on which to base
judgements about the progress and success
of the project.
Performance indicators:
let stakeholders know what they can
expect from the organisation
provide a focus for managers
help you to focus on what you need to
monitor and evaluate
help a project to judge its achievement
and effectiveness
help comparison between projects.
Who defines the performance indicators
for your project is important, as different
stakeholders will have different views about
what counts as success. If you involve them
you will have a wider and richer perspective.
Brainstorming can be a useful way to find
out how different people understand success
or failure. It may be helpful to look at
indicators that similar projects are using.
Funders and commissioners may want to
have performance indicators that are easily
measurable or that link to their own
strategies. This could mean measuring things
that are not at the heart of the project. So
discuss your key activities and intended
benefits with funders early on. Evaluation
priorities should not take over project
activities and divert you from your planned
priorities.
You may want to involve users in developing
indicators. Users have a unique and valuable
perspective on success. But be clear
yourselves, and with users in advance, what
weight will be given to their views when final
decisions are made about indicators.
21
basic
planning
Types of indicator
There are a number of different types of
indicator. The most common are output and
outcome indicators. These are often confused
with impact indicators, but there is a difference.
Output indicators these demonstrate the
work the organisation does and show
progress towards meeting objectives.
Outcome indicators these demonstrate
changes which take place as a result of the
organisations work, and show progress
towards meeting specific aims.
Impact indicators these demonstrate
broader, longer-term change, often relating to
the overall aim or mission of the organisation.
Output indicators
Output indicators can be set for:
quantity (how many services or products)
take-up (used by how many people)
accessibility (used by what type of people)
quality (including user satisfaction)
cost.
Performance
indicators
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 22
charities
evaluation
services
22
Indicators
Meetings with
agencies
Publicity leaflets
Performances
Weekend workshops
for young people
Curriculum
discussions
Outcome indicators
Outcomes and their indicators are often
confused, but are quite distinct from one
another. Agree your expected outcomes by
thinking about what success would look like for
each aim, that is, what change would happen.
Develop indicators by agreeing what would
demonstrate that this change had taken place.
Example for the Community Theatre Project
Outcome
Indicators
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 23
charities
evaluation
services
Setting targets
Often it will be useful to compare
monitoring data against targets that you
have set in advance. Targets are expected
levels of achievement in relation to inputs,
outputs, outcomes and impacts against
which you can measure performance.
Indicators on their own have no specific
value. For example, if an output indicator
for the Community Theatre Project is
number of performances, you do not know
whether the project would be successful
if it gave five performances or 50. To help
define success, you may need to qualify
the indicators by setting targets.
Targets specify the level and quality of
activities and achievements that you
expect or hope for. They may be needed
for funding applications, but there are a
number of reasons for setting targets. They
can help with operational and individual
work planning, help the project to improve
performance, and help to demonstrate and
improve value for money. They will provide
a benchmark or standard to monitor
and evaluate performance or progress
against, and to help document progress
towards achieving aims and objectives.
At the same time, projects need to be wary
of allowing targets to drive activity in a
way that might limit the time given to other
areas of work or less tangible achievements.
Targets are often set for outputs. The
following examples demonstrate targets
for a number of different projects.
23
basic
planning
Setting
targets
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 24
charities
evaluation
services
24
Project
Output indicators
Output targets
Community
development agency
Drop-in service
Training project
Level of satisfaction
with courses
Outcome indicators
Outcome targets
Employment training
programme
Percentage of trainees
getting employment
Alcohol advisory
project
Amount of weekly
units drunk
Young peoples
counselling project
Numbers of young
people remaining in the
family home
trustees appointed
workers in post
promotional activity underway
partnership agreements in place
conference held
first trainees enrolled or
completing training.
Many organisations will have an
agreement with their funder to give a
certain level of service, expressed as
targets (see above). High targets can help
to provide vision and ambition, and to
obtain funding. Low targets might be
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 25
charities
evaluation
services
your resources
what you have achieved before
needs assessments
April
May
June
July
Aug
Sept
25
basic
planning
Developing
a year plan
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 26
charities
evaluation
services
26
Davey, S et al (2008)
Using IT to Improve
your Monitoring and
Evaluation, Performance
Hub, Charities
Evaluation Services,
London
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 27
further planning
charities
evaluation
services
27
further
planning
Linking evaluation to
project planning
Your first planning stage is a good needs
assessment, which will make clear
recommendations based on the needs,
problems and solutions it has identified. A
needs assessment can also provide baseline
data, against which you can measure
changes during the lifetime of your project.
Indicators
Means of
verification
Broad aim
Project purpose
Outputs
Activities/inputs
Adapted from: Gosling, L and Edwards, M (2003) Toolkits: A Practical Guide to Monitoring,
Evaluation and Impact Assessment, Save the Children, London
Assumptions and
critical factors
Planning for
monitoring and
evaluation
Linking
evaluation
to project
planning
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 28
charities
evaluation
services
28
Level 1
Evaluation strategy
Evaluation priorities over a period of time
Level 2
Evaluation planning
Making evaluation operational
Level 3
Evaluation design
Approaches
and methods
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 29
charities
evaluation
services
Type of evaluation
Who
Year 1
Needs assessment
Baseline data
End of year report
Student researcher
Student researcher
Internal
Year 2
Implementation (process)
report
Internal
Year 3
Internal
Year 4
Internal
Internal plus external evaluator
Year 5
Outcome evaluation
29
further
planning
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 30
charities
evaluation
services
30
Designing an evaluation
Your evaluation strategy will describe the
evaluation activities needed at different points
in your project management cycle and over
the lifetime of the project. You will then need
to begin to implement your monitoring and
evaluation strategy by planning activities over
a shorter time. Each separate evaluation will
also need designing. This will focus on the
approaches and methods to be used, decide
who the respondents will be and set out a
detailed timetable for activities.
Design matrix
It is helpful to summarise the design of your
evaluation activities in a design matrix. This
usually includes the following elements:
key evaluation questions
indicators the things you will assess
to help you to make judgements
data collection methods
data sources
data collection timetable.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 31
charities
evaluation
services
Project aims
and objectives
Key question
Aim:
To enable older
people living in
St Bernards
accommodation
to make informed
choices about their
housing, social,
health and care
needs
To what extent
did older people
acquire benefits
not previously
received and use
health and care
services not
previously
received?
31
further
planning
Indicators
Number of older
people with
Housing Benefit
Total amount
of Housing
Benefit received
Data sources
Users
Data
collection
methods
Timetable
Analysis of
case records
April/May
Financial
analysis
of records
April/May
Interviews
June
Questionnaire
May/June
(analysis
early July)
Users
Desk research
April/May/June
Staff
User feedback
forms
Continuous
Interviews
May/June
Staff
Other agency
personnel
Case records
Numbers receiving
health care
Types of
services received
Numbers
receiving care
packages
Types of care
packages
Objective:
To provide
information and
advice to older
people in
St Bernards
accommodation on
available housing
benefits, personal
care services, health
services and life
skills training
Number of older
people who have
had needs
assessments
Level of user
satisfaction
Number of people
contacting statutory
and independent
agencies
Extent and type
of working
relationships with
other agencies
Other agency
personnel
Needs
assessments
Service
participation
monitoring
Case reviews
Meeting
minutes
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 32
charities
evaluation
services
32
Needs assessment
Project strategic plans
Evaluation strategy
Project operational plans
Evaluation plans
Further reading
Cracknell, BE (2000)
Evaluating
Development Aid:
Issues, Problems and
Solutions, SAGE,
London
Robson, C (2000)
Small-scale Evaluation:
Principles and
Practice, SAGE, London
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 33
section 2 monitoring
Basic monitoring
35
35
40
Outcome monitoring5
41
43
43
44
Further monitoring5
45
45
Process monitoring5
45
46
Impact monitoring5
47
47
Data management5
48
charities
evaluation
services
33
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 34
charities
evaluation
services
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 35
basic monitoring
Basic monitoring discusses how to set up
your record keeping. It looks at the different
elements of your project that you might
monitor. These include inputs (the resources
you put into your project), and your outputs,
including the scale and pattern of use, and
financial monitoring. It discusses using an
outcomes approach and developing outcome
monitoring systems. It looks at good practice
in monitoring user profile information,
introduces feedback from users, and suggests
a framework for analysing and reporting
your monitoring information.
Setting up your
record keeping
Monitoring is the routine and systematic
collection of information so that you can
check regularly on your progress. Beware
of collecting information randomly without
thinking through what information you
really need. Planning is vital in helping you
to monitor in a focused and systematic way.
It is not realistic to collect information for
all the possible performance indicators you
identified. This is why it is important to
discuss with your stakeholders, particularly
trustees, users and funders, the key
indicators for which you are going to collect
information.
Monitoring should give you enough
information to tell you what is working,
identify problems, tell you who is using
your services and how, and help you plan
to meet changing needs. It also means you
can provide relevant information to other
agencies, develop useful publicity, and tell
funders and other stakeholders about
progress. Write monitoring tasks into job
descriptions and planning cycles to make
sure that responsibilities are spelt out and
that everyone is clear about the importance
of monitoring.
Setting up record keeping, whether this is
paper based, computerised, or both, is a key
part of project start-up activities and should
charities
evaluation
services
35
basic
monitoring
Setting up
your record
keeping
Further reading:
Davey, S et al (2008)
Using IT to Improve
your Monitoring and
Evaluation,
Performance Hub,
Charities Evaluation
Services, London
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 36
charities
evaluation
services
36
Monitoring inputs
Think about the resources you put into your
services or activities. These include staff
and volunteers, management and training,
and financial resources. Dont forget the
contribution made by partner agencies.
You may want to monitor against a number
of input indicators. For example:
the amount of management or
administration time spent on certain
activities
the level and use of staff and volunteer
resources
staff and volunteer profile and skills
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 37
charities
evaluation
services
Monitoring outputs
When you set output indicators, focus
these on the information that will be
useful in planning activities and services.
A pressure group might want to monitor
its level of media coverage, whereas a
service provider might want to monitor
the scale and pattern of use. This could be:
the level of use at each session,
for example, at a luncheon club
the total level of service over a given
period of time, for example, the number
of advice sessions each month
the total number of users, number of
new and repeat users, and number of
users who have left the service, over
a given period of time
frequency or extent of use by individual
people
use of premises, transport or catering.
Remember that output information will be
useful for explaining to others what you do,
what is involved in providing your service,
and what resources you need. It will also be
important in helping to understand how and
why you have achieved your outcomes.
37
basic
monitoring
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 38
charities
evaluation
services
38
The following categories are suggested
by the Equality and Human Rights
Commission. In Scotland it will be
helpful to use questions similar to the
Scottish census questions. The Equality
Commission for Northern Ireland
recommends particular ethnic
categories in its code of practice.
White:
British
Irish
any other white background
(please write in)
Mixed:
white and black Caribbean
white and black African
white and Asian
any other mixed background
(please write in)
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 39
charities
evaluation
services
Financial monitoring
Financial monitoring has three main
purposes:
accountability
control
evaluation.
These relate to the different needs of a wide
range of stakeholders to know what is going
on in your organisation or project. Financial
monitoring systems should be built in at
the beginning of a project. This allows
appropriate information to be collated as
an integral part of the project, and allows
reports to be produced at regular intervals.
Integrating financial information needs
Use a coding system to show the different
types of income and expenditure. This allows
summary information to be broken down
into categories, for example, to highlight
income and expenditure for a particular
project or fundraising activity. At the same
time it will allow income and expenditure
to be categorised for the statutory accounts.
External accountability
By law, your project will have to keep
accurate accounting records of income,
expenditure and assets. As well as the
legal requirements, a range of external
stakeholders such as funders, regulatory
bodies and users will have an interest in
particular aspects of the projects finances.
For example:
a funder giving money for a particular
purpose will expect to see financial
reports showing that the money has
been spent effectively according to
their wishes
a regulatory body, such as the Inland
Revenue, will expect to see a financial
monitoring system that allows them to
check that you have kept to the tax laws
users may be interested in various
aspects of how effectively you use your
resources.
Control
Trustees, staff and volunteers need financial
information to make sure that legal
responsibilities are fulfilled, and to
understand what is going on in the project.
Regular reporting of progress against an
agreed annual plan and budget is a key
means of control. However, a budget is
only an estimate, and what actually happens
may not be what was planned.
You will normally report monthly for staff on
actual income and expenditure compared
with budgeted income and expenditure.
Trustees might see reports less often, say
quarterly, to fit in with the trustee meeting
schedule. Major differences between
actual and budgeted expenditure should
be highlighted and explained to make sure
that the project does not spend more than
it can afford. It is also helpful for staff to
produce a forecast of what the year-end
position is likely to be. This can be done
quarterly for trustees and lets them take
action, if appropriate, to make sure that
an acceptable year-end position is reached.
Monthly cash flow projections are also
useful.
Evaluation
Reviewing income and expenditure for
particular activities, and monitoring trends
over time, can help to evaluate the financial
implications of the different activities of
the project. This is important for planning
service delivery and fundraising. For
example:
the cash flow will show whether there
are enough funds at all times both to
implement and run the project
evaluating the cost of providing different
levels of service, compared with income
received, will help with decision-making
when planning services
the relative cost-effectiveness of
different fundraising methods can be
compared, and this can be used to plan
how resources should be allocated to
future fundraising.
39
basic
monitoring
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 40
charities
evaluation
services
40
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 41
charities
evaluation
services
Outcome monitoring
Focusing on outcomes
Outcomes are the changes, benefits, learning or
other effects that happen as a result of your
activities.
Desired outcomes often describe positive
change. They can also be:
about maintenance of the current situation.
This involves ensuring things stay the same
and dont get worse.
about reduction in something, for
example, criminal behaviour or vandalism.
Outcomes can be:
welcome or unwelcome
expected or unexpected.
You may be able to anticipate most of your
outcomes. However, some things may happen
that you did not plan for.
41
basic
monitoring
Feedback
from users
Outcome
monitoring
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 42
charities
evaluation
services
42
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 43
charities
evaluation
services
43
basic
monitoring
Analysing
monitoring
data
Reporting
monitoring
data
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 44
charities
evaluation
services
44
prompt corrective action before it is
too late.
Keep your life simple by collating information
regularly, for example, weekly or monthly, to
avoid too much work at one time. Design
simple reporting forms so that you can easily
demonstrate progress against targets. Your
basic point of reference will be the year plan.
If you have set out milestones, or key events
or achievements for the year, it will be
useful to report progress against these.
For example, you may have set out a
timetable for sending in a major funding
application, moving premises or holding
a conference.
From the performance indicators you
identified for your outputs, select key
indicators to report on, for example, number
of sessions, meetings, publications, profile of
users or other target group. Set these out in
a way that makes it easy to compare against
the targets you set for your key indicators.
Some of these may be performance levels or
standards agreed with your funder.
You may report on satisfaction feedback data
during your quarterly reporting. Some
funders also ask for some basic outcome
monitoring data every quarter, such as
numbers of people housed.
Further reading
Parkinson, D and
Wadia, A (2008)
Keeping on Track: A
Guide to Setting
and Using
Indicators,
Performance Hub,
Charities Evaluation
Services, London
Monitoring your
monitoring
Monitoring arrangements themselves need
to be monitored. Make sure they give you
the information you need and are not too
time consuming:
Keep a note of any problems as they
arise and the main ways monitoring
has been helpful.
From time to time do a more systematic
review. To start with, this should be
every six months. Later, it could be once
a year.
Make sure that you are not collecting
information that you do not use.
This section has concentrated on how
to set up basic monitoring systems on
different aspects of your projects
activities, including monitoring against
targets. It has discussed the monitoring
implications of focusing on project and
organisational outcomes. It has also
looked at the need for a framework to
analyse and report monitoring
information. You now need to think
about how monitoring links to
evaluation activities.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 45
further monitoring
Basic monitoring examined how to set up
basic systems to monitor project activities
and outcomes and gather data about users. It
also looked at how to report monitoring
information within an analytical framework.
Further monitoring looks at more complex
monitoring. This includes monitoring project
impacts, and how to monitor against a
quality system. It also looks at collecting
routine data about factors within your
project, or in the outside world, that may
affect how well the project is doing.
Process monitoring
To understand why your project is achieving
the changes it hopes to bring about, you need
to identify the factors that really matter in
the way you do things that is, the projects
processes. This may include looking at the
way you deliver certain outputs, such as client
assessments. But there will be other things you
want information about, such as the way staff
and volunteers work together, recruitment
processes, public relations and relationships
with other agencies. Process monitoring is also
important to assess the quality of what you do.
It can be helpful to set specific indicators for
processes. Examples might be:
the extent of involvement of group members
the amount of team working
the extent and type of meetings with
other agencies
the level and quality of contact with
the media.
Also allow for more wide-ranging and
unexpected data to be collected, for example,
through records, minutes and diaries.
charities
evaluation
services
45
further
monitoring
Monitoring your
monitoring
Developing
more complex
monitoring
systems
Process
monitoring
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 46
charities
evaluation
services
46
Adapted from: Charities Evaluation Services (2000) PQASSO (Practical Quality Assurance
System for Small Organisations), Second edition, London
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 47
charities
evaluation
services
lity im
Qua
provem
e
ity review
Qual
nts
Monitoring
Sta
n
dard
develop
n
me
Impact monitoring
Impact monitoring examines the effect of
your project over a longer timeframe and in
a wider context. It asks about the broader,
cumulative consequences of your work.
You will need to take measurements over
several years and use outside information to
assess the effect of your project from a
longer-term perspective. Examples of
statistical information relevant to a local
project might be:
the number of school exclusions
the number of arrests in a particular
neighbourhood
47
further
monitoring
Monitoring
for quality
Impact
monitoring
Monitoring
context issues
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 48
charities
evaluation
services
48
Data management
It is important to resolve practical data
management difficulties, as these can hold
back how effectively you can use monitoring
information to improve your work and the
quality of your reports to trustees and
funders. It may be helpful to involve an
external consultant in identifying internal
information needs as well as external
requirements. Research has shown that many
organisations struggle with IT resources that
are inadequate to meet reporting
requirements; this lack may be in IT
infrastructure and systems and in staff
training and capacity.
There is an increasing development of IT
software specifically for third sector
monitoring and evaluation; investment in
improved IT systems can have a number of
advantages, including:
considerable time savings
increased storage capacity
easier data search and access
easier checking for errors and missing data
Further reading
Davey, S et al (2008)
Using IT to Improve
Your Monitoring and
Evaluation,
Performance Hub,
Charities Evaluation
Services, London
The Charities
Evaluation Services
website has
information on
monitoring and
evaluation software:
www.ces-vol.org.uk
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 49
section 3 evaluation
Basic evaluation
51
Making judgements
51
52
53
Outcome evaluation
53
Impact evaluation
56
56
59
59
60
Further evaluation
65
65
Evaluation approaches
67
70
71
Managing evaluation
74
charities
evaluation
services
49
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 50
charities
evaluation
services
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 51
basic evaluation
Basic evaluation considers what makes
evaluation different from monitoring,
and how the two link together. It discusses
the need to take into account different
expectations and agendas and to work within
ethical standards. The section introduces the
process of data collection and looks at the
important step of interpreting evaluation
findings and at writing an evaluation report.
Making judgements
During the year you produce monthly or
quarterly monitoring reports for colleagues
and trustees, and use the data for funding
applications and publicity.
Your quarterly reports may lead you to
think about how you are delivering services,
for example, how you can best respond
to advice needs, or how you can improve the
accessibility of your services given
your user profile information. However,
an evaluation will allow you to examine
specific issues in greater depth. It gives
you the opportunity to bring together,
analyse and make judgements and
recommendations from the monitoring
data collected during the year.
Your monitoring information is likely to
contain:
profile information on your users
basic project record keeping, such as the
minutes of meetings and case records
statistical information on take-up of
services
feedback sheets from training courses
and workshops
diaries and other records of events
complaints and compliments from users.
As part of your evaluation, you may decide
you need to know more than this. Your
monitoring information will probably suggest
further questions that need an answer, such
charities
evaluation
services
51
basic
evaluation
Making
judgements
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 52
charities
evaluation
services
52
Data analysis
Interpretation
Reporting
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 53
charities
evaluation
services
Understanding
stakeholders needs
Just as you consulted your stakeholders in
setting aims, objectives and indicators of
success, make sure you plan enough time
for consultation with managers, users,
funders and staff about their monitoring
and evaluation needs.
You do not always need to involve everyone,
but try to consult with representatives from
different stakeholder groups as they will have
different, sometimes contradictory, interests
and views. Although the performance
indicators you set will guide some of your
evaluation activities, there may be other
areas of your work that need exploring, and
other questions to be answered. Find out:
What are the main questions they want
monitoring and evaluation to answer?
What information do they lack about
the project?
How do they want to be involved?
How do they want the findings
presented?
Involving funders and partners at this stage
can also help to build relationships and can
lead to a greater likelihood of the evaluation
report being understood and accepted.
Consulting users can also encourage user
participation in the evaluation itself and in
developing data collection tools, such as
questionnaires.
Deciding what questions need to be
answered is the first important step. For
example, they might be:
Are we reaching the right target group?
Outcome evaluation
We have defined outcomes as the changes,
benefits, learning and other effects that you can
attribute to your organisational or project
activities. They are different from the longer-term
change or impact of the project or programme.
Impacts relate to the overall aim or mission. They
are the broader or cumulative effects that may
be seen in local or national statistics about the
area of intervention, such as health,
homelessness, school attendance and youth
unemployment.
If you set indicators for outcomes, you might
have been able to monitor some of these
outcomes regularly. During the evaluation stage
you should be able to draw this information
together. Useful outcome information can also be
obtained during the evaluation itself through
one-off data collection. The task then is to put
outcome information in the context of:
53
basic
evaluation
The politics
of evaluation
Understanding
stakeholders
needs
Outcome
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 54
charities
evaluation
services
54
Understanding outcomes
You can look for outcomes at a number of levels.
When you set indicators remember to set them
at different levels, if appropriate. For example,
these may be:
individual
family
community
environment
organisational or institutional
systems and policy.
You may be focusing your monitoring at one
level, but it could provide a more complete
picture of your project if you collect information
on outcomes at other levels during your
evaluation. For example, a project developing a
green space might monitor the changes in the
environment on a regular basis, and during an
evaluation collect information on the resulting
outcomes for the community as well. A project
working in schools might collect regular data on
school attendance and test results, and as part
of an evaluation also collect data from families
and teachers on the difference the project has
made to them. Examples of these outcomes
might include fewer conflict episodes or
increased contact between school and family.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 55
charities
evaluation
services
Levels of intervention
Expected outcomes
Policies to prevent
homelessness
Government policy
Increased acceptance
and co-operation
Local community
Change in attitude
More young people
reunited with their families
Families
Increased self-esteem
Confidence and
change in behaviour
Individuals
Be clear about the type of change you are intending, and for whom, as shown in the table below.
Change
In what
For whom
Develop
Expand
Maintain
Modify
Improve
Attitude
Knowledge
Condition
Perceptions
Policy
Individual
Family
Neighbourhood
Other agencies
Local government
55
basic
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 56
charities
evaluation
services
56
Impact evaluation
The term impact is used in a number of
ways, but is usually taken to mean the effect
of a project or programme at a higher or
broader level, cumulative effects, or changes
that affect a wider group than the original
target. Impact often relates to the overall aim
or mission of your project. Impact is
frequently longer term and therefore the
timescale needed for impact evaluation may
well be longer than that needed for outcome
evaluation.
One area of overlap between outcomes
and impact arises when an organisation is
directly involved in policy work, education
or awareness-raising at community level, or
other broader work. Then:
Policy change or community awareness
would be defined as intended outcomes of
the organisation or work taken.
Impact could be the effect on a wider target
group of any policy change or increased
awareness.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 57
charities
evaluation
services
digital media.
Adapted from: Worthen, BR and Sanders, JR (1987)
Education Evaluation: Alternative Approaches and
Practical Guidelines, Longman.
57
basic
evaluation
Impact
evaluation
Finding the
evidence
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 58
charities
evaluation
services
58
Piloting
Before starting any data collection, it is a
good investment of time to test the datacollecting instruments. For example, ask
some project participants to fill in a
questionnaire or carry out interviews with a
small number of respondents. Then look at
the results to see if the questions have been
understood consistently by respondents and
if you have captured the information you
wanted.
Piloting is particularly important, for example,
with a survey questionnaire or other selfcompletion instruments, where a researcher
is not on hand to clarify questions.
Training
It is important that people carrying out
evaluation research are properly briefed and
trained. Data collection should be supervised
to ensure quality, for example, by comparing
data entry records for different researchers
and reviewing work regularly.
Incentives
Evaluators sometimes offer a small token to
participants if they are asked to give up a
substantial amount of time. Ask people
working directly with your informants what
they would consider appropriate: a voucher
can be given as a token payment. You also
need to budget for travel costs, refreshments
for a focus or discussion group, and possibly
for child care costs.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 59
charities
evaluation
services
59
basic
evaluation
encourage teamwork.
Involving
users in
evaluation
Evaluation
standards and
code of ethics
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 60
charities
evaluation
services
60
Interpreting findings
So far you have gathered your monitoring
data over the year and collected some
additional data. You can find information on
analysing data in the Practical toolkit. The
next stage is possibly at the heart of your
evaluation activities. You must think about
what lies behind your data, that is, interpret it.
Interpretation means looking beyond the data
itself and asking what the results mean in
relation to your evaluation questions. Be wary
of assuming that there are links of cause and
effect between your project activities and
results. Involve other people in this level of
interpretation and, where appropriate,
acknowledge in your report the possibility of
other interpretations. Remember to put
outcome information into context. To do this,
it will be useful to ask a number of questions:
Did the level of resources, for example, the
level of money or staffing, affect the
outcome?
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 61
charities
evaluation
services
Drawing conclusions
You are now ready to tell your stakeholders
what you have found, and must draw
conclusions from your findings. Conclusions do
not just repeat the data, but should link clearly
to the evidence presented. Avoid generalising
from a particular finding and make clear in
your report the difference between facts,
respondents views and the evaluators
interpretation.
If your report is short, you might summarise
your conclusions at the end of the findings
section. In a longer report, summarise your
conclusions on one section of your findings
before reporting on the next topic.
One approach to finalising the report is to
organise discussion groups on the draft report.
Think of this as your final stage of data
gathering, as you may be given new insights
and different interpretations. You may also use
these discussions to suggest recommendations
arising from the findings.
This process will improve the report and allow
others to feel involved in the evaluation
findings. It is more likely that
recommendations developed in this way will
be acted on. Make the distinction between
61
basic
evaluation
Presenting findings
How you present the report will have an
important bearing on your credibility, so consider
what your evaluation report will look like before
you start putting together your findings. Think
first about the purpose of the evaluation and its
audiences. The style of reporting and level of
detail will vary according to your audience. Will
your audience prefer evidence in the form of
tables, charts and graphs, or case examples?
Writing the
evaluation
report
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 62
charities
evaluation
services
62
Findings
Data
Monitor
routinely
Analyse
Collect
additional data
Assemble
Interpret
Report
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 63
charities
evaluation
services
63
basic
evaluation
outreach
schools
community
centres
advice
centres
social
services
other
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 64
charities
evaluation
services
64
drop in centre
outreach work
schools
community centres
advice centres
social services
others
Case examples
Case examples describe a particular event
or set of events, activity or personal history.
They are intended to demonstrate and explain
evaluation findings. They can be very useful in
demonstrating good and bad practice,
illustrating complex services and bringing an
evaluation to life. But it is helpful to explain
the point you are illustrating in the main
narrative.
Kumar, R (1999)
Research
Methodology: A stepby-Step Guide for
Beginners, SAGE,
London
Robson, C (2000)
Small-scale Evaluation:
Principles and Practice,
SAGE, London
Further reading
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 65
further evaluation
Basic evaluation concentrated on
bringing together monitoring data and
supplementing it within a structured selfevaluation exercise. Further evaluation
examines evaluation approaches within
a basic theoretical context and considers
different types of evaluation activity.
These relate to the focus of the evaluation
enquiry. This section also looks at the
relationship between evaluation, quality
audits and other audits. Finally, it considers
good practice points for managing and
reporting your evaluation, whether this is
carried out internally or by a consultant.
65
further
evaluation
charities
evaluation
services
Focusing your
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 66
charities
evaluation
services
66
Context evaluation
Evaluation can look at how the environment,
or context, a project operates in affects it,
and this will help you understand how and
why the project works. A needs analysis
will have established some of the basic
contextual factors. Later on, you may
need further context information when
you modify or develop your project. What
other services are available from other
agencies? What is the political climate
and how has it affected the work?
Contextual information is also essential
when attempting to reproduce projects
and services elsewhere. How does an urban
or rural environment, for instance, affect
implementation?
It is also important to understand how
organisational contextual factors might
hinder or support project success.
Questions might include:
How do the organisational structure
or staffing decisions influence project
development or success?
How effective are decision-making
processes?
Process evaluation
A process evaluation will help you to
assess the planning, setting up and
implementation of a project, and decide
how to improve or modify current
activities. It will focus on processes
how the project works and also provide
valuable information on progress. Process
evaluation is particularly important for
pilot projects so you can learn what needs
to be improved. Specific purposes might
include:
finding out what is working well
finding out where there are problems
and why
assessing how users and other
stakeholders experience the project,
and their use of project services.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 67
charities
evaluation
services
Evaluation approaches
Once you have clarified your evaluation
focus, it will be helpful to consider your
evaluation approach.
While there are many variations, monitoring
and evaluation approaches that focus on
accountability needs have largely followed
the tradition of scientific investigation.
These approaches assume the power and
value of measurement and objectivity.
However, there has been a shift towards
approaches that can be used more
easily in decision-making, and an
acknowledgement that quantitative data
is itself an approximation or interpretation.
This links to the increasing emphasis on
participation by stakeholders throughout
evaluation evaluation that can be used,
and evaluation for learning.
Such approaches are labelled naturalistic.
Naturalistic approaches assume that there
will not always be a definite answer. They
focus on qualitative data and description,
and they value subjective understandings
and different perspectives. Naturalistic
approaches, such as case study evaluation
and participatory evaluation, acknowledge
that all enquiry is value-driven and is
more likely to be sensitive to the different
values of programme participants. This
fits particularly well with a voluntary
sector ethos.
In practice, the difference between the two
approaches to evaluation is not as clear cut
as in theory. Qualitative and quantitative
methods may be used in both approaches.
67
further
evaluation
Evaluation
approaches
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 68
charities
evaluation
services
Participatory evaluation
68
Consultation
Dialogue
Basic evaluation
tasks
Complex
Collaboration
evaluation tasks
Stakeholders
know the
results
Views asked
about
predetermined
issues
Involvement
in the design
and input into
decision-making
on some of
the issues
Stakeholders
carry out parts
of the data
collection
Stakeholders
Joint
involved in
responsibility for
processing
the evaluation
findings and
making
recommendations
No participation
Full participation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 69
charities
evaluation
services
User-led evaluation
Theory-based evaluation
Programme logic
Inputs.....lead on to
Outputs.....which lead on to
Short-term outcomes.....which lead on to
Outcomes.....which lead on to
Impact
69
further
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 70
charities
evaluation
services
70
Designing an outcome
evaluation
Comparative designs
Positive change
C
Before project
Negative change
After project
project group
comparison group
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 71
charities
evaluation
services
Emergent designs
There may be situations when all your
evaluation questions, and even the methods
for collecting evidence, are not clear at the
beginning of the evaluation. In this case, you
have flexibility to allow important issues and
questions to emerge as you do the work, and
you design your data collection
in response.
Economic evaluation
Funders often want to know whether a
project provides value for money or is cost
effective. Economic evaluation recognises the
choices involved in spending limited resources
and focuses on the non-monetary and
monetary costs as well as benefits.
Costs are the various inputs, both direct and
indirect, needed to set up and run a project.
The most important cost is often the time
spent by staff and volunteers on different
activities, and this needs to be multiplied
by the value of each persons time. Other
resources to be costed include physical
facilities, equipment, supplies and time.
In a cost-benefit analysis you measure costs
and outcomes in monetary terms. The cash
flow can be used to work out the percentage
annual rate of return. Or, if you divide the
project costs by project benefits, this will give
the cost-benefit ratio. This approach needs
considerable economic expertise, and extensive
work that is expensive and problematic
because it is imprecise in its measurements. It
is rarely suitable for voluntary sector activity.
71
further
evaluation
Designing an
outcome
evaluation
Extending your
evaluation
activities
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 72
charities
evaluation
services
72
Evaluating sustainability
A key evaluation question is about how
sustainable the project is. Will the project be
able to flourish after a particular source of
funding has ended? You need to ask:
What ongoing funding is secured?
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 73
charities
evaluation
services
73
further
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 74
charities
evaluation
services
74
Managing evaluation
Whether you are carrying out selfevaluation, or using an external evaluator,
or combining both approaches, evaluation
requires careful management to avoid or
overcome some potential problems.
For example:
one or two people often carry much of
the responsibility of the evaluation and
it may be difficult to involve others
the time and resources involved may
prove too much of a burden, resulting
in an incomplete or poor quality process
evaluation staff may become
demoralised
there may be a hostile response to
negative findings or to the process itself.
Self-evaluation may not always be appropriate
for your specific evaluation needs. External
evaluation can bring a fresh perspective,
specific skills and new insights. However,
make sure you build in enough consultation
time so that the evaluation, while
independent, works to relevant criteria and
meets your immediate priorities, timetable
and resources.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 75
charities
evaluation
services
Contractual arrangements
Terms of reference
The terms of reference, or brief, for an
external evaluation should set out all the
principal issues that will guide evaluation
activity. These should be dealt with by the
proposal from the evaluator. The contract
should refer to these documents and should
spell out the responsibilities of each party.
The contract you make with any external
consultant is important and should be
signed and in force before the evaluation
begins. It will include or have attached the
terms of reference and the evaluators
proposal, and state clearly the amount and
method of payment. It should also clarify
issues to do with ownership and copyright.
It will be useful for the contract to detail
administrative requirements, and where
these responsibilities lie. For example, you
may need to:
make databases available, or collate and
analyse existing monitoring data
make available any previous reports or
relevant literature
contact stakeholders to brief them about
the evaluation and introduce the
consultant
send out questionnaires, in order to keep
costs down.
Ownership and copyright
Be clear from the start about rights and
responsibilities. The following questions
should be answered:
75
further
evaluation
Managing
evaluation
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 76
charities
evaluation
services
76
Quality control
Unless you have agreed who has authorship
and copyright, neither party may be free to
use the report widely; this includes website
publishing. Joint copyright means that both
the author, that is, the evaluator, and the
project, have rights to the final manuscript.
Neither party should publish as opposed to
circulating to an immediate audience
without the agreement of the other.
These issues of authorship, copyright and
publication should be settled early on and
precise procedures put in place.
Practical details
Think through in advance any practical
details. For example, are there any
constraints on data collection that the
researcher should be aware of? Are there any
IT compatibility issues or difficulties about
access to paper files? Are there any issues
concerning confidentiality for staff and
users?
Evaluation budget
Evaluation needs resources, including time
and money. Self-evaluation will minimise
costs, although you should still set a
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 77
charities
evaluation
services
77
further
evaluation
Further reading
Clarke, A and Dawson,
R (1999) Evaluation
Research: An
Introduction to
Principles, Methods and
Practice, SAGE, London
Gosling, L and Edwards,
M (2003) Toolkits: A
Practical Guide to
Monitoring, Evaluation
and Impact Assessment,
Save the Children,
London
Van Der Eyken, W (1999)
Managing Evaluation,
Second edition, Charities
Evaluation Services,
London
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 78
charities
evaluation
services
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 79
section 4 utilisation
Basic utilisation
81
81
82
Further utilisation
84
84
84
85
86
charities
evaluation
services
79
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 80
charities
evaluation
services
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 81
basic utilisation
Once you have presented your report, it may
be tempting to file the evaluation away. This
will be a waste of all the time, energy and
resources spent on the evaluation and risks
losing the goodwill of everyone involved.
Importantly, it will also be a lost opportunity
to improve what you do. This section
suggests ways that you can disseminate, or
pass on, your findings, and how you can use
the findings to demonstrate your projects
progress, to make adjustments and to plan
the future direction of the project.
81
basic
utilisation
staff meetings
annual general meeting
in the newsletter.
With other stakeholders
distribute copies of summary sheets
of your findings to your partners
write brief articles in other
organisations newsletters
Disseminating evaluation
findings
charities
evaluation
services
Disseminating
evaluation
findings
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 82
charities
evaluation
services
82
adjust workloads
change your monitoring systems
change promotional material
introduce new training
increase your quality control.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 83
charities
evaluation
services
83
basic
utilisation
Using
evaluation
findings
Further reading
Lyons-Morris, L and
Taylor Fitz-Gibbon,
C (1987) How to
Communicate
Evaluation Findings,
SAGE, London
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 84
charities
evaluation
services
84
further utilisation
Basic utilisation discussed the need to
disseminate findings and to use them to
demonstrate progress and to feed back
into, and guide, the management of the
project. This section examines further how
you might use evaluation findings internally
and externally for three main purposes:
management
- reviewing targets and target groups
- incorporating quality improvements
into planning
- reviewing key resources
policy change
strategic planning and development.
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 85
charities
evaluation
services
85
further
utilisation
Valuing the
findings
Using
evaluation
for
management
Using
evaluation
for policy
change
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 86
charities
evaluation
services
86
Step 1
Analyse
data
Step 2
Assess
strategic
options
Step 3
Agree
strategy
Negative
Strengths
Weaknesses
Opportunities
Threats
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 87
charities
evaluation
services
Aims and
objectives
Strategic
planning
87
further
utilisation
Operational
planning
Implementation
Evaluation
Using
evaluation
for strategic
planning
Process
Outcome
Further reading
Patton, MQ (1997)
Utilization-focused
Evaluation, Third
edition, SAGE, London
pages 7-88 new jp:basic corrected + extra pages 03/09/2009 13:33 Page 88
charities
evaluation
services
91
91
91
92
93
102
104
105
108
Introduction
Tool 1 Quarterly monitoring report
Tool 2 Outline evaluation report
Tool 3 Interview topic guide
Tool 4 User satisfaction form
Tool 5 Intermediate outcome monitoring form
Tool 6 Individual assessment tool
Tool 7 Outcome assessment: participant self-report
Tool 8 Telephone interview guide
Tool 9 User interview schedule
Tool 10 Participatory learning and action (PLA) tools
111
111
112
114
115
116
118
120
122
124
126
128
Further reading
130
Glossary
132
charities
evaluation
services
89
charities
evaluation
services
collecting and
analysing data
charities
evaluation
services
Introduction
Establishing credibility
91
collecting
and
analysing
data
Bias
Bias occurs when findings are unduly
influenced by the way data is collected,
analysed or interpreted.
Introduction
Establishing
credibility
charities
evaluation
services
92
Literature search
Some measures that can be taken include:
eliminating bias as far as possible by
ensuring questions are asked with the
same wording, and in the same manner
making sure that groups chosen
for comparative purposes share as
many characteristics as possible, except
for the variable you are studying
obtaining evidence to answer your
evaluation questions from different
types of sources. This is called
triangulation.
The term triangulation means getting
evidence or taking measurements from
different points. You can use triangulation
by looking for different sources of data
on the same area of enquiry or applying
different methods to the same data
sources. For example, you may wish to
ask a group about the benefits of working
together, obtain feedback from the group
facilitator, and observe the group yourself.
If you do this, you will have greater
confidence in your evaluation findings.
It may, of course, result in contradictory
findings, and it will be your task to
consider explanations for such differences.
Desk research
This is a review of records, databases and
statistical monitoring data. You may need
to look at these over a period of years.
For example:
records about members and users
planning documents
policy and discussion papers and reports
publication sales and distribution figures
minutes of meetings
correspondence.
Desk research is valuable because it helps
you understand how a project has developed,
and how it is managed and resourced. It is
essential for implementation evaluation.
Client-based data, such as case records, may
not only tell you who you are seeing and
when, but may also provide some outcome
data.
charities
evaluation
services
93
collecting
and
analysing
data
Documentary
data sources
Collecting data
directly from
individuals
charities
evaluation
services
94
charities
evaluation
services
95
collecting
and
analysing
data
charities
evaluation
services
96
Face-to-face interviews
Types of interview
Structured interview
In a structured interview the interviewer
only asks questions that have already been
prepared in an interview schedule, using a
set order and exact wording. Interviewers
are given detailed instructions on how to
ask the questions.
Recording interviews
You will be able to take notes during most interviews.
This is important, because if you wait until afterwards
to write things down, you will forget much of what
you heard, and your own biases are more likely to
intervene. Type up your notes as soon as possible
after the interview.
You may find it useful to tape a lengthy interview.
This will allow a detailed analysis and you will be able
to use accurate quotations in your report as evidence.
Always get permission before taping. Transcribing a
taped interview can be time-consuming and costly.
One option is to use notes of the interview as the
basis of your analysis. You can then use your tapes
selectively to supplement the notes.
charities
evaluation
services
Semi-structured interview
A semi-structured interview follows a less
rigid format, with open-ended questions
designed to draw out more qualitative
information. The schedule is used as a
prompt to make sure that all the required
topics are covered. Questions do not have
to follow a predetermined sequence and
the interviewer can explore the answers in
greater depth. The interviewer is free to
vary the exact wording of the questions
and to probe and follow up answers. A
semi-structured approach can be particularly
useful for group discussions.
Unstructured interviews
These are sometimes called depth interviews,
as they encourage the interviewee to reveal
in depth any aspects they want to raise.
Unstructured interviews are flexible, can be
tailored to the individual and can lead to a
greater understanding of the respondent.
Their weakness is that information is likely
to be different for each interviewee, is less
likely to be systematic and comprehensive,
and data analysis can be difficult.
Combining interview styles
You might combine interview styles. There
may be a small set of specific questions that
you want to ask everyone, and a number of
topic areas which can be explored in a more
flexible way.
Telephone interviews
Telephone interviews are particularly useful
for getting feedback from busy people.
They can be used to obtain quite sensitive
or complex information, or where you need
to contact a large number of people in
their home or office. In some situations you
may be able to conduct lengthy telephone
interviews, but try not to make them longer
than 20 minutes.
97
collecting
and
analysing
data
charities
evaluation
services
98
Group interviews
Focus groups
charities
evaluation
services
99
collecting
and
analysing
data
charities
evaluation
services
100
charities
evaluation
services
Please assess the following elements of the course by ticking the boxes 1 to 5, as follows:
1
2
3
4
5
Very poor
Poor
Average
Good
Very good
1
a Course content
b Tutorial groups
c Information materials
d Course administration
e Social activities
Please show your satisfaction with the newsletter by ticking the boxes 1 to 5,
where 5 shows that you are very satisfied.
Very dissatisfied
a Topics covered
b Design and layout
c Clarity of language
d Length
e General interest and relevance
Very satisfied
101
collecting
and
analysing
data
charities
evaluation
services
102
Data collected by an
independent observer
Observation is useful for collecting information
on group behaviour and in situations where
people find it hard to talk. It is also useful for
assessing the quality and standards of delivery,
for example of training, in a residential setting,
or of the activities of a day centre.
The use of observation can involve project
participants in the evaluation, and encourage
their involvement in other data collection
exercises. On the other hand, the technique
runs the risk of bias and subjectivity. You will
need to think clearly beforehand about what
you are analysing. Reliability of observation
depends heavily on the training and skills of
those doing the observing. You can reduce
this risk if you devise a well-thought-out
observation checklist. Where you have more
than one observer:
charities
evaluation
services
Participant observation
This involves observing while participating
in activities, such as training courses and
workshops, club activities and activity
weekends. The advantage of participating
is that you will gain experience of the
project as a user and put users at their
ease. The disadvantage is that it is harder
to stay neutral.
Take into account that:
gathering, recording and analysing
observation data can be costly
there may be concern about the reliability
and validity of observational data
if there is more than one observer,
you need to check that you are using
the same criteria and applying them
consistently.
Always let people know when they are going
to be watched, and ask their permission.
However, be aware that saying this is likely
to affect behaviour.
103
collecting
and
analysing
data
Data
collected
by an
independent
observer
charities
evaluation
services
104
Number of older
people who have had
needs assessments
Level of user
satisfaction
Number of people
contacting statutory
and independent
agencies
Extent and type of
working relationships
with other agencies
Number of older
people with
Housing Benefit
Total amount of
Housing Benefit
Numbers receiving
health care
Types of services
received
Numbers receiving
care packages
Types of care
packages
Interviews
with users
Staff
interviews
Telephone
interviews
with other
agencies
Monitoring
data
Document
review
Case
records
charities
evaluation
services
Sampling
Random sample
105
collecting
and
analysing
data
Data collection
matrix
Sampling
For example:
project locations
activities or events.
A
5
C
1
G
2
K
3
O
4
charities
evaluation
services
106
Stratified sample
Sometimes a simple random sample is not
the best way of making sure that all the
types of people or organisations that you
work with are represented. For example, if
you only have a few clients who are disabled,
random selection may miss them entirely, just
by chance.
To avoid this, in a sample of 80 you may
decide that you want:
25 men aged 19 to 54
25 women aged 19 to 54
A time sample
A time sample will include every case that
happens over a set period. For example, you
may monitor the number of people coming
to a busy service over a week or a fortnight.
Purposeful sample
With a purposeful sample, you make a
deliberate choice of who or what to include,
according to specific characteristics. This
type of sample enables you to capture a wide
range of views, experiences or situations.
For example, you might choose to interview
five users who have had successful
Longitudinal studies
A longitudinal study is data collected on
a number of occasions from the same
population over a period of time. If you are
unable to collect data from the same group,
you may interview another group from the
same population, as long as the samples are
selected randomly both times to represent the
entire population. Use identical data collection
instruments, with the same wording, or you
will invalidate the findings.
charities
evaluation
services
Problem
Sample bias
This is mainly due to non-response or incomplete
answers. For example, completed responses to a
survey to older people may not reflect the true
age range of project users. This may be because
very old or more vulnerable users were less likely
to complete the questionnaire and their views
are therefore less likely to be represented.
Sampling error
Here you run the risk that if another sample was
drawn different results might be obtained.
Action
107
collecting
and
analysing
data
charities
evaluation
services
108
Data analysis
The quality of your evaluation will depend
to a large extent on the quality of your data
analysis. Data analysis is about organising your
evidence in a logical, well-argued way in order
to make sense of it. You may need advice
when you design your evaluation to make sure
that the type and amount of data you propose
to collect is capable of being analysed.
Data analysis should be done throughout
your evaluation process, not just at the end.
Computer analysis
Quantitative data analysis
Data collected using quantitative methods
can be analysed statistically for patterns,
for example:
percentages
averages
ratios
range between lowest
and highest levels
trends
rates.
Before you collect large amounts of
quantitative data, think how it is going
to be stored and analysed. You will need
a computer database if you have a large
number of questionnaires asking for a lot
of information. Remember that monitoring
forms should have a client code or name if
you want to be able to compare two or more
forms relating to the same client. If you
cannot handle the data in your project, you
may be able to get the coding and analysis
done by a student or a research firm.
First check your data for responses that
may be out of line or unlikely, such as
percentages that are numerically incorrect.
You may need to exclude these from the data
to be analysed. The process of data analysis
can be a useful way to check the accuracy of
the data collected and can identify unusual
or unexpected results.
Coding
Coding means using a number, or a few key
words, to summarise a lot of textual material.
Categories are used to classify the data so they
can be processed by the computer.
There are two main approaches to coding
responses:
Pre-planned respondents reply to fixed
choice questions with pre-set categories.
Additional coding is not necessary.
charities
evaluation
services
Number of
young people
Percentage
of total %
Under 10
10-12
13-16
Over 16
21
16
56
44
15.3
11.7
40.9
32.1
100
Central tendency
The mean, or arithmetical average,
involves adding up the data and dividing
by the number of participants.
The median is the point at which half the
cases fall below and half above the
sample.
The mode is the category with the largest
number of cases.
The mean may not provide a good illustration
of responses where these have been widely
distributed. You may need to be more
specific. For example, 65% of participants
found the course good or very good, 15%
found it average, and 20% said that it was
poor or very poor will be more helpful than
presenting the average response. The mode
is useful for describing the most commonly
occurring non-numerical data.
Remember to make the response rate clear
in your analysis. Whenever you can, record
the basic characteristics of those declining to
participate in, for example, a questionnaire,
say by age or sex. Make it clear how the
profile of those responding relates to the
wider population you are researching.
Cross tabulation and sub-group analysis
Cross tabulation examines findings in greater
detail. If the Community Theatre Project
wanted to examine the length of
participation in a holiday scheme by age, the
computer could show this. The table on the
next page demonstrates this.
109
collecting
and
analysing
data
Data
analysis
charities
evaluation
services
110
2 days or under
3-5 days
6-10 days
11-15 days
Total
Under 10
10-12
13-16
Over 16
1
3
-
1
2
1
-
1
6
1
2
2
8
7
3
6
18
8
Total
19
35
Tool 1
Tool 2
Tool 3
Tool 4
Tool 5
Tool 6
Tool 7
Tool 8
Tool 9
Tool 10
charities
evaluation
services
111
data
collection
tools
Introduction
charities
evaluation
services
112
tool 1
quarterly monitoring report
The following is an example of a summary monitoring
report for trustees and staff, providing information against targets.
Services delivered by Thursbury Community Theatre Project: 1 January to 30 September
1st quarter
2nd quarter
3rd quarter
This
quarter
Cumulative Annual
target
% of
target
met
1
7
1
9
16
2
15
20
5
60%
80%
40%
Total number
of organisations
15
25
60%
Total attendance
102
533
1200
44.4%
Number of classroom
workshops
Half-day
One-day
Two-day
2
1
-
12
5
2
20
10
5
60%
50%
40%
Total number
of schools
11
15
73.3%
Total attendance
43
321
700
45.9%
Number of
holiday schemes
66.7%
Total attendance
28
42
75
56%
Number of theatre
skills courses
50%
Total attendance
(teachers)
12
12
10
120%
Total attendance
(community workers)
10
0%
Number of
performances
Schools
Community centres
Other venues
Positive
4th quarter
On target
Negative
charities
evaluation
services
113
data
collection
tools
tool 1
quarterly
monitoring
report
charities
evaluation
services
114
tool 2
outline evaluation report
The report would not necessarily include all sections.
Preliminaries
Front cover
List of contents
Executive summary
Report overview
Project description
Conclusions
Recommendations
Ending sections
Notes
References
Appendices
tool 3
interview topic guide
charities
evaluation
services
115
data
collection
tools
tool 2
outline
evaluation
report
tool 3
interview
topic guide
1 Introduction:
personal introductions and explanation
of the evaluation process
length of interview and outline of topics to be covered
assurances about confidentiality and anonymity.
2 Outline of roles and responsibilities in the project.
3 Start date in post, and overview of tasks and
workload since started.
4 Management support and working relationships
with other project staff.
5 First mentor recruitment: methods, successes and difficulties.
6 Recruitment of young people and liaison with schools:
process of building relationships; types of relationships
with different schools.
7 First mentor/mentee group: most successful aspects;
least successful aspects.
8 Drop out mentors: profile and reasons for drop out.
9 Drop out young people: profile and reasons for drop out.
10 Lessons learnt from first mentor/mentee group; things
that will be done differently.
11 Most important issues: project strengths and weaknesses
emerging during the start-up of the project.
12 End of interview:
contact details for any follow-up information
thanks.
charities
evaluation
services
116
tool 4
user satisfaction form
Training course evaluation form
We would be grateful if you would fill in this form so that we can monitor, evaluate and
improve our training.
Course
Date
Trainer
Venue
Agree
Neither
agree nor
disagree
Disagree
Strongly
disagree
charities
evaluation
services
Please show whether you agree or disagree with these statements about
how the training may help you (tick one box only for each statement)
Strongly
agree
Agree
Neither
agree nor
disagree
Disagree
Strongly
disagree
As a result of this
course, I feel that I:
have further developed my
knowledge and understanding
of monitoring and evaluation
am more confident about
implementing monitoring
and evaluation within
my organisation
will be able to help
improve the quality
of my organisations
service delivery
will be able to help my
organisation be more
effective in meeting
users needs
Please write any comments explaining your responses to the above statements
117
data
collection
tools
tool 4
user
satisfaction
form
charities
evaluation
services
118
tool 5
intermediate outcome monitoring form
Peer education training course:
trainers participant monitoring form
Trainers should complete three forms for each participant: one at the end
of the first day, one at the end of the first semester, and one at the
end of the course (before the placement).
Skills
Participant ID number
listen to others
communicate
with others
use appropriate
body language
summarise what
people say
discuss difficult/
sensitive topics
work with different
types of people
plan presentations
deliver presentations
facilitate a group
Satisfactory
Good
The participants
ability to:
Very good
Rate the participant on the following, ticking one box only for
each statement, and giving evidence to support your rating.
charities
evaluation
services
Confidence
Strongly
disagree
Disagree
Neither agree
nor disagree
Agree
The participant:
Strongly agree
Rate the participant on the following, ticking one box only for each statement,
and giving evidence to support your rating.
is confident of
their skills in
working with
people
is able to
describe all of
their skills
is aware of a
range of career
opportunities
available to them
is self-confident
feels good about
themselves/has
high self-esteem
Support
Comments
119
data
collection
tools
tool 5
intermediate
outcome
monitoring
form
charities
evaluation
services
120
tool 6
individual assessment tool
The following horticultural therapy assessment tools
are used, together with an individual person plan,
to help clients identify their needs and how best
they can be met. If a clients aim is to get a job, the
type of job and skills required should be identified.
The initial assessment provides a baseline from
which progress can be assessed periodically. The
assessment tool therefore can also serve as an
individual outcome monitoring tool. It is important
to note, however, that the effectiveness of the
project does not necessarily relate to an increase in
individual skills or work habits. Maintenance of
mobility alone may be an indicator of success for
some clients. Individual outcome information
therefore needs careful interpretation.
Table 1
Continuous assessment: work habits
Date
a Punctuality
b Appearance
c Personal hygiene
d Manners
e Attitude
f Interest
g Effort
h Aptitude
i Communication
j Co-operation
k Integration
Unacceptable level
Shows some competence
Below average standard for open employment
Approaches average standards for open employment
Acceptable standard for open employment
charities
evaluation
services
Table 2
Continuous assessment: planting
Date
a
Fills in soil
Firms in soil
Judges/measures planting
distance
m Waters plant
121
data
collection
tools
tool 6
individual
assessment
tool
charities
evaluation
services
122
tool 7
outcome assessment:
participant self-report
The who-are-you? quiz
We are all better at some things than others
tick the boxes and see how you score.
Score
Very Good
Good
OK
Could be
better
Need to work
on this
Smiley
Comment
1
2
3
4
5
6
7
8
9
10
11
12
13
14
Smiley
charities
evaluation
services
Questions
Communication
Establishing interpersonal
relationships
Managing feelings
Problem solving
Negotiating
Planning
Reviewing
Taking responsibility
Facing up to consequences
123
data
collection
tools
tool 7
outcome
assessment:
participant
self-report
charities
evaluation
services
124
tool 8
telephone interview guide
Note: If the named person is not available,
find a time to call back. Do not explain the
reason for the call to a third party, or ask
people to call back.
Once the named person is on the phone,
introduce the evaluation:
My name is Grace Kamwa. I am working
on behalf of the National Advice and
Information Service to evaluate its service.
I am calling because you kindly agreed to
allow us to contact you by telephone for
feedback on the service you received when
you contacted the advice line last month.
charities
evaluation
services
tool 8
telephone
interview
guide
125
data
collection
tools
charities
evaluation
services
126
tool 9
user interview schedule
This is an extract from an interview schedule used by
People First, an organisation run by and for people
with learning difficulties, as part of their user-led
evaluation. Cards were used with sad and happy faces,
which were helpful, but did not always work. A book
was also used with large prints of the pictures.
7
Getting on with staff/support workers
12
Did you meet the staff before they came to work here?
Yes
13
No
Dont know
No
Dont know
charities
evaluation
services
14
15
127
data
collection
tools
tool 9
user
interview
schedule
No
Dont know
16
17
Is it better?
Is it worse?
charities
evaluation
services
128
tool 10
participatory learning
and action (PLA) tools
Card sorting
Things we liked
Evaluation wheel
Evaluation of adventure weekend
Participants are asked to tick the things they liked.
Camp leaders
Cooking
Survival training
Canoeing
Making friends
Camping
charities
evaluation
services
Graffiti wall
Large sheets of paper are hung on the wall and participants are
invited to write comments on them during the activity or event.
129
data
collection
tools
tool 10
participatory
learning
and action
(PLA) tools
charities
evaluation
services
130
further reading
Annabel Jackson Associates (2004), Evaluation
Toolkit for Voluntary and Community Arts in
Northern Ireland, Annabel Jackson Associates,
Bath.
Burns, S and Cupitt, S (2003) Managing
Outcomes: a Guide for Homelessness
Organisations, Charities Evaluation Services,
London.
Burns, S and MacKeith, J (2006) Explaining the
Difference Your Project Makes: A BIG Guide to
Using an Outcomes Approach, Big Lottery Fund,
London.
Charities Evaluation Services: Evaluation
Discussion Papers.
1 The Purpose of Evaluation (1998)
2 Different Ways of Seeing
Evaluation (1998)
3 Self-evaluation (1998)
4 Involving Users in Evaluation (1998)
5 Performance Indicators: Use and
Misuse (1998)
6 Using Evaluation to Explore Policy (1998)
7 Outcome Monitoring (2000)
8 Benchmarking in the Voluntary Sector
(2003)
9. Assessing Impact (2005)
Charities Evaluation Services (2008) PQASSO
(Practical Quality Assurance System for Small
Organisations), Third edition, London.
Clarke, A and Dawson, R (1999) Evaluation
Research: An Introduction to Principles, Methods
and Practice, SAGE, London.
Coe, J and Mayne, R (2008) Is Your Campaign
Making a Difference? NCVO, London.
Connell, JP and Kubisch, AC (1998) Applying a
Theory of Change Approach in Fulbright
Anderson, K, Kubisch, AP and Connell, JP (eds),
New Approaches to Evaluating Community
Initiatives, Vol 2: Theory, Measurement and
Analysis, The Aspen Institute, Washington DC.
Cracknell, BE (2000) Evaluating Development
Aid: Issues, Problems and Solutions, SAGE,
London.
Cupitt, S and Ellis, J (2007) Your Project and its
Outcomes, Charities Evaluation Services, London.
Davey, S, Parkinson, D and Wadia, A (2008)
charities
evaluation
services
PLA contacts
Hull and East Yorkshire Participatory Learning and
Action Network
c/o Community Development Company
The Community Enterprise Centre
Cottingham Road
Hull HU5 2DH
01482 441002
Institute of Development Studies
University of Sussex
Brighton BNl 9RE
01723 606 261
131
further
reading
charities
evaluation
services
132
glossary
There are some technical
terms that are difficult to
avoid because of their wide
use in voluntary sector
management today and,
more particularly, within the
context of monitoring and
evaluation. These are
explained below.
A
Accountability: how much
individuals or groups are
held directly responsible for
something, such as spending
or activities.
Accuracy: the extent to
which data and an evaluation
is truthful or valid.
Achievement: performance
by a project or programme
demonstrated by some type
of assessment or testing.
Activities: this usually
means the main things your
organisation does, often
the services it provides.
B
Baseline data: facts about
the characteristics of a target
group, population and its
context, before the start of
a project or programme.
Benchmark: comparison of
activities, processes or results
with those already achieved
by your organisation or by
another organisation
Bias: undue influence causing
a particular leaning towards
one view.
Criterion, criteria:
standard(s) against which
judgement is made.
Assessment: judgements
about the organisations
performance.
Coding: translation of a
given set of data or items
into categories.
charities
evaluation
services
E
Effective: having the results
or effect you want; producing
the intended benefits.
Efficient: producing the
intended results with the
minimum necessary resources.
Evaluation: involves using
monitoring and other
information to make
judgements on how an
organisation, project or
programme is doing.
Evaluation can be done
externally or internally.
Executive summary: a nontechnical summary report
which provides a short
overview of the full-length
evaluation report.
F
Facilitator: someone who
brings together and focuses
a discussion, encouraging
participation by group
members.
Feedback: presenting
findings to people involved
in the subject in a way
that encourages further
discussion and use.
Focus group: interview with
a small group focusing on
a specific topic.
Formative evaluation:
evaluation designed and
used to improve a project,
especially when it is still being
developed.
I
Impact: the effect of a
service on a wider society
than its direct users. This
can include affecting policy
decisions at government level.
Impact evaluation:
evaluation of the longer-term
effects of the project, relating
to overall purpose.
Implementation
evaluation: assessment of
programme or project
delivery.
Indicators: see Performance
indicators.
Informant: person providing
information both directly and
indirectly.
Informed consent:
agreement given, before an
evaluation, to take part or
to the use of names and/or
confidential information, in
the light of known possible
consequences.
Inputs: resources and
activities which are used in
the organisation to create
the services offered.
Instrument: tool used for
assessment or measurement.
Integrated: built into and
part of a process or system.
Intermediate outcomes:
outcomes achieved in the
short term, but linking to
longer-term outcomes.
Internal evaluation:
evaluation carried out by
the staff of the project
being evaluated.
Interpretation:
understanding what the
data means in relation to
evaluation questions.
Intervention: service or
activity intended to change
the circumstances of an
individual, group, or physical
environment or structure
L
Longitudinal study: a study
over a substantial period of
time to discover changes.
M
Management: the people
responsible for the
organisation; the techniques
they use to run the
organisation.
Measurement: finding out
the extent or quantity of
something.
Median: the point in a
distribution which divides
the group into two, as nearly
as possible.
Matrix: an arrangement of
rows and columns used to
display information.
Mean: obtained by adding
all scores and dividing by
the total number.
Methodology: details of
how the evaluation is
carried out.
Mission statement: a short
statement of the overall
aim or purpose of the
organisation, usually
concentrating on the
difference it wants to make
and defining the values that
it will work by.
Mode: the value that occurs
more often than any other.
133
glossary
charities
evaluation
services
134
N
Needs assessment:
identification of the extent
and types of existing
problems, services available
and unmet needs.
O
Objectives: the practical
steps the organisation will
take to accomplish its aims.
Observation: direct
examination and noting
of processes, events,
relationships and behaviours.
Qualitative evaluation:
evaluation or part of an
evaluation that is primarily
descriptive and interpretative.
Outcome evaluation:
evaluation of the intended
and unintended effects of
a project or programme.
Process evaluation:
evaluation of how the project
works, that is, its processes
and its activities or outputs.
Profile: the characteristics
of a group of people or an
organisation.
P
Participatory evaluation:
actively involving stakeholders
in the evaluation process.
Partnership: an arrangement
between organisations for
joint action.
Performance indicators:
well-defined, easily
measurable information,
which shows how well the
organisation is performing.
PEST analysis: analysis of the
political, economic,
sociological and technical
issues affecting the project.
Qualitative information:
see Qualitative evaluation.
Quantitative evaluation:
an evaluation approach involving
the use and analysis of numerical
data and measurement.
Quantitative information:
see Quantitative evaluation.
Questionnaire: a series of
questions listed in a specific
order.
R
Random sampling: selection of
a smaller number of items from a
larger group so that each item
has the same chance of being
included.
Ranking: placing things in
order; used to identify
preferences or priorities.
Recommendations: suggestions
for specific appropriate actions
based on evaluation conclusions.
Reliability: likelihood of getting
the same results if procedures
are repeated; therefore
genuinely reflecting what you
are studying.
charities
evaluation
services
Respondent: individual
providing information directly.
S
Sample: selection for study
of a smaller number of items
from a larger group.
Sample bias: error largely
due to non-response or
incomplete responses from
selected sample subjects.
Sampling error: where the
probability is that different
results might be obtained
from another sample.
Scale: presents respondents
with a range of possible
responses to a number of
statements.
Secondary data: data
collected for a different
original purpose.
Self-evaluation: when an
organisation uses its internal
expertise to carry out its
own evaluation; evaluation is
integrated into project
management.
Service: all the goods and
information you supply, and
things you do for your users
(and indirectly for purchasers).
Stakeholders: the people
who have an interest in the
activities of an organisation.
This includes staff, volunteers,
users and their carers,
trustees, funders, purchasers,
donors, supporters and
members.
Standard: an agreed level on
which to base an assessment.
T
Target: something to aim for;
it is a countable or
measurable result.
Terms of reference: detailed
plan for an evaluation.
Treatment: particular project
or programme activities or
services.
Trends: show changes over
time; can be used to plan
future services.
Triangulation: looking for
evidence or taking
measurements from different
points to increase reliability
and validity.
U
Unanticipated outcomes:
a result of an activity, project
or programme that was
unexpected.
Users: people who use the
organisations services.
Utilisation (of evaluations):
making use of evaluation
findings and
recommendations.
V
Validity: extent to which an
instrument measures what it
intends to measure.
Values: principles and basic
beliefs about what really
matters, that guide how
things should be done.
Variable: any characteristic
that can vary.
Y
Year plan: a one-year
budgeted plan, outlining
objectives and targets for
the organisation.
135
glossary
charities
evaluation
services
136
4 Coldbath Square
London EC1R 5HL
+44 (0) 20 7713 5722
+44 (0) 20 7713 5692
[email protected]
www.ces-vol.org.uk
Practical
monitoring
and
evaluation
a guide for voluntary
organisations