0% found this document useful (0 votes)
150 views102 pages

6758 IDS Toolkit Interactive

ids ref

Uploaded by

lama
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
150 views102 pages

6758 IDS Toolkit Interactive

ids ref

Uploaded by

lama
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 102

GLOBAL KNOWLEDGE

FOR GLOBAL CHANGE

TRAINING
TOOLKIT
The monitoring and evaluation
for information literacy
training initiatives in Africa:
a journey approach

Our sponsor Our contributors


Table of contents
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Stage 2: Programme strategy & objectives . . . . . . . . . 33 Stage 7: Data analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . 64
Pulling it all together . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33 Using statistics for quantitative data analysis . . . . . . . . . 64
Why a toolkit? The Logical Framework approach . . . . . . . . . . . . . . . . . . . 33 Finding patterns for qualitative data analysis . . . . . . . . . 65
A resource for life . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Theory of Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 34 Stage 8: Learning from M&E . . . . . . . . . . . . . . . . . . . . . 74
The training objectives/outcomes . . . . . . . . . . . . . . . . . . 37 Factors affecting learning . . . . . . . . . . . . . . . . . . . . . . . . . 74
The central role of training . . . . . . . . . . . . . . . . . . . . . . . 7 Considerations when setting your objectives Designing learning cycles . . . . . . . . . . . . . . . . . . . . . . . . . . 77
Who is the toolkit for? . . . . . . . . . . . . . . . . . . . . . . . . . . 8 (and outcomes) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
Formulating and documenting lessons learned . . . . . . . 77
Information literacy definitions and standards . . . . . . 9 Thinking through outcomes and impact . . . . . . . . . . . . . 38
Methods and tools for learning from M&E . . . . . . . . . . . 78
Stage 3: Identifying challenges . . . . . . . . . . . . . . . . . . . 42 Who should be involved? . . . . . . . . . . . . . . . . . . . . . . . . . . 78
Toolkit principles Challenges from people . . . . . . . . . . . . . . . . . . . . . . . . . . . 42 Logistics of learning cycles . . . . . . . . . . . . . . . . . . . . . . . . . 78
Challenges from the environment . . . . . . . . . . . . . . . . . . 44 Next steps in the learning cycle: inspiring action . . . . . . 79
An African initiative . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
Stage 4: Designing M&E . . . . . . . . . . . . . . . . . . . . . . . . 48 Stage 9: Communicating findings . . . . . . . . . . . . . . . . . 82
The M&E journey . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 15
Qualitative versus quantitative methods? . . . . . . . . . . . . 48 Determining your key messages . . . . . . . . . . . . . . . . . . . . 82
Using the toolkit to share good practice . . . . . . . . . . 16 Thoughts and feelings versus behaviour? . . . . . . . . . . . . 50 Conveying your key messages . . . . . . . . . . . . . . . . . . . . . . 82
An evolving resource . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Oral versus written versus visual data? . . . . . . . . . . . . . . 51 Visualising quantitative data . . . . . . . . . . . . . . . . . . . . . . . . 82
Toolkit adaptations . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16 Individual versus group evaluation? . . . . . . . . . . . . . . . . . . 51 Visualising qualitative data . . . . . . . . . . . . . . . . . . . . . . . . . . 82
M&E design considerations . . . . . . . . . . . . . . . . . . . . . . . . 51 The importance of transparency . . . . . . . . . . . . . . . . . . . . 84
Toolkit practices Working in order . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

The value of M&E . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20


Ease of comparison . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 Annexes
Flexibility and inclusivity . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Appendix 1: Glossary of assessment
Choosing an approach . . . . . . . . . . . . . . . . . . . . . . . . . . 23 Resource requirements . . . . . . . . . . . . . . . . . . . . . . . . . . . 51 and evaluation methods and tools . . . . . . . . . . . . . . . 88
The demand for evidence . . . . . . . . . . . . . . . . . . . . . . . 24 Triangulation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52
Pilot tools . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 52 Appendix 2: M&E tools by stage . . . . . . . . . . . . . . . . 92
M&E in action Appendix 3: Suggested criteria for describing
Stage 5: Establishing a baseline . . . . . . . . . . . . . . . . . . . 55
Stage 1: Assessing needs . . . . . . . . . . . . . . . . . . . . . . . . 26 good practice in monitoring and evaluation . . . . . . 99
Gathering data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27 Stage 6: M&E during (and immediately
after) training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 58 Appendix 4: References . . . . . . . . . . . . . . . . . . . . . . 100
The significance of self-perception using surveys . . . . . 29
Assessing progress during training . . . . . . . . . . . . . . . . . . 58
Assessing progress at the end of training
(Post-training Assessments) . . . . . . . . . . . . . . . . . . . . . . . . 61

2  |  Training Toolkit  |  Contents


Tables Flowcharts Case study 9: Using questionnaires as a
Table 1: Types of information contained Flowchart 1: Assessing needs . . . . . . . . . . . . . . . . . . . . . . . 32 diagnostic, to determine distance travelled and
in a Logframe . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 35-36 to measure progress . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 56
Flowchart 2: Setting objectives . . . . . . . . . . . . . . . . . . . . . 41
Table 2: Questions for addressing challenges . . . . . 43-44 Case study 10: A flexible approach to formative
Flowchart 3: Challenges in monitoring assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62
Table 3: Qualitative vs Quantitive methods . . . . . . . . . . . 49 and evaluation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Case study 11: Using distance travelled . . . . . . . . . . . 67-68
Table 4: Changes in thinking vs Flowchart 4: Design your M&E process . . . . . . . . . . . . . . 54
Changes in behaviour . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 50 Case study 12: Using self-evaluation for
Flowchart 5: Establishing your baseline . . . . . . . . . . . . . . 57 team learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 80
Table 5: Skills self-assessment scores Flowchart 6: M&E during and immediately
from three surveys and their averages . . . . . . . . . . . . . . . 67 Case study 13: Using video to communicate
after training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 63 to our donor . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 84
Table 6: Participant scores on how well they identify Flowchart 7: Learning from M&E . . . . . . . . . . . . . . . . . . . 81
the learner- or teacher-centred approaches . . . . . . . . . . 69
Flowchart 8: Communicating findings . . . . . . . . . . . . . . . 85 Boxes
Table 7: Participant scores on how much they
value learner- or teacher-centred approaches . . . . . . . . 70 Box 1: Definitions: Information Literacy (IL) . . . . . . . . . . . 6
Table 8: Individual responses and their means . . . . . . . . . 72 Case studies Box 2: Definitions: Training in Information Literacy . . . . 8
Table 9: Tips for enhancing learning . . . . . . . . . . . . . . 75-76 Case study 1: Improving evaluation through Box 3: Resources / Recommended reading . . . . . . . . . . 10
practitioner participation . . . . . . . . . . . . . . . . . . . . . . . 24-25
Box 4: The Centurion Workshop & IL in the
Case study 2: Outcomes mapping . . . . . . . . . . . . . . . . . . . 25 African context . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
Figures
Case study 3: An institutional approach to assessing Box 5: Definitions: Outputs, outcomes and impact . . . . 21
Figure 1: The M&E journey . . . . . . . . . . . . . . . . . . . . . . . . 15 information literacy needs in southern Africa . . . . . . . . . 30
Figure 2: A matrix of M&E approaches . . . . . . . . . . . . . . 23 Box 6: Definitions: Approaches, methods,
Case study 4: Using stakeholder interviews tools & technologies . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22
Figure 3: GDNet’s Theory of Change . . . . . . . . . . . . . . . . 39 to develop an information literacy national
curriculum in Zimbabwe . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31 Box 7: Definitions: Making objectives /
Figure 4: IDS Knowledge Services Theory outcomes SMART . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
of Change (Downie, 2008) . . . . . . . . . . . . . . . . . . . . . . . . . 40 Case study 5: How GDNet applied
Theory of Change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39 Box 8: Guidance: Carrying out thematic analysis . . . . . . 66
Figure 5: Stakeholder Analysis from Research
Council, ZimLA and IDS Workshop . . . . . . . . . . . . . . . . . . 46 Case study 6: How IDS incorporated feedback Box 9: Guidance: The T-Test and how to apply it . . . . . . 71
Figure 6: Screen grab of first step in loops in its Theory of Change . . . . . . . . . . . . . . . . . . . . . . . 40 Box 10: Guidance: Using the T-Test in
using T-Tests in MS Excel . . . . . . . . . . . . . . . . . . . . . . . . . . . 72 Case study 7: Carrying out a stakeholder analysis . . . . . 45 Microsoft Excel (MS) . . . . . . . . . . . . . . . . . . . . . . . . . . . 72-73
Figure 7: Completed T-Test box in MS Excel . . . . . . . . . . 73 Case study 8: Checking for survey consistency
Figure 8: An example of a Concept Map . . . . . . . . . . . . . 83 before and after a workshop . . . . . . . . . . . . . . . . . . . . . . . 53

Training Toolkit  |  Contents  |  3


Acknowledgments
This toolkit is based on research commissioned by the information literacy training programme
under the British Library for Development Studies (BLDS) at the Institute of Development Studies
(IDS) and funded by the Department for International Development (DFID).
This toolkit is largely the product of a consultative Babakisi Fidzani, Blessing Chataira, Boipuso Mologanyi,
workshop held in Centurion, South Africa, on 15-17 Daniel Mangale, Joseph Kpetigo, Julia Paris, Lenny Rhine,
February 2012. The workshop was framed around a Linda Mbonambi, Nevermore Sithole, Peter Gatiti, Peter
series of interactive sessions with input from participants Underwood, Tenege Tesfaye and Thomas Bello. We thank
from the ABC Project Kenya, Aga Khan University Kenya, you all for your valuable insights and contributions to this
University of Botswana, Ethiopian AIDS Resource Centre, toolkit. This toolkit could not have been written without
the Information Training and Outreach Centre for Africa your participation and a genuine willingness to share
(ITOCA) (South Africa), University of Johannesburg, your experiences.
Knowledge Leadership Associates, University of Malawi
We would also like to thank Dr Mark Hepworth,
and Savana Signatures (Ghana).
Dr Orla Cronin and Stéphane Goldstein for their
The British Library for Development Studies, part of dedicated support during the production and refinement
the Institute of Development Studies, has relied on of the toolkit. Finally, special thanks go to Naomi Marks,
a number of their partners, without whom this work Penelope Beynon, Emma Greengrass and Jagdeep
would not have been realised. The below list does not do Shokar for their support in Centurion and IDS.
justice to the people involved or their contributions, and
This version is intended as a first iteration of the toolkit
nor is it a comprehensive list of everyone involved, but
for circulation to all respondents and partners to provide
nevertheless, Siobhan Duvigneau, Information Literacy
an opportunity for comment.
Manager and BLDS would like to draw attention to the
following key contributors: John Stephen Agbenyo,

4  |  Training Toolkit  |  Toolkit Practices


WHY A
TOOLKIT?
Helping people engage
Contents
6 A resource for life
What is information literacy and why do we need it?

effectively in their studies, 7 The central role of training


their work, their communities A nurturing approach to lifelong learning.

and their personal lives.


8 Who is the toolkit for?
Three questions to reflect upon…

9 Information literacy definitions


and standards
Why putting in the groundwork leads to better M&E.

Training Toolkit  |  Toolkit Practices  |  5


A resource for life
Information literacy is the ability to acquire information,
to interpret it and to treat it in an intelligent and
critical manner. It is a vital skill for active citizens in all Definitions Box 1
sectors of society and throughout all stages of life. It
is a fundamental and integral part of learning, and its Information Literacy occupational and educational goals. It is a basic
application goes way beyond the confines of formal Definitions of information literacy differ in nature human right in a digital world and promotes social
education. Information literacy allows people to engage and scope, and vary from the straightforward to inclusion” – the 2006 Alexandria Proclamation on
in effective decision making, problem solving and the aspirational. Here are a few recent examples Information Literacy and Lifelong Learning, cited
research. It enables them to take responsibility for their of attempts to pin down exactly what is meant by by UNESCO in its 2007 publication Understanding
own continued learning, whether in their personal or the term: Information Literacy: a Primer.
professional spheres, or both. Hepworth and Walton consider information literacy as:
• “Information literacy is the set of skills needed
Information skills, and familiarity with the uses of to find, retrieve, analyse, and use information” “A complex set of abilities which enables individuals to
information, may be acquired during study at school, – Association of College and Research Libraries engage critically with and make sense of the world, its
college or university, but they come to be applied, honed (ACRL) in the US. knowledge and participate effectively in learning and
and adapted in later professional and/or community life. to make use of the information landscape as well as
This continuous refinement of information skills is central • “Information literacy is knowing when and why contributing to it.”
to fostering well-rounded students, workers, business you need information, where to find it, and how
to evaluate, use and communicate it in an ethical Participants in the consultative workshop for this
people, community leaders and others who are confident toolkit (see Box 4: The Centurion Workshop and
in their approach to information. manner” – Chartered Institute of Library and
Information Professionals (CILIP) in the UK. Information Literacy in the African Context)
Regardless of any precise definition of information proposed a further definition: “Information literacy
literacy (see Box 1: Definitions of Information • “Information literacy lies at the core of lifelong is the ability to recognise the need for information,
Literacy), this toolkit is designed to help you to monitor learning. It empowers people in all walks of life knowing how to find it, manage it and integrate it into
and evaluate your information literacy training, and to seek, evaluate, use and create information your knowledge base, and communicate/apply it.”
it is relevant to the wide range of competencies (i.e. effectively to achieve their personal, social,
aptitudes, attitudes and capacities) that underpin the
truly information literate citizen.

6  |  Training Toolkit  |  Why a Toolkit?


The central role of training
Information literacy is a generic, non-specific subject,
based on a range of aptitudes, attitudes and capacities.
It is best acquired through training, in a nurturing
environment where these competencies may be tried
and tested safely. It is developed through experience,
and adapted, improved and augmented as necessary for
the individual/s being trained. The nature and quality of
information literacy training interventions are therefore
crucial (see Box 2: Training in Information Literacy).
This toolkit refers to measuring the competencies
(knowledge, skills, attitudes, confidence and behaviours)
of information literate individuals. It discusses the
reasoning behind measuring indicators in these areas
and the difficulties of measuring – and offers tools,
intended to complement those that already exist,
to help in this measuring.

Training Toolkit  |  Why a Toolkit?  |  7


Who is the toolkit for?
This toolkit is for you if you design and/or run information It does this by prompting you to reflect on three broad
literacy training, i.e. you are a trainer. As a trainer you sets of questions:
may work in a school, university or community library.
1. What are you trying to achieve? Considering the Definitions Box 2
You may, alternatively, be a specialist working for a non-
problems you are trying to address and the needs of
governmental organisation (NGO). Or you may work in
your target audiences.
Training in Information Literacy
a different type of organisation – because information Training in information literacy varies widely in
literacy training takes place in a wide variety of contexts. 2. How will you know you have achieved what you approach and style according to the context,
set out to do? Considering how you monitor and sector and organisational structure in which it
Wherever training takes place though, this toolkit serves
evaluate the quality, outcomes and impact of your takes place. Training may include:
as an easy-to-use, practical aid to help you monitor and
training initiatives, and the methods you use to identify
evaluate your training. It is designed to help you identify • Face-to-face courses delivered to groups of
evidence of the impact of your training.
and demonstrate the usefulness and value of your learners in a classroom setting
training, and to continuously improve it. 3. Why are you monitoring and evaluating your
programme? Considering who will use this M&E • One-to-one tuition, printed manuals and
information and in what context, and how you can guidance
use this information to inform your decisions about • E-learning, delivered through a range of media,
future training? including interactive online materials, webinars,
This toolkit is, though, also for you if you are a Skype and mobile devices, as well as offline
recipient of training, i.e. a trainee, or if you work for an DVDs
organisation that stands to benefit from training. In fact, This list is not exhaustive, and different
all stakeholders who stand to benefit from citizens approaches, techniques and media may often be
who are increasingly information literate will find this used in combination. The style and structure of
toolkit valuable. your training may influence the choice of methods
you use to monitor and evaluate it.

8  |  Training Toolkit  |  Why a Toolkit?


Information literacy definitions and standards
The definition of information literacy has been a When you design an information literacy programme it Information Literacy Community of Practice (ILCOPSU),
subject of debate and is recognised as having a broader is crucial that you set up a system that is based on valid the Society of College, the Chartered Institute of Librarians
meaning than that described by librarians (such as library indicators for measuring the impact and successful rollout and Information Professionals (CILIP), Society of College,
orientation or information seeking skills)1. of your training programme. Information literacy is, on National and University Libraries (SCONUL), the Joint
the one hand, a generic competency which underpins the Information Systems Committee (JISC), NordINFOLIT, and
Information literacy is context-specific. For example,
acquisition and use of information to resolve problems the Australian and New Zealand Institute for Information
the requirements for information literacy as an active
and make decisions in social, occupational and educational Literacy (ANZIIL).
citizen will differ from the requirements for information
settings. On the other hand, information literacy can vary
literacy in an education setting. It therefore demands a Although we won’t go into the detail of each of these
according to the significance placed on specific forms of
different set of indicators, leads to different outcomes standards, models and frameworks, we strongly
information literacy. Information literacy can also be viewed
and has a different purpose, i.e. impact. For instance, recommend that you familiarise yourself with them.
from an individual, organisational or social perspective.
the information literacy competencies required in a They have significant value in defining the indicators and
It is therefore essential that your M&E model focuses
household, such as identifying reliable information about competency levels of an information literate individual and
on assessing both the generic and the specific skills and
childcare leading to effective parenting, will be different will help you design your evaluation approach.
competencies of an individual, in addition to their attitudes,
to those needed in a higher education setting, for
values and behaviours4, and that these relate to the broader The international debate on the best system for evaluating
example being able to conform to the norms of academic
context within which these capabilities are applied. information literacy competencies has resulted in the
practice, such as explicitly respecting other people’s
development of tools and instruments that facilitate the
intellectual property, or in a place of work where the Many international and national institutions have proposed
evaluation of information literacy programmes by level.
ability to share information may be a key attribute. models and frameworks for evaluating information literacy.
Instruments include: Project SAILS, the CAUL information
They define valid indicators and levels of competence
The underlying factor in all of these settings is that an skills survey, the LAMP and PISA programmes. A number
for measuring information capabilities in development,
information literate person is prepared and has the of references have been provided in Box 3: IL Standards,
health and welfare, civil society, higher education and
confidence and motivation to be a lifelong learner2. They Models and Frameworks to help you locate these
employability. The documents are prepared and published
consciously value the role of information in learning and important documents, models and frameworks.
by international bodies such as the International Federation
will possess the know-how (knowledge) to “know how to
of Library Associations (IFLA) and UNESCO, as well as
learn because they know how knowledge is organised,
country associations such as the Association of College and
how to find information and how to use information in
Research Libraries (ACRL), Institute for Information Literacy
such a way that others can learn from them”3.
(IIL), the National Forum on Information Literacy (NFIL), the
1. Information skills survey: Its application to a medical course, Evidence based library and
information practice, 2007, 2:3
2. Information skills survey: Its application to a medical course, Evidence based library and
information practice, 2007, 2:3
3. ALA Final report from the Presidential Committee on information literacy Washington
DC, 1989
4. E valuation of Information literacy programmes in higher education: strategies and
tools, Miguel Angel Marzal Garcia-Quismondo, Monograph “Information and digital
competencies in higher education”, July 2010

Training Toolkit  |  Why a Toolkit?  |  9


Resources / Recommended reading Box 3

IL Standards, Models and Frameworks ‘Towards information literacy indicators’, a conceptual LAMP – Literacy Assessment and monitoring
Information Literacy Standards (Various) – Association framework, R Catts and J Lau. http://www.ifla.org/ programme http://www.uis.unesco.org/literacy/Pages/
of College and Research Libraries: http://www.ala.org/ files/assets/information-literacy/publications/towards- lamp-literacy-assessment.aspx (accessed 23 Feb 2013)
acrl/standards (accessed 23 Feb 2013) information-literacy_2008-en.pdf (accessed 23 Feb
CAUL – International resources in information literacy
2013)
Australian and New Zealand Information Literacy http://www.caul.edu.au/caul-programs/information-
Framework, principles, standards and practice, A Bundy ‘The big blue – information skills for students’ – Final literacy/information-literacy-resources/international-
http://www.library.unisa.edu.au/learn/infolit/Infolit- report, JISC, http://www.jisc.ac.uk/media/documents/ resources (accessed 23 Feb 2013)
2nd-edition.pdf (accessed 23 Feb 2013) programmes/jos/bigbluefinalreport.pdf (accessed 23
OECD’s PISA (Programme for International Student
Feb 2013)
The SCONUL Seven Pillars of Information Literacy Assessment) http://www.oecd.org/edu/school/
(Core Model for Higher Education) http://www.sconul. Project SAILS – https://www.projectsails.org/ (accessed programmeforinternationalstudentassessmentpisa/
ac.uk/sites/default/files/documents/coremodel.pdf 23 Feb 2013) (accessed 23 Feb 2013)
(accessed 23 Feb 2013)

10  |  Training Toolkit  |  Why a Toolkit?


The Centurion Workshop and Information Literacy in the African context Box 4

This toolkit is largely the product of a consultative out before the Centurion workshop, as well as by For instance:
workshop held in Centurion, South Africa, on 15-17 participants at the workshop itself:
• Information literacy in the research process
February 2012. The workshop was framed around
• The nature of information
a series of interactive sessions with input from • Management, stewardship and preservation of
participants from the ABC Project Kenya, Aga Khan • General use of library resources information and data
University Kenya, University of Botswana, Ethiopian • Search and discovery, including ‘smart’ use of the • Synthesis and integration of data and information
AIDS Resource Centre, the Information Training and Internet
Outreach Centre for Africa (ITOCA) (South Africa), • Use and implications of social media in the research
University of Johannesburg, University of Loughborough • Evaluating sources and resources process
(UK), Knowledge Leadership Associates (South Africa), • Referencing • Publishing and promoting the results of research
University of Malawi, Researchers Information Network
(UK) and Savana Signatures (Ghana). • Citation • Epistemology

Although participants formulated their own • Legal issues, including copyright and plagiarism • Forms of knowledge and its representation
understanding of information literacy (see Box 1: • Writing and authorship • Relationships between data, information, knowledge
Definitions of Information Literacy), we need not and understanding
• Defining research topics, and conducting research
worry here about a perfect definition of the concept.
• Building knowledge communities
Rather, we need to stress the importance of learners’ • Information literacy professional development
abilities to make judgments about the quality of the • Communication and presentation
• Bibliometrics
information they use, and about how they might most • Presentation and packaging of information for
appropriately use different kinds of information in • Preservation of research outputs
different audiences
different contexts. It may be helpful if we think in terms But this list is far from exhaustive, particularly if
of the knowledge, skills, competencies and behaviours • Visualisation and infographics
information literacy is understood as broader than
required in different contexts in Africa, and how the traditional, library-centred view based on finding • Information literacy in the community
they apply to learners – not only during their formal materials and deploying bibliographic skills.
education, but throughout their lives in their varied • Advocacy for community organisations and for small
roles as citizens and active members of society. The Centurion workshop also proposed other skills and and medium-size enterprises (SMEs)
competencies that trainers would like to see included as
Below are areas of information literacy identified contributing to information literacy in Africa.
as important by respondents to a survey carried

Training Toolkit  |  Why a Toolkit?  |  11


12  |  Training Toolkit  |  Toolkit Practices
TOOLKIT
PRINCIPLES
How a ‘virtuous’ journey can
Contents
help clarify what you do, how 14 An African initiative
Participatory, responsive and ongoing.
you do it and how to make it
better next time. 15 The M&E journey
Why the learning process never ceases.

16 Using the toolkit to share good practice


Help us build a repository of good M&E.

16 An evolving resource
Looking forward to a future of adaptation and refinement.

16 Toolkit adaptations
Spreading good practice, acknowledging roots.

Training Toolkit  |  Toolkit Practices  |  13


An African initiative
This toolkit was developed in response to specifically and beyond, and these formed the basis for a consultative
African needs and concerns. It is the work of European workshop in Centurion, South Africa (see Box 4: The
partners (Institute of Development Studies (IDS), Centurion Workshop and Information Literacy in the
Research Information Network, Orla Cronin Research and African Context). Workshop participants agreed to act
University of Loughborough) in close consultation with as an informal network to help validate the toolkit as
committed individuals from a range of African institutions, it was developed, and an early draft of the toolkit was
both academic and non-academic. circulated for comment in March 2012. A fuller version
was circulated in May 2012. Feedback from those
Initial consultation for the toolkit, in the form of a
consultations has served to refine the toolkit to its
questionnaire-based survey, took place in November
current form.
2011. There were over 30 responses from across Africa

Contributors to the M&E Toolkit (see Acknowledgements, page 4)

14  |  Training Toolkit  |  Toolkit Principles


The M&E journey
This toolkit takes the form of a ‘journey’. It starts The M&E journey Figure 1
before any training is designed and is delivered with 1. Assessing
an investigation of the current state of information needs
literacy in the target group and a determination of
their training needs. It moves on through the process of
making decisions on training interventions, implementing
9. Communicating 2. Programme
practical solutions and analysing the wider impact of these findings strategy and
interventions within organisations and beyond. As this, in objectives
turn, can lead to the emergence of a new baseline (i.e. the
new capabilities of individuals or organisations), which can
lead to the determination of new needs, and so on, the Pre-training
journey never ends – and so it starts to take the form
of a continual, and virtuous, cycle.
There are nine stages for each complete cycle of the 8. Learning 3. Identifying
M&E journey (see Figure 1: The M&E journey) and the from M&E challenges
toolkit follows these nine consecutive stages in the later
part of the toolkit. Post During
Completing all nine stages will provide you with the training training
necessary know-how to understand and/or to conduct
M&E. It will also enable you to interpret and use the
results of the M&E process to improve or adapt future
training interventions and to raise the level of information
literacy for your target audience – so making your own 7. Data analysis 4. Designing
M&E
particular M&E journey a ‘virtuous’ one. Individual

Organisation
Wider society
6. M&E during 5. Establishing
training a baseline

Training Toolkit  |  Toolkit Principles  |  15


Using the toolkit An evolving Toolkit
to share good resource adaptations
practice
In the same way you are likely to use practical examples This toolkit has been designed to serve a range of African Monitoring & evaluation is a generic concept and
of good information literacy practice in your training audiences. In its current state, it is broad and generic in therefore can be applied to a variety of different contexts
interventions, so we will, in this toolkit, use practical nature, and it can be applied to training interventions in and disciplines. As such, the toolkit could well be adapted
examples of M&E in information literacy training. We a wide range of settings. There is no reason why, though, to suit your institution and individual needs.
hope these case studies will illustrate to you the many it could not be developed into a more sophisticated
For example, the University of Johannesburg (UJ) are
different ways you can monitor and evaluate information instrument. Different versions could evolve for more
planning to create a training course based on the contents
literacy training and perhaps inspire you to adopt or adapt specialist use by specific groups of users, e.g. in different
of the toolkit. This programme is intended for UJ faculty
them as you see appropriate. geographical locations or working in different domains
and information librarians as part of their formal training
such as health. Indeed, we anticipate that the toolkit will
You can help us build a repository of good M&E practice academy in 2013. The adaptation and customisation of
develop in the light of suggestions received by trainers
in information literacy training so we can continue to the toolkit for training purposes offers BLDS / IDS a
and others who use it.
share and learn and develop. At Annex 3, a template is valuable opportunity to identify how different institutions
provided which you can use to describe your intervention Having stated this, it is important to keep in mind that the test the usefulness of the toolkit.
and the way in which you monitored and evaluated it. In toolkit was designed thanks to the active participation of a
We encourage institutions to adopt these approaches
this way, as a community we can continue to build good range of stakeholders in Africa. As they are the end-users
but kindly request that you credit IDS and inform
practice M&E for information literacy. and beneficiaries, it is right and proper that the toolkit is
[email protected] of your intention so that we can engage
‘owned’, presented and disseminated collectively by these
with your institution.
same stakeholders. Theirs will be a key role in the toolkit’s
further development and refinement, its use as a dynamic
and interactive resource, and its possible adaptation for
more specialist purposes.

16  |  Training Toolkit  |  Toolkit Principles


Training Toolkit  |  Toolkit Principles  |  17
18  |  Training Toolkit  |  Toolkit Practices
TOOLKIT
PRACTICES
From a run-through of the theory
Contents
to a step-by-step guide to practising 20 The value of M&E
A simple story to explain these two terms
successful M&E.
23 Choosing an approach
Making sense of M&E approaches

24 The demand for evidence


Demonstrating progress or success

26 M&E in action
Stage 1: Assessing needs
Stage 2: Programme strategy & objectives
Stage 3: Identifying challenges
Stage 4: Designing M&E
Stage 5: Establishing a baseline
Stage 6: M&E during (and immediately after) training
Stage 7: Data analysis
Stage 8: Learning from M&E
Stage 9: Communicating findings

Training Toolkit  |  Toolkit Practices  |  19


The value of M&E
There is often confusion regarding monitoring and This story shows that we need to monitor for an Evaluation focuses on outcomes and impact, and is
evaluation. A simple story can help to clarify these evaluation to make sense. If the feast was a disaster, it therefore more periodic. Evaluation takes place both
two terms: might have been simply because no-one knew it was during and after the training intervention, to examine
happening. However, we can also see that the monitoring whether it has met its objectives, i.e. whether the
A group of women decide to have a community feast to
data “we cooked seven pots of rice” doesn’t tell us a great anticipated benefits are being or have been realised at
celebrate the harvest. They get a grant from a community
deal on its own. an individual or organisational level. It may involve the
group, but that grant comes with strings attached: they
evaluation of both process (how the training was or is
must monitor and evaluate the feast. We can see from this that monitoring and evaluation
being delivered) and outcomes and impact (how much
are related concepts, each involving different processes.
So what does this mean? improvement is evident, how much the improvement
Though the borders between monitoring and evaluation
The women’s monitoring efforts will be a way of checking varies between different learners, whether there are
can become blurred, they tend to vary in terms of their
that the feast actually happened, as intended. They may changes evident that were not anticipated, what effect the
purpose, timings, who carries them out, who they are
monitor the inputs, for example which ingredients they training had on the broader environment of the trainees
‘done to’, the methods used, how they are implemented,
purchased or had donated, and the amount of time it took and so on).
and the meaning and implications of the conclusions.
to prepare the dishes. They may monitor the outputs, for The aim of evaluation is to examine the nature, scale
In the context of information literacy training, monitoring
example how many invitations were sent and how many and significance of any improvements produced by your
is generally an ongoing activity, involving the regular
people turned up to the feast. training interventions. It is also to identify any changes
collection of data on the effectiveness and efficiency of a
However, they will evaluate the outcomes – the actual that could improve your training, either during the course
given training intervention. Data are gathered during the
effects of the feast, for example did people enjoy of the training itself, or in the future.
training itself, to check progress and to collect evidence
themselves, did the feast bring the community together, that what was planned is in fact being delivered, and For more on outputs, outcomes and impact, see Box 5:
were new relationships formed and do people want to do that people are attending. Monitoring tends to focus Outputs, Outcomes and Impact.
it again next year? on outputs – the actual delivery of the intervention, e.g.
number of trainees, number of contact hours etc.

20  |  Training Toolkit  |  Toolkit Practices


Definitions Box 5

Outputs, Outcomes and Impact Outcomes may also be defined as what trainees can An example of this might relate to trainees achieving
Outputs refer to the extent to which a training do as a result of what they have learned, notably with good academic results. Has this enabled the individual
intervention was delivered. For example, outputs may regards to changes in attitude, confidence, behaviour, to participate in a community of practice, build on
include the number of contact or study hours achieved, performance and practice. For example, outcomes existing knowledge, create new knowledge and/
the number of trainees who attended (alongside some could include: or tackle shared challenges? Has it enabled their
information about the trainees, e.g. their gender). organisation to take advantage of opportunities
• The ability to use a wider range of vocabulary to
in the market place or the community to advocate
Outcomes refer to the knowledge and skills trainees describe and evaluate an information source
for resources? Has it meant they can now share
have acquired that indicate information literacy, critical • The use of more sophisticated methods to search for information and knowledge with others in a similar
thinking and independent learning. information, such as the use of Boolean logic context to achieve common goals? Has it enabled the
For example, in an academic environment, outcomes • An awareness of what information is required to individual to make evidence-informed decisions or use
could include the ability to: resolve a problem and how that information will be evidence in policy-making processes? Has it helped the
applied individual improve their research capacity?
• Define information needs (e.g. by listing key terms or
concepts, or creating mind-maps) • A positive attitude toward the opportunity to learn
• Incorporate a broad range of academically independently
authoritative sources in reports • An ethical approach to the use and application of
• Provide evidence of synthesising different viewpoints information

• Reflect critically on other people’s contributions Impact indicates the actual effects training
interventions have on the broader environment in which
• Include detailed references and bibliography trainees operate, and the extent to which trainees are
In a community setting, outcomes could include able to influence that environment as a result of the
knowledge of authoritative information sources and training. In information literacy training, impact relates
the ability to generate information drawing on other to whether the information a person can now process
information. is being applied or used in a way that enables people to
achieve their objective/s.

Training Toolkit  |  Toolkit Practices  |  21


Well-planned and rigorous M&E can help you to:
• Clarify the objectives of your training interventions
• Track the delivery and outputs of your training
Definitions Box 6

interventions Approaches, Methods, Tools and Technologies Tools are elements of methods. A particular method,
• Establish whether you have met your objectives (by To make it easier to compile the toolkit, we have e.g. a focus group, may involve the use of a particular
examining outcomes and impact) adopted the following definitions in order to cluster tool within it as an exercise or activity, e.g. using a
similar issues together. You may find these terms focused-discussion tool (e.g. ORID), creating a collage,
• Learn how to improve your interventions, generate or conducting a word association activity.
used slightly differently within other M&E resources,
new ones or advocate for resources.
but what is important is that you are aware of these Technologies are the physical ways in which you
A note on terminology: on occasions, when we are different elements of your M&E plan. collect the data. Paper and pencil is a technology,
discussing methods which relate very specifically to as is iClicker (an instant response device). Electronic
An M&E approach is its ‘philosophy’, referring to
evaluation of training, we have used the term assessment. technologies, such as email, web, mobiles etc. enable
the way in which the monitoring and evaluating is
This is not a general M&E term, but is specific to training us to implement particular methods and tools more
designed and implemented. Different approaches
and other educational contexts. We see assessments as efficiently, creatively or cheaper.
embody different values and judgments, with their
being part of the contract between trainer and trainee.
structure and focus encompassing what they regard
An assessment is unlikely to be anonymous, and the
as important to evaluate.
trainee will be given their results so they can see what
progress they are making. Assessment data can be Methods (or methodology) are the way in which
aggregated and used for M&E, and many assessment you actually collect data. Interviewing is a method,
methods in fact lend themselves well to M&E. as is a survey.
In this toolkit, we distinguished between M&E approaches,
M&E methods, M&E tools, and M&E technologies, as these
can often cause confusion (see Box 6: Approaches,
Methods, Tools and Technologies). We will learn more
about methods, tools and technologies later. For now, we
will consider approaches.

22  |  Training Toolkit  |  Toolkit Practices


Choosing an approach
When you choose an approach you need to consider the
series of values and judgments that come with it. This is
A matrix of M&E approaches Figure 2

because approaches often embody assumptions about


what or who the evaluation is for, and how outcomes and
impacts should be framed. For example, some approaches   Participatory action research Doing “with”
borrow from the private sector or the discipline of  Participatory
economics and concentrate on ‘value for money’ or ‘social   Appreciative inquiry
evaluation
return on investment’. These approaches concentrate
  Rights based evaluation
heavily on outcomes, but are not particularly interested in
how these have been achieved.   Outcomes mapping
Recently in the development sector, greater emphasis
has been placed on the process which unfolds during an
intervention. This may include considering unanticipated
outcomes such as the changes in relationships that occur
during an intervention, or the changes in behaviour.
Improvement focus Accountability
There is also now an emphasis on a participative approach
to M&E. In this, all stakeholders are actively involved in the  Results
M&E process based
evaluation
To make sense of the many M&E approaches in
circulation, it is useful to try to classify them according
to their different values. The matrix below (see Figure 2:  Social return
A matrix of M&E approaches) is an attempt to do this. on investment
It shows whether approaches are more concerned with   Cost-benefit analysis
process (‘Improvement focus’ on the matrix) or outcomes Doing “to”
(‘Accountability’), and whether M&E is ‘done with’ or ‘done
to’ participants.
Clarifying your values before you start gathering data
is a useful exercise no matter who you are. However, risk when creating organisation-wide M&E systems
it becomes particularly important where a number of which are expected to last for a number of years and be
different people may be involved over a longer period of applicable to a range of different interventions.
time, where the values may get lost. This is particularly a

Training Toolkit  |  Toolkit Practices  |  23


The demand for evidence
The need to use resources effectively and efficiently,
especially where resources are limited, has occupied
strategists, policymakers and practitioners for generations.
The demand for evidence of the progress or success of a
Case study 1
project or programme is great, and in fact it is currently Improving evaluation through practitioner Specifically, the FSE process is built on three concepts.
increasing. The area of information literacy training is no participation • Facilitated – an evaluation expert supports the
different from other areas in experiencing this. The IDS Impact and Learning team (ILT) is responsible project team to undertake every stage of the
Donor organisations and others who stand to win or lose for supporting IDS’s Knowledge Services (KS) evaluation. The facilitator’s role is to advise on
from information literacy training (stakeholders) seek department to undertake planning, evaluation and appropriate tools and approaches that will increase
evidence because it may, for example: learning. ILT is always looking for new and creative the rigour of the evaluation design, data collection
approaches to make planning, evaluation and learning and analysis. In particular, the facilitator supports
• Help them decide whether to continue funding an
more engaging and effective and also for ways to the team to use critical questioning approaches
intervention
embed these processes in the everyday activities of to challenge each other to remain true to the
• Help them choose between competing demands for KS staff and their partners. In 2010, members of ILT evidence (mitigating bias), to use evaluative thinking
funding developed a process called Facilitated Self Evaluation approaches to push analysis to a deeper level
• Establish whether their financial resources are well spent (FSE) with the following three objectives: (maximising the depth of insights) and to remain
• To improve colleagues’ understanding and focused on the evaluation questions (mitigating
Depending on whether they are seeking proof or
experience undertaking evaluation scope creep). It is helpful for the facilitator to
understanding or learning – or a combination of the
have some understanding of the project and the
three – they may favour different approaches to M&E. • To improve the quality of project evaluations within
For example, most donors are interested in supporting particular field, but not essential.
the department
interventions that provide sustainable and long-term • Self – a practitioner team with firsthand experience
solutions. They will expect the M&E evidence to • To improve the value that these evaluations deliver delivering the product or project under scrutiny has
demonstrate how the training beneficiary will apply the for people ‘working on the ground’ primary ownership for the evaluation and carries
information literacy competencies in their work either FSE is a process for undertaking participatory out all the activities in every phase of the evaluation
through interviews statements or survey results. In evaluation of products delivered in partnership. process (design, data collection, analysis and
addition, some donors may be interested in learning and The whole evaluation process is facilitated by an reporting). The team has the deepest understanding
increasing awareness of the institutional, environmental independent person and is carried out by a practitioner of their project history, rationale, assumptions and
or personal challenges a trainees might face in acquiring team with first-hand experience delivering the project information needs, and an evaluation designed
or using information. They will be particularly interested being evaluated. It results in evidence-based evaluative and undertaken by its members will have greatest
in the learning outputs (e.g. reports or publications) and judgments that can lead to quick improvements in relevance to and impact on their work. The team
expect the project team to share this information with project delivery. usually consists of six to 10 people who can commit
others in the information literacy / capacity building field.

24  |  Training Toolkit  |  Toolkit Practices


Case study 1 (continued) Case study 2
five to 15 days throughout the FSE. A mixed In practice, the full FSE process takes 12-16 weeks. Outcomes mapping
practitioner team made up of staff members from During that time, a facilitator will guide a practitioner The IDS Impact and Learning team (ILT) is
both sides of a project partnership and from varied team through at least two face-to-face workshops: an responsible for supporting IDS’s Knowledge
roles and levels of authority helps bring different inception workshop, to set out the evaluation design, Services (KS) department There are relatively
perspectives to the evaluation and an element of and an analysis workshop, to bring together all of few examples where particular approaches have
challenge that can mitigate bias and ‘group think’. the data that has been collected and to draw some specifically been applied to information literacy
evaluation conclusions. There will be a period of data interventions. However, in 2012, a project funded
• Evaluation – the FSE process is fundamentally
collection, which could include surveys, interviews, by IDS and facilitated by the Department of
about making evidence-based judgments. FSE
gathering website statistics etc. Finally, the facilitator Information Studies at Loughborough University,
assumes that project teams undertake a multitude
will set the team on the path towards reporting its involving the University of Botswana, the University
of review activities every year (for strategic review,
findings. FSE takes courage and commitment from of Zambia and Mzuzu University in Malawi, used
reporting to donors, future planning etc) but that
a practitioner team. Members need to commit the an Outcomes Mapping approach to M&E to
many of these processes rely solely on one source of
necessary time to the process and be open-minded help identify current activities, future aspirations
evidence (e.g. the practitioner team’s observations),
about discovering successes and failures within their and activities, and challenges associated with
lack research rigour and are framed to serve a
work. developing information literate, critical thinking,
particular agenda (e.g. to justify continuation of
funding). When undertaking FSE, teams can test • You can read more about FSE in the forthcoming independent learners.
and triangulate their personal experiences of IDS Practice Paper in Brief http://www.ids.ac.uk/ Outcome Mapping is a participative M&E
delivering a project with evidence gathered through publications/ids-series-titles/practice-papers-in- approach which places particular emphasis on the
rigorous design, collection and analysis methods, brief ‘journey’, i.e. the process, and the changes and the
so increasing the validity of their own experiential relationships formed during that process. (see Case
• For more on FSE, see Stage 8: Learning from
conclusions. An evaluation undertaken using an study 3: An institutional approach to assessing
M&E.
FSE approach should stand up to scrutiny by an information literacy in southern Africa).
independent reviewer.

Training Toolkit  |  Toolkit Practices  |  25


M&E in action Stage 1: Assessing needs
You may feel you have a good understanding of your experience in accessing information or may not be
training cohort’s information literacy knowledge, skills digitally literate. The context may pose limitations
? Key questions and attitudes. Even so, it is good practice to test your
assumptions.
too, such as a lack of access to the internet. This may
influence the information literacy training, e.g. placing
Assessing your trainee’s needs To do so, you need to capture current information about
more emphasis on discussion and how information can
• What do potential trainees know and be shared within the community or by connecting with
your training cohorts’ information literacy capabilities
understand now? What are they able to actually other organisations.
and their perceptions of their competencies and skills. As
do now? you do this, you need too to consider the competencies Understanding context helps to ensure that training is
• What attitudes do trainees possess? Do they you would expect to see in the cohort by referring to the effective and enables the trainer to communicate with
value information literacy, and understand its information literacy standards, models and frameworks trainees and show how new knowledge can help them
role in achieving their social, occupational or relevant to them (see Box 3: IL Standards, Models and achieve their goals or complete their tasks (whether
educational goals? Frameworks). In determining what is known and what is these are personal or professional or community-
unknown in your trainees, and comparing it with what you driven).
• What do potential trainees need to know, might expect at different competency levels outlined in
understand and be able to do to fulfil their roles 2. How these potential trainees currently perform and
the relevant standards, you can better design your training
and tasks? report on their performance in their roles. This may be
intervention to address the skill and competency gaps (i.e.
through self-reporting, peer or line-manager appraisal
• What organisational goals are you hoping the the unknowns) and so meet your trainees’ training needs.
or some other means.
training will address? Identifying information literacy training needs starts with
3. The current attitudes, values and perceptions of
• What must be achieved to meet trainees’ a broad identification of the individuals or groups who
these potential trainees to lifelong learning skills,
needs, and the organisation’s or wider may need such training. For example, undergraduate
information-seeking behaviours and critical-thinking
community’s goals? students in their final year may need training in
skills. How broad is their conception of information
referencing systems, or junior researchers may need
• What is the most effective way of reaching literacy? Do they think of it as a library-orientation
techniques for scanning articles quickly to determine
potential trainees? skill only (e.g. using e-journal databases), or do they
relevancy to their research.
recognise that it can improve their research and
• What has already been done or achieved? What
You need to consider: problem-solving activities, as well as help them become
information literacy training interventions have
more active citizens and promote social inclusion?
already taken place? 1.  The roles of these potential trainees and what they
need to know and understand to meet their objectives
in these roles. This means you need to understand the
contexts in which they operate. Understanding roles
and goals helps the trainer to understand the type and
level of training required. For example, the trainee
may need access to health information but have little

26  |  Training Toolkit  |  Toolkit Practices


cheap, plus data can be quickly captured and stored using a better understanding of ‘why’ they do what they do and
photographs. Participative approaches tend to require their values or interpretations of situations.
! Tips more commitment from the participants but tend to
result in greater engagement.
Qualitative data are often gathered through interviews
and focus group/discussions. These methods can be
Always assess needs on a case-by-case basis. Even
In either participative or survey-based approaches, structured using methods, such as grounded theory or
apparently similar stakeholders (for example
data gathered can be either quantitative or qualitative. Outcome Mapping. The latter provides a framework that
researchers) will operate in a specific context and
Quantitative tends to focus of the presence of certain enables people to participate and define their vision,
so may anticipate different learning outcomes.
characteristics, the ‘what’ and the ‘how’, for example mission, learning outcomes and progress indicators.
knowledge of sources of information, the usage of
Methods you could consider using include:
sources of information, and the techniques people
Gathering data use to gather, manage or communicate information. • Needs analysis surveys/diagnostic tools. These could be
To find answers to all these questions, you need to gather Quantitative data tends to be associated with statistical simple yes/no questionnaires or more complex, graded
data on your target individuals and groups. Data gathering analyses that imply the group is representative of a questions in which people assess themselves on, for
tends to be either through traditional survey-based larger community. However, the term quantitative is also example, a five-point scale. The latter could include
approach, where trainers elicit data from their potential associated with any data that is made up of numbers. The questions about their perceived competence on various
trainees using needs-analysis surveys, interviews etc two are not necessarily the same. tasks, such as finding and using information. Surveys
or, alternatively or in combination, highly participative that assess people’s knowledge before an intervention
This quantitative approach tends to be effective when the
approaches can be used, generally via discussion and tend to be called pre-diagnostic tests because they can
trainer has a reasonable understanding of the information
workshops in which the participants reflect on and define be followed by post-diagnostic tests that can indicate
literacy that may be appropriate for the audience
their needs. The latter are usually facilitated and various changes in competencies as a result of training. Such
but wants to get an overview of peoples’ information
techniques can be applied, such as cause-and-effect tests often include questions that assess people’s
behaviour so that training can be tailored. This can
diagrams, identification of situations where information knowledge of and usage of sources of information,
also be useful for evaluating the impact of the training
is needed, barriers and ‘helps’ etc. These methods are or specific techniques to search for or process and
and the changes that have taken place as a consequence
described in more detail below. communicate information, or knowledge of ethical
of the training.
issues associated with the use of information. In other
Survey-based approaches, often quantitative, are relatively Qualitative data are data that are relevant to a group of words what they do or don’t do and knowledge that
quick to apply and data are easily analysed. Participative people that is not statistically able to be generalised. They they have or don’t have. Data tend to be gathered
approaches, tend to be more complex, requiring genuine also tends to be made up of words. Qualitative data tend through closed questions (yes/no, or numeric scales)
engagement with the audience and involving discussion to be useful when one is exploring a situation and the that enable quantitative data to be gathered. However,
and negotiation. They tend to result in qualitative data variables are not known. For example, data may be in the open questions can be included and provide qualitative
that may be harder to analyse. However, techniques form of statements by trainees about their perception data where the respondent can explain why they have
where simple, participative techniques are used to gather of their need for information and the difficulties they answered in the way that they have.
data, such as flip charts or stickies are very effective and encounter accessing information. Qualitative data enable

Training Toolkit  |  Toolkit Practices  |  27


As well as assessing needs, programmes based on • Interviews. These can be face-to-face or virtual (e.g. by • Focus/discussion groups. These discussions with small
pre-assessments of trainees’ current capabilities and telephone). Interviews can lead to both quantitative and groups of potential trainees or other stakeholders
contexts are more likely to be tailored to those needs. qualitative data. Generally the data are qualitative and can be informal but must be well facilitated. These can
Pre-assessments can also help you make an explicit provide an insight into people’s current situations. For be captured on video, audio recording, flip chart and
link between the learning concepts and trainees’ real- example, people may be asked about their role and the Post-it notes. This method can be used in a participative
world problems. tasks they perform and how information helps them. fashion where people reflect on their information
Barriers can be identified, for example the accessibility needs and how they seek and use information. In this
Importantly, a pre-training diagnostic or test can be
of sources, a lack of information competencies etc. One case, the facilitator enables the conversation, using
used as a ‘control’ at the end of a training intervention
method is to use the critical incident technique whereby various tools such as flip charts or Post-it notes or
to assess impact by measuring the distance travelled. In
people are asked to reflect on situations where they cause-and-effect diagrams. The latter can be used,
M&E, distance travelled is a valuable tool for measuring
needed information, such as needing to get health for example, to highlight particular problems (their
the outcomes of interventions. When used in a training
information to help care for a partner or academic cause and effect) in a community, such as a lack of
context, the distance travelled refers to the progress a
research to write a policy-brief. Once identified, information security, and then the diagram used to
trainee makes towards achieving a learning outcome as
interviewees are asked to explore what led up to the reflect on the kind of information they need to resolve
a result of participating in a training intervention. For
situation, i.e. what created the need; how they sought the situation and the capabilities they need to access
more on this see Stage 5: Establishing a baseline.
information, i.e. indicating their information-seeking and use the information.
There are a number of validated instruments you can behaviour and identifying ‘helps’ and ‘hindrances’;
Focus/discussion groups are also good for discussing
use to evaluate your trainees’ information literacy and how they applied the information, i.e. helping to
and enabling reflection of current practice and
competence. These include: Project SAILS and the understand their needs better and their ability to make
attitudes. For example, a team may use this method
iSkills information literacy tests. use of the information. Task analysis is another common
to reflect on their information practices, for example,
interview technique where tasks are associated with
how they manage and store their information. This
roles or a situation, such as sharing information in the
can highlight different practices in the team and help
workplace. This provides information about current
identify good practice and consensus on necessary
! Tips practice and can elicit current obstacles, such as who
they should share information with, why and how; how
changes in behaviour and the need for training.
All methods used to gather data should be tested they should store and manage information etc. Group sessions are also important post-training to
(‘piloted’) before they are applied to make sure enable reflection on what has been learned. This leads
questions make sense to the respondents and the to deeper learning and can also generate ideas for how
data gathered helps inform the training. training can be improved.

28  |  Training Toolkit  |  Toolkit Practices


• Observations. Prior to training, if possible, observation
should be conducted in the place where trainees
The significance of self-perception Using a survey questionnaire to establish the limits of what
is known can be a difficult task. Take, for instance, someone
interact with information. This helps the trainer using surveys who scores themselves three out of five (where one is low
orientate themselves to the trainees and their Capturing a trainee’s perception of their own skills, and five is high) on a question that asks them to assess
information environment. Common practices and knowledge or confidence in a particular information their skill in using mind-maps to develop a search strategy.
issues can be identified. This enables the trainer literacy skill (e.g. using e-databases) is a useful method for Initially, you might consider their self-assessment score as
to make sure their training relates to the needs of establishing training needs. Also, (see Stage 1: Assessing indicating a reasonable degree of competency. But on its
trainees, in terms of level, and that relevant examples needs) and setting a baseline. However, research shows own, this score can be misleading. For instance, it doesn’t
can be used which helps motivate the trainees. It also that people often over-estimate their capabilities5. One tell you what the trainee knows about mind-maps or how
indicates to the trainees that the trainer is serious explanation for this could be a desire to present a certain they use them. It does though provide a measure of their
about understanding their needs. image of their capabilities. Another could be that people self-assessment – which is valuable when setting a baseline.
Observation during training enables the trainer to cannot identify what is unknown until they are shown the In practice you can use a combination of diagnostic test
identify problems that individuals are experiencing and gaps in their current knowledge, skills and even attitudes. and self-assessment type questions to check trainees
deal with them quickly and sensitively and also helps Realising we didn’t know something can have a marked understand the concepts, as well as measure their
to gauge whether the training is pitched at the right effect on our confidence levels and it is not uncommon confidence in being able to apply these skills.
level. Evidence can also be rapidly gathered about for training beneficiaries to re-assess their knowledge,
the skills and knowledge of trainees as they go about skills and attitudes once they have participated in a
undertaking a task or set of tasks. training intervention.

5. Dawes, J. (2007). Do data characteristics change according to the number of scale


points used? An experiment using 5-point, 7-point and 10-point scales. International
Journal of Market Research. 50 (1)

Training Toolkit  |  Toolkit Practices  |  29


Case study 3
An institutional approach to assessing The entire assessment proved a reflective and
information literacy needs in southern Africa thought-provoking exercise. It enabled key
At IDS we have adopted Outcome Orientation as an stakeholders to comment on what had already been
approach to planning, monitoring, evaluation and achieved (i.e. their successes), as well as the challenges
learning. This focuses on outcomes in terms of the (as they saw them) and ways these could be addressed
changes in behaviour we would ‘expect, like or love (i.e. the opportunities). It also highlighted where they
to see’. We use it to think about the kinds of changes would like to go, either at an individual, organisational,
we are trying to achieve through our activities as national, regional or international level.
well as identifying observable changes in behaviour. The discussions resulted in a list of recommendations
Put simply, we design our activities by first asking the for activities and partnerships to help all three
question, “What will be different and for whom?” institutions move forward and address their
before asking, “What am I going to do?” information literacy capacity building needs. You can
We applied this approach in an information literacy read more about this in our paper, ‘An institutional
needs assessment activity undertaken by IDS and strategy for developing information literate, critical-
Loughborough University with three southern African thinking independent learners within higher education in
higher education institutions, one in Botswana, one in Africa’, available: http://bit.ly/13kyPmO 6
Malawi and one in Zambia. We conducted 27 lengthy,
one-on-one interviews with stakeholders at the
institutions in Malawi and Zambia, using an interview
script to ensure a comparative and consistent
approach. As we were unable to do the same in
Botswana, we held a participatory workshop there to
complement the activities in Malawi and Zambia. This
consisted of a three-day planning event with 18 key
stakeholder groups. Equivalent representatives in all
three institutions were invited to take part in the study
and included vice chancellors, faculty directors, and
library, technical and administration staff. 6. Hepworth, M and Duvigneau, S. (2012). Building Research Capacity: Enabling
Critical Thinking Through Information Literacy in Higher Education in Africa. Available:
http://opendocs.ids.ac.uk/opendocs/bitstream/handle/123456789/2301/
BuildingResearchCapacityR1.pdf?sequence=1. Last accessed 23 Feb 2013

30  |  Training Toolkit  |  Toolkit Practices


Case study 4
Using stakeholder interviews to develop In actual fact, the training outcomes were a
an information literacy national curriculum combination of the outcomes elicited from the needs
in Zimbabwe assessment activities as well as those identified by the
In a needs analysis activity in Zimbabwe, IDS conducted funder of this exercise (DFID). The outcomes were
one-on-one and group interviews with different expressed as changes in knowledge, skills and attitudes
stakeholder groups interested in developing a national as a result of taking part in the learning intervention
curriculum for teaching information literacy using and were grouped as follows: those we would expect
a learner-centred and enquiry-based approach to see (normally immediately after the intervention),
in Zimbabwean universities. We interviewed key those we would like to see in the following three
stakeholders at the Research Council of Zimbabwe, months, and those we would love to see (longer-
Zimbabwe University Library Consortium (ZULC) and term goals that may or may not happen, their being
the Zimbabwe Library Association (ZimLA) to identify contingent on other factors).
why they wanted to adopt a pedagogical approach.
We also interviewed potential recipients of the
training to find out their current knowledge, skills
and attitudes, as well as those of the people they work
with (e.g. the students).
During all of these discussions, we gathered evidence
about what had already been achieved and what
difference the training might make. In other words,
we asked, “What will be different and for whom?”
and, “Why is this important in your particular
context”? From these discussions, we developed
a needs assessment survey which the training
cohort was asked to complete. The survey helped to
validate the information gathered during the face-
to-face discussions and enable us to set the training
intervention’s learning objectives and outcomes.

Training Toolkit  |  Toolkit Practices  |  31


Assessing needs Flowchart 1

START HERE
Assess
Needs analysis surveys /
The competences questionnaires
(skills and knowledge) What methods will you
of the trainees use? Remember:
a mixture of both types Traditional survey-style Interview
is possible. approaches, e.g:
The attitudes, values
Identify individuals and perceptions of Observations
and / or groups who the trainees
need to be trained

Consider the context in Consider Participative


which these individuals approaches, e.g: Focus groups
and groups operate. The needs, context
Considering their roles & constraints of the
and what they need to Focused discussions
organisation(s) in which
know and understand trainees are located
to meet their objectives Outcome Mapping
will help you determine Workshops
the type and level of The needs of the
training required. sponsors or funders of
these organisations

The needs of the


broader community
in which these
organisations operate

32  |  Training Toolkit  |  Toolkit Practices


M&E in action Stage 2: Programme strategy & objectives
Pulling it all together setting, a logframe can be used to articulate how an
organisation aims to meet regional or national objectives.
? Key questions A framework in information literacy training M&E is a way
of describing your intervention and showing how you
For example, an African higher education institution
(see Table 1 : Types of Information Contained in a
Determining your strategy and objectives expect your actions will lead to the desired outcomes and Logframe) could use a logframe to articulate how their
impact. It is a plan of the methods you will use to gather information literacy programme will contribute to the
• What are you trying to achieve with your
the monitoring data, but it can also make it easier to Africa 2020 (http://mariemejamme.com/africa/) goals.
training?
define your objectives. Two common M&E frameworks Their higher-level goal would be to “invest in human
• How will you know if it is working? in international development are Logical Frameworks and capital and bring about a sustainable future for Africa”.
• What changes will you see if the training has Theory of Change (TOC). Both of these can help you to This could be achieved through developing an information
been successful? think about and articulate the objectives of your training literacy programme aimed at students and other related
(as well as its outputs, outcomes and impact). These are activities. The indicators for measuring whether they had
• What outcomes are you hoping to observe in generic M&E frameworks and should not be confused achieved their goal might include an increase in female
knowledge, behaviours and skills? with the conceptual frameworks (see Towards information student academic achievement (as a result of attending
• Have all the stakeholders contributed to literacy indicators, R Catts & J Lau) for information the information literacy training programme). The sources
development of the objectives/outcomes? literacy. of information they would gather would be test data and
final end-year results (see table below for more examples).
• Are the objectives/outcomes specific,
measurable, achievable, relevant and time- The Logical Framework approach
bound? A Logical Framework (also known as a logframe) is “a tool
to help designers of projects think logically about what
the project is trying to achieve (purpose), what things the
If you strategically plan your project you will know what
project needs to do to bring that about (outputs) and what
you are trying to achieve and have a rational argument
needs to be done to produce these outputs (activities).”
for your approach in getting there. Strategic planning is
(Department For International Development)7.
necessary for delivering a good project and also for doing
a good evaluation. Logframes are usually developed in consultation with
stakeholders and provide project staff, donors, the
Clearly identifying what you are trying to achieve, from
project beneficiaries and other stakeholders with a simple
your own perspective and that of your audience, is
summary of a project strategy. A logframe can form
essential. Once this has been defined then a number
the basis of donor-funding applications, when it is used
of techniques can be used to identify the nature of the
throughout the project’s lifecycle to track progress and
intervention, specific objectives and outcomes, and ways
adapt to changing situations. In an information literacy
to identify progress and change.
7. BOND. (2003). Logical Framework Analysis. Available: http://www.gdrc.org/ngo/logical-fa.
pdf. Last accessed 23 Feb 2013

Training Toolkit  |  Toolkit Practices  |  33


Each project description will include details of how TOCs are normally articulated in a diagrammatic form
progress and change will be measured (verifiable although there are endless variations in style, form and
! Tips indicators), the type of information or methods used to
plan and monitor progress, and a record of the factors
content. TOCs show graphically how change happens
in relation to a number of themes, the pathways an
Each donor uses different terminology in their outside the project management’s control that may organisation might take in relation to these themes and
logical frameworks. Familiarise yourself with impact on project feasibility (i.e. assumptions). Table 1 a means by which the impact of the pathways and the
the variations in terminology before you start shows the type of information usually contained within assumptions made about how change happens can be
to complete your logframe. a logframe. tested8.
When you deliver your information literacy training you
A logframe includes a narrative about:
Theory of Change should link your training objectives (and outcomes) to the
A logframe is a simple summary of the project strategy goals, objectives and activities highlighted in your strategic
• The broad development impact the project or that helps you to plan and monitor your project’s outputs framework. You should also take careful account of the
intervention will contribute to society or individuals and outcomes. But how do you demonstrate the long- measures you need to provide to demonstrate that you
(goal) term goals of your programme or project? Theory of have achieved your project goals. When you design your
• The expected benefits/outcomes at the end of the Change (TOC) is another strategic tool that can help training intervention you should translate the framework
project or intervention (purpose/objective) you clearly articulate the long-term changes at an or organisation’s strategic goals into the learning and
organisational, programme and project level. training objectives/outcomes.
• The tangible products and services the project or
intervention delivers (also known as immediate
objectives or outputs and results)
• The activities that need to be carried out to achieve the
immediate objectives (activities)

8. O
 ntrac, The newsletter of INTRAC: Theory of Change: What’s it all about [downloaded
01 Feb 2013] http://www.capacity.org/capacity/export/sites/capacity/documents/
topic-readings/ONTRAC-51-Theory-of-Change.pdf

34  |  Training Toolkit  |  Toolkit Practices


Types of information contained in a Logframe Table 1

Project Description Indicators Source/Means of Verification Risks & Assumptions


GOAL – Sustainable improvements How the Goal is to be measured, How the information is to be
in society or well-being of people including quantity, quality, time collected, when and by whom
(impact)
Information To develop creative and innovative X number (e.g. 20) employers Interviews with employers The environment is stable and
literacy example thinking skills to build a more stating that new student recruits students can find employment.
Surveys of employers
sustainable Africa workforce have better employment skills
A demand exists for creative and
innovative talent.
Employers are willing and able
to communicate the changes
in creative, innovative thinking
student profile
OBJECTIVE(S) – Changes in How the Purpose is to be measured As above
behaviour or improvements in including quantity, quality, time.
access or quality of resources How you will know that intended
(outcome) change has occurred
Information To increase the use of research X number of current and relevant Report with analysis of examples e.g. Assess threats and
literacy example knowledge by students citations in research papers vulnerabilities of a region, outside
Self evaluation reports
beneficiaries’ control
X increase in the use of e-journal
Regular interviews conducted with
databases If the Purpose is achieved, the
students and staff
assumptions external factors that
Surveys of students and staff need to be in place or must hold
true to achieve the Goal
The increase in research knowledge
and skills will lead to critical and
innovative thinking skills.

Training Toolkit  |  Toolkit Practices  |  35


Types of information contained in a Logframe Table 1 (continued)

Project Description Indicators Source/Means of Verification Risks & Assumptions


RESULTS – Tangible and How the Results are to be As above If the Results are achieved, the
immediate products or services measured including quantity, quality, assumptions that must hold true to
as a consequence of completed time? achieve the Purpose
activities
Information A credit-bearing information X students successfully pass the Test results Attitudes to the value of the library
literacy example literacy programme programme will improve along with creative and
Assignments
innovative thinking skills
X students achieve x results in their
test scores Students are able to see the link
between critical lifelong learning
and building innovative thinking
skills
ACTIVITIES – Tasks that have to Input and Resources needed to If Activities are completed, the
be undertaken to deliver results carry out each task assumptions that must hold true to
through mobilising inputs deliver the results
Information In year 1, information literacy The module is developed in Library and academic staff are able
literacy example module (on critical thinking skills) is collaboration with academic staff to teach critical thinking skills
developed and contextualised to different
disciplines

36  |  Training Toolkit  |  Toolkit Practices


The training objectives/outcomes Your objectives should also be SMART (see Box 7: SMART
Objectives). A SMART objective for an information
Definitions Box 7
An objective is what you will achieve or learn by the end of
a training intervention. Objectives are often linked to the
literacy training intervention might, for example, be: ‘By
the end of the training intervention (time-bound and
Making your objectives / outcomes SMART training outcomes, which are expressed as what you will achievable), learners will be able to critique the tools and
be able to do with the new skills and knowledge. Effective strategies (specific) they currently use (relevant) to find
Ask yourself if each of your objectives/outcomes is:
M&E is impossible without the identification of clear scholarly information (measurable).’
• Specific Do you have a clear statement of what objectives that relate to outputs, outcomes and impact
the trainers and trainees will do or learn during (see Box 5: Outcomes, Outputs and Impact).
the training intervention?
• Measureable How will change be detected and Considerations when setting your
what are the baselines (if any) against which it
will be measured?
objectives (and outcomes)
All training interventions occur within particular contexts
• Achievable Have you ensured that trainers and – individual, organisational and community or social. The
trainees have the resources and the capacity to relationship between these contexts is important, and
achieve the objective? so defining objectives should always be a joint exercise.
• Relevant Is the objective relevant to the Trainers, trainees and other stakeholders in the training
trainees’ role or the role they perform in their intervention should all have their needs, interests and
organisations? concerns considered. For example, the objectives
of university-based information literacy training
• Time-bound Has the objective been designed
interventions might have to be matched against broader
with a view to how long the training takes place
institutional objectives, policies and strategies. Or, the
and the subsequent effects of the training persist?
objectives of trainers working for NGOs might have to
If your objective can demonstrate each of these reflect the priorities of funders or donors, or the ability
attributes, it is SMART. It is not SMART if any one of the community to advocate for resources. For more
of them is missing. on consulting with stakeholders, see Stage 3: Identifying
It is worth bearing in mind that these different challenges.
attributes are also interdependent. For
example, an objective will not be measurable if it
is not relevant to a learner who cannot achieve a
specific skill in the time available to them.

Training Toolkit  |  Toolkit Practices  |  37


attempts have been made so far to gather evidence on example, this could include acceptance of publications by
such changes. We discuss how it has been achieved so refereed journals or an increased in the success of bids
! Tips far, however, in Stage 6 of the M&E journey. for funding.
• Changes in performance and practice. This is the • Impact on the wider community. The impact on the
When considering what objectives the training extent to which trainees are doing things differently wider community of your training intervention may be
should meet, consider individuals’ personal as a result of what they have learned. M&E here seeks particularly challenging to identify. The boundaries of
objectives as well as their organisation’s objectives. to answer questions such as how trainees have applied your target sector or community may be ill-defined,
You should also consider the objectives of any their knowledge, skills and newly-acquired confidence your timescales for impact long and your SMART
other stakeholders, such as funders, and the wider to enable them to work more effectively, whether it be objectives/outcomes difficult to define, let alone to
community. as students, academics, community organisers etc. For measure. Nevertheless, the Centurion workshop
example, this could include evidence of using a range of demonstrated there was significant interest in deriving
sources shown by relevant references in a report. The community outcomes, and therefore in applying M&E
Thinking through outcomes and impact answers to such questions can be complex, particularly to a broader, societal context.
Outcomes focus on how the skills and knowledge will be if the impact can be measured only in the longer
If you need to evaluate impact on the community, you
applied. Some possible outcomes for information literacy term. However, they serve as a basis for considering
should consider the context in which your training
training include: the broader impact of information literacy training
intervention is taking place For instance, where a
interventions on organisations and wider society. We
• Changes in knowledge and skills. This is where M&E specific organisation is involved in outreach to the
provide a case study in Stage 5 and which explores
captures precisely what has been learned through the wider community, what current training interventions
the use of using diagnostic tests pre-, during and
training intervention: the knowledge, understanding do they provide and are there gaps in provision? One
immediately after training, and three months later.
and skills that have been acquired, developed or example of outreach to community is what role does
enhanced. These data should be gathered immediately Some possible impacts for information literacy training a university library play in preparing students for the
after your training and we go into more detail about include: transition to higher education? Does it work with
this in Stage 5 of our M&E journey. For example, this teachers and students to prepare students for tertiary
• Impact on an organisation. Training is likely (although
could include the ability to use a mind-map to help education and has this led to a larger number of
not invariably so) to take place within organisations such
define information needs and identify search terms. students being accepted?
as schools, universities or local libraries, or at the behest
• Changes in confidence and behaviour. M&E here of organisations such as donors or funders which Revisiting a community where an information literacy
seeks to capture the extent to which trainees’ approach commission third parties, e.g. NGOs, to undertake the intervention has taken place could seek evidence of
to the handling of information has evolved as a result training. These organisations have a right to expect increased use of information technology and successful
of what they have learned. Our research in the course that information literacy training will have an impact by advocacy for resources, such as the provision of local
of developing this toolkit suggests that relatively few helping them meet their overarching priorities. For health facilities, based on effective use of available
information.

38  |  Training Toolkit  |  Toolkit Practices


Case study 5
How GDNet applied Theory of Change entitled ‘Research Communications from and for the learned during a training intervention to what trainees
Figure 3 illustrates the Theory of Change developed by Global South’. Developing a Theory of Change can help do in practice. The easiest way to ‘read’ a Theory of
the GDNet, part of the Global Development Network you to explore and collect evidence about the pathway Change is from left to right, though they are actually
(GDN), in their five year strategy paper (2010-2014) which leads from the specifics of what is taught and developed from right to left in a workshop setting.

Figure 3: GDNet’s Theory of Change


Awareness
of relevant
knowledge.

High quality, Knowledge


Access to other relevant available in
Access to data.
research Researchers’ knowledge appropriate
capabilities for exists. formats.
communicating
research to policy.
Generation of Better informed,
high quality and evidence-
relevant knowledge based,policy
through research. making processes.
Opportunities for
researchers and
policy-makers to
Access to research engage. Willingness to use Policy is not
Access to funding. collaborators and evidence in policy solely evidence
partners. making. driven.

Training Toolkit  |  Toolkit Practices  |  39


Case study 6
How IDS incorporated feedback loops in its Figure 4 illustrates the Theory of Change IDS created Figure 4: IDS Knowledge Services Theory of Change
Theory of Change for its Knowledge Services (KS) team. This one is rather (Downie, 2008). http://www.ids.ac.uk/files/From_
Access_to_Action_Downie_2008.pdf
more complicated, with some feedback loops embedded.
Pathways of information flows
IDS KS Means of Immediate outcomes: Intermediate outcomes: Higher level outcomes: Goal
achieving outcomes Access and debate Understanding and influence Action

Promoting IDS KS and Target groups have an increased Purpose: Development


actors regularly use diverse Information contributes
offering information literacy desire and capacity to search for, More understanding of the to more enabling development
interventions. evaluate and use information in development information; sharing
causes and consequences of and applying their knowledge in processes and interventions which
their work. poverty and injustice and are pro-poor, inclusive
projects, programmes, policy,
Increased capacity to ideas of the possibilities and campaigns, advocacy and activism and equitable.
Sourcing, bringing together produce high quality potential for change.
and (co)creating diverse and which contribute towards our
research. wider vision.
credible information.
Target groups access relevant,
diverse and credible information
Repackaging, synthesising, when they need it. Information is used to design, Supergoal (our vision):
cataloguing and making implement and change a A world in which poverty does not
accessible free information in development intervention or exist, social justice prevails and
Increased capacity to Increased capacity support and justify a course of the voices of all people are
different formats, mediums and build the understanding of to influence the behaviours
languages so that it is available action already decided on. heard in national and
others to research, influence and actions of other international institutions.
when needed. and act. development actors.

Creating partnerships, networks Target groups engage There is a different understanding


and virtual and physical spaces to with each other through their and framing of issues and new
sharing information, discussing, If our assumptions are correct and If our assumptions are correct and
bring different development external conditions and influences agendas are set. external conditions and influences
actors together. debating and creating new
knowledge together. enable change. enable change.

Building a network of information Spaces are opened up where


and knowledge Intermediaries, power relations can be
facilitating capacity development negotiated and challenged,
activities and advocating for and action mobilised.
open access and the work of
Intermediaries. There is a more enabling
environment for effective
information sharing and the There is a wider awareness
Contributing to a better work of information and of key development issues and
understanding of information knowledge intermediaries. public debate is stimulated.
communication and knowledge
processes, (and our role in them)
both within IDS and externally
through research, If our assumptions are correct and If our assumptions are correct and
evaluation and teaching. external conditions and influences external conditions and influences
enable change. enable change.

40  |  Training Toolkit  |  Toolkit Practices


Setting objectives Flowchart 2

START HERE Objectives for Outputs The formulation of objectives


Individual trainees individuals can The knowledge and will help determine a framework
relate to: skills acquired for your training interventions,
The organisations i.e. plan your methods and show
in which the how you will reach your desired
Outcomes
trainess are outcomes and impact. It is
Increased
located essential for good evaluation.
knowledge,
understanding
Consider the The sponsors or and skills.
individuals and funders of these
organisations Changes in Ensure your Strategic planning
organisations the confidence and objectives and
objectives should behaviours.
relate to. The broader outcomes are:
community Changes in
Specific
performance and
practice. Measureable
For these
Achievable Frameworks you might
stakeholders,
consider include Logical
objectives are Relevant Frameworks (‘Logframes’) and
more likely to
Time-bound Theories of Change
relate to:

Impacts
These may be longer term and
harder to define

An M&E framework will help you


better define your objectives

Training Toolkit  |  Toolkit Practices  |  41


M&E in action Stage 3: Identifying challenges
Challenges from people For these reasons, it is worth doing a stakeholder analysis
as one of your earliest activities on your M&E journey. It
? Key questions Very often the most significant challenges to any
intervention are rooted in people rather than
will help you determine the extent to which you should
consult your stakeholders at each stage of the journey.
Addressing your M&E challenges practicalities. In particular, individuals who stand to gain
or lose from your intervention, or who can influence it, i.e.
• Who stands to benefit from your training, apart
your stakeholders, can determine the success or failure of
from the trainees? Is there anyone who may be
affected negatively?
your M&E journey. ! Tips
Your stakeholders should have been consulted when you
• Who can significantly influence the success of Communicating identified risks to stakeholders
set your objectives (Stage 2 of the M&E journey), but it is
the project? early on in your M&E journey flags up potential
also worth considering whether you should collect some
• What challenges might you encounter during data from them, how you should communicate with them problems to those who need to be aware of them
your M&E activities? and whether they should be involved in learning processes. and may result in help to overcome them.
• How are you going to minimise the risks of Identifying and involving stakeholders in the entire
encountering these challenges? M&E cycle, to at least some degree, brings a number of
benefits: Challenges in M&E may arise from a variety of others
• What risks are there and how can they be
sources too – and at all stages of the M&E cycle. In the
managed? • It ensures they understand and are committed to the table below we have described some challenges you may
M&E process face in terms of questions, and grouped them into the
As you embark upon your M&E journey, you need to think • It helps ensure the relevant questions are being asked different stages of the M&E journey at which they might
about the challenges you may encounter. Challenges occur.
• It increases the chance that findings and
come in many forms. Considering them in advance will recommendations are listened to and used The solutions to the questions will be highly specific to
help you manage them, and plan to minimise the risks they your own context, but failure to address them early on
• It ensures that participants are willing to provide data
present. Challenges can include people and the physical in your M&E journey may store up problems for later.
and perceive the process as a useful one
and emotional environment. Consider using a problem-analysis tree to break down the
• It may help ease the load on the trainer if stakeholders problem into manageable parts. This will help to prioritise
help with data collection and analysis various factors, define solutions and agree actions.
• It ensures that M&E is embedded in the whole
organisation and will continue even if a trainer leaves

42  |  Training Toolkit  |  Toolkit Practices


Questions for addressing challenges Table 2

Decisions
1. Assessing needs Who defines how wide a lens you should take on needs assessment?
Is everyone involved willing to admit to individual and/or organisational needs?
Do stakeholders accept the necessity to do a needs assessment, with the consequent resource and time costs?
2. Programme strategy and objectives Are stakeholders agreed on the programme strategy?
Have stakeholders embraced the need for evaluation, with the consequent resource and time costs?
Are stakeholders agreed on the purpose of the evaluation and the evaluation questions?
3. Identifying challenges Have you done a risk assessment for your M&E process, identifying big and small issues which could derail it, and have
you developed strategies to address them?
4. Designing M&E Do stakeholders understand the implications of the choice of your M&E methods, in terms of skills, resources, cost
and the kinds of data which will be generated?
Have individuals with appropriate skills and time been identified to implement the M&E?
Have logistics been considered, including scheduling, transport, data collection and data entry?
Are the participants, i.e. the trainees, sufficiently engaged with the exercise? Do all the stakeholders appreciate the
value of the training? Do they consider their needs have been adequately considered? Do they see training, including
the M&E, as threatening or enabling?
5. Establishing a baseline Have you built in baseline evaluation?
Have you planned your M&E so that you can establish the baseline early, before the intervention has actually started?
(Otherwise, your participants may have already improved, and you may not detect any further changes.)

Training Toolkit  |  Toolkit Practices  |  43


Questions for addressing challenges Table 2 (continued)

Decisions
6. M&E during and after training Have you designed your M&E so that it is only minimally disruptive during the training process?
Do all the trainers understand the point of doing M&E and have they committed to it?
Have you communicated your M&E process to participants and engaged them in it?
7. Data analysis Have you piloted your data analysis early so that you can identify any issues, e.g. in terms of skills, knowledge
or software?
8. Learning from M&E Have stakeholders committed to doing something with the M&E results?
Have they accepted that the report is just the beginning of their effort?
9. Communicating findings Have you identified suitable avenues for communication with stakeholders, and allocated time for this?

Challenges from the environment The emotional environment also has an impact on the
M&E. Participants will need to see the relevance and
The physical environment can pose a problem if benefit of the M&E process to help them engage with it
you intend to gather M&E data through face-to-face and overcome any concerns about the use of the data. It is
interviews or focus groups. One-on-one interviews important that you emphasise that their comments will be
will need a quiet space for individuals to speak openly. anonymised in summary reports and always state how the
If proximity is an issue you can conduct one-on-one data will be used (for instance, to demonstrate the training
interviews by telephone instead. In focus groups, success or to lobby for more resources).
appropriate space and resources are necessary for people
to exchange ideas and discuss aspects of information
literacy learning. Don’t forget resources for capturing
discussions, such as flip-chart paper placed on walls.

44  |  Training Toolkit  |  Toolkit Practices


Case study 7
Carrying out a stakeholder analysis to a table, where we used a grid to help us undertake an This mapping helped us to:
At a planning workshop in Harare, Zimbabwe, the analysis of the groups. (We could have used a mind-map
• Identify what we knew about our stakeholders
Research Council (Zimbabwe), the Zimbabwe Library but we have found that the tabular approach is easier to
Association (ZimLA) and IDS used stakeholder mapping translate into a report format later on) • Identify the gaps in our knowledge
to explore options for integrating pedagogy skills On the table, we placed the stakeholders in one column • Focus on the best approaches for engaging with
and techniques into curricula and information literacy and we set about answering probing questions about stakeholders
pedagogy training and teaching models into library and them. We wrote the questions on the top row of the The analysis enabled us to pinpoint those stakeholders
information science (LIS) faculties. The workshop was table (and on a flip chart). These focused on: the current who could influence the outcome of our work, i.e.
attended by faculty staff from three higher education knowledge, skills and conceptions (including values) those it was critical we considered from the outset. This
institutions and six librarians from public libraries and a of the stakeholder group, the needs (articulated as meant we could plan ways in which we could update
secondary school. opportunities) to engage with them, the challenges them with our progress. It also enabled us to identify
Our aim was to understand those individuals, we might encounter as a consequence of working with those stakeholder groups who were less likely to see
organisations and bodies with a vested in our work and them, and how we planned to communicate with them. the value of our work immediately, so we could plan
we started our stakeholder analysis by first identifying Our questions included: activities that helped us to build confidence in our
who these were, based on pre-existing relationships or • What knowledge, skills and attitudes do students and approaches.
ones we thought we could forge in the work. We asked staff currently possess? Finally, the analysis helped us to identify possible risks
participants to think as broadly as possible in the early
• What institutional factors will need to be addressed and think through mitigation strategies. For example,
stage of the workshop before asking them to group the
to foster information literacy and pedagogical skills? the group discussed how to take forward key activities
stakeholders, using Post-it notes to make things easier.
without funding and also took account of forthcoming
Workshop participants identified the following • Are students involved in research?
events that would provide a suitable platform to talk
groupings: direct recipients of the intervention (e.g. • What has already been achieved in building capacity about the project.
undergraduates), faculty staff, researchers, professional to teach information literacy using pedagogical
This stakeholder analysis, like others that we carry out,
bodies (e.g. International Federation of Library approaches?
was used during the planning process to understand our
Associations and Institutions (IFLS)) and national
• Who has a vested interest in outcomes? target audiences but it also served as a baseline to test
institutions (e.g. government).
our observations and develop high-level project plans, a
• What challenges might we face?
Next, we prioritised these groupings by project communication strategy and a risk log.
recipients, before transferring our stakeholder groups • What knowledge, skills and attitudes are needed to
take this work forward?

Training Toolkit  |  Toolkit Practices  |  45


Stakeholder Analysis from Research Council, ZimLA and IDS Workshop Figure 5

Individual Skills Knowledge Attitude Challenges


Students: • Mobile technology • Partial or minimal • Know how to find • Too rigid and
Undergraduate, • Some computer knowledge of digital resources don’t like making
Post graduate skills resources • Learn to pass mistakes
• Research skills • A ware of • Positive learning • Different
• Information information attitude ICT skills /
ethics resources both appreciation
print & electronic
Staff: • Lack IL Skills • Very limited • L ow value of IL to • Not involved in IL
Academic, support • Y oung staff knowledge of IL high appreciation initiatives
technically able • A ware of • Little resources
• Technophobia electronic • Abscond training
(‘old guard’) resources
Researchers • Limited ICT skills • Aware of IL • Keen to learn • O ften remotely
(dependent on • Possess strong • Patient located
exposure) information
• Information research skills
research skills
Executive / • Limited skills • L ack of • L ow interest or • Time constraints
Management knowledge appreciation (for training)
External readers • Limited and • Limited • Impatient • Very little access
variable skills knowledge • Low confidence to practical
training Taken from an information
literacy (IL) strategic planning
workshop in Zimbabwe, 2013.

46  |  Training Toolkit  |  Toolkit Practices


Challenges in monitoring and evaluation Flowchart 3

For example:
START HERE Lack of funding
Resource-based
Lack of time
challenges Build mitigating
Logistical obstacles action into your plan
where possible.
E.g. Carry out a
For example: stakeholder analysis,
Identify challenges problem tree-analysis
Lack of trainer skills and pilot your methods,
and expertise communicate your
strategy.
Trainee resistance to
interventions
People-based Communicate risks to
challenges relevant stakeholders.
Lack of stakeholder
buy-in Organisations, funders
or sponsors might be in a
position to help.
For example:
Potential bias

Other challenges Perceived intrusiveness


Intervention-based
e.g. cultural and challenges
language barriers Lack of a baseline

Difficulty in application
and reproducibility

Training Toolkit  |  Toolkit Practices  |  47


M&E in action Stage 4: Designing M&E
Once you have defined your objectives (and any outcomes They are often associated with feelings, thoughts or
you anticipate) then you are ready to design the detail perceptions, and can reveal responses that add a richer,
? Key questions of your M&E process, sometimes called a data collection
framework.
more compelling dimension to the data collected.
Quantitative methods help you to say how much or how
Designing your M&E The chief function of an M&E data collection framework many, and to make comparisons between two groups
• What methods will enable you to establish is to serve as a plan of the methods you will use to gather or test a hypothesis. They are often associated with
whether you are achieving your objectives and your monitoring data. There is a range of methods you can numerical data or data that can be converted into
outcomes? use and developing a data collection framework will ensure numerical form.
• What methods are appropriate to use you capture data that are consistent and comparable
Consider a field of maize ready for harvesting.
throughout your intervention? throughout your M&E journey. It will also highlight the
Quantitative methods to evaluate the quality of the maize
resources you need to capture and analyse the data.
• How will you ensure consistency and may involve weighing the maize or calculating the average
comparability between the data gathered? First though you need to answer some fundamental yield per square metre. Qualitative methods may involve
questions about the evaluation questions you are trying a botanical drawing of an ear of maize, a discussion of its
• What questions do you need to ask and what to answer. colour, flavour and smell, or an examination of the way
evidence are you looking for? in which it reacts to the weather. In short, quantitative
• Might you need to use a combination of Qualitative versus quantitative methods? methods enable us to describe the harvest; qualitative
methods, depending on the range of outcomes methods help us to understand better what maize is
The decision as to whether to consider qualitative
you desire? actually like.
or quantitative methods is often at the crux of M&E
• Do you have the skills and resources necessary design. It will determine the quality and richness of the In reality, quantitative and qualitative approaches are
to implement your chosen method(s)? information you gather, as well as the depth and breadth often used together. This is referred to as triangulation’
of it. We considered the differences between qualitative or mixed methods. Triangulation offers the advantages
and quantitative data in some detail in Stage 1 of the of both, while minimising the disadvantages of using
M&E journey. You will recall: them singularly (see Table 3: Qualitative vs Quantitive
Methods). It is important to be clear what questions you
Qualitiative methods help you to describe, explain and
are trying to answer with each approach.
understand a process, or explore a variety of positions.

48  |  Training Toolkit  |  Toolkit Practices


Qualitative vs Quantitative methods Table 3

Qualitative Quantitative
Exploratory, and permits answers to arise that The only way in which ‘how much/how many/
you hadn’t considered the possibility of how strong’ etc. questions can be answered
May feel more respectful (asking people to tell Analysis and interpretation may be easier than
Advantages

their own stories rather than imposing a story, with qualitative methods
e.g. via a questionnaire)
May seem more compelling (but good
May feel more participatory for both trainers qualitative methods can be equally compelling)
and participants
Allows you to track subtle changes, including
Provides rich, ‘textured’ data which can feel changes across time
more compelling
May be perceived as less robust than ‘Garbage in-garbage out’. Quantitative methods
quantitative methods (though this can be are only as good as the instruments used to
mitigated by conducting, analysing and collect the data
reporting the research to the highest
9 When people or events are represented by
standards )
numbers, the narrative may be oversimplified
May seem deceptively easy to conduct, e.g.
Disadvantages

Not useful if the samples are small


interviews. Care must be taken to avoid ‘demand
characteristics’ (i.e. indicating to participants Difficult to tell whether you are actually asking
what the ‘right’ answer is), closed and leading the right questions
questioning etc Participants can be anxious, e.g. about what will
It can be easy to neglect consideration of how happen to data about them
transcription analysis will be conducted. A good May feel more like an exam for participants
analysis will take approximately 3 times the
length of the original data collection session Heavily dependent on literacy
(e.g. a 1-hour interview may take 3-5 hours to
analyse) 9. * Spencer, L., Ritchie, J., Lewis, J. and Dillon, L. (2003). Quality in Qualitative Evaluation: A
framework for assessing research evidence. Available: http://resources.civilservice.gov.uk/
wp-content/uploads/2011/09/a_quality_framework_tcm6-7314.pdf. Last accessed
23 Feb 2013

Training Toolkit  |  Toolkit Practices  |  49


Thoughts and feelings versus behaviour? Changes in thinking vs Changes in behaviour Table 4
Most trainers in information literacy training interventions
want, ultimately, to change the behaviour of trainees.
However, that can be a very long-term goal. In the short Changes in thinking/feeling Changes in behaviour
term, we often use a change in trainees’ self-assessment of
Likely to be detectable early on in the lifetime May be less conscious and more habitual, and
their knowledge, skills and attitudes as an indicator that we
of a intervention therefore less subject to bias

Advantages
may eventually achieve a change in their behaviour.
Often directly targeted by our interventions Often felt to be more substantial in terms of
However, although, it is easier to examine changes in
an outcome
thinking in the short term, we shouldn’t forget that small Perceived to be easier to investigate than
changes in behaviour may be still be detectable. For behaviour May give us ideas which further shape and focus
example, an intervention aimed at increasing the use of our intervention
research evidence in policy-making processes may seek a
Possibly no direct causal link between changes May be difficult to define (‘operationalise’) the
rise in the number of policies based on comprehensive and
in thinking and long-term changes in behaviours behaviours we are looking for as indicators of
systematic evidence (such as systematic reviews) in the long

Disadvantages
impact (though this is more a challenge than
term. In the short term though, we might look at whether Dependent on self-reporting, so data may
there is a shift in attitude towards using research evidence a disadvantage)
be distorted by biases in design, e.g. when
and understanding the value it can play in formulating participants ‘know what we want to hear’ Even if we define the behaviour adequately,
policies or making decisions. it may be extremely difficult to observe in real
Or consider another example in which an intervention time, and therefore we may have to depend
is aimed at improving the training skills of information on self-reporting
literacy trainers to use more enquiry-based, learner-
centred methods rather than teacher-centred approaches.
In the short term we may seek evidence of a shift in
their perception of value and some use of enquiry-based
activities in their information literacy curriculum, but in
the long term we might look for evidence that a trainer
has developed an information literacy curriculum based on
pedagogical approaches, which also incorporates varying
assessment techniques.
The advantages and disadvantages of looking at changes
in thinking or changes in behaviour are summarised in
Table 4.

50  |  Training Toolkit  |  Toolkit Practices


Oral versus written versus visual data? Working in order may respond well to an individual interview. Incorporating
non-verbal and non-literacy based methods where
Spoken, written and visual approaches to collecting You need to make your choices of methods, tools and literacy cannot be assumed will help participants feel
qualitative data are not mutually exclusive and can be technologies (see Box 6: Approaches, Methods, Tools included and able to respond. Choose methods that will
combined if required. You might want to consider the and Technologies) in the right order. Identify the most highlight information about gender, current capacities,
options each offers at the outset, bearing in mind that the appropriate method for capturing your monitoring data language preferences and learning needs or disabilities at
written word is often prioritised to the exclusion of other before choosing your tools or technologies – which the pre-intervention stage and monitor throughout the
modes of communication. should then facilitate the data collection. Taking the M&E journey.
time to consider the amount of effort or cost involved
Individual versus group evaluation? in collecting and analysing the data upfront can result
Resource requirements
in significant savings in the long run. Your choice of
You should consider whether you want to conduct As well as the amount of effort required to undertake your
technology will also highlight any logistical issues you have
your evaluation with individuals (e.g. using interviews M&E data gathering and analyses, you need to consider
to consider when preparing your evaluation.
or questionnaires), or with groups (e.g. via consensus time and cost implications. For example, methods that
Ease of comparison
workshops or by using tools such as i-clicker). When allow a number of people to respond at the same time are
working with groups, it is also worth considering whether generally faster in terms of data collection time. However,
individuals in the group need to work together at the It is critical that you choose methods and tools that are be careful that economising at this stage doesn’t shift the
same time or by responding to others’ comments and consistent and comparable if you are to elicit results effort to the data analysis stage. This is a classic problem
input but not at the same time, e.g. by using a wiki or that are robust and meaningful. For example, when you in conducting focus groups, where large amounts of
bulletin board. design your pre- intervention survey you should plan to qualitative data can be gathered very quickly, but they are
ask the same questions in the post-intervention survey(s) much more time consuming and complicated to transcribe
M&E design considerations as comparing them can enable you to demonstrate the and analyse than data from one-to-one interviews.
impact of the intervention as well as highlight changes
There are many considerations when choosing your M&E Consider too that though data collection technology
in knowledge, skills and attitudes. Another useful tip for
design. The most basic are: can be expensive, especially at the set-up stage, it can
measuring the quantitative changes in knowledge, skills
• Will the data collected give you the information you and attitudes is to measure the distance travelled. See enormously reduce data collection and analysis time. It is
need at the pre-, mid- and immediate post-intervention Stage 5: Establishing a baseline for a description of this worth asking whether participants already have access to
stages? useful tool. particular technologies, e.g. mobile phones. It may also
be possible to request donations or free licences from
• Will the selected methods allow you to elicit evidence
demonstrating that you have met your planned Flexibility and inclusivity commercial providers.

objectives and outcomes? However, be aware that while working within your
Trainees may have a variety of intellectual and physical
resources, you need to collect sufficient data to subject
• Do you have the capacity to implement the methods capacities. You need to define methods which take
them to powerful statistical testing when you undertake
you have found appropriate? account of these. For example, individuals who have a
your data analysis (in Stage 7 of the M&E journey). You
hearing impairment may find a focus group stressful, but
Other considerations include: should bear this in mind when designing your sample size.

Training Toolkit  |  Toolkit Practices  |  51


Triangulation survey questions in the two survey capture forms led to a
misleading and biased analysis.
completed it may be that some questions are too long
or over complicated. If, on the other hand, people fail to
Triangulation is the technical term for applying mixed complete just the last page, it may be that you need to
methods. Essentially, it maps data from one method Triangulation can be particularly important if you want to
insert a simple ‘please turn over’ on the second last page.
to data gathered from other data sources, including draw comparisons between different groups. Qualitative
those captured through a different research method. In methods on their own may give inaccurate data in these As well as doing a dummy run of your data collection
information literacy interventions, we can collect data situations as the variety across individuals in a group may method, it is good practice to do a dummy run of the data
from a variety of information sources, including surveys, mask small group differences. Including a quantitative entry and analysis you plan to ensure that your data are
tests, assessment rubrics, reflective journals, focus group element can overcome this. ‘making sense’. You can use the dummy data collected
discussions and individual interviews. If we map data from piloting your methods for this.
derived from quantitative data sources, e.g. surveys, with Pilot tools
data captured using qualitative methods, e.g. individual Response bias can creep into both M&E design and data
reflective journals, it can help us to eliminate bias (in our
data analysis) and cross-check our observations about
collection stages. Bias built into the design of quantitative
tools (e.g. a survey or test) can be detected through ! Tips
patterns of behaviour. The ultimate goal is to increase the careful piloting and pilot analyses. Reword any systematic • Be clear about your objectives (and outcomes).
validity of our conclusions. biases and discriminating questions you find to remove The better they are defined, the easier it will be
For example, one method you might use at the pre- bias from the response. to choose your methods
training stage for an intervention with university students Bias is harder to detect in qualitative research (e.g. • Choose methods that are flexible and inclusive
is to conduct one-on-one interviews with faculty staff to interviews) as it can arise through the interaction between
gauge perceptions of their students’ training needs. You the trainer and trainee. To guard against it, carefully • When examining the cost of a method, look
could triangulate this information by inviting the students consider the design of your interview protocols and across the entire method pathway, from design
to complete a pre-training survey, in addition to observing select tools that apply good practice in interviewing and to data analysis
their behaviours during the training itself. Comparing all facilitation techniques. By referring to the appropriate • Ensure when choosing triangulation (mixed
the data obtained from the survey and observations will information literacy standards, models and frameworks methods) that you are clear how the different
enable you to test (and possibly confirm) the assumptions (see Box 3: Information literacy standards, models and ‘stories’ will be pulled together
made by the faculty staff. frameworks) you will be able to design your interview
protocol on the attributes and competency levels defined • Wherever possible, pilot your methods first,
Triangulation can corroborate or back up your findings ideally right through to data entry and data
and observations. However, on occasion, it can provide in these documents.
analysis stages
contrasting evidence and lead us to divergent conclusions. It is important also to pilot your methods to check they
For instance, if statistical data contained within two are understood by trainers and trainees. For example,
post-training surveys contrasted with observations of the a dummy run with your questionnaire will ensure
training cohort’s skills and behaviours. One explanation that trainees can actually do it. If, for example, all the
of this may be that a lack of comparability between the questionnaires come back with only a few questions

52  |  Training Toolkit  |  Toolkit Practices


Case study 8
Checking for survey consistency before and We then asked the multiple-choice question: This response too was entirely consistent with our first
after a workshop “As a result of the workshop, do you now consider question. It seemed that the thing participants claimed
We used triangulation to consider the consistency of M&E as being: most to have learned was just the thing they would like
answers we received within a post-event survey we to change on their courses. A more general question
a. “Less important than I considered it previously.”
carried out with participants in a Zimbabwe Pedagogy later in the survey asked participants about the key
of Trainers Workshop, as well as between this survey b. “Only as important to me as before.” highlights they took away from the training. Again,
and one we carried out with participants before the c. “More important to me as a result of workshop.” there was a huge emphasis on continuous M&E.
workshop itself. But we also wanted to compare responses between
Options ‘a’ and ‘b’ both attracted no respondents, with
The workshop was aimed at senior librarians who 100% of participants choosing ‘c’. pre- and post-workshop surveys, so again we used
provide training in information literacy in three triangulation. In the pre-workshop survey, we
Our immediate – and pleasing – conclusion from these asked: “How do you assess the training needs of the
institutions and it aimed to build their capacity to deliver
questions was that M&E was more important to our participants on your training courses?” Only 23% of
information literacy training using enquiry-based,
participants as a result of the workshop. However, the respondents said they always carried out pre-diagnostic
learner-centred approaches. Following the event, we
purpose of triangulation is to spot internal consistency tests or assignments (i.e. assessing current levels of
wanted to discover what participants had learned about
in the answers given throughout a survey (or interviews knowledge and/or need to establish a baseline).
M&E, so we started our post-workshop survey by asking
or focus groups etc). In the above two questions it would
an open-ended question: “What, if anything, have you Given this, any course emphasising the importance
have been an internal inconsistency if all respondents
learned about M&E as a result of the workshop?” of pre-course training should expect a favourable
gave resounding answers to the first question of what
Sixteen out of 20 respondents emphasised that they they learned about M&E but then claimed, on the response in a post-workshop survey asking whether
had learned that M&E is a continuous process in that it second question, not to consider M&E more important participants now consider M&E more important.
is not enough to carry out assessment merely after a as a result of the workshop than before. We concluded This is exactly what we got with our response of
workshop, but that there needs to be a baseline. then that our responses were indeed consistent. 100% to the first question in our post-workshop
survey. Triangulation helped again to assure us of the
Our next M&E survey question was also open-ended, consistency of our results.
asking participants what, on M&E, would they do
differently in their own courses as a result of this
workshop. Sixteen out of 20 said they would carry out
pre-course training assessment/analysis.

Training Toolkit  |  Toolkit Practices  |  53


Design your M&E process Flowchart 4

START HERE

Choose methods
best suited to your:
– objectives/ Quantitative
outcomes methods.
E.g. surveys
– intervention
Consider the Do you want to – capacity/ Qualitative Choose Pilot Choose your
evaluation consider changes resources methods.
questions you in behaviour or your tools your tools technologies
– trainees. E.g. focus groups
want to answer. in thinking?
Be aware:
Choosing Mixed methods.
mixed methods E.g. focus groups
(triangulation) is and surveys
usual and good
practice

54  |  Training Toolkit  |  Toolkit Practices


M&E in action Stage 5: Establishing a baseline
their pre-existing capabilities, their attitudes, and their • Trainees’ ability to define their information needs and
knowledge of concepts that will be introduced on the use the information landscape, e.g. their knowledge of
? Key questions training course. how information retrieval systems work and how best
to use them; and how to use the data, information or
Establishing a baseline may involve you assessing
Establishing a baseline individuals’10:
knowledge that they gather. (This would include their
• What is the current status of trainees’ critical ability, their information-processing ability and
knowledge and skills? • Skills, knowledge and attitudes their ability to manage the information, such as storing,
• Confidence levels organising and sharing the information they gather.)
• What are their beliefs and attitudes?
• Behaviours • Trainees’ ability to evaluate information resources
• What do you know about trainees’ critically
context now? What are the individual and For any mid- and post-training comparisons to be
organisational, goals now? meaningful, you also need to establish the gaps between • Trainees’ knowledge of the political and ethical issues
where trainees are before your training intervention and surrounding access and use of information
• What is the current information landscape like?
where they and their organisations would like them to be This is not a comprehensive list and the topics you choose
• What do trainees need to do now? after it. will depend on the needs of your trainees, the training
• What would trainees like to do/know in the Establishing a baseline may also involve an investigation of agenda and the surrounding context of the trainees.
future? the policy and practice of the organisations in which the There are several models and frameworks that can help
individuals operate, as well as behaviours within the wider you identify the competencies required in a specific
community. context or discipline (see Box 3, Information literacy
To be able to evaluate the difference your training standards, models and frameworks).
makes, you need to assess the current knowledge and You will already have defined your objectives, so it should
understanding that trainees bring with them. That is, you be clear what you need to evaluate to establish a baseline. Effectively carried out, such assessments help you build a
need to establish a baseline. However, here are some possible topics you may wish to picture of what trainees currently know, understand and
interrogate: do. This serves not only to establish the baseline against
Setting a baseline is important because it also allows which progress following the training can be charted but
you to make judgments about the content you need • Trainees’ knowledge of the information landscape, i.e. also as a basis for the design of their training.
to cover in your learning intervention and establish the the information resources available to them, such as
intervention’s learning objectives and outcomes. It gives paper and electronic sources, as well as people and
you the opportunity to test a participant’s perception of places that could help them to become informed

10. Garcia-Quismondo, M. (July, 2010). Evaluation of Information Literacy Programmes in


Higher Education: Strategies and Tools. RUSC. 7 (2)

Training Toolkit  |  Toolkit Practices  |  55


Case study 9
Using questionnaires as a diagnostic, to determine knowledge of pedagogical approaches. We invited This kind of multiple choice question type also allows
distance travelled and to measure progress respondents to assess how they would rate their skills us to capture data about trainees’ attitudes and
The IDS Information Capabilities programme uses a (so assessing their self-perception of their skill), whether behaviours. For example, we used the following
range of methods to gather data about information they could pick the right definition of it (assessing their question to test how trainees would react in situations
literacy skills and behaviours. In particular, we use knowledge) and whether they valued the approach we where a high-achiever is dominating the training
questionnaires to test our assumptions about what is were taking (assessing their attitude). session: “What strategy would you use with the high-
known and what is unknown by using a combination achieving participants in a class/seminar/training
We use a variety of question types. In the same pre-
of self-assessment and diagnostic (test) questions. workshop?” In this question we wanted to see if the
assessment training survey, we asked the prospective
The information literacy questions are based on the trainer favoured more inclusive strategies (e.g. by
trainees to rate their skills in facilitation or training
competency levels defined in SCONULs 7 pillars. thanking them for their contribution and inviting others
using a Lickert scale of one to five. We used this
to respond), or would stimulate the high-achiever by
Our questionnaires are invaluable. The responses smaller scale rather than, say, a scale of one to 10,
providing them with more work, or would reprimand
allow us to capture evidence of trainees’ pre-training because we find it is easier for people to measure their
the student and ask them to leave.
capabilities and so set a baseline. We then use this data competencies against. We defined the numerical values
to measure the distance travelled (i.e. the progress) to remove any ambiguity about the ranking, explaining: We developed our pre-, immediate post- and three-
after the learning intervention has taken place. We also “Rate your response from 1 to 5 (where 1 is low and 5 monthly follow-up questionnaires for this intervention
follow up three months after training to establish how is a high score).” at the same time to ensure the data were consistent
the trainees are applying what they have learned in their and comparative. Before using the questionnaire, we
Our other question types in this intervention tested
professional or educational tasks. verified each question by checking it would give us the
trainees’ prior knowledge by asking them to select the
outcome we were looking for. We asked ourselves:
In our questionnaires we ask questions that right answer from a list of answers, all of which were
What kinds of response are we hoping to get from this
introduce the concepts we plan to use in our training plausible. For example we provided an example of a
question? Is the question vague? Is it valid? We reviewed
interventions. For example, in our questionnaire before training method and asked the respondent to identify
and rewrote questions for which our responses were
a training intervention about pedagogical approaches whether it was a learner-centred or teacher-centred
unclear before piloting the survey questionnaire with a
we undertook with trainees before a workshop for approach. For example: “At the beginning of each
focus group and an M&E expert.
senior information literacy trainers in Zimbabwe, we session, Precious asks students to rate their skills in
wanted to know participants’ prior experience or a new topic.” This is an example of a learner-centred
training approach.

56  |  Training Toolkit  |  Toolkit Practices


Establishing your baseline Flowchart 5

START HERE
Trainees’
skills?

Trainees’
knowledge?
Design your
Trainees’ learning
behaviours? intervention
From your
Determine
defined objectives,
appropriate Develop your
determine Trainees’ pre-training Use learning objectives
what you want attitudes? diagnostics. evidence to:
to evaluate to
Eg. Self-assessment
establish your
Anything else? questionnaires. Identify
baseline
e.g. IL standards, approaches for
models & achieving your
frameworks learning outcomes

Use your pre-


diagnostic
You may also questions at the
Establish
need to consider end of training to
your baseline assess impact.
the policy and
practices of Eg. Using ‘distance
organisations in travelled’
which trainees
operate and
behaviours in the
wider community

Training Toolkit  |  Toolkit Practices  |  57


M&E in action Stage 6: M&E during (and immediately after) training
M&E undertaken during information literacy training The methods you use may be quantitative or qualitative,
interventions offers you an immediate means of finding and may range from questioning to quizzes to hands-
? Key questions out about trainees’ experiences. It falls into one of
two categories: assessment FOR learning (formative
on activities. You need to understand the differences
between formative and summative methods and when
M&E During and Post Training assessment) and assessment OF learning (summative to apply them if they are to improve the quality of your
During training: assessment). trainees’ learning (See Box 7: Types of Assessment).
• Is everyone learning: To what extent do your You should also note that there may be some overlap
trainees understand the concepts? Do you need between M&E undertaken towards the end of training
to elaborate further? Can they complete their
activity independently or do they need help?
Definition Box 7
interventions and that which takes place soon afterwards.

• How do your trainees feel about their learning Types of assessment Assessing progress during training
experience? Do you need to adjust your Formative assessment occurs while training Formative assessment – or assessment carried out during a
training style to achieve the learning objectives/ is being designed, developed and delivered. It learning intervention – brings with it many benefits. It can:
outcomes? includes methods such as pre-training surveys and
• Inform you of the extent to which your trainees have
in-class observations. As a form of assessment, it is
• Do trainees feel confident about their new skills grasped new concepts, skills or attitudes
regarded as more significant and informative than
and knowledge? summative assessment (see below) as it provides • Highlight what is not clear and what needs further
• Do you need to revisit any concepts in the next both the trainer and the trainee with evidence of elaboration, providing you with evidence about how your
session? the breadth and depth of the learning taking place. learning intervention should be immediately adjusted
Post training: Summative assessment normally takes place at the • Offer trainees an adaptable trainer who responds to
end of a learning intervention, although it can be feedback and confirms their achievements …
• What is the current level of knowledge and
undertaken over the course of weeks or months. • … so enabling them to learn to recognise what they
skills of trainees compared with the baseline?
It focuses on targets, progress and impact, and themselves need to do to improve, thus promoting
• What is the immediate indication of training allows judgments to be made about the extent to critical and independent ownership of their development
impact on the lives of the trainees? which the learning intervention’s objectives have
been achieved. As such, it aims to set out what the Below are some formative assessment methods you
trainee can actually do, focusing on the final result might want to consider, though the list is not exhaustive.
or outcome. As a form of assessment, this type of A good trainer will use a range of methods and tools to
data is usually of interest to funders, institutional assess the progress trainees are making:
and departmental managers.

58  |  Training Toolkit  |  Toolkit Practices


Focused conversations or a private diary/reflective Trainer’s tip: “ORID, or focused conversations, are in conjunction with the ORID focused discussion by asking
journal cover Objective, Reflective, Interpretive and useful for documenting or establishing which bits of participants to write their response on sticky notes and
Decisional information (also known as ORID). It is a useful your training need tweaking. Ideally, as a facilitator, post them on the smiley face (for high points), and sad face
structured, debriefing process for trainers and trainees you should be well practised in this method if you intend (for low points).
based on the Kolb11 experiential learning cycle. Its purpose to use it. I’d recommend reading The Art of Focused
is to enable participants to objectively observe, reflect Conversation: 100 ways to access group wisdom in the
on an event or experience, interpret the experience and workplace by R. Brian Stanfield, and The Art of Focused
make decisions about how this will change their practice Conversations for Schools: Over 100 ways to guide clear ! Tips
or behaviours in the future. It is a flexible process that thinking and promote learning by Jo Nelson. If you don’t
can be used in groups and/or as a framework for fully know what you’re doing it can be difficult to keep Before you begin your training, identify a
building confidence. an ORID conversation on track.” Orla Cronin, Orla Cronin space where participants can ‘park’ comments
Research anonymously. A single flip-chart sheet is all that is
required. It could comprise four sections:
Mood Monitors (or mood boards) are excellent visual
tools for capturing instant and anonymous feedback on • A smiley face to indicate ‘I feel happy about …’
! Tips training. They are an effective way of gathering evidence • A sad face for ‘I am not happy about …’
on points that need further elaboration or clarification
The reflective nature of ORID and focused • A question-mark for points requiring
as well as gauging how trainees feel. You should explain
conversations may seem awkward or be clarification or further elaboration
how to use your mood monitor at the beginning of a
uncomfortable for some trainees. Overcome this
training intervention and encourage participants to post • An exclamation mark for requests for more
by modelling the process first.
comments throughout it, as well as to conclude the day or information
session with an overall comment. You can use comments
Don’t forget to supply Post-it notes and pens
posted at the end of the day or session to highlight
close by.
areas that need revisiting and make immediate, essential
changes to the programme. Mood monitors can be used

11. Book reference: Stanfield, R.B. (2000). The Art of Focused Conversation. Toronto:
News Society Publishers

Training Toolkit  |  Toolkit Practices  |  59


Quizzes/Questionnaires are a quick method of assessing of questions that promote an informative exchange are
your trainees’ ability to grasp new concepts and of open (probing) questions. These ask the respondent to
identifying gaps in their knowledge. Examples include the
use of coloured cards, thumb signals and instant-response
! Tips provide reflective and detailed answers and offer another
strategy for finding out more information and checking
devices such as i-clicker and understoodit12. However, If you plan to ask trainees to give demonstrations trainees’ understanding of the learning concepts. For
you should consider the question and answer format or presentations, introduce this approach at probing questions to be effective, you also need active
carefully. To be most effective ask questions that probe the beginning of your training intervention. This listening skills.
comprehension of the learning concepts. Also, ensure that sets the expectation that they will be sharing
Trainer’s tip: “Simple questions and answers are a useful,
all multiple-choice responses offered are plausible. their views.
low-skill method to assess learners’ levels of understanding
Trainer’s tip: “A really good method for formative and ensure their performance will be improved after the
assessment during training is the use of i-clickers. You can Trainer’s tip: “When I train staff, one of the first things session.” Daniel Mangale, ABC Project Kenya
use these to set up questions covering anything you want I do is ask them whether they’d be willing to share their
to evaluate, from skills to attitudes. Participants answer views and understanding of the subject we’re going to
the questions using the clicker and the results are displayed
instantly, which helps you, the trainer, decide what to
cover, and if they are we make this part of the ground rules
for the training. This means that later I can ask individuals ! Tips
do next. to summarise, in two minutes, the issues that have been When monitoring the progress of an activity,
I’ve found that with careful consideration of the type and discussed. That’s a good way for me to establish levels of link your observations to the learning objectives.
structure of questions asked, the i-clicker method serves understanding – which I can then use to guide my pace, as If you are walking around observing, capture
as a useful recap of trained issues, allowing me to decide well as to identify any gaps in learning that I need to go on your observations on a record sheet using a
whether to move the training on or not. The downside of and address.” Babakisi Fidzani, University of Botswana simple rating scale from one to five (e.g. one
the method is it’s costly because you need electricity and Assessment rubrics can guide your observations and equals ‘needs assistance’ to five equals ‘working
batteries – and, of course, to buy the clickers themselves.” serve as a method of standardising feedback to your independently’).
Blessing Chataira, ITOCA trainees. A simple rubric would measure your trainees’
Demonstrations/Presentations Asking your trainees ability to demonstrate the competencies set out in the
learning objectives, rating it from one to five, where one Trainer’s tip: “Asking students about the learning that
to summarise or demonstrate key learning concepts in a
is low and five is high. You could also use this information has just taken place at various stages throughout the
two-minute presentation will help you identify any gaps in
formally, in a summative assessment, as a record of what learning process itself guides me on whether or not I need to
knowledge or skills that need to be addressed or revisited.
went well and what needs improvement. repeat part of the topics taught. It’s a simple, easy-to-use
This approach is flexible and has the additional benefit
technique, though if you use it you should always remember
of encouraging greater participation in the intervention. Probing questions For this, you should familiarise to align the recap questions to your learning objectives.”
Encouraging trainees to express their perspectives and yourself with different questioning techniques and Peter Gatiti, Agan Khan University
articulate concepts in their own words will promote deeper understand how to elicit information effectively. The type
learning and improve their ability to recall the facts for longer.
12. www.understoodit.com (see Glossary: Live polling)

60  |  Training Toolkit  |  Toolkit Practices


prove a barrier if you want to use this method for assessing Summative assessments usually include practical, written
impact.” Thomas Bello, University of Malawi or multiple-choice tests and assignments. You need to be
! Tips Trainer’s tip: “When I use quick-practice activities with my
familiar with a range of summative assessment methods
and tools and think creatively if you are to transform
students to recap what’s been learned, I always go around
Ask What? Why? How? and Describe questions. this assessments process from the predictable to the
the room to see how the activity is being carried out.
This minimises the possibility of your receiving inspirational.
There’s often at least one student who finds him or herself
a simple yes/no answer. Remember to give the
lost and doesn’t understand what to do, so being confident Examples include:
trainee plenty of time to respond before probing
enough to monitor the activity is very important.
their responses, decisions or opinions further. If • Case-study assignments (rather than written
necessary, give them thinking time before asking You also need to ensure you allow sufficient time for the assignments)
them to respond to your question. activity and you have the resources you need – for example,
• Reflective journals
you may need a computer lab if you’re asking students to
search an online database. Despite these challenges, it’s a • Role-playing or hypothetical exercises
Practicals/Group Activities A short practical or activity useful technique to assess understanding levels, particularly • Portfolios of evidence
is an excellent way in which you can assess knowledge for skills-based training.” Boipuso Mologanyi, University of
and skill levels. Monitoring and recording the progress Botswana • Diagnostic tests e.g. Project SAILS or your own
observed will help you identify and prioritise those instrument
needing assistance as well as track group or individual
performance. In larger groups, you can ask participants
Assessing progress at the end of training Summative assessment needs to clearly relate to the
to make peer assessments using an assessment rubric. (Post-training Assessments) objectives of your training intervention by being able
to demonstrate whether these objectives (whether
Feedback given during the training is formative, though if Summative assessment – or assessment at the end of a individual, organisational or community-led) have
the assessment is recorded the information can feed into training intervention – can: been achieved. Assessment needs to be individual, fair,
summative assessments at the end of the intervention. challenging and supportive. Although this is a tough
• Assist in validating the efforts invested in a training
Trainer’s tip: “I like to give participants a topic to discuss, intervention challenge for busy trainers, you can use assessment
brainstorm or present. This kind of group work gives me rubrics and feedback sheets as a way of providing
collective input from the group as a whole and helps me • Meet any demands for accountability comparative and consistent feedback.
to assess whether or not participants are grasping the • Demonstrate value for money (in terms of efficiency Normally, after your training intervention is finished
concepts we’ve been discussing. I can then decide whether and effectiveness in the case of funders) you will re-test or survey your trainees to assess the
to re-emphasise certain points or go on to the next item. changes in competency (knowledge, skills, attitudes
• Help trainees practise work or job-related activities in a
You need though, with group work, to first consider the safe and supportive learning environment and values) in addition to measuring the success of the
personalities of all the learners in a group. Some people, training intervention. Use the same questions from
introverts, don’t like speaking and this can sometimes your pre-training diagnostic or survey to assist with the

Training Toolkit  |  Toolkit Practices  |  61


comparability of results and draw conclusions about the
impact of the training intervention.
Post-training assessments usually take place immediately Case study 10
after or within one week of the training intervention
ending. If you ask your trainees to complete a post-
A flexible approach to formative assessment unsure about something we said, we asked them
Participants from 11 organisations working across probing questions to understand/diagnose their mis-
training assessment after a week you risk the chance
sub-Saharan Africa who work specifically with policy understanding.
that your trainees may forget to include all the benefits
accrued during the training. makers and influencers were invited to attend an In this way, together we explored the gap in their
International Network for the Availability of Scientific understanding by questioning the concepts and
Reflective journals Reflective journals can be structured Publications (INASP) and IDS training workshop drawing links to their own working experience(s).
as in ORID (see above) but may also be flexible in format. to build their pedagogical skills. In a session on the
They may contain a collection of thoughts, observations principles of constructivism and how this teaching “Role-playing can feel embarrassing and seem silly but
and decisions built up over a period of time, and include and learning theory is applied to information literacy if you explain the method clearly, and offer people a
images, mind-maps and diagrams as a way of recording training, we used a range of formative assessment safe space in which to do it, it can be a good energiser
facts, opinions and emerging perspectives on a specific techniques to check whether participants understood for a group, for example after a lunch break. A useful
topic. You can also use reflective journals to stimulate the key concepts. These included a daily mood monitor, role play I’ve used is to ask one trainee to pretend to
discussions and capture changes in attitude over time. thumbs-up and thumbs-down signals, role play and be ‘Google’, another ‘Google Scholar, another ‘my
questioning techniques. friend’ etc. and then ask other trainees in the group to
Trainer’s logs As a trainer, you may also find it useful to
pose questions to each of them. The answers returned
build a reflective journal (see above) to reflect on your However, although some of these techniques should demonstrate the assumptions, limitations and
training experience and capture notes on how the training were planned and written into the lesson plan, we type of answer each information source would give. As
intervention can be improved or refined. also introduced formative assessment methods in the trainer, you can then identify where the role-playing
Trainer’s tip: “By keeping your own trainer’s log you can response to observations of the trainees’ behaviours. proved difficult or the answers given were inaccurate.”
easily analyse information later for review purposes and go For example, if the trainees looked confused or John Stephen Agbenyo, Savana Signatures
on to fill in the skills and confidence gaps in your trainees
that you pinpoint. To keep a log you do, though, have to
have very good awareness levels of doing the training itself,
as well as writing. You also need to know before you start
how to keep a log and how the information will be used
so that you’re recording the right information. It’s worth
remembering that such logs can also help to pinpoint your
own skills gaps.” Julia Paris, University of Johannesburg

62  |  Training Toolkit  |  Toolkit Practices


M&E during and immediately after training Flowchart 6

START HERE

Summative assessment
Formative assessment. M&E after a training
M&E during a training intervention
intervention.

For example:

For example:
Case study
assignments
Have you captured
Focused Consider: evidence of extent Reflective
conversations of learning? journals
Do you need to change
Quizzes/ your teaching style? Diagnostic
questionnaires tools
Do you need to revisit
Demonstrations/ any concepts Hypothetical
presentations exercises
Are individuals
Probing questions learning? Portfolios
of evidence
Practicals/group Consider the respective
activities weight that you give Trainer’s logs
to quantitative and
Mood monitors qualitative methods Assessment rubrics
and/or tools

Training Toolkit  |  Toolkit Practices  |  63


M&E in action Stage 7: Data analysis
data gathering, data entry and data analysis early on (See
Stage 4: Designing M&E). Mistakes can be costly.
Using statistics for quantitative data
? Key questions Data entry can be a simple matter of typing responses analysis
from a questionnaire into a spreadsheet program such Quantitative data can be subjected to descriptive statistics
Analysing your data as Microsoft Excel. Alternatively, it may be a more time- and/or inferential statistics. Descriptive statistics provide
• Have you piloted data entry and analysis before consuming process of transcribing recorded interviews simple summaries of large amounts of information.
embarking on data gathering? or written diaries. Do devote some time to checking They tend to quantify the extent of a phenomenon, for
• Have you ensured your sample size is your data to ensure they have been accurately entered. example how many people used a particular information
appropriate for your intended analysis? Again, this could avoid costly mistakes which only become source. They may include the use of statistical tools, such
evident further down the line. as frequency tables, cross tabulations (‘crosstabs’) and
• Who will be responsible for data entry? measures of central tendency, e.g. means. Descriptive
You can check your quantitative data by using basic tools in
• Will the same person or people be responsible statistics can be informative and give you a sense of the
your spreadsheet program to ensure there are no ‘illegal’
for the data analysis? direction as to where your findings are going. Sometimes,
entries (i.e. data transferred inaccurately from the original
though, the sample size isn’t big enough. To make
• Have you checked and re-checked that your data source). This is called data cleaning. You can also take
the strongest claims, you may wish to consider using
data is entered correctly? 10% of the data and cross-check it, re-entering it where
inferential statistics.
you find substantial errors. (In commercial research, this is
• Have you measures in place to ensure you are
called ‘double entering’ and is done routinely as a way of Inferential statistics enable you to make judgments about
transparent in your analysis of qualitative data?
minimising errors.) the patterns you detect. Are they just a consequence
of random variation or are they significant? The term
Similarly, for qualitative data listen once more to 10%
Data analysis, broadly speaking, involves the detailed significant in this context means more than ‘important’.
of your recorded interviews and check the accuracy of
breakdown and examination of numbers and text. (For It means the likelihood (probability, in statistical terms)
your transcription. Judging the consistency, authority
convenience we can include other forms of non-numeric that the observed relationship (e.g. between variables)
and credibility of results are other ways of evaluating
data, e.g. photographs, under the ‘text’ banner.) You or difference (e.g. between means) in a sample did not
qualitative data.
should by now have plenty of M&E data gathered before, occur by pure chance. The lower this probability (which we
during and immediately after your information literacy Once you have checked your data, you are ready for your call the p value), the more likely it is that we have found a
intervention and be ready for this analysis. data analysis. This should be conducted in the light of your real difference. The reason we need to apply significance
original questions. The more carefully you have articulated testing is that it can uncover easy-to-make but wrong
Your data is likely to be in a wealth of different forms. what you are trying to evaluate, the easier your analysis conclusions. For example, we may observe a pattern of
For example, it may be in written statements, in video will be. As so often in the M&E of your information improvement over time in trainees but significance testing
testimonials and diaries, or in responses to survey literacy training intervention, success will come down may reveal this pattern is actually due to just one person
questionnaires and tests. Organising and managing this to how well you defined your objectives (see Stage 2: being particularly enthusiastic.
data can be very time consuming and labour intensive. This Programme strategy and objectives).
is one reason why you need to have properly piloted your

64  |  Training Toolkit  |  Toolkit Practices


To be able to conduct inferential statistics, the sample to download and install the free Data Analysis Toolpak Furthermore qualitative researchers sometimes take
size (the number of people who responded to e.g. your for Excel if you plan to run inferential statistical analyses. their findings back to the participants to see whether
survey) needs to be large enough to be able to detect true Bear in mind though, you can still do very valuable work on the analysis is ‘authentic’ and credible. Qualitative data
differences. The minimum sample size can be calculated in your M&E with descriptive statistics. You just need to be tends to uncover reasons why certain things occur and is
advance by a statistician (or you can do a power calculation cautious about your conclusions. particularly good for investigating people’s perceptions of
yourself). Generally, running inferential statistics on a a particular situation.
sample smaller than 30 is unlikely to yield any definite Finding patterns for qualitative data Qualitative data analysis can be done with nothing more
conclusions.
The simplest form of significance testing is the t-test (see
analysis than a pen and paper. However, it is well worth using
qualitative data analysis software. This does not do the
Box 9: The T-Test). This is suitable for checking whether Qualitative data analysis is a rigorous, often-time analysis for you (no more than Microsoft Word writes a
two groups (e.g. a trained and an untrained group) are consuming activity, whereby the researcher determines report for you), but it does provide a convenient way of
significantly different. It can also be used for doing before whether there are patterns in the data. To facilitate this, ‘tagging’, sorting and retrieving text. Some researchers
and after comparisons, for example, to see how much a verbal data is generally transcribed verbatim (or, at the compromise by importing their text into a spreadsheet
group improved from their starting point by comparing very least, summarised with detailed notes) and subjected program such as Excel (usually one paragraph per cell)
their scores at baseline and post-training. to a systematic analysis. While conducting the analysis, and then ‘coding’ this data by using the next column to
qualitative researchers consciously ‘bracket’ their views categorise comments by theme. It is even possible to take
More complex questions require more complex statistics. and do not let them impose on the data. This is difficult. a similar approach using the Table function in Word.
For example, if you want to examine the relationship However, researchers are usually advised to document the
between the amount of time spent in class and test process and make notes about the decisions they take. Though there are a wide variety of qualitative data
scores, you could test whether a correlation exists between analysis approaches, e.g. conversation analysis, content
amount of time and test scores. If you are conducting an Systematic analysis generally involves coding the data. analysis, discourse analysis, narrative analysis etc, the most
exploratory analysis to examine whether any particular This can be done using pre-defined frameworks, such as straightforward, and most common, is thematic analysis.
features of your intervention predict success, you could UNESCOs information literacy indicators (a deductive Thematic analysis emphasises pinpointing, examining and
conduct a factor analysis. approach) or gradually identifying patters and evolving recording patterns within data. See Box 8: Carrying Out
codes as appropriate from the data (an inductive approach). Thematic Analysis.
It is unlikely you will need specialised statistical software This is done repeatedly until all elements are coded. In
(e.g. SPSS) for any of this. Microsoft Excel offers a number some cases coding is checked using a collaborator.
of powerful data analysis tools. However, you may need

Training Toolkit  |  Toolkit Practices  |  65


Whether you use qualitative or quantitative data or a
combination to present your M&E findings is likely to be
Guidance Box 8 influenced by the stakeholders. Despite organisations and
funders showing an increased interest and acceptance
Carrying out thematic analysis • Apply the codes to the text. You can do this of qualitative data and emphasis on process, changes
The following is a description of the steps taken to either by highlighting and annotating manually, in attitude and relationships etc. they may still have a
undertake a deductive approach. However, it should or by selecting text and attaching a code if using preference for quantitative data. As information literacy
be noted that a combination of techniques can be qualitative data analysis software. is a competency-based generic skill it is important
applied. Themes may be applied to the text. But within that your evaluation assesses the skills and knowledge
• Gather all text relating to a particular code. You (quantitative) acquired through the programme as well
themes new themes may be identified that were can sort a Word or spreadsheet table, but if you
not predicted. as improvements to attitudes and values (qualitative).
are highlighting on paper you will need to do some These changes should be captured at the individual and
• Establish the themes you are interested in. This rewriting/typing. organisational levels and incorporated into an evaluation
will be a combination of your original evaluation • Summarise the text. Pay particular attention to of results13.
questions, plus unexpected themes that emerge contradictions and the variety of positions.
from reading through your transcripts.
• Revise your summary. Do this until you feel you
• Create a code book. This is a list of the themes, with have captured the meaning of the responses in a
their definitions. It will help you remember your way that enables you to communicate it swiftly in
own logic when you come to writing your report, your report.
and also help if the analysis is being conducted by
more than one person.

13. Evaluation of Information literacy programmes in higher education: strategies and


tools, M Garcia-Quismondo, July 2010

66  |  Training Toolkit  |  Toolkit Practices


Case study 11 Skills self-assessment scores from three surveys and their averages Table 5

Using distance travelled Column A Column B Column C


At IDS we use the distance travelled technique to assess
the progress made in achieving our desired training Pre-training Immediate post- Immediate post-
outcomes. We used it, for example, in a pedagogy of needs survey training survey training survey
senior information literacy trainers workshop held with (1) (2) (2)
participants who work in universities in Zimbabwe. Retrospective New
Table 5 shows some of the key attributes covered in Attribute prior knowledge competencies
this training intervention. The three columns represent assessment
two different surveys: the first, the pre-training course Using mind-maps to develop search strategies 3.15 2.35 4
survey (Column A); the second and third are two
questions posed in the immediate post-course survey Using internet search engines to look for
(Columns B and C) in which participants scored their information (e.g. Google, Google Scholar) 4.76 4.37 4.7
skills prior to training retrospectively (Column B); and Writing Boolean search phrases using concept tables 3.45 3.15 4.25
in the third, the participants scored their new skills Evaluating search results 4.15 4 4.7
after receiving the training (Column C). Note, all three
surveys measured the same attributes and, for each Narrowing/filtering search results 4.2 3.85 4.6
survey, these were scored using the same five-point Using electronic library resources to find
scale in which one was the lowest level of attainment information 4.81 4.35 4.8
and five the highest. Explaining to someone else copyright and licensing
laws of online journals 3.9 3.5 4.25
Promoting use of e-resources in my organisation 4.62 3.8 4.8
Promoting use of e-resources in other organisations 3.3 3 4.35
Lecturing 4.14 3.05 4.1
Facilitating 4.05 2.55 4.2
Training 4.15 2.9 4.37

Averages 4.06 3.41 4.43

Training Toolkit  |  Toolkit Practices  |  67


Case study 11 (continued)

When you analyse the results you can notice the While the second explanation is a theoretical possibility, teacher or learner centred. The second column is for
difference between the lowest average scores in our evidence may be stronger for the first claim. scores for the number of respondents who thought
column one (pre-training) and column two (re- However, we can argue that individuals cannot recollect the scenario was teacher centric before training, the
assessment of prior knowledge). The participants the score they gave in the first questionnaire, therefore third column is for after training, the fourth column
lowest average score was 4.05 (81%) prior to training the second re-assessment is a more realistic assessment is a percentage for correct answers, and so on for the
but dropped to 3.4 out of five (68%) after the of their pre-training capabilities. learner-centred approach. For your reference, the
training. Given that both of these scores were asking correct answer is given in red.
for the participants’ skills scores before training had Moving beyond self-assessment: testing The table shows that respondents answering after
taken place, what explains this disparity? The data
demonstrates two explanations that we first came participants the training were better able to identify the correct
approach in all but one instance (the exception being
across in Stage 1: Assessing your trainees’ needs: The alternative to self-assessment is to provide the first scenario with Mary). Our respondents scored
• Known/unknowns: after training participants begin participants with a test. This can be through survey well on both tests, 84.2% of them being able to identify
to realise how little they knew before training. questions testing participants’ knowledge or the approach before training. After training this rose
Therefore, when asked to score themselves understanding of a concept, or through assessment to 91.4%.
retrospectively, they score themselves lower. within the classroom. The former, however, lends itself
better to a quantified analysis and is therefore easier If we had not done the initial study before the course,
• The two questions (i.e. immediate retrospective for comparisons when it comes to improvements in the we could have measured our success with the lofty
re-assessment of skills prior to training and the distance travelled. figure of 91.4% in isolation. However, our pre-course
assessment of new skills) are asked in tandem: given survey showed that respondents were already largely
that participants are being asked to score themselves Consider Table 6. These are survey results from a well aware of the differences between teacher- and
before and after training, they overstate their gains question in which the training respondents were given learner-centric approaches. The true distance travelled
to emphasise how much they have learned from seven scenarios (shown in the first column), and asked here was 7.2%.
the training. whether they considered the scenario predominantly

68  |  Training Toolkit  |  Toolkit Practices


Participant scores on how well they identify the learner- or teacher-centred approaches Table 6

Teacher- Teacher- Learner-


centred centred % giving centred Learner- % giving
Scenarios
(before (after correct (before centred (After correct Response
training) training) answer training) Training) answer Count
Mary sets written assignments to assess her
14 5 73.7% 5 7 65% 19
students’ understanding of the lesson taught
Joseph stands at the front of the classroom for most
17 2 89.5% 2 0 100% 19
of the lesson
At the beginning of each session, Precious asks
1 18 94.7% 18 20 100% 19
students to rate their skills in a new topic
Innocent expects students to take notes throughout
15 4 78.9% 4 0 100% 19
the class
Chipo likes the students to be quiet when she is
18 1 94.7% 1 0 100% 19
delivering the facts
Anesu shows students how to use e-databases
before setting them a problem to see how they 4 15 78.9% 15 19% 95% 19
apply these skills
In Jabulani's class, students often contradict his ideas 4 15 78.9% 15 16 80% 19

Training Toolkit  |  Toolkit Practices  |  69


Participant scores on how much they
value learner- or teacher-centred approaches
Behavioural change: attitudes and values
Table 7

Teacher-centred Learner-centred One of the major focuses of the IDS information


approach approach literacy programme is not only to improve participants’
skills, in the narrow technical sense, but also their
Value Value Value Value attitudes and values. Using the same scenarios as the
Scenario Approach before after before after above questions, in which respondents were tested
training training training training on their ability to identify the approach being used,
Mary sets written assignments to assess her Table 7 looks at how much they valued the respective
Teacher 47.40% 35% approach.
students’ understanding of the lesson taught
Joseph stands at the front of the classroom Out of the seven scenarios, four are examples of
Teacher 15.80% 0% the teacher-centric approach and three are learner
for most of the lesson
centred. Respondents were asked to place a value on
At the beginning of each session, Precious each of them before and after their training. The table
Learner 78.90% 95%
asks students to rate their skills in a new topic shows the percentage of respondents who rated the
Innocent expects students to take notes scenario either four or five out of a five-point scale
Teacher 15.80% 0% (one being the lowest value, five the highest). The
throughout the class
results above indicate that respondents placed far
Chipo likes the students to be quiet when she higher value on the learner-centred scenarios both
Teacher 0% 5%
is delivering the facts before (77.17%) and after (85%) training.
Anesu shows students how to use The advantage of having the pre-training survey
e-databases before setting them a problem Learner 100% 95% was that we were aware, even before providing the
to see how they apply these skills training, that our participants valued the learner-
centred approach. Had they been a more traditional
In Jabulani's class, students often contradict
Learner 52.60% 65% group who distrusted learner-centred approaches, it
his ideas
would have been useful to know beforehand and to
Averages 19.75% 10% 77.17% 85.00% tailor our workshop accordingly.

70  |  Training Toolkit  |  Toolkit Practices


Guidance Box 9

The T-Test and how to apply it particularly in small groups. It might be the case, for
The most common method used to establish whether example, that a handful of participants have improved
two groups differ statistically is comparison of the substantially while the rest remain at roughly the same
two means (the mathematical averages). Sometimes level and that this scattered improvement artificially
though, this can lead to misleading conclusions. inflates the mean.

Consider this example: you are about to facilitate a This is where the t-test is useful. The t-test looks at the
workshop on information literacy that emphasises standard deviation in your sample and considers this in
the importance of search strategies. Before training the light of your sample size. (The standard deviation
begins you test the participants on their search skills, is the extent to which each of the participant’s
and they score an average of 50%. This is the baseline performance deviates from the class’s mean.)
against which any progress after the workshop will be The t-test will reveal whether your confidence
measured. You give the participants another test after in having made a difference to your participants’
the course, on which they average 60%. You decide performance is justified, i.e. if you have a large
your main objective has been met: the participants’ enough sample size and your participants’ improved
search skills are, on average, better as a result of your performance is realised across the board.
workshop.
The t-test is used for sample sizes of fewer than 30,
Your conclusion, however, might be premature. but the closer you are to 30 the more confident you
Fluctuations in the mean often occur by chance, can be with your data.

Training Toolkit  |  Toolkit Practices  |  71


Guidance Box 10

Using T-Tests in Microsoft Excel (MS) 2. Go to the ‘Formulas’ toolbar, then to ‘More
The following is a step-by-step guide to using t-tests Functions,’ and finally to ‘Statistical’. Here, you will
in Microsoft Excel 2007 (though other spreadsheet find an option for the t-test. Below, Figure 6 shows
software also has the necessary functions). You may the box that appears after this selection.
need to amend these instructions or download a
TOOLPAK depending on the version of Excel loaded
on your PC.
1. Transfer your data into a spreadsheet. So, for our
example presented in the Case Study 11: Using the
distance travelled, we have pre- and post-workshop
scores for 20 participants asked to rate from one
to five their skills in ‘Using mind-maps to develop
search strategies’, with one being the lowest score,
five the highest. As you can see in Table 8, the
average before training was 3.15 and after
training four. Our task is to find out whether
this is a significant difference. Figure 6: screen grab of first step in using T-Tests in MS Excel

Table 8: individual responses and their means

72  |  Training Toolkit  |  Toolkit Practices


Box 10 (continued)

Fill these in as follows: Below, Figure 7 shows the Excel table filled in for our
worked example. On clicking ‘OK’ you get the figure:
3. Array 1 – select all your scores for the first sample
0.001813.
(i.e. the scores for the pre-workshop test)
4. Array 2 – Select the scores for the sample against
which you want to compare Array 1 (i.e. the post-
workshop scores)
5. Tails – you must specify a one- or two-tailed
hypothesis. A one-tailed hypothesis specifies
direction, so for instance it claims that Array 1 will
be greater than Array 2. A two-tailed hypothesis is
non-directional, so it merely states that there will
be some kind of difference between Array 1 and 2
without specifying which is likely to be greater. In
M&E, we are generally looking for an improvement
of some sort, so usually we will choose a one-tailed Figure 7: completed T-Test box in MS Excel
test, which is more powerful (because the test is
only ‘looking’ for effects in one direction). This is your p value. The p value is the likelihood that
the difference between the two means is due to
6. Type – You choose Type 1 when your population is chance. As a percentage, our likelihood is 1.8% that
paired. This is the case when, for instance, you carry the fluctuations in the two means are random. The
out two tests for one group. So, for instance, your norm in social science research is to accept anything
Array 1 and Array 3 comprise the same people just lower than 0.0514, which indicates a less than 5%
in different points in time. Type 2 compares two likelihood of chance fluctuation.
different groups when there is an equal variance.
Type 3 compares two different groups when the Given that your p value is lower than p = 0.05 you can
variance is unequal. An unequal variance is when say with some confidence that “this effect is not due
one data set is more scattered than the other, i.e. a to chance alone”.
data set in which there are substantially higher and
substantially lower figures than the mean.
14. http://www.medicine.ox.ac.uk/bandolier/booth/glossary/pvalue.html

Training Toolkit  |  Toolkit Practices  |  73


M&E in action Stage 8: Learning from M&E
Regularly considering ‘lessons learned’ and planning what
to do to address them will help you increase the impact of
Factors affecting learning
? Key questions your information literacy training interventions. To do this
well, aim for:
We sometimes assume that learning ‘just happens’.
Unfortunately, it mostly doesn’t. Factors which have been
Learning from your M&E identified 16 as influencing effective learning include:
• Participation and consensus from all stakeholders
• How will you review what you have learned • Organisational culture
from your M&E activities? • A balance of generative (creative) and adaptive (survival)
learning, so it’s not just about ‘what do we need to fix’ • Individual ability and confidence
• Who needs to be involved? but also ‘what could we do better or differently?’ • Power and hierarchies
• What is the best way of embedding learning in • A simple process which becomes familiar and • Donor support
the organisation? embedded within the organisation
• Personal resources (time and energy)
• How will I ensure lessons turn into actions? • A process where people feel ownership of the learning,
so they are motivated to implement action plans • Motivation

It is also worth considering what we know already about • Quality of facilitation


There is a clear progression from data analysis, to the
drawing of conclusions, to decisions about the refining effective follow-up of M&E initiatives. A survey of 282 • Accessibility of information
and improvement of training interventions, to other evaluators and evaluation managers15 found the most
Table 9 provides tips for optimising each of these factors,
decisions about, for example, replacing the training or important strategies for facilitating the use of M&E data
but you will need to examine your own organisational and
acquiring additional resources. This is the pathway you to be:
individual context to work out how best to make sure that
now want to go along. • Planning for learning at the beginning of an evaluation learning from your own M&E becomes a reality.
However, learning is a continuous process. The learning • Identifying intended users and intended uses of the
cycle starts at the beginning of an intervention and evaluation early on
continues through and beyond it.
• Communicating findings to stakeholders as the
evaluation progresses
• Developing and implementing a learning and
communication plan 15. P
 reskill, H. and V. Caracelli (1997). Current and Developing Conceptions of Use:
Evaluation Use TIG Survey Results. Evaluation Practice, Fall 1997, 18.3
16. A
 number of these are described in:
Kusters, C., van Vught, S., Wigboldus, S., Williams, B. & Woodhill, J. (2011). Making
evaluations matter: A practical guide for evaluators. Centre for Development
Innovation, Wageningen University & Research centre, Wageningen, The Netherlands.
www.cdi.wur.nl
Cites: Preskill, H. (2007) Process Use, in Encyclopaedia of Evaluation, by Mathison, S.
(Ed.). Thousand Oaks, CA, Sage, pp. 327-28
Preskill, H. and Russ-Eft, D. (2005), Building Evaluation Capacity: 72 Activities for
Teaching and Training, Thousand Oaks, CA, Sage Publications

74  |  Training Toolkit  |  Toolkit Practices


Tips for enhancing learning Table 9

Factor Tips
Organisational culture Is learning embedded in the strategy and operational plan?
Are employee reward structures designed to reward learning?
Do senior management take learning seriously and attend learning events?
Individual ability and confidence Is some training in understanding information (e.g. qualitative and quantitative data) desirable for some staff?
Are there events designed so everyone’s voice can be heard?
Does the group feel ‘safe’ for each individual?
Are individual preferences in how to communicate (sketches, drama, poems, words, videos, photos etc.) taken
into account?
Are learning cycles designed so that different learning styles can be encompassed?
Power and hierarchies Are events designed to minimise the symbols of hierarchies (for example when considering invitees, seating plans,
roles assigned)?
Do senior staff genuinely listen to more junior staff?
Are the results of learning events valued, and are they taken seriously and acted on?
Donor support Is there a budget line for learning cycles?
Are donors involved in organisational learning?
Are results communicated to donors?
Personal resources Is learning a formal element of individuals’ roles?
Are events scheduled sensitively in terms of peak times in individuals’ roles?

Training Toolkit  |  Toolkit Practices  |  75


Tips for enhancing learning Table 9 (continued)

Factor Tips

Motivation Are events designed with short-term and long-term benefits in mind?
Are they enjoyable?
Quality of facilitation Is there a designated facilitator(s) for learning events?
Is the event planned carefully?
Does the facilitator have a range of tools and exercises available to examine the information from different
perspectives?
Accessibility of information Is the ‘raw material’ for learning easily accessible in terms of technology, readability, language?
Is there a structure for storing the output of learning events so that it is not lost?

76  |  Training Toolkit  |  Toolkit Practices


Designing learning cycles Formulating and documenting lessons
Before any learning cycle event, you need to ensure that
your key stakeholders and the facilitator17 of any learning
! Tips learned
event you hold are clear as to what is meant by lessons When shaping a lesson learned, you should:
Encourage participants to engage with your M&E
learned. Also, why you need to identify them. That may, material by holding an ‘insight generation’ session • Include a generalised principle. It should be more
for example, be for reporting to primary stakeholders, in which you ask them a series of questions: than just an observation or a description.
partners or funders, or perhaps to deal with a crisis or as a
• What did I hear which confirmed what I already • Place the lesson in its context. People need to
strategy to support fundraising.
know? understand the situation in which the lesson learned
You also need to consider some specifics. What is the occurred.
information you are planning to review and how will it be • What did I hear that was new?
• Justify the lesson with evidence. Or describe how
disseminated to the participants? Similarly, how will you • What surprised me?
such evidence might be gathered, if it is a hypothesis.
share the lessons in the end? It could be verbally, or in
• What excited me?
writing, or by video, or through drama etc (see Stage 9: • Consider the lesson’s usefulness. Check it is neither
Communicating findings). • What worried me? too general nor too specific.
Who should be involved in your event? On occasion, there • What contradicts something I thought I knew? Depending on how you plan to use, disseminate and store
may be arguments for subdividing the event into, for The answers ( the ‘insights’) can then become the your lessons learned, you may wish to come up with your
example, lessons for trainers, lessons for the organisation raw material for a lessons learned session. own template. You may want to include the following
and lessons for programme design. information:
You need also to build into your cycle measures for • The theme of the lesson
making sure your lessons learned go forward into planning • Your original understanding or assumption
activities.
• Your revised understanding or assumption
• The evidence for, or examples of, this lesson?
• The action you intend to take, with timescales and
individuals responsible

17. Managing for impact in Rural Development: A guide for project M&E. www.ifad.org

Training Toolkit  |  Toolkit Practices  |  77


Methods and tools for learning from M&E Who should be involved? Logistics of learning cycles
Tools for learning from M&E tend to cluster into two We recommend that everyone who has a stake in the The logistics of learning cycles will be specific to each
types: tools to support convergent thinking (bringing organisation be involved in learning cycles, and that the organisation. However, here is a checklist of things to
material together) and tools to support divergent thinking learning takes place in groups rather than as a personal consider:
(‘out of the box’ thinking). Ensure that you build in both of study activity. The workshop is by far the most common Timing
these into whatever overall method you adopt. method used for organisational learning. Gathering the Mini ‘lessons learned’ sessions can be built into regular
project team (or facilitation team) and trainees together is monthly or quarterly meetings. However, a substantial
Examples of tools for convergent thinking include
also a useful method. session usually happens just before finalising the design
brainstorming where you write the ideas on cards and
then cluster the cards and label the clusters18. Mind- The advantage of working with groups of stakeholders of an intervention and then either annually or at the mid-
mapping19 is another method, and ‘virtual’ mind-mapping rather than individuals are: and endpoint of an intervention.
tools are available for working remotely. Facilitation
• Different perspectives facilitate the emergence of new
There are many tools available for promoting divergent information and ideas A skilled facilitator is critical to the success of a learning
thinking. Most of them involve either making random event. Internal facilitators know the organisation and
• Different perspectives help to limit biases often the individuals concerned well, may be easier
connections or deliberately taking a different perspective.
For example, one exercise involves four ‘Rs’ where these • Achieving consensus about lessons learned makes it to schedule and usually are more economical to use.
are ‘re-expression’ (using different words or senses), easier to agree on and implement actions However, it may be hard for them to distance themselves
‘related worlds’ (finding a ‘world’ where a similar problem from the event and challenge organisational assumptions.
It may though not always be ideal to gather stakeholders
has been solved), ‘revolution’ (breaking the rules) and If you have relationships with other organisations, you
in one single group. You may need to run a series of
‘random links’ (forcing connections with random stimuli to could offer the ‘loan’ of a facilitator for the loan of one
events at different levels in the organisation, sending the
create new insights). of theirs. In a large organisation, it may be possible to
lessons learned up through the various levels of decision
‘borrow’ a facilitator from another programme or site.
Finally, Focused Conversation20 is a useful tool for either making. Alternatively, it may be more practical to hold a
kind of thinking. It involves a guided conversation through single event with representatives of different stakeholder Venue/space
objective, reflective, interpretive and decision kinds of groups. If you do this, consider how you will communicate Ideally, a learning event should happen away from the
thinking, and can embody more creative elements at the with those who cannot be present and ensure their organisation’s offices to help people focus. It also indicates
interpretive stage. See the description of ORID, in Stage commitment to any actions agreed. Whatever you decide how seriously the organisation takes the activity. A
6: M&E during training and immediately after training. will work best for your organisation, try to ensure no learning event could even be included as part of a larger
individuals or roles are marginalised. activity such as strategic planning.
Face-to-face or ‘remote’ events
18. http://oqi.wisc.edu/resourcelibrary/uploads/resources/Consensus%20Workshop%20
-%20Description.pdf
Where staff are geographically scattered, it may be worth
19. h
 ttp://www.mind-mapping.co.uk/make-mind-map.htm considering a ‘virtual’ event using software such as Adobe
20. M ok, A. (2008). The focused conversation method. Available: http://fnsingapore.blogspot. Connect.
co.uk/2008/08/focused-conversation-method.html. Last accessed 23 Feb 2013

78  |  Training Toolkit  |  Toolkit Practices


Next steps in the learning cycle: inspiring Kotter (2002)22 describes eight steps which help to ensure
that actions lead to the desired changes in organisations:
6. Create short-term wins Complete current stages
before starting new ones. Reward those who help you
action 1. Create urgency For example, make objectives real and
to meet your objectives.
One way to categorise organisational learning is to relevant. This inspires people to act. 7. Don’t let up Foster and encourage determination
identify the three action ‘levels’ at which they occur.21 and persistence. Analyse every achievement and ask
2. Build the guiding team Get the right people in place
1. Single loop learning, which involves checking whether yourself what went right and what needs improving.
with the emotional commitment and mix of skills and
we are ‘doing things the right way’. It involves watching for levels to help lead the change process. 8. Make change stick Tell success stories of change
and correcting deviations from the organisational norm. within your organisation. Reinforce the value of
3. Get the vision right Get the team to establish a simple
2. Double loop learning, which involves changing the successful change via recruitment, promotion, and new
vision and clear strategy based on the findings and
rules by asking the question ‘are we doing the right change leaders. Weave change into the culture of the
recommendations of the evaluation.
things?’ This helps us to reframe our thinking and organisation.
4. Communicate for buy-in Involve as many people as
fosters innovation and creativity. Achieving these eight steps helps to create a true, virtuous
you possibly can and communicate your vision and
3. Triple loop learning, which involves questioning the learning cycle, where the changes achieved through learning
strategies often and in a simple way. Have a clear
entire rationale and values of an organisation. and then taking action help to create an atmosphere where
message and make technology work for you.
leaning becomes rewarding in its own right.
It is easy to focus on the specific, small actions of single 5. Empower action Put in place a structure to facilitate
loop learning at the expense of double and triple loop change. Try to identify pockets of resistance and
learning. The best way of ensuring that your actions are remove barriers quickly. Allow for constructive
designed at the right level is to plan this into the learning feedback and support from leaders. Recognise and
event at the outset. reward those who make change happen.

21. Ramalingam, B. (2010). Organisational learning for aid, and learning aid organisations.
Available: http://www.capacity.org/capacity/opencms/en/topics/learning/
organisational-learning-for-aid-and-learning-aid-organisations.html. Last accessed 23
Feb 2013
22. Kotter, J.P. and Cohen, D.S. (2002) The Heart of Change: Real-life Stories of How
People Change Their Organizations. Harvard Business Review School Press, Boston,
Massachusetts
Summarised in Kusters, C., van Vught, S., Wigboldus, S., Williams, B. & Woodhill,
J. (2011). Making evaluations matter: A practical guide for evaluators. Centre for
Development Innovation, Wageningen University & Research centre, Wageningen,
The Netherlands. www.cdi.wur.nl

Training Toolkit  |  Toolkit Practices  |  79


Case study 12
Using self-evaluation for team learning Many months after the repurposed course was their previously articulated assumptions about how the
In 2010, IDS’s British Library for Development Studies launched the team employed a facilitator to guide repurposed course would effect change. The evaluation
(BLDS) and the Information Training and Outreach them through the participatory evaluation method was mixed method and drew together quantitative
Centre for Africa (ITOCA) worked together to FSE (see M&E in Action). They chose this method and qualitative data, including existing participant
repurpose an information literacy ‘training of trainers’ rather than commission an external evaluator because satisfaction and survey data, a series of before and after
course. This course aimed to train participants to use this enabled them to build the evaluation skills and course evaluations, administrative data and programme
a suite of Research4Life (R4L) database tools and help experience of internal staff members. There were documentation, and newly collected data from
them pass on this knowledge to others within their also cost considerations. participants and other stakeholders using an online
institutes. BLDS and ITOCA wanted to introduce into survey, questionnaires and interviews.
The FSE team was made up of six senior staff members
the course a more participatory method of training from ITOCA and IDS, and it was involved in all stages The result was a comprehensive evaluation that included
and more generic information literacy and training of the evaluation, from deciding the evaluation some clear conclusions and tangible recommendations.
skills. The objective was to increase the breadth of skills questions to collecting the data, and from data analysis From the facilitator’s point of view, the quality and
course participants could pass on and their capacity to to reporting the findings. The process required a legitimacy of the evaluation findings benefited greatly
train others to use R4L products. considerable time commitment from all team members from the involvement of senior staff members in
From the outset, both IDS and ITOCA were keen to and some commitment of funds to commission the inception and analysis workshops. From the
evaluate the repurposed course. The team designed the additional data collection. practitioners’ point of view, most team members
new course to capture skills, knowledge and behaviour reported that they learned about the practicalities
The project under scrutiny was well suited to FSE
information from participants before and after they of doing evaluation as well as the strengths and
as it was tightly defined and had clear and tangible
took part in it. They also ran the new and original weaknesses of their own project.
outcomes, allowing the practitioner-team to build a
courses in parallel so they could compare outcomes. straightforward theory-based evaluation based on

80  |  Training Toolkit  |  Toolkit Practices


Learning from M&E Flowchart 7

START HERE

Improve
future training Tools for
interventions Determine the convergent
information you thinking.
plan to review E.g. Mind-mapping
For reporting
to primary
stakeholders, Determine who
partners or should attend
Consider why you funders Disseminate,
E.g. all stakeholders Consider your
want to identify store, but above all
or representatives tools for learning
lessons learned. USE your learning
To deal of each
with a crisis stakeholder group

To support Determine what


fundraising sort of activity you Tools for
will hold. divergent
E.g. workshops thinking.
Other E.g. 4 Rs method

Training Toolkit  |  Toolkit Practices  |  81


M&E in action Stage 9: Communicating findings
often it does little more than languish in a drawer. Other presentation etc. Try also to keep them to a manageable
more effective written communications may include short number by prioritising them – and when listing them
? Key questions summaries, photo essays or bullet-pointed lists. in your communication, list them in order of their
significance for learning.
Your stakeholder analysis (see Stage 3: Identifying
Key Questions: Communicating your findings challenges) will have identified the different audiences you
• Who needs to know your M&E findings? need to address for your M&E communication as well as Conveying your key messages
• What is the best way to reach each audience? provided information to help you understand their needs. Regardless of whether you choose to communicate via
Consider the different needs of your audiences when a presentation or report or some other means, consider
• Have you determined your key messages? considering how you want to communicate with them. whether your key messages can be represented in a more
• Have you used appropriate visualisation tools to It is often worth planning more than one communication compelling way using data visualisation tools.
communicate your findings? to effectively meet the needs of different audiences. For
example, an evaluation of a training intervention to increase Visualising quantitative data
To learn from the M&E of your information literacy understanding of politics among Welsh young people with There is a wide range of default graph styles available in
intervention you need to communicate your findings learning disabilities resulted in one ‘standard’ evaluation Microsoft Excel which you can customise so that they
effectively. There are two main ways of doing this: work and a two-page ‘easy-read’ document aimed at the reflect both your in-house style and the information you
young people who participated in the training. are trying to convey. If you are comfortable with basic
• Verbally graphs, you could also consider ways of making your
• In writing Determining your key messages graphs dynamic or interactive, either by building them up
The key messages of your M&E will be an expression of step by step within a Microsoft PowerPoint presentation
People are much more likely to engage with your or by using online interactive software e.g. Prezi or
communications, and so learn from them, if you make the learning you have gained from those patterns that
emerge from your data analysis. Most are likely to relate Tableau Public.
them compelling. Verbal communication at its most basic
to whether or not you have fulfilled the objectives of
may comprise a face-to-face presentation. You might also
consider producing a video presentation, a webinar or a your training intervention. Others patterns unrelated to Visualising qualitative data
podcast. With these, you can be more creative, address your objectives but nevertheless of significance to future The standard way of reporting qualitative data is by using
people who may not be able to attend a one-off event and training interventions may emerge too so look out for quotes. This is always likely to be necessary but other
provide an engaging, long-lasting resource for people. these. We will consider this in more detail in Stage 9 – methods can be more compelling. For example, you could
Learning from your M&E. consider mind-maps, taxonomies/flowcharts, concept
A written communication at its most basic is a full report.
In your communications, try to articulate your key maps and concept trees to illustrate the main concepts
This, in fact, tends to be the default position for M&E
messages as simply as possible at first, preferably in one you have identified and to communicate your story.
communication. Be wary of the chunky report in English
though. It may fulfil a funder’s evaluation demands but too sentence. You can elaborate on them later in your report,

82  |  Training Toolkit  |  Toolkit Practices


An example of a Concept Map quality
embodiment of ideas Figure 8

attributes:
is characterized by scatter of information
mosaic-like construction of meaning
electronic publishing: co-exists exploration
electronic journals traditional publishing
with context
flexibility
non-linearity
is characterized by is influenced by is represented by

speed
wider accessability RELEVANCE IN ELECTRONIC ENVIRONMENT
contexts verification of
source origin
is conditioned by is supported by is perceived as
comprises verification
navigation
technological features (tools): content manipulation
use of electronic sources
advanced search
inlinks in context
is characterized by intelligent interfaces
multimedia
visualization
attributes group communication

negative: positive:
demandingness speed
time recency
vagueness accessability
commercially
problems of perception
Reference: (http://informationr.net/ir/15-4/colis719.html)

Training Toolkit  |  Toolkit Practices  |  83


The importance of transparency
Finally, you should note that the presentation of
your findings, from how you analysed your data to
Case study 13
any relevant conclusions, should be as transparent as Using video to communicate to our donor This technique is particularly helpful for people who
possible. Presentation is particularly important in cases are not entirely comfortable speaking to camera and it
The IDS information literacy programme reports to
where there is a line of accountability from trainers to worked excellently for this report. A large whiteboard
its donor, DFID, as a major part of its work. Examples
host organisations or sponsors as there are likely to be was used to deliver the presentation, with the camera
of outputs, outcomes and impact are provided in
expectations that need to be fulfilled. Host organisations moving physically three times during shooting, giving
a number of ways, but a written report, supported
and sponsors will want some sort of demonstration us the opportunity to shoot the report in several short
by a face-to-face presentation, is usual. However, in
that objectives are being met and their organisational films.
October 2012, where staff were unable to attend
aims addressed.
a quarterly review meeting, an offer was made to The video report was to be 10 minutes long but
produce a video presentation. more video than this was actually taken so the final
Information we wanted to report on included progress film was cut using Microsoft’s Live Movie Maker. We
against project milestones, the number of people found it an easy tool to master and it enabled us to
reached in the programme, and evidence of impact splice (join) sections of the film together once we had
through statements and statistics. We employed decided what sections we could cut. Taking out the
a technique used by illustrators when they want to awkward silences also meant we could produce a more
demonstrate concepts using images. This saw us professional product.
making use of a range of hand-drawn images as a The final version of the film was presented to DFID
starting template and then shifting the camera to – and its success can be gauged by the fact that they
hand-drawn images, text and statistics as the report have asked IDS to continue reporting in this way. We
was related off-camera. Between these two elements, also showed the film at an international advisory group
a narrator was only briefly introduced to camera. meeting a few months later.

84  |  Training Toolkit  |  Toolkit Practices


Communicating findings Flowchart 9

START HERE

Consider the different Determine appropriate


ways in which you can key messages
present your evidence
Be creative.
E.g. consider oral and
graphic communications
as well as written. Consider
shorter and longer forms

Your trainees

The organisations
in which the trainees
are located Consider their differing
Consider who you want
needs. Determine appropriate
to communicate your
Refer back to your communications
M&E evidence to The sponsors or stakeholder analysis
funders of these
organisations

Others

Training Toolkit  |  Toolkit Practices  |  85


86  |  Training Toolkit  |  Toolkit Practices
TOOLKIT
APPENDICES
This toolkit can only touch the
Contents
surface of M&E practice. Find out 88 Appendix 1: Glossary of
more by exploring the following assessment and evaluation
definitions, tools and resources. methods and tools
92 Appendix 2: M&E tools by stage
93 Appendix 3: Suggested criteria
for describing good practice in
monitoring and evaluation
100 Appendix 4: References

Training Toolkit  |  Toolkit Practices  |  87


Appendix 1: Glossary of assessment and evaluation
methods and tools
Bulletin boards, online discussion forums, Diagnostic tests (pre-diagnostic) Drama, puppetry or role-play
blogs and wikis These forms of assessment provide instructors with Learning exercises which allow participants to experience
These enable discussion and the sharing of views as well as information about trainees’ prior knowledge and a particular situation through the presentation and/or
peer-to-peer learning. These are particularly useful when understanding before beginning a learning activity. They enacting of different viewpoints and perspectives.
trainees are not physically in the same location, or when indicate the strengths and weaknesses of trainees and
Drawings
they are on staggered timetables. enable training to focus on and be adapted to genuine
Drawings tend not to be interpretable in their own right,
needs. They also provide a baseline for determining
Delphi process but can be a useful tool to elicit oral stories.
what new knowledge, understanding and skills have
The Delphi technique is a quantitative option aimed at
been developed during the course of the training. They Evaluation wheel (also known as Spider Tool)
generating consensus. It solicits opinions from groups in
may typically take the form of questionnaires (including An evaluation wheel is a tool to measure the degree of
an iterative process of answering questions. After each
multiple-choice questionnaires), surveys or interviews. usefulness, satisfaction, or achievement of an outcome.
round the responses are summarised and redistributed
Increasingly these are administered electronically using It can be used in a group process (such as the evaluation
for discussion in the next round. Through a process of
web-based tools such as SurveyMonkey (http://www. of a workshop) or completed individually. The wheel is
convergence involving the identification of common
surveymonkey.com). divided up into segments (usually 6 or 8 segments). Each
trends and inspection of outliers, a consensus is reached.
segment of the circle is labelled with one aspect of the
Distance-travelled
In its original form, question rounds are administered in service to be evaluated.
The distance travelled is a valuable tool for measuring the
writing, for instance distributed by email. The technique
outcomes of interventions. It helps the trainer measure Participants are asked to draw a line (like a ‘spoke’ of a
has been adapted for use in groups face to face with the
the progress a trainee has made towards achieving the wheel) in each segment from the centre towards the
heart of the process remaining intact, allowing individuals
learning outcome. This is achieved by asking the trainee, in outer rim of the circle. Participants decide to what degree
time to reflect and an equal opportunity to contribute.
a post-training questionnaire, to retrospectively reassess they are satisfied with each aspect being evaluated, or to
Diagnostic tests (post-diagnostics) what their competency levels were prior to attending the what degree the outcome has been met, by drawing a line
The form of such tests is likely to be similar to pre- training. They are then, in that same questionnaire, asked in each segment from the centre to the rim. The closer
diagnostic tests (indeed, the same test might even to rate their skills after training. The difference between the line is to the rim of the circle, the more satisfied the
be repeated). What is being assessed is the level of these two figures is then compared and this forms their service users are with that aspect, or the greater degree
knowledge, understanding and skills following a training distance travelled. The distance travelled is based on the the outcome has been achieved.
intervention, and the amount of progress achieved by the premise that we are better able to communicate what
learners as a result of the intervention. These assessments was ‘unknown’ when we are shown the gaps in our prior
may take many forms, including questionnaires, knowledge; so when we reassess our skills after training,
demonstrations of ability to use particular resources and we are better positioned to realise what we did not know.
focused exercises such as scavenger hunts.

88  |  Training Toolkit  |  Toolkit Appendices


Focus groups Interviews Mind map
Focus group sessions and participative workshops (including The use of a series of well thought-out and structured A mind map is a visualisation tool used to outline
‘consensus workshops’) involve bringing together a small questions (which might incorporate tests) to elicit a view information. It is usually created around a central theme,
group of either trainees or trainers or both for a facilitated of either (i) the extent of prospective knowledge and which could be a word or text with associated ideas,
discussion. They are used to enable reflection on the understanding prior to a training initiative, as part of the words and concepts radiating from the central node.
desired or completed teaching and learning. Key questions assessment of their needs, or (ii) what has been learnt The concept is like tree with the central theme/word
are determined in advance. The group nature encourages through a training initiative, and the development of representing the trunk with branches emanating from the
ideas to flow. These can be particularly useful in the context trainees’ understanding following it. Interviews can be central hub. The branches and sub-branches represent
of needs assessment, as they can provide an insight into undertaken face to face or virtually (using telephone or the words, ideas or even tasks that are related to the
prospective trainees’ perceptions and attitudes. Focus chat), and they may involve speaking to key informants and central key word or idea. Mindmaps are a flexible tool
groups are highly dependent on the abilities of facilitators. stakeholders as well as trainees. to map tasks, understand situations and map problems
Data can be captured using video, recordings or flip charts, and solutions. It can be hand-drawn and include images
Logframes
or a combination of these. to denote words and ideas instead of text. Other terms
A logframe is a tool is “a tool to help designers of projects
for this diagramming tool are: “spider diagrams,”
Graphical facilitation think logically about what the project is trying to achieve
“spidergrams,” “spidergraphs,” “webs”, “mind webs”, or
Graphic facilitation is a type of group facilitation that uses (purpose), what things the project needs to do to bring
“webbing”, and “idea sun bursting”.
visuals and graphic images along with text to document a that about (outputs) and what needs to be done to
group’s comments. produce these outputs (activities).” DFID Triangulation (in social research; not to be confused
with the mathematical concept)
Instant response techniques A logframe provides a simple summary of the project
The concept of triangulation encourages researchers
These are used at the end of courses, where participants strategy and helps to plan and monitor a project’s outputs
to use more than one source to support their findings.
are asked to address a short number of simple questions, and outcomes. In information literacy a logframe can be
The simple argument is that if numerous sources
typically to ascertain their broad view of the training used to link the interventions objectives with regional and/
back your claim, then that claim is a more robust one.
received, their general understanding of what they have or organisational objectives.
Triangulation is particularly effective when the broad
covered, the extent to which they might apply what they
Keypad technology (live polling) bird’s eye approach of quantitative research is combined
have learnt, etc. Electronic tools such as iClickr can be
A live polling system based on mobile-based technology. with the more detailed qualitative research to give a
used to allow one-click responses to each question, which
It offers the same functionality as instant response more thorough account. An example of this is when a
are collated and projected almost instantly on a screen
systems (such as iClickr) but uses mobile phones to poll researcher compares their survey results with a case study
to provide a graphical representation of the overall view
trainees and gather live feedback. It can also be used in or interview; and when the views expressed from these
from participants.
student assessment. The pricing plan varies with some different sources are consistent, this gives the researcher
provision for free usage. These services offer a relatively greater confidence.
inexpensive alternative to instant response systems.
Examples include: http://www.polleverywhere.com/ and
http://understoodit.com/

Training Toolkit  |  Toolkit Appendices  |  89


Most significant change (MSC) such sees outcomes as changes in people’s attitudes Photo-stories
The most significant change (MSC) technique is a method and behaviours as a result of engagement with the Photo-stories are an engaging and creative way to
for monitoring and evaluating complex interventions. Its intervention. It can be used at the program, project and visualise narrative and communicate what has been
main focus is identifying the improvements resulting from organisational levels. learned, or report on, a training intervention.
the training activity or service provision. The technique
Outcome Orientation Randomised control trials (RCTs)
consists of 10 steps through which stakeholders search
Outcome Orientation as an approach to planning, An RCT is an example of a quantitative method where two
for significant outcomes and then deliberate on the
monitoring, evaluation and learning. This focuses on control groups are created to detect the efficacy of a given
value of these outcomes in a systematic and transparent
outcomes in terms of the changes in behaviour we would intervention. For instance, in a training intervention two
manner. It is highly participatory approach and has at
‘expect, like or love to see’. This method is used to think about control groups would be created with one group receiving
its core the generation, analysis and use of stories. The
the kinds of changes you are trying to achieve through your no intervention at all, and the other receiving the new, or
technique is also known as ‘monitoring without indicators’
information literacy activities as well as identifying observable alternative, intervention. The data gathered from both
and ‘the story approach’.
changes in behaviour. Activities and interventions should interventions would be compared at the end to detect
Needs assessment be designed by asking “What will be different and for whether the intervention was effective. RCTs offer a robust
Determining the current competency levels (including whom?” before asking, “What am I going to do?” approach to testing the effectiveness of training intervention
knowledge, skills, attitudes and values) of your trainees but require large numbers of participants in the intervention
Observations
is known as understanding their needs (or needs and control groups to be able to detect a difference and
A technique involving the observation by trainers of
assessment). Before you can design a training intervention overcome challenges with ‘matching’ the sample.
trainees’ progress throughout the training intervention
it is important to understand what is currently known
as an integral part of the teaching/learning process. Reflective tools, including diaries/journals
and identify unknowns. Methods for assessing needs
This requires the systematic gathering and analysing of These allow learners, over a period of time, to record
could include interviews, focused discussions and needs
evidence which enables trainers to reach a well-founded observations or impressions of their progress during
analysis surveys or diagnostic tools. It is necessary step
view about training outcomes and needs. ‘Promenading’, the course of training, to self-evaluate this and to draw
to understanding your training cohort and test your
i.e. where the trainer tours the learning environment, conclusions that will enable them to improve their
assumptions about their training needs.
provides a quick way to identify where individuals are performance. Reflective tools thus act as a means for
Outcome Mapping (OM) experiencing problems. individuals to assess what and how well they have learnt
Outcome mapping was developed by International as a result of training initiatives and how they might
Quizzes
Development Research Centre (IDRC) in the late 1990s apply this, following the initiatives, in their respective
A variant of questionnaires, typically used in classroom
and has been championed by the Overseas Development environments. Reflective tools capture changes in
situations, allowing for immediate responses to questions
institute. OM is a methodology for planning, monitoring thinking and attitudes. They may include daily timelines,
posed in an informal and engaging way. Quizzes tend
and evaluation. It focuses on outcomes rather than blogs (and micro blogs, such as twitter), email diaries or
to rely on high levels of interactivity. Quick quizzes help
outputs and acknowledges the limits to the training other forms of journaling.
learners to stay focused and engaged.
interventions influence. OM is people-focused and as

90  |  Training Toolkit  |  Toolkit Appendices


Spider tool T-Tests
A t-test is a statistical test that allows you to measure
(see also Evaluation Wheel or Mind Maps) whether data gathered is significant. It can be used to
Storytelling validate scores in test surveys or questionnaires where
This allows training recipients to provide a narrative of trainees have self-reported changes in their competency
their experiences in applying newly-acquired knowledge levels, in particular when validating pre-training and
and skills. It is an opportunity for individuals to set out, post-training assessments. You can perform a t-test using
in their own words, what it is that they do as a result of statistical software like Microsoft Excel.
having received training. It can be a useful way of gauging Usage statistics and Web logs
their understanding and the impact of the initiative. It can Quantitative data that demonstrates the take-up or
also serve as a reflective tool. usage by training recipients, following training initiatives,
Surveys/questionnaires of particular resources, such as online catalogues,
Surveys and questionnaires may form an integral part bibliographic services and indexes. They can also provide
of diagnostic testing, but they may also be used as a data on the use of specific commands in retrieval systems
means of obtaining trainees’ immediate feedback and and the material that is used or downloaded.
opinions on training. These may range from extensive, in
depth surveys which have been developed and validated
elsewhere, or something as simple as learners placing
comments or ticks against a smiley face, a non-committal
face or a negative face on a sheet of paper when leaving
the training venue. Surveys may be conducted face to
face, online (e.g. using a tool such as SurveyMonkey, via
SMS, or by telephone.
Theory of Change (TOC)
A TOC is a tool that can help you clearly articulate the
long-term changes of your training intervention at an
organisational, programme or project-level. TOCs are
normally articulated in a diagrammatic form although
they are flexible in style, format and content. They show
how change happens in relation to focus themes and
demonstrate the pathways the organisation proposes to
take to address these themes.

Training Toolkit  |  Toolkit Appendices  |  91


Appendix 2: M&E tools by stage
IL Standards, Models and Frameworks • The SCONUL Seven Pillars of Information Literacy
(Core Model for Higher Education) http://www.sconul.
M&E in Action
• Australian and New Zealand Inforamtion Literacy ac.uk/sites/default/files/documents/coremodel.pdf • Appreciative inquiry: http://blogs.ubc.ca/evaluation/
Framework, principles, standards and practice, A Bundy (Last accessed 23 Feb 2013) files/2009/02/appreciative20inquiry.pdf (Last accessed
http://www.library.unisa.edu.au/learn/infolit/Infolit- 23 Feb 2013)
2nd-edition.pdf (Last accessed 23 Feb 2013) • ‘The big blue – information skills for students’ – Final
report, JISC, http://www.jisc.ac.uk/media/documents/ • Contribution analysis: https://communities.usaidallnet.
• CAUL – International resources in information literacy programmes/jos/bigbluefinalreport.pdf (Last accessed gov/fa/system/files/Contribution+Analysis+-+A+New+
http://www.caul.edu.au/caul-programs/information- 23 Feb 2013) Approach+to+Evaluation+in+International+Developme
literacy/information-literacy-resources/international- nt.PDF (Last accessed 23 Feb 2013)
resources (Last accessed 23 Feb 2013) • ‘Towards information literacy indicators’, a conceptual
framework, R Catts and J Lau. http://www.ifla.org/ • Outcome harvesting: http://www.outcomemapping.
• Information Literacy Standards (Various) – Association files/assets/information-literacy/publications/towards- ca/download.php?file=/resource/files/Outome%20
of College and Research Libraries: http://www.ala.org/ information-literacy_2008-en.pdf (Last accessed 23 Harvesting%20Brief%20FINAL%202012-05-2-1.pdf
acrl/standards (Last accessed 23 Feb 2013) Feb 2013) (Last accessed 23 Feb 2013)
• IL Standards, Models and Frameworks – Sheila • Outcome mapping: http://www.outcomemapping.
Webber’s Blog, Information Literacy Weblog http:// ca/resource/resource.php?id=342 and http://vimeo.
information-literacy.blogspot.co.uk/ (Last accessed 23 com/38146769 (Last accessed 23 Feb 2013)
Feb 2013)
• Participatory action research: http://www.web.ca/
• LAMP – Literacy Assessment and monitoring robrien/papers/arfinal.html (Last accessed 23 Feb 2013)
programme http://www.uis.unesco.org/literacy/Pages/
• Participatory evaluation: http://www.gsdrc.org/docs/
lamp-literacy-assessment.aspx (Last accessed 23 Feb
open/EIRS4.pdf (Last accessed 23 Feb 2013)
2013)
• Results based evaluation: http://www.oecd.org/derec/
• OECD’s PISA (Programme for International Student
worldbankgroup/35281194.pdf (Last accessed 26 July
Assessment) http://www.oecd.org/edu/school/
2013)
programmeforinternationalstudentassessmentpisa/
(Last accessed 23 Feb 2013) • Rights-based evaluation/equity based evaluation:
http://mande.co.uk/ (Last accessed 27 July 2013)
• Project SAILS - https://www.projectsails.org/ (Last
accessed 23 Feb 2013) • SROI (social return on investment): http://www.
thesroinetwork.org/publications (Last accessed 23 Feb
2013)

92  |  Training Toolkit  |  Toolkit Appendices


Stage 1: Assessing needs Stage 2: Programme strategy and objectives Stage 3: Identifying challenges
• Developing effective questionnaires and surveys: Logical Frameworks • Information from the UK Government national archives
Roddy, K. (2006). Creating effective questionnaires and • A guide to logframes produced by DFID www.dfid.gov. on carrying out stakeholder analyses, http://webarchive.
surveys and analysing the data. Available: http://www2. uk/Documents/publications1/how-to-guid-rev-log- nationalarchives.gov.uk/20120118164404/hcai.
lse.ac.uk/library/versions/Creating%20effective%20 fmwk.pdf (Last accessed 23 Feb 2013) dh.gov.uk/files/2011/03/Presentation_Stakeholder_
questionnaires%20and%20surveys.pdf ( Last accessed • An example of a Theory of Change for information engagement_FINAL_071210.pdf (Last accessed 23
23 Feb 2013) literacy training http://www.shef.ac.uk/content/1/ Feb 2013)
• Grounded theory: http://www.methods.manchester. c6/11/08/47/CILASS_ToC.pdf (Last accessed 23 Feb • Information from ODI on stakeholder analyses www.
ac.uk/events/whatis/gt.pdf (Last accessed 23 Feb 2013) 2013) odi.org.uk/rapid/tools/toolkits/Policy_Impact/
• Another set of Theory of Change resources http:// Stakeholder_analysis.html (Last accessed 23 Feb 2013)
And http://www.groundedtheoryonline.com/what-is-
grounded-theory (Last accessed 23 Feb 2013) onthinktanks.org/2011/05/18/theories-of-change/ • Problem Trees http://www.odi.org.uk/publications/
(Last accessed 23 Feb 2013) 5258-problem-tree-analysis (Last accessed 23 Feb
• IDRC resources on Outcome mapping: http://
• A set of resources that discuss Theories of Change 2013)
www.idrc.ca/EN/Resources/Publications/Pages/
ArticleDetails.aspx?PublicationID=1004 (Last accessed http://www.researchtoaction.org/2011/05/theory-of- • Problem and Objective tree analysis (ODI) http://www.
23 Feb 2013) change-useful-resources (Last accessed 23 Feb 2013) cpc.unc.edu/measure/training/materials/basic-me-
• BOND. (2003).Logical Framework Analysis. http:// concepts-portuguese/problem_tree.pdf (Last accessed
• iSkills: http://www.ets.org/iskills/about (Last accessed
www.gdrc.org/ngo/logical-fa.pdf (Last accessed 23 Feb 23 Feb 2013)
23 Feb 2013)
2013)
• Outcome mapping community http://www.
outcomemapping.ca (Last accessed 23 Feb 2013) • Project “Superwoman” Theory of Change: http://
www.theoryofchange.org/what-is-theory-of-change/
• Project SAILS: https://www.projectsails.org/ (Last
how-does-theory-of-change-work/example/#3
accessed 23 Feb 2013)
(Last accessed 23/2/13) Theory of change.org (www.
• Stakeholder Analysis: ODI. (Jan, 2009). Planning Tools: theoryofchange.org) (Last accessed 23 Feb 2013)
Stakeholder Analysis. Available: http://www.odi.org.uk/ Theory of Change
sites/odi.org.uk/files/odi-assets/publications-opinion- • Theory of Change, What’s it all about? http://
files/6459.pdf (Last accessed 05 Feb 2013) www.capacity.org/capacity/export/sites/capacity/
• SurveyMonkey. A free online survey tools useful for documents/topic-readings/ONTRAC-51-Theory-of-
needs analysis surveys (www.surveymonkey.com) Change.pdf (Last accessed 23 Feb 2013)

Training Toolkit  |  Toolkit Appendices  |  93


Stage 4: Designing your M&E • Interviews (key informant): http://pdf.usaid.gov/pdf_
docs/PNABS541.pdf (Last accessed 23 Feb 2013)
• Photostories : http://kdid.org/sites/kdid/files/
CBAA%20PHOTOSTORY.pdf (Last accessed 23 Feb
Qualitative methods 2013)
• Bulletin boards, online discussion forums, blogs, wikis: • Interviews (virtual): http://www.qualitative-research.
http://www.qualitativemind.com/using-bulletin- net/index.php/fqs/article/view/175/391 and http:// • Qualitative research : http://resources.civilservice.gov.uk/
boards/ (Last accessed 23 Feb 2013) blog.vision2lead.com/e-interviews-2/about-e- wp-content/uploads/2011/09/a_quality_framework_
interviews/ (Last accessed 23 Feb 2013) tcm6-7314.pdf (Last accessed 23 Feb 2013)
• Consensus workshops: http://oqi.wisc.edu/
resourcelibrary/uploads/resources/Consensus%20 • Mind-mapping : http://portals.wi.wur.nl/ • Reflective tools: Daily timelines : http://www.unicef-irc.
Workshop%20-%20Description.pdf (Last accessed 23 msp/?page=1232 and http://ncrcrd.msu.edu/uploads/ org/publications/pdf/iwp_2009_05.pdf (Last accessed
Feb 2013) files/133/Mapping%20Impact%20of%20Youth%20 23 Feb 2013)
on%20Com%20Dev%2012-3-10.pdf (Last accessed 23
• Delphi process: http://www.uwex.edu/ces/pdande/ • Reflective tools: Diaries/journals : http://sru.soc.
Feb 2013)
resources/pdf/Tipsheet4.pdf and http://pareonline. surrey.ac.uk/SRU2.html and http://punchcut.com/
net/pdf/v12n10.pdf and http://www.aral.com.au/ • Most significant change (MSC): http://www.mande. perspectives/uncovering-context-mobile-diary-studies
resources/delphi.html (Last accessed 23 Feb 2013) co.uk/docs/MSCGuide.pdf (Last accessed 23 Feb 2013) (Last accessed 23 Feb 2013)

• Drama, puppetry : http://www.hapinternational.org/ • Observations : http://learningstore.uwex.edu/ • Reflective tools: Diaries (blog): http://www.dubstudios.
pool/files/wvuk-puppets-for-accountability.pdf assets/pdfs/G3658-5.PDF and http://transition. com/technology/using-blog-tools-for-research-and-
(Last accessed 23 Feb 2013) usaid.gov/policy/evalweb/documents/TIPS- innovation/ and http://www.dubstudios.com/mr/
UsingDirectObservationTechniques.pdf and http:// digital-diaries-as-an-online-qual-methodology/
• Drawings: http://www.unicef-irc.org/publications/pdf/ www.taklin.com and http://www.noldus.com/human- (Last accessed 23 Feb 2013)
iwp_2009_05.pdf (Last accessed 23 Feb 2013) behavior-research (Last accessed 23 Feb 2013)
• Reflective tools: Diaries (email): http://sru.soc.surrey.
• Focus groups: http://www.infed.org/research/focus_ • Participant observation (=ethnography): http://www. ac.uk/SRU21.html (Last accessed 23 Feb 2013)
groups.htm or http://sru.soc.surrey.ac.uk/SRU19.html unicef-irc.org/publications/pdf/iwp_2009_05.pdf
(Last accessed 23 Feb 2013) • Reflective tools: Diaries (virtual): https://www.
(Last accessed 23 Feb 2013)
scss.tcd.ie/Gavin.Doherty/mood-charting-
• Graphical facilitation : http://aea365.org/ • Photography : http://www.socialsciences.manchester. preprint.pdf and http://www.jopm.org/columns/
blog/?tag=graphic-facilitation and http://www.blog. ac.uk/morgancentre/realities/toolkits/participatory- innovations/2011/09/26/just-text-me-using-sms-
biggerpicture.dk/learn-graphic-facilitation-snippit- visual/17-toolkit-participatory-visual-methods.pdf technology-for-collaborative-patient-mood-charting/
learning/ and http://www.photovoice.org/html/ppforadvocacy/ (Last accessed 23 Feb 2013)
• Interviews (face to face): http://www.fao.org/docrep/ ppforadvocacy.pdf and http://www.unicef-irc.org/
• Spider tool (see also Evaluation Wheel): http://www.
w3241e/w3241e06.htm (Last accessed 23 Feb 2013) publications/pdf/iwp_2009_05.pdf (Last accessed 23
ungei.org/resources/files/SCS_Spider_Tool_Final_2.
Feb 2013)
pdf and http://www.crin.org/docs/resources/
publications/SCS_Spider_Tool_Facilitators_Guide_3.
pdf (Last accessed 23 Feb 2013)

94  |  Training Toolkit  |  Toolkit Appendices


• Storytelling http://www.heacademy.ac.uk/assets/ • Surveys (CAPI): http://blogs.worldbank.org/ Additional resources
documents/resources/resourcedatabase/id473_ impactevaluations/paper-or-plastic-part-ii- • Mixed methods : http://www.gsdrc.org/docs/open/
valuation_through_storytelling.pdf (Last accessed 23 approaching-the-survey-revolution-with-caution EIRS4.pdf (Last accessed 17 July 2013)
Feb 2013) (Last accessed 23 Feb 2013)
• Journal of Multidisciplinary Evaluation (JMDU):
• Video : http://insightshare.org/watch/video/insights- • Surveys (general): http://learningstore.uwex.edu/assets/ http://journals.sfu.ca/jmde/index.php/jmde_1
pv-short and http://www.socialsciences.manchester. pdfs/G3658-10.PDF (Last accessed 23 Feb 2013) (Last accessed 17 July 2013)
ac.uk/morgancentre/realities/toolkits/participant-
• Surveys (online): https://www.projectsails.org/ (Last
produced-video/2008-07-toolkit-camcorders.pdf
(Last accessed 23 Feb 2013)
accessed 23 Feb 2013) Stage 5: Establishing a baseline
• Text analysis: linguistic inquiry and word count: http:// • Lickert Scale: http://www.surveymonkey.com/mp/
Quantitative methods likert-scale/ (Last accessed 23 Feb 2013)
aea365.org/blog/?cat=231 (Last accessed 23 Feb 2013)
• Diagnostic tests (pre & post): http://infolitglobal.net/
directory/en/browse/category/products/assessment_ • Usage statistics/weblogs: http://www.amicalnet.org/
tools (Last accessed 23 Feb 2013) meetings/2012/presentations/evaluating-information-
literacy-conjunction-overall-services-usage
• Evaluation wheel (also known as Spider Tool): http://
(Last accessed 23 Feb 2013)
portals.wi.wur.nl/msp/?page=1222 (Last accessed 23
Feb 2013) Hybrid methods
• Outcome stars (branded – see also Spider Tool): http://
• Instant response techniques/keypad technology/live
www.outcomesstar.org.uk/ (Last accessed 23 Feb 2013)
polling : http://ncdd.org/rc/item/3601 and http://
www.educause.edu/ero/article/clickers-and-cats-using- • Sensemaking : http://www.globalgiving.org/story-
learner-response-systems-formative-assessments- tools/ and http://chewychunks.files.wordpress.
classroom (Last accessed 23 Feb 2013) com/2012/05/storytelling-realbook-may-23-2012.
pdf and http://www.ssireview.org/articles/entry/
• Mobile data collection : http://understoodit.com/
amplifying_local_voices1/ (Last accessed 23 Feb 2013)
(Last accessed 23 Feb 2013)
• Social network analysis : http://www.adb.org/
• Quizzes : http://infoskills.uelconnect.org.uk/
Documents/Information/Knowledge-Solutions/Social-
(Last accessed 26 July 2013)
Network-Analysis.pdf and http://mande.co.uk/2008/
• Randomised control trials: http://www.entwicklung. lists/social-network-analysis-and-evaluation-a-list/ and
at/uploads/media/Designing_Impact_Evaluation_ http://www.kstoolkit.org/Social+Network+Analysis
Robert_Chambers.pdf (Last accessed 23 Feb 2013) and http://netmap.wordpress.com/about/ and http://
www.kstoolkit.org/Social+Network+Analysis
(Last accessed 23 Feb 2013)

Training Toolkit  |  Toolkit Appendices  |  95


Stage 6: M&E during and immediately Stage 7: Data Analysis • Thematic coding video tutorial: http://www.youtube.
com/watch?v=B_YXR9kp1_o (Last accessed 23 Feb
after training Analysing quantitative data
• A short, beginner’s guide to descriptive statistics, http://
2013). Workshop report on scientific foundations of
• Information on questioning techniques: www. qualitative research: http://www.nsf.gov/pubs/2004/
learningstore.uwex.edu/Assets/pdfs/G3658-06.pdf nsf04219/nsf04219.pdf (Last accessed 23 Feb 2013)
mindtools.com/pages/article/newTMC_88.htm and (Last accessed 23 Feb 2013)
http://blog.readytomanage.com/examples-of-probing- • Video tutorial from Graham H Gibbs (2010) (from
interview-questions/ (Last accessed 23 Feb 2013) • A thorough and free overview of descriptive and 2:20mins onwards) http://learningstore.uwex.edu/
inferential statistics, http://www.statsoft.com/ Assets/pdfs/G3658-12.pdf (Last accessed 23 Feb 2013)
• Information on understoodit, a method of real-time textbook/ (Last accessed 23 Feb 2013)
assessment http://understoodit.com/ (Last accessed 23 • Collecting observational data : http://learningstore.
Feb 2013) • Essential reading on descriptive and inferential uwex.edu/Assets/pdfs/G3658-05.pdf (Last accessed
statistics: Woolf, L.M Introduction to Measurement and 23 Feb 2013)
• Information on the focused-conversation method Statistics http://www.webster.edu/~woolflm/statwhatis.
http://fnsingapore.blogspot.com/2008/08/focused- html (Last accessed 23 Feb 2013) Choosing qualitative data analysis software
conversation-method.html (Last accessed 23 Feb 2013) • Lewins, A. & Silver, C. Choosing CAQDAS (Computer
• Using Microsoft Excel to enter survey and run some Assisted Qualitative Data Analysis Software) (2004) http://
• Information about the iclicker classroom response simple descriptive statistics http://learningstore.uwex. www.surrey.ac.uk/sociology/research/researchcentres/
system: http://www.iclicker.com/ (Last accessed 23 Feb edu/Assets/pdfs/G3658-14.pdf (Last accessed 23 Feb caqdas/ (Last accessed 26 July 2013)
2013) 2013)
• WeftQDA (free opensource software) http://www.
• Chambers, R. (2002). Ideas for Evaluation and Ending. Analysing qualitative data pressure.to/qda/ (Last accessed 23 Feb 2013)
In: Participatory Workshops: a sourcebook of 21 sets of • Analysing focus group data: www.utexas.edu/academic/
ideas and activities. UK: Earthscan. 40-53 ctl/assessment/iar/programs/report/focus-Analysis. • What is qualitative data? http://onlineqda.hud.ac.uk/
php (Last accessed 23 Feb 2013) Intro_QDA/what_is_qda.php (Last accessed 23 Feb
• Hogan, C. (2003). Basic Facilitation Toolkit. In: Practical
2013) and http://hsc.uwe.ac.uk/dataanalysis/qualWhat.
Facilitation: A Toolkit of Techniques. UK: Kogan Page • Excellent overview of how to maintain quality, asp (Last accessed 23 Feb 2013)
Publishers. 73-80 transparency and credibility in qualitative data analysis.
*Spencer, L., Ritchie, J., Lewis, J. and Dillon, L. (2003). • Data cleaning: http://www.uwex.edu/ces/pdande/
• Stanfield, R.B. (2000). The Art of Focused
Quality in Qualitative Evaluation: A framework for assessing resources/pdf/Tipsheet22.pdf (Last accessed 23 Feb
Conversation. Toronto: News Society Publishers
research evidence. UK Cabinet Office Strategy Unit 2013)

• Ragin, C., Nagel, J. and White, P. (2004). General • Distance travelled Roddy, K. (2011). Measuring
Guidance for Developing Qualitative Research Projects Outcomes and Impact.
& Recommendations for Designing, Evaluating, and
Strengthening Qualitative Research in the Social Sciences.
National Science Foundation Workshop on Scientific
Foundations of Qualitative Research

96  |  Training Toolkit  |  Toolkit Appendices


Sample-size calculators:
• http://www.surveysystem.com/sscalc.htm
Stage 8: Learning from your M&E • Facilitation skills and tools: ‘DFID Tools for
Development’. http://webarchive.nationalarchives.gov.
(Last accessed 23 Feb 2013) General resources on learning from evaluations: uk/+/http:/www.dfid.gov.uk/Documents/publications/
• A broader view of how to ensure that learning is toolsfordevelopment.pdf (Last accessed 23 Feb 2013)
• http://www.raosoft.com/samplesize.html (Last built into evaluation design: Kusters, C., van Vught,
accessed 23 Feb 2013) S., Wigboldus, S., Williams, B. & Woodhill, J. (2011). Energisers, ice-breakers and other tips and tools for
Making evaluations matter: A practical guide for facilitated events
• http://stat.ubc.ca/~rollin/stats/ssize/index.html (Last • Brainstorming: http://brainstorming.co.uk (Last
accessed 23 Feb 2013) evaluators. Centre for Development Innovation,
Wageningen University & Research centre, accessed 23 Feb 2013)
• http://www.worldbank.org/oed/ipdet/modules/M_11- Wageningen, The Netherlands. www.cdi.wur.nl • Facilitation tips: http://www.thiagi.com/tips.html
na.pdf (Last accessed 23 Feb 2013) (Last accessed 23 Feb 2013) (Last accessed 23 Feb 2013)
• An excellent conceptual overview of organisational • International HIV/AIDS Alliance (2002). 100 ways to
learning in NGOs: Britton, B. (2005). Organisational energise groups. http://www.aidsalliance.org/includes/
Learning in NGOs. INTRAC. http://www.intrac. Publication/ene0502_Energiser_guide_eng.pdf
org/data/files/resources/398/Praxis-Paper-3- (Last accessed 23 Feb 2013)
Organisational-Learning-in-NGOs.pdf (Last accessed
• UNICEF. Games and Exercises: A manual for facilitators
23 Feb 2013)
and trainers involved in participatory group events.
• The barefoot guide to learning practices in http://www.click4it.org/images/5/55/Visualisation_
organisations and social change: http://www. in_Participatory_Programmes_-_Games_and_
barefootguide.org/index.php/download/the-barefoot- Exercises.pdf (Last accessed 23 Feb 2013)
guide-2/item/the-barefoot-guide-2-learning- Divergent thinking/creativity
practices-in-organisations-and-social-change • 192 creativity techniques: http://www.mycoted.com/
(Last accessed 23 Feb 2013) Category:Creativity_Techniques (Last accessed 23 Feb
• Documenting learning: Lederach, J.P., Neufeldt, R., 2013)
& Culbertson, H. (2007). Reflective Peacebuilding: • Participatory events: New Economics Foundation (1998).
A planning, learning and evaluation toolkit or Participation Works! 21 techniques of community
documenting and communicating learning. The Joan participation for the 21st century. http://www.
B. Kroc Institute for International Peace Studies, neweconomics.org/publications/participation-works
University of Notre Dame & Catholic Relief Services. (Last accessed 23 Feb 2013)
http://kroc.nd.edu/sites/default/files/reflective_
peacebuilding.pdf (Last accessed 23 Feb 2013) • Tools for knowledge and learning: http://www.dochas.
ie/pages/resources/documents/ODI_KM_toolkit.pdf
(Last accessed 23 Feb 2013)

Training Toolkit  |  Toolkit Appendices  |  97


Stage 9: Communicating your findings • Tableau Software. (2013). Homepage. Available: http://
www.tableausoftware.com/public (Last accessed 07
Creating a communications strategy Feb 2013)
• CRS – Guidelines on designing an evaluation reporting
and communication strategy: Stetson, V. (Sept, 2008). • Using graphics to report evaluation results: Minter, E;
Monitoring & Evaluation: Short Cuts, Guidelines on Michaud, M. (2003). Using graphics to report evaluation
designing an evaluation reporting and communication results. Available: http://learningstore.uwex.edu/Assets/
strategy. Available: http://www.crsprogramquality.org/ pdfs/G3658-13.pdf (Last accessed 07 Feb 2013)
storage/pubs/me/MEshortcut_communicating.pdf Qualitative visualisation
(Last accessed 07 Feb 2013) • Analyse and explore data using the Many Eyes tools
• ODI toolkit: Hovland, I. (Oct, 2005). Successful developed by IBM Research
Communication, A Toolkit for Researchers and Civil Society • IBM Research, Visual Communication Lab. (2007). Tour.
Organisations. Available: http://www.odi.org.uk/sites/ Available: http://www-958.ibm.com/software/data/
odi.org.uk/files/odi-assets/publications-opinion- cognos/manyeyes/page/Tour.html (Last accessed 23
files/192.pdf (Last accessed 07 Feb 2013) Feb 2013)
Quantitative visualisation • Think with your eyes: considerations when visualising
• Cartoons and Stories on Research, Evaluation and information: Mitchell, J; Rands, M; RISE. (2001). Think
Technology. Freshspectrum. (2013). Data Viz Resources. with your eyes. Available: http://www.rise.hs.iastate.edu/
Available: http://freshspectrum.com/dataviz/ documents/VisualizingInformation.pdf (Last accessed
(Last accessed 23 Feb 2013) 23 Feb 2013)
• Data visualisation using Excel: VizWise. (2011). Data • Word clouds: www.wordle.net (Last accessed 23 Feb
Visualisation for Excel. Available: 2013)
• Duggirala, P. (2013). Excel tutorials, tips and downloads. Tools
Available: http://chandoo.org/wp/ (Last accessed 07 • Prezi: www.prezi.com (Last accessed 23 Feb 2013)
Feb 2013)
• Tableau Public: http://www.tableausoftware.com/
• Excel training tutorials online: http://www.excelcharts. public/ (Last accessed 23 Feb 2013)
com/blog/ (Last accessed 07 Feb 2013)
• Microsoft Live Movie Maker http://windows.microsoft.
• OSCI. (2009). Examples, Gallery. Available: http://www. com/en-GB/windows/get-movie-maker-download
improving-visualisation.org/visuals (Last accessed 23 (Last accessed 23 Feb 2013)
Feb 2013)

98  |  Training Toolkit  |  Toolkit Appendices


Appendix 3: Suggested criteria for describing
good practice in monitoring and evaluation
This document explains briefly how to contribute your Description and scope: this essentially describes the
experiences to the toolkit. These could be instances of monitoring and evaluation approach, method or tool.
good practice from your own experience of undertaking
• Describe the monitoring and evaluation approach,
monitoring and evaluation of your information literacy
method or tools you will be using. Try to be specific
intervention (or another capacity building programme).
about why you chose this particular approach.
Set out below is a suggested short set of criteria which
• Tell us about any innovations or creative applications
can act as a checklist and a guide towards describing your
that you devised when applying this approach, method
good practice. It applies to all information literacy training
or tool.
interventions and will be used as criteria for mapping your
experiences to the nine stages identified in this toolkit. • What did the monitoring and evaluation approach,
We welcome all examples of best practice and only ask method or tool tell you? For instance, did it help you
that you inform us of the location where the intervention strategically plan or determine your objectives.
took place. The criteria are articulated around a series of • Are there any general or specific; practical or
practical questions that all trainers should be in a position theoretical issues that need to be considered?
to address without too much difficulty.
• What did you do with the data gathered?
Definitions:
• Define the scope of the training intervention – also • How will it help you to communicate the impact of your
include information on where and when training will training intervention?
take place. When you are ready please forward your case study to the
• What is the purpose of the intervention? BLDS / IDS email inbox: [email protected] Include the term:
M&E toolkit in the subject line.
• What form of monitoring and evaluation are you
describing (please refer to the different stages in this
toolkit).
• Who will attend the intervention?
• What support is required to run the intervention
(personnel, facilities, financial)?

Training Toolkit  |  Toolkit Appendices  |  99


Appendix 4: References
ALA. (2000). Information Literacy Competency Standards for Dawes, J. (2007). Do data characteristics change Kotter, J.P. and Cohen, D.S. (2002) The Heart of Change:
Higher Education. Available: http://www.ala.org/acrl/sites/ according to the number of scale points used? An Real-life Stories of How People Change Their Organizations.
ala.org.acrl/files/content/standards/standards.pdf experiment using 5-point, 7-point and 10-point scales. Harvard Business Review School Press, Boston,
(Last accessed 23 Feb 2013) International Journal of Market Research. 50 (1) Massachusetts
ACRL. (1989). Presidential Committee on Information Downie, A. (2008). From Access to Action: Impact Kusters, C., van Vught, S., Wigboldus, S., Williams, B. &
Literacy: Final Report. Available: http://www.ala.org/acrl/ Pathways for the IDS Knowledge Services. Available: Woodhill, J. (2011). Making evaluations matter: A practical
publications/whitepapers/presidential (Last accessed 23 http://www.ids.ac.uk/files/From_Access_to_Action_ guide for evaluators. Centre for Development Innovation,
Feb 2013) Downie_2008.pdf. (Last accessed 23 July 2013) Wageningen University & Research centre, Wageningen,
The Netherlands. www.cdi.wur.nl
Bandolier. (1994-2007). P-value. Available: http://www. Health Information and Libraries. (2012). Learning and
medicine.ox.ac.uk/bandolier/booth/glossary/pvalue.html teaching in action. Health Information and Libraries Journal. Mok, A. (2008). The focused conversation method. Available:
(Last accessed 23 Feb 2013) 29, 81-86 http://fnsingapore.blogspot.co.uk/2008/08/focused-
conversation-method.html (Last accessed 23 Feb 2013)
BOND. (2003). Logical Framework Analysis. Available: GDNet. GDNet - Research Communications from & for the
http://www.gdrc.org/ngo/logical-fa.pdf (Last accessed 23 Global South. Available: http://depot.gdnet.org/cms/files// Ontrac. (2012). Theory of Change: What is it?. Available:
Feb 2013) GDNet_strategy_note.pdf. (Last accessed 23 July 2013) http://www.capacity.org/capacity/export/sites/capacity/
documents/topic-readings/ONTRAC-51-Theory-of-
Garcia-Quismondo, M. (July, 2010). Evaluation of Hepworth, M and Duvigneau, S. (2012). Building
Change.pdf (Last accessed 23 Feb 2013)
Information Literacy Programmes in Higher Education: Research Capacity: Enabling Critical Thinking Through
Strategies and Tools. RUSC. 7 (2) Information Literacy in Higher Education in Africa. Available: Ramalingam, B. (2010). Organisational learning for aid, and
http://opendocs.ids.ac.uk/opendocs/bitstream/ learning aid organisations. Available: http://www.capacity.
Catts, R and Clark, C. (2007). Information Skills Survey: Its
handle/123456789/2301/BuildingResearchCapacityR1. org/capacity/opencms/en/topics/learning/organisational-
Application to a Medical Course. Evidence Based Library and
pdf?sequence=1 (Last accessed 23 Feb 2013) learning-for-aid-and-learning-aid-organisations.html
Information Practice. 2 (3)
(Last accessed 23 Feb 2013)
Hogan, C. (2003). Basic Facilitation Toolkit. In: Practical
Catts, R and Lau J. (2008). Towards Information Literacy
Facilitation: A Toolkit of Techniques. UK: Kogan Page Roddy, K. (2011). Measuring Outcomes and Impact.
Indicators: Conceptual Framework paper. Available:
Publishers. 73-80 Available: http://www.cilip.org.uk/get-involved/
http://www.ifla.org/files/assets/information-literacy/
special-interest-groups/international/events/training/
publications/towards-information-literacy_2008-en.pdf IFLA. (2004). Guidelines for Information Literacy Assessment.
Documents/May232011KRpres.pdf (Last accessed 23
(Last accessed 23 Feb 2013) Available: http://www.ifla.org/files/assets/information-
Feb 2013)
literacy/publications/il-guidelines-2004-en.pdf (Last
Chambers, R. (2002). Ideas for Evaluation and Ending. In:
accessed 23 Feb 2013) Roddy, K. (2006). Creating effective questionnaires and
Participatory Workshops: a sourcebook of 21 sets of ideas and
surveys and analysing the data. Available: http://www.
activities. UK: Earthscan. 40-53
lse.ac.uk/library/versions/Creating%20effective%20
questionnaires%20and%20surveys.pdf (Last accessed 23
Feb 2013)

100  |  Training Toolkit  |  Toolkit Appendices


SCONUL Working Group on Information Literacy. (2011).
The SCONUL Seven Pillars of Information Literacy: Core
Model for Higher Education. Available: http://www.sconul.
ac.uk/sites/default/files/documents/coremodel.pdf
(Last accessed 23 Feb 2013)
Secker, Jane and Coonan, Emma (2011) A new curriculum
for information literacy: executive summary. Arcadia
Programme, Cambridge University Library, Cambridge, UK
Steinerová, J. (2010). “Ecological dimensions of
information literacy” Information Research, 15(4) paper
colis719. Available: http://InformationR.net/ir/15-4/
colis719.html (Last accessed 05 July 2013)
Wema, E and Hewporth, M. (2007). An evaluation of an
information literacy training initiative at the University of
Dar es Salaam. Journal of Information Literacy. 1 (1), 1-12
World Library and Information Congress: 77th IFLA
General Conference. 2011. Attaining Information Literacy
Training: Evaluating a Theory and Research Based Educational
Intervention. 13-18 August. Puerto Rico: San Juan
Spencer, L., Ritchie, J., Lewis, J. and Dillon, L. (2003).
Quality in Qualitative Evaluation: A framework for assessing
research evidence. Available: http://resources.civilservice.
gov.uk/wp-content/uploads/2011/09/a_quality_
framework_tcm6-7314.pdf (Last accessed 23 Feb 2013)

Training Toolkit  |  Toolkit Appendices  |  101


Research conducted
with the support of:

This report was jointly produced by:


Institute of Development Studies
Brighton, BN1 9RE
Tel: +44 (0) 1273 606261
Email: [email protected]
Loughborough University
Leicestershire, UK
Tel: +44 (0) 1509 223039
Email: [email protected]
Orla Cronin Research
93 Banks Road, West Kirby
Wirral, CH48 ORB
Tel: +44 (0) 161 408 2047
Email: [email protected]
Research Information Network
Woburn House, 20-24 Tavistock Square
London, WC1H 9HF
Tel: +44 (0) 203 397 3647
Email: [email protected] http://creativecommons.org/licenses/by-sa/3.0/

You might also like