6758 IDS Toolkit Interactive
6758 IDS Toolkit Interactive
TRAINING
TOOLKIT
The monitoring and evaluation
for information literacy
training initiatives in Africa:
a journey approach
IL Standards, Models and Frameworks ‘Towards information literacy indicators’, a conceptual LAMP – Literacy Assessment and monitoring
Information Literacy Standards (Various) – Association framework, R Catts and J Lau. http://www.ifla.org/ programme http://www.uis.unesco.org/literacy/Pages/
of College and Research Libraries: http://www.ala.org/ files/assets/information-literacy/publications/towards- lamp-literacy-assessment.aspx (accessed 23 Feb 2013)
acrl/standards (accessed 23 Feb 2013) information-literacy_2008-en.pdf (accessed 23 Feb
CAUL – International resources in information literacy
2013)
Australian and New Zealand Information Literacy http://www.caul.edu.au/caul-programs/information-
Framework, principles, standards and practice, A Bundy ‘The big blue – information skills for students’ – Final literacy/information-literacy-resources/international-
http://www.library.unisa.edu.au/learn/infolit/Infolit- report, JISC, http://www.jisc.ac.uk/media/documents/ resources (accessed 23 Feb 2013)
2nd-edition.pdf (accessed 23 Feb 2013) programmes/jos/bigbluefinalreport.pdf (accessed 23
OECD’s PISA (Programme for International Student
Feb 2013)
The SCONUL Seven Pillars of Information Literacy Assessment) http://www.oecd.org/edu/school/
(Core Model for Higher Education) http://www.sconul. Project SAILS – https://www.projectsails.org/ (accessed programmeforinternationalstudentassessmentpisa/
ac.uk/sites/default/files/documents/coremodel.pdf 23 Feb 2013) (accessed 23 Feb 2013)
(accessed 23 Feb 2013)
This toolkit is largely the product of a consultative out before the Centurion workshop, as well as by For instance:
workshop held in Centurion, South Africa, on 15-17 participants at the workshop itself:
• Information literacy in the research process
February 2012. The workshop was framed around
• The nature of information
a series of interactive sessions with input from • Management, stewardship and preservation of
participants from the ABC Project Kenya, Aga Khan • General use of library resources information and data
University Kenya, University of Botswana, Ethiopian • Search and discovery, including ‘smart’ use of the • Synthesis and integration of data and information
AIDS Resource Centre, the Information Training and Internet
Outreach Centre for Africa (ITOCA) (South Africa), • Use and implications of social media in the research
University of Johannesburg, University of Loughborough • Evaluating sources and resources process
(UK), Knowledge Leadership Associates (South Africa), • Referencing • Publishing and promoting the results of research
University of Malawi, Researchers Information Network
(UK) and Savana Signatures (Ghana). • Citation • Epistemology
Although participants formulated their own • Legal issues, including copyright and plagiarism • Forms of knowledge and its representation
understanding of information literacy (see Box 1: • Writing and authorship • Relationships between data, information, knowledge
Definitions of Information Literacy), we need not and understanding
• Defining research topics, and conducting research
worry here about a perfect definition of the concept.
• Building knowledge communities
Rather, we need to stress the importance of learners’ • Information literacy professional development
abilities to make judgments about the quality of the • Communication and presentation
• Bibliometrics
information they use, and about how they might most • Presentation and packaging of information for
appropriately use different kinds of information in • Preservation of research outputs
different audiences
different contexts. It may be helpful if we think in terms But this list is far from exhaustive, particularly if
of the knowledge, skills, competencies and behaviours • Visualisation and infographics
information literacy is understood as broader than
required in different contexts in Africa, and how the traditional, library-centred view based on finding • Information literacy in the community
they apply to learners – not only during their formal materials and deploying bibliographic skills.
education, but throughout their lives in their varied • Advocacy for community organisations and for small
roles as citizens and active members of society. The Centurion workshop also proposed other skills and and medium-size enterprises (SMEs)
competencies that trainers would like to see included as
Below are areas of information literacy identified contributing to information literacy in Africa.
as important by respondents to a survey carried
16 An evolving resource
Looking forward to a future of adaptation and refinement.
16 Toolkit adaptations
Spreading good practice, acknowledging roots.
Organisation
Wider society
6. M&E during 5. Establishing
training a baseline
26 M&E in action
Stage 1: Assessing needs
Stage 2: Programme strategy & objectives
Stage 3: Identifying challenges
Stage 4: Designing M&E
Stage 5: Establishing a baseline
Stage 6: M&E during (and immediately after) training
Stage 7: Data analysis
Stage 8: Learning from M&E
Stage 9: Communicating findings
Outputs, Outcomes and Impact Outcomes may also be defined as what trainees can An example of this might relate to trainees achieving
Outputs refer to the extent to which a training do as a result of what they have learned, notably with good academic results. Has this enabled the individual
intervention was delivered. For example, outputs may regards to changes in attitude, confidence, behaviour, to participate in a community of practice, build on
include the number of contact or study hours achieved, performance and practice. For example, outcomes existing knowledge, create new knowledge and/
the number of trainees who attended (alongside some could include: or tackle shared challenges? Has it enabled their
information about the trainees, e.g. their gender). organisation to take advantage of opportunities
• The ability to use a wider range of vocabulary to
in the market place or the community to advocate
Outcomes refer to the knowledge and skills trainees describe and evaluate an information source
for resources? Has it meant they can now share
have acquired that indicate information literacy, critical • The use of more sophisticated methods to search for information and knowledge with others in a similar
thinking and independent learning. information, such as the use of Boolean logic context to achieve common goals? Has it enabled the
For example, in an academic environment, outcomes • An awareness of what information is required to individual to make evidence-informed decisions or use
could include the ability to: resolve a problem and how that information will be evidence in policy-making processes? Has it helped the
applied individual improve their research capacity?
• Define information needs (e.g. by listing key terms or
concepts, or creating mind-maps) • A positive attitude toward the opportunity to learn
• Incorporate a broad range of academically independently
authoritative sources in reports • An ethical approach to the use and application of
• Provide evidence of synthesising different viewpoints information
• Reflect critically on other people’s contributions Impact indicates the actual effects training
interventions have on the broader environment in which
• Include detailed references and bibliography trainees operate, and the extent to which trainees are
In a community setting, outcomes could include able to influence that environment as a result of the
knowledge of authoritative information sources and training. In information literacy training, impact relates
the ability to generate information drawing on other to whether the information a person can now process
information. is being applied or used in a way that enables people to
achieve their objective/s.
interventions Approaches, Methods, Tools and Technologies Tools are elements of methods. A particular method,
• Establish whether you have met your objectives (by To make it easier to compile the toolkit, we have e.g. a focus group, may involve the use of a particular
examining outcomes and impact) adopted the following definitions in order to cluster tool within it as an exercise or activity, e.g. using a
similar issues together. You may find these terms focused-discussion tool (e.g. ORID), creating a collage,
• Learn how to improve your interventions, generate or conducting a word association activity.
used slightly differently within other M&E resources,
new ones or advocate for resources.
but what is important is that you are aware of these Technologies are the physical ways in which you
A note on terminology: on occasions, when we are different elements of your M&E plan. collect the data. Paper and pencil is a technology,
discussing methods which relate very specifically to as is iClicker (an instant response device). Electronic
An M&E approach is its ‘philosophy’, referring to
evaluation of training, we have used the term assessment. technologies, such as email, web, mobiles etc. enable
the way in which the monitoring and evaluating is
This is not a general M&E term, but is specific to training us to implement particular methods and tools more
designed and implemented. Different approaches
and other educational contexts. We see assessments as efficiently, creatively or cheaper.
embody different values and judgments, with their
being part of the contract between trainer and trainee.
structure and focus encompassing what they regard
An assessment is unlikely to be anonymous, and the
as important to evaluate.
trainee will be given their results so they can see what
progress they are making. Assessment data can be Methods (or methodology) are the way in which
aggregated and used for M&E, and many assessment you actually collect data. Interviewing is a method,
methods in fact lend themselves well to M&E. as is a survey.
In this toolkit, we distinguished between M&E approaches,
M&E methods, M&E tools, and M&E technologies, as these
can often cause confusion (see Box 6: Approaches,
Methods, Tools and Technologies). We will learn more
about methods, tools and technologies later. For now, we
will consider approaches.
START HERE
Assess
Needs analysis surveys /
The competences questionnaires
(skills and knowledge) What methods will you
of the trainees use? Remember:
a mixture of both types Traditional survey-style Interview
is possible. approaches, e.g:
The attitudes, values
Identify individuals and perceptions of Observations
and / or groups who the trainees
need to be trained
8. O
ntrac, The newsletter of INTRAC: Theory of Change: What’s it all about [downloaded
01 Feb 2013] http://www.capacity.org/capacity/export/sites/capacity/documents/
topic-readings/ONTRAC-51-Theory-of-Change.pdf
Impacts
These may be longer term and
harder to define
Decisions
1. Assessing needs Who defines how wide a lens you should take on needs assessment?
Is everyone involved willing to admit to individual and/or organisational needs?
Do stakeholders accept the necessity to do a needs assessment, with the consequent resource and time costs?
2. Programme strategy and objectives Are stakeholders agreed on the programme strategy?
Have stakeholders embraced the need for evaluation, with the consequent resource and time costs?
Are stakeholders agreed on the purpose of the evaluation and the evaluation questions?
3. Identifying challenges Have you done a risk assessment for your M&E process, identifying big and small issues which could derail it, and have
you developed strategies to address them?
4. Designing M&E Do stakeholders understand the implications of the choice of your M&E methods, in terms of skills, resources, cost
and the kinds of data which will be generated?
Have individuals with appropriate skills and time been identified to implement the M&E?
Have logistics been considered, including scheduling, transport, data collection and data entry?
Are the participants, i.e. the trainees, sufficiently engaged with the exercise? Do all the stakeholders appreciate the
value of the training? Do they consider their needs have been adequately considered? Do they see training, including
the M&E, as threatening or enabling?
5. Establishing a baseline Have you built in baseline evaluation?
Have you planned your M&E so that you can establish the baseline early, before the intervention has actually started?
(Otherwise, your participants may have already improved, and you may not detect any further changes.)
Decisions
6. M&E during and after training Have you designed your M&E so that it is only minimally disruptive during the training process?
Do all the trainers understand the point of doing M&E and have they committed to it?
Have you communicated your M&E process to participants and engaged them in it?
7. Data analysis Have you piloted your data analysis early so that you can identify any issues, e.g. in terms of skills, knowledge
or software?
8. Learning from M&E Have stakeholders committed to doing something with the M&E results?
Have they accepted that the report is just the beginning of their effort?
9. Communicating findings Have you identified suitable avenues for communication with stakeholders, and allocated time for this?
Challenges from the environment The emotional environment also has an impact on the
M&E. Participants will need to see the relevance and
The physical environment can pose a problem if benefit of the M&E process to help them engage with it
you intend to gather M&E data through face-to-face and overcome any concerns about the use of the data. It is
interviews or focus groups. One-on-one interviews important that you emphasise that their comments will be
will need a quiet space for individuals to speak openly. anonymised in summary reports and always state how the
If proximity is an issue you can conduct one-on-one data will be used (for instance, to demonstrate the training
interviews by telephone instead. In focus groups, success or to lobby for more resources).
appropriate space and resources are necessary for people
to exchange ideas and discuss aspects of information
literacy learning. Don’t forget resources for capturing
discussions, such as flip-chart paper placed on walls.
For example:
START HERE Lack of funding
Resource-based
Lack of time
challenges Build mitigating
Logistical obstacles action into your plan
where possible.
E.g. Carry out a
For example: stakeholder analysis,
Identify challenges problem tree-analysis
Lack of trainer skills and pilot your methods,
and expertise communicate your
strategy.
Trainee resistance to
interventions
People-based Communicate risks to
challenges relevant stakeholders.
Lack of stakeholder
buy-in Organisations, funders
or sponsors might be in a
position to help.
For example:
Potential bias
Difficulty in application
and reproducibility
Qualitative Quantitative
Exploratory, and permits answers to arise that The only way in which ‘how much/how many/
you hadn’t considered the possibility of how strong’ etc. questions can be answered
May feel more respectful (asking people to tell Analysis and interpretation may be easier than
Advantages
their own stories rather than imposing a story, with qualitative methods
e.g. via a questionnaire)
May seem more compelling (but good
May feel more participatory for both trainers qualitative methods can be equally compelling)
and participants
Allows you to track subtle changes, including
Provides rich, ‘textured’ data which can feel changes across time
more compelling
May be perceived as less robust than ‘Garbage in-garbage out’. Quantitative methods
quantitative methods (though this can be are only as good as the instruments used to
mitigated by conducting, analysing and collect the data
reporting the research to the highest
9 When people or events are represented by
standards )
numbers, the narrative may be oversimplified
May seem deceptively easy to conduct, e.g.
Disadvantages
Advantages
may eventually achieve a change in their behaviour.
Often directly targeted by our interventions Often felt to be more substantial in terms of
However, although, it is easier to examine changes in
an outcome
thinking in the short term, we shouldn’t forget that small Perceived to be easier to investigate than
changes in behaviour may be still be detectable. For behaviour May give us ideas which further shape and focus
example, an intervention aimed at increasing the use of our intervention
research evidence in policy-making processes may seek a
Possibly no direct causal link between changes May be difficult to define (‘operationalise’) the
rise in the number of policies based on comprehensive and
in thinking and long-term changes in behaviours behaviours we are looking for as indicators of
systematic evidence (such as systematic reviews) in the long
Disadvantages
impact (though this is more a challenge than
term. In the short term though, we might look at whether Dependent on self-reporting, so data may
there is a shift in attitude towards using research evidence a disadvantage)
be distorted by biases in design, e.g. when
and understanding the value it can play in formulating participants ‘know what we want to hear’ Even if we define the behaviour adequately,
policies or making decisions. it may be extremely difficult to observe in real
Or consider another example in which an intervention time, and therefore we may have to depend
is aimed at improving the training skills of information on self-reporting
literacy trainers to use more enquiry-based, learner-
centred methods rather than teacher-centred approaches.
In the short term we may seek evidence of a shift in
their perception of value and some use of enquiry-based
activities in their information literacy curriculum, but in
the long term we might look for evidence that a trainer
has developed an information literacy curriculum based on
pedagogical approaches, which also incorporates varying
assessment techniques.
The advantages and disadvantages of looking at changes
in thinking or changes in behaviour are summarised in
Table 4.
objectives and outcomes? However, be aware that while working within your
Trainees may have a variety of intellectual and physical
resources, you need to collect sufficient data to subject
• Do you have the capacity to implement the methods capacities. You need to define methods which take
them to powerful statistical testing when you undertake
you have found appropriate? account of these. For example, individuals who have a
your data analysis (in Stage 7 of the M&E journey). You
hearing impairment may find a focus group stressful, but
Other considerations include: should bear this in mind when designing your sample size.
START HERE
Choose methods
best suited to your:
– objectives/ Quantitative
outcomes methods.
E.g. surveys
– intervention
Consider the Do you want to – capacity/ Qualitative Choose Pilot Choose your
evaluation consider changes resources methods.
questions you in behaviour or your tools your tools technologies
– trainees. E.g. focus groups
want to answer. in thinking?
Be aware:
Choosing Mixed methods.
mixed methods E.g. focus groups
(triangulation) is and surveys
usual and good
practice
START HERE
Trainees’
skills?
Trainees’
knowledge?
Design your
Trainees’ learning
behaviours? intervention
From your
Determine
defined objectives,
appropriate Develop your
determine Trainees’ pre-training Use learning objectives
what you want attitudes? diagnostics. evidence to:
to evaluate to
Eg. Self-assessment
establish your
Anything else? questionnaires. Identify
baseline
e.g. IL standards, approaches for
models & achieving your
frameworks learning outcomes
• How do your trainees feel about their learning Types of assessment Assessing progress during training
experience? Do you need to adjust your Formative assessment occurs while training Formative assessment – or assessment carried out during a
training style to achieve the learning objectives/ is being designed, developed and delivered. It learning intervention – brings with it many benefits. It can:
outcomes? includes methods such as pre-training surveys and
• Inform you of the extent to which your trainees have
in-class observations. As a form of assessment, it is
• Do trainees feel confident about their new skills grasped new concepts, skills or attitudes
regarded as more significant and informative than
and knowledge? summative assessment (see below) as it provides • Highlight what is not clear and what needs further
• Do you need to revisit any concepts in the next both the trainer and the trainee with evidence of elaboration, providing you with evidence about how your
session? the breadth and depth of the learning taking place. learning intervention should be immediately adjusted
Post training: Summative assessment normally takes place at the • Offer trainees an adaptable trainer who responds to
end of a learning intervention, although it can be feedback and confirms their achievements …
• What is the current level of knowledge and
undertaken over the course of weeks or months. • … so enabling them to learn to recognise what they
skills of trainees compared with the baseline?
It focuses on targets, progress and impact, and themselves need to do to improve, thus promoting
• What is the immediate indication of training allows judgments to be made about the extent to critical and independent ownership of their development
impact on the lives of the trainees? which the learning intervention’s objectives have
been achieved. As such, it aims to set out what the Below are some formative assessment methods you
trainee can actually do, focusing on the final result might want to consider, though the list is not exhaustive.
or outcome. As a form of assessment, this type of A good trainer will use a range of methods and tools to
data is usually of interest to funders, institutional assess the progress trainees are making:
and departmental managers.
11. Book reference: Stanfield, R.B. (2000). The Art of Focused Conversation. Toronto:
News Society Publishers
START HERE
Summative assessment
Formative assessment. M&E after a training
M&E during a training intervention
intervention.
For example:
For example:
Case study
assignments
Have you captured
Focused Consider: evidence of extent Reflective
conversations of learning? journals
Do you need to change
Quizzes/ your teaching style? Diagnostic
questionnaires tools
Do you need to revisit
Demonstrations/ any concepts Hypothetical
presentations exercises
Are individuals
Probing questions learning? Portfolios
of evidence
Practicals/group Consider the respective
activities weight that you give Trainer’s logs
to quantitative and
Mood monitors qualitative methods Assessment rubrics
and/or tools
When you analyse the results you can notice the While the second explanation is a theoretical possibility, teacher or learner centred. The second column is for
difference between the lowest average scores in our evidence may be stronger for the first claim. scores for the number of respondents who thought
column one (pre-training) and column two (re- However, we can argue that individuals cannot recollect the scenario was teacher centric before training, the
assessment of prior knowledge). The participants the score they gave in the first questionnaire, therefore third column is for after training, the fourth column
lowest average score was 4.05 (81%) prior to training the second re-assessment is a more realistic assessment is a percentage for correct answers, and so on for the
but dropped to 3.4 out of five (68%) after the of their pre-training capabilities. learner-centred approach. For your reference, the
training. Given that both of these scores were asking correct answer is given in red.
for the participants’ skills scores before training had Moving beyond self-assessment: testing The table shows that respondents answering after
taken place, what explains this disparity? The data
demonstrates two explanations that we first came participants the training were better able to identify the correct
approach in all but one instance (the exception being
across in Stage 1: Assessing your trainees’ needs: The alternative to self-assessment is to provide the first scenario with Mary). Our respondents scored
• Known/unknowns: after training participants begin participants with a test. This can be through survey well on both tests, 84.2% of them being able to identify
to realise how little they knew before training. questions testing participants’ knowledge or the approach before training. After training this rose
Therefore, when asked to score themselves understanding of a concept, or through assessment to 91.4%.
retrospectively, they score themselves lower. within the classroom. The former, however, lends itself
better to a quantified analysis and is therefore easier If we had not done the initial study before the course,
• The two questions (i.e. immediate retrospective for comparisons when it comes to improvements in the we could have measured our success with the lofty
re-assessment of skills prior to training and the distance travelled. figure of 91.4% in isolation. However, our pre-course
assessment of new skills) are asked in tandem: given survey showed that respondents were already largely
that participants are being asked to score themselves Consider Table 6. These are survey results from a well aware of the differences between teacher- and
before and after training, they overstate their gains question in which the training respondents were given learner-centric approaches. The true distance travelled
to emphasise how much they have learned from seven scenarios (shown in the first column), and asked here was 7.2%.
the training. whether they considered the scenario predominantly
The T-Test and how to apply it particularly in small groups. It might be the case, for
The most common method used to establish whether example, that a handful of participants have improved
two groups differ statistically is comparison of the substantially while the rest remain at roughly the same
two means (the mathematical averages). Sometimes level and that this scattered improvement artificially
though, this can lead to misleading conclusions. inflates the mean.
Consider this example: you are about to facilitate a This is where the t-test is useful. The t-test looks at the
workshop on information literacy that emphasises standard deviation in your sample and considers this in
the importance of search strategies. Before training the light of your sample size. (The standard deviation
begins you test the participants on their search skills, is the extent to which each of the participant’s
and they score an average of 50%. This is the baseline performance deviates from the class’s mean.)
against which any progress after the workshop will be The t-test will reveal whether your confidence
measured. You give the participants another test after in having made a difference to your participants’
the course, on which they average 60%. You decide performance is justified, i.e. if you have a large
your main objective has been met: the participants’ enough sample size and your participants’ improved
search skills are, on average, better as a result of your performance is realised across the board.
workshop.
The t-test is used for sample sizes of fewer than 30,
Your conclusion, however, might be premature. but the closer you are to 30 the more confident you
Fluctuations in the mean often occur by chance, can be with your data.
Using T-Tests in Microsoft Excel (MS) 2. Go to the ‘Formulas’ toolbar, then to ‘More
The following is a step-by-step guide to using t-tests Functions,’ and finally to ‘Statistical’. Here, you will
in Microsoft Excel 2007 (though other spreadsheet find an option for the t-test. Below, Figure 6 shows
software also has the necessary functions). You may the box that appears after this selection.
need to amend these instructions or download a
TOOLPAK depending on the version of Excel loaded
on your PC.
1. Transfer your data into a spreadsheet. So, for our
example presented in the Case Study 11: Using the
distance travelled, we have pre- and post-workshop
scores for 20 participants asked to rate from one
to five their skills in ‘Using mind-maps to develop
search strategies’, with one being the lowest score,
five the highest. As you can see in Table 8, the
average before training was 3.15 and after
training four. Our task is to find out whether
this is a significant difference. Figure 6: screen grab of first step in using T-Tests in MS Excel
Fill these in as follows: Below, Figure 7 shows the Excel table filled in for our
worked example. On clicking ‘OK’ you get the figure:
3. Array 1 – select all your scores for the first sample
0.001813.
(i.e. the scores for the pre-workshop test)
4. Array 2 – Select the scores for the sample against
which you want to compare Array 1 (i.e. the post-
workshop scores)
5. Tails – you must specify a one- or two-tailed
hypothesis. A one-tailed hypothesis specifies
direction, so for instance it claims that Array 1 will
be greater than Array 2. A two-tailed hypothesis is
non-directional, so it merely states that there will
be some kind of difference between Array 1 and 2
without specifying which is likely to be greater. In
M&E, we are generally looking for an improvement
of some sort, so usually we will choose a one-tailed Figure 7: completed T-Test box in MS Excel
test, which is more powerful (because the test is
only ‘looking’ for effects in one direction). This is your p value. The p value is the likelihood that
the difference between the two means is due to
6. Type – You choose Type 1 when your population is chance. As a percentage, our likelihood is 1.8% that
paired. This is the case when, for instance, you carry the fluctuations in the two means are random. The
out two tests for one group. So, for instance, your norm in social science research is to accept anything
Array 1 and Array 3 comprise the same people just lower than 0.0514, which indicates a less than 5%
in different points in time. Type 2 compares two likelihood of chance fluctuation.
different groups when there is an equal variance.
Type 3 compares two different groups when the Given that your p value is lower than p = 0.05 you can
variance is unequal. An unequal variance is when say with some confidence that “this effect is not due
one data set is more scattered than the other, i.e. a to chance alone”.
data set in which there are substantially higher and
substantially lower figures than the mean.
14. http://www.medicine.ox.ac.uk/bandolier/booth/glossary/pvalue.html
Factor Tips
Organisational culture Is learning embedded in the strategy and operational plan?
Are employee reward structures designed to reward learning?
Do senior management take learning seriously and attend learning events?
Individual ability and confidence Is some training in understanding information (e.g. qualitative and quantitative data) desirable for some staff?
Are there events designed so everyone’s voice can be heard?
Does the group feel ‘safe’ for each individual?
Are individual preferences in how to communicate (sketches, drama, poems, words, videos, photos etc.) taken
into account?
Are learning cycles designed so that different learning styles can be encompassed?
Power and hierarchies Are events designed to minimise the symbols of hierarchies (for example when considering invitees, seating plans,
roles assigned)?
Do senior staff genuinely listen to more junior staff?
Are the results of learning events valued, and are they taken seriously and acted on?
Donor support Is there a budget line for learning cycles?
Are donors involved in organisational learning?
Are results communicated to donors?
Personal resources Is learning a formal element of individuals’ roles?
Are events scheduled sensitively in terms of peak times in individuals’ roles?
Factor Tips
Motivation Are events designed with short-term and long-term benefits in mind?
Are they enjoyable?
Quality of facilitation Is there a designated facilitator(s) for learning events?
Is the event planned carefully?
Does the facilitator have a range of tools and exercises available to examine the information from different
perspectives?
Accessibility of information Is the ‘raw material’ for learning easily accessible in terms of technology, readability, language?
Is there a structure for storing the output of learning events so that it is not lost?
17. Managing for impact in Rural Development: A guide for project M&E. www.ifad.org
21. Ramalingam, B. (2010). Organisational learning for aid, and learning aid organisations.
Available: http://www.capacity.org/capacity/opencms/en/topics/learning/
organisational-learning-for-aid-and-learning-aid-organisations.html. Last accessed 23
Feb 2013
22. Kotter, J.P. and Cohen, D.S. (2002) The Heart of Change: Real-life Stories of How
People Change Their Organizations. Harvard Business Review School Press, Boston,
Massachusetts
Summarised in Kusters, C., van Vught, S., Wigboldus, S., Williams, B. & Woodhill,
J. (2011). Making evaluations matter: A practical guide for evaluators. Centre for
Development Innovation, Wageningen University & Research centre, Wageningen,
The Netherlands. www.cdi.wur.nl
START HERE
Improve
future training Tools for
interventions Determine the convergent
information you thinking.
plan to review E.g. Mind-mapping
For reporting
to primary
stakeholders, Determine who
partners or should attend
Consider why you funders Disseminate,
E.g. all stakeholders Consider your
want to identify store, but above all
or representatives tools for learning
lessons learned. USE your learning
To deal of each
with a crisis stakeholder group
attributes:
is characterized by scatter of information
mosaic-like construction of meaning
electronic publishing: co-exists exploration
electronic journals traditional publishing
with context
flexibility
non-linearity
is characterized by is influenced by is represented by
speed
wider accessability RELEVANCE IN ELECTRONIC ENVIRONMENT
contexts verification of
source origin
is conditioned by is supported by is perceived as
comprises verification
navigation
technological features (tools): content manipulation
use of electronic sources
advanced search
inlinks in context
is characterized by intelligent interfaces
multimedia
visualization
attributes group communication
negative: positive:
demandingness speed
time recency
vagueness accessability
commercially
problems of perception
Reference: (http://informationr.net/ir/15-4/colis719.html)
START HERE
Your trainees
The organisations
in which the trainees
are located Consider their differing
Consider who you want
needs. Determine appropriate
to communicate your
Refer back to your communications
M&E evidence to The sponsors or stakeholder analysis
funders of these
organisations
Others
• Drama, puppetry : http://www.hapinternational.org/ • Observations : http://learningstore.uwex.edu/ • Reflective tools: Diaries (blog): http://www.dubstudios.
pool/files/wvuk-puppets-for-accountability.pdf assets/pdfs/G3658-5.PDF and http://transition. com/technology/using-blog-tools-for-research-and-
(Last accessed 23 Feb 2013) usaid.gov/policy/evalweb/documents/TIPS- innovation/ and http://www.dubstudios.com/mr/
UsingDirectObservationTechniques.pdf and http:// digital-diaries-as-an-online-qual-methodology/
• Drawings: http://www.unicef-irc.org/publications/pdf/ www.taklin.com and http://www.noldus.com/human- (Last accessed 23 Feb 2013)
iwp_2009_05.pdf (Last accessed 23 Feb 2013) behavior-research (Last accessed 23 Feb 2013)
• Reflective tools: Diaries (email): http://sru.soc.surrey.
• Focus groups: http://www.infed.org/research/focus_ • Participant observation (=ethnography): http://www. ac.uk/SRU21.html (Last accessed 23 Feb 2013)
groups.htm or http://sru.soc.surrey.ac.uk/SRU19.html unicef-irc.org/publications/pdf/iwp_2009_05.pdf
(Last accessed 23 Feb 2013) • Reflective tools: Diaries (virtual): https://www.
(Last accessed 23 Feb 2013)
scss.tcd.ie/Gavin.Doherty/mood-charting-
• Graphical facilitation : http://aea365.org/ • Photography : http://www.socialsciences.manchester. preprint.pdf and http://www.jopm.org/columns/
blog/?tag=graphic-facilitation and http://www.blog. ac.uk/morgancentre/realities/toolkits/participatory- innovations/2011/09/26/just-text-me-using-sms-
biggerpicture.dk/learn-graphic-facilitation-snippit- visual/17-toolkit-participatory-visual-methods.pdf technology-for-collaborative-patient-mood-charting/
learning/ and http://www.photovoice.org/html/ppforadvocacy/ (Last accessed 23 Feb 2013)
• Interviews (face to face): http://www.fao.org/docrep/ ppforadvocacy.pdf and http://www.unicef-irc.org/
• Spider tool (see also Evaluation Wheel): http://www.
w3241e/w3241e06.htm (Last accessed 23 Feb 2013) publications/pdf/iwp_2009_05.pdf (Last accessed 23
ungei.org/resources/files/SCS_Spider_Tool_Final_2.
Feb 2013)
pdf and http://www.crin.org/docs/resources/
publications/SCS_Spider_Tool_Facilitators_Guide_3.
pdf (Last accessed 23 Feb 2013)
• Ragin, C., Nagel, J. and White, P. (2004). General • Distance travelled Roddy, K. (2011). Measuring
Guidance for Developing Qualitative Research Projects Outcomes and Impact.
& Recommendations for Designing, Evaluating, and
Strengthening Qualitative Research in the Social Sciences.
National Science Foundation Workshop on Scientific
Foundations of Qualitative Research