Rainbow Framework
Rainbow Framework
There are so many different options (methods, strategies and processes) in evaluation that
it can be hard to work out which ones to choose for an evaluation.
BetterEvaluation organises options into 34 different evaluation tasks, grouped by 7 colourcoded clusters to make it easier for you to choose and use appropriate methods, strategies
or processes. It also shows approaches (which combine a package of options) such as
Randomized Controlled Trials (RCTs) and Outcome Mapping (OM).
The planning tool can be used to: commission and manage an evaluation; plan an
evaluation; check the quality of an ongoing evaluation; embed participation thoughtfully in
evaluation; develop evaluation capacity.
Send suggestions for additions or revisions to us via http://betterevaluation.org
BetterEvaluation is an international collaboration to improve evaluation theory and practice by sharing information
about evaluation options (methods, strategies, processes) and approaches (collections of methods). We provide an
interactive and freely accessibly website and related events and resources. Visit BetterEvaluation at
http://betterevaluation.org and register to contribute material, add comments and ask questions. We support
individual evaluators, managers of evaluation and practitioners as well as organisations across disciplinary and
organisational boundaries, sectors, languages and countries.
Founding partners: Institutional Learning and Change (ILAC) initiative of the Consultative Group on International
Agriculture (CGIAR), Overseas Development Institute (ODI), Pact, RMIT University (Royal Melbourne Institute of
Technology).
Financial support: Australian Government Department of Foreign Affairs and Trade (DFAT), International Fund
for Agricultural Development (IFAD), The Rockefeller Foundation, Netherlands Ministry of Foreign Affairs,
International Development Research Centre (IDRC).
You may use this document under the terms of the Creative Commons Attribution-Non Commercial Unported licence available
at http://creativecommons.org/licenses/by-nc/3.0/.
Who needs to be involved in the evaluation? How can they be identified and engaged?
Understand stakeholders:
1. Community scoping
2. Stakeholder mapping and analysis
Engage stakeholders:
3. Community fairs
4. Fishbowl technique
5. Formal meeting processes
6. Informal meeting processes
Who will have the authority to make what type of decisions about the evaluation?
Who will provide advice or make recommendations about the evaluation?
What processes will be used for making decisions?
Types of structures:
1. Advisory group
2. Citizen juries
3. Steering group
Community
Expert review
External consultant
Hybrid - internal and external
5. Internal staff
6. Learning alliances
7. Peer review
Approaches:
Horizontal evaluation
Positive deviance
Participatory evaluation
Page 2
What resources (time, money, and expertise) will be needed for the evaluation and how can they be obtained?
Consider both internal (e.g. staff time) and external (e.g. previous participants time) resources
Determine resources needed
1. Evaluation budget matrix
2. Evaluation costing
3. Resources stocktake
3. Evaluation Standards
4. Institutional Review Board (IRB)
Aide memoire
Evaluation framework
Evaluation plan
Evaluation work plan
5. Gantt chart
6. Inception report
How will the evaluation itself be evaluated including the plan, the process and report?
1. Beneficiary exchange
2. Expert review for meta-evaluation
3. Group critical reflection
How can the ability of individuals, groups and organisations to conduct and use evaluations be strengthened?
1.
2.
3.
4.
5.
6.
7.
8.
Community of practice
Conferences
Coaching
Evaluation competencies
Evaluation library
Evaluation policy
Evaluation societies and associations
Learning circle
9.
10.
11.
12.
13.
14.
15.
Mentoring
Organisational policies and procedures
Peer coaching
Peer review for meta-evaluation
Reflective practice
Supervised practice in teams
Training and formal education
Page 3
How is the intervention understood to work (programme theory, theory of change, logic model)?
Ways of developing logic models:
1. Articulating mental models
2. Backcasting
3. Five whys
4. Group model building
5. Previous research and evaluation
6. SWOT analysis
Approaches
Collaborative outcomes reporting
Outcome mapping
Participatory impact pathways approach
Realist evaluation
What are possible unintended results (both positive and negative) that will be important to address in the
evaluation?
1. Key informant interviews
2. Negative programme theory
3. Risk assessment
Page 4
Decide purpose
What are the primary purposes and intended uses of the evaluation?
Using findings:
1. Contribute to broader evidence base
2. Inform decision making aimed at improvement
(formative)
3. Inform decision making aimed at selection,
continuation or termination (summative)
4. Lobby and advocate
Using process:
5. Build trust and legitimacy across stakeholders
6. Ensure accountability
7. Ensure diverse perspectives are included,
especially those with little voice
What are the high level questions the evaluation will seek to answer? How can these be developed?
(This task has resources only)
Page 5
Approaches
Critical system heuristics
Participatory evaluation
Sample
How will you collect and/or retrieve data about activities, results, context and other factors?
Information from individuals:
1. Deliberative opinion polls
2. Diaries
3. Goal attainment scales
4. Interviews with individuals:
- Convergent
- In-depth
- Key informant
5. Hierarchical card sorting
6. Keypad technology
7. Questionnaires (or surveys):
- Email
- Face-to-face
- Internet
- Mail
- Mobile phone (see Mobile Data Collection)
- Telephone
8. Mobile data collection
9. Photolanguage
Page 6
10.
11.
12.
13.
14.
15.
16.
Photovoice
Polling Booth
Postcards
Projective techniques
Seasonal calendars
Sketch mapping
Stories
Physical:
37. Biophysical
38. Geographical
Observation:
32. Field trips
33. Non-participant observation
34. Participant observation
35. Photography/video recording
36. Transect
Manage Data
How will you organise and store data and ensure its quality?
1. Consistent data collection and recording
2. Data backup
3. Data cleaning
Analyse data
8.
9.
10.
11.
12.
Multivariate descriptive
Non-parametric inferential
Parametric inferential
Summary statistics
Time series analysis
Textual analysis
13. Content analysis
14. Framework matrices
15. Thematic coding
16. Timeline and time-ordered matrices
Page 7
Visualise data
Analyse text:
12. Phrase net
13. Word cloud
14. Word tree
How will you assess whether the results are consistent with the theory that the intervention produced them?
Gathering additional data:
1. Key informants attribution
2. Modus operandi
3. Process tracing
Approaches:
Contribution analysis
Collaborative outcomes reporting
Analysis:
4. Check dose-response patterns
5. Check intermediate outcomes
6. Check results match a statistical model
7. Check results match expert predictions
8. Check timing of outcomes
9. Comparative case studies
10.Qualitative comparative analysis
11.Realist analysis of testable hypotheses
Page 8
How will you compare the factual with the counterfactual what would have happened without the
intervention?
Experimental:
1. Control group
Quasi-experimental:
2. Difference-in-difference
3. Instrumental variables
4. Judgemental matching
5. Matched comparisons
6. Propensity scores
7.
8.
9.
Regression discontinuity
Sequential allocation
Statistically created counterfactual
Non-experimental:
10. Key informant
11. Logically constructed counterfactual
Approaches:
Randomised Controlled Trials
Key informant
Force field analysis
General elimination methodology
Process tracing
Approaches:
Contribution analysis
Collaborative outcomes reporting
Techniques:
3. Cost benefit analysis
4. Cost effectiveness analysis
5. Cost utility analysis
6. Lessons learnt
7. Multi-criteria analysis
8. Numeric weighting
9. Qualitative weight and sum
10. Rubrics
11. Value for money
Approaches:
Social return on investment
Page 9
Do you need to synthesise data across evaluations? If so, how should this be done?
1. Best evidence synthesis
6. Realist synthesis
2. Lessons learnt
7. Systematic review
3. Meta-analysis
8. Textual narrative synthesis
4. Meta-ethnography
9. Vote counting
5. Rapid evidence assessment
Generalise findings
How can the findings from this evaluation be generalised to the future, to other sites and to other programmes?
1. Analytic generalisation
2. Statistical generalisation
Approaches:
Positive deviance
Horizontal evaluation
What types of reporting formats will be appropriate for the intended users?
Written:
1. Aide memoire
2. Executive summaries
3. Final reports
4. Interim reports
5. Memos and Email
6. News media communications
7. Newsletters, bulletins, briefs and brochures
8. Postcards
9. Website communications
Presentation events:
10. Conference
11. Feedback workshops
12. Teleconference
13. Verbal briefings
14. Videoconference
15. Web-conference
Presentation materials:
16. Flip charts
17. Displays and exhibits
18. Posters
19. Power-point
20. Video
Creative:
21. Cartoons
22. Photographic reporting
23. Poetry
24. Reporting in pictures
25. Theatre
Graphic Design:
25. Arrangement
26. Color
27. Images
28. Type
Page 10
Ensure accessibility
How can the report be easy to access and use for different users?
General accessibility:
1. Applied graphic design principles
2. Descriptive chart titles
3. Eliminate chartjunk
4. Emphasis techniques
5. Headings as summary statements
6. One-Three-Twenty-Five (1:3:25) principle
7. Plain language
Develop recommendations
Beneficiary exchange
Chat rooms
Electronic democracy
External review
5.
6.
7.
8.
Support use
In addition to engaging intended users in the evaluation process, how will you support the use of evaluation
findings?
1. Annual reviews
2. Conference co-presentations
3. Data use calendar
4. Policy briefings
5. Recommendations tracking
6. Social learning
Page 11
Approaches
Appreciative Inquiry
A participatory approach that focuses on existing strengths
rather than deficiencies - evaluation users identify instances
of good practice and ways of increasing their frequency.
Beneficiary Assessment
An approach that assesses the value of an intervention as
perceived by the (intended) beneficiaries, thereby aiming to
give voice to their priorities and concerns.
Outcome Mapping
Unpacks an initiatives theory of change, provides a
framework to collect data on immediate, basic changes that
lead to longer, more transformative change, and allows for
the plausible assessment of the initiatives contribution to
results via boundary partners.
Case study
A research design that focuses on understanding a unit
(person, site or project) in its context, which can use a
combination of qualitative and quantitative data.
Collaborative Outcomes Reporting
An approach that builds on contribution analysis, adding
expert review and community review of the assembled
evidence and conclusions.
Contribution Analysis
An approach for assessing the evidence for claims that an
intervention has contributed to observed outcomes and
impacts.
Critical System Heuristics
An approach used to surface, elaborate, and critically
consider boundary judgments, that is, the ways in which
people/groups decide what is relevant to the system of
interest (any situation of concern).
Developmental Evaluation
An approach appropriate for evaluations of adaptive and
emergent interventions, such as social change initiatives or
projects operating in complex and uncertain environments.
Horizontal Evaluation
An approach that combines self-assessment by local
participants and external review by peers.
Innovation History
A way to jointly develop an agreed narrative of how an
innovation was developed, including key contributors and
processes, to inform future innovation efforts.
Institutional Histories
An approach for creating a narrative that records key points
about how institutional arrangements have evolved over
time and have created and contributed to more effective
ways to achieve project or programme goals.
Participatory Evaluation
A range of approaches that engage stakeholders (especially
intended beneficiaries) in planning, conducting, analysing the
evaluation and/or making decisions about the evaluation.
Participatory Impact Pathways Analysis
Participatory Learning for Action
Formerly known as Participatory Rural appraisal. Enables
farmers to analyse their own situation and develop a
common perspective on natural resource management and
agriculture at village level.
Positive Deviance
Involves intended evaluation users in identifying outliers
those with exceptionally good outcomes and
understanding how they have achieved these.
Randomised Controlled Trials
An approach that produces an estimate of the mean net
impact of an intervention by comparing results between a
randomly assigned control group and experimental group or
groups.
Realist Evaluation
A form of theory-driven evaluation that seeks to understand
what works for whom, where and why taking into account
how context makes a difference to programme results.
Social Return on Investment
Identifies a broad range of social outcomes, not only the
direct outcomes for the intended beneficiaries of an
intervention.
Utilisation-Focused Evaluation
Uses the intended uses of the evaluation by its primary
intended users to guide decisions about how an evaluation
should be conducted.
Page 12