2253 - 1571321382 - UN DA Evaluation Guidelines (Final)
2253 - 1571321382 - UN DA Evaluation Guidelines (Final)
2253 - 1571321382 - UN DA Evaluation Guidelines (Final)
October 2019
United Nations Development Account
Table of Contents
Abbreviations and Acronyms....................................................................................................... iv
1. Introduction ......................................................................................................................... 1
2. Characteristics of the DA and DA Projects.............................................................................. 3
2.1 DA Characteristics ......................................................................................................................... 3
2.2 Characteristics of DA Projects ....................................................................................................... 3
2.3 DA Terminology for Results-Based Management ......................................................................... 4
2.4 DA Project Evaluations .................................................................................................................. 4
3. Preparatory Phase of the Evaluation ..................................................................................... 5
3.1 Evaluation Management ............................................................................................................... 5
3.2 Terms of Reference ....................................................................................................................... 6
4. Inception Phase of the Evaluation ....................................................................................... 16
4.1 Steps in the Inception Phase of the Evaluation .......................................................................... 16
5. Data Gathering Phase of the Evaluation .............................................................................. 24
6. Analysis and Reporting Phase of the Evaluation ................................................................... 25
6.1 Findings ....................................................................................................................................... 25
6.2 Conclusions ................................................................................................................................. 26
6.3 Lessons Learned / Good Practices .............................................................................................. 26
6.4 Recommendations ...................................................................................................................... 27
6.5 The Executive Summary of the Evaluation ................................................................................. 28
6.6 The writing of the Evaluation Report .......................................................................................... 28
7. Follow-Up Phase of the Evaluation ...................................................................................... 30
7.1 Sharing of evaluation results....................................................................................................... 30
7.2 Management response ............................................................................................................... 30
ANNEXES
Annex 1 - DA Requirements for the Terms of Reference of Project Evaluations ................................. 31
Annex 2 - Formats for use in DA Project Evaluation ............................................................................. 33
Annex 3 - Details on Methods for Data Gathering and Analysis in DA Project Evaluations ................. 35
Annex 4 - DA Requirements for Inception Reports of Project Evaluations .......................................... 39
Annex 5 - Outline of the required elements for Evaluation Reports .................................................... 41
Annex 6 - References ............................................................................................................................ 43
LIST OF TABLES
Table 1: Previous and new DA Terminology ........................................................................................... 4
Table 2: Description of Key Components of the TOR ............................................................................. 6
Table 3: Description of Key Components of the Evaluation Report ..................................................... 29
LIST OF BOXES
Box 1: UNEG Definition of Evaluation Criteria ........................................................................................ 9
1. Introduction
1. The United Nations (UN) Development Account (DA) was established in 1997 by the UN General
Assembly as a capacity development programme of the UN Secretariat. The DA supports the
implementation of projects of five global UN Secretariat entities and the five UN Regional
Commissions, with the goal of enhancing capacities of developing countries in priority areas of the
2030 Agenda for Sustainable Development. The DA provides the ten implementing entities (IEs), which
are mostly non-resident in beneficiary countries, with the ability to operationalize their vast
knowledge and know-how and to deliver capacity development support on the ground to selected
stakeholders. In this way, the entities are able to follow-up on their normative and analytical work as
well as inter-governmental processes, through concrete projects at multi-country, sub-regional,
regional and global levels.
scope and objectives, evaluation criteria and questions, are in particular dealt with in the preparatory
phase, as all these issues need to be included in the TOR of the evaluation. Many of these issues,
however, remain relevant throughout the evaluation process. Aspects of the methodology, the
organization of the evaluation and the work plan are dealt with in the inception phase, as these issues
need to be further developed based on the preliminary setup provided in the TOR, and presented in
the inception report. The analysis and reporting phase focuses on findings, conclusions, lessons
learned and recommendations, as these are specific to this phase. The follow-up phase of the
evaluation concerns the sharing of the results of the evaluation, development of a management
response by IEs and partners targeted by the recommendations, use of the lessons and good practices
in the design of new projects and programmes and use of evaluation results in reporting of the DA to
the UN General Assembly.
8. Throughout these Guidelines use is made of blue boxes to highlight DA specific requirements, while
green boxes detail more general good practices and suggestions for DA project evaluations.
DA Requirements
1
UNDP, 2009.
DA Project Evaluation Guidelines, October 2019 2. Characteristics of the DA and DA Projects
3
United Nations Development Account
Objective Objective
2
Purposive sampling, also known as selective sampling, is random sampling with the application of certain criteria for
selection of the sample. In the case of the DA projects, the IEs are the criterion. This means that projects are organized by
IE and from the projects of each IE in a tranche, a random sample is taken of half of the projects concerned. With the
approach to selection of projects to be evaluated of each of the IEs being random,there are no specific selection criteria for
DA projects to be evaluated.
DA Project Evaluation Guidelines, October 2019 2. Characteristics of the DA and DA Projects
4
United Nations Development Account
3
The term evaluator is in these Guidelines used for both a single evaluator as well as where evaluation would be conducted
by an evaluation team of two persons.
28. For all DA projects, project managers prepare a final project report4. This report provides an overview
of the project and its achievements, possible challenges faced, as well as good practices and lessons
learned, primarily from the perspective of the project manager. The final project report forms an
important input to the project evaluation. A draft of the report should be available to the evaluator
during the evalution process.
Details why the evaluation is conducted, why now, who the main anticipated users
Evaluation Purpose of the evaluation results are, and how the results of the evaluation are expected to
be used by each of the users
Context and topic of Focuses on the topic that the project aims to address, and the development
the DA Project approaches used to deal with the issues concerned
Subject of the Is the DA project of which a short description needs to be provided
Evaluation
Determines the boundaries and specifies the reach of the evaluation as well as
Evaluation Scope
issues that will be left out of the focus of the evaluation
Evaluation Objectives Refer to what the evaluation needs to accomplish in order to reach its purpose
Evaluation Criteria Refer to the guiding principles that will be used to gather and analyse data
Provide further details on each of the evaluation criteria and specify key questions
Evaluation Questions that need to be answered through the evaluation, in this way guiding data
collection and analysis
a) Evaluation Purpose
31. The purpose of the evaluation details why the evaluation is conducted, why now, who the main
anticipated users of the evaluation results are, and how the results of the evaluation are expected to
4
See the following link for information on the preparation of the final report:
https://www.un.org/development/desa/da/static-guidance-public/
5
The role of the project manager is limited to providing of relevant information for the development of the TOR, with the
evaluation meant to be independent from the management of the project.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
6
United Nations Development Account
be used by each of the users. Detailing the purpose of the evaluation is of critical importance, as it
drives the evaluation, informing the evaluation scope, objectives and the evaluation criteria used and
questions posed (see also the phases below for further details on the evaluation scope, objectives,
criteria and questions).
32. Getting the purpose of the evaluation right is of particular importance for a DA project evaluation,
since most of the projects are one-off initiatives, with usually no follow-up phase through the DA. The
purpose of the evaluation is thus NOT to inform a second or subsequent phase of the same project,
as is often the case in project evaluations of other development partners.
33. It is, therefore, important to identify from the start what the use of the results of the project
evaluation is expected to be. In particular use of the results of the project evaluation by the IEs
concerned is important, as well as use by project partners. Reference needs to be made to any
decision-making processes, at the level of the IEs or otherwise, that the evaluation results could feed
into. It will be useful to specify the kind of
recommendations that are expected to result from
DA Requirements
the evaluation process. Moreover, the UN General
Assembly needs to be included as an indirect user of
Detail the purpose of the evaluation in the DA project evaluations, with evaluation results
relation to the DA project, specifying
used in DA reporting to the UN General Assembly6,
expected users and their use of the
in particular in terms of achievements and possibly
evaluation results
lessons learned.
6
The DA Progress Reports to General Assembly can be found on the DA website at the following link:
https://www.un.org/development/desa/da/static-official-public/
7
Contextual details are typically included in the project document for DA projects and would, thus, be available in the project
document.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
7
United Nations Development Account
41. Generally, DA evaluations have been focused on two objectives: accountability and learning, with in
previous years much focus having been placed on accountability, and with the introduction of the DA
Evaluation Framework, more attention being placed on learning. Rather than just making mention of
learning and accountability, details on each of these need to be made explicit, as well as the balance
between the two. Learning aspects can be further elaborated upon through details on the need for
the evaluation to provide lessons and good practices on the topic concerned as well as on the need
for recommendations to inform programming on the topic, with the recommendations usually not
aimed at informing a second phase of the project.
42. With learning being a main objective
of the evaluation, there is a need to DA Requirements
conduct the evaluation in a
participatory way, in order to ensure Provide details on the boundaries of the evaluation,
that conclusions and learnings are specifying what is included and what will be excluded from
shared with, agreed to and owned by the evaluation.
the stakeholders concerned, which Provide the learning objectives of the evaluation in addition
enhances the prospects of the use of to aspects of accountability and specify the requirements in
the results of the evaluation and the relation to the topic of the DA project, including the
implementation of the rationale
recommendations.
e) Evaluation Criteria
43. The evaluation criteria refer to the key principles that the evaluation will use in order to gather and
analyse data. The selection of evaluation criteria needs to relate to the evaluation objectives and the
underlying purpose of the evaluation. A set of evaluation criteria was developed by the Development
Assistance Committee (DAC) of the Organization for Economic Cooperation and Development (OECD)
in the 1990’s, which have been adopted by the UN Evaluation Group (UNEG). These criteria include:
relevance, efficiency, effectiveness, impact and sustainability (see box 1 below for UNEG’s definitions).
All these criteria, with the exception of impact, need to apply to each DA project evaluation.
44. The evaluation criteria developed by the OECD DAC were geared in particular towards country level
development interventions, often in single sectors and of a technical cooperation nature. As such their
definitions do not necessarily fit with the specific characteristics of DA projects, which are projects at
multi-country, regional, sub-regional or global levels and that focus on capacity development.
Therefore, the evaluation criteria need to be tailored to the specific requirements of the DA.
UNEG evaluation criteria tailored to the DA
45. While the criteria of relevance, effectiveness and sustainability do apply to DA projects, the
evaluation questions underneath each of these criteria will need to be adapted to the characteristics
of the DA (see details in box on page 14). The criterion of efficiency is useful in its focus on economic
efficiency and timeliness of the process through which activities are transformed into output level
results, including the use of human and financial resources and aspects of project management. The
criterion of efficiency, however, misses out on other important process issues of DA projects, including
partnerships, human rights and gender equality issues, which will be discussed below.
46. The criterion of impact proves usually less applicable to DA projects as results, in terms of effects on
the lives of people, in particular vulnerable and marginalized groups, would usually only be assessable
sometime after phasing out of the project, with a variety of other intervening factors playing a role.
Given the limited budget and time frame of DA projects, they cannot necessarily be expected to be
able to show impact level changes, i.e. demonstrable improvements in the lives of women and men,
girls and boys within the time frame of the project. Impact assessment usually requires a huge
investment in human and financial resources in order to be able to establish attribution through
project interventions, something beyond the ability of DA projects.
Additional DA required evaluation criteria
47. The UNEG evaluation criteria do not cover all relevant aspects of the DA. Use of the additional DA
required evaluation criteria (SDGs, partnerships, human rights, and gender equality and innovation)
are, therefore, an essential part of DA project evaluations. These can either be included as separate
criteria for the evaluation, or evaluation questions related to these could, alternatively, be included
under one or more of the UNEG evaluation criteria.
48. Given the importance of the 2030 Agenda and the SDGs, the evaluation of DA projects needs to pay
attention to these. This includes attention to the SDGs and related targets and indicators of those
SDGs relevant to the project concerned as well as attention to the principle of ‘Leaving no one behind’.
This can be achieved under selected evaluation criteria, e.g. through analysis of alignment of the
project with the 2030 Agenda as part of the assessment of the criterion of relevance, or through
analysis of the contribution of the project to SDG targets and indicators as part of the assessment of
the evaluation criterion of effectiveness. Alternatively, SDGs can be used as a separate evaluation
criterion, with specific evaluation questions formulated and analysed as part of the evaluation.
49. Partnerships are an essential element of DA projects, as they can enhance efficiencies and
effectiveness of project delivery, by ensuring full utilization of comparative advantages and avoiding
duplication of efforts. Partnerships are, therefore, an important and required criteria for the
evaluation of DA projects. In the DA context, partnerships typically refer to joint/collaborative
implementation of projects amongst DA IEs, other UN agencies as well as sub-regional, regional and
global level stakeholders. Direct beneficiaries of DA projects are, however, not refered to as
implementing partners.
50. Issues of partnership can be dealt with as part of the criterion of efficiency, looking at aspects of
synergy or as part of the evaluation criteria of effectiveness, with a focus on results concerned.
Partnerships can also be used as a separate criterion.
51. Human rights and gender equality are important cross- cutting principles to be incorporated in all UN
programming. They are also integrated in the 2030 Agenda and have been incorporated in the
evaluation quality assessment framework of UN Office of Internal Oversight Services (OIOS). It is,
therefore, essential to include these issues in the design of a DA project evaluation. This requires
explicit attention to the principles of equality, inclusion and non-discrimination as part of the
evaluation. In case the project design has clearly included human rights and gender equality issues, a
gender analysis has been conducted and monitoring and reporting have captured related details,
disaggregating data, then inclusion in the design of the evaluation can be guided by these aspects.
However, if such details are less available or altogether missing, the reasons need to be understood
and some of these aspects can be included in the evaluation design. The conduciveness of the context
to include human rights and gender related issues in the DA project evaluation will need to be taken
into account as it will affect the feasibility and the level of their inclusion in the evaluation process.8
Human rights and gender aspects can be included in each or a selection of the evaluation criteria, they
can be used as a separate criterion or both options can be combined in the evaluation.
52. With the DA focus on innovation9, it is
important for DA project evaluation to DA Requirements
pay attention to the extent to which, and
Present the UNEG evaluation criteria (relevance,
the ways in which, innovation has been a effectiveness, sustainability and efficiency), tailored to
feature of the project. As with other the characteristics of the DA and the project concerned
thematic issues, this can be done as part
of the UNEG evaluation criteria or Use of the additional DA required evaluation criteria
(SDGs, partnerships, human rights and gender equality
innovation can be used as a separate
and innovation) as additional criteria and as an essential
evaluation criterion, with specific part of DA project evaluations, or include questions
evaluation questions formulated to related to these under one or more of the UNEG
assess issues concerned. evaluation criteria
f) Evaluation Questions
53. Evaluation questions provide further details on each of the evaluation criteria and specify key
questions that need to be answered through the evaluation, in this way guiding data collection and
analysis.
54. A large amount of questions tends to divert the
evaluation to address a myriad of issues that do DA Requirements
not necessarily provide the opportunity for the
Evaluation questions need to be formulated
evaluation to come up with a set of findings that
under each of the evaluation criteria selected –
can be the basis for a well informed conclusions on
doing so is the responsibility of the evaluation
the topic of the project. With the limited resources manager, supported by the IE evaluation
that DA project evaluations have, it is important to section
provide useful and evidence based answers to a
In situations where IEC and/or ERG have been
limited number of questions, rather than
established, the questions need to be
superficially addressing a wide range of questions.
developed in participation with their members
Project evaluations, therefore,` should ideally be
limited to a maximum of six to seven main In order to focus the evaluation, questions
evaluation questions, which together need to be should ideally be limited to a maximum of
seven
able to address the purpose of the evaluation.
8
In order to assess the extent to which a project or programme contributes to gender equality, the UN System Wide Action
Plan on Gender Equality and the Empowerment of Women includes the Gender Results Effectiveness Scale, which identifies
five different levels of results, which can be of use in DA project evaluations: gender negative, gender blind, gender targeted,
gender responsive and gender transformative. United Nations Evaluation Group, 2018. For a more comprehensive overview
of possible approaches to address challenges in the evaluation of human rights and gender related cross-cutting issues, see
United Nations Evaluation Group, 2011 and 2014, with the latter providing a more detailed approach to inclusion of human
rights and gender equality in evaluation.
9
For the purposes of the DA innovation is thought of as IEs either addressing new topics or using new means of delivering
projects (or a combination thereof) that differ significantly from the topics and means of delivering projects (or part of
them) that the IE has previously addressed or used for delivery.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
11
United Nations Development Account
Relevance
• To what extent was the project objective aligned with international conventions and inter-
governmental processes?
• To what extent does the project design respond to the needs of Member States?
• What adaptations were made to the design of the project during implementation and were these
justified in the context concerned?
• What lessons and good practices from previous DA projects were used to inform project design?
Efficiency
• What human, financial and in-kind resources were leveraged through contributions of partners?
• To what extent did the project achieve efficiency in implementation through the combination of
project stakeholders involved, making use of comparative advantages and the creation of synergy?
Effectiveness
• To what extent did the selection of participants for training programmes, workshops and study tours,
contribute to results achieved?
• To what extent and in what ways have training, workshops and study tours contributed to learning of
participants?
• To what extent have participants been able to make use of learnings through training, workshops and
study tours and changed the way in which they conduct their work, in order to enhance results?
• What aspects of policy related change has the project contributed towards?
Sustainability
• Did project design include an approach to scaling up of results and how has this been implemented?
• To what extent and in which ways have national level UN and other national level development
organizations been involved in project implementation and what role can they be expected to play in
sustaining the results achieved through the project at country level?
Partnerships
• To what extent has partnering with other organizations enabled or enhanced reaching of results?
• Has partnering with other organizations resulted in reduction of overlap and increased efficiency?
The 2030 Agenda/ SDGs
• In what ways and to what extent has the project contributed to supporting the principle of leaving no
one behind in the sustainable development process?
• To what extent has the project contributed to reaching targets of selected SDGs?
Human Rights and Gender Equality
• To what extent was a rights-based and gender sensitive approach applied in the design and
implementation of the project, informed by relevant and tailored human rights and gender analysis?
• In what ways were results for disadvantaged and left behind groups included and prioritized in the
design and implementation of the project and were resources provided to enable this?
• To what extent has the project contributed to human rights and gender equality related objectives
and to SDG 5 on Gender Equality and gender objectives in other SDGs and were targets concerned
included in the project results framework?
Innovation
• What innovative aspects of the project (addressing new topics or using new means of delivery or a
combination thereof) proved successful?
• How can innovative aspects of the project that proved successful be scaled up and replicated with
funding from outside the DA?
55. An initial set of evaluation questions is developed as part of the development of the TOR for the DA
project evaluation, a process conducted under the responsibility of the evaluation manager. In order
for the evaluation questions and the results that they produce to be owned by the various project
stakeholders, it is important for the evaluation manager to engage key project stakeholders in the
development of the evaluation questions. In case an IEC and ERG are established, it will be important
to involve their members. Rather than each stakeholder adding the questions that they are most
interested in, a process that usually results in a large amount of evaluation questions, that are not well
prioritized, it will be useful to inform the process through virtual meeting(s) with key stakeholders to
develop a focused set of questions.
10
Scriven, 1991.
11
United Nations Evaluation Group, March 2008.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
13
United Nations Development Account
61. If methodological details are not explicitly specified in the TOR, the evaluator can be required to
develop (parts of) the methodology during the inception phase of the evaluation. If the latter is the
case, the TOR will need to provide the parameters of the methodology, ensuring that the
methodological approach
DA Requirements and rigor concerned will end
up to be able to result in a
Provide details on methodology in line with the purpose and objectives credible evaluation report in
and the specific characteristics of the DA project to be evaluated line with IE and DA needs and
Make use of a theory-based approach, guided by the project results UNEG norms and
12
framework standards. Further details
on methodology of the
Make use of a mixed methods approach, including qualitative as well
evaluation will be discussed
as quantitative data gathering and analysis
as part of the section on the
Use a participatory approach to enhance opportunities for generating Inception Phase of the
learning that is shared across stakeholders of the project evaluation process, with
additional details in annex 3.
12
United Nations Evaluation Group, 2016, 2017.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
14
United Nations Development Account
Evaluation Report, with an outline of the contents of each in an annex. A realistic timeframe in the
TOR can prevent the need for adaptations during the DA project evaluation process.
68. Any security considerations need to be detailed, specific to the region and countries in which the DA
project operates.
69. The details on the total
evaluation budget need DA Requirements
to be provided, as well as
a breakdown of the Specify the roles of evaluation manager, the evaluator and if applicable the
budget along DA financial composition and roles of the Internal Evaluation Committee and the
categories. Budgetary Evaluation Reference Group.
details are meant for Provide details on the role of the DA project manager in the evaluation
internal use only. process, in line with the requirements for independence of the evaluation
Payment arrangement for Include specification of the workplan of the evaluation, including details on
consultants need to be deliverables and their timing
included.
DA Requirements
DA Requirements
• Start the desk review and inform the inception phase with the initial results of the desk review
• Acknowledge the purpose of the evaluation and the expected use and users of the evaluation
results as guiding principles to the evaluation process
• Analyse the context of the DA project
• Review and analyse the details on the DA project to be evaluated, including an assessment of the
results framework of the project
• Further specify the stakeholder mapping that is included in the TOR, or develop one when this
was not included, and conduct stakeholder analysis
• Assess the availability and reliability of relevant secondary data
• Review the scope and evaluation objectives and assess their relation to the purpose of the
evaluation
• Review the evaluation criteria and questions and their relation to the evaluation objectives
• Review the methodology for the evaluation as presented in the TOR and further develop and/or
specify it
• Prepare the evaluation matrix
• Prepare a detailed work plan for the implementation of the evaluation
• Develop and discuss the draft inception report with key stakeholders and prepare the final
inception report
a) Desk review
74. The desk review entails a review of relevant secondary resources to inform the inception phase as
well as the remainder of the evaluation process. An initial overview of relevant materials is usually
provided in the TOR, while the evaluator can further expand on this in the inception phase.
c) Contextual Analysis
76. The context describes information on the backdrop of the DA project, including the political,
programmatic and governance environment in which the project is situated and in which the
evaluation takes place. Specific to the DA projects,
DA Requirements the context includes a focus on the topic that the
project aims to address. In the inception report,
The inception report should inform the the evaluator needs to demonstrate a clear
operationalization of the evaluation with understanding of the context of the DA project and
details concerning the context of the its importance in terms of the evaluation. This
evaluation and present relevant context details
should build on, but also go beyond, the
description provided in the TOR.
77. It is important to make use of the details of the contextual analysis in other parts of the inception
report and to inform the operationalization of the evaluation with specifics of the context in which
the DA project has been implemented. This can, for example, be in terms of country or site selection
or selection or sampling of participants from trainings conducted, as part of the methodology of the
evaluation.
d) Review the Subject of the Evaluation and the DA Project Results Framework
78. Building on the details provided in the TOR and informed by the desk review, the evaluator in the
inception phase needs to demonstrate a more thorough understanding of the DA project, the
objective it aimed to contribute towards achieving, as well as the ways in which it tried to accomplish
this. Such an understanding needs to be reflected in the description of the subject of the evaluation,
as part of the inception report, with references provided to secondary information consulted.
79. The review of the DA project as the subject of the
evaluation needs to include a preliminary analysis. This DA Requirements
includes in particular the results framework of the
project, which can be analysed in terms of its internal The subject of the evaluation needs to be
consistency as well as whether expected described in sufficient detail in the
accomplishments/outcomes can be expected to be inception report, including an initial
achieved in the timeframe provided for the project and analysis of the results framework of the
given the resources concerned. Moreover, the extent DA project, to inform the setup of the
to which the outputs, outcomes and the objective can evaluation
be considered to form a causal chain of results, is
important to assess.
of engagement with stakeholders concerned. In particular stakeholders with high levels of both
‘interest’ and ‘influence’ are those that the evaluation needs to fully engage with.
effectiveness, and sustainability as well as partnership, the 2030 Agenda/SDGs, human rights and
gender equality and innovation (see the box on page 14).
85. One way to review the evaluation criteria and questions is to identify the nature of the DA project
support. For eample, a project that focuses on the adoption of norms and standards and related
instruments, the evaluation is usually focused on process and governance issues in the adoption of
norms and standards as well as the relevance of the norms, standards and instruments in the context
of the project. On the other hand, for a project that supports the government in incorporating a
particular norm, standard or instrument in national
legislation, policies or development planning, the
DA Requirements evaluation would usually need to focus on the
capacity changes of the agencies concerned and the
Evaluation criteria and questions need to be
level of reflection of the norms and standards in
reviewed and finalized in the inception stage
legislation, policies and development programmes.
Criteria and questions need to be aligned For a project that supports the application of laws,
with the evaluation objectives policies and plans that have incorporated norms and
Evaluation questions should ideally be standards, the evaluation would need to focus on the
limited to a maximum of seven actual implementation, including capacities and
processes concerned and if feasible the results for
people, in particular for vulnerable groups.
86. The evaluation questions, once agreed among stakeholders in the inception phase, become a means
to guide the data gathering and analysis process and the preparation of the draft and final evaluation
report. An important tool that assists the evaluator in this respect is the evaluation matrix, which is a
way to operationalize each of the evaluation questions, so that data can be gathered and analysed to
come to meaningful statements on issues concerned. (See details below and annex 2 for a Format for
the Evaluation Matrix.)
i) Evaluation methodology
87. As part of the inception phase, the evaluation methodology needs to be reviewed by the evaluator
and if needed adapted and further developed in consultation with the evaluation manager, and if
applicable with the IEC and ERG. Discussions need to cover the appropriateness and feasibility of the
methodology, its ability to meet the evaluation purpose and objectives and to answer the evaluation
questions, taking into consideration limitations of budget, time and data sources. The methodology
needs to be finalized as part of the inception phase and agreed upon, with a comprehensive overview
of the methodological approach of the evaluation included in the final inception report.
88. As outlined in the TOR, DA project evaluations make use of a theory-based approach, making use of
the project results framework to assess the output and outcome level changes achieved and the
contribution made by the project. A participatory approach is used in order to engage stakeholders in
the evaluation process, to enhance ownership of evaluation results and to allow for triangulation
across a variety of stakeholders and participants. For data gathering, there is a need to make use of
mixed methods, making use of an appropriate combination of qualitative and quantitative
approaches, while bearing in mind that use of some of the methods are more extensive and require
specific capacities of the evaluator concerned. An overview of methods to be used for data gathering
and analysis is presented in annex 3.
89. The inception report needs to provide details on primary data gathering. The approach concerned will
depend on the set-up of the DA project and its implementation at country, regional or global level. In
particular when a limited number of countries are included, with substantial support at country level,
country visits may be useful. Otherwise, visits to regional and global offices of IEs would be a useful
approach for primary data gathering, supplemented by virtual interviews with selected stakeholders.
Participation of the evaluator at the final project workshop or training activity towards the end of the
project provides a useful opportunity for the evaluator to get first-hand experience of a project activity
and to gather data from participants. As part of the methodological details, selection of countries for
primary data gathering, sampling for quantitative and qualitative data gathering, and ways to address
limitations to the methodology need to be detailed, informed by the information presented in the
TOR.
90. Changes to the methodology of the TOR need to be identified and justified and agreed with the
evaluation manager and if applicable with the IEC and ERG during discussions on the draft inception
report. Tools to be used for data gathering, like lists of items for semi-structured interviews and
questionnaires for (mini-)surveys, need to be included in an annex to the inception report.
91. As part of the methodology, there is a need to make explicit how methods included allow for
assessment of human-rights and gender equality related issues. This can, among other ways, be
achieved through interviewing stakeholders separately, when there are differences in power, interest
or influence, including separately interviewing supervisors and staff, women and men, girls and boys,
as well as separately interviewing government stakeholders and ultimate project beneficiaries (as
relevant). In order to ensure sufficient attention to both policy and technical issues, it might be useful
to conduct separate interviews with the policy and technical staff concerned of the same agency.
92. In broader terms, the inclusion
of human rights and gender DA Requirements
equality can be achieved
through the inclusion of aspects Make use of a theory-based approach, guided by the project results
of human rights and gender framework, including assessment of results and the process through
equality in evaluation which these were achieved
questions, gathering and Use a participatory approach to enhance opportunities for
analysing sex disaggregated generating learning that is shared across stakeholders of the project
data13 from in particular
Make use of a mixed methods approach, including qualitative as
vulnerable and marginalized
well as quantitative data gathering and analysis
groups, use of appropriate
Combine output level assessment with contribution analysis in the
methods in ways that respect
review of outcome level changes of a DA project
the rights of participants in the
Changes in methodology from the TOR need to be justified in the
evaluation and the use of a
Inception Report
gender balanced evaluation
team, with the inclusion of Make explicit how human rights and gender equality related data
will be gathered and analysed
capacities on human rights and
gender in terms of the
requirements of the evaluator.
• Evaluation objective and scope include human rights and gender equality related issues
• Evaluation criteria and questions specify how to assess human rights and gender equality related issues
• Inclusion of a human rights and gender responsive methodology, data gathering tools and analysis
• The evaluator is qualified with respect to human rights and gender issues
• Evaluation findings, conclusions and recommendations reflect a gender analysis
13
Disaggregated data by sex are important but usually not sufficient to show aspects of gender equality, which also needs
to take into account the social positions of the participants concerned and aspects of their representation.
DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation
21
United Nations Development Account
j) Evaluation Matrix
93. Preparation of the evaluation matrix is an important part of the inception phase. The process of
preparing the evaluation matrix needs to be guided by the evaluation criteria and questions. The
evaluation matrix details for each of the individual evaluation questions a number of assumptions,
which need to be assessed in order to get answers to the questions. For each of these assumptions,
indicators should be identified that can provide information on the assumption and for which sources
of information and methods and tools for data collection and analysis need to be identified. Each of
the evaluation questions can thus be broken down in smaller parts on which data can be gathered. In
this way the matrix can guide data gathering. A format for the evaluation matrix is provided in annex
2.
94. The evaluation matrix needs to be developed by the evaluator during the inception phase, informed
by the desk review. In addition to guiding data gathering, the evaluation matrix is an important means
to guide data analysis and reporting. The matrix can be used during the data gathering process to
record information collected against each of the assumptions and indicators identified in the matrix.
When this is done in a consistent way, the evaluation matrix becomes an important tool for data
analysis and in the writing of the
evaluation report, as it brings
DA Requirements
together data on each of the
evaluation criteria and questions. The
The development of an evaluation matrix provides a means
same approach can, moreover, be to operationalize the evaluation questions and is required in
used as a checkpoint to ensure that the inception phase. The matrix is an important means to
data and information gathered covers guide the process of data gathering, data analysis and
all aspects of the evaluation, across all preparation of the project evaluation report
of the evaluation questions,
assumptions and indicators in the
evaluation matrix.
95. The indicators in the evaluation matrix should ideally include the indicators of the project results
framework, as well as a number of other relevant indicators.
k) Evaluation Workplan
96. In the inception phase, a clear and detailed evaluation work plan needs to be developed, including
details of activities to be performed and dates for the various stages of the evaluation process, i.e. the
inception phase, on data gathering phase, the analysis and reporting phase and the follow-up phase
of the evaluation. With the latter phase in particular aimed at supporting the use of the evaluation
results, attention needs to be paid to
DA Requirements evaluation use from the start of the
evaluation process. The workplan
needs to include dates for the draft
The inception report needs to include a detailed workplan,
and final versions of the inception
covering all phases of the evaluation process and specifying
and evaluation reports and any other
dates for the draft and final deliverables of the evaluation
deliverables that are part of the
assignment of the external evaluator.
See annex 2 for a format.
manager, the evaluator prepares the final inception report. It will be useful to provide an overview of
comments and the responses of the evaluator to each of these as part of the finalization of the
inception report.
98. The presentation of the inception report at the start of the data gathering phase of the evaluation
provides an opportunity to get all key
stakeholders together at the
DA Requirements
beginning of the evaluation and to
engage them in the process. This also
The draft inception report needs to be discussed with the
provides an opportunity to discuss evaluation manager and key stakeholders and, if applicable,
the details of the inception report with the member of the IEC and ERG, whose comments,
and to address any comments by the compiled by the evaluation manager, need to be responded to
evaluator. The final inception report in the preparation of the final inception report
needs to be completed before the
start of primary data gathering.
99. Each of the steps taken in the inception phaseprovides inputs to the development of the Inception
Report, making use of and building on the details provided in the TOR. The inception report is a means
for the evaluator to show a thorough understanding of the purpose of the evaluation, the DA project
as the subject of the evaluation, the context and to provide methodological and organizational details
on how the purpose of the evaluation will be fulfilled. A detailed overview of DA requirements for the
project inception report is presented in annex 4.
DA Project Evaluation Guidelines, October 2019 5. Data Gathering Phase of the Evaluation
24
United Nations Development Account
6.1 Findings
106. Findings are factual statements that respond to the evaluation questions or parts thereof and are
informed by evidence and data provided in the evaluation report. The findings respond to the scope
and objective of the evaluation. They reflect appropriate analysis of data resulting in statements that
start answering parts of the evaluation questions. Findings need to include a focus on results, ways in
which these have been achieved and the contributions of the project. Reasons for accomplishments
as well as for lack of progress need to be provided as much as possible.
107. Gaps and limitations in data need to be identified and discussed as well as how these were addressed
and in what way they affected the findings. In addition to expected changes, as included in the results
framework of the project, findings need to pay attention to unexpected changes, which can be positive
as well as negative in relation to the project objective. Findings should cover all the evaluation criteria
and questions included in the evaluation. Specific attention needs to be paid to human rights and
gender equality related issues, in line with their importance in terms of evaluation criteria and
questions.
108. Findings need to make note of adaptations made to the design of the project and its results framework
and indicators, something which in particular might occur during the first year of DA project
implementation. This is especially
DA Requirements important for projects where the
IE has been requested by DA PMT
Findings need to cover all evaluation criteria and respond to all of to provide country specific action
the evaluation questions. They have to be informed by evidence plans and indicators as part of the
provided in the evaluation report first annual project progress
DA project evaluation findings need to include any adaptations report(s). The evaluation needs
made to the DA project design after approval of the project to consider these adaptations
document and during project implementation, especially when and base the assessment of
country specific action plans and indicators have been requested project results on the revised
to be provided as part of the first annual project progress results framework, including
report(s) changes to the indicators.
Attention needs to be paid to unexpected change(s), in addition
to planned results
14
The OIOS Evaluation Quality Assessment Framework includes details on the requirements of the evlauation report,
including details on background, methodology, findings, conclusions and lessons learned, recommendations, gender and
human rights and report structure. A comprehensive overview of UNEG requirement for evaluation reports is provided in:
United Nations Evaluation Group, 2010.
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
25
United Nations Development Account
6.2 Conclusions
109. Conclusions point out the factors of success and failure of the evaluated intervention, with special
attention paid to the intended and unintended results, and more generally to any other strength or
weakness; they draw on data collection and analysis undertaken, through a transparent chain of
arguments.15 Conclusions need to be grounded in the analysis of the findings and articulate statements
at the level of the evaluation questions and the evaluation criteria. They can also refer to issues across
evaluation criteria or cross-cutting issues, like human rights and gender equality related issues.
Conclusions include strengths and weaknesses of the project that is evaluated, taking into account the
viewpoints of a variety of stakeholders. They need to be grounded in the the evidence provided in the
report. Conclusions provide
insights in the issues that the DA Requirements
project aims to address, identifies
solutions to important problems Conclusions need to build on analysis of findings and provide
faced and pinpoints ways in which insights in the topic that the DA project addresses
what appears to work well in
They need to add value rather than being a summary of the
organizations can be reinforced.
findings and provide a connection to recommendations made
Conclusions need to provide
linkages to the recommendations.
15
OECD Development Assistance Committee, 2002.
16
Ibid
17
UNICEF, not dated.
18
United Nations Evaluation Group, 2013.
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
26
United Nations Development Account
6.4 Recommendations
114. Recommendations are defined as proposals aimed at enhancing the effectiveness, quality, or
efficiency of a development intervention; at redesigning the objective; and/or at the reallocation of
resources.19 Recommendations should be linked to the evaluation findings and conclusions and need
to be relevant to the purpose and the objective of the evaluation. They need to address main issues
identified in the evaluation and be informed by the evidence provided in the evaluation report. They
need to be developed with the participation of key stakeholders, with the process of development of
the recommendations made explicit in the report. Each recommendation needs to identify the target
group concerned and include priorities for action. Each of those needs to show an awareness of the
potential constraints in terms of follow-up for the parties concerned. They should address aspects of
sustainability of the results supported and achieved by the project. Possible foci for recommendations,
taking into consideration the particular characteristics of the DA, are presented in the box below.
115. Recommendations need to be pitched at the right level. With DA projects usually not having a second
phase, they should be focused on other ways to make use of the achievements of the DA project and
learnings on the topic concerned by the IE and other relevant stakeholders. Recommendations should
not be pitched at too high a level, like at management aspects of the IE concerned or of the DA PMT,
as the evaluation of a single project would usually provide an insufficient evidence base for such
suggestions. They should, moreover, not be over ambitious, but realistic in their approach and time
frame.
116. Rather than having a myriad of
DA Requirements
small and unrelated
recommendations, their
number should be limited to a Recommendations should focus on use of the results and learnings
from the project by the IE and other relevant stakeholders to
maximum of five to seven key
further develop the topic that the project addressed, especially as
recommendations, which need DA projects usually don’t have a second phase
to be further elaborated on in
terms of opportunities for their Recommendations need to be limited in number to a maximum of 5
implementation. to 7, be sufficiently specific and feasible, providing details on
implementation, including potential constraints
19
OECD Development Assistance Committee, 2002.
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
27
United Nations Development Account
• Ways in which the IE(s) can make use of the results and learnings of the project in the wider programme of which
the project was a part and in supporting the achievement of the SDGs to which the project contributed
• How the results framework and the wider theory of change developed or used in DA project implementation can
be fine-tuned based on learnings of the DA project
• Ways in which results and learnings of the DA project can be used to inform other countries / (sub-)regions through
south-south learning or otherwise
• Ways in which some of the capacities built through the project at individual and organizational level can be further
institutionalized, like the use of training curricula and toolkits developed
• Ways in which capacities developed of IE(s) and partners can be used to further address the issues concerned
• Identification of funding opportunities for continued support to the topic that the DA project addressed
• Ways in which coordination mechanisms supported during the DA project can be continued and enhanced after
project termination
• Ways in which aspects of the topic addressed through the DA project can be monitored after project termination in
order to generate evidence to inform policy and other decision-making processes
• Ways in which partnerships and networks supported in the implementation of the DA project can continue to
function or be further developed
• Ways in which gender related results can further contribute to gender equality
• Options for follow-up on the introduction of innovative practices by project implementing partners, as well as other
government partners, UN resident agencies, civil society organizations, private sector actors, universities and other
relevant stakeholders
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
28
United Nations Development Account
Issues Description
Factual statements based on evidence provided in the evaluation report that answer part of
Findings
evaluation questions
Synthesis of evaluation findings that deliver concluding statements that provide answers to
Conclusions evaluation questions and criteria and/or to cross-cutting issues and other issues across
evaluation criteria
Lessons learned / Learning, in particular on the way in which results were achieved, which are considered
Good practices applicable beyond the context of the specific DA project in which they were developed
119. The use of the evaluation matrix can be a great help in the preparation of the evaluation report, in
particular when details of the data gathered are included in an added column to the matrix, so that all
data get organized in line with the evaluation criteria and questions. In order to enhance the
accessibility of the DA project evaluation reports, the main text of the report (excluding annexes)
should be limited to 40 pages.
120. The draft evaluation report needs to be shared
with relevant stakeholders, and if applicable DA Requirements
with the members of the IEC and the ERG, in
order to obtain their inputs. Comments of the The DA project evaluation report needs to provide
various parties to the evaluation report should the details on the results of the evaluation,
focus on factual imprecisions and including findings, conclusions, lessons learned /
inconsistencies in the report. Comments good practices and recommendations and be a
provided need to respect the independence of maximum of 40 pages, excluding annexes
the evaluation and the evaluator in terms of In addition to results of the evaluation, the final
conclusions to be made on the findings report of the DA project evaluation needs to
identified in the evaluation report. Inclusion of include details on the evaluation subject, the
stakeholders in all stages of the evaluation context, the purpose of the evaluation and its
process, including the review of the draft scope, evaluation objectives, criteria and
evaluation report, will enhance the ownership questions and details on the methodology used in
data gathering and analysis
of the evaluation results and heighten the
likeliness of use of the recommendations.
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
29
United Nations Development Account
Required:
9 Annexes
• DA project results framework
• Stakeholder map
• Documents to be consulted
• UNEG Ethical code of conduct20
• Detailed evaluation workplan
• Format for Inception Report
• Format for Evaluation Report
• Format for Management Response
Optional:
• Additional details to the context
• Additional details to the subject
• Additional details to the methodology
20
UNEG, 2008.
DA Project Evaluation Guidelines, October 2019 Annexes
32
United Nations Development Account
Level of influence
Way(s) to involve
Stake in the project over topic and
Expected use of the this stakeholder in
Stakeholder and the topic that the project / Ways in
evaluation results the evaluation
project addresses which affected by
process
topic and project
Evaluation Question 1:
Evaluation Question 2:
Evaluation Question 3:
1. Preparatory Phase
2. Inception Phase
5. Follow-Up Phase
Issue Details
Acceptance
Recommen- Party Action Status of
No. (Rationale if Priority Time frame
dation Responsible Description Progress
rejected)
Data Gathering
1 Desk review Review of secondary information Needed to understand the context of the
pertaining to the project (including the project, stakeholders and roles concerned;
project document, monitoring data, yields in particular, information on the
project reports and financial data) and evaluation criteria of relevance and efficiency
information on the topic concerned, while monitoring reports can include useful
including lessons from earlier evaluations data on results, that would need to be
and reviews of projects on the same topic triangulated with other sources
3 Focus group Discussion type interviews with a group of Effective means of collecting in-depth
discussions stakeholders, about 6 to 10 participants, qualitative information, including on what
with similar backgrounds and hierarchy for worked, what did not and reasons concerned;
several hours; allows for discussion of a set does usually not work with different
of issues across the participants of the hierarchical status among group members;
group; need to ensure for full participation ideally two evaluators needed, one to facilitate
of all group members; topics for discussion the discussion and one to take detailed notes
informed by the desk review
4 Mini-survey Self-administered short online or e-mailed Used to provide quantitative data and
survey from a substantial number of information, online software can usually
stakeholder or participants at low cost and provide key statistics; the inclusion of open
with possible wide reach; substantial non- ended questions can include some qualitative
response can affect the ability for data which could counteract the risk of
quantitative analysis of the data; short substantial non-response rate and the related
surveys with up to 10 questions can reduce limitations in quantitative analysis of data;
non-response rates development of different questionnaires for
different stakeholder groups is usually
required, with some overlap to enhance
comparability
5 Direct Observing the process of project Can be used to get a better understanding of
observation implementation, often through the workshop, training event, meeting or
participation in a workshop, training event, conference; useful for the assessment of
meeting or conference. Preparing aspects aspects of participation of the various
for guided observation can enhance data stakeholders in the events concerned; can be
gathering combined with key informant interviews and
focus group discussions with participants
6 Stakeholder Identifying stakeholders, their roles and Useful in particular with the inclusion of many
mapping relationships as well as the drivers and stakeholders in a project; the use of visual
those possibly opposed to the change mapping can be useful to show inter-
concerned relationships between stakeholders; to be
included as part of TOR so that it can be
specified in the inception phase and beyond
7 Case studies An in-depth exploration from multiple The case study is useful for in-depth analysis of
perspectives of the complexity and specific issues, but with its focus on cases is less
uniqueness of a particular policy, useful for generalising as it focuses on the
institution, programme or system in a real concrete and the specific. Need to focus on
8 Knowledge Assessment of the knowledge and skills Used in the assessment of learning from
assessment gained by individuals who have received training, workshop, coaching and mentoring
training, workshop, coaching or mentoring initiatives, going beyond the initial reaction of
as part of the project; ideally done before participants to establish what they learned
and after the training, workshop, coaching through the initiative concerned
or mentoring; often useful to get
information from supervisor or trainer as
well
9 Benchmarking Comparing each government’s policies, UN criteria, norms and standards can be used
legislation or development plans against a in the assessment of government legislation,
benchmark or ideal example that fits policies and development plans
related UN criteria, norms or standards; in
addition to comparing policies or plans,
individual countries can be assessed
against international criteria, norms and
standards, which are used as benchmark
11 Country visits Country site visits give the evaluator an Country visits can help the evaluator to
opportunity to conduct face-to-face validate information obtained from other
meetings and interviews and hear the sources and methods, and can help to gain a
perspectives of the ultimate beneficiaries / first-hand understanding of the enabling and
rights holders in one or more countries constraining factors that governments or
institutions face
12 Outcome Assessment of the intermediate changes The methodology is in particular useful for a
mapping21 that need to be attained in order to reach project built on longer term partnerships,
an organisations’ vision; Outcome based on a shared vision and values amongst
Mapping focuses on the capacities of participating organisations with a focus on
partner organizations through assessment output and outcome level changes and with a
of changes in the behaviour, relationships, non-linear view on social change, making use
activities or actions of the parties with of outcome, strategy and performance
whom a programme works directly journals
13 Most significant Assessment of changes that have occurred A participatory approach in which stories of
change22 in a targeted group or area and change are holistic and in which participants
development of a dialogue on the values provide part of the analysis through their
attached to these changes by key selection of stories. Though the method starts
stakeholders, who select the most off as qualitative data gathering, it can also be
important story used for quantitative data analysis. The open-
ended approach of the method is in particular
useful for assessment of unexpected changes
21
Earl, S., Carden, F., and Smutylo, T., 2001.
22
Davies, Rick and Jessica Dart, 2005 and; Dart, Jessica and Rick Davies, 2003.
14 Process The systematic assessment of the This is of particular importance in pilot and
documentation23 outcomes of a set of development other innovative initiatives as well as in more
activities, in order to understand the open-ended programme approaches and
processes that led to their results, and policy level engagement, in which there is a
consultations with others on these need for incremental learning and to
processes, to learn from programme document such processes for future
implementation and inform the facilitation programme initiatives
and management of supported
development processes
Data Analysis
1 Qualitative Content analysis is a way to conduct Categories for data analysis are developed in
content analysis systematic text analysis to reduce large inductive and deductive ways, providing
amounts of unstructured textual content means for coding of qualitative data. The
into manageable data relevant to the evaluation matrix provides an important
evaluation concerned through a step by means of organizing qualitative data along the
step process of devising the material into categories of qualitative content analysis
content analytical units making use of
analytical categories, further specified
through feedback loops
2 Context analysis Analysis of the context in which a project In particular of use for initiatives that engage at
operates, in particular the political the policy level in order to understand aspects
economy context, with a focus on issues of of the political economy environment in which
enabling environment, space for change the project operates
and capacities concerned
3 Policy analysis Analysis of policy initiatives and their Analysis of policy engagement and other policy
results, making use of: the policy cycle (to related initiatives in terms of stage of the policy
understand to what phases of the policy cycle, type of engagement, means to reach
cycle initiatives aim to contribute and the policy change and partnerships concerned
kind of policy results achieved along the
stages of the cycle); type of policy
engagement (assessing audience and
influence sought); theory of change (to
analyse the logic of how policy
engagement is meant to deliver results);
analysis of the means to reach policy
change (including the development of
evidence concerned through knowledge
products and otherwise); and partnership
analysis (to analyse the DA project’s
partnering with other organizations to
reach policy objectives)
4 Strengths, Looking at strengths and weaknesses in Strengths and opportunities identified, will be
Weaknesses, terms of internal capabilities of used to assess aspects to be further developed
Opportunities, and reinforced, while weaknesses and threats
23
Da Silva Wells, Carmen et al., 2011; John A Joseph, not dated.
DA Project Evaluation Guidelines, October 2019 Annexes
37
United Nations Development Account
5 Results chain Analysis of the project results framework This approach provides a framework for
analysis and the logical sequence between assessing whether objectives are likely to be
activities, their direct outputs, and achieved through monitoring of indicators at
outcome level changes the levels of the results framework
6 Contribution An approach designed to reduce Used in the assessment of the effects of project
analysis uncertainty about the contribution an support to intermediate level changes to which
intervention is making to the observed it cannot be considered the sole contributor,
results, through an increased providing the means to relate output and
understanding of why the observed results outcome level changes; particularly useful in
have occurred (or not!) and the roles those interventions for which a clear theory of
played by the intervention and other change was formulated, the use of the analysis
internal and external factors through provides evidence on the contribution of the
several steps, including analysis of the initiative concerned
project theory of change, whether it was
implemented as intended and the
anticipated chain of results occurred, the
extent to which the project contributed to
outcome level changes through the
realization of output level results and the
extent to which other factors influenced
the program’s achievements
7 Timeline analysis Analysis of project implementation from a This approach can provide insight in the
chronological perspective, linking time chronological aspects of the achievement of
related aspects of achievement of results project results and their relation to project
with relevant project management and implementation and contextual changes
contextual issues in-country and beyond occurring at various points in time
Report
No Contents
Section
• Title of the evaluation
1 Introduction • Very short description of the project to be evaluated
• Short reason for the evaluation (will be expanded on under 2 below)
• Short introduction of the context
• Timing of the evaluation
• Rationale for the evaluation, why it is needed at this time
Evaluation
2 • Expected users and expected use by each of these of the evaluation results
purpose
• Introduction of the topic of the project to be evaluated
3 Context of the • Details on the topic in countries/regions covered by the project
evaluation • Details on policies, plans and programmes of government and other organizations on the
topic concerned and support provided by other development partners
• The DA project, its objective and how it tries to achieve this
4 Subject of the • Coverage in terms of countries / regions and time frame concerned
evaluation • Partners for implementation, including government, other IEs, other UN agencies at
country, (sub-)regional and global levels
• Other stakeholders that have an interest in the project and the evaluation
• Human and financial resources of the DA project
• Past evaluations / assessments / studies, including gender and vulnerability assessment
• What the evaluation will cover of the subject of the evaluation in terms of project
5 Evaluation components and activities, coverage of geographical area, time frame and otherwise
scope, • What parts of the subject, the evaluation will not cover and rationale these to
objectives and • Objectives of the evaluation, i.e. what the evaluation will accomplish, including what
questions evaluation criteria will be covered and rationale for this
• Evaluation questions, organized by evaluation criteria, with the number of questions
limited to six or seven
• Evaluation scope, objectives, criteria and questions need to be reviewed in the inception
phase by the evaluator and if needed adapted in coordination with evaluation manager
and if applicable with IEC and ERG
• Methodological approach and rationale
6 Methodology • Methods for data gathering and methods for data analysis
of the • Identification of primary data gathering and rationale for country selection
evaluation • Sampling of respondents for qualitative and quantitative data gathering
• Data availability
• Application of a human rights and gender equality approach in the evaluation
• Ethical concerns and how to address these
• Limitations to the methodology and ways to address the challenges identified
• Evaluation methodology needs to be reviewed in this phase by the evaluator and if needed
adapted/further developed in coordination with evaluation manager and if applicable the
IEC and ERG
• Evaluation process and work plan
7 Organization of • Management issues including roles and responsibilities of IE, evaluation manager,
the evaluation evaluator and if applicable IEC and ERG
• Evaluation team composition and responsibilities
• Evaluation deliverables, i.e. draft and final evaluation report
• Security considerations
Required
9 Annexes • TOR
• Detailed results framework of the project
Report
No Contents
Section
• Stakeholder mapping / analysis
• Evaluation Matrix
• Detailed evaluation workplan
• UNEG Ethical code of conduct
• List of acronyms used
• References to secondary information sources
Optional
• Additional contextual details
• Additional methodological details
• Additional details on the subject of the evaluation
Annex 6 - References
Bamberger, Michael, Jim Rugh and Linda Mabry, RealWorld Evaluation, Working under budget, time,
data and political constraints, London, New Delhi, 2006.
Beer, Tanya, Evaluating Public Policy Advocacy, Center for Evaluation Innovation, Health Summit 2012.
Bester, Angela, Results-based Management in the United Nations Development System, A report
prepared for the United Nations Department of Economic and Social Affairs for the 2016 Quadrennial
Comprehensive Policy Review, January 2016.
Coghlan, Anne T., Hallie Preskill and Tessie T. Catsambas, An Overview of Appreciative Inquiry in
Evaluation, in: Preskill, Hallie and Anne T. Coghlan, Using Appreciative Inquiry in Evaluation. New
Directions for Evaluation, No 100, Winter 2003.
Cross, Harry, Karen Hardee, Norine Jewell, Reforming Operational Policies: A Pathway to Improving
Reproductive Health Programs December 2001.
Dart, Jessica and Rick Davies, A Dialogical, Story-Based Evaluation Tool: The Most Significant Change
Technique. In: American Journal of Evaluation, 24 (2), 2003.
Davies, Rick and Jessica Dart, The ‘Most Significant Change’ (MSC) Technique. A Guide to its Use. April
2005 (available at http://www.mande.co.uk/docs/MSCGuide.pdf).
Da Silva Wells, Carmen and Ewen Le Borgne, Nick Dickinson, Dick de Jong, Documenting Change, An
Introduction to Process Documentation, IRC International Water and Sanitation Centre,
September2011.
Earl, S., Carden, F., and Smutylo, T., 2001, Outcome Mapping: Building Learning and Reflection into
Development Programs, Ottawa: International Development Research Centre (available at
http://www.idrc.ca/en/ev-9330-201-1-DO_TOPIC.html).
Development Account Task Team on Evaluation, Preparation of Evaluation Reports for Development
Account Projects, September 2017.
Joseph, John A, Process Documentation, not dated.
Keck and Sikkink in Jones, Harry, A guide to monitoring and evaluating policy influence, Background
Note, London, 2011.
Kirkpatrick, Donald L., Evaluating Training Programs: The Four Levels, 2006.
Kirkpatrick, Donald L. and James D., Implementing the Four Levels. A Practical Guide for Effective
Evaluation of Training Programs, 2007.
New Zealand Aid, Gender Analysis Guideline, Integrating Gender Equality and Women’s
Empowerment into an Activity, Programme or Policy, September 2012.
ODA, Successful Communication: Planning Tools, Stakeholder Analysis.
OECD Development Assistance Committee, Glossary of Key Terms in Evaluation and Results Based
Management, Paris, 2002.
OIOS, Quality assessment framework.
Patton, Michael Quinn, Utilization-Focuses Evaluation, 4th edition, 2008.
Scriven, Michael: The Evaluation of Training, Update 2010.
The Evaluation Exchange, Harvard Family Research Project, Harvard Graduate School of Education.
XIII/1, Spring 2007.
UNDP, Capacity development: A UNDP Primer, 2009.