2253 - 1571321382 - UN DA Evaluation Guidelines (Final)

Download as pdf or txt
Download as pdf or txt
You are on page 1of 48

UN Development Account

Project Evaluation Guidelines

October 2019
United Nations Development Account

Project Evaluation Guidelines, October 2019 ii


United Nations Development Account

Table of Contents
Abbreviations and Acronyms....................................................................................................... iv
1. Introduction ......................................................................................................................... 1
2. Characteristics of the DA and DA Projects.............................................................................. 3
2.1 DA Characteristics ......................................................................................................................... 3
2.2 Characteristics of DA Projects ....................................................................................................... 3
2.3 DA Terminology for Results-Based Management ......................................................................... 4
2.4 DA Project Evaluations .................................................................................................................. 4
3. Preparatory Phase of the Evaluation ..................................................................................... 5
3.1 Evaluation Management ............................................................................................................... 5
3.2 Terms of Reference ....................................................................................................................... 6
4. Inception Phase of the Evaluation ....................................................................................... 16
4.1 Steps in the Inception Phase of the Evaluation .......................................................................... 16
5. Data Gathering Phase of the Evaluation .............................................................................. 24
6. Analysis and Reporting Phase of the Evaluation ................................................................... 25
6.1 Findings ....................................................................................................................................... 25
6.2 Conclusions ................................................................................................................................. 26
6.3 Lessons Learned / Good Practices .............................................................................................. 26
6.4 Recommendations ...................................................................................................................... 27
6.5 The Executive Summary of the Evaluation ................................................................................. 28
6.6 The writing of the Evaluation Report .......................................................................................... 28
7. Follow-Up Phase of the Evaluation ...................................................................................... 30
7.1 Sharing of evaluation results....................................................................................................... 30
7.2 Management response ............................................................................................................... 30

ANNEXES
Annex 1 - DA Requirements for the Terms of Reference of Project Evaluations ................................. 31
Annex 2 - Formats for use in DA Project Evaluation ............................................................................. 33
Annex 3 - Details on Methods for Data Gathering and Analysis in DA Project Evaluations ................. 35
Annex 4 - DA Requirements for Inception Reports of Project Evaluations .......................................... 39
Annex 5 - Outline of the required elements for Evaluation Reports .................................................... 41
Annex 6 - References ............................................................................................................................ 43

LIST OF TABLES
Table 1: Previous and new DA Terminology ........................................................................................... 4
Table 2: Description of Key Components of the TOR ............................................................................. 6
Table 3: Description of Key Components of the Evaluation Report ..................................................... 29

LIST OF BOXES
Box 1: UNEG Definition of Evaluation Criteria ........................................................................................ 9

DA Project Evaluation Guidelines, October 2019


iii
United Nations Development Account

Abbreviations and Acronyms


CDPMO ................... Capacity Development Programmme Management Office (UNDESA)
DA........................... Development Account
DESA ....................... Department of Economic and Social Affairs
ECA ......................... Economic Commission for Africa
ECE ......................... Economic Commission for Europe
ECLAC ..................... Economic Commission for Latin America and the Caribbean
ERG ......................... Evaluation Reference Group
ESCAP ..................... Economic and Social Commission on Asia and the Pacific
ESCWA.................... Economic Social Commission for Western Asia
GA........................... General Assembly (UN)
IE(s)......................... Implementing Entity(ies)
IEC .......................... Internal Evaluation Committee
PMT ........................ Programme Management Team
RBM ........................ Results-Based Management
SDG......................... Sustainable Development Goal
TOR ......................... Terms of Reference
UN .......................... United Nations
UN HABITAT ........... United Nations Human Settlements Programme
UNCTAD ................. United Nations Conference on Trade and Development
UNEG ...................... United Nations Evaluation Group
UNEP ...................... United Nations Environment Programme
UNODC ................... United Nations Office on Drugs and Crime

DA Project Evaluation Guidelines, October 2019


iv
United Nations Development Account

1. Introduction
1. The United Nations (UN) Development Account (DA) was established in 1997 by the UN General
Assembly as a capacity development programme of the UN Secretariat. The DA supports the
implementation of projects of five global UN Secretariat entities and the five UN Regional
Commissions, with the goal of enhancing capacities of developing countries in priority areas of the
2030 Agenda for Sustainable Development. The DA provides the ten implementing entities (IEs), which
are mostly non-resident in beneficiary countries, with the ability to operationalize their vast
knowledge and know-how and to deliver capacity development support on the ground to selected
stakeholders. In this way, the entities are able to follow-up on their normative and analytical work as
well as inter-governmental processes, through concrete projects at multi-country, sub-regional,
regional and global levels.

The DA Evaluation Framework


2. The DA Evaluation Framework was developed in 2019 with input from key stakeholders of the
Account. The framework aims at enhancing the DA evaluation function, orienting it towards learning,
in addition to accountability. The DA Evaluation Framework includes project level evaluations,
programme level evaluations, a rolling evaluation workplan and generation and use of learning
through evaluation.

DA Project Evaluation Guidelines


3. The present Guidelines provide details on the requirements for the evaluation of DA projects. The
Guidelines have been tailored to the specific characteristics of the DA, focusing on the distinctive
requirements for the evaluation of DA projects, in addition to a more generic evaluation perspective.
The Guidelines aim to support the implementation and enhance the quality of DA project evaluations.
They are not meant to be comprehensive and should be regarded as complementing the more general
guidelines on project evaluation conducted in the UN context and specific evaluation policies and
guidance of DA IEs.
4. The Guidelines are in particular meant for use by the IEs whose projects are being evaluated, managers
of DA project evaluations and the IE evaluation sections of which the evaluation managers of DA
projects are part, the independent evaluators who conduct DA project evaluations, as well as the
substantive section(s) of IEs whose projects are being evaluated. Moreover, the Guidelines will be of
use to the DA Programme Management Team (PMT) and other parties with an interest in DA project
evaluations.
5. The Guidelines were prepared by a senior consultant. They were commissioned by the DA Programme
Manager and the process of developing them was managed by the DA PMT. The development of the
Guidelines was informed by a desk review of relevant documentation, analysis of a sample of DA
project evaluation reports and consultations with key DA stakeholders. Draft versions of the
Guidelines were discussed in virtual meetings with DA Network members, IE evaluation specialists and
the DA PMT.

Ways to use the present Guidelines


6. The Guidelines are organized along the five stages of the DA project evaluation process, i.e.
preparation, inception, data gathering, analysis and reporting, and follow-up. This provides the reader
with the opportunity to be informed about the specific DA requirements during the various stages of
DA project evaluations.
7. Though many of the issues regarding project evaluation are relevant in more than one stage of the
evaluation process, each of these are discussed in the phase of the evaluation process most relevant
to the issues concerned. Thus, issues of purpose and context of the evaluation, as well as evaluation
DA Project Evaluation Guidelines, October 2019 1. Introduction
1
United Nations Development Account

scope and objectives, evaluation criteria and questions, are in particular dealt with in the preparatory
phase, as all these issues need to be included in the TOR of the evaluation. Many of these issues,
however, remain relevant throughout the evaluation process. Aspects of the methodology, the
organization of the evaluation and the work plan are dealt with in the inception phase, as these issues
need to be further developed based on the preliminary setup provided in the TOR, and presented in
the inception report. The analysis and reporting phase focuses on findings, conclusions, lessons
learned and recommendations, as these are specific to this phase. The follow-up phase of the
evaluation concerns the sharing of the results of the evaluation, development of a management
response by IEs and partners targeted by the recommendations, use of the lessons and good practices
in the design of new projects and programmes and use of evaluation results in reporting of the DA to
the UN General Assembly.
8. Throughout these Guidelines use is made of blue boxes to highlight DA specific requirements, while
green boxes detail more general good practices and suggestions for DA project evaluations.

DA Requirements

Blue boxes include DA requirements for project


evaluations, often related to the specific
characteristics of the DA

Good Practice in Project Evaluation

Green boxes focus on good evaluation practices


and other suggestions to support the quality of
DA project evaluations

DA Project Evaluation Guidelines, October 2019 1. Introduction


2
United Nations Development Account

2. Characteristics of the DA and DA Projects


2.1 DA Characteristics
9. The DA supports the implementation of projects by ten IEs, consisting of the economic and social
entities of the United Nations (UN) Secretariat, including five global UN Secretariat entities, i.e. UN
Department of Economic and Social Affairs (DESA), United Nations Conference on Trade and
Development (UNCTAD), United Nations Environment Programme (UNEP), United Nations Human
Settlements Programme (UN Habitat), and the United Nations Office on Drugs and Crime (UNODC)
and the five UN Regional Commissions, i.e. the Economic Commission for Africa (ECA), the Economic
Commission for Europe (ECE), the Economic Commission for Latin America and the Caribbean (ECLAC),
the Economic and Social Commission for Asia and the Pacific (ESCAP) and the Economic and Social
Commission for Western Asia (ESCWA).
10. The DA projects of the ten IEs are focused in particular on capacity development, policy level
engagement, advocacy for and support to the implementation of international norms and standards
agreed through inter-governmental processes. The Account aims to support innovative approaches to
support sustainable development. After a successful DA project, the initiatives are meant to be picked
up by the IEs or by project partners with funding from outside of the Account. All projects
implemented through the DA are based on requests from beneficiary countries to the IEs.
11. New DA tranches were until the 11th tranche initiated every two years. From 2020 onwards, starting
with the 12th tranche, new DA tranches are launched on an annual basis, with half the number of
projects of the previous biennial tranches. Project implementation periods, however, remain at a four
year period.

2.2 Characteristics of DA Projects


12. Capacity development is the main focus of DA projects, with capacities being supported at multiple
levels, including the individual, institutional and societal level.1 Many DA projects concern
international norms and standards, agreed through inter-governmental processes. This includes the
adoption of norms and standards, as well as their integration into legislation, policies and
development planning and support to implementation of such legislation, policies and development
plans, at the country, (sub-) regional and global levels.
13. The capacity development support of IEs can take a variety of forms, including organization of
workshops and trainings, support to the development of knowledge products, facilitation of inter-
governmental dialogue and coordination amongst key stakeholders and policy level engagement, at
times including the development of tools and guidelines, in order to facilitate policy implementation.
DA projects often focus on enhanced capacities of policy makers, increased institutional capacities and
strengthened coordination across stakeholders. Ultimate beneficiaries are the women and girls, men
and boys who benefit in terms of the effects on their lives and livelihoods, in particular those of
vulnerable and marginalized groups.
14. Most DA projects focus on multiple countries, often across multiple regions, involving several of the
IEs and including partnerships with national level government, other UN agencies as well as other
development partners, civil society organizations and universities. The involvement of multiple
partners often results in leveraging of additional resources, including human and in kind resources as
well as additional funding.

1
UNDP, 2009.
DA Project Evaluation Guidelines, October 2019 2. Characteristics of the DA and DA Projects
3
United Nations Development Account

2.3 DA Terminology for Results-Based Management


15. The DA uses Results-Based Management (RBM) approaches and has recently aligned its results
framework language with terminology commonly used in this connection, as per table 1 below.

Table 1: Previous and new DA Terminology

Previous DA Terminology New DA Terminology

Main Activity Output

Expected Accomplishment Outcome

Objective Objective

2.4 DA Project Evaluations


16. Project evaluations are a key component of the DA evaluation function. For the 10th and 11th tranches
all projects will continue to be evaluated as per past practice. Starting with the 12th tranche a selection
of half the projects of each IEs within a tranche are to be evaluated The selection of projects for
evaluation within a tranche will be based on a purposive sample2, with random selection of projects
for each of the implementing agencies. Projects with a budget of USD 1 million or more will by default
be included among the projects selected for a project evaluation.
17. Project evaluations will include primary data gathering through a possible visit to one or more selected
countries and virtual or face-to-face meetings and interviews with stakeholders, including the option
for the evaluator to participate in the final workshop of the project and collect data from participants.
18. DA project evaluations are conducted towards the end or shortly after the DA project has been
completed, with a focus on the achievements, and learning from the implementation of the project
as well as the ways in which these were achieved. DA project evaluations consist of five phases:
• The preparatory phase, in which the Terms of Reference (TOR) are developed in coordination
with stakeholders, the consultant(s) are identified and hired and secondary resources are
compiled
• The inception phase, in which a desk review is conducted by the evaluator, the TOR is further
operationalized, and a draft and final inception report is produced and discussed with relevant
stakeholders
• The data gathering phase, in which primary and additional secondary data are gathered by
the evaluator and in which an initial analysis is conducted, with preliminary results of the
evaluation validated through discussions with key stakeholders
• The analysis and reporting phase, in which data gathered are further analysed and draft and
final evaluation reports are prepared, informed by comments of relevant stakeholders
• The follow-up phase, in which the results of the evaluation are shared with key and other
relevant stakeholders, a management response is developed by the entities and partners that
have been targeted in the recommendations of the evaluation, and further action is taken
with respect to learnings emanating from the evaluation
19. The next chapters (chapters 3-7) will discuss these phases in more detail.

2
Purposive sampling, also known as selective sampling, is random sampling with the application of certain criteria for
selection of the sample. In the case of the DA projects, the IEs are the criterion. This means that projects are organized by
IE and from the projects of each IE in a tranche, a random sample is taken of half of the projects concerned. With the
approach to selection of projects to be evaluated of each of the IEs being random,there are no specific selection criteria for
DA projects to be evaluated.
DA Project Evaluation Guidelines, October 2019 2. Characteristics of the DA and DA Projects
4
United Nations Development Account

3. Preparatory Phase of the Evaluation


20. In the preparatory phase the scene is set for the entire DA project evaluation process. This includes
putting in place the management arrangements of the evaluation, development of the Terms of
Reference (TOR) to guide the entire process, recruitment of the independent evaluator(s) to conduct
the evaluation and compilation of the secondary data sources for the desk review. Below, details are
presented on the management arrangements to be put in place for the evaluation and the
requirements of the TOR and its development. Processes on recruitment of consultants will be specific
for each of the IEs. Requirements for the evaluator(s) in terms of background, competencies and skills
should be provided as part of the details presented in the TOR.

3.1 Evaluation Management


21. Management of the evaluation includes all phases of the evaluation process. In the preparatory phase
it includes the development of the TOR, the recruitment of the evaluation consultant(s) and the
compilation of secondary resources for the desk review.
22. DA project evaluations are to be managed independently from the person(s) directly responsible for
management of the project. Usually, they are managed by the evaluation section of the IE concerned.
This includes the development of the TOR.
23. Evaluation management can be supported by an Internal Evaluation Committee (IEC) as well as by an
Evaluation Reference Group (ERG). The IEC is an internal committee, consisting of staff from the
evaluation section and other substantive section(s) as relevant. The DA project manager can be
included in the IEC as a non-voting member, with a role in terms of facilitation of communication with
DA project stakeholders and provision of relevant secondary documentation and other inputs.
24. The ERG combines internal and external stakeholders and includes the management of the IE
substantive section(s) (but not the project manager), government counterparts, other UN agencies
and other relevant development partners. The ERG provides inputs in particular to the critical points
in the evaluation process, including in the development of the TOR, the inception phase and the
reporting phase, commenting on draft versions of the inception and evaluation reports.
25. The support of an IEC and ERG can reinforce the management of the evaluation process, enhance its
independence and can heighten the credibility of the results of the evaluation. Therefore, the
establishment of an Internal IEC and the ERG is encouraged as important means to guide the
evaluation process and support evaluation management.
26. DA Project evaluations will be conducted by an independent evaluator or evaluation team, ideally
gender balanced and if feasible, consisting of an evaluation as well as a subject matter specialist. The
TOR is an important means in the recruitment process of the evaluator3.
27. In terms of timing of the evaluation, data gathering at country level should not be started before all
project activities have been finalized, with the exception of a possible final workshop or meeting,
which can provide a useful means for the evaluator to participate in one of the project activities and
to interview a variety of key stakeholders and participants at the event. Evaluations need to be
finalized within a timeframe of three months after closure of the DA project, but could be extended
up to six months following agreement with DA PMT, the period depending on the complexity of the
evaluation and to be specified in the TOR.

3
The term evaluator is in these Guidelines used for both a single evaluator as well as where evaluation would be conducted
by an evaluation team of two persons.

DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation


5
United Nations Development Account

28. For all DA projects, project managers prepare a final project report4. This report provides an overview
of the project and its achievements, possible challenges faced, as well as good practices and lessons
learned, primarily from the perspective of the project manager. The final project report forms an
important input to the project evaluation. A draft of the report should be available to the evaluator
during the evalution process.

3.2 Terms of Reference


29. The TOR provides the basis for the evaluation and gives direction to its implementation. The
importance of a high quality TOR cannot be underestimated and it is, therefore, important to pay
attention to the requirements of the TOR and its tailoring to the specific characteristics of the DA and
the project concerned. The TOR is usually developed by the evaluation manager, in close coordination
with stakeholders of the project, the substantive section(s) of the IE that manages the project,
including the project manager5, and the IE evaluation section, with the latter providing the evaluation
manager. Moreover, in those situations where an IEC and/or an ERG would be established, the TOR
would be reviewed by their members. A high quality TOR will require multiple iterations.
30. As the TOR provides the direction of the evaluation, the contents of the TOR are critical to the entire
evaluation process. Though the TOR does need to contain a substantial amount of details, it is
important to keep the TOR relatively short (upto 10 pages, excluding annexes), with main issues in the
main document and details in annexes to the TOR. Annex 1 of these Guidelines provides a
comprehensive overview of the contents required for the TOR of a DA project evaluation. Further
details for each of the sections of the TOR are further provided below.

Table 2: Description of Key Components of the TOR

TOR section Description

Details why the evaluation is conducted, why now, who the main anticipated users
Evaluation Purpose of the evaluation results are, and how the results of the evaluation are expected to
be used by each of the users
Context and topic of Focuses on the topic that the project aims to address, and the development
the DA Project approaches used to deal with the issues concerned
Subject of the Is the DA project of which a short description needs to be provided
Evaluation
Determines the boundaries and specifies the reach of the evaluation as well as
Evaluation Scope
issues that will be left out of the focus of the evaluation
Evaluation Objectives Refer to what the evaluation needs to accomplish in order to reach its purpose
Evaluation Criteria Refer to the guiding principles that will be used to gather and analyse data
Provide further details on each of the evaluation criteria and specify key questions
Evaluation Questions that need to be answered through the evaluation, in this way guiding data
collection and analysis

a) Evaluation Purpose
31. The purpose of the evaluation details why the evaluation is conducted, why now, who the main
anticipated users of the evaluation results are, and how the results of the evaluation are expected to

4
See the following link for information on the preparation of the final report:
https://www.un.org/development/desa/da/static-guidance-public/
5
The role of the project manager is limited to providing of relevant information for the development of the TOR, with the
evaluation meant to be independent from the management of the project.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
6
United Nations Development Account

be used by each of the users. Detailing the purpose of the evaluation is of critical importance, as it
drives the evaluation, informing the evaluation scope, objectives and the evaluation criteria used and
questions posed (see also the phases below for further details on the evaluation scope, objectives,
criteria and questions).
32. Getting the purpose of the evaluation right is of particular importance for a DA project evaluation,
since most of the projects are one-off initiatives, with usually no follow-up phase through the DA. The
purpose of the evaluation is thus NOT to inform a second or subsequent phase of the same project,
as is often the case in project evaluations of other development partners.
33. It is, therefore, important to identify from the start what the use of the results of the project
evaluation is expected to be. In particular use of the results of the project evaluation by the IEs
concerned is important, as well as use by project partners. Reference needs to be made to any
decision-making processes, at the level of the IEs or otherwise, that the evaluation results could feed
into. It will be useful to specify the kind of
recommendations that are expected to result from
DA Requirements
the evaluation process. Moreover, the UN General
Assembly needs to be included as an indirect user of
Detail the purpose of the evaluation in the DA project evaluations, with evaluation results
relation to the DA project, specifying
used in DA reporting to the UN General Assembly6,
expected users and their use of the
in particular in terms of achievements and possibly
evaluation results
lessons learned.

b) Context and topic of the DA Project


34. Contextual details of TORs of many country level project evaluations focus at the country level,
detailing the country specific aspects of the topic that the project aimed to address. For DA projects,
which often focus on multiple countries and can cover multiple regions or be global in approach,
contextual details are slightly different. In the DA setup, context focuses in particular on the topic that
the project aims to address and the development approaches used to deal with the issues concerned.
This type of contextual information is important in order for the reader to understand the topic that
the project focuses on, including the present state of affairs in tackling the issues concerned. In case
the project focuses on a limited number of countries it is useful to elaborate on key details of the topic
in the selected countries.7
35. Important details that need to be provided in terms of the topic of the DA project include policies,
strategies and plans of government(s), government agencies and other stakeholders as well as support
to the issues concerned from
other UN agencies and
development partners. Such DA Requirements
data will enable assessing the DA
project and its results as part of Provide information on the topic of the DA project and key
a wider array of initiatives of development approaches used to address the issues concerned
parties working on the same Include information on relevant government strategies, policies
topic. Providing details on what and plans and pay attention to support of development partners
and other stakeholders in addressing the topic concerned, in
others are doing as part of the
order to understand the contribution of the project vis-a-vis
context can, moreover, inform other support
the understanding of the

6
The DA Progress Reports to General Assembly can be found on the DA website at the following link:
https://www.un.org/development/desa/da/static-official-public/
7
Contextual details are typically included in the project document for DA projects and would, thus, be available in the project
document.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
7
United Nations Development Account

partnerships that the project has engaged in.

c) Subject of the Evaluation


36. The subject of the evaluation is the DA project, of which a short description needs to be provided. This
includes the design of the project, any adaptations from the design during the course of project
implementation and the rationale for these adaptations, as well as details on the results framework,
outlining the objective of the project and the ways in which the project aims to contribute towards its
achievement, through output and outcome level changes. Additional details of the results framework
can be included in an annex. With many of the DA projects being part of a wider programmatic
approach, there is a need to include the relationship of the project to the larger IE programme of
which it is a part and how the DA project contributes to the results of the wider programme.
37. The description of the subject of the evaluation needs to make explicit the human rights related
aspects of the DA project. This can include human rights related issues that the DA project relates to,
and its contribution to equity and the principle of ‘leaving no one behind’. Moreover, the description
needs to include how the DA project contributes to gender equality and the empowerment of women
and how results of the project could affect women and girls, as well as men and boys. Reference needs
to be made to any possible human rights and/or gender analysis conducted in relation to the topic of
the DA project, as part of the project or by other stakeholders.
38. The subject of the evaluation also needs to include a stakeholder map, i.e. an overview of stakeholders
that have a direct or an indirect stake
DA Requirements in the project and its evaluation. This
needs to include both the interest
Description of the design of the project and any adaptations and role of stakeholders in the topic
made to the design, including the rationale for any adaptation of the project and the DA project
Details of coverage of the DA project in terms of region(s) and itself, as well as their expected
countries and the rationale concerned, as well as the project interest in the evaluation and its
results framework results. The early development of the
Explicitly refer to the wider programme that the DA project is stakeholder map will enable the
part of and the way in which the DA project contributes to the evaluator to make use of it in the
wider programme results framework inception phase, further refine it and
Incorporation of a human rights based approach and gender use the results to inform the
equity perspective in the project and related analysis evaluation process in terms of who to
conducted to inform project design include in what ways in the data
Include a stakeholder map and the interest of stakeholders in gathering, analysis and dissemination
the project and its evaluation parts of the evaluation. For a format
see annex 2.

d) Evaluation Scope and Objectives


39. The scope of the evaluation determines its boundaries and needs to be made explicit as part of the
TOR. At the centre of the scope of each DA project evaluation stands the specific DA project, its design
and its implementation over time, including the adaptations made to the original design during the
implementation process. The scope specifies the reach of the evaluation in terms of time frame
covered, geographical reach, as well as the stakeholders to be included in the evaluation. It details the
thematic attention of the evaluation, the inclusion of cross-cutting issues and its focus on managerial
and strategic aspects of the project. It also specifies issues that will be left out of the focus of the
evaluation, including the rationale for leaving out such issues.
40. Evaluation objectives refer to what the evaluation needs to accomplish in order to reach its purpose.
The objectives of the evaluation need to be distinguished from the purpose of the evaluation (i.e. why
the evaluation is conducted, see also paragraphs 28-30 above).

DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation


8
United Nations Development Account

41. Generally, DA evaluations have been focused on two objectives: accountability and learning, with in
previous years much focus having been placed on accountability, and with the introduction of the DA
Evaluation Framework, more attention being placed on learning. Rather than just making mention of
learning and accountability, details on each of these need to be made explicit, as well as the balance
between the two. Learning aspects can be further elaborated upon through details on the need for
the evaluation to provide lessons and good practices on the topic concerned as well as on the need
for recommendations to inform programming on the topic, with the recommendations usually not
aimed at informing a second phase of the project.
42. With learning being a main objective
of the evaluation, there is a need to DA Requirements
conduct the evaluation in a
participatory way, in order to ensure Provide details on the boundaries of the evaluation,
that conclusions and learnings are specifying what is included and what will be excluded from
shared with, agreed to and owned by the evaluation.
the stakeholders concerned, which Provide the learning objectives of the evaluation in addition
enhances the prospects of the use of to aspects of accountability and specify the requirements in
the results of the evaluation and the relation to the topic of the DA project, including the
implementation of the rationale
recommendations.

e) Evaluation Criteria
43. The evaluation criteria refer to the key principles that the evaluation will use in order to gather and
analyse data. The selection of evaluation criteria needs to relate to the evaluation objectives and the
underlying purpose of the evaluation. A set of evaluation criteria was developed by the Development
Assistance Committee (DAC) of the Organization for Economic Cooperation and Development (OECD)
in the 1990’s, which have been adopted by the UN Evaluation Group (UNEG). These criteria include:
relevance, efficiency, effectiveness, impact and sustainability (see box 1 below for UNEG’s definitions).
All these criteria, with the exception of impact, need to apply to each DA project evaluation.

Box 1: UNEG Definition of Evaluation Criteria


Relevance: Extent to which the objectives of a development intervention are consistent with beneficiaries’
requirements, country‐needs, global priorities and partners’ and donors’ policies
Efficiency: Measure of how economically resources/inputs (funds, expertise, time, etc.) are converted to results. It is
most commonly applied to the input‐output link in the causal chain of an intervention
Effectiveness: Extent to which the development intervention’s objectives were achieved, or are expected to be
achieved, taking into account their relative importance. Effectiveness assesses the outcome level, intended as an
uptake or result of an output
Impact: Positive and negative, primary and secondary long‐term effects produced by a development intervention,
directly or indirectly, intended or unintended
Sustainability: Continuation of benefits from a development intervention after major development assistance has
been completed. The probability of continued long‐term benefits. The resilience to risk of the net benefit flows over
time
Source: United Nations Evaluation Group, Integrating Human Rights and
Gender Equality in Evaluation – Towards UNEG Guidance, March 2011

44. The evaluation criteria developed by the OECD DAC were geared in particular towards country level
development interventions, often in single sectors and of a technical cooperation nature. As such their
definitions do not necessarily fit with the specific characteristics of DA projects, which are projects at

DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation


9
United Nations Development Account

multi-country, regional, sub-regional or global levels and that focus on capacity development.
Therefore, the evaluation criteria need to be tailored to the specific requirements of the DA.
UNEG evaluation criteria tailored to the DA
45. While the criteria of relevance, effectiveness and sustainability do apply to DA projects, the
evaluation questions underneath each of these criteria will need to be adapted to the characteristics
of the DA (see details in box on page 14). The criterion of efficiency is useful in its focus on economic
efficiency and timeliness of the process through which activities are transformed into output level
results, including the use of human and financial resources and aspects of project management. The
criterion of efficiency, however, misses out on other important process issues of DA projects, including
partnerships, human rights and gender equality issues, which will be discussed below.
46. The criterion of impact proves usually less applicable to DA projects as results, in terms of effects on
the lives of people, in particular vulnerable and marginalized groups, would usually only be assessable
sometime after phasing out of the project, with a variety of other intervening factors playing a role.
Given the limited budget and time frame of DA projects, they cannot necessarily be expected to be
able to show impact level changes, i.e. demonstrable improvements in the lives of women and men,
girls and boys within the time frame of the project. Impact assessment usually requires a huge
investment in human and financial resources in order to be able to establish attribution through
project interventions, something beyond the ability of DA projects.
Additional DA required evaluation criteria
47. The UNEG evaluation criteria do not cover all relevant aspects of the DA. Use of the additional DA
required evaluation criteria (SDGs, partnerships, human rights, and gender equality and innovation)
are, therefore, an essential part of DA project evaluations. These can either be included as separate
criteria for the evaluation, or evaluation questions related to these could, alternatively, be included
under one or more of the UNEG evaluation criteria.
48. Given the importance of the 2030 Agenda and the SDGs, the evaluation of DA projects needs to pay
attention to these. This includes attention to the SDGs and related targets and indicators of those
SDGs relevant to the project concerned as well as attention to the principle of ‘Leaving no one behind’.
This can be achieved under selected evaluation criteria, e.g. through analysis of alignment of the
project with the 2030 Agenda as part of the assessment of the criterion of relevance, or through
analysis of the contribution of the project to SDG targets and indicators as part of the assessment of
the evaluation criterion of effectiveness. Alternatively, SDGs can be used as a separate evaluation
criterion, with specific evaluation questions formulated and analysed as part of the evaluation.
49. Partnerships are an essential element of DA projects, as they can enhance efficiencies and
effectiveness of project delivery, by ensuring full utilization of comparative advantages and avoiding
duplication of efforts. Partnerships are, therefore, an important and required criteria for the
evaluation of DA projects. In the DA context, partnerships typically refer to joint/collaborative
implementation of projects amongst DA IEs, other UN agencies as well as sub-regional, regional and
global level stakeholders. Direct beneficiaries of DA projects are, however, not refered to as
implementing partners.
50. Issues of partnership can be dealt with as part of the criterion of efficiency, looking at aspects of
synergy or as part of the evaluation criteria of effectiveness, with a focus on results concerned.
Partnerships can also be used as a separate criterion.
51. Human rights and gender equality are important cross- cutting principles to be incorporated in all UN
programming. They are also integrated in the 2030 Agenda and have been incorporated in the
evaluation quality assessment framework of UN Office of Internal Oversight Services (OIOS). It is,
therefore, essential to include these issues in the design of a DA project evaluation. This requires
explicit attention to the principles of equality, inclusion and non-discrimination as part of the

DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation


10
United Nations Development Account

evaluation. In case the project design has clearly included human rights and gender equality issues, a
gender analysis has been conducted and monitoring and reporting have captured related details,
disaggregating data, then inclusion in the design of the evaluation can be guided by these aspects.
However, if such details are less available or altogether missing, the reasons need to be understood
and some of these aspects can be included in the evaluation design. The conduciveness of the context
to include human rights and gender related issues in the DA project evaluation will need to be taken
into account as it will affect the feasibility and the level of their inclusion in the evaluation process.8
Human rights and gender aspects can be included in each or a selection of the evaluation criteria, they
can be used as a separate criterion or both options can be combined in the evaluation.
52. With the DA focus on innovation9, it is
important for DA project evaluation to DA Requirements
pay attention to the extent to which, and
Present the UNEG evaluation criteria (relevance,
the ways in which, innovation has been a effectiveness, sustainability and efficiency), tailored to
feature of the project. As with other the characteristics of the DA and the project concerned
thematic issues, this can be done as part
of the UNEG evaluation criteria or Use of the additional DA required evaluation criteria
(SDGs, partnerships, human rights and gender equality
innovation can be used as a separate
and innovation) as additional criteria and as an essential
evaluation criterion, with specific part of DA project evaluations, or include questions
evaluation questions formulated to related to these under one or more of the UNEG
assess issues concerned. evaluation criteria

f) Evaluation Questions
53. Evaluation questions provide further details on each of the evaluation criteria and specify key
questions that need to be answered through the evaluation, in this way guiding data collection and
analysis.
54. A large amount of questions tends to divert the
evaluation to address a myriad of issues that do DA Requirements
not necessarily provide the opportunity for the
Evaluation questions need to be formulated
evaluation to come up with a set of findings that
under each of the evaluation criteria selected –
can be the basis for a well informed conclusions on
doing so is the responsibility of the evaluation
the topic of the project. With the limited resources manager, supported by the IE evaluation
that DA project evaluations have, it is important to section
provide useful and evidence based answers to a
In situations where IEC and/or ERG have been
limited number of questions, rather than
established, the questions need to be
superficially addressing a wide range of questions.
developed in participation with their members
Project evaluations, therefore,` should ideally be
limited to a maximum of six to seven main In order to focus the evaluation, questions
evaluation questions, which together need to be should ideally be limited to a maximum of
seven
able to address the purpose of the evaluation.

8
In order to assess the extent to which a project or programme contributes to gender equality, the UN System Wide Action
Plan on Gender Equality and the Empowerment of Women includes the Gender Results Effectiveness Scale, which identifies
five different levels of results, which can be of use in DA project evaluations: gender negative, gender blind, gender targeted,
gender responsive and gender transformative. United Nations Evaluation Group, 2018. For a more comprehensive overview
of possible approaches to address challenges in the evaluation of human rights and gender related cross-cutting issues, see
United Nations Evaluation Group, 2011 and 2014, with the latter providing a more detailed approach to inclusion of human
rights and gender equality in evaluation.
9
For the purposes of the DA innovation is thought of as IEs either addressing new topics or using new means of delivering
projects (or a combination thereof) that differ significantly from the topics and means of delivering projects (or part of
them) that the IE has previously addressed or used for delivery.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
11
United Nations Development Account

Examples of Evaluation Questions to adapt to the specifics of the DA

Relevance
• To what extent was the project objective aligned with international conventions and inter-
governmental processes?
• To what extent does the project design respond to the needs of Member States?
• What adaptations were made to the design of the project during implementation and were these
justified in the context concerned?
• What lessons and good practices from previous DA projects were used to inform project design?
Efficiency
• What human, financial and in-kind resources were leveraged through contributions of partners?
• To what extent did the project achieve efficiency in implementation through the combination of
project stakeholders involved, making use of comparative advantages and the creation of synergy?
Effectiveness
• To what extent did the selection of participants for training programmes, workshops and study tours,
contribute to results achieved?
• To what extent and in what ways have training, workshops and study tours contributed to learning of
participants?
• To what extent have participants been able to make use of learnings through training, workshops and
study tours and changed the way in which they conduct their work, in order to enhance results?
• What aspects of policy related change has the project contributed towards?
Sustainability
• Did project design include an approach to scaling up of results and how has this been implemented?
• To what extent and in which ways have national level UN and other national level development
organizations been involved in project implementation and what role can they be expected to play in
sustaining the results achieved through the project at country level?
Partnerships
• To what extent has partnering with other organizations enabled or enhanced reaching of results?
• Has partnering with other organizations resulted in reduction of overlap and increased efficiency?
The 2030 Agenda/ SDGs
• In what ways and to what extent has the project contributed to supporting the principle of leaving no
one behind in the sustainable development process?
• To what extent has the project contributed to reaching targets of selected SDGs?
Human Rights and Gender Equality
• To what extent was a rights-based and gender sensitive approach applied in the design and
implementation of the project, informed by relevant and tailored human rights and gender analysis?
• In what ways were results for disadvantaged and left behind groups included and prioritized in the
design and implementation of the project and were resources provided to enable this?
• To what extent has the project contributed to human rights and gender equality related objectives
and to SDG 5 on Gender Equality and gender objectives in other SDGs and were targets concerned
included in the project results framework?
Innovation
• What innovative aspects of the project (addressing new topics or using new means of delivery or a
combination thereof) proved successful?
• How can innovative aspects of the project that proved successful be scaled up and replicated with
funding from outside the DA?

DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation


12
United Nations Development Account

55. An initial set of evaluation questions is developed as part of the development of the TOR for the DA
project evaluation, a process conducted under the responsibility of the evaluation manager. In order
for the evaluation questions and the results that they produce to be owned by the various project
stakeholders, it is important for the evaluation manager to engage key project stakeholders in the
development of the evaluation questions. In case an IEC and ERG are established, it will be important
to involve their members. Rather than each stakeholder adding the questions that they are most
interested in, a process that usually results in a large amount of evaluation questions, that are not well
prioritized, it will be useful to inform the process through virtual meeting(s) with key stakeholders to
develop a focused set of questions.

g) Methodology of the Evaluation


56. DA Project evaluations are end of project evaluations and in that respect they are summative in
character, aimed at assessment of the results of the intervention for accountability and learning
purposes. As such, they differ from formative evaluation, conducted during an intervention and aimed
at improvement of the intervention and its management.10
57. In their design, DA project evaluations will usually make use of a theory-based approach. A theory-
based approach assesses the extent to which an intervention has contributed to observed results
through the use of a theory of change or results framework, which outlines the causal relations
between activities and their results, while at the same time considering underlying assumptions and
risks in reaching results. This approach includes review of results achieved as well as the process
through which these have been accomplished. A theory-based approach involves the use of a non-
experimental design. This setup suits most of the DA projects, for which experimental and quasi-
experimental approaches are less suitable. In addition to a theory-based approach, the methodology
can make use of other approaches, including gender and human rights responsive evaluation.
58. DA project evaluations require a mixed methods approach, which enables the assessment of the issues
concerned from a variety of methodological perspectives. Moreover, multiple methods can be used
in a complementary way, each adding specific data and perspectives, which compensates for bias
related to the use of any single method. Application of multiple methods enables the use of
triangulation across methods. In addition to methods for data gathering, attention is required to
methods for data analysis (for details see annex 3).
59. DA project evaluations make use of a participatory approach, including stakeholders in all stages of
the evaluation process. The level of stakeholder participation needs to be made explicit as part of the
methodology, in particular which stakeholders to involve in which ways in the evaluation process, for
which use can be made of the stakeholder mapping. Stakeholder participation strengthens
accountability, enhances ownership of the evaluation process and its findings, heightens credibility of
the evaluation results and increases the likeliness of the implementation of the recommendations.
60. The methodology of the evaluation needs to detail ethical considerations that the evaluation and the
evaluator need to comply with, making use of the UNEG evaluation ethical guidelines.11 While this
applies to all evaluations, this is of particular concern when members of vulnerable and marginalized
groups are participating in the evaluation and the primary data gathering process.

10
Scriven, 1991.
11
United Nations Evaluation Group, March 2008.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
13
United Nations Development Account

61. If methodological details are not explicitly specified in the TOR, the evaluator can be required to
develop (parts of) the methodology during the inception phase of the evaluation. If the latter is the
case, the TOR will need to provide the parameters of the methodology, ensuring that the
methodological approach
DA Requirements and rigor concerned will end
up to be able to result in a
Provide details on methodology in line with the purpose and objectives credible evaluation report in
and the specific characteristics of the DA project to be evaluated line with IE and DA needs and
Make use of a theory-based approach, guided by the project results UNEG norms and
12
framework standards. Further details
on methodology of the
Make use of a mixed methods approach, including qualitative as well
evaluation will be discussed
as quantitative data gathering and analysis
as part of the section on the
Use a participatory approach to enhance opportunities for generating Inception Phase of the
learning that is shared across stakeholders of the project evaluation process, with
additional details in annex 3.

h) Organization of the Evaluation


62. This section of the TOR describes the organization of the evaluation and provides details on the roles,
responsibilities and lines of authority for all parties involved in the evaluation process. Implementation
arrangements are intended to clarify expectations, prevent ambiguities, and facilitate an efficient and
effective evaluation process.
63. Details need to include the roles and responsibilities of the evaluation manager, the evaluator, and if
applicable the roles and responsibilities of the team leader and member of the team, as well as the
members of the IEC and the ERG and any other relevant stakeholders. In the case of a joint evaluation,
with the participation of multiple IEs, responsibilities of each of the partners and management details
need to be specified, making explicit which elements of the evaluation are joint and which are agency
specific.
64. The organization of the evaluation needs to ensure the independence of the evaluation from the
management of the DA project, including that management of the evaluation cannot be the
responsibility of the DA project manager.
65. The TOR needs to detail the composition of the Evaluation team, including requirements in terms of
background, competencies and skills of team leader (and member if applicable). The type of evidence
(resumes, work samples, references) that will be expected to support claims of knowledge, skills and
experience of candidates for the position of evaluator, need to be specified. The TOR needs to
explicitly require the independence of the evaluator, for her/him not to have been involved in any
part of the DA project, including its design and implementation as well as any advisory role to the
project that is the subject of the evaluation.
66. Other issues to consider include: lines of authority; processes and responsibilities for approval of
deliverables and other aspects of the implementation of the evaluation; and logistical considerations,
such as whether and what kind of support will be provided to travel arrangements, office space,
supplies, equipment and materials.
67. Details on the organization of the evaluation need to include a detailed Evaluation Workplan (see
Annex 2 for a Format for the Evaluation Workplan), with key issues concerning timing of the main
phases of the evaluation in the main text, and a detailed work plan in an annex. This needs to include
the timing of the various deliverables, including draft and final Inception Report, draft and final

12
United Nations Evaluation Group, 2016, 2017.
DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation
14
United Nations Development Account

Evaluation Report, with an outline of the contents of each in an annex. A realistic timeframe in the
TOR can prevent the need for adaptations during the DA project evaluation process.
68. Any security considerations need to be detailed, specific to the region and countries in which the DA
project operates.
69. The details on the total
evaluation budget need DA Requirements
to be provided, as well as
a breakdown of the Specify the roles of evaluation manager, the evaluator and if applicable the
budget along DA financial composition and roles of the Internal Evaluation Committee and the
categories. Budgetary Evaluation Reference Group.
details are meant for Provide details on the role of the DA project manager in the evaluation
internal use only. process, in line with the requirements for independence of the evaluation
Payment arrangement for Include specification of the workplan of the evaluation, including details on
consultants need to be deliverables and their timing
included.

i) Annexes to the TOR


70. In addition to the annexes that are required, the TOR can include additional details on context, subject
and methodology to the specifics provided in the main text of the TOR.

DA Requirements

Annexes to the TOR need to include:


• DA project results framework
• Stakeholder map
• Documents to be consulted
• UNEG Ethical code of conduct
• Detailed evaluation workplan
• Format for Inception Report (see annex 4)
• Format for Evaluation Report (see annex 5)
• Format for Management Response (see annex 2)
Optional:
• Additional details to the context
• Additional details to the subject
• Additional details to the methodology

DA Project Evaluation Guidelines, October 2019 3. Preparatory Phase of the Evaluation


15
United Nations Development Account

4. Inception Phase of the Evaluation


71. The inception phase is informed by the desk review of secondary materials and allows for the
evaluator to review the details of all aspects of the evaluation process, as presented in the TOR. This
process can be reinforced through an inception mission (if feasible in the context of the evaluation
concerned), in which the evaluator discusses aspects of the setup of the evaluation with the evaluation
manager (and if applicable with members of the IEC and the ERG). Such a mission allows for initial
data gathering through interviews and meetings and is particularly helpful in more complex evaluation
settings.
72. The evaluation manager provides support to all aspects of the inception phase, including the provision
of compiled relevant secondary data, communication with the independent evaluator, sharing of the
draft Inception Report with key stakeholders, collating comments and sharing these with the evaluator
and sharing the final Inception Report with key stakeholders. It is important for the evaluation
manager to support the setup of a meeting schedule, with inputs and support from the project
manager, for the data gathering phase of the evaluation.

4.1 Steps in the Inception Phase of the Evaluation


73. Several steps need to be taken in the inception phase, often specifying and working out details
provided in the TOR, and further operationalizing the implementation of the evaluation. Each of the
steps will be further detailed below, bearing in mind that the order of the steps can vary based on the
context and the specifics of the DA project concerned.

DA Requirements

• Start the desk review and inform the inception phase with the initial results of the desk review
• Acknowledge the purpose of the evaluation and the expected use and users of the evaluation
results as guiding principles to the evaluation process
• Analyse the context of the DA project
• Review and analyse the details on the DA project to be evaluated, including an assessment of the
results framework of the project
• Further specify the stakeholder mapping that is included in the TOR, or develop one when this
was not included, and conduct stakeholder analysis
• Assess the availability and reliability of relevant secondary data
• Review the scope and evaluation objectives and assess their relation to the purpose of the
evaluation
• Review the evaluation criteria and questions and their relation to the evaluation objectives
• Review the methodology for the evaluation as presented in the TOR and further develop and/or
specify it
• Prepare the evaluation matrix
• Prepare a detailed work plan for the implementation of the evaluation
• Develop and discuss the draft inception report with key stakeholders and prepare the final
inception report

DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation


16
United Nations Development Account

a) Desk review
74. The desk review entails a review of relevant secondary resources to inform the inception phase as
well as the remainder of the evaluation process. An initial overview of relevant materials is usually
provided in the TOR, while the evaluator can further expand on this in the inception phase.

Secondary Information of Use in DA Project Evaluations

Project related data


• DA project concept note, project document, project annual reports, project financial information as
well as project monitoring details and project final report
• TORs of consultancy missions, reports of workshops/trainings, mission reports, presentations
• Minutes of meetings related to project implementation
• Knowledge products, including studies, notes, toolkits etc. developed with support of the DA project

Other relevant data


• Documentation on the issues that the project aims to address, including information on international
meetings and inter-governmental dialogue
• Sustainable Development Goal (SDG) related data including Voluntary National Reports of countries
included in the DA project
• Global level reports on the status of the topic of the DA project
• Relevant regional and country level documentation related to the topic of the project
• Country level national development strategies and plans related to the topic of the project
• Previous evaluations on the topic of the project and any existing baseline studies or other research
documentation concerned
• Regional consultation meetings in relation to the topic of the project
• Relevant human rights, gender equality, capacity and other assessments conducted on issues related
to the topic of the project in the region and countries concerned
• Relevant statistical data regarding the topic of the DA project

b) Purpose of the Evaluation


75. With the purpose of the evaluation guiding the
objectives, which in turn guide the evaluation criteria DA Requirements
and questions, it is important for the evaluator, the
evaluation manager, and if applicable the IEC and the The inception report should provide a clear
ERG, to have a shared understanding of the purpose understanding of the purpose of the
of the evaluation and to acknowledge its significance. evaluation and the expected use of its
Whenever changes in evaluation questions, criteria, results in the specific context of the DA
project and be informed by the details
scope and objectives would be contemplated during
provided in the TOR
the inception phase, changes concerned need to be
in line with the purpose of the evaluation.

DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation


17
United Nations Development Account

c) Contextual Analysis
76. The context describes information on the backdrop of the DA project, including the political,
programmatic and governance environment in which the project is situated and in which the
evaluation takes place. Specific to the DA projects,
DA Requirements the context includes a focus on the topic that the
project aims to address. In the inception report,
The inception report should inform the the evaluator needs to demonstrate a clear
operationalization of the evaluation with understanding of the context of the DA project and
details concerning the context of the its importance in terms of the evaluation. This
evaluation and present relevant context details
should build on, but also go beyond, the
description provided in the TOR.
77. It is important to make use of the details of the contextual analysis in other parts of the inception
report and to inform the operationalization of the evaluation with specifics of the context in which
the DA project has been implemented. This can, for example, be in terms of country or site selection
or selection or sampling of participants from trainings conducted, as part of the methodology of the
evaluation.

d) Review the Subject of the Evaluation and the DA Project Results Framework
78. Building on the details provided in the TOR and informed by the desk review, the evaluator in the
inception phase needs to demonstrate a more thorough understanding of the DA project, the
objective it aimed to contribute towards achieving, as well as the ways in which it tried to accomplish
this. Such an understanding needs to be reflected in the description of the subject of the evaluation,
as part of the inception report, with references provided to secondary information consulted.
79. The review of the DA project as the subject of the
evaluation needs to include a preliminary analysis. This DA Requirements
includes in particular the results framework of the
project, which can be analysed in terms of its internal The subject of the evaluation needs to be
consistency as well as whether expected described in sufficient detail in the
accomplishments/outcomes can be expected to be inception report, including an initial
achieved in the timeframe provided for the project and analysis of the results framework of the
given the resources concerned. Moreover, the extent DA project, to inform the setup of the
to which the outputs, outcomes and the objective can evaluation
be considered to form a causal chain of results, is
important to assess.

e) Stakeholder Mapping and Analysis


80. The stakeholder map included in the TOR needs to be further developed as part of the inception phase.
This in order to inform the evaluation in terms of which of the stakeholders to include in the evaluation
as well as the way in which and the extent to which to include each of them. Analysis of stakeholders
can be conducted in a variety of ways. One of these
DA Requirements is to organize stakeholders in terms of their interest,
i.e. the degree to which they are expected to be
A detailed stakeholder mapping needs to be affected by the project, and in terms of their
prepared in the inception phase, analysis of influence, i.e. the influence that they have over the
which is indispensable in deciding which project and the degree to which they can be expected
stakeholders to include, in what ways and to to be able to support the achievement of its
what extent, in the evaluation process objective. Different kinds of combinations of
‘interest’ and ‘influence’ will require different types

DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation


18
United Nations Development Account

of engagement with stakeholders concerned. In particular stakeholders with high levels of both
‘interest’ and ‘influence’ are those that the evaluation needs to fully engage with.

f) Data Availability and Reliability


81. In order to inform the development of the
methodology of the evaluation, there is a need to know DA Requirements
which kind of data are available and expand on any
At the inception stage, the evaluator needs
information provided in the TOR. This includes the
to get a clear overview of secondary data
availability of secondary resources and project available, including baseline and additional
monitoring data, as well as baseline data on indicators data on indicators of the DA project results
of the project’s result framework and other relevant framework and the level of disaggregation
indicators of change and whether available data are of available data. Moreover, opportunities
disaggregated for gender and other aspects of for primary data gathering need to be
vulnerability and marginalization. In case baseline data identified
are missing, ways of reconstructing baseline
information will need to be included in the setup of the
evaluation methodology. Moreover, data availability is dependent on the readiness of key informants
and groups of project participants to take part in the primary data gathering process of the evaluation.
In addition to the availability of data, it is important at this stage to start providing details on the
reliability of the data concerned, which will affect the use of the data in the evaluation process.

g) Evaluation Scope and Objectives


82. The scope and objective of the evaluation are established in the TOR. These need to be consistent
with one another and feasible within the time frame and financial and human resources available for
the DA project evaluation. If their assessment in the inception phase highlights some critical concerns,
these would need to be discussed by the evaluator
DA Requirements with the evaluation manager and if applicable with
the IEC and ERG. In case there is an inception mission
Evaluation scope and objectives need to be as part of the evaluation, these issues should be
reviewed in the inception stage and any discussed at this stage, so that an agreement on any
modifications agreed with the evaluation kind of adaptation to the scope and objectives of the
manager and included in the Inception evaluation can be included in the draft and final
Report, with details on the rationale inception reports, together with justification of
alterations concerned, compared to the TOR.

h) Evaluation Criteria and Questions


83. As part of the inception phase, the evaluation criteria and questions need to be reviewed, in order to
ensure that the criteria reflect the evaluation objectives and that the evaluation questions are limited
in number and in turn sufficiently reflect the criteria used in the evaluation. This is informed by the
desk review and initial (often virtual) discussions with key stakeholders. Evaluation criteria and
questions need to be coherent and the answering of the questions needs to be feasible in terms of
the level of human and financial resources available for the evaluation. In order to focus the
evaluation, the number of evaluation questions should ideally be limited to a total of seven questions.
84. At this stage, the evaluator can propose adaptations to the evaluation questions. This could include,
for example, changes in clustering of questions around evaluation criteria or reducing the amount of
questions to a more realistic number, in line with the time frame and resources of the evaluation.
Changes need to be agreed by the evaluation manager (and if applicable IEC and ERG) with adaptations
presented in the draft Inception Report, including adequate justification for adaptations proposed.
For details on DA specific suggestions for evaluation questions for the criteria of relevance, efficiency,

DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation


19
United Nations Development Account

effectiveness, and sustainability as well as partnership, the 2030 Agenda/SDGs, human rights and
gender equality and innovation (see the box on page 14).
85. One way to review the evaluation criteria and questions is to identify the nature of the DA project
support. For eample, a project that focuses on the adoption of norms and standards and related
instruments, the evaluation is usually focused on process and governance issues in the adoption of
norms and standards as well as the relevance of the norms, standards and instruments in the context
of the project. On the other hand, for a project that supports the government in incorporating a
particular norm, standard or instrument in national
legislation, policies or development planning, the
DA Requirements evaluation would usually need to focus on the
capacity changes of the agencies concerned and the
Evaluation criteria and questions need to be
level of reflection of the norms and standards in
reviewed and finalized in the inception stage
legislation, policies and development programmes.
Criteria and questions need to be aligned For a project that supports the application of laws,
with the evaluation objectives policies and plans that have incorporated norms and
Evaluation questions should ideally be standards, the evaluation would need to focus on the
limited to a maximum of seven actual implementation, including capacities and
processes concerned and if feasible the results for
people, in particular for vulnerable groups.
86. The evaluation questions, once agreed among stakeholders in the inception phase, become a means
to guide the data gathering and analysis process and the preparation of the draft and final evaluation
report. An important tool that assists the evaluator in this respect is the evaluation matrix, which is a
way to operationalize each of the evaluation questions, so that data can be gathered and analysed to
come to meaningful statements on issues concerned. (See details below and annex 2 for a Format for
the Evaluation Matrix.)

i) Evaluation methodology
87. As part of the inception phase, the evaluation methodology needs to be reviewed by the evaluator
and if needed adapted and further developed in consultation with the evaluation manager, and if
applicable with the IEC and ERG. Discussions need to cover the appropriateness and feasibility of the
methodology, its ability to meet the evaluation purpose and objectives and to answer the evaluation
questions, taking into consideration limitations of budget, time and data sources. The methodology
needs to be finalized as part of the inception phase and agreed upon, with a comprehensive overview
of the methodological approach of the evaluation included in the final inception report.
88. As outlined in the TOR, DA project evaluations make use of a theory-based approach, making use of
the project results framework to assess the output and outcome level changes achieved and the
contribution made by the project. A participatory approach is used in order to engage stakeholders in
the evaluation process, to enhance ownership of evaluation results and to allow for triangulation
across a variety of stakeholders and participants. For data gathering, there is a need to make use of
mixed methods, making use of an appropriate combination of qualitative and quantitative
approaches, while bearing in mind that use of some of the methods are more extensive and require
specific capacities of the evaluator concerned. An overview of methods to be used for data gathering
and analysis is presented in annex 3.
89. The inception report needs to provide details on primary data gathering. The approach concerned will
depend on the set-up of the DA project and its implementation at country, regional or global level. In
particular when a limited number of countries are included, with substantial support at country level,
country visits may be useful. Otherwise, visits to regional and global offices of IEs would be a useful
approach for primary data gathering, supplemented by virtual interviews with selected stakeholders.
Participation of the evaluator at the final project workshop or training activity towards the end of the

DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation


20
United Nations Development Account

project provides a useful opportunity for the evaluator to get first-hand experience of a project activity
and to gather data from participants. As part of the methodological details, selection of countries for
primary data gathering, sampling for quantitative and qualitative data gathering, and ways to address
limitations to the methodology need to be detailed, informed by the information presented in the
TOR.
90. Changes to the methodology of the TOR need to be identified and justified and agreed with the
evaluation manager and if applicable with the IEC and ERG during discussions on the draft inception
report. Tools to be used for data gathering, like lists of items for semi-structured interviews and
questionnaires for (mini-)surveys, need to be included in an annex to the inception report.
91. As part of the methodology, there is a need to make explicit how methods included allow for
assessment of human-rights and gender equality related issues. This can, among other ways, be
achieved through interviewing stakeholders separately, when there are differences in power, interest
or influence, including separately interviewing supervisors and staff, women and men, girls and boys,
as well as separately interviewing government stakeholders and ultimate project beneficiaries (as
relevant). In order to ensure sufficient attention to both policy and technical issues, it might be useful
to conduct separate interviews with the policy and technical staff concerned of the same agency.
92. In broader terms, the inclusion
of human rights and gender DA Requirements
equality can be achieved
through the inclusion of aspects Make use of a theory-based approach, guided by the project results
of human rights and gender framework, including assessment of results and the process through
equality in evaluation which these were achieved
questions, gathering and Use a participatory approach to enhance opportunities for
analysing sex disaggregated generating learning that is shared across stakeholders of the project
data13 from in particular
Make use of a mixed methods approach, including qualitative as
vulnerable and marginalized
well as quantitative data gathering and analysis
groups, use of appropriate
Combine output level assessment with contribution analysis in the
methods in ways that respect
review of outcome level changes of a DA project
the rights of participants in the
Changes in methodology from the TOR need to be justified in the
evaluation and the use of a
Inception Report
gender balanced evaluation
team, with the inclusion of Make explicit how human rights and gender equality related data
will be gathered and analysed
capacities on human rights and
gender in terms of the
requirements of the evaluator.

Inclusion of Gender Equality and Human Rights

• Evaluation objective and scope include human rights and gender equality related issues
• Evaluation criteria and questions specify how to assess human rights and gender equality related issues
• Inclusion of a human rights and gender responsive methodology, data gathering tools and analysis
• The evaluator is qualified with respect to human rights and gender issues
• Evaluation findings, conclusions and recommendations reflect a gender analysis

13
Disaggregated data by sex are important but usually not sufficient to show aspects of gender equality, which also needs
to take into account the social positions of the participants concerned and aspects of their representation.
DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation
21
United Nations Development Account

j) Evaluation Matrix
93. Preparation of the evaluation matrix is an important part of the inception phase. The process of
preparing the evaluation matrix needs to be guided by the evaluation criteria and questions. The
evaluation matrix details for each of the individual evaluation questions a number of assumptions,
which need to be assessed in order to get answers to the questions. For each of these assumptions,
indicators should be identified that can provide information on the assumption and for which sources
of information and methods and tools for data collection and analysis need to be identified. Each of
the evaluation questions can thus be broken down in smaller parts on which data can be gathered. In
this way the matrix can guide data gathering. A format for the evaluation matrix is provided in annex
2.
94. The evaluation matrix needs to be developed by the evaluator during the inception phase, informed
by the desk review. In addition to guiding data gathering, the evaluation matrix is an important means
to guide data analysis and reporting. The matrix can be used during the data gathering process to
record information collected against each of the assumptions and indicators identified in the matrix.
When this is done in a consistent way, the evaluation matrix becomes an important tool for data
analysis and in the writing of the
evaluation report, as it brings
DA Requirements
together data on each of the
evaluation criteria and questions. The
The development of an evaluation matrix provides a means
same approach can, moreover, be to operationalize the evaluation questions and is required in
used as a checkpoint to ensure that the inception phase. The matrix is an important means to
data and information gathered covers guide the process of data gathering, data analysis and
all aspects of the evaluation, across all preparation of the project evaluation report
of the evaluation questions,
assumptions and indicators in the
evaluation matrix.
95. The indicators in the evaluation matrix should ideally include the indicators of the project results
framework, as well as a number of other relevant indicators.

k) Evaluation Workplan
96. In the inception phase, a clear and detailed evaluation work plan needs to be developed, including
details of activities to be performed and dates for the various stages of the evaluation process, i.e. the
inception phase, on data gathering phase, the analysis and reporting phase and the follow-up phase
of the evaluation. With the latter phase in particular aimed at supporting the use of the evaluation
results, attention needs to be paid to
DA Requirements evaluation use from the start of the
evaluation process. The workplan
needs to include dates for the draft
The inception report needs to include a detailed workplan,
and final versions of the inception
covering all phases of the evaluation process and specifying
and evaluation reports and any other
dates for the draft and final deliverables of the evaluation
deliverables that are part of the
assignment of the external evaluator.
See annex 2 for a format.

l) Preparing the Draft and Final Inception Report


97. The evaluator is responsible for the preparation of the draft inception report and for providing this to
the evaluation manager, who, distributes it to relevant stakeholders and, if applicable, to the members
of the IEC and the ERG for comments. Based on discussion of the contents of the draft inception report
and the consolidated comments from all key stakeholders, usually prepared by the evaluation

DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation


22
United Nations Development Account

manager, the evaluator prepares the final inception report. It will be useful to provide an overview of
comments and the responses of the evaluator to each of these as part of the finalization of the
inception report.
98. The presentation of the inception report at the start of the data gathering phase of the evaluation
provides an opportunity to get all key
stakeholders together at the
DA Requirements
beginning of the evaluation and to
engage them in the process. This also
The draft inception report needs to be discussed with the
provides an opportunity to discuss evaluation manager and key stakeholders and, if applicable,
the details of the inception report with the member of the IEC and ERG, whose comments,
and to address any comments by the compiled by the evaluation manager, need to be responded to
evaluator. The final inception report in the preparation of the final inception report
needs to be completed before the
start of primary data gathering.
99. Each of the steps taken in the inception phaseprovides inputs to the development of the Inception
Report, making use of and building on the details provided in the TOR. The inception report is a means
for the evaluator to show a thorough understanding of the purpose of the evaluation, the DA project
as the subject of the evaluation, the context and to provide methodological and organizational details
on how the purpose of the evaluation will be fulfilled. A detailed overview of DA requirements for the
project inception report is presented in annex 4.

DA Project Evaluation Guidelines, October 2019 4. Inception Phase of the Evaluation


23
United Nations Development Account

5. Data Gathering Phase of the Evaluation


100. The data gathering phase of the evaluation includes gathering of primary and additional secondary
data at relevant country, sub-regional, regional and global levels. DA evaluations should include face-
to-face interactions with stakeholders. This could, if relevant and timely, be achieved through
participation of the evaluator in the final workshop or meeting, which is often part of the activities
organized in the final stages of a DA project (typically a regional workshop). Attending the final
workshop or meeting can also be an efficient way to establish contact with a number of key
stakeholders that can be followed up upon during the course of the evaluation.
101. DA project evaluations can further include primary data gathering through a possible visit to one or
more countries that were involved in project implementation, in order to gather first-hand
information on results achieved, constraints faced and the ways in which challenges were addressed.
Primary data gathering should include the stakeholders identified in the inception phase, through the
stakeholder analysis. This will usually also involve visit(s) to the IEs responsible for the project
,including the project manager. Government and non-governmental partners as well as participants
in project activities will ussally be included in data gathering, making use of the methods and tools
presented in the inception report. Efforts should further be made to gather data from stakeholders
who did not participate in the final workshop or meeting as well as those that did not directly
participate in the project, but that can bring important perspectives in terms of project achievements,
processes and context.
102. Though the data gathering
phase is meant to be DA Requirements
implemented in line with the
final inception report, there is a The data gathering phase needs to be conducted in line with the
need to remain flexible details provided in the inception report
throughout the Primary data gathering could include possible country visit(s),
implementation process of the enabling data gathering on results and how these were achieved
evaluation. Even with a high from the perspective of stakeholders that participated as well as
quality TOR and inception from those who did not participate directly in DA project events, in
report, unexpected issues may addition to possible participation of the evaluator in a final project
occur and will need to be workshop or meeting
addressed. When such events Any constraints in the implementation of the evaluation need to be
occur, it is useful for the discussed with the evaluation manager and if needed consulted
evaluator to communicate and with the IEC and ERG if applicable
discuss with the evaluation Preliminary results of the evaluation need to be validated with key
manager, who if applicable stakeholders, with the feedback received to be used for the
could get back to the IEC and in preparation of the draft evaluation report
certain cases the ERG, in order
to get all stakeholders to agree
with how to move forward. In practice, adaptations would usually be minor, with most of the issues
addressed in the preparatory and inception phases of the evaluation.
103. At the end of the data gathering phase, validation of the preliminary results with key stakeholders is
required. This is an important way to get feedback from stakeholders on the initial analysis of the
evaluation findings and provides useful inputs to the preparation of the draft evaluation report

DA Project Evaluation Guidelines, October 2019 5. Data Gathering Phase of the Evaluation
24
United Nations Development Account

6. Analysis and Reporting Phase of the Evaluation


104. With much of the details of the evaluation process discussed as part of the TOR and Inception Report,
the consideration of the analysis and reporting phase will focus on those aspects that are new,
including findings, conclusions, lessons learned/good practices, recommendations, the executive
summary and aspects of the writing of the draft and final evaluation report.14
105. Management of the evaluation at this stage is about the review of the draft evaluation report in terms
of compliance with DA requirements by the evaluation manager, and application of the quality
assurance process of the concerned IE. The draft of the evaluation report then needs to be shared
with key stakeholders, with the evaluation manager compiling comments and sending them to the
evaluator, to inform the preparation of the final evaluation report.

6.1 Findings
106. Findings are factual statements that respond to the evaluation questions or parts thereof and are
informed by evidence and data provided in the evaluation report. The findings respond to the scope
and objective of the evaluation. They reflect appropriate analysis of data resulting in statements that
start answering parts of the evaluation questions. Findings need to include a focus on results, ways in
which these have been achieved and the contributions of the project. Reasons for accomplishments
as well as for lack of progress need to be provided as much as possible.
107. Gaps and limitations in data need to be identified and discussed as well as how these were addressed
and in what way they affected the findings. In addition to expected changes, as included in the results
framework of the project, findings need to pay attention to unexpected changes, which can be positive
as well as negative in relation to the project objective. Findings should cover all the evaluation criteria
and questions included in the evaluation. Specific attention needs to be paid to human rights and
gender equality related issues, in line with their importance in terms of evaluation criteria and
questions.
108. Findings need to make note of adaptations made to the design of the project and its results framework
and indicators, something which in particular might occur during the first year of DA project
implementation. This is especially
DA Requirements important for projects where the
IE has been requested by DA PMT
Findings need to cover all evaluation criteria and respond to all of to provide country specific action
the evaluation questions. They have to be informed by evidence plans and indicators as part of the
provided in the evaluation report first annual project progress
DA project evaluation findings need to include any adaptations report(s). The evaluation needs
made to the DA project design after approval of the project to consider these adaptations
document and during project implementation, especially when and base the assessment of
country specific action plans and indicators have been requested project results on the revised
to be provided as part of the first annual project progress results framework, including
report(s) changes to the indicators.
Attention needs to be paid to unexpected change(s), in addition
to planned results

14
The OIOS Evaluation Quality Assessment Framework includes details on the requirements of the evlauation report,
including details on background, methodology, findings, conclusions and lessons learned, recommendations, gender and
human rights and report structure. A comprehensive overview of UNEG requirement for evaluation reports is provided in:
United Nations Evaluation Group, 2010.
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
25
United Nations Development Account

6.2 Conclusions
109. Conclusions point out the factors of success and failure of the evaluated intervention, with special
attention paid to the intended and unintended results, and more generally to any other strength or
weakness; they draw on data collection and analysis undertaken, through a transparent chain of
arguments.15 Conclusions need to be grounded in the analysis of the findings and articulate statements
at the level of the evaluation questions and the evaluation criteria. They can also refer to issues across
evaluation criteria or cross-cutting issues, like human rights and gender equality related issues.
Conclusions include strengths and weaknesses of the project that is evaluated, taking into account the
viewpoints of a variety of stakeholders. They need to be grounded in the the evidence provided in the
report. Conclusions provide
insights in the issues that the DA Requirements
project aims to address, identifies
solutions to important problems Conclusions need to build on analysis of findings and provide
faced and pinpoints ways in which insights in the topic that the DA project addresses
what appears to work well in
They need to add value rather than being a summary of the
organizations can be reinforced.
findings and provide a connection to recommendations made
Conclusions need to provide
linkages to the recommendations.

6.3 Lessons Learned / Good Practices


110. Lessons learned are generalizations based on evaluation experiences with projects, programmes, or
policies that abstract from the specific circumstances of the project to broader situations. Frequently,
lessons highlight strengths or weaknesses in preparation, design, and implementation that affect
performance, outcome, and impact.16 Good practices are ‘well documented and assessed
programming practices that provide evidence of success/impact and which are valuable for
replication, scaling up and further study. They are generally based on similar experiences from
different countries and contexts’.17
111. DA project evaluations are encouraged to include the identification of lessons learned and good
practices, which can be of use beyond the context of the DA project that is being evaluated. Both are
important means for learning and crucial in understanding achievements of results, as well as how
learnings can be applied beyond the contexts in which they have been developed. In this way, lessons
learned and good practices differ from findings, conclusions and recommendations, which are specific
to the context of the DA project under evaluation.
112. The identification of both lessons learned and good practices usually results from qualitative inquiry
as part of the evaluation process, asking why certain results happened and what can be learned from
it, why things worked well and whether these results could also be achieved in other contexts. It is
important to involve key stakeholders in the identification of lessons learned and good practices.
When such detection is done early on in the evaluation process, the evaluator can review these during
the data gathering phase of the evaluation. As learnings often result from trial and error, the evaluator
should also probe on what did not work initially, as this can be an important part of the learning
process behind lessons learned and good practices.18
113. In order for the lessons to be of relevance to users outside the project context, lessons learned and
good practices need to be sufficiently detailed and substantiated, including details of how they were

15
OECD Development Assistance Committee, 2002.
16
Ibid
17
UNICEF, not dated.
18
United Nations Evaluation Group, 2013.
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
26
United Nations Development Account

DA Requirements learned and the evidence that


they are based upon. In order to
Lessons learned and good practices need to be identified and ensure enough attention to the
included in a separate section of the DA Project Evaluation report details of each lessons, they
should be limited to a maximum
The lessons and practices need to be applicable beyond the
of five. Lessons learned and good
context in which they were learned and be sufficiently
substantiated to be of use to stakeholders outside the project
practices provide ways for IEs to
context integrate the learnings of project
evaluations into their on-going
Lessons and practices should ideally be limited to a maximum of and future programmes. For a
five
format see annex 2.

6.4 Recommendations
114. Recommendations are defined as proposals aimed at enhancing the effectiveness, quality, or
efficiency of a development intervention; at redesigning the objective; and/or at the reallocation of
resources.19 Recommendations should be linked to the evaluation findings and conclusions and need
to be relevant to the purpose and the objective of the evaluation. They need to address main issues
identified in the evaluation and be informed by the evidence provided in the evaluation report. They
need to be developed with the participation of key stakeholders, with the process of development of
the recommendations made explicit in the report. Each recommendation needs to identify the target
group concerned and include priorities for action. Each of those needs to show an awareness of the
potential constraints in terms of follow-up for the parties concerned. They should address aspects of
sustainability of the results supported and achieved by the project. Possible foci for recommendations,
taking into consideration the particular characteristics of the DA, are presented in the box below.
115. Recommendations need to be pitched at the right level. With DA projects usually not having a second
phase, they should be focused on other ways to make use of the achievements of the DA project and
learnings on the topic concerned by the IE and other relevant stakeholders. Recommendations should
not be pitched at too high a level, like at management aspects of the IE concerned or of the DA PMT,
as the evaluation of a single project would usually provide an insufficient evidence base for such
suggestions. They should, moreover, not be over ambitious, but realistic in their approach and time
frame.
116. Rather than having a myriad of
DA Requirements
small and unrelated
recommendations, their
number should be limited to a Recommendations should focus on use of the results and learnings
from the project by the IE and other relevant stakeholders to
maximum of five to seven key
further develop the topic that the project addressed, especially as
recommendations, which need DA projects usually don’t have a second phase
to be further elaborated on in
terms of opportunities for their Recommendations need to be limited in number to a maximum of 5
implementation. to 7, be sufficiently specific and feasible, providing details on
implementation, including potential constraints

19
OECD Development Assistance Committee, 2002.
DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
27
United Nations Development Account

Examples of Possible Focus for Recommendations of DA Project Evaluations

• Ways in which the IE(s) can make use of the results and learnings of the project in the wider programme of which
the project was a part and in supporting the achievement of the SDGs to which the project contributed
• How the results framework and the wider theory of change developed or used in DA project implementation can
be fine-tuned based on learnings of the DA project
• Ways in which results and learnings of the DA project can be used to inform other countries / (sub-)regions through
south-south learning or otherwise
• Ways in which some of the capacities built through the project at individual and organizational level can be further
institutionalized, like the use of training curricula and toolkits developed
• Ways in which capacities developed of IE(s) and partners can be used to further address the issues concerned
• Identification of funding opportunities for continued support to the topic that the DA project addressed
• Ways in which coordination mechanisms supported during the DA project can be continued and enhanced after
project termination
• Ways in which aspects of the topic addressed through the DA project can be monitored after project termination in
order to generate evidence to inform policy and other decision-making processes
• Ways in which partnerships and networks supported in the implementation of the DA project can continue to
function or be further developed
• Ways in which gender related results can further contribute to gender equality
• Options for follow-up on the introduction of innovative practices by project implementing partners, as well as other
government partners, UN resident agencies, civil society organizations, private sector actors, universities and other
relevant stakeholders

6.5 The Executive Summary of the Evaluation Report


117. The objective of the executive summary is to inform the reader on the key aspects of the evaluation.
In particular, it is meant for policy- and decision-makers to be able to learn the main findings,
conclusions, lessons learned and recommendations of the evaluation through reading the summary.
In order to enable this, the summary needs to be a stand-alone document, which can be understood
without reading any other part of the evaluation report. Therefore, the summary needs to provide
details on the DA project, the background of the evaluation as well as its purpose, objectives and scope
and the intended users. Details on
DA Requirements methodology will enhance the
credibility of the presentation of key
The summary needs to be a stand-alone document, aimed findings, conclusions and
in particular at policy and decision makers, in order to recommendations. The length of the
enable them to take note of the results of the evaluation summary should be limited to upto
three pages.

6.6 The writing of the Evaluation Report


118. The writing of the evaluation report is a complex task, as it needs to bring together all of the primary
and secondary data and information gathered and analysis conducted over the entire period of the
evaluation process in a single concise and coherent document. In particular there is a need for close
analytical relations between findings, conclusions and recommendations as well as learning identified.
For an overview of these key components of the evaluation report see table 3 below. Usually, there is
a need for several drafts of the evaluation report, in order to end up with a report that represents the
required quality to all stakeholders. Detailed outline of the required elements of the DA project
evaluation report is provided in annex 5.

DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
28
United Nations Development Account

Table 3: Description of Key Components of the Evaluation Report

Issues Description

Factual statements based on evidence provided in the evaluation report that answer part of
Findings
evaluation questions
Synthesis of evaluation findings that deliver concluding statements that provide answers to
Conclusions evaluation questions and criteria and/or to cross-cutting issues and other issues across
evaluation criteria

Lessons learned / Learning, in particular on the way in which results were achieved, which are considered
Good practices applicable beyond the context of the specific DA project in which they were developed

Advisory statements concerning actions to be taken by stakeholders of the project,


Recommendations
including timeframes and prioritization

119. The use of the evaluation matrix can be a great help in the preparation of the evaluation report, in
particular when details of the data gathered are included in an added column to the matrix, so that all
data get organized in line with the evaluation criteria and questions. In order to enhance the
accessibility of the DA project evaluation reports, the main text of the report (excluding annexes)
should be limited to 40 pages.
120. The draft evaluation report needs to be shared
with relevant stakeholders, and if applicable DA Requirements
with the members of the IEC and the ERG, in
order to obtain their inputs. Comments of the The DA project evaluation report needs to provide
various parties to the evaluation report should the details on the results of the evaluation,
focus on factual imprecisions and including findings, conclusions, lessons learned /
inconsistencies in the report. Comments good practices and recommendations and be a
provided need to respect the independence of maximum of 40 pages, excluding annexes
the evaluation and the evaluator in terms of In addition to results of the evaluation, the final
conclusions to be made on the findings report of the DA project evaluation needs to
identified in the evaluation report. Inclusion of include details on the evaluation subject, the
stakeholders in all stages of the evaluation context, the purpose of the evaluation and its
process, including the review of the draft scope, evaluation objectives, criteria and
evaluation report, will enhance the ownership questions and details on the methodology used in
data gathering and analysis
of the evaluation results and heighten the
likeliness of use of the recommendations.

DA Project Evaluation Guidelines, October 2019 6. Analysis and Reporting Phase of the Evaluation
29
United Nations Development Account

7. Follow-Up Phase of the Evaluation


121. As identified at the start of the evaluation process, the finalization of the evaluation report needs a
follow-up phase. The final evaluation report needs to be shared with the DA PMT through the DA focal
point in the IE and with project partners. This phase also includes the development of a management
response.

7.1 Sharing of evaluation results


122. Follow-up to the evaluation process includes the wider dissemination of the evaluation results, in
particular the lessons learned and good practices identified. Entities should publish evaluations as per
their disclosure policies and practices. Further information on the sharing and use of evaluation results
can be found in the UN Development Account Evaluation Framework.

7.2 Management response


123. The evaluation manager is responsible for the overall coordination of the management response
process. The DA project manager is responsible for developing the management response for the
recommendations aimed at the IE. The IE does not necessarily need to agree with each and every
recommendation, but needs to provide a rationale or indicate otherwise why a response would not
be required. Monitoring and follow-up to the implementation of the recommendations needs to be
included in the regular follow-up system that IEs have in place as part of their own evaluation function.
124. The evaluation manager further needs to ensure that recommendations aimed at other relevant
stakeholders are conveyed to them. These stakeholders will need to provide management
response(s), in order to address the issues
DA Requirements raised. This can include an overall response to
the evaluation as well as a response to each
IEs are responsible for the communication of of the recommendations directed at them
evaluation results to project partners, including (see annex 2 for a template). The
recommendations, and for dissemination of lessons management response includes whether the
learned and good practices party concerned agrees with the
Entities should publish evaluations as per their recommendations and how they will address
disclosure policies and practices the issues concerned and manage the
implementation of the response. Follow up to
IEs and project partners need to develop a
management response and follow-up on its
the implementation of recommendations
implementation aimed at other stakeholders is at the
discretion of the IE concerned.

DA Project Evaluation Guidelines, October 2019 7. Follow-Up Phase of the Evaluation


30
United Nations Development Account

Annex 1 - DA Requirements for the Terms of Reference of Project


Evaluations

No TOR Section Contents


• Title of the DA project evaluation
1 Introduction (1 page) • Very short description of the DA project to be evaluated
• Short reason for the evaluation (will be expanded under 2 below)
• Short introduction of the context (will be expanded on under 3 below)
• Timing of the evaluation
• Clear rationale for the evaluation, in the context of the DA, including expected
2 Evaluation purpose (0.5 users and expected use by each of the users of the evaluation results
page) • The kind of recommendations that are expected from the evaluation
• Decision-making processes that the evaluation results need to feed into

• Introduction to the topic of the project to be evaluated and relevant


3 Context of the developments concerned, at relevant global, regional and country levels
evaluation (1.5 page) • Details on policies, plans and programmes of government and other
organizations on the topic concerned and relevant support provided by UN, other
development partners and stakeholders
• The DA project, its objective, outcomes and outputs and how it tries to realize
4 Subject of the these
evaluation (1.5 page) • Coverage in terms of countries / regions and time frame concerned
• Implementing partners, including government, other IEs, other UN agencies,
other development partners at country/(sub-)regional/global levels
• Other stakeholders and their interest in the project and the evaluation
• Human and financial resources available for project implementation
• Past evaluations / assessments / studies including gender analysis and
vulnerability assessments
• What the evaluation will cover of the subject of the evaluation in terms of project
5 Evaluation scope, components and outputs, geographical area, time frame and otherwise
objectives, criteria and • What parts of the subject the evaluation will not cover and rationale concerned
questions (2 pages) • Objectives of the evaluation, i.e. what the evaluation will accomplish, what
evaluation criteria will be covered and rationale concerned, the need to identify
lessons learned
• Evaluation questions, organized by evaluation criteria, with the number of
questions limited to a maximum of seven
• Methodological approach and rationale or inclusion of the requirement for the
6 Methodology of the evaluator to (further) develop the methodology in the inception phase
evaluation (1.5 pages) • Required at this stage
o Use of a theory-driven approach in terms of results achieved as well as the
process concerned, making use of the project results framework
o Data availability, including annual and terminal project reports, project
monitoring data and other relevant secondary data related to the project and
the topic that it addresses
o Application of a human rights-based and gender sensitive approach with
disaggregation of data by gender and other relevant vulnerability criteria
o Ethical considerations and reference to UNEG ethical guidelines
• Optional at this stage
o Specification of methods for data gathering and methods for data analysis
o Identification of primary data gathering and rationale for country selection
o Sampling of respondents for quantitative and qualitative data gathering
o Limitations to the methodology and ways to mitigate these

DA Project Evaluation Guidelines, October 2019 Annexes


31
United Nations Development Account

No TOR Section Contents


• Evaluation process and work plan
7 Organization of the • Management issues, including roles and responsibilities of IE, evaluation
evaluation (1.5 pages) manager, evaluator and if applicable of the members of the IEC and ERG
• Evaluation deliverables and their timing, i.e. draft and final Inception Report,
draft and final Evaluation Report and other knowledge products as required
• Quality assurance process used, including quality assessment of deliverables
• Security considerations
• Evaluation team composition, requirements and competencies

• Total evaluation budget


8 Evaluation budget • Breakdown by DA budget categories
(for internal use only) (0.5
pages)

Required:
9 Annexes
• DA project results framework
• Stakeholder map
• Documents to be consulted
• UNEG Ethical code of conduct20
• Detailed evaluation workplan
• Format for Inception Report
• Format for Evaluation Report
• Format for Management Response
Optional:
• Additional details to the context
• Additional details to the subject
• Additional details to the methodology

20
UNEG, 2008.
DA Project Evaluation Guidelines, October 2019 Annexes
32
United Nations Development Account

Annex 2 - Formats for use in DA Project Evaluation


Format for Stakeholder Mapping

Level of influence
Way(s) to involve
Stake in the project over topic and
Expected use of the this stakeholder in
Stakeholder and the topic that the project / Ways in
evaluation results the evaluation
project addresses which affected by
process
topic and project

Format for the Evaluation Matrix

Assumption to Sources of Methods and tools Methods and tools


Indicator(s)
be assessed Information for data collection for data analysis

Evaluation Question 1:

Evaluation Question 2:

Evaluation Question 3:

DA Project Evaluation Guidelines, October 2019 Annexes


33
United Nations Development Account

Format for the Evaluation Workplan

Specific Activities/Milestones/Deliverables for each Phase of the Evaluation Dates

1. Preparatory Phase

2. Inception Phase

3. Data Gathering Phase

4. Analysis and Reporting Phase

5. Follow-Up Phase

Format for the Presentation of Lessons Learned and Good Practices

Issue Details

Lesson learned/good practice in short

More detailed description of lesson


learned/good practice
Context in which learnings were obtained
and relevant contextual details concerned
Details on the lesson/practice and the
way in which it was learned, including
available evidence

Format for the Management Response

Acceptance
Recommen- Party Action Status of
No. (Rationale if Priority Time frame
dation Responsible Description Progress
rejected)

DA Project Evaluation Guidelines, October 2019 Annexes


34
United Nations Development Account

Annex 3 - Details on Methods for Data Gathering and Analysis in DA Project


Evaluations

No Methodology Description Use

Data Gathering

1 Desk review Review of secondary information Needed to understand the context of the
pertaining to the project (including the project, stakeholders and roles concerned;
project document, monitoring data, yields in particular, information on the
project reports and financial data) and evaluation criteria of relevance and efficiency
information on the topic concerned, while monitoring reports can include useful
including lessons from earlier evaluations data on results, that would need to be
and reviews of projects on the same topic triangulated with other sources

2 Key informant Interviews of selected stakeholders Effective means of collecting in-depth


interviews making use of a list of topics to discuss, qualitative information including on what
either face-to-face or through telephone worked, what did not and reasons concerned;
or Skype with topics for discussion face-to-face interviews are usually more
informed by the desk review effective; setting up interviews takes a lot of
time and is usually managed by the evaluation
manager, rather than the evaluator

3 Focus group Discussion type interviews with a group of Effective means of collecting in-depth
discussions stakeholders, about 6 to 10 participants, qualitative information, including on what
with similar backgrounds and hierarchy for worked, what did not and reasons concerned;
several hours; allows for discussion of a set does usually not work with different
of issues across the participants of the hierarchical status among group members;
group; need to ensure for full participation ideally two evaluators needed, one to facilitate
of all group members; topics for discussion the discussion and one to take detailed notes
informed by the desk review

4 Mini-survey Self-administered short online or e-mailed Used to provide quantitative data and
survey from a substantial number of information, online software can usually
stakeholder or participants at low cost and provide key statistics; the inclusion of open
with possible wide reach; substantial non- ended questions can include some qualitative
response can affect the ability for data which could counteract the risk of
quantitative analysis of the data; short substantial non-response rate and the related
surveys with up to 10 questions can reduce limitations in quantitative analysis of data;
non-response rates development of different questionnaires for
different stakeholder groups is usually
required, with some overlap to enhance
comparability

5 Direct Observing the process of project Can be used to get a better understanding of
observation implementation, often through the workshop, training event, meeting or
participation in a workshop, training event, conference; useful for the assessment of
meeting or conference. Preparing aspects aspects of participation of the various
for guided observation can enhance data stakeholders in the events concerned; can be
gathering combined with key informant interviews and
focus group discussions with participants

6 Stakeholder Identifying stakeholders, their roles and Useful in particular with the inclusion of many
mapping relationships as well as the drivers and stakeholders in a project; the use of visual
those possibly opposed to the change mapping can be useful to show inter-
concerned relationships between stakeholders; to be
included as part of TOR so that it can be
specified in the inception phase and beyond

7 Case studies An in-depth exploration from multiple The case study is useful for in-depth analysis of
perspectives of the complexity and specific issues, but with its focus on cases is less
uniqueness of a particular policy, useful for generalising as it focuses on the
institution, programme or system in a real concrete and the specific. Need to focus on

DA Project Evaluation Guidelines, October 2019 Annexes


35
United Nations Development Account

No Methodology Description Use


life context. A case study usually combines information rich cases in order to better
qualitative and quantitative data and understand qualitative issues
analysis and needs to contain a subject as
well as an analytical frame (or object); case
studies can be illustrative, exploratory,
examine a critical instance, focus on
initiative’s implementation or effects and
be cumulative when bringing together
findings from several case studies

8 Knowledge Assessment of the knowledge and skills Used in the assessment of learning from
assessment gained by individuals who have received training, workshop, coaching and mentoring
training, workshop, coaching or mentoring initiatives, going beyond the initial reaction of
as part of the project; ideally done before participants to establish what they learned
and after the training, workshop, coaching through the initiative concerned
or mentoring; often useful to get
information from supervisor or trainer as
well

9 Benchmarking Comparing each government’s policies, UN criteria, norms and standards can be used
legislation or development plans against a in the assessment of government legislation,
benchmark or ideal example that fits policies and development plans
related UN criteria, norms or standards; in
addition to comparing policies or plans,
individual countries can be assessed
against international criteria, norms and
standards, which are used as benchmark

10 Organizational Assessment of the capacities of an Assessment can identify existing capacities as


capacity organization, making use of a range of well as capacity gaps and inform an
assessment organizational development criteria with organizational development initiative to
the organization as the primary unit of enhance organizational performance
analysis

11 Country visits Country site visits give the evaluator an Country visits can help the evaluator to
opportunity to conduct face-to-face validate information obtained from other
meetings and interviews and hear the sources and methods, and can help to gain a
perspectives of the ultimate beneficiaries / first-hand understanding of the enabling and
rights holders in one or more countries constraining factors that governments or
institutions face

12 Outcome Assessment of the intermediate changes The methodology is in particular useful for a
mapping21 that need to be attained in order to reach project built on longer term partnerships,
an organisations’ vision; Outcome based on a shared vision and values amongst
Mapping focuses on the capacities of participating organisations with a focus on
partner organizations through assessment output and outcome level changes and with a
of changes in the behaviour, relationships, non-linear view on social change, making use
activities or actions of the parties with of outcome, strategy and performance
whom a programme works directly journals

13 Most significant Assessment of changes that have occurred A participatory approach in which stories of
change22 in a targeted group or area and change are holistic and in which participants
development of a dialogue on the values provide part of the analysis through their
attached to these changes by key selection of stories. Though the method starts
stakeholders, who select the most off as qualitative data gathering, it can also be
important story used for quantitative data analysis. The open-
ended approach of the method is in particular
useful for assessment of unexpected changes

21
Earl, S., Carden, F., and Smutylo, T., 2001.
22
Davies, Rick and Jessica Dart, 2005 and; Dart, Jessica and Rick Davies, 2003.

DA Project Evaluation Guidelines, October 2019 Annexes


36
United Nations Development Account

No Methodology Description Use

14 Process The systematic assessment of the This is of particular importance in pilot and
documentation23 outcomes of a set of development other innovative initiatives as well as in more
activities, in order to understand the open-ended programme approaches and
processes that led to their results, and policy level engagement, in which there is a
consultations with others on these need for incremental learning and to
processes, to learn from programme document such processes for future
implementation and inform the facilitation programme initiatives
and management of supported
development processes

15 Appreciative A methodology that builds on an This method can be applied in participatory


inquiry organization’s strengths and assets rather evaluations, the framing of evaluation
than its deficits and shortcomings, questions in a positive way is conducive to the
identifying positive aspects of the involvement of participants; the method is
organization and how these can be used to useful in particular for determining practical
enhance organizational development. action to lead to a desired state of
What does not work well is included by organizational development
asking participants what they would wish
to be changed in the future

Data Analysis

1 Qualitative Content analysis is a way to conduct Categories for data analysis are developed in
content analysis systematic text analysis to reduce large inductive and deductive ways, providing
amounts of unstructured textual content means for coding of qualitative data. The
into manageable data relevant to the evaluation matrix provides an important
evaluation concerned through a step by means of organizing qualitative data along the
step process of devising the material into categories of qualitative content analysis
content analytical units making use of
analytical categories, further specified
through feedback loops

2 Context analysis Analysis of the context in which a project In particular of use for initiatives that engage at
operates, in particular the political the policy level in order to understand aspects
economy context, with a focus on issues of of the political economy environment in which
enabling environment, space for change the project operates
and capacities concerned

3 Policy analysis Analysis of policy initiatives and their Analysis of policy engagement and other policy
results, making use of: the policy cycle (to related initiatives in terms of stage of the policy
understand to what phases of the policy cycle, type of engagement, means to reach
cycle initiatives aim to contribute and the policy change and partnerships concerned
kind of policy results achieved along the
stages of the cycle); type of policy
engagement (assessing audience and
influence sought); theory of change (to
analyse the logic of how policy
engagement is meant to deliver results);
analysis of the means to reach policy
change (including the development of
evidence concerned through knowledge
products and otherwise); and partnership
analysis (to analyse the DA project’s
partnering with other organizations to
reach policy objectives)

4 Strengths, Looking at strengths and weaknesses in Strengths and opportunities identified, will be
Weaknesses, terms of internal capabilities of used to assess aspects to be further developed
Opportunities, and reinforced, while weaknesses and threats

23
Da Silva Wells, Carmen et al., 2011; John A Joseph, not dated.
DA Project Evaluation Guidelines, October 2019 Annexes
37
United Nations Development Account

No Methodology Description Use


and Threats organizations, and opportunities and will be used to identify those internal as well as
analysis threats to highlight external factors external issues that need to be addressed and
mitigated against

5 Results chain Analysis of the project results framework This approach provides a framework for
analysis and the logical sequence between assessing whether objectives are likely to be
activities, their direct outputs, and achieved through monitoring of indicators at
outcome level changes the levels of the results framework

6 Contribution An approach designed to reduce Used in the assessment of the effects of project
analysis uncertainty about the contribution an support to intermediate level changes to which
intervention is making to the observed it cannot be considered the sole contributor,
results, through an increased providing the means to relate output and
understanding of why the observed results outcome level changes; particularly useful in
have occurred (or not!) and the roles those interventions for which a clear theory of
played by the intervention and other change was formulated, the use of the analysis
internal and external factors through provides evidence on the contribution of the
several steps, including analysis of the initiative concerned
project theory of change, whether it was
implemented as intended and the
anticipated chain of results occurred, the
extent to which the project contributed to
outcome level changes through the
realization of output level results and the
extent to which other factors influenced
the program’s achievements

7 Timeline analysis Analysis of project implementation from a This approach can provide insight in the
chronological perspective, linking time chronological aspects of the achievement of
related aspects of achievement of results project results and their relation to project
with relevant project management and implementation and contextual changes
contextual issues in-country and beyond occurring at various points in time

DA Project Evaluation Guidelines, October 2019 Annexes


38
United Nations Development Account

Annex 4 - DA Requirements for Inception Reports of Project Evaluations

Report
No Contents
Section
• Title of the evaluation
1 Introduction • Very short description of the project to be evaluated
• Short reason for the evaluation (will be expanded on under 2 below)
• Short introduction of the context
• Timing of the evaluation
• Rationale for the evaluation, why it is needed at this time
Evaluation
2 • Expected users and expected use by each of these of the evaluation results
purpose
• Introduction of the topic of the project to be evaluated
3 Context of the • Details on the topic in countries/regions covered by the project
evaluation • Details on policies, plans and programmes of government and other organizations on the
topic concerned and support provided by other development partners
• The DA project, its objective and how it tries to achieve this
4 Subject of the • Coverage in terms of countries / regions and time frame concerned
evaluation • Partners for implementation, including government, other IEs, other UN agencies at
country, (sub-)regional and global levels
• Other stakeholders that have an interest in the project and the evaluation
• Human and financial resources of the DA project
• Past evaluations / assessments / studies, including gender and vulnerability assessment
• What the evaluation will cover of the subject of the evaluation in terms of project
5 Evaluation components and activities, coverage of geographical area, time frame and otherwise
scope, • What parts of the subject, the evaluation will not cover and rationale these to
objectives and • Objectives of the evaluation, i.e. what the evaluation will accomplish, including what
questions evaluation criteria will be covered and rationale for this
• Evaluation questions, organized by evaluation criteria, with the number of questions
limited to six or seven
• Evaluation scope, objectives, criteria and questions need to be reviewed in the inception
phase by the evaluator and if needed adapted in coordination with evaluation manager
and if applicable with IEC and ERG
• Methodological approach and rationale
6 Methodology • Methods for data gathering and methods for data analysis
of the • Identification of primary data gathering and rationale for country selection
evaluation • Sampling of respondents for qualitative and quantitative data gathering
• Data availability
• Application of a human rights and gender equality approach in the evaluation
• Ethical concerns and how to address these
• Limitations to the methodology and ways to address the challenges identified
• Evaluation methodology needs to be reviewed in this phase by the evaluator and if needed
adapted/further developed in coordination with evaluation manager and if applicable the
IEC and ERG
• Evaluation process and work plan
7 Organization of • Management issues including roles and responsibilities of IE, evaluation manager,
the evaluation evaluator and if applicable IEC and ERG
• Evaluation team composition and responsibilities
• Evaluation deliverables, i.e. draft and final evaluation report
• Security considerations
Required
9 Annexes • TOR
• Detailed results framework of the project

DA Project Evaluation Guidelines, October 2019 Annexes


39
United Nations Development Account

Report
No Contents
Section
• Stakeholder mapping / analysis
• Evaluation Matrix
• Detailed evaluation workplan
• UNEG Ethical code of conduct
• List of acronyms used
• References to secondary information sources
Optional
• Additional contextual details
• Additional methodological details
• Additional details on the subject of the evaluation

DA Project Evaluation Guidelines, October 2019 Annexes


40
United Nations Development Account

Annex 5 - Outline of the required elements for Evaluation Reports

No Report Section Contents


• Title of the report, including the DA project that was evaluated and the regions
1 Title and opening pages
/ countries and time frame covered
• Date of the report
• Names and organizations of evaluator
• Name of the organization commissioning the evaluation
• Acknowledgements
• Listing of the contents of the report, including annexes, boxes, figures, and tables
2 Table of contents
with page references
• Listing of all acronyms and abbreviations used in the DA evaluation report
3 List of acronyms and
abbreviations

• The summary needs to be a stand-alone section of maximum three pages that is


4 Executive Summary
able to inform decision-making
• Needs to include short overview of the project, the purpose, scope and objective
of the evaluation and the intended users
• Provide key aspects of the methodology, its limitations and ways in which these
were mitigated
• Summarize key findings, conclusions, lessons learned / good practices and
recommendations
• Background to the project and the evaluation
5 Introduction
• Very short description of the project to be evaluated
• Short reason for the evaluation
• Purpose of the evaluation, including timing of the evaluation and expected users
and use of the evaluation results
• Introduction of the topic of the evaluation and relevant developments
6 Context of the
concerned
evaluation
• Details on the topic in countries/regions covered by the project
• Details on policies, plans and programmes of government and other
organizations on the topic concerned and support provided by other
development partners
• The DA project, its objective and how it tries to achieve this
7 Subject of the
• Coverage in terms of countries / regions and time frame concerned
evaluation
• Partners for implementation, including government, other IEs, other UN
agencies at country/regional level
• Project resources
• Past evaluations / assessments / studies if relevant including gender analysis and
vulnerability assessment
• Scope of the evaluation and rationale concerned
8 Evaluation scope,
• Objectives of the evaluation, including what evaluation criteria will be covered
objectives and questions
• Evaluation questions, organized by evaluation criteria
• Methodological approach and rationale, including methods for data gathering
9 Methodology of the
and analysis, aspects of data availability and reliability of data, designed to meet
evaluation the evaluation purpose, scope and objectives
• Sampling of respondents for qualitative / quantitative data gathering, rationale
for country selection of primary data gathering and process of stakeholder
engagement
• Ethical concerns and how these were handled
• Limitations to the methodology and ways these were mitigated

DA Project Evaluation Guidelines, October 2019 Annexes


41
United Nations Development Account

No Report Section Contents


• Statements related to (parts of) the evaluation questions and organized by
10 Findings
evaluation criteria that are based on evidence presented in the report and that
provide answers to parts of the evaluation questions
• There is a need for a clear focus on results obtained, ways in which these have
been achieved and contributions of the project
• Statements at the level of evaluation questions and beyond, that are grounded
11 Conclusions
in the analysis of the findings. This can include statements at the level of the
evaluation criteria, across criteria as well as related to cross-cutting issues.
Conclusions provide added value to the findings
• Lessons that were learned in the implementation of the DA project and that are
12 Lessons learned /
useful beyond the context in which they were learned, with sufficient
good practices substantiation to be of use to people who do not know the project
• A number of good practices that were tried out and produced results and that
can be of use beyond the context in which they were identified, with sufficient
substantiation for these to be of use to people who do not know the project
• Maximum of a total of five learnings (incl. lessons and good practices)
• There is a need to pay equal amounts of attention to lessons learned/good
practices as to recommendations
• A list of maximum five to seven actionable recommendations, including
13 Recommendations
responsible agency/agencies, time frame and aspects of implementation, in
order of priority
• TOR
14 Annexes
• Project results framework and additional details on the DA project as needed
• Addition details on the context of the project and the evaluation as needed
• List of persons interviewed and additional details on methodology as needed
• References of documents reviewed

DA Project Evaluation Guidelines, October 2019 Annexes


42
United Nations Development Account

Annex 6 - References
Bamberger, Michael, Jim Rugh and Linda Mabry, RealWorld Evaluation, Working under budget, time,
data and political constraints, London, New Delhi, 2006.
Beer, Tanya, Evaluating Public Policy Advocacy, Center for Evaluation Innovation, Health Summit 2012.
Bester, Angela, Results-based Management in the United Nations Development System, A report
prepared for the United Nations Department of Economic and Social Affairs for the 2016 Quadrennial
Comprehensive Policy Review, January 2016.
Coghlan, Anne T., Hallie Preskill and Tessie T. Catsambas, An Overview of Appreciative Inquiry in
Evaluation, in: Preskill, Hallie and Anne T. Coghlan, Using Appreciative Inquiry in Evaluation. New
Directions for Evaluation, No 100, Winter 2003.
Cross, Harry, Karen Hardee, Norine Jewell, Reforming Operational Policies: A Pathway to Improving
Reproductive Health Programs December 2001.
Dart, Jessica and Rick Davies, A Dialogical, Story-Based Evaluation Tool: The Most Significant Change
Technique. In: American Journal of Evaluation, 24 (2), 2003.
Davies, Rick and Jessica Dart, The ‘Most Significant Change’ (MSC) Technique. A Guide to its Use. April
2005 (available at http://www.mande.co.uk/docs/MSCGuide.pdf).
Da Silva Wells, Carmen and Ewen Le Borgne, Nick Dickinson, Dick de Jong, Documenting Change, An
Introduction to Process Documentation, IRC International Water and Sanitation Centre,
September2011.
Earl, S., Carden, F., and Smutylo, T., 2001, Outcome Mapping: Building Learning and Reflection into
Development Programs, Ottawa: International Development Research Centre (available at
http://www.idrc.ca/en/ev-9330-201-1-DO_TOPIC.html).
Development Account Task Team on Evaluation, Preparation of Evaluation Reports for Development
Account Projects, September 2017.
Joseph, John A, Process Documentation, not dated.
Keck and Sikkink in Jones, Harry, A guide to monitoring and evaluating policy influence, Background
Note, London, 2011.
Kirkpatrick, Donald L., Evaluating Training Programs: The Four Levels, 2006.
Kirkpatrick, Donald L. and James D., Implementing the Four Levels. A Practical Guide for Effective
Evaluation of Training Programs, 2007.
New Zealand Aid, Gender Analysis Guideline, Integrating Gender Equality and Women’s
Empowerment into an Activity, Programme or Policy, September 2012.
ODA, Successful Communication: Planning Tools, Stakeholder Analysis.
OECD Development Assistance Committee, Glossary of Key Terms in Evaluation and Results Based
Management, Paris, 2002.
OIOS, Quality assessment framework.
Patton, Michael Quinn, Utilization-Focuses Evaluation, 4th edition, 2008.
Scriven, Michael: The Evaluation of Training, Update 2010.
The Evaluation Exchange, Harvard Family Research Project, Harvard Graduate School of Education.
XIII/1, Spring 2007.
UNDP, Capacity development: A UNDP Primer, 2009.

DA Project Evaluation Guidelines, October 2019 Annexes


43
United Nations Development Account

UNICEF, Innovations, lessons learned and good practices, not dated.


United Nations Development Group, Results-based management Handbook, Harmonizing RBM
concepts and approaches for improved development results at country level, October 2011.
United Nations Development Group, Programming Principles, UNDAF Companion Guidance, 2018.
United Nations Development Group, Theory of Change, UNDAF Companion Guidance, 2018.
United Nations Evaluation Group, Guidance on Evaluating Institutional Gender Mainstreaming, New
York, 2018.
United Nations Evaluation Group, Integrating Human Rights and Gender Equality in Evaluation –
Towards UNEG Guidance, March 2011
United Nations Evaluation Group, Integrating Human Rights and Gender Equality in Evaluation, August
2014.
United Nations Evaluation Group, UNEG Handbook for Conducting Evaluations of Normative Work in
the UN System, November 2013.
United Nations Evaluation Group, UNEG Norms and Standards for Evaluation, 2016, 2017.
United Nations Evaluation Group, UNEG Ethical Guidelines for Evaluation, March 2008.
United Nations Evaluation Group, UNEG Quality Checklist for Evaluation Reports, 2010.
United Nations Evaluation Group, UNEG Quality Checklist for Evaluation Terms of Reference and
Inception Reports, 2010

DA Project Evaluation Guidelines, October 2019 Annexes


44

You might also like