0% found this document useful (0 votes)
277 views36 pages

SOP On Monitoring and Evaluation For DDR (2010)

This document provides guidance on monitoring and evaluation procedures for Disarmament, Demobilization and Reintegration programs. It outlines: 1) Key responsibilities for DDR program managers and M&E staff in planning and implementing an M&E system, including establishing indicators, surveys and evaluations. 2) A 7-step process for planning the M&E system before implementation, including defining the results chain, indicators, and developing an M&E plan. 3) A 5-step process for managing the M&E system during implementation, including monitoring indicators, adjusting the system, and providing feedback. The document aims to provide a standardized approach to comprehensively monitor and evaluate progress and results of D

Uploaded by

Paula
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
277 views36 pages

SOP On Monitoring and Evaluation For DDR (2010)

This document provides guidance on monitoring and evaluation procedures for Disarmament, Demobilization and Reintegration programs. It outlines: 1) Key responsibilities for DDR program managers and M&E staff in planning and implementing an M&E system, including establishing indicators, surveys and evaluations. 2) A 7-step process for planning the M&E system before implementation, including defining the results chain, indicators, and developing an M&E plan. 3) A 5-step process for managing the M&E system during implementation, including monitoring indicators, adjusting the system, and providing feedback. The document aims to provide a standardized approach to comprehensively monitor and evaluate progress and results of D

Uploaded by

Paula
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

United Nations

Department of Peacekeeping Operations


Department of Field Support
Ref. 2010.23

Standard Operating Procedure

Monitoring and Evaluation for


Disarmament, Demobilization
and Reintegration

Approved by: Alain Le Roy, Under-Secretary-General for Peacekeeping


Operations
Effective date: 1 June 2010
Contact: DPKO/OROLSI/DDRS
Review date: 1 June 2012

1
DPKO/DFS STANDARD OPERATING PROCEDURE ON
MONITORING AND EVALUATION FOR
DISARMAMENT, DEMOBILIZATION AND REINTEGRATION

Contents: A. Purpose
B. Scope
C. Rationale
D. Procedures
E. Terms and definitions
F. References
G. Monitoring and compliance
H. Contact
I. History

ANNEXURES
Annex 1: Example of a DDR Indicator Framework
Annex 2: Example of a DDR Indicator Tracking Sheet
Annex 3: Example of a DDR M&E Plan
Annex 4: Example of a DDR Sample Survey

A. PURPOSE
1. The Standard Operating Procedure (SOP) on Monitoring and Evaluation (M&E) for Disarmament,
Demobilization and Reintegration (DDR) provides DPKO staff with guidance on how to plan and
run a Monitoring and Evaluation System for DDR in the broader context of peace keeping. It
provides a standardized, comprehensive and systematic approach to monitoring and evaluating
both progress and results of DDR.

B. SCOPE
2. The Standard Operating Procedure on Monitoring and Evaluation for DDR applies to DDR
planners (Chapter 1 of ‘Procedures’, in particular), to DDR M&E staff (for whom Chapter 2 will be
most significant).

C. RATIONALE
3. Monitoring and Evaluation (M&E) is a crucial tool for effective DDR programme planning and
implementation. It offers the means through which stakeholders1:
 can keep track on what has (or has not) been achieved to date;
 can make immediate adjustments to the DDR programme if necessary;
 can ensure accountability for the effective and efficient use of funds;
 can credibly and timely report to stakeholders and donors;
 can manage, communicate and discuss information on progress;
 can objectively verify the outcomes and impact of the DDR programme;
 can build an institutional memory, learn and share lessons that can be fed into future
programmes and policies.

1
A listing of definitions/explanations of technical terms is provided towards the end of this SOP, section E.

2
4. In crisis and post-conflict situations, where the most urgent priorities are to get implementation up
and running, M&E may be overlooked or under-prioritised. M&E has been one of the weakest
areas of DDR programme management. It has been recognised that, while DDR sections in
peacekeeping operations have tried several approaches to M&E, there has been little guidance on
this subject, nor has a single approach been developed across peacekeeping operations.

D. PROCEDURES

Overview
6. Programme managers and DDR M&E staff shall be responsible for key M&E activities:

 During the planning phase, the DDR Section of the Office of the Rule of Law and
Security Institutions shall ensure that an adequate M&E strategy is included in the
DDR programme plan and provide technical backstopping for defining indicators and a
DDR M&E plan. During implementation, the section shall monitor performance and
process indicators, provide technical backstopping for M&E and serve as a central
repository for M&E information.

 During the planning phase, the Head of Component shall establish a M&E mechanism
and identify performance indicators. During implementation, the Head of Component
shall oversee M&E of the DDR programme and clearly define rules, responsibilities and
tasks for M&E staff.

 DDR M&E staff shall run the DDR M&E system throughout programme implementation,
adjusting it, as required, on a regular basis2.

POSITION KEY RESPONSIBILITIES


BEFORE IMPLEMENTATION DURING IMPLEMENTATION
 DDR Section in the  ensure that an adequate  monitor performance and
Office of the Rule of monitoring and evaluation process indicators during
Law and Security strategy is included in the DDR implementation of the
Institutions programme plan programme plan
 provide technical backstopping  provide technical
for defining indicators and a DDR backstopping for monitoring
Monitoring and Evaluation Plan and evaluations
 serve as central repository
for all M&E information
 Head of DDR  establish mechanisms to  oversee monitoring and
Component monitor and evaluate the evaluations of the DDR
implementation of DDR programme
programmes  clearly define the roles,
 identify performance and responsibilities and tasks for
process indicators and targets M&E units and M&E staff
 DDR M&E unit  typically not yet established  run the DDR Monitoring and
Evaluation system
 adjust the M&E system as
and when required

2
In addition, DDR M&E staff shall provide feedback to DDR programme managers in HQs in order to improve the standardized
mechanisms for monitoring and evaluating DDR programmes.

3
7. When planning a DDR programme through an Integrated Missions Planning Process (IMPP)
with a UN Country Team (UNCT) presence, the tasks related to M&E before the implementation
shall be the responsibility of the Integrated Mission Task Force (IMTF) and the UNCT planning
team.

8. The steps entailed in setting up and managing an effective M&E system in the DRR context will
be considered in two groups:

 before implementation, 7 steps (primarily the responsibility of the DDR planners in


HQs and field operations)
 during implementation, 5 steps (primarily the responsibility of DDR management and
M&E staff in field operations)

Steps to be Taken Before Implementation


9. The design of a Monitoring and Evaluation system for DDR is an essential element in planning
a DDR programme.

10. DDR planners in headquarters and field operations shall ensure that the DDR M&E system is
planned in detail, incorporated in the DDR planning document and sufficiently funded.

11. The DDR Section in the Office of the Rule of Law and Security Institutions of DPKO shall provide
technical backstopping to DDR planners to define indicators and draw up a DDR Monitoring and
Evaluation Plan.

12. To do that, programme planners shall follow seven steps:

STEP 1: be clear about DDR results chain

STEP 2: determine performance indicators

STEP 3: determine process indicators

STEP 4: develop an Indicator Framework

STEP 5: plan surveys, studies, evaluations

STEP 6: define capacities required for M&E

STEP 7: summarize it in a DDR M&E Plan

13. The planning process for M&E shall result in a comprehensive ‘DDR Monitoring & Evaluation
(M&E) Plan’, which shall be an integral part of the overall DDR programme plan.

14. Each of these steps are explained and discussed in more detail below, and together will form the
basis for effective M&E activities over the course of the DDR Programme Cycle.

4
STEP ONE - Be clear about the DDR results chain

15. A logically connected chain of results is the basis for monitoring and evaluating DDR
programmes in the peacekeeping context. All stakeholders should have a clear and shared
understanding of the chain. To this end, planners shall ensure that the Results Chain is SMART3
and laid out clearly in the DDR programme plan.

16. A basic Results Chain is defined in the Results-Based Budgeting (RBB) Framework for the
entire mission. The RBB Framework includes a number of specific, often numeric, planned
outputs which should lead to the expected accomplishments. These outputs lead to an expected
accomplishment (or ‘outcome’) for the DDR programme. Taken together, the expected
accomplishments of a Mission contribute jointly to an overall objective (as defined by the Security
Council).

overall DDR DDR


objective expected outputs
of the Mission accomplishments

17. In addition, more detailed planning documents for the DDR component may exist. These can
take the form of a DDR programme document.

18. All expected accomplishments and outputs of the RBB framework and the DDR programme plan -
including the respective indicators - shall form the basis for DDR monitoring and evaluations. If
the results chain is not sufficiently detailed for effective monitoring, DDR planners (or M&E staff
where planners failed to do this) shall expand it to a sufficient level of detail.

STEP TWO: Determine Performance Indicators

19. In order to effectively monitor and evaluate DDR, each output and every expected accomplishment
shall have a set of performance indicators. Output indicators help to track whether a DDR
programme delivers what was planned and on time. Indicators for expected accomplishments track
medium-term progress, for example in the reintegration of ex-combatants.4

20. DDR planners should consider the following requirements when seeking to identify or refine
meaningful performance indicators:

 An indicator shall be a single unit of information measured over time that helps
show changes in a specific condition (poorly defined indicator: ‘Number and level of
ex-combatants economically active’ could be better expressed as two distinct
indicators: ‘No. of ex-combatants economically active’ and ‘% of economically active
ex-combatants’.

3
SMART: Specific, Measurable, Achievable, Realistic and Time-bound; see ‘Terms and Definitions’ (section E).
4
The Result-Based Budgeting Framework for a Mission includes indicators on the level of expected accomplishments, but no indicators
for the level of outputs (although it uses ‘quantities’). Performance indicators for a DDR programme should include all RBB indicators
and additional, DDR-specific indicators for detailed monitoring and evaluations.

5
 A set of indicators for a specific output or an expected accomplishment should
comprise a mix of quantitative aspects of DDR (directly observable, e.g. ‘% of ex-
combatants who found employment’) with qualitative aspects of DDR (incorporating
judgments or perceptions, e.g. ‘% of ex-combatants satisfied with the transition
package’).

 Indicators should be numeric (e.g. a number, percentage, ratio, etc.). Numeric


indicators tend to be more sensitive to change over time and are typically less
subjective than yes/no indicators (e.g. ‘All ex-combatants successfully reintegrated’).
Numeric indicators can capture quantitative (e.g. ‘% of ex-combatants who receive
reintegration package’) as well as qualitative aspects of DDR (e.g. ‘% of ex-
combatants who say their lives have improved after demobilization’).

 Each output or expected accomplishment should have a set of three to five


indicators. Fewer than this, and it is difficult to capture all key dimensions of a
result. To use more than five indicators is cumbersome, reduces the value of
indicators, and increases the workload for monitoring.

 General indicators should be complemented by additional disaggregated indicators


which show a specific subgroup. For example, the indicator ‘% of ex-combatants that
are economically active’ may be complemented by an additional indicator on ‘% of
disabled ex-combatants that are economically active’.

 Key cross-cutting issues should be reflected through additional, disaggregated


performance indicators, to highlight and track a specific aspect of DDR.5

 To the extent possible, indicators should be disaggregated by gender to capture the


different needs of female and male DDR programme participants.

 Key partners such as national authorities and civil society should be involved in the
process of selecting indicators, as this will serve to foster broader national ownership
and support for DDR, and can help clarify the expected accomplishments and
outputs of the planned DDR programme.

21. Every indicator in a DDR programme should have an indicator baseline stating the pre-
intervention status on a given date, and an indicator target specifying the result to be achieved by
a certain date. Targets should be realistic, and should be agreed upon by key partners and
national authorities. Intermediary targets (also called ‘milestones’), for specific dates within the
programme period, are recommended, to facilitate ease of monitoring.6

STEP THREE - Determine the Process Indicators

22. Performance indicators capture what has been achieved by the DDR programme. However, the
way how a DDR programme operates can materially affect the quality of its outputs and the
likelihood of reaching its expected accomplishment.

5
Cross-cutting issues in the context of DDR in peacekeeping typically include children, youth, health, HIV/AIDS, human rights or cross-
border population movements.
6
In the typically volatile and uncertain context of peacekeeping missions, it is possible that an indicator baseline can only be collected
after the DDR programme starts operating. In such cases, clear plans to collect the missing baseline data shall be made at
the planning stage. By the end of the first year of operations at the latest, all indicators shall have a complete baseline and a
target.

6
23. It is through the use of process indicators that DDR programme managers can track key
information on the way the programme is implemented.

24. At the planning stage, DDR planners shall identify a limited set of process indicators which are
subsequently tracked by DDR M&E staff during implementation.

25. While process indicators largely depend on the local context and the type of DDR, they often relate
to key aspects of the budget (e.g. ‘% of administrative overhead costs compared to overall
expenditure’, ‘% of funds mobilized which are spent’), human resources (e.g. ‘% of human
resource budget spent on international staff’, ‘% of DDR positions which are not filled’, ‘% of DDR
programme staff that completed gender training’) or public information activities for DDR (e.g.
‘Hours of radio programmes on DDR per quarter).

7
STEP FOUR - Develop an Indicator Framework

26. DDR planners shall summarize Performance Indicators and Process Indicators in an indicator
framework.

27. This framework shall be based on the DDR results chain. It defines a set of indicators for each
output and the expected accomplishment.

28. The DDR indicator framework shall specify in full detail every indicator with its baseline and target.
Baselines and targets shall be each expressed as a single number, percentage or word. In
addition, the indicator framework shall specify the month and year for baseline and targets in
brackets. For example: ‘% of 350 targeted communities with at least one DDR-funded reintegration
project’, baseline 0% (01/2010), target 90% (12/2014).

29. The indicator framework shall also clearly specify the portfolio of evidence (often called ‘means
of verification’) for every indicator (i.e. where the data is coming from, what source, or how
determined) and the frequency of data collection for the indicator (how often it will be collected).
7
Template for DDR Indicator Framework
DDR results Indicators Baseline Target Portfolio of Frequency of
chain (month/year) (month/year) evidence data collection

STEP FIVE - Plan for the surveys, studies and evaluations to be undertaken

30. When drawing up a DDR M&E plan, planners shall provide for in-depth assessments of progress
and success to be carried out, in addition to continuous monitoring. The key tools for such
substantial assessments of DDR are:

 sample surveys
 studies and
 evaluations.

31. These assessments should be planned before the DDR programme begins. This allows a
strategic approach to be adopted, to collect information when needed rather than using an ad-
hoc approach. Naturally, needs will change during the course of the programme, and provision
should be made for the conducting of alternative or additional studies and surveys in response to
specific issues or problems which may arise during the course of implementation.

32. DDR programmes should commission an evaluation of the DDR programme during the second
year of implementation, and every second year after that. Different types of small sample surveys
should ideally be conducted annually, but at least every second year. Studies should be
conducted when the need for more in-depth information on a particular issue arises.
7
see annex 1 for an example of a DDR Indicator Framework

8
33. The approach, timing and scope of surveys, studies and evaluations will depend on the level of
field access expected in the area where the DDR programme plans to operate.

34. A certain degree of flexibility should be accommodated in the plan, as changes in field access
may trigger a revision of the planned mix of surveys, studies and evaluations to be carried out.

SAMPLE SURVEYS
A survey is a system for collecting information from, or about, people, to describe, compare,
or explain their knowledge, attitudes, and behaviour. A sample survey enables a general
conclusion to be reached on the basis of a (statistically significant) sample of a group,
population or process. Such surveys can capture different kinds of information, for example,
the level of satisfaction of ex-combatants with the DDR process, or assessing the success
and sustainability of their reintegration. Even small sample surveys of DDR programme
participants can yield sound statistically valid data for the entire group of programme
participants at reasonable cost. Without the use of sample surveys by DDR programmes, it
is difficult to credibly and objectively establish if planned accomplishments are met or not.1

EVALUATIONS
Evaluations may be conducted internally by DDR or UN staff, or by external evaluators. An
internal evaluation may be undertaken by the DDR programme or a sub-programme
(‘programme-led evaluation’), or by the DPKO/DFS evaluation team in the Policy, Evaluation
and Training Division, or the Office of Internal Oversight Services (‘evaluation-led
evaluation’). An external evaluation is conducted by entities free from control or influence of
those responsible for the design and implementation of the DDR programme. If the purpose
of an evaluation is to assess accomplishments as credibly and as objectively as possible,
DDR planners shall choose an external evaluation.

STUDIES
To the extent possible, DDR planners may consider specific studies to carry out in-depth
investigations on certain issues, problems or perspectives which may arise during the
implementation of a DDR programme. For example, DDR planners might include a study on
the success of reintegration of ex-combatants with special needs in year 2 of DDR
implementation in order to adjust the specific support to that group. While the need for most
studies will only emerge during the implementation of a DDR programme, provision for
studies shall be made prior to the programme’s start.

35. The DDR Surveys, Studies and Evaluations Plan shall summarize the surveys, studies and
evaluations to be carried out before, during and after the DDR implementation, including those that
are carried out jointly with other organizations.

36. The Surveys, Studies and Evaluations Plan shall specify the following elements for each planned
activity: a) the focus that surveys, studies and evaluations seek to address, b) the type of activity,
c) the tentative timing, and d) the anticipated costs of conducting the activity.
8
37. Template for DDR Survey, Studies and Evaluations Plan

Focus Type Timing Costs


SURVEYS

8
For an example of a DDR Survey, Studies and Evaluation Plan see Annex 3

9
STUDIES

EVALUATION

STEP SIX - Define capacities required for M&E

38. As the DDR M&E Plan nears completion, programme planners shall define:

 the human resources needed to effectively carry out M&E


 the budget required for M&E

39. Given the variety of tasks and its central role in DDR management, a DDR programme shall have
a dedicated M&E unit.

40. The head of the M&E unit shall report directly to DDR Head of Component. A direct reporting line
is paramount for M&E staff to provide DDR managers with accurate and frank information on
progress and challenges, and to avoid a conflict of interest.

41. A DDR M&E unit should consist of at least four staff: a unit head (preferably an internationally
recruited expert to provide safeguards against potential bias) and three M&E officers (preferably
nationally recruited, due to their local knowledge, and to encourage skills development locally in
M&E). For large DDR programmes, the M&E unit should be staffed with three international staff9
and six (preferably nationally recruited) M&E officers.

42. To enhance efficiency and increase the chances for sustainable DDR, planners may consider a
combination of permanent DDR M&E staff with external, part-time M&E expertise. This may
be done through a retainer agreement with external consultancy firms or individuals which can be
called upon at any time.

43. To ensure that the DDR programme has sufficient capacities to carry out M&E activities, the DDR
planning document shall contain separate budget line(s) for M&E activities and M&E staff.10

44. Typically, the costs for M&E staff and for carrying out the activities in a DDR M&E plan depends on
three factors: a) the number and level of dedicated M&E staff, b) the number and type of
evaluations and studies, c) the number and type of primary data collections, such as sample
surveys.

45. An M&E budget of no less than 3% of overall DDR budget shall be allocated. Depending on the
DDR operation and other local factors, the budget for M&E should normally be between 3% and
7% of the overall DDR budget.

46. DDR planners should include funding provision in the budget for surveys, evaluations or studies
which cannot be foreseen at the time of planning, but where a need for such assessments may
arise during the implementation stage.

9
The number of international staff should be reduced once nationally recruited M&E staff is able to take over more responsibilities. The
head of the M&E unit, however, should remain an international staff to safeguard against potential bias and to ensure frank
reporting.
10
If a DDR programme is planned through an Integrated Missions Planning Process (IMPP), separate budget lines for M&E should be
included in the Mission Budget Report.

10
STEP SEVEN - Summarize it in a DDR Monitoring and Evaluation Plan

47. Based on steps one to six, DDR planners shall draw up a DDR Monitoring and Evaluation Plan,
which shall be an integral part of the DDR planning document.

48. The DDR Monitoring and Evaluation Plan shall include an appropriate mix of activities to collect
and analyze quantitative and qualitative information on DDR. While quantitative information (e.g.
‘% of ex-combatants who receive reintegration support) is important to objectively track progress
and success of DDR, it is often insufficient to capture the complexity and progress towards the
expected accomplishment of a DDR programme.

 Quantitative information is usually procured from direct observation, assessment or


measurement. Typically, it is less subjective and therefore more credible than qualitative
information.

In DDR programmes, quantitative information typically refers to the output level of the
result chain (e.g. ‘No. of arms destroyed’, ‘% of child solders supported’).

 Qualitative information typically describes people's opinions, knowledge, attitudes or


behaviours. Qualitative tools are more effective in building up understanding on the ‘why’
and ‘how’ of aspects, perceptions, relationships and trends in the DDR process.

In DDR programmes, qualitative information is typically collected through studies,


evaluations, reviews, field visits, focus groups, and the monitoring of implementing
partners. Qualitative information can be text-based or - when collated - expressed as a
number (‘% of female members of armed groups who say that their economic situation
has improved after demobilization’).

49. If a DDR programme is planned through an Integrated Missions Planning Process (IMPP), the
DDR Monitoring and Evaluation Plan shall be part of the integrated Mission plan.

50. The DDR M&E Plan in the DDR planning document shall not exceed 2 pages and shall consist of
three components:11

 The narrative component shall describe how the DDR Programme is to be monitored and
evaluated. It shall address as a minimum, but not be limited to, all the issues (steps 1 to 6)
discussed above. It shall specify who will take what action, and when. It should also outline
how the DDR programme will engage with national partners to strengthen their M&E
capacities. The narrative component shall describe how information on DDR progress and
success is to be obtained, through a careful mix of quantitative and qualitative methods.

 The indicator framework shall summarize the performance and process indicators, their
baselines and their targets, the portfolio of evidence, and the frequency of data updates.
The indicator framework shall take the form of a table as described in step four.

 The surveys, studies and evaluations plan within the M&E Plan shall specify the name,
focus, type, timing and cost for each planned activity. It shall take the form of a table as

11
For an simplified example of a DDR M&E plan see annex 3

11
described in step five. The DPKO DDR programme manager shall assure its quality and
approve the plan.

12
Steps to be Taken During Implementation
50. During the implementation of a DDR programme, monitoring and evaluation entails a cyclical
process of seeking, obtaining, collating, organizing, analysing, interpreting, packaging and
distributing data and information throughout the duration of the DDR programme period.

51. DDR M&E staff shall have overall responsibility for managing or carrying out all M&E activities
planned in the DDR Monitoring & Evaluation Plan, including the DDR surveys, studies and
evaluations plan.

52. In addition, M&E staff shall carry out additional M&E activities, obtain and store data and
information in a DDR Information System, analyze the information, report on it and adjust
the M&E system.

53. Running the Monitoring and Evaluation System for a DDR programme is a continuous process,
in many aspects a cyclical one (of observe, assess, advise, adjust, observe, assess, advise,
adjust), and for simplicity’s sake can be broken down to an initial task, five reiterating steps,
and a final task:

INITIAL TASK: FINAL TASK:


Set up and Handover and
establish the Closure
M&E unit

STEP 5: STEP 1:
Adjust Obtain the
M&E system Data

STEP 2:
STEP 4: Review, Collate &
Report on Organize Data
Progress

STEP 3:
Analyse the
Data

54. As a minimum, DDR M&E staff shall carry out steps 1 to 5 at least quarterly. This allows DDR
M&E staff to provide updated information on quarterly reports. It also ensures that sufficient
updated information and analysis is available for the Mission’s annual performance report.

13
14
55. Throughout the DDR programme implementation, the DDR M&E unit shall provide training in
basic M&E to programme staff and partners.

56. The DDR Section in the Office of the Rule of Law and Security Institutions of DPKO shall
provide technical backstopping to the DDR M&E staff throughout the DDR programme cycle
and serve as a central repository for all information related to monitoring and evaluating of
DDR.

57. While the management of risks to the DDR programme is not part of monitoring and
evaluation, the DDR M&E section shall provide information on changes in risks and
assumptions to the Risk Management Officer or focal point of the Mission.12

INITIAL TASK – Set up and establish the M&E unit

58. At the beginning of DDR implementation, DDR management should set up and establish an
M&E unit, based on the budget and the DDR M&E plan included in the DDR planning
documents.

59. Most DDR programmes routinely use one or more databases when processing the
disarmament and demobilization of combatants.13 DDR programmes may use DREAM14 or a
purpose-made database to meet the needs of the DDR programme. When selecting a
database system, attention should be given to its M&E capability. DDR managers and M&E
staff shall ensure that the database contains a reporting module which is capable of
producing continuous, meaningful reports.15

60. To systematically plan its activities during implementation, DDR M&E staff shall draft a rolling
two-year work plan. The work plan shall include details about planned activities on a monthly
basis for the first twelve-month period, and tentative activities for the subsequent twelve
months. The work plan shall be based on the DDR Monitoring and Evaluation Plan drafted
during the planning phase. DDR M&E staff shall update the work plan at least quarterly or
when required by changed circumstances.

STEP 1 - Obtain Information and Data

61. Information is key to the M&E process, and there are two sources. Existing data is the
information already available, ready for collection. Being readily available, it will be less
expensive or even free to procure, but may be less relevant.

62. New data is the information actively obtained by DDR M&E staff or jointly with partners. New
data collected for the DDR process is specifically tailored to the needs of the programme, but it
will likely incur costs (time and/or funds) to procure.

63. Before investing significant funds in data collection, DDR M&E staff should assess the use
and quality of the full range of data and information already available.

OBTAIN EXISTING DATA

12
For details on risk management see the DPKO/DFS Policy and Guidelines on Risk Management
13
The database typically includes socio-economic information on the ex-combatants, their disarmament process and the weapons, the
demobilization package and the reintegration support provided.
14
The DREAM database (Disarmament, Demobilization, Reintegration and SALW Control) database is a generic DDR software
provided by UNDP.
15
If this is not done at the design stage, the reporting shall be customized once the database is already functional.

15
64. DDR M&E staff shall systematically and completely obtain existing (‘secondary’) data from a
wide range of sources:

 DDR M&E staff shall obtain secondary data from the reporting module of the DDR
database at least on a quarterly basis. 16 In collated form, the DDR database typically
provides ample data for a range of indicators at the output level.

 M&E staff shall obtain, organize and store data and information produced by
implementing partners. This may include detailed data on their operations, studies and
surveys conducted by implementing partners, and reports over and above those
contractually required of them by the DDR programme.

 M&E staff shall make full use of the information, including performance data, readily
available from colleagues within the DDR office. Typically, operational data can be
obtained from DDR Human Resource staff, Finance staff, Public Information staff and
Programme staff who work directly with implementing partners.

 Further, M&E staff shall seek to obtain relevant data and information from outside the
DDR programme. Useful sources will include studies, surveys, articles, statistics, or
indices published by other UN agencies, the government, by other national or international
development or humanitarian organizations, by universities, research organizations, and
national and international media. M&E staff shall seek not only text information, but also
relevant visual information (pictures, maps and videos).

 If the DDR programme is implemented through an Integrated Mission, M&E staff shall
ensure to cooperate and coordinate, to obtain and make full use of data collections carried
out by the UN Country Team, UN agencies and other partners.

ACQUIRE NEW DATA

65. In the volatile environment typically encountered in DDR, relevant or reliable data is often not
readily available. To acquire new (‘primary’) data, DDR M&E staff shall carry out or manage a
mix of data collection activities as suited to the specific DDR programme. This mix may
include a) monitoring of implementing partners, b) sample surveys, c) field visits, d)
programme reviews, e) studies, f) focus groups and g) evaluations. Each element of this mix is
discussed in more detail over the following pages.

M&E activities suitable for collecting primary data on DDR

16
If the need arises for new information based on raw data in the DDR database, DDR M&E staff shall work closely with the technical
staff responsible for operating the database to make the changes needed in the reporting module.

16
Monitoring of
Implementing
Partners
Evaluations Sample
Surveys

key primary
sources for
Focus Groups DDR Field Visits

Studies Reviews

66. If the DDR programme is implemented through an Integrated Mission, DDR M&E staff shall –
to the extent possible – collect new data jointly or in close collaboration with the UNCT or UN
agencies.

Monitor the Implementing Partners

67. Typically, social and economic reintegration support to ex-combatants is provided through
partner organizations. Implementing partners are normally local or international NGOs,
bilateral or multilateral organizations, the private sector or community organizations.

68. As part of overall DDR monitoring, DDR M&E staff shall closely track the work of
implementing partners primarily based on their reporting and field visits. As a minimum, the
DDR M&E staff shall monitor compliance of implementing partners making sure their
contractual reporting obligations are on time and of good quality.

69. To keep track of submission compliance, DDR M&E staff should use a spreadsheet with at
least the following fields:

 date the implementation report is due


 status of the report (not due, received, delayed)
 date the report is received, delay in days
 hyperlink to the report stored on the DDR Information System (see step
two)

70. DDR M&E staff shall aggregate data from tracking reports of implementing partners to obtain
an overall picture of progress. Aggregated data may also be used to calculate key output
indicators and to track their overall reporting compliance over time. The following indicators
may be used for that purpose:

 % of reports from implementing partners received on time (before the


deadline stipulated in the contractual agreements)
 % of reports delayed (disaggregated by regions and implementing
partners)

17
 average no. of days that reports are delayed

71. In addition to tracking reporting compliance, the DDR programme shall, on a regular basis
(minimum quarterly), analyze and summarize the information and findings from implementing
partners. Using a qualitative approach, the DDR unit shall identify frequent or serious concerns
or challenges, collect lessons learned and look for opportunities where the experience and
expertise of an implementing partner can help another partner.

Manage or Conduct Sample Surveys

72. Due to the need for rapid information in a fast-changing environment, DDR programmes
should make extensive use of rapid, small-scale sample surveys with random sampling.
Typically, these surveys will operate on random sampling (also called ‘probability sampling’),
and will entail asking a few key questions to a small random sample of target respondents, e.g.
ex-combatants, their dependents or community leaders. Such small-scale surveys are
significantly more cost-effective than large-scale, technically complex and time-consuming
surveys that may be more appropriate for academic research or special studies.17

73. If random sampling is impractical, DDR programmes may rely instead on small-scale surveys
(also called ‘mini-surveys’) based on non-random sampling (also called ‘purposive sampling’)
to produce relevant numeric data.

74. The socioeconomic information entered in the DDR database should provide the information
required in the identification of random and non-random samples for surveys.

75. Since DDR programmes may not have the technical expertise and experience to design a
credible sample survey and to calculate the sample size, external survey specialists should
be hired to advise on the design of the survey during the planning stage of sample surveys.

76. Ideally, DDR field staff or other individuals linked to the DDR programme should not collect
survey data themselves. 18 However, using outside staff for data collection might not be
feasible due to limited national capacity or security concerns. In these cases, DDR
programmes may make use of DDR programme or M&E staff as long as they do not collect
data from individuals they previously had contact with or in geographic areas they previously
worked in.

Conduct Field Visits

77. Field visits serve to validate reported progress and results through observation and
interviews. DDR staff and partners shall conduct frequent visits to observe DDR
implementation, to meet with implementation partners at their work site, and to interview and
observe communities involved in reintegration and regional DDR offices.

78. DDR staff should conduct routine field visits (conducted systematically based on a planned
schedule) and ad-hoc field visits (conducted to investigate a specific problem or a conflict).

17
DDR programmes may consider three instruments for sample surveys: i) interviews, ii) structured record reviews, and iii) structured
observations. A fourth instrument for sample surveys, self-administered questionnaires, should not be used, as bias may
arise if any ex-combatants are illiterate.
18
Enumerators should have no vested interest in the results of a sample survey, since this might compromise its credibility. In some
cases, DDR programmes might be able to outsource sample surveys to local private companies, NGOs or research
institutions with experience in sample surveys

18
79. Overall coordination for field visits shall be the responsibility of the DDR M&E staff. It is
usually more effective to have a team of two or three people conducting a field visit. Joint field
visits that include government and donor representatives are especially useful, since they can
capture different perspectives of DDR stakeholders.

80. The results of a field visit shall be documented in After Action Reviews19 and shared with
relevant DDR management, staff and other stakeholders.

Conduct Reviews

81. Reviews are internal, informal assessments of the implementation and progress of a
programme or sub-programme. They are normally conducted by DDR staff, M&E staff or a
combination of both.

82. Reviews are conducted to assist programme management in improving the DDR programme.
DDR programme managers may initiate an ad-hoc review for the following reasons:
 to identify and explore key issues or problems in the DDR programme
 when lessons learned need to be analyzed to help formulate changes in the
DDR programme’s plan
 to ensure the DDR programme is progressing in the intended direction20

83. If the DDR programme is part of an integrated mission, reviews should be conducted, jointly
by the mission, DPKO, and UNCT and relevant agencies, depending on the requirement.

Conduct or Manage Studies

84. To conduct studies on specific issues related to DDR, DDR M&E staff shall follow the steps
involved in commissioning an external evaluation (if the study is to be conducted by external
specialists) or the steps for conducting an internal evaluation (if the study is to be conducted by
DDR or UN staff).21

85. A particularly appropriate tool for DDR programmes are case studies. M&E staff may use a
case study approach to gather comprehensive, in-depth information about a particular case.
They are useful for DDR M&E to better understand (i) a particular group (e.g. female ex-
combatants, child soldiers, community leaders), (ii) a particular problem related to the DDR
process, or (iii) mechanisms at work in a particular aspect of DDR activities (e.g. the processes
and challenges involved in reintegrating ex-combatants). Case studies are a valid tool to
complement quantitative DDR monitoring or as part of an evaluation or a study.

Conduct or Manage Focus Groups

86. DDR M&E staff may use focus groups to solicit views, perceptions, experiences, insights
and recommendations of DDR programme participants and communities on the DDR
process. A focus group is rapid and low-cost, and is usually moderated by an expert, using
carefully structured but open-ended discussions in small groups. Focus group discussions
complement quantitative monitoring, and are a valued tool for field visits, internal reviews,
studies, or for DDR evaluations.

19
For details on After Action Reviews see Programme Management Guidelines, DPKO/DFS
20
For details on how to conduct a review see Programme Management Guidelines, DPKO/DFS
21
For detailed steps on how to conduct evaluation see DPKO/DFS Programme Management Guidelines

19
Conduct or Manage Evaluations

87. Commissioning external evaluations and conducting internal evaluations require


fundamentally different approaches. To conduct an internal evaluation, DDR M&E staff need to
have the expertise required to yield judgements based on solid evidence. To commission an
external evaluation, DDR M&E staff need to properly plan, support and monitor the quality of
the evaluation.22

88. In some cases, DDR planners, management or M&E staff may consider a mixed evaluation,
where DDR staff or UN staff work together with external evaluators. The advantage of a mixed
team is that it combines UN-specific knowledge with an external, objective analysis of progress.
The disadvantage of using a mixed team is that objectivity can be reduced, and subsequently
credibility may be compromised.

22
For detailed steps on how to conduct an internal evaluation see DPKO/DFS Programme Management Guidelines

20
STEP 2 – Review, Collate, and Organize Data and Information

89. M&E staff shall review, collate and organize relevant data and information obtained in a DDR
Information System, and shall track progress in respect of DDR outputs and expected
accomplishments, through the use of an ‘Indicator Tracking Sheet’.

MAINTAIN A DDR INFORMATION SYSTEM

90. DDR M&E staff shall review, organize, collage and store, as required, all information
relevant from different data sources in a DDR Information System regularly, diligently and on
an ongoing basis. Given the wealth of material that will accumulate during the course of the
programme, it is essential that careful consideration is given as to how and where information
is to be filed, who is to have access to it (or to parts of it), and how it is to be accessed.

91. DDR management may consider co-hosting the DDR Information System – or parts of it, if
DDR data is sensitive - together with national DDR authorities (for example the National
commission for DDR). This approach will enhance ownership and facilitate the hand-over of
the DDR M&E system after the programme has closed.

92. The DDR Information System shall serve as a one-stop shop for key information on the DDR
process and for preparing reports and public relations work, and shall serve as the
institutional memory of the DDR programme, a key resource for new or temporary personnel
in the event of staff absence or turnover.

93. The DDR Information System shall be responsive, reliable, accountable and accessible. M&E
staff shall determine the most appropriate form and format of the Information System, taking
into account the location, number and computer-competency of those expected to access the
system, as well as security and back-up concerns. Amongst options to be considered are a
common computer directory, a simple intranet (for limited access) and/or an internet
website (for public access).

94. All M&E related data and information shall also be stored and available in a hard-copy filing
system for easy reference and as a back-up for the electronic system.

UPDATE INDICATOR TRACKING SHEET


95. The Indicator Tracking Sheet is an effective means by which progress with respect to
planned outputs and expected accomplishments can be recorded and monitored. In essence, it
is nothing more than a list of all planned outputs and expected accomplishments, the indicators
being used to indicate progress, the (pre-programme) baseline status, the (end-of-programme)
target, and, to be updated regularly, the current status against each of these indicators.

96. The Indicator Tracking Sheet for DDR shall contain the information defined in the indicator
framework at the planning stage: outputs and expected accomplishment, the indicators for
each output and for each expected accomplishment, and the baseline and target for each
indicator. To track progress on indicators over time, the Indicator Tracking Sheet shall include
– as a minimum - columns for every quarter of the DDR programme cycle. If more frequent
entries are required, the Indicator Tracking Sheet may include columns for every month of the
duration of the DDR programme.

21
97. DDR M&E staff shall update the Indicator Tracking Sheet at least quarterly. If DDR
management requires more frequent updates, DDR M&E staff may update the sheet every
month.23

98. Because indicators should only capture the most relevant perspectives on DDR, M&E staff
should keep the indicators tracking sheet simple. A standard text-processing or spreadsheet
software is usually sufficient to design and update the Indicator Tracking Sheet.

99. The Indicator Tracking Sheet shall have the following format:

Template for DDR Indicator Tracking Sheet 24

DDR Indicators Baseline 1st 2nd 3rd etc. Target


results (month/year) Quarter Quarter Quarter (month/year)
chain 1st year 1st year 1st year
Expected accomplishment 1

Output 1.1

Output 1.2

Process indicators

STEP 3 – Analyze data and information

100. DDR M&E staff shall analyze and interpret existing data on DDR systematically and on a
regular basis. To do this, M&E staff shall draw on data and information collected in the DDR
Information System and in the Indicator Tracking Sheet.

101. For a credible analysis of DDR monitoring data, DDR M&E staff shall carry out a combination
of quantitative and qualitative analyses 25 . The two approaches are complementary and
provide a more credible interpretation of information on progress and success of DDR.

102. Where possible, DDR M&E staff should use triangulation to enhance the credibility and
accuracy of data analysis. Triangulation reduces bias that can arise from relying on a single
source or type of information, a major risk in volatile and complex environments typical for
DDR. For example, DDR M&E staff can use information from focus groups and anecdotal
findings from field visits to verify data from a satisfaction survey of ex-combatants.

103. Where possible, DDR M&E staff should engage relevant staff or stakeholders in the process
of analysis. This has the advantage of enhancing the sense of ownership for the findings that

23
To update indicators more often may be inefficient, as changes within a typical month are likely to be trivial. To update indicators
less frequently may result in outdated information being used in the making of key decisions, a constant risk in the volatile
and fast-changing environments faced by many DDR programmes.
24
Find an example of a DDR Indicator Tracking Sheet in Annex 2
25
Quantitative data analysis is objective and robust, and requires a sound understanding of statistical methods. Qualitative data
analysis is based on subjective, rich and in-depth information, typically including the interpretation of narrative reports, field
visit reports, focal groups, case studies, anecdotal evidence, etc.

22
emerge from the analysis, and may also serve to enhance the capacities of non-M&E staff in
the processes involved in M&E.

STEP 4 - Report on progress

104. DDR M&E staff shall report their summary findings from data collection and analysis
systematically and frequently.

105. M&E staff should get the right information to the right user in time. For effective reporting
on progress, M&E staff shall take into consideration the information needs of each target group,
and in what form they require the information. Reports based on M&E shall be tailored in
length, presentation, frequency, etc. to the information needs of its users.

106. DDR M&E staff shall produce a quarterly progress report for the head of the DDR
component.

107. If the DDR programme is part of an Integrated Mission, DDR M&E staff shall also report
quarterly to the Integrated Mission Planning Team and the UNCT.

108. DDR M&E staff may also produce, or collaborate in publishing, additional reports to support
DDR management and staff in improving the performance of the DDR programme.

109. DDR M&E staff shall provide the head of the DDR component with data and information for
all mandatory reports of the DDR programme. For DDR programmes, the following reports are
mandatory: reporting on the peacekeeping support account budget, regular budget, voluntary
contributions, programme/sub-programme plans, performance assessments against the
USG/ASG Compacts at headquarters, and HOM/DHOM compacts in field operations.

110. If the DDR programme has dedicated staff for reporting, M&E staff shall provide substantive
data and information on progress to the reporting staff and work closely with them.

111. M&E staff shall base the narrative part of reports on key DDR indicators tracked through the
Indicator Tracking Sheet, interpreting, and making frequent reference to them.

112. DDR M&E and reporting staff shall be careful in reports to distinguish between attribution and
contribution. A claim of attribution is a claim that progress towards an expected
accomplishment of DDR is caused by DDR outputs and activities, and shall be supported with
solid, credible evidence. To claim contribution is to claim nothing more than that DDR outputs
or activities were contributory, amongst others, to an observed change with respect to the
expected accomplishment.

STEP 5 - Adjust the DDR M&E system and the DDR programme

113. Because the need for information may change over the course of the DDR programme cycle,
the DDR M&E system and the set of tools used for M&E shall be adjusted to accommodate
the changing needs of DDR M&E staff, DDR management and DDR stakeholders.

114. DDR M&E staff shall review the effectiveness of the DDR M&E system and make adjustments
at least every six months. More frequent ad-hoc adjustments may be done if required, for
example where field access for monitoring has improved or deteriorated.

23
115. The DDR Head of Component and DDR managers shall use the information provided through
the DDR monitoring and evaluation system as a basis for making decisions on revising and
adjusting the DDR programme.

FINAL TASK – Closure and Hand Over of the M&E system

116. When a DDR programme is nearing completion and closure, M&E staff shall liaise with DDR
management to prepare for closing and handing over the M&E system, and to plan for
evaluations after programme operations have stopped (‘ex-post evaluations’).

CLOSING THE M&E SYSTEM

117. Twelve months before the anticipated closing of a DDR programme, DDR M&E staff should
plan for a final evaluation. The final DDR evaluation should take place in the last half year of
full operations.

118. Second, DDR M&E staff shall plan for an orderly and systematic closing of the M&E system
six months before the ceasing of operations, and reflect these activities in the DDR M&E
work plans.

119. Third, DDR M&E staff shall ensure that the DDR M&E system is complete and all relevant
data and information is updated.

120. Fourth, during the last six months of DDR operations, M&E staff shall focus on documenting
lessons learned before and during the implementation of the programme. Lessons learned
should be based on the information produced by the DDR M&E system over the course of the
programme. In addition, DDR M&E staff may consider conducting ‘End of Assignment
Reports’26 with key staff.

121. Fifth, DDR M&E staff shall hand over all documentation of the M&E system to the
Disarmament, Demobilization and Reintegration Section of the Department of
Peacekeeping Operations, which serves as central repository for all DDR M&E data.

HANDING OVER THE M&E SYSTEM

122. If DDR or components of it are to continue in some form under the authority of national
institutions or as part of subsequent UN recovery, reintegration or development
programmes, the M&E unit shall hand over relevant and non-confidential data and information
to these units.

PLAN FOR EX-POST EVALUATIONS

123. In a typical DDR programme, the expected accomplishment will only be achieved after the
programme has stopped operating. For example, credible judgements about the sustainability
of the reintegration of all programme participants can only be made after the reintegration
support has stopped.

26
For details on End of Assignment Reports see Programme Management Guidelines, DPKO/DFS

24
124. In the last six months of operations, the DDR Head of Component shall – with the technical
expertise of the M&E unit – plan for at least one ex-post evaluation.

125. The DDR Head of Component should, in close coordination with DPKO’s DDR Section,
determine the timing and funding for the ex-post evaluation.

126. Ex-post evaluations should focus on the success and sustainability of the reintegration aspect
of DDR, be external and carried out between 12 and 24 months after the DDR programme
has stopped operating.

E. TERMS AND DEFINITIONS

The following Monitoring and Evaluation terms and definitions are referred to in these guidelines. The
terms are, whenever available, in line with the DPKO/DFS Guidelines on Programme Management
and the official terms and definitions of the Secretariat, DPKO and DFS (indicated by an asterisk (*).

Accountability*: Accountability is the process whereby public service organizations and individuals
within them are held responsible for their decisions and actions, through a clearly specified and
enforced system of rewards and sanction.27

Activity*: Action taken to transform inputs into outputs.28

Attribution: A claim that progress towards an expected accomplishment of DDR is caused by DDR
outputs.

Baseline*: Data that describe the situation to be addressed by a programme, sub-programme, or


project and that serve as the starting point for measuring performance. 29

Baseline study: An analysis describing the situation prior to the commencement of the programme or
project or the situation following initial commencement of the DDR programme, to determine baselines,
and to serve as a basis for future analyses.

Budget*: The collective cost of programme or sub-programme resources needed to perform the
specific activities through a defined time cycle.

Contribution: A claim that DDR outputs were contributory, amongst others, to an observed change
with respect to the expected accomplishment.

Deliverables*: Products produced by the programme/sub-programme.

Disaggregation: The separation of aggregate data into its component parts.

Effectiveness: The extent to which expected accomplishments are achieved.

Efficiency*: Measure of how well inputs (funds, expertise, time, etc.) are converted to outputs.30

27
Department of Management
28
ST/SGB/2000/8 (PPBME)
29
Ibid
30
Ibid

25
Evaluation*: A systematic and objective process seeking to determine the relevance, effectiveness,
and impact of a programme/sub-programme related to its goals and objectives. Evaluation is often
undertaken selectively to answer specific questions to guide decision-makers and/or programme
managers, and to provide information on whether underlying theories and assumptions used in
programme development were valid, what worked and what did not work and why.31

Expected accomplishments*: A desired outcome or result of the DDR programme/sub-programme,


involving benefits to end-users. Accomplishments are the direct consequence or effect of the delivery
of outputs and lead to the fulfilment of the envisaged objective.32

External evaluation: An evaluation that is conducted by entities free from control or influence of those
responsible for the design and implementation of the DDR programme.

Evaluation-led evaluation*: Evaluation undertaken directly by the DPKO/DFS Evaluation Unit (Policy,
Evaluation and Training Division (DPET)).33

Impact*: An expression of the changes produced in a situation as the result of an activity that has
been undertaken. 34 Impact is the longer-term or ultimate effect attributable to a programme, sub-
programme or project, in contrast to an expected accomplishment and output, which are geared to the
timeframe of a plan.35

Implement*: To carry out or put into effect – according to, or by means of – a definite work plan or
procedure.

Implementing partner: In the context of DDR, a partner organization which is contracted by the DDR
programme to carry out specific reintegration activities.

Indicator*: A measure, preferably numeric, of a variable that provides a reasonably simple and
reliable basis for assessing accomplishment, change or performance. It is a unit of information
measured over time that can help show changes in a specific condition.36

Indicator baseline: The measurement of an indicator before a DDR programme starts operating,
expressed as a single number or word.

Indicator target: The expected measurement of an indicator at the end of a DDR programme,
expressed as a single number or word.

Indicator framework: A summary in a single table of the indicators for outputs, for the expected
accomplishment and for the overall objectives, along with indicator baselines and targets, the portfolio
of evidence, the frequency of data collection and the estimated costs.

Indicator Tracking Sheet: A table which is updated frequently with the current status with respect to
each indicator defined in the Indicator Framework.

Input*: Funds, personnel and other resources necessary for producing outputs.37

31
Ibid
32
ST/SGB/2000/8 (PPBME)
33
Policy Directive on DPKO/DFS Headquarters Self-Evaluation
34
ST/SGB/2000/8 Secretary-General’s bulletin: Regulations and Rules Governing Programme Planning, the Programme Aspects of
the Budget, the Monitoring of Implementation and the Methods of Evaluation (PPBME)
35
UN OIOS Glossary of Monitoring and Evaluation Terms
36
Ibid
37
Ibid

26
Internal evaluation: An internal evaluation is conducted by staff within the DDR programme or by
related UN staff. It may be undertaken by the DDR programme or a sub-programme (‘programme-led
evaluation’), or by the DPKO/DFS evaluation team in the Policy, Evaluation and Training Division or
the Office of Internal Oversight Services (‘evaluation-led evaluation’).38

Lessons learned*: The knowledge gained from the process of planning and executing a
programme/sub-programme. A lessons learned is a generalization derived from evaluation
experiences with programmes, sub-programmes or policies that is applicable to a generic situation
rather than to a specific circumstance and has the potential to improve future actions.39

Logical framework (Logframe)*: Management tool used to identify strategic elements of a


programme or sub-programme (objective, expected accomplishments, indicators of accomplishment,
outputs and inputs) and their causal relationships, as well as the assumptions and external factors that
may influence success and failure. It facilitates planning, implementation, monitoring and evaluation of
a programme or sub-programme.40

Mission concept*: Provides political and operational direction, timelines and lead/supporting roles for
priority activities to achieve the mission’s mandate as provided by the Security Council, including the
mission’s priority tasks and related organizational and deployment structure.41

Monitoring*: An assessment by programme managers, team members, M&E staff and audit bodies,
of the progress in achieving the expected accomplishments and delivering the final outputs in
comparison with the commitments set out in the programme/sub-programme budget as approved by
the General Assembly. It provides assurance that the implementation of a programme/sub-programme
is proceeding as planned.42

Monitoring and evaluation*: The combination of monitoring and evaluation together provide the
knowledge required for effective programme management as well as for reporting and accountability
responsibilities.43

Objective*: An objective refers to an overall desired achievement involving a process of change that is
aimed at meeting certain needs of identified end-users within a given period of time.. 44 A good
objective supports the accomplishment of a goal and meets the criteria of being impact-oriented,
measurable, time-limited, specific and practical.

Outcome*: In the United Nations Secretariat, “outcome” is used as a synonym of an accomplishment


or a result.45

Output*: The final product or deliverables by a programme/sub-programme to stakeholders, which an


activity is expected to produce in order to achieve its objectives. Outputs may include reports,
publications, training, meetings, security services, etc. 46

Output indicator: An indicator which tracks the extent to which a planned DDR output has been
delivered.

38
Programme Management Guidelines, DPKO/DFS
39
UN OIOS Glossary of Monitoring and Evaluation Terms
40
Ibid
41
IMPP Guidelines: Role of the Headquarters Integrated Planning for UN Field Presences
42
Ibid
43
Ibid
44
ST/SGB/2000/8 (PPBME)
45
UN OIOS Glossary of Monitoring and Evaluation Terms
46
ST/SGB/2000/8

27
Peacekeeping support account budget*: Established to provide a flexible mechanism to fund
headquarters capacity to plan, establish and direct field operations.

Performance indicator*: A measure that provides a reliable basis for assessing performance within a
programme/sub-programme.

Primary data: Data which is collected by DDR M&E staff, or where the data collection is funded and
managed by the DDR M&E staff.

Planning*: The process of developing strategies, objectives and work plans to achieve
programme/sub-programme success.

Process indicator: A measure that tracks key aspects of how a DDR programme operates. While a
performance indicator tracks outputs and outcomes, a process indicator typically refers to crucial
issues of the budget, human resources or public information activities of a DDR programme.

Programme*: A programme consists of the activities undertaken by a department or office together


with a coherent set of objectives, expected accomplishments and outputs intended to contribute to one
or more organizational goals established by Member States. The programme is guided by the
mandates entrusted to a department/office by the General Assembly or the Security Council.47

Programme-led evaluation*: Evaluation undertaken by the programme/sub-programme with the


support of the DPKO Evaluation Unit in the Policy, Evaluation and Training Division (DPET).48

Programme manager*: A programme manager is the official responsible for the formulation and
implementation of a programme/sub-programme.49

Programme management*: The centralized and coordinated management of a specific programme


to achieve its strategic goals, objectives and expected accomplishments.

Programme performance report*: The mandated report of the Secretary-General submitted to the
General Assembly biennially reflecting implementation and results for programmes in the Secretariat.50

Programme/sub-programme plan*: A detailed document stating objectives, expected


accomplishments, activities/outputs, performance indicators, responsibilities, and time frames. It is
used as a monitoring and accountability tool to ensure the effective implementation of the
programme/sub-programme plan. The plan is designed according to the logical framework.51

Project*: Planned activity or a set of planned, interrelated activities designed to achieve certain
specific objectives within a given budget, organizational structure and specified time period. Within the
Secretariat, projects are used in technical cooperation activities. 52 Individual projects within the
programme are managed by project managers. The programme manager is responsible for
overseeing overlap among the programme/sub-programme projects.

Qualitative data: Information concerned with people's opinions, knowledge, attitudes or behaviours.
Qualitative information can consist of numbers o a text .

47
UN OIOS Glossary of Monitoring and Evaluation Terms
48
Policy Directive on DPKO/DFS Headquarters Self-Evaluation
49
ST/SGB/2000/8 (PPBME)
50
UN OIOS Glossary of Monitoring and Evaluation Terms
51
Ibid
52
Ibid

28
Quantitative data: Information measured or measurable by, or concerned with, quantity. Quantitative
data typically consists of numbers.

Result*: The measurable accomplishment/outcome (intended or unintended, positive or negative) of a


programme/sub-programme. In the Secretariat practice, “result” is synonymous with accomplishment
and outcome.53

Results chain: A logically connected hierarchy of expected results from a DDR programme. In a DDR
results chain, a set of activities leads to the delivery of an output. A set of outputs should cause the
expected accomplishment for DDR. The expected accomplishment contributes, among other
achievements, to the overall objective of a peacekeeping mission.

Results-based budgeting*: A programme budget process in which: (a) programme formulation


revolves around a set of predefined objectives and expected results; (b) expected results would justify
resource requirements which are derived from and linked to the outputs required to achieve such
results; and, (c) actual performance in achieving results is measured by objective performance
indicators.54

Results-based management*: A management strategy by which the Secretariat ensures that its
processes, outputs and services contribute to the achievement of clearly stated expected
accomplishments and objectives. It is focused on achieving results and improving performance,
integrating lessons learned into management decisions and monitoring of and reporting on
performance.55

Sample: A sub-set of a larger population deemed large enough to provide a meaningful representation
of the whole population.

Sample survey: a survey carried out on a sample (subset) of a population. The sample should be
large enough to ensure that findings from the sample survey will almost certainly match those that
would have been obtained if the whole population had been surveyed.

Secondary data: Data which is not collected by DDR M&E staff, and which is readily available.

Self-evaluation*: Self-evaluation is an integral part of the management process and is undertaken by


managers primarily for their own use. The programme manager applies the findings from the self-
evaluation to make necessary adjustments to implementation of the programme/sub-programme, or
they are fed back into the planning and programming process as proposed changes in the design
and/or orientation of the programme/sub-programme.56

SMART*: An acronym often used when creating programme and sub-programme planning elements.
It stands for specific, measurable, achievable, realistic and time-bound57.

53
UN OIOS Glossary of Monitoring and Evaluation Terms
54
Results-Based Budgeting, Secretary General’s Report (A/53/500)
55
UN OIOS Glossary of Monitoring and Evaluation Terms
56
ST/SGB/2000/8 (PPBME)
57
Specific: planning elements that are related to the mandate; Measurable: quantifiable planning elements that are easily monitored
and evaluated for programme/sub-programme success and progress, making it easier to report to stakeholders on progress;
Achievable: indicated by planning elements that can happen in the specific period; Realistic: although being ambitious in
creating programme/sub-programme goals and objectives is encouraged, managers must ensure that planning elements
remain both realistic and achievable, Time-bound: managers must ensure that the objectives they have created are
achievable within the necessary time frame

29
Stakeholder*: An agency, organization, group or individual interested in a programme/sub-
programme’s end results. Not all stakeholders are involved in completing the actual work of a
programme/sub-programme. Common stakeholders include Member States, DPKO/DFS operations,
host countries, UN system partners, regional organizations and other external partners.58

Survey: A system for collecting information from or about people, to describe, compare, or explain
their knowledge, attitudes, and behaviour.

58
Ibid

30
F. REFERENCES
Normative or superior references

J;;: Guidelines- Programme Management, DPKO/DFS [draft version: to be finalized in 2010]


Ci Guidelines- Integrated Planning for UN Field Presence, Role of the Headquarters, DPKO,
May 2009
~::~ Integrated Disarmament, Demobilization and Reintegration Standards IDDRS, UN 2006
c Integrated Missions Planning Process - Guidelines Endorsed by the Secretary-Generat
in 2006, UN, 2006
Ei Norms and Standards for Evaluation in the UN System, UNEG, 2005
Policy- Programme Management Policy, DPKO/DFS [draft version: to be finalized in
2010]
IJ Policy Directive- DPKO/DFS Mission Evaluation Policy, DPKO/DFS 2008
c Secretary-General's bulletin, Regulations and Rules Governing Programme Planning, the
Programme Aspects of the Budget, the Monitoring of Implementation and the
Methods of Evaluation (PPBME) (ST/SGB/2000)

Related procedures or guidelines

EJ Fragile States and Peacebuilding Programs- Practical Tools for Improving Performance
and Results, Social Impact, 2009 ·
c Glossary of Monitoring and Evaluation Terms, UN OIOS, Monitoring, Evaluation and
Consulting Division
iJ How to Guide: Monitoring and Evaluation for Disarmament, Demobilization and
Reintegration (DDR) Programmes, BCPR, UNDP, 2009
Monitoring Peace Consolidation: UN Practitioners Guide to Benchmarking, Peacebuilding
Supp9rt Office, forthcoming
Iii Operational Guide to the Integrated Disarmament, Demobilization and Reintegration
Standards, UN 2006 ·
c Thematic Evaluation of DDR in peacekeeping operations, OIOS, 2009

G. MONITORING AND COMPLIANCE


All DPKO staff must comply with this SOP. DPKO programme managers at headquarters will
monitor this SOP to ensure consistency and compliance.

I. CONTACT
The Disarmament, Demobilization and Reintegration Section of the Office of the Rule of Law and
Security Institutions should be contacted for information about this SOP.

J. HISTORY
This is a new SOP and has not been amended

APPROVAL SIGNATURE:
DATE OF APPROVAL:

26
ANNEX 1: EXAMPLE OF A DDR INDICATOR FRAMEWORK*
Baseline Target Portfolio of Frequency of
DDR results chain Indicators
(month/year) (month/year) Evidence data collection
Expected accomplishment
Former members of and % of ex-combatants who entered a reintegration n/a** 80% (12/2014) Quarterly reports of quarterly
women associated with programme and formally finish it Implementing Partners
armed forces and groups % of ex-combatants who are very satisfied or satisfied pending 70% (12/2014) Satisfaction annually
are socially integrated in with the support received for reintegration first survey survey
their communities and % of women associated with armed forces who are pending 80% (12/2014) Satisfaction annually
economically active very satisfied or satisfied with the support received first survey survey
% of all ex-combatants who are economically active pending 65% (12/2014) Reintegration every 2 years
at least 6 months after DDR reintegration support first survey survey
% of communities who think that demobilized pending 80% (12/2014) Reintegration every 2 years
combatants are well integrated in the community first survey survey
Output 1:
40,000 combatants of No. of combatants disarmed 0 (01/2011) 40,000 (12/2014) DREAM database monthly
armed forces and groups % of disarmed combatants who are children n/a*** n/a*** (12/2014) DREAM database monthly
are demobilized and
% of targeted units declared non-compliant 0% (01/2011) 3% (12/2014) Quarterly reports of quarterly
disarmed
Nat. DDR Commission
No. of small arms collected 0 (01/2011) 25,000 (12/2013) DREAM database monthly
% of collected small arms which are destroyed n/a 100% (12/2014) DREAM database monthly
% of disarmed ex-combatants who are provided with n/a 95% (12/2014) DREAM database, monthly
a demobilization package reintegration module
Output 2:
35,000 ex-combatants and No. of ex-combatants formally entering a reintegration 0 (01/2011) 35,000 (12/2014) Monthly reports of monthly
3,000 women associated programme Implementing Partners
with armed forces are % of demobilized ex-combatants who have not n/a 10.5% (12/2014) DREAM database quarterly
provided with reintegration started a reintegration programme
support Average no. of days between formal demobilization n/a 14 (12/2014) DREAM database quarterly
and start of reintegration programme by quarter
% of ex-combatant who joined but drop out of a n/a 5% (12/2014) Monthly reports of quarterly
reintegration programme Implementing Partners
No. of women associated with armed forces who are 0 (01/2011) 3,000 (12/2014) Monthly reports of monthly
provided with reintegration support Implementing Partners
Process indicators:
DDR Budget % of quarterly administrative overhead costs n/a 5% (12/2014) DDR accounting quarterly
compared to quarterly expenditure system
% of funds mobilized which are spent 2% 100% (12/2014) DDR financial reports quarterly
DDR Human resources Ratio of international to national DDR staff 1:5 (01/2011) 1:20 (12/2014) DDR HR records quarterly
% of DDR positions which are vacant n/a 2% 12/2014) DDR HR records quarterly
* simplified for illustrative purposes; a full DDR indicator framework would typically include more outputs and/or more detailed indicators.
** If DDR operations have not yet started, the baseline for some indicators can not be calculated yet.
*** In this case, neither baseline nor target is useful. It is, however, important to track the indicator during the programme cycle.

1
These columns to be filled, in
turn, according to data and
ANNEX 2: EXAMPLE OF A DDR INDICATOR TRACKING SHEET* reports received each quarter

Indicators Baseline 1st Q. 2nd Q. 3rd Q. 4th Q. etc. Target Frequency of


(month/year) 2011 2011 2011 2011 (month/year) collection
Expected accomplishment: Former members of and women associated with armed forces and groups are socially integrated in their communities and economically active
% of ex-combatants who entered a reintegration n/a n/a 70% 70% 70% 80% (12/2014) quarterly
programme and formally finish it
% of ex-combatants who are very satisfied or satisfied pending pending pending 45% 45% 70% (12/2014) annually
with the support received for reintegration first survey first survey first survey
% of women associated with armed forces who are very pending pending pending 65% 65% 80% (12/2014) annually
satisfied or satisfied with the support received first survey first survey first survey
% of all ex-combatants who are economically active at pending pending pending pending 40% 65% (12/2014) every 2 years
least 6 months after DDR reintegration support stops first survey first survey first survey first survey
% of communities who think that demobilized combatants pending pending pending pending 65% 80% (12/2014) every 2 years
are well integrated in the community first survey first survey first survey first survey

Output 1: 40,000 combatants of armed forces and groups are demobilized and disarmed
No. of combatants disarmed (cumulative total) 0 (01/2011) 200 400 1000 1200 40,000 (12/2014) monthly
% of disarmed combatants who are children n/a 5% 7% 12% 10% n/a (12/2014) monthly
% of targeted units declared non-compliant 0% (2011) 0% 0% 0% 2% 3% (2014) quarterly
No. of small arms collected 0 (01/2011) 90 190 300 600 25,000 (12/2013) monthly
Output 2: 35,000 ex-combatants and 3,000 women associated with armed forces are provided with reintegration support
% of collected small arms which are destroyed n/a 0% 50% 70% 65% 100% (12/2014) monthly
% of disarmed ex-combatants who are provided with a n/a 0% 90% 70% 65% 95% (12/2014) monthly
demobilization package
No. of ex-combatants formally entering a reintegration 0 (01/2011) 0 250 700 750 35,000 (12/2014) monthly
programme (cumulative total)
% of demobilized ex-combatants who do not enter a n/a n/a 20% 20% 20% 10% (12/2014) quarterly
reintegration programme
Average no. of days between formal demobilization and n/a n/a 14 10 16 14 (12/2014) monthly
start of reintegration programme by quarter
% of ex-combatant who joined but drop out of a n/a 20% 15% 15% 10% 5% (12/2014) monthly
reintegration programme
No. of women associated with armed forces who are 0 (01/2011) 12 50 110 150 3,000 (12/2014) monthly
provided with reintegration support (cumulative total)
Process Indicators
% of quarterly administrative overhead costs compared n/a 10% 10% 10% 7% 5% (12/2014) quarterly
too quarterly expenditure
% of funds mobilized which are spent 0% 2% 7% 10% 4% 100% (12/2014) monthly
Ratio of international to national DDR staff 1:5 (01/2011) 1:5 1:5 1:5 1:7 1:20 (12/2014) quarterly
% of DDR positions which are vacant n/a 25% 25% 25% 15% 2% (12/2014) quarterly

* simplified for illustrative purposes;

2
ANNEX 3: EXAMPLE OF A DDR M&E PLAN*

DDR Monitoring and Evaluation

 The DDR Head of Component will be overall responsible for monitoring and evaluations.
 The DDR Unit with five professional staff will be carrying out day-to-day monitoring and
oversee and manage the sample surveys, studies and evaluations. Initially, three
internationally recruited M&E specialists will be hired. The head of the DDR unit will
report directly to the DDR Head of Component.
 The DDR Unit will work in close collaboration with the National DDR commission and
build their capacities in monitoring and evaluation.
 All M&E activities will be funded through the DDR budget lines on M&E. Total costs for
M&E over the programme lifespan are estimated to be 760,000 USD.
 The programme will use a combination of quantitative and qualitative tools to track
progress and success of DDR.

Quantitative tools:
 Ex-combatant data base DREAM: The DDR M&E unit will obtain and
analyze data from DREAM on a monthly basis.
 Survey on satisfaction: The DDR programme will track the satisfaction of
ex-combatants with the reintegration package on an annual basis through a
representative sample survey.
 Sample surveys on reintegration: In addition, the programme will assess
the medium-term success of the reintegration of DDR participants in their
communities in year 2 and year 4 of the programme.

Qualitative tools:
 Mid-term and final evaluation: The programme will commission an external,
independent mid-term evaluation at the beginning of year 3 and a final
evaluation in year 5 of the programme.
 Internal evaluations: The DDR programme will internally assess the
efficiency of DDR implementation and identify bottlenecks at the end of
year 1 of the programme. In addition, an internal evaluation in year four will
assess the effectiveness of the work of implementing partners.
 Study on special groups: The programme will assess the extent special
reintegration packages for children and members of special groups have
been effective in year 2, and use the findings to adjust the reintegration
support.
 Field visits: The M&E unit will carry out routine field visits to implementing
partners at least on a monthly basis.

Indicator Framework

 The key tool to aggregate and track progress towards the expected accomplishment and
the two outputs of the programme will be the Indicator Tracking Sheet, using the
indicators defined at the planning stage.

Table: DDR Indicator Framework (see Annex 1)

3
Surveys, Studies and Evaluation Plan

Focus Type Timing Costs


SURVEYS
Satisfaction  to track the satisfaction of ex-combatants with the Sample 1st year, $25,000
survey reintegration packages survey 3rd quarter,
2nd year, $25,000
4th quarter
Reintegration  to assess how well the reintegration of DDR Sample 2nd year, $85,000
survey participants has worked so far survey 4th quarter
 to assess to what extent communities are supporting
reintegration
STUDIES
Study on  to assess the extent special reintegration packages for External 2nd year, $35,000
reintegration of children and members of special groups have been study rd
3 quarter
special groups effective so far
EVALUATION
Internal  to assess if the DDR programme has delivered Internal 1st year, $2,500
evaluation on outputs timely and as planned so far evaluation by 4th quarter
efficiency  to identify what the key bottlenecks in the DDR DDR M&E
implementation are unit
Mid-term  to evaluate to what extent the DDR programme has so mixed 3rd year, $65,000
evaluation far progressed towards its planned accomplishment (external & 2nd quarter
 to identify what should be changed to increase the internal)
likelihood to achieve the planned accomplishment
Internal  to assess the effectiveness of the reintegration Internal 4th year, $10,000
evaluation on support provided by implementation partners evaluation by 1st quarter
effectiveness DDR M&E
unit

... etc.

* simplified for illustrative purposes

4
ANNEX 4: EXAMPLE OF A DDR SAMPLE SURVEY

Design
 Since a baseline study had not been conducted and the DDRR program intervention was
already in progress, a non-equivalent control group, post test-only, quasi-
experiment was considered the best available research design.
 The DDR sample survey was primarily designed to determine the impact of the DDR
programme’s reintegration training.
 In addition, the survey aimed at collecting additional information about conflict, conflict
analysis, reintegration, and reconciliation.

Sampling strategy
 The survey used a nationwide random sampling strategy. The survey targeted a
sample of 550 of an estimated ex-combatant population of 105,000.
 Assignment of experimental groups was selected randomly from the same clusters of
geographic locations. The respondents were segregated into two groups – DDR program
participants and non-participants.
 The program participants were further divided into their respective categories of stage of
intervention (Disarmed and Demobilized only participants, participants who had enrolled
but had not completed reintegration training, and those participants who had completed
reintegration training).
 This method utilized information that was entered into the DDR database.

Implementation
 Implementation of the design was conducted in a blended approach that incorporated a
quantitative interview employment alongside a qualitative participatory method
(focus group discussion).
 A randomized sample, drawn from across Liberia in February and March 2006, of 590
adult former fighters submitted to an interview. The interviews collected information on
socio-economic demographics, the ex-combatant geographic locations during the 14-
year civil war and their actions while serving with their former factions during the conflict,
and their knowledge of and participation in international intervention programs, and
captured their social integration and political expressions.

Key Findings:
 Empirical evidence supports the finding that those former combatants who registered
with the national DDR program and completed a course of reintegration training have
reintegrated more successfully than those ex-combatants who chose not to participate
and reintegrate on their own.

based on:
What the Fighters Say: A Survey of Ex-combatants in Liberia, James Pugel, UNDP/African Network for the Prevention and
Protection against Child Abuse and Neglect (ANPPCAN), April 2007,
http://www.lr.undp.org/UNDPwhatFightersSayLiberia_Finalv3.pdf

You might also like