0% found this document useful (0 votes)
25 views106 pages

ForAfrika DMEAL Framework

The DMEAL Framework developed by ForAfrika aims to guide the design, implementation, and assessment of programs to effectively address community needs and promote continuous learning. It emphasizes the importance of community participation and localization in all stages of the program lifecycle to enhance relevance, responsiveness, and sustainability. The framework is part of a broader organizational strategy to ensure quality and efficiency in program management, ultimately striving for self-reliance and prosperity in African communities.

Uploaded by

Hugo Souto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views106 pages

ForAfrika DMEAL Framework

The DMEAL Framework developed by ForAfrika aims to guide the design, implementation, and assessment of programs to effectively address community needs and promote continuous learning. It emphasizes the importance of community participation and localization in all stages of the program lifecycle to enhance relevance, responsiveness, and sustainability. The framework is part of a broader organizational strategy to ensure quality and efficiency in program management, ultimately striving for self-reliance and prosperity in African communities.

Uploaded by

Hugo Souto
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 106

DMEAL Framework

2025

ForAfrika | DMEAL Framework 1


DMEAL Framework

Author
Fabio Bezerra Correia Lima – Director of DMEAL
Thabo Miya – Programme Officer

Collaborators
Abeba Amene – Chief Programmes Officer
Dr Mary Okumu – Senior Director Technical Unit
Richard Mulandi – Director of Integrated Programming
Narciso Cau – DMEAL Specialist (Mozambique)
Allan Ssebutinde – DMEAL Specialist (Uganda)

Acknowledgements
We would like to express our deepest gratitude to all those who contributed their time,
expertise, and valuable insights in the development of this DMEAL Framework.
Special appreciation goes to the dedicated members of the Global Support Office (GSO)
Technical Unit. Your technical expertise and guidance were invaluable in shaping this robust
tool.
We also wish to thank the teams at the country offices, specifically the directors, programme
managers, and monitoring & evaluation officers. Your hands-on experience and intimate
knowledge of our work’s realities ensured that this guideline is both effective and practical in
measuring our impact at the grassroots level.
We are grateful to the members of the Executive Committee for their leadership and vision.
Your thorough review and final approval of this guideline not only ensures its quality but also
reaffirms our collective commitment to evidence-based decision-making and transparency.
Our gratitude extends to the communities we serve. The trust and co-creation of knowledge
you all facilitate is integral to our work. Your resilience and determination continually inspire
us to improve our methods and understand better the impacts of our work.
Together, we strive to create a world where everyone has the opportunity to lead a sustainable
and self-sufficient life.
Published in April 2023 - Updated in September 2024

ForAfrika | DMEAL Framework 2


Contents
Author 2
Collaborators 2
Acknowledgements 2

1. Introduction 7
1.1 Background and purpose 7
1.2 Theory of Change 8
1.3 Importance of community participation in the DMEAL process 10
1.4 Importance of community participation in the DMEAL process 11
• Project Management 11
• Programme Management 11
1.5 DMEAL in the programme cycle 12
Direct Users of the DMEAL Framework 13

2. Needs Assessment and Community Engagement 14


2.1 Ethical considerations 15
2.2 Preparation and engagement of community members and stakeholders in the assessment
process 16
Preparation and Planning: 17
Stakeholder Engagement: 17
2.3 Identify gaps and opportunities: Inform programme design by addressing community needs
and context 20
Data Collection: 20
Data Analysis: 20
Prioritisation: 21
2.4 Document findings: Use community input, data, and analysis to inform programme design21
Chapter 2 Summary Table 22

3. Participatory programme design 24


3.1 Define the programme logical framework 24
3.2 Identify indicators and targets 28
3.3 Performance-Based Budgeting 29
Steps to Develop a Performance-Based Budget 29
3.4 Project Performance Management 30
3.5 Plan data collection methods and tools 30
Identification of appropriate data collection methods: 30

ForAfrika | DMEAL Framework 3


Development of data collection tools: 31
Pilot testing of the tools: 31
Chapter 3 Summary Table 32

4. Community-based monitoring 33
4.1 Establish a monitoring plan 33
4.2 Collect data on programme progress 35
4.3 Participatory data verification and improvement 37
4.4 Monitor progress against targets and develop action plans 37
Chapter 4 Summary Table 39

5. Participatory evaluation 41
5.1 Establish an evaluation plan 41
Programme Prioritisation for Evaluation 41
Establishing the Evaluation Plan 42
Types of Evaluations and Evaluation Approaches 43
5.2 Ensure community ownership and empowerment 45
5.3 Evaluation management 46
5.4 Evaluation Action Plan 47
5.5 Internal Evaluations and Reviews 48
5.6 Impact Measurement 49
Chapter 5 Summary Table 51

6. Community-driven accountability 53
6.1 Establish community-based feedback channels 54
Suggested proactive feedback channels 55
Suggested reactive feedback channels 55
6.2 Handling and addressing complaints and grievances 56
6.3 Interpreting and responding to stakeholder feedback 60
6.4 Communicating responses 61
6.5 Foster community ownership 61
6.6 Reporting 62
6.7 Roles & Responsibilities 63

7. Organisational learning and adaptations 66


7.1 Promote a learning culture 66
7.2 Document and disseminate lessons learned 67

ForAfrika | DMEAL Framework 4


7.3 Capacity building 70
7.4 Adapt and improve 72

References 74

Annex A - Programme Good Practices Report 75


Executive Summary 75
1. Programming Framework 75
2. Good/Best practice identification 75
3. Project/Programme Outcomes, Results, Impact, and Evaluation Methodology 76
4. Project/Programme Management Cycle 76
6. Project/Programme Sustainability Measures and Local Ownership 76
7. Conclusions, Recommendations, and Next Steps 76
End notes: 77
Appendices 77

Annex B – Evaluation Quality Checklist 78


Terms of Reference 78
Section A: Context and Description of Object of the Evaluation (Weight 5%) 78
Section B: Evaluation Purpose, Objectives, and Scope (Weight 30%) 79
Section C: Evaluation Framework (Weight 20%) 79
Section D: Evaluability Assessment and Methodology (Weight 25%) 80
Section E: Work plan/Evaluation Management and Structure/Presentation of the ToR (Weight
20%) 81
Inception Report Review 82
Section A: Opening Pages and Introduction (Weight 5%) 82
Section C: Purpose, Objectives, and Scope of the Evaluation (Weight 10%) 83
Section D: Evaluation Framework (Weight 20%) 83
Section E: Methodology (Weight 30%) 84
Section F: Evaluation Workplan (Weight 20%) 85
Section G: Inception Report Structure/Presentation (Weight 5%) 85

Draft Evaluation Report Review 85


Section A: Background (Weight 5%) 86
Section B: Evaluation Purpose, Objectives And Scope (Weight 5%) 86
Section C: Evaluation Methodology (Weight 20%) 87
Section D: Evaluation Findings (Weight 25%) 88
Section E: Evaluation Conclusions & Lessons Learned (Weight 10%) 89

ForAfrika | DMEAL Framework 5


Section F: Recommendations (Weight 15%) 89
Section G: Evaluation Structure/Presentation (Weight 5%) 90
Section H: Evaluation Principles (Weight 10%) 91
Section I: Executive Summary (Weight 5%) 91

Annex C – DMEAL skills auto assessment 92


Core DMEAL Skills 93

Annex D - Programme Logframe Template 96

Annex E – Evaluation Budget 97

Annex F – Performance Management Plan 99

Annex G - Report Quality Assessment Tool 101

ForAfrika | DMEAL Framework 6


1. Introduction
Welcome to the heart of the guideline, where the framework is grounded in ForAfrika’s mission
and context. This document lays the foundation for Design, Monitoring, Evaluation,
Accountability and Learning (DMEAL), introducing its pivotal role within ForAfrika’s operations.
The aim is to elucidate the integral role that DMEAL plays in ForAfrika’s programme
management cycle.

1.1 Background and purpose


In 2021 Joint Aid Management underwent a significant organisational restructuring to make
sure its strategies and policies realize its vision of an “Africa that thrives”. Beyond the
rebranding to ForAfrika, as a leading pan African Non-Governmental Organisation (NGO), the
organisation also developed a Theory of Change (ToC) to define how ForAfrika would support
resilience building for vulnerable households.
Similarly, the organisation developed Programming Frameworks that seek to cascade the ToC
to field operations. The Programming Frameworks define in detail how the organisation will
ensure and deliver depth and impact in its projects/programmes. The frameworks establish
the Programming Principles that underpin all the organisation’s Projects/Programmes. Some
of these include Localisation, Inclusion, Focus on the child, Transformational Development,
Resilience Building, Strategic Partnership with local communities as equal partners and
alignment to global development standards.
Alongside the corporate Theory of Change and Programming Frameworks, ForAfrika has
developed Sector Strategies that guide all its five core priority sectors (Health and Nutrition,
WASH, Food Security and Livelihoods, Education and Economic Empowerment) and enable
its country teams to align their programmes with global/regional/national development plans
and goals.
This DMEAL Framework is an integral part of the programme cycle and adds to the corporate
efforts towards programme quality and efficiency. The purpose of the framework is to guide
ForAfrika in designing, implementing, and assessing our programmes to ensure they
effectively address the needs of the communities we serve, maximise the positive impact of
our interventions, and promote a culture of continuous learning and improvement.
This guidance has been developed to provide a more context-specific, flexible, and
participatory approach to programme management, allowing our teams to respond more
effectively to the dynamic environments in which they operate. The system chosen to
operationalise and manage information generated by this framework is Activity Info, a
programme management platform tailored to ForAfrika’s needs.
The DMEAL framework emphasises the importance of engaging community members and
stakeholders in all aspects of programme design, implementation, monitoring, evaluation, and
learning. By fostering community participation, the framework helps ensure that programmes
are relevant, responsive, and accountable to the needs and priorities of the communities they
serve.

ForAfrika | DMEAL Framework 7


Additionally, the framework supports a culture of learning and adaptation, enabling us to refine
our strategies and interventions based on evidence, feedback, and lessons learned from our
own experiences as well as those of others in the sector. Therefore, the Framework adds
independence and objectivity to the critical role of ForAfrika’s DMEAL function.

1.2 Theory of Change


ForAfrika’s core mission is to provide African communities with ways in which they can create
a sustainable living. As of 2023, the organisation has ongoing operations in Angola,
Mozambique, Rwanda, South Africa, South Sudan, Uganda, Central African Republic and
Ethiopia. ForAfrika has registered affiliate support offices in Canada, Germany, Switzerland,
the United States, the United Kingdom and representatives in Norway and Australia.
Our vision is an Africa that thrives, and our mission is to provide the resources that unlock
the abundance of Africa so every African community can thrive.
Our Theory of Change is rooted in the belief that progress starts from within communities. We
begin by addressing immediate deprivation through relief aid. By providing timely support
during disasters to help meet basic needs, we mitigate suffering while laying the foundations
for recovery.
As the initial crisis subsides, our integrated programming across sectors converges to
restore access to essential services and livelihoods. We collaborate with communities to
rebuild critical WASH, health, education, and economic infrastructure. If critical infrastructure
is rehabilitated to strengthen localisation and resilience, household and community assets will
be restored. This is because context-appropriate, community-managed facilities expand
equitable access to services central to welfare.
In the process, we journey with communities for the long term. We strengthen capabilities
to withstand future shocks through initiatives owned and directed locally. If communities gain
skills and tools to drive solutions tailored to their contexts, they will continually gain confidence
and problem-solving skills to expand their potential. This assumes the community knows
best how to direct resources and innovations to unlock local progress.
As community knowledge, resources and leadership are activated, a collective action and
accountability culture takes root. We nurture partnerships across stakeholders towards
common goals informed by grassroots realities. If collaborative efforts coalesce around
community-identified priorities, synergies will organically boost localised solutions. This
convergence is rooted in localised priorities, tapping insights from those closest to the need.
Through years of sustained, equitable engagement, communities gain the power to direct
their advancement beyond our presence. If community-driven development is incubated to
the point of self-propagation, 20 million Africans will progress towards self-reliance. The
building blocks for continued African prosperity will ripple across communities by catalysing
localised leadership and ingenuity.
We measure results through sustained access to quality services, asset building, and
household resilience indicators. But lasting testimony to our theory of change will be thriving
African communities lifted through their collective strength and vision.

ForAfrika | DMEAL Framework 8


ForAfrika | DMEAL Framework 9
1.3 Importance of community participation in the DMEAL process
Community participation, with an emphasis on localisation, is a crucial aspect of the Design,
Monitoring, Evaluation, Accountability, and Learning (DMEAL) process for ForAfrika. By
actively involving local communities in all stages of the programme lifecycle, ForAfrika can
better address their unique needs, maximising the impact of our programmes. A localisation
approach highlights the importance of local knowledge, resources, and decision-making
power in driving sustainable development.
Here are several reasons why a localisation approach to community participation is essential
in the DMEAL process at ForAfrika:
• Contextual understanding: Engaging local community members improves understanding
of the local context, culture, and dynamics, ensuring tailored interventions that address
specific needs and challenges.

• Local ownership and empowerment: Community participation fosters local ownership and
empowerment, as participants become active stakeholders in the programme’s design,
implementation, monitoring, and evaluation, contributing to long-term success and
sustainability.

• Local capacities and resources: A localisation approach leverages the skills, knowledge,
and resources of local communities, building on their existing capacities and strengths to
support the development of locally driven solutions and promote self-reliance.

• Responsiveness and relevance: Involving local community members ensures


programmes remain responsive to evolving needs and priorities, allowing ForAfrika to
make necessary adjustments and enhance the effectiveness and impact of their
interventions.

• Accountability and transparency: Community participation promotes accountability and


transparency at the local level, enabling community members to hold the organisation
accountable for its commitments and actions, fostering trust and credibility.

• Learning and adaptation: Engaging local community members in monitoring, evaluation,


and learning components facilitates continuous learning and adaptation, enabling the
organisation to refine its strategies and interventions to better meet community needs.

• Sustainability: A localisation approach fosters local ownership, capacity building, and the
development of community-driven solutions, supporting the long-term viability of
interventions even after the organisation’s direct involvement has ended.

In summary, a localisation approach to community participation in the DMEAL process is


essential for ForAfrika, as it enables the organisation to create and manage programmes that
are more relevant, effective, and sustainable, ultimately leading to greater impact and lasting,
positive change in the communities they serve.

ForAfrika | DMEAL Framework 10


1.4 Importance of community participation in the DMEAL process

• Project Management
According to the Project Management Institute and PMD Pro 1, a project is defined as “a
temporary endeavour undertaken to create a unique product, service, or result”.
Projects deliver integrated outputs (deliverables), which then result in better outcomes
(results) for communities and other stakeholders (such as donors). Projects are time-bound
and focus on a requirement to deliver specific benefits for communities in ways that are cost-
effective and measurable.
Project Management is the discipline of planning, organising and managing resources to bring
about the successful delivery of specific project goals, outcomes and outputs. The primary
challenge of project management is to achieve each of these, while managing project
constraints related to scope, budget, schedule and quality.

• Programme Management
Programmes are groups of related projects and activities (sometimes referred to as
’component parts of a programme’) that are managed in a coordinated way to achieve an
impact that is greater than if they were managed individually. In other words, the whole
(the benefit of the programme) is greater than the sum of its parts (the projects, activities and
tasks). ForAfrika organises projects into programmes to deliver outcomes that address a broad
range of needs and achieve exponential benefits for the communities in which we work.
Programmes, unlike projects, are generally implemented through a centralised management
system (Country Directors and Programme Managers) in which groups of projects are
coordinated to achieve a programme’s overall strategic objectives and benefits. This approach
enables us to achieve economies of scale and realise incremental change that would not be
possible if projects were managed separately.

Although the focus of this guideline is on DMEAL process from a programme perspective, all
the tools and rationale presented below can also be applied to project management.

1
Project Management for Development Professionals Guide developed by PM4NGOs. Accessible at
https://www.pm4ngos.org/project-dpro/

ForAfrika | DMEAL Framework 11


1.5 DMEAL in the programme cycle
The programme cycle encompasses all stages of a programme, from the initial planning phase
to the final evaluation and learning stages, and DMEAL activities are integral to each of these.
In the planning phase, the DMEAL process begins with the identification of needs and
community engagement (Chapter 2), which informs the design of the programme.
Participatory programme design (Chapter 3) is used to define programme outputs, select
appropriate indicators, set targets, and plan data collection methods.
During the implementation phase, the DMEAL process continues with community-based
monitoring (Chapter 4). Monitoring activities, such as data collection and progress review, help
to ensure that the programme is on track and that any issues or changes in the context are
identified and addressed promptly.
In the evaluation stage, DMEAL involves participatory processes (Chapter 5) to assess the
effectiveness, impact and sustainability of the programme. Evaluations provide valuable
information for accountability purposes and for learning and improving future programmes.
The accountability needs for quality programming (Chapter 6) are supported by DMEAL
processes throughout the entire cycle, fostering community ownership, transparency, and
responsiveness to concerns or grievances.
Finally, throughout the programme DMEAL supports the learning and adaptation stage
(Chapter 7) by promoting a learning culture within the organisation, documenting and
disseminating lessons learned, building capacity, and using feedback and evaluation findings
to improve and adapt the programme and future initiatives.
This cycle continues to repeat, with each iteration informed by the learning and adaptation
from previous cycles, hence embodying the continuous improvement ethos of DMEAL.
In summary, the DMEAL process within the programme cycle can be represented as follows:

ForAfrika | DMEAL Framework 12


Direct Users of the DMEAL Framework
From global to country level, the cooperation and commitment of each team member play a
vital role in successfully embodying ForAfrika's vision, mission, and values. This guideline is
intended to be more than a tool; it aims to be a living document that empowers staff in their
daily work and contributes to strengthening programmatic cohesion, impact, and
transparency. Here is how we see it being useful for multiple functions:

DMEAL Director (GSO Level): The DMEAL Director is the highest authority in terms of the
Guideline's implementation. Their role includes overseeing the DMEAL practices across the
organisation, ensuring alignment with ForAfrika's values and strategies. They offer guidance
to all levels of staff, facilitate necessary training, ensure resources are allocated for DMEAL,
and promote a culture of continuous learning and improvement.

Programme Quality Manager (GSO level): The Programme Quality Manager ensures that
the quality of programme design and implementation aligns with the guidelines set forth. They
work closely with the DMEAL Director to translate the Guideline into practice and ensure that
the quality standards are met at all stages of the programme cycle. Critically, they are
responsible for ensuring that each programme aligns with ForAfrika's results framework at the
design stage, which means they actively guide and oversee the development and refinement
of programme plans, outcomes, and indicators.

Programme Officer (GSO level): The Programme Data Officer's primary role is to manage
and oversee the quality of data collected throughout the programme's cycle. They play an
instrumental role in implementing data-related aspects of the Guideline, such as designing
and maintaining data collection tools, ensuring data quality, and facilitating data analysis and
interpretation.

Programme Integration Director (GSO level): The Programme Integration Director's role is
to ensure that all different programmes and initiatives align with the Guideline. They facilitate
cross-departmental coordination, promote integration of DMEAL practices into all
programmes, and ensure that the DMEAL processes are incorporated in a way that
complements and enhances overall programme performance.

Country Directors: The Country Directors are responsible for implementing the Guideline at
the country level. They oversee their respective teams, ensuring that DMEAL practices are
incorporated into all programmes. They also facilitate coordination with the GSO level staff,
report on DMEAL activities, and ensure that country-level insights feed into organisational
learning.

DMEAL Managers and Officers (country level): The DMEAL Managers and Officers,
hereinafter called DMEAL staff, are the backbone of DMEAL implementation at the country
level. They facilitate data collection, manage data quality, conduct monitoring and evaluation
activities, and report findings in line with the Guideline. They also collaborate closely with
programme managers and community stakeholders to ensure participatory approaches are
utilised. Additionally, DMEAL staff play a key role in ensuring that country-level programme
design and implementation align with ForAfrika's results framework, providing the necessary
support and oversight to ensure this alignment.

ForAfrika | DMEAL Framework 13


Finance Officers (country level): Key personnel in the implementation of performance-
based budgeting, which will provide valuable insights on programme/project efficiency.

Programme Managers and Staff (country level): Programme Managers have a critical role
in implementing the Guideline in their specific programmes. They are responsible for
incorporating DMEAL activities into programme design and implementation, collaborating with
DMEAL Staff to ensure data is collected and used effectively, and promoting participatory
approaches in DMEAL practices.

2. Needs Assessment and Community Engagement


The journey begins with understanding the context –
the community’s needs, strengths, and potential. Why do assessments?
This chapter outlines the process of conducting a
1. Provide better understanding
comprehensive needs assessment while ensuring of people’s vulnerabilities,
active community engagement. their causes, and their
manifestations.
A participatory needs assessment is a collaborative
process that actively involves community members, 2. Establish the number and
composition of those affected
local stakeholders, and other relevant parties in
by livelihood shocks.
identifying, prioritising, and analysing the needs,
challenges, and assets of a community. This 3. Identify what else is being
done (if at all) to address the
inclusive approach ensures that the assessment identified vulnerabilities.
captures diverse perspectives and knowledge,
4. Identify stakeholders in the
ultimately leading to more relevant, effective, and
technical or geographic area.
sustainable development interventions.
5. Inform project models,
Needs assessment is a pre-requisite for all implementation strategies and
interventions. A programme should answer to performance targets, among
specific needs of a population, which are justified by others.
concrete evidence.
ForAfrika will use secondary data2 – such as sociodemographic government surveys, situation
analysis by reputable partners, etc – as a first step in narrowing down the areas of operation
within a country and region focusing on a vulnerability ranking. Further identification of
communities’ needs will be achieved through a process of participatory needs assessment,
described below.
A complete multisectoral assessment should be conducted in cases where there is no prior
surveying and implementation history with a community. Complete assessments use cross-
sectional qualitative surveys to objectively understand needs, which are then evaluated
against humanitarian standards and the sustainable livelihoods index.

• 2
All secondary data sources should be explicitly referenced.

ForAfrika | DMEAL Framework 14


When the communities have already expressed their priorities, a rapid assessment can be
done to validate if previous findings still apply (at two-year intervals from the execution of the
complete assessment).
Assessment reports should be included with proposals and agreements or shared on
accessible platforms. This practice improves our research quality and understanding of
partner capabilities. This helps navigate situations when funders propose collaborating on
predetermined priorities. By sharing this data with ForAfrika’s project team, we can better
inform and improve our interventions, balancing funder priorities with community needs and
evidence-based practices.

2.1 Ethical considerations


Within the context of needs assessments and any other primary research we undertake,
ForAfrika will be guided by the following considerations:

Minimising Personally Identifiable Information (PII) Data Collection: Personally


identifiable information (e.g., names, phone numbers, personal documents) should only be
collected when essential.

Gender and Representation: Ensure gender balance in the composition of data collection
teams and in participatory activities. At a minimum, all indicators involving numbers or
percentages of people should be disaggregated by gender. The DMEAL team will strive to
include individuals of all age groups and persons with disabilities in routine data collection
efforts.

Informed Consent: Participants must be clearly informed about how their data will be used,
with whom it will be shared, and the process for submitting complaints or addressing concerns.
Consent must be explicitly obtained before data collection begins.

Security and Confidentiality: Personal data, particularly identifiable information, should be


securely stored to prevent unauthorised use. Such data must not be shared with authorities
without prior discussion with the programme team and Country Director (CD).

Ethical Clearance: Before contacting any respondents for participation in any needs’
assessments and/or studies which collect data beyond our monitoring activities, ForAfrika
country office teams should submit applications to ethical clearance statutory bodies in their
relevant jurisdiction for the ethical vetting of the proposed research.

ForAfrika | DMEAL Framework 15


Country Ethics Body
National Ethics Committee for Health Research (CNER - Comissão Nacional de Ética para
AO
a Pesquisa em Saúde)

CAR National Ethics Committee for Health Research

ET Ethiopian National Research Ethics Review Committee (NRERC)

RW Rwanda National Ethics Committee (RNEC)

National Bioethics Committee for Health (CNBS - Comité Nacional de Bioética para a
MZ
Saúde)

SA National Health Research Ethics Council (NHREC)

SS South Sudan Ministry of Health Research Ethics Committee

UG Uganda National Council for Science and Technology (UNCST)

While many of these review boards pertain specifically to health, clearances for research in
other fields may be obtained from the internal review boards of most Higher Education
Institutions (HEIs). This is an important layer of compliance for the organisation to have
documented. Moreover, partnerships with these institutions may be established pursuant to
their support with this and other aspects of our research efforts.
When undertaking ethical clearance for research, always keep in mind the following:
• Identify the Relevant Body: Depending on the nature of your research (e.g., health, social
sciences), you may need to approach different bodies.
• Submit Proposal: Prepare a detailed research proposal, including information on ethical
considerations, and submit it to the appropriate committee or board.
• Follow Local Regulations: Ensure compliance with local laws and guidelines regarding
research ethics.

2.2 Preparation and engagement of community members and


stakeholders in the assessment process
Before conducting the needs assessment, it is essential to define the objectives, scope, and
timeline, as well as identify the key stakeholders and community members who should be
involved. This stage also involves engaging with stakeholders and community members to
ensure that they are informed about the purpose, process, and expected outcomes of the
needs assessment. Building trust and establishing open lines of communication are crucial for
fostering collaboration and active participation.

ForAfrika | DMEAL Framework 16


Preparation and Planning:

1. Mobilise resources: Each Country Office needs to account for funds for regular
community consultations in their country strategies. These will allow for a bank of
community-based evidence on local priorities. Further, every project proposal budget
must include resources for a complete or rapid needs-assessment.
2. Define objectives, scope, and timeline: Clearly outline the purpose and objectives of
the participatory needs assessment. Determine the scope by specifying the geographic
area, target population, and thematic focus (related to ForAfrika’s programme areas) and
develop a realistic timeline for the data collection;
3. Identify stakeholders and participants: Compile a list of key stakeholders, including
local community members, leaders, civil society organisations, government
representatives, and other relevant parties who should be involved in the process.
Consider diversity in terms of gender, age, ethnicity, socio-economic status, and other
factors to ensure a balanced representation of the community. Define your focal area
precisely to draw an appropriate and representative sample;
4. Select participatory tools and methods: Choose appropriate participatory tools and
methods that align with the objectives and context of the needs assessment. Ensure that
the selected methods are culturally sensitive, accessible, and inclusive to facilitate
meaningful participation from all community members.
5. Develop a plan for data collection, quality control, and reporting: Outline the data
collection strategy, including the methods, tools, and techniques to be used, as well as
the roles and responsibilities of the assessment team and community participants. Set up
a plan for how the team will ensure the quality standards set for the data are adhered to
as well as a plan for how the results of the data collection will be reported. Consider
ethical considerations and ensure informed consent from participants.

Stakeholder Engagement:

1. Categorise stakeholder groups to begin engagement planning: Identify which


stakeholder groups have high or low capacity to influence project results as well as which
stakeholders have positive or negative interest in the project.
High influence, positive interest: These stakeholders require close engagement and
active management. Prioritise regular communication, involve them in key decisions, and
collaborate closely to leverage their influence and positive interest for project success.
High influence, negative interest: Monitor these stakeholders closely and develop
strategies to mitigate potential risks they may pose. Engage diplomatically to address their
concerns, provide clear information to counter misconceptions, and seek to shift their
interest towards a more positive stance.
Low influence, positive interest: Keep these stakeholders well-informed and engaged
to harness their enthusiasm. Involve them in project activities where appropriate and use
their positive interest to gather valuable feedback and insights.

ForAfrika | DMEAL Framework 17


Low influence, negative interest: Monitor these stakeholders with minimal effort, but be
prepared for potential criticism or opposition. Develop simple communication strategies
to address their concerns and consider ways to improve their perception of the project if
resources allow.
2. Initial contact and relationship-building: Reach out to key stakeholders and community
members to introduce the needs assessment and establish a foundation of trust and
collaboration. This may involve organising preliminary meetings or consultations to
discuss the purpose, objectives, and expected outcomes of the assessment.
3. Information sharing and communication: Clearly communicate the goals, process, and
expectations of the participatory needs assessment to stakeholders and community
members, using accessible language and formats. Establish open lines of communication
for ongoing dialogue and feedback.
4. Capacity building and training: Depending on the complexity of the participatory tools
and methods being used, provide capacity building and training for stakeholders and
community members who will be involved in the needs assessment process. This can
help ensure that all participants have the necessary skills and knowledge to contribute
effectively.
5. Establishing roles and responsibilities: Clearly define and communicate the roles and
responsibilities of stakeholders and community members in the needs assessment
process, ensuring that expectations are well-understood and that all participants are
aware of their contributions.
6. Inclusive participation: Encourage and facilitate the active participation of all community
members, paying particular attention to marginalised or vulnerable groups. This may
involve adapting the participatory methods and tools to be more accessible or providing
additional support to ensure that all voices are heard and valued.
7. Ongoing engagement and feedback: Maintain regular communication with
stakeholders and community members throughout the needs assessment process,
providing updates on progress, challenges, and findings. Seek feedback and input from
participants to ensure that the process remains responsive and relevant to the
community’s needs and priorities.

ForAfrika | DMEAL Framework 18


Participatory tools and methods: These facilitate the active involvement of community
members in the needs assessment process. These methods help gather valuable insights and
enable better understanding of local contexts and priorities. Here are some participatory tools
and methods ForAfrika can use during the needs assessment, along with concrete examples:
• Focus Group Discussions (FGDs): FGDs are structured group discussions involving a
diverse set of participants who share their experiences, opinions, and perspectives on
specific topics related to the needs assessment. For example, ForAfrika could organize
separate FGDs with women, men, youth, and elderly community members to discuss the
challenges they face in accessing clean water or quality education.
• Key Informant Interviews (KIIs): KIIs are in-depth, semi-structured interviews conducted
with individuals who possess significant knowledge or experience in the community.
ForAfrika could interview community leaders, teachers, healthcare workers, or local
business owners to gather insights into the community's pressing needs and potential
solutions.
• Participatory Mapping: Participatory mapping involves community members in creating
visual representations of their community, highlighting important resources, challenges,
and priorities. ForAfrika could facilitate a mapping exercise where community members
draw or place markers on a large map to identify areas with limited access to healthcare
facilities or areas prone to natural disasters.
• Problem Ranking and Prioritization: This method engages community members in
identifying, ranking, and prioritizing their most pressing needs and challenges. ForAfrika
could use a matrix ranking exercise, where participants list problems and rate them
according to criteria such as severity, frequency, or impact, resulting in a ranked list of
priority issues to address.
• Community Surveys: Community surveys involve administering questionnaires to gather
quantitative and qualitative data on community needs, perceptions, and priorities.
ForAfrika could develop a survey that explores household income levels, access to
essential services, and the community's perspectives on the most urgent challenges.
• Participatory Rural Appraisal (PRA): PRA is a set of techniques used to engage local
communities in gathering and analysing information about their needs and resources.
ForAfrika could use methods such as transect walks (systematic walks through the
community with local guides), seasonal calendars (to identify patterns and changes in
resources or needs throughout the year), or Venn diagrams (to visualize relationships and
interactions between different community stakeholders).
• Community Visioning and Action Planning: This method encourages community
members to envision their desired future and develop action plans to achieve their goals.
ForAfrika could facilitate a visioning workshop where participants discuss their hopes and
aspirations for the community, identify the necessary steps and resources, and
collaboratively develop an action plan to address priority needs.
• Photovoice: Community members are asked to represent their community or express
their point of view by taking photographs, usually with a specific theme or topic in mind
relevant to the project.

ForAfrika | DMEAL Framework 19


• Community Scorecards: A community-led transparency and accountability tool used to
garner public opinion on service delivery.

2.3 Identify gaps and opportunities: Inform programme design by


addressing community needs and context
Using participatory tools and methods, gather both qualitative and quantitative data on the
community’s needs, challenges, priorities, and assets. After the data is collected, analyse
it to identify key themes, patterns, and trends. This process should be done collaboratively,
involving community members and stakeholders in the interpretation of the data, ensuring that
their perspectives and insights are incorporated.
Then, based on the findings, work with community members and stakeholders to prioritise
the identified needs and challenges, considering the community’s resources, capacities, and
preferences. This process may involve ranking, scoring, or other consensus-building
techniques to reach an agreement on the most pressing needs.

Data Collection:

• Train data collectors, including community members and assessment team members, on
the selected participatory tools and methods. Ensure they understand the objectives of the
needs assessment, ethical considerations, and how to collect and record data accurately
and respectfully.
• Conduct data collection activities using the selected participatory tools and methods to
capture needs, challenges and assets, and accurately record the data using formats
such as written notes, audio recordings, photographs, or visual representations. Ensure
that data is stored securely and confidentially.
• Regularly monitor the data collection process to ensure that it is progressing according to
the plan, addressing any challenges or issues that may arise.

Data Analysis:

• Gather all collected data and organise it in a structured and accessible format. This may
involve transcribing audio recordings, digitising handwritten notes, or sorting and
categorising data.
• Analyse qualitative data, such as focus group discussions or key informant interviews, by
identifying common themes, patterns, and trends. This may involve using techniques such
as content analysis, thematic analysis, or narrative analysis.
• Analyse the quantitative data, such as survey responses or numerical rankings, using
descriptive or inferential statistical methods, depending on the data and research
questions.

ForAfrika | DMEAL Framework 20


• Compare and contrast the findings from different data sources and methods to ensure
reliability and validity. Identify any discrepancies or inconsistencies and explore possible
explanations.
• Involve community members and stakeholders in the data analysis process, ensuring that
their insights, interpretations, and perspectives are incorporated into the findings.

Prioritisation:

• Share the findings of the data analysis with community members and stakeholders, using
accessible formats and language.
• Organise workshops or meetings to discuss the findings and collaboratively prioritise the
identified needs and challenges. Ensure that diverse community members and
stakeholders are included in these discussions.
• Establish criteria for prioritising needs and challenges, such as urgency, severity,
feasibility, or potential impact. These criteria should be agreed upon by the community
members and stakeholders involved in the process.
• Apply the prioritisation criteria to the identified needs and challenges, either through a
structured process such as matrix ranking or through group discussions and consensus-
building.

2.4 Document findings: Use community input, data, and analysis to


inform programme design
The next step is to compile the findings from the data analysis and prioritisation processes,
ensuring that all relevant information, including qualitative and quantitative data, is included
and presented in a clear and organised manner.
This information will feed into the needs assessment report, whose sections will be
delineated as follows: an executive summary, background information, objectives and
methodology, key findings, prioritised needs and challenges, and recommendations for action.
Ensure that the report is written in an accessible and engaging style that effectively
communicates the key messages to a diverse audience.
Once a draft report is finalised, share it with community
members and stakeholders, seeking their input and Securely store and archive the
collected data and documentation
feedback to ensure accuracy, relevance, and
from the needs assessment
inclusiveness. The revised and approved report shall
process, ensuring that all materials
be disseminated with relevant stakeholders, such as are appropriately safeguarded and
community members, local authorities, donors, and can be accessed for future
partner organisations. Consider using different formats reference or follow-up activities.
and channels for dissemination, such as printed
copies, digital files, or presentations at community
meetings, to ensure the widest possible reach.

ForAfrika | DMEAL Framework 21


The results of the participatory needs assessment will be used to inform the design of
interventions and the development of action plans. Engage community members and
stakeholders in this process to ensure that their priorities and insights are considered, leading
to more relevant and effective programs.

Chapter 2 Summary Table

ACTIVITY PERSON TOOLS


ACTIVITY DESCRIPTION
TITLE RESPONSIBLE REQUIRED

Clearly outline the purpose and Programme Meeting space,


Objective and
objectives of the participatory needs Quality Manager objectives and
Scope Definition
assessment. & DMEAL Staff scope outline

Compile a list of key stakeholders, Programme Stakeholder


Stakeholder
including local community members, Quality Manager identification
Identification
leaders, etc. & DMEAL Staff sheet
PREPARATION AND PLANNING

Gantt chart or
Programme
Timeline Develop a realistic timeline for other programme
Quality Manager
Establishment conducting the needs assessment. management
& DMEAL Staff
tools

Needs
Choose appropriate participatory Programme
Tool and Method assessment tools
tools and methods that align with the Quality Manager
Selection and methods
objectives and context. & DMEAL Staff
guide

Programme
Data Collection Outline the data collection strategy, Data collection
Quality Manager
Plan Development including the methods, tools, etc. plan template
& DMEAL Staff

Identify and mobilise the resources Programme Budgeting and


Resource
needed to conduct the needs Quality Manager resource
Mobilisation
assessment. & DMEAL Staff allocation tools

Initial Contact and Reach out to key stakeholders to Programme


Emails, letters,
Relationship introduce the needs assessment and Quality Manager
STAKEHOLDER ENGAGEMENT

phone calls
Building establish trust. & DMEAL Staff

Information Communicate the goals, process, Programme Emails,


Sharing and and expectations of the participatory Quality Manager newsletters,
Communication needs assessment. & DMEAL Staff meetings

Provide capacity building and training Programme


Capacity Building
for stakeholders and community Quality Manager Training materials
and Training
members. & DMEAL Staff

Roles and Clearly define and communicate the Programme Roles and
Responsibilities roles and responsibilities of Quality Manager responsibilities
Definition stakeholders. & DMEAL Staff chart

ForAfrika | DMEAL Framework 22


Inclusive Encourage and facilitate the active Programme
Meetings,
Participation participation of all community Quality Manager
workshops
Facilitation members. & DMEAL Staff

Ongoing Maintain regular communication with Programme Emails,


Engagement and stakeholders and community Quality Manager newsletters,
Feedback members. & DMEAL Staff meetings

Programme Data collection


Train data collectors and conduct
Data Collection Quality Manager tools and
IDENTIFY GAPS AND

data collection activities.


OPPORTUNITIES

& DMEAL Staff methods

Gather all collected data and analyse Programme


Data analysis
Data Analysis it to identify key themes, patterns, Quality Manager
software
and trends. & DMEAL Staff

Share the findings and Programme


Meetings,
Prioritisation collaboratively prioritise the identified Quality Manager
workshops
needs and challenges. & DMEAL Staff

Compile the findings into a needs


Programme
assessment report, including an Report writing
Report Writing Quality Manager
executive summary, key findings, tools
& DMEAL Staff
etc.
DOCUMENT FINDINGs

Share the draft report with


Programme
Report Revision community members and
Quality Manager Emails, meetings
and Approval stakeholders, seeking their input and
& DMEAL Staff
feedback.

Emails, printed
Programme
Report Disseminate the approved report to copies,
Quality Manager
Dissemination relevant stakeholders. community
& DMEAL Staff
meetings

Use the results of the participatory


Programme Programme
Programme needs assessment to inform the
Quality Manager design tools,
Design design of interventions and
& DMEAL Staff meetings
development of action plans.

ForAfrika | DMEAL Framework 23


3. Participatory programme design
After understanding the community’s needs, it’s time to channel these insights into a tangible
programme design. Participatory Program Design is a collaborative approach which
emphasises the active involvement of all stakeholders, particularly community members. It
seeks to make use of local knowledge, ensuring that the program is relevant, feasible, and
aligned with community needs, priorities, and capacities. By engaging in decision-making
and contributing to the design of the programme, community members take ownership,
which can enhance commitment, sustainability, and the overall effectiveness of the
programme.
We now move to the next step on how to address the needs identified in Chapter 2, through
defining clear and measurable programme objectives and outcomes, developing a
comprehensive logic model, identifying appropriate indicators and targets, and planning
suitable data collection methods and tools.
All the products of this chapter shall be logged in ActivityInfo using the Project Details Form.

3.1 Define the programme logical framework


Establishing clear programme objectives and outcomes is a critical phase in the participatory
programme design process. These objectives and outcomes should be rooted in the findings
of the participatory needs assessment, ensuring the programme design responds to
identified community needs and priorities, and are also aligned with ForAfrika’s theory of
change and pillars’ LogFrames. Please see below how we differentiate results by different
levels when programming:

Impact: the long-term results ForAfrika aims at achieving, as defined by our mission and vision:
20 million people reaching self-sufficiency by 2030.
Ultimate Outcomes: the 3 high level results, which the Theory of Change identifies as steps
necessary to achieve the impact. These are currently related to emergency response, building
back better and transformational development.
Intermediate Outcomes: the high-level goals we aim to achieve through the programmatic area,
which will enable the ultimate outcomes. It refers to a level of changes, which relates to community
systems, policies and collective behaviour. Each intermediate outcome should/can be achieved
by a combination of immediate outcomes. Please refer to the Pillars’ LogFrames to harmonize the
programme outcomes with our global framework.
Immediate Outcomes: these are the first level of change in each programme area we expect to
observe. They are a consequence of successful implementation of one or more projects. They
normally refer to changes in knowledge, attitude and individual practices.
Outputs: Outputs are the immediate deliverables (not changes) we are promoting – i.e. services
and products. When Country Offices design projects, they need to link their outputs to the
immediate goals provided on the overarching LogFrame – hence contributing to strategic
alignment and reducing a pulverization of efforts.

ForAfrika | DMEAL Framework 24


A SMART description of results helps ensure clarity and focus in your programming efforts.
Specific: Define results precisely, detailing what is to be accomplished. For instance, instead of
saying "Improve literacy rates," consider "Improve literacy rates for women between the ages of
15 and 25 in X community."
Measurable: Determine how success will be measured. What indicators will show progress?
Continuing with the literacy example, "Increase the literacy rate among women aged 15-25 in X
community from 50% to 75% over the next five years."
Achievable: Ensure results are realistic given the resources, time, and circumstances.
Relevant: Align results with community needs and ForAfrika's mission.
Time-bound: Specify a time frame for achieving the result. This not only aids in planning but also
motivates action.

Example
If a participatory needs assessment identified low literacy rates among women as a key challenge
in a community, one of the programme immediate outcomes might be to "Increase the literacy rate
among women aged 15-25 in community X from 50% to 75% over the next five years". This
objective directly addresses the identified need and is SMART.
An associated ultimate outcome might be "Improved economic opportunities for women in X
community". The Theory of Change would then map out the steps and conditions needed to move
from the outputs, such as "increased school enrolment for girls," "improved school retention rates,"
"increased participation in vocational training programme," etc), to immediate outcome (increased
literacy rates) to the ultimate outcome (improved economic opportunities). These steps need to
explain why a lower level of results is able to generate the next level of effects.

ForAfrika | DMEAL Framework 25


One method to identify outputs and outcomes is to conduct a participatory workshop with
community members and stakeholders. In this workshop, we can use techniques such as
problem tree analysis3, which visually maps out the causes and effects of identified
problems, helping to identify possible solutions. Similarly, solution tree analysis4 can be used
to map potential solutions (objectives) and their effects (outcomes). These techniques
encourage a collective, comprehensive understanding of problems and their solutions,
ensuring the programme design is rooted in community realities.
A programme logical framework (LogFrame) is a diagrammatic representation that details the
sequence of cause-and-effect actions that describe the core activities of the programme,
connecting investments with results. Follow the steps below to develop the LogFrame:
• Identify the Long-Term Goal: Start by pinpointing the broader societal or systemic
change the programme aims to contribute towards. This should be a long-term goal that
extends beyond the direct influence of the programme and connects to ForAfrika’s theory
of change.
• Backward Mapping and Connecting Outcomes: Next, consider the conditions or
outcomes that must be met to achieve the long-term goal. These are the intermediate and
immediate outcomes. Plot these on a timeline and determine their causal relationship –
how achieving one leads to the realisation of the next.
• Identify Interventions: For each of these outcomes, specify the activities or outputs
necessary to bring them about. These are the practical steps the programme will
undertake, grounded in evidence-based practices or promising innovative approaches.
• List the risks and assumptions: These are the external factors that have the potential to
impact the achievement of your programme objectives but are outside of the programme’s
control. This process helps to reveal potential risks and inform risk management
strategies. A useful practice for identifying risks is routine post-mortems. A post-mortem
has the benefit of working with the actual facts which contributed to unsuccessful
implementation and can improve project implementation.
• Identify appropriate indicators: For each objective identified in the hierarchy, determine
suitable indicators that will serve as evidence of progress or success. Refer to the next
section on how to develop SMART indicators.
• Define the means of verification: For every indicator, specify the source or method of
data collection. The means of verification should be reliable, practical, and appropriate to
the context.
• Write a narrative: Finally, articulate the Theory of Change in a clear, comprehensive
narrative. This should detail the programme’s long-term goal, the outcomes it seeks to
achieve, the activities it will undertake, and the assumptions it makes. This narrative
ensures shared understanding and buy-in from all stakeholders.

3
https://pmdprostarter.org/problem_tree/
4
https://pmdprostarter.org/objectives_tree/

ForAfrika | DMEAL Framework 26


Programme Logical Framework: is a systematic and visual way to present the relationships
among the resources you have to operate your programme, the activities you plan to do, and
the changes or results you hope to achieve. The logic model links outcomes (both short-term
and long-term) with programme activities/processes and the theoretical assumptions/
principles of the programme.

MEANS OF RISKS AND


RESULTS INDICATORS
VERIFICATION ASSUMPTIONS

The different results that The measures or metrics How data will be External factors that
the programme area that will be used to collected to assess need to be present for
aims to achieve. Each assess progress towards progress towards the results to be
result should be specific, achieving each result. achieving each result. It achieved. Assumptions
measurable, achievable, Indicators should be should include the data should be specific and
relevant, and time-bound specific, measurable, sources, tools, or relevant to the result.
(SMART). and relevant to the methods that will be They should also be
result. used to collect the data. realistic and based on
Start listing the Ultimate
available information.
outcome, then follow the When selecting When identifying means
sequence presented indicators, choose ones of verification, consider When listing
above. that are easy to measure the data sources and assumptions, be specific
and report on. This will methods that will be about the external
Each result should be
help ensure that most reliable and factors that need to be
inserted in one line, and
progress is monitored feasible to use. This may present for the results to
followed by its indicators,
and reported accurately. include surveys, be achieved. Consider
means of verification and
interviews, observation, both positive and
assumptions.
or other data collection negative external factors
methods. that may impact the
project or program's
success.

Theory of Change: takes the Program Logic Model one-step further by explicitly outlining the
causal pathways leading from inputs to outcomes, identifying the preconditions or intermediate
outcomes that must be in place for the final outcomes to be achieved. It also considers the
assumptions underlying these pathways and the external factors that might influence the
achievement of outcomes.

ForAfrika | DMEAL Framework 27


3.2 Identify indicators and targets
The success of a programme may be evaluated through the identification of suitable indicators
and the setting of corresponding targets. This ought to be aligned with the programme’s
outputs and outcomes established in Chapter 3.1. Indicators should capture the
transformations expected because of programme activities. Coupled with specific targets, they
offer a clear, quantifiable means of progress measurement.
This process must involve community members to ensure the identified indicators and targets
hold relevance and meaning within the community context. The following step-by-step guide
elaborates on this procedure:

Identifying Indicators
• Review Programme Outputs and Outcomes: Revisit the defined outputs and outcomes
from the programme design. Each output and outcome should have associated indicators.
• Examine ForAfrika’s Global Indicator Tracking Tool and Pillar LogFrames: Consider
the indicators already established in these tools to promote consistency and comparability
across ForAfrika’s programmes.
• Involve Community Members: Conduct workshops or discussions with community
members to identify potential indicators. Community input ensures the indicators are
understandable, meaningful, and have local relevance.
• Select Indicators: Finalise a list of indicators, making sure each outcome has at least one
corresponding indicator. They can be qualitative (describing qualities or characteristics)
and/or quantitative (numerical data).
• Evaluate selected indicators against SMART, cultural appropriateness, and other
criteria: Indicators should be SMART – Specific, Measurable, Attainable, Relevant, and
Time-bound. They should also be culturally appropriate and feasible to collect with the
resources available. Indicators should also be useful for management, specify the
standards informing their targeting, and emphasise their connection to activities
undertaken.
Example: For a programme aiming to improve literacy rates (outcome), a possible
quantitative indicator could be “percentage of children at grade level reading proficiency,”
while a qualitative indicator might be increased “confidence in reading as reported by
students.”

Setting Targets
• Understand Baseline Data: Before setting targets, understand the current situation or
baseline against which progress will be measured. The baseline data for each indicator
can be gathered during the needs assessment phase.
• Consult with Community Members: Community consultation is key in setting targets to
ensure they are realistic and culturally appropriate.

ForAfrika | DMEAL Framework 28


• Set Targets: Targets define the specific, measurable objectives that a programme aims
to achieve within a certain timeframe. They should be ambitious but achievable.
Example: A target for the literacy programme might be “75% of children will reach grade-level
reading proficiency by the end of the school year”.

Remember, indicators and targets may need to be revised as the programme evolves or as
the community’s needs and context change. Regular review ensures that they continue to be
relevant and aligned with the programme’s objectives and outcomes.

3.3 Performance-Based Budgeting

Developing a performance-based budget is an integral aspect of programme planning and


management. This method establishes a concrete connection between financial resources,
programme activities, and projected outputs and outcomes, leading to refined financial
accountability, strategic resource allocation, and increased programme effectiveness. In
addition, it provides essential support for ForAfrika’s DMEAL framework by ensuring that
financial planning and management reflect the necessary resources for the successful
implementation of DMEAL activities.

Understanding Performance-Based Budgeting


Performance-based budgeting is an approach that aligns resources with strategic objectives and
anticipated outcomes. It comprises a detailed financial plan, itemising the cost of each output and activity
in relation to the overall programme objectives. This strategy facilitates the optimal allocation of
resources, fosters improved financial accountability, and bolsters programme efficiency and
effectiveness.

Steps to Develop a Performance-Based Budget

• Identify Activities: After defining the programme/project outcomes and outputs, list all
the activities that are necessary to achieve each output.

• Gather quotes: from partner suppliers for each of the identified activities which comprise
the entire project and compare overall costs across suppliers. This could include
personnel costs, materials, equipment, training, and any other costs that are necessary
to implement the activity. This step should incorporate a comprehensive analysis of all
potential costs, including both direct and indirect costs like overheads.

• Link Costs to Outputs: For each activity, identify which outputs it contributes to and link
the costs of the activity to these objectives. This can be done by assigning a percentage
of the cost of each activity to each output based on how much it contributes to the
achievement of that output. For example, if you have an activity that contributes equally

ForAfrika | DMEAL Framework 29


to two outputs, you might assign 50% of the cost of that activity to each output. If another
activity contributes 70% to one output and 30% to another, you would assign costs
accordingly.

• Temporal Distribution: After identifying cost centres and estimating the resources,
distribute the costs over the programme’s lifetime on a monthly basis. This will create a
clear financial timeline for executing the activities and achieving the outputs and
outcomes.

3.4 Project Performance Management


Having produced all the outputs as guided by this section, the project team will have enough
information to prepare a project performance management plan (see Annex F). The
performance management plan seeks to schedule the expected progress across project-
related activities to ensure the successful implementation of the project (including internal and
external obligations). In cases where projects form part of a programme, an amalgamated
programme performance management plan including the aspects highlighted in Annex F
should be used with a detailed, per project view of the project-specific indicators being
included.

3.5 Plan data collection methods and tools


In order to be prepared for a good monitoring process, the programme team needs to think in
advance about how the indicators will be measured throughout the implementation of the
intervention. It is important that this process considers community input, assuring that the
chosen methods respect the local culture, context, and ethical considerations. Simultaneously,
the planned tools and methods should also account for the data needs of the organisation
especially as it pertains to the disaggregation of the collected data. Particularly, the primary
disaggregated units of sociodemographic markers required by ForAfrika include:
• Gender (primarily male or female – where culturally appropriate, intersex),
• Age (primarily adult – over 18, or minor – under 18), and
• Disability status.

Identification of appropriate data collection methods:


Depending on the nature of the indicators, choose one of the data collection methods from the
list below, or from the options listed on pg. 18 in the Needs Assessment section of this
guideline.
• Surveys: An efficient method to amass a significant amount of data. They can be
administered in various forms, such as face-to-face, over the phone, or online. Data
collection taking place in-person will be conducted using mobile devices (smartphones,
tablets, etc.) which are compatible with ForAfrika’s form-building tool, KoboToolbox.

ForAfrika | DMEAL Framework 30


• Interviews: More time-intensive, yet they offer rich, detailed data. They can be structured
(fixed questions), semi-structured (flexible questions) or unstructured (informal and open-
ended discussion).
• Focus Groups: These are beneficial for exploring attitudes, perceptions and ideas in
depth. They facilitate engagement of multiple participants in a discussion and often
stimulate collective debates and conversations.
• Observations: This method can provide first-hand, qualitative data about behaviours and
interactions within the community.
• Document Review: Relevant documents like reports, minutes, and records can be
reviewed to provide a historical context and deeper insights into the community.

Development of data collection tools:


Depending on the method selected, tools will need to be developed or adapted to assist in
data collection. As a start, refer to the Global Monitoring Questions.xlsx for the minimum lines
of questioning to include as they relate specifically to the indicators represented in ForAfrika’s
global Logframe. In order to make sure our data collection remains participatory, ForAfrika
project teams should invite community members to co-design the monitoring tools during the
project’s inception meetings, allowing prospective participants to recommend indicators for
monitoring. In the case of observations, a checklist or a guide could be helpful to ensure
consistency and accuracy. Document review templates can assist in extracting and organising
relevant information efficiently.
Sampling: In order to ensure the generalisability of our survey results, surveys for this purpose
should be prepared using random sampling. Additionally, when the survey is focused on
specific sub-groups within a community’s population, stratified random sampling should be
applied. Depending on our interest in either of the specific sub-sets within our chosen sub-
population, we could apply either proportional or disproportional stratified random sampling 5.
When using online sample calculators, apply a 5% margin of error and a 95% confidence level.

Pilot testing of the tools:


Pilot testing the tools before actual data collection is crucial. A pilot test can:
• Uncover potential issues with the tool, allowing adjustments for language, format, or
content.
• Assist in estimating the time required for data collection, thereby aiding logistical planning.

5
ResearchMethod.Net. 2024. Stratified Random Sampling – Definition, Method and Examples.
Available online: Stratified Random Sampling - Definition, Method and Examples (researchmethod.net)
[Accessed 30 September 2024].

ForAfrika | DMEAL Framework 31


• Provide feedback to ensure that the questions or points of observation make sense to the
community members.

Assurance of ethical considerations:


Maintaining ethical standards is non-negotiable during data collection. It is necessary to obtain
informed consent from participants before data collection; ensure confidentiality and privacy
of participants’ information; be transparent about how the data will be used.
Bear in mind, this planning stage should be iterative. Regular reviews and modifications are
necessary to align the data collection strategy with the evolving context of the community and
the programme.

Chapter 3 Summary Table

PERSON TOOLS
ACTIVITY TITLE ACTIVITY DESCRIPTION
RESPONSIBLE REQUIRED

Set clear objectives and


PROGRAM LOGICAL

Define Programme Workshop space,


outcomes based on the Programme Manager
Objectives and Problem/Solution
FRAMEWORK

findings from the needs and sector specialists;


Outcomes tree analysis
assessment.

Develop a logic model that Programme Manager


Develop
maps out the sequence of and sector specialists LogFrame
Programme
cause-and-effect actions of the (lead); DMEAL Staff template
LogFrame
programme. (support)

Programme Manager
Review Programme Revisit the defined outputs and
and sector specialists Programme
Outputs and outcomes and associate each
(lead); DMEAL Staff LogFrame
Outcomes with appropriate indicators.
(support)
INDICATORS AND TARGETS

Develop indicators that are DMEAL Staff (lead);


Develop Indicator SMART - Specific, Programme Manager DMEAL
Criteria Measurable, Attainable, and sector specialists Framework
Relevant, and Time-bound. (support);

Involve community members Community


Involve Community Programme Manager
in the process of identifying meetings,
Members and sector specialists
potential indicators. Workshops

Establish specific, measurable Programme Manager


Benchmark
targets that the programme and sector specialists
Set Targets reports, baseline
aims to achieve within a (lead); DMEAL Staff
data
certain timeframe. (support)

ForAfrika | DMEAL Framework 32


Identify cost centres
PERFORMANCE-BASED BUDGETING

Identify Cost associated with the Programme Manager


Budgeting tools
Centres programme’s activities and and sector specialists
outputs.

Estimate Estimate resources needed for Programme Manager


Budgeting tools
Resources each output and activity. and sector specialists

Link Costs to Budgeting tools,


Associate each cost with its Programme Manager
Outputs and Results
expected output or outcome. and sector specialists
Outcomes framework

Temporal
Distribute the costs over the Programme Manager
Distribution of Budgeting tools
lifetime of the programme. and sector specialists
Costs

Choose suitable data


DATA COLLECTION PLANNING

Identify Data DMEAL


collection methods considering DMEAL Staff
Collection Methods Framework
the nature of the indicators.

Develop Data Develop or adapt tools to Data collection


DMEAL Staff
Collection Tools assist in data collection. tools

Conduct a pilot test of the tools Data collection


Pilot Test Tools DMEAL Staff
before actual data collection. tools

Ensure ethical standards are


Assure Ethical Ethical standards
maintained during data DMEAL Staff
Considerations guide
collection.

4. Community-based monitoring
After an inclusive and participatory programme design process, the journey does not stop.
The next crucial step is monitoring. This chapter elucidates the significance of a community-
centred approach to monitoring, turning to the community as valuable partners and
contributors in the process.
With Community-Based Monitoring (CBM), not only is the programme data collected more
reliable and relevant, but it also facilitates a shared ownership that helps to strengthen
community engagement, empowerment, and sustainability.

4.1 Establish a monitoring plan


Upon the completion of a participatory programme design that results in well-defined outputs,
indicators, methods, and tools, the focus now shifts to setting up a community-based
monitoring plan. A comprehensive monitoring plan is vital to track the programme’s progress
towards its outputs and allows for timely modifications, if required.

ForAfrika | DMEAL Framework 33


In parallel, it’s critical to integrate a financial dimension into the monitoring plan. This
inclusion will allow for simultaneous tracking of both programme activities and corresponding
expenditures. Monitoring financial indicators alongside programme indicators can provide
insight into cost-effectiveness and resource allocation efficiency. Thus, the monitoring plan
should include key financial indicators that align with the programme’s outputs and outcomes
The establishment of a monitoring plan demands a thorough understanding of the data
collection tools and methods identified in Chapter 3.3. Creating a monitoring plan should follow
a systematic process that includes:
• Identify Roles and Responsibilities: Determine who within the community and the
programme team will be responsible for data collection, storage, analysis, and reporting.
This helps to clarify expectations and ensure accountability.
• Schedule Data Collection: While this should be in line with the programme
implementation timeline and the nature of the indicators (some may require more frequent
monitoring than others), to ensure appropriate time is available for data quality assurance
at GSO level, Country Office data needs to be ready (verified internally) between the 1 st
and the 15th of each subsequent month of implementation.
• Develop Data Collection Protocols: Refer to the means of verification identified in the
project Logframe. These information sources will become the only viable source
documents from which implementation data can be gleaned. These source documents
should be legible and clear for further use, as with quarterly data verification reports. Within
data collection protocols, make sure you establish procedures for using the identified data
collection tools and methods, including how to ensure data quality, confidentiality, and
ethical considerations. This should also involve training for community members and
programme staff involved in data collection.
• Plan for Data Analysis: Determine how the collected data will be analysed to measure
progress towards the programme outputs. This should include deciding on the statistical
or qualitative analysis techniques to be used and how the community will be involved in
interpreting the data at any joint coordination / feedback meetings.
• Prepare for Reporting and Feedback: Establish procedures for how and when to report
monitoring results to programme staff, community members, and other stakeholders. This
includes determining the formats and channels for reporting (e.g. written reports,
community meetings) and planning for how to use monitoring findings to inform
programme adjustments.

While reflecting on the topics above, make sure the monitoring process is adequately
supported by factoring in the costs associated with each activity within the programme budget.
Consider the following elements when budgeting:
• Personnel: This includes wages for data collectors, analysts, and other staff involved in
the monitoring process.
• Training: Account for costs of training sessions for community members and other data
collectors.

ForAfrika | DMEAL Framework 34


• Equipment and Tools: Consider any hardware or software needed for data collection and
analysis, such as computers, smartphones, data collection applications or data analysis
software. Make sure you include only costs associated with new equipment.
• Data Management: Include costs related to data storage, protection, and maintenance.
• Transport and Logistics: Account for potential travel costs associated with data collection
activities in the community.
• Communications: Include potential costs for disseminating information, such as printing
or online communication tools.
• Contingency: Finally, it’s advisable to include a contingency fund to cover any unexpected
costs that may arise during the monitoring process.

4.2 Collect data on programme progress


Following on from establishing a comprehensive monitoring plan, it’s time to put that plan into
action. This stage of the DMEAL process is all about gathering the data necessary to track
programme progress against the predetermined outputs and outcomes.
Data collection is a collaborative and ongoing activity that should involve all relevant
stakeholders, particularly community members, given the participatory nature of the process.
Moreover, it requires meticulous attention to detail, organisation, and transparency.
Outlined below are steps to undertake data collection for programme progress:
• Prepare for Data Collection: Ensure all individuals involved in data collection are well-
versed with the established monitoring plan. This includes familiarity with the indicators,
data collection methods, tools, minimum prescribed means of verification as per the project
Logframe, and ethical considerations. Any necessary training should be provided to ensure
a consistent and accurate data collection process.
• If means of verification stem from third parties: e.g. hospital records, or ECD stock
registers, make sure the data collection focal point from the ForAfrika project team is well
acquainted with any formal data sharing agreements, key contact persons, database
access credentials, etc.
• If means of verification stem directly from project participants: e.g. signed attendance
registers for training sessions; make sure our data collectors are adequately familiar with
the means of verification as identified in the project Logframe as well as the procedure for
collecting the monitoring data with the project participants in ways which maintain our good
relationship with the participants (i.e. ethically and professionally). They should be
adequately trained. Training should cover the specific data collection methods and tools
to be used, ethical considerations, and the purpose of the data collection. The training
should ensure that all data collectors are confident and capable of carrying out their roles
effectively, thereby improving the quality and reliability of the collected data.
• Implement Data Collection Methods: Carry out the data collection activities as outlined
in the monitoring plan. This might include, but is not limited to, conducting surveys, focus

ForAfrika | DMEAL Framework 35


group discussions, community meetings, interviews, or direct observation. Remember to
employ a participatory approach, ensuring community members are actively involved in
the process.
• Document Collected Data: Maintain detailed and accurate records of the collected data.
The method for this documentation should have been decided upon in the monitoring plan,
whether it be physical notes, digital recordings, photographs, or other formats.
Confidentiality and data protection should be always prioritised.
• Review Collected Data: Review the collected data regularly to ensure its quality and
completeness. Any discrepancies or gaps should be addressed promptly to ensure
accurate and reliable monitoring.
• Load collected data into the ITT: on a monthly basis from the 1 st to the 15th of every
month, the monitoring data from the previous month should be loaded into the ITT to
enable the leadership of the organisation to continue monitoring our progress towards set
objectives.
• Record and address any data queries returned by GSO’s DMEAL unit: This will be to
the country office team’s benefit when conducting quarterly data verification checks,
making it easier to identify projects recurrently triggering data issues.
• Store Data Securely: It’s crucial that all data collected, especially the source documents
informing any reported data, is stored safely and securely to maintain confidentiality and
ensure it’s ready for analysis in the following stages of the DMEAL process.
• Share collected data: Invite community members and project participants to co-interpret
the findings during feedback sessions or at the start of our interactions with the participants
at the first interaction of the next monitoring month. Document any feedback shared to
inform future programming and to share with the community for their own community
development planning.

Along with programme progress, data collection efforts should also encompass budget
utilisation. Tracking spending against the budget will highlight any discrepancies between
planned and actual expenditures, thus facilitating timely financial management decisions.
Financial data collection should be done accurately through ForAfrika’s financial system and
compiled in a monitoring spreadsheet which pairs outputs and outcome indicators with their
budgetary performance.

ForAfrika | DMEAL Framework 36


4.3 Participatory data verification and improvement
Data verification is a crucial part of the monitoring process. This entire process should be
conducted by DMEAL team members on a quarterly basis (once every three months). The
primary goal of a data verification process is to empower project teams with the capacity for
reporting high quality data consistently. Additionally, it can also help us identify the root causes
of any data discrepancies as well as to verify the accuracy of historical data.
Based on the feedback shared by GSO DMEAL unit
on the monitoring data shared, country office teams
may identify some recurring areas/projects where the Data verification formula:
quality of data submissions can be supported. These
%variance = [reported value –
recurring projects, as stated above, ought to be the
verified value/verified value] x 100.
focus of the data verification exercise. The exercise
itself will rely also on the source documents submitted
in support of the reported data.
The process has two parts6: data verification, and data improvement.
• Part 1 (data verification): Determine the percentage of variance between what was
reported and what has been accounted for in the source documents;
• Part 2 (data improvement): undertake a root cause analysis, identify areas for
improvement, and document an action plan for addressing any issues.
This exercise should be implemented during the first year of project start-up, within 6–12
weeks after beginning data collection. The highest acceptable variance is 5% at a quarterly
rate. Any variance greater than 5% will require more frequent iterations of this exercise until
90% of all indicators have a variance of less than 5%. For the variances identified, these need
to be documented and shared with the Chief Programmes Officer and the Director of DMEAL
at GSO for approval before these changes can be affected in the country-specific Indicator
Tracking Table.

4.4 Monitor progress against targets and develop action plans


Monitoring progress against targets is an ongoing process, vital for understanding how well a
programme is performing and whether it is on track to achieve its desired outputs and
outcomes. This process involves comparing collected data against the set targets (identified
in Chapter 3.2), and actively engaging community members in the discussion of findings.
• Assess Progress: On a regular basis (as outlined in the monitoring plan), gather the data
collected and assess it against the pre-established targets. This can involve statistical

6
Okello, F., Kitungulu, B., Kabore, I., Adhikary, R., Merrigan, M., Lew, K., and Etheredge, G. 2013. Participatory
Data Verification and Improvement Tool: Framework and Operational Guide For Implementation. Available online:
FHI [Accessed 4 September 2024]

ForAfrika | DMEAL Framework 37


methods for quantitative indicators or qualitative assessment for non-numeric data can be
assessed through thematic or content analysis. The frequency of this analysis will depend
on the nature of the programme and the rate of data collection. However, regularity is key
for identifying potential issues early and making timely adjustments.
• Monitoring financial performance involves examining the correlation between funds
spent and progress made towards output and outcome targets. This analysis can highlight
areas of cost-effectiveness and potential budget reallocations to maximise impact.
Additionally, regular financial monitoring can also help detect any variances in the budget,
enabling swift corrective action. For instance, if we are over spending on a particular
activity but over- under-performing, it might be worth investigating alternative strategies
or resources for greater efficiency.
• Identify Deviations: Identify any significant deviations from the planned targets is crucial
for optimizing programme performance. Deviations can occur in two forms: areas where
the programme is outperforming expectations, which presents an opportunity to replicate
successful strategies across other areas or areas where it is falling short e we investigate
these deviations to help identify successful strategies to replicate or areas where
modifications are necessary.
• Discuss Findings with Community Members: Organise meetings with community
members to discuss the findings of the monitoring activities. Provide an overview of the
progress towards targets and highlight any areas of concern. Ensure the discussion is
accessible and understandable to all, avoiding technical jargon. Use visual aids like charts
and graphs to present data where possible, this can be effective in making data more
accessible and understandable for all stakeholders.
• Interpret Results Together: Interpret the results together with community members. Their
insights and perspectives can provide valuable context to understand the data better. This
discussion can help uncover underlying reasons for observed trends or deviations, which
may not be apparent from the data alone.

Following the collective interpretation of results, the next step is developing an action plan
to address any identified deviations or challenges.
• Define Actions: Identify specific actions that need to be taken to address the issues
raised. For instance, if a literacy programme requires more engaging teaching materials
for children, one action might be to collaborate with teachers and students to develop these
resources.
• Assign Responsibilities: For each action, assign a responsible party. This ensures
accountability for the implementation of the action. For example, a specific team within
ForAfrika could be responsible for coordinating the development of the teaching materials.
• Set Timelines: Establish a realistic timeline for the implementation of each action. This
helps maintain momentum and allows for tracking of progress. For instance, the goal might
be to have the new teaching materials ready for use within three months.

ForAfrika | DMEAL Framework 38


Feedback should be provided to all relevant stakeholders, including community members,
programme staff, and donors. This promotes transparency and allows everyone involved to
understand how the programme is progressing and what actions are being taken to address
challenges.

Chapter 4 Summary Table

PERSON TOOLS
ACTIVITY TITLE ACTIVITY DESCRIPTION
RESPONSIBLE REQUIRED

Determine who within the


programme team will be Programme Manager
Identify Roles and Programme
responsible for data collection, (Lead), Programme
Responsibilities document
storage, analysis, and Staff (Support)
reporting.

Decide on the timing and


Programme Manager Programme
ESTABLISHING MONITORING PLAN

Schedule Data frequency of data collection


(Lead), DMEAL Officer Results
Collection based on the indicators and
(Support) Framework
targets set in Chapter 3.2.

Establish procedures for using


the identified data collection
Develop Data tools and methods, including Data Collection
DMEAL Officer (Lead)
Collection Protocols how to ensure data quality, Tools
confidentiality, and ethical
considerations.

Determine how the collected


DMEAL Officer (Lead),
Plan for Data data will be analysed to Data Analysis
Programme Manager
Analysis measure progress towards the Plan
(Support)
programme outputs.

Establish procedures for how


Reporting
Prepare for and when to report monitoring Programme Manager
Template,
Reporting and results to programme staff, (Lead), DMEAL Officer
Reporting
Feedback community members, and (Lead)
Schedule
other stakeholders.

Ensure all individuals involved


DMEAL Officer (Lead), Monitoring Plan,
Prepare for Data in data collection are well-
Programme Manager Data Collection
COLLECTING DATA

Collection versed with the established


(Support) Tools
monitoring plan.

Once data collectors have DMEAL Officer (Lead), Training


Train Data
been identified, they should be Programme Manager Materials, Data
Collectors
adequately trained. (Support) Collection Tools

Carry out the data collection Community Data Collection


Implement Data
activities as outlined in the Development Officer Tools, Monitoring
Collection Methods
monitoring plan. (Lead) Plan

ForAfrika | DMEAL Framework 39


Community
Data Collection
Document Maintain detailed and accurate Development Officer
Tools,
Collected Data records of the collected data. (Lead), DMEAL Officer
KoboToolbox
(Support)

Review the collected data DMEAL Officer (Lead), Data Collection


Review Collected
regularly to ensure its quality Programme Manager Tools,
Data
and completeness. (Support) KoboToolbox

It’s crucial that all data


DMEAL Officer (Lead),
Store Data collected is stored safely and
GSO Programme KoboToolbox
Securely securely to maintain
Officer (Support)
confidentiality.

Regularly gather the data DMEAL Officer (Lead),


Data Analysis
Assess Progress collected and assess it against Programme Manager
Tools, Database
the pre-established targets. (Support)

Identify any significant DMEAL Officer (Lead),


Database,
Identify Deviations deviations from the planned Programme Manager
Monitoring Plan
MONITORING

targets. (Support)

Organise meetings with Communication


Discuss Findings Programme Manager
community members to products, Visual
with Community (Lead), Community
discuss the findings of the Presentation
Members Leaders (Support)
monitoring activities. Tools

Programme Manager Communication


Interpret the results together
Interpret Results (Lead), DMEAL Officer products, Visual
with community members to
Together (Support), Programme Presentation
better understand the data.
Quality Manager GSO Tools

Identify specific actions that Programme Manager Action Plan


DEVELOP ACTION PLANS

Define Actions need to be taken to address (Lead), Programme Template,


the issues raised. Team (Support) Meeting Minutes

Programme Manager
Assign For each action, assign a Action Plan
(Lead), Programme
Responsibilities responsible party. Template
Team (Support)

Action Plan
Establish a realistic timeline for Programme Manager
Template, Project
Set Timelines the implementation of each (Lead), Programme
Management
action. Team (Support)
Tools
REPORTING

Provide feedback to all


Reporting
relevant stakeholders, Programme Manager
Template,
Provide Feedback including community (Lead), Reporting
Communication
members, programme staff, Staff (Support)
Tools
and donors.

ForAfrika | DMEAL Framework 40


5. Participatory evaluation
Building on the community engagement, programme design, and monitoring strategies
developed in previous chapters, we now delve into the process of participatory evaluation.
This approach to evaluation prioritises the perspectives of community members, emphasising
their participation as key stakeholders in assessing the effectiveness, impact, and
sustainability of the programme.
A comprehensive assessment of the programme will inform future design and implementation,
encouraging continuous learning and adaptation. This chapter will guide through the process
of creating an evaluation plan, conducting evaluations with community participation, and
utilising participatory evaluation methods. It will also explore the analysis of evaluation
findings, developing recommendations, and sharing the results back with the community.
Crucially, this chapter will also continue to thread through the importance of integrating a
financial perspective in the evaluation process. Evaluating both programme results, and
budget performance is essential for understanding the programme’s cost-effectiveness,
ensuring responsible stewardship of resources, and driving continuous improvement.
In line with ForAfrika’s commitment to participatory development, this chapter underscores the
importance of community involvement at all stages of evaluation, empowering community
members to take active roles in shaping and assessing the initiatives that impact their lives.

5.1 Establish an evaluation plan


Establishing a robust evaluation plan is a multifaceted
process that calls for meticulous planning, engagement with Decide on evaluating a
stakeholders, and pragmatic resource allocation. It’s a key programme from the onset,
component in fostering accountability, enhancing learning, when designing it!
and enabling continuous improvement of programme As a rule, programmes with
delivery. Moreover, a well-structured evaluation plan guides budgets over 1 million USD
should be evaluated.
decision-making, optimises resource use, and assesses the
programme’s cost-effectiveness, aligning with ForAfrika’s For programmes with smaller
budget and focused on well-
strategic priorities. tested interventions, invest in
Programme Prioritisation for Evaluation a robust monitoring process.
Think about real-time and
Given resource constraints, it’s impractical to evaluate every rapid evaluations for large-
programme. Thus, prioritising programmes for evaluation is a scale emergency programs.
critical initial step.
• Assess the Programme’s Budget: Larger budget programmes generally warrant greater
scrutiny due to the substantial resources invested. A higher budget allocation might
indicate the programme’s strategic importance to ForAfrika, thus necessitating a thorough
evaluation.
• Degree of Innovation: Programmes introducing novel approaches or interventions merit
evaluation to gauge their effectiveness and potential scalability. Evaluating them can

ForAfrika | DMEAL Framework 41


generate invaluable learning opportunities, informing future programme design and
implementation.
• Risk Level of the Intervention: Programmes operating in complex or volatile contexts, or
those with potential high-risk outcomes, may require more frequent evaluations to mitigate
risks and adapt strategies. Assessing the risk level can be based on several factors such
as political instability, environmental hazards, or cultural sensitivity.
The process of prioritisation should involve the input of a broad spectrum of stakeholders,
including programme managers, field staff, community members, and funders. It should be
documented and transparent to ensure accountability.

Establishing the Evaluation Plan


Once the programmes for evaluation have been identified, the next step is to develop a
comprehensive evaluation plan.
• Objective Setting: Begin with establishing clear and specific objectives for the evaluation.
Are we evaluating for accountability, learning, or programme improvement? These
objectives should align with ForAfrika’s Theory of Change and the programme’s intended
outcomes.
• Developing Evaluation Questions: Evaluation questions are closely tied to the
objectives, helping to operationalise them by focusing on specific aspects of the
programme to be examined. For example, if an objective is to understand the effectiveness
of a water sanitation programme, an evaluation question could be: “To what extent has the
programme contributed to a reduction in waterborne diseases in the targeted
communities?”
• Ensure Clarity and Relevance: The questions should be clear, concise, and relevant to
the programme’s context. They should be formulated in a way that the answers will provide
valuable information for decision-making, learning, or accountability. The questions should
focus on proving the attribution of the result to the intervention and its associated activities,
without being biased towards this outcome,
• Cover Key Evaluation Areas: Ideally, the questions should cover key evaluation areas
such as relevance, effectiveness, efficiency, impact, and sustainability. For instance,
questions can range from exploring how well the programme activities are tailored to the
community’s needs (relevance), to what degree the programme is achieving its intended
outcomes (effectiveness), and whether the programme’s results are likely to continue after
the programme ends (sustainability).
• Stakeholder Engagement: Community members, field staff, programme participants, and
other stakeholders should be actively engaged in planning the evaluation. This
participatory approach ensures that the evaluation is contextually relevant, culturally
appropriate, and likely to yield useful information for multiple stakeholders.
To operationalise community engagement, create Evaluation Reference Groups (ERG)
at the planning phase and use it as a consultative body at every step of design, conducting
and validating evaluation processes and products.

ForAfrika | DMEAL Framework 42


• Evaluation Design: Define the evaluation design, aligning it with the evaluation
objectives. Are we conducting a formative (process) evaluation, summative (outcome or
impact) evaluation, or a developmental evaluation? Will it be a participatory evaluation, a
utilisation-focused evaluation, or perhaps a gender-responsive evaluation?
• Budgeting: A detailed evaluation budget should be developed, ensuring that financial
resources are available for all planned evaluation activities. Be mindful of budgeting for
aspects often overlooked, such as data management and dissemination of evaluation
findings. Integrate the evaluation budget within the overall programme budget. (See
Annex E)
• Timeline and Responsibilities: Define a clear timeline for the evaluation activities,
including data collection, analysis, and reporting. Outline who is responsible for each
activity and ensure that all involved have the necessary skills and resources to fulfil their
roles effectively.
• Data Collection and Analysis: Specify the methods and tools that will be used for data
collection and analysis. Consider using a mix of quantitative and qualitative methods to
capture a comprehensive view of the programme’s performance and impact.
• Reporting and Dissemination: Develop a plan for how the evaluation findings will be
reported and disseminated among stakeholders. This should consider the varying
information needs of different stakeholders, including community members, programme
staff, and donors.

Types of Evaluations and Evaluation Approaches


Understanding the different types of evaluations and evaluation approaches can help to inform
the development of the evaluation plan and the choice of methods and tools to be used.
Different types of evaluations serve varied purposes and are chosen based on the stage of
the programme and the information required, such as:
• Formative Evaluation: These evaluations are conducted during the implementation of a
programme, providing immediate feedback to improve the programme’s effectiveness. It
focuses on the process of implementation – how services are delivered, if the intended
population is being reached, and any challenges encountered.
• Summative Evaluation: Conducted after the completion of a programme, summative
evaluation assesses the overall effectiveness and impact. It primarily focuses on outcomes
and impacts, tracking changes that have resulted from the programme, and gauging the
extent to which these changes can be attributed to the programme.
• Developmental Evaluation: Suited to innovative and emergent initiatives, where
outcomes are uncertain, and programmes need to adapt to a changing context. This type
of evaluation supports the process of innovation within a programme by providing real-
time feedback, generating learnings, and informing decision-making.
• Impact Evaluation: This type of evaluation focuses on the changes that can be directly
attributed to a programme. It assesses the overall long-term effects, both intended and
unintended, produced by the programme.

ForAfrika | DMEAL Framework 43


• Real-Time Evaluation: Real-Time Evaluations (RTE) are time-sensitive and quick
evaluations aimed to provide immediate feedback during the implementation phase of a
programme. The aim of RTE is to inform decision-making and improve programme
effectiveness in real-time.

The evaluation approach sets the tone of the evaluation and influences the methods used
for data collection and analysis. Below are some of the possible approaches which can be
used:
• Participatory Evaluation: This approach involves stakeholders, particularly those
affected by the programme, in the evaluation process. The aim is to enhance the credibility
and usefulness of the evaluation by ensuring it addresses the concerns and perspectives
of these stakeholders. Participatory evaluation can be empowering and can contribute to
capacity building and sustainability.
• Utilisation-Focused Evaluation: This approach focuses on ensuring the evaluation is
useful and is actually used by stakeholders. It involves identifying the intended users of
the evaluation and their information needs at the beginning of the evaluation process and
then involving these users throughout the process to increase the likelihood that they will
find the evaluation findings credible and relevant and will use them in decision-making.
• Empowerment Evaluation: This approach focuses on helping communities to assess
their own programmes to improve and empower themselves. It focuses on capacity
building and helping the community learn about its programme.
• Gender and Equity-Focused Evaluation: This approach focuses on whether a
programme is reducing inequalities and promoting fairness among genders. It focuses on
whether all genders have been able to access and benefit from the programme equally.
• Outcome Harvesting: This approach is useful in complex scenarios where the
relationship between cause and effect is not fully understood. It involves identifying what
has been changed (outcomes) and then working backward to determine whether and how
an intervention contributed to these changes.
• Theory-Based Evaluation: This approach involves a detailed examination and analysis
of why a programme works or doesn’t work. It includes an investigation of the underlying
theories of change and the assumptions that link the activities of the programme to its
expected outcomes.
• Blue Marble Evaluation: This relatively new approach to evaluation takes a global
perspective, appreciating the complexity, interconnections, and dynamism of today’s
world. Blue Marble Evaluators are guided by principles of interconnectedness, global
thinking, simultaneous attention to universality and uniqueness, and adoption of a worldly
gaze that embraces the global and the local.
The choice of evaluation type and approach should be guided by the purpose of the
evaluation, the questions it seeks to answer, and the needs and context of the stakeholders
involved. It’s important to note that the types and approaches are not mutually exclusive. An
evaluation might often combine elements of different types and approaches to best meet its
objectives.

ForAfrika | DMEAL Framework 44


5.2 Ensure community ownership and empowerment
Participation in evaluation should not be limited to inclusive data collection methods, but
sought throughout the evaluation process to underpin community ownership and
empowerment. Hence, we seek to involve the community, not only as respondents, but also
as integral contributors, making the process more relevant and reflective of the community’s
perspectives and insights.
1. Form an Evaluation Reference Group (ERG): Our main tool to formalise participation,
the ERG is a consultative body comprising community members, programme staff,
government representative and donors (normally between 5 to 8 individuals). The ERG
serves as a representative group that collaboratively guides the evaluation process.
2. Build Capacity among ERG Members: Provide appropriate training to ERG members
to equip them with the skills necessary to participate effectively in the evaluation process.
Training or briefing processes may broadly cover areas such as data collection
techniques, ethical considerations, and data analysis methods – so that members can
provide quality inputs to the discussion.
3. Engage the ERG: Once formed, actively involve the ERG in defining the purpose and
process of the evaluation. Ensure clear communication to promote understanding and
buy-in.
4. Co-create Evaluation Design: Collaborate with the ERG in developing the evaluation
design. This includes defining evaluation questions, choosing appropriate methods, and
determining data collection and analysis procedures. This process ensures the evaluation
is contextually relevant and valuable to the community.
5. Analyse and Interpret Results Collaboratively: Once data is collected, involve the ERG
in analysing and interpreting the findings. This collaboration enables a richer
understanding of the results and enhances the credibility of the evaluation.
6. Develop Action Plans with ERG: Based on the evaluation findings, work together to
develop action plans for programme improvement. This collaborative decision-making
process reinforces community ownership and fosters a sense of empowerment.
7. Reflect and Learn: Finally, facilitate a reflection and learning process among ERG
members. Encourage open discussion on what worked well, challenges encountered, and
lessons learned that could be applied in future evaluations.
The integration of the ERG within participatory evaluation methods promotes transparency,
mutual learning, and shared ownership of the evaluation process and its outcomes. This
approach empowers the community, strengthens relationships with stakeholders, enhances
accountability, and bolsters the overall effectiveness of ForAfrika’s programmes.

ForAfrika | DMEAL Framework 45


5.3 Evaluation management
This subchapter outlines a step-by-step process for managing evaluations to provide
comprehensive insights into the programme’s effectiveness, impact, and sustainability.
• Identify Need for External Evaluators: Depending on the scope and complexity of the
evaluation, there might be a need to hire external evaluators. This step might be
necessary to ensure impartiality, credibility, or to supplement internal evaluation
capacities.
• Define Evaluators’ Scope of Work: Clearly articulate the evaluators’ roles and
responsibilities, including their tasks, deliverables, and timeline. This scope of work should
be co-created with the ERG to ensure that the community’s needs and interests are
considered. The Terms of Reference for the evaluation should clearly state this.
• Engage an Evaluation Team: When selecting external evaluators, look for a team with
expertise in the programme area, experience in participatory evaluation methods, and
cultural sensitivity. Ensure that the selection process is transparent.
• Inception Phase: Once the evaluation team is selected, ensure that they understand the
programme, its context, and its stakeholders, as well as the evaluation plan. Provide them
with all necessary documentation and introduce them to key personnel, including the
ERG. Working with the evaluation team and the ERG, develop a detailed schedule for the
evaluation process. This should include specific dates for data collection, analysis, report
drafting, review, and dissemination. Ensure this schedule is flexible enough to
accommodate unanticipated delays or changes.
The first product of the evaluation team in an Inception Report, which is the last chance
to revise the evaluation plan, objectives and methods before kickstarting the data
collection.
• Support Data Collection: Facilitate the evaluators’ data collection activities. This might
involve coordinating interviews or focus groups, providing access to programme
documents, or supporting community-led data collection efforts.
• Ensure Quality and Timeliness of Deliverables: Regularly check in with the evaluation
team to ensure they’re on track to meet their deliverables and that the quality of their work
meets the standards set in their scope of work. Use

ForAfrika | DMEAL Framework 46


• Annex B – Evaluation Quality Checklist for quality assessment of evaluation products at
each stage of the process.
• Facilitate Feedback and Validation: As the evaluators produce drafts of their findings,
share these with the ERG for feedback and validation. This step ensures that the
community’s perspective is incorporated in the final evaluation report.
• Manage Evaluation Closure: Upon completion of the evaluation, ensure that all
contractual obligations are fulfilled, including the payment of the evaluators and the
delivery of all expected outputs.
Throughout the evaluation management process, communication and transparency are key.
Regular updates should be shared with the ERG and other stakeholders to keep them
informed and involved.

5.4 Evaluation Action Plan


Once an evaluation is complete, its recommendations provide valuable insights for improving
program effectiveness and sustainability. However, these insights can only drive change if
they’re put into action. An Evaluation Action Plan is an instrumental tool for bridging the gap
between evaluation findings and programmatic action.
Here’s a step-by-step guide to develop and manage an Evaluation Action Plan that effectively
addresses evaluation recommendations:
• Analyse and Prioritise Recommendations: The first step is to thoroughly review and
understand the evaluation recommendations. Not all recommendations will have the same
impact or feasibility. Therefore, it’s important to prioritise the recommendations based on
criteria such as urgency, potential for impact, and resource requirements.
• Develop Actionable Steps: For each prioritised recommendation, define specific,
actionable steps that will address the issue or leverage the opportunity identified. Ensure
that each step is clear and measurable, stating what needs to be done, by whom, and by
when.
• Assign Responsibilities: Clear roles and responsibilities are crucial to ensure effective
implementation of the action plan. Assign each action to a specific person or team and
make sure they have the resources and capacity necessary to fulfil their role.
• Set Timelines and Milestones: Develop a timeline for each action, setting out when it
should start and end. Establishing clear milestones will help track progress and ensure
that actions stay on course.
• Monitor Progress: Regularly monitor the implementation of the action plan to ensure that
actions are being carried out as planned. Use this opportunity to identify any challenges
or bottlenecks and adjust the plan as necessary.
• Communicate Progress: Keep all stakeholders informed about the progress of the action
plan. This transparency not only maintains stakeholder engagement, but also contributes
to accountability.

ForAfrika | DMEAL Framework 47


• Review and Learn: Once the action plan has been fully implemented, review the process
and outcomes. This reflective process can yield valuable learnings for future evaluations
and action plans.
By systematically addressing evaluation recommendations through a well-crafted action plan,
ForAfrika can harness the full power of evaluations to drive programme improvement and
innovation.

ForAfrika | DMEAL Framework 48


5.5 Internal Evaluations and Reviews

Evaluations and reviews are essential for continuously monitoring and maintaining programme
quality while ensuring alignment with ForAfrika’s strategic objectives. Unlike external
evaluations which provide an independent and comprehensive assessment of programme
effectiveness, impact, and sustainability, internal evaluations and reviews offer a valuable
complement, enabling more regular, nimble checks on programme performance.
Internal evaluations are typically lighter exercises, requiring fewer resources than full-fledged
external evaluations. Conducted by DMEAL Staff within ForAfrika, these assessments provide
an opportunity for continuous learning and adaptation throughout the programme cycle.

Types of Internal Evaluations and Reviews


• Process Reviews: These focus on the implementation of programme activities to
determine if they’re being carried out as planned. They can help identify inefficiencies,
bottlenecks or areas of improvement in the process.
• Monitoring Reviews: These provide regular check-ins on the programme’s progress
against its objectives and indicators, using the data collected through the monitoring
process.
• After-Action Reviews (AARs): AARs are debriefing sessions that follow significant
programme activities or milestones. They offer a structured opportunity to reflect on what
happened, why it happened, and how it can be done better next time.

Adapting Evaluation Instructions for Internal Reviews


The general evaluation instructions are highly adaptable and can be scaled down for internal
reviews. Here are some pointers:
• Simplify the Evaluation Plan: For internal evaluations, the plan doesn’t need to be as
extensive. Still, it’s essential to define the purpose, scope, key questions, methods, and
timelines.
• Leverage Existing Data: Maximise the use of monitoring data and other internal
information to inform your review. This reduces the need for extensive new data collection.
• Capitalise on Internal Knowledge: Use the intimate understanding of the programme
among staff members to your advantage. Their insights can provide valuable context and
interpretation of findings.
• Streamline Reporting: Reporting can be less formal than for external evaluations.
However, it’s important to ensure that findings and recommendations are clearly
communicated to those who need to know and can act on them.
• Promote Learning and Adaptation: Internal evaluations offer a safe space to identify and
learn from both successes and failures. Ensure that findings are used to inform programme
decisions and improve future performance.

ForAfrika | DMEAL Framework 49


By regularly conducting these internal assessments, ForAfrika can maintain a pulse on
programme performance, swiftly detect and rectify issues, and continually adapt to evolving
contexts and needs.

5.6 Impact Measurement

A particular type of evaluation conducted by ForAfrika is Impact Monitoring. Guided by the


Impact Measurement Guidelines, it provides a standardised approach to measuring and
evaluating the outcomes and impact of ForAfrika’s programmes, with the ultimate goal of
improving programme effectiveness and achieving positive social, economic, and
environmental change in the communities it serves.
The scope of the impact measurement guideline is specifically focused on measuring self-
sufficiency as a high-level outcome of ForAfrika’s programmes. This includes defining
clear and measurable indicators of self-sufficiency, selecting appropriate data collection
methods, data analysis to assess the level of self-sufficiency, and reporting findings in a way
that is meaningful for organisational learning and accountability.

Self-sufficiency in the context of ForAfrika's programs refers to the level of autonomy and
resilience achieved by the people reached through our programs in meeting their basic needs,
improving their livelihoods, and sustaining positive outcomes without continued external support.

Impact refers to the measurable positive or negative effects that result from the execution of
ForAfrika’s programs. ForAfrika primarily envisages impact manifesting as:
• Changes in the lives and well-being of targeted individuals, communities, and
ecosystems;
• Improvements in social, economic, and environmental conditions;
• Sustainable positive outcomes that endure beyond the duration of the intervention.

The concept of self-sufficiency implies that individuals and communities are able to access
and manage resources, knowledge, and skills in a way that allows them to meet their
immediate and strategic needs, create sustainable livelihoods, and maintain their well-being
over the long term, without being depending on external assistance. Self-sufficiency
encompasses not only economic factors, but also social, environmental, and cultural aspects
that enable individuals and communities to lead dignified, empowered, and sustainable lives,
taking into account local contexts including cultural norms and behavioural changes
Our annual impact measurement involves systematically evaluating the extent to which a
programme or intervention has achieved its intended outcomes and contributed to positive
changes in the lives of the target population.

ForAfrika | DMEAL Framework 50


The chosen conceptual model used to guide the measurement of self-sufficiency in ForAfrika’s
programmes is the Sustainable Livelihoods Framework (SLF)7. The SLF is a widely used
framework that provides a holistic and comprehensive approach to understanding and
assessing the various dimensions of individuals’ and communities’ livelihoods, and how they
interact with each other in achieving sustainable livelihood outcomes.
The Sustainable Livelihoods Framework typically consists of five interrelated components:
Assets, livelihood strategies, institutions and governance, vulnerability and risks, livelihood
outcomes. ForAfrika seeks to improve livelihoods by strengthening the range of assets
(resources), both tangible and intangible, that individuals and communities possess or have
access to, enabling them to fully exert their capabilities and achieve self-sufficiency.
In order to quantify the changes on self-sufficiency, the organisation has developed a
Sustainable Livelihoods Index for assessing the resilience of households by examining their
access to different livelihood assets. The index is based on the SLF, which identifies five key
assets or capitals that are necessary for achieving sustainable livelihoods. These are: natural
capital, physical capital, human capital, financial capital, and social capital.
The guideline approach provides further investigation of ForAfrika’s contributions to social
transformation by incorporating qualitative data collection and triangulation using the
Qualitative Impact Protocol (QuIP) developed by Bath Social & Development Research (Bath
SDR). QuIP is grounded in the belief that participants’ voices and experiences are essential
to understanding the true impact of development programmes. By incorporating qualitative
data, it offers a more comprehensive and in-depth understanding of how interventions affect
people’s lives, considering not only measurable outcomes but also the perceptions, feelings,
and experiences of those involved – complementing the quantitative data collected through
the Sustainable Livelihoods Index (SLI).

Data collected through ‘blindfolded’8 interviews is then analysed using a thematic analysis
approach, which involves identifying patterns and themes in the data that relate to the
monitoring questions and objectives. This process allows for the identification of
commonalities and differences in experiences, as well as the exploration of causal
mechanisms and the influence of contextual factors on programme outcomes. By using QuIP,
the impact measurement process is enriched and provides a more comprehensive
understanding of the effects of ForAfrika’s programmes on the target communities.

7
Scoones, I. (1998) Sustainable Rural Livelihoods: A Framework for Analysis, IDS Working Paper 72, Brighton:
IDS.
8
The blindfolding process in QuIP (Qualitative Impact Protocol) interviews is a key component of the methodology
that enhances the credibility and trustworthiness of the study findings. Blindfolding refers to withholding specific
information about the intervention, program, or project being evaluated from the field monitoring officers who
conduct the interviews. The purpose of the blindfolding process is to minimize biases that may influence the way
interviewers ask questions, probe for details, or interpret the responses they receive from participants.

ForAfrika | DMEAL Framework 51


Chapter 5 Summary Table

PERSON TOOLS
ACTIVITY TITLE ACTIVITY DESCRIPTION
RESPONSIBLE REQUIRED

Country Director
Budget Reports,
Prioritise programmes for (Lead), DMEAL
Programme Programme
evaluation based on budget, officer, programme
Prioritisation for documents, Risk
degree of innovation, and risk staff, community
Evaluation Assessment
EVALUATION PLAN

level. members, and funders


Tools
(Support)

Develop a comprehensive
evaluation plan with clear
objectives, evaluation
questions, stakeholder DMEAL Officer (Lead),
Establishing the Evaluation Plan
engagement, evaluation Programme Manager
Evaluation Plan Template
design, budgeting, timeline, (Support)
data collection and analysis
plan, and reporting and
dissemination plan.

Form and Engage Form an ERG and actively


DMEAL Officer (Lead),
an Evaluation involve them in defining the Consultative
Programme Manager
Reference Group purpose and process of the Meeting Space
(Support)
COMMUNITY OWNERSHIP AND

(ERG) evaluation.

Collaborate with the ERG in


EMPOWERMENT

developing the evaluation


design including defining DMEAL Officer (Lead),
Co-create
evaluation questions, choosing ERG Members Evaluation Plan
Evaluation Design
appropriate methods, and (Support)
determining data collection
and analysis procedures.

Provide appropriate training to


Build Capacity ERG members to equip them DMEAL Officer (Lead), Training
Among ERG with the skills necessary for Training Facilitator Materials,
Members effective participation in the (Support) Training Space
evaluation process.

Identify if there is a need to


DMEAL Officer (Lead),
Identify Need for hire external evaluators based
DMEAL Director Evaluation Plan
External Evaluators on the scope and complexity
(support)
of the evaluation.
EVALUATION

Articulate the evaluators’ roles DMEAL Officer (Lead),


Define Evaluators’
and responsibilities, tasks, DMEAL Director
Scope of Work
deliverables, and timeline. (support)

DMEAL Officer (Lead),


Engage an Select an evaluation team with
DMEAL Director, Evaluation ToR
Evaluation Team relevant expertise.
(support)

ForAfrika | DMEAL Framework 52


Ensure that the evaluation
DMEAL Officer (Lead), Programme
team understands the
Inception Phase Programme Manager Documentation,
programme and the evaluation
(support) Evaluation Plan
plan.

DMEAL Officer (Lead),


Support Data Facilitate the evaluators’ data Data Collection
Programme Staff
Collection collection activities. Tools
(Support)

Regularly check in with the


evaluation team to ensure
Ensure Quality and
they’re on track to meet their Evaluation
Timeliness of DMEAL Officer (Lead)
deliverables and that the Quality Checklist
Deliverables
quality of their work meets the
standards.

Share drafts of the evaluation


Facilitate Feedback DMEAL Officer (Lead), Draft Evaluation
findings with the ERG for
and Validation ERG (Support) Reports
feedback and validation.

Ensure all contractual DMEAL Officer (Lead), Contractual


Manage Evaluation
obligations are fulfilled upon DMEAL Director, Agreement, Final
Closure
completion of the evaluation. (support) Evaluation Report

Review and prioritise the


Evaluation
Analyse and evaluation recommendations
Programme Manager Reports,
Prioritize based on criteria such as
(Lead), ERG (Support) Prioritisation
Recommendations urgency, potential for impact,
Matrix
and resource requirements.

Define specific, actionable


EVALUATION ACTION PLAN

Develop Actionable steps to address each Programme Manager


Action Plan
Steps and Assign prioritised recommendation (Lead), DMEAL Officer
Template
Responsibilities and assign each action to a (Support)
specific person or team.

Develop a timeline for each


Set Timelines and Programme Manager Action Plan
action and establish clear
Milestones (Lead) Template
milestones.

Regularly monitor the


Monitor Progress Programme Manager
implementation of the action
and Communicate (Lead), DMEAL Officer Action Plan
plan and keep all stakeholders
Progress (Support)
informed about the progress.

Review the process and Programme Manager


Review and Learn outcomes once the action plan (Lead), DMEAL Officer Action Plan
has been fully implemented. (Support)

ForAfrika | DMEAL Framework 53


6. Community-driven accountability
ForAfrika is committed to improving
accountability in its programming to participants FEEDBACK: feedback includes both the
and the communities in which we work. It information that the organisation
requires open channels for communities to provides to the community about the
communicate their perceptions about our programmes (reporting on results,
programmes, and constant feedback to them sharing updates) and the views or
with solutions to issues and information on opinions expressed by the community
results achieved by our interventions. members about the programmes
(suggestions, satisfaction, perceptions).
Accountability is not only a process of responding Feedback can be both positive
to complaints, but also ensuring we clearly inform (appreciation, satisfaction, constructive
our stakeholders what and how we will work, and suggestions) and negative (concerns,
which results we achieved through our efforts. dissatisfaction, criticism).

To live up to our commitment to ‘do no harm’ to COMPLAINTS: Complaints are a subset


of feedback. They are expressions of
the communities we operate in, our
dissatisfaction about ForAfrika's
accountability mechanisms include:
programmes, commitments, or conduct.
• Transparency in sharing programme results, Complaints are generally raised when
stakeholders feel that something has
• participatory and inclusive decision- gone wrong or when they want to see a
making to ensure community ownership, and change occur. They often relate to more
• Establishment of feedback channels which serious issues and always require a
enable learning and adaptation of response or action from the organisation.
programmes.

ForAfrika | DMEAL Framework 54


6.1 Establish community-based feedback channels
In this section, we will focus on how to create structures that allow communities to express
their views about our programmes and how to report programme results to these communities.
Our objective is to reinforce a two-way accountability pathway, ensuring both the receipt and
provision of relevant information.
Proactive Feedback Channels happens when ForAfrika carry out random calls on registered
participants for verification purposes, and to collect their feedback on the quality, quantity and
appropriateness of programme service delivery. This means that we choose the participants
and stakeholders we want to ask questions, control the questions that are being asked, and
set the timing of when the information is collected.
Reactive Feedback Channels are mechanisms that ForAfrika provides to participants in its
programmes and other stakeholders to communicate with us, at the time of their choosing.
This includes, for example, suggestion boxes, hotlines, email addresses, and office walk-ins.
Feedback, either positive or negative, is invaluable for improving our programmes. It often
provides insights into minor and major issues, which may have gone unnoticed otherwise.
Here’s how you can establish feedback channels:
• Understand who can give feedback: All stakeholders, including participants, non-
participants, partners, local leaders, NGOs, CBOs, Government, Donors, and staff, are
encouraged to provide feedback and voice their complaints.
• Define what can be feedback about: Feedback should focus on our programmes, our
commitments, and our conduct. We must clarify that feedback can lead to changes and
improvements.
• Provide accessible channels for feedback: Stakeholders should be able to provide
feedback in writing, verbally, or by phone. Consider utilising suggestion boxes at
community meetings, designated staff members, local leadership, government offices, or
other stakeholders such as partner NGOs.
• Consider the local context: Evaluate the feasibility of different feedback channels based
on the local context, including participants’ access to technology, project length, budget,
and human resources.
• Ensure inclusivity: The chosen feedback channels should be accessible to all
stakeholder groups, particularly the most vulnerable and marginalised.
• Employ a mix of quantitative and qualitative feedback channels: Using a variety of
channels will provide a comprehensive understanding of the community’s views. For
instance, surveys can provide quantitative data, whereas focus group discussions can
provide qualitative insights.

ForAfrika | DMEAL Framework 55


Suggested proactive feedback channels
Surveys: Surveys can be conducted in person, over the phone, or online, allowing for
structured feedback from a large number of participants. They are a great way to gather
quantitative data and are especially useful for collecting demographic information or
measuring levels of satisfaction.
Focus Group Discussions (FGDs): FGDs involve a small group of people whose opinions
or perceptions are studied in-depth. They allow for qualitative data collection, capturing
nuanced insights and detailed information that may not emerge through surveys.
Individual Interviews: Face-to-face or telephone interviews provide an opportunity for in-
depth exploration of individual experiences, opinions, and feedback. They enable a personal,
direct interaction, which can lead to rich qualitative data.
Community Meetings: These are forums where larger groups of stakeholders can gather,
discuss, and share their views about the programs. These meetings foster open dialogue and
can also be used to disseminate information about the programs.
Local Radio: In areas where literacy rates are low or technological reach is limited, local radio
can be a powerful tool for both sharing information and soliciting feedback.
Stakeholders Group Meetings: Meetings with specific stakeholder groups (like local leaders,
NGOs, CBOs) provide focused insights from different perspectives and allow direct
engagement with influential community members.

Suggested reactive feedback channels


Suggestion Boxes: These are physical boxes placed in community spaces where
stakeholders can drop their written feedback anonymously. They are particularly useful for
receiving sensitive feedback or complaints.
Toll-free Hotlines: They provide a direct and immediate way for stakeholders to voice their
complaints or feedback, which is particularly useful for urgent matters or sensitive complaints.
Office Walk-In Hours: Specified office hours when stakeholders can directly visit and share
their feedback or complaints. This allows for face-to-face interaction and immediate response.
SMS Lines: In areas with good mobile network coverage, stakeholders can text their feedback
or complaints. This is a quick and convenient feedback channel.
Social Media: Platforms like Facebook, Twitter, and Instagram can be used for sharing
program updates and receiving feedback. They can reach a wide audience quickly and provide
an interactive way to engage stakeholders.

If it is possible, ask participants about their preferred type of feedback channels. These
questions could be integrated into a survey conducted at the design or start-up phase of the
project. Throughout implementation, make sure to constantly verify if the available channels
are providing useful and actionable feedback data from the participants.

ForAfrika | DMEAL Framework 56


6.2 Handling and addressing complaints and grievances
ForAfrika acknowledges the importance of addressing concerns and grievances in a timely,
sensitive, and effective manner. This process not only pertains to complaints related directly
to ForAfrika’s programmes, commitments, and conduct, but also includes measures for
dealing with issues related to safety and protection of participants and participant
communities.
Stakeholders’ perspectives and feedback collected through proactive channels are easier to
handle as they are actively sought by the project team and respond to questions they have
designed. The results of the stakeholder surveys, focus group discussions, and community
meetings should be compiled and discussed within the team to inform programmatic decisions
and actions.

ForAfrika takes a survivor-centred approach to safeguarding misconduct in all mandatory reporting


obligations. This approach seeks to foster a supportive environment where survivors' rights are respected
and they are treated with dignity. It also ensures ForAfrika considers power imbalances, the importance
of consent, and developmental stages when following reporting procedures.

ForAfrika Safeguarding policy guides operations to implement appropriate policies, procedures and
practices that prevent harm to participants. This includes protection from sexual exploitation, abuse,
harassment, and protection of vulnerable children, adults, people with disabilities, and diverse genders.

The Safeguarding Policy applies to all involved in providing and receiving humanitarian services
throughout assistance delivery.

A clear referral pathway defining response protocols to various requests, feedback, and complaints from
stakeholders and participants is prerequisite. This enables appropriate and timely response to
stakeholders.

To create systems to address complaints effectively, we recommend:


• Identify types of complaints: Separate general complaints from sensitive ones that
require the protection of the complainant due to fear of reprisal or embarrassment.
• Define complaint criteria: Develop a list of circumstances under which complaints will be
addressed. This should include situations such as unmet commitments, improper
behaviour, and misuse of authority by staff or partners.
• Provide safe channels for reporting complaints: Establish both internal and external
channels to handle complaints. Make sure stakeholders know that they can approach local
committees, field officers, district team leaders, relief leaders, and the country manager for
assistance.
• Safe Referral: ForAfrika is committed to ensuring the safety and dignity of all
complainants. In cases where a complaint relates to the safety and protection of
participants, ForAfrika will ensure the identity of the complainant is protected. The nature
of the grievance will determine the appropriate channel to which it is referred. Feedback

ForAfrika | DMEAL Framework 57


on how the complaint was resolved will be provided to the complainant, ensuring
transparency in the process.
• Complaint Handling Process: Each Provincial/State Manager will maintain a
comprehensive record of complaints, detailing how and when they were addressed. This
process allows for continual monitoring and evaluation, enhancing the accountability and
transparency of ForAfrika’s operations.
• Channels for Providing Feedback and Raising Complaints: Stakeholders can provide
feedback or raise complaints through various channels, as explained above.
• Process for Raising Complaints: Communities and participants are first encouraged to
raise their concerns or complaints at their local help desk or other locally managed
channel. However, if the issue is sensitive or not adequately addressed, they can escalate
it through the following internal channels: Field Monitors or Officers; Provincial Manager;
Country Director
Please note that referral pathways are defined at country or location levels for all reactive
client feedback channels in that area, following the guidance above. The following table
provide an example of what a simple referral process could look like.

TYPE OF
PRIORITY LEVEL EXPLANATION RECORD RESPOND REFER
CATEGORY

An information
request about the
type of aid and
services available at If you don’t know the
ForAfrika, the Yes Before the next answer, refer to the
Request for
Low or medium location or timing of Record into the programme protection actor
information
a specific service, Feedback Registry interaction responsible for
questions about information services
targeting or
registration criteria,
eligibility criteria, etc

If you don’t know the


answer, refer to the
Protection team or
Request to be
Yes relevant programme
included in one of Before the next
Request for staff members,
Medium or high ForAfrika aid or Record into the programme
assistance and/or discuss
service provision Feedback Registry interaction
during weekly
programs.
meetings, if request
is not sensitive/
confidential

ForAfrika | DMEAL Framework 58


Complaint about aid
If you don’t know
entitlement missing
how to respond,
or delayed the Yes
Programmatic Before the next refer to relevant
timing or location of
complaint – minor Medium Record into the programme programme staff
ForAfrika services,
dissatisfaction Feedback Registry interaction member or discuss
or timelines of
during weekly staff
ForAfrika’s staff and
meeting
partners.

Complaint about
staff attitude, lack of
access to aid
distribution or
service, lack of
access to
information,
exclusion of a Refer to the
Provide status of
minority/vulnerable supervisor or
Yes resolution before the
Programmatic group, extortion of programme
next programme
complaint – major High aid by a third party Record into the manager, HR or
interaction; conclude
dissatisfaction (with no ForAfrika Feedback Registry. field coordinator for
resolution in x
involvement), interpretation and
months
refusal by ForAfrika action.
staff to listen or
acknowledge a
complaint or inability
for a client to reach
the feedback hotline
or any other
mechanism

ForAfrika | DMEAL Framework 59


Allegations of the
following involving
ForAfrika staff and
partners, including:
• Exploitation and
abuse (sexual, Don’t try to
economic, other) investigate or
handle the
• Gender Based
Breach of the complaint
Violence (GBV)
ForAfrika Refer the person to
safeguarding • Bribery, corruption Don’t record any
the appropriate
policy, procedure, and kickbacks personal information
support services or
or the law. or details in Refer using the
• Child protection law enforcement
Critical Feedback Registry Safeguarding
Link to the agencies (
• Fraud (Sexual) but take detailed guidelines
safeguarding depending on the
Harassment or notes for referring or
policy ForAfrika content of the
discrimination reporting the matter
Safeguarding allegations).
Policy • Misappropriation or
Report immediately
misuse of ForAfrika
to the Country
resources/assets
Management team,
• Physical safety HR
risks
• Procurement fraud
• Retaliation
• Threat

Refer (the person or


the feedback) to
Opinions, other relevant
comments, ideas, Yes organisations if the
General feedback/ Acknowledge and
Low suggestions, Record into the feedback was meant
other respond if need be
expectations and Feedback Registry. for them or if this
spam were a request that
would fall within
their mandate.

External channels like police and community leaders can also be approached; however, this
does not guarantee that ForAfrika will receive and be able to address the complaint. The
decision to take the complaint outside the organisation should be based on the nature and
severity of the issue.

ForAfrika | DMEAL Framework 60


6.3 Interpreting and responding to stakeholder feedback
After stakeholder feedback has been communicated to the relevant staff member, this
information needs to be interpreted to inform decision-making about how to respond. The
following guidelines can be used on how to interpret and make decisions.
• Identify major trends and outliers: Look at the feedback you have received over the past
week, month, or quarter. Try to identify the major trends. What are people most concerned
about? What are they most confused about? What do they want to see happen? Then
identify the outliers: pieces of feedback that seem to disagree with what the broader group
is saying, or cases where certain stakeholders are talking about topics that no one else is
bringing up.
• Try to understand what is behind the trends: Based on what you know about your
stakeholders, try to explain the trends you see in the data. Which group of stakeholders
may be concerned about an issue, and what experiences may have informed their
perspective? What might be unique about the outliers that would lead them to have
different concerns or opinions from other stakeholders?
• Triangulate: Where possible, bring in other sources of information to help you make sense
of what you are seeing in the feedback data. Are there other reports, records, or resources
that might help you understand the trends and what is behind them? Can other sources of
information further describe and validate the concerns and priorities you are hearing
through stakeholder feedback?
• Weigh your options: Try to outline the possible courses of action that could be taken to
address the various concerns and ideas raised by our stakeholders. Evaluate how feasible
it would be to implement those actions based on your programme’s resources (technical,
human, financial) and mandate. Try to establish whether you can meet the expressed
expectations of all the stakeholders from whom you have heard feedback, or whether there
need to be trade-offs. If you decide there will need to be trade-offs (that is, you cannot
meet everyone’s requests) you will need to consider your operating environment, and
ForAfrika’s strategic objectives to prioritise.
• Decide a course of action: Based on your analysis of the trends, the options you’ve
generated, and the opportunities and constraints you’ve identified, your team needs to
decide. This decision could be made in consultation with a group of stakeholders (such as
a Stakeholder Reference Group – a small group of stakeholders representing the interests
of your target group) or internally with other IRC team members. And, of course, don’t
forget to “close the loop” by informing your stakeholders about the decision and creating
room for discussion.

ForAfrika | DMEAL Framework 61


6.4 Communicating responses
Responses to stakeholders’ requests, feedback, and complaints will be responded to as per
the timeframe, roles and responsibilities, and referral pathways defined previously. In case of
sensitive feedback, the stakeholder will be given the opportunity to give their consent to follow
up after lodging the initial feedback. The country office will keep in contact with stakeholders
and update them as investigations/action plans progress. Confidentiality and security of the
stakeholders are of high importance and must be considered when communicating the
ForAfrika response.
ForAfrika’s response will be communicated through different means depending on whether
the feedback received was individual or collective, sensitive or non-sensitive.

6.5 Foster community ownership


ForAfrika prioritises fostering community ownership as a core aspect of its programme
implementation. This principle hinges on two key components: transparency through sharing
programme results and involving community members in the decision-making process.
Transparency and Sharing Program Results:
Transparency is a fundamental tenet of accountability, fostering an effective feedback loop
between ForAfrika and the community. To ensure transparent communication of programme
results we recommend to:
• Identify Appropriate Channels: Selecting channels that are accessible to the community
is vital for effective communication.
• Share Relevant Information: Clear and timely communication of program design,
progress, and results forms the backbone of our transparency efforts. Information sharing
includes work plans, progress reports, and survey results.
• Choose the appropriate language: Good communication requires that both sides
understand the “code”. Use adequate language, without jargon and tailored to the socio-
economic-educational context of the audience.
• Promote Active Participation: Encourage community members to engage actively with
shared information. This includes asking questions, providing feedback, and suggesting
improvements.
By following these steps, we not only ensure the accountability of our programmes but also
keep the communities informed about our efforts and results.
Inclusive Decision-Making and Fostering Community Ownership:
Involvement of community members in decision-making processes is crucial for promoting
ownership and ensuring programme success. We recommend involving communities and
stakeholders at every stage of project development – from proposal creation to project closure.

ForAfrika | DMEAL Framework 62


• Consultation and Bottom-Up Approach: Solicit opinions from the community about
proposed projects via local leadership, encouraging a bottom-up approach to decision-
making.
• Local Leadership Endorsement: Community opinions are endorsed through local
leadership and relevant line ministries, ensuring that their voices are heard and
represented.
• Inception Meetings: Before implementing projects in communities, inception meetings
are held to verify that the community’s contributions during project design were included.
These meetings also discuss the project’s selection criteria for participants, its potential
impact on the community, and expected benefits.
By combining transparency with inclusive decision-making, we foster community ownership,
enhancing the effectiveness and sustainability of our programmes.

6.6 Reporting
High quality reporting is an essential component of our stakeholder engagement. In order to
achieve this, country office teams need to make use of GSO’s report review process based
on the tiering assigned to them.
Below is the tiering of country office teams as of 2024:

COUNTRY
TIER MEANING GSO SUPPORT
OFFICES

1. Financial Value of more than $5 million cash


2. Widespread geographical presence – At
least 50% country coverage
3. Evidence of at least three programming Review all mid-year, final
South Sudan, South
1 sectors and annual reports,
Africa
irrespective of donor
4. At least 50% non-affiliate/local streams of
funding
5. Existence of partnerships with various
stakeholders – at least 5

1. Financial Value of less than 3 million cash


2. Growing geographical presence – At least
30% country coverage
3. Evidence of at between 2-3 programming Review all mid-year, final
2 Mozambique, Angola sectors and annual reports,
irrespective of donor
4. At least 30% non-affiliate/local streams of
funding
5. Existence of partnerships with various
stakeholders – at least 3

ForAfrika | DMEAL Framework 63


1. Financial Value of less than 1 million cash
2. Limited geographical presence – at least
10% country coverage
Rwanda, Ethiopia, Review all reports,
3. Works in one programming sector
3 Central African irrespective of donor and
Republic, Uganda 4. Has less than 10% non-affiliate/local streams report type
of funding
5. Existence of partnerships with various
stakeholders – at least 2

Moreover, for those reports requiring review by GSO before submission to donor, the following
report workflow is applicable:
• 30 days to deadline: Development Unit (DU) sends reminders (or Salesforce auto
reminders)
• 20 days to deadline: Country Offices Share draft report
• 10 days to deadline: GSO (DU, DMEAL, SME) return comments to Country Office
• 8 days to submission: Country Office submits second draft for review
• 2 days to submission: DU submits report to Affiliates/ Donor OR Country Office submits
report to donor as applicable
In order to ensure this workflow is functional, country office teams need to make sure that their
grants records and reporting schedules are up to date in Salesforce, as well as upload and
mark the status of submitted reports therein appropriately.

6.7 Roles & Responsibilities


Depending on the human and financial resources available within the team and the challenges
and opportunities offered by your operational context, roles and responsibilities will be defined
for different members of the country office team.

Country Directors:
• Provide overall leadership and direction for community accountability
• Allocate sufficient resources (staff, time, budget) for community accountability
mechanisms
• Review and sign off on major decisions related to community feedback

Programme Managers:
• Oversee design and implementation of feedback channels and community engagement

ForAfrika | DMEAL Framework 64


• Lead regular review and analysis of community feedback data
• Make decisions on operational changes based on community input
• Share community feedback trends with country leadership team

Country DMEAL Staff:


• Design surveys, focus groups, and other feedback channels
• Analyse and visualise community feedback data
• Identify major trends and outliers in the data
• Support programme managers in interpreting and responding to feedback

Community Development Officers:


• Serve as main liaison between communities and organisation
• Support community members in providing feedback through designated channels
• Close feedback loops by reporting back responses to communities
• Flag urgent or sensitive issues to programme management

Community Focal Points:


• Represent interests of community members
• Support community members in providing feedback through designated channels,
including receiving and forwarding feedback.
• Share information on community perceptions and priorities
• Validate analysis of feedback trends by country team

Global DMEAL Division:


• Provide guidance and tools for community accountability
• Conduct spot checks on adherence to community accountability policies
• Aggregate and analyse community feedback data across countries
• Identify best practices and areas for improvement across programmes

ForAfrika | DMEAL Framework 65


ForAfrika | DMEAL Framework 66
7. Organisational learning and adaptations
7.1 Promote a learning culture
Promoting a culture of learning within ForAfrika is an important aspect of our learning
framework. A learning culture encourages curiosity, exploration, experimentation, and
reflective practice among our staff. It supports the creation and sharing of knowledge, leading
to improved programme design, implementation, and results.
To promote a learning environment, we advise to:

• Foster an Open and Supportive Environment: The first step in cultivating a learning
culture is creating an environment that encourages openness, collaboration, and
psychological safety. Staff should feel comfortable asking questions, sharing their
thoughts, and discussing their successes and failures. This requires strong leadership
commitment, inclusive communication practices, and respectful interpersonal dynamics.
For example, country offices might establish a “no-blame culture” where employees feel
safe expressing their opinions and learn from mistakes. If a community programme didn’t
achieve its targets, hold a review meeting in which all team members can discuss what
happened, without fear of being reprimanded.

• Celebrate Learning: Make learning a part of daily routines and celebrate it. Acknowledge
staff who take the initiative to learn something new or share their learnings with others.
This can be done through shout-outs in team meetings, annual awards ceremonies and
highlights in the programme highlights and Postcards from the field.

• Learning as a Core Value: Learning is already embedded into our organisational values,
now it is our responsibility to make it present in work processes. Include it in staff
performance appraisals, meetings, and project management processes. Make it clear that
learning is not an optional activity but a core expectation for everyone in the organisation.

• Provide Learning Resources and Opportunities: Make learning resources easily


available to staff. This could be in the form of internal resources – such as newsletters,
best practices compendiums and programming guidelines – and external resources –
such as books, articles, online courses, or webinars. Provide opportunities for staff to
attend training sessions, conferences, or workshops. Also, consider developing a learning
budget that can support these activities.

ForAfrika | DMEAL Framework 67


• Encourage Reflective Practice: Encourage staff to reflect on their experiences and
learnings regularly. This could be through team debriefs after a project, reflective journals,
or learning logs. This process, often known as an “After-Action Review”, can be a powerful
tool for collective learning and improvement.

• Utilise Technology: Make use of technology to facilitate learning. Take advantage of


collaborative platforms like Microsoft Teams and SharePoint as easy channels for
communication and knowledge management. Communities of practice can be housed in
such platforms, centralising the storage and management of lessons’ documents.

7.2 Document and disseminate lessons learned


Documentation and dissemination of lessons learned are crucial for creating a knowledge-rich
environment within ForAfrika. This process allows us to continuously improve our work,
making us more effective and efficient in achieving our mission.
To systematically document and disseminate lessons learned we recommend to:

• Identify Lessons: After each programme, activity, or even significant meeting, take the
time to identify and document what was learned. This could be done in an “After-Action
Review” meeting, as described in the previous section.
Conduct Debriefs: Schedule a debriefing session at the close of each programme
milestone. These meetings should include all team members involved in the programme,
from design to execution. An example might include a meeting post-implementation of a
water sanitation programme to assess the accuracy of the needs assessment and context
mapping at the planning stage of the programme.
Use Structured Questions: Ask pointed questions such as, “What unexpected challenges
did we encounter and how did we adapt?” or “Which strategies led to the most significant
community impact?” This structured questioning can help unearth critical insights.
Include All Stakeholders: Ensure the involvement of community representatives in these
discussions. Their perspectives can bring to light unanticipated impacts or identify locally
relevant best practices.

• Catalogue Best Practices: Use Annex A - Programme Good Practices Report. This tool
allows other teams to learn from and potentially replicate their success.
Identify Key Areas: Choose areas of significant success in the programme. For instance,
if a new method of community engagement led to higher participation rates, this method
should be highlighted.
Write Detailed Descriptions: Document these strategies clearly and comprehensively,
ensuring they can be easily replicated in different contexts. Include tangible details such
as the steps taken, resources needed, and challenges encountered.

ForAfrika | DMEAL Framework 68


Ensure Easy Access: Share at the online best practices library and highlight it to the
GSO team for sharing in the Communities of Practice; here, after thorough scrutiny,
opportunities will be identified for replication or scale-up and the necessary support may
be availed

• Establish Communities of Practice: A community of practice (CoP) is a group of people


who share a common concern or passion for something they do and learn how to do it
better as they interact regularly. For instance, you could create communities of practice
around thematic areas like women’s empowerment, agriculture, or health. These groups
can meet regularly, sharing their experiences, successes, and failures, thereby learning
from one another.
Identify Common Interests: Look for recurring topics of interest or challenge across
programme teams. These could range from community engagement techniques to
innovative monitoring methodologies.
Create Dedicated Groups: Document the constitution of the established CoP, stating the
membership, key focus, objectives, meeting frequency, etc.
Encourage regular knowledge exchange by setting up thematic groups on platforms like
Microsoft Teams or Slack.
Facilitate Regular Meetings: Arrange monthly or quarterly meetings where these groups
share updates, discuss common challenges, and brainstorm solutions together.

• Submit regular contributions to Programmes Highlights: Regular newsletters can be


a fantastic way to share lessons learned, best practices, and innovative ideas across the
organisation. Use the existing platform to showcase key milestones within your team and
contribute to the broader organisational learning project.

ForAfrika | DMEAL Framework 69


Pay attention to the submission periods for the monthly issues below:

RECURRING DMEAL MONTHLY OUTPUTS CALENDAR

1 2 3 4 5 6 7

8 9 10 11 12 13 14

15 16 17 18 19 20 21

22 23 24 25 26 27 28

29 30 31

Programme Highlights Submission Period


ITT & Wells Submissions Period
Impact Stories Submissions Due

• Share Lessons Broadly: Do not limit your sharing of lessons to internal channels. Share
your successes, failures, and learnings with partners, donors, and other stakeholders.
This could be done through blog posts, annual reports, or webinars.
Write Blog Posts: A blog post might detail the journey of a specific community
programme, highlighting the challenges faced, innovative solutions used, and the impact
achieved.
Include in Reports: Dedicate a section of your annual report to key learnings. Similarly,
when reporting to donors, emphasise what your team has learned and how it will shape
future programmes.
Organise Webinars: These could be targeted to specific audiences. For instance, a
webinar for other NGOs might focus on sharing best practices, while a webinar for
potential donors could highlight the organisation’s commitment to learning and improving.
• Leverage Technology: Utilise collaborative platforms and technologies to make the
sharing of lessons learned easier and more efficient. For instance, a shared online
document can be a useful way for teams to contribute their experiences and insights.
Digitise Key Resources: Making use of the Programmes SharePoint website with
documents sorted into categories like “Best Practices”, “Programme Reports”, and
“Community Feedback”, can help make resources easily accessible for all staff.

ForAfrika | DMEAL Framework 70


By ensuring that experiences, insights, and lessons are captured and shared, we avoid
“reinventing the wheel” and foster an environment of continuous improvement and learning
within ForAfrika.

7.3 Capacity building


Capacity building plays an instrumental role in facilitating an efficient DMEAL process. It
involves improving the abilities and skills of staff, community members, and other stakeholders
to enable them to perform their roles effectively in the programme’s design, monitoring,
evaluation, accountability, and learning aspects. The following steps can guide country offices
and GSO support in identifying knowledge gaps and strengthen staff capacity to demand,
generate and use evidence for programming.

Identify capacity needs


The first step involves identifying the capacity needs of staff, community members, and
stakeholders. This involves determining the specific knowledge, skills, or abilities that need to
be enhanced or developed.
• Staff Capacity Assessment: Conduct regular capacity
Use the survey on Annex C
assessments with staff to identify gaps in knowledge or to conduct regular staff
skills. Use tools such as self-assessment questionnaires, capacity assessment on
interviews, or observation. Example: You might observe the competencies presented
that some staff struggle to use new data analysis software, in this framework.
suggesting a need for further training in this area.
• Community Capacity Assessment: Engage with community members to understand
their capacity needs. This could involve discussions, surveys, or community meetings.
Example: Community members might express a desire to better understand how to
interpret data from community monitoring exercises.
• Stakeholder Capacity Assessment: Identify the capacity needs of other stakeholders
involved in your programmes. This could involve discussions or feedback sessions.
Example: Local government partners might express a need for training on participatory
programme design techniques.

Given the content of this DMEAL Framework, the following areas of skills would be important
to focus on for capacity building:
DMEAL Skills
• Design: These involve the ability to conceptualise and plan a project or programme
effectively. Key skills here include logical framework development, setting realistic and
measurable objectives, defining indicators, planning for data collection, and setting up
monitoring and evaluation mechanisms. Understanding the Theory of Change and how to
develop performance-based budgets also falls under design skills.
• Monitoring: This requires the ability to track and document the progress of a programme
towards its objectives. Skills involved include data collection, analysis, and interpretation,
using both quantitative and qualitative methods. Monitoring skills also involve

ForAfrika | DMEAL Framework 71


understanding how to use monitoring data to identify issues early and adapt the
programme as necessary.
• Evaluation: Evaluation skills involve conducting systematic assessments to determine the
effectiveness, impact, and sustainability of a programme. Key skills here include
developing an evaluation plan, selecting appropriate evaluation methods and tools,
conducting data analysis, and interpreting findings to draw conclusions and
recommendations. Knowledge of various types of evaluations (such as formative,
summative, impact, and real-time evaluations) and different evaluation approaches (like
participatory, theory-based, or Blue Marble evaluations) is also important.
• Accountability: This involves ensuring transparency, providing information about
programme activities and results, and addressing concerns or grievances in a timely
manner. Key skills include establishing feedback mechanisms, engaging stakeholders,
and responding effectively to feedback or complaints.
• Learning: Learning skills involve using the findings from monitoring and evaluation to
improve programmes and inform future decision-making. This involves skills in analysing
and interpreting data, drawing actionable insights, and applying these insights to
programme design and implementation. Learning skills also encompass fostering a
learning culture within the organisation, documenting and disseminating lessons learned,
and engaging in continuous improvement.
• Facilitation and Engagement: Given the participatory and community-centric nature of
the DMEAL process, skills in facilitating group discussions, engaging community
members, and managing stakeholder relationships would be essential.
• Data Management: Training on effective data management, including collection, storage,
analysis, and reporting, is crucial to ensure high-quality and reliable data throughout the
DMEAL process.
• Communication and Reporting: The ability to communicate findings clearly and
effectively is key to the DMEAL process. This could include written communication (such
as report writing), verbal communication (like presenting findings), and visual
communication (for example, creating graphs or infographics).
• Capacity Building and Training: As part of promoting a learning culture, individuals may
need skills to effectively build the capacity of others. This might involve skills in training
design and delivery, coaching, or mentoring.
• Adaptive Management: Given the emphasis on learning and adaptation in the DMEAL
process, skills in adaptive management – the ability to adjust programmes based on
monitoring and evaluation findings – would be important.
• Ethics and Sensitivity Training: Since DMEAL involves working closely with
communities, it’s important to have training on ethical considerations, respecting cultural
norms, and maintaining sensitivity to community needs and vulnerabilities.
• Use of Technology in DMEAL: With the rise of digital tools in programme management
and DMEAL, training in relevant software or digital platforms would be beneficial.
Design and implement capacity building initiatives:
Once capacity needs are identified, the next step is to design initiatives to meet these needs.

ForAfrika | DMEAL Framework 72


• Training Workshops: Design and deliver training workshops to enhance skills and
knowledge. Tailor these workshops to the needs identified in the assessment phase.
Example: You might run a workshop on using data analysis software for your staff, or a
session on interpreting monitoring data for community members.
• Curating a digital curriculum: Approved by the key stakeholders of staff members
learning, this collection of online learning tools and products can decentralise learning.
• Mentorship and Coaching: Implement mentorship or coaching programs where more
experienced staff can guide and support others. Example: An experienced DMEAL Officer
could mentor a new staff member, helping them understand the complexities of the DMEAL
process.
• Resource Provision: Provide resources such as manuals, online courses, or how-to
guides to support ongoing learning. Example: You might develop a user-friendly guide on
conducting community needs assessments or subscribe to an online learning platform
where staff can access a range of courses.

Monitor and evaluate capacity building:


Finally, monitor and evaluate your capacity building efforts to determine their effectiveness
and to identify areas for improvement.
• Feedback Surveys: Distribute feedback surveys after training sessions or at set intervals
during a mentorship programme.
• Progress Tracking: Regularly check in with individuals to understand if and how they are
applying their new skills.

If done effectively, this process of capacity building leads to a more capable and engaged
team, more active and empowered community members, and more meaningful partnerships
with stakeholders. It will enhance the overall effectiveness and impact of our programmes.

7.4 Adapt and improve


Learning is only as valuable as its application. The essence of DMEAL lies in the ability to
adapt and improve programmes based on monitoring data, evaluation findings, and
community feedback. These insights enable us to make informed decisions and adjustments,
improving our strategies and ultimately, our impact. Here is a step-by-step guide to ensure
this critical process of adaptation and improvement:

AREA OF PERSON TOOLS


ACTIVITY PERIODICITY
ACTION RESPONSIBLE REQUIRED

Review monitoring Programme


Review of Manager & DMEAL Quarterly Monitoring and
data and evaluation
Monitoring and Staff evaluation reports,
reports to identify

ForAfrika | DMEAL Framework 73


Evaluation trends, outcomes, Data analysis
Findings and challenges. software

Discuss the review


findings and Meeting space,
Programme Quality
Reflection and community Review findings,
Manager & DMEAL Quarterly
Learning feedback, Community
Staff
document learning feedback
points.

Based on the
learning, identify
Identification of and agree on Programme Quality Meeting space,
Adaptations and necessary Manager & DMEAL Quarterly Improvement
Improvements programme Staff suggestions
adaptations and
improvements.

Develop an action
plan outlining the
changes to be Programme Quality
After each Action plan
Action Planning made, Manager & DMEAL
Reflection Session template
responsibilities, Staff
required resources,
and timelines.

Implement the
action plan
Implementation of according to Assigned team
As per Action Plan As per Action Plan
Action Plan assigned roles, members
responsibilities, and
timelines.

After
implementation,
continue monitoring Programme Meeting space,
Monitoring and
and conduct a Manager & DMEAL Bi-annual Monitoring data,
Review
review to assess Staff Evaluation reports
the effectiveness of
the adaptations.

ForAfrika | DMEAL Framework 74


References
Accountability Mechanism - World Bank. Link
Alkin, M. C. (Ed.). (2013). Evaluation Roots: A Wider Perspective of Theorists’ Views and Influences.
SAGE Publications. Link
Bamberger, M., Vaessen, J., & Raimondo, E. (2016). Dealing with complexity in development
evaluation. SAGE Publications. Link
Better Evaluation. (2017). Developing a Monitoring and Evaluation Plan. Link
Better Evaluation. (2018). Managing an Evaluation. Link
Boud, D., & Hager, P. (2012). Re-thinking continuing professional development through changing
metaphors and location in professional practices. Studies in Continuing Education.
Brest, P., & Harvey, H. (2018). Money Well Spent: A Strategic Plan for Smart Philanthropy. Stanford
University Press.
Chambers, R. (2002). Participatory Workshops: A Sourcebook of 21 Sets of Ideas and Activities.
Earthscan.
Cousins, J. B., & Whitmore, E. (1998). Framing Participatory Evaluation. New Directions for Evaluation,
1998(80), 5–23.
Guijt, I. (2014). Participatory Approaches: Methodological Briefs - Impact Evaluation No. 5. UNICEF.
Link
Howard, G., & Bartram, J. (2010). Vision 2030: The resilience of water supply and sanitation in the face
of climate change. World Health Organisation.
Hughes, K. (2003). The role of citizen participation in development planning and project management.
Economic Development Institute of the World Bank.
Kusek, J. Z., & Rist, R. C. (2004). Ten Steps to a Results-Based Monitoring and Evaluation System.
The World Bank. Link
Needs Assessment Guide - Sphere Handbook. Link
Patton, M. Q. (2011). Developmental Evaluation: Applying Complexity Concepts to Enhance Innovation
and Use. The Guilford Press.
Patton, M. Q. (2018). Principles-focused evaluation: The GUIDE. The Guilford Press.
Popp, J., Milward, H. B., MacKean, G., Casebeer, A., & Lindstrom, R. (2014). Inter-organisational
networks: A review of the literature to inform practice. IBM Center for The Business of Government.
Preskill, H., & Catsambas, T. T. (2006). Reframing Evaluation Through Appreciative Inquiry. SAGE
Publications.
UNDP. (2020). Social and Environmental Standards. Link
UNICEF. (2021). Introduction to Programme Monitoring and Evaluation. Link
USAID. (2018). Collaborating, Learning, and Adapting (CLA) Toolkit. Link
USAID. (2019). Performance Monitoring Plan Toolkit. Link
Weiss, C. H. (1998). Evaluation: Methods for Studying Programs and Policies. Prentice Hall.
World Bank. (2020). Beneficiary Monitoring and Evaluation in Large-Scale Projects. Link

ForAfrika | DMEAL Framework 75


Annex A - Programme Good Practices Report

Please use the word template stored at the DMEAL SharePoint folder when creating your
report. The content below provides a guidance, without using the recommended final
layout/style for this product.

Project/Programme:
Project/Programme Location:
Duration:
Target population:
Stakeholders: [involved in the project/programme at all levels including donors/government
departments at national and devolved tiers; community level]
ForAfrika personnel: [engaged in the project/programme]
ForAfrika [Country Office name]
Authors: [ForAfrika personnel writing (authors) of the project/programme model/best practice]
[month, year – when the document was prepared]

Executive Summary
• What is the purpose and scope of the project/programme?
• What are the key objectives, findings, and recommendations?
• Which practices were adopted, and why were they considered good or best practices?
• What results were achieved due to these practices?
• What recommendations can be made based on the project/programme’s experience?

1. Programming Framework
• What is the context of the project/programme within the organisation’s broader
programming framework?
• What is the project/programme theory of change?
• Where and when was the initiative implemented?

2. Good/Best practice identification


• Identify the practices considered as good or best and describe how they align with the
overall programme goal.
• Describe, if applicable, any innovations, including their specific names/types and how
these were generated/by whom.

ForAfrika | DMEAL Framework 76


• Describe, if applicable, the processes/dynamics, including challenges and how these were
tackled and with what results (problem-solving approaches).
• Describe, if applicable, how the project/programme identified and addressed/applied
cross-cutting issues (age/gender/disability/environment) as defined by ForAfrika.
• Discuss how these practices met the criteria for good or best practices, providing evidence
where possible.

3. Project/Programme Outcomes, Results, Impact, and Evaluation


Methodology
• What were the outcomes, and how did the good or best practices influence them?
• How did the practices contribute to the project’s impact, and what were the key findings?
• What data collection and analysis techniques were used to evaluate these practices?
• Were there any positive or negative unintended outcomes from implementing these
practices?

4. Project/Programme Management Cycle


• Discuss the influence of these practices on the project implementation strategies.
• Which technical expertise were required, and how did the identified practices facilitate their
utilisation?
• Discuss the role of these practices in engaging the community/targeted stakeholders.
• How were the practices implemented and monitored, and what effects did they have on
project execution?
• Detail how good or best practices influenced resource allocation.
• How did these practices influence the project execution, including any deviations from the
plan?

6. Project/Programme Sustainability Measures and Local Ownership


• Discuss how the identified practices contributed to defining and achieving project
sustainability.
• How did these practices foster local ownership and self-sufficiency among targeted
stakeholders?

7. Conclusions, Recommendations, and Next Steps


• What are the key conclusions about the identified practices?
• What recommendations can be made for similar programming considerations, based on
the implementation and results of these practices?

ForAfrika | DMEAL Framework 77


• What are the next steps for implementing recommendations and utilising lessons learned
from these practices?

End notes:

1. Please select high-quality pictures for illustrating various sections of the document.
2. Please ensure credit is given for any material/content cited from other sources.
3. Please use diagrams wherever possible to make visual illustrations.
4. While documenting good or best practices, ensure the practice is highlighted, its impact
demonstrated with evidence, and how it aligns with the criteria for good or best practices.
Also, note its potential for replication and adaptation in different contexts

Appendices

• What supplementary materials are relevant for a comprehensive understanding of the


project/programme?
• Include any relevant data or research that demonstrates the effectiveness of the good or
best practices documented.

ForAfrika | DMEAL Framework 78


Annex B – Evaluation Quality Checklist
The following criteria are adapted from the Global Evaluation Reports Oversight System
(GEROS), a United Nations system that aims to support strengthening of the evaluation
function to meet and exceed United Nations Evaluation Group (UNEG) norms and standards,
UN System Wide Action Plan on gender equality (UN-SWAP) and equity and human-rights
based approaches.

Terms of Reference

Section A: Context and Description of Object of the Evaluation (Weight 5%)

Question 1. Do the ToR include sufficient and relevant information on context?

• The most relevant aspects of the economic, social and political context relevant to the
object of the evaluation are well and succinctly explained. (This can include strategies,
policies, goals, frameworks and priorities at the international, national, and organisational
levels).

• The ToR spell out the national and international instruments as well as ForAfrika policies
on human rights, including child rights, and gender equality, and equity/inclusion that are
relevant to the object of the evaluation.

Question 2. Do the ToR include sufficient and relevant information on the object of the
evaluation?
• Overall, the object of the evaluation is briefly and clearly explained (includes: objective(s),
interventions, stakeholders, time period, budget, geographic scope, implementation
phase, and linkages to SDGs, ForAfrika outcome area, and Country Office strategic
priorities.

• The logic model or the theory of change (ToC) is clearly presented. If not, the ToR specify
that the ToC will be reconstructed retroactively.

• Right holders and duty bearers are clearly identified by type and geographic location
where relevant.

• Key stakeholders, including ForAfrika, are clearly identified and their contribution to the
object of the evaluation is well described.

ForAfrika | DMEAL Framework 79


Section B: Evaluation Purpose, Objectives, and Scope (Weight 30%)

Question 3. Does the ToR specify the purpose of the evaluation, who will use it, and
how it will be used?

• The purpose of the evaluation identified in the TOR clearly states why the evaluation is
being done, including justification for why it is being done at this time.

• The ToR identify the primary intended users and intended uses of the evaluation.

Question 4. Does the ToR present clearly defined, relevant and feasible objectives?

• The evaluation objectives clearly follow the overall purpose of the evaluation.

• It is clear from the ToR what type of an evaluation this is (formative, summative, mid-term,
ex-post, final, etc).

• Evaluation objectives clearly address gender, equity/inclusion, and child rights either
within objectives or as separate objectives, as appropriate.

Question 5. Does the ToR include a complete description of the scope of the
evaluation?

• The ToR clearly define what will and will not be covered in the evaluation (thematically,
chronologically, geographically), as well as the reasons for this scope (e.g., lack of access
to geographic areas for political or safety reasons at the time of the evaluation, lack of
data/evidence on elements of the intervention).

• The ToR specify the time to be covered by the evaluation.

• The ToR specify the geographical locations to be covered by the evaluation.

• The ToR specify the programmatic scope to be covered by the evaluation.

• The ToR clearly include gender, equity/inclusion, and child rights dimensions in the
evaluation scope, referring to Leave No one Behind.

Section C: Evaluation Framework (Weight 20%)

Question 6. Does the ToR include a comprehensive and tailored set of evaluation
questions within the framework of the evaluation criteria?

• Evaluation questions adequately address all of the OECD-DAC standard evaluation


criteria (relevance, coherence, efficiency, effectiveness, sustainability, impact (if
relevant)) depending on the evaluation context and are well balanced.

• Evaluation questions and sub-questions are appropriate for meeting the objectives and
purpose of the evaluation and are aligned with the evaluation criteria.

ForAfrika | DMEAL Framework 80


• Relevance of the object of the evaluation is adequately addressed.

• Coherence of the object of the evaluation is adequately addressed.

• Effectiveness of the object of the evaluation is adequately addressed.

• Efficiency of the object of the evaluation is adequately addressed.

• Sustainability of the object of the evaluation is adequately addressed.

• Impact of the object of the evaluation is adequately addressed – if relevant. If impact is


included, but is not relevant at this stage (e.g., in a formative evaluation), the reviewer
should state why.

• Any other relevant criteria are adequately addressed, including humanitarian criteria
(Coverage, Connectedness, Coordination, Protection, Security) in the case of
humanitarian programs.

• The ToR include an assessment of relevant human rights, including child rights,
equity/inclusion (e.g., disability inclusion, Leave No one Behind), and gender equality
through the selection of the evaluation criteria and questions.

Section D: Evaluability Assessment and Methodology (Weight 25%)

Question 7. Does the ToR specify the methods for data collection and analysis,
including information on the overall methodological design?

• Evaluability assessment: the ToR assess the availability and reliability of data (baseline,
indicator, targets, output and outcome data available through DMEAL) disaggregated by
gender, age, disability/socially excluded groups, etc.

• Existing primary and/or secondary sources of information are identified.

• The overall methodological approach for the evaluation is explained, e.g., theory-based
evaluation, mixed methods, participatory (including vulnerable groups, utilisation focused,
gender and human rights responsive, equity focused). The evaluation design is suggested
(e.g. experimental, quasi-experimental, non-experimental).

• Data collection and analysis methods are adequately addressed (primary and secondary
data collection).

• Limitations to the evaluation are adequately covered (methods, sources of information,


disaggregated data, time, budget, in case vulnerable groups will not be actively involved,
justify why).

• The ToR specify that the evaluation will follow the ForAfrika Norms and Standards, as
well as ethical guidelines.

• The ToR specify an evaluation approach and data collection and analysis methods that
are human-rights based, including child-rights based and gender sensitive, and for
evaluation data to be disaggregated by sex, ethnicity, age, disability, etc.

• The overall methodology is appropriate to meet the objectives and scope of the evaluation
and to answer the evaluation questions.

ForAfrika | DMEAL Framework 81


Section E: Work plan/Evaluation Management and Structure/Presentation of the
ToR (Weight 20%)

Question 8. Does the ToR present complete information on the work plan and
evaluation management?

• The ToR state the list of products that will be delivered by the evaluation team.

• The ToR describe the structure of the evaluation report and links to the ForAfrika DMEAL
Standards.

• The ToR request an Executive Summary.

• The ToR specify an appropriate length of products to be delivered. (An evaluation report
should be between 40-60 pages.)

• The ToR describe the key stages of the evaluation process (incl. meetings, consultations,
workshops with different groups of stakeholders, key points of interaction with a steering
committee, process for verification of findings, presentation of preliminary findings, etc.).

• Team composition is described, stating potential conflict of interest if any, and


corresponds to the scope and complexity of the evaluation being planned.

• The ToR define the level of expertise needed among the evaluation team on gender
equality and human rights, including child rights, equity/inclusion, and their responsibilities
in this regard, and calls for a gender balanced and culturally diverse team that makes use
of national/regional evaluation expertise.

• Roles and responsibilities of the evaluation team, ForAfrika CO, ForAfrika GSO, steering
committee, evaluation reference group, stakeholders are clearly stated.

• Ethical considerations are addressed.

• The ToR contain a section specifying that contractors are required to “clearly identify any
potential ethical issues and approaches, as well as the processes for ethical review and
oversight of the evaluation process in their proposal”.

• The ToR specify whether the evaluation will have to go through an ethical review board
based on the “Criteria for Ethical Review Checklist” annexed to the ToR.

• If relevant, the ToR explain how official ethical approvals will be received.

• The ToR describe basic steps of the evaluation quality assurance process (e.g., reference
groups, quality review of milestone documents, etc.).

• The ToR describe a proposed approach to dissemination and advocacy for the evaluation
findings and recommendations.

• The number of days allocated to each phase is realistic to produce high quality
deliverables, given the scope of the evaluation being planned.

• The budget allocated to the evaluation is realistic to produce a high-quality evaluation


report, given the scope of the evaluation being planned.

ForAfrika | DMEAL Framework 82


Question 9. Are the ToR clear and do they hold together in a logical way?

• The sections of the ToR hold together in a logically consistent way.

• The style of the ToR is adequate (brief, to the point, logically structured and easy to
understand).

Inception Report Review

Section A: Opening Pages and Introduction (Weight 5%)

Question 1. Do the opening pages and introduction of the Inception Report contain all
the relevant information?

• The introduction contains a short description of the purpose and content of the IR, the key
activities undertaken for its preparation and its place in the evaluation process.

• The introduction highlights any emerging issues that have arisen during the inception
phase (if applicable).

• Basic elements in the opening pages are presented (evaluation title, country, years
covered by the evaluation, name(s) and/or organisation(s) of the evaluator(s), and
commissioning organisation on cover page, list of acronyms, table of contents, including
list of tables and figures).

Section B: Context and Description of the Object of the Evaluation (Weight 10%)

Question 2. Are the context and description of the object of the evaluation clearly
presented?

• Clear and relevant description of the context of the object of the evaluation (i.e. relevant
policy, socio-economic, political, cultural, power/privilege, institutional, international
factors) and how context relates to the implementation of the object of the evaluation.

• Linkages are drawn to the SDGs and relevant targets and indicators for the area being
evaluated.

• The object of the evaluation is briefly and clearly explained (its objectives, stakeholders
involved and their roles, contributions, and stakes, right holders/participants and their
status and needs, time period, budget, geographic scope, phase of implementation).

• The description of the object of the evaluation makes adequate references to human
rights, gender, and equity/inclusion.

ForAfrika | DMEAL Framework 83


• The logic model or the Theory of Change (ToC) of the object being evaluated is described
to some extent, with the assumption that it will be further refined or finalised in the
Evaluation Report.

Section C: Purpose, Objectives, and Scope of the Evaluation (Weight 10%)

Question 3. Are the purpose, objectives and scope of the evaluation clearly presented?

• The evaluation purpose is clearly presented, including the rationale behind the evaluation,
its intended use and what this use is expected to achieve, its primary intended users and
how they stand to gain or lose from the results of the evaluation.

• The evaluation objectives are clearly presented with reference to any changes made to
the objectives included in the TOR.

• The scope of the evaluation is clearly defined (includes what will and will not be covered,
the geographic location, period, thematic field(s) of intervention/interventions to be
evaluated, levels (regional, country, municipal). Changes from ToR are clearly indicated
and justified.

Section D: Evaluation Framework (Weight 20%)

Question 4. Are the evaluation criteria and questions clearly presented?

• All of the evaluation criteria and questions are listed as per ToR. If criteria/questions differ
from ToR, the Inception Report justifies the changes, e.g., efforts to prioritize questions
and reduce number of questions to address should be noted in the report.

Question 5. Are evaluation findings derived from the conscientious, explicit and
judicious use of the best available, objective, reliable and valid data and by accurate
quantitative and qualitative analysis of evidence.

• The Inception Report links the evaluation criteria and questions to the chosen
methodology through an evaluation matrix that includes indicators, benchmarks,
assumptions and/or other processes from which the analysis can be based and
conclusions drawn, referring to the Convention on the Rights of the Child (CRC), Leave
No one Behind (LNOB), and disability inclusion as appropriate.

• Indicators, data sources, and data collection and methods are identified for each question.

• The indicators chosen are specific, easily measurable, and relevant to the corresponding
evaluation questions and ToC

• The evaluation questions and indicators include reference to human rights, gender, and
equity dimensions.

ForAfrika | DMEAL Framework 84


Section E: Methodology (Weight 30%)

Question 6. Is the methodology clearly presented, technically sound, logistically


feasible, and appropriate considering the evaluation framework?

• Clear and complete description of a relevant and robust methodological design and set of
methods that are suitable for the evaluation’s purpose, objectives, and scope. Any
adaptations to the methods proposed in the ToR are explained and justified.

• If the evaluation asks attribution questions (outcome or impact level), an appropriate


evaluation design (qualitative or quantitative) to reliably measure attribution is proposed.

• Key data sources are clearly presented and appropriate (includes list of documents for
desk review, the group of stakeholders to be interviewed, available databases, data gaps),
and appear comprehensive and reliable.

• Methodology allows for drawing causal connections between outputs and expected
outcomes.

• The sampling methods described for qualitative data collection are appropriate and
adequate (includes ALL of the following: sample size, the geographic area(s), specific
populations, sampled site/country visits, the rationale/criteria for selection, how
participants/interviewees will be selected, and criteria for selection of countries to be
visited/studied (if applicable)).

• The sampling methods described for quantitative data collection are appropriate and
adequate (includes ALL of the following: sample size, the geographic area(s), specific
populations, sampled site/country visits, the rationale/criteria for selection, how
participants/interviewees will be selected, and criteria for selection of countries to be
visited/studied (if applicable)).

• The data collection tools are linked to the specific evaluation questions (the way in which
the tools are designed should facilitate capturing the information needed to answer the
evaluation questions).

• Questions in interview protocols, discussion guides and questionnaires are robust,


focused, linked to the evaluation matrix and avoid leading questions.

• The Inception Report describes relevant methodological limitations to the evaluation.

• Clear and complete description of evaluation limitations, potential biases and constraints
faced by the evaluation team, and mitigation strategies to be used.

• Explicit and contextualised reference to the obligations of evaluators (independence,


impartiality, credibility, conflicts of interest, accountability).

• Description of ethical safeguards for participants appropriate for the issues described
(respect for dignity and diversity, right to self-determination, fair representation,
compliance with codes for vulnerable groups (i.e. adherence to ethical principles and
procedure, do no harm, confidentiality and data collection). For those cases where the
evaluation will involve interviewing children, explicit reference is made to procedures for
ethical research involving children.

ForAfrika | DMEAL Framework 85


Section F: Evaluation Workplan (Weight 20%)

Question 7. Is the workplan complete and containing relevant information?

• The evaluation phases are clearly described, including a timeline with associated
activities, number of days for each team member, locations and deliverables.

• The roles and responsibilities of each member of the evaluation team are clearly
described.

• If the evaluation requires official ethical approval, the process to be followed is clearly
described.

• The Inception Report describes the evaluation quality assurance process.

• The logistics of carrying out the evaluation are discussed (e.g. assistance required from
ForAfrika for interview arrangements, field visits, etc.) and the expected roles and
responsibilities from the commissioning organisation(s) or oversight committee are
adequately explained.

Section G: Inception Report Structure/Presentation (Weight 5%)

Question 8. Do the annexes contain all the relevant elements?

• The evaluation ToR are included in the annexes.

• The following elements are annexed to the Inception Report: logic model/ToC, evaluation
matrix, bibliography, data collection tools (draft interview protocols, survey, case study
formats), list(s) of people to be interviewed, if applicable and available ethical review
board approval form and/or informed consent form.

Question 9. Is the Inception Report coherent and logical?

• Structure is easy to identify and navigate (for instance, with numbered sections, clear titles
and sub-titles, well formatted).

• Inception Report is easy to understand (written in an accessible way for intended


audiences and generally free from grammar, spelling and punctuation errors), and
conveys key information through the use of visual aids (such as infographics, maps,
tables, figures, photos) which are clearly presented, labelled, and referenced in text.

Draft Evaluation Report Review

ForAfrika | DMEAL Framework 86


Section A: Background (Weight 5%)

Question 1. Is the object of the evaluation clearly described?

• Clear and relevant description of the intervention, including location(s), timelines,


cost/budget, and implementation status.

• Clear and relevant description of intended rights holders (participants) and duty bearers
(state and non-state actors with responsibilities regarding the object of the evaluation) by
type (i.e., institutions/organisations; communities; individuals…), by geographic
location(s) (i.e., urban, rural, particular neighbourhoods, town/cites, sub-regions…) and in
terms of numbers reached with disaggregation by gender, age, disability ... (as
appropriate to the purpose of the evaluation).

Question 2. Is the context of the intervention clearly described?

• Clear and relevant description of the context of the object of the evaluation (i.e. relevant
policy, socio-economic, political, cultural, power/privilege, institutional, international
factors) and how context relates to the implementation of the object of the evaluation.

• Linkages are drawn to the SDGs and relevant targets and indicators for the area being
evaluated.

• Clear and relevant description (where appropriate) of the status and needs of the right
holders/participants of the intervention.

Question 3. Are key stakeholders, their relationships and contributions clearly


identified?

• Identification of implementing agency(ies), development partners, right holders, and


additional duty bearers and other stakeholders; and of linkages between them (e.g.,
stakeholder map) (if relevant).

• Identification of the specific contributions and roles of key stakeholders (financial or


otherwise), including ForAfrika.

Section B: Evaluation Purpose, Objectives And Scope (Weight 5%)

Question 4. Is the purpose of the evaluation clearly described?

• Specific identification of how the evaluation is intended to be used and what this use is
expected to achieve.

• Identification of appropriate primary intended users of the evaluation.

ForAfrika | DMEAL Framework 87


Question 5. Are the objectives and scope of the evaluation clear and realistic?

• Clear and complete description of what the evaluation seeks to achieve by the end of the
process with reference to any changes made to the objectives included in the TOR and/or
in the inception report.

• Clear and relevant description of the scope of the evaluation: what will and will not be
covered (thematically, chronologically, geographically with key terms defined), as well as
the reasons for this scope (e.g., specifications by the ToR and/or inception report, lack of
access to particular geographic areas for political or safety reasons at the time of the
evaluation, lack of data/evidence on particular elements of the intervention).

Question 6. Is the theory of change, results chain or logic well-articulated?

• Clear and complete description of the intervention’s intended results or of the parts of the
results chain that are applicable to, or are being tested by, the evaluation.

• Causal relationship between outputs and outcomes is presented in narrative and graphic
form (e.g., results chain, logic model, theory of change, evaluation matrix).

• For theory-based evaluations, the theory of change or results framework is assessed, and
if requested in the ToR, it is reformulated/improved by the evaluators.

Section C: Evaluation Methodology (Weight 20%)

Question 7. Does the evaluation use questions and the relevant list of evaluation criteria
that are explicitly justified as appropriate for the purpose of the evaluation?

Not all OECD/DAC criteria are relevant to all evaluation objectives and scopes. Standard
OECD DAC Criteria include Relevance; Effectiveness; Efficiency; Sustainability; and Impact.
Evaluations should also consider equity, gender and human rights (these can be
mainstreamed into other criteria). Humanitarian evaluations should consider Coverage;
Connectedness; Coordination; Protection; and Security.
• Evaluation questions and sub-questions are appropriate for meeting the objectives and
purpose of the evaluation and are aligned with the evaluation criteria.

• In addition to the questions and sub-questions, the evaluation matrix includes indicators,
benchmarks, assumptions and/or other processes from which the analysis can be based
and conclusions drawn.

Question 8. Does the report specify methods for data collection, analysis, and
sampling?

• Clear and complete description of a relevant and robust methodological design and set of
data collection methods that are suitable for the evaluation’s purpose, objectives, and
scope.

ForAfrika | DMEAL Framework 88


• Data sources are appropriate, normally including qualitative and quantitative sources
(unless otherwise specified in the ToR), and are all clearly described.

• Sampling strategy is provided, describing how diverse perspectives were captured (or if
not, providing reasons for this).

• Clear and complete description of data analysis methods.

• Methodology allows for drawing causal connections between outputs and expected
outcomes.

• Clear and complete description of evaluation limitations, biases and constraints faced by
the evaluation team and mitigation strategies used.

Question 9. Are ethical issues and considerations described?

• Explicit and contextualised reference to the obligations of evaluators (independence,


impartiality, credibility, conflicts of interest, accountability).

• Description of ethical safeguards for participants appropriate for the issues described
(respect for dignity and diversity, right to self-determination, fair representation,
compliance with codes for vulnerable groups (i.e. adherence to ethical principles and
procedure, do no harm, confidentiality and data collection).

• If the Evaluation Report required an official ethical approval and informed consent, both
forms are included as an annex in the draft final evaluation report.

Section D: Evaluation Findings (Weight 25%)

Question 10. Do the findings clearly address all evaluation objectives and scope?

• Findings contain sufficient levels of evidence to systematically address all of the


evaluation’s criteria and questions. Gaps in evidence that was generated, and mitigation
of bias are highlighted if relevant.

• If feasible and relevant to the purpose, cost analysis is clearly presented (how costs
compare to similar interventions or standards, most efficient way to get expected results)-
if not feasible, an explanation is provided.

• Explicit use of the intervention’s results framework/ToC in the formulation of the findings.

Question 11. Are evaluation findings derived from the conscientious, explicit and
judicious use of the best available, objective, reliable and valid data and by accurate
quantitative and qualitative analysis of evidence.

• Evaluation uses credible forms of qualitative and quantitative data, presenting both output
and outcome-level data as relevant to the evaluation framework. Triangulation is evident
through the use of multiple data sources.

ForAfrika | DMEAL Framework 89


• Findings are clearly supported by, and respond to, the evidence presented, including both
positive and negative. Findings are based on clear performance indicators, standards,
benchmarks, or other means of comparison as relevant for each question.

• Unexpected effects (positive and negative) are identified and analysed.

• The causal factors (contextual, organisational, managerial, etc.) leading to achievement


or non-achievement of results are clearly identified. For theory-based evaluations,
findings analyse the logical chain (progression -or not- from implementation to results).

Question 12. Does the evaluation assess and use the intervention’s Results Based
Management elements?

• Clear and comprehensive assessment of the intervention’s monitoring system (including


completeness and appropriateness of results/performance framework -including vertical
and horizontal logic; DMEAL tools and their usage) to support decision-making.

Section E: Evaluation Conclusions & Lessons Learned (Weight 10%)

Question 13. Do the conclusions present an objective overall assessment of the


intervention?

• Conclusions are clearly formulated and reflect the purpose and objectives of the
evaluation. They are sufficiently forward looking (if a formative evaluation or if the
implementation is expected to continue or have additional phase).

• Conclusions are derived appropriately from findings and present a picture of the strengths
and limitations of the intervention that adds insight and analysis beyond the findings.

Question 14. Are logical and informative lessons learned identified? [N/A if lessons are
not presented and not requested in ToR]

• Identified lessons stem logically from the findings, have wider applicability and relevance
beyond the object of the evaluation.

• Lessons are clearly and concisely presented yet have sufficient detail to be useful for
intended audience.

Section F: Recommendations (Weight 15%)

Question 15. Are recommendations well-grounded in the evaluation?

• Recommendations align with the evaluation purpose, are clearly formulated and logically
derived from the findings and/or conclusions.

• Recommendations are useful and actionable for primary intended users and uses
(relevant to the intervention); guidance is given for implementation, as appropriate.

ForAfrika | DMEAL Framework 90


• Process for developing the recommendations is described, and includes the Involvement
of duty-bearers, as well as rights holders when feasible (or explanation given for why they
were not involved).

Question 16. Are recommendations clearly presented?

• Clear identification of groups or duty-bearers responsible for action for each


recommendation (or clearly clustered group of recommendations). Clear prioritisation
and/or classification of recommendations to support use.

Section G: Evaluation Structure/Presentation (Weight 5%)

Question 17. Does the evaluation report include all relevant information?

• Opening pages include:

Name of evaluated object, timeframe of the object evaluated, date of report, location of
evaluated object, name(s) and/or organisation(s) of the evaluator(s), name of organisation
commissioning the evaluation, table of contents -including, as relevant, tables, graphs,
figures, annexes-; list of acronyms/abbreviations, page numbers.

• “Annexes include: terms of reference, evaluation matrix, list of interviewees, results


chain/ToC/logical framework (unless included in report body), list of site visits, data
collection instruments (such as survey or interview questionnaires), list of documentary
evidence.

Other appropriate annexes could include additional details on methodology, information


about the evaluator(s).

Question 18. Is the report logically structured?

• Structure is easy to identify and navigate (for instance, with numbered sections, clear titles
and sub-titles, well formatted).

• Structure follows ForAfrika’s guidelines for evaluation reports: context, purpose,


objectives and methodology would normally precede findings, which would normally be
followed by conclusions, lessons learned and recommendations.

• Report is easy to understand (written in accessible way for intended audience) and
generally free from grammar, spelling and punctuation errors.

• Frequent use of visual aids (such as infographics, maps, tables, figures, photos) to convey
key information. These are clearly presented, labelled, and referenced in text.

• Report is of reasonable length; it does not exceed number of pages that may be specified
in ToR.

ForAfrika | DMEAL Framework 91


Section H: Evaluation Principles (Weight 10%)

Question 19. Did the evaluation design and style consider incorporation of the
ForAfrika’s commitment to a human rights-based approach to programming, to gender
equality, and to equity?

• Reference and use of rights-based framework, and/or CRC, and/or CCC, and/or CEDAW
and/or other rights related benchmarks in the design of the evaluation.

• Clear description of the level of participation of key rights holders and duty bearers in the
conduct of the evaluation, including in the development of recommendations, (for
example, a reference group is established, stakeholders are involved as informants or in
data gathering).

• Stylistic evidence of the inclusion of these considerations can include using human-rights
language; gender-sensitive and child-sensitive writing; disaggregating data by gender,
age and disability groups; disaggregating data by socially excluded groups.

Question 20. Does the evaluation assess the extent to which the implementation of the
intervention addressed equity?

• Evaluation assesses the extent to which the implementation of the intervention addresses
child rights and Leave No one Behind (gender and other excluded and marginalised
groups). It is disability inclusive.

• A gender-responsive Evaluation Methodology, Methods and tools, and Data Analysis


Techniques are selected.

• The evaluation Findings, Conclusions and Recommendations reflect a gender analysis.

Section I: Executive Summary (Weight 5%)

Question 21. Can the executive summary inform decision-making?

• An executive summary is included that is of relevant conciseness and depth for key users.
(Maximum of 5 pages unless otherwise specified in ToR).

• Includes all necessary elements (overview of the object of the evaluation, evaluation
purpose, objectives and intended audience, evaluation methodology, key conclusions on
findings, lessons learned if requested, and key recommendations) as per ToR.

• Includes all significant information to understand the object of the evaluation and the
evaluation itself AND does not introduce anything new from what is presented in the rest
of the report.

ForAfrika | DMEAL Framework 92


Annex C – DMEAL skills auto assessment
This survey aims to assess skills in the DMEAL framework, including design, monitoring,
evaluation, accountability, and learning. Responses will help identify areas of strength and
areas for improvement, guiding targeted capacity-building efforts to enhance programme
effectiveness.
Access the survey using the link: https://ee.humanitarianresponse.info/x/7t6np0kE

Dear Participant,
Thank you for participating in this survey to assess your skills in the Design, Monitoring,
Evaluation, Accountability, and Learning (DMEAL) framework. Your responses will help us
identify strengths and areas for improvement to enhance our programme effectiveness. The
survey covers DMEAL skills, including design, monitoring, evaluation, accountability, and
learning, as well as other relevant areas such as facilitation, data management, project
management, communication, capacity building, adaptive management, ethics, and
technology.
Your honest feedback will be kept confidential and used for internal assessment and capacity
building purposes.
Let’s begin the survey!

Most of the questions below will use a scale of 1-5 for you to assess your skill level on each
competency.
• 1: Very Low – The individual has minimal knowledge or skills in the particular area.
• 2: Low – The individual has some basic knowledge or skills but requires significant
improvement and guidance.
• 3: Moderate – The individual possesses satisfactory knowledge or skills and can perform
tasks with some level of independence.
• 4: High – The individual has a good level of knowledge or skills and can perform tasks
proficiently with minimal supervision.
• 5: Very High – The individual is highly knowledgeable or skilled and can perform tasks
expertly with a high level of independence.

Identification
• What is your name?
• In what office do you work?
• What is your current position?
• If you selected other in the previous question, please specify below:

ForAfrika | DMEAL Framework 93


Core DMEAL Skills
Design

Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme design skills:
• Proficiency in logical framework development. (Logical framework development involves
creating a structured framework that outlines the programme’s goals, objectives,
activities, and indicators.)

• Ability to set realistic and measurable objectives. (Setting objectives that are specific,
achievable, relevant, and time bound.)

• Competence in defining indicators. (Developing indicators that can effectively measure


the programme’s progress and success.)

• Competence in defining indicators. (Developing indicators that can effectively measure


the programme’s progress and success.)

• Ability to set up monitoring and evaluation mechanisms. (Establishing systems and


processes to track and assess programme progress.)

• Understanding of the Theory of Change? (The ToC explains how and why a programme’s
activities will lead to the desired outcomes and impact.)

• Familiarity with developing performance-based budgets? (Performance-based budgets


align programme resources with the intended results and outcomes.)

Monitoring
Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme monitoring skills:
• Ability to collect and analyse quantitative data. (Collecting and analysing numerical data
using appropriate statistical methods.)

• Ability to collect and analyse qualitative data. (Collecting and analysing non-numerical
data, such as interviews or observations.)

• Proficiency in using monitoring data to identify issues early. (Using monitoring data to
detect challenges or deviations from the programme’s intended progress.)

• Competence in to leading a discussion and use monitoring data to adapt programs.

Evaluation
Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme evaluation skills
• Ability to develop an evaluation plan. (Creating a comprehensive plan for assessing the
effectiveness, impact, and sustainability of a programme.)

• Competence in selecting appropriate evaluation methods and tools. (Choosing the most
suitable evaluation approaches and techniques based on the programme’s objectives and
available resources.)

ForAfrika | DMEAL Framework 94


• Proficiency in conducting data analysis. (Analysing evaluation data using appropriate
analytical methods and tools.)

• Ability to interpret findings and draw conclusions. (Making sense of evaluation results to
understand the programme’s achievements and areas for improvement.)

• Familiarity with different types of evaluations? (Formative, summative, impact, real-time


evaluations, etc.)

• Familiarity with different evaluation approaches? (Participatory, theory-based, Blue


Marble evaluations, etc.)

Accountability
Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme accountability skills:
• Ability to establish feedback mechanisms. (Creating channels for stakeholders to provide
feedback and express concerns.)

• Proficiency in engaging stakeholders. (Involving relevant stakeholders in programme


activities and decision-making processes.)

• Ability to respond effectively to feedback or complaints. (Addressing feedback or


complaints in a timely and appropriate manner.)

Learning
Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme learning skills:
• Ability to analyse and interpret data for learning purposes. (Examining data to extract
meaningful insights and lessons.)

• Proficiency in drawing actionable insights from monitoring and evaluation findings.


(Identifying key findings and translating them into practical recommendations for
programme improvement.)

• Ability to apply insights to programme design and implementation. (Using lessons learned
to inform decision-making and enhance programme effectiveness.)

Support Skills
Facilitation and Engagement
Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme facilitation skills:
• Ability to facilitate group discussions effectively. (Guiding group discussions to ensure
active participation and meaningful outcomes.)

• Proficiency in engaging community members. (Building positive relationships with


community members and involving them in programme processes.)

• Ability to manage stakeholder relationships. (Maintaining positive and constructive


interactions with programme stakeholders.)

ForAfrika | DMEAL Framework 95


Communication and Reporting Skills
Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme communication skills:
• Ability to write clear and informative reports. (Producing well-structured reports that
effectively communicate programme findings and recommendations.)

• Verbal communication skills in presenting findings. (Conveying information verbally in a


clear and engaging manner.)

• Ability to create visual representations of data (graphs, infographics, etc.) (Presenting


data visually to enhance understanding and engagement.)

Capacity Building and Training


Using the 1-5 scale described above, please rate your proficiency of ability in the following
programme training skills:
• Ability to design and deliver effective training sessions. (Developing and conducting
training programs that enhance participants’ knowledge and skills.)

• Coaching and mentoring skills. (Guiding and supporting individuals to improve their
performance and professional growth.)

Please use this space to provide any additional comments, suggestions, or feedback you may
have regarding the DMEAL skills assessment, or any other aspect related to programme
effectiveness and capacity building.

ForAfrika | DMEAL Framework 96


Annex D - Programme Logframe Template
Use this Logframe template.xlsx to create directly in excel. It’ll make visualisation easier. Please save a copy on the specific project/programme
folder.

Means of
# Result Description Indicators Q1 target Q2 Target Q3 Target Q4 Target Y1 Target Assumptions
Verification

1000 Ultimate Outcome


1100 Intermediate
Outcome
1110 Immediate Outcome
1111 Output
1112 Output
1210 Immediate Outcome
1211 Output
1212 Output
1200 Intermediate
Outcome
1210 Immediate Outcome
1211 Output
1212 Output
2000 Ultimate Outcome
(…) (…)

ForAfrika | DMEAL Framework 97


Annex E – Evaluation Budget
This template can also be used for budgeting for baseline and yearly impact assessment
exercises.
Cost per
Item Role Unit Quantity Total Cost
Unit
1. Planning and Design
Evaluation Team Evaluation
Hours
Meetings Manager
Evaluation
Hours
Officer
Evaluation Framework Evaluation
Hours
Design Manager
Stakeholder Evaluation
Hours
Consultation Manager
2. Data Collection
Data
Data Collection –
Collection Hours
Surveys
Officer
Data
Data Collection –
Collection Hours
Interviews
Officer
Cost of
Recording/Transcribing Per interview
Interviews
Data
Data Collection – Focus
Collection Hours
Groups
Officer
Community
Cost of translator Hours
translator
Cost of Venue for Focus
Per group
Groups
Cost of Refreshments
Per group
for Focus Groups
Data
Travel for Household
Collection Km
Surveys/Interviews
Officer
Travel Costs – Local Trip
Travel Costs –
Trip
International
3. Data Analysis
Evaluation
Data Analysis Time Hours
Officer
Data Analysis Software Subscription
4. Report Writing and Dissemination
Evaluation
Report Writing Hours
Manager

Impact Measurement 98 GSO | DMEAL


Report Design and
Per report
Printing
5. Project Management
Evaluation
Management Time Hours
Manager
Percentage of
Overhead Costs
total
6. Miscellaneous Expenses
Percentage of
Contingency
total
Unforeseen Expenses As needed
7. TOTAL BUDGET

Impact Measurement 99 GSO | DMEAL


Annex F – Performance Management Plan
Introduction
Describe the purpose of the plan, including the project name, its link to donor strategic
objectives, how the plan will be used, and who is responsible for implementing the plan.
Project overview

Project name

Project code

Duration From: To:


Geographical location Admin Admin Admin Admin Admin 4
1: 2: 3: 4: population:
Approved budget (in
USD)
Donor

Target Participants Male Female Boys Girls PwD

Theory of change (narrative)


Provide a narrative description of the project’s theory of change.
Roles and Responsibilities
Describe the roles and responsibilities for project monitoring and evaluation activities. Include
responsibilities linked to data collection, data quality checks, data analysis, reporting writing
and data use.

Role Responsibilities

Work Breakdown Structure (WBS)


The Work Breakdown Structure decomposes the project into smaller, manageable
components. Each level of the WBS represents an increasingly detailed definition of the
project work.
Use this WBS and Gantt Chart template.xlsx for easy planning.
Data Flow Map
Insert a data flow chart illustrating how data flows from collection through reporting.
Evaluation activities and special studies
Describe plans for project evaluations and/or special studies, including name, timeline, and
key evaluation questions.
Impact Measurement 100 GSO | DMEAL
Stakeholder Engagement Strategy
Outline the plan for engaging with various stakeholders throughout the project lifecycle,
including communication methods and frequency.
Risk Management
Identify potential risks to the project and outline mitigation strategies.

Mitigation
Risk Likelihood Impact
Strategy

Capacity Building Plan


Describe plans for training and development of team members and local partners.
Sustainability and Exit Strategy
Outline how the project will ensure long-term impact and how it will transition out of the
community.
Logical Framework integrated with Budget Monitoring

Impact Measurement 101 GSO | DMEAL


Annex G - Report Quality Assessment Tool
Reports are the tangible manifestations of our on-field activities. More than being mere project
documentation, they offer an expansive platform to share and present our accomplishments.
Considering the relatively small number of projects that allocate specific resources to report
writing, the importance of everyone involved in development work having a grasp on effective
report writing becomes more evident.
A well-written report is a testament to the quality of project implementation, providing the donor
with a clear picture of how their funding was used. Therefore, understanding how to craft an
engaging, comprehensive report is crucial.
This Report Quality Assessment Tool aims to guide that process, providing benchmarks
against which to measure the effectiveness of a report. It helps assess a report’s ability to
reflect achievements and convey feedback to donors and management throughout the
project’s life cycle. ForAfrika country offices will employ these internal benchmarks to assure
quality and uniformity in project reports.
Scoring
An integral part of this assessment tool is its rating system. Each aspect of the report is
evaluated on a 1-5 scale:
1. Inadequate: The performance or quality is insufficient and falls well below the desired
standard.
2. Requires improvements: The performance or quality needs substantial
enhancements to meet the desired level of expectation.
3. Foundations in place: The necessary groundwork or initial components have been
established, but further development or refinement is needed to reach the desired
level.
4. Progressing: The systems are functioning and operating effectively, meeting the
desired requirements and demonstrating positive performance.
5. Adequate performance: The performance or quality is fully satisfactory, meeting or
exceeding all expectations and requirements.
This scoring allows for nuanced feedback and provides specific areas for improvement. The
scores within each category are averaged to give a category score. The total average score
for the report is then calculated by averaging the category scores. This provides a holistic view
of the report’s quality and effectiveness, offering a valuable quantitative measure for report
assessment and comparison.
Annual vs Monthly Reports
Annual and monthly reports each serve a unique purpose and differ in their scope and detail.
An annual report offers a comprehensive overview of a project’s progress, achievements, and
lessons learned over an entire year. It often includes extensive details on every aspect of the
project, from inception and planning to implementation, monitoring and evaluation,
partnerships, and potential future directions. It aims to provide a complete narrative of the
project’s journey over the year, capturing both successes and challenges.

Impact Measurement 102 GSO | DMEAL


In contrast, a monthly report is a more frequent, concise snapshot of the project’s ongoing
activities. Its primary focus is on the immediate past month’s implementation details. This
includes an outline of the activities conducted, progress towards objectives, any adjustments
made, updates on partnerships, financial management, and a brief look-ahead to the next
month’s goals. Although less comprehensive than an annual report, the monthly report’s
strength lies in its ability to provide timely updates, enabling rapid responses to emerging
issues and continuous improvement in project implementation.

Annual and Final Reports


Project Overview
Executive Summary:
• Capture the main goals and results of the project
• Include the key quantitative and qualitative findings of the report
• Highlight any unique or innovative aspects of the project
• Language must be accessible and compelling to a non-specialist audience
Introduction:
• Describe and define the problem that the project aims to solve
• Justify the relevance of the project
• Explain any changes in the general situation which impacts the project
• Describe the key stakeholders
Project Outcomes
Direct Impact:
• Provide specific examples of how the project has made a difference
• Link impacts to the project’s goals and objectives
• Discuss both positive and negative impacts of the project
Indirect Impact:
• Discuss any unexpected or indirect impacts of the project
Feedback from Beneficiaries:
• Highlight how feedback from beneficiaries has been incorporated into the project
Project Implementation
Progress and Activities:
• Present project’s milestones and deliverables in a clear, chronological order and easy
for comparison with the planned activities
• Provide specific examples of activities conducted
Deviations and Adjustments:
• Explain any deviations from the original plan
Impact Measurement 103 GSO | DMEAL
• Discuss challenges in programme implementation and how they were mitigated
Partnerships and Collaboration:
• Discuss the process of choosing partners and collaborators
• Provide information about each partner’s expertise and contributions, (include
government entities, development partners, local and internal NGOs, private sector,
academia and civil society)
• Discuss the dynamics of the partnership (e.g., challenges, how conflicts were
resolved)?

Monitoring and Evaluation:


• Review data collection methods and their reliability
• Discuss any difficulties or limitations in the M&E process
• Present implications of the M&E findings for future project planning
Risk Management:
• Present how risks were identified, both high-probability and high-impact risks
• Discuss how risks have changed over time
• Present contingency plans for unmitigated risks
Financial Management
• Compare the planned and actual budget
• Discuss and explain any financial discrepancies or issues
• Include future budget projections or financial needs
• Present any cost-saving measures or financial innovations
Learning and Improvement
• Differentiate between lessons learned from successes and failures
• Reflect on implications of the lessons for future projects
• Identify any systemic or recurring issues
• Support lessons with specific examples or evidence
Sustainability and Continuation
• Present the roles of local stakeholders in ensuring sustainability
• Discuss how the benefits of the project will continue after the project ends
• Identify any potential threats to sustainability
• Discuss measures to institutionalise the benefits of the project (e.g., policy changes,
building local capacities)
Future Planning
Way Forward:

Impact Measurement 104 GSO | DMEAL


• Discuss any anticipated changes in the project environment or context
• Outline plans for future funding or resource mobilisation
• Present strategic shifts or changes in the project approach
• Review and present planned activities for the next reporting cycle
Potential for Replication or Scale-up:
• Discuss the project’s potential replication or scale-up
• Propose any areas for further research or investigation

Impact Measurement 105 GSO | DMEAL


Monthly or Quarterly Reports
Monthly reports are concise snapshots of our ongoing work in the field. These reports, while
shorter, still provide valuable information to donors and stakeholders about project progress
and effectiveness.
Project Overview
• Describe and define the problem that the project aims to solve
• Justify the relevance of the project
• Explain any changes in the general situation which impacts the project
Progress and Adjustments
• Present project’s milestones and deliverables in a clear, chronological order and easy
for comparison with the planned activities
• Provide specific examples of activities conducted
• Highlight how feedback from beneficiaries has been incorporated into the project
Challenges and Mitigation Strategies
• Explain any deviations from the original plan
• Discuss challenges in programme implementation and how they were mitigated
Partnerships and Collaboration
• Discuss the process of choosing partners and collaborators
• Provide information about each partner’s expertise and contributions, (include
government entities, development partners, local and internal NGOs, private sector,
academia and civil society)
• Review and present updates on partner collaboration and their contributions to the
project during the month
• Discuss the dynamics of the partnership (e.g., challenges, how conflicts were
resolved)?
Financial Management
• Compare the planned and actual budget
• Discuss and explain any financial discrepancies or issues
• Include future budget projections or financial needs
• Present any cost-saving measures or financial innovations
Outlook for the Next Month
• Provide a brief yet clear outline of the planned activities and goals for the next month
• Discuss any anticipated changes in the project environment or context
• Outline plans for future funding or resource mobilisation
• Discuss any strategic shifts or changes in the project approach

Impact Measurement 106 GSO | DMEAL

You might also like