PCM All Modules Bureau of Planning, September 2022

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 106

Capacity Building Training Workshop

for the Staff of Planning Directorates


and Case-teams at Regional Level

Interactive Workshop on
Project Cycle Management
and Monitoring, Evaluation,
Masks Required
at All Timesand Learning
Except Drinking
Prepared and Presented by
Abdurahman Mohamed Hereri (PGDip, MPH,
MSc)

SOMALI REGIONAL STATE


BUREAU OF PLANNIN
28 - 29 Sept. 2022, Jigjiga Ethiopia 1
Objectives of the
Workshop
•General Objective:
• Improve the capacity of staff of Bureau
of Planning to review and understand
project proposals, implementation and
monitor the performance” including
how to:
• Review and provide feedback on
administrative, financial, monitoring
and evaluation, and learning aspects
of project proposals
• Provide feedback on project
implementation, monitoring and
evaluation, and learning efforts
during implementation 2
Workshop Ground Rules
• Active Participation
• Interactive
• Learn and contribute
• Arrive on time
• No disturbances / phones silent or vibrate
• Let us know if we are going too slowly or too fast
• No site meeting

3
Components/Modules of the
training

MODULE TOPIC (Modules)


Project Inception and Appraisal, Planning
1
and Design
2 Project Design and Implementation
3 Monitoring and Evaluation

4 Learning and Knowledge Management

5 Reviewing Forms and Processes


4
MODULE 1:
Project Cycle Initiation
(Appraisal of Proposals,
Design and Planning)

5
Module 1: Initiation,
Design, and Planning
Objectives

•To reach common understanding of:


• Phases of project development and
implementation
• Some elements from PCM Project
Evaluation Appraisal Checklist (and
implementation)
• Logic models
• Sustainability
• Analysis of Team
• Basic concepts of budgeting
6
Project Planning and Design:
Terms to define
(Development) Project:
• Interventions over an established timeline and budget to
achieve a goal/objective
• A sequence of tasks that must be completed to attain a
certain objective/outcome.
• To any temporary endeavor with a definite beginning and end
- Depending on its complexity, it can be managed by a single
person or hundreds.

Project Cycle:
• All steps required for project managers to successfully manage
a project from start to finish
• The sequence of phases through which a project progresses. It
includes initiation, planning, execution, and closure
7
Project Planning and Design:
Terms to define

Project Planning/Design ::
• Early stage of project cycle defining objectives, Result/outputs,
budget, activity/work plan, targets and measures

Project Concept document (PCD) :


• The document that is the foundation for making a decision to
initiate a project. It describes the project purpose and
presents a preliminary business case for pursuing the project.
It gives decision makers the opportunity to determine project
viability.
8
9
10
11
Project needs identification
• Project cycles generally start with needs
identification
• Needs identified broadly in all
development projects
• Needs identified specifically in project
proposals
•DFID
• Developing a business case
• Running a feasibility study
• Drafting a project charter
• Enlisting & managing stakeholders
• Selecting the right team & project office
• Putting the finishing touches
12
13
Project Evaluation
Appraisal Checklist
(Concept)
• Why project is necessary (what is the
problem and how does this project
contribute to the solution?)
• What are key indicators (KPI) that the
project contributes
• State organizations or its experience in
implementing such a project & partners
• Key project activities; and the total
project budget
• Is guidance clear enough?

14
Project Cycle Phase 1: Initiation
Background & government Project Justification – Strategic
Alignment with Regional Plans

SRS 10 years
development
plan

Yearly Project
proposals

Project Proposal Project Proposal Project Proposal


Outcome 1 Outcome 1 Outcome 1

15
Project Evaluation Appraisal
Checklist Background, Target
Groups, Problem Statement

•Background including situational analysis


 What do you look for?
 What type of analyses in a proposal
might you find or want?

16
……..Appraisal
Checklist
Background including
situational analysis
• Relevant current conditions and
context
• Social, economic, political and
cultural
• Target area
• Map
• Location
• Relevant characteristics
• How area chosen

17
……..Appraisal Checklist

•Background including situational analysis


• Target group
• Detailed description of size and
characteristics of target groups, especially
direct beneficiaries
• What is difference between direct and
indirect beneficiaries ?
• Direct – directly engaged by the
projects
• Indirect – benefit secondarily through
direct beneficiary
• Is distinction between target area and
target groups? 18
……..Appraisal Checklist

Background including situational analysis


• Coordination
 Other actors working in the area
 How avoid overlap
 Should description of other actors
and/or coordination?
• Copy and paste not acceptable
 What additional guidance needed?

19
……..Appraisal Checklist

Locations and Actors Contextual Dynamics


• Target area/group • Risk analysis
• Root cause analysis
analysis
• Political economy
• Stakeholder analysis
analysis
• Gender / social • Cost-benefit / value for
inclusion analysis money
• Institutional analysis • Sustainability analysis

Data & Analysis: Examples of potential


analyses or assessments
• What typically encounter in target &
stakeholder analyses?
• How frequently encounter these? How
important?
20
……..Appraisal Checklist

• Does the proposal justification adequately


“make a case for the project?”
• Might identify the “5 Ws”
• Who?
• What?
• Where?
• When?
• Why?

21
……..Appraisal Checklist
• Prioritization: State higher to lower
priorities in targeting
• Criteria for how this prioritization decision
was reached “must also be included”
 What criteria do you look for?
o Highest / largest need
o Cost / efficiency
o Highest / largest effect
o Larger population targeted
o Stated population preference
• For further thought: Does indicating
criteria for prioritization?
22
Project Evaluation
Appraisal Checklist Goal,
Objectives, Aims
• “Describe project goal, objectives,
interventions, and critical activities”
• Best to link to assessment findings or
situational analysis
• What the difference between goals and
objectives?
• Goal: Broad, overarching and visionary
statement
 “…general aim that should explain what
the core problem is and why the project
is important, i.e. what the long-term
benefits to the target group”
23
…………Goal, Objectives, Aims

• Objectives
• More specific and within manageable
interest of project
• What is difference between general
and specific objectives?
 General “a general aim that
should explain what the core
problem is and why the project is
important”
 Specific “a more detailed
breakdown of the goal”

24
…………Goal,
Objectives, Aims
Goals and Objectives
• Are Specific Objectives
S.M.A.R.T?
 Specific
 Measurable
 Achievable
 Relevant
 Time-bound
• Do general or just specific
objectives need to be SMART?
• How to determine if they are
measurable? 25
Project
• Approach, Intervention, Strategy
Evaluation
Appraisal • Do goals and objectives
Checklist (belong before approach?
Approach, • Capacity building
Interventio • Provide institution
n, assessment to determine
Strategy: capacity of implementing
Sections partners
• Is capacity building a sub-
component of approach?

27
Project
Evaluation • Approach, Intervention,
Appraisal Strategy
Checklist • How should a project show
Approach, how approach or strategy”
Interventio show “precisely how it will
n, lead to improvement”? (sec
Strategy: 3)
Sections • One approach is logic
model

28
Checklist
Approach, •What is a logic model?
Interventi • Basic logic underpinning
on, proposal – links proposed
Strategy activities to expected outputs,
Reviewing outcomes and objectives
Logic • Includes explicit or implicit
Models change hypothesis – if X then Y
• Are more- and less-detailed
approaches

29
…………….
.
Interventi
on, •A logic model should at
Strategy minimum
Reviewin • Identify problem to solve
g Logic • Describe how to solve
Models problem through hierarchical
“if-then” statements reflecting
underlying logic

30
Intervention, Strategy
Logic Model in Results
Framework format
Results / Logical Framework / Chain Goal
• Simple, visual representation of theory of
change conveying cause-effect linkages.
• Results are intended, measurable change in Objective
variable that affects beneficiaries directly or
indirectly.
• Each level identifies the results that are Outcome/Result
necessary to achieve the level above.
• The linkages should be causal - achievement
of one result is necessary for achievement of
Output
the other.

Inputs

31
Strategy
Logic Model in Results Framework
format

Aspirational Goal

Then
Project
High-level
Medium and Long- Outcome
Objective
term results that
the project expects
to achieve Outcome Outcome

If
Short-term results
Outputs Outputs Outputs Outputs Outputs
that the project
should produce
immediately

Logic Model as Chain


What we invest What we do What we get What we achieve What we contribute to

32
Checklist Approach, Intervention, Strategy
Logic models: What do you think of this
results statement logic?

POOR: BETTER:
Example of a “definitional” Example of a stronger
relationship causal relationship
Goal: Institution Strengthened Goal: Institutional Performance
Improved

Outcome: Skills among front-


Outcome: Institutional line health workers for
capacity to deliver goods and delivering key services for
services improved maternal and child health
improved

33
………………. Intervention, Strategy
Logic models: What do you think of
this results statement logic?
POOR: BETTER:
Example of a “categorical” Example of a stronger causal
relationship relationship
Objective: Increased use of Objective: Increased Use of
Essential Family Planning Family Planning Services

Outcome 1: Outcome 1: Outcome 2:


Outcome 2:
Increased use Increased Increased routine
Increased use of
of Male availability of family planning
Female Condoms
Condoms contraceptives counseling by
health providers

34
•What is a theory of change (ToC)?
Checklist • A more detailed version of logic
Approach, model
Interventio • Includes multiple change
n, hypotheses
Strategy • Are detailed and even-more-
Logic detailed versions
Models – • Performance indicators
Theory of sometimes part of detailed
Change results frameworks or theory of
change
• Indicators discussed in module 3
• Why might a Theory of Change be
useful in proposal review? In
project development?

35
• Creates common understanding of
project, how to implement, why, what
to expect
o Shows causal pathways clearly,
sometimes simply
o Serves as a roadmap toward
………..Th interventions required
eory of o Reveals unplanned assumptions and
Change risks requiring attention
• Foundation for monitoring, evaluation,
learning (MEL)
o Creates common understanding of
path to results to define indicators,
factors to monitor beyond
interventions
o Creates a framework to monitor,
evaluate and learn 36
• Assumptions highlight factors
outside of manageable interest
• Define assumptions as specific and
measurable, if possible
• Assumptions highlight key potential
Risks and risks
Assumpti • Might be included in Logical
Model/Results Framework as well
ons
• Project should monitor changes in
status of the assumption
• Might focus on assumptions with
more uncertainty
• Should risks and assumptions
belong higher up (perhaps close to
Approach, Intervention, or Strate
37
End of MODULE I
Reflection of the participants

38
MODULE II:
Project Design and
Implementation

39
Module 2 :
Training
Objectives
• At end of module, you should
be able to better understand
and review/oversee:
o Sustainability planning
o More on proposal
appraisal
o Key priorities of project
implementation
o Planning for project
closeout
40
Assessing proposal organization
Project and team
Evaluation • What are qualifications/experience
Appraisal of organization(s)?
Checklist o Capacity, strengths
Organizati o Appropriateness, connection to
on, Team, community
Manageme o Past experience
nt and o Implementing partner capacity
Personnel assessment
• Are roles assigned clearly?
o “Key responsibilities for each
staff person … with reporting
relationship and key
qualifications (in table form)”
o Table vs organizational chart
41
………
Organizat Assessing proposal
ion, organization and team
Team, • Key staff qualifications
Managem • PCD requires capacity
ent and building and training
Personnel provided to team members
where gaps are identified
o Do you prefer capacity
building plans

42
……… Organization, Team,
Management and Personnel

•“Takes into consideration the opinions of the


target group.”
• How do/should proposals show this?
• Is this the best place in checklist for proposals
to explain planning process?

43
Project Evaluation Appraisal Checklist
Activity Plan (section

• What do/should you look for in an Activity Plan?


• “specific information and explanations of each of
the planned project activities”
• “considerable detail on the beginning and the end
of the project”
• “In general, two main formats…: a simple table and
the Gantt chart.”
• What activities/tasks are required to reach the
goal?
• Who is responsible for carrying out each task?
• What resources are allocated?
• What are the time frames?
• How will progress be monitored?
• What else? 44
Project Evaluation Appraisal
Checklist
Activity Plan (section
Simple Table Format Example
ACTIVITY Starting Ending Resources Cost Responsible Remarks
Date Date Person
Complete Community
Assessment

Identify and Engage


Partners

Develop Results
Statements

Define Indicators

Create Population-Level
Strategies

Hold Community Meetings

Implement health
activities to meet
community needs

45
Project Evaluation Appraisal
Checklist
Activity Plan Gantt charts

What is a GANTT chart?

46
……..Activity Plan
Gantt charts
• Gantt Chart - Planning tool showing
timelines, sequencing
 Should display whole project
timeline
 Timelines and deadlines of all tasks
 May show relationships,
dependencies, sequencing among
tasks
 Should display phases –visualization
through cumulative steps
 How much detail is too little /
much?
• What do you like / not like about following
GANTT charts? 47
………….Activity Plan Gantt charts

48
Project The PCD Evaluation Checklist
suggests proposals include a
Evaluat training plan if available
ion
Checklist provides no further details
Apprais and there is no standard model
al
Checkli Have you received proposals with
st training plans?
Activity
Plan What do you think should be in
Training one?
Plan
49
• What do you look for to better ensure
sustainability?
• Some ideas to increase sustainability
o Clarify collaboration with /
Checklist – involvement of key stakeholders
Sustainabi (e.g., national, regions, woredas
lity cities; private sector; CSOs; NGOs)
 How and when involved?
Phase out  During development of
strategy, interventions?
capacity  Transitioning MEL
building implementation tools to
government counterparts
• Who would own processes and
when?
• How does the project build skills and
capacity locally? 50
• Some ideas to increase
sustainability
o Clarify collaboration with /
……..Pha involvement of key
stakeholders (e.g., national,
se out local governments; private
strategy sector; CSOs; NGOs)
 How and when involved?
,  During development of
capacity interventions?
• Who would own processes and
building when?
• How does the project build skills
and capacity locally?

51
• Some ideas to increase
sustainability
• Who implements/continues
……..Pha afterwards?
se out • Are interventions tied to
sustainable financing model?
strategy, • How develop demand for
capacity services or outputs?
building • Is there an outcome or M&E
indicator linked to
sustainability?

52
Evaluati
on
• Project budget
Apprais o Formats vary by country or
al regions / development
partner /
Checklis Non-governmental/civil-
t society organization
Budgeti • What are the main categories of
budget costs?
ng
53
•What is a resource plan?
Project •A resource plan identifies physical
resources required to complete a
Cycle project, e.g., labor, equipment and
Phase: materials and unit quantities needed.
Implementa o Step 1: Lists the resource
tion required
o Step 2: Estimates the number
Resource of each listed resource required
Plan o Step 3: Constructs a resource
schedule
• When have you received / reviewed
resource plans separate from
budget?
54
•From oversight perspective,
implementation generally involves
o Providing input/guiding key
implementation decisions
o Tracking progress
 Meetings, workshops, or project
reports
Implement  Review actual costs, schedule, and
quality of deliverables
ation  Continuously monitor
Oversight achievements, results vs targets
from activity plans, training plans,
and resource plans
 Using data, other evidence for
learning, project adaptation.
Consider re-visiting results
framework/activity plan
 Discussed more in module 4
56
•Project Reporting
• Project reports facilitate
understanding
o Resources expended
Implementa o Problems encountered
tion o Schedule issues
Oversight: o Results
Project • Financial, periodic/regular reports,
Reporting special reports
• Monitoring and evaluation plans
and reporting covered under
Module 3

57
Implementation Oversight:
Reviewing Project Reports

Why are projects required to document and share progress


reports?
• to document progress against work
plans and budgets and identify
needed adjustments
Relevance • to provide details on outputs,
of project outcomes, and lessons learned
reports
• to increase accountability and
transparency
• to aid with future program planning

58
Key Elements of Progress
Reports

From organizational form


 Progress narrative
 Implementation status
Progress on
 Plan vs achievement via table
Implementation
Aside from government forms
 MEL update on data collection,
quality

59
Reviewing Project
Reports

 Budget in two sections:


Budget
 Plan vs Achievement (prior
table)
 Review variations and
reasoning

60
Reviewing Project
Reports

 A performance management
review table
Annexes
 A financial report summary
(if not captured fully in
budget table)
 Success stories
 Anything else?

61
Project Management and
Performance Oversight
Closeout
•What might a project do as part of closeout planning?
• Phaseout strategy
• Elements of phaseout strategy document
 Introduction/overview
 Country performance
 Phaseout plan
 Sustainability
 Budget

62
…….Closeout
• What might a project do as part of closeout planning?
• Reporting
o Project evaluation (mid-term or final, internal or
external)
o Develop lessons learned
o Special studies
o Final report
o Internal post-implementation/after-action reviews
• led by project, PCD may take part in
• PCD can review
• Project agreement against final financial report
• MEL plan against reports
63
…….Closeout
What might a project do as part of closeout planning?
 Staffing phase-out plan
 Financial closeout
o Staff severance
o Redirecting leftover funds
o Termination of subcontracts
o Tax payments / refunds
o Audit
o Closing project bank account
 Office closure and furniture / materials
disposition 64
End of MODULE II
Reflection of the participants

65
MODULE III:
Monitoring and Evaluation

66
At the end of this module, you
will be able to better understand
and apply:
Module
• Monitoring and Evaluation
3: Concepts
Training • Different Types of
Objectiv Performance Indicators
es • Baselines and Targets
• Data Collection Methods
• Data Management and
Quality
• Evaluation Approaches

67
Monitoring and
Evaluation in Project
Cycle

• Monitoring and evaluation are


part and parcel of the project
cycle
• They are key to ensure the
success of a project

68
Monitoring and Evaluation in
Project Cycle

How do you define Regular collection and


monitoring? analysis of information
to:
Assist timely decision making,
ensure accountability and
Provide the basis for evaluation
and learning.

69
Monitoring
and Evaluation
in Project
Cycle
• Types of monitoring
• Process monitoring:
(focuses on activities,
outputs and resources
• Impact monitoring: (deals
with changes brought as a
result of the intervention
while the project is still on
progress)
What type of monitoring does
PCD apply?
7
0
Monitoring and
Evaluation in
Project Cycle

Evaluation
What do we mean by
evaluation?
• Periodic assessment of a
project in the context of
stated objectives.
• May be undertaken
internally or externally
(independently)
o What are the tradeoffs
between internal and
7
external evaluation? 1
Types of evaluation
What types of evaluations are
there? What other types are
Monitorin there?
g and
Evaluatio • Ex-ante / formative evaluation
n in • Summative evaluation
Project • Mid-term or on-going
Cycle evaluation
• Terminal / end-of-project
evaluation
• Ex-post evaluation
• Meta evaluation
Monitoring and Evaluation in
Project Cycle
What are the differences and complementarity between M&E?
Differences
Area of interest Monitoring questions Evaluation questions

Project coverage “How many people or “Is the project reaching (or
communities were reached?” did it reach) those with
greatest need? If not, why
not?”

Project process “Did project processes “Did the project effectively


complete activities on time and appropriately invest its
and on budget?” time and budget to conduct
its activities?”
Project results Was the project successful in “How have outcomes
achieving the targets set for achieved varied by different
its intended outcomes?” groups within the target
area?”

Fundamentals of MEL Training 73


Monitoring and Evaluation in
Project Cycle
Differences

Factors Monitoring Evaluation


Objectives To determine project To determine effects &
inputs, activities and impacts
outputs, and improve
progress
Data to gather Primarily quantitative data Primarily qualitative data
Tools for data Generally short monitoring Generally long questionnaires
collection formats interview schedule. Apply
more rigorous procedures,
design and methodology and
involve extensive analysis

74
Monitoring and Evaluation in
Project Cycle
Differences

Factors Monitoring Evaluation


Data gathering During project Generally, after project
implementation completion
Frequency of data More frequent & Less frequent & periodic
collection routinely
Implementers Project staff & other External people with
stake holders internal staff & stakeholders
Use of data Mainly for decision Primarily for planning
making during project projects
implementation

75
Monitoring and
Evaluation in
Project Cycle

•Complementarity
• Monitoring and evaluation
are highly connected
• Monitoring activities can
generate data that can be
used for evaluation
• Monitoring activities can
track whether the
evaluation
recommendations are
improving the project and
its outcomes
Monitoring and Evaluation in Project Cycle
Complementarity
Monitoring Evaluation
Implementation oriented Policy oriented
Assesses intermediate results Explains results
Focuses on timeliness Focuses on rigor
Informs budgeting Informs broad resource allocation
Routinely collects data on Explores unintended results
indicators, compares actual results
with targets
Reports progress to managers and Provides lessons, highlights significant
alerts them to problems accomplishment or program potential,
and offers recommendations for
improvement

77
Monitoring and Evaluation in Project Cycle
Complementarity

MONITORING EVALUATION
Is the project going according to plan? What happened? Why? How?

Knowl
LEARNING edge
What are the takeaways? Mngt

BETTER MANAGEMENT BETTER (RE)DESIGN


of the project of existing and future projects

78
Monitoring and
Evaluation
Methodologies,
Methods, and
Indicators
Monitoring and Evaluation
Methodologies and Methods:
Quantitative vs Qualitative
Data Collection Methodology
Q: What’s the difference between quantitative and
qualitative methodologies and methods?

Numerical.
Measures something which can be counted
Quantitative or quantified (precisely). May rely on
mathematical/statistical analysis.

Non-numerical.
Approximates and characterizes. Often
Qualitative gauges judgments and perceptions.
Generally rely upon subjective analysis.

Fundamentals of MEL Training 80


Monitoring and Evaluation
Methodologies and Methods
Quantitative vs Qualitative

Quantitative Methods Qualitative Methods


• Deductive Reasoning​ • Inductive Reasoning
• Outsider perspective​ • Insider perspective
• Focus on specific concerns​ • Holistic
• Selection of units – often • Selection of units – often ​
probability sampling purposive sampling methods
methods • Number of units often small
• Number of units may be • Narrative and visual emphasis ​
large in reporting
• Numeric and visual • Heuristic (using mental
emphasis in reporting shortcuts) and iterative

Fundamentals of MEL Training 81


Monitoring and Evaluation
Methodologies and Methods
Quantitative vs Qualitative
Method Advantages Disadvantages
Quantitative • Scalable • Miss the depth and complexity of an
methods • Generalizable issue
(sometimes) • Not suitable for identifying and
• Objective exploring unanticipated or
• Standardized unexpected factors
• Suited to ICT 4D
Qualitative • Provide depth and • Results are harder to generalize
methods detail • Data are difficult to collect and
• Create openness analyze
• Simulate individual • Data are susceptible to the hidden
experiences bias of collectors and participants
• Identify the • More difficult to transcribe data
unexpected directly to digital devices

Fundamentals of MEL Training 82


Monitoring Data sources: • Primary, secondary
and Type of data:
• Quantitative,
Qualitative,
Evaluation
MethodologiesData collection
methods:
• Questionnaires (open
ended, closed ended)
and Methods • Structured
observation
Methods
• Semi-structured
interviews
Characteristics Data
• Participant
collection methods observation
• Key informant
interviews
• Focus group
discussion
• Field observation
• Collection of
administrative records
Monitoring and
Evaluation
Methodologies and
Methods:
Methods
•Data collection methods
• For indicators, a project
should identify (at a
minimum):
o Data Source
o Frequency and method
of data collection
o Type of data
o Reporting and analysis
frequency
Monitoring and Evaluation in
Project Cycle

What are performance


indicators?
A particular Quantitative or
characteristic or qualitative variables that
dimension used to allow the verification of
measure intended changes produced by the
changes project.

85
Monitoring and Evaluation
in Project Cycle

How do we use performance indicators?


• Measure intended changes plausibly as
a result of project
• Observe progress, performance across
levels (input, output, outcome,
objectives)
• Compare baselines and targets

86
Monitoring and Evaluation in
Project Cycle
Characteristics of Good Indicators

• Specific: Indicator has clear and narrow definition


(quantity, quality, target, time & place)

• Measurable: Indicator is easily calculated

• Attainable/ Achievable: Indicators must be attainable


given the budget; time & resources available

• Relevant: Indicator is a valid measure of result

• Time bound: Indicators must identify the timeframe


within which the change is expected to occur

87
Monitoring and Evaluation in
Project Cycle
What type of indicators do you know?
• Quantitative
Data Type • Qualitative
• Output
Type of Result •
Measured Outcome
• Objective (Impact)
• Composite (index)
Indirect
• Proxy
Measures
• Context
Nature/ • Standard
formulation • Custom

Sensitivity • Gender sensitive/Social Inclusion

Fundamentals of MEL Training 88


Monitoring and Evaluation
in Project Cycle

Baselines
What are baselines?
• Value/level of a performance indicator before
implementation begins
• Baselines are vital to set realistic performance
indicator targets
• Indicators without baselines are not useful for
tracking performance over time
• Baseline data can be generated from primary or
secondary sources
Monitoring and Evaluation in
Project Cycle
Baseline scenarios
No. Scenario
1 An indicator has baseline of zero
2 A performance indicator baseline needs to be collected
3 An established indicator is selected as a performance
indicator
4 Activity is implemented on a rolling basis in terms of
geography and beneficiaries

5 Multiple implementing partners reporting on the same


indicator
6 Implementation has started but baseline was not collected
7 The indicator has changed
90
Monitoring and Evaluation in
Project Cycle
What are targets?
• A target is a specific, planned level of result to be achieved
within a specific time frame with a given level of resources
• Targets help determine if progress meets expectations

•Result
Example: Indicator Baseline Target

Year 1 Year 2 Year 3


IR 2: Decreased By year three, 85% of 30% report using 50% 65% 85%
maternal mothers aged 15-49 contraceptives
mortality report increased use
of modern
contraceptives

Fundamentals of MEL Training 91


Monitoring and Evaluation in
Project Cycle
When reviewing project targets, PCD can consider:
– Baseline value
– Amount of project funding
– Length of the project
– Historical trends
– Research findings & evidence from similar programs
– Expert judgment & stakeholder consultation
– MoH expectations

Fundamentals of MEL Training 92


Data Quality
Management and
Assessment

How good does


project data
need to be?

93
Monitoring and Evaluation in Project
Cycle
Data Quality and Management

Data Quality and Management: Six Stages

Source Collection Collation

Analysis Reporting Use

Fundamentals of MEL Training 94


Monitoring and Evaluation in
Project Cycle
Data Quality and Management

Data Quality and Management, cont’d…


• Data Sources - Where the data originate, usually a document or a
person
• Data collection- involves obtaining data from the original source and
transferring them into tools from which they can be collated, analyzed,
and reported
• Data collation - is the process of assembling data into a format for the
purpose of analysis using either electronic or manual tools
• Data analysis - is the process of examining, reviewing, and evaluating
data sets to assess a hypothesis
• Reports - are a compilation of descriptive information, presenting data
analysis as useful knowledge
• Utilization – is the process of making timely, data-driven, evidence-
based decisions. This is the goal of any monitoring and evaluation effort

Fundamentals of MEL Training 95


Monitoring and Evaluation in
Project Cycle
Data Quality and Management

What is data quality? What standards of data quality do you know?


• Validity – Measure what they are intended to measure
• Reliability – Stability and consistency of data collection and analysis
methods
• Precision/Accuracy – Sufficient level of detail
• Integrity – Focuses on whether there is improper manipulation of
data
• Consistency – There are no conflicts in information within or
between systems
• Timeliness – Data should be available and up to date
• Completeness – All possible data that is required is present
• Confidentiality – The confidentiality of clients is maintained

Fundamentals of MEL Training 96


Monitoring and Evaluation in Project
Cycle
Data Quality and Management

What should projects do to improve data quality?


• Data is reviewed
• Data collection methods are clearly identified
• Written procedures on data management are available to guide
project and facilitate monitoring by stakeholders
• Qualified MEL personnel manage data instead of other project staff
(without the skills) being given the additional role of managing data
• Data protection is in place to minimize the risk of transcription error
or data manipulation
• Source documents are filed and maintained by project
• Duplicate data are detected through formal data cleaning
procedures

Fundamentals of MEL Training 97


End of MODULE III
Reflection of the participants

98
MODULE IV:
Knowledge Management
(additional topic)

99
Knowledge Management
What is Knowledge Management?
Knowledge management (KM) is about utilizing and
leveraging what is already known to work to achieve the
greatest impact and improve outcomes in organizations and
projects.
KM is the process of…
• Creating
• Capturing
• Managing
• Sharing
• Using
• … the knowledge and
information of an
organization.
Why Knowledge Management?
Some of the common benefits of KM include:
 Improved performance or competitive advantage
 Better and faster decision making
 Increased rate of innovation
 Sharing of data and lessons learned
 Sharing of specialist expertise
 Better communication
 Improved business processes.

Fundamentals of MEL and CLA Train


Understanding Knowledge

Explicit knowledge can Tacit knowledge is


be more easily conveyed, unwritten, unspoken,
stored, and accessed. and nuanced.

Examples: Reports, articles, Examples: Knowledge from


data, manuals, toolkits, individual experience,
books, forms strategic thinking,
expertise and competence

Fundamentals of MEL and CLA Train


Elements of KM System

Processes Creation, capture and use of


knowledge, workflows,
integration, best practices, content
management

Data storage and formats,


Technology networks, decision tools,
automation

Attitudes, sharing,
People and
innovation, skills, teamwork,
Culture
motivation, vision

Fundamentals of MEL and CLA Train


Stages of Knowledge Management
Cycle

Knowledge Knowledge
adaptatio generation
n (use) & capture

Knowledge Knowledge
sharing & organization
communication & synthesis

Fundamentals of MEL and CLA Train


Knowledge Generation and Capture

Tools and approaches for


Knowledge can be generation and capture:
gained many ways, but • After-action reviews
your organizations • Balanced scorecards
should capture the most • Briefings
important information • Communities of practice​
for your work. • Staff experience
• Field visits
• Monitoring, evaluation and
learning​

See SDC KM Toolkit in your handouts for


more on these tools

Fundamentals of MEL and CLA Train


Knowledge Organization &
Synthesis

Tools and approaches for


KM aims to create organization and synthesis
insights and new include:
knowledge as well as • Searchable databases
synthesize existing • Standard forms
knowledge to make it • Directories of best practices
more accessible. • Job aids
• Fact sheets or summaries
• Policy briefs
• Learning modules

See SDC KM Toolkit in your handouts for


more on these tools
Fundamentals of MEL and CLA Train
Knowledge Sharing &
Communication

Tools and approaches for sharing and


KM fosters knowledge communication include:
transfer among • Staff chats or peer-to-peer support
internal and external • Staff meetings
stakeholders. • Communities of practice
• Collaboration and networking
opportunities
• Brown bags
• Journals
• Forums and training sessions

See SDC KM Toolkit in your handouts for


more on these tools
Fundamentals of MEL and CLA Train
Knowledge Adaptation (Use)
Tools and approaches for fostering
knowledge adaptation include:
Adaptation is when • Facilitation, mentorship, or peer-
others internalize and to-peer support
use knowledge, • Help staff to understand and
reinforcing the value adapt information to their needs
of KM. • Get staff feedback about how to
improve your KM system
• Build/maintain a culture of KM
• Model how to use the KM
system

See SDC KM Toolkit in your handouts for


more on these tools
Fundamentals of MEL and CLA Train

You might also like