What Is A Monitoring and Evaluation Plan
What Is A Monitoring and Evaluation Plan
A monitoring and evaluation (M&E) plan is a document that helps to track and assess the
results of the interventions throughout the life of a program. It is a living document that should
be referred to and updated on a regular basis. While the specifics of each program’s M&E plan
will look different, they should all follow the same basic structure and include the same key
elements.
An M&E plan will include some documents that may have been created during the program
planning process, and some that will need to be created new. For example, elements such as
the logic model/logical framework, theory of change, and monitoring indicators may have
already been developed with input from key stakeholders and/or the program donor. The
M&E plan takes those documents and develops a further plan for their implementation.
Learning Objectives
After completing the steps for developing an M&E plan, the team will:
Prerequisites
How to Develop a Logic Model
Steps
Step 1: Identify Program Goals and Objectives
The first step to creating an M&E plan is to identify the program goals and objectives. If the
program already has a logic model or theory of change, then the program goals are most
likely already defined. However, if not, the M&E plan is a great place to start. Identify the
program goals and objectives.
Defining program goals starts with answering three questions:
1. What problem is the program trying to solve?
2. What steps are being taken to solve that problem?
3. How will program staff know when the program has been successful in solving
the problem?
Answering these questions will help identify what the program is expected to do, and how
staff will know whether or not it worked. For example, if the program is starting a condom
distribution program for adolescents, the answers might look like this:
From these answers, it can be seen that the overall program goal is to reduce the rates of
unintended pregnancy and STI transmission in the community.
It is also necessary to develop intermediate outputs and objectives for the program to help
track successful steps on the way to the overall program goal. More information about
identifying these objectives can be found in the logic model guide.
Step 2: Define Indicators
Once the program’s goals and objectives are defined, it is time to define indicators for
tracking progress towards achieving those goals. Program indicators should be a mix of
those that measure process, or what is being done in the program, and those that measure
outcomes.
Process indicators track the progress of the program. They help to answer the question,
“Are activities being implemented as planned?” Some examples of process indicators are:
• Number of trainings held with health providers
• Number of outreach activities conducted at youth-friendly locations
• Number of condoms distributed at youth-friendly locations
• Percent of youth reached with condom use messages through the media
Outcome indicators track how successful program activities have been at achieving
program objectives. They help to answer the question, “Have program activities made a
difference?” Some examples of outcome indicators are:
• Percent of youth using condoms during first intercourse
• Number and percent of trained health providers offering family planning
services to youth
• Number and percent of new STI infections among youth.
These are just a few examples of indicators that can be created to track a program’s
success. More information about creating indicators can be found in the How to Develop
Indicators guide.
Step 3: Define Data Collection Methods and TImeline
After creating monitoring indicators, it is time to decide on methods for gathering data
and how often various data will be recorded to track indicators. This should be a
conversation between program staff, stakeholders, and donors. These methods will have
important implications for what data collection methods will be used and how the results
will be reported.
The source of monitoring data depends largely on what each indicator is trying to measure.
The program will likely need multiple data sources to answer all of the programming
questions. Below is a table that represents some examples of what data can be collected
and how.
DHS or other
Percent of adolescents reporting condom
population-based Annually
use during first intercourse
survey
DHS or other
Number and percent of new STI infections
population-based Annually
among adolescents
survey
Data management roles should be decided with input from all team members so everyone
is on the same page and knows which indicators they are assigned. This way when it is time
for reporting there are no surprises.
An easy way to put this into the M&E plan is to expand the indicators table with additional
columns for who is responsible for each indicator, as shown below.
Data
Indicator Data source(s) Timing
manager
DHS or other
Percent of adolescents reporting condom Research
population-based Annually
use during first intercourse assistant
survey
DHS or other
Number and percent of new STI infections Research
population-based Annually
among adolescents assistant
survey
Step 5: Create an Analysis Plan and Reporting
Templates
Once all of the data have been collected, someone will need to compile and analyze it to fill
in a results table for internal review and external reporting. This is likely to be an in-house
M&E manager or research assistant for the program.
The M&E plan should include a section with details about what data will be analyzed and
how the results will be presented. Do research staff need to perform any statistical tests to
get the needed answers? If so, what tests are they and what data will be used in them?
What software program will be used to analyze data and make reporting tables? Excel?
SPSS? These are important considerations.
Another good thing to include in the plan is a blank table for indicator reporting. These
tables should outline the indicators, data, and time period of reporting. They can also
include things like the indicator target, and how far the program has progressed towards
that target. An example of a reporting table is below.
Lifetime % of target
Indicator Baseline Year 1
target achieved
11,000 10,000
10%
Number and percent of new STI
reduction 5 20%
infections among adolescents 22% 20%
years
• How will M&E data be used to inform staff and stakeholders about the success
and progress of the program?
• How will it be used to help staff make modifications and course corrections, as
necessary?
• How will the data be used to move the field forward and make program
practices more effective?
The M&E plan should include plans for internal dissemination among the program team, as
well as wider dissemination among stakeholders and donors. For example, a program team
may want to review data on a monthly basis to make programmatic decisions and develop
future workplans, while meetings with the donor to review data and program progress
might occur quarterly or annually. Dissemination of printed or digital materials might
occur at more frequent intervals. These options should be discussed with stakeholders and
your team to determine reasonable expectations for data review and to develop plans for
dissemination early in the program. If these plans are in place from the beginning and
become routine for the project, meetings and other kinds of periodic review have a much
better chance of being productive ones that everyone looks forward to.
Conclusion
After following these 6 steps, the outline of the M&E plan should look something like this:
1. Introduction to program
• Program goals and objectives
• Logic model/Logical Framework/Theory of change
2. Indicators
• Table with data sources, collection timing, and staff member
responsible
3. Roles and Responsibilities
• Description of each staff member’s role in M&E data collection,
analysis, and/or reporting
4. Reporting
• Analysis plan
• Reporting template table
5. Dissemination plan
• Description of how and when M&E data will be disseminated
internally and externally