TAI - MAL Handbook PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

MEL at TAI - Practical Handbook

Version 10/2022

Contents
I. Getting started .................................................................................................................. 1
1. What is this Handbook?................................................................................................ 1
2. What is MEL to TAI? ....................................................................................................... 1
3. MEL Key Components .................................................................................................... 2
3.1. TAI’s MEL glossary .................................................................................................. 2
3.2. Important considerations ...................................................................................... 2
3.3. MEL Framework ...................................................................................................... 4
3.4. MEL timeline ........................................................................................................... 7
3.5. Learning questions................................................................................................. 9
3.6. Evaluation ............................................................................................................... 9
II. Monthly inputs .................................................................................................................10
1. Data collection form (v3) .............................................................................................10
2. Monthly data collection session..................................................................................11
III. Yearly outcomes sensemaking .......................................................................................13
IV. Synthesis and reflection..................................................................................................17
1. Members’ attendance analysis ...................................................................................17
2. Thematic analysis ........................................................................................................17
3. Collaboration case notes .............................................................................................17
4. Other Forms of Strategic reflection .............................................................................19
V. MEL Reporting and dissemination components ............................................................20
VI. Knowledge management .................................................................................................23
I. Getting started

1. What is this Handbook?

This handbook lays forth the common understanding and updated agreed practice of Monitoring,
Evaluation, and Learning (MEL) at Transparency and Accountability Initiative (TAI). Emphasis is placed
on implementing MEL processes and related reporting systems.
This Handbook is intended for TAI Secretariat Team but could be useful for other users, in particular for
other funder collaboratives. This Guide is not intended to serve as a mandatory, “one size fits all ''
instruction booklet nor a monitoring and evaluation theoretical course of any kind.

This work is a continuation of the brilliant work done by previous MEL fellows and the head of Learning
and Impact at TAI.

2. What is MEL to TAI?

We are a collaborative where learning is at the same time a primary function and an intended
outcome. Learning is also the backbone of our three programmatic pillars: What we fund, How we fund,
and Funder Landscape.

MEL is not only the nexus between both programmatic and strategic components but also the
backbone of our Learning model.

Our processes are linked to our collective strategy, guiding reflection of progress, pursuing learning
questions, and assessing achievement against the collaborative’s anticipated contribution to change.
TAI’s MEL practices also serve the collaborative’s streamlined proposal and reporting requirements
across member grants.

At TAI, we believe that a well-designed and well-functioning MEL system paves the way for high-quality
management and accountability.

TAI employs several learning approaches including generating experiential evidence (e.g., annual
Learning Day, and peer Grantmaker practice sharing); collectively shaping work commissioned by
individual members (e.g., peer review of terms of reference or draft content); and tracking and
interpreting TPA field evidence (e.g., member and practitioner calls with evidence producers).

This learning and collaboration that happens among funders and TAI’s contribution to this learning
and collaboration is what we try to monitor, evaluate, and learn from, too.
3. MEL Key Components

3.1. TAI’s MEL glossary

MEL term TAI definition

MEL system The who, what, when, how, and why of MEL.
Suite of documents, data, practices, and people (what/who) that drive the timely
collection, management, and use of data (when/how) to assess and make
decisions about and improve TAI’s work, progress, and impact (why).

MEL plan Document that provides context for the MEL System (who and why), and describes
plans for the MEL System (what, when, and how).
TAI’s MEL plan covers monitoring (progress tracking) and learning practices in a
bit more detail than evaluation, as the latter is outsourced to external evaluators.

MEL framework Document that maps out intended results and/or learning questions against
indicators or other data points to assess progress or achievement. Results and
learning questions are grounded in TAI’s strategic framework (What We Fund, How
We Fund, and Funder Landscape).

Result Outputs– Event and/or products delivered by the Secretariat


Outcomes– Direct benefit for members, changes in practice, or significant learning
resulting from TAI efforts.

Indicator reference sheet Documentation for each indicator that, at a minimum, defines indicator language
and meaning, notes data source(s) and frequency of collection, and details the
means of verification (how you will calculate or present the indicator value). This is
a key tool to ensure consistency and data quality over time and across personnel
involved.

Learning questions Set of questions that prioritize and focus TAI’s collective attention on knowledge
gaps and/or practice challenges related to our MEL framework. It is ideal to have a
plan for answering these questions and/or link to emerging activities, as relevant.

3.2. Important considerations

a) MEL Scope: TAI offers a variety of activities and products for members to benefit from and leverage
their collective information, practices, relationships, and/or resources in a structured and sustained
manner. TAI members are incentivized to continuously learn and improve responsive and responsible
Grantmaker practices and strategies (for their beneficiaries). Therefore, the TAI MEL system’s scope
covers the secretariat team’s intervention. Our direct targets are the TAI’s members as institutions and
as staff. This means that members’ own targets could be indirect and different from TAI secretariat’s.
b) MEL approach: TAI’s MEL framework is conceived around two approaches: Question-driven
approach and indicator-driven approach.
o Question-driven approach: TAI’s monitoring and evaluation processes are structured around
learning questions that enable a much broader analysis of our thematic areas and our
intervention’s progress and achievements. The answer to these key questions will be
generated through analysis of MEL insights throughout the implementation period and
beyond. TAI will periodically document a narrative response to each question based on data
analysis and sensemaking sessions with members.
o Indicators-driven approach: In TAI’s MEL framework, there are two types of indicators:
“General indicators” and “Aligned indicators”. General indicators are divided between output
indicators (operational level) and outcomes indicators (change level). Aligned indicators come
from the general indicators as agreed with funders and are reported periodically to TAI’s
steering committee.

c) Yearly calendar: TAI has two important schedules: the Work plan schedule and the Reporting
schedule. Both are slightly different as explained below:
3.3. MEL Framework
Aligned indicators

Result statement / Learning question Result Indicator Source / Means of Frequency


level verification

Collaborative

TAI platform for learning and collaborative Goal


action brings value to members

TAI strengthens mutually beneficial Outcome # of documented instances of collaboration Collaboration case studies; Annual
collaboration among members among two or more members Member engagement survey

TAI strengthens mutually beneficial Outcome % of member survey respondents that report Member engagement survey Annual
collaboration among members benefitting from collaborative initiatives

TAI participation enhances individual and Outcome % of member survey respondents that report Member engagement survey Annual
collective member awareness and learning sharing TAI experience or content with other
funders* (*members vs non-members)

TAI participation enhances individual and Outcome % of member survey respondents that report Member engagement survey Annual
collective member awareness and learning new or stronger relationships with other
funders* through TAI (*members vs non-
members)

TAI participation enhances individual and Outcome Increase in the number of new relationships Network analysis? Twice
collective member awareness and learning and strengthening of existing ones among
member staff

TAI core institutional membership size Outcome Net change in # of core institutional TAI governance Three times
offers value to members members from 2017-2019 strategy to 2020- documentation and grant (2019
2024 strategy records baseline;
midpoint and
end of 2020-
2024
strategy)
Funders within and beyond the current Output # of unique funders (institutional program or TAI participation records and Rolling basis
membership are engaged in Secretariat- other organizational unit) represented at activity documentation
facilitated activities Secretariat-facilitated initiatives (disaggregated by
membership status, and TAI
strategic pillar)

Funders within and beyond the current Output Non-member funders' perceptions of value for Follow-up interview or online Rolling basis
membership are engaged in Secretariat- engaging with TAI’s initiatives post-event open ended
facilitated activities survey with non-member
funders

TAI annual offerings are adaptive to Output Secretariat employs inclusive processes to TAI annual work plan, Annual
member and field contexts scan member interests to initiate, pause or reporting, and programmatic
conclude collective initiatives documentation

TAI collective MEL system produces quality Output Secretariat develops and implements MEL Plan TAI MEL Plan; data collection Annual
data used by Secretariat and members for new strategy with members and reflection and learning
documentation

What We Fund

Diverse funder ecosystem supports Goal


transparency, participation, and
accountability (TPA) outcomes

New funders or funder resources support Impact Progress made on non-member funder Non-member TPA-relevant
TPA issues or outcomes engagement plans funder engagement plans

Members amplify key messages and Outcome # of member (co)commissioned or authored TAI Weekly newsletter Semi-annual
evidence-based insights to promote TPA content that is disseminated beyond TAI (Member Spotlight section,
practices membership or other featured op-ed,
blogs, research)

How are funders collaborating with other Learning


funders around TPA-relevant issues?

What opportunities are emerging within Learning


and beyond the TPA funding ecosystem for
joint action or funding?

How We Fund
Funder community offers equitable and Goal
responsive support to organizations

Members strengthen individual or Impact % of members (institutions or programs) that Secretariat inventory of Annual
collective quality* of support provided to take action to improve at least one quality member TPA Grantmaker
the TPA field factor in their TPA Grantmaker practice practice against quality
annually factors; TAI grants database

TAI-delivered experiences or content Outcome % of member survey respondents that report Member engagement survey Annual
inform member Grantmaker approaches or changes to their Grantmaker approaches or
practices* practices as a result of TAI participation

TAI-delivered experiences or content Outcome # of financial and non-financial tools TAI members, Secretariat,
inform member Grantmaker approaches or introduced or shared through TAI adopted by and grantee organizations
practices* TAI members (where relevant) (Method
TBD)

How are funders sensing and responding Learning


to grantee organization needs?

What might funders do to mitigate Learning


unintended burdens or inequities in
ecosystem or organizational support?

Funder Landscape

Funders work together to learn and Goal


support transparency, participation, and
accountability (TPA) field needs and
outcomes

Members leverage individual resources Impact # of instances where two or more TAI members TAI member and Secretariat Strategy
towards a shared priority field issue or provide funding for a similar issue or TPA- staff, and programmatic or (mid-point)
outcome relevant outcome meeting documentation evaluation

TAI-delivered experiences or content Outcome # of cases member staff identify of TAI- TAI members, Secretariat, Annual
inform member strategic direction or delivered experiences or content to inform and non-member funders
portfolio funding decisions strategic decision or portfolio funding (where relevant) (Method
decisions TBD)
TAI-delivered experiences or content Outcome % of member survey respondents that report Member engagement survey Annual
inform member strategic direction or TAI participation informed their strategies or
portfolio funding decisions portfolio funding decisions

What threats are emerging for strategic Learning Change in average CIVICUS Monitor ratings for CIVICUS Monitor Annual
TPA goals of funders / grantees? OGP and EITI countries

What windows of opportunity are opening Learning "TAI will organize ... 4 additional learning events
for TPA fields and goals? (virtual or in person) e.g. on how to take on
board forthcoming ICNL visioning of civil
society needs 20 years from now."

How might funders support more inclusive Learning


learning among and with grantee
organizations?

3.4. MEL timeline


Schedule1:

January Data collection session February Data collection session March Data collection session
Annual survey (N-1) Annual FGD (N-1)
Annual report (N-1)

April Data collection session May Data collection session June Data collection session
Annual member retreat External evaluation (every
three years)
Collaboration case notes

July Data collection session August Data collection session September Data collection session
Semi-annual survey (N) Semi-annual report (N) Annual members grants data

October Data collection session November Data collection session December Data collection session
Annual learning days

1 Some elements are indicated for information purposes and might occur in other months depending on the context.
3.5. Learning questions

In the 2020-2025 MEL Plan, TAI documented learning questions and illustrative activities through
which we would aim to address each question. TAI documented some of the key insights in the 2020
Data Placemats. However, this was not a systematic effort and served more as an illustrative sense of
how we tried to address these questions and relevant takeaways. Developing a more systematic
approach to answering and refining these questions into a living-learning agenda will take time and
might be best placed with future TAI MEL staff. In the interim, the Secretariat team might take stock of
and document key insights for these learning questions on a yearly or semi-annual basis and feature
them in the appropriate member-facing products.
While learning is designed for members primarily, TAI Secretariat and members are committed to
sharing insights with the field. Products and TAI reports are made public and proactively
disseminated.
The current learning framework (including processes and tools) is subject to an internal reflection and
will lead to a substantive change.

Pillar Guiding questions

What we fund 1- What threats are emerging for strategic TPA goals of funders/grantees?
2- What windows of opportunity are opening for TPA fields and goals?
3- How might funders support more inclusive learning among and with grantee
organizations?

How we fund 1- How are funders sensing and responding to grantee organization needs?
2- What might funders do to mitigate unintended burdens or inequities in the
ecosystem or organizational support?

Funder landscape 3- How are funders collaborating with other funders around TPA-relevant issues?
4- What opportunities are emerging within and beyond the TPA funding ecosystem for
joint action or funding?

3.6. Evaluation

Past practice: TAI has commissioned two external evaluations, one in 2016 (Morris Lipson) and the
second in 2019 (Arabella Advisors), which each had different use cases for the collaborative. In reading
the 2016 evaluation report, it seems the Steering Committee had arrived at some decisions regarding
the future purpose and scope of TAI’s work before commissioning the evaluation. Therefore, the
evaluation served more as a check, or confirmation, of those emerging decisions. The 2019 evaluation
assessed progress on, and ultimately informed the revision of TAI’s strategic framework and
outcomes.

Current intentions: The current MEL Plan notes the intention for an external evaluation to occur every
three years. TAI has intended for the 2019 evaluation to occur at the midpoint of the 2016-2019
Strategy; however, actual timing aligned with the conclusion of that three-year period. The next
evaluation would likely occur in 2024, which would eventually inform the next Strategy.
Evaluation planning: TAI might consider a more open or closed procurement process to identify an
external evaluation partner. Regardless, the Secretariat should plan for a roughly nine-month
evaluation period, with member engagement at each phase:
- Month 0– define purpose and budget for evaluation with Steering Committee, and
consider forming an evaluation advisory group to accompany the process.
- Months 1-3– draft SOW with member input, circulate to candidates, finalize evaluation
partner decision and set up the contract.
- Months 4-6– main data collection period, some early reflection with members.
- Months 7-9– deeper reflection, final analysis, co-creating recommendations / action
plan with members.

II. Monthly inputs

1. Data collection form (v3)

TAI’s data collection tool is a spreadsheet form entirely developed in Google Sheets and
was based on previous versions and the continuous learning from team’s feedback.
[Link]

This data collection tool is comprehensive and replaces the two previous tools (Data
collection Google forms + Individual level data Google docs).

The layout is significantly different as the inputs are grouped by change levels (Outputs and Results)
replacing the previous presentation by Pillar (What we fund, How we fund, Funder landscape and
Collaborative). This layout makes it easier and more intuitive to extract data for the periodic reporting.
This tool centralizes all the data collected in one database and helps the users to see the bigger
picture of the inputs and progress.

One of the weaknesses of the previous tool was its inability to ensure team coordination. The latter is
emphasized and enhanced as the main feature in this version through task assignments, messages,
tags, etc. and through real-time online cloud collaboration.
This is a great way to send reminders and prepare for the monthly data collection session beforehand.
This tool is more visual and intuitive as the tabs’ structure is divided into blocks (with different
colors) that naturally indicate the filling sequencing.

1 2 3 4

A quality check module was developed to ensure data’s minimum quality requirements easily and on
a regular basis.

2. Monthly data collection session

What?
A call/in-person session, during which all the team members go through the data collection tool and
reflect together on the inputs to be filled for the month: Bilateral support, Strategic outputs, Strategic
results, Quotes, and TAI-hosted events.
This session is much more than collecting data. It is a great opportunity to reflect on our progress and
ensure an in-depth analysis of our overall performance. This is a valuable preparation for our reporting
inputs.

When? Last day of the month (or close)


Who? The secretariat team

How?
During the session, the team goes through each tab and reflects on the inputs to add from the past
month, then adds necessary data.
This data collection tool was conceived to make the experience easier and time-saving. Therefore,
many cells were predefined:
1- Double-clicking on the dates’ columns opens up a calendar.
2- Names, Strategic Pillar and Initiatives are a predefined list of items. The Initiatives list is stored in a
hidden tab.

The last tab, “Events”, is a database that keeps track of all participants that will have joined TAI-run
events every year (started in 2020).

The names database is stored separately in a hidden tab and is updated monthly with new
participants. The list has three columns: Names, Organization and Membership. The main objective
is to ensure consistency and accuracy of data to have meaningful insights. This sheet is hidden for
protection. Ideally, the MEL fellow has exclusive access to edit this sheet.
During the data collection session, we define who is responsible for recording what event name & date
and list all participants and their respective institution. Although this task is assigned to one team
member, keeping track of participants’ names in TAI-hosted events is teamwork.

Quality check

1- Many columns contain quality check bars that measure the consistency of the column, considering
that inputs are mandatory for that specific column.

2- Many columns contain quality check bars that measure the consistency of the column, considering
that inputs are mandatory for that specific column. Moreover, many blocks contain data check
formulas that automatically detect a mistake. The latter is indicated by a “ ! ” mark.
3- For consistency matters, many cells have data validation conditions. This means that the users can
only type predefined elements. Otherwise, a red sign will automatically highlight any mistake.

4- Finally, the data collection tool includes a whole data quality check module that summarizes all the
data quality aspects in each tab. Colors indicate the general quality of the tab, while errors are located
with precision.

Best practice
Prioritizing the use of the data collection session as a reflective session, in particular about outcomes
level, rather than a data filling session.
Using the task assignment feature as reminder for future sessions.
Set data quality targets periodically to ensure the minimum quality requirements.

III. Yearly outcomes sensemaking

What?
This process aims to measure TAI’s outcomes, level progress and collect both quantitative and
qualitative data for the yearly indicators and change level assessments.

When? The yearly performance monitoring starts in February.


Who? MEL fellow, MEL consultant, and ED.

How?
The yearly data collection process is divided into three main steps:
Annual survey:

Drawing on Arabella Advisors’ 2019 evaluation survey questions and MEL system memo
recommendations for TAI, the Secretariat developed an annual member survey. The survey provides
data for some yearly and semi-annual MEL indicators and offers insights into patterns in member
perception and experience with TAI. Findings from this survey are aggregated and shared with TAI
members and used to inform key metrics in our MEL Plan. Some findings may be integrated into
TAI's grant reporting.

The annual survey has three targets and therefore three versions:
TAI members, [Link]
TAI non-member funders, [Link]
TAI non-members non-funders [Link]
This annual survey is sent to members’ staff and non-member participants who have been “actively
engaged” with TAI’s activities. At TAI we consider “actively engaged” as having attended at least 2
events. The list of respondents is updated yearly based on our events participation database (see
above).
Part of the survey’s results informs TAI’s aligned indicators subject to periodic reporting. The other
part feeds a qualitative data collection process.

The survey’s structure


Section Question
Demographics Select the foundation where you worked in yyyy.
Select the region where you were based in yyyy.
TAI Value Add: To what extent do you agree with the following statement: I personally
Personal benefited from my participation in TAI’s activities and collaborative
initiatives.
In what ways have you personally benefited from as a result of your
participation in TAI's initiatives?
TAI Value Add: Which of the following types of knowledge products published by TAI in
Publications & yyyy do you consider the most useful?
Events [List of TAI’s events/products]
Please rate TAI's yyyy research products according to their usefulness to
you and your work.
To what extent you have shared TAI written products or reports in yyyy?
To what extent you have encouraged others to attend TAI events in yyyy?
TAI Value Add: To what extent do you agree with the following statements?
Connection [Forming new relationships/strengthening existing ones]
Which of the following TAI members did you engage with in yyyy?
[List of members]
Select the type of TAI funding members you engaged with in yyyy.
[Already known/Never knew before/Didn’t engage]
Select the type of non-TAI funders you engaged with in yyyy.
[Already known/Never knew before/Didn’t engage]
TAI Value Add: Please select the ways in which you engaged with other TAI members in
Collaboration yyyy.
[Exchanging ideas/Exploring shared agendas/Responding to an
Aks/Influencing]
Please select the ways in which you collaborated with other TAI members
in yyyy.
[Co-investment in research/Synchronized strategy/Co-investment in
grantees/Worked on country-level initiative]
Can you please share an example of an engagement with another TAI
member in yyyy?
TAI Strategic In yyyy, did any TAI initiatives inform changes to your grantmaker
Contribution: approaches or practices?
Grantmaking If you answered "Yes," could you please share an example?
Practice
TAI Strategic In yyyy, did TAI initiatives inform your strategy or portfolio funding
Contribution: decisions?
Portfolio Could you please share an example?
TAI Strategic To what extent do you agree with the following statements about your
Contribution: institution’s influence in yyyy?
Influence [Increased influence among funders/Influencing grantees/Influencing
other funder’s strategic priorities]
If you selected "Agree" or "Strongly Agree" to any of the above, could you
please share an example?
Feedback for TAI: How satisfied are you with your institution’s level of engagement –
Overall Satisfaction including dedicated time/resources - in TAI in yyyy?
Since you have selected "dissatisfied" or "very dissatisfied", could you
please select the statement that best applies

Focus group discussion


As part of TAI’s yearly performance tracking, a Focus Group Discussion with TAI’s members takes
place every year right after the annual member survey. The main objective of the FGD is to present
and unpack the main results of the annual survey and make meaning from the findings. The FGD’s
agenda is co-developed by the secretariat’s relevant team members. The questions and sub-
questions depend largely on the learning agenda for each year and collective priorities. Findings
from the FGD are food for reflection during the annual retreat, qualitative insights for TAI’s MEL
framework and inputs for the annual report.

The FGD takes place right after the annual survey and has a main objective of reflecting and
sensemaking with TAI’s member on the survey’s insights.

The agenda [Link] and tools could be different from year to year depending on the survey results,
participants, TAI’s overall progress, and the specific focus.

Participants to this exercise are selected by the team in charge, based on strategic criteria and
involvement with TAI.

The session is moderated by either TAI’s MEL consultant or MEL fellow. An agenda is prepared for the
meeting and shared with the participants beforehand. The latter includes the thematic focus and
their sub-questions, and a presentation of the annual survey results described in step 1.
Many tools could be used during this session (remotely): Jamboard, Virtual Gallery walk, storytelling,
etc. depending on the objective, the agenda, and the participants.

Analysis and reporting


The results of both annual survey and Focus Group Discussion feed the learning questions
framework and are compared with previous results and translated into insights, learning and
conclusions to be shared with TAI members in diverse formats: annual retreat, PowerPoint
presentation, gallery walk, annual report, TAI Monthly newsletter, etc.

The results of this process inform part of the “aligned indicators” table:
What we fund # of cases member staff identify of TAI-delivered experiences or content to
inform strategic decision or portfolio funding decisions

% of member survey respondents that report TAI participation informed their


strategies or portfolio funding decisions

How we fund % of member survey respondents that report changes to their Grantmaker
approaches or practices as a result of TAI participation

# of financial and non-financial tools introduced or shared through TAI


adopted by TAI members

Funder # of unique funders (institutional program or other organizational unit)


landscape represented at Secretariat-facilitated initiatives

Collaborative # of documented instances of collaboration among two or more members

% of member survey respondents that report benefitting from collaborative


initiatives

Net change in # of core institutional members from 2017-2019 strategy to


2020-2024 strategy

Quality check?
1- The survey questions inform TAI’s indicators as defined in the “indicators’ reference sheet” [Link].
This ensures the convergence of the survey with the MEL framework and the relevance of the
questions.
2- The process is managed by at least three technical resources having different perspectives and
backgrounds: MEL consultant, MEL fellow and the ED to avoid any biases and risks of interpretation.

Best Practice?
Avoid updating the survey many times to ensure consistency and comparability
Refer to copies of previous survey emails and reminders to members and non-members are in
Dropbox.
Allow at least one month for members to respond. Avoid sending out the survey over the holidays.
Follow up individually with members who have not yet completed the survey.
Send the survey results to the FGD’s participants prior to the call

IV. Synthesis and reflection

1. Members’ attendance analysis

Because our members’ staff are our primary target, we seek to address their specific needs by
adapting our offer (products). Therefore, we track our monthly events’ participation and translate them
into periodic (yearly) insights that are highly important ingredients when defining and shaping our
products (events).

Our analytics scope covers, but is not limited to:


- Active participation: Who are the most and least active participants? Are there any
commonalities between them? What are the reasons for these discrepancies?
- Periodic evolution: Yearly comparison is an important factor in our analysis since our
objective is to enlarge our internal reach among members’ staff and reinforce
consistency in participation at an institutional level.
- External attendance: In our analysis, we take into consideration the external
participation, from practitioners and funders non-members as it is one of our pillars
and a strategic target.
- 20%-80% analysis: When analyzing our participation data, we look for trends, we
group participation rates, and we try to understand the ratio of participation for each
member and collectively.

Attendance analysis has more than a way to be conducted, however, analysis possibilities are
sometimes limited due to the inability to systematically and accurately collect certain data, such as
the geographic location, position of participants, and type of non-members organizations.

2. Thematic analysis

Blogs

At TAI we use blogs as a channel to publish on a regular basis our learning notes and share knowledge
across TPA sector. Blogs ideas (content) are inspired by members’ feedback, direct requests, meetings’
burning questions, TPA news, etc. and are developed either internally by the Secretariat or co-
developed with external experts, practitioners or members, and they cover many topics TPA-related
such as international trends, latest news, TPA-related concepts, analysis, etc.

Blogs have different formats depending on the content. They could be notes, interviews, infographics,
dynamic dashboards, etc.

3. Collaboration case notes

What?
Started in 2018, these are a series of brief case studies conducted with external consultant support to
assess TAI collaborative initiatives undertaken in the previous year. Collaboration case notes are brief
narrative and/or visual accounts that document instances of TAI member collaboration. The case
notes generally explain the problem the collaboration is trying to address, detail who participated and
how, what (if anything) was achieved, what was useful (or not), and what lessons could be learned
from the experience, including barriers and enablers to collaboration. These case notes form part of
TAI’s internal monitoring efforts to track and document outcome-level results and learning from our
work.

Moreover, we published the case notes on our website to make available TAI’s experience and
contribute to learning (and potentially practice) of other funder or network collaborative initiatives. We
can do more to use the case notes with TAI members and partners, thus converting these analysis and
reflection products into a learning and reflection activity. This helps us to refine the collective
expectation of the types of collaboration useful to address different problems.
Annual collaboration case note 1 [Link]
Annual collaboration case note 2 [Link]
Annual collaboration case note 3 [Link

How?
The methodology relies heavily on key informant interviews (KII) with TAI funder members and non-
member funders, the TAI Secretariat, and other stakeholders involved in the collaboration. These were
usually conducted by an external consultant, although a few of the case notes were written by TAI staff.
Overally, the data collection approach depends on the theme and the internal capacity. The following is
provided for information only:

Theme Potential approach

Problems or needs to be - Document review


addressed - Interviews

Member collaboration - Document review


- Interviews
- Network map
- Roles matrix

Utility of the collaboration - Outcome mapping


- Decision mapping
- Outcome harvesting through document review
- Interviews
- Survey

Questions for the case studies broadly (for information purposes):

• What question(s) or problem(s) did the collaboration initiative seek to address?


- To what extent did these needs evolve during the initiative?
• What did member collaboration look like?
- What major events, deliverables, or other activities occurred during the initiative?
- Over what period of time did the initiative occur?
- Who was and was not involved (within and outside of TAI)?
- What role(s) did members vs the secretariat vs other stakeholders play?
- How does this align with TAI’s anticipated types of collaboration?
• To what extent was the collaboration useful to members?
- How, if at all, did the initiative help to resolve the needs identified?
- What decisions, actions, or other changes occurred/results?
- What factors contributed to the utility of the collaboration?

Evolving the Case Notes


Given the resource intensive nature of generating these case notes (cost of external consultant,
member time for interviews, etc.) vs level of readership/engagement, the Secretariat will sunset the
Case Notes in this format.

Instead we will experiment with a lighter touch form of data gathering at the end of each significant
collaboration – most likely a call facilitated by the MEL lead with the relevant member participants to
talk through outcomes, barriers/enablers to collaboration, what could have been done more
effectively. This can be documented in a summary document (2 pages or less).

4. Other Forms of Strategic reflection

At TAI, the semi-annual and the annual report are our two primary strategic reflection moments. The
Secretariat drafts these reports as a group presents indicator values, and discusses progress and
learnings, drawing on strategic and MEL frameworks. Through the process of drafting and editing this
content, the Secretariat team engages in reflection and in surfacing key learning insights to include in
the report. It is way more than just assembling pieces to a report, it’s an in-depth analysis of “are we
doing the right things?” and “are we doing things right?”.
Besides the reporting moments, TAI’s Annual Member Retreat and Learning Day events, whether in
person or virtual, offer a rich environment for targeted and purposeful reflection to build group
cohesion and shared understanding around the past and plan for the future.
The above four key moments of outcomes analysis and learning are embedded in our workplan and
MEL schedule ensuring a continuous learning agenda throughout the year.

The results of these reflection moments are shared with TAI members in diverse formats depending
mainly on the audience.
E.g. during the annual reporting process of 2021, the annual member survey analytics ensured by the
Secretariat team and relevant members’ staff was translated into a Gallery walk and the FGD insights
into a PowerPoint presentation during TAI’s annual retreat.
V. MEL Reporting and dissemination components

- The semi-annual report is where work plan progress is documented, and to-date mid-year results are
reported. This report is used for member-required reporting. The semi-annual report presents a part of
the aligned indicators values (as some of them are measured yearly) and describes progress and
learnings, drawing on both strategy and MEL frameworks.

- While the semi-annual report focuses on the work plan priorities and progress (or challenges) for key
initiatives, the annual report focuses on strategic progress, outcomes, and insights for the past
calendar year of work. The annual report has more analysis and provides an in-depth description of the
overall progress and learning insights.

Annual Semi-annual
report report

What we fund # of cases member staff identify of TAI-delivered x


experiences or content to inform strategic decision or
portfolio funding decisions

% of member survey respondents that report TAI x


participation informed their strategies or portfolio funding
decisions

How we fund % of member survey respondents that report changes to x


their Grantmaker approaches or practices as a result of
TAI participation

# of financial and non-financial tools introduced or shared x x


through TAI adopted by TAI members

Funder # of unique funders (institutional program or other x x


landscape organizational units) represented at Secretariat-facilitated
initiatives

Collaborative # of documented instances of collaboration among two or x x


more members

% of member survey respondents that report benefitting x x


from collaborative initiatives

Net change in # of core institutional members from 2017- x x


2019 strategy to 2020-2024 strategy

- Blogs focus on learning and analysis as part of our MEL effort. They are a great channel to share
knowledge across TPA sector and publish on a regular basis TPA-related findings and learning notes.

- This MEL handbook is part of TAI’s MEL dissemination tools/channels. It lays forth TAI’s internal
common understanding and agreed MEL mechanisms/tools. It is also a channel to disclose TAI’s
vision and practice related to MEL.

- TAI’s annual retreat & the annual learning day are two important internal dissemination
mechanisms. The annual retreat focuses more on assessing overall progress and reflects on evidence-
based planning, while the learning day focuses more on thematic learning and collective knowledge.
MEL Reporting and dissemination framework
Report/Event Involved Content Key Freq. Dates
stakeholders

Annual report ED, Team Outcomes and Members Yearly March


learning Funders
Public (TPA)

Semi-annual ED, Team Work plan Members Bi-annual August


report progress

Blogs Team Analysis and Public (TPA) Quarterly -


learning

Handbook MEL fellow Processes and Secretariat Yearly -


tools

Retreat ED, DD, team Planning SC members Yearly April


Secretariat

Learning day DD, SC Learning Members Yearly November


members, Funders
Team Secretariat
VI. Knowledge management
[MEL folder tree with links]

Key shortcuts:

TAI MEL Plan 2020-2024


[Link]

TAI Strategy 2020-2024


[Link]

MEL Framework
[Link]

Monitoring tool
[Link]

Annual member survey


[Link]

Annual non-member funders survey


[Link]
Annual non-member non-funders survey
[Link]

Semi-annual member survey (short version)


[Link]

Annual focus group discussion


[Link]

Annual survey highlights


[Link]

Attendence analysis
[Link]

Consolidated attendance DB (will be merged with monitoring tool)


[Link]

Member grants dashboard


[Link]

Members grants database


[Link]

You might also like