ChicagoBeyond 2019guidebook
ChicagoBeyond 2019guidebook
ChicagoBeyond 2019guidebook
ALWAYS BEING
RESEARCHED?
A GUIDEBOOK FOR COMMUNITY ORGANIZATIONS, RESEARCHERS,
AND FUNDERS TO HELP US GET FROM INSUFFICIENT
UNDERSTANDING TO MORE AUTHENTIC TRUTH
E Q U I T Y S E R I E S , VO LU M E O N E
WHY AM I
ALWAYS
BEING
RESEARCHED?
CHICAGO BEYOND EQUITY SERIES, VOLUME ONE
2 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 3
4 | W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Executive Summary .........................................................................6
Voices of Support.............................................................................8
Introduction ................................................................................... 10
Thank You.....................................................................................108
Glossary ........................................................................................110
Bibliography.................................................................................. 111
CHICAGO BEYOND | 5
EXECUTIVE
S U M M A RY
As an impact investor that backs the fight for youth equity, Chicago Beyond has partnered with
and invested in community organizations working towards providing more equitable access and
opportunity to young people across Chicago. In many cases, we have also invested in sizable
research projects to help our community partners grow the impact of their work. Our hope is
that the research will generate learnings to impact more youth in our city and nationwide, and
arm our partners with “evidence” they need to go after more funding for what is working.
Through the course of our investing, another sort of evidence emerged: evidence that the
power dynamic between community organizations, researchers, and funders blocks information
that could drive better decision-making and fuel more investment in communities most in
need. This power dynamic creates an uneven field on which research is designed and allows
unintended bias to seep into how knowledge is generated.
There are many voices in the social impact space who have begun to call out the power
dynamic. “Grantmaking is not trusting of the community, and the community is not trusting
of funders,” Edgar Villanueva, author of Decolonizing Wealth, has said. In annual letters, the
President of the Ford Foundation, Darren Walker, pushes funders to reckon with privilege,
acknowledging that communities closest to the problems possess unique insight into the
solutions. The power dynamic in the social impact space is impeding our collective efforts to
create a better world.
Right or wrong, research can drive decisions. If we do not address the power dynamic in the
creation of research, at best, we are driving decision-making from partial truths. At worst,
we are generating inaccurate information that ultimately does more harm than good in our
communities. This is why we must care about how research is created.
In this publication, we offer “how” we can begin to level the playing field and reckon with
unintended bias when it comes to research. Chicago Beyond created this guidebook to help
shift the power dynamic and the way community organizations, researchers, and funders
uncover knowledge together. It is an equity-based approach to research that offers one way
in which we can restore communities as authors and owners. It is based on the steps and
missteps of Chicago Beyond’s own experience funding community organizations and research,
and the courageous and patient efforts of our partners, the youth they serve, and others with
whom we have learned.
6 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
This guide is intended to:
ĥ Fuel community organizations to not only We at Chicago Beyond see this publication as a
participate in research on their own terms, start—it is by no means the answer. We wrestle
but to lead it. regularly with operating in this world while
envisioning an equitable one.
ĥ Support researchers in recognizing their
immense influence and unintended bias in We ask you to join us in questioning, wrestling with
shaping the questions asked, and the inputs bias, and pushing against “how it has always been
used to answer them. done.” We need to collectively move from insufficient
understanding to more authentic truth. The stakes
ĥ Inspire funders to ask hard questions
are too high for us to do otherwise.
about their agendas, unlock more
meaningful knowledge, and therefore
achieve greater impact.
“We are not asking the right questions. To reverse the cycle of
oppression, we must understand the intersectional complexity
of personal responsibility, programmatic acts of charity, and
the interplay of systems and policies that are designed to
deny opportunity. To evaluate an afterschool program absent
”
of intersectional context is misleading and provides answers
that are insufficient, and often useless to transforming the
lives of people living in the community. This guide pushes the
conversation, which is exactly what the field needs.”
MICHAEL MCAFEE, PRESIDENT AND CEO, POLICYLINK
“When Dr. King was assassinated, in the 60s, we were doing studies in North
Lawndale and Garfield, and here it is almost sixty years later. We have been studied
for years and years and years—what is the positive outcome we are going to see for
these studies we are always doing? Will it be in these young men’s lifetimes? While
they are still in their twenties?”
”
Chicago Beyond has just handed us an incredible tool for funders, researchers and organization leaders. This pushes
the reset button and allows us to rethink evaluation. And let’s face it, the problems we aim to alleviate are too
complicated for us to approach solutions without examining how we are entering the work. This tool gives us questions
to use in our everyday work. Questions like: What if building trust is the variant that indelibly strengthens our
outcomes? How can funders come to the table recognizing our own power to influence, shape and therefore limit the
results? And what if funders had to use the same evaluation approaches and metrics on ourselves that we force others
to submit to? So, let’s rethink. Let’s come to the table recognizing the power source is the ORGANIZATION. They aren’t
the subject, they are the research designer.”
ANGELIQUE POWER, PRESIDENT, THE FIELD FOUNDATION OF ILLINOIS
“For our own kids, we understand the context of their lives, their trajectory over time, and what ‘safe’
and ‘happy’ and ‘realizing potential’ look like in the long term. But for kids in social services, we
typically create some limited data set of how they interacted with one specific service at some
specific point in time or a series of isolated services, and then claim we are ‘evidence-based.’ It is
time to support these kids as we support our own kids and these families as we support our family.
Our goal should be to help them realize their potential. It is time to change what we do, and don’t,
”
accept as evidence. Is the evidence we are working from the truth of the people studied, or did it get
imposed on them? Is it teaching us about how we support full life and well-being?”
PAULA WOLFF, DIRECTOR, ILLINOIS JUSTICE PROJECT
8 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
“As board members or decision-makers at the table, we ask for ‘hard evidence’ and ‘gold-standard
methods’ thinking we are stewarding resources well, but if we don’t take a hard look at the
”
inequities built into how evidence is getting made, we could be doing the opposite. I appreciate the
thoughtfulness behind ‘Why Am I Always Being Researched?’ and its push for us all to do better
when it comes to equity and access.”
”
This guide offers thoughtful advice to practitioners, researchers, and funders on how to combine
resources to conduct high-quality research. Importantly, the guide advances the idea that rigorous
research methods, when applied appropriately, can help us all learn whether interventions are
producing the hoped-for benefits.”
”
“The questions we are asked, where do they
come from? Whose lab are we in? When I was
doing my masters I had to do research too. The
”
starting question was never ‘how is this going
to benefit the people being researched?’”
”
“There is without a doubt a valuable role for outside evaluation. But—outside evaluation
can be problematic for communities and for community programs when there are
agendas at work beyond their own. We need to remember the first priority is to produce
something that is not just accessible to, but actually valuable to, the community. This
guide is such an important project for this reason.”
UNMI SONG, PRESIDENT, LLOYD A. FRY FOUNDATION
1 0 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 11
PRIVILEGE SHAPES KNOWLEDGE
As an impact investor that backs the fight for youth equity, Chicago Beyond
has partnered with and invested in community organizations working
towards providing more equitable access and opportunity to young people
across Chicago. In many cases, we have also invested in sizable research
projects to help our partners “validate” what is working, as that evidence
may support further investment, reaching more youth, and spreading what
our partners have learned.
In the course of our work, we have seen community We recognize our privilege at Chicago Beyond
organizations with deep expertise contribute to how as funders of both community organizations and
research gets produced, researchers with a desire to researchers, and with that the power to shape
work with community organizations to define outcomes, research and the resulting knowledge produced.
and funders committed to advocating for a central role Therefore, we recognize the need for us to check our
at the research table for community organizations. privilege, our power. At Chicago Beyond, we devote
considerable focus to identifying and counteracting
On the flip side, we have heard program staff our own personal biases through anti-racist trainings,
struggle with what the proposed research design calling out inequities, and being in communities. We
means for their relationships in their communities are a funder whose leadership is majority people
and with their families. We have heard social workers of color. Today, our relationships with community
wonder why validating their work to those with partners are intensely human. We dedicate a team
power starts with inputs that have structural racism of individuals to our partners to provide thought-
built in to them. We have seen funders, researchers partnership, strategic support, a sounding board,
and community organizations jump to action when and more. Our partnerships are multi-year, and are
research protocols or timelines are at risk, but not contingent on research outcomes or compliance
silent when research does not produce a high
checklists. When the research conducted does not
quality outcome for the community. We have heard match the intent of the research, we look for our
community organizations wonder why researchers accountability even though the power dynamic might
who are disconnected from their communities have have suggested that a funder like Chicago Beyond
the privilege of telling them what works and what does not have any.
does not. And in listening to those we serve, Jonte,
a young man in a program we invest in unforgettably
asked, “why am I always being researched?”
12 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
SO WHAT?
Right or wrong, research can drive decision-making. If we do not address the power dynamic in the creation
of research, at best, we are generating partial truths to inform decision-making in the social impact space. At
worst, we are generating inaccurate information that ultimately does more harm than good in our communities.
This is why we care about how research is created.
This guide was designed to help us level the playing field and reckon with unintended bias when it comes to the
research. It is about shifting the way community organizations, researchers, and funders ask for, produce, and use
knowledge. It is about restoring communities as experts. It calls on us all to stop, recognize, and question bias. It is
an equity-based approach to research. It is based on the steps and missteps of Chicago Beyond’s own experience
funding community organizations and research, and the courageous and patient efforts of our partners, the youth
they serve, and others with whom we have learned.
Chicago Beyond created this guide because we see the opportunity for a new path. Shifting the existing power
dynamic offers a way forward: From not enough evidence on what works, to more meaningful knowledge that
supports more meaningful action.
1. Engaged with our community partner at a human level and built trust.
ö Broke bread and spent time sitting on the stoop, as human beings, sharing our hopes and fears.
ö Participated in a Peace Circle, and asked that the research team participate in a Peace Circle
separately. This opened the door for the community organization to build trust with us and the
researchers (for definition of Peace Circle, see glossary on page 110).
ö Continued to nurture the relationships we built and hold today.
2. Supported access to information on the community organization’s own terms, and helped inform the
organization and community members.
ö Funded access to research based on the community organization’s learning goals.
ö Sat with the community organization’s Executive Director and the research partner to help the community
organization get informed about options for research and the related risks. Still, it was not until later, as
our relationship developed, that we were trusted enough by enough of the community organization for
them to truly recognize what the risks and costs were, and it was not until later that the research felt “real”
to the Executive Director.
ö Understanding that it takes time and multiple engagements to address the information imbalance, we
connected the community organization with another nonprofit going through a similar type of research to
share their perspective.
continued on page 14 >
14 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
”
“When was the last time you were at a gathering and the ‘subjects’ of a study talked about it in glowing terms, because
the research helped the practical work?”
Today, in Chicago, we see new opportunity to shape urban research, for greater equity, and greater impact.
1 6 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
H OW T H I S G U I D E B O O K
IS ORGANIZED
With each inequity, there are suggestions for potential ways forward for community
organizations, funders, and researchers.
This is followed by specific questions in the guide to help community organizations, researchers,
and funders relate to one another differently, for greater impact. The guide organizes these
questions from start to finish of a research study, so users can add questions to their individual,
or collective, agendas at the appropriate time.
This guide is a start—it is by no means the answer. We wrestle regularly with operating in
this world, while envisioning an equitable one.
ĥ Start with this commitment and find new ways to relate to each other.
Conversations about research often happen without community organizations or community at the table, or on
an “invitation only” basis on others’ terms.
THE IMPLICATION
When a voice is missing from the table, the answers we get are insufficient. We may perpetuate bias, and fail to
find out.
THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Recognize that the power ĥ Design research to serve ĥ Fund research that community
dynamic makes it tempting community purpose. organizations want, need, and are
to compromise what matters able to lead. Fund research that
ĥ Not participate in research that
for the chance to produce informs action on root causes.
perpetuates the researcher
research evidence.
as “brains” and community as ĥ Not fund research where
ĥ Where possible, speak up to “brawn” stereotype. the questions asked and the
participate—or not participate— approach hold power dynamics
ĥ Insist that conversations
in research on your own terms, in place.
about community happen with
and shape research to help
community. ĥ Insist that conversations
your community.
about community happen with
community.
20 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Pictured: North Lawndale
INFORMATION.
THE CHALLENGE
Information about research options, methods, inputs, costs, benefits and risks often reside with researchers
and funders, but less often with community. Often, the community does not have enough information to
contribute their wisdom to which questions are asked and why, what outcomes are the focus, and what data
sources are used—or to give informed consent to participate in the first place.
THE IMPLICATION
When one party does not have full information, it is difficult to partner effectively or to get to the full truth.
THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Get informed. Know your ĥ Share information, recognizing ĥ Ensure accountability for the
options, know your rights, that without it, the community community organization to
know the risks. organization cannot actually understand the options and
consent to the research. the risks.
ĥ Seek and use information to
ask questions about methods ĥ Have reciprocal exchange about
and inputs. methods and inputs.
THE IMPLICATION
When outside experts hold the authority to produce and interpret knowledge, we diminish the value of
community voice. Without that community wisdom, we accept partial truths as the full picture.
THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Value the validity of your own ĥ Recognize how the research ĥ Be accountable for what
voices at the table, especially frameworks, process and questions the research you fund
on the questions, the inputs to inputs reinforce power asks, and what processes and
answer the questions, and how dynamics, and bring your inputs it is (and is not) validating.
participants experience the creativity to making change.
ĥ Create accountability for
research.
ĥ Build relationship with the authentic engagement between
ĥ Build relationship with the community organization. community and researchers. This
researcher. Check partial truths. will check partial truths.
ĥ Check partial truths.
22 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
OWNERSHIP.
THE CHALLENGE
Community organizations often cede critical decisions about how the study operates, what to measure,
how study tools are developed, and what participants experience, because they do not feel they have equal
ownership in the research. This is solidified in legal agreements and permissions, which may say that the
community organization cannot access the data itself, does not own data produced by the study, or must seek
permission to speak about the research.
THE IMPLICATION
Without shared ownership, the process of research can take from, rather than build up, the community, and the
inputs and answers are incomplete.
THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Recognize that the power ĥ Invite co-ownership of ĥ Set expectations for
dynamic makes it tempting to research, in your processes co-ownership of research, in
cede ownership. and legal agreements. processes and legal agreements.
At the same time, research often comes at a high tangible and intangible cost. Community organizations and
communities shoulder financial and relational costs that are not explicit, visible, or compensated.
THE IMPLICATION
Uneven accounting allows investment to be high while impact is low.
THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Get aware of the potential costs— ĥ Recognize the full cost of the ĥ Account for the full cost of the
including intangible costs to your research—including intangible research—including costs to
participants, organization and costs to participants, community recruit additional participants,
community—and advocate for a organization, and community— intangible costs borne by
full accounting. and find ways to support it. participants, staffing at the
community organization—and
ĥ Get clear and speak up on how ĥ Shape research so producing support these costs.
research can produce value for value for the community is
your community. central. ĥ Insist on clarity: How will the
research benefit the community?
According to whom?
24 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
ACCOUNTABILITY.
THE CHALLENGE
Often, funders and researchers choose whether or not to take responsibility and make changes when the way
research is designed unintentionally creates harm or does not work, while the community organization and
community bear the greatest risk.
Community organizations have to prove their effectiveness and fidelity, while funders and researchers are
exempt from the same scrutiny and vulnerability.
THE IMPLICATION
Without accountability, trust is limited, and the work cannot be as bold. Worse, communities can be harmed.
THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Build trust-based relationships ĥ Build trust-based relationships ĥ Build trust-based relationships
with the other entities. Hold with the other entities. Be with the other entities. Be
researchers with whom you accountable to understand the flexible in timelines so trust
have built trust accountable. context. can develop. Be accountable to
understand the context.
ĥ Identify and mitigate risks to ĥ Own your role in missteps.
you and to your stakeholders. ĥ Own your role in missteps.
ĥ Help identify and mitigate risks.
ĥ Help identify and mitigate risks.
When data is analyzed and meaning is derived from the research, the power dynamic often mutes voices of
those who are marginalized.
THE IMPLICATION
When we restrict authorship and ignore bias, it allows incorrect meanings to be drawn.
THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Recognize that the power ĥ Invite co-ownership in ĥ Set expectations for co-
dynamic makes it tempting contextualizing and sharing ownership of contextualizing
to cede interpretation and results. Analyze and frame and sharing results. Create
presentation of the results. data with an equity lens for accountability for an equity
greater impact. lens for greater impact.
ĥ Participate in how results are
made into meaning, and shared.
Is it contextualized? Can you
hear your participants?
26 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 27
A N OT E O N S O U R C E S
We credit any wisdom here to the patience and courage of those we have worked with
and learned from.
While there does not yet seem to be a consistent practice or set of protocols for an
equity-based approach to research, we have learned from several sources of wisdom
in addition to experience: ancient and indigenous approaches to knowledge; systems
thinking; community-based participatory research; design thinking; racial equity and
cultural awareness; epistemology (for definitions see Gloassary on page 110).
This guide can inform research and its close cousin, evaluation, but uses the word
“research” throughout.
28 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
”
“We have to try to see from participants’
perspectives, to understand the impact
of what we are doing.”
ĥ Better understanding of how the program is working, what parts of the program are working best, or which
participants are best served. Sometimes called an “implementation evaluation,” this research can help
organizations figure out what is core to the program and if it is being delivered as intended. These types of
learnings have helped community organizations improve how they recruit instructors or teachers, how they
recruit participants, what parts of the program to focus on more, or how they measure program quality so that
what matters most, happens most.
30 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Along with the benefits, there have ĥ The research study did not show effectiveness
because of a problem with the research design.
also been costs, such as: Many studies are not designed well enough to draw
a conclusion of whether the program “worked.” For
Hidden costs of implementing the research study.
example, a research study could fail to capture the
ĥ Research can change the day-to-day processes impact of a program because that impact shows up
within a community organization. For example, over a much longer timeframe, or across a broader
organizations have had to close the door on ecosystem, than was feasible to study, or because
providing services to some young people for an there were not enough study participants for the
extended period, or have had to move from a first- research to attribute impact to the program.
come-first-served model to over-recruiting and
then selecting and rejecting participants by lottery, ĥ The community organization expected the research
to build up a big enough group of people in the would explain how (not just whether) the program
research study (some types of research require a worked to help them scale up.
big group of study participants in order to attribute
ĥ The research measured an outdated program model
impact to a program).
because the program evolved while the study design
ĥ Research took much more work and time was fixed.
from program staff than the community
ĥ Easy-to-measure metrics did not capture
organization expected.
what mattered to the program, or captured
ĥ The experience of the research reinforced a sense injustices beyond the control of the program
of powerlessness because the researchers were or its participants. For example, research
perceived as “the experts,” or triggered memories of comparing arrest rates could capture increases in
previous exploitative interactions with institutions. neighborhood policing, outside of the control of a
neighborhood program.
Lost energy and time due to gaps in
understanding. ĥ Even after a very successful study, there continued
to be a lot of work and a large gap in time to get
ĥ For example, researchers did not understand the
funded.
community organization’s recruitment process, and
the community organization did not understand the
assumptions in the researchers’ math, when setting
the recruitment target for the research.
Unmet expectations.
ĥ The research study did not show effectiveness.
Only a fraction of randomized controlled trials
of interventions—across business, medicine,
education, and employment—show a positive
impact compared to usual services without the
intervention. (You can find more detail on this in the
piece from Arnold Ventures, formerly The Laura and
John Arnold Foundation, listed in the Bibliography
on page 111. “Randomized controlled trials” are
defined on page 37).
If you do decide to participate in research, there are a range of ways to begin. Your
research can draw on multiple approaches, depending on where you see benefit in
an external perspective or voice, and where your wisdom and capabilities lie. Like all
people, individual researchers may be biased towards approaches they are familiar
with, and their emphasis on first building trust with your organization will vary.
It is important you get informed. Where you feel you can, engage with funders and
researchers and speak up. When you do this, you further relationships and mutual
accountability. This guide can travel the journey with you.
32 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 33
F O R C O M M U N I T Y O R G A N I Z AT I O N S
1.
KNOW YOUR ROLE,
KNOW THE RISKS
EQUITY IN HOW YOU START
ĥ Your goals for the research. Start with fill-in-the-blank statements. On a blank sheet of paper, write down a
few outcomes that you would like to see at the end of your research project. Consider: What are the questions
you want to answer? What is your purpose of “filling in the blanks” and answering these questions? How do
the answers support your goals as an organization? Should the sentences focus not only on individual change
but on interpersonal change, change to families, or community change, to honor the work you do? You can
share these statements with researchers when you start working together.
For example… “First, we want to write strong applications for state government funding for violence
prevention, so we want to say our program reduces participants’ violent behavior outside of the program by
[percentage/measure], citing a rigorous outside evaluation. Second, we want to show that it is not just about
the participants, but also the families of participants that grow stronger through our program and become
advocates of change. Third, we would like to identify early indicators that affect whether a participant will
complete the program or not.”
Organizations have found research useful in day-to-day work when it identifies early indicators of the
overall goal, if staff or participants can affect those indicators. Organizations have also found it useful to
ask staff: What 3-4 pieces of information would help you to do your job better?
ĥ Your target audience for the research. The type of data you need to “fill in the blanks,” whether small
scale self-collected data or data from a large scale randomized controlled trial, depends on your audience.
Are you looking to improve the effectiveness of the work, and would this require making changes to services
along the way? Are you looking to evaluate outcomes to make the case to funders—what type of funders? Are
you looking to establish a case for change to a specific government entity?
34 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
Next, get to know the research partner and what they ĥ Researchers’ relevant experiences. What stories
are bringing to the table: can they share of their work that illustrate how
ĥ Researchers’ openness to building trust. When they would work with you? How has community
you meet researchers, look for the foundations participated in identifying the goals of the research
of a trusted relationship. For many community in previous projects? In designing the study and in
organizations, the instinct based on previous particular how participants experience the study?
experience is ‘never give data to someone I don’t In developing and testing survey instruments? In
really know.’ Some community members remember collecting data? In interpreting what the data mean
exploitative or unethical interactions with research and sharing results?
institutions. Community organizations have Discuss with the research partner what is realistic:
shared how rushing for the sake of timelines,
ĥ Questions asked. How do you prioritize the
rather than taking the time to share history
questions you want to answer through the
and build relationship with researchers, can be
research? Taking into account the effort and
counterproductive.
change required to tackle each question and who
ĥ Researchers’ motivations. What motivates their would need to make that effort (the researcher or
interests in this work? What interests them most the community organization), which questions will
about this collaboration? Have they spent time in you focus on now and which will you defer?
the community where you work?
ĥ Information generated.
ĥ Researchers’ agenda. What are their intentions ö What types of information can the research give
for the research in the context of their professional you, and on what timeline? When will initial data be
work—what is their agenda? The value of sharing available, and what actions or decisions will it
agendas is in the trust built together. Is the work make possible?
with you supporting papers they intend to publish? ö What answers to questions or new learning can you
Is it enabling them to fundraise for their institution? be certain of getting, even in the “worst case,” and
what is the potential “best case” of producing the
Is it meeting the requirements of funding they have
research or evidence you are planning for? What will
already received? need to go right for the “best case” to happen?
Few outside of research institutions have this understanding unless they have previously experienced
research projects, but without it, it is difficult for a community organization to participate with full voice.
One method of gathering data may be the ‘gold standard’ on paper, but may not fit your purpose, mission
or organizational stage.
36 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
What are the options for the research? Some terms that come up frequently:
occurred, but does not show your program caused the change to happen.
It does not involve assigning participants to two different groups and studying both groups, and
QUASI-EXPERIMENTAL therefore asks less from your organization and your participants.
STUDY
This will reduce your flexibility to change program elements during the period of the research.
Will this generate what you are trying to learn? Is this rigorous enough for the audience you
want to reach?
A “mixed methods” approach mixes numbers and stories, and can provide the best, and worst,
MIXED METHODS STUDY
of both worlds.
ĥ Risks. What is the risk of each possible approach “failing” (i.e., not generating anything of value to your
organization)? For example, if a randomized controlled trial is not sufficiently “powered,” meaning there are
not enough participants in the research, it might not be able to show effectiveness even if your program is
actually effective. If a research study is not measuring the most relevant outcomes for your organization, the
research study might not show the true impact of your organization.
ĥ Costs. What will be the tangible and intangible costs of each possible approach?
ö For example, what additional or different work will staff have to take on?
ö What leadership and change management work will be needed to support your organization with the emotional
realities of new constraints? What will need to be formalized, or made into a manual, for the research to be possible
and will there be resistance to this that will draw on your organization’s energy or time, or strain relationships?
ö What are the specific costs to how you do your work of recruiting participants for the research? There is more
detail in the section called Plan for study recruitment on page 46, but at this stage, understanding the number of
participants your organization will need to recruit for the study to work is important.
To understand this, at the table, you need:
• (1) the voice of operations (for example, outreach, intake) to explain the process from interacting with a
potential participant for the first time all the way through program completion. Where does attrition happen
and how much attrition happens? How do participants graduate from or complete the program, and what
does this mean for how many new participants can be recruited per week or month? Will the study change the
process such that your historical enrollment or attrition might change?
• (2) the voice of your research partner to explain their best estimation of what number of participants
completing the program is likely to be needed for the research study to show an effect that researchers
consider valid.
• (3) to put these pieces together, to understand the cost to your organization and to your participants of
recruiting the numbers needed.
ĥ Restrictions. What are the restrictions imposed by each possible approach on your organization?
ö For example, community organizations are often innovators and adaptors, but many research methods
require core elements of the work to remain the same for the duration of the research, which can be years.
Are you willing to not make substantive changes to your program for the period of the study?
ö For example, if you choose a research design with random assignment, while the number of participants
you serve may not change, the number of participants you need to engage with, and your flexibility to
decide who gets services and who does not, may. For programs that have kept an “open door” for young
people who have participated to leave and come back, random assignment may mean closing the door to
services for a group of young people for an extended period. It may mean many youth in the neighborhood
where you live and work interact with your organization, and then get “randomized out” of receiving
service. For programs with age limits, random assignment could close the door to the program for a young
person altogether.
ö What do you compromise if you do not have discretion to decide who gets services?
38 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
ĥ Learning from peers. Is it possible to speak with another community organization participating in a study
to ask about their experience and risks to be aware of? Is it possible for key staff from your organization to
listen to how another organization addressed risks outlined here? Community organizations have commented
that this dialogue helped them ask their research partners the right questions. Can the funder or researchers
make connections for you, particularly to a community organization with a similar depth of relationship with
their participants (e.g., a few hours once a week versus a deep ongoing relationship with a participant and
their family)?
ĥ Existing data habits. Does your organization have habits of looking at data, reflecting, and learning? Or do
numbers usually come up only for funding proposals or report-outs to the board? Do you have data infrastructure
you can build from, such as spreadsheets or other ways of collecting information, or habits of using qualitative
and numeric data in day-to-day operations? Are staff already spending time building and maintaining this data
infrastructure, or will you need to identify staff and free up capacity to participate in research?
ĥ Staff’s experiences with research. What are staff members’ appetites for research? Who among your staff,
volunteers, partners, and community members could help shape research?
ĥ Program readiness. Is program infrastructure in place to enable research, for example consistently delivered
key elements of a program? If the research will require new “standards” or “manualization,” which means
writing down how you interact with program participants in a manual and doing it in a consistent way, will
your organization embrace this, or resist it as too institutional? This does not mean every participant must
receive the same services. For example, a study may allow for variation in what goals your participants set,
but measure progress on goals consistently.
”
“We know it works. So what
are the papers and numbers
going to actually do for us?”
For the executive director/leader of a community organization preparing for the change that a
research effort may bring:
ĥ Listening. Before committing to research, how can you listen to all the reasons for why research should not
happen? Are staff concerned the numbers will not tell the whole story? That there will be copycats? That
their voice will not be heard or valued? While from the researcher or funder’s perspective, the research study
may be separate from the program, for your staff and participants, it is often all part of the work. Several
organizations have shared with us that it was not until these conversations happened that the impact of
the research became “real.” One-on-one conversations with naysayers, for example, staff, participants or
community members, can be a valuable opportunity to understand reasons for reluctance and opposition.
Listening in this way can inform whether and how you engage on the research, improvements to the research
plan, and how you communicate about the research.
ĥ Communicating.
ö How can you educate, support and influence your staff for a successful study? Does the entire team understand
what is happening, why, and how it will impact their work? How, and how often, will you articulate the potential
benefit of the research to your organization’s mission? How will you be open about the challenges and risks, and
about how workloads will shift as a consequence of doing the research? For larger nonprofits, how will you motivate
leaders within your organization to engage in these conversations deliberately with staff on their teams?
ö Who are other stakeholders you need to engage, for example, the community organization’s board, other funders,
community partners, participants, participants’ parents? Will having the researchers speak with these stakeholders
help you?
ĥ Champions.
ö Who, within your organization, tells stories that people hear. Are they informed about and supportive of the
research? How can you engage them?
ö As staff turnover is frequent in many community organizations, are there additional members of your organization
who can be included in the discussions from the start, so understanding of how and why the research began does
not sit solely with one or two people?
40 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
2.
COMMUNITY
AND VOICE:
SETTING UP THE STUDY
”
“Bringing implementors and researchers
in close proximity is so important—so
we can have conversations without fear.
The power dynamic can hold up deeper
collaboration.”
Research is traditionally set up to keep researchers separate and “neutral.” Engaging and getting proximate
is an orientation the community organization and researchers may need to cultivate together, and create new
structures for.
One starting point is face-to-face engagement between researchers and community, participants, staff, and
partners. Another is to start by outlining the potential benefits and harms of the research, and determining how
to hear the voices of those potentially affected through the process.
42 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
ĥ Identification of voices needed. What are potential benefits of the research, and who receives them?
What are the potential harms, and who carries them? For example, what processes of your organization may
be changed or interrupted? How will staff relationships with participants be affected? These can be written
out in a T-chart (a piece of paper with a large letter “T” creating two columns) with potential benefits in one
column and potential harms in the other, following the practice of health impact assessments, environmental
impact assessments and racial equity impact assessments. The individuals and groups named on the chart
may suggest the starting list of whose voices are important to hear.
ĥ Reciprocal engagement.
ö How can the researchers and community organization staff, participants, partners, or community members engage
face-to-face, and see each other as humans? Are Peace Circles an appropriate early engagement? Breakfast forums
for partners? Evening forums for community? For researchers, having the opportunity to connect with the work
on a human level can provide perspective and, in the words of one community organization, “a positive lens, not
just focusing on statistics of demise.” For community organizations, engaging with the researchers formally and
informally helps build a relationship of trust.
ö In some cases, you may assemble a community evaluation committee representing those voices, which may make
decisions about or give input into what is researched, the outcome measures used, and which engages regularly
with the research team. The committee may include staff of your organization, researchers, program participants or
community members. What is the mandate of your group? What powers does it truly have? How often will it engage
with the research team to not burden participants or overwhelm the research team, but also to effectively contribute
to the study design and execution? (In the Community-Based Participatory Research approach, this group is called a
Community Action Board and steers both the research and the related action.)
ö How will you engage with all staff who are affected by the research effort to seek input, while the study remains in
its formative stages? How will you engage community members who may be affected by the research effort? Are
community meetings appropriate? What is the mechanism to gather and incorporate input?
ĥ Engagement at the right times. When during the process can you still shape study design? Have you
discussed with the researchers the timing of when they will seek “IRB approval” or register the study’s
“pre-analysis plan?” An “IRB” or Institutional Review Board is an administrative body that confirms that
certain ethical considerations in research are met. A “pre-analysis plan” commits to what the most important
outcomes and approaches will be in your research. Once these steps in the research process occur, you have
limited ability to change the study design while safeguarding the validity of the study.
”
“Numbers only tell a piece of the story. Some of what we do, you just
can’t quantify—driving a kid to a job interview after hours, where the
bus doesn’t go. It’s unconditional love that is keeping kids alive. But,
there are things that can be quantified.”
Pictured: Youth Advocate Programs, Inc. group outing
Take the example of recidivism. On the one hand, it is a common metric, with existing data sets. This means
it costs less money to track and allows comparison across programs. On the other hand, it does not capture a
program participant’s directional progress, while capturing things outside of the program participant’s control,
such as a concentration or increase of arrest efforts in a community. A piece by Jeffrey Butts and Vincent
Schiraldi on benefits and shortfalls of recidivism as a metric is listed in the Bibliography on page 111.
What is measured can be a productive focus for a community evaluation committee and researchers, each
contributing their expertise.
44 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
One example is of a prison-based fatherhood program (you can find the article by Abigail Henson listed at the
end), where dialogue between participants, community organization staff, and researchers changed what the
study measured: the unit was changed from the father to the family; the short-term measures were changed,
from depression and stress, to pride and reconstruction of masculinity (from provider to caregiver); the long-
term measures were expanded, from recidivism, to whether the father-child bond remained active and positive.
ö What effects of the community organization’s work important to capture in the research? Is the program affecting the
system or the capacity of the community, for example supporting cultural revitalization, changing power relations,
or increasing the capacity of the community to solve problems? Effects of the ecosystem on a participant, effects
of change by the participant on their families or communities, or broader impacts of a program’s work, may be less
simple to research. However, this may be critical to a community organization’s impact where easy-to-research
approaches have not worked.
ö Some community organizations have found that talking with trusted friends and peers has helped them to identify
additional outcomes to measure. Sometimes those outside the day-to-day work can help you to see your impact.
ĥ Communicating about metrics respectfully. How can you communicate about quantitative metrics within
your organization in a way that is respectful, and that recognizes that numbers are an incomplete picture?
”
“If I could start all over, I’d ask ‘What’s
the power number?’”
Several community organizations have noted that a lack of shared understanding between the researchers,
those with program operations expertise, and those speaking for the community organization led to lost effort
and time, and anxiety, in recruiting participants into the research. In addition, optimism for the research
from the community organization’s leadership can contribute to blind spots when it comes to planning for
recruitment and retention, which can then make the research less effective.
ĥ Voices with expertise. Who needs to be at the table to be able to walk through the recruitment process
in detail from start to finish to really understand what is necessary to put together the participant group, or
“cohort,” for a research study? Who is closest to the work? Outreach? Social workers? Program directors?
Former participants?
ĥ Recruitment process. What is the detailed recruitment process based on these voices of expertise?
Importantly, the research itself can affect both the recruitment process, and attrition. For example, the
research could change how participants apply to the program, from interested young people applying, to
schools generating lists of young people invited into the program; as a result of this shift, attrition will rise
from what the program has seen in the past. Where does attrition happen in the process and by how much?
What number of participants would need to start the recruitment process to end up with a certain number
of participants completing the program? How frequently will participants move out of the program and what
does this mean for how much time it takes for the target number of participants to complete the program?
What does “completing the program” or “graduating” mean for your organization, operationally?
46 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
ĥ Recruitment target. The higher the number of study participants, the easier it is to show a scientifically
valid change, but the greater the effort to recruit participants. For the research study to show an effect
that researchers consider valid, how many participants does your researcher estimate need to complete
the program? Piecing this together with the program’s process from the start of recruitment to program
completion, what total number of participants must initially be recruited? Having the researchers and
operational team together name different scenarios, then explore the challenges and the manpower related to
each, can be a helpful approach.
ĥ Resources required. What will your organization need to change to achieve the recruitment target? One
community organization believed their social worker could continue to manage recruitment as the research
study began, and later found they needed a dedicated person spending 30 hours per week to adequately
support recruitment and build new referral partnerships.
ĥ Data required. What other data about participants will researchers need to collect, and how will these be
collected by researchers, or by staff and passed to researchers?
ĥ Communications.
ö How will the recruitment target and recruitment process be communicated to recruitment partners, staff, community
or participants?
These types of meetings require planning through an equity lens. What is the best time of day for
community members to attend? Evenings? Weekends? What about childcare? Food? Meeting format?
Seating arrangement?
ö Are breakfasts or evening community meeting appropriate? Should researchers be present to answer questions? What
about a school-night kick-off? Are one-on-one conversations between researchers and recruitment partners/staff
leading recruitment most appropriate? Will there be any resources or stipend for recruitment partners?
ö Are the explanations of the research to participants, staff, partners, community in a form that is understood by
these audiences?
”
“I knew my fathers in the control group
needed to be compensated, and we
made it happen. Their time is valuable,
and they weren’t getting the benefit of
being in the program.”
For the community organization, community members, or research participants planning with
researchers for the study:
48 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
ĥ Recruitment strategies. What specific strategies can be used to bring participants into the study and
retain them through the course of the study? There is a “tax” on producing knowledge about those most
marginalized: it is harder for researchers to connect with those with instability in their lives; it is harder to
obtain consent from those who have learned to distrust institutions; it is harder to retain in a research study
those who experience greater barriers. How can social media be helpful? Will multiple redundant strategies
be feasible, to increase success?
The article “Study Retention as Bias Reduction in a Hard-to-Reach Population” by Columbia Professor Bruce
Western and colleagues referenced at the end may aid brainstorming on study recruitment and retention, e.g.,
the timing of financial incentives at the start of a study (if applicable); frequency of contact; back-up contacts
including mothers and supportive secondary contacts.
ĥ Responsive communications. What is the best way to communicate about what matters to participants,
for example:
ö What is the best way to communicate about privacy? Concerns nonprofits have shared, from their participants,
include: Who all will know I am in the study? Will my name be published anywhere? How much will they be in my
life? Do they watch from the cameras in the building? If I am involved in questionable activity, are they going to
report me to the police?
ö What is the best way to communicate about benefits? Participants have asked: How does being part of a study
help me?
ö What is the best way to communicate about expectations? Participants have asked: Am I allowed to
participate in other programs/employment during the study? Can you still help me if I am not in the treatment
group? Can I apply again? Who is going to help me if you can’t?
ö Be aware that there may be misunderstandings about research preceding your study.
ĥ Trauma. How do awareness of trauma and the research study’s potential to trigger memories and emotions
shape the work? A study can be an emotional trigger for program participants, and for staff who were
themselves researched or interrogated in other ways in their childhood. Trauma expertise can inform study
outreach and study design to minimize that effect. For example, trauma awareness can shape how staff
participate in the research (and change whether their stress is transmitting fear to participants). If this is not
an area in which your team is experienced, seek out expertise so you are equipped to answer these questions.
ĥ Discussion topics.
ö Feedback on the research effort so far, and what
needs to be done to address it.
ö How are inequitable approaches, methods,
measures filtering into the study, and what are
opportunities to do differently? For example, are
you unintentionally taking advantage of what
your privilege allows you to do, such as dictating
meeting times and locations? How is your
work creating a way of operating intentionally
distinct from the legacy of “research brain” and
“community brawn?”
ö How are relationships of trust being formed, and
how is the team interacting as equals? Note that
while researchers’, staff, and community members’
roles in the research differ, the point here is that
no one is treated as superior or inferior.
50 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
3.
COMMUNITY
AND VOICE:
DURING THE STUDY
ĥ Feedback loop.
ö How are staff, participants’ and community members’ voices being heard during the course of the study? What
feedback is being shared and what can be done to address the feedback and communicate back to those who
shared it? Some organizations have found a biweekly conversation among researchers and those at the community
organization involved in implementing the study tremendously valuable to ask and answer questions and to plan for
each new step in the work in an organic and effective way.
ö Are you remembering to praise and reinforce the specific actions your research partner is taking in service of
equity?
52 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : D U R I N G T H E S T U DY
ĥ Contextual awareness.
ö How can the community organization’s leaders and researchers best collaborate so study tools fit the specific
context and also have the validity desired?
ö How will draft study tools be vetted by participants? Are surveys of an appropriate length?
ĥ Execution.
ö Will certain types of data, for example social security numbers, be a challenge to collect in the specific context, and
what are alternatives?
ö If researchers are engaging with a representative group, for example a focus group, is it a meaningful “group?” How
can you and researchers together create the conditions for voices to be heard?
ö Can community members be hired to help collect data or conduct surveys?
4.
EQUITABLE NUMBERS
FOR IMPACT
ĥ Early learnings. What are early learnings from the research and how can these be shared with staff,
participants, community, or partners to show what their hard work is yielding?
ĥ Early action.
ö What improvements to the program are possible during the research, given the context of your organization and the
research design, which may limit what changes can be made?
ö How will those responsible for making the changes engage with the data early on so they can plan for action?
ĥ Sense checking. Is there a role for your organization’s staff in sense-checking early data or preliminary
results? For example, if a researcher is looking for themes in stories from participants, how is the researcher
identifying and interpreting important themes (“coding” qualitative data)? Are preliminary results consistent
with the experience of your organization, or do they look wrong?
54 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
E Q U I TA B L E N U M B E R S F O R I M PA C T
The average
grew because
of these
participants.
Average after Average after
the program the program
If your community organization’s mission is to serve participants facing barriers, or if you are innovating and
looking at whom you could serve, a key purpose of the data is to help you understand who you are succeeding
in reaching (and who you are not reaching); who is benefiting most from participating in your program (and who
is benefiting least); and what can be learned from the complex stories beneath these numbers to help you do
your work better.
ĥ Disaggregated data. Against whatever metric you have chosen to measure how your program helps your
participants? Take for example, a grade point average. How much change does the program create for participants
starting out furthest behind, versus participants in the middle, versus participants starting out furthest ahead?
(This must be done in a way that continues to protect individuals’ privacy.)
ĥ Counting inequity.
ö Is the correlation between race and outcomes, or class and outcomes, changed as a result of your organization’s
work? This is one way to look at impact for equity.
ö How can your organization use these data and stories to communicate your impact for equity?
ĥ Comparison points.
ö For a study comparing outcomes to a benchmark, or analyzing cost and benefit, who and what is proposed as the
comparison? How can comparison points show different perspectives or the complexity of the work?
ö Does the comparison point take into account the impacts of systemic and individual traumas, for example the
implications of stress experienced by program staff as a factor in selecting what productivity comparison is relevant?
ö Does the comparison point ring true to those being measured?
ĥ Cost-benefit analysis. If you want to produce cost-benefit numbers, how do researchers propose to capture
systemic effects? Accounting for “cost” of the way things are right now only in terms of tax dollars—which
are often actually wages that benefit a different group of people—rather than the actual social cost, can
unwittingly and incorrectly build a case against investment in your work.
ĥ Capturing growth. How does the presentation of the data reveal growth journeys which may not be linear?
Does the comparison point allow for—to use substance abuse terminology—relapse in the context
of recovery?
56 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 57
F O R C O M M U N I T Y O R G A N I Z AT I O N S K N OW YO U R R O L E , K N OW T H E R I S K S
5.
SHARING
RESULTS
a. Is it historical? Is it contextualized?
Numbers, without context, take on the assumptions and biases of their audience. Data sources, without
context, reinforce the structural bias built into them.
As Chimamanda Ngozi Adichie warns, a story is fundamentally shaped by where you begin it. The origins of
inequity are often left out of the story, allowing histories to be laundered, and reinforcing harmful silences in
the narrative.
ĥ Context. What history and explanation of structural and systemic factors is important to frame the challenges the
organization is addressing? To explain fully why the problem exists in the first place and the complexity of root causes
and pathways? How can this show up in the description of the organization’s work and dissemination of the research?
ĥ Dominant narratives. What narratives have previously described the organization’s participants, its work in the
community, or its type of work? How have these narratives served participants well? How have they harmed them, or
reinforced inequities? With this understanding, how can this research be framed to take on unjust narratives? What
cultural context is important to tell?
ĥ Limitations. How are limitations of the data clearly communicated? For example, limitations of
administrative data sets, limitations of summarized data, limitations of common metrics, systematic non-
counting, or systematic undercounting? “Objective” data like special education designations, census data,
crime that is measured by arrests, or domestic violence information, can incorporate racialized processes and
lead to incorrect interpretations, without context.
b. Can you hear your participants? Are you signaling that lived
experience is valuable?
Valuing data to achieve an end—whether securing funding, improving programs, sharing learning with the
field, changing narratives—is not the same as valuing a human story and experience intrinsically. Honoring
a participant’s voice requires intention, it may not just happen from documenting a participant’s story,
demographics or outcomes.
ĥ Participant voices. Can you hear your participants in what you, the researchers, the funders, are disseminating about
the research? How are images, stories, numbers resulting from the research effort putting participants at the center in
how they are shared, versus treating them as objects of a study or as tokens to lend credibility?
58 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
S H A R I N G R E S U LT S
ĥ Authorship. Is there an opportunity for you or your participants to own elements of or co-author what is
produced, or be editors? Or does it serve you to have an external author, because an external person is
perceived as not biased?
ĥ Respecting experience.
ö How does the presentation of results message to the audience that experience is valuable and valid, rather than
reinforcing the bias that university expertise produces validity?
ö Would participants find that what is being put out is true to their experience?
ĥ Language.
ö Is the language used as easy to understand as possible? Do people from different cultures, with different lived
experiences, with different technical backgrounds understand the results of the research and the “so what” of what
it means, when you test an early draft?
ö Are all inputs, calculations, and methods clearly explained, so stakeholders with different technical backgrounds
can understand what has been counted, how, and based on what judgments? Are data tables and charts readable to
those without research and statistics backgrounds? Are any technical terms used defined in plain language?
”
“What are the inherent biases we
hold? The quantitative paradigm says
it is ‘controlling for these’. But what
are the biases built in to how your
“knowing” works?”
60 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
“People ask: ‘How do I get started?’ How do I encourage voice?
How do I use my voice? There is academic theory, but people
have challenges doing it. That’s where this work comes in.
Depending on the kinds of research you do, your research institution, and
your own experience, the action you take will look different.
For example, you may:
ĥ Reflect individually or engage with your colleagues and institution on biases and how these flow into
your research
ĥ Change how you engage with community to identify research questions and study outcomes
ĥ Propose timelines for research differently, for example to support trust-building, or to develop survey
instruments with community input and community testing
ĥ Interrogate numbers and stories you lift up, and use different framing in what you publish
ĥ Evaluate your own work differently, or engage your funder stakeholders differently
Making intentional change can feel messy and uncomfortable. It requires openness to new perspectives
and unlearning old ones. It requires shifting power dynamics, departing from how “it has always been
done.” Starting from relationship and accountability, researchers can unlock immense creativity, to
achieve the promise of what knowledge can yield for communities.
1.
KNOW YOUR ROLE,
KNOW THE RISKS
EQUITY IN HOW YOU START
a. Bring awareness
How research gets done—the approaches, methods, metrics—has a system of assumptions built in.
Some sources of context include MIT historian Craig Wilder’s Ebony & Ivy on the history of elite academic
institutions in justifying inequities, and the Harlem Children’s Zone’s writings on their decades of experience
“being researched” (you can find both in the Bibliography on page 111).
In seeking more equitable approaches, one place to start is understanding context and biases.
ĥ Systemic awareness. What are the assumptions built into the research approaches you use most
frequently? How do these approaches reinforce the privilege of those already powerful? Whose competence,
capacity, or fidelity needs to be proven, and whose is taken on faith?
ĥ Institutional awareness. What is the history of research in this community, and the particular history
for this community organization and these participants? How have research institutions previously been
experienced? How does your organization benefit from this dynamic? What experiences, e.g., being reported
to child services, might a research effort bring to mind, even if not technically related to the research?
ĥ Personal awareness. Everyone comes to their work with some personal knowledge and assumptions. What
methods to create knowledge are you predisposed towards? What are your assumptions about the people
and the context of your work? For example, if employment is a focus of the study, what are your assumptions
about “valid” employment?
ĥ Awareness of what is at stake. From the community organization’s perspective, what benefits may
participants in the research see and how certain are those benefits? What may be the cost to participants, to
the community organization, to the community? What are the risks of producing the research or evidence you
are planning for? From your own perspective, what benefits may the researchers and their research institution
get from the work? How certain are those benefits and what risks do the researchers bear?
62 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
ĥ Sharing your agenda. Sharing agendas openly contributes to building trust. How does this work fit into your
professional research agenda? What is your other current research? What are your intentions for the work and
your research institution’s priorities?
ĥ Sharing your relevant experiences. What stories can you share of your work that illustrates how you would
work with the community organization?
”
“Our study is working well because everybody is unveiling themselves,
taking their degrees off the wall, rolling their sleeves up.”
ĥ Target audience. Where does the organization intend for the research to land, and what type of data and
what research design serve that purpose, with least burden on the organization or its participants? How can
you shape the research design and approaches to serve the purpose intended?
ĥ Organizational context. What are the organization’s existing approaches to reflection and evaluation, that
precede your collaboration?
ĥ Information generated.
ö When does the organization look to act on learnings? When will initial data be available, and what actions or
decisions by the organization can it enable?
ö What answers to questions or new learning can the organization be certain of getting, even in the “worst case,” and
what is the potential “best case” of producing the research or evidence you are planning for? What will need to go
right for the “best case” to happen?
ĥ Options.
ö What type of study is sufficiently rigorous for the intended audience while minimizing burden on the organization?
What type of study is valuable to future audiences, or enables systemic change if that is envisioned by the
organization? How does understanding how the program is working fit into the picture? Would a qualitative or mixed
methods study suit the organization’s learning objectives? You might share a simple table such as the one that
appears in the community organizations section under Know your options.
ö What approach to research—from traditional, to community-engaged, to full partnership between community and
researcher—is most suitable? (more on Community-Based Participatory Research appears in the glossary on page 110).
64 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
ĥ Pressures you face, experience you have. Are there institutional pressures you as a researcher face to
conduct certain types of studies, or are there types of studies where you lack experience, and have you made
those explicit to the organization?
ĥ Benefits, risks, costs. What are the implications—benefits, risks, costs—of each approach, from the
organization’s perspective?
ö What are the benefits of each possible approach? What ongoing benefits could the community organization see
after the research ends?
ö What will the organization be asked to compromise? For example, would the organization not be able to change
key program elements during the period of the research? What would this mean for the organization—recognizing
community organizations are often innovative and adaptive by their nature—and for its participants? Would the
organization’s flexibility to select who receives services be affected? Would the organization need to close the door
to services for a group of people for an extended period? For a program with age limits, could the research design
close the door to the program altogether for a young person? What would the organization compromise if it no longer
had discretion to decide who gets services?
ö If the research will mean that the organization must manualize or standardize its work, will the organization’s context and
culture embrace this, or will there be significant resistance, which will cost energy and time, and strain relationships?
ö What data sets will be used in each option for research you are considering, and how does access to those data
work? Will the community organization be able to access the data after the research study finishes?
ö What may be the costs to the organization and to participants of recruitment for the research? A common topic of
miscommunication is the number of participants the organization will need to recruit for the study to work.
This is covered in more detail below in the section called Plan for study recruitment on page 70.
ĥ Learning from peers. Can you connect the community organization to another organization that has gone
through the type of research being considered? Is it possible for key staff from the community organization to
listen to how another organization addressed risks?
ĥ Role of Principal Investigator. What will the Principal Investigator take responsibility for? Is it appropriate
for someone from the community organization to act as a co-Principal Investigator? Or is research by an
external third party important in this context?
ĥ Property rights. Discuss intellectual property rights and data rights. Who can access the data and when?
When will data be processed and shared—at regular intervals? Who can speak about the data and publish the
data? Whose consent is required, when?
ĥ Signed contract. In signing the memorandum of understanding or similar written contract, bring your
attention to how you are formalizing the above.
ĥ Cost distribution. How can you work with the community organization and community to identify costs
generated by the research, and discuss how these costs can be shared? What will participants or partner
organizations be asked to do differently for the research, and should this be compensated? Will the research
generate new staff responsibilities? Will it generate communications, change management, or other costs to
the organization? Can researchers share in this cost, in kind, or financially? In our experience, researchers
have raised funds, for example to compensate participants for their time, or for stipends for partnering public
schools. Can the funder support these costs?
66 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 67
FOR RESEARCHERS
2.
COMMUNITY
AND VOICE
SETTING UP THE STUDY
One starting point is face-to-face engagement between researchers and community, participants, staff, and
partners. While community-based research practices offer examples of how to structure this engagement,
this is human-to-human work, not a check-the-box exercise to create a particular community hearing or
steer committee structure. Spending time with the organization and breaking bread fosters relationships and
understanding that matter at a human level, and equips researchers to recognize—and as a result address—
ways the power dynamic gets in the way of impact.
ĥ Relational engagement.
ö How can you spend time with the organization and the community in which the organization resides, and what
questions can you ask, to begin to build relationships of trust? To signal your intent to listen and be a partner? To
show, not talk about, your humility? To be able to put yourself in the organization’s shoes?
ö How will you, personally, listen for and hear the subjective experiences of those affected by the research? How can
this be built into your research timeline? As one researcher notes, good intentions and interview guides are not by
themselves enough to lead to interactions that promote equity.
ö Are Peace Circles an appropriate early engagement? Breakfast forums for partners? Evening forums for community?
What are the opportunities to engage without agenda, for example at graduation events, community meetings,
performances, where you are not conducting observations?
Often, the framework into which researchers are thrust sets up stakeholders as a group to be “managed” in
order to “reduce non-adherence.” Inclusion can seem “messy.” One opportunity is to start by outlining the
potential benefits and harms of the research with the community organization or community members, and to
determine how to hear the voices of those potentially affected through the process.
ĥ Identification of voices needed. What are potential benefits of the research and to whom do they accrue?
What are the potential harms and to whom do they accrue? For example, what processes of the organization
may be changed or interrupted? How will staff relationships with participants be affected? These can be
written out in a T-chart (a piece of paper with a large letter “T” creating two columns) with potential benefits
in one column and potential harms in the other, following the practice of health impact assessments,
environmental impact assessments and racial equity impact assessments. The individuals and groups named
on the chart may suggest the starting list of whose voices are important to hear.
68 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
ĥ Engagement at the right times. When is it particularly important for the community organization to provide
its input, so that there is still flexibility to make changes while producing a valid research study? Have you
explained when you intend to register your pre-analysis plan or seek Institutional Review Board approval, and
what this means for the community organization’s ability to modify the study design?
The “simple” measures may not be the ones that represent real growth and benefit to participants most on
the margins. As large administrative data sets were in many cases built to report on compliance, metrics of
compliance such as arrest rates are, not surprisingly, easier and cheaper to collect.
Researchers can contribute expertise not only on what the common measures are, but also on how inequity is
built into those measures, and what adjuncts and alternatives are feasible, to enable an informed approach to
metrics that furthers—not contradicts—the intent of the research.
One example you could share with the community organization is of a prison-based fatherhood program (you can
find the article by Abigail Henson listed in the Bibliography on page 111), where dialogue between participants,
community organization staff, and researchers changed what the study measured: the unit was changed from
the father to the family; the short-term measures were changed, from depression and stress, to pride and
reconstruction of masculinity (from provider to caregiver); the long-term measures were expanded, from recidivism,
to whether the father-child bond remained active and positive.
ö Is there an opportunity to capture the program’s impact relating to systemic causes of inequity? Are ecosystem
effects of the community organization’s work important to capture in the research? Is the program affecting the
system or the capacity of the community, for example supporting cultural revitalization, changing power relations,
or increasing the capacity of the community to solve problems?
ĥ Communicating about metrics respectfully. How can you communicate about quantitative metrics with
the organization in a way that recognizes that numbers are an incomplete picture?
ĥ Voices with expertise. Who needs to be at the table to be able to walk through the recruitment process
in detail from start to finish to really understand what is necessary to put together a ‘cohort’ for a research
study? Outreach? Social workers? Program directors? Former participants?
ĥ Recruitment process. What is the detailed process based on these voices of expertise? Importantly, the
research itself can affect both the process, and attrition. For example, the research could change how
participants apply to the program, from interested young people applying, to schools generating lists of young
people invited into the program; as a result of this shift, attrition will rise from what the program has seen
historically. Where does attrition happen in the process and by how much? What number of participants
would need to start the recruitment process to end up with a certain number of participants completing the
program? How frequently will participants move out of the program and what does this mean for how much
time it takes for the target number of participants to complete the program? What does “completing the
program” or “graduating” mean for your organization, operationally?
70 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
ĥ Recruitment target and resources required. Based on available information from the community
organization, what is your best estimation of the number of participants necessary to appropriately power the
study? Piecing this together with the program’s process from the start of recruitment to program completion,
what total number of participants must be recruited? Can you share example numbers to illustrate what a
statistically significant change looks like, using the specific context of the nonprofit and a potential study
outcome? Can you share experiences from other studies on what changes were required by the community
organization to achieve this? Having the researchers and operational team together name different scenarios,
then explore the challenges and the manpower related to each, can be a helpful approach.
ĥ Data required. What other data about participants will you need to collect, and how will you collect these, or
how will these be collected by staff and passed to you?
ĥ Communications.
ö How will the recruitment target and process be communicated to recruitment partners, staff, community or
participants?
ö Are breakfast or evening community meetings appropriate, with researchers present to answer questions? School
night kick-offs? One-on-one conversations between researchers and recruitment partners/staff leading recruitment?
Will there be any resources or stipend for recruitment partners?
ö Are the explanations of the research to participants, staff, partners, community in a form that is understood by
these audiences?
These types of meetings require planning through an equity lens. What is the best time of day for
community members to attend? Evenings? Weekends? What about childcare? Food? Meeting format?
Seating arrangement?
While in researchers’ minds the study may be “separate” from the program, for staff and participants, that
distinction may not have any meaning—it is all part of the work.
ĥ Recruitment strategies. What specific strategies can be used to bring participants into the study and
retain them through the course of the study? There is a “tax” on producing knowledge about those most
marginalized: it is harder for researchers to connect with those with instability in their lives; it is harder to
obtain consent from those who have learned to distrust institutions; it is harder to retain in a research study
those who experience greater barriers. How can social media be helpful? Will multiple redundant strategies
be feasible, to increase success?
The article “Study Retention as Bias Reduction in a Hard-to-Reach Population” by Columbia Professor
Bruce Western and colleagues referenced in the Bibliography on page 111 may aid brainstorming on study
recruitment and retention, e.g., the timing of financial incentives at the start of a study (if applicable);
frequency of contact; back-up contacts including mothers and supportive secondary contacts.
72 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
“We are not really rats, but we are lab rats to them.
ĥ Responsive communications. What is the best way to communicate about what matters to participants,
for example:
ö What is the best way to communicate about privacy? Concerns nonprofits have shared, from their participants,
include: Who all will know I am in the study? Will my name be published anywhere? How much will they be in my
life? Do they watch from the cameras in the building? If I am involved in questionable activity, are they going to
report me to the police?
ö What is the best way to communicate about benefits? Participants have asked: How does being part of a study help me?
ö What is the best way to communicate about expectations? Participants have asked: Am I allowed to participate in
other programs/employment during the study? Can you still help me if I am not in the treatment group? Can I apply
again? Who is going to help me if you can’t?
ö Be aware that there may be misunderstandings about research preceding your study.
ĥ Trauma. How do awareness of trauma and the research study’s potential to trigger memories and emotions
shape the work? A study can be an emotional trigger for program participants, and for staff who were
themselves researched or interrogated in other ways in their childhood. Trauma expertise can inform study
outreach and study design to minimize that effect. For example, trauma awareness can shape how staff
participate in the research (and change whether their stress is transmitting fear to participants). If this is not
an area in which your team is experienced, seek out expertise so you are equipped to answer these questions.
ĥ Cadence. What is the cadence on which the team should reflect on the work and engage on challenges? Where and
when will these conversations be held? Who will participate and who will lead?
ĥ Discussion topics.
ö Feedback on the research effort so far, and what needs to be done to address it.
ö How are inequitable approaches, methods, measures filtering into the study, and what are opportunities to do
differently? For example, are you unintentionally taking advantage of what your privilege allows you to do, such as
dictating meeting times and locations? How is your work creating a way of operating intentionally distinct from the
legacy of “research brain” and “community brawn?”
ö How are relationships of trust being formed, and how is the team interacting as equals? Note that while
researchers’, staff, and community members’ roles in the research differ, the point here is that no one is treated as
superior or inferior.
ĥ Closing the loop. How are you communicating as clearly as possible to those providing input what you as researchers
are doing with that input? How are you “closing the loop?”
3.
COMMUNITY
AND VOICE
DURING THE STUDY
ĥ Feedback loop.
ö How are staff, participants’ and community members’ voices being heard during the course of the study? What
feedback is being shared and what can be done to address the feedback and communicate back to those who
shared it? If no feedback is being shared, what else can be done to listen? Some organizations have found a
biweekly conversation among researchers and those at the community organization involved in implementing the
study tremendously valuable to ask and answer questions and plan for each new step in the work in an organic and
effective way.
ö How can you continue to spend time with the organization and build relationships? How can you give credit
to the community organization or participants where your partnership is working well, and show—through
actions—humility?
74 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
ĥ Contextual awareness.
ö How can the community organization and researchers best collaborate so study tools fit the specific context and
also have the validity desired?
ö How will draft study tools be vetted by participants? For example, are surveys of an appropriate length?
ĥ Execution.
ö What types of data, for example, social security numbers, does the community organization view as likely to be a
challenge to collect in this context, and what are alternatives?
ö If a representative group is assembled, for example a focus group, is it a meaningful “group” in the eyes of the
community organization? How can the community organization help you to create the conditions for voices to
be heard?
ö Can community members be hired to help collect data or conduct surveys?
4.
EQUITABLE NUMBERS
FOR IMPACT
a. Make it useful along the way.
The history of distrust between community and researchers comes in part from community organizations
entering research partnerships expecting the work will improve lives of participants, and later finding the
research may not help them with day-to-day operations or bring resources to their community. How can the
research improve lives of participants as much as possible, as soon as possible?
ĥ Early learnings. What are early learnings from the research and how can these be shared with staff,
participants, community, or partners to show what their hard work is yielding?
ĥ Early action.
ö What improvements to the program are possible during the research, given the context of the organization and the
research design, which may limit what changes can be made?
ö How will those responsible for making the changes engage with the data early on so they can plan for action?
ĥ Sense checking. Is there a role for the organization’s staff in sense-checking early data or preliminary
results? In reviewing how qualitative data are coded? Are preliminary results consistent with the experience
of the organization, or do they look wrong?
76 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
The average
grew because
of these
participants.
Average after Average after
the program the program
As researchers are taught, averages can obscure how impact is distributed. Incentives to support participants
to reach a certain bar—less recidivism, more college enrollment, more employment—can make the case for the
work supporting participants farther away from the threshold harder-to-make.
For greater impact, researchers can support a more nuanced understanding of who is being successfully
reached and who is left farther behind, who is benefiting most and least, and what can be learned from the
complex stories beneath the numbers.
ĥ Disaggregated data. Against whatever metric has been chosen, for example, GPA, how much change
does the program create for participants starting out furthest behind, versus participants in the middle,
versus participants starting out furthest ahead? (This must be done in a way that continues to protect
individuals’ privacy.)
ĥ Learning from who is benefited least and most.
ö Who is benefited the most by the organization’s work? What are the characteristics— demographic, intersectional,
and situational, for example, housing stability, adult relationships, connectedness—of the small group of
participants benefited most?
ö Who is benefited least by the organization’s work? What are the characteristics of these participants?
ö What can the organization learn from these stories? What can the organization learn from the specific stories
of those benefited most, recognizing the complexity of these success stories, to spur innovation? Are there
adjustments to the program the organization will try out to better serve its target participants?
ĥ Counting inequity. Is the correlation between race and outcomes, or class and outcomes, changed as a
result of the organization’s work?
ĥ Comparison points.
ö For a study comparing outcomes to a benchmark, or analyzing cost and benefit, who and what is proposed as the
comparison? How can comparison points show different perspectives or the complexity of the work?
ö Does the comparison point take into account the impacts of systemic and individual traumas, for example the
implications of stress experienced by program staff as a factor in selecting what productivity comparison is relevant?
ö Does the comparison point ring true to those being measured?
ĥ Cost-benefit analysis. How can analysis capture systemic effects, where cost-benefit numbers are an output of the
work? Accounting for “cost” of the status quo only in terms of tax dollars—which are often actually wages that benefit
a different group of people—rather than the actual social cost, can unwittingly and incorrectly build a case against
investment in community work.
ĥ Capturing growth.
ö How does the presentation of the data reveal growth journeys which may not be linear? Does the comparison point
allow for—to use substance abuse terminology—relapse in the context of recovery?
78 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 79
FOR RESEARCHERS
5.
SHARING RESULTS
a. Is it historical? Is it contextualized?
Numbers, without context, take on the assumptions and biases of their audience. Data sources, without
context, reinforce the structural bias built into them.
As Chimamanda Ngozi Adichie warns, a story is fundamentally shaped by where you begin it. The origins of
inequity are often left out of the story, allowing histories to be laundered, and reinforcing harmful silences in
the narrative.
ĥ Context. What history and explanation of structural and systemic factors is important to frame the
challenges the organization is addressing? To explain fully why the problem exists in the first place and the
complexity of root causes and pathways? How can this show up in the description of the organization’s work
and dissemination of the research?
ĥ Dominant narratives. What narratives have previously described the organization’s participants, its work
in the community, or its type of work? How have these narratives served participants well? How have they
harmed them, or reinforced inequities? With this understanding, how can this research be framed to take on
unjust narratives? What cultural context is important to tell?
ĥ Limitations. How are limitations of the data clearly communicated? For example, limitations of
administrative data sets, limitations of summarized data, limitations of common metrics, systematic non-
counting, or systematic undercounting? “Objective” data like special education designations, census data,
crime that is measured by arrests, or domestic violence information, can incorporate racialized processes and
lead to incorrect interpretations, without context.
80 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY
b. Can you hear the participants? Are you signaling that lived
experience is valuable?
Valuing data to achieve an end—whether securing funding, improving programs, sharing learning with the
field, changing narratives—is not the same as valuing a human story and experience intrinsically. Honoring
a participant’s voice requires intention, it may not just happen from documenting a participant’s story,
demographics or outcomes.
ĥ Participant voices.
ö Can you hear the participants in what you, the community organization, the funders, are disseminating about
the research?
ö How are images, stories, numbers resulting from the research effort putting participants at the center in how they
are shared, versus treating them as objects of study or as tokens to lend credibility? How can you engage the
community organization to further this?
ĥ Authorship. Is there an opportunity for the participants or the community organization to own elements of
or co-author what is produced, or does it serve the community organization to have an external author?
ĥ Respecting experience.
ö How does the presentation of results message to the audience that experience is valuable and valid, rather than
reinforcing the bias that university expertise gives validity?
ö Would participants find that what is being put out is true to their experience?
ĥ Language.
ö Is the language used as easy to understand as possible? Do people from different cultures, with different lived
experiences, with different technical backgrounds understand the results of the research and the “so what” of what
it means, when you test an early draft?
ö Are all inputs, calculations, and methods clearly explained, so stakeholders with different technical backgrounds
can understand what has been counted, how, and based on what judgments? Are data tables and charts legible to
those without research and statistics backgrounds? Are any technical terms used defined in plain language?
ĥ Process. What were lessons from this research on the process of doing research with an equity orientation? Were
objectivity, bias, or rigor affected? Were participation and community involvement affected? Were accountability
or ethics affected? How can you capture this creativity and learning, and share it with your institution and with the
research community? How can you support the community organization in sharing this creativity and learning?
ĥ Endpoints. How were the endpoints of the research different than a traditional approach? Did the level of insight
derived change? Did the usefulness of the outputs to the community organization change? Was the capacity of the
community organization affected? Did members of the community or community organization get interested in evidence
and research? How were you and your capabilities changed?
ĥ Learning and sharing. What could have been done better, and what worked well? What is the feedback from the
community organization and community stakeholders? What is their guidance for future projects? How will you share
your lessons with the community organization? How will your institution internalize these lessons?
82 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Pictured: Englewood
FOR
FUNDERS
Individual funders relate to research differently. Some may fund research to generate
knowledge for policymakers and other funders on what works and what does not.
Others may fund evaluation to assess the impact of their funding, for their or their
boards’ own consumption. Still others may fund research in service of a community
organization’s growth or to change narratives. Those that do not fund research at all
may participate in the research economy by using data to direct their funding or to
summarize the impact of their work.
84 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Depending on your particular relationship with research, the action
you take to unlock more meaningful knowledge, and therefore
greater impact, will look different.
For example, you may:
ĥ Change the research questions you are willing to fund
ĥ Issue Requests For Proposals, or RFPs, for research differently or guide and evaluate your evaluators differently
ĥ Engage with board and staff on internal processes and biases, especially relating to how you use data
ĥ Interrogate numbers and stories you lift up, and use different framing in what you publish
In all cases, it will require challenging what has “always been done.” This may not be tidy, or comfortable. But
starting from accountability and relationship, funders can help to achieve the promise of what knowledge can yield.
Each funder may, in asking the questions below, find answers appropriate to their own work. Chicago Beyond
has shared some of our experience to illustrate.
1.
KNOW YOUR ROLE,
KNOW THE RISKS
EQUITY IN HOW YOU START
a. Bring awareness
How research gets done—the approaches, methods, metrics—has a system of assumptions built in.
In seeking more equitable approaches, one place to start is understanding context and biases.
ĥ Board dynamics and beyond. Who is on your board? Who are the other stakeholders you engage? How do
they create or interpret “authoritative” knowledge? Who does not show up in this? What changes does this
awareness lead to in how you interact with your board, if you have one? For example, is it valuable to have a
board discussion about how research, evaluation, and/or your use of evidence are related to your mission and
to experiment with changes in pursuit of the authentic truth? How can you listen to community organizations
from whom you have collected data, or where you have funded research, to understand their experience?
ĥ Value—to whom? How can you ensure the research produces something of real value to the community?
What is the value of the research being proposed to the “subjects” of the research? What are the benefits of
producing the research or evidence to you as a funder, and to the research institution hired?
ĥ Biases in selecting researchers. How do you select researchers? What assumptions are incorporated into
your selection process? How could you include researchers from communities being researched?
Some sources of context include MIT historian Craig Wilder’s Ebony & Ivy on the history of elite academic
institutions in justifying inequities, and the Harlem Children’s Zone’s writings on their decades of experience
“being researched” (you can find both in the Bibliography on page 111).
86 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
b. Understand context
The power dynamic between researchers, funders, and community may lead a community entity to participate
in research that marginalizes the community entity or its participants. Some communities and community
organizations may feel comfortable speaking up and engaging with funders and researchers about the purpose of
the research and how it is conducted. Others may fear that being assertive may jeopardize their funding or support.
ĥ History. What is the history of community-researcher-funder interaction in this community, and the
particular history for this organization and these participants? How have they previously experienced
research institutions? How have funders participated in this dynamic?
ĥ Reality of the community. How can you ensure the work is grounded in the experience of those affected,
to counteract bias? How can you get to know the community organization well enough to be trusted with the
truthful context, or how can you learn from someone who is?
ĥ Clarity about purpose. How will the research help the community? Where does the organization intend for
the research to land, and what type of data and research design serve that purpose, with least burden on the
organization or its participants? Is the community organization served well by research on the mechanisms,
i.e., an implementation study, focused on the “how,” in addition to the outcomes research that is more
frequently funded? Is the community organization served well by research to identify the early indicators of
their overall goal that they can influence?
ĥ Research questions. What are the questions the community organization or community wants to answer?
For example, what are specific sentences the organization wants to “fill in the blanks” on at the end of the
research, and why? Should the sentences focus not just on individual change but on interpersonal change,
change to families, or community change? Are there research questions about root causes that, if evidence
were generated by researchers, would lead to action on systemic inequities?
For example… “First, we want to write strong applications for state government funding for violence
prevention, so we want to say our program reduces participants’ violent behavior outside of the program by
[percentage/measure], citing a rigorous outside evaluation. Second, we want to show that it is not just about
the participants, but also the families of participants that grow stronger through our program and become
advocates of change. Third, we would like to identify early indicators that affect whether a participant will
complete the program or not.”
Organizations have found research useful in day-to-day work when it identifies early indicators of the
overall goal, if staff or participants can affect those indicators. Organizations have also found it useful to
ask staff: What 3-4 pieces of information would help you to do your job better?
88 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
What are the options for the research? Here is a simple chart you can share:
occurred, but does not show your program caused the change to happen.
It does not involve assigning participants to two different groups and studying both groups, and
QUASI-EXPERIMENTAL therefore asks less from your organization and your participants.
STUDY
This will reduce your flexibility to change program elements during the period of the research.
Will this generate what you are trying to learn? Is this rigorous enough for the audience you
want to reach?
A “mixed methods” approach mixes numbers and stories, and can provide the best, and worst,
MIXED METHODS STUDY
of both worlds.
90 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
ĥ Voices with expertise. Who needs to be at the table to be able to walk through the recruitment process
in detail from start to finish to really understand what is necessary to put together the participant group, or
“cohort,” for a research study? Who is closest to the work? Outreach? Social workers? Program directors?
Former participants?
ĥ Recruitment process. What is the detailed recruitment process based on these voices of expertise?
Importantly, the research itself can affect both the recruitment process, and attrition. For example, the
research could change how participants apply to the program, from interested young people applying, to
schools generating lists of young people invited into the program; as a result of this shift, attrition will rise
from what the program has seen in the past. Where does attrition happen in the process and by how much?
What number of participants would need to start the recruitment process to end up with a certain number
of participants completing the program? How frequently will participants move out of the program and what
does this mean for how much time it takes for the target number of participants to complete the program?
What does “completing the program” or “graduating” mean for your organization, operationally?
”
“If I could start all over, I’d ask ‘What’s the power number?’”
ĥ Recruitment target. The higher the number of study participants, the easier it is to show a scientifically
valid change, but the greater the effort to recruit participants. For the research study to show an effect
that researchers consider valid, how many participants does your researcher estimate need to complete
the program? Piecing this together with the program’s process from the start of recruitment to program
completion, what total number of participants must initially be recruited? Having the researchers and
operational team together name different scenarios, then explore the challenges and the manpower related to
each, can be a helpful approach.
ĥ Resources required. What will your organization need to change to achieve the recruitment target? One
community organization believed their social worker could continue to manage recruitment as the research
study began, and later found they needed a dedicated person spending 30 hours per week to adequately
support recruitment and build new referral partnerships.
ĥ Data required. What other data about participants will researchers need to collect, and how will these be
collected by researchers, or by staff and passed to researchers?
ĥ Communications.
ö How will the recruitment target and recruitment process be communicated to recruitment partners, staff, community
or participants?
These types of meetings require planning through an equity lens. What is the best time of day for
community members to attend? Evenings? Weekends? What about childcare? Food? Meeting format?
Seating arrangement?
ö Are breakfasts or evening community meeting appropriate? Should researchers be present to answer questions? What
about a school-night kick-off? Are one-on-one conversations between researchers and recruitment partners/staff
leading recruitment most appropriate? Will there be any resources or stipend for recruitment partners?
ö Are the explanations of the research to participants, staff, partners, community in a form that is understood by
these audiences?
92 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
f. Match funding and researchers to the goals; account for all costs
With shared understanding of the purpose of the research, funders can be accountable for aligning support to
these goals.
ĥ Expectations of researchers. Is there an expectation that the research effort will build the capacity
or infrastructure of the community organization? How is this translated into the expectations you set for
researchers?
ĥ Full accounting of costs. What costs to the organization and community will be generated by the research?
As a funder, you can lead the conversation to account for all costs, both direct and indirect, and discuss how
these costs can be shared. What will participants or partner organizations be asked to do differently for the
research? Will the research generate new staff responsibilities? Will it generate communications, change
management, or other costs to the organization?
ĥ Budget and timeline. Overemphasis on budget and timeline can get in the way of equity and impact.
ö When does research fit to best serve the community organization? How can the community organization drive the
timetable? Is program infrastructure in place to enable research, for example, consistently delivered key elements of
the program, or would the organization be better served by research timed after necessary infrastructure is built, or
with some lead time for preparation?
ö Do the budget and timeline support and create incentives for: Building relationships and trust? Developing data
tools with community participation? Researchers and the community organization interpreting the data together?
Collaboration on how data are shared?
ö How can reflection, time for failure, and opportunity to change be built in to the research plan?
ö How do the budget and timeline support the fully accounted costs articulated above, for example change
management? When the focus is a more marginalized group, how do the budget and timeline take into account the
additional efforts it will take to overcome the additional challenges—instead of setting expectations according to the
norms of other groups and then treating these individuals as “non-compliant?”
ö What are opportunities for residents of the community to be hired, for example to help collect data or conduct
surveys, and how can you encourage this?
g. Memorandum of understanding
The contractual agreement for the research can undermine the funder’s intention to rebalance power.
ĥ From history to present. Name the local history. How have interactions between this community and
researchers typically worked? Set against that context, how do you envision roles and accountabilities will
work in this research?
ĥ Role of Principal Investigator. What will the Principal Investigator take responsibility for? Is it appropriate
for someone from the community organization to act as a co-Principal Investigator? Or is research by an
external third party important in this context?
ĥ Property rights. Discuss intellectual property rights and data rights. Who can access the data and when?
When will data be processed and shared? Who can speak about the data and publish the data? Whose
consent is required, when?
ĥ Signed contract. In signing the memorandum of understanding or similar written contract, ensure you are
formalizing the above.
94 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 95
FOR FUNDERS
2.
COMMUNITY
AND VOICE
a. Create accountability for voice
Bryan Stevenson, founder of the Equal Justice Initiative, writes “getting proximate” changes our capacity to
make a difference.
Funders can set the tone and expectation of relationships, and, depending on the context, advocate for the
interests of the community organization.
ĥ Reciprocal engagement.
ö How will you as a funder participate in breaking bread and in building relationships?
ö How can you encourage researchers and the community organization to spend time together face-to-face, to
begin to build relationships of trust? To show, not talk about, humility? Are Peace Circles an appropriate early
engagement? Evening forums for community? For researchers, having the opportunity to connect with the work
on a human level can provide perspective, for example on the human cost of research approaches. For community
organizations, engaging with the researchers formally and informally helps build a relationship of trust.
ĥ Identification of voices needed. How can you support the researchers and community organization to identify
those affected by the research, and to create structures and conditions so that their voices can be heard?
ö In some cases, it can be helpful to assemble a committee, which engages regularly to make decisions or give input
into what is researched and the outcome measures used. The committee may include staff of the organization,
researchers, program participants or community members. (For funders familiar with Community-Based
Participatory Research, a Community Action Board is often formed to steer both the research and the related
action. Barbara Israel’s piece, which is cited at the end, surveys several Community Action Boards and describes
varying levels of power, participation, and effectiveness.) It is important to be clear about the mandate of the group:
What powers does it really have?
96 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
In doing justice work, the “simple” measures may not be the ones that represent real growth and benefit to
participants most on the margins. As large administrative data sets were in many cases built to report on
compliance, metrics of compliance such as arrest rates are, not surprisingly, easier and cheaper to collect.
Take the example of recidivism. On one hand it is a common metric, and one for which data sets exist. This
means it costs less money to track and allows comparison across programs. On the other hand, it is binary.
It fails to capture directional progress in desistance from crime. It succeeds in capturing things outside of
the program participant’s control such as the intensity of enforcement efforts and prosecutor-judge-public
defender dynamics contributing to how pleas are entered. A piece by Jeffrey Butts and Vincent Schiraldi on
benefits and shortfalls of recidivism as a metric that might prompt discussion is listed in the Bibliography on
page 111.
Funders can hold their research accountable for taking an approach to metrics that furthers—not contradicts—
the mission of the work.
One example you could share with the community organization is of a prison-based fatherhood program
(you can find the article by Abigail Henson listed in the Bibliography on page 111), where dialogue between
participants, community organization staff, and researchers changed what the study measured: the unit was
changed from the father to the family; the short-term measures were changed, from depression and stress, to
pride and reconstruction of masculinity (from provider to caregiver); the long-term measures were expanded,
from recidivism, to whether the father-child bond remained active and positive.
98 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
ö Effects of the ecosystem on a participant, network effects of change by the participant, or structural impacts of a
program’s work, may be less simple to research. However, this may be a critical piece of a community organization’s
impact where easy-to-research approaches have not worked. Are ecosystem effects of the community organization’s
work important to capture in the research? Is a program getting at the structural drivers of inequity? If so, how? Is
the program affecting the system or the capacity of the community, for example supporting cultural revitalization,
changing power relations, or increasing the capacity of the community to solve problems?
ĥ Communicating about metrics respectfully. How can you communicate about quantitative metrics in a
way that is respectful, and that validates the intuition of the community organization that numbers are an
incomplete picture?
When those closest to participants, and participants themselves, shape how the study occurs, it can help
community organizations and researchers arrive at more equitable and authentic learning. It can also strengthen
study retention.
ĥ Research design.
ö How can you ensure that the research design throughout the study, and as it ends, is consistent with the mission of
the community organization and the trust it has built in the community? How can you ensure that the impact to the
organization’s reputation in the community resulting from changes made because of the research study has been
considered?
ö How is consent best approached? Is consent written in a way that is understood? Is consent presented in a context/
at a time when the participant actually has agency to give consent?
ĥ Responsive communications. What is the best way to communicate to participants about privacy,
expectations, and benefits of the study? Concerns nonprofits have shared, from their participants, include:
Who all will know I am in the study? Will my name be published anywhere? How much will they be in my life?
Do they watch from the cameras in the building? If I am involved in questionable activity, are they going to
report me to the police? How does being part of a study help me? For a fuller list of issues, please see page 49.
ĥ Study tools. How can you encourage the community organization and researchers to collaborate so study tools
fit the specific context and also have the validity desired? How will draft study tools be vetted by participants?
ĥ Awareness of researchers. How can funders support the cultural awareness and humility of those
conducting the research? For example, Chicago Beyond has supported researchers’ participation in racial
bias workshops and reflection.
ĥ Trauma. How do awareness of trauma and the research study’s potential to trigger memories and emotions
shape the work? A study can be an emotional trigger for program participants, and for staff who were
themselves researched or interrogated in other ways in their childhood. Trauma expertise can inform study
outreach and study design to minimize that effect. For example, trauma awareness can shape how staff
participate in the research (and change whether their stress is transmitting fear to participants).
100 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 101
FOR FUNDERS
3.
EQUITABLE NUMBERS
FOR IMPACT
a. Bring attention to who is being benefited the most? Least?
The average
grew because
of these
participants.
Average after Average after
the program the program
Averages can hide whether participants most forced to the margins are left further behind, unaffected, or
helped by a program. When funders focus on aggregate data describing participants reaching a certain bar—
less recidivism, more college enrollment, more employment—it can unwittingly create incentives to focus
on participants starting closest to the threshold. For greater impact, research can support a more nuanced
understanding of who is being successfully reached and who is left farther behind, who is benefiting most
and least, and what can be learned from the complex stories beneath the numbers.
ĥ Disaggregated data. How can funders bring focus to who, specifically, is impacted and how, while still
appropriately protecting individuals’ privacy? For example, against whatever metric has been chosen, how
can funders create accountability for looking at the change the program creates for participants starting
out furthest behind, versus participants in the middle, versus participants starting out furthest ahead?
102 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
ĥ Learning from who is benefited least and most. Who is benefited the most by the organization’s work?
What are the characteristics— demographic, intersectional, and situational, for example, housing stability,
adult relationships, connectedness—of the small group benefited most? Who is benefited least by the
organization’s work? What are the characteristics of these participants? What can the organization, the
researchers and the funder learn from these stories?
ĥ Counting inequity. Is the correlation between race and outcomes, or class and outcomes, changed as a
result of the organization’s work?
ĥ Comparison points.
ö For a study comparing outcomes to a benchmark, or analyzing cost and benefit, who and what is proposed as the
comparison? How can comparison points show different perspectives or the complexity of the work? How can
funders guard against unwittingly pushing community organizations to focus on “low-hanging fruit?”
ö Does the comparison point take into account the impacts of systemic and individual traumas, for example the
implications of stress experienced by program staff as a factor in selecting what productivity comparison is relevant?
ö Does the comparison point ring true to those being measured?
ĥ Cost-benefit analysis. How can funders create incentives for analysis to capture systemic effects, where
cost-benefit numbers are an output of the work? Accounting for “cost” of the status quo only in terms of tax
dollars—which are often actually wages that benefit a different group of people—rather than the actual social
cost, can unwittingly and incorrectly build a case against investment in community work.
ĥ Capturing growth. How does the presentation of the data reveal growth journeys which may not be
linear? Does the comparison point allow for—to use substance abuse terminology—relapse in the context of
recovery?
4.
SHARING
RESULTS
a. Is it historical? Is it contextualized?
Numbers, without context, take on the assumptions and biases of their audience. Data sources, without
context, reinforce the structural bias built into them.
As Chimamanda Ngozi Adichie warns, a story is fundamentally shaped by where you begin it. The origins of
inequity are often left out of the story, allowing histories to be laundered, and reinforcing harmful silences in
the narrative.
ĥ Context. What history and explanation of structural and systemic factors is important to frame the
challenges the organization is addressing? To explain fully why the problem exists in the first place and the
complexity of root causes and pathways? How can this show up in the description of the organization’s work
and dissemination of the research?
ĥ Dominant narratives. What narratives have previously described the organization’s participants, its work
in the community, or its type of work? How have these narratives served participants well? How have they
harmed them, or reinforced inequities? With this understanding, how can this research be framed to take on
unjust narratives? What cultural context is important to tell?
ĥ Limitations. How are limitations of the data most clearly communicated? For example, limitations of
administrative data sets, limitations of summarized data, limitations of common metrics, systematic non-
counting, or systematic undercounting? “Objective” data like special education designations, census data,
crime that is measured by arrests, or domestic violence information, can incorporate racialized processes and
lead to incorrect interpretations, without context.
b. Can you hear the participants? Are you signaling that lived experience
is valuable?
Valuing data to achieve an end—whether future funding, improving programs, sharing learning with the
field, changing narratives—is not the same as valuing a human story and experience intrinsically. Honoring
a participant’s voice requires intention, it may not just happen from documenting a participant's story,
demographics or outcomes. Funders are influential in bringing this to life.
ĥ Participant voices. Can you hear your participants in what you, the community organization, and
researchers, are disseminating about the research? How are images, stories, and numbers resulting from the
research effort putting participants at the center in how they are shared, versus treating them as objects of
study or as tokens to lend credibility?
ĥ Authorship. Is there an opportunity for the participants or the community organization to own elements
of or co-author what is produced, or be editors? Or does it serve the community organization to have an
external author?
104 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S
ĥ Respecting experience.
ö How does the presentation of results message to the audience that the community’s experience is valuable and
valid (for example, in how the community’s experience is referred to and credited in the presentation), rather than
reinforcing the bias that only university expertise produces validity?
ö Would participants find that what is being put out is true to their experience, not just responsive to
what the funder set out to do, or what the researchers came to ask?
ĥ Language.
ö How can you create an expectation that the language is as easy to understand as possible? Do people from different
cultures, with different lived experiences, with different technical backgrounds understand the results of the
research and the “so what” of what it means, when you test an early draft? How can you create the expectation that
data and charts are readable to those without research and statistics backgrounds?
ö Are all inputs, calculations, and methods clearly explained, so stakeholders with different technical backgrounds
can understand what has been counted, how, and based on what judgments? Are data tables and charts legible to
those without research and statistics backgrounds? Are any technical terms used defined in plain language?
ĥ Endpoints. How were the endpoints of the research different than a “traditional” approach? Did the level of
insight derived change? Did the usefulness of the outputs to the community organization change? Was the
capacity of the community organization affected? Did members of the community or community organization
get interested in evidence and research? Was the capacity of the researcher affected?
ĥ Learning and sharing. What could have been done better, and what worked well? What is the feedback from
the community organization, its stakeholders, and the researchers? How will you as a funder internalize these
lessons and share them with others?
Thank you for your time, consideration, and use of this guidebook. We see it not as a
solution, but as a kindling to something greater, and a new path toward “how” we can
all arrive at a more authentic truth in research. We ask that you share these questions
and ideas with others in the social impact space, and host conversations to address
unintended bias and leveling the playing field to do the most good for our communities.
108 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Aimee Stahlberg, Helene Gayle, Pastor Christopher Harris,
Storycatchers Theatre The Chicago Bright Star Community
Community Trust Outreach
Aisha Noble,
Community member Jason Quiara, Paula Wolff,
The Joyce Foundation Illinois Justice Project
Andrea Ortez,
Partnership for Resilience Jennifer Keeling, Priya Shah,
Chicago CRED (Creating Real Storycatchers Theatre
Angela Odoms-Young, Economic Destiny)
University of Illinois Rami Nashashibi,
at Chicago John Rich, Inner-City Muslim
Drexel University Action Network
Angelique Power,
The Field Foundation Jonte, Rebekah Levin,
of Illinois Storycatchers Theatre Robert R. McCormick
alumnus Foundation
Asiaha Butler, Resident
Association of Greater Karen Jackson, Robin Steans,
Englewood Lawndale Christian Steans Family Foundation
Legal Center
Carmelo Barbaro Sana Syed,
University of Chicago Kelly Hallberg, Inner-City Muslim
Poverty Lab University of Chicago Action Network
Poverty Lab
Christopher Sutton, Sheldon Smith,
Youth Advocate Khalfani Myrick, Dovetail Project
Programs, Inc. Genesys Works
Theodore Corbin,
Clifford Nellis, Kim Cassel, Drexel University
Lawndale Christian Arnold Ventures
Troy Harden,
Legal Center
Lina Fritz, Northeastern Illinois
Deborah Gorman-Smith, OneGoal University
The University of Chicago
Lindsey Nurczyk, Unmi Song,
School of Social Service
OneGoal Lloyd A. Fry Foundation
Administration
Marquell Jones, Wendy Fine,
Elena Quintana,
Lawndale Christian Youth Guidance
Adler University
Legal Center
Wrenetha Julion,
Evaluation Committee
Michael McAfee, Rush University
of Lawndale Christian
PolicyLink
Legal Center
Michelle Adler Morrison,
Franklin Cosey-Gay,
Youth Guidance
The University of Chicago
School of Social Service
Administration
Design thinking: A creative problem-solving process that puts humans at the center and
focuses on what real people actually do.
Peace Circle: A method rooted in Native American practice to address conflict holistically
and solve problems. Peace Circles are a group process that repair harm, include offenders
taking responsibility for their actions and, and lead to collective healing.
Racial equity and cultural awareness: Racial equity would be achieved if racial identity
did not determine the odds of how one fares. Racial equity work includes dismantling
narratives, attitudes, practices and policies that allow or reinforce different outcomes by
race. Cultural awareness is awareness of the social systems of meaning and customs of a
group, and includes reflection on your own values, beliefs, biases.
Systems thinking: A holistic approach to analysis that focuses on the way parts of a
system relate to each other and work over time.
110 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
BIBLIOGRAPHY
American Evaluation Association. Public Statement on Cultural Inouye, Traci Endo et al. Commissioning Multicultural Evaluation:
Competence in Evaluation, (April 2011). A Foundation Resource Guide. The California Endowment. 2005.
Available at: https://www.spra.com/wordpress2/wp-content/
Butts, Jeffrey A. and Vincent Schiraldi. Recidivism Reconsidered:
uploads/2015/12/TCE-Commissining-Multicutural-Eva.pdf.
Preserving the Community Justice Mission of Community
Corrections. Program in Criminal Justice Policy and Management, Israel, Barbara A. et al. Community-Based Participatory Research:
Harvard Kennedy School, (March 2018). Lessons Learned from the Centers for Children’s Environmental
Health and Disease Prevention Research. Environmental Health
Coalition for Evidence-Based Policy. Randomized Controlled
Perspectives. (Oct. 2005); 113(10):1463-1471.
Trials Commissioned by the Institute of Education Sciences Since
2002: How Many Found Positive Versus Weak or No Effects. (July LaFrance, Joan and Richard Nichols. Reframing Evaluation: Defining
2013). Available at: http://coalition4evidence.org/wp-content/ an Indigenous Evaluation Framework. The Canadian Journal of
uploads/2013/06/IES-Commissioned-RCTs-positive-vs-weak-or- Program Evaluation. Vol. 23 No. 2 p13-31. (2010)
null-findings-7-2013.pdf.
Lee, Kien. The Importance of Culture in Evaluation: A Practical Guide
Community-Based Participatory Research (CBPR). Foundation for for Evaluators. The Colorado Trust. (2007). Available at: https://www.
Sustainable Development. (May 2017). www.fsd.org/wp-content/ communityscience.com/pdfs/CrossCulturalGuide.r3.pdf.
uploads/2017/05/Research-Toolkit.pdf. Sample Terms of Reference
Leiderman, Sally. Evaluation with a Racial Equity Lens Slides.
for a Community-Based Participatory Research project are
November 2017. www.capd.org
appended at the end, from the Wellesley Institute.
Mertens, Donna M. Transformative Mixed Methods: Addressing
Cram, Fiona. Lessons on Decolonizing Evaluation from Kaupapa
Inequities. American Behavioral Scientist 56(6) 802-813. (2012).
Maori Evaluation. Canadian Journal of Program Evaluation 30.3,
296-312, (2016). O’Fallon, Liam R. and Allen Dearry. Community-Based Participatory
Research as a Tool to Advance Environmental Health Sciences.
de Jong, Marja A. J. G.. et al. Study protocol: Evaluation of a
Environmental Health Perspectives. 110(suppl 2): 155-159 (2002).
community health promotion program in a socioeconomically
deprived city district in the Netherlands using mixed methods and Patton, Michael Quinn. Utilization-Focused Evaluation. SAGE
guided by action research, BMC Public Health 19:72 (2019). Publications, Inc. (2008).
Dean-Coffey, Jara; Casey, Jill; and Caldwell, Leon D. Raising the Bar Potapchuk, Maggie et al. Flipping the Script: White Privilege and
– Integrating Cultural Competence and Equity: Equitable Evaluation, Community Building. MP Associates, Inc. and the Center for
The Foundation Review, 6(2), 81-94 (2014). Assessment and Policy Development. (2005). Available at: http://
www.racialequitytools.org/resourcefiles/potapchuk1.pdf.
Equitable Evaluation, Equitable Evaluation Framing Paper. (July 2017).
drive.google.com/file/d/0BzlHFJSNsW5yY3dGaGN0MTdISEk/view. Promoting Healthy Public Policy Through Community-Based
Participatory Research: Ten Case Studies. PolicyLink, University
“Equity-Centered Design Framework.” Stanford D.school, Stanford
of California Berkeley School of Public Health, W.K. Kellogg
D.school, (26 Apr. 2017). dschool.stanford.edu/resources/equity-
Foundation. Available at: http://www.policylink.org/resources-
centered-design-framework.
tools/promoting-healthy-public-policy-through-community-based-
equityXdesign. “Racism and Inequity Are Products of Design. participatory-research-ten-case-studies.
They Can Be Redesigned.” Medium, EquityXdesign. (16 Nov. 2016).
Thomas, Veronica G. and Beverly A. Parsons. Culturally Responsive
medium.com/equity-design/racism-and-inequity-are-products-of-
Evaluation Meets Systems-Oriented Evaluation. American Journal of
design-they-can-be-redesigned-12188363cc6a.
Evaluation. 38(1) (April 2016).
Fricker, Miranda. Epistemic Injustice: Power and Ethics of Knowing,
Straight Talk on Evidence, How to Solve U.S. Social Problems When
Oxford University Press, (2007).
Most Rigorous Program Evaluations Find Disappointing Effects
Hall, Budd L. and Rajesh Tandon. Decolonization of Knowledge, (March 21, 2018). Available at: https://www.straighttalkonevidence.
Epistemicide, Participatory Research and Higher Education, org/2018/03/21/how-to-solve-u-s-social-problems-when-most-
Research for All, 1(1), 6-19 (2017). rigorous-program-evaluations-find-disappointing-effects-part-one-
in-a-series/.
Harlem Children’s Zone. Successful Research Collaborations:
Rules of Engagement for Community-Based Organizations. New Stevenson, Bryan. Just Mercy. Spiegel & Grau (2015).
York, NY. (2012). Available at http://www.hcz.org/images/Rules_of_
Wallerstein, Nina and Bonnie Duran. Community-Based
Engagement_paper.pdf and http://promoseneighborhoodsinstitute.
Participatory Research Contributions to Intervention Research: The
org/Technical-Assistance/Resource-Library/Tools.
Intersection of Science and Practice to Improve Health Equity. Am J
Hassmiller, Kristen et al. Extending systems thinking in planning Public Health. 100(Suppl 1): S40-S46. (April 2010).
and evaluation using group concept mapping and system dynamics
Western, Bruce, Anthony Braga, David Hureau, Catherine Sirois.
to tackle complex problems. Evaluation and Program Planning 60.
Study Retention as Bias Reduction in Hard-to-Reach Populations,
254-264 (2017).
Proceedings of the National Academy of Sciences, 113 (20) 5477-
Henson, Abigail. Strengthening Evaluation Research: A Case 5485; (May 2016). DOI:10.1073/pnas.1604138113.
Study of an Evaluability Assessment Conducted in a Carceral
Wilder, Craig Steven. Ebony & Ivy: Race, Slavery, and the Troubled
Setting. International Journal of Offender Therapy and Comparative
History of America’s Universities. Bloomsbury Press (2014).
Criminology Vol. 62(10) 3185-3200 (2018).
Pictured: Jonte