ChicagoBeyond 2019guidebook

Download as pdf or txt
Download as pdf or txt
You are on page 1of 112

WHY AM I

ALWAYS BEING
RESEARCHED?
A GUIDEBOOK FOR COMMUNITY ORGANIZATIONS, RESEARCHERS,
AND FUNDERS TO HELP US GET FROM INSUFFICIENT
UNDERSTANDING TO MORE AUTHENTIC TRUTH

E Q U I T Y S E R I E S , VO LU M E O N E
WHY AM I
ALWAYS
BEING
RESEARCHED?
CHICAGO BEYOND EQUITY SERIES, VOLUME ONE

2 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 3
4 | W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Executive Summary .........................................................................6

Voices of Support.............................................................................8

Introduction ................................................................................... 10

How This Guidebook is Organized ................................................. 17

Seven Inequities Held in Place by Power,


Seven Opportunities for Change ................................................... 18

A Note on Sources .........................................................................28

For Community Organizations.......................................................30

For Researchers ............................................................................ 60

For Funders ................................................................................... 84

Let Us Go Forward, Together ....................................................... 106

Thank You.....................................................................................108

Glossary ........................................................................................110

Bibliography.................................................................................. 111

CHICAGO BEYOND | 5
EXECUTIVE
S U M M A RY

If evidence matters, we must


care how it gets made.

As an impact investor that backs the fight for youth equity, Chicago Beyond has partnered with
and invested in community organizations working towards providing more equitable access and
opportunity to young people across Chicago. In many cases, we have also invested in sizable
research projects to help our community partners grow the impact of their work. Our hope is
that the research will generate learnings to impact more youth in our city and nationwide, and
arm our partners with “evidence” they need to go after more funding for what is working.

Through the course of our investing, another sort of evidence emerged: evidence that the
power dynamic between community organizations, researchers, and funders blocks information
that could drive better decision-making and fuel more investment in communities most in
need. This power dynamic creates an uneven field on which research is designed and allows
unintended bias to seep into how knowledge is generated.

There are many voices in the social impact space who have begun to call out the power
dynamic. “Grantmaking is not trusting of the community, and the community is not trusting
of funders,” Edgar Villanueva, author of Decolonizing Wealth, has said. In annual letters, the
President of the Ford Foundation, Darren Walker, pushes funders to reckon with privilege,
acknowledging that communities closest to the problems possess unique insight into the
solutions. The power dynamic in the social impact space is impeding our collective efforts to
create a better world.

Right or wrong, research can drive decisions. If we do not address the power dynamic in the
creation of research, at best, we are driving decision-making from partial truths. At worst,
we are generating inaccurate information that ultimately does more harm than good in our
communities. This is why we must care about how research is created.

In this publication, we offer “how” we can begin to level the playing field and reckon with
unintended bias when it comes to research. Chicago Beyond created this guidebook to help
shift the power dynamic and the way community organizations, researchers, and funders
uncover knowledge together. It is an equity-based approach to research that offers one way
in which we can restore communities as authors and owners. It is based on the steps and
missteps of Chicago Beyond’s own experience funding community organizations and research,
and the courageous and patient efforts of our partners, the youth they serve, and others with
whom we have learned.

6 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
This guide is intended to:

ĥ Fuel community organizations to not only We at Chicago Beyond see this publication as a
participate in research on their own terms, start—it is by no means the answer. We wrestle
but to lead it. regularly with operating in this world while
envisioning an equitable one.
ĥ Support researchers in recognizing their
immense influence and unintended bias in We ask you to join us in questioning, wrestling with
shaping the questions asked, and the inputs bias, and pushing against “how it has always been
used to answer them. done.” We need to collectively move from insufficient
understanding to more authentic truth. The stakes
ĥ Inspire funders to ask hard questions
are too high for us to do otherwise.
about their agendas, unlock more
meaningful knowledge, and therefore
achieve greater impact.

The guide begins by naming seven inequities


standing in the way of impact, each held in
place by power dynamics.

1. Access: Could we be missing out on community


wisdom because conversations about research
are happening without community meaningfully
present at the table?

2. Information: Can we effectively partner to get


to the full truth if information about research
options, methods, inputs, costs, benefits, and
risks are not shared?

3. Validity: Could we be accepting partial truths


as the full picture, because we are not valuing
community organizations and community
members as valid experts?

4. Ownership: Are we getting incomplete answers


by valuing research processes that take from,
rather than build up, community ownership?

5. Value: What value is generated, for whom, and


at what cost? Seven inequities held in place by power,
seven opportunities for change.
6. Accountability: Are we holding funders and
researchers accountable if research designs
create harm or do not work?

7. Authorship: Whose voice is shaping


the narrative and is the community
fully represented?

CHICAGO BEYOND EQUIT Y SERIES 7


VOICES OF
S U P P O RT

“We are not asking the right questions. To reverse the cycle of
oppression, we must understand the intersectional complexity
of personal responsibility, programmatic acts of charity, and
the interplay of systems and policies that are designed to
deny opportunity. To evaluate an afterschool program absent


of intersectional context is misleading and provides answers
that are insufficient, and often useless to transforming the
lives of people living in the community. This guide pushes the
conversation, which is exactly what the field needs.”
MICHAEL MCAFEE, PRESIDENT AND CEO, POLICYLINK

“When Dr. King was assassinated, in the 60s, we were doing studies in North
Lawndale and Garfield, and here it is almost sixty years later. We have been studied
for years and years and years—what is the positive outcome we are going to see for
these studies we are always doing? Will it be in these young men’s lifetimes? While
they are still in their twenties?”

KAREN JACKSON, DIRECTOR OF WORKFORCE DEVELOPMENT, LAWNDALE CHRISTIAN LEGAL CENTER


Chicago Beyond has just handed us an incredible tool for funders, researchers and organization leaders. This pushes
the reset button and allows us to rethink evaluation. And let’s face it, the problems we aim to alleviate are too
complicated for us to approach solutions without examining how we are entering the work. This tool gives us questions
to use in our everyday work. Questions like: What if building trust is the variant that indelibly strengthens our
outcomes? How can funders come to the table recognizing our own power to influence, shape and therefore limit the
results? And what if funders had to use the same evaluation approaches and metrics on ourselves that we force others
to submit to? So, let’s rethink. Let’s come to the table recognizing the power source is the ORGANIZATION. They aren’t
the subject, they are the research designer.”
ANGELIQUE POWER, PRESIDENT, THE FIELD FOUNDATION OF ILLINOIS

“For our own kids, we understand the context of their lives, their trajectory over time, and what ‘safe’
and ‘happy’ and ‘realizing potential’ look like in the long term. But for kids in social services, we
typically create some limited data set of how they interacted with one specific service at some
specific point in time or a series of isolated services, and then claim we are ‘evidence-based.’ It is
time to support these kids as we support our own kids and these families as we support our family.
Our goal should be to help them realize their potential. It is time to change what we do, and don’t,


accept as evidence. Is the evidence we are working from the truth of the people studied, or did it get
imposed on them? Is it teaching us about how we support full life and well-being?”
PAULA WOLFF, DIRECTOR, ILLINOIS JUSTICE PROJECT

8 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
“As board members or decision-makers at the table, we ask for ‘hard evidence’ and ‘gold-standard
methods’ thinking we are stewarding resources well, but if we don’t take a hard look at the


inequities built into how evidence is getting made, we could be doing the opposite. I appreciate the
thoughtfulness behind ‘Why Am I Always Being Researched?’ and its push for us all to do better
when it comes to equity and access.”

ARNE DUNCAN, MANAGING PARTNER, EMERSON COLLECTIVE


This guide offers thoughtful advice to practitioners, researchers, and funders on how to combine
resources to conduct high-quality research. Importantly, the guide advances the idea that rigorous
research methods, when applied appropriately, can help us all learn whether interventions are
producing the hoped-for benefits.”

KIM CASSEL, DIRECTOR OF EVIDENCE-BASED POLICY, ARNOLD VENTURES


“The questions we are asked, where do they
come from? Whose lab are we in? When I was
doing my masters I had to do research too. The


starting question was never ‘how is this going
to benefit the people being researched?’”

ASIAHA BUTLER, FOUNDER AND PRESIDENT,


RESIDENT ASSOCIATION OF GREATER ENGLEWOOD
Photo credit: WTTW


“There is without a doubt a valuable role for outside evaluation. But—outside evaluation
can be problematic for communities and for community programs when there are
agendas at work beyond their own. We need to remember the first priority is to produce
something that is not just accessible to, but actually valuable to, the community. This
guide is such an important project for this reason.”
UNMI SONG, PRESIDENT, LLOYD A. FRY FOUNDATION

CHICAGO BEYOND EQUIT Y SERIES 9


INTRODUCTION

A guidebook for community organizations,


researchers and funders to help us get
from insufficient understanding to more
authentic truth.

In the hometown of urban research, Jonte asks aloud


“why am I always being researched?” His peers
are in three studies at once. A grandmother on his
block, neighbors, and staff at nonprofits serving him,
remember being in studies, too.

Jonte is one of thousands in Chicago who, over decades,


have participated in research studies with price tags in
the millions, all in the name of societal change. And yet,
the fruits of those studies have infrequently nourished
the neighborhoods where their seeds were planted.
Instead, there remains “not enough evidence” about
what works, and a deep distrust between community,
funder, and researcher, driven by systemic injustices:
racism, segregation and disinvestment.

And then, there are questions behind how evidence


gets made. What if the structures we use to find what
works to improve communities is negatively impacted
by the same power dynamics that have propped up
those systemic injustices?

At its core, this is an interaction of individuals, human


beings, each with their own biases. What if our research
questions are unintentionally rooted in bias? Have we
stopped to consider if our inputs are unfairly affecting
our intended outcomes?

1 0 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 11
PRIVILEGE SHAPES KNOWLEDGE
As an impact investor that backs the fight for youth equity, Chicago Beyond
has partnered with and invested in community organizations working
towards providing more equitable access and opportunity to young people
across Chicago. In many cases, we have also invested in sizable research
projects to help our partners “validate” what is working, as that evidence
may support further investment, reaching more youth, and spreading what
our partners have learned.

In the course of our work, we have seen community We recognize our privilege at Chicago Beyond
organizations with deep expertise contribute to how as funders of both community organizations and
research gets produced, researchers with a desire to researchers, and with that the power to shape
work with community organizations to define outcomes, research and the resulting knowledge produced.
and funders committed to advocating for a central role Therefore, we recognize the need for us to check our
at the research table for community organizations. privilege, our power. At Chicago Beyond, we devote
considerable focus to identifying and counteracting
On the flip side, we have heard program staff our own personal biases through anti-racist trainings,
struggle with what the proposed research design calling out inequities, and being in communities. We
means for their relationships in their communities are a funder whose leadership is majority people
and with their families. We have heard social workers of color. Today, our relationships with community
wonder why validating their work to those with partners are intensely human. We dedicate a team
power starts with inputs that have structural racism of individuals to our partners to provide thought-
built in to them. We have seen funders, researchers partnership, strategic support, a sounding board,
and community organizations jump to action when and more. Our partnerships are multi-year, and are
research protocols or timelines are at risk, but not contingent on research outcomes or compliance
silent when research does not produce a high
checklists. When the research conducted does not
quality outcome for the community. We have heard match the intent of the research, we look for our
community organizations wonder why researchers accountability even though the power dynamic might
who are disconnected from their communities have have suggested that a funder like Chicago Beyond
the privilege of telling them what works and what does not have any.
does not. And in listening to those we serve, Jonte,
a young man in a program we invest in unforgettably
asked, “why am I always being researched?”

12 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
SO WHAT?
Right or wrong, research can drive decision-making. If we do not address the power dynamic in the creation
of research, at best, we are generating partial truths to inform decision-making in the social impact space. At
worst, we are generating inaccurate information that ultimately does more harm than good in our communities.
This is why we care about how research is created.

This guide was designed to help us level the playing field and reckon with unintended bias when it comes to the
research. It is about shifting the way community organizations, researchers, and funders ask for, produce, and use
knowledge. It is about restoring communities as experts. It calls on us all to stop, recognize, and question bias. It is
an equity-based approach to research. It is based on the steps and missteps of Chicago Beyond’s own experience
funding community organizations and research, and the courageous and patient efforts of our partners, the youth
they serve, and others with whom we have learned.

Chicago Beyond created this guide because we see the opportunity for a new path. Shifting the existing power
dynamic offers a way forward: From not enough evidence on what works, to more meaningful knowledge that
supports more meaningful action.

Color of Chicago Beyond’s Experience:


In the North Lawndale neighborhood of Chicago, decades of hosting research studies have been punctuated by
disinvestment. Researchers are not popular, and are rarely seen as separate from their institution and its racial history.
When the leadership of one of Chicago Beyond’s community partners determined that a rigorous research study was
the best way to make the case for change within the juvenile justice system, it brought up fierce distrust among their
staff. To reach a more authentic truth in service of change, we addressed the power dynamic in these ways:

1. Engaged with our community partner at a human level and built trust.
ö Broke bread and spent time sitting on the stoop, as human beings, sharing our hopes and fears.
ö Participated in a Peace Circle, and asked that the research team participate in a Peace Circle
separately. This opened the door for the community organization to build trust with us and the
researchers (for definition of Peace Circle, see glossary on page 110).
ö Continued to nurture the relationships we built and hold today.

2. Supported access to information on the community organization’s own terms, and helped inform the
organization and community members.
ö Funded access to research based on the community organization’s learning goals.
ö Sat with the community organization’s Executive Director and the research partner to help the community
organization get informed about options for research and the related risks. Still, it was not until later, as
our relationship developed, that we were trusted enough by enough of the community organization for
them to truly recognize what the risks and costs were, and it was not until later that the research felt “real”
to the Executive Director.
ö Understanding that it takes time and multiple engagements to address the information imbalance, we
connected the community organization with another nonprofit going through a similar type of research to
share their perspective.
continued on page 14 >

CHICAGO BEYOND EQUIT Y SERIES 13


WHAT WE MEAN
3. Followed our partner’s lead on research BY EQUITY-BASED
design and supported their ownership of
the research study. APPROACH TO
ö Before setting study design and outcomes,
together we determined the primary
audiences for the final output of the
RESEARCH
evaluation, to use the needs of those At Chicago Beyond, we believe that research can
audiences to inform the study, and to
drive decision-making. For that reason, we also
cultivate a network of support. The intention
was to ensure the final outcomes would be
believe that the creation of research should begin
helpful to our community partner, and lead to from a place of mutual understanding between
the systemic change they hope for. community organizations, researchers, and funders.
ö Built upon the community organization’s Those involved in the research design must recognize
vision for an “Evaluation Committee” unintended bias to arrive at an authentic truth that
comprised of staff members from all parts does the most good for those being researched.
of the community organization, as well as
researchers, and Chicago Beyond. The group
incorporated the community organization’s
goals and questions into the study, made
critical decisions about the study’s design
and is shaping study outcomes.
ö With guidance from the Evaluation
Committee, we designed a consent process
to inform participants about the research
in accessible language before asking if they
would participate.

4. Supported our partner’s study rollout and


helped address intangible costs.
ö Worked with the Executive Director to map
stakeholders and support communication

within the organization and with critical
organizational partners. “Researchers have to understand the reality and the
culture of the participating population. For example,
ö Worked with the community organization to
asking a question as simple as, ‘Do you put your kids to
develop internal tools to help the organization
bed?’ will mean different things to different people. And,
grow and best leverage its team and their skills.
may mean something totally different to you now than
when you were younger. Growing up, my mom told me
5. Embraced messiness, interruption and being to ‘put my butt in bed,’ and that’s what I did. If she were
uncomfortable. Put accountability, not asked that question back then, she might say, no, but
comfort first. the reality is, she did put me to bed because she was
ö Changed the funding timeline substantially home caring for her family, making sure we were fed,
to allow time to understand and prepare, bathed, and in the bed. She would probably answer ‘no’
emotionally and organizationally, for the because the generalized version is putting your kids to
execution of a rigorous evaluation. bed, reading to them until they fall asleep, and turning
off the light. Researchers have to understand these
many nuances to truly assess the population.”

SHELDON SMITH, EXECUTIVE DIRECTOR AND FOUNDER,


THE DOVETAIL PROJECT

14 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?

“When was the last time you were at a gathering and the ‘subjects’ of a study talked about it in glowing terms, because
the research helped the practical work?”

RAMI NASHASHIBI, EXECUTIVE DIRECTOR,


INNER-CITY MUSLIM ACTION NETWORK (IMAN)

Chicago and Beyond:


The History of Urban Research
Chicago changed how urban studies are done.
What is the history of research, in Chicago, nationally, and beyond? Rather than a partnership of equals,
there is a legacy of researcher “brains” and community “brawn.” In many communities, the remembered
history is that when the community and research institution interact, the institution benefits. Countless
research surveys mine communities for the raw material of lived experiences, without yielding much for
the community—or worse. Yet, there remains a lack of evidence about the value of interventions for those
from whom the most has been taken. That “lack of evidence” justifies investing less still. Community
organizations often cannot afford access either to large datasets or to the kinds of researchers that
institutions attract, and their own data and stories have limited influence on decision-making until those
institutions authenticate them. Like most people outside of universities, community organizations may not
be aware of the options in research, or what they may risk for scientific rigor. Some may not dare to assert
themselves, internalizing biases about who the experts are.

Today, in Chicago, we see new opportunity to shape urban research, for greater equity, and greater impact.

CHICAGO BEYOND EQUIT Y SERIES 15


PEOPLE SAY ‘IF YOU’RE NOT
” AT THE TABLE, YOU’RE ON
THE MENU.’
It’s more than that. It is how you are at the table. If
you don’t decide what’s on the menu, if you are invited after
the menu is set, you are still a guest. We as researchers get
funded to be hosts. But in truth, the community should be the
hosts, we are guests.”
ANGELA ODOMS-YOUNG, ASSOCIATE PROFESSOR,
UIC INSTITUTE FOR HEALTH RESEARCH AND POLICY

1 6 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
H OW T H I S G U I D E B O O K
IS ORGANIZED

This guide begins by naming seven inequities


held in place by power, and calls out how they
get in the way of truth and impact.

With each inequity, there are suggestions for potential ways forward for community
organizations, funders, and researchers.

This is followed by specific questions in the guide to help community organizations, researchers,
and funders relate to one another differently, for greater impact. The guide organizes these
questions from start to finish of a research study, so users can add questions to their individual,
or collective, agendas at the appropriate time.

This guide is a start—it is by no means the answer. We wrestle regularly with operating in
this world, while envisioning an equitable one.

CHICAGO BEYOND EQUIT Y SERIES 17


SEVEN INEQUITIES
HELD IN PLACE
BY POWER,
SEVEN
OPPORTUNITIES
FOR CHANGE
Starting commitment

The challenge we face…


The work of changing “how it’s always been done” is hard. The most important thing
for all of us is human engagement and a continuous effort to check our biases.
Making technical changes without this commitment to openness will not work.

Community organizations, researchers, and funders can…


ĥ Bring awareness to your own biases and assumptions.

ĥ Start with this commitment and find new ways to relate to each other.

This guide can help.

CHICAGO BEYOND EQUIT Y SERIES 19


ACCESS.
THE CHALLENGE
Access to creating knowledge about communities and the programs that serve them is controlled by people
outside those communities, who also often control the questions asked.

Conversations about research often happen without community organizations or community at the table, or on
an “invitation only” basis on others’ terms.

THE IMPLICATION
When a voice is missing from the table, the answers we get are insufficient. We may perpetuate bias, and fail to
find out.

THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Recognize that the power ĥ Design research to serve ĥ Fund research that community
dynamic makes it tempting community purpose. organizations want, need, and are
to compromise what matters able to lead. Fund research that
ĥ Not participate in research that
for the chance to produce informs action on root causes.
perpetuates the researcher
research evidence.
as “brains” and community as ĥ Not fund research where
ĥ Where possible, speak up to “brawn” stereotype. the questions asked and the
participate—or not participate— approach hold power dynamics
ĥ Insist that conversations
in research on your own terms, in place.
about community happen with
and shape research to help
community. ĥ Insist that conversations
your community.
about community happen with
community.

20 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Pictured: North Lawndale

INFORMATION.
THE CHALLENGE
Information about research options, methods, inputs, costs, benefits and risks often reside with researchers
and funders, but less often with community. Often, the community does not have enough information to
contribute their wisdom to which questions are asked and why, what outcomes are the focus, and what data
sources are used—or to give informed consent to participate in the first place.

THE IMPLICATION
When one party does not have full information, it is difficult to partner effectively or to get to the full truth.

THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Get informed. Know your ĥ Share information, recognizing ĥ Ensure accountability for the
options, know your rights, that without it, the community community organization to
know the risks. organization cannot actually understand the options and
consent to the research. the risks.
ĥ Seek and use information to
ask questions about methods ĥ Have reciprocal exchange about
and inputs. methods and inputs.

CHICAGO BEYOND EQUIT Y SERIES 21


VALIDITY.
THE CHALLENGE
Community organizations and members are often viewed as credible sources when talking about the community,
but they are not viewed as voices with authority to sway those with power. The institutions, frameworks,
methods and data sources seen as most authoritative and valid are often far from community reality.

THE IMPLICATION
When outside experts hold the authority to produce and interpret knowledge, we diminish the value of
community voice. Without that community wisdom, we accept partial truths as the full picture.

THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Value the validity of your own ĥ Recognize how the research ĥ Be accountable for what
voices at the table, especially frameworks, process and questions the research you fund
on the questions, the inputs to inputs reinforce power asks, and what processes and
answer the questions, and how dynamics, and bring your inputs it is (and is not) validating.
participants experience the creativity to making change.
ĥ Create accountability for
research.
ĥ Build relationship with the authentic engagement between
ĥ Build relationship with the community organization. community and researchers. This
researcher. Check partial truths. will check partial truths.
ĥ Check partial truths.

22 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
OWNERSHIP.
THE CHALLENGE
Community organizations often cede critical decisions about how the study operates, what to measure,
how study tools are developed, and what participants experience, because they do not feel they have equal
ownership in the research. This is solidified in legal agreements and permissions, which may say that the
community organization cannot access the data itself, does not own data produced by the study, or must seek
permission to speak about the research.

THE IMPLICATION
Without shared ownership, the process of research can take from, rather than build up, the community, and the
inputs and answers are incomplete.

THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Recognize that the power ĥ Invite co-ownership of ĥ Set expectations for
dynamic makes it tempting to research, in your processes co-ownership of research, in
cede ownership. and legal agreements. processes and legal agreements.

ĥ Build your ownership of your


study, which starts with knowing
what you want to learn, and why.
It is likely that your organization
has the most at stake.

CHICAGO BEYOND EQUIT Y SERIES 23


VALUE.
THE CHALLENGE
Research often fails to teach us what is working or produce value for the community.

At the same time, research often comes at a high tangible and intangible cost. Community organizations and
communities shoulder financial and relational costs that are not explicit, visible, or compensated.

THE IMPLICATION
Uneven accounting allows investment to be high while impact is low.

THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Get aware of the potential costs— ĥ Recognize the full cost of the ĥ Account for the full cost of the
including intangible costs to your research—including intangible research—including costs to
participants, organization and costs to participants, community recruit additional participants,
community—and advocate for a organization, and community— intangible costs borne by
full accounting. and find ways to support it. participants, staffing at the
community organization—and
ĥ Get clear and speak up on how ĥ Shape research so producing support these costs.
research can produce value for value for the community is
your community. central. ĥ Insist on clarity: How will the
research benefit the community?
According to whom?

24 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
ACCOUNTABILITY.
THE CHALLENGE
Often, funders and researchers choose whether or not to take responsibility and make changes when the way
research is designed unintentionally creates harm or does not work, while the community organization and
community bear the greatest risk.

Community organizations have to prove their effectiveness and fidelity, while funders and researchers are
exempt from the same scrutiny and vulnerability.

THE IMPLICATION
Without accountability, trust is limited, and the work cannot be as bold. Worse, communities can be harmed.

THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Build trust-based relationships ĥ Build trust-based relationships ĥ Build trust-based relationships
with the other entities. Hold with the other entities. Be with the other entities. Be
researchers with whom you accountable to understand the flexible in timelines so trust
have built trust accountable. context. can develop. Be accountable to
understand the context.
ĥ Identify and mitigate risks to ĥ Own your role in missteps.
you and to your stakeholders. ĥ Own your role in missteps.
ĥ Help identify and mitigate risks.
ĥ Help identify and mitigate risks.

CHICAGO BEYOND EQUIT Y SERIES 25


AUTHORSHIP.
THE CHALLENGE
Often, the power dynamic lifts up the voices of researchers and funders to shape the narrative and pushes
down the voices of the community. Racial dynamics between white researchers and funders, and communities
of color, contribute to the imbalance. Funders are often cast as “outside of the work,” and researchers as
objectively neutral and merely “observing the work.” This does not account for the biases and perspectives
every person brings to the work.

When data is analyzed and meaning is derived from the research, the power dynamic often mutes voices of
those who are marginalized.

THE IMPLICATION
When we restrict authorship and ignore bias, it allows incorrect meanings to be drawn.

THE OPPORTUNITY
Community organizations can... Researchers can... Funders can...
ĥ Recognize that the power ĥ Invite co-ownership in ĥ Set expectations for co-
dynamic makes it tempting contextualizing and sharing ownership of contextualizing
to cede interpretation and results. Analyze and frame and sharing results. Create
presentation of the results. data with an equity lens for accountability for an equity
greater impact. lens for greater impact.
ĥ Participate in how results are
made into meaning, and shared.
Is it contextualized? Can you
hear your participants?

26 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 27
A N OT E O N S O U R C E S

The content here draws on Chicago Beyond’s


role in the power dynamic, experiences with
our community and research partners, and
experiences others have shared.

We credit any wisdom here to the patience and courage of those we have worked with
and learned from.

While there does not yet seem to be a consistent practice or set of protocols for an
equity-based approach to research, we have learned from several sources of wisdom
in addition to experience: ancient and indigenous approaches to knowledge; systems
thinking; community-based participatory research; design thinking; racial equity and
cultural awareness; epistemology (for definitions see Gloassary on page 110).

This guide can inform research and its close cousin, evaluation, but uses the word
“research” throughout.

28 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?

“We have to try to see from participants’
perspectives, to understand the impact
of what we are doing.”

JOHN RICH, MD, DIRECTOR, CENTER FOR


NONVIOLENCE AND SOCIAL JUSTICE, SCHOOL
OF PUBLIC HEALTH, DREXEL UNIVERSITY,
CO-FOUNDER OF HEALING HURT PEOPLE

CHICAGO BEYOND EQUIT Y SERIES 29


FOR COMMUNIT Y
O R G A N I Z AT I O N S

Before you start:


Should we do research?
Research can be empowering, overwhelming, motivating, exhausting, and more. For
a community organization, giving a second thought to why your organization should
take on a research evaluation—beyond the notion that data drive funding—can
enable your organization to not only participate in a study, but to guide it on your
own terms. This makes it more likely to produce knowledge that helps, rather than
harms, your mission.

Consider some of the benefits community organizations have gotten


from research:
New knowledge.
ĥ Access to data about program participants they might not otherwise have, building motivation for staff who can
better see their direct impact through this information.

ĥ Better understanding of how the program is working, what parts of the program are working best, or which
participants are best served. Sometimes called an “implementation evaluation,” this research can help
organizations figure out what is core to the program and if it is being delivered as intended. These types of
learnings have helped community organizations improve how they recruit instructors or teachers, how they
recruit participants, what parts of the program to focus on more, or how they measure program quality so that
what matters most, happens most.

“Validating” the model.


ĥ Perceived validation from an external person or organization of whether the program is doing what they believe
it is doing, with those they intended to serve. Sometimes called an “outcomes evaluation,” this research can
provide an evidence base from an independent source.

ĥ Ability to communicate to philanthropy the value of the program.

ĥ Data to drive their ability to apply for government funding.

Preparing for growth.


ĥ If the organization wanted to try growing the program, standardizing it, or building program infrastructure,
committing to research and the pressure of a research timeline can force the organization to make these
activities a priority.

30 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Along with the benefits, there have ĥ The research study did not show effectiveness
because of a problem with the research design.
also been costs, such as: Many studies are not designed well enough to draw
a conclusion of whether the program “worked.” For
Hidden costs of implementing the research study.
example, a research study could fail to capture the
ĥ Research can change the day-to-day processes impact of a program because that impact shows up
within a community organization. For example, over a much longer timeframe, or across a broader
organizations have had to close the door on ecosystem, than was feasible to study, or because
providing services to some young people for an there were not enough study participants for the
extended period, or have had to move from a first- research to attribute impact to the program.
come-first-served model to over-recruiting and
then selecting and rejecting participants by lottery, ĥ The community organization expected the research
to build up a big enough group of people in the would explain how (not just whether) the program
research study (some types of research require a worked to help them scale up.
big group of study participants in order to attribute
ĥ The research measured an outdated program model
impact to a program).
because the program evolved while the study design
ĥ Research took much more work and time was fixed.
from program staff than the community
ĥ Easy-to-measure metrics did not capture
organization expected.
what mattered to the program, or captured
ĥ The experience of the research reinforced a sense injustices beyond the control of the program
of powerlessness because the researchers were or its participants. For example, research
perceived as “the experts,” or triggered memories of comparing arrest rates could capture increases in
previous exploitative interactions with institutions. neighborhood policing, outside of the control of a
neighborhood program.
Lost energy and time due to gaps in
understanding. ĥ Even after a very successful study, there continued
to be a lot of work and a large gap in time to get
ĥ For example, researchers did not understand the
funded.
community organization’s recruitment process, and
the community organization did not understand the
assumptions in the researchers’ math, when setting
the recruitment target for the research.

Unmet expectations.
ĥ The research study did not show effectiveness.
Only a fraction of randomized controlled trials
of interventions—across business, medicine,
education, and employment—show a positive
impact compared to usual services without the
intervention. (You can find more detail on this in the
piece from Arnold Ventures, formerly The Laura and
John Arnold Foundation, listed in the Bibliography
on page 111. “Randomized controlled trials” are
defined on page 37).

CHICAGO BEYOND EQUIT Y SERIES 31


This guide, and understanding how other community
organizations are using research to serve their justice
mission, can help you navigate the risks and make your
decision about whether to do research now, or not.
Organizations have said “no” to doing research even where the funding for it
was available, for example, because their partners or participants were tired of
being researched.

If you do decide to participate in research, there are a range of ways to begin. Your
research can draw on multiple approaches, depending on where you see benefit in
an external perspective or voice, and where your wisdom and capabilities lie. Like all
people, individual researchers may be biased towards approaches they are familiar
with, and their emphasis on first building trust with your organization will vary.

It is important you get informed. Where you feel you can, engage with funders and
researchers and speak up. When you do this, you further relationships and mutual
accountability. This guide can travel the journey with you.

32 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 33
F O R C O M M U N I T Y O R G A N I Z AT I O N S

1.
KNOW YOUR ROLE,
KNOW THE RISKS
EQUITY IN HOW YOU START

a. You are in charge of what you want to learn


Owning your research starts with knowing what you want to learn, and why. It is likely that your organization
has the most at stake. Your learning agenda can be an asset, and it is yours. Before jumping in, conduct an
exercise that has you asking, “what if?” research can change the day-to-day processes within a community
organization. For example, organizations have had to close the door on providing services to some young people
for an extended period, or have had to move from a first-come-first-served model to over-recruiting and then
selecting and rejecting participants by lottery, to build up a big enough group of people in the research study
(some types of research require a big group of study participants in order to attribute impact to a program).

ĥ Your goals for the research. Start with fill-in-the-blank statements. On a blank sheet of paper, write down a
few outcomes that you would like to see at the end of your research project. Consider: What are the questions
you want to answer? What is your purpose of “filling in the blanks” and answering these questions? How do
the answers support your goals as an organization? Should the sentences focus not only on individual change
but on interpersonal change, change to families, or community change, to honor the work you do? You can
share these statements with researchers when you start working together.

For example… “First, we want to write strong applications for state government funding for violence
prevention, so we want to say our program reduces participants’ violent behavior outside of the program by
[percentage/measure], citing a rigorous outside evaluation. Second, we want to show that it is not just about
the participants, but also the families of participants that grow stronger through our program and become
advocates of change. Third, we would like to identify early indicators that affect whether a participant will
complete the program or not.”

Organizations have found research useful in day-to-day work when it identifies early indicators of the
overall goal, if staff or participants can affect those indicators. Organizations have also found it useful to
ask staff: What 3-4 pieces of information would help you to do your job better?

ĥ Your target audience for the research. The type of data you need to “fill in the blanks,” whether small
scale self-collected data or data from a large scale randomized controlled trial, depends on your audience.
Are you looking to improve the effectiveness of the work, and would this require making changes to services
along the way? Are you looking to evaluate outcomes to make the case to funders—what type of funders? Are
you looking to establish a case for change to a specific government entity?

34 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

Next, get to know the research partner and what they ĥ Researchers’ relevant experiences. What stories
are bringing to the table: can they share of their work that illustrate how
ĥ Researchers’ openness to building trust. When they would work with you? How has community
you meet researchers, look for the foundations participated in identifying the goals of the research
of a trusted relationship. For many community in previous projects? In designing the study and in
organizations, the instinct based on previous particular how participants experience the study?
experience is ‘never give data to someone I don’t In developing and testing survey instruments? In
really know.’ Some community members remember collecting data? In interpreting what the data mean
exploitative or unethical interactions with research and sharing results?
institutions. Community organizations have Discuss with the research partner what is realistic:
shared how rushing for the sake of timelines,
ĥ Questions asked. How do you prioritize the
rather than taking the time to share history
questions you want to answer through the
and build relationship with researchers, can be
research? Taking into account the effort and
counterproductive.
change required to tackle each question and who
ĥ Researchers’ motivations. What motivates their would need to make that effort (the researcher or
interests in this work? What interests them most the community organization), which questions will
about this collaboration? Have they spent time in you focus on now and which will you defer?
the community where you work?
ĥ Information generated.
ĥ Researchers’ agenda. What are their intentions ö What types of information can the research give
for the research in the context of their professional you, and on what timeline? When will initial data be
work—what is their agenda? The value of sharing available, and what actions or decisions will it
agendas is in the trust built together. Is the work make possible?
with you supporting papers they intend to publish? ö What answers to questions or new learning can you
Is it enabling them to fundraise for their institution? be certain of getting, even in the “worst case,” and
what is the potential “best case” of producing the
Is it meeting the requirements of funding they have
research or evidence you are planning for? What will
already received? need to go right for the “best case” to happen?

CHICAGO BEYOND EQUIT Y SERIES 35


F O R C O M M U N I T Y O R G A N I Z AT I O N S

b. Know your options


One barrier to equal partnership between researchers and those researched is not understanding the
options, and potential benefits and costs of each.

Few outside of research institutions have this understanding unless they have previously experienced
research projects, but without it, it is difficult for a community organization to participate with full voice.
One method of gathering data may be the ‘gold standard’ on paper, but may not fit your purpose, mission
or organizational stage.

36 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

What are the options for the research? Some terms that come up frequently:

A “quantitative” study focuses on numbers to assess implementation and/or the impact of


your organization’s work.
For example, a quantitative study might count how many people your organization serves,
what services they receive, and whether they have stable housing after receiving services.
QUANTITATIVE STUDY A quantitative study often uses government data sets, for example from the public school
system or the criminal justice system, census tract data, or surveys.
A quantitative study can produce data on a large number of participants more cheaply
than other approaches. It can show that stories-focused information can be generalized
beyond the handful of participants telling the stories.

A “quasi-experimental” study is a type of quantitative study that shows a numerical change


QUANTITATIVE STUDIES

occurred, but does not show your program caused the change to happen.
It does not involve assigning participants to two different groups and studying both groups, and
QUASI-EXPERIMENTAL therefore asks less from your organization and your participants.
STUDY
This will reduce your flexibility to change program elements during the period of the research.
Will this generate what you are trying to learn? Is this rigorous enough for the audience you
want to reach?

An “RCT” or “random assignment evaluation” is a type of quantitative study used to show


your program caused the change to happen. For example, it would allow a researcher to
say “participants in this organization had stable housing more often as a result of their
participation.”
RANDOMIZED
CONTROLLED It involves assigning participants randomly to treatment and control groups which is effort-
TRIAL intensive (more detail in the sections called Know the risks and costs and Plan for study
recruitment below).
This will reduce your flexibility to change program elements during the period of the research.
It is often favored by public policy-makers. Is it necessary for your goals?

A “qualitative” study focuses on systematically collecting stories and other non-quantitative


information to convey the impact of your organization’s work.
A qualitative study may use interviews, focus groups, or observational data, which means a
researcher watching or listening to participants and staff members.
For example, a qualitative study might summarize what participants are saying has changed in
their lives while participating in your program.
Particularly when you are trying something where not much is already known, rich qualitative
information, even from a smaller number of participants, helps shed light on “why” and “how”
QUALITATIVE STUDY your efforts are working, and why participants find it valuable.
Case studies can offer rich insight—but are different than a systematic qualitative study that
may guide program or policy changes.
Community organizations have found this type of research helpful to scaling up their work
because it helps you understand what pieces matter most. Qualitative data can guide
improvements, for example: criteria in screening tools, characteristics of staff to hire for, service
or curriculum improvements. Qualitative data can also suggest internal metrics the organization
can use so that operations produce more of what matters.
However, qualitative research can take time and be expensive.

A “mixed methods” approach mixes numbers and stories, and can provide the best, and worst,
MIXED METHODS STUDY
of both worlds.

CHICAGO BEYOND EQUIT Y SERIES 37


F O R C O M M U N I T Y O R G A N I Z AT I O N S

c. Know the risks and costs


Ask researchers, and reflect on:

ĥ Risks. What is the risk of each possible approach “failing” (i.e., not generating anything of value to your
organization)? For example, if a randomized controlled trial is not sufficiently “powered,” meaning there are
not enough participants in the research, it might not be able to show effectiveness even if your program is
actually effective. If a research study is not measuring the most relevant outcomes for your organization, the
research study might not show the true impact of your organization.

ĥ Costs. What will be the tangible and intangible costs of each possible approach?
ö For example, what additional or different work will staff have to take on?
ö What leadership and change management work will be needed to support your organization with the emotional
realities of new constraints? What will need to be formalized, or made into a manual, for the research to be possible
and will there be resistance to this that will draw on your organization’s energy or time, or strain relationships?
ö What are the specific costs to how you do your work of recruiting participants for the research? There is more
detail in the section called Plan for study recruitment on page 46, but at this stage, understanding the number of
participants your organization will need to recruit for the study to work is important.
To understand this, at the table, you need:
• (1) the voice of operations (for example, outreach, intake) to explain the process from interacting with a
potential participant for the first time all the way through program completion. Where does attrition happen
and how much attrition happens? How do participants graduate from or complete the program, and what
does this mean for how many new participants can be recruited per week or month? Will the study change the
process such that your historical enrollment or attrition might change?
• (2) the voice of your research partner to explain their best estimation of what number of participants
completing the program is likely to be needed for the research study to show an effect that researchers
consider valid.
• (3) to put these pieces together, to understand the cost to your organization and to your participants of
recruiting the numbers needed.

ĥ Restrictions. What are the restrictions imposed by each possible approach on your organization?
ö For example, community organizations are often innovators and adaptors, but many research methods
require core elements of the work to remain the same for the duration of the research, which can be years.
Are you willing to not make substantive changes to your program for the period of the study?
ö For example, if you choose a research design with random assignment, while the number of participants
you serve may not change, the number of participants you need to engage with, and your flexibility to
decide who gets services and who does not, may. For programs that have kept an “open door” for young
people who have participated to leave and come back, random assignment may mean closing the door to
services for a group of young people for an extended period. It may mean many youth in the neighborhood
where you live and work interact with your organization, and then get “randomized out” of receiving
service. For programs with age limits, random assignment could close the door to the program for a young
person altogether.

ö What do you compromise if you do not have discretion to decide who gets services?

38 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

ĥ Learning from peers. Is it possible to speak with another community organization participating in a study
to ask about their experience and risks to be aware of? Is it possible for key staff from your organization to
listen to how another organization addressed risks outlined here? Community organizations have commented
that this dialogue helped them ask their research partners the right questions. Can the funder or researchers
make connections for you, particularly to a community organization with a similar depth of relationship with
their participants (e.g., a few hours once a week versus a deep ongoing relationship with a participant and
their family)?

d. Research means change, so get ready to lead through change


In deciding how extensive or rigorous of a research effort to start, the capacity of your organization—including
the leadership’s mindset for learning and for change, the culture of your organization, the infrastructure in
place—matter. These factors affect whether research may feel like some more work, or like a drastic change.

ĥ Existing data habits. Does your organization have habits of looking at data, reflecting, and learning? Or do
numbers usually come up only for funding proposals or report-outs to the board? Do you have data infrastructure
you can build from, such as spreadsheets or other ways of collecting information, or habits of using qualitative
and numeric data in day-to-day operations? Are staff already spending time building and maintaining this data
infrastructure, or will you need to identify staff and free up capacity to participate in research?

ĥ Staff’s experiences with research. What are staff members’ appetites for research? Who among your staff,
volunteers, partners, and community members could help shape research?

ĥ Program readiness. Is program infrastructure in place to enable research, for example consistently delivered
key elements of a program? If the research will require new “standards” or “manualization,” which means
writing down how you interact with program participants in a manual and doing it in a consistent way, will
your organization embrace this, or resist it as too institutional? This does not mean every participant must
receive the same services. For example, a study may allow for variation in what goals your participants set,
but measure progress on goals consistently.

CHICAGO BEYOND EQUIT Y SERIES 39


F O R C O M M U N I T Y O R G A N I Z AT I O N S


“We know it works. So what
are the papers and numbers
going to actually do for us?”

KAREN JACKSON, DIRECTOR OF


WORKFORCE DEVELOPMENT, LAWNDALE
CHRISTIAN LEGAL CENTER

For the executive director/leader of a community organization preparing for the change that a
research effort may bring:

ĥ Listening. Before committing to research, how can you listen to all the reasons for why research should not
happen? Are staff concerned the numbers will not tell the whole story? That there will be copycats? That
their voice will not be heard or valued? While from the researcher or funder’s perspective, the research study
may be separate from the program, for your staff and participants, it is often all part of the work. Several
organizations have shared with us that it was not until these conversations happened that the impact of
the research became “real.” One-on-one conversations with naysayers, for example, staff, participants or
community members, can be a valuable opportunity to understand reasons for reluctance and opposition.
Listening in this way can inform whether and how you engage on the research, improvements to the research
plan, and how you communicate about the research.

ĥ Communicating.
ö How can you educate, support and influence your staff for a successful study? Does the entire team understand
what is happening, why, and how it will impact their work? How, and how often, will you articulate the potential
benefit of the research to your organization’s mission? How will you be open about the challenges and risks, and
about how workloads will shift as a consequence of doing the research? For larger nonprofits, how will you motivate
leaders within your organization to engage in these conversations deliberately with staff on their teams?
ö Who are other stakeholders you need to engage, for example, the community organization’s board, other funders,
community partners, participants, participants’ parents? Will having the researchers speak with these stakeholders
help you?

ĥ Champions.
ö Who, within your organization, tells stories that people hear. Are they informed about and supportive of the
research? How can you engage them?
ö As staff turnover is frequent in many community organizations, are there additional members of your organization
who can be included in the discussions from the start, so understanding of how and why the research began does
not sit solely with one or two people?

40 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

e. Know your rights


Power dynamics and trust affect how a community ĥ Property rights. Discuss intellectual property
organization participates in research, whether the rights and data rights. Who can access the data and
process feels like justice work or the opposite, and when? When will data be processed and shared—at
whether the research succeeds. regular intervals, or not until the end of the project?
Who can speak about the data and publish the
Some community organizations may feel comfortable data? Whose consent is required, when? Some
taking a direct approach on these issues. Others fear community organizations have gotten legal advice
that being assertive may jeopardize the opportunity. and some have asked research institutions and
ĥ From history to present. Name the local history. funders to sign non-disclosure agreements.
How have interactions between your community ĥ Signed contract. Sign a memorandum of
and researchers typically worked? Set against understanding or similar written contract
that context, how do you envision roles and specifically covering the study, responsibilities,
accountabilities will work in this research? and rights to the data and communications
ĥ Role of Principal Investigator. What will the about the data.
Principal Investigator take responsibility for? Is it ĥ Cost distribution. Discuss with researchers
appropriate for someone from your organization to and funders how costs to your organization and
act as a co-Principal Investigator? In some cases, community generated by the research can be
this can make the co-leadership and co-ownership shared. Can researchers share in this cost, in
of the research clear and help with power dynamics; kind, or financially? In our experience, researchers
in some cases, you may be looking for the perceived have raised funds, for example to compensate
validity that an external university researcher brings. participants for their time, or for stipends for
partnering public schools. Can the funder support
these costs?

Color of Chicago Beyond’s Experience:


At Chicago Beyond, experience continues to teach us about the substantial tangible and intangible costs of
collaborations between community organizations and researchers, that we and other funders had not accounted for.
Our notes:
1. Being more proximate enables us to learn. Through deep relationships we have seen myriad financial
and intangible costs of doing research from the nonprofit and community’s perspectives. We have built a
“growth team” who in turn builds trusted relationships with a broad array of people within our nonprofit
partner organizations. This intimacy begins in our due diligence process before making the investment,
where our team spends substantial time with, and writes or co-writes the investment proposal in
collaboration with, the nonprofit.
2. Funders have unique opportunities to support and reduce some of the costs. Some examples
from our work: collaboration on the purpose of the research; helping address strategic and operational
challenges such as navigating new recruitment targets; developing communications about the research
for the nonprofit’s staff, board, and community partners; supporting executive directors in their change
management efforts resulting from doing research.
3. We acknowledge that this work is difficult and messy. Timelines and timing of funding may need to
shift, when the cost of not shifting them becomes clear.

CHICAGO BEYOND EQUIT Y SERIES 41


F O R C O M M U N I T Y O R G A N I Z AT I O N S

2.
COMMUNITY
AND VOICE:
SETTING UP THE STUDY

a. Set up for voice; Consider a community evaluation committee


“Bringing implementors and researchers
in close proximity is so important—so
we can have conversations without fear.
The power dynamic can hold up deeper
collaboration.”

LINA FRITZ, MANAGING DIRECTOR,


PROGRAM INNOVATION, ONEGOAL

Research is traditionally set up to keep researchers separate and “neutral.” Engaging and getting proximate
is an orientation the community organization and researchers may need to cultivate together, and create new
structures for.

One starting point is face-to-face engagement between researchers and community, participants, staff, and
partners. Another is to start by outlining the potential benefits and harms of the research, and determining how
to hear the voices of those potentially affected through the process.

42 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

ĥ Identification of voices needed. What are potential benefits of the research, and who receives them?
What are the potential harms, and who carries them? For example, what processes of your organization may
be changed or interrupted? How will staff relationships with participants be affected? These can be written
out in a T-chart (a piece of paper with a large letter “T” creating two columns) with potential benefits in one
column and potential harms in the other, following the practice of health impact assessments, environmental
impact assessments and racial equity impact assessments. The individuals and groups named on the chart
may suggest the starting list of whose voices are important to hear.

ĥ Reciprocal engagement.
ö How can the researchers and community organization staff, participants, partners, or community members engage
face-to-face, and see each other as humans? Are Peace Circles an appropriate early engagement? Breakfast forums
for partners? Evening forums for community? For researchers, having the opportunity to connect with the work
on a human level can provide perspective and, in the words of one community organization, “a positive lens, not
just focusing on statistics of demise.” For community organizations, engaging with the researchers formally and
informally helps build a relationship of trust.
ö In some cases, you may assemble a community evaluation committee representing those voices, which may make
decisions about or give input into what is researched, the outcome measures used, and which engages regularly
with the research team. The committee may include staff of your organization, researchers, program participants or
community members. What is the mandate of your group? What powers does it truly have? How often will it engage
with the research team to not burden participants or overwhelm the research team, but also to effectively contribute
to the study design and execution? (In the Community-Based Participatory Research approach, this group is called a
Community Action Board and steers both the research and the related action.)
ö How will you engage with all staff who are affected by the research effort to seek input, while the study remains in
its formative stages? How will you engage community members who may be affected by the research effort? Are
community meetings appropriate? What is the mechanism to gather and incorporate input?

ĥ Engagement at the right times. When during the process can you still shape study design? Have you
discussed with the researchers the timing of when they will seek “IRB approval” or register the study’s
“pre-analysis plan?” An “IRB” or Institutional Review Board is an administrative body that confirms that
certain ethical considerations in research are met. A “pre-analysis plan” commits to what the most important
outcomes and approaches will be in your research. Once these steps in the research process occur, you have
limited ability to change the study design while safeguarding the validity of the study.

CHICAGO BEYOND EQUIT Y SERIES 43


F O R C O M M U N I T Y O R G A N I Z AT I O N S

b. Participate in deciding what to measure –


check the “simple” measures


“Numbers only tell a piece of the story. Some of what we do, you just
can’t quantify—driving a kid to a job interview after hours, where the
bus doesn’t go. It’s unconditional love that is keeping kids alive. But,
there are things that can be quantified.”
Pictured: Youth Advocate Programs, Inc. group outing

CHRISTOPHER SUTTON, DIRECTOR, YOUTH ADVOCATE PROGRAMS, INC.

What you measure is what you incentivize.


The “simple” measures may not be the ones that represent real growth and benefit to participants facing the
biggest barriers. As large administrative data sets were in many cases built to report on compliance, metrics of
compliance such as arrest rates are, not surprisingly, easier and cheaper to collect.

Take the example of recidivism. On the one hand, it is a common metric, with existing data sets. This means
it costs less money to track and allows comparison across programs. On the other hand, it does not capture a
program participant’s directional progress, while capturing things outside of the program participant’s control,
such as a concentration or increase of arrest efforts in a community. A piece by Jeffrey Butts and Vincent
Schiraldi on benefits and shortfalls of recidivism as a metric is listed in the Bibliography on page 111.

What is measured can be a productive focus for a community evaluation committee and researchers, each
contributing their expertise.

44 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

ĥ Understanding the risks of possible metrics and data.


ö What are the commonly used metrics for this type of work, and what inequities, historical or present, are built into
them? What assumptions are built into how these metrics are used? Some examples:
• Is measuring participant arrests in the current state of racialized policing an accurate metric?
• In a culture that values nuclear families over independent individuals, is measuring “progress” on
self-actualization the best metric?
• Is it fair to measure housing “overcrowding” for participants from a culture that values living with extended family?
• Is measuring wealth accumulation for families from a culture that values “sending money home” or supporting
extended family members an appropriate metric?
ö What are the benefits to your organization and to your participants of the commonly used metrics?
What are the harms?
ö What are the limitations of the data sets you are working with? How will the research team address these? For
example, department of employment services data do not capture all types of employment, which may result in
undercounting increases in employment among participants in a program.

ĥ Selecting appropriate metrics.


ö What does the work intend to create, in addition to what will be avoided? When do staff and participants perceive
progress and what marks progress in their eyes? How can this be reflected in metrics that are feasible in this
context? For example, are connections to caring adults built through the work important to capture? Expansion of
the participant’s support network? Changes in the participant’s perception of their agency?

One example is of a prison-based fatherhood program (you can find the article by Abigail Henson listed at the
end), where dialogue between participants, community organization staff, and researchers changed what the
study measured: the unit was changed from the father to the family; the short-term measures were changed,
from depression and stress, to pride and reconstruction of masculinity (from provider to caregiver); the long-
term measures were expanded, from recidivism, to whether the father-child bond remained active and positive.

ö What effects of the community organization’s work important to capture in the research? Is the program affecting the
system or the capacity of the community, for example supporting cultural revitalization, changing power relations,
or increasing the capacity of the community to solve problems? Effects of the ecosystem on a participant, effects
of change by the participant on their families or communities, or broader impacts of a program’s work, may be less
simple to research. However, this may be critical to a community organization’s impact where easy-to-research
approaches have not worked.
ö Some community organizations have found that talking with trusted friends and peers has helped them to identify
additional outcomes to measure. Sometimes those outside the day-to-day work can help you to see your impact.

ĥ Communicating about metrics respectfully. How can you communicate about quantitative metrics within
your organization in a way that is respectful, and that recognizes that numbers are an incomplete picture?

CHICAGO BEYOND EQUIT Y SERIES 45


F O R C O M M U N I T Y O R G A N I Z AT I O N S

c. Plan for study recruitment


“If I could start all over, I’d ask ‘What’s
the power number?’”

AIMEE STAHLBERG, ARTISTIC MANAGER,


STORYCATCHERS THEATRE

Several community organizations have noted that a lack of shared understanding between the researchers,
those with program operations expertise, and those speaking for the community organization led to lost effort
and time, and anxiety, in recruiting participants into the research. In addition, optimism for the research
from the community organization’s leadership can contribute to blind spots when it comes to planning for
recruitment and retention, which can then make the research less effective.

ĥ Voices with expertise. Who needs to be at the table to be able to walk through the recruitment process
in detail from start to finish to really understand what is necessary to put together the participant group, or
“cohort,” for a research study? Who is closest to the work? Outreach? Social workers? Program directors?
Former participants?

ĥ Recruitment process. What is the detailed recruitment process based on these voices of expertise?
Importantly, the research itself can affect both the recruitment process, and attrition. For example, the
research could change how participants apply to the program, from interested young people applying, to
schools generating lists of young people invited into the program; as a result of this shift, attrition will rise
from what the program has seen in the past. Where does attrition happen in the process and by how much?
What number of participants would need to start the recruitment process to end up with a certain number
of participants completing the program? How frequently will participants move out of the program and what
does this mean for how much time it takes for the target number of participants to complete the program?
What does “completing the program” or “graduating” mean for your organization, operationally?

46 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

ĥ Recruitment target. The higher the number of study participants, the easier it is to show a scientifically
valid change, but the greater the effort to recruit participants. For the research study to show an effect
that researchers consider valid, how many participants does your researcher estimate need to complete
the program? Piecing this together with the program’s process from the start of recruitment to program
completion, what total number of participants must initially be recruited? Having the researchers and
operational team together name different scenarios, then explore the challenges and the manpower related to
each, can be a helpful approach.

ĥ Resources required. What will your organization need to change to achieve the recruitment target? One
community organization believed their social worker could continue to manage recruitment as the research
study began, and later found they needed a dedicated person spending 30 hours per week to adequately
support recruitment and build new referral partnerships.

ĥ Data required. What other data about participants will researchers need to collect, and how will these be
collected by researchers, or by staff and passed to researchers?

ĥ Communications.
ö How will the recruitment target and recruitment process be communicated to recruitment partners, staff, community
or participants?

These types of meetings require planning through an equity lens. What is the best time of day for
community members to attend? Evenings? Weekends? What about childcare? Food? Meeting format?
Seating arrangement?

ö Are breakfasts or evening community meeting appropriate? Should researchers be present to answer questions? What
about a school-night kick-off? Are one-on-one conversations between researchers and recruitment partners/staff
leading recruitment most appropriate? Will there be any resources or stipend for recruitment partners?
ö Are the explanations of the research to participants, staff, partners, community in a form that is understood by
these audiences?

CHICAGO BEYOND EQUIT Y SERIES 47


F O R C O M M U N I T Y O R G A N I Z AT I O N S


“I knew my fathers in the control group
needed to be compensated, and we
made it happen. Their time is valuable,
and they weren’t getting the benefit of
being in the program.”

SHELDON SMITH, EXECUTIVE DIRECTOR AND


FOUNDER, THE DOVETAIL PROJECT

Pictured: The Dovetail Project graduation

d. Participate in designing how study participants


experience the research
When those closest to participants, and participants themselves, shape how the study occurs, the approach can
become more equitable. Human connection makes impacts previously unseen to researchers and funders, visible.

For the community organization, community members, or research participants planning with
researchers for the study:

ĥ Fit of research design and mission.


ö Throughout the study and as it ends, how is the research design consistent with the mission of the community
organization and trust it has built in the community? For example, what will potential participants see after study
enrollment ends, particularly if the program will not have additional capacity at that point? What is the impact of
changes made because of the research study to the organization’s reputation in the community?
ö How is consent best approached? In this particular context, is it better for participants in the control group to meet
program staff in person and give consent, or to be randomized on a list without ever interacting with staff? Will
the control group, if there is one, be asked for consent? Is consent written in a way that is understood? Is consent
presented in a context/at a time when the participant actually has agency to give consent?

48 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

ĥ Recruitment strategies. What specific strategies can be used to bring participants into the study and
retain them through the course of the study? There is a “tax” on producing knowledge about those most
marginalized: it is harder for researchers to connect with those with instability in their lives; it is harder to
obtain consent from those who have learned to distrust institutions; it is harder to retain in a research study
those who experience greater barriers. How can social media be helpful? Will multiple redundant strategies
be feasible, to increase success?

The article “Study Retention as Bias Reduction in a Hard-to-Reach Population” by Columbia Professor Bruce
Western and colleagues referenced at the end may aid brainstorming on study recruitment and retention, e.g.,
the timing of financial incentives at the start of a study (if applicable); frequency of contact; back-up contacts
including mothers and supportive secondary contacts.

ĥ Experience of being in a research study.


ö How will study participants be contacted? How will they be engaged through the study? Who will engage them?
How can the nonverbal cues create the desired experience for study participants? What have participants heard
before about research in their community or by similar institutions? What is different here, and how is that explicitly
communicated in ways most likely to be heard?
ö The assumption in research can be that “nothing is happening” to the control group. But, from the perspective of a
young person going through an application process and being “randomized out” or from the perspective of referral
sources referring many additional people to a community organization, only to have them not receive services,
something has happened. If the study will include a randomized control group, who will communicate to the
control group about the randomization and what will be communicated? How will this feel standing in the young
person’s shoes? If members of the control group have been exposed to other research studies, as is often the case
in Chicago, what narratives is it important to address, e.g., a sense that “randomization” is not actually random? If
young people are randomized out early on in program enrollment, but are aware there is still room left in a sought-
after program, what is staff’s response? If participants in the comparison and treatment groups will interact, for
example, within a school, what communication will aid each group of people?

ĥ Responsive communications. What is the best way to communicate about what matters to participants,
for example:
ö What is the best way to communicate about privacy? Concerns nonprofits have shared, from their participants,
include: Who all will know I am in the study? Will my name be published anywhere? How much will they be in my
life? Do they watch from the cameras in the building? If I am involved in questionable activity, are they going to
report me to the police?
ö What is the best way to communicate about benefits? Participants have asked: How does being part of a study
help me?
ö What is the best way to communicate about expectations? Participants have asked: Am I allowed to
participate in other programs/employment during the study? Can you still help me if I am not in the treatment
group? Can I apply again? Who is going to help me if you can’t?
ö Be aware that there may be misunderstandings about research preceding your study.

ĥ Trauma. How do awareness of trauma and the research study’s potential to trigger memories and emotions
shape the work? A study can be an emotional trigger for program participants, and for staff who were
themselves researched or interrogated in other ways in their childhood. Trauma expertise can inform study
outreach and study design to minimize that effect. For example, trauma awareness can shape how staff
participate in the research (and change whether their stress is transmitting fear to participants). If this is not
an area in which your team is experienced, seek out expertise so you are equipped to answer these questions.

CHICAGO BEYOND EQUIT Y SERIES 49


F O R C O M M U N I T Y O R G A N I Z AT I O N S

e. Build in time for reflection,


failure, change
Meaningful reflection and problem-solving
during the initial steps of a community-research
partnership take time. Build this into the plan.

ĥ Cadence. What is the cadence on which the


team should reflect on the work and engage
on challenges? Where and when will these
conversations be held? Who will participate and
who will lead?

ĥ Discussion topics.
ö Feedback on the research effort so far, and what
needs to be done to address it.
ö How are inequitable approaches, methods,
measures filtering into the study, and what are
opportunities to do differently? For example, are
you unintentionally taking advantage of what
your privilege allows you to do, such as dictating
meeting times and locations? How is your
work creating a way of operating intentionally
distinct from the legacy of “research brain” and
“community brawn?”
ö How are relationships of trust being formed, and
how is the team interacting as equals? Note that
while researchers’, staff, and community members’
roles in the research differ, the point here is that
no one is treated as superior or inferior.

50 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

CHICAGO BEYOND EQUIT Y SERIES 51


F O R C O M M U N I T Y O R G A N I Z AT I O N S K N OW YO U R R O L E , K N OW T H E R I S K S

3.
COMMUNITY
AND VOICE:
DURING THE STUDY

a. Engage during the study


While researchers may be primarily responsible for collecting the data, the research process can remain a topic
of joint reflection and improvement. The community organization or community evaluation committee’s specific
understanding of context and people is critical.

ĥ Feedback loop.
ö How are staff, participants’ and community members’ voices being heard during the course of the study? What
feedback is being shared and what can be done to address the feedback and communicate back to those who
shared it? Some organizations have found a biweekly conversation among researchers and those at the community
organization involved in implementing the study tremendously valuable to ask and answer questions and to plan for
each new step in the work in an organic and effective way.
ö Are you remembering to praise and reinforce the specific actions your research partner is taking in service of
equity?

ĥ Listening for clues.


ö How well are efforts to bring participants into the study and engagement with participants during the study
working? What is not working and what can be done better?
ö How well is engagement with those in the control group, if there is one, working? What is not working and what can
be done better?
ö How are inequitable approaches, methods, measures filtering into the research despite the best of intentions and
what can be done to reorient? Where do you need to advocate for more resources or time to be allocated? Who is
not appearing as robustly in early data (for example, young people experiencing housing instability or young people
with Individualized Education Plans), and does this offer a clue to focus problem-solving? Who is not experiencing
being heard?

52 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : D U R I N G T H E S T U DY

b. Vet study tools and problem-solving


In our experience, an important opportunity during the study is the collaboration between community
organization and researchers on study tools such as surveys.

ĥ Contextual awareness.
ö How can the community organization’s leaders and researchers best collaborate so study tools fit the specific
context and also have the validity desired?
ö How will draft study tools be vetted by participants? Are surveys of an appropriate length?

ĥ Execution.
ö Will certain types of data, for example social security numbers, be a challenge to collect in the specific context, and
what are alternatives?
ö If researchers are engaging with a representative group, for example a focus group, is it a meaningful “group?” How
can you and researchers together create the conditions for voices to be heard?
ö Can community members be hired to help collect data or conduct surveys?

CHICAGO BEYOND EQUIT Y SERIES 53


F O R C O M M U N I T Y O R G A N I Z AT I O N S K N OW YO U R R O L E , K N OW T H E R I S K S

4.
EQUITABLE NUMBERS
FOR IMPACT

a. Make it useful along the way


The history of distrust between community and researchers comes in part from community organizations
entering research partnerships expecting the work will improve lives of participants, and later finding the
research may not help them with day-to-day operations or bring resources to their community. How can the
research improve lives of participants as much as possible?

ĥ Early learnings. What are early learnings from the research and how can these be shared with staff,
participants, community, or partners to show what their hard work is yielding?

ĥ Early action.
ö What improvements to the program are possible during the research, given the context of your organization and the
research design, which may limit what changes can be made?
ö How will those responsible for making the changes engage with the data early on so they can plan for action?

ĥ Sense checking. Is there a role for your organization’s staff in sense-checking early data or preliminary
results? For example, if a researcher is looking for themes in stories from participants, how is the researcher
identifying and interpreting important themes (“coding” qualitative data)? Are preliminary results consistent
with the experience of your organization, or do they look wrong?

54 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
E Q U I TA B L E N U M B E R S F O R I M PA C T

b. Bring attention to who is benefited the most? Least?


Many times, averages hide whether participants most forced to the margins are left further behind, unaffected,
or helped by a program.

When you disaggregate data,


Aggregated average
what can you notice?

The average
grew because
of these
participants.
Average after Average after
the program the program

Average before Average before


the program the program

What is the learning for


these participants?

If your community organization’s mission is to serve participants facing barriers, or if you are innovating and
looking at whom you could serve, a key purpose of the data is to help you understand who you are succeeding
in reaching (and who you are not reaching); who is benefiting most from participating in your program (and who
is benefiting least); and what can be learned from the complex stories beneath these numbers to help you do
your work better.

CHICAGO BEYOND EQUIT Y SERIES 55


F O R C O M M U N I T Y O R G A N I Z AT I O N S K N OW YO U R R O L E , K N OW T H E R I S K S

ĥ Disaggregated data. Against whatever metric you have chosen to measure how your program helps your
participants? Take for example, a grade point average. How much change does the program create for participants
starting out furthest behind, versus participants in the middle, versus participants starting out furthest ahead?
(This must be done in a way that continues to protect individuals’ privacy.)

ĥ Learning from who is benefited least and most.


ö Who benefits the most by your organization’s work and is this who your organization intended to serve? What are
the characteristics— demographic, intersectional, and situational, for example, housing stability, adult relationships,
connectedness—of the small group of participants who benefited most?
ö Who is benefited least by your organization’s work? What are the characteristics of this small group of participants
and are these characteristics of participants you intended to serve?
ö What can your organization learn from these stories? Do the greatest success stories, recognizing their complexity,
spur ideas for innovation? Are there adjustments to the program your organization will try out, to better serve your
target participants?

ĥ Counting inequity.
ö Is the correlation between race and outcomes, or class and outcomes, changed as a result of your organization’s
work? This is one way to look at impact for equity.
ö How can your organization use these data and stories to communicate your impact for equity?

c. Who is the comparison point?


Just as averages can obscure who, specifically, is and is not benefitting, comparison to an average can be
misleading. For example, comparing to an “average” client load may not be relevant and may burn staff out
if the work involves complex trauma. For example, if custodial sentence lengths get shorter because of a
program and there is no change in recidivism, this can be an improvement in recidivism, if normally an increase
in recidivism is expected whenever sentence lengths get shorter. For example, a young person dropping out
of a program and re-engaging several times can be an indicator of an organization’s persistence in building
relationships, rather than an efficiency marker to improve.

ĥ Comparison points.
ö For a study comparing outcomes to a benchmark, or analyzing cost and benefit, who and what is proposed as the
comparison? How can comparison points show different perspectives or the complexity of the work?
ö Does the comparison point take into account the impacts of systemic and individual traumas, for example the
implications of stress experienced by program staff as a factor in selecting what productivity comparison is relevant?
ö Does the comparison point ring true to those being measured?
ĥ Cost-benefit analysis. If you want to produce cost-benefit numbers, how do researchers propose to capture
systemic effects? Accounting for “cost” of the way things are right now only in terms of tax dollars—which
are often actually wages that benefit a different group of people—rather than the actual social cost, can
unwittingly and incorrectly build a case against investment in your work.

ĥ Capturing growth. How does the presentation of the data reveal growth journeys which may not be linear?
Does the comparison point allow for—to use substance abuse terminology—relapse in the context
of recovery?

56 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 57
F O R C O M M U N I T Y O R G A N I Z AT I O N S K N OW YO U R R O L E , K N OW T H E R I S K S

5.
SHARING
RESULTS

a. Is it historical? Is it contextualized?
Numbers, without context, take on the assumptions and biases of their audience. Data sources, without
context, reinforce the structural bias built into them.

As Chimamanda Ngozi Adichie warns, a story is fundamentally shaped by where you begin it. The origins of
inequity are often left out of the story, allowing histories to be laundered, and reinforcing harmful silences in
the narrative.

ĥ Context. What history and explanation of structural and systemic factors is important to frame the challenges the
organization is addressing? To explain fully why the problem exists in the first place and the complexity of root causes
and pathways? How can this show up in the description of the organization’s work and dissemination of the research?

ĥ Dominant narratives. What narratives have previously described the organization’s participants, its work in the
community, or its type of work? How have these narratives served participants well? How have they harmed them, or
reinforced inequities? With this understanding, how can this research be framed to take on unjust narratives? What
cultural context is important to tell?

ĥ Limitations. How are limitations of the data clearly communicated? For example, limitations of
administrative data sets, limitations of summarized data, limitations of common metrics, systematic non-
counting, or systematic undercounting? “Objective” data like special education designations, census data,
crime that is measured by arrests, or domestic violence information, can incorporate racialized processes and
lead to incorrect interpretations, without context.

b. Can you hear your participants? Are you signaling that lived
experience is valuable?
Valuing data to achieve an end—whether securing funding, improving programs, sharing learning with the
field, changing narratives—is not the same as valuing a human story and experience intrinsically. Honoring
a participant’s voice requires intention, it may not just happen from documenting a participant’s story,
demographics or outcomes.

ĥ Participant voices. Can you hear your participants in what you, the researchers, the funders, are disseminating about
the research? How are images, stories, numbers resulting from the research effort putting participants at the center in
how they are shared, versus treating them as objects of a study or as tokens to lend credibility?

58 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
S H A R I N G R E S U LT S

ĥ Authorship. Is there an opportunity for you or your participants to own elements of or co-author what is
produced, or be editors? Or does it serve you to have an external author, because an external person is
perceived as not biased?

ĥ Respecting experience.
ö How does the presentation of results message to the audience that experience is valuable and valid, rather than
reinforcing the bias that university expertise produces validity?
ö Would participants find that what is being put out is true to their experience?

c. Is it accessible? Can those researched hear the research?


Often research results are shared through academic journals, many of which are accessible only to subscribers.
Prose, data tables and charts may be written so they are inviting only to those with research and statistics
backgrounds. These walls hold up traditional power dynamics of who owns the research, and who is dependent
on others to share it and interpret it.

ĥ Language.
ö Is the language used as easy to understand as possible? Do people from different cultures, with different lived
experiences, with different technical backgrounds understand the results of the research and the “so what” of what
it means, when you test an early draft?
ö Are all inputs, calculations, and methods clearly explained, so stakeholders with different technical backgrounds
can understand what has been counted, how, and based on what judgments? Are data tables and charts readable to
those without research and statistics backgrounds? Are any technical terms used defined in plain language?

ĥ Forums and formats.


ö As the community organization owning the research, how will you collaborate with the researchers on identifying the
format(s) that will make the results most accessible to those affected by the research? To those who have power to
support the work? To your partners or those who are part of your coalition for change?
ö Where—in what forums—will the results of the research be most likely to reach each of these audiences? Does it make
sense to host community discussions to share results? Post video on social media? Design a simple summary that
articulates what was learned and to what end?
ö Revisit the chart you made to identify those potentially affected by the research. Are those people being reached?

CHICAGO BEYOND EQUIT Y SERIES 59


FOR
RESEARCHERS

Before you start:


What is the power dynamic?
Because evidence truly matters, we must care how it is made.

The existing power dynamic between community organizations, researchers, and


funders is getting in the way of the scale of impact that we, collectively intend.
The power dynamics between specific researchers, funders, and community
organizations will vary. Some funding, including government grants, will impose
restrictions. Some community organizations may feel comfortable asking questions
and asserting their perspectives; others may fear that being assertive may jeopardize
the opportunity and funding. In all cases, researchers wield substantial power in
shaping the questions asked, and the inputs used to answer them.


“What are the inherent biases we
hold? The quantitative paradigm says
it is ‘controlling for these’. But what
are the biases built in to how your
“knowing” works?”

THEODORE CORBIN, MD, CENTER FOR


NONVIOLENCE AND SOCIAL JUSTICE, SCHOOL OF
PUBLIC HEALTH, DREXEL UNIVERSITY,
CO-FOUNDER OF HEALING HURT PEOPLE

60 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
“People ask: ‘How do I get started?’ How do I encourage voice?
How do I use my voice? There is academic theory, but people
have challenges doing it. That’s where this work comes in.

ANGELA ODOMS-YOUNG, ASSOCIATE PROFESSOR, UIC



INSTITUTE FOR HEALTH RESEARCH AND POLICY

Depending on the kinds of research you do, your research institution, and
your own experience, the action you take will look different.
For example, you may:
ĥ Reflect individually or engage with your colleagues and institution on biases and how these flow into
your research

ĥ Change how you engage with community to identify research questions and study outcomes

ĥ Propose timelines for research differently, for example to support trust-building, or to develop survey
instruments with community input and community testing

ĥ Interrogate numbers and stories you lift up, and use different framing in what you publish

ĥ Evaluate your own work differently, or engage your funder stakeholders differently

Making intentional change can feel messy and uncomfortable. It requires openness to new perspectives
and unlearning old ones. It requires shifting power dynamics, departing from how “it has always been
done.” Starting from relationship and accountability, researchers can unlock immense creativity, to
achieve the promise of what knowledge can yield for communities.

CHICAGO BEYOND EQUIT Y SERIES 61


FOR RESEARCHERS

1.
KNOW YOUR ROLE,
KNOW THE RISKS
EQUITY IN HOW YOU START

a. Bring awareness
How research gets done—the approaches, methods, metrics—has a system of assumptions built in.

Some sources of context include MIT historian Craig Wilder’s Ebony & Ivy on the history of elite academic
institutions in justifying inequities, and the Harlem Children’s Zone’s writings on their decades of experience
“being researched” (you can find both in the Bibliography on page 111).

In seeking more equitable approaches, one place to start is understanding context and biases.

ĥ Systemic awareness. What are the assumptions built into the research approaches you use most
frequently? How do these approaches reinforce the privilege of those already powerful? Whose competence,
capacity, or fidelity needs to be proven, and whose is taken on faith?

ĥ Institutional awareness. What is the history of research in this community, and the particular history
for this community organization and these participants? How have research institutions previously been
experienced? How does your organization benefit from this dynamic? What experiences, e.g., being reported
to child services, might a research effort bring to mind, even if not technically related to the research?

ĥ Personal awareness. Everyone comes to their work with some personal knowledge and assumptions. What
methods to create knowledge are you predisposed towards? What are your assumptions about the people
and the context of your work? For example, if employment is a focus of the study, what are your assumptions
about “valid” employment?

ĥ Awareness of what is at stake. From the community organization’s perspective, what benefits may
participants in the research see and how certain are those benefits? What may be the cost to participants, to
the community organization, to the community? What are the risks of producing the research or evidence you
are planning for? From your own perspective, what benefits may the researchers and their research institution
get from the work? How certain are those benefits and what risks do the researchers bear?

62 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

“Evidence, without seeing


humanity, is folly.
MICHAEL MCAFEE, PRESIDENT
AND CEO, POLICYLINK

You have to build a relationship. There’s a history. It

” takes time, it takes trust, it takes being vulnerable.”

MARQUELL JONES, MAC HOUSE CASE MANAGER,


LAWNDALE CHRISTIAN LEGAL CENTER

b. Listen and relate, in order to fit research to purpose


Spending time with, and spending time listening to, the community organization and members of the
community humanizes research and enables you to find the right fit between purpose and research design. This
is the foundation for an equitable engagement.

First, invest in building relationships:


ĥ Building trust. How will you begin to build relationships of trust? How can you spend time in the
community? Could sharing what motivates your interest in the work be valuable? What interests you most
about this collaboration? For many community organizations, the instinct based on previous experience is
“never give data to someone I don’t really know.” Community organizations have shared that rushing for the
sake of timelines, rather than taking the time to share history and build relationships with researchers, can be
counterproductive.

ĥ Sharing your agenda. Sharing agendas openly contributes to building trust. How does this work fit into your
professional research agenda? What is your other current research? What are your intentions for the work and
your research institution’s priorities?

ĥ Sharing your relevant experiences. What stories can you share of your work that illustrates how you would
work with the community organization?


“Our study is working well because everybody is unveiling themselves,
taking their degrees off the wall, rolling their sleeves up.”

CHRISTOPHER SUTTON, DIRECTOR, YOUTH ADVOCATE PROGRAMS, INC

CHICAGO BEYOND EQUIT Y SERIES 63


FOR RESEARCHERS

Next, listen in order to fit research to purpose:


ĥ Goals for the research.
ö What are the organization’s specific intentions for the research? What are the few statements the organization would
like to be able to “fill in the blanks” on at the end of the work, and the organization’s intention in filling out those
blanks? (You can find prompts for the community organization in the section called You are in charge of what you
want to learn, page 34.)
ö What other effects does an organization have on the ecosystem around an individual participant?
ö How could the research capture the benefit of the program over the appropriate duration, given that benefits may
show up over time, and not immediately?

ĥ Target audience. Where does the organization intend for the research to land, and what type of data and
what research design serve that purpose, with least burden on the organization or its participants? How can
you shape the research design and approaches to serve the purpose intended?

ĥ Organizational context. What are the organization’s existing approaches to reflection and evaluation, that
precede your collaboration?

ĥ Information generated.
ö When does the organization look to act on learnings? When will initial data be available, and what actions or
decisions by the organization can it enable?
ö What answers to questions or new learning can the organization be certain of getting, even in the “worst case,” and
what is the potential “best case” of producing the research or evidence you are planning for? What will need to go
right for the “best case” to happen?

c. Help create understanding of possible approaches and methods


One practical way researchers can help to address the information imbalance is by helping their community
partners see the possibilities for evaluation, and the potential benefits and costs of each. Without this
understanding, it is difficult for a community organization to participate with full voice. Translating expertise on
what is possible and the implications to an organization’s specific context can be a valuable service.

ĥ Options.
ö What type of study is sufficiently rigorous for the intended audience while minimizing burden on the organization?
What type of study is valuable to future audiences, or enables systemic change if that is envisioned by the
organization? How does understanding how the program is working fit into the picture? Would a qualitative or mixed
methods study suit the organization’s learning objectives? You might share a simple table such as the one that
appears in the community organizations section under Know your options.
ö What approach to research—from traditional, to community-engaged, to full partnership between community and
researcher—is most suitable? (more on Community-Based Participatory Research appears in the glossary on page 110).

64 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

ĥ Pressures you face, experience you have. Are there institutional pressures you as a researcher face to
conduct certain types of studies, or are there types of studies where you lack experience, and have you made
those explicit to the organization?

ĥ Benefits, risks, costs. What are the implications—benefits, risks, costs—of each approach, from the
organization’s perspective?
ö What are the benefits of each possible approach? What ongoing benefits could the community organization see
after the research ends?
ö What will the organization be asked to compromise? For example, would the organization not be able to change
key program elements during the period of the research? What would this mean for the organization—recognizing
community organizations are often innovative and adaptive by their nature—and for its participants? Would the
organization’s flexibility to select who receives services be affected? Would the organization need to close the door
to services for a group of people for an extended period? For a program with age limits, could the research design
close the door to the program altogether for a young person? What would the organization compromise if it no longer
had discretion to decide who gets services?
ö If the research will mean that the organization must manualize or standardize its work, will the organization’s context and
culture embrace this, or will there be significant resistance, which will cost energy and time, and strain relationships?
ö What data sets will be used in each option for research you are considering, and how does access to those data
work? Will the community organization be able to access the data after the research study finishes?
ö What may be the costs to the organization and to participants of recruitment for the research? A common topic of
miscommunication is the number of participants the organization will need to recruit for the study to work.

ö To understand this, you need:


• (1) the voice of operations (i.e. outreach, intake) at the community organization to explain the process from
interacting with a potential participant for the first time all the way through program completion. Where
does attrition happen and how much attrition happens? How do participants graduate from or complete the
program, and what does this mean for how many new participants can be recruited per week or month? Will
the study change the process such that your historical enrollment or attrition might change? Are you asking
questions in a context that is likely to yield realistic estimates, for example if a board member or a funder is
in the room?
• (2) your own best estimation of what number of participants completing the program is likely to be needed
for the research study to show an effect that researchers consider valid.
• (3) to put these pieces together, to understand the cost to the organization and to participants of
recruiting the numbers needed. Developing a few different scenarios may help information flow more freely.

This is covered in more detail below in the section called Plan for study recruitment on page 70.

ĥ Learning from peers. Can you connect the community organization to another organization that has gone
through the type of research being considered? Is it possible for key staff from the community organization to
listen to how another organization addressed risks?

CHICAGO BEYOND EQUIT Y SERIES 65


FOR RESEARCHERS

d. Rights, ownership and costs


The contractual agreement for the research and how costs are distributed can undermine the intention to shift power.

ĥ Role of Principal Investigator. What will the Principal Investigator take responsibility for? Is it appropriate
for someone from the community organization to act as a co-Principal Investigator? Or is research by an
external third party important in this context?

ĥ Property rights. Discuss intellectual property rights and data rights. Who can access the data and when?
When will data be processed and shared—at regular intervals? Who can speak about the data and publish the
data? Whose consent is required, when?

ĥ Signed contract. In signing the memorandum of understanding or similar written contract, bring your
attention to how you are formalizing the above.

ĥ Cost distribution. How can you work with the community organization and community to identify costs
generated by the research, and discuss how these costs can be shared? What will participants or partner
organizations be asked to do differently for the research, and should this be compensated? Will the research
generate new staff responsibilities? Will it generate communications, change management, or other costs to
the organization? Can researchers share in this cost, in kind, or financially? In our experience, researchers
have raised funds, for example to compensate participants for their time, or for stipends for partnering public
schools. Can the funder support these costs?

ĥ From history to present.


ö After naming the local history and how projects typically work at your institution, how do you envision roles and
accountabilities will work in this research?
ö Beyond the contract, how will you choose to modify your approach and your workplan to balance power rather than
reinforce a power imbalance? How can you ground your work in the experience of those affected to counteract
inherent bias? How can you demonstrate in the words you choose to use, and how you spend your time, that you
recognize that what may be one ‘project’ or a ‘study’ in a research institution’s portfolio may be a life calling for the
staff of the community organization?

66 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 67
FOR RESEARCHERS

2.
COMMUNITY
AND VOICE
SETTING UP THE STUDY

a. Get proximate; Set up for voice


As Bryan Stevenson, founder of the Equal Justice Initiative, writes, “getting proximate” changes our capacity to
make a difference. Traditional research does not have norms for this.

One starting point is face-to-face engagement between researchers and community, participants, staff, and
partners. While community-based research practices offer examples of how to structure this engagement,
this is human-to-human work, not a check-the-box exercise to create a particular community hearing or
steer committee structure. Spending time with the organization and breaking bread fosters relationships and
understanding that matter at a human level, and equips researchers to recognize—and as a result address—
ways the power dynamic gets in the way of impact.

ĥ Relational engagement.
ö How can you spend time with the organization and the community in which the organization resides, and what
questions can you ask, to begin to build relationships of trust? To signal your intent to listen and be a partner? To
show, not talk about, your humility? To be able to put yourself in the organization’s shoes?
ö How will you, personally, listen for and hear the subjective experiences of those affected by the research? How can
this be built into your research timeline? As one researcher notes, good intentions and interview guides are not by
themselves enough to lead to interactions that promote equity.
ö Are Peace Circles an appropriate early engagement? Breakfast forums for partners? Evening forums for community?
What are the opportunities to engage without agenda, for example at graduation events, community meetings,
performances, where you are not conducting observations?

Often, the framework into which researchers are thrust sets up stakeholders as a group to be “managed” in
order to “reduce non-adherence.” Inclusion can seem “messy.” One opportunity is to start by outlining the
potential benefits and harms of the research with the community organization or community members, and to
determine how to hear the voices of those potentially affected through the process.

ĥ Identification of voices needed. What are potential benefits of the research and to whom do they accrue?
What are the potential harms and to whom do they accrue? For example, what processes of the organization
may be changed or interrupted? How will staff relationships with participants be affected? These can be
written out in a T-chart (a piece of paper with a large letter “T” creating two columns) with potential benefits
in one column and potential harms in the other, following the practice of health impact assessments,
environmental impact assessments and racial equity impact assessments. The individuals and groups named
on the chart may suggest the starting list of whose voices are important to hear.

68 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

ĥ Forums for engagement.


ö How will you create conditions so voices of those affected by the research are heard? In some cases, you might
help the community organization assemble a community evaluation committee representing these voices. The
committee may make decisions or give input into what is researched and the outcome measures used, and may
engage regularly with you. The committee may include staff of the organization, researchers, program participants
and/or community members. What is the mandate of the group? What powers does it have? How often will it engage
with the research team to not burden participants or overwhelm you, but also to effectively contribute to the study
design and execution? (For researchers familiar with Community-Based Participatory Research, a Community
Action Board is often formed to steer both the research and the related action. Barbara Israel’s piece, which is cited
in the Bibliography on page 111, surveys several Community Action Boards and describes varying levels of power,
participation, and effectiveness).
ö What engagement with a broad group of staff or community members is appropriate while the study remains in its
formative stages? Is it helpful for you as a researcher to participate in a community meeting? Is a mechanism to
gather and incorporate input regularly appropriate?

ĥ Engagement at the right times. When is it particularly important for the community organization to provide
its input, so that there is still flexibility to make changes while producing a valid research study? Have you
explained when you intend to register your pre-analysis plan or seek Institutional Review Board approval, and
what this means for the community organization’s ability to modify the study design?

b. Engage on what gets measured—check the “simple” measures


What you measure is what you incentivize.

The “simple” measures may not be the ones that represent real growth and benefit to participants most on
the margins. As large administrative data sets were in many cases built to report on compliance, metrics of
compliance such as arrest rates are, not surprisingly, easier and cheaper to collect.

Researchers can contribute expertise not only on what the common measures are, but also on how inequity is
built into those measures, and what adjuncts and alternatives are feasible, to enable an informed approach to
metrics that furthers—not contradicts—the intent of the research.

ĥ Illuminating the risks of possible metrics and data.


ö What can you learn and share with the community organization about the commonly used metrics for this type
of work, and what inequities, historical or present, are built into them? Some examples: measuring arrests in
the context of racialized policing; measuring “progress” on self-actualization for participants from a culture that
prizes interdependent families over independent individuals; measuring housing “overcrowding” for participants
from a culture that values extended family; measuring wealth accumulation for families from a culture that values
“sending money home” or supporting extended family members. What are the benefits to the organization and
to the participants it serves of the commonly used metrics? What are the harms? (A piece by Jeffrey Butts and
Vincent Schiraldi on benefits and shortfalls of recidivism as a metric that might prompt discussion is listed in the
Bibliography on page 111).
ö What are the limitations of the data sets you are working with? How will the research team share these with the
community organization, and what can be done to address these? For example, department of employment services
data do not capture all types of employment, which may result in undercounting increased employment among
participants in a program.

CHICAGO BEYOND EQUIT Y SERIES 69


FOR RESEARCHERS

ĥ Selecting appropriate metrics.


ö What does the work intend to create, in addition to what will be avoided? When do staff and participants perceive
progress and what marks progress in their eyes? How can this be reflected in metrics that are feasible in this
context? For example, are connections to caring adults built through the work important to capture? Expansion of
the participant’s support network? The participant’s perception of their agency?

One example you could share with the community organization is of a prison-based fatherhood program (you can
find the article by Abigail Henson listed in the Bibliography on page 111), where dialogue between participants,
community organization staff, and researchers changed what the study measured: the unit was changed from
the father to the family; the short-term measures were changed, from depression and stress, to pride and
reconstruction of masculinity (from provider to caregiver); the long-term measures were expanded, from recidivism,
to whether the father-child bond remained active and positive.

ö Is there an opportunity to capture the program’s impact relating to systemic causes of inequity? Are ecosystem
effects of the community organization’s work important to capture in the research? Is the program affecting the
system or the capacity of the community, for example supporting cultural revitalization, changing power relations,
or increasing the capacity of the community to solve problems?

ĥ Communicating about metrics respectfully. How can you communicate about quantitative metrics with
the organization in a way that recognizes that numbers are an incomplete picture?

c. Plan for study recruitment


Several community organizations have noted that a lack of shared understanding between the researchers, those
with program operations expertise, and those speaking for the community organization led to lost effort, lost
time, and anxiety in recruiting participants into the research. In addition, the optimism that characterizes most
community organization leaders can contribute to blind spots when it comes to planning for recruitment and
retention, which then impacts how effective the research is.

ĥ Voices with expertise. Who needs to be at the table to be able to walk through the recruitment process
in detail from start to finish to really understand what is necessary to put together a ‘cohort’ for a research
study? Outreach? Social workers? Program directors? Former participants?

ĥ Recruitment process. What is the detailed process based on these voices of expertise? Importantly, the
research itself can affect both the process, and attrition. For example, the research could change how
participants apply to the program, from interested young people applying, to schools generating lists of young
people invited into the program; as a result of this shift, attrition will rise from what the program has seen
historically. Where does attrition happen in the process and by how much? What number of participants
would need to start the recruitment process to end up with a certain number of participants completing the
program? How frequently will participants move out of the program and what does this mean for how much
time it takes for the target number of participants to complete the program? What does “completing the
program” or “graduating” mean for your organization, operationally?

70 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

ĥ Recruitment target and resources required. Based on available information from the community
organization, what is your best estimation of the number of participants necessary to appropriately power the
study? Piecing this together with the program’s process from the start of recruitment to program completion,
what total number of participants must be recruited? Can you share example numbers to illustrate what a
statistically significant change looks like, using the specific context of the nonprofit and a potential study
outcome? Can you share experiences from other studies on what changes were required by the community
organization to achieve this? Having the researchers and operational team together name different scenarios,
then explore the challenges and the manpower related to each, can be a helpful approach.

ĥ Data required. What other data about participants will you need to collect, and how will you collect these, or
how will these be collected by staff and passed to you?

ĥ Communications.
ö How will the recruitment target and process be communicated to recruitment partners, staff, community or
participants?
ö Are breakfast or evening community meetings appropriate, with researchers present to answer questions? School
night kick-offs? One-on-one conversations between researchers and recruitment partners/staff leading recruitment?
Will there be any resources or stipend for recruitment partners?
ö Are the explanations of the research to participants, staff, partners, community in a form that is understood by
these audiences?

These types of meetings require planning through an equity lens. What is the best time of day for
community members to attend? Evenings? Weekends? What about childcare? Food? Meeting format?
Seating arrangement?

d. Engage on how study participants experience the research


The lived experience of a study can change power dynamics or reinforce them. This can be true even with the
best of intentions, and even though the total amount of services provided may not have changed or may even
have increased.

While in researchers’ minds the study may be “separate” from the program, for staff and participants, that
distinction may not have any meaning—it is all part of the work.

The experience of participants in the study also affects study retention.

CHICAGO BEYOND EQUIT Y SERIES 71


FOR RESEARCHERS

ĥ Fit of research design and mission.


ö Throughout the study and as it ends, how is the research design consistent with the mission of the community
organization and trust it has built in the community? For example, what will potential participants see after study
enrollment ends, particularly if the program will not have additional capacity at that point? What is the impact of
changes made because of the research study to the organization’s reputation in the community?
ö How is consent best approached? In this particular context, is it better for participants in the control group to meet
program staff in person and give consent, or to be randomized on a list without ever interacting with staff? Will
the control group, if there is one, be asked for consent? Is consent written in a way that is understood? Is consent
presented in a context/at a time when the participant actually has agency to give consent?

ĥ Recruitment strategies. What specific strategies can be used to bring participants into the study and
retain them through the course of the study? There is a “tax” on producing knowledge about those most
marginalized: it is harder for researchers to connect with those with instability in their lives; it is harder to
obtain consent from those who have learned to distrust institutions; it is harder to retain in a research study
those who experience greater barriers. How can social media be helpful? Will multiple redundant strategies
be feasible, to increase success?

The article “Study Retention as Bias Reduction in a Hard-to-Reach Population” by Columbia Professor
Bruce Western and colleagues referenced in the Bibliography on page 111 may aid brainstorming on study
recruitment and retention, e.g., the timing of financial incentives at the start of a study (if applicable);
frequency of contact; back-up contacts including mothers and supportive secondary contacts.

ĥ Experience of being in a research study.


ö How will study participants be contacted? How will they be engaged through the study? Who will engage them?
How can the nonverbal cues create the desired experience for study participants? What have participants heard
before about research in their community or by similar institutions? What is different here, and how is that explicitly
communicated in ways most likely to be heard?
ö The assumption in research can be that “nothing is happening” to the control group. But, from the perspective of a
young person going through an application process and being “randomized out” or from the perspective of referral
sources referring many additional people to a community organization, only to have them not receive services,
something has happened. If the study will include a randomized control group, who will communicate to the
control group about the randomization and what will be communicated? How will this feel standing in the young
person’s shoes? If members of the control group have been exposed to other research studies, as is often the case
in Chicago, what narratives is it important to address, e.g., a sense that “randomization” is not actually random? If
young people are randomized out early on in program enrollment, but are aware there is still room left in a sought-
after program, what is staff’s response? If participants in the comparison and treatment groups will interact, for
example, within a school, what communication will aid each group of people?

72 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

“We are not really rats, but we are lab rats to them.

” How much are they following me?”

STORYCATCHERS THEATRE YOUTH

ĥ Responsive communications. What is the best way to communicate about what matters to participants,
for example:
ö What is the best way to communicate about privacy? Concerns nonprofits have shared, from their participants,
include: Who all will know I am in the study? Will my name be published anywhere? How much will they be in my
life? Do they watch from the cameras in the building? If I am involved in questionable activity, are they going to
report me to the police?
ö What is the best way to communicate about benefits? Participants have asked: How does being part of a study help me?
ö What is the best way to communicate about expectations? Participants have asked: Am I allowed to participate in
other programs/employment during the study? Can you still help me if I am not in the treatment group? Can I apply
again? Who is going to help me if you can’t?
ö Be aware that there may be misunderstandings about research preceding your study.

ĥ Trauma. How do awareness of trauma and the research study’s potential to trigger memories and emotions
shape the work? A study can be an emotional trigger for program participants, and for staff who were
themselves researched or interrogated in other ways in their childhood. Trauma expertise can inform study
outreach and study design to minimize that effect. For example, trauma awareness can shape how staff
participate in the research (and change whether their stress is transmitting fear to participants). If this is not
an area in which your team is experienced, seek out expertise so you are equipped to answer these questions.

e. Build in time for reflection, failure, change


Meaningful reflection and problem-solving during the initial steps of a community-research partnership take
time. Build this into the plan.

ĥ Cadence. What is the cadence on which the team should reflect on the work and engage on challenges? Where and
when will these conversations be held? Who will participate and who will lead?

ĥ Discussion topics.
ö Feedback on the research effort so far, and what needs to be done to address it.
ö How are inequitable approaches, methods, measures filtering into the study, and what are opportunities to do
differently? For example, are you unintentionally taking advantage of what your privilege allows you to do, such as
dictating meeting times and locations? How is your work creating a way of operating intentionally distinct from the
legacy of “research brain” and “community brawn?”
ö How are relationships of trust being formed, and how is the team interacting as equals? Note that while
researchers’, staff, and community members’ roles in the research differ, the point here is that no one is treated as
superior or inferior.

ĥ Closing the loop. How are you communicating as clearly as possible to those providing input what you as researchers
are doing with that input? How are you “closing the loop?”

CHICAGO BEYOND EQUIT Y SERIES 73


FOR RESEARCHERS

3.
COMMUNITY
AND VOICE
DURING THE STUDY

a. Engage during the study


Once the study launches, staying engaged with the community organization or community evaluation
committee helps the research continue in a more equitable way.

ĥ Feedback loop.
ö How are staff, participants’ and community members’ voices being heard during the course of the study? What
feedback is being shared and what can be done to address the feedback and communicate back to those who
shared it? If no feedback is being shared, what else can be done to listen? Some organizations have found a
biweekly conversation among researchers and those at the community organization involved in implementing the
study tremendously valuable to ask and answer questions and plan for each new step in the work in an organic and
effective way.
ö How can you continue to spend time with the organization and build relationships? How can you give credit
to the community organization or participants where your partnership is working well, and show—through
actions—humility?

ĥ Listening for clues.


ö How are efforts to bring participants into the study and engagement with participants during the study working?
What is not working and what can be done better?
ö Are touchpoints with those in the control group, if there is one, working well? What is not working and what can be
done better?
ö How are inequitable approaches, methods, measures filtering into the research despite the best of intentions
and what can be done to reorient? Who is not appearing as robustly in early data (for example, young people
experiencing housing instability)? Who is not being heard?

74 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

b. Collaborate on study tools and problem-solving


Collaboration between the community organization and researchers on study tools such as surveys can
improve both the experience of being surveyed and the quantity and quality of information gathered.

ĥ Contextual awareness.
ö How can the community organization and researchers best collaborate so study tools fit the specific context and
also have the validity desired?
ö How will draft study tools be vetted by participants? For example, are surveys of an appropriate length?

ĥ Execution.
ö What types of data, for example, social security numbers, does the community organization view as likely to be a
challenge to collect in this context, and what are alternatives?
ö If a representative group is assembled, for example a focus group, is it a meaningful “group” in the eyes of the
community organization? How can the community organization help you to create the conditions for voices to
be heard?
ö Can community members be hired to help collect data or conduct surveys?

CHICAGO BEYOND EQUIT Y SERIES 75


FOR RESEARCHERS

4.
EQUITABLE NUMBERS
FOR IMPACT
a. Make it useful along the way.
The history of distrust between community and researchers comes in part from community organizations
entering research partnerships expecting the work will improve lives of participants, and later finding the
research may not help them with day-to-day operations or bring resources to their community. How can the
research improve lives of participants as much as possible, as soon as possible?

ĥ Early learnings. What are early learnings from the research and how can these be shared with staff,
participants, community, or partners to show what their hard work is yielding?

ĥ Early action.
ö What improvements to the program are possible during the research, given the context of the organization and the
research design, which may limit what changes can be made?
ö How will those responsible for making the changes engage with the data early on so they can plan for action?

ĥ Sense checking. Is there a role for the organization’s staff in sense-checking early data or preliminary
results? In reviewing how qualitative data are coded? Are preliminary results consistent with the experience
of the organization, or do they look wrong?

76 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

b. Bring attention to who is being benefited the most? Least?

When you disaggregate data,


Aggregated average
what can you notice?

The average
grew because
of these
participants.
Average after Average after
the program the program

Average before Average before


the program the program

What is the learning for


these participants?

As researchers are taught, averages can obscure how impact is distributed. Incentives to support participants
to reach a certain bar—less recidivism, more college enrollment, more employment—can make the case for the
work supporting participants farther away from the threshold harder-to-make.

For greater impact, researchers can support a more nuanced understanding of who is being successfully
reached and who is left farther behind, who is benefiting most and least, and what can be learned from the
complex stories beneath the numbers.

ĥ Disaggregated data. Against whatever metric has been chosen, for example, GPA, how much change
does the program create for participants starting out furthest behind, versus participants in the middle,
versus participants starting out furthest ahead? (This must be done in a way that continues to protect
individuals’ privacy.)
ĥ Learning from who is benefited least and most.
ö Who is benefited the most by the organization’s work? What are the characteristics— demographic, intersectional,
and situational, for example, housing stability, adult relationships, connectedness—of the small group of
participants benefited most?
ö Who is benefited least by the organization’s work? What are the characteristics of these participants?
ö What can the organization learn from these stories? What can the organization learn from the specific stories
of those benefited most, recognizing the complexity of these success stories, to spur innovation? Are there
adjustments to the program the organization will try out to better serve its target participants?

ĥ Counting inequity. Is the correlation between race and outcomes, or class and outcomes, changed as a
result of the organization’s work?

CHICAGO BEYOND EQUIT Y SERIES 77


FOR RESEARCHERS

c. Who is the comparison point?


Just as averages can obscure who, specifically, is and is not benefitting, comparison to the average can be
misleading. For example, comparing to an “average” client load may not be relevant and may burn staff out
if the work involves complex trauma. For example, if custodial sentence lengths get shorter because of a
program and there is no change in recidivism, this can be an improvement in recidivism if normally an increase
in recidivism is expected whenever sentence lengths get shorter. For example, a young person dropping out
of a program and re-engaging several times can be an indicator of an organization’s persistence in building
relationships, rather than an efficiency marker to improve.

ĥ Comparison points.
ö For a study comparing outcomes to a benchmark, or analyzing cost and benefit, who and what is proposed as the
comparison? How can comparison points show different perspectives or the complexity of the work?
ö Does the comparison point take into account the impacts of systemic and individual traumas, for example the
implications of stress experienced by program staff as a factor in selecting what productivity comparison is relevant?
ö Does the comparison point ring true to those being measured?
ĥ Cost-benefit analysis. How can analysis capture systemic effects, where cost-benefit numbers are an output of the
work? Accounting for “cost” of the status quo only in terms of tax dollars—which are often actually wages that benefit
a different group of people—rather than the actual social cost, can unwittingly and incorrectly build a case against
investment in community work.

ĥ Capturing growth.
ö How does the presentation of the data reveal growth journeys which may not be linear? Does the comparison point
allow for—to use substance abuse terminology—relapse in the context of recovery?

78 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 79
FOR RESEARCHERS

5.
SHARING RESULTS

a. Is it historical? Is it contextualized?
Numbers, without context, take on the assumptions and biases of their audience. Data sources, without
context, reinforce the structural bias built into them.

As Chimamanda Ngozi Adichie warns, a story is fundamentally shaped by where you begin it. The origins of
inequity are often left out of the story, allowing histories to be laundered, and reinforcing harmful silences in
the narrative.

ĥ Context. What history and explanation of structural and systemic factors is important to frame the
challenges the organization is addressing? To explain fully why the problem exists in the first place and the
complexity of root causes and pathways? How can this show up in the description of the organization’s work
and dissemination of the research?

ĥ Dominant narratives. What narratives have previously described the organization’s participants, its work
in the community, or its type of work? How have these narratives served participants well? How have they
harmed them, or reinforced inequities? With this understanding, how can this research be framed to take on
unjust narratives? What cultural context is important to tell?

ĥ Limitations. How are limitations of the data clearly communicated? For example, limitations of
administrative data sets, limitations of summarized data, limitations of common metrics, systematic non-
counting, or systematic undercounting? “Objective” data like special education designations, census data,
crime that is measured by arrests, or domestic violence information, can incorporate racialized processes and
lead to incorrect interpretations, without context.

80 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
C O M M U N I T Y A N D V O I C E : S E T T I N G U P T H E S T U DY

b. Can you hear the participants? Are you signaling that lived
experience is valuable?
Valuing data to achieve an end—whether securing funding, improving programs, sharing learning with the
field, changing narratives—is not the same as valuing a human story and experience intrinsically. Honoring
a participant’s voice requires intention, it may not just happen from documenting a participant’s story,
demographics or outcomes.

ĥ Participant voices.
ö Can you hear the participants in what you, the community organization, the funders, are disseminating about
the research?
ö How are images, stories, numbers resulting from the research effort putting participants at the center in how they
are shared, versus treating them as objects of study or as tokens to lend credibility? How can you engage the
community organization to further this?

ĥ Authorship. Is there an opportunity for the participants or the community organization to own elements of
or co-author what is produced, or does it serve the community organization to have an external author?

ĥ Respecting experience.
ö How does the presentation of results message to the audience that experience is valuable and valid, rather than
reinforcing the bias that university expertise gives validity?
ö Would participants find that what is being put out is true to their experience?

CHICAGO BEYOND EQUIT Y SERIES 81


FOR RESEARCHERS

c. Is it accessible? Can those researched hear the research?


Sharing research results in an accessible way—versus only in journals that require a subscription to access and
a specific technical training to understand—brings the fruits of the research to those who participated in it and
who are affected by it. It is accountable to relationships built, and necessary for the research to power broader
change. Some considerations:

ĥ Language.
ö Is the language used as easy to understand as possible? Do people from different cultures, with different lived
experiences, with different technical backgrounds understand the results of the research and the “so what” of what
it means, when you test an early draft?
ö Are all inputs, calculations, and methods clearly explained, so stakeholders with different technical backgrounds
can understand what has been counted, how, and based on what judgments? Are data tables and charts legible to
those without research and statistics backgrounds? Are any technical terms used defined in plain language?

ĥ Forums and formats.


ö How will you collaborate with the community organization to identify the format(s) that will make the results most
accessible to those affected by the research? To those who have power to support the work? To partners or others in
the community organization’s coalition for change?
ö Where—in what forums—will the results of the research be most likely to reach each of these audiences? Does it
make sense to host community discussions to share results? Post video on social media? Design a simple summary
that articulates what was learned, and to what end?
ö Revisit the chart you may have made early on to identify those potentially affected by the research. Are those people
being reached?
ö Revisit what you learned about local context and history, is the work being shared in a way that feels different than
what came before?

d. Learning for equity


Having applied creativity to conducting the research, take the opportunity to reflect:

ĥ Process. What were lessons from this research on the process of doing research with an equity orientation? Were
objectivity, bias, or rigor affected? Were participation and community involvement affected? Were accountability
or ethics affected? How can you capture this creativity and learning, and share it with your institution and with the
research community? How can you support the community organization in sharing this creativity and learning?

ĥ Endpoints. How were the endpoints of the research different than a traditional approach? Did the level of insight
derived change? Did the usefulness of the outputs to the community organization change? Was the capacity of the
community organization affected? Did members of the community or community organization get interested in evidence
and research? How were you and your capabilities changed?

ĥ Learning and sharing. What could have been done better, and what worked well? What is the feedback from the
community organization and community stakeholders? What is their guidance for future projects? How will you share
your lessons with the community organization? How will your institution internalize these lessons?

82 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Pictured: Englewood
FOR
FUNDERS

Before you start:


What is your relationship with research?
Funders are a driver of the economy of research and evaluation: funding the
production of research, incentivizing its creation, shaping its form, consuming
its outputs. Government funding is influenced by the evidence that philanthropic
investments produce.

Individual funders relate to research differently. Some may fund research to generate
knowledge for policymakers and other funders on what works and what does not.
Others may fund evaluation to assess the impact of their funding, for their or their
boards’ own consumption. Still others may fund research in service of a community
organization’s growth or to change narratives. Those that do not fund research at all
may participate in the research economy by using data to direct their funding or to
summarize the impact of their work.

“We need to do a much better job of naming


the belief systems which our work privileges,
whose knowledge matters most, and why at
the end of the day, we do this work at all.”

REFLECTION ON 2018 AMERICAN EVALUATION ASSOCIATION


NATIONAL CONFERENCE, FROM EQUITABLE EVALUATION INITIATIVE

84 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Depending on your particular relationship with research, the action
you take to unlock more meaningful knowledge, and therefore
greater impact, will look different.
For example, you may:
ĥ Change the research questions you are willing to fund

ĥ Fund and set timelines for research differently

ĥ Issue Requests For Proposals, or RFPs, for research differently or guide and evaluate your evaluators differently

ĥ Engage with board and staff on internal processes and biases, especially relating to how you use data

ĥ Interrogate numbers and stories you lift up, and use different framing in what you publish

In all cases, it will require challenging what has “always been done.” This may not be tidy, or comfortable. But
starting from accountability and relationship, funders can help to achieve the promise of what knowledge can yield.

Each funder may, in asking the questions below, find answers appropriate to their own work. Chicago Beyond
has shared some of our experience to illustrate.

CHICAGO BEYOND EQUIT Y SERIES 85


FOR FUNDERS

1.
KNOW YOUR ROLE,
KNOW THE RISKS
EQUITY IN HOW YOU START

a. Bring awareness
How research gets done—the approaches, methods, metrics—has a system of assumptions built in.

In seeking more equitable approaches, one place to start is understanding context and biases.

ĥ Board dynamics and beyond. Who is on your board? Who are the other stakeholders you engage? How do
they create or interpret “authoritative” knowledge? Who does not show up in this? What changes does this
awareness lead to in how you interact with your board, if you have one? For example, is it valuable to have a
board discussion about how research, evaluation, and/or your use of evidence are related to your mission and
to experiment with changes in pursuit of the authentic truth? How can you listen to community organizations
from whom you have collected data, or where you have funded research, to understand their experience?

ĥ Value—to whom? How can you ensure the research produces something of real value to the community?
What is the value of the research being proposed to the “subjects” of the research? What are the benefits of
producing the research or evidence to you as a funder, and to the research institution hired?

ĥ Biases in selecting researchers. How do you select researchers? What assumptions are incorporated into
your selection process? How could you include researchers from communities being researched?

Some sources of context include MIT historian Craig Wilder’s Ebony & Ivy on the history of elite academic
institutions in justifying inequities, and the Harlem Children’s Zone’s writings on their decades of experience
“being researched” (you can find both in the Bibliography on page 111).

86 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

“You have to build a relationship. There’s a history. It

” takes time, it takes trust, it takes being vulnerable.”

MARQUELL JONES, MAC HOUSE CASE MANAGER,


LAWNDALE CHRISTIAN LEGAL CENTER

b. Understand context
The power dynamic between researchers, funders, and community may lead a community entity to participate
in research that marginalizes the community entity or its participants. Some communities and community
organizations may feel comfortable speaking up and engaging with funders and researchers about the purpose of
the research and how it is conducted. Others may fear that being assertive may jeopardize their funding or support.

ĥ History. What is the history of community-researcher-funder interaction in this community, and the
particular history for this organization and these participants? How have they previously experienced
research institutions? How have funders participated in this dynamic?

ĥ Reality of the community. How can you ensure the work is grounded in the experience of those affected,
to counteract bias? How can you get to know the community organization well enough to be trusted with the
truthful context, or how can you learn from someone who is?

CHICAGO BEYOND EQUIT Y SERIES 87


FOR FUNDERS

c. Invite community ownership of the questions


Towards authentic truth, funders bear an important responsibility to listen to the community organization and
the community’s specific intent.

ĥ Clarity about purpose. How will the research help the community? Where does the organization intend for
the research to land, and what type of data and research design serve that purpose, with least burden on the
organization or its participants? Is the community organization served well by research on the mechanisms,
i.e., an implementation study, focused on the “how,” in addition to the outcomes research that is more
frequently funded? Is the community organization served well by research to identify the early indicators of
their overall goal that they can influence?
ĥ Research questions. What are the questions the community organization or community wants to answer?
For example, what are specific sentences the organization wants to “fill in the blanks” on at the end of the
research, and why? Should the sentences focus not just on individual change but on interpersonal change,
change to families, or community change? Are there research questions about root causes that, if evidence
were generated by researchers, would lead to action on systemic inequities?

For example… “First, we want to write strong applications for state government funding for violence
prevention, so we want to say our program reduces participants’ violent behavior outside of the program by
[percentage/measure], citing a rigorous outside evaluation. Second, we want to show that it is not just about
the participants, but also the families of participants that grow stronger through our program and become
advocates of change. Third, we would like to identify early indicators that affect whether a participant will
complete the program or not.”

Organizations have found research useful in day-to-day work when it identifies early indicators of the
overall goal, if staff or participants can affect those indicators. Organizations have also found it useful to
ask staff: What 3-4 pieces of information would help you to do your job better?

88 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

d. Prompt discussion of possible approaches and methods


ĥ Information about options.
ö What type of study is sufficiently rigorous for the intended audience while minimizing burden on the organization?
What type of study is valuable to future audiences or enables systemic change if that is envisioned by the
organization? How does understanding how the program is working fit into the picture? Would a qualitative or mixed
methods study suit the organization’s learning objectives? You might share a simple table, such as the one that
appears on the next page.
ö What approach to research—from traditional, to community-engaged, to full partnership between community and
researcher—is most suitable? (more on Community-Based Participatory Research appears in the glossary on page 110).
ö What ongoing benefits could the community organization see after the research ends?

ĥ Information about risks, costs and tradeoffs.


ö How can you ensure the community organization understands the implications of possible research approaches, for
the organization? What would the organization be asked to compromise? For example, would the organization not be
able to change key program elements during the period of the research? What would this mean for the organization
and for its participants, since many community organizations are constantly innovating and adapting for greater
impact? Would the organization’s flexibility to select who receives services be affected? Would the organization
need to close the door to services for a group of people for an extended period? For a program with age limits, could
the research design close the door to the program altogether for a young person? According to operations staff or
participants, what may be the impact of the changes contemplated for the research?
ö Who else needs to have this information? Is it helpful for you to support the community organization in
systematically thinking through which stakeholders, from front line staff to board members, it might be important to
communicate with about benefits, risks, costs?
ö Can you connect the community organization to another organization that has gone through the type of research being
considered, particularly a community organization with a similar depth of relationship with their participants (e.g., few
hours once a week versus deep ongoing relationship with a participant and their family)? This may help get around
competitive dynamics between nonprofits, to help the community organization better understand the options available.

CHICAGO BEYOND EQUIT Y SERIES 89


FOR FUNDERS

What are the options for the research? Here is a simple chart you can share:

A “quantitative” study focuses on numbers to assess implementation and/or the impact of


your organization’s work.
For example, a quantitative study might count how many people your organization serves,
what services they receive, and whether they have stable housing after receiving services.
QUANTITATIVE STUDY A quantitative study often uses government data sets, for example from the public school
system or the criminal justice system, census tract data, or surveys.
A quantitative study can produce data on a large number of participants more cheaply
than other approaches. It can show that stories-focused information can be generalized
beyond the handful of participants telling the stories.

A “quasi-experimental” study is a type of quantitative study that shows a numerical change


QUANTITATIVE STUDIES

occurred, but does not show your program caused the change to happen.
It does not involve assigning participants to two different groups and studying both groups, and
QUASI-EXPERIMENTAL therefore asks less from your organization and your participants.
STUDY
This will reduce your flexibility to change program elements during the period of the research.
Will this generate what you are trying to learn? Is this rigorous enough for the audience you
want to reach?

An “RCT” or “random assignment evaluation” is a type of quantitative study used to show


your program caused the change to happen. For example, it would allow a researcher to
say “participants in this organization had stable housing more often as a result of their
participation.”
RANDOMIZED
CONTROLLED It involves assigning participants randomly to treatment and control groups which is effort-
TRIAL intensive (more detail in the sections called Know the risks and costs and Plan for study
recruitment below).
This will reduce your flexibility to change program elements during the period of the research.
It is often favored by public policy-makers. Is it necessary for your goals?

A “qualitative” study focuses on systematically collecting stories and other non-quantitative


information to convey the impact of your organization’s work.
A qualitative study may use interviews, focus groups, or observational data, which means a
researcher watching or listening to participants and staff members.
For example, a qualitative study might summarize what participants are saying has changed in
their lives while participating in your program.
Particularly when you are trying something where not much is already known, rich qualitative
information, even from a smaller number of participants, helps shed light on “why” and “how”
QUALITATIVE STUDY your efforts are working, and why participants find it valuable.
Case studies can offer rich insight—but are different than a systematic qualitative study that
may guide program or policy changes.
Community organizations have found this type of research helpful to scaling up their work
because it helps you understand what pieces matter most. Qualitative data can guide
improvements, for example: criteria in screening tools, characteristics of staff to hire for, service
or curriculum improvements. Qualitative data can also suggest internal metrics the organization
can use so that operations produce more of what matters.
However, qualitative research can take time and be expensive.

A “mixed methods” approach mixes numbers and stories, and can provide the best, and worst,
MIXED METHODS STUDY
of both worlds.

90 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

e. Plan for study recruitment


How can funders hold ourselves and researchers accountable for creating a shared understanding of
what study recruitment will require? Several community organizations have noted that a lack of shared
understanding between the researchers, those with program operations expertise, and those speaking for the
community organization led to lost effort and time, and anxiety, in recruiting participants into the research. In
addition, optimism for the research from the community organization’s leadership can contribute to blind spots
when it comes to planning for recruitment and retention, which can then make the research less effective.

ĥ Voices with expertise. Who needs to be at the table to be able to walk through the recruitment process
in detail from start to finish to really understand what is necessary to put together the participant group, or
“cohort,” for a research study? Who is closest to the work? Outreach? Social workers? Program directors?
Former participants?

ĥ Recruitment process. What is the detailed recruitment process based on these voices of expertise?
Importantly, the research itself can affect both the recruitment process, and attrition. For example, the
research could change how participants apply to the program, from interested young people applying, to
schools generating lists of young people invited into the program; as a result of this shift, attrition will rise
from what the program has seen in the past. Where does attrition happen in the process and by how much?
What number of participants would need to start the recruitment process to end up with a certain number
of participants completing the program? How frequently will participants move out of the program and what
does this mean for how much time it takes for the target number of participants to complete the program?
What does “completing the program” or “graduating” mean for your organization, operationally?


“If I could start all over, I’d ask ‘What’s the power number?’”

AIMEE STAHLBERG, ARTISTIC MANAGER, STORYCATCHERS THEATRE

CHICAGO BEYOND EQUIT Y SERIES 91


FOR FUNDERS

ĥ Recruitment target. The higher the number of study participants, the easier it is to show a scientifically
valid change, but the greater the effort to recruit participants. For the research study to show an effect
that researchers consider valid, how many participants does your researcher estimate need to complete
the program? Piecing this together with the program’s process from the start of recruitment to program
completion, what total number of participants must initially be recruited? Having the researchers and
operational team together name different scenarios, then explore the challenges and the manpower related to
each, can be a helpful approach.

ĥ Resources required. What will your organization need to change to achieve the recruitment target? One
community organization believed their social worker could continue to manage recruitment as the research
study began, and later found they needed a dedicated person spending 30 hours per week to adequately
support recruitment and build new referral partnerships.

ĥ Data required. What other data about participants will researchers need to collect, and how will these be
collected by researchers, or by staff and passed to researchers?

ĥ Communications.

ö How will the recruitment target and recruitment process be communicated to recruitment partners, staff, community
or participants?

These types of meetings require planning through an equity lens. What is the best time of day for
community members to attend? Evenings? Weekends? What about childcare? Food? Meeting format?
Seating arrangement?

ö Are breakfasts or evening community meeting appropriate? Should researchers be present to answer questions? What
about a school-night kick-off? Are one-on-one conversations between researchers and recruitment partners/staff
leading recruitment most appropriate? Will there be any resources or stipend for recruitment partners?
ö Are the explanations of the research to participants, staff, partners, community in a form that is understood by
these audiences?

92 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

f. Match funding and researchers to the goals; account for all costs
With shared understanding of the purpose of the research, funders can be accountable for aligning support to
these goals.

ĥ Selection of researchers. In discussions with researchers you are considering hiring:


ö What experience do they bring to the table about the specific community, program participants, or context?
ö What creativity and experience do they bring to the table in doing research in community-led or participatory ways?
What experience do they bring to the table of recognizing bias and applying that recognition in their work? What
evidence can they offer of this?
ö What are their priorities or the priorities of their institution? How do these match with the community organization’s
approach and priorities?
ö What are a particular research partner’s limitations in the kinds of research they can support, or are incentivized to
produce? What institutional pressures do they face to conduct certain types of studies?

ĥ Expectations of researchers. Is there an expectation that the research effort will build the capacity
or infrastructure of the community organization? How is this translated into the expectations you set for
researchers?

ĥ Full accounting of costs. What costs to the organization and community will be generated by the research?
As a funder, you can lead the conversation to account for all costs, both direct and indirect, and discuss how
these costs can be shared. What will participants or partner organizations be asked to do differently for the
research? Will the research generate new staff responsibilities? Will it generate communications, change
management, or other costs to the organization?

ĥ Budget and timeline. Overemphasis on budget and timeline can get in the way of equity and impact.

ö When does research fit to best serve the community organization? How can the community organization drive the
timetable? Is program infrastructure in place to enable research, for example, consistently delivered key elements of
the program, or would the organization be better served by research timed after necessary infrastructure is built, or
with some lead time for preparation?
ö Do the budget and timeline support and create incentives for: Building relationships and trust? Developing data
tools with community participation? Researchers and the community organization interpreting the data together?
Collaboration on how data are shared?
ö How can reflection, time for failure, and opportunity to change be built in to the research plan?
ö How do the budget and timeline support the fully accounted costs articulated above, for example change
management? When the focus is a more marginalized group, how do the budget and timeline take into account the
additional efforts it will take to overcome the additional challenges—instead of setting expectations according to the
norms of other groups and then treating these individuals as “non-compliant?”
ö What are opportunities for residents of the community to be hired, for example to help collect data or conduct
surveys, and how can you encourage this?

CHICAGO BEYOND EQUIT Y SERIES 93


FOR FUNDERS

Color of Chicago Beyond’s Experience:


At Chicago Beyond, experience continues to teach us about the substantial tangible and intangible costs of
collaborations between community organizations and researchers, that we and other funders had not accounted for.
Our notes:
1. Being more proximate enables us to learn. Through deep relationships we have seen myriad financial
and intangible costs of doing research from the nonprofit and community’s perspectives. We have built a
“growth team” who in turn builds trusted relationships with a broad array of people within our nonprofit
partner organizations. This intimacy begins in our due diligence process before making the investment,
where our team spends substantial time with, and writes or co-writes the investment proposal in
collaboration with, the nonprofit.
2. Funders have unique opportunities to support and reduce some of the costs. Some examples
from our work: collaboration on the purpose of the research; helping address strategic and operational
challenges such as navigating new recruitment targets; developing communications about the research
for the nonprofit’s staff, board, and community partners; supporting executive directors in their change
management efforts resulting from doing research.
3. We acknowledge that this work is difficult and messy. Timelines and timing of funding may need to
shift, when the cost of not shifting them becomes clear.

g. Memorandum of understanding
The contractual agreement for the research can undermine the funder’s intention to rebalance power.

ĥ From history to present. Name the local history. How have interactions between this community and
researchers typically worked? Set against that context, how do you envision roles and accountabilities will
work in this research?

ĥ Role of Principal Investigator. What will the Principal Investigator take responsibility for? Is it appropriate
for someone from the community organization to act as a co-Principal Investigator? Or is research by an
external third party important in this context?

ĥ Property rights. Discuss intellectual property rights and data rights. Who can access the data and when?
When will data be processed and shared? Who can speak about the data and publish the data? Whose
consent is required, when?

ĥ Signed contract. In signing the memorandum of understanding or similar written contract, ensure you are
formalizing the above.

94 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 95
FOR FUNDERS

2.
COMMUNITY
AND VOICE
a. Create accountability for voice
Bryan Stevenson, founder of the Equal Justice Initiative, writes “getting proximate” changes our capacity to
make a difference.

Funders can set the tone and expectation of relationships, and, depending on the context, advocate for the
interests of the community organization.

ĥ Reciprocal engagement.
ö How will you as a funder participate in breaking bread and in building relationships?
ö How can you encourage researchers and the community organization to spend time together face-to-face, to
begin to build relationships of trust? To show, not talk about, humility? Are Peace Circles an appropriate early
engagement? Evening forums for community? For researchers, having the opportunity to connect with the work
on a human level can provide perspective, for example on the human cost of research approaches. For community
organizations, engaging with the researchers formally and informally helps build a relationship of trust.

ĥ Identification of voices needed. How can you support the researchers and community organization to identify
those affected by the research, and to create structures and conditions so that their voices can be heard?
ö In some cases, it can be helpful to assemble a committee, which engages regularly to make decisions or give input
into what is researched and the outcome measures used. The committee may include staff of the organization,
researchers, program participants or community members. (For funders familiar with Community-Based
Participatory Research, a Community Action Board is often formed to steer both the research and the related
action. Barbara Israel’s piece, which is cited at the end, surveys several Community Action Boards and describes
varying levels of power, participation, and effectiveness.) It is important to be clear about the mandate of the group:
What powers does it really have?

ĥ Voice at the right times, and over time.


ö How can you support the community organization’s engagement at the right times to have impact, for example
before IRB approval or before the study’s pre-analysis plan is registered? An “IRB” or Institutional Review Board
is an administrative body that confirms that certain ethical considerations are met. A “pre-analysis plan” commits
to what the most important outcomes and approaches will be in the research. Once these steps in the research
process occur, the community organization has limited ability to change the study design while safeguarding the
validity of the study.
ö How can you check-in regularly during the course of the study, to ensure staff’s, participants’ and community
members’ voices are heard? What feedback is being shared—what is working well and what is not—and how can you
help address that feedback? If no feedback is being shared, what else might be done to listen?

96 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

Color of Chicago Beyond’s Experience:


Chicago Beyond hosts quarterly meetings with our nonprofit partners, researchers and our own team. In our experience:
1. These quarterly meetings serve as opportunities for the three parties to step back together from the day
to day of the work, reflect, and solve problems. We facilitate the meetings so it is not one group unilaterally
presenting. Comfort in these forums has built over time.
2. Staff at our nonprofit partners have found they learn from their own colleagues in this reflective space.
3. Researchers have found new insights for their analyses. For example, hearing from outreach staff that a shift
to serve young people pushed further to the margins meant a large proportion of the participants were parents
had implications for the study’s framing.
4. In these forums, Chicago Beyond has been able to support discussions of what preliminary research
findings mean practically, and push for more actionable information sooner.

CHICAGO BEYOND EQUIT Y SERIES 97


FOR FUNDERS

b. Create accountability for what gets measured


What is measured is what is incentivized.

In doing justice work, the “simple” measures may not be the ones that represent real growth and benefit to
participants most on the margins. As large administrative data sets were in many cases built to report on
compliance, metrics of compliance such as arrest rates are, not surprisingly, easier and cheaper to collect.

Take the example of recidivism. On one hand it is a common metric, and one for which data sets exist. This
means it costs less money to track and allows comparison across programs. On the other hand, it is binary.
It fails to capture directional progress in desistance from crime. It succeeds in capturing things outside of
the program participant’s control such as the intensity of enforcement efforts and prosecutor-judge-public
defender dynamics contributing to how pleas are entered. A piece by Jeffrey Butts and Vincent Schiraldi on
benefits and shortfalls of recidivism as a metric that might prompt discussion is listed in the Bibliography on
page 111.

Funders can hold their research accountable for taking an approach to metrics that furthers—not contradicts—
the mission of the work.

ĥ Understanding the risks of possible metrics and data.


ö What are the commonly used metrics for this type of work, and what inequities, historical or present, are built
into them? What assumptions are built into how these metrics are used? Some examples: measuring arrests in
the context of racialized policing; measuring “progress” on self-actualization for participants from a culture that
prizes interdependent families over independent individuals; measuring housing “overcrowding” for participants
from a culture that values extended family; measuring wealth accumulation for families from a culture that values
“sending money home” or supporting extended family members. What are the benefits to the organization and to the
participants it serves of the commonly used metrics? What are the harms?
ö What are the limitations of the data sets you are working with? How will the research team address these? For
example, department of employment services data do not capture all types of employment, which may result in
undercounting increases in employment among participants in a program.

ĥ Selecting appropriate metrics.


ö What does the work intend to create, in addition to what will be avoided? When do staff and participants perceive
progress and what marks progress in their eyes? How can this be reflected in metrics that are feasible in this
context? For example, are connections to caring adults built through the work important to capture? Expansion of
the participant’s support network? Changes to the participant’s perception of their agency?

One example you could share with the community organization is of a prison-based fatherhood program
(you can find the article by Abigail Henson listed in the Bibliography on page 111), where dialogue between
participants, community organization staff, and researchers changed what the study measured: the unit was
changed from the father to the family; the short-term measures were changed, from depression and stress, to
pride and reconstruction of masculinity (from provider to caregiver); the long-term measures were expanded,
from recidivism, to whether the father-child bond remained active and positive.

98 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

ö Effects of the ecosystem on a participant, network effects of change by the participant, or structural impacts of a
program’s work, may be less simple to research. However, this may be a critical piece of a community organization’s
impact where easy-to-research approaches have not worked. Are ecosystem effects of the community organization’s
work important to capture in the research? Is a program getting at the structural drivers of inequity? If so, how? Is
the program affecting the system or the capacity of the community, for example supporting cultural revitalization,
changing power relations, or increasing the capacity of the community to solve problems?

ĥ Communicating about metrics respectfully. How can you communicate about quantitative metrics in a
way that is respectful, and that validates the intuition of the community organization that numbers are an
incomplete picture?

c. Create accountability to those “being researched”


The experience of a study can change power dynamics or reinforce them. This can be true even with the best
of intentions, and even though the total amount of services provided may not have changed or may even have
increased. While the research study may be “separate” from the program in your mind, for staff and participants
that distinction may not have any meaning—it is all part of the work.

When those closest to participants, and participants themselves, shape how the study occurs, it can help
community organizations and researchers arrive at more equitable and authentic learning. It can also strengthen
study retention.

ĥ Research design.
ö How can you ensure that the research design throughout the study, and as it ends, is consistent with the mission of
the community organization and the trust it has built in the community? How can you ensure that the impact to the
organization’s reputation in the community resulting from changes made because of the research study has been
considered?
ö How is consent best approached? Is consent written in a way that is understood? Is consent presented in a context/
at a time when the participant actually has agency to give consent?

CHICAGO BEYOND EQUIT Y SERIES 99


FOR FUNDERS

ĥ Experience of being in a research study.


ö How are researchers and the community organization proposing to engage so that the community organization’s
staff, the community, or participants have voice in what the research feels like?
ö For example, how will study participants be contacted? How will they be engaged through the study, by whom? What
have participants heard before about research in their community or by similar institutions? What is different here,
and how is that explicitly communicated in ways most likely to be heard?
ö The assumption in research can be that “nothing is happening” to the control group. But, from the perspective of a
young person going through an application process and being “randomized out” or from the perspective of referral
sources referring many additional people to a community organization, only to have them not receive services,
something has happened. If the study will include a randomized control group, who will communicate to the
control group about the randomization and what will be communicated? How will this feel standing in the young
person’s shoes? If members of the control group have been exposed to other research studies, as is often the case
in Chicago, what narratives is it important to address, e.g., a sense that “randomization” is not actually random? If
young people are randomized out early on in program enrollment, but are aware there is still room left in a sought-
after program, what is staff’s response? If participants in the randomized and treatment groups will interact, for
example, within a school, what communication will aid each group of people?

ĥ Responsive communications. What is the best way to communicate to participants about privacy,
expectations, and benefits of the study? Concerns nonprofits have shared, from their participants, include:
Who all will know I am in the study? Will my name be published anywhere? How much will they be in my life?
Do they watch from the cameras in the building? If I am involved in questionable activity, are they going to
report me to the police? How does being part of a study help me? For a fuller list of issues, please see page 49.
ĥ Study tools. How can you encourage the community organization and researchers to collaborate so study tools
fit the specific context and also have the validity desired? How will draft study tools be vetted by participants?
ĥ Awareness of researchers. How can funders support the cultural awareness and humility of those
conducting the research? For example, Chicago Beyond has supported researchers’ participation in racial
bias workshops and reflection.
ĥ Trauma. How do awareness of trauma and the research study’s potential to trigger memories and emotions
shape the work? A study can be an emotional trigger for program participants, and for staff who were
themselves researched or interrogated in other ways in their childhood. Trauma expertise can inform study
outreach and study design to minimize that effect. For example, trauma awareness can shape how staff
participate in the research (and change whether their stress is transmitting fear to participants).

100 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
CHICAGO BEYOND EQUIT Y SERIES 101
FOR FUNDERS

3.
EQUITABLE NUMBERS
FOR IMPACT
a. Bring attention to who is being benefited the most? Least?

When you disaggregate data,


Aggregated average
what can you notice?

The average
grew because
of these
participants.
Average after Average after
the program the program

Average before Average before


the program the program

What is the learning for


these participants?

Averages can hide whether participants most forced to the margins are left further behind, unaffected, or
helped by a program. When funders focus on aggregate data describing participants reaching a certain bar—
less recidivism, more college enrollment, more employment—it can unwittingly create incentives to focus
on participants starting closest to the threshold. For greater impact, research can support a more nuanced
understanding of who is being successfully reached and who is left farther behind, who is benefiting most
and least, and what can be learned from the complex stories beneath the numbers.

ĥ Disaggregated data. How can funders bring focus to who, specifically, is impacted and how, while still
appropriately protecting individuals’ privacy? For example, against whatever metric has been chosen, how
can funders create accountability for looking at the change the program creates for participants starting
out furthest behind, versus participants in the middle, versus participants starting out furthest ahead?

102 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

ĥ Learning from who is benefited least and most. Who is benefited the most by the organization’s work?
What are the characteristics— demographic, intersectional, and situational, for example, housing stability,
adult relationships, connectedness—of the small group benefited most? Who is benefited least by the
organization’s work? What are the characteristics of these participants? What can the organization, the
researchers and the funder learn from these stories?

ĥ Counting inequity. Is the correlation between race and outcomes, or class and outcomes, changed as a
result of the organization’s work?

b. Who is the comparison point?


Just as averages can obscure who, specifically, is and is not benefitting, comparison to the average can be
misleading. For example, comparing to an “average” client load may not be relevant and may burn staff out
if the work involves complex trauma. For example, if custodial sentence lengths get shorter because of a
program and there is no change in recidivism, this can be an improvement in recidivism if normally an increase
in recidivism is expected whenever sentence lengths get shorter. For example, a young person dropping out
of a program and re-engaging several times can be an indicator of an organization’s persistence in building
relationships, rather than an efficiency marker to improve.

ĥ Comparison points.
ö For a study comparing outcomes to a benchmark, or analyzing cost and benefit, who and what is proposed as the
comparison? How can comparison points show different perspectives or the complexity of the work? How can
funders guard against unwittingly pushing community organizations to focus on “low-hanging fruit?”
ö Does the comparison point take into account the impacts of systemic and individual traumas, for example the
implications of stress experienced by program staff as a factor in selecting what productivity comparison is relevant?
ö Does the comparison point ring true to those being measured?
ĥ Cost-benefit analysis. How can funders create incentives for analysis to capture systemic effects, where
cost-benefit numbers are an output of the work? Accounting for “cost” of the status quo only in terms of tax
dollars—which are often actually wages that benefit a different group of people—rather than the actual social
cost, can unwittingly and incorrectly build a case against investment in community work.

ĥ Capturing growth. How does the presentation of the data reveal growth journeys which may not be
linear? Does the comparison point allow for—to use substance abuse terminology—relapse in the context of
recovery?

CHICAGO BEYOND EQUIT Y SERIES 103


FOR FUNDERS

4.
SHARING
RESULTS
a. Is it historical? Is it contextualized?
Numbers, without context, take on the assumptions and biases of their audience. Data sources, without
context, reinforce the structural bias built into them.

As Chimamanda Ngozi Adichie warns, a story is fundamentally shaped by where you begin it. The origins of
inequity are often left out of the story, allowing histories to be laundered, and reinforcing harmful silences in
the narrative.

ĥ Context. What history and explanation of structural and systemic factors is important to frame the
challenges the organization is addressing? To explain fully why the problem exists in the first place and the
complexity of root causes and pathways? How can this show up in the description of the organization’s work
and dissemination of the research?

ĥ Dominant narratives. What narratives have previously described the organization’s participants, its work
in the community, or its type of work? How have these narratives served participants well? How have they
harmed them, or reinforced inequities? With this understanding, how can this research be framed to take on
unjust narratives? What cultural context is important to tell?

ĥ Limitations. How are limitations of the data most clearly communicated? For example, limitations of
administrative data sets, limitations of summarized data, limitations of common metrics, systematic non-
counting, or systematic undercounting? “Objective” data like special education designations, census data,
crime that is measured by arrests, or domestic violence information, can incorporate racialized processes and
lead to incorrect interpretations, without context.

b. Can you hear the participants? Are you signaling that lived experience
is valuable?
Valuing data to achieve an end—whether future funding, improving programs, sharing learning with the
field, changing narratives—is not the same as valuing a human story and experience intrinsically. Honoring
a participant’s voice requires intention, it may not just happen from documenting a participant's story,
demographics or outcomes. Funders are influential in bringing this to life.

ĥ Participant voices. Can you hear your participants in what you, the community organization, and
researchers, are disseminating about the research? How are images, stories, and numbers resulting from the
research effort putting participants at the center in how they are shared, versus treating them as objects of
study or as tokens to lend credibility?

ĥ Authorship. Is there an opportunity for the participants or the community organization to own elements
of or co-author what is produced, or be editors? Or does it serve the community organization to have an
external author?

104 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
K N OW YO U R R O L E , K N OW T H E R I S K S

ĥ Respecting experience.
ö How does the presentation of results message to the audience that the community’s experience is valuable and
valid (for example, in how the community’s experience is referred to and credited in the presentation), rather than
reinforcing the bias that only university expertise produces validity?

ö Would participants find that what is being put out is true to their experience, not just responsive to
what the funder set out to do, or what the researchers came to ask?

c. Is it accessible? Can those researched hear the research?


Sharing research results so that those affected by the research can access them, versus sharing results in
invitation-only forums and subscription-only journals, is part of taking an equity orientation, and necessary for
the research to power broader change. Some considerations for funders:

ĥ Language.
ö How can you create an expectation that the language is as easy to understand as possible? Do people from different
cultures, with different lived experiences, with different technical backgrounds understand the results of the
research and the “so what” of what it means, when you test an early draft? How can you create the expectation that
data and charts are readable to those without research and statistics backgrounds?
ö Are all inputs, calculations, and methods clearly explained, so stakeholders with different technical backgrounds
can understand what has been counted, how, and based on what judgments? Are data tables and charts legible to
those without research and statistics backgrounds? Are any technical terms used defined in plain language?

ĥ Forums and formats.


ö How will you distribute what has been learned in formats that the community organization identifies as most
accessible to those affected by the research? To those who have power to support the work? To partners or others in
the community organization’s coalition for change?
ö How can you help those who were “subjects” of the research share what was learned in the ways they think best?
How can you as a funder amplify these voices?
ö Where, and in what forums, will the results of the research be most likely to reach each of the intended audiences?
Does it make sense for you to support community discussions to share results? Help post video on social media?
Design a simple summary that articulates what was learned, and to what end?
ö Revisiting the list you may have encouraged researchers and the community organization to make early on
identifying those potentially affected by the research, are those people being reached, to “close the loop?”

d. Learning for equity


As research you have funded concludes, take the opportunity to reflect:
ĥ Process. What were lessons on the process of funding research with an equity orientation? How was
accountability different? Was the research started at the right time for the community organization to yield
the desired results?

ĥ Endpoints. How were the endpoints of the research different than a “traditional” approach? Did the level of
insight derived change? Did the usefulness of the outputs to the community organization change? Was the
capacity of the community organization affected? Did members of the community or community organization
get interested in evidence and research? Was the capacity of the researcher affected?

ĥ Learning and sharing. What could have been done better, and what worked well? What is the feedback from
the community organization, its stakeholders, and the researchers? How will you as a funder internalize these
lessons and share them with others?

CHICAGO BEYOND EQUIT Y SERIES 105


LET US GO
FORWARD,
TOGETHER
106 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Let us go forward, together

Thank you for your time, consideration, and use of this guidebook. We see it not as a
solution, but as a kindling to something greater, and a new path toward “how” we can
all arrive at a more authentic truth in research. We ask that you share these questions
and ideas with others in the social impact space, and host conversations to address
unintended bias and leveling the playing field to do the most good for our communities.

ĥ Share this with your team


and encourage reflection
and discussion.

ĥ Share these principles


on social media. You
can pull from the social
media suggestions at
ChicagoBeyond.org.

ĥ Bring this to your network,


in one-on-one conversation
with your board members,
staff, allies, challengers,
and friends.

ĥ Host an event. Visit


ChicagoBeyond.org to
partner with Chicago Beyond
and bring this guidebook to
your community. Seven inequities held in place by power,
seven opportunities for change.
ĥ Engage in our community of
practice. What successes
have you had with equity-
based research? What has
made you uncomfortable,
or frustrated you?
Send us a note through
ChicagoBeyond.org to join
the dialogue about our
ongoing learning.

CHICAGO BEYOND EQUIT Y SERIES 107


T H A N K YO U

This publication was informed by the experiences, wisdom, and


generosity of the following individuals, groups, and organizations.
We are especially grateful to our investment partners for their
patient and courageous efforts, the youth they serve, and others
with whom we have learned.

108 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
Aimee Stahlberg, Helene Gayle, Pastor Christopher Harris,
Storycatchers Theatre The Chicago Bright Star Community
Community Trust Outreach
Aisha Noble,
Community member Jason Quiara, Paula Wolff,
The Joyce Foundation Illinois Justice Project
Andrea Ortez,
Partnership for Resilience Jennifer Keeling, Priya Shah,
Chicago CRED (Creating Real Storycatchers Theatre
Angela Odoms-Young, Economic Destiny)
University of Illinois Rami Nashashibi,
at Chicago John Rich, Inner-City Muslim
Drexel University Action Network
Angelique Power,
The Field Foundation Jonte, Rebekah Levin,
of Illinois Storycatchers Theatre Robert R. McCormick
alumnus Foundation
Asiaha Butler, Resident
Association of Greater Karen Jackson, Robin Steans,
Englewood Lawndale Christian Steans Family Foundation
Legal Center
Carmelo Barbaro Sana Syed,
University of Chicago Kelly Hallberg, Inner-City Muslim
Poverty Lab University of Chicago Action Network
Poverty Lab
Christopher Sutton, Sheldon Smith,
Youth Advocate Khalfani Myrick, Dovetail Project
Programs, Inc. Genesys Works
Theodore Corbin,
Clifford Nellis, Kim Cassel, Drexel University
Lawndale Christian Arnold Ventures
Troy Harden,
Legal Center
Lina Fritz, Northeastern Illinois
Deborah Gorman-Smith, OneGoal University
The University of Chicago
Lindsey Nurczyk, Unmi Song,
School of Social Service
OneGoal Lloyd A. Fry Foundation
Administration
Marquell Jones, Wendy Fine,
Elena Quintana,
Lawndale Christian Youth Guidance
Adler University
Legal Center
Wrenetha Julion,
Evaluation Committee
Michael McAfee, Rush University
of Lawndale Christian
PolicyLink
Legal Center
Michelle Adler Morrison,
Franklin Cosey-Gay,
Youth Guidance
The University of Chicago
School of Social Service
Administration

CHICAGO BEYOND EQUIT Y SERIES 109


G L O S S A RY
Ancient and indigenous approaches to knowledge: Indigenous knowledge is passed
through generations, focused on problem solving, and the basis for community decisions.
We have learned from wisdom and approaches including ancient spiritual philosophy of
India and approaches of Native peoples of the Americas, Canada, New Zealand.

Community-Based Participatory Research: This research approach prioritizes


partnership between researcher and community (versus research that is merely
community-placed) and commitment to action (versus leaving action to others after the
research finishes). The Office of Behavioral and Social Science Research at the National
Institutes of Health defines Community-Based Participatory Research as “an applied
collaborative approach that enables community residents to more actively participate
in the full spectrum of research (from conception to design to conduct to analysis to
interpretation to conclusions to communication of results) with a goal of influencing
change in community health, systems, programs or policies.” This research approach
does not assume you can separate a program from the context for purposes of studying it.
Even in Community-Based Participatory Research projects, in practice, the power of the
community varies.

Design thinking: A creative problem-solving process that puts humans at the center and
focuses on what real people actually do.

Epistemology: The theory of how we know what we know.

Peace Circle: A method rooted in Native American practice to address conflict holistically
and solve problems. Peace Circles are a group process that repair harm, include offenders
taking responsibility for their actions and, and lead to collective healing.

Racial equity and cultural awareness: Racial equity would be achieved if racial identity
did not determine the odds of how one fares. Racial equity work includes dismantling
narratives, attitudes, practices and policies that allow or reinforce different outcomes by
race. Cultural awareness is awareness of the social systems of meaning and customs of a
group, and includes reflection on your own values, beliefs, biases.

Statistically significant: In research, something is “statistically significant” if you can


feel confident (up to a defined point) that the difference you are seeing is the result
of what you are studying and you did not just get lucky or unlucky in who or what got
counted. This is different than what “significant” or “important” mean in everyday English.

Structural racism: A system in which public policies, institutional practices, cultural


representations, and other norms work in mutually reinforcing ways to perpetuate racial
group inequity. Structural racism is not something that a few people or institutions choose
to, or choose not to, practice. Instead it is a feature of the social, economic and political
systems in which we all exist.

Systems thinking: A holistic approach to analysis that focuses on the way parts of a
system relate to each other and work over time.

110 W H Y A M I A LWAYS B E I N G R ES E A RC H E D ?
BIBLIOGRAPHY
American Evaluation Association. Public Statement on Cultural Inouye, Traci Endo et al. Commissioning Multicultural Evaluation:
Competence in Evaluation, (April 2011). A Foundation Resource Guide. The California Endowment. 2005.
Available at: https://www.spra.com/wordpress2/wp-content/
Butts, Jeffrey A. and Vincent Schiraldi. Recidivism Reconsidered:
uploads/2015/12/TCE-Commissining-Multicutural-Eva.pdf.
Preserving the Community Justice Mission of Community
Corrections. Program in Criminal Justice Policy and Management, Israel, Barbara A. et al. Community-Based Participatory Research:
Harvard Kennedy School, (March 2018). Lessons Learned from the Centers for Children’s Environmental
Health and Disease Prevention Research. Environmental Health
Coalition for Evidence-Based Policy. Randomized Controlled
Perspectives. (Oct. 2005); 113(10):1463-1471.
Trials Commissioned by the Institute of Education Sciences Since
2002: How Many Found Positive Versus Weak or No Effects. (July LaFrance, Joan and Richard Nichols. Reframing Evaluation: Defining
2013). Available at: http://coalition4evidence.org/wp-content/ an Indigenous Evaluation Framework. The Canadian Journal of
uploads/2013/06/IES-Commissioned-RCTs-positive-vs-weak-or- Program Evaluation. Vol. 23 No. 2 p13-31. (2010)
null-findings-7-2013.pdf.
Lee, Kien. The Importance of Culture in Evaluation: A Practical Guide
Community-Based Participatory Research (CBPR). Foundation for for Evaluators. The Colorado Trust. (2007). Available at: https://www.
Sustainable Development. (May 2017). www.fsd.org/wp-content/ communityscience.com/pdfs/CrossCulturalGuide.r3.pdf.
uploads/2017/05/Research-Toolkit.pdf. Sample Terms of Reference
Leiderman, Sally. Evaluation with a Racial Equity Lens Slides.
for a Community-Based Participatory Research project are
November 2017. www.capd.org
appended at the end, from the Wellesley Institute.
Mertens, Donna M. Transformative Mixed Methods: Addressing
Cram, Fiona. Lessons on Decolonizing Evaluation from Kaupapa
Inequities. American Behavioral Scientist 56(6) 802-813. (2012).
Maori Evaluation. Canadian Journal of Program Evaluation 30.3,
296-312, (2016). O’Fallon, Liam R. and Allen Dearry. Community-Based Participatory
Research as a Tool to Advance Environmental Health Sciences.
de Jong, Marja A. J. G.. et al. Study protocol: Evaluation of a
Environmental Health Perspectives. 110(suppl 2): 155-159 (2002).
community health promotion program in a socioeconomically
deprived city district in the Netherlands using mixed methods and Patton, Michael Quinn. Utilization-Focused Evaluation. SAGE
guided by action research, BMC Public Health 19:72 (2019). Publications, Inc. (2008).
Dean-Coffey, Jara; Casey, Jill; and Caldwell, Leon D. Raising the Bar Potapchuk, Maggie et al. Flipping the Script: White Privilege and
– Integrating Cultural Competence and Equity: Equitable Evaluation, Community Building. MP Associates, Inc. and the Center for
The Foundation Review, 6(2), 81-94 (2014). Assessment and Policy Development. (2005). Available at: http://
www.racialequitytools.org/resourcefiles/potapchuk1.pdf.
Equitable Evaluation, Equitable Evaluation Framing Paper. (July 2017).
drive.google.com/file/d/0BzlHFJSNsW5yY3dGaGN0MTdISEk/view. Promoting Healthy Public Policy Through Community-Based
Participatory Research: Ten Case Studies. PolicyLink, University
“Equity-Centered Design Framework.” Stanford D.school, Stanford
of California Berkeley School of Public Health, W.K. Kellogg
D.school, (26 Apr. 2017). dschool.stanford.edu/resources/equity-
Foundation. Available at: http://www.policylink.org/resources-
centered-design-framework.
tools/promoting-healthy-public-policy-through-community-based-
equityXdesign. “Racism and Inequity Are Products of Design. participatory-research-ten-case-studies.
They Can Be Redesigned.” Medium, EquityXdesign. (16 Nov. 2016).
Thomas, Veronica G. and Beverly A. Parsons. Culturally Responsive
medium.com/equity-design/racism-and-inequity-are-products-of-
Evaluation Meets Systems-Oriented Evaluation. American Journal of
design-they-can-be-redesigned-12188363cc6a.
Evaluation. 38(1) (April 2016).
Fricker, Miranda. Epistemic Injustice: Power and Ethics of Knowing,
Straight Talk on Evidence, How to Solve U.S. Social Problems When
Oxford University Press, (2007).
Most Rigorous Program Evaluations Find Disappointing Effects
Hall, Budd L. and Rajesh Tandon. Decolonization of Knowledge, (March 21, 2018). Available at: https://www.straighttalkonevidence.
Epistemicide, Participatory Research and Higher Education, org/2018/03/21/how-to-solve-u-s-social-problems-when-most-
Research for All, 1(1), 6-19 (2017). rigorous-program-evaluations-find-disappointing-effects-part-one-
in-a-series/.
Harlem Children’s Zone. Successful Research Collaborations:
Rules of Engagement for Community-Based Organizations. New Stevenson, Bryan. Just Mercy. Spiegel & Grau (2015).
York, NY. (2012). Available at http://www.hcz.org/images/Rules_of_
Wallerstein, Nina and Bonnie Duran. Community-Based
Engagement_paper.pdf and http://promoseneighborhoodsinstitute.
Participatory Research Contributions to Intervention Research: The
org/Technical-Assistance/Resource-Library/Tools.
Intersection of Science and Practice to Improve Health Equity. Am J
Hassmiller, Kristen et al. Extending systems thinking in planning Public Health. 100(Suppl 1): S40-S46. (April 2010).
and evaluation using group concept mapping and system dynamics
Western, Bruce, Anthony Braga, David Hureau, Catherine Sirois.
to tackle complex problems. Evaluation and Program Planning 60.
Study Retention as Bias Reduction in Hard-to-Reach Populations,
254-264 (2017).
Proceedings of the National Academy of Sciences, 113 (20) 5477-
Henson, Abigail. Strengthening Evaluation Research: A Case 5485; (May 2016). DOI:10.1073/pnas.1604138113.
Study of an Evaluability Assessment Conducted in a Carceral
Wilder, Craig Steven. Ebony & Ivy: Race, Slavery, and the Troubled
Setting. International Journal of Offender Therapy and Comparative
History of America’s Universities. Bloomsbury Press (2014).
Criminology Vol. 62(10) 3185-3200 (2018).

CHICAGO BEYOND EQUIT Y SERIES 111


C H I CAG O B E YO N D.O R G

Pictured: Jonte

Chicago Beyond, 2018


© 2018 by Chicago Beyond. This work is licensed under the
Creative Commons Attribution-NonCommercial-ShareAlike
4.0 International License.To view a copy of this license,
visit http://creativecommons.org/licenses/by-nc-sa/4.0/.

Design by Jessica Morrow | jessicamorrowdesign.com

You might also like