World View: Don't Ask If AI Is Good or Fair, Ask How It Shifts Power

Download as pdf or txt
Download as pdf or txt
You are on page 1of 1

A personal take on science and society

World view By Pratyusha


Kalluri

Don’t ask if AI is good or fair,


ask how it shifts power
Those who could be exploited by artificial What’s needed are ways for these people to investigate
intelligence should be shaping its projects. AI, to contest it, to influence it or to even dismantle it. For
example, the advocacy group Our Data Bodies is putting
It is our

L
forward ways to protect personal data when interacting
aw enforcement, marketers, hospitals and other responsibility with US fair-housing and child-protection services. Such
bodies apply artificial intelligence (AI) to decide to recognize work gets little attention. Meanwhile, mainstream research
on matters such as who is profiled as a criminal,
our skewed is creating systems that are extraordinarily expensive to
who is likely to buy what product at what price, train, further empowering already powerful institutions,
who gets medical treatment and who gets hired. perspective from Amazon, Google and Facebook to domestic surveil-
These entities increasingly monitor and predict our and listen lance and military programmes.
behaviour, often motivated by power and profits. to those Many researchers have trouble seeing their intellectual
It is not uncommon now for AI experts to ask whether an work with AI as furthering inequity. Researchers such as me
AI is ‘fair’ and ‘for good’. But ‘fair’ and ‘good’ are infinitely
impacted spend our days working on what are, to us, mathematically
spacious words that any AI system can be squeezed into. The by AI.” beautiful and useful systems, and hearing of AI success
question to pose is a deeper one: how is AI shifting power? stories, such as winning Go championships or showing
From 12 July, thousands of researchers will meet virtually promise in detecting cancer. It is our responsibility to
at the week-long International Conference on Machine recognize our skewed perspective and listen to those
Learning, one of the largest AI meetings in the world. Many impacted by AI.
researchers think that AI is neutral and often beneficial, Through the lens of power, it’s possible to see why
marred only by biased data drawn from an unfair society. accurate, generalizable and efficient AI systems are not
In reality, an indifferent field serves the powerful. good for everyone. In the hands of exploitative compa-
In my view, those who work in AI need to elevate those nies or oppressive law enforcement, a more accurate
who have been excluded from shaping it, and doing so facial recognition system is harmful. Organizations have
will require them to restrict relationships with power- responded with pledges to design ‘fair’ and ‘transparent’
ful institutions that benefit from monitoring people. systems, but fair and transparent according to whom?
Researchers should listen to, amplify, cite and collaborate These systems sometimes mitigate harm, but are con-
with communities that have borne the brunt of surveillance: trolled by powerful institutions with their own agendas.
often women, people who are Black, Indigenous, LGBT+, At best, they are unreliable; at worst, they masquerade as
poor or disabled. Conferences and research institutions ‘ethics-washing’ technologies that still perpetuate inequity.
should cede prominent time slots, spaces, funding and lead- Already, some researchers are exposing hidden
ership roles to members of these communities. In addition, limitations and failures of systems. They braid their
discussions of how research shifts power should be required research findings with advocacy for AI regulation. Their
and assessed in grant applications and publications. work includes critiquing inadequate technological ‘fixes’.
A year ago, my colleagues and I created the Radical AI Other researchers are explaining to the public how natural
Network, building on the work of those who came before resources, data and human labour are extracted to create AI.
us. The group is inspired by Black feminist scholar Angela Race-and-technology scholar Ruha Benjamin at
Davis’s observation that “radical simply means ‘grasping Princeton University in New Jersey has encouraged us to
things at the root’”, and that the root problem is that power “remember to imagine and craft the worlds you cannot
is distributed unevenly. Our network emphasizes listening live without, just as you dismantle the ones you cannot
to those who are marginalized and impacted by AI, and live within”. In this vein, it is time to put marginalized and
advocating for anti-oppressive technologies. impacted communities at the centre of AI research — their
Consider an AI that is used to classify images. Experts needs, knowledge and dreams should guide development.
train the system to find patterns in photographs, perhaps This year, for example, my colleagues and I held a workshop
to identify someone’s gender or actions, or to find a match- for diverse attendees to share dreams for the AI future we
ing face in a database of people. ‘Data subjects’ — by which Pratyusha Kalluri is desire. We described AI that is faithful to the needs of data
I mean the people who are tracked, often without consent, a co-creator of the subjects and allows them to opt out freely.
as well as those who manually classify photographs to train Radical AI Network When the field of AI believes it is neutral, it both fails to
the AI system, usually for meagre pay — are often both and an AI researcher notice biased data and builds systems that sanctify the
PRATYUSHA KALLURI

exploited and evaluated by the AI system. at Stanford University status quo and advance the interests of the powerful. What
Researchers in AI overwhelmingly focus on provid- in California. is needed is a field that exposes and critiques systems that
ing highly accurate information to decision makers. e-mail: pkalluri@ concentrate power, while co-creating new systems with
Remarkably little research focuses on serving data subjects. stanford.edu impacted communities: AI by and for the people.

Nature | Vol 583 | 9 July 2020 | 169


©
2
0
2
0
S
p
r
i
n
g
e
r
N
a
t
u
r
e
L
i
m
i
t
e
d
.
A
l
l
r
i
g
h
t
s
r
e
s
e
r
v
e
d
.

You might also like