Introduction

Data Capitalism and Algorithmic Racism

Even before the global pandemic drastically increased reliance on communications technology for working, learning, shopping, and socializing at a distance, Americans from all walks of life reported a growing unease about the impact of technology companies on our country.1 Whether it is the gig economy company that tinkers with its incentive algorithms—and sends pay plummeting for thousands of “independent contractors,”2 or the firm peddling facial recognition technology that disproportionately misidentifies people of color as wanted criminals,3 the video site that promotes inflammatory misinformation guaranteed to generate clicks,4 or the social media giant that lets advertisers exclude Black homebuyers from seeing real estate ads in particular neighborhoods,5 communities across the country are struggling with the effects of unaccountable data* extraction and algorithmic decision-making. Concerns go far beyond worries about personal privacy to fundamental questions of power and control. This paper makes the case that the underlying driver is data capitalism: an economic model built on the extraction and commodification of data and the use of big data and algorithms as tools to concentrate and consolidate power in ways that dramatically increase inequality along lines of race, class, gender, and disability.

Collage of a historical photo of slaves, early computer

At its core, racial inequality is a feature, not a bug, of data capitalism. Indeed, big data is not as novel or revolutionary as it is commonly understood it to be. Instead, it is part of a long and pervasive historical legacy and technological timeline of scientific oppression, aggressive public policy, and the most influential political and economic system that has shaped and continues to shape this country’s economy: chattel slavery. Algorithmic racism occurs when contemporary big data practices generate results that reproduce and spread racial disparities, shifting power and control from Black and brown people and communities. 

This report aims to help policymakers, movement leaders, and thinkers better understand and address the challenges posed by data capitalism and the ways it is fundamentally intertwined with systemic racism. The report describes the core problem of data capitalism, surveys its roots in history, and briefly examines how it manifests today in the workplace, consumer marketplace, and public sphere. We show that the evolving system of data capitalism is not the inevitable result of technological progress, but rather the result of policy decisions. This brief will highlight key policy shifts needed to ensure that potent technological tools no longer concentrate the might of a white-dominated corporate power structure, but are instead used in ways that will benefit Black lives. We know that when Black lives truly matter, everyone will benefit. Finally, we conclude with a look at groups mobilizing to challenge data capitalism, vividly illustrating that our current path is not inevitable.

*Please see the glossary at the end of this report for definitions of tech terminology. Words defined in the glossary are bold the first time they appear in the text.

What is Data Capitalism?

This report uses the term “data capitalism” to describe an economic model built on the extraction and commodification of data and the use of big data and algorithms as tools to concentrate and consolidate power in ways that dramatically increase inequality along lines of race, class, gender, and disability.

Data capitalism differs from terms such as “surveillance capitalism”6 in its recognition that the central problem is not surveillance technology itself, but the ways technology is deployed to reinforce pre-existing power disparities. This report particularly focuses on disparities in racial and economic power and on exploring how data capitalism is rooted in slavery and white supremacy, even as we recognize that capitalism and white supremacy intersect with other forms of domination, including (hetero) patriarchy, cis-normativity, settler colonialism, and ableism.7

Datification of slavery

The Problem of Data Capitalism

The use of data and information systems to subjugate and control has a long history.8 The new element is how companies are deploying new technological capabilities in ways that intensify and accelerate existing trends. The first component is ubiquitous surveillance. Tech corporations such as Google and Facebook offer ostensibly free online services while exhaustively monitoring the online activity of users: every search, “like,” clicked link, and credit card purchase. Real-time location tracking by cell phone and surveillance by smart devices—from speakers to doorbells to thermostats to cars—enable corporate data collection offline as well. Public and private surveillance cameras, and increased employer monitoring of workers’ online activity, email, movements, and activity off the job, contribute to pervasive surveillance and unprecedented extraction of personal data, often without people’s awareness. Data is monetized when it is sold and resold, and algorithms are used to aggregate data from different sources to build an increasingly detailed picture of personal habits and preferences, which companies feed into predictive tools that model future outcomes and produce categorizations, scores, and rankings.

Hands holding a cell phone with an image of a world map

Big data is used not only to sell targeted advertising, but also to make an increasing array of high-stakes automated decisions around employment, investment, lending, and pricing in the private sphere and consequential government decisions in areas including criminal justice, education, and access to public benefits.

Technological tools amplify the process of data extraction and the use of algorithmic sorting and ranking to categorize and evaluate people for their “worthiness” to access everything from a new job to a home loan to health care coverage under Medicaid. Algorithms are primarily designed by affluent, white, male programmers based on training data that reflects existing societal inequalities. For example, an employer who wanted to retain employees longer found that distance from work was the most significant variable associated with how long workers remained with the employer. However, because of enduring patterns of residential segregation based on a legacy of discrimination and redlining, it was also a factor that strongly correlated with race.9 Without addressing underlying disparities and injustices, automated decisions based on algorithms evaluate people against a disproportionately white, male, able-bodied, middleclass or wealthy, U.S.-citizen norm that is depicted as universal and unbiased. This is the “coded gaze,” a term developed by scholar and artist Joy Buolamwini to describe “algorithmic bias that can lead to social exclusion and discriminatory practices.”10 As activist Hamid Khan observes, it’s “racism in-racism out.”11

Yet because numerical rankings and categories are produced by computer algorithms drawing on large data sets, they are often presented and perceived as objective and unbiased, offering a veneer of scientific legitimacy to decisions that amplify and perpetuate racism and other forms of injustice. Using the pretense of science to rationalize racism is a timeworn ploy that harkens back to the 19th century, when discredited ideas of phrenology and physiognomy were deployed to claim that “innate” biological differences justified discrimination and the social inequality that resulted.12  The same dynamic of amplified inequality with a scientific facade shows up today in workplaces, the consumer marketplace, and in the public sphere.

The core problem is not simply one of personal privacy being violated, but of the tremendous power disparities built into the system: While surveillance and data extraction expose every detail of people’s lives, the algorithms used to evaluate people and make decisions that profoundly impact life chances are kept secret and are insulated from challenge or question.

This extreme informational asymmetry consolidates power in the hands of the corporations that control the data and algorithms—further deepening existing inequality. In this section of the paper, we will trace the roots of data capitalism in chattel slavery and its evolution over time, exploring its pernicious consequences in the workplace, consumer marketplace, and public sphere.

What is an Algorithm?

An algorithm is a set of instructions to solve a problem or perform a task. For example, recipes are algorithms: a list of instructions or a process to prepare the dish, the ingredients that make up the dish, and a result. But within the problem we’re trying to solve or the task we’re trying to perform are decisions about optimizing for something. With our recipe, do we want to focus on making a healthy meal, or a delicious meal regardless of health benefits? That will inform how we go about looking for the recipe, which recipe we decide to use, and what modifications we might make. Algorithms can have different levels of sophistication and complexity, and it’s not always as simple as raw data being fed into the algorithm and outputs emerging, such as scores, ratios, GPS routes, and Netflix recommendations, for example. History and values ultimately influence inputs and outputs. Baked into the mathematical formulas of the algorithm, represented by lines of code, are legacies of racist public policy and discrimination dating back to the foundation of this country, codified through existing data sets as if they were digital artifacts of the past. Data scientists, activists, and practitioners have posited that no algorithm is neutral, and that algorithms are opinions embedded into code.13

Data Capitalism and Algorithmic Racism © 2021 by Data for Black Lives and Demos is licensed under Attribution-NonCommercial-NoDerivatives 4.0 International. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-nd/4.0/

Download the Full PDF

Visit D4BL Data Capitalism Microsite

This report has been written in partnership with Demos and Data for Black Lives ­­­­—a movement of activists, organizers, and mathematicians committed to the mission of using data science to create concrete and measurable change in the lives of Black people.

Learn more about data capitalism and visit the D4BL microsite

Computer window with text box says "Taking our power back" with protest images