Who Needs Democracy When You Have Data
Who Needs Democracy When You Have Data
Here’s how China rules using data, AI, and internet surveillance.
By Christina Larson August 20, 2018
In 1955, science fiction writer Isaac Asimov published a short story about an
experiment in “electronic democracy,” in which a single citizen, selected to
represent an entire population, responded to questions generated by a
computer named Multivac. The machine took this data and calculated the
results of an election that therefore never needed to happen. Asimov’s story
was set in Bloomington, Indiana, but today an approximation of Multivac is
being built in China.
For any authoritarian regime, “there is a basic problem for the center of figuring
out what’s going on at lower levels and across society,” says Deborah
Seligsohn, a political scientist and China expert at Villanova University in
Philadelphia. How do you effectively govern a country that’s home to one in
five people on the planet, with an increasingly complex economy and society, if
you don’t allow public debate, civil activism, and electoral feedback? How do
you gather enough information to actually make decisions? And how does a
government that doesn’t invite its citizens to participate still engender trust and
bend public behavior without putting police on every doorstep?
Hu Jintao, China’s leader from 2002 to 2012, had attempted to solve these
problems by permitting a modest democratic thaw, allowing avenues for
grievances to reach the ruling class. His successor, Xi Jinping, has reversed that
trend. Instead, his strategy for understanding and responding to what is going
on in a nation of 1.4 billion relies on a combination of surveillance, AI, and big
data to monitor people’s lives and behavior in minute detail.
But police are increasingly stopping petitioners from ever reaching Beijing.
“Now trains require national IDs to purchase tickets, which makes it easy for
the authorities to identify potential ‘troublemakers’ such as those who have
protested against the government in the past,” says Maya Wang, senior China
researcher for Human Rights Watch. “Several petitioners told us they have been
stopped at train platforms.” The bloggers, activists, and lawyers are also being
systematically silenced or imprisoned, as if data can give the government the
same information without any of the fiddly problems of freedom.
2
The idea of using networked technology as a tool of governance in China goes
back to at least the mid-1980s. As Harvard historian Julian Gewirtz explains,
“When the Chinese government saw that information technology was
becoming a part of daily life, it realized it would have a powerful new tool for
both gathering information and controlling culture, for making Chinese people
more ‘modern’ and more ‘governable’—which have been perennial obsessions
of the leadership.” Subsequent advances, including progress in AI and faster
processors, have brought that vision closer.
The most far-reaching is the Social Credit System, though a better translation in
English might be the “trust” or “reputation” system. The government plan,
which covers both people and businesses, lists among its goals the
“construction of sincerity in government affairs, commercial sincerity, and
judicial credibility.” (“Everybody in China has an auntie who’s been swindled.
There is a legitimate need to address a breakdown in public trust,” says Paul
Triolo, head of the geotechnology practice at the consultancy Eurasia Group.)
To date, it’s a work in progress, though various pilots preview how it might
work in 2020, when it is supposed to be fully implemented.
Blacklists are the system’s first tool. For the past five years, China’s court
system has published the names of people who haven’t paid fines or complied
with judgments. Under new social-credit regulations, this list is shared with
various businesses and government agencies. People on the list have found
themselves blocked from borrowing money, booking flights, and staying at
luxury hotels. China’s national transport companies have created additional
blacklists, to punish riders for behavior like blocking train doors or picking
fights during a journey; offenders are barred from future ticket purchases for
six or 12 months. Earlier this year, Beijing debuted a series of blacklists to
3
prohibit “dishonest” enterprises from being awarded future government
contracts or land grants.
“The idea of social credit is to monitor and manage how people and institutions
behave,” says Samantha Hoffman of the Mercator Institute for China Studies in
Berlin. “Once a violation is recorded in one part of the system, it can trigger
responses in other parts of the system. It’s a concept designed to support both
economic development and social management, and it’s inherently political.”
Some parallels to parts of China’s blueprint already exist in the US: a bad credit
score can prevent you from taking out a home loan, while a felony conviction
suspends or annuls your right to vote, for example. “But they’re not all
connected in the same way—there’s no overarching plan,” Hoffman points out.
4
The opacity of the system makes it difficult to evaluate how effective
experiments like Rongcheng’s are. The party has squeezed out almost all critical
voices since 2012, and the risks of challenging the system—even in relatively
small ways—have grown. What information is available is deeply flawed;
systematic falsification of data on everything from GDP growth to hydropower
use pervades Chinese government statistics. Australian National University
researcher Borge Bakken estimates that official crime figures, which the
government has a clear incentive to downplay, may represent as little as 2.5
percent of all criminal behavior.
“It’s not the technology that created the policies, but technology greatly
expands the kinds of data that the Chinese government can collect on
individuals,” says Richard McGregor, a senior fellow at the Lowy Institute and
the author of The Party: The Secret World of China’s Communist Rulers. “The
internet in China acts as a real-time, privately run digital intelligence service.”
Algorithmic policing
Writing in the Washington Post earlier this year, Xiao Qiang, a professor of
communications at the University of California, Berkeley, dubbed China’s data-
5
enhanced governance “a digital totalitarian state.” The dystopian aspects are
most obviously on display in western China.
Xinjiang (“New Territory”) is the traditional home of a Chinese Muslim
minority known as Uighurs. As large numbers of Han Chinese migrants have
settled in—some say “colonized”—the region, the work and religious
opportunities afforded to the local Uighur population have diminished. One
result has been an uptick in violence in which both Han and Uighur have been
targeted, including a 2009 riot in the capital city of Urumqi, when a reported
200 people died. The government’s response to rising tensions has not been to
hold public forums to solicit views or policy advice. Instead, the state is using
data collection and algorithms to determine who is “likely” to commit future
acts of violence or defiance.
In the western city of Kashgar, many of the family homes and shops on main
streets are now boarded up, and the public squares are empty. When I visited in
2013, it was clear that Kashgar was already a segregated city—the Han and
Uighur populations lived and worked in distinct sections of town. But in the
evenings, it was also a lively and often noisy place, where the sounds of the call
to prayer intermingled with dance music from local clubs and the conversations
of old men sitting out late in plastic chairs on patios. Today the city is eerily
quiet; neighborhood public life has virtually vanished. Emily Feng, a journalist
for the Financial Times, visited Kashgar in June and posted photos on Twitter of
the newly vacant streets.
The reason is that by some estimates more than one in 10 Uighur and Kazakh
adults in Xinjiang have been sent to barbed-wire-ringed “reeducation camps”—
and those who remain at large are fearful.
In the last two years thousands of checkpoints have been set up at which
passersby must present both their face and their national ID card to proceed on
a highway, enter a mosque, or visit a shopping mall. Uighurs are required to
install government-designed tracking apps on their smartphones, which
monitor their online contacts and the web pages they’ve visited. Police officers
visit local homes regularly to collect further data on things like how many
people live in the household, what their relationships with their neighbors are
6
like, how many times people pray daily, whether they have traveled abroad,
and what books they have.
All these data streams are fed into Xinjiang’s public security system, along with
other records capturing information on everything from banking history to
family planning. “The computer program aggregates all the data from these
different sources and flags those who might become ‘a threat’ to authorities,”
says Wang. Though the precise algorithm is unknown, it’s believed that it may
highlight behaviors such as visiting a particular mosque, owning a lot of books,
buying a large quantity of gasoline, or receiving phone calls or email from
contacts abroad. People it flags are visited by police, who may take them into
custody and put them in prison or in reeducation camps without any formal
charges.