Common Knowledge - Attacks - Dem Farrell 2018
Common Knowledge - Attacks - Dem Farrell 2018
on Democracy
by October 2018
Henry Farrell
Professor of Political Science and International Affairs at George Washington University
Bruce Schneier
Fellow, Berkman Klein Center for Internet and Society
Fellow and Lecturer, Belfer Center and Digital HKS, Harvard Kennedy School
Abstract
Existing approaches to cybersecurity emphasize either international state-to-state logics (such as deter-
rence theory) or the integrity of individual information systems. Neither provides a good understand-
ing of new “soft cyber” attacks that involve the manipulation of expectations and common understand-
ings. We argue that scaling up computer security arguments to the level of the state, so that the entire
polity is treated as an information system with associated attack surfaces and threat models, provides
the best immediate way to understand these attacks and how to mitigate them. We demonstrate sys-
tematic differences between how autocracies and democracies work as information systems, because
they rely on different mixes of common and contested political knowledge. Stable autocracies will have
common knowledge over who is in charge and their associated ideological or policy goals, but will
generate contested knowledge over who the various political actors in society are, and how they might
form coalitions and gain public support, so as to make it more difficult for coalitions to displace the
regime. Stable democracies will have contested knowledge over who is in charge, but common knowl-
edge over who the political actors are, and how they may form coalitions and gain public support.
These differences are associated with notably different attack surfaces and threat models. Specifically,
democracies are vulnerable to measures that “flood” public debate and disrupt shared decentralized
understandings of actors and coalitions, in ways that autocracies are not.
1
October 2018
Introduction
In 2014, presumed Russian hackers sought to their fellow citizens believed, and (as the debate
compromise key aspects of Ukraine’s elections. over Internet manipulation began) which of them
Notably, the targets included the systems used were fellow citizens, and which foreign trolls or
to communicate the election results to newspa- automated processes.4
pers. As a Ukrainian official described the attack:
“Offenders were trying by means of previously Both these attacks are attacks on common political
installed software to fake election results in the knowledge: the consensus beliefs that hold political
given region, and in such a way to discredit gen- systems such as democracies together.5 Election
eral election results of elections of the President security does not simply involve physical infra-
of Ukraine.”1 structure, such as ballots and polling booths. It
also involves roughly consensual expectations
In 2016, the Internet Research Agency, a com- about how the system works, who won and who
pany based in St. Petersburg, began to post false lost, and so on. If an attacker does not penetrate
content on US social media that seemed intended the physical election infrastructure, but does suc-
to stir up controversy, division, and disagreement cessfully subvert the shared expectations around
on the facts among its readers, to the point of the election, she can nevertheless succeed.6
trying to create both protests and counter-pro-
tests over the same issues.2 Many scholars doubt To work properly, democracies require this kind
whether these attacks had large-scale consequenc- of broad agreement across many questions. De-
es for behavior,3 but they plausibly worsened a mocracies delegate core aspects of decision mak-
general sense of paranoia, doubt, and confusion ing to ordinary citizens, and to politicians and
among people who were increasingly unsure what parties who struggle for citizens’ support. Politi-
1 Shackelford, Scott J., Schneier, Bruce, Sulmeyer, Michael, Boustead, Anne, Buchanan, Ben, Deckard, Amanda N. C. et al. (2017). “Making Democra-
cy Harder to Hack: Should Elections Be Classified as ‘Critical Infrastructure’?” University of Michigan Journal of Law Reform, 50(3), 629–668.
2 See Shane, Scott, and Mazetti, Mark (2018). “The Plot to Subvert an Election: Unraveling the Russia Story So Far.” New York Times.
3 Benkler, Yochai, Faris, Rob, and Roberts, Hal (2018). Network Propaganda: Manipulation, Disinformation and Radicalization in American Politics. New York:
Oxford University Press.
4 Chen, Adrian (2016). “The Real Paranoia-Inducing Purpose of Russian Hacks.” New Yorker.
5 We note for readers familiar with game theory that our understanding of “common knowledge” is less demanding than the formal definition they
are familiar with. Everybody does not need to know what everyone else knows, and so on. Instead, for us, common knowledge is the roughly shared set of
social beliefs about how the system works, who the actors are, and so on, which helps to order politics. We stress its coordinating role in this paper: other,
more sociologically inclined, accounts might stress legitimacy instead. We leave for later debate the extent to which these differing accounts might better
or worse capture actual political dynamics.
6 As Bruce Schneier wrote regarding election security: “Elections serve two purposes. The first, and obvious, purpose is to accurately choose the
winner. But the second is equally important: to convince the loser.” Schneier, Bruce (2018). “American Elections Are Too Easy to Hack. We Must Take
Action Now.” Guardian. See also Shackelford et al., “Making Democracy Harder to Hack.”
cians and the citizens they represent will disagree Technologists, in contrast, start from a very dif-
over many topics. However, if the decentralized ferent set of security assumptions. Broadly speak-
system of democracy is not to break down into ing, they are agnostic about whether the threats
chaos, then citizens and their representatives have come from states or other actors. Instead, they
to roughly agree about what they disagree about.7 focus on defending specific information systems:
They have to be able to recognize who the differ- modeling potential threats that different kinds of
ent factions in society are and what their broad actors might pose to these systems based on their
purposes are, and to believe that their political characteristics. They want to understand the attack
opponents will not seek to permanently dominate surface that attackers might exploit, and close off
or destroy them, but instead will be subject to the or mitigate the most serious vulnerabilities or
same democratic limits as they are. Attacks that the ones that widen the attack surface.10 They try
undermine these collectively held expectations to design secure and reliable systems based on a
will make it far harder for groups and parties to deep understanding of how attacks and attackers
make coalitions, forge compromises, or engage in operate.
the rest of the grind of democratic politics.
Neither of these approaches is innately well-suit-
Common-knowledge attacks can have critical con- ed for analyzing common-knowledge attacks. On
sequences,8 yet they are a poor fit with conven- one hand, national security officials have a hard
tional national security approaches to cybersecu- time using the traditional concepts of national
rity. National security officials traditionally think security theory to analyze the threats exemplified
about cybersecurity using Cold War concepts that by these attacks. Some aspects, such as the hack-
were developed to understand nuclear weapons. ing of electoral databases or systems containing
They use ideas such as the offense–defense bal- sensitive political information, fit traditional no-
ance, conventional deterrence theory, and deter- tions of state-on-state espionage and covert action.
rence by denial.9 They focus on the threats posed But the ways in which this hacked information
by nation-state adversaries. They consider how has been used to stir up political controversy are
best to mitigate these threats in a low-information a much poorer fit, and efforts to “flood”11 social
environment, both by manipulating information media with irrelevant and distracting content in
about capabilities and intentions, and — where order to compromise democratic debate do not fit
appropriate — making credible threats against at all.
adversarial states.
7 We deliberately choose not to discuss here the thorny question of how much each individual citizen has to know for democracy to function properly,
and how much of the work can instead be delegated to broader structures, such as political parties.
8 They include the attacks by Russia against the 2016 UK Brexit vote, the 2016 US election, and the 2017 French presidential election.
9 See for example Nye, Joseph S. (2011). “Nuclear Lessons for Cyber Security?” Strategic Studies Quarterly, 54(4), 18–38, “Deterrence and Dissuasion in
Cyberspace.” International Security, 41(3), 44–71, Lindsay, Jon R. (2013). “Stuxnet and the Limits of Cyber Warfare.” Security Studies, 22(3), 365–404.
10 See for example Schneier, Bruce (2001). Secrets and Lies: Digital Security in a Networked World. Hoboken, NJ: John Wiley, Bellovin, Steven (2015). Thinking
Security: Stopping Next Year’s Hackers. Boston: Addison-Wesley Professional, Shostack, John (2014). Threat Modeling: Designing for Security. Hoboken, NJ: John
Wiley.
11 On flooding, see Roberts, Margaret (2018). Censored: Distraction and Diversion Inside China’s Great Firewall. Princeton, NJ: Princeton University Press.
Some national security analysts and scholars use and more extensive security technologies. Some
concepts such as information warfare, or — more of the specific technological support structures
pejoratively — propaganda. This captures some of American democracy (most notoriously, vot-
aspects of these new forms of attack, but does less ing machines) are highly vulnerable and in sore
well at capturing others. These attacks tend to be need of redesign according to commonly under-
more aimed at degrading than persuading; that stood security principles.15 However, technologists
is, at making democratic debate more difficult don’t usually think systematically about broader
rather than attempting to change people’s minds knowledge systems and expectations. Instead, they
in a particular direction. While national security focus on narrowly defined traditional information
scholars have sought to analyze influence and systems, such as servers and individual networks,
“chaos” attacks as aspects of a common phenom- and have little to say about the consequences of
enon, it is not clear that they fit well together. 12 attacks for the broader fabric of democratic societ-
Some of this writing is unduly alarmist: for exam- ies. While technologists can note the possibility of
ple, Clint Watts warns of the threat of “Advanced such effects, they do not have any good means of
Persistent Manipulators,” claiming that “hacking evaluating the associated risks.
people’s computers...feels like child’s play com-
pared to the hacking of people’s minds that has One way to remedy this gap is to extend the logic
occurred on social media platforms the past four of national security further, so that it looks to
years.”13 Finally, it is hard to see how standard explain and counter a variety of nontraditional
approaches to deterrence can provide a plausible and nonmilitary threats that have consequenc-
solution to these attacks,14 especially when they es for freedom, liberty, and democracy. Realist
are not carried out by nation-state adversaries. scholars such as Jack Goldsmith have long been
skeptical about the US effort to extend its liberal
On the other side, technologists’ understanding and democratic values via the Internet, believing
of these attacks is equally flawed. Attacks on that this radically underestimates the differences
election systems or on political parties’ private between different nation-states and the capacity
servers are broadly similar to other attacks on in- of those states to defend their interests against
formation systems and can partly be mitigated by outside incursions.16 More recently, on his own
better threat modeling, better-designed systems, and in collaboration with Stuart Russell, Gold-
12 Lin, Herb, and Kerr, Jaclyn (2018). “On Cyber-Enabled Information/Influence Warfare and Manipulation.” In Paul Cornish (Ed.), Oxford Handbook
of Cyber Security. New York: Oxford University Press, Paul, Christopher, and Matthews, Miriam (2016). The Russian “Firehose of Falsehood” Propaganda Model:
Why It Might Work and Options to Counter It. Santa Monica, CA: Rand Corporation.
13 P. 7, Watts, Clint (2018). “Advanced Persistent Manipulators and Social Media Nationalism: National Security in a World of Audiences” (Aegis
Series Papers 1812). Palo Alto, CA: Hoover Institution.
14 See Goldsmith, Jack (2016). “The DNC Hack and the (Lack of) Deterrence.” Lawfare.
15 See Blaze, Matt (2017). Testimony. Proceedings from US House of Representatives Committee on Oversight and Government Reform, Subcommittee
on Information Technology and Subcommittee on Intergovernmental Affairs Hearing on Cybersecurity of Voting Machines, Washington DC.
16 See in particular Goldsmith, Jack, and Wu, Tim (2006). Who Controls the Internet: Illusions of a Borderless World. New York: Oxford University Press.
smith has argued that attacks against democracy focuses on the informational aspects of the na-
demonstrate the profound flaws of the Internet tion-state. Specifically, we are interested in the
freedom agenda. Specifically, he claims that the different ways that democracies and autocracies
US “pro freedom” bias against censorship and organize themselves, and the kinds of coordina-
regulating the commercial actors who dominate tion that they need to function effectively. The
the Internet has created major national security technical approach does not currently address
vulnerabilities, and rendered the US incapable of questions of collective knowledge, but there is no
responding to profound new threats.17 reason in principle why it cannot.
Goldsmith’s skepticism about the dominance of Hence, we scale up the technologists’ approach to
commercial actors is well founded. However, the cybersecurity, so that rather than thinking about
national security perspective is systematically specific information systems within democracy,
blinkered in ways that make it hard to assess the it approaches democracies and autocracies as
appropriate means to defend democratic prac- information systems, and then asks questions such
tices against incursions. When viewed from the as what is their respective attack surface, which
perspective of national security, most forms of likely threat models they face, and how do they
freedom — almost by definition — are also poten- (or can they) seek to mitigate risks? The technical
tial vulnerabilities. This means that the national approach to cybersecurity is precisely intended
security approach has enormous difficulties in as- to strike trade-offs between ensuring that the
sessing the appropriate trade-offs that are needed information system is usable and accessible, and
to guarantee a well-functioning democracy. Intel- minimizing and mitigating the inevitable vul-
ligent versions of the national security perspec- nerabilities that go together with usability and
tive, such as Goldsmith’s, at least note the need accessibility.
for these trade-offs and the difficulty in striking
them. Cruder versions may end up identifying the We sketch out a simple framework that both
freedoms that they are purportedly supposed to identifies key differences between autocratic and
defend as windows of vulnerability that need to democratic systems, and provides a roadmap for
be closed. future research and policy measures.
In this paper, we argue that extending the As far as we know, no one has previously under-
technical approach to cybersecurity provides a taken this kind of analysis. In one sense, that
different — and we believe more useful — way is surprising, given its plausible relevance and
of understanding the problem. Like national value. In another, it is not surprising at all. There
security thinkers, we begin from the level of the is existing research literature on the informa-
nation-state. But like technologists, our analysis tional trade-offs or “dictators’ dilemmas” that
17 See Goldsmith, Jack (2018). The Failure of Internet Freedom. Miami FL: Knight Foundation, Goldsmith, Jack, and Russell, Stuart (2018). “Strengths
Become Vulnerabilities: How a Digital World Disadvantages the United States in Its International Relations” (Hoover Institution Aegis Papers 1806).
Palo Alto, CA: Hoover Institution.
autocrats face, in seeking to balance between their no corresponding literature on the informational
own need for useful information and economic trade-offs that democracies face between desider-
growth, and the risk that others can use available ata like availability and stability.
information to undermine their rule.18 There is
18 See Roberts, Censored, pp. 23–25, for discussion of the literature, Kalathil, Shanthi, and Boas, Taylor (2003). Open Networks, Closed Regimes: The Impact of
the Internet on Authoritarian Rule. New York: Carnegie Endowment for an early articulation of the logic of the dictator’s dilemma, and Hollyer, James, Rosen-
dorff, Peter, and Vreeland, James (2018). Information, Democracy, and Autocracy: Economic Transparency and Political (In)Stability. New York: Cambridge University
Press for a useful recent treatment.
19 On democratic stability and expectations, see Fearon, James (2011). Self-Enforcing Democracy. Quarterly Journal of Economics, 126(4), 1661-1708. There
is of course a well-established literature in social choice on the impossibility of reaching outcomes that truly reflect people’s individual preferences, given
a set of reasonable criteria. There are further inevitable distortions that arise from institutions, political parties, and so on. Nonetheless, democracies
roughly reflect people’s different beliefs and perspectives in ways that autocracies do not.
20 North, Douglass, Wallis, John Joseph, and Weingast, Barry R. (2009). Violence and Social Orders: A Conceptual Framework for Interpreting Recorded Human
History. Princeton, NJ: Princeton University Press, Hadfield, Gillian, and Weingast, Barry R. “What is Law? A Coordination Model of the Characteris-
tics of Legal Order.” Journal of Legal Analysis, 4(2), 471–514, Carugati, Federiga, Hadfield, Gillian, and Weingast, Barry R. (2015). Building Legal Order in
Ancient Athens. Journal of Legal Analysis, 7(2), 291–324.
21 Hardin, Russell (1990). “The Social Evolution of Cooperation.” In Karen Cook and Margaret Levi (Eds.), The Limits of Rationality. Chicago: University
of Chicago Press.
This loose agreement on what everyone “knows” relationship between them. Thus, for example,
coexists with a quite different, and even contrary, some strategic accounts of politics focus on the
form of knowledge: the information dispersed in need to generate common expectations that allow
the political disagreements within a given society, for broad social coordination even in decentral-
or, as we call it, contested political knowledge.22 ized societies.23 Others instead emphasize the
This is the political knowledge that emerges from degree of diverse knowledge and beliefs within
the tensions between the different goals and per- society, and the problems and/or benefits that
spectives of various actors and groups in society. arise therefrom.24
For example, people in a democracy may disagree
on questions such as the role that government It is obvious that society organized around a gov-
should play in the economy, or whether there ernment cannot survive without common political
should be tariffs or free trade, or how the govern- knowledge. What is less obvious is that contest-
ment should conduct its foreign policy. ed political knowledge is also valuable. Just as,
for scholars of biological evolution, the level of
All societies have real or potential political fac- information in a species is contained in its ge-
tions and actors, or coalitions of actors, each with netic diversity, the extent of reasonable political
its own specific goals. Very often, these goals disagreement in a society is a rough index of the
conflict with each other; for one actor or coali- information that society possesses. Complex social
tion to achieve its goal is to frustrate another’s. problems are best solved when multiple, diverse
These differing goals are commonly associated perspectives can be applied to them, each perspec-
with different cognitive styles of problem solving, tive potentially disclosing an aspect of the prob-
and different beliefs about what the most import- lem that is invisible to others.25
ant problems are. Politics, then, is the process
through which these group conflicts over goals
and problem-solving styles, and rankings of prob-
lems are expressed, mediated, and suppressed.
22 Here we develop ideas articulated first in Farrell, Henry, and Shalizi, Cosma R. (2015). “Pursuing Cognitive Democracy.” In Danielle Allen and
Jennifer Light (Eds.), From Voice to Influence: Understanding Citizenship in a Digital Age (pp. 211–231). Chicago: University of Chicago Press (we are grateful to
Cosma for basic insights that inform our broader arguments).
23 See in particular Hadfield and Weingast, “What is Law?” While Hadfield and Weingast acknowledge the importance of “idiosyncratic” understand-
ings as a spur to creative economic interactions, their theory insulates these understandings from the self-enforcing institutions that they see as funda-
mental to the stability of open access orders.
24 Knight, Jack, and Johnson, James (2011). The Priority of Democracy: Political Consequences of Pragmatism. Princeton, NJ: Princeton University Press, Levy,
Jacob (2018). Justice In Babylon. Unpublished paper.
25 Page, Scott. (2007). The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies. Princeton, NJ: Princeton
University Press, Lazer, David, and Friedman, Allan (2005), “The Parable of the Hare and the Tortoise: Small Worlds, Diversity, and System Perfor-
mance.” Kennedy School of Government Working Paper No. RWP05-058.
27 Rosenblum, Nancy L. On the Side of the Angels: An Appreciation of Parties and Partisanship, Princeton University Press, 2010, Farrell and Shalizi, “Pursuing
Cognitive Democracy.”
28 See Jack Knight, Institutions and Social Conflict (Cambridge University Press 1992), although note that institutions invariably will involve a rough rath-
er than a complete consensus on what the rules mean. See further Danielle Allen, Henry Farrell and Cosma Shalizi, An Evolutionary Account of Institutional
Change. Unpublished paper.
29 See Przeworski, Adam (2018). Why Bother with Elections. Hoboken, NJ: John Wiley for a good recent overview of elections and alternation of gov-
ernment.
They will also need to share common knowledge Autocracies adopt a very different approach to
that their domestic adversaries are broadly com- common and contested knowledge. In contrast
mitted to the democratic process, so that they to democracies, they require common political
need not fear indefinite domination or worse knowledge about who is in charge, and what their
when they and their allies lose. social goals are, as a basic condition of stabili-
ty. There may be internal contestation between
The second involves common knowledge over the different factions within the elite, but such
range of actors, beliefs, and opinions in the soci- contestation is often clandestine, and is careful-
ety. If new interests and new parties are to come ly insulated from the public realm, so as not to
into being, compete, and either fail or flourish, destabilize the shared expectations that anchor
they will need to have a reasonable understand- regime stability.
ing of who the other political actors are, what
their interests are, and where they clash with or This explains the great lengths that autocracies
converge with their own. While much attention often go to manipulate shared expectations, and
is paid to the generic costs of collective action, to support useful public beliefs. Autocracies bene-
the shared knowledge that allows political actors fit — as democracies do not — from what political
to identify and coordinate with potential allies, scientists have described as “pluralistic ignorance”
attract voters, and so on is just as important, and or “preference falsification,” under which people
perhaps more so. only have private knowledge of their own political
beliefs and wants, without any good sense of the
In successful democratic societies, knowledge is beliefs and wants of others.32
decentralized across the wide variety of collective
actors whose consent and willingness to constrain For example, Marc Lynch has noted that the
their activities is necessary for the system to Tunisian autocracy was one of the “most heavily
work.30 This is essential, since ordinary citizens censored states on earth.”33 It relied on an infor-
play a significant role in political decision mak- mation environment in which public displays of
ing instead of just handing authority to a central support for the regime were mandated, informing
power elite.31 on friends, neighbors and family was common,
and dissidents were tortured and punished, so
30 For relevant models, see James Fearon, “Self-Enforcing Democracy,” Little, Andrew, Tucker, Joshua, and LaGatta, Tom (2015). Elections, Protest,
and Alternation of Power. Journal of Politics, 77(4), 1142–1156, Hollyer, Rosendorff, and Vreeland, Information, Democracy, and Autocracy. In all of these models,
the possibility of mass protest plays a key role in providing rulers with sufficient incentive to relinquish office.
31 Rosenblum, On the Side of the Angels, stresses the democratic problems associated with defining who the “people” are in exclusionary ways so as to
preempt future changes in a democracy’s self-conception.
32 Kuran, Timur (1997). Private Truths, Public Lies: The Social Consequences of Preference Falsification. Cambridge, MA: Harvard University Press, King, Gary,
Pan, Jennifer, and Roberts, Margaret (2017). “How the Chinese Government Fabricates Social Media Posts for Strategic Distraction, Not Engaged Argu-
ment.” American Political Science Review, 111(3), 484–501.
33 P. 305, Lynch, Marc (2011). “After Egypt: The Limits and Promise of Online Challenges to the Authoritarian Arab State.” Perspectives on Politics, 9(2),
301–310.
that it was difficult for individuals to know how for people to organize. It also requires a relative-
truly unpopular the regime had become: all they ly sophisticated understanding of the variety of
could see was their own private unhappiness, and different political actors both inside and outside
the public support shown by others.34 Even if an the ruling coalition, and the distributed support
autocratic government is broadly detested, it may among the population for these actors and their
remain in power so long as the public does not differing agendas. However, self-organizing collec-
realize how broadly detested it is. tive actors are more likely to be a challenge than
a resource to autocratic regimes, since such actors
Autocracies do not require common political may become powerful enough to form coalitions
knowledge about the efficacy and fairness of that challenge the regime, leading to a transition
elections. In Valerie Bunce’s pungent distinction, either to another form of rule or a new autocracy
while democratic countries provide certainty with different actors in charge.
about the political process and uncertainty over
outcomes, authoritarian countries provide uncer- Rather than allowing common political knowl-
tainty about process and certainty over outcomes.35 edge regarding the preferences of the population
Many authoritarian regimes conduct elections, and the variety of political actors to be shared
both as a legitimating sop and to provide them- among actors in a decentralized order, such
selves with some information as to the distribu- regimes will try to maintain monopolistic con-
tion of views within their population. However, trol. This forestalls new collective interests from
they typically show no compunction in manipu- organizing, and makes it harder for existing in-
lating the results to ensure that the regime and terests to coalesce into a challenging coalition. To
its supporters triumph. The actual workings of preserve the stability of their own rule, they will
electoral institutions — and representative insti- look to prevent independent interests from having
tutions more generally — are likely to be opaque sufficient appeal to broad segments of the popu-
to ordinary citizens and outsiders. Authoritarian lation, and to prevent the population from being
regimes also often ensure that the “rules of the attracted to — and associating themselves with —
game” of politics are hidden, or open to manipu- independent interests.
lation or revision, in order to ensure that upstarts
can’t use those rules to organize against them. Hence, they will act to limit common political
knowledge about potential groupings in the soci-
Autocratic regimes will typically benefit from ety, their likely levels of support, and the possible
contested political knowledge about nongovern- coalitions they can form among each other. Thus,
mental groups and actors in society. Again, effica- for example, the extensive Chinese social media
cious long-term collective action does not merely censorship system is less focused on shaping the
rest on simple technologies that make it cheaper expression of public opinion (which may be valu-
34 International Crisis Group (2011). Popular Protest in North Africa and the Middle East (IV): Tunisia’s Way (106). Brussels, Belgium: International Crisis
Group.
35 We are grateful to Adam Segal for informing us of this distinction, and Valerie Bunce for confirming it.
able to the state under some circumstances) than extensive survey polling, the results of which
on preventing citizens and others from organizing were only available to elite party leaders.37 Mod-
around particular causes.36 ern autocracies such as China similarly rely on
public opinion surveys, as well as on social media
To be sure, autocracies may want accurate infor- as an index of broad public sentiment.38
mation for themselves about political beliefs within
the population, so as to keep track of their legiti- Thus, there are crucial differences between de-
macy and ensure their long-term stability. Thus, mocracies’ and autocracies’ respective approaches
authoritarian regimes such as the former Soviet to information and knowledge. These differenc-
Union, even while they kept tight control of pub- es mean that forms of information that may be
lic information through extensive censorship and stabilizing for one may be destabilizing for the
surveillance, kept track of public beliefs through other.
36 King, Pan, and Roberts, “How the Chinese Government Fabricates Social Media Posts.”
37 Dimitrov, Martin (2014). “Tracking Public Opinion Under Authoritarianism: The Case of the Soviet Union During the Brezhnev Era.” Russian
History, 41(3), 329–353.
38 Dimitrov, Martin (2015). Internal Government Assessments of the Quality of Governance in China. Studies in Comparative International Development,
50(1), 50–72.
39 We note in passing that this provides one possible way of understanding the internal (and perhaps in some cases, to some limited degree, external)
attacks on democratic knowledge and expectations that have helped turn countries such as Hungary and Poland into populist democracies.
differ across different democratic regimes. Most non-libertarian right. In different decades, both
notably, where common political knowledge is Bill Clinton and Hillary Clinton have made
already frail, it will be easier for adversaries to similar claims.41 The George W. Bush adminis-
engineer further attacks. tration instituted a program of spending millions
of dollars to provide technological assistance to
This difference helps us understand how policy anti-censorship activists, which was continued
measures that increase the stability of one form of under the Obama administration.42
regime may decrease the stability of another. The
history of the last two decades has demonstrated A proper understanding of the importance of
how open information flows that benefited dem- common political knowledge and contested polit-
ocratic regimes were viewed by authoritarian re- ical knowledge helps explain both (a) why open
gimes as an existential threat, because they might information flows were regarded as an uncom-
transform regime-supporting contested political plicatedly good thing by most Western observers,
knowledge into regime-threatening common and (b) how they could have specific negative con-
political knowledge. Only recently have we started sequences for authoritarian regimes. These flows
to understand how the same information flows seemed to support the decentralized common
that benefit autocracies can be weaponized against political knowledge of democratic regimes rather
democracies, turning regime-supporting common than undermining it, providing better informa-
political knowledge into regime-undermining tion to both political groups and voters about the
contested political knowledge. broad contours of democratic politics, the range
of actors, and the public support that they had.
Until quite recently, Western academics and Internet communications technologies further
policy makers shared a broad consensus about provided the means for new groups to identify
the destabilizing consequences of open informa- their shared interests and self-organize, helping
tion flows for autocratic regimes. This consensus the Howard Dean campaign, the left-leaning
dated from the mid-1990s, when libertarians such Netroots, Tea Party Republicans, and Black Lives
as John Perry Barlow claimed that the Internet Matter to circumvent traditional institutional
would undermine tyrannical rule and extend barriers.43
freedom.40 It also extended to the left and the
40 Barlow, John Perry (1996). Declaration of the Independence of Cyberspace. San Francisco: Electronic Frontier Foundation (republished).
41 Clinton, William Jefferson (2000). “Remarks at the Paul H. Nitze School of Advanced International Studies.” Johns Hopkins SAIS, Clinton, Hillary
Rodham (2012). Internet Freedom and Human Rights. Issues in Science and Technology, 28(3), 45–52.
42 Kiggins, Ryan D. (2015). “Open for Expansion: US Policy and the Purpose for the Internet in the Post-Cold War Era.” International Studies Perspectives,
16(1), 86–105, Goldman, “Strengths Become Vulnerabilities.”
43 Johnson, Stephen B. (2008). “Two Ways to Emerge, and How to Tell the Difference Between Them.” In Jon Lebkowsky and Mitch Ratcliffe (Eds.),
Extreme Democracy. extremedemocracy.com, Farrell, Henry (2006). “Bloggers and Parties: Can the Netroots Reshape American Democracy?” Boston Review,
Carney, Nikita (2016). “All Lives Matter, but So Does Race: Black Lives Matter and the Evolving Role of Social Media.” Humanity and Society, 40(2),
180–199. There is some disagreement about the extent to which the Tea Party was an organic grassroots movement, but see Skocpol, Theda, and William-
son, Vanessa (2012). The Tea Party and the Remaking of American Conservatism. New York: Oxford University Press on the reliance of local activists on online
tools such as MeetUp.
In contrast, such flows had potential destabilizing project.”45 There is reason to believe that Russia’s
consequences for authoritarian regimes. The pref- hacking attacks during the US elections were in
erence falsification that regimes such as Tunisia part motivated by the desire for retaliation.46
relied upon could be undone by social media like
Facebook, which was not then censored or widely Contrary to these hopes and fears, the Internet
monitored. As the common political knowledge and communications technologies have no inher-
about the regime’s stability started to unravel, it ent bias toward freedom.47 Indeed, authoritarian
became easier for individuals to come together regimes proved adept at quickly turning new
and challenge it in public. technologies to their purposes. On one hand,
they started to use social media as an alternative
This was reinforced in Tunisia and elsewhere by means of safely gathering information about pub-
the creation of new forms of common political lic preferences.48 On the other, they learned how
knowledge where previously there had been con- to shut down and drown out potentially dissident
tested political knowledge. As new technologies voices.49 Many authoritarian regimes began to
substantially lowered the costs of collective action, supplement fear-based forms of censorship with
it became easier (in principle) for people to or- “friction” aimed at dissuading ordinary members
ganize in groupings outside state structures.44 As of the public from looking for certain kinds of
these groups became more aware of other groups, information through increasing the costs, and
and their various goals and levels of public sup- “flooding” public forums so as to disrupt decen-
port, they could begin to form coalitions, which tralized public knowledge building and coalition
in time could challenge and even potentially building.50 They furthermore sought increasingly
topple the regime. In many cases, it turned out to exclude foreign NGOs focused on open society-
that these coalitions did not lead to a democratic and democracy-related issues from their domestic
transition, but the prospect of long-term failure politics.
provided little comfort to threatened authoritari-
an leaders. The enthusiasm of democratic leaders
for technology-fueled challenges to authoritarian-
ism helped fuel paranoia among leaders who saw
themselves as targeted, so that Vladimir Putin,
for example, described the Internet as a “CIA
44 Shirky, Clay (2008). Here Comes Everybody: The Power of Organizing without Organizations. New York: Penguin.
45 See McAskill, Ewan (2014). “Putin Calls Internet a ‘CIA Project’ Renewing Fears of Web Breakup.” Guardian.
47 Morozov, Evgeny (2011). The Net Delusion: The Dark Side of Internet Freedom. New York: Public Affairs, Farrell, Henry (2012). “The Consequences of the
Internet for Politics.” Annual Review of Political Science, 15(1), 35–52.
48 Gunitsky, Seva (2015). “Corrupting the Cyber-Commons: Social Media as a Tool of Autocratic Stability.” Perspectives on Politics, 13(1), 42–54.
49 Tucker, Joshua A., Theocharis, Yannis, Roberts, Margaret, and Barberá, Paolo. (2017). “From Liberation to Turmoil: Social Media and Democracy.”
Journal of Democracy, 28(4), 46–59.
50 Roberts, Censored.
Adrian Chen describes the results in Russia: Tools such as flooding can stabilize authoritarian
and semi-authoritarian regimes, but are likely to
...after speaking with Russian journalists and disrupt the common knowledge that is necessary
opposition members, I quickly learned that to the successful functioning of democracy.
pro-government trolling operations were not
very effective at pushing a specific pro-Krem- This explains the Russian influence attacks
lin message — say, that the murdered opposi- against the US in 2016. There is substantial
tion leader Boris Nemtsov was actually killed reason to believe that the information ecology
by his allies, in order to garner sympathy. The of US democracy had already been substantial-
trolls were too obvious, too nasty, and too co- ly weakened by internal forces.52 Specifically, a
ordinated to maintain the illusion that these right-wing media ecology had evolved that was
were everyday Russians. Everyone knew that separate from and antagonistic to the mainstream,
the Web was crawling with trolls, and com- which rapidly conveyed extreme arguments from
ment threads would often devolve into troll the fringes of the system to the center, serving as
and counter-troll debates. The real effect, the a force-amplifier of lies.
Russian activists told me, was not to brain-
wash readers but to overwhelm social media These media structures plausibly created wide
with a flood of fake content, seeding doubt vulnerabilities. However, the information that
and paranoia.51 has emerged via the Mueller indictment of in-
dividuals associated with the Internet Research
The systemic consequences of such measures in- Agency (IRA) and other sources sketches out an
side Russia were to make the formation of com- account of flooding attacks that fits closely with
mon political knowledge impossible outside the our arguments.53
parameters set by the government. If US libertar-
ians claim that the best antidote to bad speech is First, some of the attacks focused directly on
more speech, Putin’s government discovered that undermining belief in the electoral system. As the
the best antidote to more speech was even more Mueller indictment describes the attacker’s inten-
bad speech. Thus, authoritarian governments tions: “By in or around May 2014, the ORGANI-
such as China, and semi-authoritarian regimes ZATION’s strategy included interfering with the
such as Russia moved quickly to mitigate vulner- 2016 U.S. Presidential Election,” with the stated
abilities in their information systems, through goal of “spread[ing] distrust towards the candi-
disrupting the ability of both domestic and inter- dates and the political system in general.”
national actors to turn regime-favoring contested
political knowledge into regime-undermining
common political knowledge about the genuine
state of public beliefs.
This likely explains why Russian actors helped Second, other attacks are aimed more generally at
propagate rumors that Hillary Clinton was guilty creating division between different groups, dam-
of vote fraud as well as probing the vulnerability aging and breaking up existing coalitions, and
of online US electoral records. Their probable preventing new ones from forming. For example,
intentions were not to fix the vote but to create on the Affordable Care Act:
enough paranoia over the possibility that the vote
had been fixed that Hillary Clinton’s legitima- The Russian effort moved easily between sup-
cy would have been seriously damaged, had she porting and opposing the health law depend-
been elected as president. “Guccifer 2.0,” a pseud- ing on the political moment. Pro-ACA tweets
onymous identity used by Russian intelligence, peaked around the spring of 2016, possibly
claimed just before the election that “the Demo- aimed at fostering division between Mrs. Clin-
crats may rig the elections on November 8. This ton and her presidential primary rival, Sen.
may be possible because of the software installed Bernie Sanders (I., Vt.). Anti-ACA tweets in-
in the FEC networks by the large IT companies. tensified in mid-2017 as Republicans mounted
As I’ve already said, their software is of poor their push to repeal the law, apparently seek-
quality, with many holes and vulnerabilities.”54 ing to capitalize on the emotions generated
Plausibly, the attackers did not expect Trump by that effort. “Let Obamacare crash & burn.
to be elected president. Instead, they wanted a Do not bail out insurance companies,” said
United States that was sufficiently divided against a tweet from an IRA-linked account called
itself that a President Hillary Clinton would have JUSMASXTRT on Aug. 28, 2017.55
difficulty in governing, let alone taking decisive
actions abroad. Such attacks disrupt democracy by degrading
citizens’ and groups’ shared political knowledge
about allied and adversarial groups within so-
ciety, fomenting confusion about the goals of
those groups, and the level and kind of support
that they enjoy. By increasing the levels of noise,
flooding attacks degrade the decentralized com-
mon political knowledge that provides people
with a rough overall map of politics, and make it
more difficult to organize around collective inter-
ests or to build coalitions across interest groups.
People may also come to believe that fringe be-
liefs are more widespread in the population than
in fact they are, widening the political debate
54 https://guccifer2.wordpress.com/2016/11/04/info-from-inside-the-fec-the-democrats-may-rig-the-elections/.
55 Armour, Stephen, and Overberg, Paul (2018). “Nearly 600 Russia-Linked Accounts Tweeted about the Health Law.” Wall Street Journal.
so that it includes perspectives that enjoy little and bots, they are more likely to be distrustful
actual public support. Finally, they may substan- of others (especially others with different beliefs)
tially increase paranoia, which further degrades and less likely to engage in dialogue or effective
knowledge and makes political action harder. If political action.56
people believe that they are surrounded by trolls
Democracy Defenses
A better understanding of the informational seriously damage political common knowledge.
requirements of democracy does not merely help Thus, it is important to supplement the valuable
us to understand the attack surface better. It also recommendations of the National Academies of
provides a clearer understanding of how to bolster Science for improving the security of the voting
security in democracies.57 Specifically, it implies system itself with a more specific understanding
a series of broad priorities. We sketch them out of the public perceptions and beliefs surrounding
in this section in order to spur discussion, which voting, so as to frustrate more subtle attacks on
may lead in time to a properly developed policy expectations.59
agenda.
However, there are other important sources of
The first among these, and likely the least contro- common knowledge that present less obvious
versial, is to better defend the common political security risks. For example, the US Census was
knowledge that democracies require to function.58 instituted precisely to provide the public and
We do not understand enough about the insti- experts with a common understanding of the de-
tutions that help support this knowledge, and mographics of the US population, so as to better
cannot yet provide a detailed list of defenses. At aid apportionment of political power and public
this point, more research is required (we hope policy. Attacks that are aimed at weakening the
that this paper will help spur such research). Very Census — or public expectations surrounding it
obviously, voting systems are a crucial source of — may also damage the informational supports of
political knowledge. Not only successful com- democracy, by excluding portions of the popula-
promise of such systems, but also attacks aimed tion, by reshaping beliefs about the relative role
at weakening public beliefs and expectations and influence of different demographic groups
surrounding the fairness of these systems, can in society, and so on. Mapping out other such
56 These effects involve indirect consequences. The direct effects of disinformation on people’s beliefs and behavior may be relatively weak. Barberá, P.
et al. (2018). Social Media, Political Polarization, and Political Disinformation: A Review of the Scientific Literature. Palo Alto, CA: Hewlett Foundation.
57 Our arguments also have implications for understanding autocratic politics, including perhaps developing a better understanding of how different
autocracies will be affected by knowledge attacks. We note this as a suggestion for future research and debate.
58 See also Nye, Joseph S. (2018). “How Sharp Power Threatens Soft Power: The Right and Wrong Ways to Respond to Authoritarian Influence.”
Foreign Affairs.
59 National Academies of Science. (2018). Securing the Vote: Protecting American Democracy. Washington DC: National Academies of Science Press.
institutions, and an agenda of specific measures ates obvious vulnerabilities. In future, flooding
to protect them, presents an urgent challenge for attacks may combine cross-border and internal
researchers and policy makers. campaigns to much greater effect. One obvious
implication of our framework is that dark mon-
The second set of priorities involves the nexus ey structures are not only ethically problematic,
between inside and outside groups. The extensive but create obvious security vulnerabilities, which
literature in computer security describes how at- could be mitigated through far stricter reporting
tacks by outsiders can be greatly facilitated when requirements.
insiders provide specific information and (some-
times inadvertent) help. While cooperation is Similar arguments apply to large-scale social
generally a good thing, there are also problematic media companies such as Facebook and YouTube
forms of cooperation — and institutional changes (owned by Google). These companies’ business
that render such problematic forms easier to get models make it easier to conduct clandestine
away with. information operations with little external visibil-
ity. As scholars like Zeynep Tufekci have argued,
Thus, for example, recent legal changes and they also may exacerbate the damage of com-
changes in interpretation of the law make it far mon-knowledge attacks, such as through algo-
easier for foreign actors to work together with rithms that maximize on user “engagement” and
domestic actors in clandestine ways. This may hence drive users toward material that reinforces
plausibly damage democracy, such as by funding conspiratorial thinking.61 The plausible responses
campaigns that are aimed directly at spreading to such problems entail major reform, whether by
public disinformation. Concerns have been ex- far greater regulation, the transformation of these
pressed about possible Russian funding for Ma- companies into public utilities, or their being bro-
rine Le Pen’s National Front party in the French ken up. Each of these options presents a different
general elections in 2017, the “Leave” campaign set of benefits and drawbacks.
in the United Kingdom’s Brexit referendum, and
other movements that seem likely to contribute to These arguments should not be applied indis-
general political instability in rich democracies.60 criminately. There are a wide variety of non-prob-
If these concerns are valid, they highlight specific lematic and democratically beneficial relations
vulnerabilities that may be widened further by between those who are citizens (insiders) of a
current difficulties in tracking political spending given democratic system and those who are out-
on social media. The general move away from side. As global interdependence increases, creat-
publicly disclosed political funding and toward ing new problems that span borders, some forms
nontransparent forms of political spending cre- of cross-border cooperation are not only helpful to
60 See Gatehouse, Gabriel (2017). “Marine Le Pen: Who’s Funding France’s Far Right?” BBC News and Reuters (2018). “Brexit-backer Arron Banks
Denies Fresh Allegations of Russia Links.”
61 Tufekci, Zeynep (2018). “YouTube. The Great Radicalizer.” New York Times.
democracy, but positively essential for it — global described, and the need to allow the contested
warming being the most obvious example.62 democratic knowledge that is necessary for suc-
cessful democratic problem solving. We do not
Finally, and potentially most controversially, the even pretend to offer a complete account of how
computer security literature makes no strong this tension should best be managed. Instead, we
distinction between insider and outsider effects, point out that certain features of the US political
except insofar as they have different opportunities system (e.g., the widely observed disparities of
to compromise the information system. Again, political influence between rich and poor)63 both
our framework of analysis has broad implica- hamper the contestation and political debate that
tions for how to defend democracy, suggesting we argue is necessary to democratic success, and
that institutions that allow insiders to compro- plausibly enhance the risk from insider threats.
mise common democratic knowledge can heavily While reforming these features is an enormously
damage democracy. Notably, however, there is a ambitious political agenda, it does not present
tension between the need to maintain common obvious trade-offs between security and democrat-
democratic knowledge of the kinds that we have ic functioning.64
Conclusions
In this paper, we make three basic claims. First, these combinations to explain the different attack
we argue that we currently do not have a good surfaces of autocracies and democracies, demon-
theory of the kinds of influence attacks that have strating, for example, how measures that improve
afflicted the US and other democracies over the stability in autocracies may have destabilizing
last few years. Both national security and tech- consequences in democracies, and vice versa.
nical security approaches to cybersecurity have We believe that this account better captures the
notable deficiencies in understanding how these potential policy trade-offs in defending against
attacks operate. Second, we argue that substantial- such attacks than the most plausible alternative
ly — and even radically — expanding the techni- — developing and applying the national security
cal security approach provides the best and most perspective.
appropriate means to developing such a theory.
If we treat national political regimes as infor- This last requires more justification. The most
mation systems, we can better understand their comprehensive national security account of
attack surfaces and threat models. Third, we use influence attacks that we are aware of is Jack
62 Farrell, Henry, and Knight, Jack (2018). John Dewey’s Lessons for Interdependence. Unpublished Paper.
63 Bartels, Larry (2016). Unequal Democracy: The Political Economy of the New Gilded Age. Princeton, NJ: Princeton University Press.
64 Concerns about these disparities has typically been the focus of the left. However, a new body of work builds on classical liberalism to a broadly
similar set of conclusions. See in particular Lindsey, Brink, and Teles, Steven (2017). The Captured Economy: How the Powerful Enrich Themselves, Slow Down
Growth and Increase Inequality. New York: Oxford University Press.
Goldsmith and Stuart Russell’s recent essay, with a series of unpleasant trade-offs between
“Strengths Become Vulnerabilities.” As the au- what makes American society admirable, and
thors describe their argument: what is necessary to protect it from outside en-
croachment. However, they have no useful metric
Our central claim is that the United States is to determine how difficult trade-offs ought be
disadvantaged in the face of these soft cyber struck. This is in part, we suspect, because the
operations due to constitutive and widely ad- national security approach is not designed for
mired features of American society, including people to think systematically about the internal
the nation’s commitment to free speech, priva- benefits of openness. Indeed, standard realist
cy and the rule of law; its relatively unregulat- accounts assume that what happens within states
ed markets; and its deep digital sophistication. and what happens between them are analytically
These strengths of American society create entirely separate.
asymmetric vulnerabilities in the digital age
that foreign adversaries, especially in authori- Here, the computer security approach provides
tarian states, are increasingly exploiting. ...We a better foundation. Since information systems
do not claim that the disadvantages of digitali- need to be open to input if they are to be useful,
zation for the United States in its internation- computer security analysts are trained to think
al relations outweigh the advantages. But we systematically about the trade-offs between open-
do present some reasons for pessimism about ness and security and then balance the requisite
the United States’ predicament in the face of equities. First, one needs to understand what a
adversary cyber operations. given information system is supposed to do. Then,
one needs to weigh the forms of input and access
We do not contend that Goldsmith and Russell’s that are necessary for functioning against the at-
pessimism is completely unwarranted. Defending tack vulnerabilities that different modes of input
democracy against these kinds of attacks will be a and access provide. Typically, one cannot provide
Herculean labor.65 However, we think that Gold- comprehensive solutions, but — through design
smith and Russell’s pessimism is exaggerated by and experiment — one can mitigate the vulnera-
the difficulty that the national security perspec- bilities associated with openness to the point that
tive has in thinking systematically about the ap- the benefits outweigh the risks. First, one looks
propriate trade-offs. When the perspective of na- to pluck the low-hanging fruit, by closing vulner-
tional security is extended to influence operations abilities that have few or no benefits. Then, one
(or, as we prefer, common-knowledge attacks), just carefully assesses the benefits and drawbacks of
about every opening looks like a vulnerability. the more complex trade-offs between openness
and vulnerability.
Goldsmith and Russell do not conclude that this
means that all those vulnerabilities need to be
closed. Instead, they propose that we are faced
65 See also Joseph Nye, “How Sharp Power Threatens Soft Power.”
This is why an informational account of democra- political knowledge can have very serious conse-
cy is so important to mitigation. Without it, one quences.
cannot understand how democracy is supposed to
operate and, hence, one cannot assess the trade- Obviously, many of our policy priorities flow
offs. The informational understanding that we from this understanding of democracy. A differ-
present here emphasizes the way that democracy ent understanding might be the foundation of
can draw on diverse sources of information. This different prescriptions. However, we note that
means that democracies can potentially do better our understanding is sufficiently broad to be
than autocracies over the long run, to the extent shared by an emerging set of arguments on the
that they are better able to use the disagreements center-right of American debate as well as the
and diverse information they contain to solve left. And even those who disagree sharply with
complex collective problems. However, this also our premises and conclusions may draw some
confronts them with the serious challenge of en- benefit from using a similar approach to analysis
suring that the common political knowledge that to think systematically about the informational
provides stability is not overwhelmed by internal foundations of democracy and its relationship to
disagreements. Attacks that seek to widen inter- security.
nal disagreement so that it implicates common
Acknowledgments
We are grateful to Ross Anderson, Yochai Benkler, Sheri Berman, Maria Farrell, Rob Faris, Martha
Finnemore, Art Goldhammer, Anna Grzymala-Busse, Alex Grigsby, Herb Lin, Joseph Nye, Molly
Roberts, and Adam Segal for their helpful comments on earlier versions of this paper. Some of this
material was presented at a meeting of the Twenty-First Century Trust in Strasbourg, France, Septem-
ber 14–17, 2018.
https://cyber.harvard.edu/story/2018-10/common-knowledge-attacks-democracy