The EIP Report - The Long Fuse - Doc. 209-2

Download as pdf or txt
Download as pdf or txt
You are on page 1of 292

Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 1 of 292 PageID #:

13570

THE LONG FUSE:


MISINFORMATION AND THE 2020 ELECTION

1
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 2 of 292 PageID #:
13571
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 3 of 292 PageID #:
13572

The Long Fuse


Misinformation and the 2020 Election

The Election Integrity Partnership

Digital Forensic Research Lab


Graphika
Stanford Internet Observatory
UW Center for an Informed Public

2021
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 4 of 292 PageID #:
13573

© 2021 The Election Integrity Partnership

This report is made available under a Creative Commons


Attribution-NonCommercial-NoDerivatives 4.0 License (international)
https creativecommons org licenses by-nc-nd

Identifiers: ISBN 978-1-7367627-1-4 (ebook)

1.3.0 (June 15, 2021)

Cover Illustration and Design by Alexander Atkins Design, Inc.

How to cite this work:


APA Style:

Center for an Informed Public, Digital Forensic Research Lab, Graphika,


& Stanford Internet Observatory (2021). The Long Fuse: Misinforma-
tion and the 2020 Election. Stanford Digital Repository: Election
Integrity Partnership. v1.3.0
https purl stanford edu tr s

Chicago Style:

Center for an Informed Public, Digital Forensic Research Lab, Graphika,


& Stanford Internet Observatory. The Long Fuse: Misinformation and
the 2020 Election, 2021. Stanford Digital Repository: Election In-
tegrity Partnership. v1.3.0
https purl stanford edu tr s
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 5 of 292 PageID #:
13574

Contents

Executive Summary v
Who We Are: EIP and Its Members . . . . . . . . . . . . . . . . . . . . . . vi
What We Did . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vi
Key Takeaways . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . vii
Key Recommendations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . x

Contributors xii

Acknowledgements xiii

1 The Election Integrity Partnership 1


1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.2 The EIP: Partner Organizations and Structure . . . . . . . . . . . . 2
1.3 The EIP: Goals and Scope . . . . . . . . . . . . . . . . . . . . . . . . . 5
1.4 External Stakeholders . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
1.5 Example Ticket Process . . . . . . . . . . . . . . . . . . . . . . . . . . 18
1.6 Practical Lessons Learned . . . . . . . . . . . . . . . . . . . . . . . . 19
1.7 Reading This Report . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20

2 Data and Summary Statistics 27


2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27
2.2 Summary Statistics . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3 Platform Responsiveness and Moderation Actions Taken . . . . . 37
2.4 Concerns by Reporting Collaborators . . . . . . . . . . . . . . . . . 42
2.5 Final Observations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43

3 Incidents and Narratives: The Evolution of Election Misinformation 47


3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47

i
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 6 of 292 PageID #:
13575

Contents

3.2 Narratives: Methodology and Identification . . . . . . . . . . . . . . 48


3.3 The Evolution of Narratives in the 2020 Election . . . . . . . . . . 49
3.4 Election-Related Violence . . . . . . . . . . . . . . . . . . . . . . . . 97
3.5 Narrative Crossover and Fabrication in Non-English Media . . . . 101
3.6 Fact-Checking Claims and Narratives . . . . . . . . . . . . . . . . . 119
3.7 Final Observations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 122

4 Cross-platform and Participatory Misinformation: Structure and


Dynamics 149
4.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 149
4.2 Cross-Platform Information Sharing . . . . . . . . . . . . . . . . . . 150
4.3 Dynamics of 2020 Election Misinformation . . . . . . . . . . . . . . 162
4.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 173

5 Actors and Networks: Repeat Spreaders of Election Misinformation 181


5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
5.2 Methods for Identifying Repeat Spreaders of False and Misleading
Narratives . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 181
5.3 Most Engaged Incidents . . . . . . . . . . . . . . . . . . . . . . . . . 183
5.4 Political Alignment of Influential Twitter Accounts . . . . . . . . . 184
5.5 Repeat Spreaders . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 187
5.6 An Integrated Look at Repeat Spreaders Across Platforms . . . . . 195
5.7 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204

6 Policy 211
6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 211
6.2 Social Media Platform Policy Evolution . . . . . . . . . . . . . . . . 212
6.3 Platform Interventions: Policy Approaches and Application Out-
comes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214
6.4 Mis- and Disinformation Problems Without Clear Policy Solutions 220
6.5 Primary Areas for Policy Improvement . . . . . . . . . . . . . . . . . 223
6.6 Platform Policy Moving Forward . . . . . . . . . . . . . . . . . . . . 225

7 Responses, Mitigations and Future Work 233


7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 233
7.2 Government . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 234
7.3 Media . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 236
7.4 Social Media Platforms and Technology Companies . . . . . . . . . 237
7.5 Civil Society . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240
7.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 240

Appendices 244

A Definitions 245

ii
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 7 of 292 PageID #:
13576

Contents

B Inter-coder reliability 249


B.1 Average Z-scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 249
B.2 Discordant Z-scores . . . . . . . . . . . . . . . . . . . . . . . . . . . . 250
B.3 Concordant Z-scores . . . . . . . . . . . . . . . . . . . . . . . . . . . 250

C Repeat Spreaders—Additional Partisan News Outlets in the Twitter


Data 251

D Ticket Analysis Questions 253


D.1 Tier 1 Analysis Questions . . . . . . . . . . . . . . . . . . . . . . . . . 253
D.2 Tier 2 Analysis Questions . . . . . . . . . . . . . . . . . . . . . . . . . 255

E News Articles Citing the Election Integrity Partnership 257

F Methodology for Evaluating Platform Policy 265


F.1 Assessing our methodology . . . . . . . . . . . . . . . . . . . . . . . 272

iii
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 8 of 292 PageID #:
13577
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 9 of 292 PageID #:
13578

Executive Summary

On January 6, 2021, an armed mob stormed the US Capitol to prevent the


certification of what they claimed was a “fraudulent election.” Many Americans
were shocked, but they needn’t have been. The January 6 insurrection was
the culmination of months of online mis- and disinformation directed toward
eroding American faith in the 2020 election.

US elections are decentralized: almost 10,000 state and local election offices are
primarily responsible for the operation of elections. Dozens of federal agencies
support this effort, including the Cybersecurity and Infrastructure Security
Agency (CISA) within the Department of Homeland Security, the United States
Election Assistance Commission (EAC), the FBI, the Department of Justice, and
the Department of Defense. However, none of these federal agencies has a focus
on, or authority regarding, election misinformation originating from domestic
sources within the United States. This limited federal role reveals a critical gap
for non-governmental entities to fill. Increasingly pervasive mis- and disinfor-
mation, both foreign and domestic, creates an urgent need for collaboration
across government, civil society, media, and social media platforms.

The Election Integrity Partnership, comprising organizations that specialize


in understanding those information dynamics, aimed to create a model for
whole-of-society collaboration and facilitate cooperation among partners ded-
icated to a free and fair election. With the narrow aim of defending the 2020
election against voting-related mis- and disinformation, it bridged the gap be-
tween government and civil society, helped to strengthen platform standards
for combating election-related misinformation, and shared its findings with its
stakeholders, media, and the American public. This report details our process
and findings, and provides recommendations for future actions.

v
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 10 of 292 PageID #:
13579

Executive Summary

Who We Are: EIP and Its Members


The Election Integrity Partnership was formed to enable real-time information
exchange between election officials, government agencies, civil society organiza-
tions, social media platforms, the media, and the research community.1 It aimed
to identify and analyze online mis- and disinformation, and to communicate
important findings across stakeholders. It represented a novel collaboration
between four of the nation’s leading institutions focused on researching mis-
and disinformation in the social media landscape:

• The Stanford Internet Observatory (SIO)

• The University of Washington’s Center for an Informed Public (CIP)

• Graphika

• The Atlantic Council’s Digital Forensic Research Lab (DFRLab)

What We Did
The EIP’s primary goals were to: (1) identify mis- and disinformation before
it went viral and during viral outbreaks, (2) share clear and accurate counter-
messaging, and (3) document the specific misinformation actors, transmission
pathways, narrative evolutions, and information infrastructures that enabled
these narratives to propagate. To identify the scope of our work, we built a
framework to compare the policies of 15 social media platforms2 across four
categories:

• Procedural interference: misinformation related to actual election proce-


dures

• Participation interference: content that includes intimidation to personal


safety or deterrence to participation in the election process

• Fraud: content that encourages people to misrepresent themselves to


affect the electoral process or illegally cast or destroy ballots

• Delegitimization of election results: content aiming to delegitimize election


results on the basis of false or misleading claims

The EIP used an innovative internal research structure that leveraged the capa-
bilities of the partner organizations through a tiered analysis model based on
“tickets” collected internally and from our external stakeholders. Of the tickets
we processed, 72% were related to delegitimization of the election.

vi
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 11 of 292 PageID #:
13580

Key Takeaways

Key Takeaways
Misleading and false claims and narratives coalesced into the meta-narrative
of a “stolen election,” which later propelled the January 6 insurrection.

• Right-leaning “blue-check” influencers transformed one-off stories, some-


times based on honest voter concerns or genuine misunderstandings, into
cohesive narratives of systemic election fraud.

• Warped stories frequently centered on mail-in voting and accusations of


found, discarded, or destroyed ballots, particularly in swing states. Mislead-
ing framing of real-world incidents often took the form of falsely assigning
intent, exaggerating impact, falsely framing the date, or altering locale.

• The meta-narrative of a “stolen election” coalesced into the #StopTheSteal


movement, encompassing many of the previous narratives. The narrative
appeared across platforms and quickly inspired online organizing and
offline protests, leading ultimately to the January 6 rally at the White
House and the insurrection at the Capitol.

• Fact-checking of narratives had mixed results; non-falsifiable narratives


presented a particular challenge. In some cases, social media platform
fact-checks risked drawing further attention to the claims they sought to
debunk.

The production and spread of misinformation was multidirectional and par-


ticipatory.

• Individuals participated in the creation and spread of narratives. Bottom-


up false and misleading narratives started with individuals identifying real-
world or one-off incidents and posting them to social media. Influencers
and hyperpartisan media leveraged this grassroots content, assembling
it into overarching narratives about fraud, and disseminating it across
platforms to their large audiences. Mass media often picked up these
stories after they had reached a critical mass of engagement.

• Top-down mis- and disinformation moved in the opposite direction, with


claims first made by prominent political operatives and influencers, often
on mass media, which were then discussed and shared by people across
social media properties.

vii
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 12 of 292 PageID #:
13581

Executive Summary

Narrative spread was cross-platform: repeat spreaders leveraged the specific


features of each platform for maximum amplification.

• The cross-platform nature of misinformation content and narrative spread


limited the efficacy of any single platform’s response.

• Smaller, niche, and hyperpartisan platforms, which were often less moder-
ated or completely unmoderated, hosted and discussed content that had
been moderated elsewhere. Parler in particular saw a remarkable increase
in its active user base, as users rejected the “censorship” they perceived
on other platforms.

The primary repeat spreaders of false and misleading narratives were veri-
fied, blue-check accounts belonging to partisan media outlets, social media
influencers, and political figures, including President Trump and his family.

• These repeat spreaders amplified the majority of the investigated incidents


aggressively across multiple platforms.

• Repeat spreaders often promoted and spread each others’ content. Once
content from misleading narratives entered this network, it spread quickly
across the overlapping audiences.

Many platforms expanded their election-related policies during the 2020


election cycle. However, application of moderation policies was inconsistent
or unclear.

• Platforms took action against policy violations by suspending users or


removing content, downranking or preventing content sharing, and ap-
plying informational labels. However, moderation efforts were applied
inconsistently on and across platforms, and policy language and updates
were often unclear.

• Account suspensions and content removal or labeling sometimes con-


tributed to conspiratorial narratives that platforms were “covering up the
truth,” entangling platforms with the narratives they wished to eliminate.

• Lack of transparency and access to platform APIs hindered external re-


search into the effectiveness of platform policies and interventions.

viii
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 13 of 292 PageID #:
13582

Key Recommendations

Key Recommendations
Federal Government

• Establish clear authorities and roles for identifying and countering elec-
tion related mis- and disinformation. Build on the federal interagency
movement toward recognizing elections as a national security priority and
critical infrastructure.

• Create clear standards for consistent disclosures of mis- and disinforma-


tion from foreign and domestic sources as a core function of facilitating
free and fair elections, including via CISA’s Rumor Control and joint intera-
gency statements.

Congress

• Pass existing bipartisan proposals for increased appropriations marked for


federal and state election security.

• Codify the Senate Select Committee on Intelligence’s bipartisan recommen-


dations related to the depoliticization of election security and the behavior
of public officials and candidates for federal office noted in Volumes 3 and
5 of the Committee’s report on foreign influence in 2016 elections.

State and Local Officials

• Establish trusted channels of communication with voters. This should


include a .gov website and use of both traditional and social media.

• Ensure that all votes cast are on auditable paper records and that efficient,
effective, and transparent post-election audits are conducted after each
election.

Platforms

• Provide proactive information regarding anticipated election misinforma-


tion. For example, if researchers expect a narrative will emerge, platforms
should explain that narrative’s history or provide fact-checks or context
related to its prior iterations.

• Invest in research into the efficacy of internal policy interventions (such


as labeling) and share those results with external researchers, civil society,
and the public.

ix
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 14 of 292 PageID #:
13583

Executive Summary

• Increase the amount and granularity of data regarding interventions, take-


downs, and labeling to allow for independent analysis of the efficacy of
these policies.

• Impose clear consequences for accounts that repeatedly violate platform


policies. These accounts could be placed on explicit probationary status,
facing a mixture of monitoring and sanctions.

• Prioritize election officials’ efforts to educate voters within their jurisdic-


tion and respond to misinformation. This could include the promotion of
content from election officials through curation or advertisement credits,
especially in the lead-up to Election Day.

Conclusion
The 2020 election demonstrated that actors—both foreign and domestic—remain
committed to weaponizing viral false and misleading narratives to undermine
confidence in the US electoral system and erode Americans’ faith in our democ-
racy. Mis- and disinformation were pervasive throughout the campaign, the
election, and its aftermath, spreading across all social platforms. The Election
Integrity Partnership was formed out of a recognition that the vulnerabilities in
the current information environment require urgent collective action.
While the Partnership was intended to meet an immediate need, the conditions
that necessitated its creation have not abated, and in fact may have worsened.
Academia, platforms, civil society, and all levels of government must be com-
mitted, in their own ways, to truth in the service of a free and open society.
All stakeholders must focus on predicting and pre-bunking false narratives,
detecting mis- and disinformation as it occurs, and countering it whenever
appropriate.

x
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 15 of 292 PageID #:
13584

Notes

1. (page vi) “Announcing the EIP,” Election Integrity Partnership, July 27, 2020,
https eipartnership net ne s announcing-the-eip
2. (page vi) The platforms evaluated during EIP’s operation include: Facebook,
Instagram, Twitter, YouTube, Pinterest, Nextdoor, TikTok, Snapchat, Parler, Gab,
Discord, WhatsApp, Telegram, Reddit, and Twitch. Twitch was added to our list
during our blog post update in October.

xi
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 16 of 292 PageID #:
13585

Contributors
The EIP was supported by the following students, staff and researchers from the four partner
organizations.

Stanford Internet Observatory Ashwin Ramaswami


Samantha Bradshaw Cooper Raterink Graphika
Daniel Bush Cooper Reed Joseph Carter
Jack Cable Emily Ross Avneesh Chandra
Caleb Chiam Abuzar Royesh Shawn Eib
Elena Cryst Danny Schwartz Rodrigo Ferreira
Matt DeButts Chase Small Camille François
Renée DiResta Alex Stamos Thomas Lederer
Emma Dolan Gene Tanaka Erin McAweeney
Ayelet Drazen David Thiel Vanessa Molter
Jackson Eilers Julia Thompson Morgan Moon
Ross Ewald Yesenia Ulloa Jack Nassetta
Toni Friedman Alessandro Vecchiato Ben Nimmo
Isabella García-Camargo Netta Wang Brian Potochney
Josh Goldstein Lyndsea Warkenthien Léa Ronzaud
Shelby Grossman Alex Zaheer Melanie Smith
Sejal Jhawer Kyle Weiss
Jennifer John UW Center for an Informed Public
Katie Jonsson Joseph Bak-Coleman DFRLab
Dylan Junkin Andrew Beers Eric Baker
Ananya Karthik Nicole Buckley Graham Brookie
Tara Kheradpir Michael Caufield (WSU) Emerson Brooking
Soojong Kim Michael Grass Kelsey Henquinet
Nazli Koyluoglu Melinda McClure Haughey Alyssa Kann
Kevin Lin Ian Kennedy Ayushman Kaul
Pierce Lowary Kolina Koltai Zarine Kharazian
Sahar Markovich Paul Lockaby Tessa Knight
Gordon Martinez-Piedra Rachel Moran Jean le Roux
Miles McCain Joey Schafer Jacqueline Malaret
Malika Mehrotra Emma Spiro Esteban Ponce de Leon
Carly Miller Kate Starbird Max Rizzuto
Nandita Naik Morgan Wack Iain Robertson
Benjamin Newman Jevin West Michael Sheldon
Ana Sofia Nicholls Tom Wilson Helen Simpson
Shelby Perkins Martin Zhang

This report was edited by Eden Beck and designed by David Thiel. The Election Integrity Partnership
would like to thank Matthew Masterson for additional feedback, and Nate Persily for his support.

xii
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 17 of 292 PageID #:
13586

Acknowledgements

The Election Integrity Partnership’s partners wish to acknowledge the following


organizations for the generous financial support through which this report was
possible:

Digital Forensic Research Lab, The Atlantic Council


The Digital Forensic Research Lab is part of and funded by the Atlantic Council.
A full list of the Atlantic Council’s donors is available at:
https atlanticcouncil org in-depth-research-reports report annual-report-
- -shaping-the-global-future-together

Graphika
Graphika thanks the Omidyar Network for their support on this project.

Stanford Internet Observatory


The Stanford Internet Observatory thanks its operational funders, Craig New-
mark Philanthropies and the William and Flora Hewlett Foundation, for their
ongoing support.

University of Washington Center for an Informed Public


The Center for an Informed Public (CIP) thanks Craig Newmark Philanthropies
and the Omidyar Network for their support of this project. Additional operational
and research support for the CIP is provided by the John S. and James L. Knight
Foundation and the William and Flora Hewlett Foundation. Researchers who
contributed to the EIP also receive partial support from the U.S. National Science
Foundation (grants 1749815 and 1616720), the Eunice Kennedy Shriver National

xiii
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 18 of 292 PageID #:
13587

Acknowledgements

Institute of Child Health and Human Development (training grant T32 HD101442-
01 to the Center for Studies in Demography & Ecology at the University of
Washington), the University of Washington UW Population Health Initiative, and
Microsoft. A full list of CIP donors is available at: https cip u edu about

xiv
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 19 of 292 PageID #:
13588

Chapter 1
The Election Integrity Partnership

1.1 Introduction
The 2016 presidential election in the United States demonstrated to the world the
potential of wide-scale information operations. Since 2016, these efforts have
grown, often aimed at developed democracies and operated by state-sponsored
adversaries and domestic activists alike. Misinformation and disinformation
can disenfranchise voters and diminish trust in the results of electoral contests,
eroding public confidence in the integrity of democratic processes and leader-
ship transitions overall. For the purposes of this report, we use “misinformation”
as an umbrella term to describe false, misleading, or exaggerated information or
claims. We differentiate this from “disinformation,” which is false or misleading
information that is purposefully produced, seeded, or spread, with the intent
to manipulate in service to an objective; the manipulation may also take the
form of leveraging fake accounts or pages. (We define these terms more fully in
Appendix A on page 245: Definitions).
Elections in the United States are highly decentralized.1 Over 10,000 individual
jurisdictions—covering state, county, and municipal levels—are responsible for
administering the vote on Election Day. Voter registration systems and databases
are centralized at the state level in some states and administered by states,
counties, and municipalities in others. Vote casting, in contrast, is organized at
the local level, with each locality responsible for administering ballots, counting
votes, and educating voters about the local system.2 There is no centralized
support to aid this vast number of jurisdictions in identifying and responding to
emerging election-related mis- and disinformation.
In 2020, adding to the complexity, the global COVID-19 pandemic forced rapid
changes to voting procedures. States and counties had to quickly adapt their
electoral processes to new public health guidelines. Existing state laws on elec-

1
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 20 of 292 PageID #:
13589

1. The Election Integrity Partnership

tion procedure were in many cases not adaptable to the emergency conditions,
leading to late executive and legislative action and court decisions.3
Voters, many of whom were sheltering at home, followed election conversations
on broadcast as well as social media. This included searching for information
about where and how to vote in light of pandemic restrictions.
The initial idea for the Partnership came from four students that the Stanford
Internet Observatory (SIO) funded to complete volunteer internships at the
Cybersecurity and Infrastructure Security Agency (CISA) at the Department of
Homeland Security. Responsibility for election information security is divided
across government offices: CISA has authority to coordinate on cybersecurity
issues related to the election, the FBI to investigate cyber incidents and enforce
election laws, and intelligence agencies to monitor for foreign interference. Yet,
no government agency in the United States has the explicit mandate to monitor
and correct election mis- and disinformation. This is especially true for election
disinformation that originates from within the United States, which would likely
be excluded from law enforcement action under the First Amendment and
not appropriate for study by intelligence agencies restricted from operating
inside the United States. As a result, during the 2020 election, local and state
election officials, who had a strong partner on election-system and overall
cybersecurity efforts in CISA, were without a clearinghouse for assessing mis-
and disinformation targeting their voting operations. The students approached
SIO leadership in the early summer, and, in consultation with CISA and other
stakeholders, a coalition was assembled with like-minded partner institutions.
The Election Integrity Partnership (EIP) was officially formed on July 26,
2020—100 days before the November election—as a coalition of research enti-
ties who would focus on supporting real-time information exchange between
the research community, election officials, government agencies, civil society
organizations, and social media platforms.

1.2 The EIP: Partner Organizations and Structure


The Partnership was formed between four of the nation’s leading institutions
focused on understanding misinformation and disinformation in the social media
landscape: the Stanford Internet Observatory, the University of Washington’s
Center for an Informed Public, Graphika, and the Atlantic Council’s Digital
Forensic Research Lab.
The Stanford Internet Observatory (SIO) was founded in June 2019 to study the
misuse of the internet to cause harm, formulate technical and policy responses
to said misuse, and teach the next generation how to avoid the mistakes of the
past. Founded by former Silicon Valley cybersecurity executive Alex Stamos, the

2
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 21 of 292 PageID #:
13590

1.2. The EIP: Partner Organizations and Structure

OPERATIONAL TIMELINE
Jun
June 23 - First discussion

PREPARATION

PLANNING

July 9 - Meeting with CISA to present EIP concept

Jul
July 27 - EIP website launches

EXTERNAL

ENGAGEMENTS August 12 - Kickoff webinar


Aug
TRAINING
August 18 - First platform policy blog post

September 2 - Operational workflow finalized

Sep Analysts

September 9 - Earliest in-person voting starts


on-call

12 hRS/DAY
OPERATIONAL

Oct October 13 - First weekly briefing webinar

October 30 - Briefing for election officials

16 - 20 HRS/DAY
November 2-6 - Election week; analyst coverage 20 hrs/day;
Nov daily news briefings

DATA CLEANUP

Dec December 14 - Electoral College vote


WRAP UP

January 6 - Electoral College certification


JAN REPORT
January 20 - Inaguration

FEB

Figure 1.1: Timeline of the Election Integrity Partnership’s work.

3
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 22 of 292 PageID #:
13591

1. The Election Integrity Partnership

Observatory has a specific interest in applying the learnings of major technology


platforms from the 2016 election to prevent a repeat in future years. The Obser-
vatory sits at Stanford’s Cyber Policy Center under the direction of Professors
Nate Persily and Dan Boneh.
The Internet Observatory team was led by Assistant Director Elena Cryst, Re-
search Manager Renée DiResta, CTO David Thiel, and Director Alex Stamos. SIO
graduate student Isabella García-Camargo served as the project manager for
the overall Partnership. SIO engaged its team of seven staff researchers and five
postdoctoral scholars from the Stanford Cyber Policy Center, and hired a team
of 38 undergraduate and graduate research assistants from Stanford to serve as
analysts on the project.
The University of Washington Center for an Informed Public (CIP) was founded
in December 2019 with the mission of marshalling the resources of a public
university to address mis- and disinformation through research, education, pol-
icy development, and outreach. The Center’s interdisciplinary faculty brought
deep methodological expertise at systematically analyzing “big” social data at
macro-, meso-, and micro- scales to track the spread of misinformation online,
and contextual expertise in online disinformation.
The CIP contributing team was led by three founding faculty members: Kate Star-
bird, Emma Spiro, and Jevin West. The team also included one affiliate faculty
member, three postdoctoral researchers (all of whom started after the Partner-
ship launched), nine undergraduate and PhD students from the University of
Washington, a data engineer, and a communications specialist.
Graphika is a social media analytics firm trusted by Fortune 500 companies,
human rights organizations, and universities to map and navigate complex social
media landscapes. The company was founded in 2013 by Dr. John Kelly, a pioneer
in this field and source of expert testimony on foreign interference in the 2016
US presidential election before the Senate Select Committee on Intelligence.
Graphika helps partners around the world to discover how communities form
online and map the flow of influence and information within large-scale social
networks. It reports on information operations carried out by various foreign
actors around the world. In addition, Graphika regularly briefs the House and
Senate Intelligence Committees on a range of topics, including the growth of
the QAnon movement and the spread of misinformation around COVID-19.
Graphika’s team was led by their Chief Innovation Officer Camille François and
Head of Analysis Melanie Smith, and included 13 analysts, data scientists, and
open source investigators. This unique combination of skills and expertise
enables Graphika to take an innovative approach to detecting and monitoring
disinformation.
The Digital Forensic Research Lab (DFRLab) was founded at the Atlantic Council

4
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 23 of 292 PageID #:
13592

1.3. The EIP: Goals and Scope

in 2016 to operationalize the study of disinformation by exposing falsehoods and


fake news, documenting human rights abuses, and building digital resilience
worldwide. Its mission is to identify, expose, and explain disinformation where
and when it occurs using open source research, create a new model of expertise
adapted for impact and real-world results, and forge digital resilience at a time
when humans are more interconnected than at any point in history.
DFRLab’s contributing team was led by Director Graham Brookie and Resi-
dent Fellow Emerson Brooking and included 13 DFRLab research assistants and
communications staff. These professionals brought extensive digital forensic
research experience and language skills to the work of the Partnership.
The EIP was not set up as a legal entity; rather, it was a consortium based on
good-faith agreements. While future models should certainly consider more
formal arrangements, the time-sensitive nature of the project required organi-
zations to rely on interinstitutional trust and rapport built over several years of
collaboration.

1.3 The EIP: Goals and Scope


The stated objective of the EIP was to detect and mitigate the impact of attempts
to prevent or deter people from voting or to delegitimize election results.4 The
EIP was not a fact-checking partnership, and was not focused on debunking
misinformation more generally; our objective explicitly excluded addressing
comments made about candidates’ character or actions and was focused nar-
rowly on content intended to suppress voting, reduce participation, confuse
voters as to election processes, or delegitimize election results without evidence
(see Table 1.1 on the next page).
To determine what was in and out of scope for the EIP, one of our first tasks was
to build a framework that identified potential types of election-related mis- and
disinformation. This process identified four core categories that we defined as
our scope of focus (see Table 1.2 on page 7).

5
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 24 of 292 PageID #:
13593

1. The Election Integrity Partnership

GOALS OF THE ELECTION INTEGRITY PARTNERSHIP


Goal 1: Identify Goal 2: Share clear, Goal 3: Increase
misinformation before it accurate transparency into what
goes viral. counter-messaging. happened during the
2020 elections.
Activities
• Establish a • Build critical bridges • Collect data in real-time
collaboration between the between election officials, for empirical analysis that
top misinformation platforms, and civil would be difficult to
research organizations society organizations assemble after the fact
• Operationalize the • Provide local and state • Build an annotated
misinformation research officials with a partner database of archived
process with tiered that could research and misinformation content
research and workspace help mitigate • Provide visibility into
management systems misinformation about how narratives spread
• Train analysts to their local operations across multiple social
identify cross-platform • Generate rapid media platforms
trends for earlier platform research findings that
notification and action have the ability to disrupt
when appropriate the misinformation
environment in real time
Outputs
• Flag policy violations to • Live media briefings • Final report
platforms • Blog posts • Dataset of content for
• Communicate to • Tweet threads future academic use
stakeholders
Table 1.1: Goals of the Election Integrity Partnership.

6
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 25 of 292 PageID #:
13594

1.3. The EIP: Goals and Scope

SCOPE OF THE ELECTION INTEGRITY PARTNERSHIP


Procedural Participation Fraud: Content Delegitimization
Interference: Interference: that encourages of Election
Misleading or false Content that people to Results: Content
information about deters people from misrepresent that delegitimizes
the actual election voting or engaging themselves to election results on
procedures. in the electoral affect the electoral the basis of false or
Content directly process, process or illegally misleading claims.
related to dates sometimes related cast or destroy
and components to voter ballots.
of the voting suppression or
process that intimidation.
prevents people
from engaging in
the electoral
process.
Example Content
Content that Content that Offers to buy or Claims of fraud or
misleads voters affects the desire sell votes with malfeasance with
about how to or perceived safety cash or gifts. inaccurate or
correctly sign a of voters engaging Calls for missing evidence.
mail-in ballot. in the electoral non-citizens to
Content that process. vote.
encourages voters Misleading or false
to vote on a information about
different day. the length of lines
at a polling station,
to deter in-person
voting.
Table 1.2: Scope of the Election Integrity Partnership.

7
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 26 of 292 PageID #:
13595

1. The Election Integrity Partnership

In addition to determining the EIP’s scope, this content-centric framework


enabled us to evaluate and compare platform policies across 15 different popular
social media platforms in the US, and to help civil society, government, academia,
and the public better understand what election-related content platforms can
and will moderate.5

Organizational Structure and Workflow Management


One of the innovative aspects of the EIP was its internal research structure,
which had to operationalize the misinformation research process in such a way
as to best leverage the capabilities of the partner organizations. There is often
an abundance of data involved in the analysis of information operations, and
the process of following threads can take weeks or months. In order to meet
the need for real-time or rapid analysis while maintaining the high standard
of investigations that each partner holds itself to, the Partnership developed a
tiered analysis model that leveraged “on-call” staffing of different analyst types.

Four Major
Stakeholders Tier 1: Detection and Intake
Intake Queue On-call data gathering, triage,
Government and response

Civil Society
Proactive Analyst
Detection Queue
Platforms

Media Tier 3: Mitigation


Tier 2: Assessment
EIP managers review and
Skilled analyst team maps
determine appropriate partner
network & provides attribution
communication
EIP Website/Social

Figure 1.2: The EIP internal workflow. Filed tickets moved through the listed queues per the
directional arrows.

The EIP tracked its analysis topics and engaged with outside stakeholder or-
ganizations using an internal ticketing workflow management system. Each
identified informational event was filed as a unique ticket in the system.6 Tickets
were submitted by both trusted external stakeholders (detailed in Section 1.4
on page 11) and internal EIP analysts. For example, an email from an external
stakeholder to the dedicated tip line would automatically generate a ticket to
the internal team for quick response. Similarly, if during online monitoring an
analyst came across a piece of content that might be an instance of election-
related misinformation, that analyst would open a ticket on the case and put it

8
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 27 of 292 PageID #:
13596

1.3. The EIP: Goals and Scope

in the analyst queue for investigation. A single ticket could map to one piece of
content, an idea or narrative, or hundreds of URLs pulled in a data dump. The
ticket tracked analysts’ research into this event, comments from platform part-
ners, and other developments. Related tickets were then grouped into distinct
information events or incidents, described more in Chapter 5.7

Analysis Tiers
Each ticket traveled through a series of analysis queues before reaching a final
resolution. In the investigation process, analysts completed specific forms
that contained a series of required fields detailing the information incident
and documented essential data such as target audience, subject, engagement,
and spread. The overall research process was broken down into three phases:
detection, assessment, and mitigation.

• Tier 1: Detection — Tier 1 analysts were tasked with conducting the ini-
tial analysis on and archiving of potential incidents. These analysts also
searched for potential in-scope content by tracking public social media
posts to surface incidents. To ensure coverage in the monitoring pro-
cess, each analyst was assigned to a specific state or interest group (see
Section 3.3), which they developed expertise in and followed throughout
the project. These analysts classified tickets as in and out of scope for
further analysis and closed incidents for which further investigation or
external communication was not needed. For in-scope tickets, analysts
went through a systematic process that attempted—where possible—to
assess the veracity of the underlying claims by locating an external fact-
check from election officials, fact-checking organizations, local media,
or mainstream outlets. They also made initial recommendations on the
prioritization of tickets, assigning high, medium, and low severity based
on the risk of the content itself and on its spread across platforms.8
• Tier 2: Assessment — This team was staffed by senior analysts from each
partner organization. Analysts used open source intelligence and other
social media analysis methods to delve deeper into the initial analysis
from Tier 1 by determining the suspected origins of a piece of information,
tracking its spread over time, and identifying additional fact-checks as
they became available. Tier 2 analysts also looked for evidence of coordi-
nation, potential foreign interference, or inauthentic dynamics related to
a given incident. This tier of analysts could recommend actions, such as
communication to external partners, as appropriate.
• Tier 3 (Managers): Mitigation — This team consisted of leadership from
each partnership organization, who signed off on the communication rec-
ommendations from Tier 2 senior analysts. The manager had the ability

9
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 28 of 292 PageID #:
13597

1. The Election Integrity Partnership

to tag platform partners on a ticket for action. They also communicated


with the EIP’s partners in government, and could request further infor-
mation from election officials if necessary. Once a ticket reached Tier 3,
the manager decided whether to put it into a holding queue for ongoing
monitoring, assign the ticket back to a Tier 2 analyst to produce a public
blog post or Twitter thread discussing the issue, or close a ticket if it had
been resolved.

Team members from each of these tiers were divided into on-call shifts. Each
shift was four hours long and led by one on-call manager. It was staffed by a
mix of Tier 1 and Tier 2 analysts in a 3:1 ratio, ranging from five to 20 people.
Analysts were expected to complete between two to five shifts per week. The
scheduled shifts ran from 8:00 am to 8:00 pm PT for most of the nine weeks
of the partnership, ramping up only in the last week before the election from
12-hour to 16- to 20-hour days with all 120 analysts on deck.
A note on fact-checking: the EIP was not a fact-checking organization, and in
preliminary assessments of whether an event in a ticket was potentially misin-
formation, analysts first looked to the work of others. One of the complexities
related to misleading information is that it is not always possible to verify the
claims; professional fact-checkers confronted with these situations may use
labels like “inconclusive” or “partially true” to convey uncertainty where it exists.
Where possible, our analysts identified an external fact-checking source from
news sites, credible fact-checking organizations, or statements from a local
election official when filing tickets. Analysts also used open source investigation
techniques, such as reverse image searches or location identifications, to de-
termine if images or videos tied to an incident were taken out of their original
context. Our analysts identified at least one external fact-check source for
approximately 42% of the in-scope tickets. For some tickets, it was not possible
to find an external fact-check for the content, either because no fact-checker
had yet addressed the issue, or because the information was resistant to simple
verification—for example, content based on unconfirmed or conflicting claims
from a whistleblower, conspiracy theories that claimed invisible forces at work,
and narratives based on factual claims (e.g., discarded ballots) but spread within
misleading frames that exaggerated the potential impact of these events. Addi-
tionally, some tickets were about incitement to violence, which does not lend
itself to fact-checking.

Election Day-Specific Structures


In the week before and after Election Day, EIP monitoring intensified significantly.
Over the two-month-long period from September 3 (the first day of EIP activity)
to November 1, EIP researchers had logged 269 tickets. From November 2 to 4,

10
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 29 of 292 PageID #:
13598

1.4. External Stakeholders

EIP researchers logged an additional 240 new tickets, as well as monitoring and
revising old cases as they related to new narratives. This dramatic increase in
tempo required changes to how the EIP identified and evaluated misinformation
incidents.
In order to manage an anticipated increase in incidents on Election Day itself,
the EIP established five working groups, each organized and led by relevant
subject matter experts:

• State and Regional Monitoring focused on monitoring narratives related


to polling locations in battleground states, particularly Pennsylvania, Wis-
consin, Florida, and Minnesota. Analysts used platform search features
coupled with curated CrowdTangle, Twitter, and Junkipedia lists to aid in
detection.

• “Targeted Group” Monitoring focused on identifying misinformation that


seemed to specifically target an ethnic or diaspora community in the
United States. This included content targeting the Black community, which
was the subject of extensive disinformation campaigns in 2016, as well as
Chinese- and Spanish-language content.

• Influencers and Young Electorate Monitoring focused on first-time voters,


particularly members of Generation Z. This work was conducted by way of
close analysis of TikTok and Instagram trends.

• Political Extremism Monitoring focused on communities that had previ-


ously endorsed political violence, particularly those adjacent to White-
identitarian causes. This work was conducted by comprehensive monitor-
ing across 4chan, 8kun, Gab, and Parler. Researchers additionally mon-
itored open Telegram channels and Discord servers linked to extremist
causes.

• Livestream Monitoring focused on rapidly identifying trending livestreams,


which were anticipated to involve both polling location activity and (later)
election night protests. This work required assessing popular livestreams
across Facebook Live, Periscope, YouTube Live, and Twitch.

These working groups would provide the foundation of EIP monitoring efforts
in both the Election Day and post-Election Day periods.

1.4 External Stakeholders


The EIP served as a connector for many stakeholders, who both provided in-
puts to and received outputs from the internal analysis structure described

11
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 30 of 292 PageID #:
13599

1. The Election Integrity Partnership

above. External stakeholders included government, civil society, social media


companies, and news media entities.
Government and civil society partners could create tickets or send notes to
EIP analysts, and they used these procedures to flag incidents or emerging
narratives to be assessed by EIP analysts. Sometimes the tickets were out of
scope, such as those related to general political misinformation that was not
election related. In these cases, that was communicated to the reporting partner
and the incident was closed. For all that were in scope, the EIP quickly analyzed
the issues and provided outputs to external stakeholders. Some of the cases
flagged by outside partners led to EIP participation in informing the public of a
finding, which was done by way of a rapid-response blog post or Twitter thread,
or a discussion during public media briefings.

Four Major Stakeholder Groups

Government

Civil Society

Platforms

We did not formalize partnerships with media, but we engaged with interested
Media journalists from local and national media organizations.

Figure 1.3: Major stakeholder groups that collaborated with the EIP.

Government
Given the decentralized nature of election administration, government entities
at the local, state, and federal level are all responsible in some way for election
security and thus for countering election-related mis- and disinformation.
Prior to the 2016 election, the federal government played a very limited role in
election security. Russian interference in the 2016 US presidential election took
the form of several Russia-linked entities engaged in a broad interference effort
that included information operations and targeting of election infrastructure
as well as hack-and-leak attacks. Operatives of the Russia-based Internet Re-
search Agency used social media to degrade Americans’ confidence in their own

12
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 31 of 292 PageID #:
13600

1.4. External Stakeholders

democratic process. Since 2016, the US government has declared election sys-
tems critical infrastructure and politicians have called for a “whole-of-society”
approach to countering attacks against them.9

EI-ISAC: Coordination Across State And Local Government


After the 2016 election, government entities at all levels stepped up election
security efforts; however, addressing election-related misinformation has re-
mained a gap. For the 2020 election, reporting falsehoods about the election
to social media platforms represented significant logistical and jurisdictional
challenges. The Election Infrastructure Information Sharing and Analysis Center
(EI-ISAC), an independent organization run by the non-profit Center for Internet
Security (CIS) that connects state and local governments as well as relevant
private companies, helps coordinate election security efforts broadly. In this
election cycle, the EI-ISAC served as a singular conduit for election officials to
report false or misleading information to platforms. By serving as a one-stop
reporting interface, the EI-ISAC allowed election officials to focus on detecting
and countering election misinformation while CIS and its partners reported
content to the proper social media platforms. Additionally, the Countering
Foreign Influence Task Force (CFITF), a subcomponent of CISA, aided in the
reporting process and in implementing resilience efforts to counter election
misinformation.
The EIP engaged with government stakeholders primarily to provide analyti-
cal capability and context around election-related misinformation. Content
reported by election officials to the EI-ISAC was also routed to the EIP ticketing
system. This allowed analysts to find similar content, ascribe individual con-
tent pieces to broader narratives, and determine virality and cross-platform
spread if applicable. This analysis was then passed back to election officials
via the EI-ISAC for their situational awareness, as well as to inform potential
counter-narratives. Additionally, if an internally generated EIP ticket targeted a
particular region, analysts sent a short write-up to the EI-ISAC to share with the
relevant election official. This allowed the state or local official to verify or refute
the claim, and enabled analysts to properly assess whether or not the content
violated a platform’s civic integrity policies. In this way, the EIP demonstrated
the upside of using the EI-ISAC coordinating body to connect platforms with
authoritative voices to determine truth on the ground and help election officials
effectively counter viral falsehoods about election infrastructure.

Civil Society
Civil society organizations fill critical roles in promoting civic engagement, and
in organizing and sharing information with their communities. The EIP engaged

13
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 32 of 292 PageID #:
13601

1. The Election Integrity Partnership

with civil society organizations to share findings and build perspective across ge-
ographies and demographics. Civil society collaborators submitted tips through
the trusted partner tip line and interacted with the EIP research team through
briefings, partner meetings, and shared findings. The Partnership engaged with
Common Cause,10 national and regional chapters of the NAACP,11 the Healthy
Elections Project,12 the Defending Digital Democracy Project,13 MITRE,14 re-
gional chapters of the AARP,15 and the National Conference on Citizenship16
(the latter two are discussed in more detail below). Some collaborators were
integrated into the Jira platform for tip reporting, while others preferred to
engage in a more informal capacity such as via email. Onboarded members were
able to submit tickets for analysis and receive feedback from the EIP analysts.
The AARP collaboration was maintained by the Center for an Informed Public
and was notable because it involved empowering and training retired adults to
identify false or misleading information as part of a “Factcheck Ambassador”
training program. The EIP worked primarily with the Washington State chapter
of the AARP, but informational training sessions were shared with other chapters
around the country.17
Another noteworthy civil society partner was the National Conference on Cit-
izenship, specifically their Junkipedia team.18 Junkipedia is a research tool
created by the Algorithmic Transparency Institute, a project of the National
Conference on Citizenship, to collect false and misleading social media con-
tent. The tool served dual purposes: first, it connected EIP to content surfaced
through its own network of journalists and reporters, providing visibility into
more geographies and communities; and second, it facilitated research and
detection by EIP analysts, who were able to use Junkipedia’s list feature to track
account activity on TikTok and YouTube.

Media
Carefully considered media coverage debunking false and misleading infor-
mation can help to ensure an informed public and a responsible social media
ecosystem. Although mis- and disinformation monitoring and analysis work is
valuable on its own, communications with media organizations increased the
impact of the EIP’s research. The EIP’s rapid-response research and analysis
work necessitated an adaptive, rapid-response communications strategy in
order to share timely insights and key mis- and disinformation concepts with
journalists and news outlets. One goal was to ensure that misleading narra-
tives were appropriately contextualized in terms of their reach and velocity, to
avoid unnecessarily amplifying something false but very sparse. Investigating
and reporting on mis- and disinformation is complex and comes with unique
challenges.19 The EIP held regular news briefings in which analysts and team
leads prioritized describing and contextualizing the misinformation incidents

14
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 33 of 292 PageID #:
13602

1.4. External Stakeholders

documented in tickets. Journalists who attended the briefings could then reach,
educate, and inform the communities they served, contextualizing and counter-
ing misleading narratives as they saw fit. Over the time of the EIP’s operation,
this process resulted in over 60 articles that specifically cited the EIP’s work or
its researchers.20

A thoughtful media strategy was key to our reach and impact as an organization.
We met the needs of media stakeholders in three primary ways—public research
briefings, responding to media requests, and in-depth collaborations.

Public Research Briefings

On October 13, 2020, the EIP hosted the first in a series of weekly research
briefings designed to share the Partnership’s rapid-response research and policy
analysis more broadly ahead of Election Day. Before each briefing, the EIP used
its Twitter account, @2020Partnership, to announce the briefing and promote
attendance. These briefings, scheduled for 30 minutes, were hosted virtually
on Zoom and featured short presentations from various EIP researchers and
analysts. Each briefing reserved time for members of news organizations to
ask questions of researchers involved with the Partnership. The briefings were
considered “on the record,” meaning that anything shared or said during the
course of the presentations or from the question-and-answer session could be
used and directly quoted from by journalists for their reporting. The Q&A format
allowed EIP researchers and analysts to cover a lot of ground in a relatively short
amount of time while also allowing journalists to gain additional insights from the
other questions asked by reporters from other news organizations. As interest
in the EIP’s work grew and reports of false and misleading information increased
dramatically in the days leading up to the election, briefings increased from
once a week to several times a week.

The briefings were open to the public. The first briefing hosted approximately
12 journalists, but as interest grew, so did briefing attendance, with an average
of 120 attendees on election week briefings and a peak of 174 attendees at the
briefing the day after the election. After each briefing, the EIP communications
lead followed up with journalists in attendance.

On Election Day, the EIP hosted a morning and afternoon briefing to report on
observations of activity that day. Reporters and editors from outlets including
the Washington Post, the New York Times, the Wall Street Journal, USA Today, MIT
Tech Review, Bloomberg Business, the Associated Press, Reuters, National Public
Radio, Politico, NBC News, The Markup, The Information, PBS NewsHour, BBC
News, Agence France Presse, the Telegraph, and Cyberscoop regularly attended.

15
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 34 of 292 PageID #:
13603

1. The Election Integrity Partnership

Responding to Media Requests


Throughout the course of the EIP’s work ahead of and after Election Day, our
communications lead also fielded inbound requests from the press to assist in
assessing specific developing stories. Some of these journalists were dedicated
to the “misinformation beat,” while others covered peripheral beats such as the
election, politics, technology, etc.
The UW team took the lead in tracking and responding to media requests that
came in across the Partnership and connecting with the appropriate EIP re-
searcher. For instance, journalists interested in misinformation-related policies
developed by social media companies were directed to Stanford Internet Ob-
servatory, which closely monitored and analyzed guidelines put forward by
platforms. Similarly, journalists interested in EIP research about “repeat spread-
ers” on Twitter who regularly shared false claims or misleading information
about voting procedures were connected with members of the UW team, who
were tracking and analyzing how that type of misinformation was shared and
amplified.

In-Depth Collaborations
In the days leading up to the election, the EIP set up collaborations with a
few journalists who had experience covering the “misinformation beat.” These
differed from media requests in the length of engagement; in these cases, we set
up Slack channels and Google documents to think through trends and emerging
data with the journalists, who were also experts in online misinformation. For
instance, the UW team fielded more specialized research requests from NBC
News, which has dedicated numerous newsroom resources to reporting on
mis- and disinformation issues. NBC’s Brandy Zadrozny did some of the most
substantive reporting on election-related mis- and disinformation ahead of
and after Election Day, bolstered by some of the EIP’s specialized research.
Her election week story about election fraud narratives was driven by this in-
depth collaboration.21 Sheera Frenkel of the New York Times spent Election
Day co-located with EIP researchers from the Stanford Internet Observatory,
with COVID-19 precautions in place. She published an early piece about the
emerging “Stop the Steal” narrative, with quotations from an SIO researcher.22
The EIP also spent time assisting a local journalist writing specifically about
election misinformation in Michigan for the Detroit Free Press, whose report-
ing was funded through a short-term grant from the American Press Institute.
The reporter, Ashley Nerbovig, attended numerous research briefings ahead
of Election Day and was interested in the EIP’s “What to Expect” report that
outlined the types of disinformation and misinformation that researchers antic-
ipated would emerge and take root before, during, and after Election Day.23 A
November 17, 2020, article in the Detroit Free Press looked at how many of the

16
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 35 of 292 PageID #:
13604

1.4. External Stakeholders

EIP’s pre-election predictions around voting-specific misinformation emerged


in Michigan, where incorrect claims and distorted narratives ran rampant in the
days and weeks that followed voting.24 That Detroit Free Press article, featuring
interviews with EIP researchers, was republished by USA Today25 and other
news publications in the USA Today Network, including the Arizona Republic.
Although many national newsrooms have one or multiple journalists focused
on misinformation, Nerbovig was among the few regional reporters dedicated
to covering misinformation from a local perspective, which encouraged us to
make researchers available to her as she developed her story.

The EIP’s outreach efforts with journalists and media organizations were valuable
because they enabled timely sharing of insights and in-depth analysis with
the public, policymakers, and social media platforms. During uncertain times,
many people turn to journalists. At the same time, journalists themselves were
seeking sound information to better contextualize the dynamics of how mis-
and disinformation are shared and amplified. By connecting journalists to our
research through these media efforts, the EIP was able to have a quick and
widespread impact.

Platforms
The EIP established relationships with social media platforms to facilitate flag-
ging of incidents for evaluation when content or behavior appeared to violate
platform policies (discussed further in Chapter 6). The EIP reached out to a
wide set of social media platforms to engage with the project, and onboarded
those that expressed interest in participating. At the start of the EIP analysis
period, representatives from the onboarded platforms were granted access to
the workspace management system. Analysts conducted their initial assessment
on all tickets, and, if content in a ticket appeared to be a violation of a platform’s
published content policies,26 an analyst or manager added the platform repre-
sentative to the ticket. If questions arose, a manager communicated with the
platform representative in the ticket comments. Analysts put the ticket back in
the queue and updated the ticket to note if the content in question received a
moderation action. If analysts identified the content on a ticket as in scope, but
not in violation of a platform’s published policies, the platform was not tagged.

The EIP onboarded the following social media companies: Facebook and Insta-
gram, Google and YouTube, Twitter, TikTok, Reddit, Nextdoor, Discord, and
Pinterest. These platforms were chosen based on several factors including the
size of the platform itself, as well as the practical research constraints around the
ability to monitor public content on the platform. A platform such as Snapchat,
for example, has a large userbase; however, due to its ephemeral content, we
did not include this platform in our work.

17
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 36 of 292 PageID #:
13605

1. The Election Integrity Partnership

There were additionally several “alt-platforms” that had no moderation policies,


sometimes deliberately so. This included platforms such as Parler, Gab, 4chan,
and a handful of message boards. EIP observed false and misleading content on
these platforms, but had no interactions with any of their representatives.

1.5 Example Ticket Process


To illustrate the scope of collaboration types discussed above, the following case
study documents the value derived from the multistakeholder model that the
EIP facilitated. On October 13, 2020, a civil society partner submitted a tip via
their submission portal about well-intentioned but misleading information in a
Facebook post. The post contained a screenshot (See Figure 1.4).

Figure 1.4: Image included in a tip from a civil society partner.

In their comments, the partner stated, “In some states, a mark is intended
to denote a follow-up: this advice does not apply to every locality, and may

18
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 37 of 292 PageID #:
13606

1.6. Practical Lessons Learned

confuse people. A local board of elections has responded, but the meme is
being copy/pasted all over Facebook from various sources.” A Tier 1 analyst
investigated the report, answering a set of standardized research questions,
archiving the content, and appending their findings to the ticket. The analyst
identified that the text content of the message had been copied and pasted
verbatim by other users and on other platforms. The Tier 1 analyst routed
the ticket to Tier 2, where the advanced analyst tagged the platform partners
Facebook and Twitter, so that these teams were aware of the content and could
independently evaluate the post against their policies. Recognizing the potential
for this narrative to spread to multiple jurisdictions, the manager added in the
CIS partner as well to provide visibility on this growing narrative and share the
information on spread with their election official partners. The manager then
routed the ticket to ongoing monitoring. A Tier 1 analyst tracked the ticket until
all platform partners had responded, and then closed the ticket as resolved.

1.6 Practical Lessons Learned


The EIP was a first-of-its-kind collaboration between multiple stakeholder
types who shared the goal of understanding, and being positioned to rapidly
and effectively counter, election-related misinformation. There were several
key lessons learned that may be helpful toward informing similar efforts in the
future:

Pre-Election Period

1. Detailed enumeration and comparison of platform policies led to tangi-


ble positive changes. When the EIP was formed in the summer of 2020, no
comprehensive comparison of policies around election-related misinfor-
mation, or civic integrity, had been published. One of the first efforts of the
Partnership was to collect these policies and compare them side-by-side.
That policy comparison improved the EIP’s quality of content analysis and
reporting.

2. Pre-bunking helped journalists contextualize what they were seeing. On


October 26 the EIP published a blog post predicting the manner and focus
of misinformation that its analysts and researchers believed were likely to
pervade social media on Election Day and shortly after.27 This piece was
informed by experience from past elections, and observations accrued dur-
ing the months of monitoring and analysis. Most of the predictions turned
out to be accurate. This post, and the subsequent targeted stakeholders
briefings around it, provided a rare opportunity to “pre-bunk” narratives

19
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 38 of 292 PageID #:
13607

1. The Election Integrity Partnership

before they reached the mainstream. This sort of effort may be useful in
effectively mitigating the effects of misinformation in the future.28
3. Using per-content tickets to represent incidents presented challenges
for tracking larger narratives. As noted in this chapter, the EIP often
started analysis by examining content on a very granular level—a ticket
might initially represent a single social media post. On the positive side,
this approach allowed for nimble Tier 1 analysis, and the Jira platform
allowed for aggregation as needed. On the negative side, this approach
made tracking narratives significantly more difficult, especially those dor-
mant for a period of time before resurfacing in many online locations at
once. Narratives usually spanned multiple types of content pieces across
multiple platforms over a broad period of time. While the EIP analysts
would eventually merge or link tickets into a broader narrative ticket, this
process was labor intensive, and ran the risk of content data getting lost
in the effort.

Election Day and Afterward


1. Public briefings and one-on-one media engagement bolstered real-time
information exchange, and helped educate and inform the public. The
EIP’s media briefings were not originally a planned part of the effort. How-
ever, we found that they were of value for enabling journalists to contex-
tualize observed events and trends and communicate them to the larger
public.
2. The cadence and resource demands of rapid analysis increased as the
election cycle progressed, leading to challenges in the logistics of EIP
research. The members of the EIP span the mis- and disinformation
research community, which has primarily focused on retrospective analysis.
In contrast, demands of the EIP publication schedule represented a novel
operational challenge for all organizations involved in a few key ways. First,
the EIP analysis and a commitment to quick turnaround required drawing
conclusions based on rapidly updating information. Second, the EIP’s
regular public briefings required updating conclusions and predictions in
an episodic manner. Third, a COVID-shortened fall academic quarter for
Stanford University and University of Washington student analysts made
it challenging to synchronize work after the Thanksgiving break.

1.7 Reading This Report


This report—the conclusion to the Election Integrity Partnership’s work—
summarizes and details the Partnership’s findings since its formation on July

20
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 39 of 292 PageID #:
13608

1.7. Reading This Report

26, 2020. Chapter 2 lays out the metrics and statistics from EIP’s detection
period, which are the foundation of further analysis. Chapter 3 examines the key
false and misleading narratives that emerged and evolved over the course of the
2020 election and after, and Chapter 4 looks at the tactics used to spread the
narratives across the information ecosystem. We take a broader perspective in
Chapter 5, looking at “repeat spreaders”—individuals, organizations, and media
entities that repeatedly promoted numerous false and misleading narratives.
In Chapter 6, we review social media platforms’ election-related policies and
discuss how those policies matured over time and were applied. We conclude
the report in Chapter 7 by providing policy recommendations, based on the
findings of our work, to government entities, media outlets, platforms, and civil
society organizations.

21
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 40 of 292 PageID #:
13609

22
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 41 of 292 PageID #:
13610

Notes

1. (page 1) Michael McFaul, ed., Securing American Elections: Pre-


scriptions for Enhancing the Integrity and Independence of the 2020
U.S. Election and Beyond (Stanford, CA: Cyber Policy Center, June
2019), https fsi-live s us- est- ama ona s com s fs-public stanford cyber
policy center-securing american elections pdf

2. (page 1) Herbert Lin, et al., “Increasing the Security of the U.S. Election
Infrastructure” in McFaul, Securing American Elections, 17.

3. (page 2) In Wisconsin, for example, federal district court judge William


Conley ruled to extend the acceptance date of absentee ballots from November
3 to November 9, citing that “Wisconsin’s election system sets [voters] up for
failure in light of the near certain impacts of this ongoing pandemic.” The judge
put his order on hold to give the Wisconsin State Legislature time to appeal.
The Circuit court ultimately overruled the lower court ruling and time ran
out for the Wisconsin legislature to legislate or appeal an exception to state
election law. See Democratic National Committee v. Bostelmann, No. 20-2835
(7th Cir. October 8, 2020); Amy Howe, “Court declines to reinstate COVID-19
accommodations for elections in Wisconsin,” SCOTUSblog, October 26, 2020,
11:28 pm, https scotusblog com court-declines-to-reinstate-covid-
-accommodations-for-elections-in- isconsin

4. (page 5) “Announcing the EIP,” Election Integrity Partnership, July 27, 2020

5. (page 8) The platforms we evaluated are: Facebook, Instagram, Twitter,


YouTube, Pinterest, Nextdoor, TikTok, Snapchat, Parler, Gab, Discord, What-
sApp, Telegram, Reddit, and Twitch. We published our initial evaluation on
August 18, 2020, and updates on September 4, September 11, October 14, Oc-
tober 19, October 27, and October 28, 2020. Twitch was added to our list of
evaluated platforms during our blog post update on October 27. Each update

23
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 42 of 292 PageID #:
13611

1. The Election Integrity Partnership

reflected changes in platforms’ published policies. See “Evaluating Election-


Related Platform Speech Policies,” Election Integrity Partnership, October 28,
2020, https eipartnership net policy-analysis platform-policies
6. (page 8) The EIP used Jira Service Desk software for the project. The team
chose Jira because it supported a large team and allowed the addition of work-
flows that require both robust customer management capabilities and organiza-
tional features to reflect the numerous roles needed to respond to any inbound
request. Licenses and technical support were provided under Atlassian’s com-
munity license program.
7. (page 9) See Appendix A on page 245: Definitions for a detailed definition of
both Events and Incidents.
8. (page 9) See Appendix B on page 249 for the Tier 1 and Tier 2 analysis
questions.
9. (page 13) Sean Lyngaas, “Sen. Warner calls for a ‘whole-of-society’ U.S. cyber
doctrine,” CyberScoop, December 7, 2018, https cyberscoop com sen-
arner-calls- hole-society-u-s-cyber-doctrine
10. (page 14) Common Cause, https commoncause org our- ork voting-
and-elections
11. (page 14) NAACP, https naacp org
12. (page 14) Stanford-MIT Healthy Elections Project,
https healthyelections org
13. (page 14) Defending Digital Democracy Project,
https belfercenter org pro ect defending-digital-democracy
14. (page 14) MITRE, https mitre org
15. (page 14) AARP, https aarp org
16. (page 14) National Conference on Citizenship, https ncoc org
17. (page 14) Elliot Trotter, “CIP, AARP Washington Factcheck Ambassador
Trainings help retirees sort fact from fiction,” University of Washington Center
for an Informed Public, December 16, 2020, https cip u edu
cip-aarp- ashington-factcheck-ambassador-trainings
18. (page 14) “About Junkipedia,” https unkipedia org about
19. (page 14) Melinda McClure Haughy, et al., “On the Misinformation Beat:
Understanding the Work of Investigative Journalists Reporting on Problematic
Information Online,” Proceedings of the ACM on Human-Computer Interaction
no. 4, Article 133 (October 2020), https doi org
20. (page 15) See Appendix E on page 257 for a list of media citations.

24
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 43 of 292 PageID #:
13612

1.7. Reading This Report

21. (page 16) Brandy Zadrozny, “Misinformation by a thousand cuts:


Varied rigged election claims circulate,” NBC News online, November 11,
2020, https nbcne s com tech tech-ne s misinformation-thousand-cuts-
varied-rigged-election-claims-circulate-n
22. (page 16) Sheera Frenkel, “The Rise and Fall of the ‘Stop the Steal’ Facebook
Group,” New York Times, November 5, 2020, https nytimes com
technology stop-the-steal-facebook-group html
23. (page 16) Kate Starbird, et al., “Uncertainty and Misinformation: What to
Expect on Election Night and Days After,” Election Integrity Partnership, October
26, 2020, https eipartnership net ne s hat-to-e pect
24. (page 17) Ashley Nerbovig, “‘Not a whole lot of innovation’: 2020 election
misinformation was quite predictable, experts say,” The Detroit Free Press,
November 17, 2020, https freep com story ne s politics elections
-presidential-election-misinformation-predictable-e perts
25. (page 17) Ashley Nerbovig, “‘Not a whole lot of innovation’: 2020 election
misinformation was quite predictable, experts say.” USA Today, November 17,
2020, https usatoday com story ne s politics elections -
presidential-election-misinformation-predictable-e perts
26. (page 17) “Evaluating Election-Related Platform Speech Policies,” Election
Integrity Partnership.
27. (page 19) Kate Starbird, et al., “Uncertainty and Misinformation: What to
Expect on Election Night and Days After,” Election Integrity Partnership, October
26, 2020, https eipartnership net ne s hat-to-e pect
28. (page 20) Brian Freidberg, et al., “A Blueprint for Documenting and
Debunking Misinformation Campaigns,” Nieman Reports (October 20, 2020),
https niemanreports org articles a-blueprint-for-documenting-and-debunking-
misinformation-campaigns

25
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 44 of 292 PageID #:
13613

26
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 45 of 292 PageID #:
13614

Chapter 2
Data and Summary Statistics

2.1 Introduction
The Election Integrity Partnership collected data between September 3, 2020
and November 19, 2020. The dataset we discuss in this part of our report
comes from tickets: the internal reports within the EIP’s system, each of which
identified a unique information event.

Key findings
• We processed 639 in-scope tickets. 72% of these tickets were related to
delegitimizing the election results.
• Twitter, Google, Facebook, and TikTok all had a 75% or higher response
rate (on the EIP Jira ticketing platform) to tickets they were tagged in.
• Our process got tighter—both within the EIP and in terms of our relation-
ship with the platforms—over time, with the time between ticket creation
and platform response dropping substantially as we approached Election
Day.
• 35% of the URLs we shared with Facebook, Instagram, Twitter, TikTok, and
YouTube were either labeled, removed, or soft blocked. Platforms were
most likely to take action on content that involved premature claims of
victory.

Tickets
Most tickets created through the EIP’s work represent a unique piece of mis-
information or disinformation related to election processes. For example, one

27
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 46 of 292 PageID #:
13615

2. Data and Summary Statistics

ticket was for a Google ad incorrectly claiming that a Florida official had been
caught perpetrating a voter fraud scheme. Other tickets discussed a misinfor-
mation narrative that appeared across several platforms. Some tickets would
focused on a single website that was generating a lot of misinformation. Other
tickets discussed incitement to violence—for example, one ticket discussed all
cross-platform instances of a single meme instructing people on how to disguise
themselves ostensibly ahead of a violent rally. Tickets were primarily created by
members of the four core EIP organizations, though 16% of tickets were filed by
the Center for Internet Security (CIS), an election official community partner, in
the form of tips.
Figure 2.1 on the facing page shows an excerpt of an example ticket. This ticket
was created for #Sharpiegate, the narrative that voters were forced to complete
their ballots with Sharpie markers that would invalidate ballots. The “Shared
with” list shows the organizations tagged on this ticket—tagging an organization
is the equivalent of sharing, making the ticket visible to them. The URLs field
includes URLs containing or involved in the spread of the misinformation. We
discuss the dataset composed exclusively of those URLs in this section of the
report as well.
The ticket also has fields for analyst discussion, data that we also extracted and
coded. Figure 2.2 on page 30 shows the discussion for the #Sharpiegate ticket.
This example shows responses from our government partners, who provided
helpful information, and platform responses.
The ticket-level dataset necessarily reflects the biases of those with the au-
thority to create tickets: internal EIP members and external partners. For
example, researchers within the Partnership signed up to monitor particular
topic groups, such as influencer accounts or Spanish-language content (see
Chapter 1, Section 1.3 on page 10 for a list of these groups). Our finite staff
and time meant that we prioritized monitoring some content over others; for
example, our prioritization of swing states over non-swing states may cause
the dataset to understate the amount of misinformation in the latter. Similarly,
we were not able to monitor misinformation in languages not widely spoken in
America, and as a result our dataset likely understates the amount of foreign
language misinformation. While the dataset has these weaknesses, given our
large team and cross-platform monitoring, we believe this dataset is important
and unique, and that it can shed light on key misinformation narratives and
tactics around the election.
In total, the dataset included 639 distinct, in-scope tickets. Following the elec-
tions, we coded the tickets to assess what category of election-related misinfor-
mation they fell under (for example, participation interference or fraud), what
tactics were used (for example, livestream video), what actor was targeted (for
example, poll workers or USPS), what state(s) were targeted, and what part of the

28
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 47 of 292 PageID #:
13616

2.1. Introduction

Figure 2.1: An example ticket. We have omitted specific URL information.

electoral process was discussed (for example, voting by mail). Two members of
the EIP coded each ticket, and a different member reconciled any discrepancies
in coding.

The taxonomy, featuring 10 questions and a total of 71 choices, performed suit-


ably. Intercoder agreement was evaluated with Cohen’s Kappa, a metric used
to judge coder agreement with consideration for random entries by coders.1
Cohen’s Kappa (K) is represented as a range from 0 to 1, where K = 0 indicates
random agreement, and K = 1 indicates total agreement between coders. Our
coding processes and dataset scored K = 0.629, which indicates substantial
agreement and inspires confidence in the final dataset given the thorough recon-
ciliation process that each ticket went through after its initial coding. The mean
percentage agreement across the set was 89.48% with a standard deviation of
0.08%. Given high percentage agreement and a reasonably confident Kappa
score, the codified tickets can be reliably used to evaluate our monitoring efforts.
We provide more details on findings from the inter-coder reliability analysis in

29
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 48 of 292 PageID #:
13617

2. Data and Summary Statistics

Figure 2.2: Discussion on the #Sharpiegate ticket. The commenters include members of the EIP,
government partners, and platform partners.

30
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 49 of 292 PageID #:
13618

2.2. Summary Statistics

Appendix B on page 249.


For one of the questions that had lower than normal intercoder agreement—
whether or not the ticket related to fraud—we developed a clearer definition of
fraud and re-did the coding for all tickets.
Throughout this chapter we will note some important limitations in the dataset.
For example, when we discuss platform response rates, these are response rates
only from platforms we partnered with. There will be no data for Parler response
rates, for example, because Parler was not an external partner of the EIP.

2.2 Summary Statistics


Overview of Tickets
In this section we present summary statistics from the dataset. Figure 2.3 on
the following page shows the number of tickets over time, by ticket category.
We processed 142 tickets on Election Day, 22% of all tickets. The Election Day
spike is likely due to a combination of an increase in election-related online
conversations on November 3, significantly more EIP staffing on this day than
previous days, and what may have been our partners’ greater focus on reporting
misinformation on Election Day.
Out of the 639 tickets, 72% were categorized as delegitimization (content aiming
to delegitimize election results on the basis of false or misleading claims), 21% as
procedural interference (misinformation related to actual election procedures),
and 15% as participation interference (posts that include intimidation to personal
safety or deterrence to participation in the election process). We note that not
all tickets are created equal. Some tickets discussed misinformation that spread
far, while other tickets discussed misinformation that might not have been seen
by many.
While Chapter 3 will discuss the reach of specific narratives, Table 2.1 on the
next page shows the relationship between ticket category and a rough measure
of reach that we estimated during the coding process. It suggests that most
categories of tickets had a similar distribution of reach, with the exception of
fraud narratives, which did not go as widely viral. However, we note that only
five of the tickets are categorized as fraud.

Segmentation of Misinformation by Platform and Region


After our last ticket was filed, we coded tickets to assess whether the narrative
appeared on one of the platforms we were tracking; of course, many narratives
appeared on multiple platforms. 77% of tickets appeared on Twitter, 46% on

31
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 50 of 292 PageID #:
13619

2. Data and Summary Statistics

Category frequency over time, smoothed by week


(tickets may have multiple categories)

200

150
Category
Frequency

calltoaction
delegitimization
fraud
100 participationinterference
prematurevictory
proceduralinterference

50

0
September 3, 2020 to November 19, 2020

Figure 2.3: Ticket category over time. Tickets may have multiple categories.

Medium:
High: >100k Low: < 1k
1k–100k N/A
engagements engagements
engagements
Participation
16% 40% 43% 1%
Interference
Call to Action 11% 42% 43% 4%
Premature
12% 52% 36% 0
Victory
Delegitimiza-
15% 49% 35% 1%
tion
Procedural
11% 36% 50% 3%
Interference
Fraud 0% 20% 60% 20%
Table 2.1: Relationship between ticket category and estimated reach.

32
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 51 of 292 PageID #:
13620

2.2. Summary Statistics

Facebook, 13% on Reddit, 12% on Instagram, 12% on YouTube, and 8% on TikTok.


Other platforms, including Parler, 4chan, and Telegram, appeared in less than
5% of tickets. While it is useful to know that the tickets we handled were
primarily on the two large platforms—Twitter and Facebook—we caution that
these numbers should not be interpreted as “most misinformation appeared on
Twitter.” Facebook, Twitter, Reddit, and Instagram have reasonably accessible
APIs that made it easier for our team to find misinformation on their platforms.
The low percent of tickets for Parler, which is not as easy to observe, should not
necessarily be interpreted as Parler having less misinformation.
Many of the tickets discussed misinformation that appeared on websites
distinct from social media platforms, such as forums and blogs. The top
misinformation-spreading websites in our dataset were the far-right forum
thedonald.win, moved from the banned subreddit “r/The_Donald,” and thegate-
waypundit[.]com, a far-right news website. 65% of these tickets involved an
exaggeration of the impact of an issue within the election process.
We also coded tickets based on whether they targeted particular states (Fig-
ure 2.4 on the following page). 16% of tickets targeted Pennsylvania, 9% targeted
Michigan, and 7% targeted Washington. Many of our state-specific tickets were
reported by CIS, reflecting the fact that CIS forwarded reports by state and local
election officials, and that certain states sent in many reports while others sent
few or none.

Tickets by Tactics and Targets


We also coded tickets based on what tactics we observed being used:

• 49% of tickets involved an exaggerated issue.

• 26% of tickets involved an electoral process issue incorrectly framed as


partisan.

• 22% of tickets involved misinformation that was shared by verified users.

• 18% of tickets featured content taken out of context from other places or
times to create false impressions of an election issue.

• 17% of tickets involved unverifiable claims, such as friend-of-friend narra-


tives.

Figure 2.6 on page 35 shows the portion of tickets containing incidents or nar-
ratives that targeted different aspects of the electoral process. Not surprisingly,
tickets about voting by mail dominated tickets in September, while tickets about
ballot counting spiked during the week of the election.

33
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 52 of 292 PageID #:
13621

2. Data and Summary Statistics

Percent of tickets by state targeted


(tickets can target multiple states,
only top states shown)

Ohio

NewYork

Nevada

Texas

NorthCarolina

Arizona

Wisconsina

Georgia

California

Florida

Washington

Michigan

Pennsylvania
10

15
0

Percent

Figure 2.4: Percent of tickets by state targeted.

Tactic frequency over time, smoothed by week


(tickets may have multiple categories)
150

100
Tactic
Frequency

exaggerateissue
misleadingstats
outofcontext
partisanbutitsnot
privileged
sharedbyverified
50 wellintentionedmisinfo

0
September 3, 2020 to November 19, 2020

Figure 2.5: Tactic frequency over time.

34
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 53 of 292 PageID #:
13622

2.2. Summary Statistics

Electoral process targeted over time, smoothed by week


(tickets may target multiple parts of electoral process)
120

90

Tactic
Frequency

ballotcounting
boxesdropoff
60 genericfraud
inpersonvoting
paperballot
votebymail
votingmachines
30

0
September 3, 2020 to November 19, 2020

Figure 2.6: Electoral process targeted over time.

Figure 2.7 on the following page shows the actors targeted by the misinformation.
The actors most frequently targeted were political affinity groups (for example,
Democrats or Republicans, or Biden supporters) with 39% of tickets.
Figure 2.8 on page 37 shows the proportion of tickets that made various claims
about the elections. 27% of tickets involved claims about illegal voting.2
Last, we coded tickets based on whether they additionally related to COVID-19
narratives, or had an element of foreign interference. Interestingly, just 1% of
tickets related to COVID-19, and less than 1% related to foreign interference.

Tickets by Fact-Checking URLs


As the EIP monitored the information space for mis- and disinformation about
the 2020 election, analysts consulted published fact-checking resources to
assess various claims. 42% of the tickets included fact-checking URLs found by
analysts. The most common fact-checking sources were Twitter threads and
Facebook posts, often from official government accounts, Snopes, PolitiFact,
USA Today, the Washington Post, and CNN (in that order). The remaining 58%
of tickets consisted of misinformation that had low engagement and did not
manage to attract the attention of fact-checkers, as well as misleading claims
that were not easily falsifiable. Additionally, as noted above, some tickets were
about incitement to violence, a topic that does not lend itself to fact-checking.
Many tickets included more than one fact-check URL. In total, the dataset
included 925 fact-checking URLs.

35
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 54 of 292 PageID #:
13623

2. Data and Summary Statistics

Percent of tickets by actor targeted


(tickets can have multiple categories)

platforms

the_elite

media

non_state_political_actors

usps

poll_workers

voters

government

political_affinity_groups
10

20

30

40
0

Percent

Figure 2.7: Percent of tickets by actor targeted.

Overall, among our tickets we found that higher engagement posts (those with
more than 100,000 interactions) contained fact-checking URLs more than posts
that had medium to low engagement: 34% of high engagement tickets contained
fact-checking URLs, compared to 25% for medium engagement tickets, and 18%
for low engagement tickets. EIP researchers also examined the relationship
between political ideology and fact-checking, and found that tickets that dis-
cussed only left-leaning accounts were as likely to contain fact-checking URLs
as tickets discussing only right-leaning accounts.

We also analyzed fact-checking frequency and approaches based on a number


of factors, including ticket category. Tickets categorized as “Call to action
for protest or mobilization” (often incitements to violence) were least likely to
include fact-checking URLs; this makes sense, as these types of tickets are less
likely to be appropriate for fact checking.

36
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 55 of 292 PageID #:
13624

2.3. Platform Responsiveness and Moderation Actions Taken

Percent of tickets by claim


(tickets can have multiple categories)

claim_victory

poll_watchers

ballots_rejected

intimidation

violence

harvesting

ballots_lost_found

stolen_election

NA

illegal_voting
10

20
0

Percent

Figure 2.8: Percent of tickets by claim.

2.3 Platform Responsiveness and Moderation


Actions Taken
Of our 639 tickets, 363 tickets tagged an external partner organization to either
report the content, provide situational awareness, or suggest a possible need
for fact-checking or a counter-narrative. Of the tickets in which an external
organization was tagged, Figure 2.9 on the following page shows which partner
organization was tagged.
In the case where platforms were tagged, we measured the percent of tickets
that subsequently received a response from the platforms. A response indicated
that the platform confirmed that they were investigating the ticket. We believe
these response rates are lower bounds; it is possible platforms investigated
tickets, but did not respond on the Jira platform. In total, we believe the four
major platforms we worked with all had high response rates to our tickets.
Figure 2.10 on page 39 shows the time between a ticket’s creation and the

37
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 56 of 292 PageID #:
13625

2. Data and Summary Statistics

Percent of tickets by organization tagged


(tickets can have multiple organizations tagged)

naacp

nextdoor

d3p

dnc

mitre

common_cause

gec

reddit

tiktok

google

eiisac

facebook

twitter
10

20

30
0

Percent

Figure 2.9: Percent of tickets by organization tagged.

# tickets # tickets that


Response
tagging received
Rate
organization response
TikTok 40 36 90%
Google 46 41 89%
Twitter 220 185 84%
Facebook 158 120 76%
Table 2.2: Response rate by platform.

38
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 57 of 292 PageID #:
13626

2.3. Platform Responsiveness and Moderation Actions Taken

platform’s response, over time. This data should be interpreted cautiously, as


often the ticket creator did not tag the platform; rather, a manager tagged the
platform once the ticket was reviewed. So occasionally a ticket was created
but the platform not tagged for several hours, or in some rare cases a few days.
As such, even if the platforms responded minutes after being tagged, and they
often did—particularly on Election Day—this data will not reflect this. However,
the data does suggest that the process got much tighter over time. This likely
reflects that the EIP shortened the time between ticket creation and platform
tagging, and also more engagement from the platforms.
Median time between ticket creation and platform response, averaged by week
(platforms were often not tagged for hours, or sometimes days
after ticket creation for various reasons)
250

200
Median tesponse time (hours)

150
Platform
facebook
google
tiktok
twitter
100

50

September 3, 2020 to November 19, 2020

Figure 2.10: Median time between ticket creation and platform response.

Each ticket that tagged a platform partner contained a list of URLs containing
the potentially violative content being spread—for example, the URL for a Face-
book post or YouTube video. These lists were typically not comprehensive, but
intended to highlight a few examples should the platforms decide to investigate
further. We developed a web scraping tool that visited each URL to determine
what action the platform (limited to Twitter, TikTok, YouTube, Facebook, and
Instagram) applied to the content, and ran it on all 4,832 URLs from the tickets
on December 7, 2020. The tool evaluated what a US-based individual would see
if they visited each URL using the Chrome browser on a desktop computer. For
Instagram and Facebook, the visitor was logged in to bypass “login walls.” We
found no evidence of different users observing different platform actions, so
the choice of user did not affect results.
The tool grouped each URL into four possible categories: “removed” when the
content was not available (most likely taken down by either the platform or the
original poster themselves); “soft block” when the content was only visible by

39
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 58 of 292 PageID #:
13627

2. Data and Summary Statistics

bypassing a warning (this action was only detected on Twitter); “label” when the
platform applied some kind of warning label to the content but did not hide the
content; and “none” when the platform took no detectable action. Due to the
opaque nature of platforms’ ranking algorithms, we were not able to directly
detect actions like “downranking.” Moreover, because platforms often employ
aggressive anti-scraping measures and frequently change their interfaces, it is
possible that the scraper incorrectly classified some URLs; in a random sample
of several dozen classified URLs, however, we found no errors. In this section
we will refer to whether or not platforms actioned URLs, but we note that we
cannot distinguish between a platform removing content or a user removing
content.
We find, overall, that platforms took action on 35% of URLs that we reported to
them. 21% of URLs were labeled, 13% were removed, and 1% were soft blocked.
No action was taken on 65%. TikTok had the highest action rate: actioning (in
their case, their only action was removing) 64% of URLs that the EIP reported
to their team.
Figures 2.11 to 2.14 on pages 40–42 show the distribution of platform action
by ticket category, tactic, asset, and claim. Platforms were most likely to take
action on tickets that involved premature claims of victory; they took action on
these tickets about 45% of the time. They also frequently actioned URLs related
to election delegitimization and procedural interference. They were least likely
to take action on URLs about fraud, but we note that less than 1% of the URLs
had this category. URLs with procedural interference were most likely to be
removed.

Figure 2.11: Type of action by category.

Platforms were most likely to action URLs that shared misleading statistics, and

40
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 59 of 292 PageID #:
13628

2.3. Platform Responsiveness and Moderation Actions Taken

most likely to remove phishing content and fake official accounts.

Type of Action by Tactic (All Platforms)


Misleading Stats
Out of Context
Partisan But Not
Well Intentioned Misinfo
Exaggerated Issue
Livestream
Privileged Info
Shared By Verified
Phishing
Fake Official Accounts

0 0.1 0.2 0.3 0.4 0.5

% Label % Soft Block % Removal

Figure 2.12: Type of action by tactic.

Figure 2.13 shows that there was not large variation in platform action rate across
asset types.

Type of Action by Asset (All Platforms)

Unique Text

Template Text

Contains image without text

Contains image with text

Contains video

Website

0 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45

% Label % Soft Block % Removal

Figure 2.13: Type of action by asset.

More than 50% of URLs that contained premature claims or victory, or claims
about the election being stolen, were actioned by platforms. About half of URLs
that contained unfounded claims about ballots being rejected were removed—
the claim with the highest rate of removal after incitement to violence.

41
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 60 of 292 PageID #:
13629

2. Data and Summary Statistics

Type of Action by Claim (All Platforms)


Premature Claim of Victory
Stolen Election
Ballot Rejections
Illegal Voting
Ballots Lost/Found
Ballot Harvesting
None of the Above
Poll Watchers
Violence
Intimidation

0 0.1 0.2 0.3 0.4 0.5 0.6

% Label % Soft Block % Removal

Figure 2.14: Type of action by claim.

2.4 Concerns by Reporting Collaborators

While 79% of tickets were created in-house, CIS reported 16% (N = 101) of our
tickets. Most reports from CIS originated from election officials. Compared to
the dataset as a whole, the CIS tickets were (1) more likely to raise reports about
fake official election accounts (CIS raised half of the tickets on this topic), (2)
more likely to create tickets about Washington, Connecticut, and Ohio, and (3)
more likely to raise reports that were about how to vote and the ballot counting
process—CIS raised 42% of the tickets that claimed there were issues about
ballots being rejected. CIS also raised four of our nine tickets about phishing.
The attacks CIS reported used a combination of mass texts, emails, and spoofed
websites to try to obtain personal information about voters, including addresses
and Social Security numbers. Three of the four impersonated election official
accounts, including one fake Kentucky election website that promoted a narra-
tive that votes had been lost by asking voters to share personal information and
anecdotes about why their vote was not counted. Another ticket CIS reported
included a phishing email impersonating the Election Assistance Commission
(EAC) that was sent to Arizona voters with a link to a spoofed Arizona voting
website. There, it asked voters for personal information including their name,
birthdate, address, Social Security number, and driver’s license number. Other
groups that reported tickets include the State Department’s Global Engagement
Center, MITRE, Common Cause, the DNC, the Defending Digital Democracy
Project, and the NAACP.

42
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 61 of 292 PageID #:
13630

2.5. Final Observations

2.5 Final Observations


This chapter has focused on our ticket-level dataset, which offers a look at the
work of the EIP through the duration of our activity. In Chapter 3 of this report
we will delve into some of the narratives within the EIP tickets, examining those
that achieved the greatest reach or were instrumental for a significant duration
of the time leading up to, and following, Election Day.

43
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 62 of 292 PageID #:
13631

44
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 63 of 292 PageID #:
13632

Notes

1. (page 29) Cohen’s Kappa weighs chance in its scoring by evaluating the proba-
bility of agreement and the probability of random agreement. The probability of
agreement minus the probability of random agreement divided by 1 minus the
probability of random agreement is how Kappa is calculated. With this in mind,
a Kappa value that is less than zero indicates that there is less agreement than
chance and is evidence that the taxonomy or intercoder process is somehow
flawed.
2. (page 35) “Political affinity groups” includes references to “the Democrats” or
“the Republicans” or particular politicians. “Government” refers to any govern-
ment entity. “Non-state political actors” includes groups like Black Lives Matter
or antifa. “The elite” references people like George Soros or Bill Gates. “Plat-
forms” references social media platforms like Facebook. Voters, poll workers,
USPS, and the media are self explanatory.

45
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 64 of 292 PageID #:
13633

46
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 65 of 292 PageID #:
13634

Chapter 3
Incidents and Narratives: The
Evolution of Election Misinformation

3.1 Introduction
The 2020 election was the subject of hundreds of false and misleading claims
about voter qualifications, voting processes, and even the basic nature of Amer-
ican democracy. Some claims spread like wildfire across social media only to
fade just as quickly. Others circulated unnoticed for days or weeks before ig-
niting with lasting viral momentum. Sometimes, contradictory claims battled
for supremacy. Other times, they settled into a surreal coexistence. Some
of these claims would ultimately form the foundation of “Stop the Steal”—the
2020 election’s most expansive and enduring misinformation narrative, which
ultimately culminated in the January 6, 2021, insurrection at the US Capitol—
though it was a long and complicated journey.
In this chapter, we examine some of the 2020 election’s most noteworthy pieces
of election-related misinformation, exploring the character of these claims and
charting the messy process by which claims coalesced into broader narratives.
We also trace how one narrative gave way to another, forming a conspiratorial
canon that is likely to persist for many years to come. In order to identify and
differentiate these narratives, we consider the following questions:
What was the first claim that formed the basis of a given narrative? Was there a
precipitating event? How did the story develop? What pieces or types of content
helped shape it? How did the narrative echo and build upon the narratives that
preceded it? How did it bolster the narratives that followed it? Indeed, did it
fade away at all?
We begin the chapter with a discussion of our methodology and definitions.

47
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 66 of 292 PageID #:
13635

3. Incidents and Narratives: The Evolution of Election Misinformation

From there, we explore the evolution of narratives in the 2020 election, following
their progression to the events of January 6. Then, we discuss the spread of
misinformation narratives in non-English communities, focusing on Chinese-
and Spanish-speaking Americans (foreign state-backed actors in the 2020 elec-
tion are described in a box somewhere). Finally, we examine the obstacles these
dynamics posed to fact-checkers, and conclude with observations regarding
the narrative landscape as a whole.
Because the purpose of democratic elections is a transparent, regularized trans-
fer of political power, they are gravely endangered by misinformation narratives.
If citizens are made to feel that a vote was compromised or rigged, then the
election cannot be trusted. If the election cannot be trusted, then (at least in
the mind of the true believer of such narratives) the democracy itself is invalid.
Looking back on the election of 2020 and the January 6 attack, this chapter
addresses the resounding question: how did we get here?

3.2 Narratives: Methodology and Identification


Narratives are stories. They draw from a common set of building
blocks—characters, scenes, and themes—and assemble them in novel ways.
Good narratives inspire suspense and excitement in their audience.1 A suc-
cessful book, for instance, is one whose narrative clings to the imagination of
its reader. Similarly, a successful conspiracy theory is one whose narrative is
especially compelling and emotionally resonant—the audience itself is made to
feel that they are the protagonists in a story that only they can interpret and
understand.
In daily life, the creation of narratives is aided by a parallel process of framing.
Frames are mental schemas that shape how people interpret the world; they
highlight specific pieces of information, as Robert Entman writes, “in such a
way as to promote a particular problem definition, causal interpretation, moral
evaluation and/or treatment recommendation for the item described.”2 Framing,
i.e., the production of frames, is a process of selecting certain information and
providing a kind of scaffolding that shapes how people interpret a series of
events. (The process of framing will be explored in greater detail in Chapter 4.)
Viral misinformation works by decontextualizing and recombining real-world
events into compelling narratives with minimal regard for the truth. Some of
these narratives are “bottom-up,” in which a narrative emerges organically from
the post hoc interpretation of disparate events and claims, often beginning
with a single post by an individual user. Others are “top-down,” consciously
created and first disseminated by one or more powerful media or social media
influencers. Often, the reach and staying power of certain narratives becomes
clear only after the precipitating event has concluded. In complex events like

48
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 67 of 292 PageID #:
13636

3.3. The Evolution of Narratives in the 2020 Election

the 2020 election, multiple narratives can exist side by side, contradicting or
reinforcing each other and receiving widely variable attention.

The Election Integrity Partnership’s initial monitoring for voting-related misin-


formation focused on claims, not narratives. Each of the 639 tickets in the EIP
database was tied to a particular claim: a fake viral video of ballots being burned,
for instance, or an allegation that a Philadelphia poll watcher was improperly
barred from entering a voting precinct.

The work of narrative identification began on November 30, 2020, after the EIP’s
monitoring mission had concluded. We first grouped tickets into “information
cascades,” or incidents, tracing how a single real-world event (like a video of poll
workers collecting ballots in California) could generate a number of different
false claims, spread at different rates on different platforms by different actors.
After that, we grouped similar incidents together, collapsing them into a small
number of distinct narratives. In some cases, the narratives coalesced into
umbrella meta-narratives. These narratives formed the basis of the information
conflict that would consume the 2020 election.

3.3 The Evolution of Narratives in the 2020


Election
The most destructive misinformation narratives came in waves. As fresh events
presented themselves and public attention shifted, old narratives lent their
momentum and “evidence” to new ones; incidents were framed so as to “prime”
audiences to perceive future similar events as part of a broader pattern. This
meant that, while specific falsehoods and delusions might fade, they were never
truly forgotten. This process carried some Americans from their first exposure
to voting-related misinformation in the summer of 2020 all the way through the
violent, far-reaching conspiracy theories that compelled them to storm the US
Capitol on January 6.

In the lead-up to the 2020 election, misinformation centered on mail-in voting:


the destruction and discarding of real ballots and the “discovery” of fake ones.
Such misinformation typically took the form of misleading photos or decontex-
tualized video clips of crumpled mail allegedly found in dumpsters or abandoned
trucks. This misinformation was widely amplified by Republican politicians and
far-right operatives, including by the Trump White House. After the election,
public polling indicated a lack of trust in mail-in voting;3 while it is difficult to
state to what extent that was caused by the media and social media activity,
given the amount of misinformation about the process spread from the start,
this finding is not surprising.

49
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 68 of 292 PageID #:
13637

3. Incidents and Narratives: The Evolution of Election Misinformation

Concurrently, other popular misinformation narratives suggested that the elec-


tion had been “stolen” before it even took place. Concerns about dispropor-
tionate mail-in voting by Democrats and disproportionate in-person voting by
Republicans led partisans on both sides to fear that there would be a manipula-
tion of votes on election night. The Trump campaign primed Republican voters
to expect wrongdoing by calling for an “Army for Trump” to safeguard the polls.
In turn, Democrats worried that polling places might be invaded by far-right
militias. And far-right activists argued that the United States was held in the
grip of a “color revolution” orchestrated by an imagined “Deep State” intent on
stealing the election.

On November 3 and immediately afterward, misinformation shifted to focus


on vote counting and tabulation. This was embodied by the #Sharpiegate
narrative, which alleged that poll workers were giving felt-tip pens to voters
in conservative precincts to render their ballots unreadable. Despite repeated
attempts to debunk it, the narrative found a receptive audience who set to
work flooding all social platforms, mainstream and niche, with the claim. After
#Sharpiegate gained viral traction, it drew hundreds of Trump supporters to
protest outside the recorder’s office of Arizona’s Maricopa County.

As millions of mail-in ballots were slowly counted and voting returns shifted
to favor Joe Biden, this anger and disbelief intensified. A growing swell of
misinformation narratives, including Sharpiegate, coalesced under the hashtag
#StopTheSteal, which spawned a movement of the same name. Some narratives
claimed that hundreds of thousands of deceased citizens had cast Democratic
votes; others suggested that Trump was one lawsuit away from victory. Together,
these narratives infused their followers with a sense of urgency and a call to
action.

As Stop the Steal grew in popularity over the next two months, its allegations
of legal and procedural fraud were supplemented by increasingly colorful, out-
landish conspiracy theories. Some claimed that Trump’s loss had been the work
of a CIA supercomputer commissioned by former President Barack Obama.
Others argued that Trump’s loss had been orchestrated by Dominion Voting
Systems, a company that was (falsely) tied to Bill Gates, George Soros, or even
the government of Venezuela. The more that these narratives took hold, the
further their believers slipped from reality.

Throughout the entire voting period, both Democrats and Republicans had been
consumed by fears of election-related violence—of the Proud Boys targeting
Black Lives Matter protesters or secret “antifa comrades” infiltrating conser-
vative polling locations. Outside of a surge in use of the #civilwar hashtag on

50
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 69 of 292 PageID #:
13638

3.3. The Evolution of Narratives in the 2020 Election

Twitter, however, little of this rhetoric translated into action in the immediate
aftermath of the election. Instead, the creep toward organized violence oc-
curred more slowly. It would explode with fury on January 6, 2021, changing the
course of American politics with it.

Ballot-Related Narratives
Setting the Stage for Ballot Irregularity Claims
The process by which votes were cast in the 2020 election was significantly
influenced by the global COVID-19 pandemic. By September, when the EIP
began monitoring election-related misinformation, nearly 200,000 Americans
had already died from COVID-19.4 In order to prevent COVID-19 transmission
at crowded polling places and to accommodate citizens who preferred not to
come to the polls, a number of states opted to expand the qualifications for
absentee ballots or to alter the vote-by-mail process. For example, dozens of
states significantly increased the use of ballot drop boxes.5

Changes to electoral processes and sometimes-unclear communications re-


garding the changes created an ecosystem ripe for mis- and disinformation
around the mechanics of voting. Experts predicted that Democrats would rely
on mail-in voting more than Republicans,6 a reality that resulted in the rapid
politicization of the process and that stymied many attempts to make it clearer
or more accessible.7 Meanwhile, legitimate confusion about voting procedures
offered political activists, influencers, and politicians a receptive environment
to sow doubt in the integrity of the voting process as a whole.

General concerns related to mail-in ballots constituted the most prominent type
of misinformation assessed in the months before Election Day (see Figure 3.1 on
the next page), foreshadowing claims of mass irregularities and “found ballots”
following the election. The EIP processed tickets that included claims of mail
dumping; mistreated, shredded, or dumped ballots; non-eligible people casting
ballots (e.g., dead voters); ballots cast on behalf of others; and voting multiple
times by mail.

In this section we highlight two types of ballot-related narratives: “bottom-up”


misinformation rooted in real-world events reported by concerned individuals,
and “top-down” mis- and disinformation in the form of claims of hidden con-
spiracies first made by influencers and media personalities who had political
or financial incentive to spread falsehoods (see Figure 3.2 on page 53). For the
former, we present some claims related to found, discarded, and destroyed
ballots, primarily isolated instances of wrongdoing reframed and misconstrued
in partisan terms. For the latter, we discuss a video created by Project Veri-
tas (described below), shared widely by right-wing influencers, that claimed

51
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 70 of 292 PageID #:
13639

3. Incidents and Narratives: The Evolution of Election Misinformation

Electoral process targeted over time, smoothed by week


(tickets may target multiple parts of electoral process)
120

90

Tactic
Frequency

ballotcounting
boxesdropoff
60 genericfraud
inpersonvoting
paperballot
votebymail
votingmachines
30

0
September 3, 2020 to November 19, 2020

Figure 3.1: The number of tickets that targeted different parts of the electoral process. The spike
of tickets occurred on Election Day.

the existence of widespread fraud in the form of ballot harvesting funded and
condoned by political elites.

Misinformation For and By the People: How Documented Incidents of


Found, Discarded, or Destroyed Ballots Became Narratives
Allegations of mail dumping—real or purported—can be used to mislead au-
diences in service of particular agendas, such as undermining confidence in
mail-in voting or advancing claims that the election is rigged. Narratives around
found, discarded, or destroyed ballots circulated through various platforms
before, during, and after the election. Though it is illegal for US Postal Service
(USPS) letter carriers or related partners to improperly dispose of mail, it does
sometimes occur. Overall, however, the USPS is overwhelmingly secure and
letter carriers face severe penalties for dumping mail, including jail time.8

The incidents in EIP tickets ranged from claims of a handful of ballots found on
the side of the road or under a rock to allegations of hundreds of thousands of
ballots lost at once in Pennsylvania. Mail-dumping narratives also connected
disparate real-world events, pulling them into a broader storyline in which these
were falsely portrayed as frequent occurrences, and in which each individual
incident was cited as further evidence of an irreparably corrupt and broken
system. The EIP team identified five techniques used to leverage these real
incidents for broader purposes:9

52
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 71 of 292 PageID #:
13640

3.3. The Evolution of Narratives in the 2020 Election

Narrative Spread between Media and Social Media


TOP DOWN BOTTOM UP
Mainstream Media Mainstream Media
Run story based on tip or Identify popular or trending topics
investigation, or coverage of a and prominent incidents, feature
government or institution on broadcast channels such as
Trump  Trump nightly news or print media
Create Amplify
commentary, fans
amplify
influencers Influencers Hyperpartisan  Hyperpartisan
Spread content media media
on social Frame  Inflect 
narratives with Influencers stories and
a partisan lens Amplify posts, contextualize claims
incidents into narratives

General Public   General Public   


Pick up, share, and discuss content      
Initiate narratives through collective 

within and across communities

sensemaking, often in groups




 


Figure 3.2: Pathways of top-down and bottom-up narratives.

• Falsely assigning intent: Acts that are not political are framed as political.
For example, a local mail-dumping event is falsely framed as specifically
targeting voters on one side of the political spectrum or a mail carrier
is identified as “Democratic” or “Republican” to suggest malicious intent.
Other times, too much significance is given to the demographics of the
locality in which an event occurs. Though these cases may at times contain
added falsehoods, often they will rely more on implication than assertion—
and are therefore hard to refute with fact-checking.

• Exaggerating impact: Real-life incidents are highlighted, selectively edited,


or otherwise exaggerated to give a false appearance of substantial impact
on election results or to suggest a widespread pattern of misbehavior.

• Falsely framing the date: Old events are reframed as new occurrences,
such as the recirculation of a 2014 video of a mail carrier dumping mail
accompanied by allegations that this was happening in the final weeks of
the 2020 election.

• Altering locale: Those disseminating the misinformation alter the locale of


an event to make it seem more relevant to an audience. For example, photos
from a Glendale, California, incident are reframed as having happened in a
different local community.

53
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 72 of 292 PageID #:
13641

3. Incidents and Narratives: The Evolution of Election Misinformation

• Strategic amplification: In addition to false framing, the usual amplifi-


cation concerns apply, with the potential for honest or not-so-honest
mistakes about intent, actors, times, and locales to be amplified by domes-
tic networks of politically motivated accounts and possibly even foreign
actors.

Allegations of deliberately destroyed ballots took various forms, including claims


of ballot boxes being lit on fire, mail-in votes being shredded, and foreign actors
stealing mailboxes. Occasionally there were legitimate claims, such as accu-
rate reports of attempted arson (one example was in Baldwin Park, California).
However, most were easily disproved falsehoods: for example, claims of shred-
ded ballots for President Trump in Pennsylvania in reality were unaddressed
applications for mail-in ballots.10
The earliest ballot-related story that the EIP collected and analyzed took place
within days of launching our monitoring effort in early September. The incident,
which occurred in Glendale, California, and involved improperly discarded mail,
was incorporated into a broader narrative focused on undermining trust in the
USPS and exaggerating the potential impact on the election of this and similar
events in California, Wisconsin, and other states. Throughout the election,
similar incidents of discarded mail (with and without ballots) were repeatedly
framed as fraud, particularly by hyperpartisan online media, and the specific
claims of individual stories were amplified and woven into other narratives
meant to cast doubt on the integrity of the election.

Glendale, California
In early September, a salon worker in Glendale, California, found multiple bags of
unopened mail in a dumpster and took video footage with her cellphone.11 There
is no evidence that any ballots were among the discarded mail; the American
Postal Workers Union stated the recovered mail would go through a verification
process and be delivered.12 However, politically motivated actors began using the
above techniques of falsely assigning intent, exaggerating impact, and strategic
amplification to falsely frame this situation in such a way as to undermine trust
in mail-in voting.
The incident was picked up by conservative influencers, including Charlie Kirk
and Adam Paul Laxalt. The image below shows a map of popular accounts
tweeting about the Glendale mail-dumping incident. The graph reveals an
imbalance between left- and right-leaning amplification: the conservative side
of the network had more posts than the liberal side and nearly three times as
many retweets. Conservative tweets claimed that this mail-dumping incident
proved that mail-in voting was not secure because of either incompetence or
deliberate sabotage by the USPS and thus should not be allowed. On the liberal

54
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 73 of 292 PageID #:
13642

3.3. The Evolution of Narratives in the 2020 Election

side, influencers promoted a different narrative—that President Trump was


deliberately sabotaging the USPS to reduce the number of Democratic votes—
and stressed the importance of preserving mail-in voting. As people lost faith in
the mail system, some on the left also used the narrative to push people to vote
in person or via drop boxes. This bottom-up misinformation, coming first from
concerned citizens and then amplified by influencers to, in turn, target average
platform users, is a tactic that the EIP would continue to see throughout the
election cycle. Overall, the story impacted the perception of the security of
voting by mail for both liberals and conservatives.

Glendale, CA Mail Dump

September 13, 2020 Conservative Follow


Networks
Mainstream Liberal

Mainstream Conservative
Liberal Follow Networks

Figure 3.3: The network of influential left- and right-leaning tweets and retweets about the
Glendale mail-dumping incident, where the conservative side of the network had nearly three
times as many retweets. An animated version of this graph can be found in the EIP’s blog post,
“Emerging Narratives Around ‘Mail Dumping’ and Election Integrity.”13

55
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 74 of 292 PageID #:
13643

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.4: Tweets by conservative influencers Matt Oswalt (left) and Charlie Kirk (right) amplifying
the false mail-dumping incident.

Greenville, Wisconsin
In late September, another incident of discarded mail—this one in Greenville,
Wisconsin—was used to sow doubt in mail-in voting and, in some cases, to claim
voter fraud. Local reporting at the time suggested that, in this case, the discarded
mail did include several ballots.14 (The Wisconsin Election Commission later said
the mail did not include any Wisconsin ballots;15 more recent reporting suggests
there was at least one ballot from Minnesota among the mail.16 ) However, as
in Glendale, California, strategic partisan actors distorted the significance of
this event, through selective amplification, exagggerating impact, and falsely
assigning deliberate intent to purported Biden-supporting USPS workers.17 This
second story appeared within weeks of the first in Glendale. While there were
no absentee ballots in the mail-dumping case in Glendale, the seed had been
planted that voting by mail was not safe and secure. With this second case,
when several absentee ballots were actually found, pundits were able to point
to both cases as support for their claims around voting by mail and, eventually,
a rigged election. Throughout its monitoring period, the EIP saw many isolated
incidents that seeded narratives and that were later drawn upon as “evidence”
to clarify, refine, and reinforce larger narratives—a tactic that seemed to be used
frequently among right-wing influencers and networks.
This narrative spread almost exclusively through conservative networks, pushed
by influencers such as Charlie Kirk, The Gateway Pundit, and Breitbart News.
The graph below reveals how the claim cascaded through the Twittersphere
over time.
Alarmingly, this narrative made it all the way to the White House, when press
secretary Kayleigh McEnany stated “Mass mail-out voting... could damage

56
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 75 of 292 PageID #:
13644

3.3. The Evolution of Narratives in the 2020 Election

Greenville, WI Mail Dump

September 25, 2020


Conservative Follow
Networks
Mainstream Liberal

Mainstream Conservative
Liberal Follow Networks

Figure 3.5: Network graph showing how narratives about the Greenville, WI, mail-dumping event
spread primarily through politically conservative and pro-Trump accounts. An animated version
of this graph can be found in the EIP’s blog post, “Emerging Narratives Around ‘Mail Dumping’
and Election Integrity.”18

either candidate’s chances because it’s a system that’s subject to fraud. In fact,
in the last 24 hours, police in Greenville, Wisconsin, found mail in a ditch, and
it included absentee ballots.”20 The amplification techniques were effective in
sowing distrust in mail-in voting and the USPS at large, despite neither event
posing a real risk to the election results.

Sonoma, California
On September 25, a tweet that over 1,000 ballots had been discovered in a
dumpster in Sonoma, California, further added to the narrative sowing distrust
in mail-in voting. Elijah Schaffer, a conservative influencer and verified Twit-
ter user, allegedly received photos of the mail-dumping incident. He posted
the photos on Twitter, and other influencers ensured its rapid spread across
conservative social media—including on Gab, Reddit, and Parler.

57
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 76 of 292 PageID #:
13645

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.6: A cumulative graph of tweets about mail dumping in Greenville, Wisconsin. As shown in
the graph, early propagation of the tweet was driven by pro-Trump and MAGA-branded accounts.
Original tweets are green squares; retweets are blue circles. Further tweet content and author
information can be found within a large interactive version on the website in the endnotes.19

The photo used in Schaffer’s tweet was framed as evidence of potential fraud in
the 2020 election. However, the image was of empty envelopes—not ballots—
from 2018 that had been legally discarded.21 Influencers including The Gateway
Pundit, Tim Pool, and Donald Trump Jr. retweeted and quote tweeted Schaffer,
spreading the false narrative that this was an intentional dumping of ballots
with implications on the 2020 election, and reinforcing the larger narrative that
mail-in voting was not secure.

In this case, the misinformation was followed by a prompt fact-check from


Sonoma County on Twitter (see Figure 3.8 on page 60). On September 25, the
same day as Schaffer’s tweet, Sonoma County tweeted in English and Spanish
that the picture had been taken out of context to promote a false narrative
and properly identified the photo as containing empty envelopes from the 2018
election. This timely identification and correction serves as a model for state
and local officials. However, it also demonstrates the challenge in debunking

58
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 77 of 292 PageID #:
13646

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.7: Left, earliest tweet on the Sonoma, CA, alleged mail-dumping incident; right, example
of an influencer sharing the tweet with a conspiratorial and adversarial framing.

information that has already gone viral, as the original misinformation had
significantly larger engagement than the subsequent fact-check.
In each of these cases of “mail dumping,” a real-world event was falsely framed
to reinforce a broader narrative that undermined faith in the USPS and mail-in
voting. The graph in Figure 3.9 on page 61, showing spikes in Google searches
for “mail dumping” during these periods, suggests the effective amplification of
the narratives.

Misinformation from the Top: Ballot Harvesting Conspiracies


In the previous section, we described ballot incidents in which misleading in-
formation or misinformation based on real-world events emerged, bottom-up,
from ordinary users and was subsequently picked up by influencers and political
elites. In this section, we focus on ballot conspiracy theories—ballot incidents
that were framed as deliberately manipulative, with responsibility ascribed to a
powerful figure. These were often first introduced by elites or influencers, many
of whom had large numbers of followers on social media. Top-down, elite-driven

59
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 78 of 292 PageID #:
13647

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.8: The County of Sonoma’s tweet fact-checking the false claim that ballots were illegally
dumped.

mis- and disinformation was a prominent feature of the 2020 election cycle; we
will discuss the specific mechanics of “blue-check” (verified) accounts spreading
claims across platforms in Chapter 4.
Ballot harvesting is the practice of a third party delivering an absentee or vote-
by-mail ballot on behalf of another voter; rules governing ballot harvesting vary
by state, and in most cases harvesting is not inconsistent with state law.22 Yet
it is both contentious and politicized. Its proponents argue that it increases
access for those who would otherwise have difficulty voting. Its opponents
contend that it increases the potential for fraud and point to historic cases of
wrongdoing.
The contention over ballot harvesting generally splits along party lines with
Democrats supporting the practice and Republicans opposing it. This was
evident in the run-up to the 2020 election as Republican leaders publicly claimed
it was rife with fraud. For example, in April 2020, President Trump tweeted

60
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 79 of 292 PageID #:
13648

3.3. The Evolution of Narratives in the 2020 Election

100

75
Frequency

50

25

0
Apr Jul Oct Jan
Data (2020)

Figure 3.9: Data from Google Trends of “mail dumping” from January 2020 - January 2021.The
two spikes in Google searches for “mail dumping” align with the events described above. The
first spike occurs the week of September 6-12, overlapping with the Glendale, CA, mail-dumping
story. The second spike occurs September 27-October 3, following both the Greenville, WI, and
Sonoma, CA, stories.

that ballot harvesting is “rampant with fraud,” garnering more than 250,000
likes.23 At the Republican National Convention in August 2020, President Trump
told a cautionary tale about the 2018 voter fraud case in North Carolina’s 9th
Congressional District—in which multiple people said a Republican political
operative paid them to collect absentee ballots from voters and falsely witness
a ballot.24 And when the New York Post shared its story referenced above,
conservative influencers shared it on social media.

Additionally, in August 2020, the New York Post published an article in which
an unnamed Democratic operative described committing a range of alleged
electoral fraud practices that could impact an election.25 The EIP saw multiple
tickets, for example, alleging “granny farming,” in which workers who are sent to
nursing homes to help residents fill out ballots inappropriately guide the older
person’s vote or assign a vote without their input.

While there have been isolated incidents of actors abusing ballot harvest-
ing, there is no evidence to suggest it contributes to widespread voter fraud.
Nonetheless, confusion around the practice enabled the ballot harvesting trope
to flourish.26

61
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 80 of 292 PageID #:
13649

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.10: President Trump’s April 2020 tweet alleging ballot harvesting is “rampant with fraud.”

Figure 3.11: Retweets by conservative blue-check accounts of the New York Post article alleging
mail-in ballot fraud. These images were saved after the election, which is why the Facebook labels
appear at the bottom of the posts.

62
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 81 of 292 PageID #:
13650

3.3. The Evolution of Narratives in the 2020 Election

Ballot Conspiracies: Project Veritas and Beyond


One example that illustrates how elite-driven mis- or disinformation can quickly
trend or garner engagement before platforms can react was a narrative that
revolved around a Project Veritas video released on September 27, 2020, claim-
ing that Representative Ilhan Omar was connected to wide-scale voter fraud
in Minnesota.27 Project Veritas, founded and led by conservative activist and
provocateur James O’Keefe, describes itself as an activist group that produces
investigative journalism seeking to expose corruption. It primarily targets left-
leaning organizations and political figures it believes to be anti-conservative.
The group is well-known for their unorthodox journalistic tactics and style,
including infiltrating organizations, and surreptitiously filming. They have faced
legal challenges and backlash for producing deceptively edited videos and em-
ploying unethical tactics while filming undercover. On February 11, 2021, Twitter
permanently suspended Project Veritas’ official Twitter account and temporarily
blocked James O’Keefe’s.28
Project Veritas was what the EIP observed to be a repeat spreader of false
and misleading narratives about the 2020 election (a designation discussed in
more detail in Chapter 5), generating a number of videos that were surfaced
by EIP monitoring and external partners throughout the course of the 2020
campaign and flagged as false or misleading by third-party fact-checkers.29 In
this specific case, O’Keefe released a 17-minute video along with a message that
began “Breaking: @IlhanMN connected to cash-for-ballots harvesting scheme.”
The video begins with an individual claiming that “money is everything.” He then
says his car is full of absentee ballots—showing what appear to be ballots on his
dashboard. As we describe below, these claims were found to be misleading by
independent fact-checkers.
Despite the unreliability of the Project Veritas video, it quickly gained ground.
The video went viral on multiple social media platforms, driven by right-wing
influencers. On Twitter, within the first 15 minutes after O’Keefe’s posting, at
least eight conservative influencers—including Ryan Fournier, Representative
Paul Gosar, Michelle Malkin, and Cassandra Fairbanks—shared the video. Project
Veritas was soon trending. Notably, Donald Trump Jr. appeared to separately
upload the same O’Keefe video within seven minutes of the original post, as it
was posted without a “From James O’Keefe” label. As the EIP described in a blog
post about the Project Veritas video, this suggested that the Trump campaign
may have had access to the video before the general public, raising questions of
coordination.30
The release of the video also seems to have fueled an increase in the use of
#ballotharvesting on Twitter, spiking after the video was shared on September
27. The hashtag primarily appeared in tweets in pro-Trump networks: more than
8,000 times, compared to 30 times in left-leaning and anti-Trump networks.

63
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 82 of 292 PageID #:
13651

3. Incidents and Narratives: The Evolution of Election Misinformation

Plant the Seed Build Anticipation Video Release

SEPTEMBER
SEP 24 SEP 25 SEP 26 SEP 27
2020

Project Veritas posted a video titled Project Veritas posted a second 1 New York Times released a deep-dive
“Project Veritas to release YouTube video in which O’Keefe investigation into almost two decades of President
undeniable video proof of systemic goes through multiple online Trumps’ taxes.
voter fraud 09.28.2020.” articles that mention liberals’ fears
Soon after, Mike Lindell, the chair of the Minnesota
of O’Keefe’s revelation. O’Keefe 2 Trump campaign with no known affiliation with
ends the video, “there are a bunch
Project Veritas, announced that he had spoken
of people who are about to be
with O’Keefe, who now planned to release the
‘O’Keefe’d’. Stay tuned.”
video that day at 9:00 pm ET/6:00 pm PT.

3 At 9:00 pm ET Project Veritas released the video,


a day before its widely-promoted scheduled
release time.

Figure 3.12: Timeline of the release of Project Veritas’s video about ballot harvesting.

Figure 3.13: Project Veritas videos announcing the release of their video alleging voter fraud.

64
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 83 of 292 PageID #:
13652

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.14: A visualization of the promotion of Project Veritas’s Ballot Harvesting video campaign
on Twitter. Original tweets are green squares; retweets are blue circles; quote tweets are orange
diamonds; replies are yellow circles; and retweets of quote tweets are red circles.

Figure 3.15: James O’Keefe’s Project Veritas video was quickly spread by right-wing influencers,
including Donald Trump Jr. and Ryan Fournier, on Twitter.

65
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 84 of 292 PageID #:
13653

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.16: A cumulative graph of tweets after the release of Project Veritas’s video about ballot
harvesting. Growth of the narrative increased substantially after Donald Trump Jr.’s retweet (large
red circle) and subsequent original tweet (green square).

Meanwhile, on Facebook, according to data gathered using CrowdTangle, posts


with the term “Project Veritas” garnered 2.42 million interactions between
September 27—the date the video was shared—and October 3, 2020. The most
popular of such posts was President Trump’s, in which he shared Breitbart’s
report of the release and said that he hopes the US Attorney in Minnesota has
this under serious review.

The video’s ballot harvesting claim was not well supported. As Maggie Astor of
the New York Times described several days later, the video “claimed through
unidentified sources and with no verifiable evidence that Representative Ilhan
Omar’s campaign had collected ballots illegally.”31 Minnesota’s FOX 9 reported
that the central source in the video claimed that Project Veritas offered him
$10,000 and used two separate Snapchat videos to “make it appear as if he
was illegally picking up ballots and offering money for votes.”32 Likewise, USA
Today wrote that “[t]here is no actual proof of fraud or any relationship between
individuals in the video and Omar or her campaign.”33 But the quick virality of

66
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 85 of 292 PageID #:
13654

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.17: Left, President Trump’s tweet about the Project Veritas video (bottom) in the trending
list; right, President Trump’s Facebook post. These images were saved after the election, which is
why the Facebook “projected winner” label appears at the bottom of the post.

the claim allowed it to take root as a persistent narrative.

The Project Veritas video is notable for two reasons. First, it shows how politically
motivated misinformants can capitalize on confusion; Americans were broadly
unaware of the details of third-party ballot collectors, allowing O’Keefe and right-
wing influencers to fill the gap with misleading and unverified information.

Second, it is an example of false or misleading information that was driven top-


down by verified accounts with large amplification capabilities. The video was
both created by right-wing influencers (O’Keefe and Project Veritas) and initially
disseminated by a network of right-wing social media users with large followings.
Top-down mis- and disinformation is dangerous because of the speed at which
it can spread; if a social media influencer with millions of followers shares a
narrative, it can garner hundreds of thousands of engagements and shares
before a social media platform or fact-checker has time to review its content.

67
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 86 of 292 PageID #:
13655

3. Incidents and Narratives: The Evolution of Election Misinformation

The Impact of Ballot-Related Narratives


Mail dumping and ballot harvesting appeared frequently in the days leading up
to the 2020 election. Exaggerating the effect of found, discarded, and destroyed
ballots in the pre-election period may have laid the foundation for widespread
receptivity to allegations of similar fraud on election night. It can be difficult to
make empirical assessments of how online content affects real-world opinions;
there is usually little more than engagement data such as likes, shares, and
reactions to go on, and those may signal more about pre-existing beliefs than an
audience being persuaded by the evidence presented. However, a Pew survey
from November 2020 indicated widespread concern about mail-in voting:34

• Of the 54% of respondents who voted in person, approximately half had


cited concerns about voting by mail as a major reason why they did.
• Only 59% of respondents answered that they were “very confident” that
their vote was accurately counted, as opposed to 71% in 2016, 68% in 2012,
73% in 2008, and 68% in 2004. This is the lowest response for a presidential
or midterm election in 16 years.
• Only 30% of respondents were “very confident” that absentee or mail-in
ballots were counted as intended.

While it is unclear what specific information source or pre-existing beliefs


shaped public opinion on this issue, what is clear is that a large percentage
of the electorate was open to the claim that mail-in ballots were a potentially
significant source of fraud or irregularities. Vocal holders of these beliefs were
pivotal in shaping the conversation about the legitimacy of the election both on
and after Election Day, as we will explore throughout this chapter.

Election-Theft Narratives
In addition to ballot-specific misinformation, the pre-election period was
marked by narratives that laid the broad groundwork for claims of a stolen
election. This took the form of repeated and baseless allegations that voting
wouldn’t matter at all—that the election result was already decided or would be
decided by political elites looking to undermine democracy. Claims of an im-
pending “steal” were prominent in both left-leaning and right-leaning networks
prior to the election; one side claimed that Trump would steal the election,
the other that Biden would do the same.36 Some of these claims were spread,
top down, through the same network of online influencers as the ballot misin-
formation. Viewed retroactively, these were harbingers of the Stop The Steal
campaign that would grow into a significant movement after the election, before
ultimately erupting into violence.

68
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 87 of 292 PageID #:
13656

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.18: A tweet from right-wing influencer Candace Owens after the election, supporting
claims of a rigged election and broad allegations of election fraud.35

The EIP tickets tracked three distinct narrative threads within the “stolen elec-
tion” meta-narrative prior to the election:

• The Red Mirage/Blue Wave: A weaponization of expert predictions that


election results would shift dramatically over time due to the timing of
counts for mail-in ballots; both the right and left leveraged the expert
predictions to claim the election would be “stolen” by the other side.

• Army for Trump: A real movement by the Trump campaign to solicit evi-
dence of election fraud from Trump’s supporters, based on the premise
that the Democrats were attempting mass voter fraud; this sparked a reac-
tion from the left, which alleged that the Trump campaign was trying to
lay the groundwork to steal the election away from Biden.

• The “Color Revolution”: An idea pushed by far-right activists that began


with the claim that America was experiencing a Deep State-backed color
revolution to undermine the Presidential victory of Trump via a coup.

The overall meta-narrative of an impending stolen election, and the repetition


with which it was deployed, provided a frame that could be used to process
future events: any new protest, or newly discovered discarded ballot, could be
processed as additional proof that a “steal” was indeed underway, that there

69
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 88 of 292 PageID #:
13657

3. Incidents and Narratives: The Evolution of Election Misinformation

was a vast conspiracy to steal the election from President Trump, and that the
election results would be illegitimate.

The Red Mirage and the Blue Wave

Changes to voting procedures due to COVID-19 led to expert predictions that


more mail-in voting would take place in 2020 than ever before—and that mail-in
voting would skew heavily Democrat. Many election analysts predicted that this
would lead to an initial “red mirage” followed by a “blue wave” in some states:
the Democratic share (or proportion) of the vote would increase substantially
between early counts from day-of voting and final ballot totals, as mail-in ballots
were processed.37 In the weeks leading up to the election, analysts hypothesized
about which states would see a large blue shift, which would see minimal shifts,
and which might even go in reverse.38

Some right-leaning influencers and communities attempted to frame these


predicted shifts as preemptive evidence of a “stolen election” in two ways. First,
they pushed a false but largely non-conspiratorial narrative that mass aggregate
mail-in ballot fraud by individuals would be responsible for any shift. Second,
some offered a set of false, conspiratorial claims that there would be “ballot-
stuffing” on Election Day, asserting that there would be a systematic, coordinated
effort by local authorities to alter the election night vote via the addition of
forged ballots or the swapping of “real” ballots for fake ones. More conspiratorial
communities, such as QAnon adherents, argued that attempts by the press
and electoral experts to educate voters to anticipate voting shifts were in fact
evidence that elites were strategically planning to steal the election and were
attempting to inoculate voters to that reality (see Figure 3.19 on the next page).

The “red mirage” and “blue wave” narrative of election night shifts did ultimately
come to pass largely as experts predicted, with Biden taking the lead as mail-in
ballots were counted. It has become one of the most enduring narratives under-
pinning claims of a “stolen election,” weaponized by conservative influencers as
evidence that the Democrats supposedly delivered boxes of ballots to polling
places. In the weeks following the election, prior conspiratorial claims to expect
theft evolved into specific allegations of voting machine fraud and “found ballots”
in swing states that President Trump lost. Many of the influencers argued that
the “red mirage” had in fact been a “red tsunami” interrupted by Democratic
manipulation. Further, statistical misinformation (discussed in Chapter 4) began
to appear as influencers alleged that the “blue wave” occurred not because of
the predicted voting behavior and ballot-processing procedures, but rather due
to Democratic interference to steal the election, alleging that it had been an
illegitimate election from the outset.

70
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 89 of 292 PageID #:
13658

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.19: Left, a QAnon-aligned post on Parler alleging that the “red mirage” was part of the
Democratic plans to steal the election; right, an article from conservative blog “Legal Insurrection”
arguing the same.

Army For Trump


Prior to the election, the Trump campaign proliferated the idea that the election
was going to be stolen by the Democrats, using ballot-fraud claims and other
procedural misinformation as well as the weaponization of the red mirage/blue
wave prediction, to spark outcry among supporters.

On September 21, 2020, Donald Trump Jr. posted a video on Facebook calling on
supporters to join an “election security operation” the campaign called “Army for
Trump.” Citing concerns that the “Radical Left” was laying the groundwork to
steal the 2020 election, Trump Jr. asked supporters to sign up to join the Trump
campaign’s Election Day team through a site called “DefendYourBallot.” The
website recruited volunteers for general get-out-the-vote activities but also
asked if they had legal expertise and included a form where supporters could
report alleged election incidents directly to the campaign (see Figure 3.20).

This call to action was repeated by President Trump on Twitter and in the first
presidential debate in which he urged supporters to “go into the polls and watch
very carefully” for fraud.39 He also shared the link on his Facebook Page, urging
supporters to “Fight for President Trump”; the post was engaged with over
200,000 times on Twitter (see Figure 3.21 on the following page). Appealing to
volunteers to act as unofficial poll watchers was intended to motivate Trump’s
base, providing additional pathways to participation in the election. It also set

71
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 90 of 292 PageID #:
13659

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.20: The Election Issue Report Form on the “Army for Trump” website.

the stage for untrained volunteers to amass “evidence” to support the type of
narratives discussed in the prior section of this report, in which misleading
claims were leveraged to allege systematic ballot fraud. Although we cannot
tell if the people who shared videos on Election Day and the weeks following
were officially part of the “Army for Trump,” there were multiple incidents in the
EIP ticket database that included video footage of supposed fraud that actually
documented innocuous events (e.g., video and photographic claims of ballot
theft that was in fact reporters moving camera equipment).40

Trump's Oct 5. Call for Poll Watchers


7000
Retweets
Tweets Per Hour

6000
Quote Tweets
5000 Retweets of Quote Tweets
4000
3000
2000
1000
0
1 2 8 6 00 6 2 8 0 6 2 8 0
-05 05 1 0 06 0 06 1 06 1 07 0 07 0 07 1 07 1 08 0
10 10- 10- 10- 10- 10- 10- 10- 10- 10- 10-
Time in UTC
Figure 3.21: A graph showing Twitter engagement with Trump’s initial “Army for Trump” tweet,
reproduced below. Retweets in blue; Quote Tweets in orange; Retweets of Quote Tweets in green.

The “Army for Trump” initiative assisted in creating a vast trove of images,
videos, and stories of purported incidents that could be selectively chosen,
falsely framed, and fed into “voter fraud” narratives. It had one other additional
impact: it sparked fear and outrage on the left. Left-leaning influencers claimed

72
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 91 of 292 PageID #:
13660

3.3. The Evolution of Narratives in the 2020 Election

that the “Trump Army” itself was an attempt by the campaign to steal the election.
They framed Trump’s calls to action as having the potential to incite violence at
the polls, concern about which might result in voter suppression. An analysis of
100 randomly selected tweets reacting to Trump’s call to be a “Trump Election
Poll Watcher” revealed significant concerns about the movement—with only
four out of 100 quote tweets expressing support for the call. Forty-nine out of
100 tweets believed that Trump’s call had the potential to incite violence at the
polls on Election Day (such as the quote tweets below).

Original Trump tweet: Volunteer to be a Trump Election Poll Watcher.


Sign up today! #MakeAmericaGreatAgain41

Quote tweet 1 (Oct 5, 2020): @jack @Twitter This tweet is encouraging


election violence. “Fight” and “Army” — those are bugle calls, not dog
whistles. Twitter, take down this tweet.42

Quote tweet 2 (Oct 5, 2020): To be clear, the president who has repeat-
edly encouraged political violence, said ”stand by” to heavily-armed
extremist groups, and repeatedly spread lies about voting procedures,
is now calling on his supporters to raise an “Army for Trump” at the
polls. Just so dangerous.43

Left-leaning conversation therefore framed the “Army for Trump” as an attempt


to steal the election through the propagation of fear; this fear was heavily
reflected in mainstream media coverage (see Figure 3.22).

Figure 3.22: Media coverage of “Army for Trump.” Clockwise from top left: New York Times,
Refinery29, Washington Post, and Forbes.

Calls to join the “Army for Trump” thus fed into both left and right-leaning
narratives. Right-leaning social media accounts pushed the idea that the election
would be stolen, to justify the need for the Army. Left-leaning accounts reframed

73
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 92 of 292 PageID #:
13661

3. Incidents and Narratives: The Evolution of Election Misinformation

these public calls to watch as incitements to violence, and as an intimidation


tactic with the potential for voter suppression.

Color Revolution
The narratives that the integrity of the election was being strategically and
intentionally undermined moved into the conspiracy realm with claims that a
Deep State was orchestrating a coup in a “color revolution.” The term was coined
in the late twentieth century to describe revolutions in which repressive regimes
tried to hold on to power after losing an election, spurring widespread domestic
protests for democratic change. But in 2005, autocrats in China and Russia began
to redefine the term away from its popular-activism origins, using it instead to
imply externally imposed regime change—in particular, regime change designed
to look like a popular uprising despite being furtively orchestrated by intelligence
services from Western democracies.44 Occasionally, Russian state media, such
as RT, ran op-eds insinuating that domestic protest movements in the United
States were in fact color revolution regime-change tactics. However, during
the 2020 election, the term was applied to American politics in a somewhat
unexpected way: prominent American conservative influencers suggesting that
the US was experiencing a Deep State-backed color revolution intended to
steal the election from President Trump.45 The first major push to introduce
mainstream audiences to the narrative came from former Trump speechwriter
and prominent conservative commentator Darren Beattie, who wrote about the
theory and discussed it in podcasts in conversation with Steve Bannon, Michelle
Malkin, and Adam Townsend. Right-wing newsite Revolver.News produced a
detailed series laying out his claims. On September 15, Beattie appeared on
Tucker Carlson Tonight, giving the narrative mainstream attention on a program
with an audience of millions.
The propagation of the color revolution narrative occurred over several months,
waxing and waning in popularity, but gradually gaining adoption as a frame to
explain grass-roots Black Lives Matter protests and voting irregularities as part
of an elaborate plan by Democratic operatives to steal the election.
After Election Day, use of the term “color revolution” spiked a few more times,
driven mostly by videos and posts that echoed the pre-election narrative, alleg-
ing that the “coup” had happened. Two of these spikes of activity, November
29-30 and December 11-14, seemed to revolve around tweets and posts by Lin
Wood, a defender of President Trump who prominently promoted various con-
spiracies to explain Trump’s loss.46
Wood’s tweets expanded the narrative of the color revolution to include possible
foreign interference from China, and went so far as to link COVID-19 to the
broader theme. The claims were shared to Facebook, Parler, and other social
media platforms.

74
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 93 of 292 PageID #:
13662

3.3. The Evolution of Narratives in the 2020 Election

“Super
Introducing the Narrative Mainstreaming Normalization
Spreaders”

AUGUST AUGUST SEPTEMBER OCTOBER


2019 2020 2020 2020

AUG 20 DEC 22 AUG 11 SEP 15 OCT 7


Op-ed on RT: First observed Darren Beattie Darren Beattie “Q” uses the term in a
“‘Color revolution blog post (with appears on Steve appears on ‘drop’ post, expanding its
comes home?’ limited social Bannon podcast Tucker Carlson reach in Q communities
Democrats target media traction)
Significant-influencer pickup
Trump with broaches the
(i.e., Glenn Beck) and mass-
regime-change idea of a Color
spreading sharing dynamics as
tactics” Revolution in the
users post stories and claims to
United States Facebook Groups

Figure 3.23: Timeline of notable events related to the color revolution leading up to Election Day.

Figure 3.24: Interactions on posts involving the term “color revolution” post-Election Day, using
CrowdTangle data.

Figure 3.25: Tweets by Lin Wood that were shared on Facebook at the end of November and
mid-December about a “color revolution.”

75
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 94 of 292 PageID #:
13663

3. Incidents and Narratives: The Evolution of Election Misinformation

The broad claim that a color revolution was underway, with nefarious actors
purportedly funding protests and destroying ballots, provided a convenient way
for those seeking to delegitimize the election to connect unconnected events
and to create a compelling villain while doing so. True to top-down narrative
dynamics, social media users who found the story appealing moved the narrative
into a variety of Facebook Groups and Pages through shares.
By promoting this narrative, right-leaning influencers appeared to be priming
their audience to read future events surrounding ballots and protests as po-
tentially part of that revolution. Accordingly, in the unsettled period between
Election Day and when the race was called by national media outlets for Biden
(and, it would turn out, even beyond that), believers of the color revolution
narrative were primed to accept challenges to the integrity of the election. As
we will see as we progress through this chapter, this would eventually mani-
fest into legal challenges that relied on affidavits from individuals primed to
believe election fraud narratives with little to no knowledge of the ins and outs
of election procedures. Even on election night itself, the conviction that the
election would be stolen seems to have motivated the voters of Arizona to latch
on to one specific claim—felt-tip pens had led to the mass disqualification of
the ballots of Trump voters—that would give rise to an online movement and a
real-world protest.

Case Study 1: #Sharpiegate


At 5:01 am PT on Election Day 2020, a conservative Chicago radio broadcaster
sent a tweet noting that felt-tip pens were bleeding through ballots.47 A few
hours later, at 12:16 pm PT, an anonymous Twitter account sent a tweet ad-
dressing James O’Keefe of Project Veritas: “Sharpie pen issues on Chicago paper
ballots,” it began, and alleged that scanners couldn’t read the ballots because
the markers were bleeding through.
These were the first tweets the EIP observed alleging that the black felt-tip,
Sharpie-brand markers some poll workers were handing out were rendering
ballots unreadable. The concern that ballots marked with Sharpie markers would
not be counted began to make its way around conservative communities on
social media. While the Chicago tweet did not gain much traction, the narrative
quickly spread to Parler and Facebook.
Local news in Chicago picked up the story and attempted to correct the record:
by 5:48 pm PT, CBS 2 had written an article reassuring voters that Sharpies were
just fine.48 The story faded in Chicago, but concerns about Sharpies at polling
places began to migrate across the country, popping up in tweets associated
with geographic locales that had become the focus of the vote count as the
evening progressed.

76
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 95 of 292 PageID #:
13664

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.26: Left and top right, tweets about Sharpie pens on ballots in Chicago; bottom right, a
tweet about the same concern in Arizona.

One of these locales was Arizona. A video that originally appeared on Facebook
went viral: an off-camera videographer (the account name on the video suggests
the videographer is right-wing activist Marko Trickovic)49 is broadcasting his
conversation with a pair of women who are describing voting machines not
reading Sharpie-marked ballots. In the video, the women claimed that Maricopa,
Arizona, poll workers were trying to force voters to use Sharpies despite the
presence of pens. The man recording the video can be heard on camera stating,
“so they’re invalidating votes, is what they’re doing.” As the evening progressed,
and into the next morning, the video was reposted by numerous accounts and
appeared on YouTube, Twitter, Rumble, TikTok, Parler, and Reddit.

After West Coast polls closed and it became apparent that certain swing states—
particularly Arizona—were closer than polls had predicted, the controversy
about Sharpies was offered as an explanation. It became a hashtag, #Sharpiegate,
and various pieces of content alleged that poll workers were handing out the
markers deliberately to Trump supporters to prevent their votes from being
counted. “FRAUD IN ARIZONA. Dems are so desperate,” read one tweet from
10:14 pm PT on Election Day that had over 3,000 likes and retweets. The Maricopa
County Facebook Page seemingly tried to assuage concerns very early on; even
as the debate about Sharpies was largely still a Chicago concern, it posted a
PSA noting that Sharpies worked just fine on Maricopa’s machines.51 Despite

77
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 96 of 292 PageID #:
13665

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.27: The video of a woman stating that voting machines were not reading ballots marked
with Sharpie pens was shared on Twitter, YouTube, and Rumble.50 The Rumble video text reads
“Maricopa County votes need to be counted by hand! People were given Sharpies instead of
ballpoint pens when Arizona voting machines can’t read felt-tip marker. So ballots were entered
into a slot but never counted.”

the attempts to debunk the developing controversy, however, concerned and


angry posts continued to appear within pro-Trump communities and channels
on Reddit, TikTok, Twitter, YouTube, Parler, and others.
Fox News called Arizona for Joe Biden at 11:20 pm on election night, but overall
the day ended without a clear winner, as many election experts had predicted.52
President Trump gave a speech in which he noted that the races in Pennsylva-
nia and Michigan were still in play, suggested that Arizona was too, and then
declared:

“We did win this election. So our goal now is to ensure the integrity
for the good of this nation, this is a very big moment. This is a major
fraud on our nation. We want the law to be used in the proper manner.
So we will be going to the U.S. Supreme Court. We want all voting
to stop. We don’t want them to find any ballots at 4 o’clock in the
morning and add them to the list. OK? It’s a very sad, it’s a very sad

78
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 97 of 292 PageID #:
13666

3.3. The Evolution of Narratives in the 2020 Election

moment. To me, this is a very sad moment, and we will win this, and
as far as I’m concerned, we already have won it.”53

The Sharpiegate narrative continued to accelerate the next day, beginning early
in the morning on November 4. Content appeared on Facebook, Twitter, and
Parler alleging that Trump voters specifically had been given Sharpies to in-
validate their ballots. As the day progressed, conservative influencers such as
Charlie Kirk, Dinesh D’Souza, and Steven Crowder asked questions about the
controversy and retweeted claims made by users alleging fraud.
Local media and election officials in other swing states in which Sharpie markers
had been used, including Pennsylvania and Michigan, posted articles addressing
the use of Sharpies in their own jurisdictions, attempting to fact-check what
appeared to be turning into a widely disseminated conspiracy theory. The Cy-
bersecurity and Infrastructure Security Agency (CISA) posted its own update to
its Rumor Control webpage.54 Election officials reported inquiries to the Elec-
tion Infrastructure Information Sharing and Analysis Center (EI-ISAC), noting
that they were seeing posts alleging that voters who used Sharpies would not
have their votes counted. The Michigan Attorney General posted a tweet asking
members of the public to stop making threatening and harassing phone calls to
her staff suggesting they shove Sharpies into inappropriate places.55

Figure 3.28: Graph showing the spread of Sharpiegate tweets (cumulative) before and after Pima
County’s fact-check tweet.

By early evening on November 4, however, the Sharpiegate theory left the realm
of internet chatter and became a live-action rallying cry for Trump supporters

79
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 98 of 292 PageID #:
13667

3. Incidents and Narratives: The Evolution of Election Misinformation

who felt the election had been stolen. Protestors, some open-carrying long
guns, gathered at the Maricopa County Recorder’s Office building, carrying
signs, waving Sharpies, and chanting about election theft. A local pro-Trump
Facebook Page, AZ Patriots, livestreamed the protests for several hours.56
Protesters returned on the evening of November 5 as well. This time, well-
known conspiracy theorist Alex Jones of Infowars showed up, climbed atop a
car, and gave a speech about “meth-head Antifa scum,” George Soros, and stolen
elections, occasionally chanting “1776.”57

Figure 3.29: Alex Jones at a Sharpiegate protest outside the Maricopa County Recorder’s Office
on November 5.58 (AP Photo/Matt York)

Throughout the three-day period in which Sharpiegate was a significant focus,


the social platforms responded primarily by labeling and taking down content.
The AZ Patriots livestreaming Page, which had been embroiled in controversy
over leader Jennifer Harrison livestreaming harassment on several occasions
(generating Facebook strikes), was taken down.59 Other livestreamers, however,
such as Steven Crowder on YouTube, discussed Sharpiegate or ran the footage
from the protests, and did not receive any labeling contextualization until well
after the fact; as we will discuss in Chapter 4, this is one of the unique challenges
of moderating a livestream compared to a text article.60
Sharpiegate provides a detailed look at how a misunderstood incident about
ballots compounded into a narrative among voters primed to believe that the
election would be stolen. As the ballot counts continued in the days following
Election Day, the predicted blue wave indeed began to materialize, and allega-
tions of fraud and demands for recounts began to increase. Sharpiegate became
one narrative among many that fed into a meta-narrative, the slogan for which

80
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 99 of 292 PageID #:
13668

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.30: A CrowdTangle dashboard showing results sorted by most interactions for “arizona
sharpie” in public Pages, verified Pages, and public Groups, beginning on November 3, 2020.

81
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 100 of 292 PageID #:
13669

3. Incidents and Narratives: The Evolution of Election Misinformation

would come to define the 2020 election for many Trump voters: “Stop The
Steal.”61

Case Study 2: #StopTheSteal

Pre-Election Election Post-Election

AUTUMN NOVEMBER DECEMBER JANUARY


2016
2020 2020 2020 2021

Roger Stone For several months NOV 4 JAN 6


uses the leading up to the Stop the Steal is Stop the Steal rally event
phrase “Stop election, pro-Trump used in the context near US Capitol turns into
the Steal” influencers, pundits, of election results; violent insurrection and
media outlets, and it goes viral on breach of Capitol building
Pres. Trump use the social media
Stop the Steal protests occur
phrase across the country

Figure 3.31: Timeline of the Stop the Steal narrative.

In the days after the election, “Stop the Steal” grew to encompass not only the
events related to Sharpiegate, but the broader overall theme that the election
had been decided by fraud. This rallying cry was aimed at motivating Republicans
and Trump supporters to halt purported Democratic electoral malfeasance. It
became a hashtag across multiple platforms, an encompassing and enduring
phrase. In the weeks following the election, the narrative took a distinctly
conspiratorial turn.

At its core, #StopTheSteal falsely postulates that Trump actually won the presi-
dential election, that Democrats stole the election, and that it is up to Republican
“patriots” to reverse this—i.e., to stop the Democrats’ theft. In the days following
the November 2020 election, the call was repeated by prominent conservative
influencers (including President Trump),62 and grew into a broad Stop the Steal
movement that attracted a significant presence offline as well. The phrase ap-
peared in real-world protest organizing materials and in signs at protest events.
In mid-December, over a month after Election Day, Stop the Steal rallies were
still occurring in the US; in January 2021, a protest with that slogan erupted into
a violent insurrection at the US Capitol.

82
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 101 of 292 PageID #:
13670

3.3. The Evolution of Narratives in the 2020 Election

The “Stop the Steal” phrase itself was seeded far ahead of the 2020 presiden-
tial election. Conservative political strategist Roger Stone launched a “Stop
the Steal” movement in 2016, according to a CNN article; his Stop the Steal
“voter protection” project was sued in federal court for attempting to intimidate
minority voters.63 However, in a November 2020 blog post on his personal
website, Stone took pains to clarify that he was not “a participant in any of the
organizations that adopted my phrase in this year’s election.”64 He repudiated
the CNN article that referenced him in another blog post, though he shared an
image advertising an Atlanta #StopTheSteal rally supported by StopTheSteal.us
(at that point the forefront of “Stop the Steal” in 2020).65

Figure 3.32: The Stop the Steal rally advertisement posted on Roger Stone’s website.

#StopTheSteal was used sporadically leading up to Election Day: for example,


after some states (Pennsylvania, Michigan, Wisconsin, and North Carolina) ex-
tended the date by which they would accept mail-in ballots, right-wing Twitter
accounts used the hashtag while denouncing the changes and calling for ac-
tion. Notably, Ali Alexander—a right-wing personality who would later help
organize the Stop the Steal movement and amplify numerous sub-themes and
conspiracies—was one of the actors who tweeted about the four states’ mail-in
ballot extensions using #StopTheSteal on September 22.66 In a retweet of an-
other pro-Trump account, Alexander and the original poster framed the states’
move as “favoring Democrats”: “They’re stealing this election in broad daylight.
Extending mail-in deadlines, harvesting… We need massive #StopTheSteal
protests all across the country!”67

Right-wing media ecosystems were also early adopters of this hashtag. Multiple
September articles on The Gateway Pundit mentioned “Stop the Steal”; one
article included a poll asking readers, “Do you think Democrats are trying to
steal the election?” and another used the hashtag #StopTheSteal in reporting

83
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 102 of 292 PageID #:
13671

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.33: Tweets using #StoptheSteal on September 22.68

on Trump supporters allegedly being blocked from observing Philadelphia early


voting locations.
Members of the Trump family on Twitter, and President Trump himself, also
pushed the concept of a “steal” early in the election season.70 While these tweets
did not all mention #StopTheSteal explicitly, they nonetheless helped foster
the cohesion of several disjointed narratives into one conceptual framework of
election theft.
Prior to the election, early uses of #StopTheSteal were used to discuss a range of
themes described in this chapter: ballot harvesting, mail-in voting, Trump win-
ning but Democrats stealing the race, Army for Trump, the need for poll watchers,
a rigged election, and more. The repeated priming of Trump-supporting audi-
ences to believe that the election had been stolen likely helped to bolster the
Stop the Steal movement as it further bloomed after the election.
On Election Day, as results did not break in the President’s favor, prominent con-
servative influencers quickly took up Stop The Steal. By the evening of November
3 and November 4, verified Twitter users, including recently elected officials and

84
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 103 of 292 PageID #:
13672

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.34: The Gateway Pundit articles mentioning “Stop the Steal” in September. Left, a
September 22 article and poll. Top right, an article from September 24. Bottom right, an article
from September 29.69

right-wing media outlets, were repeatedly pushing the Stop The Steal narrative
online. Some of these narratives were accompanied by more specific claims
about individual state irregularities (such as alleging they were not counting
ballots), while others were more general statements that the Democrats could
not have won the election fairly.

Besides conservative pundits, a handful of conservative politicians began to


amplify #StopTheSteal immediately after the election. One was Marjorie Taylor
Greene, a Republican Congressional candidate in Georgia who won her race in
the House of Representatives. Greene leveraged multiple social media platforms
simultaneously to spread Stop the Steal messages and promote herself. She
posted a Stop the Steal petition on both Facebook and Twitter that, once com-
pleted, redirected to a donation page. The petition spread in various Facebook
Groups, including an anti-Whitmer Michigan Group.

Stop the Steal Groups on Facebook were created at least as early as November 4,
2020. One Group, STOP THE STEAL, quickly swelled to hundreds of thousands of
members. In addition to providing a place where users shared election-related
conspiracy theories, the Group served as a hub to find various Stop the Steal
rally Facebook events across the country, some hosted by other entities. This
primary Group was shut down by Facebook on November 5 at 2:00 pm ET, with
media reports suggesting it was due to content inciting violence;71 data from
an EIP CrowdTangle archive shows that it had at least 7,000 posts with slightly

85
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 104 of 292 PageID #:
13673

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.35: A collage of some of the top conservative pundits using #StopTheSteal on November
3 and 4. On TikTok, a user filmed a live video of Charlie Kirk using the hashtag #StopTheSteal,
indicative of the cross-platform nature of this content.

Figure 3.36: Left, a post from Representative Majorie Taylor Greene, who heavily promoted
#StopTheSteal. In one of her posts, the petition led to her donation site, right.

86
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 105 of 292 PageID #:
13674

3.3. The Evolution of Narratives in the 2020 Election

over 800,000 interactions.

Figure 3.37: Image of several posts in the STOP THE STEAL Group on Facebook. The event page
of the Group listed upcoming events in different locations hosted by various entities.

Facebook Groups like STOP THE STEAL helped solidify the Stop the Steal move-
ment’s offline component. For example, on November 5, Facebook events were
scheduled for locations including California; Virginia; Washington, DC; Pennsyl-
vania; and Florida. StopTheSteal.us—a website created by Ali Alexander—and its
newsletter also helped to rally people to different locations around the country.
Inflammatory rhetoric was common; for example, in a since-removed tweet on
December 7, Alexander tweeted that he was “willing to give my life for this fight.”
The Arizona Republican Party retweeted, adding, “He is. Are you?”72
Coverage of Stop the Steal in conservative media outlets varied. In the first
two weeks after the election, Fox News had two article headlines mentioning
Stop the Steal in the context of news items (Facebook’s STOP THE STEAL Group
takedown and an incident at a rally).73 In contrast, more niche right-leaning
fringe outlets covered it uncritically, and at times seemingly supportively; for
example, on One America News Network (OANN), coverage of Stop the Steal
included a since-removed article outlining how voters were holding Stop the
Steal rallies in multiple states because of alleged election irregularities.74 The
outlet had steady coverage of the movement, telling viewers how to rally and
broadcasting an exclusive interview with organizers declaring that they will
“Fight on.”75
Stop the Steal rallies at times morphed into broader pro-Trump post-election
protests—for example, the Million MAGA March in DC on November 14 was heav-

87
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 106 of 292 PageID #:
13675

3. Incidents and Narratives: The Evolution of Election Misinformation

ily promoted by Stop the Steal influencers, and the insurrection on January 6 was
promoted by StopTheSteal.us. In an email on December 21 from StopTheSteal.us,
the January 6 protest was heavily advertised, stating, “#StopTheSteal wants
ALL Patriots to stand up with us in D.C. on what should be a historic day, Jan-
uary 6, 2021…StoptheSteal.us stands ready to FIGHT BACK with this historic
protest…we will NOT ALLOW our Republic to be stolen from us!” (bolding
theirs).

Figure 3.38: Image of a December 21 email from the StopTheSteal.us newsletter.

The Stop the Steal movement’s enduring power likely stems from several factors.
The phrase is all-encompassing of various other false claims and narratives
pushed about the election, providing an opportunity for many constituencies to
gather both virtually, and in real life, under one banner. Stop the Steal content
spread not only on Facebook, but also on Twitter, Parler, and Telegram. Because
of the many figures pushing the narrative across social media and on websites,
the movement was robust enough to survive individual takedowns of misleading
electoral content and targeted deplatforming.

#Maidengate
Many narratives co-occurred with Stop The Steal, alleging a variety of forms
of voter fraud. Some of them rehashed allegations made in elections past;

88
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 107 of 292 PageID #:
13676

3.3. The Evolution of Narratives in the 2020 Election

for example, the hashtag #DeadVoters claimed that dead people had voted in
the election via mail-in ballots (this peaked on November 11, 2020). Another
hashtag in this vein was #Maidengate,76 which began on November 9, 2020, via
hashtagged tweets from an account alleging that a Michigan mother’s vote had
been stolen by an impersonator using her maiden name. The poster claimed
to know several people who had discovered that a ballot in their name had
been cast in another state. She described this as intentional fraud, and called
on voters to check if they had been registered in multiple states due to past
addresses or name changes.
The claim of mass manipulation via maiden names, absent any evidence besides
anecdotal hearsay, was subsequently promoted on Twitter by Ali Alexander, who
created a website dedicated to the hashtag to try to collect evidence of voter
fraud. He promoted the Maidengate conspiracy on Periscope, gathering 41,000
viewers. #Maidengate chatter and content from the original tweeter’s website
appeared on Reddit and Facebook77 and the hashtag appeared approximately
1,800 times on Parler. By November 12, the Twitter account was suspended.

Figure 3.39: Left, tweets that precipitated #Maidengate; right, Ali Alexander’s tweet promoting
the Maidengate conspiracy.

#Maidengate went sufficiently viral that it generated attention from major media
outlets focused on debunking election misinformation, including the New York

89
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 108 of 292 PageID #:
13677

3. Incidents and Narratives: The Evolution of Election Misinformation

Times. As the Times noted,

“Soon, the claim that unauthorized people had cast votes under the
maiden names of real voters started trending online. From Monday to
Wednesday morning, more than 75,000 posts pushing #MaidenGate
appeared on Twitter, peaking at 2,000 between 2:10 and 2:15 a.m. on
Tuesday, according to Dataminr, a tool for analyzing social media
interactions. Beyond Twitter, the #MaidenGate rumors spread to
Facebook, YouTube and groups associated with Stop the Steal, which
have promoted the false narrative that Democrats stole the election
from President Trump. But no evidence was offered to support the
#MaidenGate claim in the original tweet. The tweet included no
details on the maiden name that supposedly had been stolen, so
there was no way to verify the claim.”78

We will discuss the specific mechanics of how these types of bottom-up “friend-
of-a-friend” narratives spread further in Chapter 4. We include it here as an
example of the way in which many sub-components of the Stop the Steal narra-
tive were often based on unverifiable claims recast as facts.
The claims based on alleged voter irregularities, however, were at least rooted
in the realm of the plausible. There was another collection of narratives, repur-
posed to explain how the “steal” took place, that were far afield of mainstream
reality, yet were still amplified on national television by some of President
Trump’s closest advisors: outlandish election conspiracies in which powerful
dark forces purportedly conspired to steal the election using secret “Deep State”
technologies to change votes.

Outlandish Claims: Attempts to Explain the “Steal”


Conspiracy theories have increased in visibility in online social spaces over the
last five years; prominent among them is the cultlike conspiracy theory known
as QAnon, which alleges that President Donald Trump spent much of his presi-
dency battling a cabal of Satan-worshipping pedophile elites. Believers of this
conspiracy are estimated to number in the low millions and many are supporters
of President Trump.79 In this section, we discuss two specific conspiratorial
narratives that attracted significant attention in the weeks and months following
the election: the first, which we will call “Hammer and Scorecard,” began years
prior to the 2020 election. The second, which we will call “Dominion” after the
election technology company that figures prominently in the narrative, rose to
prominence alongside allegations of irregularities in voting machines. However,
it merged with the Hammer and Scorecard theory to create a hybrid conspiracy
that spread throughout pro-Trump social media spaces. After Election Day,

90
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 109 of 292 PageID #:
13678

3.3. The Evolution of Narratives in the 2020 Election

these conspiracies were deployed within the stolen election meta-narrative to


“expose” the machinations behind the purported theft.

Hammer and Scorecard


In 2017, little-known conservative blog TheAmericanReport.org published a
story claiming that a government supercomputer called “The Hammer” was
created in 2009 by the CIA under President Obama.80 The article claimed that
the supercomputer was designed for spying on, and gathering data from, the
American public and conservative politicians, including Donald Trump. This ma-
chine supposedly included an application called “Scorecard,” which was capable
of manipulating election systems by switching votes to preferred candidates.
The claims underlying the story were made starting in 2013 by Dennis Mont-
gomery, described as a “CIA contractor-turned-whistleblower” who claimed to
have built the system. Various election results worldwide, and in the United
States, were attributed to the work of Hammer and Scorecard. As the conspiracy
re-emerged, updated for the events of 2020, fact-checking organizations and
CISA repudiated them; some pointed out Montgomery’s “history of deception.”81

Dominion
Early coverage of Dominion Voting Systems occurred within the general dis-
cussion of electoral integrity, though mentions of the company appear to have
taken off in earnest after two actual software glitches on Election Day in Georgia
counties were tied to Dominion software.82

In Morgan County and Spalding County, Georgia, outages in electronic poll


books temporarily prevented voters from using voting machines on Election Day,
resulting in extended voting hours.83 While the electronic poll books (the lists
of eligible voters in a precinct) were manufactured by Knowink, a subcontractor
of Dominion, the usage of Dominion Voting machines in these counties would
later lead to accusations of widespread faults in Dominion’s software.

The next day, a series of reports emerged alleging voting irregularities in Antrim
County, Michigan, again tied to Dominion: as votes were being reported, several
thousand votes in the county were incorrectly reported for Joe Biden rather
than Donald Trump.84 This error was quickly noticed and resolved. While it
would later be attributed by the Michigan Secretary of State to human error,85
narratives soon emerged that Dominion’s software, which was used to tabulate
these results, was responsible for the glitch. Prominent verified influencers on
social media began explicitly linking this incident to a broader conspiratorial
narrative saying Dominion voting systems were manipulating vote counts all
over the country.86

91
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 110 of 292 PageID #:
13679

3. Incidents and Narratives: The Evolution of Election Misinformation

As the Dominion issues were occurring, a since-deleted video grew popular, fea-
turing retired General Tom McInerney claiming the “Scorecard” application had
been used by the Obama-Biden campaign in 2012 to steal votes in Florida, and
was now being deployed by the Biden-Harris campaign in Florida, Georgia, Texas,
Pennsylvania, Wisconsin, Michigan, Nevada, and Arizona. Other YouTube chan-
nels such as SGTreport and CDMedia made similar claims, alleging a conspiracy
to use technology to steal votes.87 The videos spread to Facebook, Twitter,
Reddit, and Parler, and were republished on alternative video platforms such as
Rumble and BitChute. At this point, though, the two narratives were still largely
on separate tracks.
On November 6, GOP Chairwoman Ronna McDaniel alleged that there had been
fraud large enough to overturn Michigan’s election results, citing the Antrim
County reporting error and suggesting that 47 other counties in Michigan using
the same software may have been affected.88 Disputing McDaniel’s claims,
the Michigan Secretary of State released another statement reiterating that
the reporting incident was human error that had been caught by the county’s
processes and quickly resolved, and that no other counties were affected.89
Concurrently, however, conservative media outlets and influencers began noting
that Dominion software was used in 30 states, including all swing states, to imply
nationwide malfeasance on behalf of Dominion. Articles in the The Gateway
Pundit and Breitbart began connecting the Michigan and Georgia incidents
to suggest that the two cases were related.90 The Breitbart article received
upwards of 300,000 interactions on Facebook alone, and was posted by President
Trump.91 Similar claims of widespread flaws were shared by influential right-
wing individuals and groups such as The Western Journal and Mike Huckabee,
and in Spanish by Mexican author Alfredo Jalife-Rahme.92

Intersection of the Narratives


The Dominion narrative merged with the Hammer and Scorecard theory after
Trump campaign attorney Sidney Powell went on Fox News with Lou Dobbs
on November 6 and spread a now disproven theory claiming that the software
glitch that caused erroneous vote counts in Michigan was in fact the deliberate
work of the “Hammer and Scorecard” program.94 Powell, who was later dis-
avowed by the Trump campaign after a series of scathing legal rulings in cases
she helped litigate, gained credence in the Trump orbit for her willingness to
promote unsubstantiated fraud theories.95 Powell claimed that the purported
CIA technology altered 3% of the vote total in pre-election voting ballots that
were collected digitally.
The converging narratives were amplified by conservative website The Gateway
Pundit, which quoted Powell at length.96 Similar claims appeared on Trump-
supporting media channels such as OANN. While the Dominion and Hammer

92
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 111 of 292 PageID #:
13680

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.40: Tweets pushing the Dominion conspiracy, including one from President Trump.93

and Scorecard narratives initially were amplified together, after November 6


mentions focusing on the Dominion narrative subsumed Hammer and Score-
card (see Figure 3.41 on the following page); mentions of the latter dropped off
precipitously, while the former remained significant.
Once the Dominion narrative subsumed the Hammer and Scorecard narrative,
Donald Trump and his campaign quickly became the most prolific spreaders.
President Trump first tweeted about Dominion on November 12, and tweeted
dozens more times in the days following. Donald J. Trump (@realDonaldTrump),
“REPORT: DOMINION DELETED 2.7 MILLION TRUMP VOTES NATIONWIDE.”
Rudy Giuliani repeated similar claims on November 11 and the days after.97
For weeks after the election, the Dominion narrative persisted and was adapted
into ongoing narratives around electoral fraud by a variety of communities. One
video (on YouTube, Rumble, and Reddit) purporting to feature a “smoking gun”

93
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 112 of 292 PageID #:
13681

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.41: Mentions of Hammer and Scorecard (blue) were initially linked to mentions of
Dominion (yellow), but were eventually consumed by the Dominion narrative. (Source: Meltwater
Social)

regarding Dominion Voting Systems machines in Pennsylvania was widely shared


by high-profile accounts in Trump- and QAnon-supporting communities nearly
four weeks after the election.98 Another theory suggests Smartmatic, another
technology company, was orchestrating Dominion’s supposed interference.99
Yet another suggests several USB memory cards containing the cryptographic
key to access Dominion Voting Systems were stolen in Philadelphia.100 These
theories, which have been amplified using #StolenUSBs and #Mitattack, were
published by various outlets, including Russian state media outlet Sputnik In-
ternational (which credulously reported the claims of 8kun administrator and
QAnon aficionado Ron Watkins, calling him a “US cyber-security expert”), and
were repeatedly amplified by the President on Twitter.101

The claims became increasingly outlandish. Allegations appeared claiming Do-


minion had ties to individuals frequently scapegoated by conservatives including
Bill Gates, George Soros, and even members of the Venezuelan government.103
Others alleged Dominion had links to China, posting URLs to the US Patents and
Trademark Office website featuring a licensing agreement between the company
and Chinese bank HSBC.104 The same day that news broke of Russia-attributed
cyberattacks on US government infrastructure using vulnerabilities in SolarWind
software, The Gateway Pundit published a piece claiming Dominion used the
same software, a claim that was quickly denied by Dominion representatives.105
Both Dominion and Hammer and Scorecard have also been used as key pieces
of evidence for the “Kraken” narrative in which Sidney Powell would “release
the Kraken” by dropping indisputable evidence of voter fraud in lawsuits led by
the President’s legal team, and by the general Stop The Steal movement.106

The Dominion-meets-Hammer and Scorecard narrative has been adopted into

94
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 113 of 292 PageID #:
13682

3.3. The Evolution of Narratives in the 2020 Election

Figure 3.42: A tweet claiming a link between Dominion voting machines and Smartmatic.102

95
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 114 of 292 PageID #:
13683

3. Incidents and Narratives: The Evolution of Election Misinformation

the broader belief systems of various right-wing communities, including the


Proud Boys, the far-right militia group Three Percenters, and the Daily Stormer,
a Neo-Nazi publication.107 #Dominion was used in 1 of every 7 tweets from
QAnon accounts.108 QAnon groups used the hashtag #LordMarkMallochBrown
to demonstrate supposed ties between Dominion software systems and George
Soros. Lord Mark Malloch-Brown is a board member of SGO, the parent company
of Smartmatic, and is also on the board of Soros-founded organization Open
Society.

Figure 3.43: Hashtag use on Twitter for hashtags related to Dominion Voting System fraud
narratives.

The Dominion and Hammer and Scorecard narratives take on additional signifi-
cance for their link to ongoing incidents of real-world harm. Since the election,
Dominion employees have been doxxed, harassed, and threatened by right-wing
influencers and members of the general public.109 In early December a now-
offline website, EnemiesOfThePeople[.]us, was created (later attributed to Iran,
and discussed in our report’s “Foreign State-Backed Actors” section), featur-
ing personal information about multiple Dominion employees with crosshairs
shown over the faces of each targeted individual.110 Most recently, Dominion
has begun to file defamation lawsuits against prominent figures involved in the
perpetuation of the conspiracies we have described, including Rudy Giuliani
and Sidney Powell.111 As of the writing of this report, several of the publications
that aired the claims, such as American Thinker, have retracted them.112

96
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 115 of 292 PageID #:
13684

3.4. Election-Related Violence

The Hammer and Scorecard and Dominion conspiracies reinforced the Stop The
Steal movement, which ultimately led to violence. The hashtag appeared on the
banner of one of the first websites to announce the January 6 rally in Washington,
DC: “#DONOTCERTIFY #JAN6 #STOPTHESTEAL #WILDPROTEST.”113 And as
the violent insurrectionists breached the Capitol on that day, #StopTheSteal
signs could be seen across the crowd. In the next section, we trace threats of
violence during the 2020 election, leading up to that tragic day.

3.4 Election-Related Violence


The 2020 election season brought with it high tensions, and concerns about
violence were prevalent leading up to, during, and after the election. The EIP
team monitored channels across the political spectrum to identify and report
specific threats of violence. While this violence did not materialize on Election
Day, that relative calm was eclipsed by violent riots on January 6 at the US
Capitol.

The violence at the Capitol can be traced to violent rhetoric curated and iterated
on throughout the pre-election period, on Election Day, and after. Before the
election, both speculation and true threats of violence centered on tensions be-
tween existing groups. For example, while the left theorized about the next steps
of the Proud Boys and similar groups, the right created narratives about “antifa”
and Black Lives Matter (BLM) groups organizing massive violent insurrections.

This dynamic shifted distinctly on Election Day, especially among right-wing


audiences. Content with specific pieces of alleged “evidence” of electoral fraud
was weaponized to support the organization of real-world violence. Additionally,
rather than attacking other political groups, the ideology behind consolidated
movements such as #StopTheSteal spurred violence specifically toward election
officials and vendors, instead of simply toward “traditional” enemies such as
the Democrats and associated organizations like BLM. This growing distrust
of officials and institutions, regardless of political party affiliation, for their
role in the purportedly “stolen” election culminated in an organized, violent
insurrection on January 6.

Pre-Election Concerns
Prior to the election, the vast majority of violence-related content online was
users predicting unrest on Election Day and calling on other users to not vote
in person. This content circulated among both left-leaning and right-leaning
users, with users differing on who was considered responsible, and who would
be targeted.

97
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 116 of 292 PageID #:
13685

3. Incidents and Narratives: The Evolution of Election Misinformation

Left-leaning social media users circulated false warnings about far-right groups
and militias. One post about concerns that Proud Boys were planning to shoot
BLM protesters received over 278,000 likes (see Figure 3.44). Meanwhile, right-
leaning accounts also posted concerns that left-leaning groups such as antifa,
BLM, and the Sunrise Movement were planning to commit mass violent acts on
Election Day or the days following. For example, in September, right-leaning
accounts spread concern about an image that called for “Antifa comrades” to
dress up as “patriots/Trump supporters” to confuse the police at riots. This
image spread to Facebook, Twitter, and TikTok, garnering high engagement: on
Facebook, there were over 10,000 reactions, 15,000 shares, and 1,000 comments.
The image was subsequently fact-checked by Snopes and Medium and found
to be an internet joke from 2017 that had a second wave of popularity in 2020.
114 Heading into Election Day, pro-Trump accounts asked their followers how

they would respond to violence or voter intimidation from the left. Audience
responses indicate that threats of violence and anger were directed at the left
and leftist groups specifically.

Figure 3.44: Posts showing concerns about violence from left-leaning social media users.

Despite the reach and engagement of posts that raised fears about the potential
for violence, the EIP did not uncover any evidence of violent plans, such as from
right-wing Discord channels or Facebook Groups. Given the vague nature of the
claims and the absence of any specific evidence from those who posted concerns
of violence, these posts were non-falsifiable and unsubstantiated. Most of the
spreaders of this type of content appeared to be well-intentioned individuals,

98
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 117 of 292 PageID #:
13686

3.4. Election-Related Violence

including members of purportedly targeted communities who wanted to warn


their communities of an impending danger.115 They encouraged their audiences
to engage with and share their content; the resulting “copypasta” reposts of
the text and images spread the misinformation further and created viral panic
among some online communities.

Figure 3.45: Posts showing concerns about violence from right-leaning social media users.

During and Post-Election


Posts using violent rhetoric or inciting violence after the election significantly
differed from pre-election posts as they turned from fearing violence to coordi-
nating and organizing violence. In addition, posts were linked through larger
narratives, especially election theft, and threats turned their focus to institu-
tions such as voting systems and the government, instead of partisan groups
like antifa or the Proud Boys.
From right-leaning accounts, many violence-related posts became increasingly
tied to claims of election theft or rigging and at times were part of increasing
rhetoric that more generally referenced the idea of preparation for civil war.
Usage of the specific hashtag “#civilwar” on Twitter grew significantly between
November 1 and November 5, and posts calling for civil war increased as results
that favored Biden were announced. One Twitter user posted “Let’s just fast
forward to #CivilWar and get it over with and take out the filthy Cancerous
#DemocRats and remove them from our society.”
In the weeks that followed, the EIP additionally tracked calls for violence against
specific individuals and groups. As discussed in the previous section, employees
of Dominion Voting Systems received targeted harassment including death
threats and doxxing of personal information. Online threats became so common
that Dominion Voting employee Eric Coomer went into hiding.116

99
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 118 of 292 PageID #:
13687

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.46: Cumulative instances of the hashtag #civilwar between November 1 and November
5, 2020.

Figure 3.47: A right-leaning Twitter user calls for civil war against Democrats in response to
alleged electoral fraud.

Figure 3.48: Twitter users call for death or violence against Dominion Voting employees.

100
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 119 of 292 PageID #:
13688

3.5. Narrative Crossover and Fabrication in Non-English Media

Events Surrounding January 6, 2021


On the morning of January 6, 2021, President Trump spoke to his supporters out-
side the White House and stated multiple lies about how the election was stolen
from him. In his speech, Trump referred to Democrats as having attempted
“the most brazen and outrageous election theft,” and said, “We will not take it
anymore...We will stop the steal.” He encouraged his followers to march to the
Capitol and “try and give them [Republicans] the kind of pride and boldness
that they need to take back our country.”117 A pro-Trump mob then forcibly
entered the Capitol building and forced Congress to take cover and evacuate.
Five people died as a result of the Capitol breach.118
The violent insurrection against the United States Capitol on January 6 demon-
strated the real-world impact of mis- and disinformation narratives such as
Stop the Steal, and the effect that social media echo chambers can have on
organized violence. While earlier concerns about violence did not materialize,
angry rhetoric was frequent. That anger made its way to the offline world, as so-
cial media users used platforms to coordinate, recruit, and organize real-world
violence. Far-right users used “alt” social media sites, like Gab and Parler, to
openly organize and recruit others to join them, give directions on what streets
to avoid, and post about bringing weapons into the Capitol.119
As the violent mob launched an insurrection against the US Capitol on January 6,
angry comments by pro-Trump protestors filmed in the building, signs carried
by those outside, and calls for violence against elected officials certifying the
vote referenced narratives that we have discussed in this chapter.
In response to mainstream platforms continuing to crack down in the aftermath
of that violence, users moved off of Facebook and Twitter and onto smaller sites
with less regulation, such as Parler, Gab, and Telegram. To what extent these
communities will continue to operate in closed social media networks—the same
networks that consistently proliferated the notion that the election was stolen
from President Trump—remains to be seen.120 Regardless, the attack on the
US Capitol will forever stand as testament to the violence that echo chambers,
online rhetoric, and sustained misinformation can unleash on the world.

3.5 Narrative Crossover and Fabrication in Non-


English Media
To this point, we have traced English-language incidents, narratives, and con-
spiracies that shaped the 2020 election. However, although the majority of the
EIP tickets collected and analyzed election-related misinformation taking place
in English-speaking communities, there are many American communities that
participate in political conversations in languages other than English, and on

101
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 120 of 292 PageID #:
13689

3. Incidents and Narratives: The Evolution of Election Misinformation

apps and chat platforms popular with diaspora communities. In this section we
briefly discuss examples of election-related mis- and disinformation in Chinese-
and Spanish-speaking communities. In both cases, EIP analysts found that a
majority of the observed content were translations of the same narratives that
appeared in English—including those featured in prior sections of this chap-
ter. However, there were also uniquely inflected narratives, outlets, and actors
targeting these distinct communities.

Chinese-Language Misinformation
EIP analysts identified three types of Chinese-language misinformation: (1) mis-
information translated directly from English-language media, (2) misinformation
that originates from English-language media but is substantially altered dur-
ing the adaptation to Chinese-language audiences, and (3) misinformation that
originates from Chinese-language media and users.
Additionally, the EIP identified two actors that were prominent in spreading mis-
and disinformation in the Chinese-language media sphere, with more complex
motives and sophisticated distribution apparatuses: Falun Gong (法輪功), which
owns and operates the Epoch Times, and Guo Wengui (also known as Miles
Guo) and his associated media enterprises, including Himalaya Global and the
GTV/GNews media group.
The more influential of the two is Falun Gong, an exiled, virulently anti-CCP
Chinese religious movement.121 Its media empire consists of the Shen Yun dance
troupe, US and overseas newspapers including the Epoch Times, television net-
works such as New Tang Dynasty TV, and the Sound of Hope Radio Network; the
entire media complex has more than 12 million followers. The group’s ideological
commitments are fluid, save for a long-standing adversarial relationship with
the CCP government, but in recent years have trended in a right-wing direction.
Beginning in 2016, Falun Gong also grew more assertive in domestic politics
in the US, embracing Trump administration rhetoric while pairing its habitual
denunciations of the CCP with accusations that Democrats were colluding with
them.122 In 2020 it published extensively on Hunter Biden’s alleged ties to the
Chinese government.123
The other two entities—Himalaya Global and the GTV/GNews media group—
maintain close connections to exiled billionaire real estate developer-turned-
media tycoon Guo Wengui. Both have forged close connections with domestic
US politics and politicians, and in particular former White House chief strate-
gist Steve Bannon. Himalaya Global rarely produces information on its own.
Instead, its primary focus is on translating information from English-speaking
conservative news sources, including Fox News and Steve Bannon’s War Room.
It also features a channel of Guo’s criticism of the CCP, which is a mixture of

102
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 121 of 292 PageID #:
13690

3.5. Narrative Crossover and Fabrication in Non-English Media

purported whistle-blower statements and conspiracy theories, and reiteration


of his support of Donald Trump.124 The GTV/Gnews media group, by contrast,
was founded directly by Guo Wengui, with the goal of “taking down the CCP.”125
GTV/Gnews also reposted many of Bannon’s War Room podcasts. During the
2020 election in the US, these two media entities actively reposted mis- and
disinformation on both electoral processes and unverified stories about the
Democratic candidate and his family, particularly on conservative alt-platform
Parler. CCP state-backed media’s contribution to mis- and disinformation is
discussed in the box on 119.

Narratives Originating from English-Speaking Sources


Most of the election misinformation that gained widespread reach in the Chinese-
American community stemmed from English conservative media sources, and
content closely resembled that source material. Before the election, popular
narratives from English-speaking media that made their way into Chinese-
speaking online communities included accusations of Democrats manipulating
the election, conspiracies surrounding mail-in-ballots, and theories about the
Deep State.126
Typically, Chinese-language content was published soon after its English version.
On November 6, 2020, James O’Keefe of Project Veritas tweeted a video of USPS
workers alleging that the USPS Postmaster in Pennsylvania ordered workers
to fraudulently backdate ballots.127 One day later, the Epoch Times published
a Chinese-language article titled “Penn postal worker allegations: postmaster
falsifies ballot dates.”128 The article summarized the videos posted by James
O’Keefe without providing any new information. Similarly, the English-language
right-wing news site Distributed News published a story on the “Scorecard”
conspiracy described above. Soon after, the story was picked up and word-for-
word republished by Sound of Hope, another media outlet owned and operated
by Falun Gong and with a large online following.129
Occasionally, Chinese-language users altered the message en route to a new
audience. For example, in late October, English-language Twitter user @TheP-
ubliusUSA posted a video purporting to be shot in a mailroom in Florida’s Biden-
leaning Miami-Dade County, depicting mounds of undelivered ballots alongside
speculation that USPS failures were harming Biden’s chances in the county.130
The video went viral on Twitter before eventually spreading to Weibo, a Beijing-
based Chinese-language social media platform, where a US-based Weibo user,
Xiyatu Zhixia 西雅图夏至 (Seattle Summer Time), translated the description and
shared it with her 119,180 followers. Notably, her interpretation was more cir-
cumspect than the original video’s: “If this story proves true, if these are ballots,
if the same situation is occurring at other post offices, the consequences will be
serious.131 ”

103
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 122 of 292 PageID #:
13691

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.49: Top, English-language speakers post a video purported to be filmed in a Miami
mailroom; bottom, a Weibo user reposts the video, speculating that it might hurt the Democratic
Party.

104
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 123 of 292 PageID #:
13692

3.5. Narrative Crossover and Fabrication in Non-English Media

Narratives Unique to Chinese-Speaking Communities


Chinese-language media did originate its own misinformation, although less
frequently. These narratives often added an angle alleging a covert relationship
between the Democratic candidate (or Party) and the CCP, therefore accus-
ing both the CCP government and the Democratic Party of corrupting the US
election.

For example, a Facebook post from November 6, 2020, by Chen Junjun 陳君君
(Gentleman Chen), captioned as “South Park told the truth eight years ago;
the CCP is behind the Democrat’s mail-in-ballots voter fraud,” featured a 2012
clip from South Park joking that Obama colluded with the Chinese to win the
election.132 The video’s final frames claimed “Joe Biden is stealing the election”
before exhorting viewers to “Support Trump fight back.” A “Himalayan global”
icon in the final frame suggests the user may have lifted the video from Miles
Guo’s media network.

Figure 3.50: A Twitter post accusing China of sending mail-in ballots to the US.133

Very occasionally, Chinese-originated misinformation made new claims about

105
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 124 of 292 PageID #:
13693

3. Incidents and Narratives: The Evolution of Election Misinformation

the US election without a CCP link. On November 6, Epoch Times posted an


article in which Gary Yang, a member of the Michigan Chinese Conservatives
Alliance and a poll watcher at the TCF Center in Detroit, claimed that while he
and another Republican observed that an estimated 7,000 to 10,000 ballots were
counted on election night, ballot counters reported 50,000. He also claimed
the staff were deliberately slowing down the counting process.134 Although
fact-checking information has not been offered to debunk this specific piece,
there has been no convincing evidence of large-scale voter fraud in Michigan.

Spanish-Language Misinformation
Narratives Originating from English-Speaking Sources
Similarly to Chinese-language community misinformation, many of the misin-
formation narratives in the Spanish-language community did not originate from
within the community. Most were translated from English and circulated via
prominent platforms like Facebook, Twitter, and YouTube, as well as in closed
group chat platforms like WhatsApp, and efforts often appeared coordinated
across platforms.135 Also similarly to Chinese misinformation dynamics, the
most prominent narratives and those shared were either closely aligned with or
completely repurposed from right-wing media outlets. Both grassroots users
initiating bottom-up narratives and verified or large-audience influencers had
key roles to play in the Spanish-language misinformation ecosystem.
Non-verified, grassroots users were an important source of the Spanish-
language misinformation compilations surfaced by the EIP. Q-adherent users
organically “bootstrapped” off English-language theories to present conspir-
atorial threads as intricate as those of their English counterparts. In a single
thread, one such user linked together several false narratives: James O’Keefe’s
Michigan USPS whistleblower story and the Hammer software narrative, both
discussed above, and a generic QAnon rallying cry.136 Twitter placed a label on
the original tweet for the Hammer software claim within this longer thread;
however, the label on this tweet does not automatically translate to Spanish,
even if that is set as the default account language. This follows a broader trend
observed throughout the election season, in which non-English language policy
enforcement fell distinctly behind even when the narratives themselves were
the same across languages.137
The Spanish-language mis- and disinformation sphere also boasted several large-
scale influencers who paralleled English-language repeat spreaders in dissemi-
nating the top narratives to large audiences. One example is Aliesky Rodriguez,
a Cuban-American Trump supporter living in Florida, who hosts a livestreamed
talk show that has peddled almost every one of the aforementioned narratives
to his nearly 100,000 subscribers. Rodriguez’s videos often received between

106
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 125 of 292 PageID #:
13694

3.5. Narrative Crossover and Fabrication in Non-English Media

Figure 3.51: A QAnon-adherent Twitter user, now suspended, was extremely active during the
election period, collating several English-language misinformation threads into long-form “edu-
cational” posts.

50,000 and 110,000 views. For comparison, prominent Spanish-language out-


let Univision Noticias, with more than five million subscribers, often receives
between 5,000 and 30,000 views per video.

Rodriguez’s channel often involves screen sharing and live-translating English-


language content while editorializing. On November 5, Rodriguez was joined
by co-host Amelia Doval for a “live demo” of the dead voter narrative, one of
the theories peddled by English-language repeat spreaders directly after the
elections (see Figure 3.52 on the following page). Rodriguez and Doval exag-
gerated the impact of dead people voting to their Spanish-speaking audience.
In subsequent shows, they covered topics such as Sydney Powell’s “release the
Kraken” statements (described in the Dominion section above), the Supreme
Court rulings on contested election results, and the lead-up to the January 6
insurrection.

107
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 126 of 292 PageID #:
13695

3. Incidents and Narratives: The Evolution of Election Misinformation

Figure 3.52: Aliesky Rodriguez and Amelia Doval push the dead voters narrative. Rodriguez’s
audience often comments on the “deep seeded corruption,” uses proud statements that “AMERICA
is for the patriots,” or pivots into religious supplications for “CELESTIAL AID.”

Figure 3.53: During a November 22 livestream, Rodriguez answered live viewer questions on the
role of Sidney Powell in “dismantling the electoral fraud” against Donald Trump. A key facet of
Rodriguez’s videos is screen sharing and breaking down English-language tweets for his Spanish-
language audience.

108
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 127 of 292 PageID #:
13696

3.5. Narrative Crossover and Fabrication in Non-English Media

These efforts often appeared to be coordinated across channels. For example,


a November 6 video by Rodriguez migrated within moments from his channel
to Mr. Capacho Tv’s channel, one of the most popular sources for Spanish
conspiracy theories.

Figure 3.54: Aliesky Rodriguez’s November 6, 2020, video on his YouTube channel appeared
moments later on Mr. Capacho’s channel.

Rodriguez’s channel was neither the only example nor necessarily the most
prominent in the entire Spanish-language misinformation landscape. However,
this example illustrates the larger strategy used by many of his peers in serving
English-originating misinformation narratives to a Spanish-speaking audience.

Narratives Unique to Spanish-Speaking Sources


Several outlets have reported on the different politically motivated disinforma-
tion narratives and QAnon conspiracy theories that spread within the Spanish-
language communities leading into the election.138 The most prominent such
narrative connected Biden to socialism, which may have been intended to dis-
courage Latino voters who fled the socialist regimes in Venezuela, Cuba, and
Nicaragua from voting Democratic. However, since this content was not related
to the election processes themselves, it was deemed out of scope of our overall
EIP investigations.

Non-English Language Misinformation Impact


In both the Spanish- and Chinese-language communities the EIP monitored, the
content that got the most engagement were those that translated claims of fraud
and delegitimization from English into the audience’s native language. While
some original content was certainly present in each community, these narra-
tives were secondary to those based on the “evidence” gathered from prominent
English-language influencers and viral posts. Thus, although it is not a compre-
hensive solution, slowing the spread of English-language misinformation could
still have a significant downstream impact on its virality in non-English language

109
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 128 of 292 PageID #:
13697

3. Incidents and Narratives: The Evolution of Election Misinformation

communities. Platforms can be more proactive at detecting this translation


pipeline, and subsequently labeling this content in the appropriate language.

Culturally significant messages were sometimes added to the misinformation,


complicating the fact-checking process. For Spanish-language users, this con-
tent usually took the form of religious commentary denouncing socialism and
the left, which appeals to Latino audience members who come from religious,
often Catholic, backgrounds and/or who fled a socialist regime in their birth
country. For Chinese-language users, this took the form of alleged collusion with
the Chinese government or the Communist Party. Effective fact-checks were
notably lacking for both of these communities: improvements to this process
should not merely translate the fact-checking content into the correct language,
but also take these cultural aspects into account.

Foreign State-Backed Actors in the 2020 Election

It’s difficult to rigorously compare foreign interference campaigns in the


2016 and 2020 US election cycles, given the enormous differences in
awareness and preparedness between both electoral cycles.
In 2016, information operations on social media were a true blind spot
for entities charged with protecting the integrity of the election, from
Silicon Valley to Washington. The full scope of the Russian campaigns
targeting the 2016 election only came to light in 2017-2018. By November
2020, a professional field had emerged that focused on ensuring these
operations would be detected and exposed faster. Between December
2019 and Election Day, 12 foreign information operations focused on the
US 2020 elections were detected, attributed, and exposed by platforms,
government entities, and researchers. It is worth noting that this section
only covers the operations that the Partnership investigated during the
height of the electoral period, excluding the handful of foreign information
operations targeting the US 2020 election that had been detected and
deactivated months prior to the height of the electoral season.
A range of foreign actors were assessed to have a vested interest in the
outcome of the elections, both in terms of the actual result and its recep-
tion by the American public. The Election Integrity Partnership prioritized
monitoring actors based in China, Iran, and Russia during the election
period. Using a combination of investigative methods and ongoing mon-
itoring, the Partnership was able to track the covert and overt efforts
made by foreign actors to influence the US 2020 election.
On the covert side, this notably involved monitoring new or continued
activity from networks that have been previously attributed to Russia,

110
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 129 of 292 PageID #:
13698

3.5. Narrative Crossover and Fabrication in Non-English Media

China, and Iran and were involved in targeting Americans via grey propa-
ganda and social media engagement. In terms of covert operations, actors
originating in these three countries appeared to take different approaches
to the 2020 US elections. Assets linked to the former Russian Internet
Research Agency (IRA) consistently amplified narratives about electoral
fraud throughout the election and post-election period, primarily through
their presence on alternative tech platforms like Parler and Gab.
On the overt side, a number of different approaches were taken. Live
network maps provided by Graphika revealed that official state outlets
affiliated with Russia, Iran, and China were publishing and commenting on
the subject of the US elections throughout the campaign period. Russian
state media and the social media presences of state officials and institu-
tions were heavily engaged with the topic of the US elections. However,
Chinese and Iranian state outlets were less consistent in their coverage.
Both states adhered to the line that the elections were unimportant for
their countries and would not affect their perspective on the relationship
between themselves and the US. Instead, China and Iran concentrated on
portraying the US as a lawless, “failed state.”

Covert Operations
A variety of operations from state actors and organizations indicated
that there were adversaries interested in targeting the 2020 election.
There were disparate and somewhat unsuccessful attempts to lay the
groundwork for information operations during the 2020 election cycle
using techniques like faux news rooms, false personas, AI-generated faces,
and manipulation of unwitting freelancers for reporting.
Russia
Russian efforts to target the US 2020 election can be traced back to earlier
operations exposed in late 2019.139 This section will focus on a small set
of campaigns active around and throughout the height of the electoral
season rather than provide a comprehensive survey of foreign information
operations having targeted the US 2020 election.
On September 1, 2020, Facebook and Twitter announced that they had
received investigative tips from the FBI regarding an IRA-linked website,
“PeaceData,” which recruited US-based freelancers to populate articles
for a faux newsroom espousing left-wing political perspectives. Several
platforms removed accounts associated with the operation.140
In early October 2020, Graphika first reported on a set of Pages, profiles,

111
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 130 of 292 PageID #:
13699

3. Incidents and Narratives: The Evolution of Election Misinformation

and a website known as NAEBC, which is attributed to individuals associ-


ated with past IRA activity. This particular effort revolved around a fake
far-right “newsroom” website, NAEBC, which stands for the “Newsroom
for American and European Based Citizens.”141 This operation appeared
to be the right-wing counterpart to the previously discussed “PeaceData”
endeavor. This front media site had associated accounts operating on
Parler and Gab, which functioned as an amplification network posing as
conservative individuals who repeatedly shared its articles. Some of these
personas authored content on the website. However, after the operation
was exposed, the network stopped writing its own articles and instead
focused on sharing content written by genuine, recruited right-wing indi-
viduals as well as content copied from known far-right websites. By the
time of the US election, NAEBC-related assets had been removed from
Twitter, Facebook, and LinkedIn. However, the amplifier accounts on
Parler, Gab, and alternative platforms remained active throughout the
duration of the election, and engaged in discussing the upcoming vote.
NAEBC contributed to many of the narratives discussed in this paper.
During election week, articles posted on the operation’s website included
a report on “massive voter fraud in Wisconsin,” coverage of Republican poll
watchers being “blocked” in Philadelphia, and an article that portrayed
Trump as a sacrificial demigod. These assets also shared a number of
articles and commentary on civil unrest, including an editorial (copied
from a US blog) that claimed, “Our dirty, dangerous, and diseased cities
are now being destroyed by dirty, dangerous, and diseased animals.” After
the election, the NAEBC accounts focused on Dominion voting software,
particularly by claiming the company is tied to antifa. Despite building
up their Parler and Gab presence in an attempt to generate interaction
with memes and photomontages, and increasing their rate of posting
throughout the electoral cycle, Russia-linked covert accounts did not
achieve any significant traction with the targeted communities.
China
Similarly to Russia, networks of political spam accounts pertaining to
a China-linked coordinated influence operation attempted to engage
with American communities during the 2020 election—and were similarly
unsuccessful. The Spamouflage network, which emerged as a Mandarin-
language cluster of accounts that debuted English content in the summer
of 2020, avoided mentioning the election directly, instead continuing to
propagate content that portrays the US in a negative light.142
The prolific Spamouflage network, which includes a large number of assets

112
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 131 of 292 PageID #:
13700

3.5. Narrative Crossover and Fabrication in Non-English Media

with shallow or non-existent personas reposting and recycling a large


volume of content, has been hit by a series of rolling takedowns since
its exposure on YouTube (its primary platform), Facebook, and Twitter,
forcing it to stand up dozens of new accounts each time. This cycle of
suspensions led to a surge in Spamouflage videos being posted on new
channels in September and October 2020, with up to 15 videos emerging
per day, some of them shared by previous assets; they have not achieved
any significant engagement.
On November 6, after the election had been called, a Spamouflage video
referenced election-related protests in New York the previous day, with-
out mentioning the vote. From November 10 onwards, Spamouflage videos
commented on the election outcome as a further sign of the “impending
collapse of America.” Some videos were particularly hostile toward Trump,
but most were bipartisan in tone and focused on criticizing the entire
structure of US politics. Throughout the election period, Spamouflage
English-language videos contrasted the US response to COVID-19 with
China’s response.
In addition to Spamouflage campaigns, Facebook unveiled a separate net-
work of China-based inauthentic assets, which contained a very small num-
ber of assets supporting President Donald Trump or Joe Biden and a short-
lived Group supporting former presidential candidate Pete Buttigieg.
None of these had much traction by the time the platform took enforce-
ment action.143
In spite of this core difference in approach, Russian and Chinese covert
operations both focused on the notion that the US is a “lawless state”
facing an “inevitable civil war.” This theme was also noted by the EIP in
its monitoring of the narratives circulated by official state outlets, and
raises concerns about how covert operations from foreign actors can
leverage the rallying calls of domestic extremist movements—in this case,
accelerationism.
Iran
Iran similarly has a track record of information operations targeting US
communities.144 Note, for instance, a handful of websites and affiliated
accounts referring to themselves as the “IUVM network” (standing for
“International Union of Virtual Media”), which has created persistent in-
formation operations and triggered multiple waves of enforcement across
platforms. In October, these Iranian operations saw a significant part of
the domain names used to spread disinformation seized by the US Depart-
ment of Justice.145 However, other Iran-linked campaigns persist: less

113
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 132 of 292 PageID #:
13701

3. Incidents and Narratives: The Evolution of Election Misinformation

than a month before the US election, the Stanford Internet Observatory


documented and analyzed a Twitter campaign attributed to Iranian actors
in which actors compromised authentic accounts and created fake ones
to disseminate content supporting Black Lives Matter.146
On October 19 and 20, 2020, voters in multiple states including Alaska
and Florida received emails purporting to be from the far-right group the
Proud Boys, instructing them to vote for Donald Trump or face retalia-
tion. Some of the emails included personal details of the voters in ques-
tion. These emails appeared to come from “info@officialproudboys[.]com,”
though it was later determined that this address had been spoofed and
the emails had been sent from servers in Estonia, Saudi Arabia, and the
United Arab Emirates. In some versions of the email, a video link was also
included; this video purported to show someone accessing voter informa-
tion and claiming to demonstrate a method of casting fake ballots.147 The
EIP obtained several of these emails, including from our partners at the
NAACP.148

Texts sharing screenshots of emails purporting to be from the Proud Boys.

In a remarkably fast public attribution process, on October 22 the De-


partment of Justice held a press conference attributing this activity to

114
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 133 of 292 PageID #:
13702

3.5. Narrative Crossover and Fabrication in Non-English Media

Iran, though few details were provided.149 During the conference, it was
stated that both Russia and Iran had accessed US voter data; however,
the information contained in the “Proud Boys” emails appeared to have
been gathered from states that have publicly available voter registration
information, meaning this campaign could have been carried out with-
out needing to acquire any private data. The DOJ did not provide any
additional evidence to support this attribution.
A series of websites created in early December showed an “Enemies of
the People” list, showing the personal information of a number of elected
officials and government employees who were countering claims of voter
fraud in the 2020 election; the site also listed employees of election soft-
ware manufacturer Dominion, reflecting the allegations promoted by the
Trump legal team and right-wing media. This effort saw the operators
including platforms such as Parler and Gab in their social media cam-
paign. This activity was attributed to Iran by the FBI, as reported in the
Washington Post on December 22.150

Doxxing on the Enemies of the People website.

Overt/Openly Affiliated State Outlets


While covert information operations were scarce, state media propaganda
activities continued to varying degrees. Russian state outlets, including
Kremlin-affiliated media entities, diplomats, and other state representa-

115
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 134 of 292 PageID #:
13703

3. Incidents and Narratives: The Evolution of Election Misinformation

tives, were actively engaged in amplifying some of the most divisive stories
described previously in this chapter, focusing predominantly on promot-
ing Donald Trump and casting doubt on the integrity of the electoral
system. China was relatively quiet for much of this period. Iran, simi-
larly to China, did not spend much time on the election itself; it focused
on portraying the US as a declining power with an electoral outcome of
little consequence. 2.1 Russia Throughout the election period, Russian
state-affiliated outlets (including state representatives) engaged heavily
on the topic of voter fraud.151 In the lead-up to the election, there was a
focus on the issue of mail-in ballots and amplifying allegations of interfer-
ence from USPS workers, alongside accusations of Big Tech “interference”
and “censorship.” As the election approached, a number of the principal
Kremlin-affiliated media outlets amplified domestic disinformation narra-
tives about Joe Biden and his family. For example, in the month prior to
the vote (October 3–November 3), RT (formerly Russia Today) published 52
articles and pieces of video content about Hunter Biden or the Biden fam-
ily more broadly. This tranche of content includes op-eds with headlines
like “Blaming Russia for Hunter’s problems was a big misstep, Joe, and it
may prove to be your downfall.” Notably, many of the more aggressive
articles published during this period were opinion pieces posted on the
RT and Sputnik websites rather than directly authored by the outlets.
The EIP, among others (including the Department of Homeland Security),
also documented the concerted effort by Russian state outlets to amplify
disinformation about mail-in voting in the run-up to the election.152 The
Partnership processed over 35 tickets related to Russian outlets spread-
ing election disinformation over the course of the monitoring period.
There was one incident in which accusations of Russian activity required
de-escalation. This incident culminated with the announcement made
by National Intelligence Director John Ratcliffe on October 22 in which,
alongside attributing the spoofed Proud Boys emails to Iran, he claimed
that Russia had also obtained voter information that could be used to
endanger the election.153 Previous claims on social media, particularly on
Twitter, Facebook, and Reddit, had alleged that registration data for 15
million voters in Florida had been hacked and posted on a Russian forum.
However, the data of concern appeared to be standard public information
made available by the State of Florida and not discernable evidence of a
hack.154 Ratcliffe’s announcement appears to have referenced a different
incident where private voter information was obtained.
Following election day, the focus of Russian state outlets appeared to shift
to delegitimizing the results and alleging fraud on behalf of the Democrats

116
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 135 of 292 PageID #:
13704

3.5. Narrative Crossover and Fabrication in Non-English Media

on a broader, more systemic level. English-speaking followers of these


outlets doubled down on the false Dominion narrative, “whistleblower”
accounts from poll workers in swing states, and claims that the outcome
had been pre-determined by a group of “shadowy elites.” A number of
these narratives continued well into the post-election period. Additionally,
Russian state media spread claims of civil unrest and violent protests. On
Twitter and Facebook, Sputnik claimed that a Black Lives Matter-allied
group threatened violence if Trump did not concede, and RT posted a
documentary-style video pushing a “civil war” narrative. Russian state
media also leveraged livestreamed video of protests and in-the-street
actions from its entity Ruptly, which it aired on RT as well as lesser-known
entities such as Redfish and In The Now.

Left, a tweet by Russian state-backed media property Sputnik claiming Black Lives Matter
groups had threatened violence; right, an RT tweet of a video predicting civil war in
America.

China
Chinese state media and official accounts appeared to be taking a relatively
direct stance toward the topic of the US elections in the months prior to
the vote, but as Election Day drew closer, Chinese state officials and media
agencies grew quiet. After NCSC Director William Evanina’s statement
alleging electoral interference by China, Russia, and Iran (in that order),155
election-related activity from state media and CCP spokespeople declined

117
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 136 of 292 PageID #:
13705

3. Incidents and Narratives: The Evolution of Election Misinformation

significantly. Reporters at state-backed outlets have said that they were


told to ensure coverage was “calm” and “neutral,” and were advised not to
focus on the election.156
In one interview, Fu Cong, Director-General of Arms Control of the Min-
istry of Foreign Affairs, stated “Well, we know that the US general election
is coming very soon. And I don’t want to make any comments that may
be interpreted by the US as interfering in their internal affairs or in their
general election.”157 Following this guidance, the limited coverage that
did exist was even in tone, with the exception of some editorials in state
outlets that argued the election would make little difference to US-China
relations, given what they described as bipartisan hostility toward China.
After the election, state representatives followed Xi Jinping’s lead and
did not acknowledge the results until three weeks after the vote. State
media covered the election results with a cautiously optimistic tone, but
continued disparaging the US overall. In terms of reception, Chinese
citizens tended to celebrate Joe Biden and mock Donald Trump, while
Chinese Americans typically had mixed responses that leaned pro-Biden.
Notably, both Chinese state media and CCP representatives were willing
to forcefully criticize the Trump administration, particularly Secretary
of State Mike Pompeo, but they rarely attacked Trump himself and did
not express any explicit candidate preference. Even during the week
of the election, Foreign Ministry Spokesperson Hua Chunying harshly
criticized the US while avoiding the election itself.158 Throughout the
election period, Hua appeared to shape the narratives and tone that CCP
representatives then echoed. While she has significantly fewer followers
than state media outlets, she is consistently the most-mentioned account
among followers of Chinese outlets and CCP representatives.
Iran
Iranian state-backed outlets frequently used coverage of the US to dimin-
ish the country and cast Iran in a favorable light, but rarely engaged in
what can be classified as the widespread propagation of disinformation.
On occasion, Iranian outlets did publish content designed to attack the
legitimacy of the American electoral process—saying it fell short of its
democratic ideals and was likely to be marred by violence. This at times
involved questioning American democracy altogether—in some cases us-
ing the voice of American academics, “analysts,” activists, or media outlets
to do so.
During election week, Iranian officials sought to undermine the efficacy
of the US system of government, with Ayatollah Ali Khamenei releasing a

118
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 137 of 292 PageID #:
13706

3.6. Fact-Checking Claims and Narratives

speech in which he described the election as a “spectacle” showing the


“ugly face of liberal democracy in the US” where the only certain outcome
is “the political, civil, & moral decline of the US regime,” and furthered the
narrative that the US was facing an existential crisis.159 Broader Iranian
coverage focused on domestic issues like racial disparities and social
divisions, the treatment of protesters and minorities by the police, and
growing fears of civil unrest within the US.
In a similar vein, Tehran-based Mehr News Agency used an October re-
port from the Department of Homeland Security warning about foreign
election interference to suggest that such warnings were “old ways” of
“creating panic” among the American public and were designed to induce
participation in the electoral process—and presumably to lend the elec-
tion a stronger legitimacy.160 In at least one instance, Iranian outlets used
a report from The Hill about concerns over the absentee voting system in
Texas to heighten fears of voter suppression in the US.161
Through quotes from official and op-ed pieces, Mehr, Fars, Tasnim, and
other Iranian state-backed outlets frequently promoted the notion that
Trump and Biden were roughly equivalent in terms of their antagonism
toward the interests of the Regime and the Iranian people, and so the out-
come of the election was largely irrelevant to Iranian interests. However,
at times these outlets showed a slight preference for a Biden adminis-
tration if only because of President Trump’s open hostility toward the
country. This narrative stayed fairly consistent even in the days following
the election, with only minor adjustments.

3.6 Fact-Checking Claims and Narratives


In some cases, the direction and life cycle of a narrative can be diverted, or
even stopped, by way of authoritative fact-checking. As narratives containing
misinformation and conspiracy theories about the election emerged and spread
on social media, fact-checking by news sites, professional organizations, and
election officials often followed—but their efforts were not uniformly received.
Some high-profile narratives were fact-checked and easily debunked by journal-
ists, government officials, and mainstream media, including EIP partners. Other
false narratives escaped the notice of the fact-checking community for weeks,
or were never fact-checked at all.

In the following section, we examine examples of the fact-checking response to


two of our prominent misinformation case studies from earlier in the chapter:
Sharpiegate and Dominion Voting Systems.

119
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 138 of 292 PageID #:
13707

3. Incidents and Narratives: The Evolution of Election Misinformation

Case Study 1: Fact-Checking Sharpiegate


As the Sharpiegate narrative grew on Election Day and the days immediately
following, government offices and news media began to fact-check the claims.
This was particularly true in Arizona. On November 3, 2020, at 12:09 pm PT,
before polls had even closed, the Maricopa County Elections Department posted
a video that debunked these claims to their Facebook account.162 Many com-
menters remained unconvinced: some of the most popular comments on the
video claimed that their ballots were canceled, and attributed this to using a
Sharpie. Despite this initial attempt at debunking, posts on social platforms
continued to propagate the misinformation narrative of election fraud based on
the breadth of Sharpie use and the “massive bleed through” they cause.
The next morning, November 4 at 8:50 am PT, Pima County released a tweet
thread citing the Arizona Election Manual, clarifying that all ballots would be
counted regardless of the type of writing implement used.163 Again, many of the
commenters who replied were skeptical at best: comments mostly questioned
why officials would allow the usage of felt-tip or Sharpie markers if there was
the chance of bleeding through the ballot. Other comments pushed back on the
officials’ claims, asked follow-up questions, and continued to allege that the offi-
cials were guilty of fraud because of the “suspicious” nature of the clarification.
The Maricopa County Board of Supervisors posted an open letter to Maricopa
County voters, articulating that accurate vote counting was a bipartisan com-
mitment, and took on Sharpiegate directly: “sharpies do not invalidate ballots.
We did extensive testing on multiple different types of ink with our new vote
tabulation equipment. Sharpies are recommended by the manufacturer because
they provide the fastest-drying ink. The offset columns on ballots ensure that
any bleed-through will not impact your vote.”164
More fact-checks appeared that same day. Arizona Secretary of State Katie
Hobbs released a Twitter thread debunking Sharpiegate, with a marginally more
positive effect (and over 12,000 engagements), and AZ Family News published a
fact-check linking to Hobbs’s tweet thread and the Maricopa County video.165
But the misleading narrative continued to spread.
Despite these early fact-checks by government officials, the platforms’ responses
to the claims were neither timely nor standardized. On Twitter, some Sharpie-
gate content came down, other tweets were labeled, and still others were left
untouched. Facebook, Instagram, and TikTok had similar responses: labeling
and removing some, but not all, of the Sharpiegate content. The YouTube videos
related to the Sharpiegate narrative were labeled, but none were taken down.
Despite the many efforts made by news outlets and state officials to fact-check
these claims, the narrative spread quickly, and the same misleading content
appeared across multiple platforms. The Sharpiegate narrative reached thou-

120
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 139 of 292 PageID #:
13708

3.6. Fact-Checking Claims and Narratives

sands of individuals and inspired some of them to organize and participate


in real-world protests.166 Despite the prompt attempted debunking of these
claims, belief in Sharpiegate persisted, and it was ultimately incorporated into
the broader subsequent Stop the Steal narrative.

Case Study 2: Fact-Checking the Dominion Narrative


As the allegations against Dominion Voting Systems moved from Georgia to
Michigan to states across the country, fact-checkers tried to keep up. On
November 6, the Michigan Department of State issued a statement on its web-
site refuting allegations that Dominion Voting Systems was responsible for voter
fraud in Antrim County.167 The statement was subsequently shared by the Michi-
gan Department of State’s Twitter account, with responses in the comments
varying from gratitude for the clarification to outright denial of the Department’s
refutation.168
Similarly, on November 12, CISA released a statement certifying that there was
“no evidence that any voting system deleted or lost votes, changed votes, or was
in any way compromised.”169 CISA’s findings were subsequently corroborated by
the US Department of Justice when Attorney General Bill Barr confirmed that
there was no evidence of widespread voter fraud.170
The narrative also centered on the swing states of Arizona, Georgia, and Penn-
sylvania; in each state, fact-checkers debunked the claims. In Arizona, the
Maricopa County Board of Supervisors refuted claims of voter fraud by Domin-
ion Voting Systems in a public statement.171 The Georgia Secretary of State
released a statement confirming that “the original machine count accurately
portrayed the winner of the election.”172 In Pennsylvania, the state validated the
accuracy of the voting machines and their official tallies, further highlighting
that Dominion Voting machines had not been used in counties such as Allegheny
and Philadelphia—counties that Trump falsely claimed were responsible for
rigging the election.
Dominion Voting Systems released its own statement debunking claims that its
systems were used to switch votes or to fraudulently cast votes. The statement
cited evidence to refute claims of vote manipulation in the same four states:
Arizona, Georgia, Michigan, and Pennsylvania.173
Though false allegations of voter fraud due to Dominion Voting machines were
repeatedly debunked, propagation of misinformation relating to vote tabulation
and voting interference nonetheless appears to have had a significant impact
on how the 2020 election was perceived—social media commentary alleging
malfeasance was extensive and widespread. Nearly a month after the election,
election officials and public officials in Georgia were still continuing to hold
press conferences to debunk the misinformation.174 Even beyond that, members

121
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 140 of 292 PageID #:
13709

3. Incidents and Narratives: The Evolution of Election Misinformation

of the Trump administration as well as Trump’s supporters continued to pursue


allegations of fraud related to the Dominion voting machines (discussed further
in Chapter 4), which repeatedly reinforced claims of a rigged election among
supporters. This case was an example of the balancing act that must take
place when fact-checking: because fact-checking can draw further attention to
misinformation or conspiracy, individuals or organizations debunking stories
must take care to not unintentionally amplify narratives that could cause real
world harm, fear, or suppression.175

3.7 Final Observations


Tickets processed by the Election Integrity Partnership and external organiza-
tions were diverse—focused on different real or purported incidents, in different
states, over the course of months. The Partnership’s breadth of exposure to
election-related narratives provides unique insight into how misinformation
evolved and the themes that cut across these discrete time periods. We conclude
with five reflections on election-related misinformation narratives:
1. Researchers can predict, but not necessarily prevent, these dynamics.
On October 26, 2020, during the pre-election stage, a team of EIP researchers
published a piece, “Uncertainty and Misinformation: What to Expect on Election
Night and Days After.”176 This blog post presented a set of expectations, including
that the winner of the election would not be known on election night, that
red/blue or blue/red shifts would create opportunity for political actors and
conspiracy theorists to delegitimize the election, that voting process failures
would be strategically framed and overemphasized to fit misleading narratives,
and that “bad statistics” would be selectively highlighted.
The EIP post demonstrates the extent to which election-related misinformation
was predictable. As described throughout this chapter, many of these predic-
tions were realized. However, ease of prediction does not necessarily correlate
with ease of prevention. Although the EIP and others published advice for jour-
nalists covering the election and many journalists followed best practices, the
predictable misinformation narratives still played out during and after election
night. Further research should explore the effectiveness of prebunk/inoculation
strategies, clear journalistic coverage, and fact-checking in the 2020 election.
The post also suggests the need for more ambitious models to counter pre-
dictable election-related misinformation, and the difficulty credible journalists
will face in trying to prevent election-related misinformation altogether. Plat-
forms also, to our knowledge, did not adequately systemize the predictability of
certain narratives to create preventative policies.
2. Non-falsifiable misinformation provides challenges for platforms.

122
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 141 of 292 PageID #:
13710

3.7. Final Observations

The election information ecosphere was replete with non-falsifiable claims. For
example, when Project Veritas relies on anonymous whistleblowers, it is difficult
for independent news outlets to determine the veracity of the whisteblowers’
claims. Likewise, when social media users post that a “friend of a friend” experi-
enced or witnessed a particular event, researchers can’t reliably prove that the
claim of an unnamed “friend” is false.
Non-falsifiable narratives erode the information ecosphere; the clarity of fact
and the power of credible voices is muddled by non-falsifiable noise. In the 2020
election, the EIP witnessed numerous non-falsifiable tickets—some labeled by
platforms, others not—which contributed to broader narratives that the election
was unreliable or rigged. And when clearly falsifiable narratives were fact-
checked, they still became part of the conspiratorial discourse about election
fraud. Non-falsifiable information created for political gain will continue to be a
challenge for platforms moving forward. But so will clearly falsifiable information,
if platforms do not adequately and consistently take action against false claims.
3. Frames, not just facts, set the course.
Much of the misinformation the EIP observed in the 2020 election—including
non-falsifiable content—relied on framing. As we will describe in Chapter 4,
“frames highlight some bits of information about an item that is the subject of a
communication, thereby elevating them in salience.”177 Whether a mail-dumping
incident is seen as a one-off mistake by a postal service agent or as Democrats
stealing the election, or whether a red mirage/blue wave is evidence of mail-in
ballots arriving after Election Day or a conspiracy at work, depends on how the
event is framed.
Misinformation in the 2020 election cycle shows that how information is pack-
aged largely determines the effect of that information. In Chapter 4, we’ll de-
scribe how different actors use framing techniques to channel information to
align with their priors and their favored outcomes.
4. From online to off—election-related misinformation can have real-world
effects.
One of the biggest challenges in the misinformation research community is how
to measure effects. The baseline is often to use engagement statistics—how
many people like, comment, or share a post, for example. Throughout this
report, we often refer to such engagement statistics. However, there is a gap
between engagement on social media and change in attitudes or behaviors. Just
because someone “likes” a piece of misinformation does not necessarily mean
that they believe it or that it changed their view.
In this election cycle, EIP partners observed misinformation on social media form
the basis of real-world actions—including the formation of activist groups and
protests, and ultimately a violent insurrection at the Capitol. Misinformation in

123
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 142 of 292 PageID #:
13711

3. Incidents and Narratives: The Evolution of Election Misinformation

the pre-election stage undermined confidence in mail-in voting, delegitimizing


the election process and setting the stage for post-election claims that the elec-
tion was stolen. For months, right-wing social media users had been fed online
“evidence” of a rigged election, coalescing into a movement to #StoptheSteal.
Right-wing social media personalities—including individuals who have repeat-
edly been tied to spreading misinformation and conspiracy theories—created a
website and email discussion list for #StoptheSteal supporters to mobilize.178
Over a month after the election, #StoptheSteal events continued to take place
nationwide—some with kinetic effects including stabbings and other violence.179
On January 6, the real-world effects of election-related misinformation reached
fever pitch. Ali Alexander and other right-wing influencers had encouraged
Trump supporters throughout the country to converge on Washington, DC, to
protest in person. That morning, the President told a crowd of supporters that
“this election was stolen from you, from me, from the country” and encouraged
his supporters to march on the Capitol. A group of these protestors—including
white supremacists and QAnon believers—violently broke into the Capitol, killing
Capitol Police officer Brian Sicknick; four others died during the riot. The series
of events shows that online misinformation can engender real-life radicalization
with deathly consequences. Even as some social media platforms removed
content from the day, the stain on American democracy remains.

124
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 143 of 292 PageID #:
13712

Notes

1. (page 48) Chip Scanlan, “What is Narrative, Anyway?” Poynter, September 29,
2013, https poynter org reporting-editing hat-is-narrative-any ay
2. (page 48) Robert M. Entman, “Framing: Toward Clarification of a Fractured
Paradigm,” Journal of Communication 43, no. 4 (December 1993): 51–58, doi org
- tb
3. (page 49) Pew Research Center, “Sharp Divisions on Vote Counts, as Biden
Gets High Marks for His Post-Election Conduct: Topline Questionnaire” Pew
Research Center, November, 2020, https pe research org politics p-
content uploads sites PP Post-Election- ie s TOPLI E
pdf
4. (page 51) National Center for Health Statistics, “Provisional Death Counts
for Coronavirus Disease 2019 (COVID-19): Daily Updates of Totals by Week and
State,” Center for Disease Control and Prevention, updated February 12, 2021,
https cdc gov nchs nvss vsrr covid inde htm
5. (page 51) Richard H. Pildes, “Early and Mail-In Voting for 2020 Elec-
tion Expands Dramatically Despite Legal Fights,” Wall Street Journal, October
30, 2020, https s com articles early-and-mail-in-voting-for- -election-
e pands-dramatically-despite-legal-fights- ; Alex Hufford, “The Rise
of Ballot Drop Boxes Due to the Coronavirus,” Lawfare (blog), August 27, 2020,
https la fareblog com rise-ballot-drop-bo es-due-coronavirus
6. (page 51) Luke Broadwater, “Both Parties Fret as More Democrats Request Mail
Ballots in Key States,” New York Times, updated October 12, 2020, https
nytimes com us mail-voting-democrats-republicans-turnout html
7. (page 51) Candace Owens (@RealCandaceO), “The Democrats rigged a United
States election in the middle of the night by dumping mail-in ballots,” Twit-
ter, November 14, 2020, 8:43 am, https eb archive org eb

125
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 144 of 292 PageID #:
13713

3. Incidents and Narratives: The Evolution of Election Misinformation

https t itter com RealCandaceO status ; Donald


J. Trump (@realDonaldTrump), “MAIL-IN VOTING WILL LEAD TO MASSIVE
FRAUD AND ABUSE. IT WILL ALSO LEAD TO THE END OF OUR GREAT RE-
PUBLICAN PARTY. WE CAN NEVER LET THIS TRAGEDY BEFALL OUR NA-
TION. BIG MAIL-IN VICTORY IN TEXAS COURT TODAY. CONGRATS!!!” Twitter,
May 28, 2020, https eb archive org eb https t itter com
realDonaldTrump status
8. (page 52) United States Attorney’s Office, District of New Jersey,
“Postal Employee Arrested for Dumping Mail, Including Election Bal-
lots Sent to West Orange Residents,” Press Release, October 7, 2020,
https ustice gov usao-n pr postal-employee-arrested-dumping-mail-
including-election-ballots-sent- est-orange; United States Attorney’s Office,
District of New Jersey, “U.S. Postal Service Letter Carrier Indicted For Allegedly
Destroying Mail,” Press Release, July 18, 2013, https ustice gov usao-
ri pr us-postal-service-letter-carrier-indicted-allegedly-destroying-mail
9. (page 52) Ian Kennedy, et al., “Emerging Narratives Around ‘Mail Dumping’
and Election Integrity,” Election Integrity Partnership, September 29, 2020,
https eipartnership net rapid-response mail-dumping
10. (page 54) Haley Smith, “Ballot box fire in Baldwin Park may be arson, officials
say,” Los Angeles Times, October 19, 2020, https latimes com california
story - - ballot-bo -fire-bald in-park-may-be-arson;
Aaron Sharockman, “Social posters spread election misinformation about
‘shredded’ ballot applications for Donald Trump,” PolitiFact, October 8,
2020, https politifact com factchecks oct blog-posting social-
posters-spread-election-misinformation-abou
11. (page 54) Kelli Dugan, “California mail dump in salon parking lot caught
on surveillance video,” KIRO 7 News, Washington, updated September 7,
2020, https kiro com ne s trending california-mail-dump-salon-parking-
lot-caught-surveillance-video G S UF DR I S T
12. (page 54) Topher Gauk-Roger and Stephanie Becker, “Video shows USPS mail
being dumped in a California parking lot. A postal union says USPS employees
weren’t involved,” updated September 8, 2020, https cnn com
us usps-mail-dump-parking-lot-trnd inde html
13. (page 55) Ian Kennedy, et al., “Emerging Narratives Around ‘Mail Dumping’
and Election Integrity,”
https eipartnership net rapid-response mail-dumping
14. (page 56) Jake Prinsen and Natalie Brophy, “Updated: U.S. Postal
Service investigates mail, including absentee ballots, found along
road in Outagamie County,” Post-Crescent, updated October 2, 2020,

126
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 145 of 292 PageID #:
13714

3.7. Final Observations

https postcrescent com story ne s outagamie-county-


absentee-ballots-found-along-road-fo -valley-greenville
15. (page 56) Patrick Marley, “Mail found in Greenville ditch did
not include any Wisconsin ballots,” Post-Crescent, updated October 1,
2020, https postcrescent com story ne s politics elections
mail-found-greenville-ditch-did-not-include-any- isconsin-ballots
16. (page 56) Alice Reid, “Authorities release photos of mail, including ballots,
found in ditch last year,” NBC 26 Greenbay, updated February 19, 2021,
https nbc com ne s local-ne s authorities-release-photos-of-mail-
including-ballots-found-in-ditch-last-year
17. (page 56) Jim Hoft, “Breaking: US Mail Found in Ditch in Ru-
ral Wisconsin — Included Absentee Ballots,” The Gateway Pundit,
September 23, 2020, https eb archive org eb https
thegate aypundit com breaking-us-mail-found-ditch-greenville-
isconsin-included-absentee-ballots
18. (page 57) Ian Kennedy, et al., “Emerging Narratives Around ‘Mail Dumping’
and Election Integrity,”
https eipartnership net rapid-response mail-dumping
19. (page 58) Center for an Informed Public, “Cumulative Graph - Tweets about
Mail Dumping Incident in Greenville, Wisconsin,” University of Washington,
http faculty ashington edu kstarbi Ditching ail-WI html
20. (page 57) The Hill (@thehill), “Kayleigh McEnany: ‘Mass mail-out voting...
could damage either candidate’s chances because it’s a system that’s subject
to fraud. In fact, in the last 24 hours, police in Greenville, Wisconsin, found
mail in a ditch, and it included absentee ballots,”’ Twitter, September 24, 2020,
1:51 pm, https eb archive org eb https t itter com thehill
status s
21. (page 58) Samantha Putterman, “Ballots in California were ac-
tually old empty envelopes from 2018 election,” PolitiFact, September
28, 2020, https politifact com factchecks sep facebook-posts
ballots-california-dumpster- ere-actually-old-empt
22. (page 60) Currently, according to the National Conference of State Legisla-
tures, the practice is legal in 26 states and the District of Columbia.
23. (page 61) Donald J. Trump (@realDonaldTrump), “GET RID OF BALLOT
HARVESTING, IT IS RAMPANT WITH FRAUD. THE USA MUST HAVE VOTER I.D.,
THE ONLY WAY TO GET AN HONEST COUNT!” Twitter, April 14, 2020,
https eb archive org eb https t itter com
realDonaldTrump status

127
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 146 of 292 PageID #:
13715

3. Incidents and Narratives: The Evolution of Election Misinformation

24. (page 61) Caitlin Huey-Burns and Musadiq Bidar, “What is ballot harvesting,
where is it allowed and should you hand your ballot to a stranger?” CBS News,
September 20, 2020, https cbsne s com ne s ballot-harvesting-collection-
absentee-voting-e plained-rules ;
Dylan Scott, “North Carolina elections board orders new House election after
ballot tampering scandal,” Vox, February 21, 2019, https vo com policy-
and-politics north-carolina-election-fraud-ne -nc- -election
25. (page 61) Jon Levine, “Confessions of a voter fraud: I was a master at fixing
mail-in ballots,” New York Post, April 29, 2020, https nypost com
political-insider-e plains-voter-fraud- ith-mail-in-ballots
26. (page 61) Christopher Wan, Alex Popke, and Haley Schwab, “Ballot Collection,”
Stanford-MIT Healthy Elections Project, updated October 22, 2020,
https healthyelections org sites default files - allot Collection pdf;
Elena Cryst, Tara Kheradpir, and Erin McAweeney, “The Ballot Harvesting Trope,”
Election Integrity Partnership, October 27, 2020,
https eipartnership net rapid-response ballot-harvesting
27. (page 63) Ese Murphy, “‘Project Veritas’ Report Claims Ilhan
Omar Supporters Harvested Ballots,” CBS 4 Minnesota, September 29,
2020, https minnesota cbslocal com pro ect-veritas-report-accuses-
ilhan-omar-supporters-of-illegally-harvesting-ballots
28. (page 63) Brian Michael Goss, “Veritable Flak Mill,” Journalism Studies 19, no.
4 (2018): 548-563, doi org ;
Joel Meares, “O’Keefe Teaches Media a Lesson (Again),” Columbia Journalism
Review, March 15, 2011, https archives c r org campaign desk okeefe teaches
media a lesson again php;
James Poniewozik, “The Twisty, Bent Truth of the NPR-Sting Video,” TIME,
March 13, 2011, https entertainment time com the-t isty-bent-truth-
of-the-npr-sting-video ;
Paul Varhi, “Is it okay for James O’Keefe’s ‘investigative reporting’ to rely on
deception?” Washington Post, October 19, 2016, https ashingtonpost
com lifestyle style is-it-okay-for- ames-okeefes-investigative-reporting-to-rely-on-
deception f fd a- e- e - b c- af a story html
29. (page 63) Bethania Palma, “Viral Video Spreads Unfounded Claim About
Rep. Ilhan Omar and Voter Fraud,” Snopes, updated October 19, 2020, https
snopes com ne s pro ect-veritas-ilhan-omar
30. (page 63) Isabella Garcia-Camargo, et al., “Project Veritas #BallotHarvesting
Amplification,” Election Integrity Partnership, September 29, 2020, https
eipartnership net rapid-response pro ect-veritas-ballotharvesting
31. (page 66) Maggie Astor, “Project Veritas Video was a ‘Coordinated Disinfor-
mation Campaign,’ Researchers Say,” New York Times, September 29, 2020,

128
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 147 of 292 PageID #:
13716

3.7. Final Observations

https nytimes com us politics pro ect-veritas-ilhan-omar html


32. (page 66) Tom Lyden, “Subject of Project Veritas voter fraud story says he
was offered bribe,” FOX 9 Minnesota, updated October 6, 2019,
https fo com ne s sub ect-of-pro ect-veritas-voter-fraud-story-says-he-
as-offered-bribe
33. (page 66) Camille Caldera, “Fact check: No proof of ballot fraud scheme,
link to Rep. Ilhan Omar,” USA Today, October 16, 2020,
https fo com ne s sub ect-of-pro ect-veritas-voter-fraud-story-says-he-
as-offered-bribe
34. (page 68) Pew Research Center, “Sharp Divisions on Vote Counts.”
35. (page 69) Candace Owens (@realCandaceO), “The Democrats rigged a United
States election.”
36. (page 68) Joey Garrison, “Biden and Trump each warn that other side may
‘steal’ the election as fight over mail voting rages,” USA Today, updated June
14, 2020, https usatoday com story ne s politics elections
election- -biden-says-his-concern-trump-steal-election
37. (page 70) David A. Graham, “The Blue Shift Will Decide the Election,” The
Atlantic, August 10, 2020, https theatlantic com ideas archive
brace-blue-shift
38. (page 70) Joey Garrison, “Election week? Maybe not: Arizona, Florida
and others could give early indication of who is winning,” USA Today, October
13, 2020, https usatoday com story ne s politics elections
election- -mail-voting-shouldnt-delay-results-several-key-states
39. (page 71) Daniella Silva, “Trump’s calls for supporters to watch polls very
carefully raises concerns of voter intimidation,” NBC News, September 30,
2020, https nbcne s com ne s us-ne s trump-s-call-supporters- atch-
polls-very-carefully-raises-concerns-n
40. (page 72) Marisa Irati and Adriana Usero, “A viral video implied a man was
illegally moving ballots. It was a photographer and his equipment.” Washington
Post, November 5, 2020, https ashingtonpost com technology
video-claiming-detroit-ballot-fraud-debunked .
41. (page 73) Donald Trump (@realDonaldTrump), “Volunteer to be a Trump
Election Poll Watcher. Sign up today! #MakeAmericaGreatAgain, Twitter, Octo-
ber 5, 2020, 8:17 am, https eb archive org eb t itter com
realdonaldtrump status
42. (page 73) @[account name redacted], “@jack @Twitter This tweet is en-
couraging election violence,” Twitter, October 5, 2020, https t itter com amy
norcal status

129
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 148 of 292 PageID #:
13717

3. Incidents and Narratives: The Evolution of Election Misinformation

43. (page 73) Brian Klass (@brianklass), “To be clear, the president who has
repeatedly encouraged political violence,” Twitter, October 5, 2020, https
t itter com brianklaas status

44. (page 74) Thomas Wright, “Large-Scale Political Unrest Is Un-


likely, But Not Impossible,” The Atlantic, September 26, 2020, https
theatlantic com ideas archive large-scale-political-unrest-
unlikely-not-impossible

45. (page 74) Renée Diresta and Isabella Garcia-Camargo, “Laying the Ground-
work: Meta-Narratives and Delegitimization Over Time,” Election Integrity Part-
nership, October 19, 2020, https eipartnership net rapid-response election-
delegitimi ation-meta-narratives

46. (page 74) Jeremy W. Peters and Alan Feuer, “How Richard Jewell’s Lawyer
Became a Pro-Trump Conspiracy Theorist,” New York Times, December 29, 2020,
https nytimes com us politics lin- ood-georgia-trump html

47. (page 76) Amy Jacobson (@AmyJacobson), “If you are voting at LAKEVIEW
HS bring your own black pen! Ballots are double sided and the sharpies they
provide are bleeding through. Polling Marshal says there’s nothing she can do,”
Twitter, November 3, 2020, https t itter com my acobson status

48. (page 76) Maria Savedra, “Chicago Voters Using Sharpies Are Concerned
about Ink Bleeding Through Ballots,” news segment, CBS 2 Chicago, November 3,
5:48 pm, https chicago cbslocal com chicago-voters-using-sharpies-
are-concerned-about-ink-bleed-through

49. (page 77) Melissa Blasius, “The origins of Sharpiegate and pre-
venting future voter uncertainty,” ABC 15 Arizona, November 6, 2020,
https abc com ne s election- the-origins-of-sharpiegate-and-
preventing-future-voter-uncertainty

50. (page 78) Caleb Brown (@breeze32), “AZ wtf you doing? #ElectionNight
#voterfraud,” Twitter, November 3, 2020, 10:29 pm,
https eb archive org eb https t itter com cbree e
status

51. (page 77) Maricopa County Elections Department, “You can use a sharpie on
your ballot,” Facebook, November 3, 2020, https facebook com atch v

52. (page 78) Elahe Izadi, “First CNN, then within minutes, most other news
organizations called the race for Biden,” Washington Post, November 7, 2020,
https ashingtonpost com media fo -ne s-biden-president

130
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 149 of 292 PageID #:
13718

3.7. Final Observations

53. (page 79) James Walker, “What Donald Trump Said in Premature ‘Victory’
Statement,”’ Newsweek, November 4, 2020, https ne s eek com hat-
donald-trump-said-election-victory-speech-full-transcript-
54. (page 79) Rumor Control Page, Cybersecurity and Infrastructure Security
Agency, accessed February 10, 2021,
https cisa gov rumorcontrol rumor .
55. (page 79) Dana Nessel (@dananessel), “Dear members of the public: Please
stop making harassing & threatening calls to my staff. They are kind, hard-
working public servants just doing their job. Asking them to shove sharpies in
uncomfortable places is never appropriate & is a sad commentary on the state of
our nation,” Twitter, November 5, 2020, 12:00 pm, https eb archive org eb
https t itter com dananessel status
56. (page 80) John D’Anna and Richard Ruelas, “‘Sharpiegate’ has not halted
ballot counting in Arizona, but the debunked theory persists,” AZCentral, updated
November 6, 2020, https a central com story ne s politics elections
sharpiegate-hasnt-halted-ari ona-count-but-theory-persists
57. (page 80) 12 News Arizona, “Trump supporters protest outside Maricopa
County elections office for second night,” YouTube video, posted November 5,
2020, https youtube com atch v m ; James Walker, “Alex
Jones Joins Maricopa County Protest, Yells ‘You Ain’t Stealing S***’ in Wild Rant,”
Newsweek, November 6, 2020, https ne s eek com ale - ones-maricopa-
county-protest- ild-rant-
58. (page 80) Howard Fischer, “Arizona continues counting as race between
Biden, Trump tightens,” Arizona Daily Star, updated December 16, 2020,
https tucson com ne s state-and-regional ari ona-continues-counting-
ballots-as-race-bet een-biden-trump-tightens article ee b f- fec- d-
bf a- f deb html
59. (page 80) Steven Hsieh, “Facebook Temporarily Shut Down Anti-Migrant
Group AZ Patriots’ Page,” Phoenix New Times, July 23, 2019,
https phoeni ne times com ne s facebook-shuts-do n-anti-migrant-
group-a -patriots-page-
60. (page 80) Samantha Bradshaw, et al., “Election Delegitimization: Coming to
You Live,” Election Integrity Partnership, November 17, 2020,
https eipartnership net rapid-response election-delegitimi ation-coming-to-
you-live
61. (page 82) DFRLab, “#StopTheSteal: Timeline of Social Media and Extremist
Activities Leading to 1/6 Insurrection,” Just Security, February 10, 2021,
https ustsecurity org stopthesteal-timeline-of-social-media-and-
e tremist-activities-leading-to- - -insurrection

131
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 150 of 292 PageID #:
13719

3. Incidents and Narratives: The Evolution of Election Misinformation

62. (page 82) Donald J. Trump (@realDonaldTrump), “Watch: Hundreds of


Activists Gather for ‘Stop the Steal’ Rally in Georgia,” Twitter, November 21,
2020, 12:34 am,
http eb archive org eb https t itter com
realdonaldtrump status
63. (page 83) Rob Kuznia, et al., “Stop the Steal’s massive disinformation
campaign connected to Roger Stone,” CNN, November 14, 2020,
https cnn com business stop-the-steal-disinformation-
campaign-invs inde html
64. (page 83) Roger Stone, “Stop the Steal Diary,” Stone Cold Truth, November
30, 2020, https stonecoldtruth com stop-the-steal-diary
65. (page 83) Roger Stone, “The Lies of CNN,” Stone Cold Truth, November 20,
2020, https stonecoldtruth com the-lies-of-cnn
66. (page 83) Philip Bump, “A decade of wringing money and power out of
conservative victimhood nears its apex,” Washington Post, December 8, 2020,
https ashingtonpost com politics decade- ringing-money-
po er-out-conservative-victimhood-nears-its-ape
67. (page 83) Jewish Deplorable (@TrumpJew), “They’re stealing this election
in broad daylight. Extending mail-in deadlines, harvesting, kicking the Green
Party off ballots in key states,” Twitter, September 22, 2020, 12:23 pm,
https eb archive org eb https t itter com Trump e
status
68. (page 84) Ali Alexander (@ali), “North Carolina joins Pennsylvania, Michigan
and Wisconsin in recent decisions favoring Democrats,” Twitter, September 22,
2020, 12:17 pm,
https eb archive org eb https t itter com ali status
;
Jewish Deplorable (@TrumpJew), “PA, WI, MI, and NC extended the mail-in
ballot deadline for days AFTER the election...all in the past week,” Twitter,
September 22, 11:55 am,
http eb archive org eb https t itter com Trump e
status ;
@[account name redacted], ”I’m in the founder’s program and have been
”screaming,”,
http eb archive org eb https t itter com Trump e
status ;
Jewish Deplorable (@TrumpJew), “They’re stealing this election.”
69. (page 85) Cristina Laila, “BREAKING: North Carolina Joins Michi-
gan and Pennsylvania—Will Accept Ballots for 9 Days After Election
and Ballot Harvesting,” The Gateway Pundit, September 22, 2020,

132
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 151 of 292 PageID #:
13720

3.7. Final Observations

https thegate aypundit com breaking-north-carolina- oins-


michigan-pennsylvania- ill-accept-late-ballots- -days-election-ballot-harvesting ;
Jim Hoft, “It’s Now Clear: Trump Will Win the Election — Democrats Will Steal –
‘True the Vote’ Offers ESSENTIAL TIPS on what YOU CAN DO TO STOP THE
STEAL,” The Gateway Pundit, September 24, 2020,
https thegate aypundit com no -clear-trump- ill- in-election-
democrats- ill-steal-true-vote-offers-essential-tips-can-stop-steal ;
Cristina Laila, “VIDEO: Trump Observers Are Being Blocked Entry to All Satellite
Voting Locations in Philly #StopTheSteal,” September 29, 2020,
https thegate aypundit com trump-observers-blocked-entry-
satellite-voting-locations-philly-stopthesteal
70. (page 84) Eric Trump (@EricTrump), “Does it shock anyone that poll
watchers are being blocked and kicked out of voting locations in Philly? We are
in court right now! This corruption must end!” Twitter, September 29, 10:39 am,
https t itter com EricTrump status ;
Donald J Trump (@realDonaldTrump), “Wow. Won’t let Poll Watchers & Security
into Philadelphia Voting Places. There is only one reason why. Corruption!!!
Must have a fair Election,” Twitter, September 29, 2:14 pm,
https eb archive org eb https t itter com
realDonaldTrump status
71. (page 85) Sheera Frankel, “The Rise and Fall of the ‘Stop the Steal’ Facebook
Group,” New York Times, November 5, 2020, https nytimes com
technology stop-the-steal-facebook-group html
72. (page 87) Ali Alexander (@ali), “I am willing to give my life for this fight,”
Twitter, December 7, 6:23 pm,
https archive is ct C; Arizona Republican party (AZGOP), “He is. Are you?”
Twitter, December 7, 2020, 9:52 pm, https eb archive org eb
https t itter com GOP status
73. (page 87) Vandana Rambaran, “Pro-Trump election Facebook group ‘Stop
the Steal’ banned after calling for violence,” Fox News, November 10, 2020,
https fo ne s com politics pro-trump-election-facebook-group-stop-the-
steal-banned-after-calling-for-violence;
Danielle Wallace, “Trump supporter allegedly rips mask off counter-protester
at FL ‘Stop the Steal’ rally, police investigating,” Fox News, November 10, 2020,
https fo ne s com us florida-trump-supporter-rips-mask-off-pensacola-
stop-the-steal-rally
74. (page 87) OAN Newsroom, “Voters hold ‘stop the steal’ rallies in Mich., Nev.”
One America News Network, updated November 20, 2020,
https archive is WF R
75. (page 87) OAN Newsroom, “Exclusive: Stop the Steal organizers discuss

133
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 152 of 292 PageID #:
13721

3. Incidents and Narratives: The Evolution of Election Misinformation

reelection efforts,” One America News Network, updated December 14, 2020,
https eb archive org eb https oann com e clusive-
stop-the-steal-organi ers-discuss-reelection-efforts
76. (page 89) Reuters Staff, “Fact check: Online rumor of voter fraud scheme
using women’s maiden names is baseless,” Reuters, November 13, 2020,
https reuters com article uk-factcheck-maidengate-near-impossible fact-
check-online-rumor-of-voter-fraud-scheme-using- omens-maiden-names-is-
baseless-idUS T PE
77. (page 89) @[account name redacted], “Bill Gates has owned a product
placement company with offices around the world for 30 years. He also once
owned the rights to original Tiananmen Square photos. He sold them to China,
and now they’re gone,” Reddit, May 20, 2020,
https eb archive org eb https reddit com r
conspiracy comments gnmhd bill gates has o ned a product placement
company
78. (page 90) Davey Alba, “No Proof People Stole Maiden Names to Vote,”
New York Times, November 11, 2020, https nytimes com
technology no-proof-maiden-names-vote html
79. (page 90) Ari Sen and Brandy Zadrozny, “QAnon groups have millions of
members on Facebook, documents show,” NBC News, August 10, 2020,
https nbcne s com tech tech-ne s anon-groups-have-millions-members-
facebook-documents-sho -n
80. (page 91) Mary Fanning and Alan Jones, “Whistleblower Tapes: Trump
Wiretapped ‘A Zillion Times’ By ‘The Hammer,’ Brennan’s and Clapper’s Secret
Computer System,” The American Report, March 17, 2017,
https eb archive org eb https theamericanreport org
histleblo er-tapes-trump- iretapped- illion-times-hammer-brennans-
clappers-secret-computer-system
81. (page 91) Dan Evon, “Is There a ‘Hammer and Scorecard Operation to
Manipulate Vote Counts?” Snopes, November 9, 2020,
https snopes com fact-check hammer-scorecard-vote-counts
82. (page 91) Pat Beall, “Will your ballot be safe? Computer experts sound
warnings on America’s voting machines,” USA Today, updated November 2, 2020,
https usatoday com story ne s investigations computer-
e perts-sound- arnings-safety-americas-voting-machines ;
Kim Zetter, “Cause of Election Day glitch in Georgia counties still unexplained,”
POLITICO, updated November 12, 2020,
https politico com ne s georgia-election-machine-glitch-

134
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 153 of 292 PageID #:
13722

3.7. Final Observations

83. (page 91) Kim Zetter, “Cause of Election Day glitch in Georgia counties still
unexplained.”
84. (page 91) Max Johnston, “Republicans sound alarm on Antrim County
election results,” Michigan Radio, November 4, 2020,
https michiganradio org post republicans-sound-alarm-antrim-county-
election-results;
“Officials: Clerk error behind county results favoring Biden,” Associated Press,
November 7, 2020,
https apne s com article oe-biden-donald-trump-technology-voting-michigan-
beeef e d eaa db f f
85. (page 91) Jocelyn Benson, “Isolated User Error in Antrim County Does Not
Affect Election Results, Has no Impact on Other Counties or States,” Michigan
Department of State, November 7, 2020, https michigan gov documents
sos ntrim Fact Check pdf
86. (page 91) Matt Schlapp (@mschlapp), “Have Republican votes mysteriously
disappeared in Antrim County Michigan and it has become a strong Biden
territory? Odd that @GovWhitmer owns property in the county - coincidence?
#StopTheSteal,” Twitter, November 4, 2020, 11:34 am,
http eb archive org eb https t itter com mschlapp
status ;
Dennis Lennox (@dennislenoxx), “Trump did not lose #AntrimCounty in
#Michigan. It is now confirmed that 32 other counties use the same software
as Antrim County. He will get 6,000-plus votes out of Antrim. What about the
other counties with the same software?” Twitter, November 4, 2020, 8:55 am,
http eb archive org eb https t itter com dennislenno
status
87. (page 92) CDMedia, “Interview with Source on Election Vote Fraud,” YouTube,
November 5, 2020, https eb archive org eb if https
youtube com atch v ficae feature youtu be
88. (page 92) Craig Mauger and Beth LeBlanc, “Michigan Republicans: Election
fight ‘not over’ despite Biden’s lead,” Detroit News, updated November 6, 2020,
https detroitne s com story ne s local michigan gop-
michigan-discuss-election-status-integrity
89. (page 92) Jocelyn Benson, “False claims from Ronna McDaniel have no merit,”
Michigan Department of State, November 6, 2020,
https michigan gov sos - - - – html
90. (page 92) Jim Hoft, “Update: Corrupted Software that Stole 6,000 Votes
From Trump in Michigan - Shut Down for TWO HOURS in Red Counties in
Georgia on Election Day,” The Gateway Pundit, November 6, 2020,
https thegate aypundit com update-corrupted-soft are-stole-

135
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 154 of 292 PageID #:
13723

3. Incidents and Narratives: The Evolution of Election Misinformation

-votes-trump-michigan-county-shut-t o-hours-red-counties-georgia-election-
day ;
Alana Mastrangelo, “Georgia Counties Using Same Software as Michigan
Counties Also Encounter ‘Glitch,”’ Breitbart, November 7, 2020,
https breitbart com politics georgia-counties-using-same-
soft are-as-michigan-counties-also-encounter-glitch
91. (page 92) Donald J. Trump (@DonaldTrump), “Georgia Counties Using Same
Software as Michigan Counties Also Encounter ‘Glitch’ - What a total mess this
“election” has been!” Facebook, November 7, 2020, https facebook com
DonaldTrump posts
92. (page 92) The Western Journal (@WesternJournal), “The same software was
used in 64 other counties,” Facebook, November 6, 2020,
https facebook com posts ;
Mike Huckabee, “The same software was used in 64 other counties,” Facebook,
November 6, 2020, https facebook com posts
;
Alfredo Jalife-Rahme “¿Será? ¿Contrataron a Hilldebrando: el gangsteril hermano
de la bandida Zavala Gómez del Campo y cuñado de @Felipecalderon? Jajaja,”
Facebook, November 9, 2020,
https facebook com posts
93. (page 93) Kyle Becker (@kylenabecker), “The election software system in
Michigan that switched 6,000 votes from Trump to Biden is called ‘Dominion,”’
Twitter, November 6, 2020, 4:14 pm,
https eb archive org eb https t itter com kylenabecker
status ;
GrrrGraphics Cartoons (@GrrrGraphics), “Here we go...The Kraken
has just begun to fight!” Twitter, December 14, 2020, 9:32 am,
https eb archive org eb https grrrgraphics com release-
the-kraken ;
Donald J. Trump (@realDonaldTrump), “REPORT: DOMINION DELETED 2.7
MILLION TRUMP VOTES NATIONWIDE,” Twitter, November 12, 2020, 4:34 pm,
https isdglobal org p-content uploads Screenshot- - - -
at- -e png
94. (page 92) Lou Dobbs (@loudobbs), “Democracy at Stake: @Sidney-
Powell1 @TomFitton discuss the potential for nationwide voter irreg-
ularities & whether state legislatures and courts will uphold the rule
of law. #MAGA #AmericaFirst #DobbsTwitter,” November 6, 2020,
https t itter com loudobbs status ;
Jason Puckett, David Tregde, and Linda S. Johnson, “VERIFY: Software glitch did
not switch votes to Biden, Michigan officials say,” November 12, 2020,
https cnc com article ne s verify michigan-soft are-glitch-didnt-s itch-

136
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 155 of 292 PageID #:
13724

3.7. Final Observations

votes -d caddb - da- - be- c c f b


95. (page 92) Maggie Haberman and Alan Feuer, “Trump Team Disavows Lawyer
Who Peddled Conspiracy Theories on Voting,” New York Times, November 22,
2020, https nytimes com us politics sidney-po ell-trump html
96. (page 92) Jim Hoft, “Sidney Powell: Likelihood that 3% of Vote
Total Was Changed in Pre-Election Counting Connected to Ham-
mer and Scorecard (Video),” The Gateway Pundit, November 6, 2020,
https thegate aypundit com sidney-po ell-likelihood- -vote-total-
changed-pre-election-counting-connected-hammer-scorecard-video ;
97. (page 93) Rudy W. Guiliani (@rudyguiliani), “Any information about Dominion?
Strange to pick a foreign company to count the US vote. Something wrong with
this?” Twitter, November 11, 2020, https t itter com RudyGiuliani status
; Rudy W. Guiliani (@rudyguiliani), “Did you know a foreign
company,DOMINION,was counting our vote in Michigan, Arizona and Georgia
and other states. But it was a front for SMARTMATIC, who was really doing the
computing. Look up SMARTMATIC and tweet me what you think? It will all
come out,” Twitter, November 14, 2020, https t itter com RudyGiuliani status

98. (page 94) 2one3Studio, “Smoking Gun: ES&S Transferring Vote Ratios
between Precincts in PA. - By: Edward Solomon,” Rumble, November 20, 2020,
https eb archive org eb https rumble com vbas t-
smoking-gun-dominion-transferring-vote-ratios-bet een-precincts-in-pa -by-
e html mref gga mrefc ;
r/conservatives, “Is this the smoking gun we’re looking for?: Dominion Trans-
ferring Vote Ratios between Precincts in PA. - By: Edward Solomon,” Reddit,
November 22, 2020, https eb archive org eb https
reddit com r conservatives comments yl t is this the smoking gun ere
looking for dominion ; Lara Logan (@laralogan), “Smoking Gun: ES&S Trans-
ferring Vote Ratios between Precincts in PA. - By: Edward Solomon,” Twitter,
November 22, 2020, https t itter com laralogan status
99. (page 94) Bernard B. Kerik (@BernardKerik), “If Texas is looking for
evidence of voter and election fraud, tune in today and listen to @RudyGiuliani’s
presentation in Atlanta Georgia at 11AM! Midnight ballots, dirty dominion
machines in Smartmatic software, and statistic impossibilities. #StopTheSteaI,”
Twitter, December 10, 5:00 am,
https eb archive org eb https t itter com ernard erik
status
100. (page 94) Dave Janda, “Dr. Keshavarz-Nia: Election Fraud & Cyber Security,”
Operation Freedom Radio Show website, November 25, 2020, https dave anda
com dr-keshavar -nia-election-fraud-cyber-security

137
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 156 of 292 PageID #:
13725

3. Incidents and Narratives: The Evolution of Election Misinformation

101. (page 94) James Tweedie, “Video: US Cyber-Security Expert Exposes Flaws
in Pennsylvania e-Voting Systems,” Sputnik International, updated November 16,
2020, https sputnikne s com us -video-us-cyber-security-
e pert-e poses-fla s-in-pennsylvania-e-voting-systems
102. (page 95) Bernard B. Kerik (@BernardKerik), “If Texas is looking for
evidence of voter fraud, tune in today and listen to @RudyGiuliani’s pre-
sentation,” https eb archive org eb https t itter com
ernard erik status
103. (page 94) Lin Wood (@LLinWood), “Smartmatic was front & center in effort
to steal 2020 election from @realDonaldTrump,” Twitter, November 27, 2020,
8:27 pm,
https eb archive org eb https t itter com LLinWood
status ;
SG (@GREENESJ333) “#Breaking #BreakingNews #Canadian #Journal-
ists Go to #Dominion’s #Canada office where they are on the SAME
FLOOR as #GeorgeSoros’ #TidesFoundation. They use MASKED servers,
addresses, ips which is often Illegal for ‘Non-Profits’ & NGOs. They are
Hiding & took down their Signs,” Twitter, November 23, 2020, 11:03 am,
https t itter com GREE ES status
104. (page 94) Celtic Warrior (@WarriorforGod69), Twitter, November 23, 2020,
2:13 am,
https t itter com WarriorForGod status
105. (page 94) Jim Hoft, “HUGE UPDATE: Dominion Voting Systems Uses
SolarWinds — Same Company CISA in Rare Warning Reported Was Breached,
Compromised and Should Be Disconnected!!” The Gateway Pundit, December
14, 2020,
https thegate aypundit com huge-update-dominion-voting-
systems-uses-solar inds-company- ust-shut-cisa-govt-found-breached ;
Dustin Volz (@dnvolz), “RUMOR CONTROL: ‘Dominion Voting Systems
does not now nor has it ever used the SolarWinds Orion Platform,
which was subject of the DHS emergency directive dated December
13, 2020,’ a Dominion spokeswoman says,” Twitter, December 14, 2020,
https t itter com dnvol status
106. (page 94) GrrrGraphics Cartoons (@GrrrGraphics), “Here we go...The Kraken
has just begun to fight!”
107. (page 96) @ProudBoysUncensored, Telegram, https t me
Proud oysUncensored ; Andrew Anglin, “Trump Calls Out Dominion
Voting Systems for Deleting Nearly 3 Million Trump Votes!” Daily Stormer,
November 12, 2020, https dailystormer su trump-calls-out-dominion-voting-
systems-for-deleting-nearly- -million-trump-votes

138
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 157 of 292 PageID #:
13726

3.7. Final Observations

108. (page 96) Ben Collins, “QAnon’s Dominion voter fraud conspiracy theory
reaches the president,” NBC News, November 13, 2020,
https nbcne s com tech tech-ne s -fades- anon-s-dominion-voter-fraud-
conspiracy-theory-reaches-n
109. (page 96) Olivia Rubin, Lucien Bruggeman, and Matthew Mosk, “Dominion
employees latest to face threats, harassment in wake of Trump conspiracy,” ABC
News, November 19, 2020,
https abcne s go com Politics dominion-employees-latest-face-threats-
harassment- ake-trump story id ;
Neon Revolt, “Dominion Fraud (1 of 2),” YouTube, November 30, 2020,
https eb archive org eb https youtube com atch
v F sk d k;
Max McGuire and Joe Otto, “Dominion and How Big Tech Helped Steal the
Election,” Conservative Daily Podcast, YouTube, livestreamed November 9, 2020,
https youtube com atch v t ei gErn feature;
Spencer Ko (@spencer_ko), “This feels like the kind of treason punishable by
death. :)” Twitter, November 12, 2020, https t itter com spencer ko status
s
110. (page 96) Ellen Nakashima, Amy Gardner, and Aaron C. Davis, “FBI links
Iran to ‘Enemies of the People’ hit list targeting top officials who’ve refuted
Trump’s election fraud claim,” Washington Post, December 22, 2020,
https ashingtonpost com national-security iran-election-fraud-
violence a e ba- a - eb-a - a d f dff story html
111. (page 96) Katelyn Polantz, “Dominion Voting Systems sues Giuliani for $1.3
billion over ‘Big Lie’ about election fraud,” CNN, updated January 25, 2021,
https cnn com politics dominion-la suit-giuliani inde html
112. (page 96) Thomas Lifson, “Statement,” American Thinker, January 15, 2021,
https americanthinker com blog statement html
113. (page 97) Philip Bump, “When did the Jan. 6 rally become a march to the
Capitol?” Washington Post, February 10, 2021, https ashingtonpost
com politics hen-did- an- -rally-become-march-capitol ; Wild-
Protest homepage, https eb archive org eb http
ildprotest com about
114. (page 98) Nur Ibrahim, “Did an Antifa Flyer Tell ‘Comrades’ to Pose As
Trump Supporters and Riot After Election Day?” Snopes, December 4, 2020,
https snopes com fact-check antifa-flyer-trump-supporters ;
Will Sommer (@willsommer), “How the ‘November 4’ conspiracy the-
ory took over the pro-Trump internet,” Medium, October 31, https
medium com illsommer ho -the-november- -antifa-conspiracy-theory-
took-over-the-pro-trump-internet-a e f d f;

139
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 158 of 292 PageID #:
13727

3. Incidents and Narratives: The Evolution of Election Misinformation

Angelo Fichera, “Antifa ‘False Flag’ is an Old Hoax,” https factcheck org
antifa-false-flag-flyer-is-an-old-hoa .
115. (page 99) Cooper Raterink, et al., “Seeking To Help and Doing Harm: The
Case of Well-Intentioned Misinformation,” Election Integrity Partnership, Octo-
ber 28, 2020,
https eipartnership net rapid-response ell-intentioned-misinformation
116. (page 99) Bente Birkeland and Miles Parks, “The Toll of Conspiracy
Theories: A Voting Security Expert Lives in Hiding,” NPR, December 23,
2020, https npr org the-toll-of-conspiracy-theories-
a-voting-security-e pert-lives-in-hiding
117. (page 101) Donald Trump, “Save America” (speech, Washington, DC, January
6, 2021). Transcript at Rev, https rev com blog transcripts donald-trump-
speech-save-america-rally-transcript- anuary-
118. (page 101) Shelly Tan, Yujin Shin, and Danielle Rindler, “How one of America’s
ugliest days unraveled inside and outside the Capitol,” Washington Post, Jan-
uary 9, 2021, https ashingtonpost com nation interactive capitol-
insurrection-visual-timeline
119. (page 101) Sheera Frenkel, “How the Storming of Capitol Hill Was Organized
on Social Media,” New York Times, January 6, 2021, https nytimes com
us politics protesters-storm-capitol-hill-building html
120. (page 101) Sheera Frenkel, “How the Storming of Capitol Hill Was Organized
on Social Media.”
121. (page 102) “What is Falun Gong?” Economist, accessed February 10,
2021, https economist com the-economist-e plains hat-is-
falun-gong
122. (page 102) Keving Roose, “‘Epoch Times’: From anti-China tabloid to right-
wing influence,” New York Times China, October 27, 2020, https cn nytimes
com technology epoch-times-influence-falun-gong
123. (page 102) Ji Yun, “Hunter Video leaked; says he’s connected to the ‘Chi-
nese spy chief”’ “亨特音频流出 称自己与「中国间谍头子」来往” Sound of Hope,
October 27, 2020, https soundofhope org post
124. (page 103) Lauren Hilgers, “The Mystery of the Exiled Billionaire Whistle-
Blower,” New York Times Magazine, January 10, 2018,
https nytimes com maga ine the-mystery-of-the-e iled-
billionaire- histleblo er html
125. (page 103) Brian Spegele, Sha Hua, and Aruna Viswanatha, “Fundraising
at Company Tied to Steve Bannon and Guo Wengui Faces Probe,” Wall Street
Journal, August 19, 2020,

140
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 159 of 292 PageID #:
13728

3.7. Final Observations

https s com articles fundraising-at-company-tied-to-steve-bannon-and-


guo- engui-faces-probe-
126. (page 103) “According to analysis of Hunter Biden’s audio, the deepstate
plans to use Patrick Ho to clean out people who support the CCP,” “香港花生,
``亨特拜登聲帶分析,深層政府計劃由何志平一清洗美國親共派,” YouTube, October
28, 2020
https youtube com atch v r k ap
127. (page 103) James O’Keefe (@JamesOKeefeIII), “BREAKING: Pennsylvania
@USPS Whistleblower Richard Hopkins Goes Public; Confirms Federal Investi-
gators Have Spoken With Him About Postmaster Rob Weisenbach’s Order To
Backdate Ballots To November 3rd, 2020,” Twitter, November 6, 2020,
https t itter com amesO eefeIII status
128. (page 103) Dave Ruo, “Penn postal worker allegations: postmaster falsifies
ballot dates,” Epoch Times, updated November 7, 2020,
https epochtimes com b n htm
129. (page 103) Zhang Yuwen, “Breaking news: The U.S. Democratic Party and
the CIA colluded to use covert procedures to manipulate votes,” Sound of Hope,
November 4, 2020, https soundofhope org post lang b
130. (page 103) A Hopeful Citizen (@ThePubliusUSA), “This is a USPS mail room
in Miami-Dade. It is alleged mail-in ballots have been there for over a week.
That would explain why @JoeBiden has been underperforming in this critical
county,” Twitter, October 30, 2020,
https t itter com ThePubliusUS status
131. (page 103) The followers are counted as of December 17, 2020. Xiyatu Zhixia,
“#AmericanElection In Florida, the Democratic Party faces a huge problem:
Miami’s early voting rate is only 53%! Biden needs a huge margin in this highly-
populated, blue-tinted district in order to offset red votes elsewhere in the
state. Someone sent a video of a Miami post office to the Florida branch of the
Democratic Party, huge numbers of envelopes are piled up, one insider says,
some are over a week old. If this story proves true, if these are ballots, if the same
situation is occurring at other post offices, the consequences will be serious”
(translated from Chinese), Weibo, October 31, 2020, https eibo com
rFt r pn
132. (page 105) Chen Junjun, “South Park told the truth eight years ago; the CCP
is behind the Democrat’s mail-in-ballots voter fraud,” Facebook, November 6,
2020, https facebook com chun posts
133. (page 105) Jian Ren (@JianRen12), “Viral in China Now: SF Express for mail-in
ballots in Georgia,” Twitter, November 4, 2020, https t itter com ianRen
status

141
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 160 of 292 PageID #:
13729

3. Incidents and Narratives: The Evolution of Election Misinformation

134. (page 106) Shi Ping, “Michigan poll watcher broke the news: 7,000 votes
were counted, but 50,00 reported,” Epoch Times, November 6, 2020, https
epochtimes com b n htm
135. (page 106) Our ability to consistently monitor WhatsApp was limited, but
some have reported that it was used to send mass political, and even threatening,
messages to Spanish-language speakers. Based on the 40 in-scope posts the
EIP gathered, we found that YouTube had the highest amount of consistent en-
gagement from Spanish-language speakers, as users readily tuned in for nightly
rundowns of allegations aimed at delegitimizing the election outcome. See Ana
Ceballos and Bianca Padrò Ocasio, “‘It is getting really bad’: Misinformation
spreads among Florida’s Spanish-speaking voters,” Tampa Bay Times, updated
October 30, 2020,
136. (page 106) Saranac Hale Spencer, “Claim of Michigan Postal Fraud is Moot,”
FactCheck.org, November 6, 2020, https factcheck org claim-of-
michigan-postal-fraud-is-moot ; Angelo Fichera and Saranac Hale Spencer, “Bo-
gus Theory Claims Supercomputer Switched Votes in Election,” FactCheck.org,
November 13, 2020, https factcheck org bogus-theory-claims-
supercomputer-s itched-votes-in-election
137. (page 106) Vanessa Molter, “Platforms of Babel: Inconsistent misinformation
support in non-English languages,” Election Integrity Partnership, October 21,
2020, https eipartnership net policy-analysis inconsistent-efforts-against-
us-election-misinformation-in-non-english
138. (page 109) Christopher Bing, Elizabeth Culliford, and Paresh Dave,
“Spanish-language misinformation dogged Democrats in U.S. election,” Reuters,
November 7, 2020,
https reuters com article us-usa-election-disinformation-spanish spanish-
language-misinformation-dogged-democrats-in-u-s-election-idUS ED
139. (page 111) Camille François, Ben Nimmo, and C. Shawn Eib, “The IRA
CopyPasta Campaign,” Graphika, October 21, 2019, https graphika com reports
copypasta .
140. (page 111) “Facebook and Twitter ‘dismantle Russian Network,”’ BBC News,
September 2, 2020, https bbc com ne s orld-us-canada- .
141. (page 112) Graphika Team, “Step into My Parler: Suspected Russian Operation
Targeted Far-Right American Users on Platforms Including Gab and Parler,
Resembled Recent IRA-Linked Operation that Targeted Progressives,” Graphika,
October 1, 2020, https graphika com reports step-into-my-parler .
142. (page 112) Ben Nimmo, C. Shawn Eib, and L. Tamora, “Spamouflage: Cross-
Platform Spam Network Targeted Hong Kong Protests,” Graphika, September
25, 2020, https graphika com reports spamouflage ; Ben Nimmo, et al., “Spam-

142
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 161 of 292 PageID #:
13730

3.7. Final Observations

ouflage Goes to America: Pro-Chinese Inauthentic Network Debuts English-


Language Videos,” Graphika, August 12, 2020, https graphika com reports
spamouflage-dragon-goes-to-america .
143. (page 113) Ben Nimmo, C. Shawn Eib, and Léa Ronzaud, “Operation Naval
Gazing: Facebook Takes Down Inauthentic Chinese Network,” Graphika, Septem-
ber 22, 2020, https graphika com reports operation-naval-ga ing .
144. (page 113) Emerson T. Brooking and Suzanne Kianpour, “Iranian Digital
Influence Efforts: Guerilla Broadcasting for the Twenty-First Century,” Dig-
ital Forensic Research Lab, 2020, https atlanticcouncil org p-content
uploads IR -DIGIT L pdf.
145. (page 113) Ben Nimmo, et al., “Iran’s IUVM Turns to Coronavirus,” Graphika,
April 15, 2020,
https graphika com reports irans-iuvm-turns-to-coronavirus .
146. (page 114) Carly Miller, et al., “Hacked and Hoaxed: Tactics of an Iran-Linked
Operation to Influence Black Lives Matter Narratives on Twitter,” Stanford
Internet Observatory, October 8, 2020, https cyber fsi stanford edu io ne s
t itter-takedo n-iran-october- .
147. (page 114) Tess Owen and Lorenzo Franceschi Bicchierai, “‘Proud Boy’
Emails Threatening Florida Voters Appear to Use Spoofed Email Address, Vice,
October 20, 2020,
https vice com en article a b proud-boys-emails-threatening-florida-
voters-appear-to-use-spoofed-email-address;
Jack Cable and David Thiel, “Analysis of Wednesday’s foreign election interfer-
ence announcement,” Election Integrity Partnership, October 23, 2020,
https eipartnership net rapid-response foreign-election-interference-
announcement.
148. (page 114) One email obtained by the EIP was sent via a PHP script belonging
to the cPanel suite of utilities on the Saudi Arabian host. This script inserted
a header into the resultant email that included the IP address of the calling
system, which matched the address from Proofpoint’s analysis. The address,
195.181.170.244, belongs to Datacamp Limited, but, per PassiveTotal data, is used
as an exit node by Private Internet Access, a prominent VPN service.
149. (page 115) John Ratcliffe and Christopher Wray, FBI News Press Conference
on Election Security, C-Span, October 21, 2020,
https c-span org video - national-security-officials- arn-election-
interference-iran-russia.
150. (page 115) Ellen Nakashima, Amy Gardner, and Aaron C. Davis,
“FBI links Iran to ‘Enemies of the People’ hit list targeting top officials
who’ve refuted Trump’s election fraud claims,” Washington Post, December

143
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 162 of 292 PageID #:
13731

3. Incidents and Narratives: The Evolution of Election Misinformation

22, 2020, https ashingtonpost com national-security iran-election-fraud-


violence a e ba- a - eb-a - a d f dff story html.

151. (page 116) Ben Nimmo and the Graphika Team, “Russian Narratives on
Election Fraud,” Election Integrity Partnership, November 3, 2020,
https eipartnership net rapid-response russian-narratives-on-election-fraud.

152. (page 116) Jonathan Karl (@jonkarl), “A new @DHSgov intelligence bulletin
warns of Russian disinformation on mail-in voting: ‘We assess that Russia is
likely to continue amplifying criticisms of vote-by-mail and shifting voting
processes amidst the COVID-19 pandemic to undermine public trust in the
electoral process,”’ Twitter, September 3, 2020, https t itter com onkarl status
s ; Nimmo, et al., “Russian Narratives on Election
Fraud.”

153. (page 116) Ellen Nakashima, et al., “U.S. government concludes Iran was
behind threatening emails sent to Democrats,” Washington Post, October 20,
2020,
https ashingtonpost com technology proud-boys-emails-
florida .

154. (page 116) Election Integrity Partnership (@2020Partnership), “False claims


are circulating that Florida voter registration data has been hacked and posted
on a Russian forum,” Twitter thread, October 21, 2020, https t itter com
Partnership status s .

155. (page 117) William Evanina, “Statement by NCSC Director William Evanina:
Election Threat Update for the American Public,” Office of the Director of
Intelligence News Release No. 29-20, August 7, 2020,
https dni gov inde php ne sroom press-releases item -statement-
by-ncsc-director- illiam-evanina-election-threat-update-for-the-american-public.

156. (page 118) Tracy Wen Liu, “Chinese Media Told to Stay Calm on Election
Results,” Foreign Policy, November 4, 2020, https foreignpolicy com
election- -china-media-lo -key-coverage .

157. (page 118) Fu Cong, interview with Elena Chernenko, Kommersant, published
by the Ministry of Foreign Affairs of the People’s Republic of China, October 16,
2020, https fmprc gov cn mfa eng b t shtml.

158. (page 118) Hua Chunying (@SpokespersonCHN), “It is so naive to believe


that one could #MakeAmericaGreatAgain through flagrant smear and slander
against China. It’s also very ignorant to think that the #US could sever the flesh-
and-blood bond between the #CPC and the Chinese people by sowing discord
among them,” Twitter, November 1, 2020, https t itter com SpokespersonC
status s .

144
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 163 of 292 PageID #:
13732

3.7. Final Observations

159. (page 119) Khamenei TV (@Khamenei_tv), “Both US presidential candidates


claimed election fraud: What a democracy!” Twitter, November 6, 2020, https
t itter com hamenei tv status ; Ayatollah Ali Khamenei
(@Khamenei_ir), “What a spectacle! One says this is the most fraudulent election
in US history. Who says that? The president who is currently in office. His
rival says Trump intends to rig the election! This is how #USElections & US
democracy are,” Twitter, November 4, 2020, https t itter com khamenei ir
status s ; Ayatollah Ali Khamenei (@Khamenei_ir),
“The situation in the US & what they themselves say about their elections is a
spectacle! This is an example of the ugly face of liberal democracy in the US.
Regardless of the outcome, one thing is absolutely clear, the definite political,
civil, & moral decline of the US regime,” Twitter, November 7, 2020, https
t itter com khamenei ir status lang en.
160. (page 119) Mehr News Agency, “US accuses Iran, Russia, China prior to
upcoming election,” Mehr Media Group, October 7, 2020,
https en mehrne s com ne s US-accuses-Iran-Russia-China-prior-to-
upcoming-election;

145
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 164 of 292 PageID #:
13733

3. Incidents and Narratives: The Evolution of Election Misinformation

“Homeland Threat Assessment October 2020,” US Department of Homeland


Security, October 6, 2020,
https dhs gov sites default files publications homeland-
threat-assessment pdf.
161. (page 119) “Report: Fears Grow of Voter Suppression in US,” October
8, 2020, FarsNews Agency, https farsne s ir en ne s
Repr-Fears-Gr -f- er-Sppressin-in-US.
162. (page 120) Maricopa County Elections Department, “You can use a sharpie
on your ballot,” Facebook, November 3, 2020, https facebook com atch
v
163. (page 120) Official Pima County (@pimaarizona), “The felt-tip pen ballot con-
troversy burning through social media is false. Don’t get caught up in it. Arizona
ballot tabulating machines can read ballots marked with a felt tip pen. Felt pens
are discouraged because the ink can bleed through,” Twitter thread, November
4, 2020, https t itter com pimaari ona status ;
Office of the Secretary of State, Election Services Division, 2019 Election
Procedures Manual (Arizona Department of State: December 2019),
https a sos gov sites default files ELECTIO S PROCEDURES U L
PPRO ED pdf
164. (page 120) Steve Gallardo, Maricopa County Board of Supervisors, to
Maricopa County Voters, November 4, 2020,
https maricopa gov DocumentCenter ie PR - - - -Letter-to-
oters
165. (page 120) Secretary Katie Hobbs (@SecretaryHobbs), “IMPORTANT: If you
voted a regular ballot in-person, your ballot will be counted, no matter what
kind of pen you used (even a Sharpie)!” Twitter, November 4, 2020, 11:42 am,
https t itter com Secretary obbs status ;
AZFamily.com News Staff, “Sharpies will not invalidate your ballot, Maricopa
County officials say,” Arizona’s Family, November 4, 2020,
https a family com ne s politics election head uarters yes-sharpies-
are-fine-for-maricopa-county-ballots article b - e - eb- ff -
fd a ca html
166. (page 121) CBS 17 WNCN-TV, “‘SharpieGate’ protest in Arizona,” YouTube,
November 7, 2020, https youtube com atch v DTF usIk
167. (page 121) Michigan Department of State, “False claims from Ronna McDaniel
have no merit,” State of Michigan, November 6, 2020, https michigan gov
som - - - – html
168. (page 121) Michigan Department of State (@MichSoS), “False claims from
Ronna McDaniel have no merit,” Twitter, November 6, 2020, 3:38 pm, https

146
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 165 of 292 PageID #:
13734

3.7. Final Observations

t itter com ichSoS status


169. (page 121) “Joint Statement from Elections Infrastructure Government
Coordinating Council and the Election Infrastructure Sector Coordinating Ex-
ecutive Committees,” Cybersecurity & Infrastructure Security Agency, Novem-
ber 12, 2020, https cisa gov ne s oint-statement-elections-
infrastructure-government-coordinating-council-election
170. (page 121) Michael Balsamo, “Disputing Trump, Barr says no widespread
election fraud,” AP News, December 1, 2020, https apne s com article barr-no-
idespread-election-fraud-b f c a c b a a c f d
171. (page 121) 12 News, “Republican Maricopa County Chairman says ‘no evidence
of fraud or misconduct’ in presidential election,” WQAD8 ABC, updated November
17, 2020, https ad com article ne s politics elections maricopa-county-
chairman-says-no-evidence-of-fraud-or-misconduct-in-presidential-election -
aba - a- f- f -b a f
172. (page 121) Office of Brad Raffensberger, “Historic First Statewide Audit of
Paper Ballots Upholds Result of Presidential Race,” Georgia Secretary of State,
November 19, 2020,
https sos ga gov inde php elections historic first state ide audit of paper
ballots upholds result of presidential race
173. (page 121) Dominion Voting Systems, “Setting the Record Straight: Facts
and Rumors,” February 17, 2021, https dominionvoting com election -
setting-the-record-straight
174. (page 121) Brad Raffensberger, Press Conference of the Georgia Secretary
of State, aired on WJBF-TV, December 7, 2020, YouTube, https youtube
com atch v lgL W F ug feature youtu be t
175. (page 122) Whitney Phillips, “The Forest and the Trees: Proposed Editorial
Strategies,” part 3 in The Oxygen of Amplification (Data and Society Research
Institute: May 22, 2018), https datasociety net p-content uploads -
P RT- O ygen of mplification DS- pdf
176. (page 122) Kate Starbird, et al., “Uncertainty and Misinformation: What to
Expect on Election Night and Days After,” Election Integrity Partnership, October
26, 2020, https eipartnership net ne s hat-to-e pect
177. (page 123) Robert M. Entman, “Framing: Toward clarification of a fractured
paradigm,” Journal of Communication 43, no. 4 (Autumn 1993): 51, https is muni
c el fss aro POL um Entman pdf
178. (page 124) Ali Breland, “Meet the Right-Wing Trolls Behind ‘Stop the Steal,”’
Mother Jones, November 7, 2020, https mother ones com politics
stop-the-steal

147
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 166 of 292 PageID #:
13735

3. Incidents and Narratives: The Evolution of Election Misinformation

179. (page 124) StopTheSteal.us homepage, accessed February 10, 2021,


https eb archive org eb https stopthesteal us inde cfm

148
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 167 of 292 PageID #:
13736

Chapter 4
Cross-platform and Participatory
Misinformation: Structure and
Dynamics

4.1 Introduction
In this chapter, we attempt to understand how false and misleading narratives
about the 2020 election, highlighted in Chapter 3, took shape and spread across
a multiplatform information ecosystem. During the 2020 election, misinforma-
tion was shared across a range of social media—from broadly popular platforms
like Facebook, Instagram, Twitter, and YouTube, to niche sites like Reddit, to
up-and-coming sites like Periscope and TikTok, to “alt-platforms” such as Par-
ler, and to message boards such as the chans or thedonald.win. These diverse
platforms were leveraged in distinct and often complementary ways by those
spreading false and misleading information about the election. Additionally,
algorithmic curation systems shape the dynamics of social networks, and behav-
iors that manifest across them, as engagement begets algorithmic amplification,
complicating the story of how content is created, disseminated, and reaches
end users. Here we examine the underlying structure of this ecosystem—the
different platforms involved, and the way information moves between them. We
consider the affordances of their features, which enable communities to form,
and enable individuals to activate those communities.
Much of the misinformation narratives that we articulated in Chapter 3 involved
the active participation of ordinary people. But rank-and-file accounts and
influencers alike strive to capture the attention of larger and larger audiences, in
a bid, ultimately, to gain the power that such attention confers.1 For each social
platform, we consider the “work” that is done to create and spread narratives—

149
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 168 of 292 PageID #:
13737

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

what we might infer as tactics as well as other dynamics—to describe how these
false narratives developed, and to highlight the techniques used to produce
them, spread them, and sustain them over time.

4.2 Cross-Platform Information Sharing


Each platform enables different kinds of social and information interactions;
for example, TikTok’s user base has a large youth component, and Parler has
positioned itself as a destination for conservative users who have experienced—
or have perceived they have experienced—censorship on other platforms.2 Many
of these platforms allow content sharing from other platforms, and from the
broader information space that includes countless websites, from established
news media outlets to conspiracy theory blogs. And though journalists and
researchers sometimes draw a distinction between social media and mass media,
in a broader view, there are myriad connections between them, as, for example,
cable news pundits craft their evening shows based on content that went “viral”
that day on social media.3

In addition, internet usage statistics suggest that most online information


participants—or “users”—are not siloed in a single platform, but turn to dif-
ferent platforms for different reasons.4 Political activists and others who wish
to shape public opinion also employ multiplatform strategies, leveraging dif-
ferent platforms for different parts of their information strategies, and often
intentionally moving content from one platform to another.

To facilitate our study of cross-platform misinformation, we grouped tickets


created during our monitoring period into incidents: the information cascades
that relate to a specific information event or claim, as described in Chapter 3
and discussed more fully in Chapter 5. We used a mixed-method approach
to analysis, combining real-time forensic documentation of individual tickets
with follow-up qualitative and quantitative analyses of specific incidents and
narratives.

We observed that interactions between platforms created emergent cross-


platform dynamics. For example, while Facebook was a place to reach large
audiences and organize action, Twitter was a place to mobilize and “eventize”
longer-form content stored elsewhere. Platform policies shaped some of these
dynamics: moderation could lead to inter- and intraplatform spread, as users
shared screenshots of deleted content or posted it to platforms with less strin-
gent policies. Below we describe the roles that each platform plays in the
election-related mis- and disinformation ecosystem.

150
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 169 of 292 PageID #:
13738

4.2. Cross-Platform Information Sharing

Cross-Platform Participatory Misinformation:


From Cellphone Snapshot to Nightly News

Creation Create Content Documenting Incident   


PHOTO VIDEO AUDIO

Mainstream Alternative Chat Platforms &


Platforms Platforms Message Boards
Post content to as many If content is taken down, Share content within
Dissemination platfoms as possible host here and post link to small or close-knit
mainstream communities

 Groups & Pages 




Shared by Online Influencers Alt-Media


Amplification Communities  Pickup Pickup 
Virality Threshold 

IRL Action Legal Action Mainstream


Protests  & Lawsuits  Coverage

Facebook’s Role: Public Posts to Reach Large Audiences;


Groups for Organizing Protests

Facebook remains a widely popular social media platform, averaging around 2.7
billion active users across the globe.5 For media outlets, information operators,
and even ordinary people, Facebook represents an opportunity to reach large
audiences. Public Pages can attract millions of followers, turning their creators
into influencers with reach potential on par with some mass media outlets.
Groups can be places where people congregate—in public and “private”— around
a range of affinities. Through sharing functionality, content can move freely
and rapidly between Groups, Pages, and personal accounts and their socially
connected networks. Though our view into Facebook was limited to public
content, we were still able to document the platform’s role in the spread of
several false and misleading narratives.

151
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 170 of 292 PageID #:
13739

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Facebook Pages as a Place to Reach Massive Audiences


A number of partisan media and other right-wing influencers who appeared in
our data collection used their Facebook Pages to spread false and misleading
information about the election. Often, this was part of a multiplatform media
strategy. On Facebook, this content received significant engagement, including
tens of thousands of reshares for some posts and moving from public Pages to
personal Facebook feeds.

Facebook Groups as a Place to Share Rumors and Organize


Facebook Groups, both public and private, served as virtual places to come to-
gether and share stories of perceived election fraud and to organize a collective
response. Perhaps the most successful was the STOP THE STEAL Facebook
Group (discussed in detail in Chapter 3). The public Group started as a place
to share stories, both first- and secondhand, about a potential “stolen elec-
tion”—stories that were subsequently reshared through Facebook and cross-
posted to other platforms. It grew rapidly, reaching 320,000 users in less than a
day, assisted by cross-posted advertisements from right-wing influencers on
Twitter.6 It, along with other Facebook Groups, quickly evolved into a place to
organize protests; as some of the rhetoric grew violent and election workers
were threatened, Facebook removed STOP THE STEAL less than a day after
launch. Nevertheless, similar groups, albeit at smaller scales, continued to
emerge after this takedown, as people looked for places to gather and ways to
coordinate protest. In one case, a group of individuals organized a peaceful
protest using a private Facebook Group.7 But their call-to-action was spread
publicly and lost contextualizing information along the way, which led to a more
chaotic protest.

Twitter’s Role: Mobilizing Content from Other Platforms;


Connecting to Media Outlets and other Influencers; Networked
Framing
Mobilizing Content from Other Platforms
In the cross-platform spread of misinformation about the election, the Twitter
platform served several diverse roles. A primary role was to provide a place
to draw attention to content such as news articles, videos, and livestreams
hosted elsewhere in the media ecosystem. The real-time nature of the platform
provided an opportunity to connect existing content to the current news cycle,
while platform affordances like short-form messaging and hashtag referencing
enabled seemingly disparate narratives to be cross-referenced and integrated

152
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 171 of 292 PageID #:
13740

4.2. Cross-Platform Information Sharing

from other sources. In particular, cross-posting from YouTube to Twitter was


salient in our election integrity incidents, as shown in Figure 4.1.

Figure 4.1: Temporal graph of tweets and retweets linking to prominent YouTube channels over
time, in tweets per 15 minutes, for three prominent repeat spreaders (described in Chapter 5).

By cross-posting their videos to Twitter, repeat spreaders worked to popularize


videos alleging election fraud. In some cases, the Twitter spikes align closely with
the release of a new video. The tweets linking to Project Veritas, a right-leaning
activist media group, follow this pattern—each burst is related to a different
video. In other cases, e.g., tweets linking to compilation videos produced by
right-leaning CDMedia and Dr. Shiva Ayyadurai (a coronavirus and election-
related conspiracy theorist and anti-vaccine activist, also known as Dr. Shiva),
the same video is mobilized (re-introduced and widely spread) multiple times.
Information cascades related to content from Project Veritas and Ayyadurai are
described in Chapter 5.
YouTube was not the only platform to serve as host for long-form videos subse-
quently linked to Twitter to reach a larger audience. For example, Ayyadurai’s
statistics-based content was regularly hosted on Periscope but cross-posted on
Twitter to expand viewership and connect with other incidents using hashtags
and tagging influential users.

Connecting to Influencers
Twitter also allowed prominent spreaders of election-related mis- and disin-
formation to direct the attention of their own large audiences, as well as other
influencers, to a specific piece of content; the content was then amplified across
platforms by this audience of influential users, journalists, and politicians, in-
cluding President Trump, his campaign team, and his family.
The cross-platform nature of this amplification draws attention to the dynamics
of “networked framing” (see box on page 166). Twitter often served as the focal

153
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 172 of 292 PageID #:
13741

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

point for these collective narratives. In addition to the size of its audience,
the platform’s mobile connectivity enabled disconnected fraud narratives to be
drawn together and assembled into specific frames (i.e., widespread election
fraud) using content from other websites and social media platforms.

For example, the Hammer and Scorecard/Dominion narratives described in


Chapter 3 began with claims of poll glitches in online conversations on websites
and Twitter, then spread through YouTube videos and the use of hashtags re-
lated to the incident on Twitter and other platforms, such as Parler and Reddit.
From there, high-profile accounts drew further attention to the incidents, as
did hyperpartisan news websites like The Gateway Pundit, which used Twitter
to promote its article discussing the incident.8 This collective Dominion narra-
tive spread has since grown, having been subsequently promoted by the Proud
Boys, The Western Journal, and Mike Huckabee across a number of platforms,
including Facebook, Twitter, Instagram, Telegram, Parler, and Gab.9 On each
platform, these narratives remain tethered together by relying on the Twitter
hashtags #dominionvotingsystems and #dominionsoftware. By bouncing un-
reliable evidence back and forth from Twitter to other social media platforms,
what were initially unremarkable incidents confined to local counties became a
national story, much like the Stop The Steal and Sharpiegate narratives.

Megathreads
An additional technique unique to Twitter, due to its specific affordances around
threading and content temporality, was the use of “megathreads”—dozens or
even hundreds of tweets connected through reply-chains—to connect a mix
of real incidents as well as false and misleading claims into a long narrative
alleging fraud and attempting to delegitimize the election. One such thread
featured detailed allegations of fraud, state-by-state, through over 100 author-
appended replies to a single tweet, linking to a number of external website
sources and content on other social media platforms. These types of threads
leverage platform-specific design affordances: the list-based nature of megath-
reads allows them to be recycled in terms of their visibility and engagement
each time a new item is added to the list.

Cross-Platform Sharing to Evade Moderation on Twitter


For both Twitter megathreads and single posts spreading misinformation, the
cross-platform nature of these narratives also limited the efficacy of the plat-
form’s response. We saw numerous cases in which misinformation first shared
on Twitter continued to spread on other platforms even after it was removed—
in some cases, a simple screenshot of the since-removed tweet was shared
elsewhere —as illustrated in Figure 4.2 on the facing page.

154
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 173 of 292 PageID #:
13742

4.2. Cross-Platform Information Sharing

Figure 4.2: Screenshots of cross-posting on Facebook (left) and Reddit (right).

On the left, we see a Facebook user posting a screenshot of his own reply on
Twitter to perpetuate a disproven narrative even after it was removed on Twitter.
On the right we find a similar instance on Reddit, where a tweet by political
consultant Harlan Hill alleging a stolen election was hidden behind a label on
Twitter but is presented in full on Reddit.

In these ways and others, Twitter served to perpetuate and amplify misinforma-
tion narratives despite efforts to limit its involvement.

155
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 174 of 292 PageID #:
13743

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

YouTube’s Role: A Resource for Livestreams, Compilations,


and Mobilizations
Most major platforms now have the capacity for sharing video; however, YouTube
exists as a uniquely popular platform for videos that are long-format and can be
monetized. While search and recommendation functions exist within YouTube,
traffic is often driven from other platforms. During the 2020 election, YouTube
provided a space for video-format misinformation that could be shared easily
across platforms. The platform functioned both to provide official and familiar-
looking “evidence” for misleading narratives and to consolidate otherwise dis-
parate narratives as part of a broader picture.

Compilation and Long-Form Videos


One effective form of YouTube content—in terms of spreading misleading narra-
tives about the election—were compilations, or videos that synthesized content
across different events and narratives.10 Though these longer videos may not
have the potential for mass virality, they exist as touchstones for other misin-
formation superspreaders to continuously refer back to—from other locations
in the information ecosystem—as supportive evidence of the veracity of their
narratives.
These YouTube videos presented challenges to media literacy. They were typi-
cally produced by partisan news outlets or users and organizations with a large
presence on other platforms. These groups harnessed high production quality
and verified accounts to create videos that either misled the public through
deceptive editing or compiled multiple false and misleading narratives. Main-
stream, cable, and hyperpartisan news outlets alike host content on YouTube,
and much of it has a similar format, look, and feel. For example, Project Veritas’s
videos often begin with host James O’Keefe sitting in what appears to be a well-
established newsroom, and Shiva Ayyadurai’s videos present him as an expert
source on a television news show.
Another consequence of the long-form, multinarrative nature of YouTube videos
is that misinformation—and even more so, disinformation—can be difficult for
the general public to discern. A video containing several distinct narratives would
require substantial time on the part of a scrupulous viewer to evaluate. This
long-winded approach to misinforming can overwhelm, creating the impression
of election fraud without the viewer critically evaluating, or even remembering,
the slate of “evidence.”

Livestreams
YouTube is also used to build an audience for a unique type of content producer—
the livestreamer.11 Several of the top accounts in our YouTube analyses are

156
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 175 of 292 PageID #:
13744

4.2. Cross-Platform Information Sharing

conservative influencers who have used YouTube Live to build their following
and subsequently spread mis- and disinformation. These include right-wing
pundit Stephen Crowder, who hosts a daily livestreamed commentary show,
and Dr. Ayyadurai. The YouTube Live feature (and its counterparts on other
platforms, such as Facebook Live) create complex moderation challenges for
platforms wishing to minimize misinformation, as the streams are often boosted
in the moment by platform algorithms, though there is little opportunity to
address claims in real time. Videos often persist on the platform permanently,
where they continue to rack up views. However, in their permanent state they
may be labeled. The top-viewed video in our data sample, for instance, is a
livestream by Stephen Crowder titled “Live Updates: Democrats Try to Steal
Election!?” that aired on November 4 and has subsequently gained over 5 million
views. It was eventually labeled: “Robust safeguards help ensure the integrity of
election results.”

Long-Tail Platforms for Unique Formats and Niche


Communities
As mainstream platforms tend to exhibit some content moderation, these ac-
tions feed into narratives of “censorship,” leading some users to seek alternative
forums. These range from smaller platforms like TikTok, to almost entirely un-
moderated spaces like 8Kun and Discord, to places where moderation is minimal,
like Parler12 and some subreddits. The entirely unregulated spaces function as a
breeding ground for more extreme narratives involving the Deep State, QAnon,
and encouragement of political violence. However, these platforms’ relatively
small user base necessitates misinformation leaking or being ported into more
mainstream sites in order for it to have impact.

Misinformation Narratives Reappearing on TikTok


One phenomenon we observed was content that originated on other platforms
such as Twitter, Facebook, and Instagram, then reappeared on TikTok. A com-
mon tactic was the use of TikTok’s “green screen” feature, where users create
a video with an uploaded image, screen capture, or video as the background.
For example, as displayed in Figure 4.3 on the next page, tweets that shared
misleading graphs aimed to delegitimize the election results in Michigan and
Wisconsin were reshared as backgrounds on TikTok, where users discussed the
conspiracies.

Sometimes, content was actioned by one platform while it persisted unactioned


on another. Figure 4.4 on page 159 below shows how one user, when TikTok took
down a debunked video, used the platform’s green screen function to direct

157
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 176 of 292 PageID #:
13745

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Figure 4.3: A TikTok user reshares a tweet displaying misleading graphs to support the false
narrative that the results in Michigan and Wisconsin have been rigged. The video received 29,000
views, 1,751 comments, and 4,159 shares before being taken down.

his followers to the same video on Instagram. TikTok and Instagram have since
removed both videos.

Instagramming Screenshots of Posts on Other Platforms


Similar to TikTok, misleading content about the election on other platforms
appeared later on Instagram. For example, several of the highly engaged-with
Instagram posts from repeat spreaders consisted of screenshots of tweets—often
tweets authored by other people. Many of these images included additional
visual effects, such as added or crossed-out text, to reinforce, refine, or counter
the meaning or framing in the original content. Some of the most influential
repeat spreaders used Instagram as part of a multiplatform strategy, adapting
their content to Instagram’s image-based format.

158
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 177 of 292 PageID #:
13746

4.2. Cross-Platform Information Sharing

Figure 4.4: Cross-platform spread of a now-debunked video. Top left, a video allegedly showing
burning ballots is posted to 8Kun on November 3, 2020, 11:27 am PT. Top right, a screen capture
of the video that was posted to 8Kun. Bottom left, the next day at 2:00 pm PT, TikTok user Cuddy
Camaro (@camarocuddy) posted a video using the 8kun video as his green screen. In the video,
Camaro states that TikTok won’t let him upload the video, so he directs people to his Instagram
account (@cuddycamaro), where he has posted the video. Bottom right, on Instagram, his post
with the video received over 133,000 views by November 4, 2020, 5:00 pm PT, before it was taken
down a few hours later.

159
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 178 of 292 PageID #:
13747

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Parler as an Emerging Meeting Place for Right Wing Influencers and


Audiences

Parler was another smaller, emergent platform that came to play a significant role
in the 2020 election as a community for pro-Trump activism and perpetuation of
pro-Trump conspiracy theories post-election.13 Unlike other platforms involved
in the active, participatory cross-platform information flows described in this
section, Parler largely served as something of an echo chamber set apart from
the major platforms. While content from websites, Twitter, and YouTube were
shared to Parler, the reverse was infrequent.

Parler was established in 2018 as a “censorship-resistant” platform catering


to right-leaning users and funded by conservative donors, including Rebekah
Mercer. Its founders and early adopters—such as part-owner and prominent
pro-Trump policital commentator Dan Bongino—recruited its userbase from
right-leaning audiences who had come to feel that mainstream platforms were
censoring them. Several of Parler’s earliest prominent accounts were individuals
who had, in fact, been deplatformed on mainstream social media for specific
rules violations, such as Alex Jones and Roger Stone.14 Users joined in bursts that
were often tied to a particular allegation of censorship; in late June 2020, for
example, when Twitter’s application of a fact-check label to President Trump’s
tweets outraged his fan base, and again, in October 2020, when mainstream
platforms chose to down-rank or not host private adult content from Hunter
Biden’s laptop. On Parler, such content was easy to find. Parler’s commitment
to “free speech” (and to not fact-checking information)15 meant that some of
the wilder conspiracy theories and rhetoric about stolen elections— particularly
rhetoric with violent undertones—were contextualized, throttled, or taken down
by major platforms but moved freely within the Parler community. Members
of communities on larger platforms, such as Facebook Groups, recognized
this; we observed users within Groups that focused on election rumors and
misinformation encouraging other members to create Parler accounts so that
they could talk about the claims there.

Parler’s user base saw significant growth in the days after the election.16 Many
of its users joined because of their belief in conspiratorial narratives such as
Hammer and Scorecard, which remained popular on Parler nearly two months
after the election. However, Parler lacked certain features, such as Groups and
the ability to sort by top posts, that have made its larger competitors more
effective as places to convene for online activism. After its decision not to
moderate violent content in the days leading up to the January 6 insurrection at
the Capitol, it also struggled to retain hosting: Amazon, Apple, and Google each
took action to remove it from their infrastructure, and it was only back online,
with a new hosting service, as of February 16, 2021.

160
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 179 of 292 PageID #:
13748

4.2. Cross-Platform Information Sharing

Messaging Tools
Beyond platforms, false and misleading claims also proliferated via messaging
tools. For example, multiple Miami residents received texts claiming that antifa
and BLM protestors planned to terrorize the Miami area following the elec-
tion. This example highlights how misinformation can be highly localized and
originate from sources other than social or broadcast media.

Figure 4.5: A text sent to some Miami residents falsely warning about antifa and BLM protesters.

Cross-Platform Migration as a Demand-Side Issue


Not only did content move across platforms—users themselves moved as well.
Researchers often focus on the supply-side of mis- and disinformation— such
as how misinformation spreads and its prominence during election cycles.17
In the 2020 election, the response of social media users to content moder-
ation policies—namely, migrating to alternative platforms such as Parler— fore-
grounded the demand side of misinformation as well.
In line with their content moderation policies, and as described in Chapters 2
and 6, Twitter and Facebook used labeling and content removal to limit election-
related misinformation on their websites. A subset of social media users re-
sponded to such moderation with claims of liberal censorship, and migrated to
platforms with weaker content moderation policies, like Parler. Parler CEO John
Matze said that more than 4.5 million new people signed up for the platform in
about a week. While it’s yet to be seen whether Parler’s newfound popularity

161
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 180 of 292 PageID #:
13749

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

will continue (some evidence suggests Parler has seen a drop in usage from its
pre-election days, and the platform has only recently regained a hosting service
after the major ones dropped it), the migration suggests that content moderation
by the major platforms won’t solve the misinformation crisis entirely.

4.3 Dynamics of 2020 Election Misinformation


The Timeline
Misleading information about the 2020 election followed interesting temporal
dynamics. In Chapter 3 we trace the evolution of the narratives—stories created
by misinformation echoed past stories and gave momentum to the next wave;
here, we follow how those stories traveled across the election misinformation
landscape over time. During the pre-election period, efforts to preemptively
delegitimize the election often appeared to be top-down, spreading through
right-wing media and accounts of political figures.18 But they were also, in many
cases, decentralized, with one-off incidents bubbling up through social media
before reaching influencers and their large audiences. Together, these dynamics
worked to foment a general distrust in the election.
Election Day served as a day of data collection for partisan actors, who would
later leverage individual tweets and stories as evidence for broader claims.
Motivated by growing fears of a “rigged” election, a large number of people
went to the polls looking for evidence of voting fraud. Many documented and
used social media to share their experiences of perceived and real issues with
the voting process, sharing videos, images, and personal accounts. Politically
motivated individuals watching from home on social media contributed by
amplifying content that aligned with their views or goals.
In the week after Election Day, pro-Trump political operatives, right-wing media
outlets, and other content creators—primarily though not exclusively on the
political right—assembled evidence from Election Day into larger narratives
attempting to delegitimize the results. Armchair statisticians combed avail-
able vote tallies looking for anomalies that could be framed as potential fraud.
YouTube opportunists made long-form videos connecting different incidents to
the “electoral fraud” meta-narrative. Though initially chaotic, the information
space began to concentrate on smaller incidents that were swept into larger
narratives or growing conspiracy theories.
Post election, false claims and misleading narratives began to coalesce around
allegations of fraud in swing-state cities that favored Biden. Subsequent court
cases seeking to throw out votes in these areas based on the allegations shed
light on the motivation for this refocusing. A common tactic involved linking
statistical evidence with unfounded claims of vote-tabulation fraud. Diffuse pre-

162
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 181 of 292 PageID #:
13750

4.3. Dynamics of 2020 Election Misinformation

and post-election narratives were blended and presented as walls of evidence.


Donald Trump and members of his legal team were instrumental in pushing
these narratives, strategically employing them in an effort to overturn the results
of the election through legal proceedings. Now, we can see some storylines
have taken root, developing into more hardened conspiracy theories that may
linger for years to come.

One remarkable phenomenon is the persistence of certain narratives—e.g., that


the election would be “rigged”—from the start of our data collection through
the end. These narratives were already prevalent when we began our work in
August, and as we write this report, participation in the narratives challenging
the integrity of the 2020 election is ongoing, with new “evidence” still being
added to the conversation, even as the discourse has converged around a few
specific conspiracy theories. Research suggests that the conspiracy-theory
type of misinformation will have the most staying power—as opposed to more
ephemeral rumors that were quickly determined to be false.19 In particular,
claims that are difficult to verify and theories that are impossible to falsify—for
example, theories that software on voting machines switches votes without
leaving a trace—will likely continue to spread for years to come. These conspiracy
theories can become the tools of future disinformation campaigns, and they
risk long-term effects such as the continued delegitimization of democratic
institutions.

Participatory Mis- and Disinformation


Our analysis demonstrates that the production and spread of misinformation
and disinformation about Election 2020—including false narratives of a “stolen
election”—was participatory. In other words, these dynamics were not simply
top-down from elites to their audiences, but were bottom-up as well, with
members of the “crowd” contributing in diverse ways—from posting raw content,
to providing frames for that content, to amplifying aligned messages from both
everyday members of the crowd and media (including social media) elites.

Repeatedly, our data reveal politically motivated people sincerely introducing


content they mistakenly believed demonstrated real issues with election in-
tegrity: from the user who claimed back in early September that a ballot in their
name had been sent to their parent’s home in another state (weeks before ballots
had actually been mailed out); to the man who thought that old ballots (from
2018) in a dumpster were evidence of 2020 mail-in ballot fraud; to the person
who thought they were capturing video evidence of a poll worker illegally mov-
ing ballots on Election Day (it was a photographer moving his gear); to people
who were given Sharpies to complete their ballots and mistakenly believed their
votes therefore would not be counted.

163
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 182 of 292 PageID #:
13751

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Well-meaning, though often politically motivated, individuals repeatedly intro-


duced this content into the broader information sphere, often via social media.
In each of these incidents, the person originally reporting the issue (and many of
those who passed it along) may have sincerely thought they had found evidence
of voter fraud. However, it is also likely—especially considering what we know
about confirmation bias20 —that political views and prevailing narratives about
potential election fraud both contributed to these individuals’ misinterpretation
of what they were experiencing and motivated them to share the content.

Networked Framing: How Right-Wing Media and Social Media Influencers


Helped to Frame “Evidence” of Ballot and Voting Issues as “Election Fraud”

In Chapter 3, we noted the role of “framing,” or providing scaffolding for


selected information to shape how people interpret the world, in helping
to create and sustain the false “stolen election” narrative. Traditional
notions of framing often place the power of creating and communicating
frames within the domain of media elites.21 With the rise of participatory
media and disruption of the historical role of “gatekeepers,” researchers
have documented the phenomenon of “networked framing,” where diverse
members of online communities—including political and media elites,
social media influencers, and to some extent anyone with a social media
account—collaborate to create and propagate certain frames.22
In our analyses, we repeatedly saw this kind of networked framing in
action. Diverse social media users—from anonymous accounts with small
followings, to blue-check social media influencers, to accounts associated
with hyperpartisan media outlets—were consistently helping to do the
work of “framing” by assigning intent to, or exaggerating, real-world events
in their posts, in such a way as to fit the narrative of election fraud. Though
networked framing practices could be seen, to some extent, on “both sides”
of the political spectrum, our data show that right-wing networks were
far more active and influential (in terms of dissemination) in discourse
that threatened election integrity (see Chapter 5, Figure 5.1 on page 186).
One example of this networked framing activity occurred in late Septem-
ber 2020, when a batch of mail—originally reported to have absentee
ballots—was discovered in a ditch in Greenville, Wisconsin.23
There was not, nor has there been discovered since, any evidence that
this mail-dumping incident was politically motivated. Despite the lack
of any evidence, this event was quickly picked up and positioned within
the voter fraud frame—and the story eventually propagated widely within
that frame, reinforcing the false perception of mail-in voting contributing
to widespread election fraud.

164
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 183 of 292 PageID #:
13752

4.3. Dynamics of 2020 Election Misinformation

The story of ballots in a ditch first appears (in significant numbers) in our
data through an article on The Gateway Pundit,24 which often works by
selecting content from other sources and positioning that content within
their highly political frames. In this case, The Gateway Pundit repurposed
an article from a local (FOX11) news outlet.25 In addition to embedding the
content of that borrowed article in its text, The Gateway Pundit article
added four sentences of original content.
Its first sentence, which appeared above the borrowed content, made
the framing clear. Without any evidence connecting the incident to any-
one with a political motive, The Gateway Pundit’s article began with:
“Democrats are stealing the 2020 election.” Next were two sentences mak-
ing factual claims borrowed from the FOX11 article—that two trays of mail
had been found and that they included absentee ballots. And finally the
article attempted to make a connection between that mail and Democrats
by stating that “The USPS unions support Joe Biden.”
Those four sentences and the borrowed content are the entire article.
Without evidence, it frames the improperly discarded mail as election
“stealing” by Democrats. That article—and therefore that frame—spread
widely on Twitter. It was tweeted/retweeted nearly 25,000 times. In total,
we collected 60,000 tweets that referenced the incident.
The early propagation of the narrative was assisted by @Rasmussen_Poll
(through an original tweet linking to The Gateway Pundit’s article) and
@EricTrump (through a retweet). Other online accounts picked up and
advanced that voter fraud frame, calling it “LEFTIST VOTER FRAUD” and
stating through a hashtag that “#DemocratsAreCheaters.”
A few prominent social media accounts picked up the story with a slightly
more subtle framing. For example, the tweet below, posted by another
verified repeat spreader account, does not explicitly claim voter fraud,
but shapes the interpretive frame toward “voter fraud” — or at the very
least toward doubting the integrity of mail-in voting—by highlighting that
the mail was “FOUND IN DITCH” and that it included “ABSENTEE ballots.”

165
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 184 of 292 PageID #:
13753

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Tweet from Chuck Callesto framing a mail-dumping issue as an election integrity concern.

This event—and its framing as a “voter fraud” issue—eventually made it into


a public statement by Kayleigh McEnany, White House press secretary.26
This example demonstrates how hyperpartisan media and other promi-
nent social media users on the political right reframed events in misleading
ways to feed false narratives of widespread election fraud by Democrats.
It also reveals another dynamic that we saw repeatedly across these in-
cidents, where local media coverage was opportunistically appropriated
and often recontextualized to fit election fraud narratives.
Similarly, the Sharpiegate narrative (described in detail in Chapter 3.3
on page 49) took shape through networked framing. Early tweets—from
voters in various locations on Election Day—highlighted somewhat open-
ended concerns about Sharpies bleeding through ballots. Tweets and
retweets framing the concerns as potential voter fraud were often gener-
ated by less prominent accounts, including voters describing perceived
issues with their own ballots and “grassroots” political activists relaying
and occasionally reframing those concerns. Often, accounts with smaller
follower numbers would add @mentions of more prominent accounts to
try to gain their attention and potentially gain traction for their content
through a high-profile retweet or quote tweet. Those influencers and
political media elites then used the claims to bolster the “rigged election”
narrative.
Together, these examples show how networked framing—including se-
lecting certain pieces of evidence and placing it within the voter fraud
frame—was not the exclusive terrain of high-profile accounts, but also
incorporated the work of voters motivated to share their experiences
and politically active social media users helping to identify and amplify
potential cases of voter fraud.

For example, President Trump’s many statements (including tweets) about the
election being “rigged” may have sufficiently primed his supporters to be on the
lookout for evidence of election fraud by the time the Trump campaign’s “Army
for Trump” called for them to perform as formal and informal poll observers.
The primary objective of these militarized calls to action was to motivate and
organize the mass collection of purported “evidence” of election fraud. The
social media data we collected reveal a large number of people searching for,
and often mistakenly “finding,” evidence of the election fraud they believed was
occurring—and then, in a case of participatory disinformation, actively sharing
and resharing this kind of content.

166
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 185 of 292 PageID #:
13754

4.3. Dynamics of 2020 Election Misinformation

Once introduced onto social media, these cases of false witnesses of “election
fraud” were frequently picked up and amplified by influencers and rank-and-file
accounts alike. Often, the person who introduced the content or another active
social media user would try to call the attention of more prominent influencers
to potentially relevant content by reposting with tags and/or mentions of more
large-audience accounts. Those more influential accounts—often accounts of
hyperpartisan media, conservative political figures, and other elite right-wing
influencers—played the role of assembling this content to fit the larger narratives
(e.g., a “rigged election”) and of spreading it to increasingly large audiences.

Figure 4.6: Cumulative graph of Sharpie tweets on November 3 (Election Day) and November 4.
Individual tweets are plotted at the time they were shared and sized by the number of followers
of the account posting them. Color and shape represent tweet type: original tweets in green
squares, reply tweets in yellow squares, retweets in blue circles, quote tweets in red diamonds,
and retweets of quote tweets in red circles.

Figure 4.6, the cumulative graph of the early spread of “Sharpiegate” rumors,
shows the process of participatory disinformation. The conversation started
relatively small—with many small-follower accounts often tweeting their own
experiences—and then began to gain traction through quote tweets and retweets
by accounts with increasingly large audiences, eventually taking off with the
help of President Trump’s two adult sons.
In dozens of election-integrity incidents, these false or misleading narratives
eventually reached the inner echelons of the Trump campaign. In a few notable
cases, we saw the narratives move beyond social media into large television
audiences through President Trump’s debate performances.

167
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 186 of 292 PageID #:
13755

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Friend-of-a-Friend Narratives
One type of participatory misinformation we saw was the “friend-of-a-friend”
story.27 These pieces of evidence, which were often wrapped into larger nar-
ratives about disenfranchisement or election fraud, reference a story that the
person “heard” from someone else, and the content can extend to increasing
degrees of separation—the “friend-of-a-friend.” One story asserted that a per-
son’s friend had voted for Biden and the machine changed her vote to Trump
(see Figure 4.7).

Figure 4.7: A tweet claiming that a voting machine changed their friend’s vote from Trump to
Biden.

This story spread on Facebook and Twitter, and likely appeared elsewhere as
well. We saw a similar dynamic, though to a smaller extent in terms of spread,
around claims that a Trump supporter had been redirected to the wrong polling
location (see Figure 4.8).

Figure 4.8: A tweet claiming that a friend was sent to the wrong voting location.

The spread of these stories has a couple of common drivers. First, “friend” can

168
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 187 of 292 PageID #:
13756

4.3. Dynamics of 2020 Election Misinformation

take on new meaning in online spaces, where an otherwise stranger posting


to a Facebook Group can be considered a “friend” whose message is worth
spreading. This can result in often well-meaning (or at least not ill-intentioned)
people passing along content (“sharing is caring”) that people think will be
informative or otherwise helpful to others. Second, friend-of-a-friend rumors
can be intentionally copied and pasted, sometimes with small changes to minor
details—referred to as copypasta—to give the sense that a large number of people
have experienced a rare event. The Sharpiegate story also spread through friend-
of-a-friend posts; we collected hundreds of tweets mentioning a friend whose
ballot was cancelled due to the use of Sharpies.

Figure 4.9: A tweet claiming that use of a Sharpie canceled their vote.

In actuality, the online database provided the status of voters’ mail-in ballots,
which were canceled when they chose to vote in person.

The Use of Bad Statistics to Sow Doubt in Election Results


Elections produce vast quantities of data, from national Electoral College totals
to fine-grained, precinct-level results. The sociological processes that underlie
voting patterns are complex and varied, and are impacted by both structural
features (i.e., the shape and size of precincts), voting process (i.e., access and
eligibility), and the political landscape (i.e., candidates and issues). Each of these
factors, and more, introduce patterns and benign irregularities into voting data
that can be difficult or impossible to tease apart.
In the wake of the 2020 election, the scale and irregular nature of voting data
was weaponized to create statistical disinformation in order to undermine con-
fidence in the result. One of the more common tactics was to analyze precinct-
level vote totals using Benford’s law. In brief, Benford’s law makes predictions
about the frequency of first and/or second digits in a dataset. Violations of these
predictions have been used to some success as a tool for detecting financial
fraud, and have gained traction in recent years as a potential mechanism for
determining electoral fraud, despite well-documented theoretical and practical
limitations.28

169
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 188 of 292 PageID #:
13757

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Milwaukee ward-level data

Biden Trump
175 175

150 150

125 125

100 100
count

count
75 75

50 50

25 25

0 0
1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 9
Biden Trump
Figure 4.10: Ward-level analysis of first digits of vote totals in Milwaukee in the 2020 election,
redrawn from original data but similar to observed misinformation. The line indicates Benford’s
law, whereas the bars indicate the observed frequencies of first digits from 1 to 9.

One prominent example of disinformation invoking Benford’s law involved the


vote totals for wards in Milwaukee, Wisconsin, as seen in Figure 4.10. While
Trump’s vote totals approximated Benford’s law, Biden’s had a surplus of digits
4 through 6, and a dearth of digits 1 and 2. This was promoted as definitive
evidence of election fraud on both far-right websites and social media platforms
like Reddit, Facebook, and Twitter. However, the true cause was much more
benign. The excess digits are a signal of Biden’s lead and average precinct size,
and not indicative of fraud.29 More generally, Benford’s law is not expected to
be followed when data do not span several orders of magnitude or for voting
processes in general.30
The misinformation surrounding Benford’s law follows a familiar pattern. A
statistical model sets up an (often flawed) expectation of how voting data should
appear. Violations of this expectation occur, either due to chance (i.e., checking
many locations), a mismatch between the data and model’s assumptions, or an
inappropriate application of the statistical model. Ethical, well-meaning statisti-
cians discovering an irregularity would then get to work understanding whether
it arose as a problem with the model (i.e., failing to account for demographics),
the data (i.e., a rounding/processing error), an honest mistake, or in rare cases,
fraud. In cases of misinformation, irregularities are taken as prima facie proof
of fraud.
In another example, Shiva Ayyadurai posted a fraught analysis, choosing variables

170
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 189 of 292 PageID #:
13758

4.3. Dynamics of 2020 Election Misinformation

that artificially created the impression that Trump did more poorly than expected
in more Republican areas to suggest voting machines were changing votes to
Joe Biden.31 He further used the imposed negative slope to estimate purported
switched votes, which fed into misleading narratives about Dominion voting
software (discussed in more detail in Chapter 3).

These are two of many ways in which election data was weaponized to pro-
mote false narratives of widespread electoral fraud. This tactic is particularly
challenging, as it simultaneously creates the impression of widespread fraud
while leveraging statistical analyses that average citizens cannot reasonably be
expected to critique, leading them to accept claims of technical meddling at face
value. Debunking can be challenging even for statistically proficient academics,
as no affordable academic-facing API exists to gather election data in real time.
We observed that when data is available, it can require unique solutions to ac-
cess and clean into a usable format (i.e., scraping PDFs or websites). In many
cases, data were simply unavailable, were of low quality (e.g., just percentages),
or would require ethically or legally questionable scraping. Freed from legal and
data-quality constraints, purveyors of statistical disinformation remain at an
advantage.

Organized Outrage: Online Misinformation’s Offline Impact


The spread and impact of misinformation is not merely confined to the on-
line world. Indeed, many of the narratives we explored explicitly called for,
and resulted in, offline actions. Pre-election, this was seen most clearly in
the #ArmyforTrump hashtag, in which the Trump campaign and right-leaning
influencers directed supporters to sign up to become poll watchers for the
campaign and to submit purported evidence of electoral fraud to the campaign
team. Trump’s legal campaign in the weeks post-election repeatedly relied
upon questionable public testimonies of fraudulent behaviors in legal challenges
to courts across the country. These affidavits and public testimonies are the
consequence of the public priming of fraud pushed by the #ArmyforTrump
campaign (along with the many other election fraud narratives discussed in
this paper), and a weaponization of the information elicited through the digital
disinformation campaign.

Similarly, misinformation narratives that arose on Election Day itself led to


in-person organized outrage, as seen most notably around the #Sharpiegate
conspiracy. Despite swift debunking from election officials in Arizona, the
#Sharpiegate theory gained traction across local Facebook Groups and on
Twitter. This culminated in protestors gathering outside the vote counting
center in Maricopa County shouting about the conspiracy and chanting to “stop
the count.”

171
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 190 of 292 PageID #:
13759

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

Similar protests in swing states across the country were coordinated by social
media misinformation campaigns like #Sharpiegate and the more national Stop
The Steal campaign. The protests not only targeted key election sites but
were organized more generally in larger cities across the US, including large-
scale demonstrations in Washington, DC. These in-person events gave new
life to election misinformation, cementing its believability by affording them
physical presence and further weakening the ability of fact-checkers to counter
their spread. The organized outrage facilitated by blue-check influencers thus
leveraged misinformation to organize mass protests that further delegitimized
the electoral process and its results.

From Manipulating the Information System to Leveraging the


Legal System
Mis- and disinformation that originated and spread online eventually gained an
offline presence in the courts. Buried in the litany of lawsuits filed by the Trump
campaign in the post-election period were the same participatory mis- and
disinformation cascades traceable to online right-wing networks. Many of the
same false claims and misleading narratives we covered in our real-time analysis
fed the Trump administration’s meta-narrative of widespread election fraud.
Right-wing groups friendly to the President’s cause filed lawsuits that built on
these narratives as purported evidence of the illegitimacy of the election.
One prominent example of this behavior is exemplified in the false—now re-
canted —affidavit provided by an Erie, Pennsylvania, post office worker and
publicized by Project Veritas.32
Initially, the sworn affidavit contained allegations that the USPS had repeat-
edly backdated ballots—claiming that as evidence of widespread fraud. Later,
the worker went on record with the House Oversight Committee to recant his
allegation (though Project Veritas denies the veracity of his recantation).33 De-
spite the fallout, the air of legitimacy attached to a legal document may benefit
proponents of online disinformation campaigns and reinforce the “truth” of a
particular narrative.34
In another case, also in Pennsylvania, Republican representative Mike Kelly filed
a state lawsuit that challenged the constitutionality of the state’s 2019 mail-in
voting statute. In its statement of facts, the original complaint discussed a
number of unsupported claims that had circulated throughout online commu-
nities, including, for example, claims about “unsolicited ballots” (¶56–57) and
an attempt by the 2019 Pennsylvania legislature to subvert the legitimacy of
future elections by setting in motion a plan to shift to universal mail-in voting
(¶82–84).35 The Pennsylvania Commonwealth rejected his claims; on appeal
before the Supreme Court, Kelly’s action doubly failed.36 Had Kelly succeeded in

172
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 191 of 292 PageID #:
13760

4.4. Summary

his quest to invalidate the statute, over two million Pennsylvania ballots would
have been thrown out.
Finally, a handful of legal actions also incorporated bad statistics common among
online proponents of disinformation. One particularly visible example of this
phenomenon comes by way of a complaint filed by Sidney Powell,37 a vocal
Trump supporter,38 in the US District Court for the Northern District of Georgia.
Powell filed similar complaints in other key battleground states; all have since
been dismissed.39 In these complaints, Powell’s team relied on the misinterpre-
tations and/or misrepresentations of deviations from Benford’s Law discussed
above.40 Although experts agree that these deviations are not evidence of elec-
toral fraud,41 the online misinformation transformed into “IRL” disinformation
through Powell’s multiple, failed legal actions.
In sum, popular narratives that emerged from these participatory mis- and
disinformation dynamics were repeatedly mobilized as “evidence” in the courts.
Although the actions brought were often dismissed as baseless, this phenomenon
is unlikely to disappear in years to come.

4.4 Summary
The work of producing and spreading misleading narratives about the 2020
election was cross-platform, leveraging diverse platforms in complementary
ways to seed, amplify, and mobilize content while adapting around efforts by
the platforms to address misinformation. The work was both top-down, with
President Trump and right-wing media establishing the initial frames of “voter
fraud” and “election rigging,” and bottom-up, with armies of volunteers providing
content and analysis to develop specific narratives to fit those frames. With
his many “RIGGED!” tweets, starting long before the election, and his Army for
Trump advertisements, President Trump didn’t just prime his audience to be
receptive to false narratives of election fraud—he inspired them to produce
those narratives and then echoed those false claims back to them. Everyday
people, likely motivated by their political views, went online to share content
highlighting what they believed to be voting irregularities. Hyperpartisan news
and social media influencers played a role in selection, amplification, and framing,
assembling the “evidence” of the crowd to fit their narratives and then mobilizing
that content across platforms. Those narratives led to real-world efforts in the
form of protests and legal action, both of which set the course toward the events
at the US Capitol on January 6.

173
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 192 of 292 PageID #:
13761

174
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 193 of 292 PageID #:
13762

Notes

1. (page 149) Charlie Warzel, “I Talked to the Cassandra of the Internet Age,”
New York Times, February 4, 2021, https nytimes com opinion
michael-goldhaber-internet html

2. (page 150) Greg Roumeliotis, et al., “Exclusive: US opens national security


investigation into TikTok - sources,” Reuters, November 1, 2019,
https reuters com article us-tiktok-cfius-e clusive e clusive-u-s-opens-
national-security-investigation-into-tiktok-sources-idUS IL

3. (page 150) Yochai Benkler, et al., “Mail-in Voter Fraud: Anatomy of a Disinfor-
mation Campaign,” Berkman Center Research Publication No. 2020-6, Berkman
Klein Center, October 2, 2020, https doi org ssrn

4. (page 150) Andrew Perrin and Monica Anderson, “Share of US adults using
social media, including Facebook, is mostly unchanged since 2018,” Pew Research
Center, April 10, 2019, https pe research org fact-tank share-
of-u-s-adults-using-social-media-including-facebook-is-mostly-unchanged-since-
; Xuan Zhao, Cliff Lampe, and Nicole B. Ellison, “The Social Media Ecology:
User Perceptions, Strategies and Challenges,” Proceedings of the 2016 CHI
Conference on Human Factors in Computing Systems (May 2016): 89–100,
https doi org

5. (page 151) H. Tankovska, “Number of monthly active Facebook users


worldwide as of 4th quarter 2020,” Statista, February 2, 2021,
https statista com statistics number-of-monthly-active-facebook-
users- orld ide

6. (page 152) Sheera Frankel, “The Rise and Fall of the ‘Stop the Steal’ Facebook
Group,” New York Times, November 5, 2020, https nytimes com
technology stop-the-steal-facebook-group html

175
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 194 of 292 PageID #:
13763

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

7. (page 152) Brandy Zadrozny and Ben Collins, “Facebook anti-lockdown group
mobilizes to drive people to vote-counting location in Detroit,” NBC News,
November 4, 2020, https nbcne s com politics -election live-blog
- - -trump-biden-election-results-n ncrd
8. (page 154) Jim Hoft, “Update: Corrupted Software that Stole 6,000 Votes
From Trump in Michigan County — Shut Down for TWO HOURS in Red Counties
in Georgia on Election Day,” The Gateway Pundit, November 6, 2020,
https thegate aypundit com update-corrupted-soft are-stole-
-votes-trump-michigan-county-shut-t o-hours-red-counties-georgia-election-
day
9. (page 154) Proud Boys Uncensored (@ProudBoysUncensored), Telegram,
https t me Proud oysUncensored ; The Western Journal (@WesternJour-
nal), “The same software was used in 64 other counties,” Facebook, November
6, 2020, https facebook com posts ;
Mike Huckabee, “The same software was used in 64 other counties,” Facebook,
November 6, 2020, https facebook com posts

10. (page 156) This type of video, such as ones shared by CDMedia and Shiva
Ayyadurai, bears similarity to the image-based “evidence collages” documented
by Krafft and Donovan; see P. M. Krafft and Joan Donovan, “Disinformation
by Design: The Use of Evidence Collages and Platform Filtering in a Media
Manipulation Campaign,” Political Communication 37, no. 2 (March 2020): 194-
214, https doi org
11. (page 156) Samantha Bradshaw, et al., “Election Delegitimization: Coming
to You Live,” Election Integrity Partnership, November 17, 2020, https
eipartnership net rapid-response election-delegitimi ation-coming-to-you-live
12. (page 157) Max Aliapoulios et al., “An Early Look at the Parler Online Social
Network,” January 2021, https ar iv org pdf pdf
13. (page 160) David Thiel, et al., “Contours and Controversies of Parler,” Stanford
Internet Observatory, January 28, 2021, https cyber fsi stanford edu io ne s sio-
parler-contours
14. (page 160) Casey Newton, “How Alex Jones lost his info war,” The
Verge, August 7, 2018, https theverge com ale - ones-
deplatformed-misinformation-hate-speech-apple-facebook-youtube; Makena Kelly,
“Facebook bans Roger Stone after linking him to fake accounts,” The Verge, July
8, 2020, https theverge com facebook-roger-stone-
instagram-removed-proud-boys-far-right-manipulation
15. (page 160) Shelley Childers, “Parler: New social media platform with no
fact-checking rises in popularity,” ABC7 Eyewitness News Chicago, November 10,

176
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 195 of 292 PageID #:
13764

4.4. Summary

2020, https abc chicago com society ne -social-media-platform- ith-no-fact-


checking-on-the-rise
16. (page 160) Thiel, et al., “Contours and Controversies of Parler.”
17. (page 161) Soroush Vosoughi, et al., ”The spread of true and false news online,”
Science 359, no. 6380 (March 2018): 1146-1151, doi org science aap ;
Nir Grinberg, et al., “Fake News on Twitter during the 2016 U.S. election,” Science
363, no. 6425 (January 2019): 374–378, doi org science aau
18. (page 162) Benkler, et al., “Mail-in Voter Fraud: Anatomy of a Disinformation
Campaign.”
19. (page 163) Jim Maddok, et al., “Characterizing Online Rumoring Behavior
Using Multi-Dimensional Signatures,” Proceedings of the 18th ACM Conference
on Computer Supported Cooperative Work & Social Computing (February 2015),
doi org
20. (page 164) D.J. Flynn, Brendan Nyhan, and Jason Reifler, “The Nature and
Origins of Misperceptions: Understanding False and Unsupported Beliefs about
Politics,” Advances in Political Psychology 38, no.1 (January 26, 2017): 127-150,
doi org pops
21. (page 164) Pamela J. Shoemaker and Timothy Voss, Gatekeeping Theory
(Routledge, 2009), doi org .
22. (page 164) Sharon Meraz and Zizi Papacharissi, “Networked Gatekeeping
and Networked Framing on #Egypt,” The International Journal of Press/Politics
18, no. 2 (2013): 138-166, doi org .
23. (page 164) Ian Kennedy, et al., “Emerging Narratives Around ‘Mail Dump-
ing’ and Election Integrity,” Election Integrity Partnership, September 29, 2020,
https eipartnership net rapid-response mail-dumping;
WBAY News Staff, “Absentee ballots among mail found along Outagamie County
highway,” WBAY Wisconsin, updated September 24, 2020, https bay
com absentee-ballots-mail-found-along-outagamie-county-ditch ;
Casey Nelson, “Postal Service Investigating Mail Found in Greenville Ditch,”
WYDR Wisconsin, September 23, 2020, https ackfm com
postal-service-investigating-mail-found-in-greenville-ditch .
24. (page 165) Jim Hoft, “Breaking: US Mail Found in Ditch in Rural Wisconsin -
Included Absentee Ballots,” The Gateway Pundit, September 23, 2020,
https thegate aypundit com breaking-us-mail-found-ditch-
greenville- isconsin-included-absentee-ballots .
25. (page 165) FOX11 News staff, “ Mail found in ditch in Greenville,” WLUK
Wisconsin, September 23, 2020, https fo online com ne s local mail-found-
in-ditch-in-greenville.

177
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 196 of 292 PageID #:
13765

4. Cross-platform and Participatory Misinformation: Structure and Dynamics

26. (page 166) Patrick Marley, “Mail found in Greenville ditch did not include
any Wisconsin ballots,” Milwaukee Journal Sentinel, updated October 1, 2020,
https sonline com story ne s politics elections mail-found-
greenville-ditch-did-not-include-any- isconsin-ballots .
27. (page 168) Kolina Koltai, Jack Nasetta, and Kate Starbird, “‘Friend of a Friend’
Stories as a Vehicle for Misinformation,” Election Integrity Partnership, October
26, 2020, https eipartnership net rapid-response friend-of-a-friend-stories-
as-a-vehicle-for-misinformation
28. (page 169) Joseph Deckert, Mikhail Myagkov, and Peter C. Ordeshook,
“Benford’s Law and the Detection of Election Fraud,” Political Analysis 19, no. 3
(Summer 2011): 245-268, doi org pan mpr
29. (page 170) The median number of votes for Biden was 577 as a consequence
of averaging 73% of the vote in districts with a median size of 829 reported
votes.
30. (page 170) Deckert, et al., “Benford’s Law and the Detection of Election
Fraud.”
31. (page 171) Naim Kabir, “Dr. Shiva Ayyadurai & The Danger Of Data Charlatans:
Election Fraud in Michigan? Nope: just how lines work,” Medium, November
11, 2020, https kabir-naim medium com dr-shiva-ayyadurai-the-danger-of-data-
charlatans- f ffe c
32. (page 172) House Committee on Oversight and Reform (@OversightDems),
“BREAKING NEWS: Erie, Pa. #USPS whistleblower completely RECANTED his
allegations of a supervisor tampering with mail-in ballots after being questioned
by investigators, according to IG” Twitter thread, November 10, 2020,
https t itter com OversightDems status ;
Luke Broadwater, “Postal worker withdraws claim that ballots were backdated
in Pennsylvania, officials say,” New York Times, November 10, 2020,
https nytimes com technology postal- orker- ithdra s-claim-
that-ballots- ere-backdated-in-pennsylvania-officials-say html
33. (page 172) “The New York Times STRIKES AGAIN: Seven Reporters from
the ‘Paper of Record’ Join Forces to Defame Brave USPS Whistleblower Richard
Hopkins in Front-Page Piece,” Project Veritas, February 3, 2021,
https pro ectveritas com ne s the-ne -york-times-strikes-again-seven-
reporters-from-the-paper-of-record
34. (page 172) Isabella Garcia-Camargo, et al., “Project Veritas #BallotHarvesting
Amplification,” Election Integrity Partnership, September 29, 2020,
https eipartnership net rapid-response pro ect-veritas-ballotharvesting
35. (page 172) Kelly v. Commonwealth of Pennsylvania, No. 68 MAP 2020 (Pa.
Nov. 28, 2020), https democracydocket com p-content uploads sites

178
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 197 of 292 PageID #:
13766

4.4. Summary

- - - elly-v -Common ealth-Complaint- D -PFR pdf


36. (page 172) Appeal of Kelly v. Commonwealth of Pennsylvania,
http pacourts us assets files setting- file- pdf cb f ;
Emergency Application for Stay of Court’s Order, Kelly v. Commonwealth of
Pennsylvania, December 2, 2020, http pacourts us assets files setting-
file- pdf cb e ; Kelly v. Commonwealth of Pennsylvania, US
Supreme Court Order in Pending Case, December 8, 2020, https
supremecourt gov orders courtorders r b d pdf
37. (page 173) “Pearson v. Kemp (Sidney Powell 2020 Election Lawsuit) - Various
Motions & Orders,” Scribd, November 27, 2020,
https scribd com document Pearson-v- emp-Sidney-Po ell-
-Election-La suit- arious- otions-Orders
38. (page 173) Jeremy W. Peters and Alan Feuer, “What We Know About Sidney
Powell, the Lawyer Behind Wild Voting Conspiracy Theories,” New York Times,
December 8, 2020, https nytimes com article ho-is-sidney-po ell html
39. (page 173) Alison Durkee, “Sidney Powell’s Voter Fraud Claims Fail in All
Battleground States as Arizona and Wisconsin Judges Reject Cases,” Forbes, De-
cember 9, 2020, https forbes com sites alisondurkee sidney-
po ells-voter-fraud-claims-fail-for-third-time-as-ari ona- udge-re ects-case
40. (page 173) Joe Bak-Coleman, “Vote Data Patterns used to Delegitimize
the Election Results,” Election Integrity Partnership, November 6, 2020, https
eipartnership net rapid-response hat-the-election-results-dont-tell-us
41. (page 173) Reuters Staff, “Fact check: Deviation from Benford’s
Law does not prove election fraud,” Reuters, November 10, 2020,
https reuters com article uk-factcheck-benford fact-check-deviation-
from-benfords-la -does-not-prove-election-fraud-idUS I

179
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 198 of 292 PageID #:
13767

180
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 199 of 292 PageID #:
13768

Chapter 5
Actors and Networks: Repeat
Spreaders of Election
Misinformation

5.1 Introduction
In this chapter, we look systematically across EIP tickets to trace content across
platforms to identify “repeat spreaders”—i.e., individuals and organizations who
were repeatedly influential in spreading false and misleading narratives about
the 2020 election. We address the following questions:

• Which Twitter accounts, Facebook Pages/Groups, and YouTube channels


were most influential in the spread of these narratives?

• What domains were used to host content that was then mobilized through
social media in the spread of those narratives?

• Considering the structure of the online discourse, in which communities


(networks of accounts) were these repeat spreaders located?

5.2 Methods for Identifying Repeat Spreaders of


False and Misleading Narratives
To identify the repeat spreaders, we draw from three complementary views:
one from our ticketing and analysis process (described in Chapters 1 and 2); a
second through Twitter data EIP partners collected contemporaneously; and a

181
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 200 of 292 PageID #:
13769

5. Actors and Networks: Repeat Spreaders of Election Misinformation

third through CrowdTangle and Facebook search functionality, collected after


the EIP’s real-time analyses ended.
These complementary views allow us to:

• Identify some of the most influential accounts and most widely shared
domains on two of the most widely used platforms (Facebook and Twitter).

• Explore, through tracing links in our Facebook and Twitter data, how other
widely used social media platforms (like YouTube) fit into these incidents.

• Observe cross-platform connections and sharing practices.

Delineating Election-Integrity Incidents


Through our live ticketing process, analysts identified social media posts and
other web-based content related to each ticket, capturing original URLs (as well
as screenshots and URLs to archived content). In total, the EIP processed 639
unique tickets and recorded 4,784 unique original URLs.
After our real-time analysis phase ended on November 30, 2020, we grouped
tickets into incidents and narratives. We define an incident as an information
cascade related to a specific information event. Often, one incident is equivalent
to one ticket, but in some cases a small number of tickets mapped to the same
information cascade, and we collapsed them. As described in Chapter 3, inci-
dents were then mapped to narratives—the stories that develop around these
incidents—where some narratives might include several different incidents.
For tractability, we limited our analysis in this chapter to 181 tickets mapped
onto 153 incidents related to the narratives in Chapter 3 and that we determined
to either (1) have relatively large spread (>1000 tweets) on Twitter, or (2) be of
“high priority” as determined by analysts during our real-time research.
Next, through an iterative process, we identified a keyword-based search string
and a time window for each incident that would allow us to capture a compre-
hensive, low-noise dataset from Twitter, Instagram, and Facebook. We also
collected data for each incident from YouTube using links to that platform from
Twitter.

Collecting Data for Each Incident


Twitter Data Collection
We collected data from Twitter in real time from August 15 through December 12,
2020.1 Using the Twitter Streaming API,2 we tracked a variety of election-related
terms (e.g., vote, voting, voter, election, election2020, ballots), terms related

182
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 201 of 292 PageID #:
13770

5.3. Most Engaged Incidents

to voter fraud claims (e.g., fraud, voterfraud), location terms for battleground
states and potentially newsworthy areas (e.g., Detroit, Maricopa), and emergent
hashtags (e.g., #stopthesteal, #sharpiegate). The collection resulted in 859
million total tweets.
From this database, we created a subset of tweets associated with each incident,
using three methods: (1) tweets recorded in our ticketing process, (2) URLs
recorded in our ticketing process, and (3) search strings.
Relying upon our Tier 1 Analysis process (described in Chapter 1), we began with
tweets that were directly referenced in a ticket associated with an incident. We
also identified (from within our Twitter collection) and included any retweets,
quote tweets, and replies to these tweets. Next, we identified tweets in our
collection that contained a URL that had been recorded during Tier 1 Analysis
as associated with a ticket related to this incident. Finally, we used the search
string and time window developed for each incident to identify tweets from
within our larger collection that were associated with each election integrity
incident.
In total, our incident-related tweet data included 5,888,771 tweets and retweets
from ticket status IDs directly, 1,094,115 tweets and retweets collected first from
ticket URLs, and 14,914,478 from keyword searches, for a total of 21,897,364
tweets.

Facebook and Instagram Data Collection


To understand how the information ecosystem looks from the perspective of
Facebook and Instagram, we collected public posts through the CrowdTangle API
from Facebook Groups, Facebook Pages, Facebook verified profiles and public
Instagram accounts. We used the same set of incidents, and adapted the search
strings to capture comprehensive, low-noise samples for each incident from
these platforms. We had to adjust the search strings, often adding additional
search criteria (voting- and election-related terms) to bring the results into
alignment with our Twitter data, which was already constrained to voting-
related data.

5.3 Most Engaged Incidents


The 153 incidents examined varied dramatically in spread, ranging from under
1,000 tweets to over 7 million tweets in a single incident. Overall, the majority
of these incidents focused on topics related to delegitimization (110 of the
incidents), although several were associated with participation interference (25
incidents) and procedural interference (23 incidents).3 Table 5.1 enumerates the

183
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 202 of 292 PageID #:
13771

5. Actors and Networks: Repeat Spreaders of Election Misinformation

ten most prominent incidents (by Twitter spread) with a short description of
each.

5.4 Political Alignment of Influential Twitter


Accounts
To understand the social structure of Twitter accounts that posted about the
United States election, we created a network map10 of influential accounts
and the engaged audiences they share, using retweets as a rough measure of
influence. We included two accounts as nodes in our network if at least seven
users in our election-related Twitter streams retweeted both accounts at least
20 times each between September 1 and December 1, 2020. In practice, this
means that accounts are connected to each other if they share a similar audience
of accounts retweeting them. We then identified community clusters within
this network,11 excluding small or unrelated communities.
As displayed in Figure 5.1 on page 186, this pruning left us with two communities
broadly aligned with the US political right and left. The right-leaning community
was composed of two heavily intertwined communities: (1) prominent right-wing
(pro-Trump) influencers in politics, media, and social media; and (2) a community
of largely anonymous accounts who were active and vocal supporters of Trump,
QAnon, and other right-wing groups. The left-leaning community was focused
around left-leaning politicians, pundits, and mainstream news outlets, with
satellite communities consisting of users with more socialist politics, and a
small group of high-volume, activist users behaviorally similar to the much
larger right-wing activist community.
First, we looked at the incident-sharing behavior of the accounts represented
in this network, using the community structure to draw meaningful differences.
We found that influential accounts associated with the US right shared more
incidents than the left both by absolute number (151 vs. 119 tickets) and by the
total number of times they were retweeted in these incidents (17.8 million vs.
1.9 million retweets). The majority of incidents were primarily spread by the
right-wing communities: right-leaning accounts were retweeted more than left-
leaning accounts in 129 incidents, while left-leaning accounts were retweeted
more in 23.
Many incident-related tweets from left-leaning accounts were attempting to
fact-check, rather than uncritically spread, the false and misleading narratives.
In one of the most extreme examples, a false claim made by Michelle Bachman
that ballots pre-filled in China were being smuggled into the United States
received more spread on the left than the right, solely due to fact-checking be-
havior. Sometimes, the left-leaning accounts’ propensity to fact-check appeared

184
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 203 of 292 PageID #:
13772

5.4. Political Alignment of Influential Twitter Accounts

# of Related
Incident Title Description
Tweets
This incident accused Dominion Voting Systems software
of switching votes in favor of Joe Biden, particularly in
Dominion Voting swing states like Georgia; as of January 2021, Dominion has
7,157,398
Systems: Swing States filed defamation lawsuits against prominent individuals and
media that perpetuated this claim, and some have retracted
their stories.4
This broadly defined incident was based on tweets from
verified users broadly supporting the #StopTheSteal
Stop The Steal 2,888,209
narrative, which alleged that certain states were not
properly counting votes for President Trump.
This incident falsely claimed that in-person voters in
Arizona (believed to be predominantly supporters of
Sharpiegate 822,477 President Trump) were given Sharpies to vote with, which
the machines would be unable to read, thus causing their
votes to be excluded.
This incident centered on narratives that a GOP-affiliated
poll watcher was wrongfully denied entry to a Pennsylvania
polling station. This content was then reframed to falsely
Pennsylvania Poll
618,168 claim that this was evidence of illegal actions taking place
Watcher
in the polling station. While the video does show a poll
watcher being denied, it lacked broader context as to the
reason for denial, which was not politically motivated.5
This incident centered on footage from Project Veritas
showing a postal worker claiming that the post office had
Pennsylvania Postal ordered him to backdate ballots that arrived after the
591,838
Whistleblower voting deadline in Pennsylvania. The whistleblower, after
being questioned by investigators, later recanted these
statements.6
This incident focused on several whistleblowers from
Michigan Poll Watcher Michigan, some who were poll watchers in Wayne County
498,366
Whistleblowers (home to Detroit), alleging, in a video, various illegal actions
by poll workers.
This incident focused on false claims, based on
misinterpretations of information on a Michigan
Michigan Dead Voters 486,096
government-affiliated website, that dead and implausibly
old people had voted in the 2020 election.7
This incident centered on misleadingly edited video
footage that claimed to show federal employees conspiring
Sunrise Zoom Calls 475,581 with the left-leaning environmental activist organization
Sunrise Movement to organize a coup, leak information,
and shut down Washington, DC.8
This incident claimed that a whistleblower who worked for
the Clark County Elections Department (which
encompasses portions of Las Vegas) had come forward with
Nevada Whistleblower 415,614 a list of various “nefarious behaviors.” These included
falsely claiming that illegitimate ballots were being
processed and that people were filling out ballots that were
not their own near a Biden/Harris campaign van.
This incident, seeded by a Project Veritas video, surfaced
Minnesota Ballot otherwise unsupported claims of ballot harvesting in
415,570
Harvesting Minnesota and attempted to connect those claims to US
Representative Ilhan Omar (see discussion in Chapter 3).9

Table 5.1: Top 10 most-tweeted incidents in our data.

185
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 204 of 292 PageID #:
13773

5. Actors and Networks: Repeat Spreaders of Election Misinformation

Figure 5.1: A network visualization of influential Twitter accounts from our dataset of election-
related tweets collected from September 1 to December 1, 2020.12 Each node is one Twitter
account, and two nodes are linked together if they are retweeted by the same accounts. Two
nodes are pulled closer together if they share more accounts, and larger nodes are connected to
more accounts. Node colors correspond to automatically determined clusters of users, which
broadly split into right- and left-wing communities. Subcommunities include activist accounts
on both the left and right, and a socialist-leaning cluster on the left.

to stall the spread of some misleading incidents, such as when the spread of a
false claim about ballots being unlawfully rejected in Georgia was significantly
slowed after a series of corrective fact-checks. In other incidents, these fact-
checks came too late; a check for a similar false claim about undelivered ballots
in Florida came more than 24 hours after initial spread, and had no discernible
impact on subsequent sharing. There were also instances of misinformation
originating and spreading almost solely via left-leaning accounts, such as a
video of an overflowing ballot room in Miami-Dade implying that Postmaster
General DeJoy was hiding ballots for Biden in the critical county, as well as some
incidents in which both the right and left participated, such as the mail-dumping
incident in Glendale, California, described in Chapter 3.13

Influential accounts on the political right, by contrast, rarely engaged in fact-


checking behavior, and were responsible for the most widely spread incidents

186
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 205 of 292 PageID #:
13774

5.5. Repeat Spreaders

of false or misleading information in our dataset. Right-leaning accounts also


more frequently augmented their misinformation posts with narrative-related
hashtags, such as #StopTheSteal and #DeadVoters, which persisted across
multiple incidents and were shared millions of times in our dataset.14 Most
unique about right-leaning accounts, however, was their frequent involvement
in many tickets. Whereas almost all of the most influential left-leaning accounts
were involved in only one or two incidents of false or misleading information,
many right-leaning accounts with large audiences were involved with upwards
of 10, and were often responsible for seeding or catalyzing an incident’s spread
through the conservative, right-wing, and pro-Trump Twitter networks.

5.5 Repeat Spreaders


In this analysis, we attempt to identify entities—e.g., Twitter accounts, Facebook
Pages, and YouTube channels—that played a significant role in the spread of
multiple election integrity incidents, such as the ones identified above in Sec-
tion 5.3 on page 183. Expanding upon our pre-election analysis of influential
Twitter accounts, we refer to these entities as “repeat spreaders.”15

Repeat Spreader Twitter Accounts


First, we look at the most influential Twitter accounts across election integrity
incidents in terms of shaping the flow of information. We identify accounts that
produced highly retweeted original tweets (retweeted more than 1,000 times)
across multiple incidents. Table 5.2 on the next page lists the accounts that
appeared across the most incidents (>=10) along with relevant details for each
account.
The 21 most prominent repeat spreaders on Twitter—accounts that played a
significant role in disseminating multiple false or misleading narratives that
threatened election integrity—include political figures and organizations, parti-
san media outlets, and social media all-stars. Perhaps a reflection on both the
nature of information threats to election integrity and our process for identify-
ing them (see Chapter 2 for a note on the limitations of our approach), all 21 of
the repeat spreaders were associated with conservative or right-wing political
views and support of President Trump, and all featured in the politically “right”
cluster in our network graph in Figure 5.1 on the facing page. Notably, 15 of the
top spreaders of election misinformation were verified, blue-check accounts.
President Trump and his two older sons figure prominently in the Twitter
dataset. In addition, several GOP political figures, along with leaders of con-
servative political organizations, repeatedly spread misleading narratives on
Twitter. Charlie Kirk of Turning Point USA, for example, posted three tweets at

187
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 206 of 292 PageID #:
13775

5. Actors and Networks: Repeat Spreaders of Election Misinformation

Tweets Retweets Left


Rank Account Verified Incidents w/ >1000 Followers in or
Retweets Incidents Right
1 RealJamesWoods True 27 36 2,738,431 403,950 Right
2 gatewaypundit True 25 45 424,431 200,782 Right
3 DonaldJTrumpJr True 24 27 6,392,929 460,044 Right
4 realDonaldTrump True 21 43 88,965,710 1,939,362 Right
4 TomFitton True 21 29 1,328,746 193,794 Right
6 JackPosobiec True 20 41 1,211,549 188,244 Right
7 catturd2 False 17 20 436,601 66,039 Right
8 EricTrump True 16 25 4,580,170 484,425 Right
9 ChuckCallesto True 15 17 311,517 117,281 Right
10 charliekirk11 True 13 18 1,915,729 232,967 Right
11 marklevinshow True 12 10 2,790,699 90,157 Right
11 cjtruth False 12 27 256,201 66,698 Right
11 JamesOKeefeIII False 12 64 1,021,505 625,272 Right
11 prayingmedic False 12 26 437,976 57,165 Right
15 RichardGrenell True 11 12 691,441 143,363 Right
15 pnjaban True 11 14 208,484 58,417 Right
17 BreitbartNews True 10 11 1,647,070 38,405 Right
17 TheRightMelissa False 10 31 497,635 73,932 Right
17 mikeroman False 10 10 29,610 128,726 Right
17 robbystarbuck True 10 15 204,355 65,651 Right
17 seanhannity True 10 22 5,599,939 96,641 Right

Table 5.2: Repeat Spreaders: Twitter accounts that were highly retweeted across multiple inci-
dents. Twitter has since suspended the accounts of realDonaldTrump (January 6), The Gateway
Pundit (February 6), cjtruth, and prayingmedic (January 8).16 Account verification status as of
11/10/2020.

critical times that helped to catalyze the spread of Sharpiegate (see Chapter 3,
and Chapter 4 Figure 4.6 on page 167). James O’Keefe, founder of Project Veritas,
is also a significant repeat spreader. We discuss in more detail the activities
of President Trump and his sons, as well as James O’Keefe and Project Veritas,
below in Section 5.6.
Far-right hyperpartisan media outlets also participated in a wide range of inci-
dents, including The Gateway Pundit, which ranked #2 in the dataset; Breitbart
News; and two Fox News hosts. The Gateway Pundit (Twitter suspended this ac-
count on February 6, 2021) and Breitbart News are examined fully in Section 5.6
on page 195. The remainder of the repeat spreader accounts include a range of
right-wing social media influencers—James Woods, conservative celebrity and
actor, tops the list.
Many of these accounts follow others in this group, and their networks of
followers overlap as well. They also actively promote and spread each others’
content. Once content from misleading narratives entered this right-wing
Twitter network, it often spread quickly across influential accounts and out to

188
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 207 of 292 PageID #:
13776

5.5. Repeat Spreaders

their overlapping audiences, making it very difficult to slow down or correct.

Domains Cited in Incidents (in the Twitter Data)


Next, using the same tweet data, we identified the most prominent domains
across election integrity incidents. We identified domains that were highly
tweeted (linked to by more than 500 tweets or retweets) in multiple incidents.
Table 5.3 lists the domains that appeared across the most incidents (>=7) along
with relevant details for each account. Domains within this list may be cited for
different reasons—some (the Washington Post, for example) appear in this table
for articles that debunked false claims and narratives.
≈%
# Original Total ≈% Left
Rank Domain Incidents Right
Tweets Retweets Spread
Spread
1 www.thegatewaypundit.com 46 29,207 840,740 0.08% 99.92%
2 www.breitbart.com 26 8,569 394,689 0.94% 99.06%
3 www.youtube.com 21 14,040 269,996 2.51% 97.49%
4 www.washingtonpost.com 18 1,986 74,360 84.76% 15.23%
5 www.foxnews.com 14 1,330 34,143 0.91% 99.09%
6 www.theepochtimes.com 12 2,167 86,325 0.00% 100.00%
7 nypost.com 11 4,513 178,176 2.27% 97.73%
8 www.zerohedge.com 10 1,043 27,687 0.52% 99.48%
8 www.cnn.com 10 1,269 100,642 89.28% 10.71%
10 apnews.com 9 432 13,067 33.84% 66.14%
10 justthenews.com 9 1,035 61,305 0.00% 100.00%
10 www.nytimes.com 9 776 50,021 63.88% 36.11%
10 thedcpatriot.com 9 572 26,417 0.00% 99.99%
14 gellerreport.com 8 516 15,075 0.00% 99.99%
14 thenationalpulse.com 8 770 39,160 0.00% 99.99%
14 nationalfile.com 8 4,443 195,489 0.51% 99.48%
17 www.washingtontimes.com 7 280 11,445 1.45% 98.54%
17 www.pscp.tv 7 2,067 83,269 0.47% 99.53%
17 saraacarter.com 7 531 81,172 1.39% 98.60%
www.washingtonexam-
17 7 1,518 75,939 0.98% 99.02%
iner.com

Table 5.3: Domains, extracted from tweets, that were highly tweeted (>500) across multiple
incidents. Shortened URLs were followed when possible to extract original domains. The incident
count includes the number of incidents for which the domain was linked to in over 500 tweets
or retweets in our incident-related Twitter data. The original tweets are the count of non-
retweets (including quote tweets and replies) that mentioned the domain within those incidents,
while the total retweets column is a count of the retweets, both from within our incident-linked
Twitter data. Finally, the estimated right/left spread is the proportion of original tweets made
by influential users classified on the ideological spectrum based on our network analysis, above.
Users not included in that network analysis are excluded from the estimate.

The top 20 domains involved in spreading or discussing false or misleading in-


formation included both partisan and mainstream media outlets—which played

189
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 208 of 292 PageID #:
13777

5. Actors and Networks: Repeat Spreaders of Election Misinformation

markedly different roles in the information incidents (primarily spreading vs.


primarily correcting). The two most significant domains in our incident-related
data belonged to partisan outlets: The Gateway Pundit (www.thegatewaypun-
dit.com) and Breitbart (www.breitbart.com). Fox News again appears on the list;
other notable partisan news outlets are described in Appendix C on page 251.
A number of “mainstream” media sites also appear in our list of frequent
domains—often picked up within the left-leaning clusters in our network map.
Though this may suggest a somewhat equal share of participation in misinfor-
mation on the political left and right, the majority of stories cited on “the left”
were referenced as fact-checks on the associated incidents or narratives. For
instance, a story from CNN that challenges the Trump campaign’s claims of
deceased voters is representative of the corrective role these sites played within
the spread of these misleading narratives.17
A couple of incidents of false or misleading information did run through the left,
including a story about unauthorized voting boxes being set up by Republicans—
a true story, but one that falsely framed the motive and exaggerated the impact
of such actions. The story was covered by “mainstream” media sites including
AP News, the New York Times, CNN, and the Washington Post, all included in our
list of frequent domains. A discarded-mail incident was framed by the left as
the Trump administration’s effort to harm the mail-in voting process. CNN, in
particular, was cited in that USPS ballot-dumping narrative—though for content
that did not explicitly invoke the election integrity frame.
The presence of both YouTube (youtube.com) and Periscope (pscp.tv) in the
highly tweeted domain list illustrates the cross-platform nature of misleading
election-related narratives.
YouTube data is further discussed below. Interestingly, in our election-integrity
related data, both YouTube and Periscope were primarily tweeted by accounts on
the political right or pro-Trump side of the network (see Figure 5.1 on page 186).
In summary, though a few false or misleading narratives about the integrity of
the 2020 election did run through the left, when we look at the domains that
repeatedly helped to spread—as opposed to correct—election-related misinfor-
mation, we find an array of predominantly right-wing and pro-Trump partisan
media outlets.

Repeat Spreaders: YouTube Channels in the Twitter Data


YouTube played a prominent role in the spread of false and misleading informa-
tion across the election integrity incidents we analyzed, ranking third among
most linked-to domains overall. In at least 44 distinct incidents, YouTube videos
were tweeted more than 10 times.

190
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 209 of 292 PageID #:
13778

5.5. Repeat Spreaders

From our corpus of data, we identified the YouTube channels that were repeat
spreaders within the Twitter discourse—i.e., those that repeatedly used YouTube
to disseminate multiple false and/or misleading narratives. To do this we first
extracted all of the YouTube links from our incident data and used the YouTube
API to determine what channels posted the videos. We then identified channels
that were highly tweeted—linked to more than ten times in an incident—for
multiple election integrity incidents. This provided a corpus of 665 videos
from 411 unique YouTube channels.18 Table 5.4 lists the top 12 repeat spreader
channels (>4 incidents) that arose from this analysis.

Total YouTube
Rank Channel Incidents Videos
Tweets Views
1 Project Veritas 7 128,734 26 9,613,437
1 CDMedia 7 258,314 1 691,395
3 Donald J Trump 6 4,338 10 10,849,373
One America News
3 6 207,544 15 4,034,274
Network
3 GOP War Room 6 186,106 8 1,732,847
3 Dr. Shiva Ayyadurai 6 196,292 1 1,052,429
7 Gateway Pundit 5 10,015 13 4,085,657
8 NewsNOW from FOX 4 406 7 9,450,514
8 StevenCrowder 4 15,490 3 8,159,462
8 BlazeTV 4 314 6 3,900,083
8 Judicial Watch 4 1,333 7 511,568
8 MR. OBVIOUS 4 283 5 401,481

Table 5.4: Repeat Spreaders: YouTube channels that were highly tweeted (>=10 times/incident)
across multiple (>=4) incidents.

The channels found to be repeat spreaders of false and misleading narratives


through YouTube look similar to the repeat spreaders on Twitter—right-wing
influencers, hyperpartisan media outlets such as One America News Network
(OANN) and The Gateway Pundit, political groups supportive of President Trump
such as Project Veritas, and President Trump himself. These channels attracted
millions of views for content related to known incidents of misinformation
surrounding the 2020 election.
Two channels, compilation video creators Dr. Shiva Ayyadurai and CDMedia,
were remarkable in that they appeared in our top repeat spreader list for being
cited in multiple incidents, but for only a single video. Dr. Ayyadurai is discussed
as a prominent repeat spreader in Section 5.6 on page 195.

Repeat Spreaders: Facebook Pages & Groups and Instagram


For our Facebook and Instagram analysis, we identified accounts (public Pages
and Groups for Facebook, public accounts for Instagram) that were highly en-
gaged with across multiple incidents. Aligning with the threshold used for

191
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 210 of 292 PageID #:
13779

5. Actors and Networks: Repeat Spreaders of Election Misinformation

accounts in our Twitter analysis, a post had to receive at least 1,000 likes or
favorites to be counted as part of an incident. In this way, we were looking for
accounts that were influential across incidents. The total engagement column
for Facebook is the sum of likes (and other emotive reactions), comments, and
shares. For Instagram, the total engagements are the sum of favorites and com-
ments. Tables 5.5 and 5.6 on page 194 feature the accounts that appeared across
the most incidents.
Facebook # of # of Total
Rank Account Name
Page/Group Incidents Posts Engagement
1 Breitbart Page 8 20 831,452
1 The Silent Majority Page 8 7 69,763
Heather Cox
3 Page 6 8 816,755
Richardson
3 David J Harris Jr. Page 6 11 282,652
3 James O’Keefe Page 6 20 194,596
3 Project Veritas Page 6 20 165,377
7 NowThis Politics Page 5 11 244,023
7 Team Trump Page 5 5 153,118
7 Ryan Fournier Page 5 6 67,885
7 Wendy Bell Radio Page 5 6 62,020
#WalkAway
7 Group 5 12 51,854
Campaign
7 StandwithMueller Page 5 7 19,345

Table 5.5: Repeat Spreaders: Facebook Pages and public Groups that were highly engaged with
(>=1000 engagements) across multiple (>=5) incidents.

Facebook
Table 5.5 shows the top 12 public Facebook Pages and Groups that repeatedly
shared content about the incidents in our dataset. From this data, we see that
public Facebook Pages (and not public Facebook Groups) tended to appear more
frequently as repeat spreaders. Only one Facebook Group appeared as a repeat
spreader. This may not be surprising, as many Groups that played a role in the
spread of election-related misinformation are either private (so would not be
accessible via CrowdTangle) or have been removed from Facebook.19 Facebook’s
longer format provided an opportunity for Pages to host long, detailed posts that
contain false claims and misleading narratives that spanned multiple incidents.
Among the repeat spreaders in the Facebook data, we see several familiar names,
including Breitbart, James O’Keefe, and Project Veritas. Short-form videos were
popular on the official Facebook account of Team Trump, which does not appear
to be officially associated with the Trump campaign.
Most of the repeat spreaders in the Facebook list are, similar to what we see in
the Twitter and YouTube data, right-leaning and/or Trump-supporting entities.

192
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 211 of 292 PageID #:
13780

5.5. Repeat Spreaders

However, we do see three left-leaning Pages among the group—NowThis Politics,


StandwithMueller, and historian Heather Cox Richardson. The inclusion of all
three is primarily the result of their Pages attempting to fact-check or otherwise
counter false or misleading information about the election. For example, the
NowThis Politics post below attempts to correct post-election misinformation,
quoting the Trump campaign in its text.

Figure 5.2: An example of a NowThis Politics Facebook post discussing Trump campaign claims
included in the data.

Instagram
The Instagram repeat spreaders list (see Table 5.6 on the next page) looks some-
what similar to our Twitter list, containing accounts of partisan media organiza-
tions (e.g., The Gateway Pundit, Breitbart), and public individuals (e.g., James
O’Keefe).

193
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 212 of 292 PageID #:
13781

5. Actors and Networks: Repeat Spreaders of Election Misinformation

# of # of Total
Rank Account Name Verified
Incidents Posts Engagement
1 KAGBABE 2.O Not verified 12 33 80,484
2 Breitbart Verified 10 14 670,577
2 The Gateway Pundit Not verified 10 20 132,440
4 James O’Keefe Verified 6 20 410,335
4 Baller Alert Verified 6 7 102,837
6 Michael Hennessey Not verified 5 82 169,623
6 Occupy Democrats Not verified 5 5 51,289
6 Latinos With Trump Not verified 5 14 47,167
6 Ben & Hannah🦅 Not verified 5 11 19,529
#HisNameWasSethRich
6 Not verified 5 7 18,814
🐼🇺🇸

Table 5.6: Repeat Spreaders: Facebook Pages and public Groups that were highly engaged with
(>=1000 engagements) across multiple (>=5) incidents.

Unlike our Twitter list, most of the other accounts on the Instagram list are
not verified. We see a few new names that we do not see anywhere else, like
KAGBABE 2.O—an anonymous account that showed up in the most incidents—
Baller Alert, Michael Hennessey/Snowflake News, Latinos with Trump, Ben &
Hannah, and HisNameWasSethRich.

An account we see among the Instagram repeat spreaders is the left-leaning


group “Occupy Democrats.” Their Facebook Page also appeared in a few incidents
(though not enough to make the list of top spreaders in Table 5-5). In at least
two cases, Occupy Democrats was picked up in our data for trying to correct
misinformation related to an incident. In others, they spread information that
functioned to fan fears of voter disenfranchisement and intimidation.

For example, a tweet went viral on October 20, 2020, depicting an officer wear-
ing a Trump mask at a polling station in Miami.20 Within an hour, the Miami
Police Department publicly condemned the actions of the officer.21 Despite the
official condemnation, Occupy Democrats reposted the image through both its
Instagram and Facebook accounts. Its posts urged people to report the officer to
the non-emergency police line. Both posts created a lot of engagement. There
is no evidence to support the claim that this was part of an organized police-led
voter intimidation campaign, which appears in the embedded meme in the Oc-
cupy Democrats Facebook post in Figure 5.3 on the facing page. That framing
was both false and, while it likely functioned to rile Occupy Democrats followers
on the left, also carried a risk of suppressing voter turnout by fomenting fears
around voter intimidation at the polls (a concept covered in Chapter 3 with the
“Army for Trump” example).22

194
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 213 of 292 PageID #:
13782

5.6. An Integrated Look at Repeat Spreaders Across Platforms

Figure 5.3: Screenshots of posts by Occupy Democrats about the incident, with specific instruc-
tions in the Facebook post (right) to call Miami’s non-emergency line to report the officer, both
after Miami PD’s official response.

5.6 An Integrated Look at Repeat Spreaders


Across Platforms
In this section, we provide an integrated view, looking at how some of the most
active and prominent repeat spreaders pushed false and misleading narratives
about the election across platforms.

President Trump, His Family, and the Trump Campaign


Though the specific claims and narratives often originated elsewhere, the Trump
family and the Trump campaign regularly amplified incidents of false and mis-
leading information—especially false claims of election fraud—across multiple
platforms. President Trump’s official Twitter account (@realDonaldTrump) par-
ticipated in 21 distinct incidents and was the most highly retweeted in all of our
incident-related data (Twitter permanently suspended his account on January 8,
2021).23 His YouTube channel put out videos that linked to six distinct incidents,
making him tied for third, and that were viewed more than any other repeat
spreader’s videos. And his Facebook official account was the most engaged-with
account in all of our Facebook data.

President Trump’s adult sons Donald Jr. and Eric were involved in 24 and 16
incidents respectively; Donald Jr. was the third most prominent Twitter user
in the incident-related data. Between them, the president, Donald Jr., and Eric
Trump spread and reinforced narratives questioning the security of the mail-in
voting process, ballot harvesting claims, several different narratives about poll

195
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 214 of 292 PageID #:
13783

5. Actors and Networks: Repeat Spreaders of Election Misinformation

watchers being denied access and other questionable “whistleblower” claims,


and the Dominion conspiracy theory.

These cases capture solely when Donald Trump or his campaign produced
content (posts, videos, tweets) related to an incident. In addition to content
production, the Trump team regularly used retweets to amplify content by
hyperpartisan media outlets and other accounts. Leading up to the election,
we described one incident in which Donald Trump Jr. amplified a ballot har-
vesting narrative produced by Project Veritas (see Figure 3.16 on page 66 in
Chapter 3).24 Similar amplification events occurred involving Dr. Shiva Ayyadu-
rai, The Gateway Pundit, Breitbart, and other hyperpartisan outlets. Owing
to their large following, members of the Trump family—and a broader array
of accounts associated with their campaign—were able to catalyze the spread
of election fraud narratives. Their role in the spread of misinformation was
therefore multidimensional—through both content production and content
amplification.

Their activity also extended beyond social media. Claims of electoral fraud were
pushed by members of the Trump family, the Trump campaign, and other surro-
gates on cable news, through press briefings, and eventually within numerous
court cases. Perhaps the most important role the Trump inner circle played was
to seed and perpetuate the prevailing narrative—the general notion of a “rigged
election.”

The Gateway Pundit


The Gateway Pundit was among the most active spreaders of election-related
misinformation in our analyses. The outlet used a cross-platform strategy,
hosting content on its website and using other channels to promote both its
own and others’ content. It appeared as a top repeat spreader through its
website, its Twitter account, its YouTube channel, and its Instagram account.
(Twitter suspended the account on February 6, 2021).25

Figure 5.4 on the facing page shows the relative engagement with The Gateway
Pundit’s content over time and across platforms within our incident-related
data.

Unlike some of the other entities featured here, The Gateway Pundit was highly
active throughout the election lifecycle, including during the weeks leading
up to the election, when it repeatedly spread content—in distinct information
incidents—that sought to undermine trust in mail-in voting specifically and the
eventual election results more generally. It participated in seeding and spread-
ing misleading information about ballots being harvested, chased, dumped,
stolen, and miscounted. It spread false narratives of election fraud built upon

196
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 215 of 292 PageID #:
13784

5.6. An Integrated Look at Repeat Spreaders Across Platforms

Gateway Pundit Cross-Platform Activity


Engagement Per Day
Engagement Count for Gateway Pundit
Youtube Instagram Twitter Facebook
5,000

0
100,000
50,000
0
40,000
20,000
0
10,000,000
5,000,000
0
01 -15 0-0
1
0-1
5 01 -15
09- 0-0
9
0-1 0-1 11- 0-1
1
2020- 202 202 202 202
0-
202
Time in UTC
Figure 5.4: Engagements per day for The Gateway Pundit. Facebook engagements are in blue,
Twitter retweets in orange, and Instagram likes in green.

misinterpretations of statistics and was active in spreading the false Dominion


conspiracy theory.
On Twitter, The Gateway Pundit’s account was highly retweeted across 26 dif-
ferent incidents (#2 among repeat spreaders). Evidence from our data suggest
that its prominence was due both to production of its own material and to am-
plification (via original and quote tweets) of other partisan content. It repeatedly
interacted with content and accounts of other repeat spreaders and influencers,
including Project Veritas, as shown in Figure 5.5.

Figure 5.5: Quote tweet by @JamesOKeefeIII (the founder of Project Veritas) of a tweet by Jim
Hoft (the operator of @gatewaypundit). Hoft’s tweet links to an article on thegatewaypundit.com,
which promotes a video released by Project Veritas.

Of all the domains linked to in our Twitter data, The Gateway Pundit’s website
was connected to the largest number of incidents (46) while also garnering the

197
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 216 of 292 PageID #:
13785

5. Actors and Networks: Repeat Spreaders of Election Misinformation

most related original tweets (29,207) and retweets (840,750). Their YouTube
channel appeared in five incidents, and their 13 incident-related videos had
more than 4 million views on YouTube.
The Gateway Pundit was not as visible in the Facebook data we collected, but
its Instagram account was tied for #2 among repeat spreaders, appearing in 10
incidents for 20 posts that received more than 132,000 engagements.

Breitbart News
Breitbart News, a right-wing, online media outlet, was also a cross-platform
repeat spreader—pushing false and misleading narratives about the election
through their website, Twitter account, Facebook Page, and Instagram account.
In terms of number of different false or misleading information incidents that
they helped to spread, they were #1 on Facebook (8 incidents), #2 on Instagram
(10 incidents), and #2 among linked-to websites in the Twitter data (26 incidents).
On Facebook and Instagram, they had the highest engagement among repeat
spreaders.
Breitbart participated in a wide range of ballot-related incidents, such as mail-
dumping and ballot harvesting, voting machine issues, and now-debunked claims
that statistical anomalies suggest widespread election fraud. It both produced
its own content and propagated stories that initially rose to prominence on other
domains. Often, it picked up content found elsewhere online and reframed that
content within its own articles. However, Breitbart tended to be more careful
than The Gateway Pundit and others in how it framed events to subtly connect
them to potential issues of voter fraud without explicitly making those claims.

Newsmax Media
Newsmax Media (formerly NewsMax) is a conservative media outlet that pro-
duces content through its website, cable news channel, and various social media
accounts—including Twitter, YouTube, Facebook, and Instagram. Especially ac-
tive in the aftermath of the election, Newsmax repeatedly posted videos—across
their many media channels—where they hosted guests that made unsupported
and in many cases outright false claims about election fraud. The outlet appears
in several incidents in our data, from Stop The Steal and Sharpiegate to the
Dominion and Hammer and Scorecard conspiracy theories.
The Newsmax website is most visible in our data for seeding a misinformation
incident through a video interview (available on their website) claiming that the
head of the Federal Election Commission, Trey Trainor, believed that voter fraud
was occurring in states still counting ballots. Newsmax also hosted a pundit who
claimed that the Democrats were attempting a “coup” and ran several segments

198
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 217 of 292 PageID #:
13786

5.6. An Integrated Look at Repeat Spreaders Across Platforms

containing false accusations about companies involved in the manufacture and


software development for voting machines.

Perhaps more interesting than the specific incidents that Newsmax was involved
in spreading is how the media outlet vastly increased its visibility in this discourse
immediately following the election. Figure 5.6 shows engagement (likes and
comments) across platforms with Newsmax content related to incidents of false
or misleading information about the election. Prior to November 3, Newsmax
was not a significant part of these conversations. But after the election, the
media outlet began to gain attention—quite rapidly—for its coverage of election
fraud claims.

Newsmax Cross-Platform Activity


Engagement Per Day
Engagement Count for Newsmax
Youtube Instagram Twitter Facebook

5,000

0
10,000
5,000
0
20
10
0
40,000,000
20,000,000
0
01 -15 0-0
1
0-1
5
1-0
1 -15
09- 0-0
9
0-1 0-1 0-1 0-1
1
2020- 202 202 202 202 202
Time in UTC
Figure 5.6: Engagements per day for Newsmax in incident-related data. Facebook is in blue,
Twitter retweets in orange, Instagram in green, and YouTube in red.

Gains in engagements on Newsmax’s content were accompanied by gains in


followers for their accounts on various social media platforms—translating to a
potentially long-term visibility increase for the outlet. Figure 5.7 on the next page
shows followers over time for Newsmax’s Twitter and Instagram accounts. Both
demonstrate a sharp increase in early November. The Twitter graph (which we
can generate at a much higher granularity) indicates that the first sharp increase
occurs at about 4:00 am UTC on November 3 (11:00 pm EST on election night).
Much of that may be attributable to their election night projections, including
a tweet erroneously announcing that President Trump had won the state of
Georgia. The @newsmax Twitter account would continue to gain followers
over the course of the post-election period—growing by nearly 300% in two
weeks (from 232,000 on November 2 to 668,000 on November 15)—as their
content began to coalesce around false claims of election fraud. Their Instagram
account saw an even more remarkable gain, from 47,400 followers on October

199
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 218 of 292 PageID #:
13787

5. Actors and Networks: Repeat Spreaders of Election Misinformation

31 to 318,500 followers on November 14 (an increase of more than 600%).

Figure 5.7: Newsmax followers growth on Twitter (orange) and Instagram (green).

During the post-election period (November 3 to November 15), Newsmax also


began to promote themselves as a pro-Trump alternative to Fox News, which
was being criticized for, among other things, calling Arizona for Biden. Reflecting
what appears to be a strategy of staking claim to the right-wing and pro-Trump
media market, on November 8 Newsmax bragged that they were the “only major
news network to not call the election.”
Later, Newsmax would be legally pressured to post “clarifications” to many of
the false accusations that they aired.26 But it’s likely that their reputational
boost—in terms of followers on their social media accounts—from posting the
original false claims was not significantly diminished by the later corrections.

Project Veritas
The data show that Project Veritas was a prominent repeat spreader of false
and misleading information about the 2020 election across multiple platforms,
through both the organization’s accounts and the personal accounts of its
founder, James O’Keefe. (Twitter permanently suspended Project Veritas’s of-
ficial account and temporarily locked James O’Keefe’s on February 11, 2021.)
They produced several videos in the form of “investigative reports” that they
hosted on YouTube and their official website. They used their other social media
channels—where they were connected to a network of other large-audience,
blue-check conservative and pro-Trump accounts—to advertise and disseminate
their videos.
As a montage view of their YouTube videos shows, Project Veritas produced
videos that repeatedly challenged the integrity of electoral procedures, elec-
tion and postal service officials, and ultimately the results of the election (see
Figure 5.8 on the facing page).

200
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 219 of 292 PageID #:
13788

5.6. An Integrated Look at Repeat Spreaders Across Platforms

Figure 5.8: Project Veritas’s YouTube Page containing a number of their investigative reports on
election fraud.

Project Veritas videos maintain a consistent, signature style: they begin with
founder James O’Keefe describing the alleged fraud their video exposes before
moving on to undercover videos or anonymized interviews that are presented as
“proof” of their claims. The videos are highly edited, with often incomplete nar-
ratives. Notably, the subjects of some of the videos Project Veritas released were
found to be unreliable sources—for example, the political operative whistleblow-
ing about alleged ballot-harvesting by the Ilhan Omar campaign later revealed
he was offered a $10,000 bribe to make up the story.27

Though Project Veritas claimed to face deplatforming efforts on Twitter during


the 2020 election cycle (ostensibly for violating Twitter’s civic integrity poli-
cies), they were highly successful at disseminating their content throughout the
2020 election.28 In addition to their engagement on YouTube, the group gained
5.8 million views on videos they uploaded to their Facebook Page in 2020 and
over 12 million views on their Instagram videos. Their success, in part, can be
attributed to James O’Keefe, who uses his personal platform and connections
with other conservative influencers to direct attention to their video content,
hosted across multiple platforms. O’Keefe’s personal Twitter account (@Jame-
sOKeefeIII) appeared in 12 of our election integrity incidents and garnered over
625,000 retweets, primarily for posts promoting Project Veritas’s content.

O’Keefe’s Facebook Pages were often used nearly identically to his Twitter
account, complete with the use of hashtags, short-form statements on particular
incidents, and linked videos, as seen in Figure 5.9 on the next page.

201
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 220 of 292 PageID #:
13789

5. Actors and Networks: Repeat Spreaders of Election Misinformation

Figure 5.9: Identical posts by James O’Keefe on both Facebook (left) and Twitter (right).

In the lead-up to the election, Project Veritas focused their efforts on sowing
doubt in the integrity of mail-in voting by pushing narratives around ballot
harvesting and what they term “ballot chasing.” They released several videos
on their YouTube channel that claimed various campaigns (of primarily down-
ballot races) were engaging in illegal ballot harvesting and facilitating mail-in
voter fraud, including one accusing Representative Ilhan Omar. Project Veritas
promoted the drop of the video on Twitter prior to releasing it on YouTube
(see Figure 3.14 on page 65 in Chapter 3). Following its release, the video was
linked to by multiple prominent partisan media news sites such as The National
Pulse, whose stories were further amplified by retweets by Donald Trump Jr.
The cross-platform attention drew users to the video on YouTube, resulting in
nearly 1.2 million views. O’Keefe capitalized on the attention garnered by the
video to release multiple subsequent undercover reports on alleged election
fraud. Subsequent videos failed to gain as much traction, but still consistently
garnered at least 100,000 views on YouTube.

After the election, Project Veritas began producing videos of “whistleblowers”


alleging fraudulent behaviors in swing states—this included a video testimonial
from a Pennsylvania postal worker claiming that late ballots were backdated.
O’Keefe tweeted the video (embedded within Twitter, as well as posted to the
YouTube channel) to his one million followers. After the worker recanted his
testimony in an affidavit a few days later, O’Keefe posted a follow-up tweet/video
combination claiming that the whistleblower had been retaliated against by the
USPS.29 Both tweets (see Figure 5.10 on the next page) gained significant traction,
receiving thousands of retweets and likes.

Notably, Twitter did take action on some of the misleading content propagated
by Project Veritas and O’Keefe, occasionally adding labels saying the content was

202
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 221 of 292 PageID #:
13790

5.6. An Integrated Look at Repeat Spreaders Across Platforms

Figure 5.10: Tweets from James O’Keefe, founder of Project Veritas, claiming mail-in voting fraud
in Pennsylvania.

disputed and eventually suspending the @Project_Veritas account. Yet a lack of


uniformity in policies across platforms, and the group’s significant presence on
multiple social media platforms, mean that most of Project Veritas’s misleading
content remains online in some format.

Dr. Shiva Ayyadurai


Dr. Shiva Ayyadurai played a unique role in promoting electoral misinformation—
in that it began after election day and featured almost exclusively content that
misinterpreted and/or misrepresented statistics. He is also an example of over-
lap between producers of coronavirus and electoral misinformation. Ayyadurai’s
platform grew remarkably in 2020, after a video claiming Dr. Anthony Fauci was
part of a Deep State conspiracy to spread coronavirus garnered more than six
million views in a week.30 After the 2020 election, he successfully leveraged
YouTube’s livestreaming feature to produce lengthy videos that proliferated
multiple false narratives alongside dubious statistical “evidence.” His videos
were similarly livestreamed and viewed on Periscope and Facebook.
After a failed primary campaign for the US Senate in September 2020, Ayyadurai
began promoting a conspiracy theory that computer tabulation systems system-
atically switched votes in favor of his opponent. After November 3, he extended
this claim—based on fraught statistical analysis31 —to asserting fraud in the US
presidential election. His argument took several forms, broadly and erroneously

203
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 222 of 292 PageID #:
13791

5. Actors and Networks: Repeat Spreaders of Election Misinformation

claiming that Trump’s under-performance in areas with more straight-ticket


Republican votes was evidence of a “weighted feature” of tabulation software
favoring Joe Biden. When his arguments were debunked by statisticians, he al-
tered or changed expectations or presented a new and equally fraught statistical
argument.
His most popular video has gained over 1 million views since he livestreamed it on
November 10. His popularity can, in part, be attributed to sharing of his content
by other misinformation superspreaders, including QAnon-affiliated lawyer
Sidney Powell, who not only tweeted it to her one million-plus followers but
also used Ayyadurai’s arguments as evidence in her so-called “Kraken” lawsuit
attempting to overturn the election results in Georgia, a key swing state (see
Figure 5.11).32

Figure 5.11: Trump legal affiliate Sydney Powell tweets a link to Ayyadurai’s most popular YouTube
video.

5.7 Summary
Our analysis suggests that the primary “influencers” in the online production
and dissemination of false and misleading narratives about the 2020 election
were verified, blue-check accounts belonging to partisan media outlets, social

204
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 223 of 292 PageID #:
13792

5.7. Summary

media influencers, and political figures. Though false narratives occasionally


gained traction on the political left, almost all of the most prominent repeat
spreaders—i.e., the accounts that seeded and disseminated multiple false claims
and narratives—belonged to conservative and/or pro-Trump individuals and
organizations. Members of the Trump campaign, including President Trump
and his adult sons, played a significant role in the spread of these narratives,
which converged around false and misleading claims of voter fraud and sought
to undermine trust in the results of the election. These narratives persisted
throughout our analysis, from August through December, and spread through
and across diverse social media platforms—and through the broader information
ecosystem, including cable news.

205
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 224 of 292 PageID #:
13793

206
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 225 of 292 PageID #:
13794

Notes

1. (page 182) We continued to track the spread of incidents through December


12, 2020, approximately two weeks after our real-time analysis concluded.

2. (page 182) We ran several collections in parallel, balancing terms across


collections to reduce the impact of rate limits.

3. (page 183) It was possible for one incident to be related to multiple themes
that the EIP defined, which is why these sum to more than 153.

4. (page 185) Nick Corasaniti, “Rudy Giuliani Sued by Dominion Voting Systems
Over False Election Claims,” New York Times, January 25, 2021, https
nytimes com us politics rudy-giuliani-dominion-trump html

5. (page 185) Saranac Hale Spencer, “Overblown Claims of ‘Bad Things’ at Philly
Polls,” FactCheck.org, November 3, 2020, https factcheck org
overblo n-claims-of-bad-things-at-philly-polls

6. (page 185) Saranac Hale Spencer, “Pennsylvania Postal Worker Waffles on


Election Fraud Claim,” FactCheck.org, November 12, 2020,
https factcheck org pennsylvania-postal- orker- affles-on-
election-fraud-claim

7. (page 185) Jonathan Oosting, “Meet Michigan’s ‘dead’ voters. They’re quite
alive despite false fraud claims,” November 10, 2020, BridgeMichigan,
https bridgemi com michigan-government meet-michigans-dead-voters-
theyre- uite-alive-despite-false-fraud-claims

8. (page 185) Samantha Putterman, “Video makes it look like left-leaning groups
plotted post-election coup. That’s not the whole story,” PolitiFact, November
5, 2020, https politifact com article nov video-makes-it-look-left-
leaning-groups-plotted-po

207
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 226 of 292 PageID #:
13795

5. Actors and Networks: Repeat Spreaders of Election Misinformation

9. (page 185) Bethania Palma, “Viral Video Spreads Unfounded Claims about Rep.
Ilhan Omar and Voter Fraud,” Snopes, updated October 19, 2020,
https snopes com ne s pro ect-veritas-ilhan-omar

10. (page 184) The network was generated from our larger Twitter data
collection—the 859 million tweets we collected about the election and voting.
This creates a stable network structure onto which we later mapped specific
incidents.

11. (page 184) We used the Louvain method for identifying communities in the
network graph; see Wikipedia, s.v. “Louvain method,” last modified February 9,
2021, 12:45 pm, https en ikipedia org iki Louvain method

12. (page 186) We used a slightly abbreviated time window for this part of
the analysis (than for calculating spread of the incidents), but due to the high
thresholds for inclusion of nodes and edges, the structure is fairly stable and it
is unlikely that influential nodes would shift from one community to another if
more data was included.

13. (page 186) Ian Kennedy, et al., “Emerging Narratives Around ‘Mail Dumping’
and Election Integrity,” Election Integrity Partnership, September 29, 2020,
https eipartnership net rapid-response mail-dumping

14. (page 187) As we note in the section on Participatory Mis- and Disinformation
in Chapter 4, many of the rank-and-file accounts on the political right viewed
their participation in these false and misleading narratives as helping to expose
wrongdoing, not as spreading misinformation.

15. (page 187) Election Integrity Partnership Team, “Repeat Offenders: Voting
Misinformation on Twitter in the 2020 United States Election,” Election Integrity
Partnership, October 29, 2020,
https eipartnership net rapid-response repeat-offenders

16. (page 188) “Permanent Suspension of @realDonaldTrump,” Twitter blog,


January 8, 2021,
https blog t itter com en us topics company suspension html;
AJ Dellinger, “Twitter suspends ‘Gateway Pundit’ Jim Hoft,” Forbes, February
6, 2021, https forbes com sites a dellinger t itter-suspends-
gate ay-pundit- im-hoft ;
@cjtruth, Twitter profile, https t itter com c truth;
Chris Mills Rodrigo, “Twitter permanently suspends Michael Flynn, Sidney Pow-
ell and others,” The Hill, January 8, 2021, https thehill com policy technology
-t itter-permanently-suspends-michael-flynn-sidney-po ell-and-others

17. (page 190) Holmes Lybrand and Tara Subramaniam, “Fact check: Evidence
undermines Trump campaign’s claims of dead people voting in Georgia,” CNN,

208
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 227 of 292 PageID #:
13796

5.7. Summary

updated November 13, 2020, https cnn com politics trump-


campaign-georgia-dead-voters-fact-check inde html
18. (page 191) One limitation of this approach is that it provides a view of YouTube
activity filtered by content shared on Twitter (with at least one retweet). Videos
that were not cross-posted to Twitter are not included.
19. (page 192) Sheera Frenkel, “The Rise and Fall of the ‘Stop the Steal’ Facebook
Group,” New York Times, November 5, 2020, https nytimes com
technology stop-the-steal-facebook-group html
20. (page 194) Davey Alba, “Riled Up: Misinformation Stokes Calls for Violence
on Election Day,” New York Times, updated January 20, 2021,
https nytimes com technology viral-misinformation-violence-
election html
21. (page 194) Miami Police Department (@MiamiPD), “We are aware of the
photograph being circulated of a Miami Police officer wearing a political mask
in uniform. This behavior is unacceptable, a violation of departmental policy,
and is being addressed immediately,” Twitter, October 20, 2020, https t itter
com iamiPD status
22. (page 194) Rachel Moran, et al., “Left-Leaning Influencers, ‘Mainstream’
Media Play Big Role in Amplifying ‘Army for Trump’ Fears,” Election Integrity
Partnership, October 12, 2020, https eipartnership net rapid-response
army-of-trump
23. (page 195) Twitter, “Permanent Suspension of @realDonaldTrump.”
24. (page 196) Isabella Garcia-Camargo, et al., “Project Veritas #BallotHarvesting
Amplification,” Election Integrity Partnership, September 29, 2020,
https eipartnership net rapid-response pro ect-veritas-ballotharvesting
25. (page 196) Dellinger, “Twitter suspends ‘Gateway Pundit’ Jim Hoft.”
26. (page 200) Jeremy Barr, “Newsmax issues sweeping ‘clarification’ debunk-
ing its own coverage of election misinformation,” Washington Post, Decem-
ber 21, 2020, https ashingtonpost com media ne sma -
clarification-smartmatic
27. (page 201) Liban Osman, interview by Tom Lyden, Fox 9 Minneapolis-St.
Paul, on “Subject of Project Veritas voter fraud story says he was offered bribe,”
YouTube, October 6, 2020, https youtube com atch v W FG EE
28. (page 201) Project Veritas (@Project_Veritas), “Twitter is censoring this video
hard,” Twitter, November 12, 2020, 2:11 pm, https t itter com pro ect veritas
status ; “Civic integrity policy,” Twitter Help, January
2021, https help t itter com en rules-and-policies election-integrity-policy

209
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 228 of 292 PageID #:
13797

5. Actors and Networks: Repeat Spreaders of Election Misinformation

29. (page 202) Shawn Boburg and Jacob Bogage, “Postal worker recanted allega-
tions of ballot tampering, officials say,” Washington Post, November 10, 2020,
https ashingtonpost com investigations postal- orker-fabricated-ballot-
pennsylvania a c- - eb- - ad b e story html
30. (page 203) Ryan Broderick, “A YouTube Video Accusing Dr. Anthony Fauci
of Being Part of the Deep State Has Been Viewed Over 6 Million Times in a
Week,” BuzzFeed News, April 15, 2020, https bu feedne s com article
ryanhatesthis youtube-anthony-fauci-deep-state-coronavirus
31. (page 203) Naim Kabir, “Dr. Shiva Ayyadurai and the Danger of Data Charla-
tans,” Medium, https kabir-naim medium com dr-shiva-ayyadurai-the-danger-of-
data-charlatans- f ffe c
32. (page 204) Aaron Keller, “Sidney Powell’s ‘Kraken’ Lawsuit Argues Improba-
bility of ‘High Republican, Low Trump’ Voting Patterns,” Law & Crime, November
30, 2020,
https la andcrime com -election sidney-po ells-kraken-la suit-argues-
improbability-of-high-republican-lo -trump-voting-patterns

210
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 229 of 292 PageID #:
13798

Chapter 6
Policy

6.1 Introduction
Platform policies establish the rules of participation in social media communities.
Recognizing the heightened rhetoric and the use of mis- and disinformation
during the 2020 election, all of the major platforms made significant changes
to election integrity policies, both as the campaigns kicked off and through
the weeks after Election Day—policies that attempted to slow the spread of
specific narratives and tactics that could potentially mislead or deceive the
public, though the efforts were not always successful.
Throughout the election period, a team of EIP analysts evaluated platform
policies within three contexts:

• Actors’ Content and Behavior: The content and behavior that platforms
identify fall in or out of behavior that violates their policies.

• Platform Actions: What moderation strategies are proportionate to deal


with the actors’ content and behaviors.

• Overall Communication of Platform Policies: How policies are communi-


cated to the public clearly and transparently.

This chapter begins by briefly reviewing and comparing platform policy iter-
ations before and during the 2020 election. We then describe the primary
platform interventions, their strengths and weaknesses, and how they were
applied to the repeat spreaders in our dataset. From there we discuss misinfor-
mation problems that have no clear-cut policy solutions, and conclude with a
forward-looking assessment of areas for policy improvement.

211
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 230 of 292 PageID #:
13799

6. Policy

6.2 Social Media Platform Policy Evolution


Major social media platforms such as Facebook, Twitter, YouTube, Pinterest, and
TikTok introduced changes to their community standards in the months leading
up to the election and in the aftermath. The timeline below shows the four
phases that correspond with larger policy trends across multiple platforms:1

• Phase 0: April 2019–August 2020. Some platforms introduced or up-


dated their policies on election misinformation. However, the majority of
platforms still had sparse, non-specific, or non-existent policies around
election-related content.

• Phase 1: September 2020. A number of platforms announced the first


updates to election-specific policies: making large additions; adding more
clarity and specificity; or stating clearly that they will label or remove
content that delegitimizes the integrity of the election.

• Phase 2: Early October 2020. A month before the election, platforms


specified the media organizations they would rely on for determining when
races are declared and emphasized removing content that intimidates
voters or incites violence. However, they did not distinguish between
general and specific calls to action.

• Phase 3: Late October 2020. In the days leading up to the election, plat-
forms previewed their Election Day plans. This included providing concrete
examples of what labels on content discussing election results will look
like.

• Phase 4: Early November 2020 (post-election). Platforms released infor-


mation about the content and behavior they saw and their moderation
efforts on and after Election Day; some policies were updated to address
post-election claims of election fraud.

Early in the EIP’s research, we identified specific categories of potential election


misinformation (see Chapter 1) and ranked policy comprehensiveness in each
category.2 Table 6.1 on page 214 and 6.2 on page 215 illustrate the evolution
of platforms’ policies: the first shows coverage in August 2020; the second
shows where the policies stood as of October 28, 2020, right before the elec-
tion. (Our methodology for platform evaluations—which focused on formal or
publicly stated policies for addressing election misinformation—can be found in
Appendix F on page 265.)
There are two key findings from this analysis. First, platforms that already had
election-related policies strengthened them, while platforms that went into the

212
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 231 of 292 PageID #:
13800

6.2. Social Media Platform Policy Evolution

Timeline of Platform Policy Updates


Phase 0 Phase 1 Phase 2 Phase 3 Phase 4

10/21: Will label 8/5: Adds policy Introduces 9/3: 10/7: Will remove 10/9: Prohibits 10/27: Will remove 11/12 & 12/18:
state-controlled prohibiting synthetic policies Creates “Civic calls for voter premature victory incitement of Both Twitter and
media. Page or manipulated election-related Participation intimidation. Will claims. Inserts violence and include Facebook release
owners more media. Expands policies. Misinformation” label political “friction,” e.g., panel with election batches of
transparent. reporting options and policy. Will limit content after polls pushing Quotes, result info from the statistical
Clearer fact- fact-checking. recommendations close and update removing “liked/ AP. Does not information about
checking labels. Working with partners about election- labels when followed by” address premature content moderation
Bans ads that to fight foreign related content. winner recommendations, claims of victory. and observed
deter voting. interference. announced. adding context to behavior.
trends.

Apr 2019 Oct 2020 Nov 2020 Dec 2020

Oct 2019 Aug 2020 Sep 2020 Oct 2020 Nov 2020

Introduces stand- 2/3: Updates 9/3: Will label 9/10: Will label or 10/7: Will reduce 10/28: States 11/2: Updates 12/9: Announces
alone civic Integrity Will limit policies to content that remove content discoverability for authoritative to provide more termination of
policy. Prohibits recommendations delegitimizes that causes unverified claims, sources for context to 8000+ channels
cover more election result and
misleading voting of borderline confusion, such as premature calling election tweets: will since September.
info, intimidation, content and types of candidates’ undermines claims of victory. results. label some that Will remove
and false affiliation. elevate news election-related premature claims confidence, and Accounts/devices make claims videos alleging
Violation may lead publishers. content. of victory. Will delegitimizes that repeatedly about election that widespread
to tweet deletion, remove posts that election results. violate policies will results. fraud changed
profile modification, deter voting. Labeled tweets be banned. States official election.
or suspension. may have less sources for
visibility. election results.
Icon Key:

Facebook Pinterest Nextdoor Snapchat TikTok Twitter YouTube

Figure 6.1: A timeline of the four phases of election policy introduced by the platforms in the
lead-up to and after the 2020 election.

election without any policies remained without them through the election, with
the exception of Snapchat.3

Second, many platform policy updates related to the 2020 election cycle focused
far more on explicit topical content restrictions than on user behavior. After
the discovery of Russian interference in the 2016 election, platforms focused on
behavior, such as coordinated inauthentic behavior, rather than content.4 Even
in 2020, Facebook’s first election policy announcement focused on its efforts
to combat this behavior and “fight foreign interference.”5 Yet much of the
misinformation in the 2020 election was pushed by authentic, domestic actors,
and platforms shifted their focus to address downstream harms related to the
content itself. As a result, most subsequent updates introduced policies related
to specific content categories, such as claims of premature victory or posts that
promote violence at the polls. The iterative nature of platform policies during
the election season also indicates that, despite having seen certain narratives in
previous elections that were predicted to appear again in 2020, many platforms
did not proactively adapt policies to combat these narratives.

213
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 232 of 292 PageID #:
13801

6. Policy

Delegitimization
Procedural Participation
Fraud of Election
Interference Interference
Results
Non-
Facebook Comprehensive Comprehensive Comprehensive
Comprehensive
Non- Non-
Twitter Comprehensive Comprehensive
Comprehensive Comprehensive
Non- Non- Non-
YouTube Comprehensive
Comprehensive Comprehensive Comprehensive
Non- Non- Non-
Pinterest None
Comprehensive Comprehensive Comprehensive
Non-
Nextdoor Comprehensive None None
Comprehensive
Non- Non- Non-
TikTok None
Comprehensive Comprehensive Comprehensive
Snapchat *No election-related policies
Parler *No election-related policies
Gab *No election-related policies
Discord *No election-related policies
WhatsApp *No election-related policies
Telegram *No election-related policies
Reddit *No election-related policies
Twitch *No election-related policies

Table 6.1: The EIP’s evaluation of platform policies as they stood in August 2020. A rating of
“No election-related policies” means the platform has no explicit policy or stance on the issue;
although the platform may have existing policies that address misleading content, we were unable
to evaluate how they might apply in an election-related context. We grouped the 15th platform,
Instagram, with Facebook, however it is not entirely clear to our team if every election-related
policy update made by Facebook also applied to Instagram.

6.3 Platform Interventions: Policy Approaches


and Application Outcomes
In addition to tracking the evolution of content-based policy changes, the EIP
examined the benefits and drawbacks of the tactics that platforms used to
enforce their new policies: remove, reduce, and inform. These interventions
encompass a spectrum of actions, from removing content and suspending users,
to creating friction, to contextualizing with content labels.

Ultimately, we find that platform intervention and users’ responses are part of a
feedback loop: platforms’ observations of actions reveal the need for policies, and
policies impact subsequent actions. From July to November, we watched policy
shape users’ tactics, and users’ tactics impact policy. While this reciprocity can
make it difficult to stop the spread of misinformation, it can also force platforms
to fortify or adapt their policies.

214
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 233 of 292 PageID #:
13802

6.3. Platform Interventions: Policy Approaches and Application Outcomes

Delegitimization
Procedural Participation
Fraud of Election
Interference Interference
Results
Facebook Comprehensive Comprehensive Comprehensive Comprehensive
Non-
Twitter Comprehensive Comprehensive Comprehensive
Comprehensive
Non- Non-
YouTube Comprehensive Comprehensive
Comprehensive Comprehensive
Pinterest Comprehensive Comprehensive Comprehensive Comprehensive
Non- Non- Non- Non-
Nextdoor
Comprehensive Comprehensive Comprehensive Comprehensive
Non- Non- Non-
TikTok Comprehensive
Comprehensive Comprehensive Comprehensive
Non- Non- Non- Non-
Snapchat
Comprehensive Comprehensive Comprehensive Comprehensive
Parler *No election-related policies
Gab *No election-related policies
Discord *No election-related policies
WhatsApp *No election-related policies
Telegram *No election-related policies
Reddit *No election-related policies
Twitch *No election-related policies

Table 6.2: After multiple iterations of policy updates, the EIP’s final evaluation of platform policies
as of October 28, 2020. Listings in red indicate a change in policy from the start of our monitoring
period. We grouped the 15th platform, Instagram, with Facebook, however it is not entirely clear
to our team if every election-related policy update made by Facebook also applied to Instagram.

Platform Moderation Approach: Remove


The most punitive moderation tools at a platform’s disposal are content removal
and account suspensions. “Remove” can be applied to actors for several reasons:
accounts can be suspended for inauthentic identities, coordinated inauthentic
behavior, or repeatedly violating the community guidelines—such as the repeat
spreaders discussed in Chapter 5.
The intention behind this type of moderation is to prune false or misleading
information at its source. It is often used to address content that can have the
greatest real-world harm, and platforms were committed to removing calls for
interference in the election process that may lead to violence. In our dataset
of tickets, incitement to violence had the highest rate of content or account
removal.
Despite what appeared to be clear policy to penalize or remove repeat spreaders
and high-profile disinformation actors, platforms appeared to shy away from
using this particular intervention. In some cases, this was a result of a variety of
“newsworthiness” exceptions, which allowed some high-profile repeat spreaders,
including politicians, to evade bans.6 Yet many of the repeat spreaders we saw
were not politicians.

215
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 234 of 292 PageID #:
13803

6. Policy

Platform Moderation Approach: Reduce


The second moderation intervention is to “reduce” the distribution of policy-
violating content so that fewer users see it—to insert “friction.” This type of
intervention may include methods such as downranking content so that it
appears lower in a user’s feed or preventing sharing capabilities to reduce the
spread of certain content.
Several platforms employed friction leading up to the election. Twitter intro-
duced a series of changes, including turning off the ability to reply, retweet,
or like a tweet that violated the policy.7 Similarly, TikTok redirected search
results and hashtags, such as #RiggedElection and #SharpieGate, that violated
its community guidelines, preventing users from finding others who use the
terms.8 Facebook supplied additional context to content-sharing features by
warning people when they share old content links, a common pattern seen in
misinformation. This Facebook product feature demonstrates how friction can
also go hand in hand with informing users, discussed more below.9
Policies introducing friction can be particularly helpful around networked fram-
ing, where platforms face not one piece of content but rather the conglomeration
of often countless instances of misinformation or hard-to-verify information. If
looked at like a narrative puzzle, individual pieces are less consequential than
the whole image—platforms must have the insight to see the puzzle before
it is formed. By expanding friction policies to address narratives rather than
individual pieces of content, platforms stand a better chance at reducing the
negative impact of networked framing.
Although the EIP does not have insight into how well these friction-inducing
policies reduced the spread of misinformation, Twitter stated that from October
27, 2020, to November 11, 2020, they saw an estimated 29% decrease in quote
tweets of labeled tweets, perhaps due in part to a prompt that warned people
prior to sharing.10

Platform Moderation Approach: Inform


Content labels were the most commonly used policy intervention by Facebook
during the 2020 election and were used by Twitter on approximately 300,000
pieces of content.11 Though labels permit policy-violating content to stay on
a platform, they may reduce distribution and alter how users interact with
content.
The EIP observed four distinct issues related to labeling practices during the
2020 election. First, some platforms struggled to apply labels uniformly: content
identical in substance was labeled in some instances but not others. Labels
signal that something may be false or misleading. If some content is unla-

216
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 235 of 292 PageID #:
13804

6.3. Platform Interventions: Policy Approaches and Application Outcomes

beled, it may give the impression that it might be true—an “implied truth ef-
fect”—unintentionally giving credence to misleading content.12
Lack of uniform labeling leads to another challenge: mislabeling. Some platforms
use automated systems—AI—to detect and label content. However, AI sometimes
fails to distinguish between content that violates policies and content that
does not. For example, Facebook used AI to automatically label most election-
related content with a generic label: “Visit the Voting Information Center for
voting resources and official election updates.” While the AI did label some
content as false, the generic auto-label was applied more frequently. In fact,
content that would more appropriately be labeled as “false” was instead tagged
with the “Voting Information Center” label. The AI’s inability to distinguish
false or misleading content from general election-related commentary may
have diminished the value of Facebook’s labeling policy entirely. On balance,
AI-driven labeling is another flaw in platforms’ policy approach to identifying
misinformation.13
Second, inconsistent label language and placement impedes platforms’ attempts
to reduce the spread of misinformation. Varied language can inspire confusion
and speculation about platforms’ intent, while problematic placement and design
may obscure labels from view.
Inconsistent label language can be particularly problematic, especially against
the backdrop of an ongoing, hyperpartisan battle over content moderation. For
example, in May 2019, Twitter marked a handful of President Trump’s tweets with
a relatively neutral label: “Get the facts about mail-in ballots.” But in October,
when President Trump tweeted similar content, Twitter changed the labels:
in contrast to the previous passive language, Twitter applied a label that read,
“Learn how voting by mail is safe and secure,” complete with an embedded link
to voting resources.14

Figure 6.2: President Trump’s tweets, both violative of Twitter’s civic integrity policy, labeled with
different language.

However, the shift occurred without explanation from Twitter, and repeat

217
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 236 of 292 PageID #:
13805

6. Policy

spreaders speculated about Twitter’s purported political agenda in its wake.


While changes in label language are appropriate responses to misinformation,
lack of context and documentation of these changes, or confusing rollouts, may
trigger distrust, leading users and media outlets to speculate about a platform’s
motives rather than consider the veracity of the content.15 Notably, subsequent
updates to Twitter’s label language, such as those responding to official election
results, came with official statements that previewed what these labels would
look like.16
Similarly, label location is a notable design weakness: because location is not
mandated by policy, aesthetics seems to be the primary concern. Thus, some
platforms put labels below the flagged content instead of directly above it.
Because users have varied hardware and personalized software (e.g., text size,
speech-to-text), labels placed on the bottom may appear off-screen—or content
could be screen-captured without its bottom label and shared as if it had not
received a label at all. Further, users may click away from the post before even
seeing a label at the bottom. Although we cannot say with certainty whether
labels are effective measures of deterring users’ belief in misinformation, placing
labels at the bottom of misleading posts risks the foreclosure of any possibility
of deterrence.
Third, the EIP additionally observed that when platforms were slow to label,
misinformation spread quickly, achieving wide distribution before a platform
took action. Difficulty with fact-checking and verification, among other issues,
often gave repeat spreaders with large followings the space to quickly circulate
false narratives as platforms deliberated the appropriate response. For example,
Twitter permitted a number of Trump’s misinformation-riddled tweets to go
unlabeled for several hours after they appeared on his timeline. Between the
time of posting and the label’s application, Trump’s tweets were retweeted,
quote tweeted, and shared tens of thousands of times.17
Finally, the EIP observed inconsistency of label implementation between plat-
forms, even when they shared similar content-labeling policies. This is one
component of the cross-platform dynamic identified in previous chapters. Ul-
timately, discrepancy in labeling across platforms creates an opportunity for
misinformation to thrive. People are users on multiple platforms, and are thus
forced to determine what the presence or absence of a label on one platform
versus another means about the truth of election-related content.

Platform Interventions vs. Repeat Spreaders and Influencers


As Chapter 4 and 5 lay out, the structure of mis- and disinformation includes both
top-down prominent accounts as well as bottom-up participation. In the 2020
election, repeat spreaders played a key role in both elevating crowd-sourced

218
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 237 of 292 PageID #:
13806

6.3. Platform Interventions: Policy Approaches and Application Outcomes

stories and providing a frame to interpret them. This section highlights how
platform policies set the rules for engagement, and how gaps in policies can be
exploited by repeat spreaders.

Repeat spreaders sometimes face consequences for their violations, such as


content labels or removal, after platforms take the user’s history and the severity
of infringement into account. However, in the dataset of repeat spreaders
introduced in Chapter 5 we saw that very few Twitter accounts were actually
removed—only four including President Trump’s as of February 2021—and that
many of them are still active on other platforms. Additionally, the proliferation
of misleading and false narratives suggests that the policy interventions outlined
above were not successful.

Central to this issue is that repeat spreader policies are not clear in two key
ways. First, the majority of platforms do not publicly communicate the number
of offenses a user must commit before they will take action on the user’s entire
account (e.g., suspension), not just on their content (e.g., labeling) . While
platforms like Facebook have an internal strike system for offenses, at the time
of the election YouTube was the only platform that, in the form of its three-strike
rule, publicly placed clear limits.18 The lack of transparency means that we also
do not know the type of action to expect against an account after a certain
number of violations. We do not know, for example, when a suspension will be
temporary versus permanent.

Second, it is also unclear how public interest exemptions may play into repeat
spreader policies. Platforms such as Twitter and Facebook have policies that
exempt certain content from elected and government officials from being re-
moved;19 however, we do not know if or when a government official account
would be suspended if it repeatedly violates platform policy. For example, Twit-
ter labeled half of newly elected Representative Marjorie Taylor Green’s tweets
after the polls closed on Election Day, without moving to suspend her (see
Figure 6.3 on the following page). 20

After the insurrection of the US Capitol on January 6, one of the most prominent
repeat spreaders, President Trump, was suspended from a number of platforms;
Twitter permanently suspended his account on January 8.21 Four days later,
Twitter introduced a detailed strike system specifically for the civic integrity
policy.22 It is unclear if Twitter has applied this new policy since its creation,
or if they will expand its strike system to other policy areas, such as COVID-
19 misinformation. However, this new policy reflects a robust adaptation for
responding to repeat spreaders.

219
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 238 of 292 PageID #:
13807

6. Policy

Figure 6.3: A sample of tweets by Representative Majorie Taylor Green on November 4, 2020.
(Note: these are selected tweets, not an image of her timeline. Some of her tweets in this short
time period were not labeled.)

6.4 Mis- and Disinformation Problems Without


Clear Policy Solutions
Even with these policies in place, with full and consistent implementation, other
obstacles to preventing and containing the spread of mis- and disinformation
exist. As platforms, researchers, and official policymakers work to protect
the integrity of our elections, it is important to recognize those obstacles for
which, at this moment, there may be no clear policy solution. These include
cross-platform complexities, the use of non-falsifiable content, backlash against
platform interventions (“techlash”), and organized outrage.

220
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 239 of 292 PageID #:
13808

6.4. Mis- and Disinformation Problems Without Clear Policy Solutions

Cross-Platform Complexities
Much of what we have discussed up to this point relates to policy challenges
faced by each individual platform. However, as discussed in Chapters 4 and 5,
the platforms, combined, form an information ecosystem through which content
moves; therefore, the cross-platform spread of misinformation cannot be solved
through intervention by one platform alone. Prior to the 2020 election, US
government agencies and several platforms met periodically to communicate
the standards and observations of internal trust and safety teams, which resulted
in a joint statement noting the collaborative work.23 However, while the group
committed to discuss active threats throughout and following the election, it
remained the responsibility of each company to enforce measures to mitigate
misinformation. Ultimately, platforms do not transparently outline nor allow
independent assessment of how they engage in sector-specific, cross-platform
information sharing.
Important legal ramifications such as user privacy and antitrust laws make this
collaborative environment difficult. Another challenge is that some platforms,
such as Parler and Gab, do not have content moderation policies or even inten-
tions to moderate. Lastly, as legal scholar Evelyn Douek outlines in her work
“The Rise of Content Cartels,” there are drawbacks to private corporations set-
ting the rules of permissible speech across platforms, regardless of how effective
they may be.24

Use of Non-Falsifiable Content


The election information ecosystem was replete with non-falsifiable claims—
such as those from anonymous whistleblowers or a “friend of a friend.” These
claims can be the most difficult to fact-check, and the current policies in place
are insufficient to fully address hard-to-verify content.
Platforms use fact-checking partners to surface and verify false statements,
but unfalsifiable information can easily fall through the cracks. Facebook’s
fact-checking program, for example, identifies and addresses “particularly clear
hoaxes that have no basis in fact”—a relatively strict threshold of falsifiability—
and “is not meant to interfere with individual expression” on the platform.25
The problem lies, however, at the intersection of falsehood and personal experi-
ence, forcing platforms to either over moderate at the risk of removing personal
content that is unfalsifiable, or under moderate and allow this potentially mis-
leading material to proliferate. Some platforms such as TikTok are developing
mechanisms to limit the distribution of claims that can’t be verified or when
fact-checking is rendered inconclusive.26 These mechanisms are important, but
they need to be enforced quickly and at scale. Actors will continue to frame
misinformation as personal and unfalsifiable experiences, some for political

221
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 240 of 292 PageID #:
13809

6. Policy

gain, as long as the unverified-content gray area exists in platform policies and
actions.

Techlash Against Policy Interventions


As fact-checking becomes increasingly important to the information ecosystem,
platform interventions have often received a “techlash,” and accusations of
censorship, mostly from the conservative right.27 For example, after a slew of
Marjorie Taylor Greene’s posts were labeled as disputed and possibly misleading,
as described above, Greene posted a claim that Twitter had “censored” her; she
included a screenshot of the “censored” tweets.28 In some cases, platform fact-
checking labels were weaponized to make the case that platforms allegedly have
political agendas, and thus the fact-checks should be considered untrustworthy
and disregarded. EIP analysts observed that when some accounts were removed,
the account’s followers expressed that the mere fact of its removal was proof
of a greater conspiracy to “cover up the truth.” This appeared to contribute to
meta-misinformation about the intentions of the platforms. Continued lack of
transparency and perceived inconsistencies behind account takedowns may
further entangle platforms with the narratives they hoped to nix.

Figure 6.4: This tweet from Congressman Kevin McCarthy demonstrates the backlash to platform
action against one of President Trump’s tweets (first reported in the Washington Post on June 23,
2020)29

In some respects, these continued claims of platform censorship have fuelled


the movement of influencers to smaller, obscure, or specialized platforms like
Parler, where there is less moderation and far fewer fact-checks.

222
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 241 of 292 PageID #:
13810

6.5. Primary Areas for Policy Improvement

Organized Outrage
Social media plays a critical role in facilitating legitimate protest. However,
features such as Groups, event pages, and hashtags can be used to spread
misinformation and stoke outrage to galvanize offline action. In the 2020 elec-
tion, protesters, motivated by election misinformation and conspiracy theo-
ries, swarmed polling locations and chanted hashtags they read online, such as
#Sharpiegate and #StoptheSteal.
This organized outrage raises the question of how platforms can proactively
identify which hashtags or speech are likely to result in organizing offline action
with the potential for violence. While applying a label can create friction before
content gains enough attention to incite offline action, platforms may struggle
to move beyond the reactive and to have the political and cultural expertise to
quickly and effectively contextualize hashtags, Groups, and event pages.
As the insurrection of the US Capitol on January 6 demonstrates, the organiz-
ing leading up to the violent acts took place on multiple platforms. Facebook
provided a unifying feature in the form of Groups, which, like the other large
platforms, contributed to giving the outrage a shape and form even when the
Group was taken down. This event underscores the important need for plat-
forms to not only assess the calculus of what is actionable content but also
ensure that their policies are implemented.

6.5 Primary Areas for Policy Improvement


In addition to how policies are implemented, platforms’ methods of communi-
cation and the transparency of their data are incredibly important to election-
related policies. This section discusses how issues related to policy clarity and
transparency at times undermined platforms’ goal of reducing the spread of mis-
and disinformation.

Clarity
It is not enough simply to have a policy and a moderation regime in place; the
community governed by the rules must understand both in order for them to
be most effective. Despite improvements to policy comprehensiveness and
a shift toward some proactive policy implementation ahead of the election,
platforms struggled with straightforward policy language and centralizing all
policy updates. With the exception of a few platforms, such as Twitter and
Pinterest, platforms lacked a centralized location for all of their election-related
policies. Instead, policies were spread across blog posts, excluded from formal
community standards entirely, or disseminated in different sections of platforms’
terms of service. Platforms also failed to announce policy updates uniformly.

223
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 242 of 292 PageID #:
13811

6. Policy

Some updates were announced through blog posts, some through the personal
social media accounts of top executives, and some not at all.30
The absence of a central and public mechanism to announce and host policy
changes makes it difficult to track changes over time. Without clear documen-
tation, policy changes run the risk of confusing users as to what is and is not
permissible election-related speech.
The presence of vague and undefined terms in policy language also poses a
clarity problem. For example, in October 2020, TikTok updated its policy to
prohibit any “attempts to intimidate voters or suppress voting.”31 Yet outside of
general incitements to violence, TikTok did not sufficiently define what voter
intimidation or voter suppression looks like on its platform. However, we recog-
nize an encouraging trend: platforms are making more adjustments to improve
clarity (at times successful, other times less so) from when they first began
updating their policies. Ultimately, a focus on reducing generalized language
and streamlining policy availability is a step in the right direction.

Transparency
Although the EIP could trace content, identify policy shifts, and engage with
stakeholders, we were left trying to answer one particularly important question:
are the intervention methods effective? And how do platforms measure that?
While a number of internet platforms adopted election-related content labeling
policies, those labels’ effectiveness in combating false narratives is difficult for
external researchers to assess. As of December 2020,32 most major platforms
had not released data about the volume and consistency of labeled content.
Without information about where labels appeared, who interacted with those
labels, and what those interactions could imply, researchers are left to formulate
a best guess about the effectiveness of platforms’ most substantial intervention
effort. One study asserts that the universality of label application is necessary
to avoid the “implied truth effect”; however, it is impossible to replicate in
the wake of the 2020 election, and restricted access to platform data impedes
any further study. Over the past two years, many platforms have continued
to limit access to and the functionality of their public application interfaces
(APIs),33 and while their large-scale instructed datasets, or adaptive algorithms,
can provide important insights into the online information ecosystem, these
datasets are often compiled behind closed doors. This raises concerns about the
independence, exhaustiveness, and validity of research and monitoring activities
that rely solely on this data.
Increasing transparency in moderation practices will increase public auditability
and the subsequent perceived legitimacy of platform decisions. As the presence
of mis- and disinformation online is not likely to decrease in the coming years,

224
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 243 of 292 PageID #:
13812

6.6. Platform Policy Moving Forward

transparency is a prerequisite for any platform seeking to effectively intervene


in its influence.

6.6 Platform Policy Moving Forward


Policy shapes the propagation of information by impacting what content is
permitted, and to what extent it receives widespread distribution. As we have
discussed, the major social platforms recognized the risks of election misinforma-
tion and adjusted their policies in several key ways to try to prevent misleading
narratives from taking hold, or violence from occurring. They moderated by
removing misleading or false content, reducing its distribution, and informing
and contextualizing content for users. Despite these efforts, accounts with large
and loyal audiences repeatedly took advantage of gaps in platform policy: repeat
spreaders packaged false claims of voter fraud into hard-to-verify narratives
that escaped timely fact-checks, and President Donald Trump himself—covered
under a newsworthiness exemption—was a key player in the incitement that
ultimately led to violence at the Capitol on January 6, 2021.34 The consequences
for repeatedly violating platform policy did not appear to deter these actors, in
part because the consequences themselves were inconsistently applied.
In a remarkable turn of events, Twitter removed the sitting President of the
United States from its platform on January 8. After the insurrection at the
Capitol, platforms suspended President Trump’s account, and thousands of
others, for “risk of future incitement of violence.”35 This action has sparked a
public conversation about policy and power, including a broader discussion of
how to weigh the need to remove accounts spreading misinformation—including,
at times, those of democratically elected politicians—against stifling legitimate
discourse and free expression.
Ultimately, it is impossible to separate the events at the Capitol on January 6 from
the narratives around voter fraud and a rigged election that began much earlier.
As online speech turned into offline action, platform policy was the one line
of defense, outside of the partisan leadership fuelling the misinformation, that
could deter this progression. Given the significant decision to suspend a sitting
(albeit outgoing) president’s accounts on Instagram and Facebook indefinitely,
Facebook has referred its action to the Oversight Board.36 The decision will most
likely not only shape future platform policy decisions concerning politicians
in the US but also set a precedent for how to approach the accounts of other
global leaders.
There isn’t a simple panacea for these policy weaknesses. Content moderation
policies will continue to evolve, as they have after the January 6 insurrection at
the Capitol. The next election will have its own unique set of misinformation
narratives, yet many of the tactics, dynamics, and basic structures of these narra-

225
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 244 of 292 PageID #:
13813

6. Policy

tives will likely remain the same. Therefore, platforms must set pre-established,
clear, and transparent rules rather than waiting to react to events as they unfold.
In the next chapter we discuss specific recommendations for policymakers in
light of the narrative, tactical, and policy findings in this report.

226
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 245 of 292 PageID #:
13814

Notes

1. (page 212) The platforms evaluated during the EIP’s operation include: Face-
book, Instagram, Twitter, YouTube, Pinterest, Nextdoor, TikTok, Snapchat, Par-
ler, Gab, Discord, WhatsApp, Telegram, Reddit, and Twitch. Twitch was added
to our list during our blog post update in October.
2. (page 212) “Evaluating Election-Related Platform Speech Policies,” Election
Integrity Partnership, October 28, 2020, https eipartnership net policy-
analysis platform-policies
3. (page 213) “Community Guidelines,” Snap Inc., accessed February 10, 2021,
https snap com en-US community-guidelines
4. (page 213) Nathaniel Gleicher, “Coordinated Inauthentic Behavior Explained,”
Facebook Newsroom, December 6, 2018, https about fb com ne s
inside-feed-coordinated-inauthentic-behavior
5. (page 213) Guy Rosen, et al., “Helping to Protect the 2020 US Elections,”
Facebook News, updated January 27, 2020, https about fb com ne s
update-on-election-integrity-efforts
6. (page 215) Platforms such as Facebook, Twitter, YouTube, and TikTok have a
“newsworthiness” policy that allows content otherwise in violation of platforms’
community standards to stay up if it is newsworthy and in the public interest.
On Facebook and Twitter, this exception is limited to posts made by politicians.
On TikTok and YouTube, the scope of this policy is a little more vague, and
generally applies to “educational, documentary, scientific, or artistic content,
satirical content, content in fictional settings, counterspeech, and content in the
public interest that is newsworthy or otherwise enables individual expression
on topics of social importance.” See “Facebook, Elections and Political Political
Speech,” Facebook News, September 24, 2019, https about fb com ne s
elections-and-political-speech ; “About public-interest exceptions on Twitter,”

227
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 246 of 292 PageID #:
13815

6. Policy

Twitter Help Center, accessed February 10, 2021, https help t itter com en
rules-and-policies public-interest; and “Community Guidelines,” TikTok, updated
December 2020, https tiktok com community-guidelines
7. (page 216) Vijaya Gadde and Kayvon Beykpour, “Additional steps we’re taking
ahead of the 2020 US election,” Twitter blog, updated November 2, 2020,
https blog t itter com en us topics company -election-changes html
8. (page 216) “Our election integrity efforts,” Twitter Safety Center, accessed
February 10, 2021,
https tiktok com safety resources -us-elections;
Sarah Perez, “TikTok takes down some hashtags related to election misinforma-
tion, leaves others,” TechCrunch, November 5, 2020,
https techcrunch com tiktok-takes-do n-some-hashtags-related-to-
election-misinformation-leaves-others
9. (page 216) John Hegeman, “Providing People with Additional Context About
Content They Share,” Facebook Newsroom, June 25, 2020, https about fb com
ne s more-conte t-for-ne s-articles-and-other-content
10. (page 216) Vijaya Gadde and Kayvon Beykpour, “An update on our work
around the 2020 US election,” Twitter blog, November 12, 2020, https blog
t itter com en us topics company -election-update html
11. (page 216) “A Look at Facebook and US 2020 Elections,” Facebook, December
2020,
https about fb com p-content uploads US- -Elections-Report pdf;
Gadde and Beykpour, “An update on our work around the 2020 US election.”
12. (page 217) Gordon Pennycook, et al., “The Implied Truth Effect: Attaching
Warnings to a Subset of Fake News Headlines Increases Perceived Accuracy
of Headlines Without Warnings,” Management Science 66, no. 11: 4944-4957,
doi.org/10.1287/mnsc.2019.3478
13. (page 217) John Villasenor, “How to deal with AI-enabled disinformation,” AI
in the Age of Cyber-Disorder: Actors, Trends, and Prospects, ed. Fabio Rugge,
ISPI-Brookings Report (Milan: Ledizioni Ledi Publishing, 2020),
https ispionline it sites default files pubblica ioni ispi report ai in the
age of cyber-disorder pdf
14. (page 217) Donald J. Trump (@realDonaldTrump), “There is NO WAY (ZERO!)
that Mail-In Ballots will be anything less than substantially fraudulent,” Twitter,
May 26, 2020, 5:17 am, https eb archive org eb https
t itter com realdonaldtrump status ;
Donald J. Trump (@realDonaldTrump), “Because of the new and unprecedented
massive amount of unsolicited ballots which will be sent to ‘voters,”’ Twitter,
October 2020.

228
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 247 of 292 PageID #:
13816

6.6. Platform Policy Moving Forward

15. (page 218) Twitter Safety (@TwitterSafety), “Our recently updated Civic
Integrity Policy specifically offers guidance on these claims, including that
we will label misleading information or disputed claims that could undermine
faith in the process itself. More in this policy:” Twitter, September 17, 2020,
https t itter com T itterSafety status
16. (page 218) The Deadline Team, “Twitter Updates Its Warning Labels on
Political Tweets to Reflect Biden Certification,” Yahoo!News, December 20, 2020,
https ne s yahoo com t itter-updates- arning-labels-political- html
17. (page 218) Kate Starbird and Carly Miller, “Examining Twitter’s policy
against election-related misinformation in action,” Election Integrity Part-
nership, August 27, 2020, https eipartnership net policy-analysis t itters-
policy-election-misinfo-in-action
18. (page 219) “Enforcing Our Community Standards,” Facebook News, August 6,
2018, https about fb com ne s enforcing-our-community-standards ;
“Community Guidelines strike basics,” YouTube Help, accessed February 10, 2021,
https support google com youtube ans er hl en ref topic
19. (page 219) “About public-interest exceptions on Twitter”; “Facebook, Elec-
tions and Political Political Speech.”
20. (page 219) Brian Fung, “Twitter labels half of congresswoman-elect Marjorie
Taylor Greene’s post-election tweets as misleading,” CNN Business, November
5, 2020, https cnn com business live-ne s election- -misinformation
h abead b e ba b c d d
21. (page 219) “Permanent suspension of @realDonaldTrump,” Twitter, January
8, 2021, https blog t itter com en us topics company suspension html
22. (page 219) “An update following the riots in Washington, DC,” Twitter
Safety, January 12, 2021, https blog t itter com en us topics company
protecting–the-conversation-follo ing-the-riots-in- ashington– html
23. (page 221) Mike Isaac and Kate Conger, “Google, Facebook and Others
Broaden Group to Secure U.S. Election,” New York Times, updated Septem-
ber 22, 2020, https nytimes com technology google-facebook-
coalition-us-election html; Twitter Public Policy (@Policy), “Today we joined in-
dustry peers and US government partners as we work to counter threats to the
online public conversation ahead of the 2020 US elections,” Twitter, August 12,
2020, https t itter com Policy status
24. (page 221) Evelyn Douek, “The Rise of Content Cartels,” The Tech Giants,
Monopoly Power, and Public Discourse (Columbia University, Knight First Amend-
ment Institute: February 11, 2020), https knightcolumbia org content the-rise-of-
content-cartels

229
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 248 of 292 PageID #:
13817

6. Policy

25. (page 221) “Fact-Checking on Facebook,” Facebook Business, accessed Febru-


ary 10, 2021, https facebook com business help
26. (page 221) Eric Han, “Supporting our community on Election Day and
Beyond,” TikTok Newsroom, October 28, 2020, https ne sroom tiktok com en-
us supporting-our-community-on-election-day-and-beyond
27. (page 222) Republican Staff of the Committee on the Judiciary, House of
Representatives, Reining in Big Tech’s Censorship of Conservatives, October 6,
2020, https republicans- udiciary house gov p-content uploads -
- -Reining-in- ig-Techs-Censorship-of-Conservatives pdf
28. (page 222) Marjorie Taylor Greene (@mtgreenee), “Twitter censored me
ALL DAY yesterday. Sign your official STOP THE STEAL PETITION ->,” Twitter,
November 5, 2020, 5:21 am, https eb archive org eb https
t itter com mtgreenee status
29. (page 222) Kevin McCarthy (@GOPLeader), “The President tweets that people
should stop breaking the law and Twitter moves to censor him,” Twitter, June 23,
2020, 12:35 pm, https t itter com GOPLeader status ;
Rachel Lerman, “Twitter slaps another warning label on Trump tweet about
force,” Washington Post, June 23, 2020,
https ashingtonpost com technology t itter-slaps-another-
arning-label-trump-t eet-about-force
30. (page 224) Mark Zuckerberg, “The US elections are just two months away,
and with Covid-19 affecting communities across the country, I’m concerned
about the challenges people could face when voting,” Facebook, September 3,
2020, https facebook com uck posts ; Vijaya Gadde
(@vijaya), “Today, we’re announcing additional, significant product and enforce-
ment updates that will increase context and encourage more thoughtful con-
sideration before Tweets are amplified,” Twitter, October 9, 2020, 9:35 am,
https t itter com vi aya status ; Susan Wojcicki (@woj-
cicki), “It’s almost Election Day in the U.S. Here’s an update on how we’ve worked
over the years to make YouTube a more reliable source for election-related
news and information and how we’re applying this framework on November
3rd,” Twitter, October 27, 2020, https t itter com SusanWo cicki status

31. (page 224) Eric Han, “Supporting our community on Election Day and
Beyond.”
32. (page 224) On November 12, 2020, Twitter released statistics about the
number of election-related posts it labeled or flagged during the election period.
However, it did not provide a searchable database of labeled tweets. Facebook
revealed some information about user “clicks” on its labels, but this means little
in the absence of publicly available “click” data and reveals nothing about users’

230
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 249 of 292 PageID #:
13818

6.6. Platform Policy Moving Forward

cognitive responses to the messages those labels signal or confer. See Gadde
and Beykpour, “An update on our work around the 2020 US election.”
33. (page 224) In a move away from limiting API access, on January 26, 2021,
Twitter announced an improved API with advanced capabilities for the aca-
demic research community. See Adam Tornes and Leanne Trujillo, “Enabling
the future of academic research with the Twitter API,” Twitter blog, January
26, 2021, https blog t itter com developer en us topics tools enabling-the-
future-of-academic-research- ith-the-t itter-api html
34. (page 225) Dan Cooney, “Read Democrats’ full impeachment
brief against Trump for second Senate trial,” PBS NewsHour, Febru-
ary 22, https pbs org ne shour politics read-full-democrats-case-against-
trump-during-second-impeachment-trial
35. (page 225) Twitter Inc., “Permanent suspension of @realDonaldTrump.”
36. (page 225) Nick Clegg, “Referring Former President Trump’s Suspension
From Facebook to the Oversight Board,” Facebook News, January 21, 2021,
https about fb com ne s referring-trump-suspension-to-oversight-
board

231
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 250 of 292 PageID #:
13819

232
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 251 of 292 PageID #:
13820

Chapter 7
Responses, Mitigations and
Future Work

7.1 Introduction
The Election Integrity Partnership was born out of a collective challenge. The re-
sponsibility of mitigating election-related mis- and disinformation is shared, and
thus the observations and recommendations in this chapter span government,
media, social media platforms, and civil society, and the organizing functions
between each.
There isn’t any single catch-all policy that will rid elections—much less
democracy—of false or misleading information. However, institutions and in-
dividuals responsible for election processes, or responsible for portions of the
information ecosystem, can each adopt policies (some modest, some transfor-
mative), to build more resilience to misinformation.
Doing nothing is not an option. A government by and for the people depends on
the people coming together around trustworthy information in order to make
informed decisions—including around electing leaders. There is no doubt of the
causal impact mis- and disinformation about the 2020 US elections played in the
violent insurrection at the United States Capitol on January 6, 2021. Not pursuing
structural policy change will accelerate our country’s slide toward extremism,
erode our shared national and inclusive identity, and propel yet more individuals
toward radicalization via mis- and disinformation. The problem is larger than
elections: it spans politics, self-governance, and critical policy areas, including
public health.
In many ways, the Election Integrity Partnership was inspired by past recom-
mendations for addressing election-related vulnerabilities. For example, the

233
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 252 of 292 PageID #:
13821

7. Responses, Mitigations and Future Work

Senate Select Committee on Intelligence’s second of a five-volume report on


foreign-based disinformation, published in 2019, included a bipartisan recom-
mendation:

The Committee recommends that social media companies work to


facilitate greater information sharing between the public and private
sector, and among the social companies themselves about malicious
activity and platform vulnerabilities that are exploited to spread
disinformation. Formalized mechanisms for collaboration that fa-
cilitate content sharing among the social media platforms in order
to defend against foreign disinformation, as occurred with vio-
lent extremist content online, should be fostered. As researchers
have concluded: “Many disinformation campaigns and cyber threats
do not just manipulate one platform; the information moves across
various platforms or a cyber-attack threatens multiple companies’
network security and data integrity. There must be greater cooper-
ation within the tech sector and between the tech sector and other
stakeholders to address these issues.”1 (Emphasis added.)

The Election Integrity Partnership was designed to do just that: formalize collab-
oration among organizations to protect against misinformation. The recommen-
dations in this chapter are tailored to the Election Integrity Partnership’s scope,
specifically, identifying and mitigating misinformation related to US elections.
However, many of them have broader potential in building toward a normative
approach for elections, social media, and information access in free and open
societies.

7.2 Government
While the responsibility for accurate information is spread across society, the
responsibility for protecting elections is singularly that of the government. This
set of broad recommendations spans a complex system of state and local election
systems feeding into the federal system and focuses on dual responsibilities of
facilitating and providing information about elections.

The Executive Branch


• Strengthen interagency coordination by elevating election security as a
national security priority and reaffirming the critical infrastructure desig-
nation for election systems, allowing the Cybersecurity and Infrastructure
Security Agency (CISA) to further prioritize resources and support to state
and local officials.

234
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 253 of 292 PageID #:
13822

7.2. Government

• Solidify clear interagency leadership roles and responsibilities. CISA should


remain the lead on domestic vulnerabilities and coordination with state and
local election officials; the Office of the Director of National Intelligence
should coordinate intelligence assessments and lead the Intelligence Com-
munity on foreign-based threats; the Department of Justice and Federal
Bureau of Investigation should maintain investigation and law enforcement
leadership for domestic and international threats. The Election Assistance
Commission should remain in an amplifying role, pushing best practices
and critical information out broadly to the election community.

• Create standards and mechanisms for consistent disclosures of mis- and


disinformation from foreign and domestic sources, including via CISA’s
Rumor Control and joint interagency statements related to foreign-based
threats.2

• Maintain a threat assessment of the current election mis- and disinforma-


tion state of play, informed by collaboration with social media platforms.
Update this assessment during federal election cycles and release it to
election officials, social media platforms, civil society, and members of the
media.

Congress
• Election security should be prioritized over politics. Make best efforts
to separate the substantive and critical issue of election security from
the electoral politics that every member of Congress is engaged in during
each election. For example, Congress should authorize all non-emergency
election-related bills one year prior to the next regular election.

• Pass existing bipartisan proposals with increased appropriations marked


for federal and state election security, specifically resources for federal
agencies directly engaged in election security and more broadly toward
providing coordinated election security assistance and support to state
and local officials (see next section).

• Codify the Senate Select Committee on Intelligence’s bipartisan recom-


mendations on depolarization and public official conduct, as noted in
Volumes 3 and 5 of the Committee’s exhaustive report on foreign influence
in the 2016 election.3

• Strengthen digital expertise at federal regulators with election-related


jurisdictions, including the Federal Elections Commission and Federal Com-
munications Commission, to improve enforcement of existing regulations.

235
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 254 of 292 PageID #:
13823

7. Responses, Mitigations and Future Work

State and Local Officials


Prepare a tiered communications plan that includes:

• A start-to-finish story for each voter’s ballot. This should include infor-
mation about how to register to vote; ensuring one’s registration is up to
date; where, when, and how to vote; and how votes will be counted and
reported, including the timing of that process.

• Processes for reporting misinformation to social media platforms and


government partners.

• Establish trusted channels of communication with voters. This should


include a .gov website and use of both traditional media and social media.
This effort should include:

• A single authoritative source (webpage or social media account) for election


information. That source’s information should be specific to each election
and regularly updated; it should also provide data and evidence regarding
the security and integrity of the election.

• Ensure that all votes cast are on auditable paper records. Post-election
audits should be conducted after each election.

7.3 Media
Traditional media remains the primary means of information distribution in the
United States. As such, newsrooms have an obligation, rooted in traditional
journalism ethics and practices, to accurately and ethically cover election topics,
including election misinformation. This task has been complicated by a loss of
journalism revenue to social media companies and growing competition with
hyperpartisan news sources for reader attention. The following recommen-
dations are for journalists and media professionals covering election-related
misinformation.

Newsrooms
• Prepare journalists to encounter mis- and disinformation. This training
should include accepted definitions, attribution standards, how to avoid
inadvertent amplification, and more.

• Coordinate reporting across beats in the newsroom. Election reporting re-


lies on a combination of campaign embeds, White House and congressional

236
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 255 of 292 PageID #:
13824

7.4. Social Media Platforms and Technology Companies

reporters, national security reporters, technology reporters, and others.


Organizations should handle misinformation uniformly and professionally.

• Anticipate misinformation (“threatcasting”) and establish guidelines for


combating it (for example, the Washington Post’s guidance on hacked
material or Buzzfeed’s guidance on QAnon descriptions).4

• Formulate proactive communications for instances when genuine report-


ing is labeled “fake news” or disinformation. Newsrooms should address
the issue but not accept the premise of the charge.

• For written media, avoid headlines that mischaracterize or hyperbolize


reporting, especially in breaking news events like elections. Include the
fact-check in the headline when possible, e.g., “Trump Falsely Declared
Victory.”

News Studies and Research

• Develop a wider vocabulary for differentiating between traditional news


media and hyperpartisan or unreliable news. A new lexicon could help
social media sites better label information.

• Develop case studies on misinformation coverage (good and bad) of the


2020 election to educate and inform current and upcoming journalists.

7.4 Social Media Platforms and Technology


Companies
In their relatively brief existence, social media platforms have become a critical
part of the democratic process, facilitating political organizing, citizen engage-
ment, campaign communications, and overall information access. Mitigating
election-related misinformation in this space is particularly challenging given
the distributed nature of the social media ecosystem—anyone with internet
access can consume content and post their own—and the speed by which unver-
ified or unverifiable information can spread. As it stands, there is a high degree
of variance in how social media platforms address misinformation, the resources
they devote to combating it, and their technical policy options. Social media
platforms won’t be able to root out election-related misinformation entirely, but
these policy recommendations can help. The following recommendations for
platforms are more lengthy and specific than previous sections because this
area currently has the fewest normative practices compared to the others.

237
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 256 of 292 PageID #:
13825

7. Responses, Mitigations and Future Work

Accessibility

• Tell users about a platform’s misinformation policies. In addition to the


policies themselves, platforms should provide both rationales and case
studies. Policies specific to an event or topic (e.g., elections, COVID-19)
should be grouped in one location.

• Provide proactive information regarding anticipated election misinforma-


tion. For example, if researchers expect a narrative will emerge, platforms
should explain that narrative’s history or provide fact-checks or context
related to its prior iterations.

Transparency

• Share platform research on misinformation counter-measures with aca-


demics, civil society and the public. Where counter-measures have been
effective, reveal that; where they have fallen short, reveal that as well. If
efficacy is unknown, take steps to determine it.

• Enable access for external researchers to removed or labeled content,


including exhaustive and rapid search capabilities.

• Partner with civil society organizations. Listen to their suggestions and


support them when possible.

• Provide greater transparency about why something is removed or censored.


Sharing the evidence to support why the content was taken down would
be helpful for researchers as well as the public.

Cross-Platform Communications

• Support independent cross-platform coalitions that track cross-platform


misinformation. These coalitions can focus on specific topics (such as
vaccine disinformation) or regions and can coordinate with government
officials and civil society to respond to growing narratives.

Policy on Repeat Spreaders

• Establish clear consequences for accounts that repeatedly violate platform


policies. These accounts could be placed on explicit probationary status,
or a mixture of monitoring and sanctions.

238
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 257 of 292 PageID #:
13826

7.4. Social Media Platforms and Technology Companies

• Publicize the different thresholds of policy offenses. For example, YouTube


and Twitter use a strike system. Any such system should transparently
represent to users their current status and should describe what counts
as a strike against monetization, or leads to suspension.
• Prioritize quicker action on verified or influential accounts if they have
already violated platform policies in the past.
• Consider implementing holding areas for content from high-visibility re-
peat spreaders, where content can be evaluated against policy before
posting.
• Reevaluate policies related to blue-check influencers with significant reach,
particularly on issues such as incitement to violence. These accounts
should arguably be held more stringently to stated policies than the average
user—rather than receiving repeated exemptions—because of the amount
of attention they command and action they potentially drive.

Policy Enforcement
• Ensure platform labels are consistently applied to all product features,
including ephemeral content such as stories or livestreams.
• Labels should make clear which policy the content violates.
• Partner with civil society organizations to localize fact-checks and labels ,
especially in non-English languages or niche communities.
• Apply an interim label to content that is in the queue for fact-checkers,
or is tied to an emerging event, noting that it should be approached with
caution. For content that recurs, a label can link to a page that discusses
previous variations of the claim.
• Anticipate misinformation where possible, particularly surrounding pivotal
events such as elections. Revisit applicable policies in advance.

Election-Specific Policies
• Specify election-specific policies’ duration and geographic jurisdiction.
• For US elections, anticipate state-level premature claims of victory.
• Prioritize election officials’ efforts to educate voters within their jurisdic-
tion and respond to misinformation. This could include the promotion of
content from election officials through curation or advertisement credits,
especially in the lead-up to Election Day.

239
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 258 of 292 PageID #:
13827

7. Responses, Mitigations and Future Work

7.5 Civil Society


Civil society in the United States plays an essential role in the process and
functions of elections, as well as in the accountability of institutions directly
responsible for the stewardship of American democracy and the information
environment that facilitates it. Civil society includes a wide range of actors from
academia, public interest groups, community leaders, faith-based groups, and
other non-governmental organizations. Most notably, civil society has led in
providing better understanding and best practices regarding election-related
misinformation and can continue to play a leading role in building resilience to
it in the long term.

Overarching
• Disclose methodology and standards for technical research. Incomplete,
misleading, or false findings, even when well intentioned, often exacerbate
the problem, especially in fast-moving information environments around
elections.

• Similar to the recommendation made to media organizations, increase


awareness about misinformation and coordinate among civil society groups
with varied expertise on elections.

• Where misinformation is pervasive and touches on many topics, clearly


communicate the scope of engagement on the issue. As an example, the
Election Integrity Partnership’s scope was narrowly focused on misinfor-
mation related to the process and results of the 2020 US elections, as
opposed to false information in American political discourse more broadly.

7.6 Conclusion
The 2020 election demonstrated that actors—both foreign and domestic—remain
committed to weaponizing viral false and misleading narratives to undermine
confidence in the US electoral system and Americans’ faith in our democracy.
Mis- and disinformation warped the country’s public discourse both before
and after Election Day, spreading through online communities across all social
platforms. Influencers and hyperpartisan media cultivated loyal, polarized audi-
ences, forming echo chambers where narratives of massive fraud and a stolen
election strengthened at each retelling. These narratives have consequences.
On January 6, 2021, President Trump’s supporters stormed the Capitol in an
attempt to prevent the finalization of the Electoral College results and the peace-
ful transition of power. A small group of radicalized citizens had been repeatedly

240
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 259 of 292 PageID #:
13828

7.6. Conclusion

told that the election’s results were fraudulent; they mobilized against their own
democracy while claiming to protect it. A larger group watched those events
and cheered; others concluded, despite MAGA hats and Trump flags, that the
insurrection was the work of their political opponents.
State and local election officials throughout the country and across the polit-
ical spectrum worked hard to counter malign narratives. Tragedies such as
the January 6 insurrection suggest that, despite their best efforts, democratic
processes remain vulnerable. The events, narratives, and dynamics documented
in this report underscore the need for a collective response to the false and
misleading narratives that precipitated the attack.
The EIP was formed out of this conviction—that the challenge of misinforma-
tion is dynamic, networked, and resilient—and that to address it, we need to
act quickly and collectively. While the Partnership was intended to meet an
immediate need, the conditions that necessitated its creation have not abated,
and in fact may have gotten worse. Academia, platforms, civil society, and all
levels of government must be committed, in their own ways, to truth in the
service of a free and open society. All stakeholders should focus on predicting
and pre-bunking false narratives, detecting mis- and disinformation as it occurs,
and countering it when appropriate.
The EIP’s collaborative model was tailored toward a specific event—Election
2020—and designed specifically to aid election officials, election security stake-
holders, and civil society, but we believe the model could have further utility. As
our report reiterates, there are structural dynamics and policy frameworks in the
online information ecosystem that have long lent themselves to the viral spread
of false and misleading information and to the facilitation of polarized communi-
ties; addressing specific content is, in many ways, secondary to addressing these
infrastructure challenges. In the meantime, false and misleading narratives
proliferate about a wide variety of societally impactful topics. Shifting focus to
address specific other topics may require modification to the operation of the
Partnership, such as reallocating analytical resources and research cadence;
however, EIP’s novel structure, enabling rapid-response analysis and a multi-
stakeholder reporting infrastructure, could prove effective to many information
spaces blighted by pervasive misinformation.
In the end, we hope this report’s enduring value lies not just in its exposition of
this election story, but in its illumination of this overarching story—of declining
trust, weakened gatekeepers, social polarization, and the protean challenge of
viral misinformation amidst a skeptical and networked public. Given the enor-
mity of the challenge, we recognize the need for a whole-of-society response.
The EIP, in its structure and its operations, offered a first measure in service of
that call: it united government, academia, civil society, and industry, analyzing
across platforms, to address misinformation in real time. The lessons from EIP

241
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 260 of 292 PageID #:
13829

7. Responses, Mitigations and Future Work

should be both learned and applied. The fight against misinformation is only
beginning. The collective effort must continue.

242
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 261 of 292 PageID #:
13830

Notes

1. (page 234) Select Committee on Intelligence, Russian Active Measures Cam-


paigns and Interference in the 2016 U.S. Election, S. Rep. No. 116-XX, volume 2,
at 78 (2019), https intelligence senate gov sites default files documents
Report olume pdf
2. (page 235) “#Protect2020 Rumor vs. Reality,” Cybersecurity and Infras-
tructure Security Agency, accessed December 10, 2020, https cisa gov
rumorcontrol
3. (page 235) Select Committee on Intelligence, Russian Active Measures Cam-
paigns and Interference in the 2016 U.S. Election, S. Rep. No. 116-XX, volumes 3
and 5 (2019), https://www.intelligence.senate.gov/publications/report-select-
committee-intelligence-united-states-senate-russian-active-measures
4. (page 237) Joe Pompeo, “‘Connect the Dots’: Marty Baron Warns Wash-
ington Post Staff About Covering Hacked Materials,” Vanity Fair, September
23, 2020, https vanityfair com ne s marty-baron- arns- apo-
staff-about-covering-hacked-materials; Drusilla Moorhouse and Emerson Mal-
one, “Here’s Why BuzzFeed News is Calling QAnon a ‘Collective Delusion’ From
Now On,” BuzzFeed News, September 4, 2020, https bu feedne s com
article drumoorhouse anon-mass-collective-delusion-bu feed-ne s-copy-desk

243
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 262 of 292 PageID #:
13831

244
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 263 of 292 PageID #:
13832

Appendix A
Definitions

Misinformation is information that is false, but not necessarily intentionally


false.1 Misinformation is at times used as an umbrella category for false rumors,
disinformation, and other types of false and misleading information.
Disinformation is false or misleading information that is purposefully seeded
and/or spread for an objective—e.g., a political or financial objective.2 Disinfor-
mation may mislead through its content, or may work by deceiving its audiences
about its origins, purpose, or the identity of those who produced it. It is often
built around a true or plausible core, layering factual information with small
falsehoods or exaggerations (see Bittman, 1985).3 It also typically functions as a
campaign—a set of information actions, rather than a single piece of content.
The key difference between disinformation and other forms of misinformation
is intent, in that disinformation is intentionally produced and/or spread. Often
as a disinformation campaign progresses, it incorporates unwitting participants
in its production and spread; therefore, not every entity that spreads disinfor-
mation does so with intent to deceive or knowledge that they are spreading
false or misleading content.4
Voter Fraud is the act of fraudulently voting. It includes voting on behalf of
someone else, voting when someone is ineligible, voting multiple times, etc. The
1
Caroline Jack, “Lexicon of lies: Terms for problematic information,” Data & Society Research
Institute (2017): 3, 22, https datasociety net pubs oh Data ndSociety Le iconofLies pdf.
2
Jack, “Lexicon of lies: Terms for problematic information”; Kate Starbird, Ahmer Arif, and
Tom Wilson, “Disinformation as collaborative work: Surfacing the participatory nature of strategic
information operations,” Proceedings of the ACM on Human-Computer Interaction 3, issue CSCW
(November 2019): 1-26, doi org .
3
Ladislav Bittman, The KGB and Soviet Disinformation: An Insider’s View (Washington:
Pergamon-Brassey’s, 1985).
4
Bittman, The KGB and Soviet Disinformation: An Insider’s View; Kate Starbird, et al., “Disin-
formation as collaborative work.”

245
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 264 of 292 PageID #:
13833

A. Definitions

term is often used—including within examples in this report—as a catchall for


other types of election fraud. Research shows that voter fraud is extremely rare
in the United States.5
Election Fraud suggests a more systematic effort to change the results of an
election. It includes orchestrating voter fraud at scale, illegally registering or
illegally assisting large numbers of voters, altering vote counts through auto-
matic or manual means, systematically removing or inserting large numbers of
ballots to affect an election outcome, etc.
Electoral Fraud is a broad term denoting “illegal interference in the process
of voting.”6 Electoral fraud includes ballot stuffing, voter impersonation, vote
buying, voter suppression, fraud by election officials, and various other mech-
anisms of illegally impacting an election. Like “election fraud,” electoral fraud
suggests efforts at a scale that could impact election results.
Voter Suppression is the process of systematically reducing the ability of a spe-
cific group of people to vote. It can work through efforts to make it physically
harder to vote (fewer locations, limited time windows), through legal efforts that
disenfranchise specific groups (e.g., former felons) and through other mecha-
nisms, including intimidation. In the United States, voter suppression efforts
often target Black Americans and other people of color.7
Tickets were internal reports within the EIP system. They were submitted via
“tips” from external partners in the government and civil society, or created
through the EIP internal monitoring process. Once a ticket was submitted,
our Tier 1 analysts would go through a systematic process to document the
claim, determine if it was “in scope,” get a sense of where it was spreading, and
attempt to assess the veracity of the underlying claims by locating an external
fact-check from election officials, fact-checking organizations, local media,
or mainstream outlets. For high priority, in-scope tickets, Tier 2 researchers
conducted additional analysis, which included determining the origins of a
piece of information, tracking its spread over time, and identifying additional
fact-checks as they became available.
A majority of tickets focused on false and/or misleading claims that functioned
to diminish trust in election results. These included:

• False claims and unsubstantiated conspiracy theories (e.g., that voting


software switches votes without a trace).
5
“Debunking the Voter Fraud Myth,” Brennan Center for Justice, January 31, 2017,
https brennancenter org our- ork research-reports debunking-voter-fraud-myth.
6
Ballotpedia, s.v. “Electoral Fraud,” accessed February 10, 2021, https ballotpedia org Electoral
fraud.
7
ACLU, “Block the Vote: Voter Suppression in 2020,” February 3, 2020, https aclu org
ne s civil-liberties block-the-vote-voter-suppression-in- .

246
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 265 of 292 PageID #:
13834

• Factually valid claims taken out of context and framed in misleading ways
to suggest massive voter fraud (e.g., that a large number of ballots had
been found in a trash can, when in actuality the ballots were from 2018).

• Content that amplified and exaggerated small issues (e.g., ballots stolen
from a mailbox, discarded mail that contained a small number of ballots,
issues with individual voting machines) to support the broader (false) nar-
rative that results could not be trusted.

Events are salient occurrences in our physical and/or social worlds. Events are
typically bounded in time. We use this term to distinguish between the actual
event (e.g., Sharpie pens bleeding through ballots) and the information incidents
that feature elements of those events—though they may take shape and spread
at different times.
(Information) Incidents are distinct information cascades that pertain to a
specific event or set of events. We use the term incidents to differentiate
between the original event and the subsequent discussion or discussions of
that event. Incidents often map to one or more narratives, where the details of
an event are mobilized to create or support a specific interpretation—or story
about the meaning—of that event.
Narratives are stories that connect a series of related events or experiences.
Like any good story, narratives typically have characters, scenes, times, and
themes. They provide compelling interpretations that can help people make
sense of events and experiences.
Frames are mental schema that shape how people interpret events. Frames
select and make salient some aspects of a situation—and obscure others. Robert
Entman enumerates four functions of frames: defining a problem, diagnosing a
cause, making a moral judgement, and suggesting remedies.8 Framing is the act
of creating, refining, or challenging a frame. Framing can be used as a strategy
to shape how others interpret a situation.
The “Big Lie”: Over the course of this project, a majority of the tickets we filed
and incidents we analyzed were related to a false metanarrative of massive voter
fraud (i.e., election fraud). This false metanarrative was introduced prior to our
project’s launch and continues to this day. It was present in President Trump’s
summer 2020 tweets claiming that the election would be “rigged” against him
and in his January 6, 2021, tweets claiming that the election had been stolen
from him. It took shape through a variety of false, misleading, and exaggerated
claims that functioned generally to sow distrust in the results—and specifically to
8
Robert M. Entman, “Framing: Toward Clarification of a Fractured Paradigm,” Journal of
Communication 43, no. 4 (December 1993): 51-58; doi org - tb .

247
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 266 of 292 PageID #:
13835

A. Definitions

support the allegation of massive voter fraud functioning to “steal” the election
from candidate Trump. Looking across the breadth of the online activity to seed
and spread these narratives, our research (and that of others; see Benkler et al.’s
2020 paper9 ) has interpreted the “Big Lie” to be a participatory disinformation
campaign that incorporated the efforts of President Trump, his family and close
supporters, members of right-wing media, social media influencers, and his
followers (many of them unwitting participants in this campaign).

9
Yochai Benkler et al., “Mail-in Voter Fraud: Anatomy of a Disinformation Campaign,” Berkman
Center Research Publication No. 2020-6, Berkman Klein Center, October 2, 2020, doi org
ssrn .

248
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 267 of 292 PageID #:
13836

Appendix B
Inter-coder reliability

B.1 Average Z-scores

Survey Questions (in descending order by z-score) Z Score


Other Facets: was there anything else notable about this ticket
1.013260305
not already covered above?
Why was this ticket created? 0.5973125198
Was there a partisan focus on this ticket? 0.04870033315
Process-based tags: what part of the electoral process is this
0.04149633474
ticket about?
Specific Claims or Election-related narratives: is there a specific,
0.03772808942
recognizable claim that was used in this incident?
What are the top-level buckets of this incident? Check all that
-0.08453405504
apply.
What tactics were used to spread this content? -0.203993765
What is the estimated number of engagements (cumulative social
media shares, retweets, likes, reactions, comments) associated -0.2505426778
with the ticket?
Character-based Tags: who or what is being implicated in this
-0.2664116253
incident?
Is this a particularly important ticket that should be included in
-0.7143693447
the final report?
Table B.1: The average z-scores for each survey question

249
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 268 of 292 PageID #:
13837

B. Inter-coder reliability

B.2 Discordant Z-scores


Survey Question Choice Z Score
This content exaggerates the
What tactics were used to
impact of an issue within the -2.868919023
spread this content?
election-process
Specific Claims or
Election-related narratives: is
there a specific, recognizable None of the above -2.41007974
claim that was used in this
incident?
Character-based Tags: who or
what is being implicated in this Government Entities -1.971189991
incident?
What are the top-level buckets
of this incident? Check all that Fraud -1.951240456
apply.
Character-based Tags: who or
what is being implicated in this Political affinity group -1.911341388
incident?
Table B.2: From the above questions, choices that experienced the most discord among coders

B.3 Concordant Z-scores


Survey Question Choice Z Score
Other Facets: was there anything
Foreign interference
else notable about this ticket not 1.240684993
(Unfounded)
already covered above?
Other Facets: was there anything
else notable about this ticket not Foreign interference (Confirmed) 1.220735459
already covered above?
Other Facets: was there anything
else notable about this ticket not COVID related 1.200785925
already covered above?
What tactics were used to
Use of phishing emails or tests 1.180836391
spread this content?
What are the top-level buckets
of this incident? Check all that Premature Claims of Victory 1.140937323
apply.
Table B.3: Questions that experienced the most agreement

250
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 269 of 292 PageID #:
13838

Appendix C
Repeat Spreaders—
Additional Partisan News Outlets in
the Twitter Data

The New York Post’s coverage served mainly to introduce narratives involving
election fraud, including reporting on unfounded allegations that deceased vot-
ers in New York had ballots cast on their behalf. Conservative news outlets DC
Patriot (9 incidents) and National Pulse (8 incidents) acted similarly in the pro-
motion of stories revolving around misplaced ballots (DC Patriot) and detailing
previous instances of fraud both domestic and foreign (National Pulse).

JustTheNews, a news site run by conservative commentator John Solomon, pro-


duced stories that applied political commentary to narratives asserting election
fraud and was involved in spreading the Nevada Whistleblower narrative.URLs
from the Washington Times appear in tweets related to three of the top incidents,
reflecting their attention to widely followed election conspiracy theories.

Domains associated with political conspiracy theories include ZeroHedge, which


appeared in 10 incidents, which was involved in the spread of the Color Rev-
olution narrative. The Epoch Times was cited in a range of misleading “voter
fraud” narratives such as alleging that large numbers of people were voting twice
and that discarded ballots were evidence of intentional fraud. The website also
promoted content related to three large incidents—the Dominion conspiracy
theory, and the Sharpiegate and Stop The Steal narratives.

The Fox News website, foxnews.com, was cited in a narrative regarding bal-
lots that went missing in the care of USPS and the spread of Biden’s mis-
contextualized statement regarding fraud protections. Articles for which Fox
News was cited often presented factual evidence of a real-world event with an

251
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 270 of 292 PageID #:
13839

C. Repeat Spreaders—Additional Partisan News Outlets in the Twitter Data

underlying subtext of election insecurity or widespread voter fraud that was


picked up and made more explicit in the social media sphere. The spin-off site
of Fox contributor Sara Carter (saraacarter.com) was involved in seven similar
incidents resulting in over 80,000 retweets. Her content was often more explicit
in falsely claiming widespread voter fraud—including a highly speculative article
(now removed) that helped to feed the Dominion conspiracy theory.

252
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 271 of 292 PageID #:
13840

Appendix D
Ticket Analysis Questions

D.1 Tier 1 Analysis Questions


1: Overall Analyst Description: What is the content about? Provide a brief
description of the narrative being pushed and the tactics used to spread it
(platforms, assets, etc.) so that other analysts can understand the content at a
glance.
2: Platform: What platform(s) does the content appear on? Include links or links
to screenshots, if appropriate. What platforms has the content trended on?
3: Language: What language(s) is the content written in?
4: Content Assets: What type of media is included in the content?
Examples:

Contains video
Contains image with text
Contains image without text
Template text (copy-paste)
Unique text

5a: Category: What EIP-defined categories of election interference does it fall


under?
Choose all that apply:

Procedural Interference

253
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 272 of 292 PageID #:
13841

D. Ticket Analysis Questions

Participation Interference
Fraud
Delegitimization*

5b: *If it’s delegitimization, what kind is it?


6: Theme: What is the primary topic or theme of the content?

Examples: VoteByMail
USPS

7: Target Community: What specific communities does the content target (if
applicable)?

This refers to the community whose voting ability or trust in the election
process the content is designed to affect—not the community propa-
gating the claim. Target communities can include seniors, teenagers,
Latinx voters, QAnon, far left, far right, etc.

8: State Targeted: What geographical area [state] does the content target (if
applicable)?
9: Account Type or Amplification: What kind of account is primarily responsible
for spreading the content?

Examples:
Politician/candidate for office
Influencer/verified account
Organic account
Seemingly inauthentic account
Anonymous account

10: Reach: What is the reach of the content at this time?

How many shares does it have? How many replies or comments? How
many likes? Use the following as approximate guidelines:

• None: 0 engagements

254
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 273 of 292 PageID #:
13842

D.2. Tier 2 Analysis Questions

• Low: 1-10 engagements


• Medium: 10-500 engagements
• High: 500-1000 engagements
• Viral: 1000+ engagements

11: Velocity: What is the velocity of the content?

Is the rate of spread of the content static, growing, or declining? Use


the following as approximate guidelines:

• Static: no change to reach


• Growing: reach is growing linearly
• Viral: reach is growing exponentially
• Decreasing: reach is decreasing

D.2 Tier 2 Analysis Questions


12: What else do we know about the primary account sharing the content?
Examples:

25,000 followers
Created in 2012

13: What communities are sharing the content?


Examples:

Conspiratorial Instagram pages, Bernie-aligned Facebook groups

14: What was the first account or Page to share the content (if not the account
listed above)?
15: Is there any evidence of coordination or inauthentic activity? Unusual
tactics?
16: To what extent is counter-messaging already underway? Has it been
successful?
16: Any additional notes about the user and related social accounts/websites
discussed in the ticket?

255
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 274 of 292 PageID #:
13843

256
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 275 of 292 PageID #:
13844

Appendix E
News Articles Citing the Election
Integrity Partnership

News Articles citing the EIP during the active project period, listed in chrono-
logical order:
Route Fifty | Aug. 12, 2020: “New Coalition Wants to Help in Fight Against
Election Misinformation”
https route-fifty com tech-data election-integrity-partnership-
misinformation-disinformation
Stanford News | Sept. 28, 2020: “The 2020 U.S. election, issues and challenges”
https ne s stanford edu -u-s-election-issues-challenges
The New York Times | Sept. 28, 2020: “Editorial: What’s the Plan if Trump
Tweets That He’s Won Re-election?”
https nytimes com opinion social-media-trump-election html
The New York Times | Sept. 29, 2020: “Project Veritas Video Was a ‘Coordinated
Disinformation Campaign,’ Researchers Say”
https nytimes com us politics pro ect-veritas-ilhan-omar html
Santa Rosa Press-Democrat | Sept. 30, 2020: “A tall tale about election fraud”
https pressdemocrat com article opinion pd-editorial-a-tall-tale-about-
election-fraud
Bloomberg News | Oct. 5, 2020: “Facebook, Twitter Are Failing to Curb Voting-
By-Mail Falsehoods”
https bloomberg com ne s articles - - facebook-t itter-are-
failing-to-curb-voting-by-mail-falsehoods

257
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 276 of 292 PageID #:
13845

E. News Articles Citing the Election Integrity Partnership

Business Day (South Africa) | Oct. 5, 2020: “Facebook, Twitter have hands full
with postal voting misinformation”
https businesslive co a bd orld americas - - -facebook-t itter-
have-hands-full- ith-postal-voting-misinformation
The Washington Post | Oct. 8, 2020: “Facebook bans marketing firm running
‘troll farm’ for pro-Trump youth group”
https ashingtonpost com technology facebook-bans-media-
consultancy-running-troll-farm-pro-trump-youth-group
The Daily Beast | Oct. 13, 2020: “Far-Right Social Media Sites Packed With
Foreign Clickbait”
https thedailybeast com -in- -parler-users-follo s-macedonian-clickbait-
site
Bloomberg News | Oct. 13, 2020: “Fake News Hub from 2016 Election Thriving
Again, Report Finds”
https bloomberg com ne s articles - - fake-ne s-hub-from- -
election-thriving-again-report-finds
Associated Press | Oct. 13, 2020: “Report: Social media influencers push voting
misinformation”
https apne s com article election- -donald-trump-politics-media-
misinformation- a e e c b b c ad cb a a
NBC News | Oct. 15, 2020: “For Trump’s ‘rigged’ election claims, an online
megaphone awaits”
https nbcne s com tech tech-ne s trump-s-rigged-election-claims-online-
megaphone-a aits-n
CyberScoop | Oct. 20, 2020: “Why social media disinformation poses such a
security threat”
https cyberscoop com social-media-disinformation-represents-security-
threat
MIT Technology Review | Oct. 21, 2020: “Efforts to undermine the election are
too big for Facebook and Twitter to cope with”
https technologyrevie com ho -to-delegitimi e-an-
election-rigged-misinformation
National Public Radio | Oct. 21, 2020: “Voters In Florida And Alaska Receive
Emails Warning ‘Vote For Trump Or Else!’”

258
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 277 of 292 PageID #:
13846

https npr org voters-in-florida-and-alaska-receive-


emails- arning-vote-for-trump-or-else
Fast Company | Oct. 23, 2020: “Facebook is still failing to take down ads that
question the election’s integrity”
https fastcompany com facebook-is-still-failing-to-take-do n-
ads-that- uestion-the-elections-integrity
GeekWire | Oct. 26, 2020: “Scholars tracking social media see efforts to delegit-
imize election, imperiling democracy”
https geek ire com scholars-tracking-social-media-see-efforts-
delegitimi e-election-imperiling-democracy
Science | Oct. 26, 2020: “As U.S. election nears, researchers are following the
trail of fake news”
https sciencemag org ne s us-election-nears-researchers-are-
follo ing-trail-fake-ne s
MIT Technology Review | Oct. 26, 2020: “What to expect on election day”
https technologyrevie com hat-to-e pect-on-
election-day- -disinformation-results
CyberScoop | Oct. 29, 2020: “Don’t let election-themed misinformation fool you.
Here’s what to watch out for.”
https cyberscoop com election-trump-t itter- inner-misinformation
KIRO-TV Seattle | Oct. 30, 2020: “UW social media expert: Election misinfor-
mation is an ‘attack on democracy’”
https kiro com ne s local u -social-media-e pert-election-
misinformation-is-an-attack-democracy
Oklahoma Watch | Oct. 30, 2020: “These Oklahoma Politicians Gave Misinfor-
mation a Boost”
https oklahoma atch org these-oklahoma-politicians-gave-
misinformation-a-boost
National Public Radio | Nov. 1, 2020: “Researchers Prepare For Deluge Of
Election Night Misinformation”
https npr org researchers-prepare-for-deluge-
of-election-night-misinformationill-failing-to-take-do n-ads-that- uestion-the-
elections-integrity
Stanford News | Nov. 2, 2020: “Disinformation investigators: Stanford students
sleuth for false, misleading reports on how to vote”

259
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 278 of 292 PageID #:
13847

E. News Articles Citing the Election Integrity Partnership

https ne s stanford edu sleuthing-misinformation-voting


The Washington Post | Nov. 2, 2020: “The Post’s View: Election Day promises to
be full of misinformation. Here’s how we can all stop its spread.”
https ashingtonpost com opinions election-day-promises-to-be-full-of-
misinformation-heres-ho - e-can-all-stop-its-spread bd e - d d-
eb-b - c cd dc story html
The New York Times | Nov. 3, 2020: “After Twitter Labels Trump’s Tweet About
Pennsylvania, Its Spread Slows”
https nytimes com technology after-t itter-labels-trumps-
t eet-about-pennsylvania-its-spread-slo s html
Protocol | Nov. 3, 2020: “Meet the researchers and activists fighting misinfor-
mation”
https protocol com election-day- -misinfomation-disinformation
The Washington Post | Nov. 4, 2020: “Trump’s early victory declarations test
tech giants’ mettle in policing threats to the election”
https ashingtonpost com technology misinformation-
election-social-te t
The Washington Post | Nov. 4, 2020: “Trump’s campaign and family boost bogus
conspiracy theories in a bid to undermine vote count”
https ashingtonpost com technology election-results-
misinformation
Detroit Free Press | Nov. 8, 2020: “Antrim County figures prominently in election
conspiracy theory”
https freep com story ne s politics elections election-
misinformation-michigan-vote-antrim-county
Le Monde | Nov. 8, 2020: “Elections américaines : « La désinformation a pris un
rôle de premier plan »”
https lemonde fr pi els article election-presidentielle-
americaine-la-desinformation-a-pris-un-role-de-premier-plan
html
Reuters | Nov. 8, 2020: “Fact check: Deviation from Benford’s Law does not
prove election fraud”
https reuters com article uk-factcheck-benford-idUS I
Bridge Michigan | Nov. 9, 2020: “Human error, Dominion voting equipment fuel
false fraud claims in Michigan”

260
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 279 of 292 PageID #:
13848

https bridgemi com michigan-government human-error-dominion-voting-


e uipment-fuel-false-fraud-claims-michigan
The Washington Post | Nov. 9, 2020: “Twitter and Facebook warning labels
aren’t enough to save democracy”
https ashingtonpost com technology facebook-t itter-
election-misinformation-labels
The Washington Post | Nov. 10, 2020: “Georgia fight shows how Trump’s un-
founded election fraud claims are splitting GOP”
https ashingtonpost com politics cybersecurity- -georgia-
fight-sho s-ho -trumps-unfounded-election-fraud-claims-are-splitting-gop
The Washington Post | Nov. 10, 2020: “The Post’s View: Trump is the problem
when it comes to disinformation. So what now?”
https ashingtonpost com opinions trump-is-the-problem- hen-it-
comes-to-disinformation-so- hat-no e b e - c - eb-a -
ad d a story html
National Public Radio | Nov. 10, 2020: “From Steve Bannon To Millennial Millie:
Facebook, YouTube Struggle With Live Video”
https npr org from-steve-bannon-to-millennial-
millie-facebook-youtube-struggle- ith-live-video
CQ Roll Call | Nov. 10, 2020: “Twitter, Facebook face rocky future post-Donald
Trump”
https rollcall com t itter-facebook-face-rocky-future-post-
donald-trump
Bloomberg News | Nov. 11, 2020: YouTube Election Loophole Lets Some False
Trump-Win Videos Spread”
https bloomberg uint com on eb youtube-election-loophole-lets-some-
false-trump- in-videos-spread
NBC News | Nov. 11, 2020: “Misinformation by a thousand cuts: Varied rigged
election claims circulate”
https nbcne s com tech tech-ne s misinformation-thousand-cuts-varied-
rigged-election-claims-circulate-n
The Washington Post | Nov. 12, 2020: “Trump’s attacks on election outcome
prolong tech’s emergency measures’
https ashingtonpost com technology facebook-ad-ban-lame-
duck

261
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 280 of 292 PageID #:
13849

E. News Articles Citing the Election Integrity Partnership

Newsweek | Nov. 12, 2020: “Fact Check: Did Dominion Voting Systems Cause
Widespread Voter Fraud, As Trump Claims?”
https ne s eek com fact-check-did-dominion-voting-systems-cause-
idespread-voter-fraud-trump-claims-
NBC News | Nov. 12, 2020: “Biden picks chief of staff while misinformation
wildfire fuels Trump’s refusal to concede”
https nbcne s com ne s morning-briefing biden-picks-chief-staff- hile-
misinformation- ildfire-fuels-trump-s-n
Americas Quarterly | Nov. 12, 2020: “Misinformation Is Threatening Brazil’s
Elections, Too”
https americas uarterly org article misinformation-is-threatening-bra ils-
elections-too
Politico Morning Tech | Nov. 13, 2020: “Where Biden’s new chief of staff stands
on tech”
https politico com ne sletters morning-tech here-bidens-
ne -chief-of-staff-stands-on-tech-
Bloomberg News | Nov. 14, 2020: ‘Follow me on Parler’ is new mantra for users
aggrieved by Facebook”
https politico com ne sletters morning-tech here-bidens-
ne -chief-of-staff-stands-on-tech-
South China Morning Post | Nov. 16, 2020: “Dis-United States: Biden’s team
faces reality of rule during Trumpism”
https scmp com ne s china diplomacy article dis-united-states-
bidens-team-faces-reality-rule-during
Detroit Free Press | Nov, 17, 2020: “Russian ballot-stuffing video goes viral again,
and other predictable things about 2020 misinformation”
https freep com story ne s politics elections election-
misinformation-predictable
The Associated Press | Nov. 20, 2020: “Who needs Russia? Loudest attacks on
US vote are from Trump”
https apne s com article donald-trump-loudest-attack-us-vote-bc f e
dd c c d b
The New York Times | Nov. 20, 2020: “Trump allies are among the frequent
purveyors of election misinformation”

262
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 281 of 292 PageID #:
13850

https nytimes com technology trump-allies-are-among-the-


fre uent-purveyors-of-election-misinformation html
The New York Times | Nov. 23, 2020: “How Misinformation ‘Superspreaders’
Seed False Election Theories”
https nytimes com technology trump-allies-are-among-the-
fre uent-purveyors-of-election-misinformation html
Voice of America | Nov. 24, 2020: “Russian Influence Peddlers Carving Out New
Audiences on Fringes”
https voane s com usa russian-influence-peddlers-carving-out-ne -
audiences-fringes
The Wall Street Journal | Dec. 11, 2020: “Social Media in 2020: A Year of Misin-
formation”
https s com articles social-media-in- -a-year-of-misinformation-and-
disinformation-
MLive | Jan. 4, 2021: “Misinformation and conspiracies took starring role in
Michigan’s political movements”
https mlive com public-interest misinformation-and-conspiracies-
took-starring-role-in-michigans-political-movements html
Computer Weekly | Jan. 8, 2021: “Tech sector reacts to Trump social media
bans”
https computer eekly com ne s Tech-sector-reacts-to-Trump-
social-media-bans
The New York Times | Jan. 8, 2021: “Trump Isn’t the Only One”
https nytimes com technology trump-misinformation-
superspreaders html
National Public Radio | Jan. 8, 2021: “Twitter Permanently Suspends Trump,
Citing ‘Risk Of Further Incitement Of Violence’”
https npr org t itter-bans-president-trump-citing-
risk-of-further-incitement-of-violence
The Missoulian | Jan. 9, 2021: “Riot blame-shifting leaks into Montana social
media”
https missoulian com ne s state-and-regional riot-blame-shifting-leaks-into-
montana-social-media article bec ab -dd b- - b e- a b a e html
The Washington Post | Jan. 16, 2021 “Misinformation dropped dramatically the
week after Twitter banned Trump and some allies”

263
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 282 of 292 PageID #:
13851

E. News Articles Citing the Election Integrity Partnership

https ashingtonpost com technology misinformation-trump-


t itter
CNET | Jan. 16, 2021: “After Twitter banned Trump, misinformation plummeted,
says report”
https cnet com ne s after-t itter-banned-trump-misinformation-
plummeted-says-report
Variety | Jan. 17, 2021: “After Twitter Banned Donald Trump, Election Misinfor-
mation Online Plunged Dramatically”
https variety com digital ne s t itter-ban-trump-election-misinformation-
research-
Mountain View Voice | Jan. 21, 2021: “Can social media giants stop an insurrec-
tion before it happens?”
https mv-voice com ne s can-social-media-giants-stop-an-
insurrection-before-it-happens
Nature | Feb. 4, 2021: “Tracking QAnon: how Trump turned conspiracy-theory
research upside down”
https nature com articles d - - -y

264
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 283 of 292 PageID #:
13852

Appendix F
Methodology for Evaluating
Platform Policy

In total, we evaluated 15 different platforms1 across four categories meant to par-


tition the space of potential problematic content and behavior: the mechanics of
the election (Procedural Interference), the voters themselves (Participation In-
terference), encouragement of fraud (Fraud), and casting doubt on the integrity
of the election outcome. (Delegitimization of Election Results). The definitions
of these categories are detailed in Chapter 1.
We first determined if the platform stated in its community guidelines whether it
would address election-related content on its platform. While the platforms that
don’t have election-related policies—Parler, Gab, Discord, WhatsApp, Telegram,
Reddit, and Twitch— may use existing policies to address content such as the
encouragement of fraud, we cannot properly evaluate them in an election-
related context. We then rated each platform’s policies as either “None,” “Non-
Comprehensive,” or “Comprehensive,” depending on how specifically it addresses
the content type:

• None: The platform has no explicit policy or stance on the issue.

• Non-Comprehensive: Policy in this category contains indirect language,


or uses broad “umbrella” language, such that it is not clear what type of
election misinformation and disinformation the policy covers. This is also
reserved for policies that give one detailed example such that they cover
some, but not all, of a subject.
1
The platforms we evaluated are: Facebook, Instagram, Twitter, YouTube, Pinterest, Nextdoor,
TikTok, Snapchat, Parler, Gab, Discord, WhatsApp, Telegram, Reddit, and Twitch. Twitch was
added to the list of platforms we evaluated during our blog post update in October.

265
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 284 of 292 PageID #:
13853

F. Methodology for Evaluating Platform Policy

• Comprehensive: Policy in this category uses direct language and is clear


on what type of election misinformation and disinformation the policy
covers. It also sufficiently covers the full breadth of the category.

For each of the categories, we defined “Comprehensive” to be:

• Procedural: The policy specifies time, place, or manner (e.g., voting in


person and by mail).

• Participation: The policy specifies it will address posts that include intim-
idation to personal safety or deterrence to participation in the election
process, which can be both violent and non-violent.

• Fraud: The policy specifies it will address posts that encourage participat-
ing in the election in an illegal way.

• Delegitimization of Election Results: The policy specifies it will address


claims that attempt to delegitimize the election.

The tables in this report have slightly different policy ratings under the category
of fraud from when we first published our analysis in August 2020. There were
many unfounded claims of “election fraud,” but we determined that this fell into
the larger category of delegitimization of election results. Our fraud category
is therefore scoped solely around claims that encourage people to commit
fraud—which appeared only a handful of times during our monitoring period.
Many platforms, including those without election-related policies, have terms
of service policies and community standards that state the promotion of illegal
activity is not allowed on its platform. However, only Facebook and Pinterest
explicitly state that the encouragement of voter fraud is not allowed on their
platforms and therefore received a rating of “Comprehensive.”
Over the four months of the EIP’s operation, we updated our platform evaluations
to account for policy changes made by the platforms. We frequently checked
for changes in platforms’ community guidelines and followed the platforms’
blog posts, which we considered to be policy statements even though some
of these updates weren’t formally incorporated into the platforms’ community
guidelines. We did not consider policy changes that were stated to the press,
or on social media by executives or employees of the platform. Below is a table
of the corresponding policies for each platform. The colors correspond to new
policies that were introduced between August 2020 and October 28, 2020.

266
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 285 of 292 PageID #:
13854

Facebook
Procedural Interference Participation Interference Fraud Delegitimization of
Election Results
Comprehensive (Rating Comprehensive (Rating Comprehensive (Rating Comprehensive (Rating
did not change during did not change during did not change during changed from
election cycle): election cycle; policies election cycle): Non-Comprehensive):
“Misrepresentation of the updated are shown in red): “Offers to buy or sell votes “We will attach an
dates, locations, and times, “Any content containing with cash or gifts.” informational label to
and methods for voting or statements of intent, calls “Statements that advocate, content that seeks to
voter registration or for action, conditional or provide instructions or delegitimize the outcome
census participation.” aspirational statements, or show explicit intent to of the election or discuss
“Misrepresentation of who advocating for high- or illegally participate in a the legitimacy of voting
can vote, qualifications for mid-severity violence due voting or census process.” methods, for example, by
voting, whether a vote will to voting, voter Comprehensive (Rating claiming that lawful
be counted, and what registration, or the changed from methods of voting will lead
information and/or administration or outcome Non-Comprehensive): to fraud. This label will
materials must be provided of an election.” [Sept. 03] “We will attach an provide basic authoritative
in order to vote.” “Content stating that informational label to information about the
“Calls for coordinated census or voting content that seeks to integrity of the election
interference that would participation may or will delegitimize the outcome and voting methods.” [Sept.
affect an individual’s ability result in law enforcement of the election or discuss 03]
to participate in the consequences (e.g., arrest, the legitimacy of voting “Importantly, if any
census or an election.” deportation, methods, for example, by candidate or campaign
Facebook will remove imprisonment).” claiming that lawful tries to declare victory
implicit “Content claiming that the methods of voting will lead before the results are in,
misrepresentations about US Immigration and to fraud. This label will we’ll add a label to their
voting that may “mislead Customs Enforcement provide basic authoritative post educating that official
you about what you need (ICE) is at a voting information about the results are not yet in and
to do to get a ballot.” [Sept. location.” [Sept. 03] integrity of the election directing people to the
03] “Calls for coordinated and voting methods.” [Sept. official results.” [Sept. 03]
interference that would 03] “Other misrepresentations
affect an individual’s ability “Importantly, if any related to voting in an
to participate in an candidate or campaign official election or census
election.”’ tries to declare victory participation may be
“Explicit claims that people before the results are in, subject to false news
will be infected by COVID we’ll add a label to their standards, as referenced in
(or another communicable post educating that official section 20” (now section
disease) if they participate results are not yet in and 21).
in the voting process.” directing people to the
[Sept. 03] official results.” [Sept. 03]
“Statements of intent or “Other misrepresentations
advocacy, calls to action, related to voting in an
or aspirational or official election or census
conditional statements to participation may be
bring weapons to locations, subject to false news
including but not limited standards, as referenced in
to places of worship, section 20” (now section
educational facilities, or 21).
polling places, or locations
used to count votes or
administer an election* (or
encouraging others to do
the same).”
*“For the following
content, we may require
more information and/or
context in order to enforce:
Threats against election
officials.” [Sept. 03]

267
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 286 of 292 PageID #:
13855

F. Methodology for Evaluating Platform Policy

Twitter
Procedural Interference Participation Interference Fraud Delegitimization of
Election Results
Comprehensive (Rating Comprehensive (Rating Non-Comprehensive Comprehensive (Rating
did not change during did not change during (Rating did not change changed from
election cycle): election cycle): during election cycle): Non-Comprehensive):
“Misleading information “Misleading claims about “Illegal or certain regulated “Misleading claims that
about procedures to police, or law enforcement goods or services: You may polling places are closed,
participate in a civic activity related to voting in not use our service for any that polling has ended or
process (for example, that an election, polling places, unlawful purpose or in other misleading
you can vote by Tweet, text or collecting census furtherance of illegal information relating to
message, email, or phone information.” activities. This includes votes not being counted.”
call in jurisdictions where “Misleading claims about selling, buying, or “We also consider whether
these are not a possibility).” long lines, equipment facilitating transactions in the context in which media
“Misleading information problems, or other illegal goods or services, as are shared could result in
about requirements for disruptions at voting well as certain types of confusion or
participation, including locations during election regulated goods or misunderstanding or
identification or periods.” “Misleading services.” suggests a deliberate
citizenship requirements” claims about process, intent to deceive people
“Misleading statements or procedures, or techniques about the nature or origin
information about the which could dissuade of the content, for example
official, announced date or people from participating.” by falsely claiming that it
time of a civic process.” “Threats regarding voting depicts reality.”
“Misleading claims that locations or other key “Disputed claims that
polling places are closed, places or events (note that could undermine faith in
that polling has ended or our violent threats policy the process itself, e.g.
other misleading may also be relevant for unverified information
information relating to threats not covered by this about election rigging,
votes not being counted.” policy).” ballot tampering, vote
“Misleading claims about Twitter will remove tallying, or certification of
long lines, equipment “Tweets that encourage election results.” [Sept. 10]
problems, or other violence or call for people “Misleading claims about
disruptions at voting to interfere with election the results or outcome of a
locations during election results or smooth civic process which calls
periods.” operation of polling places.” for or could lead to
“False or misleading [Oct. 9] interference with the
information that causes “Tweets meant to incite implementation of the
confusion about the laws interference with the results of the process, e.g.
and regulations of a civic election process or with claiming victory before
process, or officials and the implementation of election results have been
institutions executing election results, such as certified, inciting unlawful
those civic processes.” through violent action, will conduct to prevent a
[Sept. 10] be subject to removal. This peaceful transfer of power
covers all Congressional or orderly succession.”
races and the Presidential [Oct. 9]
Election.” [Oct. 9] “People on Twitter,
including candidates for
office, may not claim an
election win before it is
authoritatively called. To
determine the results of an
election in the US, we
require either an
announcement from state
election officials, or a
public projection from at
least two authoritative,
national news outlets that
make independent election
calls. Tweets which
include premature claims
will be labeled and direct
people to our official US
election page.” [Oct. 9]

268
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 287 of 292 PageID #:
13856

YouTube
Procedural Interference Participation Interference Fraud Delegitimization of
Election Results
Comprehensive (Rating Comprehensive (Rating Non-Comprehensive Non-Comprehensive
did not change during changed from (Rating did not change (Rating did not change
election cycle): Non-Comprehensive): during election cycle): during election cycle):
“Content aiming to mislead “Content aiming to mislead “Don’t post content on Manipulated Media:
voters about the time, voters about the time, YouTube if it fits any of the “Content that has been
place, means or eligibility place, means or eligibility descriptions noted below. technically manipulated or
requirements for voting, or requirements for voting, or Instructional theft or doctored in a way that
false claims that could false claims that could cheating: Showing viewers misleads users (beyond
materially discourage materially discourage how to steal tangible clips taken out of context)
voting. voting.” goods or promoting and may pose a serious
“Incitement to interfere “Incitement to interfere dishonest behavior” risk of egregious harm.”
with democratic processes: with democratic processes: Example: “Misattributing a
content encouraging content encouraging 10 year old video that
others to interfere with others to interfere with depicts stuffing of a ballot
democratic processes, democratic processes, box to a recent election.”
such as obstructing or such as obstructing or Examples of content not to
interrupting voting interrupting voting post:
procedures.” procedures.” • False claims that
Examples of content not to Examples of content not to non-citizen voting has
post: post: determined the outcome
• “Deliberately telling • Telling viewers to create of past elections.
viewers an incorrect long voting lines with the • Telling viewers to hack
election date.” purpose of making it government websites to
• “Telling viewers they can harder for others to vote delay the release of
vote through fake methods • “Claiming that a voter’s elections results
like texting their vote to a political party affiliation is • Manipulated Media:
particular number.” visible on a vote-by-mail “Content that has been
• “Giving made up voter envelope.” technically manipulated or
eligibility requirements doctored in a way that
like saying that a particular misleads users (beyond
election is only open to clips taken out of context)
voters over 50 years old.” and may pose a serious
• “we remove content risk of egregious harm.”
falsely claiming that
mail-in ballots have been
manipulated to change the
results of an election”

269
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 288 of 292 PageID #:
13857

F. Methodology for Evaluating Platform Policy

Pinterest
Procedural Interference Participation Interference Fraud Delegitimization of
Election Results
Comprehensive (Rating Comprehensive (Rating Comprehensive (Rating Comprehensive (Rating
changed from None): changed from changed from changed from
“False or misleading Non-Comprehensive): Non-Comprehensive): Non-Comprehensive):
information about the “False or misleading “Content that encourages “Content apparently
dates, times, locations and content that impedes an or instructs voters or intended to delegitimize
procedure for voting or election’s integrity or an participants to election results on the
census participation.” individual’s or group’s civic misrepresent themselves basis of false or misleading
“Content that misleads participation, including or illegally participate” claims.” [Sept. 3]
voters about how to registering to vote, voting, [Sept. 3]
correctly fill-out and and being counted in a
submit a ballot, including a census.”
mail-in ballot, or census “False or misleading
form.” [Sept. 3] information about public
safety that is intended to
deter people from
exercising their right to
vote or participate in a
census.”
“False or misleading
information about who can
vote or participate in the
census and what
information must be
provided to participate.”
“False or misleading
statements about who is
collecting information
and/or how it will be used.”
“Threats against voting
locations, census or voting
personnel, voters or
census participants,
including intimidation of
vulnerable or protected
group voters or
participants.” [Sept. 3]

270
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 289 of 292 PageID #:
13858

Nextdoor
Procedural Interference Participation Interference Fraud Delegitimization of
Election Results
Non-Comprehensive Non-Comprehensive Non-Comprehensive Non-Comprehensive
(Rating changed from (Rating changed from (Rating did not change (Rating changed from
Comprehensive): None): during election cycle): None):
“bans any inaccurate “False or misleading “When offering or seeking “False or misleading
content about the time, information that could goods or services on information that could
place, means, or eligibility prevent or discourage Nextdoor, make sure that prevent or discourage
requirements to vote in people from voting, cause you’re complying with local people from voting, cause
any local or national their votes not to be laws and not engaging in their votes not to be
elections in the U.S.” counted, or interfere with illegal transactions.” counted, or interfere with
“False or misleading the election process.” the election process.”
information that could “False or misleading claims
prevent or discourage about the results of an
people from voting, cause election that could lead to
their votes not to be interference with the
counted, or interfere with election process.”
the election process.”

TikTok
Procedural Interference Participation Interference Fraud Delegitimization of
Election Results
Non-Comprehensive Non-Comprehensive Non-Comprehensive Comprehensive (Rating
(Rating did not change (Rating changed from (Rating did not change changed from
during election cycle): None): during election cycle): Non-Comprehensive):
“Content that misleads “Attempts to intimidate “Content may be removed “False claims that seek to
community members voters or suppress voting.” if it relates to activities or erode trust in public
about elections or other TikTok will redirect search goods that are regulated or institutions, such as claims
civic processes.” results with terms illegal in the majority of of voter fraud resulting
“Claims relating to polling associated with the region or world, even if from voting by mail or
stations on election day “incitement to violence.” the activities or goods in claims that your vote won’t
that have not yet been TikTok will block future question are legal in the count.” [Oct. 7]
verified.” livestreaming from an jurisdiction of posting.” “Content that misleads
“Content that account whose livestream community members
misrepresents the date of “seeks to incite violence or about elections or other
an election.” [Oct. 7] promote hateful ideologies, civic processes.”
conspiracies, or “Reviewed content that
disinformation.” shares unverified claims,
TikTok will add a banner such as a premature
pointing viewers to our declaration of victory
election guide content before results are
with…“attempts to confirmed.” [Oct. 7]
dissuade people from
voting by exploiting
COVID-19 as a voter
suppression tactic.” [Oct.
7]

271
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 290 of 292 PageID #:
13859

F. Methodology for Evaluating Platform Policy

Snapchat
Procedural Interference Participation Interference Fraud Delegitimization of
Election Results
Non-Comprehensive None Non-Comprehensive Non-Comprehensive
(Rating changed from No (Rating changed from No (Rating changed from No
election-related policies): election-related policies): election-related policies):
“We prohibit spreading We prohibit the promotion “We prohibit spreading
false information that and use of certain false information that
causes harm or is regulated goods, as well as causes harm or is
malicious, such as denying the depiction or promotion malicious, such as denying
the existence of tragic of criminal activities. the existence of tragic
events, unsubstantiated events, unsubstantiated
medical claims, or medical claims, or
undermining the integrity undermining the integrity
of civic processes.” of civic processes.”

F.1 Assessing our methodology


The purpose of this framework is to provide a clear visualization of civic integrity
policies across multiple social media platforms, and to create a single standard
upon which all platforms could be evaluated. The community guidelines and
terms of service that moderate user content vary widely among platforms, and do
not use standardized vocabulary. By directly comparing the language of multiple
platforms, the framework provides insight into the policies of each platform. This
allowed our analysis to act as an advocate for specific policy recommendations
at a platform level by highlighting existing shortfalls. Finally, the framework is
intended to be a resource for civil society, academia, and citizens to understand
what election-related speech popular platforms moderate.
At the same time, there are limitations to this methodology that are equally
important to reflect on. First, the framework doesn’t consider that each platform
functions differently in the information environment. For example, we didn’t
explore whether messaging platforms such as WhatsApp should have different
policies from a video platform like YouTube when it comes to election-related
content.
Second, this framework’s rating system was centered on policy language and
not how these policies were applied in practice, which may give a misleading
impression that one platform is better than another in mitigating misinformation
and disinformation. Although many platform policies are accessible to the
general public, platforms also have internal guidance specifying more nuances
of their externally facing rules, including deciding how to apply these policies.
The opacity of platform decision-making serves as another limitation to the
accuracy of our framework; for example, some gaps we identified in platform
policies could be accounted for by internal mechanisms, or some proficiencies
nullified by a company’s reluctance to enforce at scale; there may be unknown
pitfalls about these policies that we don’t see externally.

272
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 291 of 292 PageID #:
13860

F.1. Assessing our methodology

Lastly, as our categories were created before the election, we didn’t know how
effective they would be in accurately capturing and describing the content that
we came across in our monitoring. As we applied these categories in practice,
some of them narrowed while others expanded. For example, the category of
Fraud presented a challenge to our original definition because the term “fraud”
was used broadly to cast doubt on the election. The scope for our fraud category
was limited to a strict definition of content that encouraged people to commit
fraud. Thus, the unfounded accusations of fraud fell into the Delegitimization
category, which, looking back at our data, encompassed the majority of the
incidents we monitored. Therefore, in contrast to the specificity we tried to
capture in the other categories, Delegitimization as a category became very
expansive.

273
Case 3:22-cv-01213-TAD-KDM Document 209-2 Filed 03/04/23 Page 292 of 292 PageID #:
13861

THE LONG FUSE:


MISINFORMATION AND THE 2020 ELECTION

T
he Election Integrity Partnership was officially formed
on July 26, 2020 — 100 days before the 2020 presidential
election — as a coalition of research entities who would
focus on supporting real-time information exchange between
the research community, election officials, government agencies,
civil society organizations, and social media platforms. The
Partnership was formed between four of the nation’s leading
institutions focused on understanding misinformation in the
social media landscape: the Stanford Internet Observatory,
Graphika, the Atlantic Council’s Digital Forensic Research Lab,
and the University of Washington’s Center for an Informed Public.
This is the final report of their findings.

ISBN 978-1-7367627-1-4

Cover design & illustration:


Alexander Atkins Design, Inc.

You might also like