TikTok Ruling
TikTok Ruling
No. 24-1113
v.
2
Xiang Li, Elizabeth A. McNamara, Chelsea T. Kelly, James R.
Sigel, Adam S. Sieff, and Joshua Revesz.
3
Daniel Tenny, Attorney, U.S. Department of Justice,
argued the cause for respondent. With him on the brief were
Brian M. Boynton, Principal Deputy Assistant Attorney
General, Brian D. Netter, Deputy Assistant Attorney General,
Mark R. Freeman, Sharon Swingle, Casen B. Ross, Sean R.
Janda, and Brian J. Springer, Attorneys, Matthew G. Olsen,
Assistant Attorney General for National Security, Tyler J.
Wood, Deputy Chief, Foreign Investment Review Section, and
Tricia Wellman, Acting General Counsel, Office of the
Director of National Intelligence.
4
Jonathan Berry, Michael Buschbacher, Jared M. Kelson,
James R. Conde, and William P. Barr were on the brief for
amicus curiae American Free Enterprise Chamber of
Commerce in support of respondent.
5
Attorney General for the State of South Carolina, Marty J.
Jackley, Attorney General, Office of the Attorney General for
the State of South Dakota, Jonathan Skrmetti, Attorney
General, Office of the Attorney General for the State of
Tennessee, and Sean D. Reyes, Attorney General, Office of the
Attorney General for the State of Utah, were on the brief for
amici curiae State of Montana, Virginia, and 19 Other States
in support of respondent.
I. Background 8
A. The TikTok Platform 8
B. The Petitioners 9
C. National Security Concerns 11
D. The Act 15
1. Foreign adversary controlled applications 16
2. Prohibitions 18
3. The divestiture exemption 19
E. Procedural History 20
II. Analysis 20
A. Standing and Ripeness 21
B. The First Amendment 24
1. Heightened scrutiny applies. 24
2. The Act satisfies strict scrutiny. 32
a. The Government’s justifications are
compelling. 33
(i) National security justifications 33
(ii) Data collection 38
(iii) Content manipulation 42
b. The Act is narrowly tailored. 48
(i) TikTok’s proposed NSA 49
(ii) Other options 53
(iii) Overinclusive / underinclusive 55
C. Equal Protection 57
D. The Bill of Attainder Clause 59
E. The Takings Clause 63
F. Alternative Relief 64
III. Conclusion 65
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 7 of 92
7
GINSBURG, Senior Circuit Judge: On April 24, 2024 the
President signed the Protecting Americans from Foreign
Adversary Controlled Applications Act into law. Pub. L. No.
118-50, div. H. The Act identifies the People’s Republic of
China (PRC) and three other countries as foreign adversaries
of the United States and prohibits the distribution or mainte-
nance of “foreign adversary controlled applications.”1 Its
prohibitions will take effect on January 19, 2025 with respect
to the TikTok platform.
1
A foreign adversary controlled application is defined in § 2(g)(3)
as “a website, desktop application, mobile application, or augmented
or immersive technology application that is operated, directly or indi-
rectly (including through a parent company, subsidiary, or affiliate),
by”:
(A) any of — (i) ByteDance, Ltd.; (ii) TikTok; (iii) a
subsidiary of or a successor to an entity identified in
clause (i) or (ii) that is controlled by a foreign adver-
sary; or (iv) an entity owned or controlled, directly or
indirectly, by an entity identified in clause (i), (ii), or
(iii); or
(B) a covered company that — (i) is controlled by a foreign
adversary; and (ii) that is determined by the President
to present a significant threat to the national security of
the United States following [certain procedures].
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 8 of 92
8
I. Background
9
Chinese engineers, “continually develop[s]” the recommenda-
tion engine and platform source code. As we explain in more
detail below, the recommendation engine for the version of the
platform that operates in the United States is deployed to a
cloud environment run by Oracle Corporation.
B. The Petitioners
10
provide the following overview of the relevant corporate
relationships.
2
We use “China” when referring to the country and PRC when
referencing its government.
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 11 of 92
11
censorship, propaganda, or other malign purposes on TikTok
US.”
3
The Executive refers variously to the President, Executive Branch
agencies, including the intelligence agencies, and officials thereof.
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 12 of 92
12
a questionnaire to ByteDance about national security concerns
related to ByteDance’s acquisition of Musical.ly. Thus began a
lengthy investigatory process that culminated on August 1,
2020 with CFIUS concluding that TikTok could not suffi-
ciently mitigate its national security concerns and referring the
transaction to the President. The President, acting on that
referral, ordered ByteDance to divest any “assets or property”
that “enable or support ByteDance’s operation of the TikTok
application in the United States.” Regarding the Acquisition of
Musical.ly by ByteDance Ltd., 85 Fed. Reg. 51297, 51297
(Aug. 14, 2020).
13
Biden elaborated that “software applications” can provide
foreign adversaries with “vast swaths of information from
users,” and that the PRC’s “access to large repositories” of such
data “presents a significant risk.” Id. President Biden directed
several executive agencies to provide risk mitigation options,
and he asked for recommended “executive and legislative
actions” to counter risks “associated with connected software
applications that are designed, developed, manufactured, or
supplied by persons owned or controlled by, or subject to the
jurisdiction or direction of, a foreign adversary.” Id. The
following year, President Biden signed into law a bill prohibit-
ing the use of TikTok on government devices. See generally
Pub. L. No. 117-328, div. R, 136 Stat. 5258 (2022).
14
extensive, in-depth discussions with Oracle, the proposed
Trusted Technology Provider, whose responsibility under the
proposed mitigation structure included storing data in the
United States, performing source code review, and ensuring
safety of the operation of the TikTok platform in the United
States.”
15
Third, the proposal provided for a “trusted third party,”
Oracle, to inspect the source code, including TikTok’s
recommendation engine. It also gave the Government author-
ity, under certain circumstances, to instruct TikTok to shut
down the platform in the United States, which TikTok calls a
“kill switch.”
D. The Act
16
provision by limiting the ability of foreign adversaries to
collect data directly through adversary controlled applications.
4
The term “covered company” is defined as “an entity that operates
. . . a website, desktop application, mobile application, or augmented
or immersive technology application that”:
(i) permits a user to create an account or profile to gener-
ate, share, and view text, images, videos, real-time
communications, or similar content;
(ii) has more than 1,000,000 monthly active users with
respect to at least 2 of the 3 months preceding the date
on which a relevant determination of the President is
made pursuant to paragraph (3)(B);
(iii) enables 1 or more users to generate or distribute content
that can be viewed by other users of the website, desk-
top application, mobile application, or augmented or
immersive technology application; and
(iv) enables 1 or more users to view content generated by
other users of the website, desktop application, mobile
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 17 of 92
17
significant threat to national security. Specifically, it includes
any “covered company” that:
(i) is controlled by a foreign adversary;5 and
(ii) that is determined by the President to present a
significant threat to the national security of the
United States following the issuance of — (I) a
public notice proposing such determination; and
(II) a public report to Congress, submitted not less
than 30 days before such determination, describing
the specific national security concern involved and
containing a classified annex and a description of
18
what assets would need to be divested to execute a
qualified divestiture.
§ 2(g)(3)(B).
2. Prohibitions
19
§ 2(a)(2)(B). In both situations, the President can grant a one-
time, 90-day extension under specific circumstances not rele-
vant here. § 2(a)(3).
20
E. Procedural History
II. Analysis
21
General from enforcing it. Because the petitioners are bringing
a pre-enforcement challenge to the Act, we must determine the
extent to which this court can consider their claims consistent
with the standing aspect of the “case or controversy” require-
ment of Article III of the Constitution. We conclude that
TikTok has standing to challenge those portions of the Act that
directly affect the activities of ByteDance and its affiliates. We
further conclude that TikTok’s challenge to those portions of
the Act is ripe.
22
“facially invalid under the First Amendment,” which need not
detain us.6 Creator Reply Br. 30–31.
6
The User Petitioners have not demonstrated that “a substantial
number of” the Act’s “applications are unconstitutional, judged in
relation to the statute’s plainly legitimate sweep.” Moody v.
NetChoice, LLC, 144 S. Ct. 2383, 2397 (2024) (cleaned up). Indeed,
the core of the Act — its application as to TikTok — is valid for the
reasons we explain in this opinion.
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 23 of 92
23
take effect by operation of law on January 19, 2025. After that
date, third parties that make the TikTok platform available in
the United States would run a significant risk of incurring
monetary penalties under § 2(d)(1). Even if the Act went unen-
forced, the risk of penalties alone could cause third parties to
suspend support for the TikTok platform, such as by removing
it from online marketplaces, and an injunction would prevent
that harm. TikTok therefore has Article III standing to pursue
its claims.
7
Having concluded that TikTok has standing, we need not separately
analyze whether the User Petitioners have standing to raise the same
claims. See Carpenters Indus. Council v. Zinke, 854 F.3d 1, 9 (D.C.
Cir. 2017) (explaining that “if constitutional standing can be shown
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 24 of 92
24
B. The First Amendment
for at least one plaintiff, we need not consider the standing of the
other plaintiffs to raise that claim” (cleaned up)).
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 25 of 92
25
Under intermediate scrutiny, the Act complies with the
First Amendment “if it advances important governmental inter-
ests unrelated to the suppression of free speech and does not
burden substantially more speech than necessary to further
those interests.” Turner Broad. Sys., Inc. v. FCC (Turner II),
520 U.S. 180, 189 (1997) (citing United States v. O’Brien, 391
U.S. 367, 377 (1968)). Under strict scrutiny, the Act violates
the First Amendment unless the Government can “prove that
the restriction furthers a compelling interest and is narrowly
tailored to achieve that interest.” Reed v. Town of Gilbert,
576 U.S. 155, 171 (2015) (cleaned up).
26
Enforcement of a generally applicable law unrelated to
expressive activity does not call for any First Amendment
scrutiny. Id. By contrast, the First Amendment is implicated in
“cases involving governmental regulation of conduct that has
an expressive element,” or when a statute is directed at an
activity without an expressive component but imposes “a
disproportionate burden upon those engaged in protected First
Amendment activities.” Id. at 703–04; see also Alexander v.
United States, 509 U.S. 544, 557 (1993).
27
The Government suggests that because TikTok is wholly
owned by ByteDance, a foreign company, it has no First
Amendment rights. Cf. Agency for Int’l Dev. v. All. for Open
Soc’y Int’l, Inc., 591 U.S. 430, 436 (2020) (explaining that
“foreign organizations operating abroad have no First
Amendment rights”). TikTok, Inc., however, is a domestic
entity operating domestically. See NetChoice, 144 S. Ct. at
2410 (Barrett, J., concurring) (identifying potential “complexi-
ties” for First Amendment analysis posed by the “corporate
structure and ownership of some platforms”). The Government
does not dispute facts suggesting at least some of the regulated
speech involves TikTok’s U.S. entities. See TikTok App. 811–
12, 817–18 (explaining that promoted videos are “reviewed by
a U.S.-based reviewer,” that an executive employed by a U.S.
entity approves the guidelines for content moderation, and that
the recommendation engine “is customized for TikTok’s vari-
ous global markets” and “subject to special vetting in the
United States”).
28
neutral or content based. See Turner Broad. Sys., Inc. v. FCC
(Turner I), 512 U.S. 622, 642 (1994) (explaining that “regula-
tions that are unrelated to the content of speech are subject to
an intermediate level of scrutiny, because in most cases they
pose a less substantial risk of excising certain ideas or view-
points from the public dialogue” (citation omitted)). A law is
content based if it “applies to particular speech because of the
topic discussed or the idea or message expressed.” Reed, 576
U.S. at 163. It is facially content based “if it targets speech
based on its communicative content.” City of Austin v. Reagan
Nat’l Advert. of Austin, LLC, 596 U.S. 61, 69 (2022) (cleaned
up). A law that “requires an examination of speech only in
service of drawing neutral, location-based lines” does not
target speech based upon its communicative content. Id.; see
BellSouth Corp. v. FCC (BellSouth I), 144 F.3d 58, 69 (D.C.
Cir. 1998) (applying intermediate scrutiny to a law that
“defines the field of expression to which it applies by reference
to a set of categories that might in a formal sense be described
as content-based”). Facial neutrality, however, does not end the
analysis. Even laws that are facially content neutral are content
based if they (a) “cannot be justified without reference to the
content of the regulated speech” or (b) “were adopted by the
government because of disagreement with the message the
speech conveys.” Reed, 576 U.S. at 164 (cleaned up).
29
TikTok insists the TikTok-specific provisions nonetheless
require strict scrutiny because they single out a particular
speaker. To be sure, laws that “discriminate among media, or
among different speakers within a single medium, often present
serious First Amendment concerns.” Turner I, 512 U.S. at 659.
“It would be error to conclude, however, that the First
Amendment mandates strict scrutiny for any speech regulation
that applies to one medium (or a subset thereof) but not others.”
Id. at 660; see, e.g., BellSouth I, 144 F.3d at 68 (rejecting argu-
ment that a statute “warrants strict First Amendment review
because it targets named corporations”). Strict scrutiny “is
unwarranted when the differential treatment is justified by
some special characteristic of the particular medium being
regulated.” Turner I, 512 U.S. at 660–61 (cleaned up). As of
now, the TikTok platform is the only global platform of its kind
that has been designated by the political branches as a foreign
adversary controlled application. As explained below, the
Government presents two persuasive national security
justifications that apply specifically to the platform that TikTok
operates. “It should come as no surprise, then, that Congress
decided to impose [certain restrictions] upon [TikTok] only.”
Id. at 661.
30
(1) to counter the PRC’s efforts to collect great quantities of
data about tens of millions of Americans, and (2) to limit the
PRC’s ability to manipulate content covertly on the TikTok
platform. The former does not reference the content of speech
or reflect disagreement with an idea or message. See Ward,
491 U.S. at 792 (finding justifications offered for a municipal
noise regulation content neutral). The Government’s explana-
tion of the latter justification does, however, reference the
content of TikTok’s speech. Specifically, the Government
invokes the risk that the PRC might shape the content that
American users receive, interfere with our political discourse,
and promote content based upon its alignment with the PRC’s
interests. In fact, the Government identifies a particular topic
— Taiwan’s relationship to the PRC — as a “significant
potential flashpoint” that may be a subject of the PRC’s
influence operations, and its declarants identify other topics of
importance to the PRC. Gov’t Br. 22 (quoting Gov’t App. 7
(Decl. of Asst. Dir. of Nat’l Intel. Casey Blackburn)); see also
Gov’t App. 9, 22.
31
in that way, the Government’s justification is wholly consonant
with the First Amendment.
8
We agree with our concurring colleague that the Government’s
data-protection rationale “is plainly content-neutral” and standing
alone would at most trigger intermediate scrutiny. Concurring Op.
12–13. As we have explained, however, that is not clear for the
Government’s content-manipulation justification, and no party has
identified any portion of the Act to which the data justification alone
applies. We therefore assume strict scrutiny applies to our review of
the Act in its entirety and consider both justifications under that
standard.
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 32 of 92
32
without deciding intermediate scrutiny rather than strict scru-
tiny should be applied, thereby setting “to one side the content
discrimination question”).
33
these circumstances, the provisions of the Act that are before
us withstand the most searching review.
34
through the TikTok platform in particular. As Assistant
Director of National Intelligence Casey Blackburn explained,
the “PRC is the most active and persistent cyber espionage
threat to U.S. government, private-sector, and critical
infrastructure networks.” Its hacking program “spans the
globe” and “is larger than that of every other major nation,
combined.” The PRC has “pre-positioned” itself “for potential
cyber-attacks against U.S. critical infrastructure by building
out offensive weapons within that infrastructure.” Consistent
with that assessment, the Government “has found persistent
PRC access in U.S. critical telecommunications, energy, water,
and other infrastructure.” See China Telecom (Ams.) Corp. v.
FCC, 57 F.4th 256, 262–63 (D.C. Cir. 2022) (describing the
Government’s shift in focus from terrorism to PRC “cyber
threats” and the risk posed by use of PRC-connected “infor-
mation technology firms as systemic espionage platforms”).
“The FBI now warns that no country poses a broader, more
severe intelligence collection threat than China.” Id. at 263.
35
health and genomic data on U.S. persons” by investing in firms
that have or have access to such data. Government
counterintelligence experts describe this kind of activity as a
“hybrid commercial threat.”
36
are one “part of the PRC’s broader geopolitical and long-term
strategy to undermine U.S. national security.”
37
The Government also identifies several public reports,
which were considered by the Congress prior to passing the
Act, regarding the risks posed by TikTok.9 For example, a
Government declarant points to “reporting by Forbes
Magazine” to illustrate in part why the Government did not
trust TikTok’s proposed mitigation measures. The reporting
suggested “that ByteDance employees abused U.S. user data,
even after the establishment of TTUSDS,” and drew attention
to “audio recordings of ByteDance meetings” that indicated
“ByteDance retained considerable control and influence over
TTUSDS operations.” In its report recommending passage of
the Act, a committee of the Congress collected “a list of public
statements that have been made regarding the national security
risks posed by . . . TikTok.” H.R. Rep. No. 118-417, at 5–12
(2024). According to the committee, public reporting sug-
gested that TikTok had stored sensitive information about U.S.
persons (including “Social Security numbers and tax identifica-
tions”) on servers in China; TikTok’s “China-based employ-
ees” had “repeatedly accessed non-public data about U.S.
TikTok users”; ByteDance employees had “accessed TikTok
user data and IP addresses to monitor the physical locations of
9
Although our disposition of this case does not turn upon these
reports, the Congress and the President obviously were entitled to
consider such materials when deciding whether to define TikTok as
a foreign adversary controlled application under the Act. Indeed, we
have “approved” the use of similar public materials by the President
when making decisions to designate people or entities under various
national-security related statutes. See Zevallos v. Obama, 793 F.3d
106, 109, 113 (2015) (finding it “clear that the government may
decide to designate an entity based on a broad range of evidence,
including intelligence data and hearsay declarations” (quoting Holy
Land Found. for Relief & Dev. v. Ashcroft, 333 F.3d 156, 162 (2003)
(regarding designation of an entity as a Specially Designated Global
Terrorist))).
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 38 of 92
38
specific U.S. citizens”; and PRC agents had inspected
“TikTok’s internal platform.” Id. at 7–10.
39
username, password, email, phone number, social media
account information, messages exchanged on the platform and,
“with your permission,” your “phone and social network con-
tacts.” Id. TikTok’s “privacy policy” also makes clear that it
uses these data to “infer additional information” about its users.
Given the magnitude of the data gathered by TikTok and
TikTok’s connections to the PRC, two consecutive presidents
understandably identified TikTok as a significant vulnerability.
Access to such information could, for example, allow the PRC
to “track the locations of Federal employees and contractors,
build dossiers of personal information for blackmail, and
conduct corporate espionage.” Addressing the Threat Posed by
TikTok, 85 Fed. Reg. at 48637.
40
on users’ “precise locations, viewing habits, and private mes-
sages,” including “data on users’ phone contacts who do not
themselves use TikTok.” Gov’t Br. 1; see TikTok Reply Br. 25.
According to a TikTok declarant, the current version of TikTok
can only “approximate users’ geographic locations.” Access to
a user’s contact list, likewise, is currently available only if a
user affirmatively opts in, and it is “anonymized and used only
to facilitate connections with other TikTok users.” TikTok
Reply Br. 25. TikTok further points to other data protections
that it claims to provide, such as storing sensitive user data in
the United States and controlling access to them.
41
companies “through a web of foreign affiliates.” 77 F.4th at
1163 (cleaned up). The Executive “concluded that China’s
ownership raised significant concerns that the [companies]
would be forced to comply with Chinese government
requests.” Id. (cleaned up). The Government was concerned
that the PRC could “access, monitor, store, and in some cases
disrupt or misroute U.S. communications, which in turn
[would] allow them to engage in espionage and other harmful
activities against the United States.” Id. (cleaned up). The
Executive “further concluded that the [companies] had shown
a lack of candor and trustworthiness” and therefore “nothing
short of revocation would ameliorate the national-security
risks.” Id. This court declined the appellants’ invitation to
“second-guess” the Executive’s judgment regarding the threat
to national security. Id. at 1164. We also upheld the
Executive’s conclusion that the companies’ “untrustworthiness
would make any mitigation agreement too risky” in part
because the Executive could not “comprehensively monitor
compliance” or “reliably detect surreptitious, state-sponsored
efforts at evasion.” Id. at 1165–66. The same considerations
similarly support the Government’s judgment here.
42
responsible for operating the platform in the United States
“approved sending U.S. data to China several times.” In short,
the Government’s concerns are well founded, not speculative.
43
a foreign adversary nation. At points, TikTok also suggests the
Government does not have a legitimate interest in countering
covert content manipulation by the PRC. To the extent that is
TikTok’s argument, it is profoundly mistaken. “At the heart of
the First Amendment lies the principle that each person should
decide for himself or herself the ideas and beliefs deserving of
expression, consideration, and adherence. Our political system
and cultural life rest upon this ideal.” Turner I, 512 U.S. at 641.
When a government — domestic or foreign — “stifles speech
on account of its message . . . [it] contravenes this essential
right” and may “manipulate the public debate through coercion
rather than persuasion.” Id.; see also Nat’l Rifle Ass’n of Am.,
602 U.S. at 187 (explaining that at the core of the First
Amendment “is the recognition that viewpoint discrimination
is uniquely harmful to a free and democratic society”).
44
concerns.” Humanitarian Law Project, 561 U.S. at 35. Rather
than attempting itself to influence the content that appears on a
substantial medium of communication, the Government has
acted solely to prevent a foreign adversary from doing so. As
our concurring colleague explains, this approach follows the
Government’s well-established practice of placing restrictions
on foreign ownership or control where it could have national
security implications. Concurring Op. 2–5; see 47 U.S.C.
§ 310(a)–(b) (restricting foreign control of radio licenses); Pac.
Networks Corp., 77 F.4th at 1162 (upholding the FCC’s deci-
sion to revoke authorizations to operate communications lines);
Moving Phones P’ship v. FCC, 998 F.2d 1051, 1055, 1057
(D.C. Cir. 1993) (upholding the Executive’s application of the
Communications Act’s “ban on alien ownership” of radio
licenses “to safeguard the United States from foreign influence
in broadcasting” (cleaned up)); see also Palestine Info. Off. v.
Shultz, 853 F.2d 932, 936, 945 (D.C. Cir. 1988) (upholding the
Executive’s divestiture order under the Foreign Missions Act
regarding an organization the activities of which “were deemed
inimical to America’s interests”); 49 U.S.C. § 40102(a)(2),
(15) (requiring that a U.S. “air carrier” be “under the actual
control of citizens of the United States”).
45
TikTok resists this conclusion by emphasizing stray com-
ments from the congressional proceedings that suggest some
congresspersons were motivated by hostility to certain content.
The Supreme Court, however, has repeatedly instructed that
courts should “not strike down an otherwise constitutional
statute on the basis of an alleged illicit legislative motive.”
O’Brien, 391 U.S. at 383; City of Renton v. Playtime Theaters,
Inc., 475 U.S. 41, 47–49 (1986) (rejecting speculation about
the “motivating factor” behind an ordinance justified without
reference to speech); Turner I, 512 U.S. at 652 (similar). The
Act itself is the best evidence of the Congress’s and the
President’s aim. The narrow focus of the Act on ownership by
a foreign adversary and the divestiture exemption provide
convincing evidence that ending foreign adversary control, not
content censorship, was the Government’s objective.
10
The parties offer competing interpretations of this exclusion.
Because we do not doubt the Government’s “proffered . . . interest
actually underlies the law” under either interpretation, we have no
occasion to interpret that provision in this case. Blount v. SEC,
61 F.3d 938, 946 (D.C. Cir. 1995) (quotation omitted).
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 46 of 92
46
“purpose-built the Act to ban TikTok because it objects to
TikTok’s content.” TikTok Reply Br. 28.
47
Second, TikTok contends the Government’s content-
manipulation rationale is speculative and based upon factual
errors. TikTok fails, however, to grapple fully with the
Government’s submissions. On the one hand, the Government
acknowledges that it lacks specific intelligence that shows the
PRC has in the past or is now coercing TikTok into manipulat-
ing content in the United States. On the other hand, the
Government is aware “that ByteDance and TikTok Global have
taken action in response to PRC demands to censor content
outside of China.” The Government concludes that ByteDance
and its TikTok entities “have a demonstrated history of
manipulating the content on their platforms, including at the
direction of the PRC.” Notably, TikTok never squarely denies
that it has ever manipulated content on the TikTok platform at
the direction of the PRC. Its silence on this point is striking
given that “the Intelligence Community’s concern is grounded
in the actions ByteDance and TikTok have already taken over-
seas.” It may be that the PRC has not yet done so in the United
States or, as the Government suggests, the Government’s lack
of evidence to that effect may simply reflect limitations on its
ability to monitor TikTok.
48
for claiming the recommendation engine is “based in China”
because it now resides in the Oracle cloud. TikTok Reply Br.
21–22. No doubt, but the Government’s characterization is
nonetheless consistent with TikTok’s own declarations.
TikTok’s declarants explained that now and under its proposed
NSA “ByteDance will remain completely in control of
developing the Source Code for all components that comprise
‘TikTok’ . . . including the Recommendation Engine.” They
likewise represent that TikTok presently “relies on the support
of employees of other ByteDance subsidiaries” for code
development. Even when TikTok’s voluntary mitigation
measures have been fully implemented, the “source code
supporting the TikTok platform, including the recommenda-
tion engine, will continue to be developed and maintained by
ByteDance subsidiary employees, including in the United
States and in China.” TikTok is therefore correct to say the
recommendation engine “is stored in the Oracle cloud,” but
gains nothing by flyspecking the Government’s characteriza-
tion of the recommendation engine still being in China.
49
in the United States. E.g., Pac. Networks Corp., 77 F.4th at
1162; Moving Phones P’ship, 998 F.2d at 1055–56. The
petitioners argue nonetheless that there are less restrictive
alternatives available and contend the Act is fatally both
overinclusive and underinclusive.
50
courts to substitute their judgments for those of the political
branches on questions of national security. See Hernández v.
Mesa, 589 U.S. 93, 113 (2020). Understandably, TikTok
therefore attempts to couch its disagreement in factual terms.
But TikTok does not present any truly material dispute of fact.
51
independent of ByteDance, fears TTUSDS could be pressured
to do the latter’s bidding, and doubts TTUSDS could prevent
interference by ByteDance. Indeed, the Government predicts
that “TTUSDS personnel here would not resist demands to
comply” with directives “even if aware of pressure from the
PRC government.” Whether TTUSDS sufficiently mitigates
the risk of PRC interference through ByteDance is ultimately
an issue of judgment, not of fact.
52
Branch officials “conducted dozens of meetings,” considered
“scores of drafts of proposed mitigation terms,” and engaged
with TikTok as well as Oracle for more than two years in an
effort to work out an acceptable agreement. Here “respect for
the Government’s conclusions is appropriate.” Humanitarian
Law Project, 561 U.S. at 34.
53
Congress did not reject the proposed NSA for lack of familiar-
ity; like the Executive, the Congress found it wanting.
54
The first two suggestions obviously fall short. As the
Government points out, covert manipulation of content is not a
type of harm that can be remedied by disclosure. The idea that
the Government can simply use speech of its own to counter
the risk of content manipulation by the PRC is likewise naïve.
Moreover, the petitioners’ attempt to frame the use of
Government speech as a means of countering “alleged foreign
propaganda,” Creator Br. 54, is beside the point. It is the “secret
manipulation of the content” on TikTok — not foreign propa-
ganda — that “poses a grave threat to national security.” Gov’t
Br. 36. No amount of Government speech can mitigate that
threat nearly as effectively as divestiture.
55
For similar reasons, a limited prohibition addressing
government employees would not suffice. The Government’s
concern extends beyond federal employees to “family mem-
bers or potential future government employees (many of whom
may be teenagers today, a particular problem given TikTok’s
popularity among young people).” Indeed, as the Government
emphasizes, the Congress was legislating “in the interest of all
Americans’ data security.” Gov’t Br. 58. A more limited
prohibition would not be as effective as divestiture.
56
various platforms including TikTok but does not collect user
data or present an opportunity for PRC manipulation of con-
tent. Given the Government’s well-supported concerns about
ByteDance, it was necessary for the Act to apply to all
ByteDance entities. Moreover, the petitioners fail to demon-
strate that neither of the Government’s two national security
concerns implicate CapCut. We therefore conclude the
TikTok-specific provisions of the Act are not overinclusive.
57
* * *
To summarize our First Amendment analysis: The
Government has provided two national security justifications
for the Act. We assumed without deciding the Act is subject to
strict scrutiny and we now uphold the TikTok-specific portions
of the Act under each justification. This conclusion is sup-
ported by ample evidence that the Act is the least restrictive
means of advancing the Government’s compelling national
security interests.
C. Equal Protection
TikTok argues that the Act violates its right to the equal
protection of the laws because it singles out TikTok for disfa-
vored treatment relative to other similarly situated platforms.
The Government contends its justifications for the Act satisfy
the requirement of equal protection and add that TikTok
received more process than a company would receive under the
generally applicable provisions. We conclude the Act is con-
sistent with the requirement of equal protection.
58
News America does not require strict scrutiny for “statutes
singling out particular persons for speech restrictions”); Cmty-
Serv. Broad. of Mid-Am., Inc., 593 F.2d at 1122 (applying to a
“statute affecting First Amendment rights” an “equal protec-
tion standard [that] is closely related to the O’Brien First
Amendment tests”). Having concluded the relevant parts of the
Act do not violate the First Amendment even when subjected
to heightened scrutiny, we readily reach the same conclusion
when analyzing the Act in equal protection terms.
59
Am. Pub., Inc., 844 F.2d at 815. That case involved legislation
that regulated waivers of the rule against newspaper-television
cross-ownership in a way that targeted a single person “with
the precision of a laser beam.” Id. at 814. The legislation,
however, bore “only the most strained relationship to the pur-
pose hypothesized by the [Government].” Id. Here, by contrast,
the Act bears directly on the TikTok-specific national security
harms identified and substantiated by the Government.
60
can fairly be deemed a punishment. We conclude the Act is not
a punishment under any of the three tests used to distinguish a
permissible burden from an impermissible punishment.
61
reasons the prohibitions of the Act are close analogs to two
categories of legislative action historically regarded as bills of
attainder: confiscation of property and legislative bars to
participation in a specific employment or profession. See
BellSouth II, 162 F.3d at 685 (explaining the historical
understanding of punishment). According to TikTok, the Act
effectively requires TikTok to relinquish its property or see it
rendered useless, and it precludes TikTok from continuing to
participate in a legitimate business enterprise. As already
explained, however, the Act requires a divestiture — that is, a
sale, not a confiscation — as a condition of continuing to
operate in the United States. See BellSouth I, 144 F.3d at 65
(explaining that although “structural separation is hardly
costless, neither does it remotely approach the disabilities that
have traditionally marked forbidden attainders”); see also
Kaspersky Lab, Inc., 909 F.3d at 462–63 (comparing a law
requiring the Government to remove from its systems a Russia-
based company’s software to the business regulations in the
BellSouth cases). Nor is the divestiture requirement analogous
to a legislative bar on someone’s participation in a specific
employment or profession. See Kaspersky Lab, Inc., 909 F.3d
at 462 (rejecting a similar analogy in part “because human
beings and corporate entities are so dissimilar” (cleaned up)).
62
combinations of business endeavors”). In fact, BellSouth II all
but forecloses TikTok’s argument by recognizing that a
“statute that leaves open perpetually the possibility of
[overcoming a legislative restriction] does not fall within the
historical meaning of forbidden legislative punishment.”
162 F.3d at 685 (quoting Selective Serv. Sys. v. Minn. Pub. Int.
Rsch. Grp., 468 U.S. 841, 853 (1984)) (brackets in original).
The qualified divestiture exemption does just that. It “leaves
open perpetually” the possibility of overcoming the prohibi-
tions in the Act: TikTok can execute a divestiture and return to
the U.S. market at any time without running afoul of the law.
The Act also passes muster under the functional test. For
purposes of this analysis, the “question is not whether a burden
is proportionate to the objective, but rather whether the burden
is so disproportionate that it belies any purported nonpunitive
goals.” Kaspersky Lab, Inc., 909 F.3d at 455 (cleaned up).
Considering our conclusion that the Act passes heightened
scrutiny for purposes of the First Amendment, it obviously
satisfies the functional inquiry here: The Act furthers the
Government’s nonpunitive objective of limiting the PRC’s
ability to threaten U.S. national security through data collection
and covert manipulation of information. The Government’s
solution to those threats “has the earmarks of a rather conven-
tional response to a security risk: remove the risk.” Id. at 457
(cleaned up). In other words, the Government’s attempt to
address the risks posed by TikTok reflects a forward-looking
prophylactic, not a backward-looking punitive, purpose. That
is sufficient to satisfy the functional analysis. See id. at 460
(stating the functional test “does not require that the Congress
precisely calibrate the burdens it imposes to the goals it seeks
to further or to the threats it seeks to mitigate” (cleaned up)).
63
legislative history,” congressional intent to punish is difficult
to establish. Id. at 463 (cleaned up); see also BellSouth II,
162 F.3d at 690 (“Several isolated statements are not sufficient
to evince punitive intent” (cleaned up)). Indeed, the motiva-
tional test is not “determinative in the absence of unmistakable
evidence of punitive intent.” Id. (cleaned up). TikTok does not
come close to satisfying that requirement. We therefore con-
clude the Act does not violate the Bill of Attainder Clause
under any of the relevant tests.
64
authorizes a qualified divestiture before (or after) any prohibi-
tions take effect. That presents TikTok with a number of
possibilities short of total economic deprivation. ByteDance
might spin off its global TikTok business, for instance, or it
might sell a U.S. subset of the business to a qualified buyer.
F. Alternative Relief
65
57 F.4th at 264 (similarly relying on an unclassified record).
Notwithstanding the significant effect the Act may have on the
viability of the TikTok platform, we conclude the Act is valid
based upon the public record.11
III. Conclusion
11
Accordingly, we grant the Government’s motion for leave to file
classified materials and direct the Clerk to file the lodged materials,
though we do not rely on them in denying the petitions.
USCA Case #24-1113 Document #2088317 Filed: 12/06/2024 Page 66 of 92
2
A.
3
The first communications medium capable of reaching
mass audiences in real time—radio—was subject to restrictions
on foreign ownership and control from the very outset. The
Radio Act of 1912 required radio operators engaged in
interstate (or international) communications to obtain a license
from the Secretary of Commerce and Labor, but Congress
made licenses available only to U.S. citizens or companies.
Pub. L. No. 62-264, §§ 1–2, 37 Stat. 302, 302–03 (repealed
1927). Congress then extended the restrictions to encompass
foreign control (not just foreign ownership) in the Radio Act of
1927, prohibiting licensing of any company if it had a foreign
officer or director or if one-fifth of its capital stock was in
foreign hands. Pub. L. No. 69-632, § 12, 44 Stat. 1162, 1167
(repealed 1934).
4
Section 310 continues to restrict foreign control of radio
licenses, including ones used for broadcast communication and
wireless cellular services. See 47 U.S.C. § 310(a)–(b). And
while that provision regulates wireless licenses, limitations on
foreign control also exist for wired transmission lines under
Section 214 of the same law. 47 U.S.C. § 214(a); see also id.
§ 153(11), (50)–(53).
5
15966 (2021); China Mobile Int’l (USA), 34 FCC Rcd. 3361
(2019). This court has affirmed those FCC decisions. See Pac.
Networks Corp. v. FCC, 77 F.4th 1160 (D.C. Cir. 2023); China
Telecom, 57 F.4th 256.
6
degree with modern communications media than with
traditional broadcast (e.g., the vastly enhanced potential for
collection of data from and about users).
B.
7
that regulation of commercial speech has been subject to
intermediate scrutiny even when content based). Content-
neutral laws, on the other hand, present less substantial First
Amendment concerns and so generally trigger, at most,
intermediate scrutiny. See Turner Broad. Sys., Inc. v. FCC,
512 U.S. 622, 642 (1994) (Turner I).
8
To the extent the PRC or ByteDance might wish to adjust
the content viewed by U.S. users of TikTok, those curation
decisions would be made abroad. See Milch Decl. ¶ 29
(TikTok App. 661) (explaining that TikTok’s proposed
security measures contemplate “continued reliance on
ByteDance engineers for . . . its recommendation engine”).
The PRC and ByteDance thus would lack any First
Amendment rights in connection with any such curation
actions. Agency for Int’l Dev., 591 U.S. at 436. That is true
even though the PRC or ByteDance, in that scenario, would
aim their curation decisions at the United States. The Supreme
Court’s decision in Agency for International Development
demonstrates the point.
9
documents” exchanged with the U.S. Agency for International
Development. See Agency for Int’l Dev, 570 U.S. at 210.
C.
1.
10
TikTok also claims that TikTok Inc.’s deployment of the
platform’s recommendation engine in the U.S. is itself an
expressive decision. Even assuming so, after divestment, a
non-Chinese-controlled TikTok could still use the same
algorithm to promote the same exact mix of content presently
appearing on the app. According to TikTok, however, Chinese
law would prevent the export of the algorithm fueling the
recommendation engine without the PRC’s approval, which it
would not grant. TikTok Br. 24. The Act, though, would not
dictate that outcome. Rather, the PRC, backed by Chinese law,
would. And Congress of course need not legislate around
another country’s preferences to exercise its own powers
constitutionally—much less the preferences of a designated
foreign adversary, the very adversary whom Congress
determined poses the fundamental threat to national security
prompting the Act in the first place.
2.
11
As my colleagues explain, the Act’s divestment mandate
rests on two justifications, both of which concern the PRC’s
ability (through its control over ByteDance) to exploit the
TikTok platform in ways inimical to U.S. national security.
See Op., ante, at p. 33. First, the PRC could harvest abundant
amounts of information about the 170 million U.S. app users
and potentially even their contacts. Second, the PRC could
direct the TikTok platform to covertly manipulate the content
flowing to U.S. users. To the government, a foreign
adversary’s ability to acquire sensitive information on
Americans and secretly shape the content fed to Americans
would pose a substantial threat to U.S. national security.
12
a.
13
the First Amendment challenge to the bookstore’s closure
without even applying intermediate scrutiny.
b.
14
It is important to keep in mind, though, that Congress’s
covert-content-manipulation concern does not stand alone.
There is also its distinct data-protection interest that supports
applying (at most) intermediate scrutiny, along with the
consistent regulatory history of restricting foreign control of
mass communications channels that likewise weighs in favor
of intermediate scrutiny. So, the question ultimately is not
whether the covert-content-manipulation concern itself would
occasion applying strict scrutiny, but rather whether it so
strongly and clearly does that it overcomes the other important
considerations counseling against strict scrutiny. I believe it
does not.
15
Congress would also have concerns about the PRC sowing
discord in the United States by promoting videos—perhaps
even primarily truthful ones—about a hot-button issue having
nothing to do with China. Indeed, because the concern is with
the PRC’s manipulation of the app to advance China’s
interests—not China’s views—one can imagine situations in
which it would even serve the PRC’s interests to augment anti-
China, pro-U.S. content. Suppose, for instance, the PRC
determines that it is in its interest to stir an impression of
elevated anti-China sentiment coming from the United
States—say, to conjure a justification for actions China would
like to take against the United States. That would qualify as
covert content manipulation of the kind that concerned
Congress and supports the Act’s divestment mandate.
16
restriction. As applied here, what matters is whether a
particular potential curator, the PRC, has the ability to control
(covertly) the content fed to TikTok’s U.S. users, regardless of
what the content may be. True, “laws favoring some speakers
over others demand strict scrutiny” when the “speaker
preference reflects a content preference.” Reed, 576 U.S. at
171 (quoting Turner I, 512 U.S. at 658). But here, the speaker
(non)preference is not grounded in a content preference.
17
decision “on the narrow ground that the addressee in order to
receive his mail must request in writing that it be delivered.”
Id. at 307. That obligation amounted to “an unconstitutional
abridgement of the addressee’s First Amendment rights,”
because “any addressee is likely to feel some inhibition in
sending for literature which federal officials have condemned
as ‘communist political propaganda.’” Id.
* * *
18
D.
1.
a.
19
As TikTok does not dispute, the platform collects vast
amounts of information from and about its American users.
See TikTok App. 820; Privacy Policy, TikTok (Aug. 28, 2024),
https://perma.cc/XE6G-F86Q. The government’s national-
security concerns about the PRC’s access to that data take two
forms. First, the PRC could exploit sensitive data on individual
Americans to undermine U.S. interests, including by recruiting
assets, identifying Americans involved in intelligence, and
pressuring and blackmailing our citizens to assist China.
Second, the vast information about Americans collected by
TikTok amounts to the type of “bulk” dataset that could
“greatly enhance” China’s development and use of “artificial
intelligence capabilities.” Vorndran Decl. ¶ 32 (Gov’t App.
37).
20
government explains, Congress’s data-security concern arises
against a backdrop of broadscale “overt and covert actions” by
the PRC “to undermine U.S. interests,” Blackburn Decl. ¶ 23
(Gov’t App. 8). Collecting data on Americans is a key part of
that multi-faceted strategy. See id. ¶¶ 31–33 (Gov’t App. 10–
11). The PRC has engaged in extensive efforts to amass data
on Americans for potential use against U.S. interests. Id. ¶ 31
(Gov’t App. 10–11). And the PRC “is rapidly expanding and
improving its artificial intelligence and data analytics
capabilities for intelligence purposes,” enabling it to exploit
access to large datasets in increasingly concerning ways. Id.
¶ 30 (Gov’t App. 10).
21
b.
22
(Gov’t App. 38). And were the PRC to exert that kind of covert
control over the content on TikTok, it would be “difficult—if
not impossible—to detect, both by TikTok users and by law
enforcement personnel.” Id. ¶ 34 (Gov’t App. 38). In that
context, Congress’s concern with preventing the PRC’s covert
content manipulation of the platform readily qualifies as an
important, well-founded governmental interest.
23
objective “to correct the mix of speech that the major social-
media platforms present,” so as “to advance [the states’] own
vision of ideological balance.” Id. at 2407. The Court
explained that such an interest “is very much related to the
suppression of free expression, and it is not valid, let alone
substantial.” Id. (citing Buckley v. Valeo, 424 U.S. 1, 48–49
(1976) (per curiam)).
While that alone sets this case apart from NetChoice, see
144 S. Ct. at 2408 n.10, it also bears emphasis that the laws at
issue in NetChoice did not serve a distinct interest entirely
unrelated to the suppression of free expression. Here, on the
other hand, the Act rests in significant measure on Congress’s
data-protection interest, an interest indisputably having no
relation to the suppression of speech. For that reason as well,
NetChoice poses no obstacle to concluding that the Act serves
important governmental interests for purposes of intermediate
scrutiny.
2.
24
and covertly manipulating content on the platform. The Act
will bring about the severing of PRC control of the TikTok
platform in the United States, either through a divestment of
that control, or, if no qualifying divestment takes place, through
a prohibition on hosting or distributing a still-PRC-controlled
TikTok in the United States until a qualifying divestment
occurs. The divestment mandate is “not substantially broader
than necessary to achieve” Congress’s national-security
objectives. Ward, 491 U.S. at 800.
25
Here, Congress reasonably determined that attaining the
requisite degree of protection required mandating a divestment
of PRC control. A “disagreement over the level of
protection . . . to be afforded and how protection is to be
attained” does not constitute a basis for “displac[ing] Congress’
judgment” when applying intermediate scrutiny. Turner II,
520 U.S. at 224. And Congress’s resolution here is in line with
other situations in which national-security concerns can call for
divestment of a foreign country’s control over a U.S. company.
See 50 U.S.C. § 4565(d)(1); H.R. Rep. No. 118-417, at 5–6 &
n.26.
26
14. With specific regard to the provisions contained in the
proposed NSA, “senior Executive Branch officials concluded
that the terms of ByteDance’s final proposal would not
sufficiently ameliorate those risks.” Newman Decl. ¶ 6 (Gov’t
App. 46). The provisions, in the Executive Branch’s view,
“still permitted certain data of U.S. users to flow to China, still
permitted ByteDance executives to exert leadership control and
direction over TikTok’s US operations, and still contemplated
extensive contacts between the executives responsible for the
TikTok U.S. platform and ByteDance leadership overseas.” Id.
¶ 75 (Gov’t App. 62). And, the Executive Branch assessed, the
NSA “would have ultimately relied on . . . trusting ByteDance”
to comply, but “the requisite trust did not exist.” Id. ¶¶ 75, 86
(Gov’t App. 62, 68).
* * * * *
27
To give effect to those competing interests, Congress
chose divestment as a means of paring away the PRC’s
control—and thus containing the security threat—while
maintaining the app and its algorithm for American users. But
if no qualifying divestment occurs—including because of the
PRC’s or ByteDance’s unwillingness—many Americans may
lose access to an outlet for expression, a source of community,
and even a means of income.