26 Tul JIntl Comp L163
26 Tul JIntl Comp L163
26 Tul JIntl Comp L163
Citations:
Please note: citations are provided as a general guideline. Users should consult their preferred
citation format's style manual for proper citation formatting.
-- Your use of this HeinOnline PDF indicates your acceptance of HeinOnline's Terms and
Conditions of the license agreement available at
https://heinonline.org/HOL/License
-- The search text of this PDF is generated from uncorrected OCR text.
-- To obtain permission to use this article beyond the scope of your license, please use:
Copyright Information
Facebook, Defamation, and Terrorism:
Who Is Responsible for Dangerous Posts on
Social Media?
Caitlin McKeown*
I. INTRODUCTION
Technology and the Internet have created a whole new virtual world
for the average person to explore. With the click of a button, people can
share their thoughts, photos, and opinions, all while quickly analyzing
information posted by other persons, news sites, companies, and
politicians. While the Internet and social media have facilitated
communication and opened the door to virtual exploration, there is an
ever-growing trend towards threatened privacy and, even more serious,
breaches in national security. Companies, like Facebook, have expanded
beyond comprehension, with an average of 1.37 billion users active daily
throughout the globe in June 2017 alone.' It is nearly impossible for
Facebook and similar sites, such as Twitter, Google, and Myspace, to
micromanage individual posts on such a massive, global scale. As a
result, posts promulgating political insurgence, terrorism, and false,
* C 2017 Caitlin McKeown. J.D. candidate 2018, Tulane University Law School; B.A.
2013, The University of Chicago. I would like to thank my fellow members of the Tulane Journal
ofInternationalandComparativeLaw for all of their hard work and encouragement. In addition,
I would like to dedicate my Comment to my parents, who have continuously shown their love and
support throughout law school.
1. Company Info, FACEBOOK NEwSROOM, http://newsroom.fb.com/company-info/ (last
visited Nov. 22, 2017).
163
164 TULANE I OFINT'L & COMP. LAW [Vol. 26
incendiary data have plagued the Internet in recent years, bringing up the
important question of who is responsible for this information and what
avenues are available to stop it? Unfortunately, the answer still remains
unclear in both the United States and abroad.
In the United States, this question is coupled with an analysis of
Free Speech under the Constitution, as well as a federal immunity statute
known as the Communications Decency Act (CDA), which removes
liability from online publishers for defamatory or incendiary posts made
by third party users.2 In the past year, a variety of lawsuits have been
filed against tech giants, like Facebook and Twitter, under §§ 2339A and
2339B of the Patriot Act, better known as the Material Support Statute.'
This statute has been clashing with the CDA, when families of terrorist
victims sue these corporate giants in the hopes of placing liability on
websites for disseminating terrorist propaganda, albeit unknowingly,
through their websites.' However, courts have been wary to accept these
material support arguments, continuing to favor immunity for
corporations as mere publishers of the content.!
This Comment will examine the origins of CDA in the United
States alongside developments in antiterrorism legislation, including the
Material Support Statute of the U.S. Patriot Act, and its application to
social media. Specifically, it will analyze the two statutes in detail and
delve into a discussion of how plaintiffs in antiterrorism legislation are
finding new ways to sue Internet sites under the Material Support
Statute, as well as how these websites are rebutting such claims by
invoking the long-standing immunity of the CDA. Furthermore, this
Comment will examine potential loopholes around the CDA via
suggested changes to both the statute and case law, including the idea of
making websites liable for dangerous content via site-wide report and
remove systems, potential amendments to the CDA, and by applying
more plaintiff friendly case precedent as seen in the Ninth Circuit.'
Finally, this Comment will examine similar types of litigation abroad,
specifically a German case filed by a Syrian refugee whose "selfie" with
Chancellor Angela Merkel led to an onslaught of false posts and
II. BACKGROUND
7. See Melissa Eddy, How a Refugee's Selfie with Merkel Led to a FacebookLawsuit,
N.Y. TIMES (Feb. 6, 2017), https://www.nytimes.com/2017/02/06/business/syria-refugee-anas-
modamani-germany-facebook.html?_r-0.
8. See Telecommunications Act of 1996, Pub. L. 104-104, 110 Stat. 56, Title V, § 502
(1996).
9. William A. Sodeman, CommunicatonsDecency Act (CDA) United States [1996]
ENCYCLOPEDIA BRITANNICA (Nov. 24, 2016), https://www.britannica.com/topic/Communications-
Decency-Act.
10. Id; see alsoTelecommunications Act of 1996 § 502(d)(5)(A)-(B), which specifically
states:
[Iit is a defense to a prosecution under [the Act] ... [if] a person (A) has taken, in
good faith, reasonable, effective, and appropriate actions under the circumstances to
restrict or prevent access by minors to a communication specified in such subsections,
which may involve any appropriate measures to restrict minors from such
communications, including any method which is feasible under available technology;
or (B) has restricted access to such communication by requiring use of a verified credit
card, debit account, adult access code, or adult personal identification number.
166 TULANE I OFINT'L & COMP LAW [Vol. 26
[T]he CDA lacks the precision that the First Amendment requires when a
statute regulates the content of a speech. In order to deny minors access to
potentially harmful speech, the CDA effectively suppresses a large amount
of speech that adults have a constitutional right to receive and to address to
one another. That burden on adult speech is unacceptable if less restrictive
alternatives would be at least as effective in achieving the legitimate
purpose that the statute was enacted to serve.20
In conclusion, the Court found that the CDA did more to hinder free
speech on the Internet rather than to encourage it.2 1 This prompted the
statute to be revised, and for the term "indecent" to be officially removed
from the language of the CDA.22
It is important to note that while the Supreme Court found the CDA
to be unconstitutional for its broad prohibition on obscene materials in
light of free speech, the CDA does still uphold the idea that online
defamation is illegal and punishable by law.23 This portion of the CDA
survived the Supreme Court's gutting of the statute, and most online
defamation claims are governed under its scope. However, the modern
§ 230 makes a clear distinction for third party ISPs in defamation claims,
a source of major contention in recent lawsuits addressing issues of
individual privacy and national security online.
20. Id
2 1. Id
2 2. Id
23. The Communications Decency Act and Its Effect on Onkne Libel, REPUTATION
HAwK, https://www.reputationhawk.com/communicationsdecencyact.html (last visited Mar. 21,
2016).
24. See47 U.S.C. § 230(c)(1) (1998).
25. Id
26. Timothy L. Alger, The CommunicationsDecency Act: Making Sense of the Federal
Immunity for Online Services 59 ORANGE COUNTY L. 30, 31 (2017).
27. Id
168 TULANE OFINT'L & COMP LAW [Vol. 26
even when the provider is put on notice that the content exists.2 8 While
the CDA does not prohibit all types of civil and criminal claims, it has
opened the door for ISPs, allowing for the unprecedented growth of
social media sites and search engines largely unhindered by any sort of
regulation or responsibility for the content posted throughout their
websites.29
When discussing the CDA, it is important to learn the distinction
between interactive computer services (ICS) and information content
providers (ICP) as these have very different implications under the CDA's
scheme.30 While both fall under the general definition of an ISP, they do
have distinct differences. An ICS is "any information service, system, or
access software provider that provides or enables computer access by
multiple users to a computer server."" It is widely accepted that search
engines, like Google, and social media sites, like Facebook and Twitter,
fall under the statute's definition of an ICS.32 In contrast, an ICP is "any
person or entity that is responsible, in whole or in part, for the creation or
development of information provided through the Internet or any other
interactive computer service."" Essentially, the difference is that the
former is the site allowing access to the data by providing the Internet
platform only, while the latter is responsible for the actual information
being created anddistributed. This would mean that ICPs are not granted
federal immunity under the CDA like ICSs.'
The first major case to address publisher immunity under the CDA,
and the case that set the precedent for CDA litigation for many years,
was Zeran v Ameica Online, Inc. in the U.S. Fourth Circuit Court of
Appeals.3 5 In Zeran, a man sued AOL for negligence when he received
an array of death threats after an unidentified user falsely linked his
phone number to the sale of offensive apparel relating to the Oklahoma
City bombing." The t-shirts and accessories in question were posted for
sale via an AOL bulletin board, where Zeran's phone number was linked
as the contact for sales." The post resulted in the local newspaper and
local radio shows condemning the apparel and encouraging readers and
28. Id.
29. Id. at 31-32.
30. Id. at 31; see also Spiccia, supm note 6, at 385-86.
31. 47 U.S.C. § 230(f)(2) (1998).
32. Alger, supm note 26, at 31.
33. 47 U.S.C. § 230 (f)(3).
34. Alger, supra note 26, at 33.
35. See Zeran v. Am. Online, Inc., 129 F.3d 328, 329 (4th. Cir. 1997).
36. Id.
37. Id
2017] FACEBOOK, DEFAM4TION AND TERRORISM 169
38. Id.
39. Id.
40. Id. at 328.
41. Id
42. Id at 330-31.
43. Id at 332-33.
44. Id at 330-31.
45. Id at 331.
46. Id
47. Id.
48. Doe v. MySpace, Inc., 528 F.3d 415, 416 (5th Cir. 2008).
170 TULANE J OFINT'L & COMP. LA W [ Vol. 26
third party content, with the plaintiff's recourse being to sue the third
party posters themselves.49 The court in another famous case,
Blumenthal v Drudge, found that even though the ISP in question
maintained minor editorial control over the published content, the
defendant ISP was still immune under § 230."o In Blumenthal, the
plaintiffs sued Drudge and AOL for a gossip column Drudge wrote and
dispensed through AOL's services." The court found that while Drudge
was responsible for writing the content, AOL remained immune from the
suit as the publisher of the information, even when it held certain
"editorial rights with respect to the content provided by Drudge and
disseminated by AOL, including the right to require changes in content
and to remove it."5 2 As such, the court upheld the idea that an ISP can
hold some editorial qualities while still maintaining its immune status
under the CDA." In summation, CDA immunity has been condensed
through years of litigation into specific categories that create a high bar
for any plaintiff to reach to prove liability against ISPs in civil suits.54
While efforts to combat the CDA's publisher immunity clause
seemed futile throughout the 1990s and early 2000s, recent developments
in the Ninth Circuit have provided hope for plaintiffs in cases involving
online defamation and predation." In 2008, the Ninth Circuit created a
loophole around CDA immunity for ISPs by supporting a claim of
promissory estoppel in Barnes v Yahoo!, Inc." Barnes filed suit against
Yahoo when she repeatedly requested that Yahoo remove pornographic
photos of herself posted by her ex-boyfriend in Yahoo chatrooms." After
many attempts to get the photos removed, Yahoo responded that it was in
58. Id
59. Id
60. Id at 1108-09.
61. Alger, supranote 26, at 32-33.
62. SeeJane Doe No. 14 v. Internet Brands, Inc., 824 F.3d 848, 848 (9th Cir. 2016).
63. Id at 851.
64. Id
65. Id
66. Alger, supra note 26, at 34; see Fair Hous. Council of San Fernando Valley v.
Roommates.com, LLC, 521 F.3d 1161, 1175 (9th Cir. 2008).
67. FairHous., 521 F.3d at 1168.
172 TULANE J OFINT'L & COMP. LAW [Vol. 26
doing so."' Instead, the statute limits the plaintiffs' ability to provide
support, whether it is tangible, like monetary contributions, or intangible,
like training and education via verbal communications." Essentially, the
Court reasoned that speech that provides sensitive information, specific
knowledge, or trains someone in a specific skill furthers terrorism in this
type of scenario.
The Holder decision has been widely criticized by activist groups
and politicians alike, who argue that the decision's severe attitude towards
most types of generalized assistance and educational training services
with these types of groups is harmful towards citizens directly affected
by foreign conflicts, who usually seek out these organizations for shelter
and aid.97 Many also argue that it disrupts potential peace discussions by
limiting most types of beneficial contact with U.S. designated terrorist
groups." In the years following the Holder decision, the rise of the
Internet has made it exceptionally simple for terrorist organizations to
disseminate information and garner support for their organizations
through the web.99 A lack of Internet background checks on social media
sites have led terrorist organizations to create various profiles on pages
throughout Twitter and Facebook, with a recent study producing data that
ISIS controlled approximately 30,000 Twitter accounts in 2014.'" In
light of these recent events, legal scholars have begun to wonder whether
providing social media accounts to terrorists, like those provided via
Facebook and Twitter, should constitute a violation of § 2339B.'o
Furthermore, the issue behind suing a social media giant in connection
with § 2339B of the material support statute is the requirement that the
site has requisite knowledge or some kind of role in the organization and
its illegal and/or violent activities, as well as direct contact with the
terrorist organization itself in the formation of the webpage or online
profile.1 2 Naturally, given the massive scale of Internet sites, like
Facebook and Twitter, courts have remained more lenient towards online
service providers in this context because of the difficulty, nay
94. Id at 2730.
95. Id.
96. Id. at 2743.
97. See Adam Liptak, Court A/fins Ban on Aidng Groups Tied to Terror, N.Y. TIMES
(June 21, 2010), http://www.nytimes.com/2010/06/22/us/politics/22scotus.html.
98. Id.
99. RUANE, supm note 54, at 17.
100. Id
101. Id
102. Id at 18-20.
176 TULANE J OFINT'L & COMP. LA W [ Vol. 26
10 3. Id.
104. See, e.g, Fields v. Twitter, Inc., No. 16-CV-00213-WHO, 2016 WL 6822065, at *1
(N.D. Cal. Nov. 18, 2016).
105. Id
106. See 18 U.S.C. § 2333 (2009); 18 U.S.C. § 2339A (2009); 18 U.S.C. § 2339B (2015);
47 U.S.C. § 230 (1998).
107. See 18 U.S.C. § 2333(a) (2016).
108. RUANE, supra note 54, at 20.
109. See generallyBlumenthalv. Drudge, 992 F. Supp. 44,52 (D.C. Cir. 1998).
2017] FACEBOOK, DEFAMATION AND TERRORISM 177
110. Fields v. Twitter, Inc., No. 16-CV-00213-WHO, 2016 WL 6822065, at *1(N.D. Cal.
Nov. 18, 2016).
111. RUANE, supra note 54, at 22; see also Fields, 2016 WL 6822065, at *1.
112. See Fields, 2016 WL 6822065, at *1.
113. Id
114. Id.
115. Id at *2.
116. Id. at * 1-2.
117. Id
118. RUANE, supra note 54, at 22.
178 TULANE J OFINT'L & COMP LAW [Vol. 26
content itself, they were immune under the statute and could not be held
liable in relation to a federal, criminal offense, such as those punishable
under §§ 2339A and 2339B."9 The plaintiffs had two major arguments
that they presented in response to Twitter's claims of immunity.'20 First,
the plaintiffs argued that content was not the issue in this scenario but
rather Twitter's "provision of services" and the profiles that were freely
given to ISIS.121 Second, they invoked the "direct messaging service" that
Twitter offered, arguing that these private messages do not amount to
published materials, removing immunity from these types of
communications under § 230.122 The court rejected both of these
arguments using prior § 230 case law that discussed the broad application
that the statute provided for third party content on a website, such as
Twitter.1 23 The court, invoking case law from the Ninth Circuit, also
found that Twitter's decision to grant accounts to particular individuals
for the purposes of posting content, as well as removing the content
should Twitter see fit, related to their role as a publisher-even if the
accounts in question involved ISIS.1 24 Furthermore, the court was not
convinced by the plaintiffs' argument regarding private messaging.'25 The
court turned to precedent to determine that "in defamation law, the term
'publication' means 'communication [of the defamatory matter]
intentionally or by a negligent act to one other than the person
defamed."' 26 Applying this definition, the court found that Twitter was
still only a publisher of the information conveyed via direct messages,
because whether or not the information was made publicly available was
non-consequential to Twitter's role as an ICS.'27 In conclusion, the court
held that Twitter, even in providing accounts to ISIS agents and
permitting private messaging through its direct messaging service, was
still merely acting in its role as a publisher and was immune from
liability for acts of terror committed abroad.1 28
In the wake of the Fields case, a variety of other lawsuits, alleging
similar arguments and facts, have been brought against online
companies, including YouTube, Google, Twitter, and Facebook. Another
129. See Pais Victim's FatherSues Twitter, Facebook, Google Over ISIS, CBS NEWS
(June 15, 2016, 9:50 PM), http://www.cbsnews.com/news/paris-attacks-victim-sues-twitter-
facebook-google-youtube-isis-nohemi-gonzalez/.
130. Id.; see also Jacob Bogage, Family of ISIS Pans Attack Victhm Sues Google,
Facebookand Twitter, WASH. POST (June 16, 2016), https://www.washingtonpost.com/news/the-
switch/wp/2016/06/16/family-of-isis-paris-attack-victim-sues-google-facebook-and-twitter/?
utmterm=.7f383919fb8e.
131. Bogage, supm note 130.
132. Id
133. See Michele Gorman, Pulse Victims' Families Sue Google, Facebook and Twitter
Over ISIS Propaganda,NEWSWEEK (Dec. 20, 2016, 5:24 PM), http://www.newsweek.com/pulse-
victims-families-sue-google-facebook-twitter-isis-534407.
180 TULANE J OFINT'L & COMP LAW [Vol. 26
139. See Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d
1157, 1175 (9th Cir. 2008).
140. See Burke, supra note 6, at 257-58; see also Spiccia, supranote 6, at 408.
141. See Burke, supranote 6, at 257-58; see also Spiccia, supranote 6, at 408.
142. Burke supm note 6, at 257-58.
143. See Alejandro Alba, Facebook Responds to Change.orgPetition CriticizingIts Anti-
Terroism Policies Followhig Terrorist Attacks, N.Y. DAILY NEWS (Dec. 9, 2015, 2:02 PM),
http://www.nydailynews.com/news/world/facebook-responds-criticism-anti-terrorismpolicy-
article-1.2460276.
144. See Press Release, Facebook, Global Internet Forum to Counter Terrorism To Hold
First Meeting in San Francisco (July 31, 2017), https://newsroom.fb.com/news/2017/07/global-
internet-forum-to-counter-terrorism-to-hold-first-meeting-in-san-francisco/.
182 TULANE L OFINT'L & COMP LAW [Vol. 26
145. Fair Hous. Council of San Fernando Valley v. Roommates.com, LLC, 521 F.3d
1157, 1175 (9th Cir. 2008); Burke, supm note 6, at 257-58.
146. Burke, supia note 6, at 257-58.
147. Id
148. Id
149. Id. at 256-57.
150. Id.; see FairHous., 521 F.3d at 1175.
151. Burke, supm note 6, at 257-58.
152. Id
153. See 18 U.S.C. § 2339A (2009); 18 U.S.C. § 2339B (2015); 47 U.S.C. § 230 (1998).
154. Burke, supa note 6, at 257.
2017] FACEBOOK, DEFAMATION AND TERRORISM 183
while there are some more modem takes on the CDA, they are still
incredibly difficult in practice.
Would these new methods of interpretation work in connection to
the plaintiffs in Fields and other suits involving international terrorism?
While it is difficult to say whether Fields would perfectly fit into the
above scenario, these types of lawsuits would most certainly benefit from
a change in legislation or new case precedent. The widows in Fields
could connect the material support statute §§ 2993A and 2993B to the
Ninth Circuit's interpretation of websites materially contributing to the
posting of illegal content.1 62 It would also be far easier for victims of
terror and their families to sue websites, like Facebook and Twitter, under
an amended CDA and a notice and report system.1 63 In that instance, if
Facebook was made aware of terrorist content posted by organizations,
such as ISIS, either domestically or abroad, Facebook failed to remove
the page, and the page or its contents were linked more directly to a
specific attack or to terrorist recruitment efforts, Facebook would be
liable and this would facilitate recovery for the plaintiffs.'" It will be
interesting to see whether any of these new litigations filed pertaining to
terrorism encourage the U.S. courts to adopt the Ninth Circuit's
perspective from cases, such as Roommates.com, or whether Congress is
encouraged to amend § 230 of the CDA to adapt to the ever expanding
prowess of the Internet. Furthermore, this issue may be one that needs to
be addressed globally, as other countries begin to analyze questions of
liability involving companies with such a strong global presence.
166. Id
167. Id
168. Id; seeEddy, supra note 7.
169. See Eddy, supra note 7.
170. Eddy, supranote 165.
171. Id
172. Id.
173. Id
174. Id
175. Id
186 TULANE I OFINT'L & COMP LAW [Vol. 26
176. Id
177. Id
178. Id
179. Id.
180. Eddy, supra note 7.
181. See Burke, supm note 6, at 257-58; see also Spiccia, supra note 6, at 408.
2017] FACEBOOK, DEFAMATION AND TERRORISM 187