Fake News Report - Final PDF

Download as pdf or txt
Download as pdf or txt
You are on page 1of 24
At a glance
Powered by AI
The text provides biographies of Stefania Milan and Vidushi Marda and discusses their work and affiliations.

Stefania Milan and Vidushi Marda are mentioned as the main people in the text. It provides details about their work and backgrounds.

Stefania Milan works for the University of Amsterdam and the University of Oslo. Vidushi Marda works for ARTICLE 19. They also mention the DATACTIVE Ideas Lab research firm.

ABOUT THE AUTHORS

Ms. Stefania Milan, PhD, is curious about the politics of code, data and platforms. She is Associate Professor of New Media at the
University of Amsterdam, the Netherlands, and Associate Professor of Media Innovations (II) at the University of Oslo, Norway.
Stefania is also the Principal Investigator of the DATACTIVE project (http://data-activism.net), exploring the evolution of citizenship,
activism and governance vis-à-vis datafication and massive data collection, and the Principal of DATACTIVE Ideas Lab, a research
and consultancy firm. As a digital rights advocate, she has been active within the NonCommercial Stakeholder Group at the Internet
Corporation for Assigned Names and Numbers (ICANN) and the Freedom Online Coalition’s Working Group ‘An Internet Free and
Secure’. Stefania is the author of Social Media and Their Technologies: Wiring Social Change (Palgrave Macmillan, 2013/2016) and
co-author of Media/Society (Sage, 2011). To know more, visit https://stefaniamilan.net.

Ms. Vidushi Marda, B.A., LL.B (Hons) is a lawyer by training. Her research focuses on Internet governance, Internet infrastructure,
and human rights. In particular, Vidushi works on understanding the human rights implications of emerging technologies. For the
last three years, she has worked in international Internet governance bodies such as the Internet Corporation for Assigned Names
and Numbers (ICANN), the Institute for Electrical and Electronics Engineers (IEEE), the Partnership on AI (PAI) and also multilateral
bodies such as the Freedom Online Coalition (FOC). Her research has been cited by the Supreme Court of India in a seminal ruling
on the Right to Privacy in India, and by the House of Lords Select Committee on Artificial Intelligence. Vidushi currently works as
a Programme Officer with ARTICLE 19’s Team Digital where she leads their algorithmic decision making portfolio. At DATACTIVE,
Vidushi is a Research Associate studying the human rights implications of online content regulation.

ABOUT THE IPO ABOUT THE DATACTIVE IDEAS LAB


The Internet Policy Observatory (IPO) is a project at the The DATACTIVE Ideas Lab is a research & consultancy firm
Annenberg School for Communication at the University of registered in the Netherlands (KvK-nummer 69570132),
Pennsylvania. The overarching goal of the program is to taking a critical look at the datafication and platformization of
deepen the reservoir of researchers and advocates in regions society, the governance of internet infrastructure, and people’s
where Internet freedom is threatened or curtailed and to empowerment. It is a spin-off of the DATACTIVE project at the
support the production of innovative, high-quality, and impactful University of Amsterdam (European Research Council grant
internet policy research. The IPO facilitates collaboration no. 639379). For more information, visit https://stefaniamilan.
between research and advocacy communities, builds research net/consultancy and contact [email protected]. The
mentorships between emerging and established scholars, and research “Content Regulation on and by Platforms: Rethinking
engages in trainings to build capacity for more impactful digital Internet Governance vis-à-vis the Platformization of the Web”
rights research and advocacy. (Principal Investigator Stefania Milan) was supported by a
grant of the Internet Policy Observatory, Annenberg School
Through the IPO’s three-pronged approach, the program of Communication at the University of Pennsylvania, as part
seeks to educate a network of advocates and researchers, of the Joint Digital Rights and Internet Freedom Research/
produce high-impact, locally-relevant research in furtherance Advocacy Projects call 2017. Contact: [email protected] and
of Internet freedom objectives, and help connect researchers [email protected].
and mentors to foster collaboration, mobilization, and increase
research impact.

ACKNOWLEDGEMENTS
The authors thank Ms. Alexandra Deem (University of Amsterdam) for her precious assistance with data collection (and beyond)
and Mr. Sergio Barbosa dos Santos Silva (Universidade de Coimbra) for helping with references. They extend their gratitude also to
Laura Henderson and Monroe Price (Annenberg School, University of Pennsylvania) for their valuable feedback.
Social media platforms are increasingly accused of shaping public debate and engineering people’s behavior
in ways that might undermine the democratic process. In order to vitalize a much-needed multistakeholder
dialogue on corrective measures against the spread of false information, this project has undertaken a
truncated multistakeholder consultation, addressing experts from academia, civil society, governments and
the industry to assess diverging perspectives on institutional proposals, legislative responses, and self-
regulation resolutions that have sprung up around the world. It also asks what new challenges platform
moderation and related “fake news” issues pose to what might be called the “procedural fitness” of the current
multistakeholder internet governance system. Finally, it suggests recommendations for architectural changes
that could promote constructive and inclusive debate on the topic.

Social media platforms are under cross-fire. Their self- Act, abbreviated in NetzDG), promises to fine social media
attributed role in nurturing a healthy public sphere has been companies that fail to take down from their platforms hate
endangered by a number of recent scandals exposing speech or fake news within 24 hours; the German digital
questionable advertising and data re-use practices. In the trade association Bitkom has dubbed it a “permanent
aftermath of the 2016 US Presidential election, the “fake mechanism of censorship”.7 Content regulation on and by
news” controversy put various social media platforms in the social media platforms, we argue, might turn out to be the
public pillory. More recently, the Cambridge Analytica case wrong fix for the problem: adopted in the spur of the moment
drove attention to controversial user targeting practices and with little or no prior consultation with stakeholders,
supported by social media companies. The advent of the these measures risk infringing users’ freedom of information
“post-truth world”1 has been greeted with an array of alarmed and association as well as their right to privacy. We are left
statements, temporary remedies and clumsy solutions: to with a number of pressing questions, which are also difficult
date, however, none to them has been able to solve the to answer: Who is to decide what is “true” on social media?
conundrum. Truth be told, social media users, governments Are Facebook and its siblings merely neutral “pipes” or are
and industry all seem to be groping in the dark. Not only they also producers and distributors of content that should
are we short of effective fixes to a simultaneously socio- be subject to existing regulation of the press? What does
political and techno-legal problem that is a moving target, algorithmic content regulation mean for human rights and
we also lack the governance mechanisms to find shared freedom of expression? Can content regulation measures
solutions. Existent internet governance, industry regulation turn into a kind of dragnet legislation that ultimately hinders
and self-regulation instruments have proven inadequate to freedom of speech?
promote a much-needed dialogue that involves all parties
with a stake in the issue. The puzzle is to find the solution to As fake news has become to many a serious “threat to our
a difficult question: how can we regulate the private/public lifestyle online”,8 it has brought a great deal of attention to
space of social media without infringing on human rights? issues of media policy and regulation that are traditionally
of low interest to the public.9 Citizens, no matter their
Fake news and the data-driven revenue they produce—in expertise or lack thereof, are more and more interested
the form of clicks and advertising at the core of the so-called in the functioning of algorithms and the tactics of political
“attention economy”—have re-ignited a much-needed communication in the age of the “platformization of the
debate on the role of commercial actors in shaping public web”. Platforms have become “the dominant infrastructural
discourse. Many politicians, experts, and those within the and economic model of the social web”.10 They increasingly
general public advocate for dubious content regulation raise concerns about the exercise of human rights on the
measures. Proposed solutions range from governmental internet and the implications of false information for national
regulation2 to private intervention,3 and include “fact sovereignty and democratic participation. They interrogate
checking”4 by third parties, algorithmic curation,5 or the role of private actors in shaping our online life and acting
literacy programs.6 A recent law by the German Parliament, as the arbiters of truth. They proclaim their lost trust in both
(Netzwerkdurchsetzungsgesetz or Network Enforcement governments and the industry to protect their data and their

1
rights.11 Meanwhile, experts engaged in these policy debates
namely Brazil, Germany and India; ii) twenty in-depth
fail to examine current governance structures to understand
qualitative interviews with experts selected for their role
whether they are well equipped to address these pressing
and/or position in relation to the fake news controversy;
issues. In fact, current mechanisms in industry regulation
iii) participant observation in a number of settings
and internet governance do not seem to be apt for the task.
where a multistakeholder dialogue on this subject
How can we make sure the voice, preferences and rights of
matter emerged, including RightsCon (Brussels, March
users are taken into account by private and public actors?
2017), the Internet Governance Forum (IGF, Geneva,
How can they be encouraged to act upon citizen concerns?
December 2017), and the Computing, Privacy and Data
The widely celebrated multistakeholder decision-making
Protection conference (Brussels, January 2018); iv)
model, whereby all those affected by an issue are entitled to
desk research addressing a variety of popular sources,
have a voice, should be tested to determine whether it can
including news articles and specialized blog posts. In
be mobilized to be more constructive. It may face a crisis of
view of addressing the issue from a multistakeholder
“procedural fitness” to address issues and policies at stake
perspective, interviewees have been selected to
in the fake news debate—such as the Terms of Service (ToS)
cover four stakeholder groups, namely academia,
in force in privatized spaces such as social media platforms.
civil society, government and policy-makers, and the
More questions thus emerge: Can content regulation on and
industry, including both platform operators, journalists
by platforms be informed by multistakeholder perspectives?
and software developers. Each stakeholder group was
Do current internet governance frameworks and processes
served a distinct questionnaire. Interview transcripts
find relevance in the age of platforms? Do multistakeholder
were then qualitatively analyzed by means of discourse
decision-making, and civil society participation in particular,
analysis. As we encountered consistent difficulties
provide adequate oversight for private agreements?
in speaking on-the-record with representatives of
governmental bodies and the industry, our sample
This paper seeks to experiment with multistakeholder privileges the sectors of the organized civil society and
perspectives as a way of enriching the global debate on academia. In order to counterbalance this bias, we
misinformation and content regulatory issues. It queries also surveyed a selection of official documents and
a variety of experts and resources for ways of tackling the quotes to the press by both governmental and industry
problem of content moderation on private platforms, using representatives.
fake news as a case study. In particular, two dozen experts
from academia, the organized civil society, the government
and the industry sectors have been interviewed. Further, FAKE NEWS AND THEIR
this paper presents a legal analysis which investigates how
three major countries in distinct regions of the world—Brazil, `CONSEQUENCES: A SHORT
Germany and India—have sought to deal with the problem CHRONICLE of a long battle of
of fake news. Through these interviews and analysis, the
paper explores the current trends in content regulation of
and over narratives
social media services and asks whether the rise in popularity
The phrase “fake news” has been used in a plethora
and functionalities of private platforms is accompanied by
of contexts, and it now stands in for a variety of different
new approaches to governance and the development of
phenomena, from propaganda to audacious politicking.
adequate safeguards. Based on this evidence, this paper
While various definitions have emerged, it is clear that
offers a distilled set of recommendations for stakeholders
fake news as a concept consistently overlaps with false
to effectively address the conundrum of fake news and
or misleading information (“misinformation”), and also
content regulation through governance mechanisms that
with false information purposely spread to deceive people
are inclusive, deliberative, and reflect the complexity of the
(“disinformation”).12 Lazer et al. argue that fake news is
issue at hand.
“fabricated information that mimics news media content in
form but not in organizational process or intent”.13 According
METHODS AND DATA to the Ethical Journalism Network, fake news consists of
“information deliberately fabricated and published with
the intention to deceive and mislead others into believing
Data for this paper was obtained triangulating three falsehoods or doubting verifiable facts”.14 Perily distinguishes
social science methods: i) a jurisdictional analysis of the four types of fake news according to whether they are
latest legislative and quasi-legislative developments in exploited as satire, as a profit mechanism, as propaganda,
the matter of solutions to fake news in three countries, or as reckless reporting.15 We believe, however, that fake

2
news is better understood as a battle of and over narratives. analogue age: the technological environment that supports
It is a clash of narratives as it contrasts “information about the distribution of information and many social relationships
geopolitical viewpoints that are not conformant with the today has revolutionized how information can spread.
perceived interests of the security apparatus in the state Social media has become a key pathway to news: a 2017
where the alleged fake news is spread”.16 Think for example Pew Research survey showed how US adults are as likely
of the way the notion of fake news is regularly evoked by to get news from social media as they are from direct visits
US President Donald Trump, and the many citizen efforts to news websites.21And the business model of social media
at debunking distorted information mobilized for political allows for “the use of personal data to target the information
purposes. But it is also a battle over narratives because it to those audiences most susceptible to it”.22 In other
constitutes a semantic locus where distinct visions of the words, while misinformation is not new, the means through
world, competing understandings of citizen agency, and which it circulates and the rapid pace that it is shared are
divergent definitions of what constitutes truth are confronted. altogether novel.
novel As one interviewed expert explained,
In this respect, the fake news controversy has put under strain “traditionally, if you were a manipulator and you wanted
some of the central tenets of liberal democracies: the nature to propagandize, you had to find a medium from which
of its “rule of the majority” based on the healthy exchange you could spread your message without it being obvious.
of distinct views and preferences; the notions of political A way to launder propaganda. [On social media], it looks
freedom of citizens seen as autonomous thinkers able to organized and authentic. You are laundering misinformation
freely exercise their judgement without state interference and disinformation through an apparatus that makes it took
and empowered to actively participate in civic life, and the authentic”.23 We can identify at least two dimensions of the
idea that (fair, objective and independent) information is a problem, deeply interwoven: the technological, which has to
key ingredient of democratic participation. do with the mechanisms of platforms, and the social, 5.
Which relates to how trust and belief systems are formed
and how partisan divisions among social groups play out.
An old problem IN a new digital
environment The harms arising out of fake news are widely debated.
The United Nations (UN) Special Rapporteur on Freedom
The term “fake news” gained traction in the aftermath of of Opinion and Expression, David Kaye, has repeatedly
the US 2016 Presidential election, when Facebook was at expressed concern about the potential of disinformation and
the receiving end of scathing criticism for the circulation of propaganda to “mislead a population, as well as to interfere
fake news on its platform.17 Its use became so widespread with the public’s right to know and the right of individuals
to deserve the nomination as 2016 “word of the year” by to seek and receive, as well as to impart, information and
the Australian Macquarie Dictionary, on the ground that ideas of all kinds, regardless of frontiers, protected under
“it captures an interesting evolution in the creation of international legal guarantees of the rights to freedom of
deceptive content”.18 However, if we look at the battle of expression and to hold opinions”.24 However, the assumption
narratives they promote, fake news is essentially just “good that fake news significantly undermines democracy is not a
old lies,” which have been a part of the fabric of strategic matter of agreement. Economists Hunt and Gentzkow, who
communications throughout the last century19. It is thus studied the possible impact of fake news on voting patterns
important to historicize the notion of fake news, connecting in the 2016 US Presidential election, concluded that fake
it to the evolution of misinformation and propaganda, as news on social media was not as influential as we are led
well as the various attempts at regulating and controlling to believe, with television remaining the dominant source
emergent technologies and therefore, indirectly, the of political news and information: “for fake news to have
democratic process. The expression was popularized in the changed the outcome of the election, a single fake article
first half of 1900s, to describe the evolution of propaganda would need to have had the same persuasive effect as 36
techniques in major world conflicts. An analysis of Google television campaign ads”.25
NGrams Viewer data, tracking the popularity of strings of
words in books published over the last two centuries, shows Nonetheless, the spread of misinformation and disinformation
how this very same phrase started to be in used to describe can have dramatic consequences. Studies have shown that
propaganda during World War I, probably reflecting the on Twitter false information, especially when political, is
expansion of propaganda research during this time.20 retweeted more rapidly and widely than true information.26
During the 2014 Indonesian presidential election, Muslim
There is, however, a major difference between present- moderate President Joko Widodo was subject to smear
day fake news and the propaganda of fake news in the campaigns on social media, portraying him to be a Chinese-

3
Christian Communist—a dangerous proposition given the Because these initiatives by the legislative and/or executive
fact that precedent points to grave violence in the context powers pose dangers in the extent to which they place
of religious affiliation. Widodo was eventually forced to enormous trust and power in either governments or
produce his marriage certificate on Facebook to contain the individuals as arbiters of truth and taste, it is useful to
spread of these allegations.27 Again, in Indonesia in 2017, increase multistakeholder perspectives that can modulate
Widodo’s close aide, Basuki Tjahaja Purnama or “Ahok” was and improve discourse. The move towards content
the target of an organized online political campaign run by regulation by governments has been criticized by experts
“fake news factories”, accusing him of blasphemy against from across disciplines. While the spread of misinformation,
Islam, effective enough to pressure authorities to put him on leading to erosion of trust in the democratic process,
trial.28 In November 2017 in India, false information spread indeed is a problem that needs fixing, history has shown
on WhatsApp about a salt shortage caused widespread that government regulation of this nature is usually
panic and stampedes in grocery shops in at least four Indian instrumentalized by authorities in power to suppress dissent,
states, leading to the death of a woman.29 silence opposition, manufacture consent,37 and undermine
international standards of freedom of expression.38 Experts
have also pointed out that the move towards governmental
A swift step towards regulation of content can lead to systematic state-led
regulation surveillance and censorship, which could significantly
undermine democracy.39 It is also critical to note that using
the blanket term “fake news” can, and has, been used by
While the exact contours of the impact of fake news are still governments to undermine legitimate news, independent
little understood, the move towards regulation has been far journalists, and newsrooms when the content in question
swifter. The most debated of such moves is probably the is inconvenient or uncomfortable for those in power40—in
German regulation intended to control the spread of hate other words, to engage in the battle of narratives mentioned
speech and fake news online (the aforementioned NetzDG), above. To make things worse, governments themselves
in force since 1 January 2018.30 In early 2018, French often delegate regulatory and police functions traditionally
President Emmanuel Macron also promised to impose considered a matter of public law to private platforms.41
transparency obligations regarding sponsored content
on social media and to give the media watchdog Conseil
Supérieur de l’Audiovisuel the power to impose heavy fines Not only governments are in the dark with respect to the
on outlets spreading lies or rumors.31 In March 2017, in the underlying problems and the best solutions to adopt; the
run-up to the general election, the government of Malaysia public seems equally confused. According to a 2016 Pew
rushed an “Anti-Fake News Bill” into Parliament, allegedly survey, 45 percent of US adults believe that governments,
aimed at curtailing political speech. If passed, it would politicians and elected officials are responsible for fixing the
punish perpetrators of “fake news” with up to ten years in fake news problem. Similar percentages, however, attribute
prison and the equivalent of USD $128,000 fine; publishers responsibility to the public (43 percent), social media and
would be required to immediately remove the item “after search engines (42 percent).42
knowing or having reasonable grounds to believe that such
publication contains fake news”.32 In China, authorities The “new governors”
have been aggressively regulating content on social media
platforms for many years, making publishers responsible for
ensuring that a wide range of content is proactively deleted, More emphasis on multistakeholder perspectives can
failing which those responsible could be handed jail terms provide pushback against governmental regulation of online
up to three years.33 However, seemingly in response to the content and bring into focus the roles and responsibilities
global rise of fake news and regulatory solutions proposed of private platforms.
platforms Their active role as content curators,
in democratic countries, the Chinese military recently set lawyer Kate Klonick argues, merits treating them as the
up an online portal where citizens can report instances of “new governors”, that is to say as systems of governance
“misinformation” and “fake news” about the military.34 This operating outside traditional frameworks for free speech.
same strategy is mirrored across the world in Italy, where “Now responsible for shaping and allowing participation
the Postal Police recently launched a portal to report hoaxes in our new digital and democratic culture”, these new
and fake news,35 just a few months after the proposal of a governors “have little direct accountability to their users”
draft law criminalizing fake news.36 and their speech moderation practices are opaque.43 In this
respect, The Economist magazine has recently encouraged
the industry, and Facebook in particular, to take action to

4
“rebuild trust” among the user base, acknowledging that guidelines pertaining to hateful speech and reporting abuse
“if the industry does not come up with a joint solution, a on its platform to discourage the spread of hateful content.53
government clampdown will become inevitable”.44
In the absence of a clear understanding of fake news and the
The dominance of these platforms as a source of information contours of its consequences, regulations being suggested
and their role in shaping public discourse have led to an by governing bodies around the world risk being inconsistent
increasingly strong push to treat social media platforms as with international human rights law.
law These efforts are also
media companies and to subject them to the existing press “yet another reminder of the insidious malleability of the
laws, adapted to fit the contours of a new technology—an concept of ‘fake news,’ which has become a term often
example is the German NetzDG.45 This is particularly relevant, used to refer to news that is critical of those in power, rather
as social media platforms have very often resisted terming than news that is deliberately false”.54 Human Rights Watch
themselves as media companies in a bid to circumvent has denounced the German law as inconsistent with the
regulation, and tend to rebrand themselves as technology country’s obligation to respect free speech, because it “can
companies or platforms alone, as a way of positioning lead to unaccountable, overbroad censorship” and it creates
themselves as neutral technology-driven facilitators of online “ ‘no accountability’ zones, where government pressure to
content creation and dissemination.46 However, in light of the censor evades judicial scrutiny”.55 This view is shared by
non-neutral, value-laden choices that these algorithmically the UN Special Rapporteur on Freedom of Opinion and
curated platforms make on behalf of users, understandings Expression, who raised concerns about the implementation
of editorial decision-making and responsibility need to be of the law and it potential overreach.56 In addition, the law
re-evaluated.47 The decisions that algorithms make are “sets a dangerous precedent for other governments looking
designed by human engineers, from the choice of training to restrict speech online by forcing companies to censor
data and feature selection, to the definition of “success” for on the government’s behalf”; Russia, the Philippines and
the algorithms that undergird these platforms.48 Singapore have already indicated the NetzDG as a positive
example.57
Business models underlying private platforms are another
important piece of the puzzle. Social media are advertisement- To make things worse, users are ultimately made responsible
driven companies and thus have the “incentive to promote for authorizing data collection in a model dubbed “privacy
material that grabs attention and to sell ads to anyone”.49 self-management”.58 However, the Terms of Service, which
Incentivizing enticing content, prolonging time spent on constitute actual contracts between users and platform
their platforms, and encouraging user engagement is baked operators, fail to provide users with relevant information
right into the business models, aimed at “keeping users to make informed decisions. An analysis of the ToS of fifty
glued to their screens, collecting data about their behavior online platforms revealed that ToS generally lack specific
and convincing advertisers to pay billions of dollars to information on aspects considered important to human
reach them with targeted ads”.50 These platforms thus rights, such as the promotion of the rights to privacy and
organically generate an enabling environment for alarming, freedom of expression. ToS tend to be awash with technical
sensationalist media designed to trigger clickbait behavior. and legal terms and difficult to comprehend, and there is
This might mean that traditional methods of working against an average of three binding documents per platform. While
propaganda and misinformation, such as ethical codes for 52 percent of operators proclaim they may remove content
journalists, media laws and mass education, are no longer without any notice, 36 percent of the sample ignores the issue
sufficient.51 all-together, suggesting companies prefer not to disclose
their policies to users.59 Yet, the obligation to respect human
rights does not belong to the state alone, as the UN Guiding
The ideal mechanism to deal with the problematic distribution Principles on Business and Human Rights remind us.60
of content on platforms is still a matter of debate also within
the industry itself, which largely proceeds by trial and error.
Both Facebook and Google were amongst the first to work on The governance deFIciency
tweaking their advertisement business model by prohibiting
fake news sites to use their advertisement services.52 In
order to avoid pushback for being the final arbiter of truth, While the debate drifts organically to a conversation
Facebook swiftly outsourced fact-checking to third party about the perils of content regulation online and its
organizations, which would be prompted by users reporting potentially damaging effects on internationally recognized
fake news stories. Similarly, Twitter revisited its community human rights, the discussion of how fixes to fake news
are arrived at lingers in the background. We argue that

5
the issue of “fake news” questions and jeopardizes the
multistakeholder governance model. model The increased role “Although there are no rules to determine, in
for tech companies, law enforcement and algorithms in general terms, what kind of speech should or should not
regulating access to information does not resonate with be allowed in private online platforms, certain platforms
established multistakeholder decision-making mechanisms. should be seen more as “public spaces” to the extent
As evidenced above, users have little to no say in the that occupy an important role in the public sphere (…)
content of companies’ ToS, which, despite renewed efforts As a general rule, any restriction on the kind of content
by the industry, remain largely unintelligible to most users. permitted on a particular platform should be clearly
The government and private sector each seem to engage stated and communicated within the ToS. In addition,
with the issue and potential solutions through a number of platforms should provide effective mechanisms aimed
stand-alone trial runs, rarely talking to each other or building at signalling and requesting the removal of content
collaborative mechanisms. These individual attempts, that is forbidden under the applicable legitimate laws
whether governmental or company policy, are often siloed, (e.g. illegal content such as child pornography as well
and reactive to particular crises or media events, rather as other kinds of undesirable content, such as hate
than long-term in scope and based on comprehensive speech, spam or malware). However, such mechanisms
research. Governmental representatives often demonstrate shall be necessary and proportionate to their purpose.
in public statements and proposed policies that they It is of utmost importance that the rules and procedures
do not understand how social media platforms function imposing such restrictions are not formulated in a way
well enough to sufficiently regulate them.61 While the that might affect potentially legitimate content, as they
multistakeholder model is variably considered apt to would otherwise constitute a basis for censorship. To this
manage critical infrastructure such as the internet naming end, content restriction requests pertaining to unlawful
system and internet standards62, regulating fake news is content shall specify the legal basis for the assertion
treated as a different matter altogether. To date, no arenas that the content is unlawful; the Internet identifier and
or mechanisms provide for a substantive multistakeholder description of the allegedly unlawful content; and the
debate on fake news, online content regulation and their procedure to be followed in order to challenge the
threats to democracy. removal of the content” (pp. 237-238).64

Existent multistakeholder internet governance fora


intermittently engage with the issue of fake news at the level
of content, questioning solutions and raising concerns, but However, neither the IGF nor the Dynamic Coalition have the
fail to address the issue of procedural fitness of the decision- teeth to enforce suggestions in this time of frenzied reactions
making model itself. For example, at the 2017 United Nations’ by state actors. Thus, if we are to find effective, shared and
Internet Governance Forum (IGF), a multistakeholder forum rights-respecting solutions to fake news, there is a crucial
for policy dialogue where stakeholders meet annually “on need to better understand what governance model could
an equal basis and through an open and inclusive process”, facilitate a dutiful, regular exchange between stakeholders,
misinformation and content regulation was included in the in a way in which even users—typically the softest voice—
program, but the particular mechanisms through which can be heard. The multistakeholder model seems less than
stakeholders could seek to influence the moderation adequately equipped to address the regulation of private/
policies of private actors were not questioned.63 Meanwhile, public actors like social media platforms. Yet, this problem
the IGF Dynamic Coalition on Platform Responsibility—an appears to be consistently evaded by the supporters of
informal, issue-specific group set up in 2013 to produce multistakeholder governance, including IGF participants.
“model contractual provisions, which can be incorporated And with the exclusion of annual events like RightsCon65,
in ToS in order to provide intelligible and solid mechanisms there is a lack of policy fora and decision-making
to protect platform-users’ human rights and foster platform mechanisms where the industry, sovereign when it comes
providers’ responsibility”—has published a valid set of to crafting and enforcing ToS, interacts with users and
recommendations on Terms of Service and human rights, digital rights advocates to hear concerns and petitions. It is
addressing the rights to privacy, freedom of expression especially hard to identify any existing governance model in
and due process (December 2017). The section on due which users and civil society groups could be empowered
diligence standards in content regulation considers the to bind private actors performing certain public functions to
consequences of overzealous content regulation: respect human rights within their ToS. Thus, it is the purpose
of this paper to explore a variety of stakeholder perceptions
in order to determine the governance capabilities for such a
complicated set of issues.

6
Regulating fake news: A Failing to comply with deadlines, procedures, or reporting,
platforms can be subject to heavy fines, after the question of
jurisdictional analysis legality of content is decided on by a court ruling. While the
maximum penalty for platforms is capped at 50 million euros,
To situate these questions in a larger context, we now move there is a tiered model for fines, which has been updated
to the jurisdictional analysis of how national governments in by the Federal Office of Justice since the law’s enactment.
Germany, India, and Brazil have perceived and reacted to First, platforms with over 20 million German users (to date,
fake news. Although not the first to implement anti-fake news only Facebook) are to be fined between 2.5-40 million euros.
legislation, these countries have been selected because of Next, platforms with between 4-20 million German users
their centrality and leadership role in the respective regions (e.g., Youtube and Instagram) are to be fined 1-25 million
and for being large, established democracies with (variably) euros; lastly, platforms with 2-4 million German users such
strong rule of law. Democratic countries were privileged in as Twitter are to be fined 250,000-15 million euros. Individual
this analysis in order to better understand how democratic employees can be fined up to 400,000 euros in cases of
governments are carrying out and justifying restrictions to severe or repeated mishandling.
freedom of expression within the context of misinformation
and disinformation.66 This makes it far more difficult for platforms to be deliberate
in their ascertainment of legally acceptable content: the fear
GERMANY of being subject to hefty fines encourages over-compliance.
According to Human Rights Watch, hosting companies will
In Germany, the NetzDG, which stands for “The Act to have to make difficult determinations of free speech violations
Improve the Enforcement of Law in Social Networks” came “under conditions that encourage suppression of arguably
into full force on January 1, 2018.67 It seeks to regulate lawful speech. Even courts can find these determinations
the spread of content on social media platforms, which challenging, as they require a nuanced understanding of
have over two million users in Germany. Under the Act, context, culture, and law. Faced with short review periods
platforms are required to maintain effective and transparent and the risk of steep fines, companies have little incentive
procedures for handling complaints about unlawful content to err on the side of free expression”. This proclivity towards
through which users can flag problematic content. If such caution might have a chilling effect on speech. Further, the
content is deemed to be unlawful, it is to be subsequently law does not provide for “judicial oversight or judicial remedy
removed by the platform within 24 hours to a maximum of should a cautious corporate decision violate a person’s right
seven days, with some extended flexibility for more complex to speak or access information”.69 In other words, platforms
cases. The Act is confined to social networks only, i.e. those are put in the position to be the arbiters of the truth and of
platforms where content of any kind can be shared freely law, either because of political pressure or the possibility of
and made public, and not to media platforms such as monetary fines.
LinkedIn and WhatsApp which instead permit interpersonal
communication and dissemination of specific types of
The lack of transparency surrounding Facebook’s take-down
content. Despite concerns around the chilling effect of such
policies exacerbates the shortcomings listed above. The
a law on freedom of expression, the German government
company has traditionally moderated content in an opaque
has repeatedly contended that while freedom of expression
fashion.70 This will only be more amplified in the future, as
is an important facet of German democracy, it ends where
Facebook CEO Mark Zuckerberg, in a hearing at the US
criminal law begins.68
Congress, hailed “AI tools” as the solution to fake news,
misinformation, and hate speech on the platform, without
The NetzDG holds platforms responsible for looking into the clarifying what standards these tools will adhere to.71 In this
legality of content. Platforms can outsource decision-making respect, the NetzDG does not address the possible impact of
to a recognized “self-regulation institution”, but regardless these tools—a crucial consideration for addressing content
of how the decision is made, the complainant and user must removal in the future.
be immediately notified regarding the decision and provided
with an explanation. Platforms are also obliged to retain
The NetzDG has been met by extensive opposition both
deleted content for at least ten weeks for evidence provision
nationally and internationally. According to the Social Media
and review purposes, in addition to publishing yearly reports
Law Bulletin, amendments have already been proposed to
of deleted content with reasons for the same.
help users whose content has been erroneously deleted
and to set up an independent entity to take over the role of

7
deciding what constitutes hate speech from companies.72 The Indian approach is unique insofar as it is multi-pronged.
It is worth noting that the European Union has preferred a While the clampdown on WhatsApp is largely due to the
different approach altogether, namely the promotion of self- ubiquitous presence of the messaging app, shutdowns on
regulation through a memorandum of understanding with specific platforms beyond WhatsApp, including Facebook
platform operators. The Communication of the European and Twitter, have been deployed in the past as a way to
Parliament on “Tackling Illegal Content Online: Towards an contain rumors and public alarm.83 Wider restrictions on
enhanced responsibility of online platforms” (September journalists as discussed above are a knee-jerk reaction
28, 2017) comprises a set of non-binding guidelines which largely attributed to impending elections in the country. This
seek a unified and calibrated response by platforms, approach is also far more ad-hoc, and less formal that
governments, and stakeholders and attempts to prevent a the German example discussed above and the Brazilian
fractured liability landscape that would result from individual example discussed next.
EU Member States setting their own terms for the regulation
of online content.73
BRAZIL
INDIA
In December 2017, the Brazilian government established
the “Consultative Council on Internet and Elections” under
In India, the platform WhatsApp, which counts 160 million the Superior Electoral Court to monitor and/or block “false
Indians as part of its customer base,74 has been identified news” stories on social media ahead of the presidential
by policymakers as the primary vehicle responsible for the elections scheduled for October 2018. The Council is
spread of (mis) information.75 The most bizarre instance composed of representatives of the judiciary, the army,
of WhatsApp-spread fake news occurred shortly after the the ministries of justice and of science and technology,
government announced the demonetization of existing notes the national intelligence agency and the federal police, the
in November 2016. A rumor that India’s new banknotes were nongovernmental organization SaferNet devoted to fight
equipped with a GPS-chip to combat the country’s black online crime and researchers from the private university
economy caused panic across the country, and the story Getúlio Vargas.84 According to one of the experts interviewed,
had to finally be clarified by the Reserve Bank of India.76 the Council “operated for a couple of months and produced
a messy set of regulations for political communication
online” operating “under very heavy pressure to produce
The Indian government has historically cracked down on silver bullets that could solve the issue of fake news instantly.
Whatsapp groups and content exchanged on chats in the Part of that was external pressure from society and the
context of national security, particularly in the northernmost media”. 85 This external pressure is easy to discern—a BBC
region of Kashmir, the sovereignty over which is still under World Service survey of 18 countries in 2017 showed that
dispute.77 Measures to maintain public order, and contain Brazilians were most worried about fake news.86
“rumors” have included shutting down mobile Internet
in some parts of India.78 In addition to having groups
mandatorily register with the government, previous notices While the Council’s stated role is to monitor content on
from the Indian government have also sought to hold social media, there is little clarity on social media platforms’
administrators of groups liable for content shared on these responsibilities and compliance requirements. Amongst
mediums.79 the solutions envisaged by the Council, one concerns
“automated and human solutions provided by platforms
to remove content online”. Another solution suggested is
In April 2018, India’s Union Ministry of Information and to enable fact-checking agencies and also the traditional
Broadcasting announced in a press release80 the amendment media to contribute to individual decisions on fake news
of the Guidelines for Accreditation of Journalists, a move to and false information. As one interviewee opined, “There
ensure that journalists accused of reporting fake news would is little described about transparency and accountability
lose press credentials until all complaints against them were mechanisms envisaged for the activities of the tech industry,
verified.81 Following severe backlash across the nation, the fact-checkers and traditional media”.87 This is particularly
Prime Minister ordered a withdrawal of this press release. treacherous in a country where the press has been ranked
The Ministry has since stated that it is in the process of “partly free” by Freedom House (as of 2017), where internet
working with journalists to “fight the menace of ‘fake news’ penetration is below 60 percent and the ownership of
and uphold ethical journalism”.82 traditional media is highly concentrated.88 Furthermore,
Brazilian politicians and governing bodies have a legacy of

8
proposing problematic measures as solutions to combating Another important point of divergence is the form of
fake news, including prison sentences and asset-freezing. approaches to mitigation
mitigation, i.e. approaches to containing the
Additionally, the 2017 electoral reform, through the law no. potential spread and repercussions of fake news. Germany’s
13.488, regulates electoral campaigning online. Among approach is through legislation, with detailed roles, rules,
other features, it “allowed sponsored ads on social media and responsibilities, although, according to an expert
platforms and search engines as long as the service interviewed, “the German response, bluntly put, is just delete
was bought from the platform where the ad was to be a lot of bad stuff and then all will be fine”.94 The Brazilian
broadcasted”.89 approach includes the formation of a consultative council
within the Superior Electoral Court. Although it includes also
members from the civil society, this body is far less formal
The Brazilian Federal Police have also expressed their and, as a consequence, would continue to define the rules
intention to monitor the spread of online political content and responsibilities associated with platform engagement
deemed to be false and punish those guilty of dissemination and support on an ongoing basis. The Indian experience
of such content.90 This intended control applies not only for has been particularly unique—while administrative and
content on social media platforms but includes all political governmental efforts in the form of notices, punitive orders,
content online. The legal basis for assumption of such power and press releases are more frequent, they either do not
is questionable and concerning, deriving from a pre-internet hold muster for long (as was demonstrated in the case of
censorship law that was framed and implemented during press releases issued by the Ministry of Information and
military dictatorship rule in Brazil. Broadcasting) or are eventually rejected by Courts.95 In
addition, a common practice to curb the spread of ‘rumors’
A comparative view online has been to entirely shut off mobile internet services
for a sensitive period of time, or in sensitive areas, with a
substantial number of complete internet shutdowns.
shutdowns
These three cases focus on curbing the spread of fake news
and increasing government oversight of content online: in
other words, “fake news is misunderstood as a problem Finally, the three countries share two major policy
of illegal content (…) to be removed from the internet”.91 shortcomings. First, although all three of our cases are
However, the methods and strategies deployed to achieve established democracies, there remains a great deal of
this control are varied. political and legal ambiguity concerning what constitutes
“fake news”—which reveals the proclivity to use the term
loosely, and more importantly, politically conveniently.
In terms of liability, the German law is squarely focused This is particularly worrisome in relation to the duty of the
on penalizing social media companies for failure to state to promote and protect human rights and freedom of
curb the spread of fake news on their platforms. As an expression. When oversight is sloppy and accountability
interviewed expert argued, it “delegates censorship mechanisms are minimal, measures like those described
functions to private corporations which is even worse than above naturally lend themselves to abuse by individuals and
governmental censorship”.92 According to a civil society groups, as well as enterprises willing to overly censor content
representative, this had made the situation worse, as “it in order to avoid heavy fines. Secondly, legislators in the
has actually fallen on users quite a bit too. When I go to three countries reveal a limited understanding of the role of
Twitter, I don’t have the option of reporting a tweet under the algorithms and automation in the personalization of content
ToS alone, it automatically goes to the option to reporting on social media as well as a content regulation remedy. As
it under the German law”.93 However, in India, solutions a result, the proposed solutions fail to consider key aspects
emphasize penalizing the individual,
individual from WhatsApp group in the processes through which fake news are spread and
administrators to journalists accused of reporting fake news. legitimated. This general lack of knowledge about how
The Brazilian model is closer to the Indian approach, as it social media algorithms function also leads to simple and
seeks to punish those guilty of the dissemination of false shortsighted solutions in the face of an ever-evolving and
information. The responsibility of platforms and liability of increasingly complex digital ecosystem populated by a
social media companies is most aggressively articulated variety of actors engaged in the distribution of information
in the German case, whereas the other two cases do not and misinformation, including trolls, social bots, or accounts
position the platforms as responsible actors within the policy that are automatically run by software mimicking real users.96
solutions. While the Indian government has begun to confront
companies like Facebook on the question of data privacy
and protection more formally, the same has not followed in
regard to regulation of content on these platforms.

9
Towards increased attention At the same time, the problem of fake news involves a key
economic dimension, as it is fueled by the “click economy”
to multistakeholder driven by advertisement-based business models which
perspectives on approaches to monetize users’ attention. In this perspective, fake news is
online content regulation? merely a way to gain people’s attention and increase the
sharing and spread of content. Consequently, propositions
that seek to “solve” the problem of fake news should
Our approach was to query stakeholders on their views on holistically address these three aspects. Technical fixes
these proposed policy solutions as well as their view on the alone can only contain the way in which fake news travels,
proper governance mechanisms to debate and respond to while regulatory fixes tend to merely assign blame and
misinformation online. According to one expert interviewed, penalize those deemed responsible. Neither of these
“we are confronted with a new wave of governmental responses comprehensively deal with the complicated
involvement in a media that was more or less free from ways in which information is consumed and shared in digital
government control. The question is now what type of spaces from economic, social, technical, and political
control is acceptable for a democratic and free society”.97 perspectives. This shared belief that policy responses need
Answering such a complex question becomes particularly to address a multitude of phenomena related to how fake
challenging in the absence of appropriate spaces and news is spread and consumed underlines our point that an
mechanisms promoting a multistakeholder exchange of interdisciplinary range of experts should be consulted and
ideas. Responding to this challenge, within this section involved in governance around the subject.
we sought to create this “imaginary” dialogue between
stakeholders, by presenting, comparing, and analyzing the
views of two dozen experts from the civil society, academia, Many of the experts interviewed from North America
government institutions and the industry. Areas of concern commonly echoed the belief that the “fake news problem”
include the nature of the problem, the adequacy of current does not exist, but that the only real problem is the collective
regulatory responses and potential alternatives, and the hysteria around the issue, spreading from government
consequences of the fake news controversy for policy and officials to the media to their audiences. As many of these
governance more in general. interviewees predicted that the hype around this issue will
not last, they caution that proposed “solutions” should be
postponed until there has been time to better understand
Social vs. technical the issue and determine if it is indeed an issue that can be
addressed through technical, legal or policy levers.
For most of our experts, fake news is not a technical issue:
rather, it is a social problem exacerbated by technology Regulatory innovation to
(according to one expert, “driven by people but made worse
by technology”),98 and as such it combines both social and counter fake news
technical aspects.99 The two are deeply entwined: “without
the tech the social problem would be small. Without the According to the stakeholders interviewed for this
social problem, the tech wouldn’t be problematic”.100 The project, most of the current measures to deal with fake
“effects of technology have made it more pronounced (…) news, especially top-down government regulation, are
But ultimately it’s a problem that we can’t lay at the feet of questionably effective and severely problematic when it
the technology because how we use technology is largely comes to the potential effects on human rights. Additionally,
within our control”.101 It is “a matter of education (as enabling many experts commented on how the fake news
qualified ICT literacy), law (as a means to protect fundamental controversy brings to the surface and magnifies the tensions
rights of internet users), and technology (as a way of between national jurisdictions and a technology which
assuring the transparency of algorithms and development is transnational in nature. There is broad consensus that
of systems that protect privacy by design and default”.102 current normative responses, “broad strokes government
Its causes are to be found largely in the social realm, and regulation”, “technological solutionism”103 and/or “platform
have to do with people’s attitudes towards information, their deletionism”104 are the wrong way to go. Three main reasons
relation to sources and platforms, and their digital literacy (or emerged for criticizing these policy responses:
the lack thereof), but also the way trust and belief systems
are fashioned in a digital environment.  The experts agreed that most of the solutions
proposed thus far are likely to be infeasible
infeasible. For
example, the proposed US “Honest Ads Act”,

10
whereby each ad would have to display who paid are so busy pushing through new initiatives that
for it, is likely to be very difficult to implement. In they are not focusing on the opportunities to
addition, national regulation is usually toothless enforce already existing laws that may just as
when problems originate beyond national borders well be the remedy”.111 Instead, those interviewed
(e.g., Russian servers and ads are impossible to largely agreed that legislators should first consider
regulate from the US). In addition, the divergence in what tools they already have at their disposal.
visions and the political tensions between countries Examples of legislation that might apply include
might make joint measures across national borders “anti-defamation, electoral laws about what can be
impossible to achieve. Furthermore, national socio- discussed during election periods, laws on foreign
cultural contexts differ (e.g., Western vs. non- interference and propaganda”,112 but also existing
Western), “so what is legally allowed or forbidden is regulation on the advertising market.
one country is different in another country”.105  A soft approach (e.g., guidelines, norms…) is to
 Policymakers and “governments do not understand be preferred to “hard” laws; communication and
technology well enough (…) they are not going to cooperation across national borders should be
be responsive enough because they are not on the encouraged, also in recognition of the impracticality
battle lines dealing and moderating the content as of certain government solutions constrained by
it comes in”.106 national borders. However, such a soft approach
 There is the risk of doing too much damage by over- should refrain from “pushing social media platforms
regulation and over-enforcement. For example, into “voluntarily” policing content”.113 “Flexible
according to an institutional representative, protocols would allow within a certain corridor a
“when you start to have categories of content peaceful coexistence among different national
that are not allowed, there is always going to be jurisdictions”; this, however, requires “a high level
the tendency to expand them”.107 These decisions of tolerance and mutual recognition of the national
are very contentious by definition. In addition, an differences”.114
interviewee from the organized civil society pointed  Platform self-regulation is the most appropriate
out, content regulation “can become weaponized regulatory approach according to experts who also
by different groups against each other”.108 expressed their skepticism about whether fake
Authoritarian governments, at minimum, might be news is a problem (and were largely based in North
tempted by the instrumentalization of online content America). The trust in self-regulation emerges from
regulation legislation. Over-regulation might also the belief that platforms want to service users in the
have the unwanted effect of curbing competition, best possible way, and will thus find an approach to
as “regulation only adds more compliance costs regulation that is agreeable to users in the long run,
that the new entrants have to pay to legitimately through amending community guidelines and/or
compete with existing dominant platforms”.109 ToS, to eventually reach an equilibrium organically.
Finally, over-regulation might jeopardize the very According to one of these experts, however, self-
same nature of the medium: “if what we see on the regulation should not result in “social media
internet are just the comments that platforms have companies becoming privatized law-enforcers”.115
tacitly approved, it will kill the social significance of  Fact-checking is considered potentially effective,
the internet as we know it”.110 especially when conducted by third parties, such
as civil society organizations, civic projects, non-
partisan watchdogs, think tanks and universities,
While there was rough consensus among our experts on trusted media organizations, or by “open APIs
what constitute inappropriate regulatory approaches, there tracing back how information is shared in a social
is a spectrum of what they consider appropriate regulatory networks”.116 However, according to one expert
mechanisms and/or innovative solutions. Our experts interviewed, “fact checking is just a fashionable
suggest acting at the level of existing power relations and word for old school journalism”.117 The number of
the broad political economy of information and political fact-checking institutions worldwide has increased
communication. More specifically, the following patterns by 239 percent since 2014118, which “indicates
emerged from the interviews: that citizens are already participating in solving
 Existing legal frameworks are sufficient to address the problem”.119 However, fact-checking should be
the issue of fake news, and the exact extent to exercised with caution, as it “won’t work on things
which they are relevant bears closer scrutiny. which are bad journalism, or bad ways of relating to
Many observed how “a lot of the time legislators politics, but don’t fit into this neat little box of ‘this is

11
a lie’”.120 The idea was also expressed that the core have human rights spillover effects”.127 In addition,
problem is the lack of faith in the political process, assuming that users are powerless, as tends to
and that this faith needs to be re-established before be the case for much of the public discourse and
thinking about fixing misinformation specifically. legislation, is considered patronizing by many of
 Regulating the advertising market and the the experts.128 Interviewees noted that community
way the “click economy” generates value are reporting and/or policing has been in place for a
powerful measures in the hands of both platform long time in many online spaces such as Reddit and
operators and governments. Many experts believe Wikipedia, and these kinds of user communities
that working towards constructing a healthier can empower individuals to develop community
economy around online advertisements would standards to deal with misinformation. Further, one
improve the quality of political communication expert argued that civic projects, startups and civil
and of information circulated, and should be the society organizations uncovering fake news, as
primary goal of governments.121 Examples include well as media projects in under-served languages,
regulating the use of personal data for targeted need to receive adequate public funding in order
advertising, e.g. by “limiting the maximum amount to better combat misinformation in a variety of
of advertisement in social media feeds similar socio-political contexts.129 Related to this, many
to the limits in broadcasting”, and transparency experts encouraged media education and literacy
measures “over which messages are targeted to programs within both formal and informal education
which audiences”.122 settings.130
 Few experts entertained the idea that governments  Transparency was noted as a key factor in promoting
could have a role to play in regulating fake news a healthy approach to fake news, especially given
per se and emphasized that such regulation that, according to one expert, “platforms already
should be deliberate and carefully thought out. A have their own sets of proprietary rules for moderating
pro-government perspective, however, does not content and they make none of them transparent”.131
automatically entail a pro-regulatory one. There is The focus of the debate at the government level
a counter-discourse that foresees another role for should shift from content regulation on platforms
state institutions and regional bodies such as the to more demands for transparency from platform
European Union: that of re-establishing confidence operators on how they operate, how information is
into government institutions as a whole, as a circulated, and what kind of content is removed.
prerequisite to a healthier digital public sphere. One interviewee specially called for government
 A neutral third party,
party such as independent courts institutions to “mandat[e] more transparency from
or a professional “clearing house”,123 as suggested social networks over the spread of information”.132
by one expert, are key players to decide “what According to these experts, it is in the platforms
is just harmful and what is really illegal”, and to best interests to autonomously decide to practice
avoid “another form of censorship where a lot of greater transparency when it comes to data
controversial content risks being eliminated”. collection and the steps being taken to combat
Decisions like this cannot be left to governments fake news. Transparency should concern the
or companies; the latter, in particular, would be tools used by platforms to regulate content, such
tempted to “take out everything that is critical” to as spam and keyword filtering, which tend to be
avoid paying fines imposed by governments.124 “very imprecise and don’t allow for a sufficient level
The involvement of third party checkers would also of granularity”, as well as community guidelines
increase trust in platforms.125 (“forward consent”).133 If platforms make efforts
 Empowering users is a recurrent theme from the to become more transparent, it is likely that there
interviews. It entails, for example, “listening to will be fewer calls for more formalized regulations
the users, seeing what their concern is and using around their content regulation practices. Finally,
that to change their community guidelines”.126 It algorithmic transparency, inspired by the principles
means “giving people the power to control what of science and traditional journalism, otherwise
speech they hear and what they receive, rather than “you are making a knowledge claim in public but
blocking people from speaking in the first place”. not telling people how you came up with that. Show
The assumption is that “the closer to the user the your work, show the algorithm, the data, the code,
solution is, and the more within the control of the or at the very least a description of those things”.134
user, the less likely that it is inadvertently going to The Facebook Newsfeed FYI blog goes in this
direction.

12
 Multistakeholder mechanisms derived from internet presupposes a balanced playing field in which all
governance, rather than top-down initiatives, stakeholders have equal opportunities to contribute
should form the baseline of any intervention. to the debate and governance outcomes. However,
Norms, in particular, should take the form of “global it has become clear in many debates, beyond
principles” to be developed in a multistakholder content regulation, that power and influence in these
fashion “to guide all stakeholders including fora are weighted towards those actors with greater
platforms, governments, standard development resources and those companies who control the
communities and civil society”.135 The next section platforms and architectures of the internet. This is
will more thoroughly delve into these ideas. a source of concern especially for those experts
interviewed from academia and civil society.
 Multistakeholder participation is also made more
After fake news: What is the complex by the fake news issue. In particular, the
constituency of civil society, which is traditionally
future for multistakeholder internally very diverse, has proven to be even more
governance? fragmented when it comes to discussing fixes to
fake news, with distinct civil society organizations
As privatized infrastructure on the internet continues to representing different social groups who might be
dominate and alter the norms that have driven internet more or less affected by a particular policy solution.
governance thus far, there is a need to take stock of how These divisions in turn might jeopardize the ability
these changes may impact a variety of stakeholders, their of civil society at large to influence decision-making
role within governance structures, and the multistakeholder at various levels.
model itself. For this reason, we asked experts to specifically  Unequal power relations between stakeholders
think through these issues and how multistakeholder constitute a problematic aspect of participatory
structures could be strengthened within this debate. As governance, exacerbated by the quasi-monopolistic
noted above, the debate around fake news has contributed power of big platform operators. While “it’s still better
to raising public awareness about issues of media policy than doing things behind closed doors”, often “it is
and governance. The experts interviewed suggested that in the end just governments and big platforms and
this interest should be leveraged, and recognized that to make things appear more egalitarian you bring in
only a bottom-up, multistakeholder dialogue can produce some little NGOs or some academics”.136
effective, shared and lasting solutions that respect the
rights of users. However, in current discourse, there seems The following solutions were proposed:
to be only a tenuous link between the fake news debate
and internet governance at large. Connecting the dots, our  According to the experts interviewed, the
experts were invited to reflect on the governance challenges multistakeholder approach is the only viable
associated with the fake news controversy. The following approach to addressing fake news, as the
areas for analysis were observed: complexity of the issues at stake call for shared
solutions. In particular, the experts emphasized that
 There are limited opportunities for citizens, and it is the responsibility of social media corporations
individual users above all, to have a say in the fake to engage in multistakeholder dialogues, or as one
news controversy, beyond raising concern and expert articulated, they “have to put themselves
expressing public outrage. Platforms operators into the context of the multistakeholder approach,
autonomously set the rules for participation, opening up any decision affecting user rights to the
which leaves users with little to no chances to scrutiny of other stakeholders”.137 This move could
make their voice heard outside of representational result in a flexible, coordinated response, which is
mechanisms within democratic political structures. to be preferred to “constitutional” solutionism.
As the only outlet for public participation, political  The multistakeholder community mobilized around
processes are generally long-term, messy, and not internet governance should work to advance and
applicable to users in non-democratic contexts. develop forms of “collaborative governance” able
 Current internet governance mechanisms are no to “broaden” the practice “from the few places
longer fit for the challenge, due to the privatized where it exists at present”.138
nature of social media platform operations and  Participatory governance in the realm of the
the increased use of private contracting and self- digital should be redesigned in view of tilting
regulation. The multistakeholder model in theory the balance of power towards the user. As civil

13
society representation is “the weakest element in
the multistakeholder model”, experts urged those specific to distinct stakeholder groups, in recognition of
already engaging in internet governance spaces the distinct roles each stakeholder group can play in the
to “look for mechanisms to find new forms of process.
meaningful representation for civil society and
users”, “avoid the situation where the voice of Introduction
the users it captured by special interest groups”,
“enable civil society organizations to play a role”,
The following recommendations intend to encourage a
and “find the right balance between representative
responsible, coordinated reaction to the issue of “fake
democracy and participative democracy in
news”. They put forward measures to make sure that
cyberspace”.139
any proposed remedy follows three principles: i) it puts
 As policymakers might have insufficient familiarity
users in the driver seat, foregrounding their needs and
with the inner workings of platforms, a “higher level
preferences; ii) it respects human rights and promotes
of interaction between lawmakers and code makers”
makers
their enjoyment also on social media platforms; iii) it
is recommended. These kinds of interactions and
is the result of (at least rough) consensus amongst all
attempts towards policymaker education could
parties with a stake in the process.
help lawmakers understand “what can be done and
what is impossible”140 as well as the actual technical
implications of any proposed corrective measures, Our starting point is the user, which we consider to be
and even imagining alternative solutions. at the core of the process; other players include state
 A multistakeholder exchange could produce entities, whose duty it is to promote the respect of human
protocols or guidelines that are flexible, adaptive, rights; platform operators and other private actors
and can be widely applied across sectors. These such as newsrooms, public education institutions, and
efforts can also help with learning processes organized civil society, including but not limited to digital
and spreading best practices across disciplines rights and digital literacy organizations.
and jurisdictions.141 In addition, any approach
needs to take into account the specific needs
and preferences of many different publics who These recommendations are to be seen as
have a stake in the issue, and this approach “is complementary to other guidelines that affect the
of the utmost importance in some non-Western operations of the social media industry, and in particular
contexts”.142 the UN Guiding Principles on Business and Human
 Amongst the existent policy fora, the IGF in particular Rights and the Recommendations on Terms of Service
has a role to play, for example by providing shared and Human Rights of the IGF Dynamic Coalition on
documents that can inform decision-making at the Platform Responsibility.
government level.
 Academia, as part of civil society, can help by Governments and regulators
“creating
creating awareness,
awareness raise the level of sensibility
among stakeholders, preparing the stakeholders so  Any attempt to restrict the freedom of expression
that they understand better what are the positions must be in adherence with international human
of others (…) As more or less service providers, rights guidelines and must not restrict legitimate
academics demonstrate various options, but they speech.
are not decision makers”.143  Restrictions on freedom of expression should
pursue a legitimate aim and be necessary and
Recommendations for a proportionate to the end goal that they purport to
serve. They should be accompanied by adequate
balanced approach to fake redress mechanisms. Any restrictions beyond
news these constitute over-regulation of content.
 Before pursuing novel, hasty solutions,
governments should assess whether pre-existing
Building on our empirical data, we have compiled a set laws or guidelines could be applied to the digital
of recommendations on how to implement a balanced environment and/or social media corporations.
approach to fake news, with particular attention to  Whenever possible, guidelines and “soft
online content regulation. The recommendations are

14
approaches,” which are more easy to adapt over providing adequate, accessible and up-to-date
time to reflect technological innovation, are to be information and educating them on the ways in
preferred to constitutional solutions. which their experience on the platform is curated.
 Governments must not exert pressure on private Users must be able to opt out and take their
platforms to censor legitimate speech in the name personal data with them.
of curbing fake news, misinformation or hate  Private platforms should adapt their Community
speech. Instead, they should encourage open Guidelines and Terms of Services on international
and accountable channels of communication human rights law and state-of-the-art legislation
with platform operators. when it comes to the matter of data protection
 Rather than placing regulatory responsibility and user rights more in general.
in the hands of private actors, governments  Platform operators must build a legacy and
should demand transparency from social media practice of transparency with respect to the
platforms, requiring them to be open about their platform’s functioning, with the end goal of
handling of user data and their efforts to curb enhancing public scrutiny and accountability
misinformation or hate speech. for their actions. Areas of accountability should
 Regulators and law enforcement should also range from algorithmic curation to revenue from
embrace transparency and release regular advertisements. Examples of accountability
reports about their demands to social media mechanisms include the publication of
platforms, in particular with respect to “notice and transparency reports/evidence regarding privacy
take down” requests following court orders. breaches, third party data re-use, and the
 Corrective measures should be sufficiently occurrence of “fake news,” among other issues.
robust to meet the challenges of rapidly evolving  Platform operators should consider setting
technologies. up national, independent ethics boards in the
 Governments should consider the promotion of countries where they operate, to provide expert
competition as a way to ensure a better service oversight in order to facilitate the analysis of the
for the users. Laws setting high regulatory bars in ethical implications of business models and data
content policing might contribute to consolidating re-use practices, in accordance with national and
the power of major platforms, making it difficult international law.
for smaller companies to break into the market.  Platform operators should consider funding news
 Policymakers, law enforcement and local and digital literacy programs, in cooperation with
administrations should receive adequate and up- other stakeholders and civil society in particular.
to-date training to be empowered to understand
the mechanisms and the challenges of platforms Organized civil society
and online speech.
 Governments should invest in news and digital  Organized civil society should exert consistent
literacy programs targeting diverse age groups, pressure on, and work with technology companies
from school children to adults and the elderly. to enhance transparency and accountability of
Schools, libraries, civil society organizations as platforms.
well as the industry are natural partners in such an  Organized civil society should embrace its role
endeavor. Literacy programs should be tailored as watchdog for governments as well as platform
to different needs and adequately funded. operators that erode freedom of expression by
placing unreasonable restrictions on legitimate
platforms operators speech.
 Organized civil society should engage with
 Platform operators should refine their business consumers and users in order to better inform
models to explicitly take into account their them about the information they consume,
corporate responsibility obligations, in light of encouraging and enabling them to take a critical
existing human rights law and the UN Guiding stance towards their “information diets”.
Principles on Business and Human Rights.  Organized civil society, and digital rights
 Platform operators should empower their users to organizations in particular, should support efforts
make informed choices about their participation, to run periodic independent assessments of

15
The research Content Regulation on and by Platforms:
the operations of private platforms, including Rethinking Internet Governance vis-à-vis the Platformization
independent “audits” of the algorithms supporting of the Web (Principal Investigator Stefania Milan) was
content curation on social media platforms. supported by a grant of the Internet Policy Observatory,
 Organized civil society should take an active role Annenberg School of Communication at the University of
in fostering news and digital literacy amongst Pennsylvania, as part of the Joint Digital Rights and Internet
citizens and users of social media platforms. Freedom Research/Advocacy Projects call 2017. Contact:
[email protected] and [email protected]. The authors
thank Ms. Alexandra Deem (University of Amsterdam) for
her assistance with data collection and Mr. Sergio Barbosa
IN CONCLUSION dos Santos Silva (Universidade de Coimbra) for helping with
references.

This project set out with the intent of taking stock of the The DATACTIVE Ideas Lab is a research & consultancy firm
current debate on online content regulation, taking fake news registered in The Netherlands (KvK-nummer 69570132). For
as a case study. It also sought to determine the governance more information, visit https://stefaniamilan.net/consultancy
capabilities at our disposal to address such a complicated
set of issues at the intersection of the social, technological
and legal realms. To this end, we undertook a truncated
multistakeholder consultation involving twenty experts from
four stakeholder groups, namely academia, civil society,
governments and the industry. We also analyzed how three
large democracies—Brazil, Germany and India—have
responded to the problem of online disinformation through
a mix of jurisdictional and other means. We concluded by
offering a set of recommendations for a balanced approach
to fake news, which foreground an active role for users and
the imperative to protect human rights and their enjoyment
also on social media platforms. Amongst the proposed
measures, we wish to emphasize the enormous potential
of media and technology literacy programs:
programs they could help
users, on the one hand, to familiarize themselves with the
workings of platforms that play such an important role in
their lives, and, on the other, to take a critical approach to
information consumed online and its sources.

While we, too, agree that a multistakeholder involvement


is the only way to go if we are to find concerted solutions
that are implementable, resilient to time and acceptable to
all stakeholders, we raised the problem of the procedural
fitness of the multistakeholder model. As it is currently
implemented, this model presents multiple shortcomings,
not least the power unbalances existing between
stakeholders and the inability to shape the rules at work in
privately-owned platforms. However, the current excitement
around fake news provides a great opportunity to rethink the
multistakeholder model, and to include in the effort users
who normally remain distant from policy arenas. We believe
this is also a good time to encourage the industry to open up
to the inputs of other stakeholders, and the organized civil
society in particular.

16
SOURCES at http://science.sciencemag.org/content/359/6380/1094.full.
14. Ethical Journalism Network, “Fake News”, available at https://
ethicaljournalismnetwork.org/tag/fake-news.
15. Nathaniel Persily, “Can Democracy survive the Internet?”,
1. “Yes, I’d Lie to You”, The Economist, 2016 https://www.econ- Journal of Democracy, 2017, 28(2), pp. 63-76, https://doi.
omist.com/news/briefing/21706498-dishonesty-politics-noth- org/10.1353/jod.2017.0025.
ing-new-manner-which-some-politicians-now-lie-and.
16. Interview with former Member of the European Parliament,
2. Elizabeth MacBride, “Should Facebook, Google Be Regu-
February 13 2018.
lated? A Groundswell In Tech, Politics and Small Business
Says Yes”, Forbes, 2017, https://www.forbes.com/sites/eliza- 17. Olivia Solon, “Facebook’s fake news: Mark Zuckerberg re-
bethmacbride/2017/11/18/should-twitter-facebook-and-goo- jects “crazy idea” that it swayed voters”, The Guardian, No-
gle-be-more-regulated/#7bad9cb41bc5. vember 11, 2016, available at https://www.theguardian.com/
3. Eva Galperin, “What the Facebook and Tumblr Controversies technology/2016/nov/10/facebook-fake-news-us-election-
can teach us about content moderation”, Electronic Frontier mark-zuckerberg-donald-trump.
Foundation, March 2, 2012, available at https://www.eff.org/ 18. Elle Hunt, “‘Fake news’ named word of the year by Macqua-
deeplinks/2012/03/what-facebook-and-tumbler-controversies- rie Dictionary”, The Guardian, January 24, 2017, available at
can-teach-us-about-content-moderation. https://www.theguardian.com/australia-news/2017/jan/25/
4. Elle Hunt, “’Disputed by multiple fact checkers’: Facebook fake-news-named-word-of-the-year-by-macquarie-dictionary.
rolls out new alert to combat fake news”, The Guardian, 19. Olivia Solon, “Facebook’s fake news: Mark Zuckerberg re-
March 22, 2017, available at https://www.theguardian.com/ jects “crazy idea” that it swayed voters”, The Guardian, No-
technology/2017/mar/22/facebook-fact-checking-tool-fake- vember 11, 2016, available at https://www.theguardian.com/
news. technology/2016/nov/10/facebook-fake-news-us-election-
mark-zuckerberg-donald-trump.
5. Steven Rosenbaum, “The Facebook curation controversy”,
20. Kalev Leetaru, Kalev, “Did Facebook’s Mark Zuckerberg Coin
Forbes, May 31, 2016, available at https://www.forbes.com/
The Phrase ‘Fake News’?”, Forbes, February 17, 2017, avail-
sites/stevenrosenbaum/2016/05/31/the-facebook-curation-
able at https://www.forbes.com/sites/kalevleetaru/2017/02/17/
controversy/#64a7c5497f16.
did-facebooks-mark-zuckerberg-coin-the-phrase-fake-
6. Issie Lapowsky, “In a fake news era, schools teach the ABCs news/#2d740d456bc4.
of news literacy”, Wired, June 7, 2017, available at https:// 21. Pew Research Center, “How Americans Encounter, Recall
www.wired.com/2017/06/fake-fact-era-schools-teach-abcs- and Act Upon Digital News”, 9 February 9, 2017, available
news-literacy/. at http://www.journalism.org/2017/02/09/how-americans-en-
7. Guy Chazan, “Rise of refugee fake news rattles German counter-recall-and-act-upon-digital-news/.
politics”, The Financial Times, February 15, 2017, available 22. Interview with institutional representative, March 7 2018.
at https://www.ft.com/content/11410abc-ef6e-11e6-ba01-
119a44939bb6. 23. Interview with representative of academia, February 9 2018.
24. UN, OSCE, OAS, ACHPR, “Joint Declaration on Freedom of
8. Interview with civil society representative, February 9 2018.
Expression and Fake News, Disinformation and Propagan-
9. Robert A. Hackett and William K. Carroll (2006). Remaking da”, 2017, available at https://www.law-democracy.org/live/
Media. The Struggle to Democratize Public Communication. wp-content/uploads/2017/03/mandates.decl_.2017.fake-
New York: Routledge. news.pdf.
25. Hunt & Gentzkow, cited in https://www.washingtonpost.
10. Anne Helmond (2015). “The Platformization of the Web: Mak-
com/news/wonk/wp/2017/01/24/real-research-suggests-
ing Web Data Platform Ready”, Social Media + Society, 1(2),
we-should-stop-freaking-out-over-fake-news/?utm_ter-
https://doi.org/10.1177/2056305115603080.
m=.24b6f8bdf421.
11. Pew Research Center, “Americans’ complicated feelings 26. Soroush Vosoughi, Deb Roy and Sinan Aral, “The spread
about social media in an era of privacy concerns”, March of true and false news online”, Scence, 359(6380), pp.
27, 2018, available at http://www.pewresearch.org/fact- 1146-1151, March 9, 2018, https://doi.org/10.1126/science.
tank/2018/03/27/americans-complicated-feelings-about- aap9559.
social-media-in-an-era-of-privacy-concerns/.
27. Yenni Kwok, “Where Memes Could Kill: Indonesia’s worsen-
12. For a deeper analysis, see Fabio Giglietto et al, Fakes, ing problem of fake news”, Time, January 6, 2017, available
News and the Election: A New Taxonomy for the Study of at http://time.com/4620419/indonesia-fake-news-ahok-chi-
Misleading Information within the Hybrid Media System, nese-christian-islam/.
presented at the Convegno AssoComPol, December 28. Francis Chan, “Indonesian police uncover ‘fake news fac-
2016, available at https://papers.ssrn.com/sol3/papers. tory’”, The Straits Times, September 17, 2017, available at
cfm?abstract_id=2878774. http://www.straitstimes.com/asia/se-asia/indonesian-po-
13. David M. J. Lazer et al., “The science of fake news”, Science, lice-uncover-fake-news-factory.
March 9, 2018, Vol. 359, Issue 6380, pp. 1094-1096, available 29. Oliver Smith, “Whatsapp fake news crisis is leading to riots
and bloodshed”, The Memo, February 13, 2017, available at

17
https://www.thememo.com/2017/02/13/whatsapp-india-fake- Zingales, FGV Direito Rio, pp. 41-64, 2017, available at http://
news-crisis-is-leading-to-riots-bloodshed/. hdl.handle.net/10438/19922.
30. BBC, “Germany starts imposing hate speech law”, January 42. Pew Research Center, “Many Americans Believe Fake News
1, 2018, available at http://www.bbc.com/news/technolo- Is Sowing Confusion”, December 15, 2016, available at
gy-42510868. http://www.journalism.org/2016/12/15/many-americans-be-
31. Joanna Plucinska, “Macron proposes new law against fake lieve-fake-news-is-sowing-confusion/. Americans aged 50
news”, Politico, January 3, 2018, available at https://www. and older are more likely to place a great deal of responsibili-
politico.eu/article/macron-proposes-new-law-against-fake- ty on the government, as compared to younger people.
news/. Also see, Aurore Belfrage, “Macron’s fake news law 43. Kate Klonick, “The New Governors: The People, Rules, Pro-
will protect democracy, Politico, January 7, 2018, available at cesses Governing Online Speech”, forthcoming in the Har-
https://www.politico.eu/article/macron-fake-news-law-will-pro- vard Law Review, available at https://papers.ssrn.com/sol3/
tect-democracy/. papers.cfm?abstract_id=2937985.
32. Jeremy Malcolm, “Malaysia Set to Censor Political Speech as 44. “What Zuckerberg should do. Facebook faces a rep-
Fake News”, Electronic Frontier Foundation, March 27, 2018, utational meltdown”, The Economist, March 22, 2018,
available at https://www.eff.org/deeplinks/2018/03/malay- available at https://www.economist.com/news/lead-
sia-set-censor-political-speech-fake-news. ers/21739151-how-it-and-wider-industry-should-re-
33. Jonathan Kaiman, “China cracks down on social media spond-facebook-faces-reputational-meltdown.
with threat of jail for online rumours”, The Guardian, Sep- 45. Konrad Nicklewicz, “Weeding out fake news: an approach
tember 10, 2013, available at https://www.theguardian.com/ to social media regulation”, European View (201), 16(2), pp.
world/2013/sep/10/china-social-media-jail-rumours. 335, https://doi.org/10.1007/s12290-017-0468-0.
34. “China Military sets up website to report leaks and fake 46. Tarleton Gillespie, ‘The politics of ‘platforms’, New
news”, The Straits Times, November 20, 2017, available at Media & Society, 12(3), pp. 347–364, http://dx.doi.
http://www.straitstimes.com/asia/east-asia/china-military- org/10.1177/1461444809342738.
sets-up-website-to-report-leaks-fake-news-1. 47. Phillip M. Napoli and Robyn Caplan, “Why media compa-
35. Daniel Funke, “Italians can now report fake news to the po- nies insist they’re not media companies, why they’re wrong,
lice. Here’s why its problematic”, Poynter, January 19, 2018, and why it matters”, First Monday, 22(5), May 1, 2017,
available at https://www.poynter.org/news/italians-can-now- available at http://firstmonday.org/ojs/index.php/fm/article/
report-fake-news-police-heres-why-thats-problematic. view/7051/6124.
36. Catherine Edwards, “Italy debates fines and prison terms 48. ARTICLE 19, Submission of Evidence to the House of Lords
for people who spread fake news”, The Local, February Select Committee on Artificial Intelligence, September 6,
16, 2017, available at https://www.thelocal.it/20170216/ita- 2017, available at https://www.article19.org/wp-content/
ly-mulls-introducing-fake-news-fines. uploads/2017/10/ARTICLE-19-Evidence-to-the-House-of-
37. UN, OSCE, OAS, ACHPR, “Joint Declaration on Freedom of Lords-Select-Committee-AI.pdf
Expression and Fake News, Disinformation and Propagan-
49. “What Zuckerberg should do. Facebook faces a rep-
da”, 2017, available at https://www.law-democracy.org/live/
utational meltdown”, The Economist, March 22, 2018,
wp-content/uploads/2017/03/mandates.decl_.2017.fake-
available at https://www.economist.com/news/lead-
news.pdf.
ers/21739151-how-it-and-wider-industry-should-re-
38. Flemming Rose and Jacob Mchangma, “History proves
spond-facebook-faces-reputational-meltdown.
how dangerous it is to have the government regulate fake
news”, The Washington Post, October 3, 2017, available 50. Ibid.
at https://www.washingtonpost.com/news/theworldpost/ 51. Zeynep Tufekci, “It’s the (Democracy Poisoning) Golden
wp/2017/10/03/history-proves-how-dangerous-it-is-to- Age of Free Speech”, Wired, January 16, 2018, available at
have-the-government-regulate-fake-news/?utm_term=.ac- https://www.wired.com/story/free-speech-issue-tech-turmoil-
d993e03a89. new-censorship/. From the same author, see also, “Face-
39. Pew Research Center, “The Future of Free Speech, Trolls, An- book’s Ad Scandal isn’t a ‘fail’, it’s a feature”, The New York
onymity, and Fake News Online”, March 29, 2017, available Times, September 23, 2017, available at https://www.nytimes.
at http://www.elon.edu/docs/e-web/imagining/surveys/2016_ com/2017/09/23/opinion/sunday/facebook-ad-scandal.html.
survey/Pew%20and%20Elon%20University%20Trolls%20 52. Ivana Kottosava, “Facebook and Google to stop ads from
Fake%20News%20Report%20Future%20of%20Internet%20 appearing on fake news sites”, CNN, November 15, 2016,
3.29.17.pdf. available at http://money.cnn.com/2016/11/15/technology/
40. Tarlack MacGonagle, “Fake news: False fears or real facebook-google-fake-news-presidential-election/index.html.
concerns?”, Netherlands Quarterly of Human Rights, 35(4), 53. Thuy Ong, “Twitter starts enforcing new policies on violence,
pp. 203 - 209, https://doi.org/10.1177/0924051917738685. abuse, and hateful conduct”, The Verge, December 18, 2017,
available at https://www.theverge.com/2017/12/18/16789606/
41. Luca Belli, Pedro Augusto P. and Nicolo Zingales, “Law of
twitter-new-safety-policies-hate-groups.
the land or law of the platform? Beware of the privatization of
54. Jeremy Malcolm, “Malaysia Set to Censor Political Speech as
regulation and police”, in Platform Regulation. How platforms
Fake News”, Electronic Frontier Foundation, March 27, 2018,
are regulated and how they regulate us, edited by Belli and
available at https://www.eff.org/deeplinks/2018/03/malay-

18
sia-set-censor-political-speech-fake-news. partnerships, shape global norms, showcase new technolo-
55. “Germany: Flawed Social Media Law”, Human Rights gies, and confront the most challenging issues at the inter-
Watch, February 14, 2018, available at https://www.hrw.org/ section of human rights and technology. More than an event,
news/2018/02/14/germany-flawed-social-media-law. RightsCon is a global community with thousands of leading
56. Mark Scott and Janosch Delcker, “Free Speech v. censorship voices across stakeholder lines”.
in Germany”, Politico, January 4, 2018, available at https:// 66. Admittedly, a comparison including non-democratic countries
www.politico.eu/article/germany-hate-speech-netzdg-face- would provide far starker contrasts.
book-youtube-google-twitter-free-speech/.
67. “Act to Improve Enforcement of the Law in Social Networks
57. “Germany: Flawed Social Media Law”, Human Rights
(Network Enforcement Act)”, July 12 2017, available at https://
Watch, February 14, 2018, available at https://www.hrw.org/
www.bmjv.de/SharedDocs/Gesetzgebungsverfahren/Doku-
news/2018/02/14/germany-flawed-social-media-law.
mente/NetzDG_engl.pdf?__blob=publicationFile&v=2.
58. Daniel Solove, “Privacy Self-Management and the Consent
68. Ivana Kottasova and Nadine Schmidt, “Facebook, Twitter
Dilemma”, 126 Harvard Law Review 1880 (2013), available
face fines up to $53 million over hate speech”, CNN Tech,
at https://papers.ssrn.com/sol3/papers.cfm?abstract_
April 5, 2017, available at http://money.cnn.com/2017/04/05/
id=2171018.
technology/germany-hate-speech/index.html?iid=EL.
59. See Jamila Venturini et al, “Terms of service and human
69. “Germany: Flawed Social Media Law”, Human Rights
rights: An analysis of online platform contracts”, Editora
Watch, February 14, 2018, available at https://www.hrw.org/
Revan, 2016, http://bibliotecadigital.fgv.br/dspace/han-
news/2018/02/14/germany-flawed-social-media-law.
dle/10438/18231.
70. Nick Hopkins, “Revealed: Facebook’s Internal Rulebook on
60. United Nations, “Guiding Principles on Business and Human
sex, terrorism and violence”, The Guardian, May 21, 2017,
Rights”, available at http://www.ohchr.org/Documents/Publi-
available at https://www.theguardian.com/news/2017/may/21/
cations/GuidingPrinciplesBusinessHR_EN.pdf.
revealed-facebook-internal-rulebook-sex-terrorism-violence.
61. Victor Tangermann, “Hearings show Congress doesn’t under-
71. “Facebook Congressional Testimony: ‘AI tools’ are not the
stand Facebook well enough to regulate it”, Futurism, April
panacea”, ARTICLE 19, April 13, 2018, available at https://
11, 2018, available at https://futurism.com/hearings-con-
www.article19.org/resources/facebook-congressional-testi-
gress-doesnt-understand-facebook-regulation/.
mony-ai-tools-not-panacea/.
62. See, among others, Milton Mueller (2009). “ICANN Inc.: Ac-
72. Sven Jacobs, “Already changes to the new German law on
countability and Participation in the Governance of Critical
hate speech on social media on the horizon?”, Norton Rose
Internet Resources”, The Korean Journal of Policy Studies,
Fulbright, March 20, 2018, available at https://www.socialme-
24(3), pp. 95-116 & Stefania Milan, “The Fair of Competing
dialawbulletin.com/2018/03/already-changes-new-german-
Narratives: Civil Society(ies) after NETmundial”, Intenet Policy
law-hate-speech-social-media-horizon/.
Observatory, 10 September 2014, available at http://global-
73. See http://eur-lex.europa.eu/legal-content/EN/TX-
netpolicy.org/the-fair-of-competing-narratives-civil-societ-
T/?qid=1506933050778&uri=CELEX:52017DC0555. The
yies-after-netmundial/.
European Commission has also sponsored a public consul-
63. See http://www.intgovforum.org/. Emphasizing consen-
tation on fake news and online information (13 November
sus-based decisions to facilitate buy-in and increase
2017—23 February 2018), allowing citizens to comment on
legitimate, the IGF lacks the ability to produce binding doc-
the definition of fake information online, corrective measures
uments. The 2017 IGF gathered in Geneva, Switzerland, De-
already taken by different actors, and the scope for future
cember 28-21, 2017. The large bulk of the activity takes place
actions regarding the issue. See https://ec.europa.eu/info/
in workshops proposed by the participants and selected by
consultations/public-consultation-fake-news-and-online-disin-
a Multistakholder Advisory Group (MAG). With the exception
formation_en.
of the MAG, the IGF implements an ‘open’ multistakeholder
approach, whereby participant self-selecting is expected to 74. Deepali Moray, “WhatsApp reaches 160 million monthly active
balance perspective. In the 2017 edition, only five out of the users in India; highest in the world”, BGR, November 15,
about 100 workshops tackled issues related to the fake news 2016, available at http://www.bgr.in/news/whatsapp-reaches-
controversy; two of such workshops were called for by IGF 160-million-monthly-active-users-in-india-highest-in-the-world/
organizers, and one was organized by the authors of this 75. Pranav Dixit, “Whatsapp hoaxes are India’s own fake news
white paper. crisis”, BuzzFeed, January 19, 2017, available at https://www.
64. Luca Belli and Nicolo Zingales, ed., Platform Regulation. buzzfeed.com/pranavdixit/viral-whatsapp-hoaxes-are-indias-
How platforms are regulated and how they regulate us, Rio own-fake-news-crisis?utm_term=.bfzDD5KANy#.ulGLLdlN6v.
de Janeiro: FGV Direito Rio, 2017. See also https://www.int- 76. The Indian Express, “RBI’s new Rs 2000 notes do not have
govforum.org/multilingual/content/dynamic-coalition-on-plat- a Nano-GPS chip”, The Indian Express, November 13, 2016,
form-responsibility. available at http://indianexpress.com/article/technology/
65. RightsCon (https://www.rightscon.org) is organized by the tech-news-technology/nope-rs-2000-note-does-not-have-a-
digital rights organization Access Now. In the words of the gps-nano-chip-inside-it/.
organizers, it gathers “the world’s business leaders, technol- 77. Pranesh Prakash and Vidushi Marda, “WhatsApp in Kash-
ogists, engineers, investors, activists, human rights experts, mir: When Big Brother wants to go beyond watching you”,
and government representatives come together to build The Scroll, April 28, 2016, available at https://scroll.in/arti-
cle/807277/whatsapp-in-kashmir-when-big-brother-wants-to-
19
go-beyond-watching-you. Motherboard, November 2 2017, available at https://mother-
78. Software Freedom Law Centre, Internet Shutdown Tracker, board.vice.com/en_us/article/mb37k4/twitter-facebook-goo-
available at https://www.internetshutdowns.in/. gle-bots-misinformation-changing-politics.
79. “Offensive posts on Whatsapp can land group admin in jail”, 97. Interview with representative of academia, February 14 2018.
LiveMint, April 20, 2017, available at https://www.livemint.
com/Politics/67gE18Ii7aA8KdE5eos0AJ/Offensive-posts-on- 98. Interview with representative of academia, February 9 2018.
WhatsApp-can-land-group-admin-in-jail.html. 99. A couple of interviewees likened the current fake news
80. “Journalists accused of reporting fake news will lose press controversy to the advent of the printing press, where the
credentials till complaint is verified”, The Scroll, April 2, 2018, Catholic Church monopolized the ability to publish although
https://scroll.in/latest/874216/journalists-accused-of-report- the technology has the potential to democratize knowledge
ing-fake-news-will-lose-press-credentials-till-complaint-is- and communications. There, the problem was not the
verified. technology itself, but the people or the institution trying to
81. “In name of fake news, government frames rules to black- capture that technology.
list journalists”, The New Indian Express, April 3, 2018,
available at http://indianexpress.com/article/india/in-name- 100. Interview with civil society representative, March 14 2018.
of-fake-news-government-frames-rules-to-blacklist-journal-
101. Interview with civil society representative, February 9 2018.
ists-5121246/.
82. “Fake news order: PM Modi wants guideline spiked; Smriti 102. Interview with institutional representative, April 24 2018.
Irani says more than happy to engage with Press”, The Indian
103. Interview with representative of academia, February 9 2018.
Express, April 3, 2018, available at http://indianexpress.com/
article/india/pm-narendra-modi-directs-information-broad- 104. Interview with representative of academia, February 16 2018.
casting-ministry-to-withdraw-fake-news-order-smriti-ira-
ni-5121672/. 105. Interview with representative of academia, February 14 2018.

83. “India: 20 internet shutdowns in 2017”, Human Rights 106. Interview with civil society representative, February 9 2018.
Watch, June 15, 2017, available at https://www.hrw.org/ 107. Interview with institutional representative, February 13 2018.
news/2017/06/15/india-20-internet-shutdowns-2017.
108. Interview with civil society representative, February 10 2018.
84. Taisa Sganzeria, “ Brazil Introduces Tougher Regulations on
‘Fake News’ Ahead of 2018 Elections“, Global Voices, De- 109. Interview with representative of academia, February 16 2018.
cember 31, 2017, available at https://advox.globalvoices. 110. Interview with representative of academia and civil society,
org/2017/12/31/brazil-introduces-tougher-regulations-on- February 17 2018.
fake-news-ahead-of-2018-elections/.
111. Interview with institutional representative, January 30 2018.
85. Interview with institutional representative, April 24 2018.
86. Rory Cellan-Jones, “Fake news worries ‘are growing’ sug- 112. Interview with civil society representative, February 9 2018.
gests BBC poll”, BBC, September 22, 2017, available at 113. Interview with institutional representative, March 7 2018.
http://www.bbc.com/news/technology-41319683.
87. Interview with institutional representative, April 24 2018. 114. Interview with representative of academia, February 14 2018

88. See https://freedomhouse.org/report/freedom-press/2017/ 115. Interview with institutional representative, March 7 2018.
brazil. 116. Interview with institutional representative, March 7 2018.
89. Interview with institutional representative, April 24 2018. 117. Interview with representative of civil society representative,
90. Melanie Ehrenkranz, “Brazil’s Federal Police says it will ‘pun- March 14 2018.
ish’ creators of ‘fake news’ ahead of elections”, Gizmodo, 118. See https://reporterslab.org/fact-checking-triples-over-four-
January 10 2018, available at https://gizmodo.com/brazil-s- years/.
federal-police-says-it-will-punish-creators-of-1821945912.
119. Interview with industry representative, March 7 2018.
91. Interview with institutional representative, March 7 2018.
120. Interview with representatives from academia, February 16
92. Interview with representative of academia, February 14 2018. 2018.
93. Interview with civil society representative, February 21 2018. 121. Interview with representative of academia, February 16 2018.
94. Interview with representative of academia, February 16 2018. 122. Interview with institutional representative, March 7 2018.
95. The Delhi High Court refused to hold a Whatsapp group ad-
123. Interview with representative of academia, February 16 2018.
ministrated liable for defamatory statements made by mem-
bers of the group. See Ashish Bhalla v. Suresh Chawdhary 124. Interview with representative of academia, February 14 2018.
and Ors, Delhi High Court, CS(OS) No.188/2016, November
125. Interview with civil society representative, March 14 2018.
29 2016, available at http://delhihighcourt.nic.in/dhcqry-
disp_o.asp?pn=242183&yr=2016. 126. Interview with representative of academia, February 16 2018.
96. Renee DiResta et. al, “ The Bots that are changing politics”,
127. Interview with civil society representative, February 3 2018.
20
128. Interview with civil society representative, February 3 2018.
129. Interview with institutional representative, March 7 2018.
130. Interview with representative of academia, February 16 2018.
131. Interview with civil society representative, February 21 2018.
132. Interview with institutional representative, March 7 2018.
133. Interview with civil society representative, February 21 2018.
134. Interview with representative of academia, February 9 2018.
135. Interview with civil society representative, February 15 2018.
136. Interview with representative of academia, February 16 2018.
137. Interview with representative of academia, February 14 2018.
138. The Collaborative Governance Project of the Internet Society
goes in this direction: through a mix of training, norm de-
velopment and the promotion of academic research on the
topic, it seeks at ‘expand the global knowledge and use of
collaborative governance processes to solve problems and
develop norms’. See https://www.internetsociety.org/collabo-
rativegovernance/.
139. Interview with representative of academia, February 14 2018.
140. Interview with representative of academia, February 14 2018.
141. Interview with institutional representative, March 7 2018.
142. Interview with representative of academia, February 16 2018.
143. Interview with representative of academia, February 14 2018.

21
22

You might also like