Big Data Surveillance and Security Intelligence
Big Data Surveillance and Security Intelligence
UBC Press gratefully acknowledges the financial support for our publishing program of
the Government of Canada (through the Canada Book Fund) and the British Columbia
Arts Council.
Set in Helvetica Condensed and Minion by Apex CoVantage, LLC
Copy editor: Francis Chow
Proofreader: Judith Earnshaw
Indexer: Delano Aragão Vaz
Cover designer: Alexa Love
UBC Press
The University of British Columbia
2029 West Mall
Vancouver, BC V6T 1Z2
www.ubcpress.ca
14 Metadata – Both Shallow and Deep: The Fraught Key to Big Data
Mass State Surveillance / 253
Andrew Clement, Jillian Harkness, and George Raine
Afterword / 269
Holly Porteous
Contributors / 275
Index / 281
Figures and Tables
Figures
1.1 New Collection Posture NSA slide / 22
7.1 Canadian cyber sensor grid / 133
7.2 Boomerang route originating and terminating in Toronto / 136
14.1 “Context” as metadata category in XKEYSCORE / 261
Tables
7.1 Top border cities for Internet traffic entering Canada / 137
7.2 Top carriers bringing Internet data into Canada / 137
7.3 Principal concentrations of Internet routers by metropolitan area
and carrier / 139
Preface
This book examines one of the most pressing issues in the organization of
society today: the application of new data practices to both global systems and
everyday devices – in this case, those that have to do with national security. The
editors have been involved in research on surveillance in relation to security
for over two decades, but it is difficult to think of innovations as far-reaching
and consequential as those involving so-called big data. This is because they
rest on novel ways of using data, enabled by massive computing power, and
because they touch the lives of everyone. Intelligence gathering in the era of
social media and the Internet of Things cannot but implicate and involve all
citizens, domestically as well as in other countries, not just those conventionally
thought of as suspicious, risky, or threatening.
The book is the product of a large-scale research project involving several
Canadian universities as well as partners and collaborators in other countries,
with its central node at the Surveillance Studies Centre at Queen’s University
in Kingston, Ontario. Other lead researchers are from Université Laval, the
University of Ottawa, and the University of Victoria, along with St. Andrew’s
University in Scotland. This project is underwritten by a five-year Partnership
Grant from the Social Sciences and Humanities Research Council (SSHRC), in
which academic researchers work collaboratively with partners from both public
policy and civil society. More specifically, it comprises expert papers shared at
a very stimulating and fruitful research workshop held in Ottawa, in which both
academics and members of privacy commissions and civil liberties groups
contributed papers for discussion.
We are very grateful to the SSHRC both for the funding opportunity and for
the doors it opened for our research to be carried out in active cooperation with
our Canadian government policy and compliance bodies (the Office of the
Privacy Commissioner in Ottawa and the British Columbia Office of the Infor-
mation and Privacy Commissioner) and non-governmental organization
partners (the International Civil Liberties Monitoring Group in Ottawa and the
BC Civil Liberties Association). These partnerships are evident in the chapters
of this book.
Abbreviations
“Everything has changed” in the wake of big data, declared the Canadian
Communications Security Establishment (CSE).1 While some more skeptical
analysts may raise an eyebrow at this, it is undoubtedly the case that the modes
of analysis of communications described as “big data” have produced huge
changes in the pursuit of national security. Far from being an exclusively Can-
adian phenomenon, this is a global process in which agencies worldwide work
together using new methods of data analytics to try to identify risks to national
security with a view to preventing or pre-empting the possibility that those risks
might become realities. The title of our book, Big Data Surveillance and Security
Intelligence: The Canadian Case, thus sums up a crucially important trend, one
that calls for serious and critical analysis.
While this process is global, one cannot hope to grasp the import and impact
of what is happening by attempting to capture the complete global scene.
Thus, this book is primarily about what is happening in Canada, to follow
how the big data changes came about in one country. Of course, this pulls us
right back into the broader picture because the Canadian experience is deeply
influenced by others, especially through the so-called Five Eyes partnership
of Australia, Canada, New Zealand, the United Kingdom, and the United
States. But the case of Canada remains central to what follows. It is not at all
identical to the experience of others and contains important markers for
ongoing analysis.
First, it is vital that the two terms “security intelligence” and “surveillance”
appear together. The former, a growing activity of any nation-state, requires the
latter, which has to do with the purposeful collection and examination of per-
sonal data. Of course, all manner of information handling may be carried out
for national security, but the aim is to discover human activities potentially
detrimental to the proper functioning of the nation-state. One way this occurs
is through open-source data gathering, changes in which indicate the enor-
mously enlarged scope of personal data gathering and sifting in the twenty-first
century. During the Second World War, “open-source intelligence” referred to
the monitoring of enemy radio stations for clues about hostile activities. Today
it has expanded exponentially to include the monitoring of the Internet and
especially social media. This is the primary origin of the tremendous troves of
data referred to as “big.”
4 David Lyon and David Murakami Wood
Second, as the following chapters indicate, parallel linked trends are apparent
in national security intelligence. One is the expansion of the notion of security
to cover a range of fields not previously designated as such. In the field of
international relations, “securitization” refers to the designation by govern-
ments of their citizens and those of other countries as matters of security. This
means that extraordinary means – rendition to other countries for questioning
or even torture, for example, as happened to Maher Arar and other Canadians
after the terrorist attacks of 11 September 2001 – may be used in the name of
security. Not only persons but also certain kinds of events, such as musical or
athletic activities, or sites, such as sidewalks with no vehicle barriers, may also
be newly considered as security risks. The second trend is the one already
alluded to, of gathering data from sources that only recently – in the twenty-
first century – have become available.
Each of these trends underscores the importance of discussing the activ-
ities of communications security and security intelligence in Canada and
around the world. And they also point to the need for a deeper understand-
ing of the ways in which surveillance has become such an essential player
in each area, along with others, such as domestic policing, that are also
securitized and also increasingly depend on big data.2 As far as national
security is concerned, the big data connections started to become publicly
clear with the whistle-blowing activities of American security operatives
such as William Binney, Thomas Drake, Mark Klein, and, especially since
2013, Edward Snowden. What they demonstrated was both the widened
sweep of surveillance – often called “mass surveillance” – and the profound
dependence of the national security agencies of all countries on telecom-
munications and Internet corporations for the acquisition and then the
promiscuous use of personal data.
Third, then, is the notion of a big data age. While it may be premature or
misleading to adopt big data as a key descriptor for an historical era, it is none-
theless essential to recognize and assess the impact of big data practices in
multiple areas of corporate and governmental life. If one turns to Google’s
Internet dictionary for a definition of big data, you may find this: “extremely
large data sets that may be analyzed computationally to reveal patterns, trends,
and associations, especially relating to human behavior and interactions.” This
is very instructive because it speaks not only to the volume of data but also to
the dependence on massive computer power, the production of correlations
between disparate factors, and the predominant focus on human activity. What
is missing is a sense of the enthusiasm with which big data “solutions” are often
sought, and the relative lack of critical reflection on the limits on and ethics of
what can be achieved using such methods.
Introduction 5
The “big data” buzzword may have a limited shelf life but what it points to is
highly significant. At CSE, the relevant phrase is “New Analytic Model” (Chapter 6).
The key idea is to take advantage of the availability of rapidly growing quantities
of data available through the Internet but especially consequent on the rise of
social media. The fact that the platforms also learned to monetize what was
previously referred to as “data exhaust” also meant that such data were sought
more vigorously. Rather than relying on conventional modes of analysis, we
can now mine and analyze such data, using algorithms, to discover previously
unnoticed patterns of activity. While big data is often summarized by attributes
such as volume (of data), velocity (speed of analysis), and variety (the expanding
range of usable datasets), its core is practices. As danah boyd and Kate Crawford
note, it is the “capacity for researchers to search, aggregate and cross-reference
large data-sets.” 3 They also ask some telling questions about such data
practices.
It is hardly surprising, however, that security intelligence services would wish
to exploit new possibilities for learning from the mass of metadata that actually
answers queries a private detective agency might have – such as location, time,
and type of transaction and communication, along with identifying details of
participants. From about 2009 in Canada, it became clear that legal warrant
was sought for such data gathering and analysis. However, as hinted at in the
designation of big data as a buzzword, hard ontological and epistemological
questions are easily glossed over. Such practices are all too often marred by what
Jose van Dijck calls “dataism,” a secular belief in the “objectivity of quantification
and the potential for tracking human behaviour and sociality through online
data,” along with the presentation of such data as “raw material” to be analyzed
and processed into predictive algorithms.4 The potential for new analytics and
the potential problems that arise from this are discussed throughout this book.
It is also worth noting, however, that the processes involved are ones that are
also visible in everyday life, not only in the arcane world of national security
intelligence and surveillance. After all, the data concerned are frequently gleaned
from what David Lyon calls “user-generated surveillance,” referring to the ways
in which contemporary new media encourage the sharing of information and
images and the proliferation of online communications and transactions that
yield highly consequential metadata.5 This parallels Bernard Harcourt’s account,
which explores processes of quotidian self-exposure, where people “follow”
others, “sharing” information as they roam the web and as they themselves are
“followed” by multiple commercial and governmental organizations that obtain
access to the same data.6 The corporate use of such data has now reached such
major proportions that theorists such as Shoshana Zuboff now dub this “surveil-
lance capitalism,” a process that has far-reaching and profound political,
6 David Lyon and David Murakami Wood
economic, and social repercussions. This is the crucial context for the present
study.7
The words that form the title of our book sum up what its authors are at pains
to reveal as they debate the big data surveillance involved in today’s security
measures, seen especially in intelligence gathering and analysis dependent on
big data practices. The book does not question the need for forms of national
security or policing as such, but it does raise many questions about the ways in
which new kinds of data production and analysis challenge conventional and
time-honoured practices within such agencies. The book queries the rapid and
wholesale departure from earlier traditions of national security activity, along
with the ethics and legal instruments that have governed these areas in the past,
but it does so with a view to the future of national security, not with a nostalgic
view of what went before. After all, during the twentieth century many questions
were also raised about national security and policing measures. There is a long
history of reasoned and principled assessment of such endeavours, to which
this book makes a further contemporary contribution.
Meanwhile, in the United States arguments about the nation’s own potential
overseas empire beyond the frontier that had its terminus at the Pacific appeared
to have been settled almost by circumstance with the results of the Spanish-
American War and the US acquisition of the Philippines. The United States
instituted an intensive military governance program that conflated both crime
and terrorism and saw a “solution” to the problem in counter-insurgency oper-
ations.11 Counter-insurgency thereafter proved a rich vein for new intelligence
services that were created in the twentieth century, notably the Army Intelligence
Agency (later the Defense Intelligence Agency) and the Central Intelligence
Agency (CIA), deployed with debatable success in Europe after the Second
World War via the Gladio network, and over a longer period in Latin America,
following the Monroe Doctrine, which asserted US hegemony over the American
hemisphere. The tactic of essentially manufacturing threats that could be dis-
rupted was also the modus operandi of the Federal Bureau of Investigation
(FBI) on the US mainland.
The postwar period in the United States saw the migration and ascent of
security intelligence from a fringe activity often run largely by enthusiasts and
amateurs to the core of the state during the Cold War, with intelligence agencies – as
expert bureaucracies with the greatest access to secret information, burgeoning
budgets, and the least accountability to elected bodies – constituting the core
of what has been variously called the permanent, secret, or deep state.
With Britain’s economy grievously damaged by the war and its imperial power
broken, the postwar period also saw the United States confirmed as the new
centre of the anglophone world, and military and security intelligence arrange-
ments were among the first to recognize this new reality. Conventional histories
still emphasize NATO cooperation. This was (and is) undoubtedly important,
but in the security intelligence field there is no doubt that secret and unacknow-
ledged accords have long marked the new reality. The signing of the Britain-USA
Agreement (BRUSA) in 1943 set the stage for the UK-USA Agreement (UKUSA)
in 1946, which confirmed American dominance over the still extensive British
imperial intelligence networks. The United Kingdom was vital not just for its
personnel and expertise but also because one major legacy of British imperial
power was its centrality in international undersea cable networks. Any SIGINT
agency with global aspirations was going to want the ability to tap those cables,
and the integration of UKUSA agencies into existing and new cable and domestic
telecommunications systems was a priority from this time.
The United States made it a priority to bring in British white settler-colonial
dominions (Canada, Australia, and New Zealand). For SIGINT, a Canada-USA
Agreement (CANUSA) was signed in 1948, and Australia and New Zealand’s
intelligence services (originally as a single entity) were brought into the emerging
Introduction 9
system in 1956. These were the founding “second parties” to what later became
known as the Quadripartite Agreement and later, more informally, the Five
Eyes (FVEY), with the United States as “first party.” The reconstruction of post-
war intelligence agencies in the defeated former Axis powers (West Germany,
Italy, and Japan in particular) was also achieved with US and Allied oversight
(often involving many of the same people who had worked for the former fascist
security intelligence agencies), and the new security intelligence agencies became
“third parties” in the emerging US-dominated international security intelligence
system. Other eventual third parties included members of NATO and countries
propped up with US military aid during the Cold War, such as Thailand and
Israel.
The postwar period also saw the rapid rise of SIGINT, facilitated by both this
international system and the spread of existing technologies like the telephone
and new technologies of computing. The historian of intelligence Bernard Porter
has argued that SIGINT became increasingly central from the 1950s onward,
and that the “the future of intelligence clearly lay with ‘Sigint’ and the new
technology.”12
Compared with the United States and the United Kingdom, Canadian par-
ticipation was relatively insignificant during this period. Canadian universities
contributed to CIA research projects in psychological manipulation and torture.
Canadian SIGINT did not really become important until the emergence of the
National Research Bureau in the 1960s and its eventual more public identity as
the Communications Security Establishment of Canada (CSEC, latterly CSE).13
For much of this time, the most important Canadian contribution consisted of
hosting multiple tiny Arctic installations that formed part of what was called
the DEW (Defense Early Warning) Line, designed to protect the United States,
and to a lesser extent Canada and Britain, against Soviet long-range bomber
and later ICBM (intercontinental ballistic missile) attack, and its successors.14
While Canada struggled in many ways to establish a meaningful international
security intelligence role independent of the United States, internally it adopted
tactics that were strongly reminiscent of the FBI’s domestic version of counter-
insurgency. The Royal Canadian Mounted Police (RCMP), by global standards
a rather strange hybrid policing and internal security organization more akin
to London’s Metropolitan Police Service – with its dual local/regional conven-
tional policing and national anti-terrorism/internal security roles – than to any
other organizational equivalent, was infamously revealed to have been involved
in multiple illicit activities in order to uncover enemies within.15
This sparked a national inquiry, the McDonald Commission (Royal Com-
mission of Inquiry into Certain Activities of the RCMP), which led to the
removal of the RCMP’s internal security intelligence role and the creation of
10 David Lyon and David Murakami Wood
the Canadian Security Intelligence Service (CSIS) in 1984. However, the post-
9/11 environment has once again muddied the never entirely clear blue waters
that separate CSIS’s security intelligence role from the RCMP’s policing mandate.
As CSIS has grown in power and influence, pressure has grown for it to have
more active capabilities to – in the words of the National Security Act, 2017
(formerly Bill C-59) – “disrupt” terrorist plots and so on. In addition, the RCMP
still maintains what can only be described as a political policing role, akin to
the Special Branch of London’s Metropolitan Police, targeting what it describes
as “domestic” or “multi-issue extremists” (who include environmental activists,
Indigenous peoples’ rights organizations, Quebecois separatists, and more).16
As a result, CSIS has developed its own extensive databases, in line with the US
Department of Homeland Security’s “fusion centers,” created after 9/11. There
are multiple proposals in Canada for similar organizations to share data and
cooperate operationally, following a model that was tested for the 2010 Winter
Olympics in Vancouver.17
By 2010, Canada was four years into the decade-long government of Stephen
Harper, who had promised to change Canada “beyond recognition.” This seems
far-fetched, particularly in retrospect and in the long-term historical view taken
here, but there were a number of key changes in the area of security intelligence
with which the country is still wrestling. Harper’s approach in general was to
favour increasing the legal powers of state security intelligence organizations,
with a corresponding reduction in human rights, particularly privacy. Argu-
ments over what became known as “lawful access” were not a consistent factor
over Harper’s ten years in power but became increasingly important, particularly
after 2009.
Successive public safety ministers, Peter van Loan and Steven Blaney, and
Justice Ministers Rob Nicholson and Peter MacKay, and above all Vic Toews,
who held both portfolios at different times, joined Harper in these attempts
to enact legislation giving police and security intelligence greater powers.
The initial push in 2009 came through two bills: Bill C-46, the Investigative
Powers for the 21st Century Act, and Bill C-47, the Technical Assistance for
Law Enforcement in the 21st Century Act. Key features, which were to recur
in almost all later bills, centred on allowing warrantless police access to
many aspects of Internet communications. Security fears associated with
the Vancouver Winter Olympics may have had something to do with it, but
terrorism, serious crime, and other threats were cited. Although this pair
of bills did not attract enough support, a second attempt came in 2011 with
Bill C-30, the Protecting Children from Internet Predators Act, whose title
demonstrates the kind of rhetoric deployed to justify lawful access provi-
sions. This again failed.
Introduction 11
The Harper government did not limit itself to parliamentary avenues. In 2012,
the CSIS was effectively given more leeway with the elimination of the CSIS
Inspector-General’s Office, the body responsible for monitoring CSIS. Instead,
it was to be held accountable by the Security Intelligence Review Committee
(SIRC), which was made up of part-time federal appointees. The federal Office
of the Privacy Commissioner (OPC), although with no direct oversight of the
security intelligence services, is important in setting a tone with regard to privacy
and related human rights; here too Prime Minister Harper attempted to stack
the deck, appointing a career government security lawyer, Daniel Therrien,
when the opportunity arose, over more directly qualified candidates who were
favoured in the privacy law community. Both SIRC and Therrien gradually
discovered their teeth, however, and have bitten back to some extent against
both the government that appointed them and its Liberal successor.
In 2013, there was another attempt to introduce lawful access legislation: Bill
C-13, the Protecting Canadians from Online Crime Act, purportedly to tackle
cyber-bullying but containing provisions that were almost identical to the previ-
ous unsuccessful attempts. It again struggled, although a revised version was
eventually enacted as SC 2014, c 31. Of course, 2013 was a landmark year for
security intelligence because of the revelations of US National Security Agency
(NSA) contractor Edward Snowden. He was far from the first nor was he the
most highly placed NSA whistle-blower, but the huge cache of internal training
documents and slides he revealed was carefully curated with the assistance of
major newspapers and resonated with the public as no previous revelation had.
Canada initially seemed far from the centre of the action, even though the
documents confirmed that CSE was in fact a long-time junior partner of the
NSA. It was also revealed, however, that CSE had its own mass surveillance
programs. This should have been a national scandal, but for several years after-
wards CSE managed to avoid the consequences of the Snowden revelations that
other Five Eyes agencies faced. Instead, it moved into a slick new building whose
basement would hold massive supercomputing power – essential for the move-
ment to big data analysis.18 Far from its being reined in, the year after the
Snowden revelations saw a more concerted attempt to extend CSE’s powers.
This time the rationale was “lone wolf ” terrorist attacks in Quebec and Ottawa
and the rise of ISIS in the Middle East. Bill C-51 used tactics that the Conserva-
tive government had previously used to bury difficult legislation: it was included
as part of an omnibus bill, making it difficult to deal with the quantity and detail.
Bill C-51 provided CSIS with greater foreign and domestic powers and more
explicit immunity from accountability in the use of these powers. Documents
released in response to access to information requests to the Canadian security
services revealed that the state fears that drove Bill C-51 were much more related
12 David Lyon and David Murakami Wood
surveillance networks and also, in the twenty-first century, the rise of platforms
and social networking. These generated massive amounts of data as the partici-
pation of ordinary users grew exponentially, which major corporations learned
to harvest for profit. It is these data above all that gave the impetus to big data
intelligence gathering and analysis, which is the subject of the rest of this book.
and David Grondin note the ways in which algorithms form part of the essential
infrastructure of security surveillance, especially as it applies to terrorist finan-
cing and money laundering. In Canada, the Financial Transactions and Reports
Analysis Centre of Canada (FINTRAC) uses algorithm-based alerts to sniff out
suspicious activity and govern the movements of suspects. It is not clear, how-
ever, whether the limitations of such methods are understood by their
practitioners.
Part 2 focuses on “Big Data Surveillance and Signals Intelligence in the Can-
adian Security Establishment.” Chapter 5, by Bill Robinson, carefully tracks the
specificities of Canadian SIGINT history through different phases, showing what
is distinctive about Canadian approaches as well as what Canada shares with other
nations. It is a fascinating and important story, especially as the recently discerned
tendencies of governments to turn their attention to internal “security” have
antecedents. Today’s CSE has grown greatly since the postwar days, however, and
no longer passively receives but actively hunts for data. Among other things, this
makes it more difficult to tell “targets” and “non-targets” apart.
Chapter 6 explores in more detail one recent, crucial aspect of these changes –
the shift within CSE to a “New Analytic Model” starting in 2012. Scott Thompson
and David Lyon use material provided by Access to Information and Privacy
(ATIP) requests to show that, in the same period that the United Kingdom and
United States made similar moves, big data analysis became the order of the
day at CSE. Along with growing dependence on tech corporations and their
software, there is a much greater reliance on computing expertise and practices –
“sandbox play,” for instance – and a reduced role for legal and ethical inter-
vention appears to be the perhaps unintended consequence of these
developments.
Another angle of the work of CSE – examining the interception of Internet
communications – is investigated by Andrew Clement in Chapter 7. The agency
itself is secretive and, unlike its US partner, the NSA, no truth-tellers have come
forward, as Edward Snowden did, to indicate more precisely what goes on in
the Edward Drake Building in Ottawa. Clement’s evidence strongly suggests
that CSE intercepts Canadians’ domestic Internet communications in bulk – as
do the NSA in the United States and the Government Communications Head-
quarters (GCHQ) in the United Kingdom – which is not legally permitted. The
“airport Wi-Fi” case from 2014 is just the first of several telling examples explored
here. Clement’s case is one that should give pause not only to anyone profes-
sionally concerned with privacy or those seeking stronger digital rights or data
justice but also to CSE and, indeed, every single Canadian citizen.
Part 3 focuses on the “Legal Challenges to Big Data Surveillance in Canada.”
In Chapter 8, Micheal Vonn sets the tone for the section with her analysis of
Introduction 15
what can be learned from SIRC reports about the conduct of CSIS, an agency
as shrouded in secrecy as CSE. One report suggests that CSIS data acquisition
practices are “essentially unmoored from law.” Vonn cuts sharply through the
language used by CSIS, showing that data collection is not collection, a threshold
is not a threshold, and guidelines are not guidelines. Is this data collection
proportionate, necessary, and relevant? If not, it may be unconstitutional, and
the Bill C-59 “solution” to these problems may not be a solution.
This issue segues neatly into Craig Forcese’s Chapter 9, which is devoted to
Bill C-59, although readers may conclude that this analysis is slightly more
sanguine about the bill than the previous chapter. Nonetheless, it is a trenchant
critique from a leading legal scholar. He carefully distinguishes, for example,
between surveillance as watching on the one hand and the “potential watching”
enabled by new mass data-gathering methods on the other. The chapter clearly
understands the challenges of big data surveillance but concludes that despite
its limitations, Bill C-59 is a definite improvement on current legal measures.
These difficulties are echoed in a different area of surveillance – policing – that
nonetheless raises issues very similar to those that must be explored with national
security. In Chapter 10, Carrie Sanders and Janet Chan look at how big data
methods are actually used by police (much more is known about their activities
than about CSE and CSIS). Their findings are very instructive for grasping the
scope of this shift not only within the police but also within security agencies.
The connecting word is “security,” which each body has a mandate to protect.
The agencies’ desire to pre-empt and prevent security breaches, such as terror-
ism, is matched by the police claim to be detecting and disrupting crime – in
each case, leaning more heavily on big data. Like some other authors discussing
security agencies in this book, Sanders and Chan query police services’ know-
ledge and capacity to fully understand “the capabilities and limitations of big
data and predictive policing.” Responsible representatives of the security agencies
acknowledge this deficit too.
Part 4 then moves beyond formal legal challenges to consider active “Resist-
ance to Big Data Surveillance” by security intelligence agencies on the part of
civil society. In Chapter 11, Tim McSorley and Anne Dagenais Guertin survey
three revealing cases of resistance to government surveillance in Canada since
2001: Stop Illegal Spying (2013), Stop Online Spying (2011), and the International
Campaign Against Mass Surveillance (2004). They argue that each campaign
did make a difference because each was clearly targeted, created coalitions of
interest, and used diverse tactics to make its claims and mount its cases.
These three case studies are complemented by another – the protests against
government surveillance enshrined in Bill C-51 from 2014. In Chapter 12,
Jeffrey Monaghan and Valerie Steeves see this as unprecedented grassroots
16 David Lyon and David Murakami Wood
Conclusion
We started with CSE’s assertion that “everything has changed” in an age of big
data. Our brief historical sketch shows that this is far from the first time that
this argument has been made, and that the changes that are undoubtedly occur-
ring have deeper roots themselves, as well as potentially profound consequences.
It is also worth emphasizing that while a turn to a New Analytic Model would
seem to indicate a further shift from what has traditionally been understood as
HUMINT to SIGINT, there are two main caveats. First, security intelligence
agencies, whether SIGINT or HUMINT, are established bureaucratic organiza-
tions subject to the self-perpetuating logic of bureaucracies identified by Max
Weber early in the twentieth century.19 HUMINT agencies persist even as a lot
of what they now do and will increasingly do is indistinguishable technically
from the ostensible function of SIGINT agencies. Second, and despite the first
caveat, a lot of what happens from policing up to national security still has
nothing directly to do with big data. Human sources, tipoffs, infiltration, pro-
vocation, and much more besides remain central modes of operation for security
intelligence and political policing.
Many questions remain as to data’s centrality and “bigness.” As Carrie Sand-
ers shows, in the world of policing, big data practices are often marginalized
compared with these older, more trusted human methods.20 It appears that
this also holds for national security HUMINT work. Perhaps the officers who
doubt big data’s universal usefulness are right to do so: Stéphane Leman-
Langlois is profoundly skeptical about the historical effectiveness of big data
analysis techniques, arguing that most of the lessons that have supposedly
been learned relate to very specific situations that are not applicable in the
national security context.21 The effective use of data is often narrower and
smaller than the hype.
And finally, there are many questions about how the movement towards big
data affects how security intelligence agencies can be controlled and held
accountable, and their powers and reach in some cases rolled back. The cases
offered here present some contradictory lessons. Perhaps it is unlikely that any
legal, regulatory, or political oppositional activities on their own are going to
prevent the accumulation of ever larger collections of data and the application
of ever more “intelligent” analytic techniques, but that does not provide a carte
blanche for the collection of all or any data, for all or any data-mining processes,
and for all or any applications. Above all, the pernicious technocentric story of
endless and unstoppable technical progress must be challenged when it comes
to security intelligence agencies, because their activities can profoundly yet
often silently and secretly affect human rights, civil liberties, and the conditions
for human flourishing in Canada.
18 David Lyon and David Murakami Wood
Notes
1 CSE refers to this not as big data but as the “New Analytic Model.” See Chapter 6.
2 See, e.g., Chapter 10.
3 danah boyd and Kate Crawford, “Critical Questions for Big Data,” Information, Com-
munication and Society 15, 5 (2012): 662–79.
4 Jose van Dijck, “Datafication, Dataism and Dataveillance,” Surveillance and Society 12, 2
(2014): 197–208.
5 David Lyon, The Culture of Surveillance: Watching as a Way of Life (Cambridge: Polity,
2018).
6 Bernard Harcourt, Exposed: Desire and Disobedience in the Digital Age (Cambridge, MA:
Harvard University Press, 2015).
7 Shoshana Zuboff, The Age of Surveillance Capitalism (New York: Public Affairs, 2018).
8 Clive Ponting, Secrecy in Britain (Oxford: Basil Blackwell, 1990), 10.
9 Such revelations have a long history in themselves; see David Murakami Wood and Steve
Wright, “Before and After Snowden,” Surveillance and Society 13, 2 (2015): 132–38.
10 David Vincent, The Culture of Secrecy in Britain, 1832–1998 (Oxford: Oxford University
Press, 1998).
11 Alfred W. McCoy, Policing America’s Empire: The United States, the Philippines, and the
Rise of the Surveillance State (Madison: University of Wisconsin Press, 2009).
12 Bernard Porter, Plots and Paranoia: A History of Political Espionage in Britain, 1790–1988
(London: Routledge, 1992), ix.
13 See Chapter 5.
14 P. Whitney Lackenbauer and Matthew Farish, “The Cold War on Canadian Soil: Milita-
rizing a Northern Environment,” Environmental History 12, 4 (2007): 920–50.
15 Reg Whitaker, Secret Service: Political Policing in Canada: From the Fenians to Fortress
America (Toronto: University of Toronto Press, 2012).
16 Andrew Crosby and Jeffrey Monaghan, Policing Indigenous Movements: Dissent and the
Security State (Toronto: Fernwood, 2018).
17 For more on CSIS, see Chapter 8.
18 See Chapter 6.
19 Max Weber, Economy and Society, ed. Guenther Roth and Claus Wittich (Berkeley:
University of California Press, 1978).
20 See Chapter 10.
21 See Chapter 3.
Part 1
Understanding Surveillance, Security, and Big Data
This page intentionally left blank
1
Collaborative Surveillance with Big Data Corporations
Interviews with Edward Snowden and Mark Klein
Midori Ogasawara
When reporters asked, they [AT&T] would give this strange statement, “we
don’t comment on matters of national security,” which implicates them right
there. National security? I thought you were a telephone company!
– Mark Klein, from my interview in May 2017
One of the most striking facts about today’s security intelligence is an extensive
collaboration with technology companies, which traffic, store, and use people’s
digital footprints, often employing so-called big data practices. It is worth
remembering that people who accused tech companies of cooperating with
governments for state surveillance were usually seen as paranoid or labelled
conspiracy theorists without evidence, until Edward Snowden’s revelations in
June 2013. Although there were a few whistle-blowers prior to Snowden on
tech-intelligence collaborative surveillance, such as former AT&T employee
Mark Klein, their claims were neither understood nor accepted by the public
to the extent that Snowden’s were.1
By disclosing top-secret documents of the US National Security Agency
(NSA), the former contractor unveiled the systematic way tech giants like
Google, Microsoft, Apple, and Facebook have worked with NSA to provide
volumes of personal data on their customers. The Snowden documents have
also shown in subsequent research by investigative reporters that major tele-
communications enterprises, such as AT&T and Verizon, have helped the NSA
set up mass surveillance facilities at landing points for transoceanic cables.
Through these specifically named Internet and telecommunications companies
and their documented involvement, people finally realized that governments
could actually seize their personal data and communications, and that it mat-
tered to them. Surveillance became everyone’s issue because most of us were
familiar with Google and Apple and used the private services they provided,
while having only a murky view of state surveillance.
The names of secret partners were deeply hidden. Their extensive cooperation
with security intelligence was the vital key to “Collect It All,” the NSA’s new
imperative, established since the US “War on Terror.”2 In one secret NSA slide,
22 Midori Ogasawara
Figure 1.1 A top secret NSA slide for the 2011 conference of the Five Eyes, showing NSA’s new
organization for “Collect It All.” | Source: Glenn Greenwald, No Place to Hide: Edward Snowden,
NSA, and the US Surveillance State (Toronto: Signal, 2014), 97.
the full circle of the “New Collection Posture” is completed with “Partner It All”
and enables the starting point of “Sniff It All” (Figure 1.1). But how and why did
private companies become good partners of security intelligence for numerous,
unwarranted wiretappings? What made the two work together for mass surveil-
lance behind the scenes?
This chapter examines the development of collaborative surveillance between
security intelligence and tech companies, and the effect of the collaboration on
the political and judicial spheres. Security intelligence and tech companies
rapidly developed strategic relationships induced by political economic incen-
tives. The main resources for my analysis are two personal interviews I had with
whistle-blowers from both the security intelligence side and the corporate side,
Snowden in 2016 and Klein in 2017. It should be noted that the main focus of
these interviews was on Japan-US relations, because the activities of the NSA
had rarely been contextualized in Japanese politics before. However, Snowden
and Klein explained in the interviews the mechanisms of worldwide surveillance
networks, of which Japan, Canada, and many other countries are part. Because
the worldwide networks are basically American systems, technically supported
Collaborative Surveillance with Big Data Corporations 23
by American big data companies, and the NSA shares the data collected from
those networks with other foreign agencies, it can be said that the NSA systems
are also used as a major vehicle for security intelligence in other countries to
obtain data. In this sense, Canada’s foreign intelligence agency, the Communica-
tions Security Establishment (CSE), has been getting more data from the NSA
than the Japanese Ministry of Defense has, as the NSA categorizes Canada in
its closest group of the Five Eyes network (United States, United Kingdom,
Australia, New Zealand, Canada), or the “second parties” in sharing classified
data, while placing Japan among the “third parties” with the rest of the US allies.
Thus, the basic mechanisms of collaborative surveillance between the NSA and
tech companies to be described in this chapter are relevant to Canadian intel-
ligence agencies, to which the NSA and tech companies have been feeding the
data. Furthermore, because of the long-standing Five Eyes relationship, the
NSA’s collaborative techniques with tech companies can imply similar relations
and methods that CSE might have developed with Canadian tech
companies.
Although worldwide surveillance networks, enabled by state collaboration
with private actors, are continuously hidden, the expanded networks have been
affecting global political landscapes and redrawing judicial borders of state
surveillance. To demonstrate this, I will provide a Japanese example, and my
main argument will be the apparent global tendency to legalize the currently
illegal state surveillance of citizens. Snowden elaborated on the NSA’s strategies
to compel the Japanese government to pass certain surveillance legislation while
he worked as the NSA’s secret contractor in Japan. This legislation, called the
Secrecy Act, helped the NSA remain covert and practise further illegal surveil-
lance in Japan, and hid the Japanese government’s involvement.3 Similar stories
about other US allies are unreported as yet. But the Five Eyes and other European
countries have also passed legislation to expand state surveillance over citizens
under the political anti-terror discourse of the past two decades, including
Canada’s Bill C-51 and France’s “Big Brother Laws.”4 Together, they create a
global wave of legalization of previously illegal surveillance of citizens. This
phenomenon strongly resonates with and confirms NSA’s policy-making strat-
egies as explained to me first-hand by Snowden.
In what follows, I will first describe a dominant format of the NSA’s dragnet
surveillance, which emerged in the early stages of the War on Terror and which
Klein witnessed in 2004. It was built in part on telecommunications infrastruc-
ture in the private sector and presumably proliferated to other locations, as told
in Snowden’s detailed story of the Special Source Operations (SSO). The part-
nerships with tech companies are shrouded in secrecy, and if the secrecy is
breached, immunity follows retroactively, in order not to punish the NSA
24 Midori Ogasawara
partners that aided the illegal tapping. Second, what Snowden calls “natural
incentives” are present in every chain of collaboration, rather than coercive
orders. I will lay out the political economic interests tying the two entities
together before analyzing two kinds of collaboration, which the NSA categorizes
as “witting” and “unwitting” relationships. In the former, the information and
communications technology (ICT) companies know that they are delivering
data to the NSA; in the latter, the data are delivered without the consent of the
collaborators. The unwitting relationships are often established outside the
United States, through a technological “backdoor,” with foreign intelligence
agencies. The two together keep pushing the boundaries of data collection and
secrecy globally. As a significant outcome of the globalized collaboration, I
examine the Japanese case of surveillance laws – the Secrecy Act, the Wiretapping
Act, and the Conspiracy Act – and discern a judicial trend towards legalizing
previously illegal mass surveillance, in relation to the NSA’s collaborative strat-
egies.5 The early format of retroactive immunity is sustained and reinforced in
the trend, which allows the extra-judicial activities to replace the rule of law.
meant they were collecting everyone’s data. The story about, ‘We’re just getting
international,’ was just a cover story,” said Klein.
Why everybody? Klein also found the points at which the NSA accessed the
networks: peering links. He explained to me:
“Peering links” are a typical term for the links that connect one network with
others, and that’s how you get the Internet. So AT&T’s domestic fiber network
connects to other companies’ networks, like Sprint or what have you, with
peering links so that people who are not AT&T, their messages can get to the
AT&T network. By tapping into peering links, you get a lot more of every-
body’s communications, not just AT&T’s. The fact that they did this in San
Francisco and they were tapping into peering links, people were really upset
when I revealed that.
Klein did not want to be part of the illegal unwarranted wiretapping, but could
not take immediate action for fear of losing his decent job. Later in 2004, the
company offered a financial package for the employees around his age who
wanted to retire. He took this opportunity and retired, taking with him engin-
eering documents that proved how the splitter cabinet was connected to NSA
equipment. He brought the documents to the Electronic Frontier Foundation
(EFF), an advocacy group for privacy and free expression in the electronic age.
In 2006, the EFF sued AT&T on behalf of its customers, for violating privacy
law by collaborating with the NSA in an illegal program to intercept citizens’
communications.8 Klein supported the EFF lawsuit as a witness and his docu-
ments were submitted to the court as evidence in Hepting v AT&T.
The Bush administration quickly intervened in this private case, however.
The government petitioned the court to dismiss the case on the grounds of state
secret privilege. Though the court rejected the petition, the government eventu-
ally succeeded in getting the controversial Foreign Intelligence Surveillance Act
[FISA] of 1978 Amendments Act of 2008 enacted by Congress, under which
AT&T was awarded so-called retroactive immunity. Law is principally non-
retroactive, but it was an exception. The 2008 amendments allow the Attorney
General to require the dismissal of lawsuits over a company’s participation in
the warrantless surveillance program if the government secretly certifies to the
court that the surveillance did not occur, was legal, or was authorized by the
president, whether legal or illegal.9 As a result, in 2009, a federal judge dismissed
Hepting and dozens of other lawsuits over AT&T’s collaboration with the NSA
in illegal wiretapping.
The EFF also sued the NSA and other government agencies in 2008, and
Klein’s documents from AT&T were again provided as evidence that AT&T had
26 Midori Ogasawara
routed copies of Internet traffic to the NSA’s secret room. By then, there was
more public exposure from other whistle-blowers regarding NSA mass surveil-
lance. However, the Obama administration also moved to dismiss this case,
claiming that litigation over the wiretapping program would require the govern-
ment to disclose privileged “state secrets” and that the wiretapping program
was immune from suit, using the same logic the Bush administration had
pursued.10 Though the case is not completely over, Klein believes that President
Obama effectively contained it. “The government has a million ways to kill a
lawsuit against the government. So I don’t hold out any hope for lawsuits,” he
commented.
Klein’s early revelations of NSA mass surveillance systems, in the first few
years after the United States declared the War on Terror in 2001, illuminate two
aspects of the collaboration between the NSA and ICT companies. One is that
the collaboration appears to have started at the foundation of the existing facili-
ties built by the telecommunication giants. When I visited 611 Folsom Street
after my interview with Klein, this reality became more tangible, because the
building is located at a busy street corner in downtown San Francisco. The secret
room was set up at a location familiar to locals and visitors, but not many pay
attention to the black building, which has few windows and looms large, like a
huge data box in the city centre. The NSA began a massive wiretap within the
telecom’s facility in this populated area, and the same systems were disclosed
in other AT&T facilities by the New York Times and the Intercept.11 The more
wiretapping points there were, the more personal data were acquired. This
built-in format of dragnet tapping would grow to sites beyond the telecoms’
branches, as Snowden later revealed, to a worldwide scale.
The other important aspect of the early collaboration between the NSA and
its partners is how devotedly the government protected the private partners
that had delivered the data. The government created a very exceptional, retro-
active law that turned illegal actions of the telecoms into legal ones. Retroactive
immunity nullified the law that banned warrantless wiretapping against citizens,
and the presidential order, whether legal or illegal, replaced the law. No respon-
sibility and accountability could be demanded if the rule was retroactively
changed after the game had started. Retroactive immunity is fundamentally
opposed to the rule of law and democratic oversight by Congress and the
people. This type of immunity has become an increasingly common strategy
used by governments to expand partnerships for mass surveillance worldwide,
as I discuss later. The strategy has drastically changed the boundaries between
legal and illegal state surveillance. My interview with Mark Klein highlights
the early format that enabled collaborative surveillance, both physically and
judicially.
Collaborative Surveillance with Big Data Corporations 27
just like duplicating all data going through the domestic communications cables
in San Francisco.
Why did the NSA develop this method as today’s major source of security
intelligence? The spy agency asked itself that in one of the Snowden documents,
which also indicates the reasons. “Why SSO?” Because “SSO is ‘Big Data.’” It
can capture up to “60% of content” and “75% of metadata” in a digital data
stream.14 There is no definition of big data in this document, but it is easy to
assume that big data includes not only actual correspondence among people,
such as telephone calls, emails, and text messages, but also human behaviour,
both online and offline, such as what people searched online and where they
went. Big data provides the NSA with a wider scope of information to predict
what people would like to do next.
Snowden described the actual process behind the scenes – how the NSA
negotiated with its partners to set up choke points at the landing sites. The NSA
normally pays the company for a room where it puts all its equipment, including
its servers. The NSA attaches special locks to the room and allows only staff
with security clearance to enter. It then asks the company to run the cable
through the NSA’s equipment, to copy all the data it wants. The telecoms do not
deal with what the NSA does with that data. “But, there is no real case here
where these companies don’t know about it,” asserted Snowden.
Why do big data corporations cooperate with these illegal wiretapping pro-
grams? Snowden points out that these companies take the NSA’s requests into
their business calculations. To expand the service areas and access new networks,
the company needs to obtain permissions and approvals from the government.
If the company got into trouble with the NSA by refusing the requests, it might
lose an opportunity to increase profits, resulting in a shrinking of its business.
“So, they’ve got a little bit of leverage with the government,” says Snowden.
Not that they are really threatened by the government, but the government
goes, “Well, it’s in our interests to help them build this new site or do whatever,
because they’ll give us access too.” And so, it’s this kind of culture that nobody
sees, because this all happens secretly. But it’s the nature of how intelligence
works ... you don’t want to think about villains. What you want to think is what
are the natural incentives for all of the people engaged at every level of the pro-
cess? What, if you were in their position, what would you do? And, suddenly, it
becomes obvious why everything works out the way that it does; everybody is
just scratching each other’s back.
encouraged avoiding trouble and following the existing tacit rules, for job secur-
ity and promotion. If any lawmakers attempted to stop the process, they could
find themselves in a politically vulnerable situation. If any incidents occurred,
they could be blamed or retaliated against by the intelligence agency that can
find anyone’s weak spot and leak personal data to achieve its goals. It would be
a safe choice for lawmakers to let the intelligence agency do what it wants, so
they suffer no consequences at all. “It’s not a product of evil ... It’s a product of
incentives in the system. Secrecy corrupts the process of democracy.” Because
few challenge the secret power of security intelligence, the intelligence agencies
transcend democratic power and simultaneously undermine democratic deci-
sion making.
Snowden attributes the rapid expansion of collaborative surveillance to pol-
itical economic incentives active at both ends – the NSA and ICT companies.
From a cautious perspective, delivering customers’ data to the NSA does not
directly profit the ICT companies. Rather, it violates customers’ privacy and
other rights and organizational compliance, and may damage business by harm-
ing public trust. This would be another story if the collaborations were all hidden.
Secrecy would nudge the ICT companies to avoid problems with the authorities,
and to choose the political economic benefits of conformity. In fact, when the
secret was revealed by the whistle-blowers, angry customers began accusing
the companies, as the EFF sued AT&T on behalf of its customers. Thus, secrecy
is a necessary condition for “natural incentives” to come into effect and for
growing the political economic incentives. The NSA and ICT companies can
“scratch each other’s backs” while people don’t know that their data are actually
used in the deal.
If all these processes had been transparent from the very beginning, the
political economic incentives would have been more likely to turn into business
risks because customers would stop using the services or sue the companies.
Collaboration based on illegal surveillance inherently requires secrecy, and
secrecy lets the incentives work for both ends – the government and the private
sector. However, an end to the secrecy initiated by Klein, Snowden, and other
whistle-blowers did not end the collaboration. With secrecy removed, the gov-
ernment took formal action to wrench the legal standard towards illegal prac-
tices, by passing laws to legalize the unprecedented scale of state surveillance.
Instead of secrecy, the government invented and provided retroactive immunity
to its partners, to protect their illegal involvement in mass surveillance
imperative consists of six stages in a cycle: Sniff It All, Know It All, Collect It
All, Process It All, Exploit It All, and Partner It All (see Figure 1.1).15 The last
stage is particularly relevant, as it guarantees the means of collecting personal
data through communication infrastructure as much as possible.
Snowden’s analysis is this: “In many cases, the way it works, the biggest ICT
companies are the ones that are working the most closely with the government.
That’s how they got to be big and that’s why they don’t have any trouble with
regulatory problems, and they got all the permits they need.” The largest Amer-
ican telecoms, AT&T and Verizon, have been reported as having “extreme
willingness to help” the NSA.16 A smooth relationship with the government
helps business grow, so political economic incentives are convincing. The NSA
aims to eventually set up an SSO at all landing sites of all international trans-
oceanic cables to gear up for Collect It All.
Snowden told me that these SSO collaborations between the government and
businesses are called “witting” relationships in NSA terms: the executives know
that they are working with the intelligence agency and that information is closely
shared with other intelligence agencies, such as the Five Eyes or the second
parties.17 For example, participants in PRISM, such as Microsoft, Facebook,
Google, and Apple, are involved with witting relationships, as are AT&T and
Verizon in the SSO.
Much less known are the “unwitting” relationships, where companies are
unaware that they are providing data to the NSA. These are unilateral programs
in which the NSA sets up tapping devices within ICT equipment and uses them
as a “backdoor” to absorb data to its servers.18 For example, according to Snow-
den, the NSA encourages foreign governments to set up their own network
equipment, whether in Afghanistan or Japan, so the foreign governments order
these high-tech appliances from ICT companies, many of which are based in
the United States. When the equipment is shipped from or transits the United
States, the NSA attaches tapping devices to the product. “So we made them into
a kind of Trojan Horse where you install it on your network and it works per-
fectly for you but it also works for us without you knowing.”
Those more deeply hidden unilateral programs directly assist anti-terror
tactics of identifying targets by location technologies and attacking them
remotely by drones. The point here is that the NSA sneakily develops unwitting
relationships in order to push the limits of witting relationships towards Partner
It All. In fact, the NSA has established both types of relationships with thirty-
three countries called third parties, including Japan, Germany, France, Brazil,
and Mexico.19 These countries are diplomatically US allies, so they often cooper-
ate with NSA surveillance by offering and exchanging data. Simultaneously,
however, the NSA surveilles these countries, including German chancellor
Collaborative Surveillance with Big Data Corporations 31
On 5 Aug 2011, collection of DNR and DNI traffic at the FAIRVIEW CLIFF-
SIDE trans-pacific cable site resumed, after being down for approximately five
months. Collection operation at CLIFFSIDE had been down since 11 March
2011, due to the cable damage as result of the earthquake off of the coast of
Japan ... FAIRVIEW operations will continue to task collection for all new and
restored circuits.24
Secrecy Act was needed in this respect, and was proposed to Japanese
counterparts.
Again, they are not doing this for [an] evil purpose. They see this as necessary for
national defense. But the bottom line is, you start to get a group of countries, this
Five Eyes network, that had been creating this system of spying and judicial run-
arounds, legal runarounds for many years. And they start to export it to other
countries. They go, if you pass these laws ... of course, you can rewrite it. You
don’t have to pass the law exactly as we say. But, in our experience, (we say,) this
is what you should aim for, you should do this, you should do this. Those other
countries go, well, hey, we should do this. And, this is exactly what happened
with the state secrets law in Japan. When I was in Japan, we would have Japanese
partners come over to our building at Yokota. They knew we were there. It was
like the world’s greatest secret, because we were sharing information that we
were gaining from all this spying.
And, we would say we can only share information at this level of classification
with you. We use it as sort of a carrot and a stick, where the Japanese military
want this piece of information, or they want that piece of information. Then we
say, well, we can’t give you that because your laws aren’t the way we want them
to be. We’ll give you this smaller piece of something else that might be helpful,
just kind of a token gift to help them save face ... So we would share things at the
secret level in Japan. We said, if you change your laws, we’ll share things at the
top-secret level with you.
These conversations all take place behind the curtain of intelligence agencies,
which I had never heard of as a background of the Secrecy Act. I asked Snowden
with suspicion how these conversations finally became law, what the diplomatic
process was. He explained that the conversation normally began with division
chiefs of intelligence agencies, then went up to the heads of the agencies and
eventually to the Department of State to formalize the agreement at a policy
level (this process was evidenced by an NSA document disclosed in April 2017).29
The NSA is aware that conducting mass surveillance is illegal according to Jap-
anese constitutional rights and judicial standards. “So it’s in violation of the law.
But, it doesn’t really matter at this point because no one can find out, at least in
terms of the government’s political calculation.” Secrecy, in this case about the
34 Midori Ogasawara
And, then, eventually, on the basis of this, if this program has been continuing
for too long without legal authorization, we’ve proved that you need it. This is
how they get their foot in the door, a kind of press government. And I’m talking
about intelligence agencies in general, not just here in the States. In Japan, they
get their foot in the door by saying we’ve already operationalized this policy
and it allowed us to learn this, this, and do this and this, whatever, but we don’t
have the legal authorization we need to continue this program. So, if you don’t
force a law through your government that authorizes this, we’ll have to shut the
program down.
journalists. This was an extreme drop from 11th place in 2010.31 The index reflects
the serious dampening influence of the Secrecy Act, which induced self-
censorship in the media. The United Nations Special Rapporteur on Freedom
of Expression, David Kaye, visited Japan in 2016 and also reported on the nega-
tive effects of the Secrecy Act.32 The act has surely prevented the public from
knowing about the government’s illegal practices, including its support for the
NSA’s illegal surveillance. And this is not the end of the story.
After passing the Secrecy Act by an undemocratic procedure in the Diet, Prime
Minister Shinzō Abe’s government proposed and enacted two other surveillance
laws. One is a revision of the Wiretapping Act of 2016 that greatly expanded the
categories of crimes subject to police wiretapping investigations. The other is
the 2017 Conspiracy Act. The Conspiracy Act had failed thrice in the Diet in the
past because of its extreme stance against privacy and free speech. It criminal-
izes subjects for communications in which they agree on alleged crimes, whether
or not they actually commit any criminal acts. It replaces Japan’s postwar prin-
ciple of criminal justice without declaration, under which no one is charged for
a crime before taking action. It enables the police to tap potentially everyone’s
conversations, in order to identify who is talking about crimes.33 We have not
heard that the United States has been involved in crafting these two laws, but
they obviously help both the Japanese and American governments enlarge their
surveillance capacities. The Wiretapping Act legitimized the means of wiretap-
ping in criminal investigations, and the Conspiracy Act created the new reason
for the police to monitor everyone’s conversations, including emails, text mes-
sages, video chats, and so on. The serial enactment of the three laws contributed
first to protecting the NSA’s operations and the Japanese government’s collab-
oration in the first place, and then gave real teeth to their extension of mass
surveillance legally. Together, the three laws vitiate the once established legal
standards of privacy and right to confidential correspondence in the Japanese
Constitution, and modify them to accommodate the illegal surveillance prac-
tices.34 The previously illegal reality of mass surveillance has now been trans-
formed into lawful data collection, where extra-judicial and judicial power are
indistinguishable.
Significantly, retroactive immunity, the early means of collaborative surveil-
lance to protect the NSA’s private partners in the United States, has been
developed to a more legitimate form in Japanese surveillance laws, to legalize
illegal state surveillance. It is important to recognize this transformation because
retroactive immunity is an exceptional measure particularly deployed to absolve
the tech companies of violation of privacy. It had to be made retroactive, contrary
to the legislative principle of non-retroactivity, otherwise those companies might
have been found guilty in American courts. Retroactive immunity reveals the
36 Midori Ogasawara
Conclusion
In this chapter, I have described how the ICT industries have worked with the
NSA to collect and deliver data worldwide, based on the accounts of two whistle-
blowers. In summary, the NSA has developed a format of mass surveillance
systems built into existing digital communications infrastructure, from telecoms’
Collaborative Surveillance with Big Data Corporations 37
big data corporations have been changing citizens into consumers, purely and
forcibly.
Thus, political and judicial standards to protect human rights are under attack
from technological solutionism, in a larger picture of a data-driven society. The
collaboration between security intelligence and big data corporations takes
place outside official judicial institutions and takes over political spheres. There
is a fundamental ideological sync between the two in transcending democratic
oversight and subverting judicial rules. Big data is currently exploited in this
depoliticized technological zone, where the two players (and other players too)
envision controlling the future, without political and judicial interventions.
As a closing note, whistle-blowers are a very rare resource in unfolding the
murky military/surveillance-industrial complex today. Their protection is an
urgent matter for any society. On the other hand, Mark Klein told me that
whistle-blowers are not sufficient to bring about political change: “What you
really need is a combination of whistleblowers and a party that’s determined to
bring down the government. And, you need large masses of people angry and
marching the streets.” Even as I was nodding, I was curious about why he decided
to begin this process alone. He answered in a husky voice, “I thought it [war-
rantless wiretapping by the NSA and AT&T] was illegal and immoral and
dangerous, and I thought I had a chance to do it and get away with it ... Well, I
had some principles.” At least, democratic “principles” may offer everyone a
starting point for holding the secretive collaborators of big data surveillance
accountable and countering illegal state practices.
Notes
1 I have to admit with shame that I was part of the ignorant public despite having
researched surveillance studies for several years.
2 Glenn Greenwald, No Place to Hide: Edward Snowden, NSA, and the US Surveillance State
(Toronto: Signal, 2014), 97.
3 Midori Ogasawara, スノーデン、監視社会の恐怖を語る [Snowden talks about
the horrors of the surveillance society: The complete record of an exclusive interview]
(Tokyo: Mainichi Shimbun, 2016).
4 Canada, Bill C-51, An Act to amend the Criminal Code and the Department of Justice Act
and to make consequential amendments to another Act, 2017, https://www.justice.gc.ca/
eng/csj-sjc/pl/cuol-mgnl/c51.html; Ewen MacAskill, “How French Intelligence Agencies
Failed before the Paris Attacks,” Guardian, 15 November 2015, https://www.theguardian.
com/world/2015/nov/19/how-french-intelligence-agencies-failed-before-the-paris
-attacks; Kim Willsher, “France Approves ‘Big Brother’ Surveillance Powers despite UN
Concern,” Guardian, 24 July 2015, https://www.theguardian.com/world/2015/jul/24/
france-big-brother-surveillance-powers; Angelique Chrisafis, “France Considers Extend-
ing National State of Emergency,” Guardian, 22 January 2016, https://www.theguardian.
com/world/2016/jan/22/france-considers-extending-national-state-of-emergency;
40 Midori Ogasawara
alongside each statute: the Solicitor General Enforcement Standards for Lawful
Interception,13 CSE Ministerial Authorizations,14 Canadian Radio-television and
Telecommunications Commission (CRTC) Licensing Requirements, CSIS
Operational Directives,15 classified opinions and recommendations from the
National Security Branch at the Department of Justice,16 redacted jurisprudence
from the Federal Court,17 and reviews determined to be Secret or Top Secret
previously conducted by the Security Intelligence Review Committee (SIRC)
and the Office of the Communications Security Establishment Commissioner
(OCSEC) and now conducted by the National Security Intelligence Review
Agency (NSIRA). All such documentation can impact operations significantly,
even though they flow beneath the visible surface of the law(s). For example:
• Security of Information Act – provisions on document-handling infractions
[section 4(4)(b)] and offences for unauthorized entry of classified areas (section 6)
• Criminal Code – the investigative powers utilized in national security investiga-
tions, including production orders (section 487.014), tracking devices (section
492.1), wiretaps (section 186),18 and/or search warrants (section 487)19
• CSIS Act – sections 21 and 21.1, providing authorities to investigate threats to
Canadian systems20 and/or alleged involvement of foreign powers
• Customs Act – section 99.2, providing for broad powers of inspection and/or
seizure for any persons leaving or entering Canada
• Canada Evidence Act – section 38, providing for secrecy provisions and sealing
of court records used to cover sensitive evidence revealed, as well as to conceal
particular surveillance techniques
• National Defence Act – section 273.3, allowing CSE to provide specialized lawful
access (SLA) assistance to other federal agencies [paragraph 273.64(1)(c)]21
• PIPEDA – subparagraphs 7(3)(c.1)(i) or 7(3)(d)(ii), wherein companies such
as airlines or telecommunications service providers (TSP) holding information
relevant to an investigation bearing on Canada’s national defence or security are
given specific discretion to disclose information to authorities
• Privacy Act – paragraph 8(2)(c) or (e), whereby other government departments
may also disclose information to investigators, either under a court order or in
the enforcement of any law of Parliament22
• Telecommunications Act [section 15(1)] and Radiocommunication Act [section
5(1)(d)], which set out provisions23 allowing agencies to specify through regu-
lation or standards the maintenance of particular surveillance capabilities on
networks.24
Any or all of the above-noted provisions may figure in the court orders,
ministerial authorizations, supporting opinions, or warrant procedures tied to
either initiating specific surveillance or laying the foundation for approving
46 Christopher Prince
such activities. Arguably, they form the core legal parameters that responsible
ministers, members of the judiciary, or both expect to review before they sign
off on the particulars of intrusive surveillance or authorize broad collection of
digital evidence.25 Specific targets and suspected crimes may change from case
to case, but the underlying legal ground rules should be fairly familiar to
officials, even if they are not widely known outside security intelligence
organizations.26
So while surveillance conducted in the context of security intelligence is
almost always clandestine, it would be difficult to argue that these activities are
somehow unmoored from the law.27 Nor is it automatically malicious for gov-
ernment to keep secrets, if only to safeguard the lives of certain sources and
citizens. In fact, in any given operation conducted in Canada, by the RCMP,
CSIS, or CSE, the overlapping legal requirements accumulate quickly. Despite
commentary to the contrary – that surveillance and security operations some-
how unfold in a legal vacuum – the opposite argument can be made. The legal
framework around national security investigations in Canada offers practical
examples of surveillance assemblage, given the number of statutes that mutually
reinforce and legitimize each other.28 That theoretical term – developed by Kevin
Haggerty and Richard Ericson – refers to the elaborate arrangement of technical,
legal, personnel, and commercial systems required to operationalize ongoing,
persistent surveillance.
And in this midst of these controversies, our courts have rightly maintained
an independent skepticism towards surveillance solutionism; they have instead
asked government to reconsider first principles. The Supreme Court of Canada
treated these questions in recent cases such as Tse, Wakeling, Spencer, and
TELUS.70 Our Federal Court underscored related points through decisions of
Justice Mosley and Noël.71 The US Courts of Appeals did likewise in their
review of mobile device tracing and Jones.72 The European Court of Justice
did so in annulling the Safe Harbor Agreement. Proper democratic practices
and constitutional safeguards against intrusive state powers have been insisted
upon, and that is, ultimately, the purpose of public law, underscored by the
very due process discussions that courts around the world have been urging
us to have.
Acknowledgments
The views presented are solely those of the author. Thanks to both Christopher Parsons
and Lisa Austin for kind suggestions.
Notes
1 Henry A. Crumpton, The Art of Intelligence Lessons from a Life in the CIA’s Clandestine
Service (New York: Penguin, 2012), 7; Heidi Boghosian, Spying on Democracy: Govern-
ment Surveillance, Corporate Power and Public Resistance (San Francisco: City Lights
Books, 2013), 22.
2 Frederick A.O. Schwarz, Democracy in the Dark: The Seduction of Government Secrecy
(New York: New Press, 2015), 36–37.
3 The most recent official apology and work towards compensation concerned the target-
ing and harassment of public servants and military staff who were gay, a practice that
continued into the early 1990s; see John Paul Tasker, “Ottawa to Apologize Formally to
LGBT Community for Past Wrongs,” CBC News, 17 May 2017, http://www.cbc.ca/news/
politics/ottawa-apologize-lgbt-community-past-wrongs-1.4120371.
4 For recent discussion of individual effects, see Fen Osler Hampson and Eric Jardine,
Look Who’s Watching: Surveillance, Treachery and Trust Online (Waterloo, ON: CIGI,
2016).
5 G.T. Marx, Windows of the Soul: Privacy and Surveillance in the Age of High Technology
(Chicago: University of Chicago Press, 2016), 321.
6 Laura Donahue, The Future of Foreign Intelligence: Privacy and Surveillance in the Digital
Age (New York: Oxford University Press, 2016), 145.
7 David Anderson, A Question of Trust: Report of the Investigatory Powers Review (Lon-
don: The Stationery Office, 2015), https://terrorismlegislationreviewer.independent.gov.
uk/wp-content/uploads/2015/06/IPR-Report-Web-Accessible1.pdf, 8, 148, 218.
8 2014 FCA 249, Federal Court of Appeal, https://www.canlii.org/en/ca/fca/doc/2014/
2014fca249/2014fca249.pdf.
9 Commission of Inquiry into the Investigation of the Bombing of Air India Flight
182, Air India Flight 182: A Canadian Tragedy, vol 3, The Relationship between Intel-
ligence and Evidence and the Challenges of Terrorism Prosecutions (Ottawa: Public
Works and Government Services Canada, 2010), http://publications.gc.ca/collections/
52 Christopher Prince
71 Canadian Press, “CSIS Slammed for End-Running Law to Snoop on Canadians Abroad,”
CBC News, December 2013, http://www.cbc.ca/news/politics/csis-slammed-for-end
-running-law-to-snoop-on-canadians-abroad-1.2472843.
72 “United States v. Jones,” SCOTUSblog (blog), http://www.scotusblog.com/case-files/
cases/united-states-v-jones/; “Police Use of ‘Sting Ray’ Cellphone Tracker Requires
Search Warrant, Appeals Court Rules,” Washington Post, 21 September 2017, https://
www.washingtonpost.com/news/true-crime/wp/2017/09/21/police-use-of-stingray
-cellphone-tracker-requires-search-warrant-appeals-court-rules.
3
Big Data against Terrorism
Stéphane Leman-Langlois
In the realm of national security, big data analytics aim at two things. The
first is pre-emption, or stopping an attack at the preparation stage. This might
involve communications analysis or the recording of purchases of dangerous
materials. It may also be possible to detect behaviour patterns that reveal the
“casing” of particular targets. The second aim is prevention, which needs to
occur further upstream, before violent action is being considered. Usual preven-
tion tactics involve network disruption or the identification of persons “at risk.”
The analytical tools used to extract “actionable” information from large data-
sets have yet to show results, however. Agencies tasked with maintaining order
and with law enforcement have mostly tackled terrorism through traditional
patrol and investigation techniques, and mostly starting with citizen reports.
Such so-called low policing organizations are quickly adopting high-tech “pre-
dictive policing” approaches that make use of big data analysis, but those are
most useful with common, high-frequency crimes. On the other hand, entities
tasked with national security missions outside of officially defined crimes, or
“high policing” organizations,1 have enthusiastically embraced big data surveil-
lance for the protection of the state, both in detection and interference missions.
This also remains less than conclusive to date, for reasons listed below.
Meanwhile, since the data in question involve large numbers of citizens, col-
lecting it has raised many questions about state power and individual autonomy.
Despite the overwhelming costs and the fleeting benefits, however, big data
approaches to terrorism flourish and are definitely here to stay. In this chapter,
I review the problems and then the costs of big data for national security, con-
cluding with the incentives that keep it alive.
of what terrorists, in general, have done when planning past attacks. Unfortu-
nately, this distillation is vulnerable in many ways. First, our knowledge of
terrorist biographies is incomplete at best. Most were collected through state-
ments given to police, court proceedings, and direct or indirect testimony. Each
of these sources is weighted by different, but highly consequential, biases. So
these biographies are reconstructions, framed by the demands of the circum-
stances in which they are produced. This is true of even the most factual aspects
of the stories being told.
Second, much external data on terrorist activity comes from sources whose
objectivity is overestimated. Official reports, stories in the media, and police
reports are in fact filters that process and classify data according to common
(usually wrong) knowledge about terrorism. In the case of police reports, we
must also take into account the dozens of cases involving non-trivial interven-
tion by undercover agents in the plotting.2
In Canada, the Korody-Nuttal case in British Columbia illustrates this prob-
lem. After months of investigation and trial, the court finally found that RCMP
personnel were so closely involved in the plotting of the attack that the two
defendants had been entrapped. Both were immediately freed. Intelligence
reports are also commonly used in the building of terrorism or extremism
models and are often taken as highly reliable. Yet they are often heavily flawed
and rest on highly questionable sources. In reality, untold amounts of “intelli-
gence” are based on show trials held in dictatorships around the world, security
services manipulation, and politically expedient designations used to neutralize
individuals, groups, and populations. For instance, the famous “returnees from
Albania” show trial in Egypt in 1999 gave rise to hundreds of wrongful identi-
fications in the West, including the issuance of security certificates in Canada.
One Mohamed Mahjoub was held in custody for over twelve years as the head
of the Vanguards of Conquest in Canada, an extremely dangerous Egyptian
terror group, which also happens to have never actually existed, according to
expert testimony in the case. Finally, even scientific, apolitical databases are
coloured by politically loaded definitions of “terrorism.” This is not a semantic
problem: it can multiply or divide the number of cases in extreme proportions.3
In turn, this variation is bound to affect any identification of possible “pre-
terrorist” behaviours.
Originally, the next step was to use these historical sources to identify behav-
iours or sequences of behaviours that had preceded past terrorist acts. These,
in turn, would become the red flags that automated search tools would look for
in current masses of data. This is what the Total (later, Terrorism) Information
Awareness (TIA) program at the US Defense Advanced Research Projects Agency
(DARPA) had planned to achieve in the frantic months following 11 September
Big Data against Terrorism 59
2001.4 Current big data analytics are no longer looking for patterns in masses
of data. The idea is to develop both the patterns and the search strategy in
combination, in real time, one adjusting the other continuously. Though this
solves the post facto problem, it does almost nothing to counter the “garbage
in, garbage out” problem of unreliable data. At best, it can contribute in flagging
particularly egregious unintentional or deliberate errors.
The second strategy is to look at statistically atypical clusters of data traces
and transactions. The need for pre-established patterns disappears, as the target
moves from the needle to the haystack and the goal is to filter out whatever is
statistically normal. Of course, the parameters for “normal” behaviour are also
far from consistent, but outliers are relatively easy to identify. Large amounts
of fertilizer being ordered to a suburban bungalow, or multiple one-way airline
tickets being purchased at the same time, can stand out.
One approach is to start by a clustering method, first sorting out the base
rates for various behaviours in order to identify statistical outliers. Neural net-
works are especially adept at automatically forming and testing hypotheses
about regularities, through machine learning. This takes many subjective,
cultural, and simply erroneous assumptions, theories, and classifications out of
the loop.
However, both terrorist and non-terrorist behaviours are extremely diverse
and do not fall in clear categories, meaning that there is no way to identify the
extent of the intersection between the “abnormal but acceptable” and the “ter-
rorist” sets. Because of the rarity of terrorism, and the profusion of eccentric
behaviours, the number of false positives is likely to skyrocket. The result could
easily overwhelm policing organizations, who might be able to follow up only
a very small minority of computerized notifications.
Finally, big data analytics may also seek to identify clandestine, or “dark”
social networks. Such networks are different from those based on family, work
or leisure, friendships, and the like, because they are attempting to evade detec-
tion. This means that their communication structure is significantly different,
and therefore might be detected. At the moment, most link analysis approaches
begin not with big data but with very small data: once a person is labelled a
terrorist, his or her contacts are simply singled out within the masses of captured
metadata (usually with a court-issued warrant). This is at once simpler and
more reliable, but rests heavily on the trustworthiness of the initial labelling.
When successful, this approach significantly narrows down the populations
that should be given additional attention by high policing, national security
intelligence organizations. In that way it is far more efficient, but it is not big
data analysis per se. Big data, or bottom-up/inductive graph analysis, has a dif-
ferent set of hurdles. One common strategy of network analysis is to daisy chain
60 Stéphane Leman-Langlois
Math
Using big data analytics to prevent rare events is an overly optimistic scenario
for two other, more fundamental reasons. The first one is the Bayes theorem,
which simply uses basic probability calculations to underline the impracticality
of such predictions. Simply put, and only considering false positives, even a
near-magical 99 percent accurate algorithm would wrongly identify 1 percent,
or 320,000 random Canadians, as suspects (out of 32 million adults). Again,
this is not a realistic, practical scenario for policing. Of course, that still leaves
false negatives, those whom the algorithm falsely labelled non-dangerous but
who will in fact still commit crimes. Given this, it is difficult to detect any added
value in the use of big data analytics in the so-called War on Terror. In fact, it
is obviously not going to work as the proverbial “added layer” of security or as
an additional tool. In practice, it would drain disproportionate resources away
from the other layers into an effort that is extremely unlikely to pay off.
The other mathematical problem is the explosion of spurious correlations. It
was first raised by data scientist Jeff Jonas6 and has grown into a humour industry
on the web.7 Fortuitous or spurious correlations of variables increase rapidly
with the amount of available data, become plentiful, and eventually form an
entirely new type of haystack. They are not identifiable with statistical tools and
can show extremely high statistical significance. Only human logic can tell that,
for instance, the high correlation between the marriage rate in Kentucky and
the number of drownings after falling out of a fishing boat in the last decade is
probably meaningless (Tyler Vigen has an inventory of over 30,000 such
Big Data against Terrorism 61
examples).8 Put another way, just as it has been said that 600 monkeys on 600
typewriters, given enough time, would eventually rewrite the works of Shake-
speare, given enough data it seems that any hypothesis, however eccentric, can
be proven (and, given enough data, in no time at all). Since much data mining
is the identification of patterns and the algorithmic comparison of digital traces
of behaviour, this problem is likely to become intractable with the ever-increasing
mountains of data. This is sure to shatter hopes that big data is a direct path to
reality: on the contrary, the more data, the greater the need for theory.
Civil Liberties
The acquisition and analysis of masses of private data is sure to raise more than
a few eyebrows. It may spur resistance within the public, or at least within
subgroups with sufficient political capital, who might successfully block or
hamper efforts to exploit big data analytics. Such attacks may diminish the
political and social capital of organizations that use or promote big data analytics
as “moral entrepreneurs.” Moral entrepreneurs establish themselves as “owners”
of social problems, in the sense that they have the authority to define them and
to elaborate, to suggest – if not to directly apply – corresponding solutions. Such
organizations are facing a difficult dilemma. On the one hand, there is a risk
that the occurrence of a terrorist attack could diminish the power of their
claimed expertise in national security. On the other, aggressive surveillance and
social control tactics might undermine their legitimacy with the public and
increase their vulnerability to criticism.
preventive investigations led by undercover police. From the start, such inves-
tigations tend to format situations with conventional categories. The resulting
data, then, form a very specific type of social construction determined by
policing objectives. It is difficult to see how any deep learning activated within
this closed universe might avoid assimilating and reproducing the structural
bias.
In the case of algorithmic profiling, we have seen that even the best-designed
system will generate massive numbers of false suspects. This will certainly have
the effect of swamping police and security forces, making them less, not more,
efficient. The year 2014 was an unusually busy one for counterterrorism in
Canada, with two separate “lone wolf ” attacks. Michael Zehaf-Bibeau invaded
the Canadian Parliament with a firearm and Martin Couture-Rouleau hit mil-
itary personnel with his car. This prompted RCMP management to reassign
hundreds of investigators who were working on organized crime to terrorism
investigations. The case of Couture-Rouleau was particularly damaging for
police, who had interviewed him many times in the weeks prior to his attack.
This clearly indicates that a sudden influx of hundreds of leads would either
destabilize the system or require massive new investments in policing.
The glut of false suspects would subject thousands of computer-selected
persons to excessive scrutiny from police organizations. This scrutiny might
extend to the international level. As shown by a recent Federal Appeal Court
case (X(Re), 2013 FC 1275), Canadian intelligence agencies have subcontracted
the surveillance of Canadians to foreign Five Eyes partners, this without
informing the court when obtaining the original interception warrants. The law
has since been amended to allow the practice, which should make watching
suspects around the globe much easier. Hundreds of Canadians will be travel-
ling with secret “terrorism” labels affixed to their passports, the consequences
of which are well illustrated by the cases of Canadians Maher Arar, Abdullah
Almalki, Muayyed Nureddin, and Ahmed Abou-Elmaati. Each was the victim
of torture abroad due to misidentification by Canadian police as a terror
suspect.9
As David Lyon notes, data mining and especially clustering and network
analysis amplify social sorting.10 Neutral technology, once deployed in a socially
stratified world, can only reinforce and exaggerate the stratification. Automated
network analysis categorizes users according to their communication habits.
This includes frequency of use, time of day, chain of contacts, duration of calls,
attachments in emails, and so on, and produces a typology based on the esti-
mated risk of belonging to a radical network.
Many chilling effects associated with mass surveillance will also appear.
Research has shown that surveillance modifies the way we express ourselves,
Big Data against Terrorism 63
Practical Incentives
Beyond conventional wisdom, there are a few more tangible incentives for the
adoption of massive data analytics in security policing. Most modern organiza-
tions, whether public, private, or hybrid, are already collecting large amounts
of data on every facet of their day-to-day operations. With exponentially more
to come in the near future, most are trying to keep afloat by improving their
data centres or moving them to the cloud. And it so happens that every data
centre management software vendor, as well as all cloud providers, now sell
their services with “big data analytics” included in some way.
Policing organizations have not been immune to either the data glut or the
sales pitch. They contribute to the latter by claiming that better analytics will
lead to better security. This is taking place as most police organizations are in
the course of adopting in part or in whole the language, the tactics, the analytical
methodologies, the policies, and the management style associated with the
“intelligence-led policing” (ILP) model. ILP rests heavily on the efficient use of
the data available to police in order to maximize their impact on crime statistics.
As such, it is made to measure for the world of big data.
One of the most powerful factors of acceptance of big data surveillance is its
claimed security function. For the time being, however, this is more a promise
than an actual result, as security successes are few and far between. The NSA
claims that its massive interception and analysis capabilities were used in some
fifty-four investigations since 2001. This seems like a rather meagre return on
investment, considering the astounding resources engaged. On close scrutiny,
the technology played what might charitably be called a peripheral role at best.13
Of course, it could be argued that the data involved are simply not yet big enough:
most of the NSA holdings are communications intercepts. The original concept
of the TIA project was to gather all kinds of data, from everywhere, including
Big Data against Terrorism 65
Conclusion
It seems rather unlikely that national security will be better protected with the
adoption of big data analytics. The extreme rarity of national security threats
virtually ensures that unwanted consequences will outnumber prevention “suc-
cesses.” What it will achieve is always-on or ambient surveillance, where omni-
present robotic systems collect continuous, fine-grained information about our
daily activities. The promise of complete security through machine analytics
will become impossible to resist, especially if new signal crimes alert public
opinion to the police complaint of “going dark” because of increasingly prevalent
anonymization and encryption in communications.
Notes
1 Jean-Paul Brodeur and Stéphane Leman-Langlois, “Surveillance-Fiction: High and Low
Policing Revisited,” in The New Politics of Surveillance and Visibility, edited by Kevin
Haggerty and Richard Ericson (Toronto: University of Toronto Press, 2006), 171–98.
2 John Mueller, Terrorism since 9/11: The American Cases (Columbus, OH: Mershon Center
for International Security Studies, Ohio State University, 2011).
3 Stéphane Leman-Langlois and Jean-Paul Brodeur, Terrorisme et antiterrorisme au Can-
ada (Montreal: Presses de l’Université de Montréal, 2009).
4 Jean-Paul Brodeur and Stéphane Leman-Langlois, “Surveillance-Fiction,” 171–98.
5 Jennifer Xu and Hsinchun Chen, “The Topology of Dark Networks,” Communications of
the ACM 51, 10 (2008): 58–65.
6 Jeff Jonas and Jim Harper, “Effective Counterterrorism and the Limited Role of Predictive
Data Mining,” Policy Analysis 584 (2006), https://www.cato.org/sites/cato.org/files/pubs/
pdf/pa584.pdf.
7 For instance, “Spurious Correlations,” tylervigen.com, http://www.tylervigen.com/
spurious-correlations.
8 Ibid.
9 The Honourable Frank Iacobucci, Internal Inquiry into the Actions of Canadian Officials
in Relation to Abdullah Almalki, Ahmed Abou-Elmaati and Muayyed Nureddin (Ottawa:
Public Works and Government Services Canada, 2008).
10 David Lyon, “Surveillance, Snowden, and Big Data: Capacities, Consequences, Critique,”
Big Data and Society 1, 13 (2014), https://doi.org/10.1177%2F2053951714541861.
11 See Elizabeth Stoycheff, “Under Surveillance: Examining Facebook’s Spiral of Silence
Effects in the Wake of NSA Internet Monitoring,” Journalism and Mass Communica-
tion Quarterly 1, 16 (2016), https://doi.org/10.1177%2F1077699016630255; Jonathon
Penney, “Chilling Effects: Online Surveillance and Wikipedia Use,” Berkeley Technology
Law Journal 31, 1 (2016): 117–83, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2769645.
12 Stoycheff, “Under Surveillance.”
Big Data against Terrorism 67
13 Bailey Cahall, David Sterman, Emily Schneider, and Peter Bergen, “Do NSA’s Bulk Sur-
veillance Programs Stop Terrorists?” (policy paper, New America, Washington, DC,
2014), https://www.newamerica.org/international-security/do-nsas-bulk-surveillance
-programs-stop-terrorists/.
14 Chris Anderson, “The End of Theory: The Data Deluge Makes the Scientific Method
Obsolete,” Wired, 27 June 2008, https://www.wired.com/2008/06/pb-theory/.
4
Algorithms as Suspecting Machines
Financial Surveillance for Security Intelligence
Anthony Amicelle and David Grondin
needle in the haystack.”6 To see how that goal is achieved, we investigate the
policies and practices involved in financial surveillance as well as the detection
and reporting of “suspicious transactions.” But who is in charge of finding the
needle of dirty money in the Canadian financial haystack? As Jef Huysmans
suggests in his work on contemporary policing, financial policing is partly
detached from the institution of the police as it deals with the more general
associative practices found in networks of risk knowledge, technologies, and
agencies.7
In Canada, over 31,000 businesses – from the real estate sector to the banking
industry – must comply with legal obligations dealing with dirty money, includ-
ing reporting suspicious transactions to the relevant state authority, namely,
FINTRAC. As Canada’s financial intelligence unit, “FINTRAC receives reports
from financial institutions and intermediaries, analyzes and assesses the reported
information, and discloses suspicions of money laundering or of terrorist finan-
cing activities to police authorities and others as permitted by the Act. FINTRAC
will also disclose to CSIS [Canadian Security Intelligence Service] information
that is relevant to [a] threat to the security of Canada.”8 FINTRAC receives over
200,000 reports of suspicious transactions annually, mainly from banks, which
have become the main source of denunciation to the state,9 practising financial
dataveillance in the name of security through “the systematic use of [financial-
related] personal data systems in the investigation or monitoring of the actions
or communications of one or more persons.”10
In an attempt to meet their legal responsibilities regarding anti–money laun-
dering and counterterrorism financing, numerous banks have implemented
algorithmic infrastructures to monitor suspicious activity and help make sense
of the daily avalanche of transaction-related digital data. Our aim in discussing
the seemingly invisible work of these security/mobility infrastructures is to shed
some light on the impact and productive power of governing with algorithms
in an era characterized by big data surveillance. We seek to show how doing so
changes former practices, highlighting how socio-technical devices have become
part of security apparatuses, as we explore financial big dataveillance practices
through the algorithmic instrumentation that makes everyday surveillance and
denunciation possible and stable over time.
Algorithms cannot be divorced from the conditions under which they are devel-
oped and deployed ... What this means is that algorithms need to be understood
as relational, contingent, contextual in nature, framed within the wider context
of their socio-technical assemblage. From this perspective, “algorithm” is one
element in a broader apparatus which means it can never be understood as a
technical, objective, impartial form of knowledge or mode of operation.18
Louise Amoore and Rita Raley construe the algorithm “as both technical process
and synecdoche for ever more complex and opaque socio-technical assem-
blages.”19 To put it simply, an algorithm is a mathematical formula or a set of
instructions or rules that enable its users to obtain a specific result using its
computational ability to sift through large amounts of data and accomplish tasks
that human agents could not perform in a timely manner.20 More tellingly,
algorithms have become highly publicized artifacts that populate our everyday
life, with their logic woven “into the very fabric of all social processes, inter-
actions and experiences that increasingly hinge on computation to unfold.”21
Looking at the security/mobility nexus enables us to analyze surveillance and
infrastructure in the digital world. As part of a secretive security infrastructure,
72 Anthony Amicelle and David Grondin
algorithms operate in the background, not unlike the “black box society” that
Frank Pasquale describes in relation to the hidden algorithms that control
money and information, affecting decisions made on the mistaken assumption
that the information they are based on is neutral and technical. Pasquale uses
the black box to recognize both its function as a recording device for monitoring
technical data in transportation (cars, planes, and trains) and its metaphorical
use suggesting the obscurity and opacity of complex systems whose inner work-
ings we cannot easily decipher or reveal.22 Thinking about what algorithms do
with dataveillance means taking into account how vast streams of digital data
about people’s life, conduct, and mobility are sifted and sorted.
Before examining the algorithms in action and looking at their specific findings
beyond those described in the promotional material, let us briefly paint a rough
picture of our case study.
I think the motivation to be compliant is, you know, you want to protect the
bank’s brand. And you want customers and the shareholders to feel as if their
bank is a safe bank in which they can conduct transactions and they go without
having to worry about. You know, is this bank going to be open tomorrow? Are
there issues associated with whom they are lending money to? Do they have good
business practices? And to comply allows us to give that type of confidence to
our customers, internally to our employees and also to our shareholders. We
look at it as a value proposition whereas before compliance was looked as a cost
of doing business or an expense. But, you know, we treat compliance as our
Algorithms as Suspecting Machines 75
ability to contribute to the brand and to the safety and soundness of the organ-
isation. We think it is a competitive advantage to comply.34
There is still ... hum ... a bit of push back in cost. There is always a bottom line
cost in that, the requirement is you have to do X, Y and Z, now they have to adjust
their system and internal technology in order to meet that requirement ... And they
say, “you know, we cannot just go and do so many technological changes plus we
have a business to run. So, it is not all about you [the regulator] all the time.” So,
it is always about money and it is always about the time to do it [technological
improvement]. And they say, “we can’t do it now, then we will do it in a year.”
Well, a lot can happen in a year. Right? So, they push back on us, and their tech-
nology department pushes back on them, because the priority is making money.
For instance, they have to put a system in to collect service fees. That probably
takes priority over what we want them to do, because it is going to affect their
bottom line. We are costing them money. We don’t make them money. We can
save their reputation or risk, right! But they don’t see it that way. They don’t see
76 Anthony Amicelle and David Grondin
it that way at all, every time. And it is always, and it is not just the banks. It is
all of them [reporting entities]. They all push back because of the finance, the
financial burden. Burden. That’s what it is: a financial burden.36
The work of surveillance is also a cost challenge in a different way for bank
tellers who are the other main – and the traditional – source of internal alerts
in any financial institution. As a former compliance officer interviewed noted:
The bottom line is always the dollar. So, at the branch level, the customer service
representatives, or the loan manager, their major focus is to make money. To sell
by opening up accounts, sell credit cards, sell mortgages, sell loans, sell this ...
that is their job. It is to make that money. So, they have quota. They will say to
their tellers in one day-time you have to do X amount of loans or X amount of
mortgages, or X amount of credit cards. They push from the top. So you don’t
spend the time that you need to the KYC [Know Your Customer compliance
obligations] because the requirement of making money comes first again. So, if
as a teller I can sell a mortgage, if I can do two deals or sell a credit card in half an
hour that would be great, as oppose [sic] to doing one in an hour because I have
taken the time to have a conversation, do the KYC, file out everything properly.
The branches don’t have the time to do it. They are just under the gun of making
more revenue.37
Initially it took three years just getting all the data feeds right, and going
through the banking system, data and figuring ... and I know we are not the
only one that has this issue ... but actually figuring where all the data is, and
how to get it into the right format, and be able to identify the type of transac-
tion ... So there is a lot of manipulation that goes on in the background to get
the data that you need in the transaction monitoring system, to be able to effec-
tively get meaningful alerts.42
Two years, you are very optimistic! We are talking about banks, they have mil-
lions of customers, whether involved individuals, corporations, trusts, and so
on. And it is even worse when we talk about big financial institutions. The IT
system is not only for the bank but for all the subsidiaries of the group, insur-
ance, life insurance, investment activity and many more, in Canada and abroad.
Thus, the system often begins with the bank and it is gradually extended to the
other subsidiaries too. The task is enormous. It is a work in progress. Even when
you are there, you are still in the process of calibration because there are new
financial products, you have bought a competitor or you have forgotten a market
and so on.43
78 Anthony Amicelle and David Grondin
to help assess whether or not transactions might give rise to reasonable grounds
for suspicion. They are examples of common and industry-specific indica-
tors that may be helpful when evaluating transactions, whether completed or
attempted. They include indicators based on certain characteristics that have
been linked to money laundering or terrorist activities in the past. These indi-
cators were compiled in consultation with reporting entities, law enforcement
agencies and international financial intelligence organizations.47
There is continuous feedback from the triage group [which receives the auto-
matically generated alerts] to the analytics people [who deal with tuning on-site
scenarios]. They say, “You know, we keep seeing this and we keep seeing this. It
is nothing, it is nothing. Can we optimize the rules?” That, as a process, continu-
ally takes place from that perspective, in order to optimize the rules while, at
the same time, through typologies and other things, we build new rules to bring
in new alerts. It is a continuous process to maintain and keep your rules up to
date.48
80 Anthony Amicelle and David Grondin
It is still the big issue, there is a lot of false positives, and there is always the bal-
ance between spending money on improving this system and also the resources
Algorithms as Suspecting Machines 81
available on improving the system, versus going through the false positives and
spending the time to go through the false positives, so that is an issue and it is
always going to be an issue until I can find someone to give me more money
[laughs]. Anyway, false positives will always stay but the number of false posi-
tives, to get that down to a reasonable size so that you are getting alerts on what
you should be getting alerts on, and covering off the activity without just getting
a lot of noise and having to go through that noise, because once the alert is gen-
erated, you have to at least look at it. And every single one takes time.50
The problem with actually changing the algorithms is that every time you engage
the vendor it costs money, and the organisation is not willing to pay for extras ...
And that is how the technical model [of the vendors] is set up. It is set up so
that once they become your provider, that is when they start charging you the
extras because they know that you cannot go anywhere else. And to implement a
new system is not cost-effective, especially every couple of years when you really
82 Anthony Amicelle and David Grondin
need to upgrade because the technology is changing. Once you have decided on
one provider, you pretty much are going to stay with them unless there is some-
thing really significant that takes place.54
The analytics people, this is where we develop the rules, the thresholds, where we
basically identify alerts that could be cases that could be suspicious transactions.
They actually are the ones who are programming our transactions monitoring
rules, the algorithms. The stats and the math guys who develop the algorithms
for transactions, sanctions, names matching, everything ... My analytics people
are all, not all but most of them, masters of science or PhD analytics, so they
grew either in the computer science or the quantitative sciences world. They are
the data junkies. They love the data, they love to work with the data.55
The “data junkies” are hired to connect with “data-hungry beasts” (i.e., algo-
rithmic devices)56 and refine algorithmic monitoring in accordance with testing
sessions, internal feedback, and new indicators. Although the “white box” or
“open box [transparent] environment” approach is more expensive than black-
box algorithms, its selling point is still about “saving money”: “Can you really
afford a bigger staff to handle growing alert volumes?”57 In either case, the
reduction of false positives often becomes an end in itself, thus obscuring the
original financial surveillance mission for security intelligence.
Conclusion
Ultimately, the automated production of alerts turns out to be more negotiated
than automated, given the competing objectives of making money and fighting
crime and terrorism. The outcome of these negotiations in terms of alerts and
false positives not only is based on the algorithmic design used in transaction
monitoring but depends equally on the quality and verifiability of the data being
processed. The changes that algorithms as logistical media lead to in the inner
workings of financial surveillance are significant.
The algorithmic production of alerts differs from bank tellers’ alerts in many
ways, one of which deserves further attention. In both cases, knowledge about
the client is part of the process of alert production, but there is a difference in
the type of knowledge that can be used. With regard to clients, the form and
content of knowledge may vary from one bank to another, from one type of
clientele or business relationship to another, and from one type of bank employee
Algorithms as Suspecting Machines 83
to another (i.e., bank tellers in bank branches versus data analysts in headquar-
ters). According to Lyn Loftland and John Dewey, human beings can develop
a knowledge of people (acquaintance-knowledge) or a knowledge about people
(knowledge-about).58 Clive Norris sums up the importance of this distinction
in connection with surveillance and control by noting that the “basis of knowing
of, rather than about, people is face-to-face interaction. When we only know
about people our knowledge is second-hand.”59 De facto, analytics and data
analysts are further removed from the local branch context than the front-line
staff – acting on the basis of face-to-face interaction and face-to-face knowledge
is no longer an option. The resulting surveillance at a distance through data can
be seen either as a critical lack of knowledge or as a key element of objectivity
in controlling mobility.
The first interpretation highlights the inability of analytics “to look at some-
body and understand if they are avoiding eye contact, if they are being really
fidgety, or if they are hurrying you along. Those are indicators that something
is not quite right.”60 The second interpretation highlights the connivance bias
that may be introduced by face-to-face interactions:
That is often the case, especially if, in some cases, the managers have more a
relationship model with the clientele. So if you get into, like, wealth management
businesses where the whole nature of the business is the relationship, then they
may take a more personal view, you know, I know the client, the client is good,
I have known him for ten years, blah-blah-blah, there should not be anything
wrong with what they are doing. So you always got to balance that aspect with
what you see and sometimes stark realities have to come out depending on what
you find.61
of both technologies and agents. It also reveals the need to reconcile a technical
problem – i.e., the work of suspecting machines – with a business rationale that
seeks to minimize as much as possible the costs of using such machines (as
banks, after all, want to make money). The challenge we faced in this chapter
was making visible the complex process of algorithms as logistical media, in a
context in which they govern data. Doing so also involved understanding how
this process is part of a human/non-human assemblage, a socio-technical
infrastructure in which the algorithm is embedded. Looking at how suspecting
machines operate enables us to see where power relations and preferences play
a role in what may appear to be neutral technical operations, through human
decisions.
Notes
1 Louise Amoore, Stephen Marmura, and Mark B. Salter, “Smart Borders and Mobilities:
Spaces, Zones, Enclosures,” Surveillance and Society 5, 2 (2008): 96–101.
2 This chapter is drawn from a larger project funded by an Insight Social Sciences and
Humanities Research Council (SSHRC) grant on the central role of algorithms in the
governance of North American borderlands (NAB) and the related policing of mobilities
in the digital age of big data. The project deals with the question of how algorithms “act”
and make a difference in the security process, specifically to what extent algorithms have
come to serve as critical operators and “gatekeepers” given the security/mobility nexus
that has come to define how security is processed and delivered in the digital era.
3 Marieke de Goede and Mara Wesseling, “Secrecy and Security in Transatlantic Terror-
ism Finance Tracking,” Journal of European Integration 39, 3 (2017): 253–69; Anthony
Amicelle, “The Great (Data) Bank Robbery: The Terrorist Finance Tracking Program
and the Swift Affair,” CERI Questions de recherche/Research Questions 36 (2011): 1–27.
4 European Commission, Joint Report from the Commission and the U.S. Treasury Department
regarding the Value of TFTP Provided Data (Brussels, 2013), 5.
5 Claudia Aradau, “The Signature of Security: Big Data, Anticipation, Surveillance,” Radi-
cal Philosophy 191 (2015): 1–8.
6 Julie Conroy, Global AML Vendor Evaluation: Managing Rapidly Escalating Risk (Boston:
Aite, 2015), https://www.aitegroup.com/report/global-aml-vendor-evaluation-managing
-rapidly-escalating-risk.
7 Jef Huysmans, Security Unbound: Enacting Democratic Limits (Abingdon, UK: Routledge,
2014).
8 FINTRAC, “What Is FINTRAC?” 16 August 2019, http://www.fintrac-canafe.gc.ca/
questions/FAQ/1-eng.asp.
9 FINTRAC, FINTRAC Annual Report 2018–19 (Ottawa: FINTRAC, 2019), 37, https://
www.fintrac-canafe.gc.ca/publications/ar/2019/ar2019-eng.pdf.
10 Roger Clarke, “Information Technology and Dataveillance,” Communications of the
ACM 31, 5 (1998): 498–512.
11 Ned Rossiter, Software, Infrastructure, Labor: A Media Theory of Logistical Nightmares
(New York: Routledge, 2016), 4–5.
12 Joseph Masco, The Theater of Operations: National Security Affect from the Cold War
to the War on Terror (Durham, NC: Duke University Press, 2014); Brian Larkin, “The
Politics and Poetics of Infrastructure,” Annual Review of Anthropology 42, 1 (2013):
Algorithms as Suspecting Machines 85
Meadows, and Keith Spiller, The Private Security State? Surveillance, Consumer Data
and the War on Terror (Frederiksberg: Copenhagen Business School Press, 2015); Gilles
Favarel-Garrigues, Thierry Godefroy, and Pierre Lascoumes, “Reluctant Partners? Banks
in the Fight against Money Laundering and Terrorism Financing in France,” Security
Dialogue 42, 2 (2011): 179–96; Lyliya Gelemerova, “On the Frontline against Money-
Laundering: The Regulatory Minefield,” Crime, Law and Social Change 52 (2009): 33–55;
Eric Helleiner, “State Power and the Regulation of Illicit Activity in Global Finance,” in
The Illicit Global Economy and State Power, edited by Peter Andreas and Richard Friman
(Lanham, MD: Rowman and Littlefield, 1999), 53–89; Béatrice Hibou, The Bureaucrati-
zation of the World in the Neoliberal Era (London: Palgrave Macmillan, 2015).
34 Interview with a compliance officer, Canada, 2016.
35 Richard Ericson, Crime in an Insecure World (London: Polity, 2007).
36 Interview with a FINTRAC official, Canada, 2015.
37 Interview with a former compliance officer, Canada, 2015.
38 FINTRAC, “Financial Transactions That Must Be Reported,” 2020, https://www.fintrac
-canafe.gc.ca/reporting-declaration/rpt-eng.
39 FINTRAC, FINTRAC Annual Report 2018–19.
40 Anthony Amicelle and Elida Jacobsen, “The Cross-Colonization of Finance and Security
through Lists: Banking Policing in the UK and India,” Environment and Planning D:
Society and Space 34, 1 (2016): 89–106.
41 Anthony Amicelle and Vanessa Iafolla, “Suspicion-in-the-Making: Surveillance and
Denunciation in Financial Policing,” British Journal of Criminology 58, 4 (2018): 845–63.
42 Interview with a bank compliance officer, Canada, 2015.
43 Interview with a FINTRAC official, Canada, 2015.
44 FINTRAC, “Financial Transactions That Must Be Reported,” item 6.1.
45 Amicelle and Iafolla, “Suspicion-in-the-Making.”
46 Ibid.
47 FINTRAC, “Guideline 2: Suspicious Transactions,” item 6.3.
48 Interview with a compliance officer, Canada, 2015.
49 Ibid.
50 Interview with a bank compliance officer, Canada, 2015.
51 Evelyn Ruppert, “The Governmental Topologies of Database Devices,” Theory, Culture
and Society 29, 4–5 (2012): 116–36.
52 Anthony Amicelle, Claudia Aradau, and Julien Jeandesboz, “Questioning Security
Devices: Performativity, Resistance, Politics,” Security Dialogue 46, 4 (2015): 293–306.
53 Ibid., 294.
54 Interview with a bank compliance officer, Canada, 2015.
55 Interview with a bank compliance officer, Canada, 2015.
56 Conroy, Global AML Vendor Evaluation.
57 SAS, “SAS Anti-Money Laundering” (fact sheet, 2016), 2, https://www.sas.com/content/
dam/SAS/en_us/doc/factsheet/sas-anti-money-laundering-105623.pdf; BAE Systems, “Net-
Reveal AML Transaction Monitoring,” https://www.baesystems.com/en/cybersecurity/
product/aml-transaction-monitoring.
58 Lyn H. Lofland, A World of Strangers: Order and Action in Urban Public Space (New York:
Basic Books, 1973); John Dewey, Logic: The Theory of Inquiry (New York: Holt, 1938).
59 Clive Norris, “From Personal to Digital: CCTV, the Panopticon, and the Technological
Mediation of Suspicion and Social Control,” in Surveillance as Social Sorting: Privacy, Risk,
and Digital Discrimination, edited by David Lyon (New York: Routledge, 2003), 251.
60 Interview with a bank compliance officer, Canada, 2015.
61 Ibid.
Part 2
Big Data Surveillance and Signals Intelligence
in Canadian Security Organizations
This page intentionally left blank
5
From 1967 to 2017
The Communications Security Establishment’s Transition
from the Industrial Age to the Information Age
Bill Robinson
or both of the partners,12 and some of that material was already being made avail-
able to Canada. But the question remained whether more specific Canadian
intelligence needs might be served by direct Canadian intelligence collection.
By April 1972, a survey was underway to “determine the value of arrangements
with the Canadian Overseas Telecommunications Corporation (COTC) by
which we would have the opportunity to examine messages between Canada
and locations abroad which would (1) clarify links between revolutionary
activities abroad, (2) provide information concerning known revolutionary
elements, and (3) contribute intelligence about foreign and economic affairs
of direct interest to Canada.”13 The location of the test operation was the
COTC’s Montreal office,14 which served as the gateway for all Canadian trans-
atlantic telecommunications.
Later in 1972, CBNRC sought the budget authority to place the COTC oper-
ation on a full operational footing beginning in fiscal year 1973–74.15 The proposal
was put on hold, however, when cabinet froze the entire Canadian intelligence
program at its existing level pending completion of a review of the program.16
Not long afterwards, passage of the Protection of Privacy Act made the intercep-
tion of “private communications” (communications with one or more ends in
Canada) illegal except under specific defined circumstances, quashing the
CBNRC’s cable-monitoring plan in the process. Nobody had asked the agency
what effect the new act might have on SIGINT collection:
The Department of External Affairs queried some aspects of the draft bill which
might affect legitimate intelligence interests adversely. Copies of the queries, which
went out over the signature of the Under-Secretary of State for External Affairs
(USSEA), went to the Cabinet Secretary, the Chief of the Defence Staff (CDS)
and the Director General of the Security Service (DGSS) in the RCMP, but not to
Director CB. After receiving answers from Justice, the USSEA told his Minister in
June [1971] that “radio communications would not come within the prohibition”
against the interception of “private communications.” Also, the “use or disclosure”
of communications intercepted outside Canada would not constitute an offence,
because such an act would not be “wilful,” as being based on selection from a mass
of material picked up without any “mens rea.” He also told his Minister that rel-
evant information would be made available to intelligence authorities (presumably
including CB), even if obtained for specifically security purposes under a warrant
issued by the Solicitor General. However, these did not all turn out to be the inter-
pretations of several subsequent Solicitors General and Ministers of Justice.17
Among its other effects, the Protection of Privacy Act denied CBNRC access to
telephone, telegraph, and telex messages transmitted by cable if one or both
From 1967 to 2017 93
I doubt if we can get everything the RCMP and CSE want, but we can certainly
get something if we come up with a practical proposal. My own assessment is
that we could get permission at least to tap the US and UK data banks and lately
I have been wondering whether we could not take some action in Toronto and
Vancouver which would be less difficult to organize and explain than action in
Montreal.19
(Toronto and Vancouver were the locations of the COTC’s other gateways for
cross-border telecommunications.) Within a few months, however, both the
Keable Inquiry and the McDonald Commission had been launched to investigate
the RCMP’s misdeeds.20 In January 1978, the IAC “generally agreed the present
atmosphere associated with the McDonald Commission and the Keable Inquiry
is not conducive at this time for government approval of the [special collection]
project.”21 Nevertheless, it remained on the wish list. Just one month later, in its
Review of 1977 and Forecast for 1978, the IAC put the Interdepartmental Com-
mittee on Security and Intelligence on notice that a proposal would likely be
forthcoming: “At present very little intelligence is available which would lend
itself to the production of analyses on foreign attitudes, intentions and activities
relating to Canadian unity. A special collection project may be required to
94 Bill Robinson
Since 1990, collection activities under section 16 have gradually increased. The
Committee believes several factors are behind this trend. First, the notion of col-
lecting foreign intelligence in the early years of the Act was novel and untested. It
was only after the signing of the Tri-Ministerial MOU that the details of exactly
how to proceed were established. Second, there has been a growing awareness
within government of the utility of the kind of information that tasking under
section 16 can generate.25
discussed the contribution that could be made by Canada, under the CAN
UKUS agreement, in support of NSA processing of “R” take. He covered the fol-
lowing points: a. Canadian responsibilities under the CANUKUS agreement are
to process all raw voice intercepts in the Soviet Arctic and the northern military
districts. In order to accomplish this, the Canadians use linguists, transcribers
and analysts. At the present time, with the switch of the Soviets away from HF,
this Canadian capacity is not being used. b. The “R” system is now intercept-
ing the troposcatter system in the Soviet Arctic. NSA, therefore, proposes that
Canada be authorized to process this in order to take up the slack resulting from
the HF dry-up.28
CSE probably also helped to process traffic from the CANYON satellites, which
were controlled by NSA.29
Meanwhile, the advent of the supercomputer, heralded by the arrival of the
Cray-1 in 1976, revolutionized UKUSA cryptanalysis. Within a year, NSA began
breaking into high-echelon Soviet encryption systems, obtaining a degree of
access not seen since the end of the Second World War. NSA historian Thomas
R. Johnson wrote that “for the cryptologists, it was their finest hour since 1945.”30
Access to the encryption systems of other countries was even more complete:
a 1979 assessment of NSA’s G Group, which was responsible for non-Soviet
targets, concluded that its cryptanalytic achievements were “at an all-time peak.”31
CSE and the other partners in the UKUSA community were the beneficiaries
of NSA’s successes. But the shrinking importance of their own contributions
also placed the partners under pressure to justify their privileged access to the
96 Bill Robinson
awareness in the legal system of the needs of the S&I community and there may
be implications down the road for other projects (e.g., PILGRIM, MADRIGAL).
Our position on ECHELON has been to support the project as a valuable con-
tribution to the overall Canadian and allied effort. We regret that it appears it
will not go forward.37
SIGINT in the Industrial Age meant collecting signals, often high frequency
(HF) signals connecting two discrete and known target points, processing the
often clear text data and writing a report. eSIGINT in the Information Age
means seeking out information on the Global Net, using all available access
techniques, breaking often strong encryption, again using all available means,
defending our nation’s own use of the Global net [sic], and assisting our war
fighters in preparing the battlefield for the cyberwars of the future. The Fourth
Amendment is as applicable to eSIGINT as it is to the SIGINT of yesterday and
today. The Information Age will however cause us to rethink and reapply the
procedures, policies and authorities born in an earlier electronic surveillance
environment ... senior leadership must understand that today’s and tomorrow’s
From 1967 to 2017 99
The term “eSIGINT” fortunately never caught on, but the transition to Infor-
mation Age SIGINT certainly did. In 2000, CSE “embarked upon an important
strategic exercise to identify alternative approaches to delivering its mandate.
As a starting point, it defined its vision thus: ‘to be the agency that masters the
global information network to enhance Canada’s safety and prosperity.’”50 But
its efforts remained hamstrung by a lack of suitable legal authorities. Internet
traffic could be monitored for foreign intelligence purposes, but CSE had to
ensure that no private communications were intercepted by its systems. The
difficulty of determining whether a particular Internet user was in Canada at
the moment of communication made this an extraordinarily challenging task.
The occasional inadvertent intercept might be forgiven, but any lapse in dili-
gence would open the agency to the charge of violating the ban on wilful
interception.51 Moreover, if a private communication did end up inadvertently
intercepted, the information in it could be neither used nor disclosed, even if
it concerned the proverbial ticking time bomb. A new watchdog position, the
CSE Commissioner, was created in 1996 to keep CSE’s compliance with the
law under continuing review. Successive commissioners demonstrated little
inclination to declare CSE in non-compliance,52 but their activities certainly
forced the agency to tighten up its privacy-related practices. Behind-the-scenes
work was begun to draft an Information Age legal regime for CSE, but no bill
was put before Parliament.
In the meantime, budget resources had become tight. Despite having lost its
primary target at the end of the Cold War, CSE avoided the major program cuts
that swept through Ottawa in the mid-1990s, but it did suffer minor cuts. “Pro-
gram integrity” top-ups for fiscal years 2000–01 and 2001–02 enabled the agency
to grow to nearly 950 personnel by mid-2001.53 However, this increase probably
served only as partial compensation for the loss of 771 Communications
Research Squadron, which was disbanded shortly afterwards.54
According to CSE chief Keith Coulter, by the end of the 1990s the agency was
facing a serious erosion of its SIGINT capabilities:
[When] the events of 9/11 took place, CSE was ... facing a tough scenario. Simply
put, in a kind of perfect storm situation, the 1990s saw the global revolution in
communications technologies, resource shortages and the lack of an updated
authority framework combine to create a serious erosion of CSE’s SIGINT
capabilities.55
100 Bill Robinson
GCHQ announced its own “Mastering the Internet” project during the same
year,72 and NSA’s SIGINT Mission Strategic Plan FY2008–2013, promulgated in
October 2007, declared that it too sought “to utterly master the foreign intel-
ligence implications of networks and network technology.”73 One way NSA
sought to do so was to collect target communications at the Internet companies
that handled them, through the PRISM program, instead of intercepting them
in the wild. In September 2007, Microsoft became the first participant in PRISM.
Yahoo, Google, Facebook, and others quickly followed.74 Within a few years,
data obtained through PRISM appeared in nearly one in seven of the first-,
second-, and third-party reports produced or received by NSA.75 The NSA vision
also called for improved “collection operations around the globe” to identify
and collect the most important traffic wherever it could be found, using “fast,
flexible, front-end processors [to] spot targets based on patterns, events, or
metadata rather than pre-defined selectors or brute force scanning of
content.”76
The key to understanding and managing the deluge of data streaming through
the Five Eyes worldwide monitoring systems was the collection and analysis of
telephone and Internet metadata, which could be used both to monitor specific
targets and, at least potentially, to identify previously unknown individuals and
From 1967 to 2017 103
activities of intelligence interest. In 2004, CSE began working with the Math-
ematics of Information Technology and Complex Systems (MITACS) consor-
tium, a Canadian network of academia, industry, and the public sector, to
improve the agency’s ability to exploit metadata. A 2006 description of the
MITACS Semi-Supervised Learning in Large Graphs project provides a rare
public look into CSE’s interests:
Much of this research was later placed under the aegis of the Cryptologic
Research Institute (now called the Tutte Institute for Mathematics and Comput-
ing), which was created in 2009 to help CSE bring outside mathematical exper-
tise to bear on cryptanalytic and data-mining questions. The agency acquired
a Cray XMT, a supercomputer optimized for data-mining operations, around
the same time.78 The XMT excels at two types of problems: “The first is the
finding-the-needle-in-a-haystack problem, which involves locating a particular
piece of information inside a huge dataset. The other is the connecting-the-dots
problem, where you want to establish complex relationships in a cloud of seem-
ingly unrelated data.”79
CSE also sought big data analysis techniques that could run on “non-extra-
ordinary” hardware. In late 2012, it began implementing what it called a New
104 Bill Robinson
Analytical Model designed to help its intelligence analysts keep up with and
better exploit the metadata and other target-related information, such as finan-
cial and travel data, increasingly available to them.80
Additional growth in CSE’s workforce was also on the agenda. Between 2009
and 2014, the agency grew from 1,650 employees to roughly 2,200, an increase
of 33 percent over five years.81 CSE’s Long-Term Accommodation Project (LTAP)
saw the construction of a brand-new headquarters complex located next to the
CSIS headquarters over the same period. CSE asserts that the Edward Drake
Building contains both the “largest concentration of supercomputers in
Canada”82 and the “largest volume databases in the country.”83
Future Prospects
Whether the Five Eyes partners can truly be said to have “mastered” the Internet
is open to question. In February 2012, however, the NSA felt justified in declar-
ing it had successfully made the transition to the Information Age: “As the world
has changed, and global interdependence and the advent of the information
age have transformed the nature of our target space, we have adapted in innova-
tive and creative ways that have led some to describe the current day as ‘the
golden age of SIGINT.’”84
The gradual spread of encryption in email, web browsing, and messaging
apps – almost certainly accelerated to some degree by the Snowden leaks in
2013 – may have taken some of the lustre off the gold in the years since that
statement. CSE maintains that Edward Snowden’s “unauthorized disclosures
have diminished the advantage that we have had, both in the short term but
more worryingly in the long term.”85 It is certainly the case that encryption is
becoming more common. In 2016, the United Kingdom’s Independent Reviewer
of Terrorism Legislation reported that “about 50% of Internet traffic was now
encrypted, and 100% of emails from major email providers.”86 In June 2017,
Australian attorney-general George Brandis lamented that over 40 percent of
counterterrorism investigations were encountering encrypted communications,
compared with less than 3 percent in mid-2013: “Within a short number of years,
effectively, 100 per cent of communications are going to use encryption ... This
problem is going to degrade if not destroy our capacity to gather and act upon
intelligence unless it’s addressed.”87
Such claims are almost certainly exaggerated, however. Depending on when
and in what form quantum computing makes its appearance, existing Internet
encryption technologies could be rendered obsolete in less than a decade.88 In
the meantime, the continuing migration of Internet traffic to mobile devices,
the pervasive vulnerability of existing software, and the growth of the Internet
of Things may be making targeted surveillance even more difficult to evade – as
From 1967 to 2017 105
long as you can identify your target. The Five Eyes agencies were already work-
ing hard on smartphone exploitation techniques in 2010, and they are likely to
have made progress in the years since.89 As ubiquitous computing becomes ever
more deeply embedded in daily life, and governments and corporations collect
the growing data trails thus generated, big data analysis is also likely to take on
increasing importance, both as a means of identifying and tracking individual
targets and for generating unique intelligence on target activities and
connections.
Between June 2011 and May 2012, NSA and its partners shared approximately
180,000 end product reports, not much lower than the approximately 200,000
shared in 1990. Of those 180,000, 7,511 were issued by Australia, 11,257 were
issued by the United Kingdom, and approximately 150,000 were issued by the
United States.90 A Canadian figure was not provided, but the number was prob-
ably somewhat lower than the 10,000 Canadian reports issued in 1990.91 How-
ever, more recent documents suggest that the Canadian total has returned to
the five-digit range.92 On their face, these numbers do not support the suggestion
that Canada’s SIGINT output has suffered in the post-Snowden era, but it is
possible that their continuing quantity masks a decline in quality.
An Agency Transformed
Today’s CSE is very different from the CBNRC of 1967. It has more than four
times the workforce it had fifty years ago, a much larger budget, and immeasur-
ably greater information collection, storage, and processing capabilities. Its
SIGINT activities remain focused on foreign intelligence, but the agency now
has a much larger role in support of domestic law enforcement and security
agencies than it had in the past. It also has a much more extensive role in infor-
mation security than it had in 1967. CSE is even more tightly bound into the
Five Eyes transnational SIGINT partnership, and it considers that partnership
to be more valuable now than it has ever been.93 The agency has evolved from
a passive collector of radio signals received in Canada to an active hunter of
data stored on information systems or passing through the air or fibre-optic
cables at locations far removed from Canadian soil. It now has legal authority
to intercept Canadian communications during its operations, although it can
target Canadians or persons in Canada only when operating under the aegis of
judicial warrants obtained by the law enforcement and security agencies to
which it provides support. In the course of its activities, it collects and processes
vast amounts of metadata pertaining to Canadians. And, perhaps most import-
ant, it now operates in a domain where foreign communications and Canadian
communications are deeply and inevitably intermixed, its desired targets are
frequently difficult to distinguish from non-targets, and activities such as
106 Bill Robinson
terrorism that have an important nexus with Canadian domestic life are much
higher on its intelligence priorities. It also operates under the eyes of a watchdog
agency to ensure that CSE complies with the law and, since 2017, a new parlia-
mentary review body, the National Security and Intelligence Committee of
Parliamentarians (NSICOP). Further change came in June 2019 with the passage
of Bill C-59, which gave CSE the authority to conduct Computer Network Attack
operations for both defensive and offensive purposes, extended its cybersecurity
mandate to the protection of Canadian private sector infrastructures (subject
to request by those entities), and created an entirely new oversight and review
structure for the agency.94 These changes in priorities, budget resources, legal
authorities, and worldwide communications and information technologies have
transformed the relationship between CSE and Canadians. As this chapter has
demonstrated, early concerns about domestic surveillance by CSE were largely
unfounded, but the potential for Canadians to be drawn into the agency’s dragnet
is now much greater, highlighting the importance of oversight, review, and
transparency measures for preventing abuse of the agency’s extraordinarily
intrusive capabilities.
Notes
1 Matthew Aid, Secret Sentry: The Untold History of the National Security Agency (New
York: Bloomsbury Press, 2009), 139.
2 Kevin O’Neill and Ken Hughes, “History of CBNRC,” Communications Security Estab-
lishment, August 1987, vol 4, ch 14, 43–45, released in redacted form under Access to
Information Request Number A-2015-00045.
3 “The Current Canadian Intelligence Program – Objectives and Activities,” attachment
to draft memorandum to the Cabinet Committee on Security and Intelligence, 20 April
1972, Library and Archives Canada (LAC), RG 25, box 10, file 1-3-12-1. I am indebted
to the Canadian Foreign Intelligence History Project for access to this and the other LAC
documents cited in this chapter.
4 O’Neill and Hughes, “History of CBNRC,” vol 6, ch 25, 8.
5 Aid, Secret Sentry, 17–18.
6 “Supplementary Radio Activities Consolidation Plan,” Department of National Defence,
30 May 1966, released in redacted form under an Access to Information request.
7 O’Neill and Hughes, “History of CBNRC,” vol 1, ch 4, 16–17.
8 Prior to 1956, all Canadian overseas telephone calls were transmitted by high-frequency
radio; the last overseas call transmitted from Canada by commercial radio was made in
1975.
9 Report of the Royal Commission on Security (Abridged) (Ottawa: Queen’s Printer, June
1969), 5.
10 O’Neill and Hughes, “History of CBNRC,” vol 1, ch 2, 24.
11 Claude M. Isbister, “Intelligence Operations in the Canadian Government,” Privy Coun-
cil Office, 9 November 1970, 51, released in redacted form under Access to Information
Request Number A-2011-00010.
12 GCHQ alone processed about 1 million ILC messages a month at this time: O’Neill
and Hughes, “History of CBNRC,” vol 3, ch 13, 16–17. NSA’s effort was similar in scale
From 1967 to 2017 107
– in 1975 it was estimated that 2.8 million of the 2 billion telegrams that passed over
ILC channels every month were forwarded to NSA headquarters, where analysts pro-
cessed about 1 million of them: Letter from Frederick A.O. Schwarz to Thomas Latimer,
Tab A, 16 September 1975, 5, National Security Archive, http://nsarchive2.gwu.edu//
dc.html?doc=4058229-Document-10-Letter-from-Frederick-A-O-Schwarz-to.
13 “The Canadian Intelligence Program,” draft memorandum to the Cabinet Committee
on Security and Intelligence (CCSI), 20 April 1972, 13–14, LAC, RG 25, box 10, file 1-3-
12-1. The final version of this document was considered by the CCSI in May 1972. The
Canadian Overseas Telecommunications Corporation was a Crown corporation that
had a monopoly on overseas telephone, telex, and telegraph services from Canada. It
was renamed Teleglobe Canada in 1975 and was later privatized, eventually becoming
part of Tata Communications.
14 A.F. Hart, “Meeting of Interdepartmental Committee on Security and Intelligence (ICSI)
– Tuesday, May 2, 2:30 pm,” 27 April 1972, LAC, RG 25, box 10, file 1-3-12-1.
15 Ibid.
16 O’Neill and Hughes, “History of CBNRC,” vol 6, ch 25, 13.
17 Ibid., vol 6, ch 26, 44–45. See also vol 1, ch 2, 30.
18 Ibid., vol 6, ch 26, 45–46.
19 Memorandum from John Hadwen to Major-General Reg Weeks, 25 March 1977, LAC,
RG 25, vol 29022, file 29-4-IAC, pt 2.
20 The Quebec government’s Keable Inquiry was announced on 15 June 1977, and the fed-
eral government’s McDonald Commission followed on 6 July 1977.
21 “Extract of the Minutes of the 1th [sic] Meeting of the Intelligence Advisory Committee
Held on Wednesday, 11 January 1978,” Intelligence Advisory Committee, LAC, RG 25,
vol 29022, file 29-4-IAC, pt 3.
22 Intelligence Advisory Committee (IAC) Review of 1977 and Forecast for 1978, Intelli-
gence Advisory Committee, 22 February 1978, LAC, RG 25, vol 29022, file 29-4-IAC,
pt 3.
23 The current Department of Justice view is that it is legal for CSE to receive one-end
Canadian traffic intercepted by Canada’s allies. However, it is not permitted to ask those
allies to target the communications of specific Canadians or persons in Canada except
at the request of a federal law enforcement or security agency operating under a suit-
able warrant. Access to second- and third-party intercepts might explain former CSE
employee Mike Frost’s claim that there was a “French problem” section within CSE dur-
ing the 1970s. See Mike Frost and Michel Gratton, Spyworld: Inside the Canadian and
American Intelligence Establishments (Toronto: Doubleday, 1994), 96.
24 Teleglobe Canada, 37th Annual Report, for the Year Ended December 31, 1986 (Ottawa:
Teleglobe Canada, 1987), 24.
25 Security Intelligence Review Committee, SIRC Report 2001–2002: An Operational Audit
of the Canadian Security Intelligence Service (Ottawa: Public Works and Government
Services Canada, 2002), 14–15, http://www.sirc-csars.gc.ca/pdfs/ar_2001-2002-eng.pdf.
26 O’Neill and Hughes, “History of CBNRC,” vol 3, ch 11, 87; vol 6, ch 26, 43.
27 Ibid., vol 1, ch 4, 20–21.
28 Lieutenant General Donald Bennett, “Executive Session of USIB [US Intelligence
Board], Thursday, 5 November 1970,” Memorandum for the record, 9 November 1970. I
am grateful to the late Jeffrey Richelson for providing a copy of this document to me.
29 According to Jeffrey Richelson, both Canada and the United Kingdom assisted in process-
ing CANYON traffic: Jeffrey Richelson, “Eavesdroppers in Disguise,” AIR FORCE Maga-
zine, August 2012, 58–61, http://www.airforcemag.com/MagazineArchive/Documents/
2012/August%202012/0812eavesdroppers.pdf.
108 Bill Robinson
52 But it has happened on one occasion. See Bill Robinson, “CSE Commissioner: CSE
Violated Law,” Lux Ex Umbra (blog), 28 January 2016, https://luxexumbra.blogspot.
ca/2016/01/cse-commissioner-cse-violated-law.html.
53 Keith Coulter, “CSE’s Post-9/11 Transformation” (speech to the Canadian Association
of Security and Intelligence Studies conference, 15 October 2004), Internet Archive,
https://web.archive.org/web/20060502140839/http://www.cse-cst.gc.ca:80/documents/
publications/casis-speech.pdf.
54 Wortman and Fraser, History of Canadian Signals Intelligence and Direction Finding, 131.
771 Communications Research Squadron was disbanded in December 2002: Christine
Grimard, “15 Years of Service Remembered,” Maple Leaf, 9 April 2003.
55 Coulter, “CSE’s Post-9/11 Transformation.”
56 Canada, Bill C-36, An Act to amend the Criminal Code, the Official Secrets Act, the Can-
ada Evidence Act, the Proceeds of Crime (Money Laundering) Act and other Acts, and to
enact measures respecting the registration of charities in order to combat terrorism, 1st
Sess, 37th Parl, LEGISinfo, http://www.parl.ca/LegisInfo/BillDetails.aspx?Language=en
&Mode=1&billId=73328.
57 Canadian Press, “Secretive Federal Spy Agencies Get $47 Million for New Technology,”
19 October 2001.
58 Office of the Communications Security Establishment Commissioner, Annual Report
2013–2014 (Ottawa: Public Works and Government Services Canada, 2014), 32, https://
www.ocsec-bccst.gc.ca/a37/ann-rpt-2013-2014_e.pdf.
59 “Communications Security Establishment (CSE) – Our Good Neighbor to the North,”
7 August 2003, SIDtoday (internal NSA publication), https://theintercept.com/
snowden-sidtoday/3008306-communications-security-establishment-cse-our/.
60 Keith Coulter, testimony to Special Senate Committee on the Anti-terrorism Act, 11
April 2005, Senate of Canada, https://www.sencanada.ca/en/Content/SEN/Commit-
tee/381/anti/07evb-e. In later testimony, he used the figure “over 75%” and included
counter-intelligence activities in the count: Coulter, testimony to Subcommittee on
Public Safety and National Security of the Standing Committee on Justice, Human
Rights, Public Safety and Emergency Preparedness, House of Commons, http://www.
ourcommons.ca/DocumentViewer/en/38-1/SNSN/meeting-11/evidence. In both cases,
the figure likely included the 20–25 percent of the CSE budget then spent on the IT
Security program.
61 John Adams, testimony to the Standing Senate Committee on National Security and
Defence, 30 April 2007, Senate of Canada, https://sencanada.ca/en/Content/Sen/
committee/391/defe/15evb-e.
62 Office of the Communications Security Establishment Commissioner, “Role of the
CSE’s Client Relations Officers and the Operational Policy Section (D2) in the Release
of Canadian Identities,” 30 March 2007, 7, released in redacted form. The actual number
was redacted from the document, but it can easily be seen that it consists of three digits
and begins with a three.
63 Greg Fyffe, “The Canadian Intelligence Community after 9/11,” Journal of Military and
Strategic Studies 13, 3 (Spring 2011): 6.
64 “The Global Network Forum (Update #1),” 22 October 2004, SIDToday, https://the
intercept.com/snowden-sidtoday/3676087-the-global-network-forum-update-1/; see also
“Coming Soon: A SID Classification Guide,” 1 March 2005, SIDToday, https://theintercept.
com/snowden-sidtoday/3991126-coming-soon-a-sid-classification-guide/.
65 The directive was signed on 15 March 2004. The first ministerial directive specifically
on metadata was signed on 9 March 2005. According to the latter directive, “metadata
is defined as information associated with a telecommunication to identify, describe,
110 Bill Robinson
manage or route that telecommunication or any part of it as well as the means by which
it was transmitted, but excludes any information or part of information which could
reveal the purport of a telecommunication, or the whole or any part of its content.”
66 Communications Security Establishment (CSE), “CSEC SIGINT Cyber Discovery:
Summary of the Current Effort” (slide deck, November 2010), 13, https://christopher-
parsons.com/Main/wp-content/uploads/2015/02/cse-csec-sigint-cyber-discovery.pdf.
67 Keith Coulter, testimony to the Special Senate Committee on the Anti-terrorism
Act, 11 April 2005, Senate of Canada, https://www.sencanada.ca/en/Content/SEN/
Committee/381/anti/07evb-e.
68 Fred Kaplan, Dark Territory: The Secret History of Cyber War (New York: Simon and
Schuster, 2016), 156–57.
69 “Dealing with a ‘Tsunami’ of Intercept,” SIDtoday, 29 August 2006, https://www.eff.org/
files/2015/05/26/20150505-intercept-sidtoday-tsunami-of-intercept-final.pdf.
70 Hilbert and López, “The World’s Technological Capacity,” 63.
71 John Adams, testimony to the Standing Senate Committee on National Security and
Defence, 30 April 2007, Senate of Canada, http://www.parl.gc.ca/Content/SEN/Committee/
391/defe/15evb-e.htm.
72 Christopher Williams, “Jacqui’s Secret Plan to ‘Master the Internet,’” The Register, 3 May
2009.
73 National Security Agency (NSA), “SIGINT Mission Strategic Plan FY2008–2013,” 3
October 2007, 4, https://www.eff.org/files/2013/11/15/20131104-nyt-sigint_strategic_plan.
pdf.
74 NSA, “PRISM/US-984XN Overview” (slide deck, April 2013), 6, https://snowden
archive.cjfe.org/greenstone/collect/snowden1/index/assoc/HASH01f5/323b0a6e.dir/
doc.pdf.
75 NSA, “PRISM Expands Impacts: FY12 Metrics,” 19 November 2012, https://www.aclu.
org/foia-document/prism-expands-impacts-fy12-metrics.
76 NSA, “SIGINT Mission Strategic Plan FY2008–2013,” 8.
77 “Mathematics of Information Technology and Complex Systems: Research,” Inter-
net Archive, https://web.archive.org/web/20070519133815/http://www.iro.umontreal.
ca:80/~bengioy/mitacs/Research.htm.
78 CSE, “CSEC ITS/N2E: Cyber Threat Discovery” (slide deck, 2010), 47, https://christo-
pher-parsons.com/Main/wp-content/uploads/2015/03/csec-its-dsco-2010-20101026-
final.pdf.
79 Michael Feldman, “Cray Pushes XMT Supercomputer into the Limelight,” HPCwire, 26 Jan-
uary 2011, https://www.hpcwire.com/2011/01/26/cray_pushes_xmt_supercomputer_
into_the_limelight/.
80 For further discussion of the New Analytical Model, see Chapter 6.
81 As of 2020, the total had grown to approximately 2,900.
82 “Introduction to CSE Deck,” CSE, November 2015, 3, released in redacted form under
Access to Information Request Number A-2015-00067.
83 CSE, “Experienced Professionals and New Graduates,” 24 April 2012, Internet Archive,
https://web.archive.org/web/20130527193541/http://www.cse-cst.gc.ca/home-accueil/
careers-carrieres/professionals-professionnels-eng.html.
84 SIGINT Strategy 2012–2016 (Fort Meade, MD: National Security Agency, 23 February
2012), 2, https://www.eff.org/files/2013/11/25/20131123-nyt-sigint_strategy_feb_2012.
pdf.
85 “Unauthorized Disclosures,” CERRID #20084275, CSE, contained in briefing binder
prepared for the Chief of CSE in March 2015, released under Access to Information
Request Number A-2015-00021.
From 1967 to 2017 111
86 David Anderson, Report of the Bulk Powers Review (London: Williams Lea Group,
August 2016), 105.
87 David Wroe, “How the Turnbull Government Plans to Access Encrypted Messages,” Sydney
Morning Herald, 11 June 2017.
88 Ian MacLeod, “Quantum Computing Will Cripple Encryption Methods within Decade,
Spy Agency Chief Warns,” Ottawa Citizen, 23 September 2016.
89 Government Communications Headquarters, “Mobile Theme Briefing: May 28 2010”
(slide deck, 28 May 2010), 2, https://christopher-parsons.com/Main/wp-content/uploads/
2014/12/gchq-mobile-theme-briefing.pdf.
90 NSA, “PRISM Based Reporting June 2011–May 2012” (slide deck, 13 June 2012), https://
www.aclu.org/foia-document/prism-based-reporting-june-2011-may-2012. See also NSA,
“PRISM Expands Impacts.”
91 A briefing note produced by CSE appears to show that a four-digit number of reports
were produced in FY 2011–12, i.e., from April 2011 to March 2012: “CSEC Metadata
Collection,” CSEC ref: 1327209, 18 June 2013, released in redacted form under Access to
Information Request Number A-2013-00058.
92 For example, “In 2013–14, CSE issued [redacted five-digit number] intelligence reports
(known as End Product Reports, or EPRs) in line with GC intelligence priorities.”
Annual Report to the Minister of National Defence 2013–2014 (Ottawa: Communications
Security Establishment, 2014), 2, released in redacted form under Access to Information
Request Number A-2015-00086.
93 CSE, “Risk Assessment: Information Sharing with the Second Parties” (draft, 18 Decem-
ber 2015), 3, released in redacted form under Access to Information Request Number
A-2015-00052.
94 For more on Bill C-59 as it pertains to CSE, see Christopher Parsons, Lex Gill, Tamir
Israel, Bill Robinson, and Ronald Deibert, “Analysis of the Communications Security
Establishment Act and Related Provisions in Bill C-59 (An Act respecting national secu-
rity matters), First Reading (December 18, 2017)” (Citizen Lab/Canadian Internet Policy
and Public Interest Clinic report, December 2017), https://citizenlab.ca/wp-content/
uploads/2018/01/C-59-Analysis-1.0.pdf.
6
Pixies, Pop-Out Intelligence, and Sandbox Play
The New Analytic Model and National Security
Surveillance in Canada
Scott Thompson and David Lyon
any privacy legislation? There was a recognition that massive amounts of data
were becoming available, especially following the rapid development of Web
2.0, and that CSE was working towards exploiting those new data sources3 (see
also Chapter 14).
What was not fully recognized, however, was that a wholesale shift to big data
practices was underway, creating a watershed in data handling. As Carrie Sand-
ers and James Sheptycki put it (in relation to policing), this new development
amounts to the “algorithmic administration of populations and territory ...
based on morally neutral technology.”4 This is a new mode of doing intelligence
gathering and analysis, based in fact on global neoliberalism, seen, for instance,
in the public-private partnerships that help to drive the direction of the agency.
The disclosures by Edward Snowden in 2013 and later make it very clear that
among national security agencies, and especially at the NSA, there is a shift to
big data practices.5 The attempts to legalize certain data-gathering activities in
Canada, notably in the Anti-terrorism Act, known generally as Bill C-51, display
a desire to normalize big data in this realm.6 While these two factors of data
handling and public-private partnerships are clearest, there are others.
The establishment of the Tutte Institute for Mathematics and Computing in
2009 and its partnership with CSE are a reminder of a long-term association
with top researchers. CSE was created as such in 1975, from the former Com-
munications Branch of the National Research Council. Computer scientists,
mathematicians, engineers, linguists, and analysts are all found at CSE. Hints
of a new approach were available a few years ago. For example, Colin Freeze
found that new recruits at CSE were told not to emulate Hollywood’s James
Bond style of secret agents. Rather, they should act like the “traffic fairy,” a “tiny
pixie who apparently flits through computer traffic in search of secrets.”7
Here we comment first on the nature of the shift from the evolutionary
approach since 1946 to the big data turn starting in 2012. From the 1940s to
1974, the existence of what became CSE was secret, but it is now known that
signals intelligence (SIGINT) was its main mandate from the Second World
War onward. This meant telephone, radio (“wireless”), and telegraph – indeed
any system of communication used to send classified, and thus encoded or
encrypted, information. The primary user of SIGINT was the Department of
National Defence, although various leaks and scandals showed that sometimes
Canadian citizens, including prominent ones, could be in view.
The shift to big data practices depends heavily on very large-scale computing
facilities as well as technical, mathematical, and statistical expertise – hence
CSE depends not on the tens of workers with which it began in the 1940s but
now on more than 2,000 operating staff. At CSE, as elsewhere, big data is not
so much suspicion-driven as data-driven. That is, rather than a process of
114 Scott Thompson and David Lyon
targeted scrutiny of groups and individuals, big data engages in what was called
after Snowden “mass surveillance,” that is monitoring of communications to
discern patterns of relationship that may be deemed “actionable intelligence.”
It should of course be recalled that the kinds of data sought and analyzed are
millions or billions of bits of data generated by everyday transactions and com-
munications; numbers, text, graphics, videos, images, and sensor information.
This is what the “pixie” is trained to sift through.
Second, we note some key features of CSE big data practices and comment
on their broader meanings. The term “big data” usually refers to the volume of
data in use, which can now be handled by large-scale computing facilities that
contribute to the velocity of calculation and analysis possible, along with the
variety of datasets that may be drawn into any given analysis. Each of these
characteristics is visible in the New Analytic Model. Although this name seems
to lend weight to the analytical dimensions of big data, it is no less the case that
CSE depends on new modes of data capture, especially those associated with
“data exhaust” from everyday communications and transactions and so-called
user-generated content from massive social media sources.
Third, we observe that the changes at CSE are not by any means limited to
that body. Rather, some deliberate strategies indicate a plan to influence many
or even all government departments to use big data methods, with which
CSE expertise will be available to assist. Moreover, through potential
legal changes and alteration of the protocols of data management policy, the
idea is to catalyze a cascading effect of these new practices throughout
government.
In what follows, we show what sorts of pressures and opportunities produced
the radical shift to big data at CSE, the discernible patterns of development of
big data that are becoming visible, and the likely consequences of the adoption
of big data practices in intelligence services in Canada. This final item has a
necessarily critical edge, but is intended to prompt those engaged with CSE’s
mission to reflect more deeply on certain matters that have to do with ethics,
citizenship, and democratic oversight of which CSE is already aware, as seen in
the ATIP documents made available to us.
1) Provide a greater visibility into all collected data that w[ould] allow for the
rapid development of analytical capabilities to enrich, discover and to analyze
trends and anomalies across all relevant repositories at once; 2) Ensure that the
technological evolution supports the analysts’ quest to explore, experiment, dis-
cover, analyse and assert findings; 3) [REDACTED]; 4) Enable more effective
and efficient implementation and performance of analytic methods by lever-
aging computer automation and machine learning in order to aid in the for-
mulation of reasonable and actionable conclusions from sometimes conflicting
information; 5) Evolve the deliverables beyond static one-way interface and aim
towards an interactive mechanism through cutting edge interactive visualiza-
tion that present valuable insights in impactful ways; 6) Provide the ability to
plan, share and collaborate, monitor and measure performance, analyze results,
predict outcomes and strategize among peers, across teams, across units and,
ideally, set the stage for continued extension of these features.13
Where previously a single analyst or team would work to target, translate, and
report on identified targets, acting in more siloed, specialist areas, the NAE
would call for “a complete reworking of the DGI analyst’s task, working environ-
ment, and skill set,” acknowledging “that the role of analysis needs to undergo
a revolution of sorts, focused on innovation and sharing and collaboration.”14
Analysts would shift “from being a unique service provider, to being just one
116 Scott Thompson and David Lyon
perfect; will classify some incorrectly)” – with the NAM approach, the import-
ance of the analyst is highly stressed, as “one needs to start from the types of
questions we want to answer, then obtain the relevant data, even if it seems
expensive to do this at the beginning of the project.”34 Beyond simply involving
the analyst, CSE internal documents on big data and machine learning specific-
ally assert that fully automated approaches are incompatible with effective
knowledge development, noting that
by themselves, data analysis tools will not be useful unless there are analysts
who understand both the tools and the data ... Instead the tools will be part of a
process, which includes people analyzing the results produced by the tools and
using their own knowledge of the business and data to draw conclusions. This
analysis will include using good data visualization and other techniques to bet-
ter understand the results. Usually those conclusions will be hypotheses which
will need to be further tested in an ongoing process where the users continu-
ously refine the results. In CSE’s experience, pursuing the “Star Trek” vision has
consistently led to a dead end.35
• The data will always remain no more than a sample, taken from a certain van-
tage point.
• The newly developed systems are designed to capture certain kinds of data, and
the analytics and algorithms must already have been scientifically tested in spe-
cific ways.
• Data never speak for themselves without being framed in some way and are not
inherently meaningful – the correlations may well be random.44
• Information that “pops out” of the data is inadequate insofar as it lacks contex-
tual and domain-specific knowledge.
Pixies, Pop-Out Intelligence, and Sandbox Play 121
This suggests that the hesitations hinted at in the ATIP documents are appropri-
ate. Deep knowledge of prior modes of intelligence gathering would appear to
be vital for correctly interpreting data generated by the NAM.
Surveillance is always and everywhere a means of making visible certain
individuals or groups such that, in the case of national security intelligence-
gathering operations, they may be assessed for their possible connections with
criminal, violent, or terrorist activities. The shift to the NAM at CSE means that
new kinds of visibility emerge, different from older targeted methods and
involving “bulk” data. This is where the often missing “V” of big data – vulner-
ability – becomes acutely significant.45 Evidence from many other areas of big
data practice suggests that unless extreme care is taken, big data practices can
make some already marginalized or disadvantaged groups more vulnerable
than others.46
In the organization of intelligence work, the shift to big data practices may
be seen in the use of new gaming metaphors, new partnerships, and, significantly,
new concerns about whether the NAM will completely eclipse older practices.
Again, these kinds of shifts are not uncommon in big data practices, especially
in their corporate settings. Gaming is seen as a way of organizing teams of
workers and as a model of how progress is made in the search for actionable
data. This in turn plays into the kinds of partnerships evident among a variety
of players, especially among those who on their own would lack the resources
to mount big data centres of analysis on their own account. Universities and
technology companies frequently form such alliances in order to share ideas
and expertise. Lastly, while there is much hype surrounding big data practices,
some sober assessments acknowledge the need to combine older with newer
methods in order to make reliable claims about what the data indicate.
The gaming metaphors are significant. Those highlighted in the ATIP docu-
ments include the “sandbox” and the “Minecraft” models. Based originally
on a children’s exploratory playspace, the idea of a sandbox in software
development is for production development, for learning, testing, and experi-
menting. But the sandbox also hints at limits, the ways in which certain kinds
of experimentation may be constrained to protect items such as data, code,
and servers from potential damage resulting from changes to the system. So
while play is present, in this case it is clearly purposeful play that is in view.
The same is true of the Minecraft analogy, where, in the SIGINT lab, “starting
with a blank slate,” analysts have opportunities to “experiment, explore, create
and innovate in this universe.” The documents state, curiously, that the analysts
are bound “by the rules of physics” (really, no others?) and that within the
“Minecraft” lab they are free to innovate at will without fear of doing
damage.
122 Scott Thompson and David Lyon
Conclusion
From the evidence presented here, everything has changed at CSE in the wake
of decisions to switch to big data practices, known at CSE as the New Analytic
Model. The aim is to improve national security in an age of exploding com-
munications media. While some aims of the NAM are worthy ones, given the
urgency of dealing with global crime, violence, and terrorism, the methods and
modes of expertise chosen are very heavily weighted towards mathematics,
computing, and engineering. While this is appropriate for grappling with the
immense quantity of data available, it is not clear that sufficient attention is
being paid to the quality of the intelligence thus gleaned. The problem with
abandoning old methods in favour of new is that the risk of threatening situa-
tions may be misconstrued, with negative consequences for individuals and
groups. In CSE’s zeal to avoid human framing and “bias,” blinkers appear around
Pixies, Pop-Out Intelligence, and Sandbox Play 123
the very nature of the data and the inevitability that they are “framed” from the
outset.
Along with this – and possibly CSE is aware of the problem, however dimly –
is the ironic fact that without human involvement in the process of data analysis, the
likelihood of successful utilization of big data for security intelligence and sur-
veillance will remain slim. The focus on algorithms and machines precisely takes
attention away from the crucial matters of context. As a 2017 article states, “these
tools cannot replace the central role of humans and their ability to contextualize
security threats. The fundamental value of big data lies in humans’ ability to under-
stand its power and mitigate its limits.”48 And, we might add, not just any humans
will do. While those trained in mathematics, computing, and engineering are essential
to the task, if these skills are not complemented with ones from, for example, law,
the social sciences, and humanities, the essential task of contextualizing analysis not
only will be poorer but could raise human rights, civil liberties, and privacy problems
for vulnerable individuals, for whole classes of persons, and indeed for everyone.
Notes
1 Communications Security Establishment (CSE), “Opportunities for Professionals –
Foreign Language Intelligence Analysts,” 1 September 2017, https://web.archive.org/web/
20170825040050/https://cse-cst.gc.ca/en/node/1402.
2 Under Canada’s federal Access to Information Act or Privacy Act, individuals (either citi-
zens, permanent residents, or those currently present in Canada) and corporations are
able to make requests to federal government departments and organizations to obtain
documents relating to a given subject (the provinces also have similar legislation to cover
their departments and organizations). The acts do place some limitations regarding what
kinds of information can be released, and it is at times challenging to obtain documents
on subjects like national security or policing. Requests need to be written in such a way
as to avoid asking for current practices or ongoing investigations; they also need to be
made with the proper language or keywords of the institution. The language for the
request upon which much of this chapter is based (#A-2016-00068), for example, was
for all “high level briefing notes, presentations, policy framework documents or reports
specifically related to CSE’s definition of ‘Big Data,’ ‘Knowledge Discovery,’ and/or ‘Data
Mining’ and their impact on the mission.” The collected volume by Jamie Brownlee and
Kevin Walby, Access to Information and Social Justice: Critical Research Strategies for
Journalists, Scholars, and Activists (Winnipeg: ARP Books, 2015), is an excellent start-
ing point for those looking to better understand the use of Access to Information and
Privacy (ATIP) legislation for research, journalism, or social activism in Canada. ATIP
requests can be made online (https://atip-aiprp.apps.gc.ca/atip/welcome.do), and the
preceding two years of completed requests are also catalogued. The Surveillance Stud-
ies Centre, along with Queen’s University, is currently working to develop a repository
where the full texts of completed ATIP requests will be made available to researchers and
the general public, including those used in this chapter.
3 CSE, “NAM Episode II: Rise of the New Analytical Environment, CSE PowerPoint Pre-
sentation Deck,” 193, released under Access to Information Request Number A-2016-
00068 (2016).
124 Scott Thompson and David Lyon
4 Carrie Sanders and James Sheptycki, “Policing, Crime and ‘Big Data’: Towards a Critique
of the Moral Economy of Stochastic Governance,” Crime, Law and Social Change 68, 1–2
(2017): 1–15.
5 David Lyon, “Surveillance, Snowden, and Big Data: Capacities, Consequences, Critique,”
Big Data and Society 1, 2 (2014): 1–13.
6 Craig Forcese and Kent Roach, False Security: The Radicalization of Canadian Anti-
Terrorism (Toronto: Irwin Law, 2016).
7 Colin Freeze, “Canadian Spy Manual Reveals How New Recruits Are Supposed to Con-
ceal Their Identities,” Globe and Mail, 22 December 2013.
8 CSE, “NAM Episode II,” 193.
9 CSE, “[REDACTED TITLE], CSE PowerPoint Presentation Deck,” 175, released under
Access to Information Request Number A-2016-00068 (2016); Communications Secu-
rity Establishment, “NAM Episode II,” 193.
10 CSE, “[REDACTED TITLE],” 184.
11 CSE, “Analytic Environment for DGI: Business Case Proposal – Draft,” 98, released
under Access to Information Request Number A-2016-00068 (2016).
12 CSE, “NAM Episode II,” 191.
13 CSE, “Analytic Environment for DGI,” 100–1.
14 CSE, “NAM Episode II,” 193.
15 CSE, “[REDACTED TITLE],” 174.
16 CSE, “NAM Episode II,” 205.
17 CSE, “[REDACTED TITLE],” 189.
18 CSE, “Analysis and Production Evolution: A Simplified Take on the NAM, CSE Power-
Point Presentation Deck,” 162, released under Access to Information Request Number
A-2016-00068 (2016).
19 Ibid.; CSE, “Analytic Environment for DGI,” 96.
20 Ibid., 100.
21 CSE, “NAM Episode II,” 205.
22 Ibid.
23 CSE, “[REDACTED TITLE],” 189.
24 CSE, “Analytic Environment for DGI,” 97.
25 CSE, “Machine Learning, CSE PowerPoint Presentation Deck,” 142, released under
Access to Information Request Number A-2016-00068 (2016).
26 Ibid.
27 Ibid.
28 CSE, “NAM Episode II,” 197–98.
29 Greg Weston, “Inside Canada’s Top-Secret Billion-Dollar Spy Palace,” CBC News, 8 Octo-
ber 2013, http://www.cbc.ca/news/politics/inside-canada-s-top-secret-billion-dollar-spy
-palace-1.1930322.
30 Canadian Council for Public-Private Partnerships, The Canadian Council for Public-
Private Partnerships 2011 National Award Case Study Silver Award for Project Financ-
ing: Communication Security Establishment Canada Long-Term Accommodation Project
(Ottawa: Canadian Council for Public-Private Partnerships, 2011), 8.
31 CSE, “Big Data for Policy Development 2014, CSE PowerPoint Presentation Deck,” 30,
released under Access to Information Request Number A-2016-00068 (2016); CSE, “Big
Data Discussion Paper,” 4–5, released under Access to Information Request Number
A-2016-00068 (2016).
32 CSE, “Big Data Discussion Paper.”
33 CSE, “Machine Learning, CSE PowerPoint Presentation Deck,” 142, released under
Access to Information Request Number A-2016-00068 (2016).
34 Ibid., 129, 142.
Pixies, Pop-Out Intelligence, and Sandbox Play 125
The largest, most significant big data surveillance operations are arguably
the global Internet interception and analysis activities of the Five Eyes security
alliance. Consisting of the US National Security Agency (NSA) and its signals
intelligence partners in the United Kingdom, Canada, Australia, and New
Zealand, this alliance is capable of intercepting and analyzing much of the
world’s Internet communications as they flow through major switching centres.
While the public has learned a great deal from whistle-blowers such as Edward
Snowden about how the American and British agencies conduct surveillance
within their domestic networks, Canadians know relatively little about similar
operations by Canada’s own Communications Security Establishment (CSE).
Even at the best of times, secretive security agencies like CSE pose an inherent
dilemma for liberal democracies. Public transparency and accountability of
state institutions are fundamental tenets of democratic governance. This inevit-
ably creates tension with the secrecy that such agencies require to be effective
in their missions around national security. Central to achieving national security
is upholding the democratic rights upon which the integrity of the nation
ultimately depends. If citizens fear that their personal communications may be
intercepted unjustifiably and feel at risk of being treated more as threats than
rights holders, they are more likely to withdraw from the public sphere, with-
hold support for government initiatives, or even subvert them. Pursuing total
secrecy is not a viable long-term option. Security agencies need to be sufficiently
open about their activities to demonstrate that they respect the privacy, freedom
of expression, and other rights of individuals. With an implicit duty of public
candour, the onus is on them to hold themselves accountable and thereby earn
public trust. Achieving an appropriate balance of secrecy and transparency is
thus difficult and contingent.
Being publicly transparent is especially important when there are clear
indications that an agency has violated public trust. The Snowden revelations
of 2013 together with the recently legislated expansion of CSE’s mandates
make questioning CSE’s highly secretive posture as well as its dubious sur-
veillance practices particularly urgent. Yet when leaked documents have
indicated that CSE is conducting domestic Internet surveillance comparable to
Limits to Secrecy 127
that of its Five Eyes partners, its official responses generally go no further than
bland assertions of legal compliance:
Our activities are guided by a robust framework of Ministerial Directives and oper-
ational policies. CSE’s activities, as well as its operational directives, policies and
procedures, are reviewed by the CSE Commissioner, to ensure they are lawful.2
In the context of its partner agencies caught lying publicly about their domestic
surveillance activities, stretching legal definitions and mandates far beyond
conventional interpretations and engaging in activities that in Canada could
arguably be considered unconstitutional, CSE’s vague statements provide little
reassurance. Without being more open and offering persuasive details, CSE
invites questions about its integrity as well as whether existing laws and other
regulatory measures are sufficiently robust to keep such a powerful agency
within established democratic norms.
Canadians have shown a keen interest in issues of privacy, freedom of expres-
sion, and other democratic rights in relation to domestic surveillance, law
enforcement, and national security. Recent legislative initiatives around lawful
access (2011) and anti-terrorism (2015) triggered politically potent controversies.
In 2017, Bill C-59 (An Act respecting national security matters)3 raised new con-
cerns and reinvigorated the public debate. At that time CSE exercised secret
powers based on three broad mandates:
A. to acquire and use information from the global information infrastructure
for the purpose of providing foreign intelligence, in accordance with Govern-
ment of Canada intelligence priorities;
B. to provide advice, guidance and services to help ensure the protection of elec-
tronic information and of information infrastructures of importance to the
Government of Canada;
C. to provide technical and operational assistance to federal law enforcement
and security agencies in the performance of their lawful duties.4
Bill C-59 expanded these already formidable powers, adding two mandates that
give CSE brand-new offensive and defensive cyber authorities. It also created
the National Security and Intelligence Review Agency (NSIRA) and the Intel-
ligence Commissioner. With these novel review and oversight bodies, passage
of Bill C-59 in 2019 far from ended the national security debate but shifted it to
a new, promising phase.
128 Andrew Clement
For insight into these questions, and more generally in order for Canadians to
participate meaningfully in the national security debate, a deeper understanding
of CSE’s practical capabilities is needed.
This chapter seeks to equip Canadians more fully for this ongoing debate.5
The broad aim is to articulate “capability transparency” as a principle that
security agencies such as CSE should be held to as a core feature of their
democratic accountability. It makes the case that CSE has the capability to
intercept Canadians’ domestic Internet communications in bulk. It does not
seek to establish this claim “beyond a reasonable doubt,” as might be expected
of conventional scholarly writing or in a criminal prosecution. Given the
current degree of secrecy and obfuscation, this claim cannot be settled one
way or another solely with publicly available information. Rather, the immedi-
ate goal is more modest – to reach the lower standard of “reasonable suspi-
cion.” This means that where evidence of mass surveillance is ambiguous,
the benefit of the doubt does not go to CSE but instead can add weight to
reasonable suspicion. In light of the discussion above about secretive security
agencies operating within democratic norms, this mode of argumentation
should be sufficient to place the burden of proof squarely on CSE, which
should provide clear public evidence that either contradicts the claims here
Limits to Secrecy 129
interception capabilities that CSE had achieved by 2012. The CBC’s central claim,
that the CSE could track individual Canadians, backward and forward in time,
who accessed the Internet via Wi-Fi hotpots as they passed through airports,
hotels, libraries, and other popular locations, was based on a CSE presentation
describing a trial program in which CSE took up the challenge “to develop [a]
new needle-in-a-haystack analytic.”7 This involved linking travellers’ user IDs
with the IP addresses locally assigned to their devices. However, this was done
not by intercepting Wi-Fi signals, as the article ambiguously suggests, but
through the untargeted (i.e., bulk) interception of all Internet traffic passing
through the switching centres of major ISPs. Only a small fraction of this traffic
would include communications originating at public Wi-Fi spots, such as at an
airport. To extract the communications of individuals using the airport Wi-Fi,
analysts looked for originating IP addresses that showed specific characteristic
usage patterns. From this, they developed a signature profile for various kinds
of public Wi-Fi hotspots and inspected all the user IDs of individuals who
moved between them.
To understand what data CSE had access to from its interception operations
and how it could track individuals requires a closer look at the technical char-
acteristics of Internet communications. An IP address, such as for the router at
a Wi-Fi gateway, is included in the header of every packet originating from or
destined for that router. These IP addresses used for routing fit the conventional
definition of communication metadata, that is, of the “shallow” kind – “infor-
mation used by computer systems to identify, describe, manage or route com-
munications across networks.”8 However, the user IDs that CSE needed for
tracking individuals from one site to another are not found in packet headers
used for routing but only within the packet “payload.”
The CSE presentation does not mention what was used as the basis for user
IDs, but almost certainly it included some combination of individualized online
account names (e.g., for accessing Facebook, Yahoo, or Gmail) and “cookies.”
To reliably extract these IDs from Internet communication collected in bulk,
as would have been needed for the experiment described in the presentation,
CSE would have had to use deep packet inspection (DPI) facilities to reassemble
packets into the original full messages for analysis. This is because an account
name or cookie may be broken across more than one packet. In other words,
to achieve this fine degree of granularity, CSE must have access to message
content, not just the metadata needed for routing.
Once user IDs have been extracted from intercepted communications, track-
ing them across the various sites from which they access the Internet, and
especially back in time before becoming an ID of interest, involves a further
impressive but ominous computational feat. Foremost, it requires the ability to
Limits to Secrecy 131
For the CSE presenters, this means that “EONBLUE will be integrated into the
Network ... [to enable] Monitoring Core Infrastructure (Special Source)
extending the reach to view national infrastructure.” Under the catch phrase
“The Network is the Sensor,” the presentation anticipates that for domestic
Limits to Secrecy 133
Figure 7.1 Canadian cyber sensor grid. This is a slide from a top-secret CSE presentation that
shows the location of CSE’s sensors in various parts of the global information infrastructure,
including within the Canadian Internet Space and at entry points into Government of Canada
networks. | Source: “CASCADE: Joint Cyber Sensor Architecture,” https://snowdenarchive.cjfe.
org/greenstone/collect/snowden1/index/assoc/HASH9a6d.dir/doc.pdf.
defence, “the same capabilities [as deployed for foreign signals intelligence
(SIGINT) interception] will be integrated into the CORE of the [Canadian]
Internet.”
Some caveats are in order here about interpreting these CSE documents.
These slide decks are not simply factual reports but are intended in part to
impress colleagues with accomplishments and argue for more resources. And
of course they are intended to accompany a verbal presentation within a wider
security agency milieu to which the public has no access; they assume knowledge
of obscure technical, tradecraft, and codeword terms; and they are several years
old. Capabilities that may have existed at the time may have been discontinued,
and new ones added. Many of the statements in the 2011 and earlier documents
refer to interception capabilities in the planning stage, some of which may
not have come to fruition. However, sensor performance reported in the
134 Andrew Clement
2012 document on which the airport Wi-Fi story was based is consistent with
the operational and projected capabilities mentioned previously, and strongly
suggests that CSE may have accomplished many of its goals around intercepting
Internet communications within Canada.
Given CSE’s clear ambitions, growing funding, and expanded collection dur-
ing this recent period, it is reasonable to suspect that by 2020 CSE had success-
fully developed extensive domestic interception capabilities – that is, it is likely
routinely intercepting within Canada, filtering, storing and analyzing Internet
communications in bulk. To accomplish this, CSE would have needed the co-
operation, whether willing or coerced, of major Canadian telecommunication
providers in installing interception equipment. Furthermore, the data captured
would go well beyond metadata, in the conventional meaning of CSE’s definition
above, and draw on message content. This kind of activity constitutes mass,
population-wide, content-based, suspicion-less surveillance that would intrude
upon the communications of millions of Canadians.
NSA and GCHQ surveillance operations, this interception is likely not done
right at the border but at the first Internet exchange where the cross-border
traffic is next routed to its various destinations.
An obvious way to identify the cross-border switching centres in Canada is
to examine the route maps that various major telecom companies provide
publicly for promoting their Internet businesses. These maps paint a consistent
picture of the main fibre routes that cross the border and where they connect
with the Canadian Internet backbone. Collating the information from the route
maps of major Canadian carriers as well as large foreign carriers providing
transit services in the Canadian market indicates that nine cities host nearly all
the cross-border Internet connections: Victoria, Vancouver, Calgary, Winnipeg,
Windsor, Hamilton, Toronto, Montreal, and Halifax/Dartmouth.
It would be reasonable to expect that CSE would seek to install interception
devices at the main Internet switching centres in each of these cities, and thereby
capture close to 100 percent of inbound Internet traffic. However, especially if
resources are limited, some cities will be prioritized over others, depending on
their relative volumes of traffic and which telecom providers will need to be
enrolled. For this it is helpful to analyze the actual routes that data follow in
practice. In the absence of public reporting of such detailed Internet traffic
statistics, the IXmaps Internet mapping and analysis tool can provide useful
insights.
Figure 7.2 Boomerang route originating and terminating in Toronto. In this map showing
southern Ontario and the northeastern United States, a line, representing a data path, is
drawn from the origin in Toronto to New York City, then to Chicago, and finally to Toronto,
the final destination. Both New York City and Chicago are marked with an icon indicating
that they are sites of NSA surveillance. | Source: Adapted from Andrew Clement and Jonathan
Obar, “Canadian Internet ‘Boomerang’ Traffic and Mass NSA Surveillance: Responding to
Privacy and Network Sovereignty Challenges,” in Law, Privacy and Surveillance in Canada in
the Post-Snowden Era, edited by Michael Geist (Ottawa: University of Ottawa Press, 2015), 21.
Cartography by Eric Leinberger.
show the main Internet routes entering and traversing Canada, and reveal
the main network operators and their switching centres. The following
analysis considers the more than 250,000 traceroutes users contributed to
the database in the twelve-month period from 1 December 2016 to 30 Nov-
ember 2017. Of these, over 75,000 routes entered Canada. Table 7.1 shows in
ranked order the top five metropolitan areas where the first router in Canada
along the route is located. The percentages refer to the proportion of routes
in relation to the number of routes for which a city can be identified. In other
words, over 90 percent of incoming Internet routes make their first Canadian
hop in just these five metropolitan areas before being routed to their ultim-
ate destinations within Canada. It is these cities that CSE would be most likely
Limits to Secrecy 137
Table 7.1
Top border cities for Internet traffic entering Canada
Table 7.2
Top carriers bringing Internet data into Canada
Boomerang Routing
It is important to note that within the stream of Internet data entering Canada
from the United States, a significant portion (~10 percent) originated in Canada.
These are known as “boomerang” routes because of their characteristic pattern
of leaving Canada before returning to reach their destination. Figure 7.2 offers
an example of a boomerang route that both originates and terminates in Toronto
but transits via New York City and Chicago. While estimates vary, such routes
constitute approximately one-quarter or more of domestic routes (Canada to
Canada). Furthermore, nearly all Internet users in Canada that visit prominent
Canadian websites, including those of federal and provincial governments, are
likely to have their personal data transmitted in this way.19 This is significant
because CSE may have a legal basis to treat the data it intercepts in these inbound
channels as foreign-based by default, which provides a lower level of protection.
Only packet analysis after interception can indicate whether or not the data
originate with someone in Canada.
Table 7.3
Principal concentrations of Internet routers by metropolitan area and carrier
These claims are not conclusive, nor are they allegations of illegal behaviour on
the part of CSE or its telecommunications partners. They are, however, suffi-
ciently well founded to call into serious question CSE’s stock responses regarding
possible domestic surveillance. CSE repeatedly asserts that it follows the law,
Limits to Secrecy 141
but has yet to categorically deny that it intercepts Canadians’ Internet com-
munications in bulk. Since CSE is ultimately answerable to the Canadian public,
the mounting evidence of domestic interception places a clear onus on CSE to
be much more transparent and accountable about its Internet surveillance
capabilities and activities.
In response to the airport Wi-Fi story, CSE mentioned only its foreign intel-
ligence mandate for collecting communications, leaving open the possibility
that it may have been collecting Canadian data used for the tracking experiment
under another of its mandates. Such collection would be consistent with what
we have seen earlier about the integration of sensors in the CASCADE program
to serve multiple mandates. CSE further denied “targeting” Canadians’ com-
munication (i.e., focusing attention on already specified individuals), but what
about data that is inevitably collected “incidentally” as part of full-take intercep-
tion? It also denied “collection,”22 but what does this actually mean? Is it equiva-
lent to interception in the usual sense adopted here, or, following the NSA, does
CSE regard collection as occurring only at the point that an analyst actually
reads or listens to message content?23
An essential move for CSE to maintain the trust of Canadians would be to
state plainly whether or not it has the capability to intercept Canadians’
Internet communications in bulk, whether it is routinely doing so, and under
what mandate(s). If CSE does intercept Canadians’ communications, it would
also need to be clearer about what personal communication data it captures
and what measures are in place to ensure that personal and democratic rights
are well protected. What content is intercepted? If “only” metadata is extracted
and stored, what does this consist of? If not all message content is discarded
immediately, is metadata extraction performed later?24 While beyond the
scope of this chapter, this opens a host of other questions concerning how
CSE processes the data after the point of interception. Where are the data
stored and for how long? What organizational entities have access to them
and for what purposes? What can CSE and its partner agencies infer about
individuals’ lives from the metadata? What “minimization” and other pro-
tective procedures apply? While some of these questions may infringe on the
sources and methods that security agencies legitimately keep secret, there
should at least be some public understanding of how the boundary is drawn.
Most fundamentally, Canadians need demonstrable assurance that any
secrecy or other security measure is consistent with the Canadian Charter
of Rights and Freedoms as well as established international human rights
norms.25 In particular, it should satisfy the four-part constitutional test based
on Oakes, that any secrecy or other security measure be minimal, propor-
tionate, necessary, and effective.26
142 Andrew Clement
Acknowledgments
An earlier extended version of this chapter, with additional technical details and more
extensive endnotes, is available at SSRN, https://ssrn.com/abstract=3206875.
I am grateful for the feedback on a preliminary presentation of this work from partici-
pants in the Security and Surveillance workshop held in Ottawa on 18–19 October 2017.
In particular, my ongoing conversations with Chris Parsons as well as his “Canadian
SIGINT Summaries” (https://christopher-parsons.com/writings/cse-summaries/) have
provided invaluable insights into CSE activities. Bill Robinson, whose Lux Ex Umbra blog
(https://luxexumbra.blogspot.com/) is an important resource for better public understand-
ing of CSE, also contributed helpful feedback on an earlier draft of this chapter. The
Snowden Surveillance Archive (https://snowdenarchive.cjfe.org/greenstone/cgi-bin/
library.cgi), designed and built by archivists George Raine and Jillian Harkness, likewise
assisted in finding and interpreting relevant Snowden documents. I also appreciate the
contributions over many years of Colin McCann and other collaborators in the IXmaps
(https://ixmaps.ca/) research project that provided the basis for the empirical analysis of
Internet routing patterns. IXmaps research has received funding from Canada’s Social
Sciences and Humanities Research Council, the Office of the Privacy Commissioner of
Canada, the Canadian Internet Registration Authority, and the Centre for Digital Rights.
As is the norm for academic research, the views expressed here are those of the author
alone and not of the funding organizations or others who have assisted in the research.
Notes
1 Communications Security Establishment (CSE), “Frequently Asked Questions” (13.
Does CSE target Canadians?), 3 July 2015, https://www.cse-cst.gc.ca/en/about-apropos/
faq#q13.
Limits to Secrecy 145
2 CSE, “CSE Statement re: January 30 CBC Story – January 30, 2014,” 13 December 2016,
https://www.cse-cst.gc.ca/en/media/media-2014-01-30.
3 Bill C-59, An Act respecting national security matters, 1st Sess, 42nd Parl, 2019 (assented
to 21 June 2019), SC 2019, c 13.
4 Communication Security Establishment, “What We Do and Why We Do It,” 1 August
2019, http://www.cse-cst.gc.ca/en/inside-interieur/what-nos.
5 See also Chapters 2, 5, 8, 13, and 14.
6 Greg Weston, “CSEC Used Airport Wi-Fi to Track Canadian Travellers: Edward Snowden
Documents,” CBC News, 31 January 2014, http://www.cbc.ca/news/politics/csec-used
-airport-wi-fi-to-track-canadian-travellers-edward-snowden-documents-1.2517881.
Unless otherwise indicated, quotes in this section are from this article. Since publication
of the article, CSE no longer uses the acronym “CSEC.”
7 CSE, “IP Profiling Analytics & Mission Impacts” (slide deck, 10 May 2012), http://www.
cbc.ca/news2/pdf/airports_redacted.pdf.
8 CSE, “Metadata and Our Mandate,” 1 August 2019, https://www.cse-cst.gc.ca/en/
inside-interieur/metadata-metadonnees. See Chapter 14 in this book for the distinction
between shallow and deep metadata.
9 The slide deck gives the impression that this was another Canadian city, but CSE
denies this. See Amber Hildebrandt, “CSE Worried about How Its Use of Cana-
dian Metadata Might Be Viewed,” CBC News, 22 April 2015, http://www.cbc.ca/
news/canada/cse-worried-about-how-its-use-of-canadian-metadata-might-be-
viewed-1.3040816.
10 Laura Payton, “Spy Agencies, Prime Minister’s Adviser Defend Wi-Fi Data Collec-
tion,” CBC News, 3 February 2014, http://www.cbc.ca/news/politics/spy-agencies-prime
-minister-s-adviser-defend-wi-fi-data-collection-1.2521166.
11 CSE, “What We Do and Why We Do It.”
12 The principal CSE documents are “CSEC Cyber Threat Capabilities/SIGINT and ITS: An
End-to-End Approach” (slide deck, October 2009), https://assets.documentcloud.org/
documents/1690224/doc-6-cyber-threat-capabilities.pdf; and “CASCADE: Joint Cyber
Sensor Architecture” (slide deck, 2011), https://s3.amazonaws.com/s3.documentcloud.
org/documents/1690204/cascade-2011.pdf.
13 For more on CSE’s sensor programs, see Chapter 13.
14 CSE, “CASCADE.”
15 CSE, “What We Do and Why We Do It.”
16 See http://IXmaps.ca.
17 Andrew Clement, “IXmaps – Tracking Your Personal Data through the NSA’s War-
rantless Wiretapping Sites,” in Proceedings of the 2013 IEEE International Symposium
on Technology and Society (ISTAS) (Toronto, 27–29 June 2013), 216–23, doi: 10.1109/
ISTAS.2013.6613122.
18 Mark Klein, Wiring Up the Big Brother Machine ... and Fighting It (Charleston, SC: Book-
Surge, 2009).
19 Andrew Clement and Jonathan Obar, “Canadian Internet ‘Boomerang’ Traffic and Mass
NSA Surveillance: Responding to Privacy and Network Sovereignty Challenges,” in
Law, Privacy and Surveillance in Canada in the Post-Snowden Era, edited by Michael
Geist (Ottawa: University of Ottawa Press, 2015), 13–44. Available for free, open-access down-
load at http://www.press.uottawa.ca/law-privacy-and-surveillance or http://hdl.handle.
net/10393/32424.
20 Canadian Radio-television and Telecommunications Commission (CRTC), “Communi-
cations Monitoring Report 2017: The Communications Industry,” https://crtc.gc.ca/eng/
publications/reports/PolicyMonitoring/2017/cmr3.htm.
146 Andrew Clement
report was particularly notable for including the findings of its first exam-
ination into the CSIS data acquisition program.8 SIRC itself has broad
authority to examine information in the control of CSIS, but its public
reports on these examinations are terse, high-level, and tightly edited to
protect national security. Consequently, these reports typically require some
“unpacking” to extract the importance of the information provided by the
review and to use it to sketch more of the picture (or of the likely picture)
of CSIS activities.
SIRC’s 2016 report of its first examination of the CSIS data acquisition
program, in its characteristically measured, understated tone, managed to
convey that CSIS’s activities in this realm were essentially unmoored from
law.
As the 2016 SIRC annual report confirms that CSIS has a program of bulk data-
sets, this naturally invites the question of what kinds of bulk data collection –
generally construed as “haystack hoovering” just in case any needles might be
found – could meet the test for strict necessity.
152 Micheal Vonn
Despite this [CSIS’s agreeing that section 12 of the act applied], SIRC found no
evidence to indicate that CSIS had appropriately considered the threshold as
required in the CSIS Act.12
Again, it is helpful to pay careful attention to the language. This was not a matter
of a dispute about how the threshold was interpreted or even how an
Gleanings from the Security Intelligence Review Committee 153
interpretation was applied in some cases. This was a case of there being no
evidence to indicate that CSIS had even appropriately considered the threshold.
It is arguable that this is a failure so vast and otherwise inexplicable as to suggest
a contempt for the applicable legal requirements. While it is never expressly
stated, the inescapable conclusion of SIRC’s findings is that a very large amount
(and, minus the phonebook, possibly all) of the CSIS bulk data holdings were
not lawful.
SIRC issued recommendations (it has no order-making powers) that CSIS
re-evaluate all its referential bulk datasets; undertake an assessment of its non-
referential datasets to ensure they meet the “strictly necessary” threshold; and
halt all acquisition of bulk datasets until a formal process of assessment exists
to ensure that required standards are being met.13
First, for any bulk information, a clear connection to a threat to the security
of Canada as defined in section 2 of the CSIS Act must be established. Sec-
ond, it must be established that less intrusive means that would satisfy the
intelligence requirements are not available as an alternative to bulk collection,
consistent with the principle of proportionality. Third, if there is no reason-
able alternative to bulk collection, CSIS needs to provide an objective assess-
ment of how closely connected the bulk information is to intelligence of value;
the broader the intended collection, the more strictly CSIS must establish the
connection between the bulk information and the threat-related intelligence.
[Emphasis added]14
It is stated in the 2016 SIRC report’s very brief “CSIS response” sidebar that
CSIS agreed to review its bulk data holdings and to suspend further bulk data
acquisition pending the implementation of its newly approved governance
framework.15 However, the “response” is silent as to whether that governance
framework includes any of the criteria proposed by SIRC. The read-between-
the-lines assessment would be that the SIRC guideline recommendations are
not found in the new governance framework for bulk data acquisitions.
154 Micheal Vonn
SIRC noted in this report that it had previously seen references in CSIS docu-
ments to “the need to validate the authority to collect and manage the risk of
over-collection by confining collection to that which is ‘strictly necessary.’”16
SIRC reported that it was told that a (then) two-year-old draft of a governance
framework existed but had never been finalized.17 The CSIS “response” statement
would suggest that it is this old governance document – drafted before the
decision in X (Re) – that CSIS agreed to finalize and implement. It is difficult
to understand why SIRC would be publishing its own proposal for criteria to
evaluate bulk data collection in compliance with section 12 of the CSIS Act if
no proposal was required because the CSIS governance framework already
contained these criteria. The suggestion is that the governance framework does
not contain guidance and interpretation of the kind proposed by SIRC.
CSIS uses bulk datasets in multiple ways. They can be used to conduct indices
checks by taking information already connected to a potential threat – such as
an address, phone number or citizen identification number – and using it to
search for “hits” in the data. Datasets can also be used to enhance knowledge
of a target by searching the data for previously undetected trends, links or pat-
terns between and among data points. And data is used to support more focused
inquiries, such as “data mining” to identify leads. Finally, SIRC was told that the
data can be used to try to identify previously unknown individuals of interest
by linking together types of information which have mirrored threat behaviour.
Overall, the addition of more datasets was expected to enrich CSIS’s analytical
capacity and enhance its ability to provide support for CSIS investigations.18
This indicates that CSIS uses bulk datasets in all of the following ways: to
confirm identity, to learn more about targets, to detect networks and patterns,
and to profile.
From X (Re) we learn:
[37] In the early 2000’s, the CSIS considered that the information it collected
through investigations was underutilised as it was not processed through mod-
ern analytical techniques ... The ODAC was designed to be “a centre for excel-
lence for the exploitation and analysis” of a number of databases.
...
Gleanings from the Security Intelligence Review Committee 155
[41] ... The present reasons should not give the impression that the Court is
well informed of the [redacted] program; only very limited evidence was pro-
vided. Given that the program was still called the ODAC at the time of the appli-
cation, I will use that term and not [redacted].
[42] The ODAC is a powerful program which processes metadata resulting in
a product imbued with a degree of insight otherwise impossible to glean from
simply looking at granular numbers. The ODAC processes and analyses [sic]
data such as (but not limited) to: [redacted]. The end product is intelligence
which reveals specific, intimate details on the life and environment of the per-
sons CSIS investigates. The program is capable of drawing links between vari-
ous sources and enormous amounts of data that no human being is capable of
[redacted].19
Later, in its discussion of the arguments put forward by counsel about the privacy
interests of the data at issue, specifically with respect to “information gleaned
from granular metadata and from the products of its aggregation and analysis,”
the court noted that “the products of the CSIS’s analytical methods are much
more elaborate than methods or types of information at issue in prior Supreme
Court of Canada cases.”20
This signals that we are in a grey zone about what level of constitutional
privacy protection the data – but, more specifically, the data processed in this
way – requires. Whatever else this statement might be referring to, it almost
certainly would be referring to national security profiling, which was cited in
the 2016 SIRC report as one of the uses of the bulk data holdings.
Despite the wide scope for expansion of information sharing under SCISA, SIRC
reports that there has not been a large increase in the volume of sharing with
CSIS. In its 2017 report, SIRC “noted that the volume of exchanges [between
CSIS and other federal institutions] under SCISA has been modest.”26 There were
two agencies – Global Affairs Canada (GAC) and the Canada Revenue Agency
(CRA) – that did most of the SCISA-authorized exchanges with CSIS, and the
2017 SIRC report gives an overview of the information sharing with each.
These brief, high-level overviews provide only information subject to the
limitations that all of SIRC’s reporting is subject to. Even within these limits,
SIRC manages to give a good indication of the formidable entanglement of legal
and policy thresholds at play for information sharing and how these require-
ments from different bodies and in different pieces of legislation are supposed
to mesh, but likely do not.
Walking through this with some of the information provided in the findings
with respect to GAC, we note that CSIS and GAC share information relating to
activities that undermine the security of Canada, as defined in SCISA. However,
CSIS is allowed to “collect” information only “to the extent that it is strictly
necessary,”27 and it is not at all clear that the threshold for information to exit
GAC (under SCISA) and the threshold for information to enter CSIS (under
the CSIS Act) are the same thresholds, and if not, how such incompatibility is
assessed and what happens as a result.
We learn from the 2017 SIRC report that CSIS and GAC signed an information-
sharing agreement in 2016, but that discussions between the agencies are
ongoing, in part due to disagreements on thresholds and triggers for disclosures
of information (CSIS arguing for more, not less, disclosure).28 The 2017 SIRC
report also notes that a small number of disclosures of consular information
were recorded as having been made under the Privacy Act,29 which is perhaps
unsurprising given that legal commentators have often noted that interplay
between SCISA and the Privacy Act is extremely unclear.30 Alternatively or
additionally, it might signal that targeted information is being sought under
one scheme and bulk data under the other.
Gleanings from the Security Intelligence Review Committee 157
From the 2017 report’s findings about the CRA we learn that at least some of
the information that is being shared by CRA with CSIS previously required a
judicially authorized warrant for disclosure to CSIS, and that there is no memo-
randum of understanding in place with respect to information sharing of tax-
payer information. In terms of aligning the thresholds for disclosing information
with the thresholds for receiving information, the Income Tax Act was amended
to allow for the disclosure of information to bodies like CSIS if “there are rea-
sonable grounds to suspect that the information would be relevant to (i) an
investigation of whether the activity of any person may constitute threats to the
security of Canada as defined in section 2 of the Canadian Security Intelligence
Service Act.”31 This is a low threshold requiring only suspicion (not belief) of
information relevant (not necessarily evidence) to an investigation of whether
the activity of any person may constitute threats to the security of Canada (not
the activities of targeted persons, but whether any person’s activities may pos-
sibly justify targeting). There does not appear to be any bar to bulk data sharing
on this threshold.
rationales. The first rationale was that the data are relevant on the basis that the
NSA has analytical tools that are likely to generate investigative leads and that
these tools require bulk data for analysis. Because bulk data is necessary to
operate these analytical tools, the government argued that bulk data is relevant
for the purposes of the statute. The second rationale was that “relevance” should
have a very expansive interpretation drawing on analogous legal contexts, such
as the discovery process. The PCLOB soundly rejected both rationales.
With respect to the argument that relevance is founded in analytical tools
generating the necessity for bulk data, the PCLOB rejected the “elastic defin-
ition”38 that “supplies a license for nearly unlimited governmental acquisition
of other kinds of transactional information.”39
In the Board’s view, this interpretation of the statute is circular and deprives the
word “relevant” of any interpretative value. All records become relevant to an
investigation under this reasoning, because the government has developed an
investigative tool that functions by collecting all records to enable later search-
ing. The implication of this reasoning is that if the government develops an
effective means of search through everything in order to find something, then
everything becomes relevant to its investigations. The word “relevant” becomes
limited only by the government’s technological capacity to ingest information
and sift through it efficiently.40
The PCLOB also rejected the rationale that “relevance” has a particularized
meaning in legal contexts that is more expansive than the ordinary dictionary
definition of the term, arguing that while other legal processes demonstrate that
“relevance” can have legitimately expansive interpretations, no definition can
be so expansive as to effectively be no limit whatsoever.
Simply put, analogous precedent does not support anything like the principle
that necessity equals relevance, or that a body of records can be deemed relevant
when virtually all of them are known to be unrelated to the purpose for which
they are sought.
...
Relevance limitations are a shield that protects against overreaching, not a
sword that enables it.41
The disclosure of the NSA’s mass surveillance of telephone records and the
paradoxical interpretation of relevance that purported to authorize the program
generated controversy on several fronts. The massive sweep of the program
affected the privacy rights of nearly everyone in the United States. But further,
160 Micheal Vonn
is “strictly necessary” to assist CSIS in the performance of its duties and func-
tions. NSIRA can review these activities.
Publicly Available Datasets are defined in the amendments to the CSIS Act
as datasets “publicly available at the time of collection.”48 CSIS can collect a
Publicly Available Dataset if it is satisfied that it is relevant to the performance
of its duties and functions and is evaluated by a designated employee. CSIS can
query and exploit Publicly Available Datasets and retain the results. NSIRA can
review these activities.
collect “publicly available” datasets (with no definition of that term) on the basis
of a bare “relevance” standard.
As to Canadian Datasets, the personal information they contain is expressly
acknowledged as not directly and immediately relating to activities that represent
a threat to the security of Canada. The test is simply that the results of querying
or exploiting this information could be relevant and that this assessment must
be reasonable.
In theory, the privacy impact of this extremely wide-open funnel could be
slightly moderated by the “evaluation” that requires that irrelevant personal
information be deleted if this can be done without affecting the integrity of
the dataset, but it is difficult to imagine how this would ever be effectively
used. One of the main purposes of the dataset is for “searching the data for
previously undetected trends, links of patterns between and among data
points”50 and providing “products” that “[draw] links between various sources
and enormous amounts of data that no human being is capable of.”51 How
could an evaluator assess whether a piece of personal information is irrelevant
to a big data analytics process in which it is impossible to know the “relevance”
of any particular type of data because the queries aren’t set, predictable, or
even decisions made by humans, in the case of analytics that use machine
learning? In practice, it is likely that “the integrity of the dataset” will be cited
in almost every case as the reason for keeping the dataset intact and not
removing personal information.
The judicial authorization for the retention of the Canadian Datasets sounds
like significant gatekeeping, but in fact it simply compounds the effect of all of
the very low standards that lead up to it. Personal information that does not
directly and immediately relate to activities that represent a threat to the security
of Canada is allowed to be collected if it “could be relevant”; this assessment
must be reasonable and the judge decides whether the dataset can be retained
on the standard that it is “likely to assist.”52
It is only at the point that the now fully approved dataset gets queried or
exploited by CSIS that we see the introduction of the language that guided
the entirety of SIRC’s test, the language and interpretation of “strict necessity.”
NSIRA can review those decisions, and make findings as to reasonableness
and necessity and report those findings, but NSIRA is not likely to be report-
ing non-compliance with the law. To be clear, it is obliged to report even
possible non-compliance with the law to the attorney general, but the law as
it applies to CSIS collection, use, and retention of bulk data is so broadly
permissive that it is unlikely that CSIS will be exceeding such capacious
boundaries with any frequency.
164 Micheal Vonn
The Board also recommends against the enactment of legislation that would
merely codify the existing program or any other program that collected bulk
data on such a massive scale regarding individuals with no suspected ties to ter-
rorism or criminal activity. While new legislation could provide clear statutory
authorization for a program that currently lacks a sound statutory footing, any
new bulk collection program would still pose grave threats to privacy and civil
liberties.55
Notes
1 For further discussion of the security intelligence practices and agencies in Canada, see the
section “Situating Security Intelligence” in the Introduction to this volume, pages 6–13.
2 Simon Davies, A Crisis of Accountability: A Global Analysis of the Impact of the Snowden
Revelations (Privacy Surgeon, 2014), 24.
3 Canadian Charter of Rights and Freedoms, Part 1 of the Constitution Act, 1982, being
Schedule B to the Canada Act 1982 (UK), 1982, c 11, s 7; British Columbia Civil Liber-
ties Association, Notice of Civil Claim, Supreme Court of British Columbia, 2013, http://
bccla.org/wp-content/uploads/2013/10/2013-10-22-Notice-of-Civil-Claim.pdf.
4 Big Brother Watch and others v United Kingdom (Applications nos 58170/13, 62322/14
and 24960/15) (2018), ECHR 722; Rebecca Hill, “Bulk Surveillance Is Always Bad, Say
Human Rights Orgs Appealing against Top Euro Court,” The Register, 12 December
2018.
5 Canada, Bill C‑59, An Act respecting national security matters, 1st Sess, 42nd Parl, 2017
(first reading 20 June 2017) [Bill C‑59].
6 X (Re), 2016 FC 1105.
7 Canadian Security Intelligence Service Act, RSC 1985, c C‑23 [CSIS Act].
8 Canada, Security Intelligence Review Committee, Annual Report 2015–2016: Maintain-
ing Momentum (Ottawa: Public Works and Government Services Canada, 2016).
9 Privacy International v Secretary of State for Foreign and Commonwealth Affairs & others,
[2017] UKIPTrib IPT_15_110_CH (UK).
10 CSIS Act, s 12(1).
11 Canada, Security Intelligence Review Committee, Annual Report 2015–2016, 24.
12 Ibid., 24.
13 Ibid., 24–25.
14 Ibid., 25.
15 Ibid.
16 Ibid., 24.
17 Ibid.
18 Ibid., 23–24.
19 X (Re), paras 37, 41–42.
20 Ibid., para 79.
21 Anti‑terrorism Act, 2015, SC 2015, c 20.
22 Security of Canada Information Sharing Act, SC 2015, c 20, s 2 [SCISA]. SCISA was later
amended and renamed Security of Canada Information Disclosure Act [SCIDA].
23 Ibid., s 2.
24 Public Safety Canada, National Security Consultations: What We Learned Report
(Hill+Knowlton Strategies, 2017), 8.
25 Ibid., 4.
26 Canada, Security Intelligence Review Committee, Annual Report 2016–2017: Accelerat-
ing Accountability (Ottawa: Public Works and Government Services Canada, 2017), 22.
27 CSIS Act, s 12(1).
28 Canada, Security Intelligence Review Committee, Annual Report 2016–2017, 22–23.
29 Privacy Act, RSC 1985, c P‑21.
30 Canada, Security Intelligence Review Committee, Annual Report 2016–2017, 23.
31 Income Tax Act, RSC 1985, c 1 (5th Supp), s 9(b)(i).
32 Canada, Security Intelligence Review Committee, Annual Report 2016–2017, 24.
33 Ibid., 24.
34 Ibid., 24–25.
35 Canada, Security Intelligence Review Committee, Annual Report 2015–2016, 25.
166 Micheal Vonn
Canada is remaking its national security law through Bill C-59.1 This law
project constitutes the country’s largest national security law reform since 1984
and the creation of the Canadian Security Intelligence Service (CSIS). And with
its 150 pages of complicated legislative drafting, C-59 follows the pattern in other
democracies of codifying once murky intelligence practices into statute.
On the cusp of being enacted in Parliament at the time of this writing, the
bill responds to quandaries common to democracies in the early part of the
twenty-first century. Among these questions: How broad a remit should intel-
ligence services have to build pools of data in which to fish for threats? And
how best can a liberal democracy structure its oversight and review institutions
to guard against improper conduct by security and intelligence services in this
new data-rich environment?
This chapter examines how Bill C-59 proposes to reshape the activities of both
CSIS and the Communications Security Establishment (CSE) in fashions
responding to these dilemmas. Specifically, it highlights C-59’s proposed changes
to CSIS’s capacity to collect bulk data as part of its security intelligence mandate,
and also the new oversight system proposed for CSE’s foreign intelligence and
cybersecurity regimes. The chapter examines the objectives motivating both
sets of changes and suggests that in its architecture C-59 tries to mesh together
the challenges of intelligence in a technologically sophisticated, information-
rich environment with privacy protections derived from a simpler age but
updated to meet new demands.
case of the police) an offence or (in the case of CSIS) a threat to the security of
Canada. Warrants also oblige a large measure of specificity, targeting individuals
(or perhaps classes of individuals) who themselves are linked to these offences
or threats.
Oversight has, therefore, been front-ended, in advance of the intercept or
collection. And authorized information collection has then been relatively
focused. To date, therefore, the judicialization model has not accommodated
“bulk powers,” an expression borrowed from the United Kingdom. A bulk power
is one that allows intelligence agencies access to a large quantity of data, most
of which is not associated with existing targets of investigation. In other words,
it is the mass access to data from a population not itself suspected of threat-
related activity. The commonplace example, since Edward Snowden’s revelations,
is Internet or telephony metadata for entire populations of communications
users. But bulk powers can also involve content, and not just the metadata sur-
rounding that content.
Bulk powers are controversial – they are the heart of the post-Snowden pre-
occupations. They inevitably raise new questions about privacy and, in the
Canadian context, Charter rights, not least because bulk powers are irreconcil-
able with the requirements of classic warrants. There is no specificity. By def-
inition, bulk powers are not targeted; they are indiscriminate.
However, whether bulk powers amount to “dragnet” or “mass” surveillance
is a closely contested issue. Collection does not – and likely cannot, given
resource constraints – mean real-time, persistent observation. It does mean,
however, a sea of data that may then be queried and exploited. The distinction
between querying of collected and archived data and a permanent, panoptic
form of surveillance may be a distinction without a difference for members of
the public and privacy advocates, but it is one that David Anderson, former UK
Independent Reviewer of Terrorism Legislation, viewed as compelling in his
2016 report on bulk powers:
It should be plain that the collection and retention of data in bulk does not
equate to so-called “mass surveillance.” Any legal system worth the name will
incorporate limitations and safeguards designed precisely to ensure that access
to stores of sensitive data (whether held by the Government or by communica-
tions service providers [CSPs]) is not given on an indiscriminate or unjustified
basis.15
Put another way, surveillance means “watching,” and not “potential watching.”
And “potential” is controlled by safeguards that mean collection does not morph
seamlessly into watching. This is the philosophy that animated the United
170 Craig Forcese
Kingdom’s 2016 Investigatory Powers Act (IPA),16 and now is reflected also in
Bill C-59. Under the IPA, the world of bulk powers can be divided into bulk
interception, bulk equipment interference, bulk acquisition, and bulk personal
datasets. Canada’s Bill C-59 addresses issues relating to bulk interception and
bulk personal datasets. The bill does two things of note: for both CSE and CSIS,
it superimposes new quasi-judicial controls on collection of certain sorts of
information. For CSIS, it also creates judicial controls on retention, exploitation,
and querying of at least some sorts of information.
within it are not, and are unlikely to become, of interest to the intelligence ser-
vices in the exercise of their statutory functions. Typically these datasets are very
large, and of a size which means they cannot be processed manually.30
The C-59 approach to bulk personal datasets is a response, in part, to the Federal
Court’s 2016 decision on what was known as “ODAC.”31 But it also responds to
a broader concern about the ambit of CSIS’s threat investigation mandate.32 That
mandate is anchored in section 12 of the CSIS Act.
Under its section 12 mandate, CSIS collects, to the extent it is strictly neces-
sary, and analyzes and retains information and intelligence on activities it has
reasonable grounds to suspect constitute threats to the security of Canada. This
passage has several “magic words”: “to the extent that it is strictly necessary”;
“reasonable grounds to suspect”; and “threats to the security of Canada.”
“Threats to the security of Canada” is the only passage defined in the CSIS
Act (in section 2). “Reasonable grounds to suspect” has a generally well-
understood meaning: “suspects on reasonable grounds” is a suspicion based
on objectively articulable grounds that may be lower in quantity or content
than the requirement of reasonable belief, but must be more than a subjective
hunch.33 It amounts to a possibility the threat exists, based on cogent evidence
(and not simply a hunch).
Under section 12, CSIS commences an investigation on this standard. But
where the means of that collection are sufficiently intrusive to trigger section 8
of the Charter or the Part VI Criminal Code prohibition against unauthorized
intercept of private communications (for instance, a wiretap), it must get a
Federal Court warrant. A judge will issue a warrant only if persuaded that CSIS
has reasonable grounds to “believe” that it is required to investigate threats to
the security of Canada.
“Reasonable grounds to believe” is a higher standard than the reasonable
grounds to suspect standard that must be met for CSIS to begin an information
collection investigation under section 12. Sometimes called “reasonable and
probable grounds” in the constitutional case law, reasonable grounds to believe
is less demanding than the criminal trial standard of “beyond a reasonable
doubt.” Instead, it is defined as a “credibly-based probability” or “reasonable
probability.”34
CSIS obtains warrants in a closed-court (i.e., secret) process in which only
the government side is represented. The warrants can, and often do, impose
conditions on CSIS investigations. There are templates for standard warrant
applications. These templates are occasionally updated, a process that requires
CSIS to apply to the Federal Court. The 2006 ODAC case came about through
a belated updating process.
174 Craig Forcese
Today’s threats to Canada’s national security are fast, complex and dynamic, and
threat actors are highly connected and mobile. The ease of movement across
international borders and spread of social media networks and modern com-
munications technology can be used by individuals and groups seeking to harm
Canada. This creates some very real challenges for CSIS.37
The dilemma lies in reconciling oceans full of data generated by innocents with
the intelligence function of clearing the fog of uncertainty and revealing not
just the known threats but also the unknown threats.
176 Craig Forcese
Conclusion
Privacy is among the least absolute rights in international and Canadian human
rights law. It has always been about equilibrium, from its inception as a common
law concept in the eighteenth century. Balancing has depended, in the Canadian
law tradition, on advance oversight by judicial officers. In relation to CSE,
changes in technology have placed that agency out of alignment with this
expectation. Bill C-59 tries to restore a more traditional pattern, albeit in cir-
cumstances where a classic judicial warrant model would prove unworkable.
Meanwhile, CSIS has laboured with an act designed for an analogue period.
It has become an intelligence service largely incapable of fishing in an electronic
sea. The challenge is to permit reasonable fishing but not dragnetting. Bill C-59
attempts to strike this balance by superimposing a judge, not to police CSIS’s
dataset fishing net but rather to determine what information captured within
178 Craig Forcese
the net may be retained and analyzed. We are right to be wary of such a system,
since it depends on close adherence to a complicated set of checks and balances.
On the other hand, those checks and balances cannot become so burdensome
that intelligence services are left to obtain, essentially, a warrant to obtain a
warrant.
Put another way, C-59 seeks balance. Not everyone will agree, but in my view
(and subject to my doubts about publicly available datasets), C-59 succeeds
reasonably well in reconciling the “nightwatchman” role of the state’s security
services with the individual’s right to be left alone.
Notes
1 Canada, Bill C‑59, An Act respecting national security matters, 1st Sess, 42nd Parl, 2017,
http://www.parl.ca/DocumentViewer/en/42-1/bill/C-59/first-reading. This chapter deals
with C-59 as it existed at the time of writing: after first reading in the House of Commons.
2 Canadian Security Intelligence Service (CSIS), “Remarks by Jim Judd, Director of CSIS,
at the Global Futures Forum Conference in Vancouver,” 15 April 2008, Internet Archive,
https://web.archive.org/web/20081012174022/http://www.csis-scrs.gc.ca:80/nwsrm/
spchs/spch15042008-eng.asp.
3 RSC 1952, c 96.
4 Under the act, the minister of justice could require a communications agency to pro-
duce any communication “that may be prejudicial to or may be used for purposes that
are prejudicial to the security or defence of Canada.” David C. McDonald, “Electronic
Surveillance – Security Service and C.I.B.,” in Reports of the Commission of Inquiry Con-
cerning Certain Activities of the Royal Canadian Mounted Police (Ottawa: Privy Council
Office, 1981), vol 2–1, 158, http://publications.gc.ca/collections/collection_2014/bcp
-pco/CP32-37-1981-2-1-2-eng.pdf.
5 The document is archived as Canada, “Privy Council Wiretap Order (St-Laurent Gov-
ernment)” (unpublished document, 1951), http://secretlaw.omeka.net/items/show/69,
and was obtained by Dennis Molinaro under the Access to Information Act.
6 McDonald, Reports of the Commission of Inquiry, vol 2–1, 158.
7 RSC, 1985, c C-46.
8 Ibid., s 183.
9 RSC, 1985, c C-23, s 21.
10 [1984] 2 SCR 145.
11 Justice Canada, “Section 8 – Search and Seizure,” https://www.justice.gc.ca/eng/csj-sjc/
rfc-dlc/ccrf-ccdl/check/art8.html.
12 Atwal v Canada, [1988] 1 FC 107 (FCA).
13 2017 FC 1047.
14 Ibid., para 171.
15 David Anderson, Report of the Bulk Powers Review (London: Williams Lea Group,
2016), para 1.9, https://terrorismlegislationreviewer.independent.gov.uk/wp-content/
uploads/2016/08/Bulk-Powers-Review-final-report.pdf.
16 2016, c 25.
17 National Defence Act, RSC, 1985, c N-5, ss 274.61 and 273.64(1) [NDA]. These mandates
are preserved in Bill C-59, Part 3, Communications Security Establishment Act (CSE Act),
ss 2, 16, 17, and 18 [CSE Act].
18 NDA, s 273.64(2); CSE Act, ss 17, 18, 23, and 25.
Bill C-59 and the Judicialization of Intelligence Collection 179
police have an opportunity to make links and connections that used to be more
labor intensive.”7
New technology such as big data analytics promises opportunities for police
services to work more efficiently and effectively by identifying and predicting
crime patterns in large datasets with the hope that such practices will enable
“the opportunity to enter the decision cycle of [their] ... adversaries’ [sic] in
order to prevent and disrupt crime.”8 For example, the Vancouver Police Depart-
ment has implemented a “city-wide ‘Predictive Policing’ system that uses
machine learning to prevent break-ins by predicting where they will occur
before they happen – the first of its kind in Canada.”9 The system is said to have
80 percent accuracy in directing officers to locations of break-ins.10 While
Vancouver is the first service in Canada to implement predictive policing soft-
ware, other services are implementing technologies that enable them to collect
and store large sets of data.11 For example, Calgary Police Services, the Royal
Canadian Military Police, the Ontario Provincial Police, and Winnipeg Police
Services have all implemented Mobile Device Identifier technology (which
mimics cellular towers) that enables them to intercept cellphone data.12 Such
examples demonstrate how big data and data analytics are being integrated in
Canadian police services.
At present, the use of big data in policing has been largely limited to the col-
lection and storage of DNA information, mass surveillance, and predictive
policing.13 While there is a lot of speculation about the possibilities (both good
and bad) of these technologies, there is little empirical research available on
how big data is used and experienced in policing.14 Ridgeway identifies specific
applications of big data in American policing, such as pushing real-time infor-
mation to officers, tracking police locations, performance measurement, and
predictive policing, but like others, notes that the “evidence so far is mixed on
whether police can effectively use big data.”15 To date, much of the available
literature on big data and policing is focused on predictive policing and origin-
ates in the United States, United Kingdom, and Australia – where police intel-
ligence frameworks, policies, and practices differ.16 For example, the United
Kingdom has a National Intelligence Model (NIM) that provides police services
with a standardized approach for gathering and analyzing intelligence for
informing strategic and tactical decision making.17 Canada, however, does not
have a standardized model or approach to intelligence work, so significantly
less is known about the use of technology and data science in the Canadian
context.
This chapter presents empirical data on the challenges facing Canadian police
in making effective use of technologies and data science, including big data
technology, for intelligence practices. While Canadian police services are
182 Carrie B. Sanders and Janet Chan
adopting data analytic practices, the extent of their integration and use varies
signficantly across the country. Some large services in Canada, for example, are
working actively to implement an organizational intelligence framework that
facilitates the adoption of new technologies and standardized analytic practices
throughout their services, while others, particularly smaller municipal services,
are in the early phases of technological adoption and appropriation that facilitate
data science practices. Drawing directly from interviews with thirteen Canadian
police services, we identify technological, organizational, and cultural challenges
for the integration and effective uptake of big data technologies. It is important
to note that, similar to the variance in the integration and adoption of data
science practices across Canadian services, the extent to which services experi-
ence these challenges will also vary.
Bennett Moses found that most personnel did not have knowledge of, or first-
hand experience with, big data. When asking their participants about the value
and purpose of big data, they found that most law enforcement and security
personnel focused on the value of these technologies for investigative or crime
disruption purposes rather than for predictive purposes or for understanding
broader crime trends.27 Unlike Chan and Bennett Moses, Sarah Brayne’s ethno-
graphic study on the use of big data surveillance by the Los Angeles Police
Department (LAPD) found that the LAPD increasingly used big data analytics
for predictive rather than reactive or explanatory purposes.28 Further, she found
that big data analytics amplify prior surveillance practices, while also facilitating
fundamental transformations in surveillance activities.29 For example, she found
that the integration of big data technologies, such as “alert-based technologies”
instead of the old query-based technologies, made it possible for the LAPD to
survey an unprecedentedly large number of people – people who would not
normally be found in traditional police records systems.30
Whereas there are only a few empirical studies available on the in situ use of
big data technologies, there are studies that discuss their potential uses. Alex-
ander Babuta, for example, wrote a report exploring the potential uses of big
data analytics in British policing. His report identified four ways in which big
data analytics presently are, or could be, used: (1) predictive crime mapping;
(2) predictive analytics for identifying individuals at risk of reoffending or being
victimized; (3) advanced analytics “to harness the full potential of data collected
through visual surveillance”; and (4) the use of “open-source data, such as that
collected from social media, to gain a richer understanding of specific crime
problems, which would ultimately inform the development of preventative
policing strategies.”31 Yet, like other scholars in the field, he found that the
empirical evidence on the use of big data analytics in these four ways is uneven.32
There have also been studies that identify a number of “fundamental limita-
tions” to the implementation and effective use of big data technologies.33 For
example, Babuta notes that the “lack of coordinated development of technology
across UK policing,” fragmentation of databases and software, lack of organiza-
tional access to advanced analytic tools, and legal constraints governing data
usage impede the successful use of big data.34 In synthesizing the available
research on the uptake and impact of police technology, Chan and Bennett
Moses identify the importance of other factors – apart from technological
capacity, such as police leadership, management of technology, organizational
politics, and cultural resistance – for understanding technological adoption and
use.35 The research of Carrie Sanders, Crystal Weston, and Nicole Schott on the
integration of intelligence-led policing practices in Canada demonstrates how
cultural issues and management of innovation issues are intertwined. For
184 Carrie B. Sanders and Janet Chan
example, in the six Canadian police services studied, they found that the use of
analytic technologies to support intelligence-led policing was more rhetorical
than real. In particular, the “occupational culture of information hoarding ...
has shaped the use and functioning of police innovations.”36 In line with previ-
ous research on the “poorly understood and appreciated” role of crime and
intelligence analysts,37 the lack of knowledge and training about crime analysis
on the part of police managers and officers has left analysts to engage in crime
reporting instead of predictive analytics so that new technologies are used to
support traditional policing practices.38
Methods
The empirical data driving our argument comes from sixty-four semi-structured
interviews with personnel from thirteen Canadian police services. Our sample
includes forty-one crime and intelligence analysts, three senior/lead analysts,
six civilian managers of analytic units, three support personnel (including one
junior support analyst and two policy developers), and eleven sworn officers
who work with, or supervise, crime and intelligence analysts. Most of the inter-
views were conducted face to face, while a small portion (10 percent) were
conducted over the telephone. Interviews were supplemented with participation
at the Massachusetts Association of Crime Analysts training conference (2017);
Canadian chapter of the International Association of Crime Analysts training
conference (2017); International Association of Crime Analysts annual training
conference (2017); South Western Ontario Crime Analysis Network (2015); and
two meetings of the Canadian Association of Chiefs of Police (2016 and 2017).
Interviews and field observations were conducted by the lead author and a
doctoral research assistant. All data were stored and analyzed in NVivo 10, a
qualitative data analysis software program, using a constructivist grounded
theory approach.39 The data were thematically coded by the doctoral research
assistant, with continual collaboration and consultation with the lead author.
The lead author then used writing as an analytical device to make sense of, and
theorize, the data. Through analytic memoing,40 we saw many similarities
between the state of police technology and data science in Canadian policing
and that found in the United States, United Kingdom, and Australia. Of interest
for this chapter are the ways in which technological, organizational, and cultural
contexts create challenges for Canadian police in making effective use of big
data technologies. Such challenges include fragmented technological platforms
and information silos; resources and practical barriers; emphasis on tactical
and operational intelligence (i.e., creating workups on offenders or cases) over
strategic intelligence (identifying future offenders or emerging crime trends);
and the uneven professionalization of analytics and user capacities.
The Challenges Facing Canadian Police in Making Use of Big Data Analytics 185
Back in 2007, the Solicitor General Office in [a Canadian province] went and
talked to all of the police agencies and said we would like to get you on the same
software with the same tools ... They selected [a private IT company] to deliver
the computer aided dispatch and records management systems, business intel-
ligence and [intelligence analysis] for doing analytics. We started working on
that project with that vendor in 2010 – by 2012 it lost momentum. Policing is a
world where people like their privacy and they like to do their own thing – like
even within our own walls we have silos – so trying to get twelve agencies to
work together and agree on something is impossible. (I27)44
means that analysts and police personnel have to request access to the social
media computer terminal – requiring them to physically move locations in the
service – or request the assistance of a different analyst. Neither of these was
perceived as an acceptable or ideal option.
Second, legislative barriers to accessing information and scraping data struc-
ture the types of open-source data and analytics that can be utilized for law
enforcement purposes. In services where open-source data were available, there
was often a lack of familiarity with, and awareness of, organizational policies
or legal frameworks for analyzing them. As one analyst explained:
We’re very behind in our policy ... It’s really due to a lack of understanding
from upper management. They are slow to understand that most of our crime
is moving online, or a lot of our evidence is moving to digital. And they don’t
understand the need for a policy ... We’re not CSIS and we don’t have those
resources, but I’m sure that within there, there is what law enforcement can
and cannot do (I6).
This analyst suggested that the lack of organizational policies concerning open-
source analytics are the result of a broader problem around organizational
knowledge and understanding of open-source data and the legalities of working
with such data for intelligence purposes.
Lack of organizational training on open-source analytics was also identified
as a challenge to the effective uptake of big data analytics. An analyst explained:
We had this program [redacted] that was social media related ... Unfortunately,
really, I didn’t have the training to be able to fully know what I was doing with
that program. So, as an example, we had a major incident that somebody said
“[CITY] Shopping Centre had a bomb threat.” There had been a number of
tweets, as an example, coming from this area. Can you pinpoint exactly where
these tweets are coming from? I didn’t have the training to do that – which really
to work that – you should really have it (I1).
This analyst explains how her service did provide her with a technological
platform that made possible open-source big data analytics, but the service did
not provide the training required to effectively integrate and use the software.
Our findings mirror those of Jyoti Belur and Shane Johnson, who identified
knowledge and process gaps that inhibit the integration of analysis. Specifically,
they found that “knowledge gaps existed not only on the part of analysts as
individuals and collectively, but also in terms of what policy-makers and senior
leaders knew about analysis and analytical capability of the organization.”48
188 Carrie B. Sanders and Janet Chan
I don’t feel that I have moved to the analysis of anything – because we start
out and say there are targets and all I do is run history, like I check all of the
databases to identify their address, associates, vehicles and so on and so forth,
but there are times that at least it is not required of me to go further and try to
see the bigger picture, of the group because of the mandate for my team ... we
don’t go up on wire, we do basic investigative files ... I think one area that we are
The Challenges Facing Canadian Police in Making Use of Big Data Analytics 189
lacking is looking at the final aspects of it ... there is a bigger network and we
don’t seem to focus on that as much – we are more just reactive – and we do very
low level investigations. (I20)
Using analytics for “targeted” and “investigative purposes,” this analyst argued,
led to “low level” investigations that do not enable police services to identify
and interrupt larger crime trends.
When we inquired about the use of predictive analytics and broader strategic
intelligence analysis, all analysts noted the organizational emphasis on tactical
and operational intelligence over strategic intelligence:
Even [in] our bigger investigations there is less of an appetite [for strategic anal-
ysis] because of the resources that are involved. So, as an analyst, we could pool
all of that information and start seeing the connections and identifying what
works – pull financial[s] of these individuals and see where the money is com-
ing and going ... We don’t have the resources because we don’t have forensic
accountants that are capable. We are requesting regular members to look into
these documents ... so there is no appetite for that ... but they [police service]
don’t consider the impact that it has on the economy ... you can show, go to the
media and say “look at all these drugs and weapons that we have seized and all
this cash”[;] these are tangible things that you can show, but bank documents
and wire transfers, no one cares for the paper. (I20)
I haven’t had to do anything that’s been extensive, yet. But, in the sense of [inter-
nal records management system] being a large database, you have a lot of infor-
mation in there. So, usually I am kind of going through that, yeah. (I3)
We are tactical analysts, and they are strategic analysts, and of course we feed off
of each other ... But as for doing all the research, anything with big data, is done
by [strategic analysts]. We don’t have anything to do with it. (I12)
Thus, there are mixed understandings and definitions of what big data is. Over
half of the analysts interviewed noted that their internal police records systems
constituted big data, while the other half did not believe they worked with big
data. Also, of interest in the quotes above, is the delineation between the value
of big data for strategic versus tactical intelligence analysis. Both participant I1
and I12 believed that they did not access or conduct analysis with big data
because they were focused on tactical intelligence. The mixed definitions of
what big data is combined with mixed understandings of using it speaks to
broader user capacity and capability issues. We believe the current situation is
The Challenges Facing Canadian Police in Making Use of Big Data Analytics 191
that a large number of analysts do not have the skills to match the functionality
of the principal software packages supported by, or available to, police services.
Conclusion
The growing complexity of the post-9/11 security environment has blurred the
boundaries of policing and national security intelligence. The changing security
environment, combined with the digital revolution of the 1990s and a permis-
sive legislative environment, has also shaped the ways in which police collect,
store, analyze, and interpret intelligence for the purpose of detecting and dis-
rupting crime.55 There are growing claims in the United States and the United
Kingdom, and more recently in Canada, about a movement towards predictive
policing practices through the integration of big data technologies. The findings
presented in this chapter problematize many of the claims surrounding big data
and predictive policing by identifying how big data technologies are socially
shaped by technological, organizational, and cultural contexts, which impede
their effective integration and uptake by Canadian law enforcement.
The findings raise concerns about police services’ knowledge and capacity to
fully understand and utilize data analytics and big data technologies in police
intelligence work. In fact, the empirical data informing this chapter demonstrate
that few analysts have the technical skills and data literacy to use the technical
tools to their full potential.56 Yet, as Weston, Bennett Moses, and Sanders argue,
“the increasing complexity of data science methods associated with pattern-
recognition and prediction, particularly in the context of large volumes of
diverse, evolving data, requires a relatively high level of technical sophistica-
tion.”57 Andrew Ferguson contends that while big data technology provides
innovative potential for police services, choosing a big data system is a political
decision rather than a policing decision. He argues that police administrators
must be able to (1) identify the risks they are trying to address; (2) defend the
inputs into the system (i.e., data accuracy, methodological soundness); (3) defend
the outputs of the system (i.e., how they will impact policing practice and com-
munity relationships); (4) test the technology in order to offer accountability
and some measure of transparency; and (5) answer whether their use of the
technology is respectful of the autonomy of the people it will impact.58 Our
findings raise serious concerns about the risks associated with low-level data
literacy skills and a police service’s understanding of the capabilities and limita-
tions of big data and predictive policing.
Lastly, throughout this chapter, we have demonstrated different ways in which
police adoption and use of big data and data analytics can be plagued with
“black data” – which results from “data bias, data error, and the incompleteness
of data systems.”59 Further complicating the issues is that algorithmic predictive
192 Carrie B. Sanders and Janet Chan
technologies can also have technical and subjective bias built in that can, if their
users are not cautious, lead to discriminatory practices.60 While the outputs of
predictive technologies can be easily attained, due to proprietary algorithms
the outputs lack transparency and accountability. Thus, while big data technolo-
gies hold great promise for police services to become more efficient, effective,
and accountable, without strong data literacy and a sophisticated understanding
of the political, effective implementation their success is questionable, and more
importantly, the socio-political implications of their use are difficult to
determine.
Notes
1 Richard Ericson and Kevin Haggerty, Policing the Risk Society (Toronto: University of
Toronto Press, 1997).
2 Lucia Zedner, “Pre-Crime and Post Criminology?” Theoretical Criminology 11 (2007):
264.
3 Patrick F. Walsh, Intelligence and Intelligence Analysis (Oxford: Routledge, 2011).
4 Keeley Townsend, John Sullivan, Thomas Monahan, and John Donnelly, “Intelligence-
Led Mitigation” Journal of Homeland Security and Emergency Management 7 (2010):
1–17.
5 Walsh, Intelligence and Intelligence Analysis, 130.
6 Jerry Ratcliffe, Intelligence Led Policing (Cullompton: Willan Publishing, 2008), 81.
7 Greg Ridgeway, “Policing in the Era of Big Data,” Annual Review of Criminology 1 (2017):
409.
8 Charlie Beck and Colleen McCue, “Predictive Policing: What Can We Learn from Wal-
Mart and Amazon about Fighting Crime in a Recession?” Police Chief 76 (2009): 19.
9 Matt Meuse, “Vancouver Police Now Using Machine Learning to Prevent Property
Crime: ‘Predictive Policing’ Technology Uses Past Trends to Determine Where Break-
ins Are Likely to Occur,” CBC News, 22 July 2017.
10 Ibid.
11 Ibid.
12 Meghan Grant, “Calgary Police Cellphone Surveillance Device Must Remain Top Secret,
Judge Rules: Alleged Gangsters Barakat Amer and Tarek El-Rafie Were Targets of the
Cellphone Interception Tool,” CBC News, 30 October 2017.
13 Alexander Babuta, “An Assessment of Law Enforcement Requirements, Expectations
and Priorities” (RUSI Occasional Paper, ISSN 2397-0286, 2017).
14 Sarah Brayne, “Big Data Surveillance: The Case of Policing,” American Sociological Asso-
ciation 82, 3 (2017), doi: 10.1177/0003122417725865 i.org/10.1177/0003122417725865.
15 Ridgeway, “Policing in the Era of Big Data,” 408.
16 Babuta, “An Assessment of Law Enforcement”; Ridgeway, “Policing in the Era of Big
Data”; Walt Perry, Brian McInnis, Carter Price, Susan Smith, and John Hollywood, Pre-
dictive Policing: The Role of Crime Forecasting in Law Enforcement Operations (Santa
Monica, CA: RAND Corporation, 2013).
17 Jyoti Belur and Shane Johnson, “Is Crime Analysis at the Heart of Policing Practice? A
Case Study,” Policing and Society (2016): 2, doi: 10.1080/10439463.2016.1262364.
18 Brayne, “Big Data Surveillance.”
19 Janet Chan and Lyria Bennett Moses, “Making Sense of Big Data for Security,” British
Journal of Criminology 57 (2017): 299–319; Adam Crawford, “Networked Governance
The Challenges Facing Canadian Police in Making Use of Big Data Analytics 193
and the Post-Regulatory State? Steering, Rowing and Anchoring the Provision of Polic-
ing and Security,” Theoretical Criminology 10, 6 (2006): 449–79.
20 Ridgeway, “Policing in the Era of Big Data,” 408.
21 Lisa-Jo Van den Scott, Carrie Sanders, and Andrew Puddephatt, “Reconceptualizing
Users through Rich Ethnographic Accounts,” in Handbook of Science and Technology
Studies, 4th ed., edited by Clark Miller, Urike Felt, Laurel Smith-Doerr, and Rayvon
Fouche (Cambridge, MA: MIT Press, 2017).
22 Brayne, “Big Data Surveillance,” 6.
23 Wiebe E. Bijker, “How Is Technology Made? – That Is the Question!” Cambridge Journal
of Economics 34 (2010): 63–76.
24 danah boyd and Kate Crawford, “Critical Questions for Big Data: Provocations for a
Cultural, Technological, and Scholarly Phenomenon,” Information, Communication and
Society 15 (2012): 663.
25 Lyria Bennett Moses and Janet Chan, “Using Big Data for Legal and Law Enforcement
Decisions: Testing the New Tools,” University of New South Wales Law Journal 37 (2014):
652.
26 Chan and Bennett Moses, “Making Sense of Big Data.”
27 Ibid.
28 Brayne, “Big Data Surveillance.”
29 Ibid.
30 Ibid.
31 Babuta, “An Assessment of Law Enforcement.”
32 Ibid.
33 Ibid; Janet Chan and Lyria Bennett Moses, “Can ‘Big Data’ Analytics Predict Policing
Practice?” in Security and Risk Technologies in Criminal Justice, edited by S. Hannem
et al. (Toronto: Canadian Scholars’ Press, 2019); Ridgeway, “Policing in the Era of Big
Data.”
34 Babuta, “An Assessment of Law Enforcement.”
35 Chan and Bennett Moses, “Can ‘Big Data’ Analytics Predict Policing Practice?”
36 Carrie B. Sanders, Crystal Weston, and Nicole Schott, “Police Innovations, ‘Secret Squir-
rels’ and Accountability: Empirically Studying Intelligence-Led Policing in Canada,”
British Journal of Criminology 55 (2015): 718.
37 Colin Atkinson, “Patriarchy, Gender, Infantilisation: A Cultural Account of Police Intel-
ligence Work in Scotland,” Australian and New Zealand Journal of Criminology 50, 2
(2017): 234–51; Nina Cope, “Intelligence Led Policing or Policing Led Intelligence? Inte-
grating Volume Crime Analysis into Policing,” British Journal of Criminology 44, 2 (2004):
188–203; Belur and Johnson, “Is Crime Analysis at the Heart of Policing Practice?”
38 Walsh, Intelligence and Intelligence Analysis; Patrick F. Walsh, “Building Better Intelligence
Frameworks through Effective Governance,” International Journal of Intelligence and
Counter Intelligence 28, 1 (2015): 123–42; Anthony A. Braga and David L. Weisburd, Police
Innovation and Crime Prevention: Lessons Learned from Police Research over the Past 20
Years (Washington, DC: National Institute of Justice, 2006); Christopher S. Koper, Cynthia
Lum, and James Willis, “Optimizing the Use of Technology in Policing: Results and Impli-
cations from a Multi-Site Study of the Social, Organizational, and Behavioural Aspects of
Implementing Police Technologies,” Policing 8, 2 (2014): 212–21; Carrie B. Sanders and
Camie Condon, “Crime Analysis and Cognitive Effects: The Practice of Policing through
Flows of Data,” Global Crime 18, 3 (2017): 237–55.
39 Kathy Charmaz, Constructing Grounded Theory: A Practical Guide through Qualitative
Analysis (London: Sage Publications, 2006).
40 Ibid.
194 Carrie B. Sanders and Janet Chan
These three examples have been chosen because they address three different
issues pertaining to mass surveillance: the actions of a specific surveillance
agency, laws granting governments easier access to personal information, and
global systems of mass surveillance. They were also chosen for their differing
approaches: use of the legal arena, mass online mobilization, and an international
coalition of civil society around a text-based campaign.
As we interviewed only one person per campaign, and as impact is sometimes
difficult to determine and attribute accurately, our analysis has clear limitations.
Thus, this chapter is not meant to be read as an authoritative guide to the most
and least effective campaigning methods. As activists, we often do not have the
time to reflect on past experiences, especially those of others. Thus, this chapter
aims to present information about past campaigns that will hopefully be useful
to today’s anti-surveillance campaigners in determining their preferred approach
and actions.
Confronting Big Data 199
It has been nineteen years since Canada’s first Anti-terrorism Act (Bill C-36),10
and national security and surveillance legislation has only continued to multiply:
by our count, no fewer than twelve bills have been passed to bring in new
national security and surveillance powers over that time, culminating with Bill
C-51, Canada’s second Anti-terrorism Act, in 2015.11 In June 2019, Parliament
passed a new piece of national security legislation, Bill C-59 (the National Secur-
ity Act, 2017), which has introduced sweeping changes, including to the powers
and regulations surrounding mass surveillance in Canada.12 Along with what
we have learned through the revelations made by Edward Snowden and other
whistle-blowers and journalists, it is an opportune time to look at popular
responses to surveillance and their impact.
surveillance, particularly on behalf of the United States and the United Kingdom,
but also Canada.21 According to DiPuma, the BCCLA hoped to take advantage
of “an historical moment when people were paying attention to the issue of
mass surveillance.” Further revelations from the Snowden files would eventually
implicate CSE in spying on Canadians as well as engaging in global mass sur-
veillance operations.22
As the suit has yet to go to trial, there is no way to predict the outcome. The
BCCLA is hoping for a favourable judgment, but DiPuma says that it is open
to other outcomes too, including unilateral steps by the government to ensure
that CSE’s activities do not violate Canadians’ rights. “Litigation can add to
the overall policy discussion in a way that affects meaningful change in the
law,” says DiPuma, adding that if the government were to introduce laws that
appropriately changed CSE’s legal framework, the BCCLA would reconsider
its suit.
Impact
It is difficult to attest to the impact of a lawsuit that is still before the courts.
However, DiPuma points to what she sees as some impacts already:
• The suit has contributed to the public policy debate around CSE and govern-
ment surveillance.
• The case has brought public awareness to a secretive regime.
• The BCCLA has learned new details about the operations of CSE, but which
cannot be shared publicly yet.
These impacts are difficult to measure, but there are a few indications we can
look to.
Public Awareness
Through its outreach and public discussions, the BCCLA has found that people
were “shocked” and agreed that change was needed when the details of the case
and the association’s concerns with CSE’s surveillance practices were explained
to them.
In her experience, DiPuma said, there are mixed reactions among the public
to the question of whether or not we need to be concerned about protecting
privacy rights. In this case, however, the issues “resonated deeply.” The secrecy
around CSE makes it particularly important in this case to get people engaged,
and to understand it, DiPuma said.
At the time the lawsuit was filed, the case garnered considerable media cover-
age across the country, including articles in the Globe and Mail, CBC News, and
the Vancouver Sun.23 Each time the case has come up in court, including the
202 Tim McSorley and Anne Dagenais Guertin
filing of the class action suit and the arguments over disclosure in the summer
of 2016, there was a resurgence in media coverage of CSE and its surveillance
practices.24
Although it is impossible to measure the exact impact of this lawsuit on
public awareness, it is safe to say that such mainstream coverage at each stage
of the lawsuit would have educated more than a few people. However, as CSE
remains little known or understood by the Canadian public, it is also safe to
say that much more public education and media coverage is necessary.
Disclosure
Another impact of such a lawsuit is disclosure of information. Information that
to date has remained secret, unpublished, or inaccessible through other means
may come to light through the disclosure process, once it is entered into evidence
and become public. The disclosure of these documents helps to inform advocates
and the public at large of new issues and concerns, and can lead to further
action.
The BCCLA’s suit has gone through the disclosure stage, including argu-
ments over whether certain documents deemed sensitive to national security
should be disclosed to the BCCLA and made available to be entered into
evidence. While the bulk of disclosure is still being considered and will be
made public only if and when it is entered into evidence, we have already been
given a glimpse of the kind of information such cases can provide. In June
2016, the Globe and Mail published a comparison of two versions of a docu-
ment detailing metadata collection by CSE.25 The first version, received by the
Globe through access to information requests, is heavily redacted and of
modest value. The second version, received by the BCCLA through disclosure,
is significantly less redacted and contains information on how metadata is
defined and what analysts are able to do with the information. As the Globe
and Mail notes:
The BCCLA version of the document shows how CSE is under orders to indis-
criminately collect and share metadata with allied agencies. But also revealed is
the order to scrub out any “Canadian identifying information” after the fact.26
Policy
As with other impacts, it is difficult to show a direct relation between the pres-
sure that a lawsuit puts on policy-makers and their decisions. This is especially
true when it comes to national security, since so much of what drives govern-
ment decision making is kept confidential. At the same time, one of the BCCLA’s
stated public goals with this suit is to change the practice and law around
government surveillance. The association’s suit also comes at a time of pressure
from other organizations and sectors to reform CSE’s activities to ensure that
they do not violate Canadians’ Charter rights (or engage in mass surveillance
internationally, for that matter).
In June 2017, four years after the BCCLA filed its initial lawsuit, the federal
government proposed a major revamp of CSE’s legislation. Bill C-59, the National
Security Act, 2017, includes the newly titled CSE Act. The act lays out in detail
the policies and procedures that would govern the work of CSE, including
parameters for surveillance and retention of data.27 The bill was granted royal
assent in June 2019.
The creation in the National Security Act, 2017 of a new Intelligence Com-
missioner (IC), with quasi-judicial status (the IC must be a retired Federal
Court judge), to approve surveillance authorizations before they are carried
out may also be seen as a response to some of the issues addressed in the
lawsuit (although without more information it is impossible to point to
causality, and further research would be necessary to reach any conclusion).
Such an approval process could, in theory, provide more certainty that
authorizations are issued on reasonable and probable grounds, or at least
under a clear and articulable legal standard (which, the lawsuit argues, does
not currently occur).
While the BCCLA published a brief on the provisions of Bill C-59, including
those related to CSE, it did not express an opinion on whether these changes
address the concerns brought up in its court challenge.28
Analysis
Litigation can be an effective tactic to protect Charter rights: rather than rely
solely on public pressure and advocacy, Charter challenges can result in concrete
changes in law and put pressure on the government to act before the courts
issue their decision. They can also draw media attention to an issue and provide
an opportunity for public awareness campaigns. At the same time, they can be
resource-intensive and drawn out, and there is little guarantee that a court rul-
ing will support, in full or in part, the goal a campaign hopes to achieve. The
decision then centres on whether to continue to pursue it through the courts
or attempt to bring about change through other means.
204 Tim McSorley and Anne Dagenais Guertin
The Campaign
Many will remember the fight against lawful access in Canada for Public Safety
Minister Vic Toews’s infamous words just before Bill C-30 was introduced – that
Canadians “can either stand with us or with the child pornographers.”33 Within
a year, the bill was dead.
The battle against lawful access had been ongoing for several years already,
dating back to the 2005 Liberal government. Despite many attempts, no lawful
access legislation was successfully adopted.34 The Conservative party made it a
key plank of its proposed omnibus crime bill, which it promised to enact within
the first 100 days following its election with a parliamentary majority on 2 May
2011.35 That June, however, a coalition of thirty public interest organizations,
businesses, and concerned academics assembled by OpenMedia launched the
Stop Online Spying campaign (www.stopspying.ca), calling for an end to lawful
access legislation and for the Conservatives to exclude it from the proposed
omnibus crime bill.
At the time, OpenMedia was known especially for its Stop the Meter campaign,
which by mid-2011 had racked up more than 500,000 signatures on an e-petition
calling on the government to put an end to usage-based billing (or metered
Internet access). Such a large number of online petition signatures was unheard
of in Canada at the time, and the campaign was a factor in having the govern-
ment make the CRTC rule against the practice.36 The campaign’s success played
a role in OpenMedia’s taking on of lawful access issues.
OpenMedia believed that fighting lawful access in the same way could suc-
ceed. According to Christopher, the organization set out to bring together a
wide coalition that bridged the political spectrum in order to put pressure on
206 Tim McSorley and Anne Dagenais Guertin
“Every single provincial privacy commissioner has spoken against this bill,” says
OpenMedia.ca’s Executive Director, Steve Anderson. “Law-abiding Canadians
should be able to use the Internet and mobile devices without Big Telecom and
government looking over their shoulders. These invasive surveillance bills will
transform the Internet into a closed, rigid, and paranoid space.”37
to the Stop the Meter campaign and viewed as key to the new campaign, took
off after Toews’s statement and the momentum generated by the hashtag. By
April, the petition had more than 130,000 signatures.44
Christopher points out that it is important to see this number in context. At
the time, online petitions receiving mass support were something new, especially
in Canada. Thus, as the petition passed each milestone – say, 25,000 signatures –
the media reported on it, resulting in ever-growing momentum. On 30 April,
the petition hits its peak, with 135,000 signatures – fewer than for Stop the
Meter but still significant for a campaign on issues like online surveillance
and lawful access, topics that were not often seen as causing strong public
outcry.
Bill C-30 had passed only first reading and while opposition parties brought
it up in debate, the government never returned it to the House of Commons.
A year after its introduction, the government withdrew the bill. Justice Minister
Rob Nicholson stated:
We will not be proceeding with Bill C-30 and any attempts we will have to mod-
ernize the Criminal Code will not contain the measures in C-30 – including
the warrantless mandatory disclosure of basic subscriber information, or the
requirement for telecommunications service providers to build intercept capa-
bilities within their systems ... Any modernization of the Criminal Code ... will
not contain those.45
Analysis
While the Stop Online Spying campaign is remembered for the #TellVicEverything
hashtag and the online petition, it was a multi-faceted campaign that also
included coalition building, lobbying, and popular education through online
videos and community screenings. These other tactics arguably helped build
momentum towards the 135,000-signature petition.
Along with the multiple tactics, it is also important to acknowledge what
Christopher called the “moment in time.” This includes the resounding success
of the Stop the Meter campaign (the largest online petition campaign in Canada
to date), the novelty and newsworthiness of a viral online petition, and the
government’s miscalculations, particularly Vic Toews’s “with us or with the child
pornographers” proclamation.
208 Tim McSorley and Anne Dagenais Guertin
It would appear that OpenMedia and its partners were able to take advantage
of this moment to pressure the government and make it politically unpalatable
for the Conservatives to proceed with Bill C-30. The online petition with 135,000
signatures served as visual evidence of the widespread disapproval of the bill
that was necessary to cause the government to retreat.
Unfortunately, just as there had already been multiple attempts to bring in
lawful access, the Conservative government did not give up. Several months
later, Justice Minister Peter MacKay reintroduced lawful access legislation with
Bill C-13. This time, the government framed the issue as an anti-bullying law to
address cases like those of Amanda Todd and Rehtaeh Parsons – teenage girls
bullied online to the point of suicide. Many, including Amanda Todd’s mother,
criticized the bill for combining two unrelated issues and decried it as a political
manoeuvre.46 The new amendments were scaled back, and did not, to the word,
break Nicholson’s earlier promise: the new legislation did not allow for war-
rantless access or force Internet service providers (ISPs) to build in technology
that would allow back-door entrance for law enforcement. It did, however, lower
the threshold for approval of certain kinds of access warrants, thereby making
it easier than ever for law enforcement and national security agencies to access
online personal data.
So while the laws have been loosened and the debate continues, it is fair to
say that the Stop Online Spying campaign significantly slowed lawful access
rules in Canada. It is telling that even in recent debates on lawful access, there
continue to be references to the failure of Bill C-30 and the massive opposition
Canadians have shown to warrantless surveillance.
What Is ICAMS?
In 2004, the ICLMG joined other well-known human rights groups from
around the world – the American Civil Liberties Union (United States),
Statewatch (United Kingdom), and Focus on the Global South (Philippines) –
in launching the International Campaign Against Mass Surveillance, which
calls on governments to end mass surveillance and global registration of entire
populations.
Confronting Big Data 209
Impacts of ICAMS
The campaign had many positive results and impacts, including:
• the declaration signed by hundreds of organizations and individuals
• the book Illusions of Security
• global partnerships, the creation of a collaborative structure and culture, and
regional networks that have lasted to this day
• the emergence of many individuals and groups working on privacy
• getting privacy groups to start doing policy work
• getting organizations from all sectors to start working on privacy as well
• the creation of a relationship between civil society and the OPC
• the civil society parallel conference
• adoption of the core document as a resolution by the Conference of Data Pro-
tection and Privacy Commissioners
• development of international awareness of mass surveillance
• influence on university researchers and federal opposition parties – for exam-
ple, the Bloc Québecois and the New Democratic Party published a minority
report calling for the abrogation of the Anti-terrorism Act of 2001 because of
ICAMS’s work,51 and the ICLMG was frequently cited by them in the House of
Commons.
commissioners from around the world had adopted it. Thus, the campaign did
not lead to any new government policies. Third, no country adopted the docu-
ment or brought it before the United Nations, so no UN treaty was drafted, let
alone signed. Roch Tassé wonders whether they should have given themselves
more solid international structures or created a more formal international entity.
However, it is difficult to say whether that would have worked or been more
effective, he added.
Finally, the lack of time and resources had a big influence. Often, we feel that
it is imperative to act against a terrible affliction such as mass surveillance, but
resources and time limit what can be done. And although there were many
influential individuals and researchers involved, they could not carry the entire
burden. After a while, they moved on to other things or retired, with no one to
succeed them.
pressure seems improbable or insufficient, and if one has the resources and
the patience for such endeavours, then lawsuits and Charter challenges might
be the better approach.
It would seem that the Stop Online Spying campaign contributed to the
defeat of Bill C-30 thanks to not only its hashtag #TellVicEverything and its
online petition but also its multi-faceted approach, including coalition build-
ing, lobbying, and popular education through online videos and community
screenings. It would also appear that the campaign took advantage of a
moment in time that, unfortunately, cannot be planned and thus cannot be
replicated. As we have seen, no such moment enabled us to avoid the adop-
tion of Bill C-13, a subsequent piece of lawful access legislation. Being able
to recognize such a moment and being prepared to seize it are lessons we
can take from the Stop Online Spying campaign. Although multi-faceted
campaigns and viral online petitions also require resources, they are more
accessible than lawsuits for many individual campaigners and non-profit
organizations.
Mass surveillance being an international issue, an international campaign
might be what is needed. The International Campaign Against Mass Surveil-
lance offers a few lessons for such a project. One is that novel ways of framing
and publicizing the issue of mass surveillance as a serious problem that can be
solved appear to be essential in order to mobilize people. The political climate
also needs to be taken into account in order not to waste precious resources
and energy on governments that refuse to act on the issue. The creation of more
solid international structures or a more formal international entity is a potential
avenue to explore. Finally, a deeper collective reflection on the struggle against
mass surveillance, as well as securing more resources and ensuring the succes-
sion within the movement, appears to be necessary in order to sustain this type
of long-term campaign.
Unfortunately, it remains difficult to quantify the results or qualify any of
these campaigns as complete successes (although campaigns often achieve
partial or incremental reform). Or, perhaps more precisely, while each has
in large part fulfilled its immediate goals, the broader goal of reversing (and
eventually eliminating) intrusive, rights-undermining mass surveillance has
yet to be achieved. There is a strong argument to be made that civil society
campaigns have effectively slowed the growth of mass surveillance for national
security purposes, but if we were to measure whether mass surveillance is
more prevalent now than in 2001, it is clear that despite the revelations of
Edward Snowden and other whistle-blowers, as well as the work of progres-
sive legislators and of civil society groups, mass surveillance has continued
to grow.
Confronting Big Data 213
The lessons that we can draw from these cases are reflective of campaign
strategies adopted in other sectors as well:
• Have clear, direct targets. Attacking the entire national security apparatus, while
necessary, appears to be aiming at too broad a target. This makes it difficult to
enunciate clear goals and engage the public in a clear campaign. Targeting of
CSE or of lawful access made those campaigns more straightforward, with clear
measures of success.
• Build coalitions across sectors. ICAMS went beyond civil society to build
bridges and allies in the government bureaucracy through privacy commission-
ers’ offices. OpenMedia reached out to conservative libertarians and business
organizations that shared similar concerns over privacy as well as the cost to
industry and users of forcing ISPs to integrate interfaces for government sur-
veillance technology into their own infrastructure.
• Use a diversity of tactics. Although the more facets a campaign has, the more
intense the use of resources (which are often limited for civil society organi-
zations), by multiplying the kinds of tactics and tools used – targeted to the
campaign’s specific goal – each of these campaigns was able to achieve greater
impact.
Conclusion
As mass surveillance becomes more and more normalized and ubiquitous,
traditional and targeted campaigns and actions seem limited in their outcomes,
especially if one’s desired outcome is the abolition of mass surveillance. More
research and reflection are necessary to identify the real impact of our cam-
paigning and more radical methods to effect changes. The constant barrage of
new legislation slowly allowing more and more surveillance makes it difficult
to take the time to evaluate the efficiency of our advocacy, but it also makes it
all the more urgent.
A broader research project that investigates both the actions and thought
processes of campaigners as well as those of government officials through more
interviews, access to information requests, and primary source documentation
could help answer some of the outstanding questions raised in this chapter.
Other interesting avenues to explore could be international comparisons of
campaigns targeting similar surveillance concerns in different countries. For
example, the Five Eyes countries often adopt similar laws, so an examination
of successes and failures in different jurisdictions could be enlightening. Overall,
however, we believe these three cases show how important it is for civil liberties
and privacy groups to maintain long-term coordinated resistance to mass sur-
veillance, and provide some indications that such resistance will continue to
grow and hopefully achieve new victories in the coming years.
Notes
1 Tia Dafnos, Scott Thompson, and Martin French, “Surveillance and the Colonial Dream:
Canada’s Surveillance of Indigenous Self-Determination,” in National Security, Surveil-
lance and Terror, edited by Randy K. Lippert, Kevin Walby, Ian Warren, and Darren
Palmer (Basingstoke, UK: Palgrave Macmillan, 2016), 324.
2 Mark Neocleous, War Power, Police Power (Edinburgh: Edinburgh University Press,
2014), 31–35.
3 Cited in ibid., 33–34.
4 Ibid., 34.
5 Gary Kinsman, Dieter K. Buse, and Mercedes Steedman, “Introduction,” in Whose
National Security? Canadian State Surveillance and the Creation of Enemies (Toronto:
Between the Lines, 2000), 1–8; Luis A. Fernandez, Policing Dissent: Social Control and
the Anti-Globalization Movement (New Brunswick, NJ: Rutgers University Press, 2008),
94–107.
6 Paul Bernal, “Data Gathering, Surveillance and Human Rights: Recasting the Debate,”
Journal of Cyber Policy 1, 2 (2016): 243–64; Jon Penney, “Chilling Effects: Online
Surveillance and Wikipedia Use,” Berkeley Technology Law Journal 31, 1 (2016): 117,
SSRN, https://ssrn.com/abstract=2769645; Dafnos, Thompson, and French, “Surveillance
and the Colonial Dream,” 324.
7 Colin J. Bennett, Kevin D. Haggerty, David Lyon, and Valerie Steeves, eds., Transparent
Lives: Surveillance in Canada (Edmonton: Athabasca University Press, 2014), 19–37.
8 Ibid., 12.
Confronting Big Data 215
9 Gregory S. Kealey, Spying on Canadians: The Royal Canadian Mounted Police Security
Service and the Origins of the Long Cold War (Toronto: University of Toronto Press,
2017), 1–13.
10 Canada, Bill C-36, Anti-terrorism Act, 1st Sess, 37th Parl, 2001, http://www.parl.ca/
DocumentViewer/en/37-1/bill/C-36/royal-assent.
11 Canada, Bill C-51, Anti-terrorism Act, 2015, 2nd Sess, 41st Parl, 2015, http://www.parl.
ca/DocumentViewer/en/41-2/bill/C-51/royal-assent.
12 Canada, Bill C-59, An Act respecting national security matters, 1st Sess, 42nd Parl,
2017, http://www.parl.ca/DocumentViewer/en/42-1/bill/C-59/second-reading [National
Security Act, 2017]; International Civil Liberties Monitoring Group (ICLMG),
“Breaking Down Bill C-59, the New National Security Act,” http://iclmg.ca/issues/
bill-c-59-the-national-security-act-of-2017/.
13 BC Civil Liberties Association (BCCLA), “BCCLA Sues Canadian Government to Stop
Illegal Spying,” https://bccla.org/stop-illegal-spying/protect-our-privacy-case-details/.
14 BCCLA, “Backgrounder on Spying: Civil Liberties Watchdog Sues Surveillance
Agency over Illegal Spying on Canadians,” 1 June 2016, https://bccla.org/wp-content/
uploads/2016/06/2016_06_02_Backgrounder-BCCLA-Sues-CSE-1.pdf.
15 Ibid.
16 BCCLA, “Illegal Spying: BCCLA Files Class Action Lawsuit against Canada’s Electronic
Spy Agency,” 1 April 2014, https://bccla.org/news/2014/04/illegal-spying-bccla-files-class
-action-lawsuit-against-canadas-electronic-spy-agency/.
17 James Keller, “Ottawa Says CSEC’s Collection of Canadians’ Data ‘incidental,’” CTV
News, 24 January 2014, https://www.ctvnews.ca/canada/ottawa-says-csec-s-collection
-of-canadians-data-incidental-1.1655231.
18 BCCLA to the Attorney General of Canada, Statement of Claim to the Defendant, 27
October 2014, https://bccla.org/wp-content/uploads/2014/12/20141027-CSEC-Statement
-of-Claim.pdf.
19 Ibid.
20 BCCLA, “Stop Illegal Spying,” https://bccla.org/stop-illegal-spying/.
21 BCCLA, “Spying in Canada: Civil Liberties Watchdog Sues Surveillance Agency
over Illegal Spying on Canadians,” 23 October 2013, https://bccla.org/wp-content/
uploads/2013/10/Final-Press-Release-Spying-10_21_131.pdf.
22 Greg Weston, “CSEC Used Airport Wi-Fi to Track Canadian Travellers: Edward
Snowden Documents,” CBC News, 31 January 2014, http://www.cbc.ca/news/
politics/csec-used-airport-wi-fi-to-track-canadian-travellers-edward-snowden
-documents-1.2517881; Dave Seglins, “CSE Tracks Millions of Downloads Daily:
Snowden Documents,” CBC News, 27 January 2015, http://www.cbc.ca/news/canada/
cse-tracks-millions-of-downloads-daily-snowden-documents-1.2930120.
23 Colin Freeze and Wendy Stueck, “Civil Liberties Groups Launch Lawsuit against
Canadian Eavesdropping Agency,” Globe and Mail, 22 October 2013, https://www.
theglobeandmail.com/news/national/canadian-eavesdropping-agency-facing-lawsuit
-from-civil-liberties-group/article14984074/; “Canadian Spy Agency Sued for Alleg-
edly Violating Charter,” CBC News, 22 October 2013, http://www.cbc.ca/news/canada/
british-columbia/canadian-spy-agency-sued-for-allegedly-violating-charter-1.2158884;
Gillian Shaw, “BC Civil Liberties Association Launches Lawsuit against Canada’s Electronic
Surveillance Agency,” Vancouver Sun, 22 October 2013, http://vancouversun.com/news/
staff-blogs/bc-civil-liberties-association-launches-lawsuit-against-canadian-government
-over-csec-spying.
24 Liam Britten, “BCCLA Says Warrantless Spying on Canadians Must End,” CBC News, 23 June
2016, http://www.cbc.ca/news/canada/british-columbia/bccla-cse-surveillance-1.3650286.
216 Tim McSorley and Anne Dagenais Guertin
25 “The ‘Top Secret’ Surveillance Directives,” Globe and Mail, 2 June 2016, https://www.the
globeandmail.com/news/national/top-secret-surveillance-directives/article30249860/.
26 Ibid.
27 Monique Scotti, “Here’s What You Need to Know about Canada’s ‘Extraordinarily
Permissive’ New Spying Laws,” Global News, 6 February 2018, https://globalnews.ca/
news/3999947/cse-c59-new-spy-powers-canada/.
28 BCCLA, “Written Submissions of the British Columbia Civil Liberties Association
(‘BCCLA’) to the Standing Committee on Public Safety and National Security regarding
Bill C-59, An Act respecting national security matters,” 30 January 2018, http://www.
ourcommons.ca/Content/Committee/421/SECU/Brief/BR9669809/br-external/British
ColumbiaCivilLibertiesAssociation-e.pdf.
29 Christopher A. Parsons, Lex Gill, Tamir Israel, Bill Robinson, and Ronald J. Deibert,
“Analysis of the Communications Security Establishment Act and Related Provisions in
Bill C-59 (an Act Respecting National Security Matters), First Reading,” Transparency
and Accountability, December 2017, SSRN, https://ssrn.com/abstract=3101557.
30 Canada, Bill C-30, An Act to enact the Investigating and Preventing Criminal Electronic
Communications Act and to amend the Criminal Code and other Acts, 1st Sess, 41st Parl,
2012, http://www.parl.ca/LegisInfo/BillDetails.aspx?Language=E&billId=5375610.
31 Sarah Schmidt and Jason Fekete, “Vic Toews Will ‘Entertain Amendments’ to Online
Surveillance Bill,” National Post, 15 February 2012, http://nationalpost.com/news/
canada/protecting-children-from-internet-predators-act-vic-toews.
32 Gillian Shaw, “Stop Online Spying,” Vancouver Sun, 15 September 2011, https://vancou
versun.com/news/staff-blogs/stop-online-spying-openmedia-ca-launches-campaign
-against-web-surveillance-legislation/.
33 Schmidt and Fekete, “Vic Toews Will ‘Entertain Amendments.’”
34 Philippa Lawson, Moving toward a Surveillance Society: Proposals to Expand ‘Lawful
Access’ in Canada (Vancouver: BCCLA, January 2012), https://bccla.org/wp-content/
uploads/2012/03/2012-BCCLA-REPORT-Moving-toward-a-surveillance-society.pdf.
35 Laura Payton, “Internet Privacy Experts Raise Concerns over Crime Bill,” CBC News, 9
August 2011, http://www.cbc.ca/news/politics/internet-privacy-experts-raise-concerns
-over-crime-bill-1.1090482; Michael Geist, “The Conservatives Commitment to Internet
Surveillance,” Michael Geist (blog), 9 April 2011, http://www.michaelgeist.ca/2011/04/
conservative-lawful-access-commit/.
36 OpenMedia, “A Look Back at Our Stop the Meter Campaign,” https://openmedia.org/en/
ca/look-back-our-stop-meter-campaign.
37 OpenMedia, “Invasive Surveillance Bills Will Cost Canadians in Cash and Civil Liberties,
Says New Coalition,” 22 June 2011, https://openmedia.org/en/press/invasive-surveillance
-bills-will-cost-canadians-cash-and-civil-liberties-says-new-coalition.
38 OpenMedia to Right Honorable Prime Minister Stephen Harper, “RE: Omnibus Crime
Bill,” 9 August 2001, https://assets.documentcloud.org/documents/230754/letter-to-harper
-re-lawfulaccess.pdf.
39 Daniel Tencer, “‘Lawful Access’ Legislation Missing from Omnibus Crime Bill, but
Online Spying Fight Isn’t Over,” Huffington Post Canada, 20 September 2011, https://
www.huffingtonpost.ca/2011/09/20/lawful-access-legislation_n_971965.html.
40 Postmedia News, “Online Surveillance Bill Critics Are Siding with ‘Child Pornogra-
phers’: Vic Toews,” National Post, 14 February 2012, http://nationalpost.com/news/
canada/online-surveillance-bill-critics-are-siding-with-child-pornographers-vic-toews.
41 Melissa Martin, “TellVicEverything an Internet Sensation,” Winnipeg Free Press, 17 Feb-
ruary 2012, https://www.winnipegfreepress.com/local/tell-vic-everything-an-internet
-sensation-139501528.html.
Confronting Big Data 217
42 Steve Anderson, “Stop Online Spying Hits 100k: Canadians Are an Inspiration,” Open-
Media.ca, 17 February 2012, https://openmedia.org/en/stop-online-spying-hits-100k
-canadians-are-inspiration.
43 Ibid.
44 OpenMedia, “A Look Back at Our Stop Spying Campaign against Canada’s Bill C-30,”
https://openmedia.org/en/ca/look-back-our-stop-spying-campaign-against-canadas
-bill-c-30.
45 Canadian Press, “Conservatives Kill Controversial ‘Child Pornographers’ Internet Sur-
veillance Bill,” National Post, 11 February 2013, http://nationalpost.com/news/politics/
conservatives-kill-controversial-internet-surveillance-bill.
46 Evan Dyer, “Cyberbullying Bill Draws Fire from Diverse Mix of Critics,” CBC News,
20 October 2014, http://www.cbc.ca/news/politics/cyberbullying-bill-draws-fire-from
-diverse-mix-of-critics-1.2803637.
47 International Campaign Against Mass Surveillance (ICAMS), The Emergence of a Global
Infrastructure for Mass Registration and Surveillance (ICAMS, April 2005), https://
web.archive.org/web/20070109200500/http://www.i-cams.org/ICAMS1.pdf; ICAMS,
“The Emergence of a Global Infrastructure for Mass Registration and Surveillance: 10
Signposts,” https://web.archive.org/web/20061219231540/http://www.i-cams.org:80/
Surveillance_intro.html; ICAMS, “Campaign Declaration,” https://web.archive.org/
web/20061219231451/http://www.i-cams.org:80/Declaration_Eng.html.
48 Maureen Webb, Illusions of Security: Global Surveillance and Democracy in the Post-9/11
World (San Francisco: City Lights Books, 2007).
49 The Global Privacy Assembly (GPA), 18 May 2020, https://globalprivacyassembly.org/
the-assembly-and-executive-committee/history-of-the-assembly/; International Con-
ference of Data Protection and Privacy Commissioners (ICDPPC), “Resolution on the
Urgent Need for Global Standards for Safeguarding Passenger Data to Be Used by Govern-
ments for Law Enforcement and Border Security Purposes,” 29th International Confer-
ence, Montreal, 2007, http://globalprivacyassembly.org/wp-content/uploads/2015/02/
Resolution-on-Urgent-need-for-global-standards-for-safeguarding-passenger-data-to
-be-used-by-governments-for-law-enforcement-and-border-security-purposes.pdf.
50 ICDPPC, “Resolution on the Urgent Need for Protecting Privacy in a Borderless World,
and for Reaching a Joint Proposal for Setting International Standards on Privacy and
Personal Data Protection,” 30th International Conference, Strasbourg, France, 2008,
http://globalprivacyassembly.org/wp-content/uploads/2015/02/Resoltuion-on-the
-urgent-need-for-protecting-privacy-in-a-borderless-world.pdf.
51 Serge Ménard and Joe Comartin, “Anti-Terrorism Act Dissenting Opinion,” in Rights,
Limits, Security: A Comprehensive Review of the Anti-Terrorism Act and Related Issues,
Report of the Standing Committee on Public Safety and National Security, March 2007,
http://www.ourcommons.ca/DocumentViewer/en/39-1/SECU/report-7/page-69.
12
Protesting Bill C-51
Reflections on Connective Action against Big Data Surveillance
Jeffrey Monaghan and Valerie Steeves
Get a couple of beers in them and [privacy advocates] will fantasize about
what they call the “Privacy Chernobyl” – the one privacy outrage that will
finally catalyze an effective social movement around the issue.
– Philip Agre, cited in The Privacy Advocates:
Resisting the Spread of Surveillance
In October 2014, a homeless man killed a soldier standing guard by the National
War Memorial in downtown Ottawa using an antique single-shot rifle, and then
entered the front entrance of Parliament’s Centre Block. Within minutes, he
was shot and killed by parliamentary security. Days later, taking advantage of
this tragic yet exceptional act of violence, Prime Minister Stephen Harper’s
government proposed sweeping reform to the policing and security powers
contained in the Anti-terrorism Act. The legislation, known as Bill C-51, included
increased powers of surveillance and information sharing, as well as contentious
powers of disruption that would enable judges to sanction, in advance, police
actions that would explicitly violate the Canadian Charter of Rights and
Freedoms.1
Bill C-51 was among a number of surveillance and intelligence-sharing pro-
posals that had been floated to strengthen the ability of security agencies to
engage in big data practices of mass collection for the purpose of future pre-
emptive action through algorithmic induction; indeed, critics suggested that
the proposals were ready-made, simply waiting for the appropriate tragedy to
be officialized.2 In spite of the social backdrop of fear and anxiety, public criti-
cism of Bill C-51 quickly emerged. Cross-country mobilizations included a
number of national days of action that attracted thousands of protest participants
in fifty-five cities across the country.3 Newspapers, television news, and social
media exploded with debate, and thousands of Canadians signed petitions and
open letters of protest, gradually building a critical mass in opposition grounded
loosely on concerns about civil liberties, privacy politics, and policing powers
in the “War on Terror.”
Although an amended version of the bill was enacted into law, the public
response to Bill C-51 stands in stark contrast to the lack of public engagement
in privacy issues noted by Colin Bennett in his study of privacy advocacy. In
Protesting Bill C-51 219
the importance of campaigns against lawful access laws as formalizing the hybrid
action network.19 The lawful access debates provide an excellent example of what
Jennifer Earl and Katrina Kimport describe as e-tactics and e-mobilizations.20
However, the movement against lawful access never translated from online
activism to protest mobilizations.
Based on the networks established through earlier campaigning, organizers
were able to quickly come together and establish a new campaign as soon as
the government introduced Bill C-51, using a variety of digitally mediated efforts.
As described by another participant:
[When] Bill C-51 [was announced] that’s when we really kicked into high gear
with the Charter challenge and petition and letter writing campaigns, social
media campaigns, basically all the tactics ... [that] organizations like ours can
use and we’ve been fighting that battle now for four years.21
Almost immediately, the C-51 campaign was also far more heterogeneous,
attracting the participation of activists who were engaged more broadly in
political/economic (e.g., labour unions) and Internet/Internet freedom issues
(e.g., net neutrality). As one participant put it, “a lot more groups – a bit bigger
diversity groups came in.”22 This brought with it a concomitant diversity of views
about government surveillance and policing, and opened up more space for
members of the public to engage with the campaign for their own purposes and
from their own personal action frames. It enabled participants to act in concert,
if not collectively, to push back against surveillance and move the opposition
from online campaigning into protest organizing.
We turn now to the factors that supported this shift.
Here’s a statement of CSIS [the Canadian Security Intelligence Service]: “we will
neither confirm nor deny and we will not tell you whether in our opinion we
should get a warrant if we were ever to use one.” I’m sorry, we will not tell you
parliamentarians whether CSIS would get a warrant for the use of IMSI-catchers?
What is that? That is a shocking level of disdain for democratic process.27
The lack of solid information about how and when surveillance was being
mobilized made it particularly difficult to engage a public that was under-
informed and fearful about security:
I mean it’s such a pervasive issue, and the PR machine of the governments and
the spy agencies are so much bigger and they have so much ammunition to
throw out there, especially with, you know, the fear mongering going on in the
United States and around the world, it’s a hard, hard battle to fight.28
Interestingly, our participants indicated that these two factors also combined
to immobilize politicians and shut down dissenting voices among parliamentar-
ians: “There is such fear and so that fear is even affecting our policy makers. So
even if we can see sometimes and we can feel that yes they want to make these
changes, it’s like the political environment is really always pushing that in the
other direction.”29 The creation of “an imbalance towards security” makes it
224 Jeffrey Monaghan and Valerie Steeves
I mean it’s a weird trade-off where I think that everyone knew that it was hap-
pening, but on the other side of the same coin you have the fact that if you bring
that up, you’re basically a tinfoil hat person. And I think that Edward Snowden
made this more of a socially acceptable thing to talk about versus something that
people didn’t know. People already knew it was happening, but it was something
that they didn’t really want to talk about because they didn’t want to seem like
the crazy person. But now it’s more acceptable to talk about.32
to crack this nut for a long time.”35 The transparency created by the Snowden
revelations provided an important affordance for work on surveillance and
privacy, and was also highlighted specifically as an important factor in the
debates that emerged over Bill C-51: “I think Snowden helped [because it] really
raised the ... salience of those issues [surveillance] to a broader set of people,
which by the time C-51 rolled along ... I think helped [by] going outside the
privacy bucket.”36
That enlarged “privacy bucket” also provided a repertoire to the public at
large. This was particularly true in the use of social media to tweet or share
information regarding the impacts of C-51 in expanding information-sharing
and surveillance practices. Using social media affordances that appeared as a
result of the massive circulation of Snowden-related content, the hybridity of
the C-51 movement created opportunities to leverage the ineffectiveness of the
political environment. Often engagement with capital-P politics is conceived
as an effort to reform or direct the legislative process, or the platforms of political
parties.37 Yet, from the perspective of C-51 movement participants, the political
engagement around the proposed bill created opportunities to engage in public
education and movement building. This was particularly true regarding work
associated with attending parliamentary committee hearings. As with traditional
SMOs, participating in institutional moments of lawmaking allowed C-51 critics
to reach a much broader audience than merely the politicians themselves because
of the media attention the bill attracted. But it also provided opportunities to
leverage this focused attention to educate journalists and members of the public
about the issues.
As one participant said:
It’s not just a resonance ... [it’s that] you get to the education component of it.
Like you do build on it every time. So with C-51 – starting with the initial lawful
access stuff, we had a lot of education to do – like journalists first. Like not –
education’s maybe the wrong word, but like we would have to – like it would
take a while for journalists to get why this is a problem, how it works, etc.,
right? ... but we went through that process and then they get it ... then eventually
they get it.38
I went to the one protest here in [place] and seeing like, you know, people out
on the streets of like [place] and [place] and like small towns right across the
country on Bill C-51 was I think remarkable. So I think there’s a huge public
appetite out there for change.46
And that appetite for change was embraced by a wide variety of people who
were mobilized by their own personal action frames. A participant from a
traditional privacy group noted: “I participated in marches, in protests, in rallies
and they were great, really great, great opportunities and great moments I would
say to see people, not the usual suspects.”47 A participant from a non-privacy
organization agreed:
I mean a number of people from the C-51 campaign who were saying things
like “oh this is my first time getting active in anything political.” The number of
people ... I’ve been to a bunch of different protests in [area] but was the – what
was interesting about the C-51 one was – it definitely didn’t seem to be like the
usual kind of left wing kind of crowd that you would get.48
As a hybrid movement, the Bill C-51 days of action were created through a
network that had been partially solidified from years of privacy activism
related to surveillance, yet it was infused with organizations that had little
history in protesting surveillance or privacy-related issues. Moreover, the
affordances provided by the Snowden revelations, the negative political
environment, and social media allowed for broad resonance and connective
action with participants – not the “usual suspects” but those who had found
what Bennett and Segerberg describe as “personalized” appeals to the protests.
Although SMOs played an enabling role, the protests themselves became
heterogeneous and far more diverse than previous moments of anti-
surveillance activism.
As an example of organizationally enabled connective action, the C-51 move-
ment was highly effective at changing public opinion on the bill. A combination
of the political affordances and the mass mobilizations dramatically altered the
framing that had been initially fostered by the Harper government. On partici-
pant recalled:
You know, when C-51 was first brought on the books, something like 80 per cent
of Canadians supported it and then that shifted almost immediately ... it was
228 Jeffrey Monaghan and Valerie Steeves
almost a complete flip ... the fact that we were able to get so many Canadians so
quickly to change their minds on it is incredible. Like, you don’t see polls slip
like that, ever ... I think the fact that we’ve gotten the media so on board with it
is a huge success.49
So I think we took a rather obscure topic and made it a point of national dis-
cussion ... I think we’re able to sort of shift the frame, not for everyone but for
a lot of people away from pure like security terrorism, to democratic debate,
freedom, which I think is also helpful for a whole series of other debates that
need to happen around, you know, net neutrality and a bunch of other things,
where something that seems kind of obscure or almost benign actually is chal-
lenging, sort of like undercutting a lot of the underpinnings of [the] other types
of freedoms we have.50
While bulk data collection has been taking place for over a decade, security
and policing agencies such as the RCMP, CSE, and CSIS have deliberately misled
the public regarding the scope and objectives of its practices. The same agencies
have also withheld vital information regarding bulk data collection from the
courts. This development was spectacularly revealed in 2016 when the Federal
Court lampooned CSIS for not meeting its duty of candour and deliberately
withholding information on bulk data collection for ten years, including the
230 Jeffrey Monaghan and Valerie Steeves
Indeed, one of the most challenging dynamics in contesting big data is the sup-
posed benefits – or “big data solutionism”56 – espoused by its advocates. Appeals
to big data solutionism are certainly not exclusive to security governance issues,
but the dynamics of secrecy and potential for injustice that are rendered opaque
by appeals to big data are particularly acute with security issues.
Despite the potential of post-privacy attitudes or the discourse of big data
solutionism to obscure the ethical implications of the practices of mass data
collection and algorithmic governance, the connective action that propelled
Bill C-51 mobilizations may provide insight into practices that can challenge
efforts to expand and intensify surveillance in the future. As Bennett and Seger-
berg have noted, the success of connective action results from the ability to “cast
a broader public engagement net using interactive digital media and easy-to-
personalize action themes, often deploying batteries of social technologies to
help [people] spread the word over their personal networks.”57 Shifts in con-
temporary society towards personalized politics present challenges to the more
traditional models and requirements of collective action, and a number of
movements have made use of “personalized varieties of collective action” to
spark public mobilizations.58
The movement from collective to connective action is represented as a move
away from central organizations and a strong collective identity, as well as an
opening of a broader field of political engagement. “Clicktivism,” for example,
has been associated with what Max Halupka characterizes as small, impulsive,
non-ideological, political actions, such as clicking “likes” on Twitter or Facebook
in an effort to raise awareness or contribute to social change through one’s
personal social media networks.59 A far cry from the foundations of resource
mobilization theories that require strong group identity and a recognition of
political opportunities almost exclusively tied to capital-P politics, the personal-
ized character of connective action offers more grounds for fluid, spontaneous
political engagement. Yet, while connective action might be highly effective for
contemporary mobilizations, from flash mobs to protest camps, are these new
dynamics of public contestations effective at social change?
The broad spectrum of engagement around C-51 shows both an advantage
and a disadvantage of connective action. While personalization can draw people
into spontaneous resistance, the lack of organizational cohesion can result in
quick dissipation and disaggregation unless SMOs adapt to shifting needs and
continue to support new digital debates as they arise. Moreover, a growing area
of concern – and attention – considers how digitally enabled actions have them-
selves become networked sites of mass surveillance, something we need to be
very skeptical about.60 Although the C-51 mobilizations demonstrate how con-
nective action can translate into contentious politics, it is difficult to establish
232 Jeffrey Monaghan and Valerie Steeves
Notes
1 Craig Forcese and Kent Roach, False Security: The Radicalization of Canadian Anti-
Terrorism (Montreal: Irwin Law, 2015).
2 Ibid.; Tamir Israel and Christopher Parsons, “Canada’s National Security Consultation I:
Digital Anonymity & Subscriber Identification Revisited ... Yet Again” (Canadian Inter-
net Policy and Public Interest Clinic report, 2016), https://cippic.ca/uploads/20161005-
CNSCI-RevisitingAnonymityYetAgain.pdf.
3 Michael Shulman, “Demonstrators Across Canada Protest Bill C-51,” CTV News, 14
March 2015, http://www.ctvnews.ca/politics/demonstrators-across-canada-protest-bill
-c-51-1.2279745.
4 Colin Bennett, The Privacy Advocates: Resisting the Spread of Surveillance (Cambridge,
MA: MIT Press, 2008), 207. See also Colin Bennett and Charles Raab, The Governances
of Privacy (Cambridge, MA: MIT Press, 2006).
5 Participant groups are denoted in the text as Traditional, Internet, and Other. Each inter-
viewee has been given a number, e.g., Traditional 1.
6 W. Lance Bennett and Alexandra Segerberg, “The Logic of Connective Action: Digital
Media and the Personalization of Contentious Politics,” Information, Communication
and Society 15, 5 (2012): 739–68.
7 Ibid.; Jennifer Earl and Katrina Kimport, Digitally Enabled Social Change: Activism in the
Internet Age (Cambridge, MA: MIT Press, 2011); Paolo Gerbaudo, Tweets and the Streets:
Social Media and Contemporary Activism (London: Pluto Press, 2012).
8 Jennifer Earl, “The Future of Social Movement Organizations: The Waning Dominance
of SMOs Online,” American Behavioral Scientist 59, 1 (2015): 35–52.
9 See debates on collective identity: Paolo Gerbaudo, The Mask and the Flag: Populism,
Citizenism, and Global Protest (Oxford: Oxford University Press, 2017); Emiliano Treré,
“Reclaiming, Proclaiming, and Maintaining Collective Identity in the #YoSoy132 Move-
ment in Mexico: An Examination of Digital Frontstage and Backstage Activism through
Social Media and Instant Messaging Platforms,” Information, Communication and Soci-
ety 18, 8 (2015): 901–15.
10 Ian Angus, Emergent Publics: An Essay on Social Movements and Democracy (Winni-
peg: Arbiter Ring, 2001). See also Nancy K. Baym and danah boyd, “Socially Mediated
Publicness: An Introduction,” Journal of Broadcasting and Electronic Media 56, 3 (2012):
320–29.
11 Zeynep Tufekci, Twitter and Tear Gas: The Power and Fragility of Networked Protest (New
Haven, CT: Yale University Press, 2017).
Protesting Bill C-51 233
35 Interview, Traditional 1.
36 Interview, Internet 4.
37 See Calo, “Can Americans Resist Surveillance?”
38 Interview, Internet 4.
39 Ibid.
40 Interview, Internet 3.
41 Ibid.
42 Ibid.
43 Interview, Other 3.
44 Interview, Other 2.
45 Interview, Other 1.
46 Interview, Other 3.
47 Interview, Traditional 2.
48 Interview, Other 3.
49 Interview, Other 2.
50 Ibid.
51 Interview, Traditional 1.
52 Jef Huysmans, Security Unbound: Enacting Democratic Limits (New York: Routledge,
2014).
53 Interview, Internet 4.
54 Interview, Traditional 3.
55 Interview, Traditional 5.
56 See Ganaele Langlois, Joanna Redden, and Greg Elmer, “Introduction,” in Compromised
Data – From Social Media to Big Data, edited by Ganaele Langlois, Joanna Redden, and
Greg Elmer (New York: Bloomsbury, 2014), 1–14.
57 Bennett and Segerberg, “The Logic of Connective Action,” 742.
58 Ibid., 743. See also W. Lance Bennett, “The Personalization of Politics: Political Identity,
Social Media, and Changing Patterns of Participation,” Annals of the American Academy
of Political and Social Science 644, 1 (2012): 20–39.
59 See Max Halupka, “Clicktivism: A Systematic Heuristic,” Policy and Internet 6, 2 (2014):
115–32.
60 See Lucas Melgaço and Jeffrey Monaghan, “Introduction: Taking to the Streets in the
Information Age,” in Protests in the Information Age: Social Movements, Digital Practices
and Surveillance, edited by Lucas Melgaço and Jeffrey Monaghan (New York: Routledge,
2018), 1–17.
Part 5
Policy and Technical Challenges of
Big Data Surveillance
This page intentionally left blank
13
Horizontal Accountability and Signals Intelligence
Lessons Drawing from Annual Electronic Surveillance Reports
Christopher Parsons and Adam Molnar
Conceptual Terminology
Organizations that act transparently collate and present data to those outside
the organization.5 This disclosure of information can sometimes present data
that are useful for the public.6 Often, organizations act transparently when they
are compelled to present information in a delimited format7 or through their
own methodologies to collate and disclose information.8 In either case, organ-
izations that “behave transparently” may be attempting to engender greater
trust in their practices.9 On this basis, scholars are advised to pay “careful atten-
tion to the human and material operations that go into the production of
transparency”10 because the revelatory character of transparency practices may
be overemphasized absent critique.
One way that governments, in particular, demonstrate transparency is through
the release of statutorily required reports. Electronic surveillance reports are
an attempt to address social inequity in the social contract between governments
and their citizens. By disclosing the regularity at which government surveillance
practices occur, the disproportionate degree of intrusion of the state into the
private lives of citizens is thought to be safeguarded. In contrast, the absence of
any requirement to disclose these activities, or a failure to release such reports,
can hinder legislatures and the citizenry from holding the government to
account.11 Without information about secretive government practices, the public,
parliamentarians, and other stakeholders cannot evaluate whether government
agencies are using their exceptional powers appropriately and in ways that
cohere with public interpretations and expectations of how the law ought to
legitimate such activities.12
Transparency in government activities is needed to ensure that civic agencies
are held accountable to their minister, to Parliament, and to the public more
broadly. A system of accountability exists “when there is a relationship where
Horizontal Accountability and Signals Intelligence 239
journalists, who subsequently selectively published from what they were given.
One of the most prominent Canadian-focused Snowden disclosures was a
program covernamed CASCADE. CASCADE was operated on non–government
of Canada networks and was designed to analyze network traffic. The analysis
involved discovering and tracking targets, as well as isolating content or metadata
from traffic exposed to the network probes.33 Within the CASCADE program
was a series of differently classified and covernamed network sensors. Some
could capture metadata and content alike (EONBLUE and INDUCTION),
whereas others could solely collect and analyze metadata (THIRD-EYE and
CRUCIBLE).34 All of these sensors relied on deep packet inspection technology,
which enables operators to analyze the metadata and content of unencrypted
communications and take actions on it, such as blocking certain traffic or
modifying other traffic.35
INDUCTION operated at “Special Source Operations (SSO) sites,” or within
the premises of private Canadian organizations that had consented to CSE’s
activities. CRUCIBLE sensors, similar to INDUCTION sensors, were located
in the pathways of networks that were designated “systems of importance” to
Canada.36 Such systems might belong to defence contractors, extractive resource
companies, banks, or equivalent organizations whose compromise could
detrimentally affect the governance of Canada. These sensors could also collect
the metadata of communications that Canadians, and persons communicating
with Canadians, were engaged in, as well as the metadata of devices that trans-
mitted information into or out of Canada. Other aspects of CASCADE involved
monitoring satellite communications as well as microwave towers that trans-
mitted data.37
The purpose of CASCADE, when combined with an equivalent sensor
network designed to protect the Canadian government’s own networks (cover-
named PHOTONIC PRISM, which was expected to be replaced by EON-
BLUE),38 was to use the entirety of the global information infrastructure as a
means of defence. By tracking threat actors and their activities, CSE intended
to “affect changes at the CORE of the Internet on detection” in collaboration
with its Five Eyes partners. Such changes included modifying traffic routes,
silently discarding malicious traffic, or inserting payloads into communica-
tions traffic to disrupt adversaries.39 To achieve these ends, CASCADE would,
in essence, be situated to grant fulsome awareness of domestic and foreign
Internet activity throughout the world. The most controversial aspects of this
program in Canada were principally linked to the extensive surveillance of
Canadian-source, Canadian-bound, and Canadian domestic traffic, as well
as CSE’s efforts to work alongside private partners to conduct this global
surveillance activity.
242 Christopher Parsons and Adam Molnar
range of activities than many thought was already likely given its scope and
perceived capabilities. While Bill C-59 may retroactively authorize these existing
activities, it has made more explicit the expansive range of CSE’s activities, which
include collecting foreign intelligence through the global information infra-
structure; engaging in cybersecurity and information assurance; conducting
defensive operations to broadly protect federal institutions’ systems and those
deemed of importance to Canada; performing active cyber operations that may
involve degrading, disrupting, influencing, responding to, or interfering with
“the capabilities, intentions or activities” of non-Canadian parties; and provid-
ing technical and operational assistance to LESAs, the Canadian Forces, and
the Department of National Defence.46 There are provisions within the CSE Act
that also permits CSE to collect information from any public source,47 including
perhaps grey market information brokers, as well as interfere with non-
democratic foreign elections,48 among other controversial measures.
The program that we have examined in this chapter can be situated within
this expanded mandate. CASCADE could operate simultaneously under the
collection of foreign intelligence, cybersecurity and information assurance,
and (potentially) assistance mandates. When viewed through each of these
mandate areas, CSE is permitted to acquire information as required, provide
services to different government and non-governmental organizations that
are meant to guarantee the respective organizations’ digital security, and use
collected information as appropriate to assist domestic LESAs or foreign-
operating Canadian Forces to act on parties threatening Canadian organiza-
tions’ digital systems. If it obtains authorization, activities in Canada could
extend to active defensive operations. Furthermore, Bill C-59 explicitly
authorizes CSE to infiltrate any part of the global information infrastructure
for the purposes of collecting information that would provide foreign intel-
ligence. This includes the types of attacks being launched towards Canadian
networks or systems of interest, and also permits private companies to
cooperate with CSE and, as such, operate as SSOs. Whereas CSE’s current
legislation does not explicitly state the conditions under which it can engage
with private organizations (as envisioned under the CASCADE program),
the cybersecurity authorizations for non-federal infrastructures under Bill
C-59 establish the legislative framework for such cooperation. Notably, C-59
also includes emergency provisions for access to private organizations’ infra-
structure. These provisions might let CSE gain permission from either the
operator of infrastructure, such as a party that is running software on, say,
computer servers in a shared computing facility or, alternatively, from the
party that owns the servers and leases them to the software-running party.49
This can occur without having to get the activity approved by anyone besides
244 Christopher Parsons and Adam Molnar
Conclusion
Patrick Walsh and Seumas Miller have argued that “[t]he Snowden leaks now
provide the opportunity for ‘Five Eyes’ governments to do a root and branch
review of current organizational, ministerial, parliamentary and other standing
oversight bodies to ensure they remain fit for purpose.”61 Goldman has separately
insisted that “although the institutions designed to ensure compliance work
well, these same institutions have difficulty with a broader role.”62 We agree with
these points and argue that a review of the intelligence community and its
transparency and accountability structures must also consider how to empower
stakeholders external to government to better engage in horizontal account-
ability. Indeed, in an environment characterized by rapid technological innova-
tion, extensive legal ambiguities, and associated tensions with traditional liberal
democratic principles, horizontal accountability is an essential component of
meaningful regulation.
In this chapter, we have argued that horizontal accountability can help legit-
imate secretive government activities that are authorized by legislation. We
proposed four separate measures, focused on legal, statistical, narrative, and
proportionality, to enhance the information available to external-to-government
stakeholders. This information could then be taken up and used to understand
and critique some activities, while also ensuring that parties external to govern-
ment could identify and propose solutions to thorny legal issues, could better
explain the protections and safeguards established to protect civil liberties and
human rights, and ensure that the stakeholders they represent are better
informed about the actual, versus hypothetical or hyperbolic, issues linked to
government surveillance activities.
A continuation of the status quo, where citizens are kept in the dark concern-
ing the activities and laws that authorize secret intelligence activities, “undermines
the capacity of citizens to determine whether a new balance of security concerns
and basic rights has been struck.”63 The status quo also threatens to magnify the
already disturbing gap between legislation as it is written, as it is interpreted by
Department of Justice and other government national security lawyers, and as
it is acted upon by Communications Security Establishment staff. This gap fun-
damentally threatens the legitimacy, if not the lawfulness, of CSE’s activities. No
government party benefits from the perpetuation of this gap: while it may
be tactically helpful in advancing specific operations or activities, it ultimately
threatens to poison the legitimacy of organizations themselves and, by extension,
turn tactical outputs into components of a broader strategic blunder.
Ultimately, it is only once citizens, often facilitated by academic and civil
society actors, know what is being done in their name, and why and how those
measures are linked to the activities authorized by their legislators, can the
Horizontal Accountability and Signals Intelligence 249
Acknowledgments
Financial support for the research, authorship, and publication of this chapter was provided
by the John D. and Catherine T. MacArthur Foundation.
The authors would like to thank the participants of a national security round table held
at the 2017 Annual Citizen Lab Summer Institute for their insights concerning the CSE
Act and other relevant aspects of Bill C-59. They would also like to thank members of the
Communications Security Establishment for providing numerous briefings about different
aspects of the Establishment’s mandate, challenges it seeks to overcome, and how Bill C-59
might affect its practices.
The authors declare no potential conflicts of interest with respect to the research,
authorship, and/or publication of this chapter.
Notes
1 National Defence Act, RSC 1985, c N-5, ss 273.64(1)(a)–(c).
2 Office of the Communications Security Establishment Commissioner (OCSEC), “Fre-
quently Asked Questions,” 24 February 2017, https://www.ocsec-bccst.gc.ca/s56/eng/
frequently-asked-questions.
3 Ibid.
4 Bill Robinson, “Does CSE Comply with the Law?” Lux Ex Umbra (blog), 14 March 2015,
https://luxexumbra.blogspot.ca/2015/03/does-cse-comply-with-law.html; Ronald Deib-
ert, “Who Knows What Evils Lurk in the Shadows?” OpenCanada.org, 27 March 2015,
https://www.opencanada.org/features/c-51-who-knows-what-evils-lurk-in-the-shadows/;
Greg Weston, Glenn Greenwald, and Ryan Gallagher, “CSEC Used Airport Wi-Fi to
Track Canadian Travellers: Edward Snowden Documents,” CBC News, 30 January 2014,
https://web.archive.org/web/20140131064055/https://www.cbc.ca/news/politics/csec
-used-airport-wi-fi-to-track-canadian-travellers-edward-snowden-documents-1.2517881.
5 Robert Bushman, Joseph Piotroski, and Abbie Smith, “What Determines Corporate
Social Transparency?” Journal of Accounting Research 42, 2 (2004): 207; Sylvester Eigffin-
ger and Petra Geraats, Government Transparency: Impacts and Unintended Consequences
(New York: Palgrave Macmillan, 2006).
6 Roger Cotterrell, “Transparency, Mass Media, Ideology and Community,” Journal for
Cultural Research 3, 4 (1999): 414–26.
7 Archon Fung, Mary Graham, and David Weil, Full Disclosure: The Perils and Promise of
Transparency (New York: Cambridge University Press, 2007).
8 Ibid., 7; Christopher Parsons, “The (In)effectiveness of Voluntarily Produced Transpar-
ency Reports,” Business & Society 58, 1 (2019): 103–31, https://journals.sagepub.com/doi/
full/10.1177/0007650317717957.
9 Kent Wayland, Roberto Armengol, and Deborah Johnson, “When Transparency Isn’t Trans-
parent: Campaign Finance Disclosure and Internet Surveillance,” in Internet and Surveillance:
The Challenges of Web 2.0 and Social Media, edited by Christian Fuchs, Kees Boersma, Anders
Albrechtslund, and Marisol Sandoval (New York: Routledge, 2012), 239–54.
10 Hans Krause Hansen, Lars Thoger Christensen, and Mikkel Flyverbom, “Introduction:
Logics of Transparency in Late Modernity: Paradoxes, Mediation and Governance,”
European Journal of Social Theory 18, 2 (2015): 117–31.
250 Christopher Parsons and Adam Molnar
11 Douwe Korff, Ben Wagner, Julia Powles, Renata Avila, and Ulf Buermeyer, “Boundaries
of Law: Exploring Transparency, Accountability, and Oversight of Government Surveil-
lance Regimes” (University of Cambridge Faculty of Law Research Paper No. 16/2017, 3
March 2017), SSRN, https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2894490.
12 Christopher Parsons and Tamir Israel, “Gone Opaque? An Analysis of Hypo-
thetical IMSI Catcher Overuse in Canada” (Citizen Lab/Canadian Internet Policy
and Public Interest Clinic report, August 2016), https://citizenlab.org/wp-content/
uploads/2016/09/20160818-Report-Gone_Opaque.pdf; Adam Molnar, Christopher Par-
sons, and Erik Zouave, “Computer Network Operations and ‘Rule-with-Law’ in Australia,”
Internet Policy Review 6, 1 (2017): 1–14.
13 Riccardo Pelizzo and Frederick Stapenhurst, Government Accountability and Legisla-
tive Oversight (New York: Routledge, 2013), 2.
14 Andreas Schedler, “Conceptualizing Accountability,” in The Self-Restraining State: Power
and Accountability in New Democracies, edited by Andrew Schelder, Larry Diamond,
and Marc Plattner (Boulder, CO: Lynne Rienner, 1999), 13–28; Andrew Blick and
Edward Hedger, “Literature Review of Factors Contributing to Commonwealth Public
Accounts Committees Effectively Holding Government to Account for the Use of Pub-
lic Resources” (National Audit Office, Overseas Development Institute, 2008); Richard
Mulgan, “The Process of Public Accountability,” Australian Journal of Public Accountabil-
ity 56, 1 (1997): 26–36; Jonathan Anderson, “Illusions of Accountability,” Administrative
Theory & Praxis 31, 3 (2009): 322–39.
15 Dale Smith, The Unbroken Machine: Canada’s Democracy in Action (Toronto: Dundurn,
2017); Bruce Stone, “Administrative Accountability in the ‘Westminster’ Democracies:
Towards a New Conceptual Framework,” Governance 8, 4 (1995): 502–25.
16 Carmen Malena, Reigner Forster, and Janmejay Singh, “Social Accountability: An Intro-
duction to the Concept and Emerging Practice” (Social Development Paper 76, World
Bank, 2004).
17 See, for example, Richard Mulgan, “‘Accountability’: An Ever-Expanding Concept?” Pub-
lic Administration 78, 3 (2000): 555–73; Linda Deleon, “Accountability in a ‘Reinvented’
Government,” Public Administration 76, 3 (1998): 539–58; Amanda Sinclair, “The Cha-
meleon of Accountability: Forms and Discourses,” Accounting, Organizations and Society
20, 2–3 (1995): 219–37; David Corbett, Australian Public Sector Management, 2nd ed.
(Sydney: Allen and Unwin, 1996); James March and Johan Olsen, Democratic Gover-
nance (New York: Free Press, 1995).
18 Smith, The Unbroken Machine; Stone, “Administrative Accountability.”
19 J. LI. J. Edwards, “Ministerial Responsibility for National Security as It Relates to the
Offices of the Prime Minister, Attorney General and Solicitor General of Canada,” in
The Commission of Inquiry Concerning Certain Activities of the Royal Canadian Mounted
Police (Ottawa: Supply and Services Canada, 1980); Donald Savoie, Breaking the Bargain:
Public Servants, Ministers, and Parliament (Toronto: University of Toronto Press, 2003).
20 Malena, Forster, and Singh, “Social Accountability.”
21 Mark Bovens, “Analyzing and Assessing Accountability: A Conceptual Framework,”
European Law Journal 13, 4 (2007): 447–68.
22 Malena, Forster, and Singh, “Social Accountability”; Maxwell McCombs, Setting the
Agenda: Mass Media and Public Opinion (Hoboken, NJ: John Wiley and Sons, 2014).
23 Bovens, “Analyzing and Assessing Accountability.”
24 Alisdair Roberts, “Transparency in the Security Sector,” in The Right to Know: Transparency for
an Open World, edited by Ann Florini (New York: Columbia University Press, 2007), 309–36.
25 Malena, Forster, and Singh, “Social Accountability.”
26 Roberts, “Transparency in the Security Sector.”
Horizontal Accountability and Signals Intelligence 251
27 Jan Aart Scholte, “Civil Society and Democracy in Global Governance,” Global Gover-
nance 8, 3 (2002): 281–304; Julie Fisher, Non Governments: NGOs and the Political Devel-
opment of the Third World (West Hartford, CT: Kumarian Press, 1998).
28 Jürgen Habermas, “On the Internal Relation between the Rule of Law and Democracy,”
in The Inclusion of the Other: Studies in Political Theory, edited by Ciaran Cronin and
Pablo De Greiff (Cambridge, MA: MIT Press, 1998), 253–64; Jürgen Habermas, “Three
Normative Models of Democracy,” in Cronin and De Greiff, 239–52; Christopher Par-
sons, “Beyond Privacy: Articulating the Broader Harms of Pervasive Mass Surveillance,”
Media and Communication 3, 3 (2015): 1–11.
29 Ben Bowling and James Sheptycki, “Global Policing and Transnational Rule with law,”
Transnational Legal Theory 6, 1 (2015): 141–73; Molnar, Parsons, and Zouave, “Computer
Network Operations.”
30 Zachary K. Goldman and Samuel J. Rascoff, “The New Intelligence Oversight,” in Global
Intelligence Oversight: Governing Security in the Twenty-First Century, edited by Zachary
K. Goldman and Samuel J. Rascoff (New York: Oxford University Press, 2016); see also
“Intelligence Reform in a Post-Snowden World,” YouTube video, 1:28:12, from a panel
hosted by the Center for Strategic and International Studies, 9 October 2015, https://
www.csis.org/events/intelligence-reform-post-snowden-world-0.
31 National Defence Act; Bill Robinson, “An Unofficial Look inside the Communications
Security Establishment, Canada’s Signals Intelligence Agency,” Lux Ex Umbra (blog), 5
November 2000, http://circ.jmellon.com/docs/html/communications_security_establishment_
unofficial_webpage_020623.html.
32 National Defence Act, ss 273.64(1)(a)–(c).
33 Communications Security Establishment (CSE), “CSEC Cyber Threat Capabili-
ties: SIGINT and ITS: An End-to-End Approach” (slide deck, October 2009), https://
christopher-parsons.com/Main/wp-content/uploads/2015/03/doc-6-cyber-threat
-capabilities-2.pdf.
34 CSE, “CASCADE: Joint Cyber Sensor Architecture,” 2011, Technology, Thoughts and
Trinkets, https://christopher-parsons.com/writings/cse-summaries/#cse-cascade-joint.
35 CSE, “CSEC Cyber Threat Capabilities”; Christopher Parsons, “Deep Packet Inspection in
Perspective: Tracing Its Lineage and Surveillance Potentials” (working paper, New Trans-
parency Project, 2008), http://www.sscqueens.org/files/WP_Deep_Packet_Inspection_
Parsons_Jan_2008.pdf.
36 CSE, “CASCADE: Joint Cyber Sensor Architecture.”
37 Ibid.
38 CSE, “Cyber Network Defence R&D Activities,” 2010, Technology, Thoughts and Trin-
kets, https://christopher-parsons.com/writings/cse-summaries/#cse-cyber-threat-capabilities;
CSE, “CASCADE: Joint Cyber Sensor Architecture.”
39 CSE, “CSEC Cyber Threat Capabilities.”
40 Deibert, “Who Knows What Evils Lurk in the Shadows?”
41 Weston, “CSEC Used Airport Wi-Fi to Track.”
42 CSE, “CSEC Cyber Threat Capabilities.”
43 Based on discussions between the authors and senior CSE staff, we understand that in
such warranted cases, information is cordoned off from CSE’s more general reposito-
ries and thus inaccessible to many, if not all, CSE staff and operations.
44 “Defence Minister Insists Spy Agency Did Not Track Canadian Travellers,” CTV News,
31 January 2014, http://www.ctvnews.ca/canada/defence-minister-insists-spy-agency
-did-not-track-canadian-travellers-1.1664333; OCSEC, “Frequently Asked Questions.”
See also Craig Forcese, “Faith-Based Accountability: Metadata and CSEC Review,” National
Security Law: Canadian Practice in Comparative Perspective (blog), 13 February 2014,
252 Christopher Parsons and Adam Molnar
https://www.craigforcese.com/blog/2014/2/13/faith-based-accountability-metadata
-and-csec-review.html?rq=faith-based%20accountability%3A%20metadata%20
and%20csec%20review.
45 Robinson, “Does CSE Comply with the Law?”
46 Canada, Bill C-59, An Act respecting national security matters, 1st Sess, 42nd Parl, 2017,
pt 3, ss 17–21 [Bill C-59].
47 Ibid., pt 3, s 24(1)(a).
48 Ibid., s 33(1)(b).
49 Ibid., s 41(4).
50 The tabled bill initially included a caveat: the Intelligence Commission is not required to
first approve emergency authorizations (pt 3, s 42(2)).
51 See, for example, US Office of the Director of National Intelligence (ODNI), “Sta-
tistical Transparency Report Regarding Use of National Security Authorities for
Calendar Year 2016,” April 2016, https://www.dni.gov/files/icotr/ic_transparecy_report_
cy2016_5_2_17.pdf.
52 Kent Roach, “Review and Oversight of Intelligence in Canada: Expanding Accountabil-
ity Gaps,” in Goldman and Rascoff, Global Intelligence Oversight, 181.
53 X (Re), 2014 FCA 249 (CanLII), http://canlii.ca/t/gf63j; X (Re), [2017] 2 FCR 396, 2016
FC 1105 (CanLII), http://canlii.ca/t/gw01x.
54 Roach, “Review and Oversight of Intelligence in Canada,” 187–88.
55 Anne Dagenais Guertin, “Our Analysis of C-22: An Inadequate and Worrisome Bill,”
International Civil Liberties Monitoring Group, 30 June 2016, http://iclmg.ca/our-analysis
-of-c-22-an-inadequate-and-worrisome-bill/; Scott Newark, “Ensuring Independence
for the Parliamentary National Security Committee: A Review of Bill C-22” (Macdon-
ald-Laurier Institute publication, November 2016), http://www.macdonaldlaurier.ca/
files/pdf/MLICommentaryNewark11-16-webV2.pdf.
56 Goldman and Rascoff, “The New Intelligence Oversight,” xxix.
57 Zachary K. Goldman, “The Emergence of Intelligence Governance,” in Goldman and
Rascoff, Global Intelligence Oversight, 219.
58 Daphna Renan, “The FISC’s Stealth Administrative Laws,” in Goldman and Rascoff,
Global Intelligence Oversight, 135. Though beyond the scope of this argument, such pro-
ceedings could also include special advocates as much as possible to avoid ex parte hearings
that might lead to legal interpretations that unduly impact the civil liberties of those
affected by CSE’s surveillance operations.
59 Bill C-59, pt 1, s 3(a), as well as pt 3, ss 13–21.
60 Ibid., pt 3, s 35(1).
61 Patrick F. Walsh and Seumas Miller, “Rethinking the ‘Five Eyes’ Security Intelligence
Collection Policies and Practice Post-Snowden,” Intelligence and National Security 31, 3
(2016): 345–68, 365–66.
62 Goldman, “The Emergence of Intelligence Governance,” 220.
63 Roberts, “Transparency in the Security Sector,” 320.
14
Metadata – Both Shallow and Deep
The Fraught Key to Big Data Mass State Surveillance
Andrew Clement, Jillian Harkness, and George Raine
(apart from any information falling within paragraph (a)).”11 In the United States,
one example of metadata, “Call Detail Records,” is legally defined as “session-
identifying information (including an originating or terminating telephone
number, an International Mobile Subscriber Identity number, or an International
Mobile Station Equipment Identity number), a telephone calling card number,
or the time or duration of a call” and “does not include – (1) the contents ... of
any communication; (2) the name, address, or financial information of a sub-
scriber or customer; or (3) cell site location or global positioning system infor-
mation” (emphasis added).12
The differences in these varying approaches reveal how metadata can imply
very different things across varying communities of practice. In her Introduction
to Metadata 3.0, leading archival scholar Anne J. Gilliland explicates the “widely
used but frequently underspecified term” within the framework of the archival
discipline.13 She notes that the term originated with data management and today
in practice metadata is generally “the sum total of what one can say about any
information object at any level of aggregation.” An information object can vary
from a film or book to email or phone call; therefore “metadata” in this definition
suggests anything one could say about these items, from a title to any salient
feature of the contents. For archivists and information managers, metadata
reflects an information object’s content, context, and structure, and enables
preservation as well as “intellectual and physical access.”14
Despite at times recognizing that communications metadata may reveal a
significant amount of personal information, media and legal definitions of
metadata tend to limit their focus to the specific types of information that can
be read from the “surface” of the information object without delving into the
object’s content. By contrast, the archival definition of metadata, as put forth
by Gilliland, acknowledges that varying levels of aggregation and detail, as well
as relationships between information objects and systems, may impact how one
defines metadata as opposed to data, or context as opposed to content.15 Gilliland
notes that these “distinctions ... can often be very fluid and may depend on how
one wishes to use a certain information object.”16
of] terabytes of low-value data ... to ... a single bit of high-value data.”19 Through
the development of mobile communications technology, an environment has
emerged in which ordinary users, often without realizing it, produce large
amounts of metadata on a daily basis.20 Access to this mass of personal data,
when analyzed through big data analytical techniques and software, allows for
broad and deep access to personal information. Arguably, this access has been
downplayed through the conventional meanings of metadata summarized above,
to the benefit of both corporate business practices and surveillance agencies.21
have a good basis for painting a reliable, if preliminary, picture of how these
agencies discuss and operationalize our topic at hand.
For our study of metadata within the FVEY, we relied extensively on the
Snowden Digital Surveillance Archive, a publicly accessible finding aid to the
full corpus of published Snowden documents and related media articles that
we designed and built, and that is now hosted by the Canadian Journalists for
Free Expression.30 From working with the documents in building the archive,
we developed strong suspicions that internally the FVEY agencies take a much
more expansive view of metadata than suggested by their public statements and
reinforced in the popular media definitions discussed above. In switching to a
research role, we sought to test our suspicions while being open to possible
disconfirming evidence. We initially made use of the archive’s full-text search,
indexing, and document description features to locate documents and stories
relatively dense in details about metadata, and then pursued thematic linkages
between documents, such as by surveillance program, to amplify the contexts
in aid of interpretation.
Metadata is evidently an important topic within the FVEY. A search in the
archive on “metadata” produced 1,644 word hits. Fourteen documents contained
“metadata” in their title, which we examined first. The surveillance programs,
legal justifications, and internal policies mentioned therein informed further
archival searches. The domain knowledge we had gained from arranging and
describing the Snowden documents also greatly aided our initial searches in
identifying fertile points for research as well as in understanding the documents
we selected.
While we expected that exploring such a conceptually vague and varied
phenomenon as metadata would yield a mix of results, we were struck by the
breadth and heterogeneity of metadata produced by FVEY agencies. For
example, disparate GCHQ surveillance programs harvest hotel room bookings
(ROYAL CONCIERGE), social media activity (STELLARWIND), and text
message geolocation data (in partnership with the NSA under DISHFIRE).
Each different program generates different types of metadata, making classifica-
tion of the surveillance agency’s handling of metadata difficult. To facilitate
comparison of the various agency interpretations of metadata with each other,
with their public statements, and with the various legal definitions in their
respective jurisdictions, we looked at the three most relevant agencies in turn –
NSA, GCHQ, and CSE.31 We also focused on those surveillance programs in
which metadata plays a particularly prominent role, notably XKEYSCORE,
which provides a front-end interface to many signals intelligence databases
around the globe and is accessed by all members of the alliance as well as by
selected third-party agencies.
Metadata – Both Shallow and Deep 259
The key phrase here is “includes all information associated with, but not includ-
ing content.” But what does this actually mean? A colloquial interpretation,
based on the caveat “but not including content,” would confine communications
metadata to the surface features, such as the oft-referenced “outside of the
envelope” data. In which case, this could be reworded as “includes all informa-
tion associated with the message, but excluding anything based on message
content.” However, another plausible but very different interpretation would be
that metadata “includes all information that can be derived from a message and
its content, but excluding the content itself.” This version would be consistent
with metadata in archival theory and practice as described above. It would also
be consistent with the NSA’s often expansive, and secret, interpretation of its
legal mandate, thereby opening the door for the agency to justify unfettered
algorithmic analysis of communications content as metadata extraction. We
find strong support for this latter interpretation when we examine the NSA’s
most prominent analysis engine, XKEYSCORE. Compared with many other
tools and surveillance programs mentioned in the Snowden documents, import-
ant aspects of XKEYSCORE are extensively described, enabling a relatively
comprehensive understanding of its capabilities, scope, and use.
XKEYSCORE is one of the NSA’s most powerful tools and is often in demand
among its trusted “second party” (i.e., other members of the Five Eyes) and
“third party” partners.33 Access to the tool has been shared with the GCHQ,
ASD, CSE, GCSB, Bundesnachrichtendienst (Germany), and National Defence
Radio Establishment (FRA) (Sweden). Described in the Intercept as “the NSA’s
Google,”34 this tool gives analysts unprecedented access to communications
metadata largely harvested by the NSA from fibre-optic cables and cached in
over 700 servers at 150 storage sites scattered throughout the world. An unof-
ficial user’s guide to XKEYSCORE developed by Booz Allen Hamilton gives a
260 Andrew Clement, Jillian Harkness, and George Raine
Figure 14.1 “Context” as metadata category in XKEYSCORE. A one-page excerpt from the
classified secret Five Eyes document titled “Guide to Using Contexts in XKS Fingerprints.” It
shows that Communications Content is considered a form of Context of Type “Scan.” | Source:
Snowden Digital Surveillance Archive, https://is.gd/n7brJJ.
Metadata in CSE
In Canada, CSE collects or, more precisely, generates metadata as part of
its mandate, through the National Defence Act, “to acquire and use infor-
mation from the global information infrastructure for the purpose of
providing foreign intelligence.”44 The act broadly defines “global informa-
tion structure” as “electromagnetic emissions, communications systems,
information technology systems and networks, and any data or technical
information carried on, contained in or relating to those emissions, systems
or networks.” This mandate is limited by “measures to protect the privacy
of Canadians in the use and retention of intercepted information,” as out-
lined in the Criminal Code.45
Unlike other intelligence agencies, the Canadian CSE displays its public
definition of metadata on its website:
Metadata – Both Shallow and Deep 263
In 2016, the director of CSE reiterated this distinction between content and
metadata-as-context in responding to a Toronto Star editorial calling for more
oversight of the agency – “Context, not content.”47 As we saw above, however,
context in the eyes of these agencies is much different from how we convention-
ally understand the term and is deeply tied to communication content. As in
the case of the GCHQ, it is difficult to square this definition with the agency’s
continued use of XKEYSCORE.
One CSE surveillance experiment in particular aptly reveals the power of
metadata in surveillance activities. The leaked document titled “IP Profiling
Analytics & Mission Impacts” describes a trial program where CSE profiled the
users of Wi-Fi networks at an international airport located on Canadian soil.48
The CBC incorrectly reported this as involving the interception of Wi-Fi signals
in the airport, but as analyzed in greater depth in Chapter 7, the actual practices
are far more disturbing. Especially revealing is the statement by John Forster,
then chief of CSE, who told the Senate committee investigating the apparent
violation of Canadian law that the experiment involved no actual monitoring
at an airport, but simply “a snapshot of historic metadata collected from the
global internet ... [as] part of our normal global collection.”49
Through comprehensive capture, analysis, and storage of Internet communi-
cation, CSE spotted visitors to the airport based on the IP address of the airport’s
public Wi-Fi service. Analysts were then able to track individuals to other
locations with identifiable IP addresses, both forward and backward in time,
based on the user IDs extracted from message content. This case illustrates not
only CSE’s expansive interpretation of metadata but also the remarkably broad
scope and fine detail of its domestic surveillance capabilities.
Implications
These conclusions hold implications for the various actors interested in mass
surveillance.
Notes
1 Canadian Journalists for Free Expression, Snowden Digital Surveillance Archive (SDSA),
https://snowdenarchive.cjfe.org.
2 Ashley Burke, “‘Difficult to Determine’ Scope of Privacy Breach in Five Eyes Data Shar-
ing,” CBC News, 23 February 2016, http://www.cbc.ca/news/politics/cse-metadata-five
-eyes-sharing-1.3459717. A nearly identical definition was used again in the following
article: Alison Crawford, “Canada’s Electronic Spy Agency to Get New Rules for Shar-
ing Data with Allies,” CBC News, 29 August 2017, http://www.cbc.ca/news/politics/
sajjan-cse-data-sharing-five-eyes-1.4265583.
3 James Ball, “NSA Stores Metadata of Millions of Web Users for Up to a Year, Secret Files
Show,” Guardian, 30 September 2013, http://www.theguardian.com/world/2013/sep/30/
nsa-americans-metadata-year-documents.
4 Office of the Privacy Commissioner of Canada (OPCC), Metadata and Privacy: A Tech-
nical and Legal Overview (Gatineau, QC: Office of the Privacy Commissioner of Canada,
2014), 9.
5 Criminal Code, RSC 1985, c C-46, pt 6, s 183, https://laws-lois.justice.gc.ca/eng/
acts/C-46/page-41.html#h-118715.
6 As quoted in Craig Forcese, “Laws, Logarithms, Liberties: Legal Issues Arising from
CSE’s Metadata Collection Initiatives,” in Law, Privacy, and Surveillance in the Post-
Snowden Era, edited by Michael Geist (Ottawa: University of Ottawa Press, 2015), 137.
7 Ibid., 137, 148.
8 OPCC, Metadata and Privacy, 10.
9 OPCC, “Backgrounder: Privacy and Canada’s National Security Framework,” 6 December
2016, https://www.priv.gc.ca/en/opc-news/news-and-announcements/2016/bg_161206/.
10 OPCC, Metadata and Privacy, 148.
11 Regulation of Investigatory Powers Act 2000, 2000 c 23, s 21.4.
Metadata – Both Shallow and Deep 267
12 USA Freedom Act, Pub L No 114-23, s 107 (“Definitions”), 129 Stat 268 (2015).
13 Anne J. Gilliland, “Setting the Stage,” in Introduction to Metadata, edited by Murtha Baca
(Los Angeles: Getty Research Institute, 2008), 1.
14 Ibid., 2.
15 Ibid.,14.
16 Ibid., 14–15.
17 David Lyon, “Surveillance, Snowden, and Big Data: Capacities, Consequences, Critique,”
Big Data and Society 1, 2 (2014): 3, 10.
18 Seref Sagiroglu and Duygu Sinanc, “Big Data: A Review,” in 2013 International Confer-
ence on Collaboration Technologies and Systems (CTS) (San Diego: Institute of Electrical
and Electronics Engineers, 2013), 43; Danyel Fisher, Rob DeLine, Mary Czerwinski, and
Steven Drucker, “Interactions with Big Data Analytics,” Interactions 19, 3 (2012): 53.
19 Fisher et al., “Interactions with Big Data Analytics,” 50.
20 Lyon, “Surveillance, Snowden,” 3; Gilliland, “Setting the Stage,” 8.
21 John Laprise, “Exploring PRISMS Spectrum: Privacy in the Information Age,” in The
Turn to Infrastructure in Internet Governance, edited by Francesca Musiani, Derrick L.
Cogburn, Laura DeNardis, and Nanette S. Levinson (New York: Palgrave Macmillan,
2016), 208, 214; Lyon, “Surveillance, Snowden,” 10.
22 Also formerly referred to as the Communications Security Establishment Canada
(CSEC). It is this now unofficial name and acronym that appears most frequently in the
Snowden documents.
23 In the mid-1970s, “the very existence of GCHQ and the [worldwide US/UK] Sigint
network were then closely guarded secrets.” Duncan Campbell, “GCHQ and Me: My
Life Unmasking British Eavesdroppers,” Intercept, 3 August 2015, https://theintercept.
com/2015/08/03/life-unmasking-british-eavesdroppers/.
24 Notably Mark Klein, William Binney, Thomas Drake, and Edward Snowden.
25 Notably James Bamford, James Risen, Eric Lichtblau, Glenn Greenwald, Laura Poitras,
Barton Gellman, and Ryan Gallagher,
26 Zach Whittaker, “NSA Is So Overwhelmed with Data, It’s No Longer Effective, Says
Whistleblower,” ZDNet, 27 April 2016, http://www.zdnet.com/article/nsa-whistleblower
-overwhelmed-with-data-ineffective/.
27 Notably the Guardian, Washington Post, Der Speigel, Intercept, New York Times. See
SDSA, https://is.gd/ze5urh.
28 See SDSA, http://bit.ly/SnowdenArchive-Surveillance_Programs.
29 As of mid-2018, Snowden Doc Search reported a total of 2,176 documents in its searchable
database, of which 1,571 were individual articles that appeared in SIDToday, the internal news-
letter for the NSA’s Signals Intelligence Directorate, https://search.edwardsnowden.com/.
30 SDSA, https://snowdenarchive.cjfe.org.
31 We exclude Australia’s DSO and New Zealand’s GCSB from our treatment here as there
are relatively few Snowden documents that relate to these partners, nor does metadata
appear prominently among them.
32 This definition appears in several different documents found in the Snowden Digital
Surveillance Archive, e.g., National Security Agency (NSA), “Sharing Communications
Metadata across the U.S. Intelligence Community – ICREACH” (slide deck, 15 March
2007), SDSA, https://is.gd/9j9vRA; and “Memorandum for the Director of National
Intelligence: Sharing Communications Metadata across the Intelligence Community –
Decision Memorandum,” SDSA, https://is.gd/N1z0qR.
33 “XKEYSCORE” (slide deck, 25 February 2008), SDSA, https://is.gd/RLB6U6.
34 Morgan Marquis-Boire, Glenn Greenwald, and Micah Lee, “XKEYSCORE: NSA’s
Google for the World’s Private Communications,” Intercept, 1 July 2015, https://theintercept.
com/2015/07/01/nsas-google-worlds-private-communications/.
268 Andrew Clement, Jillian Harkness, and George Raine
35 Booz Allen Hamilton, “The Unofficial XKEYSCORE User Guide,” 10, SDSA, https://
is.gd/QX8VrU.
36 “Email Address vs User Activity” (slide deck, 24 June 2009), slide 2, SDSA, https://
snowdenarchive.cjfe.org/greenstone/collect/snowden1/index/assoc/HASH0164/
d967fedd.dir/doc.pdf.
37 James Ball, “NSA Monitored Calls of 35 World Leaders after US Official Handed Over
Contacts,” Guardian, 24 October 2013, http://www.theguardian.com/world/2013/
oct/24/nsa-surveillance-world-leaders-calls.
38 “Intelligently Filtering Your Data: Brazil and Mexico Case Studies,” SDSA, https://is.gd/
ljFRcC; “3G Impact and Update” (slide deck, November 2009), SDSA, https://is.gd/
gfOjhZ.
39 “XKEYSCORE” (slide deck, 25 February 2008), slide 26, SDSA, https://is.gd/RLB6U6.
40 “Content or Metadata?” SDSA, https://snowdenarchive.cjfe.org/greenstone/collect/snowden1/
index/assoc/HASHd8b5.dir/doc.pdf. We could not find similar technical documents for the
other two major surveillance agencies.
41 Ibid.
42 Ibid.
43 Kari Rea, “Glenn Greenwald: Low-Level NSA Analysts Have ‘Powerful and Invasive’
Search Tool,” ABC News, 28 July 2013, http://abcnews.go.com/blogs/politics/2013/07/
glenn-greenwald-low-level-nsa-analysts-have-powerful-and-invasive-search-tool/.
44 National Defence Act, s 273.64.1.
45 Ibid., s 273.64.2.
46 Communications Security Establishment (CSE), “Metadata and Our Mandate,” June
2017, https://www.cse-cst.gc.ca/en/inside-interieur/metadata-metadonnees.
47 Greta Bossenmaier, letter to the editor, The Star, 3 March 2016, https://www.thestar.com/
opinion/letters_to_the_editors/2016/03/03/metadata-is-crucial-cse-insists.html.
48 Greg Weston, “CSEC Used Airport Wi-Fi to Track Canadian Travellers: Edward Snowden
Documents,” CBC News, 31 January 2014, http://www.cbc.ca/news/politics/csec-used
-airport-wi-fi-to-track-canadian-travellers-edward-snowden-documents-1.2517881.
49 Laura Payton, “Spy Agencies, Prime Minister’s Adviser Defend Wi-Fi Data Collec-
tion,” CBC News, 3 February 2014, http://www.cbc.ca/news/politics/spy-agencies-prime
-minister-s-adviser-defend-wi-fi-data-collection-1.2521166.
50 This terminology of “shallow” versus “deep” metadata is inspired in part by the similar
distinction used in the XKEYSCORE document of 25 February 2008. It also echoes the
“deep packet inspection” techniques employed by FVEY agencies in generating metadata
from intercepted communication traffic. See “XKEYSCORE” (slide deck, 25 February 2008),
slides 9 and 10, SDSA, https://snowdenarchive.cjfe.org/greenstone/collect/snowden1/
index/assoc/HASH56fe.dir/doc.pdf.
51 As quoted in Forcese, “Laws, Logarithms, Liberties,” 137.
52 For example, the OPCC’s 2014 Metadata and Privacy statement could be expanded to
make explicit the forms of deep metadata we highlight above.
Afterword
Holly Porteous
This book raises serious questions about preserving civil liberties and national
security in a big data era. In reflecting on these questions, it became apparent
to me that lingering gaps in our knowledge are forcing us to rely on assumptions
that may prove incorrect should our access to government information and
research output increase over the next few years. What I am proposing here,
building on the excellent contributions of this book’s authors, is to identify areas
requiring further research, knowledge gaps, and where we need to strengthen
our collective analysis.
First, delimiting our subject matter more precisely and with the necessary
nuances appears urgent. Reading this book, I was struck by its enormous breadth.
Among other things, it discusses the evolution of Canadian SIGINT collection,
the complexity of Canada’s current national security legal framework, the use
of (or failure to use) big data analytics by Canadian intelligence and law enforce-
ment agencies, mobilization of the public against proposed national security
legislation, and challenges in achieving informed consent to access personal
information.
Going forward, I believe there will be value in selecting elements of this broad
discourse for closer scrutiny. For example, do we wish to examine in greater
detail how and for what purposes the Canadian security and intelligence com-
munity and law enforcement exploit big data? If so, will our goal be to stop these
activities entirely or to identify and recommend measures to mitigate the pot-
entially negative consequences to individuals? Given the enthusiasm for algo-
rithmic approaches by some government agencies, perhaps our interest is in
ensuring that the state exploits big data more effectively to fulfill its duty to
protect its citizenry? Do we wish instead to examine and address the roles and
responsibilities of the private sector and academia in supporting and developing
Canada’s big data policy and capabilities? What about the role of big data and
cybersecurity; specifically, are we interested in examining and commenting on
the growing involvement of SIGINT agencies in defending critical infrastruc-
tures operated by the private sector? What are our views on “outsourcing” aspects
of critical infrastructure protection to private sector actors, such as telecom-
munications providers? Finally, given artificial intelligence’s evolving capabilities,
what are our views on ensuring the interrogability and reliability of currently
deployed technologies and the safety of future technologies being developed in
270 Holly Porteous
well-funded labs around the world? Choosing among these questions will help
us better marshal our multidisciplinary effort and resources.
Even a research agenda focused on the first question alone – how the Can-
adian security and intelligence community and law enforcement exploit big
data – suggests to me that knowledge gaps persist and our assumptions need
to be revisited. Let me present some examples to support this claim. The first
example highlights the possibility that, for a SIGINT agency such as the
Communications Security Establishment (CSE), the utility of big data tech-
niques varies according to the intelligence collection context. The purpose
of intelligence collection is generally understood to be the provision of
assessed information on the capabilities and intentions of individuals, groups,
or foreign countries to help state and law enforcement officials make deci-
sions. Capabilities are relatively easy to assess; intentions, not so much.
Humans can say one thing and do another. Their minds are essentially black
boxes. Big data’s pattern matching can offer some insight into hidden inten-
tions but, absent a reliable template for what bad intentions look like (“indi-
cators and warnings,” in the parlance), the results may not be a reliable
predictor of future actions.
The mismatch between what big data analytics can deliver and the intelligence
task at hand is a recurring theme in this book. Big data analytics, it has been
found, are largely ineffective in detecting and preventing terrorist threats. Like-
wise, algorithms may be creating more nightmares for the financial institutions
forced to use them to meet their FINTRAC (Financial Transactions and Reports
Analysis Centre of Canada) reporting requirements than for terrorist financiers.
Finally, the enthusiasm of Canadian police services for big data is not necessarily
matched by an informed understanding about how to integrate associated tools
and techniques.
The story may be different, however, for cyber threats. Readers will recall that
each of the Five Eyes SIGINT agencies, including CSE, also has a cybersecurity
mandate. That means they are in the business of producing intelligence on cyber
threats. Other Canadian agencies also have a mandated interest. For its part,
the Canadian Security Intelligence Service (CSIS) has a requirement to inves-
tigate and assess all threats to the security of Canada, including cyber threats
such as cyber espionage. The RCMP investigates criminal activities, including
cyber crime.
In contrast to terrorists working in the physical world alone (increasingly, a
near-impossible enterprise), cyber threats are bounded by the technical protocols
and characteristics of the global information infrastructure through which they
operate. In the cyber domain, it is possible – though data-intensive – to deter-
mine what normal activities look like. Therefore, it is also possible to identify
Afterword 271
anomalous and potentially malicious traffic. Cyber threats are thus conducive
to big data analysis.
Indeed, for intelligence agencies, police, and systems security administrators
more generally, network and host defence has always been about parsing mas-
sive logs of transmissions traffic, hunting for patterns that indicate malicious
activities. Until recently, much of the work in this area has focused on recreating
events after an attack, so-called digital forensics. Now, thanks to advances in
artificial intelligence, it is increasingly realistic to speak of intrusion prevention
through near-real-time, automated responses to suspected malicious traffic.
SIGINT agencies, whose intelligence collection activities have been conducted
through the global information infrastructure for decades, are interested in
marrying their leading-edge capabilities with commercially available big data
tools to achieve intrusion prevention.
So, for cybersecurity, the big data tools are finally catching up to a long-
standing demand. With the explosive growth of the poorly secured Internet of
Things and increasingly aggressive Russian and Chinese cyber attacks, intelli-
gence officials say building automated defence capabilities into our networks
has become a “red lights blinking” undertaking. There is no choice but to fight
their algorithms with our algorithms.
Those who track public statements of Five Eyes SIGINT agency officials know
that cybersecurity now stands equal to, if not greater than, counterterrorism
among national security priorities. Yes, terrorism remains a significant concern
and big data does have some utility in this domain, but SIGINT agencies have
for some time been re-engineering their foreign intelligence collection capabil-
ities to tip and queue automated cyber defence capabilities.
Chapter 8 provides a sense of this policy trajectory. The programs it discusses
indicate that, along with hacking into adversary networks, bulk data collection
and machine learning are viewed as fundamental to national cyber defence.
If collecting bulk datasets for cybersecurity is an inescapable part of the
national security picture, then we must understand the implications. For
example, is the price of better securing our digital lifestyle a reduced expectation
of privacy and an expanded peacetime role for intelligence agencies and the
military? What role will private sector entities such as communications service
providers play in defending Canada’s cyberspace? How can we ensure that data
collected for cybersecurity purposes is stored, used, and disposed of properly?
Can our cyber defence systems be turned against us, either from within or from
without? In a related vein and drawing on observations made in Chapter 5, how
will smaller businesses who can’t afford tailored algorithms protect themselves
against cyber threats? Regarding SIGINT agencies’ use of zero-days, through
what mechanisms are vulnerabilities equities issues being addressed and who
272 Holly Porteous
information and expertise to recognize poor practice and challenge it? Alterna-
tively, is each element capable of recognizing and supporting sound practices
in this area? If not, what measures would enhance capacity while respecting
secrecy requirements?
Regarding secrecy requirements, I would draw attention to the exceedingly
small pool of individuals in this country who can claim genuine expertise in
the operational aspects of intelligence collection. Most of these individuals are
bound to secrecy for life under the Security of Information Act and they steer
well clear of providing any public commentary that would shed light on intel-
ligence collection sources and methods. This situation presents a challenge for
those who would like to expand their knowledge.
Given the far-reaching policy decisions that Canada is poised to take regard-
ing a national data strategy, perhaps there may be value in examining what is
being said on this matter in our own Parliament and in the legislatures of other
allied jurisdictions. Out of this examination should come a sense of where civil
rights fit into this discussion. Now is also the time to consider what constitutes
an appropriate balance between personal data sovereignty and the public goods
that can accrue from machine learning.
Long before the Snowden leaks, legal scholars were voicing concern about
intelligence agencies’ outsourcing of data collection to the private sector. Indeed,
the two editors of this book have for many years been instrumental in educating
the public about how surveillance assemblages are being created out of these
types of linkages.
With the recent controversy over Facebook’s sharing of personal data from
millions of its users’ accounts with a political consulting firm possessing Can-
adian connections, the time is ripe for additional research on how data brokers
operate in this country. Certainly, there is a gap in our knowledge regarding
the specifics of how CSE and CSIS plan to use proposed new authorities to col-
lect and use datasets containing “publicly available” information. A key public
policy question would be whether these new authorities could incentivize
increased private sector data collection.
Of course, it appears that we are all doing our bit to help private sector service
providers collect our personal data. As this book has shown, the death of privacy
often takes place in the fine print. Our “ignoring culture” leads most of us to
hurry past consent form documentation, clicking quickly on the consent button
at the end to gain access to desired services. If being informed about the privacy
implications of gaining access to a “free” service means reading through pages
of impenetrable legalese, nobody wants to be informed. While efforts are being
made to use plain(er) language on consent forms, the documentation sweet
spot may never be found.
274 Holly Porteous
Perhaps part of the solution lies in the broader public discourse about the
deleterious effects of surveillance capitalism that giant social media platforms
like Facebook, Twitter, and Google have come to personify. Related to this
discussion is the question of user choice. Increasingly, citizens are being herded
towards a digital monoculture that makes no room for analogue service delivery.
Can people truly consent to sharing their personal data when, to access critical
services such as public transit, they are forced to use a smart card?
In many ways, this book is the result of a breakdown in trust. It would not
exist were it not for elements of the US national security community who did
not like what they were seeing and decided to publicly disclose what they knew.
Though they excoriated Edward Snowden as a traitor, even some senior US
national security officials have come to admit that the ensuing public debate
about previously classified matters has been necessary and useful. Here in
Canada, recent polling indicates that most Canadians trust government to
protect their privacy but still think they should be given more information about
how it collects and uses their personal information. The same polling also shows
that most Canadians don’t know that their own intelligence agencies exist, let
alone what they do.
Here, too, I see an opportunity for research, and thereby education.
Notes
Holly Porteous is an analyst with the Justice and National Security section of the Library
of Parliament’s Parliamentary Information and Research Service. The views she expresses
here are hers alone and do not reflect those of the Library of Parliament.
1 The term “vulnerabilities equities” refers to the choice between enhancing overall cyber-
security through public disclosure of previously unknown exploitable cyber vulnerabil-
ity information and protecting an operational capability by maintaining secrecy.
2 Mike Ananny and Kate Crawford, “Seeing without Knowing: Limitations of the Trans-
parency Ideal and Its Application to Algorithmic Accountability,” New Media and Society
20, 3 (2016): 973–89.
3 See Judea Pearl and Dana Mackenzie, The Book of Why: The New Science of Cause and
Effect (New York: Basic Books, 2018). See also Judea Pearl, “Theoretical Impediments
to Machine Learning with Seven Sparks from the Causal Revolution,” 11 January 2018,
arXiv:1801.04016.
Contributors
was a teaching and research assistant at the Human Rights Research and Educa-
tion Centre of the University of Ottawa. Anne also organizes with Indigenous
Solidarity Ottawa, is an advocate for consent, anti-oppression, and safer spaces,
and writes political screenplays.
Craig Forcese is a full professor at the Faculty of Law (Common Law Section),
University of Ottawa. He is also an adjunct research professor and senior fellow
at the Norman Paterson School of International Affairs, Carleton University
(from 2017 to 2022), and a National Security Crisis Law Fellow, Center on Na-
tional Security and the Law at Georgetown Law (Washington, DC) (from 2017
to 2020). Craig sits on the executive at the Canadian Network for Research on
Terrorism, Security and Society (TSAS), and is a past president of the Canadian
Council on International Law and the Canadian Association of Law Teachers.
Jillian Harkness has a master’s degree from the University of Toronto Faculty
of Information, where she worked as an archival assistant helping to build the
Snowden Digital Surveillance Archive. She continues to explore privacy, tech-
nology, and education in her current role as the Head of Library and Learning
Resources at the United World College of the Adriatic in Duino, Italy.
expansion of video surveillance, 9/11, and the Snowden revelations sit alongside
treatments of the wider meanings of surveillance seen in Surveillance as Social
Sorting (2003) or in Liquid Surveillance (with Zygmunt Bauman, 2013).
Tim McSorley is the national coordinator of the International Civil Liberties
Monitoring Group. He combines his passion for civil liberties and social jus-
tice with his background in journalism, research, and analysis in his work with
colleagues and partners to fight for social change. Previously, he was an editor
with the Media Co-op and The Dominion magazine, and served as coordina-
tor for the Voices-Voix Coalition. He is a graduate of Concordia University in
Montreal, with a degree in journalism and political science.
Adam Molnar is an assistant professor in the Department of Sociology and
Legal Studies at the University of Waterloo, Ontario, where he is also a mem-
ber of the Waterloo Cybersecurity and Privacy Institute. Prior to joining the
department in 2019, he was a lecturer in criminology at Deakin University in
Australia. He completed his PhD at the University of Victoria, British Colum-
bia, and a postdoctoral fellowship at the Queen’s University Surveillance Stud-
ies Centre. Much of his work focuses on socio-legal practices of technology-
led policing and security intelligence, which also considers the implications for
civil liberties, social justice, and the politics of associated regulatory responses.
Jeffrey Monaghan is an associate professor of criminology and sociology at
Carleton University in Ottawa. His research examines the policing of social
movements as well as broader policing and surveillance practices influenced
by the “War on Terror.” Along with Andrew Crosby, he co-authored Policing
Indigenous Movements (2018), which details how policing and other security
agencies have developed a prolific surveillance regime that targets Indigenous
movements as national security threats. He is also the co-editor, with Lucas
Melgaço, of Protests in the Information Age: Social Movements, Digital Practices
and Surveillance (2018).
David Murakami Wood is former Canada Research Chair (Tier II) in Surveil-
lance Studies and associate professor of sociology at Queen’s University. He has a
BA in modern history from Oxford and an MSc and PhD from Newcastle, on the
subject of secret Signals Intelligence bases in North Yorkshire, UK. He is a widely
published specialist in the sociology and geography of surveillance, security, and
global cities, particularly in Japan, where he was a Japan Society for the Promo-
tion of Science Fellow in 2013–14, and most recently a 2019 Japan Foundation
Fellow examining the security preparations for the Tokyo Olympics. He was co-
founder and is co-editor-in-chief of the journal Surveillance & Society, and co-
editor, with Torin Monahan, of Surveillance Studies: A Handbook (2018).
278 Contributors
big data policing, 180–81, 187; capabilities bulk collection. See mass surveillance
of, 181, 182–83, 184; cultural challenges Bundesnachrichtendienst (Germany), 259
in, 184; empirical studies on, 181, 183, 184;
legislative and policy barriers in, 187; and cable, traffic, 91, 135, 139, 253. See also
open-source analytics, 187; resources, International Licensed Carriers (ILC);
organizational and personal barriers in, undersea cables
186–87, 189, 190; technological challenges Calgary Police Services, 181
in, 185–86. See also policing Canada Evidence Act, 44
big data surveillance, 28, 57, 73, 78–80, Canada Revenue Agency (CRA), 156, 157
121, 183; and Internet and surveillance Canada-USA Agreement (CANUSA), 8
capabilities, 101–2, 129; and security, Canadian Access to Information Act
112, 116, 118, 129, 169. See also (ATIA), 6
Canadian Security Intelligence Service Canadian Charter of Rights and Freedoms,
(CSIS); Communications Security 150, 199. See also litigation and
Establishment (CSE); Five Eyes surveillance; rule of law; Supreme
(FVEY); National Security Agency Court of Canada
(NSA); “New Analytic Model”; Canadian Overseas Telecommunications
Snowden’s revelations; surveillance; Corporation (COTC), 92
and entries starting with big data Canadian Security Intelligence Service
Bill C-13, 11, 140, 208 (CSIS), 10, 61; and big data/bulk data
Bill C-22, and CSE oversight, 245 collection, 150, 154, 164, 173, 176; and
Bill C-30, 10, 140, 205, 206, 211 collection and retention of associated
Bill C-46. See Investigative Powers for the data, 174–75; creation of, 9–10, 94;
21st Century Act and datasets acquisition, collection,
Bill C-47. See Technical Assistance for Law and usage, 161–62, 163, 174–75; and
Enforcement in the 21st Century Act datasets retention, 174–76; and
Bill C-51, 11, 48, 100, 113, 149, 199; foreign intelligence collection, 94; and
background of, 218; opposition to, information sharing under SCISA,
218–19, 221, 225, 229. See also 155–57; and information sharing between
terrorism agencies, 180; legal challenges and rule
Bill C-59, 10, 48, 106, 149, 164, 199; and of law issues involving, 150, 151, 157,
accountability of security intelligence 164, 172, 174; and money laundering,
agencies, 160–61, 170, 176; and bulk 70; and “non-collected” datasets, 152;
powers, 170, 172, 175, 176, 229; and oversight of, 152–53, 170, 176; and private
CSE’s powers, 127, 203, 204, 243; communication interception, 168;
datasets provisions in, 161, 176, 177; and powers and statutory mandates of, 10, 11,
oversight, 170, 176–77, 199, 203, 246. 151, 163, 172, 173, 270; and warrants, 150,
See also accountability; Communications 168, 173, 199. See also mass surveillance;
Security Establishment (CSE); “New Analytic Model”; security
transparency intelligence; surveillance; and entries
biometric information, 7 starting with big data
Blaney, Steven, 10 Canadian Security Intelligence Service
Booz Allen Hamilton. See Snowden, Act, 44, 150, 151, 173; and information
Edward; Snowden’s revelations collection and retention, 174–75; and
boyd, danah, 120, 182 provisions on datasets, 161–62
Brayne, Sarah, 183 CANUKUS Agreement, 90, 95
Britain-USA Agreement (BRUSA), 8 “capability transparency,” 128, 141, 143–44
British Columbia Civil Liberties CASCADE program. See Communications
Association (BCCLA), 150, 171, 198. Security Establishment (CSE): CASCADE
See also BCCLA v CSE program
Index 283
case law. See litigation and surveillance 262–63; and National Defence Act, 199–
Central Intelligence Agency (CIA), 8 200, 262; oversight of, 106, 127, 170, 172,
Chan, Janet, 182, 183 237–238, 244; private communication
Charkaoui v Canada (Citizenship and interception by, 92, 97, 99, 100, 105, 150,
Immigration), 174–75 170, 172, 199, 241; secrecy culture and, 127,
chilling effect, 62–63, 197, 226 128, 142, 143, 144; statutory mandates of,
Chrétien, Jean, 100 100, 105, 106, 127, 170, 199, 242, 243, 262;
Christopher, David, 205, 206 and Tutte Institute for Mathematics and
civil liberties. See Canadian Charter of Computing (TIMC) partnership, 112,
Rights and Freedoms; litigation and 113, 122. See also mass surveillance; “New
surveillance; rule of law Analytic Model”; security intelligence;
“clicktivism”, 231. See also resistance, anti- surveillance; and entries starting with big
surveillance; social movements data
Cold War, 8, 9; end of and shift in Communications Security Establishment
intelligence concerns, 97, 257; and state Act, 162, 203, 204, 243. See also Bill C-59
surveillance, 89, 90 connective action. See network theory;
Communications Branch of the National social movements
Research Council (CBNRC), 89, 113. Convention on Cyber Crime (Council of
See also Communications Security Europe’s), 36
Establishment (CSE) Crawford, Kate, 120, 182, 272
Communications Security Establishment Cray computers, 90, 95, 96, 103
(CSE), 9, 11, 14, 23, 113, 237; and crime. See financial surveillance; policing;
Afghanistan, 100; and big data, 101, terrorism
112, 114, 116, 117, 118; and CASCADE Criminal Code, 44, 50, 173, 255, 262, 264
program, 132, 141, 241, 242, 243; and cryptanalysis, 95, 96, 103, 119. See also
collaboration with domestic agencies, encryption
100, 105, 237, 243; and collaboration counterinsurgency, 8, 9
with private actors, 103, 113, 122, 137, counterterrorism. See terrorism
140, 243; and collaboration with court of law. See litigation and
the NSA and FVEY, 95, 100, 112, surveillance
140, 257; and collaboration with the Customs Act, 44, 45
RCMP, 93; Commissioner of, 99; and cybersecurity, 270–71. See also
Communications Research Squadron, Communications Security
97, 99; and cybersecurity, 106, 144, Establishment (CSE): and cybersecurity
170, 172, 243, 269, 270; and domestic
surveillance, 100, 101, 126, 129–31, 171, data, personal, 3, 21, 254; “raw,” 5, 112, 120.
241; domestic surveillance capabilities See also entries starting with big data
of, 132, 134, 140, 170, 202, 242, 263; and data analysis, 6, 82. See also entries
EONBLUE program, 101, 132, 139, 241; starting with big data
growth of, 96, 97, 99, 100, 101, 104, 144; data collection, in bulk, 64, 112, 128, 149,
historical background of, 89, 93, 240; 169; and “non-collected” datasets, 152.
and INDUCTION program, 132, 241; See also entries starting with big data
and intelligence focus shift, 100, 112, and/or mass surveillance
113–14, 115; legal challenges and issues “data exhaust,” 5, 114
involving, 150, 171, 172, 199, 242; and “data junkies.” See big data analytics
legal compliance, 99, 100, 105–6, 127, data mining, 61, 62, 112, 154, 264
243, 248; and mass surveillance, 129, data protection, 36. See also litigation and
130–32, 134, 200, 241; and Mathematics of surveillance; privacy
Information Technology and Complex dataism, 5. See also entries starting with
Systems (MITACS), 103; and metadata, big data
284 Index
datasets, and Bill C-59, 161, 176, 177; false positive/negative, 60, 62, 80
definition of, 176. See also big data; Federal Court, 150, 154, 158, 168, 173,
Canadian Security Intelligence Service 199; and dataset retention, 176–77.
(CSIS); data collection See also litigation and surveillance;
dataveillance, 69. See also entries starting rule of law
with big data and/or surveillance Ferguson, Andrew, 191
deep packet inspection (DPI), 130, financial flow, 69, 73, 76–77. See also
241, 268n50. See also big data financial surveillance; terrorism:
analytics; Communications Security financing and financial surveillance of
Establishment (CSE): CASCADE financial surveillance, 69, 70, 72–73; and
program; Communications Security monitoring devices and capabilities,
Establishment (CSE): EONBLUE 76–80; and suspicion production, 78,
program; mass surveillance; 83–84; and terrorism, 74, 77. See also
metadata financial flow; terrorism, financing and
Defense Advanced Research Projects financial surveillance of
Agency (DARPA), 58 Financial Transactions and Report
Department of National Defence, 93, Analysis Centre of Canada
113; collaboration with the NSA, 95. (FINTRAC), 68, 70, 73, 77, 79
See also Communications Security fingerprinting, 7
Establishment (CSE) First Nations. See Indigenous peoples
digital rights, 14, 50. See also Canadian Five Eyes (FVEY), 8–9, 11, 23, 90, 102, 254;
Charter of Rights and Freedoms; and data interception and analytics, 126,
intelligence and law; litigation and 254, 259; and diplomatic surveillance,
surveillance; rule of law 260; and mass surveillance, 151, 241,
“digitally networked public sphere,” 220. 257; post-9/11 transformation, 101, 180;
See also resistance, anti-surveillance; subcontract on domestic surveillance,
social movements 62. See also National Security Agency
DiPuma, Caily, 199, 200, 201 (NSA); security intelligence
Director General for Intelligence, 115. Forcese, Craig, 255
See also Communications Security Foreign Intelligence Surveillance Act
Establishment (CSE) (FISA), 25, 244
Distant Early Warning (DEW), 9 Foreign Intelligence Surveillance Court
Donahue, Laura, 43 (FISC), 48
Forster, John, 131. See also Communications
Earl, Jennifer, 222 Security Establishment (CSE)
ECHELON program, 32, 96, 97, 257. Freedom of Information Act, 6
See also Cold War; Five Eyes (FVEY) Freeze, Colin, 113
Emergency Powers Act, 168
“emergent publics,” 220 gaming metaphor, 117, 121
encryption, 104, 119. See also cryptanalysis Gerbaudo, Paolo, 220
EONBLUE program. See Communications Gilliland, Anne J., 256
Security Establishment (CSE): Global Affairs Canada (GAC), 156
EONBLUE program Goldman, Zachary, 245, 248
Ericson, Kevin, 46 Google, 21, 27, 30, 102, 274
espionage. See security intelligence Government Communications
extremism. See terrorism Headquarters (GCHQ), 14, 102, 135,
258; and international collaboration,
Facebook, 38, 130, 231; and collaboration 90, 91, 129, 254, 257, 259. See also Five
with the NSA, 21, 102; and policing, 183, Eyes (FVEY)
188; and surveillance capitalism, 273, Government Communications Security
274. See also social media Bureau (GCSB), 257
Index 285
Regulation of Investigatory Powers Act activities, 245, 247, 248, 265; and rule of
(RIPA), (UK), 255, 262 law, 43, 48, 126, 141, 160, 164, 173, 240.
Renan, Daphna, 246 See also accountability; rule of law;
resistance, anti-surveillance, 198, 220, 222, security intelligence; transparency
225–28. See also Bill C-51: opposition to; Secrecy Act, 32–34
Indigenous movements; International securitization, 3, 49
Campaign Against Mass Surveillance security, 64–65, 68, 230; cameras,
(ICAMS); social movements; litigation 63; certificates, 58; expansion of
and surveillance; Stop Illegal Spying notion of, 4; and mobility, 69; and
campaign; Stop Online Spying campaign surveillance, 13, 116, 121. See also
retroactive immunity, 25–26, 29, 35. security intelligence
See also intelligence and law security, national, 3, 66, 112, 115, 118, 198.
Ridgeway, Greg, 180, 181, 188 See also Canadian Security Intelligence
risk assessment, 180 Service (CSIS); Communications
Roach, Kent, 245 Security Establishment (CSE); security
Robertson, Gordon, 91 intelligence; terrorism
Royal Canadian Military Police, 181 security intelligence, 3, 5, 6; and
Royal Canadian Mounted Police (RCMP), cooperation between domestic
9, 10, 12, 46, 229; and collaboration agencies, 12, 13, 93, 95, 100; and
with the CSE, 93; and national security, cooperation between states, 12, 13, 23,
58, 62, 180. See also McDonald 30, 37, 89, 93, 95, 101, 102, 112, 180, 257,
Commission 259; and information sharing, 155–57,
Royal Canadian Mounted Police Act, 44 180; and partnership with private
Royal Commission of Inquiry into actors, 21, 27–29, 73, 102, 103, 122, 137,
Certain Activities of the RCMP. See 243, 273. See also Five Eyes (FVEY);
McDonald Commission intelligence agencies; intelligence and
Royal Commission on Security. See law; Snowden’s revelations
Mackenzie Commission Security Intelligence Review Committee
rights, digital. See digital rights (SIRC), 11, 150; mandate of, 150–51;
rule of law, 43, 105–6, 141, 150, 164; potentially unlawful data collection by
and accountability, 240, 265; and the CSIC, 152–153; recommendations
constitutional privacy/search and and guidelines proposal, 153, 162.
seizure protections, 155, 168, 171, 172, 176, See also accountability; transparency
200. See also accountability; intelligence Security of Canada Information Sharing
and law; litigation and surveillance; Act (SCISA), 50, 155–56; and Privacy
secrecy: and rule of law; transparency Act, 156
Security of Information Act (SOIA), 44,
sandbox. See gaming metaphor 143–44, 273
Sanders, Carrie, 113, 183, 191 security thickness, 230. See also secrecy
satellite, 91, 95; monitoring, 96, 241 Segerberg, Alexandra, 219, 220, 221, 227, 228
Schott, Nicole, 183 Sensenbrenner, Jim, 160
search and seizure. See Hunter v Southam; settler-colonialism, 7, 197. See also
litigation and surveillance; rule of law: policing: and colonialism; surveillance:
and constitutional privacy/search and and colonialism
seizure protections; Supreme Court of Sheptycki, James, 113
Canada Signals Intelligence (SIGINT), 6, 8, 115,
secrecy, 43, 224; as a barrier to public 257, 271; and CSE, 97, 99, 113, 170; and
debates on surveillance and security, Department of National Defence, 113;
223, 273; culture of, 6, 128, 142, 143, 211, and Internet traffic, 98–99, 100. See also
229–30; and legitimacy of intelligence big data analytics; Communications
Index 289
Toews, Vic, 10, 205, 206 War on Terror, 25; and big data analytics
torture, 9, 62, 120n44 and surveillance, 60, 257; and
Total Information Awareness program, surveillance legislation, 35. See also
58, 64 terrorism
tracking devices, 50, 76 warrant: CSIS and judicial, 173; and
transoceanic cables. See undersea cables discretionary power, 168, 170, 171, 172,
transparency, 44, 50, 126, 141, 202, 224, 199; judicial, 5, 25, 50, 105, 107n23,
238; and government surveillance, 140, 169, 171, 173; threshold for
238, 246, 247, 265, 272; and IXmaps, approval of, 167–68, 173, 208. See also
135, 137, 138; lack of, 29, 48, 50, 128, 143, Canadian Charter of Rights and
168, 192; and legality, 47, 141, 160, 246. Freedoms; litigation and surveillance;
See also accountability; “capability rule of law
transparency”; secrecy; Security warrantless access, 10, 140, 168, 205, 208.
Intelligence Review Committee (SIRC) See also lawful access; R v Spencer;
Trudeau, Justin, 12 rule of law; Stop Online Spying
Trudeau, Pierre, 91, 94 campaign
Tufekci, Zeynep, 220 Webb, Maureen, 209
Tutte Institute for Mathematics Weston, Crystal, 183, 191
and Computing (TIMC), 112. whistle-blower, 4, 6, 11, 21–25, 39, 137
See also Communications Security Winnipeg Police Services, 181
Establishment (CSE) wiretapping: illegal, 23–25, 30, 140;
programs, 27, 132. See also cable;
UK-USA Agreement (UKUSA), 8, 32, 90; International Licensed Carriers (ILC);
agencies, 95 Snowden’s revelations; undersea cables;
undersea cables, 8, 21, 27, 90, 149; as XKEYSCORE program
source of intelligence, 90–91, 253.
See also International Licensed X (Re), 2016 FC 1105, 150, 154, 164,
Carriers (ILC); Snowden’s revelations; 173, 229. See also Canadian Security
XKEYSCORE program Intelligence Services (CSIS); litigation
US Privacy and Civil Liberties Oversight and surveillance; Operational Data
Board (PCLOB), 158, 159, 164 Analysis Centre (ODAC)
US SANDKEY, 98 XKEYSCORE program, 258; capabilities
USA PATRIOT Act, 158, 160 of, 259–60. See also cable; Five Eyes
(FVEY); metadata; National Security
van Dijck, Jose, 5 Agency (NSA); undersea cables
van Loan, Peter, 10
Vancouver Police Department, 181 Yahoo!, 27, 102, 130
Varcoe opinion, 168
Zedner, Lucia, 180
Walsh, Patrick, 180, 248 Zuboff, Shoshana, 5. See also surveillance
War on Drugs, 98 capitalism