Privacy in Mobile and Pervasive Computing
Privacy in Mobile and Pervasive Computing
LANGHEINRICH • SCHAUB
Synthesis Lectures on
Mobile and Pervasive Computing
Privacy in Mobile
Series Editor: Mahadev Satyanarayanan, Carnegie Mellon University
and Pervasive
Marc Langheinrich, Università della Svizzera Italiana (USI)
Florian Schaub, University of Michigan
It is easy to imagine that a future populated with an ever-increasing number of mobile and pervasive devices that
Synthesis Lectures on
Mobile and Pervasive Computing
store.morganclaypool.com
Privacy in
Mobile and Pervasive
Computing
Synthesis Lectures on Mobile
and Pervasive Computing
Editor
Mahadev Satyanarayanan, Carnegie Mellon University
Synthesis Lectures on Mobile and Pervasive Computing is edited by Mahadev Satyanarayanan of
Carnegie Mellon University. Mobile computing and pervasive computing represent major
evolutionary steps in distributed systems, a line of research and development that dates back to the
mid-1970s. Although many basic principles of distributed system design continue to apply, four
key constraints of mobility have forced the development of specialized techniques. These include:
unpredictable variation in network quality, lowered trust and robustness of mobile elements,
limitations on local resources imposed by weight and size constraints, and concern for battery
power consumption. Beyond mobile computing lies pervasive (or ubiquitous) computing, whose
essence is the creation of environments saturated with computing and communication, yet
gracefully integrated with human users. A rich collection of topics lies at the intersections of mobile
and pervasive computing with many other areas of computer science.
All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in
any form or by any means—electronic, mechanical, photocopy, recording, or any other except for brief quotations
in printed reviews, without the prior permission of the publisher.
DOI 10.2200/S00882ED1V01Y201810MPC013
Lecture #13
Series Editor: Mahadev Satyanarayanan, Carnegie Mellon University
Series ISSN
Print 1933-9011 Electronic 1933-902X
Privacy in
Mobile and Pervasive
Computing
Marc Langheinrich
Università della Svizzera Italiana (USI)
Florian Schaub
University of Michigan
M
&C Morgan & cLaypool publishers
ABSTRACT
It is easy to imagine that a future populated with an ever-increasing number of mobile and
pervasive devices that record our minute goings and doings will significantly expand the amount
of information that will be collected, stored, processed, and shared about us by both corporations
and governments. The vast majority of this data is likely to benefit us greatly—making our
lives more convenient, efficient, and safer through custom-tailored and context-aware services
that anticipate what we need, where we need it, and when we need it. But beneath all this
convenience, efficiency, and safety lurks the risk of losing control and awareness of what is known
about us in the many different contexts of our lives. Eventually, we may find ourselves in a
situation where something we said or did will be misinterpreted and held against us, even if
the activities were perfectly innocuous at the time. Even more concerning, privacy implications
rarely manifest as an explicit, tangible harm. Instead, most privacy harms manifest as an absence
of opportunity, which may go unnoticed even though it may substantially impact our lives.
In this Synthesis Lecture, we dissect and discuss the privacy implications of mobile and
pervasive computing technology. For this purpose, we not only look at how mobile and pervasive
computing technology affects our expectations of—and ability to enjoy—privacy, but also look
at what constitutes “privacy” in the first place, and why we should care about maintaining it.
We describe key characteristics of mobile and pervasive computing technology and how those
characteristics lead to privacy implications. We discuss seven approaches that can help support
end-user privacy in the design of mobile and pervasive computing technologies, and set forward
six challenges that will need to be addressed by future research.
The prime target audience of this lecture are researchers and practitioners working in mo-
bile and pervasive computing who want to better understand and account for the nuanced pri-
vacy implications of the technologies they are creating. Those new to either mobile and pervasive
computing or privacy may also benefit from reading this book to gain an overview and deeper
understanding of this highly interdisciplinary and dynamic field.
KEYWORDS
mobile computing, pervasive computing, ubiquitous computing, Internet of Things,
privacy, security, privacy-enhancing technology, privacy behavior, privacy engi-
neering
vii
Contents
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ix
Acknowledgments . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xi
1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1
1.1 Lecture Goals and Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
1.2 Who Should Read This . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4
2 Understanding Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7
2.1 Codifying Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.1.1 Historical Roots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
2.1.2 Privacy Law and Regulations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
2.2 Motivating Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21
2.2.1 Privacy Benefits . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 24
2.2.2 Limits of Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 28
2.3 Conceptualizing Privacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3.1 Privacy Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31
2.3.2 Privacy Constituents . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33
2.3.3 Privacy Expectations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
2.3.4 A Privacy Taxonomy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39
2.4 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 43
6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 93
Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 95
Preface
In this book, we dissect and discuss the privacy implications of mobile and pervasive computing
technology. For this purpose, we not only look at how mobile and pervasive computing technol-
ogy affects our expectations of—and ability to enjoy—privacy, but also look at what constitutes
“privacy” in the first place, and why we should care about maintaining it. The book is structured
as follows.
• Chapter 1: Introduction. This short chapter motivates the need for this book and outlines
its contents.
• Chapter 3: Mobile and Pervasive Computing (MPC). This chapter summarizes the key
defining characteristics of mobile and pervasive computing. While mobile and pervasive
computing systems feature privacy issues inherent in any computer system in general (e.g.,
interconnectivity), aspects such as context awareness and implicit interaction pose new
privacy challenges unique to mobile and pervasive computing.
• Chapter 4: Privacy Implications of MPC. This chapter explores the specific privacy im-
plications of mobile and pervasive computing in order to determine the challenges that
must be addressed in order to create more privacy-friendly mobile and pervasive com-
puting systems. It groups these around three core aspects: (1) the digitization of everyday
life; (2) the ability of automatic data capture; and (3) the ability of using data to predict
behavior. While none of these trends are new, mobile and pervasive computing systems
exacerbate these issues greatly.
• Chapter 5: Supporting Privacy in MPC. This chapter discusses seven key directions and
associated challenges for building privacy-friendly mobile and pervasive computing sys-
tems: (1) privacy-friendly defaults; (2) adequate privacy-risk communication; (3) privacy
management assistance; (4) context-adaptive privacy mechanisms; (5) user-centric privacy
controls; (6) algorithmic accountability; and (7) privacy engineering methodologies. While
there is no silver bullet to remedy all privacy implications of any mobile and pervasive
x PREFACE
computing system, the presented approaches constitute an essential toolbox for building
privacy into mobile and pervasive computing systems.
• Chapter 6: Conclusions. This chapter provides a brief outlook and stipulates key chal-
lenges for privacy that the authors see.
Acknowledgments
It may come as no surprise that a project like this always takes longer than one originally antici-
pates. Sometimes much longer. We are thus deeply grateful to Michael Morgan, President and
CEO of Morgan & Claypool Publishers, and Mahadev “Satya” Satyanarayanan, the Mobile and
Pervasive Computing-Series Editor, for their patience and unwavering support over the years.
We also greatly benefited from the helpful feedback from both Satya and Nigel Davies, who
read through countless early versions of this lecture and offered important insights on how to
make this text more accessible. All of the remaining issues in this final version are fully our fault!
We also would like to thank all the staff and students at our respective universities that
have supported us in our work, as well as our many collaborators near and far who help shape
our research and provided us with guidance and inspiration over the years.
CHAPTER 1
Introduction
In 1999, Robert Rivera slipped on some spilled yogurt in a Vons supermarket in Southern Cali-
fornia. With a shattered kneecap as a result, Rivera sought compensation from the supermarket
chain—not only to pay for his medical bills, but also to compensate for the loss of income, as he
had to quit his job due to the injury. However, his effort to negotiate an out-of-court settlement
fell short, according to the LA Times [Silverstein, 1999], when the supermarket’s designated
mediator produced Rivera’s shopping records. Rivera was a regular Vons customer and had used
their loyalty card for several years. The mediator made it clear that should this case go to court,
Vons could use Rivera’s shopping record to demonstrate that he regularly bought large quan-
tities of alcohol—a fact that would surely weaken his case (who is to say that Rivera wasn’t
drunk when he slipped?). While Vons denied any wrongdoings, Rivera claimed that this threat
prompted him to drop the case against the company.
Shopping records are a great example of the minute details that companies are interested
in collecting about their customers. At first glance, it looks like a good deal: in exchange for
swiping a loyalty card at the checkout,1 consumers receive anywhere from small discounts to
substantial savings on their daily grocery shopping bill. The privacy implications seem negligible.
After all, the store already has a record of all items you are buying right there at checkout, so why
worry about the loyalty card that helps you save money? While the difference is not obvious, the
loyalty card allows for much more detailed data collection than just the payment transaction.
Even though it seems as if a regular credit card not issued by the store or other cashless payment
methods would be just as problematic, data flows for such cards are different: the supermarket
only receives information about successful payment, but no direct identifying information about
the customer; similarly, the credit card company learns that a purchase of a certain amount was
made at the supermarket, but not what items were purchased. Only by also swiping a loyalty
card or using a combined credit-and-loyalty card, a store is able to link a customer’s identity to
a particular shopping basket and thus track and analyze their shopping behavior over time.
So what is the harm? Most of us might not regularly buy “large quantities” of alcohol, so
we surely would never run into the problem of Robert Rivera, where our data is used “against us”.
Take the case of the U.S.-American firefighter Philip Scott Lyons. A long-time customer of the
Safeway supermarket chain, Lyons was arrested in August 2004 and charged with attempted
arson [Schneier, 2005]. Someone had tried to set fire to Lyons’ house. The fire starter found
at the scene matched fire starters Lyons had previously purchased with his Safeway Club Card.
1 If one uses a store-issued credit card, even that extra step disappears.
2 1. INTRODUCTION
Did he start the fire himself? Luckily for Lyons, all charges against him were eventually dropped
in January 2005, when another person confessed to the arson attempt. Yet for over six months,
Lyons was under heavy suspicion of having set fire to his own home—a suspicion particularly
damaging for a firefighter! A similar incident occurred in 2004 in Switzerland, when police
found a supermarket-branded contractor’s tool at the scene of a fire in the Canton of Berne. The
local court forced the corresponding supermarket chain, Migros, to release the names of all 113
individuals who had bought such a tool in their stores. Eventually, all 113 suspects were removed
from the investigation, as no single suspicion could be substantiated [20 Minuten].
In both the Safeway and the Migros cases, all customers who had bought the suspicious
item in question (fire starters and a contractor’s tool, respectively) instantly became suspects in a
criminal investigation. All were ultimately acquitted of the charges against them, although par-
ticularly in the case of firefighter Lyons, the tarnished reputation that goes with such a suspicion
is hard to rebuild. News stories tend to focus on suspects rather than less exciting acquittals—
the fact that one’s name is eventually cleared might not get the same attention as the initial
suspicion. It is also often much easier to become listed in a police database as a suspect, than
to have such an entry removed again after an acquittal. For example, until recently, the federal
police in Switzerland would only allow the deletion of such an entry if the suspect would bring
forward clear evidence of their innocence. If, however, a suspect had to be acquitted simply
through lack of evidence to the contrary—as in the case of the Migros tool—the entry would
remain [Rehmann, 2014].
The three cases described above are examples of privacy violations, even though none of
the data disclosures (Vons’ access of Robert Rivera’s shopping records, or the police access of the
shopping records in the US or in Switzerland) were illegal. In all three cases, data collected for
one purpose (“receiving store discounts”) was used for another purpose (as a perceived threat to
tarnish one’s reputation, or as an investigative tool to identify potential suspects). All supermar-
ket customers in these cases thought nothing about the fact that they used their loyalty cards to
record their purchases—after all, what should be so secret about buying liquor (perfectly legal
if you are over 21 in the U.S.), fire starters (sold in the millions to start BBQs all around the
world) or work tools? None of the customers involved had done anything wrong, yet the data
recorded about them put them on the defensive until they could prove their innocence.
A lot has happened since Rivera and Lyons were “caught” in their own data shadow—the
personal information unwittingly collected about them in companies’ databases. In the 10–15
years since, technology has continued to evolve rapidly. Today, Rivera might use his Android
phone to pay for all his purchases, letting not only Vons track his shopping behavior but also
Google. Lyons instead might use Amazon Echo2 to ask Alexa, Amazon’s voice assistant, to
order his groceries from the comfort of his home—giving police yet another shopping record to
investigate. In fact, voice activation is becoming ubiquitous: many smartphones already feature
2 AmazonEcho is an example of a class of wireless “smart” speakers that listen and respond to voice commands (see https:
//www.amazon.com/echo/); Google Home is a similar product from Google (see https://store.google.com/produ
ct/google_home).
1.1. LECTURE GOALS AND OVERVIEW 3
“always-on” voice commands, which means they effectively listen in on all our conversations in
order to identify a particular activation keyword.3 Any spoken commands (or queries) are sent
to a cloud server for analysis and are often stored indefinitely. Many other household devices
such as TVs and game consoles4 or home appliances and cars5 will soon do the same.
It is easy to imagine that a future populated with an ever-increasing number of mobile and
pervasive devices that record our minute goings and doings will significantly expand the amount
of information that will be collected, stored, processed, and shared about us by both corporations
and governments. The vast majority of this data is likely to benefit us greatly—making our lives
more convenient, efficient, and safer through custom-tailored services that anticipate what we
need, where we need it, and when we need it. But beneath all this convenience, efficiency, and
safety lurks the risk of losing control and awareness of what is known about us in the many
different contexts of our lives. Eventually, we may find ourselves in a situation like Rivera or
Lyons, where something we said or did will be misinterpreted and held against us, even if the
activities were perfectly innocuous at the time. Even more concerning, while in the examples we
discussed privacy implications manifested as an explicit harm, more often privacy harms manifest
as an absence of opportunity, which may go unnoticed even though it may substantially impact
our lives.
2. A large majority of researchers found that others where much more qualified (and required)
to think about privacy: “For [my colleague] it is more appropriate to think about [security and
privacy] issues. It’s not really the case in my case.”
3. Another large number of researchers thought of privacy issues simply as a problem that
could (at the end) be solved trivially: “All you need is really good firewalls.”
4. Several researchers preferred not to think about privacy at all, as this would interfere with
them building interesting systems: “I think you can’t think of privacy... it’s impossible, because
if I do it, I have troubles with finding [a] Ubicomp future.”
With such excuses, privacy might never be incorporated into mobile and pervasive sys-
tems. If privacy is believed to be impossible, someone else’s problem, trivial, or not needed, it
will remain an afterthought without proper integration into the algorithms, implementations,
and processes surrounding mobile and pervasive computing systems. This is likely to have sub-
stantial impact on the adoption and perception of those technologies. Furthermore, privacy laws
and regulation around the world require technologists to pay attention to and mitigate privacy
implications of their systems.
The prime target audience of this lecture are hence researchers and practitioners working
in mobile and pervasive computing who want to better understand and account for the nuanced
1.2. WHO SHOULD READ THIS 5
privacy implications of the technology they are creating, in order to avoid falling for the fallacies
above. A deep understanding of potential privacy implications will help in addressing them early
on in the design of new systems.
At the same time, researchers working in the areas of privacy and security in general—but
without a background in mobile and pervasive systems—might want to read this lecture in order
to learn about the core properties and the specific privacy challenges within the mobile and per-
vasive computing domains. Last but not least, graduate and undergraduate students interested
in the area might want to read this synthesis lecture to get an overview and deeper understanding
of the field.
7
CHAPTER 2
Understanding Privacy
In order to be able to appropriately address privacy issues and challenges in mobile and pervasive
computing, we first need to better understand why we—as individuals and as society—might
want and need privacy. What does privacy offer? How does privacy affect our lives? Why is
privacy necessary? Understanding the answers to these questions naturally helps to better un-
derstand what “privacy” actually is, e.g., what it means to “be private” or to “have privacy.” Only
by examining the value of privacy, beyond our maybe intuitive perception of it, will we be able
to understand what makes certain technology privacy invasive and how it might be designed to
be privacy-friendly.
Privacy is a complex concept. Robert C. Post, Professor of Law and former dean of the Yale
Law School, states that “[p]rivacy is a value so complex, so entangled in competing and contradictory
dimensions, so engorged with various and distinct meanings, that I sometimes despair whether it can be
usefully addressed at all” [Post, 2001]. In this chapter, we aim to untangle the many perspectives
on and motivations for privacy. In order to better understand both the reasons for—and the
nature of—privacy, we examine privacy from three perspectives. A first understanding comes
from a historical overview of privacy, in particular from a legal perspective. Privacy law, albeit
only one particular perspective on privacy, certainly is the most codified incarnation of privacy
and privacy protections. Thus, it lends itself well as a starting point. Privacy law also has a rich
history, with different approaches in different cultures and countries. The legal understanding of
privacy has also changed substantially over the years, often because of technological advances.
As we discussed in Chapter 1, technology and privacy are tightly intertwined, as technological
innovations often tend to “change the playing field” in terms of making certain data practices
and incursions on privacy possible that weren’t possible before. Our historic overview hence also
includes key moments that prompted new views on what privacy constitutes.
Our second perspective on privacy then steps back from the codification of privacy and
examines arguments for and against privacy—the motivation for protecting or curtailing privacy.
This helps us to not only understand why we may want privacy, but also what we might lose
without privacy. Is privacy something valuable worth incorporating into technology?
With both the historic backdrop and privacy motivations in mind, we then present con-
temporary conceptualizations of privacy. We will see that there are many views on what privacy
is, which can make it difficult to understand what someone is referring to when talking about
“privacy.” Precision is important when discussing privacy, in order to ensure a common under-
standing rather than arguing based on diverging perspectives on what privacy is or ought to be.
8 2. UNDERSTANDING PRIVACY
The discussion of different conceptualizations and understandings of privacy is meant to help
us evaluate the often nuanced privacy implications of new technologies.
The poorest man may in his cottage bid defiance to all the forces of the Crown.
It may be frail—it’s roof may shake—the wind may blow through it—the storm may
enter—the rain may enter—but the King of England cannot enter! — all his forces
dare not cross the threshold of the ruined tenement.
One of the earliest explicit definitions of privacy came from the later U.S. Supreme Court
Justice Louis Brandeis and his colleague Samuel Warren. In 1890, the two published the essay
“The Right to Privacy” [Warren and Brandeis, 1890], which created the basis for privacy tort
law2 in the U.S. legal system. They defined privacy as “the right to be let alone.” The fact that
this definition is so often quoted can probably be equally attributed to it being the first legal
text on the subject and being easily memorizable. While it encompasses in principle all of the
cases mentioned previously, such as peeping toms, eavesdroppers, and trespassers, it is still a
very limited definition of privacy. Warren and Brandeis’ defintion focuses on only one particular
“benefit” of privacy: solitude. As we will see later in this chapter, privacy has other benefits
beyond solitude.
1 The common law is the legal system of many anglo-american countries. It is based on traditions and customs, dating back
to historic England, and heavily relies on precedents. This is in contrast to “civil law” juristdictions where judgments are
predominently based on codified rules.
2 In common law jurisdictions, tort law governs how individuals can seek compensation for the loss or harm they experienced
due to the (wrongful) actions of others.
2.1. CODIFYING PRIVACY 9
Probably the most interesting aspect of Warren and Brandeis’ work from today’s perspec-
tive is what prompted them to think about the need for a legal right to privacy at the end of the
19th century:
Recent inventions and business methods call attention to the next step which
must be taken for the protection of the person, and for securing to the individual
what Judge Cooley calls the right ‘to be let alone.’ …Numerous mechanical devices
threaten to make good the prediction that ‘what is whispered in the closet shall be
proclaimed from the house-tops’ [Warren and Brandeis, 1890].
Figure 2.1: The Kodak Camera. George Eastman’s “Snap Camera” made it suddenly simple to
take anybody’s image on a public street without their consent.
In this context, Warren and Brandeis’ quote of Luke 12(2–3) (in a translation slightly
different from the Bible [Carroll and Prickett, 2008]) sounds like an prescient description of
the new possibilities of mobile and pervasive computing. Clearly, neither the Evangelist Luke
nor Warren and Brandeis had anything like modern mobile and pervasive computing in mind.
In Warren and Brandeis’ case, however, it actually was a reference to a then novel technology—
photography. Before 1890, getting one’s picture taken usually required visiting a photographer
in their studio and sitting still for a considerable amount of time, otherwise the picture would
be blurred. But on October 18, 1884, George Eastmann, the founder of the Eastman Kodak
Company, received U.S.-Patent #306 594 for his invention of the modern photographic film.
10 2. UNDERSTANDING PRIVACY
Instead of having to use a large tripod-mounted camera with heavy glass plates in the studio,
everybody could now take Kodak’s “Snap Camera” (see Figure 2.1) out to the streets and take a
snapshot of just about anybody—without their consent. It was this rise of unsolicited pictures,
which more and more often found their way into the pages of the (at the same time rapidly
expanding) tabloid newspapers, that prompted Warren and Brandeis to paint this dark picture
of a world without privacy.
Today’s developments of smartphones, wearable devices, smart labels, memory amplifiers,
and IoT-enabled smart “things” seem to mirror the sudden technology shifts experienced by
Warren and Brandeis, opening up new forms of social interactions that change the way we
experienced our privacy in the past. However, Warren and Brandeis’ “right to be let alone” looks
hardly practical today: with the multitude of interactions in today’s world, we find ourselves
constantly in need of dealing with people (or better: services) that do not know us in person,
hence require some form of personal information from us in order to judge whether such an
interaction would be beneficial. From opening bank accounts, applying for credit, obtaining a
personal yearly pass for trains or public transportation, or buying goods online—we constantly
have to “connect” with others (i.e., give out our personal information) in order to participate in
today’s life. Even when we are not explicitly providing information about ourselves we constantly
leave digital traces. Such traces range from what websites we visit or what news articles we read,
to surveillance and traffic cameras recording our whereabouts, to our smartphones revealing
our location to mobile carriers, app developers and advertisers. Preserving our privacy through
isolation is just not as much of an option anymore as it was over a 100 years ago.
Privacy as a Right
Warren and Brandeis’ work put privacy on the legal map, yet it took another half century before
privacy made further legal inroads. After the end of the Second World War, in which Nazi
Germany had used detailed citizen records to identify unwanted subjects of all kinds [Flaherty,
1989], privacy became a key human right across a number of international treaties—the most
prominent being the Universal Declaration of Human Rights, adopted by the United Nations
in 1948, which states in its Article 12 that [United Nations, 1948]:
No one shall be subjected to arbitrary interference with his privacy, family, home
or correspondence, nor to attacks upon his honor and reputation. Everyone has the
right to the protection of the law against such interference or attacks.
Similar protections can be found in Article 8 of the Council of Europe’s Convention of
1950 [Council of Europe, 1950], and again in 2000 with the European Union’s Charter of Fun-
damental Rights [European Parliament, 2000], which for the first time in the European Union’s
history sets out in a single text the whole range of civil, political, economic, and social rights of
European citizens and all persons living in the European Union [Solove and Rotenberg, 2003].
Article 8 of the Charter, concerning the Protection of Personal Data, states the following [Eu-
ropean Parliament, 2000].
2.1. CODIFYING PRIVACY 11
1. Everyone has the right to the protection of personal data concerning him or her.
2. Such data must be processed fairly for specified purposes and on the basis of the consent
of the person concerned or some other legitimate basis laid down by law. Everyone has the
right of access to data which has been collected concerning him or her, and the right to
have it rectified.
The rise of the Internet and the World Wide Web in the early 1990s had prompted many
to proclaim the demise of national legal frameworks, as their enforcement in a borderless cy-
berspace seemed difficult at least.3 However, the opposite effect could be observed: at the begin-
ning of the 21st century, many national privacy laws have not only been adjusted to the technical
realities of the Internet, but also received a substantial international harmonization facilitating
cross-border enforcement.
Today, more than 100 years after Warren and Brandeis laid the foundation for modern
data protection laws, two distinctive principles for legal privacy protection have emerged: the
European approach of favoring comprehensive, all-encompassing data protection legislation that
governs both the private and the public sector, and the sectoral approach popular in the United
States that favors sector-by-sector regulation in response to industry-specific needs and concerns
in conjunction with voluntary industry self-regulation. In both approaches, however, privacy
protection is broadly modeled around what is known as “Fair Information Practice Principles.”
1. There must be no personal data record keeping systems whose very existence is secret.
3 In
his 1996 “Declaration of Independence of Cyberspace,” John Barlow, co-founder of the Electronic Frontier Foundation
(EFF), declared “Governments of the Industrial World, you weary giants of flesh and steel, I come from Cyberspace, the
new home of Mind. On behalf of the future, I ask you of the past to leave us alone. You are not welcome among us. You
have no sovereignty where we gather” [Barlow, 1996].
12 2. UNDERSTANDING PRIVACY
2. There must be a way for an individual to find out what information about him is in a record
and how it is used.
3. There must be a way for an individual to prevent information about him that was obtained
for one purpose from being used or made available for other purposes without his consent.
4. There must be a way for an individual to correct or amend a record of identifiable infor-
mation about him.
2. Data Quality Principle. Personal data should be relevant to the purposes for which they are
to be used, and, to the extent necessary for those purposes, should be accurate, complete
and kept up-to-date.
3. Purpose Specification Principle. The purposes for which personal data are collected should be
specified not later than at the time of data collection and the subsequent use limited to the
fulfillment of those purposes or such others as are not incompatible with those purposes
and as are specified on each occasion of change of purpose.
4. Use Limitation Principle. Personal data should not be disclosed, made available or otherwise
used for purposes other than those specified in accordance with the Purpose Specification
principle except:
• Fourth Amendment: The right of the people to be secure in their persons, houses, papers,
and effects, against unreasonable searches and seizures, shall not be violated, and no War-
rants shall issue, but upon probable cause, supported by Oath or affirmation, and particu-
larly describing the place to be searched, and the persons or things to be seized.
• Fifth Amendment: No person shall be […] compelled in any criminal case to be a witness
against himself, nor be deprived of life, liberty, or property, without due process of law;
nor shall private property be taken for public use, without just compensation.
In addition, the Fourteenth Amendment’s due process clause has been interpreted to pro-
vide a substantive due process right to privacy.8
• Fourteenth Ammendment: No state shall make or enforce any law which shall abridge the
privileges or immunities of citizens of the United States; nor shall any state deprive any
person of life, liberty, or property, without due process of law; nor deny to any person
within its jurisdiction the equal protection of the laws.
While the U.S. Constitution recognizes an individual right to privacy, the constitution
only describes the rights of citizens in relationship to their government, not to other citizens
or companies9 [Cate, 1997]. So far, no comprehensive legal privacy framework exists in the
United States that equally applies to both governmental and private data processors. Instead,
federal privacy law and regulation follows a sectoral approach, addressing specific privacy issues
that arise in certain public transactions or industry sectors [Solove and Schwartz, 2015].
Privacy with respect to the government is regulated by the Privacy Act of 1974, which
only applies to data processing at the federal level [Gormley, 1992]. The Privacy Act roughly
follows the Fair Information Principles set forth in the HEW report (mentioned earlier in this
section), requiring government agencies to be transparent about their data collections and to
support access rights. It also restricts what information different government agencies can share
about an individual and allows citizens to sue the government for violating these provisions.
Additional laws regulate data protection in other interactions with the government, such as the
Driver’s Privacy Protection Act (DPPA) of 1994, which restricts states in disclosing or selling
personal information from motor vehicle records, or the Electronic Communications Privacy
Act (ECPA) of 1986, which extended wiretapping protections to electronic communication.
8 That the Fourteenth Amendment provides a due process right to privacy was first recognized in concurring opinions of two
Supreme Court Justices in Griswold v. Connecticut. It was also recognized in Roe v. Wade 1973, which invoked the right
to privacy to protect a woman’s right to an abortion, and in Lawrence v. Texas 2003, which invoked the right to privacy
regarding the sexual practices of same-sex couples.
9 An exception is the 13th Amendment, which prohibits slavery and thus also applies to private persons.
16 2. UNDERSTANDING PRIVACY
Privacy regulation in the private sector is largely based on self-regulation, i.e., industry
associations voluntarily enact self-regulations for their sector to respect the privacy of their
customers. In addition, federal or state privacy laws are passed for specific industry sectors in
which privacy problems emerge. For instance, the Family Educational Rights and Privacy Act
(FERPA) of 1974 regulates student privacy in schools and universities; and the Children’s On-
line Privacy Protection Act (COPPA) of 1998 restricts information collection and use by web-
sites and online services for children under age 13.
The Health Insurance Portability and Accountability Act (HIPAA) of 1996 gives the De-
partment of Health and Human Services rule making authority regarding the privacy of medical
records. The HIPAA Privacy Rule requires privacy notices to patients, patient authorization for
data processing and sharing, limits data processing to what is necessary for healthcare, gives
patients data access rights, and prescribes physical and technical safeguards for health records.
Commonly, federal privacy laws are amended over time to account for evolving privacy issues.
For instance the Genetic Information Nondiscrimination Act (GINA) of 2008 limits the use of
genetic information in health insurance and employment decisions.
Privacy in the financial industry is regulated by multiple laws. The Fair Credit Reporting
Act (FCRA) of 1970 governs how credit reporting agencies can use consumer information. It
has been most recently amended by the Economic Growth, Regulatory Relief, and Consumer
Protection Act of 2018, which, as a reaction to the 2017 Equifax Data Breach, gave consumers
the right to free credit freezes to limit access to their credit reports and thus reduce the risk of
identity theft. The Gramm-Leach-Bliley Act (GLBA) of 1999 requires that financial institu-
tions store financial information in a secure manner, provide customers with a privacy notice
annually and gives consumers the right to opt-out or limit sharing of personal information with
third parties.
The Telephone Consumer Protection Act (TCPA) of 1991 provides remedies from repeat
telephone calls by telemarketers and created the national Do Not Call registry.10 The Controlling
the Assault of Non-Solicited Pornography And Marketing (CAN-SPAM) Act of 2003 created
penalties for the transmission of unsolicited email and requires that email newsletters and mar-
keting emails must contain an unsubscribe link. The Video Privacy Protection Act (VPPA) of
1988 protects the privacy of video rental records.
Those federal privacy laws are further complemented by state laws. For instance, many
states have passed RFID-specific legislation that prohibits unauthorized reading of RFID-
enabled cards and other devices (e.g., the state of Washington’s Business Regulation Chap-
ter 19.300 [Washington State Legislature, 2009]). The state of Delaware enacted four privacy
laws in 2015, namely the Online and Personal Privacy Protection Act (DOPPA), the Student
Data Privacy Protection Act (SDPPA), the Victim Online Privacy Act (VOPA), and the Em-
ployee/Applicant Protection for Social Media Act (ESMA).
• adverse publicity which places a person in a false light in the public eye; and
1. Expanded Coverage: As per its Article 3, the GDPR now also applies to companies outside
of the EU who offer goods or services to customers in the EU (“marketplace rule”)—the
1995 Directive only applied to EU-based companies (though it attempted to limit data
flows to non EU-based companies).
2. Mandatory Data Protection Officers (DPO): Article 37 requires companies whose “core ac-
tivities... require regular and systematic monitoring of data subjects on a large scale” to
designate a DPO as part of their accountability program, who will be the main contact for
overseeing legal compliance.
3. Privacy by Design: Article 25 requires that all data collection and processing must now
follow a “data minimization” approach (i.e., collect only as much data as absolutely neces-
sary), that privacy is provided by default, and that entities use detailed impact assessment
procedures to evaluate the safety of its data processing.
4. Consent: Article 7 stipulates that those who collect personal data must demonstrate that it
was collected with the consent of the data subject, and if the consent was “freely given.”
For example, if a particular piece of data is not necessary for a service, but if the service is
withheld from a customer otherwise, would not qualify as “freely given consent.”
15 Datatransfer between Europe and the U.S. has been regulated by a separate agreement called the Safe Harbor Agreement,
which was later replaced by the EU-US Privacy Shield (see https://www.privacyshield.gov/) [Weiss and Archick,
2016].
20 2. UNDERSTANDING PRIVACY
5. Data Breach Notifications: Article 33 requires those who store personal data to notify na-
tional data protection authorities if they are aware of a “break-in” that might have resulted
in personal data being stolen. Article 34 extends this to also notify data subjects if the
breach “is likely to result in a high risk to the rights and freedoms of natural persons.”
6. New Subject Rights: Articles 15–18 give those whose data is collected more explicit rights,
such as the right to object to certain uses of their data, the right to obtain a copy of the
personal data undergoing processing, or the right to have personal data being deleted (“the
right to be forgotten”).
How these changes will affect privacy protection in Europe and beyond will become
clearer over the coming years. When the GDPR finally came into effect in May 2018, its most
visible effect was a deluge of email messages that asked people to confirm that they still wanted
to be on a mailing list (i.e., giving “unambiguous” consent, as per Article 4) [Hern, 2018, Jones,
2018], as well as a pronounced media backlash questioning both the benefits of the regula-
tion [Lobo, 2018] as well as its (seemingly extraordinarily high) costs [Kottasová, 2018]. Many
of the new principles in the GDPR sound simple, but can be challenging to implement in prac-
tice (e.g., privacy by design, the right to erasure). We will discuss some of these challenges in
Chapter 6. Also, the above-mentioned Council of Europe “Convention for the Protection of
Individuals with regard to Automatic Processing of Personal Data” (108/81) [Council of Eu-
rope, 1981] has recently been updated [Council of Europe, 2018] and is now being touted as a
first step for non-EU countries to receive the coveted status of a “safe third country” (adequacy
assessment) [European Commission, 2017] with respect to the new GDPR [Greenleaf, 2018].
17 An active user is someone who has logged in at least once in the last 30 days.
18 As announced at Google’s 2017 I/O developer conference [Popper, 2017].
19 See https://duckduckgo.com/traffic.html.
20 As of January 2017, see https://duckduckgo.com/traffic.html.
21 According to Internetlivestats.com [2018], Google serves over 3.5 billion queries a day. Google does not publicly disclose
the number of queries they serve.
2.2. MOTIVATING PRIVACY 23
or Instagram? Why would chatting through Signal22 be any better than through WhatsApp?23
Consider the following cases.
• In 2009, U.S. Army veteran turned stand-up comedian Joe Lipari had a bad customer
experience in his local Apple store [Glass, 2010]. Maybe unwisely, Joe went home and
took out his anger via a Facebook posting that quoted a line from the movie he started
watching—Fight Club (based on the 1996 book by Palahniuk [1996]): “And this button-
down, Oxford-cloth psycho might just snap, and then stalk from office to office with
an Armalite AR-10 carbine gas-powered semi-automatic weapon, pumping round after
round into colleagues and co-workers.” Lipari posted the slightly edited variant: “Joe Li-
pari might walk into an Apple store on Fifth Avenue with an Armalite AR-10 carbine gas-
powered semi-automatic weapon and pump round after round into one of those smug, fruity
little concierges.” An hour later, a full SWAT team arrived, apparently alerted by one of
Joe’s Facebook contacts who had seen the posting and contacted homeland security. After
a thorough search of his place and a three-hour interrogation downtown, Joe assumed that
his explanation of this being simply a bad movie quote had clarified the misunderstanding.
Yet four months later, Joe Lipari was charged with two “Class D” felonies—“PL490.20:
Making a terroristic threat” [The State of New York, 2018b] and “PL240.60: Falsely re-
porting an incident in the first degree” [The State of New York, 2018a]—each carrying
prison terms of 5–10 years. Two years and more than a dozen court appearances later the
case was finally dismissed in February 2011.
• In 2012, Leigh Van Bryan and Emily Bunting, two UK residents just arriving in Los
Angeles for a long-planned holiday, were detained in Customs and locked up for 12 h in
a cell for interrogation [Compton, 2012]. Van Bryan’s name had been placed on a “One
Day Lookout” list maintained by Homeland Security for “intending to come to the US
to commit a crime,” while Bunting was charged for traveling with him. The source of this
were two tweets Van Bryan had made several weeks before his departure. The first read
“3 weeks today, we’re totally in LA pissing people off on Hollywood Blvd and diggin’ Marilyn
Monroe up!”—according to Van Bryan a quote from his favorite TV show “Family Guy.”
The second tweet read “@MelissaxWalton free this week, for quick gossip/prep before I go and
destroy America?” Despite explaining that “destroying” was British slang for “party,” both
were denied entry and put on the next plane back to the UK. Both were also told that
they had been removed from the customary Visa Waiver program that is in place for most
European passport holders and instead had to apply for visas from the U.S. Embassy in
London before ever flying to the U.S. again [Hartley-Parkinson, 2012].
In both cases, posts on social media that were not necessarily secret, yet implicitly assumed
to be for friends only, ended up being picked up by law enforcement, who did not appreciate
22 Seehttps://www.signal.org/.
23 Seehttps://www.whatsapp.com/. Note that since 2016, WhatsApp supports the encryption of all information being
exchanged, though the metadata, i.e., who is chatting with whom and when, is still available to WhatApp’s owner, Facebook.
24 2. UNDERSTANDING PRIVACY
the “playful” nature intended by the poster. Did Joe Lipari or Leigh Van Bryan do “something
wrong” and hence had “something to hide”? If not, why should they have anything to fear?
“Knowledge is power” goes the old adage, and as these two stories illustrate, one aspect
of privacy certainly concerns controlling the spread of information. Those who lose privacy will
also lose control over some parts of their lives. In some cases, this is intended. For example,
democracies usually require those in power to give up some of their privacy for the purpose
of being held accountable, i.e., to control this power. Citizens routinely give up some of their
privacy in exchange for law enforcement to keep crime at bay. In a relationship, we usually show
our trust in one another by opening up and sharing intimate details, hence giving the other
person power over us (as repeatedly witnessed when things turn sour and former friends or
lovers start disclosing these details in order to embarrass and humiliate the other).
In an ideal world, we are in control of deciding who knows what about us. Obviously,
this control will have limits: your parents ask you to call in regularly to say where you are; your
boss might require you to “punch in/out” when you arrive at work and leave, respectively; the
tax office may request a full disclosure on your bank accounts in order to compute your taxes;
and police can search your house should they have a warrant24 from a judge.
In the following two sections we look at both sides of the coin: Why do we want privacy,
and why might one not want it (in certain circumstances)? Some of the motivations for privacy
will be distilled from the privacy laws we have seen in the previous section: what do these laws
and regulations attempt to provide citizens with? What are the aims of these laws? By spelling
out possible reasons for legal protection, we can try to better frame both the values and the
limits of privacy. However, many critics argue that too much privacy will make the world a
more dangerous place. Privacy should (and does) have limits, and we will thus also look at the
arguments of those that think we should have less rather than more privacy.
24 Such a warrant should only be issued based on sufficient evidence (“probably cause”).
25 Sun Microsystems was once a key software and hardware manufacturer of Unix workstations. It was acquired by Oracle in
2009.
26 BT was formerly called British Telecom, which was the state-owned telecommunication provider in the UK.
2.2. MOTIVATING PRIVACY 25
In his book Code and other Laws of Cyberspace [Lessig, 1999], Harvard law professor
Lawrence Lessig tries to discern possible motivations for having privacy27 in today’s laws and
social norms. He lists four major driving factors for privacy.
• Privacy as empowerment: Seeing privacy mainly as informational privacy, its aim is to give
people the power to control the dissemination and spread of information about themselves.
A legal discussion surrounding this motivation revolves around the question whether per-
sonal information should be seen as a private property [Samuelson, 2000], which would
entail the rights to sell all or parts of it as the owner sees fit, or as a “moral right,” which
would entitle the owner to assert a certain level of control over their data even after they
sold it.
• Privacy as utility: From the data subject’s point of view, privacy can be seen as a utility pro-
viding more or less effective protection from nuisances such as unsolicited calls or emails,
as well as more serious harms, such as financial harm or even physical harm. This view
probably best follows Warren and Brandeis’ “The right to be let alone” definition of pri-
vacy, where the focus is on reducing the amount of disturbance for the individual, but can
also be found, e.g., in U.S. tort law (see Section 2.1.1) or anti-discrimination laws.
• Privacy as dignity: Dignity can be described as “the presence of poise and self-respect in
one’s deportment to a degree that inspires respect” [Pickett, 2002]. This not only entails
being free from unsubstantiated suspicions (for example when being the target of a wire
tap, where the intrusion is usually not directly perceived as a disturbance), but rather fo-
cuses on the balance in information available between two people: analogous to having a
conversation with a fully dressed person while being naked oneself, any relationship where
there is a considerable information imbalance will make it much more difficult for those
with less information about the other to keep their poise.
• Privacy as constraint of power: Privacy laws and moral norms to that extend can also be seen
as a tool for keeping checks and balances on a ruling elite’s powers. By limiting information
gathering of a certain type, crimes or moral norms pertaining to that type of information
cannot be effectively enforced. As Stuntz [1995] puts it: “Just as a law banning the use of
contraceptives would tend to encourage bedroom searches, so also would a ban on bedroom
searches tend to discourage laws prohibiting contraceptives” (as cited in Lessig [1999]).
Depending upon the respective driving factor, an individual might be more or less willing
to give up part of their privacy in exchange for a more secure life, a better job, or a cheaper prod-
uct. The ability of privacy laws and regulations to influence this interplay between government
and citizen, between employer and employee, and between manufacturer or service provider and
customer, creates a social tension that requires a careful analysis of the underlying motivations in
27 A similar categorization but centering around privacy harms can be found in Joyee De and Le Métayer [2016].
26 2. UNDERSTANDING PRIVACY
order to balance the protection of the individual and the public good. An example of how a par-
ticular motivation can drive public policy is anti-spam legislation enacted both in Europe [Eu-
ropean Parliament and Council, 2002] and in the U.S. [Ulbrich, 2003], which provides privacy-
as-an-utility by restricting the unsolicited sending of e-mail. In a similar manner, in March
2004 the Bundesverfassungsgericht (the German Supreme Court) ruled that an 1998 amend-
ment to German’s basic law enlarging law enforcements access to wire-tapping (“Der Grosse
Lauschangriff ”) was unconstitutional, since it violated human dignity [Der Spiegel, 2004].
This realization that privacy is more than simply providing secrecy for criminals is funda-
mental to understanding its importance in society. Clarke [2006] lists five broad driving princi-
ples for privacy.
• Philosophical: A humanistic tradition that values fundamental human rights also rec-
ognizes the need to protect an individual’s dignity and autonomy. Protecting a person’s
privacy is inherent in a view that values an individual for their own sake.
• Psychological: Westin [1967] points out the emotional release function of privacy—
moments “off stage” where individuals can be themselves, finding relief from the various
roles they play on any given day: “stern father, loving husband, car-pool comedian, skilled lathe
operator, unions steward, water-cooler flirt, and American Legion committee chairman.”
• Sociological: Societies do not flourish when they are tightly controlled, as countries such
as East Germany have shown. People need room for “minor non-compliance with social
norms” and to “give vent to their anger at ‘the system,’ ‘city hall,’ ‘the boss’:”
The firm expectation of having privacy for permissible deviations is a distin-
guishing characteristic of life in a free society [Westin, 1967].
• Economical: Clark notes that “all innovators are, by definition, ‘deviant’ from the norms of
the time,” hence having private space to experiment is essential for a competitive economy.
Similarly, an individual’s fear of surveillance—from both private companies and the state—
will dampen their enthusiasm in participating in the online economy.
• Political: The sociological need for privacy directly translates into political effects if people
are not free to think and discuss outside current norms. Having people actively participate
in political debate is a cornerstone of a democratic society—a lack of privacy would quickly
produce a “chilling effect” that directly undermines this democratic process.
As Clarke [2006] points out, many of today’s data protection laws, in particular those
drafted around the Fair Information Principles, are far from addressing all of those benefits,
and instead rather focus on ensuring that the collected data is correct—not so much as to pro-
tect the individual but more so to ensure maximum economic benefits. The idea that privacy is
more of an individual right, a right that people should be able to exercise without unnecessary
burden, rather than simply an economic necessity (e.g., to make sure collected data is correct), is
2.2. MOTIVATING PRIVACY 27
a relatively recent development. Representative for this paradigm shift was the so-called “census-
verdict” of the German federal constitutional court (Bundesverfassungsgericht) in 1983, which
extended the existing right to privacy of the individual (Persönlichkeitsrecht) with the right of
self-determination over personal data (informationelle Selbstbestimmung) [Mayer-Schönberger,
1998].28 The judgment reads as follows.29
If one cannot with sufficient surety be aware of the personal information about
oneself that is known in certain part of his social environment, …can be seriously
inhibited in one’s freedom of self-determined planning and deciding. A society in
which the individual citizen would not be able to find out who knows what when
about them, would not be reconcilable with the right of self-determination over per-
sonal data. Those who are unsure if differing attitudes and actions are ubiquitously
noted and permanently stored, processed, or distributed, will try not to stand out
with their behavior. …This would not only limit the chances for individual devel-
opment, but also affect public welfare, since self-determination is an essential re-
quirement for a democratic society that is built on the participatory powers of its
citizens [Reissenberger, 2004].
The then-president of the federal constitutional court, Ernst Benda, summarized his pri-
vate thoughts regarding their decision as follows.30
The problem is the possibility of technology taking on a life of its own, so that
the actuality and inevitability of technology creates a dictatorship. Not a dictatorship
of people over people with the help of technology, but a dictatorship of technology
over people [Reissenberger, 2004].
The concept of self-determination over personal data31 constitutes an important part of
European privacy legislation with respect to ensuring the autonomy of the individual. First, it
extends the Fair Information Principles with a participatory approach, which would allow the
individual to decide beyond a “take it or leave it” choice over the collection and use of his or her
personal information. Second, it frames privacy protection no longer only as an individual right,
but emphasizes its positive societal and political role. Privacy not as an individual fancy, but as
an obligation of a democratic society, as Julie Cohen notes:
Prevailing market-based approaches to data privacy policy …treat preferences for
informational privacy as a matter of individual taste, entitled to no more (and often
much less) weight than preferences for black shoes over brown, or red wine over
28 The finding was triggered by the controversy surrounding the national census announcement on April 27, 1983, which chose
the unfortunate wording “Totalzählung” and thus resulted in more than hundred constitutional appeals (Verfassungsbeschw-
erde) to the federal constitutional court [Reissenberger, 2004].
29 Translation by the authors.
30 Translation by the authors.
31 Often abbreviated to data self-determination.
28 2. UNDERSTANDING PRIVACY
white. But the values of informational privacy are far more fundamental. A degree
of freedom from scrutiny and categorization by others promotes important non-
instrumental values, and serves vital individual and collective ends [Cohen, 2000].
The GDPR additionally includes a number of protection mechanisms that are designed
to strengthen the usually weak bargaining position of the individual. For example, Article 9 of
the GDPR specifically restricts the processing sensitive information, such as ethnicity, religious
beliefs, political or philosophical views, union membership, sexual orientation, and health, unless
for medical reasons or with the explicit consent of the data subject.
32 In its Article 23, the GDPR allows member states to limit the applicability of the Regulation in order to safeguard, e.g.,
national or public security.
33 In May 2001, a judge in Texas ordered 21 convicted sex offenders not only to post signs in their front yards, but also place
bumper stickers on their cars stating: “Danger! Registered Sex Offender in Vehicle” [Solove and Rotenberg, 2003].
2.2. MOTIVATING PRIVACY 29
lifelong imprisonment in order to prevent any repeated offenses.34
A similar lifelong-custody
mechanism passed in 2004 a public referendum in Switzerland: before being released from their
prison sentence, psychologists will have to assess a sex offender’s likelihood for relapse. Those
with a negative outlook will then be taken directly into lifelong custody.
But it is not only violent crimes and homeland security that makes people wonder whether
the effort spent on protecting personal privacy is worth it. Especially mundane everyday data,
such as shopping lists or one’s current location—things that usually manifest themselves in pub-
lic (in contrast to, say, one’s diary, or one’s bank account balance and transactions)—seem to have
no reason for protection whatsoever. In many cases, collecting such data means added conve-
nience, increased savings, or better service for the individual: using detailed consumer shopping
profiles, stores will be able to offer special discounts, send only advertisements for items that re-
ally interest a particular customer, and provide additional information that is actually relevant to
an individual. And, as Lessig remarks, any such data collection is not really about any individual
at all: “[N]o one spends money collecting these data to actually learn anything about you. They
want to learn about people like you” [Lessig, 1999].
What could be some of the often cited dangers of a transparent society then? What would
be the harm if stores had comprehensive profiles on each of their customers in order to provide
them with better services?
One potential drawback of more effective advertisement is the potential for manipulation:
if, for example, one is identified as a mother of teenagers who regularly buys a certain break-
fast cereal, a targeted advertisement to buy a competitor’s brand at half the price (or with twice
as many loyalty points) might win the kid’s favor, thus prompting the mother to switch to the
potentially more expensive product (with a higher profit margin). A similar example of “effec-
tive advertising” can be found in the Cambridge Analytica scandal of 2018 [Lee, 2018, Meyer,
2018], which saw a political data firm harvest the private profiles of over 50 million Facebook
users (mostly without their knowledge) in order to create “psychographic” profiles that were then
sold to several political campaigns (the 2016 Trump campaign, the Brexit “Leave” campaign)
in order to target online ads. Presumably, such information allowed those campaigns to identify
voters most open to their respective messages. Profiles allow a process that sociologist David
Lyon calls social sorting [Lyon, 2002]:
The increasingly automated discriminatory mechanisms for risk profiling and
social categorizing represent a key means of reproducing and reinforcing social, eco-
nomic, and cultural divisions in informational societies [Lyon, 2001].
This has implication on both the individual and societal level. For democratic societies, a thor-
oughly profiled population exposed to highly target political ads may become increasingly di-
vided [The Economist, 2016]. On an individual level, the benefits of profiling would depend
on the existing economic and social status. For example, since a small percentage of customers
34 Another
problem with this approach is its broad application toward any “sex-offenses:” in some states, this also puts adult
homosexuals or underage heterosexual teenagers having consensual sex on such lists.
30 2. UNDERSTANDING PRIVACY
(whether it be in supermarkets or when selling airline tickets) typically makes a large percent-
age of profits,35 using consumer loyalty cards or frequent flyer miles would allow vendors to
more accurately determine whether a certain customer is worth fighting for, e.g., when having
to decide if a consumer complaint should receive fair treatment.
This might not only lead to withholding information from customers based on their pro-
files, but also to holding this information against them, as the example of Ron Rivera in Chap-
ter 1 showed. In a similar incident, a husband’s preference for expensive wine that was well
documented in his supermarket profile, allowed his wife to claim a higher alimony after having
subpoenaed the profile in court. Even if such examples pale in comparison to the huge number
of transactions recorded everyday worldwide, they nevertheless indicate that this massive col-
lection of mundane everyday facts will further increase through the use of mobile and pervasive
computing, ultimately adding a significant burden to our lives, as Lessig explains:
The burden is on you, the monitored, first to establish your innocence, and sec-
ond, to assure all who might see these ambiguous facts, that you are innocent [Lessig,
1999].
This silent reversal of the classical presumption of innocence can lead to significant dis-
advantages for the data subject, as the examples of comedian Joe Lipari (page 23), UK-couple
Leigh Van Bryan and Emily Bunting (page 23), and firefighter Philip Scott Lyons (page 1) have
shown. Another example for the sudden significance of these profiles is the fact that shortly af-
ter the September 11 attacks, FBI agents began collecting the shopping profiles and credit card
records of each of the suspected terrorists in order to assemble a terrorist profile [Baard, 2002].36
First reports of citizens who were falsely accused, e.g., because they shared a common name with
a known terrorist [Wired News] or had a similar fingerprint [Leyden, 2004], illustrate how dif-
ficult it can be for an individual to contest findings from computerized investigative tools.
Complete transparency, however, may also help curb governmental power substantially,
according to David Brin, author of the book “The Transparent Society” [Brin, 1998]. In his
book, Brin argues that losing our privacy can ultimately also have advantages: While up to
now, only the rich and powerful had been able to spy on common citizens at will, the next
technology would enable even ordinary individuals to “spy back,” to “watch the watchers” in a
society without secrets, where everybody’s actions could be inspected by anybody else and thus
could be held accountable, where the “surveillance” from above could now be counteracted by
“sousveillance” from below [Mann et al., 2003].
Critics of Brin point out that “accountability” is a construct defined by public norms and
thus will ultimately lead to a homogenization of society, where the moral values of the majority
35 The Guardian cites IBM-analyst Merlin Stone with saying “In every sector, the top 20% of customers give 80% of the profit”
[Guardian].
36 Interestingly enough, the main shopping characteristic for all of the suspected terrorists wasn’t a preference for Middle-
eastern food, but rather a tendency to order home-delivery pizza and paying for it by credit card.
2.3. CONCEPTUALIZING PRIVACY 31
will threaten the plurality of values that forms an integral part of any democracy, simply by
holding anybody outside of the norm “accountable” [Lessig, 1999].
The ideal level of privacy can thus have very different realities, depending on what is tech-
nically feasible and socially desirable. The issues raised above are as follows.
1. Communitarian: Personal privacy needs to be curbed for the greater good of society (trust-
ing the government). Democratic societies may choose to appoint trusted entities to over-
see certain private matters in order to improve life for the majority.
2. Convenience: The advantages of free flow of information outweighs the personal risks in
most cases. Only highly sensitive information, like sexual orientation, religion, etc. might
be worth protecting. Semi-public information like shopping habits, preferences, contact
information, and even health information, might better be publicly known so that one can
enjoy the best service and protection possible.
3. Egalitarian: If everybody has access to the same information, it ceases to be a weapon in the
hands of a few well-informed. Only when the watchers are being watched, all information
they hold about an individual is equally worth the information the individual holds about
them. Eventually, new forms of social interaction will evolve that are built upon these
symmetrical information assets.
4. Feasibility: What can technology achieve (or better: prevent)? All laws and legislation re-
quire enforceability. If privacy violations are not traceable, the much stressed point of ac-
countability (as developed in the fair information practices) becomes moot.
Figure 2.2: Core Privacy Types and Today’s Challenges. Across history, privacy concerns shifted as
technology made things possible that were not possible before. The four broad types of privacy—
bodily, territorial, communication, and information—are being challenged by new technology:
online profiling, social networking, activity tracking, and biological profiling.
affordable and commonplace,38 which means that biological profiling will challenge our bodily
privacy like never before.
Figure 2.3: Westin’s Privacy States based on Westin [1967] defines four privacy states, or “expe-
riences:” Solitude, Intimacy, Reserve, and Anonymity.
Solitude is the “positive” version of loneliness, the act of “being alone without being lonely.”
Solitude plays an important role in psychological well-being, offering benefits such as freedom,
creativity, and spirituality [Long and Averill, 2003].
Intimacy is probably a concept as complex as privacy, as it may refer to “feelings, to ver-
bal and nonverbal communication processes, to behaviors, to people’s arrangements in space, to
personality traits, to sexual activities, and to kinds of long-term relationships” [Reis et al., 1988].
Yet, it is clear that intimacy is an essential component of forming the types of close relation-
ships [Levinger and Raush, 1977] that are essential to our psychological well-being [Baumeister
and Leary, 1995]. Gerstein [Gerstein, 1978] argues that “intimacy simply could not exist un-
less people had the opportunity for privacy.” Paradoxically, Nisenbaum [1984] finds that even
solitude often helps create intimacy, by prompting “feelings of connection with another per-
son” [Long and Averill, 2003].
Anonymity provides us with what Rössler [2001] calls decisional privacy: “securing the in-
terpretational powers over one’s life.” The freedom to decide for oneself “who do I want to live with;
which job to take; but also: what clothes do I want to wear.” Anonymity thus helps to ensure the
autonomy of the individual, protecting one’s independence in making choices central to person-
hood.
The interplay between solitude, intimacy, and anonymity (and hence autonomy) ultimately
shapes our identity. Westin describes this as follows.
Each person is aware of the gap between what he wants to be and what he actually
is, between what the world sees of him and what he knows to be his much more
complex reality. In addition, there are aspects of himself that the individual does not
fully understand but is slowly exploring and shaping as he develops [Westin, 1967].
According to Arendt [1958], privacy is essential for developing an individual identity be-
cause it facilitates psychological and social depth, and protects aspects of a person’s identity that
2.3. CONCEPTUALIZING PRIVACY 35
cannot withstand constant public scrutiny—be it publicly chastised preferences and practices,
or silly tendencies and other behavior people self-censor when being observed.
Autonomy allows us to be what we want to be; intimacy and solitude help us to explore
and shape our “complex reality,” both through the intimate exchange with others. This not only
connects with the previously-mentioned emotional release function of privacy, but also with what
Westin [1967] calls the “safety-value” function of privacy, e.g., the “minor non-compliance with
social norms” and to “give vent to their anger at ‘the system,’ ‘city hall,’ ‘the boss”’:
The firm expectation of having privacy for permissible deviations is a distinguish-
ing characteristic of life in a free society [Westin, 1967].
In that sense, privacy protects against extrinsic and intrinsic losses of freedom [Nis-
senbaum, 2009, p. 75]. Nissenbaum argues that “privacy is important because it protects the di-
versity of personal choices and actions, not because it protects the freedom to harm others and commit
crimes” [Nissenbaum, 2009, p. 77].
Other privacy constituents can be drawn from its implementation, i.e., how current social,
legal, and technical solutions attempt to provide privacy to citizens, customers, employees, or
users. We can roughly group these approaches into three categories: secrecy, transparency, and
control.
• Secrecy is often equated with privacy, in particular in security scholarship, where privacy is
simply another word for confidentiality. At the outset, it seems as if privacy without secrecy
does not make sense: if others know our information, we have lost our privacy, surely?
Such thinking implies a binary nature of privacy: we either have it or do not have it, based
on the knowledge of others. Similar to Warren and Brandeis’ conception of privacy as
“the right to be let alone,” such a binary view is neither realistic nor practical. When I
confide in a trusted friend something in private, I surely do not expect my disclosure to
invalidate my privacy. I instead expect my friend to hold this information in confidence,
making it a shared secret between the two of us, rather than public information. I expect my
doctor, who knows a lot about my health, to keep this information private—in fact, many
professionals are often bound by law to keep private information of others secret (e.g.,
lawyers, clerics). Note that in these cases, secrecy does not entirely vanish, it just includes
more people who share the same confidential information. An interesting corner case of
secrecy is anonymity, often also called unlinkability. While confidentiality makes it hard
for others to find out a certain secret information, unlinkability removes its connection to a
person. As it is the case with confidentiality, unlinkability is not a binary value but comes in
many different shades. Data minimization is the bridge between the two: It aims to ensure
that only those data elements are collected that are essential for a particular purpose—all
non-essential information is simply not collected.
• Transparency can be seen as the flip side of secrecy. If secrecy prevents others from knowing
something about me, transparency allows me to know what others know about me. In its
36 2. UNDERSTANDING PRIVACY
simplest form, transparency requires notice: informing someone about what data is being
collected about them, or why this information is being collected, or what is already known
about them. Transparency is often the most basic privacy requirement, as it thwarts secret
record keeping. While individuals may have no say about their data being collected, they
may at least understand that their information is being recorded and act accordingly (and
thus retain their autonomy). Obviously, transparency by itself does not mean that one’s
privacy is being protected—the fact that one spots a nosy paparazzi taking photos from
afar does not help much once one’s secret wedding pictures appear in a tabloid paper.
• Control is often the key ingredient that links secrecy and transparency: if people can freely
control “when, how, and to what extend information about them is communicated to oth-
ers” [Westin, 1967], they can decide who they want to take into their confidence and
when they like to keep information private or even remain anonymous. A practical (and
minimal) form of control is “consent,” i.e., a person’s affirmative acknowledgment of a
particular data collection or data processing practice. Consent may be implicit (e.g., by
posting a sign “video surveillance in progress at this premise” by the door through which
visitors enter) or explicit (e.g., by not starting an online data collection practice unless a
person has checked a box in the interface). Implicit and explicit consent are thus seen as
giving individuals a choice: if they do not want their data collected, the have the option of
not selecting a particular option, or simply not proceeding beyond a particular point (ei-
ther physical or in a user interface). Together, notice and choice offer what Solove [2013]
calls “privacy self-management.” Such control tools form the basis for many privacy laws
worldwide, as we discussed in Section 2.1. While control seems like the ideal “fix” for
enabling privacy, the feasibility of individuals effectively exercising such control is ques-
tionable. Solove [2013] notes that individuals are often ill-placed to make privacy choices:
“privacy decisions are particularly susceptible to problems such as bounded rationality, the
availability heuristic, and framing effects because privacy is so complex, contextual, and
difficult to conceptualize.” In most cases, people simply “lack enough background knowl-
edge to make an informed choice,” or “there are simply too many entities that collect, use,
and disclose people’s data for the rational person to handle.”
With these constituents in mind, Gavison [1984] goes on to define privacy as being com-
prised of “solitude, anonymity, and control,” while Simmel [1968] puts it similarly, yet expand-
ing somewhat on Gavinson.
Privacy is a concept related to solitude, secrecy, and autonomy, but it is not syn-
onymous with these terms; for beyond the purely descriptive aspects of privacy as iso-
lation from the company, the curiosity, and the influence of others, privacy implies a
normative element: the right to exclusive control to access to private realms [Simmel,
1968].
2.3. CONCEPTUALIZING PRIVACY 37
Contrary to Westin and Rössler, Gavinson and Simmel describe privacy not as an inde-
pendent notion, but rather as an amalgam of a number of well established concepts, something
that constitutes itself only through a combination of a range of factors. While Westin also relates
privacy to concepts such as solitude, group seclusion, anonymity, and reserve [Cate, 1997], he
calls them privacy states, indicating that these are merely different sides to the same coin.
• Contexts: Contexts describe the general institutional and social circumstances (e.g., health-
care, education, family, religion, etc.) in which information technology is used or infor-
mation exchange takes place. Contexts also include the activities in which actors (in dif-
ferent roles) engage, as well as the purposes and goals of those activities (Nissenbaum calls
this values). Contexts and associated informational norms can be strictly specified or only
sparsely and incompletely defined—for example, the procedure of voting vs. an informal
business meeting. Expectations of confidentiality are clearly defined in the first place, but
less clear in the second case. People often engage in multiple contexts at the same time
which can be associated with different, potentially conflicting informational norms. For
instance, talking about private matters at work.
• Actors: Actors are senders, receivers, and information subjects who participate in activities.
Actors fill specific roles and capacities depending on the contexts. Roles define relation-
ships between various actors, which express themselves through the level of intimacy, ex-
pectations of confidentiality, and power dynamics between actors. Informational norms
regulate information flow between actors.
• Attributes: Attributes describe the type and nature of the information being collected,
transmitted, and processed. Informational norms render certain attributes appropriate or
inappropriate in certain contexts. The concept of appropriateness serves to describe what
are acceptable actions and information practices.
Information
Processing
Information Information
Collection Dissemination
Figure 2.4: Four categories of privacy issues based on Solove’s privacy taxonomy based on Solove
[2008].
Information Collection
Solove distinguishes two types of information collection: surveillance and interrogation. Surveil-
lance is the passive observation of the data subject by others. He argues that while not all ob-
servations disrupt privacy, continuous monitoring does. Surveillance disrupts privacy because
people may feel anxious and uncomfortable and even alter their behavior when they are being
watched. Covert surveillance has the additional effect that it creates a power imbalance because
the data subject can be observed without being able to see the observer.
Bentham’s Panopticon purposefully leverages this effect in the architectural design of a
prison [Bentham, 1787]. Bentham designed the panopticon as a circular prison with cells on
the outside that are open toward the middle. A guard tower at the center is fitted with small
2.3. CONCEPTUALIZING PRIVACY 41
window slits facing in all directions. Thus, guards could watch any prisoner at any time, while
inmates would not know when they are actually being watched.
In a less Orwellian sense, surveillance also includes privacy issues caused by incidental
observations, such as private information visible on a screen or someone observing a typed pass-
word, also known as shoulder surfing [Schaub et al., 2012]. Hawkey and Inkpen [2006] inves-
tigate the dimensions of incidental information privacy.
In contrast, interrogation constitutes active information collection. The data subject is
directly exposed to an inquisitive party, which may pressure the data subject to disclose details.
Similar to surveillance, less evocative interrogation issues also occur in common situations. For
example, when a questionnaire or a registration form asks for more information than required,
or when social pressure leads to revealing information one would have kept private otherwise.
Information Processing
Solove’s second category, information processing, contains five potential privacy harms, which
all occur after information has been collected, and therefore without direct involvement of the
data subject: aggregation, identification, insecurity, secondary use, and exclusion.
Aggregation of information about one person facilitates profiling. While such aggregation
can have benefits, it often violates the data subjects’ expectations in terms of what others should
be able to find out about them. However, the effects of aggregation are typically less direct,
because the data has already been collected previously. The main issue is that multiple innocuous
pieces of information gain privacy sensitivity when combined.
Identification is the process of linking some information to a specific individual, some-
times also called re-identification or de-anonymization. Presumably anonymized data may con-
tain sufficient information to link the provided information back to the individual. For instance,
Sweeney [2002] showed that zip code, gender, and date of birth provided in U.S. census data
are sufficient to uniquely identify 87% of the U.S. population. The risk of de-anonymization has
also been demonstrated in the context of presumably anonymized medical records [Benitez and
Malin, 2010], genome research data [Malin and Sweeney, 2004], location traces [Gruteser and
Hoh, 2005], and home/work location pairs [Golle and Partridge, 2009].
The lack of proper security (“insecurity”) of data processing and stored data is also a sig-
nificant privacy risk, because it facilitates identity theft and distortion when information about
an individual is more readily accessible by unauthorized entities than it should be. Therefore,
all discussed data protection frameworks place a strong emphasis on the security of collected
personal data.
The term “secondary use” describes any use of collected information beyond the purposes
for which it was collected. As the data subject did not consent to secondary use by definition,
it violates privacy expectations. The main issue here is that data subjects may have provided
different information if they would have been aware of the secondary use.
42 2. UNDERSTANDING PRIVACY
Solove calls the lack of appropriate means for data subjects to learn about the existence of
collected personal data, to access those data, and to rectify it “exclusion.” Exclusion runs contrary
to the data protection principles of participation and transparency.
Information Dissemination
Solove’s third category, information dissemination, summarizes seven privacy issues that concern
the further disclosure or spread of personal information.
Breach of confidentiality is the violation of trust in a specific relationship by revealing se-
cret information associated with that relationship. Disclosure is the dissemination of true in-
formation about a data subject without consent. It violates the data subject’s information self-
determination. Disclosure can adversely affect the data subject’s reputation. Distortion is similar
to disclosure with the difference that false or misleading information about a person is being
willfully disseminated, often with the intention of harming that person’s reputation.
Exposure is very similar to disclosure, but Solove notes that it pertains to revealing physical
or emotional attributes about a person, such as information about the person’s body and health.
Thus, exposure violates bodily privacy and affects the person’s dignity rather than reputation.
Increased accessibility does not directly disclose information to any specific party but makes
it generally easier to access aggregated information about an individual. Although information
might have been previously publicly available, aggregation and increased accessibility increase
the risk of actual disclosure.
Blackmail is the threat to expose private information if the blackmailer’s demands are not
met. Blackmail is a threat of disclosure enabled by a power imbalance created by information ob-
tained by the blackmailer. Appropriation, on the other hand, is the utilization of another person’s
identity for one’s own benefit. It is sometimes also referred to as exploitation.
Invasion
Solove’s fourth category is concerned with privacy invasion, featuring two privacy issues: intru-
sion and decisional interference. While Solove’s other three categories mainly deal with informa-
tion privacy, invasion does not involve personal information directly.
Intrusion is the violation of someone’s personal territory, however that territory may be
defined. One can intrude in someone’s physical territory or private space, but intrusion can also
pertain to disrupting private affairs. Such “realms of exclusion” [Solove, 2008] facilitate interaction
with specific people without interference, and also exist in otherwise public environments, e.g.,
having a private conversation at a restaurant.
Decisional interference is a privacy issue where governmental regulations interfere with
the freedom of personal decisions and self-determination (Rössler’s concept of decisional pri-
vacy), e.g., in the case of sexuality or religious practices. Solove argues that these are not merely
issues of autonomy but are strongly associated with information privacy. The risk of potential
disclosure can severely inhibit certain decisions of an individual.
2.4. SUMMARY 43
Solove’s taxonomy provides a comprehensive framework to reason about types of privacy
violations. His goal was to facilitate the categorization of privacy violations in specific cases to
obtain appropriate legal regulations and rulings. He ignores individual privacy preferences in his
taxonomy on purpose, because, according to him, it is virtually impossible to protect individual,
varying privacy expectations in a legal framework [Solove, 2008, p. 70]. However, he recognizes
that personal privacy preferences play an important role in shaping individual expectations of
privacy.
2.4 SUMMARY
If there is anything this chapter should have demonstrated, it is that privacy is a complex
concept—hiding many different meanings in many different situations in a simple seven-letter
word. Without a clear understanding of what people expect from “having privacy,” we can-
not hope to create technology that will work accordingly, offering “privacy-aware” or “privacy-
friendly” behavior. Law offers a good starting point for this exploration,40 as it contains a
society-sanctioned codification of privacy that goes beyond the views and opinions of indi-
vidual scholars (such as the “GDPR” [European Parliament and Council, 2016]). However,
understanding the raison d’être behind these laws—Why do they exist? What function do they
serve in society?—offers further insights that help us shape the solution space for privacy-aware
technology: privacy means empowerment, dignity, utility, a constraint of power [Lessig, 1999];
privacy functions as an emotional release, a human right, a staple of democracy, or as a driver for
innovation [Clarke, 2006, Westin, 1967]. Last but not least, we discussed various conceptualiza-
tions of privacy from the literature (e.g., privacy constituents such as solitude, intimacy, reserve,
anonymity, autonomy, control) that further completed the many possible uses for and benefits
of privacy, and how the actions of others can affect this (e.g., Solove’s privacy taxonomy).
40 Notethat our discussion on privacy law in this chapter is only cursory. An excellent overview of U.S. privacy legislation
can be found, e.g., in Gormley [1992], Solove [2006], and Solove and Schwartz [2018]. For an international perspective,
Bygrave [2014] offers a detailed discussion, while Greenleaf [2014] focuses explicitly on Asia, and Burkert [2000] on Europe.
Voss [2017] and De Hert and Papakonstantinou [2016] specifically focus on the GDPR. Greenleaf [2017] offers a current
overview of over 120 national and transnational privacy laws.
45
CHAPTER 3
3.3 SUMMARY
Modern computing systems in general are characterized by a high degree of interconnectivity.
However, mobile and pervasive computing are significantly different from “traditional” com-
puters (e.g., a laptop or a desktop computer) due to seven reasons.
1. Novel form factors: Computers now not only come as powerful smartphones but are also
embedded in clothing and toys, making it possible to have them with us almost 24 h a day.
3. Always-on sensing: Modern sensors not only use fewer power than ever before, but also
include sophisticated digital signal processors that provide application developers with
usable high-level context information (e.g., indoor and outdoor location, physical activity).
4. Software ecosystems: App stores have revolutionized the way we distribute and consume
software. Never before was it easier to bring new software to millions of users, yet users now
need to be better trained to understand the implications of installing untrusted programs.
5. Invisibly embedded: The low costs of computing, communication, and sensing systems has
made it possible to make rooms, buildings, and even entire cities “smart.” As this ide-
ally happens without distractions (e.g., blinking lights), it will become increasingly more
difficult to tell augmented (i.e., computerized) from un-augmented spaces.
3.3. SUMMARY 55
6. Implicit interaction: The ubiquity of computers has made it possible to create “invisible”
assistants that observe our activities and proactively provide the right service at the right
time. Obviously, this requires detailed and comprehensive observations.
7. Ubiquitous coverage: Embedding computing from small-scale (e.g., in our blood stream)
to large scale (e.g., across an entire metropolitan area) significantly increases both vertical
and horizontal coverage (i.e., across time and space) of our lives.
While today’s interconnectivity certainly forms the starting point for most of today’s pri-
vacy issues, the characteristics discussed in this chapter are of particular interest when looking at
the privacy implications of mobile and pervasive computing. The next chapter will discuss these
implications in detail.
57
CHAPTER 4
1 “Media break” is a term from (German) business informatics that describes a missing link in the information flow of a
business.
2 For example, Apple Continuity (https://www.apple.com/macos/continuity/) enables activity transitions among dif-
ferent Apple devices.
3 For instance, NetVends (http://www.netvends.com) offers such remote vending solutions.
64 4. PRIVACY IMPLICATIONS OF MOBILE AND PERVASIVE COMPUTING
Modern sensor technology is key for eliminating media breaks. Today’s sensors use less
and less power, making it possible to both embed them into ever smaller packages (as they do not
need a large battery) and to run them for longer periods of time. For instance, positioning infor-
mation used to be available only through the use of power-hungry GPS sensors, which meant
that consumers had to choose between battery life or detailed localization. Today, those GPS
sensors have not only become much more power efficient, but they are also used together with
accelerometer sensors to better understand when the device is actually moving and hence posi-
tion information needs to be updated. Power-efficient WiFi chipsets complement GPS-based
location information—in particular indoors—by using WiFi fingerprinting technology [Husen
and Lee, 2014]. Some sensors can also harvest the required power from the measurement pro-
cess itself, making it possible to forego a battery completely.4 Alternatively, infrastructure-based
sensors “piggyback” onto the power grid or plumbing of a house and infer occupancy information
or individual device use simply by observing consumption patterns [Cohn et al., 2010, Froehlich
et al., 2009, Gupta et al., 2010, Patel et al., 2007, 2008].
8 In the U.S., the social security number is a de-facto universal identifier that is used as an authentication token in many
situations. Knowing a person’s name and social security number is usually enough to impersonate that person in a range of
situations (e.g., opening accounts, obtaining a credit card, etc.) [Berghel, 2000].
4.3. PROFILING–PREDICTING BEHAVIOR 67
4.3.2 PRIVACY IMPLICATIONS
Information aggregation and comprehensive profiling have multiple potential privacy implica-
tions. An obvious issue is that profiles may contain incorrect information. Factual information
may have been inaccurately captured, associated with the wrong person due to identical or simi-
lar names, or may have been placed out of context as part of aggregation. Inferences made from
collected data may be incorrect and potentially misrepresent the individual (see, e.g., Charette
[2018]). Inaccurate information in an individual’s profile may just be “a nuisance,” such as being
shown improperly targeted ads, but consequences can also be dire, such as having to pay a higher
premium for health insurance, a higher interest rate for a loan, being denied insurance, or being
added to a no-fly list. Some options exist to correct inaccurate information. Credit bureaus pro-
vide mechanisms to access and correct credit reports; online data brokers often provide access
to one’s profile and may allow for corrections [Rao et al., 2014]. However, correcting inaccurate
information can still be difficult or even impossible because it is difficult for an individual to
determine the original source of some misinformation, especially if it is used and disseminated
by multiple data brokers.
Even if information aggregated about an individual is correct, privacy issues arise. Based
on a profile’s information individuals may be discriminated against in obvious as well as less per-
ceptible ways. Price discrimination is a typical example. For instance, the online travel agency
Orbitz was found to display more expensive hotel and travel options to Apple users than Win-
dows users based on the transferred browser information [Mattioli, 2012]. Acquisti and Fong
[2014] studied hiring discrimination in connection with candidate information available on on-
line social networks. They find that online disclosures of personal traits, such as being Christian
or Muslim, significantly impacted wether a person was invited for a job interview. It is imagin-
able that inferences about an individual’s health based on prescription drug use, shopping history
(e.g., weight loss pills or depression self-help books) or other indicators may also lead to dis-
crimination. A key approach in addressing the risks stemming from profiling is thus also the
need to disclose information about the algorithm(s) used to rank or classify an individual (see
Section 5.6).
Large-scale collection and aggregation of information can also lead to inadvertent disclo-
sure of some information about individuals that they would have preferred to keep private. For
instance, Jernigan and Mistree [2009] used social network analysis to predict with high accuracy
a person’s sexual orientation based on their friends’ sexual orientation disclosed on online social
networks. Information shared in mobile messaging apps about when a user is active or “avail-
able to chat” is sufficient to infer a user’s sleep times, chat partners, and activities [Buchenscheit
et al., 2014]. Mobile devices and laptop computers continuously send service announcements
in order to enable interconnectivity and seamless interaction between devices, but may also leak
a user’s presence and identity when devices are connected to open wireless networks [Könings
et al., 2013].
68 4. PRIVACY IMPLICATIONS OF MOBILE AND PERVASIVE COMPUTING
4.4 SUMMARY
Three fundamental trends in mobile and pervasive computing have a significant impact on our
privacy: the digitization of our everyday life; the continuous data capture with the help of sensors;
and the construction of detailed profiles. While none of these trends are new, mobile and per-
vasive computing exacerbate these issues greatly. As a consequence, we have an ever increasing
amount of information captured about us, often well beyond what is directly needed. The ability
to use advanced sensing to collect and record minute details about our lives forms the basis for
detailed personal profiles, which, with the help of data mining techniques and machine learn-
ing, provide seemingly deep insights into one’s personality and psyche. At the same time, this
information allows for unprecedented levels of personalized systems and services, allowing us to
manage an ever-increasing amount of information at an ever-increasing pace. Left unchecked,
however, these powerful services may make us vulnerable to theft, blackmail, coersion, and social
injustice. It requires us to carefully balance the amount of “smartness” in a system with usable
and useful control tools that fit into our social and legal realities.
69
CHAPTER 5
1. Privacy provides both individual and societal benefits. Privacy is not a “nice to have” that we
might want to trade in for better shopping experiences or higher efficiency—it is a core re-
quirement of democratic societies that thrive on the individuality of their citizens. Neither
is privacy just sought by criminals, delinquents, or deviants, while “honest” citizens have
“nothing to hide.” Privacy supports human dignity, empowers individuals, and can provide
checks and balances on society’s powers, such as the government and corporate entities.
However, supporting and enforcing privacy comes with costs that individuals often do not
or cannot bear—Solove calls this the “consent dilemma” of privacy self-management. We
need structural support—legal, technical, and social—in order to enable privacy in mobile
and pervasive computing.
Figure 5.1: An example of privacy explanations around a mobile permission dialog from Google’s
Material Design Guidelines: an on-screen dialog educates the user why the Notes app wants to
record audio (left) before the audio recording permission request is shown (center); if the user
has denied the permission, the app provides information on how to enable the feature if it is
desired later on (right). Image source: Google [2017].
Many of today’s privacy notices do not provide this opportunity. For instance, privacy
policies are lengthy documents written in legal language with very little or no choices regarding
specific data practices. Individuals are confronted with a take-it-or-leave-it choice: accept the
whole privacy policy or do not use the service. In reality, this does not provide a meaningful
choice to users [Cate, 2010]. People are forced to accept the privacy policy or terms of service
in order to gain access to the desired service—regardless of their actual privacy preferences. If
there are no granular choices associated with a privacy notice, individuals have little incentive
to read it [Schaub et al., 2017]: it is not worth investing the time to read and understand a
privacy policy if there are no real actions one can take, if the policy can change anytime (most
privacy policies include a provision to that extent), and if the policy is intentionally abstract and
ambiguous about what data practices a user is actually subject to [Bhatia et al., 2016, Reidenberg
et al., 2015, 2016]. Abstract and ambiguous descriptions in privacy policies are a consequence
of the trend that many companies try to consolidate all their services under a single privacy
policy. While this makes it easier for the company to provide its privacy-related information in
one place and apply consistent practices across its services, it also means that the privacy policy
76 5. SUPPORTING PRIVACY IN MOBILE AND PERVASIVE COMPUTING
has to be necessarily abstract and generic in order to cover all services’ data collection, use, and
sharing practices. As a result, it is often not clear how a specific service covered by the privacy
policy actually collects, processes or shares personal information.
Vague privacy notices and policies leave individuals helpless and resigned [Madden, 2014,
Turow et al., 2015]. While individuals might care about their privacy, the choice architectures
they are presented with force them to accept practices they do not necessarily agree with or
are even aware of, because the only other option is to completely abstain from the benefits the
service or technology might provide to them [Acquisti et al., 2017, Cate, 2010, Schaub et al.,
2017].
The importance of providing actionable privacy information in order to obtain informed
consent has been recognized by researchers [Cate, 2010, Cranor, 2012, Schaub et al., 2017]
and policy makers [Federal Trade Commission, 2012, 2015, President’s Concil of Advisors on
Science and Technology, 2014]. Europe’s General Data Protection Regulation therefore places
strong requirements on how consent has to be obtained, mandating that consent must be freely
given, explicit, and specific [European Parliament and Council, 2016]. However, the challenge
is to provide individuals with real and granular privacy choices without overwhelming them with
those choices. Just exposing more and more opt-ins, opt-outs, and privacy settings to users will
not scale, as it will just overwhelm individuals with choices and place the burden for privacy man-
agement solely on them [Solove, 2013]. Obtaining meaningful consent without overwhelming
users is particularly challenging in mobile and pervasive computing, given the often implicit
and imperceptible nature of data collection via sensors embedded into devices and the environ-
ment [Luger and Rodden, 2013b], as well as the fact that privacy is commonly not the user’s
primary task or motivation in using a system [Ackermann and Mainwaring, 2005].
For instance, mobile permissions already require individuals to make dozens if not hun-
dreds of privacy decisions for all the resources the apps on their smartphones want to access.
Based on a sample of 3,760 Android users, Malmi and Weber [2016] find that Android users in
2015 had used 83 smartphone apps at least once per month, on average. Given that on average
each app requests five permissions [Pew Research Center, 2015], this results in over 400 permis-
sion decisions a smartphone user would have to make on average. Expanding the smartphone
permission approach to online services, wearables, smart home systems and other pervasive com-
puting systems, as well as all data collection, use, sharing, and retention practices would result
in a plethora of privacy settings, choices, and permissions. Even the most interested user would
not be able to consistently manage their privacy across all those choices and contexts [Liu et al.,
2016, Schaub, 2014, Schaub et al., 2015], as well as over time [Luger and Rodden, 2013a].
A potential way forward are privacy management assistance solutions that aim to help
users more effectively manage their privacy. For instance, machine learning can be leveraged to
create personalized privacy assistants or privacy agents which learn from an individual’s privacy
decisions and either provide recommendations for future decisions in the same context or across
contexts or even automate privacy decisions for the user [Kelley et al., 2008, Knijnenburg and
5.3. PRIVACY MANAGEMENT ASSISTANCE 77
Kobsa, 2013, Liu et al., 2016, Sadeh et al., 2009, Schaub et al., 2015]. The personalized pre-
diction of privacy settings and preferences has received considerable attention [Cornwell et al.,
2007, Cranshaw et al., 2011, Kelley et al., 2008, Lin et al., 2012, 2014, Sadeh et al., 2009].
There are two general approaches: leveraging an individual’s prior privacy decisions to predict
preferences in new decision contexts or assigning an individual to a profile, a cluster of people
with similar privacy preferences. Such profiles could be based on clustering privacy settings of
other users, learning from privacy settings made by the user’s “friends,” or be curated by ex-
perts [Agarwal and Hall, 2013, Toch, 2014]. These two approaches can also be combined: an
individual can be assigned to a profile initially to bootstrap a privacy assistant and avoid coldstart
issues, followed by experiential refinement and privacy preference learning from an individual’s
actions over time [Knijnenburg and Kobsa, 2013, Schaub, 2014, Schaub et al., 2015]. For exam-
ple, through a set of questions, the system might learn that a person is highly concerned about
location data. By assigning the person to the “location privacy” cluster, the privacy assistant may
determine that for this person location sharing should either be blocked by default or the user
should be asked when location is being accessed. Based on the person’s privacy settings adjust-
ments over time, the assistant might learn that while the user prefers not to share location in
general, the user regularly grants location access to transportation-related apps (e.g., navigation,
ride sharing, bus schedule) and update the user’s privacy preference profile accordingly.
Reasoning results can either be provided as privacy settings recommendations to users
or settings can be automatically adjusted for the user. Typically, the level of confidence in the
reasoning result determines the level of automation or user involvement in making the privacy
decisions [Bilogrevic et al., 2013, Kelley et al., 2008, Knijnenburg and Kobsa, 2013, Schaub,
2014, Schaub et al., 2015, Toch, 2011]. Determining the appropriate level of automation—with
multiple stages between fully manual and fully automated configuration [Parasuraman et al.,
2000]—for privacy decision making and privacy management assistance is a topic of active re-
search. Enabling auditing of system decisions and giving users controls to adjust and tweak
preference models, for instance through overviews of what apps were allowed or denied access
to location and other resources [Schaub et al., 2014, Tsai et al., 2017], both improves the privacy
assistant and provides the users with agency over their privacy assistant.
While researchers have made progress in accurately predicting individuals’ preferred pri-
vacy settings in specific situations, it often remains a challenge to help individuals make those
settings or automate settings changes for them. Current privacy assistance solutions are either
confined to their application context, in which they learn from users’ behavior within the ap-
plication and may adjust recommendations and settings within that context [Das et al., 2018,
Knijnenburg and Kobsa, 2013, Schaub et al., 2014], or require modifications to the underlying
system layer, e.g., research prototypes with superuser rights hooking into Android’s permission
framework [Liu et al., 2016, Wijesekera et al., 2018].
There is a call for the need to expose privacy settings APIs which would allow privacy as-
sistants to function across contexts [Liu et al., 2016], positing that otherwise we will end up with
78 5. SUPPORTING PRIVACY IN MOBILE AND PERVASIVE COMPUTING
siloed and less useful privacy assistants for different systems and services, such as Facebook, your
smartphone apps, or (parts of ) your smart home. Such disjoint privacy assistants would not be
able to leverage and benefit from privacy decisions made in other contexts to improve prediction
accuracy, requiring the re-learning of privacy preferences in different application contexts and
for different privacy assistants. However, given the context-dependent nature of privacy per-
ceptions and preferences [Acquisti et al., 2015, Nissenbaum, 2009], highly specialized privacy
assistants might be able to provide more relevant support in the context they are focusing on
(e.g., smartphone apps) than general privacy assistants that aim to comprehensively model an
individual’s privacy preferences, which may be fraught with uncertainty and context dependency
as behavioral economics research suggests [Acquisti et al., 2015]. Nevertheless, further research
on more deeply understanding factors that affect privacy decisions and modeling privacy deci-
sion making processes is essential to further improve privacy management support.
1 An appropriation tort provides for “liability against one who appropriates the identity of another for his own benefit” [Mc-
Clurg, 2003]. It is typically used by celebrities for safeguarding their “brand value” when others try to sell a product or service
based on the celebrity’s likeness (e.g., image).
86 5. SUPPORTING PRIVACY IN MOBILE AND PERVASIVE COMPUTING
It is obvious that data that has not been collected (data collection minimization) cannot
be misused. However, given the likelihood that data processors will incentivize data subjects to
consent to broad data collection practices (and thus will be able to build comprehensive profiles),
ensuring subject-controlled use of the data (data processing minimization) is critical.
of each strategy, as provided by Danezis et al. [2015]—they still do not fully address how to
integrate privacy engineering in the overall system design process. Kroener and Wright [2014]
propose to combine best-practices, privacy-enhancing technologies (PETs), and privacy impact
assessments (PIAs) in order to operationalize PbD.
According to Wright [2011] PIAs started to emerge since the mid-1990s in countries
around the world. In Europe, the UK privacy commissioner, the Information Commissioner’s
Office (ICO), published the first handbook on privacy impact assessments in 2007 [ICO, 2007].
The handbook saw a first revision in 2009 and has since been integrated into the more general
Guide to the General Data Protection Regulation (GDPR) [ICO, 2018], which the ICO regu-
larly updates. PIAs draw on the tradition of environmental impact assessments and technology
assessments, which have a long tradition in both the U.S. and Europe. Clarke [2009] offers a
comprehensive history and Wright [2011] provides an overview of PIA requirements around
the world, as well as a detailed description of typical PIA steps [Wright, 2013].
PIAs set out a formal process for identifying (and minimizing) privacy risks by drawing
on input from all stakeholders. The key steps are [ICO, 2018]:
Compromised by >
Principle
Guideline Threat
Operational
Tackles >
Requirement Risk
Is an >
Is an >
Level
Relevance
Has a >
Success Reduces >
Criterion Treatment
Realizes >
Realizes >
Goal-Oriented Risk-Based
Design
Is a> < Is a
Privacy Control
Counter-
Measure measure
Vallidates >
Vallidates >
Verification
Test
Figure 5.2: The PRIPARE Privacy-by-Design Methodology considers both goal-based and risk-
based privacy requirement analyses (base on Notario et al. [2015]).
5.8 SUMMARY
In this chapter, we provided an overview of paradigms, mechanisms and techniques to design
and develop privacy-friendly mobile and pervasive computing systems. While there’s no sil-
ver bullet to remedy all privacy implications of any mobile and pervasive system, the presented
approaches constitute an essential toolbox for building privacy into mobile and pervasive com-
puting systems. An important first step for any system is the integration of privacy-friendly
defaults. Data collection and use should be minimal—no more than necessary for the system to
function—and potentially privacy-invasive or unexpected data practices should not be active by
default but rather be activated by the user in order to help users in constructing a more accurate
5.8. SUMMARY 91
mental model of what data a system collects and how that data is used. Just-in-time mobile
permission dialogs are a common example of this approach.
Rather than just informing users about data practices, associated privacy risks and impli-
cations should further be communicated to users at such occasions, in order to minimize surprise
and help users make meaningful decisions regarding their privacy in the use of mobile and per-
vasive technologies. Furthermore, provided privacy information should always be actionable in
order to assist individuals both in forming and expressing privacy decisions through privacy
settings and options. At the same time, individuals are faced with an increasing number of pri-
vacy decisions. Machine learning can help provide personalized privacy assistance by learning
from people’s privacy preferences and recommending likely preferred settings to individual users.
Context-adaptive privacy mechanisms extend this notion to providing context-specific privacy
assistance by leveraging context-aware sensing infrastructure and computing approaches to dy-
namically adjust privacy settings to context changes.
A privacy challenge that arises with automated systems, is that algorithmic decision mak-
ing can incidentally codify discriminatory practices and behavior in algorithms. Thus, a substan-
tial challenge for artificial intelligence research is providing transparency about how automated
decisions are made in an intelligible and correctible fashion. Designers and developers of mobile
and pervasive computing systems—which derive much of their benefits from ingesting sensor
information and inferring state and required actions—need to consider how they can detect and
eliminate undue bias in their decision engines, provide algorithmic transparency to users, as well
as offer opportunities for correcting errors as well as provide redress processes for harmed users.
The privacy challenges and approaches discussed put a lot on the plate of a designer or
developer of a mobile or pervasive system or application. Engaging in privacy engineering and
considering privacy from the beginning, can help surface privacy issues and challenges early-on
in the design, as well as facilitate the consideration of creative privacy approaches and solutions.
Similar to other engineering practices and considerations, relying on established frameworks,
such as privacy impact assessments, can provide structured guidance in the process and enable
reliable and consistent privacy outcomes.
93
CHAPTER 6
Conclusions
Many privacy experts cheered when the new European privacy law, the General Data Protec-
tion Regulation (GDPR), was adopted on April 14, 2016. The general public, however, hardly
seemed to take notice. Yet when the law finally went into effect on May 25, 2018 (after a prepara-
tion period of two years), it was hard to escape the many news stories and TV specials discussing
its implications. Inboxes across the world filled up with emails from companies and organiza-
tions who updated their privacy policies to comply with the new law, or asked recipients to
confirm long-forgotten subscriptions. Is this the watershed moment for privacy, when it finally
moves from legal niche into mainstream public policy, leading to the widespread adoption of
privacy-enhancing technologies in everything from Web servers to mobile devices to smart toys
and smart homes?
As we laid out in this synthesis lecture, easy solutions for addressing privacy in mobile and
pervasive computing might be hard to come by. Privacy is a complex topic with a rich history, a
complex socio-political underpinning, and challenging interactions and dependencies between
technical, legal, and organizational processes. While the GDPR has brought the concept of
“privacy by design” into the spotlight, developing a systematic practice for integrating privacy
measures into systems is still an ongoing challenge. What are the right privacy defaults? What
is the absolute minimal data needed for a particular service? How can one limit the use of data
without restricting future big data services? How does one make complex information and data
flows transparent for users? How can one obtain consent from indviduals that is specific and
freely given, without inundating users with prompts and messages? Or should we abandon the
idea of “notice & choice” in a world full of mobile and pervasive computing? What is the right
way to anonymize data? Is anonymization even possible in practical applications, given the abil-
ity to re-identify people by merging multiple innocuous datasets? How can we organize a fair
marketplace around personal data, and what is the value of my data—today, and tomorrow?
Who owns my data, if an “ownership” concept even makes sense for personal data?
The ability of mobile and pervasive computing systems to collect, analyze, and use personal
data are ever increasing, with each new generation of technology being smaller, more power-
efficient, and more ubiquitous. Despite the already substantial body of research in that area that
we discussed here, ever more research and engineering challenges regarding privacy in mobile
and pervasive computing continue to emerge. However, the welcome thrust from the policy side
through the GDPR may help to further unify the often diverse research efforts in this space.
An interdisciplinary approach, combining research in psychology, economics, law, social science,
94 6. CONCLUSIONS
and computer science, stands the best chance to make progress in this complex field. Some of
the key challenges that we see are the following.
• Refining privacy primitives. At the outset, research needs to continue investigating the fun-
damental principles of privacy-aware systems and privacy-enhancing technologies, with a
particular focus on big data and anonymization.
• Addressing system privacy. The increasing interconnection of mobile and pervasive com-
puting systems requires effective means of regulating access and use of both personal and
anonymous (but potentially re-identifiable) data.
• Supporting usable privacy. Privacy solutions too often place a burden on users. How can
privacy be understandable (and controllable) for end users, not just lawyers? How can
legal requirements be reconciled with user experience requirements?
• Personalizing privacy. Will we be able to create systems that can adapt to individual privacy
needs without being paternalistic? Can we find solutions that scale to millions yet provide
the right support and effective assistance in managing privacy for an individual?
• Establishing privacy engineering. While early proposals for a privacy-aware design process
exist and privacy engineering is developing as practice, we need to better understand which
process to use when and how to tailor privacy solutions to the characteristics and require-
ments of specific applications and their context. There will certainly be no “one-size fits
all” solution.
• Improving privacy evaluation. Understanding what users want and how well a particular
solution is working are key factors for establishing a rigorous scientific approach for privacy.
We hope that this Synthesis Lecture provides a useful starting point for exploring these
challenges.
Bibliography
Live monitoring helps engine manufacturers track performance. The Telegraph Online, Nov.
2010. URL https://www.telegraph.co.uk/travel/travelnews/8111075/Live-moni
toring-helps-engine-manufacturers-track-performance.html. 63
20 Minuten. Cumulus: Migros musste daten der polizei geben. 20 Minuten, Aug. 2004. URL
http://www.20min.ch/news/kreuz_und_quer/story/29046904. 2
G. D. Abowd and E. D. Mynatt. Charting past, present, and future research in ubiquitous
computing. ACM Transactions on Computer-Human Interaction (TOCHI), 7(1):29 –58, 2000.
DOI: 10.1145/344949.344988. 45, 51, 52, 53
A. Acquisti and J. Grossklags. Privacy and rationality in individual decision making. IEEE
Security & Privacy, 3(1):26–33, Jan. 2005. ISSN 1540-7993. DOI: 10.1109/msp.2005.22.
73
A. Acquisti, R. Gross, and F. Stutzman. Face recognition and privacy in the age of augmented
reality. Journal of Privacy and Confidentiality, 6(2):1, 2014. URL http://repository.cmu
.edu/jpc/vol6/iss2/1/. DOI: 10.29012/jpc.v6i2.638. 66
A. Acquisti, L. Brandimarte, and G. Loewenstein. Privacy and human behavior in the age of
information. Science, 347(6221):509–514, Jan. 2015. ISSN 0036-8075, 1095-9203. DOI:
10.1126/science.aaa1465. 3, 71, 73, 78, 80
96 BIBLIOGRAPHY
A. Acquisti, I. Adjerid, R. Balebako, L. Brandimarte, L. F. Cranor, S. Komanduri, P. G. Leon,
N. Sadeh, F. Schaub, M. Sleeper, Y. Wang, and S. Wilson. Nudges for privacy and security:
understanding and assisting users’ choices online. ACM Computing Surveys, 50(3):1–41, Aug.
2017. ISSN 03600300. DOI: 10.1145/3054926. 71, 73, 74, 76, 80
Y. Agarwal and M. Hall. ProtectMyPrivacy: detecting and mitigating privacy leaks on iOS
devices using crowdsourcing. In Proceeding of the 11th Annual International Conference on
Mobile Systems, Applications, and Services, MobiSys ’13, pages 97–110, New York, NY, USA,
2013. ACM. ISBN 978-1-4503-1672-9. DOI: 10.1145/2462456.2464460. 77
I. Altman. The Environment and Social Behavior: Privacy, Personal Space, Territory, Crowding.
Brooks/Cole Publishing company, Monterey, California, 1975. 39, 79
R. Anderson. Security Engineering. Wiley, 2nd edition, 2008. ISBN 978-0-470-06852-6. URL
http://www.cl.cam.ac.uk/~rja14/book.html. 58
APEC. Apec privacy framework. Asia-Pacific Economic Cooperation, 2017. URL https:
//www.apec.org/Publications/2017/08/APEC-Privacy-Framework-(2015). 21
H. Arendt. The Human Condition. University of Chicago Press, Chicago, 1958. DOI:
10.2307/40097657. 34
Article 29 Data Protection Working Party. Advice paper on essential elements of a def-
inition and a provision on profiling within the eu general data protection regulation.
http://ec.europa.eu/justice/data-protection/article-29/documentation/othe
r-document/files/2013/20130513_advice-paper-on-profiling_en.pdf, May 2013.
65
L. Atzori, A. Iera, and G. Morabito. The Internet of Things: a survey. Computer Networks, 54
(15):2787–2805, 2010. DOI: 10.1016/j.comnet.2010.05.010. 49
E. Baard. Buying trouble – your grocery list could spark a terror probe. The Village Voice, July
2002. URL http://www.villagevoice.com/issues/0230/baard.php. 30
BIBLIOGRAPHY 97
M. Backes and D. Markus. Enterprise privacy policies and languages. In Digital Privacy:
Theory, Technologies, and Practices, chapter 7, pages 135–153. Auerbach Publications, 2008.
DOI: 10.1201/9781420052183.ch7. 72
T. M. Banks. GDPR matchup: Canada’s personal information protection and electronic docu-
ments act. IAPP Privacy Tracker, May 2, 2017, 2017. URL https://iapp.org/news/a/
matchup-canadas-pipeda-and-the-gdpr/. 21
D. Barrett. One surveillance camera for every 11 people in Britain, says CCTV survey. The
Telegraph, 2013. http://www.telegraph.co.uk/technology/10172298/One-surveill
ance-camera-for-every-11-people-in-Britain-says-CCTV-survey.html. 21
R. F. Baumeister and M. R. Leary. The need to belong: desire for interpersonal attach-
ments as a fundamental human motivation. Psychological bulletin, 117(3):497, 1995. DOI:
10.1037/0033-2909.117.3.497. 34
R. Beckwith. Designing for ubiquity: the perception of privacy. IEEE Pervasive Computing, 2
(2):40–46, Apr. 2003. ISSN 1536-1268. DOI: 10.1109/mprv.2003.1203752. 49
K. Benitez and B. Malin. Evaluating re-identification risks with respect to the HIPAA privacy
rule. Journal of the American Medical Informatics Association : JAMIA, 17(2):169–77, Jan. 2010.
ISSN 1527-974X. DOI: 10.1136/jamia.2009.000026. 41
J. Bentham. Panopticon. In M. Bozovic, editor, The Panopticon Writings (1995). Verso, London,
1787. 40
H. Berghel. Identity theft, social security numbers, and the Web. Communications of the ACM,
43(2):17–21, Feb. 2000. ISSN 00010782. DOI: 10.1145/328236.328114. 66
98 BIBLIOGRAPHY
K. Bernsmed, I. A. Tøndel, and A. A. Nyre. Design and implementation of a CBR-based privacy
agent. In Seventh International Conference on Availability, Reliability and Security (ARES ’12),
pages 317–326. IEEE, Aug. 2012. ISBN 978-1-4673-2244-7. DOI: 10.1109/ares.2012.60.
80
M. Billinghurst and T. Starner. Wearable devices: new ways to manage information. Computer,
32(1):57–64, Jan. 1999. ISSN 0018-9162. 00272. DOI: 10.1109/2.738305. 46
A. Bogle. Who owns music, video, e-books after you die? Slate.com, Aug. 2014. URL
https://www.slate.com/blogs/future_tense/2014/08/22/digital_assets_and_d
eath_who_owns_music_video_e_books_after_you_die.html?via=gdpr-consent. 60
J. Bohn, V. Coroama, M. Langheinrich, F. Mattern, and M. Rohs. Social, economic, and ethical
implications of ambient intelligence and ubiquitous computing. In W. Weber, J. M. Rabaey,
and E. Aarts, editors, Ambient Intelligence, volume 10, chapter 1, pages 5–29. Springer, 2005.
DOI: 10.1007/3-540-27139-2_2. 62
H. P. Brougham. Historical Sketches of Statesmen Who Flourished in the Time of George III, vol-
ume 1. Lea & Blanchard, Philadephia, PA, USA, 1839. As quoted in Platt [1989]. 8,
31
L. A. Bygrave. Data Privacy Law - An International Perspective. Oxford University Press, Jan.
2014. ISBN 9780199675555. DOI: 10.1093/acprof:oso/9780199675555.001.0001. 43
BIBLIOGRAPHY 99
R. Caceres and A. Friday. Ubicomp systems at 20: progress, opportunities, and chal-
lenges. IEEE Pervasive Computing, 11(1):14–21, Jan. 2012. ISSN 1536-1268. DOI:
10.1109/mprv.2011.85. 52
R. Carroll and S. Prickett, editors. The Bible. Oxford University Press, Oxford, UK, 2008. ISBN
978-0199535941. DOI: 10.1093/oseo/instance.00016818. 8, 9
J. Cas. Privacy in pervasive computing environments - a contradiction in terms? IEEE
Technology and Society Magazine, 24(1):24–33, Jan. 2005. ISSN 0278-0097. DOI:
10.1109/mtas.2005.1407744. 62
F. H. Cate. The Limits of Notice and Choice. IEEE Security & Privacy, 8(2):59–62, 2010.
ISSN 1540-7993. DOI: 10.1109/msp.2010.84. 73, 75, 76
F. H. Cate. Privacy in the Information Age. The Brookings Institution, Washington, D.C., USA,
online edition, 1997. URL brookings.nap.edu/books/0815713169/html. 15, 37, 121
F. H. Cate. The failure of fair information practice principles. In J. K. Winn, editor, Consumer
Protection in the Age of the ‘Information Economy’, chapter 13, pages 341–378. Routledge, 2006.
URL https://ssrn.com/abstract=1156972. 14
A. Cavoukian. Privacy by Design ... Take the Challenge. Information and Privacy Commissioner
of Ontario, Canada, 2009. URL http://privacybydesign.ca. 87
R. N. Charette. Michigan’s MiDAS unemployment system: algorithm alchemy cre-
ated lead, not gold - IEEE spectrum. IEEE Spectrum, 18(3):6, 2018. URL
https://spectrum.ieee.org/riskfactor/computing/software/michigans-midas-
unemployment-system-algorithm-alchemy-that-created-lead-not-gold. 67, 83
D. Chen, S. P. Fraiberger, R. Moakler, and F. Provost. Enhancing transparency and control
when drawing data-driven inferences about individuals. Big Data, 5(3):197–212, 2016. DOI:
10.1089/big.2017.0074. 84
S. Chitkara, N. Gothoskar, S. Harish, J. I. Hong, and Y. Agarwal. Does this app really
need my location?: context-aware privacy management for smartphones. Proc. ACM Inter-
act. Mob. Wearable Ubiquitous Technol., 1(3):42:1–42:22, Sept. 2017. ISSN 2474-9567. DOI:
10.1145/3132029. 81
G. Chittaranjan, J. Blom, and D. Gatica-Perez. Who’s who with Big-Five: analyzing and clas-
sifying personality traits with smartphones. In 2011 15th Annual International Symposium
on Wearable Computers, pages 29–36. IEEE, jun 2011. ISBN 978-1-4577-0774-2. DOI:
10.1109/iswc.2011.29. 66
R. Clarke. Beyond the OECD guidelines: privacy protection for the 21st century. http://ww
w.rogerclarke.com/DV/PP21C.html, Jan. 2000. 11, 14
100 BIBLIOGRAPHY
R. Clarke. What’s ’privacy’? http://www.rogerclarke.com/DV/Privacy.html, Aug. 2006.
DOI: 10.1163/9789004192195_004. 13, 14, 26, 43
R. Clarke. Privacy impact assessment: its origins and development. Computer Law & Security
Review, 25(2):123–135, Jan. 2009. ISSN 0267-3649. DOI: 10.1016/j.clsr.2009.02.002. 88
P. Cochrane. Head to head. Sovereign Magazine, pages 56–57, 2000. URL http://www.coch
rane.org.uk/opinion/papers/prof.htm. 24
J. E. Cohen. Examined lives: Informational privacy and the subject as object. Stanford Law
Review, 52:1373–1437, May 2000. URL http://www.law.georgetown.edu/faculty/j
ec/examined.pdf. As cited in Solove and Rotenberg [2003]. DOI: 10.2307/1229517. 28
G. Cohn, S. Gupta, J. Froehlich, E. Larson, and S. N. Patel. GasSense: appliance-level, single-
point sensing of gas activity in the home. In P. Floréen, A. Krüger, and M. Spasojevic, editors,
Pervasive Computing. Pervasive 2010., pages 265–282, Berlin, Heidelberg, 2010. Springer.
DOI: 10.1007/978-3-642-12654-3_16. 64
A. Compton. British tourists detained, deported for tweeting “destroy America”, Jan.
2012. URL http://www.huffingtonpost.com/2012/01/30/british-tourists-depo
rted-for-tweeting_n_1242073.html. 23
J. Cornwell, I. Fette, G. Hsieh, M. Prabaker, J. Rao, K. Tang, K. Vaniea, L. Bauer, L. F. Cranor,
J. Hong, B. McLaren, M. Reiter, and N. Sadeh. User-controllable security and privacy for
pervasive computing. In Eighth IEEE Workshop on Mobile Computing Systems and Applications
(HotMobile ’07), pages 14–19. IEEE, Mar. 2007. ISBN 0-7695-3001-X. DOI: 10.1109/wm-
csa.2007.4389552. 77
Council of Europe. Convention for the protection of human rights and fundamental freedoms.
CETS 005, Nov. 1950. URL conventions.coe.int/Treaty/en/Treaties/Html/005.
htm. 10
Council of Europe. Resolution (73) 22 on the protection of the privacy of in-
dividuals vis-à-vis electronic data banks in the private sector, 1973. URL
http://www.coe.int/T/E/Legal_affairs/Legal_co-operation/Data_protectio
n/Documents/International_legal_instruments/Resolution%20(73)%2022.asp. 18
Council of Europe. Resolution (74) 29 on the protection of the privacy of indi-
viduals vis-à-vis electronic data banks in the public sector, 1974. URL http:
//www.coe.int/T/E/Legal_affairs/Legal_co-operation/Data_protection/Do
cuments/International_legal_instruments/Resolution%20(74)%2029.asp. 18
Council of Europe. Convention for the protection of individuals with regard to automatic pro-
cessing of personal data. CETS 108, Jan. 1981. URL conventions.coe.int/Treaty/en/
Treaties/Html/108.htm. 18, 20
BIBLIOGRAPHY 101
Council of Europe. Convention 108+ – Modernised convention for the protection of in-
dividuals with regard to automatic processing of personal data. CETS 108+, June 2018.
URL http://rm.coe.int/convention-108-convention-for-the-protection-of-in
dividuals-with-regar/16808b36f1. 20
L. F. Cranor. Necessary but not sufficient: standardized mechanisms for privacy notice and
choice. Journal of Telecommunications and High Technology Law, 10(2), 2012. URL http:
//jthtl.org/content/articles/V10I2/JTHTLv10i2_Cranor.PDF. 72, 76
K. Crawford and J. Schultz. Big data and due process: toward a framework to
redress predictive privacy harms. Boston College Law Review, 55(93), Oct. 2014.
URL https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2325784http://la
wdigitalcommons.bc.edu/bclr/vol55/iss1/4/. 84
M.-F. Cuéllar. Cyberdelegation and the administrative state. Stanford public law working paper,
Stanford University, Oct. 2016. DOI: 10.1017/9781316671641.006. 85
A. Das, M. Degeling, D. Smullen, and N. Sadeh. Personalized privacy assistants for the Internet
of Things. IEEE Pervasive Computing, 2018. DOI: 10.1109/mprv.2018.03367733. 77, 80,
81
P. De Hert and V. Papakonstantinou. The new general data protection regulation: still a sound
system for the protection of individuals? Computer Law & Security Review, 32(2):179–194,
Apr. 2016. ISSN 0267-3649. DOI: 10.1016/j.clsr.2016.02.006. 19, 43
Der Spiegel, 2004. Innere sicherheit: Totes pferd. Der Spiegel, (11):48, Mar. 2004. URL
http://www.spiegel.de/spiegel/inhalt/0,1518,ausg-1395,00.html. 26
N. Dhingra, Z. Gorn, A. Kener, and J. Dana. The default pull: an experimental demonstration
of subtle default effects on preferences. Judgment and Decision Making, 7(1):69–76, 2012.
ISSN 1930-2975. 71
A. S. Douglas. The U.K. privacy white paper 1975. In Proceedings of the June 7-10, 1976, National
Computer Conference and Exposition, AFIPS ’76, pages 33–38, New York, USA, 1976. ACM.
DOI: 10.1145/1499799.1499806. 58
P. Dourish. What we talk about when we talk about context. Personal and Ubiquitous Computing,
8(1):19–30, Feb. 2004. ISSN 1617-4909. DOI: 10.1007/s00779-003-0253-8. 52
M. R. Ebling and M. Baker. Pervasive tabs, pads, and boards: are we there yet? IEEE Pervasive
Computing, 11(1):42–51, Jan. 2012. ISSN 1536-1268. DOI: 10.1109/mprv.2011.80. 47
A. Etzioni. The Limits of Privacy. Basic Books, New York, USA, 1999. DOI: 10.2307/2654355.
28
I. Glass. This American life: right to remain silent, 2010. URL https://www.thisamerican
life.org/414/transcript. 23
Google. Material design guidelines: Permissions. Website, Sept. 2017. URL https://materi
al.io/design/platform-guidance/android-permissions.html. 74, 75
K. Gormley. 100 years of privacy. Wisconsin Law Review, pages 1335–1442, 1992. ISSN
09547762. URL http://heinonlinebackup.com/hol-cgi-bin/get_pdf.cgi?handle
=hein.journals/wlr1992§ion=57. 15, 43
G. Greenleaf. Asian Data Privacy Laws. Oxford University Press, oct 2014. ISBN
9780199679669. DOI: 10.1093/acprof:oso/9780199679669.001.0001. 43
G. Greenleaf. Global data privacy laws 2017: 120 national data privacy laws, including Indonesia
and Turkey. Privacy Laws & Business International Report, 2017(145):10–13, Jan. 2017. URL
https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2993035. 20, 24, 43
G. Greenleaf. Convention 108+ and the data protection framework of the EU (speaking notes
for conference presentation at ’Convention 108+ tomorrow’s common ground for protection’).
June 2018. URL https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3202606.
20
U. Greveler, B. Justus, and D. Loehr. Multimedia content identification through smart meter
power usage profiles. In Computers, Privacy and Data Protection, 2012. 61, 64
M. Gruteser and B. Hoh. On the anonymity of periodic location samples. In Second international
conference on Security in Pervasive Computing (SPC ’05). Springer, 2005. DOI: 10.1007/978-
3-540-32004-3_19. 41
106 BIBLIOGRAPHY
Guardian. The card up their sleeve. The Guardian, July 19, 2003. URL http://www.guardian
.co.uk/weekend/story/0,3605,999866,00.html. 30
S. Gupta, M. S. Reynolds, and S. N. Patel. ElectriSense: single-point sensing using emi for
electrical event detection and classification in the home. In K. N. Truong, P. A. Nixon, J. E.
Bardram, and M. Langheinrich, editors, Proceedings of the 12th ACM international conference
on Ubiquitous computing - Ubicomp ’10, pages 139–148, New York, USA, 2010. ACM Press.
ISBN 9781605588438. DOI: 10.1145/1864349.1864375. 64
S. Gürses and J. M. del Alamo. Privacy engineering: shaping an emerging field of research
and practice. IEEE Security & Privacy, 14(2):40–46, Mar. 2016. ISSN 1540-7993. DOI:
10.1109/msp.2016.37. 87
A. Hern. Samsung rejects concern over ’orwellian’ privacy policy. The Guardian, Feb. 9
2015. URL http://www.theguardian.com/technology/2015/feb/09/samsung-reje
cts-concern-over-orwellian-privacy-policy. 3
A. Hern. Technology: most GDPR emails unnecessary and some illegal, say experts, May
2018. URL https://www.theguardian.com/technology/2018/may/21/gdpr-emails-
mostly-unnecessary-and-in-some-cases-illegal-say-experts. 20
HEW Advisory Committee. Records, computers and the rights of citizens – report of the
secretary’s advisory committee on automated personal data systems, records, computers and
the rights of citizens. Technical report, U.S. Department of Health, Education, and Welfare
(HEW), 1973. URL http://aspe.hhs.gov/datacncl/1973privacy/tocprefacemembe
rs.htmhttps://epic.org/privacy/hew1973report/. 11, 72, 73
M. Hildebrandt. Profiling and the identity of the European citizen. In Profiling the Euro-
pean Citizen, pages 303–343. Springer Netherlands, Dordrecht, 2008. DOI: 10.1007/978-1-
4020-6914-7_15. 83
BIBLIOGRAPHY 107
K. Hill. Officemax blames data broker for ’daughter killed in car crash’. Forbes Online, Jan. 2014.
URL https://www.forbes.com/sites/kashmirhill/2014/01/22/officemax-blam
es-data-broker-for-daughter-killed-in-car-crash-letter/{#}50f16b0f76cf.
66
J.-H. Hoepman. Privacy design strategies. arXiv preprint arXiv:1210.6621, 9:12, 2012. DOI:
10.1007/978-3-642-55415-5_38. 87
M. N. Husen and S. Lee. Indoor human localization with orientation using WiFi fingerprint-
ing. In Proceedings of the 8th International Conference on Ubiquitous Information Management
and Communication - ICUIMC ’14, pages 1–6, New York, USA, 2014. ACM Press. ISBN
9781450326445. DOI: 10.1145/2557977.2557980. 64
IBM Global Services. IBM multi-national privacy survey. Consumer Report 938568, Harris
Interactive, New York, USA, Oct. 1999. URL http://web.asc.upenn.edu/usr/ogandy
/ibm_privacy_survey_oct991.pdf. 22
International Data Corporation (IDC). New IDC survey finds widespread privacy concerns
among U.S. consumers (IDC US42238617), Jan. 2017. URL https://www.idc.com/getd
oc.jsp?containerId=prUS42253017. [Online; posted 24-January-2017]. 22
H. Ishii and B. Ullmer. Tangible bits: towards seamless interfaces between people, bits and
atoms. In SIGCHI conference on Human factors in computing systems (CHI ’97), pages 234–241,
New York, USA, 1997. ACM. ISBN 0897918029. DOI: 10.1145/258549.258715. 51
W. Jones. Building safer cars. IEEE Spectrum, 39(1):82–85, 2002. ISSN 00189235. DOI:
10.1109/6.975028. 50
S. Joyee De and D. Le Métayer. Privacy Risk Analysis, volume 81 of Synthesis Lectures on In-
formation Security, Privacy, and Trust. Morgan & Claypool, 2016. ISBN 9781627059879.
DOI: 10.2200/S00724ED1V01Y201607SPT017. 25
F. Kargl. Inter-Vehicular Communication. Habilitation thesis, Ulm University, Ulm, Dec. 2008.
53
D. Katz. Top 10 fitness APIs: Apple health, Fitbit and Nike, 2015. URL
https://www.programmableweb.com/news/top-10-fitness-apis-apple-health-
fitbit-and-nike/analysis/2015/04/17. 63
C. Laurant, editor. Privacy and Human Rights 2003. EPIC and Privacy International, London,
UK, 2003. ISBN 1-893044-18-1. URL http://www.privacyinternational.org/surve
y/phr2003/. 8, 32
BIBLIOGRAPHY 111
S. Lederer, J. I. Hong, A. K. Dey, and J. A. Landay. Five pitfalls in the design of privacy. In L. F.
Cranor and S. Garfinkel, editors, Security and Usability, chapter 21, pages 421–446. O’Reilly,
2005. ISBN 0-596-00827-9. 62
T. B. Lee. Facebook’s Cambridge Analytica scandal, explained. Ars Technica Website, Mar.
2018. URL https://arstechnica.com/tech-policy/2018/03/facebooks-cambridg
e-analytica-scandal-explained/. 29
Y.-D. Lee and W.-Y. Chung. Wireless sensor network based wearable smart shirt for ubiquitous
health and activity monitoring. Sensors and Actuators B: Chemical, 140(2):390–395, July 2009.
ISSN 0925-4005. DOI: 10.1016/j.snb.2009.04.040. 50
J. T. Lehikoinen, J. Lehikoinen, and P. Huuskonen. Understanding privacy regulation in ubi-
comp interactions. Personal and Ubiquitous Computing, 12(8):543–553, Mar. 2008. ISSN
1617-4909. DOI: 10.1007/s00779-007-0163-2. 79
L. Lessig. Code and Other Laws of Cyberspace. Basic Books, New York, USA, 1999. DOI:
10.1016/s0740-624x(00)00068-x. 25, 29, 30, 31, 43
G. K. Levinger and H. L. Raush. Close Relationships: Perspectives on the Meaning of Intimacy.
University of Massachusetts Press, 1977. DOI: 10.2307/351497. 34
J. Leyden. FBI apology for Madrid bomb fingerprint fiasco. The Register, May 26, 2004. URL
http://www.theregister.co.uk/2004/05/26/fbi_madrid_blunder/. 30
Y. Li, F. Chen, T. J.-J. Li, Y. Guo, G. Huang, M. Fredrikson, Y. Agarwal, and J. I. Hong.
PrivacyStreams: enabling transparency in personal data processing for mobile apps. Proc.
ACM Interact. Mob. Wearable Ubiquitous Technol., 1(3):76:1–76:26, Sept. 2017. ISSN 2474-
9567. DOI: 10.1145/3130941. 81
J. Lin, S. Amini, J. I. Hong, N. Sadeh, J. Lindqvist, and J. Zhang. Expectation and
purpose: understanding users’ mental models of mobile app privacy through crowdsourc-
ing. In ACM Conference on Ubiquitous Computing (Ubicomp ’12). ACM, 2012. DOI:
10.1145/2370216.2370290. 77
J. Lin, B. Liu, N. Sadeh, and J. I. Hong. Modeling users’ mobile app privacy preferences: restor-
ing usability in a sea of permission settings. In 10th Symposium On Usable Privacy and Security
(SOUPS 2014), pages 199–212, Menlo Park, CA, 2014. USENIX Association. ISBN 978-
1-931971-13-3. URL https://www.usenix.org/conference/soups2014/proceeding
s/presentation/lin. 77, 80
A. R. Lingley, M. Ali, Y. Liao, R. Mirjalili, M. Klonner, M. Sopanen, S. Suihkonen, T. Shen,
B. P. Otis, H. Lipsanen, and B. A. Parviz. A single-pixel wireless contact lens display. Journal
of Micromechanics and Microengineering, 21(12):125014, Dec. 2011. ISSN 0960-1317. DOI:
10.1088/0960-1317/21/12/125014. 50
112 BIBLIOGRAPHY
B. Liu, M. S. Andersen, F. Schaub, H. Almuhimedi, S. A. Zhang, N. Sadeh, Y. Agarwal, and
A. Acquisti. Follow my recommendations: a personalized privacy assistant for mobile app
permissions. In Twelfth Symposium on Usable Privacy and Security (SOUPS 2016), pages 27–
41, Denver, CO, 2016. USENIX Association. ISBN 978-1-931971-31-7. URL https://ww
w.usenix.org/conference/soups2016/technical-sessions/presentation/liu. 76,
77, 80
S. Lobo. Datenschutzgrundverordnung (DSGVO): Wer macht mir die geileren vorschriften?
Der Spiegel Online, May 2018. URL http://www.spiegel.de/netzwelt/web/datens
chutzgrundverordnung-dsgvo-wer-macht-mir-die-geileren-vorschriften-a-
1206979.html. 20
C. R. Long and J. R. Averill. Solitude: An exploration of benefits of being alone. Journal for the
Theory of Social Behaviour, 33(1):21–44, 2003. DOI: 10.1111/1468-5914.00204. 34
E. Luger and T. Rodden. Terms of agreement: rethinking consent for pervasive comput-
ing. Interacting with Computers, 25(3):229–241, Feb. 2013a. ISSN 0953-5438. DOI:
10.1093/iwc/iws017. 76
E. Luger and T. Rodden. An informed view on consent for ubicomp. In ACM international joint
conference on Pervasive and ubiquitous computing (UbiComp ’13), page 529, New York, USA,
2013b. ACM. ISBN 9781450317702. DOI: 10.1145/2493432.2493446. 76
P. Lukowicz, S. Pentland, and A. Ferscha. From context awareness to socially aware
computing. IEEE Pervasive Computing, 11(1):32–41, 2012. ISSN 1536-1268. DOI:
10.1109/mprv.2011.82. 53, 54
D. Lyon. Terrorism and surveillance: security, freedom, and justice after September 11 2001.
Privacy Lecture Series, November 12, 2001. URL privacy.openflows.org/pdf/lyon_p
aper.pdf. 29
D. Lyon, editor. Surveillance as social sorting: privacy, risk and automated discrimination. Rout-
ledge, 2002. DOI: 10.4324/9780203994887. 29
M. Madden. Public Perceptions of Privacy and Security in the Post-Snowden Era, 2014. URL
http://www.pewinternet.org/2014/11/12/public-privacy-perceptions/. 76, 79
B. Malin and L. Sweeney. How (not) to protect genomic data privacy in a distributed
network: using trail re-identification to evaluate and design anonymity protection sys-
tems. Journal of biomedical informatics, 37(3):179–92, June 2004. ISSN 1532-0464. DOI:
10.1016/j.jbi.2004.04.005. 41
E. Malmi and I. Weber. You are what apps you use: demographic prediction based on user’s
apps. In Tenth International AAAI Conference on Web and Social Media, 2016. 76
BIBLIOGRAPHY 113
E. Mandel. How the Napa earthquake affected bay area sleepers. The Jawbone Blog, Aug. 2014.
URL https://jawbone.com/blog/napa-earthquake-effect-on-sleep/. 62
S. Mann. Wearable computing. In The Encyclopedia of Human-Computer Interaction, chapter 22.
Interaction Design Foundation, 2nd ed. edition, 2013. http://www.interaction-design
.org/books/hci.html. 46
S. Mann, J. Nolan, and B. Wellman. Sousveillance: inventing and using wearable computing
devices for data collection in surveillance environments. Surveillance & Society, 1(3):331–
355, July 2003. URL http://www.surveillance-and-society.org/journalv1i3.htm.
DOI: 10.24908/ss.v1i3.3344. 30
A. Marthews and C. E. Tucker. Government surveillance and internet search behavior. Draft,
SSRN, 2017. DOI: 10.2139/ssrn.2412564. 79
G. T. Marx. Murky conceptual waters: The public and the private. Ethics and Information
Technology, 3(3):157–169, 2001. URL web.mit.edu/gtmarx/www/murkypublicandpriva
te.html. 37, 62
G. T. Marx. Some information age techno-fallacies. Journal of Contingencies and Crisis Man-
agement, 11(1):25–31, Mar. 2003. ISSN 0966-0879. DOI: 10.1111/1468-5973.1101005.
85
D. Mattioli. On Orbitz, Mac users steered to pricier hotels. Wall Street Jounral, http://www.ws
j.com/articles/SB10001424052702304458604577488822667325882, August 2012. 67
A. M. McDonald and L. F. Cranor. The cost of reading privacy policies. I/S: A Journal of Law
and Policy for the Information Society, 4(3):540–565, 2008. https://kb.osu.edu/bitstream
/handle/1811/72839/ISJLP_V4N3_543.pdf 70, 73
R. McGarvey. Is your rental car company spying on you and your driving? Here’s how they do
it. TheStreet.com, mar 2015. URL https://www.thestreet.com/story/13089306/1/is-
your-rental-car-company-spying-on-you-and-your-driving-heres-how-they-
do-it.html. 60
114 BIBLIOGRAPHY
C. R. McKenzie, M. J. Liersch, and S. R. Finkelstein. Recommendations implicit in policy
defaults. Psychological Science, 17(5):414–420, 2006. DOI: 10.1037/e640112011-058. 74
R. Meyer. Facebook and the Cambridge analytica scandal, in 3 paragraphs. The Atlantic, Mar.
2018. URL https://www.theatlantic.com/technology/archive/2018/03/the-camb
ridge-analytica-scandal-in-three-paragraphs/556046/. 29
Microsoft. Cortana. http://www.microsoft.com/en-us/mobile/campaign-cortana/,
2014. 48
G. R. Milne and M. J. Culnan. Strategies for reducing online privacy risks: why consumers
read (or don’t read) online privacy notices. Journal of Interactive Marketing, 18(3):15–29, Jan.
2004. ISSN 1094-9968. DOI: 10.1002/dir.20009. 70
S. Moncrieff, S. Venkatesh, and G. West. Dynamic privacy assessment in a smart house envi-
ronment using multimodal sensing. ACM Transactions on Multimedia Computing, Communi-
cations, and Applications, 5(2), 2008. DOI: 10.1145/1413862.1413863. 50
Mozilla Foundation. Using geolocation, 2018. URL https://developer.mozilla.org/en-
US/docs/Web/API/Geolocation/Using_geolocation. 63
E. Musk. A most peculiar test drive. Tesla blog, http://www.teslamotors.com/blog/most-
peculiar-test-drive, 2013. 61
National Archives. America’s founding documents. National Archives. URL https://www.
archives.gov/founding-docs. 15
National Police Library – College of Computing. The effects of cctv on crime. Technical report,
College of Policing, UK, 2013. URL http://library.college.police.uk/docs/what-
works/What-works-briefing-effects-of-CCTV-2013.pdf. 21
S. Nisenbaum. Ways of being alone in the world. The American Behavioral Scientist, 27(6):785,
1984. DOI: 10.1177/000276484027006009. 34
H. Nissenbaum. Protecting privacy in an information age: the problem of privacy in public.
Law and Philosophy, 17(5):559–596, 1998. DOI: 10.2307/3505189. 38
H. Nissenbaum. Privacy as contextual integrity. Washington Law Review, 79(1):119–159, 2004.
URL http://ssrn.com/abstract=534622. 38, 72
H. Nissenbaum. Privacy in Context - Technology, Policy, and the Integrity of Social Life. Stanford
University Press, 2009. ISBN 978-0804752367. DOI: 10.1080/15536548.2011.10855919.
35, 38, 39, 78, 79, 87
H. Nissenbaum. A contextual approach to privacy online. Daedalus, 140(4):32–48, Oct. 2011.
ISSN 0011-5266. DOI: 10.1162/daed_a_00113. 78
BIBLIOGRAPHY 115
P. A. Norberg, D. R. Horne, and D. A. Horne. The privacy paradox: personal information dis-
closure intentions versus behaviors. Journal of Consumer Affairs, 41(1):100–126, 2007. DOI:
10.1111/j.1745-6606.2006.00070.x. 22, 80
R. Nord, M. Barbacci, P. Clements, and R. Kazman. Integrating the architecture trade-
off analysis method (ATAM) with the cost benefit analysis method (CBAM). Technical
report, Software Engineering Institute (SEI), Pittsburgh, PA, USA, 2003. URL http:
//repository.cmu.edu/sei/537/. DOI: 10.21236/ada421615. 89
OECD. OECD guidelines on the protection of privacy and transborder flows of per-
sonal data. The Organisation for Economic Co-operation and Development, Sept.
1980. URL http://www.oecd.org/sti/ieconomy/oecdguidelinesontheprotectiono
fprivacyandtransborderflowsofpersonaldata.htm. 12, 85
OECD. The OECD privacy framework. The Organisation for Economic Co-operation and
Development, 2013. URL http://www.oecd.org/internet/ieconomy/privacy-guidel
ines.htm. 12, 72, 81
R. Parasuraman, T. Sheridan, and C. Wickens. A model for types and levels of human inter-
action with automation. IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems
and Humans, 30(3):286–297, May 2000. ISSN 10834427. DOI: 10.1109/3468.844354. 77,
81
116 BIBLIOGRAPHY
B. Parducci, H. Lockhart, and E. Rissanen. eXtensible access control markup lan-
guage (XACML) version 3.0. Committee Specification, OASIS, 2010. URL ht
tp://docs.oasis-open.org/xacml/3.0/xacml-3.0-core-spec-cs-01-en.pdf. DOI:
10.17487/rfc7061. 72
S. N. Patel, J. A. Kientz, G. R. Hayes, S. Bhat, and G. D. Abowd. Farther than you may think:
an empirical investigation of the proximity of users to their mobile phones. In UbiComp 2006:
Ubiquitous Computing, pages 123–140, 2006. DOI: 10.1007/11853565_8. 47
P. Pelegris, K. Banitsas, T. Orbach, and K. Marias. A novel method to detect heart beat
rate using a mobile phone. In Engineering in Medicine and Biology Society (EMBC),
2010 Annual International Conference of the IEEE, pages 5488–5491, Aug. 2010. DOI:
10.1109/iembs.2010.5626580. 47
J. W. Penney. Chilling effects: Online surveillance and wikipedia use. Berkeley Technology Law
Journal, 31(1):117, 2016. URL https://ssrn.com/abstract=2769645. 79
J. P. Pickett, editor. The American Heritage College Dictionary. Houghton Mifflin Co, 4th edition,
Apr. 2002. 25
BIBLIOGRAPHY 117
S. Platt, Ed. Respectfully Quoted: A Dictionary of Quotations Requested from the Congressional
Research Service. Library of Congress, Washington, DC, 1989. http://www.bartleby.com
/73/ 98
B. Popper. Google announces over 2 billion monthly active devices on Android. The Verge Website
(verge.com), may 2017. URL https://www.theverge.com/2017/5/17/15654454/androi
d-reaches-2-billion-monthly-active-users. 22
R. C. Post. Three concepts of privacy. Georgetown Law Journal, 89:2087, 2001.
URL http://digitalcommons.law.yale.edu/cgi/viewcontent.cgi?article=1184&
context=fss_papers. DOI: 10.1016/j.clsr.2015.05.010. 7
President’s Concil of Advisors on Science and Technology. Big data and privacy: a tech-
nological perspective. Report to the President, Executive Office of the President, May
2014. URL https://obamawhitehouse.archives.gov/sites/default/files/micro
sites/ostp/PCAST/pcast_big_data_and_privacy_-_may_2014.pdf 76
Privacy Rights Clearinghouse. A review of the fair information principles: the foundation of
privacy public policy, Feb. 2004. URL http://www.privacyrights.org/ar/fairinfo.ht
m. 11
Privacy Rights Clearinghouse. California medical privacy fact sheet C4: your prescriptions and
your privacy. https://www.privacyrights.org/fs/fsC4/CA-medical-prescription-
privacy, July 2012. 66
W. Prosser. Privacy. California Law Journal, 48:383–423, 1960. As cited in Solove and Roten-
berg [2003]. DOI: 10.2307/3478805. 17
A. Rao, F. Schaub, and N. Sadeh. What do they know about me? Contents and concerns of
online behavioral profiles. In PASSAT ’14: Sixth ASE International Conference on Privacy,
Security, Risk and Trust, 2014. 66, 67
R. Rehmann. Der weg aus der hooligan-datenbank. Tages-Anzeiger, Nov. 2014.
URL http://www.tagesanzeiger.ch/schweiz/standard/Der-Weg-aus-der-
HooliganDatenbank/story/21957458. 2
J. R. Reidenberg, T. Breaux, L. F. Cranor, B. French, A. Grannis, J. T. Graves, F. Liu, A. M.
McDonald, T. B. Norton, R. Ramanath, N. C. Russell, N. Sadeh, and F. Schaub. Dis-
agreeable privacy policies: mismatches between meaning and users’ understanding. Berkeley
Technology Law Journal, 30, 2015. URL http://papers.ssrn.com/abstract=2418297.
DOI: 10.2139/ssrn.2418297. 75
J. R. Reidenberg, J. Bhatia, T. D. Breaux, and T. B. Norton. Ambiguity in privacy policies
and the impact of regulation. The Journal of Legal Studies, 45(S2):S163–S190, 2016. URL
https://doi.org/10.1086/688669. 75
118 BIBLIOGRAPHY
H. T. Reis, P. Shaver, et al. Intimacy as an interpersonal process. Handbook of personal relation-
ships, 24(3):367–389, 1988. 34
C. Roda. Human attention in digital environments. Cambridge University Press, 2011. ISBN
9780521765657. URL http://www.cambridge.org/ch/academic/subjects/psych
ology/cognition/human-attention-digital-environments?format=HB{&}isbn=
9780521765657. DOI: 10.1017/cbo9780511974519. 51
J. Rosen. A watchful state. The New York Times Magazine, Oct. 2001. http://www.nytimes.
com/2001/10/07/magazine/a-watchful-state.html. 21
I. S. Rubinstein. Big data: The end of privacy or a new beginning? International Data Privacy
law, 3(2):74–87, 1 May 2013. DOI: 10.1093/idpl/ips036. 85
S. Ruggieri, D. Pedreschi, and F. Turini. Integrating induction and deduction for finding evi-
dence of discrimination. Artificial Intelligence and Law, 18(1):1–43, Mar. 2010. ISSN 0924-
8463. DOI: 10.1007/s10506-010-9089-5. 85
B. Rössler. Der Wert des Privaten. Suhrkamp Verlag, Frankfurt/Main, Germany, 2001. 34
F. Sadri. Ambient intelligence: a survey. ACM Computing Surveys, 43(4):1–66, Oct. 2011. ISSN
03600300. DOI: 10.1145/1978802.1978815. 50
A. Schmidt, M. Beigl, and H.-W. Gellersen. There is more to context than location. Com-
puters & Graphics, 23(6):893–901, Dec. 1999. ISSN 00978493. DOI: 10.1016/s0097-
8493(99)00120-x. 48
A. Schmidt, B. Pfleging, F. Alt, A. Sahami, and G. Fitzpatrick. Interacting with 21st-century
computers. IEEE Pervasive Computing, 11(1):22–31, Jan. 2012. ISSN 1536-1268. DOI:
10.1109/mprv.2011.81. 45
B. Schneier. Security risks of frequent-shopper cards. Schneier on Security, Feb. 18 2005. URL
https://www.schneier.com/blog/archives/2005/02/security_risks.html. 1
V. Schouberechts. The Post Book – History of the European post in 50 exclusive documents. Lanoo
Publishers, jun 2016. 32
B. Schwartz. The social psychology of privacy. American Journal of Sociology, 73(6):741–752,
May 1968. ISSN 0002-9602. DOI: 10.1086/224567. 39
P. M. Schwartz. Property, privacy, and personal data. Harvard Law Review, 117(7):2055–2128,
2004. URL http://ssrn.com/abstract=721642. DOI: 10.2307/4093335. 66
C. Scully, J. Lee, J. Meyer, A. Gorbach, D. Granquist-Fraser, Y. Mendelson, and K. Chon.
Physiological parameter monitoring from optical recordings with a mobile phone. Biomed-
ical Engineering, IEEE Transactions on, 59(2):303–306, Feb. 2012. ISSN 0018-9294. DOI:
10.1109/tbme.2011.2163157. 47
C. Seife. 23andMe is terrifying; but not for the reasons the FDA thinks. Scientific Amer-
ican, Sept. 2013. https://www.scientificamerican.com/article/23andme-is-terr
ifying-but-not-for-the-reasons-the-fda-thinks/. 33
A. Simmel. Privacy. In International Encyclopedia of the Social Sciences, page 480. MacMillan,
1968. As cited in Cate [1997]. 36
H. A. Simon. Models of bounded rationality: Empirically grounded economic reason, MIT press,
1982. 73
D. J. Solove. Privacy self-management and the consent dilemma. Harvard Law Review, 126:
1880–1903, 2013. URL http://ssrn.com/abstract=2171018. 22, 36, 65, 70, 76
D. J. Solove and W. Hartzog. The FTC and the new common law of privacy. Columbia
Law Review, 114(583), 2014. URL https://ssrn.com/abstract=2312913. DOI:
10.2139/ssrn.2312913. 17
D. J. Solove and M. Rotenberg. Information Privacy Law. Aspen Publishers, New York, USA,
1st edition, 2003. 10, 13, 14, 15, 28, 100, 117
D. J. Solove and P. M. Schwartz. Information Privacy Law. Wolters Kluwer, 6th edition, 2018.
ISBN 978-1454892755. 43
P. Sprenger. Sun on privacy: ‘Get over it.’ Wired News, Jan. 1999. URL https://www.wired.
com/1999/01/sun-on-privacy-get-over-it/. 24
122 BIBLIOGRAPHY
M. Staples, K. Daniel, M. Cima, and R. Langer. Application of micro- and nano-
electromechanical devices to drug delivery. Pharmaceutical Research, 23(5):847–863, 2006.
ISSN 0724-8741. DOI: 10.1007/s11095-006-9906-4. 50
T. Starner. Project Glass: An extension of the self. IEEE Pervasive Computing, 12(2):14–16,
Apr. 2013. ISSN 1536-1268. DOI: 10.1109/mprv.2013.35. 50
Statista. Number of apps available in leading app stores as of 3rd quarter 2018.
http://www.statista.com/statistics/276623/number-of-apps-available-in-
leading-app-stores/, October 2018. 48
B. Stone. Amazon erases Orwell books from Kindle devices, July 2009. URL https://www.
nytimes.com/2009/07/18/technology/companies/18amazon.html. 60
W. J. Stuntz. The substantive origins of criminal procedure. The Yale Law Journal, 105(2):393,
Nov. 1995. ISSN 00440094. DOI: 10.2307/797125. 25
L. Sweeney. k-anonymity: a model for protecting privacy. International Journal of Uncer-
tainty, Fuzziness and Knowledge-Based Systems, 10(5):557–570, 2002. ISSN 0218-4885. DOI:
10.1142/s0218488502001648. 41
P. Swire and K. Ahmad. Foundations of Information Privacy and Data Protection: A Survey of
Global Concepts, Laws and Practices. The International Association of Privacy Professionals
(IAPP), 2012. ISBN 978-0-9795901-7-7. 20
O. Tene and J. Polonetsky. Privacy in the age of big data - a time for big decisions. Stanford
Law Review Online, 64(63), 2012. URL https://www.stanfordlawreview.org/online
/privacy-paradox-privacy-and-big-data/. 85
O. Tene and J. Polonetsky. Big data for all: Privacy and user control in the age of analytics.
Northwestern Journal of Technology and Intellectual Property, 11(5):239, sep 2013. URL https:
//papers.ssrn.com/sol3/papers.cfm?abstract_id=2149364. 85
Th. Sc. Community. Pervasive Adaptation: The Next Generation Pervasive Computing Research
Agenda. Institute for Pervasive Computing, Johannes Kepler University Linz, Linz, 2011.
ISBN 9783200022706. URL http://perada.eu/essence. 54
The Economist. When smart becomes spooky. The Economist Online, Nov. 2014. URL https:
//www.economist.com/news/2014/11/13/when-smart-becomes-spooky. 61
The Economist. The signal and the noise. Technical report, London, UK, Mar. 2016. URL ht
tps://www.economist.com/sites/default/files/20160326_tech_politics.pdf. 29
The State of New York. Article 240 - NY penal law, 2018a. URL http://ypdcrime.com/pen
al.law/article240.htm. 23
BIBLIOGRAPHY 123
The State of New York. Article 490 - NY penal law, 2018b. URL http://ypdcrime.com/pen
al.law/article490.htm. 23
J. Turow, M. Hennessy, and N. Draper. The tradeoff fallacy: How marketers are misrepresenting
american consumers and opening them up to exploitation. Technical report, Annenberg
School for Communication, University of Pennsylvania, 2015. DOI: 10.2139/ssrn.2820060.
76
C. Ulbrich. Can spam? Or new can of worms? Wired News, Dec. 2003. URL https://www.
wired.com/2003/12/can-spam-or-new-can-of-worms/. 26
United Nations. Universal declaration of human rights. Adopted and proclaimed by General
Assembly resolution 217 A (III) of December 10, 1948. URL http://www.un.org/Overv
iew/rights.html. 10
United States Government. E-government act of 2002. Public Law 107-347, Dec. 2002. URL
https://www.gpo.gov/fdsys/pkg/PLAW-107publ347/content-detail.html. 89
B. Ur, J. Jung, and S. Schechter. Intruders versus intrusiveness: teens’ and parents’ perspectives
on home-entryway surveillance. In UbiComp 2014, pages 129–139. ACM Press, 2014. ISBN
9781450329682. DOI: 10.1145/2632048.2632107. 61
A. Valdez. Everything you need to know about Facebook and Cambridge analytica. Wired,
March 23, 2018, 2018. URL https://www.wired.com/story/wired-facebook-cambri
dge-analytica-coverage/. 79
W. G. Voss. European Union data privacy law reform: general data protection regulation, privacy
shield, and the right to delisting. Business Lawyer,, 72(1):221–233, Jan. 2017. URL https:
//ssrn.com/abstract=2894571. 43
124 BIBLIOGRAPHY
J. Waldo, H. S. Lin, and L. I. Millett. Engaging privacy and information technology in a digital
age. Technical report, National Research Council, 2007. URL http://www.nap.edu/open
book.php?record_id=11896. DOI: 10.29012/jpc.v2i1.580. 62
Y. Wang and M. Kosinski. Deep neural networks are more accurate than humans at detecting
sexual orientation from facial images. Journal of Personality and Social Psychology, 114(2):246–
257, 2018. DOI: 10.31234/osf.io/hv28a. 84
S. D. Warren and L. D. Brandeis. The right to privacy. Harvard Law Review, 4(5):193–220,
Dec. 1890. ISSN 0017811X. DOI: 10.2307/1321160. 8, 9, 33
Washington State Legislature. Electronic communication devices, chapter 19.300 RCW, 2009.
URL http://app.leg.wa.gov/RCW/default.aspx?cite=19.300. 16
M. Weiser. The computer for the 21st century. Scientific American, 265(3):94–104, Jan. 1991.
ISSN 1536-1268. DOI: 10.1038/scientificamerican0991-94. 47, 49, 50, 51, 52, 57, 58
M. Weiser and J. Brown. The coming age of calm technology. In P. J. Denning and R. M.
Metcalfe, editors, Beyond Calculation: The Next Fifty Years of Computing, volume 8. Springer,
1997. DOI: 10.1007/978-1-4612-0685-9_6. 47, 51, 53
M. A. Weiss and K. Archick. U.S. - EU Data privacy: from Safe Harbor to Privacy Shield.
Technical report, Congressional Research Service, Washington, D.C., USA, May 2016. URL
https://epic.org/crs/R44257.pdf. 19
A. F. Westin. Privacy and Freedom. Atheneum, New York, 1967. DOI: 10.2307/1339271. 11,
26, 32, 33, 34, 35, 36, 37, 43, 73
Wired News. Due process vanishes in thin air. Wired News, April 8, 2003. URL http:
//www.wired.com/news/print/0,1294,58386,00.html. 30
Authors’ Biographies
MARC LANGHEINRICH
Marc Langheinrich is full professor in the Faculty of In-
formatics at the Università della Svizzera Italiana (USI) in
Lugano, Switzerland. His research focuses on privacy in mo-
bile and pervasive computing systems, in particular with a view
towards social compatibility. Other research interests include
usable security and pervasive displays. Marc is a member of the
Steering Committee of the UbiComp conference series and
chairs the IoT conference Steering Committee. He has been
a General Chair or Program Chair of most major conferences
in the field—including Ubicomp, PerCom, Pervasive, and the
IoT conference—and currently serves as the Editor-in-Chief
for IEEE Pervasive Magazine. Marc holds a Ph.D. from ETH Zürich, Switzerland. He can be
reached at [email protected]. For more information, see https://uc.inf.usi.ch/.
FLORIAN SCHAUB
Florian Schaub is an assistant professor in the School of Infor-
mation and the Computer Science and Engineering Division
at the University of Michigan. His research focuses on under-
standing and supporting people’s privacy and security behav-
ior and decision making in complex socio-technological sys-
tems. His research interests span privacy, human-computer
interaction, and emergent technologies, such as the Internet
of Things. Florian received a doctoral degree in Computer
Science from the University of Ulm, Germany, and was a
postdoctoral fellow in Carnegie Mellon University’s School
of Computer Science. His research has been honored with
Best Paper Awards at CHI, the ACM SIGCHI Conference
on Human Factors in Computing, and SOUPS, the Symposium on Usable Privacy and Se-
curity. Florian can be reached at [email protected]. For more information, see https:
//si.umich.edu/people/florian-schaub.