0% found this document useful (0 votes)
18 views41 pages

Lecture5 Privacy

Uploaded by

mindbreaker360
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views41 pages

Lecture5 Privacy

Uploaded by

mindbreaker360
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 41

Ethics and Professional Issues

CS210/211 Spring 2024


Lecture 5: Privacy
Reminders
• Weekly assignment 2 is due today
• Weekly assignment 3 has been posted
• The midterm will be in the CBTF March 20th – 22nd
PRIVACY
Three Philosophies
1. Accessibility Privacy (1800’s)
– Freedom from unwarranted intrusion
– Physical access to your person
– Fourth Amendment (search and seizure)
2. Decisional Privacy (1900’s)
– Freedom from interference in personal affairs
– Personal choices, plans, and decisions are yours
– First Amendment, Griswold v. Connecticut, Roe vs. Wade
Three Philosophies
3. Informational Privacy (2000’s)
– Control over flow of personal information
– How can your information be gathered, stored, mined, combined,
exchanged?
– Legal decisions ongoing (policy vacuum)
Moor’s Theory of Privacy
• descriptive privacy
– privacy protected by physical means
– descriptive privacy can be lost
• normative privacy
– privacy protected by policies
– normative privacy can be violated

James Moor
Nissenbaum’s Theory of Privacy
• Privacy as “contextual integrity”
– Norms depend on context (e.g. school/family)
1. Norms of appropriateness
– Should information be gathered/divulged?
2. Norms of distribution
– Should information be shared?
– Transfers data to a new context
• Violating norms results in a breach
Helen
Nissenbaum
Hartzog’s Theory of Privacy
1. Trust – willingness to make oneself vulnerable to actions of
others
2. Obscurity – an observer does not have the context to make
sense of an individual
3. Autonomy – a person can engage in relationships of
trust and maintain reliable zones of obscurity

Woodrow
Hartzog
Woodrow Hartzog rejects the idea that there is a bright line
between private and public information. What does he believe
can give people a degree of privacy, even when they're in public?

a) Obscurity
b) Consent
c) Context
d) Accessibility
Privacy Vocabulary
• Personal identifying information (PII) - information used to
uniquely identify a person
• Nonpublic personal information (NPI) - confidential and private
information about a person
• Public personal information (PPI) - public information about a
person
• Metadata - data describing other data
THREATS TO PRIVACY
Data gathering
• dataveillance – data surveillance (data brokers)
• $156 billion industry
• Collect data from:
– Internet cookies, search history, IP address, browser fingerprinting
– GPS, WiFi, cell calls, text messages
– Purchases, debt, income, transactions
– Social interactions, network usage, messaging
– Credit history, medical history, academic history
• Acxiom • Instant Checkmate
• Alliance Data (Epsilon) • LiveRamp
• BlueKai • Paramount Lists
• cuebiq • PeekYou
• Datalogix • pipl
• Equifax • RapLeaf
• Epsiolon • RELX Group (ChoicePoint, Reed
• Exact Data Elsevier, Seisint)
• Exactis • Rocket Fuel
• Experion • Spokeo
• Flurry • Take 5 Solutions
• Gravy Analytis • Trans Union
• “behavioral targeting”
• 96% of American households in database
• Race, gender, phone number, car, education, family, age,
height, income, politics, pet ownership
• 1.1 billion browser cookies
• 500 million people with 1,500 data points per person
Personal Data ≈ Radioactive Waste
• Easy to generate
• Easy to store in the short term
• Harmful if released
• Almost impossible to
dispose of
• Requires very long term
planning to manage

Maciej Ceglowski
Exchanging
• Data can be merged and matched
– integrated into a composite
– unique identifiers search multiple databases
• Potentially violates contextual integrity
• Possible breach of:
a) Norms of appropriateness
b) Norms of distribution
Mining
• data mining indirectly gathers personal information through
pattern matching
• "Big data" "cloud" "deep learning"
• Difficult to regulate
– Patterns are implicit in the data
– Data is typically nonconfidential
– Data is not necessarily exchanged
Deanonymization
• Mining to derive PII from anonymous data
• Workflow:
– 1) identify unique users in the data
– 2) correlate features of individual with public data
• Examples:
– AOL search history
– Netflix subscriber data deanonymized using IMDB
– Surnames from raw DNA sequence data
PROTECTION
Informed Consent
• Informed consent - user understands and agrees to how the
data is used
• Current uses of information are opaque to user
– Do you read TOS?
– Can you predict data mining?
• Acting on presumed consent
– opt-in is the standard default
Privacy-Enhancing Technology (PET)
• End to End Encryption (E2EE)
– Whatsapp, iMessage
– Signal, Threema, OTR
– PGP, SSL/TLS
• Anonymizing tools
– tor, proxies
– Ghostery, Stealthy, Ublock, Disconnect
Policies
• Self-Regulation
– TRUSTe
Laws
• US Laws
– FCRA (1970), FERPA (1974), ECPA (1986) and HIPAA (1996)
• European Laws
– EU Directive 95/46/EC (1995)
– GDPR
Minimalism
• Design to protect users
• Only collect necessary data
• Store data as briefly as is needed
• Make sure it is deleted Bruce Schneier

Ethan Zuckerman
What do you think is most important?
a) Privacy Enhancing Technology
b) Industry self-regulation
c) Government policies (laws)
d) Minimalism
Kara Swisher
[A]nyone who wants to
spy needs very little, as
all of us continue availing
ourselves to tech’s many
wonders while
promiscuously shedding
our data… it’s up to us to
protect ourselves, since
there are no federal laws
that actually do it.

You might also like