0% found this document useful (0 votes)
7 views68 pages

Lect4 ch05

The lecture focuses on the importance of privacy in cyberspace, discussing various aspects such as personal privacy definitions, data gathering techniques, and the implications of cybertechnology on privacy. It highlights concerns related to surveillance, data mining, and the merging of personal information across databases, emphasizing the need for privacy protection and legislation. The session also includes discussions on the value of privacy as a social and individual concern, along with practical exercises for students to engage with privacy laws and data protection principles.

Uploaded by

suneldebr0t
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views68 pages

Lect4 ch05

The lecture focuses on the importance of privacy in cyberspace, discussing various aspects such as personal privacy definitions, data gathering techniques, and the implications of cybertechnology on privacy. It highlights concerns related to surveillance, data mining, and the merging of personal information across databases, emphasizing the need for privacy protection and legislation. The session also includes discussions on the value of privacy as a social and individual concern, along with practical exercises for students to engage with privacy laws and data protection principles.

Uploaded by

suneldebr0t
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 68

Lecture 4: Privacy and

Cyberspace

9/2020

1
Overview
 Aim s & requirem ents :
Providing students understandings about Cyberethics related to Privacy.
Requirements:
- Reading materials before the lecture
- Attending required
 Lecturing form at:
 Content:
 Privacy in the Digital Age
 What Is Personal Privacy? Why Is Privacy Important?
 Gathering Personal Data: Surveillance, Recording, and Tracking Techniques
 Exchanging Personal Data: Merging and Matching
 Mining: Big Data, Data Mining, and Web Mining
 Protecting Personal Privacy in Public Space
 Privacy-enhancing technologies
 Privacy Legislation and Industry Self‐Regulation
 Discussion: Importance of Privacy?
 Self-study: Data protection techniques
 Exercise: Conduct a survey of Privacy Laws and Data Protection Principles.
 Reading m aterial: Chapter 5, Textbook
1. Privacy and Cyberspace
 Concerns about personal privacy existed
long before the advent of computers and
cybertechnology.
 Prior to the information era, for example,
technologies such as the camera and the
telephone presented challenges for
privacy.
 So we can ask: what, if anything, is special
about the privacy concerns that are
associated with cybertechnology?
1. Privacy and Cyberspace
 Consider the impact that changes involving this
technology have had on privacy with respect to
the:
 amount of personal information that can be
collect,
 speed at which personal information can be
transmitted,
 duration of time that the information can be
retained,
 kind of information that can be acquired and
exchanged.
2. What is Personal Privacy
 Although many definitions of privacy have been
put forth, there is no universally agreed upon
definition of this concept.
 To illustrate this point, consider some of the
metaphors that are typically associated with
privacy:
 "lost,"
 "diminished,"
 "intruded upon,"
 "invaded,"
 "violated,"
 "breached," and so forth.
What is Privacy (continued)?
 Privacy is sometimes viewed as an "all-
or-nothing" concept – that is,
something that one either has (totally)
or does not have.
 At other times, privacy is viewed as
something that can be diminished.
 For example, as a repository of personal
information that can be eroded gradually.
Table 5-1: Three Theories
of Privacy
Accessibility Privacy Privacy is defined in terms of one's
physically "being let alone," or
freedom from intrusion into one's
physical space.

Decisional Privacy Privacy is defined in terms of


freedom from interference in one's
choices and decisions.

Informational Privacy Privacy is defined as control over


the flow of one's personal
information, including the transfer
and exchange of that information.
A Comprehensive Account of
Privacy
 Moor (1997) has introduced a theory of
privacy that incorporates important elements
of the non-intrusion, non-interference, and
informational views of privacy.

 According to Moor:
 an individual has privacy in a situation if in that
particular situation the individual is protected from
intrusion, interference, and information access by
others.
Moor’s Theory of Privacy
(continued)
 An important aspect in this definition is
Moor's notion of a situation.
 A situation is left deliberately broad so that it
can apply to a range of contexts or "zones.“
 Situations can be "declared private" in a
normative sense.
 For example, a situation can be an "activity," a
"relationship," or the "storage and access of
information" in a computer or on the Internet.
Moor’s Privacy Theory
(continued)
 Moor’s distinction between naturally private
and normatively private situations enables us
to differentiate between the conditions
required for:
 (a) having privacy (in a descriptive sense)
 (b) having a right to privacy
 With this distinction we can differentiate
between a:
 loss of privacy
 violation of privacy
Two Scenarios
 Scenario 1: Someone walks into the
computer lab and sees you using a
computer.
 Your privacy is lost but not violated.
 Scenario 2: Someone peeps through the
keyhole of your apartment door and
sees you using a computer.
 Your privacy is not only lost but is violated.
3. Why is Privacy Important?
 What kind of value is privacy?
 Is it one that is universally valued?
 So the relative importance of privacy
may vary considerably among the
generations; however, we will proceed
on the assumption that privacy has
value and thus is important.
Privacy as a Universal Value
 Not valued the same in all cultures.
 Has at least some value in all societies.
 It may be difficult to get universal
agreement on privacy laws and policies
in cyberspace.
Is Privacy an Intrinsic or
Instrumental Value?
 Not valued for its own sake.
 But is more than an instrumental value in the
sense that it is necessary (rather than merely
contingent) for achieving important human
ends.
 Fried – privacy is necessary for human ends
such as trust and friendship.
 Moor – privacy is an expression of the core
value security.
Privacy as an Important Social
Value
 Privacy is an important social, as well as an
individual, value.
 Regan (1995) points out that we often frame
debates over privacy simply in terms of how to
balance privacy interests as individual goods
against interests involving the larger social
good;
 in such debates, Regan believes, interests
benefiting the social good will generally
override concerns regarding individual privacy.
Three Ways Privacy is Threat-
ened by Cybertechnology?
 (A) data-gathering techniques used to collect and
record personal information, often without the
knowledge and consent of users.
 (B) data-exchanging techniques used to transfer and
exchange personal data across and between
computer databases, typically without the knowledge
and consent of users.
 (C) data-mining techniques used to search for
patterns implicit in large databases in order to
generate consumer profiles based on behavioral
patterns discovered in certain groups.
4. Gathering Personal Data
 Personal data has been gathered since
Roman times (census data).
 “Dataveillance” – a term coined by
Roger Clarke to capture two techniques
made possible by computer technology:
 (a) the surveillance (data-monitoring):
 (b) data-recording.
Dataveillance (Continued)
 Video cameras monitor an individual's
physical movements – when they shop at
certain department stores.
 Some motorists are now subject to new
schemes of highway surveillance while driving
in their motor vehicles, because of new forms
of scanning devices such as E-ZPASS.
 Even the number of "clickstreams" – key
strokes and mouse clicks – entered by a Web
site visitor can be monitored and recorded.
Internet Cookies
 “Cookies” are files that Web sites send to and
retrieve from the computer systems of Web
users.
 Cookies technology enables Web site owners
to collect certain kinds of data about the
users who access their sites.
 Because of "cookies technology," information
about an individual's on-line browsing
preferences can be "captured" whenever a
person visits a Web site.
Cookies (Continued)
 The data recorded (via cookies) about the
user is then stored on a file placed on the
hard drive of the user's computer system.
 No other data-gathering mechanism actually
stores the data it collects on the user’s computer.
 The information can then be retrieved from
the user's system and resubmitted to a Web
site the next time the user accesses that site.
 The exchange of data typically occurs without
a user's knowledge and consent.
Can Cookies be Defended?
 Web sites that use cookies maintain that they
are performing a service for repeat users of a
Web site by customizing a user's means of
information retrieval.
 They also point out that, because of cookies,
they are able to provide a user with a list of
preferences for future visits to that Web site.
Arguments Against Cookies
 Privacy advocates argue that activities
involving the monitoring and recording an
individual's activities while visiting a Web site
and the subsequent downloading of that
information onto a user's PC (without
informing the user), violate privacy.
 They also point out that information gathered
about a user via cookies can eventually be
acquired by on-line advertising agencies, who
could then target that user for on-line ads.
RFID technology
 Another mode of surveillance made possible by cybertechnology
involves the use of RFID technology.
 In its simplest form, RFID technology consists of a tag
(microchip) and a reader.
 The tag has an electronic circuit, which stores data, and an antenna that
broadcasts data by radio waves in response to a signal from a reader.
 The reader also contains an antenna that receives the radio signal, and it has a
demodulator that transforms the analog radio information into suitable data for
any computer processing that will be done
 Like Internet cookies and other online data gathering and
surveillance techniques, RFIDclearly threatens individual
privacy.
 But unlike surveillance concerns associated with cookies, which
track a user’s habits while visiting Web sites, RFID technology
can be used to track an individual’s location in the offline world.
5. Exchanging personal data
 In the previous section, we examined ways in which personal
data could be gathered using surveillance techniques and then
recorded electronically in computer databases.
 Other tools have been devised to transfer and exchange those
records across and between computer databases.
 Simply collecting and recording personal data, per se, might not
seem terribly controversial if, for example, the data were never
used, transferred, exchanged, combined, or recombined.
 Some would argue, however, that the mere collection of
personal data is problematic from a privacy perspective,
assuming that if data are being collected, there must be some
motive or purpose for their collection. Of course, the reason, as
many now realize, is that transactions involving the sale and
exchange of personal data are a growing business.
Computerized Merging and
Matching Operations
 Computer merging is a technique of
extracting information from two or more
unrelated databases, which contain data
about some individual or group of individuals,
and incorporating it into a composite file.
 Computer merging occurs whenever two or
more disparate pieces of information
contained in separate databases are
combined.
Computer Merging
 Consider a scenario in which you voluntarily give
information about yourself to three different
organizations.
 First, you give information about your income and
credit history to a lending institution in order to
secure a loan.
 You next give information about your age and
medical history to an insurance company to purchase
life insurance.
 You then give information about your views on
certain social issues to a political organization you
wish to join.
Computer Merging (continued)
 Each organization has a legitimate need for
information to make decisions about you.
 Insurance companies have a legitimate need
to know about your age and medical history
before agreeing to sell you life insurance.
 Lending institutions have a legitimate need to
know information about your income and
credit history before agreeing to lend you
money to purchase a house or a car.
Computer Merging (continued)
 Suppose that, without your knowledge and consent,
information about you contained in the insurance
company's database is merged with information
about you that resided in the lending institution's
database or in the political organization's database.
 You voluntarily gave certain information about
yourself to three different organizations.
 You authorized each organization to have the specific
information you voluntary granted.
 However, it does not follow that you thereby
authorized any one organization to have some
combination of that information.
Computer Merging (continued)
 Case Illustration
 Double-Click, an on-line advertising
company attempted to purchase
Abacus, Inc. an off-line database
company.
 Double-Click would have been able to
merge on-line and off-line records.
Computer Matching
 Computer matching is a technique that
involves the cross checking of information in
two or more databases that are typically
unrelated in order to produces certain
"matching records" or "hits."
 Matching or cross-referencing records in two
or more databases in order to generate one
or more hits is used for the express purpose
of creating a new file, which typically contains
a list of potential law violators.
Computer Matching
(continued)
 In federal and state government applications,
computerized matching has been used by
various agencies and departments to identify:
 potential law violators;
 individuals who have actually broken the law
or who are suspected of having broken the
law (welfare cheats, deadbeat parents, etc.).
Computer Matching
(continued)
 A scenario could be federal income tax
records matched against state motor
vehicle registration (looking for low
income and expensive automobiles).
 Consider an analogy in physical space in
which your mail in monitored and
secretly matched or opened by
authorities.
Computer Matching
(continued)
 Those who defend matching argue:
 If you have nothing to hide, you have nothing to
worry about.
 Another argument is:
 Privacy is a legal right.
 Legal rights are not absolute.
 When one violates the law (i.e., commits a crime),
one forfeits one's legal rights.
 Therefore, criminals have forfeited their right to
privacy.
Computer Matching
(continued)
 Case illustration involving biometrics:
 At Super Bowl XXXV in January 2001, a facial-
recognition technology was used to scan the
faces of individuals entering the stadium.
 The digitized facial images were then
instantly matched against images contained
in a centralized database of suspected
criminals and terrorists.
 This practice was, at the time, criticized by
many civil-liberties proponents.
6. Mining Personal Data
 Data mining involves the indirect gathering of
personal information through an analysis of
implicit patterns discoverable in data.
 Data-mining activities can generate new and
sometimes non-obvious classifications or
categories.
 Individuals whose data is mined could
become identified with or linked to certain
newly created groups that they might never
have imagined to exist.
Data Mining (Continued)
 Current privacy laws offer individuals no
protection regarding information about them
that is acquired through data-mining activities
is subsequently used.
 Important decisions can be made about those
individuals based on the patterns found in the
mined personal data.
 So some uses of data-mining technology raise
special concerns for personal privacy.
Data Mining (Continued)
 Unlike personal data that resides in explicit
records in databases, information acquired
about persons via data mining is often
derived from implicit patterns in the data.
 The patterns can suggest "new" facts,
relationships, or associations about that
person, such as that person's membership in
a newly "discovered" category or group.
Data Mining (Continued)
 Much personal data collected and used
in data-mining applications is generally
considered to be neither confidential
nor intimate in nature.
 So there is a tendency to presume that
such data must by default be public
data.
Data Mining (Continued)
 Hypothetical Scenario (Lee):
 Lee is a 35-year old junior executive;
 Lee applies for a car loan;
 Lee has an impeccable credit history;
 A data mining algorithm “discovers” that Lee
belongs to a group of individuals likely to
start their own business and declare
bankruptcy;
 Lee is denied the loan based on data mining.
Techniques for Manipulating
Personal Data
Data Merging A data-exchanging process in which personal
data from two or more sources is combined to
create a "mosaic" of individuals that would not
be discernable from the individual pieces of data
alone.

Data Matching A technique in which two or more unrelated


pieces of personal information are cross-
referenced and compared to generate a matc,h
or "hit," that suggests a person's connection with
two or more groups.

Data Mining A technique for "unearthing" implicit patterns in


large databases or "data warehouses," revealing
statistical data that associates individuals with
non-obvious groups; user profiles can be
constructed from these patterns.
Data Mining on the Internet
(web mining)
 Traditionally, data mining is done in large
“data warehouses” (off-line).
 "Intelligent agents" or "softbots" acting on
behalf of human beings sift through and
analyze the mounds of data on the Internet.
 Metasearch engines "crawl" through the Web
in order to uncover general patterns from
information retrieved from search-engine
requests across multiple Web sites.
Data Mining on the Internet
 Data mining techniques are now also used by
commercial Web sites to analyze data about
Internet users, which can then be sold to third
parties.
 This process is sometimes referred to as “Web
mining,” which has been defined as the
application of data mining techniques to
discover patterns from the Web
 The kinds of patterns discovered from Web
mining can be useful to marketers in
promotional campaigns
7. Protecting Privacy in Public
Space
 Non-Public Personal Information (or NPI)
refers to sensitive information such as in
one’s financial and medical records.
 NPI has some legal protection

 Many privacy analysts are now concerned


about a different kind of personal information
– Public Personal Information (or PPI).
 PPI is non-confidential and non-intimate in
character – is also being mined.
PPI
 Why should the collection of PPI, which is
publicly available information about persons
generate controversies involving privacy?
 it might seem that there is little to worry about.
 For example, suppose learns that that you are a
student at Rivier, you frequently attend college
basketball games, and you are actively involved in
Rivier’s computer science club.
 In one sense, the information is personal because
it is about you (as a person);but it is also about
what you do in the public sphere.
PPI (Continued)
 In the past, it would have been difficult to
make a strong case for such legislation
protecting PPI, because lawmakers and
ordinary persons would have seen no need to
protect that kind of personal information.
 Nissenbaum (1997) believes that our earlier
assumptions about the need to protect
privacy in public are no longer tenable
because of a misleading assumption:
 There is a realm of public information about
persons to which no privacy norms apply.
PPI (Continued)
 Hypothetical Scenario:
 (a) Shopping at Supermart;
 (b) Shopping at Nile.com;
 Reveal problems of protecting privacy in
public in an era of information
technology and data mining.
Search Engines and
Personal Information
 Search facilities can be used to gain
personal information about individuals
(e.g., the Amy Boyer example).
 Your Web activities can be catalogued
(Deja News) and referenced by search
engines.
 Scenario – using a search engine to
locate a friend.
Accessing Public Records via
the Internet
 What are public records?
 Why do we have them?
 Traditionally, they were accessed via
hardcopy documents that resided in
municipal buildings.
 Recall the Amy Boyer case.
 Would it have made a difference?
Accessing Public Records via
the Internet (continued)
 Some “information merchants” believe that
because public records are, by definition,
"public," they must be made available online.
 They reason:
 Public records have always been available to the
public.
 Public records have always resided in public
space.
 The Internet is a public space.
 Therefore, all of public records ought to be made
available on-line.
Accessing Public Records via
the Internet (continued)
 Two Case illustrations:
 State of Oregon (Motor Vehicle
Department);
 Merrimack, NH (tax records for city
residents).
8. PRIVACY-ENHANCING
TECHNOLOGIES (PETs)
 Privacy advocates have typically argued for
stronger privacy laws to protect individuals.
 Groups representing the e-commerce sector
have lobbied for voluntary controls and
industry self-regulation as an alternative to
additional privacy legislation.
 Now, some members of each camp support a
compromise resolution to the on-line privacy
debate in the form of privacy-enhancing tools
or PETs.
PETs
 PETs can be understood as tools that
users can employ either to:
 (a) protect their personal identity while
interacting with the Web;
 (b) protect the privacy of communications
(such as e-mail) sent over the Internet.
 An example of (b) is encryption tools that
encode and decode e-mail messages
 Our main focus in this section is on
whether PETs actually accomplish (a)
PETs
 Some PETs enable users to navigate the Internet either
anonymously or pseudonymously; one of the best-known
anonymity tools is avail able from Anonymizer.com.
 It is important to note that although Anonymizer users
enjoy anonymity while visiting Web sites, they are not
anonymous to Anonymizer.com or to their own ISPs.
 A user’s activities on a Web site can be recorded in server
log files and can thus be traced back to a specific ISP and
IP address.
 To enjoy complete anonymity on the Internet, online
users need tools that do not require them to place their
trust in a single “third party” (such as Anonymizer).
PETs
 Another useful tool is TrackMeNot
(http://cs.nyu.edu/trackmenot/), which was designed to
work with the Firefox Web browser to protect users
against surveillance and data profiling by search engine
companies.
 Rather than using encryption or concealment tools to
accomplish its objectives, TrackMeNot instead uses “noise
and obfuscation.” In this way, a user’s Web searches
become “lost in a cloud of false leads.”
 By issuing randomized search queries to popular search
engines such as Google and Bing, TrackMeNot “hides
users’ actual search trails in a cloud of ‘ghost’ queries.”
This technique makes it difficult for search engine
companies to aggregate the data it collects into accurate
user profiles.
PETs
 Although PETs such as Anonymizer and
TrackMeNot assist users in navigating the Web
with relative anonymity, they are not useful for e-
commerce transactions in which users must reveal
their actual identities.
 Many e-commerce sites now provide users with a
stated privacy policy that is backed by certified
“trustmarks” or trust seals. These trust agreements
between users and e-commerce sites can also be
viewed as PETs in that they are intended to protect
a user’s privacy during a consumer transaction.
PETs (Continued)
 But are they adequate to the task?
 To answer this question, we next
analyze PETs in relation to two specific
challenges:
 Consumer education
 Informed consent
Educating Users About PETs
 How are users supposed to find out about PETs?
 Consider that Web sites are not required to inform
users about the existence of PETs or to make those
tools available to them.
 Furthermore, online consumers must not only
discover that PETs are available, but they must
also learn how to use these tools.
 So at present, responsibility for learning about
PETs and how to use them is incumbent upon
consumers.
 Is it reasonable and is it fair to expect users to be
responsible for these tasks?
Educating Users About PETs
 Recall our earlier discussion of cookies. Although
many Web browsers allow users to reject cookies,
the default is that cookies will be accepted unless the
user explicitly rejects them. But why?
 The Web site could also inform, and possibly educate, the
user about the existence of cookies, and then ask whether
he or she is willing to accept them
 Following Judith DeCew (2006), we could “presume
in favor of privacy” and then develop ways that
would allow individuals to determine for themselves
how and when that presumption should be
overridden. (This is part of a process that DeCew
refers to as “dynamic negotiation.”)
PETS and Principle of
Informed Consent
 Do PETs adequately support users in making
informed decisions about the disclosure of their
personal data in commercial transactions?
 Traditionally, the principle of informed consent has
been the model, or standard, in contexts involving the
disclosure of one’s personal data.
 However, users who willingly consent to provide
information about themselves for one purpose
(e.g., in one transaction) may have no idea how
that information can also be used in secondary
applications
PETS and Principle of
Informed Consent
 Users might enter into an agreement with
Web site owners (if they have a privacy
policy).
 They typically have to “opt out” of having
information collected. (The default practice is
that they have opted in, unlesss they specify
otherwise.)
 Policies involving PETs can’t guarantee users
against secondary and future uses of their
information.
9. Privacy Legislation and
Industry Self-Regulation
 We saw in the previous section that even though PETs
offer users a means to protect their identity in certain
kinds of activities, they are not the “magic bullet” many of
their staunchest supporters have suggested.
 Recognizing the limitations of PETs, some privacy
advocates believe that stronger privacy laws will protect
consumers.
 Others in the commercial sector, for example, believe that
additional privacy legislation is neither necessary nor
desirable. Instead, they suggest strong industry controls
regulated by standards.
Example
 An industry-backed (self-regulatory) initiative called TRUSTe was designed to
help ensure that Web sites adhere to the privacy policies they advertise.
 TRUSTe uses a branded system of “trustmarks” (graphic symbols), which
represent a Web site’s privacy policy regarding personal information.
 Trustmarks provide consumers with the assurance that a Web site’s privacy
practices accurately reflect its stated policies.
 Through this PET-like feature, users can file a complaint to TRUSTe if the
Website bearing its trust seal does not abide by the stated policies.
 Any Web site that bears the TRUSTe mark and wishes to retain that seal must
satisfy several conditions:
 The Website must clearly explain in advance its general information-collecting
practices, including which personally identifiable data will be collected, what the
information will be used for, and with whom the information will be shared.
 Web sites that bear a trust seal but do not conform to these conditions can have
their seal revoked. And Websites displaying trust seals, such as TRUSTe, are
subject to periodic and unannounced audits of their sites.
Industry Self-Regulation
 Can industry regulate privacy with
government regulation and privacy
legislation?
 Toysmart case
Toysmart case
 Consider, for example, the case of Toysmart.com, an e-commerce site
that operated in Massachusetts.
 Consumers who purchased items from Toysmart were assured, via an
online trust seal, that their personal data would be protected. The
vendor’s policy stated that personal information disclosed to Toysmart
would be used internally but would not be sold to or exchanged with
external vendors. So, users who dealt with Toysmart expected that their
personal data would remain in that company’s databases and not be
further disclosed or sold to a third party.
 In the spring of 2000, however, Toysmart was forced to file for
bankruptcy.
 In the bankruptcy process, Toysmart solicited bids for its assets, which included its
databases containing the names of customers.
 Were the parties interested in purchasing that information under any obligation to
adhere to the privacy policy that Toysmart had established with its clients?
 If not, whoever either took over Toysmart or purchased its databases, would, in
principle, be free to do whatever they wished with the personal information in them,
despite the fact that such information was given to Toysmart by clients under the
belief that information about them would be protected indefinitely.
Toysmart case
 The Toysmart incident illustrates a situation in which users
had exercised control over their personal information in
one context—that is, in electing whether to disclose
information about themselves to Toysmart in online
transactions—based on specific conditions stated in
Toysmart’s privacy policy.
 However, it also turned out that these individuals were
not guaranteed that the personal information they
disclosed to Toysmart would be protected in the future.
Thus, it would seem that controls beyond those provided
by trustmarks and e-commerce vendors are needed.
Privacy laws and Data
protection Principles
 Many nations have enacted strong
privacy legislation
 EU Directive;
 US (a patchwork of laws).
 EU nations have, through the
implementation of strict data protection
principles, been far more aggressive
than the United States in addressing
privacy concerns of individuals
Comprehensive Privacy
Proposals
 While there has been no uniform consensus on a
comprehensive privacy policy, especially one that could be
implemented across international borders, there does
seem to be considerable agreement on at least one point:
any comprehensive privacy policy should be as
transparent as possible
 We might argues for a "co-regulatory" model that a
successful on-line-privacy policy must include:
 strong legislation;
 a privacy oversight commission;
 industry self-regulation.
 These must also be accompanied by privacy-enhancing
technologies.
 A "privacy watchdog agency" and sanctions are also both
needed.
Review Questions
 Identify and briefly describe four ways in
which the privacy threats posed by
cybertechnology differ from those posed
by earlier technologies.
 What is personal privacy, and why is
privacy difficult to define?
 Describe some important characteristics
that differentiate “accessibility privacy,”
“decisional privacy,” and “informational
privacy.”

You might also like