Data Protection
Data Protection
* Associate Professor of Law, George Washington University Law School. Professor Solove
has discussed many of the problems and solutions herein in his book, The Digital Person: Technology
and Privacy in the Information Age (2004).
** Director, Electronic Privacy Information Center West Coast Office. Chris Hoofnagle has
discussed many of the problems and solutions herein in his articles, Big Brother’s Little Helpers: How
ChoicePoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforce-
ment, 29 N.C.J INT’L L. & COM. REG. 595 (2004), available at http://ssrn.com/abstract=582302; and Put-
ting Identity Theft on Ice: Freezing Credit Reports to Prevent Lending to Impostors, Jan. 5, 2005, http://
ssrn.com/abstract=679581. Marc Rotenberg of the Electronic Privacy Information Center and Beth
Givens of the Privacy Rights Clearinghouse have provided substantial comments which are incorpo-
rated in an earlier iteration.
357
SOLOVE.DOC 2/2/2006 4:27:56 PM
I. INTRODUCTION
Privacy protection in the United States has often been criticized, but
critics have too infrequently suggested specific proposals for reform. Re-
cently, there has been significant legislative interest at both the federal
and state levels in addressing the privacy of personal information. This
was sparked when ChoicePoint, one of the largest data brokers in the
United States with records on almost every adult American citizen, de-
clared that it sold data on about 145,000 people to fraudulent businesses
set up by identity thieves.1 Other companies announced security
breaches, including LexisNexis, from which personal information belong-
ing to about 310,000 people was improperly accessed.2 Senator Schumer
criticized Westlaw for making available to certain subscribers personal
information including Social Security Numbers (SSNs).3
In the aftermath of the ChoicePoint debacle and other major infor-
mation security breaches, both of us have been asked by Congressional
legislative staffers, state legislative policymakers, journalists, academics,
and others about what specifically should be done to better regulate in-
formation privacy. In response to this question, we believe that it is im-
perative to have a discussion of concrete legislative solutions to privacy
problems.
What appears below is our attempt at such an endeavor. Privacy
experts have long suggested that information collection be consistent
with Fair Information Practices. This Model Regime incorporates many
of those practices and applies them specifically to the context of com-
mercial data brokers such as ChoicePoint. We hope that this will provide
useful guidance to legislators and policymakers in crafting laws and regu-
lations. We also intend this to be a work-in-progress in which we col-
laborate with others. We have welcomed input from other academics,
policymakers, journalists, and experts, as well as from the industries and
businesses that will be subject to the regulations we propose. We have
incorporated criticisms and constructive suggestions, and we will con-
tinue to update this Model Regime to include the comments we find
most helpful and illuminating.
1. Joseph Menn, Did the ChoicePoint End Run Backfire?, L.A. TIMES, Mar. 13, 2005, at C1;
Bob Sullivan, Database Giant Gives Access to Fake Firms: ChoicePoint Warns More than 30,000 They
May Be at Risk, MSNBC, Feb. 14, 2005, http://www.msnbc.msn.com/id/6969799/. In November 2005,
ChoicePoint informed an additional 17,000 individuals that their personal data may also have been
compromised, bringing the total number to 162,000. Michael Hiltzik, Big Data Broker Eyes DMV Re-
cords, L.A. TIMES, Dec. 1, 2005, at C1.
2. David Colker, LexisNexis Breach Is Larger: The company reveals that personal data files on
as many as 310,000 people were accessed, L.A. TIMES, Apr. 13, 2005, at C1.
3. Ken Fireman, Identity Theft Made Easy, Schumer Warns, NEWSDAY, Feb. 25, 2005, at A02.
SOLOVE.DOC 2/2/2006 4:27:56 PM
The database industry has its roots in the rise of consumer reporting
agencies—companies that gather and sell personal information on indi-
viduals for business purposes. The consumer reporting industry began
over a century ago. The first major consumer reporting agency, Retail
Credit Co., was founded in 1899, and over the years, it grew in size and
began selling reports about individuals to insurers and employers.6
By the 1960s, significant controversy surrounded the credit report-
ing agencies. There were questionable practices in the industry, including
requirements that investigators fill quotas of negative information on
data subjects.7 To do this, some investigators fabricated negative infor-
mation; others included incomplete information.8 Additionally, the in-
vestigators were collecting “lifestyle” information on data subjects, in-
cluding their sexual orientation, marital situation, drinking habits, and
cleanliness.9 The credit-reporting agencies were maintaining outdated
information and, in some cases, providing the file to law enforcement
and to unauthorized persons. Individuals had no right to see what was in
their files. Public exposure of the industry resulted in an extensive con-
10.
15 U.S.C. § 1601 (2000).
11.
Id. § 1681e(b).
12.
Id. § 1681.
13.
Id. § 1681b.
14.
Id. § 1681g.
15.
Id. § 1681i.
16.
Id. § 1681m(a).
17.
Id. § 1681b(b).
18.
Id.
19.
Id. § 1681b.
20.
Id. § 1681u.
21.
PRISCILLA M. REGAN, LEGISLATING PRIVACY: TECHNOLOGY, SOCIAL VALUES, AND
PUBLIC POLICY 71–82 (1995).
SOLOVE.DOC 2/2/2006 4:27:56 PM
22. SECRETARY’S ADVISORY COMM. ON AUTOMATED PERSONAL DATA SYS., U.S. DEPT. OF
HEALTH, EDUC. & WELFARE, RECORDS, COMPUTERS AND THE RIGHTS OF CITIZENS (1973), available
at http://epic.org/privacy/hew1973report (follow “Summary and Recommendations” hyperlink).
23. Id.
24. Id.
25. Id.
26. Id.
27. Id.
28. Id.
29. 5 U.S.C. § 552a(d) (2000).
30. Id. § 552a(e); see also Marc Rotenberg, Fair Information Practices and the Architecture of
Privacy (What Larry Doesn’t Get), 2001 STAN. TECH. L. REV. 1.
31. 5 U.S.C. § 552a(b).
32. Id. § 552a(g).
SOLOVE.DOC 2/2/2006 4:27:56 PM
tion, with the FCRA addressing the key private sector uses of personal
data and the Privacy Act addressing public sector uses.
33. ChoicePoint Inc., General Form for Registration of Securities Pursuant to Section 12(b) or
12(g) of the Securities Exchange Act of 1934 (Form 10) (June 6, 1997), available at http://www.sec.gov/
Archives/edgar/data/1040596/0000950144-97-006666.txt.
34. See EPIC ChoicePoint Page, http://www.epic.org/privacy/choicepoint/ (last visited Sept. 13,
2005).
35. ChoicePoint Inc., Annual Report (Form 10-K), at 5 (Mar. 15, 2005), available at http://www.
sec.gov/Archives/edgar/data/1040596/000095014405002577/g93507e10vk.htm#102.
36. ChoicePoint Inc., Annual Report (Form 10-K), at 3 (Mar. 26, 2003), available at http://www.
sec.gov/Archives/edgar/data/1040596/000095014401500429/g68168e10-k.txt.
37. ROBERT O’HARROW, JR., NO PLACE TO HIDE 130 (2005).
38. Testimony of Derek Smith: Before the Subcomm. On Commerce, Trade and Consumer Pro-
tection of the H. Energy and Commerce Comm., 109th Cong. (2005) (testimony of Derek Smith,
Chairman and Chief Executive Officer, ChoicePoint Inc.), available at http://energycommerce.house.
gov/108/Hearings/03152005hearing1455/Smith.pdf.
39. O’HARROW, supra note 37, at 34.
SOLOVE.DOC 2/2/2006 4:27:56 PM
sometimes know what some people read, what they order over the
phone and online, and where they go on vacation.40
LexisNexis is a corporation owned by the United Kingdom-based
Reed Elsevier that offers access to numerous databases and information
retrieval services.41 Through services such as its featured search tool
“SmartLinx,” LexisNexis offers access to SSNs, addresses, licenses, real
estate holdings, bankruptcies, liens, marital status, and other personal in-
formation.42
ChoicePoint, Acxiom, and LexisNexis are three of the larger data
brokers. There are many other companies that comprise this industry.
The database industry provides data to companies for marketing, to the
government for law enforcement purposes, to private investigators for
investigating individuals, to creditors for credit checks, and to employers
for background checks.
The government increasingly has been contracting with data bro-
kers. For example, ChoicePoint has multimillion dollar contracts with at
least thirty-five federal agencies, including the Internal Revenue Service
(IRS) and the FBI.43 The United States Marshals Service uses Lex-
isNexis for “location of witnesses, suspects, informants, criminals, parol-
ees in criminal investigations, location of witnesses, [and] parties in civil
actions.”44 LexisNexis’s Person Tracker Plus Social Security number is a
private library “designed to meet the needs of law enforcement.”45 It
provides information probably derived from credit headers, including
name, SSN, current address, two prior addresses, aliases, birth date, and
telephone number.46 After 9/11, Acxiom positioned itself as “an anti-
terrorism company” by actively pursuing ways to manage personal in-
formation for the government.47 Charles Morgan, Acxiom’s CEO, stated
after 9/11 that “we developed a sense among the leadership at Acxiom
that for this country to be a safer place [the government] had to be able
to work with information better.”48
The government is becoming increasingly interested in data-mining
technologies. Data mining involves searching through repositories of
data to find out new information by combining existing data or to make
The FCRA and the Privacy Act do not adequately address the ac-
tivities of the database industry. The FCRA applies to “any consumer
reporting agency” that furnishes a “consumer report.”54 The definition
of “consumer reporting agency” is any person who “regularly engages”
in collecting information about consumers “for the purpose of furnishing
consumer reports to third parties.”55 This definition turns on the mean-
ing of “consumer report,” which is the key term that defines the scope of
the FCRA. Unfortunately, the FCRA has a poorly drafted definition of
“consumer report” that has allowed some to unduly narrow the FCRA
coverage. The FCRA conditions the definition of “consumer report” on
how the information is used. That is, a “consumer report” is any com-
munication bearing on a consumer’s character or general reputation
which is used or collected for credit evaluation, employment screening,
56. Id. § 1681a(d) (emphasis added); see also id. § 1681b(a)(3)(A)–(E). For an account of other
limitations of the FCRA, see Joel R. Reidenberg, Privacy in the Information Economy: A Fortress or
Frontier for Individual Rights?, 44 FED. COMM. L.J. 195, 210–13 (1992).
57. OFFICE OF THE GEN. COUNSEL, NAT’L SEC. LAW UNIT, GUIDANCE REGARDING THE USE OF
CHOICEPOINT FOR FOREIGN INTELLIGENCE COLLECTION OR FOREIGN COUNTERTERRORISM
INVESTIGATIONS 12–13 (2001), available at http://epic.org/privacy/choicepoint/cpfbia.pdf.
58. Id. at 13 (footnote omitted). A strong argument can be made that these interpretations are
flawed. The provisions of the FCRA governing law enforcement access make it clear that Congress
intended procedural safeguards against disclosure of credit information, regardless of its intended use.
As noted in the review of federal privacy law and access to commercial information done by the Cen-
ter for Democracy & Technology, some courts have ruled that when information is collected for con-
sumer reporting purposes, it remains a consumer report, despite the fact that it may be employed for
non-FCRA purposes. James X. Dempsey & Lara M. Flint, Commercial Data and National Security, 72
GEO. WASH. L. REV. 1459, 1477 (2004).
59. See FEDERAL TRADE COMMISSION, INDIVIDUAL REFERENCE SERVICES: A REPORT TO
CONGRESS (1997), http://www.ftc.gov/bcp/privacy/wkshp97/irsdoc1.htm.
60. INDIVIDUAL REFERENCE SERVICES GROUP, IRSG PRINCIPLES, http://www.ftc.gov/bcp/
privacy/wkshp97/irsdoc2.htm#A%20The%20IRSG%20Principles (last visited Sept. 13, 2005).
61. Id.
SOLOVE.DOC 2/2/2006 4:27:56 PM
62. Letter from Gina Moore, ChoicePoint, Inc. to Chris Hoofnagle, Director, Electronic Privacy
Information Center 1 (Feb. 21, 2003), available at http://epic.org/privacy/choicepoint/cp_nooptout.pdf.
63. FEDERAL TRADE COMMISSION, IDENTITY THEFT SURVEY REPORT 4 (Sept. 2003), available
at www.ftc.gov/os/2003/09/synovatereport.pdf. For an excellent account of the rise of identity theft,
see BOB SULLIVAN, YOUR EVIL TWIN: BEHIND THE IDENTITY THEFT EPIDEMIC (2004).
64. 15 U.S.C. § 1681j (2000).
65. SULLIVAN, supra note 63, at 85.
66. IDENTITY THEFT RESOURCE CENTER, IDENTITY THEFT: THE AFTERMATH 2004 16 (Sept.
2004), available at http://www.idtheftcenter.org/aftermath2004.pdf.
67. 5 U.S.C. § 552a(m) (2000).
68. Id.; see, e.g., Modification M001, http://epic.org/privacy/choicepoint/cpusms7.30.02d.pdf.
SOLOVE.DOC 2/2/2006 4:27:56 PM
propose in the pages that follow is designed to address the gaps and limi-
tations in existing law.
1. Universal Notice
a. Problem
b. Legislative Mandate
c. Specific Solution
about children under 13); Video Privacy Protection Act of 1988, 18 U.S.C. §§ 2710–2711 (video re-
cords).
76. 149 N.H. 148, 152–53 (2003).
SOLOVE.DOC 2/2/2006 4:27:56 PM
a. Problem
b. Legislative Mandate
c. Specific Solution
a. Problem
b. Legislative Mandate
c. Specific Solution
78. Id.
SOLOVE.DOC 2/2/2006 4:27:56 PM
a. Problem
b. Legislative Mandate
c. Specific Solution
a. Problem
maintained about them, why it is being kept, how long it is being main-
tained, to whom it is being disseminated, and how it is being used. The
records maintained by these companies can have inaccuracies. This
would not matter much if the information were never used for anything
important. However, the data is being used in ways that directly affect
individuals—by the government for law enforcement purposes and by
private investigators for investigation.
b. Legislative Mandate
There must be a way for individuals to ensure that their personal in-
formation, that is maintained by various data brokers, is maintained ac-
curately and is not kept for an unreasonable amount of time.
c. Specific Solution
6. Secure Identification
a. Problem
b. Legislative Mandate
c. Specific Solution
a. Problem
b. Legislative Mandate
c. Specific Solution
a. Problem
b. Legislative Mandate
c. Specific Solution
a. Problem
Public records were once scattered about the country, and finding
out information on individuals involved trekking to or calling a series of
b. Legislative Mandate
There must be a way to regulate access and use of public records
that maximizes exposure of government activities and minimizes the dis-
closure of personal information about individuals.
c. Specific Solution
Background checks are cheaper now than ever before, which leads
to a situation where individuals are being screened for even menial jobs.
We risk altering our society to one where the individual can never escape
a youthful indiscretion or a years-old arrest, even for a minor infraction.
Pre-employment screens are frequently being used by employers even
for jobs that do not involve participating in security-related functions,
handling large sums of money, or supervising children or the elderly.
SOLOVE.DOC 2/2/2006 4:27:56 PM
b. Legislative Mandate
c. Specific Solution
a. Problem
b. Legislative Mandate
83. 29 U.S.C. § 2006 (2000) (listing jobs involving national defense and security).
84. See Daniel J. Solove, Access and Aggregation: Public Records, Privacy and the Constitution,
86 MINN. L. REV. 1137, 1191 (2002).
85. Id.
86. See Remsburg v. Docusearch, 149 N.H. 148, 152–53 (2003).
SOLOVE.DOC 2/2/2006 4:27:56 PM
c. Specific Solution
a. Problem
b. Legislative Mandate
There must be a way to engage in electronic commerce and routine
transactions without losing one’s expectation of privacy in personal data.
c. Specific Solution
tions should exist for reasonable law enforcement needs, including emer-
gency circumstances.
a. Problem
b. Legislative Mandate
There must be a way to ensure that government data mining does
not permit law enforcement to engage in dragnet searches for prospec-
tive crimes. Where data mining is employed, it should occur in as open a
manner as possible and have adequate judicial oversight and public ac-
countability.
c. Specific Solution
89.U.S. GEN. ACCOUNTING OFFICE, DATA MINING: FEDERAL EFFORTS COVER A WIDE RANGE
OF USES 4 (2004).
90. Id. at 2–3; see TECH. AND PRIVACY ADVISORY COMM., SAFEGUARDING PRIVACY IN THE
FIGHT AGAINST TERRORISM ix (2004), available at www.cdt.org/security/usapatriot/20040300tapac.
pdf.
91. Id. at 22.
SOLOVE.DOC 2/2/2006 4:27:56 PM
a. Problem
92. See MARY DEROSA, DATA MINING AND DATA ANALYSIS FOR COUNTERTERRORISM 5
(2004), available at www.cdt.org/security/usapatriot/20040300csis.pdf.
93. See id. at 4–5.
94. Exemption of Federal Bureau of Investigation Systems—Limited Access, 28 C.F.R. § 16.96
(2004).
95. Privacy Act of 1974, 5 U.S.C. § 552a(g)(4) (2000).
96. 540 U.S. 614, 627 (2000).
SOLOVE.DOC 2/2/2006 4:27:56 PM
for violations of the Privacy Act must prove actual loss in order to obtain
minimum damages of $1,000 under the Privacy Act. Although many
plaintiffs whose personal information is leaked by an agency suffer emo-
tional distress, such emotional distress is not sufficient to constitute an
actual loss for many courts. Accordingly, such plaintiffs are left without
a remedy.
b. Legislative Mandate
c. Specific Solution
The Privacy Act must be updated. Over thirty years have gone by
without a major reexamination of the Privacy Act, and one is sorely
needed. Congress should empanel a new Privacy Protection Study
Committee to examine government use of personal information compre-
hensively and make recommendations for legislation to update the Pri-
vacy Act. Specific changes shall include, but shall not be limited to: (1)
limiting the routine use exception; (2) addressing the outsourcing of per-
sonal information processing to private-sector businesses; (3) strengthen-
ing the enforcement provisions of the Act; and (4) overturning Doe v.
Chao97 so that violations of the Act are remedied by minimum-damages
provisions.
b. Legislative Mandate
c. Specific Solution
a. Problem
b. Legislative Mandate
98. New State Ice Co. v. Liebmann, 285 U.S. 262, 311 (1932) (Brandeis, J., dissenting).
SOLOVE.DOC 2/2/2006 4:27:56 PM
c. Specific Solution
IV. COMMENTARY
An earlier iteration of the Model Regime, which was released on
March 10, 2005, received considerable attention. It was discussed in tes-
timony at legislatures at the federal and state level. We received a num-
ber of very thoughtful comments and read many insightful discussions
across the blogosphere.99 The comments we received range from being
very supportive of the Model Regime to being very critical. In this sec-
tion, we respond to some of the comments and criticisms to the Model
Regime.
A. General Comments
99. Jim Horning, Chief Scientist at McAfee Research (title listed for identification purposes
only), and Rich Kulawiec helped us fix grammatical errors, and we are indebted to them for their kind
efforts.
100. E-mail from Eric Goldman, Assistant Professor, Marquette University School of Law, to
Daniel Solove, Associate Professor, The George Washington University Law School (Mar. 15, 2005,
06:51 PM) (on file with authors).
101. The FTC estimates that identity theft costs businesses about $33 billion each year. FEDERAL
TRADE COMMISSION, IDENTITY THEFT SURVEY REPORT 6 (2003).
SOLOVE.DOC 2/2/2006 4:27:56 PM
102. E-mail from Anonymous, to Daniel Solove, Associate Professor, The George Washington
University Law School (Apr. 4, 2005, 04:41 PM) (on file with authors).
103. Janine Benner et al., Nowhere To Turn: Victims Speak Out on Identity Theft: A
CALPRIG/Privacy Rights Clearinghouse Report (2000), http://privacyrights.org/ar/idtheft2000.htm.
104. Posting of Roy Owens to Schneier on Security: ChoicePoint Says “Please Regulate Me,”
http://www.schneier.com/blog/archives/2005/03/choicepoint_say.html (Mar. 9, 2005, 04:06 PM). Bruce
Schneier is the founder and CTO of Counterpane Internet Security, Inc. and author of numerous
books on data security. See BRUCE SCHNEIER, BEYOND FEAR: THINKING SENSIBLY ABOUT SECURITY
IN AN UNCERTAIN WORLD (2003); BRUCE SCHNEIER, SECRETS AND LIES: DIGITAL SECURITY IN A
NETWORKED WORLD (2000).
105. 15 U.S.C. § 1681h(e).
106. Id. § 1681e(b).
SOLOVE.DOC 2/2/2006 4:27:56 PM
107. E-mail from Eric Grimm, Attorney, Calligaro & Meyering, P.C., to Daniel Solove, Associate
Professor, The George Washington University Law School, and Chris Hoofnagle, Director, Electronic
Privacy Information Center, West Coast Office (Mar. 20, 2005, 04:48 PM) (on file with authors).
108. See Matt Miller, Draft of a Model Privacy Regime (Part One) (Mar. 14, 2005), http://
privacyspot.com/?q=node/view/593; Matt Miller, Draft of a Model Privacy Regime (Part Two) (Mar.
14, 2005), http://privacyspot.com/?q=node/view/595.
109. Matt Miller, Draft of a Model Privacy Regime (Part Two), supra note 108.
110. Title listed for identification purposes only.
111. E-mail from Jim Horning, Chief Scientist, McAfee Research, to Daniel Solove, Associate
Professor, the George Washington University Law School, and Chris Hoofnagle, Director, Electronic
Privacy Information Center (Mar. 15, 2005, 04:43 PM) (on file with authors).
112. 47 U.S.C. § 551(e) (2000).
113. 18 U.S.C. § 2710(e) (2000).
114. 15 U.S.C. § 1681c (2000).
115. Dennis Bailey, The Whole Kit and Caboodle—Solove and Hoofnagle Go for Regime Change,
Mar. 21, 2005, http://www.opensocietyparadox.com/mt/archives/000517.html; see also DENNIS BAILEY,
THE OPEN SOCIETY PARADOX 157–58 (2004).
SOLOVE.DOC 2/2/2006 4:27:56 PM
opment, and many regulations benefit the economy and support innova-
tion where there have been market failures.
Moreover, the free flow of information and maximization of eco-
nomic development are not the only normative ends of our society. In-
formation regulation can serve to promote fairness or prevent a panoply
of types of discrimination. For example, since the 1970s, it has been ille-
gal under federal anti-discrimination laws to exclusively rely upon an ar-
rest record to make hiring decisions, as minorities are more heavily tar-
geted by law enforcement:
Blacks and Hispanics are convicted in numbers which are dispro-
portionate to Whites and . . . barring people from employment
based on their conviction records will therefore disproportionately
exclude those groups. Due to this adverse impact, an employer may
not base an employment decision on the conviction record of an
applicant or an employee absent business necessity.116
Dennis Bailey suggests in his blog that personal information should
generally be public, and that regulation should only focus on the harmful
uses of personal information:
If information is being used to deprive someone of their freedoms,
such as the right to vote or the ability to get a job; or used to de-
fraud someone through identity theft then the full weight of the law
should be applied. But to regulate data simply on the basis of pri-
vacy is not something I support.117
It is not practical to take such an approach. Limiting the public disclo-
sure of certain data as a precaution is a first line of defense for individu-
als such as judges, workers at medical clinics performing abortions, and
domestic violence victims. The use of criminal law alone to address iden-
tity theft has been a failure. Gartner, Inc., a research firm, estimates that
far less than one percent of identity thefts result in a conviction.118 A
United States GAO Report describes in compelling detail the difficulties
with criminal investigation and prosecution of identity theft cases.119
Jim Harper, Director of Information Policy Studies at the Cato In-
stitute, comments that regulation “at its best proscribes a set of actions in
order to prevent harm,” and criticizes regulations that seek to prevent
behavior not tied to “monetary loss, property loss, or mental distress that
causes physical symptoms, loss of work, or destruction of family and pro-
116. Gregory v. Litton Systems, Inc., 472 F.2d 631, 632 (9th Cir. 1972); EEOC, Policy Guidance
No. N-915-061, POLICY GUIDANCE ON THE CONSIDERATION OF ARREST RECORDS IN EMPLOYMENT
DECISIONS UNDER TITLE VII OF THE CIVIL RIGHTS ACT OF 1964, at 4 (1990).
117. Bailey, supra note 115.
118. Stephen Mihm, Dumpster-Diving for Your Identity, N.Y. TIMES MAG., Dec. 21, 2003, at 42.
The figure is less than one in seven hundred identity thefts resulting in conviction.
119. U.S. GEN. ACCOUNTING OFFICE, REPORT TO THE HONORABLE SAM JOHNSON, HOUSE OF
REPRESENTATIVES, IDENTITY THEFT: GREATER AWARENESS AND USE OF EXISTING DATA ARE
NEEDED 17–18 (June 2002), available at http://www.gao.gov/new.items/d02766.pdf.
SOLOVE.DOC 2/2/2006 4:27:56 PM
120. E-mail from Jim Harper, Director of Information Policy Studies, The Cato Institute, to
Daniel Solove, Associate Professor, The George Washington University Law School, and Chris Hoof-
nagle, Director, Electronic Privacy Information Center (Mar. 21, 2005, 3:59 PM) (on file with authors).
121. Samuel D. Warren & Louis D. Brandeis, The Right to Privacy, 4 HARV. L. REV. 193, 193–95
(1890).
122. William L. Prosser, Privacy, 48 CAL. L. REV. 383, 392 (1960).
123. E-mail from Michael Shankey, Chief Executive Officer, BRB Publications, Inc., to Daniel
Solove, Associate Professor, The George Washington University Law School, and Chris Hoofnagle,
Director, Electronic Privacy Information Center (Mar. 3, 2005) (on file with authors).
124. Id.
SOLOVE.DOC 2/2/2006 4:27:56 PM
125. Id.
126. Id.
127. Id.
128. Id.
129. Id.
130. News Release, ChoicePoint, ChoicePoint to Exit Non-FCRA, Consumer-Sensitive Data
Markets; Shift Business Focus to Areas Directly Benefiting Society and Consumers (Mar. 4, 2005),
available at http://www.choicepoint.com/choicepoint/news.nsf/IDNumber/TXK2005-5381565?Open
Document.
131. Id.
132. According to journalist Jonathan Krim: “So far, neither those moves nor revelations of a
series of breaches at major banks and universities has curbed a multi-tiered and sometimes shadowy
marketplace of selling and re-selling personal data that is vulnerable to similar fraud.” Jonathan Krim,
Net Aids Access to Sensitive ID Data, Social Security Numbers Are Widely Available, WASH. POST,
Apr. 4, 2005, at A01.
SOLOVE.DOC 2/2/2006 4:27:56 PM
133. Brian Bergstein, ChoicePoint Tries to Find Its Footing in Anti-Fraud Effort, ASSOC. PRESS,
Sept. 30, 2005.
134. Ann Carrns & Valerie Bauerlien, ChoicePoint Curtails Business, Changes Method to Protect
Data, WALL ST. J., June 24, 2004, at A10.
135. “ChoicePoint will continue to serve most of its core markets and customers, but these actions
will have an impact on the scope of products offered to some customers and the availability of infor-
mation products in certain market segments, particularly small businesses. The transition will begin
immediately and is expected to be substantially completed within 90 days.” News Release, Choice-
Point, supra note 129.
136. See Martin H. Bosworth, USA PATRIOT Act Rewards ChoicePoint: “Identity Verification”
Exposes Consumers to Risks, May 13, 2005, http://www.consumeraffairs.com/news04/2005/patriot01.
html.
137. Id.
138. Bob Sullivan, ChoicePoint Files Found Riddled With Errors: Data Broker Offers No Easy
Way to Fix Mistakes, Either, Mar. 8, 2005, http://msnbc.msn.com/id/7118767.
139. Id.
140. Posting of Richard Smith to Free Public, http://www.freerepublic.com/forum/a3b1251464594.
htm (May 28, 2001, 06:23 PDT).
141. Sullivan, supra note 138.
SOLOVE.DOC 2/2/2006 4:27:56 PM
Fourth, individuals will have access, but not correction rights, with
respect to ChoicePoint’s unregulated public information reports. Choic-
ePoint claims that it cannot correct these reports because they are gener-
ated from public records. However, this claim is deceptive—the problem
is that ChoicePoint is mixing up public-record information between indi-
viduals. The public records are correct, but they are attached to people
to whom they do not pertain.
Fifth, nothing binds ChoicePoint to its promise to maintain its re-
formed policies. In recent years, many large companies including
eBay.com, Amazon.com, Drkoop.com, and Yahoo.com, changed users’
privacy settings or altered privacy policies to the detriment of users.142
ChoicePoint is legally in a better position to renege on its promises, as it
does not acknowledge a direct relationship with consumers that could be
the basis of a legal action. ChoicePoint’s “consumers” are the businesses
that buy data from the company.
Sixth, ChoicePoint is still reserving the right to sell “sensitive” per-
sonal information to businesses in a large number of contexts. Choice-
Point’s release states that sensitive information will be sold to “[s]upport
consumer-driven transactions where the data is needed to complete or
maintain relationships . . . [p]rovide authentication or fraud prevention
tools to large, accredited corporate customers where consumers have ex-
isting relationships . . . [a]ssist federal, state and local government and
criminal justice agencies in their important missions.”143 These categories
articulated by ChoicePoint are broad and ill-defined. What specifically
falls under “consumer-driven transactions”? When is data “needed to
complete or maintain relationships?” Under this standard, ChoicePoint
can decide what constitutes a consumer benefit. In the past, ChoicePoint
has declared that selling personal information benefits consumers in the
aggregate, and thus individuals should have no right to opt-out of Choic-
ePoint’s databases.144 Simply put, ChoicePoint’s idea of what benefits
consumers differs from what consumers and consumer advocates think
benefits them.
Seventh, the ChoicePoint policy allows the company to sell full re-
ports for anti-fraud purposes. While in theory this exception seems ap-
142. Chris Jay Hoofnagle, Consumer Privacy In the E-Commerce Marketplace, in THIRD ANNUAL
INSTITUTE ON PRIVACY LAW 1339, 1360 (2002), available at http://papers.ssrn.com/sol3/papers.cfm?
abstract_id=494883#PaperDownload.
143. News Release, ChoicePoint, supra note 130.
144. The privacy statement mailed to individuals who request their unregulated AutoTrackXP
report reads in part:
We feel that removing information from these products would render them less useful for impor-
tant business purposes, many of which ultimately benefit consumers. ChoicePoint DOES NOT
DISTRIBUTE NON-PUBLIC INFORMATION (as defined in the Principles) TO THE
GENERAL PUBLIC PURSUANT TO SECTION V(C) OF THE PRINCIPLES. The general
public therefore has NO direct access to or use of NON-PUBLIC INFORMATION (as defined
in the Principles) from ChoicePoint whatsoever.
Letter from Gina Moore to Chris Hoofnagle, supra note 62, at 1.
SOLOVE.DOC 2/2/2006 4:27:56 PM
propriate, almost any transaction can have some fraud risk. If a broad
fraud exemption is maintained, it will allow the sale of reports even when
the fraud risk is minimal or a proxy for wishing to collect information for
some other purpose.
Finally, ChoicePoint’s proposal does not at all limit sale of personal
information to law enforcement. The company continues to sell personal
information to 7,000 federal, state, and local law enforcement agencies.145
1. Universal Notice
Rich Kulawiec suggests that the exemption from consent for fraud
investigations is too broad; individuals should receive notice when an in-
vestigation does not result in a finding of fraud.149 Kulawiec argues that
this would prevent baseless uses of data for anti-fraud purposes and al-
low people to correct data that led the business to suspect fraud. We ad-
justed this principle to allow use without consent for “reasonable” fraud
investigations. Providing notice each time a company used data to per-
form some anti-fraud function would be burdensome. We think requir-
ing a condition of reasonableness for investigations will prevent arbitrary
uses of personal information under the exemption.
150. Posting of Curt Sampson to Schneier on Security: Ideas for Privacy Reform, http://www.
schneier.com/blog/archives/2005/03/ideas_for_priva.html (Mar 14, 2005, 10:56 PM).
151. E-mail from Jim Horning, Chief Scientist, McAfee Research, to Daniel Solove, Associate
Professor, The George Washington University Law School, and Chris Hoofnagle, Director, Electronic
Privacy Information Center (Mar. 15, 2005, 07:44 PM) (on file with authors).
152. Posting of Curt Sampson to Schneier on Security: Ideas for Privacy Reform, supra note 150.
153. Id.
154. Privacy Act Notice, 68 Fed. Reg. 37494, 37494 (June 24, 2003).
SOLOVE.DOC 2/2/2006 4:27:56 PM
155. Posting of David Mohring to Schneier on Security: Ideas for Privacy Reform, http://www.
schneier.com/blog/archives/2005/03/ideas_for_priva.html (Mar. 12, 2005, 06:56 AM).
156. Privacy Act Notice, 68 Fed. Reg. at 37496.
157. E-mail from Rich Kulawiec to Daniel Solove and Chris Hoofnagle, supra note 149.
158. 15 U.S.C. § 1681b(b)(2) (2000).
SOLOVE.DOC 2/2/2006 4:27:56 PM
159. Posting of No id please to Schneier on Security: ChoicePoint Says “Please Regulate Me,”
http://www.schneier.com/blog/archives/2005/03/choicepoint_say.html (Mar. 9, 2004, 04:12 PM).
160. Bailey, supra note 115.
161. The Federal Trade Commission has labeled unfair a scheme where marketers sent unwanted
popup messages to users of Microsoft Windows computers and then offered these users software to
block the messages, thereby seeking to profit from the very harm the company caused. See FTC v. D
Squared Solutions, LLC, No. AMD 03-CV3108 (N.D. Md. Nov. 6, 2003).
162. Complaint at Exhibit B, In the Matter of Experian (before the Federal Trade Commission)
(2003), available at http://www.epic.org/privacy/Experian (follow “Exhibit B” hyperlink).
163. 15 U.S.C. § 1681e(b) (2000).
164. Letter from Chris Jay Hoofnagle, Director, Electronic Privacy Information Center, to the
Federal Trade Commission (Sept. 16, 2003), available at http://epic.org/privacy/experian (arguing that
a consumer reporting agency promoting its subscription credit monitoring service by capitalizing on its
own failure to adequately fulfill its duty to maintain maximum possible accuracy violated the Fair
Consumer reporting Act).
165. Posting of Matthew B. to Schneier on Security: Ideas for Privacy Reform, http://www.
schneier.com/blog/archives/2005/03/ideas_for-priva.html (Mar. 15, 2005, 08:30 AM).
166. Id.
SOLOVE.DOC 2/2/2006 4:27:56 PM
167. E-mail from Rich Kulawiec to Daniel Solove and Chris Hoofnagle, supra note 149.
168. E-mail from Anonymous to Daniel Solove, supra note 102.
169. Id.
170. Id.
171. California, Louisiana, Texas, and Vermont have credit freeze laws. See CAL. CIV. CODE
§ 1785.11.2–.11.6 (West 2005); LA. REV. STAT. ANN. § 9:3571.1 (2005); TEX. BUS. & COM. CODE ANN.
§ 20.01 (Vernon 2005); VT. STAT. ANN. tit. 9, § 2480a (2004).
SOLOVE.DOC 2/2/2006 4:27:56 PM
6. Secure Identification
172. E-mail from Michael Shankey to Daniel Solove and Chris Hoofnagle, supra note 123.
173. Id.
174. Posting of Anonymous to Schneier on Security: ChoicePoint Says “Please Regulate Me,”
http://www.schneier.com/blog/archives/2005/03/choicepoint_say.html (Mar. 9, 2005, 06:34 PM).
175. Posting of Gary to Schneier on Security: Ideas for Privacy Reform, http://www.schneier.
com/blog/archives/2005/03/ideas_for_priva.html (Mar. 15, 2005, 04:45 AM). This proposal resembles
in some respect Lynn LoPucki’s proposal to have a public system of identification. Lynn M. LoPucki,
Human Identification Theory and the Identity Theft Problem, 80 TEX. L. REV. 89, 120 (2001). For an
extensive critique of LoPucki’s system, see Daniel J. Solove, Identity Theft, Privacy, and the Architec-
SOLOVE.DOC 2/2/2006 4:27:56 PM
ever, the SSN is an individual’s “account number” with the Social Secu-
rity Administration, and making such a financial account number public
could create opportunities for fraud and abuse.
Jim Horning and several others are critical of the employment of
passwords: (i) users choose bad passwords, (ii) they tend to use the same
password for many different purposes, and (iii) they forget good pass-
words.176 All of these problems are valid concerns, but the current sys-
tem is a password system that is among the worst that could possibly be
devised. SSNs (i) are used as passwords, (ii) are far from secret, (iii) can
readily be found out with minimal effort, (iv) are very difficult to change,
and (v) are used on countless accounts and record systems. The Model
Regime’s password proposal eliminates several of these problems. First,
passwords will vary with different accounts, so a thief finding out a pass-
word will not unlock everything. Second, passwords can readily be
changed, so once the identity theft is detected, people can readily render
the stolen password useless. Third, it will be much more difficult for the
average identity thief to guess people’s passwords. Thieves can of course
do this, but it will make identity thefts more difficult. It is difficult to
completely eliminate identity theft, but the Model Regime makes it much
harder to engage in and makes it easier for victims to halt it once it hap-
pens. The problem of forgetting passwords can be addressed by having
people supply answers to questions such as naming their favorite pet’s
name or favorite color.
We have received some interesting technological solutions for iden-
tification and authentication, but we were reluctant to adopt any without
a more thorough understanding of their implications, feasibility, usabil-
ity, and potential problems.177 We are open to other approaches that
provide more flexibility and security than passwords. For now, we be-
lieve that passwords are an easy measure that will have a significant im-
pact on reducing the incidence and severity of identity theft. While such
a solution is not perfect, its great virtue is that it supplies a substantial
advance in effectiveness with relative simplicity.
ture of Vulnerability, 54 HASTINGS L.J. 1227, 1262–66 (2003). For LoPucki’s response, see Lynn
LoPucki, Did Privacy Cause Identity Theft?, 54 HASTINGS L.J. 1277 (2003).
176. E-mail from Jim Horning to Daniel Solove and Chris Hoofnagle, supra note 110; E-mail
from Scott Minneman, Student, The George Washington University Law School, to Daniel Solove,
Associate Professor, The George Washington University Law School (Mar. 11, 2005, 08:32 PM).
177. For example, Jim Horning suggested “PwdHash” as one possible approach to addressing the
problem of using the same password for different services. Web Password Hashing, http://crypto.
stanford.edu/PwdHash/ (last visited Apr. 3, 2005). PwdHash is an Internet Explorer plug-in “that
transparently converts a user’s password into a domain-specific password.” Web Password Hashing,
supra. This would reduce the risk associated with individuals using the same password at different
Web sites.
SOLOVE.DOC 2/2/2006 4:27:56 PM
178. E-mail from Eric Goldman to Daniel Solove, supra note 100.
SOLOVE.DOC 2/2/2006 4:27:56 PM
Jim Horning argues that the background check section is not rele-
vant to the problems we aimed to address in the Model Regime.179 How-
ever, the growth of the commercial data broker industry has been in
large part due to employers performing background checks, even in cases
where the job has no security function. Outside the pre-employment
context, shoddy, electronic-only background checks have become in-
creasingly less expensive. There has to be some way to put balance back
into this situation and limit the contexts in which background checks are
performed.
Dennis Bailey contends that many nonsecurity jobs could have se-
curity implications; he gives the example of felons working for service
companies that might assault people in their homes.180 We think that
some line must be drawn that establishes categories of jobs that are
available without a background check. Almost any job imaginable has
some security implication, but the connection is often attenuated. We
think the line that we have drawn, one that allows background checks for
caretakers, handlers of large sums of money, and for functions articu-
lated by Congress in the Polygraph Act, properly balances employer and
employee interests.
Michael Sankey of BRB Publications, Inc. criticizes the background
check section, and notes that “the employer can be sued for negligent
hiring if [pre-employment screens] are not done.”181 We think this is pre-
cisely the reason why a line must be drawn. As pre-employment screen-
ing becomes cheaper, it becomes difficult for employers to refrain from
engaging in prescreening. One could foresee the day when even the
most menial job will require a clean record. Obviously, if the Model Re-
gime restricts an employer from conducting a background check for a
particular position, the employer shall not be deemed negligent for not
conducting one.
179. E-mail from Jim Horning to Daniel Solove and Chris Hoofnagle, supra note 110.
180. Bailey, supra note 114.
181. E-mail from Michael Sankey to Daniel Solove and Chris Hoofnagle, supra note 123.
182. E-mail from Jim Horning to Daniel Solove and Chris Hoofnagle, supra note 110.
SOLOVE.DOC 2/2/2006 4:27:56 PM
are frequent users of the information provided by data brokers. Too lit-
tle is known about this industry. Certainly, there are beneficial examples
of private investigators using personal information (i.e., to locate lost
children). But private investigators engage in other practices largely un-
known to the public. Moreover, there are many instances of private in-
vestigators assisting unscrupulous individuals, stalkers, and others bent
on violence. For example, the stalker who murdered Rebecca Shaeffer
obtained her address from a private investigator.183 We believe that be-
cause private investigators engage in significant use of personal informa-
tion, they should be subject to the Model Regime just like other principal
users of such data. Failure to address private investigators would leave a
significant gap in protection.
183. 139 CONG. REC. S15762 (daily ed. Nov. 16, 1993) (statement of Sen. Boxer).
184. E-mail from Michael Sankey to Daniel Solove and Chris Hoofnagle, supra note 123.
185. E-mail from Jim Harper, Director of Information Policy Studies, The Cato Institute, to
Declan McCullagh, Chris Hoofnagle, Director, Electronic Privacy Information Center, and Daniel
Solove, Associate Professor, The George Washington University Law School (Mar. 14, 2005, 01:00
PM) (on file with authors).
186. Bailey, supra note 115.
SOLOVE.DOC 2/2/2006 4:27:56 PM
Data mining regularly occurs, and with good reason, to address crimes
that have already occurred (a form of data matching was used to help
identify the Washington D.C. area sniper, for instance). Data mining
prospectively to interdict future crimes raises profound due process ques-
tions, and it is that practice that we have sought to address in this princi-
ple.
In his book, Dennis Bailey elaborates on his support for govern-
ment data mining and contends that the problems created by data mining
can be minimized by avoiding a “centralized data warehouse.”187 Bailey
observes that the government could search multiple databases with a
subpoena or court order and “[o]nly when a suspicious pattern turned up
would an individual be identified, most likely after court approval was
obtained.”188 This suggestion resembles one made in the Markle Report,
which recommends against centralization such as in the Total Informa-
tion Awareness program.189 According to the Markle Report,
“[a]ttempting to centralize this information is not the answer because it
does not link the information to the dispersed analytical capabilities of
the network.”190 In other words, the Markle Report suggests that the
government enlist the assistance of various companies and other entities
to conduct the data mining. Moreover, the Markle Report recommends
that “personally identifiable data can be anonymized so that personal
data is not seen unless and until the requisite showing . . . is made.”191
The problem with this suggestion is that merely decentralizing the data-
bases does not provide adequate protection when such information can
readily be combined at the push of a button. Such outsourcing of gov-
ernment intelligence functions presents other problems as well, since pri-
vate sector entities lack the openness and accountability of government
as well as the legal limitations on the collection and use of personal data.
Anonymizing the identities of data subjects and searching for patterns
only to identify those suspicious people still involves a dragnet search.
Concealing the names at the stage of the initial pattern analysis will pro-
vide little meaningful protection because it does not change the dragnet
nature of the search and because the search for patterns is conducted by
computers for which the names of the individuals will not be relevant
anyway.
192. E-mail from Jim Harper to Declan McCullagh, Chris Hoofnagle, and Daniel Solove, supra
note 185.
193. Id.
194. E-mail from Eric Goldman to Daniel Solove, supra note 100.
SOLOVE.DOC 2/2/2006 4:27:56 PM
195. E-mail from Edmund Mierzwinski, Consumer Program Director, U.S. PIRG, to Declan
McCullagh, Chris Hoofnagle, Director, Electronic Privacy Information Center, and Daniel Solove,
Associate Professor, The George Washington University Law School (Mar. 20, 2005, 06:05 PM) (on
file with authors).
196. 18 U.S.C. § 2510 (2000).
197. 12 U.S.C. § 3401 (2000).
198. 47 U.S.C. § 551(g) (2000).
199. 18 U.S.C. § 2710(f) (2000).
200. 29 U.S.C. § 2009 (2000).
201. 47 U.S.C. § 227(e) (2000).
202. 18 U.S.C. § 2721(e) (2000).
203. 15 U.S.C. §§ 6807, 6824 (2000).
204. Id. § 1681t.
205. SMITH, supra note 147.
206. Edmund Mierzwinski to Declan McCullagh, Chris Hoofnagle, and Daniel Solove, supra note
195.
207. E-mail from Jim Harper to Declan McCullagh, Chris Hoofnagle, and Daniel Solove, supra
note 185.
SOLOVE.DOC 2/2/2006 4:27:56 PM
208. See Posting of Bruce Schneier to Schneier on Security: U.S. Medical Privacy Law Gutted,
http://www.schneier.com/blog/archives/2005/06/us_medical_priv.html (June 7, 2005, 12:15 PM).
209. Posting of Rodolphe Ortalo to Schneier on Security: Ideas for Privacy Reform, http://
www.schneier.com/blog/archives/2005/03/ideas_for_priv.html (Mar. 17, 2005, 04:18 AM).
SOLOVE.DOC 2/2/2006 4:27:56 PM