Privacy Data Protection
Privacy Data Protection
Introduction 44
1 The History of a Triumph 44
1.1 The Hesse Act - Setting the Stage 44
1.2 National Approaches 48
2.2.1 Northern lights 48
2.2.2 German perfection 48
2.2.3 French and other Latin approaches 49
2.2.4 The pragmatists - the UK and the Netherlands 50
2.2.5 Wait and See: Italy and Greece 51
1.3 The Race for International Solutions: the Council of Europe and
the OECD 51
1.4 The International Latecomer: The European Community/Union 52
2 Privacy: Values - Approaches in European Legislation 53
2.1 Germany: from the protected sphere to informational self
determination 53
2.2 Other value approaches 55
3 An Attempt at a Functional Perspective 57
3.1 Function and performance 57
3.2 What makes Us different from Them? 60
4 The Crisis of Data Protection 60
4.1 Internal differentiation of data protection 61
4.2 The bureaucratization of data protection 62
4.3 The challenge of the Internet 63
4.4 The need for complementation 64
5 From Here to Where? 65
5.1 Discussing Remedies 65
5.1.1 Technology 65
5.1.2 Trying out new approaches - learning from the US? 65
5.1.3 A new system of information regulation? 66
5.2 A lingering doubt 67
43
Introduction
Almost thirty years ago, on September 30, 1970 the ever first data protection law, the
Hesse Data Protection Act was passed2.
This act was about differences between local communities and the state administration,
about who should be allowed to buy these new machines, the large computing systems,
1 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the
protection of individuals with regard to the processing of personal data and on the free movement
of such data.
2 Hessisches Datenschutzgesetz (The Hesse Data Protection Act), Gesetz und Verordungsblatt I
(1970), 625.
44
and who should decide which programs to run on them, or rather, the act was a follow–
up to the law that tried to settle these arguments3. The former law addressed the fear of
local communities of what they saw as the inherent centralizing power of the machine
that would shift their traditional power and influence to the Land [state]4. That law
needed a follow–up because, in that discussion, it had been discovered that the "local
community – state administration" conflict was not the only one that needed to be
addressed: There was also a conflict between the legislative body and the executive
body because of the different use they could make of power-enhancing data processing.
The legislature was afraid to be cut off from information if it disappeared in the
machines of the executive. There was also the conflict between citizens and the power
of the state. The citizens had started to worry about these machines and as to what
would happen to their data in these machines, and, of course, these worries added to the
more general concerns they were already having as to what would happen to their jobs.
Gradually, the concern for confidentiality became an important argument as well 5.
Confidentiality clauses had already been added in the form of administrative regulations
to one of the other organizational laws of the Laender, seeking to implement data
processing in public administration, the 1968 law of Schleswig Hollstein. The Hesse
Law of 1970 was different because for the first time these confidentiality clauses were
lifted to the level of a formal law, where they stood side by side with those clauses that
addressed the other power conflicts mentioned above.
While the power conflict part of the Hesse law only resurfaced occasionally in the
legislation of the other states, the confidentiality part of the law set a legislative program
into motion.
The substance of the confidentiality clauses contained in the Hesse Act, however, had
been somewhat thin to start with: There was a default confidentiality rule. Exceptions
from this rule were possible where other regulations permitted distribution, where there
was consent and – what seems to be a very sweeping exception to us today – if
processing was necessary. The data subject had an access right, a correction right, the
possibility of obtaining an injunction and remedies in case of unlawful processing – and
those rights were indeed a breakthrough. Accepted in individual cases under specific
circumstances by the administrative courts, these rights had now been put into the clear
language of a law.
3 Gesetz über die Errichtung der hessischen Zentrale für Datenverarbeitung und kommunale
Gebietsrechenzentren (1969) [Law on establishing the data processing center of the State of Hesse
and of data processing centers of local communities]. See also Hondius 1975, 35.
4 In the constitutional system of Germany after the war a new emphasis had been put on the
executive and rule-making power of the local communities which carried the main burden of
executing state and federal laws, as well as their local bye-laws. This role became expressly
recognized in the German Basic Law [constitution] in Art.28 II (see also footnote below).
5 This conflict awareness started on the states' level because according to the German constitutional
system administrative functions are primarily with the local communities which execute local bye-
laws, the law of the states and the federal law, the federal government having only a relatively
small administrative infrastructure of its own.
45
The Act was expressly trial and error legislation: Half of the 17-article law dealt with
establishing an independent agency or rather an ombudsman
("Datenschutzbeauftragter") to watch over the application of the law and to gather
experience. Citizens had a right to complain to the ombudsman, who would comment
on the complaint and pass it on to the appropriate authority, but who had had no
decision-making power.
As limited as this first law was, it already set some of the basic elements for future
legislation:
First of all, the law influenced German and later European terminology: As
"Datenschutzgesetz" (literally translated as "data protection act"), it was a misnomer,
since it did not protect data but the rights of persons whose data was being handled. But
misnomers tend to have a high survival rate.
This legislation also set some basic themes for the forthcoming legislation in Europe:
- the negative default rule
The processing of personal data was seen as interference per se that needed
legitimization.
- the rights of the data subject
For the first time, data subjects had a right of access to information relating to
them without the need to show any reason as to why they wanted access.
- the omnibus approach
Although – due to reasons of legislative competence – the Hesse act could not
cover the private sector, it set out to regulate all of the state public sector (within
its competence).
- the establishing of a privacy protection institution
The Hesse Act expresses the regulative philosophy which is very common in Germany
and perhaps also in some other European countries: If you establish regulation that
seeks to influence behavior, you cannot (exclusively) rely on litigation to establish that
behavior as a pattern. Litigation is associated with burdens, particularly when this
litigation is supposed to be directed against the state. One needs institutions, an
organizational back-up to take care of one’s interests, even if this body by way of its
own infrastructure is close to the infrastructure of the state6.
The Hesse Act implicitly answered a number of questions which became the
programmatic questions for subsequent legislation in other countries, and the different
answers to these questions would account for the variety of data protection models in
Europe and beyond:
6 The Hesse Act and many, but not all, subsequent laws in Germany tried to avoid this "state
association" by making the agency or commission directly responsible to Parliament.
46
• Should only electronic processing be addressed?
One leading international firm that preferred to be seen as a national firm wherever it
operated, regarded addressing only electronic processing as a discrimination against
computing. Rather than trying to stop data protection all over Europe, it argued for
extending the rules to at least automated processing, if not manual processing in
general 7. The extent to which non-automated processing was to be included became one
of the recurrent themes in privacy protection legislation.
• How to keep data (information) processing in line?
The Hesse law still refrained from a registration approach (an approach that never
became very popular in Germany) which would have required the registration of all
processing of personal data (with more or less technical detail). Registration was partly
discussed as a means of gaining an overview on what was going on in data processing.
Partly it was seen as an opportunity to link authorization or licensing processes with
registration. The Hesse act refrained from any of these approaches because some
questioned whether registration would be feasible on a long term basis. Also, since the
Hesse Act only addressed the public sector, registration was not deemed to be necessary
in a context that is already heavily organized and recorded.
• Why limit the approach to the public sector?
This was the most difficult question, and certainly in Germany it was the main reason
why federal legislation followed Laender legislation so late: If the private sector was to
be covered as well – due to the constitutional distribution of legislative competence –
this had to come with a federal law. Whether it would come with a federal law (again in
the German context) would depend on the outcome of the lobbying struggle
surrounding data protection. Behind this struggle was a legal doctrine that was also
occasionally used in the political struggle: the doctrine of horizontal effects. According
to this doctrine, a right that is primarily seen as a right against the state can also become
a right in the private sector if this right reflects an essential element of the social order
as such which, in turn, is transformed into the civil law system through the general
value clauses of the civil law system. This constitutional law element was not to be
underestimated in the political battle because, in Germany, constitutional law is a
cherished tool to settle political disputes. At that stage it already became obvious that,
regardless of the shape of federal legislation, only a constitutional court decision would
end this dispute8.
7 Whether, initially, this extension had been used to show the limits of a regulation that tried to
cover information processing is not quite clear. At least those arguing for effective protection were
happy to find such an important a partner in business circles.
8 A constitutional court decision - as we shall see - finally came and it did not explicitly address the
horizontal effects. But it became sufficiently clear between the lines that the court would apply the
doctrine, if necessary, since the basic right to which data protection was connected by the court
had already received horizontal recognition. Also by that time it would have become politically
impossible to withdraw the private sector from the legislation. What still lingers on as a political
problem today is the private sector’s occasional demands for a less stringent approach. The
European Directive has effaced the difference between public and private, which - as we shall
47
1.2 National Approaches
Germany was not the first country with a national data protection law. Although there
had been a government draft already in the early 1970s, the process had been halted
because of the question of whether and how to cover the private sector.
Instead, it was Sweden which passed the first national law in 1973. Sweden had
preceded its legislation with the usual careful analysis in committees. Sweden was also
a special case because, since the late forties, it had a personal identification number
system – that symbol that would eventually start national initiatives on the federal level
in Germany. In the sixties, however, it was felt that this identifier would now serve as
an integrator for the still largely decentralized files9. Sweden was also the most
computerized country in the world (on a pro capita basis). Finally, data security played
an important role, since it was linked to national security concerns: Centralized registers
could easily fall into the hands of foreign powers and thus jeopardize Sweden's
neutrality.
This law also finally introduced a concept that would become one of the cornerstones of
early European data protection legislation: The central registration of personal data
processing information and a licensing procedure, whose public register was open to the
scrutiny of citizens and consumers, and used as an enforcement tool for data protection
agencies.
Sweden would eventually also become the first country to modify a data protection law
in 1979 and 1982, particularly by increasingly limiting registration and licensing
requirements.
Swedish legislation was soon followed by other Scandinavian countries: Denmark with
a law each for the public and private sectors in 1978 and in the same year Norway; the
Danish laws went into force in 1979 and the Norwegian one in 1980. Finland was a
latecomer in 1988. All of these laws have since been revised – even before the revision
process initiated by the European Directive10.
show - sort of effaces the the difference between the public and the private sector in defining its
application area. And this approach too is occasionally threatened by some commentators - what
else - to be challenged in the European Court of Luxembourg.
9 Hondius 1975, 44ff.; Flaherty 1989.
10 For a recent comprehensive overview on the state of privacy legislation around the world: Privacy
Laws & Business no. 43 (1998), 2 ff.
48
Following the example of states like Hesse, the federal government in 1971 asked a
research group at the University of Regensburg for an expertise on the structure of a
possible federal law11.
This work and later expertise – among them a book which allegedly "invented" the term
"informational self-determination"12 – drew heavily on US writers, particularly on
Westin13 and Miller 14.
This expertise became the basis for a government draft which, as pointed out above,
became stalled because of heavy criticism stemming from the private sector. At the
same time, however, the federal government continued to bring forward projects to
establish and link national data bases. The most important project was linking the states’
population register data banks in a coordinated15 effort, by introducing a general
personal identification number. While the administration was in the midst of
introducing the identifier, the bill that was supposed to provide the legal basis for its
introduction was almost immediately killed – unanimously – in the Legal Committee of
the Federal Parliament in May 1976. No progress was the message, without a data
protection law at the federal level.
And, indeed, again across party lines, political agreement was reached and in November
1976 the law, covering the public and the private sector, was passed16.
It took another six years and – as with so many political decisions by the legislature in
Germany – a Constitutional Court ruling in 198317 – before the law was on its way to
general acceptance. The substance of this decision will be examined more closely in
part 3 below.
This decision started a wave of new revisions to data protection laws, both at the federal
and state level. Now, in 1999, we await yet another wave of revisions due to the
European Directive.
The origins of data protection in France are – not surprisingly – quite similar. The
Minister of the Interior also tried to combine existing files under a general identifier and
named the project – as it turned out, a little unwisely – SAFARI (système automatisé
49
pour les fichiers administratifs et le répertoire des individus)18. Even more unwisely, the
administration tried to keep the project secret, until it was revealed in an article in Le
Monde in 1974 entitled: "SAFARI : Chasing French citizens." The writer of that article
later became the first president of the French data protection agency, the Commission
nationale de l'informatique et des libertés (CNIL). Again, past memories in Europe and
Germany came up: The numerical system to be used in the SAFARI project had already
been used by the French statistical office for its own use since 1970 and was, in turn,
based on a system introduced into the French administration in 1941 19.
As if this was not enough: in 1973 the administration had started yet another data bank
project – more pleasantly named GAMIN (Gestion automatiséee de la médicine
infantile) – intended to process personal information on children from their moment of
birth, right through the schooling system, mainly in order to keep a check on these
children’s medical problems and in order to signal "problematic" children. The
symbolic context of such a data bank – again against the background of a not too distant
history – was even more dramatic.
Both projects helped to push forward the French data protection bill. In 1978, the
French law was published and was put into full operation in 1980. The law emphasized
the role of the CNIL, which secured a special advisory role in the planning of
administrative data systems. It also believed in registration, and it also covered the
private sector.
The French example had an impact on other at least partly French-speaking and Latin
countries when writing their drafts: Luxembourg directly followed in 1979, Belgium
followed with a law only in 1992, Spain also only in 1992 and Portugal in 1991 (with
major amendments in 1994).
Cautious approaches were taken by the Netherlands, which passed its law after a very
long deliberation process in 1988 and which went fully in force in 1990. They
introduced "codes of conduct" into their regulatory regime, an approach often quoted –
particularly in the current US/European debate – but again rarely analyzed in depth as to
their actual impact. The law provides for a closely-knit connection between self-
regulation and (more strongly) state supervision. However, under current changes, this
model does contain elements which might become more important in the future,
particularly since the European Directive has included similar elements in its Directive.
The UK was very cautious in joining the trend. Private members’ bills had been
introduced (e.g. the Data Surveillance Bill by Kenneth Baker, as early as 1969). A
parliamentary committee had been set up, the report of which did at least influence
discussions in other Commonwealth countries (Younger Report 1972).
50
The basic shift, however, came when, in 1974, the Swedish Data Inspection Board
stopped the export of personal data to the UK because of the lack of a data protection
law in that country20 and when the work in the Council of Europe in the late seventies
and early eighties (see below) confirmed the tendency that interrupting the flow of
personal data to countries without – at that time equivalent protection – would become
legitimate.
The UK realized the economic implications and finally passed its data protection law in
1984, which became fully effective in 1997.
Italy (and Greece) were the wisest: Although they discussed drafts for a considerable
time, they simply waited until the Directive had taken form and shaped their national
laws accordingly, thus avoiding the need for adaptation to the Directive, which is now
faced by the other countries.
1.3 The Race for International Solutions: the Council of Europe and the
OECD
In the meantime, the international implications had become more and more apparent,
not only with the practice of the Swedish Data Inspection Board. It was inherent to data
processing that it would become an international issue, because it would achieve its full
potential precisely by disregarding geographical limitations.
The Council of Europe, an organization to which more than 40 member states belong
today, and which has the promotion of the European Convention of Human Rights
(1949) as one of its main objectives, set up a special committee in 1971, subsequently
passing two resolutions referring to privacy problems in 1973 21. It also started work on
a special convention: Particularly the second resolution, pertaining to protection of the
privacy of individuals vis-à-vis electronic data banks in the private sector, would have
lost all credibility had it not led to a broader international consensus22.
In the mid-seventies, the OECD, via its Data Bank Panel, had also started to scrutinize
more closely the privacy problems associated with data banks23. The OECD, by the
structure of its membership, then became the main forum for exchanges between North
America and Europe on privacy legislation, with Asian countries as patient observers24.
20 A detailed description of this case and other Swedish cases can be found at Ellger 1990, 287ff.
21 Resolution 73(2) and 73(22).
22 Hondius 1975, 66ff.
23 See e.g.: OECD Informatics Studies 10: Policy issues in data protection and privacy. Proceedings
of the OECD seminar 24-26 June 1974. Paris 1976.
24 See, e.g., its report on the 1977 Vienna Conference on Transborder Data Flows and the Protection
of Privacy, published in 1979.
51
The OECD was able to beat the Council of Europe in producing the final instrument by
four months: The OECD Guidelines on the Protection of Privacy and Transborder Data
Flows of Personal Data were passed in September 1980, the text of the European
Council Convention for the Protection of Individuals with regard to Automated
Processing of Personal Data was finalized in January 1981.
Not surprisingly, both texts showed strong similarities since, on the European side, the
OECD and Council of Europe delegations had been made up of more or less the same
persons. And privacy/data protection had by then already produced its core of
international experts that would demonstrate their influence in the drafting of national
laws as well as at the international level, a process which was of course assisted by the
similarity of the problems faced in industrial countries25.
Both texts covered the public and private sectors. In principle, both instruments
legitimized restrictions on transborder data flows for reasons of differences in the levels
of privacy 26. The European Council instrument, however, had a stronger impact since,
once ratified, it required member states to adapt their laws. But, of course, there was no
requirement for all member states of the Council of Europe to ratify it27. The OECD
instrument did include the US, but it was not a binding instrument. And the underlying
expectation, certainly by the US, was that if their national companies eventually made
some sort of declaration that they would adhere to principles, no national US legislation
for (all of) the private sector would be necessary.
Due to internal differences, but also because of its own regulatory preferences, the
European (at that time) Community, at least its "engine" the European Commission,
kept its distance.
On the occasion of the 1977 OECD Conference on Transborder Data Flows,
Christopher Layton, then a Director in the European Commission, declared that the
Community might encourage Member States to arrive at a better harmonization in their
national legislation. The Community might recommend them to eventually join a
Council of Europe instrument and, perhaps, if the evidence were sufficiently strong,
enter a regulation exercise itself 28.
The recommendation to join the Convention of the Council of Europe did come. But it
was not enough. Since 197529, the European Parliament had constantly admonished the
25 See for an assessment of this "convergence" process: Bennett 1988 and 1992.
26 No. 17 of the OECD Guidelines and Art. 12 of the Council of Europe Convention.
27 A Directive of the European Union (Community) was even a stronger instrument since it leaves no
choice but to change the national legislation (if necessary). This was why at that time already
many argued for a Directive in addition to the Convention.
28 Layton 1979, 215f.
29 See Simitis at. al. Kommentar zum Bundesdatenschutzgesetz, 4.Aufl. Baden-Baden 1992ff., Rn.
153 ff. zu § 1.
52
Commission to commence preparation of a Directive. It was only when the "Swedish
incident" (i.e. the interruption of transborder data flow from Sweden to the UK – an
earlier case had involved Germany when it still lacked a federal law) repeated itself
within the European market that serious work on an instrument started: The French
CNIL had intervened in a planned transfer from France to Italy (the "FIAT" case).
Furthermore, the Schengen agreement, although an instrument outside Community law
proper but politically associated with the Community, demanded data protection
considerations. By then, the European Community had also realized that data protection
could become an argument in discussions with the US on more general issues of data
flows, at a time when the US had started to emphasize national security interests to
embargo certain data flows30.
Finally, in 1990, the Commission came out with its proposal for a directive31 which still
took another five years to be finalized.
The Directive gave up the separation between the public and private sectors by defining
its application area as the area subject to the application of Community law. The
Directive incorporated the basic principles and, in regard to other legal technicalities, it
made sure that national legislators in the Member States could recognize some of those
elements they had introduced into national legislation, such as the internal (company)
data protection officer from German law, the Codes of Conduct from Dutch law, or the
prohibition of generating negative decisions on individuals solely through automated
processes from French law.
The Directive allowed Member States a further three years to adapt their national
legislation, which to date – May 1999 – only a minority of the Member States has
succeeded in doing.
In 1983, the German Constitutional Court summarized the underlying value concept of
the (German) approach to data protection as follows:
The freedom of the individual to decide for himself is at stake when the individual is
uncertain about what is known about him, particularly where what society might view
30 See the OECD Declaration on Transborder Data Flow (Adopted by the Governments of OECD
Member countries on 11th April 1985).
31 COM (90) 314.
53
as deviant behavior is at stake (the chilling effect)32. The individual therefore has the
right to know and make decisions on the information being processed about him. At the
same time, as a social being, the individual cannot avoid becoming the object of
information processing. However, limitations to his basic rights must only be accepted
in the case of overriding general interest and if that interest is incorporated into a law
that follows the basic requirements of clarity and proportionality 33. To protect these
principles, a number of safeguards are required; these safeguards consist of data
processing principles (correctness, timeliness, purpose limitation, fairly and lawfully
obtained), derived rights (access, correction), and organizational safeguards
(independent institutions).
This decision created a strong echo – not only because it had caused a 300 million
deutschmarks administrative investment to be stopped short.
The interpretative response to that judgement became an impressive spectacle of the art
of legal interpretation: Some commentators interpreted the decision into non-existence,
some even started a discourse on the unconstitutionality of constitutional court
decisions, while others saw it to be the landmark of the privacy revolution34.
In my view, there were two elements that made that decision remarkable:
In terms of constitutional politics, the judgement was an amalgamation of liberal and
conservative attitudes towards the new information processing technologies. At the
same time, it presented a modern view on the pervasiveness of information processing
and its impact on the private sphere. The decision managed to leave behind a concept
that the Court had developed in previous decisions: the sphere theory or the theory of
the private self as a passive onion, i.e. an understanding, according to which the core of
one’s personality is surrounded by layers, which may only be removed by state
intervention if the closer one comes to the innermost layers, the more pervasive the
arguments of legitimation become. In contrast with this decision, the Court had
introduced a more active, more communication-oriented picture of the informational
self in its exchanges in a communication society.
The controversies this judgement brought (again) to the surface are still not
settled, and tend to resurface whenever new measures of data matching or
informational control are introduced into parliament, even subsequently to the
Directive. But now, these controversies will increasingly be discussed at the
European level, and there the European data protection agencies will find a
stronger voice via the Working Group of European Data Protection
Commissioners established by Art. 29 of the Directive.
The judgement was followed in Germany by a series of sector-specific legislation,
which led to a revision of the general data protection law. It shifted the burden of
54
argument to those who sought to limit privacy and it finally confirmed the existence of
data protection agencies as necessary elements of data protection regimes.
This activist view to data protection – with all its limitations – had a strong influence on
the debate in neighboring countries and eventually on the Directive although, by then,
many European countries had already established their specific understanding of data
protection.
Perhaps the closest position to the German active position had been that of the French,
where the title of the data protection act of 1978 already presented the core of the value
program: "Informatique et Libertés" – informatics and freedom rights, meaning that
privacy was seen as providing a sort of "mother right" to other rights and liberties.
The UK approach, as we recall from our brief glance into its history, had been mainly
trade oriented.
The problem remained as to how best to operationalize this impulse, if it had to be
molded into legislation that affected the whole organizational infrastructure of the
public and private sectors. In addition, the impression had to be avoided of merely
passing symbolic legislation, therefore necessitating the need for a symbol of effective
legislation. And, finally, the whole regime – in the best of Thatcherite traditions – had
to more or less self-financing.
The legislative scheme that the UK came up with proved to be ingenious. Somewhat
simplified, it worked like this: Data processors had to register their processing activities.
For this registration they paid a fee, which was in turn used to pay the registrar. The
registrar made sure that registrants only performed the activities they had registered: he
compared firms’ registrations with their actual practices and his action represented the
extent of data protection. Of course, not everything could be registered; only such
practices could be registered as observed the basic principles. The scheme also limited
the powers of the registrars. (One of the big concerns in the discussion on data
protection regulation – mainly in Anglo-Saxon countries – had been the question of who
controls the controllers).
Finally, if we look at the Directive in terms of what is understood by privacy, we come
across, of course, the market freedoms which require the free flow of personal data in
the internal market which, in turn, require the harmonization of the rules on privacy
protection in Member States. But, already in the first recital, the Directive emphasizes
the civil liberties and democratic society elements of the Directive and the second
recital combines both elements in the specific economic civil liberty rhetoric of the
European Union:
"(2) Whereas data-processing systems are designed to serve man; whereas they
must, whatever the nationality or residence of natural persons, respect their
fundamental rights and freedoms, notably the right to privacy, and contribute to
economic and social progress, trade expansion and the well-being of individuals;"
55
Nevertheless, the civil liberty approach addressing the Member States is strong as
hardly ever seen before in a European Directive, and the European institutions
obviously seek to avoid the impression that civil liberties are but a pretext to improve
the internal market conditions for personal data flows.
This stronger emphasis on basic values in more recent approaches is not only due to the
European Court of Luxembourg which, by then, had also become more explicit on that
matter.
In my view, the change in attitude is also due – with regard to information and
communication technology and the social impact of these technologies – to a value
competition in which the European Union had started to see itself with the US, and to
which we had alluded in the previous chapter.
At the latest since the White Paper on Competition, Employment and Growth (1993),
the European Union and its spearhead for these issues – DG XIII– had consciously
used the term "information society" to mark a (perceived) difference to the US approach
to the "(super) information highway" 35.
The European approach – as the EU perceives itself – is more oriented towards social
values than – again as perceived by the EU – the US economic-technological approach.
Against this background, European concern for data protection has become but a
symbol of the importance of social values. And, because of its symbolic value, it is now
more consciously used, not only in the general directive, but also in other areas: In April
1999, for example, the European Parliament ensured that data protection clauses were
explicitly introduced into the draft electronic commerce directive and the draft of the
directive on electronic signatures at the time these drafts went through their first
reading.
This is not the place to examine to what extent this self-image coincides with reality,
both with regard to the more "social" approach to the information society, and whether
data protection is really such a valid symbol of this approach36.
What seems obvious, however, is that the "information society and data protection"
rhetoric not only served as an element of external differentiation, but also as a term for
drumming up internal support for a faster and smoother integration of information and
communication technology into society – in the Member States themselves and not just
at the European level.
35 The term "information society" was, of course, not introduced in the White Paper. With regard to
the history of this term see e.g. Masuda, Yoneij: The information society as Post-Industrial
Society. Washington DC. 2nd printing 1981. Lyon, David: The Roots of the Information Society
Idea. In: O’Sullivan, Tim; Jewkes, Yvonne (eds): The Media Studies Reader. London 1997, 384-
402. [From: Lyon, David: The information society: issues and illusions. New York 1988] Bjoern-
Andersen, N.; Earl, M.; Holst, O.; Mumford, E. (eds.): Information Society for richer, for poorer.
Amsterdam 1982. Webster 1995. The White Paper had built strongly on the work of the FAST
(Forecasting of Science and Technology) Program of the European Community. As to the early
beginnings see also Andersen et. al. 1982.
36 See instead e.g. Burkert 1997b.
56
This observation invites a digression to attempt a closer look at the functional side of
data protection legislation in Europe.
Behind values there are functional evolutionary processes at work which help these
values to take a particular form at a given time in a special social context.
By function I am referring to what data protection legislation does for the
political/legal system. This also implies some assumptions on how this is achieved
(performance). I am not addressing the question as to how efficient or how
effective this legislation has been in fulfilling these functions. However, there is,
although still tentative, increasing policy research in this field37.
As a test case for my arguments, I would like to use the current German law and the
European Directive, as some sort of European digest of data protection38.
Data protection legislation was required to address two basic problems:
(a) How to ensure the technology’s smooth introduction and integration: The
process of diffusing the technology into society was somewhat problematic. The
technology had connotations that affected personal identity ("Am I a machine?"). And it
affected distribution patterns of social and political power and reopened political
debates, on which (at least) intermediate comprises had been reached39.
In addition, its introduction had to be performed against the background of the global
value competition ("Cold War" i.e. still East-West and not yet towards the end of the
"Cold War" i.e. Europe-US). In this competition "computer power" became a strange
idol of convergence because it seemed to hold the promise of mastering the
complexities of the market, as well as the complexity of planned economies40.
We call this the legitimization task.
(b) The second problem was how – once introduced – to keep the technology
running to the best of its potential. As a technology, it put emphasis on syntactical rather
37 Cf. the publications by Charles Raab and Colin Bennett as, for example, Raab et al. 1998.
38 A more detailed attempt had been undertaken in Burkert 1996b. For a more recent and slightly
different approach that seeks to stay closer to Luhmann's theory of social systems: Donos 1998.
39 Basic political problems (employer/employee relationship, consumer/provider, citizen/State; etc.)
were allowed to return to the political stage “disguised” as technology problems. Information and
communication technology is being discussed as increasing existing power gaps, as inviting
redistribution of political and social power. Implementing such technology in economic, social and
administrative systems thus requires addressing the legitimacy of these processes by either shifting
power balances or re-legitimizing existing power differences.
40 Hence in the sixties the belief on both sides that good government was (also) a management
problem to be solved by mastering complexity and quantity.
57
than semantical aspects and, as such, was prone to fallacies that needed sufficient
control on semantics so as not to jeopardize the introduction process described under
(a).
This task we call the efficiency task.
In this introduction and stabilization process, data protection became part of social
modernization or, rather, yet another instrument legitimizing modernization. In this
process, procedural aspects had to be emphasized: Charismatic elements were not
available, and scientific proof and economic success were no longer reliable
mechanisms for long term stabilization – although all of these elements were
occasionally used in the context of the "privacy problem" 41.
Since science could no longer be used as a long-term, reliable source of legitimization,
the political system had to resort to combining resources of legitimization:
- emphasizing the democratic process over the material merits of the issue,
- supplementing (not replacing) the democratic process by expert advice,
- supplementing expert advice by elements of direct democracy, and/or "common
sense" and/or "the man/woman in the street".
Data protection law incorporated these techniques:
- By being a law and referring to laws it represented the collective democratic
process:
Therefore, in data protection legislation we find: Processing of personal data is
permitted when it is permitted by a law.
- By refering to a contract it emphasized the individual consensual process:
Therefore, in data protection legislation we find: Processing is permitted if there is
informed consent, if there is a contract, or if there is a contractual relationship.
- By taking the symbolic institutional approach:
Therefore, in data protection legislation we find data protection agencies
combining elements of the "expert", the ombudsman and, in some committees,
representatives of the "people" (in a multitude of variations). Sometimes these
elements are combined by giving the data protection agency specific tasks in the
law-making process (e.g. its opinion on a bill is required.).
Ensuring efficiency, however, is a more difficult task, because it has to be achieved
almost behind the backs of users. Users of technology tend to be absorbed by short term
benefits rather than taking into account mid or long- term results.
58
The protection of data processing from the fallacies of its users demands ensuring the
semantic quality of the process, that is
- optimal representation of information by data in terms of exactness and timeliness
- safeguarding its context as far as possible.
The search for optimal representation is expressed in the data quality principles of the
data protection laws.
Context is dependent on purpose. Purpose limitation, one of the key concepts of data
protection laws, helps to guarantee context.
The optimization of external quality also necessitates the possibility of external quality
control: Therefore, processing must operate under the constant possibility of access by
the data subject (subject verification). External institutional verification processes (data
protection agencies) further add to efficiency.
This analysis may be extended to other elements of data protection laws. For example,
data protection agencies also provide legitimacy competition (among themselves; with
the courts; with the "law abiding" administration; with self-regulating mechanisms).
The purpose of such an approach is not to completely deny the "value" side of data
protection. It is only to contribute to an explanation as to why, as a legislative program,
data protection has apparently been so successful, at least if one looks at the spread of
the concept across national boundaries.
Nor is the performance of data protection legislation uncontested (as we shall see
below). However, much of this criticism will confirm that what is seen as not being
performed (or not sufficiently) is modernization, if we regard modernization as a
process of managing paradox requirements42.
For example, as regards rule-making for information systems in public
administrations, data protection regulations are a necessary but not a sufficient
condition: It is, in particular, their transparency elements that are in need of
further extension toward more general structural transparency. At that stage then,
terms as such «privacy» or «access to information» or «freedom of information»
legislation will turn into historical labels rather than functional descriptions: What
is needed is a comprehensive framework for information processes in public
sector information systems.
I do not wish to give the impression that I am pursuing a sort of intentionist approach,
arguing that, while politicians preached human and civil rights, they intended to establish
an ever more perfect system of personal data handling. I would argue, however, that
regardless of the intention, some of the systemic necessities have received a greater chance
of success against the oddities of complex social interactions, and that these needs found
their way into legislation that tried to ensure at least symbolic protection, as well as
credibility among those who worked with the "machine", not to mention, of course, that
not all interested parties tend to have the same impact in the legislative process.
42 Cf. van der Loo, Hans; van Reijen, Willem: Modernisierung. Projekt und Paradox. München 1992.
59
3.2 What makes Us different from Them?
It has also not been the purpose of this diversion to subsume all elements of data
protection laws to the function/performance paradigm. There are elements that stand by
themselves, that explain themselves, if at all, by local rites, historic collective memory,
or just obstinate cultural relicts. But this is the advantage of the functional approach: it
sharpens one’s view to differences in values: If we assume that the functional
requirements of data protection are more or less the same across societies that have
reached a certain stage of economic and technological development, then the differences
in historic residues should become more apparent.
As an example of such differences between the US and the European/German approach,
the stronger impact of history may then be explained in particular to the experience in
Europe with totalitarian regimes.
This experience has a two-fold effect: On the one side, there is a strong distrust in the
growing surveillance potential of public institutions and, at the same time, there is an
almost paternal belief that relief from such institutions can only come from other
institutions that are somewhat related to "public administration."
43 From recent publications the situation in the US still seems to be different: While Whitaker 1999
claims the end of privacy, Etzioni goes into considerable length to justify why he is giving privacy
a critical assessment (Etzioni 1999, 5ff.).
60
The "new criticism" 44 targets:
- the internal differentiation of data protection,
- the bureaucratization of data protection,
- the apparent incapability of data protection legislation to meet the challenges of
technology (i.e.: the Internet),
- the incompleteness of data protection as a regulatory concept.
While national legislatures are still preparing to make their data protection laws
compatible with the European Union Directive – in Germany’s case, for example, this
will be the third major revision of its national law – it is no longer just omnibus laws
that make up the data protection landscape. We are also faced with a multitude of what
are being called "sector-specific data protection laws". These special laws are either
separate laws, or clauses built into other laws, which address data protection issues in
areas like national security, policing, health and social security, public registers, etc.
These regulations seek to apply the general data protection principles to their specific
contexts although, even in these more specific contexts, they cannot avoid leaving areas
of discretion.
This has been an almost logical development: If the rights of the data subject are to be
limited and if this has to be done by law, as we remember from the basic assumptions of
informational self-determination, then it will be laws which introduce sector-specific
considerations. However, sufficient room must also be left there in order to retain
flexibility for individual cases unless information systems are prescribed in full detail.
With this approach, problem by problem, issue by issue, legislators tend to lose sight of
the overall picture. Legislative techniques of reference and cross-reference add to
complexity. One ends up with a highly differentiated system of rules, if system is still
the right word, which still leaves much discretion to the courts and even broader
discretion (or a broader margin of appreciation) to the administration: complexity
without the benefit of predictability. With this approach, not only the demand for clarity
is omitted, a demand that had once been an integral part of data protection, but it is also
becoming more and more difficult to obtain an overall view of the extent and limits of
data protection, if one is not to take a more sinister view, seeing legislatures as merely
"nodding" to the complex demands of the "expert" administration.
The role of the sector-specific legislation in the public sector is taken over in the private
sector by the contract approach: The creation of consent as the all-exonerating
instrument. Data protection clauses have become part of numerous contracts, privacy
44 See e.g.: Die Landesbeauftragten für den Datenschutz: 20 Jahre Datenschutz - Individualismus
oder Gemeinschaftssinn. Düsseldorf 1998.
61
policies appear on websites. Not all of these clauses meet the requirements of data
protection principles, not all of them meet the requirements of consent or even informed
consent.
In most cases, however, challenging these classes of consent or non-consent in the
private sector means activating the courts, which , indeed, is occasionally being done by
competitors, rarely by consumer associations, and even less frequently by individuals
and almost never – because this is not foreseen in the regulatory systems in Europe – by
the data protection agencies. So data protection in the private sector has been reduced to
a consumer protection issue and is going to share its fate.
This does not exclude occasional eruptions of protest, not unlike the Lotus market
place reaction in the US. But, usually, these eruptions need argumentative support
from the data protection agencies and, in these cases, privacy concerns tend to get
mixed with other concerns or resentments, in which case data protection issues are
just the catalyst. In Germany this was the case when banks changed their general
contract provisions to introduce a very limited right to give (positive) credit
information to requesting third parties, and when the German Railway
Corporation (which although privatized is still regarded as a public sector
institution) started a campaign (that was ill fated right from the beginning) to
market a card that would have combined credit card and rail card functions. In
both cases, there had already been a pre-existing feeling of discontent with regard
to the services offered by these institutions and to which data protection added
legitimacy. And, in both cases, an explicit act of consent was necessary on behalf
of the "consumers" to introduce the scheme.
While more tradition-inclined jurists are finally discovering the attractions of privacy
legislation (the recent Annual Conference of German Jurists even dedicated a whole
session to the issue45) and while some of them are falling back into ancient dreams of
codifying information as such, a sort of Codex Iuris Informationis, (this already being
another signifier of decay in legal discussion) there are increasing doubts about the
effectiveness of data protection regulation, and in particular of data protection agencies.
These doubts are raised when one is reading through the annual (and increasingly only
biannual) reports of these agencies. Occasionally, they convey the feeling that these
institutions regard themselves as being judged by the amount of pages they produce in
these reports. With their diminishing role as an authoritative registration agency, with
automated processing becoming a banality, many of these agencies have difficulties
adjusting to the new demands placed upon them.
45 To do justice to the social institution of German law, it should be noted that this had already been
the second occasion: At an annual conference exactly 25 years ago, data protection had also been
the subject of a session.
62
These agencies seem to shift uneasily between the image of data protection
bureaucratization46 and an ombudsman role. Doubts are reinforced in both roles when
one compares the demands they made themselves in their reports and the changes they
actually initiated through their intervention.
Occasionally, this general impression is modified by the particular personality of a
commissioner; above all if he or she knows how to make use of the mass media,
or by the explicit political understanding of his/her role or the role of the agency,
as is the case with, e.g., the French Commission.
This somewhat dissatisfying impression may be part of the role these institutions had to
play in current data protection concepts as described in the functional analysis above.
This may change with the Directive and its implementation, which has given these
agencies a more eminent role. However, at the current stage, dissatisfaction is not only
an appropriate term to describe how these agencies are seen from the outside; it also
reflects to a large extent the image they have of themselves.
The Internet poses a double challenge to data protection regulations. There is the
obvious technical challenge, but there is also the more fundamental challenge to
regulatory concepts in general, of which the Internet is only one example.
Current data protection regulations address – with the exception of a few attempts
recently to extend past wordings to current and future problems47 – previous generation
technology: the arrival of the personal computer, distributed processing, online data
banks and highly formalized electronic data interchange.
The technology of the day is – of course – characterized by increasing network
connectivity, high speed transmissions, the ubiquity of information processing, for
which the Internet has become the catchword.
Data protection regulation is no longer capable of meeting the technical challenges of
recording e.g. information consumption behavior in increasing detail with a decreasing
chance of the user becoming aware of this monitoring. Data protection regulation
cannot stay ahead of technical developments and is not even able to follow them at an
acceptable distance.
Independently from the Internet there are also challenges to the more traditional
forms of processing through techniques such as data mining, which finally seem
63
to give justification to the idea that one should never throw any data away because
one might need it one day48. The concept provides the strongest challenge yet to
the purpose limitation principle.
The criticism of data protection regulation in view of the Internet also echoes some of
the more fundamental criticism of regulation in general:
- Data protection regulation is no longer capable of getting a hold on the
responsible keeper: Organizational entities disappear and reconfigure constantly
in the era of virtual international corporate but also public sector institutions.
- Data protection - even where it is regionally organized as on the EU level –
cannot meet the challenges of global data flows.
- Data protection regulation is not fine-tuned to the economic mechanisms of the
use of information and communication technology today.
48 Very instructive for the privacy problems of data mining: Ann Cavoukian, Ph.D (Information and
Privacy Commissioner/Ontario): Data Mining: Staking a Claim on Your Privacy. January 1998.
64
(mainly Swedish) concerns about access. Judging from a recent Green paper, the EU
might even be moving towards a Directive on access to government information.
What is becoming obvious in the process is that:
- data protection legislation already contains structural elements of access as well as
ones pertaining to individual rights;
- data protection and freedom of information complement each other and where one
of the elements is missing the other has to be introduced to strike the proper
balance;
- both types of legislation need harmonization to avoid lacunae and overlaps, as
well as conflicts between supervising agencies.
5.1.1 Technology
Right from the outset, the concept of data protection in Europe was not a merely
European affair. American international companies and their subsidiaries, even if they
regarded themselves to be European companies, conveyed the American view on
65
regulation and on privacy regulation in particular. European companies could point to
those companies and issue warnings about their own international competitiveness. The
first hearing of the European Parliament had a strong observer delegation from the US.
The Council of Europe Convention had received special wording to provide for the
unlikely event that the US would join. And, of course, the discourse on how to draft the
Directive was a discourse that involved the US from the beginning. Therefore, it is not
surprising that then and now it contains some US reflections – now that the European
point has been made by the Directive – more strongly into European reconsideration of
privacy. Some of these concepts, in my view, have a chance, and others do not.
Among those which are more strongly re-introduced are those favoring more subtle
approaches involving the regulated entities themselves. Among these approaches, those
might have a better chance that seek to combine regulation by independent agencies
with self-regulation. For data protection, one will have to wait and see how such current
approaches are being perceived – in mid-term – in other areas like telecommunications.
Significantly less attention (although that may not remain this way) had been given to
the perception of privacy rights as a property right that can become subject to
transactions and could thus become marketable.
Rather early in the debate, the European approach seemed to have focused on privacy as
a – basically – inalienable liberty right, comparable perhaps to the personality right
element in some European copyright regimes.
66
integrating all aspects of information-related regulation. This goes beyond merely
creating a new codex of information law; it would also involve a new methodology to
examine information conflicts.
52 See e.g. the European Parliament, Report A4-0189/98: Resolution on the communication from the
Commission to the Council, the European Parliament, the Economic and Social Committee and
the Committee of the Regions on ensuring security and trust in electronic telecommunication -
towards a European framework for digital signatures and encryption (COM(97)0503 - C4-
0648/97): " 7. Emphasizes, however, the potential benefits offered by the new communication
technologies on strengthening European civil society through the development of a European and
global public space, and therefore calls on the Commission to inform it of any observations
relating to abuse of communications systems, especially through interception, for the purposes of
illegitimate surveillance of citizens and individuals in the EU;"
53 European Parliament: Draft Joint Action, adopted by the Council on the basis of Article K.3 of the
Treaty on European Union, to combat child pornography on the Internet (10850/5/98 - C4-0674/98
- 98/0917(CNS)):"Amendment 16) Article 3 (-1)(new) -1. to ensure that the identity of persons
who have set up an electronic mail (e-mail) address can be ascertained; [...]"
67
References:
69