Brkan, Maja - 'Do Algorithms Rule The World - Algorithmic Decision-Making and Data Protection in The Framework of The GDPR and Beyond'
Brkan, Maja - 'Do Algorithms Rule The World - Algorithmic Decision-Making and Data Protection in The Framework of The GDPR and Beyond'
doi: 10.1093/ijlit/eay017
Article
A BS TR A C T
The purpose of this article is to analyse the rules of the General Data Protection
Regulation (GDPR) and the Directive on Data Protection in Criminal Matters on
automated decision-making and to explore how to ensure transparency of such deci-
sions, in particular those taken with the help of algorithms. Both legal acts impose
limitations on automated individual decision-making, including profiling. While these
limitations of automated decisions might come across as a forceful fortress strongly
protecting individuals and potentially even hampering the future development of
Artificial Intelligence in decision-making, the relevant provisions nevertheless contain
numerous exceptions allowing for such decisions. While the Directive on Data
Protection in Criminal Matters worryingly does not seem to give the data subject the
possibility to familiarize herself with the reasons for such a decision, the GDPR obliges
the controller to provide the data subject with ‘meaningful information about the logic
involved’ (Articles 13(2)(f), 14(2)(g) and 15(1)(h)), thus raising the much-debated
question whether the data subject should be granted a ‘right to explanation’ of the
automated decision. This article seeks to go beyond the semantic question of whether
this right should be designated as the ‘right to explanation’ and argues that the GDPR
obliges the controller to inform the data subject of the reasons why an automated deci-
sion was taken. While such a right would in principle fit well within the broader frame-
work of the GDPR’s quest for a high level of transparency, it also raises several queries:
What exactly needs to be revealed to the data subject? How can an algorithm-based de-
cision be explained? The article aims to explore these questions and to identify chal-
lenges for further research regarding explainability of automated decisions.
K E Y W O R D S : General Data Protection Regulation, automated decision-making, right
to explanation, algorithmic transparency, Article 22 GDPR, Article 11 Directive 2016/
680
I N TRO D UC T IO N
‘This is beautiful,’ acclaimed Fan Hui, the multiple European Go champion,
when he saw Google’s AlphaGo making an unconventional move during a Go
* Associate Professor, Faculty of Law, Maastricht University, The Netherlands. E-mail: maja.brkan@maas-
trichtuniversity.nl. The author is grateful to the anonymous reviewers for their valuable comments on an
earlier version of this article.
C The Author(s) (2019). Published by Oxford University Press. All rights reserved.
V
For permissions, please email: [email protected].
1
2 Algorithmic decision-making and data protection
game.1 AlphaGo eventually won the ancient game, much more complex than chess,
and proved that machines could outperform humans . . . again.2 With the rise of in-
1 See Cade Metz, ‘The Sadness and Beauty of Watching Google’s AI Play Go’ <https://www.wired.com/
2016/03/sadness-beauty-watching-googles-ai-play-go/> accessed 21 November 2018.
2 For example, an AI outperformed a human also in playing chess (AI Deep Blue) and Jeopardy (AI
Watson).
3 KG Orphanides, ‘Robot Carries out First Autonomous Soft Tissue Surgery’ <http://www.wired.co.uk/art
icle/autonomous-robot-surgeon> accessed 25 July 2018.
4 Communication from the Commission to the European Parliament, the Council, the European Economic
and Social Committee and the Committee of the Regions. A Digital Single Market Strategy for Europe
(COM/2015/192 final).
5 See for example, the European Parliament resolution of 16 February 2017 with recommendations to the
Commission on Civil Law Rules on Robotics, 2015/2103(INL); Communication from the Commission
to the European Parliament, the European Council, the Council, the European Economic and Social
Committee and the Committee of the Regions: Artificial Intelligence for Europe, COM(2018) 237 final;
Declaration of cooperation on Artificial Intelligence, signed by EU Member States <https://ec.europa.eu/
digital-single-market/en/news/eu-member-states-sign-cooperate-artificial-intelligence> accessed 26 June
2018; Draft AI Ethics Guidelines ‘Artificial intelligence, real benefits’ <https://ec.europa.eu/digital-single-
market/en/news/draft-ethics-guidelines-trustworthy-ai> accessed 18 December 2018.
Maja Brkan 3
However, the crucial role that the EU plays in this field raises many unanswered
questions. While European companies exponentially use Big Data and AI in their
A U TOM A TE D D E C IS I ON -M A K IN G
Automated decision-making could be defined as taking a decision without human
intervention; according to the GDPR, ‘automated individual decision-making’ is ‘a
decision based solely on automated processing’.12 The human can of course feed the
system with data—although even this can be an automatic procedure—and interpret
the decision once it is taken. If the automated decision-making does not have any
binding effect on data subjects and does not deprive them of their legitimate rights,
such decision-making is of a low impact. However, when a decision is binding for
6 Ryan Calo, ‘Peeping HALs: Making Sense of Artificial Intelligence and Privacy’ (2010) 2 European Journal
of Legal Studies 3, 168–192.
7 Bryce Goodman and Seth Flaxman, ‘European Union regulations on Algorithmic Decision-making and a
“right to explanation”’ (2016) ICML Workshop on Human Interpretability in Machine Learning, New York
<https://arxiv.org/pdf/1606.08813v3.pdf> accessed 18 July 2017.
8 Andrea Bertolini, ‘Robots as Products: The Case for a Realistic Analysis of Robotic Applications and
Liability Rules’ (2013) 5 LIT 2, 214–47.
9 Brent Daniel Mittelstadt and others, ‘The Ethics of Algorithms: Mapping the Debate’ (2016) 3 Big Data &
Society 1, 1–21.
10 Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the pro-
tection of natural persons with regard to the processing of personal data and on the free movement of
such data, and repealing Directive 95/46/EC (General Data Protection Regulation), OJ L 119, 4.5.2016,
1.
11 Directive (EU) 2016/680 of the European Parliament and of the Council of 27 April 2016 on the protec-
tion of natural persons with regard to the processing of personal data by competent authorities for the
purposes of the prevention, investigation, detection or prosecution of criminal offences or the execution
of criminal penalties, and on the free movement of such data, and repealing Council Framework Decision
2008/977/JHA, L 119, 4.5.2016, p 89.
12 art 22(1) of the General Data Protection Regulation.
4 Algorithmic decision-making and data protection
individuals and affects their rights, by deciding for example whether a client should
be awarded credit, tax return or be employed, the law has to provide sufficient safe-
13 See further on efficiency and fairness in automated decision-making Tal Zarsky, ‘The Trouble with
Algorithmic Decisions: An Analytic Road Map to Examine Efficiency and Fairness in Automated and
Opaque Decision Making’ (2016) 41 Science, Technology, & Human Values 118–32.
14 More on profiling see Mireille Hildebrandt and Serge Gutwirth (eds), Profiling the European Citizen.
Cross-Disciplinary Perspectives (Springer 2008).
15 Frank Pasquale, The Black Box Society. The Secret Algorithms That Control Money and Information
(Harvard University Press 2015); Jacob Loveless and others, ‘Online Algorithms in High-frequency
Trading. The challenges faced by competing HFT algorithms’ (2013) 11 acmqueue 1.
16 Melissa Perry, ‘iDecide: Administrative Decision-Making in the Digital World’ (2017) Australian Law
Journal 29–34.
17 See Trevor Bench-Capon and Thomas F Gordon, ‘Tools for Rapid Prototyping of Legal Cased-Based
Reasoning’ (2015) ULCS-15-005, University of Liverpool, UK; Giovanni Sartor and Luther Branting
(eds), Judicial Applications of Artificial Intelligence (Springer 1998); Ang_ele Christin, Alex Rosenblat and
Danah Boyd, ‘Courts and Predictive Algorithms’ (2015) Data & Civil Rights: A New Era of Policing
and Justice <http://www.law.nyu.edu/sites/default/files/upload_documents/Angele%20Christin.pdf>
accessed 16 January 2018.
18 On the concept of procedural regularity in automated decision-making, see Joshua A Kroll and others,
‘Accountable Algorithms’ (2017) 165 University of Pennsylvania Law Review 637, 638, 641, 656ff.
19 On algorithmic discrimination see for example Bryce Goodman, ‘Discrimination, Data Sanitisation and
Auditing in the European Union’s General Data Protection Regulation’ (2016) 2 European Data
Protection Law Review 4, 493–506.
20 Thomas H Coormen, Algorithms Unlocked (MIT Press 2013) 1.
Maja Brkan 5
support of algorithms. With the increasing use of Big Data and more and more com-
plex decision-making, algorithmic intervention has become almost indispensable.
A U TO MA TE D I ND IV ID U A L D EC I SI ON - MA K I NG IN TH E EU D A TA
P RO TE CT IO N L E GI SL A TI ON
This section analyses the provisions of the GDPR and the Directive on Data
Protection in Criminal Matters on automated decision-making and explains when
such decision-making is possible and under which conditions. Given that the auto-
mated decision-making is used much more regularly in commercial setting than for
the purposes of law enforcement, the focus of this section—and of this article more
generally—is on the analysis of the GDPR provisions rather than those of the said
directive. The GDPR regulates automated individual decision-making, including
profiling, in its Article 22. The first paragraph of this provision stipulates: ‘[t]he data
subject shall have the right not to be subject to a decision based solely on automated
processing, including profiling, which produces legal effects concerning him or her or
similarly significantly affects him or her.’ Article 11 of the Directive on Data
Protection in Criminal Matters takes a comparable stance towards automated indi-
vidual decision-making by imposing an obligation on Member States to prohibit ‘a
decision based solely on automated processing, including profiling, which produces
an adverse legal effect concerning the data subject or significantly affects him or her’.
The provision of the Directive on Data Protection in Criminal Matters builds
upon an equivalent provision of the predecessor of this legal act, that is, the Council
Framework Decision 2008/977/JHA.21 Article 22 of the GDPR continues the legacy
of the 1995 Data Protection Directive,22 more precisely its Article 15, according to
which the data subject equally had the right not to be subject to a decision producing
legal effects or significantly affecting her and which was based solely on automated
processing of data. While the wording of the provision did not undergo substantial
21 See art 7 of the Council Framework Decision 2008/977/JHA of 27 November 2008 on the protection of
personal data processed in the framework of police and judicial cooperation in criminal matters, OJ L
350, 30.12.2008, 60.
22 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protec-
tion of individuals with regard to the processing of personal data and on the free movement of such data,
OJ L 281, 23.11.1995, 31.
6 Algorithmic decision-making and data protection
changes with the adoption of the GDPR,23 the practical importance of the provision
increased with augmented use of automated decision-making. The provision of Data
23 Sandra Wachter, Brent Mittelstadt and Luciano Floridi, ‘Why a Right to Explanation of Automated
Decision-Making does not Exist in the General Data Protection Regulation’ (2017) 7 International Data
Privacy Law 2, 81 and fn 66, point out that the text of art 22 GDPR has not been changed much com-
pared to art 15 of the Data Protection Directive.
24 Compare Wachter, Mittelstadt and Floridi (n 23) 81, who equally analyse the legislative developments of
art 22 GDPR, but rather from the perspective of how the (non-)inclusion of the right to explanation into
the text of the Regulation developed in the course of the legislative procedure.
25 art 20 of GDPR proposal, COM(2012) 11 final.
26 ibid, art 20(4).
27 art 20 (Profiling), European Parliament legislative resolution of 12 March 2014 on the proposal for a
regulation of the European Parliament and of the Council on the protection of individuals with regard to
the processing of personal data and on the free movement of such data (General Data Protection
Regulation), COM(2012)0011—C7-0025/2012—2012/0011(COD) (Ordinary legislative procedure:
first reading).
28 Position of the Council at first reading with a view to the adoption of a Regulation of the European
Parliament and of the Council on the protection of natural persons with regard to the processing of per-
sonal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data
Protection Regulation)—Adopted by the Council on 8 April 2016.
Maja Brkan 7
29 Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals
with regard to the processing of personal data and on the free movement of such data (General Data
Protection Regulation), COM(2012) 11 final, 9.
30 Recommendation CM/Rec(2010)13 of the Committee of Ministers to member states on the protection
of individuals with regard to automatic processing of personal data in the context of profiling (Adopted
by the Committee of Ministers on 23 November 2010 at the 1099th meeting of the Ministers’ Deputies).
31 Isak Mendoza and Lee A Bygrave, ‘The Right not to be Subject to Automated Decisions based on Profiling’
University of Oslo Faculty of Law Legal Studies Research Paper Series No 20/2017 <https://papers.ssrn.
com/sol3/papers.cfm?abstract_id¼2964855> accessed 11 July 2018, 7, rightly point out that the legislative
process leading to the adoption of the GDPR leads to the result that it does not give the right to the data
subject to object to all profiling, but only to certain types of decisions arising from profiling.
32 ibid.
33 See Chapter III, ‘Rights of the data subject’, s 4, ‘Right to object and automated individual decision-mak-
ing’, of the GDPR.
8 Algorithmic decision-making and data protection
Criminal Matters34 appears in the chapter on principles, preceding the one on rights
of the data subject. The latter provision is also phrased rather differently. Apart from
34 See Chapter II, ‘Principles’, of the Directive on Data Protection in Criminal Matters.
35 Wachter, Mittelstadt and Floridi (n 23) 94, point out that Article 22 can be interpreted either as a prohib-
ition or as a right to object and that the two interpretations ‘are differentiated by whether action is
required by the data subject to restrict automated decision-making’.
36 See in this sense Wachter, Mittelstadt and Floridi (n 23) 95.
37 WP29, Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation
2016/679, adopted on 3 October 2017; as last revised and adopted on 6 February 2018, 17/EN,
WP251rev.01, 21.
38 ibid., WP29 also stresses that the requirements of Article 22 cannot be avoided by merely ‘fabricating
human involvement’.
39 Compare Wachter, Mittelstadt and Floridi (n 23) 95.
Maja Brkan 9
subject40 without providing her with necessary safeguards from paragraph 3 of that
provision. This is because safeguards need to be provided only when one of the
47 For a collective data protection aspect in the age of Big Data analytics see Alessandro Mantelero,
‘Personal Data for Decisional Purposes in the Age of Analytics: From an Individual to a Collective
Dimension of Data Protection’ (2016) 32 Computer Law & Security Review 2, 238–55.
48 According to Recital 26 GDPR and Recital 21 of the Directive on Data Protection in Criminal Matters,
the ‘principles of data protection should . . . not apply to anonymous information’.
49 Personal data are defined in art 4(1) GDPR and 3(1) of the Directive as ‘any information relating to an iden-
tified or identifiable natural person’ (emphasis added). On identifiability, see Worku Gedefa Urgessa, ‘The
Protective Capacity of the Criterion of “Identifiability” under EU Data Protection Law’ (2016) 2 European
Data Protection Law Review 4, 521–31. Furthermore, Moerel and van der Wolk point out that, in the Big
Data environment, it is difficult to fully anonymise the data and ‘retain the functionality and usability of the
underlying information’; see Lokke Moerel and Alex van der Wolk, ‘Big Data Analytics Under the EU
General Data Protection Regulation’ <https://ssrn.com/abstract¼3006570> accessed 15 June 2018, 31.
50 Further on tension between Big Data and art 22 GDPR, see Tal Zarsky, ‘Incompatible: The GDPR in the
Age of Big Data’ (2017) 47 Seton Hall Law Review 4, 1017ff.
51 Dennis Broeders and others, ‘Big Data and security policies: Towards a Framework for Regulating the
Phases of Analytics and Use of Big Data’ (2017) 33 Computer Law & Security Review 3, 314, point out
Maja Brkan 11
scope of application of GDPR would not only create an enormous imbalance in how
individual and collective automated decisions are treated, but could also open the
Protection in Criminal Matters, the human has to use the machine only as a decision
support. If the human intervention fulfils these criteria and renders the decision not
In some cases, however, the existence of significant effect seems less straightfor-
ward. In the GDPR framework, when does sending advertisements by Google and
63 Targeted advertisement will play an insignificant or even no role with regard to the Directive on Data
Protection in Criminal Matters as this Directive relates to data protection ‘by competent authorities for
the purposes of the prevention, investigation, detection or prosecution of criminal offences or the execu-
tion of criminal penalties’ (Article 1(1)).
64 WP29, Guidelines on Automated individual decision-making, 22.
65 Wachter, Mittelstadt and Floridi (n 23) 93, correctly stress that ‘it may cause a burden for the data subject
to prove that processing affects them significantly’.
14 Algorithmic decision-making and data protection
According to the GDPR, the prohibition from paragraph 1 of that provision does
‘not apply if the decision: (a) is necessary for entering into, or performance of, a con-
66 More on this type of contracts see Lauren Henry Scholz, ‘Algorithmic Contracts’ (2017) Stanford
Technology Law Review <https://papers.ssrn.com/sol3/papers.cfm?abstract_id¼274770> accessed 10
November 2018.
67 WP29, Guidelines on Automated individual decision-making, 23.
68 ibid.
69 Recital 71 GDPR.
Maja Brkan 15
contract with data subject could potentially bring under this exception also targeted
advertising which is done with a direct and unambiguous intention of contract con-
70 Directive (EU) 2016/681 of the European Parliament and of the Council of 27 April 2016 on the use of
passenger name record (PNR) data for the prevention, detection, investigation and prosecution of terror-
ist offences and serious crime, OJ L 119, 4.5.2016, 132.
71 art 6(5) of the Directive (EU) 2016/681 of the European Parliament and of the Council of 27 April 2016
on the use of passenger name record (PNR) data for the prevention, detection, investigation and prosecu-
tion of terrorist offences and serious crime, OJ L 119, 4.5.2016, 132; compare also para 2 of this provi-
sion; and Mendoza and Bygrave (n 31) 6.
72 In practice, however, many decisions are already taken without human intervention.
73 See § 37 (Automatisierte Entscheidungen im Einzelfall einschließlich Profiling) of the Geman Gesetz zur
Anpassung des Datenschutzrechts an die Verordnung (EU) 2016/679 und zur Umsetzung der Richtlinie
(EU) 2016/680 (Datenschutz-Anpassungs- und -Umsetzungsgesetz EU – DSAnpUG-EU).
74 See ibid 106: explanations to § 37.
75 ibid.
76 See § 35a of the German Verwaltungsverfahrensgesetz (VwVfG) and explanation to § 37 of DSAnpUG-
EU above.
16 Algorithmic decision-making and data protection
comparison, the Dutch law contains a more general implementation clause of Article
22(1) GDPR by allowing automated decision-making if this is necessary to comply
77 See art 40(1) of the Dutch Regels ter uitvoering van Verordening (EU) 2016/679 van het Europees
Parlement en de Raad van 27 April 2016 betreffende de bescherming van natuurlijke personen in verband
met de verwerking van persoonsgegevens en betreffende het vrije verkeer van die gegevens en tot intrek-
king van Richtlijn 95/46/EG (Uitvoeringswet Algemene verordening gegevensbescherming).
78 art 7 GDPR.
79 See art 29 Working Party, Guidelines on consent under Regulation 2016/679, adopted on 28 November
2017; as last revised and adopted on 10 April 2018.
80 ibid 18.
81 art 29 Working Party, ‘Advice paper on essential elements of a definition and a provision on profiling
within the EU General Data Protection Regulation’, adopted on 13 May 2013, <http://ec.europa.eu/just
ice/data-protection/article-29/documentation/other-document/files/2013/20130513_advice-paper-on-
profiling_en.pdf> accessed 11 November 2018.
82 See Recital 72 GDPR according to which ‘[p]rofiling is subject to the rules of this Regulation governing
the processing of personal data, such as the legal grounds for processing or data protection principles’.
83 Prior to the applicability of the GDPR, countries that specifically allowed for profiling mostly required
additional safeguards in this regard. Italy can be used as an example of a country which specifically
allowed for profiling, but the data subject had to be notified prior to processing of data aimed at profiling:
See Guidelines on online profiling issued by Garante per la protezione dei dati personali; for a summary see
<http://blogs.dlapiper.com/iptitaly/?p¼56970> accessed 26 May 2018. Moreover, some countries,
such as the Netherlands, even allowed for ethnic profiling, which may be problematic both from data pro-
tection and non-discrimination perspective: For more on this issue see Simone Vromen, ‘Ethnic Profiling
in the Netherlands and England and Wales: Compliance with International and European Standards’,
Public Interest Litigation Project (PILP-NJCM) / Utrecht University, <https://pilpnjcm.nl/wp-content/
uploads/2015/06/Research-project-B-FINAL-version.pdf> accessed 16 May 2018.
Maja Brkan 17
Should the EU Member States want to make use of fully automated systems of crime
prevention or determination of criminal sanctions, akin to systems COMPAS84 or
84 COMPAS stands for ‘Correctional Offender Management Profiling for Alternative Sanctions’; see for ex-
ample Fact Sheet ‘COMPAS Assessment Tool Launched – Evidence-based rehabilitation for offender
success’ <https://www.cdcr.ca.gov/rehabilitation/docs/FS_COMPAS_Final_4-15-09.pdf> accessed 29
June 2018. The system was used for example for a determination of a six years prison sentence of Eric
Loomis of Wisconsin who unsuccessfully challenged such determination of his sentence before the
Wisconsin Supreme Court. See ‘State v. Loomis: Wisconsin Supreme Court Requires Warning Before
Use of Algorithmic Risk Assessments in Sentencing. Recent Case: 881 N.W.2d 749 (Wis. 2016)’ (2017)
130 Harvard L Rev, 1530. The US Supreme Court declined to hear the case; see Michelle Liu, ‘Supreme
Court Refuses to Hear Wisconsin Predictive Crime Assessment Case’, Milwaukee Journal Sentinel, 26
June 2017 <https://eu.jsonline.com/story/news/crime/2017/06/26/supreme-court-refuses-hear-wiscon
sin-predictive-crime-assessment-case/428240001/> accessed 29 June 2018.
85 VALCRI stands for ‘Visual Analytics for Sense-Making in Criminal Intelligence Analysis’; <http://valcri.
org/about-valcri/> accessed 29 June 2018.
86 Emphasis added.
18 Algorithmic decision-making and data protection
decision. For example, if the data subject concludes an online contract with dynamic
pricing, it might be impossible to request a human intervention if the website does
to a reasonable assumption that it is the controller itself that needs to take this deci-
sion. Who exactly is in charge of handling such a challenge should be left to the con-
88 art 77 GDPR.
89 art 79 GDPR; art 82 GDPR provides for the right to compensation of damage incurred through infringe-
ment of the GDPR.
90 Or another law not specifically adopted for the implementation of the Directive.
91 arts 52 and 54 of the Directive.
92 art 11 of the Position of the Council at first reading with a view to the adoption of a Directive of the
European Parliament and of the Council on the protection of natural persons with regard to the process-
ing of personal data by competent authorities for the purposes of the prevention, investigation, detection
or prosecution of criminal offences or the execution of criminal penalties, and on the free movement of
such data, and repealing Council Framework Decision 2008/977/JHA—Adopted by the Council on 8
April 2016, ST 5418 2016 REV 1 - 2012/010.
93 art 9(1) of the Proposal for a Directive of the European Parliament and of the Council on the protection
of individuals with regard to the processing of personal data by competent authorities for the purposes of
prevention, investigation, detection or prosecution of criminal offences or the execution of criminal penal-
ties, and the free movement of such data, COM(2012) 10 final.
94 art 9(1) of the Report on the proposal for a directive of the European Parliament and of the Council on
the protection of individuals with regard to the processing of personal data by competent authorities for
the purposes of prevention, investigation, detection or prosecution of criminal offences or the execution
of criminal penalties, and the free movement of such data, COM(2012)0010 – C7-0024/2012 – 2012/
0010(COD).
95 art 50(2) of the UK Data Protection Act 2018.
20 Algorithmic decision-making and data protection
ous “right to explanation”’.106 For reasons elaborated on below, this author considers
that the GDPR provisions should be interpreted in a way so as to provide the data
106 Bryan Casey, Ashkon Farhangi and Roland Vogl, ‘Rethinking Explainable Machines: The GDPR’s
“Right to Explanation” Debate and the Rise of Algorithmic Audits in Enterprise’ Berkeley Technology
Law Journal, forthcoming <https://ssrn.com/abstract¼3143325> accessed 7 July 2018.
107 art 13(2)(f) GDPR.
108 art 14(2)(g) GDPR.
109 art 15(1)(h) GDPR.
110 Emphasis added.
111 Selbst and Powles (n 105) 235–37.
112 Wachter, Mittelstadt and Floridi (n 23) 91.
22 Algorithmic decision-making and data protection
113 Case C-304/02 Commission v France ECLI:EU:C:2005:444, can be brought forward as an extreme ex-
ample where the purposeful interpretation led the Court to interpret the word ‘or’ as having a cumula-
tive sense (‘and’); see para 83 of the judgment.
114 Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protec-
tion of individuals with regard to the processing of personal data and on the free movement of such
data, OJ L 281, 23.11.1995, 31.
115 More precisely, the CJEU relied on art 12(b) and subparagraph (a) of the first paragraph of art 14 of
Directive 95/46; Case C-131/12 Google Spain and Google ECLI:EU:C:2014:317.
116 Selbst and Powles (n 105) 242.
117 Compare WP29, Guidelines on Automated individual decision-making, 27, in particular on the ‘weight’
of factors ‘on an aggregate level’.
Maja Brkan 23
out its content and call it the right of the data subject to be informed about the rea-
sons for the automated decision. In order for this right to be of relevance to the data
118 Joshua A Kroll and others (n 18) 637, 638, 641, 656ff.
119 ibid 657.
120 Wachter, Mittelstadt and Floridi (n 23) 82, emphasis omitted.
121 Emphasis added.
24 Algorithmic decision-making and data protection
should ‘be sufficiently comprehensive for [her] to understand the reasons for the
decision’.122
data subject was faced merely with a final decision without any reasons relating to
why such a decision was taken.
subject is guaranteed ‘at least’ those safeguards. Nothing in the wording of Article
22(3) would oppose adding an additional safeguard that stems from other provisions
134 Guaranteeing this additional safeguard is of course independent from data controllers’ possibility to vol-
untarily offer additional safeguards, as advocated by Wachter, Mittelstadt and Floridi (n 23) 91.
135 ibid 82.
136 Edwards and Veale (n 103) 53.
137 It seems that arts 13(2)(f), 14(2)(g) and 15(1)(h) GDPR would apply to a decision that would not ful-
fil the criteria from art 22(1) and hence be allowed, for example a decision not having a legal or signifi-
cant effect on the data subject. If the provisions are read in that way, it seems even more unreasonable
not to guarantee these rights for automated decisions taken on the basis of art 22(2).
138 Recital 39 GDPR.
139 On the importance of transparency in the GDPR, see for example Sofia Olhede and Russell Rodrigues,
‘Fairness and Transparency in the Age of the Algorithm’ (2017) Significance 8–9; Merle Temme,
‘Algorithms and Transparency in View of the New General Data Protection Regulation’ (2017) 3
European Data Protection Law Review 4, 473; Antoni Roig, ‘Safeguards for the Right not to be Subject
to a Decision Based Solely on Automated Processing (Article 22 GDPR)’ (2017) 8 European Journal of
Law and Technology 3, 1–17; Emre Bayamlıoglu, ‘Transparency of Automated Decisions in the GDPR:
An Attempt for Systemisation’ <https://ssrn.com/abstract¼3097653> accessed 10 July 2018.
Kaminski points out that the ‘right to explanation is far from the only transparency right’ in the GDPR;
Maja Brkan 27
and even propose technical solutions146 that would lead to a higher algorithmic
transparency. Algorithmic transparency is deemed to cover different transparency
the logic involved (Articles 13(2)(f), 14(2)(g) and 15(1)(h)), expressly require that
this should be guaranteed in case automated decision is based on sensitive data.
CO NC L US I ON A ND FU RT HE R CH A L LE NG ES
This article analyses rules of the GDPR and the Directive on Data Protection in
Criminal Matters regarding automated decision making. It is establishes that, while
these rules clearly give the right to the data subject not to be subjected to a fully
152 Charter of Fundamental Rights of the European Union, OJ C 326, 26.10.2012, 391.
30 Algorithmic decision-making and data protection
automated decision, including profiling, the exceptions to this right hollow it out to
the extent that the exceptions themselves become a rule. Globally, and especially for
very fast machine learning,156 it will be close to impossible to explain the reasons be-
hind its decision.157 It seems that, in order to reach the transparency of an algorithm,
156 Masnick claims that the faster the machine learns, the more difficult it is to understand the reasons be-
hind its decisions; Mike Masnick, ‘Activists Cheer On EU’s “Right To An Explanation” For Algorithmic
Decisions, But How Will It Work When There’s Nothing To Explain?’ <https://www.techdirt.com/
articles/20160708/11040034922/activists-cheer-eus-right-to-explanation-algorithmic-decisions-how-will
-it-work-when-theres-nothing-to-explain.shtml> accessed 10 January 2018.
157 Metz points out that ‘[d]eep neural nets depend on vast amounts of data, and they generate complex
algorithms that can be opaque even to those who put these systems in place.’ See Cade Metz (n 142).
Compare also Bryce Goodman and Seth Flaxman (n 7).
158 Compare ‘Artificial Intelligence, Robotics, Privacy and Data Protection’ Room document for the 38th
International Conference of Data Protection and Privacy Commissioners, October 2016.