0% found this document useful (0 votes)
51 views612 pages

Cloud Security BootCamp A User's Guide To Data Protection Law and

Uploaded by

Joseph W
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
51 views612 pages

Cloud Security BootCamp A User's Guide To Data Protection Law and

Uploaded by

Joseph W
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 612

A User’s Guide to Data

Protection: Law and Policy


Dedication

To SG & A
A User’s Guide to Data
Protection: Law and Policy

Fourth edition

Dr Paul Lambert BA, LLB, LLM, TMA, CTMA, PhD


Adjunct Lecturer, Lawyer, Certified Data Protection
­Officer, Consultant
Visiting Research Fellow, Institute of Advanced Legal
Studies, University of London
BLOOMSBURY PROFESSIONAL
Bloomsbury Publishing Plc
41–43 Boltro Road, Haywards Heath, RH16 1BJ, UK

BLOOMSBURY and the Diana logo are trademarks of Bloomsbury Publishing Plc
First published in Great Britain 2020
Copyright © Bloomsbury Professional 2020
All rights reserved. No part of this publication may be reproduced or transmitted
in any form or by any means, electronic or mechanical, including photocopying,
recording, or any information storage or retrieval system, without prior permission
in writing from the publishers.
While every care has been taken to ensure the accuracy of this work, no responsibility
for loss or damage occasioned to any person acting or refraining from action as a
result of any statement in it can be accepted by the authors, editors or publishers.
All UK Government legislation and other public sector information used in the
work is Crown Copyright ©. All House of Lords and House of Commons information
used in the work is Parliamentary Copyright ©. This information is reused under the
terms of the Open Government Licence v3.0 (http://www.nationalarchives.gov.uk/
doc/open-government-licence/version/3) except where otherwise stated.
All Eur-lex material used in the work is © European Union,
http://eur-lex.europa.eu/, 1998–2020.

British Library Cataloguing-in-Publication Data


A catalogue record for this book is available from the British Library.
ISBN: PB: 978 1 52651 570 4
ePDF:   978 1 52651 572 8
ePub:   978 1 52651 571 1
Typeset by Compuscript Ltd, Shannon

To find out more about our authors and books visit


www.bloomsburyprofessional.com. Here you will find extracts, author information,
details of forthcoming events and the option to sign up for our newsletters
Contents

Abbreviationsxvii
Table of Cases xxi
Table of Statutes xxv
Table of Statutory Instruments xxix
Table of EU Regulations xxxiii
Table of European Directives xli
Table of Treaties, Conventions and Agreements xliii

Part 1   Data Protection: How to Comply with the


Data Protection Regime

Chapter 1   Data Protection 3


What is Data Protection? 3
The Importance of Data Protection 4
The Data Protection Rules 9
Data Protection Rules 11
Summary Data Protection Rules 11
General Criteria for Data Processing 13
Data Protection Overview 14
Lawful Processing 15
Definitions16

Chapter 2   Sources of Data Protection Law 17


Introduction17
UK DPA 2018 17
UK Secondary Legislation 18
EU Data Protection Law 22
Case Law 23
ICO Guides 23
ICO Determinations 25
Legal Textbooks 25
Legal Journals 29
EDPB29

v
Contents

European Data Protection Supervisor 30


Council of Europe 30
Other Data Protection Authorities 31
Other Official Sources 31
Key/Topical Issues 32
Data Protection Websites and Blogs 33
Other Laws 33
Conferences33
Reference34

Chapter 3  Definitions 35
Introduction35
DPA Definitions 35
GDPR Definitions 38
Two Categories of Personal Data 42
Conclusion46

Chapter 4   History and Data Protection 47


Introduction47
History of Data Protection 48
Data Protection Act 50
Legal Instruments 51
GDPR54
Conclusion58

Chapter 5  Principles 61
Introduction61
When Data Protection Provisions Apply 61
Fair Processing Requirements 62
Principles of Data Protection 62

Chapter 6   Ordinary Personal Data Lawful Processing


Conditions65
Introduction65
General Lawful Processing Conditions 65
Special Personal Data Lawful Processing Conditions 66

Chapter 7   Processing Pre-Conditions: Prior Information


Requirements and Transparency 69
Introduction69
Prior Information Requirements under the EU General Data
Protection Regulation (GDPR) 69
Conclusion73

vi
Contents

Chapter 8  Exemptions 75
Introduction75
Exemptions under the DPA 2018 75
Exemptions under the GDPR 76
Conclusion77

Chapter 9   Individual Data Subject Rights 79


Introduction79
Principles of Data Protection 80
Rights for Individual Data Subjects 81
Recipients of Right 83
Access Right under the DPA 2018 83
Access Right 85
GDPR: Rectification and Erasure 88
Right to Data Portability 90
Automated Individual Decision Making Right 91
Compensation for Data Subjects 93
DPA: Requiring Data Disclosure 95
Jurisdiction95
Complaints to ICO 96
Organisational Data Protection Group 96
Court Remedies 98
Different Supervisory Authorities 98
Plaintiff Choice of Jurisdictions and Courts 98
Compensation98
Penalties99
Sanctions99
GDPR: Right to Portability 100
GDPR: Right to Object, Automated Decisions and Profiling 100
GDPR Automated Individual Decision Making,
Including Profiling 101
Conclusion102
Cases to Consider 102

Chapter 10   Time Limits for Compliance 107


Introduction107
Time Limits 107
Conclusion110

Chapter 11   Enforcement and Penalties for Non-Compliance 111


Introduction111
Breaches, Offences and Penalties 111
Criminal Offences 112

vii
Contents

Other Consequences of Breach 112


ICO Monetary Penalties 114
GDPR Changes re Fines and Prosecution 114
Civil Sanctions under the DPA 117
Remedies, Liability and Penalties under the GDPR 125
Powers and Functions of the ICO 128
ICO Data Enforcement, Loss/Data Breaches,
Fines and Convictions 129
Conclusion154

Chapter 12   Security of Personal Data 155


Introduction155
Appropriate Security Measures 156
Ensuring Appropriate Security Measures 157
Security under the EDPB 159
Security under the GDPR 160
Organisational Security Awareness 165
Organisational Security Measures 168
Raising Awareness 170
ICO Guidance 171
ICO and Security Breaches 173
Disposal of Computer Hardware 175
Conclusion176

Chapter 13   Outsourcing and Data Processors 177


Introduction177
Processors and Data Security 178
Engaging Processors 178
Relying on Third Party Processors 179
Conclusion180

Part 2   Inward Facing Organisational DP Obligations

Chapter 14   Processing Employee Personal Data 183


Introduction183
New Inward Facing Changes 184
Data Protection Officer (DPO) 186
Inward Facing Issues 189
Those Covered 189
Compliance with the Principles of Data Protection 190
Ordinary Personal Data Lawful Processing Conditions 190
Lawful Processing and Organisation’s Employees 191
Special Personal Data Lawful Processing Conditions 192
Special Data Lawful Processing and Organisation’s Employees 193

viii
Contents

ICO Codes 194


Employees and Security 195
Data Access Right of Employees 197
Conclusion197

Chapter 15   Employee Data Protection Rights 199


Introduction199
The Data Protection Rights of Employees 199
Rights Under the GDPR 200
Conclusion206

Chapter 16   Employee Considerations 207


Introduction207
Contract207
Policies207
Data Protection Policy 208
Internet Usage Policy 208
Mobile and Device Usage Policies 208
Vehicle Use Policy 208
Transfers of Undertaking 209
Evidence209
Enforceability210
Data Breach 210
Employee Data Organisations 210
Location211
Conclusion211

Chapter 17   Employee Monitoring Issues 213


Introduction213
Sample Legal Issues Arising 213
Employee Misuse of Email, Internet, etc 214
Contract216
Employment Equality 216
Harassment217
Online Abuse 217
Offline Abuse 217
Child Pornography 218
Dealing with the Employee Risks 218
Employee Corporate Communications Usage Policies 218
Focus of Organisational Communications Usage Policies 219
Data Protection and Employee Monitoring 220
Human Right 221
Application of Data Protection Regime 222
ILO Code 223

ix
Contents

EDPB/WP29223
Employment Contracts, Terms, Policies 226
Processing Compliance Rules 226
The Rights of Employee Data Subjects 228
Monitoring Case 229
Conclusion231

Part 3   Outward Facing Organisational DP Obligations

Chapter 18   Outward Facing Issues 235


Introduction235
New Outward Facing Changes 236
Data Protection Officer 239
Data Protection by Design and by Default 241
Types of Outward Facing Personal Data 241
How to be Outward Facing Compliant 242
Compliance with the Outward Facing Principles 243
Customers, etc, and Ordinary Personal Data Lawful
Processing Conditions 244
Customers, etc, and Special Personal Data Lawful
Processing Conditions 246
Customers, etc, and Security Requirements 248
Direct Marketing 249
Consequences of Non-Compliance 249
Users Versus Customers 250
Conclusion251

Chapter 19   Data Protection and Privacy by Design 253


Introduction253
Background253
Principles of PbD 254
GDPR255
ICO256
EDPB260
Commercial Adoption 260
Conclusion261

Chapter 20   Enforcement Powers 263


Introduction263
Enforcement Notices 263
Assessment Notices 265
Powers of Entry and Inspection 265
Request for Audit 265
Information Notices 265
Information Orders 266

x
Contents

Failure to Comply 266


Unlawful Obtaining Etc of Personal Data 266
Re-identifying and De-identified Personal Data 266
Re-identification and Testing 267
Power of ICO to Impose Monetary Penalty 267
Prohibition of Requirement to Produce Certain Records 268
Tasks268
Powers270
General Conditions for Imposing Administrative Fines 272
Penalties274
Conclusion274

Chapter 21   Transfers of Personal Data 277


Introduction277
Transfer Ban 278
Adequate Protection Exception 278
Exceptions279
Creating Adequacy 281
Binding Corporate Rules 281
GDPR: The New Transfers Regime 283
Issues291
Establishing if the Ban Applies 292
Checklist for Compliance 293
Brexit294
Conclusion294

Chapter 22   ePrivacy and Electronic Communications 297


Introduction297
Background298
Scope of the ePD 298
Marketing299
Marketing Protection for Organisations 300
Conclusion307

Chapter 23   Electronic Direct Marketing and Spam 309


Introduction309
Direct Marketing (DM) 310
PECR313
ICO Monetary Penalties 317
Civil Sanctions 317
Call and Fax Opt Out Registers 318
The Spam Problem 319
Related Issues 319
Cases320
Conclusion320

xi
Contents

Part 4   New UK Regime

Chapter 24   Background to the New UK Regime 323


Introduction323
Brexit, DPA, and EU 323
Queen’s Speech 324
Background Briefing Document 324
Digital Charter and Internet Safety 325
The Ministerial Statement 326
‘Final’ Brexit Negotiations in Transition Period 328
Brexit Guides 330
Data and the European Union (Withdrawal) Act 2018  335
EUWA Details 336
EUWA Official Explanatory Commentary 339
New ICO Guidance Update 340
Preparation342
EDPS Guidance 343
Commission345
EDPB346
Brexit and EU (Withdrawal Agreement) Act 2020 347
Conclusion349

Chapter 25   The New Data Protection Act 351


Introduction351
Repeal351
Breakdown351
Specific Changes from GDPR 352
Comment355
Future356

Part 5   New EU Regime

Chapter 26   New Regime 359


Introduction359
Formal Nature of Regulations and Directives 359
Review Policy 360
Importance362
Fundamental Right 363
Innovations364
Enhanced Provisions and Changes 367
The New Data Protection Regime 372
Main Provisions and Changes of GDPR 375
Communicating Data Breach to Data Subject 423
Data Protection Officer 423
Conclusion433

xii
Contents

Part 6   Particular Issues

Chapter 27   Data Breach 437


Introduction437
Data Breach Incidents in Context 437
Notification of a Data Breach to Supervisory Authority 438
Communication of a Data Breach to Data Subject 439
Employee Data Breaches 440
Notification Timelines 440
Notification Processes 440
Data Security Standards 441
Incident Response 442
Conclusion442

Chapter 28   Data Protection Impact Assessment 443


Data Protection Impact Assessment and Prior
Consultation443
Data Protection Impact Assessment 444
Prior Consultation 450
Conclusion451

Chapter 29   Social Media 453


Introduction453
Controllers and Joint Controllers  455
Investigations456
Social Media and Leveson 457
Social Media Data Transfers: Processors 458
Apps: Social Media Data Transfers 458
Awareness458
Tagging and Identification 459
Transparency and User Friendly Tools 460
Abuse, Attacks, Threats, Trolling, Victims 460
Employment and Social Media 462
Electronic Evidence 463
The Rights of Data Subjects 464
Consent and Social Media 465
Website Discretion, Liability and Obligations 468
Third Party Controllers and EU Data Subjects 468
Specific Processing Risks 469
Rights469
GDPR470

Chapter 30   Leveson, the Press and Data Protection 487


Introduction487
DPA 1998, s 32 487

xiii
Contents

Leveson Recommendations 489


Comparison492
Conclusion494
Chapter 31   Data Protection Officer 495
Introduction495
New Role of DPO 495
Tasks and Role 496
Summary498
Chapter 32   Brexit, Privacy Shield and Schrems 501
Introduction501
Issues and Questions 503
Privacy Shield 506
Standard Contract Clauses 509
Conclusion513
Chapter 33   Other Data Protection Issues 515
Introduction515
New Regime 515
Medical and Health Data 516
Genome Data 517
Body Scanners 518
Investigation, Discovery and Evidence 518
Cloud519
New Hardware Devices, New Software 519
Internet of Things 520
On-Site/Off-Site520
Online Abuse 521
Drones521
Increasing Actions 522
AI, Big Data and Data Ethics 522
Codes and Certification 523
Politics523
Conclusion524
Appendices525
Reference Links 525
Legislative Links 525
Forms and Documents Links 525
Complying with Data Protection 525
Objections to Marketing 526
Audit Checklist 527
Procedures530

Index533

xiv
‘Our measures are designed to support businesses in their use of data,
and give consumers the confidence that their data is protected and those
who misuse it will be held to account.’1

‘We are pleased the government recognises the importance of data pro-
tection, its central role in increasing trust and confidence in the digi-
tal economy and the benefits the enhanced protections will bring to the
public.’2

‘Technology has … altered the way we work. It is no longer really nec-


essary to go into “an office” to make viable contribution to an organisa-
tion. We can work at home, in coffee bars, in airports, on trains using
wifi and 3G telephone services utilising an increasingly diverse range
of devices from laptops to tablet computers to smart phones in order to
access emails and “attend” online meetings.’1

‘[P]ersonal data … are more and more processed and archived.’3

The importance of consent4 cannot be overestimated.

‘There is an inherent security issue with many of these online exchanges


which if often overlooked. Emails are not particularly secure.’5

‘A blog [or social networking post] is a bit like a tattoo: a good idea at
the time but you might live to regret it.’6

1 Minister of State for Digital, Matt Hancock, referred to in ‘Government to Strengthen


UK Data Protection Law,’ Department for Digital, Culture, Media & Sport and The
Rt Hon Matt Hancock MP, 7 August 2017.
2 Information Commissioner Elizabeth Denham, referred to in ‘Government to
Strengthen UK Data Protection Law,’ Department for Digital, Culture, Media & Sport
and The Rt Hon Matt Hancock MP, 7 August 2017.
3 Davies, C, Editorial, Communications Law, (2012)(17), pp 38–39.
4 Costa, L., and Poullet, Y, ‘Privacy and the Regulation of 2012,’ Computer Law &
Security Review, (2012)(28), pp. 254–262, at 256.
5 Ferretti, F, ‘A European Perspective on Data Processing Consent Through the Re-­
Conceptualiation of European Data Protection’s Looking Glass After the Lisbon
Treaty: Taking Rights seriously,’ European Review of Private Law, (2012)(20),
pp. 473–506.
6 Davies, C, Editorial, Communications Law, (2012)(17), pp. 38–39.

xv
xvi
Abbreviations

These abbreviations shall be used throughout.


DPA 2018: UK Data Protection Act 2018;
DPD: EU Data Protection Directive 1995
(Directive 95/46/EC of the European Parliament
and of the Council of 24 October 1995 on the
protection of individuals with regard to the
processing of personal data and on the free
movement of such data);
GDPR: EU General Data Protection Regulation;
Regulation (EU) 2016/679 of the European
Parliament and of the Council of 27 April 2016
on the protection of natural persons with regard
to the processing of personal data and on the free
movement of such data, and repealing Directive
95/46/EC (General Data Protection Regulation)
(Text with EEA relevance) OJ L 119, 4.5.2016,
p 1–88;
(The GDPR replaces the DPD and is directly
effective throughout the EU without the need for
separate sets of national implementing legislation);
There is a two year implementation deadline from
the date of official publication, in May 2016;
ePD: Directive 2002/58/EC of the European Parliament
and of the Council of 12 July 2002 concerning
the processing of personal data and the protection
of privacy in the electronic communications
sector (Directive on privacy and electronic
communications). This is also known as the

xvii
Abbreviations

ePrivacy Directive (ePD) (being replaced by the


ePrivacy Regulations);
ePR: the ePrivacy Regulation replacing the ePR (Proposal
for a Regulation of the European Parliament
and of the Council concerning the respect for
private life and the protection of personal data in
electronic communications and repealing Directive
2002/58/EC (Regulation on Privacy and Electronic
Communications);
WP29: Article 29 Working Party on Data Protection
(replaced by EDPB);
EDPB: European Data Protection Board (pursuant to the
GDPR);
Personal data: Any data or information identifying or relating to
the individual data subject (see DPD and GDPR
detailed definitions below);
Data Subject: The individual who the personal data relates to
(see DPD and GDPR detailed definitions below);
Controller: The organisation collecting, processing and
holding the personal data (previously called the
Data Controller) (see DPD and GDPR detailed
definitions below);
Processor: An outsourced third party organisation or related
entity carrying out certain defined outsourced
activities for and on behalf of the main data
controller with the personal data eg outsourced
payroll, outsourced direct marketing, etc.
(previously called the Data Processor) (see DPD
and GDPR detailed definitions below);
DPO: Data Protection Officer;
ICO: Information Commissioner’s Office ie the main
UK data protection supervisory authority;
Member State: means a member State of the European Union (EU);
PECR: Privacy and Electronic Communications
(EC Directive) Regulations 2003 (as updated
following the ePrivacy Regulation);

xviii
Abbreviations

PECR: Privacy and Electronic Communications


Amendment: (EC Directive) (Amendment) Regulations 2011;
Commission: means the EU Commission;
EEA: means the European Economic Area, comprising
the EU plus Iceland, Liechtenstein and Norway.
(Note, Switzerland has a similar type arrangement
with the EU).

xix
xx
Table of Cases

AB v Bragg Communications (2010) NSSC 215; revs’d 2011 NSCA 26;


revs’d (2012) SCC 46��������������������������������������������������������������������������������������������9.10

Bărbulescu v Romania (61496/08) [2016] IRLR 235, 41 BHRC 44������������������ 9.32; 17.24
Breyer v Germany (Case C-582/14) [2017] 1 WLR 1569, [2017] 2 CMLR 3,
[2017] CEC 691�����������������������������������������������������������������������������������������������������9.32
British Gas v Data Protection Registrar [1997–98] Info TLR 393,
[1998] UKIT DA98������������������������������������������������������������������������������������ 9.32; 17.22

CCN Systems v Data Protection Registrar [1991] UKIT DA90������������������������������������9.32


Camera di Commercio, Industria, Artigianoto e Agricoltura di Lecce v Manni
(Case C-398/15) [2018] Bus LR 25, [2017] 3 CMLR 18, [2017] CEC 1282��������9.32
Campbell v Mirror Group Newspapers Ltd [2004] UKHL 22, [2004] 2 AC 457,
[2004] 2 WLR 1232�����������������������������������������������������������������������������������������������9.32
Commission v Bavarian Lager (Case C-28/08P) [2011] Bus LR 867,
[2010] 6 WLUK 672, [2011] 1 CMLR 1���������������������������������������������������������������9.32
Common Services Agency v Scottish Information Comr [2008] UKHL 47,
[2008] 1 WLR 1550, [2008] 4 All ER 851������������������������������������������������������������9.32

Data Protection Commissioner v Facebook Ireland and Maximillian Schrems,


Schrems II, (Case C‑311/18), ECLI:EU:C:2020:559, 16 July 2020............. 32.1–32.9
Digital Rights Ireland Ltd v Minister for Communications, Marine & Natural
Resources (Case C-293/12); Proceedings brought by Kartner
Landesregierung (Case C-594/12), [2015] QB 127,
[2014] 3 WLR 1067, [2014] 2 All ER (Comm)������������������������������� 11.03; 9.21, 9.32;
11.04; 21.17; 32.1–32.2
Douglas v Hello! (No 6) [2005] EWCA Civ 595, [2006] QB 125,
[2005] 3 WLR 881�������������������������������������������������������������������������������������������������9.32
Durham County Council v Dunn [2012] EWCA Civ 1654,
[2013] 1 WLR 2305, [2013] 2 All ER 213���������������������������������������������������9.05, 9.32

xxi
Table of Cases

European Commission v Bavarian Lager Co Ltd (Case C-28/08P)


[2011] Bus LR 867, [2011] 1 CMLR 1, [2011] All ER (EC) 1007������������������������9.32

Facebook Ireland (Case C-311/18) (case in progress)����������������������������������������������������9.21


Fashion ID GmbH & Co KG v Verbraucherzentrale NRW eV (Case C-40/17)
[2020] 1 WLR 969, [2019] 7 WLUK 445, [2020] 1 CMLR 16�������������� 26.28; 29.02
Federation of German Consumer Organisations (VZBV) v Facebook
(14 January 2016, German Federal Court of Justice)��������������������� 9.10; 11.14; 29.14

GC v Commission Nationale de l’Informatique et des Libertés (CNIL)


(Déréférencement de données sensible) (Case C-136/17) [2020]
1 WLR 1949, [2020] 1 All ER (Comm) 866, [2019] 9 WLUK 277���������� 1.03; 26.60
Google Inc v Agencia Espanola de Proteccion de Datos (AEPD)
(Case C-131/12) [2014] QB 1022, [2014] 3 WLR 659,
[2014] 2 All ER (Comm) 301��������������������������������������� 1.03; 9.01, 9.32; 26.58, 26.60
Google LLC, successor in law to Google Inc v Commission National de
l’Informatique et de Libertés (CNIL) (Case C-507/17) [2020]
1 WLR 1993, [2020] 1 All ER (Comm) 829, [2019] 9 WLUK 276���������� 1.03; 26.60
Google v Vidal-Hall [2014] EWHC 13 (QB), [2014] 1 WLR 4155,
[2014] 1 CLC 201; aff’d [2015] EWCA Civ 311, [2016] QB 1003,
[2015] 3 WLR 409������������������������������������������������������������������������������������� 9.32; 29.04

Halford v UK (Application 20605/92) [1997] IRLR 471, (1997)


24 EHRR 523, 3 BHRC 31������������������������������������������������������������������������ 9.32; 17.15

Information Comr v ACT Response (30 October 2018)�����������������������������������������������23.07


Information Comr v AMS Marketing (1 August 2018)������������������������������������������������23.07
Information Comr v Avalon Direct (16 April 2019)�����������������������������������������������������23.07
Information Comr v Black Lion Marketing Ltd (27 March 2020)�������������������������������18.12
Information Comr v Bounty (UK) (11 April 2014)��������������������������������������1.11, 1.19; 9.32
Information Comr v CRDNN Ltd (2 March 2020)������������������������������������ 1.11; 9.32; 18.12
Information Comr v Cathay Pacific Airways Ltd (4 March 2020)������������ 1.11; 9.32; 17.03
Information Comr v Cullen (24 June 2019)��������������������������������������������������1.11; 9.17, 9.32
Information Comr v DM Design Bedrooms (23 November 2018)�������������������������������23.07
Information Comr v DSG Retail (9 January 2020)������������������������������������ 1.11; 9.32; 17.03
Information Comr v Davies (Laura) (December 2012)��������������������������������������������������9.32
Information Comr v Doorstep Dispensaree Ltd
(20 December 2019)����������������������������������������������������������������������� 1.11; 12.02; 17.03
Information Comr v Facebook (2018)����������������������������������������������������������������������������9.32
Information Comr v Grove Pension Solutions Ltd (26 March 2019)�����������������������������9.32
Information Comr v Hudson Bay Finance Ltd (12 August 2019)����������������������������������9.05

xxii
Table of Cases

Information Comr v Independent Inquiry into Child Sexual Abuse (18 July 2017)������9.32
Information Comr v Independent Inquiry into Child Sexual Abuse
(18 September 2018)����������������������������������������������������������������������������������������������9.32
Information Comr v London Borough of Lewisham (6 September 2018)���������������������9.05
Information Comr v Making it Wasy Ltd (2 August 2019)������������������������������������������23.07
Information Comr v Niebel; Information Comr v McNeish [2014]
UKUT 255 (AAC)������������������������������������������������������������������� 1.11; 9.17, 9.32; 18.12
Information Comr v Prudential (6 November 2012)�������������������������1.11; 9.17, 9.32; 11.12
Information Comr v Secure Home Systems (31 October 2018)�����������������������������������23.07
Information Comr v Smart Home Protection Ltd (13 June 2019)��������������������������������23.07
Information Comr v Shaw (5 December 2019)��������������������������������������������������������������1.11
Information Comr v Sjipsey (2 December 2019)�����������������������������������������������������������1.11
Information Comr v Solartech North East Ltd (23 November 2018)���������������������������23.07
Information Comr v Sony (July 2013) ���������������������������������������������������������������������������9.32
Information Comr v Superior Style Home Improvements Ltd
(17 September 2019)��������������������������������������������������������������������������������������������23.07
Information Comr v True Visions Productions Ltd (10 April 2019)��������� 1.10; 9.32; 18.12
Information Comr v Uber (26 November 2018)�������������������������������������������������������������1.11
Information Comr v Yahoo! UK (21 May 2018)������������������������������������������������������������9.32
Information Comr v Young (13 March 2020)�����������������������������������������������������������������9.05

Lindqvist, criminal proceedings against; Lindqvist v Aklagarkammaren i


Jonkoping (Case C-101/01) [2004] QB 1014, [2004] 2 WLR 1385,
[2003] ECR I-12971����������������������������������������������������������������������������������������������9.32

McCall v Facebook (No 10-16380) (9th Cir, 20 September 2012)��������� 9.32; 11.12; 29.03
Maatschap Toeters & MC Verberk v Productschap Vee en Vlees
(Case C-171/03)���������������������������������������������������������������������������������������������������10.01
Microsoft Corpn v McDonald (t/a Bizads) [2006] EWHC 3410 (Ch),
[2007] Bus LR 548, [2007] Info TLR 300����������������������������������������������������9.17, 9.32
Ministero Fiscal, proceedings brought by (Case C-207/16) [2019] 1 WLR 3121,
[2018] 10 WLUK 1, [2019] 1 CMLR 31���������������������������������������������������������������1.03
Mosley v Google Inc [2015] EWHC 59 (QB), [2015] 2 CMLR 22,
[2015] EMLR 11����������������������������������������������������������������������������������������������������9.32
Motion Picture Association v BT (judgment 28 July 2011)�������������������������������������������9.32

NT1 & NT2 v Google LLC [2018] EWHC 799 (QB), [2018]
3 All ER 581, [2018] EMLR 18, [2018] HRLR 13��������������� 1.03; 9.32; 11.14; 26.60
Nowak v Data Protection Comr (Case C-434/16) [2018] 1 WLR 3505,
[2018] 2 CMLR 21�������������������������������������������������������������������������������������������������9.32

Puškár v Finančné riaditel’stvo Slovenskej republiky & Kriminály úrad


finančnei správy (Case C-73/16)����������������������������������������������������������������������������9.32

xxiii
Table of Cases

Robertson (Brian Reed Beetson) [2001] EWHC Admin 915�����������������������������������������9.32


Rugby Football Union v Viagogo Ltd [2012] UKSC 55, [2012] 1 WLR 3333,
[2013] 1 All ER 928�����������������������������������������������������������������������������������������������9.32

Schrems v Data Protection Comr (Case C-362/14) [2016] QB 527,


[2016] 2 WLR 873, [2016] 2 CMLR 2����������������������������������� 1.19; 9.21, 9.32; 11.04;
26.44, 26.46, 26.48, 26.57; 32.1–32.2

Tele2 Sverige v Swedish Post & Telecom Authority; R (on the application
of Watson) v Secretary of State for the Home Department
(Cases C-203/15 & C-698/15) [2017] QB 771, [2017] 2 WLR 1289,
[2017] 2 CMLR 30�������������������������������������������������������������������������������������������������9.32

Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v


Wirtschaftsakademie Scleswig-Holstein GmbH (Case C-210/16)
[2019] 1 WLR 119, [2018] 6 WLUK 11, [2018] 3 CMLR 32 ���������������� 26.28; 29.02

Valsts policijas Rigas regiona pārvaldes Kārtības policijas pārvalde v Rīgas


pašvaldības SIA Rigas satiksme (Case C-13/16) [2017] 4 WLR 97,
[2017] 3 CMLR 39, [2018] CEC 255��������������������������������������������������������������������9.32
Various Claimants v Wm Morrison Supermarkets plc [2018] EWHC 1123 (QB)������� 9.32;
11.12
Various Claimants v Wm Morrisons Supermarket plc [2017] EWHC 3113 (QB),
[2018] 3 WLR 691, [2017] 12 WLUK 9�������������������������������������������������������������11.13
Verein fur Konsumenteninformation v Amazon EU Sarl (Case C-191/15)
[2017] QB 252, [2017] 2 WLR 19�������������������������������������������������������������������������9.32
Von Hannover v Germany (Applications 40660/08 & 60641/08) [2012]
EMLR 16, (2012) 55 EHRR 15, 32 BHRC 527����������������������������������������������������9.32

WM Morrison Supermarkets plc v Various Claiimants [2020] UKSC 12,


[2020] 2 WLR 941, [2020] 3 WLUK 454��������������������������� 9.32; 11.13; 17.02; 27.05
Weltimmo sro v Nemzeti Adatvédelmi és Információszabadság Hatóság
(Case C-230/14) [2016] 1 WLR 863���������������������������������������������������������������������9.32

XY v Facebook Ireland Ltd [2012] NIQB 96�����������������������������������������������������������������9.10

xxiv
Table of Statutes

Communications Act 2003�������������������2.03 Data Protection Act 1998 – contd


Coroners and Justice Act 2009 Sch 1 – contd
s 124(1)(a)(i)����������������������������������30.05 Pt II
Criminal Damage Act 1971���������������12.20 para 2(1)(a)����������������30.03, 30.06
Criminal Justice Act 1988������������������17.09 para 10����������������������������������12.16
Criminal Justice and Immigration Data Protection Act 2018�����������1.01, 1.11,
Act 2008������������������������������������23.14 1.13, 1.18, 1.21; 2.01,
s 77������������������������� 23.14; 30.03, 30.05 2.02, 2.03, 2.04, 2.06,
s 78�������������������������������������30.03, 30.05 2.15, 2.19; 3.01, 3.05;
Criminal Justice and Public Order 4.02, 4.03; 5.01, 5.05;
Act 1994������������������������ 12.20; 17.09 6.02; 8.01, 8.02, 8.05;
Data Protection Act 1984������������4.02, 4.03 9.01, 9.05, 9.19; 11.05,
Data Protection Act 1998���������� 1.01; 4.02, 11.12, 11.28, 11.29; 12.17,
4.03; 5.05; 8.01, 8.05; 12.20; 18.02; 20.13, 20.15;
9.05; 11.26, 11.29; 24.02, 24.06, 24.07,
12.02; 21.02; 24.04; 24.09, 24.10, 24.11,
25.02; 30.03, 30.06 24, 12, 24.13; 25.01,
s 7��������������������������������������������������30.06 25.03, 25.05, 25.06
s 7(9)����������������������������������������������30.06 s 1����������������������������������������������������3.02
s 10������������������������������������������������30.06 s 2(1)������������������������������������������������5.04
s 10(4)��������������������������������������������30.06 s 3����������������������������������������������������3.02
s 12������������������������������������������������30.06 s 3(14)(c)�����������������������������������������3.02
s 12(8)��������������������������������������������30.06 Pt 2 (ss 4–28)���������������������������������25.04
s 13�������������������������������������30.03, 30.06 s 5����������������������������������������������������3.02
s 14������������������������������������������������30.06 s 5(7)������������������������������������������������3.02
s 14(1)–(3)�������������������������������������30.06 Pt 2 Ch 2 (ss 6–20)���������������� 3.02; 5.02
s 32��������������������������30.02, 30.03, 30.06 s 6����������������������������������������������������3.02
s 32(1)��������������������������������������������30.06 s 6(2)������������������������������������ 3.02; 25.04
s 32(1)(b)���������������������������������������30.06 s 7, 8����������������������������������������������25.04
s 32(4)���������������������������������30.03, 30.06 s 9�������������������������� 18.02; 25.04; 26.20;
s 32(5)(b)����������������������������30.03, 30.06 29.14, 29.26
s 44�������������������������������������30.03, 30.06 s 10����������������������������� 3.02, 3.05; 14.09
s 45�������������������������������������30.03, 30.06 s 10(2), (3), (5)��������������������������������3.05
s 46�������������������������������������30.03, 30.06 s 11���������������������������� 3.05; 14.09; 25.04
s 52(1)��������������������������������������������30.04 s 13–15������������������������������������������25.04
s 55�������������������������������������30.03, 30.05 s 17������������������������������������������������11.29
Sch 1 s 19������������������������������������ 18.02; 25.04
Pt I�����������������������������������������������5.05 Pt 2 Ch 3 (ss 21–28)���������������������� 3.02;
para 7������������������������������������12.02 25.03, 25.05

xxv
Table of Statutes

Data Protection Act 2018 – contd Data Protection Act 2018 – contd
Pt 3 (ss 29–81)���������������������� 3.02; 8.02; s 199, 200���������������������������11.02; 18.02
25.03, 25.05 s 204������������������������������������������������3.02
s 29(2)����������������������������������������������3.02 s 205������������������������������������������������3.02
s 32��������������������������������������������������3.02 s 207�������������������������������������� 5.02; 9.19
s 47������������������������������������������������11.29 s 207(7)��������������������������������������������5.02
s 55������������������������������������������������11.05 s 209������������������������������������ 3.02; 25.04
Pt 4 (ss 82–113)�������������������� 3.02; 8.02; s 210������������������������������������ 3.02; 25.04
25.03, 25.05 Sch 1���������������������������������� 14.09; 18.02
s 82(3)����������������������������������������������3.02 Pt 1 (paras 1–4)��������������� 3.05; 14.02
s 83��������������������������������������������������3.02 Pt 2 (paras 5–28)�������������������������3.05
s 99(3)��������������������������������������������10.01 Pt 3 (paras 29–37)�����������������������3.05
s 115–117���������������������������������������20.17 Sch 2����������������������������������������������25.04
s 121–128��������������������������������������18.02 Pt 1 (paras 1–5)��������������� 8.02; 25.04
s 129����������������������������������������������20.08 Pt 2 (paras 6–15)�����������������������25.04
s 139–141��������������������������������������20.17 Pt 3 (paras 16, 17)����������� 8.02; 25.04
Pt 6 (ss 142–181)����������������11.26; 18.02 Pt 4 (paras 18–25)�����������������������8.02
s 142�����������������������������������11.26; 20.09 Pt 5 (para 26)������������������� 8.02; 25.04
s 143, 144��������������������������������������11.26 Pt 6 (paras 27, 28)���������������������25.04
s 145�����������������������������������11.26; 20.10 Sch 3������������������������������������ 8.02; 25.04
s 146�����������������������������������11.26; 20.04 Pt 1 (para 1)���������������������������������8.02
s 146(8), (9)�����������������������������������10.01 Pt 2 (paras 2–6)���������������������������8.02
s 147�����������������������������������11.26; 20.05 Pt 3 (paras 7–12)�������������������������8.02
s 148�����������������������������������11.26; 20.06 Pt 4 (paras 13–20)�����������������������8.02
s 149����������������������������������������������11.26 Pt 5 (para 21)�������������������������������8.02
s 150����������������������������������������������11.26 Sch 4������������������������������������ 8.02; 25.04
s 150(2)������������������������������������������20.02 Sch 12��������������������������������������������20.17
s 151�����������������������������������11.26; 20.03 Sch 13��������������������������������������������20.17
s 152, 153��������������������������������������11.26 Sch 17�������������������������������� 20.17; 30.07
s 154�����������������������������������11.26; 20.07 Sch 15���������������������������������11.26; 20.07
s 155�����������������������������������11.26; 20.15 Sch 18����������������������������������������������9.18
s 155(2), (3)�����������������������������������20.15 Sch 19
s 156����������������������������������������������11.26 Pt 1 (paras 1–232)
s 157�����������������������������������11.26; 20.15 para 43�������������������������� 1.01; 4.02
s 158, 159, 165–167����������������������11.26 para 44����������������������������������25.02
s 168�����������������������������������11.12, 11.26 Data Retention and Investigatory
s 168(3)������������������������������������������11.12 Powers Act 2014�������������������������1.03
s 169�����������������������������������11.12, 11.26 European Communities
s 170�����������������������������������11.02; 20.12 Act 1972�������������������������24.14, 24.20
s 171������������������������11.02; 20.13, 20.14 s 2(1)����������������������������������������������24.14
s 172�����������������������������������11.02; 20.14 s 2(2)������������������������������������������������3.02
s 173����������������������������������������������11.02 European Union (Withdrawal)
s 177����������������������������������������������30.07 Act 2018������������������������24.07, 24.11,
s 179����������������������������������������������30.07 24.12, 24.14
s 180������������������������������������������������9.19 s 1��������������������������������������������������24.14
s 184������������������������� 9.18; 18.02; 20.16 s 2��������������������������������������������������24.14
s 185������������������������������������ 9.18; 18.02 s 3���������������������������������������24.13, 24.14
s 186����������������������������������������������18.02 s 3(1)����������������������������������������������24.13
s 187�������������� 9.21; 11.12; 18.02; 25.04 s 3(2)(a)������������������������������24.13, 24.14
s 188–190����������������� 9.21; 11.12; 18.02 s 5��������������������������������������������������24.13
s 196, 197���������������������������11.02; 18.02 s 5(1), (2), (4)��������������������������������24.13
s 198������������������������11.02, 11.14; 18.02 s 6��������������������������������������������������24.13
s 198(1)������������������������������������������11.14 s 6(1)����������������������������������������������24.13

xxvi
Table of Statutes

European Union (Withdrawal) Fraud Act 2006�����������������������������������12.20


Act 2018 – contd Human Rights Act 1998��������������������17.14,
s 6(1)(b)�����������������������������������������24.13 17.15;
s 6(2), (4), (7)��������������������������������24.13 24.13; 26.05,
s 7��������������������������������������������������24.13 26.92; 30.02
s 7(1), (2)���������������������������������������24.13 Investigatory Powers Act 2016������������1.03
s 7(5)(f)������������������������������������������24.13 Northern Ireland Act 1998
s 7(6)����������������������������������������������24.13 s 75������������������������������������������������26.84
s 9(1)����������������������������������������������24.13 Protection from Harassment
s 9(3)(e)�����������������������������������������24.13 Act 1997������������������������������������17.06
s 9(4)����������������������������������������������24.13 Protection of Children Act 1978������ 12.20;
s 20�������������������������������������24.13, 24.14 17.09
s 20(4)��������������������������������������������24.13 Theft Act 1968�����������������������������������12.20
Sch 6����������������������������������������������24.14 UNITED STATES
European Union (Withdrawal Controlling the Assault of
Agreement) Act 2020������� 1.13; 2.02; Non-Solicited Pornography
24.07, 24.20; and Marketing (CAN-SPAM)
25.06; 26.02, 26.46 Act 2003������������������������������������23.18

xxvii
xxviii
Table of Statutory Instruments

Adoption Agency Regulations 1983, Data Protection (Corporate Finance


SI 1983/1964�������������������������������2.03 Exemption) Order 2000,
Adoption Rules 1984, SI 1984/265������2.03 SI 2000/184���������������������������������2.03
Banking Coordination (Second Council Data Protection (Conditions under
Directive) Regulations 1992, Paragraph 3 of Part II of
SI 1992/3218�������������������������������2.03 Schedule 1) Order 2000,
Civil Procedure Rules 1998, SI 2000/185���������������������������������2.03
SI 1998/3132�������������������������������2.03 Data Protection (Crown Appointments)
Consumer Credit (Credit Reference Order 2000, SI 2000/416�������������2.03
Agency) Regulations 1977, Data Protection (Designated Codes
SI 1977/329���������������������������������2.03 of Practice) Order 2000,
Consumer Credit (Credit Reference SI 2000/418���������������������������������2.03
Agency) Regulations 2000, Data Protection (Designated Codes
SI 2000/290���������������������������������2.03 of Practice) (No 2) Order 2000,
Consumer Protection (Distance Selling) SI 2000/1864�������������������������������2.03
Regulations 2000, Data Protection (Fees under
SI 2000/2334�������������������������������2.03 section 19(7)) Regulations 2000,
Data Protection Act 1998 SI 2000/187���������������������������������2.03
(Commencement) Order 2000, Data Protection (Functions of
SI 2000/183����������������������� 2.03; 4.03 Designated Authority) Order 2000,
Data Protection Act 1998 SI 2000/186���������������������������������2.03
(Commencement No 2) Order 2008, Data Protection (International
SI 2008/1592�������������������������������2.03 Co-operation) Order 2000,
Data Protection Act 2018 SI 2000/190���������������������������������2.03
(Commencement No 1 and Data Protection (Miscellaneous Subject
Transitional and Saving Provisions) Access Exemptions) Order 2000,
Regulations 2018, SI 2000/419���������������������������������2.03
SI 2018/625���������������������������������2.03 Data Protection (Miscellaneous Subject
Data Protection Act 2018 Access Exemptions)
(Commencement No 3) Regulations (Amendment) Order 2000,
2019, SI 2019/1434���������������������2.03 SI 2000/1865�������������������������������2.03
Data Protection (Charges and Information) Data Protection (Monetary Penalties)
(Amendment) Regulations 2019, (Maximum Penalty and Notices)
SI 2019/478���������������������������������2.03 Regulations 2010, SI 2010/31�����2.03
Data Protection (Charges and Data Protection Tribunal (National
Information) Regulations 2018, Security Appeals) Rules 2000,
SI 2018/480���������������������������������2.03 SI 2000/206���������������������������������2.03

xxix
Table of Statutory Instruments

Data Protection (Notification and Data Protection (Subject Access)


Notification Fees) Regulations 2000, (Fees and Miscellaneous Provisions)
SI 2000/188���������������������������������2.03 Regulations 2000,
Data Protection (Notification and SI 2000/191���������������������������������2.03
Notification Fees) (Amendment) Data Protection (Subject Access)
Regulations 2001, (Fees and Miscellaneous Provisions)
SI 2001/3214�������������������������������2.03 (Amendment) Regulations 2001,
Data Protection (Notification and SI 2001/3223�������������������������������2.03
Notification Fees) (Amendment) Data Protection (Subject Access
Regulations 2009, Modification) (Social Work)
SI 2009/1677�������������������������������2.03 (Amendment) Order 2005,
Data Protection, Privacy and Electronic SI 2005/467���������������������������������2.03
Communications (Amendments etc) Data Protection Tribunal (Enforcement
(EU Exit) (No 2) Regulations 2019, Appeals) Rules 2000,
SI 2019/485���������������������������������2.03 SI 2000/189���������������������������������2.03
Data Protection, Privacy and Electronic Data Protection Tribunal (National
Communications (Amendments etc) Security Appeals) Rules 2000,
(EU Exit) Regulations 2019, SI 2000/206���������������������������������2.03
SI 2019/419���������������������������������2.03 Data Protection Tribunal
Data Protection (Processing of Sensitive (National Security Appeals)
Personal Data) Order 2000, (Telecommunications) Rules 2000,
SI 2000/417���������������������������������2.03 SI 2000/731���������������������������������2.03
Data Protection (Processing of Sensitive Data Retention (EC Directive)
Personal Data) Order 2006, Regulations 2007,
SI 2006/2068�������������������������������2.03 SI 2007/2199�������������������������������2.03
Data Protection (Processing of Sensitive Education (School Records) Regulations
Personal Data) Order 2009, 1989, SI 1989/1261���������������������2.03
SI 2009/1811�������������������������������2.03 Education (Special Educational
Data Protection (Processing of Needs) Regulations 1994,
Sensitive Personal Data) (Elected SI 1994/1047�������������������������������2.03
Representatives) Order 2002, Electronic Commerce (EC Directive)
SI 2002/2905�������������������������������2.03 Regulations 2002,
Data Protection (Processing of SI 2002/2013������������������� 2.03; 12.20
Sensitive Personal Data) (Elected reg 7, 8�������������������������������������������23.12
Representatives) (Amendment) Electronic Communications
Order 2002, SI 2002/2905�����������2.03 (Universal Service) Order 2003,
Data Protection Registration Fee SI 2003/1904�������������������������������2.03
Order 1991, SI 1991/1142�����������2.03 Environmental Information Regulations
Data Protection (Subject Access 2004, SI 2004/3391���������������������2.03
Modification) (Health) Order 2000, European Union (Withdrawal)
SI 2000/413���������������������������������2.03 Act 2018 (Exit Day) (Amendment)
Data Protection (Subject Access Regulations 2019,
Modification) (Education) SI 2019/718�������������������������������24.13
Order 2000, SI 2000/414�������������2.03 Family Proceedings Courts (Children
Data Protection (Subject Access Act 1989) Rules 1991,
Modification) (Social Work) SI 1991/1395�������������������������������2.03
Order 2000, SI 2000/415�������������2.03 Family Proceedings Rules 1991,
Data Protection (Processing of Sensitive SI 1991/1247�������������������������������2.03
Personal Data) Order 2000, Freedom of Information and Data
SI 2000/417���������������������������������2.03 Protection (Fees and Appropriate
Data Protection Registration Fee Limits) Regulations 2004,
Order 1991, SI 1991/1142�����������2.03 SI 2004/3244�������������������������������2.03

xxx
Table of Statutory Instruments

Health Professions Order 2001, Police Act 1997 (Criminal Records)


SI 2001/254���������������������������������2.03 (Amendment) Regulations 2007,
Health Service (Control of Patient SI 2007/700���������������������������������2.03
Information) Regulations 2002, Privacy and Electronic Communications
SI 2002/1438�������������������������������2.03 (EC Directive) Regulations 2003,
Information Tribunal (Enforcement SI 2003/2426������������������ 2.03; 11.29;
Appeals) Rules 2005, 12.20, 12.24;
SI 2005/14�����������������������������������2.03 18.12; 23.11,
Information Tribunal (Enforcement 23.13, 23.15;
Appeals) (Amendment) Rules 2002, 24.10
SI 2002/2722�������������������������������2.03 reg 2�����������������������������������������������23.12
Information Tribunal (Enforcement reg 19–21���������������������������������������23.11
Appeals) (Amendment) Rules 2005, reg 22���������������������������������������������23.11
SI 2005/450���������������������������������2.03 reg 22(2), (4)����������������������������������23.12
Information Tribunal (National Security reg 23, 24, 26���������������������������������23.11
Appeals) Rules 2005, Privacy and Electronic Communications
SI 2005/13�����������������������������������2.03 (EC Directive) (Amendment)
INSPIRE Regulations 2009, Regulations 2004,
SI 2009/3157�������������������������������2.03 SI 2004/1039������������������� 2.03; 12.20
Insurance Companies (Third Insurance reg 30(1)������������������������������������������9.17
Directives) Regulations 1994, Privacy and Electronic Communications
SI 1994/1696�������������������������������2.03 (EC Directive) (Amendment)
Investment Services Regulations 1995, Regulations 2011,
SI 1995/3275�������������������������������2.03 SI 2011 /1208������������������������������2.03
Magistrates’ Courts (Adoption) Rules Privacy and Electronic Communications
1984, SI 1984/611�����������������������2.03 (EC Directive) Regulations 2003,
Magistrates’ Courts (Children and SI 2003/2426�������������������������������2.03
Young Persons) Rules 1992, Protection of Freedoms Act 2012
SI 1992/2071�������������������������������2.03 (Destruction, Retention and Use
Magistrates’ Courts (Criminal Justice of Biometric Data) (Transitional,
Act 1991) (Miscellaneous Transitory and Saving Provisions)
Amendments) Rules 1992, (Amendment) Order 2015,
SI 1992/2072�������������������������������2.03 SI 2015/1739�������������������������������2.03
Network and Information Systems Protection of Freedoms Act 2012
(Amendment) Regulations 2018, (Destruction, Retention and Use
SI 2018/629�������������������������������12.22 of Biometric Data) (Transitional,
Network and Information Systems Transitory and Saving Provisions)
Regulations 2018, (Amendment) Order 2016,
SI 2018/506������������������� 12.22; 24.10 SI 2016/682���������������������������������2.03
Open-Ended Investment Companies Protection of Freedoms Act 2012
(Investment Companies with (Destruction, Retention and Use
Variable Capital) Regulations 1996, of Biometric Data) (Transitional,
SI 1996/2827�������������������������������2.03 Transitory and Saving Provisions)
Parental Orders (Human Fertilisation and (Amendment) Order 2018,
Embryology) Regulations 1994, SI 2018/657���������������������������������2.03
SI 1994/2767�������������������������������2.03 Regulations of Investigatory Powers
Pharmacists and Pharmacy (Interception of Communications:
Technicians Order 2007, Code of Practice) Order 2002,
SI 2007/289���������������������������������2.03 SI 2002/1693�������������������������������2.03
Police Act 1997 (Criminal Records) Regulation of Investigatory Powers
Regulations 2002, (Communications Data)
SI 2002/233���������������������������������2.03 Order 2003, SI 2003/3172�����������2.03

xxxi
Table of Statutory Instruments

Regulation of Investigatory Powers Telecommunications (Data Protection


(Communications Data) (Additional and Privacy) Regulations 1999,
Functions and Amendment) SI 1999/2093�������������������������������2.03
Order 2006, SI 2006/1878�����������2.03 Telecommunications (Data Protection
Rehabilitation of Offenders (Exceptions) and Privacy) (Amendment)
(Amendment) Order 2001, Regulations 2000,
SI 2001/1192�������������������������������2.03 SI 2000/157���������������������������������2.03
Rehabilitation of Offenders (Exceptions) Telecommunications (Lawful
(Amendment) (No 2) Order 2001, Business Practice) (Interception
SI 2001/3816�������������������������������2.03 of Communications) Regulations
Representation of People (England and 2000, SI 2000/2699���������������������2.03
Wales) Regulations 2001, Transfer of Undertakings
SI 2001/341���������������������������������2.03 (Protection of Employment)
Representation of People (England and Regulations 2006,
Wales) (Amendment) Regulations SI 2006/246�������������������� 2.03; 16.08;
2001, SI 2001/1700���������������������2.03 26.84
Representation of People Regulations Unfair Terms in Consumer Contracts
1986, SI 1986/1081���������������������2.03 Regulations 1999,
Telecommunications (Data Protection SI 1999/2083�������������������������������2.03
and Privacy) (Direct Marketing) Waste Electrical and Electronic
Regulations 1998, Equipment Regulations 2006,
SI 1998/3170�������������������������������2.03 SI 2006/3289�������������������������������2.03

xxxii
Table of EU Regulations

Reg 3886/92 Reg (EU) 2016/679 – contd


art 50a��������������������������������������������10.01 Recital 1��������� 4.05; 14.02; 18.02; 26.08
Reg (EU) 45/2001��������������������������������2.04 Recital 2, 3������������������������� 14.02; 18.02
Reg (EU) 910/2014������������������� 2.04, 2.05; Recital 4�������������������� 4.05; 14.02; 18.02
12.22; 24.10 Recital 5�����������������������������������������14.03
Reg (EU) 2016/679�����������1.01, 1.03, 1.11, Recital 6��������� 4.05; 14.02; 18.02; 21.08
1.12, 1.13, 1.21; 2.01, 2.03, 2.04, 2.06, Recital 7�������������������� 4.05; 10.01; 14.02
2.10, 2.12, 2.15; 3.01, 3.02; 4.02, Recital 8������������������ 14.02; 18.02; 26.08
4.03, 4.05, 4.08, 4.10; 5.01, 5.02, Recital 9�������������������� 4.05; 14.02; 18.02
5.03, 5.04; 6.02; 8.01, 8.02, 8.04, Recital 10������������������� 3.06; 4.05; 14.02;
8.05; 9.01, 9.03, 9.05, 9.06, 9.08, 18.02; 26.22
9.09, 9.10, 9.12, 9.13, 9.15, 9.16, Recital 11���������������� 14.02; 18.02; 26.08
9.19, 9.21, 9.23, 9.25, 9.26, 9.27, Recital 12����������������������������� 4.05; 14.02
9.29, 9.32; 10.01; 11.02, 11.04, Recital 13���������������������������� 4.05; 14.02;
11.06, 11.08, 11.10, 11.10, 11.12, 18.02; 26.08
11.16, 11.19, 11.22, 11.24, 11.30; Recital 14��������������������������� 14.02; 18.02
12.02, 12.07, 12.15, 12.17, 12.20; Recital 15������������������ 4.05; 14.02; 18.02
14.02, 14.03, 14.06, 14.07, 14.09; Recital 16��������� 4.05; 8.04; 14.02; 18.02
15.05, 15.06, 15.08, 15.10, 15.11, Recital 17��������������������������� 14.02; 18.02
15.12; 17.03; 18.02, 18.03; 19.05, Recital 18������������������������������� 4.05; 8.04
19.10, 19.11; 20.17, 20.18, 20.19, Recital 19��������������������������� 14.02; 18.02
20.20; 21.08; 22.01; 23.01, 23.14; Recital 21������������������������������� 4.05; 8.04
24.02, 24.04, 24.07, 24.09, 24.10, Recital 22��������� 4.05; 9.19; 14.02; 18.02
24.11, 24.12, 24.13, 24.15, 24.16, Recital 23������������������������������� 4.05; 9.19
24.17, 24.19; 25.03, 25.04, 25.05, Recital 26�����������������������������������������4.05
25.06; 26.01, 26.02, 26.03, 26.04, Recital 27������������������������������� 4.05; 8.04
26.05, 26.06, 26.07, 26.09, 26.10, Recital 28, 29�����������������������������������4.05
26.11, 26.14, 26.15, 26.17, 26.19, Recital 30������� 4.05; 14.02; 18.02; 26.08
26.20, 26.23, 26.27, 26.28, 26.29, Recital 32������� 4.05; 14.02; 26.08; 29.14
26.34, 26.38, 26.44, 26.45, 26.46, Recital 33��������������������������� 14.02; 18.02
26.52, 26.56, 26.66, 26.67, 26.70, Recital 34�����������������������������������������4.05
26.77, 26.78, 26.80, 26.81, 26.83, Recital 35, 36������������ 4.05; 14.02; 18.02
26.84, 26.90, 26.92; 27.01, 27.03, Recital 37�����������������������������������������4.05
27.10; 28.03, 28.09, 28.10; 29.01, Recital 38����������������� 4.05; 14.02; 18.02;
29.14, 29.17, 29.23, 29.30, 29.31, 26.21; 29.26
29.44; 31.03, 31.10; 33.02, 33.09, Recital 39����������������� 4.05; 14.02; 18.02;
33.11, 33.14, 33.15 26.08, 26.16

xxxiii
Table of EU Regulations

Reg (EU) 2016/679 – contd Reg (EU) 2016/679 – contd


Recital 40����������������� 4.05; 14.02; 18.02; Recital 104, 105������ 14.02; 18.02; 26.44
26.08, 26.19; 29.14 Recital 107, 108���������������� 14.02; 18.02;
Recital 41��������������������������� 14.02; 18.02 21.08; 26.44
Recital 42–45���������������������� 4.05; 14.02; Recital 109�������������� 14.02; 18.02; 26.44
18.02; 26.08 Recital 110, 111���������������� 14.02; 18.02;
Recital 46, 47���������������������� 4.05; 14.02; 21.08; 26.44
18.02; 26.08, 26.16 Recital 112��������������� 4.05; 14.02; 18.02;
Recital 48���������������� 14.02; 18.02; 21.08 21.08; 26.44
Recital 49������������������ 4.05; 14.02; 18.02 Recital 113, 114���������������� 14.02; 18.02;
Recital 50������������������ 4.05; 14.02; 18.02 21.08; 26.44
Recital 51–54���������������������� 3.06; 14.02; Recital 115������������������������� 21.08; 26.44
18.02; 26.22 Recital 116������������������������� 14.02; 18.02
Recital 55, 56�����������������������������������3.06 Recital 117���������������� 4.05; 14.02; 18.02
Recital 57������������������ 3.06; 14.02; 18.02 Recital 119������������������������� 14.02; 18.02
Recital 58������������������ 3.06; 14.02; 29.14 Recital 121, 122, 124, 126����������� 14.02;
Recital 59������� 3.06; 10.01; 14.02; 18.02 18.02
Recital 60������������������ 3.06; 14.02; 18.02 Recital 127, 129, 136, 137,
Recital 61�����������������������������������������3.06 138���������������������������� 14.02; 18.02
Recital 63������������������ 3.06; 14.02; 18.02 Recital 139���������������� 4.05; 14.02; 18.02
Recital 64��������������������������� 14.02; 18.02 Recital 140������������������������� 14.02; 18.02
Recital 65������� 3.06; 14.02; 18.02; 29.14 Recital 141������� 4.05; 9.20; 14.02; 18.02
Recital 66��������������������������� 14.02; 18.02 Recital 142����������������� 4.05; 9.21; 14.02;
Recital 68������������������ 3.06; 14.02; 18.02 18.02; 26.08
Recital 69��������������������������� 14.02; 18.02 Recital 143����� 9.22; 14.02; 18.02; 26.06
Recital 70������������������ 3.06; 14.02; 18.02 Recital 144��������������������������� 9.23; 14.02
Recital 71��������������������������� 26.22; 29.14 Recital 145������������������ 4.05; 9.24; 14.02
Recital 72������������������ 3.06; 14.02; 18.02 Recital 146������������������ 4.05; 9.25; 14.02
Recital 72–74������������ 3.06; 14.02; 18.02 Recital 147������������������������� 14.02; 18.02
Recital 75������� 3.06; 14.02; 18.02; 29.14 Recital 148����������������������������� 4.05; 9.26
Recital 76������������������ 3.06; 14.02; 18.02 Recital 149���������������������������������������4.05
Recital 77, 78��������������������� 14.02; 18.02 Recital 150���������������������������������������9.27
Recital 79������������������ 3.06; 14.02; 18.02 Recital 153�������������� 14.02; 18.02; 21.08
Recital 80������� 3.06; 14.02; 18.02; 26.22 Recital 154������������������������� 14.02; 18.02
Recital 81������������������ 4.05; 14.02; 18.02 Recital 155���������������� 4.05; 14.02; 18.02
Recital 82�����������������������������������������4.05 Recital 156������������������������� 14.02; 18.02
Recital 83, 84������������ 4.05; 14.02; 18.02 Recital 159���������������� 4.05; 14.02; 18.02
Recital 85, 86���������������������� 4.05; 10.01; Recital 161, 162, 164, 166,
14.02; 18.02 168���������������������������� 14.02; 18.02
Recital 87�����������������������������������������4.05 Recital 171, 173�������� 4.05; 14.02; 18.02
Recital 89������� 4.05; 14.02; 18.02; 26.08 art 1�������������������������� 4.06; 14.02; 18.02;
Recital 90�����������������������������������������4.05 26.06, 26.08, 26.14
Recital 91������������������� 3.06; 4.05; 14.02; art 2�������������������������� 4.06; 14.02; 18.02;
18.02; 26.22 26.06; 26.08, 26.14
Recital 94������������������ 4.05; 14.02; 18.02 art 2(3)���������������������������������������������4.06
Recital 97������� 3.06; 14.02; 18.02; 26.22 art 3�������������������������� 4.06; 14.02; 18.02;
Recital 98������������������ 4.05; 14.02; 18.02 26.08, 26.14, 26.80
Recital 99�����������������������������������������4.05 art 3(2)������������������������������� 26.32; 29.42
Recital 100�������������������������������������14.02 art 4���������������������������� 3.03; 6.02; 14.02;
Recital 101��������������� 4.05; 14.02; 18.02; 18.02; 26.08, 26.15
21.08; 26.44 art 4(7)��������������������������������� 3.02; 25.04
Recital 102�������������� 18.02; 21.08; 26.44 Ch II (arts 5–11)����������������� 4.07; 18.08;
Recital 103�����14.02; 18.02; 21.08; 26.44 25.04; 26.83

xxxiv
Table of EU Regulations

Reg (EU) 2016/679 – contd Reg (EU) 2016/679 – contd


art 5����������������� 4.07; 5.04; 11.10; 14.02; art 9(2)������������������������ 3.05, 3.06; 26.22
15.02; 17.23; 18.02, 18.08; art 9(2)(a)������������ 7.03, 7.04; 9.10, 9.13,
20.19; 21.15; 26.08, 26.17, 9.16, 9.28; 15.05, 15.08,
26.51, 26.52; 29.13 15.11; 26.54, 26.55, 26.60,
art 5(1)��������������������������������� 4.07; 18.08 26.63, 26.66; 29.28, 29.29,
art 5(1)(c)���������������������������������������26.06 29.33, 29.36, 29.39
art 5(2)��������������������������������� 4.07; 18.08 art 9(2)(b)����������������������������������������3.05
art 6�������������������������� 4.08; 11.10; 14.02; art 9(2)(g)����������������������������� 3.05; 9.16;
18.02; 20.19; 26.08, 26.18 15.11; 26.66; 29.39
art 6(1)������������� 3.06; 4.08; 26.18, 26.24 art 9(2)(h)�������������������� 3.05, 3.06; 9.10;
art 6(1)(a)������������ 6.02; 7.03, 7.04; 9.10, 15.05; 25.04; 26.22,
9.13, 9.28; 15.05, 26.23, 26.60; 29.33
15.08; 26.20, 26.21, art 9(2)(i)������������������� 3.05; 9.10; 15.05;
26.54, 26.60, 26.63; 26.23, 26.60; 29.33
29.26, 29.28, art 9(2)(j)�����������������������������������������3.05
29.33, 29.36 art 9(3)����������������������� 3.05; 9.10; 15.05;
art 6(1)(b)������ 9.28; 15.08; 26.63; 29.36 26.22, 26.23, 26.60; 29.33
art 6(1)(c)����������������� 4.08; 12.14; 26.18, art 9(4)���������������������������������������������3.06
26.41, 26.69; 28.02; 29.43 art 9(5)��������������������������������� 3.06; 26.22
art 6(1)(d)����������������������������������������4.08 art 10���������������������������� 3.05, 3.06; 4.08;
art 6(1)(e)���������� 4.08; 9.15, 9.29; 12.14; 12.13; 14.02, 14.03;
15.10; 25.04; 26.18, 18.02, 18.03; 26.08,
26.41, 26.65, 26.69; 26.18, 26.24, 26.31,
28.02; 29.38, 29.43 26.32, 26.41, 26.69;
art 6(1)(f)������������������������������ 7.03, 7.04; 28.02; 29.42, 29.43; 31.02
9.15, 9.29; 15.10; art 11���������������������� 11.10; 14.02; 18.02;
26.54, 26.55, 26.65; 20.19; 26.08, 26.25, 26.52
29.28, 29.29, 29.38 art 11(1)������������������������������26.25, 26.52
art 6(2)��������������������������������� 4.08; 26.18 art 11(2)����������������������������� 26.25; 29.25
art 6(3)������������������������ 4.08; 8.02; 26.18 Ch III (arts 12–23)������������ 14.02; 15.02;
art 6(4)���������������������������������������������4.08 17.23; 25.04; 26.08,
art 7����������������� 4.09; 6.02; 11.10; 14.02; 26.29, 26.51, 26.83; 29.13
18.02; 20.19; 26.08, Ch III Section 1 (art 12)�������������� 14.02;
26.17, 26.19; 29.14 18.02; 26.08
art 7(1)–(3)��������������� 4.09; 26.19; 29.14 art 12�������������������������� 7.02; 8.04; 11.10;
art 7(4)����������� 4.09; 26.18, 26.19; 29.14 14.02; 15.02; 17.23;
art 8��������������� 6.02; 11.10; 14.02; 18.02; 18.02; 20.19; 21.15;
20.19; 25.04; 26.08, 26.08, 26.51, 26.52;
26.21; 29.14, 29.26 29.13, 29.25
art 8(1)��������������������� 9.10; 15.05; 26.20, art 12(1), (2)���������������������� 26.52; 29.25
26.21, 26.60; 29.26, 29.33 art 12(3)����������������������������� 10.01; 29.25
art 8(2)������������������������������� 26.21; 29.26 art 12(4)����������������������������� 26.52; 29.25
art 8(3)�������������������� 26.20, 26.21; 29.26 art 12(5)�����������������������������������������29.25
art 8(4)�������������������������������������������26.20 art 12(6)–(8)���������������������� 26.52; 29.25
art 9������������������� 3.06; 4.08; 6.03; 11.10; Ch III Section 2 (arts 13–15)��������� 7.03;
14.02, 14.03; 18.02, 18.03; 14.02; 26.08,
26.08, 26.18; 31.02 26.53; 29.27
art 9(1)�������������� 3.05, 3.06; 9.16; 12.13; art 13������������������� 7.02, 7.03; 8.02, 8.04;
15.11; 25.04; 26.22, 26.23, 11.10; 14.02; 17.23; 18.02;
26.31, 26.32, 26.41, 26.66, 20.19; 21.11, 21.13, 21.15; 25.04;
26.69; 28.02; 29.39, 26.08, 26.28, 26.48, 26.50, 26.51,
29.42, 29.43 26.52, 26.54; 29.13, 29.25, 29.28

xxxv
Table of EU Regulations

Reg (EU) 2016/679 – contd Reg (EU) 2016/679 – contd


art 13(1)�������������������� 7.03; 26.54; 29.28 art 18����������������� 8.02; 9.11, 9.12; 10.01;
art 13(2)�������������������� 7.03; 26.54; 29.28 11.10; 14.02; 15.02, 15.06,
art 13(3)�������������������� 7.03; 26.54; 29.28 15.07; 17.23; 18.02; 20.18,
art 13(4)����������������������������� 26.54; 29.28 20.19; 21.15; 25.04; 26.08,
art 14������������������� 7.02, 7.04; 8.02, 8.04; 26.25, 26.52, 26.61, 26.62,
11.10; 14.02; 15.02; 26.85; 29.13, 29.25, 29.35
17.23; 18.02; 20.19; art 18(1)–(3)����������������������� 9.11; 15.07;
21.11, 21.13, 21.15; 26.61; 29.34
25.04; 26.08, 26.28, art 19��������������� 8.02; 9.12; 10.01; 11.10;
26.48, 26.50, 26.51, 14.02; 15.02, 15.06; 17.23;
26.52; 29.13, 29.25, 29.29 18.02; 20.18, 20.19; 21.15;
art 14(1)�������������������� 7.04; 26.55; 29.29 25.04; 26.08 26.25, 26.51,
art 14(2)�������������������� 7.04; 26.55; 29.29 26.52, 26.62; 29.13,
art 14(3)�������������������� 7.04; 26.55; 29.29 29.25, 29.35
art 14(4)�������������������� 7.04; 26.55; 29.29 art 19(4)�����������������������������������������15.10
art 14(5)����������� 7.03, 7.04; 26.55; 29.29 art 20�������������������������� 8.02; 9.28; 10.01;
art 15����������������� 8.02, 8.04; 9.06; 10.01; 11.10; 14.02; 15.02; 17.23;
11.10; 14.02; 15.03; 17.23; 18.02; 20.19; 21.15; 25.04;
18.02; 20.19; 21.15; 25.04; 26.08, 26.25, 26.51, 26.52,
26.08, 26.25, 26.51, 26.52, 26.85; 29.13, 29.25
26.85; 29.13, 29.25 art 20(1)��������������������� 9.13, 9.28; 15.08;
art 15(1)������������������� 9.06; 15.03; 25.04; 26.63; 29.36
26.56; 29.30 art 20(2)��������������������� 9.13, 9.28; 15.08;
art 15(2)������������������� 9.06; 15.02, 15.03; 26.63, 26.65; 29.36
25.04; 26.56, 26.57; 29.30 art 20(3), (4)�������������� 9.13, 9.28; 15.08;
art 15(3)������������������� 9.06; 15.03; 25.04; 26.63; 29.36
26.56; 29.30 Ch III Section 4 (arts 21, 22)����������9.14,
art 15(4)��������� 9.06; 15.03; 26.56; 29.30 9.29; 15.09; 26.64
Ch III Section 3 (arts 16–19)��������� 9.08; art 21������������������������ 8.02; 10.01; 11.10;
14.02; 15.04; 26.08, 14.02; 17.23; 18.02;
26.58; 29.31 20.19; 21.15; 25.04;
art 16����������������� 8.02; 9.09, 9.12; 10.01; 26.08, 26.51, 26.52,
11.10; 14.02; 15.02, 26.85; 29.25
15.04, 15.06; 17.23; art 21(1)���������������9.10, 9.11, 9.15, 9.29;
18.02; 20.18, 20.19; 15.05, 15.07, 15.10;
21.15; 25.04; 26.08, 26.60, 26.61, 26.65;
26.25, 26.51, 26.52, 29.33, 29.34, 29.38
26.62, 26.85; 29.13, art 21(2)����������������������� 9.10, 9.15, 9.29;
29.25, 29.32, 29.35 15.05, 15.10; 26.60,
art 17����������������� 8.02; 9.13, 9.28; 10.01; 26.65; 29.33, 29.38
11.10; 14.02; 15.02, art 21(3)��������������������� 9.15, 9.29; 15.10;
15.08; 17.23; 18.02; 26.65; 29.38
20.18, 20.19; 21.15; art 21(4)����������� 9.15, 9.29; 26.65; 29.38
25.04; 26.08, 26.25, art 21(5)��������������������� 9.15, 9.29; 15.10;
26.51, 26.52, 26.63; 26.65; 29.38
29.13, 29.25, 29.36 art 21(6);�������������������� 9.15, 9.29; 15.10;
art 17(1)��������������������� 9.10, 9.12; 15.05, 26.65; 29.38
15.06; 26.60, 26.62; art 22���������������������� 10.01; 11.10; 14.02;
29.33, 29.35 17.23; 18.02; 20.19;
art 17(2)������������������� 9.10; 15.05; 20.18; 21.11, 21.15; 26.08,
26.60; 29.33 26.48, 26.51, 26.52;
art 17(3)��������� 9.10; 15.05; 26.60; 29.33 29.13, 29.25

xxxvi
Table of EU Regulations

Reg (EU) 2016/679 – contd Reg (EU) 2016/679 – contd


art 22(1)�������������� 7.03, 7.04; 9.06, 9.16, art 31(3)(b), (d), (e)�����������������������12.11
9.30; 15.03, 15.11; 25.04; Ch IV Section 2 (arts 32–34)������� 12.07;
26.54, 26.55, 26.56, 26.66; 26.36
29.28, 29.29, 29.30, 29.39 art 32���������������������� 11.10; 12.02, 12.03;
art 22(2)��������������������������������9.16; 15.11 14.02; 18.02; 20.19;
art 22(2)(a)�������������������������� 9.16; 15.11; 26.08, 26.29, 26.80
26.66; 29.39 art 32(1)������������������12.08, 12.11; 26.31,
art 22(2)(b)�������������� 9.16; 15.11; 25.04; 26.37; 29.41
26.66; 29.39 art 32(2)������� 12.08, 12.11; 26.37; 29.41
art 22(2)(c)�������������������������� 9.16; 15.11; art 32(3), (4)����������� 12.08; 26.37; 29.41
26.66; 29.39 art 33���������������������� 10.01; 14.02; 18.02,
art 22(3)��������� 9.16; 15.11; 26.66; 29.39 18.03; 20.19; 26.08,
art 22(4)����������������������� 7.03, 7.04; 9.06, 26.29, 26.70; 27.03
9.16; 15.03, 15.11; 26.54, art 33(1)������������������ 12.09; 26.38; 27.03
26.55, 26.56, 26.66; 29.28, art 33(2)������� 12.09, 12.10; 26.38; 27.03
29.29, 29.30, 29.39 art 33(3)�����������������������������������������27.03
Ch III Section 5 (art 23)����������������21.15 art 33(3)(b)–(d)������� 26.39; 27.04; 29.40
art 23������������ 14.02; 15.02; 18.02; 26.08 art 33(4), (5)����������� 12.10; 26.38; 27.03
art 23(1)����������� 4.08; 8.02; 21.15; 26.06 art 34����������� 10.01; 14.02; 17.23; 18.02;
art 23(2)������������������ 21.15; 26.06, 26.67 20.19; 21.15; 25.04; 26.08,
Ch IV (arts 24–43)������������ 14.02; 18.02; 26.29, 26.52; 29.25; 31.03
25.04; 26.08, 26.25, 26.83 art 34(1)������� 12.11; 26.39; 27.04; 29.40
Ch IV Section 1 (arts 24–31)���������26.25 art 34(2)������������������ 26.39; 27.04; 29.40
art 24����������������������� 14.02; 26.08, 26.80 art 34(3), (4)��������������������� 12.11; 26.39;
art 24(1)������������������������������26.27, 26.28 27.04; 29.40
art 24(2)�����������������������������������������26.27 Ch IV Section 3 (arts 35, 36)������� 12.12;
art 24(3)������������������������������26.27, 26.28 14.02; 18.02; 26.08,
art 25������������� 3.03; 11.10; 17.23; 18.02; 26.13, 26.40, 26.69;
19.04, 19.05; 20.19; 26.08, 28.01; 29.43
26.34, 26.51, 26.67; 29.13 art 35���������������������� 14.02, 14.03; 18.02;
art 25(1), (2)����������� 19.05; 26.34, 26.67 20.19; 26.08, 26.29,
art 25(3)������������������������������26.34, 26.67 26.42, 26.70, 26.88;
art 26������������ 14.02; 18.02; 20.19; 26.08 28.09; 29.44; 31.03
art 26(1), (2)����������������������������������26.28 art 35(1)����������������� 12.13, 12.14; 26.06,
art 27���������������������� 14.02; 18.02; 20.19; 26.41, 26.69; 28.02; 29.43
26.08; 29.23 art 35(2)����������������� 12.13, 12.14; 26.41,
art 27(1)�����������������������������������������26.32 26.69; 28.02; 29.43
art 27(2)������������������ 26.32, 26.34; 29.42 art 35(3)����������������� 12.14; 26.41, 26.69;
art 27(3), (4)���������������������� 26.32; 29.42 28.02; 29.43
art 27(5)�����������������������������������������29.42 art 35(4)����������������� 12.13, 12.14; 20.17;
art 28������������ 14.02; 18.02; 20.19; 26.08 26.41, 26.69; 28.02; 29.43
art 28(1)–(4)����������������������������������26.29 art 35(5)����������������� 12.14; 26.41, 26.69;
art 28(5)������������������������������26.29, 26.32 28.02; 29.43
art 28(6), (7)����������������������������������26.29 art 35(6)����������������� 12.13, 12.14; 26.41,
art 28(8)������������������ 20.17, 20.18; 26.29 26.69; 28.02; 29.43
art 28(9), (10)��������������������������������26.29 art 35(7)–(11)��������� 12.14; 26.41, 26.69;
art 29���������������������� 14.02; 18.02; 20.19; 28.02; 29.43
26.08, 26.30 art 36��������������������������������� 14.02; 18.02,
art 30���������������������� 14.02; 18.02; 20.19; 18.03; 20.18, 20.19;
26.08, 26.50 26.08, 26.29, 26.90
art 30(1)–(5)����������������������������������26.31 art 36(1)���������������������������� 12.15; 26.42,
art 31���������������������������������� 20.19; 26.33 26.70; 28.09; 29.44

xxxvii
Table of EU Regulations

Reg (EU) 2016/679 – contd Reg (EU) 2016/679 – contd


art 36(2)����������������� 12.15; 20.17; 26.42, art 42(8)�����������������������������������������26.81
26.70; 28.09; 29.44 art 43����������� 11.10; 14.02; 18.02; 20.17,
art 36(3)����������������� 12.15; 26.42, 26.70; 20.18, 20.19; 26.08, 26.81
28.09; 29.44 art 43(2)(d)������������������������������������20.18
art 36(4)������������������ 12.15; 26.70; 29.44 Ch V (arts 44–50)������������� 21.08; 24.17;
art 36(5)����������������� 12.15; 20.18; 26.42, 26.44, 26.83
26.70; 28.09; 29.44 art 44��������������������������������� 11.10; 14.02;
art 36(7)�����������������������������������������26.42 18.02; 20.19; 21.08;
Ch IV Section 4 (arts 37–39)��������14.02, 26.06, 26.08, 26.45; 32.1
14.03; 18.02, 18.03; art 45��������������������������������� 11.10; 14.02;
26.88; 31.02 18.02; 20.19; 21.13;
art 37���������������������� 14.02; 18.02; 20.19; 26.06, 26.08, 26.50
21.11; 26.08, 26.48 art 45(1)������������������ 21.09; 26.45, 26.46
art 37(1)������� 14.03; 18.03; 31.02, 31.04 art 45(2)����������������������������� 21.09; 26.46
art 37(2), (3)���������������������� 18.03; 31.04 art 45(2)(b)������������������������������������21.09
art 37(4)������� 14.03; 18.03; 28.09; 31.04 art 45(3)����������������� 21.09, 21.10, 21.13;
art 37(5), (6)����������� 14.03; 18.03; 31.05 26.46, 26.47, 26.50
art 37(7)������������������ 14.03; 18.03; 31.06 art 45(4), (5)���������������������� 21.09; 26.46
art 38����������������������� 14.02; 18.02; 20.19 art 45(6), (7)����������������������������������21.09
art 38(1)������� 14.02; 18.02; 26.89; 31.03 art 45(8), (9)���������������������� 21.09; 26.46
art 38(2)������� 14.02; 18.02; 26.89; 31.09 art 46������������������� 3.03; 7.03, 7.04; 9.06;
art 38(3)����������������� 14.02; 18.02; 26.89; 11.10; 14.02; 15.03; 18.02;
31.07, 31.08 20.19; 21.13; 24.17; 26.06,
art 38(4)������� 14.02; 18.02; 26.89; 31.06 26.08, 26.50, 26.54, 26.55,
art 38(5)������������������ 14.02; 18.02; 26.89 26.57; 29.28, 29.29, 29.30
art 38(6)������� 14.03; 18.02; 26.89; 31.03 art 46(1)�������������������� 21.10; 26.47; 32.7
art 39����������� 11.10; 14.02, 14.03; 20.19; art 46(2)����������������������������� 21.10; 26.47
26.08, 26.90; 29.41; 31.05 art 46(2)(c)���������������������������������������32.7
art 39(1)������������������ 18.03; 26.90; 31.03 art 46(2)(d)������������������������������������20.17
art 39(2)������� 14.03; 18.03; 26.90; 31.03 art 46(2)(e)�������������������������������������26.80
Ch IV Section 5 (arts 40–43)��������26.08, art 46(2)(f)�������������������������������������26.81
26.79 art 46(3)������������������ 20.17; 21.10; 26.47
art 40���������������������� 11.10; 12.08; 14.02; art 46(3)(a), (b)������������������������������20.18
18.02; 20.19; 21.10; 26.08, art 46(4), (5)���������������������� 21.10; 26.47
26.29, 26.37, 26.47, 26.69; art 47�������������������������� 7.03, 7.04; 11.10;
28.02; 29.41, 29.43; 33.15 14.02; 18.02; 20.17, 20.18,
art 40(1)����������������������������� 20.17; 26.80 20.19; 21.10; 24.17; 26.06,
art 40(2)–(4)����������������������������������26.80 26.08, 26.47, 26.54, 26.55;
art 40(5)������������������ 20.17, 20.18; 26.80 29.28, 29.29
art 40(6)–(11)���������������������������������26.80 art 47(1)������������������������������21.11; 26.48
art 41������������ 14.02; 18.02; 20.17; 26.08 art 47(1)(d)������������������������������������26.50
art 41(4)������������������������������11.10; 20.19 art 47(2)������������������������������21.11; 26.48
art 42���������������������� 11.10; 14.02; 18.02; art 47(2)(d)–(f)�������������������21.11; 26.48
19.05; 20.18, 20.19; art 47(2)(j)��������������������������21.11; 26.48
21.10; 26.08, 26.27, art 47(4)�����������������������������������������26.48
26.29, 26.37, 26.67; 33.15 art 48���������������������� 11.10; 14.02; 18.02;
art 42(1)����������������������������� 20.17; 26.81 20.19; 21.12; 24.17;
art 42(2)�����������������������������������������26.81 26.06, 26.08, 26.49
art 42(2)(f)�������������������������������������26.81 art 48(3)�����������������������������������������21.11
art 42(3), (4)����������������������������������26.81 art 48(3)(a)�������������������������������������24.17
art 42(5)����������������������������� 20.17; 26.81 art 49���������������������� 11.10; 14.02; 18.02;
art 42(7)������������������ 20.17, 20.18; 26.81 20.19; 26.06, 26.08

xxxviii
Table of EU Regulations

Reg (EU) 2016/679 – contd Reg (EU) 2016/679 – contd


art 49(1)��������������������� 7.03, 7.04; 21.13; art 77����������������������������������11.19, 11.20;
26.31, 26.50, 26.54, 26.08, 26.76, 26.77
26.55; 29.28, 29.29 art 77(1)������������������������������11.17; 26.74
art 49(1)(a)–(c), (g)������������ 21.13; 26.50 art 77(2)�������������������11.17, 11.18; 26.74
art 49(2), (3)����������������������������������21.13 art 78�����������������������11.18, 11.20; 14.02;
art 49(4)�����������������������������������������26.50 18.02; 26.08, 26.74, 26.77
art 49(5), (6)���������������������� 21.13; 26.50 art 78(1)�����������������������������������������26.75
art 50���������������������� 21.14; 24.17, 24.18; art 78(2)������������������������������11.08; 26.75
26.06, 26.08 art 78(3), (4)����������������������������������26.75
art 50(1)�����������������������������������������24.17 art 79��������������������������������� 11.20; 14.02;
Ch VI (arts 51–59)�������������������������26.83 18.02; 21.11; 26.08,
art 51����������������������� 14.02; 18.02; 26.08 26.48, 26.77, 26.80
art 52����������������������� 14.02; 18.02; 26.08 art 79(1)�����������������������������������������11.19
art 53����������������������� 14.02; 18.02; 28.09 art 79(2)��������11.08, 11.19; 26.76, 26.78
art 53(1)�����������������������������������������20.19 art 80���������������������� 14.02; 18.02; 20.17;
art 54����������������������� 14.02; 18.02; 26.08 25.04; 26.06, 26.08
art 55������������������������26.38, 26.75, 26.80 art 80(1), (2)�����������������������11.20; 26.77
art 56���������������������� 14.02; 18.02; 26.08, art 81����������������������� 14.02; 18.02; 26.08
26.75, 26.80 art 81(1)–(3)����������������������������������11.21
art 57������������ 14.02; 18.02; 26.08; 28.02 art 82���������������������� 11.20; 14.02; 18.02;
art 57(1)�����������������������������������������20.17 26.08, 26.77, 26.84
art 57(1)(f)�������������������������������������20.17 art 82(1)�������������������11.08, 11.22; 26.78
art 57(2)–(4)����������������������������������20.17 art 82(2)–(5)�����������������������11.08; 26.78
art 58��������������������������������� 12.15; 14.02; art 82(6)�����������������������������������������26.78
18.02; 26.08, art 83����������������������� 11.10, 11.11, 11.25;
26.42, 26.70; 29.44 14.02; 18.02; 20.15,
art 58(1)�����������������������������������������20.18 20.18, 20.19, 20.20;
art 58(1)(e), (f)�������������������������������26.86 26.08, 26.29
art 58(2)��������11.10; 20.17, 20.18, 20.19 art 83(1)�������������������� 3.06; 11.10; 20.19
art 58(2)(a)–(h), (j)������������������������11.10 art 83(1)(a)–(h), (j)������������������������20.19
art 58(3)����������������������������� 20.18; 26.81 art 83(2), (3)�����������������������11.10; 20.19
art 58(4)–(6)����������������������������������20.18 art 83(4)–(6)�����������������������11.10; 20.19
art 59����������������������� 14.02; 18.02; 26.08 art 83(7)–(9)�����������������������11.10; 20.19
Ch VII (arts 60–76)����������� 20.18; 25.04 art 84������������ 14.02; 18.02; 26.08, 26.29
Ch VII Section 1 (arts 60–62)�������26.06 art 84(1), (2)�����������������������11.11; 20.20
art 60–62����������������� 14.02; 18.02; 26.08 Ch IX (arts 85–91)�������������� 4.08; 11.10;
art 63����������������������������������21.10, 21.11; 14.02; 20.19; 26.08,
26.08, 26.41, 26.18, 26.82, 26.83
26.48, 26.69; 29.43 art 85����������������������� 14.02; 18.02; 26.08
art 64, 65����������������� 14.02; 18.02; 26.08 art 85(1)�����������������������������������������26.83
art 68������������������������������������������������2.10 art 85(2)������������������������������� 8.02; 26.83
art 68(1), (3)����������������������������������26.13 art 85(3)�����������������������������������������26.83
art 68(4)�����������������������������������������29.43 art 87����������������������� 14.02; 18.02; 26.08
art 69(1)�����������������������������������������26.13 art 87(2)����������������������������� 21.09; 26.20
art 70������������ 14.02; 18.02; 26.08, 26.13 art 88������������ 14.02; 15.12; 18.02; 26.08
art 71����������������������������������������������26.13 art 88(1)–(3)����������������������������������26.84
art 74(1)�����������������������������������������11.18 art 89������������������������� 8.02; 22.01; 26.08
art 75����������������������������������������������26.80 art 89(1)�������������� 4.07; 5.04; 6.03; 7.04;
Ch VIII (arts 77–84)��������� 11.16; 14.02; 9.02, 9.15, 9.29; 18.08;
15.02; 17.23; 25.04; 26.17, 26.22, 26.55,
18.02; 26.51, 26.60, 26.65, 26.85;
26.73; 29.13 29.29, 29.33

xxxix
Table of EU Regulations

Reg (EU) 2016/679 – contd Reg (EU) 2016/679 – contd


art 89(2)–(4)����������������������������������26.85 art 95���������������������������������� 23.01; 26.08
art 90���������������������������������� 18.02; 26.08 art 96����������������������� 14.02; 18.02, 26.08
art 90(1), (2)����������������������������������26.86 art 98����������������������������������������������26.08
art 91�����������������������������������26.08, 26.52 Reg (EU) 2016/1148����������������������������2.04
Ch X (arts 92, 93)��������������������������26.08 Reg (EU) 2018/151��������� 2.04, 2.05; 12.22
art 92����������������������������������������������26.08 Reg (EU) 2018/1725��������������������������12.13
art 93(2)������������������21.09, 21.10, 21.11; Reg (EU) 2018/1727
26.29, 26.47, 26.48 art 39.4�������������������������������������������19.09
art 93(3)�����������������������������������������21.09 ePrivacy Regulation (proposed)��������11.06;
art 94����������������������� 14.02; 18.02; 26.08 12.20; 15.10;
art 94(1)�����������������������������������������26.12 18.12; 22.22;
art 94(2)������������������������������26.12, 26.13 23.11, 23.14; 24.11

xl
Table of European Directives

Dir 95/46/EC����������� 2.01, 2.12; 4.02, 4.03, Dir 2002/58/EC – contd


4.05, 4.10; 12.22; 14.02; art 5������������������������������������������������22.15
18.02; 22.02, 22.10, 22.11, art 5(1), (2)������������������������������������22.15
22.13; 25.05; 26.01, 26.02, art 5(3)������������������������������� 22.15; 23.15
26.03, 26.06, 26.07, 26.08, art 6������������������������������������������������22.16
26.10, 26.12, 26.20 26.44, art 6(1)–(6)������������������������������������22.16
26.60; 29.07, 29.14 art 7������������������������������������������������22.17
Recital 53, 54���������������������������������29.17 art 7(1), (2)������������������������������������22.17
art 2(b), (d)��������������������������������������26.6 art 8�������������������������������������22.12, 22.18
art 4(1)(a)���������������������������������������26.60 art 8(1)–(6)������������������������������������22.18
art 25(6)����������������������������� 21.09; 26.46 art 9������������������������������������������������22.19
art 26(2)������������������ 21.07, 21.10; 26.47 art 9(1), (2)������������������������������������22.19
art 26(4)����������������������������� 21.10; 26.47 art 10�����������������������������������22.12, 22.20
art 29������������������������������������������������2.10 art 11����������������������������������������������22.12
Dir 97/66/EC��������������������������������������22.02 art 12����������������������������������������������22.21
Dir 98/34/EC art 12(1)–(4)����������������������������������22.21
art 1(2)���������������������������������������������3.03 art 13��������������������������������� 22.04, 22.13;
Dir 1999/93/EC������������������������������������2.05 23.03, 23.04
Dir 2000/31/EC��������������� 4.05; 8.04; 29.15 art 13(1)�����������������������������22.04, 22.05,
art 12–15������������������������������������������4.05 22.09, 22.13; 23.03,
Dir 2002/58/EC������� 1.03; 2.04; 4.05; 9.15, 23.04, 23.07, 23.10
9.21, 9.29; 15.10; 17.19; art 13(2)���������������������������� 22.05, 22.13;
22.01, 22.02, 22.03, 22.22; 23.03, 23.06, 23.07
23.01, 23.11, 23.15; 26.08, art 13(3)������� 22.06, 22.13; 23.03, 23.07
26.38, 26.65; 29.38 art 13(4)����������������� 22.07, 22.08, 22.13;
art 1������������������������������������������������22.10 23.03, 23.08, 23.09
art 1(1)–(3)������������������������������������22.10 art 13(5)������� 22.09, 22.13; 23.03, 23.10
art 2������������������������������������������������22.11 art 15(1)�����������������������������������������22.15
art 3������������������������������������������������22.12 Dir 2006/24/EC������ 1.03; 9.21, 9.21; 22.02
art 3(1)–(3)������������������������������������22.12 Dir 2009/136/EC����������� 2.04; 22.02; 23.15
art 4������������������������������������������������22.14 Dir 2016/680/EU���������������������� 3.02; 24.10
art 4(1), (2)������������������������������������22.14 Dir 2016/1148/EU��������������������������������2.05

xli
xlii
Table of Treaties, Conventions
and Agreements

Convention for the Protection of European Convention on Human


Individuals with Regard to Rights (Rome,
Automatic Processing of 4 November 1950)�������� 12.20; 17.15
Personal Data (Council art 8������������������������� 17.15, 17.24; 30.02
of Europe) (Strasbourg, Treaty of Lisbon amending the Treaty
28 January 1981)����������������������� 2.12; on European Union and the
4.02; 26.03 Treaty establishing the European
art 5–8����������������������������������������������4.02 Community (Lisbon,
Charter of Fundamental Rights 13 December 2007)�������������������26.05
of the European Union art 16(1)������������������������������26.04, 26.05
(2 October 2000)����������������������� 9.26; art 101, 102��������������������������������������9.27
20.18; 26.05 Treaty on European Union
art 7�����������������������������������������32.6–32.7 (Maastricht, 7 February 1992)
art 8�����������������������������������������32.6–32.7 Title V��������������������������������������������22.10
art 8(1)���������������������� 4.05; 26.04, 26.05 Title VI������������������������������������������22.10
art 16(1)�������������������������������������������4.05 Treaty on the Functioning of the
art 47����������������������������� 9.20; 32.6–32.7 European Union (Rome,
EU–US Privacy Shield 25 March 1957)�������������������������22.10
Agreement��������������������� 13.01; 21.17 art 288��������������������������������������������24.13
EU–US Safe Harbour
Agreement����������������������� 1.19; 13.01

xliii
xliv
Part 1
Data Protection

How to Comply with the Data Protection Regime

1
2
Chapter 1
Data Protection

What is Data Protection?


1.01 Data protection aims to protect the private personal ­information
of individuals. It provides a regulatory protection regime around ­personal
information, or personal data. Personal data is data or information which
relates to or identifies, directly or indirectly, an individual.
The data protection legal regime governs if, when and how organisa-
tions may collect, use and process personal data. It also provided for how
long they may use such data.
This applies to all sorts of personal information, from general to highly
confidential and sensitive. Examples of the later include sensitive health
data and details of criminal convictions.
The data protection regime is twofold in the sense of: (a) providing
obligations (which are inward-facing and outward-facing, see below)
which organisations must comply with; and (b) providing individu-
als or Data Subjects as they are known, with various data protection
rights which they or the Information Commissioner’s Office (ICO) can
invoke or enforce as appropriate. Significantly, recent measures expand
the ­ability to invoke the data protection rights to privacy groups and
collective data protection organisations including in group or class
­
type actions (see the EU General Data Protection Regulation (GDPR)).
The new Data Protection Act 2018 (DPA 2018) and new GDPR brings
‘comprehensive reform’1 to the UK and EU data protection regimes.
The previous Data Protection Act 1998 is now repealed.2
Certain specific sections of industry and certain specific activities
(eg data transfers abroad, direct marketing, cookies), also have ­additional
data protection compliance rules.

1 In Brief, Communications Law (2012) (17) 3.


2 Data Protection Act 2018, Sch 19, Pt 1, s 43.

3
1.01 Data Protection

In terms of individuals, they can invoke their rights directly with


organisations, with the ICO, and also via the courts in legal proceedings.
Compensation can also be awarded. In addition, criminal offences can
be prosecuted. Data protection compliance is therefore very important.
As regards implementing compliance frameworks, an organisation
must have defined structures, policies and teams in place to ensure that it
knows what personal data it has, for what purposes, ensure that it is held
fairly, lawfully and in compliance with the data protection regime, and
that it is safely secured against damage, loss and unauthorised access –
an increasing problem. The cost of loss, and of data security breach,
can be significant both financially and publicly. The IBM Cost of Data
Breach Study (2019) puts the average cost at $3.92m. The UK ­TalkTalk
data breach is estimated at costing £35m. The Target data breach was
estimated as costing €162m plus a 5.3 per cent drop in sales. More
recent data breaches have led to drops of 40% of share price value and
even business closures. Breaches which are criminal offences can be
­prosecuted. In addition, personal liability can attach to organisational
personnel separate and in addition to liability of the organisation itself,
in addition to reprimand or even dismissal.

The Importance of Data Protection


1.02 One Editorial notes that the ‘waves of [data protection concern
are] riding high.’3 The increasing ‘centralisation of information through
the computerisation of various records has made the right of [data
­protection and of] privacy a fundamental concern.’4 Data protection is
important, increasingly topical and an issue of legally required compli-
ance for all organisations. More importantly it is part of management
and organisational best practice. Individuals, employees and customers
expect that their personal data will be respected by organisations. They
are increasingly aware of their rights, and increasingly they enforce their
rights.
1.03 The coverage of data protection has also increased in the
­mainstream media. This is due in part to a large number of data loss and
data breach incidents. These have involved the personal data of millions
of individuals being lost by commercial organisations, but also trusted
government entities.

3 Editorial, S Saxby, Computer Law & Security Review (2012) (28) 251–253, at 251.
4 ‘Personal Data Protection and Privacy,’ Counsel of Europe, at http://www.coe.int/en/.

4
The Importance of Data Protection 1.03

Even more recently the issue of online abuse, which involves amongst
other things privacy and data protection, has also been hitting the
­headlines. Tragically, such online abuse can unfortunately contribute to
actual suicide. This is a particular concern in relation to children and
teenagers, but it can also affect adults.
It is also in the headlines because of public and official data protec-
tion supervisory authority (eg, ICO; Digital, Culture, Media and Sport
(DCMS) Committee) concerns with the problem of data misuse and the
damage of ‘permanent’ data online. The CJEU in the light of such con-
cerns issued an important decision in the Google Spain case involving
the Right to Erasure/Right to be Forgotten (RtbF) directing that certain
personal data online had to be deleted, following a complaint, from inter-
net search engine listing results.5 The High Court recently dealt with
RtbF in NC 1 and NC 2.6 Further RtbF cases include GC and Others,7
and Google v CNIL,8 and no doubt more will follow.
The CJEU also pronounced on the often contentious area of official
data retention. This is the obligation placed by countries on internet
service providers (ISPs) to retain certain customer data in relation to
telephone calls, internet searches, etc, so that (certain) official agencies
can ask to access or obtain copies of such data in the future. Debate
­frequently surrounds whether this should be permitted at all, if so, when,
and under what circumstances, how long ISPs must store such data, the
cost of retaining same, etc. The strongest argument for an official data
retention regime may relate to the prevention or investigation of terror-
ism. Serious crime may come next. There are legitimate concerns that
the privacy and data protection costs are such that official data retention,
if permitted, should not extend to ‘common decent crime’ or unlimited
access and unlimited retention. The EU Data Retention Directive has
held invalid and similar laws in Member States including the UK have
also been invalidated, in whole or in part.9 No doubt argument, debate

5 See Google Spain SL, Google Inc v Agencia Española de Protección de Datos
(AEPD), Mario Costeja González, Court of Justice (Grand Chamber), Case C-131/12,
13 May 2014. Also see P Lambert, The Right to be Forgotten (Bloomsbury, 2019).
6 NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB).
7 GC and Others v Commission Nationale de l’Informatique et des Libertés (CNIL)
(Déréférencement de données sensibles), CJEU [2019] Case C-136/17.
8 Google LLC, successor in law to Google Inc v Commission Nationale de l’Informatique
et des Libertes (CNIL), CJEU [2019] Case C-507/17.
9 Judgment in Joined Cases C-293/12 and C-594/12, Digital Rights Ireland and
Seitlinger and Others, Court of Justice, 8 April 2014. Directive 2006/24/EC of the
European Parliament and of the Council of 15 March 2006 on the retention of data
generated or processed in connection with the provision of publicly available elec-
tronic communications services or of public communications networks and amending

5
1.03 Data Protection

and research will ensue in relation to data retention. This remains, if


­anything, a contentious issue.
While the issues of official data retention are important, it is a separate
subject to data protection. This work focuses on data protection alone.
A further reason as to why data protection is important, and ­increasingly
the focus of press attention, is that organisations are increasingly using
data security and respect for data protection as a commercial differentiator
in the marketplace. Apple has repeatedly indicated that it does not o­ perate
on a user-data-intrusive model collecting user data. In fact, it has even
criticised some of its technology competitors. Microsoft has for many
years promoted the data protection and privacy friendly policy of Data
Protection by Design (DPbD) and by default. Post-Snowden, many US
technology companies have been heavily lobbying the US administra-
tion for a roll back of certain activities and practices, particularly those
felt to be extra-judicial and extra-legal, on the basis that it will disadvan-
tage the US-based cloud industry. Many non-US cloud companies have
been highlighting that they are not US-based. There are even calls for
more data privacy laws in the US, including adopting parts of the GDPR
in the US. Many notable US tech companies have also joined the calls
for a comprehensive US federal data privacy law.
1.04 All organisations collect and process personal data. Whether they
are big or new start-ups, they need to comply with the data protection
regime. It should also be borne in mind that even a new technology start-
up can scale relatively quickly to millions of users. Many issues enhance
the importance of getting organisational data protection understanding
and compliance right from day one. These include legal obligations,
director, board and officer obligations, investigations, fines, prosecu-
tions, being ordered to delete databases, adverse publicity, commercial
imperatives and even commercial advantages. If one also considers
some of the recent large scale data breach incidents, there are examples
of chief technology officers as well as managing directors/CEOs, losing
their positions as a result of the incident.

Directive 2002/58/EC (OJ 2006 L105, p 54). J Rauhofer and D Mac Sithigh, ‘The
Data Retention Directive Never Existed,’ (2014) (11:1) SCRIPTed 118. The Data
Retention and Investigatory Powers Act 2014 (DRIPA 2014) was invalidated after
challenge by two MPs. This was brought in after the Directive was invalidated. The
second replacement Act (Investigatory Powers Act 2016) was also successfully chal-
lenged by Liberty. In relation to data protection as a fundamental right, see, for exam-
ple S Rodata, ‘Data Protection as a Fundamental Right,’ in S Gutwirth, Y Poullet,
P de Hert, C de Terwangne and S Nouwt, Reinventing Data Protection? (Springer,
2009) 77. A case dealing with the issue of how serious the level of crime must be in
order for law enforcement authorities to be able to access certain telecoms data is
Ministerio Fiscal, CJEU [2018] Case C-207/16.

6
The Importance of Data Protection 1.09

Increasingly, data laws seek to place compliance burdens as ­starting


well before any personal data may collected in the first instance. This
includes data risk assessments and tools such as privacy by design
(PbD), data protection by design (DPbD), and data protection by default.
1.05 In addition, organisations often fail to realise that data protection
compliance is frequently an issue of dual compliance. They need to be
looking both inward and outward. Internally, they have to be data pro-
tection compliant in relation to all of their employees’ (and contractors’)
personal data, which traditionally may have related to HR or personnel
files and employee contracts, but now includes issues of electronic com-
munications, social media, internet usage, filtering, monitoring, abuse,
on-site activity, off-site activity, etc.
1.06 Separately, organisations have to be concerned about other sets of
personal data, such as those relating to persons outside of the organisa-
tion, eg customers, prospects, etc. Comprehensive data protection com-
pliance is also required here. The consequences are significant for non
compliance, regardless of whether the data is inward or outward facing,
but naturally outward facing breach issues tend to get more attention.
1.07 Substantial fines have been imposed in a number of recent cases.
In some instances also, organisations have been ordered to delete their
databases. In a new internet or technology start-up situation, this can be
the company’s most valuable asset. Even in an established organisation,
it can be a valuable asset and mandated deletion by a data regulator is a
significant event.
1.08 Up until recently the issue of data loss was a small story. More
recently, however, the loss of personal data files of tens of millions
of individuals in the UK – and from official governmental sources –
makes UK data loss a front page issue. There is increased scrutiny
from the ICO, and others, and increasing regulation of data security
issues, ­preparedness, and reactivity. Organisations must look at data
security issues with increasing rigour. Organisations can face liability
issues in breach incidents but also in the aggravating situation where a
­vulnerability may have already been highlighted internally but not acted
upon, thus contributing to the breach incident. As well as an official
investigation, fine and sanction, organisations also face liability issues to
users and in some instances potentially to banks and financial interme-
diaries. Organisations must be proactive and compliant. Being reactive
only when a breach event occurs is in many respects too late.
1.09 In the UK and elsewhere there are enhanced obligations to report
data losses; as well as potentially enhanced financial penalties, and in
some instances personal director responsibility for data loss. The need

7
1.09 Data Protection

for compliance is now a boardroom issue and an important issue of


corporate compliance. Proactive and complete data protection ­compliance
is also a matter of good corporate governance, brand loyalty, maintaining
firm value, and a means to ensuring user and customer goodwill.
1.10 The frequency and scale of breaches of security eg, Uber, Cathay
Pacific, Dixons, Morrisons, Yahoo! (500 million users in one breach,
and 3 billion in another), 3 UK, CEX, Bupa, TalkTalk, Target, Sony
Playstation (70 million individuals’ personal data10 in one instance, and
25 million in another11) and Cambridge Analytica make the importance
of data security compliance for personal data ever more important. The
Revenue and Customs loss of discs with the names, dates of birth, bank
and address details involved 25 million UK individuals.12
1.11 There are many UK cases involving substantial fines for data pro-
tection breaches. Brighton and Sussex University Hospitals NHS Trust
was fined £325,000 by the ICO.13 Zurich Insurance was fined £2.3m for
losing data in relation to 46,000 customers.14 Sony was fined £250,000.
Facebook was fined £500,000 (re Cambridge Analytica). DSG Retail
was fined £500,000 when 14 million customers were affected.15 Cathay
Pacific airlines was also fined £500,000 after data security failings over
four years.16 A pharmacy (Doorstep Dispensaree) was fined £275,000 for
failing to secure 500,000 records data.17 The rideshare company Uber
was fined £385,000 by the ICO.18 Fines also increase under the GDPR.
Apart from data loss incidents and data security breach incidents, a
range of other compliance failings have also been fined. Bounty (UK)
was fined £400,000 for illegal sharing and disclosure of personal
data.19 CRDNN was fined £500,000 for 193 million illegal automated

10 See, for example, G Martin, ‘Sony Data Loss Biggest Ever,’ Boston Herald, 27 April
2011.
11 See, for example, C Arthur, ‘Sony Suffers Second Data Breach With Theft of 25m
More User Details,’ Guardian, 3 May 2011.
12 See, for example, ‘Brown Apologises for Record Loss, Prime Minister Gordon Brown
has said he “Profoundly Regrets” the Loss of 25 Million Child Benefit Records,’ BBC,
21 November 2007.
13 See, for example, ‘Largest Ever Fine for Data Loss Highlights Need for Audited
Data Wiping,’ ReturnOnIt, at http://www.returnonit.co.uk/largest-ever-fine-for-data-
loss-highlights-need-for-audited-data-wiping.php.
14 See, for example, J Oates, ‘UK Insurer Hit With Biggest Ever Data Loss Fine,’
The Register, 24 August 2010. This was imposed by the Financial Services Authority
(FSA).
15 ICO v DSG Retail Limited [2020] 9/1/2020.
16 ICO v Cathay Pacific Airways Limited [2020] 4/3/2020.
17 ICO v Doorstep Dispensaree [2019] 20/12/2019.
18 ICO v Uber [2018] 26/11/2018.
19 ICO v Bounty (UK) Limited 11/4/2019.

8
The Data Protection Rules 1.13

­marketing calls.20 The size of the penalty is significant. However, pos-


sibly more significant is director liability. Directors and executives can
also be fined. Two individual Spam texters were fined a total of £440,000
in respect of their company.21 They can also be prosecuted, as exempli-
fied when a managing director was recently prosecuted.22 The ICO also
issued a data fine in relation to breach of the data protection regime by
way of incorrect storage and processing when the financial data and files
of individual customers were mixed up. Potentially, a customer could
have suffered financial loss and adverse consequences.23 In another case,
a health trust was fined £225,000 in relation to third party unauthor-
ised access to files.24 Employees have been prosecuted for unauthorised
access to personal records.25 The GDPR and DPA 2018 vastly increase
the potential fines available (up to €20m or 4% of worldwide turnover).
We can expect the level of fines on organisations to increase.
1.12 National data protection authorities are increasingly pro-active
and undertake audits of data protection compliance frameworks, as
well as incidents of breaches.26 The ICO also investigated some of the
­personal data issues in the Leveson Inquiry press phone hacking scan-
dal,27 entitled Operation Motorman.28 With the GDPR, organisations
will also have to engage with the ICO where risk related processing
operations are envisaged and when data protection impact assessments
are carried out – all in advance of any data collections whatsoever
having occurred.

The Data Protection Rules


1.13 Personal data protection is enshrined in the DPA 2018 in the UK,
and the GDPR. (The GDPR is directly effective and applicable in each

20 ICO v CRDNN [2010] 2/3/2020.


21 ICO v Christopher Niebel and Gary McNeish (2012). ICO, ‘Spam Texters Fines Nearly
Half a Million Pounds’ at https://ico.org.uk. However, note that this was appealed.
22 ICO v Cullen [2019] 24/6/2019.
23 Prudential was fined £50,000. ICO v Prudential [2012] 6/11/2012.
24 Belfast Health and Social Care (BHSC) Trust. ICO, ‘Belfast Trust Fines 22,5000 After
Leaving Thousands of Patient Records in Disused Hospital,’ at https:// ico.org.uk.
25 For example, ICO, ‘Bank Employee Fined for Reading Partners Ex Wife Statements’
at https://ico.org.uk. Also ICO v Shaw [2019] 5/12/2019; ICO v Shipsey [2019]
2/12/2019.
26 Facebook was audited by one of the EU data protection authorities. See https://­
dataprotection.ie. Note also Europe Against Facebook, at http://europe-v-facebook.org/
EN/en.html.
27 At http://www.levesoninquiry.org.uk/.
28 For more details see ‘Operation Motorman – Steve Whittamore Notebooks,’ ICO
website, at https://www.ico.org.uk.

9
1.13 Data Protection

EU state in addition to national legislation). As a consequence of Brexit


occurring (in particular following the EU (Withdrawal Agreement)
Act 2020 in January 2020) we might expect further changes, and perhaps
even more confusion.
Outward-Facing Data Protection Compliance
1.14 The data protection regime creates legal obligations which
­organisations must comply with when collecting and processing the
personal data of individuals. ‘[I]f someone can be distinguished from
other people, data protection legislation is applicable.’29 This applies to
customers and prospective customers, hence there are outward-facing
obligations. It can also apply to non-customers who may be using a
­particular website but are not a registered customer, if their personal data
are being collected.
Inward-Facing Data Protection Compliance
1.15 The data protection regime also applies to the organisation in its
dealings regarding the personal data of its employees. Equally, where
the organisation is engaging third party independent contractors but is
collecting, processing and using their personal data, the data protection
regime will also apply. Hence, the data protection regime in relation to
organisations is inward-facing.
1.16 As well as creating legal compliance obligations for organisa-
tions, the data protection regime enshrines certain rights or data protec-
tion rights for individuals in terms of ensuring their ability to know what
personal data are being collected, to consent – or not consent – to the
collection of their personal data, and to control the uses to which their
personal data may be put. There is also a mechanism through which indi-
viduals can complain to Controllers holding their personal data, the ICO,
and also via the courts directly.
Information Commissioner
1.17 In order to ensure that the duties are complied with, and the rights
of individuals vindicated, there is an official data protection supervisory
authority (SA) established to monitor and act as appropriate in relation
to compliance with and the efficient operation of the data protection
regime. This role is fulfilled in the UK by the ICO.

29 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
Security Review (2012) (28) 254 at 256.

10
Summary Data Protection Rules 1.19

Data Protection Rules


1.18 Why have a data protection regime? We have a data ­protection
regime because of the legal, societal, and political recognition that
society respects the personal privacy and informational privacy of
­individuals. In the context of data protection, that means respect for,
control of, and security in relation to informational personal data. The
DPA 2018 protects personal data relating to individuals, which includes
employees, contractors, customers and users.
Data protection exists in order to ensure:
●● protection and regulation of the collection, use, processing and
transfer of personal information;
●● protection in relation to personal information;
●● the consent of individuals is obtained to collect and process personal
data;
●● security in respect for the right to privacy and personal information;
●● protection against privacy and informational privacy abuse;
●● protection against privacy theft and identity theft;
●● protection against unsolicited direct marketing (DM);
●● remedies are available to individual Data Subjects.
The threats to personal data and informational privacy have increased
as the ease with which personal data can be collected and transferred
electronically. This has increased further with digital technology,
­
­computer processing power and the rise of Internet 2.0 and social media.30
The threat also increases as new tech advances create more c­ ategories
of data which can be collected and commercialised (even during a
pandemic).

Summary Data Protection Rules


1.19 Controllers must comply with a number of data protection issues,
perhaps the foremost of which relate to:
●● fairness;
●● transparency;
●● consent;
●● accuracy;
●● security;

30 Note generally, comments of the ICO in relation to Privacy by Design (PbD), and the
report Privacy by Design, at https://www.ico.org.uk.

11
1.19 Data Protection

●● proper procedures for processing;


●● proportionality of the need and use of personal data;
●● risk assessments.
The collecting, use and onward transfer of personal data must be fair,
lawful and transparent.
Transparency (and access to a person’s personal data) is increasingly
being emphasised in importance in relation to the internet and social
media sphere. While this is beginning to be examined and there have
been improvements with some websites, there is still a long way to go.
Some have more improvements to make, even including the implemen-
tation of privacy policies and statements, reporting mechanisms and pro-
cedures. These also need to be assessed at the front and back end. In the
earlier Prudential case where a fine of £50,000 was issued, there was
potential financial loss to the Data Subject. In the recent True Visions
case, individuals were filmed in a maternity ward for a TV programme
without proper diligence, transparency and consent. This resulted in a
fine of £120,000.31 The practice of selling personal data can also have
adverse consequences for individuals. Bounty (UK) was fined for data
selling and data brokering, and also questionable data collection prac-
tices. The ICO imposed a fine of $400,000.32
The personal data must be correct and accurate. The reason is that
damage or harm to the individual Data Subject can be a consequence
of inaccurately held personal data. For example, a credit rating could
be adversely affected through incorrect or wrong personal data records
regarding personal payment histories.
There is a detailed obligation in terms of safeguarding personal data.
Organisations must assess and implement data security measures to pro-
tect personal data. Increasingly, this is also being considered in relation
to the developing cloud environment and the increasing use of Proces-
sors and third parties.33
There has been an obligation on Controllers to register or notify the
ICO as regards their data processing activities.

31 ICO v True Visions Productions [2019] 10 April 2019.


32 ICO v Bounty (UK) Limited [2019] 11 April 2019.
33 In relation to cloud generally, see W Kuan Hon and Christopher Millard, ‘Data
Export Cloud Computing – How Can Personal Data be Transferred Outside the EEA?
The Cloud of Unknowing,’ SCRIPTed (2012) (9:1); G Singh and S Mishra, ‘Cloud
Computing Security and Data Protection: A Review,’ International Journal of Comput-
ers & Technology (2015) (14:7) 5887; F Pfarr, T Buckel and A Winkelmann, ‘Cloud
Computing Data Protection – A Literature Review and Analysis,’ 2014 47th Hawaii
International Conference on System Sciences (2014) 5018.

12
General Criteria for Data Processing 1.20

If personal data is permitted to be transferred to third countries, it


must qualify under a specific exemption, as well as the general security
conditions. (Note that the issue of the EU-US Safe Harbour exemption
mechanism is a matter of particular contention and import at this time
given that the Court of Justice has invalidated this mechanism34 – known
as the EU-US Safe Harbour Agreement – and the EU and US had to
negotiate a replacement mechanism to legitimise the controlled transfer
of EU data to the US, known as the EU-US Privacy Shield. The UK now
needs to address how it might legitimately transfer UK and EU data to
the US, which is deemed to have lesser data protection standards than
the UK and EU, if Brexit occurs. This may not be without difficulty,
and the application process for an adequacy decision may take longer
than is ideal from a UK perspective.
Controllers can have a duty of care to individuals as regards their
personal data being processed by the organisation, particularly if loss or
damage arises.
Controllers and Processors have obligations in certain circumstances
to have legal contracts in place between them. Processors process and
deal with personal data for, and on behalf of, a Controller in ­relation
to specific defined tasks, eg activities such as outsourced payroll,
­Personnel, marketing, market research, customer satisfaction surveys,
etc. ­Additional issues and considerations arise for cloud services and
data protection compliance. From an organisational perspective, it is
sometimes considered that organisational customers have less opportu-
nity to negotiate clauses in cloud service provider contracts, including
Processor and security related contracts. There is, therefore, a greater
obligation to be satisfied with the cloud service provider, where data is
located and the security measures and security documentation available.

General Criteria for Data Processing


1.20 Generally, in order to lawfully collect and process personal data,
a Controller should be aware that:
●● the individual Data Subject must consent to the collection and
processing of their personal data;
●● the Data Subject may say that they object to processing or continued
processing;

34 Schrems v Commissioner, Court of Justice, Case C-362/14, 6 October 2015. The case
technically related to Prism and Facebook Europe and transfers to the US. However,
the wider import turned out to be the entire EU-US Safe Harbour Agreement and data
transfers to the US.

13
1.20 Data Protection

●● legal data protection requirements are complied with;


●● the prior information requirements, Principles of data protection,
Lawful Processing Conditions, Sensitive Personal Data Lawful
Processing Conditions (in the case of special personal data), and
security obligations are required to be complied with;
●● the rights and interests of the individual Data Subject must be
respected and complied with.
The interests of the Controller can sometimes be relevant in particular
instances in deciding what data processing is necessary and permitted.

Data Protection Overview


1.21 The DPA 2018 and GDPR set out a number of structures, obliga-
tions, rights and implementing criteria which are together the basis of the
legal data protection regime in the UK.
The main criteria and obligations to be respected and complied with
in order to be able to legally collect and process personal data include:
●● the definitions of personal data and the data protection regime;
●● the data Principles of data protection, previously known as the ‘data
quality principles’;
●● the Lawful Processing Conditions;
●● the requirement that processing of personal data be ‘legitimate’
under at least one of the Lawful Processing Conditions;
●● recognising the two categories of personal data covered by the data
protection regime, namely, sensitive personal data and non-sensitive
general personal data;
●● in the case of sensitive personal data, complying with the additional
Special Personal Data Lawful Processing Conditions;
●● ensuring the fair obtaining of all personal data collected and
processed;
●● taking and ensuring appropriate security measures in relation to all
processing activities;
●● implementing formal legal contracts when engaging or dealing
with third party Processors (eg outsourcing data processing tasks or
activities);
●● complying with the separate criteria in relation to automated ­decision
making processes or automated decisions;
●● complying with the legal criteria for direct marketing (DM);
●● a duty of care exists in relation to the individual Data Subjects whose
personal data the organisation is collecting and processing;

14
Lawful Processing 1.22

●● the jurisdictional transfer of personal data is strictly controlled.


Personal data may not be transferred unless specifically permitted
under the data protection regime;
●● access requests, or requests by individuals for copies of their
personal data held by the organisation, must be complied with
(with limited exceptions);
●● identify processing risk situations in advance;
●● implementing internal data protection policies and terms;
●● implementing outward facing data protection policies for customers,
etc;
●● implementing outward facing website privacy statements (generally
a data protection policy covers organisation-wide activities, whereas
a website privacy statement governs only the online collection and
processing of personal data);
●● implementing mobile, computer and internet usage policies;
●● implementing data loss, data breach, incident handling and incident
reporting policies and associated reaction plans;35
●● keeping abreast of the increasing trend towards sector/issue specific
rules eg Spam; direct marketing (DM); industry codes of conduct36
in relation to personal data, etc;
●● complying with new legal developments;
●● complying with the new pre-data requirements eg risk assessments;
data protection by design and default.

Lawful Processing
1.22 There is a prohibition on the collection and processing of p­ ersonal
data and special personal data unless:
●● the processing complies with the Principles of data protection; and
●● the processing comes within one of a limited number of specified
conditions (the Lawful Processing Conditions);
●● the processing must also comply with the data security requirements.

35 Note, for example, the ICO PECR security breach notifications – guidance for service
providers, at https://www.ico.org.uk.
36 The DPA and the EU data protection regime provide for codes of conduct being agreed
with national data protection authorities such as the ICO in relation to specific industry
sectors.

15
1.23 Data Protection

Definitions
1.23 The data protection regime contains a number of key d­ efinitions.
These are central to understanding the data protection regime, and
­ultimately complying with it. These are essentially the building blocks
of the data protection regime. While these can be ‘complex concepts,’w
organisations need to fully understand them. Some examples of the
­matters defined include:
●● Data Subject;
●● Controller;
●● Processor;
●● personal data;
●● processing;
●● special personal data.
The definitions are found in greater detail in Chapter 3.

37 D Hallinan, M Friedewald and P McCarthy, ‘Citizens’ Perceptions of Data Protection


and Privacy in Europe,’ Computer Law & Security Review (2012) (28) 263.

16
Chapter 2
Sources of Data Protection Law

Introduction
2.01 Organisations and individuals need to consider a number of
sources of the law and policy underpinning the data protection regime.
In addition, there are a growing number of sources of interpretation and
understanding of data protection law. Reliance on the Data Protection
Act 2018 (DPA 2018) (or the EU General Data Protection Regulation
(GDPR)) alone can, therefore, be insufficient. Data protection is there-
fore arguably quite different from many other areas of legal practice.
In order to fully understand the data protection regime, one has to look
beyond the text, or first principles, of the DPA 2018.
What are the sources of data protection law and policy?

UK DPA 2018
2.02 Primarily, the data protection regime in the UK is governed
by the DPA 2018 and the GDPR (and previously EU Data Protection
Directive 1995 (DPD 1995)). (Note that Brexit will bring an end to the
direct effect of the GDPR in the UK. While this should have ceased from
31 January 2020 given the EU (Withdrawal Agreement) Act 2020 of that
date, the intention appears to be that the GDPR will continue during the
formal transition period for final negotiations with the EU, which are
(currently) scheduled to end on 31 December 2020). In addition, it is also
necessary to have regard to a number of other sources of law, policy and
the interpretation of the data protection regime, as well as any amend-
ments to same. Ultimately, Brexit will necessitate further data protection
regulation in the UK, and this will also be required to be reviewed in
addition to the DPA 2018. (See also Chapter 24 below).

17
2.03 Sources of Data Protection Law

UK Secondary Legislation
2.03 In addition to DPA 2018, various statutory instruments need to be
considered, which include:
●● Adoption Agency Regulations 1983;
●● Adoption Rules 1984;
●● Banking Coordination (Second Council Directive) Regulations
1992;
●● Civil Procedure Rules 1998;
●● Communications Act 2003;
●● Consumer Credit (Credit Reference Agency) Regulations 1977;
●● Consumer Credit (Credit Reference Agency) Regulations 2000
(repealed);
●● Consumer Protection (Distance Selling) Regulations 2000;
●● Data Protection Act 1998 (Commencement) Order 2000;
●● Data Protection Act 1998 (Commencement No 2) Order 2000;
●● Data Protection Act 1998 (Commencement No 2) Order 2008;
●● Data Protection (Corporate Finance Exemption) Order 2000
(repealed);
●● Data Protection (Conditions under paragraph 3 of Part II of
Schedule 1) Order 2000 (repealed);
●● Data Protection (Crown Appointments) Order 2000 (repealed);
●● Data Protection (Designated Codes of Practice) Order 2000;
●● Data Protection (Designated Codes of Practice) (No 2) Order 2000
(repealed);
●● Data Protection (Fees under section 19(7)) Regulations 2000;
●● Data Protection (Functions of Designated Authority) Order 2000
(repealed);
●● Data Protection (International Cooperation) Order 2000 (repealed);
●● Data Protection (Miscellaneous Subject Access Exemptions) Order
2000 (repealed);
●● Data Protection (Miscellaneous Subject Access Exemptions)
(Amendment) Order 2000;
●● Data Protection (Monetary Penalties) (Maximum Penalty and
Notices) Regulations 2010 (repealed);
●● Data Protection (Notification and Notification Fees) Regulations
2000;
●● Data Protection (Notification and Notification Fees) (Amendment)
Regulations 2001;
●● Data Protection (Notification and Notification Fees) (Amendment)
Regulations 2009;
●● Data Protection (Processing of Sensitive Personal Data) Order 2000
(repealed);

18
UK Secondary Legislation 2.03

●● Data Protection (Processing of Sensitive Personal Data) Order 2006


(repealed);
●● Data Protection (Processing of Sensitive Personal Data) Order 2009
(repealed);
●● Data Protection (Processing of Sensitive Personal Data) (Elected
Representatives) Order 2002;
●● Data Protection (Processing of Sensitive Personal Data) (Elected
Representatives) (Amendment) Order 2002 (repealed);
●● Data Protection Registration Fee Order 1991;
●● Data Protection (Subject Access Modification) (Health) Order 2000
(repealed);
●● Data Protection (Subject Access Modification) (Education) Order
2000 (repealed);
●● Data Protection (Subject Access Modification) (Social Work) Order
2000 (repealed);
●● Data Protection (Processing of Sensitive Personal Data) Order 2000;
●● Data Protection Registration Fee Order 1991;
●● Data Protection (Subject Access) (Fees and Miscellaneous Provi-
sions) Regulations 2000 (repealed);
●● Data Protection (Subject Access) (Fees and Miscellaneous Provi-
sions) (Amendment) Regulations 2001;
●● Data Protection (Subject Access Modification) (Social Work)
(Amendment) Order 2005;
●● Data Protection Tribunal (Enforcement Appeals) Rules 2000;
●● Data Protection Tribunal (National Security Appeals) Rules 2000;
●● Data Protection Tribunal (National Security Appeals) (Telecommu-
nications) Rules 2000;
●● Data Retention (EC Directive) Regulations 2007;
●● Education (School Records) Regulations 1989;
●● Education (Special Educational Needs) Regulations 1994;
●● Electronic Commerce (EC Directive) Regulations 2002;
●● Electronic Communications (Universal Service) Order 2003;
●● Environmental Information Regulations 2004;
●● Family Proceedings Courts (Children Act 1989) Rules 1991;
●● Family Proceedings Rules 1991;
●● Freedom of Information and Data Protection (Appropriate Limit and
Fees) Regulations 2004;1
●● Health Professions Order 2001;
●● Health Service (Control of Patient Information) Regulations 2002;

1 The Freedom of Information and Data Protection (Appropriate Limit and Fees)
­Regulations 2004.

19
2.03 Sources of Data Protection Law

●● Information Tribunal (Enforcement Appeals) Rules 2005;


●● Information Tribunal (Enforcement Appeals) (Amendment) Rules
2002;
●● Information Tribunal (Enforcement Appeals) (Amendment) Rules
2005;
●● Information Tribunal (National Security Appeals) Rules 2005;
●● INSPIRE Regulations;2
●● Insurance Companies (Third Insurance Directives) Regulations
1994;
●● Investment Services Regulations 1995;
●● Magistrates Courts (Adoption) Rules 1984;
●● Magistrates Courts (Children and Young Persons) Rules 1992;
●● Magistrates Courts (Criminal Justice (Children)) Rules 1992;
●● Open Ended Investment Companies (Investment Companies with
Variable Capital) Regulations 1996;
●● Parental Orders (Human Fertilisation and Embryology) Regulations
1994;
●● Pharmacists and Pharmacy Technicians Order 2007;
●● Police Act 1997 (Criminal Records) Regulations 2002;
●● Police Act 1997 (Criminal Records) (Amendment) Regulations
2007;
●● Privacy and Electronic Communications (EC Directive) Regulations
2003 (PECR);3
●● Privacy and Electronic Communications (EC Directive) (Amend-
ment) Regulations 2004;
●● Privacy and Electronic Communications (EC Directive) (Amend-
ment) Regulations 2011 (PECR Amendment Regulations);
●● Privacy and Electronic Communications (EC Directive) Regulations
2003;
●● Regulations of Investigatory Powers (Interception of Communica-
tions: Code of Practice) Order 2002;
●● Regulation of Investigatory Powers (Communications Data) Order
2003;
●● Regulation of Investigatory Powers (Communications Data)
(Additional Functions and Amendment) Order 2006;
●● Rehabilitation of Offenders (Exceptions) (Amendment) Order 2001;
●● Rehabilitation of Offenders (Exceptions) (Amendment) (No 2)
Order 2001;
●● Representation of People (England and Wales) Regulations 2001;

2 The INSPIRE Regulations, 2009.


3 The Privacy and Electronic Communications (EC Directive) Regulations 2003.

20
UK Secondary Legislation 2.03

●● Representation of People (England and Wales) (Amendment)


­Regulations 2001;
●● Representation of People Regulations 1986;
●● Telecommunications (Data Protection and Privacy) (Direct Market-
ing) Regulations 1998;
●● Telecommunications (Data Protection and Privacy) Regulations
1999;
●● Telecommunications (Data Protection and Privacy) Regulations
2000;
●● Telecommunications (Data Protection and Privacy) (Amendment)
Regulations 2000;
●● Telecommunications (Lawful Business Practice) (Interception of
Communications) Regulations 2000;
●● Transfer of Undertakings (Protection of Employment) Regulations
2006;
●● Unfair Terms in Consumer Contracts Regulations 1999;
●● Waste Electrical and Electronic Equipment Regulations 2006.
It is also necessary to look out for any repeals, amendments or replace-
ments to the above. In addition, the following need to be considered:
●● Data Protection Act 2018 (Commencement No 3) Regulations 2019;
●● Data Protection, Privacy and Electronic Communications (Amend-
ments etc) (EU Exit) (No 2) Regulations 2019;
●● Data Protection (Charges and Information) (Amendment) Regula-
tions 2019;
●● Data Protection, Privacy and Electronic Communications (Amend-
ments etc) (EU Exit) Regulations 2019;
●● Protection of Freedoms Act 2012 (Destruction, Retention and Use
of Biometric Data) (Transitional, Transitory and Saving Provisions)
(Amendment) Order 2018;
●● Data Protection Act 2018 (Commencement No 1 and Transitional
and Saving Provisions) Regulations 2018;
●● Data Protection (Charges and Information) Regulations 2018;
●● Protection of Freedoms Act 2012 (Destruction, Retention and Use
of Biometric Data) (Transitional, Transitory and Saving Provisions)
(Amendment) Order 2016;
●● Protection of Freedoms Act 2012 (Destruction, Retention and Use
of Biometric Data) (Transitional, Transitory and Saving Provisions)
(Amendment) Order 2015.
Notwithstanding that there may be additional primary legislation in
order to bring the remainder of the GDPR into UK law after direct
effect ceases, and given that the DPA 2018 does not fully incorporate
the GDPR, there is wide provision for secondary legislation generally in

21
2.03 Sources of Data Protection Law

relation to Brexit issues and it is expected that there will be further data
protection specific secondary legislation.

EU Data Protection Law


2.04 The main sources of EU data protection law include:
●● GDPR;
●● ePR (once enacted);
●● Directive 2002/58 on Privacy and Electronic Communications
(ePrivacy Directive) amended by Directive 2009/136 (Cookie
­Regulation Directive);
●● Regulation (EC) No 45/2001 on the protection of personal data by
EU institutions;4
●● Regulation (EU) No 910/2014 of the European Parliament and of
the Council of 23 July 2014 on electronic identification and trust
services for electronic transactions;5
●● Commission Implementing Regulation (EU) 2018/151 laying down
rules for application of Directive (EU) 2016/1148 as regards fur-
ther specification of the elements to be taken into account by digital
service providers for managing the risks posed to the security of
network and information systems.6
It is also necessary to constantly keep abreast of all amendments to same.
The DPA 2018 and GDPR are arguably the most important develop-
ments in data protection since 1995 (see Part 4) – both inside and outside
of the EU. There are significant implications for organisations and data
protection practice. Of course, that is not to downplay the significance of
Brexit and the changes it brings to the UK data protection regime – and
importantly to UK international relationships relating to data protection
and data transfers.

4 Regulation (EC) No 45/2001 of the European Parliament and of the Council of


18 December 2000 on the protection of individuals with regard to the processing of
personal data by the EU institutions and bodies and on the free movement of such data,
Regulation (EC) No 45/2001, OJ L 8, 12.1.2001.
5 Regulation (EU) No 910/2014 of the European Parliament and of the Council of
23 July 2014 on electronic identification and trust services for electronic transactions
in the internal market and repealing Directive 1999/93/EC.
6 Commission Implementing Regulation (EU) 2018/151 of 30 January 2018 laying
down rules for application of Directive (EU) 2016/1148 of the European Parliament
and of the Council as regards further specification of the elements to be taken into
account by digital service providers for managing the risks posed to the security of
network and information systems and of the parameters for determining whether an
incident has a substantial impact, C/2018/0471.

22
ICO Guides 2.06

Case Law
2.05 Increasingly data protection cases (and cases which involve
direct or indirect reference to personal data and information impacting
the data protection regime) are coming to be litigated and determined
before the courts.
One of the reasons is that individuals are increasingly aware of and
concerned about their rights under the data protection regime. A further
reason is technological developments have enhanced the potential abuse
of personal data, from Spam, unsolicited direct marketing (DM), hack-
ing and data loss, phishing, email and internet scams, online abuse, data
transfers, access to personal data, etc, and litigation related to personal
data.
The case law which is relevant, whether influential or binding, to
applying and interpreting the UK data protection regime include:
●● case studies and documentation from the ICO;
●● case complaints adjudicated by the ICO and or Tribunal;
●● cases in England and Wales;
●● cases in Scotland;
●● cases in Northern Ireland;
●● EU Court of Justice cases (CJEU)(previously ECJ);
●● European Court of Human Rights (ECHR) cases;
●● relevant cases in other EU states and/or Common Law jurisdictions.
Some of the relevant cases are summarised and referenced herein. The
increasing impact of cooperation between data regulators and indeed
sometimes joint investigations, should also be considered as they become
more influential.
Indeed, as data regulator cases elsewhere can also be influential in
the subsequent investigation by the ICO of the same issues as they may
relate to UK Data Subjects. There are clear examples of corporations
being involved in data cases in more than one jurisdiction.

ICO Guides
2.06 The ICO provides a number of guides and interpretations in
­relation to specific data protection issues and industry sectors. These
include guides and guidance for:
●● Brexit;
●● apps, online and electronic devices;
●● audits;
●● Big Data;

23
2.06 Sources of Data Protection Law

●● Cambridge Analytica;
●● CCTV;
●● charity;
●● child data;
●● construction blacklists;
●● credit and finance;
●● crime mapping;
●● criminal, court and police records;
●● data processing;
●● Data Protection by Design (DPbD);
●● data protection by default and by design;
●● Data Protection Officers;
●● data risk assessment and data protection impact assessments;
●● data sharing;
●● data subject access;
●● data protection – general;
●● deletion;
●● the new DPA 2018;
●● Driver and Vehicle Licensing Agency (DVLA);
●● drones;
●● education;
●● elected officials;
●● electoral register;
●● electronic communications and marketing;
●● employment;
●● encryption;
●● finance;
●● fundraising;
●● the GDPR and implications of GDPR for the UK;
●● health;
●● health records;
●● housing;
●● identity theft;
●● identity scanning;
●● international transfer;
●● marketing;
●● media;
●● monetary penalties;
●● online;
●● online and electronic devices;
●● Operation Motorman;
●● personal data;
●● political campaigning and personal data;
●● Principles of data protection;

24
Legal Textbooks 2.08

●● privacy and electronic communications – general;


●● privacy notice;
●● relevant filing system;
●● RFID tags;
●● Sandbox and research in relation to data protection issues;
●● schools, universities and colleges;
●● security;
●● spam;
●● spatial information;
●● telecommunications.7
These will no doubt continue expand with the increasing importance of
respective data protection issues.

ICO Determinations
2.07 In addition, there is a body of decided decisions in relation to
complaints filed by individuals with the ICO on various issues. These
include the ICO’s Office in England and Wales8 and the ICO’s Office in
Scotland.9 They can assist in considering identical and similar situations
regarding issues of data protection compliance.10

Legal Textbooks
2.08 There are an increasing number of data protection legal textbooks
and guides. Frequently also, IT legal textbooks will have chapters or sec-
tions dedicated to data protection. Some examples of the former include:
●● Data Protection Law and Practice, Rosemary Jay (Fourth Edition);
●● Data Protection Law and Practice, Supplements, Rosemary Jay;
●● The Data Protection Officer, Profession, Rules and Role, Paul
­Lambert (2017);
●● Guide to the General Data Protection Regulation: A Companion
to Data Protection Law and Practice (4th edition), Rosemary Jay,
William Malcolm, Ellis Parry, Louise Townsend, and Anita Bapat
(2017);

7 See https://ico.org.uk.
8 At https://ico.org.uk/.
9 At https://ico.org.uk/about_us/our_organisation/scotland.aspx.
10 In relation to the ICO and National Authorities generally regarding data protection
not DPD; and also Greenleaf, G, ‘Independence of Data Privacy Authorities (Part 1):
International Standards,’ Computer Law & Security Review (2012) (28) 3–13.

25
2.08 Sources of Data Protection Law

●● Understanding the New European Data Protection Rules,


Paul Lambert (2017);
●● The Right to be Forgotten, Paul Lambert (2019);
●● Data Protection, Privacy Regulators and Supervisory Authorities,
Paul Lambert (2020);
●● EU Data Protection Law, Denis Kelleher and Karen Murray (2017);
●● Transatlantic Data Protection in Practice, Rolf H. Weber and
Dominic Staiger (2017);
●● Emerging Challenges in Privacy Law: Comparative Perspectives,
David Lindsay, Moira Paterson, and Sharon Rodrick (2014);
●● Data Protection: The New Rules, Ian Long (2016);
●● Data Protection Strategy, Richard Morgan and Ruth Boardman
(2012);
●● Data Protection Law and International Dispute Resolution, Daniel
Cooper and Christopher Kuner (2017);
●● Data Protection Compliance in the UK: A Pocket Guide, Rosemary
Jay and Jenna Clarke (2008);
●● Data Protection: Law and Practice, Rosemary Jay and Angus
Hamilton (2007);
●● Data Protection Law, David Bainbridge (2005);
●● Data Protection Toolkit, Alison Matthews (2014);
●● Reforming European Data Protection Law (Law, Governance and
Technology Series/Issues in Privacy and Data Protection), Serge
Gutwirth and Ronald Leenes (2015);
●● Protecting Privacy in Private International and Procedural Law
and by Data Protection: European and American Developments,
Burkhard Hess (2015);
●● Market Integration Through Data Protection: An Analysis of the
Insurance and Financial Industries in the EU (Law, Governance
and Technology Series), Mario Viola de Azevedo Cunha (2015);
●● The Foundations of EU Data Protection Law, Dr Orla Lynskey
(2015);
●● Data Protection for Photographers: A Guide to Storing and Protect-
ing Your Valuable Digital Assets, Patrick H Corrigan (2014);
●● Data Protection on the Move: Current Developments in ICT and
Privacy/Data Protection, Serge Gutwirth and Ronald Leenes (2016);
●● The Emergence of Personal Data Protection as a Fundamental
Right of the EU, Gloria González Fuster (2014);
●● Privacy in the Age of Big Data: Recognizing Threats, Defending
Your Rights, and Protecting Your Family, Hon Howard A Schmidt
and Theresa M Payton (2015);
●● Data Protection and the Cloud: Are the Risks Too Great?,
Paul Ticher (2015);

26
Legal Textbooks 2.08

●● Data Protection: A Practical Guide to UK and EU Law, Peter Carey


(2018);
●● Property Rights in Personal Data, A European Perspective,
Nadezhda Purtova (2012);
●● Data Protection: Legal Compliance and Good Practice for Employ-
ers, Lynda AC Macdonald (2008);
●● Data Protection in the Financial Services Industry, Mandy Webster
(2006);
●● Data Protection and Compliance in Context, Stewart Room (2007);
●● European Data Protection Law: Corporate Compliance and
Regulation, Christopher Kuner (2007);
●● Data Protection for Financial Firms: A Practical Guide to M
­ anaging
Privacy and Information Risk, Tim Gough, ed (2009);
●● Data Protection for Voluntary Organisations, Paul Ticher (2009);
●● Data Protection for Virtual Data Centers, Jason Buffington (2010);
●● Effective Data Protection: Managing Information in an Era of
Change, Mandy Webster (2011);
●● Data Protection & Privacy: Jurisdictional Comparisons, Monika
Kuschewsky, ed (2012);
●● Implementation of the Data Protection Directive in Relation to
Medical Research in Europe, D Beyleveld et al (2005);
●● Research Ethics Committees, Data Protection and Medical Research
in European Countries, D Beyleveld, D Townend and J Wright
(2005);
●● Data Protection and Employment Practices, Susan Singleton (2005);
●● Data Protection: A Practical for Employers, Engineering Employ-
ers’ Federation (2005);
●● The New Data Protection Liabilities & Risks for Direct Marketers:
Handbook, Rosemary Smith, and Jenny Moseley (2005);
●● Data Protection in the NHS, Murray Earle (2003);
●● Data Protection Strategy, Implementing Data Protection
Compliance, Richard Morgan and Ruth Boardman (2012);
●● Data Protection & the Pensions Industry: Implications of the Data
Protection Act, Clare Fawke & Louise Townsend (2002);
●● Data Protection Law: Approaching its Rationale, Logic and Limits,
Lee A Bygrave (2002);
●● Regulating Spam: A European Perspective After the Adoption of the
E-Privacy Directive, L Asscher, and SA Hoogcarspel (2006);
●● Information and Decisional Privacy, Madeleine Schachter (2003);
●● Privacy on the Line, The Politics of Wiretapping and Encryption,
Whitfield Diffie and Susan Landau (1998);
●● European Data Protection Law: Corporate Regulation and
Compliance, Christopher Kuner (2007).

27
2.08 Sources of Data Protection Law

●● Report on the Implementation of the Data Protection Directive


(95/46/EC) (ie the DPD);
●● Encyclopaedia of Data Protection.
Some examples of the latter include:
●● Gringras, The Laws of the Internet, Paul Lambert, editor (2018);
●● Information Technology and Intellectual Property Law, David
­Bainbridge (2019);
●● EU Internet Law: Regulation and Enforcement, Christiana Markou
and Thalia Prastitou (2017);
●● New Technology, Big Data and the Law (Perspectives in Law,
­Business and Innovation), Marcelo Corrales and Mark Fenwick
(2017);
●● Face Detection and Recognition: Theory and Practice, Asit Kumar
Datta and Madhura Datta (2015);
●● Information Technology Law, Diane Rowland and Uta Kohl (2016);
●● International Handbook of Social Media Laws, Paul Lambert (2014);
●● E-Commerce and Convergence: A Guide to the Law of Digital
Media, Mike Butler ed (2012);
●● Information Technology Law, Ian J Lloyd (2017);
●● Information Technology Law: The Law and Society, Andrew Murray
(2013) (chapters 18 and 19);
●● Concise European IT Law, Alfred Büllesbach et al (2010);
●● Information Technology Law, Andrew Murray (2016);
●● The EU Regulatory Framework for Electronic Communications
Handbook, Michael Ryan (2010);
●● Law and the Internet, Lilian Edwards and Charlotte Waelde eds
(2009) (chapters 14–21);
●● Introduction to Information Technology Law, David Bainbridge
(2008);
●● Internet Law and Regulation, Graham JH Smith (2018);
●● Information Technology Law, Diane Rowland, Elizabeth Macdonald
(2005);
●● Computer Law: The Law and Regulation of Information ­Technology,
Chris Reed and John Angel, eds (2007) (chapters 10 and 11);
●● Outsourcing – The Legal Contract, Rachel Burnett (2005);
●● Outsourcing IT: The Legal Aspects: Planning, Contracting, Manag-
ing and the Law, Rachel Burnett (2009);
●● A Business Guide to Information Security: How to Protect Your
Company’s IT Assets, Reduce Risks and Understand the Law, Alan
Calder (2005);

28
EDPB 2.10

●● The New Legal Framework for E-Commerce in Europe, L Edwards,


ed (2005);
●● Communications Law Handbook, Mike Conradi, ed (2009);
●● Government and Information: The Law Relating to Access, Disclo-
sure and their Regulation, Birkinshaw and Varney (2012).

Legal Journals
2.09 There are also relevant learned journals and articles published in
relation to data protection compliance and developing data protection
issues. Some examples include:
●● Data Protection Law and Practice;
●● Communications Law;
●● Journal of Information Law and Technology;
●● Computers and Law from the Society of Computers and Law, at
www.scl.org;
●● SCRIPTed;
●● Computer Law and Security Review;
●● IT Law Today;
●● International Journal for the Data Protection Officer, Privacy
Officer and Privacy Counsel.

EDPB
2.10 In terms of the interpretation and understanding of the data pro-
tection regime in the UK, the European Data Protection Board (EDPB)
(established under Article 68 of the GDPR) is also required to be con-
sulted. (Previously this was effectively the WP29).11 This is an influen-
tial body in relation to addressing and interpreting the data protection
regime as well as problem areas in data protection practice. It is also
influential as it is comprised of members from the respective data protec-
tion authorities in the EU, including the ICO.
WP29 issued working papers, opinions and related documentation,
available at:
●● http://ec.europa.eu/justice/data-protection/article-29/documentation/
opinion-recommendation/index_en.htm.

11 WP29 was established under Article 29 of the DPD.

29
2.10 Sources of Data Protection Law

The EDPB is available at https://edpb.europa.eu. It has also issued the


following guidance:
●● Information note on data transfers under the GDPR in the event of a
no-deal Brexit – 12/02/2019;
●● Information note on BCRs for companies which have ICO as BCR
Lead Supervisory Authority – 12/02/2019;
●● Data Protection Officer;
●● Data Protection impact assessments
●● High risk processing;
●● Personal data breach notifications.
●● Automated decision-making and profiling;
●● Lead Supervisory Authority.
There are also addition opinions, guidelines, recommendations and best
practice documentation – which are very influential as the EDPB is com-
posed of members of the respective data regulators and other officials.

European Data Protection Supervisor


2.11 The European Data Protection Supervisor is also worth
consulting and is arguably increasing in prominence and importance.
Details are at:
●● http://www.edps.europa.eu/EDPSWEB/edps/EDPS.

Council of Europe
2.12 There are various important reference materials in relation to data
protection and privacy emanating from the Council of Europe (with over
45 Member States), such as:
●● Council of Europe Convention on data protection, No 108 of 1981;
●● Recommendation R(85) 20 on Direct Marketing;
●● Recommendation R(86) 1 on Social Security;
●● Recommendation R(97) 1 on the Media;
●● Recommendation R(97) 5 on Health Data;
●● Recommendation CM/Rec (2014)4 on electronic monitoring;
●● Democratic and effective oversight of national security services
(2015);
●● etc.

30
Other Official Sources 2.14

These and other documents are available at: http://www.coe.int/t/dghl/


standardsetting/dataprotection/Documents_TPD_en.asp.
The Council of Europe Convention on data protection12 of 1981
predates the DPD and GDPR and is incorporated into the national law
of many EU and other states (40 plus) prior to the DPD. The Council of
Europe is also reviewing and updating the Convention.13

Other Data Protection Authorities


2.13 Issues which may not yet be decided or formally reported on in
the UK can sometimes have been considered elsewhere. It can therefore
be useful to consider the decisions and logic behind decisions, reports
and opinions of:
●● Data protection authorities of other EU states and EEA states;
●● Data protection authorities of other states eg Canada.
The European Data Protection Supervisor provides links to the data
protection authorities in the EU at:
●● http://ec.europa.eu/justice/policies/privacy/nationalcomm/index_
en.htm.

Other Official Sources


2.14 Related issues can sometimes arise under freedom of information
legislation.
Tribunals can also be relevant, such as the Leveson Inquiry, ­particularly
in terms of protection for personal data, security, deliberate breaches,

12 Convention for the Protection of Individuals with regard to Automatic Processing of


Personal Data, Council of Europe (1982). ‘Draft Convention for the Protection of
Individuals with Regards to Automatic Processing of Personal Data,’ International
Legal Materials (1980) (19) 284–298.
13 See S Kierkegaard et al, ‘30 Years On – The Review of the Council of Europe Data
Protection Convention 108,’ Computer Law & Security Review (2011) (27) 223–231;
G Greenleaf, ‘Modernising Data Protection Convention 108: A Safe Basis for a Global
Privacy Treaty,’ Computer Law & Security Review (2013) (29) 430–436; P de Hert and
V Papakonstantinou, ‘The Council of Europe Data Protection Convention Reform:
Analysis of the New Text and Critical Comment on its Global Ambition,’ Computer
Law & Security Review (2014) (30:6) 633.

31
2.14 Sources of Data Protection Law

hacking, etc, and the ICO and Operation Motorman. The report and
recommendations of the Leveson Inquiry are of interest from a data
protection perspective (see below, Part 4).

Key/Topical Issues
2.15 Some of the key developments and issues which also influence
the data protection regime and how it is interpreted include:
●● data transfers, Privacy Shield and other data transfer legitimising
mechanisms;
●● data breach incidents;
●● insurance for data breach incidents;
●● preparedness and team preparations for incidents arising;
●● risk assessments;
●● data protection and privacy impact assessments;
●● mandated Data Protection Officers (DPOs) in organisations;
●● deletion, take down, erasure and the Right to be Forgotten;
●● security requirements for business;
●● employee monitoring and consent;
●● Spam and direct marketing;
●● the relationship between the Controller and the Processor, and which
relationship needs to be formalised in contract pursuant to the DPA
and GDPR;
●● disposal of computer hardware. Particular care is needed when con-
sidering the disposal of IT hardware, equipment and software. They
may still contain personal data files. This can continue to be the case
even when it appears that files have been wiped or deleted. There are
many examples accessible personal data still being available even
after it is believed to have been deleted and the device handed over
to a third part, or worse sold on. The new recipient could be able to
access the original personal data and records. This could quite easily
be a breach of a number of principles in the data protection regime.
It is always advised to take professional legal, IT and or forensic
advice when considering disposing of computer devices;
●● websites and social media compliance with the data protection
regime;
●● online abuse;
●● Internet of Things (IoT) and devices;
●● location data;
●● metadata;
●● new forms of data and personal data;
●● the ongoing ripples from the Snowden disclosures.

32
Conferences 2.18

Data Protection Websites and Blogs


2.16 There are number of privacy and data protection websites
and blogs, such as Datonomy, available at http://www.datonomy.eu/,
the Data Protection Forum at www.dpforum.org.uk and the Society of
Computers and Law at www.scl.org.

Other Laws
2.17 Other laws can also be relevant in considering personal data and
privacy.14 Examples include:
●● IT law;
●● contract law;
●● consumer law;
●● eCommerce law;
●● financial services law;
●● health and medical law and related obligations;
●● computer crime and theft laws;
●● abuse and harassment laws;
●● insurance law;
●● travel, accommodation and hospitality law;
●● motoring law;
●● vehicle and taxi law;
●● succession law;
●● developing drone laws, licensing and regulations.

Conferences
2.18 There are a variety of conferences, annual events and training
organisations related to data protection. Some will be organised by
professional conference firms while others are non-profit technology,
legal or related organisations.

14 See review of particular laws in R Delfino, ‘European Union Legislation and Actions,’
European Review of Contract Law (2011) (7) 547–551, which includes reference to
data protection law.

33
2.19 Sources of Data Protection Law

Reference
2.19 Useful reference material is available as set out below.
The DPA 2018 at: https://www.legislation.gov.uk/ukpga/2018/12/enacted
The ICO is at: https://ico.org.uk/
The EU Commission is at: http://ec.europa.eu/justice/data- protection/
index_en.htm
The WP29 and EDPB (https://edpb.europa.eu/)
International Journal for the Data Protection Officer, Privacy Officer
and Privacy Counsel
The European Court of Justice website is at: http://europa.eu/about-eu/
institutions-bodies/court-justice/index_en.htm
Court of Justice (previously ECJ and CJEU) cases15 are at: http://
curia.europa.eu/juris/recherche.jsf? language=en
The ECHR website is at: http://echr.coe.int/Pages/home.aspx?p=home

15 M Tzanou, ‘Balancing Fundamental Rights, United in Diversity? Some Reflections


on the Recent Case Law of the European Court of Justice on Data Protection,’ CYELP
(2010) (6) 53–74.

34
Chapter 3
Definitions

Introduction
3.01 It is critical to understanding the data protection regime to know
and appreciate the definitions of key terms which underpin the legal
measures implementing the data protection regime. The definitions are
the building blocks for the data protection regime. They are contained
in the Data Protection Act 2018 (DPA 2018) (and the EU General Data
Protection Regulation (GDPR)). The definitions in the GDPR (the new
EU data protection regime) and which the DPA and UK data protection
regime should mirror, should be considered in detail. The GDPR will
also be the main EU data protection legal measure for many years to
come.
The various definitions are referred to below.

DPA Definitions
3.02 Section 1 of DPA 2018 sets out the following definitions:

‘personal means any information relating to an identified or identifiable


data’ living individual (subject to s 3(14)(c));
‘identifiable means a living individual who can be identified, directly or
living indirectly, in particular by reference to –
individual’ (a) an identifier such as a name, an identification number,
location data or an online identifier, or
(b) one or more factors specific to the physical, physiological,
genetic, mental, economic, cultural or social identify of the
individual.

35
3.02 Definitions

‘processing’ in relation to information, means an operation or set of


operations which is performed on information, or on sets of
information, such as
(a) collection, recording, organisation, structure or storage,
(b) adaptation or alteration,
(c) retrieval, consultation or use,
(d) disclosure by transmission, dissemination or otherwise
making available,
(e) alignment or combination, or
(f) restriction, erasure or destruction,
(subject to s 3(14)(c) and ss 5(7), 29(2) and 82(3), which make
provision about reference to processing in different Parts of the
Act);
‘data subject’ means the identified or identifiable living individual to whom
personal data relates;
‘controller’ in relation to the processing of personal data to which
and Chapter 2 or 3 of Part 2, Part 3 or Part 4 applies, have the
‘processor same meaning as in that Chapter or Part (see ss 5, 6, 32 and 83
and see also s 3(14)(d));
‘filing means any structured set of personal data which is accessible
system’ according to specific criteria, whether held by automated
means or manually and whether centralised, decentralised or
dispersed on a functional or geographic basis;
means –
‘the data (a) the GDPR,
protection (b) the applied GDPR,
legislation’
(c) this Act,
(d) Regulations made under this Act, and
(e) Regulations made under s 2(2) of the European Com-
munities Act 1972 which relate to the GDPR or the Law
Enforcement Directive;
means the GDPR as applied by Chapter 3 of Part 2
[s 3]
‘the applied means information or personal data resulting from specific
GDPR’ technical processing relating to the physical, physiological or
‘biometric behavioural characteristics of an individual, which allows or
data’ confirms the unique identification of that individual, such as
facial images or dactyloscopic data;

36
DPA Definitions 3.02

‘data means personal data relating to the physical or mental


concerning health of an individual, including the provisions of health
health’ care services, which reveals information about his or her
health status;
‘genetic means personal data relating to the inherited or acquired
data’ genetic characteristic of an individual which gives unique
information about the physiology or the health of the
individual and which results, in particular, from an
analysis of a biological sample from the individual in
question;
‘health means data record which
record’ (a) consists of data concerning health, and
(b) has been made by or on behalf of a health professional
in connection with the diagnosis, care or treatment of the
individual to whom the data relates;
‘inaccurate’ in relation to personal data, means incorrect or misleading as
to any matter of fact;
[s 205]
Terms used in Chapter 2 of Part 2 of the 2018 Act (General Processing)
have the same meaning in Chapter 2 as they have in the GDPR.1
The definition of ‘controller’ in Article 4(7) of the GDPR has effect
subject to –
(a) subsection 6(2),
(b) section 209, and
(c) section 210.2
Section 10 of DPA 2018 refers to special categories of personal data,
previously known as sensitive personal data. The section refers to
various changes and specification in relation to special categories of data
and data in relation to criminal conviction as it will apply in the UK.
These will need to be considered in detail when appropriate.
Special categories of personal data (previously referred to as sensitive
data) contains higher compliance obligations and conditions.
There are also additional definitions for ‘health professional’ and
‘social work professional’ in s 204.

1 Data Protection Act 2018, s 5.


2 Data Protection Act 2018, s 6.

37
3.03 Definitions

GDPR Definitions
3.03 Article 4 of the new GDPR sets out the definitions for the new
data protection regime as follows:

‘personal data’ means any information relating to an identified or


identifiable natural person (‘Data Subject’);
an identifiable natural person is one who can be
identified, directly or indirectly, in particular by
reference to an identifier such as a name,
an identification number, location data, an online
identifier or to one or more factors specific to the
physical, physiological, genetic, mental, economic,
cultural or social identity of that natural person;
‘processing’ means any operation or set of operations which is
performed on personal data or on sets of personal data,
whether or not by automated means, such as collection,
recording, organisation, structuring, storage, adaptation
or alteration, retrieval, consultation, use, disclosure
by transmission, dissemination or otherwise making
available, alignment or combination, restriction, erasure
or destruction;
‘restriction of means the marking of stored personal data with the aim
processing’ of limiting their processing in the future;
‘profiling’ means any form of automated processing of personal
data consisting of the use of the personal data to
evaluate certain personal aspects relating to a natural
person, in particular to analyse or predict aspects
concerning that natural person’s performance at work,
economic situation, health, personal preferences,
interests, reliability, behaviour, location or movements;
‘pseudonymisation’ means the processing of personal data in such a manner
that the personal data can no longer be attributed to
a specific Data Subject without the use of additional
information, provided that such additional information
is kept separately and is subject to technical and
organisational measures to ensure that the personal data
are not attributed to an identified or identifiable natural
person;
‘filing system’ means any structured set of personal data which are
accessible according to specific criteria, whether
centralised, decentralised or dispersed on a functional or
geographical basis;

38
GDPR Definitions 3.03

‘Controller’ means the natural or legal person, public authority,


agency or any other body which alone or jointly with
others determines the purposes and means of the
processing of personal data; where the purposes and
means of processing are determined by EU law or
Member State law, the Controller or the specific criteria
for its nomination may be provided for by EU law or
Member State law;
‘Processor’ means a natural or legal person, public authority, agency
or any other body which processes personal data on
behalf of the Controller;
‘recipient’ means a natural or legal person, public authority,
agency or any other body, to which the personal data
are disclosed, whether a third party or not. However,
public authorities which may receive personal data in
the framework of a particular inquiry in accordance
with EU or Member State law shall not be regarded as
recipients; the processing of those data by those public
authorities shall be in compliance with the applicable
data protection rules according to the purposes of the
processing;
‘third party’ means any natural or legal person, public authority,
agency or body other than the Data Subject, Controller,
Processor and persons who, under the direct authority of
the Controller or the Processor, are authorised to process
the data;
‘the consent’ of the Data Subject means any freely given, specific,
informed and unambiguous indication of the Data
Subject’s wishes by which the Data Subject, by a
statement or by a clear affirmative action, signifies
agreement to the processing of personal data relating to
them;
‘personal data means a breach of security leading to the accidental
breach’ or unlawful destruction, loss, alteration, unauthorised
disclosure of, or access to, personal data transmitted,
stored or otherwise processed;
‘genetic data’ means all personal data relating to the genetic
characteristics of an individual that have been inherited
or acquired, which give unique information about the
physiology or the health of that individual, resulting in
particular from an analysis of a biological sample from
the individual in question;

39
3.03 Definitions

‘biometric data’ means any personal data resulting from specific


technical processing relating to the physical,
physiological or behavioural characteristics of an
individual which allows or confirms the unique
identification of that individual, such as facial images,
or dactyloscopic data;
‘data concerning means personal data related to the physical or mental
health’ health of an individual, including the provision of health
care services, which reveal information about his or her
health status;
‘main means:
establishment’ ●● as regards a Controller with establishments in more
than one Member State, the place of its central
administration in the EU, unless the decisions
on the purposes and means of the processing of
personal data are taken in another establishment
of the Controller in the EU and the latter
establishment has the power to have such decisions
implemented, in this case the establishment having
taken such decisions shall be considered as the
main establishment;
●● as regards a Processor with establishments in more
than one Member State, the place of its central
administration in the EU, and, if the Processor
has no central administration in the EU, the
establishment of the Processor in the EU where
the main processing activities in the context of the
activities of an establishment of the Processor take
place to the extent that the Processor is subject to
specific obligations under this Regulation;
‘representative’ means any natural or legal person established in the
EU who, designated by the Controller or Processor in
writing pursuant to Article 25, represents the Controller
or Processor, with regard to their respective obligations
under this Regulation;
‘enterprise’ means any natural or legal person engaged in an
economic activity, irrespective of its legal form,
including partnerships or associations regularly engaged
in an economic activity;
‘group of means a controlling undertaking and its controlled
undertakings’ undertakings;

40
GDPR Definitions 3.03

‘binding corporate means personal data protection policies which are


rules’ adhered to by a Controller or Processor established on
the territory of a Member State of the EU for transfers
or a set of transfers of personal data to a Controller or
Processor in one or more third countries within a group
of undertakings or group of enterprises engaged in a
joint economic activity;
‘supervisory means an independent public authority which is
authority’ established by a Member State pursuant to
Article 46 (SA);
‘supervisory means a supervisory authority which is concerned by
authority the processing, because:
concerned’ ●● the Controller or Processor is established on the
territory of the Member State of that supervisory
authority;
●● Data Subjects residing in this Member State are
substantially affected or likely to be substantially
affected by the processing; or
●● a complaint has been lodged to that supervisory
authority;
‘cross-border means either:
processing of ●● processing which takes place in the context of
personal data’ the activities of establishments in more than one
Member State of a Controller or a Processor in the
EU and the Controller or Processor is established in
more than one Member State; or
●● processing which takes place in the context of the
activities of a single establishment of a Controller
or Processor in the EU but which substantially
affects or is likely to substantially affect Data
Subjects in more than one Member State;
‘relevant and means an objection as to whether there is an
reasoned objection’ infringement of this Regulation or not, or, as the case
may be, whether the envisaged action in relation to
the Controller or Processor is in conformity with the
Regulation. The objection shall clearly demonstrate the
significance of the risks posed by the draft decision as
regards the fundamental rights and freedoms of Data
Subjects and where applicable, the free flow of personal
data within the EU;

41
3.03 Definitions

‘Information means any service as defined by Article 1(2) of


Society service’ Directive 98/34/EC of the European Parliament and of
the Council of 22 June 1998 laying down a procedure
for the provision of information in the field of technical
standards and regulations and of rules on Information
Society services;
‘international means an organisation and its subordinate bodies
organisation’ governed by public international law or any other body
which is set up by, or on the basis of, an agreement
between two or more countries.

Two Categories of Personal Data


3.04 Organisations need to be familiar with two separate categories of
personal data in relation to their data protection actions and compliance
obligations. It also affects what personal data they may collect in the first
instance.
The first is general personal data. Unless specified otherwise, all per-
sonal data falls into this category. The second category is special personal
data. The importance of special categories of personal data (previously
sensitive personal data) is that it triggers additional and more onerous
obligations of compliance and initial collection conditions.
Why is there a distinction? Certain categories of personal data are
more important, personal, special and sensitive to individuals over
other categories of personal data. This is recognised in the data protec-
tion regime. Additional rules are put in place. First, special categories
of personal data is defined differently. Second, in order to collect and
process special categories of personal data, an organisation must satisfy
additional processing conditions, in addition to the Principles of data
protection and the general Lawful Processing Conditions, namely com-
plying with the Special Categories of Personal Data Lawful Processing
Conditions.
Special Categories of Personal Data
Special Personal Data under the DPA 2018
3.05 Section 10 of DPA 2018 refers to Special Categories of Personal
Data as follows:
(1) subsections (2) and (3) make provision about the processing of per-
sonal data described in Article 9(1) of the GDPR (prohibition on
processing of special categories of personal data) in reliance on an
exception in one of the following points of Article 9(2) –
(a) point (b) (employment, social security and social protection);

42
Two Categories of Personal Data 3.05

(b) point (g) (substantial public interest);


(c) point (h) (health and social care);
(d) point (i) (public health);
(e) point (j) (archiving, research and statistics).
(2) The processing meets the requirements of point (b), (h), (i) or (j) of
Article 9(2) of the GDPR for authorisation by, or a basis in, the law
of the UK or a part of the UK only if it meets a condition in Part 1
of Schedule 1.
(3) The processing meets the requirement in point (g) of Article 9(2) of
the GDPR for a basis in the law of the UK or a part of the UK only
of it meets a condition in Part 2 of Schedule 1.
(4) Subsection (5) makes provision about the processing of personal
data relating to criminal convictions and offences or related secu-
rity measures that is not carried out under the control of official
authority.
(5) The processing meets the requirement in Article 10 of the GDPR
for authorisation by the law of the UK or a part of the UK only if it
meets a condition in Part 1, 2 or 3 of Schedule 1.
(6) The Secretary of State may by regulations –
(a) amend Schedule 1 –
(i) by adding or varying conditions or safeguards added by
regulations under this section, and
(ii) by omitting conditions or safeguards added by regula-
tions under this section, and
(b) consequentially amend this section.
In addition, s 11 of the DPA 2018 states that for the purposes of
Article 9(2)(h) of the GDPR (processing for health or social care
­purposes, etc) the circumstances in which the processing of personal
data is carried out subject to the conditions and safeguards referred to in
Article 9(3) of the GDPR (obligation of secrecy) include circumstances
in which it is carried out–
(a) by or under the responsibility of a health professional or a social
work professional, or
(b) by another person who in the circumstances owes a duty of confi-
dentiality under an enactment or rule of law.
In Article 10 of the GDPR and DPA 2018, s 10, references to personal
data relating to criminal convictions and offences or related security
measures include personal data relating to–
(a) the alleged commission of offences by the Data Subject, or

43
3.05 Definitions

(b) proceedings for an offence committed or alleged to have been com-


mitted by the Data Subject or the disposal of such proceedings,
including sentencing.

Special Categories of Personal Data under the GDPR


3.06 Sensitive personal data are referred to in Recitals 10 and 51 of
the new GDPR and special categories of personal data are referred to in
Recitals 10, 51, 52, 53, 54, 71, 80, 91 and 97.
Article 9 of the new GDPR refers to processing of special categories
of personal data. The processing of personal data revealing racial or eth-
nic origin, political opinions, religious or philosophical beliefs, trade-
union membership, and the processing of genetic data, biometric data in
order to uniquely identify a person or data concerning health or sex life
and sexual orientation shall be prohibited (Article 9(1)).
The above shall not apply if one of the following applies:
●● the Data Subject has given explicit consent to the processing of those
personal data for one or more specified purposes, except where EU
law or state law provide that the prohibition referred to in paragraph
1 may not be lifted by the Data Subject; or
●● processing is necessary for the purposes of carrying out the obliga-
tions and exercising specific rights of the Controller or of the Data
Subject in the field of employment and social security and social
protection law in so far as it is authorised by EU law or state law or
a collective agreement pursuant to state law providing for adequate
safeguards for the fundamental rights and the interests of the Data
Subject; or
●● processing is necessary to protect the vital interests of the Data Sub-
ject or of another person where the Data Subject is physically or
legally incapable of giving consent; or
●● processing is carried out in the course of its legitimate activities with
appropriate safeguards by a foundation, association or any other
non-profit-seeking body with a political, philosophical, religious or
trade-union aim and on condition that the processing relates solely
to the members or to former members of the body or to persons who
have regular contact with it in connection with its purposes and that
the data are not disclosed outside that body without the consent of
the Data Subjects; or
●● the processing relates to personal data which are manifestly made
public by the Data Subject; or
●● processing is necessary for the establishment, exercise or defence of
legal claims or whenever courts are acting in their judicial capacity; or

44
Two Categories of Personal Data 3.06

●● processing is necessary for reasons of substantial public interest, on


the basis of EU or state law which shall be proportionate to the aim
pursued, respect the essence of the right to data protection and pro-
vide for suitable and specific measures to safeguard the fundamental
rights and the interests of the Data Subject; or
●● processing is necessary for the purposes of preventive or occupa-
tional medicine, for the assessment of the working capacity of the
employee, medical diagnosis, the provision of health or social care
or treatment or the management of health or social care systems and
services on the basis of EU law or state law or pursuant to contract
with a health professional and subject to the conditions and safe-
guards referred to in Article 9(4); [h] or
●● processing is necessary for reasons of public interest in the area of
public health, such as protecting against serious cross-border threats
to health or ensuring high standards of quality and safety of health
care and of medicinal products or medical devices, on the basis of
EU law or state law which provides for suitable and specific meas-
ures to safeguard the rights and freedoms of the Data Subject, in
particular professional secrecy; or
●● processing is necessary for archiving purposes in the public interest,
or scientific and historical research purposes or statistical purposes
in accordance with Article 83(1) based on EU or state law which
shall be proportionate to the aim pursued, respect the essence of the
right to data protection and provide for suitable and specific meas-
ures to safeguard the fundamental rights and the interests of the Data
Subject (Article 9(2)).
Personal data referred to in Article 9(1) of the new GDPR may be pro-
cessed for the purposes referred to in Article 9(2)(h) when those data are
processed by or under the responsibility of a professional subject to the
obligation of professional secrecy under EU or state law or rules estab-
lished by national competent bodies or by another person also subject to
an obligation of secrecy under EU or state law or rules established by
national competent bodies (Article 9(4)).
States may maintain or introduce further conditions, including limita-
tions, with regard to the processing of genetic data, biometric data or
health data (Article 9(5)).
Article 10 of the new GDPR refers to processing of data relating to
criminal convictions and offences. Processing of personal data relating
to criminal convictions and offences or related security measures based
on Article 6(1) may only be carried out either under the control of official
authority or when the processing is authorised by EU law or state law
providing for adequate safeguards for the rights and freedoms of Data

45
3.06 Definitions

Subjects. Any comprehensive register of criminal convictions may be


kept only under the control of official authority.

Conclusion
3.07 It is important for organisations to distinguish, in advance of
­collecting personal data, whether the proposed data collection relates to
general personal data or special personal data. They also need to be able
to confirm compliance procedures in advance of collecting and maintain-
ing personal data and particularly sensitive personal data. The organi-
sation could be asked to demonstrate at a future date that it obtained
consent, and general compliance. If it cannot, it may have to delete the
data, have committed breaches and offences, and potentially face fines
and or being sued by the Data Subject. Depending on the circumstances,
personal liability can also arise.

46
Chapter 4
History and Data Protection

Introduction
4.01 The legal discussion in relation to privacy is frequently linked
to the Warren and Brandeis’s legal article in 1890 entitled ‘The Right to
Privacy’ published in the Harvard Law Review.1 Arguably, data ­protection
is the modern coalface of the debate in relation to privacy and privacy
protection.2 The data protection regime can be considered as setting
standards in certain areas of informational privacy protection – which
have come to be followed in other jurisdictions internationally beyond
the EU.3 There have also been calls for international level data ­protection
rules.4 Certainly, if this was to come to pass, it could add greater certainty
for both organisations, industry and individual Data Subjects. ­Arguably
the pressure for better and more international standards is increas-
ing. While privacy discussion has progressed steadily since 1890, this
has sped up significantly with the advent of the consumer computer and
electronic storage, and has reached sometimes dizzying speeds with
modern internet iterations and associated uses.

1 S Warren and L Brandeis, ‘The Right to Privacy,’ Harvard Law Review (1890)
(IV) 193.
2 See JB Rule and G Greenleaf, eds, Global Privacy Protection – The First Generation
(Cheltenham: Elgar, 2008).
3 See M Birnhack, ‘The EU Data Protection Directive: An Engine of a Global Regime,’
Computer Law & Security Report (2008) (2) 508, at 512.
4 On the topic of international data protection issues, see, for example, C de Terwangne,
‘Is a Global Data Protection Regulatory Model Possible?’ in S Gutwirth, Y Poullet,
P de Hert, C de Terwange and S Nouwt, Reinventing Data Protection? (Springer,
2009), p 175; and C Kuner, ‘Developing an Adequate Legal Framework for
­International Data transfers,’ in S Gutwirth, Y Poullet, P de Hert, C de Terwange and
S Nouwt, Reinventing Data Protection? (Springer, 2009), p 263.

47
4.02 History and Data Protection

History of Data Protection


4.02 The growth of the processing of information relating to individ-
uals in electronic computer data format from the 1970s onwards, led
to ever increasing concerns regarding such processing. Existing laws
were ‘insufficient to deal with concerns about the amount of informa-
tion relating to individuals that was held by organisations in electronic
form.’5 The purpose behind the Data Protection Acts generally is largely
to promote openness and transparency of information held about indi-
viduals, and to protect the privacy and data protection interests and
rights of such individuals.6 In the case of the Data Protection Act 2018
(DPA 2018) there is the added reason of complying with the EU ­General
Data ­Protection Regulation (GDPR), and with an eye to post-Brexit
trade and adequacy.
The main EU data protection instrument was the Data Protection
Directive 1995,7 implemented in the UK by DPA 1998, now replaced
across the EU by the GDPR. The new Data Protection Act 2018
(DPA 2018) seeks to mirror, and even go beyond, the GDPR.
However, even prior to the 1995 Directive concern for informational
privacy in the computer environment was recognised in an early data
protection regime. The Council of Europe proposed and enacted the
Convention for the Protection of Individuals with Regard to ­Automatic
Processing of Personal Data Done at Strasbourg on the 28 ­January
1981.8 The UK implemented the Data Protection Act 1984 (DPA 1984).
Ultimately this was replaced with DPA 1998 to implement DPD 1995.
It was also implemented in secondary legislation relating to data
protection.9
There were also a number of official investigations and proposed
bills regarding privacy and data protection. The Younger Committee

5 P Carey, Data Protection, A Practical Guide to UK and EU Law (Oxford: OUP, 2009)
1. Also, Bainbridge, D, Data Protection (CLT, 2000) 2.
6 L AC MacDonald, Lynda, Data Protection: Legal Compliance and Good Practice
for Employers (Tottel, 2008), p 33.
7 RI D’Afflitto, ‘European Union Directive on Personal Privacy Rights and Computer-
ised Information,’ Villanova Law Review (1996) (41) 305–324.
8 Council of Europe Convention for the Protection of Individuals with Regard to
­Automatic Processing of Personal Data Done at Strasbourg on the 28 January 1981,
at http://conventions.coe.int/Treaty/en/Treaties/Html/108.htm. Also, ‘Convention for
the Protection of Individuals with Regard to Automatic Processing of Personal Data,’
International Legal Materials (1981) (20) 317–325.
9 See generally P Carey, Data Protection, A Practical Guide to UK and EU Law
(Oxford, OUP, 2009) chapter 1.

48
History of Data Protection 4.02

on Privacy10 proposed ten recommendations and principles regarding


personal data. It proposed the following principles:
●● information should be regarded as held for a specific purpose and
should not be used, without appropriate authorisation, for other
purposes;
●● access to information should be confined to those authorised to have
it for the purpose for which it was supplied;
●● the amount of information collected and held should be the ­minimum
necessary for the achievement of a specified purpose;
●● in computerised systems handling information for statistical
­purposes, adequate provision should be made in their design and
programs for separating identities from the rest of the data;
●● there should be arrangements whereby a subject can be told about
the information held concerning them;
●● the level of security to be achieved by a system should be specified
in advance by the user and should include precautions against the
deliberate abuse or misuse of information;
●● a monitoring system should be provided to facilitate the detection of
any violation of the security system;
●● in the design of information systems, periods should be specified
beyond which the information should not be retained;
●● data held should be accurate. There should be machinery for the
­correction of inaccuracy and updating of information;
●● care should be taken in coding value judgments.
This was followed by the Lindrop Committee11 on how best to protect
privacy and personal data. The Committee uses the term ‘data privacy’
to mean the individual Data Subject’s right to control the circulation of
data about them.12 While specific recommendations were made, these
were not enacted. Ultimately, DPA 1984 was enacted following on from
the Council of Europe Convention 1981 relating to data processing.
It is entitled ‘Convention for the Protection of Individuals with regard
to the Automatic Processing of Personal Data.’ The Convention sets out
the following principles and requirements, namely, that personal data
must be:
●● obtained and processed fairly and lawfully;
●● stored for specified and lawful purposes and not used in a way
incompatible with those purposes;

10 Younger Committee on Privacy, Cmnd 5012 (1972).


11 Lindrop Committee, Cmnd 7341 (1978).
12 Referred to in P Carey, above, at 3.

49
4.02 History and Data Protection

●● adequate, relevant and not excessive in relation to the purposes


for which they are stored;
●● accurate and, where necessary, kept up to date;
●● preserved in a form which permits identification of the Data
Subjects for no longer than is required for the purpose for which
those data are stored.
In addition, the Convention provides that:
●● personal data revealing racial origin, political opinions or ­religious
or other beliefs, as well as personal data concerning health or ­sexual
life, may not be processed automatically unless domestic law
provides appropriate safeguards. The same shall apply to personal
data relating to criminal convictions;
●● appropriate security measures shall be taken for the protection of
personal data stored in automated data files against accidental
or unauthorised destruction or accidental loss as well as against
unauthorised access, alteration or dissemination;
●● any person shall be enabled,
$$ to establish the existence of an automated personal data file, its
main purposes, as well as the identity and habitual residence or
principal place of business of the Controller of the file;
$$ to obtain at reasonable intervals and without excessive delay or
expense confirmation of whether personal data relating to them
are stored in the automated data file as well as communication
to them of such data in an intelligible form;
$$ to obtain, as the case may be, rectification or erasure of such
data if these have been processed contrary to the provisions
of domestic law giving effect to the basic principles set out in
Articles 5 and 6 of the Convention;
$$ to have a remedy if a request for confirmation or, as the
case may be, communication, rectification or erasure are not
complied with.13

Data Protection Act


4.03 The DPA 1998 implemented the provisions of the EU DPD
1995 on the protection of individuals with regard to the processing of
personal data and on the free movement of such data. The GDPR replaces
DPD 1995.

13 Convention Articles 5–8.

50
Legal Instruments 4.05

The DPA 2018 replaces the DPA 1998 in the UK.14


The DPA 2018, and the GDPR, have important implications for busi-
ness and organisations which collect, process and deal in information
relating to living individuals, and in particular customers and employees.
They contain stringent data protection measures to safeguard personal
informational privacy and to ensure that personal data is not misused or
used for purposes that are incompatible with data protection legislation.
Further legal developments will continue to occur as a result of Brexit,
including the referred-to UK-GDPR, as the proposed equivalent version
of the GDPR that the UK intends to implement on or before 31 December
2020 – this is the data scheduled for the end of the transition period for
final negotiations and final agreement on the future nature of the rela-
tionship between the UK and EU. (See Chapter 24 for further details).

Legal Instruments
4.04 The introduction or Recitals to the European legal instruments,
while not legally binding like the main text of the provisions, are still
influential in terms of interpreting the data protection regime, and also
highlight some of the history, philosophy and policy behind particular
data protections laws and provisions.
GDPR Recitals
4.05 The final version of the GDPR was agreed in December 2015
by the tripartite group and passed by the EU Parliament in April 2016.
From the date of publication in the Official Journal (4 May 2016) there is
a two year implementation period for organisations – and states in terms
of any amendments required to national legislation. The Recitals include
the following:
DPD Repealed
DPD 1995 is repealed; Commission decisions based on the DPD 1995
remain until replaced (Recital 171); and Directive 2002/58/EC15 is to be
updated following the GDPR (Recital 173).

14 There was also a previous Data Protection Act in the UK, namely, the Data Protection
Act 1984. The DPA of 1998 repealed the DPA of 1984. The DPA was passed into law
on 1 March 2000 pursuant to the Data Protection Act (Commencement) Order 2000,
SI 2000/183.
15 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).

51
4.05 History and Data Protection

WP29/EDPB
Establishment of European Data Protection Board (EDPB) which is the
effective equivalent of WP29 under the DPD 1995 (Recital 139).
Background and Rationale
Personal data protection is (now) a fundamental right;16 Article 8(1) of
the Charter of Fundamental Rights of the EU and Article 16(1) of the
Treaty (Recital 1); data processing should be designed to serve mankind;
balanced with other fundamental rights; principle of proportionality
(Recital 4); internal market and cross border transfers (Recital 6); rapid
technological developments and challenges (Recital 6); developments
require a strong and more coherent data protection framework (Recital 7);
the DPD 1995 has not prevented fragmentation in how data protection is
implemented (Recital 9); the need to ensure a consistent and high level
of protection and to remove obstacles to data flows (Recital 10); the
need for a consistent level of protection throughout the EU (Recital 13);
the GDPR protects natural persons and their personal data (Recital 12);
protections should be technologically neutral (Recital 15).
Obligations
Data processing must be lawful and fair (Recital 39); processing n­ ecessary
for a contract (Recital 44); processing for a legal o­ bligation (Recital 45);
processing necessary to protect life (Recital 46); the ­legitimate interests
of the Controller (Recital 47).
Security
Network and information security, Computer Emergency Response
Teams (CERTs) and Computer Security Incident Response Teams
(CSIRTs) (Recital 49); appropriate technical and organisational meas-
ures (Recitals 81, 87); security and risk evaluation (Recital 90); high
risk (Recital 84); impact assessments (Recitals 84, 90, 91, etc); large
scale processing operations (Recital 91); consultations (Recital 94); data
breach and data breach notification (Recitals 85, 86); and Commission
delegated acts.

16 In relation to data protection as a fundamental right, see, for example S Rodata,


‘Data Protection as a Fundamental Right,’ in S Gutwirth, Y Poullet, P de Hert,
C de ­Terwangne and S Nouwt, Reinventing Data Protection? (Springer, 2009), p 77.
It is necessary to reinforce its protection in order to make it effective and not condi-
tioned by the asymmetries which characterise the relationship between Data Subject
and data controllers.

52
Legal Instruments 4.05

Processing
Processing; pseudonymised data (Recitals 26, 28, 29); online i­dentifiers
(Recital 30); consent (Recitals 32, 38, 40, 42, 43, 44, 50) lawful p­ rocessing
and consent (Recital 40); principle of transparency (Recital 39, 58, 60, 71);
children (Recital 38); processing for (additional) other purposes (Recital 50);
genetic data (Recital 34); health data (Recital 35); sensitive personal data
(Recitals 51, 52, 53, 54); additional identifying information (Recital 57);
processing and electoral activities (Recital 56); religious associations
(Recital 55); processing and direct marketing (Recital 70); right not to
be subject to a decision; automated processing (Recital 68, 71); profiling
(Recitals 70, 72); restrictions on principles and rights (Recital 73); respon-
sibility and liability of Controllers (Recital 74); risks (Recitals 75, 76, 83,
84, etc); Processors and Controllers (Recitals 79, 81, 82, etc); Codes of
Conduct (Recital 98); transparency and certification mechanisms (Recitals
99, 100); penalties and fines (Recital 148); criminal sanctions (Recital 149);
employee data (Recital 155); public authorities; processing in the public
interest or official authority; processing and public interest, scientific and
historical research purposes, statistical purposes, safeguards; archiving
purposes; scientific research purposes; historical research purposes; medi-
cal research purposes; statistical purposes; religious organisations.
Rights
Data Subject rights (Recital 59); principles of fair and transparent pro-
cessing (Recital 60); prior information requirements (Recital 61); right
of access (Recital 63); right of rectification and right to be forgotten
(RtbF) (Recital 65); right to complain (Recital 141); automated process-
ing (Recital 68).
Proceedings
Proceedings against Controllers, Processors and jurisdiction (Recitals
145, 146); damages and compensation (Recital 146); the prevention,
investigation, detection or prosecution of offences.
Establishment
Establishment (Recitals 22, 23, 36); groups of undertakings (Recital 37);
establishment (Recital 36, etc).
Transfers
Cross border data transfers (Recital 101, etc).
ICO and Supervisory Authorities
Supervisory authorities such as the ICO (SAs) (previously called data
protection authorities (DPAs) and the complete independence of SAs
(Recitals 117, etc).

53
4.05 History and Data Protection

New Bodies
Data protection not-for-profit bodies, organisations and associations
(Recital 142).
Notification/Registration Replaced
Replacement of ‘general’ notification/registration requirement (Recital 89).
Exceptions/Exemptions
The GDPR does not address national security (Recital 16); the GDPR
should not apply to data processing by a natural person in the course of
a purely personal or household activity and thus without a connection
with a professional or commercial activity (Recital 18); without preju-
dice to eCommerce Directive17 in particular eCommerce defences of
Articles 12–15 (Recital 21); the GDPR does not apply to the data of the
deceased (Recital 27). (Also fanpage case, ECJ C-210/16, June 2018).
Lawful Processing and Consent
Lawful processing and consent (Recital 40).
Online Identifiers
Online identifiers (Recital 30).
Sensitive Personal Data
Sensitive personal data and special categories of personal data (Recitals
10, 51, 53).
Children and Personal Data
Processing of children’s personal data (Recitals 38, 58, 65, 71, 75).
Health Data
Health data processing (Recitals 35, 45, 52, 53, 54, 63, 65, 71, 73, 75,
91, 112, 155, 159).

GDPR
General Provisions
4.06 The initial provisions refer to the context of the GDPR, namely,
the subject matter and objectives (Article 1); material scope (Article 2);
and territorial scope (Article 3).

17 Directive 2000/31/EC.

54
GDPR 4.07

The GDPR applies to the processing of personal data wholly or partly


by automated means, and to the processing other than by automated
means of personal data which form part of a filing system or are intended
to form part of a filing system (Article 2(2)). Matters not covered or
included are referred to in Article 2(3).
Principles of Data Protection
4.07 Chapter II refers to the Principles of data protection. Article 5
of the new GDPR relates to principles relating to personal data
processing.
Personal data must be:
(a) processed lawfully, fairly and in a transparent manner in relation to
the Data Subject (‘lawfulness, fairness and transparency’);
(b) collected for specified, explicit and legitimate purposes and not fur-
ther processed in a manner that is incompatible with those purposes;
further processing for archiving purposes in the public interest, sci-
entific or historical research purposes or statistical purposes shall, in
accordance with Article 89(1), not be considered to be incompatible
with the initial purposes (‘purpose limitation’);
(c) adequate, relevant and limited to what is necessary in relation to the
purposes for which they are processed (‘data minimisation’);
(d) accurate and, where necessary, kept up to date; every reasonable
step must be taken to ensure that personal data that are inaccurate,
having regard to the purposes for which they are processed, are
erased or rectified without delay (‘accuracy’);
(e) kept in a form which permits identification of Data Subjects for no
longer than is necessary for the purposes for which the personal data
are processed; personal data may be stored for longer periods inso-
far as the data will be processed solely for archiving purposes in the
public interest, scientific or historical research purposes or statistical
purposes in accordance with Article 89(1) subject to implementation
of the appropriate technical and organisational measures required by
the Regulation in order to safeguard the rights and freedoms of the
Data Subject (‘storage limitation’);
(f) processed in a manner that ensures appropriate security of the per-
sonal data, including protection against unauthorised or unlawful
processing and against accidental loss, destruction or damage, using
appropriate technical or organisational measures (‘integrity and
confidentiality’) (Article 5(1)).
The Controller shall be responsible for and be able to demonstrate
compliance with Article 5(1) (‘accountability’ principle) (Article 5(2)).

55
4.07 History and Data Protection

These can be summarised as:


●● processed lawfully, fairly and in a transparent manner (‘lawfulness,
fairness and transparency’);
●● for specified, explicit and legitimate purposes and not further pro-
cessed incompatibly with the purpose; (with carveout for a­ rchiving
purposes in the public interest, scientific or historical research
purposes or statistical purposes) (‘purpose limitation’);
●● adequate, relevant and limited to what is necessary for the purposes
(‘data minimisation’);
●● accurate and, where necessary, kept up to date; every reasonable step
must be taken to ensure that personal data that are inaccurate, having
regard to the purposes for which they are processed, are erased or
rectified without delay (‘accuracy’);
●● kept in a form which permits identification for no longer than is nec-
essary; (with carveout for archiving purposes in the public ­interest,
or scientific and historical research purposes or statistical pur-
poses subject to appropriate technical and organisational measures)
(‘storage limitation’);
●● appropriate security, including protection against unauthorised
or unlawful processing and against accidental loss, destruction or
damage, using appropriate technical or organisational measures
(‘integrity and confidentiality’).
The Controller must demonstrate compliance (‘accountability’).
Lawful Processing Conditions
4.08 Article 6 of the new GDPR refers to the lawfulness of processing.
Article 6(1) provides that processing of personal data shall be lawful
only if and to the extent that at least one of the following applies:
●● the Data Subject has given consent to the processing of their per-
sonal data for one or more specific purposes;
●● processing is necessary for the performance of a contract to which
the Data Subject is party or in order to take steps at the request of the
Data Subject prior to entering into a contract;
●● processing is necessary for compliance with a legal obligation to
which the Controller is subject;
●● processing is necessary in order to protect the vital interests of the
Data Subject or of another natural person;
●● processing is necessary for the performance of a task carried out in
the public interest or in the exercise of official authority vested in
the Controller;
●● processing is necessary for the purposes of the legitimate interests
pursued by the Controller or by a third party, except where such

56
GDPR 4.08

interests are overridden by the interests or fundamental rights and


freedoms of the Data Subject which require protection of personal
data, in particular where the Data Subject is a child. This (bullet)
shall not apply to processing carried out by public authorities in the
performance of their tasks (Article 6(1)).
States may maintain or introduce more specific provisions to adapt the
application of the rules of the GDPR with regard to the processing of
personal data for compliance with Article 6(1)(c) and (e) by determining
more precisely specific requirements for the processing and other meas-
ures to ensure lawful and fair processing including for other specific
processing situations as provided for in Chapter IX (Article 6(2)).
The basis for the processing referred to in Article 6(1)(c) and (e) of
must be laid down by: EU law, or state law to which the Controller is
subject.
The purpose of the processing shall be determined on this legal basis,
or as regards the processing referred to in Article 6(1)(e), shall be neces-
sary for the performance of a task carried out in the public interest or in
the exercise of official authority vested in the Controller. This legal basis
may contain specific provisions to adapt the application of rules of the
GDPR, inter alia, the general conditions governing the lawfulness of
data processing by the Controller, the type of data which are subject to the
processing, the Data Subjects concerned; the entities to, and the purposes
for which the personal data may be disclosed; the purpose limitation;
storage periods, and processing operations and processing procedures,
including measures to ensure lawful and fair processing, such as those
for other specific processing situations as provided for in Chapter IX.
EU law or the law of the state must meet an objective of public interest
and be proportionate to the legitimate aim pursued (Article 6(3)).
Where the processing for a purpose other than that for which the per-
sonal data have been collected is not based on the Data Subject’s consent
or on an EU or state law which constitutes a necessary and proportionate
measure in a democratic society to safeguard the objectives referred to
in Article 23(1), the Controller shall, in order to ascertain whether pro-
cessing for another purpose is compatible with the purpose for which the
personal data are initially collected, take into account, inter alia:
●● any link between the purposes for which the personal data have been
collected and the purposes of the intended further processing;
●● the context in which the personal data have been collected, in par-
ticular regarding the relationship between Data Subjects and the
Controller;
●● the nature of the personal data, in particular whether special catego-
ries of personal data are processed, pursuant to Article 9 or whether

57
4.08 History and Data Protection

data related to criminal convictions and offences are processed,


pursuant to Article 10;
●● the possible consequences of the intended further processing for
Data Subjects;
●● the existence of appropriate safeguards, which may include encryp-
tion or pseudonymisation (Article 6(4)).
Consent Conditions
4.09 Article 7 of the new GDPR refers to conditions for consent as
follows. Where processing is based on consent, the Controller shall be
able to demonstrate that the Data Subject has consented to the processing
of their personal data (Article 7(1)).
If the Data Subject’s consent is given in the context of a written dec-
laration which also concerns other matters, the request for consent must
be presented in a manner which is clearly distinguishable from the other
matters, in an intelligible and easily accessible form, using clear and
plain language. Any part of the declaration which constitutes an infringe-
ment of the GDPR shall not be binding (Article 7(2)).
The Data Subject shall have the right to withdraw their consent at
any time. The withdrawal of consent shall not affect the lawfulness of
processing based on consent before its withdrawal. Prior to giving
consent, the Data Subject shall be informed thereof. It must be as easy to
withdraw consent as to give consent (Article 7(3)).
When assessing whether consent is freely given, utmost account shall
be taken of the fact whether, inter alia, the performance of a contract,
including the provision of a service, is conditional on consent to the pro-
cessing of personal data that is not necessary for the performance of that
contract (Article 7(4)).

Conclusion
4.10 Privacy and data protection are evolving in terms of how technol-
ogy is changing how personal data are collected, used and processed.
The prior data protection legal regime is perceived as requiring updating.
The DPD was enacted in 1995, prior to social media, cloud computing,
mass data storage, data mining, electronic profiling, Web 2.0 and the
threats to the data security surrounding personal data. This was even
before the spotlight centred on abuse issues. Data protection needs to
evolve to deal with, or at least more explicitly deal with, these issues.
This is partly the reason for the new GDPR.
This is important for the issues it addresses as well as the current
legal provisions it enhances. As a Regulation as opposed to a ­Directive,

58
Conclusion 4.10

it means that it is directly applicable in the UK without the need for


a Directive or national implementing legislation. However, the UK,
as many other states, are also implementing national laws. That takes
on added resonance given the advance of Brexit and the date when the
GDPR is no longer directly effective here in the UK (currently sched-
uled to be 31 December 2020). The law and practice of the UK will be
changed, as will many of the obligations of organisations. This will also
differ more for organisations in particular sectors.
There must be better awareness and more hands-on board manage-
ment responsibility, planning and data protection assessment, and
including risk assessment, in advance of product or service launch via
the Data Protection by Design (DPbD) concept (see Part 4). There is
explicit recognition of children under the data protection regime for the
first time. Data protection compliance is now required to be much more
considered, organised and planned.

59
60
Chapter 5
Principles

Introduction
5.01 All organisations which collect and process personal data must
comply with the obligations of the UK data protection regime. It is,
therefore, very important to be familiar with the data protection regime.
This is set out in the new Data Protection Act 2018 (DPA 2018) (and the
EU General Data Protection Regulation (GDPR)).

When Data Protection Provisions Apply


5.02 The circumstances in which UK data protection provisions apply
are important. The data protection regime applies:
●● to the processing of personal data where the Controller is established
in the UK, whether or not the processing takes place in the UK;
●● to the processing of personal data to which Chapter 2 of Part 2 of the
Act (ie the GDPR) applies where:
(a) the processing is carried out in the context of the activities of
an establishment of a Controller and Processor in a country
or territory that is not a member State, whether or not the
processing takes place in such a country or territory,
(b) the personal data relates to a Data Subject who is in the UK
when the processing activities take place, and
(c) the processing activities relate to–
(i) the offering of goods or services to Data Subjects in the
UK, whether or not for payment, or
(ii) the monitoring of Data Subjects’ behaviour in the UK
(s 207).

61
5.02 Principles

A person established in the UK includes:


●● an individual ordinarily resident in the UK;
●● a body incorporated under UK law;
●● a partnership formed under UK law;
●● a person in the above bullets who maintains and carried out activities
through an office, branch or agency or other stable arrangements in
the UK (s 207(7)).

Fair Processing Requirements


5.03 The data protection rules provide details as to what constitutes
fair processing and identifies the information that must be given to Data
Subjects not only where the personal data is obtained directly from the
Data Subjects but also when it is obtained indirectly. It also refers to the
times at which this information needs to be given.
Organisations cannot collect and process personal data unless they:
●● comply with the registration or notification requirements (if still
required; note the GDPR removed registration or notification type
requirements);
●● comply with the Principles of data protection (previously known as
the data quality principles);
●● ensure the processing is carried out in accordance with the Law-
ful Processing Conditions (and the Special Personal Data Lawful
Processing Conditions in the case of special or sensitive personal
data);
●● provide specific information to Data Subjects in advance of the
collection and processing of personal data, known as the Prior
­
­Information Requirements;
●● also comply with security requirements.
These all serve as pre-conditions to lawful data processing of personal
data.

Principles of Data Protection


5.04 There are seven Principles of data protection to be complied with
by all Controllers. It can, therefore, be summarised that all personal data
must be:
(1) lawful, fair and transparent processing;
(2) processed for limited purposes;
(3) adequate, relevant and not excessive;

62
Principles of Data Protection 5.04

(4) accurate and up to date;


(5) not kept for longer than is necessary;
(6) secure; and
(7) processed in a manner demonstrating compliance with the above.
Section 2(1) of the DPA 2018 states that
‘the GDPR, the applied GDPR and this Act protect individuals with regard to
the processing of personal data, in particular by –
(a) requiring personal data to be processed lawfully and fairly, on the basis
of the data subject’s consent or another specified basis;
(b) conferring rights on the data subject to obtain information about the
processing of personal data and to require inaccurate personal data to be
rectified.’
Article 5 of the new GDPR sets out these principles as follows. Personal
data must be:
●● processed lawfully, fairly and in a transparent manner in relation to
the Data Subject (‘lawfulness, fairness and transparency’);
●● collected for specified, explicit and legitimate purposes and not fur-
ther processed in a manner that is incompatible with those purposes;
further processing for archiving purposes in the public interest, sci-
entific or historical research purposes or statistical purposes shall, in
accordance with Article 89(1), not be considered to be incompatible
with the initial purposes (‘purpose limitation’);
●● adequate, relevant and limited to what is necessary in relation to the
purposes for which they are processed (‘data minimisation’);
●● accurate and, where necessary, kept up to date; every reasonable step
must be taken to ensure that personal data are inaccurate, having
regard to the purposes for which they are processed, are erased or
rectified without delay (‘accuracy’);
●● kept in a form which permits identification of Data Subjects for no
longer than is necessary for the purposes for which the personal data
are processed; personal data may be stored for longer periods insofar
as the data will be processed solely for archiving purposes in the
public interest, scientific or historical research purposes or statistical
purposes in accordance with Article 89(1) subject to implementation
of the appropriate technical and organisational measures required by
the GDPR in order to safeguard the rights and freedoms of the Data
Subject (‘storage limitation’);
●● processed in a manner that ensures appropriate security of the
­personal data, including protection against unauthorised or ­unlawful
processing and against accidental loss, destruction or damage, using
appropriate technical or organisational measures (‘integrity and
confidentiality’).

63
5.04 Principles

The Controller must be responsible for and be able to demonstrate


compliance with the above (‘accountability’).
Interpreting the Principles of Data Protection
5.05 The Data Protection Act 1998 (DPA 1998) also contained certain
guidance which assisted in interpreting the data protection Principles.
These were contained in Schedule 1, Part I. This is not replicated in the
DPA 2018.

64
Chapter 6
Ordinary Personal Data
Lawful Processing Conditions

Introduction
6.01 Organisations when obtaining and processing personal data
must not mislead and must also provide a number of prior information
­requirements to the individual Data Subjects.
Without these it would be deemed that there is unfair obtaining and
processing. In addition, organisations must satisfy and meet one of the
­Lawful Processing Conditions for lawful processing.

General Lawful Processing Conditions


6.02 The Lawful Processing Conditions are required to be complied
with, in addition to the Principles of data protection. Article 4 of the
EU General Data Protection Regulation (GDPR), titled lawfulness of
processing, contains the general Lawful Processing Conditions. In order
to collect and process personal data, in addition to complying with the
above data protection Principles, organisations must comply or fall
within one of the following general personal data Lawful Processing
Conditions.
These conditions might be summarised as follows:
●● the Data Subject consent;
●● the necessary contract with Data Subject;
●● necessary for a legal obligation;
●● necessary to protect vital interests;
●● necessary for public interest or official task;
●● necessary for legitimate interests pursued by Controller except
where unwarranted.

65
6.02 Ordinary Personal Data Lawful Processing Conditions

Article 7 also specifies requirements in relation to conditions for con-


sent. It states that where processing is based on consent, the Controller
must be able to demonstrate that the Data Subject has consented to pro-
cessing of his or her personal data. If the consent is given in the context
of a written declaration which also concerns other matters, the request
for consent must be presented in a manner which is clearly distinguish-
able from the other matters, in an intelligible and easily accessible form,
using clear and plain language. Any part of such a declaration which
constitutes an infringement of the GDPR shall not be binding. In addi-
tion, the Data Subject shall have the right to withdraw his or her consent
at any time. The withdrawal of consent shall not affect the lawfulness
of processing based on consent before its withdrawal. Prior to giving
consent, the Data Subject shall be informed thereof. It shall be as easy to
withdraw as to give consent. When assessing whether consent is freely
given, utmost account shall be taken of whether, inter alia, the perfor-
mance of a contract, including the provision of a service, is conditional
on consent to the processing of personal data that is not necessary for the
performance of that contract.
In relation to children, Article 8 refers to conditions applicable to a
child’s consent in relation to information society services. It provides
where Article 6(1)(a) applies, in relation to the offer of information soci-
ety services directly to a child, the processing of the personal data of a
child shall be lawful where the child is at least 16 years old. Where the
child is below the age of 16 years, such processing shall be lawful only
if and to the extent that consent is given or authorised by the holder
of parental responsibility over the child. However, Member States may
provide a lower age. The UK has decided to provide the age of 13 in the
Data Protection Act 2018 (DPA 2018). It is noted that this is the first time
that the data protection regime, at least at the EU data protection regime
level, expressly refers to children and child data. Some might suggest
that this has been overdue.

Special Personal Data Lawful Processing Conditions


6.03 In the case of special or sensitive personal data, an organisation
must, in addition to complying with the Principles of data protection, be
able to comply or fall within one of the Special Personal Data Lawful
Processing Conditions.
Article 9 of the GDPR sets out the conditions relevant for the purposes
of the first data protection Principle in particular in relation to the pro-
cessing of special personal data. The processing of personal data reveal-
ing racial or ethnic origin, political opinions, religious or philosophical

66
Special Personal Data Lawful Processing Conditions 6.03

beliefs, or trade union membership, and the processing of genetic data,


biometric data for the purpose of uniquely identifying a natural person,
data concerning health or data concerning a natural person’s sex life or
sexual orientation shall be prohibited. It sets out the following provisions
or exceptions to this restriction, namely:
1 The Data Subject has given explicit consent to the processing of the
personal data for one or more specified purpose.
2 The processing is necessary for the purposes of carrying out the obli-
gations and exercising specific rights of the Controller or of the Data
Subject in the field of employment and social security and social
protection law in so far as it is authorised by Union or M
­ ember State
law or a collective agreement pursuant to Member State law provid-
ing for appropriate safeguards for the fundamental rights and the
interests of the data subject.
3 The processing is necessary to protect the vital interests of the
Data Subject or of another natural person where the Data Subject is
physically or legally incapable of giving consent.
4 The processing is carried out in the course of its legitimate activi-
ties with appropriate safeguards by a foundation, association or any
other not-for-profit body with a political, philosophical, religious or
trade union aim and on condition that the processing relates solely
to the members or to former members of the body or to persons who
have regular contact with it in connection with its purposes and that
the personal data are not disclosed outside that body without the
consent of the data subjects.
5 The processing relates to personal data which are manifestly made
public by the data subject.
6 The processing is necessary for the establishment, exercise or
defence of legal claims or whenever courts are acting in their judi-
cial capacity.
7 The processing is necessary for reasons of substantial public inter-
est, on the basis of Union or Member State law which shall be
proportionate to the aim pursued, respect the essence of the right
to data protection and provide for suitable and specific measures
to safeguard the fundamental rights and the interests of the Data
Subject.
8 The processing is necessary for the purposes of preventative or
occupational medicine, for the assessment of the working capacity
of the employee, medical diagnosis, the provision of health or social
care or treatment or the management of health or social care systems
and services on the basis of Union or Member State law or pursuant
to contract with a health professional and subject to the conditions
and safeguards referred to in paragraph 3.

67
6.03 Ordinary Personal Data Lawful Processing Conditions

9 The processing is necessary for reasons of public interest in the area


of public health, such as protecting against serious cross-border
threats to health or ensuring high standards of quality and safety
of health care and of medicinal products or medical devices, on the
basis of Union or Member State law which provides for suitable and
specific measures to safeguard the rights and freedoms of the Data
Subject, in particular professional secrecy.
10 The processing is necessary for archiving purposes in the public
interest, scientific or historical research purposes or statistical pur-
poses in accordance with Article 89(1) based on Union or Member
State law which shall be proportionate to the aim pursued, respect
the essence of the right to data protection and provide for suitable
and specific measures to safeguard the fundamental rights and the
interests of the Data Subject.
These special personal data lawful processing conditions may be
­summarised as follows:
●● explicit consent;
●● to comply with employment law;
●● to protect vital interests;
●● the processing is carried out by a not-for-profit organisation with
safeguards;
●● the personal data are manifestly made public by the data subject;
●● necessary for the establishment, exercise or defence of legal claims;
●● necessary for public interest;
●● necessary for preventative medicine;
●● necessary for public health;
●● necessary for archiving purposes in the public interest, scientific or
historical research purposes or statistical purposes.
Further rules also set out further obligations in relation to the processing
of special personal data.

68
Chapter 7
Processing Pre-Conditions:
Prior Information Requirements
and Transparency

Introduction
7.01 Organisations, even prior to obtaining and processing personal
data, are generally obliged to provide certain information to individual
Data Subjects. The data protection regime places great emphasise on the
need for transparency. While of course transparency in an ongoing obli-
gation throughout the data lifecycle, there is particular need to achieve
transparency through the use of providing specified details and informa-
tion prior to the initial data collection. This is in order that Data Subjects
can be properly informed, and can decide whether to consent or not, to
the proposed data collection and data processing. Problems can arise if
the Data Subject or a data regulator suggests that there is no valid collec-
tion of consent because there was either no prior information provided,
or the information provided was incorrect or vague in some way.

Prior Information Requirements under the EU General


Data Protection Regulation (GDPR)
7.02 Articles 12, 13 and 14 of the new GDPR makes provision in rela-
tion to information available and provided to the Data Subject, including
where personal data are collected, directly or indirectly.
Directly Obtained Data
7.03 Chapter III, Section 2 of the GDPR refers to data access.
­Article 13 refers to information to be provided where the data are
collected from the Data Subject.

69
7.03 Processing Pre-Conditions

Where personal data relating to a Data Subject are collected from the
Data Subject, the Controller shall, at the time when personal data are
obtained, provide the Data Subject with the following information:
●● the identity and the contact details of the Controller and, and where
applicable, of the Controller’s representative;
●● the contact details of the DPO, where applicable;
●● the purposes of the processing for which the personal data are
intended as well as the legal basis for the processing;
●● where the processing is based on Article 6(1)(f), the legitimate inter-
ests of the Controller or by a third party;
●● the recipients or categories of recipients of the personal data, if any;
●● where applicable, that fact that the Controller intends to transfer per-
sonal data to a third country or international organisation and the
existence or absence of an adequacy decision by the Commission,
or in case of transfers referred to in Article 46 or 47, or the second
subparagraph of Article 49(1), reference to the appropriate or suit-
able safeguards and the means to obtain a copy of them or where
they have been made available (Article 13(1)).
In addition to the information referred to in Article 13(1), the Control-
ler shall, at the time when personal data are obtained, provide the Data
Subject with the following further information necessary to ensure fair
and transparent processing:
●● the period for which the personal data will be stored, or if this is not
possible, the criteria used to determine that period;
●● the existence of the right to request from the Controller access to and
rectification or erasure of the personal data or restriction of process-
ing concerning the Data Subject or to object to processing as well as
the right to data portability;
●● where the processing is based on Article 6(1)(a) or Article 9(2)(a),
the existence of the right to withdraw consent at any time, without
affecting the lawfulness of processing based on consent before its
withdrawal;
●● the right to lodge a complaint to a supervisory authority;
●● whether the provision of personal data is a statutory or contractual
requirement, or a requirement necessary to enter into a contract, as
well as whether the Data Subject is obliged to provide the data and
of the possible consequences of failure to provide such data;
●● the existence of automated decision-making including profiling,
referred to in Article 22(1) and (4), and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 13(2)).

70
Prior Information Requirements under the EU (GDPR) 7.04

Where the Controller intends to further process the data for a purpose
other than that for which the personal data were collected, the Controller
shall provide the Data Subject prior to that further processing with infor-
mation on that other purpose and with any relevant further information
as referred to in Article 13(2) (Article 13(3)).
Article 13(1), (2) and (3) shall not apply where and insofar as the Data
Subject already has the information (Article 14(5)).
Indirectly Obtained Data
7.04 Article 14 refers to information to be provided where the personal
data have not been obtained from the Data Subject.
Where personal data have not been obtained from the Data Subject, the
Controller shall provide the Data Subject with the following information:
●● the identity and the contact details of the Controller and, where
applicable, of the Controller’s representative;
●● the contact details of the DPO, where applicable;
●● the purposes of the processing for which the personal data are
intended as well as the legal basis for the processing;
●● the categories of personal data concerned;
●● the recipients or categories of recipients of the personal data, if any;
●● where applicable, that the Controller intends to transfer personal
data to a recipient in a third country or international organisation
and the existence or absence of an adequacy decision by the Com-
mission, or in case of transfers referred to in Article 46 or 47, or the
second subparagraph of Article 49(1), reference to the appropriate or
suitable safeguards and the means to obtain a copy of them or where
they have been made available (Article 14(1)).
In addition to the information referred to in Article 14(1), the Controller
shall provide the Data Subject with the following information necessary
to ensure fair and transparent processing in respect of the Data Subject:
●● the period for which the personal data will be stored, or if this is not
possible, the criteria used to determine that period;
●● where the processing is based on Article 6(1)(f), the legitimate inter-
ests pursued by the Controller or by a third party;
●● the existence of the right to request from the Controller access to and
rectification or erasure of personal data or restriction of processing
concerning the Data Subject and to object to the processing as well
as the right to data portability;
●● where the processing is based on Article 6(1)(a) or Article 9(2)(a),
the existence of the right to withdraw consent at any time, without
affecting the lawfulness of processing based on consent before its
withdrawal;

71
7.04 Processing Pre-Conditions

●● the right to lodge a complaint to a supervisory authority;


●● from which source the personal data originate, and if applicable,
whether it came from publicly accessible sources;
●● the existence of automated decision-making, including profiling,
referred to in Article 22(1) and (4) and, at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 14(2)).
The Controller shall provide the information referred to in Article 14(1)
and (2):
●● within a reasonable period after obtaining the personal data, but at
the latest within one month, having regard to the specific circum-
stances in which the personal data are processed;
●● if the personal data are to be used for communication with the Data
Subject, at the latest at the time of the first communication to that
Data Subject; or
●● if a disclosure to another recipient is envisaged, at the latest when
the data are first disclosed (Article 14(3)).
Where the Controller intends to further process the personal data for a
purpose other than that for which the data were obtained, the Controller
shall provide the Data Subject prior to that further processing with infor-
mation on that other purpose and with any relevant further information
as referred to in Article 14(2) (Article 14(4)).
Article 14(1) to (4) shall not apply where and insofar as:
●● the Data Subject already has the information; or
●● the provision of such information proves impossible or would
involve a disproportionate effort, in particular for processing for
archiving purposes in the public interest, scientific or historical
research purposes or statistical purposes, subject to the conditions
and safeguards referred to in Article 89(1) or in so far as the obliga-
tion referred to in Article 14(1) is likely to render impossible or seri-
ously impair the achievement of the objectives of that processing.
In such cases the Controller shall take appropriate measures to pro-
tect the Data Subject’s rights and freedoms and legitimate interests,
including making the information publicly available;
●● obtaining or disclosure is expressly laid down by EU or state law
to which the Controller is subject and which provides appropriate
measures to protect the Data Subject’s legitimate interests; or
●● where the data must remain confidential subject to an obligation of
professional secrecy regulated by EU or state law, including a statu-
tory obligation of secrecy (Article 14(5)).

72
Conclusion 7.05

Conclusion
7.05 Organisations also need to be aware that if their data pro-
cessing, data collection, or consent gathering processes are ever
challenged in future, they need to be in a position to not only demon-
strate a record of the consent, but also a record of all of the informa-
tion, notices, etc as existed immediately prior to the collection event.
This can be difficult as the prior information record is separate to the
­consent record. In addition, any large organisation will have to maintain
these records for thousand or even millions of individuals. In addition,
these records and policies will change over time, and so the organisa-
tion will need to maintain a record of which version is attached to which
consent; when policies or prior information and documentation versions
changed, etc. One can appreciate, therefore, that these required tasks are
not without a level of complexity. Critical, however, is that if a problem
arises with having prior information, or properly recording it, difficult
issues arise as to whether any associated consent can be relied upon.

73
74
Chapter 8
Exemptions

Introduction
8.01 In considering compliance obligations it is also important to
­consider the exemptions that may apply. These are important as there
are a significant number of adaptations and exemptions from the EU
General Data Protection Regulation (GDPR) in the Data Protection
Act 2018 (DPA 2018).
The final suite of UK data protection legislation, including the so-
called UK-GDPR, will need careful analysis as this may contain further
amendments from the GDPR than are set out in the DPA 2018 itself, as
well as the important issue of exemptions from the GDPR and or the
data protection regime (whether in whole or in part on specific issues).
These may also be potentially difficult to track given that already the
DPA 2018 is a complex piece of legislation itself, and encompasses not
just data protection and aspects of GDPR-data protection, but included
other issues in addition.
Separately, the Leveson Report in considering the journalism
‘­exemption’ in the Data Protection Act 1998 (DPA 1998), suggested
that it may have been too widely drafted, and also considered the his-
tory of the particular section. The Report has suggested that the DPA
be amended to ensure a more balanced journalism exemption provision.
The interface with the erasure and forgetting right will also be important.

Exemptions under the DPA 2018


8.02 Schedule 2 to the DPA 2018 refers to exemptions from the GDPR
as follows:
●● adaptations and restrictions based on Articles 6(3) and 23(1)
(Part 1, Sch 2);

75
8.02 Exemptions

●● restrictions based on Article 23(1): restrictions of rules in


Articles 13–21 and 34 (Part 2, Sch 2);
●● restriction based on Article 23(1): protection of rights of others
(Part 3, Sch 2);
●● restrictions based in Article 23(1): restrictions of rules in
Articles 13–15 (Part 4, Sch 2);
●● exemptions, etc, based on Article 85(2) for reasons of freedom of
expression and information (Part 5, Sch 2);
●● derogations, etc, based on Article 89 for research, statistics and
archiving (Part 6, Sch 2).
In addition Schedule 3 of DPA 2018 refers to exemptions from the
GDPR: health, social work, education and child abuse data, as follows:
●● GDPR provisions to be restricted (Part 1, Sch 3);
●● health data (Part 2, Sch 3);
●● social work data (Part 3, Sch 3);
●● education data (Part 4, Sch 3);
●● child abuse data (Part 5, Sch 3).
Schedule 4 of the DPA 2018 also refers to exemptions from the GDPR
and specifically disclosure prohibited or restricted by an enactment.
There are also variations made specifically in relation to Part 3 of the
DPA 2018 (law enforcement processing) and Part 4 of the DPA 2018
(intelligence services processing).
Further Exemptions by Order
8.03 There is potential for further amendment or exemptions to be
made by official Order.

Exemptions under the GDPR


8.04 The GDPR does not address national security (Recital 16); the
GDPR should not apply to data processing by a natural person in the
course of a purely personal or household activity and thus without a con-
nection with a professional or commercial activity (Recital 18); with-
out prejudice to the eCommerce Directive1 in particular eCommerce
defences of Articles 12–15 (Recital 21); the GDPR does not apply to the
data of the deceased (Recital 27). Certain states might also seek to apply
exemptions in certain respects in relation to parts of the GDPR.

1 Directive 2000/31/EC.

76
Conclusion 8.05

Conclusion
8.05 The exemptions are important when and where applicable. They
will often be more relevant to particular sectors. Generally, the exemp-
tions appear to be less litigated than other areas of data protection. The
UK’s derogations or amendments from aspects of the GDPR also need
to be considered by particular organisations and sectors depending on
the issue at hand. Organisations need to start with the GDPR, but then
check the DPA 2018 in case there were amendments made which affect
the issue.
In addition, depending on when the particular issue under consid-
eration arose or was initiated, it may come to be considered under the
DPA 1998 (under transitional provisions) or DPA 2018. If certain issues
started prior to the DPA 2018, the transitional provisions may mean that
the old rules continue to apply for the particular issue, as it is considered
as already in process.2
As with any legislation, it is as important for what it encompasses as
well as what it may exclude. The post-Brexit UK data protection regime
­obviously requires scrutiny of each.

2 See Data Protection Act 2018, Sch 20 – Transitional Provisions.

77
78
Chapter 9
Individual Data Subject Rights

Introduction
9.01 The data protection regime provides, or enshrines, a number of
rights to individuals in relation to their informational data and informa-
tional privacy. Transparency and consent are very important aspects of
respecting and enabling such fundamental rights to be vindicated, uti-
lised and enforced by Data Subjects. Individual Data Subjects have a
right of access to personal data. There are also time limits to be com-
plied with by a Controller in relation to replying to a Data Subject access
request (ie a request to access or obtain a copy of their personal data that
the organisation holds).
Individuals also have a right to prevent data processing for direct
marketing (DM) purposes.
The individual Data Subject has a right to prevent processing likely to
cause damage or distress.
A further right relates to automated decision taking, which relates to
automated decisions being taken without human oversight or interven-
tion. The traditional example often used is adverse credit decisions being
taken automatically. However, it can equally encompass such adverse
decisions and activities as so called neutral algorithmic processing and
arranging of information and result outputs. Examples could include
search rankings and priorities; search suggestions; search prompts; auto-
suggest; autocomplete; etc. Other examples could arise in relation to
profiling and advertising related activities.
Importantly, individual Data Subjects have specific rights in relation
to rectification, blocking, erasure and destruction and what is becoming
known as the Right to be Forgotten (RtbF). This has added significance

79
9.01 Individual Data Subject Rights

and attention following the Court of Justice decision in the RtbF land-
mark case of Google Spain.1
Individual Data Subjects are also entitled to compensation and dam-
ages, as well as being entitled to complain to the ICO and to the courts
to obtain judicial remedies.
The new EU General Data Protection Regulation (GDPR) refers to
Data Subject rights; principles of fair and transparent processing; prior
information requirements; right of access; right of rectification and right
to be forgotten (RtbF); right to complain to single supervisory author-
ity; automated processing. The difficulty, however, in dealing with
Data Subject rights is that the rights set out in the GDPR have many
derogations, amendments and exemptions set out in the Data Protection
Act 2018 (DPA 2018). These are not always easy to immediately discern
given the complexity of the DPA 2018, including the extensive materials
included in the Schedules to the Act.
Organisations should note that the new data protection regime is
important for creating a number of new rights, and for expanding or
strengthening some of the existing rights. In addition, organisations are
also required to notify the existence of certain rights to Data Subjects.
This can therefore create consequential issues where a required notifica-
tion has not been provided to the individual. Ultimately, such a failing
may feature in complaints with the data regulator or in separate court
related litigation.

Principles of Data Protection


9.02 The Principles of data protection require that personal data are:
(1) processed lawfully, fairly and transparently;
(2) purpose limited;
(3) data is minimised;
(4) accurate;
(5) storage limited;
(6) secure; and
(7) accountable.
The new GDPR expresses the Principles as requiring that personal data
are:
●● processed lawfully, fairly and in a transparent manner in relation to
the Data Subject (‘lawfulness, fairness and transparency’);
1 Google Spain SL, Google Inc v Agencia Española de Protección de Datos (AEPD),
Mario Costeja González, Court of Justice (Grand Chamber), Case C-131/12, 13 May
2014. This relates to outdated search engine result listings.

80
Rights for Individual Data Subjects 9.03

●● collected for specified, explicit and legitimate purposes and not fur-
ther processed in a way incompatible with those purposes; further
processing for archiving purposes in the public interest, scientific
and historical research purposes or statistical purposes shall, in
accordance with Article 89(1), not be considered incompatible with
the initial purposes (‘purpose limitation’);
●● adequate, relevant and limited to what is necessary in relation to the
purposes for which they are processed (‘data minimisation’);
●● accurate and, where necessary, kept up to date; every reasonable step
must be taken to ensure that personal data that are inaccurate, having
regard to the purposes for which they are processed, are erased or
rectified without delay (‘accuracy’);
●● kept in a form which permits identification of Data Subjects for no
longer than is necessary for the purposes for which the personal data
are processed; personal data may be stored for longer periods insofar
as the data will be processed solely for archiving purposes in the
public interest, scientific or historical research purposes or statistical
purposes in accordance with Article 89(1) subject to implementation
of the appropriate technical and organisational measures required by
the GDPR in order to safeguard the rights and freedoms of the Data
Subject (‘storage limitation’);
●● processed in a manner that ensures appropriate security of the per-
sonal data, including protection against unauthorised or unlawful
processing and against accidental loss, destruction or damage, using
appropriate technical or organisational measures (‘integrity and
confidentiality’).
The Controller must be responsible for and be able to demonstrate com-
pliance (‘accountability’).

Rights for Individual Data Subjects


9.03 The data protection rules contain a number of important rights for
individuals in respect of their personal data, such as:
●● the prior information that has to be given to them by Controllers;
●● access to personal data;
●● the right to object to processing;
●● not being forced to make an access request as a condition of recruit-
ment, employment or provision of a service;
●● not being subjected to ‘automated decision making processes’.
The rights are expanding and are becoming more explicit. It is important
that organisations keep abreast of the expanding rights and obligations.

81
9.03 Individual Data Subject Rights

The data protection rights enshrined in the data protection regime for
individuals are set out in the data protection Principles and elsewhere in
the data protection rules. They include the following:
●● individuals have a right to be informed by organisations as to their
identity when they are collecting and processing the individual’s
personal data;
●● the organisation must disclose to the individual the purpose for
which it is collecting and processing the individual’s personal
data;
●● if the organisation is forwarding on the personal data to third party
recipients, it must disclose this to the individual as well as iden-
tify the third party recipients. If it is permitted to transfer the per-
sonal data outside of the country, the organisation must then also
identify which third party country will be receiving the personal
data;
●● organisations must answer and comply with requests from the indi-
vidual in relation to their data protection rights.
This includes requests for access to a copy of the personal data held
in relation to the individual. This is known as a personal data access
request.
The rights of Data Subjects can be summarised as including:
●● right of access;
●● right to establish if personal data exists;
●● right to be informed of the logic in automatic decision taking;
●● right to prevent processing likely to cause damage or distress;
●● right to prevent processing for direct marketing;
●● right to prevent automated decision taking;
●● right to damages and compensation;
●● right to rectify inaccurate data;
●● right to rectification, blocking, erasure and forgetting, and
destruction;
●● right to notification of erasure and for getting to third parties, etc;
●● right to complain to ICO;
●● right to go to court.
As indicated above, there can also be other obligations which may
be interpreted in a rights lens, such as the need for individuals to be
appraised of certain of their rights. A Data Subject may wish to complain
where they have not been furnished with this information.

82
Access Right under the DPA 2018 9.05

Recipients of Right
9.04 The data protection rights apply generally in relation to any indi-
viduals whose personal data are being collected and processed. Specifi-
cally, it can include:
●● employees;
●● other workers such as contractors, temps, casual staff;
●● agency staff;
●● ex-employees and retired employees;
●● spouses and family members;
●● job applicants, including unsuccessful applicants;
●● volunteers;
●● apprentices and trainees;
●● customers and clients;
●● prospective customer and clients;
●● suppliers;2
●● members;
●● users;
●● any other individuals.

Access Right under the DPA 2018


9.05 An individual is entitled to be informed by a Controller whether
personal data of which that individual is the Data Subject are being pro-
cessed by or on behalf of that Controller and if that is the case, to be
given by the Controller a description of:
●● the personal data of which that individual is the Data Subject;
●● the purposes for which they are being or are to be processed; and
●● the recipients or classes of recipients to whom they are or may be
disclosed;
●● to have communicated to them in an intelligible form, a copy of the
personal data; and information as to the source of those data.
Where the processing by automatic means of personal data of which
that individual is the Data Subject for the purpose of evaluating matters

2 L AC MacDonald, Data Protection: Legal Compliance and Good Practice for


Employers (Tottel, 2008) 41.

83
9.05 Individual Data Subject Rights

relating to them such as, for example, performance at work, credit wor-
thiness, reliability or conduct, has constituted or is likely to constitute the
sole basis for any decision significantly affecting them, to be informed
by the Controller of the logic involved in that decision-taking.
It is important to consider that being informed only verbally may not
always, or at all, be a proper vindication of the Data Subject’s access
right. This is particularly so where large sets of personal data, or com-
plex details involving personal data, are involved. Generally, personal
data is provided in hardcopy. However, in particular circumstances,
certain Controllers prefer to furnish personal data in electronic format
where large volumes of materials are involved. If a Controller was to
refuse to furnish the personal data required, on the basis that it had ver-
bally communicated same, this would jar with the accepted understand-
ing of the data access right. It could also be argued that the Controller
was not allowing access to the personal data to be effectively and prop-
erly given. Note, however, the case of Durham County Council v Dunn
[2012] EWCA Civ 1654, which while acknowledging the legitimacy of
copies of documents, appears to suggest that other avenues also arise.
An organisation should be cautious in embarking in data access disclo-
sures other than in a documented and written manner. It is clearly the
case that an organisation has an obligation to be able to demonstrate
its compliance with access requests which includes maintaining written
procedures and records. Some would also query whether interpreting the
data access right as some sort of a data communicated right as appears
to be suggested in part of Durham as compatible with the Data Protec-
tion Act 1998 (DPA 1998), the DPA 2018 and now the GDPR (refer-
ring to electronic access). Organisations should note that responses to
data access request can be responded to electronically. Indeed, under the
GDPR there is an enhanced ability for Data Subject to request that the
responses be electronic.
Recent cases of also refer to data access right issues. In one case a
town clerk was prosecuted for frustrating and blocking an access issue,
and ultimately facilitating the deletion of records in order that they not
be available to be furnished.3 Hudson Bay Finance was served with an
official Enforcement Notice for failing to fulfil the data access right.4
The London Borough of Lewisham was also issued with an Enforcement
Notice over data access right issues.5

3 See ICO v Young [2020] 13/3/2020.


4 ICO v Hudson Bay Finance Limited [2019] 12/8/2019.
5 ICO v London Borough of Lewisham [2018] 6/9/2018.

84
Access Right 9.06

Access Right
Access Right under the GDPR
9.06 Article 15 of the GDPR relates to the right of access for the Data
Subject.
The Data Subject shall have the right to obtain from the Controller
confirmation as to whether or not personal data concerning them are
being processed, and where that is the case, access to the personal data
and the following information:
●● the purposes of the processing;
●● the categories of personal data concerned;
●● the recipients or categories of recipient to whom the personal data
have been or will be disclosed, in particular recipients in third coun-
tries or international organisations;
●● where possible, the envisaged period for which the personal data
will be stored, or, if not possible, the criteria used to determine that
period;
●● the existence of the right to request from the Controller rectifica-
tion or erasure of personal data or restriction of the processing
of personal data concerning the Data Subject or to object to such
processing;
●● the right to lodge a complaint with a supervisory authority;
●● where the personal data are not collected from the Data Subject, any
available information as to their source;
●● the existence of automated decision making including profiling
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 15(1)).
Where personal data are transferred to a third country or to an interna-
tional organisation, the Data Subject shall have the right to be informed
of the appropriate safeguards pursuant to Article 46 relating to the trans-
fer (Article 15(2)).
The Controller shall provide a copy of the personal data undergoing
processing. For any further copies requested by the Data Subject, the
Controller may charge a reasonable fee based on administrative costs.
Where the Data Subject makes the request in electronic means, and
unless otherwise requested by the Data Subject, the information shall be
provided in a commonly used electronic form (Article 15(3)).
The right to obtain a copy referred to in paragraph 3 shall not adversely
affect the rights and freedoms of others (Article 15(4)).

85
9.07 Individual Data Subject Rights

Dealing with Access Requests


9.07 One commentary6 refers to the advisability of having a process
flow chart in place. The main components referred to include:
●● Data Subject calls or asks for personal data;
●● Data Subject access request form issued;
●● Data Subject access request form returned;
●● personal data located, whether by legal department, data protection
personnel, etc;
●● examining same in relation to whether any third party information;
health data, exempt data;
●● data reviewed by legal department and DPO;
●● personal data copy issues to Data Subject.
The ICO provides the following guidance or checklist:7
1 Is this a subject access request?
No – Handle the query as part of the normal course of business.
Yes – Go to 2.
2 Is there enough information to be sure of the requester’s identity?
No – If the requester’s identity is unclear, ask them to provide
­evidence to confirm it. For example, one may ask for a piece of
information held in your records that the person would be expected
to know, such as membership details, or a witnessed copy of their
signature. Once satisfied, go to 3.
Yes – Go to 3.
3 Is any other information required to find the records they want?
No – Go to 4.
Yes – One will need to ask the individual promptly for any other
information reasonably needed to find the records they want. One
might want to ask them to narrow down their request. For exam-
ple, if keeping all customers’ information on one computer system
and suppliers’ information on another, one could ask what relation-
ship they had with the organisation. Or, one could ask when they
had dealings with the organisation. However, they do have the right
to ask for everything held about them and this could mean a very
wide search. The organisation has a timeline to respond to a subject
access request after receiving any further information one needs.
Go to 4.

6 R Morgan and R Boardman, Data Protection Strategy (Sweet & Maxwell, 2003) 252.
7 Data Protection Good Practice Note, Checklist for Handling Requests for Personal
Information (subject access requests), ICO, at https://ico.org.uk.

86
Access Right 9.07

4 Fee?
No – Go to 5.
Yes – Note, that there are restricted circumstances when a fee may
be charged. Go to 5.
5 Does the organisation hold any information about the person?
No – If one holds no personal information at all about the individual
one must tell them this.
Yes – Go to 6.
6 Will the information be changed between receiving the request and
sending the response?
No – Go to 7.
Yes – One can still make routine amendments and deletions to per-
sonal information after receiving a request. However, one must not
make any changes to the records as a result of receiving the request,
even if one finds inaccurate or embarrassing information on the
record. Go to 7.
7 Does it include any information about other people?
No – Go to 8.
Yes – One will not have to supply the information unless the other
people mentioned have given their consent, or it is reasonable to
supply the information without their consent. Even when the other
person’s information should not be disclosed, one should still sup-
ply as much as possible by editing the references to other people.
Go to 8.
8 Is the organisation obliged to supply the information?
No – If all the information held about the requester is exempt, then
one can reply stating that one does not hold any of their personal
information required to be revealed.
Yes – Go to 9.
9 Does it include any complex terms or codes?
No – Go to 10.
Yes – One must make sure that these are explained so the informa-
tion can be understood. Go to 10.
10 Prepare the response.8
Note that data access rights must be fulfilled within a specified time-
frame. The ICO has recently updated its guidance on the operation of the
time limit for fulfilling the data access right. (See Chapter 10 for further
details on the time limits, and the ICO’s updated guidance.)

8 See above.

87
9.08 Individual Data Subject Rights

GDPR: Rectification and Erasure


9.08 Chapter III, Section 3 of the new GDPR refers to rectification
and erasure.
Rectification Right
9.09 The new GDPR provides that the Data Subject shall have the
right to obtain from the Controller without undue delay the rectification
of inaccurate personal data concerning them. Taking into account the
purposes of the processing, the Data Subject shall have the right to have
incomplete personal data completed, including by means of providing a
supplementary statement (Article 16).
Erasure and Forgetting Right
9.10 The new GDPR provides that the Data Subject shall have the
right to obtain from the Controller the erasure of personal data con-
cerning them without undue delay where one of the following grounds
applies:
●● the data are no longer necessary in relation to the purposes for which
they were collected or otherwise processed;
●● the Data Subject withdraws consent on which the processing is based
according to Article 6(1)(a), or Article 9(2)(a), and where there is no
other legal ground for the processing;
●● the Data Subject objects to the processing pursuant to Article 21(1)
and there are no overriding legitimate grounds for the processing, or
the Data Subject objects to the processing pursuant to Article 21(2);
●● the personal data have been unlawfully processed;
●● the personal data have to be erased for compliance with a legal obli-
gation in EU or state law to which the Controller is subject;
●● the data have been collected in relation to the offering of information
society services referred to in Article 8(1) (Article 17(1)).9
Where the Controller has made the personal data public and is obliged
per the above to erase the personal data, the Controller, taking account of
available technology and the cost of implementation, shall take reason-
able steps, including technical measures, to inform Controllers which
are processing the personal data, that the Data Subject has requested the
erasure by such Controllers of any links to, or copy or replication of that
personal data (Article 17(2)).

9 See P Lambert, The Right to be Forgotten (Bloomsbury, 2019).

88
GDPR: Rectification and Erasure 9.10

Article 17(1) and (2) shall not apply to the extent that processing is
necessary:
●● for exercising the right of freedom of expression and information;
●● for compliance with a legal obligation which requires processing by
EU or state law to which the Controller is subject or for the perfor-
mance of a task carried out in the public interest or in the exercise of
official authority vested in the Controller;
●● for reasons of public interest in the area of public health in accord-
ance with Article 9(2)(h) and (i) as well as Article 9(3);
●● for archiving purposes in the public interest, scientific or histori-
cal research purposes or statistical purposes in accordance with
Article 89(1) in so far as the right referred to in para 1 is likely to
render impossible or seriously impair the achievement of the objec-
tives of that processing;
●● for the establishment, exercise or defence of legal claims
(Article 17(3)).
Synodinou10 refers to the ‘right to oblivion’ and notes in relation to her
research that media rights are not immune to the right to be forgotten.
Examples are given where cases have been successful in preventing par-
ticular media stories dragging up past events long after they had occurred,
including court cases.11 Indeed, many countries already obscure party
names from decisions and judgments so as to render them anonymous,
such as in Germany, Austria, Greece, Finland, Belgium, Hungary, the
Netherlands, Poland and Portugal.12 The right to be forgotten has also
been recognised in France and Belgium.13 UK cases have also granted
anonymity to the Plaintiff,14 as did the Canadian Supreme Court in an
online abuse case.15 To be successful, however, it can be important to
change party initials.16

10 Ta-E Synodinou, ‘The Media Coverage of Court Proceedings in Europe: Striking a


Balance Between Freedom of Expression and Fair Process,’ Computer Law & Security
Review (2012) (28) 208–219, at 217. Also Lambert, above.
11 Above at 218.
12 Above at 218 and fn 106, 218.
13 Above at 208–219, at 217.
14 See XY v Facebook, which also said that the website was a publisher. XY v Facebook,
McCloskey J [2012] NIQB 96, 30 November 2012.
15 AB v Bragg Communications, 27 September 2012. At http://scc-csc.lexum.com/scc-
csc/scc-csc/en/item/10007/index.do.
16 For example, while the parties and director defendants in a Berlin Facebook data pro-
tection case were reduced initials, reducing one party’s name to ‘MZ’ may not have
been fully effective. See The Federal Association of Consumer Organisations and
Consumer Groups, Federal Consumer Association, GB v Facebook Ireland Limited,
MA, JB, DG, PT and the Chairman [names redacted], [redacted].

89
9.11 Individual Data Subject Rights

Right to Restriction of Processing


9.11 Article 18 of the new GDPR refers to the right to restriction of
processing. The Data Subject shall have the right to obtain from the
Controller the restriction of the processing where one of the following
applies:
●● the accuracy of the data is contested by the Data Subject, for a period
enabling the Controller to verify the accuracy of the personal data;
●● the processing is unlawful and the Data Subject opposes the eras-
ure of the personal data and requests the restriction of their use
instead;
●● the Controller no longer needs the personal data for the purposes
of the processing, but they are required by the Data Subject for the
establishment, exercise or defence of legal claims;
●● the Data Subject has objected to processing pursuant to Article 21(1)
pending the verification whether the legitimate grounds of the Con-
troller override those of the Data Subject (Article 18(1)).
Where processing has been restricted under Article 18(1), such personal
data shall, with the exception of storage, only be processed with the Data
Subject’s consent or for the establishment, exercise or defence of legal
claims or for the protection of the rights of another natural or legal per-
son or for reasons of important public interest of the EU or of a state
(Article 18(2)).
A Data Subject who obtained the restriction of processing pursuant to
Article 18(1) shall be informed by the Controller before the restriction of
processing is lifted (Article 18(3)).
Notification Obligation re Rectification, Erasure
or Restriction
9.12 The new GDPR provides that the Controller shall communicate
any rectification or erasure of personal data or restriction of processing
carried out in accordance with Articles 16, 17(1) and 18 to each recipient
to whom the personal data have been disclosed, unless this proves impos-
sible or involves disproportionate effort. The Controller shall inform
the Data Subject about those recipients if the Data Subject requests it
(Article 19).

Right to Data Portability


9.13 The new GDPR provides that the Data Subject shall have the right
to receive the personal data concerning them, which they have provided
to a Controller, in a structured, commonly used and machine-readable

90
Automated Individual Decision Making Right 9.15

format and have the right to transmit those data to another C


­ ontroller
without hindrance from the Controller to which the personal data have
been provided, where:
●● the processing is based on consent pursuant to Article 6(1)(a) or
Article 9(2)(a) or on a contract pursuant to Article 6(1)(a); and
●● the processing is carried out by automated means (Article 20(1)).
In exercising their right to data portability, the Data Subject has the right
to have the personal data transmitted directly from Controller to Control-
ler, where technically feasible (Article 20(2)).
The exercise of this right shall be without prejudice to Article 17. That
right shall not apply to processing necessary for the performance of a
task carried out in the public interest or in the exercise of official author-
ity vested in the Controller (Article 20(3)).
The right referred to in para 1 shall not adversely affect the rights and
freedoms of others (Article 20(4)).
The EBPB endorsed the previous WP29 guidelines on data portability,
namely,
●● Guidelines on the right to data portability under Regulation
2016/679, WP242 rev.01.
The portability right may become increasingly important as users seek
to move from one service provider to another, and in order to do so
(or to do so in a manner where their data can be easily carried over to
the new provider) may have to force the old provider to furnish the
required data.

Automated Individual Decision Making Right


9.14 Chapter III, Section 4 of the new GDPR refers to the right to
object and automated individual decision making.
Right to Object
9.15 The new GDPR provides that the Data Subject shall have the
right to object, on grounds relating to their particular situation, at any
time to the processing of personal data concerning them which is based
on Article 6(1)(e) or (f), including profiling based on those provisions.
The Controller shall no longer process the personal data unless the
Controller demonstrates compelling legitimate grounds for the pro-
cessing which override the interests, rights and freedoms of the Data
Subject or for the establishment, exercise or defence of legal claims
(Article 21(1)).

91
9.15 Individual Data Subject Rights

Where personal data are processed for direct marketing purposes,


the Data Subject shall have the right to object at any time to the pro-
cessing of personal data concerning them for such marketing, which
includes profiling to the extent that it is related to such direct marketing
(Article 21(2)).
Where the Data Subject objects to the processing for direct market-
ing purposes, the personal data shall no longer be processed for such
purposes (Article 21(3)).
At the latest at the time of the first communication with the Data
Subject, the right referred to in Article 21(1) and (2) shall be explicitly
brought to the attention of the Data Subject and shall be presented clearly
and separately from any other information (Article 21(4)).
In the context of the use of information society services, and not-
withstanding Directive 2002/58/EC,17 the Data Subject may exercise
their right to object by automated means using technical specifications
(Article 21(5)).
Where personal data are processed for scientific and historical research
purposes or statistical purposes pursuant to Article 89(1), the Data Sub-
ject, on grounds relating to their particular situation, shall have the right
to object to processing of personal data concerning them, unless the pro-
cessing is necessary for the performance of a task carried out for reasons
of public interest (Article 21(6)).
Automated Individual Decision Making, Including Profiling
9.16 The new GDPR provides that Data Subjects shall have the right
not to be subject to a decision based solely on automated processing,
including profiling, which produces legal effects concerning them or
similarly significantly affects them (Article 22(1)).
Article 22(1) shall not apply if the decision:
●● is necessary for entering into, or performance of, a contract between
the Data Subject and a Controller [a];
●● is authorised by EU or state law to which the Controller is subject
and which also lays down suitable measures to safeguard the Data
Subject’s rights and freedoms and legitimate interests; or
●● is based on the Data Subject’s explicit consent (Article 22(2)) [c].
In cases referred to in Article 22(2) (a) and (c) the Controller shall
implement suitable measures to safeguard the Data Subject’s rights and

17 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).

92
Compensation for Data Subjects 9.17

freedoms and legitimate interests, at least the right to obtain human inter-
vention on the part of the Controller, to express their point of view and
to contest the decision (Article 22(3)).
Decisions referred to in paragraph (2) shall not be based on special cat-
egories of personal data referred to in Article 9(1), unless Article 9(2)(a)
or (g) apply and suitable measures to safeguard the Data Subject’s rights
and freedoms and legitimate interests are in place (Article 22(4)).
The EDPB has adopted and endorsed the following earlier WP29
guidance:
●● Guidelines on automated individual decision making and profiling
for the purposes of Regulation 2016/679, WP251 rev.01.

Compensation for Data Subjects


9.17 This is an area which will continue to grow in importance. The
facts in the Prudential case could have resulted in actual financial loss
and damage to a Data Subject. While the ICO implemented a fine of
£50,000, civil compensation could also have been a possibility.
In another case, a businessman successfully sued a company for
­Spamming.18 Another victim of Spam also sued and received £750 for
a single Spam message received.19 Typically, a Spammer sends tens of
thousands if not millions of Spam messages at a time, so per message
fines can add up, even if only a small sample of them result in fines or
awards.
In the US there have been cases by service providers, in addition to
individual recipients of Spam. Microsoft successfully sued a Spam com-
pany in the UK and received £45,000.20
The trend in Spam laws appears to be to permitting recipients, ser-
vice providers and a regulator to be able to sue Spammers. Indeed,
­Regulation 30(1) PECR Amendment provides that ‘A person who suffers
damage by reason of any contravention … by any other person shall be
entitled to bring proceedings for compensation from that person for that
damage.’
Injunctions can be possible for Spam, as well as other areas involving
data protection rights.

18 H Hart, ‘Cutting Down on Spam,’ Solicitors Journal, 10 March 2005, 290.


19 T Young, ‘Courts Put Price on Unsolicited Email,’ Computing, 15 March 2007, 14.
20 Microsoft v Paul Martin McDonald [2006] EWHC 3410 (Ch).

93
9.17 Individual Data Subject Rights

The ICO also fined the promoters behind a Spam company a total of
£440,000.21 A managing director has also been personally prosecuted
individually – ie separate and apart from the company entity.22
The area of compensation and civil litigation in vindication of data
protection rights will continue to develop. While the scale of monetary
claims, and awards, can differ as between the US and the UK, it is noted
that one defendant was happy to settle one such case for approx $20 m.23
It is possible that Data Subjects, and indeed Data Subjects who have
not suffered financial loss, can still suffer damage and be awarded com-
pensation or damages. Indeed, the Digital, Culture, Media and Sport
(DCMS) Committee when considering social media and related issues
refers to the existence of ‘online harms’.24
As data fines will no doubt increase, so too will the level of compensa-
tion and damages which may arise for an organisation, albeit that in the
context of individual Data Subjects such compensation will likely trail
the level of monetary data fines. Arguably levels of compensation and
damages have been artificially low. This can partly be explained by a
lack of familiarity with the concepts of data protection, and what some
may have seen as a certain novelty element. There have also been some
robust defence arguments raised on occasion. However, there have also
been some untenable defence arguments raised – but which have not
always been recognised as such. On the plaintiff side, it sometimes also
helps not to appear as the proverbial ambulance chaser. One should also
recall that while compensation and damages are provided for, it may
not be beneficial to present this as the first remedy being sought. After
all, there are many valuable reliefs and remedies available which can
be important solutions for individuals. It is appropriate to seek compen-
sation, but it is sometimes wise not to leave this as the only remedy
or primary remedy being sought. Overall, however, one can expect the
jurisprudence of compensation to develop sigificantly over the com-
ing years. After all, in many respects the research of the damage that
can arise from misuse, loss, and lack of data security is really only now
reaching wider understanding.

21 ICO v Niebel, McNeish and Tetrus Telecoms (2012).


22 See ICO v Cullen [2019] 24/6/2019.
23 Facebook was willing to settle a social media privacy and advertising litigation case
for approx $20 million. This related to the unlawful use of personal data in relation to
the Beacon project.
24 Digital, Culture, Media and Sport (DCMS) Committee, Disinformation and ‘Fake
News’: Final Report (18 February 2019). Available at https://www.parliament.uk/
business/committees/committees-a-z/commons-select/digital-culture-media-and-
sport-committee/news/fake-news-report-published-17-19/.

94
Jurisdiction 9.19

DPA: Requiring Data Disclosure


9.18 Section 184 of the DPA 2018 provides that it is an offence
to require a person to provide or give access to a relevant record
(see Sch 18 of the Act) for employment, continued employment or a
contract requirement.
Section 185 of the DPA 2018 refers to the need for the avoidance of
certain contractual terms relating to health records. The inclusion of such
a term is void.

Jurisdiction
9.19 GDPR Recital 22 states that any processing of personal data
in the context of the activities of an establishment of a Controller or
a Processor in the Union should be carried out in accordance with the
GDPR, regardless of whether the processing itself takes place within
the EU. Establishment implies the effective and real exercise of activ-
ity through stable arrangements. The legal form of such arrangements,
whether through a branch or a subsidiary with a legal personality, is not
the determining factor in this respect.
GDPR Recital 23 states that in order to ensure that natural persons
are not deprived of the protection to which they are entitled under the
GDPR, the processing of personal data of Data Subjects who are in
the EU by a Controller or a Processor not established in the EU should
be subject to the GDPR where the processing activities are related to
the offering of goods or services to such Data Subjects irrespective of
whether connected to a payment. In order to determine whether such a
Controller or Processor is offering goods or services to Data Subjects
who are in the EU, it should be ascertained whether it is apparent that
the Controller or Processor envisages offering services to Data Subjects
in one or more states in the EU. Whereas the mere accessibility of the
Controller’s, Processor’s or an intermediary’s website in the Union, of an
email address or of other contact details, or the use of a language gener-
ally used in the third country where the controller is established, is insuf-
ficient to ascertain such intention, factors such as the use of a language
or a currency generally used in one or more states with the possibility
of ordering goods and services in that other language, or the mentioning
of customers or users who are in the EU, may make it apparent that the
controller envisages offering goods or services to such Data Subjects in
the EU.
Section 180 of the DPA 2018 refers to certain jurisdiction and jurisdic-
tion of court issues. Section 207 refers to the territorial application of the
DPA 2018.

95
9.20 Individual Data Subject Rights

Complaints to ICO
9.20 GDPR Recital 141 states that every Data Subject should have the
right to lodge a complaint with a single supervisory authority, in particu-
lar in the state of their habitual residence, and the right to an effective
judicial remedy in accordance with Article 47 of the Charter if the Data
Subject considers that their rights under the Regulation are infringed or
where the supervisory authority does not act on a complaint, partially
or wholly rejects or dismisses a complaint or does not act where such
action is necessary to protect the rights of the Data Subject. The inves-
tigation following a complaint should be carried out, subject to judicial
review, to the extent that is appropriate in the specific case. The super-
visory authority should inform the Data Subject of the progress and the
outcome of the complaint within a reasonable period. If the case requires
further investigation or coordination with another supervisory authority,
intermediate information should be given to the Data Subject. In order
to facilitate the submission of complaints, each supervisory authority
should take measures such as providing a complaint submission form
which can also be completed electronically, without excluding other
means of communication.

Organisational Data Protection Group


9.21 GDPR Recital 142 states that where a Data Subject considers
that their rights under the Regulation are infringed, they should have
the right to mandate a not-for-profit body, organisation or association
which is constituted according to the law of a state, has statutory objec-
tives which are in the public interest and is active in the field of the
protection of personal data to lodge a complaint on their behalf with a
supervisory authority, exercise the right to a judicial remedy on behalf
of Data Subjects or exercise if provided for in state law exercise the
right to receive compensation on behalf of Data Subjects. A state may
provide for such a body, organisation or association to have the right to
lodge a complaint in that state, independently of a Data Subject’s man-
date, and the right to an effective judicial remedy where it has reason to
consider that the rights of a Data Subject have been infringed as a result
of the processing of personal data which infringes the Regulation. That
body, organisation or association may not be allowed to claim compen-
sation on a Data Subject’s behalf independently of the Data Subject’s
mandate.
Indeed, there are an increasing number of data protection and privacy
groups, both nationally and supra-nationally. Indeed, the Court of Justice

96
Organisational Data Protection Group 9.21

case striking down the Data Retention Directive was taken by one such
group.25
The DPA 2018 at ss 187–190 refers to the representation of Data
­Subjects, including in ‘collective proceedings’.
There have already been a number of cases, including notable cases,
where data protection and privacy groups have been involved – if not
leading the respective cases and complaints to data regulators. Digital
Rights Ireland was instrumental in the decision invalidating the Data
Retention Directive.26 The groundbreaking case where the EU–US Safe
Harbours arrangement was invalidated was taken by an individual asso-
ciated with a European privacy group.27 The recent challenge to the
standard contractual clauses data transfer arrangement (in particular
as it relates to certain data transfers from the EU to the US) included
submissions from the Electronic Privacy Information Centre (EPIC),
the Business Software Alliance (BSA), Digitaleurope, and began with a
complaint from Max Schrems (and an associated data protection privacy
group).28
While there are an increasing number of data protection privacy
groups which take or initiate issue specific complaints, they have not
generally sought to focus on the compensation and damages potential.
In addition, an interesting point will be the extent to which established
trade unions or representative bodies may seek to be recognised as an
organisation data protection group as envisaged under the GDPR and
related legislation such as the UK-GDPR.

25 Cases C-293/12 and C-594/12, Digital Rights Ireland and Seitlinger and Others, Court
of Justice, 8 April 2014. Directive 2006/24/EC and amending Directive 2002/58/EC.
26 Directive 2006/24/EC of the European Parliament and of the Council of 15 March
2006 on the retention of data generated or processed in connection with the provision
of publicly available electronic communications services or of public communications
networks, and amending Directive 2002/58/EC. This was the case of Judgment of the
Court (Grand Chamber) of 8 April 2014 (requests for a preliminary ruling from the
High Court of Ireland (Ireland) and the Verfassungsgerichtshof (Austria)) – Digital
Rights Ireland Ltd (C-293/12) v Minister for Communications, Marine and Natural
Resources, Minister for Justice, Equality and Law Reform, The Commissioner of the
Garda Síochána, Ireland and the Attorney General, and Kärntner Landesregierung,
Michael Seitlinger, Christof Tschohl and Others (C-594/12), Joined Cases C-293/12
and C-594/12.
27 Schrems v Data Protection Commissioner, CJEU [2015] Case C-362/14.
28 Facebook Ireland and Schrems, AG Opinion, CJEU [2019] Case C-311/18_O
(19 December 2019).

97
9.22 Individual Data Subject Rights

Court Remedies
9.22 GDPR Recital 143 states that each natural or legal person should
have an effective judicial remedy before the competent national court
against a decision of a supervisory authority which produces legal effects
concerning that person. This is, inter alia, distinguished from other mat-
ters such as opinions and non legal effects.

Different Supervisory Authorities


9.23 One of the aims of the GDPR was originally to establish a single
supervisory authority in respect of which organisations would have to
deal with, a so called one-stop-shop, as opposed to dealing with data pro-
tection authorities in each state, there is recognition that circumstances
can arise for other authorities to become involved.
Supervisory authorities may cooperate on specific matters.
GDPR Recital 144 states proceedings may be stayed or declined where
there are similar or identical prior proceedings ongoing in another state.

Plaintiff Choice of Jurisdictions and Courts


9.24 GDPR Recital 145 states that for proceedings against a Control-
ler or Processor, the plaintiff should have the choice to bring the action
before the courts of the states where the controller or processor has an
establishment or where the Data Subject resides, unless the Controller is
a public authority of a state acting in the exercise of its public powers.

Compensation
9.25 GDPR Recital 146 states that the Controller or Processor should
compensate any damage which a person may suffer as a result of pro-
cessing that infringes the GDPR. The Controller or Processor should be
exempt from liability if it proves that it is not in any way responsible
for the damage. The concept of damage should be broadly interpreted
in the light of the case law of the Court of Justice in a manner which
fully reflects the objectives of the GDPR. This is without prejudice to
any claims for damage deriving from the violation of other rules in EU
or state law. Processing that infringes the GDPR also includes process-
ing that infringes delegated and implementing acts adopted in accord-
ance with the GDPR and state law specifying rules of the GDPR. Data
Subjects should receive full and effective compensation for the damage
they have suffered. Where Controllers or Processors are involved in the

98
Sanctions 9.27

same processing, each Controller or Processor should be held liable for


the entire damage. However, where they are joined to the same judi-
cial proceedings, in accordance with state law, compensation may be
apportioned according to the responsibility of each Controller or Pro-
cessor for the damage caused by the processing, provided that full and
effective compensation of the Data Subject who suffered the damage is
ensured. Any Controller or Processor which has paid full compensation
may subsequently institute recourse proceedings against other Control-
lers or Processors involved in the same processing. The developing issue
of compensation and damages will be one of the more followed areas of
the new data protection regime.

Penalties
9.26 GDPR Recital 148 states that in order to strengthen the enforce-
ment of the rules of the GDPR, penalties including administrative fines
should be imposed for any infringement of the GDPR, in addition to,
or instead of appropriate measures imposed by the supervisory author-
ity pursuant to the GDPR. In a case of a minor infringement or if the
fine likely to be imposed would constitute a disproportionate burden to a
natural person, a reprimand may be issued instead of a fine. Due regard
should be given to the nature, gravity and duration of the infringement,
the intentional character of the infringement, actions taken to mitigate
the damage suffered, degree of responsibility or any relevant previous
infringements, the manner in which the infringement became known to
the supervisory authority, compliance with measures ordered against the
Controller or Processor, adherence to a code of conduct and any other
aggravating or mitigating factor. The imposition of penalties including
administrative fines should be subject to appropriate procedural safe-
guards in accordance with the general principles of EU law and the
­Charter, including effective judicial protection and due process.

Sanctions
9.27 GDPR Recital 150 states that the in order to strengthen and har-
monise administrative penalties for infringements of the GDPR, each
supervisory authority should have the power to impose administrative
fines. The GDPR should indicate infringements and the upper limit and
criteria for setting the related administrative fines, which should be deter-
mined by the competent supervisory authority in each individual case,
taking into account all relevant circumstances of the specific situation,
with due regard in particular to the nature, gravity and duration of the

99
9.27 Individual Data Subject Rights

infringement and of its consequences and the measures taken to ensure


compliance with the obligations under the GDPR and to prevent or miti-
gate the consequences of the infringement. Where administrative fines
are imposed on an undertaking, an undertaking should be understood
to be an undertaking in accordance with Articles 101 and 102 TFEU for
those purposes. Where administrative fines are imposed on persons that
are not an undertaking, the supervisory authority should take account
of the general level of income in the state as well as the economic situ-
ation of the person in considering the appropriate amount of the fine.
The consistency mechanism may also be used to promote a consistent
application of administrative fines. It should be for the states to deter-
mine whether and to which extent public authorities should be subject to
administrative fines. Imposing an administrative fine or giving a warn-
ing does not affect the application of other powers of the supervisory
authorities or of other penalties under the GDPR.

GDPR: Right to Portability


9.28 GDPR Article 20 refers to the right to data portability. The Data
Subject shall have the right to receive the personal data concerning them,
which they provided to a Controller, in a structured, commonly used and
machine-readable format and have the right to transmit those data to
another Controller without hindrance from the Controller to which the
data have been provided, where:
●● the processing is based on consent pursuant to Article 6(1)(a) or
Article 9(2)(a) or on a contract pursuant to Article 6(1)(b); and
●● the processing is carried out by automated means (Article 20(1)).
In exercising their right to data portability pursuant to para 1, the Data
Subject has the right to have the data transmitted directly from Control-
ler to Controller, where technically feasible (Article 20(2)).
The exercise of this right shall be without prejudice to Article 17. That
right shall not apply to processing necessary for the performance of a
task carried out in the public interest or in the exercise of official author-
ity vested in the Controller (Article 20(3)).
The right referred to in para 1 shall not adversely affect the rights and
freedoms of others (Article 20(4)).

GDPR: Right to Object, Automated Decisions and Profiling


9.29 Chapter III, Section 4 of the GDPR refers to the right to object to
processing, automated individual decisions and to profiling.

100
GDPR Automated Individual Decision Making, Including Profiling 9.30

The Data Subject shall have the right to object, on grounds relating
to their particular situation, at any time to the processing of personal
data concerning them which is based on Article 6(1)(e) or (f), includ-
ing profiling based on these provisions. The Controller shall no longer
process the personal data unless the Controller demonstrates compelling
legitimate grounds for the processing which override the interests, rights
and freedoms of the Data Subject or for the establishment, exercise or
defence of legal claims (Article 21(1)).
Where personal data are processed for direct marketing purposes,
the Data Subject shall have the right to object at any time to process-
ing of personal data concerning them for such marketing, which
includes profiling to the extent that it is related to such direct marketing
(Article 21(2)).
Where the Data Subject objects to processing for direct marketing
purposes, the personal data shall no longer be processed for such pur-
poses (Article 21(3)).
At the latest at the time of the first communication with the Data Sub-
ject, the right referred to in paras 1 and 2 shall be explicitly brought to
the attention of the Data Subject and shall be presented clearly and sepa-
rately from any other information (Article 21(4)).
In the context of the use of information society services, and not-
withstanding Directive 2002/58/EC,29 the Data Subject may exercise
their right to object by automated means using technical specifications
(Article 21(5)).
Where personal data are processed for scientific and historical research
purposes or statistical purposes pursuant to Article 89(1), the Data Sub-
ject, on grounds relating to their particular situation, shall have the right
to object to processing of personal data concerning them, unless the pro-
cessing is necessary for the performance of a task carried out for reasons
of public interest (Article 21(6)).

GDPR Automated Individual Decision Making, Including


Profiling
9.30 The Data Subject shall have the right not to be subject to a deci-
sion based solely on automated processing, including profiling, which
produces legal effects concerning them or similarly significantly affects
them (see Article 22(1)).

29 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).

101
9.31 Individual Data Subject Rights

Conclusion
9.31 The rights are very important for organisations to recognise and
protect, and reflect in the documented processes, procedures and plans of
the organisation. These need to be incorporated from day one, as it may
not be possible to retrospectively become compliant if the initial collec-
tion and processing was illegitimate. This is increasingly significant as
data protection supervisory authorities and Data Subjects become more
proactive and as the levels of fines and penalties increase. It is clear that
the importance attached to data protection has increased significantly,
and so too the obligation for organisational compliance.

Cases to Consider
9.32 ICO complaints, cases and case studies which may be useful for
organisations to consider include:
●● Morrisons (liable to employees for ex-employee data breach of
employees’ data) (also described as first data breach class action in
UK) (original decision, Court of Appeal, and ultimately the Supreme
Court appeal). (The issue was employer responsibility under vicari-
ous liability for a data disclosure by way of upload to the internet
by an errant employee. The High Court and Court of Appeal found
for the plaintiff victim employees whose data was uploaded. The
Supreme Court disagreed. See Chapter 11);30
●● ICO v Cathay Pacific;31
●● ICO v CRDNN;32
●● ICO v DSG Retail;33
●● ICO v Cullen;34
●● ICO v Bounty (UK);35

30 Various Claimants v Morrison Supermarkets PLC, High Court [2018] EWHC 1123
(QB), Justice Langstaff, 16 May 2018, The errant ex IT employee, Andrew Skelton,
who released the personal data on over 100,000 employees was convicted and impris-
oned. That Morrisons was vicariously liable (at trial) opens the possibility of liabil-
ity to each of the affected employees. The company lost on appeal to the Court of
Appeal, but prevailed in the Supreme Court. Morrison v Various [2020] UKSC 12
(1 April 2020). A word of caution is still required in relation to over-assuming that a
company or employer can never be held vicariously liable for problem data issues. See
Chapter 11 for further discussion.
31 ICO v Cathay Pacific Airways Limited [2020] 4 March 2020.
32 ICO v CRDNN Limited [2020] 2 March 2020.
33 ICO v DSG Retail Limited [2020] 9 January 2010.
34 ICO v Cullen [2019] 24 June 2019.
35 ICO v Bounty (UK) Limited [2019] 11 April 2014.

102
Cases to Consider 9.32

●● ICO v Grove Pension;36


●● ICO v True Vision;37
●● ICO v IICSA;38
●● NT 1 and NT 2 (Right to Be Forgotten vindicated for businessman);39
●● ICO v Yahoo! UK;40
●● ICO v Facebook (re Cambridge Analytica);41
●● ICO v Independent Inquiry into Child Sexual Abuse (IICSA);42
●● Tele 2 Sverige v Swedish Post and Telecom Authority and
Watson v UK;43
●● Nowak v Data Protection Commissioner44 (re exam scripts and what
is personal data);
●● Valsts policijas Rīgas reģiona pārvaldes Kārtības policijas pārvalde
v Rīgas pašvaldības SIA ‘Rīgas satiksme’;45
●● Camera di Commercio, Industria, Artigianoto e Agricoltura di
Lecce v Salvatore Manni;46
●● Puškár v Finančné riaditeľstvo Slovenskej republiky and Kriminálny
úrad finančnej správy;47
●● Breyer v Germany;48
●● VKI v Amazon EU;49
●● ICO v Prudential;50

36 ICO v Grove Pension Solutions Limited [2019] 26 March 2019. Refers to use of
‘­professional’ incorrect advices.
37 ICO v True Visions Productions [2019] 10 April 2019.
38 ICO v Independent Inquiry into Child Sexual Abuse (IICSA) [2017] 18 July 2017.
39 NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB) (13 April 2018). Also see
P Lambert, The Right to be Forgotten (Bloomsbury, 2019).
40 ICO v Yahoo! UK, 21 May 2018. There was a fine of £250,000.
41 ICO v Facebook. The ICO imposed a fine of £500,000. ICO v Facebook [2018]
24 October 2018.
42 ICO v Independent Inquiry into Child Sexual Abuse (IICSA) [2018] 18 September
2018.
43 Tele 2 Sverige v Swedish Post and Telecom Authority and Watson v UK, ECJ/CJEU,
Joined Cases C-203/15 & C-698/15, 21 December 2016.
44 Nowak v Data Protection Commissioner, ECJ/CJEU, Second Chamber, Case
C-434/16, 20 December 2017.
45 Valsts policijas Rīgas reģiona pārvaldes Kārtības policijas pārvalde v Rīgas
pašvaldības SIA ‘Rīgas satiksme,’ ECJ/CJEU, Second Chamber, Case C-13/16, 4 May
2017.
46 Camera di Commercio, Industria, Artigianoto e Agricoltura di Lecce v Salvatore
Manni, ECJ/CJEU, Second Chamber, Case C-398/15, 9 March 2017.
47 Puškár v Finančné riaditeľstvo Slovenskej republiky and Kriminálny úrad finančnej
správy, ECJ/CJEU, Second Chamber, Case C-73/16, 27 September 2017.
48 Breyer v Germany, ECJ/CJEU, Case C-582/14, 19 October 2016.
49 VKI v Amazon EU, ECJ/CJEU, Case C-191/15, 29 September 2016.
50 ICO v Prudential, ‘Prudential Fined £50,000 for Customer Account Confusion,’ 6
November 2012.

103
9.32 Individual Data Subject Rights

●● Microsoft v Paul Martin McDonald;51


●● ICO v Niebel, ICO v McNeish;52
●● Rugby Football Union v Viagogo Limited;53
●● Durham County Council v Dunn;54
●● British Gas v Data Protection Registrar;55
●● Brian Reed Beetson Robertson;56
●● Campbell v MGN;57
●● CCN Systems v Data Protection Registrar;58
●● Lindqvist v Kammaraklagaren;59
●● Commission v Bavarian Lager;60
●● Common Services Agency v Scottish Information Commissioner;61
●● Douglas v Hello!;62
●● Halford v UK;63
●● Von Hannover v Germany;64
●● Mosley (newspaper case);
●● Motion Picture Association v BT;65
●● WP29 (now EDPB) and Data Protection Authorities/Google
(re Google policy change and breaches);
●● Barclays/Lara Davies prosecution;66
●● ICO v Sony;67
●● Facebook Beacon case (US);68
●● Digital Rights Ireland and Seitlinger and Others;69
●● Schrems v Commissioner;70
●● Tamiz v Google;71

51 [2006] EWHC 3410.


52 28 November 2012.
53 [2012] UKSC 55.
54 [2012] EWCA Civ 1654.
55 [1998] UKIT DA98 – 3/49/2, 4 March 1998.
56 [2001] EWHC Admin 915.
57 [2004] UKHL 22.
58 [1991] UKIT DA90.
59 Case C-101/01, ECR I-12971.
60 Case C-28/08.
61 [2008] UKHL 47.
62 [2005] EWCA Civ 595.
63 [1997] IRLR 47, ECHR.
64 10 February 2012.
65 28 July 2011.
66 ICO v Lara Davies, December 2012.
67 July 2013.
68 McCall v Facebook, 20 September 2012.
69 Joined Cases C-293/12 and C-594/12, 8 April 2014.
70 Case C-362/14, 6 October 2015.
71 [2012] EWHC 449 (QB) 3 December 2012.

104
Cases to Consider 9.32

●● Google v Vidal-Hall;72
●● Mosley v Google (various);
●● Google Spain SL Google Inc v Agencia Española de Protección de
Datos, Mario Costeja González;73
●● Weltimmo v Nemzeti Adatvédelmi és Információszabadság
Hatóság;74
●● Bărbulescu v Romania;75
●● Bărbulescu v Romania (this superior ruling overturning the above
erroneous (and controversial) decision).76
●● Google v Vidal-Hall77 (DPA 1998, s 13(2) was declared an inva-
lid implementation of EU law and an impermissive restriction of
Data Subject right to sue for damages and compensation, effectively
making Data Subject claims easier). (One issue going forward will
be any challenges to aspects of the so-called UK-GDPR once fully
enacted on the basis of being incompatible with the GDPR; and or
any challenge to a successful EU adequacy decision in relation to
data transfers to the UK on the basis of UK law due to differences
directly in UK data protection law or indirectly in relation to wider
laws (eg human rights). There have been ongoing comments by
politicians that the human rights legislation may be amended after
Brexit78 – notwithstanding that this may have serious adverse com-
plications for an EU data protection adequacy decision for the UK.

72 [2014] EWHC 13 (QB).


73 Case C-131/121.
74 Case C-230/14, 1 October 2015.
75 ECHR, Case No 61496/08, 12 January 2016.
76 Bărbulescu v Romania, ECtHR (Grand Chamber), Case 61496/08, 5 September 2017.
77 [2015] EWCA Civ 311.
78 One recent example is J Stone, ‘Boris Johnson refuses to Commit to Keeping UK in
Human Rights Convention’ Independent, 5 March 2020.

105
106
Chapter 10
Time Limits for Compliance

Introduction
10.01 There are various time limits referred to in the data protection
regime for the accomplishment of particular tasks. Depending on the
task at hand, non-compliance with a time limit could amount to a breach
and/or offence.

Time Limits
These are the main time limits and references contained in the GDPR.

Time Limit Issue Source


72 hours Breach Notification GDPR, Recital 85
Controller should notify personal data
breach to the supervisory authority
without undue delay and, where
feasible, not later than 72 hours after
having become aware of it
72 hours Breach Notification GDPR, Article 33
Controller shall without undue delay
and, where feasible, not later than
72 hours, notify personal data breach to
supervisory authority
Without Breach Notification GDPR, Recital 86
undue delay Controller should communicate to data
subject a personal data breach, without
undue delay. Such communications to
data subjects should be made as soon
as reasonably feasible

107
10.01 Time Limits for Compliance

Time Limit Issue Source


Without Breach Notification GDPR, Article 34
undue delay Communicate the personal data breach
to the data subject without undue delay
Inform Breach Notification GDPR, Recital 7
promptly Inform promptly the supervisory
authority and the data subject of data
breach
One month Rights GDPR, Recital 59
Data subject rights.
(including access, rectification or
erasure and right to object).
Controller should respond without
undue delay and at the latest within one
month
One month Action taken GDPR, Article 12(3)
Controller shall provide information on
action taken on a request under
Articles 15–22 to the data subject
without undue delay and in any event
within one month.
Controller shall inform data subject of
any extension within one month, with
reasons for delay.
Without Rectification right GDPR, Article 16
undue delay Right of rectification without undue
delay
Without Forgetting right GDPR, Article 17
undue delay Right to erasure and forgetting without
undue delay
21 days Objection right DPA 2018, s 99(3)
Right to object to processing.
Controller must, before the end of
21 days, give notice to data subject
stating that the controller has complied,
intends to comply
or reasons for not complying
Seven days Assessment DPA 2018, s 146(8)
Assessment notice, before seven days, and (9)
within seven days

108
Time Limits 10.01

The ICO has updated its guidance in respect of time limits for complying
with various obligations under the data regulations. This policy change
is on foot of an EU case dealing with time limit issues (albeit in a differ-
ent subject area). The Toeters and Verberk case holds that ‘Article 50a
of Regulation No 3886/92 laying down detailed rules for the applica-
tion of the premium schemes … must be interpreted as meaning that a
premium application may be regarded as having been ‘lodged’ in due
time only if the competent authority received it prior to the expiry of
the time-limit. The fact that the competent authority was in a position to
transmit certain data to the Commission is irrelevant to the calculation
of a time-limit, which must be applied uniformly throughout the Com-
munity in order, particularly, to maintain equality of treatment between
economic operators.’1
The ICO now advises that:
‘If you exercise any of your rights under data protection law, the organisation
you’re dealing with must respond as quickly as possible. This must be no
later than one calendar month, starting from the day they receive the request.
If the organisation needs something from you to be able to deal with your
request (eg ID documents), the time limit will begin once they have received
this.

If your request is complex or you make more than one, the response time
may be a maximum of three calendar months, starting from the day of
receipt.’2

It also describes what is meant by a calendar month, and states that:


‘A calendar month starts on the day the organisation receives the request,
even if that day is a weekend or public holiday. It ends on the corresponding
calendar date of the next month.

However, if the end date falls on a Saturday, Sunday or bank holiday, the
calendar month ends on the next working day.

Also, if the corresponding calendar date does not exist because the following
month has fewer days, it is the last day of the month.’

1 Maatschap Toeters and MC Verberk v Productschap Vee en Vlees, CJEU [2004]


Case C-171/03.
2 ICO,‘Time Limits for Responding to Data Protection Rights Requests,’ at https://
ico.org.uk/your-data-matters/time-limits-for-responding-to-data-protection-
rights-requests/.

109
10.02 Time Limits for Compliance

Conclusion
10.02 When an organisation fails to comply within the required time-
frame, this in itself is another breach. It may be taken into consideration
by the ICO and/or a court.

110
Chapter 11
Enforcement and Penalties
for Non-Compliance

Introduction
11.01 There are a series of offences set out in the data protection
­legislation. These are designed to ensure compliance with the data protec-
tion regime, from collection to fair use of personal data. O ­ rganisations
must fully comply with their obligations. In addition to questions a­ rising in
relation to their continued use of personal data if it has not been collected
fairly, investigations, prosecutions and financial penalties can also arise.
Organisations can be fined up to 20,0000 EUR or 4% of worldwide
turnover, depending on the breach, or as set out in national legislation as
appropriate.
Offences by organisations can also be prosecuted. Individuals can also
be fined and prosecuted in addition to the organisation.

Breaches, Offences and Penalties


11.02 The data protection regime set out the rules with which Con-
trollers must obey. Breaches of these rules sometimes involve offences
which are punishable by fines. The offences set out in the data protection
regime include the following:
●● EU General Data Protection Regulation (GDPR) (see below);
●● offence of unlawful obtaining, etc, of personal data (Data Protection
Act 2018 (DPA 2018), s 170);
●● new offence of re-identification of de-identified personal data (DPA
2018, s 171) (also see s 172 re re-identification effectiveness testing
conditions);
●● alterations, etc, of personal data to prevent disclosure to Data Sub-
ject (DPA 2018, s 173).

111
11.02 Enforcement and Penalties for Non-Compliance

In addition, offences are also referred to in the supplementary provi-


sions, as follows:
●● penalties for offences (DPA 2018, s 196);
●● prosecution (DPA 2018, s 197);
●● liability of directors, etc (DPA 2018, s 198);
●● recordable offences (DPA 2018, s 199);
●● guidance about PACE code of practice (DPA 2018, s 200).
See ICO penalties, fines and prosecutions details below.

Criminal Offences
11.03 The organisation can commit criminal offences in relation to its
data processing activities, namely:
●● unlawful obtaining or disclosure of personal data;
●● selling and offering to sell personal data;
●● enforcing individual Data Subject access;
●● disclosure of information;
●● obstructing or failing to assist in the execution of a warrant;
●● processing without a register entry (if still required);
●● failing to notify changes regarding registration (if still required);
●● carrying on assessable processing;
●● failing to make certain particulars available;
●● failing to comply with a notice;
●● making a false statement in response to a notice.

Other Consequences of Breach


11.04 Some of the other consequences of a breach of the data protec-
tion regime include:
●● offences can be committed by the organisation and its officers;
●● processing focused offences;
●● access related offences;
●● offences for non-compliance with ICO and ICO related powers
(eg, enforcement notices; assessments; information notices; entry
and inspection);
●● being sued by individual Data Subjects;
●● potentially being sued by rights based organisations (or follow-
ing complaints by such entities) and or class action type cases
(eg Google re Apple; DRI case; Schrems case) and which cases are
likely to increase under the GDPR;

112
Other Consequences of Breach 11.05

●● publicity of an unwanted nature;


●● adverse publicity;
●● lost or dimished sales and share value;
●● reduced sale price (eg as occurred with Yahoo when a massive data
breach was disclosed).
Offence Regarding Unlawful Obtaining or
Procuring Personal Data
11.05 In one case there were successful convictions regarding blag-
ging and unlawful access to the personal data relating to tenants. The
case involved Philip Campbell Smith, Adam Spears, Graham Freeman
and Daniel Summers. Amongst other things, the ICO again called for
custodial sentences for breach of the data protection regime. This was
also a theme in the Leveson recommendations.
The ICO states:
‘The Department for Work and Pensions hold important information about
each and every one of us. We are very pleased that a DWP staff member was
alert to this attempt to blag information and that the call was halted before it
was too late. The motive behind Mr Braun’s action was financial. He knew
that such an underhand method of obtaining the tenant’s personal information
was illegal but carried on regardless. This case shows that unscrupulous indi-
viduals will continue to try and blag peoples’ details until a more appropriate
range of deterrent punishments is available to the courts. There must be no
further delay in introducing tougher powers to enforce the Data Protection Act
beyond the current “fine only” regime.’1

The ICO also said:


‘The scourge of data theft continues to threaten the privacy rights of the UK
population. Whilst we welcome today’s sentencing of the private investigator,
Graham Freeman, and his three accomplices, the outcome of the case under-
lines the need for a comprehensive approach to deterring information theft.
If SOCA had been restricted to pursuing this case solely using their powers
under the Data Protection Act then these individuals would have been faced
with a small fine and would have been able to continue their activities the very
next day. This is not good enough. Unscrupulous individuals will continue to
try and obtain peoples’ information through deception until there are strong
punishments to fit the crime. We must not delay in getting a custodial sentence
in place for section 55 offences under the Data Protection Act.’2

1 ‘Private Detectives Jailed for Blagging: ICO Statement,’ ICO, 27 February 2012, at
https://ico.org.uk.
2 ‘Private Detectives Jailed for Blagging: ICO Statement,’ ICO, 27 February 2012, at
https://ico.org.uk.

113
11.06 Enforcement and Penalties for Non-Compliance

Offences by Direct Marketers under PECR


11.06 Offences arise in relation to activities referred to under the
additional ePrivacy and e-marketing rules, including under the ePrivacy
Regulation and amended UK PECR rules (subject to further amendment
once the EU ePrivacy Regulation is finalised).
Fines can be significant. Organisations could previously be fined up to
£500,000 by the ICO for unwanted marketing phone calls and emails in
accordance with the previous PECR rules. Fines increase significantly in
potential under the new GDPR and ePrivacy Regulation.
The ICO imposes significant fines in relation to Spamming, includ-
ing sometimes the promoters and directors of the company engaged in
Spamming. Similar examples arise in relation to private investigation
firms engaged in breaches. The types of instances where personal liabil-
ity for breaches arise will continue to increase.
Significant financial penalties also frequently arise for illegal market-
ing activities. (See examples below in this chapter.)

ICO Monetary Penalties


11.07 The ICO is empowered to impose penalties of a monetary vari-
ety. See examples of some of the recent ICO fines and monetary penal-
ties referred to below.

GDPR Changes re Fines and Prosecution


11.08 As noted above, the new GDPR sets out significant amendments
to the prosecution and fines regime (see below).
Right to Compensation and Liability
11.09 Any person who has suffered material or non-material damage
as a result of an infringement of the GDPR shall have the right to receive
compensation from the Controller or Processor for the damage suffered
(Article 82(1)).
Any Controller involved in processing shall be liable for the damage
caused by the processing which infringes the GDPR specifically directed
to Processors or where it has acted outside or contrary to lawful instruc-
tions of the Controller (Article 82(2)).
A Controller or Processor may be exempted from liability in accord-
ance with Article 78(2) if it proves that it has not in any way responsible
for the event giving rise to the damage (Article 82(3)).

114
GDPR Changes re Fines and Prosecution 11.10

Where more than one Controller or Processor, or both a Controller


and a Processor are involved in the same processing and where they
are, in accordance with Article 82(2) and (3), responsible for any dam-
age caused by the processing, each Controller or Processor shall be held
liable for the entire damage in order to ensure effective compensation of
the Data Subject (Article 82(4)).
Where a Controller or Processor has, in accordance with Article 82(4),
paid full compensation for the damage suffered, that Controller or
Processor shall be entitled to claim back from the other Controllers or
Processors involved in the same processing that part of the compensation
corresponding to their part of responsibility for the damage in accord-
ance with the conditions set out in Article 82(2) (Article 82(5)).
Court proceedings for the right to receive compensation shall be
brought before the courts competent under the law of the state referred
to in Article 79(2).
Court proceedings against a Controller or Processor can be brought in
the state where they are established or in the state where the Data Subject
resides (Article 79(2)).
This will increasingly be a potential risk issue for companies.
General Conditions for Imposing Administrative Fines
11.10 Each supervisory authority shall ensure that the imposition of
administrative fines pursuant to this Article in respect of infringements
of the GDPR referred to in Article 83(4), (5), (6) shall in each individual
case be effective, proportionate and dissuasive (Article 83(1)).
Administrative fines shall, depending on the circumstances of each
individual case, be imposed in addition to, or instead of, measures
referred to in Article 58(2) (a) to (h) and (j). When deciding whether to
impose an administrative fine and deciding on the amount of the admin-
istrative fine in each individual case due regard shall be given to the
following:
●● the nature, gravity and duration of the infringement taking into
account the nature, scope or purpose of the processing concerned as
well as the number of Data Subjects affected and the level of damage
suffered by them;
●● the intentional or negligent character of the infringement;
●● any action taken by the Controller or Processor to mitigate the
­damage suffered by Data Subjects;
●● the degree of responsibility of the Controller or Processor taking
into account technical and organisational measures implemented by
them pursuant to Articles 25 and 32;
●● any relevant previous infringements by the Controller or Processor;

115
11.10 Enforcement and Penalties for Non-Compliance

●● the degree of co-operation with the supervisory authority, in order


to remedy the infringement and mitigate the possible adverse effects
of the infringement;
●● the categories of personal data affected by the infringement;
●● the manner in which the infringement became known to the super-
visory authority, in particular whether, and if so to what extent, the
Controller or Processor notified the infringement;
●● where measures referred to in Article 58(2), have previously been
ordered against the Controller or Processor concerned with regard to
the same subject-matter, compliance with these measures;
●● adherence to approved codes of conduct pursuant to Article 40 or
approved certification mechanisms pursuant to Article 42; and
●● any other aggravating or mitigating factor applicable to the cir-
cumstances of the case, such as financial benefits gained, or losses
avoided, directly or indirectly, from the infringement (Article 83(2)).
If a Controller or Processor intentionally or negligently, for the same or
linked processing operations, infringes several provisions of the GDPR,
the total amount of the fine shall not exceed the amount specified for the
gravest infringement (Article 83(3)).
Infringements of the following provisions shall, in accordance with
Article 83(2), be subject to administrative fines up to 10,000,000 EUR,
or in case of an undertaking, up to two percent of the total worldwide
annual turnover of the preceding financial year, whichever is higher:
●● the obligations of the Controller and the Processor pursuant to
­Articles 8, 11, 25 39, 42 and 43;
●● the obligations of the certification body pursuant to Articles 42 and 43;
●● the obligations of the monitoring body pursuant to Article 41(4)
(Article 83(4)).
Infringements of the following provisions shall, in accordance with
Article 83(2), be subject to administrative fines up to 20,000,000 EUR,
or in case of an undertaking, up to four percent of the total worldwide
annual turnover of the preceding financial year, whichever is higher:
●● the basic principles for processing, including conditions for consent,
pursuant to Articles 5, 6, 7 and 9;
●● the Data Subjects’ rights pursuant to Articles 12–22;
●● the transfers of personal data to a recipient in a third country or an
international organisation pursuant to Articles 44–49;
●● any obligations pursuant to state laws adopted under Chapter IX;
●● non-compliance with an order or a temporary or definitive limitation
on processing or the suspension of data flows by the supervisory
authority pursuant to Article 58(2) or failure to provide access in
violation of Article 58(1) (Article 83(5)).

116
Civil Sanctions under the DPA 11.12

Non-compliance with an order by the supervisory authority as referred


to in Article 58(2) shall, in accordance with Article 83(2), be subject to
administrative fines up to 20,000,000 EUR, or in case of an undertaking,
up to 4 percent of the total worldwide annual turnover of the preceding
financial year, whichever is higher (Article 83(6)).
Without prejudice to the corrective powers of supervisory authorities
pursuant to Article 58(2), each state may lay down the rules on whether
and to what extent administrative fines may be imposed on public
authorities and bodies established in that state (Article 83(7)).
The exercise by the supervisory authority of its powers under this
Article shall be subject to appropriate procedural safeguards in conform-
ity with EU law and state law, including effective judicial remedy and
due process (Article 83(8)).
Where the legal system of the state does not provide for administrative
fines, Article 83 may be applied in such a manner that the fine is initi-
ated by the competent supervisory authority and imposed by competent
national courts, while ensuring that those legal remedies are effective
and have an equivalent effect to the administrative fines imposed by
supervisory authorities. In any event, the fines imposed shall be effec-
tive, proportionate and dissuasive. Those states shall notify to the Com-
mission those provisions of their laws which they adopt pursuant to
Article 83 (Article 83(9)).
Penalties
11.11 States shall lay down the rules on penalties applicable to
infringements of the GDPR in particular for infringements which are not
subject to administrative fines pursuant to Article 83, and shall take all
measures necessary to ensure that they are implemented. Such penalties
shall be effective, proportionate and dissuasive (Article 84(1)).
Each state shall notify to the Commission the provisions of its law
which it adopts pursuant to Article 84(1), by 25 May 2018 and, without
delay, any subsequent amendment affecting them (Article 84(2)).

Civil Sanctions under the DPA


11.12 This is an area which will continue to expand in future. Indi-
vidual Data Subjects can also sue for compensation under DPA 2018
(see ss 168, 169). Where a person suffers damage as a result of a failure
by a Controller or Processor to meet their data protection obligations,
then the Controller or Processor may be subject to civil sanctions by the
person affected. Damage suffered by a Data Subject will include damage
to reputation, financial loss and mental distress.

117
11.12 Enforcement and Penalties for Non-Compliance

Section 168 recognised Article 82 of the GDPR and the right to com-
pensation for material or non-material damage. In addition the section
adds that ‘non material damage’ includes distress. This significantly
clarifies and widens the scope for Data Subject actions, compensation
and damages.
Compensation and damages can also be made to Data Subject repre-
sentative bodies (s 168(3)). (Also see representation of Data Subjects in
ss 187–190).
While there are certain conditions, the primary clause refers to ‘any
contravention’ by the organisation. In terms of the defence, in order to
be able to avail of it, an organisation will have to establish that it has
‘taken such care as in all the circumstances was reasonably required.’
This will vary from organisation to organisation, sector to sector, the
type of personal data involved, the risks of damage, loss, etc, the nature
of the security risks, the security measures and procedures adopted, the
history of risk, loss and damage in the organisation and sector.
Certain types of data will convey inherent additional risks over oth-
ers, such as loss of financial personal data. This can be argued to require
higher obligations for the organisation.
One interesting area to consider going forward is online damage, such
as viral publication, defamation, bulletin boards, discussion forums and
websites (or sections of websites), and social media websites. Where
damage occurs as a result of misuse or loss of personal data or results
in defamation, abuse and threats, liability could arise for the individual
tortfeasors as well as the website.
While there are three eCommerce defences in the eCommerce Direc-
tive, one should recall that the data protection regime (and its civil rights,
sanctions, duty of care and liability provisions) are separate and stand
alone from the eCommerce Directive legal regime. Indeed, even in terms
of the eCommerce defences one should also recall that: (a) an organi-
sation must first fall within an eCommerce defence, and not lose that
defence, in order to avail of it; and (b) there is no automatic entitlement
to an internet service provider (ISP) or website to a global eCommerce
defence, as in fact there in not one eCommerce defence but three specific
defences relating to specific and technical activities. Not all or every ISP
activity will fall into one of these defences. Neither will one activity fall
into all three defences.
It is also possible to conceive of a website which has no take down
procedures, inadequate take down defences, or non-expeditious take
down procedures or remedies, and which will face potential l­iability
under privacy and data protection as well as eCommerce liability.
For example, an imposter social media profile which contains personal
data and defamatory material could attract liability for the website
­operator under data protection, and under normal liability if none of

118
Civil Sanctions under the DPA 11.13

the eCommerce defences were available or were lost. The later could
occur if, for example, the false impersonating profile was notified to the
website (or it was otherwise aware) but it did not do anything.3
As indicated above, the issue of civil liability and litigation by Data
Subjects to enforce their rights will continue. This will in part be fuelled
by increased awareness but also by the increasingly publicised instances
of data loss, data breach, damage and instances of abuse which involve
personal data. The Facebook Beacon settlement, while in the US, for
approx. $20 million emphasises the import of individuals seeking to vin-
dicate their privacy and personal data rights. In the UK, while Prudential
was fined £50,000, it is not inconceivable that loss, damage and financial
loss may have ensued. Consider, further that this example appears to
have related to a small number of individuals. However, what if thou-
sands of customer suffered financial loss, damage, stress, delay, missed
payments, lost flights, lost contracts, etc, as a result of mixing up files,
not completing files and transactions. These are all things which can
occur, whether through process errors or software glitches. This is not
at all far-fetched, as customers of RBS bank will confirm. Google was
also sued re Apple breach. Morrisons was sought to be found liable for
an errant IT employee related data breach (see further discussion below
on the Morrisons case).4
While we are well aware of the press phone hacking scandal, the
Leveson Inquiry and the ensuing litigation from many victims, as well as
there being admissions, interception offences, etc, there are also personal
data breach and data protection civil liability issues arising.
Morrisons Case
11.13 The Morrisons case is a further example of the increasing atten-
tion being paid to the important issues of data protection and private
personal data. It also exemplifies the earlier discussion referring to
data protection as being both inward facing as well as outward facing
(and which has also been highlighted from the earliest edition of this
publication).
This case is also interesting for many other aspects. Primarily, it is
receiving attention because of the claim of vicarious liability as against

3 This is a complex and developing area of law, common law, civil law, Directive,
GDPR and case law, both in the UK and internationally. A full detailed analysis is
beyond this current work.
4 Various Claimants v Morrison Supermarkets plc [2018] EWHC 1123 (QB). The errant
ex IT employee, Andrew Skelton, who released the personal data on over 100,000
employees was convicted and imprisoned. That Morrisons was vicariously liable at
trial and on appeal in the Court of Appeal opens the possibility of liability to each of
the affected employees. The company has indicated that it will appeal. (Notwithstand-
ing that the Supreme Court differed with the two lower courts).

119
11.13 Enforcement and Penalties for Non-Compliance

the employer to other employees resulting from data breach issues. The
case involved a large volume of employee records being posted onto
the internet. The employee records were also distributed to a number
of media outlets – after being further copied or downloaded from the
internet.
There are important issues of liability and responsibility – hence the
emphasis on the ability of individuals to be compensated when affected
by breaches, such as breaches of compliance, rights, data misuse, data
loss and data disclosure.
The issues involved are also demonstrated as being important given the
extensive jurisprudential record of the case. While the Supreme Court5
differed somewhat, the affected employees’ rights were vindicated as
against the employer by the High Court6 and the Court of Appeal.
The case also demonstrates the difficult and complex nature of many
modern data protection cases given the interface of law, rights and
­technology. Sometimes a detailed appreciation – and even expertise – of
IT and computing is required in order to appreciate when and how data
protection breaches occur.
I should preface these comments by indicating that this is not a full
and detailed examination of all issues discussed and raised in the case,
nor indeed a review of all of the evidence, testimony and record avail-
able to the respective courts. It focuses mainly on the following issues.
First, there may be an immediate impulse on the part of some to sug-
gest that the case (at least in the Supreme Court) rules out the possibil-
ity that employers can be held vicariously liable for errant employees’
acts regarding personal data, or wider still that no data protection use or
misuse can give rise to liability or vicarious liability. Both suggestions
would be incorrect. Indeed, the Supreme Court decision focuses only
on an employee’s acts (apparently) without knowledge of the employer.
The second point is that the appeal(s) appear focused on issues of
vicarious liability alone. Other liability issues can arise and may be con-
tested in other circumstances.
This appears to be because issues of primary liability were discounted
by the trail court. (While Morrisons appealed the vicarious liability find-
ing against it, it does not seem that the plaintiffs appealed the lack of
primary liability finding). This may well be a live issue in other cases.
While there may or may not be good reasons for the various find-
ings, the record available does raise questions as to whether all appro-
priate issues referred to were included in the public record. It is unclear
whether different outcomes may have been possible with the benefit of
further testing of particular issue points in the case – particularly at trial.

5 Morrisons v Various [2020] UKSC 12 (1 April 2020).


6 Various v Morrisons (Rev 1) [2017] EWHC 3133 (QB) (1 December 2017) Langstaff J.

120
Civil Sanctions under the DPA 11.13

The appellate levels appear to have been deprived of the ability to


consider legal and factual circumstances and arguments, which may
have given rise to primary liability or been relevant to such an issue. The
appeals should not be mistaken as deciding and ruling out the potential
for primary liability being contested in an appropriate circumstance.
The points of discussion in this analysis focus primarily on some of the
questions which arise from the decision records as to whether there were
potential gaps or matters not fully available to the court(s) to explore and
consider. There may well be additional points.
The trial court seems to have been inclined to discount the primary
liability claim(s) on the basis, in part, of testimony on what is and is not
relevant accepted industry practice in data protection and data security.
Some of this discussion as recounted in the trial decision raises some
questions as to accuracy and generality. A fuller examination of these
issues might potentially have persuaded the judge to a different conclu-
sion on primary liability.
This emphasises the need in certain cases for very careful analysis
of forensic and computer technology issues – and up to and including
a variety of different experts on each side to present pertinent evidence.
There is great attention paid to the temporality of the upload to the
internet by the errant employee, and while some earlier issues of signifi-
cance are discussed (at trial), these earlier issues do not appear (from the
record available) to have received all the attention that they potentially
deserve.
The record would arguably have ideally benefitted from a fuller elu-
cidation as to the exact contracts, policies and terms which existed at
the time of the incident(s) at issue. These are highly pertinent to cases
involving data breach issues with inward facing personal data.
It is not clear to this reader that the firm was not in substantial and
material breach of its own policies. Further, there is suggestion even from
the record that there were numerous breaches by the firm. In appropri-
ate circumstances these may be very relevant to primary liability issues.
In this context it is necessary to see what all of the relevant policies
say – and do not say – and to have appropriate (even expert) explana-
tion as to the significance of these. While the parties’ direct evidence
can be useful, it can also be appropriate for a court to have the benefit of
some less involved (or dare I say, less biased) independent information
on policies, procedures and standards.
A firm may have ten policies in place. Yet if one of the important
core issues at stake should be addressed in a required policy 11 and is
not covered in policies 1–10, that is a potential material failing on the
company’s part. However, these nuances need to be carefully appraised
and considered by the court. It is suggested that the record in this case
demonstrated a number of material gaps or failings. It can sometimes be

121
11.13 Enforcement and Penalties for Non-Compliance

more important for a court to be aware of the gaps in policies than what
current policies may say.
It is also noted that in the present case, the company acknowledges
that there were policy changes between the date of the events at issues
and the trial date. It would be useful to understand the nature of these
changes in much greater depth than is presently possible from the cur-
rent record.
A glaring gap in the record for the court was the exact date and manner
of transmission or delivery of the vast payroll documentation to KPMG
by the errant employee. The case is also inconclusive on the very impor-
tant point of how delivery was achieved.
One would normally expect that this would be a central and core issue
for the employer to have to establish. Perhaps it is, but that is not easily
evident.
It is a glaring issue that the firm was not able to establish when the
documents were given to KPMG. This is a very important and potentially
impactful point. The various decisions make clear that it is acknowl-
edged that when the documents were delivered was not established by
the firm. This seems an essential issue to have been proven. In fact, the
decision seems to suggest that it was left up to the trial judge to try and
guess a time period when delivery may have occurred.
This seems somewhat remiss if that is the case (and raised the follow
on query as to whether it benefited the company not to have this issue
clarified with specificity).
That the firm could not establish a definitive date and time is (very)
curious. This in itself demonstrates policy and procedure gaps. Some
may call this a policy and evidence failing – or data security failings.
Why were there no documented procedures to safeguard the data when
transferring it to a third party and to ensure delivery was guaranteed and
recorded? Was there no email before or after delivery to KPMG? Was
there no internal email confirming delivery? If delivery was by hand,
was there no cover letter? Is it secure and data protection compliant to
deliver an electronic storage device with a large set of sensitive data to a
third party without cover documentation? Why was the firm not able to
establish any company-side policy or delivery recordal?
Why was the firm not able to evidence any correspondence or recordal
documentation from KPMG demonstrating when delivery occurred?
Why was there no evidence from KMPG indicating when it opened the
data delivered?
Some might suggest that these gaps and lack of clarity are a clear dem-
onstration of the firm breaching primary responsibilities.
One would be surprised if there was not some further potential record
available in relation to the delivery issue. If so, why was this a­ pparently

122
Civil Sanctions under the DPA 11.13

not available to the court? It seems that there was an element of f­ orensics
devoted to other issues regarding the errant employee, but there is appar-
ently no record of forensics available to the court dealing with the deliv-
ery issues.
The when issue is potentially of large significance to the primary
­liability issue – and also the vicarious liability issue. The trail judge
narrowed the window of delivery (an official task for the employee) to
between 15 November to 21 November. The judge also found that the
employee also copied or downloaded the data in question to his own
USB device (USB2) on 18 November.
So the potentially important question of when delivery occurred
comes into play. Did the errant download on 18 December come before
delivery to KPMG? Put another way, did the download to USB2 (a per-
sonal device) occur prior to delivery to KPMG and therefore during an
official authorised activity?
Note that the firm had (a small number of) official, designated USB
devices. Why did data security policy and procedures not pick up and
alert management of (a) an unauthorised device access/connection; and
(b) an unauthorised and very large download (indeed, so large that it
could not be sent by internal email)?
While these are primarily questions which jump out, and certain mat-
ters are indeed unclear, it is clear that the respective courts in different
circumstances may well have had more to contend with. Indeed, as inter-
esting as the case is already, further factual clarity may have led to the
case being of even more importance.
It is also noted that the Supreme Court seems in part concerned pri-
marily to correct misunderstanding or misinterpretation in prior judicial
jurisprudence on vicarious liability generally, over the data protection
issues arising. Even on the record available from this case, companies
should not assume that they are immune from primary liability – nor
vicarious liability – in appropriate circumstances. The story of primary
or vicarious liability is by no means over.
A separate issue also appears to have been suggested, namely that some
distinction needs to be drawn as to when data is copied, and whether the
original data remains intact in situ. The suggestion seems to be that there
is no liability or responsibility to the employer if the original data has
not been deleted. While this aspect of the case requires further review,
some might suggest this as being curious to the extent that it may per-
mit all sorts of mischief. If a hacker hacks into a bank and steals data
which is then sold to identity thieves, is the hacker immune because the
original data remains at the bank? Is there no remedy against a bad actor
ransomware attacker because they are not changing or deleting data, but
rather locking the firm, hospital, etc out of the ability to access its own

123
11.13 Enforcement and Penalties for Non-Compliance

data? Is an employee immune for selling the customer database of an


employer to a competitor merely because the original data still sits with
the employer? Is the company always provided with a blanket defence
because an employee or third party attacker copied and did not delete
data? Potential unintended mischiefs may occur if data deletion becomes
a precondition to other legal consequences arising.
Finally, there appears to be some suggestion in the court records which
diverges from established data protection rules and indeed data regulator
official guidance. It is not clear if this is the intention or something unin-
tended in the decision, but might need to be explored further.
This is a significant decision, of course, but the stakes at play will
mean that this will not be the last contested case dealing with these types
of matters.
Director Liability and Offences
11.14 The potential for personal liability (DPA 2018, s 198) is also a
significant issue for an organisation and those within positions of author-
ity where the organisation is involved in offences.
Employees and executives can also be liable for offences if they
obtain, disclose or procure the personal data in the organisation for their
own purposes.
Section 198 of the DPA 2018 refers to liability of directors, etc.
Section 198(1) provides that where an offence under the Act has been
committed by a body corporate and is proved to have been committed
with the consent or connivance of or to be attributable to any neglect on
the part of any director, manager, secretary or similar officer of the body
corporate or any person who was purporting to act in any such capacity,
they as well as the body corporate shall be guilty of that offence and be
liable to be proceeded against and punished accordingly.
One interesting case involves using initials in case names.7 This also
occurred in the recent case of NT1 and NT2, both of which related to the

7 In a data protection related case in Berlin, not only was the company sued but the
directors were also personally named and included in the case. One of the reasons
specified was to ensure that the directors made sure that the Board of the company
made the appropriate amendments to ensure the breaches were rectified and would
not re-occur. A further interesting aspect of the case is that while the directors were
included personally, their names were redacted to initials. One such set of initials
was ‘MZ’. See The Federal Association of Consumer Organisations and Consumer
Groups, Federal Consumer Association, GB v Facebook Ireland Limited, MA, JB,
DG, PT and the Chairman MZ, [names redacted], [redacted] Reach, [redacted] Quay,
Dublin, Ireland.

124
Remedies, Liability and Penalties under the GDPR 11.19

erasure and forgetting right.8 Many cases in European countries have


anonymised plaintiffs.
Offences by Processors
11.15 Processors can also commit offences.

Remedies, Liability and Penalties under the GDPR


11.16 Chapter VIII refers to remedies, liability and penalties regard-
ing data protection.
Right to Lodge Complaint with Supervisory Authority
11.17 Without prejudice to any other administrative or judicial remedy,
every Data Subject shall have the right to lodge a complaint with a
­supervisory authority, in particular in the state their habitual residence,
place of work or place of the alleged infringement if the Data Subject
considers that the processing of personal data relating to them does not
comply with the GDPR (Article 77(1)).
The supervisory authority with which the complaint has been lodged
shall inform the complainant on the progress and the outcome of the
complaint including the possibility of a judicial remedy pursuant to
Article 78 (Article 77(2)).
Right to Judicial Remedy against Supervisory Authority
11.18 Without prejudice to any other administrative or non-judicial
remedy, each natural or legal person shall have the right to an ­effective
judicial remedy against a legally binding decision of a supervisory
authority concerning them (Article 74(1)).
The supervisory authority with which the complaint has been lodged
shall inform the complainant on the progress and the outcome of
the complaint including the possibility of a judicial remedy pursuant to
Article 78 (Article 77(2)).
Right to an Effective Judicial Remedy Against Controller
or Processor
11.19 Without prejudice to any available administrative or n­ on-­judicial
remedy, including the right to lodge a complaint with a supervisory

8 NT 1 & NT 2 v Google LLC [2018] EWHC 799 (QB).

125
11.19 Enforcement and Penalties for Non-Compliance

authority under Article 77, each Data Subject shall have the right to
an effective judicial remedy if they consider that their rights under this
GDPR have been infringed as a result of the processing of their personal
data in non-compliance with the GDPR (Article 79(1)).
Proceedings against a Controller or a Processor shall be brought
before the courts of the state where the Controller or Processor has an
establishment. Alternatively, such proceedings may be brought before
the courts of the state where the Data Subject has his or her habitual
residence, unless the Controller or Processor is a public authority of a
state acting in the exercise of its public powers (Article 79(2)).
Representation of Data Subjects
11.20 The Data Subject shall have the right to mandate a not-for-profit
body, organisation or association, which has been properly constituted
according to the law of a state, has statutory objectives are in the pub-
lic interest, and is active in the field of the protection of Data Subjects’
rights and freedoms with regard to the protection of their personal data
to lodge the complaint on their behalf, to exercise the rights referred to in
Articles 77, 78 and 79 on their behalf and to exercise the right to receive
compensation referred to in Article 82 on their behalf where provided for
by state law (Article 80(1)).
States may provide that any body, organisation or association referred
to in Article 80(1), independently of a Data Subject’s mandate, has the
right to lodge in that state, a complaint with the supervisory authority
competent which is competent pursuant to Article 77 and to exercise
the rights referred to in Articles 78 and 79 if it considers that the rights
of a Data Subject have been infringed as a result of the processing
(Article 80(2)).
There is increasing evidence of more such groups being formed and
becoming active in advocacy, filing complaints with data regulators
and, one expects, also seeking to develop jurisprudence in relation to
compensation and damages. While most of the groups are clearly data
protection and privacy focused, it may be possible that new groups
are particularly aiming to focus more on compensation issues than
­advocacy issues. It also remains to be seen if existing bodies seek to cast
themselves as partly privacy groups (eg employee’ unions).
Suspension of Proceedings
11.21 Where a competent court of a state has information on proceed-
ings, concerning the same subject matter as regards processing by the
same Controller or Processor, that are pending in a court in another state,

126
Remedies, Liability and Penalties under the GDPR 11.25

it shall contact that court in the other state to confirm the existence of
such proceedings (Article 81(1)).
Where proceedings concerning the same subject matter as regards
processing by the same Controller or Processor are pending in a court in
another state, any competent court other than the court first seized may
suspend its proceedings (Article 81(2)).
Where those proceedings are pending at first instance, any court other
than the court first seized may also, on the application of one of the
­parties, decline jurisdiction if the court first seized has jurisdiction over
the actions in question and its law permits the consolidation thereof
(Article 81(3)).
Right to Compensation and Liability
11.22 A person who has suffered damage as a result of an infringe-
ment of the GDPR shall have the right to receive compensation
from the Controller or Processor for the damage suffered (Article 82(1)).
(See para 11.09 above for more particular details).
General Conditions for Imposing Administrative Fines
11.23 An organisation can have fines and penalties imposed for
non-compliance. The fines regime is significantly updated under the new
data protection regime. In particular the level of fines and penalties are
increased. See para 11.10 above for further details.
Penalties
11.24 States shall lay down the rules on penalties applicable to
infringements of the GDPR in particular for infringements which are
not subject to administrative fines. Further details are referred to in
para 11.11 above.
Fines and Remedy Levels
11.25 Article 83 relates to administrative fine sanctions. Each super-
visory authority is empowered to impose administrative fines, which are
meant to be effective, proportionate and dissuasive, and include refer-
ence to the nature, gravity and duration of the breach, amongst other
factors.
Certain fines can be up to 10,000,000 EUR (or equivalent), or
in case of an enterprise up to two percent of its annual worldwide
turnover, whichever is higher. Other fines can amount to 20,000,000
EUR (or equivalent) or four percent of turnover, whichever is higher.
Further details are referred to in para 11.10 above.

127
11.26 Enforcement and Penalties for Non-Compliance

Powers and Functions of the ICO


11.26 The ICO has a number of promotion, investigation and
enforcement powers and functions under DPA 1998. These include:
●● to assess whether especially risky or dangerous types of processing
are involved;
●● to prepare and publish codes of practice for guidance in applying
data protection law in particular areas;
●● cases;
●● investigations;
●● complaints;
●● audits;
●● fines.
Part 6 of the DPA 2018 refers to enforcement issues. This specifies:
●● Information Notices (ss 142–145);
●● Assessment Notices (ss 146–147);
●● destroying or falsifying information and documents/information
notices and assessment notices and destruction of docyments, etc
(s 148);
●● Enforcement Notices (ss 149–153);
●● powers of entry and inspection (s 154) (also see powers of entry and
inspection referred to in Sch 15 of DPA 2018);
●● penalties (ss 155–159);
●● remedies in court: compliance orders and compensation (ss 167–169);
●● dealing with complaints via Data Subjects (s 165);
●● orders to progress complaints (via Data Subjects) (s 166).
Financial Penalties
11.27 The ICO can issue monetary penalty notices or fines. This is
where serious breaches occur which are deliberate and with knowledge
or where there ought to be knowledge of the breach and of the likely
damage.
Inspections
11.28 The ICO may also undertake inspections of organisations. This
is where the ICO suspects that there may be a breach of the DPA and
it obtains a warrant to search the organisational premises. In such an
instance the ICO may enter, search, inspect, examine, test equipment,
inspect and seize documents, materials, etc. Obstructing such an inspec-
tion will be a criminal offence. Legal advices should be immediately
sought in relation to such incidents.

128
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

ICO Data Enforcement, Loss/Data Breaches, Fines


and Convictions
11.29
Issue Date Party Breach Penalty
Nuisance 27/3/20 Black Lion Unsolicited direct £171,000
calls Marketing Ltd marketing calls.
Access 13/3/20 Young Town clerk prosecuted Prosecution
for blocking access
and deleting records to
frustrate access.
Data security 4/3/20 Cathay Pacific Failure to protect and £500,000
Airways Ltd secure customer data,
leading to hack exposure.
Lack of appropriate data
security over four years.
Automated 2/3/20 CRDNN Ltd 193 million automated £500,000
nuisance nuisance spam marketing
calls calls.
Automated 2/3/20 CRDNN Ltd 193 million automated Enforcement
nuisance nuisance spam marketing Notice
calls calls.
Disclosure 15/3/20 Kirk Former social worker Prosecution
breach disclosed data on service
users to third party service
firm.
Data breach 9/1/20 DSG Retail Ltd 14 million customers £500,000
hack exposed after point of sale
hack attack.
Data security 20/12/19 Doorstep 500,000 medical record £275,000
Dispensaree Ltd data documents left in
unsecure manner.
Access 5/12/19 Shaw Former reablement officer Prosecution
breach accessed care record data
without any business
need.
Access 2/12/19 Shipsey Former Social Services Prosecution
breach support officer accessed
care record data without
any business need.
Nuisance 17/9/19 Superior Unsolicited direct £150,000
calls Style Home marketing calls. Call to
Improvements Ltd telephone preference
service (TPS) opted-out
numbers.

129
11.29 Enforcement and Penalties for Non-Compliance

Nuisance 17/9/19 Superior Unsolicited direct Enforcement


calls Style Home marketing calls. Call to Notice
Improvements Ltd TPS opted-out numbers.
Access right 12/8/19 Hudson Bay Enforcement Notice for Enforcement
Finance Ltd failing to fulfil data access Notice
right.
Nuisance 2/8/19 Making it Easy Ltd Unsolicited direct £160,000
calls marketing calls. Calls to
TPS opted-out numbers.
Nuisance 2/8/19 Making it Easy Ltd Unsolicited direct Enforcement
calls marketing calls. Calls to Notice
TPS opted-out numbers.
Disclosure 19/7/19 Life at Parliament Failure to secure customer £80,000
breach View Ltd data; left exposed for two
years.
Access 25/6/19 Metropolitan Police Data access right; repeat Enforcement
Service (MPS) fulfilment breaches. Notice
Nuisance 24/6/19 EE Ltd 2.5 million direct £100,000
texts marketing text
communications.
Collection 24/6/19 Cullen Former managing director Prosecution
breaches. involved in illegal data
Sale collections and data sale
breaches. for insurance claims.
Nuisance 13/6/19 Smart Home Unsolicited direct £90,000
calls Protection Ltd marketing calls. Calls to
TPS opted-out numbers.
Access 7/6/19 Masterson Former customer service Prosecution
breach advisor accessed anti-
social behaviour record
data without any business
need.
Access 6/6/19 Baines Restorative Justice Prosecution
breach Case Worker victim and
offender record data to
personal email without
any business need.
Collection 10/5/19 Her Majesty’s Failure to obtain adequate Enforcement
breach. Revenue and consent to collect caller Notice
Consent Customs (HMRC) personal data.
breach.
Nuisance 7/5/19 Hall and Hanley Sending over 3.5 million £120,000
texts Ltd direct marketing text
communications.

130
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Nuisance 16/4/19 Avalon Direct Ltd Over 50,000 unsolicited £80,000


calls direct marketing calls.
Calls to TPS opted-out
numbers.
Nuisance 16/4/19 Avalon Direct Ltd Over 50,000 unsolicited Enforcement
calls direct marketing calls. Notice
Calls to TPS opted-out
numbers.
Disclosure 11/4/19 Bounty (UK) Ltd Illegal sharing and £400,000
breach disclosure of personal
data.
Collection 10/4/19 True Visions Illegal filming in £120,000
Consent Productions maternity ward.
Transparency Transparency. Consent.
breaches Prior information.
Disclosure 4/4/19 London Borough of Illegal disclosure of data £145,000
breach. Newham of persons on police
Data security. database.
Nuisance 26/3/19 Grove Pension Two million unsolicited £40,000
emails Solutions Ltd direct marketing emails.
Relying on ‘misleading’
professional advices.
Nuisance 19/3/19 Vote Leave Ltd Political. Unsolicited £40,000
texts direct marketing text
communications.
Compliance 7/2/19 Magnacrest Ltd Prosecution for failure to Prosecution
comply with Enforcement
Notice re data access
right.
Nuisance 1/2/19 Leave.EU Group Political. 300,000 £15,000
marketing Ltd (2) unsolicited
communications.
Nuisance 1/2/19 Leave.EU Group Political. Unsolicited £45,000
marketing Ltd (1) communications without
consent.
Nuisance 1/2/19 Eldon Insurance Unsolicited direct £60,000
emails Services Ltd marketing emails without
(TA GoSkippy consent.
Insurance)
Nuisance 1/2/19 Eldon Insurance Unsolicited direct Enforcement
emails Services Ltd marketing emails without Notice
(TA GoSkippy consent.
Insurance)

131
11.29 Enforcement and Penalties for Non-Compliance

Nuisance 31/1/19 A Green Legal Unsolicited direct £80,000


calls Services Ltd marketing emails without
consent.
Nuisance 31/1/19 A Green Legal Unsolicited direct Enforcement
calls Services Ltd marketing emails without Notice
consent.
Nuisance 22/1/19 NWR Ltd Over 800,000 unsolicited Enforcement
calls direct marketing calls. Notice
Calls to TPS opted-out
numbers.
Nuisance 13/12/18 Tax Returned Ltd 14.8 million unsolicited £200,000
texts direct marketing text
communications.
Nuisance 13/12/18 Tax Returned Ltd 14.8 million unsolicited Enforcement
texts direct marketing text Notice
communications.
Data security. 26/11/18 Uber Failure to secure customer £385,000
Hack. data against hack attack.
Nuisance 26/11/18 Solartech North Almost 75,000 unsolicited Enforcement
calls East Ltd direct marketing calls. Notice
Calls to TPS opted-out
numbers.
Nuisance 23/11/18 DM Design 1.6 million unsolicited £160,000
calls Bedrooms Ltd direct marketing calls.
Calls to TPS opted-out
numbers.
Nuisance 23/11/18 Solartech North Almost 75,000 unsolicited £90,000
calls East Ltd direct marketing calls.
Calls to TPS opted-out
numbers.
Nuisance 23/11/18 DM Design 1.6 million unsolicited Enforcement
calls Bedrooms Ltd direct marketing calls. Notice
Calls to TPS opted-out
numbers.
Various 16/11/18 Metropolitan Police Crime data, profiling, Enforcement
Service (MPS) gangs database matrix. Notice
Processing data for longer
than necessary.
Excessive processing of
personal data.
Processing not fair or
lawful.
Nuisance 31/10/18 Secure Home Almost 85,000 unsolicited £80,000
calls Systems Ltd direct marketing calls.
Calls to TPS opted-out
numbers.

132
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Nuisance 30/10/18 ACT Response Ltd Over 495,000 unsolicited £140,000


calls direct marketing calls.
Calls to TPS opted-out
numbers.
Social media 24/10/18 Facebook Re Cambridge Analytica £500,000
data breach political data harvesting
and profiling and data breaches.
Social media 24/10/18 Aggregate IQ Data Data analytics for political Enforcement
data breach Service Ltd profiling and other Notice
and profiling activities.
Nuisance 9/10/18 Boost Finance Ltd Millions of unsolicited £90,000
calls direct marketing calls.
Data security 8/10/18 Heathrow Airport USB lost by employee £120,000
Ltd (HAL) with personal data, lack of
encryption.
Nuisance 1/10/18 Oakland Assist UK Unsolicited direct £150,000
calls Ltd marketing calls.
Data security 28/9/18 BUPA Insurance Data security breaches £175,000
Services Ltd failing to secure persona
data.
Data breach. 20/9/18 Equifax Ltd Failure to secure personal £500,000
Hack. data of 15 million citizens
in hack attack.
Access 6/9/18 London Borough of Enforcement Notice order Enforcement
Lewisham regarding data access Notice
rights.
Nuisance 5/9/18 Everything DM Ltd 1.43 million unsolicited £60,000
emails marketing emails.
Nuisance 5/9/18 Everything DM Ltd 1.43 million unsolicited Enforcement
emails marketing emails. Notice
Collection 9/8/18 Lifecycle Illegal collection, use £140,000
breach. Marketing (Mother and sale of persona data,
Sale breach. and Baby) Ltd including mothers and
children.
Nuisance 1/8/18 AMS Marketing Over 75,000 unsolicited £100,000
calls Ltd direct marketing calls.
Calls to TPS opted-out
numbers.
Compliance 3/7/18 Noble Design and Failure to comply with Prosecution
Build Ltd Information Notice
Sensitive 18/7/17 Independent Inquiry Breach of identity of child £200,000
data breach into Child Sexual sex abuse victims.
Abuse (IICSA)
Spam 28/6/18 Our Vault Ltd Spam calls. £70,000
Spam 20/6/18 BT Email marketing spam. £77,000

133
11.29 Enforcement and Penalties for Non-Compliance

Sensitive 11/6/18 Gloucestershire Revealed abuse victim £80,000


data breach Police details in bulk email.
Hacking 7/6/18 British & Foreign Responsible for exposure £100,000
Bible Society of computer system to
hacking attack.
Records, 23/5/18 Bayswater Medical Special personal data and £35,000
security medical records left in
unattended building.
Data breach 21/5/18 Yahoo! UK Mass data breach. ISP. £250,000
Security 21/5/18 University of Serious security breach, £120,000
breach Greenwich data of 20K people,
included sensitive
personal data.
Data breach, 17/5/18 Daniel Short Recruitment agent took Prosecution
security, candidate details for new
offence, job.
unlawful
obtaining
Security, 16/5/18 Crown Prosecution Loss of police interviews £325,000
encryption Service (CPS) on DVD, not encrypted.
Unlawful 23/4/18 Michelle Harrison Hospital health employee Prosecution
access accessing patient records
without need.
Unlawful 16/4/16 Royal Borough Unlawfully identified £120,000
identification of Kensington & people.
Chelsea
Security 5/4/18 Humberside Police Rape victim interview £130,000
recording on disk lost.
Unlawful 7/2/18 Philip Bagnall Unlawfully obtaining Prosecution
access customer data of former
employer.
Consent 31/1/18 Holmes Financial Automated marketing £300,000
Services calls. without consent
Nuisance 15/9/17 Your Money Rights 146 million illegal £350,000
calls Ltd nuisance and heavy-
handed nuisance Spam
calls.
Special heath 11/8/17 Brioney Woolfe Former Colchester Prosecuted
data breach. hospital employee at Colchester
Unlawful prosecuted for (a) Magistrates
access and unlawfully obtaining Court.
disclosure sensitive heath data of Convicted
friends and (b) unlawfully
disclosing sensitive health
data.
Data breach 11/8/17 TalkTalk Customer data breach. £100,000

134
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Nuisance 3/8/17 Laura Anderson Nuisance calls and breach £80,000


calls TA Virgo Home of do not call register
Improvements telephone preference
service.
Nuisance 3/8/17 HPAS TA Safestyle Nuisance calls and breach £70,000
calls of do not call register
telephone preference
service.
Unlawful 21/7/17 Stuart Franklin Unlawful disclosure of Birmingham
disclosure CVs of applicants to then Magistrates
employer, the Controller, Court.
to third party. Conviction.
Span email 20/7/17 Moneysupermarket. Price comparison website £80,000
com fined for.
Spam texts 17/7/17 Provident Personal Spam marketing texts. £80,000
Credit Union
Data breach 3/7/17 Royal Free Sensitive health data. Curiously no
Various unlawful data fine.
breaches by hospital Curiously no
to third party research deletion.
company DeepMind.
Website data 27/7/17 Boomerang Video Website cyber attack. £60,000
breach Did not take preventative
steps.
Nuisance 22/6/17 MyHome Nuisance calls and breach £50,000
calls Installations of do not call register
telephone preference
service.
Marketing 16/6/17 Morrisons Incorrect use of data and £10,500
emails supermarkets customer marketing.
Data breach 12/6/17 Gloucester City Cyber attack and access to £100,000
Council special employee data.
Blagging 8/6/17 Joseph Walker Blagging calls to obtain Prosecution
calls data from insurance
companies re claims and
to resell data.
Data breach 31/5/17 Basildon Borough Unlawfully publishing £150,000
Council special data about a
family in online planning
documents.
Spam texts 17/5/17 Concept Car Credit Spam texts. £40,000
Spam calls 17/5/17 Brighter Home Nuisance calls and breach £50,000
Solutions of do not call register
telephone preference
service.

135
11.29 Enforcement and Penalties for Non-Compliance

Unlawful 16/5/17 Sally Anne Day Employee unlawful Prosecution.


access access of sensitive patient Conviction.
health data.
Spam texts 16/5/17 Onecom Spam texts. £100,000
Spam calls 10/5/17 Keourboom 99.5m Spam calls. ‘Record’
Communications £400,000
Data breach 4/5/17 Greater Manchester 3 DVDs lost with videos £150,000
Police of special sex crime
victim interviews lost.
Data breach 2/5/17 Construction Failed to protect customer £55,000
Materials Online data.
Spam texts 19/4/17 Moneva Spam texts. £40,000
Various 5/4/17 Great Ormond Sharing records with other £11,000
breaches Street Hospital charities.
Childrens Charity Profiling.
Matching data from other
data sources not provided
directly.
Collection 5/4/17 Battersea Dogs and Finding personal data not £9,000
breach Cats Home provided.
Various 5/4/17 Cancer Research Profiling. £16,000
breaches UK Matching data from other
data sources not provided
directly.
Unlawful 5/4/17 Cancer Support UK Sharing data with others £16,000
disclosure regardless of cause.
Various 5/4/17 Macmillan Cancer Profiling. £14,000
breaches Support Matching data from other
data sources not provided
directly.
Various 5/4/17 NSPCC Profiling. £12,000
breaches Matching data from other
data sources not provided
directly.
Matching 5/4/17 Oxfam Matching data from other £6,000
data sources not provided
directly.
Various 5/4/17 Guide Dogs for Profiling. £15,000
breaches Blind Matching data from other
data sources not provided
directly.
Various 5/4/17 International Fund Sharing data with others £18,000
breaches for Animal Welfare regardless of cause.
Profiling.
Matching data from other
data sources not provided
directly.

136
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Various 5/4/17 Royal British Profiling. £12,000


breaches Legion Matching data from other
data sources not provided
directly.
Various 5/4/17 WWF UK Profiling. £9,000
breaches Matching data from other
data sources not provided
directly.
Unlawful 5/4/17 Eileen McMillan Hospital employee Prosecution
accesss unlawfully access
sensitive health files
on estranged family
members.
Spam texts 30/3/17 PRS Media Spam texts without £140,000
consent of 4.4m people.
Spam calls 30/3/17 Xternal Property Nuisance calls and breach £80,000
of do not call register
telephone preference
service.
Spam emails 27/3/17 Flybe 3.3m spam emails despite £70,000
objection opt outs.
Emails 27/3/17 Honda PECR breach. £13,000
Data breach 20/3/17 Norfolk County Old files including special £60,000
Council data left in cabinet given
to second-hand shop.
Breach 16/3/17 Barrister Senior barrister failed to £1,000
keep client file secure.
Unlawful 16/3/17 Gregory Oram Leaving employee Prosecution
obtaining unlawfully obtaining
personal data from
employer and emailing
to personal email to start
rival recruitment firm.
Spam texts 14/3/17 Munee Hut Outsourcing spam texts. Prosecution.
Also fined
£20,000
Spam calls 9/3/17 Media Tactics 22m spam calls. £270,000
Unlawful 3/3/17 Elaine Lewis Former nurse unlawfully Prosecution
access accessed sensitive
medical records.
Security 28/2/17 HCA International Health company failed to £200,000
keep IVF fertility records
secure.
Spam texts 15/2/17 Digitonomy Spam texts. £120,000
Sale and 2/2/17 Data Supply Sale of records, resulting £20,000
spam Company in purchaser spam texts.

137
11.29 Enforcement and Penalties for Non-Compliance

CCTV 1/2/17 Karthikesu Newsagent uses CCTV. Prosecution


Section 17 offence.
Texts 24/1/17 LAD Media Spam texts. £50,000
Unlawful 18/1/17 Rebecca Gray Employee leaving Prosecution
obtaining recruitment company
emails client personal
data when moving to new
recruitment firm.
Spam calls 16/1/17 IT Project Ltd Nuisance calls and breach £40,000
of do not call register
telephone preference
service.
Data loss 10/1/17 Royal & Sun Data loss of 60,000 £150,000
Alliance customers.
Unlawful 10/1/17 Minty, Leong & Unlawful access Prosecution
access Craddock (blagging) of insurance
data to make claims.
Access 20/12/16 Wainwrights Estate Failure to comply with Prosecution
information notice after
failure to comply with
access request.
Unlawful 9/12/16 British Heart Secretly screened and £18,000
processing Foundation profiled to target for
money.
Unlawful 9/12/16 RSPCA Matching, profiling, £25,000
processing screening for wealth
targeting.
Unlawful 2/12/16 Monnapula Former health employee Prosecution
access unlawfully accessed
special heath files of
people she knew.
Spam texts 30/12/16 Oracle Insurance Spam texts. £30,000
Brokers
Spam texts 29/11/16 Silver City Spam texts. £100,000
Unlawful 11/11/16 Severs and Unlawful access. Prosecution
access Billington Blagging.
Blagging
Spam calls 10/11/16 Assist Law Nuisance calls and breach £30,000
of do not call register
telephone preference
service.
Spam texts 7/11/16 Nouveau Finance 2.2m spam texts. £70,000

138
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Unlawful use 2/11/16 Tandon Email data to personal Prosecution


email and sold to third
party.
Unlawful 28/10/16 Evans Former NHS employee Prosecution
access unlawful accessed special
data of girlfriend of
partner.
Unlawful 25/10/16 Wooltorton Former hospital employee Prosecution
access unlawfully accessing
special data of people she
knew.
Spam texts 13/10/16 Rainbow (UK) Ltd Spam texts. £20,000
Data breach 5/10/16 Talk Talk Data breach. £400,000
Cyber attack accessing
customer data.
160,000 customers.
Spam texts 28/9/16 Ocean Finance Spam texts. £130,000
Spam texts 15/9/16 CarFinance 247 Spam texts. £30,000
Texts 8/9/16 Vincent Bond Unsolicited texts. £40,000
Spam calls 8/9/16 Omega 1.6m spam calls. £60,000
Security 25/8/16 Whitehead Nursing Not protecting special £15,000
data.
Data breach 16/8/16 Hampshire County Personal data files found £100,000
Council in disused building.
Unlawful 11/8/16 Regal Chambers Unlawful disclosure of £40,000
disclosure Surgery data of woman and family
to estranged husband.
Information 21/7/16 Clarity Leeds Failure to comply with Prosecution
notice information notice after
failure to comply with
access request.
Spam calls 9/6/16 Advanced VOIP Spam calls. £180,000
Spam texts 9/6/16 Quigley & Carter Spam texts. £80,000
Data breach 8/6/16 Chief Constable of Email identifying sex £150,000
Dyfed-Powys offenders sent to member
of public in error.
Spam calls 16/5/16 Check Point Claims 17.5m spam calls. £250,000
Spam texts 11/5/16 Better for the 500,000 spam Brexit texts £50,000
Country
Unlawful 9/5/16 Chelsea & Revealed emails of HIV £180,000
disclosure Westminster patients.
Hospital

139
11.29 Enforcement and Penalties for Non-Compliance

Data breach. 4/5/16 Blackpool Teaching Inadvertently published £185,000


Unlawful Hospitals NHS online workers’
disclosure Foundation Trust confidential data
including their National
Insurance number, date of
birth, religious belief and
sexual orientation.
Nuisance 27/4/16 Nevis Home 2.5 million recorded £50,000
calls Improvements Ltd phone calls.
Data breach/ 21/4/16 Chief Constable of Fined after sensitive £80,000
Unlawful Kent Police personal details of woman
disclosure who accused her partner
of domestic abuse passed
to the suspect.
Trying to buy 7/4/16 David Barlow Former LV employee Prosecution
data without Lewis David Barlow and fine.
consent Lewis prosecuted
at Bournemouth
Magistrates’ Court for
attempting to commit a s
55 offence, by attempting
to obtain personal data
without the controller’s
consent. Pleaded guilty to
trying to get an existing
LV employee to sell him
customer data.
Failure to 5/4/16 Keurboom Both prosecuted at Luton Prosecution
comply with Communications Magistrates’ Court for of company
notice Limited and failing to comply with a and directors.
director Gregory third party information Company
Rudd notice issued by the fined £1,500,
Commissioner in relation plus costs and
to an investigation for director fined
PECR breaches. The £1,000
communication company
pleaded guilty to the s 47
offence.
Nuisance 1/4/16 Advice Direct Ltd Fined by ICO. £20,000
calls with
fake number
Failure to 24/3/16 TalkTalk Telecom Data breach and failure £1,000 per
comply Group Plc to comply with breach reg 5C(2) of
with breach notification rules. PECR
notification Privacy and Electronic
rules Communications (EC
Directive) Regulations
2003.

140
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Nuisance 17/3/16 FEP Heatcare Spam calls. £180,000


calls 2.6m calls.
Access 29/3/16 MI Wealth Failure to comply with Ordered to
request Management Ltd access request comply
Security of 10/3/16 Chief Constable Investigation file lost. Undertaking
data Wiltshire Data breach. to comply
Constabulary with Seventh
Principle
Nuisance 29/2/16 Prodial Ltd Spam calls. £350,000
calls 46m calls.
Spam texts 22/11/15 UKMS Money Spamming. £80,000
Solutions Ltd 1.3m spam texts.
(UKMS)
Spam calls 10/11/15 Oxygen Ltd Unsolicited automated £120,000
marketing calls.
Stolen 4/11/15 Crown Prosecution Stolen laptops with £200,000
laptops Service (CPS) police interview videos
of victims and witnesses,
mostly for ongoing
violent or sexual type
cases.
Spam texts 27/10/15 Help Direct UK Ltd Unsolicited marketing £200,000
text messages.
20/10/15 Pharmacy 2U Ltd Online pharmacy sold £130,000
details of more than
20,000 customers to
marketing companies.
No consent. No
notification to customers.
Information 8/10/15 Nuisance Call Failing to respond to Prosecuted
notice Blocker Ltd information notice.
Automated 30/9/15 Home Energy 6m automated marketing £200,000
marketing & Lifestyle calls. Breach of marketing
calls Management Ltd call regulations.
(HELM)
Unsolicited 16/9/15 Cold Call Unsolicited marketing £75,000
marketing Elimination Ltd calls.
calls
Enforcement 20/8/15 Google Inc Ordered to remove nine Enforcement
notice search results after ICO notice
Right to be ruled the information
Forgotten linked was no longer
relevant.

141
11.29 Enforcement and Penalties for Non-Compliance

Nuisance 10/8/15 Point One Nuisance calls. £50,000


calls Marketing Ltd
(previously
Conservo Digital
Ltd)
Lost data 6/8/15 The Money Shop Loss of computer £180,000
equipment with
significant amount of
customer details.
Non- 4/8/15 Consumer Claims Personal injury claims Prosecution.
notification Solutions Ltd telemarketing company. Guilty plea to
the s17 non-
notification
offence
Loss of 18/5/15 South Wales Police Fine for losing a video £160,000
unencrypted recording which formed
DVDs part of the evidence in a
sexual abuse case. Despite
containing a graphic and
disturbing account, DVDs
were unencrypted and left
in a desk drawer.
Non- 16/4/15 Lismore Failing to notify with the Prosecution.
notification Recruitment Ltd ICO. Guilty plea
Spam calls 1/4/15 Direct Assist Ltd Direct marketing calls £80,000
to people without their
consent.
Unauthorised 30/3/15 Serious Fraud Witness in a serious fraud, £180,000
disclosure Office bribery and corruption
investigation mistakenly
sent evidence relating to
64 other people in case.
Spam texts 24/3/15 Sweet Media Ltd Enforcement notice to Enforcement
stop sending nuisance notice
Spam texts. ICO raid.
SIM cards and computers
seized. Sent over 4.5m
texts.
Spam texts 17/3/15 Help Direct UK Order to stop spam texts. Enforcement
notice
Unlawful 12/3/15 Yasir Manzoor Former customer Prosecution
access service assistant at
Lloyds Banking Group
prosecuted for unlawfully
accessing a former
partner’s bank account.

142
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Unlawful 26/2/15 Bernard Fernandes Former support clerk at Prosecution


access Transport for London
prosecuted for unlawfully
accessing the oyster card
records of family and
neighbours.
Hack 24/2/15 Staysure.co.uk Ltd IT security failings let £175,000
Security hackers access customer
Breach records. More than 5,000
customers had their credit
cards used by fraudsters
after the attack on the
online holiday insurance
company. Hackers
potentially accessed over
100,000 live credit card
details, and customers’
medical details. Credit
card CVV numbers were
also accessible despite
industry rules that they
should not be stored
at all.
Privacy 30/1/15 Google Inc Google Inc signs Privacy
Policy undertaking committing Policy
to making further changes
to its amalgamated
Privacy Policy in order to
ensure compliance with
the first principle of
DPA 1998.
Failure to 6/1/15 Tivium Ltd Failure to respond to Prosecution.
respond to information notice. Fined
information £5000, plus
notice compensation
plus costs.
Spam texts 6/1/15 Optical Express Enforcement notice to Enforcement
(Westfield) Ltd stop Spam texts. notice
Spam calls 27/12/14 Kwik Fix Plumbers Nuisance spam calls £90,000
Ltd targeting vulnerable
victims. In several
cases, the calls resulted
in elderly people being
tricked into paying for
boiler insurance they did
not need.

143
11.29 Enforcement and Penalties for Non-Compliance

Failure to 19/12/14 Vodafone Failure to comply £1,000


report breach with the personal
data breach reporting
requirements under the
Privacy and Electronic
Communications
(EC Directive)
Regulations 2003.
Spam texts 5/12/14 Parklife Manchester Unsolicited marketing 70,000
Ltd texts. Texts sent to 70,000
people bought tickets to last
year’s event, and appeared
on recipients’ mobile phone
as sent by ‘Mum’.
Enforcement 19/11/14 Grampian Health ICO ordered NHS Enforcement
notice Board (NHS Grampian to ensure notice
Data breach Grampian) patients’ information better
protected. Six data breaches
in 13 months where papers
containing sensitive
personal data were left
abandoned in public areas
of a hospital. In one case
the data was found at a
local supermarket.
Unlawful 13/11/14 Harkanwarjit Former West Sussex Prosecution
access Dhanju Primary Care Trust
pharmacist prosecuted
for unlawfully accessing
medical records of
family members, work
colleagues and local
health professionals.
Spam calls 12/11/14 Hot House Roof Enforcement notice to Enforcement
Company stop Spam calls. notice
Company 11/11/14 Matthew Devlin Company director Prosecution
director Matthew Devlin fined
after illegally accessing
Everything Everywhere’s
(EE) customer database,
to target them with
services of his own
telecoms companies.
Hack 5/11/14 Worldview Ltd Serious data breach £7,500
Security when vulnerability on
Breach company’s site allowed
hackers to access full
payment card details of
3,814 customers.

144
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Spam email 21/10/14 Abdul Tayub Enforcement notice. Enforcement


Sending unsolicited notice
marketing mail by
electronic means without
providing information
as to his identity and
without prior consent.
Spam calls 1/10/14 EMC Advisory Spam calls. Also failure £70,000
Services Ltd to ensure persons
registered with TPS, or
who previously asked not
to be contacted, were not
called.
Spam email 12/9/14 All Claims Enforcement notice. Enforcement
Marketing Ltd Spam email and without notice
providing information as
to its identity.
Spam calls 3/9/14 Winchester and Enforcement notice. Spam Enforcement
Deakin Limited calls. Included calls to notice
(also trading as people who had registered
Rapid Legal and with the TPS, or who had
Scarlet Reclaim) asked not to be contacted.
Breach 24/8/14 Ministry of Justice Serious failings in the £180,000
way prisons handed
people’s information.
Unlawful 22/8/14 Dalvinder Singh Santander banker fined Prosecution
access for reading 11 colleagues’
bank accounts, to see
their salary and bonuses.
Failure to 6/8/14 A Plus Recruitment Failure to notify Prosecution
notify with Limited with ICO.
ICO
Failure to 5/8/14 1st Choice Failure to notify Prosecution
notify with Properties (SRAL) with ICO.
ICO
Spam calls 28/7/14 Reactiv Media Ltd Spam calls. Calls to £50,000
people who registered
with TPS.
Hack 23/7/14 Think W3 Limited Online travel services £150,000
Security company. Serious breach
Breach of DPA 1998. Thousands
of people’s data accessed
by hacker.

145
11.29 Enforcement and Penalties for Non-Compliance

Company 15/7/14 Jayesh Shah Owner of a marketing Prosecution.


owner company (Vintels) Fined £4000,
Failure to prosecuted for failing to plus costs
notify ICO notify ICO of changes to and victim
of changes his notification. surcharge
to his
notification
Failure to 14/7/14 Hayden Nash Recruitment company. Prosecution
notify with Consultants Failure to notify
ICO with ICO.
Unlawful 10/7/14 Stephen Siddell Former branch manager Prosecution
access for Enterprise Rent-
Unlawful A-Car prosecuted for
sale unlawfully stealing the
records of almost two
thousand customers and
selling them to a claims
management company.
Failure to 9/7/14 Global Immigration Failure to notify Prosecution.
notify with Consultants Ltd with ICO.
ICO
Spam calls 16/6/14 DC Marketing Ltd Enforcement notice. Spam Enforcement
calls. notice
Director 6/6/14 Darren Anthony Director of pensions Prosecution.
Failure to Bott review company Guilty plea
notify with prosecuted for failure to
ICO notify with ICO.
Failure to 5/6/14 API Telecom Failure to comply with Prosecution
comply with information notice.
information
notice
Unlawful 29/5/14 Wolverhampton Investigation into a data Enforcement
disclosure City Council breach at the council that notice
occurred in January 2012,
when a social worker who
had not received data
protection training, sent a
report to a former service
user detailing their time
in care. The social worker
failed to remove highly
sensitive information
about the recipient’s sister
that should not have been
included.
Failure to 13/5/14 QR Lettings Failure to notify with Prosecution
notify with ICO.
ICO

146
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Failure to 25/4/14 Allied Union Ltd Failure to notify with Prosecution


notify with ICO.
ICO
Spam calls 3/4/14 Unsolicited spam calls £50,000
to people who registered
with TPS.
Failure to 25/3/14 Help Direct UK Ltd Failure to notify with Prosecution
notify with ICO.
ICO
Security 19/3/14 Kent Police Highly sensitive and £100,000
Unlawful confidential information,
access including copies of police
interview tapes, were left
in a basement at the former
site of a police station.
Directors and 12/3/14 Boilershield Failure to notify with Prosecution.
company Limited ICO. Both fined
Failure to £1,200, plus
notify with costs and
ICO surcharge
Failure to 11/3/14 Becoming Green Failure to notify with Prosecution
notify with (UK) Ltd ICO.
ICO
Spam calls 10/3/14 Isisbyte Limited Enforcement notice, Spam Enforcement
calls. notice
Spam calls 10/3/14 SLM Connect Enforcement notice. Enforcement
Limited Spam calls. notice
Hack 7/3/14 British Pregnancy Hacker threatened £200,000
Security Advice Service to publish thousands
Breach (BPAS) of names of people
who sought advice on
abortion, pregnancy and
contraception.
Blagging 24/1/14 ICU Investigations Blagging. Six men who Prosecution
Ltd were part of a company
that tricked organisations
into revealing personal
details about customers
has today been sentenced
for conspiring to breach
the DPA.
Security 11/1/14 Department of A monetary penalty £185,000
Unlawful Justice Northern notice has been served
disclosure Ireland on Department of Justice
Northern Ireland after a
filing cabinet containing
details of a terrorist incident
was sold at auction.

147
11.29 Enforcement and Penalties for Non-Compliance

Spam texts 16/12/13 First Financial Millions of spam texts. £175,000


(UK) Ltd
Security. 29/10/13 North East Loss of an unencrypted £80,000
Unlawful Lincolnshire memory device containing
disclosure Council personal data and special
No personal data relating to
encryption 286 children.
Security 22/10/13 Ministry of Justice Failure to keep personal £140,000
Unlawful data securely, after
disclosure spreadsheets showing
prisoners’ details were
emailed to members of
the public in error.
Data breach 2/1/13 Sony Hack breach £250,000
Unlawful 6/11/12 Lara Davies Bank employee obtained Court
access unlawfully access to bank conviction.
Customer statements of her partner’s Fined.
financial data ex-wife. Lost job.
Bank Court prosecution.
employee Pleaded guilty to 11 DPA
offences.
Spam 28/11/12 Christopher Niebel An ICO monetary penalty £300,000
and Gary McNeish, was issued. The company £140,000
joint owners of had sent millions of
Tetrus Telecoms unlawful Spam texts to
the public over the past
three years.
This case involved
the prosecution of
Directors.
Data Loss/ 22/11/12 Plymouth City An ICO monetary penalty £60,000
Data Breach Council issued for a serious
Unlawful breach of the seventh
disclosure data protection principle.
Sensitive A social worker sent
data part of a report relating
to family A, to family B
due to printing issues.
The photocopied report
contained confidential
and highly special
personal data relating to
the two parents and their
four children, including
of allegations of child
neglect in on-going care
proceedings.

148
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Incorrect 6/11/12 Prudential An ICO monetary penalty £50,000


storage and issued after a mix-up
processing over the administration of
Potential loss two customers’ accounts
and damage led to tens of thousands
Financial of pounds, meant for an
institution individual’s retirement
fund, ending up in the
wrong account.
This was the first case
in relation to a fine for
incorrect storage and
processing.
Data Loss/ 25/10/12 Stoke-on-Trent City An ICO monetary penalty £120,000
Data Breach Council issued following a serious
Unlawful breach of the Data
disclosure Protection Act that led
Sensitive to sensitive information
data about a child protection
legal case being emailed
to the wrong person.
Data Loss/ 16/10/12 Greater Manchester An ICO monetary penalty £150,000
Data Breach Police issued after the theft of a
Police memory stick containing
special personal data
from an officer’s home.
The device, which had
no password protection,
contained details of more
than a thousand people
with links to serious crime
investigations.
Data Loss/ 10/10/12 Norwood An ICO monetary penalty £70,000
Data Breach Ravenswood Ltd issued after highly special
Charity information about the care
of four young children
was lost after being left
outside a London home.
This was a charity which
was fined.
Data Loss/ 11/9/12 Scottish Borders An ICO monetary penalty £250,000
Data Breach Council issued after former
employees’ pension
records were found in an
over-filled paper recycle
bank in a supermarket car
park.

149
11.29 Enforcement and Penalties for Non-Compliance

Unlawful 6/8/12 Torbay Care Trust An ICO monetary penalty £175,000


disclosure issued after special
Special data personal information
relating to 1,373
employees was published
on the Trust’s website.
Unlawful 12/7/12 St George’s An ICO monetary penalty £60,000
disclosure Healthcare NHS issued after a vulnerable
Special data Trust individual’s special
medical details were sent
to the wrong address.
Data Loss/ 5/7/12 Welcome Financial An ICO monetary £150,000
Data Breach Services Ltd penalty issued following
a serious breach of the
Data Protection Act. The
breach led to the personal
data of more than half a
million customers being
lost.
Data Loss/ 19/6/12 Belfast Health and An ICO monetary £225,000
Data Breach Social Care Trust penalty issued following
Special data a serious breach of the
Data Protection Act. The
breach led to the special
personal data of thousands
of patients and staff being
compromised. The Trust
also failed to report the
incident to the ICO.
Unlawful 6/6/12 Telford & Wrekin An ICO monetary penalty £90,000
disclosure Council issued for two serious
Special data breaches of the seventh
data protection principle.
A social worker sent a
core assessment report
to the child’s sibling
instead of the mother.
The assessment contained
confidential and highly
special personal data.
Whilst investigating the
first incident, a second
incident was reported to
the ICO involving the
inappropriate disclosure
of foster carer names and
addresses to the children’s
mother. Both children had
to be re-homed.

150
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Unlawful 1/6/12 Brighton and An ICO monetary £325,000


disclosure Sussex University penalty issued following
Special data Hospitals NHS the discovery of highly
Security Trust special personal data
belonging to tens of
thousands of patients
and staff – including
some relating to HIV and
Genito Urinary Medicine
patients – on hard drives
sold on an Internet
auction site in October
and November 2010.
Unlawful 21/5/12 Central London An ICO monetary penalty £90,000
disclosure Community issued for a serious
Special data Healthcare NHS contravention of the DPA,
Trust which occurred when
special personal data
was faxed to an incorrect
and unidentified number.
The contravention was
repeated on 45 occasions
over a number of weeks
and compromised 59 Data
Subjects’ personal data.
Data Loss/ 15/5/12 London Borough of An ICO monetary penalty £70,000
Data Breach Barnet issued following the loss
Special data of special information
relating to 15 vulnerable
children or young people,
during a burglary at an
employee’s home.
Unlawful 30/4/12 Aneurin Bevan An ICO monetary penalty £70,000
disclosure Health Board issued following an
Special data incident where a sensitive
report – containing
explicit details relating to
a patient’s health – was
sent to the wrong person.
Unlawful 14/3/12 Lancashire An ICO monetary penalty £70,000
disclosure Constabulary issued following the
Special data discovery of a missing
person’s report containing
special personal
information about a
missing 15 year old girl.

151
11.29 Enforcement and Penalties for Non-Compliance

Unlawful 15/1/12 Cheshire East An ICO monetary £80,000


disclosure Council penalty issued after an
Special data email containing special
personal information
about an individual of
concern to the police
was distributed to 180
unintended recipients.
Data Loss/ 13/2/12 Croydon Council An ICO monetary £100,000
Data Breach penalty issued after a bag
Special data containing papers relating
to the care of a child sex
abuse victim was stolen
from a London pub.
Unlawful 13/2/12 Norfolk County An ICO monetary penalty £80,000
disclosure Council issued for disclosing
Special data information about
allegations against a
parent and the welfare of
their child to the wrong
recipient.
Unlawful 30/1/12 Midlothian Council An ICO monetary penalty £140,000
disclosure issued for disclosing
Special data sensitive personal data
relating to children and
their carers to the wrong
recipients on five separate
occasions. The penalty
is the first that the ICO
has served against an
organisation in Scotland.
Unlawful 6/12/11 Powys County An ICO monetary penalty £130,000
disclosure Council issued for a serious breach
Special data of the Data Protection Act
after the details of a child
protection case were sent
to the wrong recipient.
Unlawful 28/11/11 North Somerset An ICO monetary penalty £60,000
disclosure Council issued for a serious breach
Special data of the Data Protection Act
where a council employee
sent five emails, two of
which contained highly
special and confidential
information about a
child’s serious case
review, to the wrong NHS
employee.

152
ICO Data Enforcement, Loss/Data Breaches, Fines and Convictions 11.29

Unlawful 28/11/11 Worcestershire An ICO monetary penalty £80,000


disclosure County Council issued for an incident
Special data where a member of staff
emailed highly special
personal information
about a large number of
vulnerable people to 23
unintended recipients.
Unlawful 9/6/11 Surrey County An ICO monetary penalty £120,000
disclosure Council issued for a serious breach
Special data of the Data Protection
Act after special personal
information was emailed
to the wrong recipients on
three separate occasions.
Unlawful 10/5/11 Andrew Jonathan An ICO monetary penalty £1,000
disclosure Crossley, formerly issued for failing to
Special Data trading as solicitors keep special personal
Security firm ACS Law information relating to
around 6,000 people
secure.
Data Loss/ 8/2/11 Ealing Council An ICO monetary penalty £80,000
Data Breach issued following the loss
Laptop of an unencrypted laptop
Encryption which contained personal
information. Ealing
Council breached the Data
Protection Act by issuing
an unencrypted laptop to a
member of staff in breach
of its own policies.
Data Loss/ 8/2/11 Hounslow Council An ICO monetary penalty £70,000
Data Breach issued following the loss
Laptop of an unencrypted laptop
Encryption which contained personal
information. Hounslow
Council breached the
Act by failing to have a
written contract in place
with Ealing Council.
Hounslow Council also
did not monitor Ealing
Council’s procedures
for operating the service
securely.

153
11.30 Enforcement and Penalties for Non-Compliance

Conclusion
11.30 One comment on the above is that it is worrying how many
problem issues arise with regard to health data, and which amounts to
special personal data.
It should also be noted that company directors (and other officers) can
be personally prosecuted and fined.
If a request for information or other notice is received from the ICO
it may be appropriate to seek immediate legal advice. The ICO can also
issue enforcement notices. In addition, the ICO may also issue monetary
penalty notices or fines. It is important to note the emphasis now under
the new GDPR regime on teams dealing with the nuanced respective
data protection (and security) issues; DPOs; assessing data protection
risks and dealing with them appropriately; and the related new rules.
Organisations should ensure proper policies; awareness and ongoing
training are in operation across all personnel whom have an impact
and responsibility regarding data protection operations. It appears that
fines, prosecutions and Data Subject damage and access litigation will
increase.

154
Chapter 12
Security of Personal Data

Introduction
12.01 ‘Personally identifiable information (PII) data has become
the prime target of hackers and cyber-criminals. It can be exploited in
many ways, from identity theft, spamming and phishing right through to
­cyber-espionage.’1 The data protection regime sets information and data
security obligations on all Controllers, as well as Processors. These IT
and personal data security requirements must be complied with. While
data security risks have increased with the internet,2 data security risk
issues are not just limited to the organisation’s internet.
The growing number of data security breaches, including through
inadequate data security, as well as internet usage and social media,
cloud computing and online abuse, will all increase the attention on
security and data protection. Yahoo! has suffered a large number of data
breaches, with the data of 500 million users exposed in one incident, and
over a billion users in another. Uber was fined £500,00 for a data breach
incident. Sony was fined £250,000 in relation to a hacking data breach
incident. Facebook was fined £500,000 regarding abuses of social media
harvested data (Cambridge Analytica). Telco Talk Talk was hacked in a
data breach incident. The MD of Talk Talk has said that ‘cyber crimi-
nals are becoming increasingly sophisticated and attacks against com-
panies which do business online are becoming more frequent’. Retailers
have been hacked, as evidenced by the recent ICO fine of £500,000 on
DSG Retail. Even children’s toy manufacturer Vtech was hacked, which

1 Y Rozenberg, ‘Challenges in PII Data Protection,’ Computer Fraud & Security (2012)
5–9, at 5.
2 ‘Security of the Internet and the Known Unknowns,’ Communications of the ACM
(2012) (55) 35–37.

155
12.01 Security of Personal Data

c­ arries the added risk to child data as well as parent data. Some organisa-
tions may not have realised that they can be held responsible for errant
third party hack attacks where they have contributed to or enabled the
attack by failure to have adequate data security precautions in place.

Appropriate Security Measures


12.02 Organisations should be aware that data security and appropriate
security measures extends beyond IT or technical data security. Doorstep
Dispensaree and its fine of £275,000 by the data regulator arose out of
the storing physical file records in an insecure manner.3
There are specific requirements with regard to the security measures
that need to be implemented under Data Protection Act 2018 (DPA 2018).
Organisations and Controller must take appropriate security measures
against unauthorised access to, or unauthorised alteration, disclosure or
destruction of, the data, in particular where the processing involves the
transmission of data over a network, and against all other unlawful forms
of processing. As these risks grow,4 so too must the effort of the organi-
sation. It cannot be a case that one solution, on one occasion, will be
sufficient.
The requirements regarding security of processing previously con-
tained in the seventh Data Protection Principle (DPA 1998, Sch 1, Pt I,
para 7) are now expanded under the EU General Data Protection Regu-
lation (GDPR) (Principle 6 and Article 32). The Controller must take
appropriate technical and organisational measures against unauthorised
or unlawful processing of personal data and against accidental loss or
destruction of, or damage to personal data.
The Controller must ensure the following:
‘Taking into account the state of the art, the costs of implementation and
the nature, scope, context and purposes of processing as well as the risk
of v­ arying likelihood and severity for the rights and freedoms of ­natural
persons, the Controller and the Processor shall implement appropriate
technical and organisational measures to ensure a level of security
­
­appropriate to the risk.’

The new GDPR places significant emphasis on risk and risk assessment.
Organisations need to be compliant with these obligations.

3 ICO v Doorstep Dispensaree Limited [2019] 20 December 2019.


4 See, for example, ‘The Cybersecurity Risk,’ Communications of the ACM (2012) (55)
29–32.

156
Ensuring Appropriate Security Measures 12.05

Ensuring Appropriate Security Measures


12.03 What are security and ‘appropriate technical and organisational
measures’? What must the organisation do? In determining appropriate
technical and organisational measures, a Controller may have regard to:
●● the state of technological development and the cost of implementing
any measures;
●● the harm that might result from such unauthorised or unlawful
processing or accidental loss, destruction or damage; and
●● the nature of the data to be protected.
Article 32 sets out various additional factors.
Employees and Security
12.04 The Controller must take reasonable steps to ensure the
­reliability of any employees of the organisation who have access to the
personal data.
Organisations should be aware that they could be held liable, whether
with the ICO or in a civil court, in relation to data breaches and data
security breaches occasioned by their employees. Organisations may
also begin to face claims in relation to online abuse caused, initiated
or participated in by their employees while using the organisation’s
network and devices. Indeed, this applies in relation to their other
servants and agents also.
There can be no full data security policy across the organisation unless
employees embrace the measures taken, and are appropriately trained in
data security issues.
Engaging Data Processors
12.05 Processors5 also have obligations in relation to processing per-
sonal data and security.
Where processing of personal data is carried out by a Processor on
behalf of a Controller, then the Controller must have a contract in writing
with the Processor, containing certain clauses and obligations.
The contracts with Processors must address the following,
●● processing must be carried out in pursuance of a contract in writing;
●● contract must provide that the Processor carries out the processing
only on and subject to the instructions of the Controller;
●● Processor must comply with the above security requirements.

5 Note generally, for example, R Morgan, ‘Data Controllers, Data Processors and
Data Sharers,’ SCL Computers and Law, 4 March 2011.

157
12.05 Security of Personal Data

There are also certain other requirements. The Controller:


●● must ensure Processors provide sufficient guarantees in respect
of the technical security measures and organisational measures
governing the processing;
●● must take reasonable steps to ensure compliance with these measures.
There are specific requirements where Processors are engaged and relied
upon. Where Controllers engage Processors then the Controller must:
●● have a contract in writing or other equivalent form which provides
that the Processor will act only on instructions of the Controller
and that it will comply with the data security measures to which the
Controller is subject;
●● ensure that the Processor provides sufficient guarantees in respect
of the technical security measures and organisational measures it
implements;
●● takes reasonable steps to ensure compliance with such matters.
What actions should an organisation take if third parties process their
organisation’s data? Organisations should consider:
●● ideally, that an organisation should have advance procedures and
policies in place to cater for such an eventuality, for example
ensuring that an appropriate senior manager/board member has an
assigned role and takes responsibility for date protection c­ ompliance,
­including dealing with breaches, and also having a documented IT
security incident handling procedure policy in place. Needless to
say, it must be properly policed, implemented, reviewed and updated
appropriately;
●● once a breach does occur, a proper procedure can assist in contain-
ing the breach and recovering from same; assessing the ongoing
risk; notify the breach as appropriate; evaluating and reacting to the
breach and the risk;
●● all of the appropriate personnel within the organisation should be
aware of the procedures and consulted as appropriate, including for
example managing director, legal, IT, media/press officer;
●● a part of the procedure will have a designated list of external contact
points, some or all of whom may need to be notified and contacted,
as appropriate;
●● obviously the nature of organisations, and of breaches themselves,
differ and the IT security incident handling procedure policy will
need to be tailored. After all, the cause of a data security breach can
be any of a number of potential reasons;

158
Security under the EDPB 12.06

●● also, such a policy while covering breaches of data protection and


privacy, may well encompass wider issues, such as outsourcing,
backups, disaster recovery, business continuity, etc;
●● if an organisation does business in other jurisdictions it may need
to take additional measures. For example, if it owns or has access
to personal information of Californian residents then it may have
a legal duty to notify those individuals if that information (such as
names and credit card details) has been accessed illegally.

Security under the EDPB


12.06 The influential EDPB (previously WP29) has previously pub-
lished a document relating to personal data and information security.6
It refers to the surveillance of employee’s electronic communications.
In terms of compliance it raises a number of policy concerns for
­organisations to consider when dealing with their policies and activities
regarding such personal data. It suggests that organisations asking if the
proposed processing of the employee’s personal data is:
●● transparent?
●● necessary?
●● fair?
●● proportionate?
●● for what purpose?
It also suggests that organisations should consider if each proposed
processing activity can be achieved through some other less obtrusive
means.
It also refers to informing those outside of the organisation as
­appropriate in relation to the proposed processing.
It also refers to issues of proportionality in suggesting that an organi-
sation in processing personal data should adopt the minimum process-
ing necessary, and pursuing a strategy of prevention versus detection.
Detection leans more towards blanket monitoring of employees, whereas
prevention is viewed as more appropriate and employee privacy friendly.
Of course, where some issue comes to the organisation’s attention, they
could then investigate that issue, as opposed to actively monitoring all
employees 24/7. An example would be using IT filtering systems, which
alert the IT manager that there may be an issue to look at further, rather

6 This was published on 29 May 2002.

159
12.06 Security of Personal Data

than the IT manager actively looking at and monitoring all employee


communications.

Security under the GDPR


12.07 Security is particularly emphasised as being important under the
new data protection regime.
The new EU General Data Protection Regulation (GDPR) Recitals
refer to security as follows: network and information security, Computer
Emergency Response Teams (CERTs) and Computer Security Incident
Response Teams (CSIRTs); guidance for appropriate measures; appro-
priate technical and organisational measures; security and risk evalu-
ation; high risk and impact assessments; and impact assessments; and
large scale processing operations; consultations; Data breach and data
breach notification; Commission delegated acts.
Chapter IV, Section 2 of the new GDPR relates to data security.
Security of Processing
12.08 Taking into account the state of the art, the costs of implemen-
tation and the nature, scope, context and purposes as well as the risk
of varying likelihood and severity for the rights and freedoms of natu-
ral persons, the Controller and the Processor shall implement appropri-
ate technical and organisational measures to ensure a level of security
appropriate to the risk, including inter alia, as appropriate:
●● the pseudonymisation and encryption of personal data;
●● the ability to ensure the ongoing confidentiality, integrity, availabil-
ity and resilience of processing systems and services;
●● the ability to restore the availability and access to personal data in a
timely manner in the event of a physical or technical incident;
●● a process for regularly testing, assessing and evaluating the effec-
tiveness of technical and organisational measures for ensuring the
security of the processing (Article 32(1)).
In assessing the appropriate level of security account shall be taken in
particular of the risks that are presented by data processing, in particular
from accidental or unlawful destruction, loss, alteration, unauthorised
disclosure of, or access to personal data transmitted, stored or otherwise
processed (Article 32(2)).
Adherence to an approved code of conduct pursuant to Article 40 or an
approved certification mechanism pursuant to Article 42 may be used as
an element by which to demonstrate compliance with the requirements
set out in Article 32(1) (Article 32(3)).

160
Security under the GDPR 12.11

The Controller and Processor shall take steps to ensure that any person
acting under the authority of the Controller or the Processor who has
access to personal data shall not process them except on instructions
from the Controller, unless they are required to do so by EU or state law
(Article 32(4)).
Notification of a Data Breach to Supervisory Authority
12.09 In the case of a personal data breach, the Controller shall with-
out undue delay and, where feasible, not later than 72 hours after having
become aware of it, notify the personal data breach to the supervisory
authority, unless the personal data breach is unlikely to result in a risk
for the rights and freedoms of natural persons. Where the notification to
the supervisory authority is not made within 72 hours, it shall be accom-
panied by reasons for the delay (Article 33(1)).
The Processor shall notify the Controller without undue delay after
becoming aware of a personal data breach (Article 33(2)).
Contents of Notification
12.10 The notification must at least:
●● describe the nature of the personal data breach including where
possible, the categories and approximate number of Data Subjects
concerned and the categories and approximate number of data
records concerned;
●● communicate the name and contact details of the DPO or other
contact point where more information can be obtained;
●● describe the likely consequences of the personal data breach;
●● describe the measures taken or proposed to be taken by the
Controller to address the personal data breach, including, where
appropriate, measures to mitigate its possible adverse effects
(Article 33(2)).
Where, and in so far as, it is not possible to provide the information at
the same time, the information may be provided in phases without undue
further delay (Article 33(4)).
The Controller shall document any personal data breaches, comprising
the facts relating to the personal data breach, its effects and the remedial
action taken. This documentation must enable the supervisory authority
to verify compliance with this Article (Article 33(5)).
Communication of a Data Breach to Data Subject
12.11 When the personal data breach is likely to result in a high risk to
the rights and freedoms of individuals the Controller shall c­ ommunicate

161
12.11 Security of Personal Data

the personal data breach to the Data Subject without undue delay
(Article 34(1)).7
The communication to the Data Subject shall describe in clear and
plain language the nature of the personal data breach and contain at least
the information and the recommendations provided for in Article 31(3)
(b), (d) and (e) (Article 32(2)).
The communication to the Data Subject referred to in Article 32(1)
must not be required if:
●● the Controller has implemented appropriate technical and organisa-
tional protection measures, and that those measures were applied to
the personal data affected by the personal data breach, in particular
those that render the data unintelligible to any person who is not
authorised to access it, such as encryption; or
●● the Controller has taken subsequent measures which ensure that the
high risks to the rights and freedoms of Data Subjects referred to in
para 1 is no longer likely to materialise; or
●● it would involve disproportionate effort. In such a case, there shall
instead be a public communication or similar measure whereby
the Data Subjects are informed in an equally effective manner
(Article 34(3)).
If the Controller has not already communicated the personal data breach
to the Data Subject, the supervisory authority, having considered the
likelihood of the breach to result in a high risk, may require it to do so
or may decide that any of the conditions referred to in Article 34(3) are
met (Article 34(4)).
Data Protection Impact Assessment and Prior Consultation
12.12 Chapter IV, Section 3 of the GDPR refers to Impact Assess-
ments and Prior Consultations.
Data Protection Impact Assessment
12.13 Where a type of processing in particular using new technolo-
gies, and taking into account the nature, scope, context and purposes
of the processing, is likely to result in a high risk for the rights and
freedoms of natural persons, the Controller shall, prior to the process-
ing, carry out an assessment of the impact of the envisaged processing

7 Note, for example, P Wainman, ‘Data Protection Breaches: Today and Tomorrow,’
SCL Computers and Law, 30 June 2012. Also see M Dekker, Dr, C Christoffer
­Karsberg and B Daskala, Cyber Incident Reporting in the EU (2012).

162
Security under the GDPR 12.13

operations on the protection of personal data. A single assessment may


address a set of similar processing operations that present similar high
risks (Article 35(1)).
The Controller shall seek the advice of the DPO, where designated,
when carrying out a data protection impact assessment (Article 35(2)).
A data protection impact assessment shall in particular be required in
the following cases:
●● a systematic and extensive evaluation of personal aspects relating
to natural persons which is based on automated processing, includ-
ing profiling, and on which decisions are based that produce legal
effects concerning the natural person or similarly significantly affect
the individual;
●● processing on a large scale of special categories of data referred to in
Article 9(1), or of personal data relating to criminal convictions and
offences referred to in Article 10; or
●● a systematic monitoring of a publicly accessible area on a large scale
(Article 35(2)).
The supervisory authority shall establish and make public a list of the
kind of processing operations which are subject to the requirement
for a data protection impact assessment pursuant to Article 35(1).
The supervisory authority shall communicate those lists to the EDPB
(Article 35(4)).
The supervisory authority may also establish and make public a list of
the kind of processing operations which are subject to requirement for a
data protection impact assessment. The supervisory authority shall com-
municate those lists to the EDPB (Article 35(4)).
The supervisory authority may also establish and make public a list
of the kind of processing operations for which no data protection impact
assessment is required. The supervisory authority shall communicate
those lists to the Board.
Prior to the adoption of the lists referred to above, the competent
supervisory authority shall apply the consistency mechanism referred
to in Article 63 where such lists involve processing activities which
are related to the offering of goods or services to Data Subjects
or to the monitoring of their behaviour in several states, or may
substantially affect the free movement of personal data within the EU
(Article 35(6)).
The EDPB has also issued the following document:
●● Recommendation 01/2019 on the draft list of the European
Data Protection Supervisor regarding the processing operations
subject to the requirement of a data protection impact assessment
(Article 39.4 of regulation (EU) 2018/1725.

163
12.14 Security of Personal Data

Contents of Data Protection Impact Assessment


12.14 The assessment shall contain at least:
●● a systematic description of the envisaged processing operations
and the purposes of the processing, including where applicable the
legitimate interest pursued by the Controller;
●● an assessment of the necessity and proportionality of the processing
operations in relation to the purposes;
●● an assessment of the risks to the rights and freedoms of Data
Subjects;
●● the measures envisaged to address the risks, including safeguards,
security measures and mechanisms to ensure the protection of per-
sonal data and to demonstrate compliance with the GDPR taking
into account the rights and legitimate interests of Data Subjects and
other persons concerned (Article 35(7)).
Compliance with approved codes of conduct referred to in Article 38 by
the relevant Controllers or Processors shall be taken into due account
in assessing the impact of the processing operations performed by such
Controllers or Processors, in particular for the purposes of a data protec-
tion impact assessment (Article 35(8)).
Where appropriate, the Controller shall seek the views of Data
Subjects or their representatives on the intended processing, without
prejudice to the protection of commercial or public interests or the
security of the processing operations (Article 35(9)).
Where the processing pursuant to Article 6(1)(c) or (e) has a legal
basis in EU law, or the law of the state to which the Controller is subject,
and such law regulates the specific processing operation or set of opera-
tions in question, and a data protection impact assessment has already
been carried out as part of a general impact assessment in the context
of the adoption of that legal basis, Article 35(1) to (7) shall not apply,
unless states deem it necessary to carry out such assessment prior to the
processing activities (Article 35(10)).
Where necessary, the Controller shall carry out a review to assess if
the processing of personal data is performed in compliance with the data
protection impact assessment at least when there is a change of the risk
represented by the processing operations (Article 35(11)).
Prior Consultation
12.15 The Controller shall consult the supervisory authority prior
to the processing of personal data where a data protection impact
assessment indicates that the processing would result in a high risk in
the absence of measures taken by the Controller to mitigate the risk
(Article 36(1)).

164
Organisational Security Awareness 12.17

Where the supervisory authority is of the opinion that the intended


processing referred to in Article 36(2) would infringe the GDPR, in par-
ticular where the Controller has insufficiently identified or mitigated the
risk, the supervisory authority shall, within a period of up to eight weeks
of receipt of the request for consultation give written advice to the Con-
troller, and where applicable the Processor in writing, and may use any
of its powers referred to in Article 58. This period may be extended for
a further six weeks, taking into account the complexity of the intended
processing (Article 36(2)).
When consulting the supervisory authority pursuant to Article 36(1),
the Controller shall provide the supervisory authority with:
●● where applicable, the respective responsibilities of the C­ ontroller,
joint Controllers and Processors involved in the processing, in
particular for processing within a group of undertakings;
●● the purposes and means of the intended processing;
●● the measures and safeguards provided to protect the rights and
freedoms of Data Subjects pursuant to the GDPR;
●● where applicable, the contact details of the DPO;
●● the data protection impact assessment provided for; and
●● any other information requested by the supervisory authority
(Article 36(3)).
States shall consult the supervisory authority during the preparation of a
proposal for a legislative measure to be adopted by a national parliament,
or of a regulatory measure based on such a legislative measure, which
relates to processing (Article 36(4)).
Notwithstanding Article 36(1), states’ law may require Controllers to
consult with, and obtain prior authorisation from, the supervisory authority
in relation to the processing by a Controller for the performance of a task
carried out by the Controller in the public interest, including the processing
of such data in relation to social protection and public health (Article 36(5)).

Organisational Security Awareness


12.16 There is a need to ensure security awareness within the organi-
sation. Controllers and Processors must take all reasonable steps to
ensure that persons employed by them and other persons at the place
of work, are aware of and comply with the relevant security measures
(note DPA 1998, Sch 1, Pt II, para 10).
Identifying and Controlling Organisational IT Security
12.17 Who needs to be in charge of IT and data security in an organi-
sation? What organisational security measures should be in place?

165
12.17 Security of Personal Data

The DPA does not require that a named individual be appointed with
responsibility for security compliance. A titled officer is highly recom-
mended and under the GDPR a formal DPO is now a requirement. There
are specific organisational responsibilities, regarding security, which are
set out in the DPA and these include a specific requirement that all staff
are aware of and comply with the security standards set out in the data
protection regime.
These security requirements apply where an organisation collects and
processes any personal data. Typically personal data would include infor-
mation which identifies living individuals such as employees or custom-
ers. Accordingly, for most organisations, data protection compliance is
likely to be an issue that crosses various aspects of the business such as
human resources, IT and customer support. It is not necessarily just an
IT function. An organisational decision as to who should be responsible
for IT and data security should be made with this in mind.
The security standards set out in the Data Protection Act can be
summarised as obliging the organisation to take ‘appropriate’ security
measures to guard against unauthorised access, alteration, disclosure or
destruction of any personal data.
In determining what is ‘appropriate’, the organisation can have regard
to the state of technological development and the cost of implementing
the security measures. However, the organisation is obliged to ensure
that the measures it decides to adopt provide a level of security appropri-
ate to the harm that might result from a security compromise given the
nature of the data concerned. For example, a hospital processing sensi-
tive health data would be expected to adopt a particularly high security
standard while a corner shop processing personal data for a paper round
might be subject to a less onerous standard.
In light of some high profile cases of data theft or loss, readers should
note also that the ICO’s office has taken the view that an appropriate
level of security for laptop computers used in the financial services and
sensitive health industries requires the use of encryption in respect of the
data stored on the hard drive (over and above the use of user name and
password log-in requirements).
In adopting security measures within an organisation, they should
note that the legal standard governing personal data is over and above
any other legal obligations of confidentiality which could be owed to
third parties at common law or under a contract which contains confi-
dentiality provisions.
Appraising Employees
12.18 What guidance should be given to an organisation’s employees
regarding IT and data security? An organisation might consider these
issues.

166
Organisational Security Awareness 12.18

The overriding principle here is that, for it to be effective, an employee


must have clear notice of all aspects of the IT and data security policy.
The following might be included.
General Policy: Seek to reserve the right to appropriately monitor use
of the computer and telephone system, including email. Disclosure of
special or confidential information about the company or personal data
about any individual without authorisation should be forbidden.
Email: Employees should be made aware that email is an essential
business tool but not without its risks. Business communications should
be suitable and not too casual or contain inappropriate material. Personal
use of email must be reasonable.
Internet Access: Downloading of any software without approval
should be forbidden. No improper or illegal activities (hacking, etc)
should be tolerated. Access to certain types of websites (abuse, adult,
criminal, hate, violent, etc) should be expressly restricted. Origination or
dissemination of inappropriate material should be forbidden.
Mobile Telephones and Devices: Downloading of any software with-
out approval should be forbidden. No improper or illegal activities (hack-
ing, etc) should be tolerated. Access to certain types of websites (adult,
criminal, hate, violent, etc) should be expressly restricted. Origination or
dissemination of inappropriate material should be forbidden.
Vehicles: Increasingly tracking and other data from vehicles can be
related to identified employees. It is important for organisations to con-
sider these personal data issues first.
Internet Social Media: Downloading of any software without approval
should be forbidden. No improper or illegal activities (hacking, etc)
should be tolerated. Access to certain types of websites (adult, criminal,
hate, violent, etc) should be expressly restricted. Origination or dissemi-
nation of inappropriate material should be forbidden, eg abuse.
Software Installation & Management: Purchase and/or installation
of software not approved in advance should be forbidden. All software
must be required to be tested before installation on the system to ensure
that it is free from viruses, malware, etc.
Password Security: Clear guidelines as to the use of passwords should
be given including the strength of password required, the importance of
keeping one’s password confidential, the changing of passwords at regu-
lar intervals and (in the light of recent events) the application of password
security to laptops, mobile phones and PDAs. At least, in some busi-
nesses, stronger encryption may be appropriate and employees should
be required to comply with procedures in this regard. Carrying company
data on vulnerable media such as memory sticks should be discouraged.
Connecting Hardware: Employees should be required not to con-
nect any hardware (memory sticks, hard drives, etc) without following
specified procedures and obtaining appropriate authorisation.

167
12.18 Security of Personal Data

Remote Access: For many businesses, remote access, including web-


based remote access, is a vital tool. However, policies should deal with
such matter as accessing remotely from insecure locations as internet
cafes, etc and emphasise the importance of logging off effectively.

Organisational Security Measures


12.19 What measures should an organisation take? Some suggestions
may include:
●● ensure that the organisation has the appropriate security equipment
and software to protect the systems if granting third party access. If
providing access to a third party involves the business opening its
systems up to a public network such as the Internet, it is essential
that suitable security measures are taken;
●● obtain a complete listing of the IT provider personnel who have
access to the systems and detail their level, status (full time/contrac-
tor) and ensure that their employment contracts incorporate terms
similar to above.
Third parties should be granted access to the systems only when the:
●● relevant individuals in the organisation (eg a business manager, in
conjunction with IT staff) have agreed that there is a business need
for such access to be granted. Access should only be for a temporary
period and renewed at regular intervals;
●● third party computer accounts should be monitored (subject to legal
agreements between the organisation and the third party and in
accordance with legal requirements) and disabled as soon as access
is no longer required;
●● ensure that sensitive files segregated in secure areas/computer sys-
tems and available only to qualified persons;
●● ensure that the organisation has inventoried the various types of data
being stored and classified it according to how important it is and
how costly it would be for the company if it were lost or stolen;
●● scan computers (especially servers) for unauthorised programs that
transmit information to third parties such as hackers;
●● programs that have a hidden purpose are known as Trojans.
Individuals may have inadvertently or deliberately downloaded free
programs from the Internet that send back information such as pass-
words to hackers or Internet usage information to marketers to build
up trends of people’s tastes;
●● ensure that third parties only have access to information and
computer settings on a ‘need-to-know’ basis;

168
Organisational Security Measures 12.21

●● the System Administrator for the computer systems should set up


computer accounts for third parties so that these third parties do not
have access to all information on the systems or have the permission
to install non standard software and change computer settings;
●● when engaging an external business to destroy records or electronic
media, ensure references are checked. Draft a contract setting out the
terms of the relationship. Ensure that destruction is done on-site and
require that a certificate of destruction be issued upon completion.
Breach Laws to Consider
12.20 What laws might affect an organisation if an organisation has an
IT or data security breach? Suggested legal issues and laws to consider
include,
●● DPA 2018;
●● further UK data protection legislation in order to enact a UK-GDPR;
●● GDPR;
●● ePrivacy regulation;
●● Electronic Commerce (EC) Regulations 2002;
●● Criminal Damage Act 1971;
●● evidence;
●● Fraud Act 2006;
●● Theft Act 1968;
●● civil liability law;
●● Protection of Children Act 1978;
●● Criminal Justice and Public Order Act 1994;
●● European Convention on Human Rights;
●● PECR/PECR Amendment.
Other issues to consider include:
●● official guidance and current stated positions of regulators and
guidance bodies;
●● international standards organisations eg technical security standards,
etc.
Third Party Security Providers
12.21 What actions should an organisation take if a third party is
responsible for IT security or has access to its IT system? Some issues
to consider include:
●● this is becoming increasingly popular in practice through outsourc-
ing arrangements and vendor agreements;
●● it is prudent to understand the security framework and document
stakeholder exposure throughout it;

169
12.21 Security of Personal Data

●● measures to take with the organisation’s IT provider;


●● ensure that one engages lawyers to draft appropriate contracts that
secure what the IT provider is securing;
●● ensure that data protection and confidentiality clauses are included
to protect all of your intellectual property assets;
●● have the IT provider supply a complete security document outlining
the hardware, topology, software and methodologies deployed;
●● ensure that the IT provider has staff participate in regular training
programs to keep abreast of technical and legal issues;
●● ensure that the IT provider develops a security breach response
plan in the event that your company experiences a data breach; also
develop security guidelines for laptops and other portable comput-
ing devices when transported off-site;
●● have the IT provider ensure that all employees follow strict
password and virus protection procedures and develop a mechanism
where employees are required to change passwords often, using safe
methods especially for terminating employees;
●● ensure that the IT provider has a records retention/disposal sched-
ule for personally identifiable information, whether stored in paper,
micrographic or magnetic/electronic (computer) media.

Raising Awareness
12.22 An EU Council Resolution8 indicates the recognition and
official concern in relation to the loss of personal data, and the fact
that technology has changed significantly since the EU Data Protection
Directive 1995 (DPD 1995).
It refers to the dangers of information loss, attacks on organisations,
and issues of network and information security for organisations and
individuals.
It indicated that EU states should engage in information campaigns
addressing the issues of security and personal data.
It recognises the need to promote best practice, such as the interna-
tionally recognised standards for IT and security set by international
and standards setting organisation eg ISO 15408, ISO 27001, PCI DSS,
ISO 31000,BS10012:2009, etc.
It also recognises the increase in eGovernment and the need to
promote secure eGovernment.
It also notes that in the increase of attacks on the security of personal
data though technology, some of the solutions may also come from

8 EU Council Resolution 2002/C43/02 of 28 January 2002.

170
ICO Guidance 12.23

new technologies. Some of these are referred to as privacy enhancing


technologies, or PETs.9 (See PbD and DPbD, Part 4).
In addition it also refers to the increase of mobile and wireless com-
munications technologies and the need for introducing and enhancing
wireless security.
Note also the network and information systems rules applicable to
certain digital service providers.10 Separate rules also relate to electronic
identification and trust services.11
Keeping on top of the increasing threats to information security is an
ongoing task. The threats are ever changing, and so equally should be the
effort to prevent breaches, and to deal with them if and when they occur.
Some of the most current information security threats are highlighted
annually by various IT security vendors as well as police and investiga-
tive officials. Many security risk reports are published annually.

ICO Guidance
12.23 The ICO has also issued a guide entitled A Practical Guide to IT
Security, Ideal for Small Businesses12 which is useful. The ICO has also
commented in relation to encryption issues.13
The ICO also provides the following tips and suggestions14 for main-
taining and implementing IT security for personal data, namely:
For Computer Security
●● install a firewall and virus-checking on the computers;
●● make sure that the operating system is set up to receive automatic
updates;
●● protect the computer by downloading the latest patches or security
updates, which should cover vulnerabilities;
●● only allow staff access to the information they need to do their job
and do not let them share passwords;

9 See, for example, B Beric and G Carlisle, ‘Investigating the Legal Protection of Data,
Information and Knowledge Under the EU Data Protection Regime,’ International
Review of Law, Computers & Technology (2009) (23) 189–201.
10 See Network and Information Systems Regulations 2018 (UK), as amended by the
Network and Information Systems (Amendment) Regulations 2018 (UK), and also the
European Commission Regulation 2018/151.
11 See Regulation (EU) 910/2014 on electronic identification and trust services for
electronic transactions in the internal market.
12 At https://ico.org.uk.
13 ICO, Our Approach to Encryption, at https://ico.org.uk.
14 ICO, Security Measures, at https://ico.org.uk.

171
12.23 Security of Personal Data

●● encrypt any personal information held electronically that would


cause damage or distress if it were lost or stolen;
●● take regular back-ups of the information on your computer system
and keep them in a separate place so that if one loses computers, one
does not lose the information;
●● securely remove all personal information before disposing of old
computers (by using technology or destroying the hard disk);
●● consider installing an anti-spyware tool. Spyware can secretly
monitor ones computer activities, etc. Spyware can be unwittingly
installed within other file and program downloads, and their use
is often malicious. They can capture passwords, banking creden-
tials and credit card details, and then relay them back to fraudsters.
Anti-spyware helps to monitor and protect the computer from
spyware threats, and it is often free to use and update.
For Using Emails Securely
●● consider whether the content of the email should be encrypted or
password protected. The IT or security team should be able to assist
with encryption;
●● when starting to type in the name of the recipient, some email soft-
ware will suggest similar addresses one has used before. If one has
previously emailed several people whose name or address starts the
same way – eg ‘Dave’ – the auto-complete function may bring up
several ‘Dave’s.’ Make sure to choose the right address before click-
ing send;
●● if one wants to send an email to a recipient without revealing their
address to other recipients, make sure to use blind carbon copy
(bcc), not carbon copy (cc). When one uses cc every recipient of the
message will be able to see the address it was sent to;
●● be careful when using a group email address. Check who is in
the group and make sure you really want to send your message to
everyone;
●● if sending a sensitive email from a secure server to an insecure
recipient, security will be threatened. One may need to check that
the recipient’s arrangements are secure enough before sending the
message.
For Using Faxes Securely
●● consider whether sending the information by a means other than fax
is more appropriate, such as using a courier service or secure email.
Make sure to only send the information that is required;
●● make sure to double check the fax number you are using. It is best to
dial from a directory of previously verified numbers;

172
ICO and Security Breaches 12.24

●● check that you are sending a fax to a recipient with adequate security
measures in place. For example, the fax should not be left uncol-
lected in an open plan office;
●● if the fax is sensitive, ask the recipient to confirm that they are at the
fax machine, they are ready to receive the document, and there is
sufficient paper in the machine;
●● ring up or email to make sure the whole document has been received
safely;
●● use a cover sheet. This will let anyone know who the information is
for and whether it is confidential or sensitive, without them having
to look at the contents.
For Other Security
●● shred all your confidential paper waste;
●● check the physical security of your premises;
●● training staff,
$$ so they know what is expected of them;
$$ to be wary of people who may try to trick them into giving out
personal details;
$$ so that they can be prosecuted if they deliberately give out
personal details without permission;
$$ to use a strong password – these are long (at least seven
­characters) and have a combination of upper and lower case
letters, numbers and the special keyboard characters like the
asterisk or currency symbols;
$$ not to send offensive emails about other people, their private
lives or anything else that could bring your organisation into
disrepute;
$$ not to believe emails that appear to come from your bank that
ask for your account, credit card details or your password
(a bank would never ask for this information in this way);
$$ not to open Spam – not even to unsubscribe or ask for no more
mailings. Tell them to delete the email and either get Spam
filters on your computers or use an email provider that offers
this service.15

ICO and Security Breaches


12.24 The ICO advises that under the Regulations (PECR updated),
public electronic communications service providers are required to

15 ICO, Security Measures, at https://ico.org.uk.

173
12.24 Security of Personal Data

notify the ICO if a personal data breach occurs. (Note the ePrivacy Reg-
ulation portends further rules, updates and or replacement).
A personal data breach is defined to mean:
‘a breach of security leading the accidental or unlawful destruction, loss,
alteration, unauthorised disclosure of, or access to, personal data transmitted,
stored or otherwise processed in connection with the provisions of a public
electronic communications service.’

The ICO advises16 that the following must be complied with:


Keep a log of personal data breaches
You must keep a record of all personal data breaches in an inventory or log.
It must contain:
●● the facts surrounding the breach;
●● the effects of that breach; and
●● remedial action taken.
We have produced a template log to help you record the information you need.
Notify breaches to the ICO
You must notify the ICO of any personal data breaches. This notification must
include at least a description of,
●● the nature of the breach;
●● the consequences of the breach; and
●● the measures taken or proposed to be taken by the provider to address
the breach.
To make this process easier, we suggest that you send your log to us on a
monthly basis. This means one will not have to record the information twice
and will meet the organisation’s requirement to notify any security breaches
without unnecessary delay.
However, if the breach is of a particularly serious nature you need to notify us
about the breach as soon as possible, by filling in the security breach notifica-
tion form (PECR).
When thinking about whether a breach is of a serious nature, we recommend
that one considers,
●● the type and sensitivity of the data involved;
●● the impact it could have on the individual, such as distress or embarrass-
ment; and
●● the potential harm, such as financial loss, fraud, theft of identity.
Please email all security breach notifications to us at data security breach@
ico.gsi.gov.uk.

16 ICO, Security Breaches, at https://ico.org.uk.

174
Disposal of Computer Hardware 12.25

Failure to comply with the requirement to submit breach notifications can


incur a … fine.
For more practical information on how to notify us about PECR security
breaches please see our guidance for service providers.
Notify breaches to your subscribers
You may also need to tell your subscribers. If the breach is likely to adversely
affect their personal data or privacy you need to, without unnecessary delay,
notify them of the breach. You need to tell them,
●● the nature of the breach;
●● contact details for your organisation where they can get more informa-
tion; and
●● how they can mitigate any possible adverse impact of the breach.
You do not need to tell your subscribers about a breach if you can demonstrate
that you have measures in place which would render the data unintelligible
and that those measures were applied to the data concerned in the breach.
If you don’t tell subscribers, the ICO can require you do so, if it considers the
breach is likely to have an adverse effect on them.17

Disposal of Computer Hardware


12.25 Particular care is needed when considering the disposal of IT
hardware, equipment and software. They may still contain personal data
files. It may also include commercial or other confidential information
relating to the organisation. This can continue to be the case even when
it appears that files have been wiped or deleted. It is always advised to
take professional legal, IT and or forensic advice.
Disposal can occur from the organisation. However, organisations
need to be aware that there can be a security breach as well a breach of
the data protection regime if devices and hardware are donated to charity
or otherwise given away but there remains corporate and personal data
on the devices. Even when it appears data may be deleted, it can some-
times be recovered with the right skills. There have also been instances
where equipment is meant to be decommissioned and disposed of but
employees or contractors do not fulfil the job, and the devices get into
third party hands with data still on them.
This can raise other issues. If an employee leaves, they will be leav-
ing their company laptop, etc behind. This can be reassigned to another
employee. However, is the new employee appropriately designated to

17 ICO, Security Measures, at https://ico.org.uk.

175
12.25 Security of Personal Data

access to the level and importance of the data on the laptop? Hierarchies
of access control need to be considered.
Particular sensitivity also arises if it is intended to recycle or give to
charity particular devices. Real consideration should be given to the per-
sonal data (and other data) thereon in deciding if and how to so dispose
of such devices.

Conclusion
12.26 Security is a legal data protection compliance requirement, as
well as best business practice. This is one of the more prominent areas
of compliance where time never stands still. The security risks and
needs must constantly be appraised and updated. It is also essential to
ensure that outsourced data processing activities are also undertaken
in an appropriately secure manner. Increasingly the area of appraisals
and procedures surrounding security breaches and data loss instances
is regulated. If such an event arises, the ICO as well as individual Data
Subjects may have to be informed. Liability issues should also be a con-
stant concern for organisations as regards security and risk. Maintaining
data security is an ongoing task. What may be secure and data protection
compliant last month, may not be so next month. Constant vigilance is
required.

176
Chapter 13
Outsourcing and Data Processors

Introduction
13.01 While many organisations may feel they do not engage third
parties to deal with their personal data, processes and databases, closer
inspection often indicates that this is not correct. Many organisations,
and across all sectors of activity, engage third parties or outsource certain
of their internal processing activities.1
One example is where an organisation may find it more convenient
to outsource its payroll functions to an organisation specialising in such
activities. It is necessary, therefore, that the employee personal data, or
certain of it, is transferred to the third party organisation for processing.
This third party is a data Processor acting for the organisation. A contract
must be in place and appropriate security standards implemented.
Sometimes organisations outsource other activities, such as ­marketing,
recruitment, employment of consultants or agents or part time employees.
Organisations increasingly outsource market research and customer
satisfaction surveys to third parties. These are all Processors where
personal data is involved.
If an organisation must transfer or export personal data outside of the
UK and lawfully permit transfers outside of the transfer restriction, it
must satisfy, as appropriate, one of the following:
●● an export to one of the safe permitted countries or UK equivalent; or
●● an export to the US under the original EU-US Safe Harbour
agreement and the new EU-US Privacy Shield agreement or UK
equivalent;2 or

1 See generally V Alsenoy, ‘Allocating Responsibility Among Controllers, Processors,


and “Everything In Between”: The Definition of Actors and Roles in Directive 95/46/EC,’
Computer Law & Security Review (2012) (28) 25–43; R Morgan, ‘Data Controllers,
Data Processors and Data Sharers,’ SCL Computers and Law, 04 March 2011.
2 Note that the original Safe Harbour agreement was struck down by the Court of Justice.

177
13.01 Outsourcing and Data Processors

●● an export pursuant to the accepted standard contractual clauses or


UK equivalent; or
●● an export pursuant to the accepted binding corporate rules (BCRs)
or UK equivalent.
(Note that UK data transfers to the US, and EU data transfers to the UK
are under scrutiny as a result of the Brexit process, and further develop-
ments are expected in this particular area. The Government will need to
avoid conflict issues in order to safely ensure a successful EU adequacy
decision.)

Processors and Data Security


13.02 Processors also have obligations in relation to processing
personal data and security.
Where processing of personal data is carried out by a Processor on
behalf of a Controller, then the Controller must have a contract in writing
in place with the Processor, containing certain clauses and obligations.
The contracts with Processors must address the following:
●● processing must be carried out in pursuance of a contract in writing
or another equivalent form;
●● contract must provide that the Processor carries out the processing
only on and subject to the instructions of the Controller;
●● processor must comply with the ‘security’ requirements (see above).
There are also certain other requirements. The Controller:
●● must ensure Processor provides sufficient guarantees in respect of
the technical security measures and organisational measures govern-
ing the processing;
●● must take reasonable steps to ensure compliance with these measures.
Cloud and data security issues also arise to be considered.

Engaging Processors
13.03 Where processing of personal data is carried out by a Processor
on behalf of a Controller, the Controller should:
●● choose a Processor providing sufficient guarantees in respect of
the technical and organisational security measures governing the
processing to be carried out; and
●● take reasonable steps to ensure compliance with those measures.

178
Relying on Third Party Processors 13.04

Relying on Third Party Processors


13.04 There are specific requirements where Processors are engaged.
Where Controllers engage Processors then the Controller must:
●● have a contract in writing or other equipment form which provides
that the Processor will act only on instructions of the Controller and
that it will comply with the data security measures to which the Con-
troller is subject;
●● ensure the Processor provides sufficient guarantees in respect of the
technical security measures and organisational measures it implements;
●● takes reasonable steps to ensure compliance with such matters.
What actions should an organisation take if third parties process their
organisation’s data? Organisations should consider:
●● advanced procedures and policies to ensure and cater for such an
eventuality;
●● ensuring that an appropriate senior manager/board member has an
assigned role and takes responsibility for date protection compli-
ance, including dealing with breaches, and also having a documented
IT security incident handling procedure policy in place. Needless to
say, it must be properly policed, implemented, reviewed and updated
appropriately;
●● once a breach does occur, a proper procedure can assist in contain-
ing the breach and recovering from same; assessing the ongoing
risk; notify the breach as appropriate; evaluating and reacting to the
breach and the risk;
●● all of the appropriate personnel within the organisation should be
aware of the procedures and consulted as appropriate, including for
example managing director, legal, IT, media/press officer;
●● a part of the procedure will have a designated list of external contact
points, some or all of whom may need to be notified and contacted,
as appropriate;
●● obviously the nature of organisations, and of the breaches them-
selves, differ and the IT security incident handling procedure policy
will need to be tailored. After all, the cause of a data security breach
can be any of a number of potential reasons;
●● also, such a policy while covering breaches of data protection and
privacy, may well encompass wider issues, such as outsourcing,
backups, disaster recovery, business continuity, etc;
●● if doing business in other jurisdictions one may need to take addi-
tional measures. For example, if owning or having access to per-
sonal information of Californian residents then one may have a legal
duty to notify those individuals if that information (such as names
and credit card details) has been accessed illegally.

179
13.05 Outsourcing and Data Processors

Conclusion
13.05 It is also important that the organisation undertake ongoing
assessments and checks regarding the operation of the data processing
undertaken by the Processor. This should also include the security and
compliance measures. Issues of risk, security breach and data protec-
tion compliance must be assessed on an ongoing basis. The issue of the
relationship and the appropriate contract documentation required when
outsourcing is becoming increasingly complex.

180
Part 2
Inward Facing Organisational
DP Obligations

181
182
Chapter 14
Processing Employee Personal
Data

Introduction
14.01 Organisations sometimes focus on their customer related data
compliance issues. It is important for new and existing organisations
to look inwards, as there are important inward facing data protection
obligations. Personal data includes the personal data of employees also.
The employees of the organisation also have Data Subject rights which
must be respected.
The variety of inward-facing data protection issues which organisa-
tions need to deal with on a daily basis are ever increasing. Some of these
include whether to allow employees to bring their own devices (BYOD)
into the organisation and to permit them to place organisational data
onto such devices. If so permitted, a particular BYOD policy needs to be
considered and implemented. If not permitted, this should be expressly
specified in a transparent manner.
The ability for organisations to track their employees off-site is also
a new consideration. This can include technologies onboard v­ ehicles
as well as satellite and location technologies. It also includes the
­organisation’s (and employee’s BYOD) smart phone (and other) devices.
The issues of monitoring and unjustified monitoring are also big consid-
erations. Organisations should not assume that monitoring, much less all
types of monitoring, of employees is permitted.
Clearly, there are many more means by which new and/or expanded
sets of personal data may be collected and used by an organisation.
However, it is equally necessary to consider the data protection
­compliance issues at the earliest opportunity, and what collections are
permissible and which are not.

183
14.01 Processing Employee Personal Data

Organisations should also note that the fields or categories of potential


employee data are ever increasing. There are no longer limited to a hard
copy personnel or HR file, not limited to the physical location of the
personnel department.

New Inward Facing Changes


14.02 Some of the new EU General Data Protection Regulation
(GDPR) changes which apply to inward facing organisational issues
include:
●● repeal of EU Data Protection Directive 1995 (DPD 1995)
(Article 94);
●● WP29 and European Data Protection Board (Recitals 72, 77, 105,
119, 124, 136, 139, 140, 143, 168; Articles 35, 40–43, 47, 51, 52,
58–62, 64 and 65, 94);
●● background and rationale (Recitals 1–8, 11–13);
●● context objectives, scope of GDPR (Articles 1, 2 and 3);
●● obligations (Recitals 32, 39, 40, 42, 43–47);
●● security (Recitals 2, 16, 19, 39, 49, 50, 52, 53, 71, 73, 75, 78, 81, 83,
91, 94, 104, 112; Articles 2, 4, 5, 9, 10, 30, 32, 35, 40, 45, 47);
●● processing;
●● rights (Recitals 1, 2, 3, 4, 9, 10, 11, 13, 16, 19, 38, 39, 41, 47, 50–54,
57, 59, 63, 65, 66, 68–71, 73–81, 84–86, 89, 91, 94, 98, 102, 104,
108, 109, 111, 113, 114, 116, 122, 129, 137, 139, 141–143, 153–156,
162, 164, 166, 173; Articles 1, 4–7, 9–22);
●● proceedings (Recitals 52, 80, 129, 143–147; Articles 23, 40, 78, 79,
81, 82);
●● establishment (Recitals 22, 36, 52, 65, 100, 117, 122, 124, 126, 127,
145; Articles 3, 4, 9, 17, 18, 21, 37, 42, 49, 54, 56, 57, 60, 62, 65,
70, 79);
●● transfers (Recitals 6, 48, 101–103, 107, 108, 110–114, 153;
Articles 4, 13, 14, 15, 23, 28, 30, 40, 42, 44–49, 70, 83, 85, 88, 96);
●● new bodies (Recital 142; Articles 9, 80);
●● notification/registration replaced (Recital 89);
●● exceptions/exemptions (Recitals 14, 15, 17);
●● lawful processing and consent (Recital 10, 32, 33, 38–40, 42–46,
49–51, 54, 63, 65, 68, 69, 71, 83, 111, 112, 116, 138, 155, 161, 171;
Articles 4–9, 13, 14, 17, 18, 20, 22, 23, 32, 40, 49, 82, 83);
●● online identifiers (Recitals 30, 64; Articles 4, 87);
●● sensitive and special personal data (Recitals 10, 51, 53, 54, 59, 65,
71, 80, 91, 97; Articles 6, 9, 22, 27, 30, 35, 37, 47);

184
New Inward Facing Changes 14.02

●● health data (Recital 35, 45, 52–54, 63, 65, 71, 73, 75, 91, 112,
155, 159; Articles 4, 9, 17, 23, 36, 88);
●● GDPR definitions (Article 4);
●● new processing rules: obligations (Articles 5–11);
●● new (data protection) Principles (Article 5);
●● lawfulness of processing: lawful processing conditions (Article 6);
●● processing special categories of personal data (Article 9);
●● processing re criminal convictions and offences data (Article 10);
●● processing not requiring identification (Article 11);
●● Controllers and Processors (Chapter IV);
●● responsibility of the Controller (Article 24);
●● joint Controllers (Article 26);
●● Processor (Article 28);
●● processing under authority of Controller and Processor (Article 29);
●● records of processing activities (Article 30);
●● representatives of Controllers not established in EU (Article 27);
●● security of processing (Article 32);
●● notifying data breach to supervisory authority (Article 33);
●● communicating data breach to Data Subject (Article 34);
●● data protection impact assessment and prior consultation (Chapter IV,
Section 3);
●● data protection impact assessment (Article 35);
●● prior consultation (Article 36);
●● new Data Protection Officer (DPO) (Chapter IV; Section 4;
Article 37);
●● position (Article 38) and tasks (Article 39) of new DPO;
●● general principle for transfers (Recitals 6, 48, 101–103, 107, 108,
110–114, 153; Articles 4, 13, 14, 15, 23, 28, 30, 40, 42, 44–49, 70,
83, 85, 88, 96; Article 44);
●● transfers via adequacy decision (Article 45);
●● transfers via appropriate safeguards (Article 46);
●● transfers via binding corporate rules (Article 47);
●● transfers or disclosures not authorised by EU law (Article 48);
●● derogations for specific situations (Article 49);
●● new processing rules: Data Subject rights (Recitals 1–4, 9–11, 13,
16, 19, 38, 39, 41, 47, 50–54, 57, 59, 63, 65, 66, 68–71, 73–81,
84–86, 89, 91, 94, 98, 102, 104, 108, 109, 111, 113, 114, 116, 122,
129, 137, 139, 141–143, 153–156, 162, 164, 166, 173; Articles 1,
4–7, 9–22; Chapter III);
●● right to transparency (Recitals 13, 39, 58, 60, 71, 78, 100, 121;
Chapter III, Section 1; Articles 5, 12–14, 26, 40–43, 53, 88);
●● data access rights (Chapter III, Section 2);

185
14.02 Processing Employee Personal Data

●● right to prior information: directly obtained data (Article 13);


●● right to prior information: indirectly obtained data (Article 14);
●● right of confirmation and right of access (Article 15);
●● rectification and erasure (Chapter III, Section 3);
●● right to rectification (Article 16);
●● right to erasure (Right to be Forgotten) (RtbF) (Article 17);
●● right to restriction of processing (Article 18);
●● notifications re rectification, erasure or restriction (Article 19);
●● right to data portability (Article 20);
●● right to object (Article 21);
●● rights against automated individual decision making, including
profiling (Article 22);
●● Data Protection by Design and by Default (DPbD) (Article 25);
●● security rights;
●● data protection impact assessment and prior consultation;
●● communicating data breach to Data Subject (Article 34);
●● DPO (contact);
●● remedies, liability and sanctions (Chapter VIII);
●● right to effective judicial remedy against Controller or Processor
(Article 79);
●● representation of Data Subjects (Article 80);
●● right to compensation and liability (Article 82);
●● general conditions for imposing administrative fines (Article 83);
●● penalties (Article 84);
●● specific data processing situations (Chapter IX);
●● processing in employment context (Article 88).
The Data Protection Act 2018 (DPA 2018) at Sch 1, Pt 1 refers to
conditions relating to employment (and other issues). It refers to
conditions relating to employment and special categories of data.
­
It states that ‘[t]his condition is met if’:
●● ‘the processing is necessary for the purposes of performing or
exercising obligations or rights which are imposed or conferred
by law on the controller or the data subject in connection with
employment …’;
●● ‘when the processing is carried out, the controller has an appropriate
policy document in place …’

Data Protection Officer (DPO)


14.03 There may have been a traditional view within organisations
that the role of the person tasked with dealing with data protection issues
was limited to dealing with outward facing data protection queries such

186
Data Protection Officer (DPO) 14.03

as access requests, data protection website queries and the like. There
may have been an understanding that personnel managers were respon-
sible for dealing with all employee related queries, including references
to and copies of employee documentation and personal data.
This is no longer the case. Now there must be a designated DPO
appointed in organisations. Furthermore, the role and tasks of the
DPO are not limited to outward facing issues. The DPO will also be
­concerned with inward facing issues. Employees and similar internal
facing ­individuals have data protection rights and will be able to address
queries to the DPO quite separate from the human resource functions.
Therefore, organisations must consider DPO issues and the GDPR
in terms of internal facing functions. Chapter IV, Section 4 of the new
GDPR refers to DPOs and the obligation for organisations to appoint
DPOs.
The Controller and the Processor shall designate a DPO in any case
where:
●● the processing is carried out by a public authority or body, except for
courts acting in their judicial capacity; or
●● the core activities of the Controller or the Processor consist of
processing operations which, by virtue of their nature, their scope
and/or their purposes, require regular and systematic monitoring of
Data Subjects on a large scale; or
●● the core activities of the Controller or the Processor consist of
processing on a large scale of special categories of data pursuant
to Article 9 and data relating to criminal convictions and offences
referred to in Article 10 (Article 37(1)).
A group of undertakings may appoint a single DPO provided that the
DPO is easily accessible from each establishment (Article 37(2)).
Where the Controller or the Processor is a public authority or body,
a single DPO may be designated for several such authorities or bodies,
taking account of their organisational structure and size (Article 37(3)).
In cases other than those referred to in Article 37(1), the Controller
or Processor or associations and other bodies representing categories of
Controllers or Processors may or, where required by EU or state law
shall, designate a DPO. The DPO may act for such associations and other
bodies representing Controllers or Processors (Article 37(4)).
The DPO shall be designated on the basis of professional qualities
and, in particular, expert knowledge of data protection law and practices
and the ability to fulfil the tasks referred to in Article 39 (Article 37(5)).
The DPO may be a staff member of the Controller or Processor, or
fulfil the tasks on the basis of a service contract (Article 37(6)).
The Controller or the Processor shall publish the contact details of the
DPO and communicate these to the supervisory authority (Article 37(7)).

187
14.03 Processing Employee Personal Data

The Controller or the Processor shall ensure that the DPO is involved,
properly and in a timely manner, in all issues which relate to the protec-
tion of personal data (Article 38(1)).
The Controller or Processor shall support the DPO in performing the
tasks referred to in Article 39 by providing resources necessary to carry
out these tasks as well as access to personal data and processing opera-
tions, and to maintain their expert knowledge (Article 38(2)).
The Controller or Processor shall ensure that the DPO does not receive
any instructions regarding the exercise of those tasks. He or she shall not
be dismissed or penalised by the Controller or the Processor for perform-
ing their tasks. The DPO shall directly report to the highest management
level of the Controller or the Processor (Article 38(3)).
Data Subjects may contact the DPO with regard to all issues related to
the processing of their personal data and the exercise of their rights under
the GDPR (Article 38(4)).
The DPO shall be bound by secrecy or confidentiality concern-
ing the performance of their tasks, in accordance with EU or state law
(Article 38(5)).
The DPO may fulfil other tasks and duties – which must not result in
a conflict of interests (Article 38(6)).
The Controller or Processor shall ensure that any such tasks and duties
do not result in a conflict of interests (Article 38(6)).
The DPO shall have at least the following tasks:
●● to inform and advise the Controller or the Processor and the employ-
ees who carry out processing of their obligations pursuant to the
GDPR and to other EU or state data protection provisions;
●● to monitor compliance with the GDPR, with other EU or state data
protection provisions and with the policies of the Controller or
­Processor in relation to the protection of personal data, including
the assignment of responsibilities, awareness-raising and training of
staff involved in processing operations, and the related audits;
●● to provide advice where requested as regards the data protec-
tion impact assessment and monitor its performance pursuant to
Article 35;
●● to cooperate with the supervisory authority;
●● to act as the contact point for the supervisory authority on issues
related to the processing of personal data, including the prior con-
sultation referred to in Article 36, and to consult, where appropriate,
with regard to any other matter (Article 37(1)).
The DPO shall in the performance of their tasks have due regard to the
risk associated with the processing operations, taking into account the
nature, scope, context and purposes of the processing (Article 39(2)).

188
Those Covered 14.05

Inward Facing Issues


14.04 Organisations must comply with the data protection regime as
regards their inward facing personal data. This primarily relates to the
employees of the organisation. This raises a number of issues both in
terms of whom are the employees, what personal data is being collected
and processed, for what purpose or purposes, where is it located, does
anyone else obtain the data, etc, and how the organisation ensures that it
is data protection compliant in relation to same.
In relation to employee personal data, organisations must ensure com-
pliance with the:
●● Principles of data protection;
●● the Legitimate Processing Conditions;
●● the Special Personal Data Legitimate Processing Conditions;
●● the security requirements;
●● risk assessments and privacy impact assessments;
●● employee (etc) Data Subject rights;
●● new DPO.

Those Covered
14.05 Who is covered by the inward facing organisational data protec-
tion obligations? While full time employees are the most obvious exam-
ple, they are not the only ones. Organisations must consider the inward
facing personal data of:
●● full time employees;
●● part time employees;
●● other workers such as temps and casual staff;
●● agency staff;
●● contractors;
●● ex-employees;
●● retired employees;
●● spouses;
●● job applicants, including unsuccessful applicants;
●● volunteers;
●● apprentices and trainees;
●● work experience staff;
●● indirectly engaged persons, such as actors (after the Sony film ‘The
Interview’ massive data breach incident – after which the online
publication of internal disparaging emails regarding certain actors
caused significant reputational damage and potential lawsuits. It also
cost key personnel their jobs).

189
14.05 Processing Employee Personal Data

Where an organisation engages with the data of any of the above, it will
have to ensure that the data protection regime is complied with.

Compliance with the Principles of Data Protection


14.06 In dealing with employee personal data, as well as potential
employees and any of the categories outlined above, it is necessary for
the organisation to comply with the Principles of data protection.
The data protection Principles apply in the work environment as well
as elsewhere. Employee personal data must be:
(1) processed lawfully, fairly and transparently;
(2) collected for specified, explicit and legitimate purposes;
(3) adequate, relevant and limited to purpose;
(4) accurate and kept up to date;
(5) kept no longer than is necessary;
(6) processed in line with security, integrity and confidentiality;
(7) processed in an accountable manner.
All of the Principles must be complied with. Compliance with the
Principles of data protection applies to all employee and inward-facing
personal data collected and or processed by an organisation. It also
applies regardless of registration requirements (if any).
These can be summarised in the new GDPR as:
●● lawful, fair and transparent processing;
●● purpose limited processing;
●● data minimisation;
●● ongoing accuracy of data;
●● time storage limitations;
●● appropriate security, integrity and confidentiality.
The Controller must demonstrate compliance and accountability.

Ordinary Personal Data Lawful Processing Conditions


14.07 When dealing with the above inward-facing categories of
employees, agents, contractors, etc, the Lawful Processing Conditions
are required to be complied with, in addition to the Principles of data
protection.
The GDPR contains the general Lawful Processing Conditions, eg,
employee non sensitive personal data. In order to comply with the

190
Lawful Processing and Organisation’s Employees 14.08

g­ eneral personal data Lawful Processing Conditions, the organisation


must fall within one of the following conditions, namely:
●● the individual whom the personal data is about has consented to the
processing;
●● the processing is necessary for the performance of a contract to
which the Data Subject is a party or in order to take steps at the
request of the data subject is party or in order to take steps at the
request of the Data Subject prior to entering into a contract;
●● the processing is necessary for compliance with a legal obligation to
which the Controller is subject;
●● the processing is necessary to protect the ‘vital interests’ of the
Data Subject or another natural person;
●● the processing is necessary for the performance of a task carried out
in the public interest or in the exercise of official authority vested in
the Controller;
●● the processing is necessary for the ‘legitimate interests’ of the
Controller or a third party, except where overridden by the interests
or fundamental rights of the Data Subject, in particular where the
Data Subject is a child.
Article 7 also sets out conditions which apply in relation to consent. The
Controller, if relying on consent (and if appropriate in the first instance),
must be able to demonstrate that the Data Subject has consented. These
conditions need to be considered carefully in advance.
(There are also additional conditions which apply for particular
children’s consent, eg if information is held in relation to the children of
an employee.)

Lawful Processing and Organisation’s Employees


14.08 Unlike certain other areas, the legitimate processing of
employee personal data does not require employee consent. However,
many organisations would have originally expected and proceeded on
the basis that consent was so required. They would have incorporated
(deemed) consent clauses into employment contracts, etc.
As indicated above, organisations can rely upon the legitimate
­processing condition that processing is necessary for the purposes of
legitimate interests pursued by the Controller or by the third party or
parties to whom the data are disclosed, except where the processing is
unwarranted in any particular case by reason of prejudice to the rights
and freedoms or legitimate interests of the Data Subject.

191
14.08 Processing Employee Personal Data

An alternative could be to say that the processing is necessary as


part of a contract to which the employee is a party, in particular the
employment contract.

Special Personal Data Lawful Processing Conditions


14.09 Organisations may also feel the need on occasion to store and
use special personal data in relation to employees, such as medical and
health data. In the case of special personal data, an organisation must,
in addition to satisfying all of the data protection Principles, be able
to comply or fall within one of the Special Personal Data Lawful
Processing Conditions.
The GDPR sets out conditions relevant for the purposes of the process-
ing of Special Personal Data. The organisation should be familiar with
what constitutes Special Personal Data, namely, personal data reveal-
ing racial or ethnic origin, political opinions, religious or philosophical
beliefs, or trade union membership, and the processing of genetic data,
biometric data for the purpose of uniquely identifying a natural person,
data concerning health data or data concerning a natural person’s sex life
or orientation. Such data by default may not be used (such ‘[p]rocessing …
shall be prohibited’). The organisation must fall within one of the
­specified exemptions to be able to process Special Personal Data. In the
context of employees and inward-facing personal data, the conditions or
exceptions provided are:
1 the employee Data Subject has given explicit consent to the process-
ing of the personal data for a specified purpose;
2 the processing is necessary for the purposes of carrying out obliga-
tions or rights under employment law;
3 the processing is necessary to protect the vital interests of the
Data Subject or another natural person where the Data Subject is
incapable of consent;
4 the processing is carried out in the course of its legitimate activities
with appropriate safeguards by a foundation, association or not for
profit body and relates to its members;
5 the processing relates to personal data which are manifestly made
public by the Data Subject;
6 the processing is necessary for the establishment, exercise or defence
of legal claims;
7 the processing is necessary for reasons of substantial public inter-
est, on the basis of Union or state law which shall be proportionate
to the aim pursued, respect the essence of the right to data protec-
tion and provide for suitable and specific measures to safeguard the
fundamental rights and the interests of the Data Subject;

192
Special Data Lawful Processing and Organisation’s Employees 14.12

8 the processing is necessary for preventative or occupational medi-


cine, for the assessment of the working capacity of the employee,
medical diagnosis, the provision of health or social care or treatment;
9 the processing is necessary for public health;
10 the processing is necessary for archiving purposes in the public inter-
est, scientific or historical research purposes or statistical purposes.
The DPA 2018 at ss 10 and 11 also sets out further provisions in relation
to Special Personal Data; as well as Sch 1 of the Act.
Furthermore, regulations can also set out further obligations in rela-
tion to the processing of special personal data.

Special Data Lawful Processing and Organisation’s


Employees
14.10 However, where sensitive personal data is involved, the organi-
sation may have to obtain consent, unless otherwise specifically permit-
ted in relation to one of the other particular purpose conditions set out
above.
Consent
14.11 If it is the case that none of the condition criteria apply or permit
a specific data processing use, then the organisation must obtain appro-
priate consent.
In considering the legitimising criteria the organisation should pay
particular attention to the conditional word ‘necessary.’ If not strictly
necessary, and demonstrably so, in the event that it is ever necessary to
justify it when subsequently queried, it means that it may be necessary to
fall back to the consent criteria.
Explicit consent, for example, with special personal data usage, can
mean having to obtain written consent. There is also a reference to freely
given, specific, informed and unambiguous consent (see definition of
‘consent’). Where employee consent is utilised or required, an issue can
sometimes arise as to whether it is voluntarily and freely given. If the
employee feels forced into giving consent they, or the ICO, might subse-
quently say that the consent is invalid, thus compromising the legitimacy
of the processing.
Compliance and Policies
14.12 It is essential for all organisations to implement appropriate
written policies. These will vary, as there are different types of policy. In
addition, each policy will be specifically tailored to meet the needs and
circumstances of the particular organisation.

193
14.12 Processing Employee Personal Data

Once finalised, the policy or policies should be properly and effec-


tively communicated to all employees. It may also have to be made
available to others who work in or for the organisation eg contractors.
Those in the organisation with responsibility for dealing with personal
data also need to be particularly familiarised with the policies, security
measures, data protection processes, as well as effectively trained in
relation to all aspects of data protection compliance.
As with all legal, business and organisational policies relating to regu-
latory compliance, data protection policies need to be regularly moni-
tored and updated as requirement dictates. This can be because of legal
changes, ICO guideline changes, business and activity changes, chang-
ing security and risk requirements, changing technology, etc.

ICO Codes
14.13 It is recommended that the organisation consult the most rel-
evant ICO guidance and codes in terms of implementing, or tailoring, IT
data processing and data protection compliance practices.1
Responsible Person for Data Protection Compliance
14.14 Organisations will have to appoint a particular identified person
in the organisation to deal with and be responsible for all data protection
processes and related compliance issues. This includes ensuring that the
organisation is data protection compliant in accordance with the DPA
and the ICO Employment Practices Code.2
In addition to a data protection supervisor or DPO it is also recom-
mended that an individual at Board level be appointed to be responsible
for overseeing and dealing with data protection compliance within the
organisation. The contemporary importance and significance of data pro-
tection ensures that it is now clearly a boardroom issue.
Responsibilities include ongoing compliance, policies, strategies,
processes, training, monitoring, reviewing and updating, coordinating,
engaging appropriate expertise and advices, audits, DPbD, education,
courses, notification and registration, incident response, etc, as regards
data protection.
Personnel Contracts
14.15 The organisation’s contracts of employment need to incorpo-
rate appropriate notifications, consent clauses (only if appropriate), and

1 ICO Employment Practices Code, at https://ico.org.uk.


2 See above.

194
Employees and Security 14.20

references to the organisational policies relevant to data protection, such


as corporate communications usage policies, internet policies, devices,
security, breaches, etc. There may also be reference to where these
policies are located and where the latest and most up to date version will
always be available. This might be a staff handbook, intranet, etc.
These documents will all be different depending on the organisation,
the sector and the data protection practices ie the types of personal data,
what it is used for and why. Issues such as compliance, security and
confidentiality should be emphasised.
Training
14.16 Staff induction sessions as well as ongoing staff training
should all incorporate guidance in relation to dealing with and handling
personal data.
Employee Handbooks
14.17 Most organisations provide an employee handbook to their
employees. These handbooks should contain guidance and polices of
data protection, corporate communications, usage policies, etc.
Notices, Communications, Internet, Intranet
14.18 All of these provide an opportunity for the organisation to set
out and reinforce the organisational requirements regarding the data pro-
tection regime. These can be user friendly and instantaneous, and hence
provide an easy means of notifying new and urgent changes, notices, etc.
Personnel Department
14.19 While the personnel department or personnel Director will be
required to be familiar with data protection, the role and ­responsibility
of operating data protection compliance is not a personnel role.
It needs to be someone outside of the employment and personnel func-
tion. However, the person responsible for data protection will need to
liaise regularly with the personnel department, including training and
keeping personnel staff up to date in relation to new developments in
data protection compliance.

Employees and Security


14.20 The security of employee personal data is an essential obliga-
tion under the data protection regime.
The Controller must take reasonable steps to ensure the reliability of
any employees of the organisation who have access to the personal data.

195
14.20 Processing Employee Personal Data

There are a number of cases where issues arise, including prosecutions,


where an employee has engaged in unauthorised access to personal data
which they had no work function or purpose to do so (eg, snooping);
and also where an employee leaving the organisation takes personal data
seeking to use it in their next employment (eg, customer databases).
Lynda McDonald3 refers to security indicating that the following
would be required to be checked, namely:
●● access to buildings, computer rooms and offices where personal data
is held;
●● access to computer and other equipment where unauthorised access
could have a detrimental effect on security;
●● that manual records are put away at night in locked filing cabinets
before the cleaners arrive;
●● that passwords are not written down so that others can access
them;
●● that strict rules are in force within the business about what personal
information can be accessed by which organisations, for example,
the information that line managers can access about their staff may
be different to (and less than) that which a personnel manager can
access;
●● that there is an efficient and effective security system in place to
prevent employees from seeing other employee data;
●● that mechanisms are in place to detect any breach of security, and
procedures in place to investigate such breaches;
●● that all staff have undergone training on the organisation’s data pro-
tection policy and how it works in practice, either as part of their
induction or at specialist sessions which are appropriate to their job
role, position and function.4
Even where an employee is working off-site or at home, or indeed trav-
elling, it is still the organisation’s responsibility to ensure appropriate
security measures and processes are in place.
These scenarios are increasingly prominent but the organisation
should ensure that they occur on a planned and permission basis. Unreg-
ulated activities enhance the level of risk as well as increasing the level
on unknown, uncontrollable activities.
Stuart Room in Cyber and Data Security Law and Practice also pro-
vides many detailed suggestions in relation to security issues.

3 L AC MacDonald, Data Protection: Legal Compliance and Good Practice for


Employers (Tottel, 2008) 11–12.
4 See above.

196
Conclusion 14.22

Data Access Right of Employees


14.21 Employees are individuals and are hence able to rely upon their
data protection rights, such as notifying for a copy of the personal data
relating to them. In the instance of an employee access request being
made, the organisation should assess the request to see if the appropri-
ate (minimum) fee has been paid and whether any exemptions apply
which would prevent a disclosure being made. In addition, while the
employee’s personal data may have to be disclosed, certain third party
personal data should be taken out or redacted. It may also be the case
that clarification may have to be obtained to further clarify what particu-
lar personal data is being sought. It should not be a mere mechanism to
avoid so complying.
Now the access request must be complied with without delay and at
latest within one month (reduced from 40 calendar days).
MacDonald5 refers to some examples of the types of employee related
personal data that may often be subject to access requests, such as:
●● performance reviews or appraisals;
●● sickness records;
●● warnings or minutes of disciplinary interviews;training
records;statements about pay;emails or electronic documents of
which they are the subject; and
●● expressions of opinion regarding eg prospects for promotion.
Information relating to automated individual decisions regarding
employees are also relevant and may be requested.

Conclusion
14.22 Organisations need to actively have considered data protec-
tion compliance issues and the recording of compliance even before an
employee is engaged. Data protection and personal data arises at recruit-
ment, selection and interview stages. The organisation must also note
relevant policies. Unless data protection issues and policy issues are
properly incorporated into the employment relationship, the organisa-
tion may be non-compliant. In addition, it may not be able to enforce
and rely upon particular contract terms, policies, employee obligations,
disciplinary rules and such like.

5 See above at 51.

197
198
Chapter 15
Employee Data Protection Rights

Introduction
15.01 Just as other parties whose personal data are being collected and
processed have data protection rights, employees also have Data Subject
rights.

The Data Protection Rights of Employees


15.02 The rights of employees can be summarised as including:
●● right of access;
●● right to establish if personal data exists;
●● right to be informed of the logic in automatic decision taking;
●● right to prevent processing likely to cause damage or distress;
●● right to prevent processing for direct marketing;
●● right to prevent automated decision taking;
●● right to compensation (see Data Protection Act 2018 (DPA 2018),
ss 168 and 169);
●● right to rectify inaccurate data;
●● right to rectification, blocking, erasure and forgetting, and
destruction;
●● right to complain to ICO;
●● right to go to court.
Chapter III of the new EU General Data Protection Regulation (GDPR)
refers to the rights of Data Subjects. These include:
●● right to transparency (Article 5; Article 12);
●● right to prior information: directly obtained data (Article 14);
●● right to prior information: indirectly obtained data (Article 14);
●● right of confirmation and right of access (Article 15);

199
15.02 Employee Data Protection Rights

●● right to be informed of third country safeguards (Article 15(2));


●● right to rectification (Article 16);
●● right to erasure (Right to be Forgotten) (RtbF) (Article 17);
●● right to restriction of processing (Article 18);
●● notifications re rectification, erasure or restriction (Article 19);
●● right to data portability (Article 18);
●● right to object (Article 19);
●● rights re automated individual decision making, including profiling
(Article 20);
●● DPbD (Article 23);
●● security rights;
●● data protection impact assessment and prior consultation;
●● communicating data breach to Data Subject;
●● Data Protection Officer (DPO);
●● remedies, liability and sanctions (Chapter VIII).
However, there are additional rights related issues to consider. Employ-
ers should recall that there can be obligations to notify Data Subjects of
the existence of certain rights that they can enforce. There is little consid-
eration of how this may apply in terms of relevance to the employment
context.
In addition, employers will recall that there are various record
keeping obligations. Consideration should also be given to how and to
what extent this is applied in the employment context, separate to the
outward-facing data protection rules.

Rights Under the GDPR


Access Right
15.03 There are various rights for individual Data Subjects. How these
can apply in the work environment for employees is set out below.
Article 15 of the new GDPR relates to the right of access for the
employee. Employees and the other categories of person referred to
above are covered by the data access right.
The employee shall have the right to obtain from the Controller
confirmation as to whether or not personal data concerning them are
being processed, and access to the data and the following information:
●● the purposes of the processing;
●● the categories of personal data concerned;
●● the recipients or categories of recipient to whom the personal data
have been or will be disclosed, in particular to recipients in third
countries or international organisations;

200
Rights Under the GDPR 15.05

●● where possible, the envisaged period for which the personal data will
be stored, or if not possible, the criteria used to determine that period;
●● the existence of the right to request from the Controller rectification
or erasure of personal data or restriction of processing of personal
data concerning the employee or to object to the processing;
●● the right to lodge a complaint to a SA;
●● where the personal data are not collected from the employee, any
available information as to their source;
●● the existence of automated decision making including profiling
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
employee (Article 15(1)).
Where personal data are transferred to a third country or to an interna-
tional organisation, the employee shall have the right to be informed of
the appropriate safeguards pursuant to Article 46 relating to the transfer
(Article 15(2)).
The Controller shall provide a copy of the personal data undergoing
processing. For any further copies requested by the employee, the Con-
troller may charge a reasonable fee based on administrative costs. Where
the Data Subject makes the request in electronic form, and unless other-
wise requested by the employee, the information shall be provided in a
commonly used electronic form (Article 15(3)).
The right to obtain a copy referred to in para (3) shall not adversely
affect the rights and freedoms of others (Article 15(4)).
Rectification Right
15.04 Chapter III, Section 3 of the GDPR refers to rectification and
erasure. The employee shall have the right to obtain from the Control-
ler without undue delay the rectification of inaccurate personal data con-
cerning them. Taking into account the purposes of the processing, the
employee shall have the right to have incomplete personal data completed,
including by means of providing a supplementary statement (Article 16).
Right to Erasure (Right to be Forgotten)
15.05 The new GDPR states that Data Subject shall have the right to
obtain from the Controller the erasure of personal data concerning them
without undue delay and the Controller shall have the obligation to erase
personal data without undue delay where one of the following grounds
applies:
●● the personal data are no longer necessary in relation to the purposes
for which they were collected or otherwise processed;

201
15.05 Employee Data Protection Rights

●● the employee withdraws consent on which the processing is based


according to Article 6(1)(a), or Article 9(2)(a), and where there is no
other legal ground for the processing;
●● the Data Subject objects to the processing pursuant to Article 21(1)
and there are no overriding legitimate grounds for the processing, or
the employee objects to the processing to Article 21(2);
●● the personal data have been unlawfully processed;
●● the personal data have to be erased for compliance with a legal
obligation in EU or state law to which the Controller is subject;
●● the data have been collected in relation to the offering of information
society services referred to in Article 8(1) (Article 17(1)).
Where the Controller has made the personal data public and is obliged
pursuant to Article 17(1) to erase the personal data, the Controller, taking
account of available technology and the cost of implementation, shall
take reasonable steps, including technical measures, to inform Con-
trollers which are processing the personal data that the employee has
requested the erasure by such Controllers of any links to, or copy or
replication of, those personal data (Article 17(2)).
Article 17(1) and (2) shall not apply to the extent that processing is
necessary:
●● for exercising the right of freedom of expression and information;
●● for compliance with a legal obligation which requires processing
by EU or state law to which the Controller is subject or for the
performance of a task carried out in the public interest or in the
­exercise of official authority vested in the Controller;
●● for reasons of public interest in the area of public health in a­ ccordance
with Article 9(2)(h) and (i) as well as Article 9(3);
●● for archiving purposes in the public interest, or scientific and
­historical research purposes or statistical purposes in accordance
with Article 89(1) in so far as the right referred to in Article 17(1) is
likely to render impossible or seriously impair the achievement of
the objectives of that processing; or
●● for the establishment, exercise or defence of legal claims
(Article 17(3)).
Notification Obligation re Rectification,
Erasure or Restriction
15.06 The new GDPR states that the Controller shall communicate
any rectification, erasure or restriction of processing carried out in
accordance with Articles 16, 17(1) and 18 to each recipient to whom
the data have been disclosed, unless this proves impossible or involves
disproportionate effort. The Controller shall inform the employee about
those recipients if the employee requests it (Article 19).

202
Rights Under the GDPR 15.08

Right to Restriction of Processing


15.07 Article 18 of the new GDPR refers to the right to restriction of
processing. The employee shall have the right to obtain from the Controller
the restriction of the processing where one of the following applies:
●● the accuracy of the data is contested by the employee, for a period
enabling the Controller to verify the accuracy of the data;
●● the processing is unlawful and the employee opposes the erasure of
the personal data and requests the restriction of their use instead;
●● the Controller no longer needs the personal data for the purposes
of the processing, but they are required by the employee for the
establishment, exercise or defence of legal claims;
●● the Data Subject has objected to processing pursuant to Article 21(1)
pending the verification whether the legitimate grounds of the
Controller override those of the employee (Article 18(1)).
Where processing has been restricted under Article 18(1), such per-
sonal data shall, with the exception of storage, only be processed with
the employee’s consent or for the establishment, exercise or defence of
legal claims or for the protection of the rights of another natural or legal
person or for reasons of important public interest of the EU or of a state
(Article 18(2)).
An employee who has obtained the restriction of processing pursuant
to Article 18(1) shall be informed by the Controller before the restriction
of processing is lifted (Article 18(3)).
Right to Data Portability
15.08 The new GDPR states that the employee shall have the right to
receive the personal data concerning them, which they have provided
to a Controller, in a structured, commonly used and machine-readable
format and have the right to transmit those data to another Controller
without hindrance from the Controller to which the personal data have
been provided, where:
●● the processing is based on consent pursuant to Article 6(1)(a) or
Article 9(2)(a) or on a contract pursuant to Article 6(1)(b); and
●● the processing is carried out by automated means (Article 20(1)).
In exercising their right to data portability pursuant to Article 20(1),
the employee has the right to have the data transmitted directly from
Controller to Controller, where technically feasible (Article 20(2)).
The exercise of the right shall be without prejudice to Article 17.
The right shall not apply to processing necessary for the performance
of a task carried out in the public interest or in the exercise of official
authority vested in the Controller (Article 20(3)).

203
15.08 Employee Data Protection Rights

The right shall not adversely affect the rights and freedoms of others
(Article 20(4)).
Automated Individual Decision Making Right
15.09 Chapter III, Section 4 of the new GDPR refers to the right to
object and automated individual decision making.
Right to Object
15.10 The new GDPR states that the employee shall have the right
to object, on grounds relating to their particular situation, at any time
to the processing of personal data concerning them which is based on
Article 6(1)(e) or (f), including profiling based on these provisions.
The Controller shall no longer process the personal data unless the
Controller demonstrates compelling legitimate grounds for the processing
which override the interests, rights and freedoms of employee or for the
establishment, exercise or defence of legal claims (Article 21(1)).
Where personal data are processed for direct marketing purposes,
the Data Subject shall have the right to object at any time to processing
of personal data concerning them for such marketing, which includes
profiling to the extent that it is related to such direct marketing
(Article 21(2)).
Where the employee objects to processing for direct marketing
purposes, the personal data shall no longer be processed for such
purposes (Article 21(3)).
At the latest at the time of the first communication with the employee,
the right referred to in Article 21(1) and (2) shall be explicitly brought
to the attention of the employee and shall be presented clearly and
separately from any other information (Article 19(4)).
In the context of the use of information society services, and
notwithstanding Directive 2002/58/EC,1 the employee may exercise
their right to object by automated means using technical specifications
(Article 21(5)).
Where personal data are processed for scientific and historical
research purposes or statistical purposes pursuant to Article 89(1), the
employee, on grounds relating to their particular situation, shall have the
right to object to processing of personal data concerning them, unless

1 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).
Also note ePrivacy Regulation and updated ePrivacy rules.

204
Rights Under the GDPR 15.12

the processing is necessary for the performance of a task carried out for
reasons of public interest (Article 21(6)).
Automated Individual Decision Making, Including Profiling
15.11 The new GDPR states that the employee shall have the right
not to be subject to a decision based solely on automated processing,
including profiling, which produces legal effects concerning them or
similarly significantly affects them (Article 22(1)).
Article 22(1) shall not apply if the decision:
●● is necessary for entering into, or performance of, a contract between
the employee and a Controller [a]; or
●● is authorised by EU or state law to which the Controller is ­subject
and which also lays down suitable measures to safeguard the
employee’s rights and freedoms and legitimate interests; or
●● is based on the Data Subject’s explicit consent (Article 22(2)) [c].
In cases referred to in Article 22(2)(a) and (c) the Controller shall imple-
ment suitable measures to safeguard the employee’s rights and freedoms
and legitimate interests, at least the right to obtain human intervention
on the part of the Controller, to express their point of view and to contest
the decision (Article 22(3)).
Decisions referred to in Article 22(2) shall not be based on special
categories of personal data referred to in Article 9(1), unless Article 9(2)(a)
or (g) applies and suitable measures to safeguard the employee’s rights
and freedoms and legitimate interests are in place (Article 22(4)).
Express Employee Provision
15.12 There are a limited number of situations which are expressly
referred to in the GDPR in addition to the general provisions. One
­specific provision refers to the employment sector. This is referred to in
GDPR Article 88. It provides that:
‘Member States may, by law or by collective agreements, provide for more
specific rules to ensure the protection of the rights and freedoms in respect
of the processing of employees’ personal data in the employment context, in
particular for the purposes of the recruitment, the performance of the con-
tract of employment, including discharge of obligations laid down by law or
by collective agreements, management, planning and organisation of work,
equality and diversity in the workplace, health and safety at work, protection
of employer’s or customer’s property and for the purposes of the exercise and
enjoyment, on an individual or collective basis, of rights and benefits related
to employment, and for the purpose of the termination of the employment
relationship.’

205
15.12 Employee Data Protection Rights

It continues that:
‘Those rules shall include suitable and specific measures to safeguard the data
subject’s human dignity, legitimate interests and fundamental rights, with par-
ticular regard to the transparency of processing, the transfer of personal data
within a group of undertakings, or a group of enterprises engaged in a joint
economic activity and monitoring systems at the work place.’

Conclusion
15.13 Organisations need to respect the data protection rights of
their employees, and even potential employees. In addition, particular
care and attention needs to be paid to access requests from ­employees.
Compliance is an ongoing issue as, for example, new changes and
­business practices will always present new challenges internally.

206
Chapter 16
Employee Considerations

Introduction
16.01 A large number of issues arise in terms of dealing with employee
personal data, both in terms of audits, access, planning and compliance.

Contract
16.02 The starting point for informing and appraising employees of
the importance of data protection, confidentiality and security in the
organisation, and in relation to the employee’s personal data should be to
begin with the employment contract. There should be clauses referring
to data protection and also to security.

Policies
16.03 There should also be policies relating to data protection
­furnished to all actual (and prospective) employees. These need to be
updated regularly as the issues covered change constantly.
These policies may be separate or may be incorporated into an
employee handbook.
Organisations need to be aware that if there is an important change
made, unless this is notified and recorded to employees, the o­ rganisation
may not be able to rely upon the changed clause. For example, if there
is a new activity banned or regulated in an updated policy but the new
policy is not notified to employees, it may be impossible to use the
new policy to disciple an errant employee. They and their lawyers will
strongly argue that the new policy was not notified, is not part of the
employment relationship, and would be unlawful to apply. There are
many examples of this problem occurring in practice.

207
16.04 Employee Considerations

Data Protection Policy


16.04 In considering what policies to implement, the organisation
will have to consider many issues and separate policies, with data
protection being just one. Others include health and safety, environmental,
etc. The data protection policy will perhaps be the most dynamic given
the wide range of data, individuals, processes, activities and ­technologies
involved, and which are ever changing. Someone within the o­ rganisation
needs to be responsible for examining and implementing ongoing
changes as the requirement arises.

Internet Usage Policy


16.05 In addition to a general data protection policy, further specific
data protection related policies may be required. The foremost of these
relates to the employees’ use of the internet on the organisation’s systems.

Mobile and Device Usage Policies


16.06 Increasingly, employees use not just desktop computers but also
laptops, mobile phones, smart phones, handheld devices, B ­ lackberrys,
iPads, etc. Unless explicitly incorporated into one of the above poli-
cies, then additional, or extended, policies need to be implemented.
One particular issue for an organisation to consider is the employees’
own devices, employee devices the bills for which are paid by the organ-
isation, and devices supplied and paid for by the organisation. Somewhat
different considerations may need to be applied.

Vehicle Use Policy


16.07 Vehicles must also be considered by organisations in terms of
policy and employee usage. Increasingly tracking and other data from
vehicles can be related to identified employees. Sometimes this may be
as a result of the organisation furnishing a vehicle to the employee for
work related business. However, this need not be the case. Even where
an employee is using their own vehicle, it is possible for the organisation
to be collecting or accessing personal data. In addition, organisations
must also be aware that vehicle technology, both in-built and added,
can result in greater collections of data, including personal data, being
potentially accessible. Tracking, incident and accident reports, etc, are

208
Evidence 16.09

just some examples. It is important for organisations to consider these


personal data issues.

Transfers of Undertaking
16.08 Transfers of undertakings or TUPE,1 as it is known, refers to
organisations being transferred to a new owner, whether in whole or in
part. Situations like this are an everyday part of the commercial business
environment.
The Transfer of Undertakings (Protection of Employment) Regulations
20062 provide certain protections for employees where the business is
being transferred to a new owner. While primarily directed at the protec-
tion of employees’ employment related rights, it is also noteworthy in
that employee personal data will also be transferred to the new employer
or organisation. There are various types of personal data to consider.
The same rights, interests, terms, contracts, etc, as applied with the
old employer are to apply with the new employer, which included data
protection user rights, and possibly corporate communications usage, etc.

Evidence
16.09 Disputes naturally arise between employers and employees from
time to time. In such disputes the organisation may need to rely upon
documents, information and materials which will contain ­personal data.
While this may fit under the legitimate interests and legal interest provi-
sions, employers should be careful about using less obvious personal
data. This might include private emails communications over which an
employee may argue privacy and confidentiality. Particular concerns
arise in cases where covert monitoring has been used. Before commenc-
ing covert activities of this type the organisation should seek advices.
Otherwise, any evidence gathered may possibly be ruled i­nadmissible
subsequently.
Organisations also need to be careful in that computer, electronic and
mobile data can change, expire and or be amended quickly. It can be
important to act quickly. Advance policies, procedures and protocols
assist in this regard.

1 See generally, L AC MacDonald, Data Protection: Legal Compliance and Good


Practice for Employers (Tottel, 2008) 44–48.
2 SI 2006/246. Please consider any changes, updates and/or replacements as may occur.

209
16.10 Employee Considerations

Enforceability
16.10 An important issue for all organisations to consider is that if
they ever wish to be able to discipline an errant employee, including up
to dismissal, consequent upon their activities regarding personal data,
internet usage, security breach, etc, the organisation needs to be able to
point to a breach of an explicit contract or policy clause. If the breach
is not explicit, or there is not written into the policy and contract, the
employee can argue that there is no breach, perhaps regardless of the
seriousness of the instant issue arising.
While online abuse is an issue to be dealt with, organisations need to
be conscious of reputational damage from such incidents. However, they
also need to consider what would happen if any of their employees are
involved, whether by way of participation or orchestration, in an incident
of abuse. One would have to consider what the contracts and policies
say, as well as develop reaction strategies.

Data Breach
16.11 Employee involvement is critical in dealing with – and
preparing for – data breach incidents. Various teams of employees will
be involved.
Notification of Employee Data Breaches
16.12 However, as employee personal data can also be the subject of
a data breach incident, employees may also need to be specifically con-
sidered in this context. For example, they may need to be separately
informed that there is a breach relating to their personal data and what
actions and safeguards are being followed by the organisation to deal
with the issue. If the employees need to take specific actions, they may
also need to be appraised of this possibility. Potentially liability issues
may also arise. For example, employees in the massive Sony data breach
incidents may have considered suing Sony for breaches in relation to
their data.

Employee Data Organisations


16.13 It is increasingly possible for an organisation to represent
Data Subjects, in terms of dealing with particular data protection issues
but also in terms of representation and litigation.
While employee unions may represent their members in dealing with
data protection issues, it is possible that employers may also have to deal

210
Conclusion 16.15

with new organisations in relation to employees’ personal data. This may


involve breach incidents in particular.

Location
16.14 Increasingly technology permits organisations to track devices
and hence employees. However, a very careful prior analysis needs to
be conducted in advance to consider whether this is strictly needed,
desirable, proportionate, transparent and complies with the data protec-
tion regime. As with many of these new issues, a question of balance
is needed and just because something is technically possible does not
equate with it being prudent or lawful.

Conclusion
16.15 These are all issues that need to be addressed well in advance
of an incident arising. It may be too late to start implementing policies,
etc, after the event.
Employee data protection compliance is a complicated and ongoing
obligation for organisations. It also needs to involve the DPOs, appropri-
ate board member, human resources, IT personnel and legal. On occa-
sion, others may also have to become involved also.
Organisations should also ensure that employees are fully appraised of
the need for respect for personal data, data security, and understanding
of the respective policies to protect personal data. Where an employee
accesses, uses or – worse – discloses personal data, whether of customers
or employees, the employer will be held responsible (notwithstanding
any employee responsibility issues). The data regulator will seek to know
what the employer did or did not do that may have prevented to employee
breach arising. Data security is also very important and ranges from
access controls, segregation, hierarchies of access on a need-to-access
basis, passwords, encryption, device policies, and on-site and off-site
usage rules. These issues, and where risks and gaps may arise, are ongoing
compliance requirements and cannot be left static over time.

211
212
Chapter 17
Employee Monitoring Issues

Introduction
17.01 One of the more contentious areas of employee data protection
practice relates to the issue of the monitoring of employee email, inter-
net, etc usage. More recently, concerns have arisen as to the use of tech-
nology to monitor employee activities generally, and to monitor their
location and movement.
Employers are concerned about certain risks that can arise as result
of the activities of employees in the workplace, but sometimes proceed
without considering data protection compliance, the need for proportion-
ality, and that while certain monitoring activities may be permissible,
others will not be. Employers need to engage in a careful diligence exer-
cise prior to rolling out any monitoring-type activities. It is also unwise
be assume that just because one monitoring use or activity may be pos-
sible, that additional unrelated forms of monitoring technology will also
be legally permitted. Various data regulators receive complaints from
employees that refer to employer monitoring issues, from CCTV, to
email, to location, to biometrics, etc.

Sample Legal Issues Arising


17.02 Some examples of the legal issues and concerns that can arise
for employers and organisations as a result of the actions of their employ-
ees include:
●● vicarious liability;1
●● defamation;
1 Note the recent Morrisons case, where the employer was not held vicariously liable
for an errant employee’s internet upload of employee personal data. See Morrison v
Various [2020] UKSC 12 (1 April 2020). Employers should not assume generally,
however, that they may never become vicariously liable.

213
17.02 Employee Monitoring Issues

●● copyright infringement;
●● confidentiality breaches and leaks;
●● data protection;
●● contract;
●● inadvertent contract formation;
●● harassment;
●● abuse;
●● discrimination;
●● computer crime;
●● interception offences;
●● criminal damage;
●● data loss;
●● data damage;
●● computer crime;
●● criminal damage;
●● eCommerce law;
●● arms and dual use good export restrictions;
●● non-fatal offences against the person;
●● child pornography;
●● policies;
●● on-site/off-site;
●● bring your own device (BYOD), etc;
●● online abuse, snooping, etc.
These are examples of an expanding list of concerns for organisations.
Obviously, some of these will be recalibrated in importance depending
on the type of organisation, the business sector and what its activities
are. Confidentiality, for example, may be critically important for certain
organisations, but less important for other organisations.

Employee Misuse of Email, Internet, etc


17.03 Even before the increase in data breach and data loss exam-
ples, there is a significant number of examples where problems have
arisen for organisations as a result of the activities of certain employ-
ees. Increasingly, organisations face the problem that the activity of an
errant employee on the internet, television, or social networks can go
viral instantly on a global scale, with adverse reputational damage for
the organisation.
There are many instances where organisations have felt the need to
dismiss or otherwise discontinue relationships with employees. Some
examples include Equifax, Uber, Yahoo!, Swedish Transport Authority,
Norwich EU Healthcare; Lois Franxhi; Royal & Sun Alliance Liverpool;

214
Employee Misuse of Email, Internet, etc 17.03

C&W Birmingham; Rolls Royce Bristol; Claire Swire/Bradley Chait;


Ford; Weil Gotshal; Sellafield, Target, Sony, etc. As these instances are
ever expanding it would be impossible to provide a definitive list.
The frequency and scale of recent breaches of security eg, Yahoo!
(500 million users), Cambridge Analytica, TalkTalk, Cathay Pacific,
DSG Retail, Sony Playstation (70 million individual’s personal data2 and
25 million in another3) make the topicality and importance of data secu-
rity compliance ever more important. Even after that, Sony was hacked
again in the example of the film ‘The Interview,’ which was arguably
even more costly and damaging. The largest UK data losses appear to be
Yahoo! and Revenue and Customs (loss of discs with the names, dates
of birth, bank and address details for 25 million individuals).4 Marks and
Spencer was also involved in a data breach as were a number of hotels,
and the mobile company TalkTalk.
There are many UK cases involving substantial fines for data pro-
tection breaches. These include where there was data relating to large
numbers of Data Subjects, to small numbers or even single Data ­Subjects
but which were particularly special or sensitive. These range across
commercial, non-profit and state organisations. The Crown Prosecution
Service (CPS) was fined £325,000. Humberside Police was fined
£130,000 relating to a data loss incident regarding one Data Subject (but
involving particularly special and sensitive data – a rape victim inter-
view on disc). Brighton and Sussex University Hospitals NHS Trust was
fined £325,000 by the ICO in relation to a data loss incident.5 Zurich
Insurance was fined £2.3m for losing data of 46,000 customers.6 Sony
was fined £250,000. Even the British & Foreign Bible Society was fined
£100,000. Note, these are all even prior to the new EU General Data
Protection Regulation (GDPR) level fines and penalties.
More recently the retail group DSG Retail was fined £500,000 for a
breach affecting 14 million individuals;7 airline Cathay Pacific was fined

2 See, for example, G Martin, ‘Sony Data Loss Biggest Ever,’ Boston Herald,
27 April 2011.
3 See, for example, C Arthur, ‘Sony Suffers Second Data Breach With Theft of 25m
More User Details,’ Guardian, 3 May 2011.
4 See, for example, ‘Brown Apologises for Record Loss, Prime Minister Gordon Brown
has said he “Profoundly Regrets” the Loss of 25 Million Child Benefit Records,’ BBC,
21 November 2007, at http://news.bbc.co.uk/2/hi/7104945.stm.
5 See, for example, ‘Largest Ever Fine for Data Loss Highlights Need for Audited
Data Wiping,’ ReturnOnIt, at http://www.returnonit.co.uk/largest-ever-fine-for-data-
loss-highlights-need-for-audited-data-wiping.php.
6 See, for example, J Oates, ‘UK Insurer Hit With Biggest Ever Data Loss Fine,’
The Register, 24 August 2010, at http://www.theregister.co.uk/2010/08/24/data_loss_
fine/. This was imposed by the Financial Services Authority (FSA).
7 ICO v DSG Retail [2020] 9/1/20.

215
17.03 Employee Monitoring Issues

£500,000 for breaches lasting four years;8 and Doorstep Dispenseree


was fined for £275,000 for data storage and data security issues.9
HP suspended 150 employees in one instance, which is one of the
potential actions available.
Prudential was also fined in relation to the mishandling of personal
data, while a Barclays employee was fined in court for personal use and
access to customer account personal data.
Senior executives at Sony and Target lost their jobs as a result of data
breach incidents. Some of the examples where employees have lost their
jobs as a result of data breach incidents include senior management as
well as IT and tech executives. It is probable that many more junior
employees have lost jobs and positions out of the headlines.
In a more creative solution Ford issued a deletion amnesty to 20,000
employees to delete objected to material before a new or amended policy
would kick in.
Some of the legal concerns referred to above will now be looked at in
greater detail.

Contract
17.04 It is possible for employees to agree and enter into contract via
electronic communications. This is increasingly a concern since the legal
recognition of electronic contract in the eCommerce legislation.
This can include the inadvertent creation of legally binding contracts
for the organisation.
In addition, it is possible that an employee may create or agree partic-
ular contract terms, delivery dates, etc, electronically which they would
not otherwise do.
These can be contracts, or terms, which the organisation would not
like to be bound by.
It is also possible that breach of contract issues could arise.

Employment Equality
17.05 It can be illegal to discriminate, or permit discrimination, on
grounds of, for example:
●● gender;
●● race;

8 ICO v Cathay Pacific Airways Limited [2020] 4/3/2020.


9 ICO v Doorstep Dispensaree Limited [2019] 20/12/19.

216
Offline Abuse 17.08

●● age;
●● sexual orientation;
●● family status;
●● religious beliefs;
●● disability;
●● member of minority groups.
Instances of such discrimination can occur on the organisation’s com-
puter systems. Even though the organisation may not be initially aware
of the instance, it can still have legal consequences.

Harassment
17.06 Harassment via the organisation’s computer systems is also
something which could cause consequences for an organisation. Exam-
ples could include the circulation of written words, pictures or other
material which a person may reasonably regard as offensive. The organi-
sation could be held liable for employees’ discriminatory actions unless
it took reasonable steps to prevent them, or to deal with them appro-
priately once they arise. See Protection from Harassment Act 1997 (as
amended).

Online Abuse
17.07 This is a growing problem. However, it is also an issue for
organisations when their employees are the victims or perpetrators of
online abuse. It is only a matter of time before organisations are sued for
the actions of their employees using the organisation’s systems. Organi-
sations can also assist in tackling these issues, and some indeed have
more of an ability to do so than others.

Offline Abuse
17.08 Offline abuse or offline disparagement, defamation, etc can also
arise. In the Sony data breach incident regarding the film ‘The Interview’
there are examples of employee details being published online after the
breach. In addition various emails were published where senior employ-
ees were at least disparaging of others directly or indirectly engaged by
the organisations. In instances such as this, as well as being other than
best practice for both the employees and the organisation, liability issues
can also arise.

217
17.09 Employee Monitoring Issues

Child Pornography
17.09 There is a serious risk and concern for organisations where this
could occur on the organisation’s computer systems or devices. See, for
example, the Protection of Children Act 1978 (as amended), Criminal
Justice Act 1988 (as amended), and the Criminal Justice and Public
Order Act 1994 (as amended).

Dealing with the Employee Risks


17.10 It is important for all organisations to engage in a process of
risk assessment of exposures which can be created as a result of their
employee’s use of its computer systems and devices.
Organisations should:
●● identify the areas of risk;
●● assess and evaluate those risks;
●● engage in a process to eliminate/reduce the risks identified as
appropriate;
●● formalise an overall body of employee corporate communications
usage policies;
●● implement appropriate and lawful technical solutions appropriate to
dealing with the identified risks;
●● implement an ongoing strategy to schedule reviews of the new and
emerging risks, and to update and or implement appropriate proce-
dures and policies, including the existing employee corporate com-
munications usage policies;
●● data security for employee related data;
●● risk assessment and indeed impact assessments, particularly where
changes happen in relation to employee related data;
●● advising all employees of the official policy if and when a breach
incident may arise – and for the avoidance of doubt, well in advance
of breach problem issues arising;
●● advising and ensuring that the specific employees who might have to
deal with breach issues directly are fully aware of and trained on their
functions. This will be an ongoing task and will also involve differ-
ent teams of employees (and others who may be external experts and
professional advisors, eg IT, IT forensics, lawyers, counsel, public
relations, crisis management, etc).

Employee Corporate Communications Usage Policies


17.11 Given the general risks, and the risks specific to an organisation
identified in a review process, it is critical that the organisation implement

218
Focus of Organisational Communications Usage Policies 17.12

appropriate policies in an overall body of employee corporate communi-


cations usage policies.
It is important to note that there is no one fixed solution, or one single
policy on its own which is sufficient to deal with these issues. Equally,
a single policy on its own would not be sufficient to give comfort to the
organisation in terms of (a) dealing with the risks and issues, and (b) ensur-
ing that the organisation has implemented sufficient policies that allows it
to act appropriately to meet specific risks, including disposal, as they arise.
For example, if an organisation discovers a particular unauthorised
activity on its computer systems undertaken by one of its employees, eg
illegal file sharing or an employee hosting an illegal gambling website
on its servers, it may decide that it wishes to dismiss the errant employee.
But can it legally dismiss the employee? In the first instance, the organi-
sation should ideally be in the position of being able to point to a specific
clause in the employees written contract of employment and a specific
breach, or breaches, of one or more of the organisation’s employee cor-
porate communications usage policies.
If it can, then the likelihood is that the organisation’s legal advisors
would be comfortable recommending that the dismissal is warranted,
justified and lawful. However, if the answer is no, in that there is no writ-
ten clause which is clearly breached as a result of the specific activities,
even though carried out during the hours of employment, at the organi-
sation’s office, on the organisations computer system and utilising the
organisation’s computer servers, the organisation would not have any
legal comfort in dismissing the employee.
The organisation may not be able to obtain pro-dismissal legal
advices. If the company decided to proceed with the dismissal in any
event it could run the legal risk that the employee may decide to legally
challenge the dismissal under employment law. The employee would
be assisted in that there is no express written clause which is breached.
Unfortunately, the organisation may have to defend such an action on
the basis of common law or implied obligations and duties owed by the
employee. There may be no guarantee of success.

Focus of Organisational Communications Usage Policies


17.12 Organisations have a need for a comprehensive suite of organi-
sational communications usage policies. This includes use of the organi-
sation’s devices, telephone, voicemail, email and Internet. The policies
need to address:
●● security;
●● telephone;
●● email;

219
17.12 Employee Monitoring Issues

●● text;
●● internet;
●● mobile and portable devices;
●● electronic storage devices;
●● home and off-site usage;
●● vehicle usage and location usage;
●● employee Internet of Things (IoT) usage as it may affect the
organisation.
This will involve an ongoing task, not just a one-off exercise. It also
needs coordinated team responsibility.
Key Issues to Organisational Communications
Usage Policies
17.13 The key aspects to consider in relation to the employee corpo-
rate communications usage policies, include:
●● ownership;
●● usage;
●● authorisation;
●● confidentiality;
●● authentication;
●● retention and storage;
●● viruses;
●● disciplinary matters;
●● dismissal matters;
●● security;
●● awareness;
●● transparency;
●● breach notification;
●● DPOs;
●● planning and data protection by design and by default.
These will vary depending on each organisation.
From a data protection regime perspective, one of the key issues for an
organisation is the ability to monitor employees if and when needed, as
distinct from general monitoring which may be difficult or not permissible.

Data Protection and Employee Monitoring


17.14 Organisations may wish to monitor for various reasons. Some
of these include:
●● continuity of operations during staff illness;
●● maintenance;

220
Human Right 17.15

●● preventing/investigating allegations of misuse;


●● assessing/verifying compliance with software licensing obligations;
●● complying with legal and regulatory requests for information, etc.
Organisations may also wish to monitor in order to prevent risks and to
identify problems as they arise. When issues arise they will wish to deal
appropriately with the employee.
Employers and agencies recruiting for them need to carefully con-
sider the legality of internet and social media monitoring of employees
and applicants. It is a growing practice by all accounts, and appears
more permissible in the US than the EU. However, organisations should
not assume that they are permitted to monitor, record, keep files, and
make decisions based upon personal information and personal data
gathered online/electronically unknown to the individual employee or
applicant.10
The issue of the right to privacy and data protection of the employee
and the rights and interests of the organisation arise. The issue of the
right to privacy is a complex subject beyond the scope of this discus-
sion. However, it is an expanding area, particularly after the introduc-
tion of the Human Rights Act 1998 (as amended).11 (Note that there
has been some discussion as to whether some or all of the Human
Rights Act may be repealed after Brexit. This issue will require careful
review in order to identify any such changes, and their implications.)
Monitoring raises interlinking issues of privacy, human rights and data
protection.

Human Right
17.15 The following is subject to continuing Brexit considerations.
The Human Rights Act 1998 (as amended)12 means that the European

10 See, for example, C Brandenburg, ‘The Newest Way to Screen Job Applicants: A
Social Networker’s Nightmare,’ Federal Communications Law Journal (2007–2008)
(60) 597; D Gersen, ‘Your Image, Employers Investigate Job Candidates Online More
than Ever. What Can You Do to Protect Yourself?’ Student Law (2007–2008) (36)
24; AR Levinson, ‘Industrial Justice: Privacy Protection for the Employed,’ Cornell
Journal of Law and Public Policy (2009) (18) 609–688; I Byrnside, ‘Six Degrees of
Separation: The Legal Ramifications of Employers Using Social Networking Sites
to Research Applicants,’ Vanderbilt Journal of Entertainment and Technology Law
(2008) (2) 445–477; and M Maher, ‘You’ve Got Messages, Modern Technology
Recruiting Through Text Messaging and the Intrusiveness of Facebook,’ Texas Review
of Entertainment and Sports Law (2007) (8) 125–151.
11 Human Rights Act 1998, at http://www.legislation.gov.uk/ukpga/1998/42/contents.
12 Human Rights Act 1998, at http://www.legislation.gov.uk/ukpga/1998/42/contents.

221
17.15 Employee Monitoring Issues

Convention for the Protection of Human Rights and Fundamental


Freedoms13 is incorporated into and applicable in the UK.
Article 8 of the Convention provides that everyone has right to private
and family life, and their home and correspondence. It states that:
‘(1) Everyone has the right to respect for … private and family life, … home
and … correspondence;
(2) There shall be no interference by a public authority with the exercise of
this right except such as is in accordance with the law and is necessary
in a democratic society in the interests of national security, public safety
or the economic wellbeing of the country, for the prevention of disorder
or crime, for the protection of health or morals, or for the protection of
the rights and freedoms of others.’14

The Halford case15 is an example of the interface of human rights and


data protection. The Leveson inquiry on press hacking, and recommen-
dations, are also relevant in considering these issues.
As noted above, the nature and extent of any changes to human rights
legislation and/or divergencies from EU norms hold large potential
significance.

Application of Data Protection Regime


17.16 In terms of employee monitoring, employers are entitled to
exercise reasonable control and supervision over their employees and
their use of organisational resources.
Employers are also entitled to promulgate policies to protect their
property and good name and to ensure that they do not become inadvert-
ently liable for the misbehaviour of employees.
Equally, however, employees do retain privacy rights and data protec-
tion rights that must be respected by the organisation.
Organisations must consider:
●● the culture of organisation;
●● whether there is understanding and expectation that employees can
use the organisation’s computers for personal use;
●● whether the organisation accessing the employee’s communications
without permission would be unfair and unlawful obtaining;
●● history and practice of adherence to existing policies – or, worse,
accepted conduct of non-compliance with existing policies.

13 At http://www.echr.coe.int/Documents/Convention_ENG.pdf. See also Balla, R,


‘Constitutionalism – Reform on Data Protection Law and Human Rights,’ Cerentul
Juridic (2011) (47) 61–74.
14 At http://www.echr.coe.int/Documents/Convention_ENG.pdf.
15 Halford v UK (1997) IRLR 471, (1997) 24 EHRR 523.

222
EDPB/WP29 17.18

Monitoring involves careful consideration of policy, practice as well as


interrelated legal issues of privacy/human rights/data protection.
Monitoring or accessing employee emails or tracking of employee web
browsing without permission would likely be unfair obtaining. Location
issues are increasingly important – for a number of reasons. In terms of
employees, particular issues arise as to if and when an organisation can,
or should, monitor or record the location of employees eg via their elec-
tronic and hardware devices, etc. It is important, therefore, that organisa-
tions have appropriate mechanisms and policies to reduce the risks, and
then also to deal with issues as they arise. Reactively responding and
trying to set up policies once a problem has arisen should be avoided.
As pointed out above, if it is not explicitly in place before the incident,
then a bad policy (or indeed an outdated policy) cannot be relied upon in
invoking particular terms against the employee.

ILO Code
17.17 Organisations should also consider the ILO Code of Practice on
Protection of Workers Personal Data,16 which recommends that:
●● employees must be informed in advance of reasons, time schedule,
methods and techniques used and the personal data collected;
●● the monitoring must minimise the intrusion on privacy of employees;
●● secret monitoring must be in conformity with legislation or on foot
of suspicion of criminal activity or serious wrongdoing;
●● continuous monitoring should only occur if required for health and
safety or protection of property.

EDPB/WP29
Processing in the Employment Context
17.18 The WP29, now replaced by the EDPB, has long been con-
cerned about employee monitoring and surveillance issues in terms of
privacy and personal data. It issued an Opinion on the Processing of
Personal Data in the Employment Context.17 It states that ‘no business
interest may ever prevail on the principles of transparency, lawful pro-
cessing, legitimisation, proportionality, necessity and others contained in
data protection laws.’

16 At http://www.ilo.org/wcmsp5/groups/public/---ed_protect/---protrav/---safework/
documents/normativeinstrument/wcms_107797.pdf.
17 Opinion 8/2001 on the processing of personal data in the employment context, WP48.

223
17.18 Employee Monitoring Issues

It states that when processing workers’ personal data, employers


should always bear in mind fundamental data protection principles such
as the following:
●● FINALITY: Data must be collected for a specified, explicit and
legitimate purpose and not further processed in a way incompatible
with those purposes;
●● TRANSPARENCY: As a very minimum, workers need to know
which data is the employer collecting about them (directly or from
other sources), which are the purposes of processing operations
envisaged or carried out with these data presently or in the future.
Transparency is also assured by granting the Data Subject the right
to access to his/her personal data and with the Controllers’ obliga-
tion of notifying supervisory authorities as provided in national law;
●● LEGITIMACY: The processing of workers’ personal data must be
legitimate. Article 7 of the Directive lists the criteria making the
processing legitimate;
●● PROPORTIONALITY: The personal data must be adequate, rel-
evant and not excessive in relation to the purposes for which they
are collected and/or further processed. Assuming that workers have
been informed about the processing operation and assuming that
such processing activity is legitimate and proportionate, such a pro-
cessing still needs to be fair with the worker;
●● ACCURACY AND RETENTION OF THE DATA: Employment
records must be accurate and, where necessary, kept up to date. The
employer must take every reasonable step to ensure that data inac-
curate or incomplete, having regard to the purposes for which they
were collected or further processed, are erased or rectified;
●● SECURITY: The employer must implement appropriate technical
and organisational measures at the workplace to guarantee that the
personal data of his workers is kept secured. Particular protection
should be granted as regards unauthorised disclosure or access;
●● AWARENESS OF THE STAFF: Staff in charge or with responsi-
bilities in the processing of personal data of other workers’ need
to know about data protection and receive proper training. With-
out an adequate training of the staff handling personal data, there
could never be appropriate respect for the privacy of workers in the
workplace.18

18 At Opinion 8/2001 on the processing of personal data in the employment context,


WP48.

224
EDPB/WP29 17.19

In relation to consent, the WP29 adds that where as a necessary and


unavoidable consequence of the employment relationship an employer
has to process personal data it is misleading if it seeks to legitimise this
processing through consent.19 Reliance on consent as a legitimising
mechanism should be confined to cases where the worker has a genuine
free choice and is subsequently able to withdraw the consent without
detriment.20 Consent is thus nuanced, considered, specific, and confined.
More recently the EDPB has issued the following:
●● Guidelines 1/2020 on processing in the context of connected vehi-
cles and mobility related applications;
●● Guidelines 3/2019 on processing of personal data through video
devices.
Electronic Communications
17.19 WP29 also issued an opinion on the review of ePrivacy rules.21
Under the heading Data Protection by Design (DPbD) it advocates
the application of the principle of data minimisation and the deployment
of Privacy Enhancing Technologies (PETs) by Controllers. It also calls
upon legislators to re-enforce this principle.22 More recently the EDPB
issued Guidelines 4/2019 on Article 25 Data Protection by Design and
by Default.
WP29 also issued a working document on the surveillance of elec-
tronic communications in the workplace.23
It also issued an Opinion on the concept of personal data which is also
relevant in considering the definition of personal data.24
In relation to employee health records, also note the Working Docu-
ment on the processing of personal data relating to health in electronic
health records (EHR).25

19 See above.
20 See above.
21 Opinion 2/2008 on the review of the Directive 2002/58/EC on privacy and electronic
communications (ePrivacy Directive), WP150.
22 Opinion 2/2008 on the review of the Directive 2002/58/EC on privacy and electronic
communications (ePrivacy Directive), WP150.
23 Working document on the surveillance of electronic communications in the work-
place, WP55.
24 Opinion 4/2007 on the concept of personal data. At http://ec.europa.eu/justice/
data-protection/article-29/documentation/opinion-recommendation/files/2007/
wp136_en.pdf.
25 At http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-
recommendation/files/2007/wp131_en.pdf.

225
17.19 Employee Monitoring Issues

When outsourcing and dealing with Processors it is worth considering


the Opinion on the concepts of data ‘Controller’ and data ‘Processor.’26
In relation to video surveillance, WP29 issued an Opinion on the pro-
cessing of personal data by means of video surveillance.27

Employment Contracts,Terms, Policies


17.20 As indicated above, organisations need to consider:
●● whether the contract provision has been properly incorporated;
●● issues of limitation of liability;
●● issues of exclusion of warranties;
●● disclaimers;
●● graduated disclaimers;
●● ongoing risk assessment;
●● continual review and updating;
●● risk, risk identification, risk assessment, and risk minimisation;
●● enforceability;
●● etc.

Processing Compliance Rules


17.21 If an organisation is collecting and processing personal data, it
must comply with the data protection regime in respect of:
●● Principles of data protection;
●● non-sensitive personal data Lawful Processing Conditions;
●● special personal data Lawful Processing Conditions;
●● direct marketing (DM) requirements;
●● data security requirements;
●● Processor contract requirements;
●● transfer requirements;
$$ risk;
$$ risk assessment.

26 WP29, Opinion 1/2010. At http://ec.europa.eu/justice/data-protection/article-29/


documentation/opinion-recommendation/files/2010/wp169_en.pdf.
27 WP29, Opinion 4/2004. At http://ec.europa.eu/justice/data-protection/article-29/
documentation/opinion-recommendation/files/2004/wp89_en.pdf.

226
Processing Compliance Rules 17.22

Suggested Guidelines
17.22 Some suggested guidelines to consider generally are set out
below. However, it is always suggested that appropriate professional
legal and technical advice be sought in particular circumstances:
●● comply with the processing compliance rules set out above;
●● ensure fair obtaining, collecting and processing of personal data;
●● compliance must be ensured at the time of data capture NOT
subsequently;
●● get it right first time;
●● the lessons of British Gas and other examples are that in a worst case
you may have to delete the database and start again from the begin-
ning, or re-do your collection and notification process;
●● consider opt-in and opt-out consent changes;
●● consider the purpose or purposes for the data collection and which
must be specified;
●● provide information to the Data Subject when collecting;
●● consider whether the Data Subject is the source (direct) or whether a
third-party is the source (indirect);
●● specified and lawful;
●● use or purpose to which the data collected will be put must be clear
and defined. Otherwise it could be deemed too vague and unfair and
ultimately an unfair collection which would undermine the initial
consent given;
●● if disclosure occurs, it must be specified, clear and defined;
●● data security measures must be assessed and implemented;
●● data security includes physical security;
●● data security also includes technical security;
●● measures must be put in place to prevent loss, alteration or
destruction;
●● other legislation may also apply such as in relation to hacking, crim-
inal damage, etc;
●● the personal data must be kept accurate and kept up to date;
●● personal data must be kept for no longer than is necessary;
●● consider that there are many different types of personal data;
●● the organisational or business need requirement must be identified;
●● how personal data is collected must be considered, planned and
recorded;
●● check Principles of data protection;
●● check Lawful Processing Conditions;
●● explicit consent may be required for collecting and processing spe-
cial personal data;

227
17.22 Employee Monitoring Issues

●● identify and deal with new processes and procedures in advance


to ensure data protection compliance. One cannot assume that the
organisation can obtain a fair and lawful consent after a go-live.
This is increasingly the case after the GDPR;
●● identify new contacts, contracts, collections and consents in advance
to ensure data protection compliance;
●● policy for protecting and securing employee’s personal data;
●● policy for notifying employees of data breaches involving their
personal data;
●● insurance issues, particularly regarding data breaches;
●● risk assessment and planning issues.

The Rights of Employee Data Subjects


17.23 The rights of Data Subjects can be summarised as including:
●● right of access;
●● right to establish if personal data exists;
●● right to be informed of the logic in automatic decision taking;
●● right to prevent processing likely to cause damage or distress;
●● right to prevent processing for direct marketing;
●● right to prevent automated decision taking;
●● right to compensation;
●● right to rectify inaccurate data;
●● right to rectification, blocking, erasure and destruction;
●● right to complain to ICO;
●● right to go to court.
Chapter III of the new GDPR refers to rights of Data Subjects. These
include:
●● right to transparency (Article 5; Article 12);
●● right to prior information: directly obtained data (Article 13);
●● right to prior information: indirectly obtained data (Article 14);
●● right of confirmation and right of access (Article 15);
●● right to rectification (Article 16);
●● right to erasure (Right to be Forgotten) (RtbF) (Article 17);
●● right to restriction of processing (Article 18);
●● notifications re rectification, erasure or restriction (Article 19);
●● right to data portability (Article 20);
●● right to object (Article 21);
●● rights re automated individual decision making, including profiling
(Article 22);
●● DPbD (Article 25);

228
Monitoring Case 17.24

●● security rights;
●● data protection impact assessment and prior consultation;
●● communicating data breach to Data Subject (Article 34);
●● Data Protection Officer (DPO);
●● remedies, liability and sanctions (Chapter VIII).

Monitoring Case
17.24 There has been considerable publicity in relation to the ECHR
case of Bărbulescu v Romania.28 This was frequently reported as hold-
ing that employers can monitor employee communications. The ECHR
press release heading states: ‘Monitoring of an employee’s use of the
internet and his resulting dismissal was justified’. Popular commentary
suggested and implied that all employers can monitor all employee
communications including email and internet. A strong and significant
word of caution should be noted by organisations. The previous edition
noted:
●● the case does not find on a general basis that employers can monitor
all employee email;
●● the case does not find on a general basis that employers can monitor
all employee internet usage;
●● the case does not find on a general basis that employers can monitor
all employee communications;
●● the case technically related to a claim that national laws are
insufficient, it is not a case per se between the employee and the
employer;
●● there is a significant dissent encompassed within the judgment –
which some will feel is a more correct assessment of the law – and
provides better guidance to employers;
●● the majority decision acknowledges difficult legal and factual issues
and certain matters missing from the claim file from the respective
sides;
●● the employer had accessed two online communications’ accounts of
the employee, one ostensibly for business and one personal;
●● notwithstanding that the personal account was clearly personal and
private, and there being various apparent issues, the majority seemed
to ignore this clear breach by the employer;

28 Bărbulescu v Romania, ECHR, Case No 61496/08, 12 January 2016.

229
17.24 Employee Monitoring Issues

●● details of one or both accounts were then disclosed to other employ-


ees in the organisation and discussed by them openly – this further
breach by the employer was also ignored by the majority;
●● in terms of the account established ostensibly for business purposes
and which was accessed by the employer, this contained personal
data and sensitive personal data;
●● the employer was not able to establish any reason or suspicion of
wrongdoing to trigger a wish to access or monitor this account;
●● as such, a considered analysis would suggest that there was no basis
for general monitoring in this instance, and seems to lead to the con-
clusion that there is also a breach by the employer here;
●● this important point is ignored in the majority decision;
●● it is well established and recognised that employees do not give up
their privacy and data protection rights at the door of the employer.
The majority decision appears to ignore and not take into account
that employers must have some genuine reason or suspicion in the
specific case to go beyond the employee’s rights to monitor and
access online communications. The issue related to Yahoo Messen-
ger accounts, not the official company email;
●● it does not appear that specific warnings to employees beyond a
general communications policy (which did not refer to monitoring
issues) was issued, or issued to the employee in question, or at least
this was not established. For some the majority decision ignores this
important point. Again, this leans strongly towards there being an
employer breach.
It was also queried whether a different result would have been achieved
before the EU Court of Justice in this instance.
On balance, it was suggested in the previous edition that employers
should remain cautious against global employee monitoring in the UK
and EU. Employees, supervisory authorities, EDPB and others might
suggest the decision is regressive not progressive, or is simply wrong
or must be interpreted cautiously and confined to unusual facts and an
unusual decision. The logic of the majority decision could be criticised
as being in contrast to the established body of law and legal understand-
ing in relation to these issues. The majority suggests a logic which means
that the result justifies the means and issues of law, rights and process
are irrelevant. The majority do not point to any lawful basis for employer
monitoring and access prior to any problem issue being raised with the
employee. It has to be cautioned that organisations may better be guided
by the dissent analysis than the frequently misunderstood blanket moni-
toring headline of the case.

230
Conclusion 17.25

Since the last edition it can now be noted that the previous level
of understanding has been restored. The case Bărbulescu v Romania
was the subject of a more superior decision in the Grand Chamber of
ECHR in 2017.29 This resulted in a different decision in favour of the
employee. The court said that the employee’s rights were not properly
considered.
The court concludes that the applicant’s communications in the work-
place were covered by the concept of ‘private life’ and ‘correspond-
ence’. In the circumstances of the case, Article 8 of the Convention was
applicable.
The original courts did not sufficiently examine whether the aim
pursued by the employer could have been achieved by less intrusive
methods than accessing the actual contents of the employee’s com-
munications. The original court also failed to consider the seriousness
of the consequences of the monitoring and the subsequent discipli-
nary proceedings. The Grand Chamber noted that the employee had
received the most severe disciplinary sanction, namely dismissal. It
held that:
‘the domestic courts failed to determine, in particular, whether the applicant
had received prior notice from his employer of the possibility that his commu-
nications on Yahoo Messenger might be monitored; nor did they have regard
either to the fact that he had not been informed of the nature or the extent
of the monitoring, or to the degree of intrusion into his private life and cor-
respondence. In addition, they failed to determine, firstly, the specific reasons
justifying the introduction of the monitoring measures; secondly, whether the
employer could have used measures entailing less intrusion into the appli-
cant’s private life and correspondence; and thirdly, whether the communica-
tions might have been accessed without his knowledge’.

‘[T]he Court considers that the domestic authorities did not afford adequate
protection of the applicant’s right to respect for his private life and corre-
spondence and that they consequently failed to strike a fair balance between
the interests at stake. There has therefore been a violation of Article 8 of the
Convention’.

Conclusion
17.25 Organisations at different times may feel a tension to engage
in monitoring of employees. No matter how tempting, this needs to be

29 Bărbulescu v Romania, ECHR, Case No 61496/08, Grand Chamber, 5 September 2017.

231
17.25 Employee Monitoring Issues

checked in order to see if it is possible and to remain data protection


­compliant. While there are naturally risks and issues to deal with, the
starting point will always be one of proportionate responses which ensure
data protection compliance. The need, or more limited circumstance,
of notification with the ICO will need to be fully considered by
­organisations. Employee data security breaches will also require the
employer to consider the rules on notifying the ICO of the event and
circumstances of the data breach.

232
Part 3
Outward Facing Organisational
DP Obligations

233
234
Chapter 18
Outward Facing Issues

Introduction
18.01 Beyond the inward facing employee related sphere, organisa-
tions also need to consider the outward facing sphere. For many organi-
sations the outward facing data protection issues frequently dominate
more. They can also be the most contentious. These issues raise signifi-
cant data protection concerns and compliance issues to be dealt with.
Because these issues are related to consumers, they are more obvious
to the media and gain more public attention when things go wrong –
whereas employee data issues are less obvious or less transparent.
Some of the queries that can arise include:
●● consumer data rights;
●● data over-collection;
●● data misuse by corporations;
●● hidden collections;
●● hidden uses;
●● consent issues;
●● data breach issues and how breaches arose;
●● whether individuals are notified frequently enough and early enough
of breach events;
●● whether companies take sufficient remedial measures after a breach
arises and whether they should pay breach monitoring and insur-
ance for the individuals affected (eg Equifax data breach and the
monitoring for third party use of breach victim details by errant third
parties);
●● designated Data Protection Officers (DPOs);
●● forms of outward facing personal data to consider;
●● compliance with the data protection regime when dealing with exist-
ing customers;

235
18.01 Outward Facing Issues

●● how organisations contact potential customers yet remain data pro-


tection compliant;
●● whether an organisation can engage in direct marketing (DM);
●● whether profiling is possible generally, or even in more limited
circumstances;
●● whether users who are not customers can raise additional issues;
●● whether security considerations still arise;
●● the possible existence of even more security issues to consider;
●● higher security obligations for customers and users;
●● how the erasure and Right to be Forgotten regime works; and
●● the new risk assessment and impact assessment provisions.
The general increase in internet usage, profiling, advertising, marketing
and social media are also important issues to consider. Abuse whether
online or via social media or related websites is increasingly recognised
as a problem to be dealt with. Civil sanctions and actions are likely to
increase in the instances where it needs to be invoked. In some instances
and in some jurisdictions organisations may also need to consider crimi-
nal law sanctions.

New Outward Facing Changes


18.02 Some of the new Data Protection Act 2018 (DPA 2018) changes
which apply to outward facing organisational issues include:
●● reducing the child age in relation to information society services
from the EU General Data Protection Regulation (GDPR) from
16 to 13 years of age (DPA 2018, s 9);
●● numerous clarification and changes from the GDPR in terms of
Special Personal Data;
●● variations in relation rights issues from the GDPR;
●● safeguards and related issues (eg, DPA 2018, s 19);
●● codes of practice (DPA 2018, ss 121–128);
●● enforcement (DPA 2018, Part 6);
●● offences (DPA 2018, ss 196–200);
●● restrictions on requirement to produce certain records (DPA 2018,
s 184);
●● avoidance of certain contractual terms relating to health records
(DPA 2018, s 185);
●● rights issues (DPA 2018, s 186);
●● new representation bodies acting for Data Subject whom will have
to be dealt with (see DPA 2018, ss 187–190);
●● provisions relating to Special Personal Data (eg. DPA 2018, Sch 1).

236
New Outward Facing Changes 18.02

Some of the new GDPR changes which apply to outward facing organi-
sational issues include:
●● repeal of the EU Data Protection Directive (DPD 1995) (Article 94);
●● WP29 and EDPB (Recitals 72, 77, 105, 119, 124, 136, 140, 143,
168; Articles 35, 40–43, 47, 51, 52, 58 – 62, 64, and 65);
●● background and rationale (Recitals 1, 8, 11, 13);
●● context, objectives, scope of GDPR (Articles 1, 2 and 3);
●● obligations (Recitals 32, 39, 40, 42, 43–47);
●● security (Recitals 2, 16, 19, 39, 49, 50, 52, 53, 71, 73, 75, 78, 81, 83,
91, 94, 104, 112; Articles 2, 4, 5, 9, 10, 30, 32, 35, 40, 47);
●● processing;
●● rights (Recitals 1, 2, 4, 9, 10, 11, 13, 16, 19, 38, 39, 41, 47, 50–54,
57, 63, 65, 66, 68–71, 73–81, 84–86, 89, 91, 94, 98, 102, 104, 108,
109, 111, 113, 114, 116, 122, 129, 137, 139, 141–143, 153–156, 162,
164, 166, 173; Articles 1, 4–7, 9–22);
●● proceedings (Recitals 52, 80, 129, 143, 147; Articles 23, 40, 78, 79,
81, 82);
●● establishment (Recitals 22, 36, 52, 65, 100, 117, 122, 124, 126, 127;
Articles 3, 4, 9, 17, 18, 21, 37, 42, 49, 54, 56, 57, 60, 62, 65, 70, 79);
●● transfers (Recitals 6, 48, 101–103, 107, 108, 110–114, 153;
Articles 4, 13–15, 23, 28, 30, 40, 42, 44–49, 70, 83, 85, 88, 96);
●● ICO, etc;
●● new bodies (Recital 142; Articles 9, 80);
●● notification/registration replaced (Recital 89);
●● exceptions/exemptions (Recitals 14, 15, 17);
●● lawful processing and consent (Recitals 10, 32, 33, 38–40, 42–46,
49–51, 54, 63, 65, 68, 69, 71, 83, 111, 112, 116, 138, 155, 161, 171;
Articles 4–9, 13, 14, 17, 18, 20, 22, 23, 32, 40, 49, 82, 83);
●● online identifiers (Recitals 30, 64; Articles 4, 87);
●● sensitive and special personal data (Recitals 10, 51, 53, 54, 59, 65,
71, 80, 91, 97; Articles 6, 9, 22, 27, 30, 35, 37, 47);
●● children (Recital 38);
●● health data (Recitals 35, 45, 52–54, 63, 65, 71, 73, 75, 91, 112, 155,
159: Article 4, 9, 17, 23, 36, 88);
●● GDPR definitions (Article 4);
●● new processing rules: obligations (Articles 5–11);
●● new (data protection) Principles (Article 5);
●● lawfulness of processing: lawful processing conditions (Article 6);
●● child’s consent;
●● processing special categories of personal data (Article 9);
●● processing re criminal convictions and offences data (Article 10);
●● processing not requiring identification (Article 11);

237
18.02 Outward Facing Issues

●● Controllers and Processors (Chapter IV);


●● responsibility of the Controller (Article 24);
●● joint Controllers (Article 26);
●● Processor (Article 28);
●● processing under authority of Controller and Processor (Article 29);
●● records of processing activities (Article 30);
●● representatives of Controllers not established in EU (Article 27);
●● co-operation with supervisory authority;
●● security of processing (Article 32);
●● notifying data breach to supervisory authority (Article 33);
●● communicating data breach to Data Subject (Article 34);
●● data protection impact assessment and prior consultation (Chapter
IV, Section 3);
●● data protection impact assessment (Article 35);
●● prior consultation (Article 36);
●● new DPO (Chapter IV; Section 4; Article 37);
●● position (Article 38) and tasks of new Data Protection Officer (DPO)
(Article 39);
●● general principle for transfers (Recitals 6, 48, 101–103, 107, 108,
110–114, 153; Articles 4, 13–15, 23, 28, 30, 40, 42, 44–49, 70, 83,
85, 88, 96; Articles 44);
●● transfers via adequacy decision (Article 45);
●● transfers via appropriate safeguards (Article 46);
●● transfers via binding corporate rules (Article 47);
●● transfers or disclosures not authorised by EU law (Article 48);
●● derogations for specific situations (Article 49);
●● new processing rules: Data Subject rights (Recitals 1–4, 9–11, 13,
16, 19, 38, 39, 41, 47, 50–54, 57, 59, 63, 65, 66, 68–71, 73–81,
84–86, 89, 91, 94, 98, 102, 104, 108, 109, 111, 113, 114, 116, 122,
129, 137, 141–143, 153–156, 162, 164, 166, 173; Articles 1, 4–7,
9–22; Chapter III);
●● right to transparency (Recitals 13, 39, 58, 60, 71, 78, 100, 121;
Articles 5; Chapter III, Section 1; Articles 12–14, 26, 40–43, 53, 88);
●● data access rights (Chapter III, Section 2);
●● right to prior information: directly obtained data (Article 13);
●● right to prior information: indirectly obtained data (Article 14);
●● right of confirmation and right of access (Article 15);
●● rectification and erasure (Section 3);
●● right to rectification (Article 16);
●● right to erasure (Right to be Forgotten) (RtbF) (Article 17);
●● right to restriction of processing (Article 18);
●● notifications re rectification, erasure or restriction (Article 19);
●● right to data portability (Article 20);

238
Data Protection Officer 18.03

●● right to object (Article 21);


●● rights against automated individual decision making, including pro-
filing (Article 22);
●● Data Protection by Design and by Default (DPbD) (Article 25);
●● security rights;
●● data protection impact assessment and prior consultation;
●● communicating data breach to Data Subject (Article 34);
●● DPO (contact details);
●● remedies, liability and sanctions (Chapter VIII);
●● right to judicial remedy against supervisory authority;
●● right to effective judicial remedy against Controller or Processor
(Article 79);
●● representation of Data Subjects (Article 80);
●● suspension of proceedings (Article 81);
●● right to compensation and liability (Article 82);
●● codes of conduct and certification;
●● general conditions for imposing administrative fines (Article 83);
●● penalties (Article 84);
●● specific data processing situations (Chapter IX);
●● safeguards and derogations: public interest/scientific/historical
research/statistical archiving processing;
●● obligations of secrecy (Article 90).

Data Protection Officer


18.03 Chapter IV, Section 4 of the new GDPR refers to Data Protection
Officers (DPOs) and the obligation for organisations to appoint DPOs.
The Controller and the Processor shall designate a DPO in any case
where:
●● the processing is carried out by a public authority or body, except for
courts acting in their judicial capacity;
●● the core activities of the Controller or the Processor consist of
processing operations which, by virtue of their nature, their scope
and/or their purposes, require regular and systematic monitoring of
Data Subjects on a large scale; or
●● the core activities of the Controller or the Processor consist of pro-
cessing on a large scale of special categories of data pursuant to
Article 9 and data relating to criminal convictions and offences
referred to in Article 10 (Article 37(1)).
A group of undertakings may appoint a single DPO provided that a DPO
is easily accessible from each establishment (Article 37(2)).

239
18.03 Outward Facing Issues

Where the Controller or the Processor is a public authority or


body, a single DPO may be designated for several such authorities
or bodies, taking account of their organisational structure and size
(Article 37(3)).
In cases other than those referred to in Article 37(1), the Controller
or Processor or associations and other bodies representing categories of
Controllers or Processors may or, where required by EU or state law
shall, designate a DPO. The DPO may act for such associations and other
bodies representing Controllers or Processors (Article 37(4)).
The DPO shall be designated on the basis of professional quali-
ties and, in particular, expert knowledge of data protection law and
practices and the ability to fulfil the tasks referred to in Article 39
(Article 37(5)).
The DPO may be a staff member of the Controller or Processor, or
fulfil the tasks on the basis of a service contract (Article 37(6)).
The Controller or the Processor shall publish the contact details
of the DPO and communicate them to the supervisory authority
(Article 37(7)).
The Controller or the Processor shall ensure that the DPO is involved,
properly and in a timely manner, in all issues which relate to the protec-
tion of personal data (Article 38(1)).
The Controller or Processor shall support the DPO in performing the
tasks referred to in Article 39 by providing resources necessary to carry
out those tasks and access to personal data and processing operations,
and to maintain their expert knowledge (Article 38(2)).
The Controller or Processor shall ensure that the DPO does not receive
any instructions regarding the exercise of those tasks. He or she shall not
be dismissed or penalised by the Controller or the Processor for perform-
ing their tasks. The DPO shall directly report to the highest management
level of the Controller or the Processor (Article 38(3)).
Data Subjects may contact the DPO on all issues related to the pro-
cessing of the Data Subject’s data and the exercise of their rights under
the GDPR (Article 38(4)).
The DPO shall be bound by secrecy or confidentiality concern-
ing the performance of their tasks, in accordance with EU or state law
(Article 38(5)).
The DPO may fulfil other tasks and duties. The Controller or Proces-
sor shall ensure that any such tasks and duties do not result in a conflict
of interests (Article 38(6)).
The DPO shall have at least the following tasks:
●● to inform and advise the Controller or the Processor and the employ-
ees who are processing personal data of their obligations pursuant to
the GDPR and to other EU or state data protection provisions;

240
Types of Outward Facing Personal Data 18.05

●● to monitor compliance with the GDPR, with other EU or state data


protection provisions and with the policies of the Controller or Pro-
cessor in relation to the protection of personal data, including the
assignment of responsibilities, awareness-raising and training of
staff involved in the processing operations, and the related audits;
●● to provide advice where requested as regards the data protec-
tion impact assessment and monitor its performance pursuant to
Article 33;
●● to cooperate with the supervisory authority;
●● to act as the contact point for the supervisory authority on issues
related to the processing, including the prior consultation referred
to in Article 36, and consult, where appropriate, with regard to any
other matter (Article 39(1)).
The DPO shall in the performance of their tasks have due regard to the
risk associated with the processing operations, taking into account the
nature, scope, context and purposes of processing (Article 39(2)).

Data Protection by Design and by Default


18.04 There are new obligations for organisations. This applies par-
ticularly as regards the introduction of outward facing uses and changes
which involve personal data (see details below).

Types of Outward Facing Personal Data


Customers, Prospects and Users
18.05 What forms of outward facing personal data must organisations
be concerned with? The types of personal data which are considered
related to:
●● current customers;
●● past customers;
●● prospective customers;
●● users whom may not be registered customers;
●● distinguishing adults and children.
Organisations will be often concerned with collecting personal prefer-
ences, account details, etc, from customers and leads. Direct marketing
(DM) and additional avenues for commercialisation are also a frequent
business imperative. Increasingly commercial organisations may be
tempted by profiling technologies. Retailers may also be tempted by

241
18.05 Outward Facing Issues

increasing sophistication in camera and visual recognition technology


for commercial uses.1
In addition, there may be individuals who access the organisations
website but who are not actual customers. Indeed, they may never
become customers. Yet certain personal data may still be collected by
the organisation. In this instance, it is also necessary for the organisation
to comply with the data protection Principles and related obligations.

How to be Outward Facing Compliant


Customers, Prospects and Users
18.06 How can an organisation focus on the elements that will ensure
data protection compliance when dealing with the personal data of cus-
tomers, prospects and/or users. In terms of personal data collected and
processed relating to the above categories, the following must be com-
plied with, namely, the:
●● prior information requirements;
●● data protection Principles;
●● Lawful Processing Conditions;
●● Special Personal Data Lawful Processing Conditions;
●● security requirements;
●● notification requirements (note new changes in this regard);
●● risk and risk assessments, particularly as regards new processes and
activities;
●● response plans to remediate data breach;
●● response plans to notify the data regulator (or data regulators) and
notify individuals;
●● response plans to assist individuals to monitor and protect against
their data being misued by errant third party attackers.
Particular considerations can also arise in relation to users whom may
not be customers in the normal and contractual sense.
Prior Information Requirements
18.07 The organisation is obliged to inform customers, etc, in a fair
and transparent manner that it is intending to collect and process their
personal data. It must identify exactly whom the Controller is, ie the
legal entity collecting and processing the personal data. The categories

1 This issue arose for consideration in the first data protection audit of Facebook by the
Irish supervisory authority. See also, for example, S Monteleone, ‘Privacy and Data
Protection at the time of Facial Recognition: Towards a New Right to Digital Iden-
tity?’ European Journal of Law and Technology (2012) (3:3).

242
Compliance with the Outward Facing Principles 18.08

of personal data being collected must be outlined, as well as the pur-


posed linked to the categories of personal data. If any third parties will
be involved in processing the personal data, these must be referred to and
identified, at least by category, when the personal data is being collected.
Particular information must also be produced where it is envisaged that
the personal data may be transferred outside of the jurisdiction.

Compliance with the Outward Facing Principles


18.08 In dealing with customer personal data, as well as potential cus-
tomers, etc, it is necessary for the organisation to comply with the Prin-
ciples of data protection.
Chapter II of the new GDPR refers to the Principles of data protection
in Article 5 as follows.
Personal data must be:
(a) processed lawfully, fairly and in a transparent manner in relation to
the Data Subject (‘lawfulness, fairness and transparency’);
(b) collected for specified, explicit and legitimate purposes and not
further processed in a manner incompatible with those purposes;
further processing for archiving purposes in the public interest, sci-
entific or historical research purposes or statistical purposes shall, in
accordance with Article 89(1), not be considered incompatible with
the initial purposes (‘purpose limitation’);
(c) adequate, relevant and limited to what is necessary in relation to the
purposes for which they are processed (‘data minimisation’);
(d) accurate and, where necessary, kept up to date; every reasonable
step must be taken to ensure that personal data that are inaccurate,
having regard to the purposes for which they are processed, are
erased or rectified without delay (‘accuracy’);
(e) kept in a form which permits identification of Data Subjects for no
longer than is necessary for the purposes for which the personal data
are processed; personal data may be stored for longer periods inso-
far as the data will be processed solely for archiving purposes in the
public interest, scientific or historical research purposes or statistical
purposes in accordance with Article 89(1) subject to implementation
of the appropriate technical and organisational measures required by
the Regulation in order to safeguard the rights and freedoms of the
Data Subject (‘storage limitation’);
(f) processed in a manner that ensures appropriate security of the per-
sonal data, including protection against unauthorised or unlawful
processing and against accidental loss, destruction or damage, using
appropriate technical or organisational measures (‘integrity and
confidentiality’) (Article 5(1)).

243
18.08 Outward Facing Issues

The Controller shall be responsible for and be able to demonstrate com-


pliance with Article 5(1) (‘accountability’ principle) (Article 5(2)).
These can be summarised as:
●● lawful, fair and transparent processing;
●● use purpose limitation;
●● data minimisation;
●● accuracy principle;
●● storage limitation principle;
●● security, integrity and confidentiality.
The Controller must demonstrate compliance accountability.
Compliance with the Principles of data protection applies to all cus-
tomers, etc, personal data collected and or processed by an organisation.
It also applies regardless of registration notification requirements.
This requires that each of the Principles of data protection must be
considered separately and individually assessed by the organisation to
ensure that it can, and will continue to be in compliance if and when it
collects and processes personal data of customers, etc.
This will involve particular consideration of the contacting model with
customers, how contracts are formed, the instant that they are formed
and how records are maintained in the event that these need to be exam-
ined and evidenced subsequently. Integral to this contract model pro-
cess is how the data protection elements, data protection policies, data
protection clauses, data protection prior information requirements and
customers consent, as appropriate, are complied with. Is the organisation
proposing to ask questions which are not strictly required for the provi-
sion of the service or goods? Are questions being asked which are wholly
unrelated to the actual service or product? If so, the third Principle of
data protection may be breached. If the product or service is time lim-
ited, and no further product or service is envisaged, it would be difficult
for the organisation to permanently keep the personal data. This could
conflict with the second, third and or fifth Principle of data protection.
Depending on the type of service, products and or relationship with
customers etc, whether physical and online, can mean that even more
particular attention needs to be paid to fulfilling compliance with certain
Principles of data protection.

Customers, etc, and Ordinary Personal Data Lawful


Processing Conditions
18.09 What are the Lawful Processing Conditions when an organisa-
tion is processing personal data regarding customers, etc? The Lawful
Processing Conditions are required to be complied with, in addition to

244
Ordinary Personal Data Lawful Processing Conditions 18.09

the Principles of data protection. In order to collect and process per-


sonal data, in addition to complying with the above Principles of data
protection, organisations must comply or fall within one of the follow-
ing general personal data Lawful Processing Conditions in dealing with
customer, etc, personal data:
●● the customer, etc, whom the personal data is about has consented to
the processing for a specified purpose;
●● the processing is necessary for the performance of a contract to
which the customer is party or in order to take steps at the request of
the customer prior to entering into a contract;
●● the processing is necessary for compliance with a legal obligation to
which the Controller is subject;
●● the processing is necessary in order to protect the vital interests of
the customer or of another natural person;
●● the processing is necessary for the performance of a task carried out
in the public interest or in the exercise of official authority vested in
the controller;
●● the processing is necessary for the purposes of the legitimate inter-
ests pursued by the Controller or by a third party, except where such
interests are overridden by the interests or fundamental rights and
freedoms of the customer which require protection of personal data,
in particular where the customer is a child.
Organisations prior to collecting and processing customer, etc, personal
data should identify what personal data it is seeking, why, and which
of the Lawful Processing Conditions it will be complying with. This
will differ according to the organisation, sector and particular activity
envisaged.
Frequently, organisations may wish to rely upon the consent option.
In that instance, there should be documented policies and/or contact
whereby the customer is informed and appraised of the proposed data
processing and freely consents to same. The manner of consent should be
recorded. If it is not recorded, it could present difficulties subsequently
where it is needed to be proven that consent was obtained for a particular
customer or for a whole database of customers.
A further issue arises. Many organisations over a long period of time
will make changes to their business activities. There may also be changes
to the contract and transaction models which can mean changes to the
processing activities and contacts, policies, etc, relating to data protec-
tion and consent. The date of the changeover, the nature of the changes,
and means of recording and maintaining a record of the consent also
needed to be maintained by the organisation in an easily accessible man-
ner. The obligation of records is a critical practical and compliance issue
for organisations.

245
18.09 Outward Facing Issues

These issues can be more pronounced for organisations which change


their business models and processes more frequently. Many internet
companies will, therefore, have frequent additional challenges.
As noted above, frequently, an organisation would seek to fall within
the contract consent or legitimate interests condition above in relation to
customers, etc.
However, an organisation may seek to maintain compliance based on
the legitimate interests condition. This could mean, on occasion, that an
organisation seeks to legitimise processing activities by saying the cus-
tomer has entered a contact and that the processing is a legitimate and
necessary consequence of that. However, organisations seeking to adopt
this model should seek professional advice beforehand. It is quite easy
for an organisation to feel that it is compliant, but then for additional
activities, commercialisation and or direct marketing to occur which
may not be transparent, fair or warranted.
In any event, the Principles of data protection must be complied with
regardless of the particular Lawful Processing Condition.

Customers, etc, Special Personal Data Lawful Processing


Conditions
18.10 Organisations generally, or more frequently organisations
involved in particular sectors, may wish to know certain information
which falls within the special personal data categories eg, health data;
sexual data. In the case of customers, etc, special personal data, an
organisation must, in addition to the data protection Principles, be able
to comply or fall within one of the Special Personal Data Lawful Pro-
cessing Conditions.
The Special Personal Data Lawful Processing Conditions require one
of the following:
1 the Data Subject has given explicit consent to the processing of
those personal data for one or more specified purposes, except
where Union or state law provide that the prohibition may not be
lifted by the Data Subject;
2 processing is necessary for the purposes of carrying out the obliga-
tions and exercising specific rights of the Controller or of the Data
Subject in the field of employment and social security and social
protection law in so far as it is authorised by Union or state law or a
collective agreement pursuant to state law providing for appropriate
safeguards for the fundamental rights and the interests of the Data
Subject;

246
Customers, etc, Special Personal Data Lawful Processing Conditions 18.10

3 processing is necessary to protect the vital interests of the Data


Subject or of another natural person where the Data Subject is
­physically or legally incapable of giving consent;
4 processing is carried out in the course of its legitimate activities with
appropriate safeguards by a foundation, association or any other
not-for-profit body with a political, philosophical, religious or trade
union aim and on condition that the processing relates solely to the
members or to former members of the body or to persons who have
regular contact with it in connection with its purposes and that the
personal data are not disclosed outside that body without the consent
of the Data Subjects;
5 processing relates to Personal Data which are manifestly made pub-
lic by the Data Subject;
6 processing is necessary for the establishment, exercise or defence of
legal claims or whenever courts are acting in their judicial capacity;
7 processing is necessary for reasons of substantial public interest, on
the basis of Union or state law which shall be proportionate to the
aim pursued, respect the essence of the right to data protection and
provide for suitable and specific measures to safeguard the funda-
mental rights and the interests of the Data Subject;
8 processing is necessary for the purposes of preventive or occupa-
tional medicine, for the assessment of the working capacity of the
employee, medical diagnosis, the provision of health or social care
or treatment or the management of health or social care systems and
services on the basis of law or pursuant to contract with a health
professional and subject to the conditions and safeguards referred to
in paragraph 3;
9 processing is necessary for reasons of public interest in the area of
public health, such as protecting against serious cross-border threats
to health or ensuring high standards of quality and safety of health
care and of medicinal products or medical devices, on the basis of
law which provides for suitable and specific measures to safeguard
the rights and freedoms of the data subject, in particular professional
secrecy;
10 processing is necessary for archiving purposes in the public interest,
scientific or historical research purposes or statistical purposes in
accordance with Article 89(1) based on Union or state law which
shall be proportionate to the aim pursued, respect the essence of the
right to data protection and provide for suitable and specific meas-
ures to safeguard the fundamental rights and the interests of the Data
Subject.
Furthermore, regulations enacted also set out further obligations in rela-
tion to the processing of special personal data.

247
18.10 Outward Facing Issues

It should be noted that the ability of an organisation to actually come


within one of the Special Personal Data Lawful Processing Conditions is
more restricted that the general personal data conditions. This is because
the data protection regime considers that organisations have less legiti-
mate interest in the processing of special personal data. It also recognises
the greater importance that legislators and individuals attach to the sensi-
tive categories of personal data.

Customers, etc, and Security Requirements


18.11 What security issues arise for customer, etc, related personal
data? Customers, etc, will be concerned to ensure that their personal data
is not inadvertently accessed, disclosed or used in a manner which they
have not consented to and are unaware of. This is a legal requirement
under the data protection regime. Appropriate security measures must
also be established. These obligations are becoming ever more present
with examples of unauthorised access, disclosure, hacks, etc, such as
Yahoo!, Cathay Pacific, Uber, DSG Retail, TalkTalk, Sony, Ashley
­Madison, Vtech (the children’s toy manufacturer), Hyatt hotels, etc.
In the case of outward facing personal data, such categories will
generally present significant differences to employee related personal
data. Generally there will be more customer, etc, related personal data.
It will also be more widely accessible within the organisation. Fewer
people within an organisation need to have access to employee files
and personal data. Customers, etc, personal data tends to be spread
across more than one single location or database. Frequently, the per-
sonal data are obtained through a variety of sources, and which are
more diverse than inward facing personal data. This all means that
there are greater data security issues and access issues to be considered
by the organisation.
Increasingly, attacks and breaches that come to public attention tend
to relate to outward facing personal data. This means that there is greater
reason to maintain and or enhance these security measures to avoid the
consequences of publicity, official enquiries, enforcement proceedings,
or complaints and or proceedings from Data Subjects.
Data security is also an issue to carefully consider in relation to the
new and developing areas of cloud computing, outsourcing, and new
models and categories of data processing.
Organisations should be aware of the importance of maintaining sepa-
rate databases and maintaining these in a non accessible or open manner.
Encryption, access restrictions, separate locations, etc, are just some of
the security precautions to be considered.

248
Consequences of Non-Compliance 18.13

Direct Marketing
18.12 Most commercial organisations will wish to engage in direct
marketing (DM). For some organisations this will be an important core
activity.
The data protection regime has extensive provisions dealing with
direct marketing. There are requirements to inform the Data Subject that
they may object by a written request and free of charge to their personal
data being used for direct marketing (DM) purposes.
If the Controller anticipates that personal data kept by it will be pro-
cessed for purposes of direct marketing it must inform the persons to
whom the data relates that they may object by means of a request in
writing to the Controller and free of charge.
However, a customer is entitled at any time by notice in writing to a
Controller to require the Controller at the end of such period (as is rea-
sonable in the circumstances) to cease, or not to begin, processing for
the purposes of direct marketing (DM) personal data in respect of which
they are the Data Subject.
Two individuals (ie, separate to the organisation) were fined £440,000
by the ICO for sending unsolicited marketing Spam messages.2 More
recently, Black Lion Marketing was fined £171,000 for unsolicited
texts;3 CRDNN Limited was fined £500,000 for 193 million unsolicited
calls;4 and True Visions Productions was fined £120,000 for consent and
transparency failings.5
Note the updated rules under the ePrivacy Regulation (and PECR)
once finalised.

Consequences of Non-Compliance
18.13 If the organisation fails to assess and organise compliance pro-
cedures in advance of undertaking the collection and processing of per-
sonal data of customers, etc, it will inevitable be operating in breach of
the data protection regime. Any personal data collected will be illicit.
Equally, if originally compliant but during operation one of the data
protection Principles, Lawful Processing Conditions, and/or security
requirements are breached, the personal data processed will be question-
able, particularly if new personal data is involved or has new activities
and uses.

2 ICO v Niebel and ICO v McNeish at https//:ico.org.uk. Note, case was appealed.
3 ICO v Black Lion Marketing Limited [2020] 27/3/2020.
4 ICO v CRDNN Limited [2020] 2/3/2020.
5 ICO v True Visions Productions Limited [2019] 10/4/19.

249
18.13 Outward Facing Issues

What are the consequences? The collection and/or processing are ille-
gal. The organisation, as the Controller, can be the subject of complaints,
investigations and enforcement proceedings from the ICO. Depending
on the severity of the non-compliance, prosecutions can also involve the
directors and employees of the organisation, in addition to the organisa-
tion itself.
If convicted, the organisation could face significant fines. These will
be in addition to the bad publicity and media attention which a prosecu-
tion can bring.
There can be other consequences too. If the organisation relies heavily
on direct marketing (DM), or is a particular type of internet company,
the customer database of personal data can be one of the most signifi-
cant assets of the organisation. If the database is collected in breach of
the data protection regime, the organisation will not be able to establish
compliant data collections and consents. It could, therefore, be ordered
to delete the database.
Alternatively, an organisation may wish to sell its business or to seek
investors for the business. This is frequently the case in the technology
sector. However, as part of a potential purchaser or investor assessing
whether to proceed, it will undertake a due diligence examination of
the processes, procedures and documentation of the organisation. It will
request to see documented evidence of data protection compliance and
that the valuable database, etc, are fully data protection compliant. If this
cannot be established, serious question marks will arise and the transac-
tion may not proceed.

Users Versus Customers


18.14 Do separate issues arise for users and their personal data?
Where someone is a customer, there is more direct contact and therefore
more direct means for the organisation to engage with the customer in
a measured contract formation process. Consequently, it is possible to
ensure compliance though a considered and documented data protection
compliance process and to engage consent appropriately at an early stage
from the individual customer or potential customer.
Users, however, may not be customers. Therefore, customer contact
models which incorporate consent or other data protection legitimising
procedures for customers may leave a gap where non customer users are
not presented with the same documentation, notices and sign up docu-
ment sets. Organisations therefore need to consider users as separate
from normal customers. They must assess where the organisation inter-
acts with such users, physically or online, and what personal data may be
collected. Where personal data is collected from users, the organisation

250
Conclusion 18.15

needs to ensure compliance. This may mean that the organisation needs
to have a separate additional set of notices, policies and consent docu-
mentation in relation to users.

Conclusion
18.15 When organisations begin to look outwards, a separate range of
data collection possibilities will arise. The avenues for data collection
are more diverse. In addition, the intended uses to which the organisation
will put this type of personal data will be potentially greater. The data
protection Principles and Lawful Processing Conditions require particu-
lar consideration and configuration to the intended data processing activ-
ities of customer, etc personal data. Different security and enforcement
risks can arise and need to be protected against. It cannot be assumed that
everyone that the organisation may wish to collect personal data from
will be an actual customer. Therefore, organisations need to consider
how to ensure separate consent and notifications to this category of person.
An example of this may be cookies which may obtain personal data.
Even on the narrow issue of cookies, we see that consent and transpar-
ency are no longer linear, and now to be compliant must be multi faceted
and multi layered. Consent and compliance from a wider perspective are
also increasingly complex. The many competing interests that must be
met by outward facing data protection compliance mean it is more diffi-
cult, but it also becomes easier to miss a gap and become non compliant.

251
252
Chapter 19
Data Protection and Privacy by
Design

Introduction
19.01 It has been suggested that ‘law should play a more active role in
establishing best practices for emerging online trends.’1 Data Protection by
Design (DPbD) and data protection by default are prime examples. One of
the most important and developing practical areas of data protection is the
concept of DPbD as referred to in the EU General Data Protection Regula-
tion (GDPR). Originally developed as a follow on from the data protection
legal regime, it is now being recognised more widely, and is also being
explicitly referred to and recognised in primary legislation itself.
DPbD/PbD and data protection by default are important for organisa-
tions both in terms of being a legal obligation but also commercially in
terms of being a competitive advantage.2

Background
19.02 The concept of PbD is complementary to data protection law
and regulation. The idea is acknowledged to have started with Dr Ann
Cavoukian, previously the Information and Privacy Commissioner for
Ontario, Canada. She states that:
‘the increasing complexity and interconnectedness of information technolo-
gies [requires] building privacy right into system design … the concept of

1 W McGeveran, ‘Disclosure, Endorsement, and Identity in Social Marketing,’ Illinois


Law Review (2009)(4) 1105–1166, at 1105.
2 See, for example, A Mantelero, ‘Competitive Value of Data Protection: The Impact
of Data Protection Regulation on Online Behaviour,’ International Data Privacy Law
(2013) (3:4) 229.

253
19.02 Data Protection and Privacy by Design

Privacy by Design (PbD), … describe[s] the philosophy of embedding pri-


vacy proactively into technology itself – making it the default.’3

Principles of PbD
19.03 The Information and Privacy Commissioner for Ontario refers
to seven principles of PbD.4 These are set out below.
1 Proactive not Reactive; Preventative not Remedial
The Privacy by Design (PbD) approach is characterised by proactive
rather than reactive measures. It anticipates and prevents privacy
invasive events before they happen. PbD does not wait for privacy risks
to materialise, nor does it offer remedies for resolving privacy
infractions once they have occurred – it aims to prevent them from
occurring. In short, PbD comes before-the-fact, not after.
2 Privacy as the Default Setting
One point is certain – the default rules PbD seek to deliver the maxi-
mum degree of privacy by ensuring that personal data are automati-
cally protected in any given IT system or business practice. If an
individual does nothing, their privacy still remains intact. No action
is required on the part of the individual to protect their privacy – it
is built into the system, by default.
3 Privacy Embedded into Design
PbD is embedded into the design and architecture of IT systems and
business practices. It is not bolted on as an add-on, after the fact.
The result is that privacy becomes an essential component of the
core functionality being delivered. Privacy is integral to the system,
without diminishing functionality.
4 Full Functionality – Positive-Sum, not Zero-Sum
PbD seeks to accommodate all legitimate interests and objectives
in a positive-sum ‘win-win’ manner, not through a dated, zero-sum
approach, where unnecessary trade-offs are made. PbD avoids the
pretence of false dichotomies, such as privacy vs security, demon-
strating that it is possible to have both.

3 At http://privacybydesign.ca/about/.
4 At http://www.privacybydesign.ca/content/uploads/2009/08/7foundational principles.
pdf.

254
GDPR 19.04

5 End-to-End Security – Full Lifecycle Protection


PbD, having been embedded into the system prior to the first ele-
ment of information being collected, extends securely throughout
the entire lifecycle of the data involved – strong security measures
are essential to privacy, from start to finish. This ensures that all
data are securely retained, and then securely destroyed at the end of
the process, in a timely fashion. Thus, PbD ensures cradle to grave,
secure lifecycle management of information, end-to-end.
6 Visibility and Transparency – Keep it Open
PbD seeks to assure all stakeholders that whatever the business prac-
tice or technology involved, it is in fact, operating according to the
stated promises and objectives, subject to independent verification.
Its component parts and operations remain visible and transparent,
to users and providers alike. Remember, trust but verify.
7 Respect for User Privacy – Keep it User-Centric
Above all, PbD requires architects and operators to keep the inter-
ests of the individual uppermost by offering such measures as strong
privacy defaults, appropriate notice, and empowering user-friendly
options. Keep it user-centric.5

GDPR
Data Protection by Design (DPbD)
19.04 The Commission proposed an enhanced data protection regime
including DPbD.6 Article 25 of the GDPR refers to data protection by
design and by default. This is an increasingly important area in data
protection.
Data Subject’s rights and freedoms and legitimate interests and com-
pliance increasingly require planning and pre-problem solving.

5 At http://www.privacybydesign.ca/content/uploads/2009/08/7foundational principles.
pdf.
6 See GDPR, S Spiekermann, ‘The Challenges of Privacy by Design,’ Communications
of the ACM (2012) (55) 38–40; S Spiekermann and LF Cranor, ‘Engineering Privacy,’
IEEE Transactions on Software Engineering (2009) (35) 67–82; L Tielemans and
M Hildebrandt, ‘Data Protection by Design and Technology Neutral Law,’ Computer
Law and Security Review (2013) (29:5) 509.

255
19.05 Data Protection and Privacy by Design

Data Protection by Design and by Default (DPbD)


19.05 Article 25 of the new GDPR refers to DPbD and by default.
(Note also the related concept of Privacy by Design (PbD). In some ways
PbD is the impetus for the current DPbD rules).
Taking into account the state of the art, the cost of implementation
and the nature, scope, context and purposes of the processing as well as
the risks of varying likelihood and severity for rights and freedoms of
natural persons posed by the processing, the Controller shall, both at the
time of the determination of the means for processing and at the time of
the processing itself, implement appropriate technical and organisational
measures, such as pseudonymisation, which are designed to implement
data protection principles, such as data minimisation, in an effective
manner and to integrate the necessary safeguards into the processing in
order to meet the requirements of the GDPR and protect the rights of
Data Subjects (Article 25(1)).
The Controller shall implement appropriate technical and organisa-
tional measures for ensuring that, by default, only personal data which
are necessary for each specific purpose of the processing are processed.
This applies to the amount of data collected, the extent of their process-
ing, the period of their storage and their accessibility. In particular, such
measures shall ensure that by default personal data are not made accessi-
ble without the individual’s intervention to an indefinite number of natu-
ral persons (Article 25(2)).
An approved certification mechanism pursuant to Article 42 may be
used as an element to demonstrate compliance with the requirements set
out in this Article.
Prior to the new GDPR, the ICO has issued recommendations in rela-
tion to privacy notices, namely:
●● Privacy Impact Assessment Code of Practice (see below).
This now needs to be read in light of the GDPR changes.

ICO
19.06 DPbD is embraced by the ICO in the UK. The ICO refers to
DPbD by saying that ‘Privacy by Design is an approach whereby pri-
vacy and data protection compliance is designed into systems holding
information right from the start, rather than being bolted on afterwards
or ignored, as has too often been the case.’7

7 ICO, Privacy by Design at https://ico.org.uk.

256
ICO 19.07

It provides8 the following documents and guidance:


●● Privacy by Design report;9
●● Privacy by Design implementation plan;10
●● Privacy Impact Assessment (PIA) handbook;11
●● ICO technical guidance note on Privacy Enhancing Technologies
(PETs);12
●● Enterprise Privacy Group paper on PETs;
●● HIDE (Homeland security, biometric Identification and personal
Detection Ethics);
●● Glossary of privacy and data protection terms;
●● Privacy Impact Assessments – international study (Loughborough
University).
Privacy by Design Report
19.07 The ICO report on Privacy by Design Foreword notes that:
‘The capacity of organisations to acquire and use our personal details has
increased dramatically since our data protection laws were first passed. There
is an ever increasing amount of personal information collected and held about
us as we go about our daily lives … we have seen a dramatic change in the
capability of organisations to exploit modern technology that uses our infor-
mation to deliver services, this has not been accompanied by a similar drive to
develop new effective technical and procedural privacy safeguards. We have
seen how vulnerable our most personal of details can be and these should not
be put at risk.’13

In the report, Toby Stevens, Director, of the Enterprise Privacy Group,


adds that:
‘This report is the first stage in bridging the current gap in the development
and adoption of privacy-friendly solutions as part of modern information sys-
tems. It aims to address the current problems related to the handling of per-
sonal information and put into place a model for privacy by design that will
ensure privacy achieves the same structured and professional recognition as
information security has today.’14

8 See above.
9 See above.
10 ICO, PbD ICO Implementation Plan, at https://ico.org.uk.
11 ICO, Privacy Impact Assessment, at https://ico.org.uk.
12 ICO, Privacy by Design, An Overview of Privacy Enhancing Technologies,
26 November 2008. At https://ico.org.uk.
13 ICO, Privacy by Design (2008). At https:// ico.org.uk.
14 See above at 2.

257
19.07 Data Protection and Privacy by Design

The report describes PbD as follows:


‘The purpose of privacy by design is to give due consideration to privacy
needs prior to the development of new initiatives – in other words, to con-
sider the impact of a system or process on individuals’ privacy and to do this
throughout the systems lifecycle, thus ensuring that appropriate controls are
implemented and maintained.’15

The report refers to the various lifecycles that arise in an organisation.16


These can be products, services, systems and processes:
‘For a privacy by design approach to be effective, it must take into account the
full lifecycle of any system or process, from the earliest stages of the system
business case, through requirements gathering and design, to delivery, testing,
operations, and out to the final decommissioning of the system.

This lifetime approach ensures that privacy controls are stronger, simpler and
therefore cheaper to implement, harder to by-pass, and fully embedded in the
system as part of its core functionality.

However, neither current design practices in the private and public sectors,
nor existing tools tend to readily support such an approach. Current privacy
practices and technologies are geared towards ‘spot’ implementations and
‘spot’ verifications to confirm that privacy designs and practices are correct at
a given moment within a given scope of inspection.’17

ICO Recommendations
19.08 The ICO report makes a number of recommendations in relation
to PbD practice in the UK.18 These are:
‘Working with industry bodies to build an executive mandate for privacy
by design, supported by sample business cases for the costs, benefits and
risks associated with the processing of personal information, and promotion
of executive awareness of key privacy and identity concepts so that privacy is
reflected in the business cases for new systems.

Encouraging widespread use of privacy impact assessments throughout the


systems lifecycle, and ensuring that these assessments are both maintained
and published where appropriate to demonstrate transparency of privacy
controls.

Supporting the development of cross-sector standards for data sharing both


within and between organisations, so that privacy needs are harmonised with

15 See above at 7.
16 ICO, Privacy by Design (2008).
17 ICO, Privacy by Design (2008). At https://ico.org.uk, at 7–8.
18 See above, summarised at 3, and in detail at 22–31.

258
ICO 19.08

the pressures on public authorities and private organisations to share personal


information.

Nurturing the development of practical privacy standards that will help


organisations to turn the legal outcomes mandated under data protection laws
into consistent, provable privacy implementations.

Promoting current and future research into PETs that deliver commercial
products to manage consent and revocation, privacy-friendly identification
and authentication, and prove the effectiveness of privacy controls.

Establishing more rigorous compliance and enforcement mechanisms


by assigning responsibility for privacy management within organisations to
nominated individuals, urging organisations to demonstrate greater clarity
in their personal information processing, and empowering and providing the
ICO with the ability to investigate and enforce compliance where required.

The government, key industry representatives and academics, and the ICO are
urged to consider, prioritise and set in motion plans to deliver these recom-
mendations and hence make privacy by design a reality.’19

The report highlights the need and context for PbD in relation to the
many instances of data loss in the UK (and internationally). It states that:
‘Consumer trust in the ability of public authorities and private organisations
to manage personal information is at an all-time low ebb. A stream of high-
profile privacy incidents in the UK over the past year has shaken confidence
in the data sharing agenda for government with associated impacts on high-
profile data management programmes, and businesses are having to work
that much harder to persuade customers to release personal information to
them.’20

PbD is part of the solution whereby ‘the evolution of a new approach to


the management of personal information that ingrains privacy principles
into every part of every system in every organisation.’21
Organisations need to address many key data protection issues,
such as:
‘assessing information risks from the individual’s perspective; adopting
­transparency and data minimisation principles; exploiting opportunities for
differentiation through enhanced privacy practices; and ensuring that privacy
needs influence their identity management agenda (since identity technologies
are invariably needed to deliver effective privacy approaches).’22

19 ICO, Privacy by Design (2008), note emphasis in original.


20 ICO, Privacy by Design (2008). At https://ico.org.uk, at 6.
21 See above.
22 See above.

259
19.08 Data Protection and Privacy by Design

The ICO has also issued more recent guidance on risk and assessment
issues in light of the GDPR.23

EDPB
19.09 In addition to the above, the EDPB has more recently set out
further guidance documentation, namely;
●● Guidelines 4/2019 on Article 25 Data Protection by Design and by
Default;
●● Recommendation 01/2019 on the draft list of the European Data
Protection Supervisor regarding the processing operations subject to
the requirement of a data protection impact assessment (Article 39.4
of Regulation (EU) 2018/1727).

Commercial Adoption
19.10 Many multinationals and other organisations are embracing
PbD and DPbD. Microsoft, for example, has endorsed PbD for many
years.
The EU data protection authorities, under WP29 and the French
data protection authority (CNIL) while investigating particular policy
amalgamation and changes, in WP29 and Data Protection Authorities/
Google,24 found certain changes to be in breach of data protection law.
Various remedial changes were required. One of these included that
Google incorporate the policy of DPbD into its products and services.
In addition to mandated DPbD per the new GDPR, it may be that spe-
cific DPbD requirements or recommendations come to be included in
individual decisions of SAs when dealing with audits, complaints and
investigations.

23 See ICO, ’Data Protection Impact Assessments’ at https://ico.org.uk/for-organisations/


guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/
accountability-and-governance/data-protection-impact-assessments/.
24 See WP29 Letter to Google: http://www.cnil.fr/fileadmin/documents/en/20121016-
letter_google-article_29-FINAL.pdf; Appendix: http://www.cnil.fr/fileadmin/documents/
en/GOOGLE_PRIVACY_POLICY-_RECOMMENDATIONS-FINAL-EN.pdf;
French DP regulator (CNIL) statement: http://www.cnil.fr/english/news-and-events/
news/article/googles-new-privacy-policy-incomplete-information-and-uncontrolled-
combination-of-data-across-ser/.

260
Conclusion 19.11

Conclusion
19.11 An organisation must be proactive and not reactive. Data pro-
tection considerations need to be considered and built in from the earli-
est stage in processes which potentially impact data protection. They
must be transparent and visible. Problem issues are addressed and solu-
tions incorporated into the process design and process cycle so that pre-
problem solving is achieved for personal data. DPbD needs to be built
in, not merely added or considered once a problem arises at the end or
after go-live. However, PbD and DPbD means incorporating these con-
siderations into the whole life cycle and not just at the beginning and
or the end. It is also incorporated into engineering processes and not
just system consideration and data categories. PbD and DPbD is now a
key concept and requirement under the new GDPR. There is increasing
emphasis on privacy engineering as a part of the mechanisms needed
to achieve DPbD. DPbD is one of the more important innovations in
data protection generally. This is reflected in the GDPR. All organisa-
tions will need to appraise themselves of the concept and the regula-
tory compliance issues. The above Google requirement to implement
DPbD is also timely and reflects the importance that enterprise, both
large and small, needs to engage the benefits, as well as the require-
ments, of DPbD.
Privacy impact assessments25 are also referred to in the GDPR and
may also be relevant in the context of DPbD. DPbD, privacy impact assess-
ments are also relevant in the context of developing cloud services.26
Cloud services also raise important data protection and security consid-
erations and these should be carefully considered by customers as well
as providers.27 WP29 (now the EDPB) has also commented in relation

25 Wright, D, ‘The State of the Art in Privacy Impact Assessments,’ Computer Law &
Security Review (2012) (28) 54–61.
26 Cloud and data protection reliability and compliance issues are referred to in R Clarke
‘How Reliable is Cloudsourcing? A Review of Articles in the Technical Media 2005–11,’
Computer Law & Security Review (2012) (28) 90–95. King and Raja also research
the area of the protections of sensitive personal data and cloud computing, see
NJ King and VT Raja ‘Protecting the Privacy and Security of Sensitive Customer Data
in the Cloud,’ Computer Law & Security Review (2012) (28) 308–319; J Peng, ’A New
Model of Data Protection on Cloud Storage,’ Journal of Networks (03/2014) (9:3) 666.
27 See, for example ICO, Guidance on the Use of Cloud Computing, at https://ico.org.
uk. WP29, Opinion 05/2012 on Cloud Computing, WP 196, 1 July 2012; P Lanois,
‘Caught in the Clouds: The Web 2.0, Cloud Computing, and Privacy?,’ Northwestern
Journal of Technology and Intellectual Property (2010) (9) 29–49; FM Pinguelo and
BW Muller ‘Avoid the Rainy Day: Survey of US Cloud Computing Caselaw,’ Boston

261
19.11 Data Protection and Privacy by Design

to Cloud issues,28 big data issues,29 Internet of Things (IoT),30 drones,31


apps on smart devices,32 cookies,33 device fingerprinting,34 anonymi-
sation and pseudonomisation techniques,35 purpose limitation,36 smart
devices,37 learning analytics, etc.

College Intellectual Property & Technology Forum (2011) 1–7; IR Kattan, ‘Cloudy
Privacy Protections: Why the Stored Communications Act Fails to Protect the Privacy
of Communications Stored in the Cloud,’ Vandenburg Journal of Entertainment and
Technology Law (2010–2011) (13) 617–656.
28 WP29, Opinion 02/2015 on C-SIG Code of Conduct on Cloud Computing; and
Opinion 05/2012 on Cloud Computing.
29 WP29, Statement on Statement of the WP29 on the impact of the development of big
data on the protection of individuals with regard to the processing of their personal
data in the EU, 2014. The ICO also issued guidance on big data and data protection
issues prior to the new GDPR in 2014.
30 WP29, Opinion 8/2014 on the Recent Developments on the Internet of Things.
31 WP29, Opinion 01/2015 on Privacy and Data Protection Issues relating to the Utilisa-
tion of Drones.
32 WP29, Opinion 02/2013 on apps on smart devices.
33 WP29, Cookie sweep combined analysis 2015; Opinion 04/2012 on Cookie Consent
Exemption.
34 WP29, Opinion 9/2014 on the application of Directive 2002/58/EC to device
fingerprinting.
35 WP29, Opinion 05/2014 on Anonymisation Techniques. The ICO has previously
issued guidance on anonymisation techniques prior to the GDPD in an anonymisation
code of practice (2012).
36 WP29, Opinion 03/2013 on purpose limitation.
37 WP29, Opinion 02/2013 on apps on smart devices.

262
Chapter 20
Enforcement Powers

Introduction
20.01 What happens if an organisation does not comply with the data
protection regime when dealing with customers, etc, personal data?
When things go wrong, there can be legal and publicity conse-
quences for the organisation. The impact of a data protection breach
can mean an immediate cross team effort to deal with the data protec-
tion breach.
In dealing with an incident, and in planning for compliance with cus-
tomer, etc personal data, organisations should be aware of the various
ICO enforcement powers. These emphasise the importance of conse-
quences for non-compliance. Enforcement proceedings can be issued
by the ICO. Significant fines and penalties can result. Potentially also,
individual customers may decide to sue for damage, loss and breach of
their personal data rights.
Data protection compliance is also an important due diligence issue
when organisations are reviewed at time of sale and purchase, and indeed
at other times also. It can affect a sale or purchase as well as the value
involved. In some instances where there is non-compliance, a customer
database, which in some instances is the most valuable asset of a com-
mercial organisation, may have to be deleted. That is a real cost of non
compliance and not getting things right from day one.

Enforcement Notices
20.02 The ICO may issue enforcement notices to organisations. If the
ICO is satisfied that a Controller has contravened or is contravening any
of the data protection Principles, in relation to the use of customer, etc

263
20.02 Enforcement Powers

personal data, the ICO may serve a notice (‘an enforcement notice’). The
enforcement notice will require compliance by the organisation with the
data protection Principles, or principle in question and as specified in
the notice.
Where the ICO is satisfied that a person has failed, or is failing, it may
give the person a written notice which requires the person:
●● to take steps specified in the notice; or
●● to refrain from taking steps specified in the notice;
●● or both.
Breaches which might give rise to an enforcement notice can be varied
but include:
●● a breach of the Principles;
●● a breach of rights;
●● a breach of obligations of Controllers and Processors;
●● a breach of a requirement to communicate a data breach to the ICO
or to Data Subjects;
●● a breach of the transfer restrictions.
There are also other types of breaches referred to which can also give
rise to an enforcement notice.
The enforcement notice is, therefore, very wide in terms of what the
ICO can require. It can encompass all types of non-compliance or breach
in relation to customer, etc, personal data. The ICO considers whether
the contravention has caused or is likely to cause personal damage or
distress, in deciding whether to serve an enforcement notice (Data Pro-
tection Act 2018 (DPA 2018), s 150(2)). This would encompass non-
compliance in terms of collecting and processing customer, etc personal
data. However, the ICO may reserve such notices for more serious
instances of non-compliance or breach. It may be argued that an actual
data breach or data loss instance is naturally a serious incident and there-
fore may lean towards investigation and enforcement. This might be
particularly the case if the breach has not been remedied by the time of
notification of the breach to the ICO.
20.03 Section 151 of the DPA 2018 also refers to enforcement notices
in the context of rectification and erasure of personal data. The provisions
are detailed and would require attention when procedures are designed
to comply with the Principles, rights and other requirement of the data
protection legislation. Where a Data Subject feels that there is non com-
pliance with their rights and/or a request in the context of rectification or
erasure, there is potential for these issues to arise if the request is being
refused, whether in whole or in part.

264
Information Notices 20.09

Assessment Notices
20.04 Organisations also need to be aware of assessment notices that
may be issued by the ICO. Section 146 of the DPA 2018 relates to assess-
ment notices. The ICO may serve a Controller with a notice (referred to
as an ‘assessment notice’) for the purpose of enabling the ICO to carry
out an assessment of whether the Controller or Processor has complied
or is complying with the data protection legislation.
An assessment notice may require the Controller or Processor to do
any of a range of particular actions, including permitting the ICO to enter
the premises. (Note that the ICO previously had to apply to court for an
order to enter the Cambridge Analytica premises).
Limitations and Restrictions
20.05 DPA 2018, s 147 refers to assessment notices and limitations
and restrictions. These should be consulted in the event of such a notice
arising.
Destroying/Falsifying Information & Document
20.06 DPA 2018, s 148 refers to particular provisions and conse-
quences which may arise in the event of destroying or falsifying infor-
mation as regards information notices and assessment notices. Various
offences can arise.

Powers of Entry and Inspection


20.07 Section 154 and Sch 15 of the DPA 2018 refer to powers of
entry and inspection.

Request for Audit


20.08 DPA 2018, s 129 refers to a request for a consensual audit,
which organisations can request from the ICO.

Information Notices
20.09 DPA 2018, s 142 refers to information notices. The ICO may
issue an information notice:
●● requiring a Controller or Processor to provide information for the
purposes of carrying out the ICO’s functions; or

265
20.09 Enforcement Powers

●● require any person to provide information required for investigating


suspected breaches or offences, or for determining whether the pro-
cessing of personal data is carried out by an individual in the course
of a purely personal or household activity.
It is an offence in response to an information notice to make a statement
which is false or reckless.

Information Orders
20.10 DPA 2018, s 145 refers to special information orders. The ICO
can apply to court from an information order where a person fails to
comply with an information notice. The order would essentially direct
that certain action or information be provided.

Failure to Comply
20.11 A failure to comply with a notice regardless of type can have
serious consequences. Further investigations and actions, including court
actions directing assistance to the ICO can occur. In addition, depending
on what the ICO finds, penalties, fines and prosecutions can also ensue.

Unlawful Obtaining Etc of Personal Data


20.12 Section 170 of the DPA 2018 refers to unlawful obtaining etc,
of personal data. It is an offence for a person knowingly or recklessly:
●● to obtain or disclose personal data without the consent of the
Controller;
●● to procure the disclosure of personal data to another person without
the consent of the Controller or Processor; or
●● after obtaining personal data, to retain it without the consent of the
person who was the Controller in relation to the personal data when
it was obtained.
This is likely to be utilised increasingly frequently, particularly where
employees or agents seek to use, access, copy or transfer personal data
in a manner other than as envisaged by the Controller.

Re-identifying De-identified Personal Data


20.13 The DPA 2018 also introduces provisions in relation to re-
identifying personal data which was originally de-identified. Section 171

266
Power of ICO to Impose Monetary Penalty 20.15

provides that it is an offence for a person knowingly or recklessly to


re-identify information that is de-identified personal data without the con-
sent of the controller responsible for de-identifying the personal data. This
will be one of the more interesting sections to look at to see how it might
be interpreted and applied by organisations, researchers, the ICO and by
the courts. The extent of the defences specified will need to be considered.

Re-identification and Testing


20.14 There is also a related provision in s 172 of the DPA 2018 relat-
ing to effectiveness testing conditions and re-identification. This follows
on from some of the defence related provisions in s 171. An individual
or organisation will have to be particularly careful and considered (even
in advance) in order to avoid the offences and or avail of the defences.
Issues of public interest and knowledge and intent may also be involved,
which are always difficult to definitively confirm or assess in advance as
to how it may be interpreted by a court.

Power of ICO to Impose Monetary Penalty


20.15 Section 155 of the DPA 2018 refers to the ICO imposing mon-
etary penalties and penalty notices. If the ICO is satisfied that a person:
●● has failed or is failing as specified; or
●● has failed to comply with an information notice, an assessment
notice, or an enforcement),
it can require the person to pay the amount specified.
The ICO must have regard to particular issues when deciding to issue
such a notice. A detailed list of such matters are contained in s 155(2)
and (3).
Some examples include:
●● the nature, gravity and duration of the failure;
●● the intentional or negligent character of the failure;
●● any action taken by the Controller or Processor to mitigate the dam-
age or distress suffered by Data Subjects;
●● past failures, etc.
Section 157 refers to the maximum amount of penalties. It states that in
relation to an infringement of the EU General Data Protection Regula-
tion (GDPR), the maximum amount of the penalty that may be imposed
by a penalty notice is the amount specified in GDPR, Article 83, or if an
amount is not specified there, the standard maximum amount.

267
20.15 Enforcement Powers

There are also provisions in relation to penalty amounts as regards


the DPA 2018. The Secretary of State is also permitted to make addition
rules as regards penalty issues.

Prohibition of Requirement to Produce Certain Records


20.16 Section 184 refers to a prohibition on requirements to produce
certain records. It is an offence to require a person to provide or provide
access to relevant records in connection with:
●● the recruitment of an employee;
●● the continued employment of a person; or
●● a contract for the provision of services to them.
It is also an offence to require a person to give or give access to a relevant
record if:
●● involved in the provision of goods, facilities or services to the public
or a section of the public; and
●● the requirement is a condition of providing or offering to provide
goods, facilities or services to the other person or a third party.
Section 185 also provides for the avoidance of certain contractual terms
relating to health records. If prohibited terms are included they are
deemed to be void.

Tasks
20.17 Without prejudice to other tasks set out under the GDPR, each
supervisory authority shall:
●● monitor and enforce the application of the GDPR;
●● promote public awareness and understanding of the risks, rules,
safeguards and rights in relation to processing. Activities addressed
specifically to children shall receive specific attention;
●● advise, in accordance with national law, the national parliament,
the government, and other institutions and bodies on legislative and
administrative measures relating to the protection of natural person’s
rights and freedoms with regard to processing;
●● promote the awareness of Controllers and Processors of their obliga-
tions under the GDPR;
●● upon request, provide information to any Data Subject concern-
ing the exercise of their rights under the GDPR and, if appropriate,
co-operate with the supervisory authorities in other states to this end;

268
Tasks 20.17

●● handle complaints lodged by a Data Subject, or by a body, organisa-


tion or association in accordance with Article 80, and investigate, to
the extent appropriate, the subject matter of the complaint and inform
the complainant of the progress and the outcome of the investigation
within a reasonable period, in particular if further investigation or
coordination with another supervisory authority is necessary;
●● co-operate with, including sharing information and provide mutual
assistance to other supervisory authorities with a view to ensuring
the consistency of application and enforcement of the GDPR;
●● conduct investigations on the application of the GDPR, including on
the basis of information received from another supervisory authority
or other public authority;
●● monitor relevant developments, insofar as they have an impact on the
protection of personal data, in particular the development of infor-
mation and communication technologies and commercial practices;
●● adopt standard contractual clauses referred to in Articles 28(8)
and 46(2)(d);
●● establish and maintain a list in relation to the requirement for data
protection impact assessment pursuant to Article 35(4);
●● give advice on the processing operations referred to in Article 36(2);
●● encourage the drawing up of codes of conduct pursuant to
Article 40(1) and give an opinion and approve such codes of con-
duct which provide sufficient safeguards, pursuant to Article 40(5);
●● encourage the establishment of data protection certification
mechanisms and of data protection seals and marks pursuant to
Article 42(1), and approve the criteria of certification pursuant to
Article 42(5);
●● where applicable, carry out a periodic review of certifications issued
in accordance with Article 42(7);
●● draft and publish the criteria for accreditation of a body for moni-
toring codes of conduct pursuant to Article 41 and of a certification
body pursuant to Article 43;
●● conduct the accreditation of a body for monitoring codes of con-
duct pursuant to Article 41 and of a certification body pursuant to
Article 43;
●● authorise contractual clauses and provisions referred to in
Article 46(3);
●● approve binding corporate rules pursuant to Article 47;
●● contribute to the activities of the EDPB;
●● keep internal records of breaches of the GDPR and of measures
taken in accordance with Article 58(2);
●● fulfil any other tasks related to the protection of personal data
(Article 57(1)).

269
20.17 Enforcement Powers

Each supervisory authority shall facilitate the submission of complaints


referred to in Article 57(1)(f), by measures such as a complaint submis-
sion form which can also be completed electronically, without excluding
other means of communication (Article 57(2)).
The performance of the tasks of each supervisory authority shall be
free of charge for the Data Subject and, where applicable, for the DPO
(Article 57(3)).
Where requests are manifestly unfounded or excessive, in particu-
lar because of their repetitive character, the supervisory authority may
charge a reasonable fee based on administrative costs, or refuse to act on
the request. The supervisory authority shall bear the burden of demon-
strating the manifestly unfounded or excessive character of the request
(Article 57(4)).
The DPA 2018, ss 115–117, and Schs 12 and 13, also refers to the gen-
eral functions of the ICO. Other sections also specify that certain report
and reviews must be undertaken by the ICO (eg ss 139–141; Sch 17).

Powers
20.18 Pursuant to the new GDPR, each supervisory authority shall
have the following investigative powers:
●● to order the Controller and the Processor, and, where applicable, the
Controller’s or the Processor’s representative to provide any infor-
mation it requires for the performance of its tasks;
●● to carry out investigations in the form of data protection audits;
●● to carry out a review on certifications issued pursuant to Article 42(7);
●● to notify the Controller or the Processor of an alleged infringement
of the GDPR;
●● to obtain, from the Controller and the Processor, access to all per-
sonal data and to all information necessary for the performance of
its tasks;
●● to obtain access to any premises of the Controller and the Processor,
including to any data processing equipment and means, in accord-
ance with EU law or state procedural law (Article 58(1)).
Each supervisory authority shall have the following corrective powers:
●● to issue warnings to a Controller or Processor that intended process-
ing operations are likely to infringe provisions of the GDPR;
●● to issue reprimands to a Controller or a Processor where processing
operations have infringed provisions of the GDPR;
●● to order the Controller or the Processor to comply with the Data
Subject’s requests to exercise their rights pursuant to the GDPR;

270
Powers 20.18

●● to order the Controller or Processor to bring processing operations


into compliance with the provisions of the GDPR, where appropri-
ate, in a specified manner and within a specified period;
●● to order the Controller to communicate a personal data breach to the
Data Subject;
●● to impose a temporary or definitive limitation including a ban on
processing;
●● to order the rectification or erasure of data pursuant to Articles 16, 17
and 18 and the notification of such actions to recipients to whom the
personal data have been disclosed pursuant to Articles 17(2) and 19;
●● to withdraw a certification or to order the certification body to with-
draw a certification issued pursuant to Articles 42 and 43, or to order
the certification body not to issue certification if the requirements for
the certification are not or no longer met;
●● to impose an administrative fine pursuant to Articles 83, in addition
to, or instead of measures referred to in this paragraph, depending on
the circumstances of each individual case;
●● to order the suspension of data flows to a recipient in a third country
or to an international organisation (Article 58(2)).
Each supervisory authority shall have the following authorisation and
advisory powers:
●● to advise the Controller in accordance with the prior consultation
procedure referred to in Article 36;
●● to issue, on its own initiative or on request, opinions to the national
parliament, the state government or, in accordance with national
law, to other institutions and bodies as well as to the public on any
issue related to the protection of personal data;
●● to authorise processing referred to in Article 36(5), if the law of the
state requires such prior authorisation;
●● to issue an opinion and approve draft codes of conduct pursuant to
Article 40(5);
●● to accredit certification bodies pursuant to Article 43;
●● to issue certifications and approve criteria of certification in accord-
ance with Article 42(5);
●● to adopt standard data protection clauses referred to in Article 28(8)
and in Article 43(2)(d);
●● to authorise contractual clauses referred to in Article 46(3)(a);
●● to authorise administrative agreements referred to in Article 46(3)(b);
●● to approve binding corporate rules pursuant to Article 47
(Article 58(3)).
The exercise of the powers conferred on the supervisory author-
ity ­pursuant to this Article shall be subject to appropriate safeguards,

271
20.18 Enforcement Powers

i­ncluding effective judicial remedy and due process, set out in EU and
state law in accordance with the Charter of Fundamental Rights of the
EU (Article 58(4)).
Each state shall provide by law that its supervisory authority shall
have the power to bring infringements of the GDPR to the attention of
the judicial authorities and where appropriate, to commence or engage
otherwise in legal proceedings, in order to enforce the provisions of the
GDPR (Article 58(5)).
Each state may provide by law that its supervisory authority shall
have additional powers to those referred to in Article 58(1), (2) and (3).
The exercise of these powers shall not impair the effective operation of
Chapter VII (Article 58(6)).

General Conditions for Imposing Administrative Fines


20.19 The new GDPR provides that each supervisory authority shall
ensure that the imposition of administrative fines pursuant to the GDPR
referred to in Article 83(4), (5) and (6) shall in each individual case be
effective, proportionate and dissuasive (Article 83(1)).
Administrative fines shall, depending on the circumstances of each
individual case, be imposed in addition to, or instead of, measures
referred to in Article 83(a) to (h) and (j). When deciding whether to
impose an administrative fine and deciding on the amount of the admin-
istrative fine in each individual case due regard shall be given to the
following:
●● the nature, gravity and duration of the infringement taking into
account the nature, scope or purpose of the processing concerned as
well as the number of Data Subjects affected and the level of damage
suffered by them;
●● the intentional or negligent character of the infringement;
●● action taken by the Controller or Processor to mitigate the damage
suffered by Data Subjects;
●● the degree of responsibility of the Controller or Processor taking
into account technical and organisational measures implemented by
them pursuant to Articles 25 and 32;
●● any relevant previous infringements by the Controller or
Processor;
●● the degree of co-operation with the supervisory authority, in order
to remedy the infringement and mitigate the possible adverse effects
of the infringement;
●● the categories of personal data affected by the infringement;

272
General Conditions for Imposing Administrative Fines 20.19

●● the manner in which the infringement became known to the super-


visory authority, in particular whether, and if so to what extent, the
Controller or Processor notified the infringement;
●● where measures referred to Article 58(2), have previously been
ordered against the Controller or Processor concerned in regard to
the same subject-matter, compliance with these measures;
●● adherence to approved codes of conduct pursuant to Article 40 or
approved certification mechanisms pursuant to Article 42; and
●● any other aggravating or mitigating factor applicable to the cir-
cumstances of the case, such as financial benefits gained, or
losses avoided, directly or indirectly, from the infringement
(Article 83(2)).
If a Controller or Processor intentionally or negligently, for the same or
linked processing operations, infringes several provisions of the GDPR,
the total amount of the administrative fine shall not exceed the amount
specified for the gravest infringement (Article 83(3)).
Infringements of the following provisions shall, in accordance with
Article 83(2), be subject to administrative fines up to 10,000,000 EUR,
or in case of an undertaking, up to two percent of the total worldwide
annual turnover of the preceding financial year, whichever is higher:
●● the obligations of the Controller and the Processor pursuant to
Articles 8, 11, 25–39 and 42 and 43;
●● the obligations of the certification body pursuant to Articles 42
and 43;
●● the obligations of the monitoring body pursuant to Article 41(4)
(Article 83(4)).
Infringements of the following provisions shall, in accordance with
Article 83(2), be subject to administrative fines up to 20,000,000 EUR,
or in case of an undertaking, up to four percent of the total worldwide
annual turnover of the preceding financial year, whichever is higher:
●● the basic principles for processing, including conditions for consent,
pursuant to Articles 5, 6, 7 and 9;
●● the Data Subjects’ rights pursuant to Articles 12–22;
●● the transfers of personal data to a recipient in a third country or an
international organisation pursuant to Articles 44–49;
●● any obligations pursuant to state laws adopted under Chapter IX;
●● non-compliance with an order or a temporary or definite limitation
on processing or the suspension of data flows by the supervisory
authority pursuant to Article 58(2) or does not provide access in vio-
lation of Article 53(1) (Article 83(5)).

273
20.19 Enforcement Powers

Non-compliance with an order by the supervisory authority as referred


to in Article 58(2) shall, in accordance with Article 83(2), be subject to
administrative fines up to 20,000,000 EUR, or in case of an undertaking,
up to four percent of the total worldwide annual turnover of the preced-
ing financial year, whichever is higher (Article 83(6)).
Without prejudice to the corrective powers of supervisory authorities
pursuant to Article 58(2), each state may lay down the rules on whether
and to what extent administrative fines may be imposed on public
authorities and bodies established in that state (Article 83(7)).
The exercise by the supervisory authority of its powers under this
Article shall be subject to appropriate procedural safeguards in conform-
ity with EU law and state law, including effective judicial remedy and
due process (Article 83(8)).
Where the legal system of the state does not provide for administrative
fines, Article 83 may be applied in such a manner that the fine is initi-
ated by the competent supervisory authority and imposed by competent
national courts, while ensuring that these legal remedies are effective and
have an equivalent effect to the administrative fines imposed by supervi-
sory authorities. In any event, the fines imposed shall be effective, pro-
portionate and dissuasive. Those states shall notify to the Commission
the provisions of their laws which they adopt pursuant to this paragraph
by 25 May 2018 and, without delay, any subsequent amendment law or
amendment affecting them (Article 83(9)).

Penalties
20.20 The new GDPR provides that states shall lay down the rules
on penalties applicable to infringements of the GDPR in particular for
infringements which are not subject to administrative fines pursuant to
Article 83, and shall take all measures necessary to ensure that they are
implemented. Such penalties shall be effective, proportionate and dis-
suasive (Article 84(1)).
Each state must notify to the Commission those provisions of its law
which it adopts pursuant to Article 84(1), by 25 May 2018 and, without
delay, any subsequent amendment affecting them (Article 84(2)).

Conclusion
20.21 Data protection is important. Personal data is considered impor-
tant and sensitive to customers. This should be respected by organisa-
tions. Organisations are not permitted to collect nor process customer,
etc, personal data without being data protection compliant. It is in this

274
Conclusion 20.21

context that there can be severe consequences for organisations for


non-compliance, whether in collecting personal data initially, or in the
subsequent processing of the personal data. The ICO can impose pen-
alties financially, or can prosecute for non-compliance. Alternatively,
enforcement notices can be imposed which specify certain actions that
must be implemented by the organisation. Certain types of organisation
can be the recipient of separate types of notices, namely assessment
notices. In any of these events, customers, etc, will be particularly con-
cerned that their personal data has been collected, is being processed in
a certain manner and or may have been subject to a breach event. This
can have its own consequences. Overall, it should also be noted that
the consequences of breach or non-compliance are becoming increas-
ingly important as enforcement actions and penalties are increasing,
in frequency, number and financial scale. The new regime also further
enhances the enforcement rules. Subject to the above, further Brexit
consequent regulations are also expected.

275
276
Chapter 21
Transfers of Personal Data

Introduction
21.01 Organisations are under ever increasing pressure to reduce
costs. This can sometimes involve consideration of outsourcing to coun-
tries outside of the jurisdiction. Any transfers of personal data, unless
specifically exempted, are restricted.
In addition, the global nature of commercial activities means that
organisations as part of normal business processes may seek to transfer
particular sets of personal data to group entities whom may be located
outside of the jurisdiction. There can be similar situations where an
organisation wishes to make cross border data flows to agents, partners
or outsourced Processors.
The data protection regime controls and regulates the transfers of
personal data1 from the UK to jurisdictions outside of the jurisdiction.
The transfer of personal data outside of the jurisdiction are known as
trans border data flows (TBDFs) or cross border transfers.2 Frequently
organisations would have transferred personal data to other sections
within their international organisation, such as banks. This could be per-
sonal data in relation to customers as well as employees (eg where the
Personnel or payroll section may be in a different country). This too is
included in the default ban, unless specifically exempted.

1 See A Nugter, Transborder Flow of Personal Data within the EC (Kluwer Law and
Taxation Publishers, 1990).
2 CT Beling, ‘Transborder Data Flows: International Privacy Protection and the Free
Flow of Information,’ Boston College International and Comparative Law Review
(1983) (6) 591–624; ‘Declaration on Transborder Data Flows,’ International Legal
Materials (1985)(24) 912–913; Council Recommendation Concerning Guidelines
Governing the Protection of Privacy and Transborder Flows of Personal Data,’ Inter-
national Legal Materials (1981)(20) 422–450; ‘Draft Recommendation of the Council
Concerning Guidelines the Protection of Privacy and Transborder Flows of Personal
Data,’ International Legal Materials (1980)(19) 318–324.

277
21.01 Transfers of Personal Data

This trend of transfers has increased, however, as more and more activ-
ity is carried out online, such as eCommerce and social media. Personal
data is frequently transferred or mirrored on computer servers in more
than one country as a matter of apparent technical routine.
However, organisations need to be aware that any transfer of personal
data of UK and EU citizens needs to be in compliance with the data pro-
tection regime. One of the obligations is that transfers of personal data
may not occur.3 This default position can be derogated from if one of a
limited number of criteria are satisfied. If none of the exemption criteria
apply, the default position in the data protection regime applies and the
transfer cannot take place.

Transfer Ban
21.02 Transfers (outside EEA/EU) are prohibited per se. The focus
is directed upon the privacy protection elsewhere and the dangers of
uncontrolled transfers of personal data.
The Data Protection Act 1998 (DPA 1998) stated previously that:
‘Personal data shall not be transferred to a country or territory outside the
European Economic Area [EEA] unless that country or territory ensures an
adequate level of protection for the rights and freedoms of Data Subjects in
relation to the processing of personal data.’

Data protection compliance practice for organisations means that they


will have to include a compliance assessment as well as an assessment
of the risks associated with transfers of personal data outside of the juris-
diction. This applies to transfers from parent to subsidiary or to a branch
office in the same way as a transfer to an unrelated company or entity.
However, different exemptions can apply in different scenarios.

Adequate Protection Exception


21.03 If the recipient country has already been deemed by the UK or
EU to already have an adequate level of protection for personal data,
then the transfer is permitted.

3 For one article noting the difficulties that the data protection regime creates in terms
of trans border data flows, see L Kong, ‘Data Protection and Trans Border Data Flow
in the European and Global Context,’ European Journal of International Law (2010)
(21) 441–456.

278
Exceptions 21.04

A transfer can occur where there has been a positive Community find-
ing in relation to the type of transfer proposed. A Community finding
means a finding that a country or territory outside the EEA/EU does, or
does not, ensure an adequate level of protection.
Therefore, if there has been a positive community finding in relation
to a named country outside of the jurisdiction, this means that that coun-
try is deemed to have a level of protection in its laws comparable to
the UK and EU data protection regime. This then makes it possible for
organisations to make transfers to that specific country.
The EU Commission provides a list of Commission decisions on the
adequacy of the protection of personal data in named third countries.4
The EU Commission has thus far recognised that Andorra, Canada
(commercial organisations), Faroe Islands, Israel, Japan, Switzerland,
Argentina, Guernsey, Isle of Man, Jersey, New Zealand, Uruguay, the
EU-US Privacy Shield rules (if signed up and adhered to), and the trans-
fer of air passenger name record to the United States Bureau of Customs
and Border Protection (as specified) as providing adequate protection for
personal data. This list will expand over time.

Exceptions
21.04 If the recipient country’s protection for personal data is not
adequate, or not ascertainable, but it is intended that transfers are still
commercially desired, the organisation should ascertain if the transfer is
comes within one of the other excepted categories. Transfers of personal
data from the UK to outside of the EU/EEA jurisdiction cannot occur
unless it falls within one of the transfer exemptions.
The exemptions from the transfer restrictions, if there is a UK equiva-
lent to the EU regime are:
●● the Data Subject has given consent;
●● the transfer is necessary for performance of contract between Data
Subject and Controller;
●● the transfer is necessary for taking steps at the request of the Data
Subject with a view to entering into a contract with the Controller;
●● the transfer is necessary for conclusion of a contract between
Controller and a person other than Data Subject that is entered
into at request of Data Subject and is in the interests of the Data
Subject;

4 At http://ec.europa.eu/justice/data-protection/international-transfers/adequacy/
index_en.htm.

279
21.04 Transfers of Personal Data

●● the transfer is necessary for the performance of such a contract;


●● the transfer is required or authorised under any enactment or instru-
ment imposing international obligation on UK;
●● the transfer is necessary for reasons of substantial public interest;
●● the transfer is necessary for purposes of or in connection with legal
proceedings or prospective legal proceedings;
●● the transfer is necessary in order to prevent injury or damage to the
health of the Data Subject or serious loss of or damage to the prop-
erty of the Data Subject or otherwise to protect vital interests;
●● subject to certain conditions the transfer is only part of personal data
on a register established by or under an enactment;
●● the transfer has been authorised by a supervisory authority where the
Controller adduces adequate safeguards;
●● the transfer is made to a country that has been determined by the
EU Commission as having ‘adequate levels of [data] protection’ ie a
Community finding (see above);
●● the transfer is made to a US entity that has signed up to the
EU/US ‘safe harbour’ arrangements (although less have signed up
than originally envisaged);
●● EU Commission contract provisions: the Model Contracts (the
EU has set out model contracts which if incorporated into the data
exporter – data importer/recipient relationship can act as an exemp-
tion thus permitting the transfer to occur.
In determining whether a third country ensures an adequate level of
­protection factors taken into account includes:
●● ‘any security measures taken in respect of the data in that country
or territory’;
●● the transfer is necessary for obtaining legal advice or in connection
with legal proceedings or prospective proceedings;
●● (subject to certain conditions) it is necessary in order to prevent
injury or damage to the health or property of the Data Subject, or in
order to protect his or her vital interests;
●● the transfer is required or authorised by law;
●● the ICO has authorised the transfer where the Controller has given or
adduced adequate safeguards; or
●● the contract relating to the transfer of the data embodies appropriate
contract clauses as specified in a ‘Community finding.’
Other potential exempted categories are where:
●● the transfers are substantially in the public interest;
●● the transfers are made in connection with any (legal) claim;
●● the transfers are in the vital interests of the Data Subject;
●● there are public protections;

280
Binding Corporate Rules 21.07

●● the transfer is necessary for the performance of a contract. (This will


depend on the nature of the goods or services provided under the
contract. The transfer must be necessary for the benefit of the trans-
feree and not just convenient for the organisation).

Creating Adequacy
Through Consent
21.05 One of the possible transfer solutions is ‘creating adequacy’
through consent.
Through Contract
21.06 One of the other exemptions relates to transfers permitted as
a result of adopting the EU model contracts into the legal relationship
between the data exporter and the data importer/recipient.
Transfers of data to a third country may be made even though there
is not adequate protection in place in the third country, if the Controller
secures the necessary level of protection through contractual obligations.
These contractual protections are the model contract clauses emanat-
ing from Commission. The Commission has issued what it considers to
be adequate clauses which are incorporated into the contract relationship
of the data exporter and data importer as then provide an adequate level
of consent.
Obtaining consent of pre-existing customers may pose a problem so in
some cases may not be possible or practical. For example, it may not be
possible to retrospectively change existing contracts and terms.
However, going forward it may be possible to include ‘transfer’ issues
in any data protection compliance and related models.
(Also note Chapter 24 with regard to Brexit and UK adequacy deci-
sion issues.)

Binding Corporate Rules


21.07 The Commission and the WP295 (now the EDPB) also devel-
oped a policy of recognising adequate protection of the policies of

5 WP29, Recommendation 1/2007 on the Standard Application for Approval of Bind-


ing Corporate Rules for the Transfer of Personal Data; Working Document setting
up a table with the elements and principles to be found in Binding Corporate Rules,
WP 153 (2008); Working Document Setting up a framework for the structure of
­Binding Corporate Rules, WP154 (2008); Working Document on Frequently Asked
Questions (FAQs) related to Binding Corporate Rules, WP155 (2008).

281
21.07 Transfers of Personal Data

multinational organisations transferring personal data whom satisfy the


determined binding corporate rules (BCR).6 This relates to transfers
internally between companies within a related group of large multina-
tional companies. It therefore, differs from the model contract clauses
above which generally relate to non-related companies, rather than
group companies.
Organisations which have contracts, policies and procedure which
satisfy the BCR and are so accepted as doing so after a review process
with the Commission or one of the national data protection supervisory
authorities can transfer personal data outside of the EU within the group
organisation.
WP29 has issued the following documents in relation to the BCR,
namely:
●● Working Document on Transfers of personal data to third coun-
tries: Applying Article 26 (2) of the EU Data Protection Directive to
­Binding Corporate Rules for International Data Transfers (WP74);
●● Model Checklist, Application for approval of Binding Corporate
Rules (WP102);
●● Working Document Setting Forth a Co-Operation Procedure for
Issuing Common Opinions on Adequate Safeguards Resulting From
Binding Corporate Rules (WP107);
●● Working Document Establishing a Model Checklist Application for
Approval of Binding Corporate Rules (WP108);
●● Recommendation on the Standard Application for Approval of
­Binding Corporate Rules for the Transfer of Personal Data;
●● Working Document setting up a table with the elements and princi-
ples to be found in Binding Corporate Rules (WP153);
●● Working Document Setting up a framework for the structure of
Binding Corporate Rules (WP154);
●● Working Document on Frequently Asked Questions (FAQs) related
to Binding Corporate Rules.
The BCR7 appear to be increasingly popular to large multinational
organisation in relation to their data processing and data transfer compli-
ance obligations. The ICO also refers to the BCR rules.
It is envisaged that the popularity of the BCR option for exemp-
tion from the data protection regime transfer restrictions will increase.

6 See http://ec.europa.eu/justice/data-protection/international-transfers/binding-corporate-
rules/index_en.htm; L Moerel, Binding Corporate Rules, Corporate Self-Regulation of
Global Data Transfers (OUP, 2012).
7 See also L Moerel, Binding Corporate Rules, Corporate Self-Regulation and Global
Data Transfers (OUP, 2012).

282
GDPR: The New Transfers Regime 21.09

However, the review process with the Commission or one of the national
data protection supervisory authorities (such as the ICO) can take some
time given the complexity involved.

GDPR: The New Transfers Regime


Transfers Restricted
21.08 Cross border data transfers are referred to in Recitals 6, 48,
101–103, 107, 108, 110–115, 153.
Chapter V of the new EU General Data Protection Regulation (GDPR)
refers to data transfers and transfer of personal data to third countries or
international organisations.
Any transfer of personal data which are undergoing processing or are
intended for processing after transfer to a third country or to an interna-
tional organisation shall only take place if, subject to the other provisions
of the GDPR, the conditions laid down in this Chapter are complied
with by the Controller and Processor, including for onward transfers of
personal data from the third country or an international organisation to
another third country or to another international organisation. All provi-
sions in this Chapter shall be applied in order to ensure that the level of
protection of natural persons guaranteed by the GDPR is not undermined
(Article 44).
Transfers Restricted on Adequacy Decision
21.09 A transfer of personal data to a third country or an international
organisation may take place where the Commission has decided that the
third country, a territory or one or more specified sectors within that third
country, or the international organisation in question ensures an adequate
level of protection. Such transfer shall not require any specific authorisa-
tion (Article 45(1)).
When assessing the adequacy of the level of protection, the
Commission shall, in particular, take account of the following elements:
●● the rule of law, respect for human rights and fundamental freedoms,
relevant legislation, both general and sectoral, including concerning
public security, defence, national security and criminal law and the
access of public authorities to personal data, as well as the imple-
mentation of such legislation, data protection rules, professional
rules and security measures, including rules for onward transfer of
personal data to another third country or international organisation,
which are complied with in that country or international organisa-
tion, case law, as well as effective and enforceable Data Subject

283
21.09 Transfers of Personal Data

rights and effective administrative and judicial redress for the Data
Subjects whose personal data are being transferred;
●● the existence and effective functioning of one or more independ-
ent supervisory authorities in the third country or to which an inter-
national organisation is subject, with responsibility for ensuring
and enforcing compliance with the data protection rules, including
adequate enforcement powers, for assisting and advising the Data
Subjects in exercising their rights and for co-operation with the
supervisory authorities of the states; and
●● the international commitments the third country or international
organisation concerned has entered into, or other obligations arising
from legally binding conventions or instruments as well as from its
participation in multilateral or regional systems, in particular in rela-
tion to the protection of personal data (Article 45(2)).
The Commission, after assessing the adequacy of the level of protec-
tion, may decide, by means of implementing act, that a third country,
a territory or one or more specified sectors within a third country, or an
international organisation ensures an adequate level of protection within
the meaning of Article 45(3). The implementing act shall provide for a
mechanism for a periodic review, at least every four years, which shall
take into account all relevant developments in the third country or inter-
national organisation. The implementing act shall specify its territorial
and sectorial application and, where applicable, identify the supervisory
authority or authorities referred to in Article 45(2)(b). The implement-
ing act shall be adopted in accordance with the examination procedure
referred to in Article 93(2) (Article 45(3)).
The Commission shall, on an on-going basis, monitor developments in
third countries and international organisations that could affect the func-
tioning of decisions adopted pursuant to para 3 and decisions adopted
on the basis of Article 25(6) of the EU Data Protection Directive 1995
(DPD) (Article 45(4)).
The Commission shall, where available information reveals, in par-
ticular following the review referred to in Article 45(3), that a third
country, a territory or one or more specified sectors within a third
country, or an international organisation no longer ensures an ade-
quate level of protection within the meaning of Article 45(2), to the
extent necessary, repeal, amend or suspend the decision referred to in
Article 45(3) without retro-active effect. Those implementing acts shall
be adopted in accordance with the examination procedure referred to in
Article 87(2), or, in cases of extreme urgency, in accordance with the
procedure referred to in Article 93(2) (Article 45(5)). On duly justified
imperative grounds of urgency, the Commission shall adopt immediately

284
GDPR: The New Transfers Regime 21.10

applicable implementing acts in accordance with the procedure referred


to in Article 93(3) (Article 45(5)).
The Commission shall enter into consultations with the third country
or international organisation with a view to remedying the situation giv-
ing rise to the decision made pursuant to Article 45(5) (Article 45(6)).
A decision pursuant to paragraph 5 is without prejudice to transfers of
personal data to the third country, territory or one or more specified sec-
tors within that third country, or the international organisation in ques-
tion pursuant to Articles 46 to 49 (Article 45(7)).
The Commission shall publish in the Official Journal of the EU and on
its website a list of those third countries, territories and specified sectors
within a third country and international organisations for which it has
decided that an adequate level of protection is or is no longer ensured
(Article 45(8)).
Decisions adopted by the Commission on the basis of Article 25(6) of
the DPD 1995 shall remain in force until amended, replaced or repealed
by a Commission Decision adopted in accordance with Article 45(3)
or (5) (Article 45(9)).
(Also note Chapter 24 with regard to Brexit adequacy decision issues.)
Transfers Subject to Appropriate Safeguards
21.10 In the absence of a decision pursuant to Article 45(3), a
Controller or Processor may transfer personal data to a third country
or an international organisation only if the Controller or Processor has
provided appropriate safeguards, and on condition that enforceable Data
Subject rights and effective legal remedies for Data Subjects are avail-
able (Article 46(1)).
The appropriate safeguards referred to in Article 46(1) may be pro-
vided for, without requiring any specific authorisation from a supervi-
sory authority, by:
●● a legally binding and enforceable instrument between public author-
ities or bodies;
●● binding corporate rules in accordance with Article 47;
●● standard data protection clauses adopted by the Commission
in accordance with the examination procedure referred to in
­Article 93(2); or
●● standard data protection clauses adopted by a supervisory authority
and approved by the Commission pursuant to the examination pro-
cedure referred to in Article 93(2); or
●● an approved code of conduct pursuant to Article 40 together with
binding and enforceable commitments of the Controller or Processor
in the third country to apply the appropriate safeguards, including as
regards Data Subjects’ rights; or

285
21.10 Transfers of Personal Data

●● an approved certification mechanism pursuant to Article 42 together


with binding and enforceable commitments of the Controller or
Processor in the third country to apply the appropriate safeguards,
including as regards Data Subjects’ rights (Article 46(2)).
Subject to the authorisation from the competent supervisory authority,
the appropriate safeguards referred to in Article 46(1) may also be pro-
vided for, in particular, by:
●● contractual clauses between the Controller or Processor and the
Controller, Processor or the recipient of the personal data in the third
country or international organisation; or
●● provisions to be inserted into administrative arrangements between
public authorities or bodies which include enforceable and effective
Data Subject rights (Article 46(3)).
The supervisory authority shall apply the consistency mechanism referred
to in Article 63 in the cases referred to in Article 46(3) (Article 46(4)).
Authorisations by a state or supervisory authority on the basis of
Article 26(2) of DPD95 shall remain valid until amended, replaced or
repealed, if necessary, by that supervisory authority. Decisions adopted
by the Commission on the basis of Article 26(4) of DPD 1995 shall
remain in force until amended, replaced or repealed, if necessary, by
a Commission Decision adopted in accordance with Article 46(2)
(Article 42(5)).
Binding Corporate Rules
21.11 The competent supervisory authority shall approve binding cor-
porate rules (BCR) in accordance with the consistency mechanism set
out in Article 63, provided that they:
●● are legally binding and apply to and are enforced by every mem-
ber concerned of the group of undertakings or groups of enterprises
engaged in a joint economic activity, including their employees;
●● expressly confer enforceable rights on Data Subjects with regard to
the processing of their personal data;
●● fulfil the requirements laid down in Article 47(2) (Article 47(1)).
The binding corporate rules shall specify at least:
●● the structure and contact details of the concerned group of undertak-
ings or group of enterprises engaged in a joint economic activity and
of each of its members;
●● the data transfers or set of transfers, including the categories of per-
sonal data, the type of processing and its purposes, the type of Data

286
GDPR: The New Transfers Regime 21.11

Subjects affected and the identification of the third country or coun-


tries in question;
●● their legally binding nature, both internally and externally;
●● the application of the general data protection Principles, in particular
purpose limitation, data minimisation, limited storage periods, data
quality, data protection by design and by default (DPbD), legal basis
for the processing, processing of special categories of personal data,
measures to ensure data security, and the requirements in respect
of onward transfers to bodies not bound by the binding corporate
rules [d];
●● the rights of Data Subjects in regard to processing and the means to
exercise these rights, including the right not to be subject to deci-
sions based solely on automated processing, including profiling in
accordance with Article 22, the right to lodge a complaint before the
competent supervisory authority and before the competent courts of
the states in accordance with Article 79, and to obtain redress and,
where appropriate, compensation for a breach of the binding corpo-
rate rules [e];
●● the acceptance by the Controller or Processor established on the ter-
ritory of a state of liability for any breaches of the binding corpo-
rate rules by any member concerned not established in the EU; the
Controller or the Processor shall be exempted from this liability, in
whole or in part, only if it proves that that member is not responsible
for the event giving rise to the damage [f];
●● how the information on the binding corporate rules, in particular on
the provisions referred to in this Article 47(2)(d), (e) and (f) is pro-
vided to the Data Subjects in addition to Articles 13 and 14;
●● the tasks of any DPO designated in accordance with Article 37 or
any other person or entity in charge of the monitoring compliance
with the binding corporate rules within the group of undertakings, or
group of enterprises engaged in a joint economic activity, as well as
monitoring the training and complaint handling;
●● the complaint procedures;
●● the mechanisms within the group of undertakings, or group of
enterprises engaged in a joint economic activity, for ensuring the
verification of compliance with the binding corporate rules. Such
mechanisms shall include data protection audits and methods for
ensuring corrective actions to protect the rights of the Data Subject.
Results of such verification should be communicated to the person
or entity referred under point (h) and to the board of the controlling
undertaking or of the group of enterprises engaged in a joint eco-
nomic activity, and should be available upon request to the compe-
tent supervisory authority [i];

287
21.11 Transfers of Personal Data

●● the mechanisms for reporting and recording changes to the rules and
reporting these changes to the supervisory authority;
●● the co-operation mechanism with the supervisory authority to ensure
compliance by any member of the group of undertakings, or group
of enterprises engaged in a joint economic activity, in particular by
making available to the supervisory authority the results of verifica-
tions of the measures referred to in this Article 47(2)(j);
●● the mechanisms for reporting to the competent supervisory authority
any legal requirements to which a member of the group of undertak-
ings, or group of enterprises engaged in a joint economic activity
is subject in a third country which are likely to have a substantial
adverse effect on the guarantees provided by the binding corporate
rules; and
●● the appropriate data protection training to personnel having perma-
nent or regular access to personal data (Article 47(2)).
There may be future changes and requirements too. The Commission
may specify the format and procedures for the exchange of information
between Controllers, Processors and supervisory authorities for binding
corporate rules. Those implementing acts shall be adopted in accordance
with the examination procedure set out in Article 93(2) (Article 48(3)).
Transfers Not Authorised by EU Law
21.12 Any judgment of a court or tribunal and any decision of an
administrative authority of a third country requiring a Controller or
Processor to transfer or disclose personal data may only be recognised or
enforceable in any manner if based on an international agreement, such
as a mutual legal assistance treaty, in force between the requesting third
country and the EU or a state, without prejudice to other grounds for
transfer pursuant to this Chapter (Article 48).
Derogations
21.13 In the absence of an adequacy decision pursuant to Article 45(3),
or of appropriate safeguards pursuant to Article 46, including binding
corporate rules, a transfer or a set of transfers of personal data to a third
country or an international organisation shall take place only on one of
the following conditions:
●● the Data Subject has explicitly consented to the proposed transfer,
after having been informed of the possible risks of such transfers
for the Data Subject due to the absence of an adequacy decision and
appropriate safeguards; [a]

288
GDPR: The New Transfers Regime 21.13

●● the transfer is necessary for the performance of a contract between


the Data Subject and the Controller or the implementation of pre-
contractual measures taken at the Data Subject’s request; [b]
●● the transfer is necessary for the conclusion or performance of a con-
tract concluded in the interest of the Data Subject between the Con-
troller and another natural or legal person; [c]
●● the transfer is necessary for important reasons of public interest;
●● the transfer is necessary for the establishment, exercise or defence
of legal claims;
●● the transfer is necessary in order to protect the vital interests of the
Data Subject or of other persons, where the Data Subject is physi-
cally or legally incapable of giving consent;
●● the transfer is made from a register which according to EU or state
law is intended to provide information to the public and which is
open to consultation either by the public in general or by any person
who can demonstrate a legitimate interest, but only to the extent
that the conditions laid down in EU or state law for consultation are
fulfilled in the particular case. [g]
Where a transfer could not be based on a provision in Articles 45
or 46, including the provision of binding corporate rules, and none of
the derogations for a specific situation pursuant to para (1) is applica-
ble, a transfer to a third country or an international organisation may
take place only if the transfer is not repetitive, concerns only a limited
number of Data Subjects, is necessary for the purposes of compelling
legitimate interests pursued by the Controller which are not overridden
by the interests or rights and freedoms of the Data Subject, and the Con-
troller has assessed all the circumstances surrounding the data transfer
and has on the basis of that assessment provided suitable safeguards with
respect to the protection of personal data. The Controller shall inform the
supervisory authority of the transfer. The Controller shall in addition to
the information referred to in Article 13 and Article 14, inform the Data
Subject about the transfer and on the compelling legitimate interests pur-
sued by the Controller (Article 49(1)). [h]
A transfer pursuant to paragraph Article 49(1)(g) shall not involve the
entirety of the personal data or entire categories of the personal data
contained in the register. Where the register is intended for consulta-
tion by persons having a legitimate interest, the transfer shall be made
only at the request of those persons or if they are to be the recipients
(Article 49(2)).
Article 49(1)(a), (b) and(c) and para (2) shall not apply to activities
carried out by public authorities in the exercise of their public powers
(Article 49(3)).

289
21.13 Transfers of Personal Data

The public interest referred to in Article 49(1)(g) shall be recognised


in EU law or in the law of the state to which the Controller is subject
(Article 49(5)).
In the absence of an adequacy decision, EU law or state law may, for
important reasons of public interest, expressly set limits to the transfer of
specific categories of personal data to a third country or an international
organisation. States shall notify such provisions to the Commission
(Article 49(5)).
The Controller or Processor shall document the assessment as well
as the suitable safeguards referred to in Article 49(1) (second subpara-
graph) in the records referred to in Article 30 (Article 44(6)).
International Cooperation
21.14 In relation to third countries and international organisations, the
Commission and supervisory authorities shall take appropriate steps to:
●● develop international co-operation mechanisms to facilitate the
effective enforcement of legislation for the protection of personal
data;
●● provide international mutual assistance in the enforcement of legis-
lation for the protection of personal data, including through notifi-
cation, complaint referral, investigative assistance and information
exchange, subject to appropriate safeguards for the protection of
personal data and other fundamental rights and freedoms;
●● engage relevant stakeholders in discussion and activities aimed at
furthering international co-operation in the enforcement of legisla-
tion for the protection of personal data;
●● promote the exchange and documentation of personal data protec-
tion legislation and practice, including on jurisdictional conflicts
with third countries (Article 50).
Derogations
21.15 Chapter III, Section 5 of the new GDPR refers to restrictions.
EU or state law to which the Controller or Processor is subject may
restrict by way of a legislative measure the scope of the obligations
and rights provided for in Articles 12 to 22 and Article 34, as well as
­Article 5 in so far as its provisions correspond to the rights and obliga-
tions provided for in Articles 12 to 20, when such a restriction respects
the essence of the fundamental rights and freedoms and is a necessary
and proportionate measure in a democratic society to safeguard:
●● national security [a];
●● defence [b];
●● public security [c];

290
Issues 21.16

●● the prevention, investigation, detection or prosecution of criminal


offences or the execution of criminal penalties, including the safe-
guarding against and the prevention of threats to public security [d];
●● other important objectives of general public interests of EU or of a
state, in particular an important economic or financial interest of EU
or of a state, including monetary, budgetary and taxation matters,
public health and social security [e];
●● the protection of judicial independence and judicial proceedings;
●● the prevention, investigation, detection and prosecution of breaches
of ethics for regulated professions [g];
●● a monitoring, inspection or regulatory function connected, even
occasionally, to the exercise of official authority in cases referred to
in (a) to (e) and (g);
●● the protection of the Data Subject or the rights and freedoms of
others;
●● the enforcement of civil law claims (Article 23(1)).
In particular, any legislative measure referred to in Article 23(1) shall
contain specific provisions at least, where relevant, as to:
●● the purposes of the processing or categories of processing;
●● the categories of personal data;
●● the scope of the restrictions introduced;
●● the safeguards to prevent abuse or unlawful access or transfer;
●● the specification of the Controller or categories of Controllers;
●● the storage periods and the applicable safeguards taking into account
the nature, scope and purposes of the processing or categories of
processing;
●● the risks for the rights and freedoms of Data Subjects; and
●● the right of Data Subjects to be informed about the restriction,
unless this may be prejudicial to the purpose of the restriction
(Article 23(2)).

Issues
21.16 Certain issues may arise in relation to:
●● What is a ‘transfer’? Is there a difference between transfer versus
transit?
●● ‘Data’ and anonymised data, is there a restriction on transfers of
anonymised data? For example, can certain anonymised data fall
outside the definition of personal data?
●● ‘Third country’ currently includes the EU countries and EEA
countries of Iceland, Norway, Liechtenstein. The EU countries are

291
21.16 Transfers of Personal Data

expanding over time. In addition, the list of permitted additional third


countries is also expanding over time, for example, safe harbour.
●● How EU/US transfers will be resolved after the ECJ/CJEU struck
down the original EU–US Safe Harbour data transfer agreement,
and challenges face the new EU-US Privacy Shield replacement.8
Many organisations used this as the legitimising basis for the lawful
transfer of personal data from the EU to the US.

Establishing if the Ban Applies


21.17 Assessing if the export ban applies, includes asking the follow-
ing questions:
●● does the organisation wish to transfer personal data?
●● is there a transfer to a ‘third country’?
●● does that third country have an adequate level of protection?
●● does it involve transfers to the EU-US Privacy Shield or other
­transfer mechanism?
●● does it involve white list countries?
●● what constitutes adequacy?
●● what is the nature of the data?
●● how is it to be used?
●● what laws and practices are in place in the third country?
●● is it a transfer by Controller to Processor?
●● are they transfers within an international or multinational company
or group of companies where an internal privacy code or agreement
is in place?
●● are they transfers within a consortium established to process interna-
tional transactions, for example, banking?
●● are they transfers between professionals such as lawyers or account-
ants where a clients business has on international dimension?
●● the Commission has identified core principles which must be present
in the foreign laws or codes of practice or regulations in order to
achieve the requisite standard of ‘adequacy’ in third party ­countries –
are these principles present?
●● has the personal data been processed for a specific purpose?
●● is the personal data accurate and up to date?
●● has the Data Subject been provided with adequate information in
relation to the transfer?

8 Digital Rights Ireland and Seitlinger and Others, Joined Cases C-293/12 and
C-594/12, Court of Justice, 8 April 2014.

292
Checklist for Compliance 21.18

●● have technical and organisational security measures been taken by


the Controller?
●● is there a right of access to the data by the Data Subject?
●● is there a prohibition on onward transfer of data?
●● is there an effective procedure or mode of enforcement?
Contracts, consent and binding corporate rules take on a new significance
as the concepts of permissibility and adequacy receive renewed scrutiny.
Given the controversy and striking out of the EU-US Privacy Shield
arrangement, all Controllers engaged in processing, and those consid-
ering transfers, should very carefully consider all of the implications,
current changes and the potential for change and additional measures
in the short to medium term. For example, while official policymakers
have agreed proposals for the EU-US Privacy Shield, the WP29 (now
EDPB) suggests further clarity, if not amendment. The Commission, and
the new EDPB, are likely to consider transfer legitimising mechanisms
in future.

Checklist for Compliance


21.18 Organisations should consider the following queries:
●● does the organisation transfer customer, etc, personal data outside of
the jurisdiction?
●● is there a transfer to the US? If so, is the recipient of the data a Pri-
vacy Shield recipient? If yes, then the organisation can transfer the
data. If not, the organisation needs to: (i) assess the adequacy of the
protection measures in place; (ii) see if that transfer is exempted;
(iii) create adequacy through consent or appropriate contract.
●● is the recipient country a Commission permitted white list country?
If so, transfer is permitted.
●● are there data protection rules in place or codes of practice and if so
did they incorporate the adequate and equivalent protections?
●● if there is inadequate protection is it practical to obtain the consent
of the Data Subject?
●● if not, then is it practical to enter into an appropriate contract with
the recipient?
●● are the Commission model data contract clauses is available?
The ICO also asks:
1. does the organisation need to transfer personal data abroad?
2. is the organisation transferring the personal data to a country outside
of the jurisdiction or will it just be in transit through a non-EEA
country?

293
21.18 Transfers of Personal Data

3. has the organisation complied with all the other data protection
Principles?
4. is the transfer to a country on the EU Commission’s white list of
countries or territories (per a Community finding) accepted as pro-
viding adequate levels of protection for the rights and freedoms of
Data Subjects in connection with the processing of their personal
data?
5. if the transfer is to the US, has the US recipient of the personal data
signed up to the EU-US Department of Commerce Privacy Shield
scheme [or any alternative mechanism]?
6. is the personal data passenger name record information (PNR)?
If so, particular rules may apply.
8. can the organisation assess that the level of protection for Data
­Subjects’ rights as ‘adequate in all the circumstances of the case’?.
9. if not, can the organisation put in place adequate safeguards to pro-
tect the rights of the Data Subjects whose data is to be transferred?
10. can the organisation rely on another exception from the restriction
on international transfers of personal data?
[Note: No 7 missing in the original]

Brexit
21.19 See Chapter 24 with regard to Brexit and the impact on EU data
transfers to the UK after Brexit is completed. The important issue of a
UK adequacy decision and the complicating issues involved are neces-
sary considerations. Many issues remain unclear at this stage.

Conclusion
21.20 Data protection compliance practice for organisations means
that they will have to include a compliance assessment, as well as an
assessment of associated risks, in relation to potential transfers of per-
sonal data outside of the jurisdiction. This applies to transfers from par-
ent to subsidiary or to a branch office as well as transfers to any unrelated
company or entity.
The ICO also provides useful guidance in relation to:
●● Assessing Adequacy;
●● Model Contract Clauses;
●● Binding Corporate Rules; and
●● Outsourcing.

294
Conclusion 21.20

There is also the following ICO guidance which can be a useful refer-
ence for organisations, namely, the:
Data Protection Act;

International Data Transfers;

The Information Commissioner’s Recommended Approach to Assessing


­Adequacy Including Consideration of the Issue of Contractual Solutions,
Binding Corporate Rules.

Organisations should be aware that if they wish to transfer personal data


outside of the EEA, that additional considerations arise and that unless
there is a specific exemption, then the transfer may not be permitted.
Once a transfer possibility arises, the organisation should undertake a
compliance exercise to assess if the transfer can be permitted, and if
so, how. Additional compliance documentation and contracts may be
required.
The increasing locations and methods by which personal data may
be collected by organisations, in particular internet, communications
and social media, will increasingly be scrutinised in terms of transpar-
ency, consent and compliance. Equally, cloud computing is receiving
particular data protection attention. Apps are also being considered as
is the developing area of the Internet of Things (IoT) and related data
­protection compliance issues.

295
296
Chapter 22
ePrivacy and Electronic
Communications

Introduction
22.01 There is increasing use of electronically transmitted personal
data. This is protected and regulated, in certain respects, separate from
the general data protection regime under the General Data Protection
Regulation (GDPR).
While originally the regulation of telecommunications related per-
sonal data centred on telecoms companies, it is now recognised as
encompassing telecoms companies and certain companies engaged in
activities involving the collection or transmission of particular personal
data over electronic communications networks, including the internet.
Organisations concerned with compliance relating to marketing, email
marketing, text marketing, telephone marketing, fax marketing, the
use of location based data, cookies, identification regarding telephone
calls and other telephone issues need to consider the additional data
protection rules.
The GDPR indicates that it shall not impose additional obligations on
natural or legal persons in relation to the processing of personal data in
connection with the provision of publicly available electronic commu-
nications services in public communication networks in the EU in rela-
tion to matters for which they are subject to specific obligations with the
same objective set out in Directive 2002/58/EC1 (Article 89).
However, other Directives and Regulations (in particular the ePrivacy
Regulation once finalised) may be reviewed in terms of amendments
required in order to smoothly comply with the GDPR.

1 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).

297
22.02 ePrivacy and Electronic Communications

Background
22.02 There has been a separation between the data protection of gen-
eral personal data, in EU Data Protection Directive 1995 (DPD 1995),
and the regulation of personal data in (tele)communications networks.
The later were legislated for in the Data Protection Directive of 1997.2
This was later replaced with the ePrivacy Directive (ePD). The ePD was
amended by Directive 2006/24/EC and Directive 2009/136/EC. Now,
organisations need to be aware of the updated provisions coming with
the ePrivacy Regulation, which replaces the previous rules regarding
ePrivacy, electronic marketing, cookies, etc.
ePD3 concerns the processing and protection of personal data and pri-
vacy in the electronic communications sector. It is also known as the
Directive on privacy and electronic communications, hence ePD. One
of the concerns has been how electronic communications and electronic
information are increasingly used for profiling for marketing purposes,
including by electronic means.4 (Indeed, this is also reflected in the
cookie rules).

Scope of the ePD


22.03 The ePD broadly relates to and encompasses the following,
namely:
●● definitions;
●● security;
●● confidentiality;
●● traffic data;
●● non itemised bills;
●● call and connected line identification;
●● location data;
●● exceptions;
●● directories;
●● unsolicited communications;
●● technical features.
It also provides rules in relation to:
●● email marketing;
●● text marketing;

2 Directive 97/66/EC.
3 Directive 2002/58/EC of 12 July 2002.
4 See, for example, W McGeveran, ‘Disclosure, Endorsement, and Identity in Social
Marketing,’ University of Illinois Law Review (2009) (4) 1105–1166.

298
Marketing 22.05

●● telephone marketing;
●● fax marketing;
●● cookies, etc.

Marketing
22.04 How should organisations go about ensuring data protection
compliance for direct marketing (DM)? When is DM permitted? All
organisations should carefully assess compliance issues when consider-
ing any direct marketing activities. Getting it wrong can be costly and
can have ICO enforcement and investigation consequences as well as
penalties.
Article 13 of the ePD refers to unsolicited communications.
Article 13(1) in particular provides that the use of the following, namely:
●● automated calling systems without human intervention (automatic
calling machines);
●● facsimile machines (fax); or
●● electronic mail,
for the purposes of direct marketing may only be allowed in respect of
subscribers who have given their prior consent.
This means that there is a default rule prohibiting DM without prior
consent. Many marketing orientated organisations may be surprised, if
not dismayed by this.
Existing Customers’ Email
22.05 However, in the context of existing customers, there is a pos-
sibility to direct market using emails. Article 13(2) of the ePD provides
that notwithstanding Article 13(1), where an organisation obtains from
its customers their electronic contact details for email, in the context of
the sale of a product or a service, in accordance with the DPD, the organ-
isation may use these electronic contact details for DM of its own similar
products or services provided that customers clearly and distinctly are
given the opportunity to object, free of charge and in an easy manner, to
such use of electronic contact details when they are collected and on the
occasion of each message in case the customer has not initially refused
such use.
Therefore, once the email details are obtained at the time of a prod-
uct or service transaction, it will be possible to use that email for direct
marketing purposes. Conditions or limitations apply however. Firstly,
the organisation is only permitted to market and promote similar prod-
ucts or services. This, therefore, rules out unrelated, non-identical and

299
22.05 ePrivacy and Electronic Communications

non-similar products and services. Secondly, at the time of each subse-


quent act of DM, the customer must be given the opportunity in an easy
manner to opt-out or cancel the DM. Effectively, they must be taken off
the DM list.
National Opt-out Registers
22.06 Article 13(3) of ePD provides that appropriate measures must
ensure that, free of charge, unsolicited communications for purposes of
direct marketing, in cases other than those referred to in paras 1 and 2,
are not allowed either without the consent of the subscribers concerned
or in respect of subscribers who do not wish to receive these communi-
cations, the choice between these options to be determined by national
­legislation. This means that each state must determine and provide a
means for individuals to opt-out of receiving DM in advance.5
Marketing Emails Must Not Conceal Their Identity
22.07 Article 13(4) of ePD provides that the practice of sending elec-
tronic mail for purposes of DM disguising or concealing the identity of
the sender on whose behalf the communication is made shall be pro-
hibited. This means that organisations cannot conceal their identity if
­permitted to engage in direct marketing. If these are not complied with,
what might otherwise be permissible DM can be deemed to be imper-
missible. Complaints, investigations or enforcement proceedings can
thus arise.
Marketing Emails Must Have an Opt-Out
22.08 In addition, Article 13(4) of ePD provides that the practice of
sending electronic mail for purposes of DM without a valid address to
which the recipient may send a request that such communications cease,
shall be prohibited. This means that organisations must also include an
easy contact address or other details at which the recipient can contact
if they wish to object to receiving any further DM. If these are not com-
plied with, what might otherwise be permissible DM can be deemed to
be impermissible. Complaints, investigations or enforcement proceed-
ings can thus also arise.

Marketing Protection for Organisations


22.09 Article 13(5) of ePD provides that Article 13(1) and (3) shall
apply to subscribers who are natural persons. States shall also ensure,

5 See ICO, Marketing, at https://ico.org.uk.

300
Marketing Protection for Organisations 22.12

in the framework of EU law and applicable national legislation, that the


legitimate interests of subscribers other than natural persons with regard
to unsolicited communications are sufficiently protected. This means that
protection from unsolicited DM can also be extended to organisations.
Scope
22.10 ePD Article 1 refers to the scope and aim of the Directive.
Article 1(1) provides that the Directive harmonises the provisions of the
states to ensure an equivalent level of protection of fundamental rights
and freedoms, and in particular the right to privacy, with respect to the
processing of personal data in the electronic communication sector and
to ensure the free movement of such data and of electronic communica-
tion equipment and services in the EU.
Article 1(2) provides that the provisions of the Directive particularise
and complement DPD 1995 for the purposes mentioned in Article 1(1).
Moreover, they provide for protection of the legitimate interests of sub-
scribers who are legal persons.
Article 1(3) provides that the ePD shall not apply to activities
which fall outside the scope of the Treaty establishing the European
Community, such as those covered by Titles V and VI of the Treaty on
EU, and in any case to activities concerning public security, defence,
State ­security (including the economic well-being of the State when the
activities relate to State security matters) and the activities of the State in
areas of criminal law.
Definitions
22.11 Article 2 of the ePD refers to definitions. The definitions in
GDPR and in the ePD shall apply. The following additional definitions
shall also apply: user, traffic data, location data, communication, call,
consent, value added consent, electronic mail.
Services
22.12 Article 3 relates to the services concerned. Article 3(1) provides
that the ePD shall apply to the processing of personal data in connec-
tion with the provision of publicly available electronic communications
­services in public communications networks in the EU.
Article 3(2) provides that Articles 8, 10 and 11 shall apply to sub-
scriber lines connected to digital exchanges and, where technically
­possible and if it does not require a disproportionate economic effort, to
subscriber lines connected to analogue exchanges.
Article 3(3) provides that cases where it would be technically
impossible or require a disproportionate economic effort to fulfil the

301
22.12 ePrivacy and Electronic Communications

requirements of Articles 8, 10 and 11 shall be notified to the Commission


by the states.
Unsolicited Communications
22.13 Article 13 refers to unsolicited communications. Article 13(1)
provides that the use of automated calling systems without human inter-
vention (automatic calling machines), facsimile machines (fax) or elec-
tronic mail for the purposes of direct marketing may only be allowed in
respect of subscribers who have given their prior consent.
Article 13(2) provides that notwithstanding Article 13(1), where a
natural or legal person obtains from its customers their electronic con-
tact details for electronic mail, in the context of the sale of a product
or a service, in accordance with DPD 1995, the same natural or legal
person may use these electronic contact details for direct marketing of
its own similar products or services provided that customers clearly and
distinctly are given the opportunity to object, free of charge and in an
easy manner, to such use of electronic contact details when they are col-
lected and on the occasion of each message in case the customer has not
initially refused such use.
Article 13(3) provides that states shall take appropriate measures to
ensure that, free of charge, unsolicited communications for purposes of
direct marketing, in cases other than those referred to in Article 13(1)
and (2), are not allowed either without the consent of the subscribers
concerned or in respect of subscribers who do not wish to receive these
communications, the choice between these options to be determined by
national legislation.
Article 13(4) provides that in any event, the practice of sending elec-
tronic mail for purposes of direct marketing disguising or concealing the
identity of the sender on whose behalf the communication is made, or
without a valid address to which the recipient may send a request that
such communications cease, shall be prohibited.
Article 13(5) provides that Article 13(1) and (3) shall apply to sub-
scribers who are natural persons. States shall also ensure, in the frame-
work of EU law and applicable national legislation, that the legitimate
interests of subscribers other than natural persons with regard to unsolic-
ited communications are sufficiently protected.
Security
22.14 Article 4 relates to security. Article 4(1) provides that the pro-
vider of a publicly available electronic communications service must
take appropriate technical and organisational measures to safeguard
security of its services, if necessary in conjunction with the provider of
the public communications network with respect to network security.

302
Marketing Protection for Organisations 22.16

Having regard to the state of the art and the cost of their implementa-
tion, these measures shall ensure a level of security appropriate to the
risk presented. Obviously, these can change over time as risks and as
technology change.
Article 4(2) provides that in case of a particular risk of a breach of the
security of the network, the provider of a publicly available electronic
communications service must inform the subscribers concerning such
risk and, where the risk lies outside the scope of the measures to be taken
by the service provider, of any possible remedies, including an indication
of the likely costs involved.
Confidentiality
22.15 Article 5 refers to confidentiality of the communications.
Article 5(1) provides that states shall ensure the confidentiality of com-
munications and the related traffic data by means of a public commu-
nications network and publicly available electronic communications
services, through national legislation. In particular, they shall prohibit
listening, tapping, storage or other kinds of interception or surveillance
of communications and the related traffic data by persons other than
users, without the consent of the users concerned, except when legally
authorised to do so in accordance with Article 15(1). This paragraph
shall not prevent technical storage which is necessary for the conveyance
of a communication without prejudice to the principle of confidentiality.
Article 5(2) provides that para 1 shall not affect any legally authorised
recording of communications and the related traffic data when carried
out in the course of lawful business practice for the purpose of pro-
viding evidence of a commercial transaction or of any other business
communication.
Article 5(3) provides that states shall ensure that the use of electronic
communications networks to store information or to gain access to infor-
mation stored in the terminal equipment of a subscriber or user is only
allowed on condition that the subscriber or user concerned is provided
with clear and comprehensive information, inter alia about the purposes
of the processing, and is offered the right to refuse such processing by
the Controller. This shall not prevent any technical storage or access for
the sole purpose of carrying out or facilitating the transmission of a com-
munication over an electronic communications network, or as strictly
necessary in order to provide an information society service explicitly
requested by the subscriber or user.
Traffic Data
22.16 Article 6 refers to traffic data. Article 6(1) provides that traf-
fic data relating to subscribers and users processed and stored by the

303
22.16 ePrivacy and Electronic Communications

provider of a public communications network or publicly available elec-


tronic communications service must be erased or made anonymous when
it is no longer needed for the purpose of the transmission of a commu-
nication without prejudice to Article 6(2), (3) and (5) and Article 15(1).
Article 6(2) provides that traffic data necessary for the purposes of
subscriber billing and interconnection payments may be processed. Such
processing is permissible only up to the end of the period during which
the bill may lawfully be challenged or payment pursued.
Article 6(3) provides that for the purpose of marketing electronic
communications services or for the provision of value added services,
the provider of a publicly available electronic communications service
may process the data referred to in Article 6(1) to the extent and for the
duration necessary for such services or marketing, if the subscriber or
user to whom the data relate has given his/her consent. Users or sub-
scribers shall be given the possibility to withdraw their consent for the
processing of traffic data at any time.
Article 6(3) provides that the service provider must inform the
subscriber or user of the types of traffic data which are processed
and of the duration of such processing for the purposes mentioned in
Article 6(2) and, prior to obtaining consent, for the purposes mentioned
in Article 6(3).
Article 6(3) provides that processing of traffic data, in accordance
with Article 6(1), (2), (3) and (4), must be restricted to persons acting
under the authority of providers of the public communications networks
and publicly available electronic communications services handling
billing or traffic management, customer enquiries, fraud detection, mar-
keting electronic communications services or providing a value added
service, and must be restricted to what is necessary for the purposes of
such activities.
Article 6(6) provides that Articles 6(1), (2), (3) and (5) shall apply
without prejudice to the possibility for competent bodies to be informed
of traffic data in conformity with applicable legislation with a view to
settling disputes, in particular interconnection or billing disputes.
Non-Itemised Billing
22.17 Article 7 relates to itemised billing. Article 7(1) provides that
subscribers shall have the right to receive non-itemised bills. Article7(2)
provides that states shall apply national provisions in order to reconcile
the rights of subscribers receiving itemised bills with the right to pri-
vacy of calling users and called subscribers, for example by ensuring that
sufficient alternative privacy enhancing methods of communications or
payments are available to such users and subscribers.

304
Marketing Protection for Organisations 22.19

Calling and Connected Line Identification


22.18 Article 8 relates to presentation and restriction of calling and
connected line identification. Article 8(1) provides that where presenta-
tion of calling line identification is offered, the service provider must
offer the calling user the possibility, using a simple means and free of
charge, of preventing the presentation of the calling line identification
on a per-call basis. The calling subscriber must have this possibility on
a per-line basis.
Article 8(2) provides that where presentation of calling line identifica-
tion is offered, the service provider must offer the called subscriber the
possibility, using a simple means and free of charge for reasonable use of
this function, of preventing the presentation of the calling line identifica-
tion of incoming calls.
Article 8(3) provides that where presentation of calling line identifica-
tion is offered and where the calling line identification is presented prior
to the call being established, the service provider must offer the called
subscriber the possibility, using a simple means, of rejecting incoming
calls where the presentation of the calling line identification has been
prevented by the calling user or subscriber.
Article 8(4) provides that where presentation of connected line iden-
tification is offered, the service provider must offer the called subscriber
the possibility, using a simple means and free of charge, of preventing
the presentation of the connected line identification to the calling user.
Article 8(5) provides that Article 8(1) shall also apply with regard to
calls to third countries originating in the EU. Paragraphs (2), (3) and (4)
shall also apply to incoming calls originating in third countries.
Article 8(6) provides that states shall ensure that where presentation
of calling and/or connected line identification is offered, the providers of
publicly available electronic communications services inform the public
thereof and of the possibilities set out in paras (1), (2), (3) and (4).
Location Data other than Traffic Data
22.19 Article 9 of the ePD relates to location data other than traffic
data. Article 9(1) provides that where location data other than traffic
data, relating to users or subscribers of public communications networks
or publicly available electronic communications services, can be pro-
cessed, such data may only be processed when they are made anony-
mous, or with the consent of the users or subscribers to the extent and
for the duration necessary for the provision of a value added service. The
service provider must inform the users or subscribers, prior to obtaining
their consent, of the type of location data other than traffic data which
will be processed, of the purposes and duration of the processing and

305
22.19 ePrivacy and Electronic Communications

whether the data will be transmitted to a third party for the purpose of
providing the value added service. Users or subscribers shall be given
the possibility to withdraw their consent for the processing of location
data other than traffic data at any time.
This is increasingly important as more and more smart phones and
electronic devices permit the capture of location based data relating to
individuals and/or their personal equipment.
Article 9(1) provides that where consent of the users or subscribers
has been obtained for the processing of location data other than traffic
data, the user or subscriber must continue to have the possibility, using a
simple means and free of charge, of temporarily refusing the processing
of such data for each connection to the network or for each transmission
of a communication.
Article 9(1) provides that processing of location data other than traf-
fic data in accordance with Articles 9(1) and (2) must be restricted to
persons acting under the authority of the provider of the public commu-
nications network or publicly available communications service or of the
third party providing the value added service, and must be restricted to
what is necessary for the purposes of providing the value added service.
Exceptions
22.20 Article 10 relates to exceptions.
Directories
22.21 Article 12 refers to directories of subscribers. Article 12(1) pro-
vides that states shall ensure that subscribers are informed, free of charge
and before they are included in the directory, about the purpose(s) of a
printed or electronic directory of subscribers available to the public or
obtainable through directory enquiry services, in which their personal
data can be included and of any further usage possibilities based on
search functions embedded in electronic versions of the directory.
Article 12(2) provides that states shall ensure that subscribers are
given the opportunity to determine whether their personal data are
included in a public directory, and if so, which, to the extent that such
data are relevant for the purpose of the directory as determined by the
provider of the directory, and to verify, correct or withdraw such data.
Not being included in a public subscriber directory, verifying, correcting
or withdrawing personal data from it shall be free of charge.
Article 12(3) provides that states may require that for any purpose of a
public directory other than the search of contact details of persons on the
basis of their name and, where necessary, a minimum of other identifiers,
additional consent be asked of the subscribers.

306
Conclusion 22.22

Article 12(4) provides that Articles 12(1) and (2) shall apply to
s­ubscribers who are natural persons. states shall also ensure, in the
framework of EU law and applicable national legislation, that the legiti-
mate interests of subscribers other than natural persons with regard to
their entry in public directories are sufficiently protected.

Conclusion
22.22 Originally envisaged as relating to telecoms type data only,
this secondary aspect of the data protection regime has expanded in
­substance, scope and detail. While much of it is still specific to telecoms
companies and entities involved in the transfer of electronic communica-
tions, certain issues are more generally applicable. The ePD as amended
applies to all organisations who wish to engage in direct marketing
through a variety of means. Compliance is necessary and needs to be
planned in advance. If not specifically exempted from the default rule it
is difficult to envisage permissible direct marketing (DM).
The ePD and the above comments all need to be read in light of the
final version of the provisions formulated in the new ePrivacy Regulation,
as well as national regulations which may be repealed, amended or
updated as a result.

307
308
Chapter 23
Electronic Direct Marketing
and Spam

Introduction
23.01 Direct marketing (DM) tends to be one of the most contentious
areas of data protection practice. It also receives probably most attention,
with the possible exceptions of data breach/data loss and internet/social
media data protection issues.
Most organisations need to engage in (DM) at some stage, some more
heavily than others. Many organisations may even go so far as to say that
DM is an essential ingredient of continued commercial success.
However, DM is sometimes viewed as Spam and unsolicited commer-
cial communications which are unwanted and also unlawful. The data
protection regime (and eCommerce legal regime) refers to permissible
DM and sets out various obligatory requirements while at the same time
setting a default position of prohibiting non-exempted or non-permitted
electronic direct marketing.
The EU General Data Protection Regulation (GDPR) does not impose
additional obligations on natural or legal persons in relation to the pro-
cessing of personal data in connection with the provision of publicly
available electronic communications services in public communica-
tion networks in the EU in relation to matters for which they are sub-
ject to specific obligations with the same objective set out in Directive
2002/58/EC1 (Article 95 of the GDPR). There may be a review to ensure
that the ePrivacy Directive (ePD) fully complements and does not con-
flict with the GDPR, and if so, amendments may be required.

1 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).

309
23.01 Electronic Direct Marketing and Spam

The the ePrivacy Regulation, once finalised, will also necessitate fur-
ther changes and compliance.

Direct Marketing (DM)


23.02 If the organisation anticipates that personal data kept by it will
be processed for the purposes of DM, it must inform the persons to whom
the data relates that they may object by means of a request in writing to
the Controller and free of charge.2
Unsolicited Communications
23.03 Article 13 of the ePD refers to unsolicited communications.
Article 13(1) provides that the use of automated calling systems without
human intervention (automatic calling machines), facsimile machines
(fax) or electronic mail for the purposes of direct marketing may
only be allowed in respect of subscribers who have given their prior
consent.
Article 13(2) provides that notwithstanding Article 13(1), where a
natural or legal person obtains from its customers their electronic con-
tact details for email, in the context of the sale of a product or a service,
in accordance with the DPD, the same natural or legal person may use
these electronic contact details for direct marketing of its own similar
products or services provided that customers clearly and distinctly are
given the opportunity to object, free of charge and in an easy manner, to
such use of electronic contact details when they are collected and on the
occasion of each message in case the customer has not initially refused
such use.
Article 13(3) provides that states shall take appropriate measures to
ensure that, free of charge, unsolicited communications for purposes of
DM, in cases other than those referred to in Article 13(1) and (2), are
not allowed either without the consent of the subscribers concerned or
in respect of subscribers who do not wish to receive these communica-
tions, the choice between these options to be determined by national
legislation.
Article 13(4) provides that in any event, the practice of sending elec-
tronic mail for purposes of DM disguising or concealing the identity
of the sender on whose behalf the communication is made, or without

2 See, for example, L Edwards, ‘Consumer Privacy Law 1: Online Direct Marketing,’
and L Edwards and J Hatcher, ‘Consumer Privacy Law: Data Collection, Profiling and
Targeting,’ each in L Edwards and C Waelde, eds, Law and the Internet (Hart, 2009)
489 et seq, and 511 et seq respectively.

310
Direct Marketing (DM) 23.06

a valid address to which the recipient may send a request that such
communications cease, shall be prohibited.
Article 13(5) provides that Article 13(1) and (3) shall apply to sub-
scribers who are natural persons. states shall also ensure that the legiti-
mate interests of subscribers other than natural persons with regard to
unsolicited communications are sufficiently protected.
Marketing Default Position
23.04 How should organisations go about data protection compliance
DM? When is DM permitted? All organisations should carefully assess
compliance issues when considering any DM activities. Getting it wrong
can be costly and can have ICO enforcement and investigation conse-
quences, not to mention penalties Indeed, a penalty of £440,000 was
recently imposed in relation to spam DM (see ICO cases chart).
Article 13 of the ePD provides a number of rules in relation to unso-
licited communications. Article 13(1) provides that:
●● automated calling systems without human intervention (automatic
calling machines);
●● facsimile machines (fax); or
●● electronic mail,
for the purposes of DM may only be allowed in respect of subscribers
who have given their prior consent.
Therefore, there is a default rule prohibiting the forms of DM referred
to above without prior consent. Many marketing orientated organisations
may consider this a hindrance to what may have been considered legiti-
mate marketing and business activities.
Limited Direct Marketing Permitted
23.05 Limited direct marketing is permitted, namely, of subscribers or
customers whom have given their prior consent. This implies consent in
advance of receiving the DM or simultaneous to the DM.
However, in terms of DM by email, this is further restricted.
Direct Marketing to Existing Customers’ Email
23.06 In the context of existing customers, there is a possibility to
DM using emails. Article 13(2) of the ePD provides that where an
organisation obtains from its customers their electronic contact details
for email, in the context of the sale of a product or a service, in accord-
ance with the DPD, the organisation may use these electronic contact
details for DM of its own similar products or services provided that cus-
tomers clearly and distinctly are given the opportunity to object, free of
charge and in an easy manner, to such use of electronic contact details

311
23.06 Electronic Direct Marketing and Spam

when they are collected and on the occasion of each message in case
the customer has not initially refused such use.
Therefore, once the email details are obtained at the time of a product
or service transaction, it will be possible to use that email for direct mar-
keting purposes. Conditions or limitations apply however. Firstly, the
organisation is only permitted to market and promote similar products
or services. This, therefore, rules out unrelated, non-identical and non-
similar products and services. Secondly, at the time of each subsequent
act of DM, the customer must be given the opportunity in an easy and
accessible manner to opt out or cancel the DM. Effectively, they must be
taken off of the organisation’s DM list.
National Marketing Opt-out Registers
23.07 Article 13(3) of ePD provides that states shall take appropri-
ate measures to ensure that, free of charge, unsolicited communications
for purposes of direct marketing, in cases other than those referred to
in Article 13(1) and (2), are not allowed either without the consent of
the subscribers concerned or in respect of subscribers who do not wish
to receive these communications, the choice between these options to
be determined by national legislation. This means that each state must
determine and provide a means for individuals to opt-out of receiving
DM in advance.3
Various of the cases where marketing breach data fines are imposed
by the ICO involve unsolicited marketing calls being made to telephone
numbers registered on the telephone preference service (TPS), whereby
individuals can register their preference not to be the recipients of mar-
keting calls. Examples of illegal TPS marketing call breaches include:
●● Superior Style Home Improvements – fined £150,000;4
●● Making it Easy – fined £160,000;5
●● Smart Home Protection – fined £90,000;6
●● Avalon Direct – fined £80,000;7
●● Solartech North East – fined £90,000;8
●● DM Design Bedrooms – fined £160,000;9
●● Secure Home Systems – fined £80,000;10

3 See ICO, Marketing, at https://ico.org.uk.


4 ICO v Superior Style Home Improvements Limited [2019] 17/9/2019.
5 ICO v Making it Easy Limited [2019] 2/8/2019.
6 ICO v Smart Home Protection Limited [2019] 13/6/2019.
7 ICO v Avalon Direct [2019] 16/4/2019.
8 ICO v Solartech North East Limited [2018] 23/11/2018.
9 ICO v DM Design Bedrooms [2018] 23/11/2018.
10 ICO v Secure Home Systems [2018] 31/10/2018.

312
PECR 23.11

●● ACT Response – fined £140,000;11


●● AMS Marketing – fined £100,000.12
Deceptive Emails: Marketing Emails Must Not Conceal
Identity
23.08 Article 13(4) of ePD provides that the practice of sending email
for purposes of DM disguising or concealing the identity of the sender on
whose behalf the communication is made shall be prohibited. This means
that organisations cannot conceal their identity if permitted to engage in
direct marketing. If these are not complied with, what might otherwise
be permissible DM can be deemed to be impermissible. Complaints,
investigations or enforcement proceedings can thus arise.
Marketing Emails Must Provide Opt-Out
23.09 In addition, Article 13(4) of ePD provides that the practice of
sending email for purposes of DM without a valid address to which the
recipient may send a request that such communications cease, shall be
prohibited. This means that organisations must also include an easy con-
tact address or other details at which the recipient can contact if they
wish to object to receiving any further DM. If these are not complied
with, what might otherwise be permissible DM can be deemed to be
impermissible. Complaints, investigations or enforcement proceedings
can thus also arise.
Marketing Protection for Organisations
23.10 Article 13(5) of the ePD provides that Article 13(1) and (3) shall
apply to subscribers who are natural persons. States shall also ensure, in
the framework of EU law and applicable national legislation, that the
legitimate interests of subscribers other than natural persons with regard
to unsolicited communications are sufficiently protected. This means that
protection from unsolicited DM can also be extended to organisations.

PECR
23.11 Detailed provisions governing direct marketing by electronic
communications are set out in the PECR (Privacy and Electronic
Communications (EC Directive) Regulations 200313), implementing the

11 ICO v ACT Response [2018] 30/10/2018.


12 ICO v AMS Marketing [2018] 1/8/2018.
13 SI 2003/2426.

313
23.11 Electronic Direct Marketing and Spam

ePD in the UK. PECR will likely be updated, amended and/or replaced
once the full extent of the ePrivacy Regulation is known.
It provides rules in relation to automated calling machine; fax;
email; unsolicited call by automated calling machine or fax; unsolic-
ited telephone call; disguising or concealing identity; contact address;
opt in/opt out; and ‘soft opt in.’
PECR is also interesting in that it applies to both legal and natural
persons. Generally, rights are not recognised for organisations in the data
protection regime.
The PECR refers to the implementation of the rules regarding elec-
tronic communications and direct marketing DM). Regulations 22
and 23 refer to email marketing.14
Regulation 22 of the PECR provides as follows:
‘(1) This regulation applies to the transmission of unsolicited communica-
tions by means of electronic mail to individual subscribers.
(2) Except in the circumstances referred to in paragraph (3), a person shall
neither transmit, nor instigate the transmission of, unsolicited commu-
nications for the purposes of direct marketing by means of electronic
mail unless the recipient of the electronic mail has previously notified the
sender that he consents for the time being to such communications being
sent by, or at the instigation of, the sender.
(3) A person may send or instigate the sending of electronic mail for the
purposes of direct marketing where:
(a) that person has obtained the contact details of the recipient of that
electronic mail in the course of the sale or negotiations for the sale
of a product or service to that recipient;
(b) the direct marketing is in respect of that person’s similar products
and services only; and
(c) the recipient has been given a simple means of refusing (free of
charge except for the costs of the transmission of the refusal) the
use of his contact details for the purposes of such direct marketing,
at the time that the details were initially collected, and, where he did
not initially refuse the use of the details, at the time of each subse-
quent communication.
(4) A subscriber shall not permit his line to be used in contravention of
paragraph (2).’

Regulation 23 of the PECR provides as follows:


‘A person shall neither transmit, nor instigate the transmission of, a communi-
cation for the purposes of direct marketing by means of electronic mail:
(a) where the identity of the person on whose behalf the communication has
been sent has been disguised or concealed; or

14 Privacy and Electronic Communications (EC Directive) Regulations 2003, at http://


www.legislation.gov.uk/uksi/2003/2426/contents/made.

314
PECR 23.12

(b) where a valid address to which the recipient of the communication may
send a request that such communications cease has not been provided.’

Regulation 24 of the PECR is also relevant in that it provides the infor-


mation which must be furnished with the direct marketing so that the
recipient may contact the organisation to opt-out, namely:
●● the name of the organisation; and
●● either the address of the person or a telephone number on which he
can be reached free of charge.
Regulation 19 of the PECR prohibits the use of automated calling
machines for the purposes of direct marketing.
The use of fax machines for the purposes of direct marketing is
restricted in reg 20 of the PECR.
Regulation 21 of the PECR provides restrictions in relation to direct
marketing by unsolicited telephone calls. An organisation cannot under-
take such direct marketing if the called line is previously notified that
such calls should not be made to that line; or if the called line is listed on
an opt out register (referred to under reg 26 of the PECR).
It is expected that once the EU ePrivacy rules are finalised, the PECR
rules in the UK will be amended or replaced.
ICO PECR Guidance
23.12 The ICO guidance states as follows:
‘The Regulations define electronic mail as “any text, voice, sound, or image
message sent over a public electronic communications network which can
be stored in the network or in the recipient’s terminal equipment until it is
collected by the recipient and includes messages sent using a short message
service” (Regulation 2 “Interpretation” applies).

In other words, email, text, picture and video marketing messages are all
considered to be “electronic mail”. Marketing transmitted in WAP messages
is considered to be “electronic mail”. WAP Push allows a sender to send a
specially formatted SMS message to a handset which, when received, allows
a recipient through a single click to access and view content stored online,
through the browser on the handset.

We consider this rule also applies to voicemail and answerphone messages


left by marketers making marketing calls that would otherwise be “live”.
So there are stricter obligations placed on you if you make live calls but then
wish to leave messages on a person’s voicemail or answerphone.

Faxes are not considered to be “electronic mail”. Fax marketing is covered


elsewhere in the Regulations. These regulations also do not cover so-called

315
23.12 Electronic Direct Marketing and Spam

silent calls or calls where a fax or other electronic signal is transmitted; this is
because no marketing material is transmitted during these calls.’15

It adds that:
‘This is what the law requires:
●● You cannot transmit, or instigate the transmission of, unsolicited market-
ing material by electronic mail to an individual subscriber unless they
have previously notified you, the sender, that they consent, for the time
being, to receiving such communications. There is an exception to this
rule which has been widely referred to as the soft opt in (Regulation 22 (2)
refers).
●● You cannot transmit, or instigate the transmission of, any marketing
by electronic mail (whether solicited or unsolicited) to any subscriber
(whether corporate or individual) where:
$$ Your identity has been disguised or concealed; or
$$ You have not provided a valid address to which the recipient can
send an opt-out request.
$$ That electronic mail would contravene regulations 7 or 8 of
the ­Electronic Commerce (EC Directive) Regulations 2002
(SI 2002/2013); or
$$ That electronic mail encourages recipients to visit websites which
contravene those regulations (Regulation 23 refers).
●● A subscriber must not allow their line to be used to breach Regulation 22 (2)
(Regulation 22 (4) refers).’16

Offences by Direct Marketers under PECR


23.13 Offences arise in relation to activities referred to under PECR.
These include:
●● sending unsolicited marketing messages to individuals by fax, SMS,
email or automated dialling machine;
●● sending unsolicited marketing by fax, SMS, email or automated
dialling machine to a business if it has objected to the receipt of such
messages;
●● marketing by telephone where the subscriber has objected to the
receipt of such calls;
●● failing to identify the caller or sender or failing to provide a physical
address or a return email address;
●● failing to give customers the possibility of objecting to future email
and SMS marketing messages with each message sent;

15 ICO, Electronic Mail, at https://ico.org.uk.


16 See above.

316
Civil Sanctions 23.15

●● concealing the identity of the sender on whose behalf the marketing


communication was made.
Fines can be significant. Organisations can be fined up to £500,000 by
the ICO for unwanted marketing phone calls and emails in accordance
with the updated amended PECR. It is also envisaged that fines will
increase significantly under the GDPR and ePrivacy Regulation.

ICO Monetary Penalties


23.14 The ICO was also empowered under the Criminal Justice and
Immigration Act 2008 (CJIA 2008) to impose penalties of a monetary
variety. Section 77 of the CJIA 2008 provides power to alter penalty
for unlawfully obtaining etc personal data. Fines and penalties for elec-
tronic marketing related offences are being increased under the GDPR
and ePrivacy Regulation.

Civil Sanctions
23.15 Individual Data Subjects can also sue for compensation under
the data protection rules. Where a person suffers damage as a result of a
failure by a Controller or Processor to meet their data protection obliga-
tions, then the Controller or Processor may be subject to civil sanctions
by the person affected. Damage suffered by a Data Subject will include
damage to reputation, financial loss and mental distress.
This will vary from organisation to organisation, sector to sector, the
type of personal data involved, the risks of damage, loss, etc, the nature
of the security risks, the security measures and procedures adopted, the
history of risk, loss and damage in the organisation and sector.
Certain types of data will convey inherent additional risks over ­others,
such as loss of financial personal data. This can be argued to require
higher obligations for the organisation.
One interesting area to consider going forward is online damage,
such as viral abuse, publication, defamation, bulletin boards, discussion
forums and websites (or sections of websites), and social media web-
sites. Where damage occurs as a result of misuse or loss of personal data
or results in defamation, abuse and threats, liability could arise for the
individual tortfeasors as well as the website.
While there are eCommerce defences in the eCommerce Directive,
one should recall that the data protection regime (and its duty of care and
liability provisions) are separate and stand alone from the ­eCommerce
Directive legal regime. Indeed, even in terms of the eCommerce

317
23.15 Electronic Direct Marketing and Spam

defences one should also recall that (a) an organisation must first fall
within an eCommerce defence, and not lost that defence, in order to
avail of it; and (b) there is no automatic entitlement to an internet ser-
vice provider (ISP) or website to a global eCommerce defence, as in
fact there in not one eCommerce defence but three specific defences
relating to specific and technical activities. Not all or every ISP activity
will fall into one of these defences. Neither will one activity fall into all
three defences.
It is also possible to conceive of a website which has no take down
procedures, inadequate take down defences, or non-expeditious take
down procedures or remedies, and which will face potential liability
under privacy and data protection as well as eCommerce liability. For
example, an imposter social media profile which contains abuse, per-
sonal data and defamatory material could attract liability for the web-
site operator under data protection, and under normal liability if none of
the eCommerce defences were unavailable or were lost. The later could
occur if, for example, the false impersonating profile was notified to the
website (or it was otherwise aware) but it did not do anything.17
PECR implemented ePD (also known as the ePrivacy Directive)
regarding the protection of privacy in the electronic communications
sector. In 2009 ePD was amended by Directive 2009/136/EC. This
included changes to Article 5(3) of ePD requiring consent for the storage
or access to information stored on a subscriber or users terminal equip-
ment ie a requirement for organisations to obtain consent for cookies and
similar technologies.18 The ePrivacy Regulation will add/import updates
and replacements to this area which organisations will need to incopo-
rate and comply with.

Call and Fax Opt Out Registers


23.16 The Telephone Preference Service (TPS) and Fax Preference
Service (FPS), operated by the Direct Marketing Association, are reg-
isters that allow people to register their numbers and their preference to
opt out of receiving unsolicited calls or faxes. Therefore, organisation
proposing to use call or fax for DM, must first consult these registers to
ensure that no contact details are used which are already registered on
these registers.

17 This is a complex and developing area of law, common law, civil law, GDPR, forth-
coming ePrivacy Regulation and case law, both in the UK and internationally. A full
detailed analysis is beyond this current work.
18 The same comments apply, see above.

318
Related Issues 23.19

The Spam Problem


23.17 Spam is just one of a number of names taken to describe the
problem of unsolicited electronic commercial marketing materials.
The Spam name comes originally from a Monty Python comic sketch.
However, electronic Spam is far from comic and costs industry hundreds
of millions of pounds each year in employee lost time and resources, in
lost bandwidth, in reduced capacity and network speed, as well as other
problems.
Spam Internationally
23.18 The growing recognition of the problems caused by Spam, such
as scams, viruses, lost productivity and bandwidth, has meant an increas-
ing number of local and national laws specifically dedicated to prevent-
ing Spam.
In the US for example there a large number of local state laws19 and
national federal laws dedicated to tackling Spam. These include both
new specific laws and Spam specific amendments to pre-existing laws.
A US federal Spam act known as the CANSPAM Act was introduced.20

Related Issues
23.19 Certain related issues also arise, which are beyond detailed
analysis presently but which may be used for specific organisations to
consider further.
One example is the increasingly controversial area of profiling, adver-
tising and direct marketing in relation to children.21
Online behavioural advertising (OBA) and the behavioural targeting
of internet advertising is increasingly debated.22
Commentators, and media, often focus on the issue of threats to pri-
vacy, data protection and reputation rights caused by web 2.0 activities
such as social media, search engine services, etc. The query arises as
to whether revenue versus privacy is better respected by certain online
services providers?23

19 See www.spamlaws.com.
20 See VJ Reid, ‘Recent Developments in Private Enforcement of the Can-Spam Act,’
Akron Intellectual Property Journal (2010) (4) 281–307.
21 U Munukutla-Parker, ‘Unsolicited Commercial E-mail, Privacy Concerns Related
to Social Network Services, Online Protection of Children, and Cyberbullying,’ I/S:
A Journal of Law and Policy (2006) (2) 628–650.
22 Deane-Johns, ‘Behavioural Targeting of Internet Advertising,’ Computers and Law
(2009) (20) 22.
23 L Edwards and C Waelde, eds, Law and the Internet (Oxford: Hart, 2009) 539.

319
23.20 Electronic Direct Marketing and Spam

Cases
23.20 There are many official cases, prosecutions and fines in relation
to Spam. Some of the largest fines from the ICO have been in relation to
Spam and unsolicited marketing (in addition to the growing area of data
breaches).

Conclusion
23.21 Few organisation will not be interested in DM and advertis-
ing. The key is to get it right. The consequences of sending unlawful
electronic communications can amount to offences, prosecutions offi-
cial enforcement and investigations as well as being sued or prosecuted.
This is one of the areas which is consistently an area of focus from ICO
investigation. These issues become more important as: (a) increased
competition puts pressure on organisations to engage, or increase, mar-
keting and profiling efforts; and (b) the software, devices and technolo-
gies available to facilitate such efforts are increasing in significance.
Over time, cost efforts also decrease in terms of adopting these new
tools. As always, however, organisations are cautioned to ensure data
protection compliance. They might also, on occasion, pause to ques-
tion general ethical consideration in certain instances. Depending on the
­sector, there may also be an obligation to consider and comply with
certain third party organisational rules, particularly as regards direct
marketing issues. Indeed, the new regime also provides the impetus
for representative organisations to increasingly consider industry-wide
data protection Codes of Conduct. These are encouraged by the ICO
and the new data protection regime. It is also important to look out for
the changes that may occur as a result of the ePrivacy Regulation once
available.

320
Part 4
New UK Regime

321
322
Chapter 24
Background to the New UK
Regime

Introduction
24.01 The UK data protection regime is fundamentally changed. It
should not be viewed as a big bang development, however. Rather, as the
ICO has pointed out, it is an ‘evolution’ not a ‘revolution.’1 The previous
data protection regime was over 20 years’ old and required substantial
redevelopment and updating.

Brexit, DPA, and EU


24.02 ‘The [new Act] will bring the European Union’s [GDPR] into
UK law, helping Britain prepare for a successful Brexit.’2 This is only
partly correct, as the Data Protection Act 2018 (DPA 2018) only partly
implements the GDPR.
On the basis that Brexit occurs as presently scheduled, data protection
is one of the more significant Brexit transition period negotiation issues.
Indeed, it is also one of the more significant post-Brexit issues given
the commercial significance of maintaining the ability, or restoring the
ability, of commercial data transfers from the EU into the UK. In order
to trade with the EU and for EU personal data to be transferred to the
UK (as is required in many trade, banking, service, travel and educa-
tion examples), it must deal with the default transfer ban. EU personal

1 Steve Wood, Deputy Commissioner (Policy), ‘GDPR is an Evolution in Data Protec-


tion, Not a Burdensome Revolution,’ ICO, 25 August 2017.
2 ‘A New Law Will Ensure That the United Kingdom Retains Its World-Class Regime
Protecting Personal Data.’ The Queen’s Speech and Associated Background Briefing,
on the Occasion of the Opening of Parliament on Wednesday 21 June 2017 (Prime
Minister’s Office, 10 Downing Street, London, SW1A 2AA) (21 June 2017).

323
24.02 Background to the New UK Regime

data may only be transferred to countries and/or entities deemed to have


proper and equivalent data security and data protection standards as the
EU data protection regime. With that in mind it is obvious that the UK
wishes to continue to maintain an equivalent EU standard for the protec-
tion, collection, use and storage of personal data. Indeed, the government
has indicated that it wishes to even go further than the GDPR standards.3
Julian David, the CEO of techUK, states:
‘techUK supports the aim of a Data Protection [Act] that implements GDPR
in full, puts the UK in a strong position to secure unhindered data flows once
it has left the EU, and gives businesses the clarity they need about their new
obligations.’4

Queen’s Speech
24.03 The 2017 Queens Speech5 forecast the Data Protection Act 2018
(DPA 2018) changes as follows:
‘A new law [to] ensure that the United Kingdom retains its world-class regime
protecting personal data, and … a new digital charter … to ensure that the
United Kingdom is the safest place to be online.’

Background Briefing Document


24.04 A government official document6 also states that the main pur-
pose is ‘Make our data protection framework suitable for our new digital
age, allowing citizens to better control their data.’ The main benefits are:
●● ‘to meet the manifesto commitments to give people new rights to
“require major social media platforms to delete information held

3 ‘UK’s proposed Data Protection Bill looks to go further than GDPR,’ Information-
Age.com, 8 August 2017. This is also interesting from the perspective of some stand-
ards exceeding the GDPR, but other standards which might be argued to be less than
the GDPR level. Even if there are one or two aspects which are below the GDPR level,
that is very problematic to an adequacy decision application.
4 ‘A New Law Will Ensure That the United Kingdom Retains Its World-Class Regime
Protecting Personal Data.’ The Queen’s Speech and Associated Background Briefing,
on the Occasion of the Opening of Parliament on Wednesday 21 June 2017 (Prime
Minister’s Office, 10 Downing Street, London, SW1A 2AA) (21 June 2017).
5 Queen’s Speech 2017, 21 June 2017.
6 ‘A New Law Will Ensure That the United Kingdom Retains Its World-Class Regime
Protecting Personal Data.’ The Queen’s Speech and Associated Background Briefing,
on the Occasion of the Opening of Parliament on Wednesday 21 June 2017 (Prime
Minister’s Office, 10 Downing Street, London, SW1A 2AA) (21 June 2017).

324
Digital Charter and Internet Safety 24.05

about them at the age of 18” … and to “bring forward a new data
protection law”.
●● To ensure that our data protection framework is suitable for our new
digital age, and cement the UK’s position at the forefront of tech-
nological innovation, international data sharing and protection of
personal data.
●● To allow police and judicial authorities to continue to exchange
information quickly and easily with our international partners in the
fight against terrorism and other serious crimes.
●● To implement the [GDPR] and the new Directive which applies to
law enforcement data processing, meeting our obligations while we
remain an EU member state and helping to put the UK in the best
position to maintain our ability to share data with other EU member
states and internationally after we leave the EU.’
The main elements of the DPA 2018 are described as:
●● ‘To establish a new data protection regime for non-law enforcement
data processing … The new rules strengthen rights and empower
individuals to have more control over their personal data, including
a right to be forgotten when individuals no longer want their data
to be processed, provided that there are no legitimate grounds for
retaining it.
●● To modernise and update the regime for data processing by law
enforcement agencies. The regime will cover both domestic pro-
cessing and cross-border transfers of personal data.
●● To update the powers and sanctions available to the [ICO].’

Digital Charter and Internet Safety


24.05 The official background briefing document also refers to a
Digital Charter and issues of online safety. It states ‘proposals for a new
digital charter will be brought forward to ensure that the United Kingdom
is the safest place to be online.’ More specifically it states:
●● ‘We will develop a Digital Charter that will create a new frame-
work which balances users’ and businesses’ freedom and security
online.
●● The Charter will have two core objectives: making the UK the best
place to start and run a digital business and the safest place in the
world to be online.7

7 Emphasis added.

325
24.05 Background to the New UK Regime

●● We will work with technology companies, charities, communities


and international partners to develop the Charter; and we will make
sure it is underpinned by an effective regulatory framework.
●● We are optimistic about the opportunities on offer in the digital age,
but we understand these opportunities come with new challenges
and threats – to our security, privacy, emotional wellbeing, men-
tal health and the safety of our children. We will respond to these
challenges, assuring security and fairness in the new digital age and
strengthening the UK’s position as one of the world’s leading digital
economies.
●● We strongly support a free and open internet. But, as in the offline
world, freedoms online must be balanced with protections to ensure
citizens are protected from the potential harms of the digital world.
We will not shy away from tackling harmful behaviours and harmful
content online – be that extremist, abusive or harmful to children.
And we will make sure that technology companies do more to pro-
tect their users and improve safety online.
●● Many of these challenges are of an international nature, so we will
open discussions with other like-minded democracies and work with
them to develop a shared approach. The Prime Minister has already
started this process, securing an agreement with G7 countries to
strengthen their work with tech companies on this vital agenda.
●● Britain’s future prosperity will be built on our technical capability
and creative flair. Through our Modern Industrial Strategy and digi-
tal strategy, we will help digital companies at every stage of their
growth, including by supporting access to the finance, talent and
infrastructure needed for success and by making it easier for compa-
nies and consumers to do business online.’

The Ministerial Statement


24.06 A Ministerial Statement8 states individuals will ‘have more con-
trol over their personal data and be better protected in the digital age
under [the] new measures.’ The DPA 2018 is indicated to:
●● ensure the public ‘have greater control over personal data – i­ ncluding
right to be forgotten’;
●● ensure a ‘[n]ew right to require social media platforms to delete
information on children and adults when asked.’

8 ‘Government to Strengthen UK Data Protection Law,’ Department for Digital, ­Culture,


Media & Sport and The Rt Hon Matt Hancock MP, 7 August 2017.

326
The Ministerial Statement 24.06

In terms of consent issues it states:


‘The reliance on default opt-out or pre-selected “tick boxes”, which are
largely ignored, to give consent for organisations to collect personal data will
also become a thing of the past.’

It adds that:
‘Businesses will be supported to ensure they are able to manage and secure
data properly. The data protection regulator, the [ICO], will also be given
more power to defend consumer interests and issue higher fines, of up to
£17 million or 4 per cent of global turnover, in cases of the most serious data
breaches.’

The Minister of State for Digital states the DPA 2018 is:
‘designed to support businesses in their use of data, and give consumers the
confidence that their data is protected and those who misuse it will be held to
account.

The new Data Protection [Act] … give[s] us one of the most robust, yet
dynamic, set of data laws in the world. The [DPA 2018] … give[s] people
more control over their data, require[s] more consent for its use, and prepare[s]
Britain for Brexit. We have some of the best data science in the world and this
new law will help it to thrive.’

In addition, it states that the new data protection regime:


●● makes it simpler for individuals to withdraw consent for the use of
their personal data;
●● allows individuals to ask for their personal data held by companies
to be erased;
●● enables parents and guardians to give consent for their child’s data
to be used;
●● requires ‘explicit’ consent to be necessary for processing sensitive
personal data;
●● expands the definition of ‘personal data’ to include IP addresses,
internet cookies and DNA;
●● updates and strengthens data protection law to reflect the changing
nature and scope of the digital economy;
●● makes it easier and free for individuals to require an organisation to
disclose the personal data it holds on them (assess issues);
●● makes it easier for customers to move data between service provid-
ers (data portability).
New criminal offences ‘are created to deter organisations from either
intentionally or recklessly creating situations where someone could
be identified from anonymised data.’ Re-identification issues are

327
24.06 Background to the New UK Regime

increasingly complex and problematic. This is particularly important as


the concept of anonymisation has been seen as one of the tools available
to data protection, data life cycles, security and safety.
The new data protection rules ‘are made clearer for those who handle
data but they will be made more accountable for the data they process
with the priority on personal privacy rights.’ Also,
‘Those organisations carrying out high-risk data processing will be obliged to
carry out impact assessments to understand the risks involved.’

‘Final’ Brexit Negotiations in Transition Period


24.07 As a result of the 23 June 2016 referendum on the UK’s
membership of the EU, the UK Government have negotiated the form
that ‘Brexit’ will take. The important interim exit deal ultimately
resulted in the passing of the EU (Withdrawal Agreement) Act 2020 in
January 2020.
Notwithstanding the interim negotiations being finalised and the EU
(Withdrawal Agreement) Act 2020 being passed, the possibility of a
so-called ‘no deal Brexit’ remains in part as there is still no final deal
with the EU as to what the post Brexit relationship will look like. These
negotiations are continuing and the ‘final’ deal is scheduled to be final-
ised on or before 31 December 2020.
A final deal is not guaranteed, and is by no means certain. The Bank of
England, for example, advised lenders in June to ramp up preparations
for a no-deal Brexit.9
However, regardless of the exit strategy, there are undoubtedly many
areas of EU law that will continue to have an effect on UK business, for
example, because they remain part of UK law itself or because compli-
ance and or equivalence is necessary in order to effectively do business
with EU Member States, and the inherently cross-jurisdictional nature
of the internet.
The 2017 Queen’s Speech proposed, amongst other legislation, a
European Union (Withdrawal) Bill, a Trade Bill, a Customs Bill and
a Data Protection Bill – all of which may have implications in the
­ecommerce/Brexit field.
A draft Brexit Agreement was issued on 14 November 2018, was
approved by the Cabinet, and by EU Member States on 25 November
2018. Parliament approval was ultimately obtained after a general
election and the formation of a new government. (The initial attempt

9 See W Schomberg, ‘Brexit: Bank of England Tells Lenders to Ramp Up Preparations


For No Deal’ Independent, 3 June 2020.

328
‘Final’ Brexit Negotiations in Transition Period 24.07

to obtain parliamentary approval was shelved in late 2018.) Brexit


Day (also referred to as Exit Day in the European Union (Withdrawal)
Act 2018) (having been triggered per the Article 50 procedure) was orig-
inally 29 March 2019 and was then delayed.
The current position is to have a transition period from Brexit Day
to 31 December 2020. Potentially this may be extended. Indeed, there
have been some statements from the EU side indicating that this may be
necessary given the Coved-19 pandemic.
Various negotiations on a wide range of topics are required during the
transition period. Ultimately, on or prior to 31 December 2020 there is
meant to be a final Withdrawal Treaty. This will govern the contours of
what the final relationship into the future between the UK and EU will
look like.
On or before the Withdrawal Treaty there were expected to be various
new national laws catering for the new arrangements (and catering for
equivalency as between the UK and EU legal norms).
As regards data protection, and in particular catering for ongoing data
transfers, an Adequacy Decision from the EU as regards permitting data
transfers on the basis of UK law being equivalent in terms of personal
data is ideally required. Recall that the Data Protection Act 2018 did not
implement the General Data Protection Regulation 2016/679 into the
UK. While the official position is progressing in terms of a final deal or
treaty.
Prior to the transition period, there was also increased focus on seek-
ing to provide official guidance in terms of the possibility of a no deal
Brexit, as that was a real possibility at that stage.
However, it should be noted that there is also a possibility of a no deal
Brexit if the transition period does not result in a final deal being agreed.
So the previous advice, opinions and documentation regarding a no deal
Brexit prior to the transition period and prior to the EU (Withdrawal
Agreement) Act 2020 remain relevant.
While contacts between the UK and EU are continuing during the
transition period, it is also noted that there is still an element of hard
positions, brinkmanship and diplomacy. In reality, there is no ‘final’ deal
or treaty until there is a ‘final’ deal and final treaty.
It (currently) remains necessary for organisations, both inside and
outside the UK, to consider the potential of a no deal Brexit. Indeed, a
number of external data protection supervisory authorities have sought
to address demand for guidance from parties outside of the UK who may
currently be dealing with entities inside the UK. Organisations inside the
UK have had to anticipate some of the queries that external partners may
have to ask, and plan and respond accordingly.
The issue of Brexit generally, and in particular what will happen in
terms of data protection and data transfers, is an obvious concern for

329
24.07 Background to the New UK Regime

commercial entities and their advisors. While there are, on the one
hand, potential problems for data transfers in the event of a no deal
Brexit, there are also a number of problem issues even in a deal situ-
ation. Unfortunately, concerns in relation to the possibility of a ‘final’
no deal Brexit, or further extension(s), all lead to uncertainty and
risk.

Brexit Guides
24.08 The ICO issued the following no deal guidance in late 2018:
●● Announcements, descriptions and statements on No Deal Brexit;
●● Six Steps to Take guide on No Deal Brexit;
●● ‘Broader’ guidance document on leaving EU if no withdrawal agree-
ment; and
●● Overview FAQs guidance on No Deal Brexit.
There is also new content and commentary on data protection, transfers
and:
●● the European Union (Withdrawal) Act 2018;
●● Department update.
There is also updated guidance from the ICO, partly in light of the earlier
enhanced possibility of a no deal Brexit arising.
The ICO has issued a number of additional guidance documents to
assist organisations to prepare for the possibility of a no deal Brexit.
These began with a press release on 13 December 2018, and related
statements and electronic update announcements. The release acknowl-
edges that:
‘[w]hile the basis on which the UK will leave the EU has still to be decided,
the ICO has today published new guidance and practical tools to help organi-
sations understand the implications in the event of a no deal.’

The set of ICO documentation includes:


●● Announcements and descriptions (press statement, blog post, etc);
●● Six Steps to Take guide;
●● ‘Broader’ guidance document on leaving EU if no withdrawal agree-
ment; and
●● Overview FAQs guidance.
An ICO blog post is more detailed than the press release. It notes that
the ‘Government has made clear that the [GDPR] will be absorbed into
UK law at the point of exit, so there will be no substantive change.’

330
Brexit Guides 24.09

However, readers will note that the DPA 2018 does not absorb or imple-
ment the GDPR. The GDPR is (apparently) currently directly effective
during the transition. It would assist to understand what date an official
application from an adequacy decision starts on, and what date examina-
tion may commence.
The blog also suggests that the ‘two-way free flow of personal infor-
mation will no longer be the case if the UK leaves the EU without a with-
drawal agreement that specifically provides [and hopefully expressly so
provide] for the continued flow of personal data.’ It may also be that an
official EU Adequacy Decision is needed. However, issues arise as to
when an application for an adequacy decision can be made, and when
the formal review process can commence. Can it be during the transition
period between Brexit Day and Exit Day, or must it wait until Exit Day?
More official certainty would be welcome.
The ICO blog advises organisations to take ‘precautionary prepa-
rations’ to ‘ensure these data flows continue’ in the event of no deal.
However, there remains uncertainty even in the event of a deal. The offi-
cial advice is to ‘carefully consider alternative transfer mechanisms to
maintain data flows.’
One of these is the standard contractual clauses route. It is indicated
that there will be further official guidance and assistance in relation
to these contracts. Another route, generally reserved for multinational
organisations, is approved binding corporate rules. These take time.
However, it is suggested in the guidance that existing binding corporate
rules may need review.
‘Six Steps to Take’ ICO Guide
24.09 The ICO has also issued a six-step assistance guide. It states that
‘[i]f you operate in the EEA, you may need to comply with both the UK
data protection regime and the EU regime after the UK exits the EU. You
may also need to appoint a representative in the EEA.’ The six steps for
‘Leaving the EU’ are:
1. continue to comply;
2. transfers to the UK;
3. transfers from the UK;
4. European operations;
5. documentation;
6. organisational awareness.
Point 1 (Continue to comply) states, inter alia, that:
‘The [DPA 2018] will remain in place. The government intends to bring the
GDPR directly into UK law on exit, to sit alongside it. There will be some
technical adjustments to the UK version of the GDPR so that it works in a

331
24.09 Background to the New UK Regime

UK-only context – for example, amending provisions referring to EU law and


enforcement cooperation.’

Point 1 (transfers to the UK) states, inter alia, that organisations need to:
‘Review your data flows and identify where you receive data from the EEA,
including from suppliers and processors. Think about what GDPR safeguards
you can put in place to ensure that data can continue to flow once we are
outside the EU.’

It continues that this ‘means the sender needs to make sure there are ade-
quate safeguards in place, or one of the exceptions listed in the GDPR.’
It also refers to the importance of the adequacy decision issue. It states
that:
‘If the EU makes a formal adequacy decision that the UK regime offers an
adequate level of protection, there will be no need for specific safeguards.
However, on exit date there may not be such a decision in place. So you
should plan to implement adequate safeguards.’

Again, while useful, there is no confirmation of when the GDPR direct


effect ends; nor on when adequacy can be applied for, nor when the
review process can commence. Readers will again note that a normal
adequacy review process can take years – and that is even in the event of
relatively fixed and static laws, not a dynamic, large-scale, law change
environment such as now presents itself.
The ICO guidance advises that ‘[y]ou may want to consider putting
standard contractual clauses (SCCs) in place if you are receiving data
from the EEA.’ Further ICO assistance is also expected.
Point 3 (transfers from the UK) refers to transfers from the UK to the
EU; and also to transfers from the UK to countries outside the European
Economic Area (EEA). In terms of the latter, the guidance states, inter alia,
[w]e expect the UK government to confirm that the UK will reflect exist-
ing EU adequacy decisions, approved EU SCCs and BCRs.’ It then
cross-refers to the more detailed guidance. Readers will note that this is
descriptive of some future governmental and legal decisions and rules.
It is unclear specifically what this as yet refers to, nor when.
Point 4 (European operations) states inter alia that ‘[i]f you operate
across Europe, you should review your structure, processing operations
and data flows to assess how the UK’s exit from the EU will affect the
data protection regimes that apply to you.’
It also refers to data protection regimes and reiterates that organi-
sations may need to comply with the EU and the UK data protection
regimes, highlighting the need for dual compliance exercises. Issues
of branches and establishments within the post-Brexit EU need to be

332
Brexit Guides 24.10

considered. A further consideration is having to deal with the UK and


one or more respective data protection authorities.
It refers also to lead authority and one-stop-shop, and states, inter
alia, that ‘[i]f the UK is currently your lead [data protection] supervisory
authority, you should review the structure of your European operations
to assess whether you will continue to be able to have a lead authority
and benefit from one-stop-shop.’
The need to appoint a representative located in the EU is also
highlighted.
Point 5 (documentation) highlights the need to review the organisa-
tion’s locations, documentation, flows, processes and relationships. This
applies inward facing and outward facing. Once changes occur there will
be additional changes, such as identifying and differentiating EU and
UK law references in documentation, policies, notices, etc.
Point 6 (organisational awareness) reiterates the need for training
and internal awareness raising processes. This is needed generally, but
also needs to be utilised as Brexit-related data changes arise, as well
as engagement and consultation to identify these issues in advance and
make appropriate preparations. Reference is also made to the need for
organisations to have, and to update, their risk register.
‘Broader’ ICO Guidance
24.10 The ICO also issued a broader Brexit transfer guidance. It high-
lights that the (various) guidance is particularly relevant to organisations
which:
●● operate in the EEA, which includes the EU; or
●● send personal data outside the UK; or
●● receive personal data from the EEA.
It is also indicated to be relevant where the following apply to the
organisation:
●● the Privacy and Electronic Communications (EC Directive)
Regulations 2003 (SI 2003/2426) (PECR);
●● the Network and Information Systems Regulations 2018
(SI 2018/506) (NIS); or
●● EU Regulation 910/2014 on electronic identification and trust
­services for electronic transactions in the internal market (eIDAS).
The guidance refers to the DPA 2018. It points out that it came into force
at the same time as the GDPR, and covers four data protection regimes,
namely:
●● Part 2, Chapter 2: General processing – the GDPR – ‘this chapter
supplements the GDPR so that it operates in a UK context’;

333
24.10 Background to the New UK Regime

●● Part 2, Chapter 3: Other general processing – ‘this chapter applies a


UK version of the GDPR (the “applied GDPR”) to those areas out-
side the scope of EU law, such as defence’;
●● Part 3: Law enforcement processing – ‘this chapter brings into
UK law the EU Data Protection Directive 2016/680 (the Law
Enforcement Directive)’;
●● Part 4: Intelligence services processing.
ICO Overview FAQs Guidance
24.11 The ICO frequently asked questions (FAQs) guidance refers to
the following guidance questions and answers:
●● Will the GDPR still apply if we leave EU without a deal?
●● What will the UK data protection law be if we leave without a deal?
●● Is the old ICO guidance still relevant?
●● Can we still transfer data to and from Europe if we leave without a
deal?
●● Do PECR rules still apply?
●● Do network and information system (NIS) rules still apply?
●● Do the electronic identification and trust services for electronic
transactions in the internal market (eIDAS) rules still apply?
●● Do FOIA still apply?
●● Do the environmental information regulations (EIR) still apply?
●● Will ICO be producing more guidance?
The first question advises that:
‘The GDPR is an EU Regulation and, in principle, it will no longer apply to
the UK if we leave the EU on 29 March 2019 without a deal. However, if you
operate inside the UK, you will need to comply with UK data protection law.
The government intends to incorporate the GDPR into UK data protection law
when we exit the EU – so in practice there will be little change to the core data
protection principles, rights and obligations found in the GDPR.’ [emphasis
added and 29 March 2019 has now been superseded by extensions]

In terms of organisations wishing to continue dealing with EU, its adds


that the ‘GDPR may also still apply directly to you if you operate in
Europe, offer goods or services to individuals in Europe, or monitor the
behaviour of individuals in Europe.’ The GDPR will also ‘still apply
to any organisations in Europe who send you data, so you may need to
help them decide how to transfer personal data to the UK in line with the
GDPR.’ The guidance cross refers to the six points above.
Question 2 above indicates that the DPA 2018, ‘which currently sup-
plements and tailors the GDPR within the UK, will continue to apply.’
Again, referring to the future intent, it states that the ‘government also
intends to incorporate the provisions of the GDPR directly into UK

334
Data and the European Union (Withdrawal) Act 2018 24.12

law if we leave the EU without a deal, to sit alongside the DPA 2018.’
In addition, the ICO ‘expect the government to use new legislation to
make technical amendments to the GDPR so that it works in a UK-only
context.’
Question four above asks if an organisation can still transfer data to
and from Europe if the UK leaves without a deal. The ICO guidance
states as follows:
‘The government has said that transfers of data from the UK to the European
Economic Area (EEA) will not be restricted. However, if we leave the EU
without a deal, GDPR transfer rules will apply to any data coming from the
EEA into the UK. You need to consider what GDPR safeguards you can put in
place to ensure that data can continue to flow into the UK.’

In terms of question 5, referring to the RECR rules, the ICO states:


‘Yes. The current PECR rules cover marketing, cookies and electronic com-
munications. They derive from EU law but are set out in UK law. They will
continue to apply after we exit the EU.’

Forthcoming EU changes are also noted. ‘The EU is replacing the cur-


rent e-privacy law with a new e-privacy Regulation (ePR). The new ePR
is not yet agreed. It is unlikely to be finalised before the UK exists the
EU. This means the ePR will not form part of UK law if we leave without
a deal.’
Question six refers to NIS rules. The guidance advises, inter alia, that
organisations may need to appoint local representatives in the EU, and
review and comply with local NIS rules.
This guidance concludes by saying that ‘[i]n the meantime, given that
we expect UK data protection law to remain aligned with the GDPR, our
Guide to GDPR remains a good source of advice and guidance on how
to comply with UK and EU data protection rules both now and after we
leave the EU.’ The guidance also notes the guidance will be variously
updates as changes and new developments occur.
What is not referred to, however, is the European Union (Withdrawal)
Act 2018.

Data and the European Union (Withdrawal) Act 2018


24.12 It is of concern to organisations that they can continue to engage
in data transfers with the EU within the context of this changing data
protection regime and Brexit. As indicated above, however, there is
uncertainty as to (a) whether there will or will not be a deal; (b) when
the direct effect of the GDPR ends in the UK; and (c) when an adequacy
decision review process can commence, and when it will likely end.

335
24.12 Background to the New UK Regime

(One further issue is that an adequacy decision process is never guar-


anteed to result in a successful adequacy recommendation. There are
complex risk issues to consider.)
The author made official enquiries, including in particular when direct
effect ends, and when an adequacy decision review process can com-
mence. While there is no guidance on these particular issues as yet, some
of the other official correspondence is useful to highlight.
Official guidance by way of correspondence from the Department for
Digital, Culture, Media & Sport indicates that:
‘The EU (Withdrawal) Act 2018 (EUWA) retains the GDPR in UK law. The
fundamental principles, obligations and rights that organisations and data
subjects have become familiar with will stay the same. To ensure the UK
data protection framework continues to operate effectively when the UK is no
longer as EU Member State, the government will make appropriate changes
to the GDPR and Data Protection Act 2018 using regulation-making powers
under the EUWA.’

The guidance then directs readers to the ICO website.


The above is helpful as it indicates a possible roadmap ahead in a
deal scenario. However, given the imperative of continuing commer-
cial relations and inherent data transfers to the economy, it may well
be that the EUWA changes will occur even in the event of a no deal
situation.
The complete paragraph plus the statement that the ‘government
will make appropriate changes to the GDPR and to the Data Protection
Act 2018 using regulation-making powers under the EUWA’ suggest
that there will be legislative (primary or secondary) changes to come
as regards data protection. In some respects, it may be preferable to
have a Data Protection Act 2019/2020, but the EUWA seems to suggest
otherwise.
This also may clarify the concerns in terms of when an adequacy pro-
cess can begin. The EUWA may suggest that the process may not com-
mence, or may not be meaningful, before the EUWA changes are made
and become available for external review as part of the normal adequacy
process.

EUWA Details
24.13 As indicated above, the DPA 2018 does not implement the
GDPR in the UK given the current direct effect of the GDPR. Post
Brexit, it may be preferable to have a DPA 2019/20 which does imple-
ment the GDPR. Unfortunately, matters may become more complicated
by inclusion in, and as a consequence of, the EUWA.

336
EUWA Details 24.13

We therefore need to refer to the EUWA in more detail. Surprisingly


perhaps, neither the Department nor ICO directs concerned parties to
the section(s) of the EUWA relevant to the GDPR and data protection
generally.
One of the main aims is to maintain pre-existing EU law already
applicable in UK after Brexit. There is also power for various ministe-
rial changes in future. However, data protection does not appear to be
expressly referred to, nor the GDPR, nor transfers.
The EUWA received Royal Assent on 26 June 2018. The defini-
tions are referred to throughout. The following are relevant to consider.
Section 20 refers to interpretation, including:
●● a definition of the Charter of Fundamental Rights;
●● ‘exit day’ (‘means 29 March 2019 at 11.00pm …’ (this has now
been extended by the European Union (Withdrawal) Act 2018
(Exit Day) (Amendment) Regulations 2019 (SI 2019/718) in the
UK and at EU level by European Council Decision 2019/476/EU),
and a further extension to 31 October 2019 was granted provided
the UK holds EU elections (unless the UK leaves without a deal by
1 June 2019, in which case no elections needed to be held, although
it was announced in May 2019 that the EU elections would go
ahead));
●● ‘EU regulation’ (‘means a regulation within the meaning of
Article 288 of the Treaty on the Functioning of the European Union’);
●● ‘EU directive’;
●● ‘retained direct EU legislation’ (‘means any direct EU legislation
which forms part of domestic law by virtue of section (as modified
by or under this Act or by other domestic law from time to time, and
including any instruments made under it on or after exit day)’;
●● ‘withdrawal agreement’.
Section 20(4) provides that
‘A Minister of the Crown may by regulations –

(a) Amend the definition of “exit day” in subsection (1) to ensure that the
day and time specified in the definition are the day and time that the
Treaties are to cease to apply to the United Kingdom, and
(b) Amend subsection (2) in consequence of such amendment.’
Section 9(1) provides that ‘A Minister … may by regulation make such
provision as the Minister considers appropriate for the purposes of
implementing the withdrawal agreement if the Minister considers that
such provision should be in force on or before exit day, subject to the
prior enactment of a statute by Parliament approving the final terms of
withdrawal of the United Kingdom from the EU.’ Note that ‘regulations

337
24.13 Background to the New UK Regime

under this section may not … amend, repeal or revoke the Human Rights
Act 1998 or any subordinate legislation made under it’ (s 9(3)(e)). Also
‘[n]o regulations may be made under this section after exit day’ (s 9(4)).
Section 2 refers to ‘Saving for EU-derived domestic legislation.’
It includes the following:
‘EU-derived domestic legislation, as it has effect in domestic law immediately
before exit day, continues to have effect in domestic law and after exit day.’

This potentially includes the DPA 2018. However, it does not, at least
currently, include the GDPR or any UK measure transposing or bring the
GDPR into UK law.
Section 3, however, may be more pertinent. It states that:
‘Direct EU legislation, so far as operative immediately before exit day, forms
part of domestic law on and after exit day’ (s 3(1)).

This should seem to include an EU regulation – which may then include


the GDPR. However, there remains some uncertainty, not least as the
organisation may have to refer to the DPA 2018, the EUWA, any minis-
terial legislation, and any other legislation that may also ensue plus the
withdrawal agreement itself. Subjects to an adequacy decision issuing,
that may also be referred to.
There is also a definition of ‘direct EU legislation’ (s 3(2)(a)). This
includes reference to ‘any EU regulation …’
Section 4 refers to a saver for rights, powers, liabilities, obligations,
restrictions, remedies and procedures which were in effect immediately
before exit day. These are to continue after exit day.
There are also provisions in relation to exceptions for saving and
incorporations (s 5). This refers to the principle of supremacy of EU
ceasing for laws after exit day (s 5(1)). However, it may apply to ‘rel-
evant’ issues of interpretation (see s 5(2)).
The Charter ‘is not part of domestic law on or after exit day (s 5(4)).
(This may have potential to interact with issues of a future transfer ade-
quacy decision assessment process). However, note that ‘Subsection (4)
does not affect the retention in domestic law on or after exit day in
accordance with this Act of any fundamental rights or principles which
exist irrespective of the charter (and references to the Charter in any case
law are, so far as necessary for this purpose, to be read as if they were
references to any corresponding retaining fundamental rights or princi-
ples)’ (s 5(5)). This may require closer examination.
Section 6 refers to interpretation of ‘retained EU law.’ This section
confirms that courts are not bound by ‘any principles laid down, or any
decision made, on or after exit day by the European Court’ (s 6(1)). UK
courts cannot continue to refer matters to the European Court (s 6(1)(b)).

338
EUWA Official Explanatory Commentary 24.14

Having said that, UK courts ‘may have regard to anything done on or


after exit day by the European court, another EU entity or the EU so far
as it is relevant to any other matter before the court’ (s 6(2)).
Also, the Supreme Court and High Court are ‘not bound’ by any
retailed EU case law (see s 6(4)). ‘In deciding whether to depart from
any retained EU case law, the Supreme Court or the High Court … must
apply the same test as it would apply in deciding whether to depart from
its own case law’ (s 695)).
There are also definitions for ‘retained case law’, ‘retained EU law’,
‘retained general principles of EU law’ amongst others set out in s 6(7).
Section 7 refers to status of retained EU law. It provides that:
‘Anything which –

(a) Was, immediately before exit day, primary legislation of a particular


kind, subordinate legislation of a particular kind or another enactment of
a particular kind, and
(b) Continues to be domestic law as an enactment of the same kind’ (s 7(1)).
‘Retained direct principal EU legislation cannot be modified other than
as specified (s 7(2)).
There is also a reference to the Human Rights Act 1998 in s 7(5)(f).
It also has a definition of ‘retained direct principal EU legislation’
(s 7(6)).

EUWA Official Explanatory Commentary


24.14 There are also detailed explanatory notes available with the
EUWA. The Overview indicates that the EUWA ‘repeals the European
Communities Act 1972 (ECA) on the day the United Kingdom leaves the
European Union’; and ends the supremacy of EU law. The query arises
as to whether this means the entire EUWA only become active ‘on’ exit
day.
The ‘principal purpose of the Act is to provide a functioning statute
book on the day the UK leaves the EU’ (explanatory note (EN) 10). ‘As
a general rule, the same rules and laws will apply on the day after exit as
on the day before. It will then be for Parliament and, where appropriate,
the devolved legislatures to make any future changes’ (EN10).
It adds that:
‘The approach in the act to preserving EU law is to ensure that all EU laws
which are directly applicable in the UK and all laws which have been made
in the UK in order to implement our obligations as a member of the EU are
converted into domestic law on the day the UK leaves the EU, subject to some
limited exceptions’ (EN 48).

339
24.14 Background to the New UK Regime

Section 1 repeals the European Communities Act 1972 (EN74). EU leg-


islation is ‘given legal effect in the UK via section 2(1) of the ECA,
which described how such legislation is to have effect “in accordance
with the EU treaties”. It is this which ensures that, for example, EU regu-
lations are directly applicable and fully binding in all member states’
(EN81).
The guidance adds that ‘Section 3 … convert[s] “direct EU legislation”
into domestic legislation at the point of exit’ (EN82). ‘Subsection (1)
therefore provides for the conversion into domestic law of this direct EU
legislation’ (EN83).
‘Subsection (2)(a) converts EU regulations, certain EU decisions and EU ter-
tiary legislation (now known as delegated and implementing acts), as they
have effect immediately before exit day. These terms are defined at section 20.
Section 20 and Schedule 6 provide that certain instruments are exempt EU
instruments. These exemptions reflect that certain EU instruments did not
apply to the UK because the UK did not adopt the Euro, or because the UK
did not participate in certain aspects of the EU acquis, in the area of freedom,
security and justice. EU decisions which are addressed only to a member state
other than the UK are also not converted into domestic law. Additionally, so
far as EU-derived domestic legislation under section 2 reproduces the effect
of an EU regulation, decision or tertiary legislation, these instruments are not
converted under this section. This is to avoid duplication on the statute book
after exit’ (EN84).

New ICO Guidance Update


24.15 Given recent obvious concerns regarding the potential for
a ‘final’ no deal Brexit (including in relation to data protection, data
transfers, etc), the ICO has decided to issue further guidance referring
expressly to the issues arising in a no deal Brexit scenario. This is com-
mercially focused guidance (with the ICO noting that separate guidance
may be issued in future for the benefit of individuals). In particular, this
no deal Brexit guidance (‘Data Protection If There’s No Brexit Deal’) is
aimed at businesses and organisations:
●● operating in the EU or EEA;
●● sending personal data outside the UK; or
●● receiving personal data from the EU or EEA.
The ICO advises organisations to review the guidance on whether the
GDPR applies to them (this does not exclude many at all). The GDPR
is explained, as is the point of direct effect of the GDPR currently. The
ICO comments that ‘[w]hen the UK exits the EU, the EU GDPR will

340
New ICO Guidance Update 24.15

no longer be law in the UK. The UK government intends to write the


GDPR into UK law, with necessary changes to tailor its provisions for
the UK (the “UK GDPR”). The government has published a “Kneeling
Schedule” for the GDPR, which shows planned amendments.’
In planning for a no deal Brexit scenario, the ICO advises consultation
of the following ICO guidance:
●● ‘Data Protection If There’s No Brexit Deal’;
●● ‘International data transfers’;
●● ‘EU representatives’;
●● ‘One Stop Shop regulatory oversight by a lead data protection
authority’.
It is the Government’s intention that ‘the UK GDPR will also apply to
controllers and processors based outside the UK, where their processing
activities relate to:
●● offering goods or services to individuals in the UK; or
●● monitoring the behaviour of individuals taking place in the UK.’
The ICO gives examples where transfers by a UK business to its con-
sumers outside the UK are permitted; while transfers to third party
organisations or service providers (including cloud services) can trigger
the GDPR or UK GDPR.
Those who currently make data transfers to outside of the EEA should
‘already’ have compliance strategies in place.
On Exit Day, organisations need to consider if they are intending to
make restricted transfers outside of the UK (permissible if covered by an
adequacy decision; or an appropriate safeguard; or an exception); and if
it is receiving personal data from outside of the UK.
The ICO also refers to the European Data Protection Board (EDPB)
advices in relation to transfers in the event of a no deal Brexit; the ICO
data transfers guide; as well as future guidance.
Various detailed references and examples are made to adequacy
decisions and appropriate safeguards (eg standard contractual clauses;
­binding corporate rules) for restricted transfers. These should be con-
sidered in detail as appropriate. Some of this is contingent on future UK
legislation, however.
As regards those in the UK who propose to continue to receive per-
sonal data from the EU, it is reiterated that the GDPR applies to such
transfers and to the transferor. It is noted that the UK will on Exit Day
become a so-called third country (outside of the EU), in which case
additional restrictions apply on transfers. The ICO has further transfer
guidance, as does the EDPB. Organisations should ‘take a broad inter-
pretation of [what is] a restricted transfer’ (ICO). The EU transferor will

341
24.15 Background to the New UK Regime

‘only’ be able to make the transfer to the UK transferee organisation if


one of the following apply:
●● There is an EU adequacy decision (eg adequacy decision; appropri-
ate safeguards; standard contractual clauses. (Additional provisions
may apply to public bodies)). (Unfortunately, from the UK perspec-
tive, ‘[a]t exit date there may not be an adequacy decision by the
European Commission regarding the UK’ (ICO) [that would seem
to be an understatement at present].
●● An exception outside of the above applies – but which is interpreted
very narrowly and restrictively (such as medical emergencies;
explicit consent; occasional contract transfers; occasional public
interest transfers; occasional legal defence transfers; transfers from
public registers; exceptional compelling legitimate interest trans-
fers). The transferor will decide if applicable, not the UK recipient.
There is particular guidance in relation to transfers from jurisdictions
covered by an EU adequacy decision (as opposed to the EU itself).
Organisations, therefore, need to consider where potential post-Brexit
transfers to the UK may come from.

Preparation
24.16 The ICO advises that preparation is necessary and that organisa-
tions should:
●● assess their data flows and transfers in order to identify problem
restricted transfers in advance;
●● consider if and how transfers may continue, especially in the
absence of a possible future EU adequacy decision, which may not
have issued yet;
●● consider standard contractual clauses with the counterparty (note the
ICO tool available, and draft template contracts);
●● updating of binding corporate rules, if previously utilised;
●● update documentation and privacy notices;
●● compliance vetting by the transferor entity.
Particular considerations also arise for UK entities which do not have an
office, branch or establishment in the EU or EEA after Exit Day. Other
entities may have appointed a representative in the EU for GDPR pur-
poses. The ICO provides commentary on these situations. Indeed, many
organisations will already have established branches outside the UK or
even moved outside of the UK as a result of preparing for the possibility
of Brexit. GDPR compliance needs to be considered. Details and com-
mentary on these scenarios are referred to in the ICO guidance.

342
EDPS Guidance 24.17

Organisations may need to review their:


●● agreements, contracts and appointments;
●● terms, contracts, policies and websites;
●● rights of data subjects;
●● transfer documentation;
●● data protection impact assessments (DPIAs);
●● Data Protection Officer or Officers;
●● further ICO and other official guidance.

EDPS Guidance
24.17 The European Data Protection Supervisor (EDPS) has also
issued guidance in relation to the possibility of a No Deal Brexit sce-
nario (‘Information Note on International Data Transfers After Brexit,’
16 July 2019).
The EDPS notes that currently as of 1 November 2019, the UK will
be a third country as referred to under the data protection rules, and
transfer restrictions automatically kick in. (This is unless the Withdrawal
Agreement takes effects prior to that date, which currently does not
seem likely, in which case transfers should be able to continue until
31 December 2020 (an effective transition period) pending final arrange-
ments and negotiations. Note, the 2020 deadline may be extended by two
years. This seems moot presently however).
The EDPS notes that a No Deal Brexit ‘would have [immediate]
reprecutions for the protection pf personal data … because … EU …
law, including data protection law, will cease to apply in the UK.’ The
GDPR requirements and restrictions on data transfers to less safe or non-
equivalent third countries will apply (see Chapter V, GDPR). There is a
default transfer ban, unless one of the limited number of transfer mech-
anisms can be established for particular data transfers or transactions
involving personal data.
The level of protection for the data being transferred must not be
undermined (GDPR Article 46).
Businesses and their advisers must therefore consider whether there is
a business need for data transfers which will be adversely affected by a
No Deal Brexit scenario. If so, the organisation needs to assess how best
to establish safeguards and the application of one of the limited number
of transfer mechanism in this scenario ‘to enable the transfer to a third
country’ – namely, the UK in this instance.
The EDPS refers to:
●● adequacy decisions (see GDPR Article 47 regarding adequate levels
of protection);

343
24.17 Background to the New UK Regime

●● appropriate safeguards (GDPR Article 48), eg,


–– standard contractual clauses expressly adopted by the Com-
mission for such purposes (transfers to controllers 2001/497,
2004/915) (transfers to processors 2010/87);
–– binding corporate rules (BCRs) (applicable to a group of
­companies – and importantly approved by one of the respective
data protection supervisory authorities. Note, prior BCRs can
continue to apply but may need certain amendments post the
GDPR. Currently, BCRs may be applied for in the UK, but take
time, and it remains unclear how the ICO could approve BCRs
in a no deal Brexit) (see GDPR Articles 46–48);
–– codes of conduct and certification mechanisms. (However, the
exact nature of how this would work is not fully established
and more guidance is awaited);
–– ad hoc contractual clauses. However, prior authorisation from
a data protection supervisory authority would be necessary
(GDPR Article 48(3)(a));
–– derogations (GDPR Article 50(1)).
The EDPS notes that a common feature in the above is that Data Subject
rights must be enforceable and effective.
In terms of derogations, the EDPS notes that these are exhaustively
mentioned in Article 50:
●● explicit consent of the individual to the transfer – ‘after having been
provided with all necessary information about the risks associated
with the transfer’;
●● where the transfer is necessary for contract performance to which
the Data Subject is a part;
●● where the transfer is necessary to conclude or perform a contract
concluded in the interest of the Data Subject;
●● where the transfer is necessary for important public interest reasons;
●● where the transfer is necessary for legal claims;
●● where the transfer is necessary to protect the vital interests of the
data subject (or someone else) and the data subject is physically or
legally incapable of giving consent;
●● where the transfer is made from a public register.
In what may be an important clarification, the EDPS notes that the
Commission in a position paper (Commission, ‘Position paper on the
Use of Data and Protection of Information Obtained or Processed Before
the Withdrawal Date,’ 20 September 2017) says that personal data
transferred (to the UK) before the Withdrawal Date may continue to be
processed.

344
Commission 24.18

However, a note of caution is advised in that data processing activities


must be judged both at the time of collection, processing and storage.
That initial collection may be lawful and permissible, does not absent an
organisation from the assessment of ascertaining a current lawful basis
for current processing – especially the further-in-time current process-
ing for the time being moves away from the initial date of collection,
and more so in the event of secondary uses or purposes of processing.
Later storage also requires a separate assessment of the basis for ongo-
ing storage.
The EDPS referring to the Commission Position Paper reads as
follows:
‘The European Commission in the Position Paper on the Use of Data and
Protection of Information Obtained or Processed Before the Withdrawal Date
concludes that UK based controllers and processors may continue to process
personal data transferred before the withdrawal date only if these data enjoy
the protection of EU data protection law. Such protection will be guaranteed,
in the case that a Withdrawal Agreement is put in place.’

The above caution should be noted, however, as anything to do with


Brexit and data transfers is far from simple or assured in the current
climate, and data processing activities can be varied and more nuanced
than a headline description may suggest.
The EDPS acknowledges and cautions that ‘[t]he developments on
this sensitive issue should be closely followed and the EDPS [and oth-
ers] may provide further guidance’ as necessary. After all, this is a very
fluid situation.
The EDPS sets out five steps to prepare for Brexit, which businesses
and organisations should consider, namely:
●● map data processing activities;
●● check the most suitable available data transfer mechanisms;
●● implement the correct data transfer mechanism before 1 November
2019;
●● update internal documentation;
●● update data protection notices.
(These steps are also included in guidance from the European Data
Protection Board (EDPB) in guidance of 12 February 2019).

Commission
24.18 In addition to the above Position Paper referred to by the EDPS,
the Commission has also issued a ‘Notice to Stakeholders Withdrawal of

345
24.18 Background to the New UK Regime

the United Kingdom From the Union and EU Rules in the Field of Data
Protection’ on 9 January 2018.
This notes that given the Article 50 notice submitted by the UK (on
29 March 2017) that EU law ceases to apply in the UK – unless there
is a validated Withdrawal Agreement. (Note, that while there has been
some discussion recently as to whether the UK may seek to have a
further extension of the leave date past 1 November 2019, there has
also been some discussion as to whether the Article 50 Notice may be
withdrawn).
The Commission’s ‘Notice to Stakeholders’ notes the ‘considerable
uncertainties’ and that ‘all stakeholders processing personal data are
reminded of legal repercussions, which need to be considered when the
[UK] becomes a third country.’ The Commission highlights the possibil-
ity and need to consider:
●● standard data protection clauses (as approved by the Commission);
●● binding corporate rules (but which generally only apply to certain
large group entities, and in any event require prior approval from a
data protection supervisory authority);
●● approved codes of conduct; and
●● approved certification mechanisms.
The Commission notes that the rules relating to some of these mecha-
nisms have been simplified in the GDPR as compared with the prior
DPD95/46.

EDPB
24.19 The European Data Protection Board (EDPB) issued guidance
entitled ‘Information Note on Data Transfers Under the GDPR in the
Event of a No Deal Brexit’ (12 February 2019). In this scenario it reiter-
ates the need to consider:
●● standard or ad hoc data protection clauses;
●● binding corporate rules;
●● codes of conduct and certification mechanisms; and
●● derogations.
It also includes reference to the five steps referred to above.
It notes that at the time of issue, ‘[a]ccording to the UK Government,
the current practice, which permits personal data to flow freely from
the UK to the EEA, will continue in the event of a no-deal Brexit.’
However, matters have shifted politically since then and the immedi-
ate next steps remain somewhat less certain, which is unfortunate for

346
Brexit and EU (Withdrawal Agreement) Act 2020 24.20

business entities which require the ability to plan ahead. The ‘UK
Government’s and the ICO’s website should be regularly consulted.’
While there is uncertainty, organisations wishing to continue dealing
and receiving personal data from their unconnected business partners
located inside the EU must implement most likely standard contractual
clauses, given that binding corporate rules will only apply to the largest
of organisations and in a pre-approved manner. In addition, there will
be increased pressure on parties in the EU wishing to continue to deal
with UK entities, to implement appropriate mechanisms and safeguards
from their end, in which case they will be raising queries directly with
their UK counter parties.

Brexit and EU (Withdrawal Agreement) Act 2020


24.20 The EU (Withdrawal Agreement) Act 2020 became law on
23 January 2020. This was ultimately possible pursuant to the preceding
general election and formation of a new government. The so-called Exit
Date is confirmed as 31 January 2020. This now triggers the transition
period during which final negotiations are proceeding between the UK
and the EU with a view to agreeing all outstanding issues and, impor-
tantly, formally what the future relationship between each will be. The
final details of the future formal relationship are meant to be incorpo-
rated into a formal international treaty document.
One of the disadvantages of such an arrangement is that it is neces-
sary to discern the data protection related elements from the vast range
of details documented. A separate danger nationally may be the extent
to which a minister or secondary legislation may be the route for certain
data protection changes to be incorporated into law. The disadvantage
is that there is arguably lesser prominence to data issues than we have
come to expect with primary legislation.
The EU (Withdrawal Agreement) Act 2020 refers to the transition
period as the ‘Implementation Period.’ While the Act refers to ‘Brexit’
and’ Exit Date’, during the Implementation Period or transition period –
really a further negotiating period – it states that EU law continues to
apply in the UK for the moment.
Importantly, the transition period is meant to end on 31 December
2020. If there is no ‘final’ agreement reached on or prior to 31 December
2020, a no deal Brexit situation will arise.
However, as before, there is a potential to extend the deadline. The EU
appears willing to do so, and some have even suggested an extension is
appropriate. Officially, the UK’s position appears to be that an extension
will not be requested.

347
24.20 Background to the New UK Regime

It should be noted that – some may say strangely – the EU (Withdrawal


Agreement) Act 2020 seeks to prohibit the transition period deadline of
31 December 2020 from being extended. Ministers are prohibited from
asking for an extension. The UK may seek an extension of one or two
years if such a request is made prior to 30 June 2020. The Government,
however, appears to have ruled out making such an extension.
Notwithstanding that an extension request may not be made prior
to June, if both parties agree between July and 31 December 2020 one
might assume that they will find a route to do so. From the UK end, that
may however require an Act of Parliament to amend the EU (Withdrawal
Agreement) Act 2020 as necessary.
There is also a mechanism for the arbitration of any disputes which
may arise in the withdrawal agreement.
The EU (Withdrawal Agreement) Act 2020 operates by repealing the
European Communities Act 1972, which brought the UK into the (then)
EU. However, the carve-out in order to keep EU law during the transition
period operates by reinstating temporarily the European Communities
Act. In light of that, in the event of late extension (between July and
December 2020) it may be that any necessary extension Amendment Act
may be required to provide for the extension, and the continuation, of EU
law during the extension period. It may well be that there is an element
of politics involved in seeking to put pressure on the EU negotiators by
having ruled out an extension at and prior to June.
There is also reference to the Court or Justice of the European
Union and to EU law issues as they apply in the UK. However, the EU
(Withdrawal Agreement) Act 2020 allows all courts – as opposed to just
the Supreme Court (as previously referred to) – to depart from CJEU
jurisprudence (section 26).
It is also worth pointing out that the EU (Withdrawal Agreement) Act
2020 was passed after rejecting a number of House of Lords amend-
ments. The ability of Parliament to scrutinise or be involved in the Brexit
process has also been reduced. Also, certain other provisions in an ear-
lier draft have been removed. However, if there is a final treaty agreed
for the future relationship, it appears that this would indeed be put to
Parliament.
A final point to note is that as a result of the EU (Withdrawal
Agreement) Act 2020 UK officials are no longer represented at EU insti-
tutions, bodies and agencies. One assumes that this means that the ICO
is no longer a member of the EDPB. (The EDPB has not yet responded
to a query in this regard.) However, they may attend certain international
consultations with the EU delegation.

348
Conclusion 24.21

Conclusion
24.21 The above provides an introduction to some of the Brexit con-
siderations and the intentions behind the new UK data protection regime.
The documentation referred to above, in particular the Ministerial
Statement document, provides very detailed background and policy
in relation to the need and benefit of updating the UK data protection
regime and aligning, or continuing to align, with the EU regime.

349
350
Chapter 25
The New Data Protection Act

Introduction
25.01 The new UK Data Protection Act is the most important
­development in UK data protection law in over 20 years.

Repeal
25.02 The Data Protection Act 1998 (DPA 1998) is repealed – as
­specified in Sch 19, para 44 of the Data Protection Act 2018 (DPA 2018).

Breakdown
25.03 The DPA 2018 is long and complex, comprising:
●● 215 sections over 130 pages; and
●● 20 Schedules (comprising 1–9 Parts depending on the Schedule)
over 208 pages.
The Act is also complicated by the fact that it addresses more than just
the EU General Data Protection Regulation (GDPR) itself, and conse-
quent repeal of the prior legislation.
In addition to the GDPR related issues, the Act also encompasses the
following:
●● Part 3: Law Enforcement Processing;
●● Part 4: Intelligence Services Processing;
●● Part 2, Chapter 3: applying the GDPR rules via the Act to other
information and data not covered by the GDPR.

351
25.03 The New Data Protection Act

Part 3 refers to Law Enforcement Processing and makes provision about


the processing of personal data by competent authorities for law enforce-
ment purposes and implements the EU Law Enforcement Directive in
the UK.
Part 4 refers to Intelligence Services Processing and makes provision
about the processing of personal data by the intelligence services.

Specific Changes from GDPR


25.04 There are many changes from the GDPR. Some of these are
referred to below.
DPA 2018 Changes From GDPR

GDPR Description Change DPA 2018


Art 8 Age for information Change to: s9
society services: 13
16
GDPR GDPR Extends GDPR Part 2
to additional
information areas
Art 4(7) Meaning of ‘Controller’ Change to definition s 6(2)
s 209
s 210
Meaning of ‘public Specifying UK s7
authority’ and ‘public public authorities
body’
Art 6(1)(e) Lawfulness of Specifies examples s8
processing: public for UK
interest, etc
Art 9(1) Special categories Makes provisions s 10
of personal data & in UK:
criminal convictions re (b) Employment,
social security &
social protection;
re (g) Substantial
public interest;
re (h) Health &
social care;
re (i) Public health;
re (j) Archiving,
research & statistics

352
Specific Changes from GDPR 25.04

GDPR Description Change DPA 2018


Art 9(2)(h) Processing for health or Special categories s 11
social care purposes of personal data:
supplementary/
health or social care
Art 15(1)–(3) Controller obligations Obligations of credit s 13
reference agencies
Art 22(1); Automated decisions Makes exception s 14
22(2)(b) Art 22(1).
Automated decision
making authorised
by law: safeguards
GDPR GDPR Schedules 2, 3, 4 s 15
make exemptions
from, and
restrictions and
adaptation on GDPR
Arts 13–21, Prior information Adaptation or Sch 2,
& 34 requirements; restriction on Part 1
Right of access; Arts 13–21, & 34
Right of rectification;
Right of erasure and
forgetting;
Right to restriction of
processing;
Notification obligation;
Right to data
portability;
Right to object;
Communicating breach
to Data Subject
Arts 13–21, Prior information Restriction on Sch 2,
& 34 requirements; Arts 13–21, & 34 Part 2
Right of access;
Right of rectification;
Right of erasure and
forgetting;
Right to restriction of
processing;
Notification obligation;
Right to data
portability;
Right to object;
Communicating breach
to Data Subject

353
25.04 The New Data Protection Act

GDPR Description Change DPA 2018


Art 15 Right of access Restriction from Sch 2,
Art 15 Part 3
Arts 13–15 Prior information Restriction on Sch 2,
requirements; Arts 13–15 Part 4
Right of access

Chs II, III, IV, Principles; Exemptions & Sch 2,


& VII Rights of Data Subjects; derogation on Part 5
Controllers & GDPR Chapters II,
Processors; III, IV, & VII
Cooperation &
consistency
Arts 15, 16, Right of access; Derogations from Sch 2,
18, 19, 20 Right of rectification; rights in Arts 15, 16, Part 6
& 21 Right to restriction of 18, 19, 20 & 21
processing;
Notification obligation;
Right to data
portability;
Right to object
Arts 13–21 Prior information Restricting rules Sch 3
requirements; in Arts 13–21 to
Right of access; health, social work,
Right of rectification; education & child
Right of erasure and abuse data
forgetting;
Right to restriction of
processing;
Notification obligation;
Right to data
portability;
Right to object
Arts 13–21 Prior information Restricting rules in Sch 4
requirements; Arts 13–21
Right of access;
Right of rectification;
Right of erasure and
forgetting;
Right to restriction of
processing;
Notification obligation;
Right to data
portability;
Right to object

354
Comment 25.05

GDPR Description Change DPA 2018


Art 89(1) Safeguards & Makes provision re s 19
derogations re archiving, research
processing for archiving & statistical
in public interest, purposes: safeguards
scientific or historic
research purposes or
statistical purposes
Art 80 Representation of Data Further specifics on s 187
Subjects bodies representing
Data Subjects
GDPR GDPR various Various Schedules

Comment
25.05 The DPA 2018 is quite a complex piece of legislation and will
require organisations and their representatives undertaking careful re-
reading even in order to get an overview understanding.
Large elements of the Act will not be relevant in the ordinary course
of dealing for most organisations, in particular Part 3 and Part 4. The
GDPR extended section (Part 2, Chapter 3) (eg, freedom of information)
will also not be directly relevant to most organisations, especially those
in the commercial sphere. Note that these elements of the Act while so
encompassed, are outside of core data protection and are beyond the pre-
set scope of this book.
In terms of the remainder data protection sections of the Act, it should
be noted that these differ from the DPA 1998. The DPA 1998 was intended
to implement the EU Data Protection Directive 1995 (DPD 1995) into
the UK. EU Directives by their nature need local law implementation.
EU Regulations are different. The GDPR is directly effective in all
states – including the UK. The DPA 2018 is not technically required to
implement the GDPR in the UK as it is already applicable directly.
However, like many other states, the UK recognises that it needs a
new law to repeal the old law (ie the DPA 1998); and in order to deal
with specific local law issues. In addition, the GDPR specifies that
certain provisions from the GDPR can be tailored in each individual
state depending on the local issues and environment. For example,
while the GDPR provides that information society services must have
an age of consent of 16, it provides that states may derogate from
this. The UK utilises this derogation to apply an age of consent of 13
instead of 16. Numerous other derogations and changes are made in
the DPA 2018.

355
25.06 The New Data Protection Act

Future
25.06 With the UK in the EU, the GDPR was directly effective –
regardless of the DPA 2018. With the EU (Withdrawal Agreement)
Act 2020 of 23 January 2020 the Brexit Exit Day has occurred, thus the
GDPR direct effect ceases. However, there is a saver insofar as mat-
ters appear to continue as before during the transition period during the
remainder of 2020. Under the EU (Withdrawal Agreement) Act 2020
and Withdrawal Agreement the UK and EU continue negotiations on the
nature of future long term relations between each, and which transition
period is to continue until 21 December 2020.
While the Government has indicated that it does not intend to seek
extensions beyond December, it is possible for there to be extension
to the negotiation period if needed. Given that there have already been
extensions at earlier stages of the Brexit process, it is not out of the ques-
tion that further extensions may occur. Indeed, the current pandemic
might also provide a basis for an extension.
In the Brexit scenario, a further Data Protection Act will be needed to
bring into UK law the GDPR provisions which are not already incorpo-
rated in the DPA 2018. There would be a political and economic need
to ensure that UK law after the GDPR stops being directly effective, to
ensure a level of legal equivalence between the UK law protections for
personal data and those in the EU. If there is no equivalency of protec-
tion and rights, transfers of important economic and financial data which
incorporates personal data cannot continue to flow between the EU and
UK. Many business flows of data would have to cease as it would be
caught by the default transfer ban from the EU to areas not formally
recognised as being of equivalent standards.
A new Data Protection Act may have to follow after the DPA 2018,
depending on political developments. In addition, an EU formal ade-
quacy finding may be also required. This particular issue will not be
without difficulty in the normal course – and could take years. However,
one can assume that the Government would wish to have this included
in the terms of any final agreement with the EU, and this is currently
planned to be concluded in advance of the December 2020 data sched-
uled for the end of the transition period.
However, predicting the result of political negotiations and associated
timeframes is beyond a work such as this – particularly during a period
of frought negotiations, a level of brinksmanship, and an international
pandemic.

356
Part 5
New EU Regime

357
358
Chapter 26
New Regime

Introduction
26.01 The EU data protection regime is fundamentally updated and
expanded.1 Many things have changed since the introduction of the EU
Data Protection Directive 1995 (DPD 1995). Data processing activi-
ties have changed, as well as increases in scale and complexity. New
data processing activities and issues are constantly evolving. The EU
undertook a formal review of the data protection regime. Partly on foot
of the review, it was decided to update DPD 1995, ultimately via the
new EU Regulation, namely the EU General Data Protection Regulation
(GDPR).

Formal Nature of Regulations and Directives


26.02 As noted above, DPD 1995 is replaced by a Regulation, the
GDPR. The final version of the GDPR was agreed in April 2016. However,
it is important to note that an EU Regulation differs from a Directive
under formal EU law.2 A Regulation is immediately directly effective
in all EU states – without the need for national implementing laws.

1 European Commission – Press release, ‘Agreement on Commission’s EU Data Pro-


tection Reform Will Boost Digital Single Market,’ Brussels (15 December 2015);
R Graham, ‘Prepare for European Data Protection Reform,’ SCL Computer and Law,
30 November 2011.
2 Generally see, for example, A Biondi and P Eeckhout, eds, EU Law after Lisbon
(2012); N Foster, Foster on EU Law (2011); O’Neill, A, EU Law for UK Lawyers
(2011); Steiner, J, EU Law (2011). Also, HCFJA de Waele, ‘Implications of Replacing
the Data Protection Directive by a Regulation. A Legal Perspective,’ Privacy & Data
Protection (2012) (12) 3.

359
26.02 New Regime

A Directive requires national implementing measures, whereas a


Regulation does not. The GDPR Regulation applies in the UK (prior
to Brexit exit data in January 2020 – and apparently during the transi-
tion period3) as in other EU countries. It changes the UK data protec-
tion regime as well as commercial practice. The reforms introduced are
described as being ‘comprehensive.’4 Notwithstanding Brexit, the UK is
indicated to maintain a UK version of the GDPR (the UK-GDPR). This
will therefore necessitate further data protection regulation (during or
after) the transition period scheduled for the end of 2020.

Review Policy
26.03 The Commission and others recognised that technology and
commercial changes have meant that the data protection regime required
updating.5 Indeed, the Council of Europe Convention on data p­ rotection6
which pre-dates DPD 1995 and which was incorporated into the national
law of many EU (and other states) prior to DPD 1995, is also in the pro-
cess of being updated.7 The WP29 also refers to the need for future data
protection measures in its Opinion regarding The Future of Privacy.8
Indeed, there have also been calls for greater political activism in rela-
tion to particular data protection issues.9 Others10 have also highlighted
new problematic developments in relation to such things as location data
and location based services, which need to be dealt with.11 Online abuse
and offline abuse issues are other issues which need to be addressed.

3 See EU (Withdrawal Agreement) Act 2020, enacted on 23 January 2020.


4 In Brief, Communications Law (2012) (17) 3.
5 For example, see also from a US perspective, N Roethlisberger, ‘Someone is
­Watching: The Need for Enhanced Data Protection’, Hastings Law Journal (2011)
(62:6) 1793.
6 Convention for the Protection of Individuals with regard to Automatic Processing of
Personal Data, Council of Europe (1982).
7 See S Kierkegaard et al, ‘30 Years On – The Review of the Council of Europe Data
Protection Convention 108,’ Computer Law & Security Review (2011) (27) 223–231.
8 The Future of Privacy, WP29, referred to in R Wong, ‘Data Protection: The Future of
Privacy,’ Computer Law & Security Review (2011) (27) 53–57.
9 A Ripoll Servent and A MacKenzie, ‘Is the EP Still a Data Protection Champion? The
Case of SWIFT,’ Perspectives on European Politics & Society (2011) (12) 390–406.
10 C Cuijpers and M Pekarek, ‘The Regulation of Location-Based Services: Challenges
to the European Union Data Protection Regime,’ Journal of Location Based Services
(2011) (5) 223–241.
11 See, for example, M Hildebandt, ‘Location Data, Purpose Binding and Contextual
Integrity: What’s the Meggase,’ in L Florida, ed, Protection of Information and the
Right to Privacy – A New Equilibrium (Springer 2014) 31.

360
Review Policy 26.03

Rebecca Wong refers to some of the areas of concern which the GDPR
proposes to address.12 These include:
●● the data protection regime in the online age;
●● social media;
●● cloud computing;
●● minimum/maximum standards;
●● the data protection Principles.13
The Commission states14 that the policy in reviewing the data protection
regime is to:
●● modernise the EU legal system for the protection of personal data,
in particular to meet the challenges resulting from globalisation and
the use of new technologies;
●● strengthen individuals’ rights, and at the same time reduce adminis-
trative formalities to ensure a free flow of personal data within the
EU and beyond;
●● improve the clarity and coherence of the EU rules for personal data
protection and achieve a consistent and effective implementation
and application of the fundamental right15 to the protection of per-
sonal data in all areas of the EU’s activities.16
It is also to enhance consumer confidence in eCommerce.17 In addition,
it should also bring comprehensive savings to organisations as the com-
pliance obligations of complying with somewhat differing national data
protection regimes will be reduced if not eliminated.18

12 R Wong, ‘The Data Protection Directive 95/46/EC: Idealisms and Realisms,’ Interna-
tional Review of Law, Computers & Technology (2012) (26) 229–244.
13 See above.
14 Reform of Data Protection Legal Framework, Commission, Justice Directorate, at
http://ec.europa.eu/justice/data-protection/index_en.htm.
15 In relation to data protection as a fundamental right, see, for example S Rodata,
‘Data Protection as a Fundamental Right,’ in S Gutwirth, Y Poullet, P de Hert,
C de ­­Terwangne and S Nouwt, Reinventing Data Protection? (Springer, 2009) 77.
A Mantelero notes that ‘If legislators consider data protection as a fundamental right,
it is necessary to reinforce its protection in order to make it effective and not condi-
tioned by the asymmetries which characterize the relationship between Data Subject
and data controllers,’ A Mantelero, ‘Competitive Value of Data Protection: The Impact
of Data Protection Regulation on Online Behaviour,’ International Data Privacy Law
(11/2013) (3:4) 98.
16 Reform of Data Protection legal Framework, Commission, Justice Directorate,
at http://ec.europa.eu/justice/data-protection/index_en.
17 In Brief, Communications Law (2012) (17) 3.
18 See above.

361
26.03 New Regime

The review19 summarises the need for new data protection rules, as
follows:
‘The current EU data protection rules date from 1995. Back then, the internet
was virtually unknown to most people. Today, 250 million people use the
internet daily in Europe.

Think how that has changed our personal data landscape through the explo-
sion of ecommerce, social networks, online games and cloud computing.

The European Commission has therefore adopted proposals for updating data
protection rules to meet the challenges of the digital age. In particular, the
proposals will strengthen protection of your personal data online.’

These proposals were debated in some detail.20 Commentators describe


the new GDPR as ‘a long (and ambitious) text’.21 In addition, the process
which has arrived at this stage is described as being herculean.22 The
details of the changes were agreed in December 2015, and finalised and
enacted in April 2016, with a go live date of May 2018.

Importance
26.04 Costa and Poullet indicate that as the GDPR ‘comes into force,
the document will be the new general legal framework of data protection,
repealing [the DPD] more than twenty-seven years after its adoption.’23
The GDPR, as well as Article 8(1) of the EU Charter of fundamental
rights of 2000 and Article 16(1) of the Lisbon Treaty, reassert the impor-
tance of privacy and data protection ‘as a fundamental right’.24 ‘[E]ffec-
tive and more coherent protection’ is required25 (see below).
In terms of policy as between modernising via a Directive or via a
Regulation ‘in order to ensure a full consistent and high level of protection

19 See details of some of the steps and consultations, at http://ec.europa.eu/justice/


data-protection/index_en.htm.
20 See Commission, Why Do We Need New Data Protection Rules Now?, at http://
ec.europa.eu/justice/data-protection/document/review2012/factsheets/1_en.pdf.
21 P De Hert and V Papakonstantinou, ‘The Proposed Data Protection Regulation
Replacing Directive 95/46/EC: A Sound System for the Protection of Individuals,’
Computer Law & security Review (2012) (28) 130–142. Note also, IN Walden and
RN Savage, ‘Data Protection and Privacy Laws: Should Organisations be Protected?,’
International and Comparative Law Quarterly (1988) (37) 337–347.
22 See above.
23 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–262, at 254.
24 See above, 254.
25 See above.

362
Fundamental Right 26.05

equivalent in EU states, a Regulation was judged as the adequate solu-


tion to ensure full harmonisation’26 throughout the EU. The Commission
may also oversee and monitor the national supervisory authorities.27
Individuals are rarely aware about how their data are collected and
processed while they are surfing on the internet at home, using their
cell phones, walking down a video-surveyed street or with an RFID
tag embedded in their clothes, and so on.28 There is a need for greater
transparency. As regards data processing, ‘transparency translates the
widening of the knowledge about information systems … coupled with
fairness.’29

Fundamental Right
26.05 Personal data protection is now recognised as a fundamen-
tal right for individuals, both in the new GDPR and the EU Charter of
Fundamental Rights of 2000, Lisbon Treaty (Treaty on the Functioning
of the European Union) and the Council of Europe Convention.
‘EU Charter (Article 8(1))

Protection of Personal Data

Everyone has the right to the protection of personal data concerning him or
her.’

‘EU Lisbon Treaty (Article 16(1))

Everyone has the right to the protection of personal data concerning them.’

While the UK is indicated to be adopting a UK-GDPR at the end of


Brexit, it is unclear as yet what, if any, relationship this new legislation
will seek to maintain to the Charter or Treaty. There have also been some
suggestions that the UK Human Rights Act 1998 will be repealed after
Brexit, or at least watered down by way of amendment. If there is no
relationship or equivalence to EU norms such as in the Charter, Treaties
or human rights law, this holds the potential for difficulties in the EU rec-
ognising the UK protections for personal data in a formal adequacy deci-
sion. This is notwithstanding the envisaged UK-GDPR. The UK-GDPR

26 See above, 255.


27 See above.
28 See above, 256.
29 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–292.

363
26.05 New Regime

and application for an adequacy decision could thus be undermined by


other UK law changes.

Innovations
26.06 Commentators have indicated that parts of the GDPR contain
particular ‘legislative innovation.’30 Some examples of this innovation
are indicated to be the:
●● Principles of data protection;
●● Data Subjects’ rights;
●● Controllers’ and Processors’ obligations;
●● Regulation issues regarding technologies.31
It has been noted that while DPD 1995 emphasises protection for the
fundamental right and freedoms of individuals ‘and in particular their
right to privacy,’ the GDPR in Articles 1 and 2 stresses the need to pro-
tect the fundamental right and freedoms of individuals ‘and in particular
their right to the protection of personal data.’32 Further references also
emphasise data protection as a stand-alone concept from privacy, such as
data protection risk assessments and pre-problem solving via data pro-
tection by design (DPbD) and by default.
There is a new consistency mechanism whereby the national data
protection authorities are obliged to cooperate with each other and with
the Commission.33 Two examples given include: data protection assess-
ments; and the obligation of notifying Data Subjects in relation to data
breaches.34
The obligations in terms of sufficient security and data breaches are
more detailed in the GDPR than previously.35 The obligations are now
more detailed than those in relation to telcos and ISPs in the ePD.36 Data
breaches are referred to in the GDPR. In the event of a data breach the
Controller must notify the ICO. In addition the Controller must also
communicate to the Data Subjects if there is a risk of harm to their pri-
vacy or personal data.

30 See above.
31 See above.
32 See above, 255.
33 Chapter VII, section 10, see L Costa and Y Poullet, ‘Privacy and the Regulation of
2012,’ Computer Law & Security Review (2012) (28) 254–262, at 255.
34 See above.
35 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–262, at 256.
36 See above.

364
Innovations 26.06

Data portability is a newly expressed right. It ‘implies the right of


Data Subjects to obtain from the Controller a copy of their personal data
in a structured and commonly used format … data portability is a kind of
right to backup and use personal information under the management of
the data Controller. Second, data portability grants the right to transmit
personal data and other information provided by the Data Subject from
one automated processing system to another one … therefore the right
to take personal data and leave.’37
DPD 1995 stated that Controller must not process personal data exces-
sively. However, this is now more limited. The GDPR states that data
collection and processing must be transparent, fair and limited to the
minimum.
Broader parameters are contained in the GDPR in relation to consent.
The definition and conditions are broader than previously. The inclusion
of the words freely given, specific, informed and unambigious appears
more specific than the previous ‘unambiguously’ consented.
DPD 1995, Article 15 protection in relation to automated individual
decisions ‘is considerably enlarged’38 regarding profiling in GDPR. The
use of, inter alia, the word ‘measure’ in the GDPR as opposed to ‘deci-
sion’ in DPD 1995 now makes the category of activity encompassed
within the obligation much wider.39 There is greater Data Subject protec-
tion. While there were previously two exemptions, in terms of contract
and also a specific law, the GDPR adds a third in terms of consent from
the Data Subject. However, Controllers will need to ensure a stand alone
consent for profiling separate from any consent for data collection and
processing per se.40
The GDPR also moves significantly further than DPD 1995 in terms
of creating obligations, responsibility and liability on Controllers.41
Appropriate policies must be implemented by Controllers, as well
as complaint data processing, secure data processing, the undertak-
ing of data protection impact assessments, shared liability as between
joint Controllers, appointing representatives within the EU where
the Controllers are located elsewhere and provisions regarding
Processors.42

37 See above, at 256.


38 See above, at 258. Also see Council of Europe Recommendation regarding profiling
(25 November 2010).
39 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–262, at 258–259.
40 See above, at 258. Also see Council of Europe Recommendation regarding profiling
(25 November 2010) 259.
41 See above.
42 See above.

365
26.06 New Regime

While DPD 1995 imposed compensation obligations on Controllers


in the case of harm to Data Subjects, the GDPR extends liability to
Processors.43 In addition, where harm is suffered by Data Subjects any
joint Controller and or Processors shall be ‘jointly and severally liable
for the entire amount of the damage.’44
The concepts of data protection by design (DPbD), data protection
by default and impact assessments all emphasise the ability of the data
protection regime to become involved in standards setting and the regu-
lation of particular technologies and technical solutions.45 The previous
Ontario Data Protection Commissioner, Anne Cavoukian, refers to PbD
(the precursor to DPbD).46 The GDPR describes it as follows, by indicat-
ing that:
‘[h]aving regard to the state of the art and the cost of implementation and
taking account of the nature, scope, context and purposes of the processing
as well as the risks of varying likelihood and severity for rights and freedoms
of individuals posed by the processing, the Controller shall, both at the time
of the determination of the means for processing and at the time of the pro-
cessing itself, implement appropriate technical and organisational measures,
such as pseudonymisation, which are designed to implement data protection
principles, such as data minimisation, in an effective way and to integrate the
necessary safeguards into the processing in order to meet the requirements of
[the GDPR] and protect the rights of Data Subjects.’ (Article 23(1)).

Data protection by default is referred to and defined as follows:


‘The Controller shall implement mechanisms for ensuring that, by default,
only those personal data are processed which are necessary for each specific
purpose of the processing and are especially not collected or retained beyond
the minimum necessary for those purposes, both in terms of the amount of the
data and the time of their storage. In particular, those mechanisms shall ensure
that by default personal data are not made accessible to an indefinite number
of individuals’ (Article 23(2)).

These accord with the general principle of data minimisation, whereby


non-personal data should be processed first and where the collection and
processing of personal data is required, it must be the minimum data as
opposed to the minimum data which is so processed. This is referred to
in Article 5(1)(c).

43 See above.
44 See above.
45 See above.
46 For example, Anne Cavoukian, DPA Ontario, Privacy Guidelines for RFID Informa-
tion Systems, at www.ipc.on.ca, says the privacy and security must be built into the
solution from the outset, at the design stage. Referred to ibid.

366
Enhanced Provisions and Changes 26.07

Data Subjects have more control over their personal data. In the con-
text of social networks, ‘individual profiles should be kept private from
others by default.’47
The concept of DPbD and data protection by default as provided in the
GDPR are predicted to soon impact upon organisational contracts and
contracting practices relating to data processing activities.48
As mentioned above, one of the new areas is the obligation to engage
in data protection impact assessments. Article 35(1) provides that that:
‘[w]here a type of processing in particular using new technologies, and tak-
ing into account the nature, scope, context and purposes of the processing, is
likely to result in a high risk for the rights and freedoms of natural persons,
the Controller shall, prior to the processing, carry out an assessment of the
impact of the envisaged processing operations on the protection of personal
data. A single assessment may address a set of similar processing operations
that present similar high risks.’

This is particularly so where the envisaged processing could give rise to


specific risks.
A further addition is the possibility of mass group claims or claims
through representative organisations. This is referred to as ‘collective
redress’ and allows data protection and privacy NGOs to complain to
both the ICO and to the courts (see Recital 143 and Article 80).49 ‘Civil
procedure rules’50 may also need to be introduced.
The regime as regards cross border data transfers are ‘significantly
altered.’51 These are included in Articles 44–50.

Enhanced Provisions and Changes


26.07 One of the more important extensions and enhancements
relates to the expanded Right to be Forgotten. The ‘right to be forgot-
ten and to erasure, which consists of securing from the Controller the

47 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–262, at 260, and referring to European data Protec-
tion Supervisor on the Communications from Commission to the European Parlia-
ment, the Council, the Economic and Social Committee and the Committee of the
Regions, ‘A Comprehensive Approach on Personal Data Protection in the European
Union,’ at 23.
48 See above, 260.
49 See also Commission on a common framework for collective redress, at http://
ec.europa.eu/consumers/redress_cons/collective_redress_en.htm.
50 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–262, at 261.
51 See above.

367
26.07 New Regime

erasure of personal data as well prevention of any further dissemina-


tion of his data.’52 (It is also said to interface with the new right to data
portability53).
The erasure and forgetting right is even more enhanced in instances
where the personal data was originally disclosed when the Data Subject
was a child. Some commentators refer to the option of an entire ‘clean
slate.’54
‘The use of data from social networks in employment contexts is a representa-
tive example. Personal data such as photos taken in private contexts have been
used to refuse job positions and fire people. But forgetfulness is larger. It is
one dimension of how people deal with their own history, being related not
only to leaving the past behind but also to living in the present without the
threat of a kind of “Miranda” warning, where whatever you say can be used
against you in the future. In this sense the right to be forgotten is closely
related to entitlements of dignity and self-development. Once again, privacy
appears as the pre-requisite of our liberties, assuring the possibility to freely
express ourselves and move freely on the street …’55

The erasure and forgetting right is most clearly associated and related to
the following in particular:
●● where the personal data are no longer necessary in relation to the
purposes for which they were originally collected and processed
(and the associated finality principle);
●● where the Data Subject has withdrawn their consent for processing;
●● where Data Subjects object to the processing of the personal data
concerning them;
●● where the processing of the personal data does not comply with the
GDPR.56
The GDPR and the erasure and forgetting right ‘amplifies the effective-
ness of data protection Principles and rules.’57
Data Subjects can have their data erased under the right to be forgot-
ten when there is no compliance, in addition to instances where they
simply withdraw their consent.58 User control and Data Subject control
are, therefore, enhanced.

52 See above, at 256.


53 See above.
54 See above, at 257.
55 See above.
56 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–262.
57 See above.
58 See above, at 257.

368
Enhanced Provisions and Changes 26.07

The new right creates compliance obligations, such as:


●● erasing personal data and not processing it further;
●● informing third parties that the Data Subject has requested the dele-
tion of the personal data;
●● taking responsibility for publication by third parties under the
Controller’s authority59 (Articles 17, 2 and 8).
The GDPR also enhances and expands the various powers of the national
supervisory authorities, such as the ICO.60
Some of the key changes are referred to below. In some respects,
these could also be seen as advantages of the new GDPR data protection
regime:
●● administrative costs are to be reduced with a single EU wide set of
rules and obligations;
●● there may be less need to interact with the ICO, as more responsibil-
ity and accountability is passed to the organisational level;
●● the consent requirement is clarified as to mean explicit consent
(whereas previously there were references to different categories of
consent);
●● rights are improved with easier access to personal data, as well as
its transferability;
●● the enhanced Right to be Forgotten will improve the position of Data
Subjects and the ability to delete and erase data;
●● the EU data protection regime applies to non-EU entities operating
with regard to EU personal data and EU citizens;
●● the national authorities will be able to impose fines based on a per-
cent of global turnover61 (see below).
The ICO welcomes the GDPR, stating:
‘it strengthens the position of individuals, recognises important concepts such
as privacy by design and privacy impact assessments and requires organisa-
tions to be able to demonstrate that they have measures in place to ensure
personal information is properly protected.’62

The EU Commission’s press statement dated 15 December 2015 issued


once the new GDPR text was agreed notes that the GDPR ‘will ena-
ble people to better control their personal data. At the same time mod-
ernised and unified rules will allow businesses to make the most of

59 See above.
60 See above, at 260.
61 In Brief, Communications Law (2012) (17) 3.
62 Referred to in In Brief, Communications Law (2012) (17) 3.

369
26.07 New Regime

the opportunities of the Digital Single Market by cutting red tape and
­benefiting from reinforced consumer trust.’63
The Commission also refers to the benefits of the changes. It states:
‘The reform will allow people to regain control of their personal data. Two-
thirds of Europeans (67%), according to a recent Eurobarometer survey, stated
they are concerned about not having complete control over the information
they provide online. Seven Europeans out of ten worry about the potential use
that companies may make of the information disclosed. The data protection
reform will strengthen the right to data protection, which is a fundamental
right in the EU, and allow them to have trust when they give their personal
data.

The new rules address these concerns by strengthening the existing rights
and empowering individuals with more control over their personal data. Most
notably, these include,
●● easier access to your own data: individuals will have more information
on how their data is processed and this information should be available in
a clear and understandable way;
●● a right to data portability: it will be easier to transfer your personal data
between service providers;
●● a clarified “Right to be Forgotten”: when you no longer want your data
to be processed, and provided that there are no legitimate grounds for
retaining it, the data will be deleted;
●● the right to know when your data has been hacked: For example,
companies and organisations must notify the national supervisory author-
ity of serious data breaches as soon as possible so that users can take
appropriate measures.

Clear modern rules for businesses

In today’s digital economy, personal data has acquired enormous economic


significance, in particular in the area of big data. By unifying Europe’s rules
on data protection, lawmakers are creating a business opportunity and encour-
aging innovation.

One continent, one law: The regulation will establish one single set of rules
which will make it simpler and cheaper for companies to do business in the
EU.

One-stop-shop: businesses will only have to deal with one single supervisory
authority. This is estimated to save €2.3 billion per year.
European rules on European soil: companies based outside of Europe will
have to apply the same rules when offering services in the EU.

63 European Commission – Press release, ‘Agreement on Commission’s EU Data Protec-


tion Reform Will Boost Digital Single Market,’ Brussels (15 December 2015).

370
Enhanced Provisions and Changes 26.07

Risk-based approach: the rules will avoid a burdensome one-size-fits-all


obligation and rather tailor them to the respective risks.

Rules fit for innovation: the regulation will guarantee that data protection
safeguards are built into products and services from the earliest stage of devel-
opment (Data protection by Design (DPbD)). Privacy-friendly techniques
such as pseudonomysation will be encouraged, to reap the benefits of big data
innovation while protecting privacy.

Benefits for big and small alike

The data protection reform will stimulate economic growth by cutting costs
and red tape for European business, especially for small and medium enter-
prises (SMEs). The EU’s data protection reform will help SMEs break into
new markets. Under the new rules, SMEs will benefit from four reductions
in red tape:
●● No more notifications: Notifications to supervisory authorities are a for-
mality that represents a cost for business of €130 million every year. The
reform will scrap these entirely.
●● Every penny counts: Where requests to access data are manifestly
unfounded or excessive, SMEs will be able to charge a fee for providing
access.
●● Data Protection Officers: SMEs are exempt from the obligation to
appoint a data protection officer insofar as data processing is not their
core business activity.
●● Impact Assessments: SMEs will have no obligation to carry out an
impact assessment unless there is a high risk.

Better protection of citizens’ data

Individuals’ personal data will be better protected, when processed for any law
enforcement purpose including prevention of crime. It will protect everyone –
regardless of whether they are a victim, criminal or witness. All law enforce-
ment processing in the Union must comply with the principles of necessity,
proportionality and legality, with appropriate safeguards for the individuals.
Supervision is ensured by independent national data protection authorities,
and effective judicial remedies must be provided.

The Data Protection Directive for Police and Criminal Justice Authorities
­provides clear rules for the transfer of personal data by law enforcement
authorities outside the EU, to ensure that the level of protection of individuals
guaranteed in the EU is not undermined.’64

64 European Commission – Press release, ‘Agreement on Commission’s EU Data Protec-


tion Reform Will Boost Digital Single Market,’ Brussels (15 December 2015).

371
26.07 New Regime

In one survey, 72 percent of internet users were concerned that they


gave away too much personal data.65 However, red tape reduction and
economic growth are also commercial issues considered under the new
regime. New technological changes are also recognised and encom-
passed. European harmonisation issues are also a central consideration.

The New Data Protection Regime


Introducing the New GDPR Changes
26.08 Some of the new headline changes include:
●● new and updated fundamental principles, such as fairness and legal-
ity, processing conditions, prior information, consent, transparency,
accuracy, lawfulness of processing, definitions;
●● DPOs;
●● special rules concerning children in relation to information society
services;
●● expanded rights including subject access rights;
●● new enforcement and sanctions provisions;
●● Right to be Forgotten;
●● right to data portability;
●● new security and breach provisions;
●● provisions regarding single supervisory authority or one-stop-shop;
●● risk based approach, risk minimisation, consultations and reporting.
Some of the new detailed changes include:
●● repeal of DPD 1995 (Article 94);
●● WP29 and European Data Protection Board (EDPB) (Articles 35,
40–43, 51, 52, 58–63, 64, 65 and 94);
●● background and rationale (Recitals 1, 8, 11, 13);
●● context objectives, scope of GDPR (Articles 1, 2 and 3);
●● obligations (Recitals 32, 39, 40, 42, 43–47);
●● security (Articles 2, 4, 5, 9, 10, 30, 32, 35, 40, 45 and 47);
●● processing;
●● rights (Articles 1, 4–7, 9–22);
●● proceedings (Articles 23, 40, 78, 79, 81 and 82);
●● establishment (Articles 3, 4, 9, 17, 18, 21, 37, 42, 49, 54, 56, 57, 60,
62, 65, 70 and 79);
●● transfers (Articles 44–50);

65 Eurobarometer, ‘Attitudes on Data Protection and Electronic Identity in the EU’


(June 2011).

372
The New Data Protection Regime 26.08

●● ICO, etc;
●● new bodies (Recital 142; Articles, 9 and 80);
●● notification/registration replaced (Recital 89);
●● exceptions/exemptions (Article 23);
●● lawful processing and consent (Articles 4–9, 13, 14, 17, 18, 20, 22,
23, 32, 40, 49, 82, 83);
●● online identifiers (Recital 30; Articles 4, 87);
●● sensitive and special personal data (Articles 6, 9, 22, 27, 30, 35,
37, 47);
●● children (Articles 6, 8, 12, 40, 57);
●● health data (Articles 4, 9, 17, 23, 36, 88);
●● DPR definitions (Article 4);
●● new processing rules: obligations (Articles 5–11);
●● new (data protection) Principles (Article 5);
●● lawfulness of processing: legitimate processing conditions
(Article 6);
●● child’s consent: conditions for information society services
(Article 8);
●● processing special categories of personal data (Article 9);
●● processing re criminal convictions and offences data (Article 10);
●● processing not requiring identification (Article 11);
●● Controllers and Processors (Chapter IV);
●● responsibility of the Controller (Article 24);
●● joint Controllers (Article 26);
●● Processor (Article 28);
●● processing under authority of Controller and Processor (Article 29);
●● records of processing activities (Article 30);
●● representatives of Controllers not established in EU (Article 27);
●● co-operation with supervisory authority;
●● security of processing (Article 32);
●● notifying data breach to supervisory authority (Article 33);
●● communicating data breach to Data Subject (Article 34);
●● data protection impact assessment and prior consultation (Chapter IV,
Section 3);
●● data protection impact assessment (Article 35);
●● prior consultation (Article 36);
●● new Data Protection Officer (DPO) (Chapter IV, Section 4;
Article 37);
●● position (Article 38) and tasks of new DPO (Article 39);
●● general principle for transfers (Article 44);
●● transfers via adequacy decision (Article 45);
●● transfers via appropriate safeguards (Article 46);
●● transfers via binding corporate rules (Article 47);
●● transfers or disclosures not authorised by EU law (Article 48);

373
26.08 New Regime

●● derogations for specific situations (Article 49);


●● new processing rules: Data Subject rights (Chapter III);
●● right to transparency (Article 5; Chapter III, Section 1; Articles 12–14,
26, 40–43, 88);
●● data access rights (Chapter III, Section 2);
●● right to prior information: directly obtained data (Article 13);
●● right to prior information: indirectly obtained data (Article 14);
●● right of confirmation and right of access (Article 15);
●● rectification and erasure (Chapter III, Section 3).
●● right to rectification (Article 16);
●● right to erasure (Right to be Forgotten) (RtbF) (Article 17);
●● right to restriction of processing (Article 18);
●● notifications re rectification, erasure or restriction (Article 19);
●● right to data portability (Article 20);
●● right to object (Article 21);
●● rights against automated individual decision making, including
­profiling (Article 22);
●● Data protection by design (DPbD) and by default (Article 25).
●● security rights;
●● data protection impact assessment and prior consultation;
●● communicating data breach to Data Subject (Article 34);
●● Data Protection Officer (DPO) (contact);
●● remedies, liability and sanctions (Chapter VIII);
●● right to lodge complaint with supervisory authority (Article 77);
●● right to judicial remedy against supervisory authority (Article 78);
●● right to effective judicial remedy against Controller or Processor
(Article 79);
●● representation of Data Subjects (Article 80);
●● suspension of proceedings (Article 81);
●● right to compensation and liability (Article 82);
●● codes of conduct and certification (Chapter IV, Section 5;
Articles 40–43);
●● international co-operation on personal data (Articles 50, 60);
●● Supervisory Authorities (Chapter VI, Section 1; Article 81);
●● general conditions for imposing administrative fines (Article 83);
●● penalties (Article 84);
●● specific data processing situations (Chapter IX);
●● processing and freedom of expression and information (Article 85);
●● processing and public access to official documents (Article 86);
●● processing national identification numbers (Article 87);
●● processing in employment context (Article 88);
●● safeguards and derogations: public interest/scientific/historical
research/statistical processing (Article 89);
●● obligations of secrecy (Article 90);

374
Main Provisions and Changes of GDPR 26.12

●● churches and religious associations (Article 91);


●● delegated acts and implementing acts (Chapter X);
●● exercise of the delegation (Article 92);
●● relationship to Directive 2002/58/EC (Article 95);
●● relationship to previously concluded agreements (Article 96);
●● review of other EU data protection instruments (Article 98);
●● restrictions.
New Definitions
26.09 Article 4 of the new GDPR sets out the definitions for the new
data protection regime.
Prior to the new GDPR, the ICO has issued recommendations in rela-
tion to personal data, definitions, etc, namely:
●● Determining What is Personal Data;
●● What is ‘Data’ for the Purposes of the DPA?;
●● What is Personal Data? – A Quick Reference Guide.
These now need to be read in light of the GDPR changes.
GDPR Recitals
26.10 The Recitals to the new GDPR are also instructive in relation to
the purposes and themes covered. They include reference to, for exam-
ple: DPD 1995 being repealed; WP29/EDPB; background and rationale;
obligations; security; processing; rights; proceedings; establishment;
transfers; supervisory authorities; new bodies; lawful processing and
consent; online identifiers; special personal data; children; and health
data. The importance of data protection and health data is also referred
to by the OECD.66

Main Provisions and Changes of GDPR


26.11 The main new GDPR provisions and changes are referred to
below.
Repeal of DPD95
26.12 DPD 1995 is repealed. It is repealed with effect from 25 May
2018 (Article 94(1)). References to the repealed DPD 1995 shall be con-
strued as references to the GDPR (Article 94(2)).

66 Strengthening Health Information Infrastructure for Health Care Quality Govern-


ance: Good Practices, New Opportunities, and Data Privacy Protection Challenges,
OECD (2013).

375
26.13 New Regime

WP29 and European Data Protection Board


26.13 A new European Data Protection Board (EDPB) is established.
This effectively replaces the WP29. References to the previous WP29
shall be construed as references to the EDPB established by the GDPR
(Article 94(2)). Chapter IV, Section 3 of the GDPR provides for the
EDPB. The EDPB is established as body of the EU and shall have legal
personality (Article 68(1)). The EDPB shall be composed of the head
of one supervisory authority of each state and of the European Data
Protection Supervisor, or their respective representatives (Article 68(3)).
The EDPB shall act independently when performing its tasks or exercis-
ing its powers pursuant to Articles 70 and 71 (Article 69(1)).
Context of GDPR
26.14 The initial provisions refer to the context of the GDPR, namely,
the subject matter and objectives (Article 1); material scope (Article 2);
and territorial scope (Article 3).
GDPR Definitions
26.15 Article 4 of the new GDPR sets out the important definitions for
the new data protection regime.
New Processing Rules: Obligations
26.16 The new processing rules as set out in the new GDPR regime
are set out below.
The Recitals refer to the following: data processing must be lawful
and fair (Recital 39); processing necessary for a contract (Recital 40);
processing for a legal obligation (Recital 40); processing necessary
to protect life (Recital 46); the legitimate interests of the Controller
(Recital 47).
Principles
26.17 Article 5 of the GDPR sets out the new Principles relating to
data protection. Personal data must be:
●● processed lawfully, fairly and in a transparent manner in relation to
the Data Subject (‘lawfulness, fairness and transparency’);
●● collected for specified, explicit and legitimate purposes and not fur-
ther processed in a manner that is incompatible with those purposes;
further processing for archiving purposes in the public interest, sci-
entific or historical research purposes or statistical purposes shall, in
accordance with Article 89(1), not be considered to be incompatible
with the initial purposes (‘purpose limitation’);

376
Main Provisions and Changes of GDPR 26.17

●● adequate, relevant and limited to what is necessary in relation to the


purposes for which they are processed (‘data minimisation’);
●● accurate and, where necessary, kept up to date; every reasonable step
must be taken to ensure that personal data that are inaccurate, having
regard to the purposes for which they are processed, are erased or
rectified without delay (‘accuracy’);
●● kept in a form which permits identification of Data Subjects for no
longer than is necessary for the purposes for which the personal data
are processed; personal data may be stored for longer periods insofar
as the personal data will be processed solely for archiving purposes
in the public interest, scientific or historical research purposes or sta-
tistical purposes in accordance with Article 89(1) subject to imple-
mentation of the appropriate technical and organisational measures
required by the GDPR in order to safeguard the rights and freedoms
of the Data Subject (‘storage limitation’);
●● processed in a way that ensures appropriate security of the personal
data, including protection against unauthorised or unlawful pro-
cessing and against accidental loss, destruction or damage, using
appropriate technical or organisational measures (‘integrity and
confidentiality’).
The Controller must be responsible for and be able to demonstrate com-
pliance with the above (‘accountability’).
These can be summarised as:
●● processed lawfully, fairly and transparently (‘lawfulness, fairness
and transparency’);
●● for specified, explicit and legitimate purposes and not further pro-
cessed incompatibly with the purpose; (with a carveout) (‘purpose
limitation’);
●● adequate, relevant and limited to what is necessary for the purposes
(‘data minimisation’);
●● accurate and, where necessary, kept up to date; every reasonable step
must be taken to ensure that personal data that are inaccurate, having
regard to the purposes for which they are processed, are erased or
rectified without delay (‘accuracy’);
●● kept in a form which permits identification for no longer than is
necessary; (with a carveout) (‘storage limitation’);
●● appropriate security, including protection against unauthorised
or unlawful processing and against accidental loss, destruction or
damage, using appropriate technical or organisational measures
(‘integrity and confidentiality’).
The Controller must demonstrate compliance (‘accountability’).

377
26.18 New Regime

Lawfulness of Processing: Lawful Processing Conditions


26.18 Article 6 of the GDPR sets out the new provisions in relation to
the lawfulness of processing. Processing of personal data shall be lawful
only if and to the extent that at least one of the following applies:
●● the Data Subject has given consent to the processing of their per-
sonal data for one or more specific purposes (see section below at
para 26.19 and also Article 7);
●● processing is necessary for the performance of a contract to which
the Data Subject is party or in order to take steps at the request of the
Data Subject prior to entering into a contract;
●● processing is necessary for compliance with a legal obligation to
which the Controller is subject;
●● processing is necessary in order to protect the vital interests of the
Data Subject or of another natural person;
●● processing is necessary for the performance of a task carried out in
the public interest or in the exercise of official authority vested in
the Controller;
●● processing is necessary for the purposes of the legitimate interests
pursued by the Controller or by a third party, except where such
interests are overridden by the interests or fundamental rights and
freedoms of the Data Subject which require protection of personal
data, in particular where the Data Subject is a child. This shall not
apply to processing carried out by public authorities in the perfor-
mance of their tasks (Article 6(1)).
States may maintain or introduce more specific provisions to adapt the
application of the rules of the GDPR with regard to the processing of
personal data for compliance with Article 6(1)(c) and (e) by determining
more precisely specific requirements for the processing and other meas-
ures to ensure lawful and fair processing including for other specific pro-
cessing situations as provided for in Chapter IX (Article 6(2)). The basis
for the processing referred to in Article 6(1)(c) and (e) must be laid down
by EU law, or state law to which the Controller is subject.
The purpose of the processing shall be determined in this legal basis
or as regards the processing referred to in Article 6(1)(e), shall be neces-
sary for the performance of a task carried out in the public interest or
in the exercise of official authority vested in the Controller. This legal
basis may contain specific provisions to adapt the application of rules
of the GDPR, inter alia the general conditions governing the lawful-
ness of processing by the Controller, the type of data which are subject
to the processing, the Data Subjects concerned; the entities to, and the
purposes for which the data may be disclosed; the purpose limitation;
storage periods and processing operations and processing procedures,

378
Main Provisions and Changes of GDPR 26.19

including measures to ensure lawful and fair processing, such as those


for other specific processing situations as provided for in Chapter IX.
EU law or the state law must meet an objective of public interest and be
proportionate to the legitimate aim pursued (Article 6(3)).
Where the processing for another purpose than the one for which the
data have been collected is not based on the Data Subject’s consent or on
EU or state law which constitutes a necessary and proportionate meas-
ure in a democratic society to safeguard the objectives referred to in
Article 23(1), the Controller shall, in order to ascertain whether process-
ing for another purpose is compatible with the purpose for which the
personal data are initially collected, take into account, inter alia:
●● any link between the purposes for which the data have been col-
lected and the purposes of the intended further processing;
●● the context in which the personal data have been collected, in par-
ticular regarding the relationship between Data Subjects and the
Controller;
●● the nature of the personal data, in particular whether special catego-
ries of personal data are processed, pursuant to Article 9 or whether
personal data related to criminal convictions and offences are pro-
cessed, pursuant to Article 10;
●● the possible consequences of the intended further processing for
Data Subjects;
●● the existence of appropriate safeguards, which may include encryp-
tion or pseudonymisation (Article 7(4)).
Prior to the new GDPD, the ICO has issued recommendations in relation
to marketing personal data issues, namely:
●● Direct Marketing;
●● Direct Marketing Checklist – Ideal for Small Businesses;
●● Companies Receiving Unwanted Marketing;
●● Guidance on Political Campaigning.
These now need to be read in light of the GDPR changes.
Consent for Processing: Conditions for Consent
26.19 Consent is an important issue under the new GDPR.67 Lawful
processing and consent are referred to in Recital 40. The WP29 also
refers to consent issues.68

67 JP Vandenbroucke and J Olsen, ‘Informed Consent and the New EU Regulation on


Data Protection,’ International Journal of Epidemiology (2013) (42: 6) 1891.
68 WP29 Opinion 15/2011 Consent; Working Document 02/2013 providing guidance on
obtaining consent for cookies, 201; Opinion 04/2012 on Cookie Consent Exemption.

379
26.19 New Regime

Article 7 of the GDPR refers to conditions for consent as follows.


Where processing is based on consent, the Controller shall be able to
demonstrate that the Data Subject has consented to the processing of
their personal data (Article 7(1)).
If the Data Subject’s consent is given in the context of a written decla-
ration which also concerns other matters, the request for consent must be
presented in a manner which is clearly distinguishable from other mat-
ters, in an intelligible and easily accessible form, using clear and plain
language. Any part of such a declaration which constitutes an infringe-
ment of the GDPR shall not be binding (Article 7(2)).
The Data Subject shall have the right to withdraw his or her consent
at any time. The withdrawal of consent shall not affect the lawfulness
of processing based on consent before its withdrawal. Prior to giving
consent, the Data Subject must be informed thereof. It shall be as easy to
withdraw consent as to give consent (Article 7(3)).
When assessing whether consent is freely given, utmost account shall
be taken of the fact whether, inter alia, the performance of a service, is
conditional on consent to the processing of personal data that is not nec-
essary for the performance of this contract (Article 7(4)).
Children
26.20 The issue of children in the data protection regime have been
steadily rising. The increased use of social media and Web 2.0 services
enhance the exposures and risks for children and the uninitiated.69
This concern has included children’s groups, regulators and also
the EDPB (previously the WP29). WP29 issued Opinion 2/2009 on the
Protection of Children’s Personal Data (General Guidelines and the
Special Case of Schools) in 2009 and also Working Document 1/2008 on
the Protection of Children’s Personal Data (General Guidelines and the
Special Case of Schools). Schools are being encouraged to be proactive
and to have appropriate codes and policies for children’s social media
and internet usage.70

69 See, for example, D Gourlay and G Gallagher, ‘Collecting and Using Children’s
Information Online: the UK/US Dichotomy,’ SCL Computers and Law, 12 December
2011.
70 Note generally, for example, JS Groppe, ‘A Child’s Playground or a Predator’s Hunting
Ground? – How to Protect Children on Internet Social Networking Sites,’ CommLaw
Conspectus (2007) (16) 215–245; EP Steadman, ‘MySpace, But Who’s Responsibil-
ity? Liability of Social Networking Websites When Offline Sexual Assault of Minors
Follows Online Interaction,’ Villanova Sports and Entertainment Law Journal (2007)
(14) 363–397; DC Beckstrom, ‘Who’s Looking at Your Facebook Profile? The Use of
Student Conduct Codes to Censor College Students’ Online Speech,’ Willamette Law
Review (2008) 261–312.

380
Main Provisions and Changes of GDPR 26.20

There is now an explicit acknowledgement of children’s interest in


the EU data protection regime, unlike with DPD95 which contained no
explicit reference. The GDPR refers to a ‘child’ including as below the
age of 16 years. This is significant amongst other things in relation con-
sent, contracting, etc. It is also significant for websites which have signif-
icant numbers of children. Up until now it was common for certain social
networks to purport to accept users only over the age of 13. A child was
defined in the first and final version of the GDPR as up to 18 years of age.
Therefore, the potential implications will require careful assessment.
This includes, for example, social contracts, consent, notices, terms of
use, contract terms, payment and purchase issues, related process mod-
els, sign-ups, social media, internet, etc. Note, however, that the UK has
adopted a change whereby it specifies the age as being 13 for the UK
(Data Protection Act 2018 (DPA 2018), s 9).
Article 8 of the original proposal for the GDPR contained provisions
in relation to the processing of personal data of a child. The final version
of Article 8(1) provided that where Article 6(1)(a) applies, in relation
to the offer of information society services directly to a child, the pro-
cessing of the personal data of a child shall be lawful where the child
is at least 16 years old. Where the child is below the age of 16 years,
such processing shall be lawful only if and to the extent that consent
is given or authorised by the holder of parental responsibility over the
child. The original Article 8(1) proposal provided that Article 8(1) shall
not affect the general contract law of states, such as the rules on the
validity, formation or effect of a contract in relation to a child. Under the
original Article 8(3) the Commission is empowered to adopt delegated
acts in accordance with Article 86 for the purpose of further specifying
the criteria and requirements for the methods to obtain verifiable con-
sent referred to in Article 8(1). In doing so, the Commission shall con-
sider specific measures for micro, small and medium-sized enterprises
(original draft Article 8(3)). In addition, the Commission may lay down
standard forms for specific methods to obtain verifiable consent referred
to in Article 8(1). Those implementing acts shall be adopted in accord-
ance with the examination procedure referred to in Article 87(2) (origi-
nal draft Article 8(4)).
The explicit reference to children is new, and some would argue over-
due. Increasingly, the activities of children on the internet and on social
media, poses risks and concerns.71 This has been further emphasised of
late with tragic events involving online abuse, in particular cyber bully-
ing. Risks arise obviously from their activities online (eg inappropriate

71 See, for example, L McDermott, ‘Legal Issues Associated with Minors and Their Use
of Social Networking Sites,’ Communications Law (2012) (17) 19–24.

381
26.20 New Regime

content, cyber bullying, but also from the collection and use of their per-
sonal data online and collected online, sometimes without their knowl-
edge or consent). Their personal data and privacy is more vulnerable
than that of older people.
It is important for organisation to note the ‘child’ provisions in the
GDPR. This will have implications in how organisations:
●● consider the interaction with children and what personal data may be
collected and processed;
●● ensure that there is appropriate compliance for such collection and
processing for children as distinct from adults.
Child’s Consent: Conditions for Information Society Services
26.21 Processing of children’s personal data are referred to in Recital
38. Article 8 of the GDPR makes new provisions in relation to condi-
tions for children’s consent for Information Society services. Where
Article 6(1)(a) applies, in relation to the offer of information society
services directly to a child, the processing of personal data of a child
shall be lawful where the child is at least 16 years old. Where the child
is below the age of 16 years, such processing shall only be lawful if and
to the extent that consent is given or authorised by the holder of parental
responsibility over the child (Article 8(1)).
The Controller shall make reasonable efforts to verify in such cases
that consent is given or authorised by the holder of parental responsi-
bility over the child, taking into consideration available technology
(Article 8(2)). This shall not affect the general contract law of states such
as the rules on the validity, formation or effect of a contract in relation to
a child (Article 8(3)).
Processing Special Categories of Personal Data
26.22 Sensitive personal data are referred to in Recitals 10 and 51 and
special categories of personal data are referred to in Recitals 10, 51, 52,
53, 54, 71, 80, 91 and 97.
Article 9 refers to the processing of special categories of personal data.
The processing of personal data, revealing racial or ethnic origin, politi-
cal opinions, religious or philosophical beliefs, or trade-union member-
ship, and the processing of genetic data, biometric data for the purpose
of uniquely identifying a natural person, data concerning health or data
concerning a natural person’s sex life and sexual orientation shall be
prohibited (Article 9(1)).72

72 In relation to genetic data generally, see for example D Hallinan, PJA de Hert and
M Friedewald, ‘Genetic Data and the Data Protection Regulation: Anonymity,

382
Main Provisions and Changes of GDPR 26.22

The above shall not apply if one of the following applies:


●● the Data Subject has given explicit consent to the processing of those
personal data for one or more specified purposes, except where EU
law or state law provide that the above prohibition may not be lifted
by the Data Subject;
●● processing is necessary for the purposes of carrying out the obliga-
tions and exercising specific rights of the Controller or of the Data
Subject in the field of employment and social security and social
protection law in so far as it is authorised by EU law or state law or
a collective agreement pursuant to state law providing for adequate
safeguards for the fundamental rights and the interests of the Data
Subject;
●● processing is necessary to protect the vital interests of the Data
Subject or of another natural person where the Data Subject is physi-
cally or legally incapable of giving consent;
●● processing is carried out in the course of its legitimate activities with
appropriate safeguards by a foundation, association or any other not-
for-profit-seeking body with a political, philosophical, religious or
trade union aim and on condition that the processing relates solely
to the members or to former members of the body or to persons who
have regular contact with it in connection with its purposes and that
the personal data are not disclosed outside that body without the
consent of the Data Subjects;
●● processing relates to personal data which are manifestly made public
by the Data Subject;
●● processing is necessary for the establishment, exercise or defence
of legal claims or whenever courts are acting in their judicial
capacity;
●● processing is necessary for reasons of substantial public interest, on
the basis of EU or state law which shall be proportionate to the aim
pursued, respect the essence of the right to data protection and pro-
vide for suitable and specific measures to safeguard the fundamental
rights and the interests of the Data Subject;
●● processing is necessary for the purposes of preventive or occupa-
tional medicine, for the assessment of the working capacity of the
employee, medical diagnosis, the provision of health or social care
or treatment or the management of health or social care systems and
services on the basis of EU law or state law or pursuant to contract
with a health professional and subject to the conditions and safe-
guards referred to in Article 9(3); [h] or

­ ultiple Subjects, Sensitivity and a Prohibitionary Logic Regarding Genetic Data?,’


M
Computer Law and Security Review (2013) (29: 4) 317.

383
26.22 New Regime

●● processing is necessary for reasons of public interest in the area of


public health, such as protecting against serious cross-border threats
to health or ensuring high standards of quality and safety of health
care and of medicinal products or medical devices, on the basis of
EU law or state law which provides for suitable and specific meas-
ures to safeguard the rights and freedoms of the Data Subject, in
particular professional secrecy;
●● processing is necessary for archiving purposes in the public inter-
est, scientific or historical research purposes or statistical purposes
in accordance with Article 89(1) based on EU or state law which
shall be proportionate to the aim pursued, respect the essence of the
right to data protection and provide for suitable and specific meas-
ures to safeguard the fundamental rights and the interests of the Data
Subject (Article 9(2)).
Personal data referred to in Article 9(1) may be processed for the pur-
poses referred to in Article 9(2)(h) when those data are processed by or
under the responsibility of a professional subject to the obligation of pro-
fessional secrecy under EU or state law or rules established by national
competent bodies or by another person also subject to an obligation of
secrecy under EU or state law or rules established by national competent
bodies (Article 9(3)).
States may maintain or introduce further conditions, including limita-
tions, with regard to the processing of genetic data, biometric data or
health data (Article 9(5)).
Health Research
26.23 The issue of health research was just one of the matters which
proved to be of some controversy during the discussions for the GDPR.
The final provisions include the following. The special data process-
ing prohibition in Article 9(1) shall not apply in the following health
research instances, namely where:
●● processing is necessary for the purposes of preventive or occupa-
tional medicine, for the assessment of the working capacity of the
employee, medical diagnosis, the provision of health or social care
or treatment or the management of health or social care systems and
services on the basis of EU or state law or pursuant to contract with
a health professional and subject to the conditions and safeguards
referred to in paragraph 3 (Article 9(2)(h));
●● processing is necessary for reasons of public interest in the area of
public health, such as protecting against serious cross-border threats
to health or ensuring high standards of quality and safety of health

384
Main Provisions and Changes of GDPR 26.25

care and of medicinal products or medical devices, on the basis of


EU or state law which provides for suitable and specific measures to
safeguard the rights and freedoms of the data subject, in particular
professional secrecy (Article 9(2)(i)).
Personal data referred to in Article 9(1) may be processed for the pur-
poses referred to in Article 9(2)(h) when those data are processed by or
under the responsibility of a professional subject to the obligation of
professional secrecy under Union or state law or rules established by
national competent bodies or by another person also subject to an obli-
gation of secrecy under EU or state law or rules established by national
competent bodies (Article 9(3)).
The EDPB recently issued data protection and health guidance spe-
cific to the Covid pandemic, namely,
●● Guidelines 03/2020 on the processing of data concerning health
for the purpose of scientific research in the context of the Covid-19
outbreak.
Processing re Criminal Convictions and Offences Data
26.24 Article 10 refers to processing of data relating to criminal con-
victions and offences. Processing of personal data relating to crimi-
nal convictions and offences or related security measures based on
Article 6(1) shall be carried out only under the control of official author-
ity or when the processing is authorised by EU law or state law providing
for adequate safeguards for the rights and freedoms of Data Subjects.
Any comprehensive register of criminal convictions may be kept only
under the control of official authority.
Processing Not Requiring Identification
26.25 Article 11 refers to processing not requiring identification. If the
purposes for which a Controller processes personal data do not or do no
longer require the identification of a Data Subject by the Controller, the
Controller shall not be obliged to maintain, acquire or process additional
information in order to identify the Data Subject for the sole purpose of
complying with the GDPR (Article 11(1)).
Where, in such cases the Controller is able to demonstrate that it is
not in a position to identify the Data Subject, the Controller shall inform
the Data Subject accordingly, if possible. In such cases, Articles 15 to 20
shall not apply except where the Data Subject, for the purpose of exer-
cising their rights under these articles, provides additional information
enabling their identification (Article 11(2)).

385
26.26 New Regime

Controllers and Processors


26.26 Chapter IV of the GDPR refers to Controllers and Processors.
Chapter IV, Section 1 refers to general obligations. The WP29 also refers
to Controller and Processor issues.73
Responsibility of the Controller
26.27 Taking into account the nature, scope, context and purposes of
the processing as well as the risks of varying likelihood and severity for
the rights and freedoms of individuals, the Controller shall implement
appropriate technical and organisational measures to ensure and be able
to demonstrate that the processing of personal data is performed in com-
pliance with the GDPR. Those measures shall be reviewed and updated
where necessary (Article 24(1)).
Where proportionate in relation to the processing activities, the meas-
ures referred to in Article 24(1) shall include the implementation of
appropriate data protection policies by the Controller (Article 24(2)).
Adherence to approved codes of conduct pursuant to Article 40 or an
approved certification mechanism pursuant to Article 42 may be used
as an element to demonstrate compliance with the obligations of the
Controller (Article 24(3)).
Joint Controllers
26.28 Where two or more Controllers jointly determine the purposes
and means of the processing of personal data, they are joint Controllers.
They shall, in a transparent manner, determine their respective responsi-
bilities for compliance with the obligations under the GDPR, in particu-
lar as regards the exercising of the rights of the Data Subject and their
respective duties to provide the information referred to in Articles 13
and 14, by means of an arrangement between them unless, and in so far
as, the respective responsibilities of the Controllers are determined by
EU or state law to which the Controllers are subject. The arrangement
may designate a point of contact for Data Subjects (Article 26(1)).
The arrangement shall duly reflect the joint Controllers’ respective
effective roles and relationships vis-à-vis Data Subjects. The essence
of the arrangement shall be made available for the Data Subject
(Article 26(2)).74

73 WP29 Opinion 1/2010 on the concepts of ‘controller’ and ‘processor.’


74 Generally see J Mäkinen, ‘Data Quality, Sensitive Data and Joint Controllership as
Examples of Grey Areas in the Existing Data Protection Framework for the Internet of
Things,’ Information & Communications Technology Law (2015) (24:3) 262.

386
Main Provisions and Changes of GDPR 26.29

Irrespective of the terms of the arrangement referred to in Article 24(1),


the Data Subject may exercise their rights under the GDPR in respect of
and against each of the Controllers (Article 24(3)).
The CJEU has also dealt with the issue of who might be joint con-
trollers, and related issues, in a number of cases, namely the so called
Facebook fan page case where Facebook and the operator of a fan page
were found to be joint controllers;75 and the Fashion ID case.76
Processor
26.29 Where a processing is to be carried out on behalf of a Controller,
the Controller shall use only Processors providing sufficient guaran-
tees to implement appropriate technical and organisational measures
in such a manner that the processing will meet the requirements of
the GDPR and ensure the protection of the rights of the Data Subject
(Article 28(1)).
The Processor shall not engage another Processor without the prior
specific or general written authorisation of the Controller. In the case of
general written authorisation, the Processor shall inform the Controller
on any intended changes concerning the addition or replacement of other
Processors, thereby giving the Controller the opportunity to object to
such changes (Article 28(2)).
The carrying out of processing by a Processor shall be governed by
a contract or other legal act under EU or state law, that is binding the
Processor with regard to the Controller and that sets out the subject-
matter and duration of the processing, the nature and purpose of the pro-
cessing, the type of personal data and categories of Data Subjects and the
obligations and rights of the Controller. The contract or other legal act
shall stipulate, in particular, that the Processor:
●● process the personal data only on documented instructions from the
Controller, including with regard to transfers of personal data to a
third country or an international organisation, unless required to do
so by EU or state law to which the Processor is subject; in such a
case, the Processor shall inform the Controller of that legal require-
ment before processing the data, unless that law prohibits such infor-
mation on important grounds of public interest;
●● ensure that persons authorised to process the personal data have
committed themselves to confidentiality or are under an appropriate
statutory obligation of confidentiality;

75 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaftsaka-


demie Schleswig-Holstein GmbH, CJEU [2018] Case C-210/16 (5 June 2018).
76 Fashion ID GmbH & Co.KG v Verbraucherzentrale NRW eV, CJEU [2019]
Case C-40/17 (29 July 2019).

387
26.29 New Regime

●● take all measures required pursuant to Article 32;


●● respect the conditions referred to in Article 28(2) and (4) for engag-
ing another Processor;
●● taking into account the nature of the processing, assist the Controller
by appropriate technical and organisational measures, insofar as
this is possible, for the fulfilment of the Controller’s obligation to
respond to requests for exercising the Data Subject’s rights laid
down in Chapter III;
●● assist the Controller in ensuring compliance with the obligations
pursuant to Articles 32 to 36 taking into account the nature of pro-
cessing and the information available to the Processor;
●● at the choice of the Controller, deletes or returns all the personal data
to the Controller after the end of the provision of services relating
to processing, and deletes existing copies unless EU or state law
requires storage of the personal data;
●● make available to the Controller all information necessary to demon-
strate compliance with the obligations laid down in this Article and
allow for and contribute to audits, including inspections, conducted
by the Controller or another auditor mandated by the Controller.
(With regard to this point, the Processor shall immediately inform
the Controller if, in its opinion, an instruction breaches the GDPR or
EU or state data protection provisions) (Article 28(3)).
Where a Processor engages another Processor for carrying out specific
processing activities on behalf of the Controller, the same data protec-
tion obligations as set out in the contract or other legal act between
the Controller and the Processor as referred to in Article 28(3) shall be
imposed on that other Processor by way of a contract or other legal act
under EU or state law, in particular providing sufficient guarantees to
implement appropriate technical and organisational measures in such a
manner that the processing will meet the requirements of the GDPR.
Where that other Processor fails to fulfil its data protection obligations,
the initial Processor shall remain fully liable to the Controller for the
performance of that other Processor’s obligations (Article 28(4)).
Adherence of the Processor to an approved code of conduct pursu-
ant to Article 40 or an approved certification mechanism pursuant to
Article 42 may be used as an element to demonstrate sufficient guaran-
tees referred to in Article 28(1) and (4) (Article 28(5)).
Without prejudice to an individual contract between the Controller
and the Processor, the contract or the other legal act referred to in
Article 26(3) and (4) may be based, in whole or in part, on standard con-
tractual clauses referred to in Article 28(7) and (8), including when they
are part of a certification granted to the Controller or Processor pursuant
to Articles 42 and 43 (Article 28(6)).

388
Main Provisions and Changes of GDPR 26.31

The Commission may lay down standard contractual clauses for the
matters referred to in Article 28(3) and (4) and in accordance with the
examination procedure referred to in Article 93(2) (Article 28(7)).
A supervisory authority may adopt standard contractual clauses for the
matters referred to in Article 28(3) and (4) and in accordance with the
consistency mechanism referred to in Article 63 (Article 28(8)).
The contract or the other legal act referred to in Article 28(3) and (4)
shall be in writing, including in an electronic form (Article 28(9)).
Without prejudice to Articles 82, 83 and 84, if a Processor infringes
the GDPR by determining the purposes and means of data processing,
the Processor shall be considered to be a Controller in respect of that
processing (Article 28(10)).
Organisations acting as Processors should recall that while they may
indeed be Processors, they can also at the same time be Controllers in
relation to different sets of personal data.
Processing Under Authority of Controller and Processor
26.30 The Processor and any person acting under the authority of the
Controller or of the Processor who has access to personal data shall not
process them except on instructions from the Controller, unless required
to do so by EU or state law (Article 29).
Records of Processing Activities
26.31 Each Controller and where applicable the Controller’s rep-
resentative, shall maintain a record of processing activities under its
responsibility. This record shall contain all of the following information:
●● the name and contact details of the Controller and where applicable
the joint Controller, the Controller’s representative and the DPO;
●● the purposes of the processing;
●● a description of categories of Data Subjects and of the categories of
personal data;
●● the categories of recipients to whom the personal data have been or
will be disclosed including recipients in third countries or interna-
tional organisations;
●● where applicable, transfers of data to a third country or an interna-
tional organisation, including the identification of that third country
or international organisation and, in case of transfers referred to in
Article 49(1), the documentation of suitable safeguards;
●● where possible, the envisaged time limits for erasure of the different
categories of data;
●● where possible, a general description of the technical and organisa-
tional security measures referred to in Article 32(1) (Article 30(1)).

389
26.31 New Regime

Each Processor and, where applicable, the Processor’s representative


shall maintain a record of all categories of processing activities carried
out on behalf of a Controller, containing:
●● the name and contact details of the Processor or Processors and
of each Controller on behalf of which the Processor is acting, and
where applicable of the Controller’s or the Processor’s representa-
tive, and the DPO;
●● the categories of processing carried out on behalf of each Controller;
●● where applicable, transfers of data to a third country or an interna-
tional organisation, including the identification of that third country
or international organisation and, in case of transfers referred to in
Article 49(1), the documentation of suitable safeguards;
●● where possible, a general description of the technical and organisa-
tional security measures referred to in Article 32(1) (Article 30(2)).
The records referred to in Article 30(1) and (2) shall be in writing, includ-
ing in an electronic form (Article 30(3)).
The Controller and the Processor and, where applicable, the
Controller’s or the Processor’s representative, shall make the record
available to the supervisory authority on request (Article 30(4)).
The obligations referred to in Article 30(1) and (2) shall not apply to an
enterprise or an organisation employing fewer than 250 persons unless
the processing it carries out is likely to result in a risk to the rights and
freedoms of Data Subject, the processing is not occasional, or the pro-
cessing includes special categories of data as referred to in Article 9(1)
or personal data relating to criminal convictions and offences referred to
in Article 10 (Article 30(5)).
The records-rules may be one of the more underappreciated areas of
the new data protection regime, but holds the potential for substantial
bite. Already data regulators are looking at how businesses and organi-
sations document and record how they are compliant. It is perhaps only
a matter of time before party opponents seek to direct attention to this
area also.
Representatives of Controllers Not Established in EU
26.32 Where Article 3(2) applies, the Controller or the Processor shall
designate in writing a representative in the EU (Article 27(1)).
This obligation shall not apply to:
●● processing which is occasional, does not include, on a large scale,
processing of special categories of data as referred to in Article 9(1)
or processing of personal data relating to criminal convictions and
offences referred to in Article 10, and is unlikely to result in a risk

390
Main Provisions and Changes of GDPR 26.34

for the rights and freedoms of natural persons, taking into account
the nature, context, scope and purposes of the processing; or
●● a public authority or body (Article 27(2)).
The representative shall be established in one of the states where the
Data Subjects whose personal data are processed in relation to the offer-
ing of goods or services to them, or whose behaviour is monitored, are
(Article 27(3)).
The representative shall be mandated by the Controller or the
Processor to be addressed in addition to or instead of the Controller or
the Processor by, in particular, supervisory authorities and Data Subjects,
on all issues related to the processing, for the purposes of ensuring com-
pliance with the GDPR (Article 27(4)).
The designation of a representative by the Controller or the Processor
shall be without prejudice to legal actions which could be initiated
against the Controller or the Processor themselves (Article 28(5)).
Cooperation with Supervisory Authority
26.33 The Controller and the Processor and, where applicable, their
representatives, shall co-operate, on request, with the supervisory author-
ity in the performance of its tasks (Article 31).
Data Protection by Design (DPbD) and by Default
26.34 Article 25 refers to data protection by design and by default.
Note also the related concept, or precursor concept, of Privacy by Design
(PbD). In some ways PbD is the impetus for the current DPbD rules.
Taking into account the state of the art and the cost of implementation
and the nature, scope, context and purposes of the processing as well as
the risks of varying likelihood and severity for rights and freedoms of
natural persons posed by the processing, the Controller shall, both at the
time of the determination of the means for processing and at the time of
the processing itself, implement appropriate technical and organisational
measures, such as pseudonymisation, which are designed to implement
data protection Principles, such as data minimisation, in an effective
manner and to integrate the necessary safeguards into the processing in
order to meet the requirements of the GDPR and protect the rights of
Data Subjects (Article 25(1)).
The Controller shall implement appropriate technical and organisa-
tional measures for ensuring that, by default, only personal data which
are necessary for each specific purpose of the processing are processed.
That obligation applies to the amount of data collected, the extent of
their processing, the period of their storage and their accessibility. In
particular, such measures shall ensure that by default personal data are

391
26.34 New Regime

not made accessible without the individual’s intervention to an indefinite


number of natural persons (Article 27(2)).
An approved certification mechanism pursuant to Article 42 may be
used as an element to demonstrate compliance with the requirements set
out in Article 25(1) and (2) (Article 25(3)).
Rectification and Erasure
26.35 See details of obligations below.
Data Security
26.36 The Recitals refer to: network and information security,
Computer Emergency Response Teams (CERTs) and Computer Security
Incident Response Teams (CSIRTs); appropriate technical and organisa-
tional measures; security and risk evaluation; high risk and impact assess-
ments; and impact assessments; and large scale processing operations;
consultations; data breach and data breach notification; Commission
delegated acts.
Chapter IV, Section 2 of the GDPR refers to security in detail.
Note that the EDPB also registers details in relation to certification,
seals, and related tools, each of which also impact data security issues.
Security of Processing
26.37 Having regard to the state of the art and the costs of implemen-
tation and taking into account the nature, scope, context and purposes
of the processing as well as the risk of varying likelihood and sever-
ity for the rights and freedoms of natural persons, the Controller and
the Processor shall implement appropriate technical and organisational
measures, to ensure a level of security appropriate to the risk, including,
inter alia, as appropriate:
●● the pseudonymisation and encryption of personal data;
●● the ability to ensure the ongoing confidentiality, integrity, availabil-
ity and resilience of processing systems and services;
●● the ability to restore the availability and access to data in a timely
manner in the event of a physical or technical incident;
●● a process for regularly testing, assessing and evaluating the effec-
tiveness of technical and organisational measures for ensuring the
security of the processing (Article 32(1)).
In assessing the appropriate level of security account shall be taken in
particular of the risks that are presented by processing, in particular from
accidental or unlawful destruction, loss, alteration, unauthorised disclo-
sure of, or access to personal data transmitted, stored or otherwise pro-
cessed (Article 32(2)).

392
Main Provisions and Changes of GDPR 26.38

Adherence to an approved code of conduct pursuant to Article 40 or


an approved certification mechanism pursuant to Article 42 may be used
as an element to demonstrate compliance with the requirements set out
in Article 32(1) (Article 32(3)).
The Controller and Processor shall take steps to ensure that any natu-
ral person acting under the authority of the Controller or the Processor
who has access to personal data shall not process them except on instruc-
tions from the Controller, unless he or she is required to do so by EU or
state law (Article 32(4)).
Prior to the new GDPD, the ICO has issued recommendations in rela-
tion security issues, namely:
●● A Practical Guide to IT Security: Ideal for the Small Business;
●● Guidance on Data Security Breach Management;
●● Notification of Data Security Breaches to the Information
Commissioner’s Office;
●● Notification of Privacy and Electronic Communications Security
Breaches;
●● eIDAS;
●● The Guide to NIS;
●● Data Protection Impact Assessments.
These now need to be read in light of the GDPR changes.
Notifying Data Breach to Supervisory Authority
26.38 In the case of a personal data breach, the Controller shall with-
out undue delay and, where feasible, not later than 72 hours after having
become aware of it, notify the personal data breach to the supervisory
authority competent in accordance with Article 55, unless the personal
data breach is unlikely to result in a risk for the rights and freedoms of
natural persons. Where the notification to the supervisory authority is not
made within 72 hours, it shall be accompanied by a reason for the delay
(Article 33(1)).
The Processor shall notify the Controller without undue delay after
becoming aware of a personal data breach (Article 33(2)).
The notification referred to in Article 33(1) must at least:
●● describe the nature of the personal data breach including where pos-
sible, the categories and approximate number of Data Subjects con-
cerned and the categories and approximate number of data records
concerned;
●● communicate the name and contact details of the DPO or other con-
tact point where more information can be obtained;
●● describe the likely consequences of the personal data breach;

393
26.38 New Regime

●● describe the measures taken or proposed to be taken by the Controller


to address the personal data breach, including, where appropriate,
measures to mitigate its possible adverse effects (Article 33(2)).
Where, and in so far as, it is not possible to provide the information at
the same time, the information may be provided in phases without undue
further delay (Article 33(4)).
The Controller shall document any personal data breaches, comprising
the facts relating to the personal data breach, its effects and the remedial
action taken. This documentation must enable the supervisory authority
to verify compliance with this Article (Article 33(5)).
Organisations should obviously review their security procedures.
However, in accordance with the GDPR obligations and in accordance
with planned preparedness, organisations should have procedures for
how and when they notify the ICO. (Depending on the organisation and
sector, other regulators may also need to be notified).
Each organisation should have a prepared notification procedure and
letter for the ICO in the event that it is needed. This may be incorporated
into an overall data breach procedure for the organisation.
WP29 also refers to breach and breach notification issues in one of its
official documents.77
Communicating Data Breach to Data Subject
26.39 When the personal data breach is likely to result in a high risk
to the rights and freedoms of natural persons, the Controller shall com-
municate the personal data breach to the Data Subject without undue
delay (Article 34(1)).
The communication to the Data Subject referred to in Article 34(1)
shall describe in clear and plain language the nature of the personal data
breach and contain at least the information and measures referred to in
Article 33(3)(b), (c) and (d) (Article 34(2)).
The communication to the Data Subject shall not be required if:
●● the Controller has implemented appropriate technical and organisa-
tional protection measures, and that those measures were applied to
the data affected by the personal data breach, in particular those that
render the data unintelligible to any person who is not authorised to
access it, such as encryption;

77 WP29, Opinion 03/2014 on Personal Data Breach Notification. Also see WP29;
Working Document 01/2011 on the current EU personal data breach framework
and recommendations for future policy developments; Opinion 06/2012 on the draft
Commission Decision on the measures applicable to the notification of personal data
breaches under Directive 2002/58/EC on privacy and electronic communications.

394
Main Provisions and Changes of GDPR 26.41

●● the Controller has taken subsequent measures which ensure that the
high risk for the rights and freedoms of Data Subjects referred to in
Article 34(3) is no longer likely to materialise;
●● it would involve disproportionate effort. In such a case, there shall
instead be a public communication or similar measure, whereby
the Data Subjects are informed in an equally effective manner
(Article 34(3)).
If the Controller has not already communicated the personal data breach
to the Data Subject, the supervisory authority, having considered the
likelihood of the personal data breach resulting in a high risk, may
require it to do so or may decide that any of the conditions referred to in
Article 34(3) are met (Article 34(4)).
Each organisation should have a prepared Data Subject notification
procedure and letter/email for the Data Subject in the event that it is
needed. This may be incorporated into an overall data breach procedure
for the organisation.
Data Protection Impact Assessment and Prior Consultation
26.40 Chapter IV, Section 3 of the GDPR refers to impact assess-
ments and prior consultations. The WP29 also refers to impact assess-
ments.78 The ICO also has a guide entitled ‘Data Protection Impact
Assessments’.79
Data Protection Impact Assessment
26.41 Where a type of processing in particular using new technolo-
gies, and taking into account the nature, scope, context and purposes of
the processing, is likely to result in a high risk for the rights and free-
doms of natural persons, the Controller shall, prior to the processing,

78 WP29 Opinion 03/2014 on Personal Data Breach Notification; Opinion 07/2013 on


the Data Protection Impact Assessment Template for Smart Grid and Smart Meter-
ing Systems (‘DPIA Template’) prepared by Expert Group 2 of the Commission’s
Smart Grid Task Force; Opinion 04/2013 on the Data Protection Impact Assessment
Template for Smart Grid and Smart Metering Systems (‘DPIA Template’) prepared
by Expert Group 2 of the Commission’s Smart Grid Task Force; Opinion 9/2011 on
the revised Industry Proposal for a Privacy and Data Protection Impact Assessment
Framework for RFID Applications (and Annex: Privacy and Data Protection Impact
Assessment Framework for RFID Applications); Opinion 5/2010 on the Industry
Proposal for a Privacy and Data Protection Impact Assessment Framework for RFID
Applications; Statement on the role of a risk-based approach in data protection legal
framework, 2014.
79 ICO “Data Protection Impact Assessments,” at https://ico.org.uk/for-organisations/
guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/
accountability-and-governance/data-protection-impact-assessments/.

395
26.41 New Regime

carry out an assessment of the impact of the envisaged processing opera-


tions on the protection of personal data. A single assessment may address
a set of similar processing operations that present similar high risks
(Article 35(1)).
The Controller shall seek the advice of the DPO, where designated,
when carrying out a data protection impact assessment (Article 35(2)).
A data protection impact assessment referred to in Article 35(1) shall
in particular be required in the case of:
●● a systematic and extensive evaluation of personal aspects relating
to natural persons which is based on automated processing, includ-
ing profiling, and on which decisions are based that produce legal
effects concerning the individual person or similarly significantly
affect the natural person;
●● processing on a large scale of special categories of data referred to in
Article 9(1), or of personal data relating to criminal convictions and
offences referred to in Article 10;
●● a systematic monitoring of a publicly accessible area on a large scale
(Article 35(3)).
The supervisory authority shall establish and make public a list of the
kind of processing operations which are subject to the requirement for a
data protection impact assessment pursuant to Article 35(1). The supervi-
sory authority shall communicate those lists to the EDPB (Article 35(4)).
The supervisory authority may also establish and make public a list
of the kind of processing operations for which no data protection impact
assessment is required. The supervisory authority shall communicate
those lists to the EDPB (Article 35(5)).
Prior to the adoption of the lists referred to in Article 35(4) and (5) the
competent supervisory authority shall apply the consistency mechanism
referred to in Article 63 where such lists involve processing activities
which are related to the offering of goods or services to Data Subjects or
to the monitoring of their behaviour in several states, or may substantially
affect the free movement of personal data within the EU (Article 35(6)).
The assessment shall contain at least:
●● a systematic description of the envisaged processing operations and
the purposes of the processing, including where applicable the legit-
imate interest pursued by the Controller;
●● an assessment of the necessity and proportionality of the processing
operations in relation to the purposes;
●● an assessment of the risks to the rights and freedoms of Data Subjects
referred to in Article 35(1); and
●● the measures envisaged to address the risks, including safeguards,
security measures and mechanisms to ensure the protection of

396
Main Provisions and Changes of GDPR 26.42

personal data and to demonstrate compliance with the GDPR taking


into account the rights and legitimate interests of Data Subjects and
other persons concerned (Article 35(7)).
Compliance with approved codes of conduct referred to in Article 40 by
the relevant Controllers or Processors shall be taken into due account
in assessing the impact of the processing operations performed by such
Controllers or Processors, in particular for the purposes of a data protec-
tion impact assessment (Article 35(8)).
Where appropriate, the Controller shall seek the views of Data Subjects
or their representatives on the intended processing, without prejudice to
the protection of commercial or public interests or the security of the
processing operations (Article 35(9)).
Where the processing pursuant to Article 6(1)(c) or (e) has a legal
basis in EU law, or the law of the state to which the Controller is sub-
ject, that law regulates the specific processing operation or set of opera-
tions in question, and a data protection impact assessment has already
been carried out as part of a general impact assessment in the context of
the adoption of that legal basis, Article 35(1)–(7) shall not apply, unless
states deem it to be necessary to carry out such an assessment prior to the
processing activities (Article 35(10)).
Where necessary, the Controller shall carry out a review to assess if
the processing is performed with the data protection impact assessment
at least when there is a change of the risk represented by the processing
operations (Article 35(11)).
Organisations must assess their obligations in terms of implementing
procedures for, and undertaking of, DPIAs.80
Prior Consultation
26.42 The Controller shall consult the supervisory authority prior to
the processing where a data protection impact assessment as provided
for in Article 35 indicates that the processing would result in a high risk
in the absence of measures taken by the Controller to mitigate the risk
(Article 36(1)).
Where the supervisory authority is of the opinion that the intended
processing referred to in Article 36(2) would infringe the GDPR, in par-
ticular where the Controller has insufficiently identified or mitigated
the risk, it shall within a maximum period of eight weeks following
the request for consultation provide written advice to the Controller
and, where applicable to the Processor, and may use any of its powers

80 Generally note D Wright and P de Hert, eds, Privacy Impact Assessment (Springer
2012).

397
26.42 New Regime

referred to in Article 58. This period may be extended by six weeks,


taking into account the complexity of the intended processing. The
supervisory authority shall inform the Controller and, where applica-
ble, the Processor, of any such extension within one month of receipt
of the request for consultation together with the reasons for the delay.
Those periods may be suspended until the supervisory authority has
obtained information it has requested for the purposes of the consulta-
tion. (Article 36(2)).
When consulting the supervisory authority pursuant to Article 36(3),
the Controller shall provide the supervisory authority with:
●● where applicable, the respective responsibilities of Controller, joint
Controllers and Processors involved in the processing, in particular
for processing within a group of undertakings;
●● the purposes and means of the intended processing;
●● the measures and safeguards provided to protect the rights and free-
doms of Data Subjects pursuant to the GDPR;
●● where applicable, the contact details of the DPO;
●● the data protection impact assessment provided for in Article 35; and
●● any other information requested by the supervisory authority
(Article 36(3)).
States shall consult the supervisory authority during the preparation of a
proposal for a legislative measure to be adopted by a national parliament
or of a regulatory measure based on such a legislative measure, which
relates to the processing (Article 36(4)).
Notwithstanding Article 36(1), states’ law may require Controllers to
consult with, and obtain prior authorisation from, the supervisory author-
ity in relation to the processing by a Controller for the performance of
a task carried out by the Controller in the public interest, including pro-
cessing in relation to social protection and public health (Article 36(5)).
Data Protection Officer
26.43 See obligations below (at para 26.72).
Data Transfers
26.44 Recitals 101–115 refer to cross border data transfers. The WP29
also refers to transfers.81

81 WP29 Statement on the implementation of the judgement of the Court of Justice of


the European Union of 6 October 2015 in the Maximilian Schrems v Data Protection
Commissioner case (C-362/14); Recommendation 1/2012 on the Standard Application
form for Approval of Binding Corporate Rules for the Transfer of Personal Data for
Processing Activities; Opinion 3/2009 on the Draft Commission Decision on standard

398
Main Provisions and Changes of GDPR 26.46

Chapter V of the GDPR refers to data transfers and transfer of per-


sonal data to third countries or international organisations.
Prior to the new GDPR, the ICO has issued recommendations in rela-
tion to personal data transfer issues, namely:
●● Assessing Adequacy;
●● International Transfers – Legal Guidance;
●● Model Contract Clauses;
●● Outsourcing – A Guide for Small and Medium-Sized Businesses.
These now need to be read in light of the new changes.
General Principle for Transfers
26.45 Any transfer of personal data which are undergoing processing
or are intended for processing after transfer to a third country or to an
international organisation shall take place only if, subject to the other
provisions of the GDPR, the conditions laid down in this Chapter are
complied with by the Controller and Processor, including for onward
transfers of personal data from the third country or an international
organisation to another third country or to another international organi-
sation. All provisions in this Chapter shall be applied in order to ensure
that the level of protection of natural persons guaranteed by the GDPR
are not undermined (Article 44).
Transfers via Adequacy Decision
26.46 A transfer of personal data to a third country or an international
organisation may take place where the Commission has decided that the
third country, a territory or one or more specified sectors within that third
country, or the international organisation in question ensures an adequate
level of protection. Such transfer shall not require any specific authorisa-
tion (Article 45(1)).
When assessing the adequacy of the level of protection, the
Commission shall, in particular, take account of the following elements:
●● the rule of law, respect for human rights and fundamental freedoms,
relevant legislation, both general and sectoral, including concerning

contractual clauses for the transfer of personal data to processors established in third
countries, under Directive 95/46/EC (data controller to data processor); FAQs in order
to address some issues raised by the entry into force of the EU Commission Decision
2010/87/EU of 5 February 2010 on standard contractual clauses for the transfer of
personal data to processors established in third countries under Directive 95/46/EC
2010; Recommendation 1/2007 on the Standard Application for Approval of Binding
Corporate Rules for the Transfer of Personal Data.

399
26.46 New Regime

public security, defence, national security and criminal law and the
access of public authorities to personal data, as well as the imple-
mentation of this legislation, data protection rules, professional
rules and security measures, including rules for onward transfer of
personal data to another third country or international organisation,
which are complied with in that country or international organisa-
tion, case law, as well as effective and enforceable Data Subject
rights and effective administrative and judicial redress for the Data
Subjects whose personal data are being transferred;
●● the existence and effective functioning of one or more independ-
ent supervisory authorities in the third country or to which an inter-
national organisation is subject, with responsibility for ensuring
and enforcing compliance with the data protection rules, including
adequate enforcement powers for assisting and advising the Data
Subjects in exercising their rights and for assisting and advising the
Data Subjects in exercising their rights and for co-operation with the
supervisory authorities of the states; and
●● the international commitments the third country or international
organisation concerned has entered into, or other obligations arising
from legally binding conventions or instruments as well as from its
participation in multilateral or regional systems, in particular in rela-
tion to the protection of personal data (Article 45(2)).
The Commission, after assessing the adequacy of the level of protection,
may decide by means of implementing act, that a third country, or a ter-
ritory or one or more specified sectors within that third country, or an
international organisation ensures an adequate level of protection within
the meaning of Article 45(3).
The Commission shall, on an on-going basis, monitor developments
in third countries and international organisations that could affect the
functioning of decisions adopted pursuant to Article 45(3) and decisions
adopted on the basis of Article 25(6) of DPD 1995 (Article 45(4)).
The Commission shall publish in the Official Journal of the EU and
on its website a list of those third countries, territories and specified sec-
tors within a third country and international organisations where it has
decided that an adequate level of protection is or is no longer ensured
(Article 45(8)).
Decisions adopted by the Commission on the basis of Article 25(6) of
DPD 1995 shall remain in force until amended, replaced or repealed by
a Commission Decision adopted in accordance with Article 45(3) or (5)
(Article 45(9)).

400
Main Provisions and Changes of GDPR 26.47

Note that the EU-US Safe Harbour data transfer regime was struck
down by the Court of Justice in the Schrems case.82 The Safe Harbour
regime was held to be invalid. Notwithstanding the GDPR, the previ-
ous Safe Harbour regime needed to be replaced. Negotiations between
the EU Commission and the US authorities ensued. Agreement may be
reached and a new agreement implemented in early 2016. The replace-
ment is entitled EU-US Privacy Shield.
It should be noted also that there have been some concerns that the
same or similar reasons for the striking down of the Safe Harbour regime
may cause concerns for some of the other transfer legitimising mecha-
nisms. It remains to be seen if further challenges or concerns will under-
mine the Privacy Shield, standard clauses, etc.
The issues of data transfers and adequacy decision has added signifi-
cance in the UK now that Brexit has been passed by Parliament (in the
form of the EU (Withdrawal Agreement) Act 2020), and the transition
period is well advanced, and the UK is seeking a formal adequacy deci-
sion from the EU in order ease and facilitate the continued use of EU
data transfers into the UK. (See also Chapter 24.)
Transfers via Appropriate Safeguards
26.47 In the absence of a decision pursuant to Article 45(3), a
Controller or Processor may transfer personal data to a third country
or an international organisation only if the Controller or Processor has
provided appropriate safeguards, and on condition that enforceable Data
Subject rights and effective legal remedies for Data Subjects are avail-
able (Article 46(1)).
The appropriate safeguards referred to in Article 46(1) may be pro-
vided for, without requiring any specific authorisation from a supervi-
sory authority, by:
●● a legally binding and enforceable instrument between public author-
ities or bodies;
●● binding corporate rules in accordance with Article 47;

82 Schrems v Commissioner, Court of Justice, Case C-362/14, 6 October 2015. The case
technically related to Prism and Facebook Europe and transfers to the US. However,
the wider import turned out to be the entire EU-US Safe Harbour Agreement and data
transfers to the US. Note WP29 statement on the case, Statement on the implementa-
tion of the judgement of the Court of Justice of the European Union of 6 October 2015
in the Maximilian Schrems v Data Protection Commissioner case (C-362/14).

401
26.47 New Regime

●● standard data protection clauses adopted by the Commission


in accordance with the examination procedure referred to in
Article 93(2);
●● standard data protection clauses adopted by a supervisory authority
and approved by the Commission pursuant to the examination pro-
cedure referred to in Article 93(2);
●● an approved code of conduct pursuant to Article 40 together with
binding and enforceable commitments of the Controller or Processor
in the third country to apply the appropriate safeguards, including as
regards Data Subjects’ rights; or
●● an approved certification mechanism pursuant to Article 42 together
with binding and enforceable commitments of the Controller or
Processor in the third country to apply the appropriate safeguards,
including as regards Data Subjects’ rights (Article 46(2)).
Subject to the authorisation from the competent supervisory authority,
the appropriate safeguards referred to in Article 46(1) may also be pro-
vided for, in particular, by:
●● contractual clauses between the Controller or Processor and the
Controller, Processor or the recipient of the personal data in the third
country or international organisation; or
●● provisions to be inserted into administrative arrangements between
public authorities or bodies which include enforceable and effective
Data Subject rights (Article 46(3)).
The supervisory authority shall apply the consistency mechanism referred
to in Article 63 in the cases referred to in Article 46(3) (Article 46(4)).
Authorisations by a state or supervisory authority on the basis of
Article 26(2) of DPD95 shall remain valid until amended, replaced
or repealed, if necessary, by that supervisory authority. Decisions
adopted by the Commission on the basis of Article 26(4) of DPD95
shall remain in force until amended, replaced or repealed, if necessary,
by a Commission Decision adopted in accordance with Article 46(2)
(Article 46(5)).
Transfers via Binding Corporate Rules
26.48 The competent supervisory authority shall approve binding cor-
porate rules in accordance with the consistency mechanism set out in
Article 63, provided that they:
●● are legally binding and apply to and are enforced by every mem-
ber concerned of the group of undertakings or groups of enterprises
engaged in a joint economic activity, including their employees;

402
Main Provisions and Changes of GDPR 26.48

●● expressly confer enforceable rights on Data Subjects with regard to


the processing of their personal data; and
●● fulfil the requirements laid down in Article 47(2) (Article 47(1)).
The binding corporate rules shall specify at least:
●● the structure and contact details of the group of enterprises engaged
in a joint economic activity and of each of its members;
●● the data transfers or set of transfers, including the categories of per-
sonal data, the type of processing and its purposes, the type of Data
Subjects affected and the identification of the third country or coun-
tries in question;
●● their legally binding nature, both internally and externally;
●● the application of the general data protection Principles, in particular
purpose limitation, data minimisation, limited storage periods, data
quality, data protection by design (DPbD) and by default, legal basis
for the processing, processing of special categories of personal data,
measures to ensure data security, and the requirements in respect
of onward transfers to bodies not bound by the binding corporate
rules [d];
●● the rights of Data Subjects in regard to processing and the means to
exercise these rights, including the right not to be subject to deci-
sions based solely on automated processing, including profiling in
accordance with Article 22, the right to lodge a complaint with the
competent supervisory authority and before the competent courts of
the states in accordance with Article 79, and to obtain redress and,
where appropriate, compensation for a breach of the binding corpo-
rate rules [e];
●● the acceptance by the Controller or Processor established on the ter-
ritory of a state of liability for any breaches of the binding corpo-
rate rules by any member concerned not established in the EU; the
Controller or the Processor shall be exempted from this liability, in
whole or in part, only if it proves that that member is not responsible
for the event giving rise to the damage [f];
●● how the information on the binding corporate rules, in particular on
the provisions referred to in this Article 47(2)(d), (e) and (f) is pro-
vided to the Data Subjects in addition to Articles 13 and 14;
●● the tasks of any DPO designated in accordance with Article 37 or
any other person or entity in charge of monitoring compliance with
the binding corporate rules within the group of undertakings, or
group of enterprises engaging in a joint economic activity, as well as
monitoring training and complaint handling;
●● the complaint procedures;

403
26.48 New Regime

●● the mechanisms within the group of undertakings, or group of


enterprises engaging in a joint economic activity for ensuring the
verification of compliance with the binding corporate rules. Such
mechanisms shall include data protection audits and methods for
ensuring corrective actions to protect the rights of the Data Subject.
Results of such verification should be communicated to the person
or entity referred under point (h) and to the board of the controlling
undertaking or of the group of enterprises engaged in a joint eco-
nomic activity, and should be available upon request to the compe-
tent supervisory authority;
●● the mechanisms for reporting and recording changes to the rules and
reporting those changes to the supervisory authority;
●● the co-operation mechanism with the supervisory authority to ensure
compliance by any member of the group of undertakings, or group
of enterprises engaging in a joint economic activity, in particular by
making available to the supervisory authority the results of verifica-
tions of the measures referred to in this Article 47(2)(j);
●● the mechanisms for reporting to the competent supervisory authority
any legal requirements to which a member of the group of undertak-
ings, or group of enterprises engaging in a joint economic activity
is subject in a third country which are likely to have a substantial
adverse effect on the guarantees provided by the binding corporate
rules; and
●● the appropriate data protection training to personnel having perma-
nent or regular access to personal data (Article 47(2)).
The Commission may specify the format and procedures for the
exchange of information between Controllers, Processors and supervi-
sory authorities for binding corporate rules. Those implementing acts
shall be adopted in accordance with the examination procedure set out in
Article 93(2) (Article 47(4)).
Given that the EU-US Safe Harbour data transfer regime was struck
down by the Court of Justice in the Schrems case,83 there have been some
concerns that a similar strike down problem could arise for other trans-
fer legitimising mechanisms, including BCRs. The WP29 also refers to
BCR issues, including the Schrems case.84

83 Schrems v Commissioner, Court of Justice, Case C-362/14, 6 October 2015.


84 WP29 Statement on the implementation of the judgement of the Court of Justice of
the European Union of 6 October 2015 in the Maximilian Schrems v Data ­Protection
Commissioner case (C-362/14); Explanatory Document on the Processor Binding Cor-
porate Rules, Revised version May 2015 – WP 204; Opinion 02/2014 on a Referential
for requirements for Binding Corporate Rules submitted to national Data Protection
Authorities in the EU and Cross Border Privacy Rules submitted to APEC CBPR

404
Main Provisions and Changes of GDPR 26.50

Transfers or Disclosures Not Authorised by EU Law


26.49 Any judgment of a court or tribunal and any decision of an
administrative authority of a third country requiring a Controller or
Processor to transfer or disclose personal data may only be recognised or
enforceable in any manner if based on an international agreement, such
as a mutual legal assistance treaty, in force between the requesting third
country and the EU or a state, without prejudice to other grounds for
transfer pursuant to this Chapter (Article 48).
Derogations for Specific Situations
26.50 In the absence of an adequacy decision pursuant to Article 45(3),
or of appropriate safeguards pursuant to Article 46, including binding
corporate rules, a transfer or a set of transfers of personal data to a third
country or an international organisation shall take place only on one of
the following conditions:
●● the Data Subject has explicitly consented to the proposed transfer,
after having been informed of the possible risks of such transfers
for the Data Subject due to the absence of an adequacy decision and
appropriate safeguards; [a]
●● the transfer is necessary for the performance of a contract between
the Data Subject and the Controller or the implementation of pre-
contractual measures taken at the Data Subject’s request; [b]
●● the transfer is necessary for the conclusion or performance of a
contract concluded in the interest of the Data Subject between the
Controller and another natural or legal person; [c]
●● the transfer is necessary for important reasons of public interest;
●● the transfer is necessary for the establishment, exercise or defence
of legal claims;
●● the transfer is necessary in order to protect the vital interests of the
Data Subject or of other persons, where the Data Subject is physi-
cally or legally incapable of giving consent;

Accountability Agents; Opinion 07/2013 on the Data Protection Impact Assessment


Template for Smart Grid and Smart Metering Systems (‘DPIA Template’) prepared
by Expert Group 2 of the Commission’s Smart Grid Task Force; Opinion 04/2013 on
the Data Protection Impact Assessment Template for Smart Grid and Smart Metering
Systems (‘DPIA Template’) prepared by Expert Group 2 of the Commission’s Smart
Grid Task Force; Working Document 02/2012 setting up a table with the elements
and principles to be found in Processor Binding Corporate Rules; Recommendation
1/2012 on the Standard Application form for Approval of Binding Corporate Rules
for the Transfer of Personal Data for Processing Activities; Opinion 5/2010 on the
Industry Proposal for a Privacy and Data Protection Impact Assessment Framework
for RFID Applications.

405
26.50 New Regime

●● the transfer is made from a register which according to EU or state


law is intended to provide information to the public and which is
open to consultation either by the public in general or by any person
who can demonstrate a legitimate interest, but only to the extent
that the conditions laid down in EU or state law for consultation are
fulfilled in the particular case. [g]
Where a transfer could not be based on a provision in Article 45 or
46, including the provisions on binding corporate rules, and none of
the derogations for a specific situation referred to in the first subpara-
graph of this paragraph is applicable, a transfer to a third country or
an international organisation may take place only if the transfer is not
repetitive, concerns only a limited number of data subjects, is neces-
sary for the purposes of compelling legitimate interests pursued by the
Controller which are not overridden by the interests or rights and free-
doms of the Data Subject, and the Controller has assessed all the cir-
cumstances surrounding the data transfer and has on the basis of that
assessment provided suitable safeguards with regard to the protection
of personal data. The Controller shall inform the supervisory author-
ity of the transfer. The Controller shall, in addition to providing the
information referred to in Articles 13 and 14, inform the Data Subject
of the transfer and on the compelling legitimate interests pursued
(Article 49(1)).
A transfer pursuant to Article 49(1)(g) shall not involve the entirety of
the personal data or entire categories of the personal data contained in
the register. Where the register is intended for consultation by persons
having a legitimate interest, the transfer shall be made only at the request
of those persons or if they are to be the recipients (Article 49(2)).
Article 47(1)(a), (b), (c) and the second subparagraph shall not apply
to activities carried out by public authorities in the exercise of their pub-
lic powers (Article 49(3)).
The public interest referred to in Article 47(1)(d) must be recognised
in EU law or in the law of the state to which the Controller is subject
(Article 49(4)).
In the absence of an adequacy decision, EU law or state law may, for
important reasons of public interest, expressly set limits to the transfer of
specific categories of personal data to a third country or an international
organisation. States shall notify such provisions to the Commission
(Article 49(5)).
The Controller or Processor shall document the assessment as well as
the suitable safeguards referred to in Article 49(1) in the records referred
to in Article 30 (Article 49(6)).

406
Main Provisions and Changes of GDPR 26.52

Data Subject Rights


26.51 The Recitals refer to: Data Subject rights; principles of fair and
transparent processing; prior information requirements; right of access;
right of rectification and right to be forgotten (RtbF); right to complain
to single supervisory authority; automated processing.
Chapter III of the GDPR refers to the rights of Data Subjects. These
include:
●● right to transparency (Article 5; Article 12);
●● right to prior information: directly obtained data (Article 13);
●● right to prior information: indirectly obtained data (Article 14);
●● right of confirmation and right of access (Article 15);
●● right to rectification (Article 16);
●● right to erasure (Right to be Forgotten) (RtbF) (Article 17);
●● right to restriction of processing (Article 18);
●● notifications re rectification, erasure or restriction (Article 19);
●● right to data portability (Article 20);
●● right to object (Article 21);
●● rights re automated individual decision making, including profiling
(Article 22);
●● DPbD (Article 25);
●● security rights;
●● data protection impact assessment and prior consultation;
●● communicating data breach to data subject;
●● Data Protection Officer (DPO);
●● remedies, liability and sanctions (Chapter VIII).
Transparency
26.52 Article 5 of the GDPR provides that personal data shall be
‘processed … in a transparent manner in relation to the Data Subject.’
Transparency:
‘require[s] greater awareness among citizens about the processing going on:
its existence, its content and the flows generated in and out by using terminals.

Transparency also relates to security of data and risk management.’85

Some commentators have suggested the GDPR could go further. It has


been suggested that ‘the greater the flow of information systems the

85 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law &
­Security Review (2012) (28) 254–262, at 256.

407
26.52 New Regime

more opaque it becomes in modern information systems and with new


ICT applications. In that case the right to transparency must increase
alongside these new processes.’86
Article 5 of the new GDPR in the data protection Principles first prin-
ciple now reads that personal data shall be:
‘processed lawfully, fairly and in a transparent manner in relation to the Data
Subject (“lawfulness, fairness and transparency”).’

Article 12 refers to transparent information, communication and modali-


ties for exercising the rights of the Data Subject.
The Controller shall take appropriate measures to provide any infor-
mation referred to in Articles 13 and 14 and any communication under
Articles 15–22, and 34 relating to processing to the Data Subject in a
concise, transparent, intelligible and easily accessible form, using clear
and plain language, in particular for any information addressed specif-
ically to a child. The information shall be provided in writing, or by
other means, where appropriate, by electronic form. When requested
by the Data Subject, the information may be provided orally, pro-
vided that the identity of the Data Subject is proven by other means
(Article 12(1)).
The Controller shall facilitate the exercise of Data Subject rights under
Articles 15–22. In cases referred to in Article 11(2) the Controller shall
not refuse to act on the request of the Data Subject for exercising their
rights under Articles 15–22, unless the Controller demonstrates that it is
not in a position to identify the Data Subject (Article 12(2)).
The Controller shall provide information on action taken on a request
under Articles 15–22 to the Data Subject without undue delay and in any
event within one month of receipt of the request. That period may be
extended by two further months when necessary, taking into account the
complexity and the number of the requests. The Controller shall inform
the Data Subject of any such extension within one month of receipt of the
request, together with the reasons for the delay. Where the Data Subject
makes the request by electronic means, the information shall be provided
by electronic means where possible, unless otherwise requested by the
Data Subject (Article 12(3)).
If the Controller does not take action on the request of the Data
Subject, the Controller shall inform the Data Subject without delay and
at the latest within one month of receipt of the request of the reasons for
not taking action and on the possibility of lodging a complaint to a super-
visory authority and seeking a judicial remedy (Article 12(4)).

86 See above, at 256.

408
Main Provisions and Changes of GDPR 26.54

Information provided under Articles 13 and 14 and any communica-


tion and any actions taken under Articles 15–22 and 34 shall be pro-
vided free of charge. Where requests from a Data Subject are manifestly
unfounded or excessive, in particular because of their repetitive charac-
ter, the Controller may either charge a reasonable fee taking into account
the administrative costs for providing the information or the commu-
nication or taking the action requested; or refuse to act on the request.
The Controller shall bear the burden of demonstrating the manifestly
unfounded or excessive character of the request (Article 12(4)).
Without prejudice to Article 11, where the Controller has reasonable
doubts concerning the identity of the natural person making the request
referred to in Articles 15–21, the Controller may request the provision
of additional information necessary to confirm the identity of the Data
Subject (Article 12(6)).
The information to be provided to Data Subjects pursuant to Articles 13
and 14 may be provided in combination with standardised icons in order
to give in an easily visible, intelligible and clearly legible manner, a
meaningful overview of the intended processing. Where the icons are
presented electronically they shall be machine-readable (Article 12(7)).
The Commission shall be empowered to adopt delegated acts in
accordance with Article 92 for the purpose of determining the informa-
tion to be presented by the icons and the procedures for providing stand-
ardised icons (Article 12(8)).
Prior to the new GDPR, the ICO has issued recommendations in rela-
tion to privacy notices, namely:
●● Getting It Right: Collecting Information about Your Customers;
●● Privacy Notices Code of Practice.
These now need to be read in light of the GDPR changes.
Data Access Rights
26.53 Chapter III, Section 2 of the GDPR refers to data access. It
should be noted that the prior information requirements might have pre-
viously been viewed as Controller compliance obligations. However,
they are now reformulated as part of the Data Subject rights. This there-
fore is a change in emphasis.
Right to Prior Information: Directly Obtained Data
26.54 Article 13 refers to information to be provided where the data
are collected from the Data Subject. Where personal data relating to a
Data Subject are collected from the Data Subject, the Controller shall, at

409
26.54 New Regime

the time when personal data are obtained, provide the Data Subject with
the following information:
●● the identity and the contact details of the Controller and, where
applicable, of the Controller’s representative;
●● the contact details of the DPO, where applicable;
●● the purposes of the processing for which the personal data are
intended as well as the legal basis for the processing;
●● where the processing is based on Article 6(1)(f), the legitimate inter-
ests pursued by the Controller or by a third party;
●● the recipients or categories of recipients of the personal data,
if any;
●● where applicable, the fact that the Controller intends to transfer per-
sonal data to a third country or international organisation and the
existence or absence of an adequacy decision by the Commission, or
in the case of transfers referred to in Articles 46 or 47, or the second
subparagraph of Article 49(1), reference to the appropriate or suit-
able safeguards and the means by which to obtain a copy of them or
where they have been made available (Article 13(1)).
In addition to the information referred to in Article 13(1), the Controller
shall, at the time when personal data are obtained, provide the Data
Subject with the following further information necessary to ensure fair
and transparent processing:
●● the period for which the personal data will be stored, or if that is not
possible, the criteria used to determine that period;
●● the existence of the right to request from the Controller access to and
rectification or erasure of personal data or restriction of processing
concerning the Data Subject or to object to the processing as well as
the right to data portability;
●● where the processing is based on Article 6(1)(a) or Article 9(2)(a),
the existence of the right to withdraw consent at any time, without
affecting the lawfulness of processing based on consent before its
withdrawal;
●● the right to lodge a complaint to a supervisory authority;
●● whether the provision of personal data is a statutory or contractual
requirement, or a requirement necessary to enter into a contract, as
well as whether the Data Subject is obliged to provide the data and
of the possible consequences of failure to provide such data;
●● the existence of automated decision making including profiling
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 13(2)).

410
Main Provisions and Changes of GDPR 26.55

Where the Controller intends to further process the data for a purpose
other than that for which the personal data were collected, the Controller
shall provide the Data Subject prior to that further processing with infor-
mation on that other purpose and with any relevant further information
as referred to in Article 13(2) (Article 13(2)).
Article 13(1), (2) and (3) shall not apply where and insofar as the Data
Subject already has the information (Article 13(4)).
Right to Prior Information: Indirectly Obtained Data
26.55 Article 14 refers to information to be provided where the data
have not been obtained from the Data Subject. Where personal data have
not been obtained from the Data Subject, the Controller shall provide the
Data Subject with the following information:
●● the identity and the contact details of the Controller and, where
applicable, of the Controller’s representative;
●● the contact details of the DPO, where applicable;
●● the purposes of the processing for which the personal data are
intended as well as the legal basis of the processing;
●● the categories of personal data concerned;
●● the recipients or categories of recipients of the personal data, if any;
●● where applicable, that the Controller intends to transfer personal data
to a recipient in a third country or international organisation and the
existence or absence of an adequacy decision by the Commission,
or in case of transfers referred to in Article 46 or 47, or the second
subparagraph of Article 49(1), reference to the appropriate or suit-
able safeguards and the means to obtain a copy of them or where
they have been made available (Article 14(1)).
In addition to the information referred to in Article 14(1), the Controller
shall provide the Data Subject with the following information necessary
to ensure fair and transparent processing in respect of the Data Subject:
●● the period for which the personal data will be stored, or if this is not
possible, the criteria used to determine that period;
●● where the processing is based on Article 6(1)(f), the legitimate inter-
ests pursued by the Controller or by a third party;
●● the existence of the right to request from the Controller access to and
rectification or erasure of personal data or restriction of processing
concerning the Data Subject and to object to processing as well as
the right to data portability;
●● where processing is based on Article 6(1)(a) or Article 9(2)(a), the
existence of the right to withdraw consent at any time, without
affecting the lawfulness of processing based on consent before its
withdrawal;

411
26.55 New Regime

●● the right to lodge a complaint to a supervisory authority;


●● from which source the personal data originate, and if applicable,
whether it came from publicly accessible sources;
●● the existence of automated decision making including profiling
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 14(2)).
The Controller shall provide the information referred to in Article 14(1)
and (2),
●● within a reasonable period after obtaining the data, but at the latest
within one month, having regard to the specific circumstances in
which the data are processed; or
●● if the data are to be used for communication with the Data Subject,
at the latest at the time of the first communication to that Data
Subject; or
●● if a disclosure to another recipient is envisaged, at the latest when
the data are first disclosed (Article 14(3)).
Where the Controller intends to further process the data for a purpose
other than the one for which the data were obtained, the Controller shall
provide the Data Subject prior to that further processing with informa-
tion on that other purpose and with any relevant further information as
referred to in Article 14(2) (Article 14(4)).
Article 14(1) to (4) shall not apply where and insofar as:
●● the Data Subject already has the information;
●● the provision of such information proves impossible or would
involve a disproportionate effort, in particular for processing for
archiving purposes in the public interest, scientific or historical
research purposes or statistical purposes subject to the conditions
and safeguards referred to in Article 89(1) or in so far as the obliga-
tion referred to in Article 14(1) is likely to render impossible or seri-
ously impair the achievement of the objectives of that processing.
In such cases the Controller shall take appropriate measures to pro-
tect the Data Subject’s rights and freedoms and legitimate interests,
including making the information publicly available;
●● obtaining or disclosure is expressly laid down by EU or state law
to which the Controller is subject and which provides appropriate
measures to protect the Data Subject’s legitimate interests; or
●● where the data must remain confidential subject to an obligation of
professional secrecy regulated by EU or state law, including a statu-
tory obligation of secrecy (Article 14(5)).

412
Main Provisions and Changes of GDPR 26.56

Right of Confirmation and Right of Access


26.56 The Data Subject shall have the right to obtain from the
Controller confirmation as to whether or not personal data concerning
them are being processed, and where that is the case, access to the per-
sonal data and the following information:
●● the purposes of the processing;
●● the categories of personal data concerned;
●● the recipients or categories of recipients to whom the personal data
have been or will be disclosed, in particular recipients in third coun-
tries or international organisations;
●● where possible, the envisaged period for which the personal data
will be stored, or if possible, this the criteria used to determine that
period;
●● the existence of the right to request from the Controller rectifica-
tion or erasure of personal data or restriction of the processing
of personal data concerning the Data Subject or to object to the
processing;
●● the right to lodge a complaint to a supervisory authority;
●● where the personal data are not collected from the Data Subject, any
available information as to their source;
●● the existence of automated decision making including profiling
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 15(1)).
Where personal data are transferred to a third country or to an interna-
tional organisation, the Data Subject shall have the right to be informed
of the appropriate safeguards pursuant to Article 46 relating to the trans-
fer (Article 15(2)).
The Controller shall provide a copy of the personal data undergoing
processing. For any further copies requested by the Data Subject, the
Controller may charge a reasonable fee based on administrative costs.
Where the Data Subject makes the request by electronic means, and
unless otherwise requested by the Data Subject, the information shall be
provided in a commonly used electronic form (Article 15(3)).
The right to obtain a copy referred to in Article 15(3) shall not
adversely affect the rights and freedoms of others (Article 15(4)).
Prior to the new GDPR, the ICO has issued recommendations in rela-
tion data access issues, namely:
●● Access to Information Held in Complaint Files;
●● Enforced Subject Access (section 56);

413
26.56 New Regime

●● How to Disclose Information Safely – Removing Personal Data


from Information Requests and Datasets;
●● Regulatory Activity Exemption;
●● Subject Access: Code of Practice;
●● Subject Access: Responding to a Request Checklist.
These now need to be read in light of the GDPR changes.
Right to be Informed of Third Country Safeguards
26.57 Where personal data are transferred to a third country or to an
international organisation, the Data Subject shall have the right to be
informed of the appropriate safeguards pursuant to Article 46 relating
to the transfer (Article 15(2)). This has added relevance as part of the
potential ripples from the Schrems Court of Justice decision.
Rectification and Erasure
26.58 Chapter III, Section 3 of the GDPR refers to rectification and
erasure. This was already an important and topical issue and is now even
more important on foot of the Google Spain case (see below) and also
issues such as online abuse.
Right to Rectification
26.59 The Data Subject shall have the right to obtain from the
Controller without undue delay the rectification of inaccurate personal
data concerning them. Taking into account the purposes of the process-
ing, the Data Subject shall have the right to have incomplete personal
data completed, including by means of providing a supplementary state-
ment (Article 16).87
Right to Erasure (Right to be Forgotten) (RtbF)
26.60 The Data Subject shall have the right to obtain from the
Controller the erasure of personal data concerning them without undue
delay where one of the following grounds applies:
●● the personal data are no longer necessary in relation to the purposes
for which they were collected or otherwise processed;
●● the Data Subject withdraws consent on which the processing is based
according to Article 6(1)(a), or Article 9(2)(a), and where there is no
other legal ground for the processing;
●● the Data Subject objects to the processing of personal data pursu-
ant to Article 21(1) and there are no overriding legitimate grounds

87 See P Lambert, The Right to be Forgotten (Bloomsbury, 2019).

414
Main Provisions and Changes of GDPR 26.60

for the processing, or the Data Subject objects to the processing of


­personal data pursuant to Article 21(2);
●● the personal data have been unlawfully processed;
●● the personal data have to be erased for compliance with a legal obli-
gation in EU or state law to which the Controller is subject;
●● the data have been collected in relation to the offer of information
society services referred to in Article 8(1) (Article 17(1)).
Where the Controller has made the personal data public and is obliged
pursuant to Article 17(1) to erase the personal data, the Controller, tak-
ing account of available technology and the cost of implementation,
shall take reasonable steps, including technical measures, to inform
Controllers which are processing the personal data that the Data Subject
has requested the erasure by such Controllers of any links to, or copy or
replication of those personal data (Article 17(2)).
Article 17(1) and (2) shall not apply to the extent that processing is
necessary:
●● for exercising the right of freedom of expression and information;
●● for compliance with a legal obligation which requires processing of
personal data by EU or Member state law to which the Controller is
subject or for the performance of a task carried out in the public inter-
est or in the exercise of official authority vested in the Controller;
●● for reasons of public interest in the area of public health in accord-
ance with Article 9(2)(h) and (i) as well as Article 9(3);
●● for archiving purposes in the public interest, or scientific and his-
torical research purposes or statistical purposes in accordance with
Article 89(1) in so far as the right referred to in Article 17(1) is
likely to render impossible or seriously impair the achievement of
the objectives of that processing; or
●● for the establishment, exercise or defence of legal claims
(Article 17(3)).88
In the Google Spain case, the Court of Justice held that:
‘Article 2(b) and (d) of [the DPD] … are to be interpreted as meaning that,
first, the activity of a search engine consisting in finding information pub-
lished or placed on the internet by third parties, indexing it automatically,
storing it temporarily and, finally, making it available to internet users accord-
ing to a particular order of preference must be classified as “processing of

88 In relation to the original draft, etc, see for example, G Sartor, ‘The Right to be Forgot-
ten in the Draft Data Protection Regulation,’ International Data Privacy Law (2015)
(5;1) 64. Also A Mantelero, ‘The EU Proposal for a general data protection regulation
and the roots of the “right to be forgotten,”’ Computer Law and Security Report (2013)
(29:3) 229.

415
26.60 New Regime

personal data” within the meaning of Article 2(b) when that information con-
tains personal data and, second, the operator of the search engine must be
regarded as the “controller” in respect of that processing, within the meaning
of Article 2(d).

Article 4(1)(a) of [DPD 1995] is to be interpreted as meaning that processing


of personal data is carried out in the context of the activities of an establish-
ment of the controller on the territory of a Member State, within the meaning
of that provision, when the operator of a search engine sets up in a Member
State a branch or subsidiary which is intended to promote and sell advertis-
ing space offered by that engine and which orientates its activity towards the
inhabitants of that Member State.’89

The WP29 also refers to RtbF issues as well as the Google Spain case.90
This includes WP29 ‘Guidelines on the implementation of the Court of
Justice of the European Union judgment on “Google Spain and Inc v
Agencia Española de Protección de Datos (AEPD) and Mario Costeja
González.”’
There have also been recent cases, including a case against the French
data regulator (regarding special categories of data)91; a case taken by
the French data regulator against Google (regarding jurisdictional take-
downs);92 and the first UK High Court RtbF cases, NT1 and NT2.93
Note also the EDPB guidance:
●● Guidelines 5/2019 on the criteria of the Right to be Forgotten in the
search engines cases under the GDPR (part 1).

89 Google Spain SL and Google Inc v Agencia Española de Protección de Datos (AEPD)
and Mario Costeja González, Case C-131/12, 13 May 2014. Also see, for example,
P Lambert, The Right to be Forgotten (Bloomsbury, 2019); P Lambert, International
Handbook of Social Media Laws (Bloomsbury 2014); P Lambert, Social Networking:
Law, Rights and Policy (Clarus Press 2014); V Mayer-Schönberger, Delete: the Virtue
of Forgetting in the Digital Age (Princeton, 2009).
90 WP29 Guidelines on the implementation of the Court of Justice of the European
Union judgment on ‘Google Spain and Inc v Agencia Española de Protección de
Datos (AEPD) and Mario Costeja González’ C-131/121; Opinion 8/2010 on applica-
ble law (WP29 adds as follows: ‘In its judgment in Google Spain the Court of Justice
of the European Union decided upon certain matters relating to the territorial scope of
Directive 95/46/EC. The WP29 commenced an internal analysis of the potential impli-
cations of this judgment on applicable law and may provide further guidance on this
issue during the course of 2015, including, possibly, additional examples.’).
91 GC and Others v Commission Nationale de l’Informatique et des Libertés (CNIL)
(Déréférencement de données sensibles), CJEU [2019] Case C-136/17.
92 Google LLC, successor in law to Google Inc v Commission Nationale de l’Informatique
et des Libertes (CNIL), CJEU [2019] Case C-507/17.
93 NT1 and NT2 v Google LLC [2018] EWHC 799 (QB; [2018] EMLR 18; [2018]
HRLR 13.

416
Main Provisions and Changes of GDPR 26.63

Right to Restriction of Processing


26.61 Article 18 refers to the right to restriction of processing. The
Data Subject shall have the right to obtain from the Controller the restric-
tion of the processing where one of the following applies:
●● the accuracy of the personal data is contested by the Data Subject,
for a period enabling the Controller to verify the accuracy of the
personal data;
●● the processing is unlawful and the Data Subject opposes the erasure
of the personal data and requests the restriction of their use instead;
●● the Controller no longer needs the personal data for the purposes
of the processing, but they are required by the Data Subject for the
establishment, exercise or defence of legal claims;
●● the data subject has objected to processing pursuant to Article 21(1)
pending the verification whether the legitimate grounds of the
Controller override those of the Data Subject (Article 18(1)).
Where processing of personal data has been restricted under Article 18(1),
such personal data shall, with the exception of storage, only be processed
with the Data Subject’s consent or for the establishment, exercise or
defence of legal claims or for the protection of the rights of another natu-
ral or legal person or for reasons of important public interest of the EU
or of a state (Article 18(2)).
A Data Subject who obtained the restriction of processing pursuant to
Article 18(1) shall be informed by the Controller before the restriction of
processing is lifted (Article 18(3)).
Notifications re Rectification, Erasure or Restriction
26.62 The Controller shall communicate any rectification or erasure
of personal data or restriction of processing carried out in accordance
with Articles 16, 17(1) and 18 to each recipient to whom the personal
data have been disclosed, unless this proves impossible or involves dis-
proportionate effort. The Controller shall inform the Data Subject about
those recipients if the Data Subject requests this (Article 19).
Right to Data Portability
26.63 The Data Subject shall have the right to receive the personal
data concerning him or her, which he or she has provided to a Controller,
in a structured, commonly used, machine-readable format and have the
right to transmit those data to another Controller without hindrance from
the Controller to which the personal data have been provided, where:
●● the processing is based on consent pursuant to Article 6(1)(a) or
Article 9(2)(a) or on a contract pursuant to Article 6(1)(b); and
●● the processing is carried out by automated means (Article 20(1)).

417
26.63 New Regime

In exercising his or her right to data portability pursuant to Article 20(1),


the Data Subject has the right to obtain that the data is transmitted directly
from Controller to Controller where technically feasible (Article 20(2)).
The exercise of this right shall be without prejudice to Article 17. The
right shall not apply to processing necessary for the performance of a
task carried out in the public interest or in the exercise of official author-
ity vested in the Controller (Article 20(3)).
The right shall not adversely affect the rights and freedoms of others
(Article 20(4)).
Right Against Automated Individual Decision Making
26.64 Chapter III, Section 4 of the GDPR refers to the right to object
and automated individual decision making.
Right to Object
26.65 The Data Subject shall have the right to object, on grounds relat-
ing to his or her particular situation, at any time to the processing of
personal data concerning him or her which is based on Article 6(1)(e)
or (f), including profiling based on these provisions. The Controller shall
no longer process the personal data unless the Controller demonstrates
compelling legitimate grounds for the processing which override the
interests, rights and freedoms of the Data Subject or for the establish-
ment, exercise or defence of legal claims (Article 21(1)).
Where personal data are processed for direct marketing purposes, the
Data Subject shall have the right to object at any time to the process-
ing of personal data concerning him or her for such marketing, which
includes profiling to the extent that it is related to such direct marketing
(Article 20(2)).
Where the Data Subject objects to the processing for direct market-
ing purposes, the personal data shall no longer be processed for such
purposes (Article 21(3)).
At the latest at the time of the first communication with the Data
Subject, the right referred to in Article 21(1) and (2) shall be explicitly
brought to the attention of the Data Subject and shall be presented clearly
and separately from any other information (Article 21(4)).
In the context of the use of information society services, and notwith-
standing Directive 2002/58/EC,94 the Data Subject may exercise his or
her right to object by automated means using technical specifications
(Article 21(5)).

94 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).

418
Main Provisions and Changes of GDPR 26.67

Where personal data are processed for scientific or historical research


purposes or statistical purposes pursuant to Article 89(1), the Data
Subject, on grounds relating to his or her particular situation, shall have
the right to object to processing of personal data concerning him or her,
unless the processing is necessary for the performance of a task carried
out for reasons of public interest (Article 21(6)).
Rights re Automated Individual Decision Making, Including Profiling
26.66 There is often a wide concern as to the data protection impact of
increased profiling techniques.95
The new GDPR provides that the Data Subject shall have the right not
to be subject to a decision based solely on automated processing, includ-
ing profiling,96 which produces legal effects concerning him or her or
similarly significantly affects him or her (Article 22(1)).
Article 22(1) shall not apply if the decision:
●● is necessary for entering into, or performance of, a contract between
the Data Subject and a Controller [a];
●● is authorised by EU or state law to which the Controller is subject
and which also lays down suitable measures to safeguard the Data
Subject’s rights and freedoms and legitimate interests; or
●● is based on the Data Subject’s explicit consent (Article 22(1)) [c].
In cases referred to in Article 22(2)(a) and (c) the Controller shall
implement suitable measures to safeguard the Data Subject’s rights and
freedoms and legitimate interests, at least the right to obtain human inter-
vention on the part of the Controller, to express his or her point of view
and to contest the decision (Article 22(3)).
Decisions referred to in Article 22(2) shall not be based on special
categories of personal data referred to in Article 9(1), unless Article 9(2)
(a) or (g) applies and suitable measures to safeguard the Data Subject’s
rights and freedoms and legitimate interests are in place (Article 22(4)).
Data Protection by Design and by Default
26.67 Article 25 refers to data protection by design and by default.
Note also the related concept of Privacy by Design (PbD). In some ways
PbD is the impetus or precursor for the current DPbD rules.

95 This is in the EU as well as the US (and elsewhere). From a US perspective, see


N Roethlisberger, ‘Someone is Watching: The Need for Enhanced Data Protection,’
Hastings Law Journal (2011) (62:6) 1793.
96 See, for example, M Hildebrandt, ‘Who is Profiling Who? Invisible Visibility,’ in
S Gutwirth, Y Poullet, P de Hert, C de Terwange and S Nouwt, Reinventing Data
Protection? (Springer, 2009) 239.

419
26.67 New Regime

Taking into account the state of the art and the nature, scope, context
and purposes of the processing as well as the risks of varying likelihood
and severity for rights and freedoms of natural persons posed by the pro-
cessing, the Controller shall, both at the time of the determination of the
means for processing and at the time of the processing itself, implement
appropriate technical and organisational measures, such as pseudonymi-
sation, which are designed to implement data protection Principles, such
as data minimisation, in an effective way and to integrate the necessary
safeguards into the processing in order to meet the requirements of the
GDPR and protect the rights of Data Subjects (Article 25(1)).
The Controller shall implement appropriate technical and organisa-
tional measures for ensuring that, by default, only personal data which
are necessary for each specific purpose of the processing are processed.
That obligation applies to the amount of data collected, the extent of
their processing, the period of their storage and their accessibility. In
particular, such measures shall ensure that by default personal data are
not made accessible without the individual’s intervention to an indefinite
number of natural persons (Article 23(2)).
An approved certification mechanism pursuant to Article 42 may be
used as an element to demonstrate compliance with the requirements set
out in Article 25(1) and (2) (Article 25(3)).
Note also the EDPB guidance:
●● Guidelines 4/2019 on Article 25 Data Protection by Design and by
Default.
Security Rights
26.68 See above (paras 26.36 and 26.37).
Data Protection Impact Assessment and Prior Consultation
Data Protection Impact Assessment
26.69 Chapter IV, Section 3 of the GDPR refers to Impact Assessments
and Prior Consultations.
Where a type of processing in particular using new technologies,
and taking into account the nature, scope, context and purposes of the
processing, is likely to result in a high risk for the rights and freedoms
of natural persons, the Controller shall, prior to the processing, carry
out an assessment of the impact of the envisaged processing operations
on the protection of personal data. A single assessment may address
a set of similar processing operations that present similar high risks
(Article 35(1)).

420
Main Provisions and Changes of GDPR 26.69

The Controller shall seek the advice of the DPO, where designated,
when carrying out a data protection impact assessment (Article 35(2)).
A data protection impact assessment referred to in Article 35(1) shall
in particular be required in the case of:
●● a systematic and extensive evaluation of personal aspects relating
to natural persons which is based on automated processing, includ-
ing profiling, and on which decisions are based that produce legal
effects concerning the natural person or similarly significantly affect
the natural person;
●● processing on a large scale of special categories of data referred to in
Article 9(1), or of personal data relating to criminal convictions and
offences referred to in Article 10; or
●● a systematic monitoring of a publicly accessible area on a large scale
(Article 35(3)).
The supervisory authority shall establish and make public a list of the
kind of processing operations which are subject to the requirement for a
data protection impact assessment pursuant to Article 35(1). The supervi-
sory authority shall communicate those lists to the EDPB (Article 35(4)).
The supervisory authority may also establish and make public a list
of the kind of processing operations for which no data protection impact
assessment is required. The supervisory authority shall communicate
those lists to the EDPB (Article 35(5)).
Prior to the adoption of the lists referred to in Article 35(4) and (5) the
competent supervisory authority shall apply the consistency mechanism
referred to in Article 63 where such lists involve processing activities
which are related to the offering of goods or services to Data Subjects or
to the monitoring of their behaviour in several states, or may substantially
affect the free movement of personal data within the EU (Article 35(6)).
The assessment shall contain at least:
●● a systematic description of the envisaged processing operations and
the purposes of the processing, including where applicable the legit-
imate interest pursued by the Controller;
●● an assessment of the necessity and proportionality of the processing
operations in relation to the purposes;
●● an assessment of the risks to the rights and freedoms of Data Subjects
referred to in Article 35(1);
●● the measures envisaged to address the risks, including safeguards,
security measures and mechanisms to ensure the protection of per-
sonal data and to demonstrate compliance with the GDPR taking
into account the rights and legitimate interests of Data Subjects and
other persons concerned (Article 35(7)).

421
26.69 New Regime

Compliance with approved codes of conduct referred to in Article 40 by


the relevant Controllers or Processors shall be taken into due account
in assessing the impact of the processing operations performed by such
Controllers or Processors, in particular for the purposes of a data protec-
tion impact assessment (Article 35(8)).
Where appropriate, the Controller shall seek the views of Data Subjects
or their representatives on the intended processing, without prejudice to
the protection of commercial or public interests or the security of the
processing operations (Article 35(9)).
Where the processing pursuant to Article 6(1)(c) or (e) has a legal
basis in EU law or the law of the state to which the Controller is sub-
ject, that law regulates the specific processing operation or set of opera-
tions in question, and a data protection impact assessment has already
been carried out as part of a general impact assessment in the context of
the adoption of this legal basis, Article 35(1)–(7) shall not apply, unless
states deem it necessary to carry out such assessment prior to the pro-
cessing activities (Article 35(10)).
Where necessary, the Controller shall carry out a review to assess
if the processing is performed in compliance with the data protection
impact assessment at least when there is a change of the risk represented
by the processing operations (Article 35(11)).
Prior Consultation
26.70 The Controller shall consult the supervisory authority prior to
the processing where a data protection impact assessment as provided
for in Article 33 indicates that the processing would result in a high risk
in the absence of measures taken by the Controller to mitigate the risk
(Article 36(1)).
Where the supervisory authority is of the opinion that the intended
processing referred to in Article 36(1) would infringe the GDPR, in
particular where the Controller has insufficiently identified or miti-
gated the risk, it shall within a period of up to eight weeks of receipt
of the request for consultation, provide written advice to the Controller
and, where applicable the Processor, and may use any of its powers
referred to in Article 58. That period may be extended for six weeks,
taking into account the complexity of the processing. The supervi-
sory authority shall inform the Controller, and where applicable the
Processor of any such extension within one month of receipt of the
request for consultation together with the reasons for the delay. Those
periods may be suspended until the supervisory authority has obtained
any information it has requested for the purposes of the consultation
(Article 36(2)).

422
Data Protection Officer 26.73

When consulting the supervisory authority pursuant to Article 36(1),


the Controller shall provide the supervisory authority with:
●● where applicable, the respective responsibilities of Controller, joint
Controllers and Processors involved in the processing, in particular
for processing within a group of undertakings;
●● the purposes and means of the intended processing;
●● the measures and safeguards provided to protect the rights and free-
doms of Data Subjects pursuant to the GDPR;
●● where applicable, the contact details of the DPO;
●● the data protection impact assessment provided for in Article 35; and
●● any other information requested by the supervisory authority
(Article 36(3)).
States shall consult the supervisory authority during the preparation of a
proposal for a legislative measure to be adopted by a national parliament
or of a regulatory measure based on such a legislative measure, which
relates to the processing (Article 36(4)).
Notwithstanding Article 36(1), states’ law may require Controllers to
consult with, and obtain prior authorisation from, the supervisory author-
ity in relation to the processing by a Controller for the performance of
a task carried out by the Controller in the public interest, including pro-
cessing in relation to social protection and public health (Article 36(5)).

Communicating Data Breach to Data Subject


26.71 Individual data subjects are to be notified in the event of a data
breach. Further details are set out above in para 26.39.

Data Protection Officer


26.72 Section 4 of the new GDPR refers to Data Protection Officers
(DPOs) and the obligation for organisations to appoint DPOs. See
Chapter 31 for particular details.
Remedies, Liability and Sanctions
26.73 Chapter VIII refers to remedies, liability issues and sanctions
regarding data protection.
The Recitals refer to proceedings against Controllers, Processors and
jurisdiction; damages and compensation; the prevention, investigation,
detection or prosecution of criminal offences or the execution of crimi-
nal penalties, including public security.

423
26.74 New Regime

Right to Lodge Complaint with Supervisory Authority


26.74 Without prejudice to any other administrative or judicial rem-
edy, every Data Subject shall have the right to lodge a complaint with
a supervisory authority, in particular in the state of his or her habitual
residence, place of work or place of the alleged infringement if the Data
Subject considers that the processing of personal data relating to him or
her infringes the GDPR (Article 77(1)).
The supervisory authority with which the complaint has been lodged
shall inform the complainant on the progress and the outcome of the
complaint including the possibility of a judicial remedy pursuant to
Article 78 (Article 77(2)).
Right to Judicial Remedy Against Supervisory Authority
26.75 Without prejudice to any other administrative or non-judicial
remedy, each natural or legal person shall have the right to an effec-
tive judicial remedy against a legally binding decisions of a supervisory
authority concerning them (Article 78(1)).
Without prejudice to any other administrative or non-judicial remedy,
each Data Subject shall have the right to an effective judicial remedy
where the supervisory authority which is competent in accordance with
Article 55 and Article 56 does not handle a complaint or does not inform
the Data Subject within three months on the progress or outcome of the
complaint lodged under Article 77 (Article 78(2)).
Proceedings against a supervisory authority shall be brought before
the courts of the state where the supervisory authority is established
(Article 78(3)).
Where proceedings are brought against a decision of a supervisory
authority which was preceded by an opinion or a decision of the EDPB
in the consistency mechanism, the supervisory authority shall forward
that opinion or decision to the court (Article 78(4)).
Right to Effective Judicial Remedy Against Controller
or Processor
26.76 Without prejudice to any available administrative or non-­
judicial remedy, including the right to lodge a complaint with a super-
visory authority pursuant Article 77, each Data Subject shall have the
right to an effective judicial remedy where they consider that their rights
under the GDPR have been infringed as a result of the processing of their
personal data in non-compliance with the GDPR (Article 79(1)).
Proceedings against a Controller or a Processor shall be brought before
the courts of the state where the Controller or Processor has an establish-
ment. Alternatively, such proceedings may be brought before the courts
of the state where the Data Subject has his or her habitual residence,

424
Data Protection Officer 26.78

unless the Controller or Processor is a public authority of a state acting


in the exercise of its public powers (Article 79(2)).
Representation of Data Subjects
26.77 The Data Subject shall have the right to mandate a not-for-profit
body, organisation or association, which has been properly constituted
according to the law of a state, has statutory objectives which are in the
public interest and is active in the field of the protection of Data Subject’s
rights and freedoms with regard to the protection of their personal
data to lodge the complaint on his or her behalf, to exercise the rights
referred to in Articles 77, 78 and 79 on his or her behalf and to exercise
the right to receive compensation referred to in Article 82 on his or her
behalf where provided for by state law (Article 80(1)).
States may provide that any body, organisation or association referred
to in Article 80(1), independently of a Data Subject’s mandate, has the
right to lodge, in that state, a complaint with the supervisory author-
ity competent in accordance with Article 77 and to exercise the rights
referred to in Articles 78 and 79 if it considers that the rights of a Data
Subject under the GDPR have been infringed as a result of the process-
ing (Article 80(2)).
Right to Compensation and Liability
26.78 Any person who has suffered material or non-material damage
as a result of an infringement of the GDPR shall have the right to receive
compensation from the Controller or Processor for the damage suffered
(Article 82(1)).
Any Controller involved in the processing shall be liable for the dam-
age caused by the processing which infringes the GDPR. A Processor
shall be liable for the damage caused by the processing only where it
has not complied with obligations of the GDPR specifically directed to
Processors or where it has acted outside or contrary to lawful instruc-
tions of the Controller (Article 82(2)).
A Controller or Processor shall be exempt from liability under
Article 82(2) if it proves that it is not in any way responsible for the
event giving rise to the damage (Article 82(3)).
Where more than one Controller or Processor or both a Controller
and a Processor are involved in the same processing and, where they
are, in accordance with Article 82(2) and (3), responsible for any dam-
age caused by the processing, each Controller or Processor shall be held
liable for the entire damage, in order to ensure effective compensation of
the Data Subject (Article 82(4)).
Where a Controller or Processor has, in accordance with Article 82(4),
paid full compensation for the damage suffered, that Controller or

425
26.78 New Regime

Processor shall be entitled to claim back from the other Controllers or


Processors involved in the same processing that part of the compensation
corresponding to their part of responsibility for the damage in accord-
ance with the conditions set out in Article 82(2) (Article 82(5)).
Court proceedings for exercising the right to receive compensation
shall be brought before the courts competent under the law of the state
referred to in Article 79(2) (Article 82(6)).
Codes of Conduct and Certification
26.79 Chapter IV, Section 5 of the GDPR refers to Codes of Conduct
and Certification.
Codes of Conduct
26.80 The states, the supervisory authorities, the EDPB and the
Commission shall encourage the drawing up of codes of conduct intended
to contribute to the proper application of the GDPR, taking account of
the specific features of the various data processing sectors and the spe-
cific needs of micro, small and medium-sized enterprises (Article 40(1)).
Associations and other bodies representing categories of Controllers
or Processors may prepare codes of conduct, or amend or extend such
codes, for the purpose of specifying the application of the GDPR,
such as:
●● fair and transparent data processing;
●● the legitimate interests pursued by Controllers in specific contexts;
●● the collection of data;
●● the pseudonymisation of personal data;
●● the information of the public and of Data Subjects;
●● the exercise of the rights of Data Subjects;
●● the information provided to, and the protection of children and the
manner in which the consent of the holders of parental responsibility
over children to be obtained;
●● the measures and procedures referred to in Article 24 and measures
to ensure security of processing referred to in Article 32;
●● the notification of personal data breaches to supervisory authori-
ties and the communication of such personal data breaches to Data
Subjects;
●● transfer of personal data to third countries or international organisa-
tions; or
●● out-of-court proceedings and other dispute resolution procedures
for resolving disputes between Controllers and Data Subjects with
regard to the processing, without prejudice to the rights of the Data
Subjects pursuant to Articles 75 and 79 (Article 40(2)).

426
Data Protection Officer 26.80

In addition to adherence by Controller or Processor subject to the


Regulation, codes of conduct approved pursuant to Article 40(5) and
having general validity pursuant to Article 40(9) may also be adhered
to by Controllers or Processors that are not subject to this Regulation
pursuant to Article 3 in order to provide appropriate safeguards within
the framework of personal data transfers to third countries or interna-
tional organisations under the terms referred to in Article 46(2)(e). Such
Controllers or Processors shall make binding and enforceable commit-
ments, via contractual or other legally binding instruments, to apply
those appropriate safeguards including with regard to the rights of Data
Subjects (Article 40(3)).
Such a code of conduct shall contain mechanisms which enable the
body referred to in Article 41(1) to carry out the mandatory monitor-
ing of compliance with its provisions by the Controllers or Processors
which undertake to apply it, without prejudice to the tasks and pow-
ers of the supervisory authority competent pursuant to Article 55 or 56
(Article 40(4)).
Associations and other bodies referred to in Article 40(2) which intend
to prepare a code of conduct or to amend or extend an existing code,
shall submit the draft code, amendment or extension to the supervisory
authority which is competent pursuant to Article 55. The supervisory
authority shall provide an opinion on whether the draft code, or, amend-
ment or extension complies with the GDPR and shall approve such draft
code, amendment or extension if it finds that it provides sufficient appro-
priate safeguards (Article 40(5)).
Where the draft code, or amendment or extension is approved, and
where the code of conduct concerned does not relate to processing activ-
ities in several states, the supervisory authority shall register and publish
the code (Article 40(6)).
Where a draft code of conduct relates to processing activities in sev-
eral states, the supervisory authority which is competent shall, before
approving the draft code, amendment or extension, submit it to the
EDPB which shall provide an opinion on whether the draft code, amend-
ment or extension complies with the GDPR or, in the situation referred to
in Article 40(3), provides appropriate safeguards (Article 40(7)).
Where the opinion referred to in Article 40(7) confirms that the draft
code, amendment or extension complies with the GDPR, or, in the situ-
ation referred to in Article 40(3), provides appropriate safeguards, the
EDPB shall submit its opinion to the Commission (Article 40(8)).
The Commission may by way of implementing acts decide that the
approved codes of conduct, amendments or extensions submitted to it
pursuant to Article 40(3) have general validity within the EU. Those
implementing acts shall be adopted in accordance with the examination
procedure set out in Article 93(2) (Article 40(9)).

427
26.80 New Regime

The Commission shall ensure appropriate publicity for the approved


codes which have been decided as having general validity in accordance
with Article 40(9) (Article 40(10)).
The EDPB shall collate all approved codes of conduct, amendments
and extensions in a register and shall make them publicly available by
way of appropriate means (Article 40(11)).
Certification
26.81 The states, the supervisory authorities, the EDPB and the
Commission shall encourage, in particular at EU level, the establish-
ment of data protection certification mechanisms and of data protection
seals and marks, for the purpose of demonstrating compliance with the
GDPR of processing operations by Controllers and Processors. The spe-
cific needs of micro, small and medium-sized enterprises shall be taken
into account (Article 42(1)).
In addition to adherence by Controllers or Processors subject to the
GDPR, data protection certification mechanisms, seals or marks approved
pursuant to Article 42(5) may be established for the purpose of demon-
strating the existence of appropriate safeguards provided by Controllers
or Processors that are not subject to the GDPR according to Article 3
within the framework of personal data transfers to third countries or
international organisations under the terms referred to in Article 46(2)
(f). Such Controllers or Processors shall make binding and enforceable
commitments, via contractual or other legally binding instruments, to
apply those appropriate safeguards, including with regard to the rights of
Data Subjects (Article 42(2)).
The certification shall be voluntary and available via a process that is
transparent (Article 42(3)).
A certification pursuant to this Article does not reduce the responsibil-
ity of the Controller or the Processor for compliance with the GDPR and
is without prejudice to the tasks and powers of the supervisory authori-
ties which are competent pursuant to Article 55 or 56 (Article 42(4)).
A certification pursuant to this Article shall be issued by the certifica-
tion bodies referred to in Article 43, or by the competent supervisory
authority on the basis of the criteria approved by that competent super-
visory authority pursuant to Article 58(3), or by the EDPB. Where the
criteria approved by the EDPB this may result in a common certification,
the European Data Protection Seal (Article 42(5)).
The Controller or Processor which submits its processing to the cer-
tification mechanism shall provide the certification body, the competent
supervisory authority, with all information and access to its processing
activities which are necessary to conduct the certification procedure
(Article 42(6)).

428
Data Protection Officer 26.84

The certification shall be issued to a Controller or Processor for a


maximum period of three years and may be renewed under the same
conditions, provided that the relevant requirements continue to be met.
Certification shall be withdrawn, as applicable, by the certification bod-
ies referred to in Article 43, or by the competent supervisory authority
where the requirements for the certification are not or no longer met
(Article 42(7)).
The EDPB shall collate all certification mechanisms and data protec-
tion seals and marks in a register and shall make them publicly available
by any appropriate means (Article 42(8)).
Specific Data Processing Situations
26.82 Chapter IX of the GDPR refers to provisions regarding specific
data processing situations.
Processing and Freedom of Expression and Information
26.83 States shall by law reconcile the right to the protection of per-
sonal data pursuant to the GDPR with the right to freedom of expression
and information, including the processing of personal data for journalis-
tic purposes and the purposes of academic, artistic or literary expression
(Article 85(1)).
For processing carried out for journalistic purposes or the purpose of
academic artistic or literary expression, states shall provide for exemp-
tions or derogations from the provisions in Chapter II (Principles),
Chapter III (Rights of the Data Subject), Chapter IV (Controller and
Processor), Chapter V (transfer of personal data to third countries or
international organisations), Chapter VI (independent supervisory
authorities), Chapter VII (cooperation and consistency) and Chapter IX
(specific data processing situations) if they are necessary to reconcile the
right to the protection of personal data with the freedom of expression
and information (Article 85(2)).
Each state shall notify to the Commission those provisions of its law
which it has adopted pursuant to Article 85(2) and, without delay, any
subsequent amendment law or amendment affecting them (Article 85(3)).
Processing in Employment Context
26.84 Issues related to employee personal data are continuing to grow
in importance.97
The new GDPR provides that states may, by law or by collective agree-
ments, provide for more specific rules to ensure the protection of the

97 See, for example, Chapters 16–19 and Article 82 of the GDPR.

429
26.84 New Regime

rights and freedoms in respect of the processing of employees’ personal


data in the employment context, in particular for the purposes of the
recruitment, the performance of the contract of employment, including
discharge of obligations laid down by law or by collective agreements,
management, planning and organisation of work, equality and diversity
in the workplace, health and safety at work, protection of employer’s
or customer’s property and for the purposes of the exercise and enjoy-
ment, on an individual or collective basis, of rights and benefits related to
employment, and for the purpose of the termination of the employment
relationship (Article 88(1)).
These rules shall include suitable and specific measures to safeguard
the Data Subject’s human dignity, legitimate interests and fundamen-
tal rights, with particular regard to the transparency of processing, the
transfer of data within a group of undertakings or group of enterprises
engaged in a joint economic activity and monitoring systems at the work
place (Article 88(2)).
Each state shall notify to the Commission those provisions of its law
which it adopts pursuant to Article 88(1), by 25 May 2018 and, without
delay, any subsequent amendment affecting them (Article 88(3)).
Prior to the new GDPR, the ICO has issued recommendations in rela-
tion to personal data employment issues, namely:
●● Employment Practices Code;
●● Employment Practices Code – A Quick Guide;
●● Employment Practices Code – Supplementary Guidance;
●● Disclosure of Employee Information under TUPE;
●● Getting it Right: A brief Guide to Data Protection for Small
Businesses;
●● Monitoring under Section 75 of the Northern Ireland Act 1998.
These now need to be read in light of the GDPR changes.
Safeguards and Derogations: Public Interest/Scientific/Historical Research/
Statistical Archiving Processing
26.85 There are also provisions in relation to safeguards and dero-
gations for processing personal data for archiving purposes in the
public interest, scientific or historical research purposes or statistical
purposes.
Processing for archiving purposes in the public interest, scientific or
historical research purposes or statistical purposes, shall be subject to
appropriate safeguards for the rights and freedoms of the Data Subject.
These safeguards shall ensure that technical and organisational measures
are in place in particular in order to ensure the respect of the principle
of data minimisation. These measures may include pseudonymisation,

430
Data Protection Officer 26.87

provided that those purposes can be fulfilled in that manner. Where those
purposes can be fulfilled by further processing which does not permit or
no longer permits the identification of Data Subjects these purposes shall
be fulfilled in that manner (Article 89(1)).
Where personal data are processed for scientific and historical research
purposes or statistical purposes, EU or state law may provide for deroga-
tions from the rights referred to in Articles 15, 16, 18 and 21 subject to
the conditions and safeguards referred to in Article 89(1) in so far as such
rights are likely to render impossible or seriously impair the achievement
of the specific purposes, and such derogations are necessary for the ful-
filment of those purposes (Article 89(2)).
Where personal data are processed for archiving purposes in the pub-
lic interest, EU or state law may provide for derogations from the rights
referred to in Articles 15, 16, 18, 19, 20 and 21 subject to the conditions
and safeguards referred to in Article 89(1) in so far as such rights are
likely to render impossible or seriously impair the achievement of the
specific purposes, and such derogations are necessary for the fulfilment
of these purposes (Article 89(3)).
Where processing referred to in Article 89(2) and (3) serves at the
same time another purpose, the derogations shall apply only to process-
ing for the purposes referred to in those paragraphs (Article 89(4)).
Obligations of Secrecy
26.86 States may adopt specific rules to set out the powers by the
supervisory authorities laid down in Article 58(1)(e) and (f) in relation to
Controllers or Processors that are subject, under EU or state law or rules
established by national competent bodies to an obligation of professional
secrecy or other equivalent obligations of secrecy where this is neces-
sary and proportionate to reconcile the right of the protection of personal
data with the obligation of secrecy. Those rules shall only apply with
regard to personal data which the Controller or Processor has received as
a result of, or has obtained in, an activity covered by that obligation of
secrecy (Article 90(1)).
Each state shall notify to the Commission the rules adopted pursuant
to Article 90(1), by 25 May 2018 and without delay, any subsequent
amendment affecting them (Article 90(2)).
New Data Protection Officer Obligation
Introduction
26.87 Now organisations need to have a designated Data Protection
Officer (DPO) to deal with the data protection compliance obligations,
dealing with Data Subject access requests, etc. That is not to say that

431
26.87 New Regime

board and management responsibility for data protection compliance is


in any way removed.
DPO
26.88 Chapter IV, Section 4 of the new GDPR refers to DPOs and the
obligation for organisations to appoint DPOs.
The Controller and the Processor shall designate a DPO in all cases
specified by the GDPR, in particular in Article 35. (For further details
see Chapter 31).
Position of DPO
26.89 The Controller or the Processor shall ensure that the DPO is
involved, properly and in a timely manner in all issues which relate to
the protection of personal data (Article 38(1)).
The Controller or Processor shall support the DPO in performing the
tasks referred to in Article 39 by providing resources necessary to carry
out those tasks as well as access to personal data and processing opera-
tions, and to maintain his or her expert knowledge (Article 38(2)).
The Controller or Processor shall ensure that the DPO does not receive
any instructions regarding the exercise of the tasks. He or she shall not be
dismissed or penalised by the Controller or the Processor for performing
the tasks. The DPO shall directly report to the highest management level
of the Controller or the Processor (Article 38(3)).
Data Subjects may contact the DPO with regard to all issues relating
to the processing of their personal data and to the exercise of their rights
under the GDPR (Article 38(4)).
The DPO shall be bound by secrecy or confidentiality concern-
ing the performance of their tasks, in accordance with EU or state law
(Article 38(5)).
The DPO may fulfil other tasks and duties. The Controller or Processor
shall ensure that any such tasks and duties do not result in a conflict of
interests (Article 38(6)).
Tasks of DPO
26.90 The DPO shall have at least the following tasks:
●● to inform and advise the Controller or the Processor and the employ-
ees who carry out processing of their obligations pursuant to the
GDPR and to other EU or state data protection provisions;
●● to monitor compliance with the GDPR, with other EU or state data
protection provisions and with the policies of the Controller or

432
Conclusion 26.92

Processor in relation to the protection of personal data, including


the assignment of responsibilities, awareness-raising and training of
staff involved in processing operations, and the related audits;
●● to provide advice where requested as regards the data protec-
tion impact assessment and monitor its performance pursuant to
Article 39;
●● to cooperate with the supervisory authority;
●● to act as the contact point for the supervisory authority on issues
related to the processing, including the prior consultation referred
to in Article 36, and consult, where appropriate, on any other matter
(Article 39(1)).
The DPO shall in the performance of his or her tasks have due regard to
the risk associated with the processing operations, taking into account
the nature, scope, context and purposes of processing (Article 39(2)).
Importance
26.91 The new profession and new mandated requirement for a DPO
will prove to be one of the game changes in data protection practice
and protection for personal data. While the DPO will be promoting data
protection compliance, adherence to the new data protection regime,
pre-problem solving (such as data protection by design and by default)
and the understanding of data protection throughout the organisation,
the protection of the DPO’s independence will ensure that the tempta-
tion by some in an organisation to ignore or downgrade data protec-
tion in favour of (new) collections and processing, and maximum use
compared to proportionate and limited data use, can be more robustly
resisted. The independence of the DPO is also protected by ensuring that
they report directly to the highest level which will generally be a board
member. This counters junior managers or others seeking to adversely
influence, ignore or overrule the DPO in the proper performance of their
functions and role. See The Data Protection Officer, Profession, Rules
and Role.

Conclusion
26.92 All organisations need to become very familiar with the GDPR.
While in some instances the current compliance mechanisms are con-
tinued, there are many new requirements to compliance. Organisations
need to start now in terms of ensuring preparation and compliance.
Indeed, the most prudent organisations will continually be adopting best

433
26.92 New Regime

practice, and data protection compliance is an area where best practice


has positive benefits above and beyond mere compliance.
While the GDPR is now established, what is less certain is how the
envisaged UK-GDPR will look like in final draft and how it will be
applied. Critical to assessing it – especially from an EU perspective –
will be any associated laws or changes, such as the cessation or diminu-
tion of the Human Rights Act 1998.

434
Part 6
Particular Issues

435
436
Chapter 27
Data Breach

Introduction
27.01 Data breach issues are one of the most important and conse-
quential areas of data protection (and security) for organisations to
pay attention to. Given the unfortunate increase in the number of data
breaches, it is inevitable that there will be increasing regulator attention.
The new EU General Data Protection Regulation (GDPR) also increases
data protection security obligations; and the consequence of significant
new fines and penalties where breaches occur.
The new GDPR defines ‘personal data breach’ as a ‘breach of secu-
rity leading to the accidental or unlawful destruction, loss, alteration,
unauthorised disclosure of, or access to, personal data transmitted, stored
or otherwise processed’. The importance attached to dealing with data
breaches and data breach incidents is highlighted in the GDPR. Now
data breaches must be notified to the supervisory authority and the Data
Subjects. (Bear in mind that employees can also be Data Subjects). Data
Subjects can suffer loss and damage if there has been a data breach, and
particularly so if they are unaware of it and are not notified when the
organisation becomes aware in order that, for example, remedial meas-
ures can be undertaken by the Data Subject. For example they may wish
to change passwords or cancel credit cards, depending on the nature of
the breach. Indeed, in some instances organisations may need to recom-
mend remedial or safety measures to Data Subjects after a data breach.

Data Breach Incidents in Context


27.02 The issue and frequency of data breaches and data breach inci-
dents are highlighted in Part 1. In addition, it is clear that national super-
visory authorities take data breaches very seriously. Significant fines are
now regularly levelled at organisations, including large organisations

437
27.02 Data Breach

and public organisations, in relation to data breaches. Additionally, even


smaller organisations have been fined.
The ICO, and other supervisory authorities, can carry out audits and
inspections of organisations, which can include security and data breach
preparedness, as well as the implications which result from a recent
data breach incident. In fact, many data breaches result in media public-
ity for the organisation, in which case the ICO is likely to contact the
organisation.

Notification of a Data Breach to Supervisory Authority


27.03 In the case of a personal data breach, the new GDPR requires
that the Controller shall without undue delay and, where feasible, not
later than 72 hours after having become aware of it, notify the personal
data breach to the supervisory authority competent in accordance with
Article 33, unless the personal data breach is unlikely to result in a risk
for the rights and freedoms of natural persons. Where the notification to
the supervisory authority is not made within 72 hours, it shall be accom-
panied by reasons for delay (Article 33(1)).
The Processor shall notify the Controller without undue delay after
becoming aware of a personal data breach (Article 33(2)).
The notification referred to in Article 33(1) must at least:
●● describe the nature of the personal data breach including where pos-
sible, the categories and approximate number of Data Subjects con-
cerned and the categories and approximate number of data records
concerned;
●● communicate the name and contact details of the DPO or other con-
tact point where more information can be obtained;
●● describe the likely consequences of the personal data breach;
●● describe the measures taken or proposed to be taken by the Controller
to address the personal data breach, including, where appropriate, to
mitigate its possible adverse effects (Article 33(3)).
Where, and in so far as, it is not possible to provide the information at
the same time, the information may be provided in phases without undue
further delay (Article 33(4)).
The Controller shall document any personal data breaches, comprising
the facts relating to the personal data breach, its effects and the remedial
action taken. This documentation shall enable the supervisory authority
to verify compliance with this Article (Article 33(5)).

438
Communication of a Data Breach to Data Subject 27.04

Note that there can be data security breach issues which can lead to
data regulator sanction, but separately and in addition, the failure to
notify when required can itself be an instance of breach of the data pro-
tection regime.

Communication of a Data Breach to Data Subject


27.04 When the personal data breach is likely to result in a high risk
to the rights and freedoms of natural persons the Controller shall com-
municate the personal data breach to the Data Subject without undue
delay (Article 34(1)).1
The communication to the Data Subject referred to in Article 34(2)
shall describe in clear and plain language the nature of the personal data
breach and contain at least the information and measures referred to in
Article 33(3)(b), (c) and (d) (Article 34(2)).
The communication to the Data Subject referred to in Article 34(1)
may not be required if:
●● the Controller has implemented appropriate technical and organisa-
tional protection measures, and that those measures were applied to
the personal data affected by the personal data breach, in particular
those that render the personal data unintelligible to any person who
is not authorised to access it, such as encryption;
●● the Controller has taken subsequent measures which ensure that the
high risk for the rights and freedoms of Data Subjects referred to in
para 1 is no longer likely to materialise;
●● it would involve disproportionate effort. In such cases, there shall
instead be a public communication or similar measure whereby
the Data Subjects are informed in an equally effective manner
(Article 34(3)).
If the Controller has not already communicated the personal data breach
to the Data Subject, the supervisory authority, having considered the
likelihood of the personal data breach resulting in a high risk, may
require it to do so or may decide that any of the conditions referred to in
Article 34(4) are met (Article 34(4)).

1 Note, for example, P Wainman, ‘Data Protection Breaches: Today and Tomorrow,’
SCL Computers and Law, 30 June 2012. Also see M Dekker, Dr, C Christoffer
­Karsberg and B Daskala, Cyber Incident Reporting in the EU (2012).

439
27.05 Data Breach

Employee Data Breaches


27.05 Employee involvement is critical in dealing with – and pre-
paring for – data breach incidents. Various teams of employees will be
involved.
However, as employee personal data can also be the subject of a data
breach incident, employees may need to be specifically considered in
this context also. For example, they may need to be separately informed
that there is a breach relating to their personal data and what actions
and safeguards are being followed by the organisation to deal with the
issue. If the employees need to take specific actions, they may also need
to be appraised of this possibility. Potentially liability issues may also
arise. For example, employees in the massive Sony data breach incidents
may have considered suing Sony for breaches in relation to their data.
Employees did sue their employer in the Morrison case, albeit unsuc-
cessful in the specific matter apparently sued for.2 A note of caution
should be applied as to if any how this immunises all employers. This
text would suggest that there may indeed still be circumstances upon
which an employer may indeed be vicariously liable to employees and
third parties.

Notification Timelines
27.06 Organisations will need to assess the categories of personal data
they have that may be involved in a data breach. They also need to assess
what type of organisation or sector they are involved in. These factors
may dictate how the data protection regime may impose time limits for
respective notification of breaches. (See Chapter 10).

Notification Processes
27.07 DPOs and organisations need to develop, and update as appro-
priate, breach notification procedures. There needs to be an appropriate
response plan for the different types of breach incidents, ICO, other reg-
ulators if appropriate or required, Data Subjects and other organisations,
whether partners, Processors, outsource security, etc.
Contracts and agreements should also be reviewed to ensure appropri-
ate breach, notification and security provisions.

2 See Morrison v Various [2020] UKSC 12 (1 April 2020).

440
Data Security Standards 27.08

Data Security Standards


27.08 The requirement to comply with the new GDPR regime, to
maintain security and to prevent and to deal with data breaches increas-
ingly directs attention to detailed standards and implementation meas-
ures. One example is compliance with ISO 27001 international standard
for confidentiality and security. In particular it refers to:
●● security policies;
●● organisational information security;
●● HR security issues (including employees, families, contractors pre-
viously and currently);
●● asset identification and control;
●● encryption;
●● physical and environmental security factors;
●● operational security;
●● communications security;
●● acquisition, development and maintenance of systems;
●● supplier relations;
●● security incident management;
●● security issues and business continuity;
●● compliance with internal policies and external issues such as a data
protection regime and other laws.
Also consider ISO 31000 on international risk management standards,
and BS10012:2009 a standard for personal information management
systems.
Organisations, including Processors, must implement appropriate
security measures, considering:
●● PIAs;
●● anonymising data;
●● deleting after use purpose;
●● confidentiality;
●● integrity;
●● availability;
●● access controls and restrictions;
●● Approved Codes of Conduct and Certification;
●● separating, segregating and securing different data sets;
●● encryption.
There are also various other technical data security standards and guides,
such as ISO 15408; ISO/IEC 27001 and 27002; ANSI/ISA 62443
(Formerly ISA-99); IEC 62443; NERC; NIST: and also the GDPR provi-
sions in relation to data protection impact assessments, certification, and
codes of conduct.

441
27.09 Data Breach

Incident Response
27.09 Some of the incident response and action points include, and not
in order of priority:
●● incident detection and reporting;
●● incident notification to organisation (eg notification or demand from
hacker, posting online, etc);
●● internal notification(s);
●● team notifications;
●● risk assessment;
●● PIAs;
●● disciplinary action;
●● hacker relation action;
●● supervisory authority external breach notification;
●● Data Subject breach notification;
●● customer breach notification.

Conclusion
27.10 Security and data breach issues are significant considerations
for organisations. The new data protection regime re-emphasises this.
Data breach issues should be considered in conjunction with the vari-
ous risk reduction mechanisms referred to under the new GDPR regime,
such as impact assessments, Data Protection by Design and by default,
mandatory breach reporting, mandatory prior consultations with the
supervisory authority in the case of identified high risks, codes of con-
duct and certification mechanisms. The enhanced penalties will also be
applied in the worst breach cases.

442
Chapter 28
Data Protection Impact
Assessment

Data Protection Impact Assessment and Prior Consultation


28.01 Chapter IV, Section 3 of the EU General Data Protection
Regulation (GDPR) refers to data protection impact assessments (DPIAs)
(sometimes also referred to as privacy impact assessments (PIAs)) and
prior consultations. As a result of the new GDPR regime there is now
a mandatory data protection impact assessment regime. These assess-
ments must be undertaken when data processing activities involve spe-
cific data protection and privacy risks. In particular when new products
and services, or other changes to existing products and services arise, the
organisation should ensure that these activities are the subject of a data
protection impact assessment (DPIA).
These impact assessments will help organisations to identify and
understand current and new risks in proposed processing activities, or
indeed to existing processing activities. Considerations include:
●● identifying when a project involves the collection of new informa-
tion about individuals;
●● identifying whether information about individuals will be disclosed
to organisations or people who have not previously had routine
access to the information;
●● identifying whether the project involves the use of new technology
which may raise privacy and data protection issues, such as over-
reach or privacy intrusion;
●● identifying whether the personal data raises issues or concerns or is
in some way objectionable.

443
28.02 Data Protection Impact Assessment

Data Protection Impact Assessment


The New Requirement
28.02 Where a type of processing in particular using new technolo-
gies, and taking into account the nature, scope, context and purposes
of the processing, is likely to result in a high risk for the rights and
freedoms of natural persons, the Controller shall, prior to the process-
ing, carry out an assessment of the impact of the envisaged processing
operations on the protection of personal data. A single assessment may
address a set of similar processing operations that present similar high
risks (Article 35(1)).
The Controller shall seek the advice of the data protection officer
(DPO), where designated, when carrying out a data protection impact
assessment (Article 35(2)).
A DPIA referred to in Article 35(1) shall in particular be required in
the following cases:
●● a systematic and extensive evaluation of personal aspects relating
to natural persons which is based on automated processing, includ-
ing profiling, and on which decisions are based that produce legal
effects concerning the natural person or similarly significantly affect
the natural person;
●● processing on a large scale of special categories of data referred to in
Article 9(1), or of personal data relating to criminal convictions and
offences referred to in Article 10;
●● a systematic monitoring of a publicly accessible area on a large scale
(Article 35(3)).
The supervisory authority shall establish and make public a list of the
kind of processing operations which are subject to the requirement for
a DPIA pursuant to Article 35(1). The supervisory authority shall com-
municate those lists to the EDPB (Article 35(4)).
The supervisory authority may also establish and make public a list of
the kind of processing operations for which no DPIA is required. The SA
shall communicate those lists to the EDPB (Article 35(5)).
Prior to the adoption of the lists referred to in Article 35(4) and (5) the
competent supervisory authority shall apply the consistency mechanism
referred to in Article 57 where such lists involve processing activities
which are related to the offering of goods or services to Data Subjects
or to the monitoring of their behaviour in several states, or may sub-
stantially affect the free movement of personal data within the EU
(Article 35(6)).

444
Data Protection Impact Assessment 28.03

The assessment shall contain at least:


●● a systematic description of the envisaged processing operations and
the purposes of the processing, including where applicable the legit-
imate interest pursued by the Controller;
●● an assessment of the necessity and proportionality of the processing
operations in relation to the purposes;
●● an assessment of the risks to the rights and freedoms of Data Subjects
referred to in Article 35(1);
●● the measures envisaged to address the risks, including safeguards,
security measures and mechanisms to ensure the protection of per-
sonal data and to demonstrate compliance with the GDPR taking
into account the rights and legitimate interests of Data Subjects and
other persons concerned (Article 35(7)).
Compliance with approved codes of conduct referred to in Article 40 by
the relevant Controllers or Processors shall be taken into due account
in assessing the impact of the processing operations performed by such
Controllers or Processors, in particular for the purposes of a DPIA
(Article 35(8)).
Where appropriate, the Controller shall seek the views of Data Subjects
or their representatives on the intended processing, without prejudice to
the protection of commercial or public interests or the security of the
processing operations (Article 35(9)).
Where the processing pursuant to Article 6(1)(c) or (e) has a legal
basis in EU law, or the law of the state to which the Controller is subject,
that law regulates the specific processing operations in question, and a
DPIA has already carried out as part of a general impact assessment in
the context of the adoption of this legal basis, Article 35(1)–(7) shall not
apply, unless states deem it necessary to carry out such assessment prior
to the processing activities (Article 35(10)).
Where necessary, the Controller shall carry out a review to assess
if processing is performed in accordance with the DPIA at least when
there is a change of the risk represented by processing operations
(Article 35(11)).
ICO Guidance
28.03 The ICO also refers to these issues, post the GDPR, in g­ uidance.1
It stresses that ‘[a] DPIA should begin early in the life of a project, before

1 ICO, ‘Data Protection Impact Asessments,’ at https://ico.org.uk/for-organisations/


guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/
accountability-and-governance/data-protection-impact-assessments/.

445
28.03 Data Protection Impact Assessment

you start your processing, and run alongside the planning and develop-
ment process.’
It refers to the following steps, namely:
●● identify need for a DPIA;
●● describe the processing;
●● consider consultation;
●● assess necessity and proportionality;
●● identify and assess risks;
●● identify measures to mitigate risk;
●● sign off and record outcomes;
●● integrate outcomes into plan;
●● keep under review.
Reasons for Assessment
28.04 Vodafone refer to the following reasons for assessments:
●● accountability: to demonstrate that the assessment process was
performed appropriately and in accordance with the programme of
assessments agreed with the board sponsor for data protection;
●● provides basis for post implementation review: to ensure any data
protection risks identified are allocated a business owner and a time-
table for delivery mitigation actions, therefore providing the DPO
with a mechanism for ensuring that the agreed actions are delivered
within agreed timescales;
●● provides a basis for audit: Vodafone distinguishes between a review
which is undertaken by the DPO who is responsible for ensuring
it is implemented and the controls required are delivered, and the
audit which is an objective and neutral assessment undertaken by
the group or local audit function or any other suitably qualified audit
function that is not part of delivering the overall Data Protection
Risk Management System;
●● provides corporate memory: ensuring the information gained is
available to those completing new assessments if original staff have
left or use a part of a subsequent assessment of the same business or
commercial unit or activity;
●● enables the experience gained during the project to be shared with
the future assessment teams and others outside the organisation.2

2 S Deadman and A Chandler, ‘Vodafone’s Approach to Privacy Impact Assessments,’


in D Wright and P de Hert, eds, Privacy Impact Assessment (Springer, 2012) 298.

446
Data Protection Impact Assessment 28.06

Nokia also give reasons for undertaking assessments:


●● ‘to measure the implementation of privacy requirements, to get an
understanding of the current status (risk, controls, root causes, etc)’;3
●● the assessment is part of technical and organisational measures. It
assists to find out if new projects follow the data protection require-
ments; project management; communicate fulfilment of require-
ments; help to generate status reports for management teams; an
effective tool for assigning responsibility and fixing problems;
●● the DPIA assessment serves as ‘a repository for information requests
from authorities and consumers. Consumers might ask Nokia where
and for how long their data is stored. A data protection authority might,
for example, ask how consumers are informed about privacy prac-
tices or who the controller of the data is. Privacy assessment might
also be used to prepare notifications for data protection authorities’;4
●● ‘a means to improve general awareness. The assessment process …
builds up competencies and privacy awareness, as it offers an exten-
sive set of questions that might be relevant for privacy compliance.’5
Key Elements of Assessment Report
28.05 Some key elements of an assessment report are as follows:
●● the scope of the assessment undertaken;
●● summary of the consultative process undertaken;
●● the project background paper(s) provided to those consulted
(appendices);
●● analysis of the data protection issues and risks arising from the
assessment;
●● the business case justifying data protection intrusion and implica-
tions, treatment or mitigating action, together with timelines for
implementation;
●● references to relevant laws, codes and guidelines, including internal
local group policies.6
Assessment Characteristics
28.06 Common characteristics of impact assessments:
●● Statement of problem: Is government intervention both necessary
and desirable?

3 T Brautigam, ‘PIA: Cornerstone of Privacy Compliance in Nokia,’ in D Wright and


P de Hert, eds, Privacy Impact Assessment (Springer, 2012) 260–261.
4 See above.
5 See above.
6 S Deadman and A Chandler, above, 299.

447
28.06 Data Protection Impact Assessment

●● Definition of alternative remedies: these include different approaches,


such as the use of economic incentives or voluntary approaches;
●● Determination of physical effects of each alternative, including
potential unintended consequences: the net should be cast wide.
Generally speaking, regulations or investments in many areas of
public policy can have social, environmental and other implications
that must be kept in mind;
●● Estimation of benefits and costs of each alternative: benefits should
be quantified and where possible monetised. Costs should be true
opportunity and not simply expenditures;
●● Assessment of other economic impacts: including effects on compe-
tition, effects on small firms, international trade implications;
●● Identification of winners and losers: those in the community who
stand to gain and lose from each alternative and, if possible, the
extent of their gains and losses;
●● Communication with the interested public, including the following
activities: notification of intent to regulate, request for compliance
costs and other data, public disclosure of regulatory proposals and
supporting analysis, and consideration of and response to public
comments;
●● A clear choice of the preferred alternative, plus a statement defend-
ing that choice;
●● Provision of a plan for ex post analysis of regulatory outcomes: it is
important to establish a benchmark against which to measure perfor-
mance. Planning is needed to ensure that procedures are in place for
the collection of date to permit such benchmarking.7
Some assessment characteristics distinguishing it from other privacy
related processes include:
●● an assessment focuses on a particular initiative or project;
●● an assessment is performed at depth, through the project life cycle,
and involves engagement with stakeholders;
●● an assessment assesses a project against the needs, expectations
and concerns of all stakeholders, including but not limited to legal
requirements;
●● an assessment assesses all aspects of privacy and data protection;
●● an assessment adopts a multi-perspective approach, taking into
account the costs and benefits as perceived by all stakeholders;

7 OECD, ‘Regulatory Performance: Ex Post Evaluation of Regulatory Tools and Insti-


tutions,’ Working Party on Regulatory Management and Reform, Draft Report by
the Secretariat, OECD, Paris (2004), 7; referred to in D Parker, ‘(Regulatory) Impact
Assessment and Better Regulation,’ in D Wright and P de Hert, eds, Privacy Impact
Assessment (Springer, 2012) 80.

448
Data Protection Impact Assessment 28.08

●● an assessment adopts a multi perspective approach, taking into


account the risks as perceived by all stakeholders;
●● an assessment is a process used to establish what undertakings an
organisation needs to give;
●● an assessment is the process that identifies the problems and identi-
fies solutions to them;
●● an assessment is conducted before and in parallel with a project, and
ensures that harmful and expensive problems that an audit would
later expose are involved, and that unavoidable negative impacts on
privacy are minimised and harms mitigated.8
Key Steps and Methodologies
28.07 The key steps and methodologies in an assessment include:
●● identifying all of the personal data related to a programme or service
and looking at how it will be used;
●● mapping where personal data is sent after collection;
●● identifying privacy and data protection risks and the level of the
risks;
●● finding methods to eliminate or reduce the risks.9
Some Assessment Issues
28.08 Organisations might consider issues such as the following:
●● preparation;
●● undertaking of the assessment;
●● the timing of the assessment;
●● cost and resourcing the assessment;
●● whom the report is for;
●● issues and problems raised;
●● independence of those undertaking the assessment;
●● any constraints;
●● legal professional privilege and confidentiality;
●● after undertaking the assessment, draft report/comments/final report;
●● whether the assessment is a one off or an ongoing assessment in a
(rolling) series?

8 R Clarke, ‘PIAs in Australia: A Work-In-Progress Report,’ in D Wright and P de Hert,


eds, Privacy Impact Assessment (Springer, 2012) 121.
9 Office of the Privacy Commissioner of Canada, Fact Sheet on Privacy Impact Assess-
ment. Also note OIPC of Alberta, ‘Commissioner Accepts Privacy Impact assessment
for the Alberta Security Screening Directive,’ press release, 16 January 2003.

449
28.09 Data Protection Impact Assessment

Prior Consultation
28.09 The Controller shall consult the supervisory authority prior to
processing where a DPIA as provided for in Article 35 indicates that the
processing would result in a high risk in the absence of measures taken
by the Controller to mitigate the risk (Article 36(1)).
Where the supervisory authority is of the opinion that the intended
processing referred to in Article 36(1) would infringe the GDPR, in par-
ticular where the Controller has insufficiently identified or mitigated
the risk, it shall within a period of up to eight weeks of receipt of the
request for consultation, provide written advice to the Controller and,
where applicable the Processor, and may use any of its powers referred
to in Article 53. That period may be extended for a further six weeks,
taking into account the complexity of the intended processing. The
supervisory authority shall inform the Controller and, where applicable,
the Processor, of any such extension within one month of receipt of the
request for consultation together with the reasons for the delay. Those
periods may be suspended until the supervisory authority has obtained
information it may have requested for the purposes of the consultation
(Article 36(2)).
When consulting the supervisory authority pursuant to Article 36(1),
the Controller shall provide the supervisory authority with:
●● where applicable, the respective responsibilities of the Controller,
joint Controllers and Processors involved in the processing, in par-
ticular for processing within a group of undertakings;
●● the purposes and means of the intended processing;
●● the measures and safeguards provided to protect the rights and free-
doms of Data Subjects pursuant to the GDPR;
●● where applicable, the contact details of the DPO;
●● the DPIA provided for in Article 35; and
●● any other information requested by the supervisory authority
(Article 36(3)).
States shall consult the supervisory authority during the preparation of a
proposal for a legislative measure to be adopted by a national parliament,
or of a regulatory measure based on such a legislative measure, which
relates to processing (Article 37(4)).
Notwithstanding Article 36(1), states’ law may require Controllers to
consult with, and obtain prior authorisation from, the supervisory author-
ity in relation to the processing of personal data by a Controller for the
performance of a task carried out by the Controller in the public interest,
including processing in relation to social protection and public health
(Article 36(5)).

450
Conclusion 28.10

Conclusion
28.10 Carrying out impact assessments and the like helps to not
only identify privacy and data protection problems, which can then
be addressed, but also helps to raise these at the earliest stage possi-
ble. Therefore, the least expensive and least problematic time to make
remedial changes is engaged. Carrying out such assessments is now a
requirement under the new GDPR regime. This is especially so for high
risk activities and when sensitive personal data may be involved. These
assessments ensure organisations understand the data they hold, and the
problem issues likely to arise. The organisation, its processes, and the
ultimate customer relationship, will all be improved. Impact assessments
are ultimately one of the mechanisms under the new GDPR for assess-
ing, and thus minimising, risk in the personal data environment.
Organisations must now be proactive and assess when processing
activities are likely to raise risks in relation to personal data and pro-
cessing. The DPO and other relevant parties/teams must be involved.
Assessments must be more systematic. Risk identification and evalua-
tion are now key considerations. Measures to mitigate and address risks
must be considered and documented, including risk assessments. In situ-
ations where there are substantial risk issues, it may be necessary to con-
sult with the ICO.

451
452
Chapter 29
Social Media

Introduction
29.01 New technologies ‘permit easy dissemination and using of
information. Current ICT allows individuals to share [sometimes
unknowingly] their personal preferences and behaviour information on
an unprecedented scale. This could lead to people losing control of per-
sonal information.’1 The internet is an increasing part of our daily lives.
One of its more popular examples is social media. What are the legal
implications of social media?2 One of the most controversial issues in
relation to social media websites is their data processing and respect for
privacy and personal data.3 This is only part of the story. There are many
discrete issues, such as:
●● employers using social networks to vet and screen job applicants;4
●● employers monitoring their employees’ social media;5

1 T Stanimir, ‘Personal Data Protection and the New Technologies,’ Proceedings of the
International Conference on Information Technologies (2011) 333–344.
2 S Nelson, J Simek and J Foltin, ‘The Legal Implications of Social Networking,’
Regent University Law Review, (2009–2010) (22) 1–34. Also, P Viscounty, J Archie,
F Alemi and J Allen, ‘Social Networking and the Law,’ Business Law Today
(2008–009) (58) 18.
3 See, for example, P Roth, ‘Data Protection Meets Web 2.0: Two Ships Passing in the
Night,’ UNSW Law Journal (2010) (33) 532–561. NJ Slabbert, ‘Orwell’s Ghost: How
Teletechnology is Reshaping Civil Society,’ CommLaw Conspectus (2007–2008) (16)
349–359.
4 C Brandenburg, ‘The Newest Way to Screen Job Applicants: A Social Networker’s
Nightmare,’ Federal Communications Law Journal (2007–2008) (60) 597. D Gersen,
‘Your Image, Employers Investigate Job Candidates Online More than Ever. What
can You Do to Protect Yourself?’ Student Law (2007–2008) (36) 24; I Byrnside,
‘Six Degrees of Separation: The Legal Ramifications of Employers Using Social
Networking Sites to Research Applicants,’ Vanderbilt Journal of Entertainment and
Technology Law (2008) (2) 445–477.
5 AR Levinson, ‘Industrial Justice: Privacy Protection for the Employed,’ Cornell
­Journal of Law and Public Policy (2009) (18) 609–688.

453
29.01 Social Media

●● recruiting through social media;6


●● universities monitoring student usage of social media;
●● universities using social media websites to vet applicants;
●● new forms of digital evidence, both criminal and civil.
One example of note is the Facebook investigation of particular data
protection issues by one of the EU national supervisory authorities.
LinkedIn has also been audited in the EU. Recently, the ICO investigated
the significant implications surrounding Cambridge Analytica; and also
separately, particular Brexit campaign issues.
Prior to the new EU General Data Protection Regulation (GDPR), the
ICO has issued recommendations in relation to online and computing
personal data issues, namely:
●● Bring Your Own Device (BYOD) guidance;
●● Cloud Computing;
●● IT Asset Disposal;
●● Personal Information Online Code of Practice;
●● Personal Information Online: Small Business Checklist;
●● Privacy in Mobile Apps: Guidance for App Developers;
●● Social Media and Online Forums – When Does the DPA apply?
These now need to be read in light of the GDPR changes.
The Digital, Culture, Media and Sport (DCMS) Committee, which has
been investigating social media, disinformation and fake news, recom-
mends more scrutiny, responsibility, and detailed regulation for social
media firms, particularly in terms of what it refers to as ‘online harms’.
The final report is entitled ‘Disinformation and “Fake News”: Final
Report’.7 Damian Collins MP, Chair of the DCMS Committee states
inter alia that:
‘The big tech companies are failing in the duty of care they owe to their users
to act against harmful content, and to respect their data privacy rights.

Companies like Facebook exercise massive market power which enables them
to make money by bullying the smaller technology companies and developers
who rely on this platform to reach their customers.

6 M Maher, ‘You’ve Got Messages, Modern Technology Recruiting Through Text Mes-
saging and the Intrusiveness of Facebook,’ Texas Review of Entertainment and Sports
Law (2007) (8) 125–151.
7 Digital, Culture, Media and Sport (DCMS) Committee, ‘Disinformation and “Fake
News”: Final Report’ (18 February 2019). Available at www.parliament.uk/business/
committees/committees-a-z/commons-select/digital-culture-media-and-sport-
committee/news/fake-news-report-published-17-19/.

454
Controllers and Joint Controllers 29.02

These are issues that the major tech companies are well aware of, yet continu-
ally fail to address. The guiding principle of the ‘move fast and break things’
culture often seems to be that it is better to apologise than ask permission.

We need a radical shift in the balance of power between the platforms and the
people. The age of inadequate self regulation must come to an end. The rights
of the citizen need to be established in statute, by requiring the tech companies
to adhere to a code of conduct written into law by Parliament, and overseen by
an independent regulator.

We also have to accept that our electoral regulations are hopelessly out of
date for the internet age. We need reform so that the same principles of trans-
parency of political communications apply online, just as they do in the real
world. More needs to be done to require major donors to clearly establish the
source of their funds.

Much of the evidence we have scrutinised during our inquiry has focused on
the business practices of Facebook; before, during and after the Cambridge
Analytica data breach scandal.

We believe that in its evidence to the Committee Facebook has often deliber-
ately sought to frustrate our work, by giving incomplete, disingenuous and at
times misleading answers to our questions.’8

Controllers and Joint Controllers


29.02 Some may have assumed that the social media company
was automatically the Controller in terms of the data held on the site.
However, a CJEU case made clear that such a situation can be more
nuanced. In the Facebook fan page case9 it was held that the operator
of a fan page on Facebook was a Joint Controller, in conjunction with
Facebook, of the respective data held. The court stated that ‘the concept
of “[C]ontroller” within the meaning of that provision encompasses the
administrator of a fan page hosted on a social network.’
The issue of Joint Controllers is also considered in the Fashion ID
case.10 This relates to a website which incorporated a social medial
plugin whereby certain data was collected and transmitted to a third
party in addition to the website operator.
One wonders if there may be a potential situation where there are
three Joint Controllers. Consider two companies creating a form of joint

8 Ibid.
9 Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaftsaka-
demie Schleswig-Holstein GmbH, CJEU [2018] Case C-210/16 (5 June 2018).
10 Fashion ID GmbH & Co.KG v Verbraucherzentrale NRW eV, CJEU [2019]
Case C-40/17 (29 July 2019).

455
29.02 Social Media

venture, which as part of its activities also takes direction from each
respective entity. Another scenario might be a lecturer who posts on
Facebook or a blogging website. Perhaps the lecturer, university and the
site are each a Joint Controller.

Investigations
29.03 Social media organisations can be officially investigated and
audited much like any other organisations can. One audit,11 for example,
reviewed certain specific aspects of social media data protection compli-
ance. This arose after a number of complaints regarding specific aspects
of an organisation. The following issues were looked at, namely:
●● privacy policies;
●● advertising;
●● access requests;
●● retention;
●● cookies/social plug-ins;
●● third part apps;
●● disclosures to third parties;
●● facial recognition/tag suggest;
●● data security;
●● deletion of accounts;
●● friend finder;
●● tagging;
●● posting on other profiles;
●● credits;
●● pseudonymous profiles;
●● abuse reporting;
●● compliance management/governance.
That is not to suggest that every potential data protection issue was con-
sidered. It was not. Other issues and complaints can arise in future, as
well as further investigations.12

11 Facebook Ireland Limited, Report of Re-Audit, Data Protection Commissioner,


21 September 2012.
12 The Europe Against Facebook group also point out that there are particular issues and
complaints outstanding. See http://www.europe-v-facebook.org/EN/en.html. There
was also a group created in the US by MoveOn.org called Petition: Facebook, Stop
Invading My Privacy, similarly objecting to certain practices of the social media web-
site. A Morganstern, ‘In the Spotlight: Social Network Advertising and the Right of
Publicity,’ Intellectual Property Law Bulletin (2007–2008) (1) 181–198; R Podolny,
‘When “Friends” Become Adversaries: Litigation in the Age of Facebook,’ Manitoba
Law Journal (2009) (33) 391–408; Y Hashemi, ‘Facebook’s Privacy Policy and Its

456
Social Media and Leveson 29.04

The audit investigation forced the social media organisation to make


particular changes to certain aspects of its data protection practices.
These are referred to in the reports. The entity in the EU, the entity which
was investigated, was responsible for its data protection compliance for
everywhere outside of the US and Canada. Therefore, these changes
should see an impact even beyond the EU. In addition it is noted that the
controversial use of facial recognition technology by the organisation
also had to be turned off for users in the EU. This is a direct result of the
complaints and the official audit investigation.
Originally, it did not permit users to delete their accounts. However,
it has now been made clear to the organisation that it must permit users
the right and functional ability to delete their accounts. It will be recalled
that one of the data protection Principles refers to personal data being
held no longer than is necessary. User consent to processing can also be
withdrawn.
The organisation settled litigation in the US relating to an advertising
feature which it had launched and later cancelled. The case was meant
to have been settled for approximately $20m. Note, also that there was a
strong dissenting judgement criticising the settlement as, inter alia, too
low.13

Social Media and Leveson


29.04 Amongst the many witnesses at the Leveson Inquiry were
Facebook, Google and Twitter. One of the headline issues relates to what
activities and services they engage in, respectively, and what they can
and cannot do in terms of specific content. These are controversial and
evolving issues in terms of both data protection compliance as well as
take downs and liability for material on (and via) their websites. This
is an area which will continue to expand. It is a critical area of conten-
tion in litigation (and policy discussion). Sony has been fined £250,000
by the ICO and Google was sued by UK Apple users.14 It remains to
be seen if Cambridge Analytica will result in successful Data Subject

Third-Party Partnerships: Lucrativity and Liability,’ BU Journal of Science & Technol-


ogy Law (2009) (15) 140–161. Also P Nyoni and M Velempini, ‘Data Protection Laws
and Privacy on Facebook,’ SA Journal of Information Management (2015) (17:1) 1.
13 The Beacon advertising feature. See appeal and lower court in the case of McCall v
Facebook. The appeal case is McCall v Facebook, US Court of Appeals for the Ninth
Circuit. At http://cdn.ca9.uscourts.gov/datastore/opinions/2012/09/20/10–16380.pdf.
See also, for example, W McGeveran, ‘Disclosure, Endorsement, and Identity in
Social Marketing,’ Illinois Law Review (2009) (4) 1105–1166.
14 See Google Inc v Vidal-Hall [2015] EWCA Civ 311 (27 March 2015).

457
29.04 Social Media

actions, but which may be complicated by the act that the company has
sought to close down. Facebook was fined £500,000 by the ICO regard-
ing Cambridge Analytica issues.

Social Media Data Transfers: Processors


29.05 Any social media organisation may quite legitimately need to
engage third parties or outsource particular tasks. However, it is not
always clear that the website will have ensured that an appropriate writ-
ten contract is in place and that appropriate security measures are in place
with the outsourced Processor as regards the personal data received and
processed by it. Users should also be informed of such outsourcing and
be assured of the security measures. Consent, transparency and prior
information are equally important compliance issues.

Apps: Social Media Data Transfers


29.06 Increasingly, social networks provide and facilitate third party
apps or applications on their websites.15 Frequently, as part of this prac-
tice, personal data is disclosed to the third party companies operating or
developing the apps. Unfortunately, there appears to be an overly loose
compliance relationship as regards the transfer and protection of users’
personal data. Sometimes, data would be accessible or transferred with-
out regard to users’ personal data rights, user knowledge, contracts and
security. In addition, there could sometimes be no restriction on the apps
developers using the personal data for more than one purpose and for
activities unrelated to the initial intended purpose.
The new GDPR also recognises that apps raise data protection issues.

Awareness
29.07 Increasingly potential employers, schools and universities use
social media profile information in making assessments on applications
regarding specific individuals. Unfortunately, one of the issues relates
to the consequence of this for individuals, sometimes adverse conse-
quences.16 In addition, many users, and particularly those of a younger

15 See, for example, Y Hashemi, ‘Facebook’s Privacy Policy and its Third-Party Part-
nerships: Lucrativity and Liability,’ BUJ Science & Technology Law (2009) (15)
140–161.
16 See discussion at L Edwards and C Waelde, eds, above, 481.

458
Tagging and Identification 29.08

age, will not (fully) appreciate that such activities and consequences can
arise from their social media.
There is arguably more to be done by social networks in terms of
informing and appraising users of the issues which can arise. This is par-
ticularly emphasised when children and teenagers are concerned.
WP29 in its Opinion regarding social media, recognises the dan-
gers arising from apps.17 Compliance with EU General Data Protection
Regulation (GDPR) must be ensured. There is also a UK Home Office
Good Practice Guidance for the Providers of Social Networks.18
Arguably, these could be updated. This is an area where significant ongo-
ing research is needed.
In addition, it should be noted that social media companies have also
recognised that their own awareness and preparedness needs to increased
following the revelations surrounding the attack on the US election of
2016; and the ongoing Cambridge Analytica developments. The ICO is
also investigating the latter and will no doubt issue findings and a report
in due course. It cannot be ruled out at this remove as to whether the
ICO may find sufficient evidence to mount prosecutions in relation to
offences committed.

Tagging and Identification


29.08 It is possible that people are visible online in photographs
uploaded to social networks which they would not want, may be una-
ware of, and also have not consented to.
In addition, social media websites can permit people to be tagged and
labelled in uploaded photographs, without their consent.
These concerns are even more enhanced. Facebook developed
enhanced capacity to index and reveal information and photographs
from its website (Facebook Graph Search tool as well as new facial
recognition applications). There has been some controversy as to what
jurisdictions these types of tools may be applied to, and which not. It is
also noted that users are being requested to submit nude photographs in
order to be scanned in such a manner as to aid in dealing with revenge
porn attacks. By submitting such photographs and presumably furnish-
ing associated specific terms and acknowledgements, there may be an
intention to obtain not just consent, but explicit consent.

17 WP29, Opinion 5/2009 on online social networking, at http://ec.europa.eu/justice/


policies/privacy/docs/wpdocs/2009/wp163_en.pdf.
18 At https://www.gov.uk/government/uploads/system/uploads/attachment_data/file/
251456/industry_guidance_social_networking.pdf.

459
29.09 Social Media

Transparency and User Friendly Tools


29.09 Social media frequently appear to value membership numbers
over fully transparent, obvious and user friendly privacy protection and
complaint tools. The on-site tools (when present) are ‘frequently unobvi-
ous or difficult to use. The cynical might imagine that this is because …
the revenue stream from the SNS comes from third parties – a­ dvertisers –
having access to as much data, on as many profiles, as possible.’19
One part solution is to have privacy and friend restricted access as the
default model for social media websites. This is suggested particularly
in relation to children by the Home Office code of conduct for social
networks.20
WP29 in the context of online behavioural advertising indicates that
an icon in itself can be insufficient. In that context it was particularly
concerned with consent issues. However, significant research remains to
be undertaken in order to properly assess the adequacy of icons, notices,
tools, information, information notices, report buttons, reports processes,
report teams, response times, resolutions times, etc, in relation to online
abuse and social media websites. There is as yet a distinct shortage of
research and literature on this topic. This is despite the media reports of
online abuse and many examples of tragic consequences.

Abuse, Attacks,Threats,Trolling,Victims
29.10 It is clear that these problem issues are significant and appear
to be increasing. Pierre Trudel, for example, notes that the risks to indi-
viduals increase from many online activities, including in relation to data
protection, safety, etc.21
Popular social media websites, such as Twitter, Reddit and now
Facebook, are recognising that more needs to be done to address issues
of online abuse and will progress changes to assist in dealing with the
problem.
The Olympics in London brought to the fore the disadvantages of
social media, where social networks such as Twitter, Facebook, etc,
can be used for abuse.22 There is a growing and troubling number of

19 L Edwards and C Waelde, eds, Law and the Internet (Hart, 2009) 483.
20 In 2008. At https://www.gov.uk/government/uploads/system/uploads/attachment_data/
file/251456/industry_guidance_social_networking.pdf.
21 P Trudel, ‘Privacy Protection on the Internet: Risk Management and Networked Nor-
mativity,’ in S Gutwirth, Y Poullet, P de Hert, C de Terwange and S Nouwt, Reinvent-
ing Data Protection? (Springer, 2009) 317.
22 J Rosenberg, ‘Tom Daley, Twitter Abuse and the Law,’ Guardian, 31 July 2012;
S James, ‘Man Cautioned After Mark Halsey Twitter Abuse,’ Guardian, 27 S ­ eptember

460
Abuse, Attacks, Threats, Trolling,Victims 29.10

instances of suicides arising as a result of abuse and threats occurring on


social media websites. Social networks, and other websites, also contain
controversial material in relation to self-harm.
YouTube and Google also embrace ‘real name’ posting policies
(in part). Such policies may assist somewhat in reducing trolling and
attacks online, as well as the unfortunate consequences thereof.23
The problem of cyber bullying is incontrovertible, with increasing
high-profile suicides recently. If anyone is still unclear about the dev-
astation caused to victims and families, the video posted by Canadian
teenager Amanda Todd is required viewing. Indeed it should be required
viewing for parents, educators and policymakers.
The cyberbullying of teenagers is just one facet of online abuse, which
can range from threats to defamation, harassment to stalking, grooming
and breaches of privacy and data protection. There are also overlaps.
The cyberbullying of children can involve harassment, threats, verbal
abuse and defamation, or in the case of Amanda Todd, privacy and data-
protection breaches later expanding to threats. The solutions are multi-
faceted. Much of the recent commentary points to education, educators
and parents. While this is correct, it is only one aspect.
One point missing from the current discussion is that it is legally pos-
sible to find anonymous abusers. Victims and the police can apply to
court for the disclosure of user details from relevant websites and service
providers. These are frequently known as Norwich Pharmacal orders.
Online abusers can face civil as well as criminal consequences, even if
they are children or teenagers. It goes without saying the police must
properly follow up reports. It is clear from recent media commentary that
the UK police are proactive and considered in this area.
Much of this abuse occurs on social media websites, but it does also
occur via mobile telephones and smartphones. An article in the Guardian
states that the ‘Mobile internet is now just the internet.’24 Some social
media sites provide ‘tools’ or ‘reporting processes.’ However, many of
them, and indeed other Web 2.0 websites, do not do all they could to
deal with these issues. Even some of the largest social networks have
been slower to implement reporting procedures for certain forms of
abuse than should be the case. Some websites also have particular tools

2012; LL Baughmanm, ‘Friend Request or Foe? Confirming the Misuse of Inter-


net and Social Networking Sites by Domestic Violence Perpetrators,’ Widener Law
Journal (2010) (19) 933–966. Generally, also see JJ Ator, ‘Got Facebook?’ GPSolo
(March, 2009) 4–5.
23 Generally note, for example, B Kane, ‘Balancing Anonymity, Popularity, & Micro-
Celebrity: The Crossroads of Social Networking & Privacy,’ Albany Law Journal of
Science and Technology (2010) (20) 327–363.
24 J Naughton, ‘Mobile internet is Now Just the Internet,’ Guardian, 27 December 2015.

461
29.10 Social Media

available which they use for certain activities, but are reluctant to extend
to abuse victims.
But no matter how many ‘report buttons’ there are on a given web-
site, they are entirely useless without protocols, policies and procedures
behind the scenes to follow through on reports that are made. A com-
plaints procedure is meaningless unless there are enough people investi-
gating abuse reports.
It would be interesting to examine and compare the number of peo-
ple employed in abuse investigation teams across various social media
websites. Should there be a minimum number of employees assigned
per number of users of a social media website? Or should there be a
minimum number of employees per amount of abuse reports made? The
turnover of such websites could easily absorb hiring more staff.
A further point arises regarding social media and related websites.
Some are happy to publish statistics about the level of reports and com-
plaints received relating to copyright infringement. This appears com-
mercially driven. There is significantly less ‘transparency’ as regards the
level of abuse reports and complaints made to social media websites, and
around how, and how quickly, these abuse reports are resolved.
As much as we are presently shocked by the dark side of internet abuse,
cyberbullying and the terrible consequences, it may be that we would be
further shocked at the scale of abuse being reported, when the facility is
available to do so. That may be a useful line of pursuit for anyone who is
officially concerned about this issue. It is also worth considering that what-
ever a website may say at first blush may not always be the whole picture.
We are now beginning to realise that, on occasion, social media and
other websites can have a dark side.
Unfortunately, there are large gaps in our knowledge, research and
understanding of these developing issues. More research is needed to
appraise ourselves of all potential solutions and policy decisions as well
as assisting websites to fully engage their own (corporate, moral and
legal) responsibilities and functional capabilities.
There are also business case advantages. The EU Commission and
websites have recently developed a code of conduct initiative for online
hate takedowns.

Employment and Social Media


29.11 Employers are increasingly considering access to employees’
social media materials.25 A general policy of such monitoring is not

25 A Blank, ‘On the Precipe of e-Discovery: Can Litigants Obtain Employee Social
Networking Web Site Information Through Employers?’ CommLaw Conspectus

462
Electronic Evidence 29.12

permitted under EU data protection law. Nor can employees be forced


to disclose or permit access to their social media accounts by way of
enforced access. However, it is the case that social media evidential
material is increasingly frequent in employment disputes and employ-
ment litigation. Organisations should seek legal advice in advance of
seeking to utilise or rely upon such materials, otherwise potential litiga-
tion, dismissals or disciplinary actions can be deemed to be illegal.26

Electronic Evidence
29.12 There are growing digital evidence opportunities for litiga-
tion, which can include personal data.27 These can include service of
documents28 and civil discovery.29 Social media have been described as
electronic footprints.30 This can also include employees’ social media
activity.31

(2009–2010) (18) 487–516. Also, I Byrnside, ‘Six Clicks of Separation: The Legal
Ramifications of Employers using Social Networking Sites to Research Applicants,’
Vanderbilt Journal of Entertainment and Technology Law (2008) (10) 445–477.
26 See also M Maher, ‘You’ve Got Messages: Modern technology Recruiting Through
Text-Messaging and the Intrusiveness of Facebook,’ Texas Review of Entertainment
& Sports Law, (2007) (8) 125–151; C Brandenburg, ‘The Newest Way to Screen Job
Applicants: A Social Networker’s Nightmare,’ Federal Communications Law Journal
(2007–2008) (60) 597–626.
27 K Minotti, ‘The Advent of Digital Diaries: Implications of Social Networking Web
Sites for the Legal Profession,’ South Carolina Law Review (2009) (60) 1057–1074;
JS Wilson, ‘MySpace, Your Space or Our Space? New Frontiers in Electronic Evi-
dence,’ Oregon Law Review (2007) (86) 1201–1240; AC Payne, ‘Twitigation: Old
Rules in a New World,’ Washburn Law Journal (2010) (49) 842–870.
28 AL Shultz, AL, ‘Superpoked and Served: Service of Process via Social Networking
Sites,’ 43 University Richmond Law Review (2008–2009) (43) 1497–1528. RJ Hedges,
Rashbaum and AC Losey, ‘Electronic Service of Process at Home and Abroad: Allow-
ing Domestic Electronic Service of Process in Federal Courts,’ Federal Courts Law
Review (2010) (4) 54–76.
29 SC Bennett, ‘Civil Discovery of Social Networking Information,’ Southwestern Law
Review (2009–2010) (39) 413–431. Also, DS Witte, ‘Your Opponent Does Not Need
a Friend Request to See Your Page: Social Networking Sites and Electronic Discov-
ery,’ McGeorge Law Review (2009–2010) (41) 891–903; RA Ward, ‘Discovering
Facebook: Social Network Subpoenas and the Stored Communications Act,’ Harvard
Journal of Law & Technology (2011) (24) 563–588.
30 EM Marsico, Jr, ‘Social Networking Websites: Are Myspace and Facebook the Finger-
prints of the Twenty-first Century?’ Widener Law Journal (2010) (19) 967–976. Also,
EE North, ‘Facebook Isn’t Your Space Anymore: Discovery of Social Networking Web-
site,’ University of Kansa Law Review (2009–2010) (58) 1279–1309; TM ­Williams,
‘Facebook: Ethics, Traps, and Reminders,’ Litigation News (2009–2010) (35) 4.
31 For example, see, L Thomas, ‘Social Networking in the Workplace: Are Private
Employers Prepared to Comply With Discovery Requests for Posts and Tweets?’ SMU
Law Review (2010) (63) 1373–1402.

463
29.13 Social Media

The Rights of Data Subjects


29.13 It is noted that ‘[o]ne aspect of privacy and data protection is
a remedy against intrusiveness and loss of control of the circulation of
a person’s informational image.’32 In addition, ‘[s]uch intrusiveness
and its loss do not only exist when someone is or can be identified; for
instance, the acts of being observed and being traced are privacy threats,
even without knowing the name of the observed or traced person.’33
Even with social media, the rights of Data Subjects remain. The rights
of Data Subjects, including for social media, can be summarised as
including:
●● right of access;
●● right to establish if personal data exists;
●● right to be informed of the logic in automatic decision taking;
●● right to prevent processing likely to cause damage or distress;
●● right to prevent processing for direct marketing;
●● right to prevent automated decision taking;
●● right to compensation;
●● right to rectify inaccurate data;
●● right to rectification, blocking, erasure and destruction;
●● right to complain to ICO;
●● right to go to court.
These expand under the GDPR. Chapter III of the GDPR refers to the
rights of Data Subjects. These include:
●● right to transparency (Article 5; Article 12);
●● right to prior information: directly obtained data (Article 13);
●● right to prior information: indirectly obtained data (Article 14);
●● right of confirmation and right of access (Article 15);
●● right to rectification (Article 16);
●● right to erasure (Right to be Forgotten) (RtbF) (Article 17);
●● right to restriction of processing (Article 18);
●● notifications re rectification, erasure or restriction (Article 19);
●● right to data portability (Article 20);
●● right to object (Article 21);
●● rights re automated individual decision making, including profiling
(Article 22);
●● DPbD (Article 25);
●● security rights;

32 L Costa and Y Poullet, ‘Privacy and the Regulation of 2012,’ Computer Law & Secu-
rity Review (2012) (28) 254–262, at 256.
33 See above.

464
Consent and Social Media 29.14

●● data protection impact assessment and prior consultation;


●● communicating data breach to Data Subject;
●● Data Protection Officer;
●● remedies, liability and sanctions (Chapter VIII).

Consent and Social Media


29.14 It is noted that social media websites:
‘are of course data Controllers, and so under [data protection] law they must,
in pursuit of “fair processing,” gain the consent of their users to process their
personal data – and indeed, “explicit consent” in the case of sensitive personal
data (which abounds on SNA [social networking service] – on almost every
Facebook profile a user reveals his or her race, politics, sexuality or religious
beliefs, since these predetermined fields in the user profile, which most users
fill in without much thought).’34

Frequently, social media websites seek to comply with the consent


requirement by providing that it is ‘required as part of registration before
“admission” to the site is granted. Consent is usually obtained by dis-
playing a privacy policy [or privacy statement] on the site and asking the
user to accede to them by ticking a box. As there is no chance to negoti-
ate, and little or no evidence that users either read or understand these
conditions, it is hard to see how this consent is “free and informed” …
yet business practice for the entire sector seems to regard this consent is
satisfactory.’35
There are other difficulties.
‘A further problem arises where SNS users display facts or images about other
users – eg commonly, photographs featuring multiple persons. Such users
rarely seek prior consent and such software tools as are made available usually
only facilitate post factum removal, for example, of “tags” on Facebook.’36

‘Also, the question arises as to whether ordinary users should be subject to


the full panopoly of [data protection] obligations vis-a-vis their peers, and
if they are not, how invasions of privacy by users rather than the site itself
should be controlled. One suggestion has been that sites should be subject
to liability for invasion of privacy if they do not take down expediently on
complaint.’37

34 L Edwards and C Waelde, eds, Law and the Internet (Hart, 2009) 479.
35 See above.
36 See above.
37 See above.

465
29.14 Social Media

Consent is an important issue under the new GDPR.38 Lawful processing


and consent are referred to in Recitals 32 and 40. The WP29 also refers
to consent issues.39
Article 7 of the GDPR refers to conditions for consent as follows.
Where processing is based on consent, the Controller shall be able to
demonstrate that the Data Subject has consented to processing of their
personal data (Article 7(1)).
If the Data Subject’s consent is given in the context of a written dec-
laration which also concerns other matters, the request for consent must
be presented in a manner which is clearly distinguishable from the
other matters, in an intelligible and easily accessible form, using clear
and plain language. Any part of the declaration which constitutes an
infringement of the GDPR shall not be binding (Article 7(2)).
The Data Subject shall have the right to withdraw his or her consent
at any time. The withdrawal of consent shall not affect the lawfulness
of processing based on consent before its withdrawal. Prior to giving
consent, the Data Subject must be informed thereof. It shall be as easy to
withdraw consent as to give it (Article 7(3)).
When assessing whether consent is freely given, utmost account shall
be taken of whether, inter alia, the performance of a contract, including
the provision of a service, is conditional on the consent to the process-
ing of data that is not necessary for the performance of that contract
(Article 7(4)).
The issue of children in the data protection regime has been steadily
rising. The increased use of social media and Web 2.0 services enhance
the exposures and risks for children and the uninitiated.40
This has included children’s groups, regulators and also the EDPB
(previously the WP29). WP29 issued Opinion 2/2009 on the Protection
of Children’s Personal Data (General Guidelines and the Special Case of
Schools) in 2009 and also Working Document 1/2008 on the Protection
of Children’s Personal Data (General Guidelines and the Special Case
of Schools) in 2008. Schools are being encouraged to be proactive and
to have appropriate codes and policies for children’s social media and
internet usage.41

38 JP Vandenbroucke and J Olsen, ‘Informed Consent and the New EU Regulation on


Data Protection,’ International Journal of Epidemiology (2013) (42: 6) 1891.
39 WP29 Opinion 15/2011 Consent; Working Document 02/2013 providing guidance on
obtaining consent for cookies, 201; Opinion 04/2012 on Cookie Consent Exemption.
40 See, for example, D Gourlay and G Gallagher, ‘Collecting and Using Children’s Infor-
mation Online: the UK/US Dichotomy,’ SCL Computers and Law, 12 December 2011.
41 Note generally, for example, JS Groppe, ‘A Child’s Playground or a Predator’s
Hunting Ground? – How to Protect Children on Internet Social Networking Sites,’
CommLaw Conspectus (2007) (16) 215–245; EP Steadman, ‘MySpace, But Who’s
­Responsibility? Liability of Social Networking Websites When Offline Sexual Assault

466
Consent and Social Media 29.14

There is now an express acknowledgement of children’s interests in


the EU data protection regime, unlike with the DPD 1995 which con-
tained no explicit reference. The GDPR expressly refers to a ‘child’ for
the first time. This is significant amongst other things in relation to con-
sent, contracting, etc. It is also significant for social networks which have
significant numbers of children. Up until now it was common for certain
social networks to purport to accept users only over the age of 13. Now
that the age of a child in the context of information society services is
considered by the GDPR to be at least 16 years old, it may require care-
ful assessment in relation to social media contracts, terms, processes,
sign ups, etc. (Note, states may additionally make provisions in relation
to the age of children in this regard in addition to the above, for exam-
ple, not below the age of 13 years old (see Article 8), which the UK has
availed of (see Data Protection Act 2018 (DPA 2018), s 9).
The explicit reference to children is new, and some would argue over-
due. Increasingly, the activities of children on the internet and on social
media poses risks and concerns.42 This has been further emphasised of
late with tragic events involving online abuse, in particular cyber bully-
ing. Risks arise obviously from their activities online (eg inappropriate
content, cyber bullying, but also from the collection and use of their per-
sonal data online and collected online, sometimes without their knowl-
edge or consent). Their personal data and privacy is more vulnerable
than that of older people.
It is important for organisations to note the definition of ‘child’ in the
GDPR. This will have implications in how organisations;
●● consider the interaction with children and what personal data may be
collected and processed;
●● ensure that there is appropriate compliance for such collection and
processing for children as distinct from adults.
Processing of children’s personal data are referred to in Recitals 58, 65,
71 and 75.
The German Federal court has affirmed lower courts holding unlaw-
ful Facebook’s ‘friend finder’ function which accesses users’ email lists
and promoted/advertised to these contacts even if not members of the
website.43

of Minors Follows Online Interaction,’ Villanova Sports and Entertainment Law Jour-
nal (2007) (14) 363–397; DC Beckstrom, ‘Who’s Looking at Your Facebook Pro-
file? The Use of Student Conduct Codes to Censor College Students’ Online Speech,’
­Willamette Law Review (2008) 261–312.
42 See, for example, L McDermott, ‘Legal Issues Associated with Minors and Their Use
of Social Networking Sites,’ Communications Law (2012) (17) 19–24.
43 Federation of German Consumer Organisations (VZBV) v Facebook, German Federal
Court of Justice, 14 January 2016.

467
29.14 Social Media

The GDPR can be taken to be part of the increasing willingness to crit-


ically assess both the arguments raised by internet companies, or at least
social media internet companies, and the responsibility of such com-
panies. The express inclusion of the right of data protection bodies to
act on behalf of groups of Data Subjects in the GDPR will undoubtedly
increase the potential for more actions targeted at social media internet
companies (and indeed other companies).

Website Discretion, Liability and Obligations


29.15 On occasion certain websites will argue that they are not
responsible for certain activities occurring on their websites. A particular
example is user generated content. One frequent argument relates to the
issue of the limited ISP defences of caching, hosting and mere conduit
in the ECD.44 There is also an argument raised by certain websites that
they can target and deal with European consumers and users, yet are not
responsible in relation to the collection and processing of users’ per-
sonal data. This later point is based on the argument that if the website
company is based outside of the EU, even though the users are in the
EU, they do not have to comply with the data protection regime. There
is an irony here. These website companies seek to avail of EU law as
regards EU law defences of mere conduit, caching and hosting, yet then
argue their preferred policy of ignoring EU data protection law. Are such
websites immune to the EU data protection regime? Can websites pick
and choose one set of EU laws to benefit them, but ignore entirely other
laws which create rights for their users? Caselaw and the GDPR confirm
compliance liability.
These issues in relation to websites, social media and jurisdiction are
commented upon later.

Third Party Controllers and EU Data Subjects


29.16 DPD 1995 Recital 20 stated that the fact that the processing of
data is carried out by a person established in a third country must not
stand in the way of the protection of individuals provided for in this
Directive. In these cases, the processing should be governed by the law
of the state in which the means used are located, and there should be

44 Directive 2000/31/EC of the European Parliament and of the Council of 8 June 2000
on certain legal aspects of information society services, in particular electronic com-
merce, in the Internal Market (‘Directive on electronic commerce’).

468
Rights 29.19

guarantees to ensure that the rights and obligations provided for in this
Directive are respected in practice.

Specific Processing Risks


29.17 DPD 1995 Recital 53 stated that certain processing opera-
tions are likely to pose specific risks to the rights and freedoms of Data
Subjects by virtue of their nature, their scope or their purposes, such as
that of excluding individuals from a right, benefit or a contract, or by
virtue of the specific use of new technologies; it is for states, if they so
wish, to specify such risks in their legislation.
DPD 1995 Recital 54 stated that with regard to all the processing
undertaken in society, the amount posing such specific risks should be
very limited. States must provide that the supervisory authority, or the
data protection official in cooperation with the authority, check such
processing prior to it being carried out. Following this prior check, the
supervisory authority may, according to its national law, give an opinion
or an authorisation regarding the processing. Such checking may equally
take place in the course of the preparation either of a measure of the
national parliament or of a measure based on such a legislative measure,
which defines the nature of the processing and lays down appropriate
safeguards.
The GDPR also continues this trajectory with the obligations to pro-
actively examine risk and gap issues, and to carry out audits and assess-
ments. While most organisations will have engaged in GDPR compliance
exercises leading up to May 2018, it must be stressed that data protection
compliance is an ongoing exercise, not a one moment in time action. In
addition, there is a new emphasis in being able to record and demonstrate
compliance, as opposed to merely making statements without appropri-
ate backup.

Rights
Right to Prevent Data Processing Likely to Cause Damage
or Distress
29.18 Data Subjects have a right to prevent processing likely to cause
damage or distress. This can also be considered relevant to social media.
Right to Prevent Data Processing for DM
29.19 Data Subjects have the right to prevent processing for purposes
of direct marketing. This can also be considered relevant to social media.

469
29.20 Social Media

Compensation for Data Subjects


29.20 Data Subjects are entitled to damages and compensation for
failure to comply with certain requirements, particularly where there
is material or non material damage. This can be relevant to social
media.
Rectification, Blocking, Erasure and Destruction Rights
29.21 Data Subjects have rights in relation to rectification, block-
ing, erasure and forgetting, and destruction. This can also be relevant to
social media.
Automated Decision Taking/Making Processes
29.22 Automated decision taking/making processes can also be con-
sidered in the social media context. Controllers may not take decisions
which produce legal effects concerning a Data Subject or which other-
wise significantly affect a Data Subject and which are based solely on
processing by automatic means of personal data and which are intended
to evaluate certain personal matters relating to the Data Subject such as,
performance at work, credit worthiness, reliability of conduct.

GDPR
Definitions and Social Media
29.23 The GDPR is relevant to social media. The GDPR definitions
include:
‘personal data means a breach of security leading to the accidental
breach’ or unlawful destruction, loss, alteration, unauthorised
disclosure of, or access to, personal data transmitted, stored
or otherwise processed;
‘main means:
establishment’ ●● as regards a Controller with establishments in more
than one Member State, the place of its central
administration in the EU, unless the decisions on
the purposes and means of the processing of per-
sonal data are taken in another establishment of the
­Controller in the EU and the latter establishment
has the power to have such decisions implemented,
in which case the establishment having taken
such ­decisions is to be considered to be the main
establishment;

470
GDPR 29.25

●● as regards a Processor with establishments in more


than one Member State, the place of its central admin-
istration in the EU, or, if the Processor has no central
administration in the EU, the establishment of the Pro-
cessor in the EU where the main processing activities
in the context of the activities of an establishment of
the Processor take place to the extent that the Proces-
sor is subject to specific obligations under the GDPR.
‘representative’ means any natural or legal person established in the
EU who, designated by the Controller or Processor in
writing pursuant to Article 27, represents the Controller or
Processor, with regard to their respective obligations under
the GDPR;
‘enterprise’ means any natural or legal person engaged in an
economic activity, irrespective of its legal form, including
partnerships or associations regularly engaged in an
economic activity;
‘group of means a controlling undertaking and its controlled
undertakings’ undertakings.

The originally proposed definition of a child appears deleted in the final


version of the GDPR.
However, see reference to children below.
Right to Object, Profiling and Social Media
29.24 Electronic identity or eID’s are also an area being considered.45
Transparency
29.25 Article 12 refers to transparent information, communication and
modalities for exercising the rights of the Data Subject. The Controller
shall take appropriate measures to provide any information referred to in
Articles 15–22 and 34 relating to the processing to the Data Subject in a
concise, transparent, intelligible and easily accessible form, using clear
and plain language, in particular for any information addressed specifi-
cally to a child. When requested by the Data Subject, the information
may be provided orally, as long as the identity of the Data Subject is
proven by other means (Article 12(1)).

45 See, for example, Norberto Nuno Gomes de Ardrade, ‘Regulating Electronic Identity
in the European Union: An Analysis of the Lisbon Treaty’s Competences and Legal
Basis for eID,’ Computer Law and Security Review (2012) (28) 153–162, at 153.
European Commission, Communication from the Commission – a Digital Agenda for
Europe (Brussels: European Commission 2010) 11.

471
29.25 Social Media

The Controller shall facilitate the exercise of Data Subject rights under
Articles 15–22. In cases referred to in Article 11(2), the Controller shall
not refuse to act on the request of the Data Subject for exercising their
rights under Articles 15–22, unless the Controller demonstrates that it is
not in a position to identify the Data Subject (Article 12(2)).
The Controller shall provide information on action taken on a request
under Articles 15–22 to the Data Subject without undue delay and in any
event within one month of receipt of the request. This period may be
extended by two further months where necessary, taking into account the
complexity of the request and the number of the requests. The Controller
shall inform the Data Subject of any extensions within one month of
receipt of the request, together with the reasons for the delay. Where the
Data Subject makes the request in electronic form means, the informa-
tion shall be provided in electronic form where possible, unless other-
wise requested by the Data Subject (Article 12(3)).
If the Controller does not take action on the request of the Data
Subject, the Controller shall inform the Data Subject without delay and
at the latest within one month of receipt of the request of the reasons for
not taking action and on the possibility of lodging a complaint to a super-
visory authority and seeking a judicial remedy (Article 12(4)).
Information provided under Articles 13 and 14 and any communica-
tion and any actions taken under Articles 15–22 and 34 shall be pro-
vided free of charge. Where requests from a Data Subject are manifestly
unfounded or excessive, in particular because of their repetitive charac-
ter, the Controller may either charge a reasonable fee taking into account
the administrative costs of providing the information or communication
or taking the action requested; or, refuse to act on the request. In these
cases, the Controller shall bear the burden of demonstrating the mani-
festly unfounded or excessive character of the request (Article 12(5)).
Without prejudice to Article 11, where the Controller has reasonable
doubts concerning the identity of the individual making the request
referred to in Articles 15–21, the Controller may request the provision
of additional information necessary to confirm the identity of the Data
Subject (Article 12(6)).
The information to be provided to Data Subjects pursuant to Articles 13
and 14 may be provided in combination with standardised icons in order
to give in an easily visible, intelligible and clearly legible manner a
meaningful overview of the intended processing. Where the icons are
presented electronically they shall be machine-readable (Article 12(7)).
The Commission shall be empowered to adopt delegated acts in
accordance with Article 92 for the purpose of determining the informa-
tion to be presented by the icons and the procedures for providing stand-
ardised icons (Article 12(8)).

472
GDPR 29.28

Child’s Consent: Conditions for Information Society Services


29.26 Processing of children’s personal data are referred to in
Recital 38. Article 8 of the GDPR makes provisions in relation to con-
ditions for children’s consent for information society services. Where
Article 6(1)(a) applies, in relation to the offer of information society
services directly to a child, the processing of personal data of a child
shall be lawful where the child is at least 16 years old. Where the child
is below the age of 16 years, such processing shall be lawful only if and
to the extent that such consent is given or authorised by the holder of
parental responsibility over the child (Article 8(1)). (States may provide
by law for a lower age for those purposes provided that such lower age
is not below 13 years (Article 8(1)) – as the UK has done (DPA 2018,
s 9). The Controller shall make reasonable efforts to verify in such cases
that consent is given or authorised by the holder of parental respon-
sibility over the child, taking into consideration available technology
(Article 8(2)). This shall not affect the general contract law of states
such as the rules on the validity, formation or effect of a contract in rela-
tion to a child (Article 8(3)).
Data Access Rights
29.27 Chapter III, Section 2 of the GDPR refers to data access. It
should be noted that the prior information requirements might have pre-
viously been viewed as Controller compliance obligations. However,
they are now reformulated as part of the Data Subject information and
access rights. This would therefore be a change in emphasis.
Right to Prior Information: Directly Obtained Data
29.28 Article 13 refers to information to be provided where the data
are collected from the Data Subject. Where personal data relating to a
Data Subject are collected from the Data Subject, the Controller shall, at
the time when personal data are obtained, provide the Data Subject with
all of the following information:
●● the identity and the contact details of the Controller and, where
applicable, of the Controller’s representative;
●● the contact details of the DPO, where applicable;
●● the purposes of the processing for which the personal data are
intended as well as the legal basis of the processing;
●● where the processing is based on Article 6(1)(f), the legitimate inter-
ests pursued by the Controller or by a third party;
●● the recipients or categories of recipients of the personal data, if any;
●● where applicable, the fact that the Controller intends to transfer per-
sonal data to a third country or international organisation and the

473
29.28 Social Media

existence or absence of an adequacy decision by the Commission,


or in case of transfers referred to in Article 46 or 47, or Article 49(1)
(second subparagraph), reference to the appropriate or suitable safe-
guards and the means by which to obtain a copy of them or where
they have been made available (Article 13(1)).
In addition to the information referred to in Article 13(1), the Controller
shall, at the time when personal data are obtained, provide the Data
Subject with the following further information necessary to ensure fair
and transparent processing:
●● the period for which the personal data will be stored, or if this is not
possible, the criteria used to determine this period;
●● the existence of the right to request from the Controller access to and
rectification or erasure of personal data or restriction of processing
concerning the Data Subject or to object to the processing as well as
the right to data portability;
●● where the processing is based on Article 6(1)(a) or Article 9(2)(a),
the existence of the right to withdraw consent at any time, without
affecting the lawfulness of processing based on consent before its
withdrawal;
●● the right to lodge a complaint with a supervisory authority;
●● whether the provision of personal data is a statutory or contractual
requirement, or a requirement necessary to enter into a contract, as
well as whether the Data Subject is obliged to provide the data and
of the possible consequences of failure to provide such data;
●● the existence of automated decision making, including profiling,
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 13(2)).
Where the Controller intends to further process the personal data for
a purpose other than the one for which the data were collected, the
Controller shall provide the Data Subject prior to that further process-
ing with information on that other purpose and with any relevant further
information as referred to in Article 13(2) (Article 13(3)).
Article 13(1), (2) and (3) shall not apply where and insofar as the Data
Subject already has the information (Article 13(4)).
Right to Prior Information: Indirectly Obtained Data
29.29 Article 14 refers to information to be provided where the per-
sonal data have not been obtained from the Data Subject. Where personal

474
GDPR 29.29

data have not been obtained from the Data Subject, the Controller shall
provide the Data Subject with the following information:
●● the identity and the contact details of the Controller and, where
applicable, of the Controller’s representative;
●● the contact details of the DPO, where applicable;
●● the purposes of the processing for which the personal data are
intended as well as the legal basis of the processing;
●● the categories of personal data concerned;
●● the recipients or categories of recipients of the personal data, if any;
●● where applicable, that the Controller intends to transfer personal data
to a recipient in a third country or international organisation and the
existence or absence of an adequacy decision by the Commission,
or in case of transfers referred to in Article 46 or 47, or the second
subparagraph of Article 49(1), reference to the appropriate or suit-
able safeguards and the means to obtain a copy of them or where
they have been made available (Article 14(1)).
In addition to the information referred to in Article 14(1), the Controller
shall provide the Data Subject with the following information necessary
to ensure fair and transparent processing in respect of the Data Subject:
●● the period for which the personal data will be stored, or if this is not
possible, the criteria used to determine that period;
●● where the processing is based on Article 6(1)(f), the legitimate inter-
ests pursued by the Controller or by a third party;
●● the existence of the right to request from the Controller access to and
rectification or erasure of the personal data or restriction of process-
ing of data concerning the Data Subject and to object to the process-
ing as well as the right to data portability;
●● where processing is based on Article 6(1)(a) or Article 9(2)(a), the
existence of the right to withdraw consent at any time, without
affecting the lawfulness of processing based on consent before its
withdrawal;
●● the right to lodge a complaint to a supervisory authority;
●● from which source the personal data originate, and if applicable,
whether it came from publicly accessible sources;
●● the existence of automated decision making including profiling
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 14(2)).

475
29.29 Social Media

The Controller shall provide the information referred to in Article 14(1)


and (2):
●● within a reasonable period after obtaining the personal data, but at
the latest within one month, having regard to the specific circum-
stances in which the personal data are processed;
●● if the personal data are to be used for communication with the Data
Subject, at the latest at the time of the first communication to that
Data Subject; or
●● if a disclosure to another recipient is envisaged, at the latest when
the personal data are first disclosed (Article 14(3)).
Where the Controller intends to further process the personal data for a
purpose other than that for which the data were obtained, the Controller
shall provide the Data Subject prior to that further processing with infor-
mation on that other purpose and with any relevant further information
as referred to in Article 14(2) (Article 14(4)).
Article 14(1)–(4) shall not apply where and insofar as:
●● the Data Subject already has the information;
●● the provision of such information proves impossible or would
involve a disproportionate effort, in particular for processing for
archiving purposes in the public interest, scientific or historical
research purposes or statistical purposes, subject to the conditions
and safeguards referred to in Article 89(1) or in so far as the obliga-
tion referred to in Article 14(1) is likely to render impossible or seri-
ously impair the achievement of the objectives of that processing.
In such cases the Controller shall take appropriate measures to pro-
tect the Data Subject’s rights and freedoms and legitimate interests,
including making the information publicly available;
●● obtaining or disclosure is expressly laid down by EU or state law
to which the Controller is subject and which provides appropriate
measures to protect the Data Subject’s legitimate interests; or
●● where the personal data must remain confidential subject to an obli-
gation of professional secrecy regulated by EU or state law, includ-
ing a statutory obligation of secrecy (Article 14(5)).
Right of Confirmation and Right of Access
29.30 The Data Subject shall have the right to obtain from the
Controller confirmation as to whether or not personal data concerning
them are being processed, and where that is the case, access to the per-
sonal data and the following information:
●● the purposes of the processing;
●● the categories of personal data concerned;

476
GDPR 29.31

●● the recipients or categories of recipients to whom the personal data


have been or will be disclosed, in particular recipients in third coun-
tries or international organisations;
●● where possible, the envisaged period for which the personal data will
be stored, or if not possible, the criteria used to determine that period;
●● the existence of the right to request from the Controller rectification
or erasure of personal data or restriction of processing of personal
data concerning the Data Subject or to object to such processing;
●● the right to lodge a complaint to a supervisory authority;
●● where the personal data are not collected from the Data Subject, any
available information as to their source;
●● the existence of automated decision making including profiling
referred to in Article 22(1) and (4) and at least in those cases, mean-
ingful information about the logic involved, as well as the signifi-
cance and the envisaged consequences of such processing for the
Data Subject (Article 15(1)).
Where personal data are transferred to a third country or to an interna-
tional organisation, the Data Subject shall have the right to be informed
of the appropriate safeguards pursuant to Article 46 relating to the trans-
fer (Article 15(2)).
The Controller shall provide a copy of the personal data undergoing
processing. For any further copies requested by the Data Subject, the
Controller may charge a reasonable fee based on administrative costs.
Where the Data Subject makes the request by electronic means, and
unless otherwise requested by the Data Subject, the information shall be
provided in a commonly used electronic form (Article 15(3)).
The right to obtain a copy referred to in Article 15(3) shall not
adversely affect the rights and freedoms of others (Article 15(4)).
Prior to the new GDPR, the ICO has issued recommendations in rela-
tion to data access, namely:
●● Access to Information Held in Complaint Files;
●● Enforced Subject Access (section 56);
●● How to Disclose Information Safely – Removing Personal Data
from Information Requests and Datasets;
●● Regulatory Activity Exemption;
●● Subject Access: Code of Practice;
●● Subject Access: Responding to a Request Checklist.
These now need to be read in light of the GDPR changes.
Rectification and Erasure
29.31 Chapter III, Section 3 of the GDPR refers to rectification and
erasure.

477
29.31 Social Media

Prior to the new GDPR, the ICO has issued recommendations in


­relation to personal data deletion issues, namely:
●● Deleting Personal Data.
This now needs to be read in light of the GDPR changes.
Right to Rectification
29.32 The Data Subject shall have the right to obtain from the
Controller without undue delay the rectification of personal data con-
cerning them. Taking into account the purposes of the processed, the
Data Subject shall have the right to obtain completion of incomplete per-
sonal data completed, including by means of providing a supplementary
statement (Article 16).
Right to Erasure (Right to be Forgotten) (RtbF)
29.33 The Data Subject shall have the right to obtain from the
Controller the erasure and forgetting of personal data concerning them
without undue delay and the Controller shall have the obligation to erase
personal data without undue delay where one of the following grounds
applies:
●● the data are no longer necessary in relation to the purposes for which
they were collected or otherwise processed;
●● the Data Subject withdraws consent on which the processing is based
according to Article 6(1)(a), or Article 9(2)(a), and where there is no
other legal ground for the processing;
●● the Data Subject objects to the processing pursuant to Article 21(1)
and there are no overriding legitimate grounds for the processing, or
the Data Subject objects to the processing pursuant to Article 21(2);
●● the personal data have been unlawfully processed;
●● the data have to be erased for compliance with a legal obligation in
EU or state law to which the Controller is subject;
●● the personal data have been collected in relation to the offer of infor-
mation society services referred to in Article 8(1) (Article 17(1)).
Where the Controller has made the personal data public and is obliged
pursuant to Article 17(1) to erase the personal data, the Controller, tak-
ing account of available technology and the cost of implementation,
shall take reasonable steps, including technical measures, to inform
Controllers which are processing the personal data that the Data Subject
has requested the erasure by such Controllers of any links to, or copy or
replication of, that personal data (Article 17(2)).

478
GDPR 29.34

Article 17(1) and (2) shall not apply to the extent that processing is
necessary:
●● for exercising the right of freedom of expression and information;
●● for compliance with a legal obligation which requires processing of
personal data by EU or state law to which the Controller is subject or
for the performance of a task carried out in the public interest or in
the exercise of official authority vested in the Controller;
●● for reasons of public interest in the area of public health in accord-
ance with Article 9(2)(h) and (i) as well as Article 9(3);
●● for archiving purposes in the public interest, scientific or histori-
cal research purposes or statistical purposes in accordance with
Article 89(1) in so far as the right referred to in Article 17(1) is
likely to render impossible or seriously impair the achievement of
the objectives of that processing; or
●● for the establishment, exercise or defence of legal claims
(Article 17(3)).
Right to Restriction of Processing
29.34 Article 18 refers to the right to restriction of processing. The
Data Subject shall have the right to obtain from the Controller the restric-
tion of the processing where one of the following applies:
●● the accuracy of the data is contested by the Data Subject, for a period
enabling the Controller to verify the accuracy of the data;
●● the processing is unlawful and the Data Subject opposes the erasure
of the personal data and requests the restriction of their use instead;
●● the Controller no longer needs the personal data for the purposes
of the processing, but they are required by the Data Subject for the
establishment, exercise or defence of legal claims;
●● the Data Subject has objected to processing pursuant to Article 21(1)
pending the verification whether the legitimate grounds of the
Controller override those of the Data Subject (Article 18(1)).
Where processing has been restricted under Article 18(1), such personal
data shall, with the exception of storage, only be processed with the Data
Subject’s consent or for the establishment, exercise or defence of legal
claims or for the protection of the rights of another natural or legal per-
son or for reasons of important public interest of the EU or of a state
(Article 18(2)).
A Data Subject who obtained the restriction of processing pursuant to
Article 18(1) shall be informed by the Controller before the restriction of
processing is lifted (Article 18(3)).

479
29.35 Social Media

Notification re Rectification, Erasure or Restriction


29.35 The Controller shall communicate any rectification or erasure
of personal data or restriction of processing carried out in accordance
with Articles 16, 17(1) and 18 to each recipient to whom the personal
data have been disclosed, unless this proves impossible or involves dis-
proportionate effort. The Controller shall inform the Data Subject about
those recipients if the Data Subject requests it (Article 19).
Right to Data Portability
29.36 The Data Subject shall have the right to receive the personal
data concerning them, which they have provided to a Controller, in
a structured and commonly used and machine-readable format and
have the right to transmit those data to another Controller without
hindrance from the Controller to which the data have been provided,
where:
●● the processing is based on consent pursuant to Article 6(1)(a) or
Article 9(2)(a) or on a contract pursuant to Article 6(1)(b); and
●● the processing is carried out by automated means (Article 20(2)).
In exercising their right to data portability pursuant to Article 20(1),
the Data Subject has the right to have the data transmitted directly from
Controller to Controller where technically feasible (Article 20(2)).
The exercise of this right shall be without prejudice to Article 17. That
right shall not apply to processing necessary for the performance of a
task carried out in the public interest or in the exercise of official author-
ity vested in the Controller (Article 20(3)).
The right referred to in Article 20(4) shall not adversely affect the
rights and freedoms of others (Article 20(4)).
Right Against Automated Individual Decision Making
29.37 Chapter III, Section 4 of the GDPR refers to the right to object
and automated individual decision making.
Right to Object
29.38 The Data Subject shall have the right to object, on grounds relat-
ing to their particular situation, at any time to processing of personal
data concerning them which is based on Article 6(1)(e) or (f), includ-
ing profiling based on these provisions. The Controller shall no longer
process the personal data unless the Controller demonstrates compelling
legitimate grounds for the processing which override the interests, rights
and freedoms of the Data Subject or for the establishment, exercise or
defence of legal claims (Article 21(1)).

480
GDPR 29.39

Where personal data are processed for direct marketing purposes,


the Data Subject shall have the right to object at any time to the pro-
cessing of personal data concerning them for such marketing, which
includes profiling to the extent that it is related to such direct marketing
(Article 21(2)).
Where the Data Subject objects to the processing for direct market-
ing purposes, the personal data shall no longer be processed for such
purposes (Article 21(3)).
At the latest at the time of the first communication with the Data
Subject, the right referred to in Article 21(1) and (2) shall be explicitly
brought to the attention of the Data Subject and shall be presented clearly
and separately from any other information (Article 21(4)).
In the context of the use of information society services, and not-
withstanding Directive 2002/58/EC,46 the Data Subject may exercise
their right to object by automated means using technical specifications
(Article 21(5)).
Where personal data are processed for scientific or historical research
purposes or statistical purposes pursuant to Article 89(1), the Data
Subject, on grounds relating to their particular situation, shall have the
right to object to processing of personal data concerning them, unless
the processing is necessary for the performance of a task carried out for
reasons of public interest (Article 21(6)).
Rights re Automated Individual Decision Making, Including Profiling
29.39 The Data Subject shall have the right not to be subject to a deci-
sion based solely on automated processing, including profiling, which
produces legal effects concerning them or similarly significantly affects
them (Article 22(1)).
Article 22(1) shall not apply if the decision:
●● is necessary for entering into, or performance of, a contract between
the Data Subject and a Controller [a];
●● is authorised by EU or state law to which the Controller is subject
and which also lays down suitable measures to safeguard the Data
Subject’s rights and freedoms and legitimate interests; or
●● is based on the Data Subject’s explicit consent (Article 22(1)) [c].
In cases referred to in Article 22(2) (a) and (c) the Controller shall
implement suitable measures to safeguard the Data Subject’s rights and

46 Directive 2002/58/EC of the European Parliament and of the Council of 12 July 2002
concerning the processing of personal data and the protection of privacy in the elec-
tronic communications sector (Directive on privacy and electronic communications).

481
29.39 Social Media

freedoms and legitimate interests, at least the right to obtain human inter-
vention on the part of the Controller, to express his or her point of view
and to contest the decision (Article 22(3)).
Decisions referred to in Article 22(4) shall not be based on spe-
cial categories of personal data referred to in Article 9(1), unless
Article 9(2)(a) or (g) applies and suitable measures to safeguard the
Data Subject’s rights and freedoms and legitimate interests are in place
(Article 22(4)).
Communicating Data Breach to Data Subject
29.40 When a personal data breach is likely to result in a high risk to
the rights and freedoms of natural persons, the Controller shall commu-
nicate the personal data breach to the Data Subject without undue delay
(Article 34(1)).
The communication to the Data Subject shall describe in clear and
plain language the nature of the personal data breach and contain at least
the information referred to in Article 33(3)(b), (c) and (d) (Article 34(2)).
The communication to the Data Subject shall not be required if:
●● the Controller has implemented appropriate technical and organi-
sational protection measures, and those measures were applied to
the personal data affected by the personal data breach, in particular
those that render the personal data unintelligible to any person who
is not authorised to access it, such as encryption;
●● the Controller has taken subsequent measures which ensure that the
high risk for the rights and freedoms of Data Subjects referred to in
Article 34(1) is no longer likely to materialise;
●● it would involve disproportionate effort. In such cases, there shall
instead be a public communication or similar measure whereby
the Data Subjects are informed in an equally effective manner
(Article 34(3)).
If the Controller has not already communicated the personal data breach
to the Data Subject, the supervisory authority, having considered the
likelihood of the breach to result in a high risk, may require it to do so
or may decide that any of the conditions referred to in Article 34(3) are
met (Article 34(4)).
Security of Processing
29.41 Taking into account the state of the art, the costs of implementa-
tion and the nature, scope, context and purposes of the processing as well
as the risk of varying likelihood and severity for the rights and freedoms
of natural persons, the Controller and the Processor shall implement

482
GDPR 29.42

appropriate technical and organisational measures, to ensure a level of


security appropriate to the risk, including inter alia, as appropriate:
●● the pseudonymisation and encryption of personal data;
●● the ability to ensure the ongoing confidentiality, integrity, availabil-
ity and resilience of processing systems and services;
●● the ability to restore the availability and access to personal data in a
timely manner in the event of a physical or technical incident;
●● a process for regularly testing, assessing and evaluating the effec-
tiveness of technical and organisational measures for ensuring the
security of the processing (Article 32(1)).
In assessing the appropriate level of security account shall be taken in
particular of the risks that are presented by data processing, in particular
from accidental or unlawful destruction, loss, alteration, unauthorised
disclosure of, or access to personal data transmitted, stored or otherwise
processed (Article 32(2)).
Adherence to an approved code of conduct pursuant to Article 40 or
an approved certification mechanism pursuant to Article 39 may be used
as an element to demonstrate compliance with the requirements set out
in Article 32(1) (Article 32(3)).
The Controller and Processor shall take steps to ensure that any natu-
ral person acting under the authority of the Controller or the Processor
who has access to personal data shall not process them except on instruc-
tions from the Controller, unless he or she is required to do so by EU or
state law (Article 32(4)).
Representatives of Controllers Not Established in EU
29.42 Where Article 3(2) applies, the Controller or the Processor shall
designate in writing a representative in the EU (Article 27(1)).
This obligation shall not apply to:
●● processing which is occasional, does not include, on a large scale,
processing of special categories of data as referred to in Article 9(1)
or processing of personal data relating to criminal convictions and
offences referred to in Article 10, and is unlikely to result in a risk
for the rights and freedoms of natural persons, taking into account
the nature, context, scope and purposes of the processing; or
●● a public authority or body (Article 27(2)).
The representative shall be established in one of the states where the
Data Subjects whose personal data are processed in relation to the
offering of goods or services to them, or whose behaviour is monitored
(Article 27(3)).

483
29.42 Social Media

The representative shall be mandated by the Controller or the


Processor to be addressed in addition to or instead of the Controller or
the Processor by, in particular, supervisory authorities and Data Subjects,
on all issues related to the processing, for the purposes of ensuring com-
pliance with the GDPR (Article 27(4)).
The designation of a representative by the Controller or the Processor
shall be without prejudice to legal actions which could be initiated
against the Controller or the Processor themselves (Article 27(5)).
Chapter IV, Section 3 of the GDPR refers to Impact Assessments and
Prior Consultations.
Data Protection Impact Assessment
29.43 Where a type of processing in particular using new technolo-
gies, and taking into account the nature, scope, context and purposes
of the processing, is likely to result in a high risk for the rights and
freedoms of natural persons, the Controller shall, prior to the process-
ing, carry out an assessment of the impact of the envisaged processing
operations on the protection of personal data. A single assessment may
address a set of similar processing operations that present similar high
risks (Article 35(1)).
The Controller shall seek the advice of the DPO, where designated,
when carrying out a data protection impact assessment (Article 35(2)).
A data protection impact assessment referred to in Article 35(1) shall
in particular be required in the case of:
●● a systematic and extensive evaluation of personal aspects relating
to natural persons which is based on automated processing, includ-
ing profiling, and on which decisions are based that produce legal
effects concerning the individual or similarly significantly affect the
individual;
●● processing on a large scale of special categories of data referred to in
Article 9(1), or of data relating to criminal convictions and offences
referred to in Article 10;
●● a systematic monitoring of a publicly accessible area on a large scale
(Article 35(3)).
The supervisory authority shall establish and make public a list of the
kind of processing operations which are subject to the requirement
for a data protection impact assessment pursuant to Article 35(1).
The supervisory authority shall communicate those lists to the EDPB
(Article 68(4)).
The supervisory authority may also establish and make public a list
of the kind of processing operations for which no data protection impact

484
GDPR 29.43

assessment is required. The supervisory authority shall communicate


those lists to the EDPB (Article 35(5)).
Prior to the adoption of the lists referred to above the competent super-
visory authority shall apply the consistency mechanism referred to in
Article 63 where such lists involve processing activities which are related
to the offering of goods or services to Data Subjects or to the monitoring
of their behaviour in several states, or may substantially affect the free
movement of personal data within the EU (Article 35(6)).
The assessment shall contain at least:
●● a systematic description of the envisaged processing operations and
the purposes of the processing, including where applicable the legit-
imate interest pursued by the Controller;
●● an assessment of the necessity and proportionality of the processing
operations in relation to the purposes;
●● an assessment of the risks to the rights and freedoms of Data Subjects
referred to in Article 35(1); and
●● the measures envisaged to address the risks, including safeguards,
security measures and mechanisms to ensure the protection of per-
sonal data and to demonstrate compliance with the GDPR taking
into account the rights and legitimate interests of Data Subjects and
other persons concerned (Article 35(7)).
Compliance with approved codes of conduct referred to in Article 40 by
the relevant Controllers or Processors shall be taken into due account
in assessing the impact of the processing operations performed by such
Controllers or Processors, in particular for the purposes of a data protec-
tion impact assessment (Article 35(8)).
Where appropriate, the Controller shall seek the views of Data Subjects
or their representatives on the intended processing, without prejudice to
the protection of commercial or public interests or the security of the
processing operations (Article 35(9)).
Where the processing pursuant to Article 6(1)(c) or (e) has a legal
basis in EU law, or the law of the state to which the Controller is subject,
that law regulates the specific processing operation or set of operations
in question, and a data protection impact assessment has already been
carried out as part of a general impact assessment in the context of the
adoption of this legal basis, Article 35(1) to (7) shall not apply unless
states deem it to be necessary to carry out such an assessment prior to the
processing activities (Article 35(10)).
Where necessary, the Controller shall carry out a review to assess
if the processing is performed in accordance with the data protection
impact assessment at least when there is a change of the risk represented
by the processing operations (Article 35(11)).

485
29.44 Social Media

Prior Consultation
29.44 The Controller shall consult the supervisory authority prior to
the processing where a data protection impact assessment as provided
for in Article 35 indicates that the processing would result in a high risk
in the absence of measures taken by the Controller to mitigate the risk
(Article 36(1)).
Where the supervisory authority is of the opinion that the intended
processing referred to in Article 36(1) would infringe the GDPR, in par-
ticular where the Controller has insufficiently identified or mitigated
the risk, it shall within a maximum period of eight weeks following the
request for consultation give written advice to the Controller and, where
applicable the Processor, and may use any of its powers referred to in
Article 58. This period may be extended by six weeks, taking into account
the complexity of the intended processing. The supervisory authority
shall inform the Controller and, where applicable, the Processor, of any
such extension within one month of receipt of the request for consulta-
tion together with the reasons for the delay. These periods may be sus-
pended until the supervisory authority has obtained any information it
has requested for the purposes of the consultation (Article 36(2)).
When consulting the supervisory authority pursuant to Article 36(1),
the Controller shall provide the supervisory authority with:
●● where applicable, the respective responsibilities of Controller, joint
Controllers and Processors involved in the processing, in particular
for processing within a group of undertakings;
●● the purposes and means of the intended processing;
●● the measures and safeguards provided to protect the rights and free-
doms of Data Subjects pursuant to the GDPR;
●● where applicable, the contact details of the DPO;
●● the data protection impact assessment provided for in Article 35; and
●● any other information requested by the supervisory authority
(Article 36(3)).
States shall consult the supervisory authority during the preparation of a
proposal for a legislative measure to be adopted by a national parliament,
or of a regulatory measure based on such a legislative measure, which
relates to the processing (Article 36(4)).
Notwithstanding Article 36(1), states’ law may require Controllers to
consult with, and obtain prior authorisation from, the supervisory author-
ity in relation to the processing by a Controller for the performance of
a task carried out by the Controller in the public interest, including pro-
cessing in relation to social protection and public health (Article 36(5)).

486
Chapter 30
Leveson, the Press and Data
Protection

Introduction
30.01 The UK Leveson Report deals with (certain) data protection
issues in detail, namely the recommendations relating to data protec-
tion and journalism. The evidence and issues are more fully described in
Part H, 5 of the Report.

DPA 1998, s 32
30.02 The Data Protection Act 1998 (DPA 1998) s 32 referring to jour-
nalism activities, provided:
‘(1) Personal data which are processed only for the special purposes are
exempt from any provision to which this subsection relates if—
(a) the processing is undertaken with a view to the publication by any
person of any journalistic, literary or artistic material,
(b) the data Controller reasonably believes that, having regard in par-
ticular to the special importance of the public interest in freedom of
expression, publication would be in the public interest, and
(c) the data controller reasonably believes that, in all the circum-
stances, compliance with that provision is incompatible with the
special purposes.
(2) Subsection (1) relates to the provisions of—
(a) the data protection principles except the seventh data protection
principle,
(b) section 7,
(c) section 10,
(d) section 12, and
(e) section 14(1) to (3).

487
30.02 Leveson, the Press and Data Protection

(3) In considering for the purposes of subsection (1)(b) whether the belief
of a data Controller that publication would be in the public interest was or
is a reasonable one, regard may be had to his compliance with any code of
practice which—
(a) is relevant to the publication in question, and
(b) is designated by the [Secretary of State] by order for the purposes
of this subsection.
(4) Where at any time (‘the relevant time’) in any proceedings against a data
Controller under section 7(9), 10(4), 12(8) or 14 or by virtue of section 13
the data Controller claims, or it appears to the court, that any personal data to
which the proceedings relate are being processed—
(a) only for the special purposes, and
(b) with a view to the publication by any person of any journalistic,
literary or artistic material which, at the time twenty-four hours
immediately before the relevant time, had not previously been pub-
lished by the data Controller,
the court shall stay the proceedings until either of the conditions in
subsection (5) is met.
(5) Those conditions are—
(a) that a determination of the Commissioner under section 45 with
respect to the data in question takes effect, or
(b) in a case where the proceedings were stayed on the making of a
claim, that the claim is withdrawn.
(6) For the purposes of this Act ‘publish’, in relation to journalistic, literary
or artistic material, means make available to the public or any section of the
public.’
Lord Lester of Herne Hill is referred to in the Report (p 1067) as having
‘warned at length that, as drafted and because of cl 31, the DPA failed
to implement the Directive and authorised interference by the press with
the right to privacy in breach of Art 8 of the ECHR.’
At page 1068 of the Report, it refers to:
‘Mr Coppel’s arguments … would be that on the current state of the UK
authorities, s 32 fails to implement the Directive from which it derives, and
is inconsistent with the relevant parts of the ECHR to which it is intended to
give effect, because the relationship between privacy and expression rights
has got out of balance. A proper balance is a fundamental obligation. The UK
is therefore positively required to change the law to restore the balance. That
is indeed Mr Coppel’s own contention: that UK data protection law currently
fails to implement our obligations, and that Lord Lester’s concerns had proved
to be prescient.’

The Report itself then states:


‘2.11 Without going so far as that, even if the current balance were within
the spectrum permitted by our international obligations, the argument could
be expressed in terms that it is at an extreme end of that spectrum, and the

488
Leveson Recommendations 30.03

UK can as a matter of law, and should as a matter of policy, restore a more


even-handed approach, not least given the asymmetry of risks and harms as
between the individual and the press.

2.12 Put at its very lowest, the point could be made that the effect of the
development of the case law has been to push personal privacy law in media
cases out of the data protection regime and into the more open seas of the
Human Rights Act. This has happened for no better reason than the slowness
of the legal profession to assimilate data protection law and, in the case of
the judiciary, its greater familiarity with (and, he suggests, perhaps a prefer-
ence for) the latitude afforded by the human rights regime over the specificity
of data protection. But this, the argument goes, is undesirable because the
data protection regime is much more predictable, detailed and sophisticated
in the way it protects and balances rights, and significantly reduces the risks,
uncertainties and expense of litigation concomitant on more open-textured
law dependent on a court’s discretion. Where the law has provided specific
answers, the fine-nibbed pen should be grasped and not the broad brush. The
balancing of competing rights in a free democracy is a highly sophisticated
exercise; appropriate tools have been provided for the job and should be used.’

Leveson Recommendations
To the Ministry of Justice
30.03 The Leveson Report makes the following recommendations,
namely:
‘48. The exemption in section 32 of the Data Protection Act 1998 should
be amended so as to make it available only where: 49 (a) the processing of
data is necessary for publication, rather than simply being in fact undertaken
with a view to publication; (b) the data Controller reasonably believes that
the relevant publication would be or is in the public interest, with no special
weighting of the balance between the public interest in freedom of expression
and in privacy; and (c) objectively, that the likely interference with privacy
resulting from the processing of the data is outweighed by the public interest
in publication.

49. The exemption in section 32 of the Data Protection Act 1998 should
be narrowed in scope, so that it no longer allows, by itself, for exemption
from: 50 (a) the requirement of the first data protection principle to process
personal data fairly (except in relation to the provision of information to the
Data Subject under paragraph 2(1)(a) of Part II Schedule 1 to the 1998 Act)
and in accordance with statute law; (b) the second data protection principle
(personal data to be obtained only for specific purposes and not processed
incompatibly with those purposes); (c) the fourth data protection principle
(personal data to be accurate and kept up to date); (d) the sixth data protec-
tion principle (personal data to be processed in accordance with the rights of

489
30.03 Leveson, the Press and Data Protection

individuals under the Act); (e) the eighth data protection principle (restrictions
on exporting personal data); and (f) the right of subject access. The recom-
mendation on the removal of the right of subject access from the scope of
section 32 is subject to any necessary clarification that the law relating to the
protection of journalists’ sources is not affected by the Act.

50. It should be made clear that the right to compensation for distress con-
ferred by section 13 of the Data Protection Act 1998 is not restricted to cases
of pecuniary loss, but should include compensation for pure distress.

51. The procedural provisions of the Data Protection Act 1998 with special
application to journalism in: (a) section 32(4) and (5) (b) sections 44 to 46
inclusive should be repealed.

52. In conjunction with the repeal of those procedural provisions, considera-


tion should be given to the desirability of including in the Data Protection Act
1998 a provision to the effect that, in considering the exercise of any powers
in relation to the media or other publishers, the Information Commissioner’s
Office should have special regard to the obligation in law to balance the public
interest in freedom of expression alongside the public interest in upholding
the data protection regime.

53. Specific provision should be made to the effect that, in considering the
exercise of any of its powers in relation to the media or other publishers, the
Information Commissioner’s Office must have regard to the application to a
data Controller of any relevant system of regulation or standards enforcement
which is contained in or recognised by statute.

54. The necessary steps should be taken to bring into force the amendments
made to section 55 of the Data Protection Act 1998 by section 77 of the Crimi-
nal Justice and Immigration Act 2008 (increase of sentence maxima) to the
extent of the maximum specified period; and by section 78 of the 2008 Act
(enhanced defence for public interest journalism).

55. The prosecution powers of the Information Commissioner should be


extended to include any offence which also constitutes a breach of the data
protection principles.

56. A new duty should be introduced (whether formal or informal) for the
Information Commissioner’s Office to consult with the Crown Prosecu-
tion Service in relation to the exercise of its powers to undertake criminal
proceedings.

57. The opportunity should be taken to consider amending the Data Protection
Act 1998 formally to reconstitute the Information Commissioner’s Office as
an Information Commission, led by a Board of Commissioners with suitable
expertise drawn from the worlds of regulation, public administration, law and
business, and active consideration should be given in that context to the desir-
ability of including on the Board a Commissioner from the media sector.’

490
Leveson Recommendations 30.04

To the ICO
30.04 The Leveson Report also makes recommendation to the ICO.
These are:
‘58. The Information Commissioner’s Office should take immediate steps to
prepare, adopt and publish a policy on the exercise of its formal regulatory
functions in order to ensure that the press complies with the legal require-
ments of the data protection regime.

59. In discharge of its functions and duties to promote good practice in areas
of public concern, the Information Commissioner’s Office should take imme-
diate steps, in consultation with the industry, to prepare and issue compre-
hensive good practice guidelines and advice on appropriate principles and
standards to be observed by the press in the processing of personal data. This
should be prepared and implemented within six months from the date of this
Report.

60. The Information Commissioner’s Office should take steps to prepare and
issue guidance to the public on their individual rights in relation to the obtain-
ing and use by the press of their personal data, and how to exercise those
rights.

61. In particular, the Information Commissioner’s Office should take immedi-


ate steps to publish advice aimed at individuals (Data Subjects) concerned
that their data have or may have been processed by the press unlawfully or
otherwise than in accordance with good practice.

62. The Information Commissioner’s Office, in the Annual Report to Parlia-


ment which it is required to make by virtue of section 52(1) of the Act, should
include regular updates on the effectiveness of the foregoing measures, and
on the culture, practices and ethics of the press in relation to the processing
of personal data.

63. The Information Commissioner’s Office should immediately adopt the


Guidelines for Prosecutors on assessing the public interest in cases affecting
the media, issued by the Director of Public Prosecutions in September 2012.

64. The Information Commissioner’s Office should take immediate steps to


engage with the Metropolitan Police on the preparation of a long-term strat-
egy in relation to alleged media crime with a view to ensuring that the Office
is well placed to fulfil any necessary role in this respect in the future, and in
particular in the aftermath of Operations Weeting, Tuleta and Elveden.

65. The Information Commissioner’s Office should take the opportunity to


review the availability to it of specialist legal and practical knowledge of the
application of the data protection regime to the press, and to any extent neces-
sary address it.

491
30.04 Leveson, the Press and Data Protection

66. The Information Commissioner’s Office should take the opportunity to


review its organisation and decision-making processes to ensure that large-
scale issues, with both strategic and operational dimensions (including the
relationship between the culture, practices and ethics of the press in relation
to personal information on the one hand, and the application of the data pro-
tection regime to the press on the other) can be satisfactorily considered and
addressed in the round.’

Increased Sentencing for Data Breach


30.05 The Leveson Report also makes other law recommendations.
These are:
‘67. On the basis that the provisions of s 77–78 of the Criminal Justice and
Immigration Act 2008 are brought into effect, so that increased sentencing
powers are available for breaches of s 55 of the Data Protection Act 1998.

68. The Secretary of State for Justice should use the power vested in him by
s 124(1)(a)(i) of the Coroners and Justice Act 2009 to invite the Sentencing
Council of England and Wales to prepare guidelines in relation to data protec-
tion offences (including computer misuse).’

Comparison
30.06 A comparison of the DPA 1998 and the Leveson comments is
set out below.
DPA 1998 (s 32) Leveson
(1) Personal data which are processed 48 The exemption in s 32 of the Data
only for the special purposes are Protection Act 1998 should be
exempt from any provision to which amended so as to make it available
this subsection relates if— only where:(a) the processing of data
(a) the processing is undertaken is necessary for publication, rather
with a view to the publication than simply being in fact undertaken
by any person of any with a view to publication; (b) the
journalistic, literary or artistic data Controller reasonably believes
material, that the relevant publication would
(b) the data Controller reasonably be or is in the public interest,
believes that, having regard with no special weighting of the
in particular to the special balance between the public interest
importance of the public in freedom of expression and in
interest in freedom of privacy; and (c) objectively, that
expression, publication would the likely interference with privacy
be in the public interest, and resulting from the processing of the
data is outweighed by the public
interest in publication.

492
Comparison 30.06

(c) the data Controller reasonably 49 The exemption in s 32 of the


believes that, in all the Data Protection Act 1998 should
circumstances, compliance be narrowed in scope, so that it
with that provision is no longer allows, by itself, for
incompatible with the special exemption from: (a) the requirement
purposes. of the first data protection principle
(2) Sub-s (1) relates to the provisions to process personal data fairly
of— (except in relation to the provision
(a) the data protection principles of information to the Data Subject
except the seventh data under paragraph 2(1)(a) of Part II
protection principle, Schedule 1 to the 1998 Act) and in
(b) s 7, accordance with statute law; (b) the
(c) s 10, second data protection principle
(d) s 12, and (personal data to be obtained
(e) s 14(1) to (3). only for specific purposes and
(3) In considering for the purposes of not processed incompatibly with
sub-s (1)(b) whether the belief of those purposes); (c) the fourth data
a data Controller that publication protection principle (personal data
would be in the public interest was to be accurate and kept up to date);
or is a reasonable one, regard may be (d) the sixth data protection principle
had to his compliance with any code (personal data to be processed
of practice which— in accordance with the rights of
(a) is relevant to the publication in individuals under the Act); (e) the
question, and eighth data protection principle
(b) is designated by the[Secretary (restrictions on exporting personal
of State]by order for the data); and (f) the right of subject
purposes of this sub-section. access. The recommendation on
(4) Where at any time (‘the relevant the removal of the right of subject
time’) in any proceedings against a access from the scope of s 32 is
data Controller under s 7(9), 10(4), subject to any necessary clarification
12(8) or 14 or by virtue of s 13 the that the law relating to the protection
data Controller claims, or it appears of journalists’ sources is not affected
to the court, that any personal data by the Act.
to which the proceedings relate are 50 It should be made clear that the
being processed— right to compensation for distress
(a) only for the special purposes, conferred by s 13 of the Data
and Protection Act 1998 is not restricted
(b) with a view to the publication to cases of pecuniary loss, but
by any person of any should include compensation for
journalistic, literary or artistic pure distress.
material which, at the time 51 The procedural provisions of the
twenty-four hours immediately Data Protection Act 1998 with
before the relevant time, had special application to journalism in:
not previously been published (a) s 32(4) and (5); (b) ss 44 – 46
by the data Controller, the court inclusive should be repealed.
shall stay the proceedings until
either of the conditions in
sub-s (5) is met.

493
30.06 Leveson, the Press and Data Protection

(5) Those conditions are— 52 In conjunction with the repeal


(a) that a determination of the of those procedural provisions,
Commissioner under s 45 with consideration should be given to the
respect to the data in question desirability of including in the Data
takes effect, or Protection Act 1998 a provision to
(b) in a case where the proceedings the effect that, in considering the
were stayed on the making exercise of any powers in relation
of a claim, that the claim is to the media or other publishers,
withdrawn. the Information Commissioner’s
(6) For the purposes of this Act Office should have special regard
‘publish’, in relation to journalistic, to the obligation in law to balance
literary or artistic material, means the public interest in freedom of
make available to the public or any expression alongside the public
section of the public. interest in upholding the data
protection regime.
53 Specific provision should be made
to the effect that, in considering
the exercise of any of its powers
in relation to the media or other
publishers, the Information
Commissioner’s Office must have
regard to the application to a data
controller of any relevant system of
regulation or standards enforcement
which is contained in or recognised
by statute.

Conclusion
30.07 The Data Protection Act 2018 (DPA 2018), s 177 makes provi-
sion for the ICO to produce guidance in relation to the steps needed
in seeking redress in relation to media organisations. In addition, DPA
2018, s 179 states that the Secretary of State issue reports on the effec-
tiveness of media related dispute resolution procedures. Schedule 17 also
refers to reviews of processing of personal data for journalism purposes.

494
Chapter 31
Data Protection Officer

Introduction
31.01 Organisations are now required to have a Data Protection
Officer (DPO). In addition, the role and task requirements are now
more explicit. The DPO must also have an appropriate independence
in their activities and cannot be compromised or dictated to in a manner
which undermines their role and duties in relation to personal data. It is
now clear that the profession of the independent and expert DPO has
arrived.

New Role of DPO


31.02 Chapter IV, Section 4 of the new EU General Data Protection
Regulation (GDPR) refers to the new role and requirement of DPOs and
the obligation for organisations to appoint DPOs.
The Controller and the Processor shall designate a DPO in any case
where:
●● the processing is carried out by a public authority or body, except for
courts acting in their judicial capacity;
●● the core activities of the Controller or the Processor consist of pro-
cessing operations which, by virtue of their nature, their scope and/
or their purposes, require regular and systematic monitoring of Data
Subjects on a large scale;
●● the core activities of the Controller or the Processor consist of pro-
cessing on a large scale of special categories of data pursuant to
Article 9 and personal data relating to criminal convictions and
offences referred to in Article 10 (Article 37(1)).

495
31.03 Data Protection Officer

Tasks and Role


31.03 The Controller or the Processor shall ensure that the DPO is
involved, properly and in a timely manner in all issues which relate to
the protection of personal data (Article 38(1)).
The DPO shall have at least the following tasks:
●● to inform and advise the Controller or the Processor and the employ-
ees who carry out processing of their obligations pursuant to the
GDPR and to other EU or state data protection provisions;
●● to monitor compliance with the GDPR, with other EU or state data
protection provisions and with the policies of the Controller or
Processor in relation to the protection of personal data, including
the assignment of responsibilities, awareness-raising and train-
ing of staff involved in the processing operations, and the related
audits;
●● to provide advice where requested as regards the data protec-
tion impact assessment and monitor its performance pursuant to
Article 35;
●● to cooperate with the supervisory authority;
●● to act as the contact point for the supervisory authority on issues
related to the processing, including the prior consultation referred
to in Article 34, and consult, as appropriate, on any other matter
(Article 39(1)).
The DPO shall in the performance of their tasks have due regard to the
risk associated with the processing operations, taking into account the
nature, scope, context and purposes of the processing (Article 39(2)).
The DPO may fulfil other tasks and duties. The Controller or Proces-
sor shall ensure that any such tasks and duties do not result in a conflict
of interests (Article 38(6)).
The DPO will also supervise, advise and/or assist in relation to PIAs
and monitor performance and make recommendations. They will also
monitor and deal with any requests from the supervisory authority and
also be the designated contact for the supervisory authority.
It is important that the DPO has regard to the risks that may arise
specific to the organisation in relation to personal data and processing
issues (including security issues). Depending on the sector, there may
also be a Code of Conduct and/or certification issues for the DPO to be
also concerned with. Data protection seals and certification are meant to
help organisations demonstrate compliance.

496
Tasks and Role 31.08

Group DPO
31.04 A group of undertakings may appoint a single DPO provided
that a DPO is easily accessible from each establishment (Article 37(2)).
Where the Controller or the Processor is a public authority or body,
a single DPO may be designated for several such authorities or bodies,
taking account of their organisational structure and size (Article 37(3)).
In cases other than those referred to in Article 37(1), the Controller
or Processor or associations and other bodies representing categories of
Controllers or Processors may or, where required by EU or state law
shall, designate a DPO. The DPO may act for such associations and other
bodies representing Controllers or Processors (Article 37(4)).
Qualifications and Expertise of DPO
31.05 The DPO shall be designated on the basis of professional
qualities and, in particular, expert knowledge of data protection law
and practices and the ability to fulfil the tasks referred to in Article 39
(Article 37(5)).
The DPO may be a staff member of the Controller or Processor, or
fulfil the tasks on the basis of a service contract (Article 37(6)).
Contact Details
31.06 The Controller or the Processor shall publish the contact
details of the DPO and communicate these to the supervisory authority
(Article 37(7)).
Data Subjects may contact the DPO on all issues related to the pro-
cessing of the Data Subject’s personal data and the exercise of their
rights under the GDPR (Article 38(4)).
Reporting
31.07 The DPO shall directly report to the highest management level
of the Controller or the Processor (Article 38(3)).
Independent in Role and Tasks
31.08 The Controller or Processor shall ensure that the DPO does not
receive any instructions regarding the exercise of their tasks. He or she
shall not be dismissed or penalised by the Controller or the Processor for
performing their tasks (Article 38(3)).

497
31.09 Data Protection Officer

Resources
31.09 The Controller or Processor shall support the DPO in perform-
ing the tasks referred to in Article 39 by providing resources necessary
to carry out those tasks and access to personal data and processing opera-
tions, and to maintain his or her expert knowledge (Article 38(2)).

Summary
31.10 In summary, organisations must designate a DPO to:
●● monitor internal compliance with the GDPR regime and rules;
●● ensure governance of the organisation’s data management;
●● draft and update compliant data protection policies;
●● implement systems, changes and functions in terms of being
compliant.
The DPO should be qualified and have particular expertise in data pro-
tection law and practice. They need to be able to fulfil their tasks in
compliance and conformity with the GDPR. It appears they may be an
employee or a contractor.
The DPO details must be made available publicly and the supervisory
authority (such as the ICO) should be notified.
The organisation must involve the DPO in a timely manner in relation
to all issues in relation to the protection of personal data and Data Sub-
ject issues. Proper and adequate resources must be supplied to the DPO
by the organisation in order that they can undertake their tasks. There is
an obligation that the DPO has independence in their role and functions,
and that they cannot be controlled or micromanaged or instructed in rela-
tion to their tasks.
The DPO will report to the Board or highest management level as
appropriate. This also emphasises the increasing importance attached to
data protection understanding and compliance.
The DPO advises the organisation and employees in relation to their
data protection obligations under national law and the GDPR. They will
also monitor compliance with the data protection legal regime as well as
internal policies. They will also be involved in assigning responsibilities,
raising awareness and staff education and training.
DPOs should be highlighting the changes and the new GDPR to the
organisation. Key issues need to be identified to appropriate manage-
ment. New and ongoing change and compliance issues need appropriate
resourcing. The DPO should assess what personal data the organisation
collects and processes, for what purpose, and where it is located and
secured. Particular attention is needed with regard to outsourcing issues

498
Summary 31.10

and contracts with Processors. Contracts, including service level agree-


ments in relation to IT systems, cloud, etc may be assessed. The vari-
ous IT hardware, software and systems that employees use need to be
considered.
The structure of the organisation needs to be considered, as well as
jurisdiction and location issues. The life cycles, storage and disposal of
personal data is also an important consideration for the new DPO.
The processes, policies and documentation must be maintained by the
organisation, which places particular obligations on the DPO to consider
the different documentation sets.
For further details see The Data Protection Officer, Profession, Rules
and Role.1 The ICO also provides commentary on the importance of the
DPO.2

1 Paul Lambert, The Data Protection Officer, Profession, Rules and Role (Routledge,
Taylor & Francis, 2017).
2 ICO, ‘Data Protection Officers,’ at https://ico.org.uk/for-organisations/guide-to-data-
protection/guide-to-the-general-data-protection-regulation-gdpr/accountability-
and-governance/data-protection-officers/.

499
500
Chapter 32
Brexit, Privacy Shield and
Schrems

Introduction
32.1 The recent Schrems PRISM case1 (Schrems II) is important to
consider. This is a request for a preliminary ruling under Article 267
TFEU from the High Court (in Ireland) in proceedings before it involv-
ing privacy campaigner Max Schrems, Facebook and the Irish Data Pro-
tection Commission. A range of other interested parties, including the
US Government, EU national states, and other privacy, data protection
and industry parties, have also made submissions.
Maximilian Schrems, in what was an evolving complaint, effectively
sought to have the local data regulator (now the Data Protection Com-
mission) impose a ban or restriction on the EU Facebook entity, based
on Ireland, transferring the personal data of EU citizens to the US
Facebook entity because such data may become subject to access by
(certain) EU investigative agencies, in particular the CIA and FBI. The
case refers in particular to US surveillance known as signals intelli-
gence provisions under the Foreign Intelligence Surveillance Act (FISA)
­(specifically ­section 702) and a particular Executive Order (also known
as a ­Presidential Directive) (EO 12333).
From one perspective the case can be seen as a data retention-type case
as opposed to a core (commercial) data protection case. (As indicated
earlier, data retention often proves to be controversial and is sometimes

1 Data Protection Commissioner v Facebook Ireland and Maximillian Schrems,


Schrems II, Case C-311/18, ECLI:EU:C:2020:559, 16 July 2020.

501
32.1 Brexit, Privacy Shield and Schrems

also challenged2.) Regardless, the decision already has important impli-


cations for companies and citizens in the UK, and (other) EU member
states – not to mention the Commission and US authorities. Significant
data transfers occur between the EU and US, and a large number (over
5,300) of US companies have registered in the US as being compliant
with the EU–US Privacy Shield, and no doubt made representations to
that effect to their EU business partners. (There are also potential knock-
on implications for the separate Swiss–US Privacy Shield, which is to a
large extent modelled on the EU–US Privacy Shield).
The issues at stake in Schrems II are the interpretation and legality of the:
●● standard contractual clause data transfers (ie the EU SCC Decision
providing for such contracts for data transfers); and
●● Privacy Shield data transfers (ie the EU Privacy Shield Decision
providing for such data transfers).
As indicated earlier, data transfers of personal data from inside the EU to
outside it are subject to a default transfer ban (see GDPR Art 44). Only if
a particular transfer can fit within one of the limited exemption or legiti-
mising mechanisms may a data transfer become permissible.
The EU recognises a mechanism for data transfers from the EU to third
countries based on standard contractual clauses (SCC) per the SCC Deci-
sion. These standard contracts are expressly provided for in the official
decision,3 and provide a legal mechanism for parties to provide a lawful
mechanism for individualised data transfers by way of contracts incor-
porating particular clauses and safeguards specified by the Commission
(the so-called SCC Decision).4
The new Schrems decision upholds the validity of the SCC Decision –
and hence the existing data transfer contracts and clauses which are com-
pliant with the SCC Decision. Data transfers can continue insofar as they
are compliant with this transfer mechanism – but importantly new such
contracts may have to be read in light on this new decision (see below).
New data contracts will become more involved and complicated as a

2 The earlier Data Retention Directive, for example, was declared invalid in the Digital
Rights Ireland case: Case C-293/12.
3 In fact, and technically speaking, there are two such standard contract mechanisms.
One refers to data transfers from a Controller to a Controller, and the other refers to
data transfers from a Controller to a Processor. While only one of the SCC mecha-
nisms is the focus in this case, there is growing commentary that the same principles
elucidated in the case should apply to both mechanisms.
4 See the interpretation and validity of Commission Decision 2010/87/EU of 5 February
2010 on standard contractual clauses for the transfer of personal data to processors
established in third countries under Directive 95/46 (OJ 2010 L 39, p 5), as amended
by Commission Implementing Decision (EU) 2016/2297 of 16 December 2016
(OJ 2016 L 344, p 100). This is the Commission Decision 2010/87.

502
Issues and Questions 32.2

result of the decision. Indeed, it would also be wise, if not recommended,


for those with existing SCC contracts in place, particularly older ones,
to review them. It may be that some or all existing SCC contracts may
benefit from being updated or have further provisions added by way of
supplementary agreement.
Recall that the Privacy Shield was effectively a negotiated agreement
between EU and US authorities providing a lawful data transfer mecha-
nism for all data transfers by organisations signing up to the Privacy
Shield registration process and thereby agreeing to regulate their data
transfers in accordance with the Privacy Shield rules.
The need for the Privacy Shield arose as the result of an earlier data
transfer mechanism known as the Safe Harbour arrangement being
declared invalid.5 The replacement Privacy Shield Decision6 was effec-
tively being challenged in this new Shrems case.

Issues and Questions


32.2 There is obviously a long-running background to this decision.
This runs from the EU–US provisions providing for lawful commercial
data transfers from the EU to the US, to the Snowden disclosures, to the
Schrems 17 and Digital Rights Ireland cases.8 As with most cases, it is
important to consider the specific issues asked of the court as the ulti-
mate decision is tailored to, if not contingent on, these specific questions.
The questions before the EU court from the referring national court
(para 68) are:
(Q1) In circumstances in which personal data is transferred by a pri-
vate company from a European Union (EU) Member State to
a private company in a third country for a commercial purpose
pursuant to [the SCC Decision] and may be further processed in
the third country by its authorities for purposes of national secu-
rity but also for purposes of law enforcement and the conduct of
the foreign affairs of the third country, does EU law (including
the Charter) apply to the transfer of the data notwithstanding the
provisions of Article 4(2) TEU in relation to national security

5 Schrems I, Case C-3662/14, EU:C:2015:650.


6 Commission Implementing Decision (EU) 2016/1250 of 12 July 2016 pursuant to
Directive 95/46 on the adequacy of the protection provided by the EU–US Privacy
Shield (OJ 2016 L 207, p 1).
7 Schrems I, Case C-3662/14, EU:C:2015:650.
8 Digital Rights Ireland, Case C-293/12, [2015] QB 127, [2014] 3 WLR 1067, [2014] 2
All ER (Comm) 1.

503
32.2 Brexit, Privacy Shield and Schrems

and the provisions of the first indent of Article 3(2) of Directive


[95/46] in relation to public security, defence and State security?
(Q2)
(a) In determining whether there is a violation of the rights of an
individual through the transfer of data from the [European
Union] to a third country under the [SCC Decision] where
it may be further processed for national security purposes, is
the relevant comparator for the purposes of [Directive 95/46]:
(i) the Charter, the EU Treaty, the FEU Treaty, [Directive
95/46], the [European Convention for the Protection of
Human Rights and Fundamental Freedoms, signed at
Rome on 4 November 1950] (or any other provision of
EU law); or
(ii) the national laws of one or more Member States?
(b) If the relevant comparator is (ii), are the practices in the con-
text of national security in one or more Member States also
to be included in the comparator?
(Q3) When assessing whether a third country ensures the level of protec-
tion required by EU law to personal data transferred to that country
for the purposes of Article 26 of [Directive 95/46], ought the level of
protection in the third country be assessed by reference to:
(a) the applicable rules in the third country resulting from its
domestic law or international commitments, and the practice
designed to ensure compliance with those rules, to include
the professional rules and security measures which are com-
plied with in the third country; or
(b) the rules referred to in (a) together with such administrative,
regulatory and compliance practices and policy safeguards,
­procedures, protocols, oversight mechanisms and non-judi-
cial remedies as are in place in the third country?
(Q4) Given the facts found by the High Court in relation to US law,
if personal data is transferred from the European Union to the
United States under [the SCC Decision] does this violate the
rights of individuals under Articles 7 and/or 8 of the Charter?
(Q5) Given the facts found by the High Court in relation to US law,
if personal data is transferred from the European Union to the
United States under [the SCC Decision]:
(a) does the level of protection afforded by the United States
respect the essence of an individual’s right to a judicial rem-
edy for breach of his or her data privacy rights guaranteed by
Article 47 of the Charter?
If the answer to Question 5(a) is in the affirmative:
(b) are the limitations imposed by US law on an individual’s right
to a judicial remedy in the context of US national security
504
Issues and Questions 32.2

proportionate within the meaning of Article 52 of the Charter


and do not exceed what is necessary in a democratic society
for national security purposes?
(Q6)
(a) What is the level of protection required to be afforded to per-
sonal data transferred to a third country pursuant to standard
contractual clauses adopted in accordance with a decision of
the Commission under Article 26(4) [of Directive 95/46] in
light of the provisions of [Directive 95/46] and in particular
Articles 25 and 26 read in the light of the Charter?
(b) What are the matters to be taken into account in assessing
whether the level of protection afforded to data transferred
to a third country under [the SCC Decision] satisfies the
requirements of [Directive 95/46] and the Charter?
(Q7) Does the fact that the standard contractual clauses apply as
between the data exporter and the data importer and do not bind
the national authorities of a third country who may require the
data importer to make available to its security services for further
processing the personal data transferred pursuant to the clauses
provided for in [the SCC Decision] preclude the clauses from
adducing adequate safeguards as envisaged by Article 26(2) of
[Directive 95/46]?
(Q8) If a third country data importer is subject to surveillance laws
that in the view of a data protection authority conflict with the
[standard contractual clauses] or Article 25 and 26 of [Directive
95/46] and/or the Charter, is a data protection authority required
to use its enforcement powers under ­Article 28(3) of [Directive
95/46] to suspend data flows or is the exercise of those powers
limited to exceptional cases only, in light of recital 11 of [the
SCC Decision], or can a data protection authority use its discre-
tion not to suspend data flows?
(Q9)
(a) For the purposes of Article 25(6) of [Directive 95/46], does
[the Privacy Shield Decision] constitute a finding of gen-
eral application binding on data protection authorities and
the courts of the Member States to the effect that the United
States ensures an adequate level of protection within the
meaning of Article 25(2) of [Directive 95/46] by reason of
its domestic law or of the international commitments it has
entered into?
(b) If it does not, what relevance, if any, does the Privacy
Shield Decision have in the assessment conducted into the
adequacy of the safeguards provided to data transferred to
the United States which is transferred pursuant to the [SCC
Decision]?

505
32.2 Brexit, Privacy Shield and Schrems

(Q10) Given the findings of the High Court in relation to US law, does
the provision of the Privacy Shield ombudsperson under Annex
A to Annex III to the Privacy Shield Decision when taken in con-
junction with the existing regime in the United States ensure that
the US provides a remedy to data subjects whose personal data is
transferred to the United States under the [SCC Decision] that is
compatible with Article 47 of the Charter]?
(Q11) Does the [SCC Decision] violate Articles 7, 8 and/or 47 of the
Charter?
As sometimes happens, the answers ultimately provided are compounded
together as opposed to individualised.
Interestingly, the referring court asks ‘Does the [SCC Decision] violate …
the Charter?’ It does not ask ‘Does the [Privacy Shield Decision]
violate … the Charter?’. Yet it strikes down and invalidates the Privacy
Shield nonetheless.
While the referring national court had already made certain adverse find-
ings in relation to US law and the protections and remedies available in
the US for EU citizens in relation to their data which has been transferred
to the US, it has been suggested that only the EU court could actually
invalidate an EU law decision (such as the SCC Decision or Privacy
Shield Decision.
The decision also exemplified how sometimes important data protection
cases can sometime take a long, and sometimes surprising route. Despite
previous criticisms, it was actually the Irish Data Protection Commission
which had taken an adverse view to the standing of the Privacy Shield
and which initiated the High Court case (because of the prior Schrems
complaint). Shrems and Facebook actually became technical defendants
in the case. It is also worth pointing out that when the Schrems complaint
originated, the Privacy Shield itself was not yet in existence.

Privacy Shield
Decision
32.3 The court invalidated the Privacy Shield found decision in what
is the most directly and immediately impactful part of the decision. The
official summary states that:
‘[T]he Court examines the validity of Decision 2016/1250 in the light of the
requirements arising from the GDPR, read in the light of the provisions of
the Charter guaranteeing respect for private and family life, personal data
­protection and the right to effective judicial protection. In that regard, the Court
notes that that decision enshrines the position, as did Decision 2000/520, that

506
Privacy Shield 32.4

the requirements of US national security, public interest and law enforcement


have primacy, thus condoning interference with the fundamental rights of per-
sons whose data are transferred to that third country. In the view of the Court,
the limitations on the protection of personal data arising from the domestic
law of the United States on the access and use by US public authorities of
such data transferred from the European Union to that third country, which
the Commission assessed in Decision 2016/1250, are not circumscribed in a
way that satisfies requirements that are essentially equivalent to those required
under EU law, by the principle of proportionality, in so far as the surveillance
programmes based on those provisions are not limited to what is strictly nec-
essary. On the basis of the findings made in that decision, the Court pointed
out that, in respect of certain surveillance programmes, those provisions do
not indicate any limitations on the power they confer to implement those
programmes, or the existence of guarantees for potentially targeted non-US
persons. The Court adds that, although those provisions lay down require-
ments with which the US authorities must comply when implementing the
surveillance programmes in question, the provisions do not grant data subjects
actionable rights before the courts against the US authorities.’9

It continues that:
‘As regards the requirement of judicial protection, the Court holds that,
contrary to the view taken by the Commission in Decision 2016/1250, the
Ombudsperson mechanism referred to in that decision does not provide data
subjects with any cause of action before a body which offers guarantees sub-
stantially equivalent to those required by EU law, such as to ensure both the
independence of the Ombudsperson provided for by that mechanism and the
existence of rules empowering the Ombudsperson to adopt decisions that
are binding on the US intelligence services. On all those grounds, the Court
declares Decision 2016/1250 invalid.’10

The court decision at para 5 (p 44) finds that:


‘Commission Implementing Decision (EU) 2016/1250 of 12 July 2016 pursu-
ant to directive 95/46/EC of the European Parliament and of the Council on
the adequacy of the protection provided by the EU – US Privacy Shield is
invalid.’

Implications
32.4 The immediate impact is that companies seeking to rely on Pri-
vacy Shield as the lawful mechanism for the transfer of data to the US

9 ‘The Court of Justice invalidates Decision 2016/1250 on the adequacy of the protec-
tion provided by the EU–US Data Protection Shield. However, it considers that Com-
mission Decision 2010/87 on standard contractual clauses for the transfer of personal
data to processors established in third countries is valid,’ Court of Justice of the Euro-
pean Union, press release No 91/20, 16 July 2020.
10 ibid.

507
32.4 Brexit, Privacy Shield and Schrems

can no longer rely on it. (There is a technical issue, however, in that


the recipient companies in the US who have already registered with US
authorities for Privacy Shield are possibly as a matter of US law still
bound by their committed obligations.)
New companies seeking to legitimise data transfer from the EU to the
US must identify a Plan B given that the often preferred Plan A is now
invalidated.
It may come to pass that the EU and US may agree a new Privacy Shield
2.0 or some other transfer arrangement, but there is not public timeline
for this, and already some commentators have suggested that it may not
be feasible prior to the 2020 US presidential election.
Existing US companies may need, therefore, to engage with their EU
partners to identify alternative legal routes where available. This may
mean consideration of the standard contractual clauses route (see further
below) which has been recognised as legally valid by the court.
Brexit
32.5 One Brexit-related impact is that the decision puts an even greater
spotlight on political negotiations and the preferred Brexit solution of an
EU Adequacy Decision finding. The case also suggests to some that such
an ultimate outcome is likely unsuccessful.
The decision is helpful, therefore, to the extent that it re-emphasises that
UK and UK-based firms might wish to avail of the standard contracts
route to ensure lawful data transfers to the UK.
Given that the Privacy Shield up to now would have encompassed UK
data transfers to the US, the question then arises as to what the impact
is for ongoing or future UK–US transfers. This question becomes more
complicated in the Brexit context. Firms will need to consider whether
they can demonstrate lawful receipt of data from the EU, and then
whether they are permitted to send some or all of this data to the US.
Where data originates in the UK and might be sent to the US, this raises
other interesting considerations. On one hand, it is UK data arguably
unaffected by EU considerations. From another perspective, any com-
pany which has EU dealings may not wish for a variety of reasons to be
sending even UK data to the US in a manner which is less than equiva-
lent to that expected under EU rules. Additionally, if a practice were to
build up of unregulated or less than equivalent data transfer between the
UK and US, this could become yet another problem in terms of the UK’s
(anticipated) application for an Adequacy Decision.
Further considerations arise also in terms of ascertaining the type of
company or indeed industry sector involved, as certain companies may
be treated differently for the purposes of US law and US law safeguards.
508
Standard Contract Clauses 32.7

Standard Contract Clauses


Decision
32.6 The official statement from the court states that ‘the Court of Jus-
tice finds that examination of Decision 2010/87 in the light of the Charter
of Fundamental Rights has disclosed nothing to affect the validity of that
decision.’11 It continues that:
‘The Court considers that the validity of that decision is not called into ques-
tion by the mere fact that the standard data protection clauses in that decision
do not, given that they are contractual in nature, bind the authorities of the
third country to which data may be transferred. However, that validity, the
Court adds, depends on whether the decision includes effective mechanisms
that make it possible, in practice, to ensure compliance with the level of pro-
tection required by EU law and that transfers of personal data pursuant to
such clauses are suspended or prohibited in the event of the breach of such
clauses or it being impossible to honour them. The Court finds that Decision
2010/87 establishes such mechanisms. In that regard, the Court points out,
in particular, that that decision imposes an obligation on a data exporter and
the recipient of the data to verify, prior to any transfer, whether that level of
protection is respected in the third country concerned and that the decision
requires the recipient to inform the data exporter of any inability to com-
ply with the standard data protection clauses, the latter then being, in turn,
obliged to suspend the transfer of data and/or to terminate the contract with
the former.’12

Paragraph 4 (p 44) of the ultimate court ruling finds as follows:


‘Examination of European Commission Decision 2010/87/EU of 5 ­February
2010 on standard contractual clauses for the transfer of personal data to
processors established in third countries under Directive 95/46/EU of the
European Parliament and of the Council, as amended by Commission Imple-
menting Decision (EU) 2016/2297 of 16 December 2016 in light of Articles 7,
8 and 47 of the Charter of Fundamental Rights has disclosed nothing to affect
the validity of that decision.’

Implications
32.7 The direct implication is that businesses with pre-existing data
transfer arrangements which are based on the legal premise of the stand-
ard contracts are (currently) valid and safe, and continue. Businesses

11 ‘The Court of Justice invalidates Decision 2016/1250 on the adequacy of the protec-
tion provided by the EU–US Data Protection Shield However, it considers that Com-
mission Decision 2010/87 on standard contractual clauses for the transfer of personal
data to processors established in third countries is valid,’ Court of Justice of the Euro-
pean Union, press release No 91/20, 16 July 2020.
12 ibid.

509
32.7 Brexit, Privacy Shield and Schrems

also have the assurance now that the standard contracts are available as
a prime means of creating future lawful data transfers.
Businesses will also be assured by the fact that the standard contract
rules apply to data transfers to any third country outside of the EU
and so will be helpful in regulating and permitting international data
­transfers – not just to the US. (Of course, some transfers may not require
standard contracts if the EU had already made an Adequacy Decision
permitting lawful data transfers to specific named countries based on the
EU’s assessment that the laws and protection of the recipient country are
already equivalent to those in the EU. One reason for standard contracts
being so important, however, is that there are more countries without an
Adequacy Decision than with one.)
Notwithstanding the assurance provided by the decision for the legality
and continued existence of standard contracts transfer mechanism, the
decision holds very important practical and legal implications in terms
of how to apply and operate the standard contracts.
Rightly or wrongly, there is a suggestion that standard contracts were
not always operated with the respect with which they deserved. After all,
they are legal documents and seek to regulate important aspects of data
protection law. The criticism is that sometimes a company may sign a
standard contract in a less than considered manner, place it in a drawer
and then (almost) forget about it.
Not so anymore.
The court makes express statements as to how seriously the standard
contracts must be treated and the careful consideration and examinations
which firms must undertake in order to assess whether a standard contract
will work and whether it may be undermined by the legal rules in the
recipient country. There are more detailed exercises which must be under-
taken by both the proposed data exporter and the proposed data recipient.
The implication is that without examination of safeguard issues and pro-
vision for safeguard mechanisms, it may be that the proposed contracts
cannot be entered into. Another potential implication is that without
appropriate consideration of safeguard issues for a proposed contract,
the validity of the contract if executed may be called into question.
While the above applies to new standard contracts, the question which
must also be considered by those with pre-existing standard contracts
is whether there is a risk that their contracts could be invalidated
because there is insufficient consideration and provision for safe-
guards. It is prudent, it is suggested, to at least review pre-existing
contracts and to ask the safeguard questions which this aspect of the
case gives rise to.

510
Standard Contract Clauses 32.7

The court makes a number of references to safeguards and what exercises


and consideration must occur for the entering into of a (valid) standard
contracts for data transfers. The court states that, for example:
‘It follows that the standard data protection clauses adopted by the Commis-
sion on the basis of Article 46(2)(c) of the GDPR are solely intended to provide
contractual guarantees that apply uniformly in all third countries to controllers
and processors established in the European Union and, consequently, inde-
pendently of the level of protection guaranteed in each third country. In so far
as those standard data protection clauses cannot, having regard to their very
nature, provide guarantees beyond a contractual obligation to ensure compli-
ance with the level of protection required under EU law, they may require,
depending on the prevailing position in a particular third country, the adoption
of supplementary measures by the controller in order to ensure compliance
with that level of protection.’ (para 133)

The judgment continues:


‘In that regard, as the Advocate General stated in point 126 of his Opinion, the
contractual mechanism provided for in Article 46(2)(c) of the GDPR is based
on the responsibility of the controller or his or her subcontractor established
in the European Union and, in the alternative, of the competent supervisory
authority. It is therefore, above all, for that controller or processor to verify, on
a case-by-case basis and, where appropriate, in collaboration with the recipi-
ent of the data, whether the law of the third country of destination ensures
adequate protection, under EU law, of personal data transferred pursuant to
standard data protection clauses, by providing, where necessary, additional
safeguards to those offered by those clauses.’ (para 134)

And states that:


‘That validity depends, however, on whether, in accordance with the require-
ment of Article 46(1) and Article 46(2)(c) of the GDPR, interpreted in the
light of Articles 7, 8 and 47 of the Charter, such a standard clauses decision
incorporates effective mechanisms that make it possible, in practice, to ensure
compliance with the level of protection required by EU law and that transfers
of personal data pursuant to the clauses of such a decision are suspended or
prohibited in the event of the breach of such clauses or it being impossible to
honour them.’ (para 137)

The court refers, inter alia, to ‘supplementary measures’, ‘additional


safeguards’, and ‘effective mechanisms … to ensure’. Therefore, some
of the practical and legal exercises to consider when proposing to enter
new standard contracts (or reviewing old ones) include but are by no
means limited to:
●● how an EU Data Subject may exercise their rights after a data transfer;
●● whether there are ‘additional safeguards to those offered by those
clauses’;

511
32.7 Brexit, Privacy Shield and Schrems

●● what those ‘additional safeguards to those offered by those clauses’


are;
●● whether these ‘additional safeguards’ are strong and sufficient;
●● whether these ‘additional safeguards’ are weak and insufficient;
●● whether there is a local law issue in the recipient country which con-
flicts with the standard contracts and or the rights of Data Subjects
and their abilities of protecting their data or seeking remedies;
●● whether any local law problem issues can be worked around in the
standard contract so as to still sufficiently protect the rights of Data
Subjects.
Some might also suggest asking how an EU Data Subject may exercise
their rights before such a transfer occurs at all. There are, after all, detailed
prior information, transparency, and fairness rules to be complied with.
Brexit
32.8 As is pointed out elsewhere in this edition, there are significant
issues facing UK politicians in terms of assuring businesses – particu-
larly City financial firms and international firms – that Brexit will be as
smooth as possible and will avoid any unnecessary speedbumps. There
would certainly be adverse consequences if, for example, UK business
could not continue to trade as normal and receive EU data as part of its
normal activities. There is an implication that post-Brexit the UK will be
a deemed third country outside of the EU, and as such the default data
transfer ban kicks in. The UK will no longer be able to receive EU data
as it is not (yet) a recipient country which the EU has assessed and vali-
dated as being safe and equivalent for receiving EU data. Such a decision
needs to be validated in a formal Adequacy Decision.
While applying for, processing and receiving an EU Adequacy Deci-
sion is time-consuming and complex, some commentators are suggest-
ing after Schrems II that the UK will not at all be successful in obtaining
an Adequacy Decision.
Mr Schrems, while deliberately avoiding a detailed view on Brexit,
does state that after the decision there ‘may be a huge problem for UK
adequacy’.13
In that scenario, and even in prudent preparation for Brexit itself, it may
be incumbent on firms to look to standard contracts as their means to
legally continue to have data interactions with EU countries. Schrems II
offers assurance in this regard that this is a viable route – even if the
complexity and level of detail required is now much more onerous.

13 Interview with IAPP, 17 July 2019.

512
Conclusion 32.9

Perhaps the lesson is that while a successful Adequacy Decision is


aspired to, one cannot hold one’s breath waiting on uncertain political
and administrative processes. The standard contracts become a matter of
primary practical importance. (Note also that for certain large organisa-
tions the additional route for inter-group international data transfers may
lead towards the Binding Corporate Rules (BCRs) transfer mechanism
for certain transfers.)

Conclusion
32.9 There are also other issues disclosed in the decision, such as the
ability of national data protection regulators, to regulate data transfer.
The decision suggests that there may indeed be positive obligations on
such authorities to be proactive in policing data transfers which may not
be adhering to or be permitted under the data protection regime.
However, the decision issues referred to above will require to be care-
fully monitored for further official guidance and potential official deci-
sions on data transfers. At the same time, those proposing to engage in
data transfers will need to examine the feasibility of contract mechanisms
(and also BCC mechanisms) as appropriate. In addition, those with pre-
existing executed standard contracts arrangements would be prudent to
review their existing documentation. It may be that contracts or practice
documentation might be enhanced further.
Vice-President Jourová states that ‘[w]hen personal data travels abroad
from Europe, it must remain safe’, but acknowledges that ‘it is essential
to have a broad toolbox for international transfers while ensuring a high
level of protection for personal data’.14
The EU and US authorities are already in contact and will discuss the
avenues available. It may, for example, be possible to consider a Privacy
Shield 2.0 or other updated arrangement.
Commissioner Reynders adds that:
‘In the meantime, transatlantic data flows between companies can continue
using other mechanisms for international transfers of personal data available
under the GDPR.
‘We will work together with the national data protection to ensure a swift and

14 “Opening remarks by Vice-President Jourová and Commissioner Reynders at the


press point following the judgment in case C-311/18 Facebook Ireland and Schrems’
Statement, 16 July 2020. At https://ec.europa.eu/commission/presscorner/detail/en/
statement_20_1366.

513
32.9 Brexit, Privacy Shield and Schrems

coordinate response to the judgement. This is essential to provide European


citizens and businesses with legal certainty.’15

The Vice President sets out priorities for considering and future replace-
ment, namely:
●● guaranteeing the protection of personal data transferred across the
Atlantic;
●● working constructively with our American counterparts with an aim
of ensuring safe transatlantic data flows;
●● working with the European Data Protection Board and national data
protection authorities to ensure our international data transfer tool-
box is fit for purpose.16
In addition, the Commission is already ‘working intensively to ensure
that this toolbox is fit for purpose, including the modernisation of the
Standard Contractual Clauses’.17 So we can also expect that new more
enhanced clauses will be released sometime in the future.
Just to add one note of caution, however, there is a potential sting in the
tail in the judgement to the extent that the court said that the (current)
‘[e]xamination of … standard contractual clauses … has disclosed noth-
ing to affect the validity of that decision.’ That is not a guarantee of per-
manent immunity from a future adverse decision. Indeed, Mr Schrems
already seemed to suggest after the judgment that if the clauses were
seriously challenged they may fall too.18
Controllers and Processors should note that current best practice never
stands still. A contact or practice which was legal last year may have to
be reviewed, updated or replaced today. Legal compliance is an ongoing
task – and as the judgment confirms, can become more onerous over time.
This is further emphasised by the judgment’s requirement that the
national data protection supervisory authorities must be (more) proactive
in enforcing the rules of data protection.19 From Mr Schrems’ perspec-
tive, this was the most important issue decided.
Depending on the circumstances, there may also arise situations where
companies might have to consider where data already transferred to the
US may have to be returned or deleted, and the documentation surround-
ing such deletion.

15 Ibid.
16 Ibid.
17 Ibid.
18 The comment is that they ‘would be splattered in pieces too if they were.’ Interview
with IAPP, 17 July 2019.
19 See, for example decision [3], p 44.

514
Chapter 33
Other Data Protection Issues

Introduction
33.01 Data protection covers many separate but important topics.
Many of these are directly relevant to many organisations. Unfortunately
they cannot all be adequately covered in a book such as this. However, it
may assist to briefly refer to some of these issues, and to convey at least
to some extent the wide scope of the data protection regime. Quite sim-
ply, data protection affects every business, organisation and individual.

New Regime
33.02 The new GDPR overhauls and modernises the data protection
regime throughout the EU (and elsewhere). UK organisations will be
affected and will have to prepare for the new regime. What does the
new data protection regime under the forthcoming EU GDPR do? Some
of the specific changes and updates for organisations are highlighted in
Part 5 of the book.
The whole area of transfers of personal data outside of the EEA
(TBDFs or data transfers) is regularly changing, for example, as new
countries are added to a white list of permitted export countries after
having been examined on behalf of the EU Commission. There are also
other changes such as contractual clauses and binding corporate rules
(BCR) (and noting related updates in relation to Privacy Shield, and the
debate as to whether there are knock on consequences for contractual
clauses and BCRs). If an organisation needs to consider the possibil-
ity of data transfer exports to non-EEA countries, the current most up
to data transfer rules should be assessed, as well as appropriate pro-
fessional advice. It may be necessary to have specific legal contracts
in place. These rules may also be sector specific for certain industries

515
33.02 Other Data Protection Issues

eg airlines flying to US from Europe. The EDPB (previously WP29) is


also an important resource for organisations.
Further topical issues are regularly being analysed by the EDPB
(previously WP29). These issues may be consulted at, http://ec.europa.
eu/justice/data-protection/article-29/index_en.htm, and also https://edpb.
europa.eu/.
The Information Commissioners Officer’s website should also be con-
sulted regularly, at https://ico.org.uk/.

Medical and Health Data


33.03 Medical and health data comprise one of the categories of spe-
cial personal data. Hence, there are greater conditions and compliance
obligations. There is also a greater need for higher data security meas-
ures. The concerns in relation to medical and health data increase once
such data is held in electronic form and electronic databases. There is a
need for enhanced practical and security procedures.1
There are various and increasing forms of recording personal health
and related data regarding individuals, held in databases or biobanks.2
One of the concerns is also the increasing possibility for profiling indi-
viduals from bio-informatic and genetic data.3 The issue of consent in
relation to bio data and biobanks is an issue of increasing concern.4
There is controversy in relation to the apparent transfer of s­ ensitive
medical health data relating to 1.6 million patients. This involves c­ urrent,
historical and live feed patient data. The transfer is being made by the
Royal Free NHS Trust to DeepMind and/or Google UK Limited and or
a third party, the details of which are redacted in an Information S­ haring
Agreement signed by one party on 29 September 2015.5 The exact
details are somewhat unclear. It is clear, however, that serious ques-
tions arise as to how the arrangement could be data protection complaint
based on the details currently available, including lack of transparency in

1 I Sandea, ‘Analysis of the Legal Aspects Concerning Data Protection in Electronic


Medical Registry,’ Applied Medical Informatics (2009) (25) 16–20.
2 LA Bygrave, ‘The Body as Data? Biobank Regulation via the “Back door” of Data
Protection Law,’ Law, Innovation & Technology (2010) (2) 1–25.
3 IM Azmi, ‘Bioinformatics and Genetic Privacy: The Impact of the Personal Data
Protection Act 2010,’ Computer Law & Security Review (2011) (27) 394–401.
4 See, for example, J Taupitz and J Weigel, ‘The Necessity of Broad Consent and
Complementary Regulations for the Protection of Personal Data in Biobanks: What
Can We Learn from the German Case?’ Public Health Genomics (2012) (15) 263–271.
5 ‘DeepMind Rapped,’ New Scientist, 8 July 2017; Powles, Julia, and Hodson, Hal,
‘Google DeepMind and Healthcare in an Age of Algorithms,’ Health and Technology
(2017) (7:4) 351–367.

516
Genome Data 33.04

particular to the patients whose special personal data are involved. There
does not appear to have been any opportunity or mechanism for patients
to opt-in or to opt-out before any proposed data transfers. The purpose
of the transfer and the details leading up to the transfer and arrangement
are also unclear, which also undermines the possibility of fair, compliant
processing. Fundamentally, the entire project relates to a new secondary
use in relation to the medical data in question, which requires proper con-
sideration and data protection compliance, which is not evident from the
documentation thus far available. The data referred to is also of such a
nature as to appear to go beyond the purported purpose thus far disclosed.
One would expect any medical data being transferred for research –
in particular names, address and other non-necessary information – to
be redacted and or pseudononymised prior to transfer, which does not
appear to have happened and which is not required in the Information
Sharing Agreement.
While Big Data health projects can have benefits, and such projects
can be worth the endeavour even without a successful end-resulting
health benefit, they all need to be data protection complaint. Serious
questions remain in relation to this project, and thus far compliance has
not been demonstrated. In fact, the information disclosed in the Data
Sharing Agreement and an official NHS Q&A document raise more
questions than answers. It appears that on the basis of these documents
to be non-data protection compliant.
Recently the ICO has commented on issues of data protection, contact
tracing, and mobile applications as they relate to Covid-19.6

Genome Data
33.04 A related and growing area is genomic,7 genome research, and
the implications for individuals’ privacy and personal data. Individu-
als may be concerned with what happens with their own DNA gene
sequence, information regarding predispositions to diseases, how this
may affect them, and how doctors, employers, insurers, and government
may access and use such personal data. One resource relating to this area
is The Governance of Genetic Information, Who Decides? by Widdows
and Mulle.8

6 ICO, ‘Apple and Google Joint Initiative on COVID-19 Contact Tracing Technology,’
ICO (17 April 2020).
7 L Curren et al, ‘Identifiability, Genomics, and UK Data Protection Law,’ European
Journal of Health Law (2010) (17) 329–344.
8 H Widdows and C Mullen, eds, ‘Frontmatter, The Governance of Genetic Information,
Who Decides?’ Cambridge Law, Medicine and Ethics (2009).

517
33.05 Other Data Protection Issues

Body Scanners
33.05 The introduction of body scanning technology in airports has
been controversial.9 While the prime argument in favour relates to air-
line security and terrorism, not everyone is convinced, and those chal-
lenged to produce evidence of successful attacks prevented, have been
less forthcoming.
The main controversy centres on the ability of the body scanners to
provide a complete, graphic, internal and intrusive image of a person’s
naked body once they walk through the scanner. There are of course
different types and different settings. However, the introduction of body
scanners is a perfect example of the introduction of a new technology
without considering the privacy and data protection implications in
advance. Later versions of body scanners have been developed which
produce a line image drawing, not a biological, naked representation.
They are equally capable of highlighting contraband material. Privacy
designed body scanners can be equally effective. While the original ver-
sions of body scanners were problematic, the more recent versions dem-
onstrate that technology can be much more privacy- and data protection
friendly when properly designed. The further lesson is that these consid-
erations should be incorporated at the design stage – not after the product
launches on the market.

Investigation, Discovery and Evidence


33.06 The issue of electronic evidence is important and critical,
whether for the organisation or the Data Subject wishing to use such
evidence. It is recommended that organisations consider these issues
proactively in advance rather than hoping to be able to deal with them
adequately in a reactive manner.
The new data protection regime also sets higher obligations in terms
of the need for more extensive record keeping. While data regulators are
already looking at some of these new sets of documentation, it is only
a matter of time before data plaintiffs and privacy groups focus more
attention on these documents.
The Morrisons case record seems to suggest certain avenues where
further documentation may have assisted the court.10

9 See, for example, O Mironenko, ‘Body Scanners Versus Privacy and Data Protection,’
Computer Law & Security Review (2011) (27) 232–244.
10 Various v Morrison (rev 1) [2017] EWHC 3113 (QB) (1 December 2017); Various
v Morrison [2020] UKSC 12 (1 April 2020).

518
New Hardware Devices, New Software 33.08

Recently, the Information Commissioner commented that ‘[o]ur work


needs to meet the civil and criminal standards of evidence gathering and
recovery of it is to be useful.’11 Business in taking and defending claims
in relation to data protection will face increasing calls to ensure appro-
priate diligence in the data records existing and available, and of the data
gaps that may exist.

Cloud
33.07 The popularity of cloud computing and virtualisation services
with users, enterprise and increasingly official organisations is increas-
ing. However, there are real concerns in relation to privacy, data pro-
tection, data security,12 continuity, discovery, liability, record keeping,
etc.13 One commentator refers to cloud computing as ‘the privacy storm
on the horizon.’14 Any organisation considering cloud services needs to
carefully consider the advantages, disadvantages, assessments and con-
tract assurances that will be required. Such organisations, as well as the
service operators, also need to assure themselves as to how they ensure
data protection compliance.

New Hardware Devices, New Software


33.08 The arrival of new devices, from smartphones, combined devices
and even communications devices on devices (such as RFID tags),

11 Information Commissioner, ‘ICO opening remarks – The Commitee on Civil L ­ iberties,


Justice and Home Affairs (LIBE) of the European Parliament – Hearing on the
Facebook/Cambridge Analytica case’ (4 June 2018).
12 See for example, C Soghoian, ‘Caught in the Cloud: Privacy, Encryption, and
­Government Back Doors in the Web 2.0 Era,’ Journal of Telecommunications &
High Technology Law (2010) (8) 359–424. Also note U Pagallo, ‘Robots in the Cloud
with Privacy: A new Threat to Data Protection?’ Computer Law & Security Report
(2013)(29:5) 501.
13 ICO, Guidance on the Use of Cloud Computing, at https://ico.org.uk; Article 29
­Working Party, Opinion 05/2012 on Cloud Computing, WP 196, 1 July 2012; P Lanois,
‘Caught in the Clouds: The Web 2.0, Cloud Computing, and Privacy?’ Northwestern
Journal of Technology and Intellectual Property (2010) (9) 29–49; FM Pinguelo and
BV Muller, ‘Avoid the Rainy Day: Survey of US Cloud Computing Caselaw,’ Boston
College Intellectual Property & Technology Forum (2011) 1–7; IR Kattan, ‘Cloudy
Privacy Protections: Why the Stored Communications Act Fails to Protect the Privacy
of Communications Stored in the Cloud,’ Vandenburg Journal of Entertainment and
Technology Law (2010–2011) (13) 617–656.
14 AC DeVere, ‘Cloud Computing: Privacy Storm on the Horizon?’ Albany Law Journal
(2010) (20) 365–373.

519
33.08 Other Data Protection Issues

emphasise that organisations need to be much more aware and con-


sidered in their policies and risk assessments under the data protection
regime. Gaming and new headset devices (eg VR) will increase and will
raise many new issues, including for personal rights, consent, represen-
tations and personal data.
The software used in new facial recognitions technology is just one
example of how device development and software development will
continually bring new services, but also new data protection challenges.
The ICO points out that facial recognition – including live facial recog-
nition (LFR) – needs to be appreciated as containing not just personal
data, but also biometric data. The ICO also reiterates that compliance
with the data rules applies to law enforcement authorities as well as more
commercial entities.15

Internet of Things
33.09 The beginning of the so called Internet of Things (IoT) or
connected devices era is now well heralded. However, the full
­consideration of the data protection implications are yet to be fully out-
lined. Organisations need to appreciate the implications for employees,
users and also their compliance systems. Manufacturers are assisted in
identifying and reducing these risks by the new risk and assessment tools
of the EU General Data Protection Regulation (GDPR).

On-Site/Off-Site
33.10 Organisations have to tackle the issues presented by employ-
ees not just working on-site, but also travelling and working at home or
other locations off-site. This trend will likely increase even following
the Covid-19 pandemic. This can impact, for example, the security and
security risks regarding the personal data collected and processed by the
organisation. It also means that devices may be taken off-site and or that
third party devices may exist and which are utilised to access the organi-
sation’s systems remotely.
Note also the recent EDPB document:
●● Guidelines 3/2019 on processing of personal data through video
devices.

15 ICO, ‘The Use of Live Facial Recognition Technology by Law Enforcement in Public
Places’, 31 October 2019.

520
Drones 33.12

Online Abuse
33.11 The increasingly evident problem of online abuse such as
cyberbullying, trolling, defamation, copying and utilising personal
data to abuse and blackmail children, teenagers, etc, are issues which
need to be considered by all organisations, as well as policymakers.
Pierre ­Trudel, for example, notes that the risks to individuals increase
from many online activities, including in relation to data protection,
safety, etc.16
The Information Commissioner comments more generally that
websites ‘can be held liable for their misuse of personal data.’ She adds
that ‘[t]hese organisations have control over what happens with an
­individual’s personal data and how it is used to filter content – they con-
trol what we see, the order in which we see it, and the algorithms that
are used to determine this. Online platforms can no longer say that they
are merely a platform for content; they must take responsibility for the
provenance of the information that is provided to users.’17
‘Data crimes are real crimes,’ states the Information Commissioner.18

Drones
33.12 As much as there is a new GDPR data protection regime, there
are also new rules and regulations being developed in relation to drones.
Drones now have to be registered in more and more jurisdictions,
However, the privacy and data protection implications, while being
highlighted in general discussion, are not yet adequately encompassed in
express privacy and data protection rules. There will be increasing calls
to do so, as well as increasing examples of why such an accommodation
is needed. A number of headline issues are raised by drones, including
privacy and data protection, drones as transport or delivery vehicles, and
the interference and malicious use of drones (such as at airports).
Note also the recent EDPB document:
●● Guidelines 1/2020 on processing personal data in the context of
connected vehicles and mobility related applications.

16 P Trudel, ‘Privacy Protection on the Internet: Risk Management and Networked


­Normativity,’ in S Gutwirth, Y Poullet, P de Hert, C de Terwange and S Nouwt,
­Reinventing Data Protection? (Springer, 2009) 317.
17 Information Commissioner, ‘ICO opening remarks – The Committee on Civil
­Liberties, Justice and Home Affairs (LIBE) of the European Parliament – Hearing on
the Facebook/Cambridge Analytica case’ (4 June 2018).
18 Ibid.

521
33.13 Other Data Protection Issues

Increasing Actions
33.13 There will be increasing enforcement and fines facing organisa-
tions when compliance goes wrong and also when data breach incidents
arise. In addition to the actions of regulators such as the ICO, there will
also be increasing actions from individual Data Subjects, class actions
and representative organisations. While this might occur most frequently
where financial data or special data are concerned, it will not be limited
to these areas. For example, there are already examples of data breach
and data loss in relation to Internet of Things devices and services which
open up new areas of exposure.

AI, Big Data and Data Ethics


33.14 The government agreed to a proposal from the Science and
Technology Committee to establish a ‘Council of Data Science Ethics’.
The government is developing an ‘ethical framework for government
data science’. The Committee issued a report entitled ‘Report: The Big
Data Dilemma’ in February 2016, to which the government issued a
response.
In addition, the government also indicates that it will not presently
introduce criminal penalties for serious data breaches, and will consider
developments, including the new GDPR. The ICO had previously called
for criminal sanctions on a number of occasions previously.
The government also confirms that it will not introduce compulsory
data protection audits of local authorities, as there is already ongoing
progress in that area.
Nicola Blackwood, MP, the Chair of the Committee states that:
‘Big Data has enormous potential to improve public services and business
productivity, but there are also justified privacy concerns when personal
data is used in new applications, services and research. Getting the balance
between the benefits and the risks right is vital.

I am pleased therefore that the Government has accepted our call to set up
a “Council of Data Science Ethics” to address the growing legal and ethical
challenges associated with balancing privacy, anonymisation of data, security
and public benefit.’19

19 ‘Government Agree to set up “Council of Data Ethics,”’ The Science and Technology
Committee, available at http://www.parliament.uk/business/committees/committees-
a-z/commons-select/science-and-technology-committee/news-parliament-2015/
big-data-dilemma-government-response-15-16/. The report is at http://www.­publications.

522
Politics 33.16

Codes and Certification


33.15 Two of the innovations contained within the GDPR relate to the
possibility of organisations demonstrating their compliance (whether in
whole or in part) by use of complying with established Codes of Practice
(see GDPR Article 40) and or with particular Certification mechanisms
(see GDPR Article 42).
The ICO has recently announced that it is now ready to accept inquir-
ies and applications for the recognition of such codes and certification,
and the respective bodies whhich would administer and operate such
schemes.20
This is likely to be one of the growing areas of data protection
compliance.

Politics
33.16 The Cambridge Analytical scandal was a shock to many, in
terms of how so much private personal data could be collected without
knowledge, transparency or consent, and then used for data analytical
purposes with a view to political user profiling and ultimately political
influence manipulation, Various actions were initiated by data regulators
as well as policymakers. This Information Commissioner has indicated
that the ICO’s investigation was the largest data regulatory investigation
thus far, and involved detailed cooperation with other data regulators
internationally. The Commissioner comments that ‘we have seen that the
behavioural advertising ecosystem has been applied across to political
campaigning to influence how we vote. I am deeply concerned that this
has happened without due legal or ethical consideration of the impacts to
our democratic system.’21
While a number of significant data fines have been issued and detailed
reports have been published, one should not assume that there are not
more shoes to drop, nor that further investigation and problem issues
need to be dealt with.

parliament.uk/pa/cm201516/cmselect/cmsctech/468/468.pdf; the government response


is at http://www.publications.parliament.uk/pa/cm201516/cmselect/cmsctech/
992/99202.htm. The report is officially titled House of Commons Science and Tech-
nology Committee, The Big Data Dilemma, Fourth Report of Session 2015–16.
The government response was published on 26 April 2016.
20 ICO, ‘ICO Codes of Conduct and Certification Schemes Open for Business,’ ICO
(28 February 2020).
21 Information Commissioner, ‘ICO opening remarks – The Committee on Civil
­Liberties, Justice and Home Affairs (LIBE) of the European Parliament – Hearing on
the Facebook/Cambridge Analytica case’ (4 June 2018).

523
33.17 Other Data Protection Issues

Conclusion
33.17 Data protection compliance is never a one size fits all or a ­single
one time policy document. The nature of what amounts to personal
data and the activities to which such data can be processed for are ever
changing. Those within an organisation, therefore, need to be constantly
alert to compliance issues and changes. Organisations also need to be
constantly alert to new issues and dangers generally, and also those hot
button issues specific to their sector. Significantly, greater attention and
resources are needed to also deal with security, data breaches and data
protection compliance.
The Information Commissioner states, ‘[t]he spotlight is well and
truly on data protection … We need a sustained willingness by citizens to
exercise their data protection rights. We need data protection authorities
unafraid to use our new tools, sanctions and fining powers.’ Businesses
and organisations – and their advisors – need to be ready to meet this
new challenge.

524
Appendices

Reference Links
Information Commissioner’s Office:
https://ico.org.uk/
European Data Protection Board:
https://edpb.europa.eu/
Society of Computers and Law:
http://www.scl.org
International Journal for the Data Protection Officer, Privacy Officer
and Privacy Counsel:
www.idpp.info

Legislative Links
Data Protection Act 2018:
https://www.legislation.gov.uk/ukpga/2018/12/enacted General Data
Protection Regulation: http://ec.europa.eu/justice/data-protection/reform/
files/regulation_oj_en.pdf

Forms and Documents Links


Notification self assessment guides:
https://ico.org.uk

Complying with Data Protection


All organisations collect and process at least some personal data as
defined under the Data Protection Act 2018 (DPA 2018) and the data

525
Appendices

protection regime. Therefore, an organisation must ensure it only col-


lects and processes personal data if complying with:
●● the obligation to only collect and process personal data if in compli-
ance with the DPA 2018;
●● the updated data protection Principles;
●● the updated Lawful Processing Conditions;
●● the updated Special Personal Data Lawful Processing Conditions;
●● the separate consideration to the processing of children’s personal
data;
●● the security and risk conditions;
●● the data breach notification conditions and incident management;
●● the new Data Protection by Design and by default obligations;
●● the new accountability obligations;
●● data protection impact assessment and risk assessment;
●● the personal data outsourcing and data processor conditions;
●● the personal data transfer ban and restrictions and standards;
●● the individual data subject rights, including access, deletion, right to
be forgotten, etc;
●● queries, audits and investigation orders from the ICO;
●● the time limits for undertaking various tasks and obligations.

Objections to Marketing
Jay1 provides the following suggestions for organisations when dealing
with access requests/marketing objections, namely,
●● does the objection relate to marketing or another form or processing?
●● does it relate to direct marketing within the definition?
●● is it in writing or sent electronically or oral?
●● if it is not in writing or electronic is it appropriate to deal with it as
sent or should the individual be require to put it in writing or send it
by electronic means?
●● at which branch or office was it received?
●● on what date was the request received?
●● has the individual making the request given an intelligible name and
address to which the controller can respond?
●● when does the period for response expire?
●● what time scale has the individual specified to stop marketing
processing on a notice?
●● what marketing processing is affected?

1 Jay and Hamilton, Data Protection: Law and Practice, (London: Sweet and Maxwell,
2007), pp 436–437.

526
Audit Checklist

●● is more time needed to comply with the requirement?


●● how is the marketing data held? Is it manual or automated or some
of both?
●● does the objection apply to manual or automated?
●● has the individual described the data?
●● has the individual described the processing?
●● has the individual explained why unwarranted damage and distress
would be caused?
●● on what grounds are the data being processing?
●● can one of the primary grounds be claimed?
●● does the notice amount to a revocation of an existing consent?
●● is it possible to comply with the objection?
●● is processing about others affected?
●● would compliance mean system changes?

Audit Checklist
Morgan and Boardman2 refer to the audits. Their checklists include for
example:
Extent of Audit
●● what parts of the organisation and its files and systems have been
audits and for what reason?
●● are they likely to be sufficient to give an indication of the organisa-
tion’s overall data protection compliance or not?
Classes of Personal Data Audited
●● computer
●● email
●● other letter/memo files
●● internet
●● intranet
●● manual (relevant filing system)
●● video (scanned images, CCTV, photographic, film)
●● audio (contract, training, voicemail)
●● biometric
●● other categories of personal data (eg tachograph)
●● types of Personal Data

2 Morgan, R, and Boardman, R, Data Protection Strategy, Implementing Data


Protection Compliance, (London: Sweet and Maxwell, 2012), pp 78 onwards.

527
Appendices

●● is there personal data or not?


●● if yes, is the personal data sensitive personal data?
●● or accessible personal data?
●● is any of the personal data confidential? eg medical/financial? If
so, what are the consequences of this and what are individuals told
about this?
●● is there any automatic or automated processing of personal data? If
so, what are the consequences of this and what are individuals told
about this?
●● are there any cookies? If so, how are they used?
Types of Data Subject
●● staff
●● retired staff
●● staff spouses or other family members
●● embers of the public
●● customers
●● prospective customers
●● business contacts, which may include,
$$ sole traders or partnerships, who are identifiable individuals and
so data subjects; or
$$ managers or other individual office holders in those bodies the
organisation has contracts with?
‘Owner’ of Personal Data
●● who?
●● any ‘private’ files?
●● how are private emails handled?
●● what, if any, of the data is processed by a data processor?
●● what personal data is received from another organisation?
●● what personal data is shared with one or more other organisations?
●● does the organisation process data itself on behalf of others, as a data
processor?
Purposes of Processing
●● how and why is the personal data collected?
●● what information is given to individuals about this, how and when?
●● how well does the data’s use match the purposes for which it is
collected?
●● does the organisation carry out direct marketing? If so, how?
●● does it subscribe to ‘preference’ service?
●● what or whom ensures that the personal data are accurate?

528
Audit Checklist

●● what are the consequences if the personal data is inaccurate?


●● how long is it kept?
●● what happens to it when it is no longer used?
●● what are the criteria for its destruction?
●● is it in fact destroyed: if so, who does this and how do they do it?
Contracts
●● data processing contracts
●● data sharing contracts
●● contracts involving personal data outside of the EEA
●● fair obtaining warranties
Security of Personal Data
●● physical security
●● staff security
●● what standards accreditation does the organisation have?
●● system security – eg passwords, firewalls, etc
●● what about portable media eg Laptops, memory sticks, CDs, etc?
●● procedures for sending secure faxes – and emails – for sensitive
­personal data?
●● is there any use of wireless transmission wi-fi?
●● security with data processors
●● how were they selected?
●● was their security checked?
●● how is it guaranteed by the data processor and documented?
●● is there any ongoing review processing?
Sending Outside EEA?
●● files?
●● emails?
●● internet material?
●● intranet material?
Cookies
●● which?
●● from where?
●● how intrusive?
Noting also of course the updates and clarifications of the new EU
General Data Protection Regulation (GDPR) require separate addi-
tional audit and record processes for the data protection impact and risk
impact assessments.

529
Appendices

Procedures
Morgan and Boardman3 refer to processes and procedures in relation
compliance and ongoing compliance. The queries checklist includes for
example:
Procedures and Procedures
●● what industry/trade association guidance is available?
●● what published processed and procedures are there in respect of
personal data?
●● how are they brought to the attention of staff others?
●● how enforced?
●● how updated?
●● who is responsible for this?
●● how does he/she fit into the organisations structure?
Data Protection Notification/Registration
●● has the organisation notified?
●● if not, why does it consider itself exempt?
●● is the notification consistent with the personal data identified by the
audit?
●● purposes?
●● is the notification up to date?
●● how is the notification kept up to date?
Data Subjects’ Rights
●● what Procedures are in place to deal with requests in connection
with data subjects rights?
●● how has it worked so far?
●● any problems with identifying a particular individual in the personal
data?
●● are the Information Commissioner’s Codes (eg CCTV, employment
practice, data sharing) followed?
People
●● who is in charge of data protection?
●● what resources does he/she have?
●● how is he/she supported by senior management?
●● what if any disciplinary action has been taken in respect of data
protection?

3 Morgan, R, and Boardman, R, Data Protection Strategy, Implementing Data


Protection Compliance, (London: Sweet and Maxwell, 2012), pp 80 onwards.

530
Procedures

Information Commissioner Office (ICO)


●● apart from notification/registration, has the organisation ever had
any dealings with the ICO? If so, what and when and with what
particular results?
Obviously in future, risk issues and risk assessments will be very
important.

531
532
Index

[all references are to paragraph number]

A B
Abuse Binding corporate rules
employee monitoring, 17.07 definition, 3.03
generally, 33.11 generally, 21.11
social media, 29.10 WP29 documents, 21.07
Access right Biometric data
dealing with requests, 9.07 definition, 3.03
GDPR, and, 9.06 Blocking
generally, 9.05 generally, 9.01
Accuracy social media, 29.21
overview, 1.19 Blogs
Adequate protection see Websites and blogs
transfer of personal data, 21.03 Body scanners
Anonymised data generally, 33.05
transfer of personal data, 21.16 Breaches of rules
Assessment notices data processing activities, 11.03
destroying or falsifying information generally, 11.02
and documents, 20.04 other consequences, 11.04
generally, 20.04 Brexit
limitations, 20.05 conclusion, 24.21
restrictions, 20.05 EDPB guidance, 24.19
Audits EDPS guidance, 24.17
request, 20.08 employee monitoring and human rights,
Automated calling systems 17.14–17.15
ePrivacy Directive, 22.04 EU Commission’s ‘Notice to
Automated decision taking Stakeholders’, 24.18
data subjects’ rights EU (Withdrawal) Act 2018
generally, 9.16, 9.29–9.30, 26.66 data, and, 24.12
introduction, 9.14 details, 24.13
right to object, 9.15, 9.29 Official Explanatory Commentary,
employee data protection regime 24.14
introduction, 15.09 EU (Withdrawal Agreement) Act 2020,
object, to, 15.10 24.07, 24.20
profiling, 15.11 ‘final’ negotiations in transition period,
social media 24.07
generally, 29.39 GDPR, and, 1.13, 2.02–2.04,
introduction, 29.22, 29.37 4.10
right to object, 29.38 generally, 2.02, 2.04, 8.05, 24.02

533
Index

Brexit – contd Complaints to ICO/SA


ICO guides choice of jurisdiction and court, 9.24
‘broader’ guidance, 24.10 compensation, 9.25
further guidance, 24.15 court remedies, 9.22
generally, 24.08 different SAs, 9.23
overview FAQs guidance, 24.11 generally, 9.20
‘six steps to take’, 24.09 organisational privacy groups, 9.21
‘no deal’ penalties, 9.26
EDPB guidance, 24.19 portability, 9.28
generally, 24.07 sanctions, 9.27
ICO guides, 24.08, 24.09, 24.11, Compliance
24.15 civil sanctions
preparation, ICO advice, 24.16 director liability, 11.14
transfers of personal data, 21.19, 24.02 generally, 11.12
Morrisons case, 11.13
processors, 11.15
C conclusion, 11.30
Caching DPA, under, 11.12–11.15
social media, 29.15 GDPR, under, 11.08–11.11,
Calling line identification 11.16–11.25
generally, 22.18 ICO, and, 11.07, 11.26–11.28,
Case law 11.29
sources of law, 2.05 introduction, 11.01
Certification time limits, 10.01–10.02
generally, 26.81, 33.15 Concealing identity
Child pornography electronic mail, by, 22.07
employee monitoring, 17.09 generally, 23.08
Children Conferences
conditions for consent, 26.21 sources of law, 2.18
generally, 26.20 Confidentiality
Civil sanctions ePrivacy Directive, 22.15
director liability and offences, 11.14 Connected devices era
generally, 11.12 generally, 33.09
Morrisons case, 11.13 Connected line identification
processor offences, 11.15 generally, 22.18
Cloud computing and services Consent
generally, 33.07 see also Data subjects consent
privacy by design, 19.11 children, 26.21
Codes of Conduct conditions, 4.09, 26.19
generally, 26.80, 33.15 ePrivacy Directive, 22.11
Communication of data breach to data generally, 26.19
subject introduction, 4.09
employees, and, 16.12 overview, 1.19
generally, 26.39, 26.71, 27.04 social media, 29.14
process, 27.07 transfer of personal data, 21.05
social media, 29.40 Contracts
time limits, 27.06 transfer of personal data, 21.06
Community findings Contracts of employment
transfer of personal data, 21.03 employee monitoring, 17.20
Compensation Controllers
damage as result of infringement of co-operation with SA, 26.33
GDPR, 9.25, 11.09, 11.22 definition
data subjects rights, 9.17, 9.25, 29.20 DPA, 3.02
generally, 26.78 GDPR, 3.03

534
Index

Controllers – contd Data breach – contd


introduction, 26.26 employees, and – contd
joint introduction, 16.11
generally, 26.28 notification, 16.12
social media, 29.02 incident response, 27.09
processing under authority, 26.30 introduction, 27.01
records of activities, 26.31 Leveson Inquiry, and, 30.05
representatives not established in EU location, 16.14
generally, 26.32 notification to ICO, 12.24
social media, 29.42 notification to SA
responsibility, 26.27 employees, and, 16.12
social media, 29.02 generally, 26.38, 27.03
Cookies process, 27.07
EU law, 2.04 time limits, 27.06
Corporate communications usage policy process for notification, 27.07
content, 17.12 security standards, 27.08
generally, 17.11 social media, 29.23
key issues, 17.13 time limits for notification, 27.06
Council of Europe Data concerning health
sources of law, 2.12 DPA, 3.02
Covid-19 GDPR, 3.03
EDPB guidance on health research, Data controller
26.23 DPA, 3.02
medical and health data, 33.03 GDPR, 3.03
Criminal convictions Data disclosure
GDPR, and, 26.24 generally, 9.18
Criminal Justice and Immigration Data ethics
Act 2008 generally, 33.14
enforcement and penalties for Data portability
non-compliance, 11.07 generally, 9.13, 9.28, 26.63
Cross-border processing Data processing
definition, 3.03 see also Processing
Cross-border transfers enforcement and penalties for
see Trans border data flows (TBDFs) non-compliance, 11.03
Cyber bullying Data processors
social media, 29.10 definition
DPD, 3.02
GDPR, 3.03
D enforcement and penalties for
Data non-compliance, 11.15
transfer of personal data, 21.16 outsourcing, and
Data breach conclusion, 13.05
communication to data subject engagement, 13.03
employees, and, 16.12 introduction, 13.01
generally, 26.39, 26.71, 27.04 processing personal data and
process, 27.07 security, 13.02
social media, 29.40 reliance on third parties, 13.04
time limits, 27.06 Data protection
conclusion, 27.10 additional issues
context, 27.02 AI, Big Data and Data Ethics, 33.14
definition, 3.03, 27.01 body scanners, 33.05
employees, and cloud, 33.07
employee data organisations, 16.13 codes and certification, 33.15
generally, 27.05 conclusion, 33.17

535
Index

Data protection – contd Data protection – contd


additional issues – contd sources of law – contd
drones, 33.12 other authorities, 2.13
GDPR, 33.02 other laws, 2.17
generally, 33.01 other official sources, 2.14
genome data, 33.04 reference material, 2.19
increasing actions, 33.13 secondary legislation, 2.03
Internet of Things, 33.09 websites and blogs, 2.16
investigation, discovery and summary of rules, 1.19
evidence, 33.06 Data Protection Act (DPA) 2018
medical and health data, 33.03 see also Data Protection Acts
new hardware devices and software, background, 24.01–24.06
33.08 changes from GDPR, 25.04
online abuse, 33.11 comment, 25.05
on-site/off-site, 33.10 definitions
politics, 33.16 general list, 3.02
background to UK regime, 24.01–24.21 special personal data, 3.05
compliance future, 25.06
inward-facing, 1.15–1.16 generally, 1.13, 25.01
outward-facing, 1.14 outward facing obligations, 18.02
regulatory authority, 1.17 overview, 1.21
data processing criteria, 1.20 repealing, 25.02
definitions, 1.23 sources of law, 2.02
EU regime see EU General Data structure, 25.03
Protection Regulation (GDPR) Data Protection Acts
historical background access right
conclusion, 4.10 dealing with requests, 9.07
DPA, 4.03 generally, 9.05
GDPR, 4.05–4.09 adequacy of protection, 12.03
generally, 4.02 background, 4.02–4.03
introduction, 4.01 civil sanctions
legal instruments, 4.04 director liability and offences, 11.14
importance, 1.02–1.12 generally, 11.12
Information Commissioner, 1.17 Morrisons case, 11.13
legitimate processing, 1.22 processor offences, 11.15
meaning, 1.01 commencement, 4.02
media coverage, 1.03 compensation for data subjects, 9.17,
overview, 1.21 29.20
purpose, 1.18 data access
regulatory regime, 1.13 dealing with requests, 9.07
sources of law generally, 9.05
case law, 2.05 data disclosure, 9.18
conferences, 2.18 data protection principles, 5.01
Council of Europe, 2.12 data subject rights
DPA 2018, 2.02 access, 9.05–9.07
EDPB, 2.10 data disclosure, 9.18
EU law, 2.04 dealing with requests, 9.07
European Data Protection introduction, 9.01
Supervisor, 2.11 jurisdiction, 9.19
ICO determinations, 2.07 representation of data subjects, 9.21
ICO guides, 2.06 dealing with access requests, 9.07
introduction, 2.01 destruction
legal journals, 2.09 director liability, 11.14
legal textbooks, 2.08 social media, 29.21

536
Index

Data Protection Acts – contd Data protection authorities


enforcement and penalties for see Information Commissioner’s
non-compliance Office
civil sanctions, 11.12 Data protection by default
criminal offences, 11.02–11.06 generally, 1.04
director liability, 11.14 Data protection by design (DPbD)
Morrisons case, 11.13 see also Privacy by design (PbD)
processor liability, 11.15 background, 19.02
engaging data processors, 12.05 conclusion, 19.11
exemptions EDPB, and, 19.09
conclusion, 8.05 generally, 1.04, 19.04–19.05
generally, 8.02–8.03 ICO, and, 19.06–19.08
introduction, 8.01 introduction, 19.01
generally, 4.03 multinationals, and, 19.10
jurisdiction, 9.19 overview, 18.04
level of security, 12.03 principles, 19.03
Leveson Inquiry, and Data protection compliance
comparison table, 30.06 inward-facing, 1.15–1.16
generally, 30.02 outward-facing, 1.14
offences regulatory authority, 1.17
breaches of rules, and, 11.02–11.04 Data Protection Directive (DPD)
data processing activities, and, 11.03 exemptions, 8.15
direct marketers, by, 11.06 repeal, 26.12
director liability, 11.14 social media
introduction, 11.01 specific processing risks, 29.17
other consequences of breach, and, third party controllers, 29.16
11.04 Data protection impact assessments
processors, by, 11.15 (DPIA)
unlawful obtaining or procuring characteristics, 28.06
personal data, 11.05 conclusion, 28.10
principles, 5.01 elements, 28.05
processor liability, 11.15 generally, 26.69, 28.01
reliability of employees, 12.04 ICO guidance, 26.40, 28.03
repeal, 25.02 introduction, 26.41
representation of data subjects, 9.21 issues for consideration, 28.08
security of personal data methodologies, 28.07
adequacy of protection, 12.03 prior consultation, 28.09
appropriate measures, 12.02–12.03 privacy by design, 19.08
engaging data processors, 12.05 purpose, 28.04
level of security, 12.03 requirement, 28.02
reliability of employees, 12.04 security of personal data
social media contents, 12.14
automated decision taking/making, generally, 12.13
29.22 introduction, 12.12
blocking, 29.21 social media, 29.43
compensation for data subjects, 29.20 steps and methodologies, 28.07
destruction, 29.21 Data protection officers (DPO)
erasure, 29.21 conclusion, 26.91
prevent processing for direct contact details, 31.06
marketing purposes, 29.19 expertise, 31.05
prevent processing likely to cause functions, 26.90, 31.03
damage or distress, 29.18 generally, 26.89, 31.02
rectification, 29.21 groups, and, 31.04
transfer of personal data, 21.03 independence, 31.08

537
Index

Data protection officers (DPO) – contd Data subject rights


introduction, 26.72, 31.01 access, of
inward-facing issues, 14.03 dealing with requests, 9.07
overview, 26.88 GDPR, and, 9.06, 26.53
qualifications, 31.05 generally, 9.05
reporting, 31.07 automated decision taking/making
resources, 31.09 processes
role, 26.89, 31.02 generally, 9.16, 9.29–9.30, 26.66
summary, 31.10 introduction, 9.14
tasks, 26.90, 31.03 right to object, 9.15, 9.29
Data protection principles social media, 29.22
accurate and up to date, 5.04 case law, 9.32
adequate, relevant and not excessive, choice of jurisdiction and court, 9.24
5.04 compensation
application, 5.02 generally, 9.25
compliance, processing of employee social media, 29.20
personal data, 14.06 complaints to ICO/SA
correct, 5.04 choice of jurisdiction and court, 9.24
data subject rights, and compensation, 9.25
generally, 9.02 court remedies, 9.22
introduction, 5.04 different SAs, 9.23
DPA, 5.01 generally, 9.20
EU regime, and organisational privacy groups, 9.21
generally, 26.17 penalties, 9.26
introduction, 4.07 portability, 9.28
overview, 26.06 sanctions, 9.27
fair processing, 5.04 conclusion, 9.31
generally, 5.04 confirmation, 26.56
interpretation, 5.05 data disclosure, 9.18
introduction, 5.01 data portability, 9.13, 9.28, 26.63
lawful, fair and transparent processing, data processing
5.04 direct marketing, for, 29.19
manner demonstrating compliance likely to cause damage or distress,
with, 5.04 29.18
not excessive, 5.04 data protection principles, and
not kept for longer than is necessary, generally, 9.02
5.04 introduction, 5.04
overview, 4.07 dealing with requests, 9.07
processed for limited purposes, 5.04 direct marketing
processed in line with data subject’s generally, 9.01
rights, 5.04 preventing data processing, 29.19
relevant, 5.04 directly obtained date, 26.54
secure, 5.04 disclosure of data, 9.18
summary, 5.04 DPA
transfer of personal data, 21.02 access, 9.05–9.07
up to date, 5.04 data disclosure, 9.18
Data quality principles dealing with requests, 9.07
see Data protection principles introduction, 9.01
Data risk assessments and tools jurisdiction, 9.19
see Data protection by default; representation of data subjects, 9.21
Data protection by design (DPbD); employees
Privacy by design (PbD) data protection rights see Employee
Data subject data protection rights
definition, 3.02 monitoring, 17.23

538
Index

Data subject rights – contd Data subject rights – contd


erasure social media – contd
generally, 9.10, 26.60 confirmation, 29.30
introduction, 9.08 data access, 29.27
notification, 9.12 data portability, 29.36
social media, 29.21 data protection impact assessment,
generally, 9.03 29.43
indirectly obtained date, 26.55 definitions, 29.23
information as to third country destruction, 29.21
safeguards, 26.57 directly obtained data, 29.28
introduction, 9.01, 26.51 erasure, 29.21, 29.31, 29.33
jurisdiction, 9.19 generally, 29.13
notification of rectification, erasure indirectly obtained data, 29.29
or restriction, 9.12, 26.62 notification of rectification, erasure
objections or restriction, 29.35
individual decision making, to, 9.15 objection, 29.24, 29.38
processing and profiling, to, 9.29, preventing data processing,
26.65 29.18–29.19
organisational privacy groups, 9.21 prior consultation, 29.44
overview, 26.06 prior information, 29.28–29.29
penalties, 9.26 privacy impact assessment,
portability, 9.13, 9.28, 26.63 29.43
preventing data processing profiling, 29.24, 29.39
direct marketing, for, 29.19 rectification, 29.21, 29.31–29.32
likely to cause damage or distress, representatives of controllers not
29.18 established in EU, 29.42
prior information, 26.54–26.55 restriction of processing, 29.34
profiling, 9.16, 9.29–9.30, 26.66 right to be forgotten, 29.33
recipients, 9.04–9.05 security of processing, 29.41
rectification transparency, 29.25
generally, 9.09, 26.58–26.59 third country safeguards, 26.57
introduction, 9.08 transparency, 26.52
notification, 9.12 types, 26.51
social media, 29.21 Data subject’s consent
restriction of processing see also Consent
generally, 9.11, 26.61 transfer of personal data, 21.05
introduction, 9.08 Data transfers
notification, 9.12 see Trans border data flows (TBDF)
right to be forgotten Definitions
generally, 9.10, 26.60 conclusion, 3.07
introduction, 9.01 DPA
social media, 29.33 general list, 3.02
sanctions, 9.27 special personal data, 3.05
social media GDPR
access, 29.27–29.30 general list, 3.03
automated decision taking/making special personal data, 3.06
processes, 29.22 general personal data, 3.04
automated individual decisions, introduction, 3.01
29.37–29.39 overview, 1.23
blocking, 29.21 personal data, 3.04
children, 29.26 special personal data, 3.05–3.06
communication of data breach to Destruction
data subject, 29.40 director liability, 11.14
compensation, 29.20 social media, 29.21

539
Index

Digital Charter Direct marketing – contd


online safety, 24.05 origins of name, 23.17
Direct marketers PECR 2003
offences, 11.06 civil sanctions, 23.15
Direct marketing generally, 23.11
automated calling systems, 23.03 ICO guidance, 23.12
call opt-out registers, 23.16 offences, 23.13
case law, 23.20 prior consent, 23.05
civil sanctions, 23.15 related issues, 23.19
concealing identity Telephone Preference Service (TPS)
electronic mail, by, 22.07 generally, 23.16
generally, 23.08 illegal marketing call breaches,
conclusion, 23.21 23.07
deceptive emails, 23.08 transparency of identity
default position, 23.04 electronic mail, by, 22.07
electronic mail, by generally, 23.08
concealing identity, 22.07 unsolicited communications, 23.03
‘electronic mail’, 23.12 Director liability
existing customers’ email, 22.05 offences under DPA, 11.14
general rule, 22.04 Directories
opt-out option, 22.08 ePrivacy Directive, 22.22
opt-out registers, 22.06 Discovery
organisations, and, 22.09–22.21 generally, 33.06
transparency of identity, 22.07 social media, 29.12
existing customers’ email Disposal of hardware
electronic mail, by, 22.05 security of personal data, 12.25
generally, 23.06 Drones
fax opt-out registers, 23.16 generally, 33.12
Fax Preference Service (FPS), 23.16
GDPR, 23.01
generally, 23.02–23.20 E
ICO, and EDPB
monetary penalties, 23.14 see European Data Protection
PECR guidance, 23.12 Board (EDPB)
implementing regulations Electronic communications
civil sanctions, 23.15 direct marketing
generally, 23.11 see also Electronic direct
ICO guidance, 23.12 marketing
offences, 23.13 conclusion, 23.21
international approaches, 23.18 generally, 23.02–23.20
introduction, 23.01 introduction, 23.01
monetary penalties, 23.14 ePrivacy
offences, 23.13 see also ePrivacy Directive (ePD)
online behavioural advertising, and, 23.19 2002/58
opt-out option background, 22.02
calls and fax, 23.16 conclusion, 22.22
electronic mail, by, 22.08 ePD, 22.03–22.21
generally, 23.09 introduction, 22.01
opt-out registers introduction, 22.01
electronic mail, by, 22.06 Electronic direct marketing
generally, 23.07 automated calling systems, 23.03
organisations, and call opt-out registers, 23.16
electronic mail, by, 22.09–22.21 case law, 23.20
generally, 23.10 civil sanctions, 23.15

540
Index

Electronic direct marketing – contd Electronic direct marketing – contd


concealing identity Telephone Preference Service (TPS)
electronic mail, by, 22.07 generally, 23.16
generally, 23.08 illegal marketing call breaches,
conclusion, 23.21 23.07
deceptive emails, 23.08 transparency of identity
default position, 23.04 electronic mail, by, 22.07
electronic mail, by generally, 23.08
concealing identity, 22.07 unsolicited communications, 23.03
‘electronic mail’, 23.12 Electronic evidence
existing customers’ email, 22.05 generally, 33.06
general rule, 22.04 social media, 29.12
opt-out option, 22.08 ‘Electronic mail’
opt-out registers, 22.06 ePrivacy Directive, 22.11
organisations, and, 22.09–22.21 Email misuse
transparency of identity, 22.07 entering into contracts, 17.04
existing customers’ email generally, 17.03
electronic mail, by, 22.05 Employee appraisal
generally, 23.06 security of personal data, 12.18
fax opt-out registers, 23.16 Employee considerations
Fax Preference Service (FPS), 23.16 see also Employee data protection
GDPR, 23.01 rights
generally, 23.02–23.20 conclusion, 16.15
ICO, and data breach
monetary penalties, 23.14 employee data organisations,
PECR guidance, 23.12 16.13
implementing regulations generally, 27.05
civil sanctions, 23.15 introduction, 16.11
generally, 23.11 notification, 16.12
ICO guidance, 23.12 data protection policy, 16.04
offences, 23.13 device usage policy, 16.06
international approaches, 23.18 employment contracts, 16.02
introduction, 23.01 enforceability, 16.10
monetary penalties, 23.14 evidence, 16.09
offences, 23.13 Internet usage policy, 16.05
online behavioural advertising, and, 23.19 introduction, 16.01
opt-out option location, 16.14
calls and fax, 23.16 mobile usage policy, 16.06
electronic mail, by, 22.08 notification of data breach, 16.12
generally, 23.09 policies
opt-out registers data protection, 16.04
electronic mail, by, 22.06 device usage, 16.06
generally, 23.07 enforceability, 16.10
organisations, and Internet usage, 16.05
electronic mail, by, 22.09–22.21 introduction, 16.03
generally, 23.10 mobile usage, 16.06
origins of name, 23.17 vehicle use, 16.07
PECR 2003 social media, 29.11
civil sanctions, 23.15 transfers of undertaking, 16.08
generally, 23.11 use of employee data organisations,
ICO guidance, 23.12 16.13
offences, 23.13 vehicle use policy, 16.07
prior consent, 23.05 Employee data protection rights
related issues, 23.19 access, 14.21, 15.03

541
Index

Employee data protection rights – contd Employee monitoring – contd


automated decision taking human rights, 17.15
introduction, 15.09 Internet misuse
object, to, 15.10 entering into contracts, 17.04
profiling, 15.11 generally, 17.03
conclusion, 15.13 ILO Code, 17.17
data portability, 15.08 introduction, 17.01
erasure issues arising, 17.02
generally, 15.05 misuse of electronic communications
notification obligation, 15.06 entering into contracts, 17.04
GDPR, under generally, 17.03
access, 15.03 offline abuse, 17.08
automated decision taking, online abuse, 17.07
15.09–15.11 policies, corporate communications,
data portability, 15.08 17.11–17.13
erasure, 15.05 processing compliance rules, 17.21
express employee provision, 15.12 purpose, 17.14
introduction, 15.02 risk assessment
object, to, 15.10 generally, 17.10
rectification, 15.04 policies, 17.11–17.13
restriction of processing, Romanian/ECHR case, 17.24
15.06–15.07 WP29 Opinion, 17.19
to be forgotten, 15.05 Employee personal data
introduction, 15.01 access requests, right of employees,
list, 15.02 14.21, 15.03
object, to, 15.10 ‘bring your own devices’ (BYOD),
profiling, 15.11 14.01
rectification communications, 14.18
generally, 15.04 compliance
notification obligation, 15.06 accountability of controller, 14.06
restriction of processing data protection principles, and,
generally, 15.07 14.06
notification obligation, 15.06 responsible person, 14.14
summary, 15.02 written, 14.12
to be forgotten, 15.05 conclusion, 14.22
Employee monitoring consent
application of data protection regime, generally, 14.11
17.16 ordinary data legitimate processing,
child pornography, 17.09 14.08
conclusion, 17.25 sensitive data legitimate processing,
contracts of employment, and, 17.20 14.11
corporate communications usage policy coverage, 14.05
content, 17.12 data protection officers (DPO),
generally, 17.11 14.03
key issues, 17.13 data protection principles, 14.06
data protection, and, 17.14 employee handbooks, 14.17
data subjects rights, 17.23 employees covered, 14.05
EDPB Opinion, 17.18 employment contracts, 14.15
email misuse GDPR, 14.02
entering into contracts, 17.04 HR Department, 14.19
generally, 17.03 ICO Codes, 14.13
equality, 17.05 Internet, 14.18
guidelines, 17.22 Intranet, 14.18
harassment, 17.06 introduction, 14.01

542
Index

Employee personal data – contd Enforcement for non-compliance – contd


inward-facing issues, 14.04 assessment notices
legitimate processing generally, 20.04
ordinary personal data, 14.07–14.08 limitations, 20.05
sensitive personal data, 14.09–14.12 restrictions, 20.05
notices, 14.18 audit request, 20.08
ordinary data legitimate processing breaches of rules, and
consent clauses, 14.08 data processing activities, 11.03
generally, 14.07 generally, 11.02
policies, 14.12 other consequences, 11.04
processing civil sanctions
access requests, right of employees, director liability and offences, 11.14
14.21 generally, 11.12
communications, 14.18 Morrisons case, 11.13
compliance policies, 14.12 processor offences, 11.15
conclusion, 14.22 compensation right, 11.09, 11.22
consent, 14.11 complaints against SA, 11.17
data protection principles, 14.06 conclusion, 11.30
employee handbooks, 14.17 conditions for imposing fines
employment contracts, 14.15 GDPR, under, 11.10, 11.24
GDPR, 14.02 generally, 20.19
HR Department, 14.19 criminal offences
ICO Codes, 14.13 breaches of rules, and, 11.02–11.04
Internet, 14.18 data processing activities, and, 11.03
Intranet, 14.18 direct marketers, by, 11.06
introduction, 14.01 director liability, 11.14
inward-facing issues, 14.04 introduction, 11.01
legitimate processing, 14.07–14.12 other consequences of breach, and,
notices, 14.18 11.04
persons covered, 14.05 processors, by, 11.15
policies, 14.12 unlawful obtaining or procuring
responsible person for compliance, personal data, 11.05
14.14 data processing activities, and, 11.03
security, 14.20 de-identified personal data, 20.13
training, 14.16 director liability, 11.14
responsible person for compliance, enforcement notices
14.14 generally, 20.02
rights rectification and erasure, 20.03
access requests, 14.21, 15.03 entry powers, 20.07
conclusion, 15.13 failure to comply with notices, 20.11
GDPR, under, 15.03–15.12 fines, 11.25
introduction, 15.01 GDPR, under
list, 15.02 compensation right, 11.09, 11.22
security, 14.20 complaints against SA, 11.17
sensitive data legitimate processing conditions for imposing
compliance, 14.12 administrative fines, 11.10,
consent clauses, 14.11 11.23
generally, 14.09 fines, 11.25
policies, 14.12 generally, 11.08
social media, 29.11 introduction, 11.01
training, 14.16 judicial remedy against controller or
written policies, 14.12 processor, 11.19
Enforcement for non-compliance judicial remedy against SA, 11.18
see also Enforcement powers level of remedy and fines, 11.25

543
Index

Enforcement for non-compliance – contd Enforcement for non-compliance – contd


GDPR, under – contd powers – contd
liability for damage, 11.09, 11.22 information orders, 20.10
penalties, 11.11, 11.24 inspection, 20.07
remedies, 11.16–11.25 introduction, 20.01
representation of data subjects, 11.20 monetary penalties, 20.15
ICO, and notices, 20.02–20.11
case penalties, 11.29 penalties, 20.20
financial penalties, 11.27 powers of SAs, 20.18
functions, 11.26 production of records, 20.16
inspections, 11.28 re-identification, 20.13–20.14
monetary penalties, 11.07 tasks of SAs, 20.17
powers, 11.26 unlawful obtaining of personal data,
information notices 20.12
generally, 20.09 processors, 11.15
special, 20.10 production of records, 20.16
information orders, 20.10 re-identification
inspection powers, 20.07 de-identified personal data, 20.13
introduction, 11.01 testing, 20.14
judicial remedy representation of data subjects, 11.20
controller or processor, against, Supervisory Authorities
11.19 functions, 20.17
SA, against, 11.18 generally, 20.18
level of remedy and fines, 11.25 judicial remedy against, 11.18
liability for damage, 11.09, 11.22 tasks, 20.17
monetary penalties, 20.15 testing, 20.14
Morrisons case, 11.13 unlawful obtaining of personal data,
notices 20.12
assessment notices, 20.04–20.06 unwanted marketing communications,
failure to comply, 20.11 11.01
generally, 20.02–20.03 Enforcement notices
information notices, 20.09–20.10 erasure, 20.03
offences failure to comply, 20.11
generally, 11.02–11.06 generally, 20.02
introduction, 11.01 rectification, 20.03
organisations, by, 11.01 Enforcement powers
penalties assessment notices
case, 11.29 destroying or falsifying information
GDPR, under, 11.11, 11.24 and documents, 20.04
financial, 11.27 generally, 20.04
ICO, 11.07, 11.27, 11.29, 20.15 limitations, 20.05
monetary, 11.07, 20.15 restrictions, 20.05
powers audit request, 20.08
assessment notices, 20.04–20.06 conclusion, 20.21
audit request, 20.08 conditions for imposing fines, 20.19
conclusion, 20.21 enforcement notices
conditions for imposing fines, 20.19 failure to comply, 20.11
enforcement notices, 20.02–20.03 generally, 20.02
entry, 20.07 rectification and erasure, 20.03
failure to comply with notices, entry powers, 20.07
20.11 failure to comply with notices, 20.11
functions of SAs, 20.17 functions of SAs, 20.17
GDPR, and, 20.17–20.20 GDPR, and, 20.17–20.20
information notices, 20.09 ICO penalties, 20.15

544
Index

Enforcement powers – contd ePrivacy Directive (ePD)


information notices 2002/58 – contd
failure to comply, 20.11 existing customers’ email, 22.05
generally, 20.09 fax machines, 22.04
special, 20.10 interpretation, 22.11
information orders, 20.10 introduction, 22.01–22.02
inspection powers, 20.07 location data
introduction, 20.01 definition, 22.11
monetary penalties, 20.15 generally, 22.19
notices marketing
assessment notices, 20.04–20.05 concealing identity, 22.07
enforcement notices, 20.02–22.03 existing customers’ email, 22.05
failure to comply, 20.11 general rule, 22.04
information notices, 20.09–20.10 opt-out option, 22.08
penalties, 20.15 opt-out registers, 22.06
production of records, 20.16 organisations, and, 22.09–22.21
re-identification, 20.13–20.14 transparency of identity, 22.07
special information orders non-itemised billing, 22.17
failure to comply, 20.11 opt-out
generally, 20.10 general rule, 22.08
supervisory authorities (SAs) registers, 22.06
administrative fines, 20.19 organisations, and, 22.09–22.21
functions and tasks, 20.17 public security, and, 22.10
powers, 20.18 purpose
tasks of SAs, 20.17 generally, 22.10
unlawful obtaining of personal data, introduction, 22.03
20.12 scope
Entry powers generally, 22.10
enforcement, 20.07 introduction, 22.03
ePrivacy Directive (ePD) 2002/58 marketing, 22.04–22.21
aim, 22.10 security of data, 22.14
automated calling systems, 22.04 services, 22.12
automatic call forwarding, 22.21 sources of law, and, 2.04
background, 22.02 state security, and, 22.10
‘call’, 22.11 structure, 22.03
calling line identification, 22.18 traffic data
‘communication’, 22.11 definition, 22.11
conclusion, 22.22 generally, 22.16
confidentiality, 22.15 transparency of identity, 22.07
connected line identification, 22.18 unsolicited communications
‘consent’, 22.11 generally, 22.13
content, 22.03 marketing, 22.04
definitions, 22.11 ‘user’, 22.11
directories, 22.22 ‘value added service’, 22.11
‘electronic mail’, 22.11 Equality
electronic mail marketing employee monitoring, 17.05
concealing identity, 22.07 Erasure
existing customers’ email, 22.05 enforcement notices, 20.03
general rule, 22.04 generally, 9.10, 26.60
opt-out option, 22.08 introduction, 9.08
opt-out registers, 22.06 notification, 9.12
organisations, and, 22.09–22.21 social media, 29.21, 29.31, 29.33
transparency of identity, 22.07 Ethics
excluded activities, 22.10 generally, 33.14

545
Index

EU Charter of Fundamental Rights EU General Data Protection Regulation


generally, 26.05 (GDPR) – contd
EU Council Resolution data breach – contd
security of personal data, 12.22 notification to SA, 12.09–12.10,
EU Data Protection Directive (DPD) 26.38, 27.03
exemptions, 8.15 process for notification, 27.07
repeal, 26.12 security standards, 27.08
social media time limits for notification, 27.06
specific processing risks, 29.17 data protection by design, 19.04–19.05,
third party controllers, 29.16 26.34, 26.67
EU General Data Protection Regulation data protection impact assessment
(GDPR) characteristics, 28.06
access right, 9.06 conclusion, 28.10
background, 4.02 contents, 12.14
Brexit, and, 1.13, 2.02–2.04, 4.10 elements, 28.05
certification, 26.81, 33.15 generally, 12.13, 26.69, 28.01–28.02
changes introduced, 26.08 ICO guidance, 26.40, 28.03
children introduction, 26.41
conditions for consent, 26.21 issues for consideration, 28.08
generally, 26.20 methodologies, 28.07
Codes of Conduct, 26.80, 33.15 prior consultation, 12.15, 28.09
communication of data breach purpose, 28.04
to data subject, 12.11, 26.39, security of personal data, 12.16
26.71 steps and methodologies, 28.07
compensation, 26.78 data protection officers
conclusion, 26.92 conclusion, 26.91
conditions for consent contact details, 31.06
children, 26.21 expertise, 31.05
generally, 26.19 functions, 26.90, 31.03
introduction, 4.09 generally, 26.89, 31.02
consent groups, and, 31.04
conditions, 4.09 independence, 31.08
generally, 26.19 introduction, 26.72, 31.01
context, 26.14 inward-facing issues, 14.03
controllers overview, 26.88
co-operation with SA, 26.33 qualifications, 31.05
introduction, 26.26 reporting, 31.07
joint, 26.28 resources, 31.09
processing under authority, 26.30 role, 26.89, 31.02
records of activities, 26.31 summary, 31.10
representatives not established in tasks, 26.90, 31.03
EU, 26.32 data protection principles
responsibility, 26.27 generally, 26.17
co-operation with SA, 26.33 introduction, 4.07
criminal convictions data, 26.24 overview, 26.06
data breach data subject rights
communication to data subject, access, 9.06, 26.53
12.11, 26.39, 26.71, 27.04 automated individual decisions, 9.16,
conclusion, 27.10 9.29–9.30, 26.66
context, 27.02 choice of jurisdiction and court, 9.24
definition, 27.01 compensation, 9.25
employees, and, 27.05 complaints to ICO/SA, 9.20–9.27
incident response, 27.09 confirmation, 26.56
introduction, 27.01 data portability, 9.13, 9.28, 26.63

546
Index

EU General Data Protection Regulation EU General Data Protection Regulation


(GDPR) – contd (GDPR) – contd
data subject rights – contd employee data protection rights – contd
data protection principles, 9.02 express employee provision, 15.12
directly obtained date, 26.54 introduction, 15.02
erasure, 9.10, 26.60 object, to, 15.10
indirectly obtained date, 26.55 rectification, 15.04
information as to third country restriction of processing,
safeguards, 26.57 15.06–15.07
introduction, 9.01, 26.51 to be forgotten, 15.05
jurisdiction, 9.19 enforcement for non-compliance
notification of rectification, erasure compensation right, 11.09, 11.22
or restriction, 9.12, 26.62 complaints against SA, 11.17
objection to individual decision conditions for imposing
making, 9.15 administrative fines, 11.10,
objection to processing and profiling, 11.23
9.29, 26.65 fines, 11.25
overview, 26.06 generally, 11.08
penalties, 9.26 introduction, 11.01
portability, 9.13, 9.28, 26.63 judicial remedy against controller
prior information, 26.54–26.55 or processor, 11.19
profiling, and, 9.16, 9.29–9.30, 26.66 judicial remedy against SA, 11.18
rectification, 9.09, 26.58–26.59 level of remedy and fines, 11.25
restriction of processing, 9.11, 26.61 liability for damage, 11.09, 11.22
sanctions, 9.27 penalties, 11.11, 11.24
social media see Social media remedies, 11.16–11.25
to be forgotten, 26.60 representation of data subjects, 11.20
transparency, 26.52 enforcement powers, 20.17–20.20
types, 26.51 enhanced provisions and changes, 26.07
data transfers erasure, 26.35
adequacy decision, 21.09, 26.46 European Data Protection Board, 26.13
appropriate safeguards, 21.10, 26.47 exemptions
binding corporate rules, 21.11, 26.48 conclusion, 8.05
derogations, 21.13, 21.15, 26.50 generally, 8.04
general principle, 21.08, 26.45 introduction, 8.01
generally, 21.08, 26.44 formal nature, 26.02
international cooperation, 21.14 freedom of expression, and, 26.83
not authorised by EU law, 21.12, fundamental rights, and, 26.05
26.49 generally, 26.01, 33.02
principle, 26.45 health research, 26.23
restrictions, 21.15, 26.50 impact assessment
subject to appropriate safeguards, characteristics, 28.06
21.10, 26.47 conclusion, 28.10
definitions contents, 12.14
general list, 3.03 elements, 28.05
generally, 26.09 generally, 12.13, 26.69, 28.01–28.02
special personal data, 3.05 ICO guidance, 26.40, 28.03
direct effect, 26.02 introduction, 26.41
employee data protection rights issues for consideration, 28.08
access, 15.03 methodologies, 28.07
automated decision taking, prior consultation, 12.15, 28.09
15.09–15.11 purpose, 28.04
data portability, 15.08 security of personal data, 12.16
erasure, 15.05 steps and methodologies, 28.07

547
Index

EU General Data Protection Regulation EU General Data Protection Regulation


(GDPR) – contd (GDPR) – contd
importance, 26.04 prior information
innovations, 26.06 directly obtained data, 7.03
joint controllers, 26.28 generally, 7.02
judicial remedy indirectly obtained data, 7.04
controller/processor, against, 26.76 introduction, 7.01
SA, against, 26.75 outward-facing organisational
legitimate processing conditions obligations, 18.07
generally, 26.18 social media, 29.13, 29.28–29.29
introduction, 4.08 privacy by design, 19.04–19.05, 26.34,
liability, 26.78 26.67
lodging complaints with SA, 26.74 privacy impact assessments
material scope, 26.14 characteristics, 28.06
objectives, 26.14 conclusion, 28.10
overview, 1.01 contents, 12.14
penalties for non-compliance elements, 28.05
compensation right, 11.09, 11.22 generally, 12.13, 26.69,
complaints against SA, 11.17 28.01–28.02
conditions for imposing ICO guidance, 26.40, 28.03
administrative fines, 11.10, introduction, 26.41
11.23 issues for consideration, 28.08
fines, 11.25 methodologies, 28.07
generally, 11.08 prior consultation, 12.15, 28.09
introduction, 11.01 purpose, 28.04
judicial remedy against controller security of personal data, 12.16
or processor, 11.19 steps and methodologies, 28.07
judicial remedy against SA, 11.18 processing
level of remedy and fines, 11.25 authority, under, 26.30
liability for damage, 11.09, 11.22 children, 26.20–26.21
penalties, 11.11, 11.24 conditions, 4.08, 26.18–26.19
remedies, 11.16–11.25 consent, 4.09, 26.19
representation of data subjects, controller authority, under, 26.30
11.20 criminal convictions data, 26.24
personal data breach derogations, 26.85
communication to data subject, direct marketing, for, 29.19
12.11, 26.39, 26.71, 27.04 employment context, in, 26.84
conclusion, 27.10 freedom of expression, and,
context, 27.02 26.83
definition, 27.01 health research, 26.23
employees, and, 27.05 introduction, 26.16
incident response, 27.09 lawfulness, 26.18
introduction, 27.01 legitimate conditions, 4.08, 26.18
notification to SA, 12.09–12.10, not requiring identification, 26.25
26.38, 27.03 likely to cause damage or distress,
process for notification, 27.07 29.18
security standards, 27.08 offences data, 26.24
time limits for notification, 27.06 pre-conditions, 7.01–7.04
principles principles, 4.07, 26.17
generally, 26.17 processor authority, under, 26.30
introduction, 4.07 records of activities, 26.31
overview, 26.06 safeguards, 26.85
prior consultation, 12.15, 26.42, 26.70, special categories of personal data,
28.09 26.22

548
Index

EU General Data Protection Regulation EU General Data Protection Regulation


(GDPR) – contd (GDPR) – contd
processors territorial scope, 26.14
co-operation with SA, 26.33 time limits for compliance, 10.01
generally, 26.29 transfer of personal data
introduction, 26.26 adequacy decision, 21.09, 26.46
processing under authority, 26.30 appropriate safeguards, 21.10,
records of activities, 26.31 26.47
representatives not established in binding corporate rules, 21.11,
EU, 26.32 26.48
recitals, 4.05, 26.10 derogations, 21.13, 21.15, 26.50
records of processing activities, 26.31 general principle, 21.08, 26.45
rectification, 26.35 generally, 21.08, 26.44
remedies international cooperation, 21.14
compensation, 26.78 not authorised by EU law, 21.12,
introduction, 26.73 26.49
judicial remedy against principle, 26.45
controller/processor, 26.76 restrictions, 21.15, 26.50
judicial remedy against SA, 26.75 subject to appropriate safeguards,
liability, 26.78 21.10, 26.47
lodging complaints with SA, 26.74 WP29, and, 26.13
representation of data subjects, 26.77 EU law
representation of data subjects, 26.77 sources of law, 2.04
representatives not established in EU, European Commission
26.32 Brexit and ‘Notice to Stakeholders’,
review policy, 26.03 24.18
right to be forgotten European Data Protection Board
employees, 26.17 (EDPB)
erasure, 26.60 adoption/endorsement of WP29, 9.16
generally, 26.07 data protection by design/privacy by
social media, 29.33 design, 19.09
scope, 4.06, 26.14 employee monitoring, 17.18
secrecy, 26.86 generally, 26.13
security of personal data guidance
communication of data breach to Brexit, 24.19
data subject, 12.11, 26.39, Covid-19 health research, 26.23
26.71 data protection by design and by
generally, 12.07, 26.36 default, 26.67
impact assessment, 12.12–12.14, drones, 33.12
26.41, 26.69, 28.01–28.10 right to be forgotten, 26.60
notification of data breach, video devices, 33.10
12.09–12.10, 26.38 security of personal data
prior consultation, 12.15, 26.42, data protection impact assessment,
26.70, 28.09 12.13
processing, of, 12.08, 26.37 generally, 12.06
social media social media, 29.14
consent, 29.14 sources of law, 2.10
data subject rights see Data subject European Data Protection Supervisor
rights (EDPS)
sources of law, and, 2.04 Brexit guidance, 24.17
special categories of personal data, sources of law, 2.11
26.22 Evidence
structure, 4.05 generally, 33.06
subject matter, 26.14 social media, 29.12

549
Index

Exemptions General Data Protection Regulation


conclusion, 8.05 (GDPR) – contd
generally, 8.02–8.03 controllers – contd
introduction, 8.01 records of activities, 26.31
Existing customers’ email representatives not established
electronic mail, by, 22.05 in EU, 26.32
generally, 23.06 responsibility, 26.27
co-operation with SA, 26.33
criminal convictions data, 26.24
F data breach
Facebook communication to data subject,
Schrems, 32.1 12.11, 26.39, 26.71, 27.04
Fairness conclusion, 27.10
overview, 1.19 context, 27.02
Fax machines definition, 27.01
ePrivacy Directive, 22.04 employees, and, 27.05
Fax Preference Service (FPS) incident response, 27.09
direct marketing, 23.16 introduction, 27.01
Filing system notification to SA, 12.09–12.10,
DPD, 3.02 26.38, 27.03
GDPR, 3.03 process for notification, 27.07
Freedom of expression security standards, 27.08
GDPR, and, 26.83 time limits for notification, 27.06
data protection by design, 19.04–19.05,
26.34, 26.67
G data protection impact assessment
General Data Protection Regulation characteristics, 28.06
(GDPR) conclusion, 28.10
access right, 9.06 contents, 12.14
background, 4.02 elements, 28.05
Brexit, and, 1.13, 2.02–2.04, 4.10 generally, 12.13, 26.69, 28.01–28.02
certification, 26.81, 33.15 ICO guidance, 26.40, 28.03
changes introduced, 26.08 introduction, 26.41
children issues for consideration, 28.08
conditions for consent, 26.21 methodologies, 28.07
generally, 26.20 prior consultation, 12.15, 28.09
Codes of Conduct, 26.80, 33.15 purpose, 28.04
communication of data breach to data security of personal data, 12.16
subject, 12.11, 26.39, 26.71 steps and methodologies, 28.07
compensation, 26.78 data protection officers
conclusion, 26.92 conclusion, 26.91
conditions for consent contact details, 31.06
children, 26.21 expertise, 31.05
generally, 26.19 functions, 26.90, 31.03
introduction, 4.09 generally, 26.89, 31.02
consent groups, and, 31.04
conditions, 4.09 independence, 31.08
generally, 26.19 introduction, 26.72, 31.01
context, 26.14 inward-facing issues, 14.03
controllers overview, 26.88
co-operation with SA, 26.33 qualifications, 31.05
introduction, 26.26 reporting, 31.07
joint, 26.28 resources, 31.09
processing under authority, 26.30 role, 26.89, 31.02

550
Index

General Data Protection Regulation General Data Protection Regulation


(GDPR) – contd (GDPR) – contd
data protection officers – contd data transfers – contd
summary, 31.10 principle, 26.45
tasks, 26.90, 31.03 restrictions, 21.15, 26.50
data protection principles subject to appropriate safeguards,
generally, 26.17 21.10, 26.47
introduction, 4.07 definitions
overview, 26.06 general list, 3.03
data subject rights generally, 26.09
access, 9.06, 26.53 sensitive personal data, 3.05
automated individual decisions, 9.16, direct effect, 26.02
9.29–9.30, 26.66 employee data protection rights
choice of jurisdiction and court, 9.24 access, 15.03
compensation, 9.25 automated decision taking,
complaints to ICO/SA, 9.20–9.27 15.09–15.11
confirmation, 26.56 data portability, 15.08
data portability, 9.13, 9.28, 26.63 erasure, 15.05
data protection principles, 9.02 express employee provision, 15.12
directly obtained date, 26.54 introduction, 15.02
erasure, 9.10, 26.60 object, to, 15.10
indirectly obtained date, 26.55 rectification, 15.04
information as to third country restriction of processing,
safeguards, 26.57 15.06–15.07
introduction, 9.01, 26.51 to be forgotten, 15.05
jurisdiction, 9.19 enforcement for non-compliance
notification of rectification, erasure compensation right, 11.09, 11.22
or restriction, 9.12, 26.62 complaints against SA, 11.17
objection to individual decision conditions for imposing
making, 9.15 administrative fines, 11.10,
objection to processing and profiling, 11.23
9.29, 26.65 fines, 11.25
overview, 26.06 generally, 11.08
penalties, 9.26 introduction, 11.01
portability, 9.13, 9.28, 26.63 judicial remedy against controller
prior information, 26.54–26.55 or processor, 11.19
profiling, and, 9.16, 9.29–9.30, 26.66 judicial remedy against SA, 11.18
rectification, 9.09, 26.58–26.59 level of remedy and fines, 11.25
restriction of processing, 9.11, 26.61 liability for damage, 11.09, 11.22
sanctions, 9.27 penalties, 11.11, 11.24
to be forgotten, 26.60 remedies, 11.16–11.25
transparency, 26.52 representation of data
types, 26.51 subjects, 11.20
data transfers enforcement powers, 20.17–20.20
adequacy decision, 21.09, 26.46 enhanced provisions and changes, 26.07
appropriate safeguards, 21.10, 26.47 erasure, 26.35
binding corporate rules, 21.11, 26.48 European Data Protection Board, 26.13
derogations, 21.13, 21.15, 26.50 exemptions
Facebook, 32.1 conclusion, 8.05
general principle, 21.08, 26.45 generally, 8.04
generally, 21.08, 26.44 introduction, 8.01
international cooperation, 21.14 formal nature, 26.02
not authorised by EU law, 21.12, freedom of expression, and, 26.83
26.49 fundamental rights, and, 26.05

551
Index

General Data Protection Regulation General Data Protection Regulation


(GDPR) – contd (GDPR) – contd
generally, 26.01, 33.02 personal data breach – contd
health research, 26.23 definition, 27.01
impact assessment employees, and, 27.05
characteristics, 28.06 incident response, 27.09
conclusion, 28.10 introduction, 27.01
contents, 12.14 notification to SA, 12.09–12.10,
elements, 28.05 26.38, 27.03
generally, 12.13, 26.69, 28.01–28.02 process for notification, 27.07
ICO guidance, 26.40, 28.03 security standards, 27.08
introduction, 26.41 time limits for notification, 27.06
issues for consideration, 28.08 principles
methodologies, 28.07 generally, 26.17
prior consultation, 12.15, 28.09 introduction, 4.07
purpose, 28.04 overview, 26.06
security of personal data, 12.16 prior consultation, 12.15, 26.42, 26.70,
steps and methodologies, 28.07 28.09
importance, 26.04 prior information
innovations, 26.06 directly obtained data, 7.03
joint controllers, 26.28 generally, 7.02
judicial remedy indirectly obtained data, 7.04
controller/processor, against, 26.76 introduction, 7.01
SA, against, 26.75 outward-facing organisational
legitimate processing conditions obligations, 18.07
generally, 26.18 social media, 29.13, 29.28–29.29
introduction, 4.08 privacy by design, 19.04–19.05, 26.34,
liability, 26.78 26.67
lodging complaints with SA, 26.74 privacy impact assessments
material scope, 26.14 characteristics, 28.06
objectives, 26.14 conclusion, 28.10
overview, 1.01 contents, 12.14
penalties for non-compliance elements, 28.05
compensation right, 11.09, 11.22 generally, 12.13, 26.69, 28.01–28.02
complaints against SA, 11.17 ICO guidance, 26.40, 28.03
conditions for imposing introduction, 26.41
administrative fines, 11.10, issues for consideration, 28.08
11.23 methodologies, 28.07
fines, 11.25 prior consultation, 12.15, 28.09
generally, 11.08 purpose, 28.04
introduction, 11.01 security of personal data, 12.16
judicial remedy against controller steps and methodologies, 28.07
or processor, 11.19 processing
judicial remedy against SA, 11.18 authority, under, 26.30
level of remedy and fines, 11.25 children, 26.20–26.21
liability for damage, 11.09, 11.22 conditions, 4.08, 26.18–26.19
penalties, 11.11, 11.24 consent, 4.09, 26.19
remedies, 11.16–11.25 controller authority, under, 26.30
representation of data subjects, 11.20 criminal convictions data, 26.24
personal data breach derogations, 26.85
communication to data subject, direct marketing, for, 29.19
12.11, 26.39, 26.71, 27.04 employment context, in, 26.84
conclusion, 27.10 freedom of expression, and, 26.83
context, 27.02 health research, 26.23

552
Index

General Data Protection Regulation General Data Protection Regulation


(GDPR) – contd (GDPR) – contd
processing – contd security of personal data – contd
introduction, 26.16 notification of data breach,
lawfulness, 26.18 12.09–12.10, 26.38
legitimate conditions, 4.08, 26.18 prior consultation, 12.15, 26.42,
likely to cause damage or distress, 26.70, 28.09
29.18 processing, of, 12.08, 26.37
not requiring identification, 26.25 social media
offences data, 26.24 consent, 29.14
pre-conditions, 7.01–7.04 data subject rights see Data subject
principles, 4.07, 26.17 rights
processor authority, under, 26.30 sources of law, and, 2.04
records of activities, 26.31 special categories of personal data, 26.22
safeguards, 26.85 structure, 4.05
special categories of personal data, subject matter, 26.14
26.22 territorial scope, 26.14
processors time limits for compliance, 10.01
co-operation with SA, 26.33 transfer of personal data
generally, 26.29 adequacy decision, 21.09, 26.46
introduction, 26.26 appropriate safeguards, 21.10, 26.47
processing under authority, 26.30 binding corporate rules, 21.11, 26.48
records of activities, 26.31 derogations, 21.13, 21.15, 26.50
representatives not established in general principle, 21.08, 26.45
EU, 26.32 generally, 21.08, 26.44
recitals, 4.05, 26.10 international cooperation, 21.14
records of processing activities, 26.31 not authorised by EU law, 21.12,
rectification, 26.35 26.49
remedies principle, 26.45
compensation, 26.78 restrictions, 21.15, 26.50
introduction, 26.73 subject to appropriate safeguards,
judicial remedy against 21.10, 26.47
controller/processor, 26.76 WP29, and, 26.13
judicial remedy against SA, 26.75 Genetic data
liability, 26.78 DPA, 3.02
lodging complaints with SA, 26.74 GDPR, 3.03
representation of data subjects, 26.77 Genome data
representation of data subjects, 26.77 generally, 33.04
representatives not established in EU, Google
26.32 privacy by design, 19.10
review policy, 26.03 Group of undertakings
right to be forgotten definition for social media purposes,
employees, 26.17 29.23
erasure, 26.60
generally, 26.07
scope, 4.06, 26.14 H
secrecy, 26.86 Harassment
security of personal data employee monitoring, 17.06
communication of data breach to Hardware devices
data subject, 12.11, 26.39, generally, 33.08
26.71 Health data
generally, 12.07, 26.36 generally, 33.03
impact assessment, 12.12–12.14, Health research
26.41, 26.69, 28.01–28.10 GDPR, and, 26.23

553
Index

Hosting Information Commissioner’s Office


social media, 29.15 (ICO) – contd
Human rights enforcement powers
employee monitoring, 17.15 see also Enforcement powers
penalties, 20.15
monetary penalties, 11.07, 20.15
I privacy by design
ICO introduction, 19.06
see Information Commissioner’s Privacy by Design Report (2008),
Office (ICO) 19.07
ILO Code recommendations, 19.08
employee monitoring, 17.17 security of personal data
Impact assessments guidance, 12.23
characteristics, 28.06 security breaches, 12.24
conclusion, 28.10 social media, 29.01
contents, 12.14 sources of law
elements, 28.05 determinations, 2.07
generally, 12.13, 26.69, 28.01–28.02 guides, 2.06
ICO guidance, 26.40, 28.03 time limits for compliance, 10.01–10.02
introduction, 12.12, 26.41 Information Society service
issues for consideration, 28.08 definition, 3.03
methodologies, 28.07 Inspection powers
prior consultation, 12.15, 28.09 enforcement, 20.07
purpose, 28.04 International organisation
security of personal data, 12.16 definition, 3.03
steps and methodologies, 28.07 Internet misuse
Information Commissioner entering into contracts, 17.04
introduction, 1.17 generally, 17.03
Information Commissioner’s Office Internet of Things (IoT)
(ICO) generally, 33.09
Brexit advice/guidance see Brexit Investigations
complaints by data subjects generally, 33.06
choice of jurisdiction and court, social media, 29.03
9.24 Inward-facing organisational
compensation, 9.25 obligations
court remedies, 9.22 ‘bring your own devices’ (BYOD),
different SAs, 9.23 14.01
generally, 9.20 data protection officers (DPO), 14.03
organisational privacy groups, 9.21 employee considerations
penalties, 9.26 conclusion, 16.15
portability, 9.28 data breach, 16.11–16.13
sanctions, 9.27 data protection policy, 16.04
data protection impact assessment device usage policy, 16.06
guidance, 26.40, 28.03 employment contracts, 16.02
direct marketing enforceability, 16.10
monetary penalties, 23.14 evidence, 16.09
PECR guidance, 23.12 Internet usage policy, 16.05
enforcement for non-compliance introduction, 16.01
case penalties, 11.29 mobile usage policy, 16.06
financial penalties, 11.27 policies, 16.03–16.07
functions, 11.26 transfers of undertaking, 16.08
inspections, 11.28 use of employee data organisations,
monetary penalties, 11.07 16.13
powers, 11.26 vehicle use policy, 16.07

554
Index

Inward-facing organisational Inward-facing organisational


obligations – contd obligations – contd
employee data protection rights processing employee’s personal
conclusion, 15.13 data – contd
GDPR, under, 15.03–15.12 Intranet, 14.18
introduction, 15.01 introduction, 14.01
list, 15.02 inward-facing issues, 14.04
employee monitoring legitimate processing, 14.07–14.12
application of data protection regime, notices, 14.18
17.16 persons covered, 14.05
child pornography, 17.09 policies, 14.12
conclusion, 17.25 responsible person for compliance,
contracts of employment, and, 17.20 14.14
corporate communications usage security, 14.20
policy, 17.11–17.13 training, 14.16
data protection, and, 17.14
data subjects rights, 17.23
EDPB Opinion, 17.18 J
email misuse, 17.03 Joint controllers
equality, 17.05 GDPR, and, 26.28
guidelines, 17.22 social media, 29.02
harassment, 17.06 Journalism activities
human rights, 17.15 Leveson Inquiry, and, 30.02
Internet misuse, 17.03 Journals
ILO Code, 17.17 sources of law, 2.09
introduction, 17.01 Judicial remedy
issues arising, 17.02 controller/processor, against, 26.76
misuse of electronic SA, against, 26.75
communications, 17.03–17.04 Jurisdiction
offline abuse, 17.08 complaints by data subjects, 9.24
online abuse, 17.07 data subject rights, 9.19
policies, 17.11–17.13
processing compliance rules, 17.21
purpose, 17.14 L
risk assessment, 17.10–17.11 Lawful processing
Romanian/ECHR case, 17.24 see Processing
WP29 Opinion, 17.19 Legal instruments
GDPR, 14.02 history of data protection, 4.04
introduction, 14.01 Legal journals
issues, 14.04 sources of law, 2.09
processing employee’s personal data Legal textbooks
access requests, right of employees, sources of law, 2.08
14.21 Legitimate processing
communications, 14.18 general conditions, 6.02
compliance policies, 14.12 generally, 4.08, 26.18
conclusion, 14.22 introduction, 6.01
consent, 14.11 overview, 1.22
data protection principles, 14.06 special personal data conditions, 6.03
employee handbooks, 14.17 Leveson Inquiry
employment contracts, 14.15 conclusion, 30.07
GDPR, 14.02 data breach sentencing, 30.05
HR Department, 14.19 DPA, s 32
ICO Codes, 14.13 comparison table, 30.06
Internet, 14.18 generally, 30.02

555
Index

Leveson Inquiry – contd Monitoring employees – contd


introduction, 30.01 guidelines, 17.22
journalism activities, 30.02 harassment, 17.06
Recommendations human rights, 17.15
data breach, as to, 30.05 Internet misuse
ICO, to, 30.04 entering into contracts, 17.04
MoJ, to, 30.03 generally, 17.03
social media, 29.04 ILO Code, 17.17
sources of law, 2.14 introduction, 17.01
Lisbon Treaty issues arising, 17.02
generally, 26.05 misuse of electronic communications
Litigation entering into contracts, 17.04
social media, 29.11 generally, 17.03
Location data offline abuse, 17.08
definition, 22.11 online abuse, 17.07
generally, 22.19 processing compliance rules, 17.21
Lodging complaints with SA purpose, 17.14
GDPR, and, 26.74 risk assessment
generally, 17.10
policies, 17.11–17.13
M Romanian/ECHR case, 17.24
Main establishment WP29 Opinion, 17.19
definition Multinationals
generally, 3.03 privacy by design, 19.10
social media purposes, 29.23
Medical data
generally, 33.03 N
Mere conduit Non-compliance
social media, 29.15 civil sanctions
Microsoft director liability, 11.14
privacy by design, 19.10 generally, 11.12
Misuse of electronic communications Morrisons case, 11.13
entering into contracts, 17.04 processors, 11.15
generally, 17.03 conclusion, 11.30
Monetary penalty DPA, under, 11.12–11.15
generally, 20.15 GDPR, under, 11.08–11.11,
Monitoring employees 11.16–11.25
application of data protection regime, ICO, and, 11.07, 11.26–11.28,
17.16 11.29
child pornography, 17.09 introduction, 11.01
conclusion, 17.25 Non-itemised billing
contracts of employment, and, 17.20 ePrivacy Directive, 22.17
corporate communications usage Notification of data breach
policy contents, 12.10
content, 17.12 data subject, to, 12.11
generally, 17.11 SA, to, 12.09
key issues, 17.13 time limits, 10.01–10.02
data protection, and, 17.14
data subjects rights, 17.23
EDPB Opinion, 17.18 O
email misuse Offences
entering into contracts, 17.04 breaches of rules, and, 11.02–11.04
generally, 17.03 data processing activities, and, 11.03
equality, 17.05 direct marketers, by, 11.06

556
Index

Offences – contd Organisational obligations – contd


director liability, 11.14 employee data protection rights
introduction, 11.01 conclusion, 15.13
organisations, by, 11.01 GDPR, under, 15.03–15.12
other consequences of breach, and, introduction, 15.01
11.04 list, 15.02
processors, by, 11.15 employee monitoring
unlawful obtaining or procuring application of data protection regime,
personal data, 11.05 17.16
Offline abuse child pornography, 17.09
employee monitoring, 17.08 conclusion, 17.25
Off-site employees contracts of employment, and, 17.20
generally, 33.10 corporate communications usage
Online abuse policy, 17.11–17.13
employee monitoring, 17.07 data protection, and, 17.14
generally, 33.11 data subjects rights, 17.23
social media, 29.10 EDPB Opinion, 17.18
Online behavioural advertising (OBA) email misuse, 17.03
direct marketing, 23.19 equality, 17.05
Online safety guidelines, 17.22
Digital Charter, 24.05 harassment, 17.06
Opt-out human rights, 17.15
calls and fax, 23.16 Internet misuse, 17.03
ePrivacy Directive ILO Code, 17.17
general rule, 22.08 introduction, 17.01
registers, 22.06 issues arising, 17.02
generally, 23.09 misuse of electronic
registers communications, 17.03–17.04
electronic mail, by, 22.06 offline abuse, 17.08
generally, 23.07 online abuse, 17.07
Organisational obligations policies, 17.11–17.13
‘bring your own devices’ (BYOD), processing compliance rules, 17.21
14.01 purpose, 17.14
data protection officers (DPO) risk assessment, 17.10–17.11
inward-facing issues, 14.03 Romanian/ECHR case, 17.24
outward-facing issues, 18.03 WP29 Opinion, 17.19
employee considerations introduction, 14.01
conclusion, 16.15 inward-facing
data breach, 16.11–16.13 data protection officers, 14.03
data protection policy, 16.04 GDPR, 14.02
device usage policy, 16.06 introduction, 14.01
employment contracts, 16.02 issues, 14.04
enforceability, 16.10 outward-facing
evidence, 16.09 compliance, 18.06–18.11
Internet usage policy, 16.05 conclusions, 18.15
introduction, 16.01 consequences of non-compliance,
location, 16.14 18.13
mobile usage policy, 16.06 customers, prospects and users,
notification of data breach, 16.12 18.05–18.11
policies, 16.03–16.07 data protection by design, 18.04
transfers of undertaking, 16.08 data protection officers, 18.03
use of employee data organisations, data protection principles, 18.08
16.13 direct marketing, 18.12
vehicle use policy, 16.07 DPA 2018 changes, 18.02

557
Index

Organisational obligations – contd Outward-facing organisational


outward-facing – contd obligations – contd
introduction, 18.01 compliance – contd
legitimate processing conditions, prior information requirements, 18.07
18.09–18.10 security requirements, 18.11
prior information requirements, conclusions, 18.15
18.07 consequences of non-compliance, 18.13
security requirements, 18.11 customers, prospects and users, 18.05
types of relevant data, 18.05 data protection by design (DPbD)
users vs customers, 18.14 background, 19.02
processing employee’s personal data conclusion, 19.11
access requests, right of employees, EDPB, and, 19.09
14.21 GDPR, and, 19.04–19.05
communications, 14.18 ICO, and, 19.06–19.08
compliance policies, 14.12 introduction, 19.01
conclusion, 14.22 multinationals, and, 19.10
consent, 14.11 overview, 18.04
data protection principles, principles, 19.03
14.06 data protection officers, 18.03
employee handbooks, 14.17 direct marketing, 18.12
employment contracts, 14.15 DPA 2018 changes, 18.02
GDPR, 14.02 enforcement powers
HR Department, 14.19 assessment notices, 20.04–20.05
ICO Codes, 14.13 audit request, 20.08
Internet, 14.18 conclusion, 20.21
Intranet, 14.18 conditions for imposing fines, 20.19
introduction, 14.01 enforcement notices, 20.02–22.03
inward-facing issues, 14.04 entry, 20.07
legitimate processing, 14.07–14.12 failure to comply with notices, 20.11
notices, 14.18 functions of SAs, 20.17
persons covered, 14.05 GDPR, and, 20.17–20.20
policies, 14.12 information notices, 20.09
responsible person for compliance, information orders, 20.10
14.14 inspection, 20.07
security, 14.20 introduction, 20.01
training, 14.16 monetary penalties, 20.15
Outsourcing notices, 20.02–20.11
data processors, and penalties, 20.20
conclusion, 13.05 powers of SAs, 20.18
engagement, 13.03 production of records, 20.16
introduction, 13.01 special information orders, 20.10
processing personal data and tasks of SAs, 20.17
security, 13.02 unlawful obtaining of personal data,
reliance on third parties, 13.04 20.12
Outward-facing organisational electronically transmitted personal data
obligations background, 22.02
compliance conclusion, 22.22
consequences of non-compliance, ePD, 22.03–22.21
18.13 introduction, 22.01
data protection principles, 18.08 GDPR, 18.02
generally, 18.06 introduction, 18.01
legitimate processing conditions, legitimate processing conditions
18.09–18.10 ordinary personal data, 18.09
methods, 18.06 sensitive personal data, 18.10

558
Index

Outward-facing organisational Personal data – contd


obligations – contd security see Security of Personal data
prior information requirements, special personal data, 3.05–3.06
18.07 transfers see Trans border data flows
privacy by design (PbD) (TBDFs)
background, 19.02 Personal data breach
conclusion, 19.11 communication to data subject
EDPB, and, 19.09 employees, and, 16.12
GDPR, and, 19.04–19.05 generally, 12.11, 26.39, 26.71,
ICO, and, 19.06–19.08 27.04
introduction, 19.01 process, 27.07
multinationals, and, 19.10 time limits, 27.06
overview, 18.04 conclusion, 27.10
principles, 19.03 context, 27.02
security requirements, 18.11 definition
trans border data flows generally, 3.03, 27.01
Brexit, 21.19 social media purposes, 29.23
checklist for compliance, 21.18 employees, and
conclusion, 21.20 employee data organisations,
DPA, 21.03 16.13
establishing application of the ban, generally, 27.05
21.17 introduction, 16.11
GDPR, 21.08–21.14, 21.15 notification, 16.12
introduction, 21.01 incident response, 27.09
issues arising, 21.16 introduction, 27.01
transfer ban, 21.02 notification to ICO, 12.24
WP29, 21.07 notification to SA
types of relevant data, 18.05 contents, 12.10
users vs customers, 18.14 employees, and, 16.12
generally, 12.09, 26.38, 27.03
process, 27.07
P time limits, 27.06
Penalties process for notification, 27.07
financial, 11.27 security standards, 27.08
GDPR, under, 11.11, 11.24 time limits for notification, 27.06
generally, 20.20 Personally identifiable information (PII)
ICO, 11.27, 11.29, 20.15 data
monetary, 11.07, 20.15 security see Security of personal data
processing, 11.03 Policies
processors, 11.15 corporate communications
spam, 23.14 content, 17.12
Personal data generally, 17.11
breach see Personal data breach key issues, 17.13
categories, 3.04–3.06 Politics
definition generally, 33.16
DPD, 3.02 Portability
GDPR, 3.03 generally, 9.13, 9.28, 26.63
employees see Employee personal Powers of entry and inspection
data generally, 20.07
legitimate processing Principles of data protection
general conditions, 6.02 see Data protection principles
introduction, 6.01 Prior consultation
special personal data conditions, data protection impact assessments,
6.03 12.15, 26.42, 26.70, 28.09, 29.44

559
Index

Prior information Privacy impact assessments


conclusion, 7.05 (PIAs) – contd
directly obtained data security of personal data
generally, 7.03 contents, 12.14
social media, 29.28 generally, 12.13
generally, 7.02 introduction, 12.12
indirectly obtained data steps and methodologies, 28.07
generally, 7.04 Privacy shield
social media, 29.29 decision in Schrems, 32.3
introduction, 7.01 implications, 32.4
outward-facing organisational Brexit, 32.5
obligations, 18.07 Processing
social media authority, under, 26.30
directly obtained data, 29.28 children, 26.20–26.21
generally, 29.13 conditions
indirectly obtained data, 29.29 directly obtained data, 7.03
Privacy by design (PbD) generally, 7.02
background, 19.02 indirectly obtained data, 7.04
cloud services, and, 19.11 introduction, 7.01
commercial adoption, 19.10 outward-facing organisational
conclusion, 19.11 obligations, 18.07
data protection by design and by consent
default, 19.04–19.05 generally, 26.19
EDPB, and, 19.09 introduction, 4.09
GDPR, and, 19.04–19.05 controller authority, under, 26.30
generally, 1.04, 19.01 criminal convictions data, 26.24
Google, and, 19.10 definition
ICO guidance DPD, 3.02
introduction, 19.06 GDPR, 3.03
Privacy by Design Report (2008), directly obtained data
19.07 generally, 7.03
recommendations, 19.08 social media, 29.28
impact assessments, 19.08 employee monitoring, 17.21
meaning, 19.02 employee’s personal data
Microsoft, and, 19.10 access requests, right of employees,
multinationals, and, 19.10 14.21
overview, 18.04 communications, 14.18
principles, 19.03 compliance policies, 14.12
security, and, 19.03 conclusion, 14.22
transparency, and, 19.03 consent, 14.11
WP29, and, 19.10 data protection principles, 14.06
Privacy impact assessments (PIAs) employee handbooks, 14.17
characteristics, 28.06 employment contracts, 14.15
conclusion, 28.10 GDPR, 14.02, 26.84
elements, 28.05 HR Department, 14.19
generally, 26.69, 28.01–28.02 ICO Codes, 14.13
ICO guidance, 26.40, 28.03 Internet, 14.18
introduction, 26.41 Intranet, 14.18
issues for consideration, 28.08 introduction, 14.01
methodologies, 28.07 inward-facing issues, 14.04
prior consultation, 28.09 legitimate processing, 14.07–14.12
purpose, 28.04 notices, 14.18

560
Index

Processing – contd Processing – contd


employee’s personal data – contd penalties for non-compliance, 11.03
persons covered, 14.05 pre-conditions
policies, 14.12 directly obtained data, 7.03
responsible person for compliance, generally, 7.02
14.14 indirectly obtained data, 7.04
security, 14.20 introduction, 7.01
training, 14.16 outward-facing organisational
enforcement for non-compliance, 11.03 obligations, 18.07
freedom of expression, and, 26.83 preventing processing
GDPR direct marketing, for, 29.19
authority, under, 26.30 likely to cause damage or distress,
children, 26.20–26.21 29.18
conditions, 4.09, 26.18–26.19 principles, 26.17
consent, 4.09, 26.19 prior information
controller authority, under, 26.30 directly obtained data, 7.03
criminal convictions data, 26.24 generally, 7.02
definition, 3.03 indirectly obtained data, 7.04
derogations, 26.85 introduction, 7.01
directly obtained data, 7.03, 29.28 outward-facing organisational
employment context, in, 26.84 obligations, 18.07
freedom of expression, and, 26.83 social media, 29.13, 29.28–29.29
health research, 26.23 processor authority, under, 26.30
indirectly obtained data, 7.04, 29.29 records of activities, 26.31
introduction, 26.16 safeguards, 26.85
lawfulness, 26.18 security of personal data, 12.08
legitimate conditions, 4.08, 26.18 social media
not requiring identification, 26.25 direct marketing, 29.19
offences data, 26.24 likely to cause damage or distress,
pre-conditions, 7.01–7.04 29.18
principles, 26.17 security of processing, 29.41
processor authority, under, 26.30 special categories of personal data,
records of activities, 26.31 26.22
safeguards, 26.85 Processors
special categories of personal data, definition
26.22 DPD, 3.02
general criteria, 1.20 GDPR, 3.03
health research, 26.23 enforcement for
indirectly obtained data non-compliance, 11.15
generally, 7.04 GDPR, and
social media, 29.29 co-operation with SA, 26.33
lawfulness, 26.18 definition, 3.03
legitimacy, 1.22 generally, 26.29
legitimate processing introduction, 26.26
general conditions, 6.02 processing under authority, 26.30
generally, 4.08, 26.18 records of activities, 26.31
introduction, 6.01 representatives not established in
overview, 1.22 EU, 26.32
special personal data conditions, 6.03 outsourcing, and
not requiring identification, 26.25 conclusion, 13.05
offences data, 26.24 engagement, 13.03
overview, 1.19 introduction, 13.01

561
Index

Processors – contd Restriction of processing


outsourcing, and – contd definition, 3.03
processing personal data and generally, 9.11, 26.61
security, 13.02 introduction, 9.08
reliance on third parties, 13.04 notification, 9.12
penalties for non-compliance, 11.15 Right to be forgotten (Rtf)
Production of records employees, 26.17
enforcement powers, 20.16 generally, 9.10, 26.07, 26.60
Profiling introduction, 9.01
data subjects’ rights, 9.16, 9.29–9.30, social media, 29.33
26.66 Risk assessments
definition, 3.03 employee monitoring
Proportionality generally, 17.10
overview, 1.19 policies, 17.11–17.13
Pseudonymisation overview, 1.19
definition, 3.03
Public security
ePrivacy Directive, 22.10 S
Scanners
generally, 33.05
R Schrems
Recipient privacy shield, 32.3–32.5
DPD, 3.02 decision, 32.3
GDPR, 3.03 implications, 32.4
Records Brexit, 32.5
processing activities, 26.31 standard contract clauses, 32.6
Rectification decision, 32.6
generally, 9.09, 26.35, 26.58–26.59 implications, 32.7
introduction, 9.08 Brexit, 32.8
notification, 9.12 Secondary legislation
social media, 29.21, 29.31–29.32 sources of law, 2.03
Reference material Secrecy
sources of law, 2.19 GDPR, and, 26.86
Re-identification Security
de-identified personal data, 20.13 see also Security of personal data
testing, 20.14 data breach, and
Remedies communication to data
compensation, 26.78 subject, 12.11, 26.39,
introduction, 26.73 26.71, 27.04
judicial remedy against controller/ conclusion, 27.10
processor, 26.76 context, 27.02
judicial remedy against SA, 26.75 definition, 27.01
liability, 26.78 employees, and, 27.05
lodging complaints with SA, 26.74 incident response, 27.09
representation of data subjects, 26.77 introduction, 27.01
Representation of data subjects notification to SA, 12.09–12.10,
GDPR, and, 26.77 26.38, 27.03
Representative process for notification, 27.07
definition scale and frequency of breaches,
generally, 3.03 1.10–1.12
social media purposes, 29.23 security standards, 27.08
Representatives not established in EU time limits for notification, 27.06
see Controllers ePrivacy Directive, 22.14

562
Index

Security – contd Security of personal data – contd


overview, 1.19 GDPR, under
privacy by design, 19.03 communication of breach to data
Security of personal data subject, 12.11, 26.39, 26.71,
appraising employees, 12.18 27.04
appropriate measures generally, 12.07, 26.36
ensuring, 12.03 impact assessment, 12.12–12.14,
generally, 12.02 26.41, 26.69, 28.01–28.10
communication to of data breach data notification of breach to SA,
subject 12.09–12.10, 26.38, 27.03
employees, and, 16.12 prior consultation, 12.15, 26.42,
generally, 12.11, 26.39, 26.71, 26.70, 28.09
27.04 processing, of, 12.08, 26.37
process, 27.07 ICO, and
time limits, 27.06 guidance, 12.23
conclusion, 12.26 security breaches, 12.24
data breach, and impact assessment
communication to data subject, characteristics, 28.06
12.11, 26.39, 26.71, 27.04 conclusion, 28.10
conclusion, 27.10 contents, 12.14
context, 27.02 elements, 28.05
definition, 27.01 generally, 26.69, 28.01–28.02
employees, and, 27.05 ICO guidance, 26.40, 28.03
incident response, 27.09 introduction, 26.41
introduction, 27.01 issues for consideration, 28.08
notification to ICO, 12.24 methodologies, 28.07
notification to SA, 12.09–12.10, prior consultation, 28.09
26.38, 27.03 purpose, 28.04
process for notification, 27.07 security of personal data,
scale and frequency of breaches, 12.12–12.14
1.10–1.12 steps and methodologies, 28.07
security standards, 27.08 introduction, 12.01
time limits for notification, 27.06 notification of data breach
data protection impact assessment data subject, to, 12.11, 26.39, 26.71,
characteristics, 28.06 27.04
conclusion, 28.10 SA, to, 12.09–12.10, 26.38, 27.03
contents, 12.14 notification of data breach to SA
elements, 28.05 employees, and, 16.12
generally, 26.69, 28.01–28.02 generally, 26.38, 27.03
ICO guidance, 26.40, 28.03 process, 27.07
introduction, 26.41 time limits, 27.06
issues for consideration, 28.08 privacy impact assessment
methodologies, 28.07 characteristics, 28.06
prior consultation, 28.09 conclusion, 28.10
purpose, 28.04 contents, 12.14
security of personal data, elements, 28.05
12.12–12.14 generally, 26.69, 28.01–28.02
steps and methodologies, 28.07 ICO guidance, 26.40, 28.03
disposal of hardware, 12.25 introduction, 26.41
employees, 14.20 issues for consideration, 28.08
ensuring appropriate measures, 12.03 methodologies, 28.07
ePrivacy Directive, 22.14 prior consultation, 28.09
EU Council Resolution, 12.22 purpose, 28.04

563
Index

Security of personal data – contd Social media – contd


privacy impact assessment – contd electronic evidence, as, 29.12
security of personal data, 12.12–12.14 employment, and, 29.11
steps and methodologies, 28.07 ‘enterprise’, 29.23
processing, of erasure, 29.21, 29.31, 29.33
generally, 12.08 evidence, as, 29.12
processors’ obligations, 13.02 ‘group of undertakings’, 29.23
social media, 29.41 hosting, and, 29.15
raising awareness, 12.22 ICO, and, 29.01
surveillance of employee’s electronic indirectly obtained data, 29.29
communications, 12.06 introduction, 29.01
third party providers, 12.21 investigations, 29.03
WP29, and, 12.06 joint controllers, 29.02
Sensitive personal data Leveson Inquiry, and, 29.04
see Special personal data litigation, and, 29.12
Service of documents ‘main establishment’, 29.23
social media, 29.12 mere conduit, and, 29.15
Social media notification of rectification, erasure
abuse, 29.10 or restriction, 29.35
access right objection, 29.24, 29.38
confirmation, 29.30 ‘personal data breach’, 29.23
directly obtained data, 29.28 preventing processing
indirectly obtained data, 29.29 direct marketing purposes, 29.19
introduction, 29.27 likely to cause damage or distress,
prior information, 29.28–29.29 29.18
automated decision taking/making prior consultation, 29.44
generally, 29.39 prior information, 29.28–29.29
introduction, 29.22, 29.37 privacy impact assessment, 29.43
right to object, 29.38 profiling, 29.24, 29.39
awareness, 29.07 rectification, 29.21, 29.31–29.32
blocking, 29.21 reporting procedures, 29.10
caching, and, 29.15 ‘representative’, 29.23
children, 29.26 representatives of controllers not
communication of data breach to data established in EU, 29.42
subject, 29.40 restriction of processing, 29.34
confirmation, 29.30 right to be forgotten, 29.33
consent, and, 29.14 security of processing, 29.41
controllers, 29.02 service of documents, 29.12
cyber bullying, 29.10 tagging, 29.08
data access, 29.27 third party controllers, 29.16
data portability, 29.36 threats, 29.10
data protection impact assessment, transparency, 29.09, 29.25
29.43 trolling, 29.10
data subjects’ rights see Data subject user friendly tools, 29.09
rights user generated content, 29.15
data transfers victims, 29.10
Apps, 29.06 website discretion, liability and
processors, 29.05 obligations, 29.15
destruction, 29.21 WP29, and, 29.14
digital evidence, as, 29.12 Software
directly obtained data, 29.28 generally, 33.08
DPD Sources of law
specific processing risks, 29.17 case law, 2.05
third party controllers, 29.16 conferences, 2.18

564
Index

Sources of law – contd Spam – contd


Council of Europe, 2.12 origins of name, 23.17
DPA 2018, 2.02 PECR 2003
EDPB guidance, 2.10 civil sanctions, 23.15
EU law, 2.04 generally, 23.11
European Data Protection Supervisor, ICO guidance, 23.12
2.11 offences, 23.13
ICO determinations, 2.07 prior consent, 23.05
ICO guides, 2.06 related issues, 23.19
introduction, 2.01 Telephone Preference Service (TPS)
legal journals, 2.09 generally, 23.16
legal textbooks, 2.08 illegal marketing call breaches, 23.07
other authorities, 2.13 transparency, 23.08
other laws, 2.17 unsolicited communications, 23.03
other official sources, 2.14 Special information orders
reference material, 2.19 failure to comply, 20.11
UK law generally, 20.10
DPA 2018, 2.02 Special personal data
secondary legislation, 2.03 DPA, 3.05
websites and blogs, 2.16 GDPR, 3.06
Spam legitimate processing conditions, 6.03
automated calling systems, 23.03 State security
call opt-out registers, 23.16 ePrivacy Directive, 22.10
case law, 23.20 Supervisory authorities (SAs)
civil sanctions, 23.15 complaints by data subjects
concealing identity, 23.08 choice of jurisdiction and court,
conclusion, 23.21 9.24
deceptive emails, 23.08 compensation, 9.25
default position, 23.04 court remedies, 9.22
‘electronic mail’, 23.12 different SAs, 9.23
existing customers’ email, 23.06 generally, 9.20
fax opt-out registers, 23.16 organisational privacy
Fax Preference Service (FPS), 23.16 groups, 9.21
GDPR, 23.01 penalties, 9.26
generally, 23.02–23.20 portability, 9.28
ICO, and sanctions, 9.27
monetary penalties, 23.14 definition, 3.03
PECR guidance, 23.12 enforcement powers
implementing regulations administrative fines, 20.19
civil sanctions, 23.15 functions and tasks, 20.17
generally, 23.11 powers, 20.18
ICO guidance, 23.12 functions and tasks, 20.17
offences, 23.13 Surveillance of employee’s electronic
international approaches, 23.18 communications
introduction, 23.01 security of personal data, 12.06
monetary penalties, 23.14
offences, 23.13
online behavioural advertising, T
and, 23.19 Tagging
opt-out social media, 29.08
calls and fax, 23.16 Telephone Preference Service (TPS)
generally, 23.09 direct marketing
national registers, 23.07 generally, 23.16
organisations, and, 23.10 illegal marketing call breaches, 23.07

565
Index

Textbooks Trans border data flows


sources of law, 2.08 (TBDFs) – contd
Third country not authorised by EU law, 21.12, 26.49
transfer of personal data, 21.16 principle, 26.45
Third parties restrictions, 21.15, 26.50
definition, 3.03 social media
security of personal data, 12.21 Apps, 29.06
Third party controllers processors, 29.05
social media, 29.16 subject to appropriate safeguards,
Time limits 21.10, 26.47
communication to data subject of data ‘third county’, 21.16
breach, 27.06 ‘transfer’, 21.16
compliance, 10.01–10.02 transfer prohibition
Traffic data establishing application, 21.17
definition, 22.11 exceptions, 21.03–21.07
generally, 22.16 generally, 21.02
Trans border data flows (TBDFs) WP29, 21.07
adequate protection exception Transfers of personal data
GDPR, 21.09, 26.46 see Trans border data flows (TBDFs)
generally, 21.03 Transparency
‘anonymised data’, 21.16 direct marketing
appropriate safeguards, 21.10, 26.47 electronic mail, by, 22.07
binding corporate rules generally, 23.08
generally, 21.11, 26.48 ePrivacy Directive, 22.07
introduction, 21.07 overview, 1.19
Brexit see Brexit prior information requirements see
checklist for compliance, 21.18 Prior information
Community findings, and, 21.03 privacy by design, 19.03
conclusion, 21.20 social media, 29.09, 29.25
consent, 21.05 Treaty on the Functioning of the
contract, 21.06 European Union (TFEU)
‘data’, 21.16 generally, 26.05
data protection principles, and, 21.02 Trolling
data subject’s consent, 21.05 social media, 29.10
derogations, 21.13, 21.15, 26.50
DPA, 21.03
establishing application of the ban, U
21.17 Unlawful obtaining or procuring
exemptions personal data
adequate protection, 21.03 enforcement powers, and, 20.12
binding corporate rules, 21.07 generally, 11.05
consent, 21.05 Unsolicited communications
contract, 21.06 direct marketing, 23.03
generally, 21.04 generally, 22.13
Facebook, 32.1–32.7 marketing, 22.04
general principle, 21.08, 26.45 Unsolicited electronic commercial
general rule marketing
establishing application, 21.17 automated calling systems, 23.03
exceptions, 21.03–21.07 call opt-out registers, 23.16
generally, 21.02 case law, 23.20
generally, 21.08, 26.44 civil sanctions, 23.15
international cooperation, 21.14 concealing identity, 23.08
introduction, 21.01 conclusion, 23.21
issues arising, 21.16 deceptive emails, 23.08

566
Index

Unsolicited electronic commercial Unsolicited electronic commercial


marketing – contd marketing – contd
default position, 23.04 prior consent, 23.05
‘electronic mail’, 23.12 related issues, 23.19
existing customers’ email, 23.06 Telephone Preference Service (TPS)
fax opt-out registers, 23.16 generally, 23.16
Fax Preference Service (FPS), 23.16 illegal marketing call breaches, 23.07
GDPR, 23.01 transparency, 23.08
generally, 23.02–23.20 unsolicited communications, 23.03
ICO, and Unwanted marketing communications
monetary penalties, 23.14 enforcement and penalties for
PECR guidance, 23.12 non-compliance, 11.01
implementing regulations User generated content
civil sanctions, 23.15 social media, 29.15
generally, 23.11
ICO guidance, 23.12
offences, 23.13 V
international approaches, 23.18 ‘Value added service’
introduction, 23.01 ePrivacy Directive, 22.11
monetary penalties, 23.14
offences, 23.13
online behavioural advertising, and, W
23.19 Websites and blogs
opt-out social media, 29.15
calls and fax, 23.16 sources of law, 2.16
generally, 23.09 WP29
national registers, 23.07 EDPB adopting/endorsing, 9.16
organisations, and, 23.10 employee monitoring, 17.19
origins of name, 23.17 GDPR, and, 26.13
PECR 2003 generally, 26.13
civil sanctions, 23.15 privacy by design, 19.10
generally, 23.11 security of personal data, 12.06
ICO guidance, 23.12 social media, 29.14
offences, 23.13 transfer of personal data, 21.07

567

You might also like