0% found this document useful (0 votes)
7 views

Data Protection

The document discusses the General Data Protection Regulation (GDPR) and data protection law. It outlines key concepts such as the definition of personal data and sensitive personal data. It discusses the core principles of data protection like lawful and fair processing. It describes what constitutes a data breach and the roles of data controllers and processors. It provides examples of decisions by the Irish Data Protection Commission involving fines against companies like Twitter, WhatsApp, Meta, and TikTok for GDPR violations. It also summarizes an Irish court case around the processing of CCTV footage.

Uploaded by

meganggiblin
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

Data Protection

The document discusses the General Data Protection Regulation (GDPR) and data protection law. It outlines key concepts such as the definition of personal data and sensitive personal data. It discusses the core principles of data protection like lawful and fair processing. It describes what constitutes a data breach and the roles of data controllers and processors. It provides examples of decisions by the Irish Data Protection Commission involving fines against companies like Twitter, WhatsApp, Meta, and TikTok for GDPR violations. It also summarizes an Irish court case around the processing of CCTV footage.

Uploaded by

meganggiblin
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

Data Protection

1. Data Protection Law – The GDPR


What is the GDPR?
- The General Data Protection Regulation (GDPR) is the central piece of data
protection legislation for the residents of EU Member States. The GDPR is given
effect in this jurisdiction by the Data Protection Act 2018.

Aims of the GDPR:


- The GDPR’s central goal is to provide protection for, and enable the free movement
of, personal data. It aims to provide for a more uniform interpretation and
application of data protection standards across the EU.
- It only relates to the data of "Natural persons" - ie companies/ organisations do not
have any data protection rights under the GDPR.

The basic concepts:


1) What is personal data?
- Article 4(1) of the GDPR provides that personal data means "any information relating
to an identified or identifiable natural person (‘data subject’); an identifiable natural
person is one who can be identified, directly or indirectly, in particular by reference
to an identifier such as a name, an identification number, location data, an online
identifier or to one or more factors specific to the physical, physiological, genetic,
mental, economic, cultural or social identity of that natural person..."

Examples of personal data include:


 a name and surname;
 a home address;
 an email address such as [email protected] (but not a generic
email)
 an identification card number;
 location data (for example the location data function on a mobile phone);
 an Internet Protocol (IP) address;
 an examination script (Nowak v Data Protection Commissioner case)

2) What is sensitive personal data?


- Article 9 of the GDPR provides that sensitive personal data is “personal data
revealing racial or ethnic origin, political opinions, religious or philosophical beliefs,
or trade union membership ... data concerning health or data concerning a natural
person's sex life or sexual orientation....”
 The processing of sensitive personal data is generally prohibited, though it
may be processed if the data subject has given their explicit consent, they
have made the data public themselves, or the processing is in the public
interest.

3) The core principles


- Article 5 of the GDPR requires that personal data be:
 Obtained and processed fairly and lawfully and in a transparent manner;
 Stored for specified and legitimate purposes, and not used in a way
incompatible with those purposes;
 Adequate, relevant and not excessive in relation to the purposes for which
it's stored;
 Accurate and, where necessary, kept up to date;
 Preserved in a form that permits identification of the data subjects for no
longer than is required for the purpose for which that data is stored;
 Processed in a manner that ensures appropriate security of the personal
data.

4) When can data be lawfully processed?


- Article 6 of the GDPR sets out the following six bases for processing of personal data:
 The processing has been consented to.
 In order to carry out a contract.
 In order for an organisation to meet a legal obligation.
 Where the processing is necessary to protect the vital interests of a person.
 Where the processing is necessary for the performance of a task carried out
in the public interest.
 Where the processing is in the legitimate interests of a company/organisation
individual.

5) Examples of how data protection rights can be breached:


- There is a myriad of ways in which a data breach can occur. For example:
 A company's database being hacked and private information being accessed;
 Personal data being accidentally destroyed;
 Personal data being thrown out without being shredded;
 Sending an email to the wrong recipient with information about a third party;
 Posting a compromising photograph on the internet of someone without
their permission;
 Using data to create targeted advertising without the person's consent
 Placing content on a website about a person without their consent – see
CJEU decision in C-101/01 Bodil Lindqvist v Sweden (2004). The Court held
that "the act of referring, on an internet page, to various persons and
identifying them by name or by other means, for instance by giving their

2
telephone number or information regarding their working conditions and
hobbies, constitutes "the processing of personal data wholly or partly by
automatic means" within the meaning of Article 3(1) of Directive 95/46.

6) Data controller v Data processor


- Data controllers have special responsibilities on account of their position, defined by
Article 4(7) of the GDPR, as “the natural or legal person ... alone or jointly with
others, determines the purposes and means of the processing of personal data...
- Article 4(1) of the GDPR provides that a data processor is: “a natural or legal person,
public authority, agency or other body which processes personal data on behalf of
the controller.”
 In many instances, the data controller will process the data itself and will not
avail of a separate data processor. There are two basic conditions for
qualifying as processor:
i. being a separate legal entity with respect to the controller, and
ii. processing personal data on the controller’s behalf.

The Data Protection Commission


- The Office of the Data Protection Commission was created by the Data Protection
Act 2018. The Commission should comprise of no more than three Commissioners,
to be appointed by the Government. Currently, there is only Commissioner in this
jurisdiction.
- An individual who wishes to make a complaint in respect of the processing of their
personal data has two options. They can make a complaint to the DPC, which has the
options described below, or they can bring proceedings against the relevant data
controller or processor in the Circuit Court or High Court.

Making a complaint to the Data Protection Commission:


- Once a complaint is made to the DPC, an authorised officer(s) will conduct an
investigation into the complaint. Their powers include the right to require the
production of documents and the taking of statements under oath, and the officer(s)
will then produce a report for the Commission.
- The Commission may act on this report or conduct further investigations. The data
controller of processor may appeal the DPC's decision to either the Circuit or High
Court.
- The options open to the Commissioner, should they find that the data controller or
processor has breached the provisions of the Act, are to impose a fine or to apply to
the High Court for an order requiring the controller/ processor to either restrict or
suspend all data processing. The fines that the DPC are entitled impose equate to a
maximum of either €20m, or 4% of the undertaking's annual worldwide turnover,
whichever is the greater.

3
Examples of DPC decisions:
- In December 2020, the DPC fined Twitter €450k, the first fine it imposed on a social
media platform. It related to private tweets which had inadvertently become public
due to a bug, which Twitter reported.
- In September 2021, the DPC fined WhatsApp €225m, as the company violated
provisions of the GDPR through the way it processed users’ and non-users’ data, as
well as in the way it processed and shared data with other companies owned by the
parent global social media company.
- In March 2022, the DPC fined Meta €15m, having held that the company was not
able to demonstrate that it had appropriate security measures in place to protect
users' data in 2018.
- In September 2023, the DPC fined Tik Tok €345m, having held that the company had
failed to adequately protect the security of data relating to children. Most notable
was the finding in relation to the "public by default" setting.

2. Data Protection Law – Civil Remedies


Doolin v Data Protection Commissioner [2022] IECA 117
- In Doolin, the Court of Appeal considered the issue of what may constitute the
unlawful processing of data in relation to CCTV footage.
- Mr. Doolin worked for Our Lady's Hospice in Harold's Cross, Dublin. Around the time
of the Paris Terrorist attacks in November 1915, someone had carved the words "Kill
all whites, Isis is my life" into a table in the staff tearoom. CCTV footage was viewed
in order to try and ascertain who might have done so. Crucially, the Hospice's CCTV
policy stated that "The purpose of the system is to prevent crime and promote staff
security and public safety"
- The footage showed Mr. Doolin walking in and out of the tearoom at times when he
was supposed to be on duty, and an investigation was launched on the back of this
footage as to whether he was taking unauthorised breaks. He was found to have
done so and was disciplined.
- Mr. Doolin brought a complaint to the Data Protection Commissioner on the basis
that his data had been processed for the purposes of disciplinary proceedings
without his consent, as its use against him was not for the specified security and
safety purposes.
- Both the DPC and the Circuit Court rejected his complaint, but the High Court upheld
it. The DPC then appealed the decision of the High Court to the Court of Appeal.
- The Court of Appeal held that the data had been processed 3 times – when it was
collected, when watched in relation to the security incident, and when watched in
relation to the unauthorised taking of breaks. It held that the processing of the data
for the latter was incompatible with the stated purpose of the CCTV being used only
for security purposes. It held that there was no evidence that the taking of
unauthorised tea breaks was a "security issue."

4
Damages for breach of data protection rights
- A person can bring a civil claim against their employer for a breach of their data
protection rights. Article 82(1) GDPR provides that:
“Any person who has suffered material or non-material damage as a result of an
infringement of this Regulation shall have the right to receive compensation from
the controller or processor for the damage suffered”
- Under the heading “Judicial remedy for infringement of relevant enactment”, section
117 of the 2018 Act provides:
"A data protection action shall be deemed, for the purposes of every enactment and
rule of law, to be an action founded on tort."

The Irish Position:


- An issue in which clarity is awaited concerns the type of damages which may be
recovered on account of a breach of data protection rights. The GDPR states that
damages may be awarded for “material or non-material damage”, while section
117(10) of the 2018 Act also clarifying that “‘damage’ includes material and non-
material damage’.”
- The problem, however, that existing Irish jurisprudence is inconsistent with the
GDPR, as current (pre-GDPR) case law states that damages may not be awarded for
mere distress arising out of a breach of data protection rights. This is on account of
the decision of the High Court in Collins v FBD Insurance.
- The issue is currently being considered in a High Court appeal in Keane v CSO, which
related to a claim brought by a census enumerator against the Central Statistics
office when her personal data was accidentally made public. The case will consider
whether damages are recoverable for purely non-material harm, and also whether a
PIAB Authorisation is required before you can bring such proceedings.

Collins v FBD Insurance [2013] IEHC 137


- In Collins, the plaintiff was awarded €15,000 in damages in the Circuit Court for
distress, following a breach of the Data Protection Acts 1988-2003 during the
processing of an insurance claim. The defendant insurance company appealed to the
High Court, submitting that while there had been a breach of the plaintiff’s data
protection rights, no damage had flowed from that breach.
- The court examined the right to compensation under s 7 of the Data Protection Act
1988 and held that damages were not available for mere distress; instead, a claimant
must show that they have suffered pecuniary loss or acute psychological damage in
order to be able to claim compensation. The Court held that:
"In this case, where it is clear that there is no strict liability ... it necessarily follows
that a claimant must establish that the breach has caused the claimant damage if
that claimant is to be entitled to damages ... In general, an entitlement to damages
for distress, damage to reputation or upset, are not recoverable save where extreme
distress results in actual damage, such as a recognisable psychiatric injury."

5
UK Jurisprudence is inconsistent with Collins:
- Collins is inconsistent with UK jurisprudence which was also based on pre-GDPR law.
In Vidal Hall v Google [2016] QB 1003 it had been claimed by the plaintiffs that
Google had breached their data protection rights by collecting information relating
to their browsing history, without their knowledge or consent, for the purposes of
generating online advertisements. The Court rejected the reasoning of Collins, and
held that data protection legislation, read together with the Charter , should be
interpreted as including the right to obtain compensation for non-pecuniary loss.
- The important recent decision in Richard Lloyd v Google LLC [2021] UKSC 50 sought
to extent liability even further, as damage has not even been pleaded in these
proceedings. Instead, the claimants attempted to establish that the very loss of
control over personal data constitutes a breach of their data protection rights.
Overturning the decision of the Court of Appeal, the Supreme Court refused
permission to serve Google outside the jurisdiction, holding that data protection
rights were not actionable per se under the pre-GDPR legislation.

3. Data Protection Law – Data Processing on the Internet


Are search engine providers and social media platforms "data controllers"?
- There is no doubt that search engines and social media platforms, regardless of
where they are based, are subject to the provisions of the GDPR. This is because of
the use they make of their users' data for the purposes of generating advertising
revenue. The fact that advertisements are targeted at specific users based on their
profile, and the interests they have revealed through their browsing history, means
that such platforms “monitor” the data subjects for the purposes of Recital 24:
“In order to determine whether a processing activity can be considered to monitor
the behaviour of data subjects, it should be ascertained whether natural persons are
tracked on the internet including potential subsequent use of personal data
processing techniques which consist of profiling a natural person, particularly in
order to take decisions concerning her or him or for analysing or predicting her or his
personal preferences, behaviours and attitudes.”

The need to obtain clear consent for processing:


- A significant decision in relation to the GDPR and what constitutes consent was
made by the French data protection agency “CNIL” when it fined Google €50m for a
breach of the Regulation. The complaints, brought on behalf of French users of
Google's Android operating system, claimed that Google lacked a valid legal basis to
process the personal data of the users of its services, particularly for ads
personalization purposes.
- Central to the finding was whether or not Google obtained “specific” consent, as
explicitly required by Articles 4 and 6. While users were given the option of
customising the degree of processing that they agreed to, the difficulty with which
users would become aware of this option was criticised by the CNIL. The user was

6
obliged to follow and long and complex list of options and click-throughs in order to
customize their settings, with some boxes pre-checked which is not consistent with
specific consent.

The use of cookies:


- Cookies are small pieces of computer code which are left by a website on the
visitor's computer. They allow the website to memorize the user's identity,
performing such tasks as storing payment methods used etc. Cookies are therefore
vital to the efficient functioning of websites, as they speed up the interactions with
users who have previously visited.
- More importantly for the website operator, they crucially provide the operator with
information about the user's online activities which can subsequently be used for
targeting advertising purposes.

C-673/17 Bundesverband der Verbraucherzentralen und Verbraucherverbände –


Verbraucherzentrale Bundesverband e.V. v Planet49 GmbH
- The case involved an online lottery organised by Planet 49, which presented an
internet user who wished to participate with two checkboxes which had to be clicked
or un-clicked before he could hit the ‘participation button’.
- One of the checkboxes, asking the user to accept being contacted by a range of firms
for promotional offers, needed to be clicked to show consent. The other checkbox,
however, which asked the user to agree to cookies being installed on their
computer, was already ticked, requiring the user to click on it to remove their
consent.
- The Court held that pre-checked boxes are not consistent with active, rather than
passive consent. It held that the average user could not be expected to be familiar
with how cookies work, and it must be clearly explained what they are consenting to,
including the length of time that a cookie will be stored for, and whether third
parties are being given access to such cookies.

4. The Right to be Forgotten

7
The origin of the right
C131/12 Google Spain SL, Google Inc. v Agencia Española de Protección de Datos
(AEPD), Mario Costeja González ("Google Spain")
- The right predates the GDPR and was first conisdered in detail under the previous EU
data protection legilsation in 'Google Spain'.
- Mario Costeja Gonzalez is a Spanish lawyer who, in 1988, had been the subject of
attachment proceedings, when real estate property he owned was auctioned off in
satisfaction of a social security debt. When Mr. Gonzalez's name was entered into
Google's search engine, links to two newspaper reports, which covered the
proceedings, were presented to users.
- In 2009, he requested that the articles which appeared in the newspaper's online
archive be removed, and in 2010 that Google remove links that referred to the
proceedings. He did so on the basis that the proceedings concerning him had been
fully resolved for a number of years and that reference to them was now irrelevant.
When they were not removed, he complained to the Spanish Data Protection
Agency.
- The Data Protection Authority rejected the complaint in respect of the newspaper
archive but upheld the complaint pertaining to the search engine results. Google
appealed this decision to the national court, which made a reference to the CJEU

Search engine results and a data subject's right to privacy:


- “Processing of personal data ... by the operator of a search engine is liable to affect
significantly the fundamental rights to privacy and to the protection of personal data
when the search ... is carried out on the basis of an individual's name, since that
processing enables any internet user to obtain through the list of results a structured
overview of the information relating to that individual that can be found on the
internet –information which potentially concerns a vast number of aspects of his

8
private life and which, without the search engine, could not have been
interconnected or could have been only with great difficulty ...
Furthermore, the effect of the interference with those rights of the data subject is
heightened on account of the important role played by the internet and search
engines in modern society, which render the information contained in such a list of
results ubiquitous."

Guidelines as to when the right should be allowed


- In Google Spain, the Court held that even if the underlying article was accurate and
lawfully published, search engine results which linked it could be held to violate a
data subject's rights. It identified certain factors which should be considered in any
right to be forgotten application:
"The (data subject's) rights override, as a rule, not only the economic interest of
the operator of the search engine but also the interest of the general public in
finding that information upon a search relating to the data subject’s name.
However, that would not be the case if it appeared, for particular reasons, such as
the role played by the data subject in public life, that the interference with his
fundamental rights is justified by the preponderant interest of the general public in
having, on account of inclusion in the list of results, access to the information in
question.”
Following the decision in Google Spain, the Article 29 Working Party produced a set
of Guidelines to assist in the interpretation of the judgement.
- These included:
 Does the search result come up against a search on the data subject’s name?
 Is the data subject a public figure?
 Is the data accurate? 'Accurate' means accurate as to a matter of fact. There
is a difference between a search result that clearly relates to one person’s
opinion of another person and one that appears to contain factual
information.
 Does the search result link to information that puts the data subject at risk?
 Was the original content published in the context of journalistic purposes?
 Does the data relate to a criminal offence?

Savage v Data Protection Commissioner [2018] IEHC 122


- In September 2014, Mark Savage contacted Google Ireland about the above search
engine result, who refused to take it down. He appealed to the Data Protection
Commissioner, claiming that the search engine result constituted a statement of fact
that he was homophobic, an allegation which he denied. As such, it should be
considered to be inaccurate data for the purposes of the Data Protection Acts 1988-
2003.

9
- The Commissioner rejected the complaint, deciding it was accurate because
“accurate means accurate as a matter of fact, and this link remains accurate in that it
represents the opinions expressed of you by a user of the relevant forum.”
- Savage appealed to the Circuit Court, which upheld his complaint. The Court found
that the snippet was “not accurate by virtue of the fact that it is simply not clear that
it is the original poster expressing his or her opinion, but rather bears the
appearance of a verified fact.” It ordered that the search engine result be edited so
that it is clear this was an expression of opinion. This decision was appealed to the
High Court by both Google and the DPC.
- "(Google) does not carry out any editing function in respect of its activities. It is an
automated process where individual items of information on the internet are
collated automatically and facilitate the user searching particular topics or names. To
mandate a search engine company to place parenthesis around a URL heading would
oblige it to engage in an editing process which is certainly not envisaged in the
Google Spain decision.""
- The core reason for the High Court upholding the appeal was explained in just one
paragraph: “The learned Circuit Court Judge in applying the jurisprudence of Google
Spain had a duty to consider the underlying article the subject of the search. The
Circuit Court did refer to this matter by indicating that if that Reddit.com discussion
was considered, it would become clear that the original post by Soupynorman was
an expression of opinion. The learned Circuit Court Judge was incorrect in law to
consider the URL heading in isolation."
- While Google Spain did involve a consideration of the underlying article, it is
questionable whether the CJEU’s judgment imposes a duty on a court to do so, as
suggested by the High Court. In Mr. Gonzalez's case, the underlying article was
considered by the domestic courts because he had requested that both it, and the
search engine link which referenced it, be deleted. Mr. Savage made no such request
of the Data Commissioner, limiting his request to the correction of the search engine
result.
- In fact, Savage does not fit neatly under the “Right to be Forgotten” jurisprudence of
Google Spain, and the weight that was given to the CJEU judgment in the three
decisions of the Data Protection Commissioner, Circuit Court and High Court, is open
to question. Mr. Gonzalez's issue was that the personal data which Google processed
was no longer relevant. Savage, on the other hand, was not looking for accurate
information about him to be “forgotten”. Instead, he was looking for inaccurate
information to be corrected.

NT1 & Nt2 v Google LLC [2018] EWHC 799 (QB)


- Decided just 2 months after Savage – this case was much more on point with the
central issue in Google Spain. Interestingly, both applicants were anonymised, with
the English High court stating that "“(it) is required to ensure that these claims do
not give the information at issue the very publicity which the claimants wish to limit

10
- In the late 1990s, NT1 was convicted of criminal conspiracy to defraud the Inland
Revenue through false accounting running into millions of pounds, for which he
received a four-year custodial sentence.
- NT2 offences were, by comparison, relatively minor. In the early part of this century,
he was sentenced to six months imprisonment (though released on licence after six
weeks) for authorising a firm to conduct computer hacking and phone tapping to
find out who was engaged in hostile activity against his company.
NT1:
- In refusing NT1's application, the Court gave some weight to his evidence, most
noticeably his perceived lack of remorse for his criminal wrongdoing, connecting his
lack of contrition, and continued refusal to admit his crimes, with the evaluation of
the degree to which the continued publicising of his criminal history by Google could
be considered relevant.
- Notice was taken of NT1's continued business activities, and his continued attempts
to portray himself online and via social media as having an unblemished business
record. This prompted the Court to take a less favourable stance towards his
attempts to have his previous convictions for fraud “forgotten.
NT2:
- NT2’s crime was a more minor one and his sentence much shorter. It is noticeable,
however, that the Court placed emphasis on NT2’s acceptance of wrongdoing, and
his remorse when giving evidence. These were factored into an assessment of
whether the data processed by Google was still relevant.
- The Court emphasised why his application found favour with the Court, and NT1’s
did not: “NT2 has frankly acknowledged his guilt and expressed genuine remorse.
There is no evidence of any risk of repetition. His current business activities are in a
field quite different from that in which he was operating at the time ... There is no
real need for anybody to be warned about that activity.”

The basics of the right in the GDPR


- Articles 16 and 17 of the GDPR deal with the Right to Rectification, and the Right to
Erasure respectively. The latter is more commonly referred to by the name given to
it in parenthesis in the heading of Article 17 – the “Right to be Forgotten.”
- It is important to stress that these are two separate Articles, dealing with two
separate factual scenarios. The Right to Rectification arises when the data relating to
the data subject is “inaccurate.” The Right to be Forgotten, however, relates to the
data subject's right to seek erasure of personal data when it meets any of the six
criteria listed in Article 17(1)(a)-1(f):
a) the personal data are no longer necessary in relation to the purposes for which
they were collected or otherwise processed;
b) the data subject withdraws consent on which the processing is based according to
point (a) of Article 6(1), or point (b) of Article 9(2), and where there is no other legal
ground for the processing;

11
c) the data subject objects to the processing pursuant to Article 21(1)1and there are
no overriding legitimate grounds for the processing, or the data subject objects to
the processing pursuant to Article 21(2);
d) the personal data have been unlawfully processed;
e) the personal data have to be erased for compliance with a legal obligation in
Union or Member State law to which the controller is subject;
f) the personal data have been collected in relation to the offer of information
society services referred to in Article 8(1).

General points about the right to be forgotten


- The first step is to seek voluntary de-listing by Google itself. Since 2014, Google has
received just over 11,500 requests from Irish citizens to have results de-listed. They
have complied with just under 50% of such requests.
- The right relates to having a search engine result removed, not the underlying
article. Different considerations will exist for each, as clearly explained in Google
Spain. In ML and WW v Germany [2018] ECHR 554, the Strasbourg Court considered
a refusal by have two half-brothers, who had been convicted of murder in 1993 and
released on probation in 2007, anonymised in a news report on a German radio
station's website.
- The ECtHR upheld the decision as not violating their Article 8 right to privacy,
stressing the fact that they were not simply looking for search engine links to be
removed, that their convictions were not "spent", and that they were continuing to
protest their innocence.
- The interaction with Spent Convictions legislation (considered in NT2 & NT2 case) is
an interesting one and doesn't yet seem to have been considered in this jurisdiction.
If a conviction is considered "spent" after 7 years, is it just that it should be produced
by a search engine result?
- When de-listing is agreed to or ordered, it should ideally be done on a world-wide
basis – ie include ".com" domain names as well as individual Member State ones,
though the CJEU recently held in C-136/17 GC v CNIL that the requirement to de-list
could not be imposed on websites accessible from outside EU Member states.

12

You might also like