0% found this document useful (0 votes)
347 views35 pages

GDPR & Generative AI - A Guide For Customers

Generative AI

Uploaded by

pravii007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
347 views35 pages

GDPR & Generative AI - A Guide For Customers

Generative AI

Uploaded by

pravii007
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 35

GDPR & Generative AI

A Guide for Customers

May 2024

1
Contents
Executive Summary....................................................................................................................................3
Introduction.................................................................................................................................................5
Part 1: Responsibly using AI - Microsoft’s AI journey and leveraging our tools and resources.......6
Responsible AI............................................................................................................................................. 6
Tools, Commitments, and Resources to Assist your AI Deployment........................................................... 7
Part 2: The GDPR Compliance Framework in the Context of AI............................................................8
What is the GDPR and who does it apply to?............................................................................................. 8
Leverage established principles to comply with regulatory frameworks when using AI solutions............. 8
Who is responsible for GDPR compliance when using AI and cloud services?........................................... 9
Compliance with the GDPR is a shared responsibility................................................................................. 9
How does Microsoft support customers with their GDPR compliance obligations?................................... 9
Protecting the data of our customers - Microsoft’s privacy commitments in the AI era.......................... 10
Key obligations under the GDPR in the context of generative AI services................................................ 11
• Articles 12 to 14 of the GDPR (Transparency).................................................................................... 11
• Articles 15 to 21 of the GDPR (Data Subject Rights).......................................................................... 11
• Article 28 of the GDPR (Processor Obligations)................................................................................. 12
• Article 32 of the GDPR (Technical and Organizational Security Measures)........................................ 13
• Article 35 of the GDPR (Data Protection Impact Assessments)......................................................... 14
• Articles 44 to 50 of the GDPR (Transfers of Personal Data to Third Countries) 15
How does the GDPR interact with the AI Act?.......................................................................................... 16
Our continued compliance with data protection regulation and open dialogue with key regulators in
Europe and across the globe..................................................................................................................... 16
Part 3: Copilot for Microsoft 365........................................................................................................... 17
What is Copilot for Microsoft 365 and how does it work?........................................................................ 17
How does Copilot for Microsoft 365 use personal data?.......................................................................... 18
Security for Copilot for Microsoft 365....................................................................................................... 19
EU Data Boundary and Data Residency..................................................................................................... 19
Part 4: Azure OpenAI Service.................................................................................................................. 20
What is Azure OpenAI Service and how does it work?............................................................................. 20
Preventing abuse and harmful content generation.................................................................................. 22
How does the Azure OpenAI Service use personal data?......................................................................... 23
Security for Azure OpenAI......................................................................................................................... 24
EU Data Boundary and Data Residency..................................................................................................... 24
Part 5: Conclusion.................................................................................................................................... 25
Appendix 1: Business opportunities arising from generative AI........................................................ 26
Appendix 2: Frequently Asked Questions (FAQs)................................................................................. 29
Appendix 3: Additional Resources......................................................................................................... 34

2
Executive Summary

• The use cases for generative AI present an • There are a number of key obligations under
exciting opportunity to improve the quality the GDPR which organizations need to
of services and operational efficiency. consider when using generative AI services.
At Microsoft we want to empower our In this paper we have included details of
customers to harness the full potential of these obligations and the associated support
new technologies like generative artificial and resources which Microsoft can offer
intelligence (generative AI), while complying including in relation to international transfers
with their obligations under the General Data of personal data, transparency, data subject
Protection Regulation (GDPR). rights, processor obligations, technical and
organizational security measures, and DPIAs.
• Microsoft is committed to ensuring its AI
systems are developed responsibly and in • Our customers’ data belongs to our
a way that is worthy of people’s trust. We customers. Microsoft does not claim
drive this commitment according to six ownership of any customer prompts or output
key principles which align closely with the content created by Microsoft’s generative
fundamental principles set out in Article 5 of AI solutions. In addition, no Customer Data
the GDPR. (including prompts or output content) is used
to train foundation models without customer
• When considering GDPR compliance in the permission.
context of using generative AI services, the
fundamental principles of the GDPR apply in • As the regulatory landscape evolves and
the same manner as they do for processing we innovate to provide new kinds of AI
personal data in any other context (e.g. the solutions, Microsoft will continue to offer
use of cloud services). So, while AI technology industry-leading tools, resources and support
may be new, the principles and accordingly the to demonstrate our enduring commitment
processes for risk assessment and compliance to meeting the needs and demands of our
with the GDPR remain the same. Hence, to customers in their AI journey.
ensure GDPR compliance, organizations
should be confident to approach Microsoft’s
AI services in the same way as they have
approached using other cloud services.

• Microsoft’s existing privacy commitments


including those provided in Microsoft’s Data
Protection Addendum extend to our AI
commercial products. Customers¹ can rest
assured that the privacy commitments they
have long relied on when using our enterprise
cloud products also apply to Copilot for
Microsoft 365 and the Azure OpenAI Service.
Customers can therefore be confident that
their valuable data is safeguarded by industry-
leading data governance and privacy practices
in the most trusted cloud on the market today.

1
 his guide applies to the use of our paid enterprise services for Copilot for Microsoft 365 and the Azure OpenAI
T
Service. Any references in this guide to “customers” is intended to refer to private corporate entities and/or
businesses. The contents of this paper is not applicable to consumers or individuals using Microsoft solutions in
their personal capacity. Microsoft has also produced a version of this white paper for public sector customers a
copy of which can be accessed at the following link: GDPR and Generative AI - A Guide for the Public Sector.

3
Introduction
In today’s rapidly evolving business landscape, At Microsoft we want to empower our customers to
industries are increasingly pressured to innovate, harness the full potential of new technologies like
achieve greater efficiency, and enhance customer generative AI, while complying with their obligations
experiences. This is driving organizations to seek under the General Data Protection Regulation (GDPR)
a competitive edge by utilizing the potential of to ensure the privacy and security of their data.
generative AI solutions. By automating routine tasks,
providing deep analytical insights, and enabling We have a long-standing practice of protecting our
real-time decision-making, generative AI solutions customers’ information. Our approach to Responsible
can help businesses stay competitive and responsive AI is built on a foundation of privacy, and we remain
to market dynamics. dedicated to upholding core values of privacy, security,
and safety in all our generative AI products and
There is no doubt that AI is poised to shape the solutions. As the use of AI solutions expands, our
future of how organizations operate. The business customers can be confident that their valuable data is
value of AI is clear: it helps organizations operate safeguarded by industry-leading data governance and
efficiently, perform better, achieve more, and gain privacy practices in one of the most trusted clouds on
the insights required to make better decisions. In the market today. Customers can rest assured that the
addition, investment in AI solutions has been shown to privacy commitments they have long relied on when
positively impact an organization’s bottom line.2 using our enterprise cloud products also apply to our
enterprise generative AI solutions that are backed
Generative AI solutions can optimize your organization by Microsoft’s Data Protection Addendum, including
at every level and uncover new valuable opportunities Copilot for Microsoft 365 and Azure OpenAI Service.
within your business. Delivering this type of impact
with AI innovation needs to be balanced by ensuring As an industry and thought leader in AI, we have
that your organization selects efficient and trustworthy developed this paper to address specific concerns
AI solutions and that these are implemented in a relating to the GDPR-compliant use of Copilot for
responsible and secure manner, taking into account Microsoft 365 and the Azure OpenAI Service for
the need to safeguard personal data. customers in Europe, and to demonstrate how our AI
solutions can be embraced in a GDPR-compliant manner.

2
For every $1 a company invests in AI, it is realising an average return of $3.50 and it takes on average 14 months for
organizations to realize a return on their AI investment. Source: IDC, The Business Opportunity of AI November 2023
4
This paper is set out as follows:

Part 1 Part 5
examines the meaning of responsible AI, the six concludes the paper, reflecting on the insights shared
key principles and approach to responsible AI that and the future trajectory of AI and data protection
guide Microsoft’s development of AI products, and regulation.
demonstrates the tools and resources Microsoft offers
to assist your AI deployment. Appendix 1

Part 2 showcases some of the exciting opportunities that


generative AI presents for businesses across various
shifts focus to the structure and requirements of the industries.
GDPR and how Microsoft can support customers to
embrace our AI solutions while continuing to meet Appendix 2
their compliance obligations under the GDPR.
addresses some frequently asked questions (FAQs) that
Parts 3 and 4 customers have, in relation to embracing AI in a GDPR-
compliant manner.
are dedicated to an in-depth exploration of Copilot for
Microsoft 365 and the Azure OpenAI Service, and how Appendix 3
these services can be utilized in compliance with the
GDPR. provides links to additional resources which customers
can reference to supplement and expand their
understanding of the information provided in this
paper.

5
Part 1:
Responsibly using AI -
Microsoft’s AI journey and leveraging our tools and resources

Responsible AI These principles can be used by customers to


evaluate AI systems and processes in use or under
AI has the potential to transform your business, from consideration in the context of the GDPR, as explored
streamlining employee tasks to accelerating service/ in Part 2 below. Within Microsoft, we have established
product delivery. The growing interest in generative our Office of Responsible AI, which sets AI governance
AI is clear. However, with this ‘great power comes
policies for the entire company, advises our senior
great responsibility’, and it is therefore essential that
leadership team on AI issues, enables engineering
AI is developed and deployed responsibly. Microsoft
has taken a principled role in this area with the and compliance teams across the company to build
development of comprehensive AI responsibility according to responsible AI principles, all while
policies and tools, grounded on work we have been ensuring that as a corporation we are continuing
doing for many years. to examine and improve our ethical stance as new
capabilities and challenges arise.
The responsible use of AI is, of course, a topic which
businesses around the world have actively addressed Learn more about Microsoft’s principles and approach
in recent years. Through leading discussions, to Responsible AI
developing approaches and strategies, and
implementing these in their operations, the use of AI
to responsibly deliver more productive, efficient and
innovative products and/or services is on the rise. In May 2024, we published our inaugural
Responsible AI Transparency Report which
Learn more about Governing AI: A Blueprint for the builds on our internal Microsoft Responsible
Future AI Standard. This report provides insight
At Microsoft, we are committed to making sure AI into how we build applications that use
systems are developed responsibly and in a way that generative AI; make decisions and oversee the
is worthy of people’s trust. We drive this commitment deployment of those applications; support our
according to six key principles which align closely customers as they build their own generative
with the fundamental principles set out in Article 5 of applications; and learn, evolve, and grow as a
the GDPR: responsible AI community.

• Fairness: AI systems should be designed Organizations should develop and be governed by


to treat all individuals fairly, without bias or
responsible AI strategies, and these strategies should
discrimination.
incorporate principles, practices, tools, and governance
• Reliability and safety: AI systems should be to enable those across the organization to assess,
reliable and safe, with built-in mechanisms to
adopt, and manage their use of AI.
prevent errors and minimize harm.
• Accountability: The creators of AI tools and When potential risks are understood and carefully
the developers who leverage them should be
managed, organizations can realize the promise of
accountable for their systems.
AI. Forward-looking leaders will ensure that their
• Privacy and security: AI systems should commitment to responsible AI is not an afterthought
respect individuals’ privacy and data security.
but is baked into their organization’s innovation
• Inclusiveness: AI systems should be designed pipeline. This allows businesses to harness the power
to be accessible and usable by everyone, of AI to improve their products and/or services and
including individuals with disabilities.
generate more profitable outcomes. You can find
• Transparency: AI systems should be several exciting examples of how to use generative AI
transparent and explainable, with clear in Appendix 1.
documentation of their functionality and
decision-making processes.
6
Tools, Commitments, and Resources For decades we’ve defended our customers against
to Assist your AI Deployment
intellectual property claims relating to our products.
Building on our previous AI Customer Commitments,
Microsoft announced our Customer Copyright
To support our customers and empower their Commitment, which extends our intellectual property
compliant use of AI, Microsoft offers a range of indemnity support to both Copilot for Microsoft 365
solutions, tooling, and resources to assist in their and our Azure OpenAI Service. Now, if a third party sues
AI deployment. From comprehensive transparency a customer for copyright infringement for using Copilot
documentation to a suite of tools for data governance, for Microsoft 365 or the Azure OpenAI Service, or for
risk, and compliance assessment. Dedicated programs the output they generate, we will defend the customer
such as our industry-leading AI Assurance Program and pay the amount of any adverse judgments or
and AI Customer Commitments further broaden the settlements that result from the lawsuit, as long as the
support we offer customers in addressing their needs. customer has used the guardrails and content filters we
have built into our products.
Microsoft’s AI Assurance Program helps customers
ensure that the AI applications they deploy on our Microsoft has also developed a range of solutions to
platforms meet the legal and regulatory requirements support our customers with data governance, with
for responsible AI. The program includes support for Microsoft Purview. You can find further detail on how
regulatory engagement and advocacy, risk framework Microsoft Purview can support compliance with GDPR
implementation and the creation of a customer council. in Part 2.

7
Part 2:
The GDPR Compliance
Framework in the Context of AI

What is the GDPR and who does including when using the cloud. So, while the AI
technology may be new, the principles and accordingly
it apply to? the processes for risk assessment and compliance with
the GDPR remain the same.
The General Data Protection Regulation also known
as the “GDPR”3 sets an important bar globally for It is also helpful to recognize that the GDPR was
privacy rights, information security, and compliance. drafted to be technology-agnostic and does
At Microsoft, we value privacy as a fundamental right, therefore not prevent organizations from embracing
and we believe that the GDPR plays an important opportunities to use generative AI.
role in protecting and enabling the privacy rights of As such, applying established GDPR assessment
individuals. processes is a great way for organizations to harness
Microsoft is committed to its own compliance with the revolutionary potential of AI and deliver great
the GDPR, and providing an array of products, outcomes, while safeguarding people’s privacy and
features, documentation, and resources to support wellbeing. Microsoft has a long-standing history
our customers in meeting their compliance of collaborating with and assisting its customers
obligations under the GDPR. in pursuit of their digital transformation priorities
while complying with the requirements of the GDPR,
The GDPR is in force in the UK and all EU countries including in relation to the transition from on-premises
and imposes a set of data protection rules on the to cloud computing. Customers can approach
processing of personal data, with the goal to protect Microsoft’s generative AI solutions by leveraging
the fundamental rights of data subjects and create a the approach they have taken when using our cloud
level playing field for the processing of personal data services.
and further the internal market.
Cloud computing is essential for accessing the
Any organization that processes the personal data potentially groundbreaking AI technology, and the
of data subjects residing in Europe is subject to hyper-scale cloud is, therefore, the foundation for
the GDPR. The national laws also incorporate data deploying AI. Azure’s enterprise-grade protections
protection rules and guidelines. These are generally which form part of Copilot for Microsoft 365 and the
adapted to meet and/or exceed the requirements of Azure OpenAI Service provide a strong foundation
the GDPR. upon which customers can build their data privacy,
security, and compliance systems to confidently scale
AI while managing risk and ensuring compliance with
Leverage established principles to the GDPR.
comply with regulatory frameworks
when using AI solutions

When we think about the GDPR in the context of


leveraging generative AI and taking advantage of
the opportunities presented by this technology, the
starting point is that the fundamental principles of
the GDPR still apply in the same manner as they do
for processing personal data in any other context,

3
For the purpose of this paper any references to the EU GDPR also apply to the UK GDPR. 8
Who is responsible for GDPR How does Microsoft support
compliance when using AI and customers with their GDPR
cloud services? compliance obligations?
Under the GDPR, there are two key parties each with a As more businesses seek to leverage generative AI,
many are looking to Microsoft not only as a service
separate set of compliance responsibilities:
provider, but as a trusted partner on the journey to
helping them to meet their compliance obligations
under the GDPR.
• The Data Controller: The data controller
decides why and how personal data is The first step towards compliance is understanding
processed and is the entity that is the how Microsoft’s generative AI services work including
principal subject of the obligations imposed how they process personal data. Our comprehensive
by the GDPR. Many of these obligations transparency documentation and information help
apply from the moment this entity starts to you understand how our AI tools work and what
collect personal data about individuals. choices our customers can make to influence system
performance and behaviour.
• The Data Processor: In contrast, under
the GDPR, the data processor is essentially In Part 3 and Part 4 of this paper we provide specific
information and links to additional resources which
a subcontractor to the data controller, you can use to help enhance your understanding of
processing personal data on behalf of and these products and services.
upon instruction from the data controller.
Jump to Part 3 to find out more about Copilot for
Microsoft 365
Organizations can act as data controllers and
data processors in the GDPR context. When using Jump to Part 4 to find out more about Azure OpenAI
Microsoft’s generative AI services, Microsoft’s Product Service
Terms indicate whether Microsoft is providing an
Online Service as a data processor or a data controller. This knowledge provides the foundation for
Most of the Online Services, including generative AI compliance with a number of key obligations under
services, are provided by Microsoft as a data processor the GDPR. We will explore these key obligations and
and are governed by the Data Protection Addendum. the associated support that Microsoft offers customers
For further details on specific products and services later in this Part 2 but first we will address the seven
consult the Microsoft Product Terms. core privacy commitments which Microsoft offers to its
customers in the AI era.

Compliance with the GDPR is a shared


responsibility
GDPR compliance is a shared responsibility. Microsoft
is committed to complying with all laws and
regulations which are applicable to Microsoft and its
generative AI tools and services including the GDPR.

As a Microsoft customer, you will need to determine


how these tools and services will be used and what
personal data will be processed to enable you to
ensure you are using such tools in a compliant manner.

To assist you with that, we have designed our


generative AI tools and services with privacy and data
protection in mind and provide our customers with
information, features, and contractual commitments
to support you in your compliance and accountability
obligations under the GDPR. The following sections in
this Part 2 delve into this in more detail and provide
you with information to support your assessment of
the use of Microsoft’s generative AI tools and services
in compliance with the GDPR.

9
Protecting the data of our customers 4. Your organization’s data is not shared.
– Microsoft’s privacy commitments in Microsoft does not share your data with third parties
the AI era without your permission. Your data, including the data
generated through your organization’s use of Copilot
Microsoft’s existing privacy commitments extend to for Microsoft 365 or Azure OpenAI Service – such as
our AI commercial products, as explained in a blog prompts and responses – are kept private and are not
post from our Chief Privacy Officer Julie Brill. You disclosed to third parties.
can rest assured that the privacy commitments you
have long relied on when using our enterprise cloud 5. Your organization’s data privacy and security are
products also apply to our enterprise generative protected by design.
AI solutions that are backed by Microsoft’s Data
Protection Addendum, including Copilot for Microsoft Security and privacy are incorporated through all
365 and Azure OpenAI Service. phases of design and implementation of Copilot for
Microsoft 365 and Azure OpenAI Service. As with all
The following seven commitments apply to “Customer
our products, we provide a strong privacy and security
Data”, which is defined in Microsoft’s Product Terms as
baseline and make available additional protections
all data, including all text, sound, video, or image files,
that you can choose to enable. As external threats
and software, that are provided to Microsoft by, or on
evolve, we will continue to advance our solutions and
behalf of, our customers through use of an online service.
offerings to ensure world-class privacy and security in
All inputs (including prompts)4 and output content5 are
Copilot for Microsoft 365 and Azure OpenAI Service,
Customer Data. In accordance with Microsoft’s Data
and we will continue to be transparent about our
Protection Addendum the customer “retains all right, title
approach.
and interest in and to Customer Data”.
6. Your organization’s data is not used to train
1. We will keep your organization’s data private.
foundation models.
Your data remains private when using Copilot for
Microsoft’s generative AI solutions, including
Microsoft 365 and Azure OpenAI Service and is
Copilot for Microsoft 365 and Azure OpenAI Service
governed by our applicable privacy and contractual
capabilities, do not use Customer Data to train
commitments, including the commitments we make
foundation models without your permission. Your
in Microsoft’s Data Protection Addendum and
data is never available to OpenAI or used to improve
Microsoft’s Product Terms.
OpenAI models.
2. You are in control of your organization’s data.
7. Our products and solutions comply with global
data protection regulations.
Your data is not used in undisclosed ways or without
your permission. You may choose to customize your
The Microsoft AI products and solutions you deploy
use of Copilot for Microsoft 365 or Azure OpenAI
are compliant with today’s global data protection and
Service, opting to use your data to fine tune models
privacy regulations. As we continue to navigate the
for your organization’s own use. If you do use your
future of AI together, including the implementation of
organization’s data to fine tune, any fine-tuned AI
the EU AI Act and other global laws, organizations can
solutions created with your organization’s data will
be certain that Microsoft will be transparent about our
be available only to you.
privacy, safety, and security practices. We will comply
3. Your access control and enterprise policies are with global laws that govern AI, and back up our
maintained. promises with clear contractual commitments.

To protect privacy within your organization when using You can find additional details about how Microsoft’s
enterprise products with generative AI capabilities, privacy commitments apply to Azure OpenAI and
your existing permissions and access controls will Copilot for Microsoft 365 here and the FAQ: Protecting
continue to apply to ensure that your organization’s the Data of our Commercial and Public Sector
data is displayed only to those users to whom you Customers in the AI Era.
have given appropriate permissions.

4
“ Inputs” means all Customer Data that the customer provides, designates, selects, or inputs for use by a generative artificial intelligence
technology to generate or customize an output including any customer prompts.
5
“Output Content” means any data, text, sound, video, image, code, or other content generated by a model in response to Input. 10
Key obligations under the GDPR in
the context of generative AI services Articles 15 to 21 of the GDPR
(Data Subject Rights)
There are a number of obligations under the GDPR
which organizations need to consider when procuring Under the GDPR, data controllers must ensure
generative AI services. This section considers some of they are in a position to comply with their
the key obligations and what associated support and obligation to respond to requests from data
resources Microsoft can offer to your organization to subjects relating to the exercise of their rights
help you comply. under Articles 15 to 21 of the GDPR, with
appropriate assistance from data processors
where necessary.

How we help you comply: In the “Data


Articles 12 to 14 of the GDPR Subjects Rights; Assistance with Requests”
(Transparency) section of Microsoft’s Data Protection
Addendum, Microsoft commits to make
Articles 12 to 14 of the GDPR require data available to customers (in a manner consistent
controllers to provide data subjects with certain with the functionality of the services and
key information about how their personal Microsoft’s role as a data processor) the ability
data will be used. This information must be to fulfil requests from data subjects exercising
provided in a concise, transparent, intelligible, their rights under the GDPR.
and easily accessible form, using clear and plain
language. This information is often provided If Microsoft receives such a request directly
in the form of a privacy notice. If you deploy a from a data subject in situations where it is
new technology (such as Copilot for Microsoft processing personal data on behalf of your
365 or Azure OpenAI Service) and intend to use organization, it will redirect the data subject to
such technology in a way that is not reflected in submit its request to your organization instead.
your existing privacy notices, then you will need You are responsible for responding to any
to update your privacy notices to reflect these such requests, but Microsoft will comply with
new processing activities. reasonable assistance requests in this respect.

How we help you comply: The information Microsoft has developed additional solutions
set out in this paper and available in our to assist its customers when responding to
transparency resources noted below is intended data subject rights requests, such as Microsoft
to assist your understanding of how Copilot Purview and Purview eDiscovery. The features
for Microsoft 365 and Azure OpenAI Service of these products empower our customers to
process data and the extent to which additional proactively govern their AI usage and adhere
information (if any) needs to be communicated to evolving regulatory requirements. This can
to data subjects. Additional product-specific be valuable for instance to improve efficiency
information is available at Data, Privacy and in responding to and actioning requests in
Security for Azure OpenAI Service; Data, Privacy relation to the “right to access personal data”
and Security for Microsoft Copilot for Microsoft and the “right to be forgotten” that apply under
365; Copilot in Dynamics 365 and Power Articles 15 and 17 of the GDPR.
Platform; and FAQs for Copilot data security Learn more about Microsoft Purview and its
and privacy for Dynamics 365 and Power features and how these tools can assist you in
Platform. the deployment of Microsoft’s generative AI
solutions.

11
Article 28 of the GDPR
(Processor Obligations)
The GDPR requires that where an organization
acts as a data controller that they only use data
processors to process personal data on their
behalf where they provide sufficient guarantees
to meet key requirements of the GDPR. These
key requirements are described in Article 28
of the GDPR and include that data processors
commit to:
• only use subprocessors with the
consent of the data controller and
remain liable for subprocessors;

• process personal data only on


instructions from the data controller,
including with regard to transfers;

• ensure that persons who process


personal data are committed to
confidentiality;

• implement appropriate technical and In this context, it is important to emphasise that


organizational measures to ensure
the GDPR does not require data controllers to
a level of personal data security
create and use their own data protection terms
appropriate to the risk;
with their data processors. The European Data
Protection Board (EDPB) itself recognises that it
• assist the data controller in its
obligations to respond to data is compliant to use a cloud provider’s standard
subjects’ requests to exercise their terms, subject to their compliance with GDPR,
GDPR rights; and Article 28.6
A hyperscale cloud provider serves all of its
• meet the GDPR’s breach notification customers uniformly. The contractual structure
and assistance requirements;
must accurately reflect how the processor’s
services operate and protect personal data.
• assist the data controller with data
Uniformity is standard in cloud services and
protection impact assessments
makes cloud services more manageable,
and consultation with supervisory
authorities; scalable, secure, and affordable than on-
site solutions. In a multi-tenant service, a
• delete or return personal data at the change imposed by one customer may affect
end of provision of services; and all customers using the service. This can be
problematic if customers have inconsistent or
• support the data controller with mutually exclusive requirements. In addition,
introducing different security measures
• evidence of compliance with the GDPR. or standards for different customers may
undermine the security of Microsoft’s services
How we help you comply: Microsoft provides as a whole. It is therefore not feasible for
the contractual commitments required of Microsoft to change its operational processes
data processors in Article 28 of the GDPR to or create bespoke contractual commitments
its customers in Microsoft’s Data Protection and/or contractual structure for individual
Addendum (DPA). You can find these specific customers.
commitments in the attachment to the DPA In light of this, customers need to understand
labelled “European Union General Data that creating their own data processing terms
Protection Regulation Terms”, in addition to when working with hyperscale cloud providers
the main body of the DPA addressing in detail may prevent them from leveraging the rich
the substantive requirements under the GDPR, innovation of cloud based generative AI
including under Article 28. services.

12
6
Guidelines 07/2020 on the concepts of controller and processor in the GDPR.
Article 32 of the GDPR Those technical measures are set forth in
(Technical and Organizational Microsoft’s Security Policy and comply with ISO
27001, ISO 27002 and ISO 27018. Microsoft also
Security Measures) contractually commits to encrypting ‘Customer
Data’ (including any ‘Personal Data’ contained
Article 32 of the GDPR requires data controllers therein), in transit (including between Microsoft
and data processors to implement appropriate data centers) and at rest. Appendix A – Security
technical and organisational measures to Measures to Microsoft’s Data Protection
ensure a level of security appropriate to the Addendum also contains comprehensive
risk taking into account the nature, scope, commitments from Microsoft regarding
context and purposes of the processing of the security of Customer Data, including in
personal data. These measures should address relation to the Organization of Information
the risks associated with accidental or unlawful Security, Asset Management, Human Resources
destruction, loss, alteration, unauthorised Security, Physical and Environmental Security,
disclosure of, or access to personal data Communications and Operations Management,
transmitted, stored or otherwise processed. Information Security, Incident Management and
How we help you comply: In the “Data Business Continuity Management.
Security” section of the Microsoft’s Data The technical, organizational, and security
Protection Addendum, Microsoft contractually measures described above apply to any
commits to implement and maintain Customer Data that customers provide or create
appropriate technical and organizational when using Copilot for Microsoft 365 and Azure
measures to protect “Customer Data” and OpenAI Service. You can refer to the information
“Personal Data” against accidental or unlawful set out above to demonstrate the commitment
destruction, loss, alteration, unauthorized and measures taken by Microsoft to protect
disclosure of, or access to, such data Customer Data (including personal data).
transmitted, stored or otherwise processed.
Jump to Part 3 to find out more about Security
for Copilot for Microsoft 365.

Jump to Part 4 to find out more about Security


for Azure Open AI Service.

13
Article 35 of the GDPR A DPIA must contain at least:
(Data Protection Impact (a) a systematic description of the
Assessments) envisaged processing operations and
the purposes of the processing;
Article 35 of the GDPR requires data controllers
to undertake a data privacy impact assessment (b) an assessment of the necessity and
(DPIA) when processing personal data is proportionality of the processing
likely to result in a high risk to the rights and operations in relation to the purposes;
freedoms of data subjects (particularly when
this involves using new technologies). (c) an assessment of the risks to the rights
and freedoms of data subjects; and
When assessing whether a DPIA is required
data controllers need to take into account
(d) the measures envisaged to address
the nature, scope, content and purposes of
the risks, including safeguards, security
the processing. Therefore, whether a DPIA is
measures and mechanisms to ensure
required for the use of Copilot for Microsoft
the protection of personal data and to
365 and Azure OpenAI Service will depend on
demonstrate compliance with the GDPR,
the particular use case and type of personal
taking into account the rights and
data which you wish to process using these
legitimate interests of data subjects and
services.
other persons concerned.
Learn more about when a DPIA must be
Learn more about the contents of a DPIA
completed
How we help you comply: The information
Even if it is not legally required, a DPIA is good
contained in this paper and the additional
practice and can help you work through the
resources to which it refers can assist you with
specific data protection risks associated with
completing a DPIA. In particular, the information
the implementation of Copilot for Microsoft
in:
365 and/or Azure OpenAI Service for a specific
use case. Preparing a DPIA may also assist you • Part 3 and Part 4 relating to how
in meeting your accountability obligations Copilot for Microsoft 365 and Azure
under Article 5(2) of the GDPR. OpenAI Service process data will
assist with completing the elements
described in (a) above; and

• the sections on technical and


organizational measures for both
Copilot for Microsoft 365 and Azure
OpenAI Service will assist with
completing the elements described
in (d) above.

The assessments described in (b) and (c) will


vary on a case-by-case basis depending on the
use case and the nature, scope and content of
the personal data involved and will need to be
undertaken by you.

Learn more about Data Protection Impact


Assessments for the GDPR.

14
Articles 44 to 50 of the GDPR data within the EU as specified in Microsoft’s
Data Protection Addendum and the Microsoft
(Transfers of Personal Data Product Terms, reducing transfers of personal
to Third Countries) data to third countries thereby simplifying
GDPR compliance for transfers to third
The GDPR permits personal data to be countries. Both Copilot for Microsoft 365 and
transferred to a third country outside of Azure OpenAI Services are EU Data Boundary
the EU or EEA (including the US) where services.
certain conditions have been satisfied.
These conditions include where there has The EU Data Boundary is a geographically
been an adequacy decision by the European defined boundary (consisting of the countries
Commission or where appropriate additional in the EU and the European Free Trade
safeguards (such as the EU Standard Association) within which Microsoft has
Contractual Clauses) have been put in place. committed to store and process Customer
Data (including any personal data) for
For customers in the UK, the UK GDPR permits certain enterprise online services. The EU
personal data to be transferred to a third Data Boundary uses or may use Microsoft
country outside of the UK (including the US) datacenters announced or currently operating
where certain conditions have been satisfied. in Austria, Belgium, Denmark, Finland,
These conditions include where there has been France, Germany, Greece, Ireland, Italy,
an adequacy decision by the UK Secretary Netherlands, Norway, Poland, Spain, Sweden,
of State or where appropriate additional and Switzerland. In the future, Microsoft may
safeguards (such as the International Data establish datacenters in additional countries
Transfer Addendum to the EU Commission located in the EU or EFTA to provide EU Data
Standard Contractual Clauses (“UK Boundary Services.
Addendum”) have been put in place.
There are limited exceptions to the EU
How we help you comply: All transfers of Data Boundary that may result in Microsoft
personal data by Microsoft outside of the UK, processing Customer Data (including personal
EU or EEA will be subject to a valid transfer data) outside of the EU Data Boundary. Where
mechanism under the GDPR, including this is the case, Microsoft relies on compliant
transfers to the US. data transfer mechanisms as set out in the
GDPR. Further details relating to these limited
The EU Commission and the UK Secretary of circumstances can be found in the Microsoft
State have announced adequacy decisions Product Terms.
finding that (for the purpose of Article 45 of
the GDPR) the US ensures an appropriate level Learn more about the EU Data Boundary.
of protection for personal data transferred
from the UK or EU to organizations in the US Jump to Part 3 to find out more about Data
that are certified to the EU-U.S. Data Privacy Residency for Copilot for Microsoft 365.
Framework. Microsoft is certified under Jump to Part 4 to find out more about Data
the EU-U.S. Data Privacy Framework and Residency for Azure OpenAI Service.
the commitments they entail. Microsoft is
committed to embracing the framework and
will go beyond it by meeting or exceeding all
the requirements this framework outlines for
our customers.

Microsoft also continues to utilize the EU


Standard Contractual Clauses and UK Addendum
globally where appropriate for transfers from the
UK, EU or onward transfers – to the benefit of
our customers and their legal certainty around
transfers that originate in the EU.

In addition to Microsoft’s compliant data


transfer mechanisms, Microsoft has established
the EU Data Boundary making robust
commitments to store and process customer’s

15
How does the GDPR Our continued compliance with
interact with the AI Act? data protection regulation and open
The GDPR and the AI Act are intended to be
dialogue with key regulators in
complementary and operate alongside each Europe and across the globe
other providing a regulatory framework for
AI products and services. The GDPR, which As privacy and data protection laws advance, norms
regulates the processing of personal data and requirements evolve in Europe and across the
by controllers and data processors, focuses globe; you can be certain that Microsoft will be
on data privacy and aims to give individuals transparent about our privacy, safety, and security
control over their personal data. practices. We will comply with laws in Europe and
globally that govern AI, and back up our promises with
The AI Act, which applies to providers, clear contractual commitments.
importers, distributers, users, and others
involved in the AI lifecycle, aims to ensure Beyond adhering to the GDPR and other regulatory
that AI systems that are used in the EU requirements applicable to us, Microsoft prioritizes
respect fundamental rights, safety, and ethical an open dialogue with its customers, partners, and
principles, as well as address certain risks regulatory authorities to better understand and
related to the most highly capable general- address evolving privacy and data protection concerns.
purpose AI models.
We continue to work closely with data protection
Find out more about the AI Act and its authorities and privacy regulators around the world
interaction with the GDPR in the Appendix 2: to share information about how our AI systems
Frequently Asked Questions (FAQs). work thereby fostering an environment of trust and
cooperation.

16
Part 3:
Copilot for Microsoft 365

Understanding the potential of generative AI services a conversation, using plain but clear language and
and how these products and services operate and providing context like you would with an assistant.
use personal data is the foundation for compliance
with a number of obligations under the GDPR. This When Copilot for Microsoft 365 uses content from the
Part 3 provides information and links to various organization’s Microsoft 365 tenant to augment the
external resources which can help you understand how user’s prompt and enrich the response, as described
Copilot for Microsoft 365 operates and provides key above, this is called “grounding”. Grounding is different
information about the product and its features which to training. No Customer Data is being used to train
can be used to assist with completion of a DPIA or the LLM. In fact, the LLM is stateless, meaning that
other data protection assessment/analysis. it retains no information about the prompt that was
submitted to it, nor any Customer Data that was used
to ground it, nor any responses it provided.
What is Copilot for Microsoft 365 Copilot for Microsoft 365 leverages an instance of
and how does it work? a foundation LLM hosted in Azure OpenAI. Copilot
for Microsoft 365 does not interact with any services
Copilot for Microsoft 365 is an AI-powered operated by OpenAI (e.g. ChatGPT, or the OpenAI
productivity tool that uses “Large Language Models API). OpenAI is not a sub-processor to Microsoft and
(LLMs)” to work alongside popular Microsoft 365 apps Customer Data - including the data generated through
such as Word, Excel, PowerPoint, Outlook, Teams, and your organization’s use of Copilot for Microsoft 365
more. Copilot for Microsoft 365 provides real-time, such as prompts and responses – are not shared with
intelligent assistance which enables users to enhance third parties without your permission.
their creativity, productivity, and skills.
To get the best responses and the most out of Copilot
Copilot for Microsoft 365 is built on top of the same for Microsoft 365, it’s important that you input suitable
cloud infrastructure as its Microsoft 365 applications, prompts and avoid certain common pitfalls. Learn
and applies the same principles of confidentiality and more about the skill of prompting: the art and science
privacy to Customer Data that Microsoft has leveraged of prompting (the ingredients of a prompt) and
for years. Copilot for Microsoft 365 adheres to all prompting do’s and don’ts.
existing privacy, security, and compliance commitments
that apply to Microsoft 365 including Microsoft’s GDPR
commitments as set out in Microsoft’s Data Protection
Copilot for Microsoft 365 is:
Addendum and in relation to the EU Data Boundary.

Copilot for Microsoft 365 uses the organizational • built on Microsoft’s comprehensive approach
content in your Microsoft 365 tenant, including users’ to security, compliance, and privacy;
calendars, emails, chats, documents, meetings, contacts,
and more only in accordance with existing access • designed to protect tenant, group, and
permissions. The richness of the Copilot for Microsoft individual data; and
365 experience depends on the data sources indexed
by Microsoft 365. Customers with the most abundant • committed to responsible AI.
data in Microsoft 365 (Exchange, OneDrive, SharePoint,
Teams) will get the best results from Copilot. With
access to comprehensive organizational data, Copilot
Get an inside look at how LLMs work when you use
can suggest more relevant and personalised content
them with your data in Microsoft 365. Learn more
based on the user’s work context and preferences.
about Copilot for Microsoft 365.
Copilot responds to prompts from your users. A
Learn about how Copilot can be used in your favourite
“prompt” is the term used to describe how you
Microsoft apps by visiting the Copilot Lab.
ask Copilot for Microsoft 365 to do something for
you — such as creating, summarising, editing, or You can also find more detailed information about
transforming. Think about prompting like having Copilot for Microsoft 365 in our Learn portal.
17
Copilot and your privacy

Copilot in Windows Copilot Pro (home users) Copilot for Microsoft 365 (IT Pros/admins)

Learn more about how Copilot uses your data Learn more about how Copilot uses your data Learn more about how your organizational data is used
to assist you on your Windows device. in Microsoft 365 apps at home. and protected when using Copilot with Microsoft 365.

Learn more about your data and privacy Read about Microsoft 365 apps and your privacy Get details about data, privacy, and security

How does Copilot for Microsoft 365 Microsoft will collect and store data about user
interactions with Copilot for Microsoft 365. This will
use personal data? include the user’s prompt, how Copilot responded,
and the information used to ground Copilot’s response
Copilot for Microsoft 365 provides value by connecting (“Content Interactions”). Customer admins can view,
Microsoft’s LLMs to your organizational data. Copilot manage, and search your organization’s Content
for Microsoft 365 accesses content and context to Interactions. It may be necessary to update your
generate responses anchored in your organizational privacy notices for your organization’s users to ensure
data, such as user documents, emails, calendar, chats, it appropriately captures any processing of personal
meetings, and contacts. Copilot for Microsoft 365 data by admins in this context. See Part 2 for further
combines this content with the user’s working context, details of the transparency obligations under the
such as the meeting a user is currently attending, GDPR.
email exchanges the user had on a topic, or chat
conversations the user had in a given period. Copilot It is important for Microsoft that our customers’ data
for Microsoft 365 uses this combination of content belongs to our customers. Microsoft does not claim
and context to help provide accurate, relevant, and ownership of the content created by Copilot for
contextual responses to the user’s prompts. Microsoft 365. All Content Interactions including user
prompts and any output data/content qualifies as
Copilot for Microsoft 365 can reference web content “Customer Data” in our Product Terms and Microsoft’s
from the Bing search index to ground user prompts Data Protection Addendum.
and responses. Based on the user’s prompt, Copilot
for Microsoft 365 determines whether it needs to use All Customer Data processed by Copilot for Microsoft
Bing to query web content to help provide a relevant 365 is processed and stored in alignment with
response to the user. Controls are available to manage contractual commitments with your organization’s
the use of web content for admins. other content in Microsoft 365.

Abuse monitoring for Copilot for Microsoft 365 occurs Copilot for Microsoft 365 does not use Customer Data
in real-time, without providing Microsoft any standing to train foundation models without the customers’
access to Customer Data, either for human or for permission.
automated review. While abuse moderation, which
includes human review of content, is available for
Azure OpenAI Service, this is not required for Copilot
for Microsoft 365.

18
Security for Copilot for Microsoft 365
As noted in Part 2, the GDPR requires data controllers • For content accessed through Copilot for
and data processors to implement appropriate Microsoft 365 plug-ins, encryption can
technical and organisational measures to ensure a level exclude programmatic access, thus limiting
of security for any personal data which they process. the plug-in from accessing the content.
Learn more about Configure usage rights for
The same security and compliance terms apply, by Azure Information Protection.
default, to Copilot for Microsoft 365 as already apply
for your organization’s use of Microsoft 365. Copilot • As generative AI systems are also software
for Microsoft 365 is hosted in Azure infrastructure systems, all elements of our Security
and protected by some of the most comprehensive Development Lifecycle apply: from threat
enterprise compliance and security controls in the modeling to static analysis, secure build
industry. Copilot for Microsoft 365 was built to take and operations, use of strong cryptography,
advantage of the security and compliance features that identity standards, and more.
are already well-established in Microsoft’s hyperscale
cloud. This includes prioritization of reliability, • We’ve also added new steps to our Security
redundancy, availability, and scalability, all of which are Development Lifecycle to prepare for AI
designed into our cloud services by default. threat vectors, including updating the Threat
Modeling SDL requirement to account for AI
Copilot for Microsoft 365 also respects each user’s and machine learning-specific threats. We
access permissions to any content that it retrieves. put our AI products through AI red teaming
This is important because Copilot for Microsoft 365
to look for vulnerabilities and ensure
will only generate responses based on information the
particular user has permission to access.
we have proper mitigation strategies in
place.
Microsoft already implements multiple forms
of protection to help prevent customers from
compromising Microsoft 365 services and applications Learn more about Data, Privacy, and Security for
or gaining unauthorized access to other tenants or the Capilot for Microsoft 365.Copilot for Microsoft
Microsoft 365 system itself. 365.

Below are a few examples of those forms of


protection:
EU Data Boundary and Data Residency

• Logical isolation of Customer Data within As we explained in Part 2 of this paper, Copilot for Mi-
each tenant for Microsoft 365 services crosoft 365 is an EU Data Boundary Service.
is achieved through Microsoft Entra
authorization and role-based access control. Learn more about the EU Data Boundary.
Learn more about Microsoft 365 isolation
controls. When you store data generated by Copilot for Micro-
soft 365 in Microsoft 365 products that already have
• Microsoft uses rigorous physical security,
data residency commitments under the Product Terms,
background screening, and a multi-
layered encryption strategy to protect the then the applicable commitments will be upheld.
confidentiality and integrity of Customer Data.
Copilot for Microsoft 365 has been added as a covered
• Microsoft 365 uses service-side technologies workload in the data residency commitments in the
that encrypt Customer Data both at rest Microsoft Product Terms. Microsoft Advanced Data
and in transit, including BitLocker, per-file Residency (ADR) and Multi-Geo Capabilities offerings
encryption, Transport Layer Security (TLS), also include data residency commitments for Copilot
and Internet Protocol Security (Ipsec). Learn
for Microsoft 365 customers.
more about encryption in Microsoft 365,
see Encryption in the Microsoft Cloud.
• Your control over your organization’s data
is reinforced by Microsoft’s commitment to
comply with broadly applicable privacy laws
including the GDPR and privacy standards,
such as ISO/IEC 27018, the world’s first
international code of practice for cloud
privacy.

19
Part 4:
Azure OpenAI Service

Understanding how generative AI products and OpenAI/ChatGPT is not a sub-processor to Microsoft


services operate and use personal data is the and customer data - including the data generated
foundation for compliance with a number of through your organization’s use of Azure OpenAI
obligations under the GDPR. This Part 4 provides Service– such as prompts and responses – are kept
information and links to various external resources private and are not shared with third parties without
which can help you understand how Azure OpenAI your permission.
Service operates and provides key information about
the service and its features which can be used to assist Learn more about the underlying LLMs that power the
with completion of a DPIA or other data protection Azure OpenAI Service.
assessment/analysis.

What is Azure OpenAI Service


and how does it work?
Azure OpenAI Service is a cloud-based platform that
enables customers to build and deploy their own
generative AI applications leveraging the power of
AI models. Azure OpenAI Service provides customers
with access to a set of LLMs for the development of
generative AI experiences.

From generating realistic images and videos to


enhancing customer experiences, generative AI has
proven to be a versatile tool across various industries.
The models underlying Azure OpenAI Service can be
easily adapted to your specific task including: content
design, creation and generation; summarization;
semantic search; natural language to code translation;
accelerated automation; personalised marketing;
chatbots and virtual assistants; product and service
innovation; language translation and natural language
processing; fraud detection and cybersecurity;
predictive analytics and forecasting; creative writing;
and medical research and diagnosis.

Azure OpenAI Service is fully controlled by Microsoft.


Microsoft hosts the OpenAI/Chat GPT models in
Microsoft’s Azure environment and the service does
not interact with any services operated by OpenAI (e.g.
ChatGPT or the OpenAI API).
OpenAI/ChatGPT owns and trains the foundation LLMs
which Microsoft uses, and Microsoft has a license to
offer services that rely on these foundation LLMs.

20
Azure OpenAI Service can be used Training data and fine-tuned models:
in the following ways:
1. Are available exclusively for
• Prompt engineering: Prompt engineering is a use by your organization.
technique that involves designing prompts for
LLMs. Prompts are submitted by the user, and 2. Are stored within the same
content is generated by the service, via the
region as the Azure OpenAI
completions, chat completions, images, and
resource.
embeddings operations. This process improves
the accuracy and relevance of responses,
optimizing the performance of the model. 3. Can be deleted by the
customer at any time.
Learn more about prompt engineering.

• Azure OpenAI On Your Data: When using the When you upload custom data to fine tune
“on your data” feature, the service retrieves the results of the LLM, both the Customer Data
relevant data from a configured Customer and the results of the fine-tuned model are
Data store and augments the prompt to maintained in a protected area of the cloud,
produce generations that are grounded with stored in your tenant – accessible only by your
your data.
organization and separated by robust controls
to prevent any other access. The Customer Data
Azure OpenAI “on your data” enables you to
run supported LLMs on your organization’s and results can additionally be encrypted by
data without needing to train or fine-tune either Microsoft-managed or customer-managed
models. Running models on Customer Data encryption keys in a Bring Your Own Key format
enables you to analyze your data with greater if a customer so chooses.
accuracy and speed. By doing so, you can
unlock valuable insights that can help you In most instances, Microsoft can support and
make better decisions, identify trends and troubleshoot any problems with the service
patterns, and optimize your operations. without needing access to any Customer
Data (such as the data that was uploaded for
One of the key benefits of Azure OpenAI “on fine-tuning). In the rare cases where access
your data” is its ability to tailor the content
to Customer Data is required, whether it be
of conversational AI. The model within
in response to a customer-initiated support
Azure OpenAI Service has access to and
can reference specific sources to support ticket or a problem identified by Microsoft, you
responses, answers are not only based on its can assert control over access to that data by
pre-trained knowledge but also on the latest using Customer Lockbox for Microsoft Azure.
information available in the designated data Customer Lockbox gives customers the ability
source. This grounding data also helps the to approve or reject any access request to their
model to avoid generating responses based Customer Data.
on outdated or incorrect information.
Learn more about Azure OpenAI fine tuning.
Learn more about Azure OpenAI
On Your Data.
Whether content is used to ground prompts using the
“on your own data” feature, or whether content is used
• Azure OpenAI fine-tuning. You can
provide your own training data consisting of to build a fine-tuning model, the Customer Data is not
prompt-completion pairs for the purposes being used to train the foundation LLM. In fact, the
of fine-tuning an OpenAI model. This LLM is stateless, meaning that it retains no information
process finetunes an existing LLM using about the prompt that was submitted to it, nor any
example data. This fine-tuning refers to the Customer Data that was used to ground it, nor any
process of retraining pre-trained models on responses it provided. The LLM is not trained and does
specific datasets, typically to improve model not learn at any point during this process, it is exactly
performance on specific tasks or introduce the same foundational model even after millions of
information that wasn’t well represented when prompts are run through it.
the base model was originally trained. The
outcome is a new “custom” LLM that has been You can find detailed information about Azure
optimized for the customer using the provided OpenAI Services through the Azure OpenAI Service -
examples. Documentation, quickstarts and API reference guides.

21
Preventing abuse and harmful This human review may create a challenge for
customers, who need to strike a balance between the
content generation safety of the system and the risks of external access
– even under controlled conditions. To accommodate
that balance, Microsoft offers limited access features
To reduce the risk of harmful use of Azure OpenAI
that allow for approved customer-use cases to opt out
Service, both content filtering and abuse monitoring
of these human review and data logging processes.
features are included.

Content filtering is the process by which responses Some customers may want to use Azure OpenAI
are synchronously examined by automated means Service for a use case that involves the processing of
to determine if they should be filtered before being sensitive, highly confidential, or legally regulated input
returned to a user. This examination happens without data but where the likelihood of harmful outputs and/
the need to store any data, and with no human review or misuse is low. These customers may conclude that
of the prompts (i.e. the text provided by users as they do not want or do not have the right to permit
requests) or the responses (i.e. the data delivered back Microsoft to process such data for abuse detection,
to the user.) as described above, due to their internal policies or
applicable law. To address these concerns, Microsoft
Learn more about content filtering. allows customers who meet additional Limited
Access eligibility criteria and attest to specific use
Abuse monitoring is conducted by a separate process.
cases to apply to disable the Azure OpenAI content
This data may be accessed only by authorized
management features by completing this form.
Microsoft personnel to assist with debugging, and
protect against abuse or misuse of the system. The
human reviewers are authorized Microsoft employees If Microsoft approves a customer’s request to
who access the data via point wise queries using disable abuse monitoring, then Microsoft does not
request IDs, Secure Access Workstations (SAWs), and store any prompts and completions associated with
Just-In-Time (JIT) request approval granted by team the approved Azure subscription for which abuse
managers. monitoring is configured off. In this case, because no
prompts and completions are stored at rest in the
Learn more about abuse monitoring. service results store, the human review process is not
possible and is not performed.

22
How does the Azure OpenAI Service use personal data?

The diagram below illustrates how your organization’s Customer prompts (inputs) and completions (outputs),
data is processed by Azure OpenAI Service. This embeddings, and training data:
diagram covers three different types of processing:
• are NOT available to other customers.
1. How Azure OpenAI Service processes your
prompts to generate content (including • are NOT available to OpenAI.
when additional data from a connected
data source is added to a prompt using • are NOT used to train foundation
Azure OpenAI “On Your Data”). models without the customer’s
permission.
2. How Azure OpenAI Service creates a
fine-tuned (custom) model with your • are NOT used to improve any Microsoft
training data. or 3rd party products or services.

3. How Azure OpenAI Service and Microsoft • are NOT used for automatically
personnel analyze prompts, completions, improving Azure OpenAI models for
and images for harmful content and for your use in your resource (the models are
patterns suggesting the use of the service stateless unless you explicitly fine-tune
in a manner that violates the Code of models with your training data).
Conduct or other applicable product
terms.
Customer fine-tuned Azure OpenAI models are
available exclusively for your organization’s use.

23
Security for Azure OpenAI EU Data Boundary and Data
Residency
As noted in Part 2 of this paper, the GDPR requires
data controllers and data processors to implement
appropriate technical and organisational measures to Azure OpenAI Service is an EU Data Boundary service.
ensure a level of security for any personal data which For the purpose of interpreting the “EU Data Boundary
they process. Services” section of the Product Terms, Azure OpenAI
service is an Azure service that enables deployment in
Security is built-in throughout the development a region within the EU Data Boundary.
lifecycle of all of our enterprise services (including
those that include generative AI technology), from Learn more about the EU Data Boundary
inception to deployment.
In relation to:
Azure OpenAI Service is hosted in Azure infrastructure
and protected by some of the most comprehensive
enterprise compliance and security controls in the • Azure OpenAI on your Data feature:
industry. These services were built to take advantage of Any data sources you provide to ground
the security and compliance features that are already the generated results remain stored in the
well-established in Microsoft’s hyperscale cloud. data source and location you designate.
This includes prioritization of reliability, redundancy, No data is copied into Azure OpenAI
availability, and scalability, all of which are designed service.
into our cloud services by default.
• Training data and fine-tuned (custom)
As generative AI systems are also software systems, all LLMs: These are stored within the same
elements of our Security Development Lifecycle apply: region as Azure OpenAI resource in the
from threat modelling to static analysis, secure build customer’s Azure tenant.
and operations, use of strong cryptography, identity
standards, and more. • Abuse monitoring for customers who
use Azure OpenAI service in Europe:
We’ve also added new steps to our Security This review is conducted exclusively by
Development Lifecycle to prepare for AI threat Microsoft employees in the European
vectors, including updating the Threat Modelling SDL Economic Area. The data store where
requirement to account for AI and machine learning- prompts and completions are stored is
specific threats. We put our AI products through AI logically separated by customer resource
red teaming to look for vulnerabilities and confirm we (each request includes the resource ID of
have proper mitigation strategies in place.   the customer’s Azure OpenAI resource).
A separate data store is located in each
Learn more about data, privacy and security for Azure region in which Azure OpenAI service
OpenAI Service is available, and a customer’s prompts
and generated content are stored in the
Azure region where the customer’s Azure
OpenAI service resource is deployed,
within the Azure OpenAI service boundary.

24
Part 5:
Conclusion

Microsoft runs on trust. We are committed to security,


privacy, and compliance across everything we do,
and our approach to generative AI is no different. As
an industry leader in the provision of generative AI
solutions we are trusted by customers across the world
and adhere to the strictest privacy and security standards
in the industry. We provide superior products and
services to our customers, thereby facilitating continued
progress towards their digital transformation goals.

Furthermore, we have been intentional about signalling


to our customers our willingness and commitment to
get our data protection and privacy settings right to
ensure compliance with the GDPR. We demonstrate
this commitment through our contracts, extensive
technical documentation (providing details about our
data processes and activities), and the implementation
of technical and organizational safeguards to mitigate
residual privacy and security risks. This is backed by
consistent engagement with regulatory and industry
stakeholders whom we partner with on our journey
towards responsibility, accountability, and integrity in the
delivery of generative AI solutions at scale.

As the regulatory landscape evolves and we innovate to


provide new kinds of AI solutions, we are keenly aware
that organizations will continue to look to us to help
decipher and operationalize the requirements of new
and existing data protection frameworks. Microsoft will
continue to offer industry-leading tools, transparency
resources and support and we look forward to the
opportunity to continue to demonstrate our enduring
commitment to meeting the needs and demands of our
European customers in their AI journey.

25
Appendix 1:
Business opportunities arising from generative AI

The availability of generative AI solutions has served


as an accelerator to the consideration of generative AI Reshape business processes: As businesses
use cases. This Appendix sets out several relevant areas grow, AI can easily scale to handle increased
of impact for consideration by businesses. data and transaction volumes, ensuring
consistent performance without proportional
AI Transformation Opportunities increases in operational costs.

Bend the curve on innovation: AI facilitates


The integration of generative AI into business
the exploration of new business models and
operations is driven by several key opportunities:
services by leveraging AI to identify trends,
predict market movements, and customize
offerings.
Enrich Employee Experiences: Automating
routine and time-consuming tasks frees up This introduction sets the stage for a detailed
human resources to focus on more strategic exploration of Generative AI’s specific
initiatives. AI-driven processes reduce human applications within different industries,
error and increase the precision of outputs, from demonstrating how its capabilities are
financial forecasting to legal compliance checks. not just theoretical but have practical and
transformative impacts on business operations.
Reinvent Customer Engagement: By providing
personalized experiences and rapid responses to
customer inquiries, AI can help improve overall
customer satisfaction and loyalty.

26
General Use Cases for Copilot for
Microsoft 365 • Personalized Content and
Recommendations: Copilot for Microsoft
365 tailors content and recommendations
Copilot for Microsoft 365 is designed to enhance to individual users based on their behaviors,
operational efficiencies and decision-making across preferences, and past interactions, commonly
a wide range of industries. This section outlines the used in sectors like e-commerce, media,
most popular and universally applicable use cases for and content delivery. This enhances user
Copilot for Microsoft 365, demonstrating its flexibility engagement and satisfaction, leading
and the value it adds to any business operation. to increased loyalty and revenue from
personalized experiences.

• Automated Customer Support: advanced


virtual assistants and chatbots that manage Department and Employee Specific Use Cases
customer inquiries, provide real-time support,
and resolve issues autonomously. This We have developed the Microsoft Copilot Scenario
reduces response times, increases customer Library to provide guidance by department and
satisfaction, and decreases the operational employee specific scenarios to get inspired, empower
costs associated with maintaining large your workforce and realize value from your Copilot
customer service teams. for Microsoft 365 investment. ​Find more examples by
department and role at the following links:
• Document Automation and Management:
Use Cases for Finance Department
create, format and manage documents.
Copilot for Microsoft 365 can generate Use Cases for Human Resources Department
reports, draft correspondences, and
prepare presentations based on user Use Cases for Information Technology
inputs. This enhances productivity and
Use Cases for Marketing Department
ensures consistency across all business
communications, allowing staff to focus on Use Cases for Sales
more strategic tasks.

• Data Analysis and Insights Generation:


analyze large datasets to identify trends,
perform predictive analytics, and generate
actionable insights, which are crucial for
decision-making. This helps businesses make
informed decisions based on data-driven
insights, optimizing operations and improving
strategic planning.

• Workflow and Process Automation:


automate repetitive and time-consuming
tasks such as data entry, scheduling, and
process tracking, integrating seamlessly with
existing systems to streamline workflows.
This increases operational efficiency, reduces
human error, and frees up employees to focus
on higher-value activities.

27
Industry-Specific Use Cases

This section explores the specific applications of • Risk Assessment: analyzes customer data
Copilot for Microsoft 365 in three critical industries: to predict and mitigate potential risks in
legal, banking, and healthcare. By highlighting targeted lending and investments. This enhances the
use cases, we demonstrate Copilot’s effectiveness in bank’s ability to manage and mitigate risk
addressing industry-specific challenges and enhancing effectively.
core operations.
• Regulatory Compliance Tracking: keeps
track of all regulatory requirements and
ensures the bank complies with financial
1. Legal Industry Use Cases
regulations. This avoids legal penalties and
• Contract Review and Analysis: automates maintains operational integrity.
the review process by comparing contract
3. Healthcare Industry Use Cases
clauses against legal standards and
previous contracts. This increases efficiency, • Patient Data Management: manages
reduces human error, and ensures and secures vast amounts of patient data,
compliance with legal standards. facilitating easy access for healthcare
providers. This improves the efficiency and
• Litigation Support: assists in organizing
confidentiality of patient care.
and analyzing vast amounts of case-related
data to support litigation processes. This • Diagnostic Assistance: provides support
saves time and enhances the preparation in diagnosing diseases by analyzing
and presentation of legal arguments. patient data and medical imagery. This
enhances the accuracy of diagnoses and the
• Compliance Monitoring: continuously
effectiveness of treatment plans.
scans for changes in legislation to help
firms remain compliant with all relevant • Remote Patient Monitoring: monitors
laws. This reduces the risk of legal penalties patients remotely using data from wearable
and enhances the firm’s reputation for devices, providing real-time health
diligence. updates to providers. This reduces hospital
readmissions and allows for proactive
2. Banking Industry Use Cases
healthcare management.
• Fraud Detection: utilizes AI to monitor
transactions in real-time and identify
patterns indicative of fraudulent activity.
This minimizes financial losses and protects
customer trust.

28
Appendix 2:
Frequently Asked Questions (FAQs)

How is my organization’s data What is generative AI and what are


protected when I use Microsoft’s the different types of AI models
Generative AI Services? Microsoft uses?
Microsoft runs on trust. We are committed to security, Generative AI is a type of artificial intelligence that can
privacy, and compliance across everything we do, and create new things, like pictures, text, or speech, that
our approach to generative AI is no different. are similar to examples it has seen before. It does this
by learning from a set of examples, figuring out the
Privacy is built into our approach to Responsible AI and patterns and rules that make them similar, and then
we will continue to uphold our core values of privacy, using those patterns and rules to make new examples
security, fairness, accountability, transparency, reliability, that are similar to the ones it learned from. It’s
inclusiveness and safety in our AI products and solutions. different from other types of AI because it can create
new things, instead of just recognizing or classifying
In Part 2 of this paper, we outline seven commitments things it has seen before.
that demonstrate our continued commitment to
protecting our customers’ data when they use our Microsoft’s Azure OpenAI service and Copilot for
Generative AI services: Microsoft 365 allow customers to leverage OpenAI’s
models, including GPT-3, GPT-4, and Codex in the
Microsoft environment. These models are commonly
• We will keep your organization’s data referred to as “foundation models,” which are generally
private. understood to be large-scale AI models that are
trained on vast quantities of primarily unlabeled data
• You are in control of your organization’s at scale (usually by self-supervised learning), and can
data. be adapted with minimal fine-tuning for a range of
different downstream tasks.
• Your access control and enterprise
policies are maintained.

• Your organization’s data is not shared


without your permission.

• Your organization’s data privacy and


security are protected by design.

• Your organization’s data is not used to


train foundation models without your
permission.

• Our products and solutions continue


to comply with global data protection
regulations.

29
What are the differences between What are the key obligations of the
cloud and generative AI services from GDPR that apply to generative AI
a GDPR perspective? systems?
The obligations under the GDPR which apply to using The obligations under the GDPR will apply whenever
cloud computing services are the same as those a generative AI system uses or otherwise processes
which apply to using generative AI services. The GDPR personal data.
requires a risk-based approach to be taken towards the
implementation and use of any new technologies. Key obligations which organizations should consider
when procuring and/or implementing generative AI
The level of risk involved will depend on the nature, systems include:
scope, content, and purpose for which personal data will
be used. When using cloud services and/or generative
AI services, an organization will need to consider what • consider whether you need to update
technical and organisational measures are in place to your privacy notices to reflect any new
protect and safeguard the use of personal data and processing activity or to clarify activities
(Articles 12 – 14);
ensure that it has appropriate contractual commitments
and operational processes to ensure it can comply with
• ensure you have processes in place to
its obligations under the GDPR.
enable you to comply with data subject
rights requests (Article 15 – 21 of the
Find out more about how Microsoft can assist GDPR);
customers in undertaking this assessment when they
are looking to use Copilot for Microsoft 365 and/or • ensure that any agreement you have with a
Azure OpenAI Service in Part 2 of this paper. data processor complies with Article 28 of
the GDPR including in relation to security
measures and international transfers;

• consider whether you need to conduct a


data protection impact assessment (DPIA)
(Article 35 of the GDPR); and

• ensure that all transfers of data outside of


the UK, EU or EEA are made subject to a
valid transfer mechanism (Article 44 – 50).

Learn more about how Microsoft assists customers in


meeting these obligations in Part 2 of the paper.

30
How does the GDPR interact with How does Microsoft comply with
the AI Act? applicable law?

The AI Act is a new law currently being put in place in Microsoft’s AI products and solutions are designed and
the EU to regulate AI systems. It will apply to providers, built for compliance with applicable data protection
importers, distributers, users, and others involved in and privacy laws today, including GDPR.
the AI lifecycle, aiming to ensure that AI systems that
Microsoft’s approach to protecting privacy in AI is
are used in the EU respect fundamental rights, safety,
underpinned by a commitment to compliance with
and ethical principles, as well as address certain risks
existing and emerging regulatory and legal obligations
related to the most highly capable general-purpose AI
globally. We will continue to support meaningful
models.
privacy and AI regulation, and believe that the best
The GDPR and the AI Act are intended to be way to make rapid progress on needed guardrails for
complementary and operate alongside each other AI is to lean in to existing legal protections, approaches
providing a regulatory framework for AI products and and regulatory tools that could be applied to
services. protecting privacy and safety in these systems today.

The GDPR, which regulates the processing of personal Does Microsoft share Customer Data
with OpenAI/ChatGPT?
data by data controllers and data processors, focuses
on data privacy and aims to give individuals control
over their personal data. Under the AI Act most of the
regulatory burden will fall on providers of high-risk AI No. Your organization’s Customer Data, including
systems and general-purpose AI (GPAI) models. prompts (inputs) and completions (outputs), your
embeddings, and any training data you might provide
Although the GDPR and the AI Act are different in their to the Microsoft Online Services are not available to
scope and purpose, they interact with each other in OpenAI.
several ways. For example:
Azure OpenAI Service is fully controlled by Microsoft;
Microsoft hosts the OpenAI models in Microsoft’s
• The GDPR requires data controllers to Azure environment and Azure OpenAI Service does
conduct a DPIA in certain circumstances. not interact with any services operated by OpenAI
The AI Act refers to this obligation and (e.g., ChatGPT, or the OpenAI API). OpenAI is not a
requires users of high-risk AI systems to use sub-processor to Microsoft
certain mandatory user-facing information
to comply with their DPIA-obligations Learn more about the underlying OpenAI models that
under the GDPR. power Azure OpenAI Service.

• The GDPR applies where personal data is


processed to train an AI system or where an Can I share confidential information
AI system is being used to process personal with Microsoft’s Generative AI
data. services?

Adopting the measures outlined in this paper for Yes. When using Azure OpenAI or Copilot for
GDPR compliance is therefore complementary to the Microsoft 365, customers may confidently share their
AI Act and the associated obligations that will apply confidential information. The foundation models that
under this new legislation. are accessed via Azure OpenAI Service and Copilot for
At Microsoft, we are committed to compliance with Microsoft 365 do not use Customer Data for training
the EU AI Act. Our multi-year effort to define, evolve, without permission. These foundation models are
and implement our Microsoft Responsible AI Standard stateless and do not store any data, including prompts
and internal governance has strengthened our that a customer inputs and completions that the
readiness. As final requirements under the EU AI Act model outputs. Customers can also trust that their
are defined in more detail, we look forward to working confidential information will not be transmitted to
with policymakers to ensure feasible implementation other customers.
and application of the rules, to demonstrating our
compliance, and to engaging with our customers and
other stakeholders to support compliance across the
ecosystem.
31
How does Microsoft protect security Where will my data be stored and
in this new era of AI? processed?
Security is built-in throughout the development Your data residency choices will be respected when
lifecycle of all of our enterprise services (including you use Microsoft’s Generative AI products and
those that include generative AI technology), from services that offer local storage and/or processing
inception to deployment.
capabilities.
Azure OpenAI Service and Copilot for Microsoft 365
Azure OpenAI Service and Copilot for Microsoft 365
are hosted in Azure infrastructure and protected
will process and store your data within EU/EFTA for EU
by some of the most comprehensive enterprise
Data Boundary (EUDB) customers, as set forth in the
compliance and security controls in the industry.
Product Terms and the EU Data Boundary Transparency
These services were built to take advantage of the
security and compliance features that are already Documentation.
well-established in Microsoft’s hyperscale cloud.
This includes prioritization of reliability, redundancy, Do organizations need to develop
availability, and scalability, all of which are designed
into our cloud services by default.
a customized data protection
addendum (DPA)?
Because generative AI systems are also software
systems, all elements of our Security Development
No, the GDPR does not require that each data
Lifecycle apply: from threat modeling to static
analysis, secure build and operations, use of strong controller has a customized data protection addendum
cryptography, identity standards, and more. with their data processors. Microsoft’s Data Protection
Addendum is compliant with the requirements of
We’ve also added new steps to our Security Article 28 of the GDPR.
Development Lifecycle to prepare for AI threat
vectors, including updating the Threat Modeling SDL It is not viable for hyperscale cloud providers to offer
requirement to account for AI and machine learning- different terms for different customers as it is the
specific threats. We put our AI products through AI red uniformity of the services which makes cloud services
teaming to look for vulnerabilities and ensure we have more manageable, scalable, secure and affordable
proper mitigation strategies in place. than on-site solutions. In addition, introducing
different security measures or standards for different
Learn more about Security for Copilot for Microsoft customers could undermine the security of Microsoft’s
365 in Part 3 of this paper, and about Security for
services as a whole. It is therefore not feasible for
Azure OpenAI Service in Part 4 of this paper.
Microsoft to change its operational processes or create
bespoke contractual commitments and/or contractual
Are data transfers to countries structure for every customer.
outside of the UK, EU or EEA allowed Find out more about Microsoft’s data processor
under the GDPR? obligations in Part 2 of this paper.

Yes, personal data can be transferred to countries


outside the UK, EU or EEA where certain conditions are
met including where: (a) there is an adequacy decision
by the European Commission or the UK Secretary of
State (Article 45 of the GDPR); or (b) the transfer is
subject to additional safeguards which include the EU
Standard Contractual Clauses and the UK IDTA (Article
46 of the GDPR).

Microsoft’s transfers of personal data outside of the


UK, EU or EEA utilize valid transfer mechanisms under
the GDPR, including EU-U.S. Data Privacy Framework
certification and EU Standard Contractual Clauses as
appropriate.

Find out more about how Microsoft approaches data


transfers to third countries in Part 2 of this paper.
32
How can customers set up their use of already make use of these services). The information
generative AI services to be compliant set out in this paper and contained in the Product
Terms and Data Protection Addendum can be
with the GDPR? used by you to undertake an appropriate risk-
based assessment of any proposed use of Copilot
The GDPR requires data controllers to consider for Microsoft 365 and Azure OpenAI Service so
data protection issues at every stage of their as to demonstrate compliance with the relevant
processing activities, from the initial design to final requirements of the GDPR.
implementation.

The risks associated with the use of generative AI will


How can organizations comply
vary depending on the specific use case and related with their transparency obligations
nature, sensitivity, and volume of personal data that under the GDPR when deploying AI
will be used in connection with that use case.
technologies?
One way you can demonstrate compliance with
the GDPR is to complete a data protection impact Articles 12 to 14 of the GDPR require organizations
assessment (DPIA) relating to specific use cases for to provide data subjects with certain key information
generative AI solutions. A DPIA helps organizations about how their personal data will be used. This
identify and reduce the data protection risks. A DPIA is information is often provided in the form of privacy
legally required where the processing activity is likely notices. If you deploy a new technology (such as
to result in a high risk to the rights and freedoms of Copilot for Microsoft 365 or Azure OpenAI Service)
data subjects. Even if it is not legally required, a DPIA and intend to use such technology in a way that is not
is good practice and can help you work through the reflected in your exiting privacy notices, then you will
specific data protection risks associated with how you need to update their privacy notice to reflect these
wish to implement generative AI for a specific use case. new processing activities.
Find out more about DPIAs in Part 2 of this paper.
The information set out in this paper is intended to
assist you to understand how Copilot for Microsoft 365
and Azure OpenAI Service use data and to determine
Can customers comply with the GDPR what information needs to be communicated to data
subjects.
when using a public cloud to use
generative AI services?
Microsoft’s public cloud services have been
developed to ensure they can be used by customers
in compliance with the GDPR (and many customers

33
Appendix 3:
Additional Resources

Microsoft is committed to providing our customers with clear information about how we use and share data, and
choices they have in managing their data. This Appendix sets out additional resources which you can reference to
supplement and expand on the information set out in this paper.

Responsible AI Data Protection Impact Assessments (DPIA)


• Empowering responsible AI practices • DPIAs and their contents
• Governing AI: A Blueprint for the Future • Data Protection Impact Assessments for
• Microsoft’s principles and approach to the GDPR
Responsible AI
• Microsoft Responsible AI Standard
AI for Business
• Responsible AI Transparency Report
• AI Solutions for Organizations
• AI driven businesses surge ahead of
Microsoft’s Customer Commitments competition
• AI Assurance Program and AI Customer • AI business value and benefits
Commitments • The business opportunity of AI
• Customer Copyright Commitment
• Protecting the data of our commercial
and public sector customers in the AI era Copilot for Microsoft 365
• FAQ: Protecting the Data of our • Copilot for Microsoft 365
Commercial and Public Sector Customers • Copilot Lab
in the AI Era • Copilot for Microsoft 365 Documentation
• Data, Privacy, and Security for Copilot for
Microsoft 365
Understanding Generative AI
• FAQs for Copilot data security and privacy
• The underlying LLMs that power • Microsoft 365 isolation controls
Microsoft’s generative AI solutions • Encryption in the Microsoft Cloud
• The art and science of prompting (the • Microsoft Copilot Scenario Library
ingredients of a prompt)
• Prompting do’s and don’ts
Azure OpenAI Service
Data Protection Addendum and • Azure OpenAI Service - Documentation,
Product Terms quickstarts and API reference guides
• Configure usage rights for Azure
• Data Protection Addendum Information Protection
• Microsoft Product Terms • Data, privacy and security for Azure
OpenAI Service
• Prompt Engineering
Data Residency Commitments
• Azure OpenAI On Your Data
• The EU Data Boundary • Azure OpenAI fine tuning
• EU Data Boundary Transparency • Content filtering
Documentation • Abuse monitoring
• Advanced Data Residency (ADR) • Enterprise security for Azure Machine
• Multi-Geo Capabilities Learning

34
© Microsoft Corporation 2024. All rights reserved.

Microsoft makes no warranties, express or implied, in this document. This document is for informational
purposes only and provided “as-is.” The document may not contain the most up to date information or guidance.
Information and views expressed in this document including references to any of our terms, URL and other
references may change without notice. You bear the risk of using it. This document is not legal or regulatory
advice and does not constitute any warranty or contractual commitment on the part of Microsoft. You should seek
independent legal advice on your legal and regulatory obligations.

This document does not provide you with any legal rights to any intellectual property in any Microsoft product.
You may copy and use this document for your internal, reference purposes.

35

You might also like