GDPR & Generative AI - A Guide For Customers
GDPR & Generative AI - A Guide For Customers
May 2024
1
Contents
Executive Summary....................................................................................................................................3
Introduction.................................................................................................................................................5
Part 1: Responsibly using AI - Microsoft’s AI journey and leveraging our tools and resources.......6
Responsible AI............................................................................................................................................. 6
Tools, Commitments, and Resources to Assist your AI Deployment........................................................... 7
Part 2: The GDPR Compliance Framework in the Context of AI............................................................8
What is the GDPR and who does it apply to?............................................................................................. 8
Leverage established principles to comply with regulatory frameworks when using AI solutions............. 8
Who is responsible for GDPR compliance when using AI and cloud services?........................................... 9
Compliance with the GDPR is a shared responsibility................................................................................. 9
How does Microsoft support customers with their GDPR compliance obligations?................................... 9
Protecting the data of our customers - Microsoft’s privacy commitments in the AI era.......................... 10
Key obligations under the GDPR in the context of generative AI services................................................ 11
• Articles 12 to 14 of the GDPR (Transparency).................................................................................... 11
• Articles 15 to 21 of the GDPR (Data Subject Rights).......................................................................... 11
• Article 28 of the GDPR (Processor Obligations)................................................................................. 12
• Article 32 of the GDPR (Technical and Organizational Security Measures)........................................ 13
• Article 35 of the GDPR (Data Protection Impact Assessments)......................................................... 14
• Articles 44 to 50 of the GDPR (Transfers of Personal Data to Third Countries) 15
How does the GDPR interact with the AI Act?.......................................................................................... 16
Our continued compliance with data protection regulation and open dialogue with key regulators in
Europe and across the globe..................................................................................................................... 16
Part 3: Copilot for Microsoft 365........................................................................................................... 17
What is Copilot for Microsoft 365 and how does it work?........................................................................ 17
How does Copilot for Microsoft 365 use personal data?.......................................................................... 18
Security for Copilot for Microsoft 365....................................................................................................... 19
EU Data Boundary and Data Residency..................................................................................................... 19
Part 4: Azure OpenAI Service.................................................................................................................. 20
What is Azure OpenAI Service and how does it work?............................................................................. 20
Preventing abuse and harmful content generation.................................................................................. 22
How does the Azure OpenAI Service use personal data?......................................................................... 23
Security for Azure OpenAI......................................................................................................................... 24
EU Data Boundary and Data Residency..................................................................................................... 24
Part 5: Conclusion.................................................................................................................................... 25
Appendix 1: Business opportunities arising from generative AI........................................................ 26
Appendix 2: Frequently Asked Questions (FAQs)................................................................................. 29
Appendix 3: Additional Resources......................................................................................................... 34
2
Executive Summary
• The use cases for generative AI present an • There are a number of key obligations under
exciting opportunity to improve the quality the GDPR which organizations need to
of services and operational efficiency. consider when using generative AI services.
At Microsoft we want to empower our In this paper we have included details of
customers to harness the full potential of these obligations and the associated support
new technologies like generative artificial and resources which Microsoft can offer
intelligence (generative AI), while complying including in relation to international transfers
with their obligations under the General Data of personal data, transparency, data subject
Protection Regulation (GDPR). rights, processor obligations, technical and
organizational security measures, and DPIAs.
• Microsoft is committed to ensuring its AI
systems are developed responsibly and in • Our customers’ data belongs to our
a way that is worthy of people’s trust. We customers. Microsoft does not claim
drive this commitment according to six ownership of any customer prompts or output
key principles which align closely with the content created by Microsoft’s generative
fundamental principles set out in Article 5 of AI solutions. In addition, no Customer Data
the GDPR. (including prompts or output content) is used
to train foundation models without customer
• When considering GDPR compliance in the permission.
context of using generative AI services, the
fundamental principles of the GDPR apply in • As the regulatory landscape evolves and
the same manner as they do for processing we innovate to provide new kinds of AI
personal data in any other context (e.g. the solutions, Microsoft will continue to offer
use of cloud services). So, while AI technology industry-leading tools, resources and support
may be new, the principles and accordingly the to demonstrate our enduring commitment
processes for risk assessment and compliance to meeting the needs and demands of our
with the GDPR remain the same. Hence, to customers in their AI journey.
ensure GDPR compliance, organizations
should be confident to approach Microsoft’s
AI services in the same way as they have
approached using other cloud services.
1
his guide applies to the use of our paid enterprise services for Copilot for Microsoft 365 and the Azure OpenAI
T
Service. Any references in this guide to “customers” is intended to refer to private corporate entities and/or
businesses. The contents of this paper is not applicable to consumers or individuals using Microsoft solutions in
their personal capacity. Microsoft has also produced a version of this white paper for public sector customers a
copy of which can be accessed at the following link: GDPR and Generative AI - A Guide for the Public Sector.
3
Introduction
In today’s rapidly evolving business landscape, At Microsoft we want to empower our customers to
industries are increasingly pressured to innovate, harness the full potential of new technologies like
achieve greater efficiency, and enhance customer generative AI, while complying with their obligations
experiences. This is driving organizations to seek under the General Data Protection Regulation (GDPR)
a competitive edge by utilizing the potential of to ensure the privacy and security of their data.
generative AI solutions. By automating routine tasks,
providing deep analytical insights, and enabling We have a long-standing practice of protecting our
real-time decision-making, generative AI solutions customers’ information. Our approach to Responsible
can help businesses stay competitive and responsive AI is built on a foundation of privacy, and we remain
to market dynamics. dedicated to upholding core values of privacy, security,
and safety in all our generative AI products and
There is no doubt that AI is poised to shape the solutions. As the use of AI solutions expands, our
future of how organizations operate. The business customers can be confident that their valuable data is
value of AI is clear: it helps organizations operate safeguarded by industry-leading data governance and
efficiently, perform better, achieve more, and gain privacy practices in one of the most trusted clouds on
the insights required to make better decisions. In the market today. Customers can rest assured that the
addition, investment in AI solutions has been shown to privacy commitments they have long relied on when
positively impact an organization’s bottom line.2 using our enterprise cloud products also apply to our
enterprise generative AI solutions that are backed
Generative AI solutions can optimize your organization by Microsoft’s Data Protection Addendum, including
at every level and uncover new valuable opportunities Copilot for Microsoft 365 and Azure OpenAI Service.
within your business. Delivering this type of impact
with AI innovation needs to be balanced by ensuring As an industry and thought leader in AI, we have
that your organization selects efficient and trustworthy developed this paper to address specific concerns
AI solutions and that these are implemented in a relating to the GDPR-compliant use of Copilot for
responsible and secure manner, taking into account Microsoft 365 and the Azure OpenAI Service for
the need to safeguard personal data. customers in Europe, and to demonstrate how our AI
solutions can be embraced in a GDPR-compliant manner.
2
For every $1 a company invests in AI, it is realising an average return of $3.50 and it takes on average 14 months for
organizations to realize a return on their AI investment. Source: IDC, The Business Opportunity of AI November 2023
4
This paper is set out as follows:
Part 1 Part 5
examines the meaning of responsible AI, the six concludes the paper, reflecting on the insights shared
key principles and approach to responsible AI that and the future trajectory of AI and data protection
guide Microsoft’s development of AI products, and regulation.
demonstrates the tools and resources Microsoft offers
to assist your AI deployment. Appendix 1
5
Part 1:
Responsibly using AI -
Microsoft’s AI journey and leveraging our tools and resources
7
Part 2:
The GDPR Compliance
Framework in the Context of AI
What is the GDPR and who does including when using the cloud. So, while the AI
technology may be new, the principles and accordingly
it apply to? the processes for risk assessment and compliance with
the GDPR remain the same.
The General Data Protection Regulation also known
as the “GDPR”3 sets an important bar globally for It is also helpful to recognize that the GDPR was
privacy rights, information security, and compliance. drafted to be technology-agnostic and does
At Microsoft, we value privacy as a fundamental right, therefore not prevent organizations from embracing
and we believe that the GDPR plays an important opportunities to use generative AI.
role in protecting and enabling the privacy rights of As such, applying established GDPR assessment
individuals. processes is a great way for organizations to harness
Microsoft is committed to its own compliance with the revolutionary potential of AI and deliver great
the GDPR, and providing an array of products, outcomes, while safeguarding people’s privacy and
features, documentation, and resources to support wellbeing. Microsoft has a long-standing history
our customers in meeting their compliance of collaborating with and assisting its customers
obligations under the GDPR. in pursuit of their digital transformation priorities
while complying with the requirements of the GDPR,
The GDPR is in force in the UK and all EU countries including in relation to the transition from on-premises
and imposes a set of data protection rules on the to cloud computing. Customers can approach
processing of personal data, with the goal to protect Microsoft’s generative AI solutions by leveraging
the fundamental rights of data subjects and create a the approach they have taken when using our cloud
level playing field for the processing of personal data services.
and further the internal market.
Cloud computing is essential for accessing the
Any organization that processes the personal data potentially groundbreaking AI technology, and the
of data subjects residing in Europe is subject to hyper-scale cloud is, therefore, the foundation for
the GDPR. The national laws also incorporate data deploying AI. Azure’s enterprise-grade protections
protection rules and guidelines. These are generally which form part of Copilot for Microsoft 365 and the
adapted to meet and/or exceed the requirements of Azure OpenAI Service provide a strong foundation
the GDPR. upon which customers can build their data privacy,
security, and compliance systems to confidently scale
AI while managing risk and ensuring compliance with
Leverage established principles to the GDPR.
comply with regulatory frameworks
when using AI solutions
3
For the purpose of this paper any references to the EU GDPR also apply to the UK GDPR. 8
Who is responsible for GDPR How does Microsoft support
compliance when using AI and customers with their GDPR
cloud services? compliance obligations?
Under the GDPR, there are two key parties each with a As more businesses seek to leverage generative AI,
many are looking to Microsoft not only as a service
separate set of compliance responsibilities:
provider, but as a trusted partner on the journey to
helping them to meet their compliance obligations
under the GDPR.
• The Data Controller: The data controller
decides why and how personal data is The first step towards compliance is understanding
processed and is the entity that is the how Microsoft’s generative AI services work including
principal subject of the obligations imposed how they process personal data. Our comprehensive
by the GDPR. Many of these obligations transparency documentation and information help
apply from the moment this entity starts to you understand how our AI tools work and what
collect personal data about individuals. choices our customers can make to influence system
performance and behaviour.
• The Data Processor: In contrast, under
the GDPR, the data processor is essentially In Part 3 and Part 4 of this paper we provide specific
information and links to additional resources which
a subcontractor to the data controller, you can use to help enhance your understanding of
processing personal data on behalf of and these products and services.
upon instruction from the data controller.
Jump to Part 3 to find out more about Copilot for
Microsoft 365
Organizations can act as data controllers and
data processors in the GDPR context. When using Jump to Part 4 to find out more about Azure OpenAI
Microsoft’s generative AI services, Microsoft’s Product Service
Terms indicate whether Microsoft is providing an
Online Service as a data processor or a data controller. This knowledge provides the foundation for
Most of the Online Services, including generative AI compliance with a number of key obligations under
services, are provided by Microsoft as a data processor the GDPR. We will explore these key obligations and
and are governed by the Data Protection Addendum. the associated support that Microsoft offers customers
For further details on specific products and services later in this Part 2 but first we will address the seven
consult the Microsoft Product Terms. core privacy commitments which Microsoft offers to its
customers in the AI era.
9
Protecting the data of our customers 4. Your organization’s data is not shared.
– Microsoft’s privacy commitments in Microsoft does not share your data with third parties
the AI era without your permission. Your data, including the data
generated through your organization’s use of Copilot
Microsoft’s existing privacy commitments extend to for Microsoft 365 or Azure OpenAI Service – such as
our AI commercial products, as explained in a blog prompts and responses – are kept private and are not
post from our Chief Privacy Officer Julie Brill. You disclosed to third parties.
can rest assured that the privacy commitments you
have long relied on when using our enterprise cloud 5. Your organization’s data privacy and security are
products also apply to our enterprise generative protected by design.
AI solutions that are backed by Microsoft’s Data
Protection Addendum, including Copilot for Microsoft Security and privacy are incorporated through all
365 and Azure OpenAI Service. phases of design and implementation of Copilot for
Microsoft 365 and Azure OpenAI Service. As with all
The following seven commitments apply to “Customer
our products, we provide a strong privacy and security
Data”, which is defined in Microsoft’s Product Terms as
baseline and make available additional protections
all data, including all text, sound, video, or image files,
that you can choose to enable. As external threats
and software, that are provided to Microsoft by, or on
evolve, we will continue to advance our solutions and
behalf of, our customers through use of an online service.
offerings to ensure world-class privacy and security in
All inputs (including prompts)4 and output content5 are
Copilot for Microsoft 365 and Azure OpenAI Service,
Customer Data. In accordance with Microsoft’s Data
and we will continue to be transparent about our
Protection Addendum the customer “retains all right, title
approach.
and interest in and to Customer Data”.
6. Your organization’s data is not used to train
1. We will keep your organization’s data private.
foundation models.
Your data remains private when using Copilot for
Microsoft’s generative AI solutions, including
Microsoft 365 and Azure OpenAI Service and is
Copilot for Microsoft 365 and Azure OpenAI Service
governed by our applicable privacy and contractual
capabilities, do not use Customer Data to train
commitments, including the commitments we make
foundation models without your permission. Your
in Microsoft’s Data Protection Addendum and
data is never available to OpenAI or used to improve
Microsoft’s Product Terms.
OpenAI models.
2. You are in control of your organization’s data.
7. Our products and solutions comply with global
data protection regulations.
Your data is not used in undisclosed ways or without
your permission. You may choose to customize your
The Microsoft AI products and solutions you deploy
use of Copilot for Microsoft 365 or Azure OpenAI
are compliant with today’s global data protection and
Service, opting to use your data to fine tune models
privacy regulations. As we continue to navigate the
for your organization’s own use. If you do use your
future of AI together, including the implementation of
organization’s data to fine tune, any fine-tuned AI
the EU AI Act and other global laws, organizations can
solutions created with your organization’s data will
be certain that Microsoft will be transparent about our
be available only to you.
privacy, safety, and security practices. We will comply
3. Your access control and enterprise policies are with global laws that govern AI, and back up our
maintained. promises with clear contractual commitments.
To protect privacy within your organization when using You can find additional details about how Microsoft’s
enterprise products with generative AI capabilities, privacy commitments apply to Azure OpenAI and
your existing permissions and access controls will Copilot for Microsoft 365 here and the FAQ: Protecting
continue to apply to ensure that your organization’s the Data of our Commercial and Public Sector
data is displayed only to those users to whom you Customers in the AI Era.
have given appropriate permissions.
4
“ Inputs” means all Customer Data that the customer provides, designates, selects, or inputs for use by a generative artificial intelligence
technology to generate or customize an output including any customer prompts.
5
“Output Content” means any data, text, sound, video, image, code, or other content generated by a model in response to Input. 10
Key obligations under the GDPR in
the context of generative AI services Articles 15 to 21 of the GDPR
(Data Subject Rights)
There are a number of obligations under the GDPR
which organizations need to consider when procuring Under the GDPR, data controllers must ensure
generative AI services. This section considers some of they are in a position to comply with their
the key obligations and what associated support and obligation to respond to requests from data
resources Microsoft can offer to your organization to subjects relating to the exercise of their rights
help you comply. under Articles 15 to 21 of the GDPR, with
appropriate assistance from data processors
where necessary.
How we help you comply: The information Microsoft has developed additional solutions
set out in this paper and available in our to assist its customers when responding to
transparency resources noted below is intended data subject rights requests, such as Microsoft
to assist your understanding of how Copilot Purview and Purview eDiscovery. The features
for Microsoft 365 and Azure OpenAI Service of these products empower our customers to
process data and the extent to which additional proactively govern their AI usage and adhere
information (if any) needs to be communicated to evolving regulatory requirements. This can
to data subjects. Additional product-specific be valuable for instance to improve efficiency
information is available at Data, Privacy and in responding to and actioning requests in
Security for Azure OpenAI Service; Data, Privacy relation to the “right to access personal data”
and Security for Microsoft Copilot for Microsoft and the “right to be forgotten” that apply under
365; Copilot in Dynamics 365 and Power Articles 15 and 17 of the GDPR.
Platform; and FAQs for Copilot data security Learn more about Microsoft Purview and its
and privacy for Dynamics 365 and Power features and how these tools can assist you in
Platform. the deployment of Microsoft’s generative AI
solutions.
11
Article 28 of the GDPR
(Processor Obligations)
The GDPR requires that where an organization
acts as a data controller that they only use data
processors to process personal data on their
behalf where they provide sufficient guarantees
to meet key requirements of the GDPR. These
key requirements are described in Article 28
of the GDPR and include that data processors
commit to:
• only use subprocessors with the
consent of the data controller and
remain liable for subprocessors;
12
6
Guidelines 07/2020 on the concepts of controller and processor in the GDPR.
Article 32 of the GDPR Those technical measures are set forth in
(Technical and Organizational Microsoft’s Security Policy and comply with ISO
27001, ISO 27002 and ISO 27018. Microsoft also
Security Measures) contractually commits to encrypting ‘Customer
Data’ (including any ‘Personal Data’ contained
Article 32 of the GDPR requires data controllers therein), in transit (including between Microsoft
and data processors to implement appropriate data centers) and at rest. Appendix A – Security
technical and organisational measures to Measures to Microsoft’s Data Protection
ensure a level of security appropriate to the Addendum also contains comprehensive
risk taking into account the nature, scope, commitments from Microsoft regarding
context and purposes of the processing of the security of Customer Data, including in
personal data. These measures should address relation to the Organization of Information
the risks associated with accidental or unlawful Security, Asset Management, Human Resources
destruction, loss, alteration, unauthorised Security, Physical and Environmental Security,
disclosure of, or access to personal data Communications and Operations Management,
transmitted, stored or otherwise processed. Information Security, Incident Management and
How we help you comply: In the “Data Business Continuity Management.
Security” section of the Microsoft’s Data The technical, organizational, and security
Protection Addendum, Microsoft contractually measures described above apply to any
commits to implement and maintain Customer Data that customers provide or create
appropriate technical and organizational when using Copilot for Microsoft 365 and Azure
measures to protect “Customer Data” and OpenAI Service. You can refer to the information
“Personal Data” against accidental or unlawful set out above to demonstrate the commitment
destruction, loss, alteration, unauthorized and measures taken by Microsoft to protect
disclosure of, or access to, such data Customer Data (including personal data).
transmitted, stored or otherwise processed.
Jump to Part 3 to find out more about Security
for Copilot for Microsoft 365.
13
Article 35 of the GDPR A DPIA must contain at least:
(Data Protection Impact (a) a systematic description of the
Assessments) envisaged processing operations and
the purposes of the processing;
Article 35 of the GDPR requires data controllers
to undertake a data privacy impact assessment (b) an assessment of the necessity and
(DPIA) when processing personal data is proportionality of the processing
likely to result in a high risk to the rights and operations in relation to the purposes;
freedoms of data subjects (particularly when
this involves using new technologies). (c) an assessment of the risks to the rights
and freedoms of data subjects; and
When assessing whether a DPIA is required
data controllers need to take into account
(d) the measures envisaged to address
the nature, scope, content and purposes of
the risks, including safeguards, security
the processing. Therefore, whether a DPIA is
measures and mechanisms to ensure
required for the use of Copilot for Microsoft
the protection of personal data and to
365 and Azure OpenAI Service will depend on
demonstrate compliance with the GDPR,
the particular use case and type of personal
taking into account the rights and
data which you wish to process using these
legitimate interests of data subjects and
services.
other persons concerned.
Learn more about when a DPIA must be
Learn more about the contents of a DPIA
completed
How we help you comply: The information
Even if it is not legally required, a DPIA is good
contained in this paper and the additional
practice and can help you work through the
resources to which it refers can assist you with
specific data protection risks associated with
completing a DPIA. In particular, the information
the implementation of Copilot for Microsoft
in:
365 and/or Azure OpenAI Service for a specific
use case. Preparing a DPIA may also assist you • Part 3 and Part 4 relating to how
in meeting your accountability obligations Copilot for Microsoft 365 and Azure
under Article 5(2) of the GDPR. OpenAI Service process data will
assist with completing the elements
described in (a) above; and
14
Articles 44 to 50 of the GDPR data within the EU as specified in Microsoft’s
Data Protection Addendum and the Microsoft
(Transfers of Personal Data Product Terms, reducing transfers of personal
to Third Countries) data to third countries thereby simplifying
GDPR compliance for transfers to third
The GDPR permits personal data to be countries. Both Copilot for Microsoft 365 and
transferred to a third country outside of Azure OpenAI Services are EU Data Boundary
the EU or EEA (including the US) where services.
certain conditions have been satisfied.
These conditions include where there has The EU Data Boundary is a geographically
been an adequacy decision by the European defined boundary (consisting of the countries
Commission or where appropriate additional in the EU and the European Free Trade
safeguards (such as the EU Standard Association) within which Microsoft has
Contractual Clauses) have been put in place. committed to store and process Customer
Data (including any personal data) for
For customers in the UK, the UK GDPR permits certain enterprise online services. The EU
personal data to be transferred to a third Data Boundary uses or may use Microsoft
country outside of the UK (including the US) datacenters announced or currently operating
where certain conditions have been satisfied. in Austria, Belgium, Denmark, Finland,
These conditions include where there has been France, Germany, Greece, Ireland, Italy,
an adequacy decision by the UK Secretary Netherlands, Norway, Poland, Spain, Sweden,
of State or where appropriate additional and Switzerland. In the future, Microsoft may
safeguards (such as the International Data establish datacenters in additional countries
Transfer Addendum to the EU Commission located in the EU or EFTA to provide EU Data
Standard Contractual Clauses (“UK Boundary Services.
Addendum”) have been put in place.
There are limited exceptions to the EU
How we help you comply: All transfers of Data Boundary that may result in Microsoft
personal data by Microsoft outside of the UK, processing Customer Data (including personal
EU or EEA will be subject to a valid transfer data) outside of the EU Data Boundary. Where
mechanism under the GDPR, including this is the case, Microsoft relies on compliant
transfers to the US. data transfer mechanisms as set out in the
GDPR. Further details relating to these limited
The EU Commission and the UK Secretary of circumstances can be found in the Microsoft
State have announced adequacy decisions Product Terms.
finding that (for the purpose of Article 45 of
the GDPR) the US ensures an appropriate level Learn more about the EU Data Boundary.
of protection for personal data transferred
from the UK or EU to organizations in the US Jump to Part 3 to find out more about Data
that are certified to the EU-U.S. Data Privacy Residency for Copilot for Microsoft 365.
Framework. Microsoft is certified under Jump to Part 4 to find out more about Data
the EU-U.S. Data Privacy Framework and Residency for Azure OpenAI Service.
the commitments they entail. Microsoft is
committed to embracing the framework and
will go beyond it by meeting or exceeding all
the requirements this framework outlines for
our customers.
15
How does the GDPR Our continued compliance with
interact with the AI Act? data protection regulation and open
The GDPR and the AI Act are intended to be
dialogue with key regulators in
complementary and operate alongside each Europe and across the globe
other providing a regulatory framework for
AI products and services. The GDPR, which As privacy and data protection laws advance, norms
regulates the processing of personal data and requirements evolve in Europe and across the
by controllers and data processors, focuses globe; you can be certain that Microsoft will be
on data privacy and aims to give individuals transparent about our privacy, safety, and security
control over their personal data. practices. We will comply with laws in Europe and
globally that govern AI, and back up our promises with
The AI Act, which applies to providers, clear contractual commitments.
importers, distributers, users, and others
involved in the AI lifecycle, aims to ensure Beyond adhering to the GDPR and other regulatory
that AI systems that are used in the EU requirements applicable to us, Microsoft prioritizes
respect fundamental rights, safety, and ethical an open dialogue with its customers, partners, and
principles, as well as address certain risks regulatory authorities to better understand and
related to the most highly capable general- address evolving privacy and data protection concerns.
purpose AI models.
We continue to work closely with data protection
Find out more about the AI Act and its authorities and privacy regulators around the world
interaction with the GDPR in the Appendix 2: to share information about how our AI systems
Frequently Asked Questions (FAQs). work thereby fostering an environment of trust and
cooperation.
16
Part 3:
Copilot for Microsoft 365
Understanding the potential of generative AI services a conversation, using plain but clear language and
and how these products and services operate and providing context like you would with an assistant.
use personal data is the foundation for compliance
with a number of obligations under the GDPR. This When Copilot for Microsoft 365 uses content from the
Part 3 provides information and links to various organization’s Microsoft 365 tenant to augment the
external resources which can help you understand how user’s prompt and enrich the response, as described
Copilot for Microsoft 365 operates and provides key above, this is called “grounding”. Grounding is different
information about the product and its features which to training. No Customer Data is being used to train
can be used to assist with completion of a DPIA or the LLM. In fact, the LLM is stateless, meaning that
other data protection assessment/analysis. it retains no information about the prompt that was
submitted to it, nor any Customer Data that was used
to ground it, nor any responses it provided.
What is Copilot for Microsoft 365 Copilot for Microsoft 365 leverages an instance of
and how does it work? a foundation LLM hosted in Azure OpenAI. Copilot
for Microsoft 365 does not interact with any services
Copilot for Microsoft 365 is an AI-powered operated by OpenAI (e.g. ChatGPT, or the OpenAI
productivity tool that uses “Large Language Models API). OpenAI is not a sub-processor to Microsoft and
(LLMs)” to work alongside popular Microsoft 365 apps Customer Data - including the data generated through
such as Word, Excel, PowerPoint, Outlook, Teams, and your organization’s use of Copilot for Microsoft 365
more. Copilot for Microsoft 365 provides real-time, such as prompts and responses – are not shared with
intelligent assistance which enables users to enhance third parties without your permission.
their creativity, productivity, and skills.
To get the best responses and the most out of Copilot
Copilot for Microsoft 365 is built on top of the same for Microsoft 365, it’s important that you input suitable
cloud infrastructure as its Microsoft 365 applications, prompts and avoid certain common pitfalls. Learn
and applies the same principles of confidentiality and more about the skill of prompting: the art and science
privacy to Customer Data that Microsoft has leveraged of prompting (the ingredients of a prompt) and
for years. Copilot for Microsoft 365 adheres to all prompting do’s and don’ts.
existing privacy, security, and compliance commitments
that apply to Microsoft 365 including Microsoft’s GDPR
commitments as set out in Microsoft’s Data Protection
Copilot for Microsoft 365 is:
Addendum and in relation to the EU Data Boundary.
Copilot for Microsoft 365 uses the organizational • built on Microsoft’s comprehensive approach
content in your Microsoft 365 tenant, including users’ to security, compliance, and privacy;
calendars, emails, chats, documents, meetings, contacts,
and more only in accordance with existing access • designed to protect tenant, group, and
permissions. The richness of the Copilot for Microsoft individual data; and
365 experience depends on the data sources indexed
by Microsoft 365. Customers with the most abundant • committed to responsible AI.
data in Microsoft 365 (Exchange, OneDrive, SharePoint,
Teams) will get the best results from Copilot. With
access to comprehensive organizational data, Copilot
Get an inside look at how LLMs work when you use
can suggest more relevant and personalised content
them with your data in Microsoft 365. Learn more
based on the user’s work context and preferences.
about Copilot for Microsoft 365.
Copilot responds to prompts from your users. A
Learn about how Copilot can be used in your favourite
“prompt” is the term used to describe how you
Microsoft apps by visiting the Copilot Lab.
ask Copilot for Microsoft 365 to do something for
you — such as creating, summarising, editing, or You can also find more detailed information about
transforming. Think about prompting like having Copilot for Microsoft 365 in our Learn portal.
17
Copilot and your privacy
Copilot in Windows Copilot Pro (home users) Copilot for Microsoft 365 (IT Pros/admins)
Learn more about how Copilot uses your data Learn more about how Copilot uses your data Learn more about how your organizational data is used
to assist you on your Windows device. in Microsoft 365 apps at home. and protected when using Copilot with Microsoft 365.
Learn more about your data and privacy Read about Microsoft 365 apps and your privacy Get details about data, privacy, and security
How does Copilot for Microsoft 365 Microsoft will collect and store data about user
interactions with Copilot for Microsoft 365. This will
use personal data? include the user’s prompt, how Copilot responded,
and the information used to ground Copilot’s response
Copilot for Microsoft 365 provides value by connecting (“Content Interactions”). Customer admins can view,
Microsoft’s LLMs to your organizational data. Copilot manage, and search your organization’s Content
for Microsoft 365 accesses content and context to Interactions. It may be necessary to update your
generate responses anchored in your organizational privacy notices for your organization’s users to ensure
data, such as user documents, emails, calendar, chats, it appropriately captures any processing of personal
meetings, and contacts. Copilot for Microsoft 365 data by admins in this context. See Part 2 for further
combines this content with the user’s working context, details of the transparency obligations under the
such as the meeting a user is currently attending, GDPR.
email exchanges the user had on a topic, or chat
conversations the user had in a given period. Copilot It is important for Microsoft that our customers’ data
for Microsoft 365 uses this combination of content belongs to our customers. Microsoft does not claim
and context to help provide accurate, relevant, and ownership of the content created by Copilot for
contextual responses to the user’s prompts. Microsoft 365. All Content Interactions including user
prompts and any output data/content qualifies as
Copilot for Microsoft 365 can reference web content “Customer Data” in our Product Terms and Microsoft’s
from the Bing search index to ground user prompts Data Protection Addendum.
and responses. Based on the user’s prompt, Copilot
for Microsoft 365 determines whether it needs to use All Customer Data processed by Copilot for Microsoft
Bing to query web content to help provide a relevant 365 is processed and stored in alignment with
response to the user. Controls are available to manage contractual commitments with your organization’s
the use of web content for admins. other content in Microsoft 365.
Abuse monitoring for Copilot for Microsoft 365 occurs Copilot for Microsoft 365 does not use Customer Data
in real-time, without providing Microsoft any standing to train foundation models without the customers’
access to Customer Data, either for human or for permission.
automated review. While abuse moderation, which
includes human review of content, is available for
Azure OpenAI Service, this is not required for Copilot
for Microsoft 365.
18
Security for Copilot for Microsoft 365
As noted in Part 2, the GDPR requires data controllers • For content accessed through Copilot for
and data processors to implement appropriate Microsoft 365 plug-ins, encryption can
technical and organisational measures to ensure a level exclude programmatic access, thus limiting
of security for any personal data which they process. the plug-in from accessing the content.
Learn more about Configure usage rights for
The same security and compliance terms apply, by Azure Information Protection.
default, to Copilot for Microsoft 365 as already apply
for your organization’s use of Microsoft 365. Copilot • As generative AI systems are also software
for Microsoft 365 is hosted in Azure infrastructure systems, all elements of our Security
and protected by some of the most comprehensive Development Lifecycle apply: from threat
enterprise compliance and security controls in the modeling to static analysis, secure build
industry. Copilot for Microsoft 365 was built to take and operations, use of strong cryptography,
advantage of the security and compliance features that identity standards, and more.
are already well-established in Microsoft’s hyperscale
cloud. This includes prioritization of reliability, • We’ve also added new steps to our Security
redundancy, availability, and scalability, all of which are Development Lifecycle to prepare for AI
designed into our cloud services by default. threat vectors, including updating the Threat
Modeling SDL requirement to account for AI
Copilot for Microsoft 365 also respects each user’s and machine learning-specific threats. We
access permissions to any content that it retrieves. put our AI products through AI red teaming
This is important because Copilot for Microsoft 365
to look for vulnerabilities and ensure
will only generate responses based on information the
particular user has permission to access.
we have proper mitigation strategies in
place.
Microsoft already implements multiple forms
of protection to help prevent customers from
compromising Microsoft 365 services and applications Learn more about Data, Privacy, and Security for
or gaining unauthorized access to other tenants or the Capilot for Microsoft 365.Copilot for Microsoft
Microsoft 365 system itself. 365.
• Logical isolation of Customer Data within As we explained in Part 2 of this paper, Copilot for Mi-
each tenant for Microsoft 365 services crosoft 365 is an EU Data Boundary Service.
is achieved through Microsoft Entra
authorization and role-based access control. Learn more about the EU Data Boundary.
Learn more about Microsoft 365 isolation
controls. When you store data generated by Copilot for Micro-
soft 365 in Microsoft 365 products that already have
• Microsoft uses rigorous physical security,
data residency commitments under the Product Terms,
background screening, and a multi-
layered encryption strategy to protect the then the applicable commitments will be upheld.
confidentiality and integrity of Customer Data.
Copilot for Microsoft 365 has been added as a covered
• Microsoft 365 uses service-side technologies workload in the data residency commitments in the
that encrypt Customer Data both at rest Microsoft Product Terms. Microsoft Advanced Data
and in transit, including BitLocker, per-file Residency (ADR) and Multi-Geo Capabilities offerings
encryption, Transport Layer Security (TLS), also include data residency commitments for Copilot
and Internet Protocol Security (Ipsec). Learn
for Microsoft 365 customers.
more about encryption in Microsoft 365,
see Encryption in the Microsoft Cloud.
• Your control over your organization’s data
is reinforced by Microsoft’s commitment to
comply with broadly applicable privacy laws
including the GDPR and privacy standards,
such as ISO/IEC 27018, the world’s first
international code of practice for cloud
privacy.
19
Part 4:
Azure OpenAI Service
20
Azure OpenAI Service can be used Training data and fine-tuned models:
in the following ways:
1. Are available exclusively for
• Prompt engineering: Prompt engineering is a use by your organization.
technique that involves designing prompts for
LLMs. Prompts are submitted by the user, and 2. Are stored within the same
content is generated by the service, via the
region as the Azure OpenAI
completions, chat completions, images, and
resource.
embeddings operations. This process improves
the accuracy and relevance of responses,
optimizing the performance of the model. 3. Can be deleted by the
customer at any time.
Learn more about prompt engineering.
• Azure OpenAI On Your Data: When using the When you upload custom data to fine tune
“on your data” feature, the service retrieves the results of the LLM, both the Customer Data
relevant data from a configured Customer and the results of the fine-tuned model are
Data store and augments the prompt to maintained in a protected area of the cloud,
produce generations that are grounded with stored in your tenant – accessible only by your
your data.
organization and separated by robust controls
to prevent any other access. The Customer Data
Azure OpenAI “on your data” enables you to
run supported LLMs on your organization’s and results can additionally be encrypted by
data without needing to train or fine-tune either Microsoft-managed or customer-managed
models. Running models on Customer Data encryption keys in a Bring Your Own Key format
enables you to analyze your data with greater if a customer so chooses.
accuracy and speed. By doing so, you can
unlock valuable insights that can help you In most instances, Microsoft can support and
make better decisions, identify trends and troubleshoot any problems with the service
patterns, and optimize your operations. without needing access to any Customer
Data (such as the data that was uploaded for
One of the key benefits of Azure OpenAI “on fine-tuning). In the rare cases where access
your data” is its ability to tailor the content
to Customer Data is required, whether it be
of conversational AI. The model within
in response to a customer-initiated support
Azure OpenAI Service has access to and
can reference specific sources to support ticket or a problem identified by Microsoft, you
responses, answers are not only based on its can assert control over access to that data by
pre-trained knowledge but also on the latest using Customer Lockbox for Microsoft Azure.
information available in the designated data Customer Lockbox gives customers the ability
source. This grounding data also helps the to approve or reject any access request to their
model to avoid generating responses based Customer Data.
on outdated or incorrect information.
Learn more about Azure OpenAI fine tuning.
Learn more about Azure OpenAI
On Your Data.
Whether content is used to ground prompts using the
“on your own data” feature, or whether content is used
• Azure OpenAI fine-tuning. You can
provide your own training data consisting of to build a fine-tuning model, the Customer Data is not
prompt-completion pairs for the purposes being used to train the foundation LLM. In fact, the
of fine-tuning an OpenAI model. This LLM is stateless, meaning that it retains no information
process finetunes an existing LLM using about the prompt that was submitted to it, nor any
example data. This fine-tuning refers to the Customer Data that was used to ground it, nor any
process of retraining pre-trained models on responses it provided. The LLM is not trained and does
specific datasets, typically to improve model not learn at any point during this process, it is exactly
performance on specific tasks or introduce the same foundational model even after millions of
information that wasn’t well represented when prompts are run through it.
the base model was originally trained. The
outcome is a new “custom” LLM that has been You can find detailed information about Azure
optimized for the customer using the provided OpenAI Services through the Azure OpenAI Service -
examples. Documentation, quickstarts and API reference guides.
21
Preventing abuse and harmful This human review may create a challenge for
customers, who need to strike a balance between the
content generation safety of the system and the risks of external access
– even under controlled conditions. To accommodate
that balance, Microsoft offers limited access features
To reduce the risk of harmful use of Azure OpenAI
that allow for approved customer-use cases to opt out
Service, both content filtering and abuse monitoring
of these human review and data logging processes.
features are included.
Content filtering is the process by which responses Some customers may want to use Azure OpenAI
are synchronously examined by automated means Service for a use case that involves the processing of
to determine if they should be filtered before being sensitive, highly confidential, or legally regulated input
returned to a user. This examination happens without data but where the likelihood of harmful outputs and/
the need to store any data, and with no human review or misuse is low. These customers may conclude that
of the prompts (i.e. the text provided by users as they do not want or do not have the right to permit
requests) or the responses (i.e. the data delivered back Microsoft to process such data for abuse detection,
to the user.) as described above, due to their internal policies or
applicable law. To address these concerns, Microsoft
Learn more about content filtering. allows customers who meet additional Limited
Access eligibility criteria and attest to specific use
Abuse monitoring is conducted by a separate process.
cases to apply to disable the Azure OpenAI content
This data may be accessed only by authorized
management features by completing this form.
Microsoft personnel to assist with debugging, and
protect against abuse or misuse of the system. The
human reviewers are authorized Microsoft employees If Microsoft approves a customer’s request to
who access the data via point wise queries using disable abuse monitoring, then Microsoft does not
request IDs, Secure Access Workstations (SAWs), and store any prompts and completions associated with
Just-In-Time (JIT) request approval granted by team the approved Azure subscription for which abuse
managers. monitoring is configured off. In this case, because no
prompts and completions are stored at rest in the
Learn more about abuse monitoring. service results store, the human review process is not
possible and is not performed.
22
How does the Azure OpenAI Service use personal data?
The diagram below illustrates how your organization’s Customer prompts (inputs) and completions (outputs),
data is processed by Azure OpenAI Service. This embeddings, and training data:
diagram covers three different types of processing:
• are NOT available to other customers.
1. How Azure OpenAI Service processes your
prompts to generate content (including • are NOT available to OpenAI.
when additional data from a connected
data source is added to a prompt using • are NOT used to train foundation
Azure OpenAI “On Your Data”). models without the customer’s
permission.
2. How Azure OpenAI Service creates a
fine-tuned (custom) model with your • are NOT used to improve any Microsoft
training data. or 3rd party products or services.
3. How Azure OpenAI Service and Microsoft • are NOT used for automatically
personnel analyze prompts, completions, improving Azure OpenAI models for
and images for harmful content and for your use in your resource (the models are
patterns suggesting the use of the service stateless unless you explicitly fine-tune
in a manner that violates the Code of models with your training data).
Conduct or other applicable product
terms.
Customer fine-tuned Azure OpenAI models are
available exclusively for your organization’s use.
23
Security for Azure OpenAI EU Data Boundary and Data
Residency
As noted in Part 2 of this paper, the GDPR requires
data controllers and data processors to implement
appropriate technical and organisational measures to Azure OpenAI Service is an EU Data Boundary service.
ensure a level of security for any personal data which For the purpose of interpreting the “EU Data Boundary
they process. Services” section of the Product Terms, Azure OpenAI
service is an Azure service that enables deployment in
Security is built-in throughout the development a region within the EU Data Boundary.
lifecycle of all of our enterprise services (including
those that include generative AI technology), from Learn more about the EU Data Boundary
inception to deployment.
In relation to:
Azure OpenAI Service is hosted in Azure infrastructure
and protected by some of the most comprehensive
enterprise compliance and security controls in the • Azure OpenAI on your Data feature:
industry. These services were built to take advantage of Any data sources you provide to ground
the security and compliance features that are already the generated results remain stored in the
well-established in Microsoft’s hyperscale cloud. data source and location you designate.
This includes prioritization of reliability, redundancy, No data is copied into Azure OpenAI
availability, and scalability, all of which are designed service.
into our cloud services by default.
• Training data and fine-tuned (custom)
As generative AI systems are also software systems, all LLMs: These are stored within the same
elements of our Security Development Lifecycle apply: region as Azure OpenAI resource in the
from threat modelling to static analysis, secure build customer’s Azure tenant.
and operations, use of strong cryptography, identity
standards, and more. • Abuse monitoring for customers who
use Azure OpenAI service in Europe:
We’ve also added new steps to our Security This review is conducted exclusively by
Development Lifecycle to prepare for AI threat Microsoft employees in the European
vectors, including updating the Threat Modelling SDL Economic Area. The data store where
requirement to account for AI and machine learning- prompts and completions are stored is
specific threats. We put our AI products through AI logically separated by customer resource
red teaming to look for vulnerabilities and confirm we (each request includes the resource ID of
have proper mitigation strategies in place. the customer’s Azure OpenAI resource).
A separate data store is located in each
Learn more about data, privacy and security for Azure region in which Azure OpenAI service
OpenAI Service is available, and a customer’s prompts
and generated content are stored in the
Azure region where the customer’s Azure
OpenAI service resource is deployed,
within the Azure OpenAI service boundary.
24
Part 5:
Conclusion
25
Appendix 1:
Business opportunities arising from generative AI
26
General Use Cases for Copilot for
Microsoft 365 • Personalized Content and
Recommendations: Copilot for Microsoft
365 tailors content and recommendations
Copilot for Microsoft 365 is designed to enhance to individual users based on their behaviors,
operational efficiencies and decision-making across preferences, and past interactions, commonly
a wide range of industries. This section outlines the used in sectors like e-commerce, media,
most popular and universally applicable use cases for and content delivery. This enhances user
Copilot for Microsoft 365, demonstrating its flexibility engagement and satisfaction, leading
and the value it adds to any business operation. to increased loyalty and revenue from
personalized experiences.
27
Industry-Specific Use Cases
This section explores the specific applications of • Risk Assessment: analyzes customer data
Copilot for Microsoft 365 in three critical industries: to predict and mitigate potential risks in
legal, banking, and healthcare. By highlighting targeted lending and investments. This enhances the
use cases, we demonstrate Copilot’s effectiveness in bank’s ability to manage and mitigate risk
addressing industry-specific challenges and enhancing effectively.
core operations.
• Regulatory Compliance Tracking: keeps
track of all regulatory requirements and
ensures the bank complies with financial
1. Legal Industry Use Cases
regulations. This avoids legal penalties and
• Contract Review and Analysis: automates maintains operational integrity.
the review process by comparing contract
3. Healthcare Industry Use Cases
clauses against legal standards and
previous contracts. This increases efficiency, • Patient Data Management: manages
reduces human error, and ensures and secures vast amounts of patient data,
compliance with legal standards. facilitating easy access for healthcare
providers. This improves the efficiency and
• Litigation Support: assists in organizing
confidentiality of patient care.
and analyzing vast amounts of case-related
data to support litigation processes. This • Diagnostic Assistance: provides support
saves time and enhances the preparation in diagnosing diseases by analyzing
and presentation of legal arguments. patient data and medical imagery. This
enhances the accuracy of diagnoses and the
• Compliance Monitoring: continuously
effectiveness of treatment plans.
scans for changes in legislation to help
firms remain compliant with all relevant • Remote Patient Monitoring: monitors
laws. This reduces the risk of legal penalties patients remotely using data from wearable
and enhances the firm’s reputation for devices, providing real-time health
diligence. updates to providers. This reduces hospital
readmissions and allows for proactive
2. Banking Industry Use Cases
healthcare management.
• Fraud Detection: utilizes AI to monitor
transactions in real-time and identify
patterns indicative of fraudulent activity.
This minimizes financial losses and protects
customer trust.
28
Appendix 2:
Frequently Asked Questions (FAQs)
29
What are the differences between What are the key obligations of the
cloud and generative AI services from GDPR that apply to generative AI
a GDPR perspective? systems?
The obligations under the GDPR which apply to using The obligations under the GDPR will apply whenever
cloud computing services are the same as those a generative AI system uses or otherwise processes
which apply to using generative AI services. The GDPR personal data.
requires a risk-based approach to be taken towards the
implementation and use of any new technologies. Key obligations which organizations should consider
when procuring and/or implementing generative AI
The level of risk involved will depend on the nature, systems include:
scope, content, and purpose for which personal data will
be used. When using cloud services and/or generative
AI services, an organization will need to consider what • consider whether you need to update
technical and organisational measures are in place to your privacy notices to reflect any new
protect and safeguard the use of personal data and processing activity or to clarify activities
(Articles 12 – 14);
ensure that it has appropriate contractual commitments
and operational processes to ensure it can comply with
• ensure you have processes in place to
its obligations under the GDPR.
enable you to comply with data subject
rights requests (Article 15 – 21 of the
Find out more about how Microsoft can assist GDPR);
customers in undertaking this assessment when they
are looking to use Copilot for Microsoft 365 and/or • ensure that any agreement you have with a
Azure OpenAI Service in Part 2 of this paper. data processor complies with Article 28 of
the GDPR including in relation to security
measures and international transfers;
30
How does the GDPR interact with How does Microsoft comply with
the AI Act? applicable law?
The AI Act is a new law currently being put in place in Microsoft’s AI products and solutions are designed and
the EU to regulate AI systems. It will apply to providers, built for compliance with applicable data protection
importers, distributers, users, and others involved in and privacy laws today, including GDPR.
the AI lifecycle, aiming to ensure that AI systems that
Microsoft’s approach to protecting privacy in AI is
are used in the EU respect fundamental rights, safety,
underpinned by a commitment to compliance with
and ethical principles, as well as address certain risks
existing and emerging regulatory and legal obligations
related to the most highly capable general-purpose AI
globally. We will continue to support meaningful
models.
privacy and AI regulation, and believe that the best
The GDPR and the AI Act are intended to be way to make rapid progress on needed guardrails for
complementary and operate alongside each other AI is to lean in to existing legal protections, approaches
providing a regulatory framework for AI products and and regulatory tools that could be applied to
services. protecting privacy and safety in these systems today.
The GDPR, which regulates the processing of personal Does Microsoft share Customer Data
with OpenAI/ChatGPT?
data by data controllers and data processors, focuses
on data privacy and aims to give individuals control
over their personal data. Under the AI Act most of the
regulatory burden will fall on providers of high-risk AI No. Your organization’s Customer Data, including
systems and general-purpose AI (GPAI) models. prompts (inputs) and completions (outputs), your
embeddings, and any training data you might provide
Although the GDPR and the AI Act are different in their to the Microsoft Online Services are not available to
scope and purpose, they interact with each other in OpenAI.
several ways. For example:
Azure OpenAI Service is fully controlled by Microsoft;
Microsoft hosts the OpenAI models in Microsoft’s
• The GDPR requires data controllers to Azure environment and Azure OpenAI Service does
conduct a DPIA in certain circumstances. not interact with any services operated by OpenAI
The AI Act refers to this obligation and (e.g., ChatGPT, or the OpenAI API). OpenAI is not a
requires users of high-risk AI systems to use sub-processor to Microsoft
certain mandatory user-facing information
to comply with their DPIA-obligations Learn more about the underlying OpenAI models that
under the GDPR. power Azure OpenAI Service.
Adopting the measures outlined in this paper for Yes. When using Azure OpenAI or Copilot for
GDPR compliance is therefore complementary to the Microsoft 365, customers may confidently share their
AI Act and the associated obligations that will apply confidential information. The foundation models that
under this new legislation. are accessed via Azure OpenAI Service and Copilot for
At Microsoft, we are committed to compliance with Microsoft 365 do not use Customer Data for training
the EU AI Act. Our multi-year effort to define, evolve, without permission. These foundation models are
and implement our Microsoft Responsible AI Standard stateless and do not store any data, including prompts
and internal governance has strengthened our that a customer inputs and completions that the
readiness. As final requirements under the EU AI Act model outputs. Customers can also trust that their
are defined in more detail, we look forward to working confidential information will not be transmitted to
with policymakers to ensure feasible implementation other customers.
and application of the rules, to demonstrating our
compliance, and to engaging with our customers and
other stakeholders to support compliance across the
ecosystem.
31
How does Microsoft protect security Where will my data be stored and
in this new era of AI? processed?
Security is built-in throughout the development Your data residency choices will be respected when
lifecycle of all of our enterprise services (including you use Microsoft’s Generative AI products and
those that include generative AI technology), from services that offer local storage and/or processing
inception to deployment.
capabilities.
Azure OpenAI Service and Copilot for Microsoft 365
Azure OpenAI Service and Copilot for Microsoft 365
are hosted in Azure infrastructure and protected
will process and store your data within EU/EFTA for EU
by some of the most comprehensive enterprise
Data Boundary (EUDB) customers, as set forth in the
compliance and security controls in the industry.
Product Terms and the EU Data Boundary Transparency
These services were built to take advantage of the
security and compliance features that are already Documentation.
well-established in Microsoft’s hyperscale cloud.
This includes prioritization of reliability, redundancy, Do organizations need to develop
availability, and scalability, all of which are designed
into our cloud services by default.
a customized data protection
addendum (DPA)?
Because generative AI systems are also software
systems, all elements of our Security Development
No, the GDPR does not require that each data
Lifecycle apply: from threat modeling to static
analysis, secure build and operations, use of strong controller has a customized data protection addendum
cryptography, identity standards, and more. with their data processors. Microsoft’s Data Protection
Addendum is compliant with the requirements of
We’ve also added new steps to our Security Article 28 of the GDPR.
Development Lifecycle to prepare for AI threat
vectors, including updating the Threat Modeling SDL It is not viable for hyperscale cloud providers to offer
requirement to account for AI and machine learning- different terms for different customers as it is the
specific threats. We put our AI products through AI red uniformity of the services which makes cloud services
teaming to look for vulnerabilities and ensure we have more manageable, scalable, secure and affordable
proper mitigation strategies in place. than on-site solutions. In addition, introducing
different security measures or standards for different
Learn more about Security for Copilot for Microsoft customers could undermine the security of Microsoft’s
365 in Part 3 of this paper, and about Security for
services as a whole. It is therefore not feasible for
Azure OpenAI Service in Part 4 of this paper.
Microsoft to change its operational processes or create
bespoke contractual commitments and/or contractual
Are data transfers to countries structure for every customer.
outside of the UK, EU or EEA allowed Find out more about Microsoft’s data processor
under the GDPR? obligations in Part 2 of this paper.
33
Appendix 3:
Additional Resources
Microsoft is committed to providing our customers with clear information about how we use and share data, and
choices they have in managing their data. This Appendix sets out additional resources which you can reference to
supplement and expand on the information set out in this paper.
34
© Microsoft Corporation 2024. All rights reserved.
Microsoft makes no warranties, express or implied, in this document. This document is for informational
purposes only and provided “as-is.” The document may not contain the most up to date information or guidance.
Information and views expressed in this document including references to any of our terms, URL and other
references may change without notice. You bear the risk of using it. This document is not legal or regulatory
advice and does not constitute any warranty or contractual commitment on the part of Microsoft. You should seek
independent legal advice on your legal and regulatory obligations.
This document does not provide you with any legal rights to any intellectual property in any Microsoft product.
You may copy and use this document for your internal, reference purposes.
35